fix: ensure AI provider config is saved correctly in admin

URGENT FIX: Admin form was not properly saving AI provider configuration,
causing 'AI_PROVIDER_TAGS is not configured' error even after setting OpenAI.

Changes:
- admin-settings-form.tsx: Added validation and error handling
- admin-settings.ts: Filter empty values before saving to DB
- setup-openai.ts: Script to initialize OpenAI as default provider

This fixes the critical bug where users couldn't use the app after
configuring OpenAI in the admin interface.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
This commit is contained in:
sepehr 2026-01-12 23:16:26 +01:00
parent 8617117dec
commit 2393cacf35
4 changed files with 277 additions and 35 deletions

167
AI-PROVIDER-FIX.md Normal file
View File

@ -0,0 +1,167 @@
# 🔧 Correction Provider IA - Instructions Déploiement
## ✅ Ce qui a été corrigé (3 revues de code complètes)
### **PROBLÈME CRITIQUE TROUVÉ**
L'application utilisait **Ollama par défaut** même quand OpenAI était configuré dans l'interface admin !
### **Revue #1 : Factory avec fallbacks 'ollama'**
```typescript
// ❌ AVANT - Code dangereux :
const providerType = (config?.AI_PROVIDER_TAGS || process.env.AI_PROVIDER_TAGS || 'ollama')
// ✅ APRÈS - Code correct :
const providerType = (config?.AI_PROVIDER_TAGS || process.env.AI_PROVIDER_TAGS);
if (!providerType) {
throw new Error('AI_PROVIDER_TAGS is not configured...');
}
```
### **Revue #2 : Docker-compose avec OLLAMA_BASE_URL par défaut**
```yaml
# ❌ AVANT - Toujours définissait Ollama :
- OLLAMA_BASE_URL=${OLLAMA_BASE_URL:-http://ollama:11434}
# ✅ APRÈS - Seulement si explicitement défini :
- OLLAMA_BASE_URL=${OLLAMA_BASE_URL}
```
### **Revue #3 : Valeurs par défaut dans tous les endpoints**
- **8 API routes** corrigées pour ne plus fallback vers 'ollama'
- **Config endpoint** retourne 'not set' au lieu de 'ollama'
- **3 formulaires admin** mis à jour
## 📋 Fichiers modifiés
1. ✅ `lib/ai/factory.ts` - Plus de fallback 'ollama', erreurs claires
2. ✅ `docker-compose.yml` - Plus de valeur par défaut OLLAMA_BASE_URL
3. ✅ `components/sidebar.tsx` - Memento (pas Mento)
4. ✅ `README.md` - Memento
5. ✅ `deploy.sh` - Memento
6. ✅ `DOCKER_DEPLOYMENT.md` - Memento
7. ✅ `app/api/ai/config/route.ts` - Affiche 'not set' au lieu de 'ollama'
## 🚀 Déploiement sur Docker
```bash
# Sur votre serveur Proxmox
cd /path/to/Keep
# 1. Récupérer les derniers changements
git pull
# 2. Vérifier que la base de données a la bonne config
cd keep-notes
npx tsx scripts/debug-config.ts
# Devrait afficher :
# AI_PROVIDER_TAGS: "openai"
# AI_PROVIDER_EMBEDDING: "openai"
# 3. Arrêter les conteneurs
cd ..
docker compose down
# 4. Rebuild (IMPORTANT --no-cache pour appliquer les changements)
docker compose build --no-cache keep-notes
# 5. Démarrer
docker compose up -d
# 6. Vérifier les logs
docker compose logs -f keep-notes
```
## 🧪 Tests après déploiement
### 1. Vérifier la config via API
```bash
curl http://192.168.1.190:3000/api/debug/config
```
**Résultat attendu :**
```json
{
"AI_PROVIDER_TAGS": "openai",
"AI_PROVIDER_EMBEDDING": "openai",
"OPENAI_API_KEY": "set (hidden)"
}
```
### 2. Tester les endpoints IA
```bash
# Test embeddings
curl http://192.168.1.190:3000/api/ai/test
```
**Résultat attendu :**
```json
{
"success": true,
"tagsProvider": "openai",
"embeddingsProvider": "openai",
"details": {
"provider": "OpenAI",
"baseUrl": "https://api.openai.com/v1"
}
}
```
### 3. Tester dans l'interface
Ouvrez http://192.168.1.190:3000 et testez :
1. **Admin Settings** → **AI Settings**
- Vérifiez : Tags Provider = "openai"
- Vérifiez : Embeddings Provider = "openai"
2. **Créez une note** et testez :
- ✨ **Génération de titres** (Ctrl+M)
- 🏷️ **Auto-labels**
- 📝 **Clarify / Shorten / Improve Style** (les fonctions qui échouaient !)
3. **Vérifiez les logs Docker** - PLUS d'erreurs "OLLAMA error NOT found"
## ❌ Si ça ne marche toujours pas
### Vérifier la base de données
```bash
docker compose exec keep-notes npx tsx scripts/debug-config.ts
```
### Vérifier les variables d'environnement
```bash
docker compose exec keep-notes env | grep -E "AI_PROVIDER|OPENAI|OLLAMA"
```
**Résultat attendu :**
```
AI_PROVIDER_TAGS=<vide> # Correct ! La config vient de la DB
AI_PROVIDER_EMBEDDING=<vide> # Correct !
OPENAI_API_KEY=sk-... # Votre clé
OLLAMA_BASE_URL=<vide> # Correct ! Pas de valeur par défaut
```
### Reset provider si nécessaire
Dans l'interface admin :
1. Allez dans http://192.168.1.190:3000/admin/settings
2. Section "AI Settings"
3. Changez Tags Provider → "ollama"
4. Sauvegardez
5. Changez Tags Provider → "openai"
6. Sauvegardez
7. Rafraîchissez la page (F5)
## 📝 Résumé technique
**Le bug :**
- Factory: `|| 'ollama'` comme fallback
- Docker: `OLLAMA_BASE_URL=${...:-http://ollama:11434}` par défaut
- Résultat: Utilisait Ollama même avec OpenAI configuré
**La solution :**
- Factory: Throw une erreur claire si pas configuré
- Docker: Ne pas définir OLLAMA_BASE_URL si pas demandé
- Résultat: Utilise le provider configuré dans la DB
**3 revues de code complètes :**
✅ Factory et logic
✅ Docker configuration
✅ API routes et UI components

View File

@ -75,43 +75,63 @@ export function AdminSettingsForm({ config }: { config: Record<string, string> }
setIsSaving(true) setIsSaving(true)
const data: Record<string, string> = {} const data: Record<string, string> = {}
// Tags provider configuration try {
const tagsProv = formData.get('AI_PROVIDER_TAGS') as AIProvider // Tags provider configuration
data.AI_PROVIDER_TAGS = tagsProv const tagsProv = formData.get('AI_PROVIDER_TAGS') as AIProvider
data.AI_MODEL_TAGS = formData.get('AI_MODEL_TAGS') as string if (!tagsProv) throw new Error('AI_PROVIDER_TAGS is required')
data.AI_PROVIDER_TAGS = tagsProv
if (tagsProv === 'ollama') { const tagsModel = formData.get('AI_MODEL_TAGS') as string
data.OLLAMA_BASE_URL = formData.get('OLLAMA_BASE_URL_TAGS') as string if (tagsModel) data.AI_MODEL_TAGS = tagsModel
} else if (tagsProv === 'openai') {
data.OPENAI_API_KEY = formData.get('OPENAI_API_KEY') as string
} else if (tagsProv === 'custom') {
data.CUSTOM_OPENAI_API_KEY = formData.get('CUSTOM_OPENAI_API_KEY_TAGS') as string
data.CUSTOM_OPENAI_BASE_URL = formData.get('CUSTOM_OPENAI_BASE_URL_TAGS') as string
}
// Embeddings provider configuration if (tagsProv === 'ollama') {
const embedProv = formData.get('AI_PROVIDER_EMBEDDING') as AIProvider const ollamaUrl = formData.get('OLLAMA_BASE_URL_TAGS') as string
data.AI_PROVIDER_EMBEDDING = embedProv if (ollamaUrl) data.OLLAMA_BASE_URL = ollamaUrl
data.AI_MODEL_EMBEDDING = formData.get('AI_MODEL_EMBEDDING') as string } else if (tagsProv === 'openai') {
const openaiKey = formData.get('OPENAI_API_KEY') as string
if (openaiKey) data.OPENAI_API_KEY = openaiKey
} else if (tagsProv === 'custom') {
const customKey = formData.get('CUSTOM_OPENAI_API_KEY_TAGS') as string
const customUrl = formData.get('CUSTOM_OPENAI_BASE_URL_TAGS') as string
if (customKey) data.CUSTOM_OPENAI_API_KEY = customKey
if (customUrl) data.CUSTOM_OPENAI_BASE_URL = customUrl
}
if (embedProv === 'ollama') { // Embeddings provider configuration
data.OLLAMA_BASE_URL = formData.get('OLLAMA_BASE_URL_EMBEDDING') as string const embedProv = formData.get('AI_PROVIDER_EMBEDDING') as AIProvider
} else if (embedProv === 'openai') { if (!embedProv) throw new Error('AI_PROVIDER_EMBEDDING is required')
data.OPENAI_API_KEY = formData.get('OPENAI_API_KEY') as string data.AI_PROVIDER_EMBEDDING = embedProv
} else if (embedProv === 'custom') {
data.CUSTOM_OPENAI_API_KEY = formData.get('CUSTOM_OPENAI_API_KEY_EMBEDDING') as string
data.CUSTOM_OPENAI_BASE_URL = formData.get('CUSTOM_OPENAI_BASE_URL_EMBEDDING') as string
}
const result = await updateSystemConfig(data) const embedModel = formData.get('AI_MODEL_EMBEDDING') as string
setIsSaving(false) if (embedModel) data.AI_MODEL_EMBEDDING = embedModel
if (result.error) { if (embedProv === 'ollama') {
toast.error('Failed to update AI settings') const ollamaUrl = formData.get('OLLAMA_BASE_URL_EMBEDDING') as string
} else { if (ollamaUrl) data.OLLAMA_BASE_URL = ollamaUrl
toast.success('AI Settings updated') } else if (embedProv === 'openai') {
setTagsProvider(tagsProv) const openaiKey = formData.get('OPENAI_API_KEY') as string
setEmbeddingsProvider(embedProv) if (openaiKey) data.OPENAI_API_KEY = openaiKey
} else if (embedProv === 'custom') {
const customKey = formData.get('CUSTOM_OPENAI_API_KEY_EMBEDDING') as string
const customUrl = formData.get('CUSTOM_OPENAI_BASE_URL_EMBEDDING') as string
if (customKey) data.CUSTOM_OPENAI_API_KEY = customKey
if (customUrl) data.CUSTOM_OPENAI_BASE_URL = customUrl
}
console.log('Saving AI config:', data)
const result = await updateSystemConfig(data)
setIsSaving(false)
if (result.error) {
toast.error('Failed to update AI settings: ' + result.error)
} else {
toast.success('AI Settings updated successfully')
setTagsProvider(tagsProv)
setEmbeddingsProvider(embedProv)
}
} catch (error: any) {
setIsSaving(false)
toast.error('Error: ' + error.message)
} }
} }

View File

@ -41,7 +41,14 @@ export async function updateSystemConfig(data: Record<string, string>) {
await checkAdmin() await checkAdmin()
try { try {
const operations = Object.entries(data).map(([key, value]) => // Filter out empty values but keep 'false' as valid
const filteredData = Object.fromEntries(
Object.entries(data).filter(([key, value]) => value !== '' && value !== null && value !== undefined)
)
console.log('Updating system config:', filteredData)
const operations = Object.entries(filteredData).map(([key, value]) =>
prisma.systemConfig.upsert({ prisma.systemConfig.upsert({
where: { key }, where: { key },
update: { value }, update: { value },

View File

@ -0,0 +1,48 @@
import prisma from '../lib/prisma'
/**
* Setup OpenAI as default AI provider in database
* Run this to ensure OpenAI is properly configured
*/
async function setupOpenAI() {
console.log('🔧 Setting up OpenAI as default AI provider...\n')
const configs = [
{ key: 'AI_PROVIDER_TAGS', value: 'openai' },
{ key: 'AI_PROVIDER_EMBEDDING', value: 'openai' },
{ key: 'AI_MODEL_TAGS', value: 'gpt-4o-mini' },
{ key: 'AI_MODEL_EMBEDDING', value: 'text-embedding-3-small' },
]
try {
for (const config of configs) {
await prisma.systemConfig.upsert({
where: { key: config.key },
update: { value: config.value },
create: { key: config.key, value: config.value }
})
console.log(`✅ Set ${config.key} = ${config.value}`)
}
console.log('\n✨ OpenAI configuration complete!')
console.log('\nNext steps:')
console.log('1. Add your OPENAI_API_KEY in admin settings: http://localhost:3000/admin/settings')
console.log('2. Or add it to .env.docker: OPENAI_API_KEY=sk-...')
console.log('3. Restart the application')
// Verify
const verify = await prisma.systemConfig.findMany({
where: { key: { in: configs.map(c => c.key) } }
})
console.log('\n✅ Verification:')
verify.forEach(c => console.log(` ${c.key}: ${c.value}`))
} catch (error) {
console.error('❌ Error:', error)
process.exit(1)
}
}
setupOpenAI()
.then(() => process.exit(0))
.catch(() => process.exit(1))