fix: 8 AI services were not using configured provider

CRITICAL FIX: Auto-labels, notebook summaries, and other AI features
were not working because 8 services were calling getAIProvider() WITHOUT
passing the config parameter.

This caused them to use the default 'ollama' provider instead of
the configured OpenAI provider from the database.

ROOT CAUSE ANALYSIS:
Working features (titles):
- title-suggestions/route.ts: getAIProvider(config) ✓

Broken features (auto-labels, summaries):
- contextual-auto-tag.service.ts: getAIProvider() ✗ (2x)
- notebook-summary.service.ts: getAIProvider() ✗
- auto-label-creation.service.ts: getAIProvider() ✗
- notebook-suggestion.service.ts: getAIProvider() ✗
- batch-organization.service.ts: getAIProvider() ✗
- embedding.service.ts: getAIProvider() ✗ (2x)

FIXED: All 8 services now properly call:
  const config = await getSystemConfig()
  const provider = getAIProvider(config)

This ensures ALL AI features use the provider configured in the admin
interface (OpenAI) instead of defaulting to Ollama.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
This commit is contained in:
sepehr 2026-01-12 23:34:16 +01:00
parent 1cbada87f4
commit 3dd48e248c
7 changed files with 85 additions and 8 deletions

63
URGENT-FIX.md Normal file
View File

@ -0,0 +1,63 @@
# 🚨 CORRECTION URGENTE - DÉPLOIEMENT IMMÉDIAT
## Le problème qui a été corrigé
Le formulaire admin ne sauvegardait PAS correctement la configuration AI dans la base de données !
## ✅ CORRECTIONS EFFECTUÉES
1. Formulaire admin avec validation et logs
2. Sauvegarde filtre les valeurs vides
3. Script d'initialisation OpenAI créé
## 🚀 DÉPLOIEMENT MAINTENANT (5 min)
```bash
# Sur votre serveur Proxmox
cd /path/to/Keep
git pull
# OPTION 1: Initialiser OpenAI dans la DB (RECOMMANDÉ)
cd keep-notes
npx tsx scripts/setup-openai.ts
# OU OPTION 2: Configurer via l'interface
# Ouvrez http://192.168.1.190:3000/admin/settings
# Section AI Settings
# - Tags Provider: openai
# - Embeddings Provider: openai
# - OPENAI_API_KEY: sk-proj-...
# Cliquez "Save AI Settings"
# Redémarrer l'application
cd ..
docker compose down
docker compose up -d
# Vérifier que ça marche
curl http://192.168.1.190:3000/api/debug/config
```
## ✅ RÉSULTAT ATTENDU
```json
{
"AI_PROVIDER_TAGS": "openai",
"AI_PROVIDER_EMBEDDING": "openai",
"AI_MODEL_TAGS": "gpt-4o-mini",
"AI_MODEL_EMBEDDING": "text-embedding-3-small",
"OPENAI_API_KEY": "set (hidden)"
}
```
## 🧪 TESTS
Ouvrez http://192.168.1.190:3000 et:
1. Créez une note
2. Testez: Ctrl+M (génération titres) ✅
3. Testez: Auto-labels ✅
4. Testez: Clarify/Shorten/Improve Style ✅
**PLUS d'erreurs "AI_PROVIDER_TAGS is not configured" !** 🎉
## 💤 BONNE NUIT !
Une fois le déploiement fait, tout devrait marcher. Dormez tranquille ! 😴

View File

@ -1,5 +1,6 @@
import { prisma } from '@/lib/prisma'
import { getAIProvider } from '@/lib/ai/factory'
import { getSystemConfig } from '@/lib/config'
export interface SuggestedLabel {
name: string
@ -102,7 +103,8 @@ export class AutoLabelCreationService {
const prompt = this.buildPrompt(notes, existingLabelNames)
try {
const provider = getAIProvider()
const config = await getSystemConfig()
const provider = getAIProvider(config)
const response = await provider.generateText(prompt)
// Parse AI response

View File

@ -1,5 +1,6 @@
import { prisma } from '@/lib/prisma'
import { getAIProvider } from '@/lib/ai/factory'
import { getSystemConfig } from '@/lib/config'
export interface NoteForOrganization {
id: string
@ -100,7 +101,8 @@ export class BatchOrganizationService {
const prompt = this.buildPrompt(notes, notebooks)
try {
const provider = getAIProvider()
const config = await getSystemConfig()
const provider = getAIProvider(config)
const response = await provider.generateText(prompt)
// Parse AI response

View File

@ -6,6 +6,7 @@
import { prisma } from '@/lib/prisma'
import { getAIProvider } from '@/lib/ai/factory'
import { getSystemConfig } from '@/lib/config'
export interface LabelSuggestion {
label: string
@ -73,7 +74,8 @@ export class ContextualAutoTagService {
const prompt = this.buildPrompt(noteContent, notebook.name, availableLabels)
try {
const provider = getAIProvider()
const config = await getSystemConfig()
const provider = getAIProvider(config)
// Use generateText with JSON response
const response = await provider.generateText(prompt)
@ -155,7 +157,8 @@ export class ContextualAutoTagService {
const prompt = this.buildNewLabelsPrompt(noteContent, notebook.name)
try {
const provider = getAIProvider()
const config = await getSystemConfig()
const provider = getAIProvider(config)
// Use generateText with JSON response
const response = await provider.generateText(prompt)

View File

@ -5,6 +5,7 @@
*/
import { getAIProvider } from '../factory'
import { getSystemConfig } from '@/lib/config'
export interface EmbeddingResult {
embedding: number[]
@ -28,7 +29,8 @@ export class EmbeddingService {
}
try {
const provider = getAIProvider()
const config = await getSystemConfig()
const provider = getAIProvider(config)
// Use the existing getEmbeddings method from AIProvider
const embedding = await provider.getEmbeddings(text)
@ -65,7 +67,8 @@ export class EmbeddingService {
}
try {
const provider = getAIProvider()
const config = await getSystemConfig()
const provider = getAIProvider(config)
// Batch embedding using the existing getEmbeddings method
const embeddings = await Promise.all(

View File

@ -1,5 +1,6 @@
import { prisma } from '@/lib/prisma'
import { getAIProvider } from '@/lib/ai/factory'
import { getSystemConfig } from '@/lib/config'
import type { Notebook } from '@/lib/types'
export class NotebookSuggestionService {
@ -31,7 +32,8 @@ export class NotebookSuggestionService {
// 3. Call AI
try {
const provider = getAIProvider()
const config = await getSystemConfig()
const provider = getAIProvider(config)
const response = await provider.generateText(prompt)

View File

@ -1,5 +1,6 @@
import { prisma } from '@/lib/prisma'
import { getAIProvider } from '@/lib/ai/factory'
import { getSystemConfig } from '@/lib/config'
export interface NotebookSummary {
notebookId: string
@ -124,7 +125,8 @@ ${content}...`
const prompt = this.buildPrompt(notesSummary, notebook.name)
try {
const provider = getAIProvider()
const config = await getSystemConfig()
const provider = getAIProvider(config)
const summary = await provider.generateText(prompt)
return summary.trim()
} catch (error) {