fix: make paragraph refactor service use configured AI provider

The paragraph-refactor service was using OLLAMA_BASE_URL directly from
environment variables instead of using the configured AI provider from
the database. This caused "OLLAMA error" even when OpenAI was configured
in the admin interface.

Changes:
- paragraph-refactor.service.ts: Now uses getSystemConfig() and
  getTagsProvider() from factory instead of direct Ollama calls
- factory.ts: Added proper error messages when API keys are missing
- .env.docker.example: Updated with new provider configuration
  variables (AI_PROVIDER_TAGS, AI_PROVIDER_EMBEDDING)

This fixes the issue where AI reformulation features (Clarify, Shorten,
Improve Style) would fail with OLLAMA errors even when OpenAI was
properly configured in the admin settings.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
This commit is contained in:
2026-01-12 22:51:24 +01:00
parent 58e486c68e
commit 5d315a6bdd
10 changed files with 3025 additions and 3072 deletions

View File

@@ -29,6 +29,7 @@ function createOpenAIProvider(config: Record<string, string>, modelName: string,
const apiKey = config?.OPENAI_API_KEY || process.env.OPENAI_API_KEY || '';
if (!apiKey) {
throw new Error('OPENAI_API_KEY is required when using OpenAI provider');
}
return new OpenAIProvider(apiKey, modelName, embeddingModelName);
@@ -39,9 +40,11 @@ function createCustomOpenAIProvider(config: Record<string, string>, modelName: s
const baseUrl = config?.CUSTOM_OPENAI_BASE_URL || process.env.CUSTOM_OPENAI_BASE_URL || '';
if (!apiKey) {
throw new Error('CUSTOM_OPENAI_API_KEY is required when using Custom OpenAI provider');
}
if (!baseUrl) {
throw new Error('CUSTOM_OPENAI_BASE_URL is required when using Custom OpenAI provider');
}
return new CustomOpenAIProvider(apiKey, baseUrl, modelName, embeddingModelName);