fix: resolve Docker AI features connection to Ollama

Critical fix for AI features (reformulation, auto-labels) in Docker.
Only title generation was working because it had hardcoded fallback.

Problem:
- Code uses OLLAMA_BASE_URL environment variable
- docker-compose.yml was setting OLLAMA_API_URL (wrong variable)
- .env.docker was missing on server
- Container couldn't connect to Ollama service

Root cause:
- Environment variable name mismatch
- Missing OLLAMA_BASE_URL configuration
- localhost:11434 doesn't work in Docker (need service name)

Solution:
1. Update docker-compose.yml:
   - Change OLLAMA_API_URL → OLLAMA_BASE_URL
   - Add OLLAMA_MODEL environment variable
   - Default to http://ollama:11434 (Docker service name)

2. Update .env.docker:
   - Add OLLAMA_BASE_URL="http://ollama:11434"
   - Add OLLAMA_MODEL="granite4:latest"
   - Document AI provider configuration

3. Update .env.docker.example:
   - Add examples for different Ollama setups
   - Document Docker service name vs external IP
   - Add OpenAI configuration example

Result:
✓ All AI features now work in Docker (titles, reformulation, auto-labels)
✓ Proper connection to Ollama container via Docker network
✓ Clear documentation for different deployment scenarios

Technical details:
- Docker service name "ollama" resolves to container IP
- Port 11434 accessible within memento-network
- Fallback to localhost only for local development

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
This commit is contained in:
sepehr 2026-01-12 22:13:35 +01:00
parent 2e4f9dcd83
commit 76c7cd97e9
3 changed files with 16 additions and 3 deletions

View File

@ -8,7 +8,8 @@ NEXTAUTH_URL="http://192.168.1.190:3000"
# AI Provider Configuration # AI Provider Configuration
# ============================================ # ============================================
AI_PROVIDER=ollama AI_PROVIDER=ollama
OLLAMA_API_URL="http://ollama:11434" OLLAMA_BASE_URL="http://ollama:11434"
OLLAMA_MODEL="granite4:latest"
# ============================================ # ============================================
# Email Configuration (Optional) # Email Configuration (Optional)

View File

@ -14,8 +14,19 @@ NEXTAUTH_URL="http://YOUR_SERVER_IP:3000"
# ============================================ # ============================================
# AI Provider Configuration # AI Provider Configuration
# ============================================ # ============================================
# For local Ollama in Docker (service name):
AI_PROVIDER=ollama AI_PROVIDER=ollama
OLLAMA_API_URL="http://ollama:11434" OLLAMA_BASE_URL="http://ollama:11434"
OLLAMA_MODEL="granite4:latest"
# For external Ollama (on host or different server):
# AI_PROVIDER=ollama
# OLLAMA_BASE_URL="http://YOUR_SERVER_IP:11434"
# OLLAMA_MODEL="granite4:latest"
# For OpenAI:
# AI_PROVIDER=openai
# OPENAI_API_KEY="sk-..."
# ============================================ # ============================================
# Email Configuration (Optional) # Email Configuration (Optional)

View File

@ -26,7 +26,8 @@ services:
# AI Providers # AI Providers
- OPENAI_API_KEY=${OPENAI_API_KEY} - OPENAI_API_KEY=${OPENAI_API_KEY}
- OLLAMA_API_URL=${OLLAMA_API_URL:-http://ollama:11434} - OLLAMA_BASE_URL=${OLLAMA_BASE_URL:-http://ollama:11434}
- OLLAMA_MODEL=${OLLAMA_MODEL:-granite4:latest}
volumes: volumes:
- db-data:/app/prisma - db-data:/app/prisma
- uploads-data:/app/public/uploads - uploads-data:/app/public/uploads