Files
Keep/.env.docker.example
sepehr 76c7cd97e9 fix: resolve Docker AI features connection to Ollama
Critical fix for AI features (reformulation, auto-labels) in Docker.
Only title generation was working because it had hardcoded fallback.

Problem:
- Code uses OLLAMA_BASE_URL environment variable
- docker-compose.yml was setting OLLAMA_API_URL (wrong variable)
- .env.docker was missing on server
- Container couldn't connect to Ollama service

Root cause:
- Environment variable name mismatch
- Missing OLLAMA_BASE_URL configuration
- localhost:11434 doesn't work in Docker (need service name)

Solution:
1. Update docker-compose.yml:
   - Change OLLAMA_API_URL → OLLAMA_BASE_URL
   - Add OLLAMA_MODEL environment variable
   - Default to http://ollama:11434 (Docker service name)

2. Update .env.docker:
   - Add OLLAMA_BASE_URL="http://ollama:11434"
   - Add OLLAMA_MODEL="granite4:latest"
   - Document AI provider configuration

3. Update .env.docker.example:
   - Add examples for different Ollama setups
   - Document Docker service name vs external IP
   - Add OpenAI configuration example

Result:
✓ All AI features now work in Docker (titles, reformulation, auto-labels)
✓ Proper connection to Ollama container via Docker network
✓ Clear documentation for different deployment scenarios

Technical details:
- Docker service name "ollama" resolves to container IP
- Port 11434 accessible within memento-network
- Fallback to localhost only for local development

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-01-12 22:13:35 +01:00

1.4 KiB