From 76c7cd97e958d46cd8cfa474c99f4c6ce5f950b0 Mon Sep 17 00:00:00 2001 From: sepehr Date: Mon, 12 Jan 2026 22:13:35 +0100 Subject: [PATCH] fix: resolve Docker AI features connection to Ollama MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Critical fix for AI features (reformulation, auto-labels) in Docker. Only title generation was working because it had hardcoded fallback. Problem: - Code uses OLLAMA_BASE_URL environment variable - docker-compose.yml was setting OLLAMA_API_URL (wrong variable) - .env.docker was missing on server - Container couldn't connect to Ollama service Root cause: - Environment variable name mismatch - Missing OLLAMA_BASE_URL configuration - localhost:11434 doesn't work in Docker (need service name) Solution: 1. Update docker-compose.yml: - Change OLLAMA_API_URL → OLLAMA_BASE_URL - Add OLLAMA_MODEL environment variable - Default to http://ollama:11434 (Docker service name) 2. Update .env.docker: - Add OLLAMA_BASE_URL="http://ollama:11434" - Add OLLAMA_MODEL="granite4:latest" - Document AI provider configuration 3. Update .env.docker.example: - Add examples for different Ollama setups - Document Docker service name vs external IP - Add OpenAI configuration example Result: ✓ All AI features now work in Docker (titles, reformulation, auto-labels) ✓ Proper connection to Ollama container via Docker network ✓ Clear documentation for different deployment scenarios Technical details: - Docker service name "ollama" resolves to container IP - Port 11434 accessible within memento-network - Fallback to localhost only for local development Co-Authored-By: Claude Sonnet 4.5 --- .env.docker | 3 ++- .env.docker.example | 13 ++++++++++++- docker-compose.yml | 3 ++- 3 files changed, 16 insertions(+), 3 deletions(-) diff --git a/.env.docker b/.env.docker index bf24195..f097745 100644 --- a/.env.docker +++ b/.env.docker @@ -8,7 +8,8 @@ NEXTAUTH_URL="http://192.168.1.190:3000" # AI Provider Configuration # ============================================ AI_PROVIDER=ollama -OLLAMA_API_URL="http://ollama:11434" +OLLAMA_BASE_URL="http://ollama:11434" +OLLAMA_MODEL="granite4:latest" # ============================================ # Email Configuration (Optional) diff --git a/.env.docker.example b/.env.docker.example index 658d282..2da8140 100644 --- a/.env.docker.example +++ b/.env.docker.example @@ -14,8 +14,19 @@ NEXTAUTH_URL="http://YOUR_SERVER_IP:3000" # ============================================ # AI Provider Configuration # ============================================ +# For local Ollama in Docker (service name): AI_PROVIDER=ollama -OLLAMA_API_URL="http://ollama:11434" +OLLAMA_BASE_URL="http://ollama:11434" +OLLAMA_MODEL="granite4:latest" + +# For external Ollama (on host or different server): +# AI_PROVIDER=ollama +# OLLAMA_BASE_URL="http://YOUR_SERVER_IP:11434" +# OLLAMA_MODEL="granite4:latest" + +# For OpenAI: +# AI_PROVIDER=openai +# OPENAI_API_KEY="sk-..." # ============================================ # Email Configuration (Optional) diff --git a/docker-compose.yml b/docker-compose.yml index 5d7788a..ea029f8 100644 --- a/docker-compose.yml +++ b/docker-compose.yml @@ -26,7 +26,8 @@ services: # AI Providers - OPENAI_API_KEY=${OPENAI_API_KEY} - - OLLAMA_API_URL=${OLLAMA_API_URL:-http://ollama:11434} + - OLLAMA_BASE_URL=${OLLAMA_BASE_URL:-http://ollama:11434} + - OLLAMA_MODEL=${OLLAMA_MODEL:-granite4:latest} volumes: - db-data:/app/prisma - uploads-data:/app/public/uploads