Keep/docker-compose.yml
sepehr 76c7cd97e9 fix: resolve Docker AI features connection to Ollama
Critical fix for AI features (reformulation, auto-labels) in Docker.
Only title generation was working because it had hardcoded fallback.

Problem:
- Code uses OLLAMA_BASE_URL environment variable
- docker-compose.yml was setting OLLAMA_API_URL (wrong variable)
- .env.docker was missing on server
- Container couldn't connect to Ollama service

Root cause:
- Environment variable name mismatch
- Missing OLLAMA_BASE_URL configuration
- localhost:11434 doesn't work in Docker (need service name)

Solution:
1. Update docker-compose.yml:
   - Change OLLAMA_API_URL → OLLAMA_BASE_URL
   - Add OLLAMA_MODEL environment variable
   - Default to http://ollama:11434 (Docker service name)

2. Update .env.docker:
   - Add OLLAMA_BASE_URL="http://ollama:11434"
   - Add OLLAMA_MODEL="granite4:latest"
   - Document AI provider configuration

3. Update .env.docker.example:
   - Add examples for different Ollama setups
   - Document Docker service name vs external IP
   - Add OpenAI configuration example

Result:
✓ All AI features now work in Docker (titles, reformulation, auto-labels)
✓ Proper connection to Ollama container via Docker network
✓ Clear documentation for different deployment scenarios

Technical details:
- Docker service name "ollama" resolves to container IP
- Port 11434 accessible within memento-network
- Fallback to localhost only for local development

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-01-12 22:13:35 +01:00

91 lines
2.4 KiB
YAML

services:
# ============================================
# keep-notes - Next.js Web Application
# ============================================
keep-notes:
build:
context: ./keep-notes
dockerfile: Dockerfile
container_name: memento-web
env_file:
- .env.docker
ports:
- "3000:3000"
environment:
- DATABASE_URL=file:/app/prisma/dev.db
- NEXTAUTH_SECRET=${NEXTAUTH_SECRET:-changethisinproduction}
- NEXTAUTH_URL=${NEXTAUTH_URL:-http://localhost:3000}
- NODE_ENV=production
# Email Configuration (SMTP)
- SMTP_HOST=${SMTP_HOST}
- SMTP_PORT=${SMTP_PORT:-587}
- SMTP_USER=${SMTP_USER}
- SMTP_PASS=${SMTP_PASS}
- SMTP_FROM=${SMTP_FROM:-noreply@memento.app}
# AI Providers
- OPENAI_API_KEY=${OPENAI_API_KEY}
- OLLAMA_BASE_URL=${OLLAMA_BASE_URL:-http://ollama:11434}
- OLLAMA_MODEL=${OLLAMA_MODEL:-granite4:latest}
volumes:
- db-data:/app/prisma
- uploads-data:/app/public/uploads
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:3000"]
interval: 30s
timeout: 10s
retries: 3
start_period: 40s
networks:
- memento-network
# ============================================
# mcp-server - MCP Protocol Server
# ============================================
mcp-server:
build:
context: ./mcp-server
dockerfile: Dockerfile
container_name: memento-mcp
volumes:
- db-data:/app/db
depends_on:
- keep-notes
restart: unless-stopped
networks:
- memento-network
# ============================================
# Ollama - Local LLM Provider (Optional)
# ============================================
ollama:
image: ollama/ollama:latest
container_name: memento-ollama
ports:
- "11434:11434"
volumes:
- ollama-data:/root/.ollama
restart: unless-stopped
networks:
- memento-network
# ============================================
# Volumes - Data Persistence
# ============================================
volumes:
db-data:
driver: local
uploads-data:
driver: local
ollama-data:
driver: local
# ============================================
# Networks - Service Communication
# ============================================
networks:
memento-network:
driver: bridge