Keep/docker-compose.yml
sepehr 8617117dec fix: remove Ollama default fallbacks in factory and Docker
ROOT CAUSE: The factory was defaulting to 'ollama' when no provider
was configured, and docker-compose.yml was always setting OLLAMA_BASE_URL
even when using OpenAI. This caused the app to try connecting to Ollama
even when OpenAI was configured in the admin.

CRITICAL CHANGES:
1. lib/ai/factory.ts - Removed 'ollama' default fallback
   - getTagsProvider() now throws error if AI_PROVIDER_TAGS not set
   - getEmbeddingsProvider() now throws error if AI_PROVIDER_EMBEDDING not set
   - Forces explicit configuration instead of silent fallback to Ollama

2. docker-compose.yml - Removed default OLLAMA_BASE_URL
   - Changed: OLLAMA_BASE_URL=${OLLAMA_BASE_URL:-http://ollama:11434}
   - To: OLLAMA_BASE_URL=${OLLAMA_BASE_URL}
   - Only set if explicitly defined in .env.docker

3. Application name: Mento → Memento (correct spelling)
   - Updated in: sidebar, README, deploy.sh, DOCKER_DEPLOYMENT.md

4. app/api/ai/config/route.ts - Return 'not set' instead of 'ollama'
   - Makes it clear when provider is not configured

IMPACT: The app will now properly use OpenAI when configured in the
admin interface, instead of silently falling back to Ollama.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-01-12 23:08:20 +01:00

92 lines
2.4 KiB
YAML

services:
# ============================================
# keep-notes - Next.js Web Application
# ============================================
keep-notes:
build:
context: ./keep-notes
dockerfile: Dockerfile
container_name: memento-web
env_file:
- .env.docker
ports:
- "3000:3000"
environment:
- DATABASE_URL=file:/app/prisma/dev.db
- NEXTAUTH_SECRET=${NEXTAUTH_SECRET:-changethisinproduction}
- NEXTAUTH_URL=${NEXTAUTH_URL:-http://localhost:3000}
- NODE_ENV=production
# Email Configuration (SMTP)
- SMTP_HOST=${SMTP_HOST}
- SMTP_PORT=${SMTP_PORT:-587}
- SMTP_USER=${SMTP_USER}
- SMTP_PASS=${SMTP_PASS}
- SMTP_FROM=${SMTP_FROM:-noreply@memento.app}
# AI Providers
# Only define these if you're using the corresponding provider
- OPENAI_API_KEY=${OPENAI_API_KEY}
- OLLAMA_BASE_URL=${OLLAMA_BASE_URL}
- OLLAMA_MODEL=${OLLAMA_MODEL}
volumes:
- db-data:/app/prisma
- uploads-data:/app/public/uploads
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:3000"]
interval: 30s
timeout: 10s
retries: 3
start_period: 40s
networks:
- memento-network
# ============================================
# mcp-server - MCP Protocol Server
# ============================================
mcp-server:
build:
context: ./mcp-server
dockerfile: Dockerfile
container_name: memento-mcp
volumes:
- db-data:/app/db
depends_on:
- keep-notes
restart: unless-stopped
networks:
- memento-network
# ============================================
# Ollama - Local LLM Provider (Optional)
# ============================================
ollama:
image: ollama/ollama:latest
container_name: memento-ollama
ports:
- "11434:11434"
volumes:
- ollama-data:/root/.ollama
restart: unless-stopped
networks:
- memento-network
# ============================================
# Volumes - Data Persistence
# ============================================
volumes:
db-data:
driver: local
uploads-data:
driver: local
ollama-data:
driver: local
# ============================================
# Networks - Service Communication
# ============================================
networks:
memento-network:
driver: bridge