4 Commits

Author SHA1 Message Date
8617117dec fix: remove Ollama default fallbacks in factory and Docker
ROOT CAUSE: The factory was defaulting to 'ollama' when no provider
was configured, and docker-compose.yml was always setting OLLAMA_BASE_URL
even when using OpenAI. This caused the app to try connecting to Ollama
even when OpenAI was configured in the admin.

CRITICAL CHANGES:
1. lib/ai/factory.ts - Removed 'ollama' default fallback
   - getTagsProvider() now throws error if AI_PROVIDER_TAGS not set
   - getEmbeddingsProvider() now throws error if AI_PROVIDER_EMBEDDING not set
   - Forces explicit configuration instead of silent fallback to Ollama

2. docker-compose.yml - Removed default OLLAMA_BASE_URL
   - Changed: OLLAMA_BASE_URL=${OLLAMA_BASE_URL:-http://ollama:11434}
   - To: OLLAMA_BASE_URL=${OLLAMA_BASE_URL}
   - Only set if explicitly defined in .env.docker

3. Application name: Mento → Memento (correct spelling)
   - Updated in: sidebar, README, deploy.sh, DOCKER_DEPLOYMENT.md

4. app/api/ai/config/route.ts - Return 'not set' instead of 'ollama'
   - Makes it clear when provider is not configured

IMPACT: The app will now properly use OpenAI when configured in the
admin interface, instead of silently falling back to Ollama.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-01-12 23:08:20 +01:00
58e486c68e fix: rename application from Keep to Mento in UI and docs
- Update sidebar.tsx to display "Mento" instead of "Keep"
- Update README.md title from "Keep Notes" to "Mento"
- Update DOCKER_DEPLOYMENT.md references to "Mento"
- Update deploy.sh script comments to use "Mento"
- Add DOCKER-SETUP.md with Docker configuration guide

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-01-12 22:37:38 +01:00
1678bcaced chore(docker): tweak deployment docs and scripts; update package metadata 2026-01-11 23:20:34 +01:00
0b258aef4e feat(docker): Add complete Docker deployment configuration for Proxmox
## Docker Configuration
- Enhance docker-compose.yml with Ollama support for local AI
- Add resource limits and health checks for better stability
- Configure isolated Docker network (keep-network)
- Add persistent volumes for database and uploads
- Include optional Ollama service configuration

## Deployment Files
- Add DOCKER_DEPLOYMENT.md with comprehensive deployment guide
- Add deploy.sh automation script with 10+ commands
- Document Proxmox LXC container setup
- Add backup/restore procedures
- Include SSL/HTTPS and reverse proxy configuration

## Docker Build Optimization
- Improve .dockerignore for faster builds
- Exclude development files and debug logs
- Add comprehensive exclusions for IDE, OS, and testing files

## Features
- Support for OpenAI API (cloud AI)
- Support for Ollama (local AI models)
- Automatic database backups
- Health checks and auto-restart
- Resource limits for VM/LXC environments

## Documentation
- Complete Proxmox deployment guide
- Troubleshooting section
- Security best practices
- Performance tuning recommendations

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>

Ou une version plus courte si vous préférez :

feat(docker): Add Proxmox deployment config with Ollama support

- Enhance docker-compose.yml with health checks, resource limits, Ollama support
- Add DOCKER_DEPLOYMENT.md guide (50+ sections covering Proxmox, SSL, AI setup)
- Add deploy.sh script with build, start, backup, logs commands
- Improve .dockerignore for optimized builds
- Document backup/restore procedures and security best practices
- Support both OpenAI and local Ollama AI providers

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-01-11 22:58:56 +01:00