Critical fix for Docker deployment where AI features were trying to connect
to localhost:11434 instead of using configured provider (Ollama Docker service
or OpenAI).
Problems fixed:
1. Reformulation (clarify/shorten/improve) failing with ECONNREFUSED 127.0.0.1:11434
2. Auto-labels failing with same error
3. Notebook summaries failing
4. Could not switch from Ollama to OpenAI in admin
Root cause:
- Code had hardcoded fallback to 'http://localhost:11434' in multiple places
- .env.docker file not created on server (gitignore'd)
- No validation that required environment variables were set
Changes:
1. lib/ai/factory.ts:
- Remove hardcoded 'http://localhost:11434' fallback
- Only use localhost for local development (NODE_ENV !== 'production')
- Throw error if OLLAMA_BASE_URL not set in production
2. lib/ai/providers/ollama.ts:
- Remove default parameter 'http://localhost:11434' from constructor
- Require baseUrl to be explicitly passed
- Throw error if baseUrl is missing
3. lib/ai/services/paragraph-refactor.service.ts:
- Remove 'http://localhost:11434' fallback (2 locations)
- Require OLLAMA_BASE_URL to be set
- Throw clear error if not configured
4. app/(main)/admin/settings/admin-settings-form.tsx:
- Add debug info showing current provider state
- Display database config value for transparency
- Help troubleshoot provider selection issues
5. DOCKER-SETUP.md:
- Complete guide for Docker configuration
- Instructions for .env.docker setup
- Examples for Ollama Docker, OpenAI, and external Ollama
- Troubleshooting common issues
Usage:
On server, create .env.docker with proper provider configuration:
- Ollama in Docker: OLLAMA_BASE_URL="http://ollama:11434"
- OpenAI: OPENAI_API_KEY="sk-..."
- External Ollama: OLLAMA_BASE_URL="http://SERVER_IP:11434"
Then in admin interface, users can independently configure:
- Tags Provider (for auto-labels, AI features)
- Embeddings Provider (for semantic search)
Result:
✓ Clear errors if Ollama not configured
✓ Can switch to OpenAI freely in admin
✓ No more hardcoded localhost in production
✓ Proper separation between local dev and Docker production
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Fixes issue where interface always defaulted to English for new users.
Now automatically detects and applies browser language on first visit.
Changes:
- lib/i18n/LanguageProvider.tsx:
- Add browser language detection using navigator.language
- Check if detected language is in supported languages list
- Auto-save detected language to localStorage
- Update HTML lang attribute for proper font rendering
Behavior:
- First visit: Detects browser language (e.g., fr-FR → fr)
- Returning visits: Uses saved language from localStorage
- Fallback: Defaults to English if language not supported
Result:
✓ French users see French interface automatically
✓ All supported languages work (en, fr, es, de, fa, it, pt, ru, zh, ja, ko, ar, hi, nl, pl)
✓ Better UX for international users
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Fixes issue where notebook dropdown showed icon name (e.g., "folder")
instead of the actual icon component.
Problem:
- note-card.tsx was displaying {notebook.icon} as text
- Users saw "folder", "book", etc. instead of icons
Solution:
- Import Lucide icon components (Folder, Book, Briefcase, etc.)
- Add ICON_MAP matching icon names to components
- Use getNotebookIcon() function to resolve icon name to component
- Render component as <NotebookIcon className="h-4 w-4 mr-2" />
Changes:
- components/note-card.tsx:
- Add LucideIcon and icon imports
- Add ICON_MAP and getNotebookIcon() helper
- Update notebook dropdown to render icon components
Result:
✓ Notebook icons now display correctly in dropdown menu
✓ Consistent with notebooks-list.tsx implementation
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Critical fix for production deployment on Proxmox/Docker.
Problem:
- Runtime error: Prisma Client could not locate Query Engine for "debian-openssl-1.1.x"
- Wrong binary target generated (Windows dll instead of Linux .so.node)
- Wrong OpenSSL version (3.0.x instead of 1.1.x for Debian 11)
Root cause:
- Schema.prisma didn't specify binaryTargets
- Prisma auto-detected Windows during local development
- Debian 11 (bullseye) uses OpenSSL 1.1.x, not 3.0.x
Solution:
1. Add binaryTargets to schema.prisma:
- "debian-openssl-1.1.x" for Docker/Proxmox
- "native" for local development
2. Fix Prisma folder permissions in Docker:
- RUN chown -R nextjs:nodejs /app/prisma
- Ensures Query Engine binary is readable by app user
Changes:
- prisma/schema.prisma: Add binaryTargets = ["debian-openssl-1.1.x", "native"]
- keep-notes/Dockerfile: Add chown for /app/prisma folder
Verification:
✓ libquery_engine-debian-openssl-1.1.x.so.node exists
✓ Permissions: nextjs:nodejs (readable)
✓ Prisma Client loads successfully in container
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Fixes runtime error where Prisma Client could not locate the Query Engine:
"libquery_engine-debian-openssl-1.1.x.so.node" not found
Root cause:
- Next.js standalone output does not include Prisma Query Engine binaries
- The .prisma folder in node_modules contains the required binary files
Solution:
- Copy node_modules/.prisma folder in Docker runner stage
- This includes libquery_engine-debian-openssl-1.1.x.so.node
- Prisma Client can now find and load the Query Engine at runtime
Tested:
✓ Docker build successful
✓ Container starts without Prisma errors
✓ Application ready in 40ms
Changes:
- keep-notes/Dockerfile: Add COPY for node_modules/.prisma folder
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
## Docker Configuration
- Enhance docker-compose.yml with Ollama support for local AI
- Add resource limits and health checks for better stability
- Configure isolated Docker network (keep-network)
- Add persistent volumes for database and uploads
- Include optional Ollama service configuration
## Deployment Files
- Add DOCKER_DEPLOYMENT.md with comprehensive deployment guide
- Add deploy.sh automation script with 10+ commands
- Document Proxmox LXC container setup
- Add backup/restore procedures
- Include SSL/HTTPS and reverse proxy configuration
## Docker Build Optimization
- Improve .dockerignore for faster builds
- Exclude development files and debug logs
- Add comprehensive exclusions for IDE, OS, and testing files
## Features
- Support for OpenAI API (cloud AI)
- Support for Ollama (local AI models)
- Automatic database backups
- Health checks and auto-restart
- Resource limits for VM/LXC environments
## Documentation
- Complete Proxmox deployment guide
- Troubleshooting section
- Security best practices
- Performance tuning recommendations
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Ou une version plus courte si vous préférez :
feat(docker): Add Proxmox deployment config with Ollama support
- Enhance docker-compose.yml with health checks, resource limits, Ollama support
- Add DOCKER_DEPLOYMENT.md guide (50+ sections covering Proxmox, SSL, AI setup)
- Add deploy.sh script with build, start, backup, logs commands
- Improve .dockerignore for optimized builds
- Document backup/restore procedures and security best practices
- Support both OpenAI and local Ollama AI providers
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Add AI Provider Testing page (/admin/ai-test) with Tags and Embeddings tests
- Add new AI providers: CustomOpenAI, DeepSeek, OpenRouter
- Add API routes for AI config, models listing, and testing endpoints
- Add UX Design Specification document for Phase 1 MVP AI
- Add PRD Phase 1 MVP AI planning document
- Update admin settings and sidebar navigation
- Fix AI factory for multi-provider support
## Bug Fixes
### Note Card Actions
- Fix broken size change functionality (missing state declaration)
- Implement React 19 useOptimistic for instant UI feedback
- Add startTransition for non-blocking updates
- Ensure smooth animations without page refresh
- All note actions now work: pin, archive, color, size, checklist
### Markdown LaTeX Rendering
- Add remark-math and rehype-katex plugins
- Support inline equations with dollar sign syntax
- Support block equations with double dollar sign syntax
- Import KaTeX CSS for proper styling
- Equations now render correctly instead of showing raw LaTeX
## Technical Details
- Replace undefined currentNote references with optimistic state
- Add optimistic updates before server actions for instant feedback
- Use router.refresh() in transitions for smart cache invalidation
- Install remark-math, rehype-katex, and katex packages
## Testing
- Build passes successfully with no TypeScript errors
- Dev server hot-reloads changes correctly
- Added multi-provider AI infrastructure (OpenAI/Ollama)
- Implemented real-time tag suggestions with debounced analysis
- Created AI diagnostics and database maintenance tools in Settings
- Added automated garbage collection for orphan labels
- Refined UX with deterministic color hashing and interactive ghost tags
- Remove buggy Undo/Redo that saved character-by-character
- Simplify state to plain useState like Google Keep
- Create toast component with success/error/warning/info types
- Toast notifications auto-dismiss after 3s with smooth animations
- Add ToastProvider in layout
- Remove all JavaScript alert() calls
- Production-ready notification system
- Add debounced state updates for title and content (500ms delay)
- Immediate UI updates with delayed history saving
- Prevent one-letter-per-undo issue
- Add cleanup for debounce timers on unmount