The paragraph-refactor service was using OLLAMA_BASE_URL directly from
environment variables instead of using the configured AI provider from
the database. This caused "OLLAMA error" even when OpenAI was configured
in the admin interface.
Changes:
- paragraph-refactor.service.ts: Now uses getSystemConfig() and
getTagsProvider() from factory instead of direct Ollama calls
- factory.ts: Added proper error messages when API keys are missing
- .env.docker.example: Updated with new provider configuration
variables (AI_PROVIDER_TAGS, AI_PROVIDER_EMBEDDING)
This fixes the issue where AI reformulation features (Clarify, Shorten,
Improve Style) would fail with OLLAMA errors even when OpenAI was
properly configured in the admin settings.
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Critical fix for AI features (reformulation, auto-labels) in Docker.
Only title generation was working because it had hardcoded fallback.
Problem:
- Code uses OLLAMA_BASE_URL environment variable
- docker-compose.yml was setting OLLAMA_API_URL (wrong variable)
- .env.docker was missing on server
- Container couldn't connect to Ollama service
Root cause:
- Environment variable name mismatch
- Missing OLLAMA_BASE_URL configuration
- localhost:11434 doesn't work in Docker (need service name)
Solution:
1. Update docker-compose.yml:
- Change OLLAMA_API_URL → OLLAMA_BASE_URL
- Add OLLAMA_MODEL environment variable
- Default to http://ollama:11434 (Docker service name)
2. Update .env.docker:
- Add OLLAMA_BASE_URL="http://ollama:11434"
- Add OLLAMA_MODEL="granite4:latest"
- Document AI provider configuration
3. Update .env.docker.example:
- Add examples for different Ollama setups
- Document Docker service name vs external IP
- Add OpenAI configuration example
Result:
✓ All AI features now work in Docker (titles, reformulation, auto-labels)
✓ Proper connection to Ollama container via Docker network
✓ Clear documentation for different deployment scenarios
Technical details:
- Docker service name "ollama" resolves to container IP
- Port 11434 accessible within memento-network
- Fallback to localhost only for local development
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Fixes redirect issue where application redirects to localhost:3000
instead of server IP when accessed from remote browser.
Problem:
- Accessing http://192.168.1.190:3000 redirects to http://localhost:3000
- NEXTAUTH_URL was hardcoded to localhost in .env
- Remote users cannot access the application
Solution:
1. Create .env.docker with server-specific configuration
- NEXTAUTH_URL="http://192.168.1.190:3000"
- Ollama URL for container-to-container communication
2. Update docker-compose.yml:
- Add env_file: .env.docker
- Loads Docker-specific environment variables
3. Create .env.docker.example as template:
- Document NEXTAUTH_URL configuration
- Users can copy and customize for their server IP/domain
Files:
- .env.docker: Docker environment (gitignored)
- .env.docker.example: Template for users
- docker-compose.yml: Load .env.docker
- .env.example: Update NEXTAUTH_URL documentation
Usage:
On production server, edit .env.docker with your server IP:
NEXTAUTH_URL="http://YOUR_IP:3000"
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>