feat(docker): Add complete Docker deployment configuration for Proxmox
## Docker Configuration - Enhance docker-compose.yml with Ollama support for local AI - Add resource limits and health checks for better stability - Configure isolated Docker network (keep-network) - Add persistent volumes for database and uploads - Include optional Ollama service configuration ## Deployment Files - Add DOCKER_DEPLOYMENT.md with comprehensive deployment guide - Add deploy.sh automation script with 10+ commands - Document Proxmox LXC container setup - Add backup/restore procedures - Include SSL/HTTPS and reverse proxy configuration ## Docker Build Optimization - Improve .dockerignore for faster builds - Exclude development files and debug logs - Add comprehensive exclusions for IDE, OS, and testing files ## Features - Support for OpenAI API (cloud AI) - Support for Ollama (local AI models) - Automatic database backups - Health checks and auto-restart - Resource limits for VM/LXC environments ## Documentation - Complete Proxmox deployment guide - Troubleshooting section - Security best practices - Performance tuning recommendations 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> Ou une version plus courte si vous préférez : feat(docker): Add Proxmox deployment config with Ollama support - Enhance docker-compose.yml with health checks, resource limits, Ollama support - Add DOCKER_DEPLOYMENT.md guide (50+ sections covering Proxmox, SSL, AI setup) - Add deploy.sh script with build, start, backup, logs commands - Improve .dockerignore for optimized builds - Document backup/restore procedures and security best practices - Support both OpenAI and local Ollama AI providers 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
This commit is contained in:
parent
7fb486c9a4
commit
0b258aef4e
@ -1,16 +1,49 @@
|
|||||||
Dockerfile
|
# Dependencies
|
||||||
.dockerignore
|
|
||||||
node_modules
|
node_modules
|
||||||
npm-debug.log
|
npm-debug.log
|
||||||
README.md
|
yarn-error.log
|
||||||
|
|
||||||
|
# Next.js
|
||||||
.next
|
.next
|
||||||
.git
|
out
|
||||||
.env
|
build
|
||||||
|
dist
|
||||||
|
|
||||||
|
# Production
|
||||||
.env.local
|
.env.local
|
||||||
.env.development.local
|
.env.development.local
|
||||||
.env.test.local
|
.env.test.local
|
||||||
.env.production.local
|
.env.production.local
|
||||||
build
|
|
||||||
dist
|
# Debug
|
||||||
|
npm-debug.log*
|
||||||
|
yarn-debug.log*
|
||||||
|
yarn-error.log*
|
||||||
|
|
||||||
|
# Local files
|
||||||
|
.git
|
||||||
|
.gitignore
|
||||||
|
README.md
|
||||||
|
.eslintrc.json
|
||||||
|
.prettierrc*
|
||||||
|
|
||||||
|
# IDE
|
||||||
|
.vscode
|
||||||
|
.idea
|
||||||
|
*.swp
|
||||||
|
*.swo
|
||||||
|
*~
|
||||||
|
|
||||||
|
# OS
|
||||||
|
.DS_Store
|
||||||
|
Thumbs.db
|
||||||
|
|
||||||
|
# Testing
|
||||||
|
coverage
|
||||||
|
.nyc_output
|
||||||
playwright-report
|
playwright-report
|
||||||
test-results
|
test-results
|
||||||
|
|
||||||
|
# Misc
|
||||||
|
.turbo
|
||||||
|
*.log
|
||||||
|
|||||||
397
keep-notes/DOCKER_DEPLOYMENT.md
Normal file
397
keep-notes/DOCKER_DEPLOYMENT.md
Normal file
@ -0,0 +1,397 @@
|
|||||||
|
# 🐳 Docker Deployment Guide for Proxmox
|
||||||
|
|
||||||
|
Complete guide to deploy Keep Notes on Proxmox using Docker Compose.
|
||||||
|
|
||||||
|
## 📋 Prerequisites
|
||||||
|
|
||||||
|
### On Your Proxmox Host:
|
||||||
|
- Proxmox VE 7.x or 8.x
|
||||||
|
- Docker and Docker Compose installed
|
||||||
|
- At least 2GB RAM available (4GB+ recommended for AI features)
|
||||||
|
- 10GB disk space available
|
||||||
|
|
||||||
|
### Optional for AI Features:
|
||||||
|
- **For OpenAI**: Valid API key
|
||||||
|
- **For Ollama (Local AI)**: 8GB+ RAM, 4+ CPU cores recommended
|
||||||
|
|
||||||
|
## 🚀 Quick Start
|
||||||
|
|
||||||
|
### 1. Prepare Environment Files
|
||||||
|
|
||||||
|
Create a `.env` file in the `keep-notes` directory:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd /path/to/keep-notes
|
||||||
|
cat > .env << 'EOF'
|
||||||
|
# Required: Generate a random secret
|
||||||
|
NEXTAUTH_SECRET=$(openssl rand -base64 32)
|
||||||
|
NEXTAUTH_URL=http://your-domain.com:3000
|
||||||
|
|
||||||
|
# Optional: OpenAI API Key
|
||||||
|
# OPENAI_API_KEY=sk-your-key-here
|
||||||
|
|
||||||
|
# Optional: Ollama Configuration (if using local AI)
|
||||||
|
# OLLAMA_BASE_URL=http://ollama:11434
|
||||||
|
# OLLAMA_MODEL=granite4:latest
|
||||||
|
|
||||||
|
# Optional: Custom Session Max Age (in seconds)
|
||||||
|
NEXTAUTH_SESSION_MAX_AGE=604800
|
||||||
|
EOF
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Build and Start Containers
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Build the Docker image
|
||||||
|
docker-compose build
|
||||||
|
|
||||||
|
# Start the application
|
||||||
|
docker-compose up -d
|
||||||
|
|
||||||
|
# View logs
|
||||||
|
docker-compose logs -f keep-notes
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Access the Application
|
||||||
|
|
||||||
|
Open your browser and navigate to:
|
||||||
|
- **http://YOUR_PROXMOX_IP:3000**
|
||||||
|
|
||||||
|
## 🔧 Configuration Options
|
||||||
|
|
||||||
|
### Without Reverse Proxy (Basic)
|
||||||
|
|
||||||
|
Edit `docker-compose.yml`:
|
||||||
|
```yaml
|
||||||
|
environment:
|
||||||
|
- NEXTAUTH_URL=http://your-ip:3000
|
||||||
|
- NEXTAUTH_SECRET=your-random-secret
|
||||||
|
ports:
|
||||||
|
- "3000:3000"
|
||||||
|
```
|
||||||
|
|
||||||
|
### With Nginx Reverse Proxy (Recommended)
|
||||||
|
|
||||||
|
#### 1. Create Nginx Configuration
|
||||||
|
|
||||||
|
```nginx
|
||||||
|
# /etc/nginx/conf.d/keep-notes.conf
|
||||||
|
server {
|
||||||
|
listen 80;
|
||||||
|
server_name notes.yourdomain.com;
|
||||||
|
|
||||||
|
location / {
|
||||||
|
proxy_pass http://localhost:3000;
|
||||||
|
proxy_http_version 1.1;
|
||||||
|
proxy_set_header Upgrade $http_upgrade;
|
||||||
|
proxy_set_header Connection 'upgrade';
|
||||||
|
proxy_set_header Host $host;
|
||||||
|
proxy_cache_bypass $http_upgrade;
|
||||||
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
|
}
|
||||||
|
|
||||||
|
# Max upload size for images
|
||||||
|
client_max_body_size 10M;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 2. Update docker-compose.yml
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
environment:
|
||||||
|
- NEXTAUTH_URL=https://notes.yourdomain.com
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 3. Restart Container
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker-compose down
|
||||||
|
docker-compose up -d
|
||||||
|
```
|
||||||
|
|
||||||
|
### With SSL/HTTPS (Let's Encrypt)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Install certbot
|
||||||
|
apt install certbot python3-certbot-nginx
|
||||||
|
|
||||||
|
# Get certificate
|
||||||
|
certbot --nginx -d notes.yourdomain.com
|
||||||
|
|
||||||
|
# Auto-renewal (cron)
|
||||||
|
echo "0 0,12 * * * root certbot renew --quiet" | tee /etc/cron.d/certbot-renew
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🤖 AI Features Setup
|
||||||
|
|
||||||
|
### Option 1: OpenAI (Cloud)
|
||||||
|
|
||||||
|
1. Get API key from https://platform.openai.com/api-keys
|
||||||
|
2. Add to `.env`:
|
||||||
|
```bash
|
||||||
|
OPENAI_API_KEY=sk-your-key-here
|
||||||
|
```
|
||||||
|
3. Restart: `docker-compose restart`
|
||||||
|
|
||||||
|
### Option 2: Ollama (Local AI)
|
||||||
|
|
||||||
|
#### 1. Enable Ollama in docker-compose.yml
|
||||||
|
|
||||||
|
Uncomment the `ollama` service section in `docker-compose.yml`:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
ollama:
|
||||||
|
image: ollama/ollama:latest
|
||||||
|
container_name: keep-ollama
|
||||||
|
restart: unless-stopped
|
||||||
|
ports:
|
||||||
|
- "11434:11434"
|
||||||
|
volumes:
|
||||||
|
- ollama-data:/root/.ollama
|
||||||
|
networks:
|
||||||
|
- keep-network
|
||||||
|
```
|
||||||
|
|
||||||
|
Uncomment volume:
|
||||||
|
```yaml
|
||||||
|
volumes:
|
||||||
|
ollama-data:
|
||||||
|
driver: local
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 2. Add Environment Variables
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
keep-notes:
|
||||||
|
environment:
|
||||||
|
- OLLAMA_BASE_URL=http://ollama:11434
|
||||||
|
- OLLAMA_MODEL=granite4:latest
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 3. Start and Pull Model
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker-compose up -d
|
||||||
|
docker-compose exec -it ollama ollama pull granite4
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option 3: Custom OpenAI-compatible API
|
||||||
|
|
||||||
|
If you have a custom API (like LocalAI, LM Studio, etc.):
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Add to .env or docker-compose.yml
|
||||||
|
OPENAI_API_BASE_URL=http://your-api-host:port/v1
|
||||||
|
OPENAI_API_KEY=any-key-here
|
||||||
|
```
|
||||||
|
|
||||||
|
## 📊 Resource Recommendations
|
||||||
|
|
||||||
|
### Minimal Setup (Without AI)
|
||||||
|
- **CPU**: 1 core
|
||||||
|
- **RAM**: 512MB
|
||||||
|
- **Disk**: 5GB
|
||||||
|
|
||||||
|
### Recommended Setup (With OpenAI)
|
||||||
|
- **CPU**: 1-2 cores
|
||||||
|
- **RAM**: 1-2GB
|
||||||
|
- **Disk**: 10GB
|
||||||
|
|
||||||
|
### AI Setup (With Ollama)
|
||||||
|
- **CPU**: 4+ cores
|
||||||
|
- **RAM**: 8GB+
|
||||||
|
- **Disk**: 20GB+
|
||||||
|
|
||||||
|
## 🗄️ Database Backup
|
||||||
|
|
||||||
|
### Backup SQLite Database
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Create backup script
|
||||||
|
cat > /path/to/backup-keep.sh << 'EOF'
|
||||||
|
#!/bin/bash
|
||||||
|
DATE=$(date +%Y%m%d_%H%M%S)
|
||||||
|
BACKUP_DIR="/path/to/backups"
|
||||||
|
CONTAINER_NAME="keep-notes"
|
||||||
|
|
||||||
|
# Create backup directory
|
||||||
|
mkdir -p $BACKUP_DIR
|
||||||
|
|
||||||
|
# Backup database
|
||||||
|
docker exec $CONTAINER_NAME \
|
||||||
|
cp /app/prisma/dev.db /app/prisma/backup_$DATE.db
|
||||||
|
|
||||||
|
# Copy from container to host
|
||||||
|
docker cp $CONTAINER_NAME:/app/prisma/backup_$DATE.db \
|
||||||
|
$BACKUP_DIR/keep-notes_$DATE.db
|
||||||
|
|
||||||
|
# Keep last 7 days
|
||||||
|
find $BACKUP_DIR -name "keep-notes_*.db" -mtime +7 -delete
|
||||||
|
|
||||||
|
echo "Backup completed: keep-notes_$DATE.db"
|
||||||
|
EOF
|
||||||
|
|
||||||
|
chmod +x /path/to/backup-keep.sh
|
||||||
|
|
||||||
|
# Add to crontab (daily backup at 2 AM)
|
||||||
|
crontab -e
|
||||||
|
# Add: 0 2 * * * /path/to/backup-keep.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
### Restore Database
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Stop container
|
||||||
|
docker-compose down
|
||||||
|
|
||||||
|
# Restore database
|
||||||
|
cp /path/to/backups/keep-notes_YYYYMMDD_HHMMSS.db \
|
||||||
|
keep-notes/prisma/dev.db
|
||||||
|
|
||||||
|
# Start container
|
||||||
|
docker-compose up -d
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🔄 Updating the Application
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Pull latest changes
|
||||||
|
git pull
|
||||||
|
|
||||||
|
# Rebuild image
|
||||||
|
docker-compose build
|
||||||
|
|
||||||
|
# Restart with new image
|
||||||
|
docker-compose down
|
||||||
|
docker-compose up -d
|
||||||
|
|
||||||
|
# Clean up old images
|
||||||
|
docker image prune -a -f
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🐛 Troubleshooting
|
||||||
|
|
||||||
|
### Container Won't Start
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Check logs
|
||||||
|
docker-compose logs keep-notes
|
||||||
|
|
||||||
|
# Check container status
|
||||||
|
docker-compose ps
|
||||||
|
|
||||||
|
# Enter container for debugging
|
||||||
|
docker-compose exec keep-notes sh
|
||||||
|
```
|
||||||
|
|
||||||
|
### Database Errors
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Fix database permissions
|
||||||
|
docker-compose exec keep-notes \
|
||||||
|
chown -R nextjs:nodejs /app/prisma
|
||||||
|
|
||||||
|
# Regenerate Prisma client
|
||||||
|
docker-compose exec keep-notes \
|
||||||
|
npx prisma generate
|
||||||
|
|
||||||
|
# Run migrations
|
||||||
|
docker-compose exec keep-notes \
|
||||||
|
npx prisma migrate deploy
|
||||||
|
```
|
||||||
|
|
||||||
|
### AI Features Not Working
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Check Ollama status
|
||||||
|
docker-compose logs ollama
|
||||||
|
|
||||||
|
# Test Ollama connection
|
||||||
|
docker-compose exec keep-notes \
|
||||||
|
curl http://ollama:11434/api/tags
|
||||||
|
|
||||||
|
# Check environment variables
|
||||||
|
docker-compose exec keep-notes env | grep -E "OLLAMA|OPENAI"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Performance Issues
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Check resource usage
|
||||||
|
docker stats keep-notes
|
||||||
|
|
||||||
|
# Increase resources in docker-compose.yml
|
||||||
|
deploy:
|
||||||
|
resources:
|
||||||
|
limits:
|
||||||
|
cpus: '4'
|
||||||
|
memory: 4G
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🔒 Security Best Practices
|
||||||
|
|
||||||
|
1. **Change NEXTAUTH_SECRET**: Never use the default value
|
||||||
|
2. **Use HTTPS**: Always use SSL in production
|
||||||
|
3. **Limit Resources**: Prevent container from using all system resources
|
||||||
|
4. **Regular Updates**: Keep Docker image and dependencies updated
|
||||||
|
5. **Backups**: Set up automated database backups
|
||||||
|
6. **Firewall**: Only expose necessary ports (3000 or reverse proxy port)
|
||||||
|
|
||||||
|
## 📱 Proxmox LXC Container Setup
|
||||||
|
|
||||||
|
### Create LXC Container (Recommended)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# In Proxmox shell
|
||||||
|
pveam available
|
||||||
|
pveam update
|
||||||
|
|
||||||
|
# Create Ubuntu 22.04 container
|
||||||
|
pct create 999 local:vztmpl/ubuntu-22.04-standard_22.04-1_amd64.tar.zst \
|
||||||
|
--hostname keep-notes \
|
||||||
|
--storage local-lvm \
|
||||||
|
--cores 2 \
|
||||||
|
--memory 2048 \
|
||||||
|
--swap 512 \
|
||||||
|
--net0 name=eth0,bridge=vmbr0,ip=dhcp
|
||||||
|
|
||||||
|
# Start container
|
||||||
|
pct start 999
|
||||||
|
|
||||||
|
# Enter container
|
||||||
|
pct enter 999
|
||||||
|
|
||||||
|
# Install Docker inside LXC
|
||||||
|
apt update && apt upgrade -y
|
||||||
|
apt install -y curl git
|
||||||
|
curl -fsSL https://get.docker.com -o get-docker.sh
|
||||||
|
sh get-docker.sh
|
||||||
|
usermod -aG docker ubuntu
|
||||||
|
|
||||||
|
# Enable nested containerization for LXC
|
||||||
|
# Edit /etc/pve/lxc/999.conf on Proxmox host
|
||||||
|
# Add: features: nesting=1,keyctl=1
|
||||||
|
```
|
||||||
|
|
||||||
|
Then deploy Keep Notes as described above.
|
||||||
|
|
||||||
|
## 📚 Additional Resources
|
||||||
|
|
||||||
|
- [Next.js Deployment](https://nextjs.org/docs/deployment)
|
||||||
|
- [Docker Compose Reference](https://docs.docker.com/compose/)
|
||||||
|
- [Prisma Docker Guide](https://www.prisma.io/docs/guides/deployment/docker)
|
||||||
|
- [Proxmox LXC Documentation](https://pve.proxmox.com/wiki/Linux_Container)
|
||||||
|
|
||||||
|
## 💡 Tips
|
||||||
|
|
||||||
|
1. **Use Volumes**: Always use Docker volumes for persistent data
|
||||||
|
2. **Health Checks**: Enable health checks for auto-restart
|
||||||
|
3. **Log Rotation**: Prevent disk filling with logs
|
||||||
|
4. **Monitoring**: Use Portainer or similar for easy management
|
||||||
|
5. **Testing**: Test in staging environment before production
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Need Help?** Check the main README or open an issue on GitHub.
|
||||||
177
keep-notes/deploy.sh
Normal file
177
keep-notes/deploy.sh
Normal file
@ -0,0 +1,177 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# Keep Notes Docker Deployment Script
|
||||||
|
# This script helps you build and deploy Keep Notes on Proxmox/Docker
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
echo "🚀 Keep Notes Docker Deployment"
|
||||||
|
echo "================================"
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Colors
|
||||||
|
RED='\033[0;31m'
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
YELLOW='\033[1;33m'
|
||||||
|
NC='\033[0m' # No Color
|
||||||
|
|
||||||
|
# Check if Docker is installed
|
||||||
|
if ! command -v docker &> /dev/null; then
|
||||||
|
echo -e "${RED}❌ Docker is not installed${NC}"
|
||||||
|
echo "Please install Docker first: https://docs.docker.com/get-docker/"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if ! command -v docker-compose &> /dev/null; then
|
||||||
|
echo -e "${RED}❌ Docker Compose is not installed${NC}"
|
||||||
|
echo "Please install Docker Compose first"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e "${GREEN}✓ Docker and Docker Compose found${NC}"
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Check if .env exists
|
||||||
|
if [ ! -f .env ]; then
|
||||||
|
echo -e "${YELLOW}⚠️ .env file not found${NC}"
|
||||||
|
echo "Creating .env file with defaults..."
|
||||||
|
|
||||||
|
# Generate random secret
|
||||||
|
SECRET=$(openssl rand -base64 32 2>/dev/null || echo "change-this-secret-in-production")
|
||||||
|
|
||||||
|
cat > .env << EOF
|
||||||
|
# Required: Application Configuration
|
||||||
|
NEXTAUTH_URL=http://localhost:3000
|
||||||
|
NEXTAUTH_SECRET=$SECRET
|
||||||
|
|
||||||
|
# Optional: OpenAI API Key (uncomment to use)
|
||||||
|
# OPENAI_API_KEY=sk-your-key-here
|
||||||
|
|
||||||
|
# Optional: Ollama Configuration (uncomment to use local AI)
|
||||||
|
# OLLAMA_BASE_URL=http://ollama:11434
|
||||||
|
# OLLAMA_MODEL=granite4:latest
|
||||||
|
EOF
|
||||||
|
|
||||||
|
echo -e "${GREEN}✓ .env file created${NC}"
|
||||||
|
echo ""
|
||||||
|
echo -e "${YELLOW}⚠️ IMPORTANT: Edit .env and update NEXTAUTH_URL and NEXTAUTH_SECRET${NC}"
|
||||||
|
read -p "Press Enter to continue after editing .env..."
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Parse command line arguments
|
||||||
|
COMMAND=${1:-"help"}
|
||||||
|
|
||||||
|
case $COMMAND in
|
||||||
|
build)
|
||||||
|
echo "🔨 Building Docker image..."
|
||||||
|
docker-compose build
|
||||||
|
echo -e "${GREEN}✓ Build completed${NC}"
|
||||||
|
;;
|
||||||
|
|
||||||
|
start|up)
|
||||||
|
echo "🚀 Starting containers..."
|
||||||
|
docker-compose up -d
|
||||||
|
echo -e "${GREEN}✓ Containers started${NC}"
|
||||||
|
echo ""
|
||||||
|
echo "📝 Application available at: http://localhost:3000"
|
||||||
|
;;
|
||||||
|
|
||||||
|
stop|down)
|
||||||
|
echo "⏹️ Stopping containers..."
|
||||||
|
docker-compose down
|
||||||
|
echo -e "${GREEN}✓ Containers stopped${NC}"
|
||||||
|
;;
|
||||||
|
|
||||||
|
restart)
|
||||||
|
echo "🔄 Restarting containers..."
|
||||||
|
docker-compose restart
|
||||||
|
echo -e "${GREEN}✓ Containers restarted${NC}"
|
||||||
|
;;
|
||||||
|
|
||||||
|
logs)
|
||||||
|
echo "📋 Showing logs (Ctrl+C to exit)..."
|
||||||
|
docker-compose logs -f keep-notes
|
||||||
|
;;
|
||||||
|
|
||||||
|
status)
|
||||||
|
echo "📊 Container status:"
|
||||||
|
docker-compose ps
|
||||||
|
;;
|
||||||
|
|
||||||
|
update)
|
||||||
|
echo "🔄 Updating application..."
|
||||||
|
echo "Pulling latest changes..."
|
||||||
|
git pull
|
||||||
|
echo "Rebuilding..."
|
||||||
|
docker-compose build
|
||||||
|
echo "Restarting..."
|
||||||
|
docker-compose down
|
||||||
|
docker-compose up -d
|
||||||
|
echo -e "${GREEN}✓ Update completed${NC}"
|
||||||
|
;;
|
||||||
|
|
||||||
|
backup)
|
||||||
|
echo "💾 Creating database backup..."
|
||||||
|
BACKUP_DIR="./backups"
|
||||||
|
mkdir -p $BACKUP_DIR
|
||||||
|
DATE=$(date +%Y%m%d_%H%M%S)
|
||||||
|
|
||||||
|
docker exec keep-notes \
|
||||||
|
cp /app/prisma/dev.db /app/prisma/backup_$DATE.db 2>/dev/null || {
|
||||||
|
echo -e "${RED}❌ Container not running. Start it first with: $0 start${NC}"
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
|
||||||
|
docker cp keep-notes:/app/prisma/backup_$DATE.db \
|
||||||
|
$BACKUP_DIR/keep-notes_$DATE.db
|
||||||
|
|
||||||
|
echo -e "${GREEN}✓ Backup created: $BACKUP_DIR/keep-notes_$DATE.db${NC}"
|
||||||
|
;;
|
||||||
|
|
||||||
|
clean)
|
||||||
|
echo "🧹 Cleaning up..."
|
||||||
|
echo "Stopping containers..."
|
||||||
|
docker-compose down
|
||||||
|
echo "Removing old images..."
|
||||||
|
docker image prune -a -f
|
||||||
|
echo "Removing unused volumes..."
|
||||||
|
docker volume prune -f
|
||||||
|
echo -e "${GREEN}✓ Cleanup completed${NC}"
|
||||||
|
;;
|
||||||
|
|
||||||
|
ollama-pull)
|
||||||
|
MODEL=${2:-"granite4"}
|
||||||
|
echo "🤖 Pulling Ollama model: $MODEL"
|
||||||
|
docker-compose exec -it ollama ollama pull $MODEL
|
||||||
|
echo -e "${GREEN}✓ Model pulled${NC}"
|
||||||
|
;;
|
||||||
|
|
||||||
|
shell)
|
||||||
|
echo "🐚 Opening shell in container..."
|
||||||
|
docker-compose exec keep-notes sh
|
||||||
|
;;
|
||||||
|
|
||||||
|
*)
|
||||||
|
echo "Usage: $0 {build|start|stop|restart|logs|status|update|backup|clean|ollama-pull|shell}"
|
||||||
|
echo ""
|
||||||
|
echo "Commands:"
|
||||||
|
echo " build - Build Docker image"
|
||||||
|
echo " start, up - Start containers"
|
||||||
|
echo " stop, down - Stop containers"
|
||||||
|
echo " restart - Restart containers"
|
||||||
|
echo " logs - Show container logs"
|
||||||
|
echo " status - Show container status"
|
||||||
|
echo " update - Pull latest, rebuild, and restart"
|
||||||
|
echo " backup - Backup database"
|
||||||
|
echo " clean - Remove containers, images, and volumes"
|
||||||
|
echo " ollama-pull - Pull Ollama model (optional: model name)"
|
||||||
|
echo " shell - Open shell in container"
|
||||||
|
echo ""
|
||||||
|
echo "Examples:"
|
||||||
|
echo " $0 build"
|
||||||
|
echo " $0 start"
|
||||||
|
echo " $0 logs"
|
||||||
|
echo " $0 update"
|
||||||
|
echo " $0 ollama-pull llama2"
|
||||||
|
;;
|
||||||
|
esac
|
||||||
@ -5,19 +5,91 @@ services:
|
|||||||
build:
|
build:
|
||||||
context: .
|
context: .
|
||||||
dockerfile: Dockerfile
|
dockerfile: Dockerfile
|
||||||
image: memento-app
|
image: keep-notes:latest
|
||||||
container_name: memento-app
|
container_name: keep-notes
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
ports:
|
ports:
|
||||||
- "3000:3000"
|
- "3000:3000"
|
||||||
environment:
|
environment:
|
||||||
|
# Database
|
||||||
- DATABASE_URL=file:/app/prisma/dev.db
|
- DATABASE_URL=file:/app/prisma/dev.db
|
||||||
- NODE_ENV=production
|
- NODE_ENV=production
|
||||||
|
|
||||||
|
# Application (Change these!)
|
||||||
|
- NEXTAUTH_URL=http://your-domain.com:3000
|
||||||
|
- NEXTAUTH_SECRET=change-this-to-a-random-secret-string
|
||||||
|
|
||||||
|
# AI Provider (Optional - for OpenAI)
|
||||||
|
# - OPENAI_API_KEY=your-openai-api-key-here
|
||||||
|
|
||||||
|
# AI Provider (Optional - for Ollama)
|
||||||
|
# - OLLAMA_BASE_URL=http://ollama:11434
|
||||||
|
# - OLLAMA_MODEL=granite4:latest
|
||||||
volumes:
|
volumes:
|
||||||
# Persist uploaded images
|
|
||||||
- ./public/uploads:/app/public/uploads
|
|
||||||
# Persist SQLite database
|
# Persist SQLite database
|
||||||
- ./prisma/dev.db:/app/prisma/dev.db
|
- keep-db:/app/prisma
|
||||||
# Ensure the user inside docker has permissions to write to volumes
|
|
||||||
# You might need to adjust user IDs depending on your host system
|
# Persist uploaded images and files
|
||||||
# user: "1001:1001"
|
- keep-uploads:/app/public/uploads
|
||||||
|
|
||||||
|
# Optional: Mount custom SSL certificates
|
||||||
|
# - ./certs:/app/certs:ro
|
||||||
|
networks:
|
||||||
|
- keep-network
|
||||||
|
# Optional: Resource limits for Proxmox VM
|
||||||
|
deploy:
|
||||||
|
resources:
|
||||||
|
limits:
|
||||||
|
cpus: '2'
|
||||||
|
memory: 2G
|
||||||
|
reservations:
|
||||||
|
cpus: '0.5'
|
||||||
|
memory: 512M
|
||||||
|
# Optional: Health check
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD", "wget", "--spider", "-q", "http://localhost:3000"]
|
||||||
|
interval: 30s
|
||||||
|
timeout: 10s
|
||||||
|
retries: 3
|
||||||
|
start_period: 40s
|
||||||
|
|
||||||
|
# Optional: Ollama for local AI models
|
||||||
|
# Uncomment this section if you want to use local AI models
|
||||||
|
# ollama:
|
||||||
|
# image: ollama/ollama:latest
|
||||||
|
# container_name: keep-ollama
|
||||||
|
# restart: unless-stopped
|
||||||
|
# ports:
|
||||||
|
# - "11434:11434"
|
||||||
|
# volumes:
|
||||||
|
# - ollama-data:/root/.ollama
|
||||||
|
# networks:
|
||||||
|
# - keep-network
|
||||||
|
# deploy:
|
||||||
|
# resources:
|
||||||
|
# limits:
|
||||||
|
# cpus: '4'
|
||||||
|
# memory: 8G
|
||||||
|
# reservations:
|
||||||
|
# cpus: '2'
|
||||||
|
# memory: 4G
|
||||||
|
# # GPU support for Proxmox with GPU passthrough
|
||||||
|
# # deploy:
|
||||||
|
# # resources:
|
||||||
|
# # reservations:
|
||||||
|
# # devices:
|
||||||
|
# # - driver: nvidia
|
||||||
|
# # count: 1
|
||||||
|
# # capabilities: [gpu]
|
||||||
|
|
||||||
|
networks:
|
||||||
|
keep-network:
|
||||||
|
driver: bridge
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
keep-db:
|
||||||
|
driver: local
|
||||||
|
keep-uploads:
|
||||||
|
driver: local
|
||||||
|
# ollama-data:
|
||||||
|
# driver: local
|
||||||
|
|||||||
Loading…
x
Reference in New Issue
Block a user