Keep/keep-notes
sepehr e6bcdea641 fix: remove hardcoded localhost fallbacks, require explicit config
Critical fix for Docker deployment where AI features were trying to connect
to localhost:11434 instead of using configured provider (Ollama Docker service
or OpenAI).

Problems fixed:
1. Reformulation (clarify/shorten/improve) failing with ECONNREFUSED 127.0.0.1:11434
2. Auto-labels failing with same error
3. Notebook summaries failing
4. Could not switch from Ollama to OpenAI in admin

Root cause:
- Code had hardcoded fallback to 'http://localhost:11434' in multiple places
- .env.docker file not created on server (gitignore'd)
- No validation that required environment variables were set

Changes:

1. lib/ai/factory.ts:
   - Remove hardcoded 'http://localhost:11434' fallback
   - Only use localhost for local development (NODE_ENV !== 'production')
   - Throw error if OLLAMA_BASE_URL not set in production

2. lib/ai/providers/ollama.ts:
   - Remove default parameter 'http://localhost:11434' from constructor
   - Require baseUrl to be explicitly passed
   - Throw error if baseUrl is missing

3. lib/ai/services/paragraph-refactor.service.ts:
   - Remove 'http://localhost:11434' fallback (2 locations)
   - Require OLLAMA_BASE_URL to be set
   - Throw clear error if not configured

4. app/(main)/admin/settings/admin-settings-form.tsx:
   - Add debug info showing current provider state
   - Display database config value for transparency
   - Help troubleshoot provider selection issues

5. DOCKER-SETUP.md:
   - Complete guide for Docker configuration
   - Instructions for .env.docker setup
   - Examples for Ollama Docker, OpenAI, and external Ollama
   - Troubleshooting common issues

Usage:
On server, create .env.docker with proper provider configuration:
- Ollama in Docker: OLLAMA_BASE_URL="http://ollama:11434"
- OpenAI: OPENAI_API_KEY="sk-..."
- External Ollama: OLLAMA_BASE_URL="http://SERVER_IP:11434"

Then in admin interface, users can independently configure:
- Tags Provider (for auto-labels, AI features)
- Embeddings Provider (for semantic search)

Result:
✓ Clear errors if Ollama not configured
✓ Can switch to OpenAI freely in admin
✓ No more hardcoded localhost in production
✓ Proper separation between local dev and Docker production

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-01-12 22:28:39 +01:00
..
2026-01-11 23:50:30 +01:00

Keep Notes - Google Keep Clone

A beautiful and feature-rich Google Keep clone built with modern web technologies.

Keep Notes TypeScript Tailwind CSS Prisma

Features

  • 📝 Create & Edit Notes: Quick note creation with expandable input
  • ☑️ Checklist Support: Create todo lists with checkable items
  • 🎨 Color Customization: 10 beautiful color themes for organizing notes
  • 📌 Pin Notes: Keep important notes at the top
  • 📦 Archive: Archive notes you want to keep but don't need to see
  • 🏷️ Labels: Organize notes with custom labels
  • 🔍 Real-time Search: Instantly search through all your notes
  • 🌓 Dark Mode: Beautiful dark theme with system preference detection
  • 📱 Fully Responsive: Works perfectly on desktop, tablet, and mobile
  • Server Actions: Lightning-fast CRUD operations with Next.js 16
  • 🎯 Type-Safe: Full TypeScript support throughout

🚀 Tech Stack

Frontend

  • Next.js 16 - React framework with App Router
  • TypeScript - Type safety and better DX
  • Tailwind CSS 4 - Utility-first CSS framework
  • shadcn/ui - Beautiful, accessible UI components
  • Lucide React - Modern icon library

Backend

  • Next.js Server Actions - Server-side mutations
  • Prisma ORM - Type-safe database client
  • SQLite - Lightweight database (easily switchable to PostgreSQL)

📦 Installation

Prerequisites

  • Node.js 18+
  • npm or yarn

Steps

  1. Clone the repository

    git clone <repository-url>
    cd keep-notes
    
  2. Install dependencies

    npm install
    
  3. Set up the database

    npx prisma generate
    npx prisma migrate dev
    
  4. Start the development server

    npm run dev
    
  5. Open your browser Navigate to http://localhost:3000

Getting Started

First, run the development server:

npm run dev
# or
yarn dev
# or
pnpm dev
# or
bun dev

Open http://localhost:3000 with your browser to see the result.

You can start editing the page by modifying app/page.tsx. The page auto-updates as you edit the file.

This project uses next/font to automatically optimize and load Geist, a new font family for Vercel.

Learn More

To learn more about Next.js, take a look at the following resources:

You can check out the Next.js GitHub repository - your feedback and contributions are welcome!

Deploy on Vercel

The easiest way to deploy your Next.js app is to use the Vercel Platform from the creators of Next.js.

Check out our Next.js deployment documentation for more details.