Skip to content

sarpel/bookmark-manager

Repository files navigation

BookmarkBrain

A comprehensive, production-ready intelligent bookmark management platform with AI-powered categorization, RAG-based conversational search, and advanced organization features.

Features

Core Functionality

  • Smart Bookmark Management: Create, organize, and manage bookmarks with folders and tags
  • AI-Powered Categorization: Automatic categorization using OpenAI, Anthropic, Google Gemini, or local Ollama models
  • RAG-Powered Chat: Ask questions about your bookmarks using semantic search and AI
  • Health Monitoring: Automatic detection of dead/offline links with status tracking
  • Thumbnail Generation: Automatic screenshot capture for visual bookmark browsing

Technical Features

  • Multi-Provider AI Support: OpenAI GPT-4, Anthropic Claude, Google Gemini, or local Ollama
  • Vector Embeddings: Semantic search using pgvector for PostgreSQL
  • Secure Authentication: JWT with refresh token rotation, 2FA via TOTP
  • Role-Based Access Control: Admin, Manager, and User roles
  • RESTful API: Comprehensive API with OpenAPI/Swagger documentation
  • Docker-Ready: Complete containerization with docker-compose

Tech Stack

Backend

  • Node.js 20+ with TypeScript
  • Express.js for REST API
  • Prisma ORM with PostgreSQL
  • pgvector for vector embeddings
  • Bull for background job processing
  • Redis for caching and queues

Frontend

  • React 18 with TypeScript
  • Vite for build tooling
  • Tailwind CSS for styling
  • shadcn/ui component library
  • React Query for server state
  • Zustand for client state

Infrastructure

  • Docker & Docker Compose
  • Nginx reverse proxy
  • PostgreSQL 16 with pgvector
  • Redis 7

Quick Start

Prerequisites

  • Node.js 20 or higher
  • Docker and Docker Compose
  • Git

Development Setup

  1. Clone the repository

    git clone https://github.com/yourusername/bookmarkbrain.git
    cd bookmarkbrain
  2. Copy environment file

    cp .env.example .env
  3. Configure environment variables Edit .env and set:

    • JWT_SECRET and JWT_REFRESH_SECRET (generate secure random strings)
    • At least one AI provider API key (OPENAI_API_KEY, ANTHROPIC_API_KEY, or GEMINI_API_KEY)
  4. Start with Docker Compose (recommended)

    docker-compose -f docker-compose.yml -f docker-compose.dev.yml up -d
  5. Run database migrations

    docker-compose exec backend npx prisma migrate deploy
  6. Access the application

Manual Development Setup

  1. Install dependencies

    npm install
  2. Start PostgreSQL and Redis

    docker-compose up -d postgres redis
  3. Run migrations

    cd backend
    npx prisma migrate dev
  4. Start development servers

    npm run dev

Project Structure

bookmarkbrain/
├── backend/                 # Express.js API server
│   ├── prisma/             # Database schema and migrations
│   ├── src/
│   │   ├── config/         # Configuration files
│   │   ├── controllers/    # Request handlers
│   │   ├── middleware/     # Express middleware
│   │   ├── routes/         # API routes
│   │   ├── services/       # Business logic
│   │   │   ├── ai/         # AI provider integrations
│   │   │   └── rag/        # RAG system (embeddings, chat)
│   │   └── index.ts        # Entry point
│   └── Dockerfile
├── frontend/               # React SPA
│   ├── src/
│   │   ├── components/     # UI components
│   │   ├── lib/            # Utilities and API client
│   │   ├── pages/          # Page components
│   │   └── stores/         # State management
│   └── Dockerfile
├── docker/                 # Docker configuration
│   ├── nginx/             # Nginx reverse proxy
│   ├── postgres/          # Database initialization
│   └── puppeteer/         # Screenshot service
├── docker-compose.yml      # Production compose file
├── docker-compose.dev.yml  # Development overrides
└── package.json            # Root monorepo config

API Documentation

The API is fully documented using OpenAPI/Swagger. Access the interactive documentation at:

Key Endpoints

Endpoint Method Description
/api/auth/register POST Create new account
/api/auth/login POST Authenticate user
/api/bookmarks GET List bookmarks with filtering
/api/bookmarks POST Create bookmark
/api/folders/tree GET Get folder hierarchy
/api/tags GET List all tags
/api/chat POST Send message to RAG chat
/api/health/check-all POST Check all bookmark health

Configuration

Environment Variables

Variable Description Default
DATABASE_URL PostgreSQL connection string Required
REDIS_URL Redis connection string redis://localhost:6379
JWT_SECRET JWT signing secret Required
JWT_REFRESH_SECRET Refresh token secret Required
OPENAI_API_KEY OpenAI API key Optional
ANTHROPIC_API_KEY Anthropic API key Optional
GEMINI_API_KEY Google Gemini API key Optional
OLLAMA_URL Ollama server URL http://localhost:11434

AI Provider Configuration

BookmarkBrain supports multiple AI providers for categorization and chat:

  1. OpenAI (Default): Best quality, requires API key
  2. Anthropic: Claude models, requires API key
  3. Google Gemini: Gemini Pro, requires API key
  4. Ollama: Local models, no API key needed

Set your preferred provider in Settings > AI & Categorization.

Recommended Local Models (2025)

Based on current benchmarks and community consensus:

LLM Models (for chat & categorization):

Model VRAM Best For
qwen3:14b 12GB Best balance - recommended default
qwen3:8b 8GB Laptop-friendly, good all-rounder
qwen3:30b-a3b 20GB MoE model, excellent quality
qwen3:4b 4GB Edge devices, low resources
llama3.3:70b 40GB+ Maximum quality

Embedding Models (for RAG/semantic search):

Model Dimensions Best For
mxbai-embed-large 1024 Highest accuracy - recommended
nomic-embed-text 768 Long documents, good speed
bge-m3 1024 Multilingual, long context
snowflake-arctic-embed 1024 Code & technical docs

Why Qwen3? Outperforms Gemma 3 in reasoning/coding benchmarks, 128K context window, Apache 2.0 license, excellent multilingual support, and dual-mode reasoning (thinking/non-thinking).

Deployment

Docker Production Deployment

  1. Configure production environment

    cp .env.example .env
    # Edit .env with production values
  2. Generate SSL certificates (using Let's Encrypt)

    certbot certonly --webroot -w /var/www/certbot -d your-domain.com
  3. Enable HTTPS in Nginx config Edit docker/nginx/conf.d/default.conf and uncomment the HTTPS server block.

  4. Deploy

    docker-compose up -d
    docker-compose exec backend npx prisma migrate deploy

Using Local Ollama

To use local AI models with Ollama:

  1. Start Ollama service

    docker-compose --profile ollama up -d
  2. Pull recommended models

    # LLM model (pick one based on your hardware)
    docker-compose exec ollama ollama pull qwen3:14b
    # or for lighter hardware:
    docker-compose exec ollama ollama pull qwen3:8b
    
    # Embedding model
    docker-compose exec ollama ollama pull mxbai-embed-large
  3. Configure in Settings Select "Ollama" as your AI provider and choose the model.

Development

Running Tests

# Backend tests
cd backend
npm test

# Frontend tests
cd frontend
npm test

Code Quality

# Lint
npm run lint

# Type check
npm run typecheck

# Format
npm run format

Database Migrations

# Create migration
cd backend
npx prisma migrate dev --name your_migration_name

# Apply migrations
npx prisma migrate deploy

# Reset database (development only)
npx prisma migrate reset

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • shadcn/ui for the beautiful component library
  • Prisma for the excellent ORM
  • pgvector for vector similarity search
  • OpenAI, Anthropic, and Google for their AI APIs

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published