FirstbookLM is an AI-powered notebook application that allows you to create, manage, and interact with your documents through intelligent chat interfaces. Built with Next.js, it features file uploads, vector search, and support for multiple AI providers.
- 🤖 Multi-AI Provider Support - OpenAI, Anthropic, Google Gemini
- 📁 File Upload & Processing - PDF, Word, text files with automatic content extraction
- 🔍 Vector Search - Semantic search through your documents using embeddings
- 💬 Intelligent Chat - Context-aware conversations with your documents
- 🔐 Authentication - Google OAuth and email/password authentication
- ☁️ File Storage - Cloudflare R2 integration for file storage
- 🎨 Modern UI - Beautiful, responsive interface built with Tailwind CSS and shadcn/ui components
- Framework: Next.js 15 with App Router
- Database: PostgreSQL with Drizzle ORM
- Authentication: Better Auth
- AI: Vercel AI SDK with multiple providers
- Storage: Cloudflare R2 (S3-compatible)
- Styling: Tailwind CSS + shadcn/ui
- Package Manager: Bun
- Vector Search: pgvector
Before you begin, ensure you have the following installed:
- Node.js (v20 or higher - required for React 19)
- Bun (recommended package manager)
- PostgreSQL (v14 or higher with pgvector extension)
- Git
-
Clone the repository
git clone https://github.com/raodevendrasingh/firstbook.git cd firstbook -
Install dependencies
bun install
-
Set up environment variables
cp .env.example .env.local # Edit .env.local with your configuration -
Set up the database
# Enable pgvector extension in your PostgreSQL database bun run db:push -
Start the development server
bun run dev
-
Open your browser Navigate to http://localhost:3000
Create a .env.local file in the root directory with the following variables:
# Database
DATABASE_URL="postgresql://username:password@localhost:5432/firstbook"
# Application URL
NEXT_PUBLIC_APP_URL="http://localhost:3000"
# Google OAuth (for authentication)
GOOGLE_CLIENT_ID="your-google-client-id"
GOOGLE_CLIENT_SECRET="your-google-client-secret"# AI Providers (at least one required for AI features)
OPENAI_API_KEY="your-openai-api-key"
ANTHROPIC_API_KEY="your-anthropic-api-key"
GOOGLE_GENERATIVE_AI_API_KEY="your-google-ai-api-key"
# Search Provider
EXASEARCH_API_KEY="your-exa-search-api-key"
# File Storage (Cloudflare R2)
R2_S3_API_ENDPOINT="https://your-account-id.r2.cloudflarestorage.com"
R2_ACCESS_KEY_ID="your-r2-access-key"
R2_SECRET_ACCESS_KEY="your-r2-secret-key"
R2_PUBLIC_ACCESS_URL="https://your-custom-domain.com"
R2_PUBLIC_BUCKET="firstbook"
# Environment
NODE_ENV="development"-
Install PostgreSQL with pgvector extension:
# Ubuntu/Debian sudo apt install postgresql postgresql-contrib # macOS (with Homebrew) brew install postgresql # Windows # Download from https://www.postgresql.org/download/windows/
-
Install pgvector extension:
# Ubuntu/Debian (PostgreSQL 14+) sudo apt install postgresql-14-pgvector # or for PostgreSQL 15+ sudo apt install postgresql-15-pgvector # macOS brew install pgvector
-
Create database and enable extension:
CREATE DATABASE firstbook; \c firstbook; CREATE EXTENSION vector;
Neon (Recommended)
- Sign up at neon.tech
- Create a new project
- Copy the connection string to your
DATABASE_URL - Enable pgvector in the SQL editor:
CREATE EXTENSION vector;
Supabase
- Sign up at supabase.com
- Create a new project
- Go to Settings > Database
- Copy the connection string to your
DATABASE_URL - Enable pgvector in the SQL editor:
CREATE EXTENSION vector;
- Go to Google Cloud Console
- Create a new project or select existing one
- Enable Google+ API (or Google Identity API)
- Go to "Credentials" → "Create Credentials" → "OAuth 2.0 Client IDs"
- Set application type to "Web application"
- Add authorized redirect URIs:
http://localhost:3000/api/auth/callback/google(development)https://yourdomain.com/api/auth/callback/google(production)
- Copy Client ID and Client Secret to your
.env.local
- Sign up at platform.openai.com
- Go to API Keys section
- Create a new API key
- Add to
OPENAI_API_KEYin.env.local
- Sign up at console.anthropic.com
- Go to API Keys section
- Create a new API key
- Add to
ANTHROPIC_API_KEYin.env.local
- Go to Google AI Studio
- Create an API key
- Add to
GOOGLE_GENERATIVE_AI_API_KEYin.env.local
- Sign up at Cloudflare
- Go to R2 Object Storage
- Create a new bucket
- Go to "Manage R2 API tokens"
- Create a new API token with R2 permissions
- Set up a custom domain (optional but recommended)
- Add credentials to your
.env.local:R2_S3_API_ENDPOINT="https://your-account-id.r2.cloudflarestorage.com" R2_ACCESS_KEY_ID="your-access-key" R2_SECRET_ACCESS_KEY="your-secret-key" R2_PUBLIC_ACCESS_URL="https://your-custom-domain.com" R2_PUBLIC_BUCKET="your-bucket-name"
- Sign up at exa.ai
- Get your API key
- Add to
EXASEARCH_API_KEYin.env.local
After setting up your database, run the migrations:
# Generate migration files (if needed)
bun run db:generate
# Apply migrations to database
bun run db:migrate
# Or push schema directly (for development)
bun run db:push# Start development server
bun run dev
# Build for production
bun run build
# Start production server
bun run start
# Format code
bun run format
# Fix linting issues
bun run fix
# Clean build artifacts
bun run clean
# Database operations
bun run db:generate # Generate migrations
bun run db:migrate # Run migrations
bun run db:push # Push schema to database
bun run db:studio # Open Drizzle Studio- Connect your repository to Vercel
- Set environment variables in Vercel dashboard
- Deploy - Vercel will automatically build and deploy
Make sure to set all required environment variables in your production environment:
DATABASE_URL- Your production PostgreSQL connection stringNEXT_PUBLIC_APP_URL- Your production domainGOOGLE_CLIENT_IDandGOOGLE_CLIENT_SECRET- Production OAuth credentials- AI provider API keys
- R2 storage credentials (if using file uploads)
-
Database connection errors
- Verify
DATABASE_URLis correct - Ensure PostgreSQL is running
- Check if pgvector extension is installed
- Verify
-
Authentication not working
- Verify Google OAuth credentials
- Check redirect URIs match your domain
- Ensure
NEXT_PUBLIC_APP_URLis set correctly
-
File upload issues
- Verify R2 credentials are correct
- Check bucket permissions
- Ensure custom domain is properly configured
-
AI features not working
- Verify at least one AI provider API key is set
- Check API key permissions and quotas
- Ensure proper model access
- Check the Issues page
- Review the Next.js documentation
- Check Better Auth documentation
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Commit your changes:
git commit -m 'Add amazing feature' - Push to the branch:
git push origin feature/amazing-feature - Open a Pull Request
This project is licensed under the GNU General Public License v3.0 - see the LICENSE file for details.