Mood-based book recommendations. Find your next read by how you feel.
Pitch: Instead of choosing by genre, pick a mood. We blend lightweight search (Open Library / Google Books) with a GPT-powered planner + reranker to surface books that feel right (e.g., Cozy, Adventurous, Melancholic). Each result includes a summary and one-click actions to Read free (Project Gutenberg), Borrow (Open Library), Buy, or Find locally (WorldCat).
- Demo
- Why It Matters
- Features
- Tech Stack
- Quick Start
- API
- How Recommendations Work
- UI/UX Notes
- Firebase (Optional)
- Deployment
- Security & Privacy
- Troubleshooting
- Roadmap
- Contributing
- Credits
- License
Replace with your own assets after recording.
- Screenshot:
docs/screenshot-ui.png - Screen recording:
docs/demo.mp4 - Live (optional): Frontend (Vercel/Netlify) + Backend (Render/Fly/Heroku)
- Application / Impact: Tackles “what should I read next?” paralysis with a mood-first approach; nudges free access (Gutenberg), borrowing (Open Library), and local discovery (WorldCat).
- Functionality / Quality: Polished UI (dark mode, a11y), resilient backend with fallbacks (if GPT or a book API hiccups, results still appear).
- Creativity: Replaces blunt genre filters with mood + intention; blends planner + reranker with public datasets for serendipitous discovery.
- Technical Complexity: OpenAI Responses API (structured JSON), multi-API aggregation, ISBN deduping, metadata enrichment, optional personalization via Firebase.
- Presentation: Clear, end-to-end demo with future work and ablations listed.
- Mood grid (10–12 moods): e.g., Cozy, Adventurous, Melancholic, Whimsical, Thought-provoking, Uplifting, Dark, Romantic, Nostalgic, Fast-paced, Found-Family, Speculative.
- Elegant, library-ambience UI: serif headings, subtle textures, focus states; Dark mode (persists in
localStorage). - Accessible: keyboard navigation, ARIA roles, focus traps in modals.
- Actionable results: Read (Gutenberg), Borrow (Open Library), Buy (store), Local (WorldCat).
- Robust fallbacks: deterministic keyword search + naive ranking if any upstream fails.
- (Optional) ratings & lightweight adaptive profile via Firebase (anonymous auth + Firestore).
- Frontend: React + Vite, Tailwind CSS v4 (
@tailwindcss/postcss), small custom CSS - Backend: Node.js + Express,
node-fetch, CORS,dotenv - AI: OpenAI Responses API (JSON with
text.format: "json_schema") - Data: Open Library Search, Google Books Volumes, Gutendex (Project Gutenberg), WorldCat links
- Optional: Firebase Auth (anonymous), Firestore (ratings)
- Dev: npm / Node ≥ 18, Git, (optional) Git LFS for media
- Node.js ≥ 18 and npm ≥ 9
- Git (
git --version) - OpenAI API key
- (Optional) Git LFS (
git lfs install)
# 1) Clone
git clone https://github.com/<you>/bookworm.git
cd bookworm
# 2) Backend (server)
cd server
cp .env.example .env # or create it; see "Environment Variables"
npm i
npm run dev # -> http://127.0.0.1:3001/api/health returns {"ok":true}
# 3) Frontend (client) in a new terminal
cd ../client
cp .env.example .env # set VITE_API_BASE, see below
npm i
npm run dev # open printed URL (e.g., http://127.0.0.1:5173)Create .env files as shown below.
/server/.env
# Required
OPENAI_API_KEY=your_openai_key
# Optional / sensible defaults
PORT=3001
OPENAI_MODEL=gpt-4o-mini # or another JSON-capable model
OPEN_LIBRARY_BASE=https://openlibrary.org
GOOGLE_BOOKS_BASE=https://www.googleapis.com/books/v1
GOOGLE_BOOKS_API_KEY= # optional; unauthenticated works with limits
GUTENDEX_BASE=https://gutendex.com
WORLD_CAT_SEARCH_BASE=https://worldcat.org
ALLOWED_ORIGIN=http://127.0.0.1:5173/client/.env
VITE_API_BASE=http://127.0.0.1:3001
# Optional Firebase (enable ratings)
VITE_FIREBASE_API_KEY=
VITE_FIREBASE_AUTH_DOMAIN=
VITE_FIREBASE_PROJECT_ID=
VITE_FIREBASE_APP_ID=Health check.
Response: 200 OK → {"ok": true}
Generates recommendations for selected moods.
Request (JSON)
{
"moods": ["Cozy", "Adventurous"],
"limit": 12,
"profile": {
"ageRange": "teen|adult",
"contentNotes": ["no-graphic-violence"],
"recentLikes": ["The Hobbit", "Anne of Green Gables"]
}
}Response (JSON)
{
"items": [
{
"title": "The Wind in the Willows",
"authors": ["Kenneth Grahame"],
"isbn13": "9780143039099",
"cover": "https://covers.openlibrary.org/b/id/xxxxx-L.jpg",
"score": 0.87,
"moodTags": ["Cozy", "Whimsical"],
"reasons": "Quiet riverside pacing, warm companionship, gentle stakes.",
"summary": "A classic tale of friendship along the river bank...",
"links": {
"read": "https://www.gutenberg.org/ebooks/xxxxx",
"borrow": "https://openlibrary.org/works/OLxxxxxxW",
"buy": "https://books.google.com/books?id=...&buy=y",
"local": "https://worldcat.org/search?q=The+Wind+in+the+Willows"
}
}
]
}Error Codes
400invalid payload (e.g., emptymoods)429rate limited (provider or app)502/503upstream API error (with fallback applied when possible)
- User input → Frontend posts
moods(and optionalprofile) to/api/recommend. - Query Planner (GPT) → Transforms moods into 2–4 compact search strings with a JSON schema (
text.format: "json_schema"). - Fetch & Aggregate → Search Open Library + Google Books; normalize records and dedupe by ISBN.
- Reranker (GPT) → Scores candidates for mood fit; returns top N with reasons and tags.
- Enrichment → Adds cover art (Open Library), Read (Gutendex / Gutenberg), Borrow (Open Library), Buy (GB buyLink), Local (WorldCat).
- Resiliency → If GPT or any data API fails, we fallback to deterministic keyword ranking so the UI never empties.
- Library ambience: subtle paper texture, soft vignette, gentle hover states; serif headings + legible body font.
- Dark mode: toggle persists via
localStorageand prefers-color-scheme on first load. - Accessibility: focus rings, ESC to close modals,
aria-labelledby/aria-describedby, logical tab order. - Performance: debounced searches, lazy-loaded images, prefetch covers in viewport.
- Enable anonymous auth and create a Firestore collection
ratings. - Client writes
{ isbn13, rating, moods[], timestamp }. - Backend can read aggregates to lightly adjust the planner/reranker prompts (e.g., “boost books with high cozy-rating for user X”).
- Set
VITE_API_BASEto your backend URL. - Build:
npm run build(Vite) → deploydist/.
- Set env vars (
OPENAI_API_KEY, etc.). - Expose
PORT(defaults to3001). - Add a health check to
/api/health. - Consider a small in-memory cache (or KV) to reduce provider calls.
- No PII required; moods are non-identifying.
- Do not persist prompts/responses unless explicitly enabling analytics.
- Respect robots / rate limits for public APIs.
- Sanitize outbound links; open in new tab with
rel="noopener noreferrer".
- CORS error: ensure
ALLOWED_ORIGINin server.envmatches the dev server URL. - Empty results: check
OPENAI_API_KEY; verify upstream APIs are reachable; fallback should still show deterministic results. - Covers missing: Open Library covers may be absent for some editions—use placeholder.
- 429 / rate limit: reduce
limit, add backoff, consider caching.
- Personal shelves import (Goodreads/CSV) → seed profile
- Multi-mood blends with weights (e.g., 70% Cozy, 30% Adventurous)
- Offline mode with last results cached
- Fine-grained content notes/filters
- Internationalization (i18n)
- Mobile gestures & haptics
- “Why this book?” richer attributions (highlights, pull-quotes)
- Fork the repo and create a feature branch:
git checkout -b feat/<short-name>
- Run both apps locally, add tests where applicable.
- Open a PR with a clear description, screenshots of UI changes, and test notes.
- Open Library (covers, search)
- Google Books (volumes, buy links)
- Project Gutenberg / Gutendex (free reads)
- WorldCat (local library discovery)
- OpenAI Responses API (planner + reranker)
MIT ©