A mini AI support agent for a live chat widget, built as a take-home assignment for Spur.
| Service | URL |
|---|---|
| Frontend | https://spursoftwareapp.vercel.app |
| Backend API | https://spur-software-api.onrender.com |
Note: The backend runs on Render's free tier and sleeps after 15 mins of inactivity. First request may take ~30 seconds to wake up.
- AI-Powered Support Chat: Integrated with Groq (Llama 3.3 70B) for intelligent, fast responses
- Conversation Persistence: Messages stored in SQLite and restored on page reload
- Modern UI: Beautiful dark theme with smooth animations and typing indicators
- Session Management: Automatic session tracking via localStorage
- FAQ Knowledge: Pre-loaded with TechGadgets Pro store information
- Robust Error Handling: Graceful error messages for all failure scenarios
- Node.js 18+
- npm 9+
- Groq API key (FREE!)
git clone <your-repo-url>
cd aiBot
npm install- Go to: https://console.groq.com/keys
- Sign up / sign in (free, takes 30 seconds)
- Click "Create API Key"
- Copy the key (starts with
gsk_...)
cp .env.example .envEdit .env and add your Groq API key:
GROQ_API_KEY=gsk_your-key-herenpm run devThis starts:
- Backend: http://localhost:3001
- Frontend: http://localhost:5173
Open http://localhost:5173 in your browser to start chatting!
βββ packages/
β βββ backend/ # Express + TypeScript API
β β βββ src/
β β β βββ index.ts # Server entry point
β β β βββ routes/ # API route handlers
β β β β βββ chat.ts # POST /chat/message, GET /chat/history
β β β βββ services/ # Business logic
β β β β βββ chat.service.ts # Message processing
β β β β βββ llm.service.ts # Groq/LLM integration
β β β βββ repositories/ # Data access layer
β β β β βββ conversation.repo.ts
β β β β βββ message.repo.ts
β β β βββ db/ # Database setup & schema
β β β βββ prompts/ # LLM system prompts & FAQ
β β β βββ middleware/ # Express middleware
β β βββ package.json
β βββ frontend/ # Svelte + Vite UI
β βββ src/
β β βββ App.svelte # Main app component
β β βββ lib/
β β β βββ components/ # ChatWidget, MessageList, etc.
β β β βββ stores/ # Svelte stores (chat state)
β β β βββ api/ # API client
β β βββ app.css # Global styles
β βββ package.json
βββ .env.example # Environment template
βββ package.json # Monorepo root
Routes β Services β Repositories β Database
β
LLM Service β Groq API
| Layer | Responsibility |
|---|---|
| Routes | HTTP handling, request validation (Zod) |
| Services | Business logic, orchestration |
| Repositories | Data access, SQL queries |
| LLM Service | Groq API wrapper, prompt management |
- Monorepo with npm workspaces: Simple setup, easy to run together
- SQLite with sql.js: Zero infrastructure, pure JS (no native compilation)
- Zod validation: Runtime type safety for API requests
- Svelte stores: Reactive state management with localStorage persistence
- Hardcoded FAQ in system prompt: Fast to implement, easy to update
Groq with Llama 3.3 70B Versatile - Chosen for:
- 100% FREE - No credit card required
- Fast inference - Groq's custom hardware
- High quality - State-of-the-art open model
The system prompt includes:
- Agent persona: Friendly, professional support agent for TechGadgets Pro
- Store knowledge:
- Shipping policy (free over $50, ships to USA/Canada/UK/EU)
- Return policy (30-day hassle-free, 90-day for defective)
- Support hours (Mon-Fri 9AM-6PM EST)
- Contact info (email, phone)
- Response guidelines: Concise, helpful, honest
| Setting | Value |
|---|---|
| Model | llama-3.3-70b-versatile |
| Max tokens | 500 |
| Temperature | 0.7 |
| Context | Last 20 messages |
Send a chat message and receive AI reply.
Request:
{
"message": "What's your return policy?",
"sessionId": "optional-uuid-here"
}Response:
{
"success": true,
"reply": "We offer a 30-day hassle-free return policy...",
"sessionId": "generated-or-same-uuid"
}Fetch conversation history for a session.
| Scenario | Behavior |
|---|---|
| Empty message | 400 error, "Message cannot be empty" |
| Long message (>2000 chars) | Truncated, still processed |
| Invalid API key | 500 with friendly message |
| Rate limit | 503 with "try again" message |
| Invalid sessionId | Creates new conversation |
| Column | Type | Description |
|---|---|---|
| id | TEXT (UUID) | Primary key |
| created_at | TEXT | ISO timestamp |
| metadata | TEXT | JSON (optional) |
| Column | Type | Description |
|---|---|---|
| id | TEXT (UUID) | Primary key |
| conversation_id | TEXT | FK to conversations |
| sender | TEXT | "user" or "ai" |
| text | TEXT | Message content |
| created_at | TEXT | ISO timestamp |
| Current | With More Time |
|---|---|
| SQLite (sql.js) | PostgreSQL for production |
| No auth | JWT + user accounts |
| localStorage session | HttpOnly cookies |
| Hardcoded FAQ | RAG with vector database |
| No streaming | SSE for token streaming |
| No rate limiting | Redis-based rate limiting |
| No WebSocket | Real-time updates |
- Send "What's your return policy?" β Gets accurate FAQ answer
- Send "Do you ship to USA?" β Gets correct shipping info
- Refresh page β Conversation history restored
- Send empty message β Prevented by frontend
- Send while loading β Button disabled
- Invalid API key β Friendly error shown
MIT
Built with β€οΈ for the Spur take-home assignment