A clean, focused chat application powered by AI. Ask questions, speak your queries, and get clear answers. Built with Next.js, React, and TypeScript for an optimal experience.
Built with:
- 8 Powerful Models via OpenRouter:
- Google: Gemma 3 4B, 12B, 27B
- Meta: Llama 3.2 3B, Llama 3.3 70B
- OpenAI: GPT OSS 20B, GPT OSS 120B (default)
- Provider Tabs: Click provider tabs to browse models by company
- Smooth Model Switching: Change models mid-conversation
- Full Conversation Context: Models get complete chat history for better responses
- Click-to-Record: Simple microphone button for hands-free input
- Real-time Transcription: Powered by Groq Whisper large-v3-turbo
- Error Handling: Clear feedback if transcription fails
- Seamless Integration: Transcribed text appears instantly in chat
- Dark/Light Mode: Toggle themes with circular theme switcher
- Responsive Design: Works perfectly on desktop, tablet, and mobile
- Clean Typography: Easy-to-read responses with markdown support
- Code Highlighting: Beautiful syntax highlighting for code blocks
- Auto-scrolling: Chat scrolls to latest messages automatically
- Node.js v18+ (Get it here)
- npm or yarn package manager
- Git for cloning
-
Clone the repository
git clone https://github.com/cidopenup/app0.git cd app0 -
Install dependencies
npm install
-
Set up environment variables
Create a
.env.localfile:# Windows echo. > .env.local # macOS/Linux touch .env.local
-
Add API Keys
Open
.env.localand add:# OpenRouter API Key (for all chat models) # Get it from: https://openrouter.io/keys OPENROUTER_API_KEY=your_openrouter_key_here # Groq API Key (for speech-to-text) # Get it from: https://console.groq.com/keys GROQ_API_KEY=your_groq_key_here
-
Start the development server
npm run dev
Open http://localhost:3000 in your browser.
- Go to the chat page at
/chat - Click the bot icon to select a model and provider
- Type your question in the input box
- Press
Enteror click the send button - Click the microphone for voice input
- Visit
/chat/modelsto see all available models - Each model shows its provider, description, and capabilities
- Click "Use Model" to start a conversation with that specific model
app0/
βββ app/
β βββ api/
β β βββ chat/ # Chat API endpoint
β β β βββ route.ts # Chat completions
β β β βββ models/
β β β βββ route.ts # List available models
β β βββ speech-to-text/ # Whisper transcription endpoint
β βββ chat/
β β βββ page.tsx # Chat interface
β β βββ models/
β β βββ page.tsx # Models browser page
β βββ layout.tsx # Root layout with navigation
β βββ page.tsx # Landing page
β βββ globals.css # Global styles
βββ components/
β βββ chat.tsx # Main chat component
β βββ navigation.tsx # Top navigation bar
β βββ ui/ # Reusable components (Radix UI based)
β βββ button.tsx
β βββ card.tsx
β βββ select.tsx
β βββ theme-switch-circular.tsx
β βββ ... (30+ UI components)
βββ lib/
β βββ utils.ts # Utility functions
βββ hooks/
βββ use-mobile.tsx # Mobile detection hook
Send a message and get an AI response.
Request:
{
"messages": [{"role": "user", "content": "What is JavaScript?"}],
"model": "openai/gpt-oss-120b:free"
}Response:
{
"response": "JavaScript is a programming language..."
}Get list of all available models.
Response:
{
"models": [
{
"id": "google/gemma-3-4b-it:free",
"name": "Gemma 3 4B",
"description": "Fast and efficient for everyday questions",
"provider": "Google"
}
],
"total": 8
}Convert audio to text.
Request: FormData with audio file
Response: { "text": "transcribed text here" }
Edit components/chat.tsx:
const [selectedModel, setSelectedModel] = useState('model-id-here');Update app/api/chat/models/route.ts and components/chat.tsx:
{
id: 'provider/model-name:free',
name: 'Display Name',
description: 'Model description',
provider: 'Provider Name',
}| Variable | Required | Description |
|---|---|---|
OPENROUTER_API_KEY |
Yes | OpenRouter API key for chat models |
GROQ_API_KEY |
Yes | Groq API key for speech-to-text |
Set environment variables in Vercel dashboard:
OPENROUTER_API_KEYGROQ_API_KEY
The app is a standard Next.js project. Deploy to:
- Netlify
- Railway
- Render
Make sure to set environment variables in your hosting platform.
Contributions are welcome! Feel free to:
- Report bugs
- Suggest features
- Submit pull requests