This project is an open-source clone of the T3.Chat application, built as a submission for the T3 ChatCloneathon Competition.
This repository contains a modern, multi-LLM chat application featuring real-time streaming, robust authentication, and a secure, scalable backend. It's designed to be a high-performance, production-ready chat interface.
Live Demo (Coming Soon) | Cloneathon Page | Original T3.Chat
For MVP, focus was on basic features but smooth UX and performance.
- 🤖 Multi-LLM Support: using the Vercel AI SDK v4, you can use any model supported by the SDK.
- 🔒 Secure & Private: End-to-end security with Clerk for authentication and Convex's Row-Level Security (RLS) to ensure data privacy.
- 🚀 Fast local first caching: using custom hooks on top of Convex's queries
This project leverages a modern, type-safe, and scalable technology stack.
- Framework: Next.js 15 (React 19)
- Backend & Database: Convex (Real-time serverless backend)
- Authentication: Clerk
- AI Integration: Vercel AI SDK v4
- UI: ShadCN UI & Tailwind CSS
- Language: TypeScript (Strict Mode)
- Containerization (Dev): Docker & Docker Compose
Didn't had time to implement but wanted to:
- smooth streaming
- stream error handling
- input save per thread (in case of error, page reload etc)
- code colorization
- thread renaming
- stream resumability
- favorite threads
- web search
- better optimized Markdown rendering
- public shared threads
- common primitive widget tool (llm) to display data (list, images, etc)
- clean my absolute mess of a codebase because I rushed it
- Server-Side AI Operations: All AI SDK calls are handled in Next.js API Routes (
/api/chat) to protect API keys and manage provider logic securely. - Row-Level Security (RLS): Utilizes a custom RLS system within Convex (
queryWithRLS,mutationWithRLS) to provide bulletproof data isolation between users automatically. - Optimistic UI Updates: Messages appear instantly in the UI while being sent to the server in the background for a snappy user experience.
- Local Caching: All data returned by convex query hooks are cached locally for fast load with good UX of stale data.
- Component-Based Architecture: A clean separation of concerns with reusable components for UI, chat logic, and authentication.
This project is fully containerized. The only local dependencies you need are Docker (with docker-compose) and make.
-
Clone the repository:
git clone https://github.com/PaulSenon/t3-chat-cloneathon.git cd t3-chat-cloneathon -
Set up environment variables: Copy the example environment file and fill in your keys for Convex, Clerk, OpenAI, and Anthropic.
cp .env.example .env.local
-
Install and Build: This command builds the Docker container and installs all
pnpmdependencies inside it.make install
[!NOTE] It will prompt you for convex setup. Ctrl+C to exit.
-
Run the Development Server: This starts the Next.js development server inside the container.
make dev
The application will be available at http://localhost:3000.
make help: Show all available commands.make dev: Start the development server.make run cmd="...": Run any command inside the development container (e.g.,make run cmd="pnpm add package-name").make bash: Get a shell inside the running container.make clean: Stop and remove all project-related containers and volumes.
This project is licensed under the MIT License. See the LICENSE file for details.
Developed by Paul Senon for the T3 ChatCloneathon. Connect with me on LinkedIn or Twitter/X.