Askify is a modern, full-stack AI chat application that revolutionizes how you interact with documents and code repositories. Upload PDF documents or connect GitHub repositories to chat with your content using state-of-the-art AI models including GPT-4o, GPT-5, o3 Mini/Pro, and Google Gemini.
- π Document Intelligence: Upload and chat with PDF, DOC, and DOCX files
- π Repository Analysis: Connect public GitHub repositories for code exploration
- π€ Multi-Model AI: Choose from GPT-4o, GPT-5, o3 series, and Gemini models
- π Usage Analytics: Track daily/monthly usage with visual breakdowns
- π¨ Modern UI: Beautiful dark/light theme with responsive design
- β‘ Real-time Processing: Background job processing with live progress updates
- π Secure Authentication: JWT-based auth with protected routes
- π High Performance: Vector embeddings with Qdrant for fast retrieval
- React 19 with TypeScript for type safety
- Tailwind CSS + shadcn/ui for modern, accessible components
- Zustand for lightweight state management
- Server-side rendering with proper hydration handling
- RESTful API with TypeScript
- Prisma ORM with PostgreSQL database
- Redis/Bull queues for background processing
- Passport JWT authentication strategy
- File upload with validation and processing
- OpenAI GPT Models: GPT-4o, GPT-5, o3 Mini/Pro
- Google Gemini: Latest 2.0 Flash models
- LangChain: Document processing and retrieval
- Qdrant: Vector database for semantic search
- OpenAI Embeddings: text-embedding-3-small
- Node.js 18+ and pnpm
- Docker and Docker Compose
- PostgreSQL, Redis, and Qdrant (via Docker)
git clone https://github.com/tushargr0ver/askify.git
cd askifyCreate .env files in both client and server directories:
Server .env:
DATABASE_URL="postgresql://user:password@localhost:5432/askify"
JWT_SECRET="your-super-secret-jwt-key"
OPENAI_API_KEY="your-openai-api-key"
GEMINI_API_KEY="your-gemini-api-key"Client .env.local:
NEXT_PUBLIC_API_URL="http://localhost:3001"docker-compose up -d# Install dependencies for both client and server
cd server && pnpm install
cd ../client && pnpm install
# Run database migrations
cd ../server && pnpm dlx prisma migrate dev
# Start development servers
cd server && pnpm run start:dev # Backend on :3001
cd client && pnpm run dev # Frontend on :3000- Sign up with email and password
- Login to access your personalized dashboard
- Manage preferences including preferred AI models
- Click "New Chat" β "Upload Document"
- Select PDF, DOC, or DOCX files (max 5MB)
- Wait for processing completion
- Start asking questions about your document content
- Click "New Chat" β "Fetch Repository"
- Enter a public GitHub repository URL
- Wait for code analysis and vectorization
- Explore your codebase through natural language queries
- GPT-4o: Balanced performance for general tasks
- GPT-5: Advanced reasoning capabilities
- o3 Mini/Pro: Optimized for coding and STEM
- Gemini 2.0: Multimodal with strong reasoning
| Plan | Daily Messages | Monthly Uploads | Monthly Repos |
|---|---|---|---|
| Free | 50 | 10 | 5 |
| Pro | Unlimited | Unlimited | Unlimited |
POST /auth/signup- User registrationPOST /auth/login- User loginGET /auth/me- Get current user
POST /chat- Create new chatGET /chat- List user chatsPOST /chat/:id/message- Send messageDELETE /chat/:id- Delete chat
POST /file-upload- Upload documentGET /file-upload/job/:id- Check processing status
POST /repository/process- Process GitHub repoGET /repository/job/:id- Check processing status
askify/
βββ client/ # Next.js frontend
β βββ app/ # App router pages
β βββ components/ # Reusable UI components
β βββ hooks/ # Custom React hooks
β βββ lib/ # Utilities and API client
βββ server/ # NestJS backend
β βββ src/ # Source code
β β βββ auth/ # Authentication module
β β βββ chat/ # Chat and AI processing
β β βββ file-upload/ # Document processing
β β βββ repository/ # GitHub repo handling
β β βββ users/ # User management
β βββ prisma/ # Database schema and migrations
βββ docker-compose.yml # Infrastructure setup
# Server commands
cd server
pnpm run start:dev # Development server
pnpm run test # Run tests
pnpm run build # Production build
# Client commands
cd client
pnpm run dev # Development server
pnpm run build # Production build
pnpm run lint # ESLint check
# Database operations
pnpm dlx prisma studio # Database GUI
pnpm dlx prisma migrate dev # Run migrations
pnpm dlx prisma generate # Generate client# Build applications
cd server && pnpm run build
cd client && pnpm run build
# Start production servers
cd server && pnpm run start:prod
cd client && pnpm start# Build and run with Docker Compose
docker-compose -f docker-compose.prod.yml up --buildEnsure all environment variables are set for production:
- Database connections
- API keys (OpenAI, Gemini)
- JWT secrets
- CORS origins
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- OpenAI for GPT models and embeddings
- Google for Gemini AI models
- Vercel for Next.js framework
- NestJS for the backend framework
- LangChain for AI orchestration
- Qdrant for vector search capabilities
Built with β€οΈ by Tushar Grover