ClearTerms is a full-stack Generative AI application that protects you from predatory legal agreements. It uses multi-model AI agents (Llama, Qwen, Mixtral via Hugging Face) to scan Privacy Policies and Terms of Service instantly, highlighting red flags and summarizing your rights in plain English.
In a digital world where "I Agree" is the biggest lie, ClearTerms stands as your personal legal defense.
- 🔍 Deep Legal Analysis, Not Just Keywords: unlike simple keyword CTRL+F tools, our AI agent understands context. It can distinguish between "we sell your data" and "we transfer data to service providers".
- 🤖 Multi-Model Failsafe Intelligence: Built with a robust fallback system. If one AI model is busy, ClearTerms automatically switches providers (Llama → Qwen → Mixtral), ensuring 100% uptime and reliability.
- ✨ Premium User Experience: Legal tools don't have to be boring. Enjoy a stunning, mobile-responsive Glassmorphism UI that makes reading contracts as easy as browsing social media.
- 🛡️ Privacy First: We analyze policies without storing your personal data. Your privacy is our priority while we check theirs.
- AI Risk Detection: Instantly identifies "High Severity" risks like Data Selling, IP Ownership, and Forced Arbitration.
- Transparency Score: Assigns a simple 0-100 score to every policy.
- Smart Fallback System: Automatically rotates between Llama 3.3, Qwen 2.5, and Mixtral to guarantee results.
- Anti-Bot Scraping: Uses a hybrid scraper (Jina Reader + Headless Fallback) to read policies even from difficult sites like Zomato.
- Performance Optimized:
- Fast Next.js rendering via App Router Server/Client Component splitting.
- High-concurrency FastAPI backend that offloads blocking HTTP scrapes to background threads.
- Modern Tech Stack: Built with Next.js 14, FastAPI (Python), and Tailwind CSS, featuring a responsive, premium Glassmorphism UI.
- Frontend: Next.js 14, TypeScript, Tailwind CSS, Framer Motion, Lucide React.
- Backend: Python 3.9+, Hugging Face Hub, FastAPI (Serverless), Trafilatura.
- AI Infrastructure: Hugging Face Serverless Inference API (Official
huggingface_hubSDK). - Deployment: Vercel (Frontend + Python Serverless Functions).
/
├── api/ # Python Serverless Functions (FastAPI entry)
├── backend/ # Core AI Logic & Agent Definitions
│ └── agent.py # Hugging Face AI Agent & Multi-Model Config
├── frontend/ # Next.js Application
│ ├── app/ # App Router Pages
│ └── components/ # UI Components
├── requirements.txt # Python Dependencies
└── vercel.json # Vercel Build Config- Clone the repo:
git clone https://github.com/yourusername/clearterms.git
- Install Python Backend:
pip install -r requirements.txt
- Install Frontend:
cd frontend && npm install
- Setup Keys: Create a
.envfile and add yourHF_TOKEN. - Run Locally:
- Backend:
uvicorn api.index:app --reload - Frontend:
npm run dev
- Backend:
This project is optimized for Vercel.
- Push to GitHub.
- Import project in Vercel.
- Go to the Settings > Environment Variables tab in your Vercel Dashboard.
- Add
HF_TOKENin the Key field and your Hugging Face Access Token in the Value field. - Deploy! (If already deployed, click the three dots on the latest deployment and click Redeploy so it picks up the token).
