AI assistant that analyzes a résumé and job description, detects skill gaps, retrieves learning resources, and generates a 4-week study plan.
Requirements: Python 3.11+ and a Groq API key.
-
Install deps
python -m pip install --upgrade pip python -m pip install -r requirements.txt
(Or
make install.) -
Configure env (
.envin repo root)GROQ_API_KEY=your_key_here # Optional overrides: RAG_MODEL, RAG_PERSIST_DIR, RAG_COLLECTION_NAME, RAG_EMBEDDING_MODEL, RAG_TOP_K, RESUME_PARSER_MODEL, RESUME_PARSER_TEMPERATURE -
Run backend
uvicorn backend.app.api.main:app --host 0.0.0.0 --port 8000 --reload
(Or
make run-backend.) -
Serve frontend
python -m http.server 3000 --directory frontend
Visit
http://localhost:3000and upload a résumé PDF + paste a job description.
- Résumé → XML (
resume_parser.py) - Job skills (LLM) →
<jobSkills>XML (job_skill_eval.py) - Résumé skills (LLM) →
<skillsEvaluation>XML (resume_skill_eval.py) - Skill gaps →
<skillGaps>XML (skill_gap_eval.py) - RAG study plan: retrieve from Chroma (
VectorDB/), generate 4-week plan (rag_agent.py) - API (
backend/app/api/main.py) returns:study_plan(markdown),plan_structured(weeks/tasks/resources),plan_reasoning- job/resume skills and gap XML
GET /health— uptime probePOST /analyze— accepts JSON{ resume_pdf_path, job_description }POST /analyze-upload— multipart form (resume_pdf,job_description) used by the frontend
Static HTML/CSS/JS (frontend/index.html), no build step. Fetches POST /analyze-upload. Views: Study Plan, Structured, Raw, Reasoning.
backend/app/api/ # FastAPI entrypoint
backend/app/workflow/ # Orchestrator wiring services
backend/app/services/ # Resume/job skill eval, skill gap, RAG agent
backend/app/utils/ # XML helpers
frontend/ # Static UI
VectorDB/ # Chroma persistence (local)
docs/ # Brief, appendix, deliverables checklist
demo/ # Examples of resumes and job descriptions for local tests and demo
Makefile # install/run helpers
requirements.txt
======= Planned expansion:
-
Software Engineering
-
Finance
-
Mechanical Engineering
Additional technical and non-technical career paths
- Both entry-level and senior candidates
1. Skill Gap Detection
Users upload their resume or answer domain-specific questions.
-
Our system:
-
Analyzes skills, tools, and domain knowledge
-
Compares them to real job requirements
-
Returns structured “knowledge gap” outputs
-
2. RAG-Based Knowledge Retrieval
The RAG database is built from credible sources:
-
ML & Statistics textbooks
-
Ace the Data Science Interview
-
Applied DS/ML/AI resources
-
Articles, videos, PDFs, and tutorials (Domain Specific)
The engine retrieves explanations, examples, and runnable insights—not generic fluff.
3. Personalized Study Plans
For every knowledge gap, the system generates:
-
Sequenced study modules from videos, articles, and textbook excerpts
-
Timelines and difficulty progression
Frontend: Built using HTML and JavaScript. Communicates with backend through Fast API calls.
Backend:
-
Fast API, Python
-
Inference - Groq
-
Model - llama-3.1-8b-instant
RAG Pipeline:
-
Textbook into markdown using Cloud GPUs
-
Chunking & preprocessing by header
-
Sentence transformer embedding (all-MiniLM-L6-v2)
-
Storage in vector DB (ChromaDB)
-
Langchain
-
Retrieval (Precision(k=3)) + Generation
Agentic Workflow
Agent 1 [PDF converter]
Input: Resume
Output: XML of resume
Goes to:
Agent 2[Job description Skills]
Input: Job description
Output: Skill and proficiency requirements for the specific job
Goes to:
Agent 3[Resume Skills]
Input: XML of resume
Output: Skill and proficiency requirments with respect to the skills outlined from agent 2
Goes to:
Python function to get the gaps
Input: Skills
Output: Gaps
Goes to:
Agent 4[RAG + Study Plan]
Input: Skill gaps
Output: Study plan to get proficient in the necessary skills
- Working prototype (steps above; Makefile included)
- Product brief:
docs/product_brief.md - Technical appendix:
docs/technical_appendix.md - Deliverables checklist:
docs/deliverables.md - Add your deck and demo video to
docs/or link them indocs/deliverables.md
- Defaults use HuggingFace
all-MiniLM-L6-v2embeddings with Chroma. Override paths/models via env vars if needed. mainis protected; merge changes via PR.