Skip to content

chenjunlin110/resume-match

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

6 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Resume Match - AI-Powered Resume Analysis & Job Matching

License: MIT Docker Python Go React

An intelligent resume analysis and job matching system that leverages AI/LLM technology to provide real-time feedback and suggestions for resume optimization.

πŸš€ Features

  • Multi-format Support: Parse resumes in PDF, DOCX, and TXT formats
  • AI-Powered Analysis: Leverage Ollama with local LLM inference for privacy
  • Real-time Streaming: Get instant feedback with Server-Sent Events (SSE)
  • Skill Extraction: Automatically identify and extract skills from resumes
  • Job Description Matching: Compare resume content against job requirements
  • Modern Web Interface: Beautiful React-based UI with Material-UI components
  • Docker Ready: Easy deployment with Docker Compose

πŸ—οΈ Architecture

The project consists of four main services:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  Frontend  β”‚    β”‚   Backend   β”‚    β”‚ ML Service  β”‚    β”‚   Ollama    β”‚
β”‚   (React)  │◄──►│    (Go)     │◄──►│  (Python)   │◄──►│  (LLM)      β”‚
β”‚   Port 3000β”‚    β”‚  Port 8080  β”‚    β”‚  Port 8000  β”‚    β”‚  Port 11434 β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Service Details

  • Frontend: React 18 + Vite + Material-UI, responsive web interface
  • Backend: Go + Gin framework, handles file uploads and API routing
  • ML Service: Python + FastAPI, processes resumes and integrates with LLM
  • Ollama: Local LLM inference engine with GPU acceleration support

πŸ“‹ Prerequisites

  • Docker & Docker Compose: For containerized deployment
  • Git: For cloning the repository
  • Modern Web Browser: Chrome, Firefox, Safari, or Edge

πŸ› οΈ Installation

1. Clone the Repository

git clone https://github.com/chenjunlin110/resume-match.git
cd resume-match

2. Start Services with Docker Compose

# Navigate to infrastructure directory
cd infra

# Start all services
docker compose -f docker-compose.mac.yml up -d

3. Verify Services

# Check service status
docker compose ps

# View logs
docker compose logs -f

πŸš€ Usage

Web Interface

  1. Open your browser and navigate to http://localhost:3000
  2. Upload a resume file (PDF, DOCX, or TXT format)
  3. Enter job description text in the provided field
  4. Submit and watch real-time AI analysis
  5. Review results including skill extraction and matching suggestions

API Endpoints

Traditional Response

POST /api/upload
Content-Type: multipart/form-data

Parameters:
- jd_text: Job description text
- resume_file: Resume file upload

Streaming Response (Recommended)

POST /api/upload_stream
Content-Type: multipart/form-data

Parameters:
- jd_text: Job description text
- resume_file: Resume file upload

Response: Server-Sent Events (SSE) stream

Sample Files

Test the system with sample files in the docs/samples/ directory:

  • resume_sample.txt - Sample resume content
  • jd_sample.txt - Sample job description

πŸ”§ Configuration

Environment Variables

Backend Service

ML_URL=http://ml:8000/api/score_file_stream  # ML service endpoint

ML Service

OLLAMA_BASE_URL=http://ollama:11434  # Ollama service URL

Ollama Service

OLLAMA_GPU_LAYERS=35              # GPU acceleration layers
OLLAMA_FLASH_ATTENTION=true       # Flash attention optimization
OLLAMA_METAL=1                    # Apple Metal GPU support (macOS)

Docker Compose Configuration

The docker-compose.mac.yml file includes:

  • Port mappings for all services
  • Volume mounts for persistent data
  • Environment variable configurations
  • Service dependencies and health checks

πŸ§ͺ Testing

Test Streaming API

# Test the streaming endpoint directly
curl -v -X POST http://localhost:8080/api/upload_stream \
  -F "jd_text=Go programming" \
  -F "resume_file=@../docs/samples/resume_sample.txt" \
  --no-buffer

Test Individual Services

# Test ML service
curl http://localhost:8000/health

# Test backend
curl http://localhost:8080/health

# Test Ollama
curl http://localhost:11434/api/tags

πŸš€ Development

Frontend Development

cd frontend
npm install
npm run dev

Backend Development

cd backend
go mod tidy
go run main.go

ML Service Development

cd ml_service
pip install -r requirements.txt
uvicorn app:app --reload --host 0.0.0.0 --port 8000

πŸ“ Project Structure

resume-match/
β”œβ”€β”€ frontend/                 # React frontend application
β”‚   β”œβ”€β”€ src/
β”‚   β”‚   β”œβ”€β”€ components/      # React components
β”‚   β”‚   └── config.ts        # API configuration
β”‚   β”œβ”€β”€ Dockerfile           # Frontend container
β”‚   └── package.json
β”œβ”€β”€ backend/                  # Go backend service
β”‚   β”œβ”€β”€ main.go              # Main application
β”‚   └── Dockerfile           # Backend container
β”œβ”€β”€ ml_service/              # Python ML service
β”‚   β”œβ”€β”€ app.py               # FastAPI application
β”‚   β”œβ”€β”€ requirements.txt     # Python dependencies
β”‚   └── Dockerfile           # ML service container
β”œβ”€β”€ infra/                   # Infrastructure configuration
β”‚   β”œβ”€β”€ docker-compose.mac.yml  # Docker Compose for macOS
β”‚   └── docker-compose.yml      # Standard Docker Compose
β”œβ”€β”€ docs/                    # Documentation and samples
β”‚   └── samples/             # Sample files for testing
└── README.md                # This file

πŸ”’ Security & Privacy

  • Local LLM: All AI processing happens locally via Ollama
  • No External APIs: No data sent to third-party services
  • File Handling: Secure file upload and processing
  • CORS Configuration: Proper cross-origin resource sharing setup

🀝 Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

πŸ“ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ™ Acknowledgments

πŸ“ž Support


Made with ❀️ by Junlin Chen

Transform your resume with AI-powered insights and real-time feedback.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published