Skip to content

pauxd26/ai-recruiter

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AI Recruiter

An AI-driven recruitment tool designed to scrape and analyze GitHub and Google Scholar profiles to identify top candidates for AI and ML roles.

Project Structure

AIrecruiter/
├── src/
│   ├── core/           # Core data models and classes
│   │   ├── candidate.py
│   │   ├── person.py
│   │   ├── data.py
│   │   └── prof.py
│   ├── scrapers/       # Web scraping modules
│   │   ├── github.py
│   │   ├── googlescholar.py
│   │   ├── linkedin.py
│   │   └── authors.py
│   ├── scanners/       # Scanning and filtering modules
│   │   ├── scangit.py
│   │   ├── scangs.py
│   │   └── scanauth.py
│   └── utils/          # Utility functions
│       ├── llm.py
│       ├── normalise.py
│       ├── combiner.py
│       └── query_classifier.py
├── data/               # Data files and CSV outputs
├── templates/          # Flask HTML templates
├── static/             # Static CSS files
├── app.py              # Main Flask application
└── requirements.txt    # Python dependencies

Features

  • Query Classification: Automatically classifies recruitment queries
  • GitHub Profile Analysis: Scores GitHub profiles based on AI/ML relevance
  • Google Scholar Integration: Fetches and analyzes academic profiles
  • Author Filtering: Filters and classifies co-authors from research papers
  • Location-Based Search: Supports location-specific candidate searches
  • Web Interface: Flask-based web UI for easy interaction

Installation

  1. Clone the repository
  2. Create a virtual environment:
    python -m venv venv
    source venv/bin/activate  # On Windows: venv\Scripts\activate
  3. Install dependencies:
    pip install -r requirements.txt

Usage

Running the Web Application

python app.py

The application will start on http://localhost:5000

Example Queries

  • "Find top 6 students who have worked on TensorFlow and have a strong GitHub presence in Boston."
  • "Recruit top 5 students in California who have worked on computer vision projects."
  • "Find top 8 programmers in Seattle who have worked on GPT-3 and have published papers on NLP."

Workflow

  1. Query Classification: Input query is classified to determine search type (GitHub, Scholar, Student)
  2. Profile Fetching: Relevant profiles are fetched based on classification
  3. Scoring: Profiles are scored using relevance algorithms
  4. Filtering: Results are filtered and normalized
  5. Output: Results are saved to CSV files in the data/ directory

Requirements

  • Python 3.7+
  • See requirements.txt for full dependency list

License

MIT License

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors