Skip to content

A question generator for coffee chats, powered by the Mistral7B large language model. Scrapes information from provided LinkedIn profiles using BeautifulSoup and Playwright. Generates and displays relevant questions based on the collected information.

Notifications You must be signed in to change notification settings

majockbim/connectin

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 

Repository files navigation

ConnectIn

About

If you ever wanted though about having a 'coffee chat' with a cracked LinkedIn user but wouldn't know what to say, ConnectIn is for you!

Features

  • AI-powered coffee chat question generation
  • LinkedIn profile analysis
  • Personalized networking conversation starters
  • RESTful API backend
  • Modern web interface

Technology Stack

Backend

  • FastAPI
  • Mistral 7B AI Model (via llama-cpp-python)
  • Playwright for web scraping
  • BeautifulSoup for HTML parsing

Frontend

  • [frontend technologies here]

Developers

Backend Developer: Majock Bim

  • API development
  • AI model integration
  • LinkedIn scraping implementation

Frontend Developer: Carson Carrasco

  • [info here]

Getting Started

Prerequisites

  • LinkedIn account credentials

Installation

  1. Clone the repository:
git clone https://github.com/majockbim/connectin.git
cd connectin
  1. Backend setup:
cd backend
pip install -r requirements.txt
python -m playwright install chromium
  1. Environment configuration:
# Create .env file in backend directory
LINKEDIN_EMAIL=your_linkedin_email@example.com
LINKEDIN_PASSWORD=your_linkedin_password
  1. Download the AI model:

    • Follow instructions in mistral_README.txt to install mistral-7b-instruct-v0.1.Q4_K_M.gguf in backend/app/models/
  2. Start the backend server:

cd connectin
python -m backend.main
  1. Frontend setup:
cd frontend
npm run dev

# frontend installation steps here

API Documentation

Once the backend is running, visit:

  • API docs: http://localhost:8000/docs
  • Health check: http://localhost:8000

Main Endpoint

POST /generate
Content-Type: application/json

{
  "learner_url": "https://linkedin.com/in/student-profile",
  "mentor_url": "https://linkedin.com/in/mentor-profile"
}

Project Structure

connectin/
├── backend/
│   ├── app/
│   ├── prompt_engine.py
│   │   ├── models/
│   │   │   └── mistral-7b-instruct-v0.1.Q4_K_M.gguf
│   │   └── utils/
│   │       └── scraper.py
│   ├── main.py
│   └── requirements.txt
│
├── frontend/
│   └── [Frontend files]
└── README.md

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/new-feature)
  3. Commit your changes (git commit -am 'Add new feature')
  4. Push to the branch (git push origin feature/new-feature)
  5. Create a Pull Request

Acknowledgments

Honourable Mention: Junior Assani for creating the first (and only) pull request.

About

A question generator for coffee chats, powered by the Mistral7B large language model. Scrapes information from provided LinkedIn profiles using BeautifulSoup and Playwright. Generates and displays relevant questions based on the collected information.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •