Skip to content

BronwynB19/ml-study-assistant

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 

Repository files navigation

ML Study Assistant

A locally-hosted ML study assistant that runs entirely on your own machine with no API keys, no internet connection, and no data sent to external servers. Built with Python and Ollama, accelerated by a local NVIDIA GPU via CUDA.

Features

  • Conversational explanations of machine learning concepts
  • Analogies tailored to make abstract ideas concrete
  • On-demand quizzing with the quiz command
  • Fully offline after initial model download

Tech Stack

  • Python
  • Ollama
  • LLaMA 3.2 (local model)

Requirements

  • Ollama installed
  • Python 3.x
  • NVIDIA GPU recommended for faster inference

Setup

  1. Clone this repository
  2. Install dependencies: pip install -r requirements.txt
  3. Pull the model: ollama pull llama3.2
  4. Run: python main.py

Usage

Type any machine learning concept to get an explanation. Type quiz at any time to be quizzed on the last topic discussed. Type quit or exit to close the assistant.

About

A simple study assistant for machine learning using ollama that can run locally on your computer with no internet connection.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages