Skip to content
#

quantized-models

Here are 3 public repositories matching this topic...

Mistral Legal Advisor is a full-stack AI application that connects a fine-tuned Mistral language model running on Google Colab (GPU) with a local Flask-based frontend. The system generates comprehensive legal document requirements for startups based on user inputs, using an ngrok tunnel to securely bridge local and cloud environments.

  • Updated Dec 18, 2025
  • Jupyter Notebook

A privacy-first, fully offline neural machine translation application supporting 200+ languages. Built on Meta's NLLB-200 model, this tool runs entirely in your browser without requiring server infrastructure or internet connectivity after initial model download.

  • Updated Dec 10, 2025
  • HTML

A complete, menu-driven AI model interface for Windows that simplifies running local GGUF language models with llama.cpp. This tool automatically manages dependencies, provides multiple interaction modes, and prioritizes user privacy through fully offline operation.

  • Updated Jan 30, 2026
  • PowerShell

Improve this page

Add a description, image, and links to the quantized-models topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the quantized-models topic, visit your repo's landing page and select "manage topics."

Learn more