Skip to content

Latest commit

Β 

History

History
62 lines (37 loc) Β· 1.4 KB

File metadata and controls

62 lines (37 loc) Β· 1.4 KB

LocalSLM

Hey πŸ‘‹, welcome to LocalSLM!
This repo is all about running Small Language Models (SLMs) locally – think LLM (chatGPT, Claude, and Gemini) but way more lightweight and privacy-friendly. Perfect for ChromeBooks.


πŸš€ Features

  • Local Inference: Run SLMs on your own machine, no cloud needed.
  • Plug-n-Play: Add or swap out models easily.
  • Extensible: Designed for quick hacks, tweaks, and upgrades.
  • Privacy First: All your data stays on your device, I am also too lazy for a server.

πŸ› οΈ Getting Started

  • download the html, done.

βš™οΈ Configuration

  • Model Selection: Drop your models into models/ and point the config to 'em.
  • Custom Prompts: Tweak the prompts in the source or config for your use-case.

πŸ™Œ Contributing

Got ideas? Found a bug? Wanna help out?

  1. Fork it 🍴
  2. Make your changes ✏️
  3. Submit a PR πŸš€

πŸ’¬ FAQ

Q: What even is a "Small Language Model"?
A: It's like a mini version of GPT/LLM, way lighter and runs locally!

Q: Can I use my own model?
A: Yup! Just drop it in /models and update the config.

Q: Is this production-ready?
A: Nah, this is for local AI and experiments. Use at your own risk!


πŸ“’ Credits

Made with πŸ’» by Henry and the Robot's.


πŸͺͺ License

MIT – do whatever, just don’t sue me πŸ˜… (BCPS :| )


Stay curious, stay local!