Skip to content

Easy to use single-file executable to run LLMs locally on your machine

License

Notifications You must be signed in to change notification settings

explicit-logic/light-ai

Repository files navigation

Light AI

Easy to use single-file executable to run LLMs locally on your machine

Demo

Getting Started

  • Download the right Light AI server for your operating system.
  1. Run ./light-ai (./light-ai.exe for Windows) from the command line.
  2. Open Swagger (http://0.0.0.0:8000/swagger) and pick id of one of the available models: GET /v1/models.
  3. Restart the server with the model id in params (For example: ./light-ai -m llama3.2-1b-instruct).

That's it! Your personal AI server is ready for usage. Try POST /v1/ask and POST /v1/completion to check.

Download

MacOS Silicon light-ai
MacOS Intel light-ai
Linux light-ai
Windows light-ai.exe

Usage

./light-ai -p 8000 -m llama3.2-1b-instruct
Argument Explanation
-p, --port Port to listen on (Optional)
-m, --model Model name (Optional)

API Endpoints

POST /v1/ask: Get a quick reply for a given prompt

Options:

prompt: Provide the prompt to get a reply (Required)

model: Model name (Optional)

grammar: Set grammar for grammar-based sampling (Optional)

schema: JSON response with a schema (Optional)

For Example:

curl http://0.0.0.0:8000/v1/ask --header 'Content-Type: application/json' --data '{"prompt": "Is an apple more expensive than a banana?"}'

POST /v1/completion: Given a prompt, it returns the predicted completion.

Options:

prompt: Provide the prompt for this completion as a string (Required)

model: Model name (Optional)

For Example:

curl http://0.0.0.0:8000/v1/completion --header 'Content-Type: application/json' --data '{"prompt": "Here is a list of sweet fruits:"}'

GET /v1/models: List of models

POST /v1/models/pull: Pull a model

Options:

model: Model name (Required)

Acknowledgements

About

Easy to use single-file executable to run LLMs locally on your machine

Resources

License

Stars

Watchers

Forks

Sponsor this project

Packages

No packages published