Skip to content

lewashby/llmasp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

55 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLMASP

LLMASP is a framework that tries to unify the potential of Large Language Models (LLMs) and the reasoning power of Answer Set Programming (ASP), a form of declarative programming oriented towards difficult search problems.

PyPI Installation

Install the package from PyPI using pip by following the instructions on the package page.

Local Installation

Install poetry following the official documentation.

git clone https://github.com/LewAshby/llmasp.git
cd llmasp
poetry install

Download and install Ollama following the official documentation, then run

ollama serve

Make sure you downloaded the model before calling it. E.g.

ollama pull llama3

Go to Ollama library for all available models.

Usage

If you are running Ollama locally, the default IP address + port will be http://localhost:11434, for more details about Ollama and OpenAI compability go here. If you installed Ollama in an external server, then create an SSH local port forwarding.

ssh -L <local_port>:<remote_server>:<remote_port> <user>@<remote_server>

Ollama binds 127.0. 0.1 port 11434 by default

Use the command-line help for usage instructions.

cd llmasp
poetry run llmasp --help

Run example case

poetry run llmasp --example -m llama3 -s http://localhost:11434/v1

Acknowledgments

License

Apache-2.0 license

About

Bringing together LLMs and ASP

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages