An edge agent framework built in pure Python.
- Connects to local language model servers: AnythingLLM, LM Studio, Nexa, and Ollama
- Easily extensible with custom tools
- Refer to the Setup Guide to configure the application.
- Create a virtual environment and install Python dependencies:
python3 -m venv venv # Windows PowerShell .\venv\Scripts\Activate.ps1 # Mac/Linux source venv/bin/activate pip install openai httpx pyyaml requests pytest
- Run the tests to verify the configuration
# Windows PowerShell .\scripts\run_tests.ps1 -c -l # Mac/Linux ./scripts/run_tests.sh -c -l - Configure your local LLM server settings in
config.yaml. - Run your chosen model server (e.g., AnythingLLM, LM Studio, or Nexa).
- Run the local agent
python main.py
I welcome contributions! Please follow these steps:
- Fork the repository.
- Create a new branch for your feature or bug fix.
- Make your changes, commit them, and push to your branch.
- Create a pull request explaining your changes.
Special thanks to these community contributors:
This project is licensed under the MIT License. See the LICENSE file for details.
