| title | emoji | colorFrom | colorTo | sdk | sdk_version | python_version | app_file | pinned |
|---|---|---|---|---|---|---|---|---|
Permacore |
🌿 |
green |
blue |
gradio |
6.2.0 |
3.13 |
permacore/app.py |
true |
This project implements a Retrieval-Augmented Generation (RAG) chatbot to provide accurate, context-aware answers about sustainable farming practices, ecological design principles, and regenerative systems.
This project uses MistralAI for the LLM backend, LangChain to implement the RAG system, and Gradio for the UI.
All data sources are either open-source or usage permissions have been granted by the respective publishers.
- Provide accessible information about permaculture principles and practices
- Support learning and decision-making in regenerative agriculture
- Offer evidence-based guidance on sustainable development techniques
- Enable natural language queries about complex ecological topics
- Context-aware responses and source citations using RAG architecture
- Knowledge base covering permaculture design, soil health, water management, and more
- Conversational interface for exploring sustainable agriculture topics
Run an interactive demo at https://huggingface.co/spaces/mk-mccann/Permacore
A conda environment is recommended for installation. Then,
pip install -r requirements.txtThe UI is powered by Gradio. A local UI can be started by running python app.py.
# Minimal example usage
from os import getenv
from dotenv import load_dotenv
from permacore.rag_agent import RAGAgent
load_dotenv()
mistral_api_key = getenv("MISTRAL_API_KEY")
# Initialize the RAG agent
agent = RAGAgent()
# For a single query:
response = agent.query("What are the three ethics of permaculture?")
print(response)
# For interactive chat:
agent.cli_chat()Contributions are welcome! Please open an issue or submit a pull request.
Suggestions for resources to share are also welcome!
GPL-3.0