A specialized AI service powered by Groq (Llama 3.1) and LangGraph. Designed to answer astronomy-related questions with real-time web search capabilities using Tavily.
This repository houses the intelligent backend for the GalacticView ecosystem.
It provides two modes of interaction:
- CLI Tool: For testing and direct interaction in the terminal.
- REST API Server: A backend service that exposes the agent to the frontend application.
The core agent logic is encapsulated in the galacticview_bot package, leveraging LangGraph for reasoning loops and Groq for high-speed inference.
- Language: Python
- Web Framework: FastAPI (Server)
- Orchestration: LangGraph (LangChain)
- Inference Engine: Groq API
- Model: Llama 3.1 (via Groq)
- Tools: Tavily Search API
- Dependency Management: Poetry
Before running the agent, ensure you have the necessary tools and API keys.
- Python: Version >=3.12, <3.14
- Poetry: Installation Guide.
- Groq API Key: Sign up at console.groq.com.
- Tavily API Key: Sign up at tavily.com.
-
Clone the Repository
git clone https://github.com/levilevente/GalacticView-agent.git cd GalacticView-agent -
Install Python Dependencies Use Poetry to install the environment defined in
pyproject.toml.poetry install
-
Configure Environment Variables Create a
.envfile in the root directory and add your keys:# .env file content GROQ_API_KEY=gsk_your_groq_key_here TAVILY_API_KEY=tvly-your_tavily_key_here MODEL_NAME=llama-3.1-70b-versatile LLM_LOCAL=False # Set to True if using a local Ollama instance
You can run the application in two ways using the scripts defined in pyproject.toml.
Best for testing the agent logic directly in your terminal.
poetry run galacticview_cliStarts the web server (located in server/serve.py) to accept HTTP requests.
poetry run galacticview_appWhen running the server (poetry run galacticview_app), the agent logic is exposed via a REST endpoint.
POST /chat
Receives a user question and returns the structured agent response.
{
"question": "How many stars are approximately between earth and moon?",
"date": "2025-12-01"
}{
"title": "Stars and the Moon",
"content": "There are no stars between Earth and the Moon. The closest star, Proxima Centauri, is over 4 light-years away. This means that the Moon is in the Earth's shadow and does not reflect the light of any nearby stars. The Moon's surface is illuminated by the Sun's light, which is the only star that is close enough to be visible from the Moon.",
"key_metrics": [
"4 light-years",
"Proxima Centauri",
"Earth's shadow"
]
}Structure (for more details see agents.py):
-
Input: The agent receives a query via the CLI or API.
-
Reasoning (LangGraph): The agent determines if it has the internal knowledge to answer or if it needs external information.
-
Tool Usage (Tavily): If the topic requires current events (e.g., "news today"), it calls the Tavily Search API.
-
Synthesis (Groq): The LLM synthesizes the search results into a structured JSON format containing a summary, title, and key metrics.
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.