A modern, Streamlit-powered chatbot that combines Groqβs Llama-3-70B with Google Gemini 1.5 and on-demand web search to answer anything from trivia to deep technical questions.
It features multi-chat sessions, elegant UI/UX, and secure API-key handling β all in a single Python file.
π Live demo: Try it instantly on Hugging Face β https://huggingface.co/spaces/Ashkchamp/General_Knowledge_Assistant
| Capability | Details |
|---|---|
| Dual-LLM pipeline | β’ Groq Llama-3-70B (via langchain_groq) for core reasoning and responses.β’ Gemini 1.5-pro for meta-reasoning (decides when to web-search) and safe-content filtering. |
| Smart Web Search | Uses DuckDuckGo through langchain_community.tools.DuckDuckGoSearchRun only when Gemini signals <SEARCH>. |
| Persistent Chats | Auto-saves each conversation (name, messages, timestamp) in Streamlit Session State; switch, rename, or delete with one click. |
| Polished UI | Custom CSS for clean, mobile-friendly chat bubbles, avatars, and a subtle typing indicator. |
| Zero-backend setup | Runs locally β no database or server-side code required. |
| Secure Keys | API keys loaded from a local .env file (never stored in code or state). |
# 1. Clone
git clone https://github.com/<your-handle>/general-knowledge-assistant.git
cd general-knowledge-assistant
# 2. Create & activate a virtual environment (recommended)
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# 3. Install dependencies
pip install -r requirements.txt
# 4. Add your keys
cp .env.example .env # then edit .env
# ββββββββββββββββββββββββββββββββ
# GROQ_API_KEY=your_groq_key_here
# GEMINI_API_KEY=your_gemini_key_here
# ββββββββββββββββββββββββββββββββ
# 5. Run
streamlit run app.pyOpen http://localhost:8501 in your browser and start chatting!
(Or just use the hosted version on Hugging Face if you donβt have keys handy.)
βββ app.py # Streamlit application
βββ requirements.txt # Python dependencies
βββ .env.example # Sample environment file
βββ README.md
| Variable | Description |
|---|---|
GROQ_API_KEY |
Obtain from https://console.groq.com/ |
GEMINI_API_KEY |
Obtain from https://aistudio.google.com/ |
(Optional) PORT |
Override Streamlit default port via streamlit run app.py --server.port <PORT> |
- Model choice β change
model="llama-3.3-70b-versatile"insetup_models. - UI theme β edit
local_css()for colours, fonts, layouts. - Search provider β swap DuckDuckGo for any other
langchaintool.
- User prompt β stored in session history.
- Gemini decides if the query needs a web search (
<SEARCH> keywords) or not (NO_SEARCH). - If search required, DuckDuckGo fetches top results β passed to Groq with a βSearch Resultsβ prompt.
- Groq Llama-3 generates the final answer.
- UI shows a typing indicator while processing, then streams the response.
- All messages & session metadata persist until the browser tab is closed (or deleted via sidebar).
MIT β see LICENSE.