Interactive demo for ignfab/geocontext based on Gradio - ChatBot and LangGraph.
- uv ( Python package and project manager )
- NodeJS (npx)
| Name | Description | Default |
|---|---|---|
| MODEL_NAME | The name of the model (see LangGraph - create_react_agent / init_chat_model) | "anthropic:claude-sonnet-4-6" |
| ANTHROPIC_API_KEY | Required from anthropic:* models (https://console.anthropic.com/settings/keys) |
|
| GOOGLE_API_KEY | Required from google_genai:* models (https://aistudio.google.com/api-keys) |
|
| TEMPERATURE | Model temperature | 0 |
| DB_URI | URL of the PostgreSQL (postgresql://postgres:ChangeIt@localhost:5432/geocontext) or Redis (ex : redis://default:ChangeIt@localhost:6379/0) database. |
None (use InMemorySaver) |
| CONTACT_EMAIL | Email for the contact button. | "dev@localhost" |
| GEOCONTEXT_LOG_LEVEL | Log level for Geocontext MCP. | error |
| LOG_LEVEL | Log level for this application. | INFO |
Note that "HTTP_PROXY", "HTTPS_PROXY", "NO_PROXY" are supported if you have to use a corporate proxy.
# download repository
git clone https://github.com/ignfab/demo-geocontext
cd demo-geocontext
# configure model model and credentials
export MODEL_NAME="anthropic:claude-sonnet-4-6"
export ANTHROPIC_API_KEY="YourApiKey"
# start demo on http://localhost:8000/ :
uv run demo_gradio.pyCompared to Linux, adapt model and credentials configuration as follow with PowerShell :
#$env:MODEL_NAME="ollama:mistral:7b"
$env:MODEL_NAME="anthropic:claude-sonnet-4-6"
$env:ANTHROPIC_API_KEY="YourApiKey"
# start demo on http://localhost:8000/
uv run demo_gradio.pySee docker-compose.yaml :
# build image
docker compose build
# Use Google Gemini API
export MODEL_NAME="google_genai:gemini-2.5-flash"
export GOOGLE_API_KEY="YourApiKey"
# start demo on http://localhost:8000/
docker compose up -d