A powerful, modular Django application that translates natural language questions into SQL queries, allowing users to interact with your database using plain English. Powered by Large Language Models and built with a clean, extensible architecture.
- Natural Language → SQL: Transform plain English questions into precise SQL queries
- Smart Schema Discovery: Automatically extracts schema and constraints from Django models
- Relationship Awareness: Understands model relationships and dependencies
- Handles Enum Choices: Correctly manages field choices with exact capitalization and spacing
- Multi-LLM Support: Works with both Anthropic Claude and OpenAI models
- Conversation Context: Maintains conversation history for follow-up questions
- Clean Architecture: Modular design makes extension and customization simple
- API-First Design: Easy to integrate into any Django application
- Python 3.8+
- Django 3.2+
- PostgreSQL database
- API key for either Anthropic or OpenAI (or both)
-
Install the package:
pip install django-db-chat # Coming soon to PyPI # Or from source pip install -e .
-
Add to your Django
INSTALLED_APPS:INSTALLED_APPS = [ # ... 'db_chat', ]
-
Configure your settings:
# Choose your LLM provider LLM_PROVIDER = "anthropic" # or "openai" # API keys ANTHROPIC_API_KEY = "your_api_key_here" # If using Anthropic OPENAI_API_KEY = "your_api_key_here" # If using OpenAI # Specify which models to expose to the chatbot ALLOWED_MODELS = [ "users.User", "projects.Project", # Add more models as needed ]
-
Include the URLs:
urlpatterns = [ # ... path('api/chat/', include('db_chat.urls')), ]
-
Run migrations and server:
python manage.py migrate python manage.py runserver
Send a POST request to /api/chat/query/ with:
{
"query": "How many users are in the system?",
"conversation_id": "optional-existing-conversation-id"
}Response:
{
"reply": "There are 1,234 users in the system.",
"conversation_id": "conversation-uuid"
}db-chat offers optional real-time communication using WebSockets powered by Django Channels. This allows for features like streaming responses as they are generated by the LLM.
To enable WebSocket features, install the app with the [realtime] extra:
pip install "django-db-chat[realtime]"
# Or for local development from source:
pip install -e ".[realtime]"This installs the necessary dependencies: channels, daphne, and channels-redis.
In your project's settings.py:
-
Add
channelstoINSTALLED_APPS(beforedb_chat):INSTALLED_APPS = [ 'channels', # ... other apps 'db_chat', # ... ]
-
Set the
DB_CHAT_ENABLE_WEBSOCKETSflag:DB_CHAT_ENABLE_WEBSOCKETS = True
-
Configure
ASGI_APPLICATION:ASGI_APPLICATION = "your_project_name.asgi.application" # Replace your_project_name
-
Configure
CHANNEL_LAYERS(example using Redis on localhost):CHANNEL_LAYERS = { "default": { "BACKEND": "channels_redis.core.RedisChannelLayer", "CONFIG": { "hosts": [("localhost", 6379)], }, }, }
(Ensure you have a Redis server running and
channels-redisinstalled) -
Update your project's
asgi.py(e.g.,your_project_name/asgi.py) to handle WebSocket routing conditionally (as shown in the exampledb_chat_project/asgi.py):# your_project_name/asgi.py import os from django.core.asgi import get_asgi_application from django.conf import settings from channels.routing import ProtocolTypeRouter, URLRouter from channels.auth import AuthMiddlewareStack # Optional, for auth os.environ.setdefault("DJANGO_SETTINGS_MODULE", "your_project_name.settings") # Initialize Django ASGI application early to ensure AppRegistry is populated django_asgi_app = get_asgi_application() websocket_routes = [] if getattr(settings, 'DB_CHAT_ENABLE_WEBSOCKETS', False): try: from db_chat.routing import websocket_urlpatterns as db_chat_websocket_routes websocket_routes.extend(db_chat_websocket_routes) print("Loaded db_chat WebSocket routes.") except Exception as e: print(f"Failed to load db_chat WebSocket routes: {e}") application = ProtocolTypeRouter( { "http": django_asgi_app, # Add WebSocket routing only if enabled and loaded "websocket": AuthMiddlewareStack(URLRouter(websocket_routes)) if websocket_routes else None, } )
Once configured, you can connect to the WebSocket endpoint at:
ws://your-server/ws/db_chat/query/
Send JSON messages with the following format:
{
"query": "Your natural language query here",
"conversation_id": "optional-uuid-to-continue-conversation"
}The server will send back JSON messages including:
{"type": "status", "message": "..."}: Connection/processing status updates.{"type": "chat_response", "reply": "...", "conversation_id": "...", ...}: The final response, mirroring the HTTP API.{"type": "error", "message": "..."}: Any errors encountered.
(Note: Streaming of partial responses is planned for a future update.)
The application follows a clean, modular architecture:
-
Model Registry: Automatically discovers model schemas and relationships
- Extracts field types, constraints, and choices from Django models
- Builds dependency graph for related models
-
LLM Adapter: Provides a unified interface to different LLM providers
- Adapter pattern with implementations for Anthropic and OpenAI
- Factory method for selecting the appropriate adapter
-
Prompt Templates: Cleanly separated prompt templates for different tasks
- SQL generation prompts
- Result interpretation prompts
-
Chat Service: Core business logic
- Handles the full query lifecycle
- Manages conversation history
-
Storage Adapters: Flexible storage for conversation history
- Redis-based persistence
- In-memory option for development
View detected models and their relationships at:
/api/chat/schema-info/
This shows:
- All detected models and their table names
- Fields with choices and their exact values
- Dependency relationships between models
Extend the LLMAdapter class:
class NewProviderAdapter(LLMAdapter):
def __init__(self):
# Initialize client for the new provider
pass
def generate_text(self, system_prompt, messages, max_tokens=1024):
# Implement text generation using the new provider
passThen update the get_adapter method in LLMAdapter.
Add new validation functions to the utils.py module following the pattern of existing validators.
Edit the templates in prompts.py to customize how the chatbot interacts with the LLM.
MIT License - See LICENSE file for details