Dwell helps triage tenant maintenance requests using an on-device LLM (Ollama llama3) and suggests nearby vendors via the Google Maps Places API.
- Ollama llama3 integration for maintenance triage and short tenant responses
- Google Maps Places API integration for nearby vendor suggestions
- FastAPI backend with async background processing
- React frontend for tenant reporting and manager dashboard
Tenant view:
Manager dashboard:
Backend (PowerShell):
cd backend
python -m venv .venv; .\.venv\Scripts\Activate.ps1
pip install -r requirements.txt
# Optional configuration (defaults shown)
$env:OLLAMA_BASE_URL = "http://localhost:11434"
$env:OLLAMA_MODEL = "llama3:latest"
$env:OLLAMA_TIMEOUT_SECONDS = "120"
# Maps: set to "google" to use Places API; default is "mock"
$env:MAPS_MODE = "mock"
# If MAPS_MODE=google, provide your API key
# $env:GOOGLE_MAPS_API_KEY = "<your_key>"
uvicorn backend.app.main:app --reload --port 8000Frontend (PowerShell):
cd frontend
npm install
npm run devPOST /api/triage: accept a tenant message; returns awork_order_idGET /api/work_orders/{id}: fetch triage results and vendor suggestionsGET /health/ollama: check Ollama connectivity and resolved modelGET /debug/ollama_raw: inspect raw LLM output (for debugging)
- Ensure Ollama is running locally on port 11434 and the llama3 model is installed (e.g.,
ollama pull llama3) - To use Google Places, set
MAPS_MODE=googleand defineGOOGLE_MAPS_API_KEY

