Download and install Ollama from https://ollama.com/download for your operating system.
Before starting the Ollama server, pull the required model:
ollama pull llama3.2:3bSee https://ollama.com/library for more details.
Run:
ollama serve- The backend service connects to Ollama's API at
http://localhost:11434by default. - This URL can be configured via the
OLLAMA_API_URLenvironment variable inapp/backend/.env. - Ensure Ollama is running before using narration features in the app.
- The Ollama API URL is configured in
app/backend/.env:OLLAMA_API_URL=http://localhost:11434
- The
startup.shscript automatically creates this file fromsample.env.txtif it doesn't exist.
- If the backend cannot connect, verify Ollama is running and accessible at the configured port.
- Check the
OLLAMA_API_URLvalue inapp/backend/.envmatches your Ollama server address. - Review backend logs for connection error details.