Skip to content

Latest commit

 

History

History
36 lines (28 loc) · 1.09 KB

File metadata and controls

36 lines (28 loc) · 1.09 KB

Ollama Setup Instructions

Install Ollama

Download and install Ollama from https://ollama.com/download for your operating system.

Pull Model(s)

Before starting the Ollama server, pull the required model:

ollama pull llama3.2:3b

See https://ollama.com/library for more details.

Start Ollama Server

Run:

ollama serve

API Usage

  • The backend service connects to Ollama's API at http://localhost:11434 by default.
  • This URL can be configured via the OLLAMA_API_URL environment variable in app/backend/.env.
  • Ensure Ollama is running before using narration features in the app.

Configuration

  • The Ollama API URL is configured in app/backend/.env:
    OLLAMA_API_URL=http://localhost:11434
  • The startup.sh script automatically creates this file from sample.env.txt if it doesn't exist.

Troubleshooting

  • If the backend cannot connect, verify Ollama is running and accessible at the configured port.
  • Check the OLLAMA_API_URL value in app/backend/.env matches your Ollama server address.
  • Review backend logs for connection error details.