This repository contains a Docker Compose setup for running Ollama (LLM server), OpenWebUI (UI for interacting with LLMs), and SearXNG (privacy-focused search engine) together.
- 🚀 One-click deployment of the entire AI stack
- 🧠 Run open-source language models locally
- 🌐 Beautiful web interface for chatting with models
- 🔍 Integrated search capabilities
- 🎮 NVIDIA GPU acceleration support
- 💾 Persistent storage for all components
- 🔄 Easy update and backup scripts
- Docker and Docker Compose
- NVIDIA Docker runtime (for GPU acceleration)
- Git
-
Clone this repository:
git clone https://github.com/yourusername/ollama-webui-stack.git cd ollama-webui-stack -
Start the stack:
./scripts/start-stack.sh
-
Access the interfaces:
- OpenWebUI: http://localhost:3000
- SearXNG: http://localhost:8080
- Ollama API: http://localhost:11434
Copy the example environment file and modify as needed:
cp .env.example .envEdit the .env file to configure:
- Ports
- Base URLs
- GPU settings
ollama-webui-stack/
├── docker-compose.yml # Main configuration file
├── .env.example # Example environment variables
├── .gitignore # Git ignore file
├── README.md # This file
├── data/ # OpenWebUI data directory
├── ollama-data/ # Ollama data directory
├── searxng-data/ # SearXNG configuration directory
└── scripts/ # Utility scripts
├── backup.sh # Backup script
├── run-ollama-docker.sh # Run Ollama standalone
├── start-stack.sh # Start the entire stack
└── update.sh # Update Docker images
./scripts/start-stack.sh./scripts/update.sh./scripts/backup.sh [backup_directory]To download and use models with Ollama:
- Access OpenWebUI at http://localhost:3000
- Navigate to the Models section
- Choose from available models to download
Popular models include:
- llama3
- mistral
- phi
- gemma
If you're having GPU issues:
- Ensure NVIDIA drivers are properly installed
- Verify NVIDIA Docker runtime is configured
- Check Docker logs:
docker-compose logs ollama
If components can't connect to each other:
- Check that all containers are running:
docker-compose ps - Inspect logs:
docker-compose logs - Verify network configuration in docker-compose.yml
docker-compose downdocker-compose logs -fThis project is distributed under the MIT License. See the LICENSE file for more information.