Skip to content

Smile0010/qwen3-container

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Open WebUI + Ollama + Qwen3 with Docker Compose

Docker GHCR License Platform

Local stack to run Ollama (custom image with Qwen3 preloaded) and Open WebUI using Docker Compose.

This setup provides a ready-to-use local AI chat environment without downloading models at runtime.


Quick architecture

  • ollama-qwen3 → Ollama runtime with model qwen3:4b preloaded
    API available at http://localhost:11434
  • open-webui → Web-based chat interface available at http://localhost:3000

diagram


Requirements

  • Docker (Docker Desktop on macOS)
  • Docker Compose v2 (docker compose)
  • Free ports:
    • 11434 (Ollama + Qwen3 API)
    • 3000 (Open WebUI)

Installation

Install Docker

Fedora / RHEL / CentOS

sudo dnf install -y docker
sudo systemctl enable --now docker

Ubuntu / Debian

sudo apt update && sudo apt install -y docker.io

macOS

brew install --cask docker

Verify installation:

docker --version
docker compose version

Configuration

Create a .env file in the same directory as docker-compose.yaml:

WEBUI_SECRET_KEY=change-me-to-a-long-random-secret
ENABLE_SIGNUP=true

Generate a secure secret:

openssl rand -hex 32

Execution

From the project directory:

docker compose up -d

Access:

On first access, Open WebUI will ask you to create the first user (admin). This is expected behavior.


Testing and validation

Verify Ollama

curl http://localhost:11434

Expected response:

Ollama is running

List models

curl http://localhost:11434/api/tags

Validate Open WebUI

Open in the browser:

http://localhost:3000

Create a user, log in, and test a chat with the model.

Logs

docker logs ollama-qwen3
docker logs open-webui

Stop and restart

Stop

docker compose stop

Start again

docker compose start

Teardown

Bring down containers (keeps data)

docker compose down

Full cleanup (⚠️ removes volumes)

docker compose down -v

This removes:

  • Downloaded models (Ollama)
  • Users and conversations (Open WebUI)

Image source

ghcr.io/r3xakead0/qwen3-container/ollama-qwen3:latest

Security recommendations

  • Change WEBUI_SECRET_KEY to a long, random secret
  • After creating the first user, disable signups:
    ENABLE_SIGNUP=false
    then recreate containers:
    docker compose up -d --force-recreate

Expected final state

  • Ollama responding on localhost:11434
  • Open WebUI accessible on localhost:3000
  • Model available and functional

Third-party components

This project uses the following open-source components:


License

This project is licensed under the MIT License. See the LICENSE file for details.

About

Running Qwen3 + Ollama + Open WebUI

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Dockerfile 100.0%