Admin UI for the Chroma embedding database, built with Next.js
- GitHub Repo: https://github.com/flanker/chromadb-admin
- Chroma Official Website https://docs.trychroma.com
- π Vector similarity search with text queries
- π Collection and record management
- π Authentication support (Token, Basic Auth, No Auth)
- π¨ Modern UI built with Mantine
- βοΈ Environment variable configuration
- π Quick start command
Install the global command for easy access from anywhere:
./install-command.sh
source ~/.zshrc # or restart your terminalThen you can start the project from anywhere:
chromadb-adminFirst, install dependencies:
yarn install
# or
npm installThen, start the development server:
yarn dev
# or
npm run dev
# or
pnpm dev
# or
bun devFinally, open http://localhost:3001 in your browser to see the app.
Create a .env.local file in the project root (copy from .env.example):
cp .env.example .env.localEdit .env.local with your configuration:
# OpenAI API Configuration
OPENAI_API_KEY=your-openai-api-key-here
OPENAI_BASE_URL=https://api.openai.com/v1
# ChromaDB Connection Configuration
# Chroma connection string (host:port or full URL)
CHROMA_API=http://localhost:8000
# Embedding Model Configuration
# Model name for embeddings (e.g., text-embedding-3-small, text-embedding-3-large, llama2)
EMBEDDING_MODEL=text-embedding-3-smallNote: The values in .env.local will be automatically filled in the setup page when you first open it.
When you first open the app, you'll be directed to the setup page where you can configure:
- Chroma Connection String: Your ChromaDB server address (e.g.,
http://localhost:8000) - Tenant & Database: Multi-tenancy configuration
- Embedding Model URL: Supports various embedding services:
- OpenAI:
https://api.openai.com/v1 - LM Studio:
http://localhost:1234/v1/embeddings - Ollama (OpenAI mode):
http://localhost:11434/v1 - Ollama (native):
http://localhost:11434/api/embeddings
- OpenAI:
- Embedding Model: Model name (e.g.,
text-embedding-3-small,llama2) - Authentication: Token, Basic Auth, or No Auth
docker run -p 3001:3001 fengzhichao/chromadb-adminThen visit http://localhost:3001 in your browser.
Note: Use http://host.docker.internal:8000 for the connection string if you want to connect to a ChromaDB instance running locally.
Build the Docker image:
docker build -t chromadb-admin .Run the Docker container:
docker run -p 3001:3001 chromadb-adminyarn dev- Start development server on port 3001yarn build- Build for productionyarn start- Start production serveryarn lint- Run ESLintyarn generate-mock-data- Generate mock data for testing
chromadb-admin/
βββ src/
β βββ app/ # Next.js app directory
β β βββ api/ # API routes
β β βββ setup/ # Setup page
β βββ components/ # React components
β βββ lib/ # Utilities and helpers
βββ script/ # Utility scripts
βββ .env.example # Environment variables template
βββ .env.local # Your local configuration (not committed)
If port 3001 is already in use, you can change it in package.json:
"dev": "next dev -p 3001"Make sure to:
- Create
.env.localfile (not.env) - Restart the development server after changing environment variables
- Check that variable names match exactly (case-sensitive)
If you encounter proxy connection errors, you can disable proxy for this repository:
git config --local http.proxy ""
git config --local https.proxy ""Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the terms of the MIT license.
This is NOT an official Chroma project.
