A local-first AI terminal assistant powered by Ollama. Personal pocket AI for conversations, coding, and work.
- Local-First - Runs entirely on your machine with Ollama
- Memory - Remembers past conversations with semantic search
- Fast - Single Go binary, instant startup
- Customizable - Wine and gold theme, configurable models
- Install Ollama
- Pull a model:
ollama pull qwen2.5:3bgo install github.com/diiviikk5/dvkcli/cmd/dvkcli@latestOr build from source:
git clone https://github.com/diiviikk5/dvkcli.git
cd dvkcli
go install ./cmd/dvkclidvkcli/help Show all commands
/models List available Ollama models
/search <query> Search past conversations
/clear Clear current conversation
/export Export chat to markdown
Enter Send message
Ctrl+N New conversation
Ctrl+L Load last conversation
Ctrl+E Export conversation
Up/Down or j/k Scroll
PgUp/PgDown Page scroll
Ctrl+C Quit
Config stored in ~/.dvkcli/config.json:
{
"ollama_url": "http://localhost:11434",
"model": "qwen2.5:3b",
"embed_model": "nomic-embed-text",
"memory_enabled": true
}- Go
- Bubbletea (TUI framework)
- Lipgloss (styling)
- SQLite (memory storage)
- Ollama (local LLM)
MIT
