Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
40 changes: 0 additions & 40 deletions AGENTS.md

This file was deleted.

107 changes: 15 additions & 92 deletions CLAUDE.md
Original file line number Diff line number Diff line change
@@ -1,113 +1,36 @@
# CLAUDE.md

This file provides guidance to Claude Code (claude.ai/code) when working with
code in this repository.

## Project Overview

Ghost is a command-line AI assistant written in Go and powered by Ollama, designed
with a cyberpunk aesthetic inspired by Shadowrun, Cyberpunk 2077, and The Matrix.
It provides local AI capabilities for querying, analyzing piped data, processing
images with vision models, and formatting output (text, JSON, Markdown).

## Architecture

### Core Flow
Ghost is a command line AI assistant written in Go and powered by Ollama, designed
with a cyberpunk aesthetic inspired by Shadowrun, Cyberpunk 2077, and The Matrix.

1. **Entry Point** (`main.go`): Initializes root command via Fang CLI framework
with custom theming and error handling
2. **Root Command** (`cmd/root.go`): Orchestrates the main execution flow:
- Collects user prompt, piped input, and flags
- Analyzes images if provided (using vision model)
- Executes tool calls in a loop before streaming final response
- Streams LLM response using Bubbletea TUI
- Renders final output with appropriate formatting
3. **LLM Client** (`internal/llm/ollama.go`): Communicates with Ollama API
- `StreamChat()`: Streaming chat with callback for each chunk
- `AnalyzeImages()`: Non-streaming vision model requests
4. **UI Layer** (`internal/ui/`): Bubbletea models for interactive display
- `stream.go`: Streaming model for single-shot queries
- `chat.go`: Core ChatModel struct, types, Init, Update, View
- `chat_normal.go`: Normal mode key handling
- `chat_command.go`: Command mode (`:` commands like `:q`, `:r`)
- `chat_insert.go`: Insert mode text input handling
- `chat_stream.go`: LLM streaming and response handling
5. **Theme System** (`theme/`): Handles cyberpunk-themed rendering and formatting
- UI glyphs in `theme/glyph.go`: Use `theme.GlyphInfo` (󱙝) and `theme.GlyphError`
(󱙜)
## Configuration

### Configuration System

Configuration priority (highest to lowest):
Priority (highest to lowest):

1. Command-line flags
2. Environment variables (prefixed with `GHOST_`, dots/hyphens replaced with `*`)
3. Config file (`~/.config/ghost/config.toml`)

Implemented in `cmd/config.go` using Viper. Vision model configuration uses
nested structure: `vision.model` in config file, `--vision-model` flag, or
`GHOST_VISION*MODEL` env var. Web search uses `search.api-key` and
`search.max-results` following the same pattern.

### Message Flow for Images

Images are base64 encoded and analyzed separately with the vision model. Analysis
results are formatted with IMAGE_ANALYSIS blocks and appended to message history
before the main model processes everything.

Vision system prompt is designed to prevent prompt injection from image text by
treating all visible text as data, not instructions.

### Streaming Architecture

User goroutine and Bubbletea message passing where callbacks send chunk/done/error
messages to the SteamModel for incremental rendering.

### Error Handling Pattern

All packages define custom error types (e.g., `ErrImageAnalysis`, `ErrModelNotFound`)
with cyberpunk-themed messages. Errors are wrapped using `fmt.Errorf("%w", err)`
for proper unwrapping. Theme package provides custom Fang error handler.
Nested config keys use dot notation in TOML, hyphens in flags, and `*` in env
vars (e.g., `vision.model` / `--vision-model` / `GHOST_VISION*MODEL`).

## Code Conventions

**Style**:

- Standard Go formatting (enforced by pre-commit)
- Wrap errors with `fmt.Errorf("%w", err)` for proper error chains
- Follow Go naming conventions (exported vs unexported)
- Comment struct fields and exported types
- Cyberpunk aesthetic in user-facing messages (e.g., "neural link", "data stream",
"visual recon")

**Testing**:

- One test function per code function: Test function name matches the function
being tested
- Cyberpunk aesthetic in user-facing messages (e.g., "neural link", "data stream")
- UI glyphs: `style.GlyphInfo` (󱙝) and `style.GlyphError` (󱙜)
- Table-driven tests, one test function per code function
(e.g., `TestChatModel_HandleCommandMode` tests `handleCommandMode`)
- Use table-driven tests pattern (see `cmd/root_test.go`, `internal/llm/ollama_test.go`)
- Test file naming mirrors source files (e.g., `chat_command_test.go` for `chat_command.go`)
- Test file naming mirrors source files (e.g., `chat_command_test.go`)
- Use `errors.Is()` for error comparison
- Use `t.Fatalf()` for unexpected errors, `t.Errorf()` for assertions

**Commit Messages**:
Conventional commits format (`feat:`, `fix:`, `refactor:`, `test:`, `docs:`)
- Conventional commits (`feat:`, `fix:`, `refactor:`, `test:`, `docs:`, `chore:`,
`perf:`, `build:`, `ci:`, `style:`)

## Design Principles

- **Keep it simple**: Single-file structure per package unless strong reason to
split
- **Cyberpunk aesthetic**: Match tone in user-facing messages and error messages
- **CLI-first**: Prioritize terminal experience with proper TTY detection
- **Teach, don't implement**: When helping users, explain patterns and provide
code examples rather than immediately editing files

## Documentation

### VHS Tape Files (GIF Demos)

- Located in `documentation/` directory
- Standard settings: Fish shell, 14pt font, 1200x600 dimensions
- Use `ghost` command (not `go run .`) in demos for cleaner output
- Key timing: 500ms between user actions, 12-20s for LLM response streaming
- Generate GIFs with `vhs <filename>.tape` from the documentation directory
- **Keep it simple**: Single file structure per package unless strong reason to split
- **Cyberpunk aesthetic**: Match tone in user facing messages and error messages
- **TUI first**: Prioritize terminal experience with proper TTY detection
6 changes: 3 additions & 3 deletions cmd/chat.go
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ import (
"github.com/theantichris/ghost/v3/internal/agent"
"github.com/theantichris/ghost/v3/internal/storage"
"github.com/theantichris/ghost/v3/internal/tool"
"github.com/theantichris/ghost/v3/internal/ui"
"github.com/theantichris/ghost/v3/internal/tui"
)

var ErrHomeDir = errors.New("failed to retrieve user home directory")
Expand Down Expand Up @@ -59,7 +59,7 @@ func runChat(cmd *cobra.Command, args []string) error {

prompts := cmd.Context().Value(promptKey{}).(agent.Prompt)

config := ui.ModelConfig{
config := tui.ModelConfig{
Context: cmd.Context(),
Logger: logger,
URL: viper.GetString("url"),
Expand All @@ -70,7 +70,7 @@ func runChat(cmd *cobra.Command, args []string) error {
Store: store,
}

chatModel := ui.NewChatModel(config)
chatModel := tui.NewChatModel(config)

logger.Info("entering chat", "ollama_url", config.URL, "chat_model", config.ChatLLM, "vision_model", config.VisionLLM)
program := tea.NewProgram(chatModel)
Expand Down
36 changes: 14 additions & 22 deletions cmd/root.go
Original file line number Diff line number Diff line change
Expand Up @@ -15,21 +15,11 @@ import (
"github.com/theantichris/ghost/v3/internal/agent"
"github.com/theantichris/ghost/v3/internal/llm"
"github.com/theantichris/ghost/v3/internal/tool"
"github.com/theantichris/ghost/v3/internal/ui"
"github.com/theantichris/ghost/v3/theme"
"github.com/theantichris/ghost/v3/internal/tui"
"github.com/theantichris/ghost/v3/style"
)

const (
Version = "dev"

useText = "ghost <prompt>"
shortText = "ghost is a local cyberpunk AI assistant."
longText = `Ghost is a local cyberpunk AI assistant.
Send prompts directly or pipe data through for analysis.`
exampleText = ` ghost "explain this code" < main.go
cat error.log | ghost "what's wrong here"
ghost "tell me a joke"`
)
const Version = "dev"

type promptKey struct{}

Expand All @@ -50,11 +40,13 @@ func NewRootCmd() (*cobra.Command, func() error, error) {
var cfgFile string

cmd := &cobra.Command{
Use: useText,
Short: shortText,
Long: longText,
Example: exampleText,
Args: cobra.MinimumNArgs(1),
Use: "ghost <prompt>",
Short: "ghost is a local cyberpunk AI assistant.",
Long: "Ghost is a local cyberpunk AI Assistant.\nSend prompts directly or pipe data through for analysis.",
Example: ` ghost "explain this code" < main.go
cat error.log | ghost "what's wrong here"
ghost "tell me a joke"`,
Args: cobra.MinimumNArgs(1),
PersistentPreRunE: func(cmd *cobra.Command, args []string) error {
cmd.SetContext(context.WithValue(cmd.Context(), loggerKey{}, logger))

Expand Down Expand Up @@ -126,7 +118,7 @@ func run(cmd *cobra.Command, args []string) error {
maxResults := viper.GetInt("search.max-results")
registry := tool.NewRegistry(tavilyAPIKey, maxResults, logger)

config := ui.ModelConfig{
config := tui.ModelConfig{
Context: cmd.Context(),
Prompts: cmd.Context().Value(promptKey{}).(agent.Prompt),
Logger: logger,
Expand All @@ -138,7 +130,7 @@ func run(cmd *cobra.Command, args []string) error {
Images: images,
Registry: registry,
}
streamModel := ui.NewStreamModel(config)
streamModel := tui.NewStreamModel(config)

var programOpts []tea.ProgramOption
if ttyIn, ttyOut, err := tea.OpenTTY(); err == nil {
Expand All @@ -156,12 +148,12 @@ func run(cmd *cobra.Command, args []string) error {
return fmt.Errorf("%w: %w", ErrStreamDisplay, err)
}

finalModel := returnedModel.(ui.StreamModel)
finalModel := returnedModel.(tui.StreamModel)
if finalModel.Err != nil {
return finalModel.Err
}

render, err := theme.RenderContent(finalModel.Content(), format, isTTY)
render, err := style.RenderContent(finalModel.Content(), format, isTTY)
if err != nil {
return fmt.Errorf("%w: %w", ErrRender, err)
}
Expand Down
Binary file removed documentation/chat.gif
Binary file not shown.
31 changes: 0 additions & 31 deletions documentation/chat.tape

This file was deleted.

Binary file removed documentation/demo.gif
Binary file not shown.
13 changes: 0 additions & 13 deletions documentation/demo.tape

This file was deleted.

Binary file removed documentation/file.gif
Binary file not shown.
37 changes: 0 additions & 37 deletions documentation/file.tape

This file was deleted.

Binary file removed documentation/search.gif
Binary file not shown.
13 changes: 0 additions & 13 deletions documentation/search.tape

This file was deleted.

6 changes: 6 additions & 0 deletions internal/tool/registry.go
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,12 @@ import (

var ErrToolNotRegistered = errors.New("tool not registered")

// Tool is the interface that all the tools the LLM uses must implement.
type Tool interface {
Definition() llm.Tool
Execute(ctx context.Context, args json.RawMessage) (string, error)
}

// Registry holds all available tools, provides their definitions to send to chat
// requests, and dispatches execution to the right tool by name.
type Registry struct {
Expand Down
14 changes: 0 additions & 14 deletions internal/tool/tool.go

This file was deleted.

Loading
Loading