Navy AI is a terminal-based AI assistant designed for local-first usage, with optional cloud providers.
It works entirely from the command line and prioritizes privacy, cost control, and clarity.
🟢 Default mode is FREE and OFFLINE using local AI.
- 🖥️ Clean, modern CLI
- 🧠 Local AI via Ollama (free, offline)
- ☁️ Cloud AI via Gemini (free tier available)
- 💳 Optional OpenAI support (paid, opt-in)
- 🔁 Argument mode & interactive mode
- 🎨 Styled terminal output
- 🔌 Extensible provider system
- 🔐 Secure by default (no keys in code)
- Python 3.9+
- (Optional) Ollama for local AI
pip install navy-aiVerify Installation After installing Navy AI, verify that the CLI is available:
navy-ai --helpNavy AI works in two modes:
- Argument mode – single command
- Interactive mode – chat-style session
Ask a question directly from the terminal:
navy-ai "what is zero trust security?"Example output:
Zero Trust is a security model that assumes no implicit trust...Start an interactive session:
navy-aiYou will see:
Navy AI >Then type your questions:
Navy AI > what is a cpu
Navy AI > explain zero trust
Navy AI > exit| Provider | Cost | Internet | Notes |
|---|---|---|---|
| Ollama | Free | ❌ No | Local, offline, recommended |
| Gemini | Free tier | ✅ Yes | Google AI Studio |
| OpenAI | Paid | ✅ Yes | Requires billing |
🟢 Default provider is Ollama (local-first).
Ollama allows you to run AI models locally and offline.
ollama pull mistral
ollama pull qwen2.5-coder:7b
ollama pull llama3
..........Explicit provider + model:
navy-ai --provider ollama --model mistral "what is cpu"Or simply:
navy-ai "what is cpu"➡️ Ollama is the default provider.
Keys must be created from Google AI Studio:
👉 https://aistudio.google.com/app/apikey
Windows (PowerShell)
setx GEMINI_API_KEY "AIzaSyXXXX"macOS (Terminal) Set the variable (temporary – current session only):
export GEMINI_API_KEY="AIzaSyXXXX"Make it persistent (recommended): For zsh (default on modern macOS):
echo 'export GEMINI_API_KEY="AIzaSyXXXX"' >> ~/.zshrcFor bash:
echo 'export GEMINI_API_KEY="AIzaSyXXXX"' >> ~/.bashrcRestart the terminal (or run source ~/.zshrc / source ~/.bashrc).
Verify:
echo $GEMINI_API_KEYLinux (Terminal)
Set the variable (temporary – current session only):
export GEMINI_API_KEY="AIzaSyXXXX"Make it persistent:
For bash:
echo 'export GEMINI_API_KEY="AIzaSyXXXX"' >> ~/.bashrcFor zsh:
echo 'export GEMINI_API_KEY="AIzaSyXXXX"' >> ~/.zshrcRestart the terminal (or run source ~/.bashrc / source ~/.zshrc).
Verify:
echo $GEMINI_API_KEYnavy-ai --provider geminiRecommended model:
navy-ai --provider gemini --model gemini-2.5-flashOpenAI requires billing to be enabled.
👉 https://platform.openai.com/api-keys
👉 https://platform.openai.com/account/billing
Windows (PowerShell)
setx OPENAI_API_KEY "sk-xxxx"macOS (Terminal) Set the variable (temporary – current session only):
export OPENAI_API_KEY="sk-xxxx"Make it persistent (recommended): For zsh (default on modern macOS):
echo 'export OPENAI_API_KEY="sk-xxxx"' >> ~/.zshrcFor bash:
echo 'export OPENAI_API_KEY="sk-xxxx"' >> ~/.bashrcRestart the terminal (or run source ~/.zshrc / source ~/.bashrc).
Verify:
echo $OPENAI_API_KEYLinux (Terminal)
Set the variable (temporary – current session only):
export OPENAI_API_KEY="sk-xxxx"Make it persistent:
For bash:
echo 'export OPENAI_API_KEY="sk-xxxx"' >> ~/.bashrcFor zsh:
echo 'export OPENAI_API_KEY="sk-xxxx"' >> ~/.zshrcRestart the terminal (or run source ~/.bashrc / source ~/.zshrc).
Verify:
echo $OPENAI_API_KEYnavy-ai --provider openai --model gpt-3.5-turbo "explain zero trust"429 Too Many Requestsnavy-ai [OPTIONS] [PROMPT]| Option | Description |
|---|---|
--provider |
ollama |
--model |
Provider-specific model |
--help |
Show help |
navy-ai "hi!"
navy-ai --provider ollama --model qwen2.5-coder:7b
navy-ai --provider gemini --model gemini-2.5-flash
navy-ai --provider openai --model gpt-3.5-turbo