Part of BlackRoad OS — Sovereign Computing for Everyone
ollama models is part of the BlackRoad OS ecosystem — a sovereign, distributed operating system built on edge computing, local AI, and mesh networking by BlackRoad OS, Inc.
| Org | Focus |
|---|---|
| BlackRoad OS | Core platform |
| BlackRoad OS, Inc. | Corporate |
| BlackRoad AI | AI/ML |
| BlackRoad Hardware | Edge hardware |
| BlackRoad Security | Cybersecurity |
| BlackRoad Quantum | Quantum computing |
| BlackRoad Agents | AI agents |
| BlackRoad Network | Mesh networking |
Website: blackroad.io | Chat: chat.blackroad.io | Search: search.blackroad.io
Local LLM inference with custom model configurations
# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh
# Pull BlackRoad models
./scripts/pull-all.sh
# Run a model
ollama run blackroad-coderollama create blackroad-coder -f models/blackroad-coder.Modelfileollama create blackroad-analyst -f models/blackroad-analyst.Modelfileollama create blackroad-agent -f models/blackroad-agent.Modelfile| Model | Base | Size | Quantization | Purpose |
|---|---|---|---|---|
| blackroad-coder | codellama:34b | 19GB | Q4_K_M | Code gen |
| blackroad-analyst | mistral:7b | 4.1GB | Q4_0 | Analysis |
| blackroad-agent | llama3:8b | 4.7GB | Q4_0 | Tasks |
| blackroad-phi | phi3:mini | 2.3GB | Q4_0 | Fast inference |
import ollama
response = ollama.chat(model='blackroad-coder', messages=[
{'role': 'user', 'content': 'Write a Python function for Fibonacci'}
])
print(response['message']['content'])Deploy across all BlackRoad devices:
for host in cecilia lucidia alice aria; do
ssh $host 'ollama pull mistral:7b'
doneBlackRoad OS - Local AI Power