-
-
Notifications
You must be signed in to change notification settings - Fork 5
Open
Labels
community-opportunityenhancementNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomershelp wantedExtra attention is neededExtra attention is needed
Description
The Problem
engine/openclaw.template.json is currently wired for NVIDIA's API (Kimi K2.5). This is a great default — free tier, 131K context, fast — but it's the single biggest setup friction point for users who don't have an NVIDIA account or want to use a different provider.
What's Needed
Tested, working openclaw.json config blocks for additional providers. Each one should be a drop-in replacement for the model + provider block in the main template.
High priority:
- OpenRouter — broadest model selection, single API key for everything
- Groq — fastest inference, free tier, good for Llama models
- Ollama — fully local, no API key, no cost. The "air-gapped" use case.
- Anthropic — Claude as the AI running the framework (meta, but valid)
Format
Each provider config should be a standalone snippet showing:
- The
modelfield value - The
providerblock withbaseUrland auth pattern - Any provider-specific quirks (context limits, rate limits, unsupported features)
- Which models are recommended and why
These would live in a new docs/PROVIDERS.md file and be referenced from CONFIG_REFERENCE.md.
How To Contribute
- Get the framework running on a non-NVIDIA provider
- Document your working config (sanitize your API key — use
YOUR_API_KEY_HERE) - Note anything that behaved differently from the NVIDIA setup
- Open a PR adding your provider block to
docs/PROVIDERS.md
Even a single working provider config is a meaningful contribution — each one removes a setup blocker for a different group of users.
Status
- NVIDIA (Kimi K2.5) — shipped, production-validated
- OpenRouter
- Groq
- Ollama
- Anthropic
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
community-opportunityenhancementNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomershelp wantedExtra attention is neededExtra attention is needed