Skip to content

Using local Ollama models doesnt return any results. #7083

@padsbanger

Description

@padsbanger

Question

Hello, I am running Opencode using local llms from ollama:

  • qwen2.5-coder:7b
  • qwen2.5-coder:32b
  • qwen2.5-coder:32b
  • codellama:34b

os: Omarchy
gpu: rtx 4070ti

When I run them on my projects (generic React crud apps) using /init command I get following responses:

{"name": "todoread", "arguments": {}}

or

{"name": "read", "arguments": {"path": "/home/Projects/react-app/AGENTS.md"}}

Here is my opencode.json file:

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "ollama": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Ollama (Local)",
      "options": {
        "baseURL": "http://127.0.0.1:11434/v1",
        "apiKey": "ollama"  // Optional dummy key; some setups need it
      },
      "models": {
        "qwen2.5-coder:7b": {
          "name": "Qwen2.5-Coder 7B"
        },
        "qwen2.5-coder:32b": {
          "name": "Qwen2.5-Coder 32B"
        },
        "deepseek-coder-v2": {
          "name": "DeepSeek-Coder-V2 16B"
        },
        "qwen3-coder": {
          "name": "Qwen3-Coder 30B"
        },
        "glm4": {
          "name": "GLM-4"
        },
        "codellama:34b": {
          "name": "CodeLlama 34B"
        },
        "codestral": {
          "name": "Codestral"
        },
        "gpt-oss": {
          "name": "GPT-OSS"
        }
      }
    }
  },
  "model": "ollama/qwen2.5-coder:32b" // Default: provider_id/model_id format
}

When I switch to cloud providers such as Copilot of Grok Code Fast, I get desired results (agents.md files). Any suggetions what I am doing wrong ?

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions