Skip to content

When using with VSCode Github Copilot errors. #10

@Spitfire1900

Description

@Spitfire1900

VSCode, when using Ollama api proxy as an upstream server gets the error "Failed to fetch models from Ollama with the following log.

2/23/2026, 3:52:08 AM] [dotenv@17.3.1] injecting env (0) from .env -- tip: 🛡 auth for agents: https://vestauth.com
[2/23/2026, 3:52:08 AM] ✅ Loaded models from /application/models.json
[2/23/2026, 3:52:08 AM] 🚀 Ollama Proxy with Streaming running on http://localhost:11434
[2/23/2026, 3:52:08 AM] 🔑 Providers: openrouter
[2/23/2026, 3:52:08 AM] 📋 Available models: minimax-m2.5, free-then-cheap, glm-4.7-flash
[2/23/2026, 3:52:12 AM] ℹ GET /api/version
[2/23/2026, 3:52:12 AM] ℹ GET /api/tags
[2/23/2026, 3:52:12 AM] ℹ POST /api/show

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions