-
Notifications
You must be signed in to change notification settings - Fork 9
Open
Description
VSCode, when using Ollama api proxy as an upstream server gets the error "Failed to fetch models from Ollama with the following log.
2/23/2026, 3:52:08 AM] [dotenv@17.3.1] injecting env (0) from .env -- tip: 🛡 auth for agents: https://vestauth.com
[2/23/2026, 3:52:08 AM] ✅ Loaded models from /application/models.json
[2/23/2026, 3:52:08 AM] 🚀 Ollama Proxy with Streaming running on http://localhost:11434
[2/23/2026, 3:52:08 AM] 🔑 Providers: openrouter
[2/23/2026, 3:52:08 AM] 📋 Available models: minimax-m2.5, free-then-cheap, glm-4.7-flash
[2/23/2026, 3:52:12 AM] ℹ GET /api/version
[2/23/2026, 3:52:12 AM] ℹ GET /api/tags
[2/23/2026, 3:52:12 AM] ℹ POST /api/show
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels