Create a VS Code extension that exposes GitHub Copilot through Ollama-compatible API endpoints, solving the token extraction problem by using VS Code's official Language Model API.
copilot-ollama-bridge/
โโโ package.json # Extension manifest and dependencies
โโโ tsconfig.json # TypeScript configuration
โโโ README.md # Complete documentation
โโโ setup.sh # Automated installation script
โโโ test-api.sh # API testing script
โโโ .vscode/
โ โโโ launch.json # VS Code development configuration
โโโ src/
โโโ extension.ts # Main extension implementation
GET /api/tags- List available modelsPOST /api/generate- Generate text completionsPOST /api/chat- Chat-style completions- 100% compatible with existing Ollama clients
- Uses official
vscode.lm.selectChatModels()API - Automatic GitHub Copilot authentication
- Status bar integration with server status
- Output channel for request logging
- Command palette controls
- No filesystem token hunting
- No OS-specific path handling
- No brittle storage format parsing
- Official authentication flow through VS Code
-
Setup the extension:
cd copilot-ollama-bridge ./setup.sh -
Restart VS Code - extension auto-starts
-
Configure Cline:
- Ollama URL:
http://localhost:11434 - Model:
copilot:latest
- Ollama URL:
-
Test the API:
./test-api.sh
โ Brittle filesystem token hunting
โ OS-specific path variations
โ Storage format changes breaking app
โ Authentication edge cases
โ No user consent handling
โ
Official VS Code Language Model API
โ
Automatic authentication
โ
Proper user consent dialogs
โ
Stable, supported interface
โ
Future-proof implementation
- Install and start the extension
- Configure Cline with:
- URL:
http://localhost:11434 - Model:
copilot:latest
- URL:
- Use Cline normally - it now uses Copilot through the bridge
Cline โ HTTP Request โ VS Code Extension โ Language Model API โ GitHub Copilot
โ
Ollama Format โโ VS Code Format
The extension acts as a translation layer between Ollama's API format and VS Code's Language Model API.
- Start Copilot Bridge Server - Start the API server
- Stop Copilot Bridge Server - Stop the API server
- Restart Copilot Bridge Server - Restart the API server
- Status Bar: Shows server status (๐ค icon)
- Output Channel: "Copilot Bridge" for detailed logs
- Web Interface:
http://localhost:11434for documentation
- Reliable: Uses official, stable APIs
- Secure: Proper authentication flow
- Compatible: Drop-in Ollama replacement
- Maintainable: VS Code handles authentication
- Future-proof: Official API support
This approach completely eliminates the token extraction problem and provides a robust, officially-supported way to use GitHub Copilot with Ollama-compatible tools like Cline.