Language: English | δΈζ
CopilotMask is a GitHub Copilot BASE_URL customization tool implemented through Ollama proxy, specifically designed for VS Code's GitHub Copilot. By running this project locally or remotely, you can use custom AI API providers in GitHub Copilot, breaking through official limitations and using your own API keys.
- π API Proxy Forwarding: Forward GitHub Copilot requests to custom AI service providers
- π€ Multi-Model Support: Support multiple AI models including OpenAI, Claude, DeepSeek, etc.
- π§ Reasoning Process Display: Special optimization for DeepSeek to show complete reasoning processes
- π Powerful Logging System: Provide detailed request and response logs for easy debugging
- π Smart Configuration Management: Support intelligent priority configuration with .env files and environment variables
- π Break Free from Subscription Limits: Use powerful AI models without GitHub Copilot paid membership
- β Payment Alternative: Perfect solution for users who cannot easily purchase or add official payment methods
- π§ Easy Debugging: Deep understanding of Copilot's working mechanism through detailed logging system
- π Third-Party Relay Support: Perfect support for various third-party API relay stations
- β‘ High Performance: Built on FastAPI with excellent performance
- Python 3.8+
- pip package manager
pip install -r requirements.txtCopy .env.example to .env file:
cp .env.example .envEdit the .env file to configure your API keys and service addresses (see Configuration section for details).
python main.pyThe service will start at http://localhost:11434.
- Open GitHub Copilot settings
- Select "Manage Models"
- Choose "Ollama"
- Select models starting with
virtual- - Start using your custom API!
Model Addition FailureοΌ If you encounter failures when adding models in GitHub Copilot, this is typically because of VS Code Copilot. When model addition fails, backend logs show no registration requests received from VS Code Copilot, indicating that Copilot did not attempt to access the service.
SolutionοΌ
- First register all available Ollama models in Copilot settings
- Then unregister all models
- Restart VS Code or refresh Copilot status
- Retry adding the desired virtual models
This operation will reset Copilot's internal state and typically resolves model addition failures.
The configuration file supports intelligent priority system:
- When
.envfile exists, use.envconfiguration preferentially - When
.envdoesn't exist, use.env.exampleas configuration template - System environment variables can supplement file configuration
# Environment mode
ENVIRONMENT=development
# Server configuration
SERVER_HOST=0.0.0.0 # Service binding address
SERVER_PORT=11434 # Service port (recommended to keep 11434)
# Ollama configuration
REAL_OLLAMA_BASE_URL=http://1.2.3.4:11434 # Real Ollama service address
OLLAMA_TIMEOUT=10.0 # Ollama source request timeoutLOG_LEVEL=INFO # Log level: DEBUG, INFO, WARNING, ERROR
LOG_FILE_NAME=ollama_proxy.log # Log file nameOPENAI_API_KEY=your_openai_api_key_here
OPENAI_BASE_URL=https://api.openai.com/v1
OPENAI_MAX_TOKENS=4096
OPENAI_MODELS=gpt-4o-mini,gpt-4o,gpt-3.5-turbo
OPENAI_CAPABILITIES=completion,chat,tools,visionCLAUDE_API_KEY=your_claude_api_key_here
CLAUDE_BASE_URL=https://api.anthropic.com
CLAUDE_MAX_TOKENS=25600
CLAUDE_MODELS=claude-sonnet-4-20250514,claude-opus-4-20250514
CLAUDE_CAPABILITIES=completion,chat,tools,visionDEEPSEEK_API_KEY=your_deepseek_api_key_here
DEEPSEEK_BASE_URL=https://api.deepseek.com
DEEPSEEK_MAX_TOKENS=30000
DEEPSEEK_MODELS=deepseek-chat,deepseek-reasoner
DEEPSEEK_CAPABILITIES=completion,chat,toolsCAPABILITIES configuration supports two formats:
-
One-dimensional array (all models have same capabilities):
OPENAI_CAPABILITIES=completion,chat,tools,vision
-
Two-dimensional array (each model has different capabilities):
OPENAI_CAPABILITIES=completion,chat,tools,vision|chat,tools|completion,chat
Supported capability types:
completion: Text completionchat: Conversational chattools: Tool callingvision: Visual understanding
main.py: Main application entry point, FastAPI application creation and startupconfig.py: Configuration management module, intelligent loading of environment variablesmodel_manager.py: Model manager, handles virtual model registration and client initializationroutes.py: Route handler, defines API routes and request forwarding logicapi_handlers.py: API handlers, processes API calls from different service providersmiddleware.py: Middleware, handles request logging and response processinglogger.py: Logging system, provides structured loggingutils.py: Utility functions, contains various helper functions
- Request Reception: GitHub Copilot sends OpenAI format requests
- Model Recognition: Detect if the model name in the request is a virtual model
- Request Forwarding:
- Virtual model β Forward to corresponding remote API service provider
- Non-virtual model β Forward to Ollama source
- Format Conversion: Handle conversion between different API formats (e.g., Claude)
- Response Return: Convert response to OpenAI format and return to Copilot
The project has special optimization for DeepSeek, capable of displaying complete reasoning processes, allowing you to understand AI's thinking process.
Supports complex configuration priority management, allowing flexible use of different configurations in different environments.
The project provides a powerful logging system:
- Structured logging
- Detailed tracking of requests and responses
- Support for different log levels
- Dual output to file and console
- π₯οΈ GUI Interface: Graphical configuration and management interface
- π§ System Service: Support running as system service
- βοΈ Serverless Adaptation: Adapt to serverless platforms like function computing
- π Forwarding Configuration Station: Create server for user_id-based configuration mapping
Welcome to submit Issues and Pull Requests to help improve the project!
This project is licensed under the MIT License - see the LICENSE file for details.
If you encounter problems during use, please:
- Check if the configuration file is correct
- Check the log file for error information
- Submit an Issue describing the problem
Enjoy the freedom and convenience brought by CopilotMask! π