LLM chat directly in VS Code. No browser needed.
- Send entire file or selected text to an LLM
- Response inserted at cursor or replaces selection
- Supports Claude, OpenAI, and LM Studio (local)
- Configure API keys and models in VS Code settings
- VS Code
- API Keys: Obtain keys from Anthropic for Claude or OpenAI for GPT models.
- LM Studio: For local LLM use, ensure LM Studio is running on
http://localhost:1234. - Node.js: Required for building from source.
- Download the
.vsixfile from the latest release. - Open VS Code, go to Extensions (
Ctrl+Shift+X). - Click
...> Install from VSIX and select the downloaded file.
- Clone the repository:
git clone https://github.com/your-repo.git. - Install dependencies:
npm install. - Build the extension:
npm run webpack. - Package the extension:
npm run package. - Install the generated
.vsixfile as above.
Set your preferred LLM provider and API keys in VS Code settings (File > Preferences > Settings or Ctrl+,):
askdotmd.defaultModel: Default LLM (claude,openai, orlmstudio).askdotmd.claudeApiKey: Anthropic API key for Claude.askdotmd.openaiApiKey: OpenAI API key for GPT models.askdotmd.lmstudioApiKey: (Optional) API key for LM Studio.
Example settings.json:
{
"askdotmd.defaultModel": "openai",
"askdotmd.openaiApiKey": "your-openai-api-key",
"askdotmd.claudeApiKey": "your-anthropic-api-key"
}- Open a file in VS Code
- Add a request in a comment (e.g.,
// Generate a sorting function) - Trigger the command:
- No selection: Sends entire file, inserts response at cursor
- With selection: Sends only selection, replaces with response
Access the command:
- Command Palette (
Ctrl+Shift+P):askdotmd: Send Request - Default keybinding:
Ctrl+Shift+L(Mac:Cmd+Shift+L)
Select your LLM when prompted (Claude, OpenAI, or LM Studio).
- Requests should be in comments; automatic extraction not yet implemented
- Large files may exceed token limits (select specific sections instead)
- Subject to your LLM provider's rate limits and quotas