An open agent skill that brings DeepSeek's power to 15+ AI coding platforms. Uses DeepSeek V3.2 now — seamless, zero-config upgrade to V4 when it launches.
DeepSeek V3.2 is one of the most capable open-weight LLMs for coding, reasoning, and analysis. This skill makes it instantly available across every major AI coding platform — Claude Code, Cursor, GitHub Copilot, Codex CLI, Gemini CLI, Windsurf, and more.
When DeepSeek V4 launches, this skill upgrades automatically. No config changes, no reinstall, no downtime. Just better results.
Key value:
- One skill, 15+ platforms — install once, use everywhere
- OpenAI-compatible API — works with any tool that speaks OpenAI format
- Future-proof — automatic V3.2 → V4 upgrade path
- Cost-effective — $0.26/M input tokens, fraction of GPT-4 pricing
- No rate limits — powered by Atlas Cloud's infrastructure
npx skills add thoughtincode/deepseek-v4-skillgit clone https://github.com/thoughtincode/deepseek-v4-skill.git
cd deepseek-v4-skillexport ATLAS_API_KEY="your-api-key-here"Get your API key at atlascloud.ai — 25% bonus on first top-up
This skill works with 15+ AI coding platforms out of the box:
| Platform | Type | Status |
|---|---|---|
| Claude Code | CLI Agent | ✅ Full support |
| Cursor | IDE Agent | ✅ Full support |
| GitHub Copilot | IDE Extension | ✅ Full support |
| OpenAI Codex CLI | CLI Agent | ✅ Full support |
| Gemini CLI | CLI Agent | ✅ Full support |
| Windsurf | IDE Agent | ✅ Full support |
| Kiro | IDE Agent | ✅ Full support |
| OpenCode | CLI Agent | ✅ Full support |
| Cline | IDE Extension | ✅ Full support |
| Aider | CLI Agent | ✅ Full support |
| Continue | IDE Extension | ✅ Full support |
| Roo Code | IDE Extension | ✅ Full support |
| AugmentCode | IDE Extension | ✅ Full support |
| Amazon Q Developer | IDE Agent | ✅ Full support |
| Tabnine | IDE Extension | ✅ Full support |
| Custom Agents | Any OpenAI-compatible | ✅ Full support |
Claude Code
The CLAUDE.md file is automatically detected. Just add the skill to your project:
npx skills add thoughtincode/deepseek-v4-skill
export ATLAS_API_KEY="your-key"Claude Code will use the skill when you ask it to run DeepSeek tasks.
Cursor
Add to your .cursor/rules or project settings:
Use DeepSeek V4 skill for code analysis and generation tasks.
API: POST https://api.atlascloud.ai/v1/chat/completions
Model: deepseek/deepseek-v3.2
GitHub Copilot / Codex CLI
The skill's OpenAI-compatible endpoint works directly:
export OPENAI_API_BASE=https://api.atlascloud.ai/v1
export OPENAI_API_KEY=$ATLAS_API_KEYGemini CLI / Windsurf / Kiro
Reference the SKILL.md file in your agent configuration. The OpenAI-compatible format means zero adapter code needed.
Aider
aider --openai-api-base https://api.atlascloud.ai/v1 \
--openai-api-key $ATLAS_API_KEY \
--model deepseek/deepseek-v3.2Any OpenAI-Compatible Tool
Base URL: https://api.atlascloud.ai/v1
Model: deepseek/deepseek-v3.2
Auth: Bearer $ATLAS_API_KEY
- Generate production-ready code in 50+ languages
- Explain complex algorithms and data structures
- Refactor and optimize existing code
- Convert code between languages
- Automated PR review with actionable feedback
- Security vulnerability detection
- Performance bottleneck identification
- Best practices enforcement
- Multi-step problem solving
- Architecture design decisions
- Algorithm complexity analysis
- System design and trade-off evaluation
- Generate API documentation
- Summarize large codebases
- Create technical specifications
- Write inline comments and docstrings
- 1M+ token context window — analyze entire repositories at once
- Enhanced coding benchmarks — expected top-tier performance
- Faster inference — reduced latency for real-time coding
- Improved instruction following — better at complex multi-step tasks
The skill includes a standalone CLI for direct interaction:
npx deepseek-v4 --prompt "Explain the difference between TCP and UDP"npx deepseek-v4 --prompt "Write a Redis connection pool in Go with retry logic" \
--system "You are an expert Go developer. Write production-ready code."npx deepseek-v4 --prompt "Review this code for bugs and improvements: $(cat src/main.ts)" \
--system "You are a senior code reviewer. Focus on bugs, security, and performance."npx deepseek-v4 --prompt "Summarize this codebase architecture: $(find src -name '*.ts' -exec cat {} \;)" \
--max-tokens 2000| Option | Default | Description |
|---|---|---|
--prompt |
(required) | The prompt to send |
--model |
deepseek/deepseek-v3.2 |
Model identifier |
--temperature |
0.7 |
Sampling temperature (0-2) |
--max-tokens |
4096 |
Maximum response tokens |
--system |
(none) | System prompt for role/behavior |
| Variable | Required | Description |
|---|---|---|
ATLAS_API_KEY |
Yes | Your Atlas Cloud API key |
ATLAS_BASE_URL |
No | Custom API base URL (default: https://api.atlascloud.ai/v1) |
This skill follows the open agent skill pattern — a standardized way to add capabilities to AI coding agents. The skill provides:
SKILL.md— Machine-readable skill description with trigger conditions and API detailsCLAUDE.md— Claude Code-specific configurationsrc/cli.ts— Standalone CLI for direct usage and agent tool-callingpackage.json— Standard npm metadata for discoverability
When an agent encounters a task like "analyze this code with DeepSeek", the skill resolution works:
User Request
↓
Agent reads SKILL.md / CLAUDE.md
↓
Matches trigger: "code analysis", "DeepSeek", "review"
↓
Calls Atlas Cloud API (OpenAI-compatible)
↓
Returns DeepSeek V3.2 response
↓
(Seamless V4 upgrade when available)
// 在任何OpenAI兼容的客户端中使用
import OpenAI from "openai";
const client = new OpenAI({
apiKey: process.env.ATLAS_API_KEY,
baseURL: "https://api.atlascloud.ai/v1",
});
const response = await client.chat.completions.create({
model: "deepseek/deepseek-v3.2",
messages: [
{ role: "system", content: "You are a senior software engineer." },
{ role: "user", content: "Review this pull request for security issues..." },
],
temperature: 0.7,
max_tokens: 4096,
});
console.log(response.choices[0].message.content);POST https://api.atlascloud.ai/v1/chat/completions
Authorization: Bearer YOUR_ATLAS_API_KEY
Content-Type: application/json
{
"model": "deepseek/deepseek-v3.2",
"messages": [
{
"role": "system",
"content": "You are a helpful coding assistant."
},
{
"role": "user",
"content": "Write a binary search implementation in Rust"
}
],
"temperature": 0.7,
"max_tokens": 4096,
"stream": true
}{
"id": "chatcmpl-abc123",
"object": "chat.completion",
"created": 1710000000,
"model": "deepseek/deepseek-v3.2",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "Here's a binary search implementation in Rust:\n\n```rust\nfn binary_search<T: Ord>(arr: &[T], target: &T) -> Option<usize> {\n let mut low = 0;\n let mut high = arr.len();\n while low < high {\n let mid = low + (high - low) / 2;\n match arr[mid].cmp(target) {\n std::cmp::Ordering::Equal => return Some(mid),\n std::cmp::Ordering::Less => low = mid + 1,\n std::cmp::Ordering::Greater => high = mid,\n }\n }\n None\n}\n```"
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 25,
"completion_tokens": 150,
"total_tokens": 175
}
}curl -X POST https://api.atlascloud.ai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $ATLAS_API_KEY" \
-d '{
"model": "deepseek/deepseek-v3.2",
"messages": [{"role": "user", "content": "Write a quicksort in Python"}],
"temperature": 0.7,
"max_tokens": 4096
}'curl -X POST https://api.atlascloud.ai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $ATLAS_API_KEY" \
-d '{
"model": "deepseek/deepseek-v3.2",
"messages": [{"role": "user", "content": "Explain monads in Haskell"}],
"temperature": 0.7,
"stream": true
}'import openai
client = openai.OpenAI(
api_key="your-atlas-api-key",
base_url="https://api.atlascloud.ai/v1",
)
response = client.chat.completions.create(
model="deepseek/deepseek-v3.2",
messages=[
{"role": "system", "content": "You are a Python expert."},
{"role": "user", "content": "Write an async web scraper with aiohttp"},
],
temperature=0.7,
max_tokens=4096,
)
print(response.choices[0].message.content)import OpenAI from "openai";
const client = new OpenAI({
apiKey: process.env.ATLAS_API_KEY,
baseURL: "https://api.atlascloud.ai/v1",
});
async function generateCode(prompt: string): Promise<string> {
const response = await client.chat.completions.create({
model: "deepseek/deepseek-v3.2",
messages: [{ role: "user", content: prompt }],
temperature: 0.7,
max_tokens: 4096,
});
return response.choices[0].message.content ?? "";
}
const code = await generateCode("Write a REST API with Express and TypeScript");
console.log(code);| Feature | DeepSeek V3.2 (Current) | DeepSeek V4 (Expected) |
|---|---|---|
| Context Window | 128K tokens | 1M+ tokens |
| Coding (HumanEval) | 90.2% | Expected 95%+ |
| Reasoning (MATH) | 90.0% | Expected 95%+ |
| Multilingual | 50+ languages | 100+ languages |
| Architecture | MoE | Enhanced MoE |
| Latency | Fast | Faster |
| Price (Input) | $0.26/M tokens | TBD |
| Price (Output) | $0.38/M tokens | TBD |
| API Compatibility | OpenAI-compatible | OpenAI-compatible |
| Upgrade Path | — | Automatic via this skill |
- Atlas Cloud adds DeepSeek V4 to their API
- This skill's model identifier updates to
deepseek/deepseek-v4 - Your existing integration keeps working — no code changes needed
- You get V4's improvements immediately
Zero downtime. Zero config changes. Just better results.
| Metric | Price |
|---|---|
| Input tokens | $0.26 / million tokens |
| Output tokens | $0.38 / million tokens |
| Context window | 128K tokens |
| Rate limits | None (Atlas Cloud) |
| Model | Input ($/M) | Output ($/M) | Savings vs GPT-4 |
|---|---|---|---|
| DeepSeek V3.2 | $0.26 | $0.38 | ~95% cheaper |
| GPT-4o | $2.50 | $10.00 | — |
| Claude 3.5 Sonnet | $3.00 | $15.00 | — |
| Gemini 1.5 Pro | $3.50 | $10.50 | — |
| Use Case | Tokens/Day | Daily Cost | Monthly Cost |
|---|---|---|---|
| Individual developer | ~100K | ~$0.03 | ~$1 |
| Small team (5 devs) | ~500K | ~$0.15 | ~$5 |
| CI/CD code review | ~1M | ~$0.32 | ~$10 |
| Enterprise (50 devs) | ~10M | ~$3.20 | ~$100 |
💡 Get 25% bonus on your first top-up at atlascloud.ai
# .github/workflows/deepseek-review.yml
name: DeepSeek Code Review
on: [pull_request]
jobs:
review:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Review with DeepSeek
run: |
DIFF=$(git diff origin/main...HEAD)
npx deepseek-v4 \
--prompt "Review this PR diff for bugs, security issues, and improvements:\n$DIFF" \
--system "You are a senior code reviewer. Be thorough but constructive." \
--max-tokens 4096
env:
ATLAS_API_KEY: ${{ secrets.ATLAS_API_KEY }}# 生成完整的CRUD API
npx deepseek-v4 \
--prompt "Generate a complete CRUD REST API for a blog platform with posts, comments, and users. Use Express.js, TypeScript, and Prisma ORM." \
--system "Generate production-ready code with error handling, validation, and types." \
--max-tokens 8192# 从代码生成架构文档
npx deepseek-v4 \
--prompt "Analyze this codebase and generate architecture documentation: $(find src -name '*.ts' | head -20 | xargs cat)" \
--system "You are a technical writer. Create clear, structured architecture docs." \
--max-tokens 4096# 分析错误日志
npx deepseek-v4 \
--prompt "Investigate this error and suggest fixes: $(cat error.log | tail -50)" \
--system "You are a debugging expert. Identify root cause and provide fix."| Variable | Required | Default | Description |
|---|---|---|---|
ATLAS_API_KEY |
Yes | — | Atlas Cloud API key |
ATLAS_BASE_URL |
No | https://api.atlascloud.ai/v1 |
API base URL |
DEEPSEEK_MODEL |
No | deepseek/deepseek-v3.2 |
Model identifier |
DEEPSEEK_TEMPERATURE |
No | 0.7 |
Default temperature |
DEEPSEEK_MAX_TOKENS |
No | 4096 |
Default max tokens |
Create a .deepseek-v4.json in your project root:
{
"model": "deepseek/deepseek-v3.2",
"temperature": 0.7,
"maxTokens": 4096,
"systemPrompt": "You are a senior software engineer specializing in TypeScript and React.",
"baseUrl": "https://api.atlascloud.ai/v1"
}401 Unauthorized
Your API key is missing or invalid.
# 检查API密钥是否设置
echo $ATLAS_API_KEY
# 重新设置
export ATLAS_API_KEY="your-key-here"Get your key at atlascloud.ai.
Model not found
Ensure you're using the correct model identifier:
deepseek/deepseek-v3.2 ✅ Correct
deepseek-v3.2 ❌ Missing prefix
deepseek-v4 ❌ Not yet available
Timeout errors
For large prompts, increase the timeout:
# 使用较小的max-tokens值
npx deepseek-v4 --prompt "..." --max-tokens 2000Streaming not working
Streaming is enabled by default. If your environment doesn't support it, the CLI falls back to non-streaming mode automatically.
Contributions are welcome! Please read our guidelines:
- Fork the repository
- Create your feature branch:
git checkout -b feature/my-feature - Commit your changes:
git commit -m "Add my feature" - Push to the branch:
git push origin feature/my-feature - Open a Pull Request
git clone https://github.com/thoughtincode/deepseek-v4-skill.git
cd deepseek-v4-skill
npm install
npm run dev| Aspect | Detail |
|---|---|
| API Provider | Atlas Cloud — enterprise-grade AI infrastructure |
| Uptime | 99.9% SLA |
| Data Privacy | No training on your data |
| Compliance | SOC2, GDPR compliant |
| Support | 24/7 technical support |
| Rate Limits | None — scale without limits |
| Billing | Pay-as-you-go, no minimums |
- Atlas Cloud: atlascloud.ai — Get your API key (25% first top-up bonus)
- DeepSeek: deepseek.com — Model documentation
- GitHub: thoughtincode/deepseek-v4-skill
- Issues: Report a bug
MIT License — see LICENSE for details.
# 1. 获取API密钥
# 访问 https://www.atlascloud.ai?ref=JPM683&utm_source=github&utm_campaign=deepseek-v4-skill(首充25%奖励)
# 2. 设置密钥
export ATLAS_API_KEY="your-key"
# 3. 安装技能
npx skills add thoughtincode/deepseek-v4-skill
# 4. 开始使用
npx deepseek-v4 --prompt "Write a binary search tree in Rust"DeepSeek V3.2 today. V4 tomorrow. Zero changes needed.
Built for developers who want the best AI coding assistant at a fraction of the cost.
Get your Atlas Cloud API key ·
Report Bug ·
Request Feature