-
Notifications
You must be signed in to change notification settings - Fork 366
Open
Description
👋 Hi! I've created a fork with local model support
I've extended your Claude-Cowork project to work with local LLM models while maintaining all original functionality.
🔗 Repository
https://github.com/vakovalskii/Cowork-Local-LLM
1. OpenAI SDK Integration
- Replaced Anthropic SDK with OpenAI SDK for broader compatibility
- Works with any OpenAI-compatible endpoint (
/v1/chat/completions) - Tested with Qwen 3 (30B), works with vLLM, Ollama, LM Studio, etc.
2. New Features
- 🧠 Memory System - persistent user preferences storage
- 🔄 Dynamic Memory Reload - updates within same session
- ✏️ Message Editing - edit and resend with history truncation
- 📌 Session Pinning - pin important chats with search
- 🔐 Permission Modes - ask/default for tool execution
- 📊 Token Tracking - display usage and API duration
- 📝 Raw Logging - full JSON request/response for debugging
3. Architecture Improvements
- Modular tool system (each tool in separate file)
- Compressed tool execution history
- Language consistency (responds in user's language)
- Enhanced error handling and file protection
solthx, ZM-Learn, ggsimida-kao, LMXKO, NicCraver and 1 more
Metadata
Metadata
Assignees
Labels
No labels