Releases: ZeroTricks/lumo-tamer
Releases · ZeroTricks/lumo-tamer
v0.5.0
Changelog v0.5.0
This version doesn't have much new functionality, but it features the new ConversationStore (experimental) and a lot of restructuring and cleanup, paving the way for new features in future versions such as #47.
Breaking Changes
Vault format changed
Please re-authenticate after upgrading.
Renamed config options
conversations.sync.projectName→conversations.projectNameconversations.sync.enabled→conversations.enableSync
Removed config options
conversations.sync.projectId: useconversations.projectNameinsteadconversations.sync.includeSystemMessagesconversations.sync.autoSyncand/syncchat commandconversations.maxInMemory
Major Features
Upstream Conversation Store Integration (PR #58)
- Added persistent conversation storage using Proton's Redux + IndexedDB + Sagas infrastructure.
- Added
useFallbackStoreconfig option (default: true) for gradual migration
Package Restructuring
- Reorganized upstream Proton code into npm workspace packages:
packages/lumo/: synced fromhttps://github.com/ProtonMail/WebClients/applications/lumo/src/app/packages/proton/: synced fromhttps://github.com/ProtonMail/WebClients/packages/
- Established patch strategy: "pull unchanged > patch > shim"
- Simplified sync-upstream workflow with better tracking
Native Tool Call Persistence (PR #59)
- Web search, weather, stock, and crypto tool calls now persist correctly
Bug Fixes
- Fixed invalid conversation continuations when API client changes tool call outputs (#52)
- Fixed inconsistencies between streamed and persisted messages
- Fixed tool call deduplication (don't drop semanticId)
- Fixed config.yaml saving issues (#53)
- Fixed rclone auth not showing messages on text input
- Fixed npm build not using project-local tsc
- Fixed Docker build dependencies and missing packages/ copy (#61)
- Fixed Lumo tool call format: normalize nested
arguments.parameters - Fixed README to avoid launching docker without config.yaml
Improvements
Auth Refactoring
- Unified vault structure and auth providers
- Login/rclone auth now generates fallback keys if no keys available
- Browser auth decrypts session blob upon extraction and only saves keys
Config & Error Handling
- Improved config file handling: proper directory checks, valid scheme validation, missing auth section handling
- Better error messages for vault/keyfile misconfiguration
- Hide stack traces for known errors and invalid config values
- Show informative errors when db file can't be accessed
CLI & Commands
/savenow saves current conversation only (#41)- Changed obsolete "npm run auth" to "tamer auth" in docs & messages
Type Consolidation
- Consolidated Role and Status types
- Reuse @Lumo types:
ConversationPriv,ConversationPub,MessagePub - Unified
IncomingMessageandNormalizedMessageintoMessageForStore - Improved naming of OpenAI chat/responses message types and converter functions
Code quality
- Console shim suppresses upstream store log floods, maps to pino logger
- Added
create-patchhelper script - Added initial unit tests for ConversationStore
- Use fake-indexeddb in mock mode instead of fallbackStore
Documentation
- Added Home Assistant setup guide (#20)
- Added heads up about official Lumo API in README and HA howto
- Updated docs on new ConversationStore and package structure
v0.4.0
Changelog
Breaking Changes
- Config renamed:
cli.localActions.fileReads.maxFileSizeKB→cli.localActions.fileReads.maxFileSize
Features
- Initial OpenClaw support and OpenAI compatibility hardening #45, thanks @wranglerdriver!
OpenAI edge-field parity, improved error semantics, OpenClaw tool-call reliability, and configurableapi.bodyLimit. - Metrics endpoint
New/metricsendpoint with Prometheus-compatible stats on API requests, tool calls, and sync. Disabled by default. Grafana dashboard included. - Commands wakeword
Trigger commands without slash using a wakeword (e.g., "tamer title"). - Instructions injection
Instructions now prepend to user messages (matching WebClient). Newinstructions.injectIntoconfig for first/last message injection. - Initial nanocoder support
Bug Fixes
- Token refresh on startup - Expired tokens now auto-refresh at startup (#37)
- Tool call validation and parsing - Fixed valid tool calls being marked invalid in non-streaming requests. Regular JSON blocks without a name field no longer treated as invalid tool calls.
- Log format - Fixed output format issues; now uses singleLine for pino-pretty (#38)
Refactoring
- Unified streaming/non-streaming handling in
/v1/responsesand/v1/chat/completions - Simplified tool deduplication: removed call_id tracking from ConversationStore
Other
- Updated documentation for auth troubleshooting, wakeword usage, and API client compatibility (OpenClaw, nanocoder)
- Synced proton upstream files to 6dfd1f79
v0.3.0
Changelog
Breaking Changes
Commands & Binaries
- Consolidated all commands into
tamer:tamer-auth→tamer auth,tamer-server→tamer server - Docker: Renamed container from
lumo-tamer-apptolumo-tamer - Removed Makefile: Use
npmcommands instead
Configuration
- Instructions
instructions.default→instructions.fallback- Removed
instructions.append: useinstructions.templateinstead - Removed
cli.instructions.default: usecli.instructions.templateinstead
- Server
tools.enabled→customTools.enabledtools.enableWebSearch→enableWebSearch
- CLI
tools.enabled→localActions.enabled- Moved
fileReadsandexecutorsunderlocalActions
Conversations
Stateless requests (without conversation identifiers) are no longer synced to Lumo.
New Features
- Tool call support on
/v1/chat/completions - Handlebars-based instruction templates: Flexible instruction assembly with
instructions.template - Custom tool prefixes: Add
customTools.prefixto namespace user tools (default:user:)
Improvements
- Improved custom tool instructions to reduce misrouted calls
- Early bounce for misrouted tool calls: detect when Lumo routes custom tools through its native pipeline and bounce them back immediately instead of waiting for the full response
- CLI executors passed dynamically to instructions
- Better conversation handling: stateless requests are truly stateless, fixing duplicate conversations and auto-title generation
Bug Fixes
- Fix conversation format mismatches with tool call outputs
- Fix invalid conversation continuations
Documentation
- Split tools documentation into
custom-tools.mdandlocal-actions.md - Reworked README sections on Custom Tools, Local Actions, and API clients
Notes
This is an early release. Expect rough edges. Only tested on Linux. See the README for setup instructions.
v0.2.0
Changelog
Breaking Changes
Renamed config options
tools.enableFileReads→fileReads.enabledconversation.syncfields:spaceName→projectName,spaceId→projectIdconversations.deriveIdFromFirstMessage→conversations.deriveIdFromUser. Same goal, different behavior.
Features
- Handle Lumo's "confused" tool calls: bounce custom tools calls mistakenly routed through Lumo's native SSE tool pipeline so Lumo can try again via text messages. (#27)
- Add tool calls to
/v1/responses(non-streaming) (#26) - CLI: limit file reads to small, supported (non-binary) files only (#12)
- Add Lumo API mock for testing, based on Proton WebClients mock (#5)
- Add Vitest test suite: unit, integration (OpenAI endpoints), and e2e tests (#28)
Improvements
/v1/responsesand/v1/chat/completions: Comply to the OpenAI API streaming spec (#21)- Be more verbose about Lumo errors: find & log error messages deep in HTTP response and JSON body
- Show autorefresh config in auth status
- Change user-facing "space" wording to "project"
- Add up-to-date doc on tools
Bug Fixes
- Fix not sending Lumo full Turns when sending tool call result
- Fix API returning empty response when raw JSON detector can't find JSON end (#25)
- Fix empty response on duplicate messages (#25)
- Fix
auth-statuscrashing on uninitialized logger (#13) - Clean up auth logger (#4)
- Always show warnings & errors in terminal, even when logging to a file (#18)
- Don't load existing conversations on startup (#6)
- Pass through Lumo API errors properly to logs and API client (c83b55b)
Notes
This is an early release. Expect rough edges. Only tested on Linux. See the README for setup instructions.
v0.1.0 - First public release
First public release of lumo-tamer.
Features
- OpenAI-compatible API (
/v1/responses,/v1/chat/completions): use Lumo with any OpenAI-compatible client - CLI: interactive mode with file read/create/edit support and command execution
- Conversation sync: encrypted conversation persistence, synced with Proton so you can continue on lumo.proton.me or mobile
- Tool support: experimental function calling via the API
- Multiple auth methods: browser-based, SRP login, or rclone token import
- Encrypted secret storage: session tokens stored in an AES-256-GCM vault, key in OS keychain or Docker secret
- Docker support: containerized setup with Docker Compose
Notes
This is an early release. Expect rough edges. Only tested on Linux. See the README for setup instructions.