Skip to content

Commit e76a8ce

Browse files
committed
readme update
1 parent b5a4d05 commit e76a8ce

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
[![ko-fi](https://ko-fi.com/img/githubbutton_sm.svg)](https://ko-fi.com/dansmolsky)
44
[![npm version](https://img.shields.io/npm/v/@tarquinen/opencode-dcp.svg)](https://www.npmjs.com/package/@tarquinen/opencode-dcp)
55

6-
Automatically reduces token usage in OpenCode by removing obsolete tools from conversation history.
6+
Automatically reduces token usage in OpenCode by removing obsolete content from conversation history.
77

88
![DCP in action](assets/images/dcp-demo5.png)
99

@@ -50,7 +50,7 @@ LLM providers like Anthropic and OpenAI cache prompts based on exact prefix matc
5050

5151
**Trade-off:** You lose some cache read benefits but gain larger token savings from reduced context size and performance improvements through reduced context poisoning. In most cases, the token savings outweigh the cache miss cost—especially in long sessions where context bloat becomes significant.
5252

53-
> **Note:** In testing, cache hit rates were approximately 65% with DCP enabled vs 85% without.
53+
> **Note:** In testing, cache hit rates were approximately 80% with DCP enabled vs 85% without for most providers.
5454
5555
**Best use case:** Providers that count usage in requests, such as Github Copilot and Google Antigravity, have no negative price impact.
5656

0 commit comments

Comments
 (0)