Skip to content

Conversation

@Sahil5963
Copy link
Contributor

Summary

  • Remove @yourgpt/copilot-sdk dependency from llm-sdk (now self-contained)
  • Add usage stripping from client-facing responses by default
  • Add { includeUsage: true } option for raw API access

Changes

llm-sdk now independent

  • Created src/core/stream-events.ts with all event types
  • Created src/core/utils.ts with ID generators
  • No longer requires copilot-sdk as dependency

Usage stripping (security)

  • pipeToResponse(), collect() strip usage by default
  • onFinish callback always receives usage (for billing)
  • Pass { includeUsage: true } for raw access
// Copilot SDK (default - safe for clients)
const { text, messages } = await runtime.stream(body).collect();

// Raw access (includes usage)
const { text, usage } = await runtime.stream(body).collect({ includeUsage: true });

Test plan

  • Build passes
  • Copilot endpoints don't expose usage
  • Raw endpoints with includeUsage: true expose usage
  • onFinish callback receives usage for billing

🤖 Generated with Claude Code

Sahil5963 and others added 2 commits January 28, 2026 14:10
- Add usage capture in adapters (OpenAI, Anthropic, Google)
- Add onFinish callback to runtime.stream() and runtime.chat()
- Strip usage from client responses (server-side only)
- Update server docs with usage tracking examples
- Bump llm-sdk version to 1.5.0

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Remove copilot-sdk dependency (now self-contained)
- Add stream-events.ts with all event types
- Add utils.ts with ID generators
- Strip usage from client-facing responses by default
- Add `includeUsage` option for raw API access
- Usage still available in onFinish callback for billing

BREAKING: None - API compatible, just internal changes

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@vercel
Copy link

vercel bot commented Jan 28, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Review Updated (UTC)
copilot-sdk-docs Ready Ready Preview, Comment Jan 28, 2026 10:50am

Request Review

- Clarify that usage is stripped by default from client responses
- Add documentation for { includeUsage: true } option
- Update examples to use onFinish callback pattern

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Resolve conflicts in express-demo (keep copilot/raw endpoint structure)
- Resolve conflicts in server.mdx (merge usage tracking + generate() docs)
- Keep llm-sdk version at 1.5.2
- Remove generate() endpoints from express-demo (not yet in npm)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@Rohitjoshi9023 Rohitjoshi9023 merged commit f63f57c into main Jan 28, 2026
2 of 3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants