-
Notifications
You must be signed in to change notification settings - Fork 2
Open
Open
Copy link
Labels
enhancementNew feature or requestNew feature or request
Description
Problem Statement
When one LLM passes output to another (e.g., summarization, transformation), metadata is lost. There’s no mechanism to preserve or verify origin across multiple LLM generations.
Proposed Solution
- Enable metadata passthrough and relaying between models.
- If Model A creates a text, and Model B modifies it, metadata includes both origins with timestamps and hashes.
- Maintain provenance chain with nested or chained signatures.
Alternative Solutions
- Flatten metadata to “last model only” (loses history).
- Manual logging of provenance (non-standard, easy to omit).
Use Cases
- A content platform chains summarization, translation, and rewriting steps through multiple LLMs and wants to preserve full provenance.
- An enterprise uses one LLM to draft, and another to edit – both steps need attribution.
Implementation Ideas
- Chain metadata with a signature tree.
- Use canonical ordering to preserve integrity and prevent tampering.
- Update encoder to optionally accept prior metadata and extend it.
Additional Context
Could become essential in regulated or high-trust AI pipelines.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request