Skip to content

Conversation

@flexiondotorg
Copy link
Contributor

  • Add configurable constants to detect hot inputs: la2aHotInputTPThreshold, la2aHotInputTPSevere, la2aHotInputRatioReduction, la2aHotInputHeadroomReduction
  • Back off LA-2A ratio when measured true peak exceeds the threshold. Use a square-root scaled severity to apply a smooth, non-linear reduction up to the configured maximum.
  • Reduce LA-2A headroom (raise threshold) for loud inputs using the same severity curve so the compressor applies gentler gain reduction.
  • Rationale: the downstream limiter handles peak control; backing off the
    compressor for already-hot material avoids unnecessary dynamics crushing and preserves perceived dynamics.

IMPACT: gentler compression on files with high true peaks; reduces over-compression without changing limiter behaviour. No breaking changes.

- Add configurable constants to detect hot inputs:
  la2aHotInputTPThreshold, la2aHotInputTPSevere,
  la2aHotInputRatioReduction, la2aHotInputHeadroomReduction
- Back off LA-2A ratio when measured true peak exceeds the threshold.
  Use a square-root scaled severity to apply a smooth, non-linear
  reduction up to the configured maximum.
- Reduce LA-2A headroom (raise threshold) for loud inputs using the
  same severity curve so the compressor applies gentler gain reduction.
- Rationale: the downstream limiter handles peak control; backing off
the
  compressor for already-hot material avoids unnecessary dynamics
  crushing and preserves perceived dynamics.

IMPACT: gentler compression on files with high true peaks; reduces
over-compression without changing limiter behaviour. No breaking
changes.

Signed-off-by: Martin Wimpress <martin@wimpress.org>
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No issues found across 1 file

Confidence score: 5/5

  • Automated review surfaced no issues in the provided summaries.
  • No files require special attention.

@flexiondotorg flexiondotorg merged commit 87ca1be into main Feb 7, 2026
5 checks passed
@flexiondotorg flexiondotorg deleted the hot-compressor branch February 7, 2026 12:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant