Skip to content

feat: protocol compliance, sync improvements, and FLAC fixes#15

Merged
chrisuthe merged 6 commits intomasterfrom
feat/protocol-compliance
Mar 4, 2026
Merged

feat: protocol compliance, sync improvements, and FLAC fixes#15
chrisuthe merged 6 commits intomasterfrom
feat/protocol-compliance

Conversation

@chrisuthe
Copy link
Owner

@chrisuthe chrisuthe commented Mar 4, 2026

Summary

  • Reconnect stabilization: Suppress sync corrections for 2s after reconnect while Kalman filter re-converges
  • Reanchor cooldown: 5s minimum between re-anchors, matching Android/Python CLI behavior
  • FLAC 32-bit fix: Use actual STREAMINFO bit depth instead of stream/start header (fixes silence on 32-bit FLAC)
  • AudioFormat.BitDepth update: Propagate actual decoded bit depth from FLAC STREAMINFO to audio format
  • Buffer capacity: Increase from 8s to 30s to prevent overruns on track start
  • Artwork-only stream/start: Skip pipeline start when stream/start has no player key

Test plan

  • Verify audio plays correctly with FLAC streams (16-bit and 32-bit)
  • Test track changes don't cause silence
  • Test reconnect behavior — sync should stabilize within ~2s
  • Verify no re-anchor loops on track change

🤖 Generated with Claude Code

chrisuthe and others added 6 commits March 4, 2026 08:38
…s post-reconnect)

After a WebSocket reconnect, the Kalman clock synchronizer is reset and
needs ~2 seconds to re-converge. During this window, sync error
measurements are unreliable. Without suppression, the correction system
reacts to garbage measurements causing audible artifacts.

Adds NotifyReconnect() through the pipeline chain (AudioPipeline →
TimedAudioBuffer + IAudioPlayer → SyncCorrectionCalculator) to suppress
corrections during the stabilization period. Covers both Read() (deprecated)
and ReadRaw() (current) correction paths.

Matches Android client's RECONNECT_STABILIZATION_US = 2,000,000.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…oid/Python CLI)

Without a cooldown, persistent clock sync error can trigger re-anchors every
~750ms (500ms grace + 250ms rebuffer), causing audio stuttering. The cooldown
timer persists across Clear() calls since reanchor itself calls Clear().

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…/start)

The server's stream/start reports bit_depth=32 (PyAV's s32 container format)
but the actual FLAC STREAMINFO declares 24-bit precision. Using the wrong
bit depth for the scale factor (dividing by 2^31 instead of 2^23) produced
near-zero sample values = silence.

Changes:
- Use server codec_header (real STREAMINFO) instead of synthetic header
- Calibrate scale factor from SimpleFlac's actual BitsPerSample
- Fix integer overflow in scale factor calc (1<<31 -> 1L<<31)
- Add logging to FlacDecoder via ILoggerFactory
- Catch NotSupportedException in addition to InvalidDataException

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…rack start

The server paces data by compressed byte capacity (32MB), which for FLAC
with ~4:1 compression decodes to 30+ seconds of audio. The previous 8s
internal buffer overflowed on every track start, causing 230+ frame drops
and a fast-forward sound artifact.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Stats for Nerds and main UI were showing 32-bit (from stream/start's PyAV
container format) instead of the actual 24-bit FLAC precision. Now the
FlacDecoder updates Format.BitDepth on calibration so all downstream
displays reflect reality.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
When connecting to a server mid-track, the server sends an artwork-only
stream/start (no "player" key) before the real audio stream/start. Previously
this created a phantom Opus pipeline and forced "Playing" state, causing the
UI to show playback with no audio until the user skipped tracks.

Now StreamStartPayload.Format is nullable — artwork-only messages are detected
and skipped, so the pipeline only starts when actual audio format info arrives.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@chrisuthe chrisuthe merged commit 726d041 into master Mar 4, 2026
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant