fix: add @MainActor isolation and detect mic contention#3
fix: add @MainActor isolation and detect mic contention#3
Conversation
WhisperTranscriber and AudioRecorder had @published properties but no actor isolation. After async suspension points (e.g. await whisper.transcribe), objectWillChange could fire from background threads, corrupting SwiftUI's view graph. This led to a SIGSEGV in DesignLibrary during view rendering after prolonged use. Changes: - Mark WhisperTranscriber and AudioRecorder as @mainactor - Add nonisolated(unsafe) isCapturing flag for audio tap callback to avoid cross-isolation read of @published isRecording - Detect mic in use by another process via CoreAudio kAudioDevicePropertyDeviceIsRunningSomewhere and skip recording instead of disrupting the other app's audio - Show brief mic.slash / dimmed icon when mic is busy - Guard isMicBusy timeout Task against clobbering active recording state - Add LocalizedError conformance to AudioRecorderError Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
There was a problem hiding this comment.
Pull request overview
Adds main-actor isolation to UI-facing observable objects and introduces microphone-contention detection to prevent SwiftUI crashes and avoid disrupting other apps.
Changes:
- Annotate
WhisperTranscriberandAudioRecorder(and related tests) with@MainActorto keep@Publishedupdates on the main actor. - Add mic-contention detection (CoreAudio
kAudioDevicePropertyDeviceIsRunningSomewhere) and surface “mic busy” state throughAppState+ menu bar icon. - Add
LocalizedErrorconformance forAudioRecorderErrorand a cross-threadisCapturingflag for the tap callback.
Reviewed changes
Copilot reviewed 6 out of 6 changed files in this pull request and generated 5 comments.
Show a summary per file
| File | Description |
|---|---|
| FreeWispr/Tests/FreeWisprTests/WhisperTranscriberTests.swift | Runs transcriber tests on the main actor to match new isolation. |
| FreeWispr/Tests/FreeWisprTests/AudioRecorderTests.swift | Runs recorder tests on the main actor to match new isolation. |
| FreeWispr/Sources/FreeWispr/WhisperTranscriber.swift | Main-actor isolates transcriber observable state. |
| FreeWispr/Sources/FreeWispr/FreeWisprApp.swift | Plumbs isMicBusy into menu bar icon and updates icon rendering. |
| FreeWispr/Sources/FreeWispr/AudioRecorder.swift | Adds mic busy detection, introduces isCapturing, and improves error typing. |
| FreeWispr/Sources/FreeWispr/AppState.swift | Updates UI state when mic is busy and schedules automatic reset. |
Comments suppressed due to low confidence (1)
FreeWispr/Sources/FreeWispr/AudioRecorder.swift:139
isCapturing = falseis written while the engine may still be running and the tap callback may still be readingisCapturing. This contradicts the safety comment onisCapturingand increases the likelihood of a data race. If you keep the non-atomic approach, move the flag transition to a point where the tap is guaranteed quiesced (e.g., stop engine/remove tap first), or (preferred) switchisCapturingto an atomic so ordering doesn’t rely on engine timing.
private func resetEngine() {
isCapturing = false
audioEngine.stop()
if isTapInstalled {
audioEngine.inputNode.removeTap(onBus: 0)
isTapInstalled = false
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| } catch AudioRecorderError.micInUse { | ||
| statusMessage = "Mic in use by another app" | ||
| isMicBusy = true | ||
| Task { [weak self] in | ||
| try? await Task.sleep(for: .seconds(2)) | ||
| guard let self, self.isMicBusy else { return } | ||
| self.isMicBusy = false | ||
| if !self.isRecording && !self.isTranscribing { | ||
| self.statusMessage = "Ready" | ||
| } | ||
| } |
There was a problem hiding this comment.
Task { ... } is not guaranteed to run/resume on the main actor, but it mutates @Published state (isMicBusy, statusMessage, etc.). This can reintroduce the same “background thread publishing changes” issue this PR is addressing. Make the task main-actor bound (e.g., Task { @MainActor in ... }) or do the post-sleep mutations via await MainActor.run { ... } (alternatively, annotate AppState as @MainActor if appropriate for the class).
| // Written on main thread before engine start / after engine stop. | ||
| // Read from audio tap callback. Safe because writes happen only while | ||
| // the engine is stopped, creating a happens-before relationship. | ||
| nonisolated(unsafe) private var isCapturing = false |
There was a problem hiding this comment.
isCapturing is read from the audio tap callback thread while being written from the main actor. As implemented, writes do not consistently happen only while the engine is stopped (see stopRecording()/resetEngine()), and the flag is a plain Bool with nonisolated(unsafe), which can still produce a data race/undefined behavior. Prefer a lock-free atomic (e.g., Swift Atomics ManagedAtomic<Bool>) for the tap’s fast-path read/write, or otherwise ensure synchronization such that the tap thread never concurrently accesses the flag during mutations.
| func stopRecording() { | ||
| guard isRecording else { return } | ||
| isCapturing = false | ||
| isRecording = false | ||
| audioEngine.stop() |
There was a problem hiding this comment.
Same issue as resetEngine(): this writes isCapturing concurrently with the tap thread potentially reading it, since audioEngine.stop() happens after the write. This should be addressed together with the isCapturing design (atomic flag recommended) to avoid undefined behavior under load.
| if Self.isMicInUseByAnotherProcess() { | ||
| throw AudioRecorderError.micInUse | ||
| } |
There was a problem hiding this comment.
New behavior changes startRecording() to throw .micInUse based on a CoreAudio query, but the updated tests shown only add @MainActor and don’t cover this branch. Add a unit test that verifies the .micInUse path and that isRecording remains false when this error is thrown; to make this testable, consider injecting the “mic usage check” behind a protocol/closure rather than calling a static CoreAudio function directly.
| case .micInUse: "Microphone is in use by another app" | ||
| case .formatError: "Cannot create audio format converter" |
There was a problem hiding this comment.
LocalizedError conformance implies user-facing strings; returning hard-coded English strings makes localization harder and can create inconsistencies with the rest of the app. Consider using String(localized:) (or NSLocalizedString) for these messages so they can be localized and centrally managed.
| case .micInUse: "Microphone is in use by another app" | |
| case .formatError: "Cannot create audio format converter" | |
| case .micInUse: | |
| return String(localized: "Microphone is in use by another app") | |
| case .formatError: | |
| return String(localized: "Cannot create audio format converter") |
WhisperTranscriber and AudioRecorder had @published properties but no actor isolation. After async suspension points (e.g. await whisper.transcribe), objectWillChange could fire from background threads, corrupting SwiftUI's view graph. This led to a SIGSEGV in DesignLibrary during view rendering after prolonged use.
Changes: