SDK version: stream_video 1.3.1
Platform: iOS only
Description
When a speaker mutes or unmutes their microphone in an audio room, the playback volume of other active speakers briefly drops for ~1-2 seconds before recovering.
Root Cause
When the audio track is first published, _createAndPublishTrack calls publishAudioTrack without passing stopTrackOnMute, so it uses the parameter default of true, which gets baked into the track object:
_createAndPublishTrack()
→ publishAudioTrack(track: it.data)
// publishAudioTrack has: bool stopTrackOnMute = true ← origin of true
→ audioTrack.copyWith(stopTrackOnMute: true) // stored on the track object
Then when muting:
call.setMicrophoneEnabled(enabled: false)
→ rtcManager.setMicrophoneEnabled(enabled: false)
→ _setTrackEnabled(trackType: audio, enabled: false)
→ _toggleTrackMuteState(track, muted: true)
→ muteTrack(trackId: track.trackId) // no stopTrackOnMute override passed
→ originalTrack.copyWith(stopTrackOnMute: null) // null = no change, keeps true
→ track.stopTrackOnMute == true
→ track.stop() // ← releases hardware mic, deactivates AVAudioSession
track.stop() fully releases the hardware microphone and deactivates AVAudioSession. Since iOS uses a single shared audio session for both mic input and speaker output, this interruption briefly cuts the playback of all incoming remote audio streams.
The same issue occurs on unmute — unmuteTrack calls track.recreate() when stopTrackOnMute == true, which reinitializes the entire WebRTC audio pipeline, causing a second AVAudioSession disruption.
Expected Behavior
Muting the local mic should call track.disable() only (sets mediaTrack.enabled = false at the WebRTC layer — stops sending audio without touching AVAudioSession). Remote audio playback should be completely unaffected.
Suggested Fix
In _toggleTrackMuteState (lib/src/webrtc/rtc_manager.dart), pass stopTrackOnMute: false for audio tracks, or expose it as a parameter on setMicrophoneEnabled:
Future<RtcLocalTrack> _toggleTrackMuteState({
required RtcLocalTrack track,
required bool muted,
}) async {
if (muted) {
await muteTrack(
trackId: track.trackId,
stopTrackOnMute: track.trackType == SfuTrackType.audio ? false : null,
);
Note: muteTrack already accepts stopTrackOnMute as an optional override param in 1.3.1 — the plumbing exists, it just isn't wired up from the public API path.
Reproduction Steps
- Join an audio room as a speaker with 1+ other active speakers
- Confirm you can hear other participants clearly
- Tap mute
- Observe ~1-2 second dip in other participants' volume
Workaround
None available from user code. call.muteSelf() hits the same code path via a server round-trip and also requires muteUsers permission.
SDK version:
stream_video1.3.1Platform: iOS only
Description
When a speaker mutes or unmutes their microphone in an audio room, the playback volume of other active speakers briefly drops for ~1-2 seconds before recovering.
Root Cause
When the audio track is first published,
_createAndPublishTrackcallspublishAudioTrackwithout passingstopTrackOnMute, so it uses the parameter default oftrue, which gets baked into the track object:Then when muting:
track.stop()fully releases the hardware microphone and deactivatesAVAudioSession. Since iOS uses a single shared audio session for both mic input and speaker output, this interruption briefly cuts the playback of all incoming remote audio streams.The same issue occurs on unmute —
unmuteTrackcallstrack.recreate()whenstopTrackOnMute == true, which reinitializes the entire WebRTC audio pipeline, causing a second AVAudioSession disruption.Expected Behavior
Muting the local mic should call
track.disable()only (setsmediaTrack.enabled = falseat the WebRTC layer — stops sending audio without touchingAVAudioSession). Remote audio playback should be completely unaffected.Suggested Fix
In
_toggleTrackMuteState(lib/src/webrtc/rtc_manager.dart), passstopTrackOnMute: falsefor audio tracks, or expose it as a parameter onsetMicrophoneEnabled:Note:
muteTrackalready acceptsstopTrackOnMuteas an optional override param in 1.3.1 — the plumbing exists, it just isn't wired up from the public API path.Reproduction Steps
Workaround
None available from user code.
call.muteSelf()hits the same code path via a server round-trip and also requiresmuteUserspermission.