I am struggling to solve an issue with audio working on inbound calls using WebRTC. Audio works in both directions for outbound (made via WebRTC). But when receiving a call, I cannot hear any audio from the caller.
Any ideas? I have tried changing configuration almost everywhere with no luck. Let me know what information you may need and I’d be happy to share!
Adding: I have narrowed it down to the packets are being received (as shown in chrome://webrtc-internals/) but they’re all being discarded…
Only thing that comes to mind here is that for some reason you are not attaching the MediaStream Tracks from the PeerConnection to the Audio Element for the inbound calls. But if it works when you are making an outbound call, then it’s odd that it’s not when receiving a call - one would assume the same code would be in play.
The only other preventative item would be that audio playback requires some kind of human interaction with the page in order to actually play out via the speakers - this is usually solved with the “dial” button or the “answer” button having to be clicked… but there is always a chance that you failed to mention your system is on auto-answer with absolutely no human interaction with the page… and then of course there is a big glaring warning message that displays in the Console Log, that you would have already picked up on… anyway as I say, this is pretty much cover-all-bases. Without any code samples, it’s all just guesses.