I’ved pushed a fork of the Asterisk AudioFork module that facilitates real-time streaming of stereo data (assuming duplex calls) to a WebSockets endpoint. We vetted the logic through more than 40,000 calls with one of MinervaCQ clients. It seems to work reliably, but if you use it, I would suggest looking closely at it for memory leaks or other concurrency pitfalls, as I’m not entirely familiar with all the concurrency conventions in Asterisk. Thanks again to MinervaCQ for providing the operating context for these logic improvements to AudioFork. See: GitHub - kodecharlie/asterisk-audiofork-with-stereo-streaming: Enable real-time streaming of stereo audio to WebSockets endpoint. · GitHub
Related topics: