Hi,
I am working on a project to send an audio stream to a WebSocket and play back the received audio on the Asterisk channel. Below is my current setup:
- Asterisk Version: 16.9.0
- Components:
- ARI application (Java) to send RTP to an external media point
- UDP Server/WebSocket Client (Node.js) for receiving and sending audio to Asterisk
- External media processing server communicating over WebSocket
Current Progress:
- Successfully initiated a call using a softphone
- Answered the call in the dial plan and triggered the STASIS application
- Managed the call via ARI, created a new Stasis bridge, and added the channel to the bridge
- Integrated an external media endpoint (UDP Server) to send and receive audio Successfully:
- Captured RTP audio from the softphone and sent it to the external media processing server
- Received processed audio from the external media server over WebSocket and forwarded it to Asterisk for playback
Issue Faced:
When sending audio back to Asterisk, it plays as distorted and incomprehensible. I suspect this issue is related to the format of the RTP packet received from the external media processing server (voice bot). However, I have verified that the audio from the media server is correct, as it plays properly in a web browser.
Any help or guidance on this would be really appreciated.
Thanks