I would like to know if it is possible to realise the idea I have in mind. When I receive an incoming call in my Asterisk, is it possible to send the raw audio stream through a websocket? I have an endpoint that listens to WS, and I’d like to send the raw audio that way. I have seen that Asterisk has the option to send it via WS, but the customer has to know libraries like JSSIP. I just want to receive in my client the raw remote audio. Is it viable to create an Asterisk module that does that? Doesn’t it make sense what I say?
There is no ability built in to send audio over Websockets, not even in the case of JsSIP. Custom code would have to be written in Asterisk to do it there. The foundation in order to write it should all be present, but you’d have to put it together (get the audio, send it over an existing websocket or make a new one - depends on what you want to do - and depends on how it should be wrapped or if it is literally a raw timestampless/sequence numberless stream).
Then the best thing would be to create a custom module. Is that correct?
Are there functions in asterisk to pick up the raw audio of an inbound channel from inside a module?
Thanks!
That depends on what exactly you mean by “pick up the raw audio”. If you want it in a blocking fashion then an application gets it. If you want to spy/snoop on a channel then the audiohooks API or framehooks API can get the audio. Like I said, the foundation is there.
I would like to accept an incoming call, get the raw remote audio and send it as messages through the websocket. There is no need to spy on the call, it would be this same custom code that answers the call.
Regards