To upgrade my project, I need to stop the audio player when the people at the other side are speaking during the play.
With ARI, I have a stasis app to originate a call, and when the client ask a question, I have to send him an audio with the good answer using this.channel.play(media…).
But when the client is speaking during the audio play (I have the signal when he is speaking), I can’t stop it because ARI is waiting finishing the audio play to continue…
Thanks a lot for your help I am on Asterisk since only 3 weeks.
Thanks for the quick answer.
I am using ARI in nodejs. To play the audio I use:
this.channel.play(media…)
On the file ‘file1.js’.
On the file ‘file2.js’ I have the text to speech from github and when a new word is detected, I send a signal to ‘file1.js’.
Then, this signal appears only when the play is finished…and not during the play…
Also this.channel.delete does not existe with ARI…
I don’t do javascript. The play returns a playback. You call delete on the playback to stop it. There is also a delete on channels, but that is for hanging them up.
Your problem doesn’t appear to be ARI related really, but Javascript and the ARI usage in it.
Thanks for your answer and sorry for the late response.
I tried a lot of options, and the easiest was playing audio with MOH, but during the conversation, I can’t play only one time the playlist and it plays it without interuption…
I use external channel to perform speech to text, but when play is runningm the external channel is not sending data to my websocket to perform the speech to text during the play…
EXTERNAL MEDIA → WEBSOCKET → SPEECH TO TEXT (transcription from the called number)
SPEECH TO TEXT → AUDIO RESPONSE PLAYED on the LOCAL CHANNEL to the called number
During PLAY, the voice from the CALLED NUMBER is not detected or recognized…
By addition, if I use:
localChannel.on(‘ChannelDtmfReceived’, function(event, channel){
console.log(event.digit);
});
When the callee press a digit, it is recognized.
You haven’t specified how things are connected together. For example if caller is connected to external media in a bridge and you call play on caller, then external media won’t receive audio. A snoop channel, which can passively listen to the audio from caller, would need to be used as it would continue to receive audio.
ChannelTalkingStarted is only raised if you have set the TALK_DETECT dialplan function on the channel.
the External channel is added to the bridge:
externalChannel = ari.Channel();
externalChannel.on(‘StasisStart’, (event, chan) => {
bridge.addChannel({channel: chan.id});
the external channel is sent to a websocket :
resp = externalChannel.externalMedia({
app: “externalMedia”,
external_host: host,
format: format
});
ASTERISK RETURN:
Activating Stasis app ‘externalMedia’
== WebSocket connection from ‘ADRESS’ for protocol ‘’ accepted using version ‘13’
== Using SIP RTP CoS mark 5
– Called CALEE_SIP
> 0x7f87ac0095f0 – Strict RTP learning after remote address set to: ADRESS
– Called ADRESS
– UnicastRTP/ADRESS-0x7f87ac007100 answered
> Launching Stasis(externalMedia) on UnicastRTP/ADRESS-0x7f87ac007100
– Channel UnicastRTP/ADRESS-0x7f87ac007100 joined ‘simple_bridge’ stasis-bridge <96218d5a-edcb-4652-9da3-275de5e93637>
== Using SIP RTP CoS mark 5
> 0x7f86e40125b0 – Strict RTP learning after remote address set to: CALLEIP:8000
– SIP/CALEE-SIP-00000136 is ringing
> 0x7f87a000dcd0 – Strict RTP learning after remote address set to: CALLEIP:8000
– SIP/CALEE-SIP-00000136 answered
> Launching Stasis(externalMedia) on SIP/CALEE-SIP-00000136
– Channel SIP/CALEE-SIP-00000136 joined ‘simple_bridge’ stasis-bridge <96218d5a-edcb-4652-9da3-275de5e93637>
> 0x7f87a000dcd0 – Strict RTP switching to RTP target address CALEEIP:8000 as source
– <SIP/CALEE-SIP-00000136> Playing ‘/root/file.slin’ (language ‘en’)
> 0x7f87a000dcd0 – Strict RTP learning complete - Locking on source address CALEEIP:8000
– <SIP/CALEE-SIP-00000136> Playing ‘/tmp/ANSWER.slin’ (language ‘en’)
So I juste have to listen a snoop channel that I can create from the externalchannel to listen without interruption?
For the second pointm talk_detect was setted:
chan.setChannelVar({channelId: this.externalChannel.id, variable :
{‘TALK_DETECT(set)’:‘1500’}})
.then(function (event) {console.log(event)})
.catch(function(err) {console.error(’ failed to set channel var ', err);});
const playback = await chan.play({media: ‘sound:/root/mysound’,});
I would suggest instead of trying to piece all of this together that you make small applications to learn and experiment with the specific individual components so you understand them better, then piece that knowledge together.
So at the end, with nodejs, it works using the snoop channel.
During my call, I open a bridge between 2 channels, the channel used to be sent to a listening server is bridged to a third channel: The snoop channel to spy the ‘IN’.
Then I can listen to my callee even if a play is running.
A last question (maybe not the least). When we call ‘EXTERNALMEDIA’ or ‘SNOOPCHANNEL’, one argument is the app. What does it means? because this app is like nothing…
At the beggining of the code we start an app: this.ari.start(“externalMedia”); and we use it as an argument…So like this it looks like it is just a ghost argument…What is the real power of the app?