How to get real time audio streams of both Calling party and called party independently

Hi,
How can we access the real time audio streams of both Calling party and called party separately and send to an external service. What will be the recommended path to achieve the same.

Scenario: Suppose a Customer “A1” is dialling to Customer “A2”. In order to get the live transcripts of both participants, how can we access the live audio streams of both customer “A1” and “A2” separately WHEN Customer A2 attended the call and started their conversation.

PLEASE HELP.

The easiest option is to use ARI to snoop on each channel, bridge them to external media channels, which then sends the audio to your ARI application.

1 Like

Getting good transcripts requires seeing both sides of the conversation. Unless there is a speech to text service that takes a split feed, I’d suggest that you will get better transcripts by feeding it the combined audio.

I assume you don’t want just a palantype transcript (purely phonetic).

Even with combined feeds, you need either to backtrack, or transcribe somewhat behind, to allow for good recognition. You could see this on live sub-titling of news programmes, and that was even with palantype capture, which, I suspect, still gives better phoneme recognition than machine capture.

1 Like

Thanks for the details.

Hi,
Trying with the mentioned approach. But not getting the audio streams. So could you please help, if we missed/incorrectly used anything here. Thanks.

Running this script in the same server which we used for running the asterisk.

AudioCapture.js

const client = require('ari-client');
const util = require('util');
const rtp = require('./rtp-udp-server');

async function connect() {
  try {
   let swap16 = false;
   let audioOutput = "test.raw";
   let audioServer = new rtp.RtpUdpServerSocket("localhost:9999", swap16, audioOutput);
   const ariClient = await client.connect("http://127.0.0.1:8088", "asterisk", "asterisk");
   await ariClient.start("externalMedia");
   console.log("Connected to Asterisk ARI");
 
   const bridges = await ariClient.bridges.list();
   console.log("number of bridges :", bridges.length);

   let bridgeId = "bridgeId"   //will update the bridge id of the current call, each time for the timebeing.
   if(bridgeId){
      const bridge = await ariClient.bridges.get({ bridgeId });
      const channelsInBridge = bridge.channels;
      const snoopBridge = await ariClient.bridges.create({
             type: 'mixing'
        });

      // Snoop on each channel
      for (const channel of channelsInBridge) {
        const snoopChannel = await ariClient.channels.snoopChannel({
            channelId: channel,
            app: 'externalMedia',
            spy: 'both',
            snoopId: 'snoop-' + channel,
            });

      console.log(`Snooping on channel id`, snoopChannel.id);


       let externalChannel = await snoopChannel.externalMedia({
                                app: "externalMedia",
                                external_host: "localhost:9999",
                                format: "ulaw"
                        });


       await ariClient.bridges.addChannel({ bridgeId: snoopBridge.id, channel: externalChannel.id });

      }
    }
   console.log('Waiting for audio streams ..... ');

  } catch (error) {
    console.error("Error connecting to Asterisk ARI:", error.message);
  }
}

// Call the connect function
connect();

rtp-udp-server.js.js → we are referring from :- https://github.com/asterisk/asterisk-external-media/blob/master/lib/rtp-udp-server.js )

I don’t do Javascript, sorry.

The steps we followed here is,

  • Get the bridgeId of the current active call

  • Take each channels in that bridge

  • For each channel, create a snoopChannel.

  • Then, create an externalMedia channel, corresponds to each snoopChannel ( while creating an externalMedia, provide the hostname and port of the rtp_udp_server).

  • Finally create a new bridge and add these externalMedia channels to this bridge.

Is this the correct steps to get audio streams ?

Please help

You also have to place the snoop channels in the bridge that contains external media, and afterwards verify everything occurred as expected. You can also confirm aspects such as recording the snoop channel and playing it back to ensure that the snoop channel is working, or play back an audio file to an external media channel to have audio sent out.

1 Like

Thanks for the input. It worked.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.