What I’m trying to achieve:
Using ARI, A call is originated from asterisk to a customer’s phone number. Some data is played and the call ends. Optionally but very important for this case, another call is originated at the same time from asterisk to a customer service rep. This customer service rep should be able to talk to the customer and hear whatever data is played back by asterisk.
What is actually happening:
The call to the customer and to the supervisor are connecting. They both can answer and talk to each other. But only the customer can hear the audios that are played by asterisk. The customer service rep cannot hear those audios. I also tested playing dtmfs with asterisk and the customer rep can’t hear them.
What I’m doing:
- I create a bridge with type ‘mixing’
- I create two channels (endpoints are PJSIP), one to reach the customer, one to reach the customer service REP
- When the appropriate event message arrives (‘StasisStart’) I add the channels to the bridge. I add each channel with a ‘participant’ role, not muted, specifying the same app name.
What things I am seeing, what have I done:
- I see that the bridge is in softmix mode
- changed codec priorities codecs (and even disabled codecs) on both phones (polycoms in this case)
- disabled directmedia on pjsip
I’ve read the docs a couple of times and I’m sure I’m missing something, I just don’t know what.