I am working on an ARI application that is supposed to implement a conference room. It’s using the aricpp library.
The guy who worked on this project before me modified chan_alsa.c to support multiple ALSA devices. I’m not taking advantage of that at the moment, but I want to mention it in case it seems like a likely culprit in the issue I’ll describe:
I have one device, running asterisk, which uses ARI to create a bridge, adds an ALSA channel to the bridge, then creates a SIP channel to dial another device and add that channel to its bridge. I’ve been calling it “the dialer”.
Another device, running asterisk, uses ARI to create a bridge, adds an ALSA channel to the bridge, then adds incoming calls to its bridge. I’ve been calling it “the conference service”.
Eventually both applications will be run on TI Sitara embedded devices (ARM Cortex A processors, running Arago Linux). Currently, I am running the conference service on my laptop, and the dialer on the embedded device.
I expect that when the dialer calls the conference service, users on either “side” should be able to communicate by speaking into or listening to their respective ALSA devices.
Currently, this only works in one direction at a time. The dialer starts with its ALSA channel muted, the conference service does not. When the dialer is added to the conference service’s bridge, wireshark confirms that an RTP stream immediately starts transmitting audio to the dialer, as expected.
If the dialer’s ALSA channel is unmuted, nothing happens. Wireshark shows there is no RTP traffic from the dialer to the conference service. I expected a stream would start, and that I would hear audio from the dialer’s ALSA device on the conference service’s ALSA device.
If, without unmuting the dialer, the conference service’s ALSA device is muted, the RTP traffic from the dialer to the conference service begins, and the audio from the dialer’s ALSA device can be heard on the conference service’s ALSA device.
In short, it only works when one side is muted.
I have also added a little command line to each application, where an ‘h’ command will play “hello world” on the other device. When I expect the RTP stream to be transmitting audio from the dialer’s ALSA device to the conference service, and it isn’t, I can send an ‘h’ command to the dialer, and hear “hello” on the conference service (as well as see RTP traffic in both directions).
I would appreciate any suggestions as to what the problem could be. The only things I can think of are that I have a setting misconfigured somewhere, or that either alsa channel can only playback or record at any given time. So when the dialer is playing audio from the conference service, it’s unable to capture audio to send an RTP stream.
I apologize for any silly mistakes, I’m new to VoIP applications.