Hi,
I have created following webrtc app, using ARI interface on asterisk 15.4.0.
- WebRTC client1 calls asterisk (using jsSip library)
- Call is received and new bridge is created and incoming channel is added to bridge.
- New channel is originated to webrtc client2, after success this channel is added to bridge.
Problem is that on second client I’m not able to get correct remote video.
-
If I create bridge with following options: “mixing,proxy_media,dtmf_events”, then both local and remote video on client1 are correct, but on client2 local and remote video is the same.
It looks like that asterisk bridge is sending same video stream (from client2) to both clients. -
But if I add “video_sfu” to previous bridge options, video stream is not transferred neither to client1 nor to client2. If I check webrtc-internals I can see: on both clients video and audio is sent, but on both clients only audio is received. Received bytes for video on both is 0!
It looks like that problem is with bridge type but I’m not able to figure out, so any hint or search direction is welcome.
best regards
edvin