Use MediaStream from getDisplayMedia in conference - Unable to find a codec translation path


#1

Besides getUserMedia, I would like to use getDisplayMedia as the source of the stream that is passed with SIP.js. It works fine when I call directly to another sip user, but when joining a conference, it seems to have problems with the waiting sound:

*CLI>   == Setting global variable 'SIPDOMAIN' to '192.168.0.103'
   -- Executing [2222@default:1] Answer("PJSIP/1000-00000007", "") in new stack
   -- Executing [2222@default:2] ConfBridge("PJSIP/1000-00000007", "guest") in new stack
   -- Channel CBAnn/guest-00000007;2 joined 'softmix' base-bridge <b6d07cbd-dbcc-4a53-9beb-407794cd95da>
[Dec 20 11:12:44] WARNING[9332][C-00000008]: channel.c:5578 set_format: Unable to find a codec translation path: (slin) -> (vp8)
[Dec 20 11:12:44] WARNING[9332][C-00000008]: file.c:1252 ast_streamfile: Unable to open conf-onlyperson (format (vp8)): Function not implemented
   -- Started music on hold, class 'default', on channel 'PJSIP/1000-00000007'
   -- Channel PJSIP/1000-00000007 joined 'softmix' base-bridge <b6d07cbd-dbcc-4a53-9beb-407794cd95da>
[Dec 20 11:12:44] WARNING[9332][C-00000008]: translate.c:485 ast_translator_build_path: No translator path: (starting codec is not valid)
[Dec 20 11:12:44] WARNING[9332][C-00000008]: channel.c:5578 set_format: Unable to find a codec translation path: (slin) -> (vp8)
   -- Channel PJSIP/1000-00000007 left 'softmix' base-bridge <b6d07cbd-dbcc-4a53-9beb-407794cd95da>
   -- Stopped music on hold on PJSIP/1000-00000007
   -- <CBAnn/guest-00000007;1> Playing 'confbridge-join.slin' (language 'en')
   -- <CBAnn/guest-00000007;1> Playing 'confbridge-leave.slin' (language 'en')
   -- Channel CBAnn/guest-00000007;2 left 'softmix' base-bridge <b6d07cbd-dbcc-4a53-9beb-407794cd95da>
   -- Executing [2222@default:3] Hangup("PJSIP/1000-00000007", "") in new stack
 == Spawn extension (default, 2222, 3) exited non-zero on 'PJSIP/1000-00000007'
 == WebSocket connection from '127.0.0.1:58610' closed

My pjsip template:

[endpoint_webrtc](!)
disallow=all
allow=ulaw,slin,opus,vp8,vp9,h264
context=default
dtls_auto_generate_cert=yes
max_audio_streams=10
max_video_streams=10
rewrite_contact=yes ; necessary if endpoint does not know/register public ip:port
rtp_keepalive=5
rtp_timeout_hold=5
rtp_timeout=5
send_pai=yes
transport=transport-wss
type=endpoint
webrtc=yes

It seems a bit weird that Asterisk tries to translate audio to a video codec? Any idea how to work around this?


#2

Asterisk currently expects an audio stream to exist on a channel. If this is not true then weird things can happen, including the translation path message you saw. Depending on how exactly you’ve done it on the WebRTC side this may not be true, so you’d need to look and also check the SDP.


#3

Thanks! Adding the audio stream from userMedia fixed it indeed. Something like:

const userMedia = await navigator.mediaDevices.getUserMedia(flags)
const audio = await userMedia.getAudioTracks()[0]
stream = await navigator.getDisplayMedia({audio: false, video: true})
stream.addTrack(audio)