Video not initializing on the calle device

Hello I would like some help about the Asterisk server.

I’m quite new to these kind of servers, and I’m developing a web project that call one telephone extension.

The problem is, when I start the call the Phone Extension, the Audio Stream works without a problem, but the video stream off my WebApp only shows up properly after some time or covering the camera. When I cover the camera of the computer, the video streams start’s showing up properly on the Phone Extension…

And in other hand, the Web App receives the Media Streams correctly, both video and audio from the phone extension.

Have anybody passed through that already?

If you guys wanna some log or something like that I can provide it.

The lib that I’m using on the Web App to call the Phone Extensions is SIPML5.

On Wednesday 06 March 2024 at 19:15:15, iago_barbosa_fadami via Asterisk
Community wrote:

Hello I would like some help about the Asterisk server.

I’m quite new to these kind of servers, and I’m developing a web project
that call one telephone extension.

I think you’ll have to give us quite a bit more information about what you’re
trying to do, and also how you’re trying to do it, before we can be of much
help.

You say “a web project”. How does this interact with Asterisk?

You say “call one telephone extension”. What is on the other end of the call?

(You also say “extension”, and you’re using it in completely the normal way
that most people talk about telephone extensions, but be aware that in the
Asterisk world, the word “extension” has a completely different and highly
specific meaning, which is very unfortunate, but means that it’s probably best
to avoid using the word at all in your description, and just say “telephone”.)

The problem is, when I start the call with the Phone Extension,

Please explain what this means in detail. How does Asterisk know you are
“starting a call”?

the Audio Stream works without a problem, but the video stream only works
properly after some time or covering the camera.

I think we need to know things such as:

a) how is the camera connected to Asterisk?

b) which codec is being used for video?

c) what do you see in the Asterisk console with reasonably (-vvv at least)
verbose logging turned on:

  1. when the call gets answered

  2. when the audio starts

  3. when the video starts

When I cover the camera, the video streams start’s working properly…

I think we need to see a SIP packet capture (including SDP) to understand
what’s happening here.

And in other hand, the Web App receives the Media Streams correctly, both
video and audio from the phone extension.

So, the end which initiates the connection gets good audio and video from the
start, whereas the end being called gets immediate audio, but the video takes
a while to become stable? Is that correct?

What happens if you place the call the other way around - from the
(video)phone to the web app?

The lib that I’m using on the Web App to call the Phone Extensions is
SIPML5.

Please show us the minimum amount of code necessary to show how this is
communicating with Asterisk.

Antony.


Because it messes up the order in which people normally read text.

Why is top-posting such a bad thing?

Top-posting.

What is the most annoying way of replying to e-mail?

                                               Please reply to the list;
                                                     please *don't* CC me.

On Wednesday 06 March 2024 at 19:15:15, iago_barbosa_fadami via Asterisk
Community wrote:

Hello I would like some help about the Asterisk server.

I’m quite new to these kind of servers, and I’m developing a web project
that call one telephone extension.

Oh - one additional question - what does Asterisk do in this setup?

If you have a SIP-compliant web application and a SIP-compliant phone, and
you’re developing a project which simply calls “one telephone extension”, why
is Asterisk needed at all?

Antony.


f u cn rd ths, u cn gt a gd jb n nx prgrmmng

                                               Please reply to the list;
                                                     please *don't* CC me.

Hello Pooh!

Sorry, I’ll be calling just telephone now, I didn’t know that could be misunderstood, so thanks for the info.

I’m developing an Web project that connects to the asterisk with the Javascript Lib SIPML5, I don’t know all the details about how the lib does the call, but I know that it’s through WebRTC

On the project we have one workflow that it’s something like:

a) how is the camera connected to Asterisk?
The user uses WebApp to do one videocall with one phone. The WebApp it’s located in one computer, that have one webcam connected into it. The microphone that is used it’s the Webcam internal mic.

b) which codec is being used for video?
The codec that we use in both sides( Phone and WebApp ) it’s the Video Codec H264, VP9 and VP8, and the audio codec that we use are Alaw, Ulaw, Opus, g726, g722, g729 and gsm.

I think we need to see a SIP packet capture (including SDP) to understand
what’s happening here.

Latter I will send here the logs from the events and try to capture the SIP packet. But can I use Wireshark, or do you recommend any other software

So, the end which initiates the connection gets good audio and video from the
start, whereas the end being called gets immediate audio, but the video takes
a while to become stable? Is that correct?

Yes, the end which initiates the connection gets good audio and video from the start, the other end gets audio from the start, but the video gets a while to appear, or you can just cover up the camera and then the video would start working on telephone

What happens if you place the call the other way around - from the
(video)phone to the web app?

The same thing

#makeCall = () => {
watch(() => this.#store.makeCall, () => {
if (this.#store.makeCall) {
this.#store.$patch({ showDialog: false, showCallDialog: true });
setTimeout(() => {
this.#sipCallConfiguration.audio_remote = document.getElementById(“remote_audio”);
this.#sipCallConfiguration.video_remote = document.getElementById(“remote_video”);
this.#sipCallConfiguration.video_local = document.getElementById(“local_video”);
this.#currentSessionCall = this.#sipStack.newSession(‘call-audiovideo’, this.#sipCallConfiguration);
console.log(‘000’, this.#sipCallConfiguration);
this.#currentSessionCall.call(${this.#siaConfigurationStore.ramalToCall}@${this.#siaConfigurationStore.sipServer});
this.#endCallListener();
}, 1000);

        }
    }
    );
}

#acceptCall = () => {
this.#store.$patch({ showDialog: false });
this.#store.$patch({ showCallDialog: true });
this.#stopRingbackTone();
this.#endCallListener();
setTimeout(() => {
this.#sipCallConfiguration.audio_remote = document.getElementById(“remote_audio”);
this.#sipCallConfiguration.video_remote = document.getElementById(“remote_video”);
this.#sipCallConfiguration.video_local = document.getElementById(“local_video”);
this.#currentSessionCall.accept(this.#sipCallConfiguration);
console.log(this.#sipCallConfiguration);
}, 2000);

}

#sipCallConfiguration = {
audio_remote: document.getElementById(“remote_audio”),
video_local: document.getElementById(“local_video”),
video_remote: document.getElementById(“remote_video”),
bandwidth: { audio: 320, video: 320 },
video_size: { minWidth: 150, minHeight: 150, maxWidth: undefined, maxHeight: undefined },
events_listener: { events: ‘', listener: this.#eventsListener },
sip_caps: [
{ name: ‘+g.oma.sip-im’ },
{ name: ‘language’, value: ‘"en,fr"’ }
]
}
#createSipStack = () => {
this.#sipStack = new SIPml.Stack({
realm: ‘fadami.local’,
impi: ${this.#siaConfigurationStore.ramalUser},
impu: this.#siaConfigurationStore.serverConnectorUrl,
password: this.#siaConfigurationStore.ramalUserPassword,
display_name: ${this.#siaConfigurationStore.ramalNumber},
websocket_proxy_url: wss://${this.#siaConfigurationStore.sipServer}:8089/ws,
outbound_proxy_url: tcp://${this.#siaConfigurationStore.sipServer}:5060,
enable_rtcweb_breaker: true,
disable_video: false,
events_listener: { events: '
’, listener: this.#eventsListener },
sip_headers: [
{ name: ‘User-Agent’, value: ‘IM-client/OMA1.0 sipML5-v1.0.0.0’ },
{ name: ‘Organization’, value: ‘Fadami’ }
]
}
);
}

On Wednesday 06 March 2024 at 22:15:46, iago_barbosa_fadami via Asterisk
Community wrote:

The codec that we use in both sides( Phone and WebApp ) it’s the Video
Codec H264, VP9 and VP8, and the audio codec that we use are Alaw, Ulaw,
Opus, g726, g722, g729 and gsm.

Wow, that is a big range. I would recommend that you narrow things down
(especially on the video codecs) to see whether selecting just a single codec
on each end gives you better performance (or at least, better connectivity and
media setup times).

Why do you have such a wide variety of codecs?

Later I will send here the logs from the events and try to capture the SIP
packet. But can I use Wireshark, or do you recommend any other software

Wireshark will do fine - you may want to investigate the application sngrep as
well, which is specifically for capturing and displaying SIP communications.

Personally I’m not certain whether WebRTC is just another way of communicating
between endpoints using SIP, or whether WebRTC is an alternative to SIP. I
think other people here know a lot more about this topic than I do.

What happens if you place the call the other way around - from the
(video)phone to the web app?

The same thing

Sorry; your code trace / debug log made no sense to me.

Please simply explain:

  • if the call is placed from the videophone to the web application,

  • does the videophone get immediate audio and video, and the web application
    gets audio, but the video takes a while to appear, or

  • does the web application get the immediate audio and video (as is the case
    when it initiates the call), and the caller this time gets the video problem?

Also, as I asked a little belatedly - what is Asterisk doing in this setup?

Is it simply acting as a “proxy” between a web application which speaks WebRTC
and a videophone which speaks UDP/SIP, or is there more to it than that?

Antony…


Archaeologists have found a previously-unknown dinosaur which seems to have
had a very large vocabulary. They’ve named it The Saurus.

                                               Please reply to the list;
                                                     please *don't* CC me.

Pooh, I’m using Asterisk because of the SIP protocol.
The company I work sell to softwares to tolls, and the toll uses those old phones that communicate with SIP protocol, and we need to adapt ourselves to them.

We need to put a lot of computers with those Web Apps to call some Phones in a center place [

Sorry for the late answer in some of your questions.

Why do you have such a wide variety of codecs?

I was using all those codecs for test, and in the ending of the project I would chose the best codecs to use, removing all the others.
Currently I’m using on video only h264

Please simply explain:

  • if the call is placed from the videophone to the web application,
  • does the videophone get immediate audio and video, and the web application
    gets audio, but the video takes a while to appear, or
  • does the web application get the immediate audio and video (as is the case
    when it initiates the call), and the caller this time gets the video problem?

First, the call is placed from the web application to the videophone.
The videophone get’s only audio at first, after i cover the camera, or wait some time like 1 minute and 40 seconds, the video pops up on the videophone screen

And the web application get’s imeddiate audio when calls the videophone and even when the videophone calls the web application

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.