I have some questions about the stats in RTCPSent RTCPReceived AMI events that the wiki pages [1, 2] don’t seem to answer. If there is more documentation somewhere that gives more details how this data can be used in practice, please do share.
Am I understanding this right?
- RTCPReceived with a 200 packet type is the phone reporting to Asterisk its statistics for the rtp audio stream from Asterisk to the phone.
- RTCPSent with a 200 packet type is the Asterisk reporting statistics of the rtp audio stream from the phone to Asterisk.
- To report on call quality in real-time, both are needed because there are two audio streams in the channel going in opposite directions, and each stream may take different routes to arrive at their destination.
Does the 201 packet type really just mean that the sender of the report hasn’t actually dealt with any packets? I’m not quite sure what I am supposed to do with the 201 packet type as someone who is trying to report to the end-user about network quality.
It seems like RFC 5760 tries to provision for reporting on multiple audio streams, and the ReportCount and ReportX* fields seem to allow for multiple reports, but did Asterisk actually implement code that will report on more than one? Under “Add support for multiple remote media sources” in the RTP task list there is a comment that seems to imply that we may not ever expect multiple audio sources on the same stream, so the author wasn’t quite sure what should be done. So, at this point with Asterisk, I should only really ever expect one report if there is one provided?
Is the RTT value in the RTCPReceived packet in response to a previous RTCPSent packet? Or is it something Asterisk is calculating from the timestamps from RTCPReceived alone?
RTT is a little bit confusing because ReportXDLSR is expressed in 1/65536 seconds as RFC 3550 says, and RFC 5760 says that RTT is expressed in 1/65536 seconds, but the wiki page seems to imply that RTT is a decimal second.
Does RTT on the RTCPReceived do the math to subtract the DLSR processing time already? Or is that time still included in RTT and I need to convert the 1/65536 from RTT to get the actual network latency? I understand that DLSR is basically the delay in processing CPU/memory/scheduling on the phone. For example, some of these old Polycom phones we use have 125MHz processors, so DLSR could represent quite a bit of time in addition to the actual network round-trip latency. That is, if the reply is immediate – the phone could just sit and wait for some other timeout before it decides to send the packet too.
Since there is a CumulativeLost, it is possible to calculate the fraction of lost packets for the entire call with SentPackets. That’s nice because FractionLost is just for the time since the last report packet. IAJitter is also only a snapshot since the last RTCP packet. I guess there isn’t a way to figure summarize Jitter for the entire call length with these events?
Any suggestions on how to use these stats in practice to report network reliability to the user in real-time are welcome.
Thanks.