Voice & Video Performance SLAs: Do Yours Deliver?
Depending on how a service provider measures factors like latency, jitter, and packet loss, some SLAs simply don't deliver.
Many customers and providers think service-level agreements cover reliability and availability. This is true, but incomplete. An SLA needs to cover every component of the connection, including performance quality.
As I mentioned in my previous post, "Know Your SLAs," I recently attended a session on SLAs presented by Mark Lindsay, a partner with law firm LB3, and David Lee, technology director with telecom consultancy TechCaliber, presented at CCMI's 25th annual Negotiating Network and Infrastructure Deals conference. In my previous post I shared what they had to say about what's in an SLA and what to measure. Here I'll pick up with a discussion on quality performance.
SLAs for Quality Performance
Latency, jitter, and packet loss will affect voice and video quality, and any one of these factors can lead to impaired voice and video quality. Traffic congestion, infrastructure design, and the endpoints connected may influence these factors.
Is the SLA a deliverable or an objective? Some SLAs don't deliver. These can be called an "almost SLA." This is an objective. It is rather meaningless, offering no customer recourse other than complaining.
Where is the SLA measured? A customer's primary goal should be to measure the SLA from the mouthpiece or camera to the earpiece, speaker, or screen at the other end. This includes everything the service provider delivers plus what the endpoints contribute to the performance, not just the network backbone.
What's in Your Codec?
The codec in use for digitizing voice will have an effect on the voice quality. There are standard and proprietary codecs in use today. The common PSTN codec is the G.711 standard (64 Kbps), which covers a frequency range of approximately 300 to 3.4 kHz. The high-definition codec for voice is G.722 (64 Kbps), which covers a range from about 50 Hz to 7 kHz. A common standard codec for voice compression is G.729, which primarily operates at 8 Kbps.
A codec's mean opinion score (MOS) is a sign of how good the voice quality isthat it delivers. A MOS of 4.0, in a 1.0 to 5.0 scale, is considered acceptable. R-factor is another term used to describe voice quality. Ask what your SLA covers.
What's Your Latency?
Latency is the end-to-end delay measured in milliseconds. The SLA should define where the measurement starts and ends. The latency measurement may not cover the entire connection (local access lines). Read the latency definition carefully. It may be a goal without any penalties if it is not met.
AT&T, for example, specifies network latency as "a monthly measure of the AT&T network-wide delay within a region, which is the average interval of time it takes during the applicable calendar month for test packets of data to travel between selected pairs of AT&T network nodes within a region. Latency for the month is the average of these measurements." This is really a measurement of the backbone latency. It turns out that this is really an objective of 37 ms.
Measuring over a month does not tell you what happens during your call busy hour. The overall user latency goal is 125 ms one way (250 ms roundtrip), which includes local access lines and endpoint delays. Codec processing can contribute 20 to 40 ms or more at each end depending on the packet size and processing time.
The end result of the AT&T latency measurement is that it does not cover most of the contributors to the overall latency experienced.
Jitter in Your Connection
Jitter is the variation in latency measured in milliseconds. It needs to be corrected at the receiving endpoint, via a jitter buffer. The typical jitter buffer is one packet length, at 10 or 20 ms. When the jitter exceeds the jitter buffer capacity, packets get discarded. I had one experience where the jitter was so great that the IP phones discarded 30% of the packets.
The major culprit for network jitter is traffic congestion. As an example, AT&T's SLA provision covering jitter specifics network jitter as "a monthly measure of the AT&T network-wide IP packet delay variation within an applicable region, which is a measure of the average difference in the interval of time it takes during the applicable calendar month for selected pairs of test packets and data stream to travel between pairs of AT&T network nodes in the region." This means a 1 ms jitter average over a month.
The variations in jitter can be tremendous during heavy traffic periods. The AT&T SLA clause only covers its backbone and averages over a month, which is not indicative of the actual user experience. Most providers specify 0.5 to 2 ms of backbone jitter. It looks good, but the monthly averaging produces an attractive jitter measurement that does not tell you very much at all.
Packet Loss Happens
Packet loss is sometimes called data delivery ratio (DDR). This is the ratio of the percentage of packets delivered compared to the total packets available to be sent. AT&T has a DDR of 99.9% (0.1% loss) measured over a month. The IP phone at the receiving endpoint can compensate for packet loss when it is not too great. When multiple packets are lost in sequence, the user hears a garbled voice. In the case of video, you'll see video pauses and squares on the screen -- i.e., pixilation.
Assuming that you do have a packet-loss problem, you must open a trouble ticket. In some SLAs, the provider has 30 days to fix the problem. If the problem is not fixed after 30 days, you can then apply for credit. This is not much comfort to those continuing to experience the garbled speech and distorted video pictures.
Providers do not implement performance measurements that cover what is important to the user. But there are many VoIP testing tools on the market (a Google search came up with 648,000 hits) -- and you need to have your own.