Media Delivery Index
The Media Delivery Index (MDI) is a set of measures that can be used to monitor both the quality of a delivered video stream as well as to show system margin for IPTV systems by providing an accurate measurement of jitter and delay at network level (Internet Protocol, IP), which are the main causes for quality loss. Identifying and quantizing such problems in this kind of networks is key to maintaining high quality video delivery and providing indications that warn system operators with enough advance notice to allow corrective action.
The Media Delivery Index (MDI) may be able to identify problems caused by:
If packets are delayed by the network, some packets arrive in bursts with interpacket delays shorter than when they were transmitted, while others are delayed such that they arrive with greater delay between packets than when they were transmitted from the source (see figure below). This time difference between when a packet actually arrives and the expected arrival time is defined as packet jitter or time distortion.
A receiver displaying the video at its nominal rate must accommodate the varying input stream arrival times by buffering the data arriving early and assuring that there is enough already stored data to face the possible delays in the received data (because of this the buffer is filled before displaying).
The effects of all these facts on the amount of packets received by a specific point in the network can be seen in the next graphics:
Packet delay variation and packet loss have been shown to be the key characteristics in determining whether a network can transport good quality video. These features are represented as the Delay Factor (DF) and the Media Loss Rate (MLR), and they are combined to produce the Media Delivery Index (MDI), which is displayed as:
The different components of the Media Delivery Index (MDI) are explained in this section.
Delay Factor (DF)
The Delay Factor is a temporal value given in milliseconds that indicates how much time is required to drain the virtual buffer at the concrete network node and at a specific time. In other words, it is a time value indicating how many milliseconds’ worth of data the buffers must be able to contain in order to eliminate time distortions (jitter).
It is calculated as follows:
1. At every packet arrival, the difference between the bytes received and the bytes drained is calculated. This determines the MDI virtual buffer depth:
2. Over a time interval, the difference between the minimum and maximum values of Δ is taken and then divided by the media rate:
Media Loss Rate (MLR)
The Media Loss Rate is the number of media packets lost over a certain time interval (typically one second).
It is computed by subtracting the number of media packets received during an interval from the number of media packets expected during that interval and scaling the value to the chosen time period (typically one second):
Maximum acceptable average MLR:
- SDTV: 0.004
- VOD: 0.004
- HDTV: 0.0005
It must be said that the maximum acceptable MLR depends on the implementation. For channel zapping, a channel is generally viewed for a brief period, so one would be bothered if any packet loss occurred. For this case the maximum acceptable MLR is 0, as stated before, because any greater a value would mean a loss of one or more packets in a small viewing timeframe (after the zap time).
Generally, the Media Delivery Index (MDI) can be used to install, modify or evaluate a video network following the next steps:
- Identify, locate, and address any packet loss issues using the Media Loss Rate.
- Identify and measure jitter margins using the Delay Factor.
- Establish an infrastructure monitor for both MDI components to analyze any possible scenarios of interest.
Given these results, measures must be taken to provide solutions to the problems found in the network. Some of them are: redefining system specifications, modifying the network components in order to meet the expected quality requirements (or number of users), etc.
- Network Utilization. Tracking the instantaneous, minimum, and maximum overall network utilization is needed to verify that sufficient raw bandwidth is available for a stream on a network. High utilization level is also an indicator that localized congestion is likely due to queue behavior in network components. The DF provides a measure of the results of congestion on a given stream.
- Video stream statistics such as:
- Instantaneous Flow Rate (IFR) and Instantaneous Flow Rate Deviation (IFRD). The measured IFR and IFRD confirm a stream’s nominal rate and, if not constant over time, gives insight into how a stream is being corrupted.
- Average Rate in Mbit/s. This measure indicates whether the stream’s rate being analyzed conforms to its specified rate over a measurement time. This is the longer term measurement of IFR.
- Stream Utilization in percent of network bandwidth. This measure indicates how much of the available network bandwidth is being consumed by the stream being analyzed.
- Florin Hodis (2008). "IPTV challenges and metrics Application Note "IPTV QoE: Understanding and interpreting MDI values" (PDF). 2011-07-10.". EXFO Electro-Optical Engineering Inc.
- J. Welch, J. Clark (April 2006). "A Proposed Media Delivery Index (MDI)". Internet Engineering Task Force (IETF)
- IneoQuest. "Media Delivery Index Application Note"
- Francisco Palacios (2006). "IPTV testing over DSL "IPTV QoE: Understanding and interpreting MDI values" (PDF). 2007-01-26.". EXFO Electro-Optical Engineering Inc.
- Agilent Technologies (2008). "IPTV QoE: Understanding and interpreting MDI values "IPTV QoE: Understanding and interpreting MDI values" (PDF). 2008-07-20."