|LATEST NEWS||PREVIOUS NEWS||ARTICLES||AWARDS|
Let’s Get Objective About QoE
By Simen K. Frostad, Chairman
Published in In Broadcast, March 2014
Everyone knows the joke about the tourist winding down his car window and asking a local man how to get to the coast. The local scratches his head and says: ‘Well I wouldn’t start from here.’
It’s good advice for anyone thinking about implementing a quality of experience monitoring system. QoE is deemed to be an essential facet of a good monitoring strategy for digital media operators, but the conventional approach to QoE for digital media has some surprising origins.
Monitoring QoE first began in the telecoms industry, when telcos wanted to assess how phone users felt about the sound they were hearing. They assembled panels of experts who evaluated the aural quality, often by listening to a selection of stock phrases such as ‘You will have to be very quiet’, ‘There was nothing to be seen’, and ‘They worshipped wooden idols’. The experts recorded their scores, assessing impairments on a scale of 1-5, from ‘imperceptible’ to ‘very annoying’. The scores were then averaged and weighted using statistical manipulations common in the social sciences and market research. This methodology was called MOS (Mean Opinion Score), and specified by the ITU-T (International Telegraph Union).
It certainly made sense to use this subjective approach for telephony, because the quality and intelligibility of sound over a phone line is difficult to evaluate using more precise and objective measures. But as a methodology for visual media and digital media transmissions? Well, you wouldn’t start from here…
And yet, that is precisely where most conventional QoE monitoring solutions for digital media do start. MOS may have been reinvented as VideoMOS, and the evaluation panel may have been robotised – so now it’s algorithms that attempt to simulate the subjective reactions of a range of ‘typical’ viewers – but the methodology is still based on the premise that subjectivity is the key to QoE.
Subjectivity is subtle: a human viewer watching a top-quality 1080i transmission of the Superbowl, followed on the same channel by the 1940s B&W movie Casablanca knows that the two pieces of content cannot be judged by the same criteria. Yet in a robotized QoE assessment based on MOS criteria, the Superbowl would score highly, while Casablanca would be marked way down for ‘blurriness’, scratches and other artifacts, and for its lack of resolution and color. These are spurious and misleading results, making the data from this kind of QoE system of questionable value to the provider.
Fortunately there are plenty of criteria inherent to digital media that are readily evaluated in a completely objective way: lost packets, timeouts, buffering, and so on. By basing QoE monitoring on these, it’s no longer necessary to confect an algorithmic ‘opinion’ about quality of experience.
|| CONTACT US
Postal address: Bentsebrugata 20,
NO-0476 Oslo, Norway
Visit: Sandakerveien 24C, bld. D5,
NO-0473 Oslo, Norway
Phone: +47 22 38 51 00
2004-2016 © Bridge Technologies co as