Standards don’t answer all the test and monitoring questions

By Simen K. Frostad, Chairman

Published in Broadcast Bridge July 2015

Without standards, the world would be a very difficult place to live in. There are many kinds of standards that affect almost every aspect of live – technology is just one of those areas. We can consider language as a kind of standard that allows people in one part of the world to communicate with each other. International finance uses standardised methods of accounting to try to provide a consistent framework for doing business. Currency itself is a symbolic representation of value that we use as a standard for exchange of goods and services.

So we need standards to get on with our daily lives. In our industry and many others, technological development is not regulated or centrally organised; it takes place in a free-for-all where commercial realities hold sway. But in order to build workable infrastructure for a national or international cellular phone system or a broadcasting network, these commercial interests have to be tempered by some kind of framework that allows competing energies to be channeled in roughly the same direction.

It’s here that the tension between competition and regulation creates compromises. A standards body usually includes among its members representation for the main competing entities in any field of development – the commercial organisations fighting it out to establish their way of doing things as the dominant way. The aim of the standards body is to establish some framework in which the customer for the technology can enjoy the benefits of an open market (the ability to buy from competing manufacturers) while also benefiting from enough market regulation to ensure that the manufacturers are all playing roughly in tune.

It’s an inherently conflicted endeavour, and the output from most standards committees reflects that conflict. The standard will provide just enough common ground to allow an industry sector to resist splintering and move forward in an agreed direction, but the compromises necessary for reaching an agreement tend to push the standard towards the lowest common denominator, while leaving wriggle room for the more powerful commercial entities to advance their proprietary technologies within the framework. Standards therefore usually represent not the best way to tackle a problem, but the best way that can be agreed on at this point, given the competing interests.

But standards do not provide a tool for dealing with every contingency in the real world. Language for example, is a fluid, constantly-evolving thing; and try as they might to provide a standard of ‘correct’ usage, the language academies in some European countries can only fight a losing battle against neologism, foreign influences, and regional variations. Currency unions such as the eurozone are a good idea in some ways, but the standard that they try to establish places strain on some of the economies adhering to the standard. An architect may produce construction drawings for every last detail of a project, but in the real world it’s the builders who create the edifice, and the real world is full of unforeseeable mishaps, nuances, improvisations and infinite shades of grey cement.

So while a standard is conceived in compromise and in some abstraction from the messiness of the real world, it’s also a snapshot of a moment in time. Given the average gestation period required to produce an industry standard, there’s more than a chance that by the time it emerges, it will already have been partly overtaken by events – by the relentless onward march of technological development.

To return to the language analogy, if your academy-sanctioned standard language does not contain words for new concepts and technologies invented elsewhere in the world, the normal solution is to borrow the foreign word for the invention, making it part of the language even if the standard prescribes that foreign words should be avoided. In effect, some divergence from the standard is necessary to maintain the usefulness of the language as a tool for describing the world.

So it’s important in any industry to recognise standards for what they are: a way of dealing with part – but only part – of the messy real world. They provide a baseline – but not a safety net, and in practice there will always be important elements of day-to-day operations that are outside the boundaries of the standard. But a complacent, box-ticking mentality about standards is a trap to avoid, and the idea that if a product is standards-compliant it is ipso facto a perfect tool for the job is a dangerous one.

In the digital media industry, ETR290 (ETSI TR 101 290) is the key standard for evaluating the quality of digital streams, underpinning the common approach to testing and monitoring. But like any standard, it has its limitations. There are grey areas, and a great deal of complexity which makes it difficult for operational staff to grasp fully, unless they have a rare degree of expertise. So given the strain on staff resources and time in the real world of digital media services, monitoring based on ETR290 is often poorly calibrated, and as a result it delivers less accurate data.

Add to this the fact many of the component parts of a digital media service are outside the scope of ETR290: if there’s a fault in the conditional access system, a malfunction in the programme guide, or the wrong language is presented, the effect on service quality can be serious. Yet none of these errors would be picked up by testing based on ETR290. So something in addition to the prevailing standard is needed.

Our response to this is Gold TS Protection – a basis for monitoring and analysis of digital media services which includes high-quality ETR290 testing and extensions to it, but which also tests for other vital elements in a service, such as correct functioning of the CAS and EPG. It provides the safety net that is missing in T&M systems based solely on ETR290, and makes it very much simpler to create an accurate and useful calibration of the parameters for ensuring higher service quality. It’s a practical response to the recognition that digital media monitoring needs more than just the standard if it’s to be truly effective.

Bridge Technologies Reinforces Commitment to Interoperability at IBC 2017
At IBC, Bridge Technologies (Stand 1.F68) is focusing on how the company is responding to the industry requirement for maximum interoperability with ST2110 signal vendors and at the same time contributing to the AIMS, VSF, SMPTE, AES interop showcase with new analytics gear.
Bridge Technologies VB440-V Virtual Probe Wins IABM Award for Test, Quality Control & Monitoring
Bridge Technologies today announced that its innovative VB440-V virtual probe was chosen by the judges as the winner of the 2016 IABM Design & Innovation Awards in the “Test, Quality Control & Monitoring” category.
Bridge Technologies Remote Data Wall “Highly Commended” by CSI Judges
Bridge Technologies today announced that its innovative Remote Data Wall was ‘Highly Commended’ by the judges in the ‘Best monitoring or network management solution’ category of the CSI Awards at a presentation at IBC on 9 September.
Bridge Technologies Addresses New Classes Of User With Uniquely Portable Probe for Monitoring and Managing IP Networks
At IBC 2016, Bridge Technologies launched NOMAD, a unique, innovative and affordable tool for anyone tasked with managing, supporting and optimising IP networks and hybrid networks with RF signals.
Remote Data Wall (RDW) Receives IBC 2016 Innovation Award from Broadcast Beat
Remote Data Wall enables users with no special skills to create displays, extending over multiple screens in a videowall format, that deliver graphical representations of a broad range of data, significantly easing the monitoring, analysis and troubleshooting of media networks.

The OMEGA Program