The Word of the Day Is Interoperability

Written by

Brenton Ough

June 11, 2021

Share on

The Word of the Day Is Interoperability

Let’s face it: right now broadcast television has the upper hand. Why? Because it just works. People can sit down on the couch, hit the power on the remote, and there are no issues. That’s because broadcast television is built on proven, established technologies that are based on government-backed standards. And why is that important? First, it ensures everything operates the same. When consumers move from one city to another, and from one cable operator to the next, everything is relatively the same (except for the guide). Second, it ensures broadcasters can swap out technologies from different vendors without having to rebuild everything. That means the service can continually improve.

Streaming, though, isn’t like that. Maybe that’s because it’s relatively new. Broadcasting has been around for over 60 years, streaming for a fraction of that. The technologies we take for granted today, HTTP and chunked streaming, are only twelve years old (Move Networks brought the concept to market with Microsoft during the 2008 Summer Olympics). What’s more, there are dozens upon dozens of different technologies within the streaming video stack, all of them evolving and transforming at an astounding rate. This makes standardization, a process that often involves years of development, difficult. A standard could take three years to be ratified and by that time, the technology could have evolved past it.

The result of all this? Unlike broadcast, streaming technologies are fragmented. And that creates a much bigger problem for providing a consistent, reliable streaming service.

The Fragmentation of Streaming Extends Into the Data

The fragmentation of the technology stack isn’t all that bad. The lack of standardization shows a lot of innovation: different technology approaches provide a lot of choice in how a challenge will ultimately be addressed. And although the lack of consistency within the technology stack may require providers to create their own middleware or ad-hoc solutions, they get it done. Streaming works.

Where the fragmentation poses a serious problem though, when it comes to providing a reliable, consistent service, is with the data.

Within the broadcast workflow, operators have clear visibility into every component, down to the set-top box in the viewer’s home. This helps ensure a very high-quality service because the operator can troubleshoot any component within the delivery chain. 

But with streaming, this kind of monitoring is very difficult to achieve. Technology vendors are not obligated to any type of data structure, normalization, or even access. In fact, some technology vendors require the streaming operator to use their custom dashboard. Other vendors may keep their data under lock-and-key (i.e., a black box). 

What this means is that streaming providers must figure out a way to bring together whatever data they can gather from the different components in their technology stack to get a complete view of network performance (especially when using third-party CDNs) and the end-user experience (QoE). There are some really good visualization tools out there, such as Splunk, Tableau, Looker, and Datadog. But even with these, getting a complete picture of the workflow when wrestling with fragmented data can be very time-consuming. 

Still, it’s critical: this holistic picture can help identify the root cause of outages and other service degradation. 

How A Monitoring Harness Can Help With Interoperability

how-a-monitoring-harness-boosts-interoperability-in-streaming-touchstream-infographicGiven that this fragmentation isn’t going to end anytime soon, streaming providers should be looking at ways to mitigate the potential impact. For example, when technology within the stack is swapped out, operations engineers must identify how to access the data. This could be through an API, raw logs (that need to be normalized), or even a proprietary visualization tool. All of this takes time to operationalize which adds to the complexity of solving delivery or user experience issues that could result in churn.

A monitoring harness, though, can help solve some of that. Rather than building a one-off, custom link to the new data and figuring out how to visualize it, the harness can simply be configured to point to the new tool. The data is pulled in automatically and operations continue to use the same dashboard as before. The key is in the underlying software of the harness which is already programmed to normalize data from various tools. With a few tweaks to the logic (such as mapping existing relationships from other data sets to the new data sets), supporting a new technology becomes a much easier task and ensures a continuity of service for viewers while mitigating the inherent fragmentation of technologies within the overall streaming platform. 

The Industry Must Embrace a Common Approach to Streaming Technology Data

In an ideal world, where the data coming from technologies in the streaming video stack is standardized, the monitoring harness is truly plug-and-play. Just point it to the new API and everything functions as it has. But streaming is not an ideal world right now. It is still evolving. As such, a monitoring harness implemented now can both mitigate fragmentation as well as lay the groundwork for that future state of conformity. It can be the lynchpin of a streaming architecture and ensure operational excellence. When the industry settles down and standards begin to play a more important role, when there is more interoperability between technologies and vendors, a monitoring harness will become the glue that holds the entire workflow together.

Popular posts

Continual Transformation Is a Natural by-product of Delivering Streaming Video

March 31, 2021
Share on

The Day Your Streaming Video Workflow Meets the Cloud

May 31, 2021
Share on

A Live Streaming OTT Guide: Why Comprehensive Live Stream Monitoring Is Essential

April 17, 2020
Share on

The Heart of the Transition from Broadcast to Streaming is Monitoring

March 4, 2021
Share on

How To Achieve Near Broadcast Quality In OTT Live Streaming

April 17, 2020
Share on