Continual Transformation Is a Natural by-product of Delivering Streaming Video

Written by

Brenton Ough

March 31, 2021

Share on

Continual Transformation Is a Natural by-product of Delivering Streaming Video

According to technology veteran Steve Anderson, it took 22 years for the television to reach a mass audience (25% market access). It took the internet just seven years. This means that streaming video took probably just about that as well. The quickness of adoption with streaming video, from the first baseball game in the mid-1990s to global OTT platforms today, has resulted in exponential change and growth within the technology stack. But that isn’t necessarily a bad thing.

From Proprietary Protocols to Segmented HTTP

When streaming first began, it was based on very specific protocols. First, there was Real, then Flash (RTMP) and then Quicktime (RTSP). These protocols all had different pros and cons. But one thing they shared was adoption. From commercial CDNs to early telcos, these protocols promised an improved viewing experience. But in 2008, Move Networks introduced segmented HTTP streaming. This radically changed how streaming video was prepared and delivered. The specialized servers required for those initial protocols could be replaced with standard HTTP servers. Although there were some tradeoffs, such as latency, this kind of streaming was adopted by Apple, in the HLS specification, and resulted in massive adoption.

The Evolution of Streaming Required Operators to Evolve Too

Whether it was the change in protocols, codecs, or security, streaming has undergone a constant state of transformation. These improvements, sometimes small, sometimes large, have always been aimed at the viewing experience: to ensure the highest quality streams with the least amount of bandwidth. But each change has also required streaming operators, whether telcos or stand-alone OTT platforms, to adapt as well. 

From replacing technologies to modifying workflows and other processes, these operators have had to continually evaluate new streaming technologies while simultaneously offering their platform to paying subscribers. For telecommunications companies that have ventured into streaming as either a replacement or augmentation of their traditional broadcast operations, in particular, this continual transformation has also required a fundamental change.

At the Heart of The Constant Streaming Transformation Is This One Thing

Quality of Service (QoS) has always been the mainstay of cable operators’ broadcast offerings. With well-established processes and technologies, they could trace issues down to individual set-top boxes. They had complete control of the network and, as such, could guarantee four or five nines of service. 

But streaming doesn’t follow that playbook. It’s delivered over an unmanaged network and since operators can’t guarantee that QoS will remain up and running all the time, they have to instead focus on the user’s experience with the streams they receive. In short, QoS has now become Quality of Experience (QoE) -- this is at the heart of the continual streaming transformation. Even as technologies are evaluated, replaced, and operationalized, there must be a constant focus on the QoE lest viewers churn to other streaming services.

To ensure that QoE metrics can be constantly tracked, measured, and acted upon, it’s critical that streaming operators enable some fundamental aspects of their streaming technology stack, such as a monitoring harness, which can support the continual transformation but remain inherently stable.

Transform the Components But Keep Monitoring the Same

It’s inevitable that streaming operators will change the components of their technology stack and delivery workflow. Whether it’s the newest protocol, like SRT or HESP, or a better codec, like VVC, or improved security, the pieces within will always be changing. But operators can embrace this continual transformation while also ensuring the highest QoE by adopting a monitoring harness. This framework within the streaming architecture provides operators with the ability to connect any internal or external data source from within the streaming workflow through programmatic interfaces and a consistent dashboard. Operations engineers, then, only need to train once because the method by which they monitor QoE, the dashboard, remains the same even as components within the technology stack are swapped out and connected to the harness.

Embrace Transformation While Embracing Stability

Continually changing technologies is counter to what many operators have experienced with traditional broadcast. Television components, backed by government-approved standards, rarely change. Even if components are improved over time, the operator is still providing a service within an accepted and understood range of parameters. That is not so with streaming. 

For example, changing protocols requires a massive retooling of the entire workflow, from how the content is encoded to how it is delivered. This is a paradigm shift for operators which is why it’s so critical to embrace the stability of a monitoring harness as part of the streaming technology stack. Let’s face it, viewers don’t care if the steam is being delivered via RTMP or HTTP or if it requires an AV1 or HEVC decoder. They just want it to work. This means QoE has to be front-and-centre for streaming, just as QoS is front-and-centre for broadcast. 

Change, even constant, can be a good thing when it provides customers with a better end product. And that’s what’s happening with each iteration of the components in the streaming technology stack. But that doesn’t have to mean that everything needs to change all the time. With a monitoring harness, the stability of ensuring a great viewer experience never changes even in the midst of constant transformation.

Popular posts

The Day Your Streaming Video Workflow Meets the Cloud

May 31, 2021
Share on

Continual Transformation Is a Natural by-product of Delivering Streaming Video

March 31, 2021
Share on

Streaming Cloud Workflows and Monitoring

November 10, 2020
Share on

Dealing With Data Fragmentation

July 22, 2021
Share on

The Word of the Day Is Interoperability

June 11, 2021
Share on