The rise of sustainable streaming is posing new challenges for streaming operators. Streaming is a 24/7/365 service that never turns off, and oftentimes, it scales at a moment’s notice to meet the needs of an expanding audience all tuning into a content stream, such as a live event.
Those same streaming services fight a constant battle with the quality of the viewer experience. They must ensure the best bitrate in a consistent, reliable manner, requiring a similar 24/7/365 approach to monitoring. Streaming operators must watch every stream to guarantee that reliable service, but monitoring every stream every second of the day isn’t sustainable. In fact, it’s incongruous with the growing focus on sustainability throughout the workflow.
So how can streaming operators strike a balance between diligent monitoring and reducing their carbon footprint?
Monitoring everything doesn’t mean monitoring everything
Consumption of streaming media grew significantly during the pandemic. According to Neilsen, while connected TV use skyrocketed in mid-2020 due to COVID restrictions, it soon returned to seasonal norms. Streaming usage took the bigger seat, increasing by more than 21% between May 2021 and May 2022.
The demand for streaming had a ripple effect: streaming operators needed more infrastructure to run longer and, as they pushed their workflows into the cloud, cloud operators needed more infrastructure to support the demand. The ISPs may also have needed to make infrastructure changes to accommodate the increased demand. Finally, all of this has to be monitored.
Providing a high quality of experience for a streaming service isn’t just about the video bits themselves. It’s also about ensuring the infrastructure that delivers those bits is operating within acceptable tolerances. Streaming operators must deploy hardware or software probes throughout their infrastructure to gather the data needed, but how much data is enough? Does everything need to be monitored every second to identify and resolve the issues that can undermine QoE? The short answer is no.
When monitoring clashes with sustainable streaming
If a streaming operator’s monitoring services are running constantly, they are clashing with the need to reduce power and waste within the streaming workflow. Those hardware probes or software agents churning data about infrastructure, bitrates, video quality, and more every second might not be necessary, especially if the scale of users doesn’t warrant it. It may make sense to capture data every second when there are millions of users watching a live sporting event. It may not make sense to capture that same frequency of data for a few hundred users watching a long-tail content asset.
To align monitoring with other sustainable streaming efforts, operators must look at streaming with a different approach. They must assess the frequency and amount of data needed by operations to properly diagnose and resolve quality or performance issues. To do that requires a smarter monitoring system.
Leveraging AI to monitor smarter and more sustainably
Probes that are continually monitoring streaming components, even when there isn’t sustained demand, are wasting energy. They may be collecting data that is unnecessary to operations ensuring the highest QoE. However, a smarter monitoring system can reduce unneeded cycles and help the monitoring activities contribute to improved sustainability across the workflow.
The first step is to programmatically align probe activity with audience scale. As scale decreases, the probes deliver data less frequently, requiring fewer cycles and reducing the demand for CPU or GPU usage. The second step is to use ML to look at past quality data to predict potential future issues. By embracing a predictive model, continuously running probes can be cycled down to a more intermittent schedule. In this way, not only do operations get a picture of where performance or quality impairments might happen, but the streaming service needs less power to sustain a high volume of data capture and reporting.
IBC 2023: Discover Touchstream’s approach to sustainable streaming
Thanks to associations like Greening of Streaming, the industry continues to drive towards a consensus on measuring streaming sustainability. Whether it settles on bits delivered or computational cycle per kilowatt hour, it will become just as important to reduce the infrastructure demands on operational services like monitoring as it will on core workflow components like encoding and delivery.
Using technology like AI and ML to automate data capture will help streaming operators reduce the need for 24/7 monitoring without sacrificing visibility into issues that impact the viewer experience or contribute to subscriber churn.
Touchstream’s VQA Verde utilises existing data from its own ABR monitoring and CDN logs and takes a machine learning approach to predict possible VQ issues. Detected data anomalies then trigger the full deep VQA monitoring, but only for several minutes, to validate or reject the prediction. This approach significantly reduces the amount of processing time and resources required.
To learn more about Touchstream’s approach to sustainable streaming monitoring, don’t miss CEO and Co-Founder Brenton Ough’s presentation at IBC 2023 here.