Most social media platforms today offer live streaming capabilities. The technology has permeated mainstream culture and has become a staple tool for brands across various sectors to generate buzz and engagement – from the airing of regular e-and traditional sports matches to the broadcast of momentous occasions like the SpaceX launch. Livestreaming has been widely embraced by the digital community and platforms are now in an arms race to provide low latency live streaming to their users.

Going live has never been easier but the technology that goes into realizing this feature is anything but that. As demand and consumption for live content continue to skyrocket, live streaming is table stakes for any platform looking to establish a following and the solution that can offer the broadest range of low latency streaming capabilities will outcompete the rest. Not every live streaming application requires the lowest possible latency. There are several factors that determine which solution is the best fit for a given application. Big-name streamers will flock to the platform that offers the most appropriate video latency solution because that platform will provide the best viewing experience for their audience.

Low-latency streaming is a multidimensional problem

Latency in video isn’t quite the same as latency in other forms of internet media. In the latter, latency is the time it takes for a server to fetch content and send it to your browser. Video latency, on the other hand, represents the lag between the time when a frame is captured and the time when it shows up on your monitor.

As such, video latency can be a multidimensional problem. For example, imagine that you’re streaming an e-sports competition in which two well-known gamers are competing against each other. The stream doesn’t include just the action of the video game – it also includes camera shots of each gamer’s face, plus an additional video inset of the live commentators.

In order to present an optimal experience, four separate video streams – the game, the two players, and the commentator – need to be broadcast simultaneously to a vast audience. If one stream has higher latency than the others, the whole presentation suffers – the commentator could potentially be remarking on action that took place up to 30 seconds in the past, for instance.

As the streaming audience matures, their needs and expectations also continue to evolve. Beyond just broadcasting, it is about offering a high degree of interaction and technical sophistication. A successful streamer will need to have a high-bandwidth fiber internet connection to deliver the best viewing experience, but equally important are their choice of video format as well as encoding protocol which determines the level of latency for their streams.

HTTP-based chunk streaming works against low latency

Ultra-low latency in live streaming is generally defined as sub-second connections within the range of 0.2 to two seconds, while low latency connection clocks in at two to six seconds. With that being said, the default latency for delivering common HLS and DASH video formats can be between thirty to sixty seconds. With this high latency, it’s impossible to deliver the interactivity that audiences want.

The culprit behind this high latency is what’s known as HTTP-based chunk streaming. HTTP streaming formats such as MPEG-DASH and HLS break the video into small segments or chunks that must be buffered prior to playback. While it’s possible to reduce the size of the chunks to provide low latency, making them too small increases the chance viewers will experience video rebuffering and other playback issues.

Complicating the video delivery workflow, audiences watch streaming video on TVs, phones, laptops, tablets, and more. Each of these devices may use a different video format. This means that a broadcaster must take each video stream and transmux it into popular video formats such as HLS and DASH before sending it. In addition, each stream may require configurations to optimize delivery for the best picture quality possible for the viewer’s internet connection condition.

Sub-second latency solutions are the name of the game

As high-speed connectivity and growing consumption of live content continue to raise the benchmark for interactive applications, it is key that brands looking to upgrade their streaming solutions need to consider the following: whether the provider has a global content delivery network with the capacity, reach, and connectivity to deliver reliable, broadcast-quality, and real-time video streaming at scale.

Furthermore, as the online population continues to fragment across channels and devices, it is imperative that the solution is natively supported on all major browsers including Chrome, Safari, Firefox, and Opera, making it easy for viewers to enjoy real-time streaming without the need for special plug-ins.

Lastly, streaming solutions should offer capabilities that can value-add to the live experience such as the integration of real-time interactive data and the broadcast to offer better interactivity between the content provider and the audience. By making the live viewing a more interactive social experience, it brings additional opportunities for content distributors to monetize their live video content.

In short, the technical side of livestreaming matters almost as much as the on-screen talent.


Edwin Koh is Regional Director, SEA at Edgio.

TechNode Global INSIDER publishes contributions relevant to entrepreneurship and innovation. You may submit your own original or published contributions subject to editorial discretion.

How well do you know Asia’s surging live streaming commerce market?