The WebRTC protocol has crept into video almost by stealth through its widespread adoption by major messaging platforms including Google Hangouts and Facebook Messenger, as well as some aspects of WhatsApp. Its development was spearheaded by Google for release in 2013 to facilitate peer to peer communication between browsers and subsequently apps for messaging and other interactive applications requiring low latency communications, notably Voice over IP and video conferencing.
Before that, browsers required dedicated plug-ins to participate in collaborative applications or universal messaging systems. Through WebRTC, browsers effectively have embedded real time communications and can participate in group communication via video, voice and chat with ability to share screens and files. As all the leading browsers have now incorporated it, users can communicate from any web interface they choose.
This presence in messaging has taken WebRTC into video and its emphasis on low latency has led to its adoption for some streaming services, but not always with optimum results. Having been designed for Peer to Peer (P2P) communication for two-way interactive applications low latency is non-negotiable, so that video quality inevitably takes second place. Meanwhile, as WebRTC deployment grew, video streaming services were also getting going and coalescing around two protocols focused more on balancing quality against available network bandwidth and device playback capabilities, that is Apple’s HLS and DASH. There was also Microsoft Smooth Streaming, but that was largely subsumed by DASH, leaving just those two.
With both of those well entrenched it was clear that they would have to be accommodated within a common framework somehow to minimize complexity and cost for distributors and content owners. Fortunately, Apple and Microsoft came together to develop the Common Media Application Format (CMAF) as a standard transport container for both HLS and DASH, reducing complexity within video streaming workflows. At first, the main emphasis was on VoD and therefore video quality rather than latency. Neither HLS nor DASH were designed as real time protocols and rely on buffering to protect against glitches or variations in bandwidth resulting from congestion and also deliver in different resolutions to ensure optimal quality. Latency can be reduced only by storing less data in buffers and increasing probability of glitches occurring.
WebRTC has been proposed as a complementary approach with lower latency but did not work well in streaming environments where video was broken into chunks. It is based on the connectionless UDP protocol which on its own provides no guarantee of delivery and has no scope for retransmissions. Various enhancements have been added to WebRTC, allowing some degree of retransmission if requests are received within a given time window. But having been designed for operation within small groups as is typical in messaging environments it has not scaled well to larger streaming services.
The scaling challenge can be addressed to some extent by CDNs deploying edge servers supporting WebRTC so that the communications are broken down into smaller groups, which Limelight has done, but then latency is not as low as in messaging environments. Also, unless this approach enjoys more widespread support it would compromise the multi-CDN strategy being adopted by many service providers to bear down on latency by selecting the best performing CDN in real time.
Having been developed six years ago, WebRTC was inevitably not designed from the outset for low latency streaming and so despite enhancements has been overtaken on this count by the slightly newer SRT (Secure Reliable Transport) protocol which is the most advanced currently progressing as an open rather than proprietary standard. Like WebRTC but with greater success, SRT combines the robustness of the connection-oriented TCP protocol with its built-in packet retransmission with the inherent low latency of UDP which just sends packets into the ether without any acknowledgement.
Just to complicate matters, there is also a newer protocol gaining some traction called RIST (Reliable Internet Stream Transport). Like SRT, RIST is UDP-based and is an effort from the Video Services Forum (VSF) to standardize the same mechanisms for low latency streaming embodied in SRT.
The main point, as we have said before, is that whatever protocol becomes dominant, whether SRT, RIST or even WebRTC, they are all on the same page in embodying some form of enhanced UDP in place of TCP as the mechanism of choice for streaming latency-sensitive live or linear video content.