Your browser is not supported. Please update it.

12 December 2019

WebRTC emerges as winner for low latency in-game betting

In-sports betting might not seem directly related to video and TV, but has a similar requirement for low latency as live streaming. Furthermore, online betting firms are under competitive pressure to provide live video feeds from major events where they have the rights at as low latency as possible, as well sometimes as audio. Most pertinently of all, in-sports betting has become an increasing feature of live OTT services, attracting eyeballs and generating additional revenue. The pressure to compete over features means that commissions charged could be cut to the bone, almost as happens in Las Vegas casinos, aiming to generate significant profits from slim margins through volume.

For such reasons, in-sports betting now features prominently in discussions and conferences around low latency, alongside video. The latency challenge for betting within OTT services is fairly obvious when we consider for example that viewers of English Premier League matches streamed by Amazon just a weeks ago often experienced delays a minute behind live (see separate story). This means that any viewer with say a friend at a match could bet on the changing odds of the final score in the event of a goal being scored faster than those without such facility.

This example highlights a contentious aspect of in-sports betting since latency cannot ever be taken out of the equation entirely. Anyone waiting to view a stream before placing a bet will always be at a disadvantage compared with someone at the actual event who can hit the button vital seconds earlier. This is already a significant issue for some sports such as tennis where bets can be placed on the results of very short-term events such as individual games within a longer match, lasting just a few minutes. In that case, there is an additional source of latency – the time taken for the umpire to update the score, which is already known to a punter.

Organized betting rings have moved in to exploit this by stationing large numbers of people at such events, often otherwise sparsely attended second line matches. Organizers and also bookmakers have attempted to counter, with the latter banning customers caught indulging in such on-court betting and the former ejecting those caught excessively using mobile phones from the grounds.

There is also scope for arbitrage by exploiting that short-term difference in odds between rival betting sites that can arise as a result of this latency. Just after a goal is scored at a football match the game odds change but that may not be reflected at exactly the same time on different websites, yielding a short window where in theory a punter can make a guaranteed profit by betting on one outcome with a given site and the opposite outcome on another.

However, that is risky because large sums have to be bet to make significant profits on such fractional variations in odds and there are risks of it going wrong, say when one site temporarily freezes as sometimes happens when prices are being updated. In practice, therefore the main risk is of punters being able to exploit prior odds before they have changed and so no longer reflect the real probability of the event.

To some extent, these are separate issues from low latency streaming but useful to set context. For OTT providers it is more about encouraging some viewers to place small bets as a way of enhancing the experience and drawing them in. Latency is important more to avoid customers being at a disadvantage than to help them get rich quick.

As such, bookmakers have become an additional voice in the low latency debate, which as we observed around IBC 2019 is in a state of flux, with several contending variants such as Low-Latency CMAF (Common Media Access Format) with Chunked Transfer Encoding, Apple’s Low Latency HLS, WebRTC, SRT (Secure Reliable Transport) and RIST (Reliable Internet Stream Transport).

We have compared and contrasted these before and the main context here is that in-sports betting has become another force attempting to crash heads together, overcome short term commercial resistance to convergence and accelerate trials or deployments. Essentially there is a tradeoff between low latency, cost and quality, with the former rising to the top for in-sports betting.

It is worth just considering all the low latency options from the higher level of where the delays come from and what scope there is for reducing each of them. The first source of latency is raw signal transmission time imposed by the laws of physics, with two sub-categories. Firstly, there is theoretical minimum end to end transmission time along a single fiber without any amplifying components, if that were possible, which is about 5 microseconds per km. There is nothing that can be done to mitigate that other than to place source and destination as close together as possible, but it is fortunately not too significant, amounting to about 100 ms for a round the world trip.

Then there are additional delays associated with switching, rerouting and amplification within a fiber network, where there is scope for optimization and tradeoff against cost, for example using longer fibers and all optical components. This might add 50% to 100% to the theoretical minimum delay.

In the next full category come delays within the video lifecycle associated with content creation and uploading, where there is again scope for optimization but often outside the control of the operator.

Thirdly and finally, there is latency associated with distribution and playout, the lag between video capture and playback, including delays incurred over a CDN where relevant. These delays, especially those associated with adaptive bit rate streaming, are the focus of these low latency methods.

The challenge for all these methods is to balance latency against quality. For on-demand traffic, low latency is almost irrelevant, so that the TCP transport protocol with packet retransmission can be used, as it is by Netflix and YouTube. But TCP is too inefficient for live streaming, because all packets have to be acknowledged by the receiver so that the sender can retransmit any that have been lost. This imposes unacceptable latency because of buffering between senders and receivers in the workflow, while scaling appallingly for live transmission, because the network becomes flooded with receipt acknowledgements, reducing the bandwidth efficiency. The buffering established between every sender and receiver in the workflow, in every router for example, also introduces enormous transmission delays.

For this reason, TCP-based protocols such as RTMP are being superseded for live video streaming. These mostly use some form of Automatic Request (ARC) for packet retransmission, aiming to make this more efficient and faster than TCP while achieving essentially the same goal of insulating the receiver against lost IP packets.

WebRTC stands out from the others by avoiding packet retransmission altogether and instead adopting the alternative redundant approach of Forward Error Correction (FEC), where extra data bits are added to enable packets that are lost or corrupted to be reconstructed at the receiving end. This requires additional data, adding to network bandwidth required and it can only cope with a degree of packet loss, with no scope for winding up error correction by increasing latency when network conditions deteriorate too greatly. WebRTC therefore depends on relatively reliable transmission and even then, at present, relies on H.264 Baseline encoding, which restricts content to 1080p HD quality and is not capable of transmitting 4K.

WebRTC was originally conceived largely for interactive applications including video conferencing but also collaboration in general, with speed gained by using simple APIs and direct browser-to-browser communication without any intermediary, in addition to avoiding packet retransmission. It achieves sub-second latency in many situations but at the cost of video quality and resilience against varying network conditions.

Initially, other low latency protocols such as CMAF LLC and HLS LLC were positioned for interactive applications such as webinars and indeed gambling, but now WebRTC has risen to the top for in-sports betting, because it does truly trade quality for low latency. When the video is subservient to the application, 1080p is good enough and the occasional buffering or glitches can be tolerated.

Now some sports broadcasters and betting sites are trialing WebRTC, so we can expect the first deployments during 2020. Among those believed to be looking at WebRTC are DAZN and Disney’s ESPN+.

We are also seeing partnerships between streamers and web betting firms, with BetMGM, the online sports betting app, having formed a content partnership with DAZN to provide gambling content around the latter’s broadcast of the recent heavyweight boxing rematch between Andy Ruiz Junior and Anthony Joshua. Live betting lines supplied by BetMGM were integrated into DAZN’s broadcast throughout the night of the fight. Again, we can expect more such collaborations net year.