Over the weekend, Netflix’s live boxing match sparked significant discussion on LinkedIn, fueled by widespread reports of buffering. Although we enjoyed the verbal online jousts, we are the only publication to put hard numbers on the table, to back our conclusions.
Internet Service Providers (ISPs) later shared traffic statistics, revealing the network load generated by the event. Netflix announced a peak of 65 million viewers worldwide for the Tyson versus Paul match, across 60 million households, and an average minute audience (AMA) of 108 million live viewers globally. AMA reflects the average number of active viewers during any given minute, not households, showing here an average of 1.8 viewers per household.
However, in some cases when using an AMA count, the same viewers from the same household can be counted multiple times, if they watched the event on different devices or dropped in or out during the session. This may have been the case for numerous households when the first device fell over during the crucial Tyson walk-on, when concurrency peaked.
Even so, Netflix has been testing live streaming for about a year, gradually scaling its events. Starting with stand-up comedy shows, the platform has since broadcast a golf match featuring Formula 1 drivers and a tennis exhibition starring Rafael Nadal and Carlos Alcaraz. The boxing match marked a significant milestone ahead of Netflix’s planned grand finale: an exclusive NFL game on Christmas Day.
Across social media, many viewers complained about buffering and low-resolution video. Such issues are typically linked to either content delivery network (CDN) problems or congestion within ISP networks. Downdetector, which tracks complaints primarily in the US, logged 100,000 reports during the event, underscoring the scale of disruptions.
Based on this data, and with 38 million peak viewers located in the US, the failure rate is less than 0.3%, which is way too low to justify the global backlash.
Netflix relies on its proprietary CDN, Open Connect, which it fully manages. Open Connect involves deploying Netflix-owned caches within ISP networks or at higher-tier Internet Exchange (IX) points. For video on-demand (VoD) content, about 60% of Netflix traffic is delivered through ISP caches, with the remaining 40% served via IX caches. However, data specific to live traffic remains unavailable.
Focusing on the US, let’s compare Netflix’s 65 million peak viewers with other recent major livestreamed sports events. Peacock reached 23 million users during an NFL playoff game, with 16 million peak viewers, while Amazon Prime Video’s Thursday Night Football counted 18 million peak viewers per game during the 2023 season.
Some ISPs reported that Netflix traffic spiked to 2.5 times its usual level, reaching 66% of Peacock’s peak traffic. Both Peacock and Netflix stream in HD, with Peacock using AVC and Netflix probably using HEVC to stream to capable devices. Based on measured traffic versus other events, Netflix saw an average of 4 Mbps, which shows either lower resolutions were streamed and/or HEVC was used to limit traffic.
Several factors likely contributed to the issues Netflix faced. A key challenge lies in the design of Open Connect, which was optimized for VoD traffic with a concurrency rate (the percentage of users streaming simultaneously) typically below 5%. Live streaming, in contrast, can see concurrency rates of 50% or higher, requiring Netflix to handle ten times more simultaneous sessions at peak.
When ISP-level caches reach capacity, Netflix falls back on IX-level caches, requiring traffic to traverse ISP peering points. Robust ISP architectures can absorb this surge, but if peering networks lack capacity, they may become saturated, leading to resolution drops and buffering. Many viewers experienced these issues as Tyson entered the ring, coinciding with the audience peak.
Netflix has yet to comment publicly on the technical matter itself, and a detailed explanation may not be forthcoming.
However, Netflix CTO Elizabeth Stone issued a memo internally to staff, which—according to Bloomberg—said, “This unprecedented scale created many technical challenges, which the launch team tackled brilliantly by prioritizing stability of the stream for the majority of viewers. I’m sure many of you have seen the chatter in the press and on social media about the quality issues. We don’t want to dismiss the poor experience of some members, and know we have room for improvement, but still consider this event a huge success.”
Even though the event was seen as a success in the CTO’s eyes, the lessons are clear: relying solely on a proprietary CDN without adopting a multi-CDN approach, as other sports streaming services do, exposes scalability vulnerabilities. The upcoming NFL Christmas game may face similar US traffic levels, although based on other NFL numbers, the concurrency might be smaller than the boxing match, which does not preclude scalability issues at some ISPs.
To address these issues before the NFL game, Netflix has several short-term options:
- Increase peering network capacity at ISPs where failures occurred.
- Deploy public CDNs in regions where peering traffic proved unsustainable.
In the longer term, Netflix could explore technologies such as:
- Multicast ABR (Broadpeak): While effective, this solution requires ISPs to upgrade their gateways. Given Netflix’s partnerships with approximately 2,500 ISPs, this would be a monumental undertaking.
- P2P (Quanteec): Although promising, this approach has been tested only on a small scale for unencrypted content. Deploying it for DRM-protected streams would require significant updates to Netflix’s client software.
Given Netflix’s penchant for proprietary technologies and its fierce DIY culture, these two long-term options are serious wildcards.
Conclusively, this event has undoubtedly sparked broader discussions about scaling live sports streaming. It may deter other streaming providers from building proprietary CDNs and instead encourage reliance on proven, large-scale live CDNs.
The debate is also likely to reinvigorate discussions about edge CDN options, where all caches are embedded inside ISP networks, thus removing the dependency on peering points.