Haivision buys into perceptual quality in warning shot to SSIMWave

Secure Reliable Transport (SRT) founding member and overall video streaming darling Haivision has used its enviable growth momentum to poach the LightFlow Media Technologies part of video optimization outfit Epic Labs – a month after the two vendors joined forces on bringing content-aware encoding to edge devices.

The similarities between Spain’s LightFlow and SSIMWave, the Canadian perceptual video quality specialist which Faultline has spoken with numerous times this year, are striking. In fact, during conversation with SSIMWave at the recent IBC show, we suggested the vendor’s captivating technology and revered R&D record would surely attract acquisition interest soon.

Naturally, SSIMWave has upheld the position that it has very few direct rivals, yet Haivision has inadvertently exposed a clear-cut competitor and in doing so will draw additional attention to SSIMWave while also creating a more formidable rival.

Adding AI and machine learning capabilities to Haivision’s portfolio is the name of the game here, based on LightFlow’s media cloud orchestration architecture. In short, LightFlow uses proprietary machine learning algorithms to analyze video content on a per-title or per-scene basis, from which it produces a quality-bitrate report. But here comes the clever part. Enter the LightFlow Quality Index (LFI), a video quality metric built to represent how the human visual system perceives specific video content at different bitrates and resolutions.

Through this AI-based system, LightFlow claims to reduce CDN and transcoding costs by around 30% to 40% on average, while enhancing QoE by around 15% and increasing user engagement. It relies on actioning video analytics to optimize streaming delivery, capturing real-time context so playback is reproduced using optimal encoding, codec, network and storage settings to maximize KPIs, as demonstrated by the diagram below.

We have no way of telling which system is best. But we can tell you that SSIMWave’s big claim is having the only software out there capable of breaking through the 90% correlation accuracy mark between compute objective and human subjective content. Its SSIMPlus technology churns out a scoring system as follows: Excellent (81-100), Good (61-80), Fair (41-60), Poor (21-40), and Bad (1-20) – whereby a score of 80+ is generally considered the equivalent of an HD TV broadcast.

Shots have been fired, however, as LightFlow claims SSIM (Structural Similarity Index Measurement) might not be all it’s cracked up to be. It says that more commonly used quality metrics like SSIM and PSNR (Peak signal-to-noise ratio) simply compare differences between the original and the encoded video through arithmetic pixel-to-pixel calculations – without truly factoring in human perception.

Netflix’s VMAF (Video Multimethod Assessment Fusion) for example can improve metrics by rolling in the opinions of real life viewers about perception, which takes a substantial amount of time and computing costs.

So, LQI claims its secret sauce to achieving ‘true’ human perceptual quality without using actual humans, lies in removing the set of encoded output streams to perform measurement. Encoding tasks are therefore taken out of the evaluation process to save time and computing resources. LightFlow’s machine learning-based system slides in before content is encoded, where it predicts what the perceptual quality will be once the title is played, based on different combinations of bitrates, screen resolutions and specific encoding parameters.

LQI’s underlying algorithm has been trained with millions of hours of video encoded with variable parameters to evaluate a video stream’s overall QoE performance inclusive of quality and playback. Similarly to SSIMWave, LightFlow stamps each rendition with an LQI score ranging from 0 to 100 (100 being the best).

Significantly, LightFlow’s technology will support Haivision’s recently launched SRT Hub intelligent cloud media routing platform and the SRT open source initiative as a whole, inheriting a LightFlow team behind the development of the DASH.js implementation with low-latency CMAF (Common Media Application Format) support.

Haivision has been one of the most talked about vendors of 2019, releasing the aforementioned SRTHub at NAB 2019 to critical acclaim. SRTHub is a new media routing cloud service enabling low latency live streaming and fast file transfer – using SRT running on Microsoft Azure – by routing media from IoT-connected video encoders to production facilities.

The idea is that streamlined field contribution reduces complexity and shortens time-to-air for breaking news and live events – thereby bringing live video contribution, production and distribution workflows to broadcasters globally through a secure and reliable architecture. Unveiling SRTHub in the run up to NAB earlier this year was really a call to arms, with Haivision imploring companies to come forward to test the technology.

SSIMWave might have been under the illusion that it was unrivaled in the perceptual video quality landscape prior to this week, having told Faultline earlier this year that almost every tier 1 North American operator was either evaluating or already trialing its SSIMPlus technology. But now the company will have to figure out how to stop LightFlow riding Haivision’s scale and partnerships straight into its territory.