Close
Close

Published

Harmonic EyeQ perception filter crushes mobile HD to 1.5 Mbps

It’s as if everyone at NAB sat down a year ago and decided to introduce their low latency streaming options at this year’s show. We just got through talking about Haivision and Wowzer partnering on lower latency for streaming, and how Akamai has improved its own Media Services Live to do the same, and now Harmonic is talking about how it plans to cut down latency in streaming.

At the NAB Show, Harmonic says it will showcase a real-time streaming workflow that matches the latency of live broadcast. Well live broadcast is about 7 seconds or so, so presumably this is about the same.

Harmonic says that it is able to do this by simply using CMAF packaging (Content Media Application Format) and HTTP chunked transfer encoding. CMAF creates content that can be viewed in both HLS and MPEG-DASH, allowing a single packager output. Chunked transfer encoding is a well-established way of bypassing the HTTP Content-Length header and replacing it with one that has an unknown length, so that live content can be sent in this way. Otherwise you would need to know how long it is before you began sending.

Use of CMAF became possible in June last year after Apple said it would support a container format called an fMP4 container, the same as CMAF, and pretty much everyone is lining up to drop this into streaming platforms everywhere.

Harmonic is also showing off a genuine compression gain which it calls EyeQ, a form of a perception filter – a set of algorithms which reduce the bandwidth for video encoding based on the limits of what the human eye can sense. Now Israel’s Beamr has been pushing this idea for 4 years, and was in stealth before that, but Harmonic says that it has developed this in parallel, and has applied for its own patents.

The extra 50% compression this gives means that HD streams can be compressed to under 1.5 Mbps which Harmonic claims comes in under the data ceiling of US zero-rated mobile services and that it will also work with xDSL networks, meaning far better quality video can be sent to these. Rival V-Nova has recently claimed that it can produce fully monetizable HD for mobile in 1 Mbps and HD for DSL in 2 Mbps. On the other hand, it can get a video stream to a phone in just 300 kbps, so it’s hard to compare the two without sitting down and looking at them.

But suffice it to say that this is perhaps the first time that Harmonic has ever been close to the leading edge of what codecs are capable of for phones.

Another area where Harmonic has been vocal of late is in encoding signals for 360 degree video and at NAB it will also team up with Nokia and Ideal Systems to tell us how it provided a system to PCCW in Hong Kong, to deliver an immersive VR experiences at the 2017 Cathay Pacific HSBC Hong Kong Sevens rugby tournament.

Ideal Systems is an Asian broadcast systems integrator and Nokia naturally provided its OZO cameras. It was compressed on an Electra VS software only encoding system, which came out of the ViBE series from Thomson, that it acquired last year.

And finally, Harmonic at NAB said it is extended its VOS cloud capabilities include support for graphics, branding, and digital video effects for both file-based and live workflows. It began its VOS initiative in 2014 with encoding only to begin with.

Bart Spriester, senior VP, video products at Harmonic said, “Bringing the playout capabilities found in our Spectrum media servers to the VOS Cloud opens up additional cost savings for operators, speeding up their operations and enabling additional types of monetization, such as brand reinforcement, pop-up channels, rapid platform deployment and expansion, graphic avails and more. As two of the industry’s first cloud-native solutions, the VOS offerings are making a huge impact on the way in which operators create and deliver video.”

Close