Your browser is not supported. Please update it.

9 August 2021

Verizon extends its support for open caching across its networks

Verizon has been a leader in developing technology, with various partners, for caching content close to the edge in order to improve quality of experience. A few years ago, it kicked off advanced trials of technologies to work with 5G video content and it has been steadily extending its strategy to cover its FiOS fiber network services and multi-network experiences.

The recent additions of video delivery experts Ateme, Broadpeak and Varnish Software to the Streaming Video Alliance’s (SVA) Open Caching project has been embraced with open arms by high profile members, including Disney and Verizon, as the alliance continues to encourage outsiders to help plug gaps in the network technology initiative.

In a recent webinar, the SVA welcomed Sanjay Mishra, associate fellow at Verizon, who singled out scale as the most important aspect of open caching, which is gaining traction among ISPs with proven methods for equalizing the playing field for all video subscribers, where any consumer within the network footprint can receive content equally in terms of QoE.

It’s no wonder Verizon is currently working tightly with Disney+ on a trial project that caches popular on-demand titles at the network edge, considering that Verizon’s initial work with open caching started with BAMTech, which is now part of Disney Streaming Services. Open caching is now available for Disney+ subscribers on the Verizon Fios network, using technology from open caching pioneer Qwilt, and feedback from Fios customers streaming Disney+ content has been positive so far, according to Mishra, although he sees a lot more to come from this project.

For Verizon video subscribers with DVR functionality, Mishra wants to extend the open caching environment to work more closely with content delivery networks (CDNs) and content providers so that content can be pre-positioned and pre-loaded for consumers down to edge storage devices. This would make the DVR use case a lot more powerful, in his view, allowing consumers to record and cache subscription video as well as linear TV, where content is already available on your device without any buffering.

This case for edge storage being a powerful enabler for DVR capabilities raises the idea of content prepositioning being married with content recommendations, where suggested content can be automatically cached in off-peak times – as just one of many future possibilities for open caching.

In fact, the open caching community is already planning for people to extend and build on top of the existing available architecture, according to Eric Klein, director of content distribution at Disney Streaming Services. Klein explained that, from an application programming interface (API) design standpoint, extensibility is not only being thought about, but already being recommended and defined.

“This is about how to extend to help define undefinable things that each service offers as unique – each technology provider can bring their own things,” said Klein.

And speaking of bringing unique elements to the table, Ateme’s David Tencer, product owner for the NEA-CDN portfolio inherited from Anevia, provided the vendor perspective by noting how the SVA has upped the ante with building specifications from an API point of view over the past six months.

This approach is providing a big helping hand by putting a framework in place for companies like Ateme and those in its customer base to easily implement open caching into their own systems, with Tencer describing the current state of open caching as “solid ground to start building real systems”.

Some of Ateme’s network customers are now starting to use open caching APIs in their existing internal CDNs, based on the NEA-CDN technology, to deliver more content to external content providers. This early success with non-open caching is driving ISPs to want to start using standardized systems to get more content deals, as it means they can interoperate and scale better, therefore creating better value proposition for networks, instead of taking isolated caches from other providers. He said there is a trend of ISPs wanting to become “masters of open caching”.

Meanwhile, Broadpeak’s Guillaume Bichot, principal engineer and head of the exploration team, cited an unnamed but “famous” content provider in the US as a great caching prospect, while the French CDN specialist has been working with Telefónica  for two years now, using a concept and mechanism promoted by open caching. Orange is also exploring open caching with Broadpeak, but then Orange is open to most things.

“If someone wants to start using open caching tomorrow, they have to first open the implementation guideline document, which shows a set of simple scenarios, with pointers to different API documentation as follow up documents. Here you will find two big interfaces – the configuration interface, and the footprint and capacity interface – and if you understand these two then you can start coding the scenario,” explained an enthusiastic Bichot.

Broadpeak has since demonstrated a set of open AIPs that standardizes the interface between an ISP’s hosted local cache infrastructure and content providers, as well as the party operating it.

Building on Verizon’s edge device comments, Disney Streaming Service’s Klein namedropped the edge storage sub-group within the SVA’s Open Caching initiative – which is tasked with exploring the extension of open caching into the home. He views this as just another layer on top of the local node within an ISP network to delegate caching further down. This makes it sound simple, but the group has encountered challenges with how to signal for delegation – figuring out the right way to signal between two aspects in a hierarchical chain.

With the crushing demands placed on networks during the pandemic, the SVA has made adjustments over the last year to reduce the amount of heavy lifting involved in deploying open caching – by delivering a set of clear APIs with an explanation of how to actually implement open caching.

Looking ahead, the project is busy finalizing the first version of its clear configuration API that will allow publishers and CDNs to configure open caching networks and define specifications. Capacity inside the interface will allow open caching networks to communicate current capacity upstream and will allow bidirectional understanding of the current state of networks, as well as the ability to delegate additional traffic. This is going to review fairly soon and should be out by the end of the year, while members are already implementing this as we speak.

It wouldn’t be an open caching party without Qwilt, which has risen to prominence through its work with Cisco, nestling up to the network equipment heavyweight to get its technology as close to the ISP network as possible.

Yoav Gressel, VP of R&D at Qwilt, noted: “Covid showed us networks can become highly congested. Once you have a system deep in the network that overcomes that, to provide consistent experiences to customers, this allows service providers to partner with OTT video players which is beneficial to all players within ecosystem.”

Qwilt is an interesting case as it adheres to all the open caching specs, but licenses its own proprietary open caching software deep within service provider networks – and the company is aligning closer with CDNs as Gressel sees potential similarities in terms of APIs between open caching and CDNs.

Interestingly, Swedish CDN specialist Varnish Software received plenty of mentions during the webinar session, as the newest member of Open Caching. There is big hope among founding members that Varnish is the one to pick up the torch and build plug-in code so that Varnish customers can easily turn on open caching support for boxes running the Varnish reverse HTTP proxy, which sits between a client and server where caching can be carried out.