Verizon’s promise to revolutionize mixed reality (XR) markets after completing tests of a GPU-based orchestration system was taken as gospel this week. The reality of the broader XR situation, however, is one of deep-seated unrest in a perpetually underperforming industry.
VR turbulence was underscored recently by Oculus CTO John Carmack openly admitting that days were numbered for the Facebook-owned company’s smartphone-powered Gear VR headset, while only this week the BBC suffered a major blow as the UK public broadcaster was forced to shutter its VR Hub less than two years since the project was established. The BBC VR Hub joins Nokia Ozo and Magic Leap as among the most infamous XR flops over the past two years (although Ozo reemerged as a pure 360-video software firm in early 2018, while Magic Leap has taken its $billions of funding outside the digital entertainment ecosystem).
Could Verizon’s latest breakthrough combining 5G network and XR capabilities really provide some sort of cushion for VR and AR applications, preventing further casualties and more importantly establishing a solid platform for future success stories? All signs point to no – but strangely we found ourselves believing the US operator’s claims.
Ultimately, the plan is to empower app developers. Verizon wants to give devs a ride on its Intelligent Edge Network, a project launched some two and a half years ago designed to support Verizon’s 5G small cells deployment and existing 4G LTE network. Start with the developer community and the rest will follow, is how we interpret Verizon’s message.
So, Verizon’s 5G Lab has created eight specific services designed to equip developers with tools to create applications for use on its 5G edge technology. These are 2D computer vision, XR lighting, split rendering, real-time ray tracing, spatial audio, 3D computer vision, real-time transcoding, and, finally, asset caching.
What Verizon has set out to solve is the heavy imaging and graphics processing requirements of XR applications, which would benefit from the eight services listed above being shifted to the network edge. As such, it says these applications would run significantly better on a GPU, but are hindered by the limited resource management nature of GPUs. Verizon therefore developed a prototype using GPU slicing and management of virtualization designed to support any GPU-based service.
Live network trials using the newly developed GPU orchestration technology in combination with edge services have proven successful. “In one test for computer vision as a service, this new orchestration allowed for eight times the number of concurrent users, and using a graphics gaming service, it allowed for over 80 times the number of concurrent users,” states the announcement.
Of the eight services created, a couple stand out. Real-time transcoding is a familiar term in the Faultline for saving time and optimizing workflows, while real time ray tracing is a less well-known technique. The latter takes traditional 3D scenes or objects, which are designed using complex custom algorithms, to extrapolate or calculate how light will fall and how ending colors will look and feel. Verizon says that with real-time ray tracing capabilities, each pixel can receive accurate light coloring – greatly advancing the realism of 3D images.
Split rendering is another neat technique. This enables the delivery of PC/console level graphics to any mobile device by breaking up the graphics processing workload of games, 3D models, or other complex graphics, and then pushes the most difficult calculations onto the server – allowing the lighter calculations to remain on the device.
On top of the obvious network technology clout, Verizon recently acquired a bunch of VR assets – picking up the remains of Jaunt XR earlier this month, a year after the start-up gave up the ghost. From what we gather, assets mostly include AR software, but we suspect Jaunt held onto VR assets (after shifting focus from VR to AR) like some of its studio-grade cameras costing $100,000 a pop, along with a suite of VR software and a distribution platform.
As Faultline highlighted at the time, this deal was with an eye to improving Verizon’s existing XR arm called Envrmnt. This division forms part of Verizon’s 5G Lab focused on providing mobile XR applications to autonomous vehicles, smart cities and volumetric VR applications.
Critical to Verizon’s 5G-focused Intelligent Edge Network is the fiber network backbone. Verizon has laid thousands of miles of fiber this year, including 4,200 miles across 60 cities during the second quarter alone – an increase from the 3,000 miles of fiber achieved in Q1.
“Creating a scalable, low cost, edge-based GPU compute capability is the gateway to the production of inexpensive, highly powerful mobile devices,” said Nicki Palmer, Chief Product Development Officer at Verizon. “This new innovation will lead to more cost effective and user friendly mobile mixed reality devices and open up a new world of possibilities for developers, consumers and enterprises.”
Once delivering XR applications over networks efficiently becomes a reality, only then does the industry stand a chance of breaking into the mainstream. Verizon’s progress is a rare positive and the onus is now on developers to get the ball rolling.