Your browser is not supported. Please update it.

3 September 2020

Intel Studios unveils budget-slashing volumetric production stage

Intel has taken the wraps off a very out-there production technology that could help studios slash their content costs – one of the most pressing issues facing the emerging SVoD sector. Due to the increasing competition between services, the monthly cost is going to quickly become the main differentiator.

To this end, Disney may well be forced to raise its Disney+ fees, or Netflix might feel more pressure to bring its monthly charges closer to Disney. The new entrants already know that Netflix is the price ceiling, and with many homes already having two or three SVoD subscriptions, the chance for a new service to break through is fading.

Similarly, the incumbent SVoD providers want to improve profit margins, and the time will come where new subscriber growth can’t be depended on to prop up the investor value. When this pivot point is reached, content costs are the primary concern, and any percentages you can knock off here are applied almost directly to the gross margin.

Given production costs overseas, and the award-winning US titles that make do with much smaller budgets, Hollywood needs to find a way to make its blockbusters on the cheap. In the past, CGI was a salve, allowing Hollywood to increase the spectacle, without increasing the budgets for extras and set construction.

However, CGI hasn’t helped bring the costs down enough. The human workload in churning out the latest superhero ensemble is still colossal, meaning that these films are still too expensive, and that’s before you approach the alleged abuses of the thousands who work in these render farms.

This is why new production technologies are particularly curious for Faultline. They offer a way to improve the dollar-per-spectacle ratio, which allows Hollywood to bring its costs down without losing the quality of its output. For smaller movies, such technologies offer a way to approach the headiest heights of Hollywood productions, without needing the financial resources traditionally required.

Of course, there’s a risk that these volumetric capture capabilities just descend into the realm of 3D movies, but it seems that innovative directors are going to be able to capture scenes ‘in real life’ that would be very expensive to produce using conventional CGI techniques.

Essentially, Intel has created a stage that has a complete 360-degree greenscreen backing, with an overhead array of lights and cameras. The 10,000 square-foot geodesic dome houses 96 separate 5K cameras, which then feed their 2D inputs into a software suite that lets directors manipulate those shots into what Intel calls a 3D virtual environment.

The processing demands for this are immense. Each of these 5K cameras constitutes nearly 15 million pixels, meaning that each frame of footage captured by the 96-camera array is comprised of just over 1.4 billion pixels.

At 24 frames per second, this is just under 34 billion pixels per second of film. However, because VR requires much faster framerates, of between 90 and 120 fps to avoid the feeling of motion sickness, Intel’s software needs to be able to process just under 170 billion individual pixels per second of VR experience.

Technical details are still extremely scant, and Intel Studios is playing its cards very close to its chest. Of course, racks of Intel’s Xeon processors are going to be in the mix, but we have enquired about the software being layered on top. Intel has made a number of hardware-based acquisitions in the past few years that could be applied here, but it would be foolish to presume that Intel Studios would have been a primary driving factor behind any of them.

As an indication of the secrecy at play, not even the cameras have been disclosed, but earlier demonstrations also talked of an array of 100 8K cameras. Similarly, the tools used to construct backgrounds and props from the camera point-cloud captures are in-house, and the compression and rendering that can output to VR and 2D screens are tightly under wraps.

Intel Studios premiered two films produced using its volumetric capture technologies. Called Queerskins: ARK, produced with Cloudred, and HERE, produced with 59 Productions. The movies were unveiled at the Venice International Film Festival, and demonstrate the capabilities of the technology.

The first is a six-degrees-of-freedom (6DoF) VR experience, which tells a story of a mother imagining her deceased son’s potential life. It promises to let viewers move around a large space, while watching these imagined experiences, entering the mother’s imagination and ‘co-creating and controlling the experience through movement.’

The second feature, HERE, is another VR title, which is an adaptation of a graphic novel by Richard McGuire. The conceit is that the viewer remains in one place, and then experiences the plot through the passage of time – with different scenes overlaying the same physical space. In this case, this means remaining within the same room in a house, while watching plots weave in and out of time.

So, while both are meant to be viewed on VR headsets, we can see how this production technology could be used in creating video meant for viewing on a conventional TV or mobile screen. Capturing the movements of physical actors inside physical sets, and then being able to manipulate the background and vantage points at will could create incredibly immersive experiences, without needing to carry out the level of post-production work needed to remove all trace of the film crews that would normally get in the way of the shot.

To some extent, the technique allows for directors to capture the intimacy of a theatre production with the blockbuster scope that CGI allows on the screen. It’s a hybrid approach, and one only made possible via the gargantuan leaps in processing power and camera technologies in the past decade, and while we are wary that its association with VR could lead to very gimmicky video, the adaptability for 2D screens remains a very intriguing proposition.

As it stands, Intel Studios has a sharp focus on new viewing experiences, but we expect this to change over time. Currently, its focus is on exploring what is technically possible with the new approach, and it has produced shows with musical comedian Reggie Watts, K-pop group NCT, a reproduction of ‘You’re the one that I want’ from Grease, and an eight-episode series with Emmy-winning sports producer John Brenkus, which examines elite athletic performance and stories in AR.

The ability to endlessly revisit captured footage is going to be a big draw for directors. For those on the marketing side of things, being able to edit shots to best fit the advertising medium they are carried on is another added benefit. Being able to offer behind-the-scenes promotional material, or added extras, would be a very powerful sales vehicle.

“Using the groundbreaking volumetric capture and production abilities of Intel Studios, whether it be through the unique movement-based experiences of Queerskins: ARK via six degrees of freedom or the ability to layer scenes from various time periods on top of one another in HERE, we are ushering in a new age of content creation and immersive experiences,” said Diego Prilusky, head of Intel Studios.