Dimension has teamed up with DNEG, Unreal Engine, ARRI, Mo-Sys, 80six, ROE, Brompton Technologies and Malcolm Ryan to explore using LED stages and real-time engines for virtual production. This test, featuring live actors and creative, builds on the work Dimension is already doing in virtual production using Unreal Engine.
This technology was previously used in Disney’s The Mandalorian, where location shoots were entirely eliminated.
Virtual Production technology not only improves on-set safety for crews, it also allows for more flexibility for filmmakers and can be a highly efficient solution for productions looking to restart as soon as it is safe to do so. LED stages running complex photo-real scenes in real-time game engines allow us to create vast sets with fully adjustable lighting and animated effects; with the added benefit that multiple environments can be shot on a single LED stage.
DNEG Co-Founder and Creative Director Paul Franklin recently directed a shoot with Dimension Studio, demonstrating how virtual production technology is revolutionizing the way we approach filmmaking. From framing and blocking to lighting, environments, and FX, creative decisions can be made in real-time to meet the needs of the production.
“It gives you the best of both worlds,” said Paul Franklin. “The flexibility of a visual effects process but with the immediate realism of actual photography.”
Here’s a behind the scenes look at the test shoot with further information from Paul, the team at Dimension and the other companies involved in the shoot:
Actors are filmed performing in the space created by the surrounding screens. This technology means there is no need for a green screen that has to be replaced in post-production and no rear projection.
Teams are able to join remotely from anywhere in the world, viewing feeds from multiple cameras. In a world where production has been heavily affected, these new ways for the industry to work are driving both efficiency and brand new possibilities.
The results of this creative test will be shared in the coming weeks, where we also explore the transmedia possibilities offered by combining virtual production and volumetric capture