top of page
Search

Getting Real with Virtual Production

How often have we as Cinematographers felt disconnected from the final image when VFX is involved? Having shot the scene against green screen we wait many months to see the final composite image with a general sense of unease and no sense of control, and then when we do, we almost always feel a little short changed. If you got a bunch of D.O.Ps over for a drink it will soon turn into a crib fest about how long VFX took for their films, how they had no time during DI to fix anything, how they had lit for day and it now looks like night . How the scale is off, realism, perspective, depth, off.. off.. off …


It is traditional industry practise for graphics heavy films, scenes that require artificial environments, set extensions, historical/mythological stories, fantasy tales and heavy duty action sequences to use green screen cinematography and VFX to achieve the desired output. Principal photography happens against green screen with lighting to match the background plates, markers on the screen for tracking movement shots, motion control cameras for exact measured repeatable movements , motion capture for animation and so on. The VFX team work with the H.O.Ds of the production (Director, D.O.P, Production Designer/Art Director) at the scene’s pre-vis stage, closely monitor the actual shoot, and post principal photography they start work on the captured footage back in their studios to generate the final composite frames as visualised by the team. This is usually time and labour intensive with several VFX artists working on a single frame day and night. A process the Cinematographer is not involved with till the final stage of delivery.


A few years back the entry of Virtual Production offered up an intuitive technological advancement as an alternative to green screen shooting especially for CGI heavy films. Covid also nudged the technology forward as people were more keen to shoot in controlled ‘safe- spaces’ and not travel.


Virtual production by definition uses its technology to join the digital world with the physical world in real-time. Actors perform in front of 3D backgrounds projected on LED walls inside the studio instead of green screens, negating the need for compositing, what you see is what you get. The LED walls or arena of walls known as Volumes project spaces and environments created using world capture – which is the use of photography, video and other references to translate real-world spaces into digital assets. Digital assets are essentially environments created with the use of graphics. They are 3D renders made using background plates – both photographs and videos shot from all angles in real locations or virtual sets created digitally by the VAD (Virtual Art Department ) lead by the vision of the Production Designer. The VAD are the backbone of asset creation. Digital Assets include models and characters which are created digitally for Virtual production.


Software used to create assets vary between production houses. Unreal Engine which have been driving LED screen imagery in gaming and have the capacity to create sets in real time have expanded their horizon from gaming to cinema and are one of the key players in Virtual Production.



LED walls are key to the ecosystem of the VP studio, they are used as the building blocks that are stacked to scale as per the focal length and coverage of the lenses on camera and the extent of camera movements planned for a shot . These LED screens on to which the background assets are projected are fine quality pixel screens which promise superior projection with good enough resolution to avoid pixelation and moire. LED screens are also used as ceilings to help create ambient lighting and reflections. LED screen environments can range from a wall, an arc or a complete dome/volume as per the production’s requirement. It is important to know the pixel pitch of LED walls being employed (pixel pitch is the distance between individual LED lights on the screen and is measured in millimetre) as it correlates to how good the image looks. The recommended pixel pitch being between 1.5 – 3mm.



Inside a VP studio the crew can interact with the digital process in the same ways they interact with a live-action production. The scene is planned, prepped, blocked, recorded and the Director and D.O.P see what they are getting as they shoot, with minimal to no post production involved. D.O.Ps  are in control  of the lighting with the interactive ability to match foreground and background to help create the exact shot they had visualised. The actors too get to interact with the environments, they get a real feel for the space instead of emoting in front of a green screen which makes their job a lot more symbiotic and immersive. If they are running for their life from a dinosaur they can see it palpably and not have to imagine it. They get to react and act simultaneously.


The whole process now termed ICVFX (In camera VFX) is pushing tech giants like ARRI to facilitate the turn technology has taken. ARRI unveiled its Virtual production stage in Uxbridge London, this year touted as Europe's largest fixed virtual production facility.



Virtual Production is not new, it’s simply technological evolution. It takes its roots from Rear/ Back and Front projection which is decades old. All of us have seen that convertible with actors pretending to drive in front of a flickering screen with a road projected on it, just outside the frame are those big fans creating breeze in their hair, a few men shaking the car to give the illusion of movement.  It also harks back further to Matte paintings and Travelling Matte composite photography.


Barsaat, 1949
2001: A Space Odyssey, 1968

Virtual production uses this very same technology of projecting pre-recorded material in the background to create the illusion of reality, it is just more interactive and advanced, negating parallax problems, perspective issues and more intuitive than green screen photography. The actual merging of foreground and background in camera in real time creates an authentic image which for a Cinematographer is extremely useful.  If the creative team on the floor doesn’t like what they are seeing through the camera, they can change that on the spot instead of having to come back and reshoot something or rework it in post. Virtual Production also offers set extensions in real time, visualisation capabilities from end to end and performance capture to create CG characters.



The technology has been used notably to create entire 3D worlds in the series, ‘The Mandalorian’ and the feature ‘1899’. Shooting the entire movie, episodes, seasons on floors, without stepping out ever, it gave them a complete solution, pre-vis, tech-vis, post-vis, set development and creation, shooting, graphics and a near final image on a daily shoot day basis.


“The Holy Grail of visual effects - and a necessity for The Mandalorian, according to co-cinematographer and co-producer Greig Fraser, ASC, ACS - was the ability to do real-time, in-camera compositing on set. Fraser also says, “The volume is a difficult technology to understand until you stand there in front of it, and move the camera around. It’s hard to grasp. It’s not really rear projection; it’s not a TransLite because [it is a real-time, interactive image with 3D objects] and has the proper parallax; and it’s photo-real, not animated, but it is generated through a gaming engine.” (AC mag - Feb 2020)


Inset - On location - '1899' : The story takes place entirely at sea and Virtual production enabled the production to shoot entirely on set with the sea becoming a living breathing character in the frames.

There are some key things to consider when creating these worlds of LED walls. For the virtual environments to look completely believable the following techniques are employed to ensure realism.


Photogrammetry – which is taking photographs of real world objects textures and locations that can be mapped as 3D objects onto the backgrounds.


Scanning – taking thousands of photographs from different viewpoints of a location and using them in the 3D render of the environment.


LIDAR - A survey method that illuminates a target with laser and measures the reflected light via infrared sensors to derive a point cloud; which is used as part of asset creation and to capture real-world locations.


For ‘The Mandalorian’ the supervisors travelled far and wide to shoot elements to create the Star Wars landscape. The footage was shot primarily in overcast conditions so the backgrounds remained neutral with not too much light and shadow. DSLRs like the Canon EOS 5D MKIV are commonly used for this capture.



The Virtual Art Department (VAD) work on the creation of environments/ worlds once they have all possible data required, they also light the spaces using lighting simulation software that include LUTs that they can share as references to the D.O.P.  The D.O.P works with the VAD prior to shooting by being part of the asset creation, giving them inputs on how he/she wants the lighting to be. Once inside the studio during the shoot stage when the need arises for the D.O.P to control the background lighting of the assets the VAD behave as virtual gaffers by moving lights, dimming or brightening them within the virtual space as per his/her requirement.


Noah Kadner, Virtual Production editor at American Cinematographer recommends Digital Multiplex or DMX lighting and Pixel Mapping for interesting and creative lighting possibilities in Virtual productions. DMX can program specific lighting cues and colours through a lighting board. This offers a significant amount of utility and Pixel mapping software can set any light on the set to mimic the colour and intensity of a portion of the 3D scene/ asset.

For example, if you’re doing a driving scene, you could set up pixel mapping to sample a portion of your background plate footage and connect it to lights above and the sides of your picture vehicle,” he says. “You can mimic passing car headlights, street lamps, tail lights, you name it.


Having created accurate assets and lit the scene accordingly, in order to creatively match the colour spaces between foreground and background it is ideal if a Colourist is present on set.

Other important things to consider while shooting against LED walls is avoiding ‘Moire’ and ‘Colour Calibration’.  ‘Moire’ can happen when the camera focuses on the screen or the screen is within the depth of field of the lens, the way to avoid this would be to be to have good quality screens or have enough space in the studio to create distance between foreground and background to avoid it, which will be impossible in smaller studio set ups.


As for colour calibration the colour cast of LED screens hugely impacts the faithful colour rendition of the assets in camera. If the LED panels are not colour matched between walls and calibrated beforehand it will create issues that sometimes cannot be corrected even in DI/Post. It is best to do a  Test shoot in the studio before actual shoot day to ensure this doesn’t happen. But as I am writing this, advances are being made in LED screen technology which offer better and better colour space and rendition.


The Sphere in Las Vegas which has stunned audiences with its immersive visual experience and superior LED screen technology this year. The Sphere is a 516'-wide, 366'-tall geodesic dome that houses the world’s highest-resolution screen: a 160,000-square-foot LED wraparound. The sonically transparent surface of LED panels have 500-nit brightness that produce a high dynamic-range experience. Shot on 18K Big sky cameras specially designed for capturing images the viewing experience has created a huge buzz across the world. While Cinema projection largely remains 4K however juiced up the resolution of the cameras we shoot with boast, the requirement for the sphere is unique. The creators want to take the viewer on a spectacular journey, not try to tell a narrative story with actors.



In the context of Virtual projection this technology may be borrowed for higher end LED wall design and projection. Software is also being developed to match LED walls to specific camera colour spaces to improve colour rendition and matching in Virtual productions.

Now let’s pivot all this to an Indian industry context-  How can it change things for us?


It would be nice to imagine a shoot which doesn’t involve long travel time to a remote location or the need to fly the team to distant outdoor locations. In a controlled studio environment where most parameters are within our control. Hassle free in many ways, except for traffic everything else remains in the production’s control, discounting cast arrival times, that sometimes defy all call-sheets.


Also to know that the sunlight for a scene can always be steady, if the need is an overcast day then that can be simulated and controlled, if it’s a sunny day we need then that’s possible too. The light just remains a constant. We often find ourselves shooting outdoor scenes over an entire day and many days where it’s sometimes impossible to match lighting conditions and continuity. We are always chasing the sun, using natural light when its angular then skimming , cutting, bouncing, augmenting and playing hide and seek with the sun to maintain lighting and contrast continuity within the scene. With air conditioning thrown in to the Virtual production set, the crew doesn’t have to fight the elements and special effects can bring in rain, snow or blizzards only as required.


Cinematographers also find themselves fighting the dying light while shooting during magic hour to get those money shots. The whole crew who otherwise seem reasonably calm suddenly behave like a bunch of frazzled frantic humans screaming for those thirty minutes, which is all they have to can the shots before the sky goes dark. In a VP studio magic hours are eternal…


Then there is the struggle with travel shots, with heavy duty gear – from rigs for multiple cameras, rigs for lights and frames for silks, trusses on the principal car with actors and follow cars all of which are not conducive to Indian roads or traffic especially when trying to shoot in narrow realistic locations. Actors being recognised on roads leads to ‘crowd’ problems. There will always be that one guy on a bike tailing the car and looking-in, infuriating all the ADs.


VP also means that the ‘Plates’ of the location for travel shots have to be shot prior to principal photography. The plates once shot will be transformed from 2D footage into 3D environments. Remapping 2D imagery into a 3D environment provides an illusion of depth. During principal photography inside the VP studio, gaming software like Unreal Engine track and interact with the camera’s position in real time. This allows the movement of the camera in relation to the LED backgrounds be it pan/tilt or repeatable-move head to be realistic as perspectives of the backgrounds keep changing in relation to the camera’s position. All this happens in real time creating the illusion of live location filming within a studio.


Mitesh Mirchandani, ISC  threw some light on this process and spoke of his experience with Virtual Production for shooting car travel shots on his feature ‘Bawaal’. For the travel shots through London and Poland , he shot plates of the streets that were sent to the VP team who then rendered and fed it onto the LED walls for the shoot day. He specifically shot the plates in overcast light which was the look he wanted to create for the mood of the sequence. The plates were shot on the Sony A7S III cameras. Mitesh was satisfied with the results and pointed out that what might have taken several shoot days on live locations was wrapped in a day inside the studio. He used the LED walls both to project the backgrounds that are seen through the car’s windows and to create reflections of passing buildings on the glass of the window through which he shot the lead actors.


Bawaal, 2023

Mitesh also shot virtually some key scenes that take place in Auschwitz in the same film. The scene follows the lead pair as they walk through the historic site and relive the nightmares of the Jewish concentration camp, a place he could never have actually shot in. This scene in the film is in black and white, a creative choice made during preproduction. As the actors walk into the empty camp it comes alive with images from the past, stark and disturbing. It manages to evoke some of the historic events quite strikingly.


Mitesh for his prep took extensive HD stills of the real location in Auschwitz and sent it off to create the 3D assets. The Production designer Aditya Kanwar created structures to match the virtual space, special mention goes to the barbed wire fence that frames the faces of the Jewish prisoners, gets incorporated as a foreground element as well.

He shot over four days with ATM VIRTUAL at their studio facility in Warsaw, Poland.


Bawaal, 2023
Inset - On location - Bawaal at ATM VIRTUAL, Warsaw, Poland.
Bawaal, 2023
Inset - On location - Bawaal at ATM VIRTUAL, Poland.
Bawaal, 2023

Virtual Production’s main pillar and key collaborator apart from the Director and D.O.P is the Production designer.


To get some insights into the process a conversation with Sonal Sawant, Production Designer was hugely helpful. She shared her expertise on what it involved to design for a Virtual production.

It always starts at the board, just like designing for real locations, sketches of the spaces are drawn up.  In the case of virtual production the graphics team take over the designs translating my vision into 3D models. A virtual walk through follows with the Director and D.O.P  where we put our heads together and fine tune, iron out and finalise details. There is a timeline involved for digital asset creation and once that is ready and shoot dates are fixed, props are procured, real sets / set elements created to place inside the VP studio according to the scene’s requirement.


Nirav Shah, ISC spoke of  VP saying his run in with it showed immense possibilities. He shot a short sequence at Qube’s Virtual production studio partnered with Annapoorna studios in Hyderabad. The studio is the first and only one in India fully equipped for large scale virtual production. Nirav recommends testing which is the best way for any D.O.P to figure out the process, once the assets are created it would be ideal to do a test shoot before principal photography to iron out any unforeseen glitches. And the magic sauce to the entire process is he says the quality of the assets, without which it will prove to be a waste of time, money and energy.


Virtual Production can be a game changer only with the right pre visualisation and a planned workflow. It involves an intensive amount of prep and planning from all departments with the onus on the Production Designer and Cinematographer to work in alignment with the Virtual asset creation team which include the VAD, virtual gaffers and colourists to realise a unified vision. The Director has to back this process by making his vision clear by taking vital creative decisions beforehand and locking everything before shoot.


In a typical ICVFX workflow the Director locks the scene during Previs, then a story board of the sequence is created with the shots he/she is planning. An animated render of the scene with camera movements, lensing and lighting changes, set design required follows during Tech vis with the DOP, Director and Production designer.  The asset creation follows and once streamlined, the next step is to plan to go on floors. A tech scout of the studio before shoot ensures there are no last minute gaffes.


This maybe a new approach for Directors as many are used to making scene changes on set or wanting options, they may complain that the process is counter intuitive and prefer green screen if they are unable to adhere to the workflow.

Virtual Production can benefit big budget television series, game shows and any show entirely set in one location/world as once set up it means many hours of productive shooting. It can be well suited for TV commercials and Music Video shoots which tend to push creative boundaries and are both graphics heavy. Once different digital worlds are created this can also mean moving from one set up to another within the studio saving a lot of time.


For feature films it offers a good workflow for specific scenes that are graphics heavy, replicating inaccessible locations where live shooting is complicated or any location which is unaffordable to shoot in and for travelling shots.


The timelines to build assets are a variable and the processing power of the production facility will be a major deciding factor for speed and execution of complicated shots. Budgets will be another factor as green matte photography is still an effective viable option. Also key to VP’s success is the digital asset creation team which need to be in synchronicity and share a unified vison with the H.O.Ds  and understand the timelines of the production. This circles back to VFX capabilities. Will real time graphics capabilities manned by a few trained professionals replace hundreds of VFX artists working on each frame? Will AI saunter into this space and take things forward? This only time will tell.


The recently launched company Cuebric have created AI-driven software which promises camera-ready media in seconds, for playback in a volume in a VP studio. This might be the next frontier in virtual production.


The big stumbling block to consider is the trap of images looking fake or manufactured, which is bound to creep in when everything is controlled. For those Cinematographers who are nostalgic, who miss analog, who enjoy the stray streak of light that creeps in when shooting on a live location, who don’t want everyone in the frame to look perfectly back -lit all the time, they will need to work this technology to retain their signature style and idea of beauty, work the system to make it more real, make the story more believable..  do some ‘Jugaad’ to VP, to make it our own..


 

bottom of page