PhysLight is becoming part of Weta Digital’s standard pipeline. The tool is actually a set of tools that span from set to final render. A suite of tools that aim to, as closely as possible, match the lighting that is happening on set when adding visual effects.
Many artists are familiar with capturing an HDR to get the values of illumination at any point on set, but an HDR does not accurately record colour. Such productions actually only record illumination as a relative data set, not directly linked to the real world light or their individual camera gear.
While a bracketed set of 360 degree stills can provide the range and intensity of directional light at a point, it does a poor job at accurately helping to match colour. There are several reasons for this:
- Often there is only a small macbeth colour chart in shot and this does not help define a rigorous mathematically accurate colour sampling. In the first instance it is only showing the colour of light at that point in the scene.
- The camera that takes the HDR is very different from the camera that films the plate photography. It would not be uncommon to have a CMOS Canon 5D stills camera used on set with an 8mm lens to capture the HDR, but have the principle photography captured with an Alexa with Arri Prime lens.
- Even if one could adjust the RAW Canon to look like the Arri, neither of these will have the spectral response that the 3D renderer will simulate.
- This is not just a problem of balance, the entire gamut or colour space of the Canon will be different to the Arri and different again to the Renderer.
- Different lights have different spectral responses. Skin can look quite different when filmed on the same camera but lit with different lights (LED vs. Tungsten) – even if they are the same colour temperature. White LEDs in particular have spikes in the spectral frequency curves compared to other lights. This greatly changes the way colours are resolved by the RGB cameras (using CMOS matrixing).
- Finally, even if one focuses on the illumination levels and ignores colour, the HDR that feeds the IBL in 3D is a relative measurement. It captures relative differences in the HDR, it is not an absolute record of the real light levels and it is not linked to the actual production camera. It is only a record of what this HDR camera recorded. In other words, the values are accurate inside the HDR relative to each other, but there are no absolute units or measurements that definitively connect two different HDRs.
What is needed is a way to convert the HDR images into a colour space and spectral response that will match the main unit camera with the production’s lights and then be able to perfectly emulate the results in the renderer. Until recently few renderers had the flexibility to provide full spectral rendering and those that did, still needed tools to calibrate them to the rest of the pipeline. A couple of years ago Weta’s own visual effects supervisors were unsure why the technical team were so insistent on making their new Manuka renderer a full Spectral renderer, (similar to Maxwell and only a few other renderers). It is easy to see why they would be unconvinced. If a production is rendering an entire CG film, then this aspect does not matter nearly as much. But if you want CG characters, in particular, to sit into live action photography, then having the correct colours is as important as having the correct levels of illumination.
Manuka was designed from the outset to provide the power and control of spectral renderering and now the rest of Weta’s pipeline uses the PhysLight tools to calibrate and balance their entire end to end process. The latest Planet of the Apes film was the first full PhysLight production and the Oscar nominated visual effects benefit from the extra level of realism the Manuka brings to the CG characters making them look like they are really shot at the same time as the plate photography.
Below is our interview with War of the Planet of the Apes VFX s upervisor Dan Lemmon discussing the use of PhysLight on the Oscar nominated film.
Part of the PhysLight solution is that the company now works in real world units of lumens and nits. Dan Lemmon believes this is advantageous in its own right. “In the past the way most facilities shot their HDRs and built their IBL, meant that the exposure data was very accurate, but it was only relative, it was not necessarily put into real world measurements such as Lumens”. This approach also did not focus on the production camera, what ND filters were used, the specific iris, the camera ISO, the way the camera internally deals with colour temperature and the effects that all of these things might have. On most productions only simple data is logged, such as shutter angle and framerate. By comparison Weta logs everything and records the actual serial number of each camera, lens and filter used. This is key to their closed loop system.
Dan Lemmon points out that one might not think that this is such a big deal, but it makes a real difference to a high end film pipeline, without it artists are guessing and that is why some shots just don’t look quite right, it is very hard to guess how bright a window should be, or how dark the sky should be in a comp. Weta can creatively do anything, but it starts from an informed accurate initial known level. For example, the levels and actual colour of the blacks can be shifted by the use of external ND filters. This is a known issue. Without knowing what ND filters were used, an artists would not know to adjust the black accordingly, and even if they checked the camera’s Metadata that would only provide the T-stop or f-stop recorded, post ND filter.
“In the past we would eyeball it, we couldn’t work in absolute calibrated real world values, we were working in relative values and dialing exposure to look correct but it was not locked into a closed loop system where the camera exposure and the exposure of the sky or a burning torch were all in the correct numerical space” he explains.
Today, Lemmon or his team can go on set with a calibrated Canon camera and perform an IBL and know that you will get the true real world colour temperature and lumen values of all the lights. That these will be converted correctly in the system to true spectral values from RGB, such that when an artist sets the correct exposure with all the properties of the camera, the renders will match the cinematography very accurately.
Luca Fascione is Senior Head of Technology & Research at Weta Digital, where he oversees Weta’s core R&D efforts including Simulation and Rendering Research. He points out that one could still use an RGB renderer calibrated using PhysLight, and the results would be more accurate “but with a spectral renderer like Manuka, PhysLight really sings”.
The notion of matching the colour is never ignored in a traditional film pipeline but “the question is can you match the colour with a traditional 3×3 (LUT)? And the answer is no, there are large, human perceivable differences that you get by moving beyond a 3×3. So by being able to measure the correct thing, and then calculate the correct sensitivity actually gives you a very real and material advantage”.
To be honest Fascione and the team were not sure how much of a difference building an end to end colour matched spectral pipeline would make. “We always suspected that there would be an advantage and it would look a whole lot closer to the footage. As it turns out, it was much more than we even anticipated” he comments. Weta Digital had hoped to notice a difference on human skin, but in fact they noticed a visible difference in a huge range of shots. “This was a great satisfaction for me personally after spending a year convincing everyone that we should do a spectral solution and it would be a good idea” Fascione adds.
The implications of committing to PhysLight is substantial, all the reference material that is now collected needs to be documented, all the cameras Weta uses have to be spectrally characterised (measured), and it is now incredibly important to capture metadata, maintain and preserve it. While it may seem maintaining Metadata from onset to post is a simple problem, almost no visual effects companies in the world will say they even get correct Metadata half the time.
On War of the Planet of the Apes the production shot with the Alexa with the Arri Lens Data System (LDS) metadata system. The ARRI LDS is part of the Electronic Control System (ECS), that was introduced in 2000 with the LDS Ultra Prime and Master Prime lens series. LDS describes all the digital lens settings. Lenses with LDS functionality deliver information about its current settings (focus, iris, zoom) to the camera and the Alexa camera interprets these values and records them along with the footage. Many 3D camera trackers hope to get shots without the zoom being adjusted or racked during the take as it makes solving more complex, but Weta easily copes with much more than this. For example, on Apes, the DOP liked to rack exposure. Most shots have a fixed exposure for the shot, and this T-stop can even be noted by a data wrangler on set normally, but Michael Seresin liked to adjust exposure during the shot, which affects not just illumination but depth of field and a host of other camera aspects. By having the data recorded constantly, and encoding the metadata per frame, “we could see when the exposure changed exactly, we could see how quickly it adjusted. We used those curves and the focus curves to inform us how quickly, and in what direction things were changing” comments Lemmon.
While gels are already in the system, the PhysLight approach is so accurate it challenges base assumptions such as how ‘neutral’ is an ND (Neutral Density) filter in lowering light levels and not filtering unevenly across the spectrum of visible light. While an ND’s shot may look normal to our eyes, if the filter blocks more red light wavelengths than blue as part of the ND process, this could be significant in fine detail spectral matching. Not only are different ND brands different, batches inside the same brands may vary and Weta is documenting all of this. Weta Digital is continuing it’s research and development.
PhysLight also has a component in compositing, Weta has noise profiles from each camera. In the case of the last Apes film, the cameras were the Alexa 65s. In some dark scenes PhysLight made a real difference in compositing. The ALEXA 65 camera is a scaled-up version of an ALEXA SXT, and it is able to capture uncompressed ARRIRAW 65 mm imagery. The camera is internally made up of essentially three sensors from a regular camera. If one shoots in low light with a high ISO or the imagery is graded up in the comp then artifacts can be seem. “In extremely low light, where the camera does very well, if you crank up the ISO, and then grade those shots up, you see noise and also artifacts from where the multiple sensors are stitched together…you see almost noise stripes or changes in black levels from one part of the image to the next. Those are things we actually modelled and put back into the comp. In the shots in the hidden fortress, where it was pitch black and the (CG) Apes walk in with torches, we had to have the torch light illuminate the sides of the set, which was a live action physical set – but also illuminating the CG Apes” explains Lemmon. “Getting the black levels right, which includes getting the noise level right, and the right colour and including those artifacts was all part of the process”.
Other films such as Valerian and Maze Runner have benefited from Manuka and its role in PhysLight, but currently War of the Planet of the Apes representsWeta’s most complete visual effects lighting pipeline.
Note: Weta Digital can get the following Metadata from the Arri Alexa 65 LDR, per take, and this isof the data that feeds PhysLight:
- Lens Model
- Lens Serial Number
- Lens Distance Unit
- Lens Focus Distance
- Lens Focal Length
- Lens Iris
- RawEncoderFocus RawLds
- RawEncoderFocus RawMotor
- RawEncoderFocal RawLds
- RawEncoderFocal RawMotor
- RawEncoderIris RawLds
- RawEncoderIris RawMotor
- Lds Lag Type
- Lds Lag Value (eg. lag in ALEXA cameras is one frame)
- Lens Linear Iris
- ND Filter Type
- ND Filter Density
- Camera Tilt
- Camera Roll
- Master Slave Setup Info
- 3D Eye Info
Thanks so much for reading our article.
We’ve been a free service since 1999 and now rely on the generous contributions of readers like you. If you’d like to help support our work, please join the hundreds of others and become an fxinsider member.
Article by channel:
Everything you need to know about Digital Transformation
The best articles, news and events direct to your inbox
Read more articles tagged: Innovation