LEGO superheroes, wizards, and the Millenium Falcon coming to life on the big screen? Yes, please! The LEGO Movie opened on February 7, 2014 and has surpassed $200 million at the box office. This hilarious tale is a stunning achievement in animation and visual effects by the folks at Animal Logic. Intrepid reporter, Barbara Robertson, went behind the scenes at Animal Logic to interview their VFX team on their lighting and rendering process. Get all the insights and the tips so you can apply it to your own work. Enjoy!
-The Animation Mentor Crew
Lighting the LEGO Movie
Usually when we think of digital doubles, we imagine stand-ins for actors in live action films, and consider the challenges of creating photorealistic CG humans. And when we think of digital environments, we imagine photorealistic landscapes and set extensions. It comes as a surprise, then, to realize that Craig Welsh of Animal Logic, the lighting supervisor on Warner Bros’ The LEGO Movie, faced many of the same challenges, as would a CG lighting artist working on a live action film.
Everyone knows what LEGO bricks look like. It was up to the texture, shading, and lighting artists to match that reality and bring alive the miniature characters and sets in a photorealistic world. Prior to this film, Welsh was a lighting artist on The Matrix Reloaded, Shrek 2, Madagascar, Monster House, Surf’s Up, I Am Legend, Speed Racer, and Lighting Supervisor on Legend of the Guardians. For The LEGO Movie, his crew of 45 included three lighting leads, a handful of compositors, and six technical directors who wrote tools and supported the artists. Two of the TDs, Max Liani and Luke Emrose, developed Glimpse, a custom renderer.
If you want to match a LEGO brick, think beyond a plastic CG surface.
“We used only legal LEGO colors, and emulated the plastic material with the correct diffusion and colors,” Welsh says. “LEGO plastic scatters light through it, so you need to do a lot of ray-traced calculations to get subsurface scattering. And, every surface in the LEGO world is highly or somewhat reflective, so you need reflections and refractions. You have to pay a lot of attention to oxidation of the bricks, particularly for the exterior shots, as if they had been in the sun and the weather, or maybe in a sand box. And for the interiors, as if they had been in a basement for 10 years. You have to add fingerprints, thumbprints, make them look sticky with dust and hair. And, you have to do that efficiently. We used ptex to distribute thumbprints across multiple bricks.
To design photorealistic lighting, start with something real.
“We had a lot of concept art and a lot of LEGO in the building,” Welsh says. “People in the art department were modeling with the LEGO. So, we took photographs of the buildings and experimented with depth of field. To get a photorealistically convincing look, you have to underpin it with the concept of how lighting really works. We carried boards with LEGO-built streets outside and photographed them in the sun. We took HDR probes of the environments to replicate them in the computer and played around with different settings for the HDR maps. We considered how to make the cylinder heads look appealing. How to give them shape and convey emotion with lighting. We did it again and again until we got it right.”
When you’re making a film, reference interesting lighting from previous films.
“The first sequence I got my teeth into in terms of look was the interrogation room with Emmett and the bad cop,” Welsh says. “I created it with real LEGO bricks and talked with [production designer] Grant Freckelton about how to approach the lighting. He suggested looking at the interrogation scene in The Rock. I racked my mind to think of others and a scene from Basic Instinct stood out because of the striking lighting coming down through a grate. I looked for LEGO pieces I could use, and found one with a grating pattern. I put an LED bulb behind the grating and got a nice effect, so I photographed it from different angles and exposures. After Grant and I talked about what was visually appealing and what made it seem real, I went to town with the [digital] light rig. I made a gray-scale image in the shape of the grate and made a gobo out of that. Then we went back and forth with the renderer until we got the look.”
“We also had reference images and other examples of real world cinematography,” he says. “We’d emulate the way Michael Bay lights street scenes.”
If you spend time developing your visual language, you won’t panic if the story departs from the color script.
“Grant Freckelton and the art department developed a color script for the first two editing sections of the film, and the goal was to have scripts for the rest, but the story changed,” Welsh says. “We knew the general story arc, the time of day at various points, and the emotional content. So, we pretty much did ‘on the fly’ key lighting. It sounds ad hoc, but it wasn’t because we had done so many explorations. We could leverage those early conversations.”
Don’t be afraid to depart from reality if that works better to achieve your goal.
“We always tried to imagine how a scene with actual LEGO pieces might be lit and tried to stick to that as much as possible,” Welsh says. “We’d ask ourselves how can we make this wonderful thing, this model, look fantastic, as if it were built in the basement and lit by a whiz bang miniature photographer.
“But, if we really put a lipstick camera into a set built with LEGO and photographed it, the depth of field would be so shallow the backgrounds would be a blur. We had an enormous amount of detail in the background that we didn’t want people to miss. We tried to keep the sense of a miniature environment, but we departed from strict physical reality to enable the audience to appreciate the background. We also removed highlights and reflections on Emmett’s face if they interfered with his facial performance.”
When a fantastic scene calls for something special, push the boundaries, and rationalize later.
“When Emmett touches the piece of resistance the first time and goes into hallucinations, lighting artist Miguel Angel Perez Tejerina created some mind-blowingly spectacular looks,” Welsh says. “He used AOVs (arbitrary output variables), normal passes, passes representing vectors of movement, and he sat there and noodled away with crazy ideas that weren’t photographically real, but they were so extraordinary, everyone embraced them. There was some talk about whether a look like that could be achieved with a kid in the basement, but we live in a world where kids do digital post processing. So, it isn’t hard to imagine.”
The first shots you light might show you how not to make the film.
“We knew we wanted a photorealistic look and we had a PRMan-based pipeline with physically plausible shading, so we began using that,” Welsh says. “It gave us a beautiful look, but by the time we finished a test shot we had worked long hours and we were exhausted. We learned that we needed to rethink our approach. We needed to accelerate the ray tracing to handle the heavy geometry. We had tens of millions of polygons, huge amounts of geometry. We needed to render it efficiently. But, there was already a strong agreement about the visual look we had established with PRMan. We couldn’t divert from that. We had to maintain the same look.”
Sometimes, a solution is right in front of you.
“[Technical Director] Max Liani had developed a ray tracer called Glimpse as a hobby,” Welsh says. “We saw the potential – it allowed us to handle complex ray tracing – so we folded it into the Animal pipeline. At first, we used Glimpse as a rayserver for PRMan because we still relied on Renderman’s point cloud-based approach for subsurface scattering. We called our hybrid solution FrankenGlimpsen. But, over the course of the production, Glimpse evolved into a production renderer, albeit one targeted to the needs of LEGO, that incorporated full ray-traced subsurface scattering. It doesn’t support things like hair primitives yet; it’s a brick renderer. But, it can handle huge amounts of geometry that make most renderers choke and fall over.
With rendering, one size doesn’t fit all, and when you switch to a new renderer, the workflow changes.
“Because the renderer evolved, it was an interesting time for the lighting crew,” Welsh says. “We could never really put a peg in the sand. But, everyone on the crew was flexible and engaged with the process of implementing and testing this new renderer. We tried to make the R&D changes as transparent to the lighting artists as possible, so the UI didn’t change. But, we had to re-learn how to get a high quality image, what parameters to tweak, and how to maximize efficiency without losing quality. We couldn’t work that out globally; it needed to be tailored to individual sequences and shots.”
Don’t render things you can’t see.
We had vast amounts of geometry to render,” Welsh says. “So the Assets Department worked with R&D to develop techniques to render it efficiently. We’d take a street of buildings and bake sections into shells. If you look at a complex building made with LEGO bricks, you determine which ones are hidden and cull out the hidden, unwanted geometry. That leaves you with a shell that still looks like a solid building.”
Past experiences will help you focus on the big picture.
“There’s always a terrifying time when you can’t imagine how you will get a film done, and we certainly went through that on this film,” Welsh says. “That’s the other side of being a supervisor. You have to keep the crew apprised of the big vision because they focus on minutia. You have to convey the big picture – not only where we are going creatively, but also technically. My experience on other films gave me confidence to move forward with the Glimpse development: I watched the development of Arnold at Imageworks. If you keep your eyes and ears open during your career, it stands you well later. You can back yourself up with what you observed previously.