Kingdom of the Planet of the Apes: Stephen Unterfranz (Unter) and Phillip Leonhardt – (Sequence VFX Supervisors) – Weta FX

Stephen Unterfranz has dedicated more than 25 years to the visual effects industry, collaborating with studios such as Blue Sky Studios, Tippett Studio, and Weta FX. His extensive filmography includes projects like Prometheus, The Hobbit trilogy, War for the Planet of the Apes, and Rampage.

Phillip Leonhardt launched his career in 2000, gaining experience at various studios like Cinesite, DNEG, and Rising Sun Pictures, before becoming part of Weta FX in 2005. He has contributed to many films, including Dawn of the Planet of the Apes, Maze Runner: The Death Cure, Alita: Battle Angel, and Avengers: Endgame.

What was your feeling to be back in the Planet of the Apes universe?

Unter: I was really excited to return to the Planet of the Apes. For starters, the story of Kingdom is a fresh take on the franchise without abandoning any of the canon. It’s long enough after the life of Caesar that not only can all new characters be established, but also the world can be beautiful again; we described the visual language as post-post-apocalyptic. During Dawn I was the head of environments at Weta FX, where we did a lot of world building in and around San Francisco. On War I was head of lighting, and helped with the implementation of new toolsets for our TDs. It’s been really enjoyable to approach each of these films in a different role, and to tie together that past experience as a supervisor on Kingdom.

Phillip: It’s been almost 7 years since we worked on War, with a quick excursion to the Apes universe that introduced our beloved characters to The Planet of the Apes theme park ride. It’s been a thrill to be back and to bring a whole new set of character to live in a future world dictated by primates.

How was the collaboration with Director Wes Ball?

Phillip: Having worked with Wes before on his Maze Runner installments, we knew it was going to be a real pleasure to collaborate with him again on this project. The scale and complexity of Kingdom of the Planet of the Apes is enormous. Wes’ visual guidance and feel for the world he created is excellent, and there was mutual trust and a strong belief that we were right on track to create something truly epic together.

Unter: I agree, Wes is great! I knew from past projects how technically and creatively savvy he is, both as a filmmaker and a visual artist. His fluency in VFX made our process really constructive, and his enthusiasm and attention detail always made us feel appreciated. It’s exactly the kind of relationship you want with a director on a project this epic in scope and complexity. We always looked forward to our reviews with him.

Director Wes Ball on the set of 20th Century Studios’ KINGDOM OF THE PLANET OF THE APES. Photo by Jasin Boland. © 2024 20th Century Studios. All Rights Reserved.

How did you organize the work with your VFX Producer?

Phillip: Because of the scale and sheer amount of CG assets there’s a lot of coordination involved to ensure milestones are being realised, work is executed to the highest standard, time frames are being defined and honoured, as well as tasks being distributed efficiently and effectively amongst all the artists involved. We often sit down together, review our current progress, and adjust schedule parameters – film making is a very creative process with a lot of variables that change as time moves on.

Unter: We also split the show across four teams that loosely followed the story chronologically. We got the machine firing on all cylinders relatively early to get a manageable distribution of complex sequences across the schedule. Some of our heavier simulations had long iteration cycles so we needed to get a head start on them to ensure we weren’t setting ourselves up for a major render crunch at the end. 

Could you elaborate on the key visual effects advancements made between Kingdom of the Planet of the Apes and its predecessor, and how these advancements contributed to the overall visual effects work of the film?

Unter: We’ve developed a lot of technology in the years since War… One obvious example is the focus on water for the Avatar sequel. Our FX framework for water (state machine) makes it possible to realistically define the transition from surface to blobbies to points as waves break on the coast. What determines the specific structure of those transitions within that volume on any given frame, and how those points and surfaces are rendered, is something we – how else to say it?! – dove even deeper into for Kingdom. Feather systems, muscle behaviour… the list of ongoing improvements reaches into every corner of the work. But the tools are only half the story. On every show, our talented crew gains new experience in deploying these tools and techniques, and brings that expertise to the next project.

Phillip: As Unter says, technology never stands still, particularly in VFX. To name just a few others, we made some key upgrades to our facial capture system, created efficiencies in our shader and texture pipeline, advanced with new procedural dressing techniques for our vast environments, applied new FX workflows for fire as well as water, and did some extensive lens profiling to capture and apply the characteristics of the anamorphic lenses used in the film to our CG characters, environments, and effects.

In what ways did the evolution of CG and animation technology influence the portrayal of the ape characters in the latest instalment compared to earlier films in the series?

Phillip: In many ways, we used the advancements in technology to give artists more time to spend on the creative aspect of their work. For example, facial solving now supports a deep learning solver that provides the animators with a consistent base line that they can then adjust and refine. Having to commit less time to such base line parameters essentially frees up precious time for creative fine tuning that adds those last few percent of perfection. Therefore, our apes’ performances on Kingdom shine and are truly alive in every single shot of the movie.

Unter: Along with a lot more dialog, many of the apes in this film are wearing clothes. It’s not that long ago that a furry character wearing multiple layers of clothing would have been a tall order to set up and to process! As an example, to provide an outward expression of his obsession with the Romans, Proximus has the most elaborate outfit – a combination of cloth, copper, and leather, with each material needing to render, deform, and interact in a believable way.

Can you walk us through the initial concept behind creating hyper-realistic visual effects for the new Planet of the Apes film?

Unter: Replicating the imperfections of the Panavision anamorphic lenses used in principle photography was key to integrating every aspect of the visual effects for Kingdom. All the data gathered on set combined with advancements in our tools meant we could strip those imperfections from the plate, composite the CG elements in, and then reapply the look of each lens across the entire image. Lens warp, sensor noise, chromatic aberration, vignetting, edge smearing, PSF (point spread function), veiling glare . . . All were accounted for. The great thing was that the system was automated enough to establish accurate results while still enabling artists to make creative adjustments. That kind of control also meant that full CG shots (of which there are more than a few!) could get the same treatment based on the settings of the closest real world lens.

What were some of the key technological advancements that enabled the creation of these lifelike ape characters?

Phillip: For our final facial motion pass we used DeepShapes to be able to simulate overlap of skin wrinkle motion, based on fascia and epidermis layers. We essentially accounted for the skin’s depth profile to help inform where memory folds and facial shape variation would appear. It’s a subtle, but very powerful effect that breathes life into our hero characters that we didn’t have before.

How closely did the visual effects team work with the film’s directors and producers to ensure the apes were portrayed authentically?

Unter: Extremely! In the full swing of things we were reviewing work with Wes and the producers almost every day of the week. Our overall VFX supervisor Erik Winquist is an industry veteran, with a metric ton of CG primate experience. He and Wes had a shared vision of the film, not to mention great rapport; Erik understood the story Wes wanted to tell, and Wes trusted Erik’s expertise. That collaborative mindset radiated through the team, and authenticity can’t help but emerge in that kind of environment.

Phillip: Agreed, we were extremely close. We are combining the filmmakers’ vision of the characters with our expertise that we gathered over the decades bringing apes to live on the big screen. It all starts with the actors’ performances, so it’s important for them to learn about the behaviour of apes which, once on set in a mocap suit, they craft into their performances. There was so much more dialog amongst the apes than in the previous films, and it’s exhilarating to see that we still entirely believe that there are real apes having a conversation on the big screen – even though everyone knows that they can’t actually talk.

Could you elaborate on the process of motion capture and how it was utilized in bringing the ape characters to life?

Phillip: Motion capture was the heart and soul of this project. To translate the actors’ performances to their ape characters, we put them in mocap suits that look like grey, full-body leotards with dots, patterns and optical markers on them. And as long as the actors move within a predefined volume that was setup and covered with motion capture cameras, you can record their body motion and retarget it onto their ape character’s skeletons. For capturing the facial performance, special head-mounted camera rigs were used to record even the smallest detail of facial muscle motion. There’s a specific dot pattern painted on every actor’s face to help solve and translate that motion to their characters.

What were some of the biggest challenges faced during the creation and animation of the apes?

Unter: If you take a few minutes to really look at them, apes have very interesting hair and vary quite a bit, even within the same species. And the longer the hair, for example on an orangutan, the more difficult it is to wrangle it into behaving. Raka actually had a bespoke groom just for when he was in water. Another challenge is apes on horseback. Their legs are quite a bit shorter than humans relatively, so aligning their feet to real world stirrups and replacing the stunt riders required a bit of movie magic!

Phillip: Challenges also lie within creating a believable translation of the actor’s facial performance to their ape’s character. Ape faces move very differently compared to human faces. Their lips and cheeks are incredibly flexible, yet their eyebrows are rather stiff and firm. So frowning, for example, isn’t much of a an ape thing to do due to anatomy restrictions, however, humans use their eyebrows a lot. For example, if you want to make an inquisitive expression as a human your gaze narrows, you frown with your brows being pulled together and raised toward the centre – an ape can’t do that, instead they express this emotion more wide eyed, with their mouth slightly open, lips rounded, etc. The actors and animators did a fantastic job understanding those differences and relay that into their performances.

Were there any particular scenes or sequences in the film that presented unique challenges for your team?

Phillip: Besides the complex process of creating a whole new lineup of ape characters, creating the world they live in and making sure it’s as believable as the characters themselves posed numerous challenges. For the opening egg hunt scene, for example, we choreographed the action by previsualising the entire sequence, before building the whole world in all its lushness and detail for the apes to perform in, which was a huge undertaking. From there we moved onto extensive set extensions for the Eagle Clan Village, which we knew we later had to burn to the ground. There were a lot of concatenating processes involved in building these sets in CG so that we could pipe it through our fire simulation framework to create photoreal pyro effects that portray all the dramatic events as if they were real and shot in camera.

Unter: The river scene is a big turning point in the film, and was all the more challenging as some really emotionally charged performances had to come through all the frenetic action. Unlike deep open ocean, the underlying terrain of a riverbed plays a key part in creating believable rapids and eddies, but simultaneously makes them more difficult to art direct. For the establishing shot, we needed to simulate an entire kilometre of water to turn the calm flow at the location into an angry torrent. From that, we found key features to graft into the very specific look of the water that Wes wanted immediately around the bridge, where all the action takes place. The actual bridge set piece was a tangle of old nets, rough-hewn timber, and rusted steel and we needed a few model variations of our CG bridge in order to align it to each plate for extension.

How did the team approach the balance between realism and artistic interpretation when designing the ape characters?

Unter: The characters on this film were more cast than created. Rather than go straight from concept art to a CG model, we honed in on specific reference images of individual apes to “cast” as our characters. Proximus is a good example of this; there was one ape in particular that we studied in great detail to create his face. Everything from the curvature of his brow to the cross-hatching of his muzzle wrinkles was faithfully replicated to capture the intensity of that particular animal’s appearance. Without direct anatomical corollaries, Kevin Durand’s phenomenal performance had to be carefully preserved in mapping to a truly authentic ape facial structure.

Phillip: It was important to base every ape model on features from actual primates in the real world so we didn’t drift away from plausible anatomy. Another important step was to include some of the signature facial features of the actors playing the apes to help translate their acting onto their digital counterpart and really bring their DNA into the design.

Can you discuss the role of facial animation and expression in conveying emotion and character development for the apes?

Phillip: Given the emotional rollercoaster that is Noa’s journey throughout the movie, it was paramount to capture the actors most minute facial expressions and make sure it translated well to the ape’s character performance. Being able to tweak and fine tune these expressions so they sit well within the character’s anatomy and design, yet fully convey the actor’s intentions, was a key skill set that our facial animators perfected. The character emotion that you see develop throughout the movie is entirely driven by the actors performances, and carefully transplanted onto their ape characters.

Unter: There’s a great example of the lengths our team went to in this regard when Raka is in the river. Peter Macon originally performed the scene in the tank on set, where you see all the involuntary behaviour you’d expect – head turns, blinks, breath holds – to avoid splashing water. This scene evolved as we got further into post, and during pickups on our motion capture stage here in Wellington, Wes and Peter made some subtle changes to the performance, and ultimately changed a line of dialog to consolidate a few key ideas. Our motion team then combined this new performance capture with all the nuances of the original shoot to create something truly emotive and totally grounded in reality.

How did the team ensure consistency in the appearance and behaviour of the ape characters throughout the film?

Unter: Each actor’s performance is at the heart of every character’s behaviour. Their movement and mannerisms, expressions and emotions all flow from that performance. The actors invented a backstory for some of the key characters that serves to inform every choice they make. Why does Noa follow the law of the elders? What motivates Raka to preserve the history of Caesar? How does Proximus’ childhood affect his notion of what it means to be powerful? With that level of introspection, it’s really clear to the artists what is – and is not – true to each character.

Phillip: There is also the extremely complex challenge of costume continuity for all apes across the movie. This included different states of outfits, wounds, fur debris, wet, damp, and dry looks for both hair and skin – the list is endless. We created costume bibles for our main characters that defined the progression of outfits and looks for the entire film. It was stored in a database which sanctioned the assignments on a sequence, or even shot, level to ensure continuity across all the scenes. Feeding this database with new information to keep it up to date was a mammoth task and not for the faint hearted.

Can you explain the collaborative process between the visual effects team and other departments?

Phillip: One of many pleasant collaborative processes was with Weta Workshop’s costume department. They designed and produced a vast array of clothing items, jewellery, and accessories the apes could wear and interact with. It was fantastic to have the actual samples in your hands to study the texture, detail and craftmanship, which really shows in the final wardrobe items our apes wearing in the movie.

The eagles in the film are not only visually stunning but also play a significant role in conveying emotion and narrative. What techniques were employed to imbue them with personality and expression?

Phillip: Because our main character Noa and his family is part of the Eagle Clan, it was important to understand the story and relationship behind these apes and their eagles. From a young age they raise and bond with their eagles for life in symbiotic fashion. That however doesn’t mean any eagle gets along well with any given ape, as the relationship between Eagle Sun and Noa portraits nicely. When it came to performance we embraced the natural behaviour of eagles. Wes Ball and the main actors had the great opportunity to receive an eagle handling lesson in Australia, getting close and personal with a Wedge Tail eagle. Our VFX Supervisor Erik Winquist shot that whole encounter on camera at high frame rate. It was beautiful reference to study and a lot of what we learned there flowed directly into the eagle behaviour you see in the movie.

Can you walk us through the process of creating visually stunning environments, especially those featuring lush vegetation or coastal settings?

Phillip: Organically grown environments are a tough nut to crack. There’s nowhere to hide, every detail has to be accounted for, simply because audiences know very well what these natural sceneries look like in real life. We were blessed to have access to a lot of approved concept art for our environments that helped a great deal to get a feel for the look Wes was after. We backed this with a ton of reference photography of overgrown forests, ruins, derelict shipyards, etc. and build what we call an art reference pack. It contained all the approved reference photography for each scene that artists could refer to as they modelled plants, dressed foliage, painted textures, shaded old concrete ruins and rusty ships.

When it comes to plant growth there are a lot of rules, such as topology, sun exposure, wind, temperature, and erosion to name only a few. A lot of these growth rules can be wrapped into procedural dressing tools that have parameters defining the types of plants that can be rooted in certain areas – or not. These tools allow for a broad, believable first dressing pass of a scene. From there we move onto the shot specific dressing, again examining some of the dressing placement more closely, adding or removing foliage and working on the framing for the final image. Because we often had to integrate our environments into existing plate photography, the onset survey capture files and pictures taken from the shooting locations often guided and inspired our artistic choices.

Unter: The ship graveyard on the beach was a collaboration across disciplines. Kingdom takes place further into the future than the Industrial Revolution is in our past, so the art department had to imagine what present-day rusted ship hulls would look like roughly 300 years from now. Matte painting extended the dunes to include the outer encampment backed up against DEM data to establish realistic terrain, added wind to grass and tents, and arranged and detailed ship models in various states of decay. Models, lookdev, and layout teamed up to extend existing rock features in the plates into concrete and steel ruins, while creating the intricately detailed ship of Proximus’, a vessel larger than any ever made, complete with internal superstructure so that light and animation stayed true when glimpsed through the rotten hull.

How do visual effects artists tackle the challenge of seamlessly integrating CGI environments, such as dense forests or expansive ocean vistas, with live-action footage?

Phillip: Firstly, it was important to ensure that what we generated spoke the same visual language as the scenery and flora in the various locations in Australia. We modelled, textured and shaded a large variety of native Australian plants such as gum trees and bushes, shrubs and grasses. For the actual integration with live action footage, a lot of roto and paint work was required to define natural lines where these CG extensions could be transitioned into. And it was important for Compositors to emulate the characteristics of the lenses used in the film and apply the exact look that the plate photography had to our CG. In order to do that, we profiled each lens on this project to understand what the optical characteristics are, from chromatic aberration and lens vignetting to defocus bokeh shapes and point spread functions – anamorphic lenses are a lot of fun!  

Unter: As another example, we had our work cut out for us in a scene where Mae seeks refuge in the tall grass. We had to create enough plant variations to match the look of the field on location, but not so many that we’d lose the benefits of instancing. We used some predefined ambient motion, especially in the trees surrounding the field, but a lot of shots required bespoke FX simulations to match the complex behaviour of the wind across the field as a whole. And of course the interaction of the grass in contact with the apes had to be simulated per shot. Then depending on the camera angle and the position of CG apes and horses, we chose a split line to keep as much plate grass as possible while still completely imbedding our characters into the environment. Finally, compositing would grade the renders to add regional variation.

In creating environments filled with intricate details like swaying trees or crashing waves, what role do advanced simulation software and procedural generation techniques play?

Phillip: For the environments dominated by vegetation, we simulated 1000 frames of loop-able ambient wind motion based on the Beaufort scale strengths 1-8 for each plant using our inhouse plant solver. Because this produced a lot of data to wrangle, we tapped into the Houdini software package and its procedural dependency graph to manage the motion data, which was then assigned by layout artists during the refined dressing stage and tweaked where necessary. When it comes to water simulation, it’s worth noting that Kingdom contained pretty much every type and state a body of water can be. Rivers, oceans, waves, tears, waterfalls, tidal waves, you name it. Thanks to the continuous development of our water solving framework, also called ‘state machine’, we could define all the variables, scales, and properties water can have and realistically influence the outcome of its simulation. That’s not to say simulating a complex state of water in a hero FX moment is a quick thing to accomplish, but it does look extremely impressive once you see the final result.

Unter: That’s right. As an example, our heroes are drenched with ocean water when they travel along a rugged coast at night. Simulating those crashing waves required establishing the right terrain and oceanic conditions to generate additional water that not only matched but also extended the splashes in the plate. Procedural wetness on the rocks helped establish a match to the set where CG water had doused areas not touched by practical water. The cliffs they climb immediately afterwards were created using photogrammetry kit pieces generated from high resolution photography of much smaller cliffs on location. These were then arranged to correlate, as much as possible, to the hand and foot holds used by the actors on a climbing wall. The crashing waves below them feature complex interaction with the coast that was captured with drone footage and projected into place.

Did the visual effects team encounter any unexpected breakthroughs or discoveries during the production process?

Phillip: Nothing that wasn’t planned for or at least somewhat expected. On a project of this scale, you want to keep the ‘unexpected surprises’ to a minimum as you can’t plan with them. You sketch out your technology frame work and eliminate as many unknowns as possible beforehand to guarantee a stable process. Having said that, we did course-correct our approach to simulating fire on this show a few months in due to very promising results that our inhouse fire solver produced. It was a bit of a gamble that really paid off in the end. 

Nothing seems impossible for Weta FX in the current technological advancements, what could be your next frontier in visual effects technology?

Phillip: That’s a good question, and it’s asked frequently at Weta FX. What is next? A lot of envelopes have been pushed, boundaries have been moved, and new grounds were broken. To me, the refinement and optimisation of workflows is often the key to success, simply because this frees up more time for the artists to be creative and concentrate on the things that really matter. 

Were there any memorable moments or scenes from the film that you found particularly rewarding or challenging to work on from a visual effects standpoint?

Unter: There’s a shot at the end of the movie where Eagle Sun lands on Noa’s arm, that, once you’ve seen their relationship in the movie, you realise how much of a completion of their arc together this moment is. The sun is low in the sky, and there’s this great lighting that ties them to Mae in time, but not in space. The environment is a completely other location, and all CG, right down to the grass that reacts to the draft of Sun’s wings. This shot evolved a lot over time and I really enjoyed the end result.

Phillip: For me, it’s the scene where Noa arrives back at his burning village. It comprises various takes stitched together into one long shot, combining all the drama and carnage caused to Noa’s clan and family into one big, extraordinary FX firework.  

Looking back on the project, what aspects of the visual effects are you most proud of?

Phillip: The sheer amount and quality of work that went into this film overall. As we worked on all the various aspects of it, it was easy to quickly lose sight of how monumental this project was as a whole. After seeing the final release in the cinema when the end credits started rolling, I was reminded once more of the incredible effort every single artist put in to make this epic piece of adventure come true.

Unter: I’m really proud that there were no second-class shots despite the sheer volume of work and epic scope of the film. The team brought so much talent and energy to this project, and that never wavered. And I always love when visual effects supports a great story, in this case that of a compassionate and clever hero. It’s great to be a part of stories like that when they come along.

How long have you worked on this show?

Unter: A bit over a year.

Phillip: 18 months, give or take.

What’s the VFX shots count?

Unter: Just over 1500!

Phillip: 1502, to be precise.

What is your next project?

Unter: I’m actually taking a few weeks off to travel before my next show. I’ve found it an important part of the process to go out and look at the natural world with fresh eyes.

Phillip: I’ll be continuing to promote Kingdom of the Planet of the Apes, the rest is top secret. ?

A big thanks for your time.

WANT TO KNOW MORE?
Weta FX: Dedicated page about Kingdom of the Planet of the Apes on Weta FX website.

© Vincent Frei – The Art of VFX – 2024

Masters of the Air: VFX Breakdown by Whiskytree

Dive into the world of Masters of the Air with Whiskytree‘s extraordinary visual effects. Witness the awe-inspiring recreation of B-17 bombers and the hauntingly beautiful depiction of Paris during World War II, brought to life with precision and artistry:

WANT TO KNOW MORE?
Stephen Rosenbaum: Here’s my interview of Production VFX Supervisor Stephen Rosenbaum.
Apple TV+: You can watch Masters of the Air on Apple TV+ now.

© Vincent Frei – The Art of VFX – 2024

All of Us Strangers: VFX Breakdown by Union VFX

Explore the subtle visual enhancements in All of Us Strangers thanks to the Union VFX‘s invisible visual effects that blend seamlessly into the film, adding depth and realism:

WANT TO KNOW MORE?
Union VFX: Dedicated page about All of Us Strangers on Union VFX website.

© Vincent Frei – The Art of VFX – 2024

Beverly Hills Cop: Axel F

Experience the adrenaline of the 80s with this explosive new trailer for Beverly Hills Cop: Axel F! Axel Foley is back and better than ever in this Netflix movie being released today!

The VFX are made by:
Scanline VFX (VFX Supervisor: Micah Gallagher)

The Production VFX Supervisor is Bryan Grill.

Director: Mark Molloy
Release Date: July 3, 2024 (Netflix)

© Vincent Frei – The Art of VFX – 2024

Those About to Die

Prepare yourself for a groundbreaking journey into Rome’s darkest corners. Roland Emmerich‘s Those About To Die explores the brutal reality behind the entertainment that captivated an empire:

The VFX are made by:
ReDefine (VFX Supervisor: Izet Buco)
Dream/Machine FX

Directors: Roland Emmerich, Marco Kreuzpaintner
Release Date: July 18, 2024 (Peacock)

© Vincent Frei – The Art of VFX – 2024

The Fall Guy: VFX Breakdown by Rising Sun Pictures

Step into the visual effects world of The Fall Guy, brought to life by teams of Rising Sun Pictures. From explosive car chases to gravity-defying stunts, every frame is meticulously crafted to deliver a cinematic spectacle like no other:

WANT TO KNOW MORE?
Rising Sun Pictures: Dedicated page about The Fall Guy on Rising Sun Pictures website.

© Vincent Frei – The Art of VFX – 2024

Unfrosted: VFX Breakdown by Crafty Apes

Step back into 1963 with the visual effects crafted by Crafty Apes in Jerry Seinfeld’s Unfrosted. Set against the fierce rivalry between Kellogg’s and Post, watch as their team brings the nostalgic era to life with stunning detail:

© Vincent Frei – The Art of VFX – 2024

Bob Marley – One Love: Roni Rodrigues – Production VFX Supervisor – Outpost VFX

Coming from Brazil, Roni Rodrigues launched his visual effects career in 2011 at BlueBolt, where he quickly made his mark on shows such as Spectre, Fast & Furious Presents: Hobbs & Shaw, News of the World, and The Wheel of Time.

What is your background?

I’m from Brazil and my background is in video editing. I’ve had quite an interesting journey into the visual effects world. I started in the industry back in 1999 as a Line Producer, then in 2001 I became a Video Editor and I was lucky enough to work on films like City of God, Elite Squad and City of Men, which were Brazilian projects.

In 2007 I moved to the UK, mainly to learn English, and I basically started again from zero. My first job in the UK was as a toilet cleaner, and then I became a bartender and a waiter. In 2010 I was accepted for a place at the National Film and Television School to do a Master’s in visual effects.

My first VFX project was The Iron Lady with Meryl Streep, and then I worked on Snow White and the Huntsman, and The Monuments Men, I was lucky enough to be part of the team that did The Imitation Game. My career kind of took off from there.

For the past four years I’ve been working mainly as a client-side VFX Supervisor, starting with the Paul Greengrass and Tom Hanks movie News of the World. I’ve been client side since then.

How did you get involved on this movie?

I am part of the Outpost VFX team and they were in touch with Paramount, who were looking for a suitable person to take on the role of Production VFX Supervisor. The Director, Reinaldo Marcus Green, wanted to get someone that wasn’t just able to achieve his vision but also someone with the sort of life experience that could really understand the background of the story. Someone who would have a better connection with the narrative. We all knew it would be a delicate project, not just for the Director but for the studio and for the Marley family – they wanted to make sure that everyone involved had an emotional connection to Bob Marley’s story.

I had talks with the Producers and they both really liked my story, particularly my connection with City of God, and then I had a nice conversation with Reinaldo the Director and we talked about films and my background, but most importantly what I thought I could bring to the project. After all those meetings I was delighted to be invited to be part of the team. I was part of the project from the pre-production phase, so I started back in October 2022 and we went for some scouting in Jamaica.

How was the collaboration with director Reinaldo Marcus Green?

Reinaldo and I had a lot of conversations at the beginning of the project during pre-production. He wanted to make this film feel as real as possible and really wanted to make sure we achieved historical accuracy, not just for the crowd work and the concerts but also the city environments like London and Kingston.

The collaboration was very important because it gave VFX a seat at the table early on, not just with the Director but also with the DOP and Production Designer, which gave us the best chance to replicate history and be super accurate. We did several tests and agreed to shoot as much practically as possible, including shooting elements, references and tiles. Having the DOP on board from an early stage really helped as well because we had more flexibility to collect as much information as possible.

So I would say it was a very collaborative process and a very creative process as well. I think that’s what made this project so special – everyone there was connected by the love of the Bob Marley story and his music. It was like no one was there on this job just because they were being paid – everyone wanted the best for this project and was on the same page. They wanted to tell this story in the right way.

(L to R) Kingsley Ben-Adir and Director Reinaldo Marcus Green

How did you organize the work with your VFX Producer?

From the beginning we had this big responsibility to portray such a well-known figure from recent history. Bob Marley has been dead for over 40 years but at the same time he’s very much alive.

From the beginning, myself and the VFX Producer, Gil James, worked hard to plan out exactly what we wanted to achieve. We started the process with a script breakdown and then we had several conversations and readings with the Director, the Producers and also the Production Designer to make sure we were all agreed on how we’d achieve the film’s visual effects in the most realistic way.

For One Love we had to rely a lot on invisible effects, and sometimes these are harder to plan for than when you’re creating big, otherworldly effects because the audience’s eyes can easily pick up when something that should look like real life doesn’t look real. We started the process of breaking down the script and also going out on location while the Art Department were still building sets out in order to do several element tests. I put together loads of reference to be able to show the Director and the DOP – the whole idea was to narrow down the vision.

How did you choose the various vendors and split the work amongst them?

From the beginning we decided to work with Outpost as a sole vendor. Initially this film was supposed to have 300+ VFX shots, largely focusing on the concert sequences with crowd, the vision sequence and some work on the Trenchtown environment. As we started shooting the project grew a lot. For us the priority was to give maximum flexibility to the Editorial team, we wanted them to have the freedom to be creative. The project grew from around 300 shots to over 700 shots. With the show growing so quickly and with the deadline remaining the same we decided to share out some of the work with other vendors. In this way we could make sure Outpost could focus on all the big sequences and hero shots and focus on creating stellar work. A lot of the additional work was clean up, beauty work, some de-ageing, which is all still demanding work and requires a lot of skill but was the easiest to place with other vendors so Outpost could focus on the major beats.

We took a lot of things into consideration when choosing our additional vendors. We had an in-house team run by Crafty Apes that could work closely with myself and the Director during the making of the Director’s cut. It was also useful because where appropriate they could create temps and send those to Outpost to work up the final shots based on that immediate feedback from myself and the Director.

We also worked with a company called Exceptional Minds that employs people on the autism spectrum, which we saw as a way to have a positive impact and their work was superb. It’s a wonderful academy and studio, and they offer a really inclusive route into the industry for people with autism. I tried to work as closely with them as I could during production to give them creative direction and support.

We did some work with Midas VFX – they specialise in beauty and de-ageing, so we gave a good chunk of beauty work to them. We also worked with Soho VFX over in Canada.

What is the your role on set and how do you work with other departments?

I would say my role on set as a VFX Supervisor is to make sure the collaboration and communication works well and is aligned across the whole project. My responsibility isn’t just to stay on set and make sure we shoot plates correctly and help the Director and DOP complete principal photography in a way that’s ready to receive visual effects, but also to keep working closely with other departments.

We were constantly in communication with the Art Department, making sure we had everything practical that we needed to achieve the end result. We were always working closely with the SFX department to collaborate with them on all the different effects in the movie, the atmospherics and the explosions.

I also spent quite a bit of time with the Camera Department to understand their lens choices and also advised them where I could on composition for certain shots to ensure the plates we had to work with later were VFX-friendly.

The amount of collaboration during a production like this is significant. I had to collaborate with the Wardrobe Department to ensure costumes worked with the blue screens we were using and for flexibility when dealing with crowd elements. The Makeup Department were in constant communication with me to make sure wigs and makeup worked well on camera and in post.

We dealt a lot with the Extras Coordinator because we had to spend a lot of time directing extras for the various crowd sequences. We had a very close collaboration with the Location crew – their job is absolutely crucial and it’s important for someone like me to work closely with them, especially on a project like this where we were dealing with many units. We had teams going out to several different locations and collecting footage, LIDAR scans and more. I really believe that the more VFX can collaborate with other departments the better the final result will be.

How did the use of invisible visual effects contribute to the immersive depiction of Jamaican landscapes and London set in the 1970s?

From very early conversations with the Director we knew that this was the sort of project where we had to always think about authenticity. The Marley family was involved in the production as well, and everyone was committed to making the film feel as accurate as we could. We wanted to send the audience back to the 70s, to these iconic concerts and events.

We did a huge amount of research into London and Jamaica back in the 70s. We had thousands of pictures and videos of Trenchtown back in the 70s. We did the same with London, and we also managed to rent Bob Marley’s house that he stayed in in London at this time to be as close to reality as possible.

We studied the individual concerts we were representing on screen and the venues that they took place in. We managed to re-create these iconic venues that don’t exist anymore, like the Paris Pavilion. We rebuilt this venue digitally, paying attention to the smallest details. We did the same in London with the Rainbow Theatre, which was a very famous venue. Nowadays that venue has become a church, so through using a lot of photos and archive footage we re-created every detail of that venue with VFX.

London was probably more intense in terms of the invisible VFX that were required to send audiences back in time. There were some changes we made to Trenchtown and we also needed to paint a lot of things out, but it has developed a lot less than London since the 1970s. We had whole sequences in London where the skyscrapers in the background didn’t exist back in the 70s.

We also used London as Paris as well, so we didn’t just have to make London period-accurate, but we also had to use VFX to change London into Paris, including changing the River Thames to look like the Seine.

What challenges did you face in accurately recreating the ambiance and architectural nuances of London?

London is obviously a very recognisable and iconic city, and we were fortunate to shoot on location there as it meant there was less of an overhead in terms of building environment extensions. There was a lot of work to do in making modern London look like the 1970s, however. There are many shots where we’ve had to remove the modern London skyline as it’s changed so much since then, and then there was the usual overhead of paying attention to road markings, period-appropriate vehicles and much more. There are many shots in there where we had to use CG vehicles to populate the roads. For the Rainbow Theatre, an iconic venue that closed back in the 80s, we were able to shoot the building as it is now and then worked from reference to replace all the signage so it looked how it did when Marley played there in the 70s, and then we added CG vehicles and integrated crowd elements to show people waiting to go into the concert. The fact that the building is still standing made it easier for us to achieve the authenticity we were looking for.

What was the most complicate location to recreate?

I would say the One Love show Marley played when he returned to Jamaica following his exile in London. The venue for the show was the Jamaican national stadium, Independence Park. We weren’t able to shoot anything at the real stadium so instead we shot the sequence at a big, open park, and we had to use VFX to re-create the stadium in as much detail as possible. We went to Independence Park with our LIDAR team who did drone scanning as well as LIDAR. As an environment that was the most complex to create because we also had to adapt the data we got as the stadium has changed a lot since the 70s – once again we had to use archive footage to understand things like what the scoreboard should look like, what the floodlights looked like. The level of detail was incredible. Obviously the concert happened at night so we lost some of those details due to the lighting, but it was really important to us that we were as accurate as possible. We also had to take into account a crowd of around 30,000 people. When we shot the plates we had about 100 to 150 extras available for crowd rep in other sequences, but the lighting we went with to shoot the plates ended up not matching the lighting we wanted to use for that sequence specifically, so we had to rely entirely on CG crowd simulation for this concert.

Could you elaborate on the creative challenges involved in rendering for a compelling bushfire scene?

That sequence was always a very delicate sequence for the Director. He wanted to give the audience an abstract glimpse into Bob Marley’s background and how that affected the person he became. The Director decided to use Jamaica’s famous sugar cane fields and a process that they still use today where they burn the sugar cane, and it burns super quick. The tricky part of that sequence was we had to have a horse, a rider, a child, a sugar cane field and loads of fire. The whole idea was that the kid was running through the sugar cane field and arrives in a clearing, where it becomes a ring of fire. We knew from the beginning we’d never be able to shoot that practically due to health and safety issues, but also because we wouldn’t have the flexibility to position the camera exactly where we wanted. We decided to work in collaboration with the SFX team to use some gas bars in very specific places in this clearing. The clearing didn’t exist in reality and we asked the Art Department to build two walls, one in front and one behind the boy, and then we replaced these digitally later. The smoke and the fire from the gas bars were crucial for us to generate good lighting reference. We strategically placed them at a safe distance from the child actor but it still gave us a nice lighting wrap on the kid, the horse and the rider. We’d then move the gas bars according to the DOP’s framing to make sure we got a realistic result. It then helped us to replicate and digitally extend the rest of the fire and create an asset of a perfect ring around the performers.

One weekend while I was in Jamaica and had a small amount of downtime, I managed to wrangle two cameras and then went with a visual effects team to burn a sugar cane field for real, which is a normal process as before you collect the sugar cane the farmers need to burn the field anyway. It was an amazing process. We had two fields, each about 200 metres long, and the farmers set fire to them. In a matter of minutes the whole field is completely on fire but it only takes about 15 minutes for the fire to burn all the dry leaves and then extinguish itself. It’s a really well controlled fire and normal practice.

It was crucial for us to have two cameras shooting two fields burning so we had enough data. We managed to get a lot of practical embers, smoke and fire, and this was incredibly helpful for shots where the fire was in the distance and we could use the footage as elements. It was also great footage to give to the FX crew at Outpost as they could replicate it in Houdini.

What animation techniques were employed to simulate the dynamic behavior of flames and smoke in the bushfire scene?

As I’ve discussed we had the results of a number of element shoots that gave the team at Outpost a good foundation to work from, but where we needed to expand the fire and control its behaviour, we worked with the FX team to develop and integrate fire and smoke simulations out of Houdini. This was especially important when we were working on the wide shot where the fire extends out into a ring around the boy and the horse and rider.

How do you tackle the challenge of creating and animating realistic crowds to fill stadiums and concert halls, while ensuring each individual appears unique and believable?

The Director was determined to make the crowds feel as real as possible, so what we decided to do was approach it in two ways. We had the photography, where we shot up to 200 extras dancing to the right music that would align with what was playing in the film. Then after principal photography we asked the Camera Department to shoot all of this again but with tiling, so we could shoot the extras then shuffle them and move them out of place, before trying to shoot again with the same camera movement. This allowed us to have a foreground of real crowd rather than just using a digital crowd. As well as this we also set up some blue screens on the studio stage to shoot some 2D elements. We worked with the Lighting Department to replicate the same lighting that was used during principal photography and used the same music so the dancing was consistent. We shot elements of single people, of groups of two or three people, a then a bigger crowd. We had two cameras rolling so we could shoot on the ground with eyeline, then another pointing 45 degrees down as this would be perfect for integrating into crane footage.

Once we had the principal photography and the 2D elements, we had enough footage in most sequences for the foreground and the midground to be filled with real people, which was the realistic result we were looking for. Then we started to look at digi doubles and crowd simulation for the background crowd. We invited Visual Skies to join us on the project to do a lot of the LIDAR on the venues, but when it came to the crowd we built a cyberscan booth in London where we managed to scan all the extras, and then we asked them to join us in Jamaica so we could do the same thing with the extras there so we had enough scans to build diverse crowds based on where the concerts were taking place. We also wanted to make sure the animators had the best starting point when it came to animating all the dancing, clapping and idling, so we worked with the mocap company Centroid and they spent the whole day recording several types of performance that we could use for our digi doubles.

In terms of lighting, how do you manage to create dynamic lighting within virtual crowds?

Like with most things this began with reference so we could understand what the Lighting Technicians were doing back when Bob played these concerts back in the 70s. There was then the task of re-creating these techniques practically on set, and then matching this with CG lighting that could interact with our crowd agents in the same way it would if it were all real.

Can you discuss any specific scenes where visual effects were particularly challenging or innovative?

There is a sequence during the movie that’s a bit of a montage, where the Director wanted to show a montage of Bob Marley’s tour after the release of Exodus. The whole idea was to shoot several concert shots, but then we needed to create an environment for the whole European tour that went to places like Denmark, Germany, London and Paris.

The main issue we had was we wanted to be consistent with the real venues that Marley actually played on the tour, but we shot everything in the same place due to schedule issues. We shot all of this at Alexandra Palace, London. We decided to shoot on blue screens but with a lot of atmospherics, which was very challenging, and then we needed to create different environments for each of the different venues around the world to a very high level of detail, which were then used in a single montage-style shot.

Because it’s meant to look like the 70s, we found it a challenge to find a lot of archive photos of all these different venues to use as reference. It was a lot of work to create those venues so they looked different enough to create the sense of montage, as well as maintaining that high level of accuracy that we were trying to achieve.

Looking back on the project, what aspects of the visual effects are you most proud of?

I really like the result we achieved for the vision sequence. I think the fire on the sugar cane field looks absolutely gorgeous and we managed to capture the idea of the vision with beautiful lighting and realistic fire. I’m very proud of this sequence, the team at Outpost did a fantastic job.

Tricky question, what is your favorite shot or sequence?

My favourite sequence was the One Love concert in Jamaica where we created Independence Park. There was a lot of work involved in bringing that to life and it made such a big impact on the final film.

How long have you worked on this show?

I was working on One Love from October 2022 so it ended up being a little over a year in total.

What’s the VFX shots count?

We created just over 700 shots for the movie.

What is your next project?

I’ve got a few potential projects lining up at the moment but I can’t discuss them just yet!

What are the four movies that gave you the passion for cinema?

The movies that had the biggest impact on me were City of God; Gulliver’s Travels because it really opened my eyes to visual effects; Jason and the Argonauts was so impressive and had a big impact on my childhood, especially the animation of the skeletons; and the fourth one would be The NeverEnding Story. I was just a child when I watched that and it was so magical.

A big thanks for your time.

WANT TO KNOW MORE?
Outpost VFX: Dedicated page about Bob Marley: One Love on Outpost VFX website.

© Vincent Frei – The Art of VFX – 2024

Twisters

Dive into the whirlwind of filmmaking with this featurette, taking you on a thrilling journey behind the scenes of Twisters:

The VFX are made by:
ILM (VFX Supervisors: Florian Witzel, Charles Lai, Bill Georgiou)

The Production VFX Supervisor is Ben Snow.

Director: Lee Isaac Chung
Release Date: July 19, 2024 (USA)

© Vincent Frei – The Art of VFX – 2024

House of the Dragon – Season 2

Unveil the magic of House of the Dragon with this new featurette that dives deep into the visual effects of episode 3, with insights from VFX Supervisor Dadi Einarsson and VFX Producer Tom Horton:

The VFX are made by:
Pixomondo
Weta FX
Rodeo FX
RVX
Zoic Studios
Red Visual Effects

The Production VFX Supervisor is Dadi Einarsson.
The Production VFX Producer is Tom Horton.

Creators: Ryan J. CondalGeorge R.R. Martin
Release Date: June 16, 2024 (HBO)

© Vincent Frei – The Art of VFX – 2024

0SuiveursSuivre