With a career spanning 35 years, Stephen Rosenbaum‘s journey in the realm of visual effects commenced at ILM. Throughout his tenure, he lent his expertise to esteemed studios like Sony Pictures Imageworks, Weta FX, and Cinesite. He worked on many studio such as Contact, The Perfect Storm, Avatar and Kong: Skull Island. His mastery of the craft was duly recognized with two Oscars for Best Visual Effects, honoring his contributions to cinematic milestones such as Forrest Gump and Avatar.
Can you provide an overview of your background in visual effects and how it led to your involvement in Masters of the Air?
I started my career as an Assistant TD (the first!) at ILM in the early days of the Computer Graphics department (The Abyss, Terminator 2, Jurassic Park, Forrest Gump), and eventually worked my way up to becoming a VFX Sup. Over the last 20 years, I’ve bounced between Supervising at VFX vendors and on the studio side as a Production VFX Sup, mostly focused on character animation and digital human projects such as Avatar, Jack The Giant Slayer, Michael Jackson, Kong Skull Island, and an ABBA music video. We heavily leveraged Virtual Production workflows on those projects, so when Masters of the Air started Development, Production called and asked me to assess whether using Virtual Production could benefit this show.
How was the collaboration experience with the showrunner, directors, and what were their expectations and approaches towards the visual effects?
For the nine episodes in the series we had four sets of directors. We also had the benefit of the writer John Orloff being present with us in the UK during the yearlong Prep and Shoot. Gary Goetzman is the Showrunner and through his magical expertise he worked with the directors to maintain the character through-lines that Orloff builds upon with each successive episode.
Armed with Orloff’s scripts and vast and comprehensive research bible, Gary entrusted me to work with the directors to choreograph Previz for the various Mission sequences so that they were not just historically and technically accurate, but also fun to watch. Typically the directors would begin by sketching storyboards, and then I would interpret these boards with our Previz Sup Matthew Smart at The Third Floor. TTF would then start blocking the action and we would regularly review and refine shots into dynamic action sequences. Once we had a sequence that was a visually exciting representation of the scripted narrative, I would review with the directors, get notes and then discuss with Matt on implementing updates. All production departments would then reference the approved Previz as a guide to prep their work and be ready for shooting.
How did you coordinate with your VFX Producer to organize and execute the visual effects work for Masters of the Air?
While prepping and shooting in the UK with our VFX Producer Leslie Hough, the scope of the show increased substantially, and as a consequence so did the schedule. As often happens, once we would translate a few script pages into a Previz sequence that accounts for complex action with characters between multiple planes within multiple compartments within those planes, and added coverage angles and wide “geography” shots, we suddenly had to involve lots of actors that had to be shot on location and on different sets on multiple stages. Coupled with filming during the height of COVID and strict crew interaction protocols, inevitable outbreaks, temporary shutdowns, and the collateral stress of working under these conditions, we experienced a number of schedule extensions and delays to our shooting schedules. Also due to COVID, planned location shoots were pulled back to local and simplistic environments augmented with whatever the Art Department could build as a partial set in the foreground. The VFX work for the show grew exponentially so we had to continually expand our on-set support teams, and as a result, extend our delivery timelines and just roll with the inevitable drift in available prospective vendors who had capacity and the capability of taking on this kind of complicated VFX work.
As we moved into Post, our VFX Producer Bruce Franklin and I discovered another problematic effect of COVID was learning how to manage and direct an entirely remote workforce of over 1000 VFX artists. In the past, we would always visit a vendor, spending time at their office to interface directly with the supervisors and the artist teams. Now, because nobody was physically in the offices, it meant that all communications were done via numerous daily and nightly virtual meetings with individual artists scattered across time zones ranging from India, Europe, North America, and Oceania. The upshot of this “new normal” style of production management often caused regular work expectation misunderstandings, but we tried to stay supportive and acknowledge our vendors’s own internal struggles of keeping well intending artists productive and on track.
Could you explain the process of choosing and assigning work to different vendors for the visual effects?
Prior to starting Post, we rationalized that it made the most creative and financial sense to have one of our large vendors (DNEG) undertake all the aerial work. The logic was that once they figure out how to make a B-17 look real, and master the formation layouts and flight dynamics, the work will look more consistent and be produced more systematically, ultimately saving on not having to pay another vendor go through the same development process. As the VFX shot count grew more than 50% and remote work efficiencies lumbered, we realized the need to bring on more vendors. We identified certain Missions and other plane work on the ground that we could divert to Rodeo FX, Weta FX, and Whiskytree. We were able to pre- bake generic animations (e.g. spinning propellers) into Alembic files and share plane assets to a limited extent, but most of the model topologies, rigs, and shaders had to be reconfigured to work within each vendor pipeline.
Where were the various sequences of the series filmed, and how did the locations impact the visual effects work?
The 100th Bombardment Group was stationed at Thorpe Abbotts in East Anglia. There’s a museum at the original site, but the airfield itself is now farmland, so we shot the exterior airfield scenes at Abingdon Airfield in Oxfordshire. The Art Department built the Thorpe Abbott’s Control Tower, a few surrounding buildings, and some hardstands at Abingdon and we were able to stage a lot of scenes in this area of the airfield. VFX would then extend shots where we needed to see the rest of Thorpe Abbotts. We did pay attention to the historically correct squadron parking locations, so when necessary we would sometimes need to correct the building, hardstand, and taxiway layouts to match Thorpe Abbotts.
The more significant challenges both the Art Department and VFX encountered was the cancellation of shooting in more comprehensive and camera-ready European locations because of COVID restrictions. Instead, the vast Stalag Luft POW camps and various German and Belgian villages had to be partially built by the Art Department at a local airfield and then extended by VFX.
For terrain environments and towns seen from the air, sometimes we had the benefit of multi camera array background plates, but most of the time German, French, Italian, Swiss, Dutch and Greenland environments were fabricated entirely as CG terrains, regardless of altitude.
All the aerial shots where we see a foreground character close-up were shot on stage against an LED wall, and then inter-cut with exterior angle plane extensions or wider, all CG shots.
How did the collaboration between the visual effects team and other departments, like art direction and cinematography, contribute to the overall aesthetic of the series?
The enormous amount of characters, complex action, and numerous environments across the nine episodes was the equivalent of producing three large scale movies back-to-back (and at times, on top of each other!). Fortunately, every department on the production had many years of big show experience and proven expertise, so with no insecurities and egos to get in the way, we all synced very quickly and collaborated throughout the design, planning, and shooting schedules. As the saying goes but often neglected, “It’s all in the Prep.” The overall rich aesthetic and production value presented throughout the entire series has its foundation in having been allowed to do a proper Prep. The tone was initially set by our Production Designer Chris Seagers and the Art Department, and with our daily scouts and solution- oriented discussions with Stunts, SFX, Grips, Makeup, Costume, Props, Locations, and our invaluable researchers, we were able to sensibly distribute our work loads and support each other along the way. We would present our ideas and plans to each episode’s director, DP, and AD, and naturally they would want to adjust some things to their schedule and style of shooting, but for the most part, the solid trust and positive spirit within the Prep periods directly influenced a consistently high quality look to the entire series.
The B-17 Flying Fortress is a significant element in the series. How did your team bring this iconic aircraft to life, and what specific technologies were employed for this?
Most of the B-17’s flown in WW2 were either B17F or B17G models. The Olive Drab F’s preceded the natural metal G’s, and there were some subtle structural difference between the F and G models, most notably substituting the single interior Nose gun on the F for twin 50 caliber “chin” guns that are exterior mounted underneath the Nose on the G. Stuart Heath and BGI Supplies built two full scale model B-17F’s for the show. They had access to the original Boeing construction drawings so the exteriors were precise and perfectly detailed down to the last rivet. Neither model was flight worthy, but one had electric motors attached to the wheels so it could taxi and navigate a hardstand. The other could be used as a background parked plane. For the CG B-17’s, we LiDAR scanned the practical models and were also able to LiDAR scan a few real B-17’s (both F and G models), including the infamous (B-17F) Memphis Belle to compare (very) slight differences against the full scale production models. We also took comprehensive texture photos and shot HDRI’s under varying natural light conditions. There was nothing technically special about the CG rigs built for the B-17 with the exception of getting the propellers to spin at the right RPM’s and look correct. The wing and cowl flaps, bomb bay doors, wheels, landing gear, gun turrets, and guns all moved to the true B-17 specifications.
Generally for ground-based shots, our job was to populate hardstands, taxiways, and runways with CG B-17’s, whether static or moving. For render efficiency, we built four different model (poly count) resolutions to lay-out based on distance to camera, and usually had at least one real B-17 model to match somewhere in the shot. Neither of the model B17’s had working propellers, so we would typically remove the physical propellers and add CG versions for shots that required them to be spinning.
All shots of any plane taking-off, flying, and landing in the show is CGI. Bummer, I know, but we could not use real planes because of the time suck we would encounter to rehearse with multiple planes, set-up, shoot a take, and then re-set start positions, make adjustments, and re-shoot another take, and another, etc. With several hours of needed action and coverage for the series, combined with very tight production schedules, it was just not a practical option, especially in England where light and weather conditions change five times over any given day. Also the remaining flight worthy B-17’s are now too expensive to insure and fly over numerous weeks with no certainty of what will be usable by the end of the shoot schedule. We did manage, however, to squeak in one day of shooting a single real P-51D just to placate our own desires, and it was wonderful for all to watch (and hear!).
Eventually we see the B-17 G’s in action, but because of schedule restrictions, other than a silver paint job, we were not able to modify the existing F models to look like G models, including the tell-tale chin guns. We chalked this up to a conscious production compromise where we wish we could have made the changes but time didn’t allow for it. Once we established these silver “G’s” using the real full scale models at the airfield, we were then committed to the look continuity in later scenes when they’re airborne. So when we see the “G” models, they will (sadly) look like silver model F’s, because that’s what they are. I can rationalize it by arguing that historically there actually were a limited number of silver B17F’s put into service before the B17G’s arrived, so technically it could be accurate, although not really what we wanted to show.
For interior angles, sectioned set pieces of the B-17 were built and detailed to perfection also by BGI. These set pieces were positioned on a variety of huge motion bases designed and animated by SFX Sup Neil Corbould and his team. Using a manual hand controlled Waldo device to drive the motion base servos in real time, a SFX technician could effectively “fly” the set piece and react to action while watching the content played-back on the LED walls. So for example, if there was a flak explosion directly off the port side pilot window, in response the technician would jolt the plane toward starboard. This bolstered actor performances, and gave interior set details life as things jiggled and swayed in the background.
In terms of research, how did your team ensure the authenticity of the details related to aircraft and air battles during the Second World War?
We went into the project with a mandate to get it right. Every decision we made had to account for being technically and historically accurate. There are definitely some creative cheats in there, but I could confidently say that 95% of it (and there’s A LOT of “it”) is faithfully correct, even down to plane serial numbers and squadron ID’s. Preemptive callout: Yes, I know “Wild Cargo” should not have nose art at the point in time of the Episode 1 Mission. It was a creative choice. But “Our Baby” is correct!
Everything started with Donald Miller’s tome “Masters of the Air” and John Orloff’s scripts and research bible. We also had a dedicated researcher and a B17 technical advisor present throughout the Prep and Shoot. From them and watching thousands of hours of historical video reference, listening to oral histories, and my own scouring of YouTube and websites like the 100th Bomb Group Foundation, I learned nearly everything there is to know about the 100th Bombardment Group, a B17, its payload types, the Norden Bombsight, and its airmen. I studied combat formation flying, Luftwaffe fighters (Bf109, Bf110, Fw190, and Ju88) and their attack strategies, and the P-40, P-47 and the pivotal P-51 Mustang counteroffensive. I also studied German flak guns (typically 88mm) and how flak can devastate the B17’s and their crew.
As a funny story, one of the first encounters I had with one of our tech advisors Taigh Ramey was him confronting me early on in Prep begging me to not have the front of an engine on fire. He explained that “Hollywood always gets this wrong”, and that we would almost never see fire shooting from the front part of an engine. “There’s no fuel there”, he would exclaim, “It’s in the wing!”. Of course, we unavoidably failed a few times on this one but only because of how the action was cut together and needing to see fire from a pilot POV, and not because we weren’t aware this was wrong. I bet he will still curse at me through the TV screen.
Aerial combat scenes are crucial in Masters of the Air. Can you share the challenges of recreating authentic and visually striking aerial combat?
My approach to choreographing the Mission sequences was to ensure firstly that the action is physically plausible and secondly also fun to watch. We made sure the American and German planes were traveling at realistic air speeds, and flight dynamics were accurately simulated. There are certainly some creative embellishments, however, such as how many German planes one might glimpse in a given shot given the combined airspeed of a B17 traveling one direction at 180 mph, for example, and a Bf109 traveling at 320 mph the opposite direction. Also, making fire, explosions, and gun muzzle blasts more visually exciting to see than what an accurate simulation would actually allow us to see can all be blamed on needing to maintain the “energy” of a sequence.
Because there were always numerous planes in the sky, with the B17’s typically navigating a more linear path and German fighters attacking from various angles at great speeds, we liberally used contrails to help the audience track the action easier. We also tried to ground the camera positions in or on a plane as much as possible, and avoid magic camera moves that travel through space at hypersonic speeds. I felt it was much more exciting to experience the plane ride and be immersed within the action, and not just a gratuitous observer to it. Editorially, however, we would sometimes need to pop wide for a geography shot to remind the audience of the state of things, but for the most part, we made a point of staying with just a few key characters (and never German) so the audience would not get confused or lost as the action evolved.
One of the main defensive strategies for the B17’s flying on a Mission was to maintain a tight flying formation known as a Combat Box. Each B17F Flying Fortress had ten 50 caliber guns that multiplied to at least 180 guns when formed into a defensive Box arranged by a minimum of three squadrons of six planes each, totaling 18 (sometimes more) B17’s. The Luftwaffe had a variety of attack techniques, the most well known is attacking from a 12 o’clock high position, but they had many other effective strategies including boldly flying straight through a formation in an effort draw friendly fire and ideally split off planes on the outer perimeter of the lower squadron to isolate them for a follow-on attack. I read one account from a German pilot that flying through a Combat Box of B17 tracer fire was like flying through a water sprinkler of bullets.
In generating exciting Previz action sequences, it’s all about storytelling and I had the good fortune of having PJ Harling as my editor. Daily he would help me assemble individual Previz shots, continually adjusting timings and cadence, and ensuring that the action flowed, builds upon itself, and was easy to follow.
Filming on LED screens is innovative. Can you explain how it differs from traditional methods and how it impacted the visual effects process in Masters of the Air?
Traditionally VFX shots of actors in planes is done either against a bluescreen or greenscreen, or camera angles that are aiming upwards showing just infinite sky. The issues with these approaches are that the actors don’t really know where to look and perform to the surrounding action (other than some VFX geek like me waving a tennis ball on stick). Lighting is more restricted because there’s this giant green or blue backing that’s in the way and often reflecting an undesirable blue or green cast on the actors and set. Composing reverse cameras angles looking over the actors out the windows at surrounding action is nothing more than a best guess at where the action will eventually be animated into the shots. Directors, DP’s, and actors generally don’t like it because it presents a very limited understanding of unseen action, and that usually translates to flat performances and boring action sequences. The upshot of this is it now more commonly results in excessive all CGI conjured shots during Post as an editorial effort to compensate for uninspired action sequences. Also, editors initially won’t have the surrounding animations to cut with so they have to make assumptions, just making educated guesses in connecting foreground actor performances with background action, timings and continuity. Really, the only positive for shooting against a green or bluescreen is that is provides the VFX compositors with solid matte extractions. This is significant and not to be quickly dismissed. Think about extracting clean mattes for the window glass and airmen reflections, dirt, lens flares, and of course motion blurred edges. Not having a way to cleanly extract these mattes as we could with a good blue or greenscreen means articulate roto for everything and then adding back window reflections as CG elements in the final composite; a potentially big creative and costly compromise.
Alternatively, employing a Virtual Production workflow during a shoot provides real time (proxy) visual effects to the production crew, and one toolset of VP we decided to leverage on MOTA was to install giant LED walls in place of where a green or bluescreen would have otherwise been positioned. One of our main LED “volumes” was a nine meter high by 25 meter diameter horse-shoe shaped vertical wall and ceiling of seamless LED panels. We placed a ten ton, 6-axis motion base in the middle that could “fly” the Cockpit and Nose compartments of a B-17 set piece. On the walls we would playback Previz content through Unreal Engine, which gave the director control in tweaking or even re-staging the surrounding action. The actors could always see and react to everything out the windows, and the DP could dynamically control the lighting and look, including weather conditions and interactive FX. And the cameras understood exactly how to compose the shots to the entire action. Additionally, the editors were receiving temp composited plates that effectively represent the directors’s vision and that allowed them to more confidently cut the action together more quickly and get VFX working sooner.
We knew from the start there was no time during Prep to design, Previz and then light, render, and comp more than five hours of final pixel content, nor would it have been a wise spend considering there were multiple directors on the series and they would have to be involved with approving the work before it could be used on the Walls. Doing this while they were madly prepping for the rest of their live action shoots was not logistically feasible, so we committed to rotoscoping most of the plates and adding window reflections in Post. Fortunately, there are new machine learning tools that can auto-rotoscope a high percentage of a given shot, and this really saved a lot of otherwise manual effort. And in some cases, the focus was so shallow making the backgrounds out the windows soft enough that one couldn’t tell a Previz exterior from a VFX composited background, so where we could, we took the win and just used the plates as they were shot.
Given the scope and complexity of action throughout the episodes, in the end, using the Walls offered a far a more creatively flexible approach to shooting.
A big thanks for your time.
// Trailers
WANT TO KNOW MORE?
DNEG: Dedicated page about Masters of the Air on DNEG website.
Rodeo FX: Dedicated page about Masters of the Air on Rodeo FX website.
Weta FX: Dedicated page about Masters of the Air on Weta FX website.
Apple TV+: You can watch Masters of the Air on Apple TV+ now.
© Vincent Frei – The Art of VFX – 2024