Let’s go for a ride in the frozen world in the train of SNOWPIERCER with FuseFX who unveils their work on this series:
© Vincent Frei – The Art of VFX – 2020
Let’s go for a ride in the frozen world in the train of SNOWPIERCER with FuseFX who unveils their work on this series:
© Vincent Frei – The Art of VFX – 2020
Let’s go for a trip in a galaxy far, far away with the VFX Breakdown by Pixomondo about their work on THE MANDALORIAN:
© Vincent Frei – The Art of VFX – 2020
Australian studio Rising Sun Pictures takes us into the middle of the war in the battlefield of Shanghai with this VFX Breakdown about their work on the epic Chinese film, THE EIGHT HUNDRED:
WANT TO KNOW MORE?
Rising Sun Pictures: Dedicated page about THE EIGHT HUNDRED on Rising Sun Pictures website.
© Vincent Frei – The Art of VFX – 2020
Grant Walker began his career in visual effects in 2006 at The Mill. He then joined MPC in 2009 before joining Framestore in 2010.
What is your background?
I come from a family of construction workers. My path has also been in construction; just of digital creatures/characters, not physical buildings. I had a passion for art and films in my childhood, and after a few years on building sites, I decided to try out VFX. I went to the University of Wales Trinity Saint David (formally Swansea Institute) and got a 3d Animation degree before moving to London and becoming a VFX artist. I worked in a number of the big VFX companies in London before finding Framestore 10 years ago. Framestore’s culture for looking after artists has kept me there ever since.
How did you and Framestore get involved on this show?
Framestore landed the work through an existing relationship with Executive Producer Bill Carraro who we had previously worked with on (the Oscar-winning) BLADE RUNNER 2049.
How was the collaboration with the various directors and VFX Supervisor Kevin Blank?
We benefited from being around the team which included showrunner Misha Green when we were on-set for Eps 101, 102 and 108. Those chance conversations, and getting direct thoughts and ideas for what the Shoggoth is, and its importance in the show was invaluable. We had direct access to Kevin Blank over the last 14 months, on-set in Atlanta and with regular video calls and cineSync reviews to present and review animation and WIP’s. These relationships, and getting clear notes and direction, really helped us run the show efficiently and deliver our best work.
What was their expectations and approach about the visual effects?
The expectation was to create an absolutely terrifying monster capable of ripping people limb from limb. HBO is renowned for its incredibly high-quality content, so we knew we would have to pull out all the stops to create something truly spectacular.
How did you organize the work with your VFX Producer?
Stephen Nixon (VFX Producer) and I have worked as a partnership for the last 14 months. To plenty of peoples’ annoyance I don’t like rigid restrictions between peoples roles so we’ve worked to figure things out together. That said, Stephen definitely kept a firm grip on the crew, schedule and budget and I had my eyes on the methodology and the images. The shift to remote working thanks to covid lockdown in March was a big challenge that everyone met admirably. Stephen and the production team kept communication open and helped maintain our focus on priorities and the time we had. Production did solid work to build flexibility into the schedule that maintained pace but supported the crew and the move to working from home during the pandemic.
How did you split the work amongst the Framestore offices?
This project was carried out by the London office, although it has been completed in spare rooms, home offices and garden sheds due to lockdown measures.
What are the sequences made by Framestore?
Framestore’s chief responsibility was all things Shoggoth. We completed sequences in episodes one, two, eight and ten. The later episodes are of particular note due to a special variation on the Shoggoth already seen.
How did you work with the art department for the design of The Shoggoth?
Designs were passed to us from concept artists which included the very talented team in Framestore’s art department and further design changes were made by myself in collaboration with our Animation Supervisor Jed Fisher. The internal skeleton and muscles of the creature were also figured out at this stage which helped inform the rigger of joint positions and other anatomical decisions.
Can you explain in detail about the creation of The Shoggoth?
For the most part the initial build of the Shoggoth was fairly straightforward. The Zbrush sculpt was remeshed in Maya and textured in Mari. The rig (also created in Maya by rigger Leo Schreiber) was a fun and technical challenge. Creatures with large mouths generally need fairly complex rigging setups to deal with the large deformations. A combination of joints and extensive blendshape work were used in order to give the face a broad freedom of movement but also maintain a fleshy feel. After animation the creature lives in Houdini, this gives a huge amount of control over both the geometry and how the geometry can affect the shading.
Can you tell us more about his rigging and animations?
Jed Fisher, Animation Supervisor // The Shoggoth was rigged in a way that allowed for both an animation-friendly version and a highly-detailed version of the rig to see more accurately how the model would look in shot. A modular approach was taken to the rig and in part the way animation was dealt with. The rig had the flexibility to turn on and off segments and features so work could be more streamlined. And a similar approach was taken in early stages of animation where a library of modular clips were created. For example limb and claw animations, tongue animation, locomotive cycles, behaviours, passive and aggressive. These become great starting blocks for quickly fleshing out ideas in a shot.
Did you receive specific indications and references for the animation?
Jed Fisher, Animation Supervisor // The brief was that the creature was otherworldly. It had a mysteriousness to it but was also built as a perfect killing machine because everything was a weapon; its jaws, its claws, its insect-like meta arms and its spiked club-like tail. It could display adaptive behaviours, so it could stalk like a big cat but also charge and posture like a gorilla. We looked at a lot of reference from the animal kingdom. It was about pooling on reference and animating in a way that felt correct for the shoggoths motivation in any given shot.
Can you elaborate about the face creation and animations?
Jed Fisher, Animation Supervisor // The face had to look fleshy and be able to stretch accordingly to how the shoggoth would open its jaw. It needed to feel as if it was sliding over bone and being triggered by muscle. Several shapes were created to form one motion, giving the impression of flesh gliding over bone and muscle. The eyes extending from what you would call its face right over to its back were not a classical eye rig setup – they were animated so the whole of the eye would be swallowed by a membrane with the actual eye underneath still moving and looking. This animation would then drive a more complex FX pass adding nuances to each eye blink performance.
Can you tell us more about his mouth and the drool?
Hernan Llano, CG Supervisor // We knew that doing sticky, viscous fluids on very fast moving creatures is always tricky and heavy to compute, so we started doing RnD very early on. As with any complex task we did break it up into different smaller problems that were easier to manage.
The drool is divided in three main setups; an internal drool that creates all the nice detail between the teeth and the interior of the mouth, a more sticky and viscous drool that consisted of tendrils hanging from the gums and mouth and finally a more liquid drool that was able to leave the mouth as drops and sheets of saliva. The base of all three setups were fully procedural allowing us to achieve the shapes using non-dynamic approaches for maximum art directability, and once we were happy with the general look, we would run them through simulations to create the final behaviour that you can see in the show.
How did you handle so many eyes on the same creature?
Hernan Llano, CG Supervisor // It was a fairly complex problem that had to be solved between rigging, animation and FX/CFX. Animation would take care of the rotation and scale of the eyes, and then in FX we had to make sure that we could keep those rotations and changes in scales but at the same time update their position so they would remain attached to the skin after the muscles were simulated.
That was just one part of the problem because the eyes should also affect how the skin behaved, so we ended up making a setup that would create some sort of feedback loop where the skin affected the position of the eyes, and the eyes would affect the behavior and wrinkling of the skin.
Can you tell us more about his textures and shaders?
As mentioned before the textures were created in Mari and the look development was carried out in Houdini (rendering with Arnold). The shading on the creature was a complicated setup due to its translucent skin. We used multiple maps plus combinations of attributes on the geometry to drive the levels of transmission. For example as the skin stretches it becomes more translucent. Also we knew the Shoggoth was going to get pretty messy at times. We used attributes on the geometry to drive adding geometry, textures and procedural shaders for the blood, mud and dust that landed on the Shoggoths. Different combinations of these layers were used depending on the shots requirements.
Owen Braekke Carroll, 2D Supervisor // The Shoggoths’ translucency was look-dev’d in CG, but rendered in multiple layers to allow for flexibility on a shot-by-shot basis. This allowed artists to control how much of the Shoggoths internal structure was visible as it moved, allowing for creative application of the effect to sell different lighting, locations and action. Dust, blood and mud could be controlled after render in the same fashion, letting us tweak the visibility and degree of coverage across sequences for continuity and dramatic visual effect.
How was simulate their presence on-set?
There was a prosthetic head and arms that were based on the Shoggoth designs. We chopped up the creature’s ZBrush model which the show’s art department in Atlanta used to mock up a foam approximation of the Shoggoth to use on set. It looked like a big green Kermit the Frog, but it was the right size and scale to give the actors – and our animation team – a reference. Animation Supervisor Jed Fisher and I helped block out one specific shot for a later episode, using the foam Shoggoth arms and head, in a car park in Atlanta at 1am. Our producer captured what we were doing on an iPhone and we put it in front of Kevin (Blank) and Misha (Green) to help inform staging of the action required.
The creatures are mostly seen in the dark. How does that affects your lighting work?
Henrique Campanha, Lighting Lead // The first word that comes to mind is NOISE. Working in low light conditions is a concern for any lighter, as the rendering time to obtain a clean scene of noise can increase considerably. At this point, having 6000 procs on the farm available to us was a great help (thanks Framestore for that). Photographically speaking, the general darkness mood brought us a good challenge. Like almost everything rare in the universe tends to be more valuable, the general absence of light that we had in most of the shots, made every light added in the scene stand out. So we needed to evaluate and carefully direct each beam of light that we put in the scene. It was intense and work that required fine-tuning for our amazing lighting team to get the best look for our beloved Shoggoth.
Which sequence or shot was the most challenging?
Episode eight has a sequence which is very exciting and very violent. It was particularly challenging due to the interactivity with the shot elements. We pre-visualised the sequence precisely and this proved to be essential to deliver a coherent edit that maintains the energy and performance required for Misha’s vision. It has some very long shots which required many layers of blood and destruction.
Is there something specific that gives you some really short nights?
Stephen Nixon, VFX Producer // There was nothing quite as terrifying as reading a call-sheet which includes a list of creatures that could be lurking in the woods we were shooting in; especially Chiggers (or berry bugs). And the smell of DEET lingers for a very very long time.
What is your favorite shot or sequence?
It has to be episode eight again; it’s a sequence that’s a lot of fun and there’s some really nice moments where you get to see the Shoggoth up close and personal. It has to be my favourite sequence, that said we are still working on episode ten which has a really nice confrontation between two Shoggoths which is also very exciting.
What is your best memory on this show?
For me, every time I see a difficult shot that’s really starting to work I feel a lot of happiness/relief. Some shots have so many elements and balancing them all correctly can be very challenging and the work is a testament to the team’s talent and ability. I also have to mention how incredibly hard the whole Lovecraft team has worked in some very difficult times. When Covid struck and we all started working from home I was concerned that the team dynamic would be negatively affected but we soldiered on and have delivered some of our best work.
How long have you worked on this show?
The project, start to finish was about 70 weeks – with half the work being delivered while our teams have been working remotely.
What’s the VFX shots count?
It was a total of 66 shots from Framestore. This isn’t a huge amount but the complexity of those shots in which we animated the Shoggoth meant it was a pretty complex project.
What was the size of your team?
There were approximately 50 people working on our sequences for Lovecraft Country from start to finish. Plus the team in the art department who worked on the Shoggoth and on a lot of the show’s other creatures and beings.
What is your next project?
Urrrr? I haven’t really thought about that yet. Something less violent I think.
What are the four movies that gave you the passion for cinema?
STAR WARS – A NEW HOPE
TERMINATOR 2
JURASSIC PARK
LORD OF THE RINGS – THE FELLOWSHIP OF THE RING
A big thanks for your time.
WANT TO KNOW MORE?
Framestore: Dedicated page about LOVECRAFT COUNTRY on Framestore website.
© Vincent Frei – The Art of VFX – 2020
Netflix has released this VFX Breakdown featuring Everett Burrell (Senior VFX Supervisor, Co-Producer and 2nd Unit Director) talking about the visual effects work (made by Spin VFX) on the final battle for the second season of THE UMBRELLA ACADEMY:
WANT TO KNOW MORE?
Everett Burrell: Here is my interview of Everett Burrell, Senior VFX Supervisor, Co-Producer and 2nd Unit Director.
Netflix: You can watch THE UMBRELLA ACADEMY on Netflix now.
© Vincent Frei – The Art of VFX – 2020
In 2012, Mitchell Drain explained the work of Method Studios on THIS MEANS WAR. He then joined Shade VFX in 2015 and Digital Domain in 2018. He has worked on projects such as KONG: SKULL ISLAND, DOWNSIZING, WESTWORLD and THE TWILIGHT ZONE.
What is your background?
I have been involved in visual effects for over 30 years. I came to Los Angeles from Chicago with an Art degree and found myself at pioneering effects studio, Robert Abel and Associates. I learned all manner of optical effects techniques there. Ultimately, I ended up doing motion graphics and visual effects using the Quantel Paintbox, Harry and finally the Domino, Quantel’s 3k film compositing system. This led to more work as a Flame/Inferno artist, and finally to visual effects supervision.
How did you and Digital Domain get involved on this show?
Executive producer of episodics, Eric Robertson, had a history with Justin Ball, the client side visual effects supervisor. After the initial bidding process, we met and discussed the requirements of the show. I have a great interest in historic LA, and the idea of recreating the Angels Flight area of Bunker Hill, circa 1932, was a very exciting prospect.
How was the collaboration with the various directors and VFX Supervisor Justin Ball?
We worked primarily with Justin Ball. He has a great eye for detail and a complete grasp of what the Showrunner wanted for the visual effects. His understanding of the story arc and overall aesthetic of the show greatly informed every aspect of what we were creating.
What were their expectations and approach about the visual effects?
First priority was to be as accurate to the period and historical locations as possible. Once we had the historically correct models built, we focussed on serving the story. Sometimes this meant subtle variations in the composition and lighting to best serve the narrative.
How did you organize the work with your VFX Producer?
We were very fortunate to have Gayle Munro as our visual effects producer. She has a long history of managing large teams on big shows. Although “Perry Mason” had a more modest shot count, the complexity and detail of the work required the same attention as a show many times its size. We had two CG supervisors, Jonathan Green and Brian Creasy. Jon has an extensive background in digital environments and matte painting, and Brian has a wealth of knowledge as a CG Generalist. Eric M. Beaver was our compositing supervisor. Of course, there was quite a bit of overlap between disciplines, and the team worked together as one to deliver beautiful VFX work.
What are the sequences made by Digital Domain?
Digital Domain created the entire area of Angels Flight and the surrounding Bunker Hill environment seen in several episodes. It is a complex world that included period cars, digital extras in time appropriate clothing and a trolley system that ran through the city at that time. We also created a rooftop vista complete with period buildings, signs, oil derricks and period-correct fireworks. There are also various set extensions, plate enhancements and even a brutal knife attack that required a CG knife and requisite blood enhancement.
What was the biggest challenge in recreating the Los Angeles of the 1930s?
The biggest challenge was creating the level of detail necessary to sell the reality of these long lost locations. Angels Flight and the surrounding buildings are very iconic and recognizable. Attention had to be paid to every electrical cable, building material, period-specific levels of decay and even creating the proper growth and composition of foliage that existed at the time. Accuracy to time and place was a top priority.
Can you explain in detail about the creation of 1930 Los Angeles?
To accurately recreate the locations, it was necessary to reference archival photographs and vintage film clips. We were fortunate that production supplied us with LIDAR data of the filming locations. This coupled with photographs and measurements that we gathered, helped us to create models as accurately as possible. The team would compose these models and align them with the archival photos to adjust for any disparity in space or scale. Modelling was done in Maya, while Substance Painter was used for textures. DD has a vast library of buildings, props and human characters that were re-dressed for 1932 and used to fill out the environments. The renders were done in V-Ray and compositing was done in Nuke.
What was your main source for the references and inspirations?
Primarily archival photographs and vintage movie clips. Angels Flight was a popular location for shooting in its day, so we were able to use that as reference materials. We also drew inspiration from principal photography. The sets and lighting are authentic and true to the noir aesthetic. This gave us a great base from which to build our visual effects.
Where were the various parts of the show filmed?
The existing Angels Flight is in downtown Los Angeles at Pershing Square. Various downtown streets which still have some of the original architecture were also used. Any modern structures were digitally replaced with period buildings and props.
Can you tell us more about the creation of the Angels Flight environment?
Angels Flight was torn down in 1969 as part of an urban renewal project. The original location is unrecognizable today, nothing is left there. The iconic archway, funicular cars and tracks have been reassembled at Pershing Square, a few blocks from the original location at 3rd Street and Hill Street. As noted earlier, we had LIDAR scans of the existing location. We also took our own reference images that we used to create some models through photogrammetry. The environment is quite extensive. Details were included that were not seen in the final VFX shots, but were nevertheless necessary to create an overall accurate environment. The Clay street area is a good example of this. It runs through — halfway down the hill — next to what was known as the “McCoy” house. It was important to include that since any subtle changes, whether for story or composition, could reveal any unfinished or incomplete areas.
This environment is seen in various light conditions. How does that affect your work?
That is an excellent point and one that we were fully aware of at the outset. Ultimately, what that meant was that we would need to be sure the farthest corners of the environment needed to be fully realized since it would show in the daylight. Conversely, areas like whatever was going on inside the apartments, would need to be illuminated at night. Rooms were created for inside the structures, some relatively bland, some with Christmas trees and other props.
The streetlights in 1930 were different than today. Can you tell us more about the lighting work?
It is always good to start from reality. The production’s Art Department built out the immediate area where the actors were working. That gave us an excellent reference as to what our digital streetlights should do. There was a bit more creativity involved in lighting up areas where we had storefronts, cars, trolleys or Christmas lights to rationalize illumination, and for some sequences we had to match existing lighting exactly. Starting with HDRs, it was then up to the lighter to creatively interpret what was needed.
How did you populate the streets with people and cars?
Digital Domain has a vast library of car models from all eras. We created various paint schemes that would have been used in the time period, and they were animated in Maya as needed. The digital extras were also library elements from the DD vaults, and we have a library of motion capture walk cycles that were used for basic animation. Some custom animation was done for characters walking up stairs or interacting. For scenes that were sparsely populated, the characters were hand placed, while for more crowded scenes, Houdini was used to reference the animation and handled any problems with collision or interpenetration. The period clothing was created in Maya.
Did you use procedural tools for crowd and cars animation?
The cars were hand placed and animated. Larger crowds were populated using Houdini to reference animation cycles form our motion capture library.
Which sequence or shot was the most challenging?
It is hard to pick a favorite! There is a shot in the second episode that has a lingering technocrane move that is over a minute long. The shot begins at street level with a small crowd of people listening to a street preacher, then the camera slowly rises to look down the crowded street clear to the horizon. It would be difficult to overstate the complexity of this shot. The devil is in the details as they say, and there were so many seemingly small details to account for in this shot that it is hard to recount them all. To start, the camera track needed to be spot on as we were adding many elements to the existing location. The rotoscoping team had the Herculean task of isolating every character on the street and any element that would intersect with the VFX. All modern elements had to be removed and a trolley car, cables and tracks needed to be added, which meant that the entire street needed to be replaced digitally to accommodate the tracks and to remove modern traffic markings. Cars had to be placed onto the new street with all of the associated interactive lighting, the sidewalks were populated with digital extras that needed to be interspersed with the existing crowd and a matte painting was utilized to allow for distant period buildings and lights. Lastly, this was all seen as the camera rose through trolley car cables and digital Christmas lights. I’m sure I have forgotten something!
Is there something specific that gives you some really short nights?
Anyone who takes this career seriously is going to have some sleepless nights. Personally, I agonize over the creative decisions. Technical issues can be overcome. It is the creative call and the ripple effect that can be caused by an incorrect interpretation of a client request, or just a plain old wrong decision that keeps me up at night.
What is your favorite shot or sequence?
I am very proud of the Angels Flight work. It is both beautiful and historically correct. It serves the story narrative and allows for a compelling backdrop. It provides a small window into a time and place that no longer exists, that none of us will get to experience. That makes me happy.
What is your best memory on this show?
I would have to say the interaction with the team. The COVID-19 crisis broke just as we were gaining steam in shot production. The way that DD efficiently handled the transition to working from home was amazing, and the can-do attitude of the team was inspiring.
How long have you worked on this show?
This was a surprisingly quick turnaround. We first met with the VFX team in December of 2019, and delivered our final shots by the third week of June 2020.
What’s the VFX shots count?
The final VFX shot count for Digital Domain was 138 shots.
What was the size of your team?
We had an amazing team of 78 talented folks.
What is your next project?
I am presently prepping post VFX work on a feature film project that I am not at liberty to discuss.
What are the four movies that gave you the passion for cinema?
Awesome question! It is hard to narrow down. I am a great fan of classic movies, mostly horror films from the ‘30s, ‘40s and ‘50s. FRANKENSTEIN, THE WOLF MAN,– as long as it has a monster! AN AMERICAN WEREWOLF IN LONDON was a bit of an obsession for me when I first saw it. I wanted to be a monster maker. BLADE RUNNER made me want to create other worlds. Since I have an education in fine art, VFX seemed a natural progression.
A big thanks for your time.
PERRY MASON – VFX BREAKDOWN – DIGITAL DOMAIN
WANT TO KNOW MORE?
Digital Domain: Official website of Digital Domain.
© Vincent Frei – The Art of VFX – 2020
Come have a look at the seamless work made by Pixomondo for THE KING OF STATEN ISLAND:
© Vincent Frei – The Art of VFX – 2020
Important Looking Pirates has release this VFX Breakdown about their work on the HBO series, WATCHMEN:
© Vincent Frei – The Art of VFX – 2020