VFX Supervisor Jordi Cardus Civit details how the studio balanced real-time workflows in Unreal Engine and complex simulations to help tackle snowstorms, CG creatures, and virtual environments in Netflix’s ambitious live-action adaptation of the iconic Latin American sci-fi graphic novel.
Originally published between 1957 and 1959 as a serialized comic strip of the same name by Héctor Germán Oesterheld and Francisco Solano López, follows the residents of Buenos Aires struggling to survive an alien invasion that began with a mysterious, deadly snowfall.A six-episode live-action adaptation, written and directed by Bruno Stagnaro, is now streaming on Netflix.
Ezequiel Rossi served as post-production supervisor on the show; ReDefine was brought on as lead visual effects vendor, contributing 388 shots across Episodes 101 through 104.“ stood out in several key ways,” says Jordi Cardus Civit, VFX Supervisor at ReDefine.“First and foremost, it was one of the first major visual effects-centric productions of this scale to come out of Argentina, which made it both exciting and culturally significant.
There was a sense of breaking new ground, not just in terms of technical ambition, but also in bringing a deeply iconic Latin American graphic novel to the screen.Another major distinction was the central role Unreal Engine played throughout production.It wasn’t just used for previews but was fully integrated into pre-production and production.
That level of real-time visualization brought both creative freedom and technical precision, allowing departments to align early and iterate faster.Also unique was the way atmosphere, particularly the snow, became a narrative element.The snow wasn't just a background effect; it was treated like a character.
That meant we had to think about it visually, narratively, and interactively, which made for a layered and unusual challenge.” Visual development was guided by the original graphic novel and Stagnaro’s verbal direction.“One of the biggest creative challenges was environmental continuity — how the snowstorms behaved,” Civit explains.“We had to design rules for snow accumulation, wind direction, and storm intensity that could evolve naturally across sequences while remaining visually coherent.
The snow wasn’t just an effect – it was part of the story and needed to feel intentional.Another challenge was animating the Cascarudos.The director was clear that these creatures weren’t mindless monsters but had purpose behind their actions, whether they were hunting, observing, or attacking.
We ran extensive animation tests to explore behavioral nuance and develop a distinct motion language that reflected their nature and role in the story.Recreating Buenos Aires under apocalyptic conditions also required close collaboration with the client’s internal teams.Their local knowledge was invaluable.
Every location, street detail, and skyline element was carefully art directed to reflect real-world references.We wanted viewers, especially those familiar with the city, to feel the authenticity, even when the environments were heavily stylized or partially destroyed.” Unreal Engine was central to the show’s environmental work.“All layout was created in Unreal, based on scene setups provided by the client,” says Civit.
“They delivered full sequence-level environments for each major location in the series, which we adjusted as needed.Lighting was also handled in Unreal.While base setups were established per environment, we refined them per sequence and shot to ensure continuity and mood, especially given how important snow, atmosphere, and weather effects were to the storytelling.
One of the major benefits of using Unreal was the ability to conduct interactive reviews in-engine with the director.We could adjust lighting, layout, and even camera placement on the spot.” For snow atmospherics, the FX team led by Luca Vitali used Houdini.“We leveraged Houdini’s procedural capabilities to create a variety of looks based on wind direction, strength, and whether wind was blowing toward or away from the camera,” Civit notes.
“We created variations for foreground, midground, and background layers, which contained a range of attributes which we leveraged at render time attributes that gave compositors control over snow density per layer.An internal automated setup allowed us to select and combine these elements per shot.” A key technical development was integrating Unreal Engine with ReDefine’s USD-based pipeline, Helix.“We called this system ‘Asset Courier,’” Civit explains.
“It extracted Unreal Engine levels, broke them into independent USD assets, and reassembled them accurately on a USD stage, maintaining correct structure and positioning.This was entirely new to our toolset and became a core piece of our pipeline that enabled Unreal to co-exist with the rest of our workflow.We also adapted animation tools for Unreal compatibility.
Certain assets and sequences required bringing animation data into Unreal, and though not unusual in real-time workflows, required internal consistency to ensure round-tripping was possible when needed.From a compositing perspective, it required a creative shift mindset rather than a technical overhaul.Since we didn’t generate the same depth of AOVs and render layers typical in offline renderers like RenderMan or Arnold, we had to be more decisive in lighting and look development upstream and more surgical in compositing adjustments.
The result was leaner renders and faster iteration.” Given the importance of blanketing Buenos Aires in snow, the FX team developed a procedural simulation system.“For snow on the Cascarudos and their interaction with the environment, our FX team developed a procedural system that allowed us to achieve several key effects,” Civit describes.“We designed the system to procedurally add snow on top of any creature, controlling its distribution and density dynamically.
Additionally, we implemented both procedural and manual controls to determine which areas of the creature would react to movement, for example, causing the snow to fall off when movement exceeded a certain threshold.The system also enabled procedural interaction between the snow on the creatures and the snow on the ground, allowing for realistic displacement and integration.We converted these effects into volumetric data, which were then rendered using Pixar’s RenderMan.
Finally, these rendered snow elements were seamlessly integrated on top of our Unreal Engine environment work, ensuring visual consistency and a high level of realism in the final shots.” The team built a snow simulation library that contained hundreds of gigabytes of pre-cached elements with different intensities, wind directions, and depth layers, which Civit notes, “let us quickly select appropriate setups per shot without re-simulating from scratch every time.Even specific interactions, like Cascarudos disturbing ground snow or shedding snow, used lightweight, procedural setups that rendered quickly and didn’t require days of caching, which was critical for keeping the production moving across so many shots.Modular systems and thoughtful caching let us deliver complex atmospherics efficiently, with a high level of control.” The insectoid Cascarudos are also responsible for creating cocoon-like webs.
“The concept evolved as we explored how they would physically make cocoons,” Civit says.“We refined the layers making up the cocoon — tight inner wraps, outer webbing, translucency, and slime.Tentacles used by the creatures during cocooning saw significant design iterations for shape and material characteristics, needing to match their biology while remaining cinematic.
We started by simulating web wrapping on live actors, which informed how we depicted soldiers cocooned in real time.Realism was key.The cocoons had to feel visceral and plausible, particularly in shots with challenging camera angles or actor poses where the wrapping had to interact believably with limbs, clothing, and set geometry.” Most action was captured on set with minimal use of bluescreens.
“While we had some physical sets, many environments were heavily augmented or replaced in post,” Civit says.“Roto and prep became major pipeline tasks — isolating actors, removing partial set elements, and creating clean plates for full digital environment replacements.” Digital doubles appeared mainly in battle scenes between the military and the Cascarudos.“These were used for stunts too dangerous or complex to film practically,” he explains.
“We also created crowd simulations of Cascarudos in Houdini.These allowed for large, coordinated swarms with individual behavior, adding realism.The crowd system allowed us to retain a sense of individual behavior within a broader collective motion, which helped keep the action grounded.
It also added the Cascarudos snow effects, snow interaction on the floor, and extra cocoons being dragged by Cascarudos.We didn’t rely heavily on face or limb replacements, as most performances were captured practically.But the few full digital doubles we used were integrated seamlessly into environments using the same snow, lighting, and interaction systems developed for the rest of the show.” “Each shot posed different challenges,” Civit concludes.
“Some were technically demanding, like Cascarudos cocooning soldiers.Making the interaction between creature, actor, and effects elements to feel physically believable was no small task.Others seemed simple at first glance — no big action beats — but Bruno, our director, had specific visuals in mind, requiring iteration and precision.
One standout shot involved following a car with a moving camera, panning up to show Cascarudos destroying nearby vehicles.The shot climaxes with a bus getting slammed and thrown into the air, surrounded by a burst of effects-driven snow.It was tricky because it had a bit of everything — CG creatures, heavy effects, vehicle destruction, atmospheric integration, and we had to simulate camera movement relative to the car within Unreal Engine.
We also had interior train shots that involved full CG set extensions, including reflective surfaces like glass windows, which demanded precision in lighting and compositing to sell the realism.Every shot had its own challenge, whether technical, creative, or both.The difficulty often came from balancing all those elements together.”
Trevor Hogg is a freelance video editor and writer best known for composing in-depth filmmaker and movie profiles for , , and .
Designing the Grit, Decay and Emotional Impact of ‘The Penguin’
Netflix Drops ‘Havoc’ Trailer, Key Art
Apple TV+ Renews ‘Silo’ for 2 More Seasons
Universal Drops New Trailer for ‘Abigail’
Disney+ Drops ‘Brahmāstra Part One: Shiva’ Trailer
Morgana Studios, ReachStar, ReDefine Originals Partner on Animated ‘Diamante’
ReDefine Launching Barcelona Studio