How I Made New Specimen in Unreal Engine 5

Dimitri Vallein, a real-time 3D director and technologist, shares a comprehensive breakdown of his short film project and how it was created, animated, simulated, and rendered using UE5.
Directed by Dimitri Vallein 4min30s, self-produced, France Set in the future, the award-winning film tells the story of a laboratory that creates a new specimen through unethical experiments; a sinister entity takes control, forcing the subject to survive deadly challenges where reality blurs with illusion.Dimitri Vallein, a real-time 3D director and technologist, has shared a comprehensive breakdown of his project, , an award-winning short film.

In this article, he explains how it was created, animated, simulated, and rendered using Unreal Engine 5.Introduction Hi everyone! I’m Dimitri Vallein, a real-time 3D director working on sci-fi animated films. Before filmmaking, I built projects around immersive 3D technologies; I reached 500M+ views creating Augmented Reality experiences and developing video games that hit #1 on the French App Store.A few years ago, I turned to Unreal Engine with a clear goal: to direct the sci-fi films I’d always dreamed of seeing.

Since then, I’ve written, directed, and self-produced two UE5 shorts: (2022) screened at Dances With Films LA at the TCL Chinese Theatre and selected for the BMVAs (Best Animation) alongside The Weeknd, Pharrell Williams, and Muse.(2024), which was a semi-finalist at the Rhode Island International Film Festival and later led to an invitation to join a panel with Ron Dyens (Oscar-winning producer of ) to discuss real-time 3D, the exact process I break down below.Starting Point Everything began with this single render from my friend Qtn.cls: a lone figure in a field at night.

I felt instantly connected to it; I could feel the silence, the cold breath, the mystery.In that moment, the world of started to take shape in my mind.Research and Script I like to start picturing my films with strong images before anything is written.

Those visuals are what excite me the most at the beginning, and like a chef in a kitchen, they form the recipe for a great short film.Great input usually means great output.The other advantage of this method is not constraining yourself too early in the process.

Images have the power to continue to grow inside you and encapsulate much more story and ideas than words ever can.This is what the early gathering of the references looks like: Then, I called my childhood friend Vincent Guyon and laid out for him the vision for where this could go.We bounced ideas back and forth until the story snapped into shape.

Together, we drafted a 4-minute outline, without going too much into the details.The goal here was mostly to set the overall direction, but nothing was carved in marble, and everything was made to be open to new ideas and shifts along the way.That's how I usually make the best discoveries.

I always discover more about the film while making it rather than writing it.After that, I jumped straight into production.Creation of the Main Character One key point for me was to create a unique, distinct look for the main character.

Most of the time when you watch Unreal Engine short films, you can spot MetaHumans immediately, and I feel it often subtly pulls you out of the story.The problem usually is that people use the default MetaHuman’s look, which still feels quite uncanny.It often lacks the personality and style you look for when you try to create something distinctive.

That’s why I chose a 3DScanStore head scan to avoid this trap and go for something unique.Then I MetaHumanized it while keeping the surface details from 3DScanStore by adapting the original UVs to the UVs of the MetaHuman I just created.This step mattered because I still wanted to use the MH animation features, especially the facial rig, which were crucial to my process.

Facial Animation One of the biggest challenges was getting believable facial expressions, and, above all, making them land emotionally for the viewer (curiosity → apprehension → anxiety → fear → desperation).I didn’t have an experienced actor around me, so I did it myself.I used my iPhone with the Live Link Face application to capture takes for each sequence, processed the data in Unreal Engine, and synced it with the body animation in the sequencer.

I added light facial keyframes to push certain expressions or correct specific muscle movements, but about 90% of the results came straight from the capture.Body Animation For my character’s Body Animation, I decided to explore the possibilities of markerless video-to-motion options and chose the tool Move.AI for its accuracy.I set up four iPhones on tripods around a small capture area, ran the standard calibration, and recorded all the performances myself in my living room.

The best part of this process was the iteration speed: if a take didn’t work properly, I could re-record the animation immediately.I exported FBX files from Move.AI, retargeted them in Unreal with the IK Retargeter, and did a light cleanup in the sequencer (foot contact, hand jitter, etc.) until it felt right on the screen with the lighting and environment.Cloth Simulation One thing I always aim for in 3D is to keep every pixel in motion.

CGI gets static and fake pretty fast.So, I like to keep everything moving on screen, even subtly; the result feels more believable to the human eye.That’s why I decided to take the hard road for not having static rigged clothes on my characters, and to simulate the clothes movement for every shot of the short film.

For that, Jon Sanchez (who worked with me on ) designed a hospital-style gown in Marvelous Designer.He also created a torn, distressed variant of the garment with rips along the hem and sleeves.I love this choice because it lets the outfit evolve with the story.

Once the garments were designed and optimized, the next challenge was simulation. I exported the MetaHuman animations and brought them into Marvelous Designer.Some simulations went right after a few iterations (walks, look-arounds).Others were bigger challenges; the hardest were probably sprints, shoulder rolls, and sudden stops.

They needed troubleshooting: collision thickness tweaks, pattern adjustments, higher sub-steps, and occasional re-caches until everything fell into place.I then exported the cloth back to Unreal as Alembic (Ogawa) and integrated it into each sequence.Environment Design & Lighting The environment and lighting were created by Qtn.cls.

I had already worked with him on , and we've always been closely aligned on the aesthetics we like.It was great to work with him again.Our process is to leverage the power and versatility of Lumen.

Usually, in CG filmmaking, it's really hard to get a quick feeling of how your scene is going to look with correct lighting, as the iteration process is really long with traditional software, and the lighting part usually comes at the end.Here our goal was to set up the lighting as one of the first things, before what usually comes first: modeling, texturing, rigging, animation, etc.  Animation First New York One it was released, quickly got traction from the 3D community.

I was contacted by Animation First Festival to screen and present my film to their audience in New York City! After the screening, I sat on stage for an hour to discuss real-time workflows and the use of AI in animation.I shared the panel with Ron Dyens, the Oscar-winning producer of , an animated feature film made in Blender with the same first principles as .VIEW Conference 2025 Two weeks ago, I gave an exclusive talk at the VIEW Conference 2025 in Turin, Italy, where I explored how a new generation of creators is reinventing the language of cinema through real-time 3D engines, artificial intelligence, and immersive technologies.

I shared my vision on how these tools enable us to prototype the future of storytelling: faster, more freely, and with machines augmenting human vision.Rooted in my experience crafting short films in Unreal Engine, I invited filmmakers, artists, and storytellers to discover how real-time creativity opens the door to radically new ways of imagining and building cinematic worlds.The same event also featured many other animation and visual effects projects, with studios such as Dreamworks, Pixar, Industrial Light & Magic, and many others.

Conclusion I hope you’ve enjoyed my creative process breakdown – here’s a Making of and my final film trailer! Dimitri Vallein is a real-time 3D director working on sci-fi animated films.
Epic Games Releases Free Unreal Engine Sports Broadcast | Motion Design Sample Six Game-Changing New Features for Animators in the Latest Version of Unreal Engine Unreal Engine 5.6 Brings Exciting New Features for Animators ReDefine Produces Unreal Visual Effects for ‘The Eternaut’ Conductor Introduces Cloud-Based Rendering for Unreal Engine VFX Studio Filmworks/FX Transforms Virtual Production with iClone & Unreal Game Changer: iClone LIVE LINK and Control Rig for Unreal

Read More
Related Posts