Categories
Product Release
Pitch & Produce
News
Learning
Featured Story
Events
Tags
Product
Character Creator
iClone
AccuRIG
ActorCore
Cartoon Animator
Smart Content Manager
Plugins & Pipelines
Unreal
Unreal Live Link
Auto Rig
ZBrush
Blender
PSD Pipeline
After Effect
Photoshop
Cinema 4D
Marvelous Designer
SkinGen
Vector Pipeline
Unity
Illustrator
Omniverse
MetaHuman Live Link
Motion LIVE
Headshot
Daz
Motion Link
Iray
Application
Animation
Cartoon
Conceptual Art
Films & Movies
Comics
Education
AR/VR/MR/XR
Games
Social Media
Live Performance
Previz
Virtual Production
AEC
3D Scan
Vtuber
Commercial Ads
Music Videos
Television & Network
Theme
Character Creation
Digital Human
Facial Animation
Lip-sync Animation
Video Compositing
Character Animation
Scene Creation
Metaverse
Motion Capture
360 Head
MetaHuman
Digital Twin
Digital Double
Motion Director
Environment & Crowd
Developer
Plug-ins
Certified Trainer
Content
GET THE LATEST UPDATE

Pitch & Produce | Remnants: Post Apocalyptic Short with iClone, Character Creator & Unreal Engine 5

Share
This story is featured in 80 Level

Stanislav Petruk

My name is Stanislav (but I prefer Stan). I am a self-taught artist working in the game-dev industry. I started as a motion designer in a small Siberian town and eventually moved to a bigger city chasing my goal to make games. The first job I landed was a startup mobile game, and I think it was a perfect project that helped me to make a transition from video production to a real-time environment. My next job was at a company called Sperasoft, it’s a big outsourcing company that co-developed a lot of great AAA games. I started as a VFX artist and worked on several big titles there – WWE immortals (mobile), Mortal Kombat (mobile), Agents of Mayhem, Overkill’s the walking dead, and Saints Row. It was a great time and it helped me to grow a lot as a professional artist.

After over 4 years at Sperasoft, it was time to move on and in 2019 I moved to Poland. I spent almost 2 years working at Techland Warsaw. In 2021 I moved once again, this time to Stockholm, and currently work at Avalanche studios as a senior VFX artist.

In iClone 8 I was able to do all the mocap clean-up, previously done in MotionBuilder, but without the complexity or higher cost. Inertial sensors of Xsens understand the movement well but have no idea where they are really located in 3D space.

Stan Petruk – Senior VFX Artist from Avalache Studios

About the Remnants Project

Besides making games, I love films and parallel to my real-time VFX career I am trying to develop myself as a filmmaker. I attended several filmmaking courses, the most recent one was a short online course from Vancouver film school. I am also learning from books and trying to break down the films I watch. But it was all a theory and at some point, I realized that I need to get my hands dirty and make my first film. So, it is when I started to develop an idea for a short film with a simple story. I also explored a similar theme in my previous project, which was a VFX contest two years ago – But it was not a film yet.

The production of the Remnants started maybe a year ago, but in the beginning, it was mostly research and development. I took a pack of sticky notes and started to break down the project into small elements, first breaking it into the story structure acts, adding events in between, and making a list of what software and techniques I must learn to make the project come to life. Everything was glued to the wall and visually looked like there is no way I could make it alone with the skills I currently have. At the same time, I didn’t want the production to turn into only a technical presentation, my goal here was to make a film and the main focus should be exactly film direction.

So, I started to dig into how I can optimize my work. The real-time approach in all possible ways was something obvious. Merging my two passions – gamedev and films – into one was the core idea to get production going.

Discovering the Reallusion Tools

I was very concerned about having characters in my film, they are very important and they must look alive, it must be possible to empathize with them. So, I googled how I can make a character without being a Character Artist, and after some research, Reallusion products seemed like a perfect solution. Simple to use and with some basic knowledge of CG you can get an awesome result from it.

Reallusion software works very well with other programs, especially with Unreal Engine. All file formats I needed are supported, and export presets are there, too. You just use it as a part of a solid pipeline. It simply does the job and helps to save a massive amount of time.

The creation process in Character Creator is as simple as it can be, you just pull the slides and get the shapes you want. But if you want to get the result you must understand what exactly you need. So, I made a virtual film casting. I found some AI that generated a bunch of photos and just chose the ones I liked best. After that maybe an hour per person and you are ready for the next production stages.

Clothes and Hair

I did not want the clothes on the characters to look like a standard game asset, where they are a part of the skeleton. So, I decided to use Marvelous Designer and simulate the clothes as a separate element. The pipeline was the following:

  • Export an animated character from Unreal Engine
  • Import it into Marvelous Designer
  • Create a 3D model of clothes in Marvelous Designer
  • Add UV and textures
  • Simulate the clothes on top of the animated character
  • Export as alembic
  • Import into Unreal Engine as a 3ds Max preset with skeleton

Eventually, I used a mix of simulated clothes and 3D model skinning to save some time.

Hair was a tricky part, mostly because I wanted to use UE hair and fur. A lot of hair is also added to the character’s clothes, there are reasons for it – it looks great and hides imperfections of simulated cloth. Here is the pipeline:

  • Export character (or clothes) from Unreal
  • Adding particle hair in Blender. Blender has a lot of great tools to style hair.
  • Export hair as an alembic file (only the first frame, not a whole simulation)
  • (It is an extra stage which is necessary only if the fur in UE is not binding to mesh correctly, I used it only for clothes). Export the first frame of the mesh as alembic and import as a skeletal mesh in UE
  • Import fur into Unreal Engine
  • Create hair binding (use additionally exported mesh as a source skeletal mesh if required)
  • Add the groom component to a mesh

The real problem with the hair – it is not always sorting correctly, and if for example there is a transparent particle in front of fur, the fur will be rendered on top anyway. So, I had to remove or tweak somehow a lot of smoke and fog because of that.

Creating Animation in iClone

Before making the final animations, I made the whole film as a simplified cinematic. I used a mannequin in UE and was just moving it in the scene, all cameras were also there as well. Thus, I had a very good plan, I knew what animations I need and even how much time should every movement last (roughly of course). In the final version, a lot was changed but the core idea was from the initial cinematic. For the motion capture, I used Xsens, which is really great but requires a bit of cleanup and tweaking anyway.

In iClone I was able to fix most of such issues and also fix some of the mistakes I made during the pre-production stage. The most complicated scenes were the ones with interaction involved. Especially because I had to be both characters at the same time. Such things required a lot of manual fixing, e.g., there is a scene where one character takes a cigarette from the other, or throws and catches a potato, all of that originally was not moving correctly. I didn’t have access to finger tracking so it was made in iClone.

For facial mocap I used an iPhone that also required a bit of tweaking later, e.g., chewing the potato was done manually as tracking was not able to understand the recorded movement. I also used a bridge to export characters to Unreal and it is very simple and smooth, with just a click of a button. The animations were imported manually because I needed to clean up and fix the movements before using them in production.

VFX used in Remnants

All of the effects are animated flipbooks, they are mostly fire and smoke. It was the simplest part for me because I am a VFX artist. The pipeline is standard, make a simulation in Houdini, render it as a flipbook, and make a particle effect in Niagara.

Conclusion

The production started quite a long time ago (over a year ago), mostly because it was a research and development for me in the beginning. So, you could understand better the last 5 minutes of the film were made during maybe a couple of months. So, when I developed a good pipeline, made all preproduction, and had a good plan, the production started to go very fast. So, you basically have a scene set up, add the lights, drop characters to the scene and roll the camera. I really loved this approach, because it actually simulates the film production and shifts most of your attention to actual filmmaking.
I think with a traditional approach I would most likely never make it alone. Now with all the knowledge I have, I started a new short animated film with a bit more complicated story. Currently, it is in preproduction, but soon I will be able to share more information about it.

Follow Stan

ArtStation:
https://www.artstation.com/illusorisch

Youtube:
https://www.youtube.com/user/stas170187/videos

LinkedIN:
https://www.linkedin.com/in/stanislav-petruk-13624868/

Related topics

Share

1 comment

Leave a Reply

Recommended Posts