Categories
Product Release
Pitch & Produce
News
Learning
Featured Story
Events
Tags
Product
Character Creator
iClone
Cartoon Animator
AccuRIG
ActorCore
Smart Content Manager
Plugins & Pipelines
ZBrush
Photoshop
PSD Pipeline
Vector Pipeline
Headshot
Auto Rig
Motion LIVE
Illustrator
Blender
SkinGen
Cinema 4D
After Effect
Unreal
Omniverse
Maya
Unreal Live Link
Unity
MetaHuman Live Link
3ds Max
Marvelous Designer
Daz
Motion Link
Iray
Application
Music Videos
Social Media
Animation
Cartoon
Comics
Commercial Ads
Previz
Conceptual Art
Games
Television & Network
Education
Films & Movies
AEC
3D Scan
Live Performance
Virtual Production
AR/VR/MR/XR
Vtuber
Theme
Character Animation
Character Creation
Scene Creation
Facial Animation
Lip-sync Animation
Video Compositing
Motion Capture
Motion Director
Digital Double
360 Head
Digital Twin
Environment & Crowd
Digital Human
AI & Deep Learning
MetaHuman
Metaverse
Developer
Content
Plug-ins
Certified Trainer
Columnist
WarLord
GET THE LATEST UPDATE

Winning Tips & Tricks: The Making of Petar Puljiz’s “Dear Rosie” Showcase

Share

The “Winning Tips & Tricks” series covers practical workflows and techniques shared by winners from the “2021 iClone Lip Sync Animation Contest”. To let users see the full spectrum of the iClone facial pipeline, we are introducing projects that are rendered with different 3D engines including; Unreal, Unity, iClone, Blender, and Cinema4D.  Let’s now take a look at Petar Puljiz’s: “Dear Rosie”, combining workflow in Character Creator, iClone, Marvelous Designer, Blender, and Unreal Engine.

Petar Puljiz

Hello! I’m Petar(Peter) Puljiz, originally from Split, Croatia. I describe myself as a visual producer and artist, but I work in many roles. Currently, my main role is Visual Artist in Lowly from Trap Nation where I produce and publish experimental visual content for music. My current focus is based on virtual humans, and their implementation and representation on digital/virtual platforms. One of my virtual humans is mister visual, which I will write more about soon.

Why did I create this showcase? 

The goal of this project is to explore what’s the most fun and painless way to animate complex characters such as Metahuman with a variety of tools. I used to believe in a lie that I would be able to get character animation straight from motion capture. But then I found I was a bit naive, because it requires a lot of data-cleaning for the captured motion data, and that’s just the way it is. Since I’m getting into a new storytelling format – virtual influencers – I need a lot of tools in a one-stop-shop solution, and a more streamlined process to animate MetaHuman. The iClone Lip Sync Animation Contest was a great motivation to free up time and dive into their tools. So I set up some criteria, took a deep breath, and dove into iClone!

A quick summary of my workflow

  • After refining the Rokoko mo-cap data, I used iClone’s Motion Layer Editor to create several natural body movements emphasizing the character’s talking animation, including hand contact on the table, and grabbing and releasing the cup prop.
  • Like the other contestants, I used AccuLips to drive the main lip-sync animation and observed the Unreal render in real-time via the iClone to Unreal Live Link.
  • Instead of using a facial mo-cap, I chose to blend subtle emotional expressions using iClone’s Face Puppet and made final detailed tweaks via MetaHuman’s facial controller.

I relied on Marvelous Designer to create the character’s unique dress and gave it a dynamic animation within an environment built up in Blender. Now you know the outline, let me just break down the details of each production phase.

How I do it with Reallusion Tools

Preparation: Concepting my character

First, I step into the shoes of my female character, Rosie and describe her personality, situation and anatomy. I try to look at the world from her point of view. It helps me to know how she acts and behaves in certain situations, which will ultimately reflect in the 3D animation.

Step 1: Rokoko – mocap, gloves, and suit: preparation work and motion data capturing

I used a Rokoko Smartsuit and its Smartgloves for motion capturing. The first reminder for you would be: Please get used to the suit! Because the suit is very tight, it will restrict your movement when performing. Don’t be afraid to stretch the costume to extend your body movement — you need to feel comfortable first!
Mocap animation provided me with an awesome block-out for movement; it gave me great timing and canvas to animate. After the mocap session was over, we prepared the data to retarget from Rokoko to iClone via 3DXChange. It’s easier if you have a plugin, but this is a good way to do it too.

Step 2: iClone – clean up, additive layer for the body, Acculips for mouth animation editing, and Facial Puppet for the rest of the face

Here we are, the moment of truth! I start to click around, seeing all the tools, I take a brief look at official documentation to get a much clearer vision. I begin placing a Metahuman dummy and ExPlus face in the scene with additional props that my character will interact with and setting up an additive layer to manipulate the mocap animation data and fix root motion. All the while using IK controls to get bigger moves and polishing it with FK controls later on. I parent the cup of coffee to the hand bone, and when it leaves it on the table I deactivate it. I fell in love with Reach Targets — ain’t gonna lie! For hand placement, I used ReachTargets so the hands stay on the table, and then deactivate it when the arm needs to be raised.

For fingers, Gestures help me to create main keys and then I polish them manually. After the body, I started with the face, and the first item is the mouth, using AccuLips to create the main layer of animation editing. Then I used iPhone LIVE FACE to get a good foundation for the rest of the face.

After the mouth, I did the eyes, eyebrows and nose. For the whole time, I had a second monitor with Unreal Engine and Live-Link turned on to exactly see what I’m getting, not relying solely on the iClone viewport.

Step 3: UE4 – Record 3D animation and ControlRig clean up and additive layer.

Unreal Engine

Once I finished with character animation in iClone, I transferred the animation data to Unreal via Live Link and Take Recorder. Disable the Reduce Keys option because we want to preserve all of that “juicy” animation (raw data). Also, I set up all my keyframes in Linear, because every frame is keyframed and you wouldn’t want auto-smooth or curves as they can create errors.

In Unreal, I import animation on our character, bake data to Control Rig and play with it a little bit. I spent some time here, just to clean up stuff we don’t need and put some more fine-tuning, especially for the face.

Step 4: Import the baked animation sequence from Unreal Engine and convert it to Alembic with a fixed table prop

Once done and satisfied with the animation editing, I bake ControlRig Animation Sequence to get an animation asset from it. With that, I export animation in FBX format. From there we import to Blender to convert it to an Alembic file which will be imported to Marvelous Designer.

Now, bake an animation asset from the control rig animation in Unreal Engine 4, and export it as FBX to Blender. Convert the animation asset to Alembic cache and send it to Marvelous Designer for cloth simulation.

Step 5: Marvelous Designer garment and simulation

The goal here was to test the pipeline and create small interactive cloth details. After importing the Alembic file, apply the motion to simulate the cloth animation.

Step 6: Transferring from Unreal Engine 4 to Unreal Engine 5

Our simulation is done and ready to be imported to Unreal. Once every setting is completed in Unreal, I continue to finish our scene and lighting. And after a couple of hours of adjustment, voila! That’s how I complete it, and now I’m ready to send all the animated files for full render.

Hope you enjoyed reading my tips and tricks!

Related topics

Share

Leave a Reply

Recommended Posts