首頁 » April 19, 2022

Character Creator 4 – Create Animated Personalities

Work in Progress : Part Three 

In the latest WIP III video, we’d like to show how artists can create accentuated characters with their unique set of expression styles. With the newly introduced Expression Profile Editor, designers can use native modelling morphs and bone tools to reshape expressions. Artists can take advantage of the FBX workflow between Blender, Maya, and 3ds Max to create custom expression morphs, to make batch facial updates back to Character Creator.

Furthermore, Delta Mush helps you smooth the messy parts of the mesh, either to fix the cloth distortion or to clean facial morphs. Character Creator 4 enables everyone to produce high-quality models which are ready for animation for well-crafted personalities.

Expression Editor: Native Morph Slider Editing

With well-constructed interface, users can easily adjust different angles of characters’ facial features. Not only does the succinct thumbnail design make editing more intuitive – what you see is what you get – but the Modify panel also enables users to tweak each morph slider in detail. Without the aid of any third-party tool, artists are encouraged to use Edit Mesh to design their own customized morph sliders in Character Creator 4.

Red dots in each thumbnail are parts controlled by the morph slider. Artists can edit the mesh of each slider to customize the characters’ stylized expressions.

Blink animations are probably the most challenging area for making characters unique and come alive. In Character Creator 4, blink animations rely on five distinct morph sliders: Eye Blink, Eyelid, Eyelash, Eyeballs, and Eye Look.

By adjusting the source morph target shape of these sections, each expression morph slider can be revised, and with the use of the Mesh Edit tool, one can quickly overcome these obstacles. Lastly, having finished the eye animations for one side of the face, artists can use the Mirror tool to instantly apply the same slider settings to the other eye.

The Mirror function can be seen at 3:11.

Besides modeling morph targets, Character Creator 4’s Bone Adjustment feature can be used to modify specific facial expressions and mouth motions that is often inextricably linked to jaw movements. Freely design nuanced emotions by customizing the mesh of each morph slider. Customizations to the mesh editing, bone adjustments, and morph target models can be saved to custom sliders.

The Quick Update feature refreshes the iClone Muscle Edit data along with facial mocap to give unique styles of animation to your characters that is hard to emulate.

Bone adjustment – Jaw

The Highlight feature can promptly update the viewport to preview changes made to the morph sliders, which is very useful for fixing morph shapes while checking specific expressions and performances. In addition, morph sliders that are activated during playback are clearly listed under the Currently Used section of the interface.

Besides the existing Expression Editor, Character Creator 4 provides Mesh and Expression Set for exporting individual facial morph sliders in sequential frames, which are editable in third-party 3D software, with which you are familiar with. Consequently, artist taking advantage of the seamless GoZ pipeline process between Character Creator and Zbrush can update their custom expression shapes on the fly, with a simple click of a button.

Single slider update – GoZ  

Ⅰ. Once you’ve made your adjustments using expression sliders, you can bake your work as a component of the character’s expression profile. 

Ⅱ. Press the GoZ button to export it from Character Creator (CC) to ZBrush for advanced editing.

Ⅲ. Export those edited profiles back to CC using Re-Align function to update the newly designed expressions on your characters.

Expression Editor: External Blendshape Editing (FBX/OBJ)

Beyond fixing CC3+ characters, you can take a custom rigged character from Blender or other 3D applications and add an entire personality within hours. Directly import FBX blendshape files from Blender, Maya, 3ds Max, Unity, Unreal, Motion Builder and Cinema4D, just to name a few.

Most importantly, the Batch Update feature of Character Creator 4 enables you to import/export the enormous expression files all at once. Less tedious than ever, the new painless workflow gives precise results in a very short period of time.

Batch update – FBX Pipelines : Blender
Batch update – FBX Pipelines : Maya
Batch update – FBX Pipelines : 3ds Max

Mesh Smoothing Tool: Delta Mush

Besides the Expression Editor, Character Creator 4’s Delta Mush helps artists to mesh smooth the characters’ facial expressions and body topology, including outfits, shoes, and even the patterns on their clothes! Whole or partial mesh can be cleaned to achieve proper topology. This is especially useful for characters with faulty or irregular mesh as a result of intensive model processing.

• Fix cloth distortion

• Clean facial morphs

You can check the 3D Character Creation & Animation webpage for more information about the free upgrade offer, feature comparison, and FAQs. If you’re interested in the Character Creator 4 motto of “Enlivening any Character” and “Animating any Rig”, you can refer to the article ‘Working in Progress: Part One. It’s also recommended to watch the Sneak Peek review narrated by Russell Midfield, who precisely explains each feature. For more technical details, please have a look at our latest New Features Introduction.

If you have further questions, please contact our Support Team, or join the discussion at Reallusion Forum.

Winning Tips & Tricks: The Making of Petar Puljiz’s “Dear Rosie” Showcase

The “Winning Tips & Tricks” series covers practical workflows and techniques shared by winners from the “2021 iClone Lip Sync Animation Contest”. To let users see the full spectrum of the iClone facial pipeline, we are introducing projects that are rendered with different 3D engines including; Unreal, Unity, iClone, Blender, and Cinema4D.  Let’s now take a look at Petar Puljiz’s: “Dear Rosie”, combining workflow in Character Creator, iClone, Marvelous Designer, Blender and Unreal Engine.

Petar Puljiz

Hello! I’m Petar(Peter) Puljiz, originally from Split, Croatia. I describe myself as a visual producer and artist, but I work in many roles. Currently, my main role is Visual Artist in Lowly from Trap Nation where I produce and publish experimental visual content for music. My current focus is based on virtual humans, and their implementation and representation on digital/virtual platforms. One of my virtual humans is mister visual, which I will write more about soon.

Why did I create this showcase? 

The goal of this project is to explore what’s the most fun and painless way to animate complex characters such as Metahuman with a variety of tools. I used to believe in a lie that I would be able to get animation straight from motion capture. But then I found I was a bit naive, because it requires a lot of data-cleaning for the captured motion data, and that’s just the way it is. Since I’m getting into a new storytelling format – virtual influencers – I need a lot of tools in a one-stop-shop solution, and a more streamlined process to animate MetaHuman. The iClone Lip Sync Animation Contest was a great motivation to free up time and dive into their tools. So I set up some criteria, took a deep breath, and dove into iClone!

A quick summary of my workflow

  • After refining the Rokoko mo-cap data, I used iClone’s Motion Layer Editor to create several natural body movements emphasizing the character’s talking animation, including hand contact on the table, and grabbing and releasing the cup prop.
  • Like the other contestants, I used AccuLips to drive the main lip-sync animation and observed the Unreal render in real-time via the iClone to Unreal Live Link.
  • Instead of using a facial mo-cap, I chose to blend subtle emotional expressions using iClone’s Face Puppet and made final detailed tweaks via MetaHuman’s facial controller.

I relied on Marvelous Designer to create the character’s unique dress and gave it a dynamic animation within an environment built up in Blender. Now you know the outline, let me just break down the details of each production phase.

How I do it with Reallusion Tools

Preparation: Concepting my character

First, I step into the shoes of my female character, Rosie and describe her personality, situation and anatomy. I try to look at the world from her point of view. It helps me to know how she acts and behaves in certain situations, which will ultimately reflect in the animation.

Step 1: Rokoko – mocap, gloves, and suit: preparation work and motion data capturing

I used a Rokoko Smartsuit and its Smartgloves for motion capturing. The first reminder for you would be: Please get used to the suit! Because the suit is very tight, it will restrict your movement when performing. Don’t be afraid to stretch the costume to extend your body movement — you need to feel comfortable first!
Mocap provided me with an awesome block-out for movement; it gave me great timing and canvas to animate. After the mocap session was over, we prepared the data to retarget from Rokoko to iClone via 3DXChange. It’s easier if you have a plugin, but this is a good way to do it too.

Step 2: iClone – clean up, additive layer for the body, Acculips for mouth animation, and Facial Puppet for the rest of the face

Here we are, the moment of truth! I start to click around, seeing all the tools, I take a brief look at official documentation to get a much clearer vision. I begin placing a Metahuman dummy and ExPlus face in the scene with additional props that my character will interact with and setting up an additive layer to manipulate the mocap data and fix root motion. All the while using IK controls to get bigger moves and polishing it with FK controls later on. I parent the cup of coffee to the hand bone, and when it leaves it on the table I deactivate it. I fell in love with Reach Targets — ain’t gonna lie! For hand placement, I used ReachTargets so the hands stay on the table, and then deactivate it when the arm needs to be raised.

For fingers, Gestures help me to create main keys and then I polish them manually. After the body, I started with the face, and the first item is the mouth, using AccuLips to create the main layer of animation. Then I used iPhone LIVE FACE to get a good foundation for the rest of the face.

After the mouth, I did the eyes, eyebrows and nose. For the whole time, I had a second monitor with Unreal Engine and Live-Link turned on to exactly see what I’m getting, not relying solely on the iClone viewport.

Step 3: UE4 – Record animation and ControlRig clean up and additive layer.

Unreal Engine

Once I finished with animation in iClone, I transferred the animation data to Unreal via Live Link and Take Recorder. Disable the Reduce Keys option because we want to preserve all of that “juicy” animation (raw data). Also, I set up all my keyframes in Linear, because every frame is keyframed and you wouldn’t want auto-smooth or curves as they can create errors.

In Unreal, I import animation on our character, bake data to Control Rig and play with it a little bit. I spent some time here, just to clean up stuff we don’t need and put some more fine-tuning, especially for the face.

Step 4: Import the baked animation sequence from Unreal Engine and convert it to Alembic with a fixed table prop

Once done and satisfied with the animation, I bake ControlRig Animation Sequence to get an animation asset from it. With that, I export animation in FBX format. From there we import to Blender to convert it to an Alembic file which will be imported to Marvelous Designer.

Now, bake an animation asset from the control rig animation in Unreal Engine 4, and export it as FBX to Blender. Convert the animation asset to Alembic cache and send it to Marvelous Designer for cloth simulation.

Step 5: Marvelous Designer garment and simulation

The goal here was to test the pipeline and create small interactive cloth details. After importing the Alembic file, apply the motion to simulate the cloth animation.

Step 6: Transferring from Unreal Engine 4 to Unreal Engine 5

Our simulation is done and ready to be imported to Unreal. Once every setting is completed in Unreal, I continue to finish our scene and lighting. And after a couple of hours of adjustment, voila! That’s how I complete it, and now I’m ready to send all the animated files for full render.

Hope you enjoyed reading my tips and tricks!