Categories
Product Release
Pitch & Produce
News
Learning
Featured Story
Events
Tags
Product
ActorCore
iClone
Character Creator
Cartoon Animator
AccuRIG
Smart Content Manager
Plugins & Pipelines
Blender
Headshot
PSD Pipeline
ZBrush
After Effect
Illustrator
Photoshop
Motion LIVE
Unreal
Unreal Live Link
Vector Pipeline
Auto Rig
SkinGen
Cinema 4D
Omniverse
Maya
Unity
MetaHuman Live Link
3ds Max
Marvelous Designer
Daz
Motion Link
Iray
Application
AR/VR/MR/XR
Animation
Cartoon
Previz
Comics
Commercial Ads
Education
Films & Movies
Virtual Production
Games
3D Scan
Music Videos
Social Media
Conceptual Art
Television & Network
AEC
Live Performance
Vtuber
Theme
Character Animation
Metaverse
Character Creation
Lip-sync Animation
Scene Creation
Motion Director
Facial Animation
MetaHuman
Motion Capture
AI & Deep Learning
Environment & Crowd
Digital Double
Video Compositing
360 Head
Digital Twin
Digital Human
Developer
Content
Plug-ins
Certified Trainer
Columnist
WarLord
GET THE LATEST UPDATE

4 One-minute Tips to turn your MetaHumans from Unreal into Real

Share
Using iClone to animate Unreal Metahumans.

The exquisitely detailed MetaHuman has captured the eye of the media and entertainment industry. That said, how does one bring MetaHumans to life without breaking realism for rapid production with talking animations, emotes, and tailor-made performances? This is where iClone MetaHuman Live Link comes in by providing designers with a comprehensive and highly efficient way to animate full-body MetaHumans, complete with lip-syncing.

The following diagram is a quick overview of using this method to create natural talking animations in the shortest time possible:

iClone's suggested process on natural facial performances.

In addition, you can refer to the video for the making of “Lieutenant Thompson”

There are some important tips for newcomers to this procedure:

Tip 1: Accurate & Smooth Lip-sync

By simply importing your audio or recording directly in iClone, AccuLips will automatically detect and generate precise text and visemes for talking characters. AccuLips also has a text-to-speech capability that can automatically generate the waveform data according to the provided text—which can be changed on the fly; With the expressiveness of each viseme that can be adjusted on a per component basis that includes the lips, tongue, and jaw. What’s more, one can also choose from a large library to layer on talking styles such as yelling or whispering. Users will no longer be plagued with mechanical mouth/jaw juddering as a result of high frequency or high-speed dialogue because AccuLips is able to transition word to word in a way that simulates natural slurring. 

Tip 2: Powerful & Flexible Facial Mo-cap

Once accurate lip shapes are obtained via AccuLips, you’ll want to give life to the performance of your digital actors—and nothing is more effective than adding facial expressions on top of your talking animations. There are some important tips for iPhone facial capture you should be aware of for this process.

Tracking Data Multiplier

First, controlling the strength of the tracking data by using the strength Multiplier attribute while the actor is performing will allow you to increase or decrease the expression strength for a more convincing speech delivery. 

Multi-pass Recording

Second, take advantage of the flexibility afforded by multi-pass recording. For example, you can record one part of the face at any given time and layer another recording on top for a different part of the face, over and over again. Instead of redoing the whole performance with a new take, you can focus instead on a single part each time and simply add them together to compose the most refined expressions with subtle nuances like a cheeky smirk, brooding frown, scrutinizing squint, etc.

Convincing performance

Third, avoid fixing your eyes on your phone while recording as this can create a sort of “dead stare”. You will be better off pretending the phone does not exist, and perform naturally while relaxing your eyes with subtle eyeball movements by looking at different spots around your periphery. And, of course, don’t forget to blink!

Not everyone can muster the skills and dexterity of a professional stage actor, and in order to mitigate stiff performances, you should resort to multi-pass recording. We recommend recording facial movements for the first pass and adding natural eye movements/performances for the second pass. Most importantly, do consider the proper reaction and intention of your digital actors according to the situation and mood they are dealing with. Try to inhabit the scene and act as though you are the performing MetaHuman. For producing a compelling performance, you can watch this tutorial for further tips –  iClone Tutorial – Quick Tips for the Best Facial Animation Performance 

Live Smoothing

Finally, remember to utilize the Smooth feature, either for real-time tracking, or direct application to the recorded data, to fix unwanted neck juddering which usually occurs with head rotations.

To know more about the iPhone facial tracking features, see the following link: https://mocap.reallusion.com/iclone-motion-live-mocap/iphone-live-face.html

Tip 3: Face Puppet with Intuitive Mouse Control

iClone’s Face Puppet tool is an effective expression mixer for the eyes, brows, lips and tongue, which allows you to intuitively create facial expressions with the mouse. In addition, iClone provides 7 ExpressionPlus ARKit profiles that offer a slew of subtle facial movements with solo or multi-area tweaking, expression strength, with the addition of head rotation and tilting. The movements of the face and head are range-bound, therefore they can never exceed their natural range of motion; So flicking the mouse will never produce an awkward movement.

Tip 4: Face Key Editing to Fine-tune Details

Once your MetaHumans have good lip-syncs and suitable expressions on their faces, the next step is to fine-tune the facial details to turn performances from good to great. There is a fine line between human expressions and a digital actor can go from looking joyous from one second to furious in the next, due to very minute shifts in the facial muscles; and if one is not careful then unintended expressions can give mixed signals to the audience. iClone’s Face Key editor, expression presets, and varieties of facial morph sliders can help you get it right each and every time. Expression presets comprise of 7 emotion categories with 12 variations in each and 4 mixed ExPlus categories with 15 variations in each. A total of 60 standards and 63 ExPlus morph sliders are available to faithfully form multi-layered expressions and includes plenty of blendshapes for the tongue, as well as one-to-one human-scanned ARKit data; And each slider can be adjusted from subtle to exaggerated. 

All in all, giving a natural human-like performance to your MetaHumans may not be easy, but with iClone’s dedicated solutions, the process can be painless, and even fun. Most importantly, having personable, life-like performances for your digital actors can be a very rewarding experience for creators and audiences alike.

Related topics

Share

Leave a Reply

Recommended Posts