Categories
Product Release
Pitch & Produce
News
Learning
Featured Story
Events
Tags
Product
iClone
AccuRIG
Character Creator
Cartoon Animator
ActorCore
Smart Content Manager
Plugins & Pipelines
Headshot
Auto Rig
ZBrush
Motion LIVE
Illustrator
Photoshop
PSD Pipeline
Vector Pipeline
Blender
SkinGen
Cinema 4D
After Effect
Unreal
Omniverse
Maya
Unreal Live Link
Unity
MetaHuman Live Link
3ds Max
Marvelous Designer
Daz
Motion Link
Iray
Application
Animation
Cartoon
Previz
Conceptual Art
Games
Commercial Ads
Television & Network
Education
Films & Movies
Music Videos
Social Media
AEC
3D Scan
Comics
Live Performance
Virtual Production
AR/VR/MR/XR
Vtuber
Theme
Scene Creation
Character Creation
Character Animation
Facial Animation
Lip-sync Animation
Video Compositing
Motion Capture
Motion Director
Digital Double
360 Head
Digital Twin
Environment & Crowd
Digital Human
AI & Deep Learning
MetaHuman
Metaverse
Developer
Content
Plug-ins
Certified Trainer
Columnist
WarLord
GET THE LATEST UPDATE

MasterClass #3: Blender Veteran Introduces a New Character Creation & Animation Pipeline

Share

In this 4 part tutorial series, Marko documents the process of creating a fully-rigged character in Character Creator 3, animating the character with iPhone facial mocap and the Reallusion motion library in iClone, and importing character and motion into Blender. Marko showcases two methods in working with motion data in Blender, one is using Auto-Rig Pro, another is using Non-linear Action Editor (NLA).

Marko Matosevic is an enthusiast with Blender for over 8 years, runs his own YouTube channel (YouTube.com/Markom3D) and is now a multi award winner of several short films using the Reallusion suite and Blender.

PART #3 – FACIAL MOTION CAPTURE FOR BLENDER

Facial animation is a very time consuming aspect of animation, as this is what will make or break your animation. Using iClone to capture that facial animation, you have the ability to use facial capture. Just simply put an iPhone X or above in front of your actor and you are set. iClone will then do all the hard work for you, saving you time and money. Within Blender, you could try using hooks and paint dots on your face to track the markers, however that comes with its own challenges of failed tracks.

In this article, we will go through the process of capturing facial motion data, exporting from iClone and importing it into Blender. From here in Blender we will then go through the process of combining body and facial motion capture to one character to complete your animation cycle. 

Generate Facial Animation with iClone, Motion LIVE and iPhone

The first requirement that you need to follow in order to have your PC and iPhone X or above to be connected to each other, is that they both need to be on the same wifi network or have the iPhone directly plugged into the PC via USB.

Once your character has been imported from Character Creator 3, select the character in iClone, Motion tab, then Motion LIVE.

Motion LIVE window will appear. You will need to open the Live face app on your iPhone and you will see that there will be an IP address. Update the IP address on connection for live face and press the green dot next to live face.

If you would like to record audio (Strongly recommended if you are doing facial motion capture) then ensure that you have the Record Audio for Viseme Track checked and have your microphone selected. This can only be a microphone attached to the PC.For the best result, with facial motion capture, it is important to set the zero pose. To do this, you will need to have an expressionless face looking forward, and click on the Set Zero Pose option.

If you would like to just test what you have set up, then select the preview button, and then press Spacebar to start. You will have a live feed of your facial expressions. When you are ready to record, select the record button, and then press Spacebar to start recording.

If you are unhappy with your recording, you can place the cursor back to the start and record again. If you would just like to re-record part of it, place the cursor at the start of the location that you would like to re-record, and follow the steps above.

I highly recommend that if you are going to add body motion capture, that it is done in iClone as well as this speeds up the process of getting it into Blender.

Export Facial Animation from iClone

Once you are happy with your facial animation, you will need to export the model and the audio. To export the model, click File > Export > Export FBX. You will need to select Target Tool Preset as Blender, and set your FPS rate to whatever your finished scene will be. It is important to select Range in Export Range, and to type the start and end frames of the animation that you would like to export. Lastly is a personal preference, but unticking the Embed Textures box as this allows for the .fbx to be smaller in size, and for all the textures to be placed into the one main folder. Then press export.

Next, we will need to export the audio by clicking Render, Export Audio. Have the Format selected as Video, and Format again select as .WAV. Change your Export Ranger to Range and input the ranger that you had previously selected for the facial motion. Hit Export and save it to any location.

Import Facial Animation in Blender

From here we jump into Blender and select File > Import > FBX, we then find our fbx file and simply double click on it. The blend file will update the FPS of the scene depending on the FPS of the first character that you have imported. By default, iClone animation is at 60FPS, so if you change the FPS in the export setting, this will change accordingly in Blender. Your character will now be fully functional in Blender.

Combine Body Motion and Facial Capture

If you would like to append the body animation to the character we just imported, you will need to do a few things for everything to work smoothly. Ensure that the armature with the body motion has the same bone structure and naming convention, as the character we just imported. Select the Animation Tab at the top of Blender, and change the editor type of the top left window to “Nonlinear Animation”.

Once in the NLA Editor, select one of the armatures, select a NLA track, Shift+A on the track and select the corresponding action from the secondary character. The issue here is that whatever animation is on top, will win. That means that if both actions have arm control whatever is on top, and in this case WalkStart, that animation will be playing.

Select the action that you just brought, and press tab to go into Edit Mode, as it should now have gone green. On the armature with the body animation, select all the bones in the head, ensure that you also selected the Head bone which sits between the facial bone and the neck bones.

In the Dope Sheet at the bottom, change Dope Sheet to Action Editor, and you should have a list of keyframes for the bones that are selected. Select everything and press delete.

This will now remove all animations for these bones from this action. In the NLA Editor, press Tab to get out of Edit Mode. Now going back to our previous point, the animation for these bones will now look at the next layer in the NLA Editor, and will now play the correct animation.

We have now successfully created a facial motion capture using iClone 7, exported it to blender, and combined this with another body that has body animation.

PART #1 – Character Creator to Blender with Auto-Rig Pro

PART #2 – Animating Characters in Blender

PART #4 – Combing Facial Shape Keys and Armature Animation in Blender with Auto-Rig Pro

Related topics

Share

Leave a Reply

Recommended Posts