Get mocapping with iClone’s latest Faceware integration!
iClone Facial Mocap Plugin with Faceware Realtime for iClone
iClone7’s new real-time facial mocap system integrates Faceware’s industry standard markerless motion capture with iClone in a uniquely powerful and customisable way. It brings new tools and new techniques which can accelerate workflow and improve output quality. This tutorial will take you through the approach step by step, from start to finish, describing best practices for setup, performance and post editing, as well as instructions on advanced techniques for calibration, modifying mapping parameters as well as reworking and improving a character’s animation morphs. The steps are in sequence from initial character preparation and mocap setup – followed by the many ways to customise the mocap system, before recording and finally, post cleanup using iClone7’s core facial animation tools.
Step 01 – Character preparation
You can prepare characters for iClone in Reallusion’s dedicated Character Creator as well as bring in custom characters from Daz3d and other applications via 3DXchange as long as they have a suitable bone rig and morphs for facial animation. For Daz3d Genesis characters, Reallusion provides *.duf pose files which you can add to automate facial morph generation and expressions setup on export – whilst for other custom models you can prepare compatibly named facial morph sets as well as set expressions in 3DXchange’s expression editor.
Step 02 – Set up for mocap
Whether using a static or head mounted camera, correct framing and balanced lighting are vital to get best results. Your camera should support at least 30 fps – and if it’s static – this should be positioned face on, approximately at the user’s eye level, whilst a head mounted camera should be positioned face-on, approximately pointing at the user’s nose. Select the appropriate tracking model (static or headcam) in Faceware Realtime, and choose either the staticcam, headcam or the more general Faceware.json file in the mocap plugin – use the json which works best for your character.
Step 03 – Calibration
Calibration maps your neutral expression to the character’s default expression, and all subsequent animation is relative to that expression. Use the grid overlay in Faceware Realtime to position your face correctly: for a static camera, your eyebrows and mouth should be balanced within the central box – whilst for a head mounted camera, your nose should be roughly central, with your mouth within the central box. Take a neutral, relaxed expression and press the calibrate button. When using staticcam and headcam jsons, open your mouth slightly for calibration – which can improve lipsync. Calibrate regularly to get best results.
Step 04 – Preview
Once calibrated, ensure your target character is selected in the Mocap Plugin and press the preview button followed by the keyboard spacebar to see live mocap. Test how well all facial animation, including lipsync, is working. Try different *.json files if needed, or customise the json (see Step 7) to get the results you want. With a static camera, keep your head rotations within around 30 degrees from the default position to maintain good tracking and minimise feature distortion. You’ll also find that higher framerate cameras will give smoother results, especially when moving very quickly between one expression and another.
Step 05 – Strength and Smooth
When checked in the mocap plugin, the smooth head checkbox will soften head rotation, reducing head jitters. This control can also be used to smooth other parts of the face – by changing the values in the FaceSmootherSetting,ini which is in the iClone7/ Resource/FacewareFacialMocap folder. Strength sliders allow you to control global as well as feature and head strength, useful to exaggerate animation for stylised characters, and to reduce animation for more naturalistic results. Adjust these during preview to see what the effects are before recording. You can save and load specific strength profiles for individual characters if you wish.
Step 06 – Custom calibration
If a character has very particular features eg low or high eyebrows, wide or squinting eyes etc, try mirroring the character’s features with your own for calibration – so if the character has low brows, make your own brows a little lower – if the character is squinting or frowning, squint or frown a little for calibration. These techniques can help balance animation. Also, if you have particular features – eg low or high eyebrows etc – and the animation appears limited – try doing the opposite: if you have low eyebrows, make them a little higher for calibration etc.
Step 07 – Modify mapping parameters
With a character selected, you can modify every expression by using sliders in the Plugin’s mapping panel. Use the control name drop down to select an expression and edit its blended morph and bone values to change the expression. Whilst you can edit individual expressions on a static character, it’s better to do this with visual feedback during preview, since the expressions blended during mocap. Open the tracking data inspector for reference to see which expressions are firing when you make a particular expression, and when you make an edit – test fully by performing other expressions too during preview.
Step 08 – Modify character morphs
If you need an expression which you’re unable to achieve via calibration or mapping parameters – eg a crooked smile or a particular cartoon effect – you can modify any of the blended animation morphs externally. Send the character to 3DXchange and from the Face Setup Panel, export the morph you want to change. Modify it in an external package, then in 3DXchange replace the original with the modified morph and return the character to iClone. NB modifying animation morphs will not only affect facial mocap, it will also affect iClone’s other facial animation tools where those morphs are used.
Step 09 – Lipsync
Staticcam and headcam *.jsons can provide better lipsync, especially for iClone characters, than the Faceware json. Calibrate with your mouth slightly open when using these jsons and test during preview. Spend some time getting the best calibration to find the sweet spot for lipsync before recording, as this can save hours of cleanup later. Select ‘record audio to viseme track’ and your chosen microphone to auto record and sync audio, as well as produce visemes on the timeline. Tongue motion is generated by default, but you can edit visemes by re-enabling lips and jaw strength on the Lip Options track.
Step 10 – Record performance
Now you’re ready to record facial mocap, it’s time to focus on performance. Whilst the system has many ways to control how the animation comes through, the most powerful control is your own face. Exactly how you perform is up to you, but it’s worth bearing in mind that film and TV acting is generally quite subtle, and even cartoons rarely have characters pulling faces all of the time – so taking a ‘less is more’ approach can be helpful. And now that the shortest acting lesson in history is over – hit record and press the keyboard space bar…
Step 11 – Multipass record and post cleanup
iClone facial mocap is not just a single pass, ‘one shot deal’. Feature masking tools can toggle features on and off – use these along with ‘blend data on next recording’ checked or unchecked to blend or overwrite particular feature animation. You can also blend in passes of Face Puppet if you wish. Once recording is done, use Face Key to cleanup and polish, using the Muscle and Modify panels to adjust any part of the face. The Modify panel gives access to all available morph and bone values, you can also use negative values here if you wish.
Step 12 – Pipeline to apps and engines
iClone is a pipeline tool. Whether you’re working with custom or native characters, they can all be exported along with animation to other 3d applications and game engines. First use the ‘collect clip’ track on the character’s timeline to export the whole animation as an *.iMotionPlus file. Send the character to 3DXchange, then import the animation to the Motion Library and click ‘Add to Perform’. Next, export as *.fbx with the appropriate settings for your target application – it could be Max or Maya, Unity or Unreal – or any 3d application which supports *fbx including morph and bone animation.
Character Creator 1.5 Tutorial – Exporting Talking Animations to Maya via Character Creator
iClone Animation to Unity3D Part Five: Export Character and Animation
Character Creator to Unreal Part 4: Export & Material Settings
The Author- Mike Sherwood (aka 3Dtest)
Over 20 years in CGI, specialising in real time character modelling, animation and tools R&D, clients including broadcast, games and 3d software developers