Categories
Product Release
Pitch & Produce
News
Learning
Featured Story
Events
Tags
Product
Character Creator
iClone
Cartoon Animator
ActorCore
Plugins & Pipelines
Unreal
Unreal Live Link
Headshot
ZBrush
Face Tools
Blender
accuFACE
Motion LIVE
After Effect
Daz
Smart Content Manager
AccuRIG
Vector Pipeline
Omniverse
Photoshop
PSD Pipeline
Maya
Illustrator
Auto Rig
SkinGen
Cinema 4D
Unity
MetaHuman Live Link
3ds Max
Marvelous Designer
Motion Link
Iray
Application
Animation
AEC
Films & Movies
Games
3D Scan
Commercial Ads
Virtual Production
Cartoon
Comics
Previz
Television & Network
Conceptual Art
Music Videos
Social Media
Live Performance
Vtuber
AR/VR/MR/XR
Education
Theme
Character Animation
Environment & Crowd
Digital Human
Motion Capture
Motion Director
Scene Creation
Character Creation
Digital Double
Lip-sync Animation
AI & Deep Learning
Facial Animation
Digital Twin
360 Head
Metaverse
MetaHuman
Video Compositing
Developer
Content
Plug-ins
Certified Trainer
Columnist
WarLord
GET THE LATEST UPDATE

Winning Tips & Tricks: The Making of Jason Taylor’s “Animal Farm” Showcase

Share

The “Winning Tips & Tricks” series covers practical workflows and techniques shared by winners from the “2021 iClone Lip Sync Animation Contest”. To let users see the full spectrum of the iClone facial animation pipeline, we are introducing projects that are rendered with different 3D engines including; Unreal, Unity, iClone, Blender, and Cinema4D.  Let’s now take a look at Jason Taylor’s: “Animal Farm”, combining Character Creator (CC3) / iClone workflow and Unity render.

Jason Taylor

My name is Jason Taylor. I’m a filmmaker and multimedia artist, running my own digital media studio, Jumping Guy Productions in Ontario, Canada.

I’m the creator and producer of a puppet-based video series for kids which I created using real-life puppets in a 3D animated world via Unity in 2016. Since then I have been working on developing professional 3D character animation skills in an effort to diversify my skillset and tell stories that I couldn’t with puppets. Reallusion’s Character Creator and iClone have been a real game-changer for me in this regard.

Why I created this showcase 

Working part-time over a few evenings or so, I was able to bring a scene from Animal Farm to life. The story needed a variety of characters from talking pig characters, a talking horse to a variety of rough-looking farmers. With relatively little effort I was able to create eleven original AAA characters for my video in a matter of hours using Character Creator. From there, once I exported the characters over to iClone, I was able to easily add facial expressions, animate the head and easily add high-quality lip-syncing to my characters before I exported them over to Unity for composition and final render.

How I do it with iClone

Step 1: Building Five Animal Characters

To create the characters I started with CC3 compatible base characters from the Reallusion Marketplace and content stores. Rather than reinventing the wheel, I was able to take a simple clean looking pig character and add facial hair, eyebrows, clothing and morph the facial features to create the human-like pigs that the story called for.

Step 2: Building Five Human Characters

The process for creating the human characters was similar to the pigs. I started with simple base characters similar to what I had in mind for the story and then I morphed their faces, added facial hair and unique hairstyles and the proper clothing. For one of the characters, a farmer character, I was even able to add dirt on his face like he just came in from working in the field.

Step 3: Morphing Horse Character from Deer

Blender helped with pretty much everything else. Using simple shapes, I was able to sculpt them into jewelry, pants and even a braid-looking thing. And Blender has for the creation of the horse character — this one stumped me at first as I couldn’t find the style of horse I was looking for in any of the Reallusion stores. I had a specific style in mind and my horse also needed to be compatible with CC3’s facial morphs system for animating the mouth using AccuLips. My solution was to purchase a compatible deer model that came with an isolated head model that I was able to stretch and morph and shape into the horse base that I had in mind. I then added custom textures/materials that I photoshopped together from an actual image of a horse. For the final touch, I added some stylish hair that I purchased from the marketplace.

Step 4: Animating the Body and Hair

The facial emotions and lip-syncing abilities, paired with the live facial mocap abilities with LIVE FACE the iPhone mocap plugin for iClone 7 have really blown me away and have allowed me to take my animations to the next level, imbuing my characters with personality and emotion. As a general rule for myself, I start with lip-syncing and then add emotions afterward. To create the lip-syncing, I simply import the audio of my character speaking as well as the text to go along with the audio and iClone’s AccuLips give me a solid starting point that I am able to tweak to my liking, adding more or less jaw motion, more or less lip motion and the ability to add, remove or move details. There are also presets I can use like “Singing” or “Whispering”, that I can use as-is or tweak to my heart’s desire.

Step 5: Mocap and Facial Expression

The scenes I picked to recreate in 3D had little to no character animation. Nevertheless, I took advantage of the Motion Puppet tool to create some simple movements and avoid having the character stand in place like a statue the whole time. In the first sequence, for example, I positioned his hands above his head and once I am committed to the finalized result of the lip-sync, I’ll add emotions using iClone’s library of emotion, e.g., “Happy” or “Sad” or “Angry” etc. I can keyframe these emotions in, blend them together from one to the other, adjust the intensity in terms of how happy or how sad, and so on, to bring my character and story to life.

On top of my keyframed emotions and lip-syncing from AccuLips, I like to add facial mocap data to the mix to give things a more human-like realistic look and feel. I go subtle here though. Using my iPhone connected to LIVE FACE in iClone, I can add a layer of head rotations and another layer of facial expressions from the mocap data that is blended together with the original.

Step 6: Animate the Head

For this project, in addition to the head movements from the mocap data, I also animated some of the larger head movements using the “Look at” option in iClone. This allowed me to give the character something to look at directly, which in the end gave the character more of a believable performance. To do this, I animated a cube to the various points I wanted my character to be looking at. I then tweaked and keyframed the options of how much the character’s eyes were looking at the cube without the neck and head moving versus how much the character turned his full head to look at the object.

Final Step: Tips to do Composition in Unity

And finally, it was time to add all the cool effects, such as smoke, lighting, magical butterflies, and of course, floating dust particles. I had already bought as a Unity user, I was happy with how great the characters looked in Unity once I exported them from CC3 and imported them into Unity. I used the Unity / CC3 Auto Setup Addon which helped to give me good results on the materials in Unity HDRP shader mode. After I did a bit of custom tweaking to the materials and hair and I was off to the races. If something came up in Unity with one of the characters that I wasn’t happy with given how my scene was building out, e.g., a character’s nose, head shape, eyes or arms, etc. It was no trouble to re-export the character from CC3 and then re-import to Unity without losing references to 3D character animations etc. The same went for the facial animation and lip-syncing. Once I saw things come to life in Unity, I was able to go back to iClone, make some adjustments accordingly to timing and expressions and lip-syncing and I could re-import things back into Unity seamlessly.

Final Thoughts and Conclusions

Working with 11 different characters on a 30-second clip, this process could have taken me weeks or months to do. And, it’s not a stretch to think that this would be the kind of endeavor that a studio team would take on. But no, I was able to do this on my own in a very timely manner, working part-time in between my main project. Overall, I’m glad I gave this project a shot and I’m happy it was even possible thanks to Reallusion’s professional tools. I’m excited to see what comes next from Reallusion!

Related topics

Share

Leave a Reply

Recommended Posts

Discover more from Reallusion Magazine

Subscribe now to keep reading and get access to the full archive.

Continue reading