There is a lot of hype about performance and speed of creation with today’s digital filmmaking tools and we are continually promoting iClone with emphasis on how much time you can save and how much more productive you can be with a real-time engine. However, it’s really more than that. It’s more than just real-time, it’s about how well the tools within the real-time engine can make you a better animator and storyteller. So, yes… real-time is paramount, but real-tools with real-innovation and real-power are what really makes the difference. This is my experience with the facial puppeteering inside iClone4. I’ve seen our engineers knock down some amazing things, but this lone feature in iClone4 is astonishing.
So to explore this new tactical feature I grabbed my paperback of Charles Bukowski’s, ‘The Last Night of the Earth Poems,’ and I searched for a short monologue. I have many bookmarks in this book placed for just when the animation muse may hit, but this time I was looking for something specific. I found it. ‘Splashing,’ right there on page 238 with a crisp corner I’d not yet bent.
I had already created the character which just took a few minutes. I spent most of the time on the head deformation sliders to change the shape of the face and also with the skin texture to get the specular settings to cast a moist, greasy sheen to the skin. The specular settings on the skin combined with the Normal bump map can be seen when the actor turns his head. The pores and wrinkles in the skin are revealed as the light changes across the movement of the face.
The voice recording was done inside iClone. I did not use any external audio editor to capture or edit the voice. I cut it raw, one time and went to animating. I think where you record and what you record with is the key. I am using a Blue SnowflakeUSB mic that sells for around $60 USD. When I first watched the lip-sync automatically generated by iClone I was impressed. Then I started working with the facial animaiton and puppeteering tools. First, I assigned an ‘Angry’ emotive to the overall timeline as a general mood for my character. That gave me a good base for how my actor would emote. Next came my first production experience with the facial puppeteering panel in iClone. Woah! Again, these engineers are insane.
With my project playhead set at the beginning I used the preview function on the puppeteering panel to audition a few facial puppet actions. You can select from a list of facial emotions or you can select your own group of muscles or individual facial features to affect during the puppeteering session. Once you have the combination selected you hit ‘record’ and watch as the playhead advances and plays your project in real-time, while this happens you use your mouse to puppet the face along with your audio. All your puppeteering is recorded in real-time as a motion clip for each session. For a cg animator this type of automation speed mixed with total custom control is so amazingly valuable. It’s not presets and looped movements… it’s really you pulling the strings, the artist is in control. This is also where any knowledge of non-verbal facial performance, general acting technique and a good mirror can come in very handy. The fundamentals will always be part of what makes us better, no matter the tool.
For the Bukowski poem I made two passes with this technique. First I focused on the jaw and brows and next on some head turning and overall grimacing. I previewed my results instantly and saw what would have taken me a very long time to do before completed in just a few minutes. Of course, that was just the first draft. I tried a few more times and saved each result as a CTS file so I could choose from the best of a few attempts. The only thing that slowed me down here was that fact that it was fun and I was likely doing a few more takes than normal because I was playing. Finally I went back to a few spots and and did some editing with the Face Key window. This is a tool that is just like the puppeteering panel, but is meant for single frame editing of the face pose. What’s cool is that this can be an overriding layer that will blend with the action already recorded to your face. In short, you can spot edit the face pose… anywhere.
I threw in a camera and did some camera ‘look-at’ commands to get the actor looking at the camera and away from the camera at specific points of the monologue. The actor’s motions are a blend of key pose animation (the ‘jesus christ’ arm swing) and mocap clips (other hand gestures) broken and aligned with easing to match the movement of my actor.
In the end I decided to keep it simple and only do lighting on a black soundstage. I decided to forgo the set, secondary characters, weather, etc… I wanted this one to be all about the actor, the face and the ‘real’ time it took to make it… This was how I spent 2 hours in iClone.