首頁 » Page 8

Professional Actors Stress Test AccuFACE Facial Mocap

To showcase the new iClone mocap profile AccuFACE’s role within a real production environment and stress test its capabilities before release, Reallusion has proudly invited seasoned performers Lucas Gilbertson and Carol Anne to rigorously explore the AccuFACE facial mocap technology.

Both Lucas and Carol-Anne boast over two decades of experience as professional voice actors. Lucas has made his mark on renowned franchises like the Mega Man series and Sonic the Hedgehog. Carol-Anne Day is recognized for her extensive voice acting for anime and video games produced by Viz Media, Bandai Entertainment, Capcom, Geneon, and Bandai-Namco. Since 2020, they have ventured into game and animation creation, utilizing iClone, Character Creator, and Unreal Engine. Their first-hand experiences have helped them foster leading expertise in the field of facial capture technology.

In a bid to unravel the full potential of AccuFACE, they set about portraying cartoon characters, orc monsters, and lifelike digital humans. Their experiments involve testing scripts of varying inflections — ranging from subtle to intense — to gauge AccuFACE’s ability to deliver dramatic, natural, and nuanced performances.

Here is the timeline for a quick overview of the AccuFACE test:
00:00 Project background
00:53 How we plan to test it (with beard and nose ring)
01:16 Quick test without calibration
01:52 After calibration
02:00 Impersonate a vicious orc or cartoon character
02:29 Combining facial editing for stylized details
03:00 Tongue twister and dialog test
04:00 Comparing to iPhone facial tracking
04:42 Individual brow tracking with AccuFACE
05:22 Proper lighting requirements for AccuFACE
05:36 An extreme performance test (with good eyeball and nose tracking)

Know more about AccuFACE related production:
Try AccuFACE facial mocap free for 30 Days
Easily produce professional animations with iClone
Automatically generated Lip-sync animation with AccuLIPS

Animated 3D Events for the Metaverse, Easier than ever

ActorCore & iClone integrated with EEVEE real-time in Blender

In the burgeoning realm of metaverses and the emergence of virtual events on these platforms, we find ourselves delving into immersive experiences, surrounded by avatars who are interconnected through virtual reality. However, the key to designing successful events lies in the integration of genuine human connections within these virtual spaces, necessitating a sturdy and lifelike framework to deliver impactful virtual content. This is precisely where ActorCore comes into play—an indispensable resource for procuring top-tier 3D assets, including lifelike human models, animations, motion capture capabilities, and real-time rendering, all conveniently accessible via the web.

It’s the boundless prospects offered by this remarkable tool that has captured my attention. Coupled with iClone 8, it provides unparalleled flexibility and simplicity in seamlessly incorporating crowd animations and staging for the web metaverse.

In the following sections, I will share my endeavors in Blender, where I’ve worked on integrating realistic individuals into a 3D environment simulating a party, achieving exceptional levels of visualization and detail.

So, let’s embark on this journey and delve into the process!

About the Artist

I am José Ripalda, a Video Game Designer, and Metaverse Artist, serving as the Co-Director of Marco Virtual MX. Our dedicated creative team focuses on delivering top-notch experiences within virtual events and the metaverse. Leveraging Blender as my primary tool, I craft diverse thematic environments tailored for web performance.

At Marco Virtual MX, our business model revolves around providing premium digital services for advertising purposes. Among our offerings are comprehensive 3D environment designs for the metaverse, video games, and various visual media. Moreover, our specialization lies in seamlessly integrating 3D content into the web using WebGL. We also excel in developing bespoke interactive systems for clients seeking personalized digital spaces. Furthermore, our expertise extends to crafting digital replicas of individuals, achieved through scanning or meticulous modeling. These services collectively define our role as a creative team dedicated to harnessing 3D technology.

Behind the Scenes

This tutorial will cover the entire process, commencing with ActorCore. It will elucidate the diverse possibilities of visualizing actors or 3D assets directly on the web, showcasing their animations and accessories. Subsequently, we’ll proceed to acquire these assets, integrating them seamlessly into our library within iClone 8. This integration marks the pivotal step in assembling multiple actors within a virtual space, which could be perceived as a stage or a setting where the events will unfold.

Throughout our exploration of iClone 8, we’ll delve into various tools, including Motion Layer, a comprehensive system designed to facilitate seamless actor control. Additionally, we’ll discover the functionality of the Timeline, which enables the amalgamation of motions, along with other remarkable features tailored to animate with meticulous attention to detail.

Moreover, as part of our supplementary learning, we’ll delve into the CC Auto Setup for Blender. This tool significantly streamlines the process of importing animated actors from iClone to Blender, enhancing the efficiency of our workflow.

ActorCore Animated 3D Characters

Initially, my aim in orchestrating virtual events revolved around sourcing lifelike performances from ActorCore’s expansive repertoire. First, I seek actors in the midst of skillful dancing or lively conversation, followed by a DJ responsible for overseeing the DJ booth. Lastly, I enlist waitstaff to curate a personalized and intimate ambiance for the event.

Following this introduction, a comprehensive breakdown of the ActorCore interface showcases the array of options available for discovering and previewing actors. This interface allows users to manipulate various visual elements, including lighting and selected animations.

Through this web interface, we have the capability to designate selected actors as “FAVORITES” for the party. Leveraging ActorCore’s extensive library, I effortlessly located all necessary components characterized by their realistic appearance, PBR shading, and readiness for animation through the Motion Animation library. This vast selection ensured that everything I required was readily available.

After preparing our selected actors, the next step involves downloading them for integration with Character Creator or iClone. This seamless integration capability enables us to effortlessly incorporate these actors into various Reallusion systems, game engines, and 3D applications—specifically, Blender.

An important aspect to emphasize here is the actors’ optimization for seamless integration into the metaverse. Each actor is tailored with an intermediate polygon count ranging between 15K-20K, flexibly accommodating events hosting multiple actors without significantly impacting performance. Furthermore, iClone 8 offers a streamlined method to import these actors directly from ActorCore. Its robust and efficient system ensures a swift and solid process, requiring just a few clicks for acquisition and integration.

iClone Advanced Motion Editing

Incorporating animations is as effortless as a simple drag-and-drop action onto the desired avatar, offering a visual interface that expedites the process without the need for extensive, time-consuming steps. This intuitive system not only enables users to seamlessly adjust the position and orientation of the actor but also ensures steady foot-and-floor contact. Initiating the animation setup with an established position facilitates better control, allowing users to define precisely where the actor will commence the animation sequence from the outset.

After grasping the process of integrating actors into iClone 8, the subsequent crucial step for our event is to incorporate the apartment stage. I highly suggest utilizing FBX files (including textures) to ensure seamless importation into iClone 8 without encountering any issues.

Once the stage is set, the process becomes straightforward—simply select and drag our actors to designated areas, such as the DJ section, which is prepped with mixing equipment. This setup allows us to effortlessly test out the Motion Layer tool for smooth execution.

The Motion Layer represents an innovative system seamlessly integrated into iClone 8, offering meticulous control over every aspect of our actor’s performance. This includes configuring individual finger movements, manipulating hand gestures such as opening and closing, editing various body parts, transitioning between IK and FK seamlessly, and accessing a comprehensive panel for selecting specialized gizmos.

In addition to working with the DJ, I managed to implement specific modifications for two guests engaging in a handshake. Motion Layer makes it easy to adjust hand positions. Moreover, within this tool, we explore a practical method enabling actors to maintain eye contact while conversing. Under the ‘LOOK AT’ menu, options like Look at Camera or Pick Target can significantly enhance the realism of our scenes by facilitating natural gaze and interaction between characters.

Furthermore, beyond the guest actors, it’s important to integrate featured actors. In a recent scenario where I required waiters, I found ActorCore particularly useful. Specifically, the “CATERING SERVICE STAFF” set offers a diverse range of actor types ideally suited to serve as waitstaff for our virtual party.

Once all our actors are prepared and animated using the tools showcased in the video, a crucial step involves transferring the actors and animations to Blender (the program chosen for creating this virtual scene). In iClone 8, I opted to export all avatars in FBX format, selecting the “BLENDER” preset and configuring a comprehensive array of animation and other parameters as depicted in the following image.

Stunning Rendering Results in Blender

In under 5 minutes, I efficiently grouped each avatar, seamlessly integrated textures, animations, and more. While this process is undeniably efficient, I’d like to delve into some aesthetic nuances I implemented during the creation of the apartment in Blender.

Blender offers unparalleled flexibility and practicality in crafting virtual environments, allowing for effortless texturing, lighting manipulation, and the creation of an extensive camera system with easy control. In my case, aiming for a realistic yet swiftly generated rendering, I utilized techniques like “Irradiance Volume” and “Reflection Cubemap,” both highly effective within EEVEE. These, combined with effects such as bloom and ambient occlusion, strike a fine balance between achieving good quality visuals and optimizing the overall realism of the scene.

Once our actors are primed, the Blender add-on called CC Auto Setup for Blender comes into play. This add-on allows for direct selection of the FBX files exported from iClone 8, simplifying the import process into Blender. Once imported, each avatar will be neatly separated with its corresponding ARMATURE and integrated animation. This practical feature significantly hastens various processes involved in animating the party scene.

The camera system integrated within Blender offers remarkable flexibility in capturing stunning shots that vividly depict the virtual event, providing a clear preview of its appearance within the metaverse. Regarding the equipment used for this work, it comprised a single workstation equipped with an i9 12900KF processor and an RTX 3080 graphics card. I’m delighted to have explored these fantastic tools offered by Reallusion, and I eagerly anticipate continuing to share and learn from experiences with such excellent and practical platforms. Thank you for joining in this exploration of innovative possibilities!

Conclusion

ActorCore and iClone complement each other perfectly, offering a seamless integration of animations and lifelike characters ideal for virtual events or VR metaverse settings. The addition of CC Auto Setup for Blender further enhances this workflow, swiftly importing renderable, animated characters into Blender, eliminating the complexities of additional material settings. This intuitive process doesn’t require specialized technical expertise, making it accessible for all. I wholeheartedly endorse this workflow for anyone looking to create immersive VR metaverse productions, as it simplifies the process while ensuring impressive results.

More info about Marco Virtual MX

WarLord’s Beginners Guide: Creating a Simple Scene with AI, iClone 8 & CC4 

In this installment of the Beginners Guide, I am going to demonstrate how easy it is to incorporate AI tools into your workflow so you can leverage both the timesaving aspect of AI and the creative tools of iClone and Character Creator 4 (CC4) to make a simple but effective scene. 

I’ll be focusing on iClone and CC4 while making use of the free-tier AI services relevant to this type of work.

Specifically, I’ll use: 

– Leonardo AI (free tier) to generate a headshot image.  

– Bing Create (free) for the Headshot 2 image. 

– ChatGPT 3 (free tier) for dialog and scene-setting ideas. 

– Eleven Labs (free tier) AI-powered voice generation for the character.  

Below is an image of what we will end up with. A pleasant and hopefully somewhat charming female host in a 3D cartoon environment of a bedroom doing a podcast. The headshot, (the AI-generated image) and voice are important to adding any factor like charm to a character. I picked Eleven Labs for their ability to make a voice emote instead of a monotone or rhythmic recitation. Leonardo AI is an excellent image generator for my tastes. There are many of these AI tools to choose from. 

VIDEO TUTORIALS:
I have created two short YouTube tutorials that are near the end of this article that cover both character creation and scene creation along with using AI-generated dialog. It also shows how to enlarge the mesh for big cartoon-type eyes in Character Creator 4.

Below is the setup view with the background turned off. 

As you can see in the image above the scene is extremely simple when you leave out the AI-generated background image. The “desk” is the Door prop that comes with iClone and I didn’t bother with a chair either. It would be simple enough to add one if you desire to do so. You could also dress up the desktop a little more with other objects as I did in a later render with the headshot image on the desk.  

You could also put a plant prop in the foreground at the side of the desk to add more 3D depth to the scene as it goes into the 2D background. A proper combination of image in the background and props in the foreground or intermediate space can help to blend the 2D background image into the scene eliminating some of the contrast between the flat image and the 3D objects. 

CHATGPT 3 

The first thing I did was go over to ChatGPT 3 at chat.openai.com at ask it to write a dialog for “a fan talking to the camera about how much she loves Taylor Swift”. I received the following response: 

 
I didn’t follow the script at all in terms of direction because that wasn’t the creative direction I was looking for. I wanted to keep this a simple scene with a simple character, so I used the Podcast concept from the bedroom. The dialog was all I was interested in.  

ELEVEN LABS – VOICE GENERATION 

For this, I went over to ElevenLabs.io and used the free tier (requires signup) to create the dialog. Like most tasks of this nature, it took longer to decide on a voice from the voice library than it did to copy and paste the text into the dialog box, process, and download the audio file. 

At first, I did a sentence or two of dialog with the voice I had selected to find out how it interprets the text and how pauses and other formatting worked. It was much easier than expected.  Commas and the “…” created different timing and pauses in the audio. Using two blank lines between paragraphs seemed to work to keep the paragraphs from running together when spoken. 

Now, of course, this Is where I pushed the envelope too far in terms of how much of the script dialog I sent into the speech generator at a time. After a couple of small tests with just a line or two at a time, I proceeded to dump all of the remaining text into the voice generator.  

It worked but had some pronunciation errors that would require correcting and running through the AI again. This burns up credits in an extremely wasteful manner as you get 10,000 characters on the free tier. Not words, characters. If I had done it a sentence or two at a time, I wouldn’t have burned so many credits. 

In this case, it didn’t matter as I wasn’t going to use it all anyway in the final render.  I just needed enough to do a good demonstration. 

Creating the dialog from Eleven Labs to iClone was easy. It was only a matter of minutes to copy, paste, process, download, import into iClone, and have the entire script’s audio lip-synched and loaded into the character. I also copied the dialog into the clipboard so I could paste it into AccuLips to go along with the audio file for near-perfect lip-synching with no intervention on my part. I’m not doing anything advanced, and I think you’ll be surprised at the final product. 

TIP: When generating audio from text think about how to write out the text. If you want a character to say iClone then write it out as “eye Clone”. “AI” is another stumper for generated speech. If you leave them as AI, you’ll probably get an “aye” so write it out as A I with a space in the middle. You may have to phonetically spell some words to get the desired results. 

LEONARDO AI 

I couldn’t get the face image I wanted out of Leonardo, but it was a snap getting the background image of the bedroom. Feel free to use the image in your own project but it is low resolution for the purpose of this article to not take up too many resources. 

BING CREATE 

I used Bing Create to generate the character head image to use with Headshot 2 (or Headshot 1). I tried Leonardo and got some great-looking images but few that faced forward like I needed. So, I cruised over to Bing Create and used the prompt: 

Portrait of a bald, Pacific Islander cartoon girl with no hair, facing the camera, white background 

I got several good images from Bing and was actually very pleased with the outcome. I had already spent a frustrating time with Leonardo without the desired results so when Bing Create immediately started returning usable results, I was surprised, to say the least. 

Bing Creator generated Head Image. In Windows right-click and “Save Image As” it to download.

It didn’t take me long to choose a good headshot image and I was off to CC4 to create the character. I already knew there was going to be a problem with the large eyes in the image, but I decided to “brute force” my way through that by pushing the eye and eyeball sliders way past the intended maximum. I love this feature about the slider. If you go to the end and it doesn’t do enough… just type in a number in the input box and press enter. If 100 or –100 isn’t enough, then type in 200 or –200.  

TIP: The input boxes do not stop at the end of the slider. You can manually enter 200 (or any number) in the input box if the slider only goes to 100. This can be a very powerful feature pushing intended use past the boundaries and experienced users know how helpful it can be. 

CHARACTER CREATOR 4 

The AI work was very enjoyable, and I looked forward to putting it all together but I still had to make the character and it needed some personality. The Bing Create image certainly helped in that department. 

I needed a stylized female body to further enhance the cartoon effect. You could load the free Alika character that comes with iClone and CC4, but I chose to load a neutral base and then use the Alika slider in the slider section to morph the body to a more cartoonish shape. I also used the Alika head morph. 

HEADSHOT 2 

Next up was jumping over Headshot 2 to wrap the head to the character. Since I’m talking basics here I will only use the Image, Auto feature, and a little bit of sculpting around the eyes. Since I used the Auto feature it also created some hair which I discarded so I could use the Alika cartoonish hair instead. 

Now we see some freaky things happening like the strange-shaped head (compared to the image) and really weird eyes because of the huge eye size on the AI-generated image compared to the mesh. Like I said though, we are going to use brute force to enlarge the mesh eyes and eyeball to cover up the eyes on the Headshot image. 

TIP: If the remnants of the eye colors around the edges and the bottom of the eye bother you then take the diffuse image into something like Photoshop and clone over the lower two-thirds of the painted eye leaving only the top or you can clone over all of it if you want a clean look. 

I then dressed the character in a top, jeans, and tennis shoes changing the original colors to a shade of pink where appropriate. 

CLOSING

As I showed you in the first and second images, the scene itself is very simple. Using a prop for a desktop, add a microphone or whatever you want to place in the foreground. Add your AI-generated background image and the scene is set. All you need to do now is light the scene and that is personal preference, but I would keep it bright and airy for this type of scene. 

Creating a Toon Girl Character in Character Creator 4

Creating the scene, animating the character including using AI-generated dialog.

MD McCallum - WarLord

MD McCallum – WarLord

Digital Artist MD “Mike” McCallum, aka WarLord, is a longtime iClone user. Having authored free tutorials for iClone in its early years and selected to write the iClone Beginners Guide from Packt Publishing in 2011, he was fortunate enough to meet and exchange tricks and tips with users from all over the world and loves to share this information with other users. He has authored hundreds of articles on iClone and digital art in general while reviewing some of the most popular software and hardware in the world. He has been published in many of the leading 3D online and print magazines while staying true to his biggest passion, 3D animation. For more information click here

Anyone can Create Professional Animations with Cartoon Animator 5.2 Motion Pilot

Veteran Cartoon Animator user and developer Garry Pye has been working with Cartoon Animator (CTA) for over a decade and has made the journey from casual animator to the role of 2D Community Manager for Reallusion. Now working alongside his 15 year old daughter Maddison, together they sat down to showcase just how easy the new Cartoon Animator 5.2 Motion Pilot makes 2D animation.

With a layered scene setup in Stage Mode, Garry and Maddison demonstrate the tools and features of Motion Pilot and how anyone can now produce real time smooth, professional results in minutes. 

Cartoon Animator has long been the easiest and fastest way to create 2D animation. With Cartoon Animator you can produce quality projects in no time, regardless of your skill level in animation. Cartoon Animator makes animating easy for everyone, and today we’re going to show you how.

With the latest update to Cartoon Animator 5.2, we see the introduction of Motion Pilot, which gives you total control over the movement of your actors and props, controlled in real time using mouse movements, with variable parameters set by you. 

Motion Pilot gets rid of the need for hours of keyframing your animation, because now you can control your characters like puppets using your mouse, while you record the motion. Motion Pilot is more like acting than animating.  It makes everything easier and faster, but more importantly it makes animating so much fun for people of all ages and skill levels. Without any previous knowledge or experience of 2D animation, anyone can use Motion Pilot to create smooth results in minutes. 

In the screenshot above we have a scene set up in Stage Mode. This scene was bought in as a Photoshop PSD file, maintaining its layers so we can move our actors around inside the scene, both in front of and behind objects and then move our camera about to create a sense of depth to our scene. 

You can instantly switch to 3D view to see where all your characters and prop elements sit on the stage, as shown below.

To introduce an actor to your scene, you can simply drag and drop them from your Content Manager. By adding any one of the hundreds of premade motions to your actor, we could have our scuba diver swim instantly, without the need to keyframe his movement or draw individual frames. Cartoon Animator‘s premade motions do all the hard work and give perfect results every time. 

With Motion Pilot, you can now make the scuba diver automatically swim as you pilot him around the screen using your mouse. In Motion Pilot, there is a section called Motion. Check the Blend Motion box, then drag and drop your swim motion onto the text box. If you only want him to swim forwards, scroll up to Face Cursor, and turn that off so he doesn’t flip direction as he swims.

Position your scuba diver to where he will begin in your animation and then before recording the screen you can select practice so you can rehearse you motion. Once you click Preview, move your mouse over the scuba diver and hit the space bar and watch him come to life and swim, following your mouse movement in real time. 

You can change the speed at which an actor moves using the Speed slider. 

Once you have rehearsed your animation, do the same again, only this time hit Record, to capture all of your characters animation in real time. Moving your character around feels more like playing than animating, as you become the actor and put your performance into their motions.

From frame zero, hit Play on the timeline and watch your new animation playback. What would have taken hours of tedious keyframing and animation skills has been completed in mere seconds thanks to Motion Pilot and has produced the smoothes result possible. 

Even after you have completed many frames of animation, Cartoon Animator allows you to rewind through your project and add new animation or even alter the animation you have created so far. 

The octopus actor in our project has Spring Bones built into his body rig, which you can apply yourself to any CTA actor. Spring Bones allow character parts like limbs, hair and accessories to wobble and bounce about in real time automatically, with no need for keyframing from the animator. Spring Bones dramatically reduce the amount of time it takes to produce 2D animation and at the same time provide realistic and smooth motion results.

Instead of adding a motion to the octopus like we did with the scuba diver, this time we were able to  be the puppeteer and move him about while we record his movements in real time. After recording the motion of the octopus, we could playback our animation to see your octopus and scuba diver swimming together.

Animation has never been so easy or fun! You can control any character or prop using Pilot Motion. Moving them about like a puppet and just recording it to make an animation in real time.

We were able to pilot the fish in our scene using the same real time puppetry applied to the octopus. In addition, using a Motion Pilot feature called Flock Settings, we could make duplicates of the fish and have them follow the leader. Flock Settings give the option to select multiple items as the children, following the parents with preset time delays and in various patterns. This is the ideal way to create animations of groups like swarming insects or random swirling leaves blowing about in the wind. Again, the type of animation that would otherwise require a high degree of skill and a large volume of time to execute, can be done in seconds.  

Even though Motion Pilot makes animation fast, fun and easy, it is still an incredibly powerful tool that gives you total control over your animation. For instance, we saw that Maddison and I could animate the Octopus in real time just by moving the mouse around. But even after the motion is recorded, we can still go back and add more character animation to our actors. 

Using Cartoon Animator‘s 360 Head function,  we were able to make the octopus turn its head in the direction it’s swimming. We could keyframe the rotation of the actors head and even add in facial animation like blinking and lip syncing to voiceover tracks.

Spring Bones were easily added to the flat PNG fish props to make them move in a more dynamic way, as shown in the image below. Then, using the Wave function of Motion Pilot, we were able to have the fish move automatically, controlled by parameters we set. For instance, by adjusting the Y-axis of the Move Amplitude, the fish will now bob up and down in the water on its own. And changing the Loop Duration determines how fast or slow this motion is.  You can even tell Cartoon Animator whether you want the fishes movement to be constant or random.

As well as puppeting your actors in real time, you can create a specific path that your actors can follow, so you have total control over their movement. After plotting a path with mouse clicks, you can link your actor to the path, which it will follow as the animation plays, and you can determine how long it takes for the actor to progress along that path. In the image below, the red line is the path that has been manually drawn for the fish to follow. 

Your actors don’t have to just move side to side and up and down either. In Cartoon Animator you can build your scenes so they have camera depth. All of the elements in your scene are layered, from close to camera, to all the way off in the distance. This means you can have your actors move in front of and behind objects and control this during live recording with Motion Pilot by adjusting the Z-axis. We were even able to use the Scale function to make actors become larger as they move closer to camera, and smaller as they move further away. 

There are lots of different ways you can control your actors with Motion Pilot, and changing the parameters for how they move is fast and easy, and give instant results that you can preview before recording your final movie.

Finally, you can decide whether you want your actor to move in time with your mouse or using the Lazy Mouse function, you can have the actor follow along at a slower, smoother pace.

You are not limited to just the actors in your scene being controlled by Motion Pilot. You can control other scene elements as well, to help bring your scene to life in a dynamic way. When we set up our scene and added the sponge as a static prop, we were able to add our own custom bones to the sponge to give it a simple skeleton. Then we turned some of those bones into Spring Bones, giving them flexibility. Using Motion Pilot, we were able to set the rotation on the prop so it gently moves side to side, as if affected by the water current, and it does this automatically as our scene plays. 

By making the Tween frames Random, the motion looks more natural, and then we can set the Loop duration to make the sponge gently sway about during our scene, while the rest of the action takes place around it. 

You can apply this style of automated motion to any item in your scene, meaning everything can move in its own unique way. We could even apply the wave motion to the water current prop at the top, creating the illusion of the water constantly moving. By applying Spring Bones to this prop, it appears to flex and wobble about when we applied a simple Wave motion to it, helping to bring our scene to life. 

Getting the exact look you want is as easy as changing the values in the menu, for instant results. You can make them as realistic or as exaggerated as you want! 

Maddison and I had so much fun working together on this Motion Pilot project. Anyone of any animation skill level, advanced, amateur or even zero experience, can be producing quality animation instantly with CTA5 now. You don’t have to do anything but have fun, because Cartoon Animator and Motion Pilot do all the work for you, making the process feel more like playing than animating. 

Learn about Cartoon Animator:
https://www.reallusion.com/cartoon-animator/

Discover CTA Motion Pilot:
https://www.reallusion.com/cartoon-animator/puppet-animation.html

Follow Garry Pye:
https://www.facebook.com/garrypyeanimations/

Turn Your Manga into Animated 2D Characters with Cartoon Animator and After Effects

The Winner Tips & Interview series covers practical workflows and techniques shared by winners from the “2023 Animation At Work Contest”. To let users see the full spectrum of the Cartoon Animator pipeline, we are introducing projects that received attention and credit from the community. Let’s now take a look at “Valiant Seven” to see how Astrovique worked her magic with Reallusion Cartoon Animator (CTA).

About Astrovique

Greetings! My name is Astrovique. I’m a full-time illustrator and hobbyist animator. I often create fan art to exhibit and sell at conventions in Australia such as SMASH!, Oz Comic Con, and Supanova. On the sidelines, I also have a webcomic that I’m working on known as Valiant Seven and oftentimes find myself animating characters from my series. I’m also a massive memester and would just about animate anything that I find myself laughing at and sharing that laughter with others.

How I Discovered Cartoon Animator and why I used it

I stumbled upon this software through the Animation at Work competition. About a month after the competition had already started, I saw a sponsored ad on Instagram and thought, “Why not give it a shot?” I’m always eager to learn new software and see what I can create. I selected Cartoon Animator 5 as one of my production tools due to its free-form deformation tool and spring dynamics. I was intrigued by the idea of picking up a new program on the fly, and it turned out to be a user-friendly choice with educational resources readily available to guide me through the process. I particularly appreciated the character rig tool, which allowed me to easily manipulate my character’s limbs and keyframe them, while the applied spring dynamics responded to the movement. The deformation tool proved invaluable for creating quick and simple motions for my characters.

Why did I choose this topic?

I was drawn to the Comics and Art Animation category because I have an existing webtoon series that I wanted to explore further in various digital media, including illustrations and animation. This contest presented an opportunity to create a short animated trailer for my series, and I am satisfied with the result.

My Creation Process

Step 1 – Storyboard

To begin, I crafted a rough storyboard outlining the composition and timing of each scene in sync with the trailer’s music. Underneath each rough sketch would be general notes of how it would be timed to the beat of the trailer music.

Step 2 – Drawing Unique Characters

After finalizing the composition, I worked on detailed character drawings, ensuring that various parts like hair, head, eyes, and clothing were on separate layers for animation. I dedicated extra time to make the final shot in the trailer as detailed as possible, given the limited timeframe that I had.

Step 3 – Animation

I imported each scene into Cartoon Animator 5, utilizing different layers that needed animation. I used the free deform tool and character rig tool to manipulate and rig them in the program. Additionally, I applied spring dynamics to smaller details, such as strands of hair, to respond to specific keyframes rather than animating them manually.

Step 4 – VFX Composition

Once all the scenes were animated, I exported them as a PNG sequence and imported them into After Effects. There, I edited each shot to match the music, and I used the puppet tool to animate complex layers and create unique distortion effects during scene transitions.

Interview with Astrovique

Q: What do you think makes a winning entry to competitions like this? How does someone make their entry stand out in the crowd and really grab the judges’ attention?

A: Honestly, having something that is uniquely yours and fits whatever category you’ve entered is the best way to go about in a competition like this. On top of that, remember to have fun with what you create! I participated in this competition because I wanted to create an exciting trailer featuring my characters in action scenes, all while learning new software and enhancing my drawing and editing skills.

Q: Some of you are quite new to Cartoon Animator, while some of you would now be considered veterans of the software. For those of you who are new to Cartoon Animator, what surprised you most about working with this 2D animation software?

A: I’ve only been using Cartoon Animator 5 for about a month, so I’m relatively new to the program. However, I managed to pick up a lot in a short time thanks to its user-friendliness. Despite being a new user, I found it straightforward to learn, with abundant resources and tutorials available. The most exciting feature I discovered was the ability to capture mouth movements for lip-syncing and body animations for 2D characters, which I think is a fantastic addition.

Q: Did you find the process of creating characters and content for your entry easy and enjoyable? And how did you find the experience of animating your project with Cartoon Animator? Were there any specific challenges you had to overcome? And if so, how did you overcome them?

A: Creating characters was relatively straightforward since they were already established. My experience with Cartoon Animator has been a positive one! The learning curve isn’t too steep, and anyone can pick it up. My main challenge was finding time to work on the animation alongside my regular job and daily life while learning new software. Additionally, creating initial storyboards and coming up with compelling scene compositions for events that hadn’t occurred in my webtoon were specific challenges. I wish I could have added more details to my drawings, but I’m still happy with the end results.

Q: As a Cartoon Animator user, if you had to choose only one tool or feature of CTA that is your absolute favorite, what would it be?

A: If I had to pick one feature out of them all I would choose the deformation tool. A very simple and straightforward tool to use that allows you to capture movement quickly and easily.

Q: What is your best piece of advice to other Cartoon Animator users who might be considering entering contests like this in the future?

A: Familiarize yourself with the program as much as you can and have characters or illustrations ready beforehand so you don’t rush towards the end. Have a clear plan of what style you want for your animation, the music you’re going to use, and the voice acting you’ll need in your animation, so you are able to schedule everything efficiently and maintain consistent quality in your work.

Q: Now that this year’s contest is finalized, what does the year ahead hold for your work? Are there specific projects you are working on? How does Cartoon Animator fit into your future plans?

A: Some plans I have for the future are continuing my webcomic on the sidelines, drawing more original illustrations and animating some of these illustrations!

Follow Astrovique

Instagram | https://www.instagram.com/astrovique/

YouTube | https://www.youtube.com/@astrovique

TikTok | https://www.youtube.com/@astrovique

WarLord’s Workshop Review: A Look at CC4 ZBrush Face Tools

ZBrush has long been recognized as a powerhouse of 3D sculpting and painting with innovative tools, techniques, and a user interface that is very unconventional but very usable. Once you’ve spent some time with ZBrush you can begin to appreciate its value to the 3D community.

ZBrush can sculpt the finest details into a mesh whether it be an aging, realistic character with weathered old leathery skin to detailed armor, or finely engraved jewelry. It’s long been recognized as a 3D tool that punches way above its weight class in terms of pure artistic “enablement” if that is a word.

While Character Creator 4 does indeed create 3D mesh characters it does so from base or original meshes reducing the need to rely on pure artistic sculpting or technical rigging. It does not produce character meshes from scratch which is the domain of tools like ZBrush that can and often do require advanced skills. On the other hand, ZBrush is not known for its character rigging either, which is one of the strong suits of Character Creator 4.

Put them both together and you have basically supercharged Character Creator 4.

ROBUST GoZ FUNCTION

Face Tools is the kind of GoZ functionality we’ve always wanted. It packs a lot into the time it takes to make the trip back and forth from ZBrush to iClone with the click of a button. The Face Tools control panel is packed with one click morphs for testing and editing along with detail and diffuse functionality. Just the diffuse feature alone is a great improvement and that is not counting the rest of the features in the Face Tools control panel.

Left and Middle: Face Tools Menu (in ZBrush). Right: A few of the layers that import from CC4.

EASY INSTALLATION

Face Tools installation is a very simple process. Use the Reallusion Hub to install the plug-in and you should see a Face Tools button (with a character loaded) on the modify menu. Do your thing in Character Creator 4 with the sliders to define your character, add Skin Gen, and things like that if desired. Once your character is ready to go into ZBrush for heavy-duty work you press the Face Tools button.

The first time you press that button it will check to see if the GoZ functionality is installed and then do so. It will ask you which version of ZBrush to use if you have more than one version installed.

That’s it. Worked flawlessly for me and there was no need to manually copy some GoZ files like many users had to do with the original GoZ feature.  

ZBRUSH LAYER’S FUNCTION

The tool relies heavily on the Layers function in GoZ to keep track of changes within the mesh. When you go to layers you will see there are several layers there already. Since Layers is being used you need to use an existing layer or create a new one to make changes. The Face Tools control panel image above shows just a few of the available layers that transfer over to ZBrush.

MY FIRST TEST

Just to see what Face Tools was about I loaded a base mesh and then jumped over to an online AI image generator to create a facial image of an older Native American Indian male. I used Leonardo AI for this particular image, and it really made the character pop without a lot of interaction from me.

I sent the mesh over to ZBrush with all the options checked that I could find or made sense as I had not read the manual yet wanting to see how far I could get just using it. There is very little learning curve with this tool if you are proficient with ZBrush and CC4. By proficient I mean able to follow instructional steps to move forward as you don’t have to be a power user.

As I said earlier, layers rule the day as you need to create a new layer or use an existing layer to make changes in ZBrush. This is nothing new to seasoned ZBrush users. Layers are powerful and record your changes.

At this stage, I can add more detail to wrinkles or add more skin texturing if desired. Enhance what is already there or cut in new wrinkles that match some of the AI-generated head image. I never ran into a snag along the way. It was one of the few times the experience was frustration-free.

Left: CC4 Character, Middle: CC4 Character with Detail, Right: CC4 Character Diffuse Polypainting

I will list Kai’s quick start tutorial in the resources section at the bottom of this article. When I said I didn’t read anything, that didn’t mean I didn’t WATCH anything. So with Kai as my guide, I flew through the process like I knew what I was doing and that Is a plus when you can do that on your first run with any new plugin or tool. I made the clothing and accessories in Marvelous Designer.

CHARACTER CREATOR 4 TOON WOLF TOUCHUP

Since the early days of Character Creator, I created several toon sliders and some extreme reshaping sliders to morph characters like the Toon Wolf I’m using in this example. I loaded a generic base in CHARACTER CREATOR 4 and applied the slider to the full extent.  The slider was based on reshaping the existing base mesh at the original low poly count for GoZ.  It might have been created as far back as CC1 since it seems I’ve always had these morphs around to test with.

PHOTOSHOP RETOUCHING

I sent all the different maps like diffuse and normal over to Photoshop to remove things like the mouth, nose, and ear sections so they wouldn’t interfere in the anthropomorphic wolf skin. If you don’t remove these details, they can show up on the character in the wrong places or show a badly distorted human ear where the original ear was located on the base mesh.

Removing some of the skin features through the various channels with Photoshop. Used the clone tool to clone clean areas over the features that were removed.

You can use the image editor of your choice for the same task. In this example, I didn’t change the texture until the end of the process.

TIP: If you still see two black nostrils after editing the maps in the various channels then you need to change the shader from Human Skin to PBR and the black nostrils will disappear.

TO ZBRUSH

Next up was sending the head over to ZBrush with the Head Tools plugin so I could correct some problems with the teeth sticking through the lips due to the extreme reshaping and low poly count. The Face Tools menu in ZBrush makes it easy to quickly select a morph like Mouth Smile or Mouth Frown to check on the teeth penetrating the skin.

This is Face Tools in its basic usage. We could also create more wrinkles, and/or modify existing wrinkles while adding other details like skin textures, wounds, scrapes and bruises if needed, and much more.

RESHAPING THE SKIN OVER THE TEETH

As I went through the morph buttons in the Face Tools menu, I used the Move tool with horizontal symmetry turned on to pull the skin over teeth or gums when they poked through.

Selecting a morph in the Face Tools panel and using the move tools to bring the lips out over the teeth and gums.

This requires using the aforementioned layers in ZBrush to make edits and you can make edits to the wrinkles and other modifications or add more detail to the skin surfaces as long as you do not alter the polycount like any GoZ function. You cannot increase or decrease the polycount as it has to match the model it will be transferred to.

TIP: When I needed to do body shaping and head shaping, I did the body first with the regular GoZ function then used Face Tools to Create a new head mesh instead of Relinking to the original GoZ mesh.

DIFFUSE WORK

The next step was changing the diffuse map from a human skin tone to a tone more suitable for a toon wolf. To do this I turned on the Diffuse channel in the Face Tools ZBrush menu then went to Polypaint -> Paint from Texture (in ZBrush) to transfer the existing skin texture which we could alter or paint over.

I turned off the diffuse button in the ZBrush Face Tools palette and noticed the original coloring remaining was now painted on the head. I selected the material in ZBrush I was going to use for the and got to work. For the purpose of this review, I kept it very simple.

Simple two-material texture painting.

After transferring back to iClone I then dressed the Toon Wolf in clothing similar to what a jazz musician might wear to complete the look of a laid-back, saxophone-playing anthropomorphic wolf with his nose and mouth scrunched up playing the saxophone with no penetration from the teeth or gums. I dragged and dropped in a 2D ai-generated background of a stage to dress things up a bit.

What used to be a tedious character to work with because of the extreme shapes forced on the low poly base mesh had become a pleasure to work with no penetration problems even when using the wrinkle system heavily.

Saxophone model “Saxophone” (https://skfb.ly/oo97B) by Matt Caddie is licensed under Creative Commons Attribution (http://creativecommons.org/licenses/by/4.0/). Beard from the Beard and Brows Builder pack.

I’ve been using ZBrush for several years and consider myself to be a typical user, not a power user by any means, and I found Face Tools to be simple to use, easy to grasp, and somewhat intuitive which is not a word generally associated with ZBrush. This GoZ function is much more powerful and including the ability to paint in ZBrush and transfer to iClone without mapping will be a time saver and eliminate some of the buzzkill that type of housekeeping work can bring with it.

SUMMARY

If you use ZBrush with iClone then Face Tools is worth a look. Take the trial for a test drive and decide for yourself.

MD McCallum - WarLord

MD McCallum – WarLord

Digital Artist MD “Mike” McCallum, aka WarLord, is a longtime iClone user. Having authored free tutorials for iClone in its early years and selected to write the iClone Beginners Guide from Packt Publishing in 2011, he was fortunate enough to meet and exchange tricks and tips with users from all over the world and loves to share this information with other users. He has authored hundreds of articles on iClone and digital art in general while reviewing some of the most popular software and hardware in the world. He has been published in many of the leading 3D online and print magazines while staying true to his biggest passion, 3D animation. For more information click here.

Make a Compelling Promo Video with Animated 2D Characters For Cartoon Animator

The Winner Tips & Interview series covers practical workflows and techniques shared by winners from the “2023 Animation At Work Contest”. To let users see the full spectrum of the Cartoon Animator pipeline, we are introducing projects that received attention and credit from the community. Let’s now take a look at “Bringing to live” to see how Nandor Toth worked his magic with Reallusion Cartoon Animator (CTA).

About Nandor Toth

Hi, I’m Nandor Toth, and I am from Hungary. I used to study veterinary medicine in Budapest, and treating dogs, cats, and horses was my daily life. But a decade ago, I was lucky to switch to another interest, which is animation. I love my job, and I am grateful to have a job that is also my hobby and where I can express my creativity. I think about my work almost all the time, even when I’m walking, swimming, or mowing the lawn.

My main strengths are creativity, a sense of beauty and drawing, and creating cartoon characters. I have been working with some main clients including BOSCH, Liqui Moly, and RTL. My main areas of work include Motion Graphics, Explainer videos, Visual effects, Cartoon Animations, Logo animation, Mascot animation, Opener videos, and Whiteboard animation. I like challenges and I wanted to challenge myself in the competition as I have never participated in any competition before. The “Business & Commercial” category seems the closest category related to my background, as I have created animations for my previous clients that require a similar mindset.

How I Discovered Cartoon Animator

I’ve been working in animation for 13 years, and I’ve been improving my skills all the time. Every day, I think about what field of animation I can express myself better and how. I first found it on YouTube in a video comparing it with two other software, but I liked Cartoon Animator 5 more, and then I installed it on my computer in March 2023.

When animating, I think the ease of adding movements and modifying them is really important. And making professional movements is also vital for my work. Besides Cartoon Animator, I finished the project with the help of other software, such as Procreate, Illustrator, After Effects, and Audacity. Though it took me some time to finish, I am glad to share how I made it with Cartoon Animator.

My Creation Process

Step 1 – Ideation and Script Writing

I immediately started brainstorming when the competition kicked off. My goal was to showcase the software’s user-friendliness and my own skills. It wasn’t the first scene that came immediately to mind, but the core storyline: a cute superhero doing magic on a laptop. Then came the challenge: someone had to display the superhero character. This gave me the idea of having a cartoon artist to draw him, but he can’t animate him, instead the Cartoon Animator comes into the picture, which he gets as a gift from his lover. Then the ideas started coming like an avalanche, later I had to take some things out of the script, I had so many ideas.

Step 2 – Drawing Unique Characters

After several days of scripting, I began drawing the characters on my iPad using Procreate. I first sketched their simplified skeleton in blue, followed by a black outline sketch, and finished with detailed lines and coloring.

Step 3 – Vectorization

The next step involved vectorization. I opened the character template file and imported the JPGs exported from Procreate into an Illustrator file, reducing the visibility of the imported JPGs to 20-30%. I made the lines of the template file invisible, leaving the enclosing layers empty, where I drew the body parts. I positioned the bones and then exported them to an SVG file.

Step 4 – Animation

I launched Cartoon Animator 5, imported the background graphics and characters, and placed the background elements in 3D space. I adjusted the characters one by one in Composer Mode. For the facial animation setup, I made the necessary modifications, followed by 360 Head Creator adjustments. I recolored the sprite hands in SVG Color Adjustment. In Spring Editor, I removed hair and some clothing elements. Then, I proceeded with character animation.

Step 5: Adding Effects

In the next phase, I added effects to enhance the video’s impact. Most of these were free downloadable effects, while some were part of the standard package. To achieve subtle movements of the iPad in the stock video, I used Motion Tracking to analyze and connect the animated Cartoon characters from CTA5 to the Null.

Interview with Nandor

Q: What do you think makes a winning entry to competitions like this? How does someone make their entry stand out in the crowd and really grab the judges’ attention?

A: For an animation to be a winner in the “Business & Commercial” category, I think it has to meet several factors (I’ve ranked them in order of importance, with the most important ones at the top):
-be special, unique: different from the rest
-present the story in a creative way
-to convey the clearest possible message with the simplest possible graphics
-it is advantageous if the main character is likable to the majority of the audience (especially if he is cute)
-be pleasing to the eye, with no distracting graphic elements (e.g. too much vibration, flickering, flashing)
-it can be spectacular, but be careful with it, because it’s like spice: if you put too much in, people run away screaming 🙂

Q: Some of you are quite new to Cartoon Animator, while some of you would now be considered veterans of the software. For those of you who are new to Cartoon Animator, what surprised you most about working with this 2D animation software?

A: As I just started using it in March 2023, I consider myself a newcomer to CTA. I have never come across such good animation software that moves characters in such a professional way, and I can save the movements and add new movements made by others so easily, and even edit them back and forth. For me, this is probably the most important benefit of the program. There are many other advantages, but the one I mentioned is one that I use for all my animations.

Q: Did you find the process of creating characters and content for your entry easy and enjoyable? And how did you find the experience of animating your project with Cartoon Animator? Were there any specific challenges you had to overcome? And if so, how did you overcome them?

A: Yes, I found the job relatively easy compared to the end result. In other similar software, it would have been much harder and slower. A challenge for me was the scene where the superhero is facing us and pointing the magic wand at us. As the deadline was approaching, I didn’t have time to make a character in another view, so I took advantage of the 360 head rotation I had done beforehand, so I turned the head towards us, I also made a version of the magic wand facing us, while the half sideways turned body is covered by the laptop monitor in front of the character 🙂

Q: As a Cartoon Animator user, if you had to choose only one tool or feature of CTA that is your absolute favorite, what would it be?

A: 360 Head Creator, Spring Editor and the latest Motion Pilot (end result looks amazing!)

Q: What is your best piece of advice to other Cartoon Animator users who might be considering entering contests like this in the future?

A: First, learn to use at least a basic level of this wonderful software. Then: unleash your creativity, and free your mind: don’t bother to come up with a story, go for a walk, a bike ride, or as I do: swim a lot, because that’s when the best ideas come 🙂

Q: Now that this year’s contest is finalized, what does the year ahead hold for your work? Are there specific projects you are working on? How does Cartoon Animator fit into your future plans?

A: I want to learn more about the software and will use this knowledge to make “Demo Reel” videos. Videos that I will be proud of. But what I will be MOST PROUD of in my whole life is winning the first prize in the “Animation@Work Contest 2023 Business & Commercial” category! Thank you so much!

Follow Nandor Toth

Website | https://SuperAnimation.net

LinkedIn | https://www.linkedin.com/in/n%C3%A1ndor-t%C3%B3th-2285a128b/

From AccuRIG to Custom Bone and Facial Blendshape Design – a Comprehensive Character Creator Workflow

About the Artist

I am Marcel Brynard, an artist with a background in architecture. I have worked some years at a bronze sculpture studio in Cape Town and have worked as draftsman & set-designer, 3D modeling sets for popular series such as Netflix’s One Piece. In my personal art, I am inspired by my visits to various archeological sites. 

My love for sculptural forms intertwines with a fascination for the chimeric and arcane, resulting in art that exudes story-telling and mysticism. 

Embracing digital tools, I intend to explore new possibilities, intersecting architecture and sculpture. Fueled by a love for these things, I am working on an animation project titled “Transcendental Rebirth,” narrating the transformative journey of two shamans on a pilgrimage. One of the main characters for this project is featured in my following workflow.

Workflow Introduction

The art of character animation has witnessed a remarkable transformation in recent years, thanks to the integration of advanced software tools and the creative synergy of various platforms. This article explores a comprehensive workflow for crafting a stylized and emotive humanoid character, focusing on the creation of detailed facial expressions. 

By harnessing the combined power of Character Creator 4 (CC4), ZBrush, Adobe Substance Painter, and Blender, artists can unlock new possibilities in character animation, resulting in characters that truly come to life.

For me, the animating of a rigged character has in the past seemed a daunting and tedious task, especially when it comes to realistic, mathematical movement. Having been introduced and acquainted with CC4 however has made the process far speedier and far less laborious. Now it is super simple and quick to rig my character, most of the weights have been automatically painted by CC, and I have access to a large range of complex human movements which I can sync to my stylized character in a relatively easy process.

I. Character Sculpting and Modeling in ZBrush

Creating a compelling humanoid character begins with ZBrush, a digital sculpting software known for its powerful and intuitive tools. In this initial stage, I breathe life into my character by sculpting and detailing the character’s form, from head to toe.

I start the process by sculpting the character’s base mesh, establishing the character’s anatomy and overall proportions. With multiple subdivision levels I can add details, and ZBrush excels at capturing them. I can add subtle features like wrinkles, blemishes, and scars, making the character’s face more realistic and believable. 

A very useful feature from CC4 is the GoZ compatibility with Zbrush. I can export all low-res versions of my subtools directly to CC4 with the click of a button, then in CC4 I can very quickly & easily rig my character and access various poses & animations in the CC4 library. Then, again with the GoZ button in CC4, I can send a posed version of my character back to Zbrush where I can maintain all the high-res details on the newly posed model.

I could for example send to Zbrush from CC4 a version of my T-pose character as holding a staff or a weapon, or with, for example, a frightened facial expression. This is a very powerful workflow for 2D concept art, 3D renders and figurine design for 3D printing.

II. Texturing in Substance Painter

Once the character’s form and details are sculpted in ZBrush, it’s time to move to Adobe’s Substance Painter, a tool that specializes in creating stunning and detailed textures for characters and objects. I can easily export my high-res version of the sculpture as FBX as well as the low-res version, and in Substance Painter I can bake those high-res details of the high-poly mesh onto the low-res mesh, optimizing the performance of the software while maintaining the sculpted details. The FBX export plugin for Zbrush makes the transition seamless.

I use Substance Painter’s extensive texture libraries and tools to apply realistic textures to the character. For the skin, clothing, and any other materials I use a combo of smart-materials and various generators and masks. For this specific character I want it to have the look of a ceramic sculpture that has come to life.

III. Character Rigging and Animation in Character Creator 4

With the character’s textures applied, it’s time to prepare it for animation. CC4 is an invaluable tool for rigging and animating. In CC4, I establish the character’s rig, using AccuRig mode to define bones and joints that will control movement. I place the facial bones roughly into position, as I will further manipulate and prepare them in Blender.

IV. Further Bone Manipulation in Blender

Blender, an open-source and versatile 3D software, is where I further manipulate the rig generated with CC4. I export from CC4 an FBX with its rig, and import it into Blender. Here I can add additional bones for example the Ear bones. Once I’ve created the extra bones and placed the facial bones in their positions, I check the Weight Painting for the various vertex groups and make appropriate changes and additions in Weight Paint Mode. I then reimport the FBX into CC4 where I can review and assign the new bones as spring bones (in the case of the ears) and I download an HIK expressions profile from CC4 which I then load onto the character. I export a new FBX this time “with facial expressions” and when I import it into blender it comes in with Shape Key information.

V. Shape Keys in Blender

Using my Expressions Frame Map as reference (generated from CC4), I can start working on the expressions along the Timeline in Blender’s Dope Sheet. I read for example “Brow Raise Inner Right” for frame number 16 and appropriately sculpt that expression, making sure to change the value of the Shape Key from 0 to 1 before sculpting the change. I can sculpt as well as use bone animations and weight painting to achieve my various expressions. For some expressions only sculpting is needed, for others, for example “Jaw Open” bone animation works well. I create Shape Keys to define a variety of facial expressions – from smiles to frowns, and everything in between, checking through the list of expressions as found in CC4’s Expression Map, checking inside of CC4 what the different expressions should look like on a human, for reference when needed.

Once I’ve sculpted, animated and weight-painted the various Shape Keys, I export a single FBX from Blender and import it into CC4 again, this time it contains all the Expressions information. I can check the results using the “Edit facial Expressions” function in CC4. If there are any mistakes or flaws I can revert back to Blender and improve the Shape Keys.

VI. Animations in CC4

When I am happy with the expressions results, I can choose from an extensive library of animations within CC4, I can preview these animations in the animation previewer and see that all my facial expressions work along with the various body motions. I can easily export any animation of my choice in FBX format, with the model as well as the animation included.

VII. Render in Blender

In Blender I can then import the FBX with full animation included and set up my camera and lights as desired. I render it out in Cycles and export my frames to Adobe Premiere where I can compile the final video.

Conclusion

The integration of Character Creator 4, ZBrush, Substance Painter, and Blender into a comprehensive workflow has transformed character animation. I can now create my stylized humanoid characters that convey a rich range of emotions through facial expressions with a relatively easy workflow. Especially since I can download a whole range of animations from CC4 this saves me a lot of time and frustration.

The result is a character that comes to life with vibrant facial expressions, adding depth and authenticity to storytelling, gaming, and other forms of visual media. The possibilities are endless, and the evolution of character animation continues to astound and captivate audiences and artists worldwide.

More Info about Character Creator 4: https://www.reallusion.com/character-creator/

Renowned Director Debuts Mech vs. Giant Slugfest with iClone & Character Creator

The following is an interview with Cesar Turturro, an accomplished film director who received the prestigious 2022 Epic MegaGrant in recognition of his exceptional work on “Invasion 2040.”

Q: Hi Cesar, please tell us a little bit about yourself.

Hi, let me start from the beginning. Back in 2004, I embarked on my filmmaking journey with my inaugural war feature film, delving into the 1982 Malvinas War. For this project, I crafted VFX close-ups, seamlessly blending real images with 3D animation. Subsequently, I directed my maiden documentary on the same subject for The History Channel, featuring over 60 minutes of compelling air combat animations.

Continuing my creative exploration, I ventured into the realm of fan films, garnering some worldwide attention with a production centered around Robotech. Notably, the film garnered substantial attention in Japan, accumulating over 200,000 views within a single day.

Cesar Turturro and Fernando Acuña as Co-Directors at Premiere of 1982, estuvimos ahi in 2005

At the panel of Robotech Academy 2014 at Comic Con in San Diego

Q: How did you come up with the idea of creating the science fiction film “Nick 2040”?

Building on my previous experiences, I took a significant leap into creating my own science fiction realm named “Invasion 2040”. This transformative endeavor has spanned over seven years. Along the way, I secured an Epic MegaGrant, and our project caught the attention of Reallusion, leading to our participation in their Pitch & Produce program.

Taking yet another stride forward, I elevated the production to a cinematic level with “Nick 2040”, shooting in 6K using RED cameras. The post-production was meticulously rendered in 4K, thereby completing the trilogy of short films. This final installment serves as a proof of concept for an ambitious feature film, currently in the pre-production stage.

Shooting “Nick 2040”

Achieving this is contingent upon having a significant portion of the VFX groundwork completed before filming. Presently, I am actively engaged in this preparatory phase — every scene, character, animation, and VFX test crafted for the short film is seamlessly integrated into the overarching feature film. This dual role enables me to serve as both co-director and VFX director.

Shooting “Nick 2040”

Q: Please describe your workflow with Reallusion software.

Phase 1 – Create realistic characters with Headshot and Dynamic Wrinkles

I begin by initiating a facial scan of the robot using Reality Scan. Simultaneously, utilize Headshot to generate a face based on a photo. In parallel, I employ ZBrush to amalgamate both models, creating a mesh that closely resembles the target appearance and utilizing the mesh and the texture generated in iClone via the Headshot plug-in in Character Creator. Once finalized, I transfer all components to Unreal Engine to generate the robot’s face within a Metahuman framework. Additionally, I conducted a scan of the clothing to enhance the overall realism in the final rendering.

Highly detailed head of the giant using ZBrush and AccuRIG

I meticulously crafted the antagonist, a towering giant, by intricately detailing the facial features. Drawing inspiration from the expressions of Metahumans, I ventured beyond conventional methods. Employing Blender, I seamlessly merged an AccuRIG body with a Metahuman face, yielding a remarkably satisfying outcome.

Our breakthrough involved generating facial animations through Dynamic Wrinkles. This allowed us to achieve lifelike expressions on both the protagonist’s face and the characters featured in the invasion scenes.

Certain scenes necessitated the recreation of the real actress’s face, which was impractical using conventional shooting methods. Leveraging the Headshot plug-in, I scanned the actress’s face and refined the mesh using ZBrush in conjunction with CC4. Subsequently, I animated it in iClone 8 and seamlessly transferred all data to Unreal Engine via Live Link, producing an astounding result.

Harnessing the advanced capabilities of CC4’s custom character Dynamic Wrinkles and employing facial mocap, I effortlessly executed intricate facial expressions within mere minutes.

Phase 2 – Animate the fighting scene with iClone and ActorCore

The integration of iClone played a pivotal role in refining the animation of the transformation in “Invasion 2040”. This was particularly crucial as the character seamlessly rises, and places hands on the ground, while simultaneously undergoing a dynamic metamorphosis into a robotic form.

The invasion scene in “Nick 2040” was meticulously crafted in Unreal Engine, featuring a blend of characters sourced from different platforms — some from Metahumans and others from iClone. Leveraging the practical and easily processable Actorcore scans, obtained through iClone’s Live Link, proved ideal for seamlessly incorporating various animations from the “Run For Your Life” pack, perfectly complementing the intensity of the invasion shots. To achieve a cohesive scene, I cloned the iClone animations onto the skeletons of the Metahuman characters in Unreal.

The climactic final combat between the robot and the Jumper was a two-part process. Initially, I choreographed the movements of both characters through Mocap, personally executing the choreography and refining the contacts for impactful hits within iClone. These animations were then transmitted via Live Link to Unreal for rendering. Simultaneously, the fight scene featured additional sequences sourced from fight sequences available on Reallusion’s Marketplace, enhancing the overall dynamic and intensity of the confrontation.

Reallusion and iClone 8 have been instrumental in empowering me to create intricate animated scenes, featuring epic battles between giants and robots. These scenes, serving as the essence of my films, are brought to life through the robust features integrated into my workflow pipeline. The capabilities offered by these tools, particularly in the realm of CG and animation, play a pivotal role in realizing the vision and complexity inherent in my cinematic creations.

Phase 3 – Immediately preview the results with Unreal Live Link

The Live Link enhancement stands out as truly remarkable. The process of sending animations to Unreal is both seamless and stable. Correcting any movement is a straightforward task — I make adjustments in iClone, ensuring I adhere to the specified start and end times on the timeline. Subsequently, through the Unreal Sequencer, I effortlessly replaced the animation, completing the process with just a few clicks. In a matter of moments, the updated animation is prepared and ready for rendering.

The timecode sync render has also revolutionized my workflow. Now, I can seamlessly transfer objects to iClone, animate characters, and use those objects as references. The remarkable part is the speed at which I can send the animation to Unreal, taking mere seconds. This streamlined process has made animation exceptionally simple and fast, marking a significant improvement in my efficiency and creative capabilities.

Q: Why choose Reallusion?

We are currently finalizing the storyboard, shooting, budget, and VFX plan for “Nick 2040” — an essential undertaking that forms the backbone of our feature film. In a compact animation studio, where tasks are inherently complex, our team of few artists has found immense capabilities through this process.

  • Stability: The entire process is marked by absolute stability, with zero instances of crashes.
  • Speed: Animations exceeding 1000 frames are transferred to Unreal in less than a minute. Remarkably, the processor and GPU often remain unburdened during this swift process.
  • Organization: I streamline my workflow by saving diverse animations in iClone’s Content Browser. This allows for quick retrieval and utilization in various scenes of the film, as virtually all necessary sequences are conveniently stored there.
  • Practicality: Incorporating animations from various sources, including Noitom mocaps, Mixamo animations, and Unreal character animations, is a breeze. A simple drag-and-drop action from the Explorer to iClone seamlessly integrates these animations into the project.

As we near completion, the trailer provides a glimpse of our work, and we eagerly anticipate the release of the final short.

Learn More