首頁

Create Your Own Stylized Character – An Easter Bunny: Character Creator to Blender Workflow

In the high-stakes world of digital media, the ability to create your own character with speed and artistic precision is a vital skill. Whether you are an independent artist or a professional character designer, bridging the gap between a standard human creator tool and a fully customized, stylized asset is the key to a modern 3D pipeline.

This workflow breakdown explores how we leverage the combined power of Character Creator 5 (CC5) and Blender to build a stylized Easter Bunny. By utilizing a solid character base and advanced 3D animation software techniques, you can achieve cinematic results without sacrificing production efficiency.


1. Establishing the Silhouette: Base Proportions & Pipeline

Every iconic character designer knows that a successful design begins with a readable silhouette. The process begins inside Character Creator 5 using the Proportion Editor on a standard neutral character base.

By dialing in the general body shape—adjusting limb length, torso width, and head scale—we establish the foundational cartoon proportions before ever touching a sculpting brush.

The Character Creator Blender Pipeline Export

Once the rough shape is set, the model is sent over to Blender using the Blender Auto Setup free add-on. A key choice here is exporting the character as a fully rigged armature rather than just a morph. This is a critical step for any character designer because it allows for much more complex geometry modifications and advanced texture painting later in the pipeline without breaking the underlying rigging.


2. Pushing the Geometry: Stylized Facial Sculpting

When the model arrives in Blender, the real work begins. For this stylized bunny, working at SubD1 provides the perfect “sweet spot” of resolution.

Sculpting Fur and Features

Working at this level offers enough geometry to grab and pull distinct tufts of fur directly out of the facial mesh, yet remains low enough to easily manage broad structural changes. While this aggressive mesh-pulling isn’t suited for realistic humans, it is highly effective for exaggerated designs in 3D character animation.

Custom Jaw and Teeth Modifications

One of the most drastic modifications involves the character’s mouth. The jaw is heavily manipulated, and standard teeth are hidden to make way for custom-modeled cartoon buck teeth.

Important Note: Extreme facial modifications require massive care to ensure they don’t break when accommodating CC5’s wide range of standard facial expressions and lip sync animation. There is a significant difference between prepping a character for a specific project versus the Reallusion Content Store.


3. Dynamic Accessories: Ear Construction & Rigging

To give the rabbit its signature look, the large ears are modeled from scratch in Blender. Starting from a simple flat plane, the ear shape is mapped out, inset to create the inner ear cavity, and extruded for thickness.

The 0,0,0 Origin Rule

Before these ears are rigged for 3D character animation, a crucial pipeline step must be taken: the geometry is dropped down to the absolute center of the scene (the 0,0,0 origin).

If you build an armature with an offset origin, the bones will often float off-axis when you attempt to set up spring bone physics back in CC5. By centering the asset first, smoothing the vertex weights, and applying a gradient tool for a clean transition, we ensure flawless dynamic secondary animation once the ears are parented to the character’s head.


4. Procedural Texturing: Achieving the Stylized Look

For texturing the stylized fur and skin, the workflow utilizes UCU Paint directly inside Blender. While external programs are industry standards, UCU Paint offers a remarkably robust layered system right in the native viewport.

By stacking procedural effects, generating noise, and baking the lighting directly into the base color, we can achieve a highly appealing, hand-painted aesthetic. Once the textures are baked and saved, the procedural nodes are removed to keep the asset clean for export.


5. Building the Scene: Custom Easter Props

No Easter Bunny is complete without his deliveries. We jump back into Blender’s modeling tools to create a custom wicker basket from a subdivided cube.

Using edge loops and basic shaping tools, a handle is formed, and a woven procedural texture is applied. The basket is then filled with a cluster of smooth, colorful eggs, giving the character immediate context and charm for future 3D character animation sequences.


6. Final Assembly

The final step brings all the custom Blender assets back home to Character Creator 5. The custom ears are parented to the head with spring physics applied, the baked textures are slotted into the materials, and the basket is placed in the character’s hand.

The result is a fully rigged, highly stylized, and animation-ready character created in a fraction of the time a traditional scratch-build would take.


Conclusion: Elevating Your Character Pipeline

By following this structured pipeline—from the initial character base in CC5 to the stylized sculpting in Blender—you can create your own character that bridges the gap between technical reliability and artistic flair. Whether it’s mastering 3D character animation or perfecting custom geometry, this hybrid workflow is the most efficient way to build high-quality characters in 2026.

Author- Mythcons

Greetings, my name is Peter Alexander. In this demonstration, I’m going to walk you through how you can leverage Character Creator 5‘s new ActorMIXER as a powerful stepping stone to create unique, stylized characters. We’ll use the CC Base Mesh as our foundation, and ActorMixer will provide the next layer from which to build, making professional character creation that much easier. For this demo, I’ll be creating a stylized version of Arnold, using the Blender Autosetup pipeline, which is both cost-effective and efficient for creating characters and assets.

Visit Mythcons’ ArtStation

FAQs

Q: Why use Character Creator base as starting point to Blender? Why not directly modeling in Blender? A: Starting with a CC5 character base saves a massive amount of time on technical fundamentals. While you can model directly in Blender, using the base ensures you have professional topology, UV maps, and a fully functional skeleton right away. This allows you to skip the tedious rigging and skinning process and jump straight into the creative sculpting and design.

Q: What is the function of Blender Auto Setup Plugin? Is it free? A: Yes, it is a free plugin provided by Reallusion. Its main function is to automate the import/export process between Character Creator and Blender. Beyond setting up complex shaders and mapping the rig, it features a powerful Data Link function. This allows you to sync and transfer your character, scene, lighting, and camera with one click, ensuring total consistency between both environments.

Q: How to get Character Creator & iClone for my Blender project? A: You can download Character Creator (CC) and iClone (IC) from the Reallusion website. Start from 2026, Reallusion offers flexible options: you can choose from various subscription plans (monthly or annual) for low upfront costs or purchase a perpetual license for permanent ownership. Trial versions are also available to test the workflow, and the Blender Auto Setup tool can be found on their official plugin page.

Related Article

AI Video Mocap Brings Studio-Level Productivity to Indie Game Developers

Creating game NPCs with Character Creator and iClone

Indie and double-A games are booming, but one challenge remains: animation is expensive. NPCs require massive amounts of motion, including idles, interactions, walk cycles, daily routines, and countless variations. Traditional mocap can help, but equipment costs only pay off with repeated use. Pre-made motion packs save time initially, yet they rarely match your exact needs, and manual adjustments can erase those time gains.

Enter Video Mocap: a fast, lightweight solution that captures motion from everyday video. This article walks through the full process, from building NPCs in Character Creator 5 to generating custom motions for them using iClone’s Video Mocap.

José Antonio Tijerín

I’m José Antonio Tijerín, a digital illustrator and 3D sculptor. I’m the creator of the video game “Dear Althea” on Steam and the content pack “We’re Besties”, which is available in the Reallusion Content Store.

Generate a Full NPC Roster from a Single Base

We’ll start by rapidly building a roster of NPCs in Character Creator. With ActorMixer, you can take a single base character and generate a wide range of variations in body shape and facial structure while keeping the style consistent. Layer in texture changes, and you can scale NPC diversity quickly, which is ideal for towns, street scenes, and large management-style populations.

Easily Switch to Game-Ready Low-Poly Characters

Character Creator also solves the production side of NPC scale. You are not locked into one quality level. Start with high-fidelity characters for close-ups, then convert the same NPC into lighter, game-ready versions for crowds and background use without rebuilding. For more aggressive optimization, Optimize and Decimate can further reduce complexity by unifying the character and add-ons into a single low-poly result.

Capture Motion Whenever Production Needs It

The Video Mocap workflow is so lightweight that motion capture stops being a tedious process and becomes a tool you can use anytime to generate NPC actions. Footage can be recorded with your phone, and it doesn’t need to be high resolution. Avoid uploading oversized videos, as they can slow down the generation process. What matters is a clear performance: the performer should be fully visible, the camera should remain static (eye level works best), and the motion should reflect natural human movement. Video Mocap is designed for live performances, not for cartoon characters or stylized animation.

Turn Raw Capture into Game-Ready Motion Fast

Video Mocap gives you a strong motion base fast, but what makes the workflow production-ready is iClone’s motion editing. Like any mocap approach, you may see issues such as minor joint jitter, finger vibration, or unstable ground contact in sitting and leaning poses. The difference is you’re not stuck with raw output. With iClone, you can quickly reduce unnecessary keys, stabilize posture and contact, and refine timing so the motion loops cleanly and behaves predictably in-game. That combination turns Video Mocap from “AI capture” into a complete animation pipeline where you can fix what matters and keep iterating without losing days to cleanup.

Blend Captured Motion to Fix Missing Data

Sometimes reference footage isn’t ideal, especially when legs aren’t visible. In a one-button pipeline, that would be a dead end. A practical workaround is pairing captured upper-body performance with a compatible motion from iClone’s library, including premium motions from ActorCore, the largest motion libraries in the industry, then retiming and blending so the final animation feels coordinated. It’s a very game-dev way to work: take the best part of each source and assemble a result that ships.

Source Video As The Ground-Truth Reference

It’s tempting to treat video input as disposable once motion is extracted, but the footage remains useful. It’s your ground truth for timing, direction, orientation, and interaction accuracy. If your character needs to align with an object like a lean, a prop, or a surface, the original performance is still the best guide for making the motion feel intentional rather than procedurally close enough.

A Smooth Handoff to Unreal Engine 5

The handoff to Unreal Engine 5 is designed to be smooth, so moving characters and animations across tools doesn’t become a pipeline headache. With the updated Live Link and Auto Setup workflow, you can transfer from iClone to UE5 with minimal friction, keeping materials prepared and animations intact so your NPCs are ready to drop into a playable scene fast.

Studio-Level Leverage for Indie and Small Teams

Video Mocap is built to give smaller teams studio-level production leverage, so you can generate believable NPC motion on demand without big budgets, specialized gear, or infrequent capture sessions. Combined with Character Creator’s ability to generate and optimize NPCs at scale, you get a workflow that keeps up with how indie and mid-size teams build games today: fast iteration, reusable assets, and practical results.

Related Posts

The Liberty League: 48-Hour Superhero iClone Film

Ruben Ybarra – 3D Animator / Writer / Director

Ruben Ybarra

When most of us think of superhero films, we imagine months, if not years, of planning, production, and polish. But for filmmaker Ruben Ybarra and his girlfriend, the challenge was to create one in just 48 hours. Enter a hilariously unconventional superhero parody, THE LIBERTY LEAGUE, a villain so powerful he can control anyone’s mind… except for one “hero” whose brain is so empty he’s completely immune.

Armed with iClone, Unreal Engine, and a running list of gags, Ruben raced against the clock to bring this wild idea to life. The result? A short film that not only placed 2nd citywide in the 48 Hour Film Project, but also picked up Best FX and earned festival recognition along the way.

In this interview, Ruben talks about how the story came together, the tools that saved precious hours, and the lessons he’s learned about speed, collaboration, and creativity under pressure.

https://www.facebook.com/share/v/1BpbFPbogX

Hey Ruben. Can you tell us about the short film you created. What’s the story, and how did you approach the superhero genre within such a tight time frame?

The story is a simple superhero parody. Once we got the genre, I had a few ideas floating around and then I thought hey how about there’s a villain that is so powerful at controlling people’s minds that he defeats the Justice League heroes in minutes. However, they have a new interim team member that is so dumb that his mind can’t be controlled because it’s basically empty.

When I pitched it to my girlfriend and one of my friends I’ve written a ton of films with, we all chuckled and knew there would be room to put in funny bits to hash out a full story. I wrote the script with my girlfriend and we had a blast. We figured out a simple beginning, middle and end to the story. At that point we throw everything we can at the wall and see what sticks as far as gags and bits. Once we have a few things that make us laugh, you rank them and quickly knock out the parts that either take too long to develop or aren’t as funny. Then we put the pieces together logically where the gags make the most sense.

It’s pretty simple, if it makes us laugh we’ll keep it. If only one of us is laughing… then the idea is cast aside.

What tools in iClone helped you animate so quickly under a 48-hour deadline?

The first is the crazy ease of animation in iClone. You have mocap as an option and also the ability to drag and drop premade professional hand keyed animations on the timeline, transitions to blend between animation clips, and then fine tuning with key framing. It can’t get simpler than that.

“The crazy ease of animation in iClone is what makes everything possible under a 48-hour deadline. You can drag and drop professional hand-keyed animations, blend them seamlessly, layer in mocap, and fine-tune with keyframing — it honestly can’t get simpler than that.”

Ruben Ybarra – 3D Animator / Writer / Director

I use iClone’s AccuFACE plugin for facial mocap and Perception Neuron Studio for body mocap. It works seamlessly with iClone. It’s so simple to apply mocap to multiple characters in a scene. Even faster if they’re sitting, since I can mask their lower bodies and just record my upper body. I’ll usually complete all the mocap animation for the film in less than 2 hours.

I’m especially fond of the handmade cartoon animation series that Reallusion has released. Even if the animation isn’t exactly what you need, there are bits in one that fit perfectly and bits of another that work well. It’s as simple as splitting stock animations into pieces, cobbling them together, and adding smooth transitions. That’s some of the real power within iClone — I can mix and match different stock animation to get close to what I want, then polish it with keyframing. Simple tools like hand gestures to hold items are so fast and easy to use.

“It’s so simple to apply mocap to multiple characters in iClone. With AccuFace and Perception Neuron Studio working through Motion Live, I can animate an entire film’s performances in just a couple of hours.”

Ruben Ybarra – 3D Animator / Writer / Director

Another massive timesaver is the iClone to Unreal Live Link and Character Creator setup with Unreal. I’ve been using it since the early days when syncing assets was tricky and materials weren’t always consistent. I even missed deadlines on a few films back then because Unreal was always an adventure with last-minute problems.

Now all I can say is WOW! It just works. I have my sets in Unreal per scene, transfer landmark items like walls and floors into iClone so I know where to place my actors, and animate everything in iClone without even looking at Unreal. There’s zero chance I’d be able to make a film in Unreal on a 48-hour deadline without the iClone/Unreal link. Unless maybe the film was about two characters lost in a cave in total darkness! I kid… I kid…

What made your film stand out, and how did iClone save you from major headaches during the project?

From the technical side, we stood out because the only limitation we had telling our story was our imagination. I was able to literally put everything in the film that I wanted. When writing the script, my girlfriend would have an idea for a gag and ask, “Can you animate that?” The answer was always, “Yep, I sure can.” Thanks Reallusion!

iClone really saved me everywhere. I needed cartoon-style facial animation, AccuFACE handled it. I needed characters talking quickly, just add audio files and you’re done. I needed to animate a bunch of characters on the screen fast,  mocap and iClone Motion LIVE took care of it.

What used to be a nightmare in Unreal was syncing animation to audio. In iClone, audio naturally works perfectly in renders, but Unreal doesn’t take iClone audio. With iClone Timecode sync, all I have to do is render in Unreal and match the same audio file I used in iClone inside Premiere. Viola! They’re in perfect sync. That makes the final edit extremely fast, since I’ve already done 99% of my camera cuts in iClone.

What did you learn about speed and decision-making during a time-crunched project like this? Any tips for keeping momentum?

You have to sacrifice perfection for “good enough.” You can spend hours upon hours just in camera work making a scene near perfect; but you’ll never finish. Keep it simple and keep moving forward. Make a schedule and stick to it. If you are falling behind in time, that means you’ll need to sacrifice time for the other events you still have yet to do. Also, the story is key. Even if the animation is a little rough; as long as the story is a good one, the audience will go along for the ride.

Your film placed 2nd citywide, won Best FX, and got Festival Recognition. What did that success feel like after such a fast and furious project?

From Friday to Sunday I literally had 2 hours of sleep. I remember thinking, I can’t do this to my body. It isn’t worth it. After the awards and hearing the audience reaction at the screening, we signed up for another one. It was beyond rewarding. Working with and sharing this with someone you love is just the icing on the cake. We’re excited for the next one.

What advice would you give to someone new to 3D animation or anyone thinking of trying a speed animation challenge?

Reallusion has some fantastic tutorials. Learn the basics, then give yourself a simple project — animate just one full scene. Force yourself to finish it. Inevitably you’ll hit challenges you can’t figure out, but instead of giving up, take to the internet. The Reallusion forums are AMAZING. The “UE 5 Reallusion Tips and Tricks” thread has saved me so many times I can’t even begin to count. Use those resources, find the technical solutions, and finish your project.

If you’re preparing for a speed challenge, the same principles apply. Keep the story small enough that you can animate it quickly, three or so sets, a limited cast of characters, nothing sprawling. Prebuild generic sets and premake characters you know you’ll use. It’s no different than scouting film locations or casting actors ahead of time. And give yourself a dry run. Set aside a weekend, pull a random genre, character, prop, and line of dialogue, and see if you can hit the deadline. If you miss the mark, recalibrate and try again until you know where the time sinks are.

What’s one iClone feature that you think every new user should learn first and why?

The basic animation technique of using stock motions but cutting/splicing them together while incorporating the edit motion layer tools to create totally different and unique animations. It is one of the simplest and most powerful ways to use iClone. Once you master this, you can honestly make any character animation your heart desires.

“One of the most powerful techniques in iClone is cutting and splicing stock motions together, then refining them with Edit Motion Layer tools. Once you master that workflow, you can honestly make any character animation your heart desires.”

Ruben Ybarra – 3D Animator / Writer / Director

What’s next for you in animation? Any future projects, goals, or lessons from this experience that you’ll carry forward?

We just completed the Houston 48Hr Film Project with a fun Vampire Sitcom Film. We feel good about it, so we’ll wait to see what the judges think and maybe shop it around the film festival circuit. We plan to shop the Superhero film to more film festivals as well. We signed up for another 48Hr Festival in mid August to no rest for the weary!

https://www.facebook.com/reel/837608819299204

Follow Ruben Ybarra:

Facebook:
https://www.facebook.com/groups/1684100315159672/user/1592285587

YouTube:
https://www.youtube.com/@alphaflightfilms64/videos

Related Topics:

iClone Video Mocap Delivers a Studio-Free Lifesaver for Indie Artists

The Making of 3D Gorilla Animations: Capturing the Beast in Motion

Dom Fred Unveils Cutting-Edge 3D Animation at Cannes 2025

Master the ZBrush to Character Creator 5 Pipeline: Get the “Veyra” Masterclass Free with Your CC5 Trial

Elevate your character production by unifying ZBrush and Character Creator 5 (CC5) with Oscar Fernandez, the ZBrush master. Known for his renowned series “Master of the Orc” and “Master in ZBrush for 3D Printing,” Oscar now brings his expertise to a revolutionary CC5 x ZBrush pipeline.

In an exclusive collaboration with Libel Academy, we are offering the complete “The Making of Veyra” masterclass—a premium $60 USD value—absolutely FREE when you sign up for the Character Creator5 (CC5) Trial.

This is your chance to achieve efficiency and professional-grade results without struggling with manual retopology and weight painting. Instead, let Character Creator5 do the heavy lifting for you.

Why “Veyra” is the Must-Watch Tutorial for Your Pipeline

This series isn’t just about making a cool creature; it’s about mastering a professional ecosystem. Oscar Fernandez demonstrates how to unify ZBrush and Character Creator 5 (CC5) into a single, powerhouse workflow.

  • Sculpt with Freedom: Use ZBrush for world-class anatomy, armor, and organic detailing.
  • Rig with Speed: Leverage CC5’s auto-rigging and total compatibility to bring your sculpts to life in minutes.
  • Optimized Exporting: Learn precise workflows to ensure your high-poly details translate perfectly to functional assets.
  • Professional Efficiency: Designed for artists who need high-end results without the traditional technical bottlenecks.

20 Chapters of Expert Instruction

This comprehensive course guides you through every facet of the Veyra project—from the initial CC5 base mesh to the final rendered masterpiece.

  1. Before You Begin
  2. CC5 Base Mesh
  3. Basic Facial Structure
  4. Basic Body Structure
  5. Helmet & Horns
  6. Armor Pieces
  7. Creating Clothing
  8. Creating Ornaments
  9. Basic Wing Structure
  10. Refining Anatomy
  11. Detailing The Hands
  12. Detailing The Horns
  13. Helmet Ornaments
  14. Detailing The Wings
  15. Detailing The Armor
  16. Preparing For Export To CC5
  17. Texturing Accessories
  18. Sending Accessories To CC5
  19. Final Render
  20. Workflow For 3D Collectibles

Claim Your Free Masterclass Today

Stop letting technical hurdles slow down your creativity. Learn how to bridge the gap between sculpting and animation with the industry’s most efficient tools.

  1. Download and Start your Character Creator Trial HERE.
  2. Check Your Inbox: You will receive a follow-up email immediately after registration. The link to the tutorial will be embedded in this email.
  3. Unlock Your Training: You can then access the private link of The Making of Veyra tutorial series.

Headshot 3 is Coming: Pre-launch Offer Live Now!

We are excited to announce that Headshot 3 for Character Creator 5 is officially arriving this April. To mark this next leap in digital human creation, Reallusion is kicking off a special pre-launch offer starting today.

To significantly improve the accuracy of 3D head creation from images, Reallusion has invested extensive effort in training a proprietary AI model designed specifically for high-precision facial reconstruction. This custom-built engine ensures a level of detail and likeness that sets a new industry standard.

New Key Features of Headshot 3

1. Prompt to 3D Model

To obtain neutral-expression facial images suitable for constructing 3D head models, Headshot 3 includes an integrated AI Image Generation tool. Simply enter a text prompt to generate optimized front, side, and full-body photos. These images can then be automatically converted into an accurate 3D model. This streamlined workflow removes the need for tedious reference collection and enables the rapid creation of production-ready assets.

2. Higher Accuracy Across All Demographics

Headshot 3 delivers much more precise generation results tailored to specific ethnicities and ages. Whether you are creating a digital double or a unique character, the AI ensures a faithful representation of diverse human features.

3. Intuitive Bezier Controls

To achieve an even more faithful digital double, you can refine the front and side profiles with intuitive Bezier controls. This allows for granular adjustment of the head shape to match your reference perfectly. When combined with CC5 characters, this feature allows users to precisely define eyelid structures, refine mouth curves, and align the flow of laugh lines to achieve greater facial likeness.

4. Professional Lens Correction

Built-in lens correction compensates for distortion in source photos. This ensures that the generated 3D head maintains correct proportions, even when working with photos taken from different focal lengths.

5. Automatic Texture Enhancement

The new De-light function intelligently removes shadows and highlights from source photos to produce a clean albedo texture for accurate rendering. It can also generate roughness and normal maps automatically, giving users a full PBR texture set for more realistic results.

6. Advanced 3D Brush & Masking

The 3D mask brush allows users to remove unwanted blemishes and photo artifacts, such as stray hair or eyelashes captured from the source image, while enabling edits on selected texture channels or across all channels at once.

Headshot 2 vs Headshot 3

A comparison of head shape accuracy between Headshot2 and Headshot3, using the same source image for automatic head generation.

Exclusive Pre-Launch Offers Live Now

To celebrate the upcoming release, we are pleased to offer special entry points for every creator. Whether you are beginning your journey or upgrading your studio pipeline, there is a dedicated path for you:

A. New Users (Don’t own Headshot 1 or 2 yet)

Purchase Headshot 2 now for $129 (Special Offer) and get Headshot 3 for FREE at launch! This offer is available from now until the official April release.

B. Existing Users (Own CC5 + Headshot 1 & 2)

Current Headshot 1 & 2 owners can secure Headshot 3 for a Loyalty Upgrade price of only $99! This special pricing will be available starting on launch day in April.

C. Get the Full Suite (For non-CC5 users)

Don’t have Character Creator 5 yet? Pick up the CC5 + Headshot 2 Bundle during this pre-launch period to secure the best pricing and ensure your pipeline is ready for the Headshot 3 revolution.

Special Bonus for All Users

To show our appreciation for our community, all three types of users will receive 100 AI Image Generations (Note: The pricing and details for Reallusion AI Generations will be officially announced once the service is ready.)

How it Works

  1. Purchase: Buy Headshot 2 during the pre-launch period (now through April).
  2. Automatic Enrollment: Your Headshot 3 license will be automatically added to your Reallusion account.
  3. Launch Day (April): Simply open the Reallusion Hub to install Headshot 3 and start creating immediately!

→ Learn more in Store FAQ

*Note: Headshot 3 requires Character Creator 5 (CC5) to operate. If you don’t have CC5 yet, it’s the best time to upgrade to CC5 now!

We can’t wait to see the stunning characters you’ll create with the combined power of Headshot and AI!

Related Links

Actor Capture Brings Raphael to Life with Motion Capture for Director John Likens

When director and creator John Likens set out to reimagine Raphael in a darker, grittier cinematic world, he wasn’t interested in nostalgia alone. He wanted intensity. Weight. Brutality. A version of the Teenage Mutant Ninja Turtles grounded in shadow, steel, and subway grit — something long-time fans have been hungry to see on the big screen.

The result? A high-impact action short centered on Raphael battling The Foot in a dimly lit underground station — a film that surpassed 2 million views on YouTube within its first few weeks.

At the heart of that realism: performance-driven motion capture from Actor Capture.

Building a Darker TMNT Universe

Likens had already built momentum with previous shorts spotlighting Donatello and Leonardo before embarking on his most visceral installment yet — Raphael. Known as the most volatile and physically dominant of the Turtles, Raphael demanded a fight sequence that felt raw, grounded, and punishing.

Likens collaborated with Oni Stunts and Fight Coordinator James Newman to design a brutal, tightly choreographed battle between Raphael and multiple Foot Soldiers. Every strike, counter, and environmental interaction was carefully mapped out before stepping into the volume.

With the fight language established, Likens turned to motion capture veterans Actor Capture to translate that choreography into digital performance.

From FBX to Fully Characterized Digital Fighter

The pipeline began with Likens delivering his Raphael and Foot Soldier FBX models to Actor Capture.
Using Reallusion Character Creator, the team characterized and prepared the models for animation, ensuring clean skeletal structures, proper retargeting setups, and compatibility with the capture system.

Quick motion tests were captured and sent back for creative approval — allowing Likens and Newman to evaluate performance timing, scale, and physicality before committing to the full shoot. Thanks to the efficiency of the Reallusion pipeline, turnaround was rapid. Once approved, Actor Capture moved into full production mode at its Atlanta studio.

Two Days Inside the Capture Volume

The action shoot brought together:

  • Director John Likens, Fight Coordinator James Newman, Stunt team from Oni Stunts, Performers including Alan Silva, Bee Xiong, Joshua Davidson, and Tony Sre, and the Actor Capture technical crew.
  • Inside a volume powered by 40 optical motion capture cameras and a 13×25 LED wall, the team recorded the entire fight sequence. The LED wall provided immersive lighting reference and spatial awareness, while the optical system ensured high-fidelity body tracking — crucial for preserving the nuance of stunt-driven choreography.
  • Performances weren’t just about combat — they were about character. Raphael’s aggression had to feel personal. The Foot Soldiers had to move with disciplined precision. Subtle weight shifts, pauses between blows, and grounded landings all contributed to the realism that fans responded to.

From Capture to Final Frame

After capture, the data was processed and cleaned in OptiTrack Motive, ensuring clean solves and accurate skeletal performance. The animation was then transferred into Maxon Cinema 4D for final animation refinement, lighting, and rendering.

For visualization during production, 3D models were also sent into Unreal Engine for mocap previs — allowing the creative team to see the action unfold in real time and make on-set decisions about pacing, framing, and choreography adjustments.

This hybrid workflow — optical capture, real-time visualization, and high-end 3D rendering — gave Likens cinematic control while preserving the authenticity of live performance.

Why It Resonated with Fans

The success of Raphael wasn’t just about nostalgia. It was about:

Performance authenticity driven by stunt professionals. High-end motion capture fidelity. Cinematic lighting and environment design. A grounded, mature tone many TMNT fans have long requested.

By blending live stunt choreography with advanced motion capture and digital animation tools, Actor Capture helped bridge the gap between indie fan film and theatrical-quality action.

A Glimpse of What Could Be

Likens’ trilogy of TMNT shorts demonstrates what’s possible when passionate storytelling meets production-grade performance capture. With over two million views and strong fan demand, Raphael proves there’s an audience ready for a darker take on the Turtles. And behind the digital shell and sais? Real performers. Real choreography. Real motion capture. Actor Capture didn’t just capture Raphael and The Foot. They brought them to life.

More About Actor Capture: www.ActorCapture.com

More about John Likens: www.JohnLikens.com

Related Posts

How Three Developers Without an Art Team Built a Cinematic Horror Game

Overview

Building a Cinematic Survival Horror Game Without AAA Resources

Creating a cinematic survival horror game is demanding under any circumstances. For indie developers working nights and weekends, it can feel almost impossible.

Dragon Level Studio — a three-brother team balancing full-time jobs — set out to build Connection: The Nightmare Within. Without a large animation team, without outsourced character artists, and without AAA budgets, they still delivered a game featuring fully voiced cutscenes, dynamic camera systems, and high-quality characters using Character Creator, iClone, and Unity Auto Setup.

The key question was not whether they could build the game. It was whether they could finish it.

The Team Behind Dragon Level

Dragon Level Studio is composed of three brothers based in LA, USA:

David – the eldest, with a cinematic background. He handles lighting, coding, and is also a music producer, shaping the game’s atmosphere both visually and sonically.

Armen – an engineer fluent in C#, which naturally led the team toward Unity as their engine of choice.

Paul – supporting across multiple areas and helping wherever production required. He also handled much of the playtesting, helping the team catch bugs earlier.

All three maintain full-time jobs while dedicating significant personal time to the game. Their motivation? A shared love of horror games, especially the classic era of survival horror.

That passion eventually turned into a serious production effort.

About the Game

Connection: The Nightmare Within

Connection: The Nightmare Within is a narrative-driven survival horror game inspired by classics such as Resident Evil, Silent Hill, and Alone in the Dark.

Players take on the role of Detective Stone, who becomes entangled in a shadow conspiracy while investigating a mysterious case. To uncover the truth, Stone uses experimental technology that allows investigators to enter the subconscious minds of criminals.

Inside these distorted mindscapes, players explore atmospheric environments, solve puzzles, and battle creatures that embody the fears and memories of the suspect. As the investigation progresses, fragments of the conspiracy begin to surface, gradually revealing a larger story hidden behind the crimes.

Two Generations, Two Camera Philosophies

One of the game’s distinctive features is its camera system. Players can switch between a classic fixed-camera mode and a modern over-the-shoulder perspective. The idea actually originated from a creative disagreement between the brothers themselves. 

David preferred the cinematic tension created by fixed camera angles, which allow scenes to be carefully framed to control pacing and atmosphere. Armen, who is nearly 12 years younger, grew up playing FPS and modern action games and strongly preferred an over-the-shoulder perspective that gave players more direct control. 

Rather than forcing a compromise, the brothers decided to support both approaches and let players choose their preferred style. In the end, the hybrid system became one of the game’s distinctive features. As the team put it, “it turned out quite well.”

The Reality of Low-Budget Development

Building a cinematic horror game as a part-time team introduced two major constraints: time and budget.

Time

Since all three brothers maintained full-time jobs, development mostly happened at night and on weekends. This meant every part of the production pipeline had to remain efficient. Long iteration cycles, complex asset rebuilding, or repeated outsourcing revisions would quickly slow progress and threaten the completion of the project.

Cost

Character production quickly became one of the most expensive aspects of development. Initially, the team experimented with purchasing premade characters, but customization often required additional work, such as new shape keys, rig adjustments, and other modifications.

Outsourcing a base character might cost roughly $300, but once custom shape keys, full rigging, and additional adjustments are included, the total can easily rise by nearly $3,000 more.

For three protagonists, the cost could approach $10,000 USD — even before animation and revision cycles begin. For a small studio working part-time, that level of investment represented a significant financial risk.

That’s when they discovered Character Creator.

Integrating Character Creator & iClone into the Pipeline

After discovering Character Creator, the team adopted it as the core of their character production pipeline. Characters were created and customized directly in Character Creator, with clothing assembled from assets in the Reallusion Content Store and Marketplace.

For animation, the team combined several sources. Body motion was captured using a Rokoko suit, while additional animations came from the Unity Asset Store and Mixamo.

Dialogue scenes were handled in iClone. After recording voice performances with actors, lip sync was generated in iClone before exporting the characters and animations to Unity using Auto Setup. This workflow allowed the small team to produce fully voiced cinematic cutscenes without relying on a large animation department.

Why Character Creator Worked for Them

1. Clothing & Rapid Styling

The Content Store allowed the team to quickly assemble believable outfits without starting from scratch.

For example, the detective’s clothing was built directly from store assets, drastically reducing design time. Using built-in styling features like the Conform to Body feature, garments automatically adjust to the character’s proportions, eliminating the need for time-consuming manual corrections.

For a small team with limited hours each week, this ability to style characters quickly made a significant difference.

2. Morph Precision

The protagonist was inspired by actor Josh Holloway. The team wanted a rugged facial structure with strong contours that conveyed experience and toughness.

Character Creator’s morph system allowed them to sculpt those features precisely within the same environment. Instead of exporting to a separate sculpting pipeline and iterating repeatedly, they could refine facial proportions directly until the character matched the vision they had in mind.

This greatly simplified the iteration process.

3. Mesh Editing Flexibility

When importing accessories such as belts or small props, Character Creator’s automatic conforming handled most of the fitting.

However, when finer adjustments were needed, the Mesh Editing tools allowed the team to quickly reshape elements without breaking the character pipeline. A belt could be narrowed or repositioned in minutes, avoiding additional modeling steps in external software.

This level of flexibility kept the workflow efficient.

4. Polygon Optimization for Real-Time Performance

Another advantage came during the optimization phase. Because the characters were built with high detail, not every polygon was necessary for real-time gameplay.

Using Character Creator’s polygon reduction tools, the team was able to significantly lower triangle counts while maintaining visual quality. Hair assets in particular benefited from this process — large amounts of geometry could be removed without noticeably affecting the final appearance.

This allowed the characters to remain visually strong while running smoothly inside Unity.

Auto Setup for Unity: Removing Technical Friction

None of the brothers comes from a traditional artist background. Shader setup and hair rendering in Unity initially led to disappointing results.

Unity Auto Setup has been significantly simplified:

  • Shader configuration
  • Hair material setup
  • Character integration

What had previously resulted in broken visuals became predictable and production-ready.

Measurable Impact

Character Creator and iClone affected three core areas:

Quality

They achieved character quality that matched their horror vision, elevating immersion. Most critically: Without CC and iClone, fully voiced cinematic cutscenes would likely not have been feasible.

Time

The team estimates they saved approximately seven months of development time, primarily by eliminating outsourcing cycles and iterative rework.

Cost

Instead of spending nearly $10,000 on character outsourcing alone, they maintained control within a structured pipeline.

What’s Next for Dragon Level?

After completing such an ambitious title, the brothers are evaluating a more focused scope for their next project. The experience of building Connection: The Nightmare Within provided something even more valuable than the finished game itself — a validated pipeline.

They now understand exactly how to structure character production, animation, and Unity integration in a way that balances quality with limited time resources. The uncertainty that often paralyzes indie teams has been replaced with confidence.

Cinematic storytelling no longer feels like something reserved for AAA studios. With the right workflow and tools in place, it has become a practical reality — even for three brothers building horror games after work.

Related Posts

AI-Generated Characters & iClone/CC5

AI-generated character and prop meshes are improving, and some are operating on at least their fifth or sixth generation of AI models that drive the mesh creation. As a longtime character and prop creator I have relied heavily on my own creations for personal and professional projects but in the past year and half or so I have added Meshy AI to keep up with the current trend of AI mesh generation. While it has improved, AI-generated characters are not a full replacement for iClone or CC5 characters, but they do have their place.

Right now, that place is not in front of or close to the camera if you are wanting cinematic quality. Props may be a different story, but they too have flaws that need to be dealt with or hidden from camera view.

High Poly Count

As the AI models behind Meshy have evolved so has the poly count.  It is not unusual for Meshy 6 to produce character meshes up into the millions of polys.  I have seen many characters between three and four million.

This requires the extra step of remeshing and during this process we can convert the model to quads for a bit cleaner topology for animation performance but it’s not the perfect solution. The Meshy AI re-mesh offers general or fixed poly counts from 3K, 10K, 30K and 100K or a custom poly count. This does solve some of the problems but the meshes do lose a little quality after remeshing.

To get a decent character that looks good in front of the camera you still need at least 130K or more in most cases while Character Creator can provide a character with a working mouth, expressions, primed for easy lip-syncing and provide more advanced features like wrinkles and expressions for a lower poly count.

Atrocious UV Maps

In an iClone Facebook group, one user commented on a Meshy AI-generated UV Diffuse map that it looked like someone “ate the map and threw it up,” and I have no argument with that.

For beginners this is particularly problematic and maybe a showstopper and for pros it’s just a pain in the butt to deal with. It’s frustrating, confusing and so chopped up it looks like it came from an app tripping on mushrooms.

Also, for beginners, being able to use an app like Photoshop to alter or change the clothing and other aspects of a character is a defining moment and necessary milestone to get a unique character just by editing the diffuse map.

Sadly, choosing quads doesn’t solve the UV Map problem in Meshy but it does tame it a bit. The quad UV Map is bit more coherent in that we can more easily recognize certain features, but those features will still be chopped up into multiple pieces most times.

For example, the UV map can and usually does make it almost impossible to locate the face let alone all its scattered parts. With quads there is a better chance of the face being chopped up into far fewer pieces making some alterations possible but still frustratingly limited.

This image vividly demonstrates the difference in UV Mapping with Meshy AI versus iClone/ Character Creator

Mesh Binding and Collapsing

In a lot of cases, the generated mesh will be too poly-heavy for use in animation or even a game engine. This requires optimizing the mesh to a lower poly count while retaining as much of the original features as possible. This chops the mesh up into triangles even if it is a quad mesh to start with.  This can lead to mesh binding and collapsing in tight joint areas like elbows and knees.

If needed for a crowd or army off in the distance you might get away with flat arms or pinched elbows, but this won’t work for anything closer to the camera. Stylized characters with thin arms and legs are more susceptible to this type of binding or collapse. With iClone and CC5 characters you can optimize character meshes down as low as 800 kb each with just the click of a button even if you don’t know anything about optimizing mesh. This same button can cause binding and collapse when used on some AI-generated meshes though.

Poor Quality Textures

Not only is the UV Mapping horrible so is the diffuse texture map which is the beauty shot of the maps. Just as I was writing this article Meshy announced Meshy 6 Texturing. Up to this point we could make a mesh in Mesh 4, 5 or 6 Preview but the texture was generated with Meshy 5.

Meshy 6 Preview texturing is an improvement but still not as good as Character Creator or most Content Store/Marketplace vendor items.

Quads Improve Mesh Quality Somewhat

In general, Quad meshes use four-sided polygons, while triangle meshes use three-sided. Quads are generally better because they deform more smoothly during animation, are easier to edit, and work best with subdivision tools. Triangles can cause messy deformations and are harder to manage for detailed modeling resulting in many 3D artists preferring quads for character creation in particular.

No Working Mouth

This is arguably one of the biggest drawbacks to using something like Meshy for characters. We can create characters with an open mouth then manually skin that area to the jawbone in CC, but this is an advanced skill generally not available to beginners.

Or we can simply use Character Creator to provide a unique character with all the bells and whistles including AI-assisted lip sync.

No Expressions or Emoting

Applications like Character Creator have spoiled us to great characters within low to reasonable poly counts with expressions. AI-generated characters have the expressions of a rock. Its expression will be whatever comes out of the mesh generation, and we are stuck with it.

On the other hand, iClone and CC5 have dedicated expression and wrinkle systems to improve realism. While it is possible for experienced 3D mesh modelers to create and rig a working jaw/mouth in CC5 even though it is an advanced skill, it is not possible to create a wrinkle system on AI-generated meshes with CC5 Advanced or ActorCore’s free AccuRIG, which leaves these characters rather lifeless compared to iClone/CC5 characters.

You’ll never get that superhero on the right to smile like Aaron does.

Mesh Editing/Clean up

This is annoying… AI-generated characters can be very messy with “blobs” and other detached parts floating around that must be dealt with either in Character Creator or another application. In CC5 we can usually isolate these blobs and hide them but that is not always possible. ZBrush and similar applications can handle the edits, but complicate them.

What You See is Not Always What You Get

If you have used a mesh generator then you have probably experienced some form of this problem. For those of you that haven’t been through this I will try to explain the problems in practical terms.

The untextured preview looks good. The final texture looks even better in the application interface. You download the character only to find a significant number of flaws not visible in the previous stages of generation.

These flaws can include both mesh and UV map problems. What looked like a great face on a character turned out to have another “ghost nose” or bump near the nose area or strange ears or misaligned texture requiring attention from a mesh editing app. This means the user needs to be up to speed on correcting character details like hands with too many fingers or multiple thumbs. Bumps where there shouldn’t be bumps…  wrinkles where… well… where it’s just embarrassing and these must be dealt with, or the character is unusable.

Lifeless Eyes

This is a big one in terms of realism and there is no substitute for it. Back in the early days of gaming and animation it was not unusual for a character to be looking through you instead of at you. This defined the experience for the user. The mocap and acting were getting better, but those lifeless eyes would ruin the entire experience by taking focus from the scene. It might have been some great animation and storytelling, but the fixation on those lifeless eyes overshadowed the emotion of the scene.

iClone, in particular, Character Creator has gone to great lengths to solve this problem with eye tracking and textural realism. Using partially rigged AI-generated characters takes us back to the early day as they won’t have eyes in most cases… with them being part of the main mesh instead of separate meshes we can control.

What you see below on the left is all you will ever see with that character. The look of the eyes NEVER CHANGES versus the CC5 characters on the right that have layers such as occlusion and tearline for realism. Eyelashes and eyebrows add more realism too.

Left: Eyes are part of head mesh. Right: Eyes are separate items with reactive layers such as Eye Occlusion and Tearline.

Now let’s take a look at what AI-Generated characters can be used for.

Crowds

This is an obvious use, but you will have to decimate (optimize) them severely to get a crowd of any size. Depending on the distance from the camera, highly decimated characters will work just fine and should work well with the Crowd system in iClone.

Even with crowds, Character Creator still excels with high-quality optimization that starts with good, solid topography before decimation, which means a smoother character at the end of the process, which translates into smoother animation.

Storyboarding

This is another somewhat obvious use for AI-generated characters as storyboards are usually static and only require posing. Add in AccuPOSE and creating a storyboard is almost child’s play. From simple blocking shots to complicated scenes, iClone can punch out a storyboard very quickly and it can be animated if need be.

This is a great way to develop a shot list by recreating the scene in iClone to whatever level of realism is needed and moving the camera around to get a preview of how the camera shot list should be laid out.

Comic Books, Graphic Novels

Just like the storyboards above, iClone is a natural for comic books and graphic novels whether they are realistic, cartoony or somewhere in between. And just like the storyboards, AccuPOSE makes it so simple you will most likely have time left over to tinker with the project to refine it even more! If you don’t need dialog or expressions, then AI-generated characters can work in that situation.

A great example of this is Antareus Studio’s Operation Cobra: The Burma Incident, a WWII-era graphic novel produced entirely with iClone and Character Creator. Read the full story to see how they did it.

Scene Filler

This is a very important role for AI-generated characters just as I mentioned in the Crowd section and with those caveats. The meshes have to be optimized to lean and mean even if you are only going to use them in small Actor Groups but when these groups are deployed, they can add life to a scene.

A great ready-made solution for this is the Crowd Sim: Social Actions pack from the Content Store, which lets you drop pre-animated groups of characters into any scene with just a few clicks.

Content Store and Marketplace

These are still the best places to find quality characters, props, hair, accessories and other 3D assets for iClone and Character Creator 5 that will perform as expected. The Content Store is Reallusion’s curated, professional-grade market — every asset is held to a high standard, making it a reliable source for production-ready content. The Marketplace, on the other hand, is a more community-driven platform where independent creators share a wide and ever-growing variety of assets. Assets from both will look great in front of the camara, including close-up shots to intermediate crowds. The superior mesh topography, UV mapping and mesh density from these Reallusion and vendor products will work for most situations while making your production look as good as possible. With all the promotional events, sales and discounts available we can build a great library of 3D assets just from these two stores.

Summary

I did notice while writing this article that I use Meshy AI almost exclusively for non-standard, one-off characters and anthropomorphic creatures as I had a hard time finding a standard human Meshy AI character to use as a demo next to a standard CC5 character like Aaron.

The one thing that concerns me with AI-generated characters is the fact that they are not fully functional, and in some cases, the mesh is bad enough that the hands or fingers are not functional either. They have to be rigged as a 2-finger mitten to keep them from being stiff.

We sacrifice a lot of features with AI-generated characters that Character Creator has spoiled us to over the years. If you are an expert at sculpting and rigging you can do your own mods but it’s still a time-consuming and at times a tedious task.

AI-generated characters certainly have their place but at the present time they are rather limited in scope in terms of cinematic quality.

MD McCallum - WarLord

MD McCallum – WarLord

Digital Artist MD “Mike” McCallum, aka WarLord, is a longtime iClone user. Having authored free tutorials for iClone in its early years and selected to write the iClone Beginners Guide from Packt Publishing in 2011, he was fortunate enough to meet and exchange tricks and tips with users from all over the world and loves to share this information with other users. He has authored hundreds of articles on iClone and digital art in general while reviewing some of the most popular software and hardware in the world. He has been published in many of the leading 3D online and print magazines while staying true to his biggest passion, 3D animation. For more information click here.

Related Posts

Melis Caner remakes Jackson Wang’s "Cruel" music video, blending Character Creator 5’s stylized base models with Blender. By combining Rokoko motion capture and scans of Turkish ancient ruins, she showcases a seamless professional 3D workflow.

Stylized Music Video Remake in Character Creator and Blender

Author: Melis Caner

Melis Caner / Melnverse

Hi, I’m Melis Caner, a 3D Generalist and filmmaker with a Film degree from Dokuz Eylul University. After graduating with a focus on cinematography, editing, and sound design, I discovered 3D animation during the pandemic that followed. Amazed by its limitless creative possibilities, what began as a hobby quickly became my career and a new form of self-expression. My lifelong love for drawing, dancing, and acting now comes together in 3D animation, blending all my passions into one art form.

Melis Caner | LinkedIn / Melniverse | Instagram

Character Creator 5 HD Base and ActorMIXER

I’ve been obsessed with this music video.  So I wanted to create a similar scene, but in my style. The character concept is a remake of Jackson Wang’s Music Video “Cruel.” Watch the full video side by side with my mood reference clippings.

I start by using the new HD stylized base model from Character Creator 5.

With ActorMIXER, you can combine different models and swap individual actor parts seamlessly. Just drag it onto your character, and the skin details will already look incredible. From there, I refined the makeup and ran a few expression tests, and the results turned out great as well.

ActorMIXER: a brand new way for character editing

If you are working on projects with unique character designs, using CC5 HD characters from the Reallusion Content Store will be more efficient. In the store, you will currently find several HD characters ready for use with ActorMIXER.

I would also recommend using the “Digital Soul 100+” pack for detailing the character expression. This saves my time tremendously in emoting the character.

Rokoko Body Capture and Character Animation in iClone

I captured my motion with the Rokoko suit and cleaned the motion up in iClone. I added facial animation with my favorite “Digital Soul” pack. This basically drags and drops the selected expression, then applies it to my character.

3D Cloth Customization and Cloth Simulation

For the outfit, I purchased a 3d suit on CLO Connect. I designed the cropped sleeves of the clothing, and then simulated everything in Marvelous Designer.

Scene Composition in Blender

Finally, I built the scene in Blender using the 3D Scan tools that I scanned in the ancient city in Turkey, captured by Blendreams. Although I cannot move as coolly as the dancer queen, it is fun to see CC5 fit into my workflow. And Character Creator is truly a good companion with Blender.

Related Posts