首頁 » Page 8

The Making of 3D Gorilla Animations: Capturing the Beast in Motion

Over 130 lifelike gorilla movements, professionally performed and motion-captured.

About ActorCore

Reallusion ActorCore is a premium motion capture library designed for professional 3D production, offering over 4,700 high-quality animations spanning a wide range of themes, from everyday actions and conversations to combat, stunts, and paired performances. Constantly pushing creative boundaries, ActorCore holds a reputation for delivering mocap animations that are often difficult, time-consuming, or costly for studios to produce on their own, while making them accessible and affordable to everyone.

Gorilla Series

Following the acclaimed success of the Wire Stunts and Legendary Heroes series, Reallusion is now venturing beyond standard human motion with a new collection that explores primeval behaviors. Inspired by films such as Planet of the Apes, Tarzan, King Kong, and Warcraft, and supported by industry expertise and the production-scale resources of a major studio, this ambitious project captures authentic gorilla movements. While primate themes often appear on the big screen, this kind of motion data has been largely out of reach for 3D creators… until now.

In collaboration with Quantum Stage, a professional motion capture studio in Switzerland renowned for its directing expertise and stunt performers capable of simulating realistic primal behaviors, Reallusion creates the Gorilla Series—a distinctive collection of high-quality 3D animations available on the ActorCore 3D asset store. These animations authentically capture the strength, movement, and personality of hulking gorillas, making them perfect for wildlife documentaries, zoo simulations, or cinematic storytelling.

See how the Quantum Stage process and professional mocap acting replicate lifelike gorilla movement:

Research and Planning

To achieve natural movement, Reallusion and Quantum Stage conducted extensive research on gorilla behavior. The study organized actions into key categories: daily routines such as walking, running, crawling, and jumping; alert behaviors like scanning the environment and responding to threats; terrain navigation across different levels; and social interactions, including attacking, taunting, defending, and escaping. Together, the study has captured the full range of gorilla motion for use as a production reference.

Imitation of Gorilla Motion

Creating realistic gorilla animation involved two key steps. First, the mocap studio captured the human performer’s body and skeletal motion, much like a traditional motion capture session. The second step focused on accurately adapting these movements to a gorilla’s anatomy. Although gorillas share a similar skeletal structure with humans, their body proportions differ significantly: their hips, chest, and shoulders are much broader, their upper arms and forearms are considerably longer, and their legs are noticeably shorter. By carefully adjusting these proportions, the final animation preserves both anatomical accuracy and authentic primate movement.

“When we shot this animation we specifically observed gorilla motion and behavior from movies such as the King Kong series and Planet of the Apes, and also nature documentaries about how the primates behave with other animals and with each other. It is a very big part of the whole makeup of a gorilla and that shows in the performance. We tried to replicate these as closely as possible,”

Tobias Baumann, Founder, Quantum Stage

Preparing for Action

The most crucial element in capturing authentic gorilla motion is the performer. The actor must combine physical prowess with the creative ability to understand, visualize, and embody gorilla behavior and movement. With high-quality motion capture, every subtle emotion and intention of the performer is recorded, translating directly into the animated character’s movements and expressions.

For this project, the studio collaborated with Ophir Raray, a stunt actor and motion capture artist specializing in creature performances, wire stunts, and movement-based roles. With a background in gymnastics, capoeira, and military training, he brought both life and realism to the gorilla performance. The capture process was physically demanding and technically challenging, requiring the actor to maintain consistent energy, posture, and motion, so that all movements flowed naturally from a precise starting point.

“The preparation involves a lot of research, including researches of animals, apes, and gorillas, their movements, also their psychology, sounds and breath, which are some of the most important things for the performance, to give a life-like performance that the audience can really feel,”

Ophir Raray, creature performer and movement expert

The Right Tool for the Job

To ensure accurate kinesthetics, the team used arm extensions, allowing Raray to mimic the true skeletal proportions of a gorilla. This tool enabled him to move with the correct weight, rhythm, and balance, fully transforming his performance into that of a believable primate. These techniques and tools were essential in bridging human expression and performance with animal motion and behavior, ensuring that the animation captured not only anatomical accuracy but also conveyed personality and emotion.

“Using arm extensions is a very specialized skill, and it requires the body parts, muscles and movements that you normally don’t use. It’s a behavior that you as a human never really experience but the tools allow you to actually move like a gorilla and you can really feel that you are becoming a gorilla,”

Ophir Raray, creature performer and movement expert

Available Now on ActorCore

A product of hours of dedication and passion, the Gorilla Series is now available on ActorCore. The pack includes four motion categories: Routines, LocomotionAggressionand Evasion. The full collection features 131 mocap motions designed for jungle scenes, wildlife simulations, educational content, or any project that requires realistic gorilla movement and behavior.

Whether you’re creating games, cinematic animations, or storytelling sequences, this motion set provides the versatility needed to bring your scenes to life. The collection also includes core poses, instinctual reactions, expressive gestures, combat responses, and terrain-based movements. The Gorilla Series offers the perfect mocap content to bring wildlife adventures into your productions. 

Related Posts

The Ultimate CC5 3D Lighting Guide: Create Cinematic Realism with Free Presets

Lighting is one of the most powerful tools for defining mood, realism, and style in Character Creator 5 (CC5). The free 3D lighting presets give you an excellent foundation to start with — but to truly make your character shine, you’ll want to move beyond simply applying a preset. Fine-tune the lighting to match your character, theme, and camera angle for the best visual results.

This guide walks you through how to make the most of CC5’s Lightrooms and how to adjust presets so they perfectly complement your scene.

Corresponding Lightrooms to CC5 Key Visuals

Reallusion’s Lightrooms are designed to reflect the aesthetics of CC5’s promotional visuals, making them ideal reference points for achieving professional-quality lighting. Each Lightroom offers a ready-made setup that you can adapt to your own scene instead of starting from scratch.

Before making adjustments, take a moment to study the Lightroom you’re using — observe the direction of the key light, how the fill balances the shadows, and whether rim lights are adding separation and depth to your character.

Refer to the Content Resource Page for related examples and presets.

Drag and Drop to Switch Light Presets

Switching between lighting setups in Character Creator 5 is as simple as dragging and dropping. Each preset instantly updates your lighting, letting you preview different moods and angles without manual tweaking.

Apply the Same Light Preset to Different Characters

Apply one light preset to multiple characters to keep a consistent visual tone. Subtle differences in skin tone, facial shape, and materials will still stand out—keeping your renders unified yet distinctive.

Using Lightrooms Effectively

1. Break Down the Lighting Setup

Understanding the default lighting setup is essential. Open the Scene Manager and toggle each light on or off to observe how it shapes the scene. Notice which light defines the face, which adds depth to the shoulders, and how they interact to create balance. Once you understand each light’s role, refining the look becomes much easier.

2. Troubleshoot with On/Off Toggles

If your character appears too bright, flat, or lacks shadow definition, toggle off one light at a time to identify the cause. This method helps isolate problem lights quickly and prevents overcorrecting with unnecessary adjustments.

3. Adjust Light to Match Character Height

Lighting presets are designed for average proportions. If your character is taller, shorter, or posed differently, the light may hit at the wrong angle. Raise or lower the lights so they illuminate the character’s face naturally.

4. Rotate Lights Around the Character

Pivot your lights around the subject to refine highlight and shadow balance. Even a subtle 15–30° rotation can dramatically shift the mood and emphasize your character’s best features.

5. Extend Portrait Lighting to Full Body

Most presets are tuned for upper-body shots. For full-body renders, extend the light range downward so the legs, feet, and shadows are properly illuminated. This ensures the character doesn’t appear to be floating under inconsistent lighting.

5-1. Duplicate Existing Lights with Dummies

If one light works well but doesn’t cover enough area, duplicate it and link the copy to a light dummy. This extends coverage while keeping the original lighting balance intact.

5-2. Add a Shadow Catcher

Ground shadows add realism. Use a shadow catcher plane to anchor your character in the scene—without it, even good lighting can look artificial.

5-3. Adjust Shadow Darkness and Range

Avoid setting shadow darkness to zero or your scene will appear flat. Increase intensity for more weight, and extend the range so shadows fade naturally to the ground.

5-4. Duplicate the Full-Body Light to Add a Fill Light

Duplicate the main light to create a soft fill. This secondary light reduces harsh shadows and adds surface detail while keeping the key light dominant.

5-5. Reduce Fill-Light Shadow and Adjust Intensity

Lower the fill light’s shadow strength to prevent double shadows, then fine-tune brightness until it complements the key light smoothly.

5-6. Use PCSS as an Optional Choice

PSCC (Percentage-Closer Soft Shadows) softens shadows farther from the object for a more natural falloff. It may, however, cause visual artifacts—use selectively and check results from different angles before final rendering.

6. Enhance Eye Reflections

Eyes bring characters to life. Slightly adjust the light position to create strong catchlights in the eyes. This subtle touch adds realism and enhances the emotional depth of your character.

7. Post Effects

Lightrooms and iAtmosphere include built-in post effects for instant visual enhancement. Free LUT resources can be added, removed, or adjusted to refine color tone and elevate the overall mood and atmosphere.

8. Use Depth of Field (DOF) for Focus

DOF not only adds a photographic touch but also draws attention to your character’s face. A subtle background blur enhances the mood and helps separate your subject from the scene.

9. Don’t Forget Shader Adjustments

Lighting alone won’t achieve realism if shaders aren’t properly set. Adjust skin, hair, and clothing shaders to respond correctly to your light setup. For example, fine-tuning roughness or specular levels helps prevent skin from looking plastic or overly shiny.

GreaseMicro Roughness Scale: -0.4

MatteMicro Roughness Scale: 0.2

NaturalMicro Roughness Scale: -0.2

Final Thoughts

The free lighting presets in Character Creator 5 are more than quick solutions—they’re powerful starting points for creative customization. By breaking them down, adjusting for character proportions, extending their range, and fine-tuning shadows, reflections, and shaders, you can achieve cinematic, expressive lighting that feels uniquely yours.

Related Posts

CC5 – Big Step Up for Beginners & Pros

Character Creator 5 has been released, and it was well worth the wait, and a welcome release to beginners and pros with its usual ease of use we have come to expect from Reallusion products. Over the past five releases, Reallusion has brought some of the most time-consuming and difficult animation tasks, character creation and rigging, to anyone that wants to make a character.

No technical experience necessary.

This release brings characters to a new level from gaming to cinematic needs as it includes High-Definition Characters, High-Definition morphs, and High-Definition facial animation by including Subdivision levels from 0 (lowest to 2 (highest), providing three levels of mesh density to suit a wide range of needs.

With three levels, this also brings up another interesting and much-needed new feature: baking normal maps to transfer features from a high-poly character to a low-poly character (projecting) using maps, not mesh. I’ll get more into this later on in the article, but I can say they have made it a simple process as usual.

CC5 Launch Bundle

For those not familiar with all this the CC5 Launch Bundle consists of CC5 Deluxe, the new ActorMIXER Pro plugin with the core library.

ActorMIXER Core Library includes 44 HD scanned head shapes and textures across four ethnicities: Asian, Caucasian, African, and Middle Eastern. (It will cost you hundreds if you purchase 3D-scanned heads in the market)

HD Ultimate Morphs is different from the Ultimate Morphs Pack currently available for CC3+, it includes fine-tuned, realistic body parts and facial features. Users can refine facial details to full-body shape with built-in sliders, even baking the details into normal maps on lower base meshes.

HD Human Anatomy Set includes 12 avatars across diverse anatomy features that are animation-ready and customizable with ActorMIXER.

HD Characters & Morphs

Character Creator 5 introduces a major advancement with its new HD Character Base that supports up to 16 times more mesh detail through subdivision for cinematic-quality visuals and real-time performance. The system bridges game-ready characters with film-quality rendering by combining enhanced shaders, high-resolution eyes with a new eyelash system, and full support for displacement maps to achieve real geometric depth even in close-ups.

The upgraded HD morph system allows detailed anatomical adjustments without changing the character’s bone structure, enabling variations like muscular builds, feminine curves, or aging effects.

Character Creator 5 also features a next-generation facial animation system with an expanded HD Expression Profile that drives more blend shapes for subtle micro-expressions. A corrective constraint system ensures all facial movements remain anatomically accurate, preventing distortion and providing more lifelike emotional detail.

Overall, Character Creator 5 sets a new standard for real-time HD character creation, combining flexibility and fidelity for games, film, and VFX production.

HD Facial Animation

Reallusion’s Character Creator 5 introduces a next-generation HD Facial Animation system designed to deliver cinematic realism. Built on 262 blendshapes and 128 corrective morphs, the new rig captures everything from subtle micro-expressions to complex lip-sync with film-quality precision. The result is more lifelike characters that move beyond the uncanny valley and bring digital performances to a new level of believability.

Seamless integration with iClone tools like Face Puppet, Face Key Editor, and Motion LIVE ensures intuitive control and flexibility. The system also supports mocap pipelines such as iPhone and AccuFACE, and aligns with Epic’s MetaHuman Animator for high-fidelity performance capture. Even legacy CC3+ models benefit from the Extended-Plus Profile, which adds corrective morphs and mocap compatibility, making Character Creator 5 a powerful hub for animators seeking both realism and efficiency.

  • Next-gen HD Facial Animation with 262 blendshapes + 128 corrective morphs capturing detailed micro-expressions and lip-sync
  • Seamless iClone Face Puppet, Face Key, and Motion LIVE mocap support
  • Compatibility with iPhone mocap, AccuFACE, and MetaHuman Animator
  • Extended-Plus Profile upgrades legacy CC3+ models, boosting corrective morphs and mocap compatibility
  • Enhanced mesh detail, HD morphs, and displacement maps for cinematic realism

Baking and Projecting High Poly Maps

Baking maps refers to the process of pre-calculating and storing complex information about an object’s surface into texture maps. These maps can include details like lighting, shadows, and other surface characteristics. By embedding this information into the maps, the rendering process becomes more efficient, as the computer doesn’t have to calculate these details in real-time. This technique helps achieve high-quality visuals while optimizing performance, significantly reducing the computational load when scenes are rendered.

Projecting was at one time an advanced skill. I use it in ZBrush, and the process was already simplified a lot by then. Now Reallusion has made it even easier by giving us a simple, straightforward method of projecting details from, for example, a 500K mesh to a 50K mesh with little to no loss of detail. Let that sink in. Less polys, same look.

The general idea behind projecting is taking at least two versions of the same mesh—one very high poly and the other very low poly. We do the work on the high poly mesh with the detailed character, then use the High-Definition maps to create detail instead of mesh for use on the low poly character. This allows us to dramatically alter and lower the mesh density (poly count).

In simpler terms, imagine you have a very detailed 3D model (high poly) with lots of tiny details. This model can be quite complex and heavy for computers to handle in real-time. To make it easier to work with, we create a simpler version (low poly) with fewer mesh details and more map details. The projection process transfers the detailed information from the high poly model onto the low poly model by creating texture maps that capture intricate details like bumps, shadows, and textures. These maps are then applied to the low-poly model, making it look as detailed as the high-poly model but with much less computational load.

By using these techniques, artists can create highly detailed and realistic 3D models without the need for complex and resource-intensive calculations during rendering. Character Creator now has a simple process to create up to three levels of mesh density. If you can click a button or check a box, you can bake. Overall, baking and projecting maps are crucial techniques in 3D animation and computer graphics that enhance both the visual quality and performance of rendered scenes.

ActorMIXER Plugin

ActorMIXER is a valuable tool in itself and worth the upgrade for a lot of creators who need unique characters, particularly if they can make those new characters from existing assets. It is beginner-friendly with ease of use. We can even drag in custom morphs for all sorts of possibilities.

Reallusion released a free and a Pro version of the plugin. This opens up new possibilities of combining our current characters to create even more characters with the mixer wheel. It is a very simple and intuitive way to mix existing character parts including heads for more unique characters.

We can mix the body or facial parts, including the Head, Head Shape, Eyes, Nose, Mouth, Forehead, Ears, and Chin. With this enhanced morphing via the ActorMIXER wheel, we can easily and quickly produce unique characters from our existing character base. ActorMIXER Pro comes with random face generation too.

Not only can we use the ActorMIXER wheel to create unique heads, but we can also apply the same principles to the body morph. This includes adjustments to the Body, Bone Scale, and Body Shape. By manipulating these parameters, we can achieve a wide range of body types and proportions, allowing for the creation of truly unique and diverse characters.

Whether you’re aiming for a more muscular build, a slender frame, or anything in between, the body morph options provide the flexibility needed to bring your vision to life. This level of customization ensures that each character can be tailored to fit specific artistic or narrative requirements, enhancing the overall creative process.

Auto Setup for Marmoset, Maya, and Blender

Maya – Auto Setup for Maya is a free plugin designed to streamline the import and rendering of high-definition characters and animations from Character Creator, iClone, and ActorCore directly into Autodesk Maya. The tool automates rigging, facial animation controls, material assignments, and offers integrated lighting presets, helping artists create cinematic-quality renders with minimal manual setup.

Blender – Auto Setup for Blender is a free plugin that connects Blender with Character Creator, iClone, and ActorCore to enable a smooth workflow for 3D character and animation development. The tool eliminates the need for manual FBX imports by providing real-time, two-way data synchronization called Data Link, allowing artists to transfer character models, morphs, poses, animations, lights, and cameras between platforms with a single click.

Marmoset Toolbag – Auto Setup for Marmoset Toolbag is a free plugin that simplifies the process of importing and rendering high-definition characters and animations from Character Creator, iClone, and ActorCore into Marmoset Toolbag 5.0 and above. This tool automates the assignment and baking of Reallusion’s Digital Human Shaders, enabling photorealistic ray-traced rendering with full control over skin, wrinkles, hair, eyes, and teeth materials.

MetaHuman Support

Reallusion offers a comprehensive MetaHuman integration pipeline that connects its Character Creator and iClone software with Unreal Engine’s MetaHuman Creator and animation system.

This pipeline enables seamless transfer of customized head shapes, skin textures, high-resolution body textures, and dynamic wrinkles from Character Creator to MetaHuman, ensuring ultra-realistic character representation in Unreal Engine.

iClone 8.6 Update

The iClone 8.6 update is packed with features and improvements for character realism, animation control, and export flexibility, especially when paired with Character Creator 5 (CC5). Now, you can import MetaHuman facial animations and use CC5’s HD and Extended-Plus profiles for more detailed corrective shapes and better subdivision control. The eye shaders got an upgrade too, with new occlusion and tearline systems that get rid of mesh intersections and make tear ducts look super realistic.

On the animation side, there’s a new facial parts strength option and expanded Motion LIVE functionality, which lets you do non-linear curve editing for characters with corrective facial profiles. Exporting is smoother now with new FBX presets for Marmoset Toolbag, Unreal Engine 5 skeletons, and UEFN. Other updates include better eye modeling, optimized file compression, and bug fixes for crash scenarios and Unreal control rig mappings.

Summary

There is far too much to cover with the CC5 release and subsequent articles will look at some of these tools in more detail. CC5 is so feature-packed that it just can’t be adequately described in one article.

It doesn’t matter if you are a beginner or seasoned professional, CC5 can help you achieve even more unique characters than previous versions, and is a next-level step above CC4, which was a character powerhouse in its own right.

If you have never tried Character Creator, then version 5 is a perfect time to start. You will be creating characters almost as soon as you download and install it.  Then you will be wondering why you wasted so much time on other methods.

Don’t wait… try the free trial and explore a new world of 3D character possibilities all with full facial and lip sync features!

MD McCallum - WarLord

MD McCallum – WarLord

Digital Artist MD “Mike” McCallum, aka WarLord, is a longtime iClone user. Having authored free tutorials for iClone in its early years and selected to write the iClone Beginners Guide from Packt Publishing in 2011, he was fortunate enough to meet and exchange tricks and tips with users from all over the world and loves to share this information with other users. He has authored hundreds of articles on iClone and digital art in general while reviewing some of the most popular software and hardware in the world. He has been published in many of the leading 3D online and print magazines while staying true to his biggest passion, 3D animation. For more information click here.

From CC5 to Marmoset Viewer: Bring Characters to Life on the Web

ÓSCAR FERNÁNDEZ / DIGITAL SCULPTOR

Óscar Fernández is a freelance digital sculptor from Spain, specializing in creating figures for 3D printing. With deep expertise in ZBrush, he is known for crafting highly expressive characters that capture both personality and motion. His work stands out through the meticulous attention to facial expression, muscle definition, and dynamic posing, giving each sculpt a strong sense of tension and storytelling power.

Check Oscar’s ArtStation

Oscar Fernandez’s HD Alien MIXER pack is available now!

Overview

Character Creator 5 (CC5) continues to expand its creative ecosystem with the addition of new free plugins that streamline workflows with leading industry tools. One of the latest integrations, Auto Setup for Marmoset, makes it easier than ever to render and showcase your characters with professional quality.

In this article, we’ll take a closer look at how Marmoset Toolbag—and specifically Marmoset Viewer—can enhance your CC5 projects. From real-time rendering to interactive presentation, we’ll explore how these tools work together to bring your creations to life with both speed and style.

Setting Up Our Scene

We’ll start by setting up the scene: importing the mesh using Auto Setup for Marmoset, applying its materials, and adjusting the lighting, camera, and post-processing effects, just as we did in the previous video. 

It’s important to keep in mind that Marmoset Viewer is designed to run smoothly on a wide variety of devices, so there are some limitations. For example, shadow-casting lights are limited to three (you can use more lights, but only the first three will cast shadows), omni lights don’t cast shadows, and certain shading models, post-processing effects, and rendering features are not yet supported.

We’ll make sure the camera is centered to make navigation and scene rotation easier. Usually, to do this, we select the main mesh and go to View → Frame Selected (Ctrl+F).

Export the .mview File

To export the .mview file, go to File → Export → Viewer, or simply press Shift + Ctrl + V.

You can also add artist credits and a link to your portfolio by filling out the Title, Author, and Link fields.

As you’d expect, the texture quality setting has a significant impact on the file size. Here you can choose between Low, High, or Unreasonable quality — a very fitting name for this viewer’s purpose, corresponding to 4K textures.

Once everything is ready, just click the Export button to generate the file.

After exporting, you can view the file locally using Marmoset Viewer, which is installed for free  alongside Marmoset Toolbag, but can also be downloaded separately from the Marmoset  website.

Once opened, you can switch to full-screen mode and also isolate different layers such as normal  maps, color information, roughness, metalness, and mesh topology independently — all with the advantage of being able to view your model in real-time 3D.

Upload to Artstation 

This same file can be shared directly on several 3D communities, such as ArtStation, just as easily as adding an image to a new project, and with the same viewing advantages we saw earlier.

Embedding On Our Website 

When exporting, you can also choose the HTML option. Once you’ve defined the size and a few additional parameters, you can hit Export. However, if you’re going to embed the viewer inside an iframe, it’s important to select Full Frame. This way, Marmoset Viewer will ignore the  height and width values and automatically fit the iframe size. 

This will generate a simple HTML file that displays your scene. It’s very handy if you want to embed or showcase the scene on your personal website. You’ll just need some basic HTML  knowledge, but it’s very straightforward. 

I’ve created this super simple HTML file, and we’re going to embed the code by simply copying  and pasting it where we want: 

<iframe src="yourscene1.html" allowfullscreen="true" height="467" width="830"></iframe>

We replace the example HTML with our own file, adjust the iframe dimensions, and that’s it! 

Keep in mind that for this to work, the file must be uploaded to a server, since it won’t display correctly when viewed locally.  Another very interesting option is to create a gallery on your own website. 

We’ll create a pose for each character in Character Creator to give them a bit more personality,  export each one as FBX files (as we already know how to do), and then load them into Marmoset using the Auto Setup plugin, on a previously configured but empty scene — just like we did in the previous video. Finally, we’ll export a Marmoset Viewer (.mview) for each character using the HTML export option we just saw. We’ll also render some images of each character to use as thumbnails for  our gallery. 

Using the same template as before, we’ll copy the following base code:

<iframe src="yourscene1.html" name="viewerframe" allowfullscreen="true" height="467"  width="830"></iframe>

Here, we can assign any name we like — in this case, “viewerframe”.

We replace the example HTML with the scene we want to display by default, make sure  allowfullscreen is set to “true”, and define the viewer size. To create clickable thumbnails, we’ll copy the following code: 

<a href="yourscene1.html" target="viewerframe"><img src=" yourscene1.jpg"  alt=" yourscene1"></a> 
<a href=" yourscene2.html" target="viewerframe"><img src=" yourscene2.jpg"  alt=" yourscene2"></a> 
<a href=" yourscene3.html" target="viewerframe"><img src=" yourscene3.jpg"  alt=" yourscene3"></a>

Replacing each HTML reference with one of our own. Make sure that the target matches the name we defined earlier (viewerframe) and replace the images with our custom thumbnails.

I’ve also added a stylesheet to the gallery to make it look a bit more modern and appealing — and this is what the final result looks like once uploaded to our website. 

<style> 
.gallery-section { 
display: flex; 
flex-direction: column; 
align-items: center; /* aling iframe horizontally*/ 
gap: 10px; /* space between the iframe and the thumbnails */ 
margin-top: 20px; 
.thumbnail-bar { 
display: flex; 
justify-content: center; /* center the thumbnails */ 
flex-wrap: wrap; /* allows them to go down the line if there are many */  gap: 10px; /* space between images */ 
.thumbnail-bar a img { 
width: 295px; /* thumbnail size */ 
height: auto; 
border-radius: 8px; 
cursor: pointer; 
transition: transform 0.2s, box-shadow 0.2s; 
.thumbnail-bar a img:hover { 
transform: scale(1.05); 
box-shadow: 0 0 10px rgba(0,0,0,0.3); 
iframe { 
border: none; 
border-radius: 12px; 
box-shadow: 0 0 20px rgba(0,0,0,0.3); 
</style>

And finally, this is the part of the code connected to the style sheet:

<div class="gallery-section"> 
<iframe src=" yourscene1.html" name="viewerframe" allowfullscreen height="600"  width="900"></iframe> 
<div class="thumbnail-bar"> 
<a href=" yourscene1.html" target="viewerframe"><img src=" yourscene1.jpg" alt="  yourscene1"></a> 
<a href=" yourscene2.html" target="viewerframe"><img src=" yourscene2.jpg" alt="  yourscene2"></a> 
<a href=" yourscene3.html" target="viewerframe"><img src=" yourscene3.jpg" alt="  yourscene3"></a> 
</div> 
</div>

I hope this tutorial has helped you enhance your project presentations using the new tools in Character Creator 5.

Related Posts

iClone Video Mocap Converts Videos into Editable 3D Motion

Reallusion introduces iClone Video Mocap, an AI motion capture service that turns ordinary videos into precise, editable 3D motions, right inside iClone. This new solution streamlines motion creation for filmmakers, game developers, and creators seeking natural, production-ready animation without the need for mocap hardware or costly subscriptions.

Precise AI Mocap from Any Video

By partnering with QuickMagic, Reallusion brings one of the most accurate and widely adopted AI mocap solutions on the market. With iClone Video Mocap, creators can instantly convert any video — whether it’s a YouTube clip, a movie scene, or even a quick phone recording — into precise, editable 3D motion data that is ready to apply to any 3D character. This is all made possible with Quickmagic’s advanced AI engine, which analyzes 2D footage, reconstructs 3D motion to deliver stable, lifelike animations.

Recreate Group Actions to Talking Scenes

From high-energy choreography to intimate conversations, iClone Video Mocap gives users precise control over every capture. Visually trim clips and define the detection range to match the performance you need. For multi-actor footage, simply duplicate the task for each performer and separate the detections to ensure clean, accurate motion for every single character.

The system adapts to a variety of performance, from full-body movement to upper-body gestures with detailed finger tracking. Paired with iClone’s AccuFACE and LIVE FACE, it transforms any talking clip into a fully synchronized facial-and-body performance with lifelike realism.

Motion Editing Adds Core Support

While AI mocap can capture realistic movement, it also introduces common issues like foot sliding, depth misalignment, or unnatural twists. Most tools require exporting the data to another 3D editor for cleanup, adding extra steps to the process. With iClone Video Mocap, all fixes can be done directly inside iClone. Animators can correct motion offsets, smooth out jitters, and fine-tune details right in the timeline using tools like Foot Contact, Auto Motion Alignment, Curve Editor, and Reach Target.

Once a motion is generated, the reference video automatically loads in the viewport as a 2D background, letting you compare their animation with the original footage in real time.

Reallusion also provides over 35 professional mocap tutorials, helping users quickly master motion refinement and achieve production-quality results — all without leaving iClone.

Custom Motion for Any Character

iClone Video Mocap expands the creative reach of Reallusion’s character ecosystem. For AccuRIG, it offers a quick way to animate freshly rigged characters with personalized motions. For ActorCore users, it’s the perfect add-on, enabling unique performances that complement existing motion packs.

With Auto Retargeting, generated motions can be instantly adapted to characters of different body types with precise fitting, ensuring consistency and accuracy across diverse models.

Production-Level Mocap for Everyone

iClone Video Mocap is built to fit the needs of both indie creators and professional studios, offering a flexible, cost-efficient, and production-level approach to motion capture. For indie creators, it removes the need for expensive mocap suits or studio rentals — just record with your phone and start animating right away. For production houses, it supports a hybrid workflow for quick shots or background actions while reserving traditional mocap for large, complex scenes. The precise motion data makes it easy to integrate with existing pipelines, boosting efficiency without sacrificing quality. From solo artists to full teams, iClone Video Mocap adapts to any workflow with ease.

Pay-Per-Use for Maximum Flexibility

Unlike most AI mocap services that lock users into monthly subscriptions, iClone Video Mocap adopts a pay-per-use model. This gives creators full control over their costs. Spend just 250 DA Points (~$2.50) per motion generation, with no ongoing fees or usage limits.

Each upload can be up to 60 seconds long, and you can process up to 10 videos simultaneously, making it a fast, scalable solution for both individual artists and production teams. This on-demand model is both affordable and flexible.

Top up DA Points to start

Convert Footage to Finished Animation in iClone

iClone Video Mocap marks a new chapter in Reallusion’s AI production ecosystem that combines intuitive AI automation with iClone’s precision motion tools. Creators can now generate, refine, and apply realistic motion entirely within one platform, redefining how animation is made.

Up to 100 free AI mocap generations — Limited-Time Video Mocap Access. Learn more

Copresence 3D Scans into Character Creator 5 Workflow

Copresence, a Switzerland-based deep tech company, is transforming how we interact in digital spaces. By combining AI-driven deep learning models with robust capture technology, Copresence enables anyone to generate photorealistic 3D avatars in just minutes. Their platform supports industries spanning gaming, VR/AR, telepresence, and digital communication, giving creators and businesses tools to build hyper-realistic avatars at scale.

In this article, we’ll walk through a step-by-step workflow shared by Anderson Rohr, a 3D artist from Brazil, who demonstrates how to bring a Copresence face scan into Character Creator 5 (CC5). Whether you’re a 3D animator, content creator, or production studio, this guide shows how to unlock the full potential of Copresence + Reallusion for lifelike character creation.

Why Use Copresence for 3D Face Scans?

The value of Copresence lies in its accessibility and accuracy. Unlike traditional 3D scanning setups that require expensive hardware and studio environments, Copresence only requires your smartphone. Within minutes, you can create:

  • High-fidelity 3D geometry of your face
  • 4K texture maps for photorealistic detail
  • Strand-based hair in Alembic format (currently in CC5 support)

This means creators no longer need to rely solely on manual modeling or limited avatar generators. Instead, they start from a true-to-life digital double, ready to be customized in Character Creator 5.

Step 1: Scanning with the Copresence App

Anderson begins the tutorial by introducing the Copresence mobile app, available in app stores. The process is straightforward:

From there, download the CC5-compatible export package.

Download the app and follow the built-in tutorial.

Capture your face scan using your smartphone camera.

Once complete, upload the scan to the Copresence web app.

This package includes:

  • avatar.fbx → the 3D mesh of your face
  • Texture maps (diffuse, normal, roughness, etc.)
  • Hair in Alembic format (not currently supported in CC5)

At this point, you’re ready to bring your scan into Character Creator 5.

Step 2: Importing into Character Creator 5

Once the scan is downloaded, the workflow moves into Character Creator 5, Reallusion’s powerful tool for designing customizable characters. Anderson breaks it down:

  1. Unzip the Copresence download folder.
  2. Open Character Creator 5.
  3. Go to File → Import → Import.
  4. Select your avatar.fbx file.
  5. When prompted, choose:
    • Prop
    • Keep Original
    • Convert All

This imports the 3D scan as a base mesh.

Step 3: Using CC Headshot plugin for Head Generation

With the Copresence model imported, the next step is refinement using Character Creator’s Headshot 2 plugin.

  • Navigate to the Scene Tab.
  • Select the root node of your imported mesh.
  • Open Headshot 2.
  • Select Mesh as input and click Start Head Generation.

Headshot 2 analyzes the Copresence scan and regenerates it as a fully rigged CC5-compatible head, with improved topology optimized for animation and morphing.

This is where the scan truly comes alive—transitioning from a static 3D model into a character that can be animated.

Step 4: Refining Textures and Hair

Since CC4 previously didn’t support Alembic-based hair imports, Anderson chooses a preset hairstyle directly within Character Creator.

For textures, Copresence provides 4K maps that can be re-applied:

  • Base Color → for skin tone accuracy
  • Normal Map → for fine detail like pores and wrinkles
  • Roughness Map → for realistic lighting and skin reflection

The result is a character that looks remarkably close to the scanned subject—while retaining full CC5 editability.

Step 5: Full Customization in Character Creator 5

One of the advantages of CC5 is that the imported scan isn’t locked. Anderson demonstrates how you can still modify:

  • Facial features (nose, eyes, jawline, lips, etc.)
  • Skin tones and makeup
  • Clothing, accessories, and hairstyles
  • Expression profiles for animation

This flexibility makes CC5 a powerful extension of Copresence’s scan. The workflow combines realism from AI-driven capture with creative control in CC5.

Importing CC5 Characters into MetaHuman Animator

One of the most powerful aspects of Character Creator 5 is its flexibility to integrate with pipelines beyond iClone and Unreal Engine. For creators who want to leverage MetaHuman Animator (MHA) for advanced facial performance capture, CC5 provides a streamlined way to bridge assets into the workflow.

By exporting characters from CC5 in Unreal-ready formats (FBX or USD), users can bring their fully rigged and textured avatars directly into Unreal Engine. Once inside, these characters can be linked with the MetaHuman Animator system, enabling detailed facial animation through Epic Games’ powerful capture pipeline.

The benefits of this integration are significant:

  • Freedom in character design – While MetaHuman focuses on ultra-realistic humans, CC5 lets you design stylized, fantasy, or hybrid characters, which can still benefit from MHA’s facial fidelity.
  • Faster production workflow – You don’t need to build characters from scratch in Unreal. Instead, you can take advantage of CC5’s extensive morph sliders, clothing systems, and asset libraries before plugging into MHA.
  • Cross-pipeline compatibility – This means you can combine the best of both worlds: CC5 for creative freedom, MetaHuman Animator for performance capture realism, and Unreal Engine as the final rendering hub.

For filmmakers, indie studios, and animators, this compatibility opens doors to creating highly expressive performances without being locked into a single ecosystem.

Applications for Creators and Studios

The combination of Copresence and Reallusion tools opens new possibilities for:

  • Game Developers → Build lifelike avatars for NPCs or player characters.
  • Animation Studios → Reduce time spent on modeling and rigging digital doubles.
  • Content Creators → Clone yourself in 3D for YouTube, TikTok, or VR content.
  • Production Houses → Use real faces in pre-visualization and cinematic pipelines.

By eliminating the need for expensive 3D scanners or mocap setups, this workflow lowers the barrier for high-quality avatar creation.

Unlock Flawless 3D Scans

Copresence just released a next-gen pipeline using new deep learning models to eliminate shadows and artifacts. Dive into the update now → https://www.copresence.tech/blog/platform-release/

Conclusion: The Future of Avatar Creation

Copresence and Reallusion’s Character Creator 5 together represent a new standard for digital human creation. With Copresence, anyone can create a realistic 3D scan in minutes. With CC5, that scan becomes a fully editable, animatable avatar ready for games, films, and immersive experiences.

For animators, studios, and creators, this workflow means less time modeling, more time creating. As Anderson Roar shows, the future of avatars is here—and it’s accessible to everyone.

FAQ

Do I need professional equipment for Copresence scans?

No. Copresence works with your smartphone camera and processes everything in the app.

Can I use Copresence hair in Character Creator 5?

Currently, CC5 doesn’t support Alembic hair files. You’ll need to use built-in hairstyles.

Is Headshot 2 required for this workflow?

Yes. Headshot 2 is necessary to convert the Copresence mesh into a fully rigged CC5-compatible head.

Can I animate the imported Copresence avatar in iClone?

Yes. Once processed in CC5, the character can be exported to iClone for full animation.

What industries benefit most from Copresence + CC5?

Gaming, film, VR/AR, and digital content creation all gain from faster, more realistic avatar production.

Related Posts

Breathe Life into Your CC5 Characters with Marmoset Toolbag Auto Setup

ÓSCAR FERNÁNDEZ / DIGITAL SCULPTOR

Óscar Fernández is a freelance digital sculptor from Spain, specializing in creating figures for 3D printing. With deep expertise in ZBrush, he is known for crafting highly expressive characters that capture both personality and motion. His work stands out through the meticulous attention to facial expression, muscle definition, and dynamic posing, giving each sculpt a strong sense of tension and storytelling power.

Check Oscar’s ArtStation

Oscar Fernandez’s HD Alien MIXER pack is available now!

Overview

An exciting update in Character Creator 5 (CC5) is the addition of more free add-ons designed to streamline your workflow across leading industry tools. With the new UDIM integration in Marmoset Toolbag 5, it’s now possible to achieve a direct, hassle-free transition between CC5 and your rendering software using Auto Setup for Marmoset. We’ll start off with an existing project for a set of alien characters and finish with a high-quality final render.

Installation

The installation is super easy: Download the Auto Setup plugin for Marmoset from the Reallusion website and extract the files. Open Marmoset and go to Edit > Plugin, then click Show User Plugin Folder. When the folder opens, simply drag in the Auto Setup files you just extracted. Go back to the Plugin menu and click Refresh to reload the plugin data. Once it’s installed correctly, CC Auto Setup for Marmoset will appear under the Plugin menu. Now you can click on it to open the Auto Setup panel — but before that, head over to CC5.

Character Export from CC5

In our case, we don’t have any clothing or accessories, but we are going to apply a pose to our character before exporting. Adjust the body posture, and you can also add facial expressions if you want. Save the pose in case you want to use it later. Select the character, go to the File menu, choose Export, and select FBX. Pick the Clothed Character option — even though in this case our character isn’t wearing any clothes. Once the export window opens, you’ll see a bunch of presets depending on the target software, but in our case, we’ll choose Marmoset Toolbag to automatically set up compatibility with Marmoset. 

Under FBX Options, make sure Mesh and Motion is enabled to include the character’s animation data. For Motion Settings, select Current Pose, leave the rest of the settings at their default values, and hit Export. Once the export is complete, you’ll find the FBX file along with its associated textures and a JSON file. This JSON file is very important because it links the textures to the model, defining which part of the mesh uses which texture.

Character import to Marmoset Toolbag

Now, in Marmoset, go to the Plugins menu as we did before and open Character Creator and iClone Auto Setup. The plugin couldn’t be easier to use: 

Select the FBX file you just exported from CC5, and the JSON file will automatically link to it. Click Import and wait for the character to load. The loading process is usually quite fast, but it may take a bit longer if you selected Subdivision Level 2, since that makes the file larger. Once it’s loaded, you can examine the character from every angle before moving on. 

Adjusting lighting and environment (Lookdev settings)

The plugin doesn’t just load the file and textures — it also provides a set of presets to light your scene. Any imported object or character will automatically be illuminated using the Full Body Light presets. When switching between presets, you’ll notice that a different HDRI image is also loaded in the Sky section, which helps create a more vibrant and coherent render. Similarly, if you want to focus on the face, you can choose one of the configurations included under Headlighting. This gives you a high-quality starting setup that you can easily tweak and play around with. 

Directly from the plugin, you can adjust the HDRI brightness and rotation to preview how the lighting affects your model from different angles. Ray tracing is enabled by default for high-quality rendering with optimized preset settings, but of course, you can modify them if needed.

Once our character is imported, we can continue tweaking all the typical Marmoset settings to customize the scene to our liking — adjusting the camera settings, such as focal length, depth of field, adding post-processing effects, and more. As mentioned earlier, we can play around with the HDRI values. The lighting setup we get from the presets is also fully customizable — you can adjust the color, brightness, light softness, and of course, its position. With all these options, you can create a completely personalized scene and fine-tune it precisely to match your character and the mood or story you want to convey.

Materials preparation

The first material we can experiment with is the background, adjusting parameters like color, roughness, or metalness to change the atmosphere of our scene. Thanks to the Auto Setup plugin, the Digital Human shader is applied automatically — meaning all the necessary parameters for each texture are transferred to Marmoset so that everything looks exactly as it did in CC5. 

We won’t go into every material in detail, but for each one we can modify all the settings to fully customize it. For example, making the skin look wetter or more metallic, changing the eye color and all related parameters, adjusting the tiling of the normal map to control how skin details behave, tweaking skin transparency values, and more. We can fine-tune the details endlessly. 

Animations

Another advantage of working with Marmoset is that we can also render animations, and the process is just as simple as what we saw for a static pose. In our case, we’ll choose a simple animation, such as a walking cycle, and enable position locking so the character walks in place. Then we go to the export settings as we did before. We select Subdivision Level 0 so the import into Marmoset is faster and the animation runs more smoothly. We also make sure to check Current Animation and All, so the full animation is exported. Additionally, we ensure the frame rate matches the project settings in Marmoset (in our case, 30 fps), and then we just export the FBX.

Now, we launch Marmoset and load our character through the plugin as we saw in the previous example, and then all we have to do is hit Play to run the animation. With Ray Tracing enabled, the animation may not look smooth in the viewport, but it will render perfectly. Next, we can slightly adjust the scene by changing the background color and some lighting parameters. I’m going to place the character walking on a simple base I created in ZBrush. I’ll adjust the size and position of the base, as well as the backdrop, so they align with the bottom of the character. Now the scene is ready. I’ll apply a metallic material to the base and tweak the material settings slightly so it doesn’t overshadow the character. 

To keep the renders of our characters consistent with each other, I’m going to save this empty scene. So I delete the character but keep the lighting setup and the base. Then I delete all materials except for the backdrop and base materials, place them in a folder, and save the project. Now I can load another character using the Auto Setup plugin, but this time I’ll uncheck Apply Look Dev so the scene configuration is preserved. We can also uncheck this option if we want to apply our own lighting setup.

This animation was exported with Subdivision Level 2, and my PC struggles a bit to handle it, so I go to the body mesh and uncheck the Subdivide option… as we can see, there’s hardly any difference since all the shader information is still applied, but everything runs much more smoothly. Now we have everything ready for export, so let’s go for it.

Final render

Finally, let’s look at several cases that make Marmoset Toolbag an excellent choice for rendering. We’ll start with rendering a still image: 

After posing our character, we import it and adjust the focal length and camera angle for the render. We tweak the lighting slightly to highlight the character’s details and adjust the depth of field to give it a more cinematic look. In the Render tab, we choose the final image dimensions and increase the samples to 512 for higher quality. We check the Safe Frame option to ensure the framing is correct and verify that the render pass we’re generating is the final composition. Now all that’s left is to press the Render button so the program produces the image. 

We can also do basic post-processing directly in Marmoset by adjusting exposure, contrast, sharpening, etc., to achieve a more polished image — but I prefer to make these adjustments in Photoshop. For that, we can export additional render passes to aid in post-processing in Photoshop. Once in Photoshop, we play with the background texture, locally expose or overexpose certain areas, apply extra effects using Camera Raw, or add atmosphere with smoke, floating particles, or blurred elements to enhance depth in the composition. All these small touches make the final image feel much more organic and dramatic.

Turntable 

Another super useful option for showcasing our characters is turntable videos. We can create a 360° video of either a static pose or an animation. In our case, we’ll do it with our walking character, so we can also consider some extra points. Once the FBX is loaded, we simply create a turntable by right-clicking in the Scene panel. At first, it seems like nothing happens — our character keeps walking without turning… This is because the turntable works like a folder: everything inside it will rotate, while anything outside stays static. 

In my case, I only want the character and the base to rotate, not the lights or backdrop, so I put just these two elements inside the folder. To ensure we get a seamless loop, I check in CC5 how many frames the animation lasts — in this case, 327 frames — and set that value as the number of frames in Marmoset’s timeline. This gives me the total time, which I can then use to calculate how many degrees it should rotate per second. 

Using a calculator, I divide 360° (a full rotation) by 10.9 seconds (the length of the animation) and change the value accordingly. Now we can check that the first and last frames match perfectly. Once that’s sorted, we can export. Go to the Render tab, choose the video dimensions and sample count, make sure we’re exporting the final composition, and render. This gives us the final result. 

Don’t miss the next tutorial, where we’ll explore some other very interesting examples of what Marmoset Toolbag and CC5 can do.

Related Posts

Fail Forward Faster: Why Nick Shaheen Recommends CC4D Plugin for CC Cinema 4D Pipeline

Nick Shaheen: From Medicine to Motion

I’m Nick Shaheen, a radiologist by profession and a CG artist by passion. My journey into animation started in an unexpected place—medical illustration. Early in my career, I spent a fair bit of time creating motion graphics and anatomical visuals to help explain complex medical concepts. That experience taught me the power of visual storytelling and deepened my appreciation for animation as a way to communicate ideas dynamically.

Over time, my work expanded beyond medical visualization into character animation and motion design. I’m always looking for ways to refine my process and bring more life to my work, and discovering Reallusion’s tools opened up new creative possibilities that I hadn’t explored before.

Nick’s Instagram / YouTube Channel

Overview

Art is all about experimentation and learning from failure. The faster you can move through that cycle, the more chances you have to refine your work. That’s why I’m excited about CC4D, a powerful plugin that bridges Character Creator and Cinema 4D with Redshift.

What is CC4D

Developed by Reallusion partner Benjamin Broschinski, CC4D extends Character Creator’s ecosystem into Cinema 4D through an extensive toolbox designed for seamless character import, material setup, and animation management. It’s the ultimate bridge for artists who love Cinema 4D’s creative environment but want to harness Character Creator’s digital human quality and rigging precision.

Why CC4D Matters

1. Instant Redshift Setup

Bringing characters into Cinema 4D used to mean lots of time spent on manual node cleanup. CC4D automates that entire process in just a few clicks. Diffuse, roughness, subsurface scattering — all the essentials are correctly assigned without touching a single node.

2. More Time for Creativity

Instead of fighting technical bottlenecks, you can focus on stylizing characters, experimenting with lighting, and pushing the overall look of your scene.

3. Animation-Friendly Pipeline

CC4D doesn’t stop at materials. It can also import face and body rigs, giving artists who prefer Cinema 4D’s character tools a clean way to handle animation from start to finish.

4. Iterate at Speed

Creative work is messy. You try things, fail, adjust, and try again. CC4D accelerates that loop, letting you experiment more often — and every extra cycle makes the final result stronger.

For my recent project Envy, CC4D made the difference between spending hours on technical prep versus spending those hours crafting mood, stylization, and storytelling.

Final Thoughts

Failure is part of every artist’s journey, but the right tools can help you fail forward — faster. With CC4D, you can skip technical roadblocks and spend more time where it matters most: crafting compelling stories, experimenting with style, and bringing your characters to life.

Related Posts

Featured Developer Eric Larson (Libertas Armory) shares his honest opinion on Reallusion's Laser Sword Motion & Content Pack. He explains how he had used it for his combat motion, and shares his tips and tricks on how to use this pack to your advantage.

Libertas Review: Laser Sword Pack for Epic Cinematic Fights

Featured Developer Erik Larson (Libertas Armory) shares his honest opinion on Reallusion's Laser Sword Content Pack. He explains how he had used it for his combat motion, and shares his tips and tricks on how to use this pack to your advantage.

As a 3D artist, I often combine Blender, Unreal Engine, and iClone to tell stories and bring digital characters to life. I’m always searching for tools that make animation both faster and more expressive. I was genuinely intrigued by Laser Sword Pack’s potential, and after putting it to the test, I can confidently say: the paired motion system is an absolute game-changer.

Erik Larson (Libertas)

Hailing from Chicago, Libertas developed a passion for filmmaking early on. His love for the AAA game Assassin’s Creed led him to create videos about the game, which can be found on his channel, Libertas Video. He also produces a series of Assassin’s Creed-inspired micro-short films, such as “Modern Assassin Training Session”.

Despite his shoestring budget, Libertas enjoys crafting characters and costumes, bringing them to life in his YouTube short films. During the day, he is the Director of 3D of a production house and runs Libertas Video as a solo 3D generalist by night. He enjoys spending his free time creating characters, costumes, and props for his digital actors.

In this article, Erik takes a moment out of his precious time to introduce the “Laser Sword Pack”.

Inside the Laser Sword Motion Pack

This Laser Sword Pack offers 41 animations across four categories: “Sword Duels”, “Win or Lose”, “2v1 Battles” and “Combat Stances”—the pack also comes with 2 types of Laser Sabers that support RGM customization for lighting adjustment. Made by Hollywood veterans, the Monkey Chow Animation Studios‘ professional stunt team is the people behind the motions, delivering unbeatable quality.

Matched Motions: True Choreography from the Pros

The pack comes with paired combat animations—each move precisely matched between two characters, one attacking and one defending. It’s similar in approach to Reallusion’s Hand-to-Hand Combat Pack, which I reviewed a few years back, but this time the focus is on weapon-based combat. The result feels cinematic, coordinated, and dynamic right out of the box.

How the Laser Sword Pack was Made by Monkey Chow Animation Studio: A Side-by-Side Comparison.

While I often use my Sony Mocopi Pro for solo motion capture, recording both sides of a duel can be slow and repetitive. This pack solves that problem completely. Every sequence comes from professional stunt performers, delivering real choreography with flips, spins, and reactive movement that’s difficult to capture solo.

Animated Props: The Secret Weapon

One feature that really surprised me was iClone’s animated props. I’ll admit—I hadn’t paid much attention to them before. But in this pack, they truly shine. Every time you apply a motion to a character, a corresponding laser sword is automatically added and animated in sync. That means the weapon stays perfectly aligned in the character’s hands during all those high-speed swings and clashes. Even better, it helps when aligning characters for impact points, ensuring that every strike connects visually and believably.

At first, I thought having “laser swords” might limit my options since I’m not creating Star Wars fan films. But it turns out, this feature is incredibly versatile. Simply line up your own weapon model with the laser sword, parent it to the prop, and then hide the original blade. The result? Your sword moves with the exact same fidelity—giving you dynamic, realistic motion in a fraction of the time.

Ideal for Cinematic Duels

Although these motions are stylized for laser sword duels, they’re easily adaptable for two-handed weapons—katanas, longswords, or even spears for the double-bladed sequences. With a few tweaks, the choreography can fit any fantasy or sci-fi project.

The Verdict

The Laser Sword Motion Pack is a must-have for anyone aiming to create fluid, cinematic fight scenes without spending weeks in mocap or manual animation. Just drop in your custom characters, apply the motions, fine-tune a few alignments, and render your sequence—it’s that simple.

If you liked the look of the characters in my demo, many of their outfits come from my own Libertas Armory packs, which you can find linked below. And if this pack caught your attention, I’ve reviewed several other Reallusion motion packs on my channel—definitely worth checking out if you’re building your own 3D action library.

Related Posts