Categories
Product Release
Pitch & Produce
News
Learning
Featured Story
Events
Tags
Product
Character Creator
iClone
Cartoon Animator
AccuRIG
ActorCore
Smart Content Manager
Plugins & Pipelines
ZBrush
Photoshop
PSD Pipeline
Vector Pipeline
Headshot
Auto Rig
Motion LIVE
Illustrator
Blender
SkinGen
Cinema 4D
After Effect
Unreal
Omniverse
Maya
Unreal Live Link
Unity
MetaHuman Live Link
3ds Max
Marvelous Designer
Daz
Motion Link
Iray
Application
Music Videos
Social Media
Animation
Cartoon
Comics
Commercial Ads
Previz
Conceptual Art
Games
Television & Network
Education
Films & Movies
AEC
3D Scan
Live Performance
Virtual Production
AR/VR/MR/XR
Vtuber
Theme
Character Animation
Character Creation
Scene Creation
Facial Animation
Lip-sync Animation
Video Compositing
Motion Capture
Motion Director
Digital Double
360 Head
Digital Twin
Environment & Crowd
Digital Human
AI & Deep Learning
MetaHuman
Metaverse
Developer
Content
Plug-ins
Certified Trainer
Columnist
WarLord
GET THE LATEST UPDATE

Architect-Artist Innovates CG Film Production with CC AccuRIG, Redshift SSS and iClone

Share

Kay John Yim is a Chartered Architect at Spink Partners based in London. He has worked on a wide range of projects across the UK, Hong Kong, Saudi Arabia and Qatar, including property development and landscape design. His work has been featured on Maxon, Artstation, CG Record and 80.LV.

Yim’s growing passion for crafting unbuilt architecture with technology has gradually driven himself to taking on the role of a CGI artist, delivering visuals that not only serve as client presentations but also as means of communication among the design and construction team. Since the 2021 COVID lockdown, he challenged himself to take courses in CG disciplines beyond architecture, and has since won more than a dozen CG competitions. 

“Masquerade” tells a story of a girl with a concealed identity who enters an empty theater filled with statues. As the statues come to life, a mysterious gentleman joins her and they waltz to the rhythm of enchanted flames, lost in a world of magic and fantasy.

This project was my debut for a short CG film, allowing me to develop my skills in storyboarding while working with AccuRIG and motion capture data to create intricate character animations. It was also an opportunity for me to refine my lighting and rendering techniques. That being said, I would like to express my sincere gratitude to Reallusion and GarageFarm for their sponsorship. Their support made this project possible.

Workflow Overview

The software and workflow I used throughout every stage of the project can be summarized as the following:

Ⅰ. MODELING

▪ Rhino > MOI > Cinema4D > Redshift

Ⅱ. CHARACTER CREATION

▪ Character Creator 4 (CC4) > iClone 8 (iC8) > Houdini > Marvelous Designer > Houdini

Ⅲ. AccuRIG (STATUES ANIMATION)

▪ Cinema 4D > Houdini > CC4 AccuRIG > iC8.1 > Cinema 4D > Embergen > Houdini (retime) > Cinema 4D > Redshift

Ⅳ. CHARACTER ANIMATION

 ▪ iC8.1 > Embergen > Houdini (retime) > Cinema 4D > Redshift

Ⅴ. CLOTH SIMULATION

▪ Marvelous Designer > Houdini > Cinema 4D (C4D)

Ⅵ. ASSEMBLY, LOOK_DEV & SHOT FRAMING

▪ iClone > C4D

Ⅶ. RENDERING

▪ Redshift > Neat Video (denoise) > After Effects (Heatwave) > Premiere (Magic Bullet Looks) > RIFE-App (interpolation)

1. MODELING

The scene consists of two spaces, the hallway and theater, designed and modeled in reference to the Louvre Palace and Opera Garnier theater respectively. While they are not complicated spaces in terms of architecture, the ornaments are highly detailed and could have easily resulted in extremely heavy models that caused render times to skyrocket. Because of this, optimization became a core tenet throughout the making of “Masquerade”. 

The following iconic buildings served as references throughout the modeling process. 

Using my past projects “Ballerina”, “The Magician”, and “The Gallery of the Great Battles” as the foundations, I recycled a lot of the ornaments that I had modeled for the Amalienburg hunting lodge in Munich. With the main architectural space blocked out, I then switch between Rhino and C4D, using the following Rhino commands and C4D functions for modeling:

  • Flow (Rhino): Conform objects to a specified curve, useful for accurately bending ornaments meshes to architraves, bullnose etc.
  • Flow Along Surface (Rhino): Conform objects from one surface to another, useful for accurately transferring ornaments on flat surfaces to curved counterparts, e.g., balcony balustrade.
  • Spline Wrap (C4D): Similar to Flow in Rhino but procedural, can not handle heavy meshes like Flow but extremely versatile for changing design.
  • Volume Builder + Volume Mesher + Quad Remesher (C4D): Useful for topologizing heavy meshes that require close-up details but need to maintain their 3D silhouettes.
  • Mirror Instance (C4D): I set my scene origin at (0,0,0), placed everything symmetrically under a null, instanced, then scaled the null’s X by -1, which mirrored it procedurally.

While I used a lot of my ornaments previously modeled in ZBrush and retopologized in C4D, I also used 3D ornaments from textures.com. I recycled meshes and used “Render Instances” within the project wherever possible to constrain file size.

Figure 1. Rhino time lapse
Figure 2. C4D & Rhino time lapse

2. CHARACTER CREATION

The female character I used for this project was originally created in Character Creator for a short called “Ballerina”. I have since updated her using the integrated SkinGen library in CC4.1 and added an extended facial profile for animation in iClone 8.1.

My CG character workflow drastically sped up when I began to integrate CC4 with Redshift’s Random Walk Subsurface Scattering. Not only had CC4’s integrated SkinGen features allowed for faster procedural texturing, my overall workflow was simplified into a series of drag-and-drop processes with textures automatically exported with the FBX files.

The male lead ended up being a modified version of a default character in CC4.1. While the character creation process had become a lot easier in CC4, making convincing characters remained out of reach for me. I used the following websites as my go-to reference while adjusting character bone structures, facial silhouettes, and skin textures.

  • This site generates a non-existing person every time the page is refreshed.
  • This site provides a series of high-res portrait photos.

After creating the male and female leads, I exported both characters to iClone 8.1 for animation.

Figure 3. Face morph
Figure 4. CC4 extended facial profile & Digital Soul test – 1
Figure 5. CC4 extended facial profile & Digital Soul test – 2

3. AccuRIG in CHARACTER CREATOR 4

The statues played a huge role in setting the tone for the scene, in fact, the final theater model and shot framing were designed around the movement of the statues.

Figure 6. Theater clay render

AccuRIG in CC 4.1 lets inexperienced artists rig their models and turn them into animatable characters. For AccuRIG to work its magic, I had to first remove the skirt on the ballerina statue that would have interfered with the rigging procedure. I then reposed the statue into a t-pose using a combination of Cinema 4D (C4D) and Houdini.

Figure 7.  CC AccuRIG preparation
Figure 8.  CC AccuRIG time lapse

For the animation, I used Edit Motion Layer to create three varied movements for use in my scene. I then exported the animated statues into Houdini, removed all of the body parts except for the sleeves, and imported them into Embergen for pyrotechnic simulation. Exporting the pyro VDBs from Embergen became an overnight process much like final rendering.

At the time of this writing, Embergen (v0.7.5.8) did not support VDB export at 24 fps, which would have been the preferred frame-rate for most of my projects, including “Masquerade”. I eventually uploaded the project to GarageFarm for final rendering, hence it was equally important to optimize the file sizes along with the render times. Exported VDB files can easily take up dozens of gigabytes of space (approximately 150 GB in total for the whole project). So, in order to reduce the total files size, I used Houdini’s Retime node to adjust the timing of the VDBs from 60 or 30 fps to 24 fps, which effectively reduced the total file size by 25%.

Figure 9.  Embergen pyro sim
Figure 10.  Retiming VDBs in Houdini

I then imported the animated Alembic files, FBXs, and simulated pyro VDBs into C4D. I used Constraints to bind the skirt back onto the FBX skeleton’s pelvis, which would transform according to the animation. Finally, I duplicated the statues into an array and offset every statue’s animation slightly to create a gradual rhythm to the animation.

Figure 11.  Animated statue with pyro effect

4. CHARACTER ANIMATION

The character animations were created using a mix of Xsens motion capture data and ActorCore premade motions. For the entrance sequence, I first applied a looping “Cat Walk” cycle from the ActorCore motion library to the female lead. This gave me the foundation for animating the girl’s entry into the theater hall. I then animated her transformation by matching the pace of the walk cycle and using iClone 8.1’s Motion Correction feature to compensate for foot sliding. I continued to use Edit Motion Layer to add the head turn amid her cat walk and used features from “Digital Soul” to apply facial animations.

Figure 12.  Catwalk

With the entrance sequence animation complete, I exported the animated character as an Alembic file to Houdini, where I removed intersecting body parts. Particularly, the arms which would have interfered with the cloth simulation in Marvelous Designer (MD).

Figure 13.  Dress simulation

For the second half of the animation, I motion captured real-life actors in Xsens suits doing a “Cinderella” waltz and transitioned the animation data to a slow dance motion from Reallusion’s “Studio Mcap Series: Motion for Lovers”.

Figure 14.  Motion-Capturing a Cinderella Waltz

With the Xsens raw data imported into iClone 8.1, I used “Digital Soul” to apply facial animations and locked the eyes of my actors on one another. I used Reach Target to correct the glitchy parts of the interactions and Edit Motion Layer to readjust the intersecting body parts wherever necessary.

Figure 15.  Editing Motion Data in iClone 8 (mocap clean-up)
Figure 16.  Using ‘Reach Target’ feature in iClone 8

5. CLOTH SIMULATION

While C4D and Houdini have both made significant improvements in cloth simulation lately, I still find that Marvelous Designer offers a level of control, speed, and quality that surpasses every cloth simulation software I have tried. While realistic most of the time, MD’s simulation rarely gives the perfect result. Generally, I tinker with the Friction settings and Simulation Quality in face of the minor glitches I encountered along the way. I then export various versions of the simulation into Houdini and use Blendshape to blend them together.

Figure 17.  Dance dress simulation
Figure 18.  Tux simulation
Figure 19.  Adjusting blendshapes in Houdini
Figure 20.  Dance animation after mocap clean-up

To add some mystery to the protagonist, I had her wear a cloak at the opening sequence, which gradually vanishes and reveals the character’s 3D face and dress. This was created using C4D’s PolyFX with an improved setup carried over from a previous project of mine called “Kagura”.

Figure 21.  Cloak simulation
Figure 22.  Cloak reveal

6. ASSEMBLY, LOOK-DEV, AND SHOT FRAMING

Speaking again of the entrance sequence, I used iClone’s camera in combination with C4D Camera Morph to precisely control the tracking camera. At first, I set up a camera in iClone 8.1 and aimed it toward the female lead’s eyes. This created the illusion of the character looking into the camera and breaking the fourth wall. I then exported the camera as an FBX file for use in C4D. Then I positioned a total of four cameras around the entrance and used Camera Morph to smoothly transition among the four cameras.

Figure 23.  Camera animation

With the character in place, I played the sequence repeatedly and added three extra cameras along her walking path, with two cameras eventually passing her and switching their focus toward the statues in the theater. The final Camera Morph is controlled and timed with Greyscalegorilla’s “Signal” plugin, which allowed me to use a curve (rather than keys) to control the timing of the camera transitions. For the dance sequence, I used Align to Spline with a circle spline centered on the characters to create a consistent and controlled framing of the characters, as if they were filmed with actual cameras attached to dolly tracks. 

Figure 24.  Dance in the assembled scene

I kept the lighting setup as simple as possible, using the RS Sun and chandeliers as the primary sources of light and pyro VDBs as the secondary source of illumination. I added a RS Environment driven by a Maxon Noise shader to add an additional layer of atmosphere to the final render. I also kept the materials simple, using only five materials in total to maintain a distinguishable color palette.

Since version 3.5.06, Redshift’s updated Random Walk SSS has provided more realistic SSS models without sacrificing render speed. It simplifies the setup of skin materials and produces better results under a variety of lighting conditions. Prior to this update, Redshift’s ray-traced SSS required multiple texture layers and manual adjustments to create realistic skin materials, which was a time-consuming process that required constant adjustments on animation sequences with significant light changes. While Arnold Renderer has offered Random Walk SSS for some time, Redshift’s implementation has made it much more efficient and practical for use in animation. 

For the Subsurface settings of the characters’ skin materials, I used the Skin Diffuse map from CC 4.1 as the color and set Radius to a salmon color (similar to the color one’s dermis when viewed under direct illumination). I set Scale to 0.1 to represent the thickness of the skin, and used Random Walk mode with Include Mode set to “All Objects”. 

Figure 25.  Redshift skin material graph

As for the look of the fire, I aimed for a more fantastical and smokeless appearance. To ensure that the colors and movements of the fire would match the lighting and color palette of the rest of the animation, I dedicated a lot of time at the start of the project to test different render and pyro simulation settings.

Figure 26.  Fire dress test

7. RENDERING

At the time of writing, there was a persistent NVIDIA driver issue (a VRAM memory allocation bug) that consistently caused Redshift to crash on long sequence renders. This issue was widely discussed on Redshift’s Facebook group and official forum. Some 3D artists found that downgrading their NVIDIA drivers to 462.59 worked well, but the only fix that worked for me was to disable half of my GPUs for renderings (two out of four in my case).

The scene for Masquerade was one of the heaviest among all the animation projects I have done — the architectural model alone totalled 4 GBs, while the entire scene (including VDBs, textures, characters, cloth simulations) totalled over 500 GBs. To optimize render times, I converted all static objects, including Matrix objects, into Redshift proxies by groups. For example, ceiling as one proxy and walls as another. This drastically reduced the loading times for geometry during final renders. While I used to shy away from using Redshift proxies for animations due to render farm limitations, I have had the pleasure of using GarageFarm, which fully supports Redshift proxies as long as the proxy material properties are set to “Materials from Object” or “Materials from Scene (Match name and prefix)”.

Aside from its support for Redshift proxies, working with GarageFarm was one of the best experiences I have ever had using a render farm. The 24/7 support from the GarageFarm team was invaluable and they were always available to answer any questions I had. One of the standout features of GarageFarm was the flexibility it offered when it came to rendering. The ability to render single images in strips and having three levels of priorities for rendering jobs allowed me to easily manage my budget and render times. I would highly recommend GarageFarm to anyone looking for a user-friendly and reliable renderfarm solution for their CGI animation projects.

For the final rendering, every shot was saved individually as one project, which included the following:

  • Static geometry as RS Proxies, for architecture and furniture. 
  • Matrix Objects as RS Proxies, for flower petals scattered across the ground and in the air.
  • Characters and animated garments in Alembic format
  • Simulated pyro in VDB format

I experimented with multiple techniques for further optimization and found the following four to have the most drastic reduction in render times:

  1. Deleting everything outside the camera: I created unique scenes for every shot such that each scene only contained what was visible to the camera. This significantly reduced my final render times by up to 80%, especially for the statue close-up shots.
  2. Turning off motion blur for Matrix/Cloner objects with RS Object tags
  3. Keeping render samples low. I kept mine at the default, with Automatic Sampling set to a threshold of 0.03 and denoising with Neat Video in post-processing
  4. Rendering only every other frame and using “RIFE-App” to interpolate frames.

Note:

Technique 3 resulted in relatively noisy renders, which I then imported as sequences into Premiere Pro and used Neat Video 5 for denoising. I left most of the Neat Video settings at their default and “automatic” values, but it’s important that you right-click on the Premiere Pro viewport and make sure the Playback Resolution is set to “Full”. This ensures that Neat Video samples the final renders at full resolution. The Neat Video user interface is easy to navigate, but I recommend checking out their official tutorials to get the most out of the software. Technique 4 worked for the first half of this particular project, but did not work for some of my other projects. Make sure to do test renders before proceeding with the final renders.

Figure 27.  Premiere “Playback Resolution”
Figure 28.  Neat Video

For the final touches, I used Red Giant Universe Heatwave in After Effects to add heat distortions to the fire. I also used Magic Bullet Looks in Premiere for color correcting, adding chromatic aberration, and film grain. The resulting effects were reminiscent of movies filmed with detuned 70s lenses, adding a layer of nostalgia and enigma to the final aesthetics.

Figure 29.  Applying Heatwave effect
Figure 30.  Magic Bullet Looks
Figure 31.  Raw render
Figure 32.  Final still with heat distortion & Magic Bullet Looks

Conclusion

In conclusion, the creation of project “Masquerade” was a challenging but rewarding journey of self-learning and artistic exploration. The process of bringing this character animation project to life allowed me to develop my skills in storyboarding, character design, character animation and rendering.

This project was created amid the rise of text-to-image artificial intelligence (AI) systems, which have the potential to disrupt the field of visual arts and threaten the livelihood of artists. While these technologies offer new possibilities for automation and efficiency, they also raise important questions about the role of creativity and human expression in an increasingly automated world. In the face of these challenges, I believe that it is more important than ever to celebrate and support the work of artists and creators.

By sharing my own process and insights through this article, I hope to inspire others to pursue their own artistic endeavors and to value the unique perspectives and talents of human artists. Despite the advances of AI, we must remember that it is through the passion and dedication of artists like ourselves that the world can continue to be enriched by the beauty and complexity of human creativity.

Learn more :

• Kay John Yim’s personal site https://johnyim.com/

• Kay John Yim’s ArtStation https://www.artstation.com/johnyim

• Character Creator https://www.reallusion.com/character-creator/download.html

• iClone https://www.reallusion.com/iclone/download.html

Related topics

Share

Leave a Reply

Recommended Posts