Categories
Product Release
Pitch & Produce
News
Learning
Featured Story
Events
Tags
Product
Cartoon Animator
AccuRIG
iClone
Character Creator
ActorCore
Smart Content Manager
Plugins & Pipelines
Photoshop
PSD Pipeline
Unreal
Motion LIVE
Unreal Live Link
Auto Rig
ZBrush
Blender
After Effect
Cinema 4D
Marvelous Designer
SkinGen
Vector Pipeline
Unity
Illustrator
Omniverse
MetaHuman Live Link
Headshot
Daz
Motion Link
Iray
Application
Cartoon
Games
Animation
Films & Movies
Live Performance
Conceptual Art
Comics
Education
AR/VR/MR/XR
Social Media
Previz
Virtual Production
AEC
3D Scan
Vtuber
Commercial Ads
Music Videos
Television & Network
Theme
Character Animation
Character Creation
Motion Capture
Digital Human
Facial Animation
Lip-sync Animation
Video Compositing
Scene Creation
Metaverse
360 Head
MetaHuman
Digital Twin
Digital Double
Motion Director
Environment & Crowd
Developer
Plug-ins
Certified Trainer
Content
GET THE LATEST UPDATE

3D Render Engines for iClone: Choosing The Best (Part 1)

Share

Comparison of Omniverse, Unreal Engine 5, and Maya V-Ray for 3D Real-time Animation

Introduction

José Tijerín is a digital illustrator and 3D sculptor who has created a video game “Dear Althea” to be launched on the Steam.  His “We’re Besties” content pack is currently selling in the Reallusion Content Store.

In this tutorial, José shares his way of working with Character Creator  and iClone to get fantastic results in Omniverse, Unreal Engine 5, and Maya V-Ray render engines. Then he compares the results and gives his personal opinions on each one of them.  This tutorial will help users to discover new possibilities for showcasing their 3D sculpt and animation work.

Prepare A Character

The first step would be to create a character in Character Creator, however, I’m going to skip this step because I have already written an article with a tutorial video on how to create a cartoon-style character.

Now we are going to add movements to this very same character. In order to bring it to life, we can take advantage of ActorCore’s fantastic motion library.

Once we have selected the desired character animation, we’ll just have to apply it to the character and adapt it to the scene we want represented in iClone. In my case, I had to select an animation of a cook stirring contents in a pan and another motion for taste-testing with a spoon. In order to make the 3D animations fit the scene, I had to connect the spoon to the hand, fix the other hand to the pan handle, and adjust all of the movements to fit the character I was working with. Finally, I animated the character’s gestures and eyes to bring her to life, which is without a doubt, the funnest part of the whole process.

NVIDIA Omniversef

Let’s start with the NVIDIA Omniverse platform, an application that is easy to download and completely free to use: Just register an account on their website and download the program. I’m adamant that everyone should try this software at least once due to its simplicity, usefulness, fun, and ease of use.

Once we have it installed, go back to the open project in iClone and look for the USD export button at the top of the program. In the subsequent pop-up window you can configure various aspects of the export, but in our case the default setting will work as they are. Next, click on the Export button to take the whole project along. We can click the Send to Server button to send it directly to Omniverse, but in my case, I’ll just work off of a scenario already prepared in the program.

Export an iClone project to NVIDIA Omniverse (image credit: José Tijerín)

Now, we will export our files to a folder and open the NVIDIA Omniverse Launcher – if not already opened. The Omniverse Create application can be found and launched from the Library menu. 

For my project, I just need to load a scene in which the character will be introduced. To do this, go to the File menu and click on the Add Reference option. Don’t worry if the character enters with whitish skin – thanks to iClone’s intelligent export system to Omniverse – all of the skin texture parameters will be adapted to the program. 

To apply the textures to a model in Omniverse, we first have to click on the model by clicking either on the 3D model or on its name in the list to the right and apply new material. Afterward, you’ll notice that the lower right window will be filled with the material properties applied to the model. These material properties will be editable after clicking on the material symbol. 

Apply textures in Omniverse Create (image credit: José Tijerín)

Applying the textures to the material is very simple, we only have to look for the folder with the textures in the Content window and drag the textures to different sections that make up the material and configure it in an intuitive way. 

For other occasions, as in the case of cartoon-style hair, we’ll need a special material. To apply a different material to an object, we have to right-click on it. In the floating menu that appears, click on the Create menu and, inside it, look for the Materials menu. For this example, I’ll choose one of the predefined Presets of the OmniHair material.

Character hair creation (image credit: José Tijerín)

In case of problems with the 3D animation, such as objects in Attach mode not moving correctly, you should go back to iClone and bake the animation. To do this, go to the timeline. In the CC3 Base Plus section, go to the Collect Clip track and select the entire animation track. Once selected, right-click and perform Add MotionPlus tro Library from the pop-up menu. Then find the custom animation in the content menu and apply it to the character. And in the situation where the facial expressions don’t work, you will have to update the programs and re-export the character.

The Three Different Render Modes in Omniverse

Next topic is on the Omniverse render modes, which is really impressive stuff. If we go to the top of the viewport we can see the lightbulb symbol. Click on it to see three types of rendering that Omniverse has to offer:

  • RTX Real-Time is one we have been seeing so far, which corresponds more or less to the traditional lighting of video games.
  • RTX Path-Traced is the strong point of the program.
  • Iray Photoreal is the most accurate of the group, but it is slow and computationally heavy.

One of the main differences between the rendering modes is the rendering time. For my project, the scene is complex in terms of lighting and textures, so to avoid noise in the video, the render times are quite long compared to the incredible speed that Omniverse usually offers.

Omniverse render mode: RTX Real-Time (image credit: José Tijerín)

The RTX Path Traced result is excellent with rendering speed so fast and of such high quality that you can work on the project while rendering without a problem. This gives you the possibility to tweak the scene without having to wait for the render to update; something that can speed up the production by a lot.

Omniverse render mode: RTX Path Traced (image credit: José Tijerín)

Finally, Iray Photoreal is NVIDIA’s bid to become one of the major rendering engines dedicated for the big screen. Compared to other industry-standard render engines, it can be plagued with some small shader problems in certain situations. However, judging by the color transmission present in the left-over food, it tends to give better results than other modes and can be quite impressive for some scenes.  

Omniverse render mode: Iray Photoreal (image credit: José Tijerín)

As you may have noticed, unlike the other modes, the rendering of translucent material does not work by default. So to make it work, we have to go to the Iray menu to the Render Settings section to find the Caustic Sampler option. When we activate it, we get a result that is quite impressive, even for a very simple scene.

Comparison of render modes of Omniverse (image credit: José Tijerín)

Render with RTX Path-Traced 

However, given the complexity of the operations to be performed to achieve this result, the time required for this mode to completely remove noise and achieve high definition is very time-consuming compared to the other modes. So I recommend choosing the intermediate mode to work in for good, responsive results.Now that we have chosen a render mode, let’s see how to go about rendering a scene. If we go to the Render Settings menu, we can find the Path Tracing rendering menu. Inside we can find two important sections: Path-Tracing and Denoising. To explain these parameters we are going to render a complicated material like the pre-defined material for honey.

Comparison of render modes of Omniverse (image credit: José Tijerín)

In the first section, we find the Total Samples per Pixel parameter. The higher the number, the greater the number of samples and the higher the quality, but at the cost of longer render times.

The second section is the Denoiser, a system that many 3D programs have been using for some years now. This system detects the noisy areas in the render and softens them. However, you have to bear in mind that the more samples you have to work with, the better the results will be.

Now that we have the basics set up, let’s render the animation by going to the Rendering menu and in the drop-down menu open the Movie Capture menu. In this window, we can configure parameters such as frame rate and resolution. Just select the option to Use Current (RTX Path Traced) and set the number of samples per pixel. When we are done we just have to assign a destination folder for the output images and hit the Capture Sequence button. 

The final result is quite good and we have obtained it in a very short amount of time compared to traditional renders.

Comparison of render modes of Omniverse
Comparison of render modes of Omniverse (image credit: José Tijerín)

Unreal Engine 5

Downloading and installing Unreal 5 is completely free, although we must have enough space on our hard disk and some previous knowledge of 3D programs to learn the engine.

Once Unreal is downloaded and installed, we are going to import the animations from iClone. To do this it is advisable to put both programs (iClone and Unreal) side-by-side on the same screen to optimize workflow. Once we have opened both programs, we must go to the Unreal Settings menu and click on the Plugins menu. In the pop-up window, type “Live” in the Search section to check if Live Link and iClone Live Link plugins are activated. 

You’ll notice that the new Live Link menu resides next to the Details menu. If we click on the Source button in this menu, we can access the drop-down menu and select the iClone Live Link option and then click Ok to accept the port number.

 iClone Live Link with Unreal 5 (image credit: José Tijerín)

iClone Unreal Live Link

After everything is ready on the Unreal side, we’ll go back to iClone for a moment to make the connection. To do this, we need to deploy the Unreal Live Link menu by going to the Plugins menu. After clicking on it, one should see the Unreal Live Link menu with the options for Transfer and Link. Make sure the Transfer option is selected and go to the bottom of the window and click on Transfer File.

After waiting a few minutes, we’ll have the imported character with all of its materials already configured. Next, we’ll need to link the animation to the transferred character by going back to the Unreal Live Link window in iClone and clicking on the Link button. Again, we have to go to the bottom of the window and click on the Active Link option. If you come across some problems with the linking process, you can click on the character and go to the Details window to diagnose the errors. Under the Animation section, make sure to set Animation Mode to Animation Plane.

Recording Unreal Animation

Now it’s time to record character animation in Unreal. To do so, go to the Sequence Recorder window where we will introduce the models we want to record into the Actors list. Once this is done, simply press the Record button and, when the countdown is over, press the Play button in iClone. When the animation has finished, we have to press the StopAll button in Unreal.

When we do this, a pop-up window will appear at the bottom right of the screen so that we can open the sequence directly. Now that the iClone portion of the process is complete, we can close the application to focus on the Unreal scene. As you can see, in addition to the models we have imported, the models linked to the animation sequence now appear in the scene. We can remove them from the scene so that they don’t get in the way while we work. 

If there are textures that you have forgotten to put into iClone, you can still apply them to the materials in Unreal. If we press Ctrl+Spacebar we can access the project content menu. Just import the textures into a folder and create a material. Double-click on the material to access the Material Graph, a graph view where you can drag the textures and connect them to the appropriate channels.

Record animation in Unreal (image credit: José Tijerín)

Once we have connected the textures, we just have to scale the material and drag it to the Materials section in the corresponding element of the model. Now that we have all the materials ready and the scene prepared, we are going to render the video. In the Sequencer window, we have to click on the film plate button to access the Render Movie Settings menu, where we can configure parameters such as the output format or the size of the frames. When we have finished configuring the parameters we just have to give it a destination folder and click on the Capture Movie button.

The result is really amazing for a video game rendering engine thanks to the revolutionary Lumen technology.

Unreal 5 render result with Lumen (image credit: José Tijerín)

If we remove Lumen, the same scene would look as below because the light would not bounce off of the walls in the room. Not having a system that bounces light into the shadows makes the light contrast too abrupt, even if the image was later retouched in post-processing.

Unreal 5 render result without Lumen (image credit: José Tijerín)

Continue reading in Part 2…

Related topics

Share

Leave a Reply

Recommended Posts