Reallusion tools help bring 2049 TV drama to the Netflix streaming platform

Xanthus Animation Studio
Established in 2004 in Taiwan, Xanthus Animation Studio focuses on making high-quality film and TV content production worldwide. Led by Bruce Yao, the supervisor of Xanthus, the team is capable of both animation conceptualization and execution and has quickly become the leading animation studio in Taiwan.
Xanthus’ core business includes original animation development, high-end quality animation, and VFX production. Their contracted project “All is Love” won the Golden Bell Awards in 2017, which is the finest award for a Taiwanese television production crew. Besides developing their original IP animation project “Yameme”, Xanthus also delivered several iconic projects, including 2049, 2049+ Voice of Rebirth, and crafting an animated mascot for the 2017 Taipei Summer Universiade.
About 2049
2049: The Fortune Telling is a Taiwanese, made-for-Netflix, science fiction thriller series. The story centers on a family in 2049 whose mundane existence is disrupted by sudden pregnancy. In this future setting, big data is used to predict one’s fate, and by this, the unborn child was predicted to have a high probability of becoming a criminal, leading to a series of dramatic events. Besides the VFX used in the scene, the series’ crew tried for the first time to use Reallusion’s Character Creator (CC) and iClone (IC) 3D character animation software to quickly create high-quality animation for the main character Xiao Xu. In this article, we will talk about the overcoming of various obstacles by the production team, and share some of their production secrets.

3D avatars help filmmakers break the limits of reality
For productions large and small, such as shorts and feature-length films, there’s often the need to express surreal, larger-than-life concepts. Yet, perhaps due to the age or personality of the characters, challenges come from translating the spirit of the script or realizing the screenwriter’s vision. Even so, it’s no longer a feat to create realistic virtual 3D character animations with the latest advancements in animation software. Reallusion continues to innovate ways to create intuitive and high-quality real-time character animation; and to this day, has been the prime choice for several professional animation studios. Reallusion’s 3D character animation software helps to bring screenplays to life by removing roadblocks and saving precious time in production. A perfect example is 2049: The Fortune Telling, a Netflix series based on a high-quality script. Read on to find out more about Reallusion’s solution for creating the show’s virtual actor.
Q: Hello Bruce, and welcome to our Reallusion Feature Stories. Kindly tell us about Xanthus Animation Studio and the 2049 TV Series on Netflix.
We are Xanthus Animation Studio, and we have been in operation for 15 years, specializing in the production and development of 3D animation, with contributions to many films and TV shows. It’s been 6 years since we started working with Reallusion, and lately, we’ve benefited most from using CC to quickly create virtual actors along with motion capture technology to achieve realistic film-quality special effects.
In the course of our partnership, Reallusion has helped us become an outsource for Japanese animation, produce our very own animated series, and create virtual characters. In the Netflix series 2049, we created Xiao Xu, the digital infant using textures from Character Creator’s SkinGen plugin and various modeling tools and features.

“In the process, we’ve shortened the production period and delivered a product that was more pleasing to our client. We’d like to share our experience making 2049: The Fortune Telling and how the 3D virtual infant and star of the show was made.”
Bruce Yao, Supervisor at Xanthus Animation Studio
Q: Why was it necessary to use a 3D virtual infant to act out the scenes?
Casting children has many pain points, especially prevalent in Hollywood, where 90 to 95 percent of the work for children is given to twins and triplets. Making 2049 was particularly problematic as it was nearly impossible to have a child act out a scene with a pair of scissors in a safe manner. Furthermore, the story required the parent to physically abuse the child in a fit of rage. It’s obviously not feasible to put children in danger, so we had to rely on CGI to create these shots.

Besides this, we had to consider that a child actor is limited in acting skills, and 2049’s script called for scenes that may be too difficult to perform for an eight-month-old child, like having Xiao Xu standing up to kiss his older sister. Instead of using a real child, we needed to create a 3D virtual infant to ease and round out the production process.

Q: How did the production team use Character Creator to make Xiao Xu?
We used the Headshot plugin in CC to quickly create Xiao Xu’s 3D model from photographs, giving us more time to work on the character’s silhouette and appeal—which is severalfold better than modeling from scratch!
While sculpting the character, we were able to preview various motions, performances, and expressions thanks to the non-destructive workflow of the Headshot face maker tool and the subsequent preservation of the animated character’s bones and realistic skin textures. The finished CC character can be used directly in iClone without the need to transfer files or adjust the materials. All we had to do was apply mo-capped expressions to the animated character to get a performance! This saved us a lot of time and effort.

Q: Once the character is finished, how do you go about making him move naturally with iClone?
In preparation for the character’s final performance, we’ve had to bind the character to an entire skeletal system, create a complete set of facial expressions, and so on.

Once we had the 3D model ready, we sent Xiao Xu directly into iClone and used the Face Puppet tool to drive the expressions and create realistic facial animations. In combination with the Motion Live plugin and iPhone’s facial mo-cap hardware, we received instant feedback from our live performances—again, saving us a lot of time in production.

Feedback from the production house – Greener Grass Production
Here at Green Grass Production, we were very happy with the finished product once we were able to review it—the results were much better than expected! The virtual infant was our biggest concern; whether or not it would look realistic or too animated. In the end, we were pleasantly surprised that every little detail was crafted with great care.
The use of new body and facial animation technologies this time around has been a learning process and a breakthrough for us. Now inspired by this experience, we look forward to using more real-time techniques to unravel further possibilities during production.
