首頁 » December 23, 2021

Reallusion’s 2021 iClone Lip Sync Animation Contest – Meet the winners!

Reallusion’s 2021 iClone Lip Sync Animation Contest launched in July 2021 with the aim of delivering the best 30-second talking animations that showcase lip-sync, facial animation, and great characters. Within 3 months, there were 468 submissions documented among 60 countries. Contestants were inspired by comedy, movies, music videos, storytelling and remaking them into detailed talking animation performances. Besides using iClone and Character Creator software, the contest also welcomed diverse entries using different renders and engines including Blender, Cinema4D, Unreal, Unity, and more.

The final result exemplified iClone’s game-changing facial animation workflow for accurate voice lip-sync, puppet emotive expressions, muscle-based face key editing, and professional iPhone facial data capturing. A third of submissions using Blender and MetaHuman demonstrated that iClone is a highly compatible facial animation pipeline tool used by Unreal MetaHuman and Blender users, along with Replica’s AI-voice plugin that made talking animations easier than ever.

“At Reallusion we are proud to see an influx of high-level artists that are adopting Character Creator and iClone into their pipelines. The contest proved the viability of using iClone’s powerful lip-syncing and facial animation tools in professional production, when time and budget are of the essence. We congratulate all who participated, and look forward to bringing more innovations to the industry.”

– Enoc Burgos, Reallusion Director of Partnership Marketing

The 2021 iClone Lip Sync Animation Contest was hosted by Reallusion and sponsored by industry leaders including; NVIDIA, Epic Games, Noitom, Rokoko, Replica Studios, Pixologic, CLO Virtual Fashion Inc, FXHome, Boris FX, and Razer. Through this contest, iClone and Character Creator prove to be one of the most powerful facial and body animation pipelines for all animators.

Watch Overview Video:

WINNERS

The iClone Lip-Sync Animation Contest 2021 offered cash and prizes valued at over USD $40,000 thanks to A-list sponsors that partnered with Reallusion.

BEST CHARACTER DESIGN CATEGORY

1st Place: Ice Queen

By Loic Bramoulle, Cinematics for indies and AA studios. Render: Blender

See their process>

“I used an iPhone XR for capturing a base performance, while simultaneously speaking over the audio clip, and video capturing my face, which allowed me to manually polish the expressions more easily while aiming for hand-animated, keyframed facial animations. The process was pleasingly fast, as the motion capture was essentially used as an animation blocking, but without all the manual work. Polishing it by hand wasn’t as fast, but it seems much more efficient overall than animating from scratch.” – Loic Bramoulle

2nd Place: Do Right, Jessica Rabbit

By Jasper Hesseling, Story loving, professional 3D Artist. Render: iClone

See their process>

“I had a blast working on this piece with the lip-sync tools in iClone. I have still much to discover but I already learned a lot.” – Jasper Hesseling

3rd Place: Love, Rosie

By Petar Puljiz, Unreal 3D Artist. Render: Unreal

See their process>

“Thank Heaven for these animation tools, which you need to have in your pipeline, no matter if you are a professional or a beginner.” – Petar Puljiz

BEST CHARACTER DESIGN CATEGORY

Characters that showcased exceptional character design, character-setting, and voice matching.

1st Place: The Pumpkin Cave

By Varuna Darensbourg, Artist & Game Developer. Render: Unreal 

See their process>

“Using Reallusion’s tools to breathe life into my characters has been a thrilling experience. They can save time, which is a huge boon, but they work so well that it makes daunting tasks enjoyable. Using iClone AccuLips for the 2021 contest was incredible! The accuracy blew my mind and combining it with iClone Motion LIVE and the Live Face iPhone app for layering allowed me to do far more than I expected.” – Varuna Darensbourg

2nd Place: Judar

By Gergana Hristova, Freelance 3D Artist. Render: iClone

See their process>

“I never knew that animating in 3D will turn out to be more enjoyable and quick, thanks to Iclone!” – Gergana Hristova

3rd Place: Animal Farm

By Jason Taylor, Videographer and Director. Render: Unity

See their process>

“iClone has been a game-changer for me. I love the ease of use for lip-syncing and the ability to fine-tune facial animations using facial keys and facial mocap. Amazing stuff. Thank you.” – Jason Taylor

SPECIAL AWARDS & HONORABLE MENTIONS

The judging process for the Lip-Sync Animation Contest 2021 proved to be quite difficult and took many hours of deliberation. Besides the top 3 placements from Best Character Animation & Best Character Design, Reallusion also selected 41 winners for special awards and honourable mentions. See the Winner Page for winner details and showcases.

How to animate 3D characters as a music video audience

This article is featured on Creativebloq: https://www.creativebloq.com/advice/how-to-animate-3d-characters-as-a-music-video-audience

Discover how director Solomon Jagwe transformed ActorCore’s 3D people into an all-singing and dancing music video audience.

(Image credit: Solomon Jagwe)

Solomon W. Jagwe is the co-Founder of Sowl Studios and the creator and director of The Adventures of Nzoka and Nankya, an app and animated series that helps children and adults learn Luganda and other Ugandan Languages. Jagwe has a passion for writing stories and creating 3D animated short films and cinematics, with his most recent endeavours seeing him use Unreal Engine and iClone to create his content. 

Jagwe latest project saw him transform ActorCore’s 3D people into an all-singing and dancing crowd animation for a music concert. Here he reveals how…

The challenge

“I was super excited when Reallusion released the first batch of the ActorCore Characters, mainly because I had this vision of creating a music video featuring a MetaHuman lead singer with a fun audience dancing to the song he was performing.”

“My idea was inspired by the Bob Marley track, No Woman No Cry. I wanted the video to feature a group of Non-MetaHuman characters attending a concert, with a song encouraging them not to worry or cry because they weren’t MetaHumans. So I wanted to play on that idea, and have fun with the line, “No Metahuman, No Cry”.” 

The final result

Before we get into how Jagwe created the video, let’s take a look at the final result showing ActorCore characters as a 3D audience:

The characters that you see in the background are all from the ActorCore series, animated in iClone, using ActorCore mocap, Perception Neuron Studio mocap suit data and rendered in the Unreal Engine.

Creating the song

“The first step to creating the final video (above), was to come up with the ‘No MetaHuman, No Cry’ reggae song. At first I tried to create the song myself, but it didn’t have quite the sound I was going for. 

“So I partnered with a very talented Ugandan artist, Andereya Baguma, whose awesome music I chanced upon when doing some research on YouTube. I wrote the lyrics, and then sent him a recording of myself doing a rough beatboxing version so he could understand the idea. He did such an awesome job with the production, creating an unbelievably cool track where he sang the lyrics and matched them up perfectly.

“Once the song was finished, I was able to turn my attention to bringing the music to life. The time had come to create a 3D audience – and that’s where the ActorCore characters came in.”

Creating the 3D audience

“To gather the characters for the music video audience, I first logged into my Reallusion account. This then automatically logged me into the ActorCore catalogue, after typing in the ActorCore website in my browser. This is a key step if you want to browse and purchase the ActorCore characters. 

“The ActorCore catalogue offers a wide range of beautifully diverse, textured and fully rigged characters. I started by choosing the Party People Vol. 1 and 2 and the Casual People Vol. 1 packs.

“Probably the coolest thing about ActorCore is it allows you to select characters you like, and then preview it and test the different mocap animation files on the ActorCore catalog, right there in the browser. You can also download the characters in several file formats.”

“In the top right-hand corner of the ActorCore interface, there is an Info tab, which, when selected, shows more details about the character chosen. The ActorCore photo scanned characters are between 7K-10K polygons, which makes them perfect for a crowd animation setting. I was so relieved when I saw that because I needed my audience to have at least 30 characters. 

“Another really important vision for my video was the characters being able to sing along to the ‘No MetaHuman, No Cry’ song. So I was pleasantly surprised to find out that the ActorCore characters were already prepared and rigged for facial animation.

“My plan was to edit the body and facial animation of the character in iClone, so chose the iClone download option for all the characters I needed. The iClone option makes it possible to download your character selection into your Smart Gallery.”

“Once downloaded, the characters become available for pulling into the iClone interface for facial and body animation, so I went ahead and loaded all of the ones I wanted.”

Animating the characters’ bodies

“I needed some of the characters in the audience to dance in rhythm to the song, and for that I turned to my Perception Neuron Mocap suit. I suited up, played the song, and acted out the full song while recording in Axis Studio. I did multiple takes so I could have some variation of the background characters waving their arms in the air.

(Image credit: Solomon Jagwe)

“For the foreground ActorCore characters, I relied on the ActorCore mocap pack of the nightlife. The files are nicely looped, which works very well during the playback in Unreal Engine. After I created the mocap animation files in Axis Studio, I recorded the motion capture to a stand-in character in iClone, using Motion Live and the Perception Neuron plugin.”

I saved the iClone mocap files from the Perception Neuron Suit session, into the Motion folder so I could repurpose them and apply them to each character. I then created an entire scene, setup to mimic what the set was going to look like in the Unreal Engine, and applied mocap data to each character.”

I was also able to take advantage of the 3D animation tools in iClone to adjust the Perception Neuron data. This gave me the ability to alter the height and spacing of the mocap data of the characters waving in the air.

Animating the characters’ faces

For the facial animation, I used an iPhone X, together with the Live Face plugin in Motion Live to record myself singing the ‘No MetaHuman, No Cry’ song from beginning to end. 

“I played back the stand-in character with the dancing mocap while I was recording the facial motion capture. Now I have to use the Default Facial profile instead of the Artkit version as ActorCore characters, however, Reallusion said they will make them available in early 2022.”

I saved the edited facial motion capture along with the body into the MotionPlus folder, so I could use it on all the other characters in the audience.

Sending the ActorCore Characters to Unreal Engine

After perfecting the crowd’s 3D animation of the ActorCore characters in iClone, I proceeded to export all them all as FBX files with animation enabled, covering the full range of the timeline.

I made sure I had the CC Setup/Autoconvert plugin installed in Unreal Engine, so the materials and character rigs could be imported and formatted properly once imported into the software.

(Image credit: Solomon Jagwe)
(Image credit: Solomon Jagwe)

“After successfully importing all the characters into Unreal Engine, I moved them into their different locations to face the main MetaHuman lead singer on the stage. I then created a level sequence and dragged all the characters onto the timeline so they could line up with the sound track.

“All the animations were tested to make sure they were working properly in Unreal Engine 4.26. I then moved to Unreal Engine 5 as I wanted to take advantage of the beautiful lighting offered by Lumen in the software”. 

(Image credit: Solomon Jagwe)

“After setting up all the lighting in the scene, I was able to do a test to see if the animation was still working. When I saw the ActorCore audience dancing away in realtime in front of the MetaHuman lead singer, I was super-thrilled! All my hard work had paid off.

“The ActorCore characters looked awesome in the Lumen lighting, and the facial and full-body animation worked very well while the song was playing in Unreal Engine. I was really impressed to see all this come together from ActorCore, to iClone, and then to Unreal Engine.”

(Image credit: Solomon Jagwe)

The final render

“Once happy with the entire animation, I added the sequence to the Movie Render Queue and rendered out the final animation at 1920×1080.  The render speed blew my mind and the final result speaks for itself.”

(Image credit: Solomon Jagwe)

“I am extremely grateful to Reallusion for creating a brilliant library of low polygon yet fully rigged ActorCore characters. It means filmmakers and storytellers like myself have access to an awesome resource for easily creating background characters for our projects.

“I highly recommend the ActorCore characters for any artist working on similar projects. So go ahead and give them a try. Reallusion offers a number of free versions for you to try before you buy. Check out the amazing new characters and mocap files from the ActorCore Library here