Product Release
Pitch & Produce
Featured Story
Character Creator
Cartoon Animator
Smart Content Manager
Plugins & Pipelines
Auto Rig
Motion LIVE
PSD Pipeline
Vector Pipeline
Cinema 4D
After Effect
Unreal Live Link
MetaHuman Live Link
3ds Max
Marvelous Designer
Motion Link
Conceptual Art
Commercial Ads
Television & Network
Films & Movies
Music Videos
Social Media
3D Scan
Live Performance
Virtual Production
Scene Creation
Character Creation
Character Animation
Facial Animation
Lip-sync Animation
Video Compositing
Motion Capture
Motion Director
Digital Double
360 Head
Digital Twin
Environment & Crowd
Digital Human
AI & Deep Learning
Certified Trainer

How to animate 3D characters as a music video audience

This article is featured on Creativebloq:

Discover how director Solomon Jagwe transformed ActorCore’s 3D people into an all-singing and dancing music video audience.

(Image credit: Solomon Jagwe)

Solomon W. Jagwe is the co-Founder of Sowl Studios and the creator and director of The Adventures of Nzoka and Nankya, an app and animated series that helps children and adults learn Luganda and other Ugandan Languages. Jagwe has a passion for writing stories and creating 3D animated short films and cinematics, with his most recent endeavours seeing him use Unreal Engine and iClone to create his content. 

Jagwe latest project saw him transform ActorCore’s 3D people into an all-singing and dancing crowd animation for a music concert. Here he reveals how…

The challenge

“I was super excited when Reallusion released the first batch of the ActorCore Characters, mainly because I had this vision of creating a music video featuring a MetaHuman lead singer with a fun audience dancing to the song he was performing.”

“My idea was inspired by the Bob Marley track, No Woman No Cry. I wanted the video to feature a group of Non-MetaHuman characters attending a concert, with a song encouraging them not to worry or cry because they weren’t MetaHumans. So I wanted to play on that idea, and have fun with the line, “No Metahuman, No Cry”.” 

The final result

Before we get into how Jagwe created the video, let’s take a look at the final result showing ActorCore characters as a 3D audience:

The characters that you see in the background are all from the ActorCore series, animated in iClone, using ActorCore mocap, Perception Neuron Studio mocap suit data and rendered in the Unreal Engine.

Creating the song

“The first step to creating the final video (above), was to come up with the ‘No MetaHuman, No Cry’ reggae song. At first I tried to create the song myself, but it didn’t have quite the sound I was going for. 

“So I partnered with a very talented Ugandan artist, Andereya Baguma, whose awesome music I chanced upon when doing some research on YouTube. I wrote the lyrics, and then sent him a recording of myself doing a rough beatboxing version so he could understand the idea. He did such an awesome job with the production, creating an unbelievably cool track where he sang the lyrics and matched them up perfectly.

“Once the song was finished, I was able to turn my attention to bringing the music to life. The time had come to create a 3D audience – and that’s where the ActorCore characters came in.”

Creating the 3D audience

“To gather the characters for the music video audience, I first logged into my Reallusion account. This then automatically logged me into the ActorCore catalogue, after typing in the ActorCore website in my browser. This is a key step if you want to browse and purchase the ActorCore characters. 

“The ActorCore catalogue offers a wide range of beautifully diverse, textured and fully rigged characters. I started by choosing the Party People Vol. 1 and 2 and the Casual People Vol. 1 packs.

“Probably the coolest thing about ActorCore is it allows you to select characters you like, and then preview it and test the different mocap animation files on the ActorCore catalog, right there in the browser. You can also download the characters in several file formats.”

“In the top right-hand corner of the ActorCore interface, there is an Info tab, which, when selected, shows more details about the character chosen. The ActorCore photo scanned characters are between 7K-10K polygons, which makes them perfect for a crowd animation setting. I was so relieved when I saw that because I needed my audience to have at least 30 characters. 

“Another really important vision for my video was the characters being able to sing along to the ‘No MetaHuman, No Cry’ song. So I was pleasantly surprised to find out that the ActorCore characters were already prepared and rigged for facial animation.

“My plan was to edit the body and facial animation of the character in iClone, so chose the iClone download option for all the characters I needed. The iClone option makes it possible to download your character selection into your Smart Gallery.”

“Once downloaded, the characters become available for pulling into the iClone interface for facial and body animation, so I went ahead and loaded all of the ones I wanted.”

Animating the characters’ bodies

“I needed some of the characters in the audience to dance in rhythm to the song, and for that I turned to my Perception Neuron Mocap suit. I suited up, played the song, and acted out the full song while recording in Axis Studio. I did multiple takes so I could have some variation of the background characters waving their arms in the air.

(Image credit: Solomon Jagwe)

“For the foreground ActorCore characters, I relied on the ActorCore mocap pack of the nightlife. The files are nicely looped, which works very well during the playback in Unreal Engine. After I created the mocap animation files in Axis Studio, I recorded the motion capture to a stand-in character in iClone, using Motion Live and the Perception Neuron plugin.”

I saved the iClone mocap files from the Perception Neuron Suit session, into the Motion folder so I could repurpose them and apply them to each character. I then created an entire scene, setup to mimic what the set was going to look like in the Unreal Engine, and applied mocap data to each character.”

I was also able to take advantage of the 3D animation tools in iClone to adjust the Perception Neuron data. This gave me the ability to alter the height and spacing of the mocap data of the characters waving in the air.

Animating the characters’ faces

For the facial animation, I used an iPhone X, together with the Live Face plugin in Motion Live to record myself singing the ‘No MetaHuman, No Cry’ song from beginning to end. 

“I played back the stand-in character with the dancing mocap while I was recording the facial motion capture. Now I have to use the Default Facial profile instead of the Artkit version as ActorCore characters, however, Reallusion said they will make them available in early 2022.”

I saved the edited facial motion capture along with the body into the MotionPlus folder, so I could use it on all the other characters in the audience.

Sending the ActorCore Characters to Unreal Engine

After perfecting the crowd’s 3D animation of the ActorCore characters in iClone, I proceeded to export all them all as FBX files with animation enabled, covering the full range of the timeline.

I made sure I had the CC Setup/Autoconvert plugin installed in Unreal Engine, so the materials and character rigs could be imported and formatted properly once imported into the software.

(Image credit: Solomon Jagwe)
(Image credit: Solomon Jagwe)

“After successfully importing all the characters into Unreal Engine, I moved them into their different locations to face the main MetaHuman lead singer on the stage. I then created a level sequence and dragged all the characters onto the timeline so they could line up with the sound track.

“All the animations were tested to make sure they were working properly in Unreal Engine 4.26. I then moved to Unreal Engine 5 as I wanted to take advantage of the beautiful lighting offered by Lumen in the software”. 

(Image credit: Solomon Jagwe)

“After setting up all the lighting in the scene, I was able to do a test to see if the animation was still working. When I saw the ActorCore audience dancing away in realtime in front of the MetaHuman lead singer, I was super-thrilled! All my hard work had paid off.

“The ActorCore characters looked awesome in the Lumen lighting, and the facial and full-body animation worked very well while the song was playing in Unreal Engine. I was really impressed to see all this come together from ActorCore, to iClone, and then to Unreal Engine.”

(Image credit: Solomon Jagwe)

The final render

“Once happy with the entire animation, I added the sequence to the Movie Render Queue and rendered out the final animation at 1920×1080.  The render speed blew my mind and the final result speaks for itself.”

(Image credit: Solomon Jagwe)

“I am extremely grateful to Reallusion for creating a brilliant library of low polygon yet fully rigged ActorCore characters. It means filmmakers and storytellers like myself have access to an awesome resource for easily creating background characters for our projects.

“I highly recommend the ActorCore characters for any artist working on similar projects. So go ahead and give them a try. Reallusion offers a number of free versions for you to try before you buy. Check out the amazing new characters and mocap files from the ActorCore Library here

Related topics


Leave a Reply

Recommended Posts