Product Release
Pitch & Produce
Featured Story
Character Creator
Cartoon Animator
Smart Content Manager
Plugins & Pipelines
Auto Rig
Motion LIVE
PSD Pipeline
Vector Pipeline
Cinema 4D
After Effect
Unreal Live Link
MetaHuman Live Link
3ds Max
Marvelous Designer
Motion Link
Conceptual Art
Commercial Ads
Television & Network
Films & Movies
Music Videos
Social Media
3D Scan
Live Performance
Virtual Production
Scene Creation
Character Creation
Character Animation
Facial Animation
Lip-sync Animation
Video Compositing
Motion Capture
Motion Director
Digital Double
360 Head
Digital Twin
Environment & Crowd
Digital Human
AI & Deep Learning
Certified Trainer

Sowl Studios uses Reallusion Production Pipeline to Create Animations that Teach Kids about Their African Heritage


During this period of quarantine and lockdown due to the coronavirus pandemic, parents need fun and educational activities to keep their kids engaged and entertained more than ever.

Sowl Studios, a startup art and animation studio based in Alexandria, Virginia, is harnessing the power of Reallusion’s iClone software to create animated app content for African parents in the diaspora. The aim is to help their kids learn more about their cultural heritage and languages, through animation and storytelling. The studio is run by a husband and wife team, Solomon W. Jagwe and Kim Jagwe (COO), who are currently working from home.

They are combining motion capture and hand-keyed animation to bring their stories to life, relying on powerful and seamless plugins in iClone, like Motion Live, Live Face, Perception Neuron, Faceware, and the iPhone X profile.

Below, Solomon W. Jagwe, creator and director of The Adventures of Nkoza & Nankya, explains their pipeline on the show.

Solomon W. Jagwe: As a startup animation studio with a small animation team, we are constantly looking for efficient ways to execute the animation for our children’s series. Discovering and using Iclone has been an exciting and amazing adventure, which has saved us countless hours in our production of Nkoza & Nankya.

We start our animation process by importing our custom characters, modeled, textured, and rigged from Maya and 3DS Max, into 3DXchange. We use 3dxchange to configure and connect all the morphs for the body and face to match those in the Expression Editor.

We then send our custom character to iClone as an avatar, ready for facial and full-body animation.

For the facial animation, we use several of iClone’s powerful facial animation plugins that are part of the Motion Live interface. We use Live Motion with the iPhone X profile, as well as Faceware’s Live plugin to capture the actor’s facial animation. My wife plays a major part as a performance actor, and so does our daughter, who is the voice of little Nankya.

This plugin takes advantage of the iPhone X’s depth-sensing camera to drive the morphs on the characters. This saves us so much time because it gets us much closer to the finished animation, which we then refine with keyframe animation. Iclone offers a powerful animation editor through the Curve Editor plugin.

We also use the Face Puppet to add subtle edits, layering on top of the cleaned mocap animation data. This tool is very useful in refining things like blinks and offers a real-time recording option that one can use to layer additional facial animation.

Another fun animation tool we use in iClone is the Face Key, which allows us to refine parts of the face like smiles, cheek puffs, eyebrow raises, and so much more.

For the full-body animation, we use Noitom’s Perception Neuron Mocap suit and the iClone Perception Neuron plugin in Motion Live. This suit is wireless, which gives the actor freedom of motion and works very well with iClone. The data is pretty stable and requires little to no editing.

We use both the 2.0 version and the Pro version. The 2.0 version includes finger tracking, which saves us countless hours. Where we could have spent hours hand-animating the fingers, the motion-capture data gives us a very good foundation with the full-body motion capture of the performer.

Working from home means we have to use our living space for the performances. Having a wireless motion-capture suit has made it easier to capture multiple takes for the animation we end up using on our cartoon characters.

We capture the performances using Noitom’s Axis Neuron software, which connects to iClone through a live plugin that records the data in real-time. We can choose to record in real-time while iClone is running, or simply record the session in Axis Neuron.

The advantage iClone offers is that we can see in real-time how the mocap animation data from the motion capture suit and performer is driving the character within the environment inside iClone. This makes it much easier to make adjustments in real-time.

Once the animation is done, we then export it to 3DS Max and Maya as an FBX file for scene setup, lighting, and rendering using the Octane render engine. We also use Maya and 3DS Max for additional animation like cloth simulation. We render out the videos and prepare them as compressed clips for inclusion in our Nkoza and Nankya app for iOS and Android.

The content is made accessible globally through the app, which is available on both the App Store and Google Play store, as well as through our Youtube channel. Click here for our Facebook page.

We are very grateful to Reallusion, Noitom, and Faceware for creating such a seamless suite of animation tools, which have made a huge difference in our production workflow. We have received many encouraging messages from parents around the world who are at home with their children, sharing with us how much the show has made a difference in keeping their kids engaged, learning, laughing, and dancing to the Nkoza & Nankya clips in the app and on our Youtube channel.

Learn more about iClone:

Related topics


1 comment

Leave a Reply

Recommended Posts