Categories
Product Release
Pitch & Produce
News
Learning
Featured Story
Events
Tags
Product
Cartoon Animator
iClone
ActorCore
Character Creator
Smart Content Manager
AccuRIG
Plugins & Pipelines
After Effect
Photoshop
PSD Pipeline
Cinema 4D
Marvelous Designer
Motion LIVE
SkinGen
Blender
Unreal
Auto Rig
Illustrator
MetaHuman Live Link
Unreal Live Link
Omniverse
ZBrush
Headshot
Unity
Daz
Motion Link
Iray
Application
Animation
Cartoon
Commercial Ads
Virtual Production
Vtuber
AEC
Comics
3D Scan
AR/VR/MR/XR
Films & Movies
Games
Education
Music Videos
Conceptual Art
Television & Network
Live Performance
Previz
Social Media
Theme
360 Head
Character Creation
Video Compositing
Character Animation
Scene Creation
Facial Animation
Lip-sync Animation
MetaHuman
Motion Director
Environment & Crowd
Digital Double
Digital Human
Motion Capture
Digital Twin
Metaverse
Developer
Certified Trainer
Content

TV Production Company CREAM creates digital double Survivorman Les Stroud with Character Creator

Share
Andrew Macdonald – Executive Creative Producer at Cream Productions

Andrew Macdonald

A serial innovator throughout his filmmaking career, Andrew has always been the first to bring a new camera or technology to set. This led him to co-found Cream Productions’ digital department with Cream CEO David Brady in 2015. This venture has grown from one person incubator to an award-winning full XR production department.

Andrew has become well versed in concept definition, interactive UX design, directing motion capture, game development, and innovation of XR production workflows. He is currently directing an applied research machine learning project on digital humans at Sheridan’s Screen Industries Research and Technology (SIRT) centre. 

Cream Productions is a Toronto-based Producer of documentary television, specials, and digital media that has created over 20 pieces of original XR content including 360 3D series, AR apps, and interactive volumetric VR, for networks, studios, agencies, and brands all over the world. They also developed SurvivormanVR a single-player interactive VR game for Quest and PCVR that challenges players to survive dynamic, sophisticated, real-world life-or-death experiences in remote, wild, natural locations, under the watchful guidance of real-world expert and TV celebrity Les Stroud.

For SurvivormanVR, the team at Cream Productions contacted SIRT Centre for the best pipeline to create digital doubles which led them to the Dynamic Digital Human system (DDH) that is employing Reallusion’s Character Creator (CC) software.

“Reallusion software is affordable, gives you multiple output options, and because its procedural it allows creators to iterate quickly.”

Andrew Macdonald – Executive Creative Producer at Cream Productions

Q: Greetings Andrew, and welcome to our Reallusion Feature stories. Please introduce yourself, Cream and the SurvivormanVR project.

Hi, I’m Andrew MacDonald the Executive Creative Director at Cream Productions a TV production company based in Toronto with lots of high profile clients. Cream Digital is the skunk works department within Cream (if you will).

Dave Brady the CEO decided to set up a digital department back in 2016 to keep up with the future trends in media creation. We began with 360 video back when 360 video was “neat” and some of our clients were commissioning ancillary content and promotional 360 video. We produced 360 content for some major networks over the next few years, Hulu, Discovery, Smithsonian to name a few. We were even invited to Cannes and won a CSA (the Canadian Oscars) for “best immersive experience”.   As the heyday of 360 VR video came to an end we had already begun working in game engine, building apps and now here we are in 2022 with several full VR games in production, SurvivormanVR being one of them.

Interestingly the original TV series Survivorman was created by Cream with Les Stroud, back in 2001

Survivorman Les Stroud

Q: For SurvivormanVR, Cream contacted Sheridan’s Screen Industries Research and Technology (SIRT) centre, for the best pipeline to create digital doubles, which led them to the Dynamic Digital Human system (DDH). Can you explain what this pipeline is?

Yes, with SurvivormanVR we are taking the format we know well, a host driven factual entertainment TV production and applying it to a choose your own adventure style VR experience for the Meta (oculus) Quest. We found ourselves in a situation where we needed to create a digital double of a well known person and have them appear in a game and perform. This presented a few challenges, number ONE we are selling a celebrity, in this case Survivorman Les Stroud, and as such we need to ensure that the virtual reality character really represented Les. TWO: we needed it to run on the limited processing power of a Quest.

We had an initial exploratory project with SIRT to try projection of a facial performance on a static head like a 3D video sprite of a face, it produced some promising results. We decided to start an applied research project at SIRT to see if we could project HMC facial performance on an animated 3D face and get it to run all in sync in the game engine.

Over the course of this project at SIRT we solved a few key challenges and came up with some exciting results. A pretty accurate digital double and something that was light weight enough to run on a Quest.  From there we filed for 2 patents around the process and developed the pipeline into a full on product with installer.

Q: The DDH pipeline wanted to see how well it would translate into singing performances, which let to the creation of “Jasmine”. Please elaborate how Jasmine was created and the challenges throughout?

Jasmine was a breakthrough for us. We wanted to see how the DDH pipeline would perform on a realistic character that emotes actions through song.  There were a lot of complexities to consider.  The animated character has viseme activation, jaw-to-lip movement variation, vocal projection details and an emotional component that had to align perfectly to create an emotional performance. 

Most importantly we had an idea to incorporate a digital human maker, such as the Character Creator (CC3) pipeline into ours to create a procedural, topologically similar character production pipeline.  This would eventually reduce the overall time needed to create our DDH characters for VR applications.

Using all the benefits of CC3 characters, topologically similar vertices, identical UV layouts, skin weighting and the standard set of blend shapes, combined with a single structured light scan to create Jasmine facial proportions we were able to accurately attain all the facial positions and subtle skin actions required to make Jasmine’s realistic singing performance.  This was invaluable, as we are now able to incorporate CC3 into our pipeline and eliminate the need for a photogrammetry studio, the dozens of scans for expressions and range of motion, and hours of cleanup that are required in a traditional animated character pipeline.

By combining Character Creator as a digital human maker into the DDH pipeline we were able to save a huge amount of time and money to build realistic characters for VR on the Oculus Quest and beyond.

“Character Creator 3 is a one stop companion to our DDH facial animation pipeline – CC3 does everything else, from body, to rigs, to clothing and hair. A one stop solution!”

Andrew Macdonald – Executive Creative Producer at Cream Productions

Q: You created a digital double of celebrity Les Stroud with the DDH system. What would you say were the most difficult tasks, and how was Character Creator able to help your team with this?

One of the challenges for Les Stroud, was combining character actions of the rest of the body with facial performances. We have been focusing most of our attention on reprojection of the facial performances and by adding CC3 into our pipeline, we got the added bonus of using all the other various features of the Character Creator Pipeline. 

Automated skin weighting for clothing and accessories allowed us to add custom clothing.  InstaLOD for combining materials and textures allowed us to reduce draw calls on characters for performance in VR.  And the added benefits of integration with iClone allowed us to borrow performances from ActorCore and integrate them with custom motion capture performances for our character.

Q: As digital doubles gradually take front stage in mainstream television and big screen productions, what other digital human projects can the community look forward to from Cream Productions?

Great question, yes we see lots of opportunities for digital characters not only for the VR titles we are producing but also for the linear 2D TV we produce. The kind of TV Cream produces there is lots of what we can “re con” or “re cre”. This is the dramatic recreation of an event. With the adoption of game engines like Unreal there are new opportunities to create any kind of set or location we need.

Virtual Production its often referred to. Sure there is the Mandalorian kind of VP where 3D assets are projected on walls with real actors in the foreground, this does have its limitations. There is also fully in engine VP where there is much more freedom. Once you take a production full into the game engine with virtual cameras you also need virtual people. That’s where these quick to produce economical high fidelity digital doubles DDH can produce can really come in useful.

Follow Cream Productions:

Website:
https://www.creamproductions.com/

Facebook:
https://www.facebook.com/CreamProductions/

LinkedIN:
https://www.linkedin.com/company/creamproductions/

Instagram:
https://www.instagram.com/cream_productions/

YouTube:
https://www.youtube.com/c/CreamInc

Related topics

Share

Leave a Reply

Recommended Posts