Product Release
Pitch & Produce
Featured Story
Cartoon Animator
Character Creator
Smart Content Manager
Plugins & Pipelines
PSD Pipeline
Motion LIVE
Unreal Live Link
Auto Rig
After Effect
Cinema 4D
Marvelous Designer
Vector Pipeline
MetaHuman Live Link
Motion Link
Films & Movies
Live Performance
Conceptual Art
Social Media
Virtual Production
3D Scan
Commercial Ads
Music Videos
Television & Network
Character Animation
Character Creation
Motion Capture
Digital Human
Facial Animation
Lip-sync Animation
Video Compositing
Scene Creation
360 Head
Digital Twin
Digital Double
Motion Director
Environment & Crowd
Certified Trainer

Pitch & Produce | EVERY SINGLE NIGHT: The Making of Vicky Rai’s Music Video with Real-time Software

This story is featured in Creativebloq.
See the article:

Michael T Morrow – Director / Documentary Filmmaker

Michael T Morrow

Michael T Morrow is an award-winning television director, camera man and editor with over 22 years of experience in Documentaries and Series Television, Music Videos and Commercials.

After graduating from OCAD’s Integrated Media Program, MTMorrow spent his first few years directing television commercials, music videos and corporate films before self-financing his first documentary feature film, “Hot Rod, The Movie” about small-town demolition derby drivers. The film played in festivals worldwide, before airing on OLN and IFC. Morrow worked with Summerhill Entertainment to extend the concept into a half-hour series redubbed “Crash Addicts” for OLN. Morrow directed and story produced each of the 26 episodes.

Morrow has become a sought-after director, cameraman and editor of factual television with larger-than-life people living on the social fringe. Notable directing work includes the Gemini award-winning “Fire Jammers” Discovery Channels “License to Drill:, “Cold Water Cowboys”, “Nerve Center”, and “Mighty Planes”, History Channel’s “Ice Road Truckers” and second unit directing and editing four seasons of “Political Blind Date” for TVO.

Morrow’s first attempt at 3D/virtual production, came with the music video – “Every Single Night” for Canadian artist Vicky Rai who enjoyed international recognition, winning Best Pop Music Video –  Munich Music Video Awards, Best Music Video – Madras Independent Film Festival,  Best VFX – International Music Video Awards, Best Music Video Finalist – Kosice International Film Festival, Best Music Video Finalist 4th Dimension Independent Film Festival, and Official Selection – High Tatras Film and Video Festival. In this music video project he extensible utilized Reallusion’s Character Animation suite with Character Creator 3, iClone 7, and ActorCore, with final render in Unreal Engine.

Morrow aspires to make a meaningful contribution to Canadian culture by continuing to work with top production houses and networks on programs that educate, inspire and entertain a broad audience.

“Reallusion’s one-two punch of Character Creator 3 and iClone 7 are a powerful suite of digital human creation, rigging, and animation tools. I found them intuitive to use, massively scalable, and I was able to generate very high-quality digital assets with little time and effort.”

Michael T Morrow – Director / Documentary Filmmaker

Q: Congratulations Michael for being part of the Reallusion Pitch & Produce program. Could you tell us how you started the Every Single Night music video project ?

I appreciate that many gifted artists applied for the Pitch & Produce program, so I was humbled and honored to be chosen. The Every Single Night music video began with a call from my good friend Michael Hanson. Michael is a Gramm y-nominated and multi-Juno award-winning recording artist, composer and music producer. He asked me to listen to a track from the new album of a young artist he was producing. The artist was the gifted 15-year-old Vicky Rai. Michael sincerely believed in her talents and wanted to promote her new single in a big way. Sadly they did not have a big budget for this, but I listened to the track and bounced around a few ideas. 

My process is pretty “arty” at first; I listen to the song ten or twenty times, let the music inspire imagery, and with more and more listens, those images construct themselves into some kind of narrative. It’s not scientific, but it’s a process I’ve refined over the years and have grown to trust. Once I get to a place where I just watch the video in my head, I tweak this “blue sky” vision to align with the budget and practicalities of production. In this case, the song inspired movement, Neo-Noire lighting aesthetic with Vicky on a motorbike racing through pink and blue neon streets. 

I wrote up a little treatment with style boards and sent it to Michael and Vicky. They loved the concept and the aesthetic sensibilities and wanted to go ahead. I didn’t want to promise something I couldn’t do, so I asked them to give me a few days to research viable options to execute this. My concept demanded many visual effects, which we couldn’t afford, so it meant if I wanted to do something, I’d need to do it myself.

So I did some initial testing and research and priced out the kind of world I wanted to make, keeping in mind the limited time and cash resources we had. With the tools available from Character Creator 3, iClone 7 combined with the world building, cost-effective assets and real-time rendering with Unreal Engine 4, this big idea seemed like a feasible idea. So I agreed to go ahead and begin shooting.

Munich Music Video Award – Best Pop Music Video :

Q: In the past you have used tools like Blender, what big differences / advantages do you feel real-time tools like Character Creator, iClone and Unreal Engine bring you?

Blender has a physically-based real-time render engine in EEVEE. The most significant benefit of these real-time tools is the instant feedback which allows for faster iteration and refinement of ideas. The creative process always has an element of trial and error, and the faster one can see a result, the faster you can tweak that result until you call it a final.

We had over 45 shots in Every Single Night, all of the special effects shots with varying degrees of complexity. I rendered each of these shots an average of 6 times. I’d fix glitches, enhance lighting, or add elements to improve the image’s production value with each pass. I have a pretty capable PC with a 24 core Threadripper, 64 gigs of ram, and an RTX 3090, so it can churn out frames pretty fast, even with the quality settings set very high. So the iteration checks for each shot became minutes, not hours or days.

Q: What were the most significant challenges when planning to create a digital double of Vicky Rai? Can you share a little of your workflow with Character Creator?  

I did a very early digital double of Vicky to execute an early dance test. I used Character Creator 3 and the plug-in Headshot to match her facial structure and textures. With this first test, I had low-quality still images to work with, generating less than favourable results. So when we did the main video shoot, we scheduled time to take high-resolution stills with a DSLR and even lighting. These high-quality stills were critical with this process.

It was apparent that the “Real Vicky” and the Digital double would need to wear the same clothes and have the exact hairstyle. In a traditional workflow, one might match the digital double to the actor, create custom clothing assets and custom hair. In the interest of simplifying this process where I could, I found the digital assets first, a plain white shirt, a leather jacket, pants and a high-quality hair asset in Character Creator 3. Vicky approved the look, and our Make-up artist owned a leather jacket that matched very closely, so we were all set. We cloned the real Vicky to the digital double.

It just seemed like the most efficient approach at the time. Throughout the editorial process and receiving feedback from viewers online, there has not been a single note about how the digital double did not match the real Vicky. There is still one shot on the stage near the end where we see digital Vicky singing, and it is not 100% convincing; that bugs me every time I watch it. 

Q: ActorCore proved helpful when animating your dancing CG robots for the music video. Would you recommend this online mocap repertoire to others, and why?

To be honest, I spent far too much time doing “tests” with the robot dancers. It was just too much fun. Granted, I learned a lot about Unreal Engine Sequencer, blending animations from previous bones and what-not, but yeah, I spent too much time playing in that sandbox, just because it was so much fun.

“I am not a character animator, but I had very high-quality animations of very high-quality characters in real-time in a few steps. ActorCore is reasonably priced, very well organized, and the asset downloads are speedy and support the primary digital creator tools, so I ABSOLUTELY RECOMMEND ACTORCORE.”

Michael T Morrow – Director / Documentary Filmmaker

Q: With iClone, you were able to edit your motions and control characters directly into Unreal Engine. All in all, how long did it take you to learn this entire ecosystem? And what advice would you share with others that are thinking about using it? 

I was learning a lot on this project and consciously decided to learn new aspects of the pipeline as I needed to execute new elements. The iClone to Unreal Auto setup happened very early in the process. It was just a matter of watching a few tutorial videos readily available for free from Reallusion. The iClone to Unreal LIVE-LINK was a little trickier to sort out. At the time, there were several videos on the topic, and it seemed each of them left out a vital element of the process, but once I was able to cross-reference the information, the pipeline was rock solid.

But for reference, we are talking a matter of minutes or hours. At the risk of being unfairly critical of free video tutorials, the one aspect I see missing is a clear overview of the process before venturing deep into the woods. For me, that was:

  1. Send Character from iClone to Unreal Engine using the AUTO-SETUP plug-in.
  2. Once your Character is in iClone AND Unreal Engine, begin the process of establishing a link using the UNREAL LIVE-LINK plug-in. 

Q: You mentioned that you will continue to consider real-time software tools in your future projects. Do you have any upcoming projects the community can look forward to?  

This experience has certainly opened my eyes to the capabilities and possibilities of the Character Creator 3 / iClone / Unreal Engine workflow for future projects. I’m very excited about an Animated Comedy Series I’m actively pitching called Galactic Pub. It’s an irreverent Adult Swim style comedy. Imagine Family Guy or Rick and Morty set inside the Star Wars cantina. It has stylized characters bumbling through fantastic sci-fi environments with kooky aliens, all rendered with physically based textures and cinematic lighting and filmic camera techniques. It is a combination of genre and aesthetic we haven’t seen before because it has been cost-prohibitive in the past. But with these real-time tools and workflows, it is now a feasible and attainable approach for series and episodic television.

The pilot has gone through many revisions, and I have a 44-page treatment/bible. It is a self-referential homage to animated series television and the sci-fi and fantasy genre. The project is still in pre-production, and I’m seeking funding for the pilot and shopping the series for broadcasters. Once green-lit, we will be looking for talented Concept Artists, 3d Character Artists, Environmental Artists, Animators, and Unreal Engine geniuses familiar with this pipeline. As I was frantically scrambling to make “Every Single Night,” I turned to many resources for information and support. The many user groups and various tutorial videos were an incredible resource and I would love to pay that forward somehow.

Related topics


Leave a Reply

Recommended Posts