Categories
Product Release
Pitch & Produce
News
Learning
Featured Story
Events
Tags
Product
Character Creator
iClone
Cartoon Animator
AccuRIG
ActorCore
Smart Content Manager
Plugins & Pipelines
ZBrush
Photoshop
PSD Pipeline
Vector Pipeline
Headshot
Auto Rig
Motion LIVE
Illustrator
Blender
SkinGen
Cinema 4D
After Effect
Unreal
Omniverse
Maya
Unreal Live Link
Unity
MetaHuman Live Link
3ds Max
Marvelous Designer
Daz
Motion Link
Iray
Application
Music Videos
Social Media
Animation
Cartoon
Comics
Commercial Ads
Previz
Conceptual Art
Games
Television & Network
Education
Films & Movies
AEC
3D Scan
Live Performance
Virtual Production
AR/VR/MR/XR
Vtuber
Theme
Character Animation
Character Creation
Scene Creation
Facial Animation
Lip-sync Animation
Video Compositing
Motion Capture
Motion Director
Digital Double
360 Head
Digital Twin
Environment & Crowd
Digital Human
AI & Deep Learning
MetaHuman
Metaverse
Developer
Content
Plug-ins
Certified Trainer
Columnist
WarLord
GET THE LATEST UPDATE

Pitch & Produce | RIFT: Director Producer creates Multiverse Film & Game with Transmedia Assets

Share
Hasraf ‘HaZ’ Dulull – Filmmaker / Director / Animator

Hasraf ‘HaZ’ Dulull

HaZ started his career in Video games before moving into Visual Effects,  and then later transitioning over to Directing & Producing with his breakout feature film THE BEYOND, the indie sci-fi was released in 2018 and was number 2 in the iTunes Charts before being licensed on Netflix and turning into a profitable movie.  

His second feature film – 2036 Origin Unknown, starring Katee Sackhoff (Battlestar Galactica, The Mandalorian) earned a limited Theatrical release before landing on Netflix.  He was then hired by Disney to helm the pilot for action comedy series Fast Layne, where he also served as EP / Creative Consultant on the entire series.   

HaZ is the co-founder of production company HaZimation producing animated feature films, series and video games based on their propriety pipeline utilizing Reallusion’s Character Creator, and iClone with Unreal Engine. He is represented in Hollywood by The Vendetta Group.

“The Reallusion toolset empowers me as a filmmaker to have hands on control with the animation, thanks to the intuitiveness with the tools available from iClone’s Face puppet,  to AccuLipsync, Hand Gestures and more, which I use on a daily basis when creating shots and collaborating with my team. I don’t think we would have been able to have this creative flexibility and high-quality output for our feature film and video game simultaneously, without the integration of Character Creator and iClone in our pipeline. ”

Hasraf ‘HaZ’ Dulull – Filmmaker / Director / Animator

Q: Hello Haz, and congratulations on being part of the Reallusion Pitch & Produce program! Kindly introduce HaZimation Studios, and tell us about your RIFT film and game project.

Firstly, thank you Reallusion for the support and for creating such amazing tools.   HaZimation is a production company developing and creating our own IP with animated feature films, series and video games utilizing our final pixels in Unreal Engine and Reallusion pipeline.  

RIFT is the first project to be produced out of HaZimation, and its a sci-fi action animated feature film and video game that follows a character called Leon who is continuously trying to rescue his little brother Max who is held at a facility doing experiments on him because he has the ability to reset the fabric of space and time to jump to alternate realities.   It’s a cross between ‘Edge of Tomorrow’ meets ‘Akira’.

The video game plays in the same universe as the animated feature film but strongly leans into the multi-verse aspect by embracing the branch narrative approach you get in video games, something we couldn’t really do in a 90 mins animated feature film.

Q: RIFT deals with a multiverse which requires your team to create different versions of characters. How has Character Creator helped you with this part of production? Can you share your typical creation process?

When we started this project, we were not really character artists, so we needed to jumpstart and that’s where Character Creator came in.  We had used it for one character in the Mutant Year Zero project last year and we fell in love with how fast were able to get such high quality with just being artistic as apposed to a steep learning curve with tools and process. So we knew Character Creator was going to be driving all the creations of our characters in the film.

Andrea Tedeschi, the CG Supervisor on this project was responsible for creating those awesome characters based on designs, references and directions I supplied to him.  First thing we established was the overall tone and style of the film, which was to be reflective of Anime. So that meant alot of work on the stylization features on the characters like their eyes, hair and clothes but also their shaders too, which I wanted to have a mix of cel shader, and graphic lines yet embrace the RTX elements of reflections, GI, shadows etc.

Fully-rigged Character Creator 3 avatars

Because this film is set in the multi-verse, it means we needed a modular way of creating our characters so that Andrea could then create all the various Alternatives (ALTs) of the characters. We also tried to ensure that the variations were not drastically different from each other geometry wise, in order to use the same skeleton throughout so that our character animation workflow was non-disrupted from the new ALTs.

Due to the agile nature of our production, meaning its very interactive and we didn’t wait until all the characters were created before I could go in and create shots, Andrea would create the first pass of the main characters for me to start creating shots with, and then at some point in the production do another pass on the characters now that he could see what I was doing with the characters in the shots. He would update the shaders, update the hair geometry and update via our source control, and viola! It updated in my shots.

Q: Your project utilized a combination of pre-made ActorCore animations, along with custom Xsens and Manus motion captures with professional performers. How do you decide when you need to create specific motions, and what is the most difficult part of this whole process?

We knew that in this ambitious project there was going to be a need for specific motion capture sessions using an Xsens suit, but we also knew we didn’t have a huge budget and time allocated to motion capture every single movement in the film. So what I did was break down the script and instantly went straight into blocking out shots. Anything specific like a choreographed action scene was motion captured by our mocap artists – Gabriella Krousaniotakis (who also helped us lock down a pipeline for this) and Ace Ruele.  

But all the other mocap such as idles, running, kneeling, generic conversations, we were able to grab from the ActorCore libraries and download as Unreal Engine FBX and it retarget nicely onto our characters.  In the last few scenes of the movie we recently used machine learning approach to motion capture via Move.AI which uses GoPro footage to capture the motion with the need for our actors to wear suits, and again same FBX workflow into our pipeline.

For the hand animation we used the Manus gloves with Xsens which Gabriella did pretty much all those shots in one take.

Gabriella Krousaniotakis using Manus hand motion capture

But as I was creating the shots in Unreal and bouncing back and forth with my edit, I realized that I needed more specific hand gestures for other shots, and I didn’t want to have to go back to another session to just capture hands, so we used a plugin inside Reallusion iClone called Hand Gestures. With this iClone plugin I can just move the mouse around and the fingers and move to create the gestures I wanted. This was such a quick process like in a matter of minutes, which I then export as FBX into Unreal and applied to the hands later in our character animation pipeline. That’s the beauty about the way we redesigned the pipeline to allow each of us to contribute to various parts of the characters performance.

Q. How did you handle the mouth animations on the characters?

Lipsync has always been something that was tricky to get right, as well as time-consuming if audio changes during editorial.  So we needed something that worked well with the iterative nature of how the shots are being created, and a solution that didn’t involve us animating a mouth rig against a video reference.  iClone’s AccuLips tools pretty much saved the day, and I do literary mean save days of work! Even more efficient was that I tasked myself with handling all the Lipsync work in iClone so that I could jump between editing and creating Lipsync from the exported audio of the editorial timeline (we edit in Davinci Resolve).

Q. How did iClone help with the facial animation of your characters?

For facial animation we used the iClone LIVE FACE plugin. Our talented animator Alex Kong, actually had fun performing many of the facial expressions and core moments of the characters himself and then adjusted them further inside iClone.

Animator Alex Kong using iClone’s LIVE FACE iPhone mocap plugin

For other generic facial expressions, I was able to dive into the iClone Face Puppet panel where with just selecting parts of the face, then moving my mouse around. I was able to create organic looking face expressions and keyframe them all in iClone’s timeline before exporting as FBX to work in Unreal Engine using our propriety developed tools we created to allow us to control face, hands, body and Lipsync as individual character animation tracks in sequencer.  

iClone face puppet panel

This was so the Unreal Engine artist we bring in later on in production, such as filmmaker / animator  Mark Cheng who joined the team to help create shots with me, was able to hit the ground running and dive into the shots on Day One and start animating using the exported FBX’s of Lipsync, body animation and facial animation from iClone in Unreal Engine sequencer.

Q: Once your motions were decided, how did you use iClone’s specialized tools to fine-tune everything? How much time do you feel real-time software saves you in comparison to using traditional software like MotionBuilder?

The great thing about iClone is that its timeline is intuitive and reflective to most animation timelines anyways, so as a hands-on director I can go in and fine-tune things very easily in the timeline and pretty much keyframe every parameter available in the tools.  I love the idea that our characters are built  and then rigged along with animation all under the same Reallusion software roof, and we don’t have to export out to different software to get things done.

The only thing we did externally is use Substance Painter and sometimes ZBrush for additional details. But the translation from Reallusion to Unreal Engine is very smooth and allows us to take it further with the shaders in Unreal Engine when integrating the characters into the UE scenes. The great thing is that this smooth process of fine-tuning with iClone and Character Creator, is not disruptive and fits the iterative creative process we embrace in our productions at HaZimation.

Q: Using modular creation tools like Character Creator and iClone allow you to build parallel assets that you can use for both your film and video game. In your opinion, how do you feel this changes the industry for cost-effective studios like yours?

Totally! We are embracing real-time engines and Reallusion tools enable us to do things that would be considered expensive and insane to do. As a small team we are being smart with the approach we take to creating content in the character animation and video game space.

“We are embracing real-time engines and Reallusion tools enable us to do things that would be considered expensive and insane to do. As a small team we are being smart with the approach we take to creating content in the animation and video game space. “

Hasraf ‘HaZ’ Dulull – Filmmaker / Director / Animator

Assets have always been one of the expensive things in a budget for projects, specially characters – which means huge teams and specialisms required. But as a team we are all using Reallusion tools as part of our process because of how intuitive the tools are, empowering us to really focus on pushing our creativity and not being bogged down with technicalities of the tools. This is a game changer for sure, because now the only thing that keeps me up at night is how cool can I make the shot and focus on the storytelling as opposed to bottlenecks and limitations. 

Q: When will the RIFT film and game be launched, and what else can the community look forward to from HaZimation Studios?

The RIFT film has been deep in production since January 2021 and we aim to have it finished by Mid-2022, but the great things about real-time is that we can create previews very quickly to show distributors early on, in order to start the conversation of sales, distribution and PR while sharing the project with early audiences.  The video game is still in a prototype stage, but we aim to release a pre-alpha early access in the middle of 2022 as well, to start engaging with the community for feedback (its one of the things I love about game dev then compared to film production).

We are also in early pre-production of Mutant Year Zero, the animated featured film as the script is being developed. We are involved in some video game cinematics too at the moment, but due to NDA’s we cant mention details yet, but stay tuned via our social media.  But one thing is for sure, we are so excited to see what’s next with Reallusion tools because it is so integral in our production pipeline for all our projects.

Follow HaZimation Studios:

Website: www.hazimation.com
ArtStation: https://www.artstation.com/hazimation
Instagram: Hazdazzle
Twitter: HaZ_Dulull
Facebook: HaZFilmStudio
LinkedIN: HaZFIlm

Related topics

Share

Leave a Reply

Recommended Posts