Robert is an independent filmmaker as well as an experienced editor and post-production manager who has a passion for not only the creative process but for all things technical.
He has written and directed three feature films. His first film, “After the Flood,” shot in 35mm film, was awarded the Grand Prize for Best Director at the 2002 Rhode Island International Film Festival. “White of Winter,” Robert’s second feature was an official selection to the 2003 Sundance Film Festival, and his third film, “Godspeed” – co-written with Cory Knauf – was awarded the Special Jury Award for Exceptional Artistic Achievement at the 2009 CineVegas Film Festival. The film was subsequently invited to screen at such festivals as the Austin Film Festival, also in competition, and the prestigious Stockholm International Film Festival. “Godspeed” was also awarded The Golden Ring for Best Film at the 2010 Ravenna Nightmare Film Festival in Italy.
Robert’s background also includes producing and editing high profile music videos for such artists as Keith Urban and he has also directed three music videos for singer-songwriter George Adrian. His latest project is entitled “Wicked Flower,” a short film to be made entirely using Unreal Engine, motion capture, and other 3D animation software such as Reallusion’s Character Creator and iClone.
“In addition to how good scenes could look in Unreal, the ease of creating great looking characters, including adding wardrobe and animating them, using Character Creator and iClone, made the whole process possible for someone like me without prior experience or training. The ease of adding, adjusting, and tweaking Character Creator characters in Unreal was also crucial to making the workflow and pipeline one that was both fluid and possible for essentially a beginner into this world.”Robert Saitzyk – Independent filmmaker, Editor, Post-Production Manager
Q: Hello Robert, and welcome to our Reallusion Feature Stories! Kindly introduce yourself, your background your work with Wicked Flower.
First off, thanks so much for inviting me to be part of the Feature Stories and for letting me talk a bit about this project! So, I come from a more traditional filmmaking background, having directed three independent feature films and also working for many years in the post-production industry. (Godspeed, After The Flood, White of Winter) But I never really delved into the world of 3D animation until recently.
Prior to working with Reallusion’s software and Unreal Engine, I had directed and animated a music video for my friend and incredibly talented singer-songwriter George Adrian using Element 3D and After Effects, which really introduced me to some of the basics like using materials and textures, modeling, and lighting and moving cameras in a “virtual” environment.
The song took place in an empty warehouse with a spinning zoetrope (basically a cylinder with images that appear to animate as it rotates) as the centerpiece, and I think we were both very happy with the results. I also really enjoyed the whole experience and the challenge of working in a medium I wasn’t too familiar with.
Then, another friend of mine – Jeffery Morse, a talented filmmaker and artist as well – introduced me to what Neill Blomkamp and his team at Oats Studios had done with the “Adam” series using Unity. I was blown away with the results and, more than that, with all the possibilities of this new workflow using a game engine.
The “Adam” series was very inspiring and paved the way for me to begin working more in the 3D animation space and this is also when I began using Reallusion’s Character Creator and iClone (versions 3 and 7 at the time) to create, clothe, and animate characters using Unity.
When Unreal Engine began introducing strand-based hair grooms, I began to migrate over to this engine, feeling that I might get more of the results I felt were better suited to the cinematics I wanted to create, especially for an animated short film or project.
“Wicked Flower” was originally conceived as a low budget indie feature, but this film just didn’t get the financing we needed. A lot of the themes, characters, and certain elements of the story continued to resonate with me, and I began trying to re-think it as a shorter, animated film which would be made using Reallusion’s software and within Unreal Engine. At its core, “Wicked Flower” is essentially a psychological thriller, focusing on a young woman’s dreamlike journey of whether she may or may not have been the victim of a deadly attack.
All the films I’ve made have been somewhat challenging, both in terms of content and themes, and financing independent films – especially those that might not be easily categorized – has never been easy, so as the pandemic hit, the prospect of being able to make another film seemed even further away.
But now, more than ever, the idea of using 3D animation to create a new project seemed like the right path to take. I still hadn’t quite figured out just what “Wicked Flower” would look like as a short film, let alone, one that would be completely animated, so I decided to create a short “proof-of-concept” demo to see what I could build on my own. This was when I really began diving deeper into using and learning Character Creator, iClone, Unreal Engine and of course, all things 3D animation.
Q: As a filmmaker you mentioned that have no training in character modeling, rigging or animation, but that with the help of Character Creator and iClone, you have been able to test scenes, to try difference scenarios, and writing the script by working with animation. Kindly elaborate more on this.
For me, I was so amazed at how quickly you could create a scene, light, and set up some shots in Unreal, and how good that scene could look, especially when rendered out as a cinematic. Understandably, you may not achieve AAA game or high-end VFX studio quality, but you certainly can get some amazing results, certainly shots that look way better than basic pre-viz. For this project, I’m not creating a world full of spaceships or dragons, so of course, that helps. But there is absolutely a degree of photorealism I’m going for, if anything to make sure an audience can engage with these 3D characters and with this particular world they exist in.
So, of course, in addition to how good scenes could look in Unreal, the ease of creating great looking characters, including adding wardrobe and animating them, using Character Creator and iClone, made the whole process possible for someone like me without prior experience or training. Ultimately, I felt really good about the results I had gotten with the “proof-of-concept,” especially with how the characters looked and moved. The ease of adding, adjusting, and tweaking Character Creator characters in Unreal was also crucial to making the workflow and pipeline one that was both fluid and possible for essentially a beginner into this world. Were there concepts about subsurface skin scattering, retargeting animation, skin weights, etc… that I had to either learn about or begin understanding how to accomplish? Absolutely, but again, much less so than if this was all a more traditional pipeline without the help of Reallusion’s software.
In the end, going full force into creating this 9-minute proof-of-concept helped me understand what I could do and also what I couldn’t do on my own, or with the limited budget and resources I would have – certainly influencing where I thought I could take this version of the story.
Also, because of this ability to work fairly quickly and try different things and see them in front of you in high quality renders, all of this also inspired me to take the story in a slightly different direction and helped me understand what might work best as a short film version. The most specific example is the decision to add a sci-fi element to the story, and this was really inspired by how much I liked one character I created for the demo and the possibilities she opened up story-wise, which were never part of the original version.
I think, ultimately, it’s like being able to go onto a location or potential set with your actors and play with a few ideas, only this time it’s a virtual world with digital humans. And instead of just blocking out and then later maybe sketching out some storyboards, you can really get close to perhaps how this scene or sequence might actually look and feel in the final version, down to rendering out specific camera angles and moves.
Q: During the process you initially used Unreal’s MetaHuman Creator with fantastic results, but eventually gravitated towards Character Creator (CC4). Can you share how CC4 increased the quality and customization of your MetaHuman characters without using other tools?
The initial demo I created was made before the release of MetaHuman Creator, and like many others, I wanted to both play with and see how characters might appear and work as MetaHumans, especially as I was going to continue creating this project in the Unreal Engine environment.
So, in the interim, I did revise some of the characters that were created initially entirely in CC3, but now this latest feature in CC4 giving us the ability to enhance our MetaHuman characters with even more detailed textures including CC4’s own dynamic wrinkles, has just taken these characters to another level. We can now not only get more detailed and customized face textures, we can also apply more nuanced and other textures to the bodies, so things like scars and tattoos are WAY easier to apply.
Before, you could certainly take the MetaHuman base textures into Substance Painter, but again, that requires another software subscription and, of course, another set of skills to achieve the same results you can now get with CC4.
In Character Creator 4, you can even really individualize specific wrinkles, so that your MetaHumans don’t all have the same base dynamic wrinkles as all the others. And of course, we all know the MetaHuman bodies lacked the ability to add proper details and felt pretty “vanilla” for the most part, especially if they were aged and older characters.
“Before, you could certainly take the MetaHuman base textures into Substance Painter, but again, that requires another software subscription and, of course, another set of skills to achieve the same results you can now get with Character Creator 4 – where you can even really individualize specific wrinkles, so that your MetaHumans don’t all have the same base dynamic wrinkles as all the others. And of course, we all know the MetaHuman bodies lacked the ability to add proper details and felt pretty “vanilla” for the most part, especially if they were aged and older characters.”Robert Saitzyk – Independent filmmaker, Editor, Post-Production Manager
Comparing the “enhanced” skin detail and Character Creator 4’s dynamic wrinkles when applied to my MetaHuman character in comparison with the base, default MetaHuman textures and wrinkles, you can really see an all-important extra level of expression, even on the character’s neck muscles when they contract. So nice!
Q: Using the iClone Unreal LIVE LINK, data link, time code features, how were you able to use iClone 8 to work with complex Unreal Engine environments like “The Matrix” city ?
“The Matrix” city project (or City Sample) is simply an amazing environment and world to play in, with so many different areas to place characters and scenes. I do want to customize the city a bit more (especially using some KitBash models and other elements) but, as with the demo, I thought it might be great to test out a couple of shots using the new iClone 8.
For this, I wanted to mimic a big crane move, starting on one of the highway bridges with some moving traffic, then easing down to the project’s main character, Justine, who has been jogging through the city, who comes to a stop in a parking lot just underneath the bridge. Because the shot incorporates a lot of the set – from the bridge to the sidewalk below with meters, street lamps, etc… then to the parking lot, you really have to direct the animation’s motion so we don’t hit or collide with anything.
I currently don’t have the most high-end system (like a lot of indie creators out there!), so this was also a test of what I could do with the gear I have at the moment, especially since the City Sample project is such a complex and large environment, even if you “load” only so much of the city in Unreal (i.e. loading selected regions within the map’s World Partition) and focus in on a certain area. Additionally, I’ve been testing with the Small City map to help keep things as less complicated as possible and more manageable with my system.
I was able to use the iClone to Unreal Data Link feature to select certain landmarks (or reference meshes) – the fencing surrounding the lot, the cars within the lot, and even portions of the bridge and its pillars – giving me a solid blueprint of the main “obstacles” the character would move through. Honestly, I was surprised that it all transferred over to iClone without any issue, being that it was such a large area with a lot of assets. Using the “Merged” option to transfer the reference meshes seemed to help the process as well as keep the meshes organized and easier to manage in the iClone project.
With that done, I could then use the MetaHuman dummy from the most recent Unreal MetaHuman kit that matched the current character’s body (in this case, the “female tall underweight” version). Utilizing some stock jogging assets (with the hope of replacing with custom mocap from an actor) and the Live Link feature, I was able to adjust and direct the motion of the animation, seeing how it looked with the actual MetaHuman character within the City Sample project, all linking directly to the MetaHuman dummy over in the iClone 8 project at the same time.
And, of course, I could try different camera angles or versions of the proposed “crane shot,” while still adjusting the animation, and seeing how it all might look. Certainly, this was without the City Sample’s AI crowd and traffic for the moment, but that of course, would all be activated when I rendered the shot. Additionally, after this test, my hope is to also use some of the great Reallusion ActorCore characters as bystanders and a few crowd “extras” that I can better control (for example, maybe one who almost collides with the main jogging character).
I then used the latest iClone Timecode Sync feature to capture the animation using Unreal’s Take Recorder, and as expected, the animation was recorded with frame accuracy and matched the animation in iClone. One of the best things about Timecode Sync and recording with the Take Recorder was that I could bypass any retargeting of the animation onto this character’s MetaHuman skeleton – a process that can be a bit intensive and in my experience has been a bit hit or miss, especially when trying to retarget fingers. From here, you can continue tweaking in Unreal if need be, baking the recorded animation onto the MetaHuman’s control rig.
Again, one of the best things about using Reallusion’s software and its integration into Unreal Engine is the fact that you can continue the process of refining in your Unreal project without having to go back into iClone to finesse or redo any animation should you choose not to.
Follow Robert Saitzyk: