Categories
Product Release
Pitch & Produce
News
Learning
Featured Story
Events
Tags
Product
Cartoon Animator
AccuRIG
ActorCore
Character Creator
iClone
Smart Content Manager
Plugins & Pipelines
After Effect
PSD Pipeline
Vector Pipeline
Auto Rig
MetaHuman Live Link
Unreal
Unreal Live Link
Illustrator
Motion LIVE
Photoshop
Marvelous Designer
ZBrush
Blender
Cinema 4D
SkinGen
Omniverse
Headshot
Unity
Daz
Motion Link
Iray
Application
Cartoon
Comics
Education
3D Scan
Virtual Production
Animation
Games
Live Performance
Vtuber
Commercial Ads
AEC
AR/VR/MR/XR
Films & Movies
Music Videos
Conceptual Art
Television & Network
Previz
Social Media
Theme
360 Head
Video Compositing
Digital Human
Motion Capture
Character Animation
Character Creation
Motion Director
Scene Creation
Facial Animation
Lip-sync Animation
MetaHuman
Environment & Crowd
Digital Double
Digital Twin
Metaverse
Developer
Plug-ins
Certified Trainer
Content

Virtual Production Company adapts Live Action Projects to Real-time Animation

Share
Piero Varda – Filmmaker, Creative Director, Co-Founder

Piero Varda – La Escena Virtual

My name is Piero Varda and I’m a filmmaker based in Lima, Peru with more than 15 years of experience in the advertisement and movie industry.  I originally started as an editor but then moved on to cinematography.  I’m now co-founder and creative director at La Escena Virtual, a virtual production company that provides virtual production services. We mainly work with LED wall technology and Unreal Engine. We create photorealistic 3D environments and track real cameras to virtual ones for FX purposes.

As part of our work, we recently started to upgrade and focus on real-time and motion capture workflows.  Our main goal is to migrate into a full real-time production company.  La Escena Virtual has also recently begun using Reallusion, primarily with iClone and Character Creator.  Additionally, we use different Reallusion plugins and animation bundles as a hub for character creation and mocap animation.

Before we started our company, I wrote and filmed a sci-fi live action movie project called “The Within” that is deeply inspired by Andean mythology and sacred practices.  Today La Escena Virtual uses the project to test techniques and pipelines as it’s a fantasy series that includes FX scenes with opportunities for digital doubles.

A couple of years back, regular post production meant this was a highly ambitious project which necessarily required co-producers to join in on the project to make it happen.  Today, thanks to virtual production and the doors it opens for collaboration, we believe we will be able to finish The Within in house.  Passionate filmmakers like ourselves have been waiting for these kinds of procedural tools, capable of working with other industry standard  software to help reduce time frames and regain creative control of the film process.  

“We learned that iClone and Character Creator were tools that worked seamlessly with Unreal Engine. Additionally, we eventually discovered iClone’s LIVE LINK and how it lets all the animation tools we were now using, work in conjunction with Unreal Engine — and after that we were totally hooked.”

Piero Varda – Filmmaker, Creative Director, Co-Founder

Q: Greetings Piero, and welcome to our Reallusion Feature Stories. Your company La Escena Virtual provides services for film and advertisement. Can you share with us some of the previous projects you have worked on?

Hello, and thank you for the opportunity to showcase what we do. La Escena Virtual is a virtual production company providing a wide range of different services. Our clients have come from the advertisement and movie industry. We initially started providing LED wall services which consisted in creating virtual backdrops, generating scenery to be placed behind actors in a film studio.

This is our company reel which includes all of the virtual production services we now offer. Today our focus has shifted into mocap and character creation. We see these areas as emerging markets for our clients.

Today our focus has shifted into mocap and character creation. We see these areas as emerging markets for our clients. These are commercials and behind the scenes for advertisements we have done in the past.

Q: Your film, The Within started as an live action movie project, which later morphed into virtual production. How did Reallusion tools help during this process and what advantages did they to bring when taking in production costs ?

At the beginning our project The Within seemed to be entirely live action. After realizing that we needed to create animatics in order to understand how to budget our film we discovered virtual production and that’s how we came upon Reallusion and its character creation and animation software tools.

As we started creating stand in doubles for animatic purposes, we discovered that through right shading and morphing we could get a very realistic digital double for our main characters. We also realized that these characters could be animated with many different tools that made sense and didn’t need professional animators as part of the team.

We learned that iClone and Character Creator were tools that worked seamlessly with Unreal Engine. Additionally, we eventually discovered iClone’s LIVE LINK and how it lets all the animation tools we were now using, work in conjunction with Unreal Engine — and after that we were totally hooked.

After many experiments and test with our project, we decided to go to the next level and buy a mocap suit. With the suit, the pre-made mocap animations and the puppet tools iClone offered, we started taking Reallusion as a key tool in our pipeline. Today our movie project bases itself from what we can previsualize through animatics.

Our production cost has been considerably lowered, as this process has allowed us  to try our scenes beforehand and understand their production costs.  It has also provided us with a  detailed rough cut of our movie which lets us know and decide what works, and what will eventually be shot as real live, LED walls, or produced as a high rendered cinematic.  If we were on a normal live action production, we wouldn’t be able to have those things beforehand, and probably not at the same cost. 

Q: You recently won an Unreal Engine Real-time Short Challenge at Festival Internacional de Cine en Guadalajara. Kindly tell us about this experience and how did Character Creator 4 and iClone 8 help you meet this challenge?

The experience was really something new for us.  Our team had never been part of a short competition of this kind so we were not sure what to expect.  Our first obstacle was time. The short had to be accomplished in about 2 weeks.  We had many assets ready like basic locations and basic characters but we had to block the scenes as fast as we could. iClone was the best tool for that.  We started by exporting a low poly version of our virtual location in order to have a perfect match with what would be rendered in Unreal Engine.  

https://ficg.mx/es/noticia/146

Basically, we made the characters in Character Creator and iClone served as the animating tool to bring them to life. We started with basic position and camera angles to have a basic idea of how the story would flow regarding visual narrative.  We knew that because it was a film festival the most important factor would be narrative, so we had to be able to tell a story correctly.  We also had to create a virtual storyboard with iClone .  This evolved to several shots  that had to be exported to DaVinci Rresolve in order to get an idea of timing through an edited animatic.  Our second biggest obstacle was that the short could not be longer than a minute long.  This required the editing to be super precise, and that we work in parallel with the animation and blocking process.

If something didn’t work, we had to re-export and work it out. We exported every single shot form iClone in order to see if we had the right animation timings.  When the animation started fitting inside the editing time frame, we decided to start doing mocap with the Xsens suit.  It was here where iClone Motion LIVE plugin let us work with LIVE LINK perfectly.  Thanks to this we were able to match in real-time our mocap to the animations that had determined the shots duration.  Without this factor it would have been really hard to match the new mocap to the shots. 

We also used LIVE LINK with an iPhone and iClone’s LIVE FACE plugin to get facial mocap. After we got all the acting we needed,  we started the cleanup phase.  This was our third and final obstacle. iClone’s Curve Editor and all of the puppet tools were really handy for correcting characters body parts positions, body mesh break throughs,  and natural fluent body movements in our characters. 

After all of this  we lighted and rendered the final shots  in Unreal Engine and replaced iClone’s animatic shots for the final ones in DaVinci Resolve

Q: Previously you also won an Honorable Mention in our 2021 iClone Lip Sync Animation contest where you animated a MetaHuman character with iClone. Now that you are using Character Creator 4 (CC4), in your opinion how has it enabled you to create highly customized digital humans compared to MetaHuman Creator? What advantages do you see in CC4 ?

Yes, I  had the chance to be part of  the 2021 iClone Lip Sync Animation contest.  At that time, I was working with a Metahuman head which was puppeteered via LIVE LINK and iClone.  It was a very interesting learning curve.  That contest was probably the company’s starting point into Mocap , and it introduced me to a digital human creator community that I wasn’t  aware of. People like JSFILMZ, Style Marshal, Matt Workman, Salomon Jagwe, Jonathan Winbush and Francisco Perez are, in my opinion, the biggest references out there.  If someone wants to go follow this path, those are the names one has to know.  Basically, amazing, generous, and talented professionals who are on the top of their game.  Without their  tutorials and their constant collaboration on places like YouTube and the Reallusion forum, learning to master all the different tools needed to make animatics with digital characters would have been a very daunting task.

During the contest itself, MetaHumans had just come out and I, like many others, was very interested in understanding what could be achieved with them. MetaHumans were very popular then because of their blend shapes which gave them a very realistic performance. Today we still use them in our company but Character Creator 4 has definitely stepped up their game. The new blend shapes that come with the new CC4 upgrade are incredible. Aside from that, MetaHumans don’t share the same founding ground that Character Creator offers. Reallusion is focused on absolute creative control. They do so by adding direct export and import systems to other software’s from the trade.

ZBrush and Substance Painter are tools we constantly use in La Escena Virtual. They are key tools when specific textures or shapes need to be added to our own characters. The character’s morphs and skin tools inside Character Creator are designed so that an average user or artist can have absolute control during his creative process. The software’s name says it all; you can create any sort of character – from a humanoid alien, to a very precise digital double based on a real person. That used to be something that could only be achieved with high end photogrammetry studios. Today procedural tools like Character Creator 4 are giving us indie studios the chance to enter markets that were far from our reach. We are very grateful for that.

“MetaHumans were very popular because of their blend shapes which gave them a very realistic performance.  Today we still use them in our company but Character Creator 4 has definitely stepped up their game.  Aside from that, MetaHumans don’t share the same founding ground that Character Creator offers as Reallusion is focused on absolute creative control.  The character’s morphs and skin tools inside Character Creator are designed so that an  average user or artist can have absolute control during his creative process. Today procedural tools like Character Creator 4 are giving us indie studios the chance to enter markets that were far from our reach.  We are very grateful for that.”

Piero Varda – Filmmaker, Creative Director, Co-Founder

Q: La Escena Virtual studios also incorporates the use of iClone 8 real-time animation tool and ActorCore motion collection. Can you guide us through your typical animation process and how you end up connecting everything to Unreal Engine?

Yes, we have been using different animation options from Reallusion.  We first started with dance animation packs and used them for our tests and motion capture videoclips.  We have been posting many of these kinds of videos online. ActorCore came afterwards, but it was definitely a better way of finding the right animations and creating our own pack by deciding which are the animations that really fit what you are looking for. ActorCore’s interphase is great, as it lets you try the animations and even see them in different angles which is a very useful option  for creating and understanding how to plan a  complex animation.  Animations are  also well captured and organized.  

Before, there used to be a  learning curve in order to master how to clean and edit mocap animation together.  Foot sliding and character positions where hard to blend and fixing them were a very common hassle for everyone.  Today, iClone 8 has improved greatly on that matter.  There are now several techniques and tools that can be easily used in order to create fluent complex animations, or simple animation block outs for later improvement.  It all depends on what you are  looking for.  

In our case, we have done several dance videos that where later edited in order to try out our virtual camera.  We developed a virtual camera based on the tracking system we use for our LED walls service. We use it on set in order to record 3D characters dancing.  The process is similar to a regular shoot with a real camera only that this one is handled in the real world but it records inside the virtual one.  The best thing about working with it is that the animations can run in loops so we can get as many shots as we want in different lenses and camera angles. It’s very easy to anticipate characters movements this way because one gets used to knowing what the character is about to do.  We do this inside Unreal Engine after we have edited a dance sequence in iClone.  

We put a lot of effort on these videos because they are the testing ground for how we plan to shoot our cinematics in the future.  Today the system is slowly working for storytelling but we started developing  it while we tried to recreate how we would shoot a videoclip in a performance dance sequence.

Well, thats its… thank you for reading this far, and thank you again for giving us a chance to share everything we do at La Escena Virtual!

Follow La Escena Virtual:

Website:
https://www.pierovarda.com/

YouTube:
https://www.youtube.com/watch?v=fEiiaQCaiAA

Instagram:
https://www.instagram.com/la_escena_virtual/?hl=es

Related topics

Share

Leave a Reply

Recommended Posts