Categories
Product Release
Pitch & Produce
News
Learning
Featured Story
Events
Tags
Product
Cartoon Animator
AccuRIG
iClone
Character Creator
ActorCore
Smart Content Manager
Plugins & Pipelines
Photoshop
PSD Pipeline
Unreal
Motion LIVE
Unreal Live Link
Auto Rig
ZBrush
Blender
After Effect
Cinema 4D
Marvelous Designer
SkinGen
Vector Pipeline
Unity
Illustrator
Omniverse
MetaHuman Live Link
Headshot
Daz
Motion Link
Iray
Application
Cartoon
Games
Animation
Films & Movies
Live Performance
Conceptual Art
Comics
Education
AR/VR/MR/XR
Social Media
Previz
Virtual Production
AEC
3D Scan
Vtuber
Commercial Ads
Music Videos
Television & Network
Theme
Character Animation
Character Creation
Motion Capture
Digital Human
Facial Animation
Lip-sync Animation
Video Compositing
Scene Creation
Metaverse
360 Head
MetaHuman
Digital Twin
Digital Double
Motion Director
Environment & Crowd
Developer
Plug-ins
Certified Trainer
Content
GET THE LATEST UPDATE

OrpheusVR | re:Naissance Opera brings live virtual performance with iClone

Share

OrpheusVR is an indie VR game prototype that I have been working on in conjunction with Debi Wong of re:Naissance Opera, Brian Lee Topp, and Youhan Guan. The project was conceived as a way to engage new virtual reality platforms with the ancient art form of opera. Live From The Underworld is a series of interactive live-stream performances featuring mythological avatars brought to life by real-time motion capture technology and live opera singers.

In 2018 a team of digital and opera artists came together to ask “what would opera look like if it were invented today?”. From that initial question OrpheusVR, an interactive, choose-your-own-adventure, virtual-reality opera was born. After completing the prototype in 2021, the OrpheusVR team started working in the digital live stream space, using iClone real-time animation software, iClone’s LIVE FACE mocap with Xsens body live motion capture technology and computer avatars from the OrpheusVR world to create live, interactive operatic performances that can be streamed from any device.

Debi Wong – Founding Artistic Director

Debi Wong

Debi Wong is an interdisciplinary performance artist that loves to weave together sound, text, movement and technology to create contemporary stories. Her performances have been recognized as “unique and magical” (Rondo Classic) and “electric” (Schmopera).

As a soloist, Debi has performed with leading orchestras and ensembles across Europe and North America most notably with her lute and voice duo, White Sparrow at The Stockholm Early Music Festival, The Copenhagen Renaissance Music Festival, The BRQ Vantaa Festival (Helsinki), Garrick’s Temple To Shakespeare (UK) and the Venice Biennale. Debi is currently artistic director of re:Naissance Opera, a Vancouver-based indie-opera company.

Debi is the founding artistic director of re:Naissance Opera in Vancouver, BC. With re:Naissance she has created #DidoAndAeneas (2014), an interactive, social-media opera; Acis & Galatea which was recognized by Vancouver Classical Music as one of the city’s best operas in 2017. In 2019, Debi spearheaded OrpheusVR, an interactive Virtual Reality Opera that was featured at the VIFF Immersed exhibit at the 2020 Vancouver International Film Festival and in 2020 she co-created the award-winning podcast The Apocrypha Chronicles (Finalist, New York Festivals Radio Awards).

Debi is a graduate of Yale University (M.Mus) and The Yale Institute of Sacred Music (Diploma in Sacred Music), where she studied vocal performance and was the recipient of the 2010 Margot Fassler award for outstanding performances in sacred music. She is currently a Doctoral candidate at The University of the Arts Helsinki.

Conrad Sly – Art Director

Conrad Sly

Conrad is a 3D Artist with a deep passion for the medium and its many platforms. He specializes in real time graphics and has been producing architectural visualization for virtual reality projects for 4 years, and has worked on several iterations of the VR Theatre for SIGGRAPH in 2018 and 2019. He has a deep interest in the crossroads of experimentation with emergent technologies such as VR and AR, and traditional theatrical and art contexts.

Conrad and Debi began collaborating in 2018 when Debi took an interest in the digital art and virtual reality games Conrad was creating at the Centre for Digital Media. Their initial conversations led to the 2020 prototype of OrpheusVR, an interactive, VR opera for Oculus Quest and Live From The Underworld (2021), a new interactive live stream that uses mythological CG avatars, motion capture and dancers and singers  to innovate live opera and online performance.

“It is pretty complicated, but at the same time we’ve entered into these partnerships with, for example, Reallusion Software, and they’ve really created these user-friendly tools that have really helped us as indie producers.”

Debi Wong – Founding Artistic Director

Q: Greetings Debi and Conrad, welcome to our Reallusion Feature Stories! Kindly introduce OrpheusVR and what your team is trying to achieve with LIVE from the Underworld.

OrpheusVR is a mythological world we have been creating over the past three years. Inside this world, we’re trying to explore in what ways real-time gaming technologies and live motion capture technologies can bring something new to the table of traditional opera performance.

We’ve developed a prototype of a choose-your-own-adventure VR opera and we are now experimenting with live streaming our operatic avatars to mobile and personal devices. We’ve been having a lot of fun exploring what kind of design aesthetics we can create that you might not see in a traditional theatre and how livestreaming to personal devices allows for interactivity and new kinds of engagements from audiences. 

Q: Please share with us how you were able to combine different performances to bring your avatars to life for a live experience. What are the biggest challenges in doing this in real-time?

One of the biggest challenges of real-time motion capture has to do with capturing high quality data from live performances. We use the Xsens, which produces excellent results, but still has responsiveness limitations — but because everything is live, we don’t have the luxury of going back and cleaning anything up or doing any detailed work. Coupled with an indie setup for the face using a iPhone with iClone‘s LIVE FACE app, it starts to get pretty complicated in terms of how to manage the intake of data in a performant way for real-time graphics. 

This meant we had to “rehearse” our avatars in very specific ways – we had to discover what kinds of movements read well with the suit and which movements do not produce great real-time results. We went through a similar process with facial capture and we also learned that mounting head rigs on opera singers is extremely cumbersome and limiting for their techniques. 

We decided to approach these challenges by compartmentalizing our motion capture and live performance art, dedicating certain workstations to specific tasks. We also started treating the avatars like puppets. We used dancer-choreographers to create vocabularies of movement for us and our singers did the same for facial gestures. In the end, we actually puppet our avatars with two performers: a dancer plays the body of character and an opera singer plays the face and voice. This lets us have a really diverse and expressive range of choreography and dynamic vocal acrobatics.

Q: As a producer, how has iClone and its LIVE FACE mocap helped your production ? What combination of tools and hardware did you use for the live actor performance?

We’re an indie studio working in real-time so the great thing about iClone is the ease of use for our team and its ability to bring in many different data streams (for our livestreams, we use an Xsens suit with Rokoko Smartgloves to capture the dancer’s body and finger data and the LIVE FACE app running on an iPhone 12 mini to capture facial expressions from our singer.). It produces great results with our custom Character Creator (CC3) avatars and we are able to produce really exciting live results.

LIVE FACE has also been a really powerful tool for us because it means any of our opera singers can take out their iPhones and capture data for us. We tested a few different options for facial capture because opera singers make pretty distinct facial animations when they sing and we discovered that calibrating facial movements and using LIVE FACE produced the most expressive real-time results for our singers. 

iClone’s LIVE FACE app for iPhone facial motion capture

Both iClone and LIVE FACE are an integral part of our livestream production pipeline and they have been second to none.  

Facial key editing inside of iClone animation software

 Q: After LIVE from the Underworld, what other projects can the community look forward to from re:Naissance Opera ?

We have a few dreams we are entertaining – we’re curious about streaming live in 360 and how our Reallusion tools can help us accomplish that in Unreal Engine. We are excited to go further with our live performances and audience interactions – we want our audiences to feel like they are participating in gameplay when they join our livestreams.

We are continuing to explore and discover in what ways these emerging and increasingly affordable technologies can infuse the world of opera with new life. 

Live streaming rendered inside of Unreal Engine
Orpheus VR live virtual performances with iClone and Unreal Engine

Follow OrpheusVR:

Website:
https://reopera.ca/
https://orpheusvr.ca/

YouTube:
https://www.youtube.com/channel/UCCJ_d9ADrBdNfZsRtkwFvow

Related topics

Share

Leave a Reply

Recommended Posts