These 2 pieces focus on the unexplored expressive power of Virtual Reality technologies, in the perspective of staging unconventional music performances. By means of non-invasive VR technology, together with Dario Mazzanti I developed a system to present artists and interactive virtual objects in audio/visual choreographies on the same real stage. According to VR literature, I called these live shows Hybrid Reality Performances.
Virtual_Real has been the first Hybrid Reality Performance I designed and developed. Born from the collaboration with the electronic composer Useless_Idea [Cesare Bignotti], the performance stemmed from the artist’s passion for both music and graphics as expressive means, which were combined together to transform a music concert into an experimental audio/video venue. The performance features five original tracks, specifically composed for the event. Each track is associated to an immersive 3D choreography, arousing visual atmospheres directly connected to the sounds and the music. By means of body tracking, both the artist and the audience are able to interact with the 3D visuals, which are perceived as coming out of the screen. These actions produce a joint modification of graphics and sound. The artist actively participated in all the steps leading up to the final show, trying to explain his motivations and his messages, towards a keen refinement of algorithms, controls and contents. Virtual_Real was showcased 3 times within the IIT ADVR department VR room.
Dissonance is a Hybrid Reality audio/visual performance in which a progressive soundtrack is created along with the exploration of an interactive virtual environment. The piece takes about 10 minutes, and features two on-stage performers [Dario Mazzanti and me] alternating between real instruments, placed on the side of the stage, and virtual instruments, drawn onto a retro-projected screen. The performers take part to a multimodal journey across different living virtual worlds, where their presence is perceived and permits to create meaningful relationships between the artists, their gestures and the surrounding environment. While real instrument–generated music affects and animates projected worlds, the performers are allowed to move virtual objects and change their physical features, like shape and color; objects respond to these stimuli influencing music and creating new sounds. Dissonance was selected to take part to the official concert program of NIME Conference 2011. It was showcased at Betong, one of the venues of the Chateau Neuf building of The Norwegian Students’ Society (University of Oslo).
Both the pieces feature reactive virtual environments designed and developed in XVR, using OpenGL shaders for the visual effects. Real-time audio and predefined sequences are controlled through Ableton Live and Max/MSP, via MIDI and OSC [using the LiveAPI’s and LiveOSC]. Artists’ musical instruments and gears are directly connected to a single workstation that handles the projection of the virtual environment; doing so a full bidirectional communication between the visual and the audio system is achieved. For a correct perception of 3D objects, the audience is equipped with active shutter glasses.
For Virtual_Real, the ADVR VR room has been transformed into a stage and a seat area. The virtual choreographies are projected onto a Powerwall and the 12 camera mocap system available in the VR room is used to track both the artist and the audience.
Dissonance has been designed to be easily performed on regular stages, in theaters or music clubs. In Oslo, projections were drawn onto a foldable self-standing screen, and, thanks to standard data protocols, the system was connected to an IR tracking system available in the venue to stream the artists’ position and gestures.