';
Logo TooMuchIdle

Hybrid Reality Performances

2009-2012, Ph.D. Thesis _ Department of Advanced Robotics |IIT|

Image module
_description

During my Ph.D., I explored the usage of projected Virtual Reality (VR) technologies for immersive live musical performances.

Differently from previous VR music research, my work aimed at maximizing spectators‘ feeling of presence. Indeed, since the experimentations in the early 90’s] artists have recognized the potential VR has as a means to explore unprobed musical territories and innovative audio/visual experiences. However, I can say that, until the time I started my Ph.D., all the research in the field had always primarily targeted the musicians’ experience, by building virtual instruments and virtual world exclusively designed for the performers on stage. This was due to technological limitations, both in terms of rendering and tracking capabilities; in other words, there is not much to do to invite the audience in the virtual world when your workstation supports only one low res screen [most likely an HMD] and your tracking system is a cyber-glove. But you can still make very good music.

When in the late 2000’s I started to work with projected VR, I was immediately stricken by the incredibly strong perceptual effects this technology has on our brain. The simple possibility to see my hand overimposed on the virtual environment and directly interacting with virtual objects made every experience in the ADVR VR room so real, no matter how simple it was. However, in the beginning, I started to focus as everyone else on the classic user-centered paradigm, designing virtual instruments and installations. it took me a whole year of my Ph.D and several studies and experiments to gradually realize the unexplored musical potential of projected VR. In particular, it was a dance+projections performance I randomly stumbled upon on Saturday, February 20th 2010 that made me think that VR could be used in a different way from what done so far.

From that point on, the leit motiv of my Ph.D. research became the search for novel visualization and interaction paradigms for VR musical performances. Throughout the following years, this plan translated into the design of setups to display interactive virtual environments that surround the audience, the performer, as well as the on stage physical instrumentation. This paradigm is defined in literature as Hybrid Reality, as proposed by Milgram and Kishino in the paper “A Taxonomy of Mixed Reality Visual Displays”. To use the authors’ words, in Hybrid Reality setups “real physical objects in the user’s environment play a role in (or interfere with) the computer generated scene”. However [and not surprisingly], Milgram and Kishino described only user-centered displays, while my research focused on exocentric setups. In the scenario, the primary visual feedback is moved from the user [the performer] to an external observer [the audience], while interaction is still mainly carried out by the user. This feedback decoupling leads to a number of issues, that I tried to address through a series of user studies on the role of multimodality in interaction. The outcome was a set of metaphors for Hybrid Reality Performances, that allow both the performer and the audience to engage in musical interaction with virtual objects, while immersed in an audio/visual scenography.

This long journey is documented in my dissertation, entitled “Multimodal VR Interface for Music and Visuals Creation”.
It consists of 3 intense years of study of the human body, software and hardware experimentation, user studies on perception and musical interaction, installations and performances; all of which would have not been possible without the team of engineers and psychologists I was part of and, above all, without the collaboration with the talented musicians from the underground music scene in Genova. Other than consisting of my first step into the world of academia and research, this experience also deeply influenced my vision as avant-garde musician and new media artist.

[For sake of curiosity, the dance performance that triggered my search for novel paradigms was by a famous Italian dancer/TV host, which also inspired Beyoncé’s choreography for the 2011 Billboard Music Awards. You can watch excerpts of both here]

_media and links

Hybrid Reality interaction examples:

Excerpts from Virtua_Real Hybrid Reality performance:

_related artworks
_related publications
  • Mazzanti, D., Zappi V., Brogni, A. and Caldwell, D., “Point Clouds Indexing in Real Time Motion Capture”, Proceedings of the 18th International Conference on Virtual Systems and Multimedia, 2012, Milan, Italy.
  • Gaudina M., Zappi V., Brogni, A. and Caldwell, D., “Haptic, Audio, Visual: Multimodal Distribution for Interactive Games”, IEEE Transactions on Instrumentation & Measurement 2012, Volume 61, Number 11.
  • Zappi V., Mazzanti, D., Brogni, A. and Caldwell, D., “Concatenative Synthesis Unit Navigation and Dynamic Rearrangement in vrGrains”, Proceedings of the Sound and Music Computing Conference, 2012, Copenhagen, Denmark.
  • Zappi V., Pistillo, A., Calinon, S., Brogni, A. and Caldwell, D., “Music Expression with a Robot Manipulator Used as a Bidirectional Tangible Interface”, EURASIP Journal on Audio, Speech, and Music Processing, Volume 2012, Issue 1.
  • Zappi V., Mazzanti, D., Brogni, A. and Caldwell, D., “Design and Evaluation of a Hybrid Reality Performance”, Proceedings of the International Conference on New Interfaces for Musical Expression, 2011, Oslo, Norway.
  • Zappi V., Gaudina M., Brogni, A. and Caldwell, D., “Virtual Sequencing with a Tactile Feedback Device”, Proceedings of the 5th International Haptic and Auditory Interaction Design Workshop, 2010, Copenhagen, Denmark.
  • Gaudina M., Zappi V., Brogni, A. and Caldwell, D., “Distributed Multimodal Interaction Driven Framework: Conceptual Model and Game Example” Proceedings of the IEEE International Symposium on Haptic Audio-Visual Environments and Games, 2010, Phoenix, Arizona.
  • Zappi V., Brogni, A. and Caldwell, D., “OSC Virtual Controller”, Proceedings of the International Conference on New Interfaces for Musical Expression, 2010, Sydney, Australia.
  • Zappi V., Brogni, A. and Caldwell, D., “Passive Hand Pose Recognition”, Proceedings of the IEEE Symposium on 3D User Interfaces, 2010, Boston, Massachusetts.
  • Zappi V., Brogni, A. and Caldwell, D., “A Multimodal Platform for Audio Manipulation in Virtual Reality”, Proceedings of the 4th International Haptic and Auditory Interaction Design Workshop, 2009, Dresden, Germany.
_development and updates