2010 - Interactive audio/visual installation

Image module

Virtual_Sequencing is a cross between a virtual instrument and an audio/visual installation, designed together Dario Mazzanti and Marco Gaudina. The core of the work is a virtual step sequencer which allows the creation of rhythmic and melodic patterns through direct interaction with virtual objects. The main idea behind this piece is inviting the audience to try out an immersive VR music experience, in an imaginary environment where real and virtual elements have the same importance during composition. This work was selected to be part of the program of roBOt Festival 2010.


Different versions of the installation have been developed. A first version includes a tactile actuator that provides a pressure feedback each time the user’s finger touches the note grid of the sequencer.
A later version of Virtual_Sequencing combines the sequencer with other interactive graphic elements and with real music instruments to engage two users into the audio/visual exploration of a reactive virtual environment. The first user is immersed inside the virtual scene and her/his hand is tracked to allow the creation of patterns on the sequencer. The second user has access to a physical music controller programmed to trigger audio loops, access audio effects and modify the graphic appearance of the virtual environment.


The virtual environment has been developed in XVR and is visualized using active stereoscopy and shutter glasses. The audio part relies on an Ableton Live set and a Max/MSP patch. The 3 applications communicate via MIDI and OSC. Ableton Live is tweaked to have access to the LiveAPI’s through LiveOSC.
The first version was meant to be exhibited inside of the IIT ADVR department VR room. It capitalizes on the high-end equipment there available, including inertial-ultrasonic trackers, a set of 12 infrared mocap cameras and a Powerwall. The actuator is a custom prototype, designed and developed by Marco using the Arduino board.
The 2 user version has been designed to run on a portable setup. The virtual environment is projected onto a foldable self-standing screen and tracking is carried out using a custom IR tracker; this is based on a Wiimote hack, running on a Linux virtual machine, and streaming data to a main Windows OS.

_research project

Teaser trailer for the festival installation proposal:

  • Gaudina M., Zappi V., Brogni, A. and Caldwell, D., “Haptic, Audio, Visual: Multimodal Distribution for Interactive Games”, IEEE Transactions on Instrumentation & Measurement 2012, Volume 61, Number 11.
  • Zappi V., Gaudina M., Brogni, A. and Caldwell, D., “Virtual Sequencing with a Tactile Feedback Device”, Proceedings of the 5th International Haptic and Auditory Interaction Design Workshop, 2010, Copenhagen, Denmark.
  • Gaudina M., Zappi V., Brogni, A. and Caldwell, D., “Distributed Multimodal Interaction Driven Framework: Conceptual Model and Game Example” Proceedings of the IEEE International Symposium on Haptic Audio-Visual Environments and Games, 2010, Phoenix, Arizona.