';
Logo TooMuchIdle

The Blue Regen Project

2012 - Audio/visual performance

Image module
_concept

The range of unconventional media which could inspire and help artists’ creativity is extremely wide. In the field of music content processing, the availability of haptic interfaces offers interesting possibilities of performer-instrument interaction. Together with the help of Antonio Pistillo and Sylvain Calinon, throughout this project I investigated the use of a robotic arm as a bidirectional tangible interface for musical expression, actively modifying the compliant control strategy to create a bind between gestural input and music output. The outcome is a Hybrid Reality Performance [see next works – link a sezione 5], in which the robot becomes a novel musical interface, completely connected to standard musical instruments and Virtual Reality equipment.

_description

The core of The Blue Regen Project is a robotic music controller which allows the instantiation of low frequency oscillators, gradually refining periodic movements executed on a robot manipulator. To do this, the user can grasp the robotic arm and locally modify the executed movement, which is learnt on-line; the trend of each trajectory is locally converted into standard music control signals, and can be routed to all the connected hardware and software devices. After releasing the arm, the robot continues the execution of the movement in consecutive loops. During the interaction, the impedance parameters of our robot controller are modified to produce a haptic feedback which guides the user during the modulation task.
The Blue Regen Project takes its name from the title of the first music track ever composed using our robot manipulator as main controller. The talented composer and performer Valerio Visconti [aka K] joined our team for the whole design and test process of the instrument. Eventually, he composed and performed live Blue Regen, a new track specifically produced to take advantage of the robot controller features. During the performance, the musician is tracked and together with the controller is immersed into a reactive Hybrid Reality choreography; the reactive projections enhance the transparency of the show in the eyes of the audience and, at the same time, they work as secondary visual feedback for the musician.

_technology

The robot employed in this work is a Barrett WAM with 7 revolute DOFs back- drivable arm, controlled by inverse dynamics solved with recursive Newton Euler algorithm [coded in C++].
A script written in XVR bidirectionally communicates with the robot via UDP, handling the data related to the each trajectory. These data are converted into OSC messages and forwarded to Ableton Live/LiveOSC and Max/MSP, to drive software and hardware music instruments and sound generators. On the other way around, data coming from software and hardware devices pass through XVR to reach the robot, and modify its behavior.
The same XVR script handles logic that lies beneath the Hybrid Reality choreography [tracking of the performer and projection of the virtual environment].

_video

K performing live the track Blue Regen:

_photo