Gesture Wall

The Gesture Wall uses Electric Field Sensors to measure the position and movement of the user's hands and body in front of a projection screen; the projected video and musical timbres are changed accordingly. To start the experience, the performer steps onto a plate that has a harmless low-frequency (50 Khz), low-voltage (10 Volts) RF signalÊapplied to it. This signal is then couples through the performer's shoes and is broadcast through their body to a set of four pickup antennas located on goosenecks around the perimeter of the screen. These signals change with the distance of the performer from the respective sensors (an LED mounted in each sensor glows with increasing intensity as the performer's body approaches). The sensor data is transferred to a PC running ROGUS, where it is analyzed for gestural characteristics. Before starting the experience, the user must calibrate out the coupling strength of their shoes and body mass, which vary considerably from person to person. This is accomplished by touching a reference pickup electrode, which adjusts the transmitted signal such that every participant radiates equally. A Fish sensor with linearizing log amplifiers is used for the sensing hardware, just as with the Sensor Chair. Hand-on-calibrator state is detected by breaking an IR beam directed across the calibrator surface.

Link here for video clips and more details.


Papers about this system:


A background paper about the capacitive technology behind this system:
Musical Applications of Electric Field Sensing Joseph A. Paradiso and Neil Gershenfeld, Computer Music Journal 21(2), Summer 1997, pp. 69-89.

A description of the Gesture Wall is given here:
Electric Field Sensing for Graphical Interfaces Joshua Smith, Tom White, Christopher Dodge, David Allport, Joseph Paradiso, Neil Gershenfeld. IEEE Computer Graphics and Applications, Vol. 18, No. 3, May-June 1998, pp. 54-60.

More description of the Gesture Wall is given here:
The Brain Opera Technology: New Instruments and Gestural Sensors for Musical Interaction and Performance Joseph Paradiso. Journal of New Music Research, Vol. 28, No. 2, 1999, pp. 130-149.

See also:
J.A. Paradiso, "Several Sensor Approaches that Retrofit Large Surfaces for Interactivity" Presented at the UbiComp 2002 Workshop on Collaboration with Interactive Walls and Tables, Gothenburg, Sweden, September 29, 2002.

Credits:

Joe Paradiso - Sensor and hardware design
Josh Smith - Embedded code
Rick Ciliberto - Electronics Fabrication
Joel Rosenberg - Electronics Support
Kai-Yuh Hsiao - Music Software
Chris Dodge - Video Software
Sharon Daniel - Imagery and Video
Pete Rice and Tod Machover - Insipration

Click here to go back to the main Brain Opera Technology index page.


Return to the Responsive Environments Group Projects Page