OVERVIEW
Gesture Piano is literally a "Piano" controlled by finger movements that are detected by the computer camera.
DOMAIN
Physical Computing
Wearable technology
Remote Control
GOAL
Gesture Piano is a project for course Interaction Lab. The goal is to experience and practice physical computing, at the same time get a touch on wearable technology.
TOOL
Ardurino / Processing /
3D modeling / Laser Cutting
ROLE (Team size 2)
Ideation
Engineer
Installation design
How It works
Background
I am always interested in wearable technology and remote control, therefore I enrolled in Interaction Lab to experience physical computing and installation building. Project Gesture Piano is an attempt to trail these fields. We were inspired by the interaction method of the piano: Is it possible for us to make music without any objects and just with our gestures? We want to make a gesture-control piano which would be an evolutionary interaction of musical instruments.
How It Works

Users wearing a glove with green stickers (the same color as the green screen which is rare in real life) move their fingers in front of the web camera. Here I used the color detection technique in Processing. The camera will detect the movement and position of the fingers. Once the fingers move to a certain place, the web camera will detect the position and the notes from the computer will be triggered. At the same time, with serial communication and Arduino, the self-fabricated “physical” piano keys connected with motors will be triggered to move. In this sense, we could build the effect of remote controlling the piano, and the gesture piano works.
Process
Code
Citing Daniel Shiffman's tutorial on multiple color detection, we decided to use color tracking to mimic finger movement detection. The most significant step for me is to figure out how to match the fingers to the keys in order using a 2d array in processing. We didn’t discuss a lot about 2d Array in class, so I didn’t understand the syntax very well. Rolled as the engineer, I stayed in the lab for afternoons, studied tutorials, and consulted fellows. Once the 2d array works, I was able to store the position of finger dots and reordered them before assigning them to newly mapped positions on the screen. Here is the 2D array syntax that I finally figure out. The video is the testing outcome.

Fabrication
In the production, we used five motors to trigger the piano keys’ movements by sending processing values to Arduino. We used a glove and colored stickers on the finger for color tracking. We tested the blue tape first and it turned out to be too dark that the computer can’t read it accurately. And we finally chose the green screen color since it’s the most uncommon color in the real life. For the appearance of the piano, we 3d modeled a box and laser cut it out.



