top of page

OVERVIEW

Gesture Piano is literally a "Piano" controlled by finger movements that are detected by the computer camera.

DOMAIN

Physical Computing

Wearable technology 

​Remote Control

​GOAL

Gesture Piano is a project for course Interaction Lab. The goal is to experience and practice physical computing, at the same time get a touch on wearable technology.

TOOL 

Ardurino / Processing /

3D modeling / Laser Cutting

ROLE (Team size 2)

Ideation

​Engineer ​​​

​Installation design 

How It works

​Background
 

I am always interested in wearable technology and remote control, therefore I enrolled in Interaction Lab to experience physical computing and installation building. Project Gesture Piano is an attempt to trail these fields. We were inspired by the interaction method of the piano: Is it possible for us to make music without any objects and just with our gestures? We want to make a gesture-control piano which would be an evolutionary interaction of musical instruments. 

How It Works

截屏2022-08-20 下午12.54.21.png

 

Users wearing a glove with green stickers (the same color as the green screen which is rare in real life) move their fingers in front of the web camera. Here I used the color detection technique in Processing. The camera will detect the movement and position of the fingers. Once the fingers move to a certain place, the web camera will detect the position and the notes from the computer will be triggered. At the same time, with serial communication and Arduino, the self-fabricated “physical” piano keys connected with motors will be triggered to move. In this sense, we could build the effect of remote controlling the piano, and the gesture piano works. 

​Process

Code

Citing Daniel Shiffman's tutorial on multiple color detection, we decided to use color tracking to mimic finger movement detection. The most significant step for me is to figure out how to match the fingers to the keys in order using a 2d array in processing. We didn’t discuss a lot about 2d Array in class, so I didn’t understand the syntax very well. Rolled as the engineer, I stayed in the lab for afternoons, studied tutorials, and consulted fellows. Once the 2d array works, I was able to store the position of finger dots and reordered them before assigning them to newly mapped positions on the screen. Here is the 2D array syntax that I finally figure out. The video is the testing outcome.

​Fabrication

In the production, we used five motors to trigger the piano keys’ movements by sending processing values to Arduino. We used a glove and colored stickers on the finger for color tracking.  We tested the blue tape first and it turned out to be too dark that the computer can’t read it accurately.  And we finally chose the green screen color since it’s the most uncommon color in the real life. For the appearance of the piano, we 3d modeled a box and laser cut it out. 

IMG_0759.HEIC
IMG_0772.HEIC

Reflection and Takeaway

Due to time constraints, we were not able to accommodate the processing window to the fullscreen, thus the interaction experience is not optimized. Glitchy detection is also a troublesome problem. We haven't figured out the reasons, maybe we could try a Kinect camera next time. At the same time, the movement of the physical "piano keys" is not accurate enough, which requires testing and adjustments over and over again. However, as a trail in physical computing, we basically achieved what we want, and we are both proud of this project.

The takeaway of this project is to experience physical computing and remote control. I believe human-computer interactions shouldn't be limited to being only on the screen. I practiced physical production skills like laser cutting and 3D printing. This provides me with more potential to explore HCI.

Let's Chat!

4th Floor, 370 Jay Street, Brooklyn, NY 11201 cy1503@nyu.edu  |  Tel: +1 914 447 2577

bottom of page