This week we’re looking at some of the projects shown at NYU’s ITP Winter Show.
Jack Kalish designed this musical interface that takes control of your face. When I tried it, it only detected smiles and happy/mouth agape poses through facial mapping, but ostensibly it could be used to track a wide variety of emotions. Above is a performance of his work and he explains:
What exactly is music and why is it that it can enstill such strong emotions in people? This is the question I sought to explore when I approached this project. One theory about music is that it simulates the natural cycles and rhythms in our bodies. For instance, when you hear a loud fast bass line, it gets us excited and aroused because it simulates a loud fast heart-beat that we might feel in our own bodies. This this project, I wish to take this idea that music instills emotion, and turn it on its head by having emotion create music.
It’d be amazing to see more work like this for the Kinect. Titles like Dance Central capture the kinesthetic nature of, well, dance, and while the Kinect may not have the same facial mapping capabilities as Kalish’s work, connecting emotional states to music could point to an interesting way forward. Opera Hero, anyone?
See more of Kalish’s work here.