Ever wonder what exactly your friend, or boss, was thinking while you talked to them? If Rosalind Picard has her way, soon you’ll be able to find out.
Picard is a Professor of Media Arts and Sciences at MIT. Together with Rana el Kalioby, a research scientist who gave a TED lecture in June 2010 titled “Improving Lives with Emotionally Intelligent Technology,” they have created a prototype for a pair of “social x-ray specs” that tells the wearer how people respond to them while in the midst of conversation. Do they say, “I understand,” even though they’re confused? A light inside the frames will blink red, and a tiny voice transmits the message through attached headphones.
But how does it work?
Inside the glasses is a camera the size of a rice grain connected to a wire snaking down to a piece of dedicated computing machinery about the size of a deck of cards. The camera tracks 24 “feature points” on your conversation partner’s face, and software developed by Picard analyses their myriad micro-expressions, how often they appear and for how long. It then compares that data with its bank of known expressions.
The device was originally developed to help those with autism recognize social cues. Potential applications are vast, from more efficient sales people to augmented reality computer games. No word yet on whether “Rowdy” Roddy Piper has pre-ordered a pair.