Microsoft’s robot assistants will follow your every move

In an effort to prove that humans and robots can interact on a more meaningful level, Microsoft has showcased its Situated Interaction project, an immersive experience ed by distinguished scientist Eric Horvitz, and his colleague Dan Bohus. The fully-immersive experience features elevators that can predict if you need a ride, and robot secretaries that rely on work calendars to allow, or deny, appointments.

The project relies on intensive integration of multiple computational competencies and methods, including machine vision, natural language processing, machine learning, automated planning, speech recognition, acoustical analysis, and sociolinguistics. It also pushes into a new area of research: how to automate processes and systems that understand multiparty interaction.

“We’re addressing core challenges in artificial intelligence,” Horvitz says. “The goal is to build systems that can coordinate and collaborate with people in a fluid, natural manner.”

Horvitz’s assistant, for example, can access his online calendar, detect if he’s in the office, infer how busy he is, predict when he’ll finish a certain task based on his past habits, and even predict when he’ll conclude a conversation, based on the length of his past conversations.

“Intelligent, supportive assistants that assist and complement people are a key aspiration in computer science,” Horvitz says, which is why he expects fierce competition from similar companies in the same space.

While Microsoft claims the applications could be as wide-ranging as aerospace, medicine, and disaster relief, right now the focus seems to be on making mobile devices more intuitive. As with many similar projects that aim to replace people with robots, the question to ask, why are we in such a hurry to replace ourselves?

This post was originally written by Ross Brooks for PSFK.