Versions is the essential guide to virtual reality and beyond. It investigates the rapidly deteriorating boundary between the real world and the one behind the screen. Versions launched in 2016 at the eponymous conference dedicated to creativity and VR with the New Museum’s incubator NEW INC.

Pitches, questions, and concerns can be directed to info@killscreen.com

We're always hiring and looking for new writers! For details, click here.

Kill Screen Versions The Meta

Autism Glass wants to help kids on the autism spectrum

Autism Glass wants to help kids on the autism spectrum

What happened to Google Glass? With all the worries from non-Glass wearers about being secretly recorded, plus the fact that they were a little ugly, Glass plummeted from latest tech hype to totally forgotten. But the abandoned augmented reality eyewear has found new purpose, now that a research team at Stanford University has found a more practical application. Stanford Medicine recently revealed its project Autism Glass, an app for Google Glass that aims to help kids on the Autism Spectrum Disorder (ASD) in their daily lives by using Glass as a behavioral aid to help kids communicate with others.

About one in 68 children in the United States has ASD, according to the Centers for Disease Control and Prevention. In most cases, people with ASD struggle with the social interactions that the non-inflicted probably take for granted. A person with ASD might have a harder time attempting to interpret facial expressions (like a smile or a frown), resulting in them feeling uncomfortable, and ultimately leading to avoiding eye contact—or social situations all together. A large part of the hardships of ASD result from feeling alienated, but Autism Glass wants to help remedy that.

The glasses effectively code emotions for the wearer

“We have developed a system using machine learning and artificial intelligence to automate facial expression recognition that runs on wearable glasses and delivers real-time social cues,” reads Autism Glasses’ website. Linked to an app on an Android device (iPhone coming eventually), Google Glass’ outward camera records the actions of whomever the wearer is talking to, where the app’s AI determines their reactions. The glasses don’t clog up the wearers’ glasses with long, complicated words, and opts for a simpler route: emojis. The glasses effectively code emotions for the wearer—be it happy, sad, angry, or any other emotion—through its app, and an instantly-recognizable color-coded emoji pops up on the lens’ screen.

“With vision, as you start on the outside and work your way in, color gets registered before text does and a long word like ‘disgusted’ may not be fully understood, or in some cases can’t even be read by some people,” lead mobility engineer for Autism Glass, Aaron Kline, told Motherboard. “Having a color-coded system with expressive emojis has shown so far to make the most sense to users.” Following a successful 40-person pilot study, where kids ages 6-16 tested Autism Glass in their daily lives, Autism Glass seeks to embark on a larger-scale 100-person clinical at-home study with more families.

While not coming to the market anytime soon, you can read more about the Autism Glass project from Stanford.

Versions is brought to you by Nod Labs,
Precision wireless controllers for your virtual, augmented and actual reality.
More From Author