Versions is the essential guide to virtual reality and beyond. It investigates the rapidly deteriorating boundary between the real world and the one behind the screen. Versions launched in 2016 at the eponymous conference dedicated to creativity and VR with the New Museum’s incubator NEW INC.

Pitches, questions, and concerns can be directed to info@killscreen.com

We're always hiring and looking for new writers! For details, click here.

Kill Screen Versions The Meta

The next stage in AR will let Pokémon Go interact with the real world

The next stage in AR will let Pokémon Go interact with the real world

I’m walking to my local corner store, phone firmly in my grasp, to grab some milk and who knows what else, maybe an iced tea if it’s on sale. Since the release of Pokémon Go in early July, the flippant app has cemented itself as a new part of my walking-anywhere routine. I’m not an absurdly high level in the game at all (in fact, I barely inched to level 18 just the other day). Yet, I still open it once or twice daily, fling a few Pokéballs in my Zubat-infested neighborhood, and usually call it a day.

As I lackadaisically roam the small shop, my phone buzzes—there’s a Staryu nearby. I glance around, trying to spot the starfish-esque creature, only to find it right in front of me in the candy aisle. Then, as I’m trying to catch it, a stranger walks right through the Staryu, and it doesn’t budge at all. Talk about a break in AR immersion.

Luckily though, there are researchers over at MIT Computer Science and Artificial Intelligence Labs hard at work figuring out a way to create fully interactive dynamic video, and even the exploring the potential for dynamic augmented reality. In observing how objects respond to the forces people have control over (such as shaking) and implementing that through other tech, dynamic video and augmented reality could maybe make that Staryu leap out of the way, or at least rustle some nearby candy bars.

a Caterpie rustling a bush

The problem that lies in AR that researchers are studying to possibly amend is that when the digital overlays with the physical world, it simply sits on top of it, not actually interacting with the environment at all. With MIT CSAI’s proposal, if AR technology catches up that is, digital objects can potentially cause movements on the screens that we see. Like a Caterpie rustling a bush, or a Pikachu bouncing across a wobbly playground fixture. The core idea for creating dynamic augmented reality, specifically, is turning an actual video of a mostly static object—like a plant—and turning it into something interactive and dynamic. Something that’s still moveable and reacts to other objects, if only slightly.

Pikachu actually interacting with a bush like he's a squirrel or somethin'.
Pikachu actually interacting with a bush like he’s a squirrel or somethin’.

The project’s being led by researcher Abe Davis for his PhD dissertation at MIT. Davis wrote a paper, “Image-Space Modal Bases for Plausible Manipulation of Objects in Video,” alongside fellow researchers Justin G. Chen and Fredo Durand, in addition to launching a website and series of videos detailing the MIT-patented research. Davis insists this is not a commercial endeavor at this time (nor a collaboration whatsoever with Niantic or The Pokémon Company), but a mere exercise in academia.

Want to know more about the workings of interactive dynamic video and augmented reality? You can read more on Davis’ website, and watch some video examples while you’re at it.

Versions is brought to you by Nod Labs,
Precision wireless controllers for your virtual, augmented and actual reality.
More From Author