Using AikenLabs’ software, programmers and users can connect any number of pre-defined real-world actions to on-screen activity. I used it to play a rudimentary PC game. When I looked around in the real world, my view on screen changed. I could look left, right, up and down in the CES show hall and my on-screen view looked all around in the virtual world, as well. With the two sensors on my hands, I targeted virtual trees with one hand and pushed them over with the other. It made me feel just a little bit like a Jedi Master.
Will games finally get the kind of immersive and responsive headgear promised by defunct products like the Virtual Boy? The Immersive Motion Sensory System, the reporter admits, still makes you look “a little silly when you wear it.”
The new technology is slated for release sometime this spring, both as a development tool and a “desktop version for consumers and a $149 Bluetooth-enabled motion sensors for use with smartphone games.”