The idea is a blending of hardware and software: A headset that seems a little like the one from the James Cameron-written 1995 film, Strange Days, complete with a set of sensors that are built to read your brain waves.
The software then is designed to interpret those brain waves in such a way as to allow users to manipulate objects onscreen with nothing but their mind.
So that's why I've come to this office in downtown San Francisco, where I'm face-to-face with this little orange cube. It's kind of mocking me, daring me to make it disappear.
Here's how it works: The software has several choices for actions you can take. So, taking the disappearing cube as an example, once you're hooked up to the headset, you're directed to run a short, six-second test, where you concentrate on doing something, anything, with your mind--relax, focus, whatever.
Then, once you've completed the test, it's you against the cube. And the challenge is to see if you can reproduce what it was you were doing with your mind during the test; If so, the cube slowly disappears.
In my case, it disappeared, then came back, then disappeared again and then came back. Repeat.
They also ran me through another example, this time trying to pull the cube forward. This one was harder because the brain function I chose to do to synchronize with the challenge was more concentrated. It involved me sort of tensing up my head and imagining the act of pulling the cube forward. It didn't work very well.
But with the disappearing act, I simply relaxed my mind, with much better results.
Of course, there's no relationship at all between brain activity that is consciously trying to "pull" the cube forward and what happens. That is to say, it doesn't matter in any way what you're doing with your mind, so long as what you do during the six-second calibration matches what you do when you try to enact the action.
So really, the software is just looking for a pattern match. It's not all that complicated a concept, though I'm sure it's a pretty difficult engineering feat.
Emotiv has also built technology designed to read your facial expressions and emotions. So while there, I saw a demonstration where someone wearing the headset would smile, frown, smile again, and so forth. And a goofy-looking face on the monitor would repeat the expression.
For now, this is all still just in prototype phase. But Emotiv promised me that the headset would be available in time for Christmas this year, at a price of US$299. It'll come bundled with a game that is geared toward using the technology, and presumably, more games will follow. The success, I think, of this product, will be how easy it is for developers to build the technology into their games. And that, presumably, is why the product is being showcased during this week's Game Developers Conference in San Francisco.
Emotiv also said that the company is working on a partnership with IBM to integrate the brain control interface technology with Big Blue's virtual worlds projects.
To be perfectly honest, I think this technology is a ways from being ready for any hard-core application. Based on what I saw, it's very interesting and even quite impressive. But I just don't know if it can improve fast enough to make a real difference in the market in the next year. Perhaps it can, and if so, that would be fantastic.
Nintendo's Wii and Guitar Hero have opened people's eyes to all-new interfaces, and I'm sure that this would fit into that category. But the things that have made the Wii and the Guitar Hero controller so successful is that they are easy and intuitive to use. Whether Emotiv's technology is as well is something I'd have to reserve judgment on.
Still, I was able to make that cube disappear without using my hands. And that's something.