One of the aspects of augmented reality is the human-computer interface that we will need to control our computers. We use mouse and keyboard for our desk PCs, and touch for our iPhones, but those methods may become outdated as the technology advances.
Last week a University of Wisconsin-Madison research team successfully posted to twitter with their mind. The benefit for those with debilitating diseases that destroy the body is obvious. I’ve explored the technology available previously in reference to my childhood friend Bill that was a quadriplegic due to a serious brain tumor.
Currently the technology, as seen in the video, is cumbersome to use. This is not much of an obstacle if you cannot do it manually, so the opportunities are huge.Â
For those of us without difficulties in accessing computers, then the technology has a long way to go before it becomes usable. Considerably in the mobility department. The ability for the computer to recognize the correct brainwaves has to go beyond hunt-and-peck type of letter picking for the technology to gain any foothold in a wider scope.Â
Having said that, I don’t want to lose sight of the wonderful tool this will become for those that need it most. When I see these technologies, I sometimes wonder if Bill died because he didn’t have much to live for, the mind had given up. Steven Hawking has lived with LGD for decades past the average lifespan of three years, but I conclude his strong will in staying interested in his surroundings have contributed in his extended lifespan. With a tool like this, maybe Bill could have been able to stay connected with the world, and live a longer, fuller life.Â
[…] Other notable things that happened in April – Georgia Tech fear of heights video (another peak into the future of augmented reality as a perception changer), Rouli’s call for marketers to stop using AR to sell cars, Lester at the Augmented Planet gets his blog started, Nokia point and find is out and a couple of articles about brain-computer interfaces: Honda controls robot and the brain-twitter breakthrough. […]