fbpx

The Telepathy Interface

Our great-grandchildren won’t remember a time they couldn’t use telepathy across the ARNet.  Just like children today can’t fathom a world without text messaging. 

Neuroscience researchers at the University of Berkeley can translate recorded patterns of neural activity into which pictures from a sample grouping a subject is seeing.  In simpler terms–they can read your mind. 

While they report that finer details can be ascertained from the scans, this type of calibrated mind-reading will take lots of scanning power.  If instead they focused on how to calibrate lesser powered scanners to detect a smaller sub-set of images, or “commands”, we could use these as a command language for thought-controlled computers. 

The current neural-interface devices like the Neurosky Mindset or OCZ Neural Impulse Actuator have limited degrees of freedom.  Adapting the scanning ability of the research could lead to the ability to give a diverse sub-set of commands to a computer, some which could be used as simply as we text while other more complex commands could be used to do advanced controls. 

While this type of interface sounds like science-fiction, so once were rocket ships that went to the moon.  And if ever we get to have thought-controlled fighting robotslike Ichikawa is using for the 16th Annual Robot-One Gladiatorial Combat tournament, I want mine to look like Bruce Willas when he was in Die Hard (and not the toupe-wearing version from the new movie Surrogates). 

Yippie, Kai-Yay, Motherf###er!

bruce-willis-die-hard

 

Fighting robots via @peterhorvath