Thursday, April 24, 2008

Future Science: Augmented Wetware Processor

During a recent hallway conversation a friend and I were considering heads up displays in aircraft and automobiles, and the difficulties in doing the same things with spectacles. A lot of work has already been done in these areas and some remarkable things have been built that add information to the optical channel. Since the signal-processing bandwidth of the eyes is the highest of our senses, projecting information onto the retinas (either indirectly by annotating the "window" we look through or directly by imaging into the eye) seems like the best way to push more immediate knowledge into the brain.

But is it really? I suspect we'll bypass the eyes completely in the future, instead augmenting the brain's wetware directly. After gaining a much deeper understanding of our neural networks and the parallel processing algorithms in our brain, we ought to be able to determine what has our attention moment to moment. Using that as the driving input and factoring in enough context, we should be able to add information to what we are conscious of in real time by stimulating the brain artificially - from within using processors and a wetware interface device. Imagine that whenever you looked at, heard, tasted, smelled, or touched something, you would immediately just "know" all sorts of extra information about the experience as if it was just "there" already, without having had to learn it the hard way, by integrating it a priori through the senses and reasoning.

Abuse by malevolent powers aside, a virtually-connected, immediate-access Wetipedia could extend perception far beyond what any one person would be capable of by themself. There are lots of issues, and science has a long way to go before any of this is possible at the lowest level. But I think it'd be pretty cool.

No comments: