Tuesday, February 19, 2013

Touching infrared, feeling sound

The Washington Post recently ran an article about the FDA's approval of the first artificial retina. An implant used to treat the a degenerative condition known as retinitis pigmentosa. It's a big step in terms of health care. Especially if you've ever known a sighted person who's been gradually plunged into darkness during the latter part of their life, when it's much more difficult to adapt to the loss. It's also an advance that raises questions about human augmentation.

Right now the technology is basic. At best it restores shades of gray and requires a set of external glasses. However, the rapid pace of advances in both implant technology and miniaturized sensors means that the prospect of artificial retinas that can see into the infrared, ultraviolet, radio energy portions of the spectrum aren't all that far fetched anymore.

That said, could the brain actually make sense of the extra information transmitted down the optical nerve? More than likely, yes.

 Interpreting Augmented Senses 

A recent study at Duke University has demonstrated the ability of the brain to interpret entirely new types of sensory data through its existing sensory cortices. The study subjects (rats, which are blind to IR) were fitted with infrared sensors that were wired into the centers that process tactical information. At first the animals touched their faces in response to the presence of infrared light, but in short order they demonstrated an ability to located infrared sources.

Another sensory augmentation type that I've seen referenced is a system that transmits bat-like echolocation signals into the brain, where they are translated into touch. This allows the user to feel the position of objects around him or her, though it apparently takes a good deal of training to develop this augmented sense. The oddest aspect of the system, however, is that it uses a device that bridges the gap between the machine and the brain via tongue stimulation.

In other words, far from being a GUI (graphic user interface), the users bites down on an electrical signal simulator to expand their perception.

It turns out that the organ of taste is a fat pipe of direct-to-the-brain bandwidth for carrying sensory signals. While that's a "well duh" observation from an evolutionary biology standpoint, it's something of a dramatic revelation as far as human-machine interfaces.

That same pathway of direct brain stimulation also holds the potential for medical application. A similar device is currently enter trials for treating Traumatic Brain Injuries.

Android vs. Terminator Vision

So how will soldiers integrate extra sensory information into their situational awareness?

The present explosion of sensor platforms on the battlefield -- and a desire to push that information down to the level of the individual shooters -- is already challenging the existing intelligence architecture. How do you integrate so much data into the flow of someone who's already immersed in an in environment of shoot / no-shoot decisions?

The short-term solution for the operators of the Special Forces Command deployed in Afghanistan has been Android. Or rather, SOCOM personal have grown adept at using Android OS phones to swap data over encrypted, peer-to-peer local networks.

In the long-run, however, augmented reality technologies like Google glasses offer the promise of being able to deliver both interactive real time maps and some types of enhanced sensory data. The latter include sound-based bullet-tracking, wall-penetrating ultrasonic, in addition to IR and thermal. Other types of augmented sense information such as the echolocation sense of remote touch, or 360 vision will likely require some means of direct machine to brain interface.

So will it look like something out of the Terminator or Ghost in the Shell films? I hope not.

As traditionally depicted in science fiction, augmented military reality is visually cluttered with all kinds of superfluous but cool-looking data feeds. It's also usually accompanied by useless and distracting audios -- beeps and other 'computer' noises that don't seem to convey anything useful, and primarily serve to block out background noises that could mean the difference between life and death.

I suspect that the heads up displays of first person shooter games are closer to the mark for future conventional data displays. The direct to brain information, however, will likely be its own type of experience--the closest thing that humanity has ever come to possessing a genuine sixth sense.


No comments: