A blind woman sits in a chair holding a video camera focused on a scientist sitting in front of her. She has a device in her mouth, touching her tongue, and there are wires running from that device to the video camera. The woman has been blind since birth and doesn't really know what a rubber ball looks like, but the scientist is holding one. And when he suddenly rolls it in her direction, she puts out a hand to stop it. The blind woman saw the ball. Through her tongue.
Well, not exactly through her tongue, but the device in her mouth sent visual input through her tongue in much the same way that seeing individuals receive visual input through the eyes. In both cases, the initial sensory input mechanism -- the tongue or the eyes -- sends the visual data to the brain, where that data is processed and interpreted to form images. What we're talking about here is electrotactile stimulation for sensory augmentation or substitution, an area of study that involves using encoded electric current to represent sensory information -- information that a person cannot receive through the traditional channel -- and applying that current to the skin, which sends the information to the brain. The brain then learns to interpret that sensory information as if it were being sent through the traditional channel for such data. In the 1960s and '70s, this process was the subject of ground-breaking research in sensory substitution at the Smith-Kettlewell Institute led by Paul Bach-y-Rita, MD, Professor of Orthopedics and Rehabilitation and Biomedical Engineering at the University of Wisconsin, Madison. Now it's the basis for Wicab's BrainPort technology (Dr. Bach-y-Rita is also Chief Scientist and Chairman of the Board of Wicab).
Most of us are familiar with the augmentation or substitution of one sense for another. Eyeglasses are a typical example of sensory augmentation. Braille is a typical example of sensory substitution -- in this case, you're using one sense, touch, to take in information normally intended for another sense, vision. Electrotactile stimulation is a higher-tech method of receiving somewhat similar (although more surprising) results, and it's based on the idea that the brain can interpret sensory information even if it's not provided via the "natural" channel. Dr. Bach-y-Rita puts it this way:
... we do not see with the eyes; the optical image does not go beyond the retina where it is turned into spatio-temporal nerve patterns of [impulses] along the optic nerve fibers. The brain then recreates the images from analysis of the impulse patterns.
The multiple channels that carry sensory information to the brain, from the eyes, ears and skin, for instance, are set up in a similar manner to perform similar activities. All sensory information sent to the brain is carried by nerve fibers in the form of patterns of impulses, and the impulses end up in the different sensory centers of the brain for interpretation. To substitute one sensory input channel for another, you need to correctly encode the nerve signals for the sensory event and send them to the brain through the alternate channel. The brain appears to be flexible when it comes to interpreting sensory input. You can train it to read input from, say, the tactile channel, as visual or balance information, and to act on it accordingly. In JS Online's "Device may be new pathway to the brain," University of Wisconsin biomedical engineer and BrainPort technology co-inventor Mitch Tyler states, "It's a great mystery as to how that process takes place, but the brain can do it if you give it the right information."
In the next section, we'll look more closely at the concepts of electrotactile stimulation.