« The vOICe Home Page
« The vOICe Learning Edition

Copyright © New Scientist. Archived transcript of June 4, 1994 article by Rosie Mestel, pages 20 - 23.

Hearing pictures,

seeing sounds

Ear and eye implants for the deaf and blind are still not living up to their promise.
Rosie Mestel asks whether researchers would do better to exploit the potential of the unimpaired senses.

PETER MEIJER is demonstrating his device - a machine that transforms pictures into patterns of sound. "This is a bright line, running upward, from left to right," he says, and the machine chirps out a sound that starts with a low note and smoothly slides up to a high one, then repeats itself again and again. Next, Meijer plays a tune representing a square (a discordant chunk, chunk). Then he plays a sound representing a circle slowly moving towards the listener: a series of slowly changing blips and blares.

All the sounds seem strange and alien, like warning signals for some unspecified disaster. But Meijer, a physicist-engineer at the Philips Research Laboratories in The Netherlands, hopes that one day they might become familiar, and that people who are blind might one day "see" by interpreting sounds.

By tapping into a sense that remains intact, Meijer's machine and others like it could give blind and deaf people glimpses and whispers of a sensory realm denied them at the moment. For blind people, there are devices like Meijer's, and others that turn pictures into patterns of vibrations on the skin. For deaf people, there are machines that turn sound into vibrations or sounds into pictures.

At the very best, these aids might let blind people recognise cars, houses, trees-even specific faces - as they go about their day-to-day business. And deaf people might understand speech from vibrations on the skin. At the very least people who are deaf may learn better speaking and lip-reading skills, and blind people may gain access to the world of computer graphics.

Sensory aids such as Meijer's differ from the cochlear implants available for deaf people, or the various implants being developed for blind people, both of which seek to repair the damaged sense directly. Cochlear implants, for instance, detect sound and crudely sort it into several frequency bands. Then they stimulate cells in the auditory nerve via a set of electrodes embedded in the inner ear, providing rudimentary hearing for those who lack the sensory cells that normally do this job.

Meanwhile, scientists at the National Institutes of Health in Bethesda, Maryland, have plans to produce implants for blind people. one approach uses special spectacles that convert patterns of light into electrical stimuli which travel to electrodes sitting in the brain's visual centre, stimulating nerve cells in precise patterns.

Restoring a damaged sense to full working order seems attractive. But today's cochlear implants do not restore normal hearing, and the language comprehension of people implanted with them is variable. Of 85 adults who became deaf after learning to speak, and had cochlear implants fitted at Los Angeles' House Ear Clinic, only 35 percent can understand enough speech from sound alone to hold a simple telephone conversation-and only one or two can chat on the phone for long periods. The statistics are worse for children born deaf. And the surgery is both invasive and expensive (between $14,000 to $29,000 per implant). The implant debate is charged with emotion among the deaf community, many of whom consider poor hearing a poor option- one that would cut off deaf children from the rich, visual world of sign language. Meanwhile, implants for blind people are still in the early stages of development.

Good vibrations

So the time is ripe for people who have alternative ideas. Paul Bach-y-Rita, a neurophysiologist at the University of Wisconsin in Madison, has developed another means of improving existing senses. While Meijer's machine converts images into sounds, Bach-y-Rita's device converts images into a pattern of vibrations on the skin. The work dates back to 1969, when a paper in Nature described his first prototype. In that setup, blind volunteers wore a camera on the head. The camera was attached to a computer that encoded video images into a 20-pixel by 20-pixel grid. This information was then fed to a 400-point grid of plastic spikes (like teeth on a comb) that was placed in contact with the back of the volunteer. If the pixel was bright, then the elements would vibrate; if it was dark, then they would not. Thus, the device would "vibrate" the shape of the image onto the skin, and with practice some volunteers could distinguish facial images like those of Twiggy the model and Khrushchev the Soviet statesman. "They could recognise faces and say, for instance: 'Oh, that's Mary and she's wearing her hair down today,"' says Bach-y-Rita.

Since then, Bach-y-Rita and his colleagues have fine-tuned their device. Now they can break the video image into more than a thousand pixels and they have switched from using vibrations to painless jolts of electricity, and adding different amounts of stimulation to correspond to different intensities of light. They have also used their device with young schoolchildren. "This has been real fun to do," he says. Suppose a child asks to see a candle flame. "These kids have never 'seen' a lighted candle before because you can't touch it," Bach-y-Rita says. "All of them are surprised by how small the flame is, because they feel the heat well above the candle. And they're surprised that there's a space between the candle and the flame itself." There is a wealth of detail about the world we live in that blind people never experience, he says, but with pictures on the skin, they might.

With this in mind, Bach-y-Rita's colleague Kamal Sesalem is busy scaling down the bulky hardware (a video camera which connects to a computer that runs a pixel conversion program) to something that teachers could carry from school to school. It might be particularly useful for teaching blind children about science, which is very visual, says Sesalem. For instance, it might enable kids to see samples down microscopes. "We would like to see blind children deal with scientific information as well as anyone else in school," he says.

But Bach-y-Rita admits there are limitations. Blind volunteers did learn to recognise faces. "But it was not an immediate, snap recognition like with the visual system," he says. "Interpreting a face took a minute or two-and this was in an environment where we cut out all the clutter. It wasn't one face standing out in a crowd of faces. It was one face on a white background."

Now Bach-y-Rita is planning to fit babies with the device, in the hope that younger, nimbler brains might do much more with tactile images than his blind, adult college students. He and psychologist Eliana Sampaio at the University of Paris have funding from the French government to strap on cameras to babies' heads and test just that. "If you're ever going to have people develop useful artificial vision or tactile substitution vision I think it's most likely it will work if you start with very, very young blind children," he says. The babies could gain a lot as well, since their lack of sight can cause developmental delays that last well into childhood.

The sound of light

Meijer's system, while similar to Bach-y-Rita's, is harder to visualise. it consists of a video camera that takes a picture which is converted into a digitised image made up of 64 by 64 pixels. But then the image is converted into sounds by a computer, following two simple rules. First, pixels of light situated "high" in the picture are converted into high tones; those that are low are converted into low tones. Secondly, the brighter the pixel, the louder the sound. So a bright dot near the top of the pixel grid would be high-pitched and loud.

If you were to "hear" a picture with Meijer's device, you wouldn't hear the whole image instantly: rather, you would hear a column at a time, from left to right. A bright, diagonal line stretching upward to the right produces a loud "ooiieep" sound and another stretching downward to the right makes the opposite sound - "eeiioop". After one entire scan, which takes about a second, the scan begins again. If the image changes, so will the next pattern of sound.

Meijer's machine is simply a prototype for now. But if one day he can persuade a company to develop his idea and make it portable, Meijer imagines a blind person with a portable camera, scanning things as she or he walks down the street. The images would be converted into repeated one-second blasts of noise that would change as objects grew nearer or receded from sight. "Blind people have their cane, which is a very useful thing," says Meijer. "But they don't have the ability to detect buildings from a distance, or to recognise buildings they have encountered before. I hope that a system like this would help with orientation in particular."

It is easy enough, with Meijer's machine, to "hear" a straight line. But as the images become more complex, so too do the signals. Nobody trying out the machine for the first time could immediately "hear" a face or a tree and know it was a face or tree - especially if the images were cluttered with other faces, trees, buildings and more. But how good might someone become, given time and training? Could any of us ever learn to see via blasts of sound, or weird jiggles of the skin?

Nobody knows the answer yet. But we do know that our brains are fabulously plastic, especially early on: each one is moulded by the events of our lives. Blind people, for instance, use areas of their brain normally reserved for vision when touching or hearing. Mike Merzenich, a researcher in brain plasticity at the University of California at San Francisco, found that monkeys trained to do manual tasks in return for food quickly harness more of their brains for analysing touch sensations from their fingers. In one mind- boggling experiment at the Massachusetts Institute of Technology, a ferret's optic nerve was surgically rerouted to the auditory portion of its brain with the result that the animal could still see.

Merzenich believes that people could gain valuable information from Meijer and Bach-y-Rita's systems, but he doubts if it would ever be much like vision. Nor does he think that people will ever come to grips with sounds by sensing them through touch especially when it comes to learning a language. "It's not clear that the machinery in the touch system in the higher reaches of the nervous system is up to the job [of language]," he says.

Others are more hopeful. Geoffrey Plant, who teaches deaf people and works at MIT, points to the incredible feats of people who are both blind and deaf, some of whom can understand speech simply by feeling the mouth and throat of the speaker. To develop this ability, scientists are building devices that turn sounds into vibrations.

Today, about 400 deaf people worldwide are using Tactaid 7, a device that sorts sound into seven frequency channels which are linked to seven vibrators along the wearer's arm. With it, people who could once hear can learn to understand much of speech with lip-reading. Also, children who went deaf before they could speak learn to enunciate better and can more easily distinguish between words like "cat" and "bat". Tactaid 7 was designed by the Audiological Engineering Corporation in Massachusetts, where Plant also works.

Tactile hearing

How does the device compare with a cochlear implant? One study at the University of Miami's Mailman Center for Child Development found that children who went deaf before learning language do as well with Tactaid 7 combined with a hearing aid as they do with implants. But a group led by Richard Miyamoto at Indiana University found that while Tactaid 7 was clearly helpful, the performance of children using it reached a plateau. Meanwhile, the speech skills of children with cochlear implants continue to improve.

The Miami group, consisting of Rebecca Eilers, Kim Oller and Özcan Özdamar, thinks that higher precision tactile devices are the answer. They have built a 16-channel tactile aid which emphasise more detailed sound signals. The aid digitises sounds which are then manipulated by computer to produce more subtle vibrations. These emphasise the cues people use to recognise speech. Özdamar, who is a biomedical engineer, is trying to make it portable so it can be used all the time.

Meanwhile, Plant and David Franklin, president of Audiological Engineering, are moving towards simpler aidswith the help of Gustaf Soderlund, a 53-year-old Swedish man deaf since the age of eight. Soderlund's father was very attentive of his son and would let him climb on his lap and feel the gentle vibrations of his body while he spoke. "To meet him is a rather startling experience," says Franklin. "What he does is he loosely throws his hand on your shoulder and he feels vibrations and lip-reads. Yet when I try it I can't feel a thing."

Many deaf people could benefit from Soderlund's method, but draping one's arms around strangers is not always socially acceptable. Plant and Franklin's solution is to build a hand-held device - a box small enough to strap to the wrist which incorporates a microphone or a radio tranceiver. This picks up sounds from transmitters worn by people speaking and converts them into the frequencies that Soderlund uses to understand speech. At the moment, the researchers are running tests on Soderlund to find out which frequencies these are.

Visual cues might also help the deaf improve their speaking skills. Lionel Tarassenko and Jake Reynolds at the University of Oxford have developed a device that extracts certain information from speech - the changes in resonances of the vocal tract as a sound is made - and then displays it graphically on a computer screen. In future, deaf students could study the patterns created when they speak. "They would try to adjust the way they pronounce a word or subword to make it more like the pattern created by their teachers," says Tarassenko.

Best of both worlds?

None of these researchers are suggesting that deaf children use their aids instead of learning sign language. Instead, they want the best of both worlds for deaf children: sign and speech. But Moise Goldstein, professor of biomedical engineering at Johns Hopkins University, worries that parents who are desperate for their children to speak and lip-read will latch onto tactile aids. As a result, they may neglect the child's essential first language - signing. "I started out in this field with the hope of making the skin into an ear," says Goldstein. "But right now what I'm trying to do is work with young engineers to see if we can make it easier for the parents to learn sign language."

The issue, then, is more than discovering what is feasible: it is deciding what is helpful. Here, not surprisingly, opinions differ - for the blind as well as the deaf. "You can pull 100 blind people off the street and ask them what ought to be done and you'll get 95 different answers, probably," says James Gashel, director of government affairs with the National Federation of the Blind.

Gashel, for instance, doesn't think he needs devices to help him get around town, especially noisy ones. "Noises are distracting," he says. " I could be listening for a pole or a bush and run into a person walking down the street." Meanwhile, Larry Scadden, director of a National Science Foundation programme promoting education for disabled students (and perhaps the only blind person with a PhD in visual sciences) is more open to such devices - as long as they can be turned off at will.

But more pressing by far, both men stress, is something more workaday: finding a way to give blind people access to the world of computer graphics. Many blind people were counselled into computing careers, and the new trend toward graphics is leaving them high and dry. Meijer and Bach-y-Rita hope their devices will help here too: either by giving sound clues to the computer operator or by providing a grid of tactile information. Other researchers around the world are pursuing similar lines of research.

One day, perhaps, technology will deliver excellent sight to the blind and perfect hearing to the deaf - either through devices such as cochlear implants, or by less orthodox methods that harness other sensations to do the job. But until then, there are a wealth of smaller, more modest ways in which technology can help: teaching a blind child what a candle flame looks like, or how to find the "trash" icon on her computer; showing a deaf child how to say "cat", or coaxing his parents to learn how to sign. "You would not want to sit around just waiting for the day somebody's going to develop a device to "make you see" - you've got to get on with your life," says Gashel. "Being sighted may be nice but it's not the greatest thing in the world."