Communications in the 21st Century – Part 3: Helping the Blind See Through Sound

Using sound to see things is not new to humans. After all we have been using sonar for a century to locate objects in water. And we use sonar in ultrasound to visualize internal organs and fetal development. But using sound to see the world around us seems to be something only other animal species such as dolphins and toothed whales, bats, some species of shrews and a number of cave birds can do.

In fact, however, people born without sight or who have developed blindness can see using sound allowing them to interact with our sighted world in ways never thought possible. In this blog we will explore how technology contributes to giving the blind sight through sound, and how some of us can actually switch on our brains to become echolocators.

Generating Sound Through a Device to Visualize the World

I started thinking about writing this blog after reading an article on EyeMusic, a device that converts images into music as recently reported in the journal, Restorative Neurology and Neuroscience.

EyeMusic doesn’t use t echolocation but it does use sound generated by a sensory device that lets the blind see using music. Images are converted into musical phrases called soundscapes. Different musical instruments represent different colours.

  • Hear vocal sounds and visualize white.
  • Hear a trumpet and visualize blue.
  • Hear a reggae organ and see red.
  • Hear a synthesizer reed and see green.
  • Hear a violin and see yellow.
  • Hear nothing and see black.

EyeMusic soundscapes span five octaves. Musical phrases that ascend or descend provide horizontal cues. High or low-pitched notes determine vertical space. Listen to some of the sample sounds created by the device to get a sense of what it creates.

EyeMusic is a sensory substitution device that uses different musical sounds to help the blind locate objects. This allows the user to visualize mentally what he or she cannot see.                Source: Maxim Dupliy, Amir Amedi and Shelly Levy-Tzedek

EyeMusic, although not echolocation, does mimic the results. Which brings me to the question, can humans learn to echolocate?

Humans can Echolocate

In an article that appeared in Science Daily in 2009, researchers reported humans developing echolocation to identify objects around them without sight. The study, done at the Superior Polytechnic School of the University of Alcala de Henares, near Madrid, Spain, analyzed using palate clicks, sounds made by placing the tip of the tongue behind the front teeth and moving it backward, to see. Where dolphins produce up to 200 clicks per second, the human subjects in this study demonstrated the capability to do 3 or 4. But even with a limited number of clicks the researchers found that humans trying echolocation could demonstrate the ability to visualize objects in space.

In trying echolocation they had many things to learn including:

  1. Distinguishing their own unique sounds from others doing echolocation.
  2. Learning to listen for their own echo responses.
  3. Interpreting and visualizing the objects and spatial dimensions from those echo responses.

The study showed that humans could train themselves to begin echolocating within a couple of weeks sensing such large objects as another person or a tree in front of them.

Other studies have shown that blind echolocators use the visual parts of their brains to process the clicks and echoes they create. In the May 2011 edition of the journal, PLoS ONE, researchers at the University of Western Ontario’ Centre for Brain and Mind, in London, Ontario, published results that showed blind people doing activities otherwise thought impossible through echolocation. Using MRI imaging of the brain, researchers played echolocation recordings of different objects back to blind echolocators who in listening to the clicks and echoes could identify them by sound. As a control sighted people were used in the study as well and demonstrated no echolocation capability whatsoever.

Can Different Echolocating Species Communicate With Each Other?

The answer is no. A good example of the complexity of echolocation can be found in dolphins. Every dolphin pod has its own unique set of clicks and whistles. This allows dolphins within the pod to share a rich vocabulary of sounds. But individual dolphin pods share only a small number of clicks and whistles with other pods. These sounds are limited to indicating distress or danger.

Dolphins use sound for echolocation as well as for communication within their social groups and with other dolphins. Humans can learn to do the same things as dolphins according to recent studies.                           Source: Wired Science

We humans do very similar things when we try to communicate with other human groups that speak a different language. We use the full range of body motion for communication but dolphins bodies lack arms to point and heads that can nod. Some of our words have common derivative origins so that the sounds are similar enough to be recognized. So that’s what dolphins do but their body language is done with tails and fins and through swimming motions and clicks and whistles replace words.

Dolphins and bats cannot talk to each other even though they both use high frequency sound to visualize their worlds. Interesting the physiology and chemistry indicates convergence between the species but they are captives of very different environments. As a result bat echolocation through air images objects up to 3 meters (10 feet) away. But dolphins and whales echolocate objects through water as far away as 100 meters (330 feet) or greater. That’s because sound travels much farther and up to four times faster through water than air.


Len Rosen lives in Toronto, Ontario, Canada. He is a researcher and writer who has a fascination with science and technology. He is married with a daughter who works in radio, and a miniature red poodle who is his daily companion on walks of discovery. More...