web analytics

How Blind People Use Bat-Like Echolocation to Orient Themselves

virtual corridor

Image credit: Illustration of the virtual corridor. Echo-acoustic orientation performance was tested at two positions on the midline of the corridor (M1, M2) and two positions along the left lateral wall (L1, L2) / L. Wallmeier & L. Wiegrebe, Royal Society 2014

In order to capture insects in the entire darkness, bats are capable of emitting exactly timed acoustic signals and afterwards rapidly analyzing the echoes accordingly. With biological sonar, whales could make navigation through murky waters.

By the application of the virtual echo-acoustic space technique, scientists have demonstrated that people could also learn to use sonar-guided orientation successfully and such skill is not just related with supersensitive hearing. These latest findings have been released in the newest edition of Royal Society Open Science. 

For a long time, scientists have been interested in the ability of blind humans to apply echo-acoustics into understanding the space just around them. Some are able to listen for the echoes from self-produced sounds in a very precise way during the  daily live, such as assessing distances, probing obstacles, and discriminating between objects which had different textures and various sizes. However, so far, scientist have known little  about the sensory-motor interactions hidden behind the efficient echo-acoustic orientation. Different from the animals like bats, human being have no large, mobile ears, just like radar dishes, by swiveling their ears, they could indentify echoes from the tiny, flying insects.

To examine whether people could do the same after some training despite of inability of our ears, Ludwig Wallmeier and Lutz Wiegrebe of Ludwig Maximilian University in Munich enrolled eight volunteers who had normal vision for exploration of a long corridor made of concrete walls and PVC flooring only by clicking their tongues. After several weeks of training, most of them behaved quite well, because they could be able to reliably orient themselves to go through the corridor without colliding with any walls only by the means of clicks and echoes.

Actually, Ludwig Wallmeier and Lutz Wiegrebe had designed a virtual version of the corridor, called as VEAS, the short term for the virtual echo-acoustic space, so as to identify how vital head and body movements were to their accuracy instead of just hearing alone. To make certain that their VEAS was realistic; the research team recruited two blind professional echolocation teachers who had learned to echolocate by themselves since they were young kids.

When sitting in a chair, the tongue clicks of these blind teachers were collected by a headset microphone and were played again via headphones just like they were in the corridor as before. The aim was still to line their bodies up with the center of the corridor, and they received the tests in two ways. Firstly, by a joystick they could rotate the virtual corridor and did not move their bodies. But for the second test, the corridor was set and they could be able to swivel their chairs and move their heads.

On the occasion it was not allowed for them to move their bodies or heads, they would teeter-totter down the virtual corridor and failed to make correction by themselves before running into a virtual wall. However, as the corridor was fixed and they were free to move their bodies and heads, these blindfolded recruits were capable of quickly correcting themselves.

It is hoped that such a virtual reality program like this one could be much helpful in teaching blind people to use echolocation.

Journal reference: Wallmeier, Ludwig, and Lutz Wiegrebe. “Self-motion facilitates echo-acoustic orientation in humans.” Royal Society Open Science 1.3 (2014): 140185.