web analytics

Look at What I’m Saying: Brain Depends on Vision to Hear, Study Says

OWlZQ4k_b2-u1O6EKIKjaU9pg_2uY3zGSlDqkplLNGr0AQAAXQEAAEpQ_260x196Recently, bioengineers from University of Utah discovered that our understanding of language is heavily dependent on vision than we previously thought. Under the right conditions, what you see will override what you hear. The new finding suggest that speech-recognition software and artificial hearing devices could benefit not only from a microphone, but also from a camera.

 “For the first time in history, we were able to connect the auditory signal in our brain to what a person said they hear when they actually heard something different. We discovered that vision is affecting the hearing part of our brain to change your perception of reality—and you could not turn off this illusion,” says Elliot Smith, the first author of the new study. “People think there is a tight coupling between physical phenomena in the world and our subject experience, however, this is not the case.”

 Our brain considers both sound and sight when it processes speech. However, if the two are slightly different, then visual cues is dominant than sound. The phenomenon is called the McGurk effect, which has been known for several decades. However, the origin of the effect remains elusive.

 This research studied and recorded electrical signals from the brain surfaces of four severely epileptic patients, who volunteered to be as studying subjects during their operative treatment process. Several button-sized electrodes were placed into the subjects’ brain surfaces and then they were asked to watch a video, focusing on a person’s mouth as they said the syllables “ba”, “va”, “ga”, and “tha”—and the sound may match or not match the motion of mouth.

There are three possible combinations of the sound and the motion of mouth:

  1.  The motion of the mouth matched the sound. For instance, the video showed “ba” and so did the audio sound, and in this case the patients saw and heard “ba” easily.
  2. The motion of mouth obviously didn’t match the corresponding sound. For example, the video showed “ga” while the audio was “tha”. Under this condition, the test subjects could perceive this disconnect and correctly heard “tha”.
  3. The third situation is interesting. There was only a slight mismatch between the motion of the mouth and the corresponding sound. For example, the sound in the video was “ba” but the audio was “va”, so the patients heard “ba” even though the sound was actually “va”. This demonstrates the McGurk effect that vision overrides hearing.

 Smith and co-workers measured the electrical signals in the brain while each video was being watched, and based on that, they could pinpoint whether auditory or visual brain signals were used to identify the syllable in the videos. If the syllable being mouthed perfectly matched the sound or did not match at all, then the increased brain activity correlated to the sound being watched. However, when the McGurk effect played a part in the video being watched, the activity pattern changed to resemble what the person saw but not what they heard. The statistical analyses confirmed this effect on all test subjects.

 “Our results shows that neural signals in the brain that should be driven by sound are being overridden by visual cues that say,” says the other researcher, Bradley Greger. “The brain essentially ignores the physics of sound in our ears and follows what is happening through the vision.”

 Greg said that the new findings could help researchers understand the driving force for human in language processing, especially in a developing infant brain trying to link sounds with lip movement to learn language. It is also helpful for researchers to sort out how language processing goes wrong when auditory and visual inputs are not correctly integrated, such as in dyslexia.


Title imageShutterstock