Skip to main content

Verified by Psychology Today

Cognition

How Your Eyes Communicate With Your Ears

Research reveals the sounds produced in your ears when you move your eyes.

Key points

  • The brain integrates visual and auditory information to better understand its environment.
  • New research shows that this sensory integration can happen at the earliest stages of processing.
  • Each time we move our eyes, a sound is produced in our ear canals that can signal which way our eyes moved.
  • The brain may use these auditory signals to quickly align visual and auditory reference frames.

I recently attended the International Multisensory Research Forum (IMRF) in Reno, Nevada, where scientists from around the world gathered to share their research on how the multiple senses (vision, hearing, touch, etc.) interact with each other. There were fascinating talks about all sorts of sensory interactions, with a focus on visual-auditory interactions—how our visual and auditory systems share information to process events more efficiently.

As it turns out, there there are many brain regions, and many stages of perceptual processing, where information is combined, or integrated, between senses. Some studies show evidence of late integration—integration that happens after the visual cortex has processed the visual information in a scene and the auditory cortex has processed the auditory information in a scene.

But there is also plenty of evidence that sensory information can combine between senses much earlier than that. In a fascinating keynote presentation by Duke University Professor Jennifer Groh, I learned that our eyes may actually be communicating directly with our ears every time we make an eye movement.

To understand why such eye-to-ear communication would even be useful, it is important to understand how the brain combines visual and auditory information to estimate where things are. Visual information that enters our eyes produces a retinotopic map of the world based on how light hits our retinas. Depending on where our eyes are pointed at any given moment, our retinas receive a particular image of the world that is specific to that direction of gaze.

As soon as we make a new eye movement, all the visual information gets remapped to different areas of our retinas. Yet we are able to maintain a stable perception of the world, in part because our brain knows how much our eyes move each time, and can counteract that motion to keep a stable representation of our surroundings.

Our ears work differently. When we hear a sound, we are able to estimate the direction of the sound source based on how the sound wave hits our two ears. For example, a sound that is coming from your right side will arrive at your right ear a fraction of a millisecond sooner, and a tiny bit louder, than a sound coming from your left side. Based on the difference in timing and loudness between the auditory information processed by our two ears, the brain can estimate the general direction from which a sound came, relative to our head position.

But in order to integrate information from vision and audition, our brain needs to somehow combine visual information that is based in a retinal reference frame with auditory information that is based in a head-based reference frame. How the brain does this so quickly and efficiently is still a mystery, but research by Professor Groh's teams offers compelling new clues.

Her research team showed that every time our eyes move, it results in a tiny sound produced in our ear canals called "eye-movement-related eardrum oscillations" (EMREOs). They discovered these sound waves by placing tiny microphones inside people's ear canals where they could clearly pick up the sounds. This video presents an amplified version of these eye-movement-related sounds:

Intriguingly, these sounds are not only present, but they are informative about the eye movements that produced them. The research team was able to decode how much the eyes moved, and in what direction, based on the amplitude and frequency of these EMREOs. In other words, they were able to reconstruct the direction and magnitude of an eye movement based on just the sounds that are produced in the ear canal.

Further research is needed to determine whether the brain actually relies on these auditory signals to aid its representation of the world, but Professor Groh's work reveals that multisensory integration can start at the very earliest stages of sensory processing.

Facebook/LinkedIn image: Juergen Bauer Pictures/Shutterstock

References

Gruters, K. G., Murphy, D. L., Jenson, C. D., Smith, D. W., Shera, C. A., & Groh, J. M. (2018). The eardrums move when the eyes move: A multisensory effect on the mechanics of hearing. Proceedings of the National Academy of Sciences, 115(6), E1309-E1318.

Lovich, S. N., King, C. D., Murphy, D. L., Landrum, R. E., Shera, C. A., & Groh, J. M. (2023). Parametric information about eye movements is sent to the ears. Proceedings of the National Academy of Sciences, 120(48), e2303562120.

advertisement
More from Nicolas Davidenko Ph.D.
More from Psychology Today