localizer pixel
Jul 1, 2021

Hear ye, hear ye? How cognition impacts hearing and its loss

The band wasn’t playing songs, exactly. Instead, they were layering ever louder guitar riffs over high-pitched beeps so that even I, a flouter of common sense when it came to my ears, knew I should be nervous. Still, instead of springing for the $1 ear plugs that the venue was selling, I just stuffed toilet paper into my ears. I started getting bouts of tinnitus shortly thereafter, and it was hard not to wonder whether that night was the tipping point: had I finally done perceptible damage after years of disregarding my hearing?

Maybe, maybe not. Research suggests that hearing loss isn’t all in the ears where we typically think of hearing taking place. While congenital deafness is usually caused by anomalies in the inner ear, there is a cognitive component to hearing that not only describes how sounds get processed by the brain, but which may account for some hearing problems as well.

It’s difficult to provide a succinct description of hearing without misrepresenting how it works. Indeed, as the National Science Foundation acknowledges, “human hearing is a process full of unanswered questions,” especially from a cognitive perspective.

But we do know that, when a noise is made, sound waves enter the ear canal and register on the eardrum and in the tiny bones of the ear (malleus, incus, and stapes). The waves then make their way across a membrane that separates the middle ear from the inner ear, which houses the fluid-filled, snail-shaped cochlea.

Next, the cochlea’s hair-like receptor cells convert ripples in its fluid to electrical signals destined for the brain. During this process, differences in sound, like pitch and frequency, are registered in different places on the basilar membrane, which spirals around the coil of the cochlea.

Frequencies and pitches that are mapped by nerve fibers in the basilar membrane register in the brain’s thalamus, which initiates the cognitive aspect of hearing: it provides the connection point between the ear and the auditory cortex, where sound is processed in the brain.

Pure tones are processed in the core of the auditory cortex, with layers above the core processing more complex sound qualities (like duration, direction, and loudness), plus the memory, decision-making and auditory associations that help us learn and recognize language.

Obviously, hearing is complex, and most hearing problems in young people arise when one aspect of the physical structure fails. (1) But that’s not necessarily the case when it comes to adults and older people. In them, the connection between the brain and the ears is often responsible for hearing loss.

Tinnitus, for instance, affects 15-20 percent of adults and is most often experienced as a high pitched tone, buzzing, or ringing in the ears. Tinnitus may be brought on by damage to the ear caused by exposure to loud noise over time, but the noises heard in the ears during a tinnitus episode probably originate from the brain. Here’s why:

The brain will normally filter out a lot of noise during the day. We might live on a busy road or have a clanking furnace, but it quickly becomes background noise that we’re not conscious of. This filtering is done by the thalamic reticular nucleus (TRN), which acts as a gatekeeper for sensory information and prevents certain sounds from reaching the auditory cortex.

In tinnitus, a loss of hearing may prevent the TRN from doing its job, because sounds that we expect to hear are no longer present. At the same time, a loss of volume in the medial prefrontal cortex (mPFC) seems to accompany tinnitus. Both of these findings suggest that the mPFC and the TRN are part of a noise suppression system in the brain that, when it fails, produces hyperactive signals for picking up frequencies to compensate for losing the ability to hear certain sounds. These signals create the ringing that’s characteristic of tinnitus.

Another example showcasing the cognitive component of hearing is that of older adults whose hearing is unimpaired yet who are unable to efficiently process language. As Samira Anderson of the University of Maryland noted, "Often we will hear an older person say, 'I can hear you, I just can't understand you.'" This is partly because the ability to distinguish sounds (not just hear them) requires more brain power in older people.

Various areas of the brain involved in hearing and language-processing begin to erode with age. For example, the “cocktail party effect”—the ability to isolate and listen to a particular voice in a noisy room—is the result of the midbrain coordinating with the auditory cortex. As we get older, the inhibitory and excitatory processes in the brain that sync the two—by recognizing and filtering noise versus attending to the “important” sounds of human speech, respectively—begin to degrade, and it takes longer to process language. That’s why, if someone is speaking quickly, the processing abilities of older people may not be able to keep up. In this scenario, increasing your speaking volume won’t help, because the problem is not an inability to detect sound. Instead, speaking slowly in a quiet environment can help adults with language processing problems to isolate your voice and keep pace with a conversation.

If recent research into such hearing problems as these can help us understand the complex relationship between hearing and cognition and to elucidate the fact that hearing isn’t all in the ears, still more research is needed to fully understand how hearing loss impacts various areas of cognition. In the meantime, protect your ears from unnecessary damage—that will eliminate one factor in the puzzle.

(1) Interestingly, some animals whose ears are much simpler than the human ear have much better hearing abilities. The wax moth, for example, far outstrips all other creatures in its hearing frequency range. Whereas people can hear in the range of 20 to 20,000 hertz—with the numbers roughly corresponding to how low or high-pitched a sound is—wax moths hear frequencies up to 300,000 hertz in “ears” that take the form of eardrums on the sides of their bodies. There’s an interesting chart of frequency hearing ranges for various animals here: https://www.lsu.edu/deafness/HearingRange.html

References:

https://www.nidcd.nih.gov/health/how-do-we-hear

https://www.nsf.gov/discoveries/disc_summ.jsp?cntn_id=297993&org=NSF&from=news

https://www.sciencedirect.com/topics/medicine-and-dentistry/auditory-cortex

https://www.ncbi.nlm.nih.gov/books/NBK10900/

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4691684/

http://www.ssc.education.ed.ac.uk/courses/deaf/dnov10i.html

https://www.nature.com/news/moth-smashes-ultrasound-hearing-records-1.12941#:~:text=Even%20though%20its%20ears%20are,than%20any%20bat%20can%20squeak.

https://www.lsu.edu/deafness/HearingRange.html

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5675031/

https://www.sciencedaily.com/releases/2016/10/161018141152.htm

https://www.mayoclinic.org/diseases-conditions/tinnitus/symptoms-causes/syc-20350156#:~:text=Tinnitus%20is%20when%20you%20experience,especially%20common%20in%20older%20adults.

https://fortune.com/2021/03/22/texas-roadhouse-ceo-kent-taylor-suicide-tinnitus-covid/

https://www.nidcd.nih.gov/health/auditory-neuropathy#:~:text=Auditory%20neuropathy%20is%20a%20hearing,ages%2C%20from%20infancy%20through%20adulthood.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4906307/

Share this on Facebook
Tweet this
Email this