top of page

The Brain's Ability to Hear in Different Ways

Did you know that hearing, light pulses, and touch can stimulate the same area of the brain?

It’s almost as if our brains can hear in different ways. Often a deaf individual’s brain will still use the auditory cortex—but in response to touch. This fascinating concept can give some insight into how both hearing and deaf people learn and communicate.


In my American Sign Language (ASL) fingerspelling class, I had an interesting experience with my own “hearing” of the language. About an hour into our session, I began losing focus—I felt myself drifting off while our Deaf instructor stood at my desk fingerspelling something. I completely missed it. She signed WATCH, indicating for me to focus. Once I gave my full attention, she signed THREE. Then she moved her hand ever so slightly higher and to the right, and she signed FOUR. Then, moving her hand slightly lower, she signed TWO.


AH! I saw the number 342! She explained to the class, “Your eyes get tired because you are not used to hearing with your eyes. It will get easier.”



The Brain’s Ability to Hear in Different Ways


In my interview with audiologist Angie Lederman, MS, CCC-A of Hear Now Audiology & Tinnitus Center, Angie and I talked about the Deaf person’s experience of auditory fatigue. Like the fatigue in a hearing person’s eyes—like what I had experienced in my class—the Deaf person experiences auditory fatigue. In her interview with the Guardian, Sara Novic said, “sometimes I turn off my hearing aids and dip below the surface of the sound.” This is because our brains adapt to the senses we have been given.


Evelyn Glennie is a musician who became deaf as a young girl—after her auditory cortex had already learned much about sound, especially music. She’d wanted to be a percussionist. Her story is told in a beautiful picture book, Listen: How Evelyn Glennie, a Deaf Girl, Changed Percussion by Shannon Stocker. Determined to play, Evelyn forged a way. Shannon writes the following about Evelyn’s audition for the Royal Academy of Music in London:

“If she could learn to think about listening in a whole new way, they [the Academy judges who turned her down] could too. Evelyn knew that sound and touch weren’t that different. The air vibrated, traveling through the judges’ ears before turning into sound. For Evelyn, those vibrations traveled through her body. Her brain just listened differently.”


How do our brains enable us to hear and see with different abilities? These accounts of hearing, seeing, and knowing are explained through research. Research by the National Institute on Deafness and Other Communication Disorders (NIDCD) through the National Institutes of Health (NIH) (2015) reports: “NIH study shows the deaf brain processes touch differently.” One researcher, Christina M. Karns, Ph.D., of the Brain Development Lab at the University of Oregon, and her colleagues tested the effects of touch and light stimuli on the auditory cortex. They found that “deaf people use the auditory cortex to process touch and visual stimuli to a much greater degree than occurs in hearing people.”


How Was It Tested?


In my blog article “Shannon Stocker Shares a Very Personal Story,” Shannon shares how Evelyn Glennie developed her auditory cortex to play percussion. Evelyn Glennie had been devastated by her hearing loss as a young girl. Fortunately, she’d had a music teacher who’d sent her home with a drum to play with all its parts and vibrations. Together, they had explored how Evelyn could use her whole body to hear all the vibrations the hearing person understands as sound.


Dr. Karns and her colleagues’ research showed that the auditory cortex of the deaf subjects’ brains responded to soundless puffs of air that touched the right eyebrow and the cheek below the right eye.

In their study, a magnetic resonance imaging (MRI) scanner recorded brain activity caused by air touching the face. In both hearing and deaf groups, visual stimuli were also tested with brief light pulses delivered through a cable mounted directly below the air-puff nozzle.


The research built on the prior knowledge of a perceptual illusion in hearing people. This visual illusion causes a hearing person to experience multiple flashes of light when exposed to two sounds accompanied by one flash of light. The study sought to discover how deaf participants would respond to a similar tactile stimulation.


Dr. Karns states, “We designed this study because we thought touch and vision might have stronger interactions in the auditory cortices of deaf people.” The experiment found that this was correct.


“The finding suggests that since the developing auditory cortex of profoundly deaf people is not exposed to sound stimuli, it adapts and takes on additional sensory processing tasks,” as stated in the NIH study.


Why Is This Knowledge So Important?


What are some ways this discovery may help people—both deaf and hearing—to learn?


Discovering that touch and vision interact directly with the auditory cortices in the brains of deaf people beckons for more research to open the door to new possibilities. American Sign Language is a very tactile language, rich with body language to express nuance and tone. Dr. Karns’s research team suggested that these visual and tactile techniques could be used to help deaf students learn math and reading.


To give you an idea, consider that the numbers 0 through 999 can be produced on one hand in ASL. Just as you can touch your nose with your eyes closed, a fluent signer can run through the numbers 0 to 999 with little thought or effort. The hand movements and contact of the fingers actually engage the signer’s auditory cortex. One feels with their fingers and “hears” each number with their mind!


Imagine you are deaf or hard of hearing and trying to pronounce words you may have never seen or heard before. Short vowels (such as the a in “cap”) and long vowels (such as the a in “cape”) are examples of how the same letters make different sounds. This is overwhelming for many children—and especially deaf children. Providing a visual and tactile tool to help sound out words offers multiple ways to learn and, in turn, activates the auditory cortex. In the same way, hearing students have found that learning to fingerspell their spelling words is a successful and unique way to practice their words.

Visual Phonics helps deaf and hard-of-hearing children learn to see and say the sounds they do not hear clearly. The student can sign these visual sounds to provide the tactile sensation of sound their auditory cortex is waiting to receive. For more information, watch this short video by Dr. Hartwell on Visual Phonics.


The NIH research findings may also help individuals who receive cochlear implants, “especially among congenitally deaf children who are implanted after the ages of 3 or 4,” according to NIH (2015). Children who have lacked auditory input since birth often have auditory cortices that have adapted to primarily process other sensations, such as touch and vision. Because of this adaptation, these children’s brains may have more difficulty recovering auditory processing function after cochlear implantation. If healthcare providers could measure how much the auditory cortex has been taken over by other sensory processes, providers could potentially tailor intervention programs to best suit the patient’s brain to help it retrain and devote more capacity to auditory processing.


The brain is magnificent. Its capabilities are beyond our perception. How much can the auditory cortex process? With science, life experience, collaboration, and exploration, the possibilities are limitless.


In the NIH article, James F. Battey, Jr., M.D., Ph.D., director of the NIDCD, said, “This research shows how the brain is capable of rewiring in dramatic ways.”


In her author note in Listen, Shannon Stocker quotes Evelyn Glennie:


“Hearing is basically a specialized form of touch. Sound is simply vibrating air, which the ear picks up and converts to electrical signals, which are then interpreted by the brain. The sense of touch does this too.”

Whether through research or life experiences, the more you know, the more you know there is to learn. What we learn from each other unites us. As Sophie Grégoire Trudeau once said, “The differences that separate human beings are nothing compared to the similarities that bond us.” We are all humans seeking the same thing in different ways. One meaningful way we can connect is through language, whether it is heard or signed.


References

“Feeling Sound with Evelyn Glennie.”“ July 12, 2019. www.youtube.com. Accessed May 28, 2022. https://www.youtube.com/watch?v=Gl2a6w6sTAs&ab_channel=EvelynGlennie.


“NIH Study Shows the Deaf Brain Processes Touch Differently.” 2015. NIDCD. August 18, 2015. https://www.nidcd.nih.gov/news/2012/nih-study-shows-deaf-brain-processes-touch-differently.


“Visual Phonics Cues.” October 20, 2020. www.youtube.com. Accessed January 17, 2023. https://youtu.be/ItlOW4o1nnA.