Study Finds a Way to Determine Whether People Understand What They Hear

greatergood_ctg_belowtitle

One of the greatest struggles people have in caring for a person who is nonverbal is attempting to communicate with that person. For a parent of a nonverbal autistic child, or for the children of an elderly person with advanced dementia, or for anyone else who has a nonverbal person in their life, communication can be difficult.

In most cases, we are taught (and rightfully so) to presume competence in nonverbal people. We say what we have to say in the most understandable way possible without talking down to the person; we talk to them as if we know for sure that they understand. This way, we avoid the pitfall of potentially hurting a person who is capable of understanding our words.

But what if we could know whether or not a nonverbal person was understanding what they heard? What if we could learn more about a person’s needs and capabilities by testing whether or not they comprehended spoken words? It may, in fact, be possible.

Scientists from Trinity College Dublin and the University of Rochester, New York, have recently discovered a particular brain wave that is associated with the conversion of speech into meaning. This particular brain wave appears at the mid-back part of the scalp on EEG sensors when a piece of speech is understood, and it is absent when the subject either doesn’t understand or isn’t paying attention to what is said.

The researchers performed EEG tests on patients as they listened to audiobooks. The books were sometimes played normally and sometimes in reverse so that they could not be understood. In looking at the brainwave patters of each subject, the team discovered a specific brain response that reflected how similar or different a word was in meaning from the preceding words. However, when the subject did not comprehend a word or stopped paying attention, the signal disappeared.

This rapid calculation of whether or not words are similar to one another in meaning is believed to be a large part of how we manage to listen and speak at rates of about 120 to 200 words per minute. And the discovery of this function is also helping scientists develop technological advancements using similar techniques. Professor Ed Lalor, the study’s lead author, has high hopes for the future:

“We hope the new approach will make a real difference when applied in some of the ways we envision.”

And they’re envisioning a lot of ways to apply this discovery. It could help us better understand the needs of young children learning to speak, people with nonverbal autism, and people who suffer from advanced dementia and can no longer talk. It may assist in the development of a test that shows whether a person is developing Alzheimer’s disease or whether a young child may have autism or a learning disability. It could also help determine whether comatose or unresponsive patients can understand speech, as well as be a testing tool for people in demanding and fast-paced jobs to determine whether important instructions have been understood.

For now, it’s best to presume competence in any person who cannot tell you whether or not they understand you. But this technology brings hope for a future that could provide better diagnosis, treatment, and understanding of people who may not be able to speak for themselves.


Want to read about another breakthrough that will revolutionize the treatment of autism, Alzheimer’s, and a host of other health issues? Click “next” below to learn about the World Health Organization’s latest statement about medical marijuana.

Next

Not Just Forgetful: The 11 Dementia Types: Click “Next” below!

Elizabeth Nelson is a wordsmith, an alumna of Aquinas College in Grand Rapids, a four-leaf-clover finder, and a grammar connoisseur. She has lived in west Michigan since age four but loves to travel to new (and old) places. In her free time, she. . . wait, what’s free time?
Media.net ALZ
Proper greatergood_ctg_belowcontent
store ecomm