New technologies help bridge the communication gap for patients with language impairments triggered by brain injuries or stroke
Oct. 29, 2010
When a stroke patient or someone with a traumatic brain injury doesn’t respond to questions as expected, everyone from family members to clinicians might assume that the person has lost cognitive abilities. But those assumptions often are incorrect. Aphasia—a neurological condition brought on by brain disease or injury—impairs people’s ability to process language but doesn’t affect their intelligence. Unfortunately, there’s no easy way to determine how much individuals with aphasia understand.
Brooke Hallowell, professor of communication sciences and disorders at Ohio University, has been working for the last 20 years to change that.
“Everyone has experienced situations where they have strengths that are overlooked or where people make judgments about their abilities that are based on incorrect information,” Hallowell says. “For patients with aphasia, people may be incorrectly assuming that these patients are not cognizant of everything going on around them.”
Most tools currently used in clinical settings to assess language comprehension can result in inaccurate diagnosis because of other medical conditions present. By learning how to better understand people’s abilities, health care providers can work with the patients and their families to select the appropriate social interventions, living situations, and therapies to improve their lives, Hallowell says.
“It’s very important to ﬁgure out a patient’s abilities as well as their disabilities. Otherwise, these individuals could be misdiagnosed with a cognitive disorder, which could have a negative effect on their quality of life,” notes Hallowell, who ﬁrst became interested in helping people with communication problems when, as a child, she volunteered at a nursing home where her mother worked as the activities director.
This effort could become more important in the coming years, as the incidence of stroke and brain injuries is expected to increase as the population ages. Aphasia currently affects more than 1 million Americans, making it more prevalent than Parkinson’s disease, cerebral palsy, or muscular dystrophy. According to the National Aphasia Association, more than 100,000 Americans acquire the disorder each year.
Over the last several years, Hallowell has conducted experiments on how eye-tracking methods may be used to assess language comprehension, on both people with aphasia and control groups of adults. Systems can monitor where people’s eyes look on a computer screen in response to different stimuli and instructions—for example, listening to stories and questions while viewing paintings on the computer screen.
Eye-tracking systems are one of the most reliable means of measuring understanding in and communicating with people with linguistic disorders, Hallowell says, because eye movement is often the best preserved system in the human body when neurological disorders occur. Even among people who cannot speak or move, many can move their eyes in response to commands.
Using a camera mounted to the bottom of a computer screen and a chin rest to steady a patient’s head, the eye-tracking system Hallowell uses records eye position 60 times each second. The system takes two measurements—one looking at the pupil of the eye and another looking at the cornea. Customized computer software uses those two points to do a vector calculation to determine a third point, which allows Hallowell to get an accurate reading of where a patient’s eyes were ﬁxated over time during an experiment. By matching those ﬁxation points to the verbal cues individuals received, researchers can get a better understanding of how much the person comprehends.
That’s not only been valuable information for the researchers, but also for the loved ones and caregivers of patients in Hallowell’s studies.
“It’s most common for a spouse or partner of the patient to say something like, ‘This test is so good because I know he understands so much more than his test scores suggest, but he just can’t show people what he understands,’” Hallowell says. “On the other hand, a caregiver might also be surprised to learn that an eye-tracking test conﬁrms poor comprehension, having had no clear way of knowing for sure what a loved one was taking in.”
With experiences like these, Hallowell isn’t content to keep the technology in the laboratory. She has recruited a research team that includes Hans Kruse, professor of information and telecommunications systems at Ohio University, and LC Technologies, a Virginia-based ﬁrm that has already developed eye-tracking technology that helps people with disabilities use computers to communicate.
Together, the team successfully applied for a $700,000 Small Business Innovation Research (SBIR) grant through the National Institutes of Health to determine the feasibility of commercializing this new technology. In partnership with Kruse, who oversees software development, and LC Technologies, Hallowell is working to develop a more user-friendly interface for the technology that will provide clinicians with the information they need to quickly and easily assess patients in a health care setting.
Over the last year, the research team has used the SBIR funds to reﬁne its hardware and software systems and ﬁled a patent for the technology, both in the United States and internationally. This summer, the team began testing a new version of LC Technologies’ system that uses two cameras instead of one to record eye movements, which will provide even more accurate data readings. The team will then examine whether the increased accuracy is worth the increase in cost for clinicians to use the technology.
This fall, the research team plans to apply for a National Institutes of Health SBIR Phase II grant to continue moving toward commercialization. Hallowell also is exploring how the technology might be used to examine other factors that are difﬁcult to assess in people with neurological disorders, such as attention and working memory.
Although that will require more study, Hallowell said she’s hopeful about the technology’s potential to help improve the quality of life of people with language disorders.
“Our goal is to help others—health care workers, family, signiﬁcant others—understand that just because patients don’t respond when you give them a command, that doesn’t necessarily mean that they don’t ‘get it,’” Hallowell says. “We just need a better way to understand how much they ‘get.’”
By Linda Knopp
This article appears in the Autumn/Winter 2010 issue of Perspectives magazine.