Dye Seeks to Understand Deafness at the Most Basic Levels

Professor Matthew Dye uses American Sign Language to converse with Geo Kartheiser, the Visiting Research Coordinator in Dye’s Cross-modal Plasticity Lab at the Department of Speech and Hearing Science.
Professor Matthew Dye uses American Sign Language to converse with Geo Kartheiser, the Visiting Research Coordinator in Dye’s Cross-modal Plasticity Lab at the Department of Speech and Hearing Science.

His experience as a lecturer at the Center for Deaf Studies in his native England helped shape Beckman Institute researcher Matthew Dye’s current work today involving deafness and visual function.

Beckman Institute researcher Matthew Dye had a trial by fire as a lecturer at the Center for Deaf Studies at the University of Bristol. With just a few years of study in the areas of psychology, linguistics, and deafness – and much less time learning British Sign Language – Dye was asked to lecture at the Center, using only sign language.

“When I went to work, I wasn’t allowed to speak. I had to use sign language,” Dye said. “When I taught, I had to use sign language. I could sign my way out of a paper bag, but not much further. Being in that environment, where every day I’m there from 8 in the morning to 5 or 6 at night and my social life became hanging out with the deaf colleagues, it was like, just do it. Use the language. It’s the only real way to learn.”

That experience of being immersed with deaf students and colleagues in his native England helped inform Dye’s research then as a Ph.D. student in psychology analyzing data on deafness. It continues to inform his efforts today as he does research in the areas of deafness and visual function and translational work for educational programs and potential interventions.

Dye is a professor and director of the American Sign Language (ASL) track in the Department of Speech and Hearing Science, which also houses his Cross-modal Plasticity Lab. The lab is sign language only when Deaf people are present and at group meetings, where an ASL interpreter is included.

... I work with deaf adults and deaf children, and the idea that I would do research with that population and then not give that population direct and immediate access to opportunities in my lab would make no sense at all.
– Matthew Dye

“I try to make it an accessible lab, so if there’s a deaf graduate student or deaf undergraduate who wants to come work in the lab, they can do so and get full and immediate access to what’s going on,” Dye said. “You see all these smart deaf kids who have been held back, not by their own motivation, not by their own intelligence, but because they’re getting delayed access to information, and they’re getting mediated access to information where everything’s going through an interpreter, and it must be so frustrating.

“Plus, I work with deaf adults and deaf children, and the idea that I would do research with that population and then not give that population direct and immediate access to opportunities in my lab would make no sense at all.

Dye began college as a psychology major at Manchester Polytechnic interested in neural computational modeling, but a course on the psychology of hearing impaired children led him toward his current path.

“I took these adult education classes and studied British Sign Language and was absolutely fascinated by it,” he said. “I loved it, really, really enjoyed those classes. What I do now has its origins in what I did then. It was a circuitous path to where I am now but the seed was in my undergraduate studies in Manchester.”

Dye, a member of the Human Perception and Performance group, has a research focus on the effects of altered sensory experience on the development of visual cognition skills.

“My academic calling is to do this work on cross-modal plasticity and deafness and the translational implications for education,” Dye said. “I find the theoretical work fascinating, intellectually stimulating, but if I’m going to do this, I want to make some kind of difference, make it have some kind of meaning, so the translational side of it is very important.

“I have a whole set of studies that are interested in how these cross-modal changes in deaf people play out in education. What’s the impact of this stuff for deaf kids learning to read? What’s the impact for deaf classrooms?”

Dye said both his research data and experiences at the Center for Deaf studies had a profound impact on his current perspectives.

“It gave me certain insights because I met deaf people who acquired (sign language) from their deaf parents and I met deaf people who didn’t learn this language until they were 16-, 17-, 18-years-old,” he said. “Both groups are very, very fluent users of the language, but there are subtle differences in how easily they comprehend. When I analyzed my data, I didn’t just lump all of my deaf people together in one group and compare them to hearing people.

“Deaf people are highly variable as a group,” he added. “Some people use sign language, and amongst those, some acquire that from their parents. Others acquire it later in life. Some people don’t use sign language, they use speech. Some of them have a cochlear implant, some of them don’t. It’s just such a variable group and we think that there are a lot of things going on.”

Dye’s experience as a lecturer at the Center in Bristol also led him to pursue a more traditional university position of teaching and doing research after earning a Ph.D. in Psychology from the University of Southampton. A colleague suggested a postdoctoral position in the United States and he ended up at the University of Rochester, where he stayed for seven years, eventually becoming a senior lecturer.

Why is it that in blind people, the visual cortex is reading braille and how is it in deaf people that the auditory cortex can actually help them see the world? – Matthew Dye

Three years ago Dye heard about an opportunity at the University of Illinois and it turned out to be the perfect fit for both him and the Department of Speech and Hearing Science.

“They wanted somebody who did research related to deafness and sign language, and somebody who could also direct the American Sign Language program,” Dye said. “All through my career, I’ve had people who’ve looked out for me and who’ve alerted me to opportunities. You just have to take the opportunity.”

The mission of the Cross-modal Plasticity Lab is to “understand the effects of deafness on visual functions at the behavioral and neural levels, and to explore the implications for learning in K-12 and higher education settings.” The research employs behavioral, neuroimaging, and classroom studies, as well as looking at the impact of cross-modal plasticity on cognition and learning in deaf populations.

“I’m interested in why the different parts of your brain do what they do and how they ended up doing that,” Dye says of his research. “Why it is that in some people, the same parts of the brain do something completely different? Why is it that in blind people, the visual cortex is reading braille and how is it in deaf people that the auditory cortex can actually help them see the world?

“My work kind of helps challenge these notions that these different parts of the brain are dedicated to certain things and explores just how flexible the brain is.”

And that flexibility has to do with the concept of cross-modal brain plasticity that is at the heart of his research.

“Cross-modal plasticity is really the flexibility of the brain and how the different senses interact such that, in any system, when you take something away, you change the equilibrium of the system and it reorganizes itself,” Dye said.

As an example, Dye cites one of his research projects that looks at the effects of language delays in deaf children who have a wide variety of experiences, from those who learn sign language at an early age, to those who receive cochlear implants and intensive speech therapy.

“When I look at deaf children, they seem to be struggling and then we look at children who are a little bit older and they’re OK“,” he said. “Then we look at young adults and they’re doing really, really well. It’s almost like this reorganization of the brain is taking place but it takes a while for the children to learn.

“So, it seems like these changes in visual attention have to be coupled with, this is my hypothesis, changes in attentional control in order for the organism, in this case the human child, to obtain some benefit from it. It’s not automatic. There is a point in development where control is sufficient to allow the child to benefit from this functional reorganization. So it’s a dynamic system and what we think is going on is their attention gets redistributed.”

In order to include more imaging data in his work, Dye recently began working with Beckman colleagues Gabriele Gratton and Monica Fabiani, using their diffuse optical imaging system for studies of various populations in a project with the NSF-funded Science of Learning Center at Gallaudet, a university for deaf and hard of hearing students in Washington, D.C.

“This imaging project is allowing us to really look at different competing hypotheses about what changes in deaf individuals,” Dye said. “My work tends to be very behavioral, psychophysics work, and we have an opportunity to explore this which these imaging methods allow us to do.”

Dye is just as interested in potential translational outcomes of his work, such as interventions, and in educational programs. He is engaged in studies on how cross-modal changes in deaf people impact education, in the classroom, and in deaf students learning how to read. One example of a potential intervention is using American Sign Language as an early intervention.

“Deaf children are all very highly variable and some of them don’t really have access to language and communication early on in life,” Dye said. “Some children are fortunate enough, in that they’re exposed to natural language in infancy.

“Based upon some of our data, we think that while other research groups are proposing that deafness leads to these general cognitive deficits, we think it’s the lack of language input early on that leads to these deficits. That’s because you don’t see them in deaf children born to deaf families, who are very, very deaf. They never get a cochlear implant, many times they never get exposed to spoken language, but they are absolutely fine on these measures.

“So what we think is going on is that a lot of deficits stem from these early delays in access to language and that by introducing American Sign Language as early as you can, you’re going to help the child with those cognitive abilities.”

For Dye, the deaf community isn’t there as simply subjects for study. They are part of his lab, teaching, basic and interventional research work and educational outreach efforts.  That includes his lab where Geo Kartheiser serves as the Visiting Research Coordinator.

“I couldn’t do my work without the participation and involvement of the deaf community, deaf parents, deaf children,” Dye said. “And from a purely non-altruistic point of view, it’s hard for me to do that research if I don’t have a deaf person on my team. Geo has been here for two-and-a-half years, he’s a smart guy and he’s learned a lot to a point where’s he’s now been accepted for a Ph.D. in linguistics at Gallaudet University.

“To me he is my first success, to have this deaf guy who has a degree in English and public relations but has worked in my lab for two-and-a-half years because he’s fascinated with that field. And we’ve got it to the point where he has the knowledge and the motivation to get a graduate degree in linguistics.”