Her work focuses on language acquisition, specifically how very young children interpret words and sentences. While conventional wisdom says that knowing the meaning of each word is central to that task, Fisher says sentence structure, or syntax, plays a primary role as well—even for those as young as 15 months old.
“The language we learn is not just words but a formal system of combining the finite set of words or expressions for our own purposes as a way to say new things,” said Fisher, professor of psychology and linguistics, and a member of Beckman’s Illinois Language and Learning Initiative. “It’s the grammar that allows us to make new combinations and figure out an unlimited set of sentences. “
“One of the questions I’m most interested in is: ‘How is it possible that the structures of sentences affect how very young children interpret their meaning even before they have learned much about the grammar of their language?’”
The answer to that question begins with an understanding of “syntactic bootstrapping,” the theory that young learners use their preliminary knowledge of syntax to infer the meaning of words. This idea was developed in 1985 by Lila Gleitman, Fisher’s graduate advisor at the University of Pennsylvania, and has informed Fisher’s work since she wrote her dissertation.
The idea is that once toddlers know a handful of nouns, like “Mommy,” “baby,” and “cup,” for instance, “they have a tiny bit of structure that can be used as building blocks for more structure,” said Fisher. “A simple set of nouns is sufficient to learn new things about the language,” such as identifying verbs.
“To study the learners’ sensitivity to syntax, we use invented verbs, like ‘pilking’ or ‘kradding,’” said Fisher. “Simple syntactic cues, like how many nouns occur in the sentence, help babies determine meaning. A sentence with two nouns, for instance, tells the learner that the sentence’s meaning involves two participants.”
Some Words About Verbs
How does Fisher test the theory when the children are too young to express their knowledge in speech?
She explains it this way. “Imagine a 16-month-old is sitting in front of a television with two video ‘windows.’ One video shows a two-participant event, such as a box, bumping another box along. The other shows the action of one participant, perhaps a ball jumping up and down. While watching these two events, the child hears an invented verb--either transitive, with a direct object, like “Mommy is pilking the baby,” or intransitive, without a direct object, like “Mommy is pilking.” We videotape the kids and track their eye gaze, coding which video they’re looking at. What we find is that the children who hear the transitive verb look longer at the two-participant event than the kids who hear the intransitive verb do.”
A Beckman collaboration is advancing this work on “baby knowledge of syntax.” Dan Roth, professor of computer science and a member of the Illinois Language and Learning Initiative, is a pioneer in the use of advanced machine learning methods in natural language processing. With current postdoctoral research associate Christos Christodoulopoulos, Roth and Fisher have developed a computational model that allows for further experimentation as well as refinement of the syntactic bootstrapping theory.
The model, called BabySRL (SRL for Semantic Role Labeling), verifies that it is possible for very young children to begin learning sentence-level semantics, and to identify verbs, once they can identify a small number of nouns. Future work with the model will test how the theory can be extended to more complex sentence structures.
Fisher’s expertise in language acquisition and infant cognition has also added to other Beckman collaborations, including research being conducted by psychology professors Gabriele Gratton and Monica Fabiani of the Mechanisms of Cognitive Control Group. The team is exploring how a technology that Gratton and Fabiani have developed for studying brain function, the Event-Related Optical Signal (EROS), can be used to measure the brain activity of infants during learning tasks.
Making Sense of Sentences
Collaborations beyond Beckman involve whether the theory of syntactic bootstrapping applies best to English or holds up equally well in other languages. Kyong-sun Jin, who earned a Ph.D. in psychology from Illinois, is examining that question in her research as a postdoctoral fellow at Yonsei University.
Fisher says Korean is an excellent choice for testing the theory because it is one of several languages in which speakers can drop nouns from sentences. Results from the same methods of video testing and eye-gaze monitoring used in Fisher’s lab in the United States indicate that “despite the many differences between languages, there are more similarities in the kinds of information that input sentences provide than you might expect,” said Fisher.
Whatever the language, the research continues to provide insights into how children assign meaning to words and sentences.
“It’s an exciting notion that the structure of sentences can be intrinsically meaningful to young children, helping to push them into new interpretations and new learning about the grammar,” said Fisher. “There is a relationship between structure and meaning, and that’s something that even very young children can discover.”