Researchers study early stages of infant word learning
Researchers at Indiana University, supported by a grant from the U.S. National Science Foundation, have published the results of their research on how infants put names to objects, a key step in language development. Past research has usually focused on "naming moments" -- when a baby sees an object and hears its name spoken at the same time.
What happens, though, when those two things don't occur in tandem? How does it affect the process of infant word learning and language development? The brain's hippocampal memory system may not be developed enough for infants to form durable memories that pair objects with names, scientists have found.
"Our study shows that a different perspective is potentially needed to explain how infants are making these links," Elizabeth Clerkin, one of the authors of the study, said. "We focused on understanding how infants are developing their memories for objects and categories more generally."
The results suggest that the connection between an object and its name could come from object-name memories that have built up over time. The researchers observed infants and cataloged their encounters with everyday objects over 67 hours of audiovisual recordings at mealtimes. They found that while babies sometimes heard the names of objects while they saw the objects, they much more frequently saw everyday objects without hearing them named.
"The idea is that over long periods of time, traces of memory for visual objects are being built up slowly in the neocortex," Clerkin said. "When a word is spoken at a specific moment and the memory trace is reactivated close in time to the name, this mechanism allows infants to make a connection rapidly."
Added said study co-author Linda Smith, "When scientists think about how infants manage to learn words, they've traditionally focused on internal cognitive mechanisms. We need to study the structures of learning environments, not just the internal cognitive mechanisms, because that will tell us more about what needs to be in place for children to learn language."
Understanding more about how infants learn language using verbal and visual cues from their environment could result in earlier and more effective interventions for children with delayed speech and language development difficulties.
"This study provides insights into how young children can learn to generalize from a few examples," says Soo-Siang Lim, director of NSF's Science of Learning and Augmented Intelligence program. "This work will also help inform new methods for artificial intelligence where massive data sets are required for machine learning."