New study shows that by 15 months, babies can learn words without seeing objects

A groundbreaking study from Northwestern University (United States) and Harvard University (United States) reveals that infants as young as 15 months can begin to grasp the meaning of entirely new words — even when the objects those words refer to are not visible. The findings show that babies are able to extract meaning from the context of adult conversations, forming mental representations of unseen items based solely on language. This early ability to infer meaning without direct visual experience marks a significant milestone in cognitive and language development, challenging traditional assumptions about how and when children begin to understand the world through words.

New study shows that by 15 months, babies can learn words without seeing objects. Image by Pexels

How Babies Learn Words for Things They Can’t See

The Research Behind the Discovery

According to ScienceDaily, a team of developmental psychologists from Northwestern University and Harvard University has provided compelling new evidence about the remarkable abilities of infants to learn language. Published in PLOS ONE on April 23, 2025 (https://doi.org/10.1371/journal.pone.0321775), the study was led by Sandra Waxman, Louis W. Menk Professor of Psychology and director of the Infant and Child Development Center at Northwestern, and co-authored by Elena Luchkina, a former postdoctoral fellow at Northwestern now working at Harvard.

The researchers designed a novel three-part experiment to test how well infants could form a mental “gist” of a new word’s meaning, even when the object it referred to was hidden. They enrolled 134 infants in total — 67 who were 12 months old and 67 who were 15 months old.

The researchers first showed the infants familiar words along with pictures of well-known objects, like “banana” or “apple,” to establish a baseline of understanding. Then they introduced a completely new word — for example, “kumquat” — but didn’t show the object it referred to. Instead, the word was mentioned while the related object remained hidden, simulating how babies often hear unfamiliar words in everyday conversations without seeing the item in question.

In the final part of the task, the infants were shown two unfamiliar objects side by side — one was a kumquat (a small, orange-like fruit) and the other was something unrelated, like a whisk. The researchers asked, “Where is the kumquat?” and tracked where the babies looked. They found that 15-month-old infants spent more time looking at the fruit than the whisk, suggesting they had connected the new word with the likely object based on context. Meanwhile, 12-month-olds didn’t show a clear preference, pointing to a developmental leap that typically occurs between 12 and 15 months.

What Makes This Study Unique: Moving Beyond Direct Word-Object Pairings

Previous studies in infant language acquisition primarily focused on situations where a new word was introduced alongside a visible object. These traditional “word-object mapping” scenarios — where a parent points and says, “Look, a dog!” — have shaped our understanding of how babies learn vocabulary.

This new study breaks with that convention. It asks: what happens when babies hear a new word, but don’t see the object it describes?

The innovation lies in showing that babies can learn without direct visual experience. By isolating the linguistic context alone — without physical cues — the researchers created a more realistic, everyday learning environment. After all, in real life, people frequently mention objects that aren’t immediately present.

This makes the study one of the first to directly investigate how infants construct meaning purely from language.

Key Findings from the Study

1. 15-month-olds can identify unseen objects through language context

Babies in this age group looked longer at the object most likely associated with a novel word they had heard earlier, even if they hadn’t seen the object at the time.

Example: A parent says, “I bought kumquats today — they’re like oranges,” while the fruit is still in a grocery bag. Later, when the baby sees two unfamiliar items, they’re more likely to recognize the kumquat based on what was said earlier.

2. 12-month-olds did not yet show this ability

One-year-olds did not display a consistent preference, suggesting that their language systems might not yet be robust enough to infer meaning from abstract linguistic cues.

Example: A 12-month-old hearing the word “kumquat” in a conversation would likely not associate it with anything specific until they actually see the fruit.

3. Infants build a “gist” of the word’s meaning

Researchers concluded that babies do not need detailed knowledge — even a general category, like “fruit,” is enough to guide them later.

Example: Even if the baby doesn’t know what a kumquat looks like, they might expect it to be something edible or round based on the conversation context.

4. Language alone can drive learning before object recognition

The study showed that language exposure itself can create mental placeholders for concepts.

Example: Hearing about a “platypus” in a storybook before ever seeing one might still allow a baby to later distinguish it from other unknown animals.

5. Babies’ brains are primed for contextual learning much earlier than previously thought

This pushes back the assumed timeline for when infants begin to connect words and meanings based on indirect experience.

Example: When adults casually chat around babies, even those seemingly irrelevant conversations may shape how babies learn about the world.

What Does This Mean for Cognitive Development: Language as a Cognitive Catalyst

This study illustrates a key milestone in cognitive development: the ability to create symbolic mental representations of things that are not immediately visible. It shows that by 15 months, infants are already moving beyond simple stimulus-response learning and engaging in more abstract, inferential reasoning.

Such abilities are foundational for higher-order thinking. Forming a mental representation of an unseen object involves memory, attention, categorization, and predictive reasoning — all core components of cognitive development.

It also shows that infants are not passive listeners. Instead, their brains are actively constructing knowledge from the language they hear, even when there’s no physical referent. This hints at an early integration of linguistic and conceptual skills, paving the way for more complex forms of reasoning, imagination, and problem-solving later in life.

Why It Matters: New Clues for Early Learning, Diagnostics, and Education

The findings from this study offer more than just insight into how babies acquire language — they could shape real-world practices in science, medicine, and education.

In developmental research, the discovery that infants form abstract concepts through language alone pushes the boundaries of what was previously thought possible in early cognition. It challenges long-standing assumptions that direct visual experience is required for word learning and opens new pathways for studying how symbolic thinking develops.

For educators and caregivers, the implications are equally significant. The results suggest that even very young children benefit from rich verbal environments. Talking to babies — even when they aren’t yet speaking — becomes not just a bonding activity but a crucial foundation for building future language and thinking skills. Storytime, everyday conversations, and exposure to descriptive language could all accelerate cognitive growth long before school begins.

In healthcare, the research may help improve early screening tools for language delays or developmental disorders. If a child at 15 months struggles to respond to linguistic context, it could serve as an early warning sign, allowing for quicker intervention and support.

From a social perspective, the study reinforces the importance of equitable access to language-rich environments, especially in under-resourced communities. Initiatives that encourage reading and conversation with infants could play a critical role in reducing developmental disparities.

Overall, this work highlights the extraordinary power of everyday language in shaping a child’s mind — even when the object of discussion is nowhere in sight.

Final Insight: Rethinking How and When Babies Start to Understand Language

This study marks a turning point in how researchers and parents alike understand the earliest stages of language development. By showing that 15-month-old infants can use context alone to grasp the meaning of words — even when the objects are out of sight — scientists have uncovered a powerful new dimension of early learning.

It reveals that babies aren’t simply reacting to what they see; they’re constantly processing what they hear, forming mental models of the world through language long before they can express themselves. This insight redefines what it means to “talk to your baby” — it’s not just communication, it’s early education.

For parents curious about how their child is progressing, tools like the BabyBright by CogniFit app can help track whether development is aligned with the child’s age — offering a simple way to stay engaged and attentive during early growth. While babies absorb and process more than we often realize, having a resource that reflects age-appropriate milestones can support caregivers in nurturing early learning through everyday interaction, language, and play.

In essence, the study affirms a hopeful truth: babies understand more than we think. Every word spoken around them has the potential to shape how they perceive, remember, and learn — even before they speak a single word.