The science of early language acquisition just got a surprising plot twist.
Most theories say babies only start forming abstract language categories after they tune into the sounds of their native language– usually around 6–10 months. But what if they’ve been doing it before that all along?
A new study by Eylem Altuntas & colleagues at Western Sydney University suggests exactly that. Using a clever experiment with “mini-languages” & cartoon crabs, researchers found that 4–6-month-old infants don’t just hear language– they’re already analysing it.
How do we know?
Infants were trained with two sets of nonsense words. One set had consonants made with the lips (like /b/ & /v/), the other with the tongue tip (like /d/ & /z/). Each mini-language was paired with a cartoon image. Later, without any sound, babies watched silent videos of people saying new words. When the lips or tongue movements matched what they’d learned earlier, the babies looked longer.
This wasn’t just recognition– it was abstraction. Babies were generalising phonological features across consonants & across sensory modes (sound → sight). That means they weren’t just mimicking- they were forming abstract, amodal representations.
Why does this matter?
This finding challenges major frameworks such as:
- WRAPSA (Word Recognition and Phonetic Structure Acquisition, Jusczyk, 1993), which assumes abstraction follows perceptual attunement. In this model, infants first become attuned to the sounds of their native language before they can build more complex, abstract phonological categories.
- PRIMIR (Processing Rich Information from Multidimensional Interactive Representations, Werker & Curtin, 2005), which places abstraction after vocabulary growth. According to PRIMIR, infants need a bank of known words before they can start generalising sound patterns across them.
- Lexical Restructuring Hypothesis (Walley, 1993), which ties abstraction to word learning. It suggests that as children learn more words, they shift from holistic representations to more segmental, phoneme-based ones– driven by the need to differentiate similar-sounding items.
Instead, this research aligns with the Perceptual Assimilation Model (Best & Tyler, 2007), which emphasises the importance of articulatory information as a foundation for perception.
Teacher Takeaways?
These findings apply to babies far too young to be in our classrooms– but keeping up with this kind of research helps us understand the foundations of language learning that older learners are still building on. It also reminds us how much linguistic ability is hardwired & how astonishingly capable young brains are. Even before they’ve “tuned in” to their mother tongue, infants are already doing the mental heavy lifting of a linguist.
Language learning starts earlier than you think. These babies weren’t speaking or even babbling– yet they were already building abstract linguistic systems. Our definitions of ‘pre-verbal’ might need a rethink.
Multisensory input matters. Seeing & hearing language in sync may not just help comprehension– it might build linguistic categories. Use videos, gestures, facial cues where possible.
Don’t underestimate the young learner. This research highlights just how cognitively rich early development is. What’s “too early” to teach may be more about how than when.




Leave a Reply