Toddlers Can Master Computers

In the last couple of years toddlers and even babies have begun to be able to use computers. This may seem like the sort of minor news that shows up in the "lifestyle" section of the paper and in cute you-tube videos. But it actually presages a profound change in the way human beings live.

Touch and voice interfaces have only become ubiquitous very recently—it’s hard to remember that the iPhone is only eight years old. For grown-ups, these interfaces are a small additional convenience. But they completely transform the way that young children interact with computers. For the first time, a toddler can directly control a smart phone or tablet.

And they do. Young children are fascinated by these devices and they are remarkably good at getting them to do things. In recognition of this, in 2015, the American Academy of Pediatrics issued a new report about very young children and technology. For years the Academy had recommended that children younger than two should have no access to screens at all. The new report recognizes that this recommendation has become completely impracticable. It focuses instead, sensibly, on ensuring that when young children look at screens, they do it in concert with attentive adults, and that adults supervise what children see.

But this isn’t just news for anxious parents, its important for the future of the entire human species. There is a substantial difference between the kind of learning we do as adults, or even as older children, and the kind of learning we do before we are five. For adults, learning mostly requires effort and attention; for babies, learning is ubiquitous and automatic. Grown-up brains are more "plastic" than we once thought, (neural connections can rewire) but very young brains are far more plastic—young children’s brains are designed to learn.

In the first few years of life we learn about the way the physical, biological, and psychological world work. Even though our everyday theories of the world depend on our experience, by the time we’re adults we simply take them for granted—they’re part of the unquestioned background of our lives. When technological, culturally specific knowledge is learned early it becomes part of the background too. In our culture children learn how to use numbers and letters before they are five, in rural Guatemala, they learn how to use a machete. These abilities require subtle and complicated knowledge, but it’s a kind of knowledge that adults in the culture hardly notice (though it may startle visitors from another culture).

Until now, we couldn’t assume that people would know how to use a computer in the way we assume they know how to count. Our interactions with computational systems depended on first acquiring the skills of numeracy and literacy. You couldn’t learn how a computer worked without first knowing how to use a keyboard. That ensured that people learned about computers with relatively staid and inflexible old brains. We think of millennial high-school tech whizzes as precocious "digital natives." But even they only really began to learn about computers after they’d reached puberty. And that is just the point when brain plasticity declines precipitously.

The change in interfaces means that the next generation really will be digital natives. They will be soaked in the digital world and will learn about computers the way previous generations learned language—even earlier than previous generations learned how to read and add. Just as every literate person’s brain has been reshaped by reading, my two-year-old granddaughter’s brain will be reshaped by computing.

Is this a cause for alarm or celebration? The simple answer is that we don’t know and we won’t for at least another twenty years, when today’s two-year-olds grow up. But the past history of our species should make us hopeful. After all, those powerful early learning mechanisms are exactly what allowed us to collectively accumulate the knowledge and skill we call culture. We can develop new kinds of technology as adults because we mastered the technology of the previous generation as children. From agriculture to industry, from stone tools to alphabets to printed books, we humans reshape our world, and our world reshapes our brains. Still, the emergence of a new player in this distinctively human process of cultural change is the biggest news there can be.