[ print ]

Cognitive Scientist; Author, Guitar Zero: The New Musician and the Science of Learning

Within my lifetime (or soon thereafter) scientists will finally decode the language of the brain. At present, we understand a bit about the basic alphabet of neural function, how neurons fire, and how they come together to form synapses, but haven't yet pieced together the words, let alone the sentences.  Right now, we're sort of like Mendel, at the dawn of genetics: he knew there must be something like genes (what he called "factors"), but couldn't say where they lived (in the protein? in the cytoplasm?) or how they got their job done. Today, we know that thought has something to do with neurons, and that our memories are stored in brain matter, but we don't yet know how to decipher the neurocircuitry within. 

Doing that will require a quantum leap. The most popular current techniques for investigating the brain, like functional magnetic resonance imaging (fMRI), are far too coarse. A single three dimensional "voxel" in an fMRI scan lumps together the actions of tens or even hundreds of thousands of neurons — yielding a kind of rough geography of the brain (emotion in the amygdala, decision-making in the prefrontal cortex) but little in the way of specifics.  How does the prefrontal cortex actually do its thing? How does the visual cortex represent the difference between a house and a car, or a Hummer and a taxi? How does Broca's area know the difference between a noun and verb?

To answer questions like these, we need to move beyond the broad scale geographies of fMRI and down to the level of individual neurons.

At the moment, that's a big job.  For one thing, in the human brain there are billions of neurons and trillions of connections between them; the sheer amount of data involved is overwhelming. For another, until recently we've lacked the tools to understand the function of individual neurons in action, within the context of microcircuits.

But there's good reason to think all that's about to change. Computers continue to advance at a dizzying pace.  Then there's the truly unprecedented explosion in databases like the Human Genome and the Allen Brain Atlas, enormously valuable datasets that are shared publically and instantly available to all researchers, everywhere; even a decade ago there was nothing like them. Finally, genetic neuroimaging is just around the corner — scientists can now induce individual neurons to fire and (literally) light up on demand, allowing us to understand individual neural circuits in a brand new way.

Technical advances alone won't be enough, though — we'll need a scientist with the theoeretical vision of Francis Crick, who not only helped identify the physical basis of genes — DNA — but also the code by which the individual nucleotides of a gene get translated (in groups of three) into amino acids.  When it comes to the brain, we already know that neurons are the physical basis of thinking and knowledge, but not the laws of translation that relate one to the other.

I don't expect that there will be one single code. Although every creature uses essentially the same translation between DNA and amino acids, different parts of the brain may translate between neurons and information in different ways.  Circuits that control muscles, for example, seem to work on a system of statistical averaging; the angle at which a monkey extends its arm seems, as best we can tell, to be a kind of statistical average of the actions of hundreds of individual neurons, each representing a slightly different angle of possible motion, 44 degrees, 44.1 degrees, and so forth. Alas, what works for muscles probably can't work for sentences and ideas, so-called declarative knowledge like the proposition that "Michael Bloomberg is the Mayor of New York" or the idea that my fight to Montreal leaves at noon. It's implausible that the brain would have vast population of neurons reserved for each specific thought I might entertain ("my flight to Montreal leaves at 11:58 am", "my flight to Montreal leave leaves at 11:59 am", etc). Instead, the brain, like language itself, needs some sort of combinatorial code, a way of putting together smaller pieces (Montreal, flight, noon) into larger elements.

When we crack that nut, when we figure out how the brain manages to encode declarative knowledge , an awful lot is going to change. For one thing, our relationship to computers will be completely and irrevocably altered; clumsy input devices like mice, windows, keyboards, and even heads-up displays and speech recognizers will go the way of typewriters and fountain pens; our connection to computers will be far more direct. Education, too, will fundamentally change, as engineers and cognitive sciences begin to leverage an understanding of brain code into ways of directly uploading information into the brain. Knowledge will become far cheaper than it already has become in the Internet era; with luck and wisdom, we as species could advance immeasurably.