2014 : WHAT SCIENTIFIC IDEA IS READY FOR RETIREMENT?

[ print ]

Computer Scientist, Yale University; Chief Scientist, Mirror Worlds Technologies; Author, America-Lite: How Imperial Academia Dismantled our Culture (and ushered in the Obamacrats)
The Grand Analogy

Today computationalists and cognitive scientists—those researchers who see digital computing as a model for human thought and the mind—are nearly unanimous in believing the Grand Analogy and teaching it to their students. And whether you accept it or not, the analogy is milestone of modern intellectual history. It partly explains why a solid majority of contemporary computationalists and cognitive scientists believe that eventually, you will be able to give your laptop a (real not simulated) mind by downloading and executing the right software app. Whereupon if you tell the machine, "imagine a rose," it will conjure one up in its mind, just as you do. Tell it to "recall an embarrassing moment" and it will recall something and feel embarrassed, just as you might. In this view, embarrassed computers are just around the corner.

But no such software will ever exist, and the analogy is false and has slowed our progress in grasping the actual phenomenology of mind. We have barely begun to understand the mind from inside. But what's wrong with this suggestive, provocative analogy? My first reason is old; the other three are new.

1. The software-computer system relates to the world in a fundamentally different way from the mind-brain system. Software moves easily among digital computers, but each human mind is (so far) wedded permanently to one brain. The relationship between software and the world at large is arbitrary, determined by the programmer; the relationship between mind and world is an expression of personality and human nature, and no one can re-arrange it.

There are computers without software, but no brains without minds. Software is transparent. I can read off the precise state of the entire program at any time. Minds are opaque—there is no way I can know what you are thinking unless you tell me. Computers can be erased; minds cannot. Computers can be made to operate precisely as we choose; minds cannot. And so on. Everywhere we look we see fundamental differences.

2. The Grand Analogy presupposes that minds are machines, or virtual machines—but a mind has two equally-important functions, doing and being; a machine is only for doing. We build machines to act for us. Minds are different: yours might be wholly quiet, doing ("computing") nothing; yet you might be feeling miserable or exalted—or you might merely be conscious.

Emotions in particular are not actions, they are ways to be. And emotions—states of being—play an important part in the mind's cognitive work. They allow you, for instance, to feel your way to a cognitive goal. ("He walked to the window to recollect himself, and feel how he ought to behave." Jane Austen, Persuasion.) Thoughts contain information, but feelings (mild wistfulness, say, on a warm summer morning) contain none. Wistfulness is merely a way to be.

Until we understand how to make digital computers feel (or experience phenomenal consciousness), we have no business talking up a supposed analogy between mind:brain and software:computer.

(Those who note that computers-that-can-feel are incredible are sometimes told: "You assert that many billions of tiny, meaningless computer instructions, each unable to feel, could never create a system that feels. Yet neurons are also tiny, "meaningless" and feel nothing--but a hundred billion of those yields a brain that does feel." Which is irrelevant: 100 billion neurons yield a brain that supports a mind, but a hundred billion sandgrains or used tires yields nothing. You need billions of the right article arranged in the right way to get feeling.)

3. The process of growing up is innate to the idea of human being. Social interactions and body structure change over time, and the two sets of changes are intimately connected. A toddler who can walk is treated differently from an infant who can't. No robot could acquire a human-like mind unless it could grow and change physically, interacting with society as it did.

But even if we focus on static, snapshot minds, a human mind requires a human body. Bodily sensations create mind-states that cause physical changes that create further mind-changes. A feedback loop. You are embarrassed; you blush; feeling yourself blush, your embarrassment increases. Your blush deepens.

We don't think with our brains only. We think with our brains and bodies together. We might build simulated bodies out of software—but simulated bodies can't interact in human ways with human beings. And we must interact with other people to become thinking persons.

4. Software is inherently recursive; recursive structure is innate to the idea of software. The mind is not and cannot be recursive.

A recursive structure incorporates smaller versions of itself: an electronic circuit made of smaller circuits, an algebraic expression built of smaller expressions.

Software is a digital computer realized by another digital computer. (You can find plenty of definitions of digital computer.) "Realized by" means made-real-by or embodied-by. The software you build is capable of exactly the same computations as the hardware on which it executes. Hardware is a digital computer realized by electronics (or some equivalent medium).

Suppose you design a digital computer; you embody it using electronics. So you've got an ordinary computer, with no software. Now you design another digital computer: an operating system, like Unix. Unix has a distinctive interface—and, ultimately, the exact same computing power as the machine it runs on. You run your new computer (Unix) on your hardware computer. Now you build a word processor (yet another dressed up digital computer), to run on Unix. And so on, ad infinitum. The same structure (a digital computer) keeps recurring. Software is inherently recursive.

The mind is not and cannot be. You cannot "run" another mind on yours, and a third mind on that, and a fourth atop the third.

In conclusion: much has been gained by mind science's obsession with computing. Computation has been a useful lens to focus scientific and philosophic thinking on the essence of mind. The last generation has seen, for example, a much clearer view of the nature of consciousness. But we have always known ourselves poorly. We still do. Your mind is a room with a view, and we still know the view (objective reality) a lot better than the room (subjective reality). Today subjectivism is re-emerging among those who see through the Grand Analogy. Computers are fine, but it's time to return to the mind itself, and stop pretending we have computers for brains; we'd be unfeeling, unconscious zombies if we had.