"Answer is the betrayal of the open spirit of Question."
Edge 270—January 7, 2009
IN THE NEWS
New tools equal new perceptions.Through science we create technology and in using our new tools we recreate ourselves. But until very recently in our history, no democratic populace, no legislative body, ever indicated by choice, by vote, how this process should play out.
Nobody ever voted for printing. Nobody ever voted for electricity. Nobody ever voted for radio, the telephone, the automobile, the airplane, television. Nobody ever voted for penicillin, antibiotics, the pill. Nobody ever voted for space travel, massively parallel computing, nuclear power, the personal computer, the Internet, email, cell phones, the Web, Google, cloning, sequencing the entire human genome. We are moving towards the redefinition of life, to the edge of creating life itself. While science may or may not be the only news, it is the news that stays news.
And our politicians, our governments? Always years behind, the best they can do is play catch up.
Nobel laureate James Watson, who discovered the DNA double helix, and genomics pioneer J. Craig Venter, recently were awarded Double Helix Awards from Cold Spring Harbor Laboratory for being the founding fathers of human genome sequencing. They are the first two human beings to have their complete genetic information decoded.
Watson noted during his acceptance speech that he doesn't want government involved in decisions concerning how people choose to handle information about their personal genomes.
Venter is on the brink of creating the first artificial life form on Earth. He has already announced transplanting the information from one genome into another. In other words, your dog becomes your cat. He has privately alluded to important scientific progress in his lab, the result of which, if and when realized, will change everything.
151 Contributors (107,000 words): Alan Alda, Chris Anderson, Alun Anderson, Stephon H. Alexander, Mahzarin R. Banaji, John D. Barrow, Patrick Bateson, Gregory Benford, Yochai Benkler, Jesse Bering, David Berreby, Jamshed Bharucha, Susan Blackmore, David Bodanis, Stefano Boeri, Lera Boroditsky, Nick Bostrom, Stewart Brand, Rodney Brooks, David Buss, William Calvin, Leo Chalupa, Nicholas A. Christakis, Andy Clark, Gregory Cochran, M. Csikszentmihalyi, Austin Dacey, David Dalrymple, Paul Davies, Richard Dawkins, Aubrey de Grey, Emanuel Derman, Daniel C. Dennett, Keith Devlin, Betsy Devine, Eric Drexler, Freeman Dyson, George Dyson, David Eagleman, Brian Eno, Juan Enriquez, Daniel Everett, Paul Ewald, Christine Finn, Eric Fischl, Helen Fisher, Kenneth W. Ford, Richard Foreman, Howard Gardner, Joel Garreau, James Geary, David Gelernter, Neil Gershenfeld, Marcelo Gleiser, Daniel Goleman, Dominique Gonzalez-Foerster, Brian Goodwin, Alison Gopnik, April Gornik, John Gottman, Jonathan Haidt, Haim Harari, Henry Harpending, Sam Harris, Marc D. Hauser, Marti Hearst, Roger Highfield, W. Daniel Hillis, Gerald Holton, Donald D. Hoffman, Verena Huber-Dyson, Nicholas Humphrey, Marco Iacoboni, Eric Kandel, Stuart Kauffman, Kevin Kelly, Marcel Kinsbourne, MD, Brian Knutson, Terence Koh, Bart Kosko, Stephen M. Kosslyn, Kai Krause, Laurence Krauss, Andrian Kreye, A. Garrett Lisi, Seth Lloyd, Gary Marcus, Ian McEwan, Thomas Metzinger, Oliver Morton, David G. Myers, P.Z. Myers, Steve Nadis, Monica Narula, Randolph Nesse, Tor Nørretranders, Hans Ulrich Obrist, James J. O'Donnell, Gloria Origgi, Dean Ornish, M.D., Mark Pagel, Bruce Parker, Philippe Parreno, Gregory Paul, Irene Pepperberg, Clifford A. Pickover, Steven Pinker, Ernst Pöppel, Corey S. Powell, Robert R. Provine, Lisa Randall, Ed Regis, Howard Rheingold, Carlo Rovelli, Douglas Rushkoff, Karl Sabbagh, Paul Saffo, Scott Sampson, Robert Sapolsky, Dimitar Sasselov, Roger Schank, Stephen H. Schneider, Peter Schwartz, Charles Seife, Gino Segrè, Tino Sehgal, Terrence Sejnowski, Martin Seligman, Robert Shapiro, Rupert Sheldrake, Michael Shermer, Kevin Slavin, Barry Smith, Laurence C. Smith, Lee Smolin, Dan Sperber, Maria Spiropulu, Paul J. Steinhardt, Nassim Nicholas Taleb, Timothy Taylor, Max Tegmark, Frank J. Tipler, John Tooby & Leda Cosmides, Joseph F. Traub, Sherry Turkle, Alexander Vilenkin, J. Craig Venter, Frank Wilczek, Ian Wilmut, Anton Zeilinger
...It is the next period, the fourth, starting about 1969, that has to be considered his mature phase, the moment when he did, in fact, find his metier, recognize it with blinding clarity and apply himself to exploring it exhaustively. The discovery was, as Byars called it, "Question." Question was primarily an immaterial mode of art. (Materiality would be more a statement than a question.) It could be a very minimal performance or even less. Byars once described it by saying, "1 create atmospheres." His pursuit of the immaterial through the ephemeral is shown by a work of the mid·'60s for which he released 100 pink helium-filled balloons to rise toward the sphere of the moon.6 When I began my first writing about him, for an ArtForum article of 1981 called "James Lee Byars and the Atmosphere of Question," I sat Byars down, took out a pencil and pad, and started to ask him questions.
Evincing impatience with the prosaic clerical approach, he exclaimed, "Oh, Thomas, just make me up!" Truly he would rather remain a question than allow anyone to turn him into an answer. I saw the point, and I made him up. Specifically, 1 made him up in the way that I made sense of his work, the way it made sense for me-in terms of philosophy. I made him up with references to Democritus, Sextus Empiricus, Edmund Husserl and other thinkers whom he had heard of barely or not at all. I did not claim that these references came from him; 1 simply allowed my way of understanding his work to enter into my picture of it. It seems a mistake to describe Byars as trained in philosophy, though certain writers have done so. Byars was, as he himself put it, "interested in philosophy" but had actually read almost nothing but some scraps of Wittgenstein (Zettel and bits of the Blue Book and Brown Book). He was aware that Wittgenstein had become a cult figure in the 1960s, and that he was antisocial or socially perverse, and Byars liked that; but he could not have begun to explain what Wittgenstein had meant. Question is after all not an answer; in fact, in Byars's way of seeing it, a question is smudged, polluted, cancelled out by having an answer. Answer is the betrayal of the open spirit of Question.
The answer to which we cannot aspire ("Is is?") is the true doorway to the openness or emptiness he exalted. The many works of Question constitute the center of Byars's career, his highest insight, the principle to which he was most committed. But the immateriality presented problems. Like many other artists of his generation, Byars worked constantly but rarely, if ever, had anything to sell. In the U.S. his reputation was mostly that of a charlatan or mountebank. Village lVice critic Kim Levin recalls once hearing him described as "the Liberace of the art world," which is a way of mocking the vulgarity of his pretentions and costumes and special airs.7 But in Europe he had been more or less accepted from the beginning (meaning from about 1970-71). The Europeans had no problem with his denial that he was an American. They could see perfectly well he was not an American. Perhaps not a European either-surely, a prince of an imaginary kingdom.
After 1970, a long period ensued in which Byars shuttled more or less constantly between Europe and the United States; like many art ists and intellectuals of earlier centuries, he went wherever a patron offered him a place to stay for a while, usually in conjunction with some show or work in the neighborhood. These were the years of Byars's artistic maturity. They fall into two phases, but one must bear in mind that in Byars's career, stylistic or thematic phases tended to overlap; there were not clean breaks between them.
The first was the period of Question-from around 1969 until 1985 or 1990-when Byars rarely made any concrete object-works; the very idea made him literally shudder. ("The artist comes in," he said with withering contempt, "carrying his little Kunst ... ") The point was that any formed object would be an answer, and thus the antithesis of Question. Question was open because it had not yet received any form; it was a kind of prime matter, or a substance that existed in a realm of potentiality, an indefinite state that had not yet become anything in particular and maybe never would. But any formed object, on the other hand, would have denied all that: if it has already received form it is over, closed, ended; it has slid from the vague cloud of potentiality into a collision with the flat wall of fact that lay hidden behind it. Byars was far more amenable to the mode of becoming than to that of being: an object whose form was somehow always changing, so you could never say that at a certain moment it was exactly this or that, was acceptable to him because it avoided certainty.
During the Question phase, Byars claimed to renounce more or less all his earlier work. He seemed genuinely to have no interest in it anymore. He fell in love with the lightness of Question; he swore that thenceforth nothing material (nothing "heavy") would pollute his consciousness. Byars never employed religio-sentimental terms like "spirituality," but it would perhaps not be offensive to him to say that Question, for him, was the most precious manifestation of spirit.
At Documenta V in 1972, Byars shouted German names from the top of the Fridericianum and passed out tiny Eucharist-like paper discs with the letter Q printed in "the smallest print that twenty-twenty vision can read," Byars claimed. Sometimes the performance element got bigger, as in the World Question Center (1969), where elaborate efforts were made to elicit from a group of world intellectuals their most interesting questions. In one situation, a group of students sat at desks with telephones as in a telethon and attempted to reach people, mostly scientists, from a list of names that Byars had assembled. The event was broadcast live on Belgian TV. These procedures didn't always work out, as when one famous scientist said, "What do you mean, questions?" though in other instances the spirit seemed to pass over, as when physicist John Wheeler said, "Axiology?" (Axiology is a method of studying how values are determined, and in the form of a question it seems to imply a questioning of value judgments.) Sometimes the object took on a larger role, as in The Black Book, 1971, "a one page book with one hundred questions printed in tiny gold letters on black tissue paper with imaginary covers." Indeed, books became the quintessential genre of Question, most often as volumes containing brief philosophical passages or abbreviations of phrases he had made up ("QR," for example, meant "Question is in the Room"). Sometimes the books were made pure Question by being wordless. ...
Continue to "James Lee Byars: A Study In Posterity" By Thomas McEvilley, an illustrated profile of the late artist who founded The World Question Center. [Click here].
Further Reading on Edge on James Lee Byars and The World Question Center: "He Confuses One And Two The 200 I.Q.: Mr. Byars By Mr. Brockman"
SELF AWARENESS: THE LAST FRONTIER By V.S. RamachandranAn Edge Original Essay
One of the last remaining problems in science is the riddle of consciousness. The human brain—a mere lump of jelly inside your cranial vault—can contemplate the vastness of interstellar space and grapple with concepts such as zero and infinity. Even more remarkably it can ask disquieting questions about the meaning of its own existence. "Who am I" is arguably the most fundamental of all questions.
It really breaks down into two problems—the problem of qualia and the problem of the self. My colleagues, the late Francis Crick and Christof Koch have done a valuable service in pointing out that consciousness might be an empirical rather than philosophical problem, and have offered some ingenious suggestions. But I would disagree with their position that the qualia problem is simpler and should be addressed first before we tackle the "Self." I think the very opposite is true. I have every confidence that the problem of self will be solved within the lifetimes of most people reading this column. But not qualia.
V.S. RAMACHANDRAN is a Neuroscientist, Director, Center for Brain and Cognition, University of California, San Diego; Author, Phantoms in the Brain.
THE REALITY CLUB: Marc D. Hauser, V.S. Ramachandran, Timothy D. Wilson, Arnold Trehub, Robert Provine
SELF AWARENESS: THE LAST FRONTIER
One of the last remaining problems in science is the riddle of consciousness. The human brain—a mere lump of jelly inside your cranial vault—can contemplate the vastness of interstellar space and grapple with concepts such as zero and infinity. Even more remarkably it can ask disquieting questions about the meaning of its own existence. "Who am I" is arguably the most fundamental of all questions.
It really breaks down into two problems—the problem of qualia and the problem of the self. My colleagues, the late Francis Crick and Christof Koch have done a valuable service in pointing out that consciousness might be an empirical rather than philosophical problem, and have offered some ingenious suggestions. But I would disagree with their position that the qualia problem is simpler and should be addressed first before we tackle the "Self." I think the very opposite is true. I have every confidence that the problem of self will be solved within the lifetimes of most readers of this essay. But not qualia.
The qualia problem is well known. Assume I am an intellectually highly advanced, color-blind martian. I study your brain and completely figure out down to every last detail what happens in your brain—all the physico-chemical events—when you see red light of wavelength 630 and say "red". You know that my scientific description, although complete from my point of view, leaves out something ineffable and essentially non-communicable, namely your actual experience of redness. There is no way you can communicate the ineffable quality of redness to me short of hooking up your brain directly to mine without air waves intervening (Bill Hirstein and I call this the qualia-cable; it will work only if my color blindness is caused by missing receptor pigments in my eye, with brain circuitry for color being intact.) We can define qualia as that aspect of your experience that is left out by me—the color-blind Martian. I believe this problem will never be solved or will turn out (from an empirical standpoint) to be a pseudo-problem. Qualia and so-called "purely physical" events may be like two sides of a Moebius strip that look utterly different from our ant-like perspective but are in reality a single surface.
So to understand qualia, we may need to transcend our ant-like view, as Einstein did in a different context. But how to go about it is anybody's guess.
The problem of self, on the other hand, is an empirical one that can be solved—or at least explored to its very limit—by science. If and when we do it will be a turning point in the history of science. Neurological conditions have shown that the self is not the monolithic entity it believes itself to be. It seems to consist of many components each of which can be studied individually, and the notion of one unitary self may well be an illusion. (But if so we need to ask how the illusion arises; was it an adaptation acquired through natural selection?)
Consider the following disorders which illustrate different aspects of self.
David also had difficulty abstracting across successive encounters of a new person seen in different contexts to create an enduring identity for that person. Without the flash of recognition he ought to have experienced in the second, third or n'th exposure, he couldn't bind the experiences together into a single person. Even more remarkably David sometimes duplicated his own self! He would often refer to "The other David who is on vacation." It was as if even successive episodes of his own self were not bound together the way they are in you and me.
This is not to be confused with MPD ("multiple personality disorder") seen in psychiatric contexts. MPD is often a dubious diagnosis made for medico-legal and insurance purposes and tends to fluctuate from moment to moment. (I have often been tempted to send two bills to an MPD patient to see if he pays both.) Patients like David, on the other hand, may give us genuine insight into the neural basis of selfhood.
We will now consider two aspects of self that are considered almost axiomatic. First its essentially private nature. You can empathise with someone but never to the point of experiencing her sensations or dissolving into her (except in pathological states like folie a duex and romantic love). Second, it is aware of its own existence. A self that negates itself is an oxymoron. Yet both these axioms can fall apart in disease; without affecting other aspects of self. An amputee can literally feel his phantom limb being touched when he merely watches a normal person being touched. A person with Cotard's syndrome will deny that he exists; claiming that his body is a mere empty shell. Explaining these disorders in neural terms can help illuminate how the normal self is constructed.
To account for some of these syndromes we need to invoke mirror neurons discovered by Giacomo Rizzolatti, Victorio Gallase and Marco Iacoboni. Neurons in the prefrontal cortex send out sophisticated signals down the spinal cord that orchestrate skilled and semi-skilled movements such as putting food in your mouth, pulling a lever, pushing a button, etc. These are "ordinary" motor command neurons but some of them, known as mirror neurons, also fire when you merely watch another person perform a similar act. It's as if the neuron (more strictly the network of which the neuron is part) was using the visual input to do a sort of "virtual reality simulation" of the other persons actions—allowing you to empathize with her and view the world from her point of view.
In a previous Edge essay I also speculated that these neurons can not only help simulate other people's behavior but can be turned "inward"—as it were—to create second-order representations or metarepresentations of your own earlier brain processes. This could be the neural basis of introspection, and of the reciprocity of self awareness and other awareness. There is obviously a chicken-or-egg question here as to which evolved first, but that is tangential to my main argument. (See also Nick Humphrey's contributions to Edge.) The main point is that the two co-evolved, mutually enriching each other to create the mature representation of self that characterizes modern humans. Our ordinary language illustrates this, as when we say "I feel a bit self conscious", when I really mean that I am conscious of others being conscious of me. Or when I speak of being self critical or experiencing "self-pity". (A chimp could—arguably—feel pity for a begging chimp, but I doubt whether it would ever experience self-pity.)
I also suggest that although these neurons initially emerged in our ancestors to adopt another's allocentric visual point of view, they evolved further in humans to enable the adoption of another's metaphorical point of view. ("I see it from his point of view" etc.) This, too, might have been a turning point in evolution although how it might have occurred is deeply puzzling.
There are also: "touch mirror neurons" that fire not only when your skin is touched but when you watch someone else touched. This raises an interesting question; how does the neuron know what the stimulus is? Why doesn't the activity of these neurons lead you to literally experience the touch delivered to another person? There are two answers. First the tactile receptors in your skin tell the other touch neurons in the cortex (the non-mirror neurons) that they are not being touched and this null signal selectively vetos some of the outputs of mirror neurons. This would explain why our amputee experienced touch sensations when he watched our student being touched; the amputation had removed the vetoing. It is a sobering thought that the only barrier between you and others is your skin receptors!
A second reason why your mirror neurons don't lead you to mime everyone you watch or to literally experience their tactile sensations might be that your frontal lobes send feedback signals to partially inhibit the mirror neurons' output. (It cant completely inhibit them; otherwise there would be no point having mirror neurons in the first place.) As expected, if the frontal lobes are damaged you do start miming people ("echopraxia").
Recent evidence suggests that there may also be mirror neurons for pain, disgust, facial expression—perhaps for all outwardly visible expression of emotions. (We call these "empathy" neurons or Gandhi neurons.) Some of these are in the anterior cingulate—others in the insula.
I mention these to emphasize that despite all the pride that your self takes in its individuality and privacy, the only thing that separates you from me is a small subset of neural circuits in your frontal lobes interacting with mirror neurons. Damage these and you "lose your identity"—your sensory system starts blending with those of others. Like the proverbial Mary of philosopher's thought experiments, you experience their qualia.
We suggest that many otherwise inexplicable neuro-psychiatric symptoms may arise from flaws in these circuits leading to "you-me" confusion and impoverished ego-differentiation. Lindsay Oberman, Eric Altschuler and I have seen strong preliminary hints that autistic children have a paucity of mirror neurons which would not only explain their poor imitation, empathy and 'pretend play" (which requires role-playing) but also why they sometimes confuse the pronouns I and You, and have difficulty with introspection. Even Freudian phenomena like "projection", seen in all of us, may have similar origins; "I love you" turns to "You love me" to make me feel safer.
Let us return to Cotards syndrome—the ultimate paradox of the self negating its own existence (sometimes claiming "I am dead", "I can smell my body rotting", etc.). We postulate that this arises from a combination of two lesions. First, a lesion that is analogous to Capgras but far more pervasive. Instead of emotions being disconnected from just visual centers, it is disconnected from all sensations and even memories of sensations. So the entire world becomes an imposter—unreal (not just the mother). Second, there may be dysfunctional interaction between the mirror neurons and frontal inhibitory structures leading to a dissolution of the sense of self as being distinct from others (or indeed from the world ). Lose the world and lose yourself—and it's as close to death as you can get. This is not a fully developed explanation by any means; I mention it only to indicate the style of thinking that we may need to explain these enigmatic syndromes.
Now imagine these same circuits become hyperactive as sometimes happens when you have seizures originating in the temporal lobes (TLE or temporal lobe epilepsy). The result would be an intense heightening of the patient's sensory appreciation of the world and intense empathy for all beings to the extent of seeing no barriers between himself and the cosmos—the basis of religious and mystical experiences. (You lose all selfishness and become one with God.) Indeed many of history's great religious leaders have had TLE. My colleague, the late Francis Crick, has suggested that TLE patients as well as priests may have certain abnormal transmitters in their brains that he calls "theotoxins". (He once told philosopher Pat Churchland that he had nothing against religion per se, so long as it was a private arrangement between consenting adults.)
I hasten to add that the involvement of the temporal lobes in mystical experiences does not in itself negate the existence of an abstract God, who, in Hindu philosophy, represents the supreme dissolution of all barriers. Perhaps the TLE patient has seen the truth and most of us haven't. I don't have TLE myself but have had personally had epiphanies when listening to a haunting strain of music, watching the aurora borealis, or looking at Jupiter's moons through a telescope. During such epiphanies I have seen eternity in a moment and divinity in all things. And, indeed, felt one with the Cosmos. There is nothing "true "or "false" about such experiences—they are what they are; simply another way of looking at reality.
Let us turn now to out-of-body experiences. Even a normal person—such as the reader—can at times adopt a "detached" allocentric stance toward yourself (employing something like mirror neurons) but this doesn't become a full blown delusion because other neural systems (e.g. inhibition from fontal structures and skin receptors ) keep you anchored. But damage to the right fronto-parietal regions or ketamine anesthesia (which may influence the same circuits) removes the inhibition and you start leaving your body even to the extent of not feeling your own pain. You see your pain "objectively" as if someone else was experiencing it. Some such opossum-like detachment also occurs in dire emergencies when you momentarily leave yourself and watch your body being raped or mauled by a lion. This reflex is normally protective (lying still to fool predators) but a vestige of it in humans may manifest as "dissociative" states under conditions of extreme stress.
The purported "unity" or internal consistency of self is also a myth. Most patients with left arm paralysis caused by right hemisphere stroke complain about it as, indeed, they should. But a subset of patients who have additional damage to the "body image" representation in the right SPL (and possibly insula) claim that their paralyzed left arm doesn't belong to them. The patient may assert that it belongs to his father or spouse. (As if he had a selective "Capgras" for his arm). Such syndromes challenge even basic assumptions such as "I am anchored in this body" or "This is my arm". They suggest that "belongingness" is a primal brain function hardwired through natural selection because of its obvious selective advantage to our hominoid ancestors. It makes one wonder if someone with this disorder would deny ownership of (or damage to) the left fender of his car and ascribe it to his mother's car.
There appears to be almost no limit to this. An intelligent and lucid patient I saw recently claimed that her own left arm was not paralyzed and that the lifeless left arm on her lap belonged to her father who was "hiding under the table". Yet when I asked her to touch her nose with her left hand she used her intact right hand to grab and raise the paralyzed hand—using the latter as a "tool" to touch her nose! Clearly somebody in there knew that her left arm was paralyzed and that the arm on her lap was her own, but "she"—the person I was talking to—didn't know. I then lifted her "father's hand" up toward her, drawing attention to the fact that it was attached to her shoulder. She agreed and yet continued to assert it belonged to her father. The contradiction didn't bother her.
Her ability to hold mutually inconsistent beliefs seems bizarre to us but in fact we all do this from time to time. I have known many an eminent theoretical physicist who prays to a personal God; an old guy watching him from somewhere up there in the sky. I might mention that I have long known that prayer was a placebo; but upon learning recently of a study that showed that a drug works even when you know it is a placebo, I immediately started praying. There are two Ramachandrans—one an arch skeptic and the other a devout believer. Fortunately I enjoy this ambiguous state of mind, unlike Darwin who was tormented by it. It is not unlike my enjoyment of an Escher engraving.
In the last decade there has been a tremendous resurgence of interest among neuroscientists in the nature of consciousness and self. The problem has been approached from many angles—ranging from single neuron electrophysiology to macroscopic brain anatomy (including hundreds of brain imaging studies ) What has been missing, though, is what might be called "psycho-anatomy"; whose goal is to explain specific details of certain complex mental capacities in terms of equally specific activity of specialized neural structures. As an analogy, consider the discovery of the genetic code. Crick and Watson unraveled the double helix, and saw in a flash that the complementarity of the two strands of the helix is a metaphor of the complementarity of parent and offspring in heredity. (Pigs give birth to pigs—not to donkeys.) In other words the structural logic of DNA dictates the functional logic of heredity. No such radical insight has emerged in neuroscience that would allow us to precisely map function on to structure.
One way of achieving this goal, as we have seen in this essay, might be to explore syndromes that lie at the interface between neurology and psychiatry Given the inherent complexity of the human brain, it is unlikely that there will be a single climactic solution like DNA (although I don't rule it out). But there may well be many instances where such a synthesis is possible on a smaller scale and these may lead to testable predictions and novel therapies. They may even pave the way for a grand unified theory of mind of the kind physicists have been seeking in trying to unify gravitation, relativity and quantum mechanics.
When such a theory finally emerges we can either accept it with equanimity or ask "Now that we have solved the problem of self, what else is there?"
MARC. D HAUSER: In case you glossed it, here it is, word for word: "One of the last remaining problems in science is the riddle of consciousness." Really? One of the LAST problems in Science, capital S Science, as in not only psychology, but evolutionary biology, anthropology, and molecular biology, not to mention physics, and chemistry? Now that is a claim! Let's keep at bay all the unsolved problems in chemistry and physics, and focus instead on some a bit closer to psychology, say evolutionary biology. We still don't understand how sex evolved, really!...
V.S. RAMACHANDRAN: But if Hauser were to poll his colleagues in all sciences, asking them what the last remaining mysteries are, he would find that consciousness and self awareness on everyone's list. It doesn't follow that there wouldn't be plenty of other scientific problems in the list as well such as recursiveness in language, linking quantum mechanics with gravity, the viability of string theory, etc., but consciousness and the precise nature of time (in physics) would rank very near the top simply because we don't even know where to begin.
TIMOTHY D. WILSON: The message seems to be that major puzzles about the mind—such as the nature of the "self"—will be solved by the field neuroscience. There is nary a mention of vast areas of research on the self from psychology, particularly social psychology, that have contributed much more to our understanding of the nature of the human self than neuroscience ever has or, in my opinion, ever will.
ARNOLD TREHUB: The key questions of how the neuronal mechanisms and systems of our brain create the phenomenal experience of our self as the subjective origin of a surrounding world, and how we are able to parse and analyze the world to do our science are are now being addressed in a detailed neuronal model that relates our phenomenal experience of a personal world-space to our metaphorical theater of consciousness.
ROBERT PROVINE: So far, mirror neurons concern the disembodied neurological correlates of the action of others; they have not been shown to produce actual behavior. While we wait for these data, investigators lacking electrophysiological laboratories and fMRIs can explore mirror-like contagious acts such as yawning and laughing that are familiar to everyone, are associated with actual behavior, and are typically neglected by investigators of mirror neurons.
MARC D. HAUSER
I love Rama. Such a wonderful writer, so provocative, so engaging. That said, I have to make a confession: I didn't get past the first sentence of this essay! In case you glossed it, here it is, word for word: "One of the last remaining problems in science is the riddle of consciousness." Really? One of the LAST problems in Science, capital S Science, as in not only psychology, but evolutionary biology, anthropology, and molecular biology, not to mention physics, and chemistry? Now that is a claim! Let's keep at bay all the unsolved problems in chemistry and physics, and focus instead on some a bit closer to psychology, say evolutionary biology. We still don't understand how sex evolved, really! There are several fascinating theories, but no concensus, and no knock down empirical evidence that lights the way. Here's another: are there non-carbon-based life forms? Maybe, but we don't know, yet. These are genuine problems, and like self-awareness, we are only beginning to have hints regarding their solutions.
Perhaps Ramachandran really meant to say that consciousness is one of the last remaining problems in the sciences of the mind, that is psychology and neurobiology. But that can't be right either. To take a problem near and dear, does anyone really believe that we have a genuine understanding of how the brain creates linguistic representations? If so, I would like to hear the account. But perhaps we shouldn't go for such a difficult problem.
How about an account of how the brain of a bee creates the representations of its own language, what von Frisch described in the 1960s as the bee's dance language. We certainly know that the dance is, in some respects, symbolic, in that it stands for or provides information about the location and quality of food. But we don't know how electrical activity creates this information in a format that can be read out and followed. And I don't think we are even close to understanding this problem. Scale it up, and ask how the brain creates the representations that enable us to appreciate the grammaticality of colorless green ideas sleep furiously, while also appreciating the lack of intelligible meaning, and we come up remarkably short.
None of this is to say that the problem of self-consciousness isn't a genuine problem, or that the material Ramachandran discusses isn't interesting. It is a problem, we don't have clear solutions, and studies of patient populations, such as individuals with Cotard's syndrome or Capgras, will certainly increase our understanding. All of this material is deeply fascinating (see, I did read past the first sentence), and in Rama's able hands, we will make terrific progress.
But even when we understand how the brain gives us an understanding of who we are, and how we experience the world, there will be many more problems left to solve. No scientist should fear the day when the problem of self-awareness is cracked. No scientist should think that they will be forced into early retirement [Leave it to our economy instead!]. No child should think that science is a dead end career because there are no more interesting problems to chew on. There are dozens and dozens of big problems to solve in every science, even the mind sciences. And the more learn, the more we create new problems. That is the beauty of science. That is the beauty of our ever inquisitive minds.
Marc Hauser is wise and scholarly; I always listen carefully to what he has to say. On this occasion he doesn't say much that I would disagree with. In fact he shows keen insight. He agrees with me that new insights into the nature of consciousness can come from studying the selective disturbances of different components of self that arise in neuropsychiatry. We believe this approach; mapping (not merely correlating) function on to structure, is analogous to mapping point mutations on to Chromosomes by Muller and Morgan (thereby linking chromosomes to heredity). The "monsters" he produced are analogous to neuropsychiatric syndromes. Another example is the discovery of homeotic mutants by Bateson in Drosophila ( legs replacing antennae)—anomalies which lived up to their promise of revolutionizing developmental and evolutionary biology long before the detailed nitty gritty mechanisms were figured out.
Yet even while applauding my enterprise, Hauser is uncomfortable with my opening line ("One of the last remaining problems in Science is the riddle of consciousness, etc.") and my closing sentence ("When such a theory finally emerges we can either accept it with equanimity or ask "Now that we have solved the problem of self, What else is there".) These were only intended to be playfully rhetorical (opening) and ironic (concluding) lines. Of course I don't mean that once we figure out Self, scientists will have nothing to do. Of course there are innumerable problems in every area of science to keep us busy for a long time. No one doubts that.
But if Hauser were to poll his colleagues in all sciences, asking them what the last remaining mysteries are, he would find that consciousness and self awareness on everyone's list. It doesn't follow that there wouldn't be plenty of other scientific problems in the list as well such as recursiveness in language, linking quantum mechanics with gravity, the viability of string theory, etc., but consciousness and the precise nature of time (in physics) would rank very near the top simply because we don't even know where to begin. It is precisely the fact that they seem to border on the metaphysical ("Who am I" or "Why does time have an arrow", etc.) that their solution in empirical terms would be especially exciting. I was certainly not trying to belittle other areas of research. What the "most important" problems are is a matter of taste; I know people who find counting the exact number of hair cells in the ear important and endlessly fascinating and I certainly wouldn't wish to deny them their privilege. On the other hand the question of which primates have "a theory of mind" (a topic which has been elegantly tackled by Hauser, Povinelli and others), is likely to be of interest to anyone who is not wholly devoid of common sense.
I have had this problem with Hauser before where he takes rhetorical or provocative opening lines literally, even while recognizing, by his own admission, that it has no bearing on either the spirit or the substance of my argument. Nonetheless, I welcome his suggestion that my concluding sentence could have been worded more carefully.
TIMOTHY D. WILSON
Ramachandran has done it again, presenting fascinating case histories worthy of Ripley's Believe It or Not and extracting profound insights about the human mind. But there is a subtext to his essay (whether intended or not I don't know) that is alarming. The message seems to be that major puzzles about the mind—such as the nature of the "self"—will be solved by the field neuroscience. There is nary a mention of vast areas of research on the self from psychology, particularly social psychology, that have contributed much more to our understanding of the nature of the human self than neuroscience ever has or, in my opinion, ever will.
Here is a sampling is a sampling of key principles already established by social psychological research:
Although the neurological underpinnings of these psychological phenomena would be interesting to explore, the phenomena themselves were not discovered by neuroscientists, and could not be easily deduced from neuroimaging or observations of brain-damaged patients. They required clever experimental manipulations and measurements of people's self-reports and behavior.
This isn't a horse race, of course. The "problem of the self" will not be solved by one subdiscipline alone. Studies of brain-damaged patients and images of the "normal" brain will teach us a lot, as will behavioral research. More progress will be made by combining these fields of inquiry than by focusing on one alone.
There are certainly many remaining problems in science, but overwhelming evidence tells us that without our human brain as the source of our phenomenal experience (consciousness), science and all the questions of science would not exist. This is why understanding how the brain generates consciousness may be the most fundamental scientific problem. I doubt if Marc Hauser and Timothy Wilson actually believe that the pursuit of their scientific specialties and all the other scientific endeavors could have occurred without the evolution of the conscious human brain.
I have suggested that the hallmark of consciousness is a transparent representation of the world from a privileged egocentric perspective. The key questions of how the neuronal mechanisms and systems of our brain create the phenomenal experience of our self as the subjective origin of a surrounding world, and how we are able to parse and analyze the world to do our science are are now being addressed in a detailed neuronal model that relates our phenomenal experience of a personal world-space to our metaphorical theater of consciousness. Science must start within this phenomenal world.
Self and Other: A Ticklish Solution
V. S. Ramachandran's essay is another of his original and provocative contributions to neuropsychology and neurophilosophy, this time concerning self awareness, "the last frontier." In considering the related problems of qualia and self, he correctly observes that self is the more tractable research problem and presents an ingenious position based on neuropathological case studies and recent breakthroughs in neuroscience, including mirror neurons. To Ramachandran's list of the exotic, I suggest the addition of the mundane—tickle. Tickle provides an answer to his concluding statement, "Now that we have solved everything, what else is there? The answer is "other."
The same mechanism that detects non-self, ticklish stimuli generates the sense of self. Although our sense of identity involves more than self/nonself discrimination, such a mechanism is at its foundation and a first step toward the evolution of personhood and the neurological computation of its boundaries. Pathology of the self/nonself discriminator may play a role in anomalous social behavior (e.g., touch aversion in autism) and body perception of the sort considered by Ramachandran. The computation of other also provides a bridge linking the often estranged disciplines of social psychology and neuroscience.
The full importance of mirror process will be appreciated when the chain of events from sensory input to motor output is established and we can understand how a behavior is initiated, controlled and produced. So far, mirror neurons concern the disembodied neurological correlates of the action of others; they have not been shown to produce actual behavior. While we wait for these data, investigators lacking electrophysiological laboratories and fMRIs can explore mirror-like contagious acts such as yawning and laughing that are familiar to everyone, are associated with actual behavior, and are typically neglected by investigators of mirror neurons.
That's a bit extreme, she thought, as well as hard to prove. "If I wanted to run a bus ad saying ‘Beware — there is a giant lion from London Zoo on the loose!' or ‘The "bits" in orange juice aren't orange but plastic — don't drink them or you'll die!' I think I might be asked to show my working and back up my claims," Ms. Sherine wrote in a commentary on the Web site of The Guardian.
And then she thought, how about putting some atheist messages on the bus, as a corrective to the religious ones?
And so were planted the seeds of the Atheist Bus Campaign, an effort to disseminate a godless message to the greater public. When the organizers announced the effort in October, they said they hoped to raise a modest $8,000 or so.
But something seized people's imagination. Supported by the scientist and author Richard Dawkins, the philosopher A. C. Grayling and the British Humanist Association, among others, the campaign raised nearly $150,000 in four days. Now it has more than $200,000, and on Tuesday it unveiled its advertisements on 800 buses across Britain.
"There's probably no God," the advertisements say. "Now stop worrying and enjoy your life."
...Following the wake of Snow and probably trying to repair the betrayal of Benda-speaking, John Brockman in 1988 founded the Edge Foundation (www.edge.org), an organization that seeks to reintegrate, under the idea of a "Third Culture "scientific and humanistic discourse and contribute to that science has a key role in the discussion of public affairs. ...
Many of the changes of mind are just changes of opinion or an evolution of values. One contributor, a past supporter of manned spaceflight, now thinks it's pointless, while another no longer has moral objections to cognitive enhancement through drugs. An anthropologist is now uncomfortable with cultural relativism (as in, study the Inca practice of human sacrifice non-judgmentally). Other changes of mind have to do with busted predictions, such as that computer intelligence would soon rival humans'. ...
THE CHRONICLE OF HIGHER EDUCATION
December 15, 2008
Not So Smart: Aliens, Computers, and Universities
By Josh Fischman
Just because you're smart doesn't mean you get things right the first time. That's the premise behind What Have You Changed Your Mind About? (Harper Perennial), a new anthology. In it, 150 "big thinkers" describe what they now think they were wrong about earlier in their lives. Much of this has to do with technology and education. Among the highlights:
Ray Kurzweil no longer thinks that intelligent aliens exist. The oft-cited futurist and inventor, a pioneer in artificial intelligence and in making reading machines for the blind, says that conventional thinking holds there should be billions of such civilizations and a number of them should be ahead of us, "capable of vast, galaxy-wide technologies. So how can it be that we haven't noticed" all of the signals they should be creating? "My own conclusion is that they don't exist."
Roger C. Schank used to say "we would have machines as smart as we are within my lifetime." Now Mr. Schank, a former Yale University professor and director of Yale's artificial-intelligence project, says: "I no longer believe that will happen… I still believe we can create very intelligent machines. But I no longer believe that those machines will be like us." Chess-playing computers that beat people are not good examples, he says. Playing chess is not representative of typical human intelligence. "Chess players are methodical planners. Human beings are not." We tend, Mr. Schank says, "to not know what we know."
Randolph M. Nesse "used to believe that truth had a special home at universities." Mr. Nesse, professor of psychiatry at the University of Michigan and an expert on evolution and medicine, now thinks "universities may be the best show in town for truth pursuers, but most of them stifle innovation and constructive engagement of real controversies — not just sometimes but most of the time, systematically." Faculty committees, he complains, make sure that most positions "go to people just about like themselves." Deans ask how much external financing new hires will bring in. "No one with new ideas … can hope to get through this fine sieve."
THE CHRONICLE OF HIGHER EDUCATION
December 15, 2008
Not So Smart II: The Internet Doesn't Work So Well
By Josh Fischman
Yesterday I listed a few flip-flops by leading thinkers chronicled in a new anthology, What Have You Changed Your Mind About? (Harper Perennial). Whether universities were really that great was one of them. But there are more.
One of the major things that bright minds have rethought is that the Internet will be a boon to humanity. Here is why:
It does not fight authority. Nicholas Carr, who wrote the recent best seller The Big Switch: Rewiring the World, From Edison to Google, used to believe the Internet would shift the bulk of power to the little people, away from big companies and governments. But "its technical and commercial working actually promote the centralization of power and control," he says. Although the overall number of Web sites has increased from 2002 through 2006, the concentration of traffic at the 10 most popular sites has grown from 31 percent to 40 percent of all page views. Further, "look at how Google continues to expand its hegemony over Web searching," Mr. Carr says. "To what end will the Web giants deploy their power? They will, of course, seek to further their own commercial or political interests."
A few bad people counteract many good people, and machines can't fix that. Xeni Jardin, co-editor of the tech blog Boing Boing, says comments on the blog were useful and fun, originally. But as the blog grew more popular, so did antisocial posts by "trolls," or "people for whom dialogue wasn't the point." Things got so nasty that Boing Boing editors finally removed the ability for readers to comment. Now she has reinstated comments, because "we hired a community manager. … If someone is misbehaving, she can remove all the vowels from their screed with one click." There is no automated way to do this, Ms. Jardin says, and "the solution isn't easy, cheap, or hands-free. Few things of value are."
NEW YORK TIMES MAGAZINE
January 4, 2009
By Joe Nocera
One Saturday a few months ago, Taleb, a trim, impeccably dressed, middle-aged man — inexplicably, he won't give his age — walked into a lobby in the Columbia Business School and headed for a classroom to give a guest lecture. Until that moment, the lobby was filled with students chatting and eating a quick lunch before the afternoon session began, but as soon as they saw Taleb, they streamed toward him, surrounding him and moving with him as he slowly inched his way up the stairs toward an already-crowded classroom. Those who couldn't get in had to make do with the next classroom over, which had been set up as an overflow room. It was jammed, too.
It's not every day that an options trader becomes famous by writing a book, but that's what Taleb did, first with "Fooled by Randomness," which was published in 2001 and became an immediate cult classic on Wall Street, and more recently with "The Black Swan: The Impact of the Highly Improbable," which came out in 2007 and landed on a number of best-seller lists. He also went from being primarily an options trader to what he always really wanted to be: a public intellectual. When I made the mistake of asking him one day whether he was an adjunct professor, he quickly corrected me. "I'm the Distinguished Professor of Risk Engineering at N.Y.U.," he responded. "It's the highest title they give in that department." Humility is not among his virtues. On his Web site he has a link that reads, "Quotes from ‘The Black Swan' that the imbeciles did not want to hear."
"How many of you took statistics at Columbia?" he asked as he began his lecture. Most of the hands in the room shot up. "You wasted your money," he sniffed. Behind him was a slide of Mickey Mouse that he had put up on the screen, he said, because it represented "Mickey Mouse probabilities." That pretty much sums up his view of business-school statistics and probability courses. ...
December 29, 2008
Darwin shouldn't be hijacked by New Atheists - he is an ethical inspiration
...The fear is that the anniversary will be hijacked by the New Atheism as the perfect battleground for another round of jousting over the absurdity of belief (a position that Darwin pointedly never took up). Many of the prominent voices in the New Atheism are lined up to reassert that it is simply impossible to believe in God and accept Darwin's theory of evolution; Richard Dawkins and the US philosopher Daniel Dennett are among those due to appear in Darwin200 events. It's a position that infuriates many scientists, not to mention philosophers and theologians. ...
God has had a lot of bad press recently. The four horsemen of atheism, Richard Dawkins, Daniel Dennett, Sam Harris, and Christopher Hitchens, have all published books sharply critical of belief in God: respectively, The God Delusion, Breaking the Spell, The End of Faith, and God Is Not Great. Dawkins, Harris, and Hitchens pile on the greatest amount of scorn, while Dennett takes the role of good cop. But despite differences of tone and detail, they all agree that belief in God is a kind of superstition. As Harris puts it, religion "is the denial—at once full of hope and full of fear—of the vastitude of human ignorance."
Sapolsky is one of 130-plus scientists and "thinkers" who have contributed highly personal revelations to What Have You Changed Your Mind About?, due next month.
Book marketing seems to demand sensational subtitles, but Today's Leading Minds Rethink Everything turns out to be an accurate guide to the content. In almost 400 pages, the contributors cover frontier aspects of all three scientific arenas: physical, biomedical and social.
It should come with a warning: "Reading this book may be dangerous to your cherished myths and perceptions."
COLUMBIA JOURNLAISM REVIEW
December 19, 2008
CS: One of the things that I've noticed with criticisms of the Internet is that very often they're displaced criticisms of television. That there are a lot of people, Nick Carr especially is a recent addition to the canon, wringing their hands over the end of literary reading. And they're laying that at the foot of the Internet. It seems to me, in fact, from the historical record, that the idea of literary reading as a sort of broad and normal activity was done in by television, and it was done in forty years ago.
The funny thing, though, is when television came along, it became, to a degree literally unprecedented in the history of media—not just the dominant media compared to other media, but really the dominant activity in life outside of sleeping and working—that a curious bargain was struck where television still genuflected to the idea of literary reading. The notion was that there was somehow this sacred cathedral of the great books and so forth. It was just that no one actually participated in it, and so it was sort of this kind of Potemkin village. What the Internet has actually done is not decimate literary reading; that was really a done deal by 1970. What it has done, instead, is brought back reading and writing as a normal activity for a huge group of people.
Many, many more people are reading and writing now as part of their daily experience. But, because the reading and writing has come back without bringing Tolstoy along with it, the enormity of the historical loss to the literary landscape caused by television is now becoming manifested to everybody. And I think as people are surveying the Internet, a lot of what they're doing is just shooting the messenger.
December 24, 2008
It's not often that a science writer gets to say this, but the Pope is right. It's not as if he's always right: where scientific matters are concerned, Benedict XVI has displayed precious little infallibility. He has shown a disquieting sympathy for the rebranded creationism of intelligent design, and his views on embryonic stem cells, IVF and contraception are inimical to medical progress. But in attacking the notion that sex roles are invariably ordained by culture and not biology, the Holy Father has said something that needed saying.
As the Pope is finding out, anyone who criticises this "gender theory" invites vitriol from its liberal champions. Scientists such as Simon Baron-Cohen and Steven Pinker, who suggest that differences between typical male and female behaviour may be biologically influenced, have been accused of rationalising patriarchy and discrimination.
The work of these researchers and others shows that gender theory is built on sand. Anatomical variations between the sexes are not the only ones with natural roots. Women tend to be better at empathising, while men are more likely to excel at understanding systems from motorbike engines to offside laws, and there is growing evidence that these traits are influenced by testosterone exposure in the womb. They may also be linked to the recent discovery of hundreds of variations in the way that genes are switched on and off in male and female brains. If social factors are important in shaping gender roles, it is increasingly apparent that biology matters too, and recognising this in no way justifies sexism. Sex differences in behaviour apply only on average, across populations, and people should be considered as individuals. ...
THE NEW YORK TIMES
December 24, 2008
They came for the Moon, and for the first three orbits it was to the Moon that the astronauts of Apollo 8 devoted their attention. Only on their fourth time round did they lift their eyes to see their home world, rising silently above the Moon's desert plains, blue and white and beautiful. When, later on that Christmas Eve in 1968, they read the opening lines of Genesis on live television, they did it with a sense of the heavens and the Earth, of the form and the void, enriched by the wonder they had seen rising into the Moon's black sky.
...Along with everything else they have done, the financial meltdown and attendant economic slump have spurred unprecedented political attention and participation on the part of economists.
"In my lifetime as an economist I've never seen economists so engaged by what's going on," says Richard Thaler of the University of Chicago. "At the University of Chicago people always talk economics at lunch, but for the last three months they've all been talking about the crisis and the bailout, and writing op-eds."
This is something of a change. The topics economists study often have little to do with the average person's economic life - as in most any academic field, practical relevance can have little to do with what questions are deemed most interesting and rewarding. This divergence was exacerbated, many economists say, during the span of almost uninterrupted economic growth that began in the late 1980s, a period when many of the more practical questions in economic policy-making came to be seen as having been settled. For years, leading economic figures like Larry Summers and Alan Greenspan argued that the United States had more or less brought the business cycle to heel. ...
The Science of Spore--The "Evolution" of Gaming
When Will Wright was developing Spore, his much acclaimed computer game, he interviewed several life scientists. He asked them how nature had actually done what he was attempting to simulate in the game—which was, among other things, the development of the earliest stages of life and its evolution. (Some billboard advertisements for the game feature the slogan "Evolution Begins at Spore.com.") Among the scientists Wright consulted were Michael Levine, a geneticist at the University of California, Berkeley; Neil H. Shubin, a paleontologist at the University of Chicago; and Hansell Stedman, a surgeon at the University of Pennsylvania School of Medicine.
But for all the research that went into it, Spore comes off as a mixed success at replicating the inner workings of evolution by natural selection. On the plus side, in both the game and the real world, there is competition among individuals: Darwin's well-known "struggle for existence." In both, the more fit survive, and the less so die out, duplicating the basic evolutionary principle of survival of the fittest. In the game and in real life, simple entities develop into more complex ones, a pattern that is a common, though not an inevitable, feature of Darwinian evolution. Finally, in both Spore and in nature, life-forms tend to be bilaterally symmetrical, even though exceptions occur in real-life creatures such as amoebas as well as in some of Spore's unicellular organisms. ...
Evolution of the Mind: 4 Fallacies of Psychology
...The most notable representatives of Pop EP are psychologists David M. Buss (a professor at the University of Texas at Austin and author of The Evolution of Desire and The Dangerous Passion) and Steven Pinker (a professor at Harvard University whose books include How the Mind Works and The Blank Slate). Their popular accounts are built on the pioneering theoretical work of what is sometimes referred to as the Santa Barbara school of evolutionary psychology, led by anthropologists Donald Symons and John Tooby and psychologist Leda Cosmides, all at the University of California, Santa Barbara. ...
3 QUARKS DAILY
December 22, 2008
THE UNION OF EVOLUTION AND DESIGN
...The Left — the party of science, environmentalism, equality, and choice — would do well to understand what this job does and does not include. First, as Oliver Morton explained a couple of years ago on Edge.org, it does not include saving the planet. Earth and its biosphere is resilient enough in the long term to take what we are giving it: fresh water depletion, species losses, a boosted greenhouse effect, and more. Nothing we can do (or at least, are at all likely to do) can stop biological and geological evolution on Earth. But while the planet can adapt, humans, especially the poorest, could be greatly harmed. The strongest arguments for cutting greenhouse gas emissions start by honoring human solidarity, not the intrinsic value of sea ice. ...
Individual versus Group in Natural Selection
Richard Dawkins, whose writings have reached millions, maintains that selection might not even reach such a high level of biological organization as the individual organism. Instead, he claims, selection operates on genes—the individual is the embodiment of the selection of thousands of selfish genes, each trying to perpetuate itself.
In the past few decades, however, group selection has made a quiet comeback among evolutionary theorists. E. O. Wilson of Harvard University and David Sloan Wilson (no relation) of Binghamton University are trying to give group selection full-fledged respectability. They are rebranding it as multilevel selection theory: selection constantly takes place on multiple levels simultaneously. And how do you figure the sum of those selections in any real-world circumstance? "We simply have to examine situations on a case-by-case basis," Sloan Wilson says. ...
December 18, 2008
Why we are, as we are
Social psychologists have long observed that, on first meeting, people automatically classify each other in three ways: by sex, by age and by race. But Dr Cosmides and Dr Tooby pointed out that before long-distance transport existed, only two of those would have been relevant. People of different ages and sexes would meet; people of different races would not.
The two researchers argue that modern racial discrimination is an overstimulated response to what might be called an "alliance" detector in the human brain. In a world where the largest social unit is the tribe, clan or what-you-will of a few hundred people, your neighbours and your other allies will normally look a lot like you, and act similarly. However, it is known from the study of modern hunter-gatherers, and inferred from archaeological evidence about ancient ones, that neighbouring tribes are often hostile.
I am a fan of Christopher Hitchens. There's that delightful disdain with which he impales his opponents, his flashing wit—and the hints of seriousness to show that it's all more than just a jousting game.
But sometimes he gets things very wrong, and his attitude to the ten commandments—one he shares with many modern atheists—is one such mistake. They represent little more, he argues, than the rantings of an angry, vain and vengeful God. Who would possibly want to follow their "vague pre-Christian desert morality," which shows every sign of being invented by a "Bronze Age demagogue"?
A TALK WITH LISA RANDALL
By Samuel P. Jacobs
HARVARD'S JEFFERSON LABORATORY, home to the physics department since 1884, has seen its share of firsts; 10 Nobel Laureates have made their discoveries there. Today, leading theoretical physicist Lisa Randall is working on another improbable first for the department: She's writing an opera. ...
...Writing a book for a general audience connected Randall with a new set of people in fields outside of physics. One of them, the Spanish composer Hector Parra, intrigued Randall by asking if she would try writing a libretto for an opera about her work. The resulting piece, a collaboration with the artist Matthew Ritchie, is scheduled to debut in Paris at the Georges Pompidou Centre this summer, then travel throughout Europe in the fall.
Society must respond to the growing demand for cognitive enhancement. That response must start by rejecting the idea that 'enhancement' is a dirty word, argue Henry Greely and colleagues.
Today, on university campuses around the world, students are striking deals to buy and sell prescription drugs such as Adderall and Ritalin — not to get high, but to get higher grades, to provide an edge over their fellow students or to increase in some measurable way their capacity for learning. These transactions are crimes in the United States, punishable by prison.
Many people see such penalties as appropriate, and consider the use of such drugs to be cheating, unnatural or dangerous. Yet one survey1 estimated that almost 7% of students in US universities have used prescription stimulants in this way, and that on some campuses, up to 25% of students had used them in the past year. These students are early adopters of a trend that is likely to grow, and indications suggest that they're not alone.
WHAT HAVE YOU CHANGED YOUR MIND ABOUT
"An Intellectual Treasure Trove"
Contributors include: STEVEN PINKER on the future of human evolution • RICHARD DAWKINS on the mysteries of courtship • SAM HARRIS on why Mother Nature is not our friend • NASSIM NICHOLAS TALEB on the irrelevance of probability • ALUN ANDERSON on the reality of global warming • ALAN ALDA considers, reconsiders, and re-reconsiders God • LISA RANDALL on the secrets of the Sun • RAY KURZWEIL on the possibility of extraterrestrial life • BRIAN ENO on what it means to be a "revolutionary" • HELEN FISHER on love, fidelity, and the viability of marriage…and many others.
Praise for the online publication of
"The splendidly enlightened Edge website (www.edge.org) has rounded off each year of inter-disciplinary debate by asking its heavy-hitting contributors to answer one question. I strongly recommend a visit." The Independent
"A great event in the Anglo-Saxon culture." El Mundo
"As fascinating and weighty as one would imagine." The Independent
"They are the intellectual elite, the brains the rest of us rely on to make sense of the universe and answer the big questions. But in a refreshing show of new year humility, the world's best thinkers have admitted that from time to time even they are forced to change their minds." The Guardian
"Even the world's best brains have to admit to being wrong sometimes: here, leading scientists respond to a new year challenge." The Times
"Provocative ideas put forward today by leading figures."The Telegraph
The world's finest minds have responded with some of the most insightful, humbling, fascinating confessions and anecdotes, an intellectual treasure trove. ... Best three or four hours of intense, enlightening reading you can do for the new year. Read it now." San Francisco Chronicle
"As in the past, these world-class thinkers have responded to impossibly open-ended questions with erudition, imagination and clarity." The News & Observer
"A jolt of fresh thinking...The answers address a fabulous array of issues. This is the intellectual equivalent of a New Year's dip in the lake—bracing, possibly shriek-inducing, and bound to wake you up." The Globe and Mail
"Answers ring like scientific odes to uncertainty, humility and doubt; passionate pleas for critical thought in a world threatened by blind convictions." The Toronto Star
"For an exceptionally high quotient of interesting ideas to words, this is hard to beat. ...What a feast of egg-head opinionating!" National Review Online
"The optimistic visions seem not just wonderful but plausible." Wall Street Journal
"Persuasively upbeat." O, The Oprah Magazine
"Our greatest minds provide nutshell insights on how science will help forge a better world ahead." Seed
"Uplifting...an enthralling book." The Mail on Sunday
"Danger – brilliant minds at work...A brilliant bok: exhilarating, hilarious, and chilling." The Evening Standard (London)
"A selection of the most explosive ideas of our age." Sunday Herald
"Provocative" The Independent
"Challenging notions put forward by some of the world's sharpest minds" Sunday Times
"A titillating compilation" The Guardian
"Reads like an intriguing dinner party conversation among great minds in science" Discover
"Whether or not we believe proof or prove belief, understanding belief itself becomes essential in a time when so many people in the world are ardent believers." LA Times
"Belief appears to motivate even the most rigorously scientific minds. It stimulates and challenges, it tricks us into holding things to be true against our better judgment, and, like scepticism -its opposite -it serves a function in science that is playful as well as thought-provoking. not we believe proof or prove belief, understanding belief itself becomes essential in a time when so many people in the world are ardent believers." The Times
"John Brockman is the PT Barnum of popular science. He has always been a great huckster of ideas." The Observer
"An unprecedented roster of brilliant minds, the sum of which is nothing short of an oracle—a book ro be dog-eared and debated." Seed
"Scientific pipedreams at their very best." The Guardian
"Makes for some astounding reading." Boston Globe
"Fantastically stimulating...It's like the crack cocaine of the thinking world.... Once you start, you can't stop thinking about that question." BBC Radio 4
"Intellectual and creative magnificence" The Skeptical Inquirer
Edge Foundation, Inc. is a nonprofit private operating foundation under Section 501(c)(3) of the Internal Revenue Code.