PD Smith on What We Believe But Cannot Prove
What We Believe But Cannot Prove: Today's Leading Thinkers on Science in the Age of Certainty edited by John Brockman (Pocket Books, £7.99)
According to Richard Dawkins, science proceeds by hunches. John Brockman's cybersalon, Edge.org, invited members of the "third culture" - the scientists whom he considers to be the "pre-eminent intellectuals of our time" - to contribute their most cherished intuitions. As Ian McEwan (a rare non-scientist here) points out, this is rather intriguing because scientists, unlike "literary critics, journalists or priests", don't just believe things. They need proof. Indeed, Simon Baron-Cohen dismisses "ideas that cannot in principle be proved or disproved". But mathematician John Barrow is happy to believe that "our universe is infinite in size, finite in age, and just one among many", all "unprovable in principle". But the nature of consciousness turns out to be more controversial. Daniel Dennett argues that animals and prelinguistic children are not truly conscious, whereas Alison Gopnik claims young children are more conscious than adults: "every wobbly step is skydiving, every game of hide-and-seek is Einstein in 1905, and every day is first love in Paris". Scientific pipedreams at their very best.
The new age of ignorance
We take our young children to science museums, then as they get older we stop. In spite of threats like global warming and avian flu, most adults have very little understanding of how the world works. So, 50 years on from CP Snow's famous 'Two Cultures' essay, is the old divide between arts and sciences deeper than ever?
Here we ask a celebrity panel to answer some basic scientific questions
It is an immutable law of nature that acute embarrassment can make a few short seconds last pretty much for ever. The longest two minutes of my life occurred in the company of James Watson, one half of the famous double act who discovered the double helix. I was interviewing Watson, then in his late seventies, at his lab in Cold Spring Harbor on Long Island. At one point, I referred blithely to the 'perfect simplicity' of his and Francis Crick's findings about the code of life.
Watson is a mischievous, famously prickly man and that phrase seemed to get under his skin. He raised an eyebrow. He sat back. He thought he would have some fun. Seeing as it was all so perfectly simple, he suggested, maybe I could briefly run through my understanding of DNA base pairing, say, or chromosome mapping.
What followed - a tangled, stuttering stream of consciousness reflecting distant O-level biology and recent half-understanding of Watson's brilliant books, punctuated with words like 'replication' and 'mutation' and meaning nothing much - gave new resonance to the notion of floundering.
Watson, resisting the temptation to laugh, correct or comment, simply moved on, having categorically established our respective levels of evolution. I can still cringe now at the brief pause that concluded my ill-judged aside on the significance of the genome.
Given that science informs so much of our culture, and so many of us have such patchy knowledge, it is surprising that such embarrassments are not routine. It's half a century since CP Snow put forward the idea of the 'Two Cultures', the intractable divide between the sciences and the humanities, first in an article in the New Statesman, then in a lecture series at Cambridge and finally in a book. Back then, Snow, who was both a novelist and a physicist, used to employ a test at dinner parties to demonstrate his argument.
'A good many times,' he suggested, 'I have been present at gatherings of people who, by the standards of the traditional culture, are thought highly educated and who have with considerable gusto been expressing their incredulity at the illiteracy of scientists. Once or twice, I have been provoked and have asked the company how many of them could describe the Second Law of Thermodynamics. The response was cold; it was also negative. Yet I was asking something which is the scientific equivalent of: have you ever read a work of Shakespeare's?'
Fifty years on, and exponential scientific advance later, it seems unlikely that the response of dinner guests would be much different. I was reminded of Snow's test when reading the new book by Natalie Angier, science editor of the New York Times. Angier's book is called The Canon, and subtitled 'A Whirligig Tour of the Beautiful Basics of Science'. It is not a long book and it contains, as the title suggests, a breathless Baedeker of the fundamental scientific knowledge Angier believes is the minimum requirement of an educated person.
In many places, I found myself cringeing all over again. I've read a fair amount of popular science, tried to follow the technical arguments that underpin debates about global warming, say, or bird flu, listened religiously to Melvyn Bragg's In Our Time, but still I discovered large black holes in my elementary understanding of how our world works. Angier divides her book into basic disciplines - biology, chemistry, geology, physics and so on - and each chapter offers an animated essay on the current established thinking.
The result is the kind of science book you wish someone had placed in front of you at school - full of aphorisms that help everything fall into place. For geology: 'This is what our world is about: there is heat inside and it wants to get out.' For physics: 'Almost everything we've come to understand about the universe we have learned by studying light.' Along the way there are all sorts of facts that stick: 'You would have to fly on a commercial aircraft every day for 18,000 years before your chances of being in a crash exceeded 50 per cent', for example; or, if you imagined the history of our planet as a single 75-year human life span: 'The first ape did not arrive until May or June of the final year... and Neil Armstrong muddied up the Moon at 20 seconds to midnight.'
Angier also gives as clear an insight as I have read of CP Snow's culture-dividing Second Law of Thermodynamics, the law of entropy, the one that states that in any system inefficiency is inevitable and eventually overwhelming. 'Entropy,' Angier writes, 'is like a taxi passing you on a rainy night with its NOT IN SERVICE lights ablaze, or a chair in a museum with a rope draped from arm to arm, or a teenager.'
Entropy, unusable energy, leads to the law that states that everything in time must wear out, become chaotic, die. 'The darkest readings of the Second Law suggest that even the universe has a morphine drip in its vein,' Angier suggests, 'a slow smothering of all spangle, all spiral, all possibility.' No wonder CP Snow thought we should know about it.
For all of its infectious analogies and charged curiosity, the most telling fact about Angier's book is that it seems to have been written out of sheer desperation. It is something of a cry from the wilderness; impassioned, overwrought in places. It is written in the voice of someone who has spent her whole award-winning career evangelising about this amazing stuff and is facing up to the fact that most people have not even begun to 'get' any of it.
Angier's tipping point, the reason she came to write the book, was a decision made by her sister. When the second of her two children turned 13 the sister decided that it was time to let their membership lapse in two familiar family haunts: the science museum and the zoo. They were, the implication went, ready to put away childish things, ready to go to the theatre and the art gallery, places where there was none of this 'mad pinball pinging from one hands-on science exhibit to the next, pounding on knobs to make artificial earthquakes'. They had grown out of science.
Angier believes this idea - that science is something for kids - still pervades much of our thinking, and characterises the presentation of science in culture. Part of it is the notion that argues science is just a bunch of facts with no overarching coherence. Just as bad are the media, which insist on ghettoising science and serving it up as cliches: scientists as boffins, with permanent bad-hair days; science as controversy, always set up for polarised clashes with religion.
'Science is rather a state of mind,' Angier argues and, as such, it should inform everything. 'It is a way of viewing the world, of facing reality square on but taking nothing for granted.' It would be hard to argue that this state of mind was advancing across the globe. We no longer make and mend, so we no longer know how anything works.
One of Angier's interviewees, Andrew Knoll, a professor of natural history at Harvard's earth and planetary sciences department, suggests that 'the average American adult today knows less about biology than the average 10-year-old living in the Amazon, or the average American of 200 years ago'. I spoke to Angier to find out why she thought that this might be the case.
To some extent, she suggested, that was a political question. 'Here in the US we have had the last seven years of this administration which has made everything about the two-cultures divide seem worse.' But it is not just that. 'Newspapers are getting rid of all their science pages; they are jettisoning all their science staff. The feeling is people don't want to read it.'
The implications of this, and the resultant general scientific illiteracy, she believes, are possibly catastrophic. Forty-two per cent of Americans in a recent survey said they believed that humans had been on Earth since the beginning of time. 'A geophysicist friend suggests we are at a critical crossroads just like the start of the Renaissance,' Angier says, 'where you couldn't just leave reading and writing to the kings and priests anymore. Ordinary people have to keep up. In the world we live in, the new economy, you have to become scientifically literate or you will fall quickly from view.'
It is, apparently, not just America that does not want to hear this news. Foreign rights to Angier's book have been snapped up in auctions by publishers across Asia and Eastern Europe, 'countries that see themselves as the economic future', but she has not, for example, sold her book in the UK, a place, we might remember, where 20 per cent of people still believe that the Sun revolves around Earth. 'I tend to see that as a tiny little sign that some of these more aggressive competitive nations are more aware of what the future looks like,' Angier suggests.
She believes this persistent apathy in matters of science in America and Britain comes in part from a lack of interest in what the future might hold. 'In the 1960s, we had the space race, we had these world fairs and the whole idea of the future was very exciting. Science was something they wanted to be involved in.' You could hope that the apocalyptic panic that attends climate change, the front pages of floodwaters rising, might have a similar effect. 'Whatever you think of him, Al Gore has been great for science,' she says.
Angier's initiation into the 'beautiful basics' was brought about by a professor at the University of Michigan, who taught a 'physics of music' class. The walls between the two cultures came tumbling down every week. 'There were kids from the engineering and physics departments and then there were kids from the music departments. I was just in there on my own. But the way he brought us together was an extraordinary thing,' she recalls. 'Both groups were kind of ecstatic; this guy would get standing ovations at the end of every lecture. So I guess I saw that bridging that gap might be something to strive for in life in terms of engaging people.'
This kind of engagement, a sense of a bigger picture in science, its poetry and mystery, is no doubt all too rare. In a 2005 survey of British teenagers at school conducted by the exam board OCR, more than half said they thought science classes were 'boring', 'confusing' and 'difficult'. Just 7 per cent believed that scientists were 'cool' and when asked to pick out a famous scientist from a list including Isaac Newton and Albert Einstein, a fair few chose Christopher Columbus.
Some of this Angier believes has to do with the way science is taught - 'I go through these science books for kids and they are so dull compared to the novels that children read... I think that you have to make it an epic journey, a narrative with heroes and villains, molecules engaging in this struggle for life.' A lot of it, however, is cultural, she believes. Numbers of students still studying science at 18 are falling in Britain and America, perhaps because we are becoming generally less motivated to address difficulty.
As a culture, we allow ourselves too many excuses. 'Western parents are quite comfortable saying their children have a predilection for art or for writing or whatever, and allow them just to pursue that. In the Asian education system, if you are not good at something, it's because you are lazy and you just have to work harder at it. Just because things are hard does not mean they are not worth doing.'
That idea of difficulty, I suggest, cannot really be helped in the States in particular, when all of the presidential candidates of one party stand up in televised debate and say they believe in 'intelligent design' and suggest that the world could well have been created by a bearded God a few thousand years ago. Angier laughs, somewhat bleakly.
'I see all that as a macho kind of posturing. It's like, I can believe the impossible: look, I can lift a tree! It is a Republican initiation ritual, like having a hook pulled through your cheek and not flinching.' But no, she concedes, it doesn't help much.
Some people would suggest that Natalie Angier's enlightenment utopia, in which everyone might one day agree on the fundamentals of the universe, the beautiful basics, is a false ideal; the mass has always believed in mumbo-jumbo. One of these people is John Brockman. Brockman has probably done more than anyone to break down CP Snow's cultural divide. He is the PT Barnum of popular science, a great huckster of ideas. In the Sixties, he hung out with John Cage and Andy Warhol, got an MBA and then made his first fortune selling psychedelia to corporations, turning on marketing executives with 'multikinetic happenings' and showing them how their profits could levitate.
These days, he acts as literary agent for many of the world's greatest minds, including Richard Dawkins, Daniel Dennett and Steven Pinker, and achieves for some of them the kind of publishing advances that it takes great mathematicians to compute. It is Brockman who invented the publishing market for quarks and quantum theory and black holes in the 1990s, and it is he who is behind the current boom in atheism. The universe may be infinite, but Brockman takes 15 per cent of it.
He also runs a kind of global online Royal Society called Edge. Edge promotes what he calls the Third Culture, a marriage of physics and philosophy, astronomy and art. The name itself derives from a phrase of CP Snow's outlining his personal hope for the future. Brockman, when launching his Third Culture in 1991, had significant ambition for the project, much of which has been realised. 'The Third Culture consists of those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are,' he suggested, grandly.
Though Brockman borrowed Snow's phrase, he did not employ it in the same way: Snow had hoped for a kind of detente between the rival mindsets; Brockman perceived a third way. 'Literary intellectuals are not communicating with scientists,' he suggested. 'Scientists are communicating directly with the general public. Traditional intellectual media played a vertical game; journalists wrote up and professors wrote down. Today, Third Culture thinkers tend to avoid the middleman and endeavour to express their deepest thoughts in a manner accessible to the intelligent reading public.'
Brockman's cross-fertilising club, the most rarefied of chatrooms, has its premises on his website www.edge.org. Eavesdropping is fun. Ian McEwan, one of the few novelists who has contributed to Edge's ongoing debates, suggests that the project is not so far removed from the 'old Enlightenment dream of a unified body of knowledge, when biologists and economists draw on each other's concepts and molecular biologists stray into the poorly defended territory of chemists and physicists'.
Brockman is at the hub of this conversation. When I phone him, he is waiting for a call from maverick geneticist Craig Venter about an invention that will 'put new operating mechanisms into genes' and radically change our idea of life; earlier, he has been speaking to George Smoot, the Nobel-winning astrophysicist who first identified the background radiation of the Big Bang and thereby invented cosmology.
From where he is sitting, the Two Cultures no longer applies, the Third Culture has long-since prevailed.
'Basically, in terms of whatever war has been going on, I think it has finished,' he says. 'I don't characterise it by saying we've won. I think everybody has won. We are living in a profound science culture and the big events that are affecting people's lives are scientific ones.'
What about Natalie Angier's anxiety that these ideas have not trickled down, that, if anything, scientific thought seems to be on the retreat?
'Since when have the masses of people had any ideas anyway?' Brockman asks. 'It is always a certain percentage of people who do the thinking for everybody else. What is changing,' he argues, contrary to Angier's perception, 'is that the media people, who used to have no thoughts of science, now sit up. Science makes the news.'
I wonder why there are still so few literary contributors to Edge, which has remained a predominantly scientific and philosophical forum. Is there not some evidence there that the divide persists?
Brockman explains how Edge evolved out of a group called the Reality Club that held actual meetings with scientists, artists, architects, musicians. Ten of the leading novelists in America were invited to participate. Not one accepted.
'We are talking about Vonnegut, Updike, Mailer, John Irving,' Brockman says. 'Ian McEwan is one of the first writers to jump feet-first into the world of science and embraced it wholeheartedly. But we still have never had a novelist come to one of these events. Neither have we had a major business person. Maybe getting up in front of a group of Nobel-winning scientists to talk might be intimidating for these people. Maybe they are too busy.'
Brockman's optimism is infectious, and, at his elite level, the battle may have been won, but further down the food chain, the forces of reason are still compromised by the culture.
When I had recovered a little of my composure with James Watson, back in Cold Spring Harbor, I asked him how he thought the climate of scientific research had changed since he made his fateful discovery of the structure of life in 1953. As ever, he came at the question from an unusual angle. He doubted, he said, that in today's world, he and Francis Crick would ever have had their Eureka moment.
'I recently went to my staircase at Clare College, Cambridge and there were women there!' he said, with an enormous measure of retrospective sexual frustration. 'There have been a lot of convincing studies recently about the loss of productivity in the Western male. It may be that entertainment culture now is so engaging that it keeps people satisfied. We didn't have that. Science was much more fun than listening to the radio. When you are 16 or 17 and in that inherently semi-lonely period when you are deciding whether to be an intellectual, many now don't bother.'
Watson raised an eyebrow, fixed me again with a look. 'What you have instead are characters out of Nick Hornby's very funny books, who channel their intellect in pop culture. The hopeless male.'
As James Watson knows perhaps more clearly than anyone alive, biology works in mysterious ways.
One of these claims is sure to make your blood boil: the assertion that humans have no soul. Or that we are alone in the universe. Or that the search for the origin of life is pointless.
In What Is Your Dangerous Idea? Today's Leading Thinkers on the Unthinkable (Harper Perennial, $13.95), John Brockman, founder of Edge (www.edge.org), an online salon, asks 108 thinkers and scientists to describe their "most dangerous idea." Harvard University cognitive scientist Steven Pinker sets the tone in the introduction: "Science in particular has always been a source of heresy, and today the galloping advances in touchy areas like genetics, evolution, and the environmental sciences are bound to throw unsettling possibilities at us," he writes.
Essentially a compendium of short essays, the book reads like an intriguing dinner party conversation among great minds in science—some of whom, of course, talk right past each other. String theory king Brian Greene contends that our universe is just one of many. On the next page, quantum theory proponent Carlo Rovelli shoots down the multiverse as "audacious scientific speculation."
Bold ideas aren't limited to the hard sciences; there's something here to provoke everyone, including the suggestion that evil emerges in all of us. Geneticist-provocateur J. Craig Venter proposes that we are not all created equal; the unorthodox psychology writer Judith Rich Harris undermines parenting by claiming that parents don't have much influence over the ultimate character of their children.
Don't expect to find answers here. Brockman will have you asking more questions than when you started—and may even change your mind about the ideas you've always been convinced are right. After reading What Is Your Dangerous Idea? even know-it-alls will realize how little they know for sure.
If your notion of a dangerous idea is handing the car keys to Lindsay Lohan or entering a biker bar and calling its patrons a bunch of pansies, you might want to steer clear of this book.
What Is Your Dangerous Idea? Today's Leading Thinkers On The Unthinkable deals strictly with the bigger-picture stuff, gathering 108 bright lights from around the world to proffer theories and opinions on everything from the meaning of life and our relevance in the universe (or absence thereof) to the erosion of democracy.
These "what if" scenarios have been compiled by John Brockman, founder of the "third culture" website The Edge (www.edge.org), an online forum for fellow eggheads and a community – to quote the site – "of those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are."
Walter Isaacson's "Einstein: His Life and Universe" (Simon & Schuster, 2007) is quite properly drawing praise for its thorough, step-by-step chronicle of the great man's long and eventful life, but if you want a briefer, quirkier, more multifaceted picture of Einstein, try "My Einstein" (Pantheon, 2006). The latter, edited by John Brockman, is a bouquet of brilliant essays by Einstein's intellectual peers: top scientists who know their way around a quark. My quarrel with the Isaacson biography is that it occasionally feels padded, as if the author is just adding anecdotes to thicken the sauce, whereas the essays in the Brockman book are zippy and personal.
En 1959, C. P. Snow dictó en Cambridge una famosa conferencia titulada Las dos culturas y la revolución científica, deplorando la escisión académica y profesional entre el ramo de las ciencias y el de las letras. En 1991, el agente literario John Brockman popularizó el concepto de la tercera cultura, para referirse a la entrada en escena de los científicos-escritores. Nacería así un nuevo humanismo. Un nuevo humanismo que ya no sería tanto el humanismo clásico cuanto una nueva hibridación entre ciencias y letras.
En lo que concierne a la filosofía, este nuevo humanismo debería estar atento no sólo a la ciencia, sino al mayor número posible de corrientes de pensamiento vivo. Ello es que la filosofía no debe estar encerrada en un departamento académico profesional, sino ejercerse en un cruce interdisciplinario y en "conversación" -como dijera el recientemente desaparecido Richard Rorty- con todas las demás ciencias. La filosofía tiene que trazar mapas de la realidad. El filósofo es, en palabras de Platón, "el que tiene la visión de conjunto (synoptikós)", es decir, el que organiza lo más relevante de la "información almacenada" (cultura) y esboza nuevas cosmovisiones (provisionales, pero coherentes). Por otra parte, la inicial intuición de los filósofos "analíticos" -que fueron los primeros en señalar la importancia de evitar las trampas que nos tiende el lenguaje- no debe echarse en saco roto.
Pienso, así, que un nuevo humanismo debería asumir ciertas reformas lingüísticas. Recordemos, por ejemplo, lo mucho que nos sigue condicionando todavía el viejo constructo aristotélico hecho de sujeto, verbo y predicado, que es también el modelo cartesiano de cognición sujeto-objeto. Esta convención es responsable -como ya denunciaran tanto Buda como David Hume- de incurrir en la falacia de creer que hay mente cuando lo único seguro es que hay actos mentales.
Lo que ocurre es que en el género filosófico las palabras tienen que transmitir conceptos, y por ahí caben pocas florituras. En filosofía es muy difícil salirse de un determinado modelo gramatical. Martin Heidegger ya explicó que tuvo que renunciar a escribir la segunda parte de El ser y el tiempo por la inadecuación del lenguaje de la metafísica que siempre identifica el ser con el ente, olvidando la diferencia ontológica. Hoy, cuando la filosofía tiende a confundirse con la literatura, ¿qué otros recursos caben? Gregory Bateson solía decir que hay que acostumbrarse a una nueva forma de pensar que substituya los objetos por relaciones. Pero substituir los objetos por relaciones es contar historias. De modo que Gregory Bateson nos estaba invitando a contar historias.
En todo caso, si bien se ha producido el "giro lingüístico", nuestros hábitos sintácticos han cambiado poco. Y ya digo que se comprende. El ya citado Heidegger, en su segunda época, reivindicó la poesía -cuyo ejemplo supremo sería Hölderlin- como modelo de lenguaje no objetivante, no reducido a simple instrumento de información. Sólo que Heidegger llegó a embriagarse tanto de "oscuridad poética" que difícilmente se le podía seguir. En cuanto a los lenguajes formales usados por las ciencias duras, sucede que al final sólo son accesibles a un grupo reducidísimo de especialistas. Así, pongo por caso, todavía las gentes ilustradas pudieron digerir en su día la teoría de la gravitación de Newton, e incluso la de la relatividad de Einstein (aunque ésta ya menos, la constancia de la velocidad de la luz es estrictamente contraintuitiva); pero ¿quién es capaz de seguir la endiablada complejidad matemática de la teoría de las supercuerdas?
Y, con todo, hay ahí un camino a mi juicio irreversible. Pues, al margen del lenguaje que uno utilice, ha sonado la hora de liberarse de la tiranía de la intuición, el sentido común y otros embelecos parecidos.
Por otra parte, ¿por qué la realidad habría de ser completamente inteligible? De entrada, el teorema de Gödel impugna la noción misma de una teoría completa de la natura: cualquier sistema de axiomas moderadamente complejo plantea preguntas que los axiomas no pueden responder. De otro lado,la Teoría de la Evolución confirma nuestra oscuridad. Nada nos obliga a pensar que el mundo ha de ser completamente inteligible. Al menos para nosotros, simios pensantes. Al menos en relación a lo que nosotros, simios pensantes, entendemos por inteligibilidad.
En resolución. Un nuevo humanismo debería comenzar por una cura de modestia, y quizá abjurando del mismo y arrogante concepto de humanismo, el que coloca al animal humano como centro y referencia de todo lo que existe. Un nuevo humanismo, compatible con la sensibilidad metafísica, no puede ponerse de espaldas a la ciencia. Naturalmente, no se trata de incurrir en el oscurantismo pseudocientífico denunciado por Alan Sokal y Jean Bricmont en su conocido libro Imposturas intelectuales. No hay que usar la jerga científica en contextos que no le corresponden. Tampoco se trata de caer en un relativismo epistémico radical (que surge de una mala digestión de las obras de Kuhn y Feyerabend), ni de creer que la ciencia es una mera narración, o una pura construcción social. Ni de buscar síntesis atolondradas entre Ciencia y Mística. La tarea es previa y más respetuosa con la autonomía de la ciencia. Se trata de conocer de verdad nuestros condicionamientos esenciales. Se trata de que los paradigmas científicos fecunden realmente a los discursos filosóficos e incluso literarios.
Ello es que es la totalidad de la cultura la que permanentemente está en juego y se renueva. Se renueva desde la interfecundación de las distintas disciplinas. Hoy procede, incluso, elaborar un nuevo concepto de los "textos sagrados" que no hay que ir a buscar donde las fuentes están ya secas. Por ejemplo, ¿llegará algún día en que algún Sumo Pontífice de la Iglesia católica escriba algo verdaderamente inspirado, algo real, sin esos horribles amaneramientos de los documentos oficiales? No parece probable, y tampoco hace falta. Los verdaderos "textos sagrados" de la tradición occidental son, desde hace siglos, los de los grandes autores. Platón y Aristóteles, Dante y Shakespeare. Pero también Victoria, Bach, Haendel, Beethoven. Y Giotto, Fra Angelico, Rembrandt. Y Arquímedes, Pascal, Newton, Darwin, Einstein, Heisenberg. Y Paul Celan y Bela Bartok. Etcétera. Todos ellos son "autores sagrados". Canónicos. La Física Cuántica es un monumento no menos inspirado que la Biblia. Ni menos ambiguo. Escribe el científico Arthur I. Miller: "Como una gran obra literaria, la teoría cuántica está abierta a multitud de interpretaciones".
Se equivocan pues quienes oponen la ciencia a los textos sagrados, o la ciencia al arte. Respetando los correspondientes ámbitos de autonomía, todo forma parte de un mismo prodigioso forcejeo. La persecución de lo real. Que en cierto modo es también la persecución de lo absoluto. Lo absoluto que se presiente, aunque sea inaccesible. Ciertamente, la fusión de saberes como en el Renacimiento ya no es posible. La montaña de la especialización es demasiado alta. Ahora bien, cabe hacer que los diferentes saberes "comuniquen". Comuniquen sin "reducirse" los unos a los otros. Es el meollo de lo que Edgar Morin ha llamado "transdisciplinariedad", la que, sin buscar un principio unitario de todos los conocimientos (lo cual también sería reduccionismo), aspira a una comunicación entre las disciplinas sobre la base de un pensamiento "complejo". Ni todo es física, ni todo es biología, ni todo es sociología, ni todo es antropología; pero cabe enlazar estas áreas cibernéticamente.
¿Enciclopedismo? Más bien puesta en ciclo del bucle físico/biológico/social/antropológico. Ello es que las grandes preguntas se renuevan, el tema de la condición humana está en juego y la permeabilidad entre ciencias, artes y letras se convierte en una exigencia central de nuestro tiempo.
What is Your Dangerous Idea?: Today’s Leading Thinkers on the Unthinkable Edited by John Brockman (Simon and Schuster, £12.99)
JOHN Brockman is a kind of entrepreneur of ideas. He runs edge.org, a website for boffins, and writes and edits clever books on subjects such as the future and God. Here, he has had what might be his whizziest idea yet. He simply asked the cleverest scientists in the world to tell him one thing: what is the most dangerous idea they can think of? And they did. And it's really good.
When you ask clever people about dangerous ideas, it turns out, they normally say one of two things. Some say that we, as a species, are becoming too clever for our own good - that our ideas are excellent, and that, pretty soon, life will get much worse as a result.
Others say quite the opposite - that the human race has no idea about anything, and that, pretty soon, we'll realise this fact, and that, as a result, life will be much worse. Of course I'm simplifying.
But not much.
Let's start with John Horgan, of the Stevens Institute of Technology. What, he asks quite reasonably, would happen if we managed to get to the bottom of the "neural code", and understood exactly how the brain works? "Will we be liberated or enslaved by this knowledge?" he asks. Quite possibly enslaved, because nobody would be able to believe in the soul any more.
And David Buss, the Darwinian psychologist famous for his research into human mating behaviour, wonders what might happen if we understood ourselves so well that we could grasp the concept "that evil has evolved".
That, in other words, lots of us are descended from tyrants such as Attila the Hun. And that, therefore, he has passed on some of his evil genes to us.
In the end, says Buss, we need to face up to this. "The danger," he says, "comes from people who refuse to recognise that there are dark sides to human nature."
The geneticist Craig Venter has similar worries - understanding the fact that we are all different, genetically speaking, challenges the cosy, politically-correct word we have got used to.
There's more of this - the fear that, in the end, good ideas might actually have bad consequences. What will happen, asks the psychologist Diane Halpern, when we know enough to be able to choose the sex of our children? Too many boys, she believes. She's done the research, and it doesn't look promising.
On the other hand, what if we don't know anything? The Stanford physicist Leonard Susskind wonders about the effect of the " landscape" idea on the future of physics. What if the universe is so big that, "rather than being a homogeneous, mono-colored blanket, it is a crazy-quilt patchwork of different environments"? In this case, we might realise that we only have knowledge of an infinitely small part of it. And then, dispirited, we might give up the ghost.
Maths in the digital age, writes the Cornell mathematician Steven Strogatz, has entered a troublesome new world. These days, we are able prove theorems by crunching numbers in unearthly quantities. But we have no insight - we may know that something is true, but not why. Scary, no? And psychologist Geoffrey Miller gives us a good reason why we haven't had signals from other life-forms - because, if they ever did exist, they got so good at sating themselves with junk food and video games that they died out.
A brilliant book: exhilarating, hilarious, and chilling. But is anything else out there? Quite possibly. As the physicist W Daniel Hillis says: "I don't share my most dangerous ideas."
If you think the web is full of trivial rubbish, you will find the intellectual badinage of edge.org to be a blessed counterpoint. This online magazine from the eponymous foundation links to the latest articles by the likes of scientists Richard Dawkins and Steven Pinker: heralds the new "third culture" who are "rendering visible the deeper meanings of our lives".
NORMAL, Ill. -- To get some idea of the brouhaha currently enveloping linguists, occupants of a usually quiet corner of the ivory tower, suppose a high-school physics teacher found a hole in the theory of relativity.
Students of language consider Noam Chomsky the Einstein of their discipline. Linguistics is a very old science, but beginning in the 1950s, Chomsky so revolutionized the field that linguists refer to the time prior to his work as B.C., or before Chomsky.
They may have to add another marker: A.D., after Dan.
Daniel Everett, a faculty member at Illinois State University, has done field work among a tiny tribe in the Amazon. He reports that their obscure language lacks a fundamental characteristic that, according to Chomsky's theory, underlies all human language.
With that declaration, Everett pitted himself against a giant in the field, and modest ISU against the nation's elite universities. In the process, he drew national attention to this arcane field and enveloped scholars around the world in a battle that plays out over and over in -- this is academia, after all -- conferences and seminars. ...
The son of a Boston wholesale flower seller, he adapted his father's business methods in his work as a pop publicist and management consultant. He went on to become a successful literary agent, specialising in top science writers and — with an online 'intellectual salon' — building a reputation as a tireless promoter of influential ideas. Interview by Andrew Brown
In 1968 John Brockman was promoting a film called Head , starring the Monkees. His idea of publicity was simply to have the whole town covered in posters showing a head, with no caption. Naturally, the chosen head was his. Grotesquely solarised, with blue-grey lips and and scarlet spectacles, fashionable, suggestive of intellectual power, impossible to decipher, there he stood against a thousand walls, looking down on the city of New York.
The posters have long since faded, but Brockman's position remains the same, gazing inscrutably on anything interesting in Manhattan. Now he is one of the most successful literary agents in the world, but to his friends and clients he is much more: an impresario and promoter of scientific ideas who is changing the way that all educated people think about the world. Richard Dawkins, his friend and client, says, "his Edge web site has been well described as an online salon, for scientists and for other intellectuals who care about science. John Brockman may have the most enviable address book in the English-speaking world, and he uses it to promote science and scientific literature in a way that nobody else does."
Portrait copyright © by Eamonn McCabe
Anyone today who thinks that scientists are the unacknowledged legislators of the world has been influenced by Brockman's taste. As well as Dawkins, he represents Daniel Dennett, Jared Diamond, and Sir Martin Rees, as well as three Nobel prize winners and almost all the other famous popular scientists. His old friend Stewart Brand, the publisher of the Whole Earth Catalog and later the promoter of the Clock of the Long Now, which is intended to run for 10,000 years, says: "It's so easy to think the guy's just a high-class pimp that it's quite easy to ignore the impact on the intellectual culture of the west that John has enabled by getting his artist and scientist friends out to the world. There is a whole cohort of intellectuals who are interacting with each other and would not [be able to] without John."
Brockman himself says, "Confusion is good. Then try awkwardness. Then you fall back on contradiction. Those are my three friends." Fortunately, they are not his only friends. When asked for photographs of himself as a young man, he sends one where he is standing with Bob Dylan and Andy Warhol on the day Dylan visited Warhol's Factory. In the course of a couple of hours' conversation, he brings up encounters with (amongst others) John Cage; Robert Rauschenberg; Sam Sheppard; Larry Page and Sergei Brin, the founders of Google, with whom he had just had lunch along with his client Craig Venter, the genome researcher; "Rupert" (Murdoch); Stewart Brand; Elaine Pagels, an influential historian of religion; Hunter S Thompson; Richard Dawkins; Daniel Dennett; Nicholas Humphrey, the psychologist; Murray Gell-Mann, the Nobel-winning physicist; the actor Dennis Hopper; and Steve Case of AOL.
He even mentions Huey P Newton, the Black Panther. "Sometime around 1987 or '88, I get a call from Huey, who was a close friend of mine, who I was trying to avoid, because it had been revealed that he was actually gratuitously murdering people . . . you know, shooting them. He was flipping out. He wasn't talking about revolution or anything. Newton's message said: 'Me and my buddy Bob Trivers — we're going to write a book on deceit and self-deception.'" Robert Trivers was one of the most important evolutionary biologists of the past 50 years, and came up with the hugely influential idea of "reciprocal altruism" as a graduate student at Harvard in the early 70s before his career was interrupted by psychological problems and he went off to live in the Jamaican jungle for some years. (He is now back at Harvard, in a chair funded by a friend of Brockman's.) Brockman continues: "Soon after that, he [Newton] died a very nasty death: just a crummy sidewalk dope deal. This was no way for a real revolutionary . . .
"A couple of years ago, I made a rare visit to LA and was doing my favourite thing: watching the movie stars round the pool, and I got a message: Bob Trivers called. 'John. It's Bob Trivers. As I was saying. I've got the proposal ready. It's for a book on deceit and self-deception.'" Such a book will be ideal Brockman fodder. It takes science out to the edges of society yet deals with subjects of eternal importance. It captures a theory at the stage when it is most vigorously fighting for its life. It is written by the man who made the discovery, which is an important point.
Though Brockman has made some journalists a lot of money, his truly unique selling point is that he has made real scientists far more. In 1999, for example, at the height of the pop science boom, he sold the world rights to a book by the theoretical physicist Brian Greene for $2m. Some of his books have proved initially trickier. Gell-Mann had to return an advance of $500,000 for a book, The Quark and the Jaguar, delivered late, that Bantam rejected. Brockman subsequently sold it to WH Freeman for a reported $50,000.
Many would agree that at least half his clients are truly remarkable thinkers, but there is room for disagreement about which half. For instance, he represents Sir John Maddox, the former editor of Nature, but also Rupert Sheldrake, whose heretical ideas about biology were denounced by Maddox in a Nature editorial that suggested Sheldrake's book A New Science of Life be burnt. Brockman has sold most of Richard Dawkins' books, but also the Bible Code by Michael Drosnin, which claimed that everything significant in the world up to the death of Princess Diana could have been predicted by reading every seventh letter in the Hebrew Bible, and the novel The Diary of a Manhattan Call Girl by Tracy Quan, which was the first account of a prostitute's life to be serialised on the Internet.
"He likes proposals to be about two pages long, no more, and then he likes to get an auction going," one of his authors says. "You'll get a call from him, and he's walking down Fifth Avenue on his cellphone, saying that he's got Simon & Schuster to bid 100,000 and now he will see what happens. A quarter of an hour later, Bantam has bid 125,000 and then he says he'll go back to S&S and see if he can get 150,000. But he's got an attention span of about half an hour. If the book isn't sold within a week, forget it."
Tom Standage, the technology editor of the Economist, had his first book sold by Brockman on the basis of an outline one paragraph long. He sent it off in a speculative spirit and the next thing he heard was the rustle of a contract crawling towards him from the fax machine. Standage says: "He feels he's failed if a book earns out its advance and pays royalties because that means he hasn't got as much from the publishers as he could have done."
This is how the young Brockman learned from his father, a broker in the wholesale flower market in Boston, to hustle sales. "He dominated the carnation industry. He would go to the Boston flower market, which was owned by the growers, who formed a cooperative. All these Swedes and Norwegians would be growing gladiolas and carnations and they'd bring them in at three in the morning and leave them like a long aisle. There'd be thousands of flowers, and you had to sell them, or they died. He said to me 'you gotta move them, they're going to die'. And one day, 40 years later, I'm on the phone, and I had a chilling feeling as I felt my father's voice coming through me, like, 'they're going to die'. So, why am I always so fixated on closing the deal, getting the next book in? It comes from that experience. That was a pure market situation. So, that's the way I run my business. It's not literary. It's not publishing. It's business. I have got properties to sell, on behalf of my clients.
"My job is to do the best I can for them and I do it by making a market. The market decides. But knowing how to make a market involves . . . some capacities." The capacities are at the heart of his business, but it's hard to describe them. He has a keen sense for interesting ideas, but also for the ways in which they fit into society. For instance, he would never call himself an atheist, he says, in America: "I mean I don't believe: I'm sure there's no God. I'm sure there's no afterlife. But don't call me an atheist. It's like a losers' club. When I hear the word atheist, I think of some crummy motel where they're having a function and these people have nowhere else to go. That's what it means in America. In the UK it's very different."
The Brockmans were the children of immigrants — John's father's family had come from Austria — and grew up in a largely poor and Catholic neighbourhood of Boston and he remains extremely sensitive to anti-Semitism. "There were no books in our house. My father could barely read. He was a brilliant man but he was on the streets working at eight years old. My mother read a little bit, but, you know, it was a little encyclopedia.
"My parents were poor. My father started a business the day I was born which became a successful business. But we grew up in a tough neighbourhood called Dorchester, which was an Irish-Catholic bastion, where this radical right-wing priest went up and down the streets telling people to kill Jews. So that's how my brother and I grew up." He has one brother, a retired physicist, who is three years older. "We quickly found out, going to school, that . . . we were personally responsible for the death of Jesus Christ. We had a lot of fighting to do, and most of it on the losing end, because there were always 30 of them to two of us. My brother got the worse of it. My mother was a tough cookie. She would kick him out of the house if he didn't fight hard enough. Luckily in those days you didn't get killed; you just got a bloody nose. But it was tough."
"Confusion is good. Then try awkwardness. Then you fall back on contradiction. Those are my three friends."
This mixture of pugnacity and sensitivity about ethnicity can still surface. When he was upset by a profile in the Sunday Times magazine, which he thought played to an anti-Semitic stereotype, he complained straight to Rupert Murdoch (using Murdoch's banker, another of his contacts, as an intermediary).
Brockman was a poor student in high school and was turned down for 17 colleges before studying business, finishing up with an MBA from Columbia University in New York. He worked selling tax shelters for a while, but in the evenings he was hanging out with all the artists he could find. He stacked chairs in the theatre with the young Sam Sheppard; he went to dinner parties with John Cage; he started to put on film festivals and then multi-media extravagances at about the same time as Ken Kesey's Merry Pranksters in San Francisco and Andy Warhol in New York. This early attraction to the art world seems to have set his style. The art that he was involved with qualified as art simply because everyone involved decided it was.
In this flux, it seemed the only certainty was scientific truth, but he was early attracted to the idea of science, of computing as a metaphor for everything. Stewart Brand first met him in the early 60s: "I was in the army as an officer and spending the weekends in New York — he was in the thick of the multimedia scene that was the cutting edge of performance pop art. He was an impresario, who could help organise events and people and media and be essential to the process, but unlike a lot of people he was actually alert to what the art was about, just as later, as an agent, he was alert to what the books were about. So far as I was concerned he was another artist in the group of artists I was running with."
Life at a glance
Born: February 16 1941 Boston, Massachusetts.
Educated: Babson Institute of Business Administration; Columbia University, New York.
Employment: 1965-69 Multimedia artist; '74-present, literary agent; founder Brockman, Inc; chairman Content.com.
Married: Katinka Matson (one son, Max, 1981).
Some books: 1969 By the Late John Brockman; '88 Doing Science: The Reality Club; '95 The Third Culture; '96 Digerati: Encounters with the Cyber Elite; '03 The Next Fifty Years: Science in the First Half of the Twenty-First Century; '04 Science at the Edge.
In 1967 Brockman discovered how to sell flower power while it was still fresh. A business school friend who had gone to work for a paper company asked Brockman to help motivate the sales force for their line of sanitary towels. This was at a time when the New York Times was solemnly explaining that "Total environment" discothèques, such as Cheetah and The Electric Circus in New York, were turning on their patrons with high-decibel rock'n'roll combined with pulsing lights, flashing slide images, and electronic "colour mists". Brockman asked — and got — a fee of $15,000 despite having no consulting experience. He put on a multimedia show for the salesmen: they lay on the floor of a shiny vinyl wigwam while four sound systems played them Beatles songs, bird calls, company advertising slogans with an executive shouting about market statistics and competitive products, and a film showed a young woman wearing a dress made of the company's paper which she ripped down to her navel. In the 60s it was cutting-edge art, an "intermedia kinetic experience", and the salesmen exposed to it reportedly sold an additional 17% of feminine hygiene products in the next quarter. Brockman took the show around nine cities for the company, energising its sales force nationwide, and was established as a consultant who could sell his services to anyone.
But it was not enough. His book By the Late John Brockman was unfavourably reviewed, but he was not discouraged and continued to write and edit books — 18 at last count. One,Einstein, Gertrude Stein, Wittgenstein and Frankenstein , had to be hurriedly withdrawn after portions were found to have been plagiarised from an article by James Gleick, the author of Chaos , one of the first big pop science hits and not a Brockman client. Brockman blamed one of his assistants.
Brockman's later books have mostly been collections of interviews with friends and clients, salted and sometimes vinegared as well with their opinions of each other. He has a made a Christmas tradition of asking questions of 100 or so people and circulating their responses. "What do you believe to be true, but cannot prove?" was the most recent one, in 2004, and is a fine example of Brockman's method as an editor or curator of thought. The question was supplied by Nicholas Humphrey, but it was Brockman who spotted its potential, and then knew 120 interesting people who were prepared to answer it. Humphrey's own answer is characteristically thought-provoking: "I believe that human consciousness is a conjuring trick, designed to fool us into thinking we are in the presence of an inexplicable mystery . . . so as to increase the value we each place on our own and others' lives." Philip Anderson, the Nobel Prize-winning physicist, believes that string theory is a waste of time. Randolph Nesse, an evolutionary biologist, believes, but cannot prove, that believing things without proof is evolutionary advantageous; Ian McEwan that no part of his consciousness will survive death.
Brockman has constantly reinvented himself. He has been at the leading edge of intellectual fashion for the past 30 years. In the late 90s, just before the dot.com bubble popped, he told an interviewer from Wired magazine that he wanted to be "post-interesting". Looking back on all the ideas he has enthused about you glimpse a mind that rushes around like a border collie — tirelessly and gracefully pursuing anything that moves, but absolutely uninterested in things that stay still, and liable, if shut up in a car, to get bored and eat all the upholstery. Like a lot of successful salesmen, part of his secret is that he is interested in people for their own sake as well as for what they can do for him, and can study them with extraordinary concentration, solemnly placing out, beside the journalist's machine, two tape recorders of his own at the beginning of an interview. To be under his attentive, almost affectionate gaze, is to know how a sheep feels in front of a collie.
Twice in the course of a couple of hours' chat he says "you ought to write a book about that". He became a book agent by accident. He was talking about God to the scientist John Lilly, a friend of Brand's, whose research into dolphins and LSD was one of the first tendrils of a scientific study of consciousness, and he realised Lilly had a book there. He sold the proposal and found a new business where his talents and his interests coincided.
He has been in the vanguard of the trend towards larger advances at the expense of royalties, and a model of rewards in which a few superstars make gigantic sums and almost everyone else makes next to nothing. His first enormous commercial success came in the early 80s, as personal computers started to appear. He understood that software manuals would need publishing just as normal books do. In the end, the idea of software publishers didn't work out, but not before Brockman had made a fortune from the idea. He started an annual dinner for the other players in the business, called the millionaires' dinner. Later, when this seemed unimpressive, he renamed it the billionaires' dinner; then the scientists' dinner — whatever worked to bring lively people round him.
"Throughout history, only a small number of people have done the serious thinking for everybody"
He works with and is married to Katinka Matson, the daughter of a New York literary agent who was AD Peters's partner in 50s. "She actually makes the wheels turn in the office," says Tom Standage. The Brockmans have one son, Max, who works in the family business as a third-generation agent, and who was blessed in his crib by a drunken dance performed round it by Hunter S Thompson, Dennis Hopper and Gerd Stern, a multi-media artist from the avant-garde scene.
After the first boom in personal computers and their software blew out, Brockman was perfectly placed for the next boom, in writing about the people who made it. The house magazine of that boom was Wired, which sold itself to Conde Nast as "The magazine which branded the digital age"; it is almost an obligation on the editor of Wired to be a Brockman client. He set out his manifesto in the early 90s for what he called the Third Culture: "Traditional American intellectuals are, in a sense, increasingly reactionary, and quite often proudly (and perversely) ignorant of many of the truly significant intellectual accomplishments of our time. Their culture, which dismisses science, is often non-empirical. By contrast, the Third Culture consists of those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are."
Everything was speeding up, too. Brockman had always been quick to close a deal. Now he demanded pinball-fast reactions from the editors he sold to. One trick was to watch the front page of the New York Times and get a quick book proposal out of every science story that appeared there. This mean that if you were a Brockman client on the staff of the New York Times a front page splash was not just professionally gratifying, but also a potential route to a large cheque. There was a danger that this constituted a temptation to hype. When one New York Times journalist, Gina Kolata, followed a "cure for cancer" splash with a book contract the next day, there was an outcry and the book was eventually cancelled.
For Brockman, America now is the intellectual seedbed for Europe and Asia. He wrote: "The emergence of the Third Culture introduces new modes of intellectual discourse and reaffirms the pre-eminence of America in the realm of important ideas. Throughout history, intellectual life has been marked by the fact that only a small number of people have done the serious thinking for everybody else. What we are witnessing is a passing of the torch from one group of thinkers, the traditional literary intellectuals, to a new group, the intellectuals of the emerging Third Culture. Intellectuals are not just people who know things but people who shape the thoughts of their generation. An intellectual is a synthesiser, a publicist, a communicator." Brockman, it hardly needs saying, is the true intellectual's agent.
Of course, this was angrily resented by those outside the magic circle, especially if they were themselves intellectuals in every respect save being represented by him. But any anger or ridicule stays off the record. Who knows when they will need to deal with him? Who knows when he could bless them with a million dollars, and whisk them into the magic circle?
Yet it is a tribute to Brockman's personality that people who have known him a long time like him a great deal. Stewart Brand says: "The salon-keeper has an interesting balancing act between highlighting the people they're attracted to and also having a strong enough personality so that they are taken seriously as a peer. People do not feel threatened by him or competitive with him. They either admire him or profess to be amused by him. But you look behind that, and you realise that they don't look down on him at all."
The magic circle has gone by different names and using different degrees of formality. In the 90s it was a manifested in a physical gathering, run with Heinz Pagels, called the Reality Club. The elite would come together and talk about the work that interested them. They didn't have to be his clients, and many of them weren't. But all invested their time in ideas he was promoting. Pagels died in an accident and Brockman says he didn't have the heart to go on by himself. Instead, he set up his Edge website, where he puts up new interviews every month, which can be read as transcripts or watched as videos, with commentaries.
It all reinforces his idea that reality is essentially social. Even the name, the Reality Club, goes right back to his earliest big idea: that reality is what the smart people, who should be friends of John Brockman, decide to make of the world: "It's an argument that I have with all my scientist friends, and I lose it every time. They don't buy it at all. It's very primitivistic, I'm told, or even solipsism, but it works for me."
In 2005, the American anthropologist Daniel Everett published an article in Current Anthropology in which he presented his insights into Pirahã life, acquired over years spent living with the tribe. Pirahã culture, Everett claimed, was unique: it was totally focused on immediate experience and it lacked basic number skills, a vocabulary for colours, a past perfect tense and a creation myth....
Genres crumble, divisions fade in light of tragedy
By Julia Keller
Tribune cultural critic
...Contemporary culture is a blur, a haze, a hodgepodge, a constant shuffle play on the natural-born iPod known as the human consciousness. The old hierarchies -- high art, low art, enlightenment, junk -- are dead. The ancient demarcations of poem and story and painting are pointless.
Genres are dissolving. Boundaries are disintegrating. Old lines of stratification and division and roping-off of subject areas, gone. Next thing you know, they'll be taking the 9/11 commission's austere and straightforward exegesis of the defining national tragedy of our lifetimes and turning it into a comic book. ...
... Modern technology, then, may have been almost as urgent a target for the 9/11 terrorists as were the helpless humans they murdered. The audacity of the attacks may have arisen from a desire to splash the world with the ghastly imagery of technology run amok, of technology outsmarting itself to bring about chaos and death. Thus the arts -- still our chief means of engaging with ideas, even the heinous ideas of terrorists -- must grapple with technology's double-edged sword: Some of us see it as redemptive and positive, while others see it as threateningly negative.
John Brockman, founder of a Web site illuminating the interplay of science and culture (www.edge.org), believes technological advances are always beneficial, despite the lethal misgivings that certain groups harbor. Science "figures out how things work and thus can make them work better," he wrote in an e-mail. "As an activity, as a state of mind, it is fundamentally optimistic."
And so here we stand, clutching a comic book in one hand and a copy of "Hamlet" in the other, listening to an aria through one headphone and a Dixie Chicks ballad through the other, looking out at a landscape that seems ancient and exhausted -- and bright and new. A world in which we are, every second, individuals and vital parts of communities as well.
Philip Zimbardo used to be one of my heroes, but no longer. The psychologist dreamt up the Stanford Prison experiment, in which 24 male students were randomly assigned roles as either captive or guard in a mock prison. Guards were given uniforms and power; prisoners were stripped of their names and privileges, and were ordered to remain largely silent. The nightly toilet run saw the prisoners blindfolded and shackled together before being marched to the bathroom.
The experiment, in 1971, was stopped after just six days because the guards had become sadists and the prisoners depressives. Remember, they all started off as nice, normal college kids. The experiment became a totem of a thing called “situational” evil: good people, when put into bad situations, could become brutes. It has furnished an explanation — but not exoneration — for atrocities ranging from the Holocaust to Abu Ghraib (Professor Zimbardo appeared as a defence witness at the trial of a soldier charged with torture at the Iraqi prison).
I had always assumed it was Professor Zimbardo who called time. In fact, it was a young psychologist called Christina Maslach. Professor Zimbardo, who had just started dating Dr Maslach, had invited her over to impress her. Instead, after witnessing the toilet run, she fled in horror, telling Professor Zimbardo she no longer wanted to know him. The experiment, she said, had dehumanised its instigator as well as its participants.
So, Professor Zimbardo stopped the experiment because he risked losing the woman he loved. He calls Dr Maslach a hero for challenging the wisdom that the experiment was a justifiable study of human nature. And it is has led him, he tells the Edge website (www.edge.org), to consider the flip side of evil: the psychology of heroism.
Just as some people can be made to grow horns, others grow haloes. Yet, so little is known about heroes, other than that they often say, in the face of mountainous evidence to the contrary, that they didn’t do anything special. Do heroes ever contemplate the risks? Or do they consider them and then override them? Such basic research, Professor Zimbardo says, has never been conducted but should be, ideally in the immediate aftermath of a heroic act.
We must also cultivate a different heroic imagination in the young. Dangerously, children grow up believing that heroism is the preserve of the legendary rather than the ordinary: Achilles or Superman. He says: “The secondary consequence is for us to say, ‘I could never be that . . . or bear such a burden’. I think, on the other hand, we each could say, ‘I could do what Christina Maslach did’.” Indeed: we need heroes who will stop another Enron, another Abu Ghraib, another questionable psychology experiment.
By the way, Professor Zimbardo and the now Professor Maslach celebrate their 35th wedding anniversary this year.
Slide into evil. In the Stanford Prison Study in 1971, university students were randomly assigned to be prisoners or guards and then placed in a mock prison setting in the basement of the campus psych building. The guards became so oppressive and sadistic, and the prisoners so passive and depressed, that the two-week study was ended after six days. Lead researcher Philip Zimbardo is featured on edge.org in a lengthy discussion of evil and heroism. He calls the study a "cautionary tale of the many ways in which good people can be readily and easily seduced into evil. . . . Those who sustain an illusion of invulnerability are the easiest touch for the con man, the cult recruiter or the social psychologist ready to demonstrate how easy it is to twist such arrogance into submission."
In an original EDGE essay, Wikipedia co-founder Larry Sanger claims that the Web's ability to aggregate public opinion and knowledge into some form of "collective intelligence" is leading to a new politics of knowledge. According to Sanger, the power to establish what "we all know'" is shifting out of the hands of a small elite group and becoming more of a conversation open to anyone with a Net connection. However, Sanger is also the founder of Citizendium, a competitor to Wikipedia that, according to its Web site, "aims to improve on (the Wikipedia) model by adding 'gentle expert oversight' and requiring contributors to use their real names." In this essay, titled "Who Says We Know: On The New Politics Of Knowledge," Sanger argues that a lack of "expert" oversight leads to unreliable information, something he sees as a major flaw in knowledge egalitarianism. I'm sure this essay will spark as much fiery debate as the previous essay in this EDGE series, Jaron Lanier's "Digital Maoism." From Sanger's essay:
Today's Establishment is nervous about Web 2.0 and Establishment-bashers love it, and for the same reason: its egalitarianism about knowledge means that, with the chorus (or cacophony) of voices out there, there is so much dissent, about everything, that there is a lot less of what "we all know." Insofar as the unity of our culture depends on a large body of background knowledge, handing a megaphone to everyone has the effect of fracturing our culture.
I, at least, think it is wonderful that the power to declare what we all know is no longer exclusively in the hands of a professional elite. A giant, open, global conversation has just begun—one that will live on for the rest of human history—and its potential for good is tremendous. Perhaps our culture is fracturing, but we may choose to interpret that as the sign of a healthy liberal society, precisely because knowledge egalitarianism gives a voice to those minorities who think that what "we all know" is actually false. And—as one of the fathers of modern liberalism, John Stuart Mill, argued—an unfettered, vigorous exchange of opinion ought to improve our grasp of the truth.
This makes a nice story; but it's not the whole story.
As it turns out, our many Web 2.0 revolutionaries have been so thoroughly seized with the successes of strong collaboration that they are resistant to recognizing some hard truths. As wonderful as it might be that the hegemony of professionals over knowledge is lessening, there is a downside: our grasp of and respect for reliable information suffers. With the rejection of professionalism has come a widespread rejection of expertise—of the proper role in society of people who make it their life's work to know stuff. This, I maintain, is not a positive development; but it is also not a necessary one. We can imagine a Web 2.0 with experts. We can imagine an Internet that is still egalitarian, but which is more open and welcoming to specialists. The new politics of knowledge that I advocate would place experts at the head of the table, but—unlike the old order—gives the general public a place at the table as well.
In 1992, John Brockman defined the concept of the third culture in his essay entitled "The Emerging Third Culture": " The third culture consists of those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are."
For John Brockman, the strength of the third culture is precisely that it can tolerate disagreements about which ideas are to be taken seriously. Unlike previous intellectual pursuits, the achievements of the third culture are not the marginal disputes of a quarrelsome mandarin class: they will affect the lives of everybody on the planet. Scientific subjects now receive outstanding treatment in the pages in newspapers and magazines.
Molecular Biology, artificial intelligence, artificial life, the neuronal theory of the chaos, networks, the inflationary universe, adaptive fractals, complex systems, superstrings, biodiversity, nanotecnology, human genome, the virtual reality, etc., are some of the new scientific subjects that they are transmitted to the present society under new metaphors created by the intellectuals of the third culture.
In the third culture a new philosophy of the nature is being born whose sustenance is in the understanding of the complexity of the evolution. According to Brockman, very complex systems, such as the organisms, brains, the biosphere or the own universe, were not constructed following a deterministic design, but all are evolutionary processes, whose interpretation through images and metaphors has been the function of the intellectuals of the third culture, and in this regard they attempt to express their deeper reflections in an accessible way for the intelligent reading public.
In spite of its critics, the third culture is alive and in the heat of development. Books by Richard Dawkins, Daniel C. Dennett, Jared Diamond, Brian Greene, Stephen Pinker, Martin Rees, etcetera, are indispensable not only for their information, but they are also great successes in the bookstore. Their subjects deal wth the main controversies of the western world in the last decades: abortion and euthanasia, demographic policies, the increase of differences between rich and poor countries, pacifism, migrations, racism and xenophobia, the causes of the ecological crisis and the implications of the technology that lead to a postulation of an ethics of the responsibility and the social control of the scientific policies.
The world-wide phenomenon of the third culture is not only the interruption by the natural scientists of the postmodern intellectual scene, but a movement towards a global intellectual vision caused by the intensive use of the images and hypermedia in the communication between the human beings, which has allowed the scientific knowledge of second half of 20th century to permeate all society, providing for the utlization of information for confronting the great universal challenges of 21st century.
However, in spite of the serious warnings of the natural scientists, the mainstream political leaders of the world have not managed to include or understand that present political action must be focused to the preservation of the habitat of human beings. Although the contribution of the scientific knowledge is falsifiable, ephemeral and almost always probabilistic, it is always helpful in making important decisions, indicating what it is not due to do. As Machiavelli wrote: "To know the ways that lead to hell is to avoid them".
Has a remote Amazonian tribe upended our understanding of language?
Dan Everett believes that Pirahã undermines Noam Chomsky’s idea of a universal grammar.
[ED. NOTE: Thanks to the New Yorker for making available the link to John Colapinto's article.]
Great reading in George Dyson's essay "Turing's Cathedral," found atedge.org. It connects the impulses of original computer pioneers to the age of Google.
Life is full of surprises, but it's rare to reach for a carafe of wine and find your hand clutching a bottle of milk -- and even rarer, you'd think, to react by deciding the milk was actually what you wanted all along.
Yet something like that happened when scientists in Sweden asked people to choose which of two women's photos they found most attractive. After the subject made his choice, whom we'll call Beth, the experimenter turned the chosen photo face down. Sliding it across the table, he asked the subject the reasons he chose the photo he did. But the experimenter was a sleight-of-hand artist. A copy of the unchosen photo, "Grizelda," was tucked behind Beth's, so what he actually slid was the duplicate of Grizelda, palming Beth.
Few subjects batted an eye. Looking at the unchosen Grizelda, they smoothly explained why they had chosen her ("She was smiling," "she looks hot"), even though they hadn't.
In 1966, Time magazine asked, "Is God Dead?" Even then, the answer was no, and with the rise of religion in the public square, the question now seems ludicrous. In one of those strange-bedfellows things, it is science that is shedding light on why belief in God will never die, at least until humans evolve very different brains, brains that don't (as they did with Beth and Grizelda) interpret unexpected and even unwanted outcomes as being for the best.
"Belief in God," says Daniel Gilbert, professor of psychology at Harvard University, "is compelled by the way our brains work."
As shown in the Grizelda-and-Beth study, by scientists at Lund University and published this month in Science, brains have a remarkable talent for reframing suboptimal outcomes to see setbacks in the best possible light. You can see it when high-school seniors decide that colleges that rejected them really weren't much good, come to think of it.
You can see it, too, in experiments where Prof. Gilbert and colleagues told female volunteers they would be working on a task that required them to have a likeable, trustworthy partner. They would get a partner randomly, by blindly choosing one of four folders, each containing a biography of a potential teammate. Unknown to the volunteers, each folder contained the same bio, describing an unlikable, untrustworthy person.
The volunteers were unfazed. Reading the randomly chosen bio, they interpreted even negatives as positives. "She doesn't like people" made them think of her as "exceptionally discerning." And when they read different bios, they concluded their partner was hands-down superior. "Their brains found the most rewarding view of their circumstances," says Prof. Gilbert.
The experimenter then told the volunteer that although she thought she was choosing a folder at random, in fact the experimenter had given her a subliminal message so she would pick the best possible partner. The volunteers later said they believed this lie, agreeing that the subliminal message had led them to the best folder. Having thought themselves into believing they had chosen the best teammate, they needed an explanation for their good fortune and experienced what Prof. Gilbert calls the illusion of external agency.
"People don't know how good they are at finding something desirable in almost any outcome," he says. "So when there is a good outcome, they're surprised, and they conclude that someone else has engineered their fate" -- a lab's subliminal message or, in real life, God.
Religion used to be ascribed to a wish to escape mortality by invoking an afterlife or to feel less alone in the world. Now, some anthropologists and psychologists suspect that religious belief is what Pascal Boyer of Washington University, St. Louis, calls in a 2003 paper "a predictable by-product of ordinary cognitive function."
One of those functions is the ability to imagine what Prof. Boyer calls "nonphysically present agents." We do this all the time when we recall the past or project the future, or imagine "what if" scenarios involving others. It's not a big leap for those same brain mechanisms to imagine spirits and gods as real.
Another God-producing brain quirk is that although many things can be viewed in multiple ways, the mind settles on the most rewarding. Take the Necker cube, the line drawing that shifts orientation as you stare at it. (A cool version is at dogfeathers.com/java/necker.html.) If you reward someone for seeing the cube one way, however, his brain starts seeing it that way only. The cube stops flipping.
There are only two ways to see a Necker cube, but loads of ways to see a hurricane or a recovery from illness. The brain "tends to search for and hold onto the most rewarding view of events, much as it does of objects," Prof. Gilbert writes on the Web site Edge. It is much more rewarding to attribute death to God's will, and to see in disasters hints of the hand of God.
Prof. Gilbert once asked a religious colleague how he felt about helping to discover that people can misattribute the products of their own minds to acts of God. The reply: "I feel fine. God doesn't want us to confuse our miracles with his."
Michael Wright enjoys a eureka moment at the edge of knowledge, as scientists ponder the imponderable
Some of the presentations are available to watch as QuickTime movies, if you prefer not to read, and keen thinkers can have a bimonthly e-mail of the latest discussions delivered to their inbox.
Each year, John Brockman, the site’s American editor, also sends a big, open-ended question to all the notable thinkers he knows, then publishes their responses online. This year’s little teaser — “What do you believe is true, even though you cannot prove it?” — prompted 60,000 words in reply, on subjects including particle physics, consciousness, arti- ficial intelligence, global warming and tedious sophistry.
I like the belief of Alun Anderson, the editor-in-chief of New Scientist, that cockroaches are conscious, but cannot comment on the theoretical physicist who denies that black holes destroy information or the computer scientist who believes the continuum hypothesis is false.
Visiting Edge will make pseudo- scientists feel cleverer, and the rest of us more than usually stupid, as we discover, with a jolt of pleasure, how little we really know about the world.