MIT robotics professor Rodney Brooks helped bring about a paradigm shift in robotics in the late 1980s when he advocated a move away from top-down programming (which required complete control of the robot's environment) toward a biologically inspired model that helped robots navigate dynamic, constantly changing surroundings on their own. His breakthroughs paved the way for Roomba, the vacuuming robot disc that uses multiple sensors to adapt to different floor types and avoid obstalces in its path. (Brooks is chief technology officer and cofounder of Roomba's parent company, iRobot.) Brooks talked to NEWSWEEK's Katie Baker about the challenges involved in creating robots that can interact in social settings. Excerpts:
NEWSWEEK: Sociologists talk about the importance of culture and sociability in humans, and why [it should be equally important] in robots. Do roboticists consider things such as culture when thinking about how to integrate robots into human lives?
Rodney Brooks: Some of us certainly do, absolutely. My lab has been working on gaze direction. This is the one thing that you and I don't have right now [over the telephone], but if we were doing some task together, working in the same workspace, we would continuously be looking up at each other's eyes, to see what the other one was paying attention to. Certainly that level of integration with a robot has been of great interest to me. And if you're going to have a robot doing really high-level tasks with a person, I think you will want to know where its eyes are pointing, what it's paying attention to. Dogs do that with us and we do that with dogs, it happens all the time. Somehow cats don't seem to bother.
So are there ethical implications involved when you think about developing sociable robots, in terms of how they might change human behavior?
Well, every technology that we build changes us. There's a great piece on Edge.org by Kevin Kelly, I think it was, talking about how printing changed us, reading changed us. Computers have changed us, and robots will change us, in some way. It doesn't necessarily mean it's bad.
What are some of the more interesting robots that you've seen, or that you're developing or have developed?
I think what gets interesting is when robots have an ongoing existence. And Roomba has an ongoing existence, [though] less than [that of] an insect. But the ones that I have in my lab here, that I've scheduled to come out and clean every night, they do it day after day and recharge themselves. And they just sort of live as other entities in the lab and we never see them doing anything, except every so often we go and empty their dirt bins. So they've got an ongoing existence, but at a very, very primitive level. All the robots that you see from Honda and all those places don't even have that level of ongoing-ness. They're demo robots. But up until now, people haven't been building robots to have an ongoing existence, so they're sitting in the world, ready to do their thing when the situation is appropriate. So I think that's where the really interesting things will start to happen.
When you don't have to completely control the robot's environment?
When it becomes [something that] can have an ongoing existence with people … that is where things get interesting. We've done a few things like that here, starting back with [MIT professor] Cynthia Brezeal and [her sociable robot] Kismet, and in her new lab she's got some fantastic new robots where she's pushing towards that. We've had other robots here in my lab—Mertz, which was trying to recognize people from day to day as their looks change, and know about them. And some of our robots, like Domo, will interact with a person for 10 minutes or so, and [it] has face detectors and things like that. There are other projects in Europe—the RoboCub, which is focused in Genoa [Italy], is building these robots that many labs in Europe now have, which are all about emulating child development.
Obviously, we can tell when something's a robot and when something's a human. But when a robot is too humanlike, do we get concerned?
Our robots that we've built in my lab and the European robots, [if you] show a picture of that robot—just a static picture—to a kid, they'll tell you, "That's a robot." There's no question in their mind. And nevertheless, when the robots are acting and interacting, they can draw even skeptics into interacting with them—for a few seconds at least—where the person responds in a social way. And some people will go on responding for minutes and minutes. Then there are these super-realistic robots that a couple of different groups, one in Asia and one in the U.S., are building. One of them looks like Albert Einstein, and [the other] looks like this television reporter. And there it gets a little weirder. Because very quickly, you realize that they're not Albert Einstein or they're not the television reporter. But they look so much like it, you get this--some of the researchers in Japan call it the Uncanny Valley, I think. There's this dissonance in your head, because the expectations go so far up when it looks so realistic.
What else is important to understand about the robotics field today?
There are two typical reporter questions that you haven't asked me, and I'm glad you haven't. [The first] is: but a robot can't do anything it's not programmed to do anyway, so it's totally different from us. And my answer to that is that it's an artificial distinction, I think. Because my belief is that we are machines. And I think modern science treats us as machines. When you have a course in molecular biology, it's all mechanistic, and likewise in neuroscience. So if we are machines, then at least it seems to me, in principle, there could exist machines built out of other stuff, silicon and steel maybe, which are lawfully following the laws of physics and chemistry, just as we are, but could in principle have as much free will and as much soul as we have. Whether we humans are smart enough to build such machines is a different question. Maybe we're just not smart enough. That pisses off the scientists when I say that.
Well don't physicists say that, in a way? That there may be things that our brains are just not configured to understand about the universe?
Yes. Actually, Patrick Winston, who is a professor here—I used to co-teach his AI [artificial intelligence] course many years ago—he'd always start the first lecture on artificial intelligence with the undergrads here, talking about his pet raccoon he'd had as a kid, growing up in the Midwest. And it was very dexterous with its hands. But, he said, it never occurred to him that that raccoon was going to build a robot replica of itself. The raccoon just isn't smart enough. And maybe there are flying saucers up there, with little green men or green entities from somewhere and they're looking down at my lab and saying "What, he's trying to build robot replicas of himself? Isn't that funny! He'll never make it!"
And you said there was one other [typical reporter] question.…
When? When are we going to have them in our homes? When, when, when?
Jason Calacanis announced on Friday that he was retiring from blogging. There was a very mixed reaction to the news, with most believing it to be a publicity stunt. Jason said in his farewell post that instead of blogging, he would instead be posting to a mailing list made up of his followers, capped at 750 subscribers. That subscriber limit was reached very quickly, and today Jason sent out his first new 'post' to that mailing list, which we have included below.
We expect that moving his posts to a mailing list will not achieve what he has set out for - and that is to have a conversation with the top slice of his readers. Instead, you will likely see his emails re-published, probably on a blog and probably with comments and everything else.
> From: "Jason Calacanis"> Date: July 13, 2008 11:16:15 AM PDT> To: email@example.com> Subject: [Jason] The fallout (from the load out)>> Brentwood, California> Sunday, July 12th 11:10AM PST.> Word Count: 1,588> Jason's List Subscriber Count: 1,095> List: http://tinyurl.com/jasonslist>> Team Jason,>> Wow, it's been an amazing 24 hours since I officially announced my> retirement from blogging (http://tinyurl.com/jasonretires). .... John Brockman explained to me at one time that some> of the most interesting folks he's met have, over time, become less> vocal. He explained, that there was a inverse correlation between your> success and your ability to tell the truth. When I met John I was> nobody and I promised myself I would never, ever censor myself if I> become successful. ... Comments on blogs inevitably implode, and we all accept it> under the belief that "open is better!" Open is not better. Running a> blog is like letting a virtuoso play for 90 minutes are Carnegie Hall,> and then seconds after their performance you run to the back Alley and> grab the most inebriated homeless person drag them on stage and ask> them what they think of the performance they overheard in the Alley.> They then take a piss on the stage and say "F-you" to the people who> just had a wonderful experience for 90 or 92 minutes. That's openness> for you¿ my how far we've come! We've put the wisdom of the deranged> on the same level as the wisdom of the wise.>> You and I now have a direct relationship, and I'm cutting the mailing> list off today so it stays at 1,000 folks. I'll add selectively to> the list, but for now I'm more interested in a deep relationship with> the few of you have chosen to make a commitment with me. Perhaps some> of you will become deep, considered colleagues and friends¿something> that doesn't happen for me in the blogosphere any more.>> Much of my inspiration for doing this comes from what I've seen with> John Brockman's Edge.org email newsletter. When it enters my inbox I'm> inspired and focused. I print it, and I don't print anything. The> people that surround him are epic, and that's my inspiration¿to be> surrounded by exceptional people.>>>...
The game designer was one of some thirty paradigm-shifting thinkers and doers who took the stage at this year's Pop!Tech conference
by Jessie Scanlon
A Global Who's Who
Five hundred entrepreneurs, thinkers, designers, educators, and inventors attended this year's conference, which closed Saturday, and which focused on the theme of Dangerous Ideas. ...
While a glance at the Pop!Tech program suggests an eclectic, almost random assortment of interesting people—co-founder of the Global Business Network Stewart Brand and New York Times columnist Thomas Friedman...the conference held together surprisingly well, in part because one particular "dangerous idea" kept coming up again and again. ... In the kick-off session, Brian Eno, the British experimental-music pioneer and theorist, presented an idea which shocked society when it was first introduced and which, although now widely accepted, continues to reverberate through culture and business: the theory of evolution.
... Pop!Tech isn't the only one to emphasize community and the power of the network, but it walks the walk more than some. Its focus is less on high-power networking—there's no equivalent of the exclusive "Billionaire's Dinner" that publisher John Brockman hosts for TED muckety-mucks every year—and more on the network. ...
John Brockman, the straw-hatted literary agent who looks after the fortunes of the world's major science writers, has had a smart idea. He's contacted 100-odd scientists, psychologists, evolutionary biologists and laboratory-based thinkers and asked them, "What Is Your Dangerous Idea?" The results, published next month, are provocative, if not exactly scary. It seems the most alarming idea is the possibility that the laws of physics may turn out to be local phenomena - that they hold true only in certain circumstances (like, say, living on Earth, specifically in south London) but might be completely different in a potentially infinite number of different universes - and that the world is (dammit) fundamentally inexplicable to the human brain. This is called "the anthropic principle" and you'll hear it being aired at a pretentious London dinner party, any day now, by the kind of person who used to bang on about Heisenberg's Uncertainty Principle....
...My favourite Dangerous Idea, however, comes from Simon Baron-Cohen, a Cambridge psychopatho-logist, who suggests we try a political system based on empathy. He points out that parliaments and congresses across the world base their systems on combat, from waging war to the dirty-tricks campaigns currently enfuming the US airwaves. Isn't it time, he asks, that we tried the principle of empathising? It would mean "keeping in mind the thoughts and feelings of other people" rather than riding roughshod over them. It would mean acquiring completely different politicians and election strategies. Instead of choosing party leaders and prime ministers because of their kick-ass, "effective" leadership traits, we'd choose them for their readiness to understand other people's feelings, to ask genuinely interested questions and respond "flexibly" to different points of view.
The whiff of Sixties hippiedom and Nelson Mandela saintliness are, I'm sure, unconscious. Mr Baron-Cohen is a serious psychologist and his theory deserves sober reflection by political scientists, provided they can stop corps-ing at the image of Prime Minister's Questions as a murmurous chamber of thoughtful, non-adversarial debaters, muttering, "How interesting - I never thought of it that way before," as their leader, no longer forced to behave like a stag at bay, tells the leader of the Opposition, "I wouldn't dream of arguing over this point because I know you're very sensitive to contradiction???" If media journalists joined in, Newsnight would become a Shavian dialogue with no conclusions, and Radio 4's Today a warm and fuzzy group hug in which John Humphrys and John Reid strove to find their common humanity in the maelstrom of ideas. I don't know about dangerous, but Mr Baron-Cohen's idea is certainly radical. If only I could stop thinking it's all a spoof masterminded by Simon's cousin Sacha???
Es gibt Dinge, die lassen sich schlecht mischen: Wasser und Öl zum Beispiel. Oder Natur- und Geisteswissenschaftler. Doch sie nähern sich an.
Natur- und Geisteswissenschaftler kann man mit Gewalt zusammenbringen, aber lässt man sie dann eine Weile in Ruhe, neigen sie dazu, sich zu entmischen und am Ende wieder säuberlich getrennt vorzuliegen. So jedenfalls lautet das weit verbreitete Urteil über die akademische Landschaft in Deutschland.
Es könnte allerdings sein, dass die Widerstände abnehmen. Vor kurzem fand in Nürnberg eine Tagung zum Thema „Neuer Humanismus“ statt. Wissenschaftler und interessierte Laien hatten sich in der Nürnberger Burg versammelt, um zu diskutieren: Über Aufklärung und Atheismus, Epikur und Evolution, Hedonismus und Humanismus. Bemerkenswert dabei: Trotz des „geisteswissenschaftlichen“ Themas waren viele Naturwissenschaftler anwesend.
Eckart Voland zum Beispiel. Er ist Biologe und Professor für Philosophie der Biowissenschaften in Gießen. Er beschreibt sich als „Quereinsteiger in der Philosophie“. Voland hält es für wichtig, dass auch Naturwissenschaftler an ethischen Diskussionen teilnehmen: „Viele gesellschaftliche Entwürfe gehen von falschen Voraussetzungen aus“, sagt er. Bei der Feindesliebe zum Beispiel. Jahrtausendelang wurde sie von der Bibel propagiert, aber Voland hält sie für unvereinbar mit der menschlichen Natur. Der Mensch sei schließlich ein Produkt der Evolution, und Feindesliebe habe in der freien Wildbahn wenig Vorteile.
Ein anderes Problem sieht er im Rückgriff auf dualistisches Denken, in der Tatsache, dass „Evolution immer als nicht mehr zuständig erklärt wird, wenn es um mentale Zustände geht.“ Dabei sei das Gehirn ebenso der Evolution unterworfen wie jedes andere Organ. Sein Fazit: „Der Naturwissenschaftler kann helfen, das Fundament zu beschreiben, auf dem jegliche erfolgreiche Politik stehen muss.“ Sonst bestehe die Gefahr einer „Ethik, die an der Welt vorbeigeht“.
Der Kölner Physiker Bernd Vowinkel hat an der Tagung teilgenommen, „weil halt die neuen Technologien in das althergebrachte Menschenbild eingreifen“. Vowinkel meint Entwicklungen wie Roboter, Gehirnchips, künstliche Intelligenz und Prothesen, die die Leistungsfähigkeit des Menschen erhöhen. „Geistes- und Naturwissenschaften müssen stärker zusammenwachsen“, fordert er.
Damit steht er nicht alleine. Bereits 1959 beklagte der Physiker und Schriftsteller Charles Percy Snow, dass die Geistes- und Naturwissenschaften auseinanderdrifteten. Snow prägte das bis heute aktuelle Wort von den „zwei Kulturen“. Gleichzeitig sagte er eine „dritte Kultur“ voraus, eine gemeinsame Kultur von Geistes- und Naturwissenschaftlern.
Mitte der neunziger Jahre fand der amerikanische Literaturagent John Brockman diese dritte Kultur. Sie sah anders aus als Snow sie sich vorgestellt hatte: Brockman stellte fest, dass Naturwissenschaftler wie der Biologe Richard Dawkins oder der Physiker Roger Penrose sich direkt an die Öffentlichkeit wendeten und eine Erklärfunktion übernahmen, die früher literarisch gebildete Geisteswissenschaftler übernommen hatten. Brockmann nannte dies die dritte Kultur.
Inzwischen nimmt aber auch Snows ursprünglicher Gedanke langsam Realität an. Eine zweite dritte Kultur bahnt sich an. In Deutschland bewegen sich Geisteswissenschaftler und Naturwissenschaftler aufeinander zu. Der Wille beider Seiten zum Verständnis des anderen wächst, trotz praktischer Probleme.
Der Biologe Josef Reichholf sieht in der Kommunikation Defizite. Besonders „das starre Festhalten an Worten, die nur scheinbar gut definiert sind“ erschwere das Gespräch. Auch die Unis seien schuld: „In Deutschland wird bei der Ausbildung des akademischen Nachwuchses nicht fächerübergreifend vorgegangen“, beklagt er. Trotz alledem ist sein Fazit optimistisch: „Positive Ansätze sind vorhanden.“ Kai Kupferschmidt
Gloria Origgi on why a second language is the best antidote to intolerance
By rejecting the Lisbon Treaty, Ireland's voters may have thrown the European Union into cri sis, but in a more profound way I am optim istic about Europe. A while ago, I took the train from Paris to Brussels for a meeting at the headquarters of the European Commission. The train was full of people my age - the late thirties - going to Brussels to participate in various EU projects.
I started chatting with my neighbours. Most of the people I spoke with came from more than one cultural background, with two or more nationalities in the family. All of us were at least bilingual, many trilingual or more. My neighbours epitomised the deep cultural change now taking place in Europe. A new generation has grown up, of people born more than a quarter of a century after the end of the Second World War and now moving around Europe to study and work - meeting, dating, marrying and having children with people from other European countries and doing so as a matter of course.
More and more European children are growing up multilingual. They are unlike immigrants born in one culture and having to grow up in another. They are unlike children growing up in a monolingual, monocultural family that happens to be located in a wider multicultural en vironment. For these children, cultural and linguistic diversity is not just a part of the society at large, it is a part of themselves, a novel kind of identity. Multilingualism is becoming an existential condition in Europe, good news for a continent in which national identities have been so powerful and have caused so much tragedy and pain in the past.
This condition also affects our cognitive life. Recent research in developmental psychology shows that bilingual children are quicker to develop an ability to understand the mental states of others. A likely interpretation of these findings is that bilingual children have a more fine-grained ability to understand their social environment and, in particular, a greater awareness that different people may represent reality in different ways. My bilingual six-year-old son makes mistakes in French and Italian but never confuses contexts in which it is more appropriate to use one language than the other.
I believe that European multilingualism will help produce a new generation of children whose tolerance of diverse cultures will be built from within, not learned as a social norm.
All this may be wishful thinking, projecting my own personal trajectory on the future of Europe. But I can't help thinking that being multilingual is the best and cheapest antidote to cultural intolerance, as well as a way of going beyond the empty label of multiculturalism by experiencing a plural culture from within. And, of course, this is not just a European issue.
As people arrived from all over the world to attend the opening weekend of the Reykjavik Arts Festival and participate in Hans-Ulrich Obrist and Olafur Eliasson’s “Experiment Marathon Reykjavik,” the mood resembled a summer camp—albeit one attended by Björk, who was on my flight from London, and the country’s president, Olafur Ragnar Grímsson. Festivities kicked off with receptions at both the president’s residence and at Reykjavik city hall, where mayor Ólafur F. Magnússon spoke with guests. Iceland’s intimate social landscape, along with its intimidating physical landscape, brought the eclectic crowd together, and it seemed that whenever someone was mentioned in conversation, they appeared just around the corner.
The marathon began Friday morning at the Reykjavik Art Museum–Hafnarhús and featured a diverse lineup including artificial-intelligence expert Luc Steels, physicist Thorsteinn Sigfússon, artists Tomas Saraceno and Hreinn Fridfinnsson, and architects Neri Oxman and David Adjaye. The most successful presentations were often the most straightforward. For example, Indian artist Abhishek Hazra plotted a sine curve by laughing and crying into crescendos of hysteria. Another highlight was the touching performance Table Piece One, in which filmmaker Jonas Mekas, his son, Sebastian, and actor-filmmaker Benn Northover ate lunch and made toasts to elves and trolls; the whole thing resembled a hall of mirrors as a giant video of Mekas shushing the audience was projected above while the performance was simulcast on a smaller screen to the side.
Left: Brian Eno. (Photo: Karl Petersson) Right: Dorrit Moussaieff with Ólafur Ragnar Grímsson, president of Iceland. (Photo: Cathryn Drake)
That evening Frida Bjork Ingvarsdóttir, culture editor of the daily Morgunbladid, improvised a cozy last-minute dinner at her home, partly in honor of her daughter, Elín Hansdóttir, whose immersive, mazelike installation was featured in the exhibition “Art Against Architecture” opening later that night at the National Gallery of Iceland. Arriving with Obrist and Eliasson, our posse was soon followed by Rebecca Solnit, writer-in-residence at the Library of Water in Stykkishólmur, as well as marathon participants John Brockman, Marina Abramovic, and Carolee Schneemann. On hearing the song “Sveitin Milli Sanda” (The Land Between the Sands), performed by Ellý Vilhjálms in the late 1950s, Abramovic proclaimed that she would use it in her performance the next day.
Later, at the National Gallery, guests lounged and swung on Monica Bonvicini’s leather and chain hammocks. Finnbogi Pétursson’s calming poetic installation used magnifiers to project quivering flames on four walls, while outside in the Tjörnin pond, the evocative Atlantis, a sunken little red house by Tea Mäkipää and Halldór Úlfarsson, squared architecture against nature—and the winner seemed clear.
Afterward, collector Ingunn Wernersdóttir led us to the gritty Hressingarskálinn restaurant, where we were serenaded by a deadpan Icelandic duo’s stiff renditions of classic rock tunes. Between bites of City’s Best hot dogs, designer Gudrun Lilja Gunnlaugsdóttir informed me that in Reykjavik, it’s not unusual to wander into places at random, following the common philosophy that “it is about the journey, not the destination.” Putting that into practice, we later stumbled into a party sponsored by I8 gallery in honor ofErnesto Neto’s exhibition, where we again spotted Björk and reeled to the live music while balancing bags of greasy fish and chips.
On Saturday, our troupe flew northward by propeller plane to Akureyri (pop. 17,300), the country’s second-largest city. President Grímsson sat in the first row reading his newspaper while his Israeli-born wife, Dorrit Moussaieff, recommended her favorite Icelandic fashion designers. Arriving in the city at midday, we visited the exhibition “Facing China” at the Akureyri Art Museum, then moved on to the lovely Safnasafnid folk-art museum, where contemporary installations by “outsider” artists were juxtaposed with traditional cultural artifacts. From there, we flew on to Egilsstadir, making our way to the Eidar Art Center, where we were greeted by young dancers running about and posing in the grass, then hiked through the mud to Paul McCarthy and Jason Rhoades’s 2004 installation of a Macy’s in the middle of a field.
After a visit to the Slaughterhouse Culture Centre, we drove through the snow-covered peaks above Lake Lagarfljót, haunted by the legendary Worm monster, to Seydisfjördur, the small-town home of the Skaftfell Centre for Visual Art, founded in memory of former resident Dieter Roth. After being greeted at the door with handshakes and hugs from Gudni Gunnarsson and Lieven Dousselaere of the art collective Skyr Lee Bob, we gawked as dancer Erna Omarsdóttir growled, twitched, and scratched at the walls from within a glass room. Outside, Pétur Kristjánsson used his tractor to “Paint by Numbers,” lining up milk cartons containing various liquid foods on the pavement and running them over to create a splatter pattern, eventually moving on to crushing vacuum cleaners while children danced on the sidelines. “Welcome to Iceland,” a local resident commented.
Left: Artist Abhishek Hazra and Dr. Ruth. (Photo: Cathryn Drake) Right: Carolee Schneemann's performance. (Photo: Karl Petersson)
Bringing together art and science, the experiment marathon seemed like an inspirational DIY manual for life itself. Describing reality as a nonlinear process of input and output in which we ourselves are the instruments, Brockman noted, “You are not creating the world, you are inventing it.” In “Laughing at Leonardo,” filmmaker-composer Tony Conrad made a sort of Vitruvian Man joke using his own body as a stringed musical instrument. Brian Eno led the audience in a sing-along of “Can’t Help Falling in Love” and proposed choral singing as the key to civilization: “In a group you stop being me and start being us. I encourage you all to start your own a cappella group and change the world.” He added, “The three keys to happiness and a healthy old age are dancing, singing, and camping.”
In the end, the marathon also demonstrated that experiments can be most interesting when they fail, as when a curious collaboration between Abramovic and Dr. Ruth Westheimer was canceled due to a blowout between the two personalities. After screening a video explaining how she had been rejected by the elderly sex adviser, Abramovic led the audience in breathing exercises, then instructed everyone to hug each other. Hugs may do it for some, but it wasn’t until Sunday night’s closing party at the Blue Lagoon that our group came upon the true secret to Iceland’s famously high happiness rate: relaxing in a volcanic hot pool under the midnight sun.
Left: Dancer Erna Omarsdóttir. (Photo: Cecilia Alemani) Right: Sebastian Mekas, Jonas Mekas, and Benn Northover. (Photo: Cathryn Drake)
Left: Thorgerdur Katrín Gunnarsdóttir, Iceland's minister of culture, with the Sugarcubes's Einar Örn Benediktsson. Right: Dagur B. Eggertsson, former mayor of Reykjavík, with Ólafur F. Magnússon, mayor of Reykjavik. (Photos: Björn Blöndal)
Syracuse University professor Arthur Brooks's new book, "Gross National Happiness," advances the provocative hypothesis that conservatives are happier than liberals: "Political conservatives take the happiness prize hands down." Why? For one thing, they are more likely to be married, which generally correlates with happiness. (Although having children does not.) Also, they are more likely to be religious, which, Richard Dawkins/Christopher Hitchens/Sam Harris notwithstanding, has its own rewards.
More to the point, conservatives like things the way they are. The status quo is perfectly all right with them, although the status quo ante would be even better. Haven't you noticed that right-wing lunatics like Rush Limbaugh affect a jolly, contented tone, while left-wing lunatics like Al Franken always sound angry? Look at our current president: distanced, out of it, but smugly satisfied with his disengagement. It may be that his last day in office will be the happiest day of his life. Ours, too.
What is it with happiness, anyway? It's like being thin; everybody wants it, no one can have it. Happiness, of course, is the animal that disappears in the pursuit. "Those only are happy," John Stuart Mill wrote, "who have their minds fixed on some object other than their own happiness."
I keep an eye on the nebulous science of happiness, or "hedonometrics," which is not so unlike the nebulous science of political polling, or of bunting with men on second and third and one out. Happiness studies prove to be a full-employment program for economists, psychologists, and psychiatrists offering pabulum for people almost as miserable as they.
So who's happy? Not people in midlife, according to data extrapolated from 500,000 responses to the General Social Survey in America, and from the Eurobarometer across the Atlantic. In a paper posted on the National Bureau of Economic Research website, economists David Blanchflower of Dartmouth and Andrew Oswald of Warwick University report that "well-being reaches a minimum, on both sides of the Atlantic, in people's mid to late 40s." After that, the U-shaped index rises again.
Their paper also shows that, generation after generation, Americans (like Japanese) are becoming more unhappy. De Tocqueville knew as much more than 150 years ago: "So many lucky men, restless in the midst of abundance." With Europeans, it is the opposite; the younger ones enjoy more "well-being" than their parents.
What about the undeserving rich? Research shows that it's better to be middle class than poor. Things get complicated as you move further out on the "swinishly wealthy" axis, because $100 million doesn't buy a hundred times the pleasure of $1 million. Best-selling happiness monger ("Stumbling on Happiness") Daniel Gilbert compares accumulating wealth to eating pancakes. "The first one is delicious, the second one is good, the third OK," he told Harvard magazine. "By the fifth pancake you're at a point when an infinite number more pancakes will not satisfy you to any degree. But no one stops earning money or striving for more money."
The hedonometricians even came up with the notion of a "hedonic set point," or baseline. This is like the body weight set point, meaning that if you weigh 175 pounds now, you will probably weigh about that much for the rest of your life. Hedonically speaking: This is about as happy as you will ever be.
Harvard psychologist Nancy Etcoff has asserted that this happiness baseline notion is wrong: "Personality is much less stable than body weight, and happiness levels are even less stable than personality." So, there is an upside: A certain number of people can become more happy. But wait! "For every person who shows a substantial lasting increase in happiness, two people show a decrease," Etcoff wrote on a website called edge.org.
Suddenly, this is an ethical dilemma. For me to be happier, both you and your friend have to bum out. Of course, your being unhappy might raise my spirits. Speaking of which, I think I'll have that second Negra Modelo.
I'm OK, but you two are not. I am happy with that.
On this subject
I know, I am becoming like a broken record on the subject of Justin Cartwright. But his 2004 novel, "The Promise of Happiness," is very good. Stewart O'Nan can pull this off, too, writing about intimacy and family dynamics in a non-syrupy fashion. Hie thee to the library, or to the internets. I think you'll be happy that you did.
Descending through the clouds over Iceland, the land looks like cauliflower, or something growing in a giant petri dish. Driving from the airport, which is basically out in the wilderness a dozen or so miles from Reykjavik, the interminable rockiness of the earth becomes obvious: rock everywhere, volcanic black gnawed and gnarly masses smeared with a thin film of moss, stretching back to the horizon in incredible sliding perspective (as you drive by), before it's stopped short by a wall of squat, tempting mountains. I'm here for the Reykjavik Art Festival, which began last night, and my knee-jerk thought riding through the countryside was: how does culture, let alone a thriving triennial of visual art (this is the second after Bjorn Roth (son of Dieter) and Jessica Morgan's effort in 2005) get a toe-hold here in the midst of such overwhelming, isolating and intimidating nature?
Easy. At the packed opening reception for the festival, hosted by the Reykjavik Art Museum (a mixture of brutalist concrete and steel-and-glass elegance), Hans Urlich Obrist speculated that Iceland is possibly the only country in the world where the president and his wife would come to a performance by Emily Wardill, the emerging London-based film artist. President Ólafur Ragnar Grímsson – a big supporter of the arts – was indeed one of those watching in the small auditorium as Wardill kicked off the crowning event of the festival, Obrist and Olafur Eliasson's Experiment Marathon. This is a new iteration of the exhilarating event – a series of presentations, performances and interactions – that was first tried out in the Serpentine pavilion during Frieze last year. (And Obrist revealed that this summer's marathon at the Serpentine will be a Manifesto Marathon – for an era without manifestos – inside Frank Gehry's pavilion.)...
..."Try saying your brain is a computer in the 1970s, and you'd get a lot of flak. Now it's old hat", said cultural entrepreneur and founder ofedge.org, John Brockman in an on-stage interview with Obrist. "Who we are is a changing game." Let's hope art can keep up. At the end of the short interview, Brockman quoted James Lee Byars, who is perhaps the father of this kind of polyphonous, multi-disciplinary thinking in the contemporary artworld with his World Question Center (1968): "It's Einstein, Gertrude Stein, Wittgenstein and Frankenstein" – you need all four in order to think; a man can't live on art alone.
Brian Eno, up next, demonstrated how man can't live alone either. Singing helps, and we don't do enough of it. Eno has been campaigning for a compulsory five minutes of singing in English schools every day, and it looks like he's succeeding. With a small group of volunteers leading us on stage, Eno soon got everyone in the audience (which was overflowing today) happily singing 'I can't help falling in love with you' a cappella. It was a joyous, silly, profound moment. ...
...This is not the first intellectual iconoclasm of a practical science. In the early nineties, the so-called Third Culture arose under the patronage of New York literary agent John Brockman. Since then, in bestsellers and in the online magazine Edge.org, scientists have begun to conquer the realm that traditionally belonged to philosophy and theology. With enormous success, Steven Pinker destroyed the great myths of the Enlightenment with his book The Blank Slate, Daniel Dennett reduced free will to biological processes, and Richard Dawkins supported the core beliefs of millions with his onslaught against religious faith in The God Delusion. ...
As the world wages war over geographical, religious and historical turf - a growing number of big note scientists want religious faith put under the microscope. Uber philosopher of mind and popular provocateur, Daniel Dennett, author of Darwin's Dangerous Idea, is one of them. He joins Natasha Mitchell to discuss his latest controversial offering, Breaking the Spell. Be provoked...
Presented by Natasha Mitchell
Wer erinnert sich noch an die «dritte Kultur»? Das Schlagwort, das der amerikanische Literaturagent John Brockman vor bald anderthalb Jahrzehnten zum Markenzeichen zu promovieren versuchte, war eine Mogelpackung. Es knüpfte an die von C. P. Snow 1959 in Umlauf gebrachte Rede von den zwei intellektuellen Kulturen an, die einander fremd und verständnislos gegenüberstünden. Die Kluft zwischen der literarisch-geisteswissenschaftlichen und der technisch-naturwissenschaftlichen Denkungsart, so suggerierte das heute kaum noch benutzte Etikett, werde von einer dritten überbrückt.
Doch die vorgeblich dritte hatte entschieden mehr mit der naturwissenschaftlichen als mit der geisteswissenschaftlichen Weise gemein, Mensch und Welt, Natur und Gesellschaft in den Blick zu nehmen. Alles in allem lief das Unternehmen darauf hinaus, einem in der Tendenz naturalistischen Weltverständnis im intellektuellen Diskurs und in der öffentlichen Meinungsbildung mehr Geltung zu verschaffen. Das lässt sich auch an dem Web-Journal «Edge» ablesen, das – einer anspruchsvollen Wissenschaftspopularisierung verpflichtet – von der Kampagne von einst noch übrig geblieben ist (www.edge.org). Manche Anzeichen, nicht zuletzt der um sich greifende Hirn-Talk, sprechen dafür, dass der Naturalismus in vielerlei Spielarten tatsächlich zu einer Weltanschauung von erheblicher Schubkraft geworden ist.
Auf dem weiten, von Erdspalten durchbrochenen Feld der Zwei-bis-drei-Kulturen-Debatte bewegt sich auch ein neues Projekt aus dem Hause Suhrkamp. Der Frankfurter Verlag hat eine Buchreihe aus der Taufe gehoben, die sich (in Partnerschaft mit «Spiegel online») erklärtermassen von der «Deutungshoheit» herausfordern lassen will, die die Naturwissenschaften im intellektuellen Raum mittlerweile erlangt hätten. Die Reihe trägt den Namen des einstigen, vor wenigen Jahren verstorbenen Verlegers. Zwar hat Siegfried Unseld sich persönlich nicht mit Erkundungsgängen auf jenem weiten Feld der zwei oder mehr Kulturen hervorgetan, doch immerhin ein Faible für Goethe gehabt. Als Fingerzeig auf eine goetheanische, auf eine ganzheitliche Natur-und-Geist-Wissenschaft der «dritten» Art wird man dies aber nicht missverstehen wollen. Es ist, wie Ulla Unseld-Berkéwicz in der Vorschau zum ersten Programm der «edition unseld» andeutet, eher um Dialoge und Blickwechsel zwischen Natur- und Geisteswissenschaften sowie darum zu tun, das «Für und Wider einer naturalistischen Weltsicht» zu erörtern.
Die ersten acht Bände der neuen Edition sind seit gestern auf dem Markt. Ausser dem handlichen Format und der Bezeichnung der Reihe erinnert äusserlich wenig an die vor fünfundvierzig Jahren begründete «edition suhrkamp». Auch die von Band zu Band wechselnden bunten Farben tun es nicht, denn sie fächern sich nicht im Spektrum des Regenbogens auf. Was indes die Themen und Gehalte angeht, so wären einige der jeweils zehn Euro kostenden booklets ohne weiteres in der altehrwürdigen und gleichwohl immer wieder auf die Kammhöhen der Zeit strebenden «edition suhrkamp» gut aufgehoben – und die restlichen passten womöglich in Suhrkamps Wissenschaftstaschenbücher («stw»). Gegen Aufmerksamkeitserzeugung durch Reihenvermehrung ist, andererseits, wenig einzuwenden.
Die «eu»-Nummer eins trägt ein Gedankengang der in Pittsburgh arbeitenden Wissenschaftsphilosophin Sandra Mitchell. Dessen Titel ist, wie es sich ziemt, anfänglich und programmatisch ausgefallen: «Komplexitäten. Warum wir erst anfangen, die Welt zu verstehen». Die Autorin wirbt für einen «integrativen Pluralismus» der Perspektiven und Prinzipien. Freilich hat sie kein «Anything goes» im Sinn. Wissenschaftliche Erklärungen und Theorien sind für Mitchell, wenn gut belegt, nach wie vor «repräsentative Abbilder der Welt». Die Philosophin hält lediglich die – in den Wissenschaften allerdings gängige – Annahme für verfehlt, es könne nur die eine und einzig wahre Erklärung geben. Depressionen beispielsweise seien nur als «komplexe Kombination aus biochemischen, neurologischen, psychischen und körperlichen Zuständen» begreifbar. Deutlicher als die Antwort auf die Frage, wie das Zusammenspiel verschiedener Erklärungsmodelle im Einzelnen aussehe, konturiert sich der antireduktionistische Impetus, der den wissenschaftlichen Willen zur Komplexität befeuert.
Beschränkungen der Forschungsfreiheit
In der theoretischen Physik, die nach elementaren und elementarsten Teilchen sowie nach den Grundkräften fahndet, die die Welt zusammenhalten, hat der Reduktionismus naturgemäss nach wie vor eine Heimstatt. Eine antireduktionistische Gegenbewegung macht sich indes seit einiger Zeit bemerkbar. Robert B. Laughlin hat sich mit seinem «Abschied von der Weltformel» (dt. Piper 2007) an deren Spitze gestellt. Der in Stanford lehrende Nobelpreisträger ist in der «edition unseld» jedoch mit einem anderen Thema vertreten. Er schlägt sich in seinem Essay «Das Verbrechen der Vernunft» mit den Beschränkungen herum, die die sogenannte Wissensgesellschaft dem Wissenserwerb und der Wissensverwertung auferlegt – um der Sicherheit oder um der ökonomischen Verwertbarkeit willen. Es geht also um Forschungs- und Informationsfreiheit, um geistiges Eigentum, Patent- und Urheberrecht; und manches geht – eine eigene Form der Komplexitätssteigerung – auch durcheinander.
Weniger komplex, konzentrierter nämlich ist das Gespräch geartet, das der Frankfurter Neurophysiologe Wolf Singer und Matthieu Ricard, einstmals Molekularbiologe, seit langem aber buddhistischer Mönch, führen. Es entspricht dem Dialog-Auftrag der neuen Edition am ehesten. Die «kontemplative Wissenschaft» der Meditation und die analytische der Hirnforschung sind bei der Erkundung ihrer Gemeinsamkeiten und ihrer Unterschiede allerdings bisweilen so konzentriert, dass sie auf der Stelle treten.
Der Münchner Zoologe und Ökologe Josef H. Reichholf prangert falsche Vorstellungen von «natürlichen» Gleichgewichten an, die in unseren Köpfen herumspukten. In der Natur herrschten allenfalls Fliessgleichgewichte, in denen sich Ungleichgewichte zeitweise stabilisierten. Sein munteres Plädoyer für eine «Ökologie der Zukunft» operiert mit methodisch wenig kontrollierten Verschränkungen sozialer und ökologischer Perspektiven; auch das mag eine Möglichkeit sein, die Kluft zwischen den Denkkulturen zu schliessen: «Die Natur braucht Ungleichgewichte, damit Neues entstehen kann. Die Gesellschaft auch!»
Noch skizzenhafter und spekulativer präsentiert sich eine maschinenstürmerische «Streitschrift» von Dietmar Dath, dem vielseitigen Autor und ehemaligen «FAZ»-Redaktor. Deren diagnostischer Kernsatz lautet: «Wir leben, wie wir leben, nur, weil es Maschinen gibt, aber wir leben gleichzeitig so, als könnten wir dem, was sie tun, keine Richtung geben.» Ob die Richtung aber tatsächlich mit der Reaktivierung eines – im Vagen bleibenden – sozialistischen Gedankenguts gefunden und gegeben werden kann? – Auch der Pariser Kulturwissenschafter Bernard Stiegler, der offenbar Wert darauf legt, dass seine Leser wissen, dass er von 1978 bis 1983 wegen bewaffneten Raubüberfalls im Gefängnis sass, nimmt fatale Auswirkungen der Technologie ins Visier und registriert – gleichfalls ein wenig altmodisch erscheinend – den «Verlust der Aufklärung durch Technik und Medien». Triebgesteuerte, unkonzentrierte, ichschwache, unmündige Wesen würden durch den Kurzschluss von psychischem und elektronischem Apparat gezüchtet.
Die Überraschung: Descartes
Erfreulicheres weiss von der Mensch-Maschinen-Schnittstelle Rolf Landua vom Europäischen Laboratorium für Elementarteilchenphysik (Cern) in Genf zu berichten. Einem Besucher erläutert er in seinem fiktiven Dialog «Am Rand der Dimensionen», zwar vermöge kein Einzelner den Grossapparat eines Beschleunigers zu überschauen, aber das Kollektiv – und nur es – sei «in der Lage, ein solches Gerät zu verstehen und richtig zu nutzen». Die wissenschaftliche Grossforschung, so spekuliert der Physiker – und der Leser fühlt sich an Teilhard de Chardin erinnert –, markiere vielleicht so etwas wie «den Beginn der nächsten Stufe» der kosmischen «Bewusstseinsentwicklung». Dass sogar Gott im Cern Platz finden kann, wundert darum nicht.
Auch das Haus der Edition Unseld hat viele Zimmer. Der Passepartout, der sie dem Leser allesamt öffnete, ist am Portal allerdings nicht hinterlegt worden; man muss ihn erst noch finden. In einem Gelass unterm Dach hat der Dichter und Essayist Durs Grünbein sein Schreibpult aufgestellt und drei «lockere Meditationen» zu Descartes zu Papier gebracht. Auf mitunter überraschende Weise bricht er dem Gedanken Bahn, ausgerechnet mit diesem Renatus Cartesius, dem unter Naturfreunden schlecht beleumundeten Philosophen des Leib-Seele-Dualismus, lasse sich die «Kulturdichotomie» zwischen Poesie und Wissenschaft «spielerisch» aufheben.
Nicht alle der acht ersten Bände der «edition unseld» zeugen von einem souveränen spielerischen Intellekt, nicht alle sind ausgegoren, keiner ist unentbehrlich – und doch hat ein jeder Kontakt mit dem, was an der Zeit wäre.
apster in 1999. MySpace in 2004. YouTube in 2006. Experts from the tech community look ahead to the innovations that will change how we work, play and communicate in 2007...
All computing, all the time
John Brockman is publisher and editor of Edge (edge.org)
WE WILL SEE migration of social applications as user-generated content moves to the WiFi environment. YouTube, MySpace and multi-user games will be available on hand-held devices, wherever you go. People will carry their digital assets much like their bacteria. Israeli tech guru Yossi Vardi calls it "continuous computing."
The nanotechnology world foreseen by K. Eric Drexler arrives in the form of MEMS, or microelectronic mechanical systems. Very inexpensive moving parts will be mass-produced like a semiconductor. But unlike semiconductors, they move. Useful for anything that employs moving parts.
Synthetic Biology pioneer George Church of Harvard University expects $3,000 personal genomics kits in stores.
"Pop Atheism" might include popular atheist TV and movie characters, professional athletes, political figures, etc. Look for the first billion-dollar IPO for the Web service that gets atheists together for "rituals," dating and political and business networking.
Rod Brooks, director of MIT's computer lab, is looking at new Web services aimed at the baby boomer age group, who realize that, in terms of IT use, they've been passed by, missing out on IM, text-messaging, MySpace, etc.
But don't put much stock in predictions. Consider that YouTube/ /MySpace/ Napster didn't change the real world for most people very much. MySpace became TheirSpace and YouTube became TheirTube faster than you can say "2006."
This is where big brains hang out online. Its membership includes 'some of the most interesting minds in the world' debating intellectual, philosophical and artistic issues. Sounds heavy, but it's always full of wise words to steal.
Writer, editor and architect of a great number of the recent years' scientific bestsellers, American John Brockman recounts how the project came about to summon a hundred brilliant minds, mostly scientists, and each year ask provocative questions to synthesize, in a way, contemporary thought. The answers are striking.
By Juana Libedinsky
NEW YORK — "It was July and so hot that you could fry an egg on Park Avenue. I went out to do some errands, driving around the city in an airconditioned taxi when I was distracted by the news on the radio: the war in Iraq was going from bad to worse; Bush was, well, being Bush (and let me clarify that among the many hundreds of science-minded thinkers that I know, I can count three who are Republicans). It was then that I had the idea: the question of the year could only be "What are you optimistic about!".
Sitting in his magnificent office on Central Park, with the St. Patrick's Day parade going by below, John Brockman, a writer, editor and the agent behind nearly every major scientific bestseller in recent years (such as books by Richard Dawkins, Jared Diamond and Nassim Taleb, among others) talks about how the idea came about for his latest compilation entitled, obviously "What are you optimistic about?" ...
Tajik Muslims praying. Photograph: Alexei Vladykin/AP
An atmosphere of moral panic surrounds religion. Viewed not so long ago as a relic of superstition whose role in society was steadily declining, it is now demonised as the cause of many of the world's worst evils. As a result, there has been a sudden explosion in the literature of proselytising atheism. A few years ago, it was difficult to persuade commercial publishers even to think of bringing out books on religion. Today, tracts against religion can be enormous money-spinners, with Richard Dawkins's The God Delusion and Christopher Hitchens's God Is Not Great selling in the hundreds of thousands. For the first time in generations, scientists and philosophers, high-profile novelists and journalists are debating whether religion has a future. The intellectual traffic is not all one-way. There have been counterblasts for believers, such as The Dawkins Delusion? by the British theologian Alister McGrath and The Secular Age by the Canadian Catholic philosopher Charles Taylor. On the whole, however, the anti-God squad has dominated the sales charts, and it is worth asking why.
The abrupt shift in the perception of religion is only partly explained by terrorism. The 9/11 hijackers saw themselves as martyrs in a religious tradition, and western opinion has accepted their self-image. And there are some who view the rise of Islamic fundamentalism as a danger comparable with the worst that were faced by liberal societies in the 20th century.
For Dawkins and Hitchens, Daniel Dennett and Martin Amis, Michel Onfray, Philip Pullman and others, religion in general is a poison that has fuelled violence and oppression throughout history, right up to the present day. The urgency with which they produce their anti-religious polemics suggests that a change has occurred as significant as the rise of terrorism: the tide of secularisation has turned. These writers come from a generation schooled to think of religion as a throwback to an earlier stage of human development, which is bound to dwindle away as knowledge continues to increase. In the 19th century, when the scientific and industrial revolutions were changing society very quickly, this may not have been an unreasonable assumption. Dawkins, Hitchens and the rest may still believe that, over the long run, the advance of science will drive religion to the margins of human life, but this is now an article of faith rather than a theory based on evidence.
It is true that religion has declined sharply in a number of countries (Ireland is a recent example) and has not shaped everyday life for most people in Britain for many years. Much of Europe is clearly post-Christian. However, there is nothing that suggests the move away from religion is irreversible, or that it is potentially universal. The US is no more secular today than it was 150 years ago, when De Tocqueville was amazed and baffled by its all-pervading religiosity. The secular era was in any case partly illusory. The mass political movements of the 20th century were vehicles for myths inherited from religion, and it is no accident that religion is reviving now that these movements have collapsed. The current hostility to religion is a reaction against this turnabout. Secularisation is in retreat, and the result is the appearance of an evangelical type of atheism not seen since Victorian times.
As in the past, this is a type of atheism that mirrors the faith it rejects. Philip Pullman's Northern Lights - a subtly allusive, multilayered allegory, recently adapted into a Hollywood blockbuster, The Golden Compass - is a good example. Pullman's parable concerns far more than the dangers of authoritarianism. The issues it raises are essentially religious, and it is deeply indebted to the faith it attacks. Pullman has stated that his atheism was formed in the Anglican tradition, and there are many echoes of Milton and Blake in his work. His largest debt to this tradition is the notion of free will. The central thread of the story is the assertion of free will against faith. The young heroine Lyra Belacqua sets out to thwart the Magisterium - Pullman's metaphor for Christianity - because it aims to deprive humans of their ability to choose their own course in life, which she believes would destroy what is most human in them. But the idea of free will that informs liberal notions of personal autonomy is biblical in origin (think of the Genesis story). The belief that exercising free will is part of being human is a legacy of faith, and like most varieties of atheism today, Pullman's is a derivative of Christianity.
Zealous atheism renews some of the worst features of Christianity and Islam. Just as much as these religions, it is a project of universal conversion. Evangelical atheists never doubt that human life can be transformed if everyone accepts their view of things, and they are certain that one way of living - their own, suitably embellished - is right for everybody. To be sure, atheism need not be a missionary creed of this kind. It is entirely reasonable to have no religious beliefs, and yet be friendly to religion. It is a funny sort of humanism that condemns an impulse that is peculiarly human. Yet that is what evangelical atheists do when they demonise religion.
A curious feature of this kind of atheism is that some of its most fervent missionaries are philosophers. Daniel Dennett's Breaking the Spell: Religion as a Natural Phenomenon claims to sketch a general theory of religion. In fact, it is mostly a polemic against American Christianity. This parochial focus is reflected in Dennett's view of religion, which for him means the belief that some kind of supernatural agency (whose approval believers seek) is needed to explain the way things are in the world. For Dennett, religions are efforts at doing something science does better - they are rudimentary or abortive theories, or else nonsense. "The proposition that God exists," he writes severely, "is not even a theory." But religions do not consist of propositions struggling to become theories. The incomprehensibility of the divine is at the heart of Eastern Christianity, while in Orthodox Judaism practice tends to have priority over doctrine. Buddhism has always recognised that in spiritual matters truth is ineffable, as do Sufi traditions in Islam. Hinduism has never defined itself by anything as simplistic as a creed. It is only some western Christian traditions, under the influence of Greek philosophy, which have tried to turn religion into an explanatory theory.
The notion that religion is a primitive version of science was popularised in the late 19th century in JG Frazer's survey of the myths of primitive peoples, The Golden Bough: A Study in Magic and Religion. For Frazer, religion and magical thinking were closely linked. Rooted in fear and ignorance, they were vestiges of human infancy that would disappear with the advance of knowledge. Dennett's atheism is not much more than a revamped version of Frazer's positivism. The positivists believed that with the development of transport and communication - in their day, canals and the telegraph - irrational thinking would wither way, along with the religions of the past. Despite the history of the past century, Dennett believes much the same. In an interview that appears on the website of the Edge Foundation (edge.org) under the title "The Evaporation of the Powerful Mystique of Religion", he predicts that "in about 25 years almost all religions will have evolved into very different phenomena, so much so that in most quarters religion will no longer command the awe that it does today". He is confident that this will come about, he tells us, mainly because of "the worldwide spread of information technology (not just the internet, but cell phones and portable radios and television)". The philosopher has evidently not reflected on the ubiquity of mobile phones among the Taliban, or the emergence of a virtual al-Qaida on the web.
The growth of knowledge is a fact only postmodern relativists deny. Science is the best tool we have for forming reliable beliefs about the world, but it does not differ from religion by revealing a bare truth that religions veil in dreams. Both science and religion are systems of symbols that serve human needs - in the case of science, for prediction and control. Religions have served many purposes, but at bottom they answer to a need for meaning that is met by myth rather than explanation. A great deal of modern thought consists of secular myths - hollowed-out religious narratives translated into pseudo-science. Dennett's notion that new communications technologies will fundamentally alter the way human beings think is just such a myth.
In The God Delusion, Dawkins attempts to explain the appeal of religion in terms of the theory of memes, vaguely defined conceptual units that compete with one another in a parody of natural selection. He recognises that, because humans have a universal tendency to religious belief, it must have had some evolutionary advantage, but today, he argues, it is perpetuated mainly through bad education. From a Darwinian standpoint, the crucial role Dawkins gives to education is puzzling. Human biology has not changed greatly over recorded history, and if religion is hardwired in the species, it is difficult to see how a different kind of education could alter this. Yet Dawkins seems convinced that if it were not inculcated in schools and families, religion would die out. This is a view that has more in common with a certain type of fundamentalist theology than with Darwinian theory, and I cannot help being reminded of the evangelical Christian who assured me that children reared in a chaste environment would grow up without illicit sexual impulses.
Dawkins's "memetic theory of religion" is a classic example of the nonsense that is spawned when Darwinian thinking is applied outside its proper sphere. Along with Dennett, who also holds to a version of the theory, Dawkins maintains that religious ideas survive because they would be able to survive in any "meme pool", or else because they are part of a "memeplex" that includes similar memes, such as the idea that, if you die as a martyr, you will enjoy 72 virgins. Unfortunately, the theory of memes is science only in the sense that Intelligent Design is science. Strictly speaking, it is not even a theory. Talk of memes is just the latest in a succession of ill-judged Darwinian metaphors.
Dawkins compares religion to a virus: religious ideas are memes that infect vulnerable minds, especially those of children. Biological metaphors may have their uses - the minds of evangelical atheists seem particularly prone to infection by religious memes, for example. At the same time, analogies of this kind are fraught with peril. Dawkins makes much of the oppression perpetrated by religion, which is real enough. He gives less attention to the fact that some of the worst atrocities of modern times were committed by regimes that claimed scientific sanction for their crimes. Nazi "scientific racism" and Soviet "dialectical materialism" reduced the unfathomable complexity of human lives to the deadly simplicity of a scientific formula. In each case, the science was bogus, but it was accepted as genuine at the time, and not only in the regimes in question. Science is as liable to be used for inhumane purposes as any other human institution. Indeed, given the enormous authority science enjoys, the risk of it being used in this way is greater.
Contemporary opponents of religion display a marked lack of interest in the historical record of atheist regimes. In The End of Faith: Religion, Terror and the Future of Reason, the American writer Sam Harris argues that religion has been the chief source of violence and oppression in history. He recognises that secular despots such as Stalin and Mao inflicted terror on a grand scale, but maintains the oppression they practised had nothing to do with their ideology of "scientific atheism" - what was wrong with their regimes was that they were tyrannies. But might there not be a connection between the attempt to eradicate religion and the loss of freedom? It is unlikely that Mao, who launched his assault on the people and culture of Tibet with the slogan "Religion is poison", would have agreed that his atheist world-view had no bearing on his policies. It is true he was worshipped as a semi-divine figure - as Stalin was in the Soviet Union. But in developing these cults, communist Russia and China were not backsliding from atheism. They were demonstrating what happens when atheism becomes a political project. The invariable result is an ersatz religion that can only be maintained by tyrannical means.
Something like this occurred in Nazi Germany. Dawkins dismisses any suggestion that the crimes of the Nazis could be linked with atheism. "What matters," he declares in The God Delusion, "is not whether Hitler and Stalin were atheists, but whether atheism systematically influences people to do bad things. There is not the smallest evidence that it does." This is simple-minded reasoning. Always a tremendous booster of science, Hitler was much impressed by vulgarised Darwinism and by theories of eugenics that had developed from Enlightenment philosophies of materialism. He used Christian antisemitic demonology in his persecution of Jews, and the churches collaborated with him to a horrifying degree. But it was the Nazi belief in race as a scientific category that opened the way to a crime without parallel in history. Hitler's world-view was that of many semi-literate people in interwar Europe, a hotchpotch of counterfeit science and animus towards religion. There can be no reasonable doubt that this was a type of atheism, or that it helped make Nazi crimes possible.
Nowadays most atheists are avowed liberals. What they want - so they will tell you - is not an atheist regime, but a secular state in which religion has no role. They clearly believe that, in a state of this kind, religion will tend to decline. But America's secular constitution has not ensured a secular politics. Christian fundamentalism is more powerful in the US than in any other country, while it has very little influence in Britain, which has an established church. Contemporary critics of religion go much further than demanding disestablishment. It is clear that he wants to eliminate all traces of religion from public institutions. Awkwardly, many of the concepts he deploys - including the idea of religion itself - have been shaped by monotheism. Lying behind secular fundamentalism is a conception of history that derives from religion.
AC Grayling provides an example of the persistence of religious categories in secular thinking in his Towards the Light: The Story of the Struggles for Liberty and Rights That Made the Modern West. As the title indicates, Grayling's book is a type of sermon. Its aim is to reaffirm what he calls "a Whig view of the history of the modern west", the core of which is that "the west displays progress". The Whigs were pious Christians, who believed divine providence arranged history to culminate in English institutions, and Grayling too believes history is "moving in the right direction". No doubt there have been setbacks - he mentions nazism and communism in passing, devoting a few sentences to them. But these disasters were peripheral. They do not reflect on the central tradition of the modern west, which has always been devoted to liberty, and which - Grayling asserts - is inherently antagonistic to religion. "The history of liberty," he writes, "is another chapter - and perhaps the most important of all - in the great quarrel between religion and secularism." The possibility that radical versions of secular thinking may have contributed to the development of nazism and communism is not mentioned. More even than the 18th-century Whigs, who were shaken by French Terror, Grayling has no doubt as to the direction of history.
But the belief that history is a directional process is as faith-based as anything in the Christian catechism. Secular thinkers such as Grayling reject the idea of providence, but they continue to think humankind is moving towards a universal goal - a civilisation based on science that will eventually encompass the entire species. In pre-Christian Europe, human life was understood as a series of cycles; history was seen as tragic or comic rather than redemptive. With the arrival of Christianity, it came to be believed that history had a predetermined goal, which was human salvation. Though they suppress their religious content, secular humanists continue to cling to similar beliefs. One does not want to deny anyone the consolations of a faith, but it is obvious that the idea of progress in history is a myth created by the need for meaning.
The problem with the secular narrative is not that it assumes progress is inevitable (in many versions, it does not). It is the belief that the sort of advance that has been achieved in science can be reproduced in ethics and politics. In fact, while scientific knowledge increases cumulatively, nothing of the kind happens in society. Slavery was abolished in much of the world during the 19th century, but it returned on a vast scale in nazism and communism, and still exists today. Torture was prohibited in international conventions after the second world war, only to be adopted as an instrument of policy by the world's pre-eminent liberal regime at the beginning of the 21st century. Wealth has increased, but it has been repeatedly destroyed in wars and revolutions. People live longer and kill one another in larger numbers. Knowledge grows, but human beings remain much the same.
Belief in progress is a relic of the Christian view of history as a universal narrative, and an intellectually rigorous atheism would start by questioning it. This is what Nietzsche did when he developed his critique of Christianity in the late 19th century, but almost none of today's secular missionaries have followed his example. One need not be a great fan of Nietzsche to wonder why this is so. The reason, no doubt, is that he did not assume any connection between atheism and liberal values - on the contrary, he viewed liberal values as an offspring of Christianity and condemned them partly for that reason. In contrast, evangelical atheists have positioned themselves as defenders of liberal freedoms - rarely inquiring where these freedoms have come from, and never allowing that religion may have had a part in creating them.
Among contemporary anti-religious polemicists, only the French writer Michel Onfray has taken Nietzsche as his point of departure. In some ways, Onfray's In Defence of Atheism is superior to anything English-speaking writers have published on the subject. Refreshingly, Onfray recognises that evangelical atheism is an unwitting imitation of traditional religion: "Many militants of the secular cause look astonishingly like clergy. Worse: like caricatures of clergy." More clearly than his Anglo-Saxon counterparts, Onfray understands the formative influence of religion on secular thinking. Yet he seems not to notice that the liberal values he takes for granted were partly shaped by Christianity and Judaism. The key liberal theorists of toleration are John Locke, who defended religious freedom in explicitly Christian terms, and Benedict Spinoza, a Jewish rationalist who was also a mystic. Yet Onfray has nothing but contempt for the traditions from which these thinkers emerged - particularly Jewish monotheism: "We do not possess an official certificate of birth for worship of one God," he writes. "But the family line is clear: the Jews invented it to endure the coherence, cohesion and existence of their small, threatened people." Here Onfray passes over an important distinction. It may be true that Jews first developed monotheism, but Judaism has never been a missionary faith. In seeking universal conversion, evangelical atheism belongs with Christianity and Islam.
In today's anxiety about religion, it has been forgotten that most of the faith-based violence of the past century was secular in nature. To some extent, this is also true of the current wave of terrorism. Islamism is a patchwork of movements, not all violently jihadist and some strongly opposed to al-Qaida, most of them partly fundamentalist and aiming to recover the lost purity of Islamic traditions, while at the same time taking some of their guiding ideas from radical secular ideology. There is a deal of fashionable talk of Islamo-fascism, and Islamist parties have some features in common with interwar fascist movements, including antisemitism. But Islamists owe as much, if not more, to the far left, and it would be more accurate to describe many of them as Islamo-Leninists. Islamist techniques of terror also have a pedigree in secular revolutionary movements. The executions of hostages in Iraq are copied in exact theatrical detail from European "revolutionary tribunals" in the 1970s, such as that staged by the Red Brigades when they murdered the former Italian prime minister Aldo Moro in 1978.
The influence of secular revolutionary movements on terrorism extends well beyond Islamists. In God Is Not Great, Christopher Hitchens notes that, long before Hizbullah and al-Qaida, the Tamil Tigers of Sri Lanka pioneered what he rightly calls "the disgusting tactic of suicide murder". He omits to mention that the Tigers are Marxist-Leninists who, while recruiting mainly from the island's Hindu population, reject religion in all its varieties. Tiger suicide bombers do not go to certain death in the belief that they will be rewarded in any postmortem paradise. Nor did the suicide bombers who drove American and French forces out of Lebanon in the 80s, most of whom belonged to organisations of the left such as the Lebanese communist party. These secular terrorists believed they were expediting a historical process from which will come a world better than any that has ever existed. It is a view of things more remote from human realities, and more reliably lethal in its consequences, than most religious myths.
It is not necessary to believe in any narrative of progress to think liberal societies are worth resolutely defending. No one can doubt that they are superior to the tyranny imposed by the Taliban on Afghanistan, for example. The issue is one of proportion. Ridden with conflicts and lacking the industrial base of communism and nazism, Islamism is nowhere near a danger of the magnitude of those that were faced down in the 20th century. A greater menace is posed by North Korea, which far surpasses any Islamist regime in its record of repression and clearly does possess some kind of nuclear capability. Evangelical atheists rarely mention it. Hitchens is an exception, but when he describes his visit to the country, it is only to conclude that the regime embodies "a debased yet refined form of Confucianism and ancestor worship". As in Russia and China, the noble humanist philosophy of Marxist-Leninism is innocent of any responsibility.
Writing of the Trotskyite-Luxemburgist sect to which he once belonged, Hitchens confesses sadly: "There are days when I miss my old convictions as if they were an amputated limb." He need not worry. His record on Iraq shows he has not lost the will to believe. The effect of the American-led invasion has been to deliver most of the country outside the Kurdish zone into the hands of an Islamist elective theocracy, in which women, gays and religious minorities are more oppressed than at any time in Iraq's history. The idea that Iraq could become a secular democracy - which Hitchens ardently promoted - was possible only as an act of faith.
In The Second Plane, Martin Amis writes: "Opposition to religion already occupies the high ground, intellectually and morally." Amis is sure religion is a bad thing, and that it has no future in the west. In the author of Koba the Dread: Laughter and the Twenty Million - a forensic examination of self-delusion in the pro-Soviet western intelligentsia - such confidence is surprising. The intellectuals whose folly Amis dissects turned to communism in some sense as a surrogate for religion, and ended up making excuses for Stalin. Are there really no comparable follies today? Some neocons - such as Tony Blair, who will soon be teaching religion and politics at Yale - combine their belligerent progressivism with religious belief, though of a kind Augustine and Pascal might find hard to recognise. Most are secular utopians, who justify pre-emptive war and excuse torture as leading to a radiant future in which democracy will be adopted universally. Even on the high ground of the west, messianic politics has not lost its dangerous appeal.
Religion has not gone away. Repressing it is like repressing sex, a self-defeating enterprise. In the 20th century, when it commanded powerful states and mass movements, it helped engender totalitarianism. Today, the result is a climate of hysteria. Not everything in religion is precious or deserving of reverence. There is an inheritance of anthropocentrism, the ugly fantasy that the Earth exists to serve humans, which most secular humanists share. There is the claim of religious authorities, also made by atheist regimes, to decide how people can express their sexuality, control their fertility and end their lives, which should be rejected categorically. Nobody should be allowed to curtail freedom in these ways, and no religion has the right to break the peace.
The attempt to eradicate religion, however, only leads to it reappearing in grotesque and degraded forms. A credulous belief in world revolution, universal democracy or the occult powers of mobile phones is more offensive to reason than the mysteries of religion, and less likely to survive in years to come. Victorian poet Matthew Arnold wrote of believers being left bereft as the tide of faith ebbs away. Today secular faith is ebbing, and it is the apostles of unbelief who are left stranded on the beach.
· John Gray's Black Mass: Apocalyptic Religion and the Death of Utopia will be out in paperback in April (Penguin)
In the last edition of John Brockman's always-provoctaive EDGE, Harvard MD and sociologist Nicholas Christakis talked about social networks. But instead of delving into well-trodden social network phenomena like viral videos, Christakis studies a variety of unexpected things that can spread through social networks, such as obesity, happiness, altruism, and, oddly, the taste for privacy. From the essay:
For me, social networks are like the eye. They are incredibly complex and beautiful, and looking at them begs the question of why they exist, and why they come to pass. Do we need a kind of just-so story to explain them? Do they just happen to be there, for no particular reason? Or do they serve some purpose — some ontological and also pragmatic purpose?
Along with my collaborator James Fowler, I have been wrestling with the questions of where social networks come from, what purpose they serve, what rules they follow, and what they mean for our lives. The amazing thing about social networks, unlike other networks that are almost as interesting — networks of neurons or genes or stars or computers or all kinds of other things one can imagine — is that the nodes of a social network — the entities, the components — are themselves sentient, acting individuals who can respond to the network and actually form it themselves. Link
As EDGE is a conversation, the new edition includes two insightful responses to Christakis's essay, from Douglas Rushkoff and Alan Alda (yes, that Alan Alda), and, finally, Christakis's response to them. Also in this EDGE edition, photos from the annual EDGE Dinner where big thinkers meet, eat, and somehow avoid being suffocated by the massive amount of smarts in the room. Link
ON HIS website, www.edge.org, John Brockman has been asking his contributors an annual question and publishing the results in book form. This year's question is: what are you optimistic about? The new offering collects almost 150 contributions from an array of Nobel laureates, professors, Pulitzer Prize winners and bestselling authors. Global warming, space travel, international terrorism, religious intolerance, stay-at-home dads, the increasing numbers of women in politics and other harder-to-understand medical and technological advances are some of the topics covered in this impressive book.
Each Christmas, those who know what makes me happiest usually give me the gift of knowledge in the form of a few good books. This year one of these gifts was What Are You Optimistic About?, edited by John Brockman. It contains a collection of answers by some of the world's leading scientists and thinkers to the third "annual www.edge.org question." It had me considering my own answer to the question. I also got to thinking about what answers might be given by members of the Webdiary community. So, here's my answer and then it's over to you: What are you optimistic about?
My third wish could begin to come true.
At the end of the year, Fiona Reynolds proposed that every Webdiarist have three wishes: one for the world, one for our dear ones, and one for ourselves. I reversed the order and made my third wish a wish for the world:
For all of us: An increased desire to understand and make good use of what unites us.
On reading John Brockman's collection I was delighted to see that more than one of the world's leading thinkers expressed an optimism about the prospects of what I'd wished for becoming real.
For example, David Berreby, science writer and author of Us and Them: Understanding Your Tribal Mind explains why he's optimistic about the diminishing influence of what he calls "the zombie concept of identity", which is "the intuition that people do things because of their membership in a collective identity or affiliation". In other words, he sees signs that the incorrect assumption that people are obedient zombies who do what identity ordains is being overcome. I share Berreby's optimism that:
As we become more comfortable with the idea that people have multiple identities whose management is a complex psychological phenomenon, there will be more research on the central questions: What makes a particular identity convincing? What makes it come to the fore in a given context?
My optimism is also encouraged by, Philip G. Zimbardo, Professor of Psychology emeritus at Stanford University and famous for the Stanford Prisoner Experiment:
In trying to understand human behavior that violates our expectations, there is a tendency to 'rush to the dispositional.' We seek to explain behavior in terms of the personal qualities of the actor. In individualistic cultures, this means searching for genetic, personality, or pathological characteristics that can be reasonably linked as explanatory constructs. It also has come to mean discounting or ignoring aspects of the behavioral context - situational variables - that may be significant contributors to behavior. Dramatists, philosophers, and historians, as well as clergy and physicians, all tend toward the dispositional and away from the situational in their views of human nature.
Social psychologists have been struggling to modify this bias toward inner determinants of behavior by creating a large body of research highlighting the importance of outer determinants. Rules, responsibility, anonymity, role-playing, group dynamics, authority pressures, and more have been shown to have a dramatic effect on individual behavior across a variety of settings.
The social psychologist Stanley Milgram's classic demonstration of blind obedience to authority showed that most ordinary Americans would follow orders given by an authority even if it led to severely harming an innocent person. My Stanford prison experiment extended this notion of situational power to demonstrate that instituational settings - prisons, schools, businesses - exert strong influences over human behavior. Nevertheless, the general public (and even intellectuals from many fields) still buys the dispositional and dismisses the situational as mere mitigating circumstance.
I am optimistic that this bias will be rebalanced in the coming year, as new research reveals that the situational focus is to an enhanced public-health model as the dispositional is to the old medical model in trying to understand and change the behavior of people in communities. The focus of public health on identifying vectors of disease can be extended to systemic vectors of health and success in place of individual ambition and personal failure or success.
This analysis will be important in meeting the challenges posed by international terrorism through new efforts to build community resilience instead of focussing on individual coping. It will also change the blame game of those in charge of various institutions and systems - from identifying the 'few bad apples' to actively trying to understand how the apple barrel is corrupting good apples. I have shown how this dispositional thinking operated in analyzing the causes of the abuse and torture at Abu Ghraib by the military and civilian chains of command. Dispositional thinking is no different than the search for evil by identifying and destroying the 'witches' in Salem. Although the foundations of such thinking run deep and wide in most of us, I am optimistic that we will acquire a more balanced perspective on how good people may turn evil and bad people can be guided toward good.
My optimism that we can make good use of the knowledge of what makes us human, and then also what unites us, is bolstered by the optimism of founder and CEO of Neoteny, Joichi Ito:
I am optimistic that open networks will continue to grow and become available to more and more people. I am optimistic that computers will continue to become cheaper and more available. I am optimistic that the hardware and software will become more open, transparent, and free. I am optimistic that the ability to create, share, and mix works will provide a voice to the vast majority of people.
I believe the Internet, open source, and a global culture of discourse and sharing will become pillars of democracy for the 21st century. Whereas those in power – as well as terrorists, who are not – have used broadcast technology and the mass media of the 20th century against the free world, I am optimistic that the Internet will enable the collective voice of the people, and that it will be a voice of reason and goodwill.
One of the most interesting developments of the last sixty years in the popularization of intellectual concerns and higher culture has been the appearance of “public intellectuals.” They are, for the most part, academics who use a variety of means of access to a wide audience to disseminate ideas that are sometimes an integral part of their expertise, and sometimes very far from their professional field.
There were, indeed, at an earlier time, occasional purveyors of scientific ideas either to a cultured public or as part of a conscious attempt to educate the working class. Thomas Henry Huxley was not only a major popularizer of Darwin for an educated English reading public in the 1860s, but also gave workingmen’s lectures on various biological questions. In pursuit of his own ideological program, J.B.S.Haldane, one of the founders of modern evolutionary genetics in the 1930s, wrote on science for the British Daily Worker. In the more conventional press, the feuilleton pages of French and Italian newspapers have long been the outlet for occasional articles on scientific and cultural issues by prominent academics. It has only been since World War II, however, that there has arisen a moderately large class of academics for whom a major preoccupation has been the popular explication and interpretation of either their body of technical knowledge or their theories about almost anything.
The rise of the public intellectual as a regular career category, bringing esoteric knowledge and overarching theories to a wide audience, as well as fame and fortune to the practitioner, began when the most esoteric science intruded itself onto the public consciousness with a very loud bang on July 16, 1945. In high school I was a typically nerdy science enthusiast, part of a small, more or less socially isolated coterie that met after school to trade Freudian interpretations of our dreams at the local soda fountain. But when the school year began in the fall of 1946 I found myself on the assembly hall platform, a public-intellectual-in-training, explaining the mysteries of nuclear physics to an audience of the entire school.
The Manhattan Project and the development of radar during World War IIprovided the impetus for a major reorientation of the relationship between the state and the academic world. It became obvious to policymakers like Vannevar Bush, head of the wartime Office of Scientific Research and Development, that a regular major investment in scientific research would be necessary for the future security and financial prosperity of the country and that, given the competitive demands for profit, private capital could not be adequate for the purpose. The result has been that the annual federal expenditure for research and development (in constant dollars) has been multiplied by a factor of ten since 1947. The relevance of this immense increase in the funding of science to our understanding of changes in culture is twofold.
First, universities and colleges have been a major beneficiary of the investment in science, their total share having risen …