apster in 1999. MySpace in 2004. YouTube in 2006. Experts from the tech community look ahead to the innovations that will change how we work, play and communicate in 2007...
All computing, all the time
John Brockman is publisher and editor of Edge (edge.org)
WE WILL SEE migration of social applications as user-generated content moves to the WiFi environment. YouTube, MySpace and multi-user games will be available on hand-held devices, wherever you go. People will carry their digital assets much like their bacteria. Israeli tech guru Yossi Vardi calls it "continuous computing."
The nanotechnology world foreseen by K. Eric Drexler arrives in the form of MEMS, or microelectronic mechanical systems. Very inexpensive moving parts will be mass-produced like a semiconductor. But unlike semiconductors, they move. Useful for anything that employs moving parts.
Synthetic Biology pioneer George Church of Harvard University expects $3,000 personal genomics kits in stores.
"Pop Atheism" might include popular atheist TV and movie characters, professional athletes, political figures, etc. Look for the first billion-dollar IPO for the Web service that gets atheists together for "rituals," dating and political and business networking.
Rod Brooks, director of MIT's computer lab, is looking at new Web services aimed at the baby boomer age group, who realize that, in terms of IT use, they've been passed by, missing out on IM, text-messaging, MySpace, etc.
But don't put much stock in predictions. Consider that YouTube/ /MySpace/ Napster didn't change the real world for most people very much. MySpace became TheirSpace and YouTube became TheirTube faster than you can say "2006."
This is where big brains hang out online. Its membership includes 'some of the most interesting minds in the world' debating intellectual, philosophical and artistic issues. Sounds heavy, but it's always full of wise words to steal.
Writer, editor and architect of a great number of the recent years' scientific bestsellers, American John Brockman recounts how the project came about to summon a hundred brilliant minds, mostly scientists, and each year ask provocative questions to synthesize, in a way, contemporary thought. The answers are striking.
By Juana Libedinsky
NEW YORK — "It was July and so hot that you could fry an egg on Park Avenue. I went out to do some errands, driving around the city in an airconditioned taxi when I was distracted by the news on the radio: the war in Iraq was going from bad to worse; Bush was, well, being Bush (and let me clarify that among the many hundreds of science-minded thinkers that I know, I can count three who are Republicans). It was then that I had the idea: the question of the year could only be "What are you optimistic about!".
Sitting in his magnificent office on Central Park, with the St. Patrick's Day parade going by below, John Brockman, a writer, editor and the agent behind nearly every major scientific bestseller in recent years (such as books by Richard Dawkins, Jared Diamond and Nassim Taleb, among others) talks about how the idea came about for his latest compilation entitled, obviously "What are you optimistic about?" ...
Tajik Muslims praying. Photograph: Alexei Vladykin/AP
An atmosphere of moral panic surrounds religion. Viewed not so long ago as a relic of superstition whose role in society was steadily declining, it is now demonised as the cause of many of the world's worst evils. As a result, there has been a sudden explosion in the literature of proselytising atheism. A few years ago, it was difficult to persuade commercial publishers even to think of bringing out books on religion. Today, tracts against religion can be enormous money-spinners, with Richard Dawkins's The God Delusion and Christopher Hitchens's God Is Not Great selling in the hundreds of thousands. For the first time in generations, scientists and philosophers, high-profile novelists and journalists are debating whether religion has a future. The intellectual traffic is not all one-way. There have been counterblasts for believers, such as The Dawkins Delusion? by the British theologian Alister McGrath and The Secular Age by the Canadian Catholic philosopher Charles Taylor. On the whole, however, the anti-God squad has dominated the sales charts, and it is worth asking why.
The abrupt shift in the perception of religion is only partly explained by terrorism. The 9/11 hijackers saw themselves as martyrs in a religious tradition, and western opinion has accepted their self-image. And there are some who view the rise of Islamic fundamentalism as a danger comparable with the worst that were faced by liberal societies in the 20th century.
For Dawkins and Hitchens, Daniel Dennett and Martin Amis, Michel Onfray, Philip Pullman and others, religion in general is a poison that has fuelled violence and oppression throughout history, right up to the present day. The urgency with which they produce their anti-religious polemics suggests that a change has occurred as significant as the rise of terrorism: the tide of secularisation has turned. These writers come from a generation schooled to think of religion as a throwback to an earlier stage of human development, which is bound to dwindle away as knowledge continues to increase. In the 19th century, when the scientific and industrial revolutions were changing society very quickly, this may not have been an unreasonable assumption. Dawkins, Hitchens and the rest may still believe that, over the long run, the advance of science will drive religion to the margins of human life, but this is now an article of faith rather than a theory based on evidence.
It is true that religion has declined sharply in a number of countries (Ireland is a recent example) and has not shaped everyday life for most people in Britain for many years. Much of Europe is clearly post-Christian. However, there is nothing that suggests the move away from religion is irreversible, or that it is potentially universal. The US is no more secular today than it was 150 years ago, when De Tocqueville was amazed and baffled by its all-pervading religiosity. The secular era was in any case partly illusory. The mass political movements of the 20th century were vehicles for myths inherited from religion, and it is no accident that religion is reviving now that these movements have collapsed. The current hostility to religion is a reaction against this turnabout. Secularisation is in retreat, and the result is the appearance of an evangelical type of atheism not seen since Victorian times.
As in the past, this is a type of atheism that mirrors the faith it rejects. Philip Pullman's Northern Lights - a subtly allusive, multilayered allegory, recently adapted into a Hollywood blockbuster, The Golden Compass - is a good example. Pullman's parable concerns far more than the dangers of authoritarianism. The issues it raises are essentially religious, and it is deeply indebted to the faith it attacks. Pullman has stated that his atheism was formed in the Anglican tradition, and there are many echoes of Milton and Blake in his work. His largest debt to this tradition is the notion of free will. The central thread of the story is the assertion of free will against faith. The young heroine Lyra Belacqua sets out to thwart the Magisterium - Pullman's metaphor for Christianity - because it aims to deprive humans of their ability to choose their own course in life, which she believes would destroy what is most human in them. But the idea of free will that informs liberal notions of personal autonomy is biblical in origin (think of the Genesis story). The belief that exercising free will is part of being human is a legacy of faith, and like most varieties of atheism today, Pullman's is a derivative of Christianity.
Zealous atheism renews some of the worst features of Christianity and Islam. Just as much as these religions, it is a project of universal conversion. Evangelical atheists never doubt that human life can be transformed if everyone accepts their view of things, and they are certain that one way of living - their own, suitably embellished - is right for everybody. To be sure, atheism need not be a missionary creed of this kind. It is entirely reasonable to have no religious beliefs, and yet be friendly to religion. It is a funny sort of humanism that condemns an impulse that is peculiarly human. Yet that is what evangelical atheists do when they demonise religion.
A curious feature of this kind of atheism is that some of its most fervent missionaries are philosophers. Daniel Dennett's Breaking the Spell: Religion as a Natural Phenomenon claims to sketch a general theory of religion. In fact, it is mostly a polemic against American Christianity. This parochial focus is reflected in Dennett's view of religion, which for him means the belief that some kind of supernatural agency (whose approval believers seek) is needed to explain the way things are in the world. For Dennett, religions are efforts at doing something science does better - they are rudimentary or abortive theories, or else nonsense. "The proposition that God exists," he writes severely, "is not even a theory." But religions do not consist of propositions struggling to become theories. The incomprehensibility of the divine is at the heart of Eastern Christianity, while in Orthodox Judaism practice tends to have priority over doctrine. Buddhism has always recognised that in spiritual matters truth is ineffable, as do Sufi traditions in Islam. Hinduism has never defined itself by anything as simplistic as a creed. It is only some western Christian traditions, under the influence of Greek philosophy, which have tried to turn religion into an explanatory theory.
The notion that religion is a primitive version of science was popularised in the late 19th century in JG Frazer's survey of the myths of primitive peoples, The Golden Bough: A Study in Magic and Religion. For Frazer, religion and magical thinking were closely linked. Rooted in fear and ignorance, they were vestiges of human infancy that would disappear with the advance of knowledge. Dennett's atheism is not much more than a revamped version of Frazer's positivism. The positivists believed that with the development of transport and communication - in their day, canals and the telegraph - irrational thinking would wither way, along with the religions of the past. Despite the history of the past century, Dennett believes much the same. In an interview that appears on the website of the Edge Foundation (edge.org) under the title "The Evaporation of the Powerful Mystique of Religion", he predicts that "in about 25 years almost all religions will have evolved into very different phenomena, so much so that in most quarters religion will no longer command the awe that it does today". He is confident that this will come about, he tells us, mainly because of "the worldwide spread of information technology (not just the internet, but cell phones and portable radios and television)". The philosopher has evidently not reflected on the ubiquity of mobile phones among the Taliban, or the emergence of a virtual al-Qaida on the web.
The growth of knowledge is a fact only postmodern relativists deny. Science is the best tool we have for forming reliable beliefs about the world, but it does not differ from religion by revealing a bare truth that religions veil in dreams. Both science and religion are systems of symbols that serve human needs - in the case of science, for prediction and control. Religions have served many purposes, but at bottom they answer to a need for meaning that is met by myth rather than explanation. A great deal of modern thought consists of secular myths - hollowed-out religious narratives translated into pseudo-science. Dennett's notion that new communications technologies will fundamentally alter the way human beings think is just such a myth.
In The God Delusion, Dawkins attempts to explain the appeal of religion in terms of the theory of memes, vaguely defined conceptual units that compete with one another in a parody of natural selection. He recognises that, because humans have a universal tendency to religious belief, it must have had some evolutionary advantage, but today, he argues, it is perpetuated mainly through bad education. From a Darwinian standpoint, the crucial role Dawkins gives to education is puzzling. Human biology has not changed greatly over recorded history, and if religion is hardwired in the species, it is difficult to see how a different kind of education could alter this. Yet Dawkins seems convinced that if it were not inculcated in schools and families, religion would die out. This is a view that has more in common with a certain type of fundamentalist theology than with Darwinian theory, and I cannot help being reminded of the evangelical Christian who assured me that children reared in a chaste environment would grow up without illicit sexual impulses.
Dawkins's "memetic theory of religion" is a classic example of the nonsense that is spawned when Darwinian thinking is applied outside its proper sphere. Along with Dennett, who also holds to a version of the theory, Dawkins maintains that religious ideas survive because they would be able to survive in any "meme pool", or else because they are part of a "memeplex" that includes similar memes, such as the idea that, if you die as a martyr, you will enjoy 72 virgins. Unfortunately, the theory of memes is science only in the sense that Intelligent Design is science. Strictly speaking, it is not even a theory. Talk of memes is just the latest in a succession of ill-judged Darwinian metaphors.
Dawkins compares religion to a virus: religious ideas are memes that infect vulnerable minds, especially those of children. Biological metaphors may have their uses - the minds of evangelical atheists seem particularly prone to infection by religious memes, for example. At the same time, analogies of this kind are fraught with peril. Dawkins makes much of the oppression perpetrated by religion, which is real enough. He gives less attention to the fact that some of the worst atrocities of modern times were committed by regimes that claimed scientific sanction for their crimes. Nazi "scientific racism" and Soviet "dialectical materialism" reduced the unfathomable complexity of human lives to the deadly simplicity of a scientific formula. In each case, the science was bogus, but it was accepted as genuine at the time, and not only in the regimes in question. Science is as liable to be used for inhumane purposes as any other human institution. Indeed, given the enormous authority science enjoys, the risk of it being used in this way is greater.
Contemporary opponents of religion display a marked lack of interest in the historical record of atheist regimes. In The End of Faith: Religion, Terror and the Future of Reason, the American writer Sam Harris argues that religion has been the chief source of violence and oppression in history. He recognises that secular despots such as Stalin and Mao inflicted terror on a grand scale, but maintains the oppression they practised had nothing to do with their ideology of "scientific atheism" - what was wrong with their regimes was that they were tyrannies. But might there not be a connection between the attempt to eradicate religion and the loss of freedom? It is unlikely that Mao, who launched his assault on the people and culture of Tibet with the slogan "Religion is poison", would have agreed that his atheist world-view had no bearing on his policies. It is true he was worshipped as a semi-divine figure - as Stalin was in the Soviet Union. But in developing these cults, communist Russia and China were not backsliding from atheism. They were demonstrating what happens when atheism becomes a political project. The invariable result is an ersatz religion that can only be maintained by tyrannical means.
Something like this occurred in Nazi Germany. Dawkins dismisses any suggestion that the crimes of the Nazis could be linked with atheism. "What matters," he declares in The God Delusion, "is not whether Hitler and Stalin were atheists, but whether atheism systematically influences people to do bad things. There is not the smallest evidence that it does." This is simple-minded reasoning. Always a tremendous booster of science, Hitler was much impressed by vulgarised Darwinism and by theories of eugenics that had developed from Enlightenment philosophies of materialism. He used Christian antisemitic demonology in his persecution of Jews, and the churches collaborated with him to a horrifying degree. But it was the Nazi belief in race as a scientific category that opened the way to a crime without parallel in history. Hitler's world-view was that of many semi-literate people in interwar Europe, a hotchpotch of counterfeit science and animus towards religion. There can be no reasonable doubt that this was a type of atheism, or that it helped make Nazi crimes possible.
Nowadays most atheists are avowed liberals. What they want - so they will tell you - is not an atheist regime, but a secular state in which religion has no role. They clearly believe that, in a state of this kind, religion will tend to decline. But America's secular constitution has not ensured a secular politics. Christian fundamentalism is more powerful in the US than in any other country, while it has very little influence in Britain, which has an established church. Contemporary critics of religion go much further than demanding disestablishment. It is clear that he wants to eliminate all traces of religion from public institutions. Awkwardly, many of the concepts he deploys - including the idea of religion itself - have been shaped by monotheism. Lying behind secular fundamentalism is a conception of history that derives from religion.
AC Grayling provides an example of the persistence of religious categories in secular thinking in his Towards the Light: The Story of the Struggles for Liberty and Rights That Made the Modern West. As the title indicates, Grayling's book is a type of sermon. Its aim is to reaffirm what he calls "a Whig view of the history of the modern west", the core of which is that "the west displays progress". The Whigs were pious Christians, who believed divine providence arranged history to culminate in English institutions, and Grayling too believes history is "moving in the right direction". No doubt there have been setbacks - he mentions nazism and communism in passing, devoting a few sentences to them. But these disasters were peripheral. They do not reflect on the central tradition of the modern west, which has always been devoted to liberty, and which - Grayling asserts - is inherently antagonistic to religion. "The history of liberty," he writes, "is another chapter - and perhaps the most important of all - in the great quarrel between religion and secularism." The possibility that radical versions of secular thinking may have contributed to the development of nazism and communism is not mentioned. More even than the 18th-century Whigs, who were shaken by French Terror, Grayling has no doubt as to the direction of history.
But the belief that history is a directional process is as faith-based as anything in the Christian catechism. Secular thinkers such as Grayling reject the idea of providence, but they continue to think humankind is moving towards a universal goal - a civilisation based on science that will eventually encompass the entire species. In pre-Christian Europe, human life was understood as a series of cycles; history was seen as tragic or comic rather than redemptive. With the arrival of Christianity, it came to be believed that history had a predetermined goal, which was human salvation. Though they suppress their religious content, secular humanists continue to cling to similar beliefs. One does not want to deny anyone the consolations of a faith, but it is obvious that the idea of progress in history is a myth created by the need for meaning.
The problem with the secular narrative is not that it assumes progress is inevitable (in many versions, it does not). It is the belief that the sort of advance that has been achieved in science can be reproduced in ethics and politics. In fact, while scientific knowledge increases cumulatively, nothing of the kind happens in society. Slavery was abolished in much of the world during the 19th century, but it returned on a vast scale in nazism and communism, and still exists today. Torture was prohibited in international conventions after the second world war, only to be adopted as an instrument of policy by the world's pre-eminent liberal regime at the beginning of the 21st century. Wealth has increased, but it has been repeatedly destroyed in wars and revolutions. People live longer and kill one another in larger numbers. Knowledge grows, but human beings remain much the same.
Belief in progress is a relic of the Christian view of history as a universal narrative, and an intellectually rigorous atheism would start by questioning it. This is what Nietzsche did when he developed his critique of Christianity in the late 19th century, but almost none of today's secular missionaries have followed his example. One need not be a great fan of Nietzsche to wonder why this is so. The reason, no doubt, is that he did not assume any connection between atheism and liberal values - on the contrary, he viewed liberal values as an offspring of Christianity and condemned them partly for that reason. In contrast, evangelical atheists have positioned themselves as defenders of liberal freedoms - rarely inquiring where these freedoms have come from, and never allowing that religion may have had a part in creating them.
Among contemporary anti-religious polemicists, only the French writer Michel Onfray has taken Nietzsche as his point of departure. In some ways, Onfray's In Defence of Atheism is superior to anything English-speaking writers have published on the subject. Refreshingly, Onfray recognises that evangelical atheism is an unwitting imitation of traditional religion: "Many militants of the secular cause look astonishingly like clergy. Worse: like caricatures of clergy." More clearly than his Anglo-Saxon counterparts, Onfray understands the formative influence of religion on secular thinking. Yet he seems not to notice that the liberal values he takes for granted were partly shaped by Christianity and Judaism. The key liberal theorists of toleration are John Locke, who defended religious freedom in explicitly Christian terms, and Benedict Spinoza, a Jewish rationalist who was also a mystic. Yet Onfray has nothing but contempt for the traditions from which these thinkers emerged - particularly Jewish monotheism: "We do not possess an official certificate of birth for worship of one God," he writes. "But the family line is clear: the Jews invented it to endure the coherence, cohesion and existence of their small, threatened people." Here Onfray passes over an important distinction. It may be true that Jews first developed monotheism, but Judaism has never been a missionary faith. In seeking universal conversion, evangelical atheism belongs with Christianity and Islam.
In today's anxiety about religion, it has been forgotten that most of the faith-based violence of the past century was secular in nature. To some extent, this is also true of the current wave of terrorism. Islamism is a patchwork of movements, not all violently jihadist and some strongly opposed to al-Qaida, most of them partly fundamentalist and aiming to recover the lost purity of Islamic traditions, while at the same time taking some of their guiding ideas from radical secular ideology. There is a deal of fashionable talk of Islamo-fascism, and Islamist parties have some features in common with interwar fascist movements, including antisemitism. But Islamists owe as much, if not more, to the far left, and it would be more accurate to describe many of them as Islamo-Leninists. Islamist techniques of terror also have a pedigree in secular revolutionary movements. The executions of hostages in Iraq are copied in exact theatrical detail from European "revolutionary tribunals" in the 1970s, such as that staged by the Red Brigades when they murdered the former Italian prime minister Aldo Moro in 1978.
The influence of secular revolutionary movements on terrorism extends well beyond Islamists. In God Is Not Great, Christopher Hitchens notes that, long before Hizbullah and al-Qaida, the Tamil Tigers of Sri Lanka pioneered what he rightly calls "the disgusting tactic of suicide murder". He omits to mention that the Tigers are Marxist-Leninists who, while recruiting mainly from the island's Hindu population, reject religion in all its varieties. Tiger suicide bombers do not go to certain death in the belief that they will be rewarded in any postmortem paradise. Nor did the suicide bombers who drove American and French forces out of Lebanon in the 80s, most of whom belonged to organisations of the left such as the Lebanese communist party. These secular terrorists believed they were expediting a historical process from which will come a world better than any that has ever existed. It is a view of things more remote from human realities, and more reliably lethal in its consequences, than most religious myths.
It is not necessary to believe in any narrative of progress to think liberal societies are worth resolutely defending. No one can doubt that they are superior to the tyranny imposed by the Taliban on Afghanistan, for example. The issue is one of proportion. Ridden with conflicts and lacking the industrial base of communism and nazism, Islamism is nowhere near a danger of the magnitude of those that were faced down in the 20th century. A greater menace is posed by North Korea, which far surpasses any Islamist regime in its record of repression and clearly does possess some kind of nuclear capability. Evangelical atheists rarely mention it. Hitchens is an exception, but when he describes his visit to the country, it is only to conclude that the regime embodies "a debased yet refined form of Confucianism and ancestor worship". As in Russia and China, the noble humanist philosophy of Marxist-Leninism is innocent of any responsibility.
Writing of the Trotskyite-Luxemburgist sect to which he once belonged, Hitchens confesses sadly: "There are days when I miss my old convictions as if they were an amputated limb." He need not worry. His record on Iraq shows he has not lost the will to believe. The effect of the American-led invasion has been to deliver most of the country outside the Kurdish zone into the hands of an Islamist elective theocracy, in which women, gays and religious minorities are more oppressed than at any time in Iraq's history. The idea that Iraq could become a secular democracy - which Hitchens ardently promoted - was possible only as an act of faith.
In The Second Plane, Martin Amis writes: "Opposition to religion already occupies the high ground, intellectually and morally." Amis is sure religion is a bad thing, and that it has no future in the west. In the author of Koba the Dread: Laughter and the Twenty Million - a forensic examination of self-delusion in the pro-Soviet western intelligentsia - such confidence is surprising. The intellectuals whose folly Amis dissects turned to communism in some sense as a surrogate for religion, and ended up making excuses for Stalin. Are there really no comparable follies today? Some neocons - such as Tony Blair, who will soon be teaching religion and politics at Yale - combine their belligerent progressivism with religious belief, though of a kind Augustine and Pascal might find hard to recognise. Most are secular utopians, who justify pre-emptive war and excuse torture as leading to a radiant future in which democracy will be adopted universally. Even on the high ground of the west, messianic politics has not lost its dangerous appeal.
Religion has not gone away. Repressing it is like repressing sex, a self-defeating enterprise. In the 20th century, when it commanded powerful states and mass movements, it helped engender totalitarianism. Today, the result is a climate of hysteria. Not everything in religion is precious or deserving of reverence. There is an inheritance of anthropocentrism, the ugly fantasy that the Earth exists to serve humans, which most secular humanists share. There is the claim of religious authorities, also made by atheist regimes, to decide how people can express their sexuality, control their fertility and end their lives, which should be rejected categorically. Nobody should be allowed to curtail freedom in these ways, and no religion has the right to break the peace.
The attempt to eradicate religion, however, only leads to it reappearing in grotesque and degraded forms. A credulous belief in world revolution, universal democracy or the occult powers of mobile phones is more offensive to reason than the mysteries of religion, and less likely to survive in years to come. Victorian poet Matthew Arnold wrote of believers being left bereft as the tide of faith ebbs away. Today secular faith is ebbing, and it is the apostles of unbelief who are left stranded on the beach.
· John Gray's Black Mass: Apocalyptic Religion and the Death of Utopia will be out in paperback in April (Penguin)
In the last edition of John Brockman's always-provoctaive EDGE, Harvard MD and sociologist Nicholas Christakis talked about social networks. But instead of delving into well-trodden social network phenomena like viral videos, Christakis studies a variety of unexpected things that can spread through social networks, such as obesity, happiness, altruism, and, oddly, the taste for privacy. From the essay:
For me, social networks are like the eye. They are incredibly complex and beautiful, and looking at them begs the question of why they exist, and why they come to pass. Do we need a kind of just-so story to explain them? Do they just happen to be there, for no particular reason? Or do they serve some purpose — some ontological and also pragmatic purpose?
Along with my collaborator James Fowler, I have been wrestling with the questions of where social networks come from, what purpose they serve, what rules they follow, and what they mean for our lives. The amazing thing about social networks, unlike other networks that are almost as interesting — networks of neurons or genes or stars or computers or all kinds of other things one can imagine — is that the nodes of a social network — the entities, the components — are themselves sentient, acting individuals who can respond to the network and actually form it themselves. Link
As EDGE is a conversation, the new edition includes two insightful responses to Christakis's essay, from Douglas Rushkoff and Alan Alda (yes, that Alan Alda), and, finally, Christakis's response to them. Also in this EDGE edition, photos from the annual EDGE Dinner where big thinkers meet, eat, and somehow avoid being suffocated by the massive amount of smarts in the room. Link
ON HIS website, www.edge.org, John Brockman has been asking his contributors an annual question and publishing the results in book form. This year's question is: what are you optimistic about? The new offering collects almost 150 contributions from an array of Nobel laureates, professors, Pulitzer Prize winners and bestselling authors. Global warming, space travel, international terrorism, religious intolerance, stay-at-home dads, the increasing numbers of women in politics and other harder-to-understand medical and technological advances are some of the topics covered in this impressive book.
Each Christmas, those who know what makes me happiest usually give me the gift of knowledge in the form of a few good books. This year one of these gifts was What Are You Optimistic About?, edited by John Brockman. It contains a collection of answers by some of the world's leading scientists and thinkers to the third "annual www.edge.org question." It had me considering my own answer to the question. I also got to thinking about what answers might be given by members of the Webdiary community. So, here's my answer and then it's over to you: What are you optimistic about?
My third wish could begin to come true.
At the end of the year, Fiona Reynolds proposed that every Webdiarist have three wishes: one for the world, one for our dear ones, and one for ourselves. I reversed the order and made my third wish a wish for the world:
For all of us: An increased desire to understand and make good use of what unites us.
On reading John Brockman's collection I was delighted to see that more than one of the world's leading thinkers expressed an optimism about the prospects of what I'd wished for becoming real.
For example, David Berreby, science writer and author of Us and Them: Understanding Your Tribal Mind explains why he's optimistic about the diminishing influence of what he calls "the zombie concept of identity", which is "the intuition that people do things because of their membership in a collective identity or affiliation". In other words, he sees signs that the incorrect assumption that people are obedient zombies who do what identity ordains is being overcome. I share Berreby's optimism that:
As we become more comfortable with the idea that people have multiple identities whose management is a complex psychological phenomenon, there will be more research on the central questions: What makes a particular identity convincing? What makes it come to the fore in a given context?
My optimism is also encouraged by, Philip G. Zimbardo, Professor of Psychology emeritus at Stanford University and famous for the Stanford Prisoner Experiment:
In trying to understand human behavior that violates our expectations, there is a tendency to 'rush to the dispositional.' We seek to explain behavior in terms of the personal qualities of the actor. In individualistic cultures, this means searching for genetic, personality, or pathological characteristics that can be reasonably linked as explanatory constructs. It also has come to mean discounting or ignoring aspects of the behavioral context - situational variables - that may be significant contributors to behavior. Dramatists, philosophers, and historians, as well as clergy and physicians, all tend toward the dispositional and away from the situational in their views of human nature.
Social psychologists have been struggling to modify this bias toward inner determinants of behavior by creating a large body of research highlighting the importance of outer determinants. Rules, responsibility, anonymity, role-playing, group dynamics, authority pressures, and more have been shown to have a dramatic effect on individual behavior across a variety of settings.
The social psychologist Stanley Milgram's classic demonstration of blind obedience to authority showed that most ordinary Americans would follow orders given by an authority even if it led to severely harming an innocent person. My Stanford prison experiment extended this notion of situational power to demonstrate that instituational settings - prisons, schools, businesses - exert strong influences over human behavior. Nevertheless, the general public (and even intellectuals from many fields) still buys the dispositional and dismisses the situational as mere mitigating circumstance.
I am optimistic that this bias will be rebalanced in the coming year, as new research reveals that the situational focus is to an enhanced public-health model as the dispositional is to the old medical model in trying to understand and change the behavior of people in communities. The focus of public health on identifying vectors of disease can be extended to systemic vectors of health and success in place of individual ambition and personal failure or success.
This analysis will be important in meeting the challenges posed by international terrorism through new efforts to build community resilience instead of focussing on individual coping. It will also change the blame game of those in charge of various institutions and systems - from identifying the 'few bad apples' to actively trying to understand how the apple barrel is corrupting good apples. I have shown how this dispositional thinking operated in analyzing the causes of the abuse and torture at Abu Ghraib by the military and civilian chains of command. Dispositional thinking is no different than the search for evil by identifying and destroying the 'witches' in Salem. Although the foundations of such thinking run deep and wide in most of us, I am optimistic that we will acquire a more balanced perspective on how good people may turn evil and bad people can be guided toward good.
My optimism that we can make good use of the knowledge of what makes us human, and then also what unites us, is bolstered by the optimism of founder and CEO of Neoteny, Joichi Ito:
I am optimistic that open networks will continue to grow and become available to more and more people. I am optimistic that computers will continue to become cheaper and more available. I am optimistic that the hardware and software will become more open, transparent, and free. I am optimistic that the ability to create, share, and mix works will provide a voice to the vast majority of people.
I believe the Internet, open source, and a global culture of discourse and sharing will become pillars of democracy for the 21st century. Whereas those in power – as well as terrorists, who are not – have used broadcast technology and the mass media of the 20th century against the free world, I am optimistic that the Internet will enable the collective voice of the people, and that it will be a voice of reason and goodwill.
One of the most interesting developments of the last sixty years in the popularization of intellectual concerns and higher culture has been the appearance of “public intellectuals.” They are, for the most part, academics who use a variety of means of access to a wide audience to disseminate ideas that are sometimes an integral part of their expertise, and sometimes very far from their professional field.
There were, indeed, at an earlier time, occasional purveyors of scientific ideas either to a cultured public or as part of a conscious attempt to educate the working class. Thomas Henry Huxley was not only a major popularizer of Darwin for an educated English reading public in the 1860s, but also gave workingmen’s lectures on various biological questions. In pursuit of his own ideological program, J.B.S.Haldane, one of the founders of modern evolutionary genetics in the 1930s, wrote on science for the British Daily Worker. In the more conventional press, the feuilleton pages of French and Italian newspapers have long been the outlet for occasional articles on scientific and cultural issues by prominent academics. It has only been since World War II, however, that there has arisen a moderately large class of academics for whom a major preoccupation has been the popular explication and interpretation of either their body of technical knowledge or their theories about almost anything.
The rise of the public intellectual as a regular career category, bringing esoteric knowledge and overarching theories to a wide audience, as well as fame and fortune to the practitioner, began when the most esoteric science intruded itself onto the public consciousness with a very loud bang on July 16, 1945. In high school I was a typically nerdy science enthusiast, part of a small, more or less socially isolated coterie that met after school to trade Freudian interpretations of our dreams at the local soda fountain. But when the school year began in the fall of 1946 I found myself on the assembly hall platform, a public-intellectual-in-training, explaining the mysteries of nuclear physics to an audience of the entire school.
The Manhattan Project and the development of radar during World War IIprovided the impetus for a major reorientation of the relationship between the state and the academic world. It became obvious to policymakers like Vannevar Bush, head of the wartime Office of Scientific Research and Development, that a regular major investment in scientific research would be necessary for the future security and financial prosperity of the country and that, given the competitive demands for profit, private capital could not be adequate for the purpose. The result has been that the annual federal expenditure for research and development (in constant dollars) has been multiplied by a factor of ten since 1947. The relevance of this immense increase in the funding of science to our understanding of changes in culture is twofold.
First, universities and colleges have been a major beneficiary of the investment in science, their total share having risen …
Daily Brew: Valuable Reasons to Check Your Kid's Closet
RacketBoy.com: They must be lying around the house somewhere. (Try your kid's closet). The rarest and most valuable Super Nintendo video games.
NYTimes.com: In the country of record debt and credit card lovin', how do Americans spend their money?
LATimes.com: The upside of pollution--all our man-made junk is giving life to a new breed of organism.
Edge.org: From the existence of ghosts to losing faith in equality, the world's top scientific thinkers change their minds on some provocative issues.
SmashingMagazine.com: 10 principles of effective web design in the age of A.D.D.
--Kevin Maney and Andrea Chalupa
In its roundup of best books of 2007, The Economist claimed that "there is something for everyone" -- but there wasn't.
There was not a single science title, which is curious, even for a business and political affairs periodical, given not only the technology-invention-business connection but also the fact that we are currently in a golden age of literary science writing.
That we are is affirmed by British science journalist Matt Ridley in his introduction to a recent collection of essays on evolution. Scientists, says Ridley, "(are) writers and their currency (is) words: poetic flights of fancy, ample use of metaphor, and personal appeals to the reader."
Many editors, reviewers and other publicists don't seem to have heard the news, however. Not only The Economist but also the Globe & Mail and the New York Times snubbed 2007's science titles.
In Britain, 2007 saw the release of Richard Mabey's well-applauded Beechcombings: the Narratives of Trees, Robert Macfarlane's The Wild Places and Roger Deakin's Wildwood: A Journey Through Trees, all of which combine autobiography, history and travel with nature literature.
These writers are actually heirs to the tradition of the Nature Poets, otherwise known as "the Romantics," and they all warn that we are still shooting the albatross by polluting, clear-cutting and overpopulating our planet.
The "romance" of science itself is their other subject, and this is just what some reviewing organs have missed, still thinking "science book" means dry, technical and difficult. Yet many readers have discovered the lyrical interdisciplinary pens of Steve Jones (Coral: A Pessimist in Paradise), J.M. Adovasio and Olga Soffer (The Invisible Sex: Uncovering the True Roles of Women in Prehistory), and Piotr Naskrecki (The Smaller Majority). The small majority are the 99 per cent of animal species smaller than a human finger. Naskrecki provides more than 400 images revealing the wonders of insects, worms, pond and tidepool creatures and others that live in fur, soil and plants.
New Canadian science books include the paperback edition of Candace Savage's revelatory Crows: Encounters With the Wise Guys of the Avian World and Stephen Marshall's Insects: Their Natural History and Diversity, which won the latest Canadian Science Writers' Association's "Science in Society Book Award."
Quill & Quire's Top 10 Canadian 2007 titles included only one science book, and that a practical manual rather than a literary work. Ecoholic: Your Guide to the Most Environmentally Friendly Information, Products and Services is itself recyclable right down to its binding, and that is the kind of detail it focuses on in an effort to help people take small steps toward environmental responsibility.
B.C.'s notable 2007 nature books include Operation Orca, by Daniel Francis, editor of Harbour Publishing's Encyclopedia of British Columbia (2000). This describes the two juvenile whales, Springer and Luna, who appeared off our coast in 2000 and 2001, alone and lost. Francis tells the inside stories behind the efforts to help them -- the planning of biologists and the scheming of interest groups, the arrival of tourists and media and the human jostling and bumbling that followed.
While humans squabbled Luna met his end at the propeller end of a boat. Operation Orca goes from this to the sordid beginnings of whale capture off B.C. in 1965, and refers to other abuses that characterize our interaction with these fellow-mammals, such as how we inundate their underwater world with the sonar booms of naval traffic and pollution from spills and industry.
Also unnerving but more poetically presented is Terry Glavin's Waiting for the Macaws, and Other Stories from the Age of Extinctions. Glavin tells us that one fifth of bird species existing 20,000 years ago are now extinct, and he makes us feel the pathos of it by describing the fate of the crested mynah, which a few decades ago flourished in Vancouver. Thanks to habitat loss, by 2003 only one pair was left, and then one of those was hit by a car "at 2nd Avenue and Columbia Street." The other kept a faithful vigil for two weeks until it too was hit by a car, "and then there were none."
In his Christmas Day sermon, the Archbishop of Canterbury praised his compatriot Richard Dawkins for expressing humanity's "amazement and awe" at nature, and urged people to treat nature with "reverence." It seems that for some, the famous long cultural war between science and the humanities can now be over, and that "science literature" can now be literature.
That is certainly the opinion of editor John Brockman whose exhilarating science site "edge.org" profiles dozens of groundbreaking scienists by asking them an annual New Year's Big Question. This year's is "What Have You Changed Your Mind About?"
Their answers add up to, roughly, "everything." That is what science frees thinkers to do: change their theories as new evidence comes in. Most responders one way or another emphasized the ethical demands of good science, and described scientific work as subjective, dynamic and creative -- rather like the humanities, in fact.
Contemplating species extinctions, Terry Glavin emphasizes this by urging that we "reclaim the legacy of the Enlightenment" and "strengthen conditions for the diversity of living things" by preserving multiplicity and diversity in our ideas.
For a guiding principle he quotes a poet, William Blake: "Everything that lives is holy."
Barbara Julian is a freelance writer who suspects her Fairfield house might be an ecosystem.
Beechcombings, by Richard Mabey; Chatto; 304 pages; $40
Coral: A Pessimist in Paradise, by Steve Jones; Little Brown; 256 pages, $24
Crows: Encounters with the Wise Guys of the Avian World, by Candace Savage; Douglas & McIntyre; 120 pages; $19.95
Ecoholic: Your Guide to the Most Environmentally Friendly Information, Products and Services in Canada, by Adria Vasil; Vintage Canada; 333 pages; $24.95
Insects: Their Natural History and Diversity, by Stephen Marshall; Firefly Books; 736 pages; $95
The Invisible Sex: Uncovering the True Roles of Women in Prehistory, by J.M. Adovasio & Olga Soffer; HarperCollins; 320 pages; $31.95
Operation Orca: Springer, Luna and the Struggle to Save West Coast Killer Whales, by Daniel Francis and Gil Hewlett; Harbour Publishing; 266 pages; $34.95
The Smaller Majority, by Piotr Naskrecki; Belknap Press; 288 pages; $29.50
Waiting for the Macaws, and Other Stories from the Age of Extinctions, by Terry Glavin; Penguin Canada; 284 pages; $19
The Wild Places, by Robert Macfarlane; Granta; 352 pages; $40
Wildwood: a Journey Through Trees, by Roger Deakin; Hamish Hamilton; 416 pages, $25
How do you predict the future without making a fool of yourself? You can extrapolate current trends to their logical next steps, but unless you stick to the weather -- hurricanes a-comin' next year! -- you're likely to be wrong. Human beings should have been cloned by now. Gasoline should be pumping at $5 a gallon. California, to the disappointment of many, has yet to collapse into the sea along its fault lines, metaphorical or otherwise. What, then, is the point of predicting the future at all?
On the evidence of the more nuanced forecasting in "What's Next" and "What Are You Optimistic About?," looking ahead is best undertaken not as a guessing game but as a way of glimpsing humanity's most realistic yet provocative possibilities, good or bad.
For "What's Next," Jane Buckingham, the founder of the trend-forecasting consultancy The Intelligence Group, asked big-brain scientists and pop-culture achievers to think out loud about where they see things going. What's next for pro sports, says the book's first contributor, Seattle Seahawks running back Shaun Alexander, is that future sports stars will need to be much more media friendly -- well spoken, well rounded -- and that watching games will be just one part of a larger and more immersive online fan experience. Granted, those predictions are not exactly risky, but they have more authority, and interest, coming from a current star than from an ESPN chatterbox.
In soliciting essays for "What Are You Optimistic About?," literary agent and science writer John Brockman looked for pearls of hope from "today's leading thinkers" -- some of them, such as human-genome entrepreneur J. Craig Venter and "The God Delusion" author Richard Dawkins, drawn from Mr. Brockman's client list. More than one of the optimists here note that human violence is in steady decline. As bad as it is in Darfur, relatively few people in the world today are likely to die at the hands of others. The trend will almost certainly continue, says the book -- it just won't seem like that when you watch the news.
What does the future look like as a roll-up of both books' predictions and hopes? Bad news: The environment is going to get worse before it gets better. The process is a natural part of civilization growing up, says John Passacantando, Greenpeace's executive director. But even he thinks things are turning around, citing President Bush's 2006 State of the Union address ("America is addicted to oil") as a milestone. Both books predict that technological advances will cut greenhouse gases, replenish the ocean's overfished stock and move civilization forward in more sustainable ways. Unlike 40 years ago, there are no Paul Ehrlich-style predictions of overpopulation and mass starvation. The world-wide baby boom of the 20th century will subside, say today's thinkers, as developing nations' birth rates drop to match those of the industrialized world.
Both books are notably lacking in business forecasts, but Dov Seidman, founder of the business-ethics consultancy LRN, does offer an insight, in "What's Next," that happens to mesh nicely with his company's mission. In the future, he predicts, businesses will need to focus more and more on how their behavior is perceived by an increasingly networked and informed public. To outperform your competition, he says, you'll need to outbehave them in customer and partner relationships. The same holds true internally: Companies that treat employees well will steal workers away from companies that mistreat them -- as news of the abuse spreads more quickly and more widely than ever before.
Of course, the problem with envisioning the future is that one man's utopia is another's nightmare. Does a world of a million video channels on your iPhone sound exciting to you, or like a living hell of mindless dreck? Do you think stem-cell therapies will lead to better lives, or just prolong a painful and expensive process of aging and dying?
Most advances in technology or civilization can be seen as dual-use. Their goodness or badness depends on whose hands they fall into. The predictions in "What's Next" and "What Are You Optimistic About?" are most entertaining when experts see the flip side of the coin. Mr. Brockman probably wouldn't ask gossip columnist Liz Smith what she's optimistic about, but her essay in Ms. Buckingham's book would probably delight him: "You could stop the taking of pictures, the intrusions into private life, the nonstop gossip and speculation only if you stopped the democratic idea. People are always looking for their betters -- people who are richer, better looking, sexier, more athletic, more famous than themselves." In other words, Ms. Smith sees TMZ.com as progress: If we must have personality cults, better Britney than Hitler.
Not surprisingly, the most detailed predictions in both books come from information technologists. Second-guessing current trends is, after all, an integral part of their work. Taken together, the optimistic visions of several of Mr. Brockman's Net-savvy essayists seem not just wonderful but plausible: The Internet, for all it has brought so far, is only the first step before a much bigger leap in information and interconnectivity between people. One contributor to "What Are You Optimistic About?" worked briefly with the editors at Encyclopedia Britannica; they honestly believed, he claims, that they had captured nearly all the cultural information anyone could reasonably want to know. By contrast, Wikipedia's millions of entries in more than 100 languages aren't as meticulously researched and edited, but the sheer volume of information they contain is awe-inspiring and dwarfs what Britannica has on offer.
Now take it one step further: The Internet has been built and used by only a fraction of the Earth's population. What happens when, like telephones and televisions, Internet-connected computers make their way into most of the world's homes and ever more gadgets become Net-ready? Not only will we better understand our neighbors on the other side of the planet, but I may also finally be able to Google my lost car keys.
Diese Konferenz ist in ihrer Mischung einmalig. Im schmucken München trafen sich zum vierten Mal Internet-Unternehmer, Wissenschaftler, Künstler, Bosse und Blogger auf Einladung des kleinen Medienmoguls Hubert Burda zur DLD Conference (DLD=Digital, Life, Design), um nicht weniger zu tun, als die Welt zu retten. Oder um sie zumindest zu einem "better place" zu machen, wie man auf Englisch sagt, der offiziellen Tagungssprache. Der kunstsinnige und vielseitig neugierige Verleger von eher leichtgewichtigen Blättern wie "Focus" oder "Bunte" widmet sich mit seinen Gästen den richtig großen Fragen unserer Zeit. Und weil's München ist, wird das Ganze mit Bussi-Bussi-Schickimicki abgeschmeckt.
Der brasilianische Bestsellerproduzent Paulo Coelho erzählt, was ihn zu seinen Eso-Kitsch-Schwarten inspiriert. Der italienische Fotograf Oliviero Toscani verrät, warum er Models HIV-Tattoos aufmalt, um Pullis zu verkloppen. Wikipedia-Erfinder Jimmy Wales erläutert, weshalb er die geniale Erfindung der gemeinnützigen Internet-Enzyklopädie noch mal als Kommerzversion nachbauen will. Und der Boss der größten Werbeagentur der Welt, WPP, Sir Martin Sorrell, weshalb er heute in Bejing leben würde, wenn er noch mal 25 wäre. Die Konferenz brachte für drei Tage den Geist von Davos, wo gleich im Anschluss der Weltwirtschaftsgipfel stattfindet, und die ewig frische Aufbruchstimmung des Silicon Valley in die heimelige Heimatstadt von Laptop und Lederhose.
Immer ein Platz für Hubert frei
In der ersten Reihe wird immer ein Platz freigehalten für wahlweise "Hubert", "Dr. Burda" oder gar "Professor Burda". Amerikaner sprechen ihn mit dem Vornamen an, Deutsche und seine Angestellten mit dem akademischen Grad des promovierten Kunsthistorikers. Neben dem 67-Jährigen nimmt bei manchen Vorträgen seine 26 Jahre jüngere Gattin Maria Furtwängler Platz. Die Schauspielerin ist als "Tatort"-Kommissarin Charlotte Lindholm ein in Maßen glamouröser Fernsehstar. Neben ihrer Tochter sitzt sie würdevoll da. Frisch toupiert mit gerader Haltung, gar nicht so ungeschminkt burschikos wie in der Fernsehrolle. Ihren Gatten überragt sie deutlich. Immer wieder spielt sie mit ihrem Blackberry.
Der Patriarch von "Hubert Burda Media" und seine Familie halten Hof. Die teils ungeheuer geleckten jungen Mitarbeiter umschwirren Burda. Eine zweite Gruppe des DLD-Teams wirkt in grauen Kapuzenpullis dagegen sichtbar als legerer Teil der Generation Google. Alles wird fotografiert, gefilmt, gebloggt und quasi live ins Netz gestellt. Eine Konferenz in Echtzeit, die man zwar nur auf Einladung besuchen darf, deren Diskussionen man aber im Netz problemlos verfolgen kann.
Wenn auf dem Podium nicht alle Platz finden, schreitet der Patron schon mal persönlich ein. "Hier ist noch ein Stuhl", sagt er dann. In der Eröffnungsrunde der Konferenz mit schwerreichen, alten, weißen Männern freut sich Burda, dass ihn seine Kinder aus zweiter Ehe am Puls des digitalen Lebensstils ihrer Generation halten. Anekdoten darüber, wie man sich in Davos mal eine Stretchlimousine vom anderen gemopst habe, belegen nur eins: Wir sind etabliert, reich, und wir müssen nichts mehr beweisen.
Hubert Burda gönnt sich jährlich die DLD Conference© Alexander von Spreti/Action Press
Es muss ein befriedigendes Gefühl sein, sich so wie Burda ganz viele interessante Menschen einzuladen, die dann eine Art Studium Universale, einen Crash-Kurs in brennenden Themen von Wirtschaft, Wissenschaft und Technologie auf hohem Niveau abliefern.
Entzauberer und Entschlüsseler
Wenn Richard Dawkins, Evolutionsbiologe und Entzauberer des Gotteswahns, und Craig Venter, erster Entschlüsseler des menschlichen Erbguts, sich begegnen, fühlt sich der Zuhörer privilegiert, lauschen zu dürfen, und strengt sich an, den nicht ganz leichten Gedankengängen zu folgen. Die beiden Denker sind sich einig. "Genetik ist ein Teil der Informationstechnik geworden", erkennt Dawkins. Das wachsende Verständnis für die Zusammensetzung unserer Gene und deren komplexes Zusammenspiel sei "die größte Revolution in der Geschichte des Selbsterkenntnis des Menschen".
Craig Venter wants to email life (Craig Venter will Lebewesen e-mailen)
By Christian Stöcker
A pioneer in the field of genetics can envision a fantastic future in which genetic codes are sent by email and then reassembled as living beings at the other end. Or so Craig Venter forecast at an Internet conference in Munich. He also hopes to solve the problem of global warming—with designer microbes. ...
CRAIG VENTER: LIFE VIA EMAILStart Slide Show: Click on photo (6 photos)
It is a dense network. At the annual gathering of the digital elite, organized by Burda Media in Munich, cell phone networks have barely enough capacity. WLAN and UMTS are groaning under their full load, as everyone calls, surfs the Internet, types—everywhere you look people have their Smartphones and their laptops, and the crowds of Blackberry devotees now also have an iPhone handy.
The event is called DLD. Previously this stood for the "Digital Lifestyle Day," but it is now "Digital Life, Design." The attendees are first-rate—in part because the event is so opportune: many of the international business stars to whom the publisher pays tribute in Munich will subsequently travel on to Davos for the World Economic Forum. And so this year we are running into people like Richard Dawkins and Marissa Mayer of Google in the hallways. And Jason Calacanis, who invented the concept of blogging, chatted with Wikipedia founder Jimmy Wales—oh yeah, and even Naomi Campbell will make an appearance today.
Bio-revolutionaries amidst technology fans
The excitement is palpable, latching on to topics like the new markets in India and China, social networks, and above all the mobile network. Although it possible that this last issue seems especially urgent because everyone is constantly trying to get on the Internet, and failing.
Amidst all the enthusiasm for technology, one conversation had more explosive potential than the talking points of all the old and new digital entrepreneurs put together. Only hardly anybody noticed. DLD is always so crowded that you have to stand for the interesting events. But when genetics entrepreneur Craig Venter and genetics revolutionary Richard Dawkins, who took on the entire religious Right with his antireligious tomeThe Selfish Gene, got up on stage yesterday to talk about a "gene-centric world view," noticeably fewer people were standing than is often the case. And this even though their talk contained more revolutionary statements and wild forecasts by far than the other presentations looking toward future.
Venter, who last made headlines when he published his personal genome in full on the Internet, made brazen claims, but nobody reacted. Venter insisted that climate change represents a much greater risk to humanity than genetic engineering, which could actually help fight it. For example, with genetically manipulated microbes capable of absorbing CO2: "We can change the environment through genetic engineering." John Brockman, who is the literary agent of both Dawkins and Venter, had the role of moderator, but let Dawkins take over. When Venter began to speak of specific genetically engineered correctives for the environment, however, he abruptly woke up. Somebody once explained to him that when you talk about these subjects in Germany, "it causes an uproar—but everyone appears so calm!" And he is right.
"Life is becoming technology"
The momentum was building and, always one to provoke, Venter was on the ball. Dawkins' was inevitably the role of Devil's advocate and he asked whether Venter considers that all life is technology. "Life is machinery," he answered, "which as we learn how to manipulate it, becomes a technology." Dawkins, who wore shirt sleaves and an eccentric white and gray tie, and who came across a bit like a friendly math teacher, suddenly found himself delivering a tentative warning: the unchecked intermingling of gene pools could have unforeseen consequences. He drew a parallel to the unforeseen devastation that introducing new microbes, plants, or animal species can cause to ecosystems.
Dawkins knows what he is talking about—in the '70s he acheived fame with his book entitled The Selfish Gene. At the start of his talk, he declared that "genes are information." From this Venter transitioned into the depiction of a future in which genetic information could be sent over email for the receiver to reassemble as a living being: "We can already reconstruct a chromosome in the laboratory." Last October, the Guardian already reported that Venter would soon be the first to create an entirely artificial life form—something he is accomplishing even as he speaks of a future in which genes are software and humans, at their discretion, can produce life that conforms to their wishes. The question of what happens when genes, which behave all too selfishly in Dawkins' own portrayal of them, breed freely did not come up.
At the same time as this staggering conversation took place on the podium, between a radical genetic engineer and a mastermind in the science of genetics, who evoked a future with artificially designed life and DNA-printers that is already emerging from their current scientific revolution, directly next door a group of Web Entrepreneurs and venture capitalists were engaged in a heated discussion about social networks and earning opportunities. But next to the two dignified grey haired figures onstage, they suddenly seemed a little colorless—almost even a little outdated.
Translated by Karla Taylor
German Language Original
Digital oder biologisch? Auf der Münchner Zukunftskonferenz DLD (Digital Life Design) gab es am vergangenen Montag einen Moment, der an die Hölzchen-Übergabe beim Staffellauf erinnern konnte: Nach einer eher zähflüssigen Diskussion über Sinn und Zweck sozialer Plattformen im Internet trat ein gedrungener Mann auf die Bühne, stellte sich als John Brockman vor und kündigte an, dass von jetzt an eine Stunde nur über Biologie gesprochen werden würde.
John Brockman ist nicht irgendein Moderator. Im Spätsommer 2007 hat er in seinem Landhaus in Connecticut das inzwischen legendäre "Life: What a Concept!"-Symposium veranstaltet, bei dem sechs Pioniere der Naturwissenschaften gemeinsam eine ganze neue Wissenschaftsära ausriefen: Nach der Entschlüsselung des menschlichen Genoms würden schon bald eigene Genom-Sequenzen geschrieben werden können. Und damit bräche das biologische Zeitalter an.
Auf die Münchner Konferenz hatte Brockman nun mit Craig Venter den wichtigsten Kopf seines damaligen Treffens mitgebracht. Der amerikanische Unternehmer, Molekularbiologe und Erst-Entschlüssler des Genoms ist die personifizierte Zukunft der Biotechnologie. Nicht nur, dass Venter in den letzten Jahren die Zahl der bekannten Gene mehr als verdoppelt hat, bereits vor dem Treffen in Connecticut hatte er ein Patent auf die erste künstliche Lebensform überhaupt angemeldet - sein Mycoplasma laboratorium soll einmal das erste sich durch eigene Zellteilung fortpflanzende Kunst-Chromosom überhaupt werden. Und einmal, das bedeutet in Venters Welt noch im Kalenderjahr 2008.
Brockmans zweiten Gast begeisterten diese Aussichten. Der britische Evolutionsbiologe Richard Dawkins, zuletzt vor allem mit seinen Büchern "Das egoistische Gen" und "Der Gotteswahn" bekannt geworden, beschwor, wie nahtlos sich die Möglichkeiten einer "synthetischen Biologie" in Darwins Evolutionslehre einpassen ließen. Für Dawkins ist von menschlicher Fortpflanzung bis zur Laborschaffung neuer Mikroben alles ein großer Testlauf der Natur - zwar unkorrigierbar, vor allem aber unaufhaltbar. Im rasanten Vorwärts der Evolution hat der Mensch ohnehin keine Wahl und braucht deshalb auch kein genbiologisches Experiment zu scheuen.
Craig Venter, dem das Vergnügen an den rasanten Fortschritten seines Instituts deutlich anzusehen war, argumentierte da vorsichtiger. Im Wissen um die Vorbehalte der Europäer gegen Genmanipulationen betonte er vor allem die dringende Notwendigkeit forcierter Eingriffe in den Bauplan der Natur: Die menschlichen Umweltzerstörungen hätten derart irreversible Schäden angerichtet, dass nur noch die Flucht nach vorn helfe. Seinen Kunst-Chromosomen will er eines Tages maßgeschneiderte Gene aufsetzen, die beispielsweise den Kohlendioxid-Überschuss einfach aus der Luft saugen oder Licht in Wasserstoff umwandeln können.
Venter focht seine Sache gut, prangerte die restriktiven Gen-Gesetzgebungen vieler Nationen an und beschrieb die Zukunft detailliert als Selektionsvorgang, der wenigstens nicht ganz so chaotisch wie bisher ablaufen müsse. Zur Einführung hatte Conferencier Brockman noch im Scherz postuliert, dass sich mit Venters Forschungen jede Hauskatze schon bald in einen Haushund verwandelt werden könne - Venter distanzierte sich scharf von Manipulationen an Säugetieren und sprach ausschließlich von Eingriffen im Molekularbereich.
Verständlicherweise aber wollte er sich nicht einmal auf dieser Ebene zum Herrn über eigene Kreaturen abstempeln lassen. Angesichts unzähliger sich ununterbrochen transformierender Lebewesen sei jeder Schöpfergedanke bloße Mystifikation. Lachend verbeugte er sich vor Dawkins religionskritischer Polemik "Der Gotteswahn": Wo es keinen Gott gebe, könne man auch nicht Gott spielen.
Das Netz ist dicht. Bei der jährlichen Versammlung der digitalen Eliten, die Burda Media in München organisiert, ist kaum noch Platz in den Funkzellen. W-Lan und UMTS ächzen unter Vollast, überall wird telefoniert, gesurft und getippt, auf jedem Stuhl sitzt jemand mit einem Smartphone oder Laptop, und so mancher habituelle Blackberry-User hat jetzt zusätzlich auch noch ein iPhone dabei.
DLD heißt die Veranstaltung. Früher stand das für "Digital Lifestyle Day", inzwischen für "Digital, Life, Design". Sie ist hochkarätig besetzt - nicht nur, aber auch deshalb, weil sie günstig liegt: Viele der internationalen Business-Stars, die dem Verleger hier die Ehre erweisen, reisen anschließend gleich weiter nach Davos zum Weltwirtschaftsforum. So kann man dieses Jahr Menschen wie Richard Dawkins und Marissa Mayer von Google auf dem Gang treffen. Und Jason Calacanis, der das Profi-Bloggen erfand, diskutiert mit Wikipedia-Gründer Jimmy Wales - ach ja, auch Naomi Campbell soll heute noch vorbeischauen.
Bio-Revolutionär inmitten von Technikfans
Die Erregung ist groß, Themen sind die neuen Märkte in Indien und China, immer noch Social Networks und vor allem das mobile Netz. Vielleicht erscheint letzteres Thema manchem auch deshalb so dringlich, weil fast alle Anwesenden ständig versuchen, ins Internet zu kommen und immer wieder daran scheitern.
Inmitten all der Technologiebegeisterung findet dann ein Gespräch statt, das mehr Sprengkraft birgt als all die Pläne der alten und neuen digitalen Unternehmer zusammen - aber das merkt kaum jemand. Es ist immer zu voll beim DLD, bei interessanten Veranstaltungen muss man stehen. Aber als der Gen-Unternehmer Craig Venter und der Gen-Revolutionär Richard Dawkins, der mit seinem religionskritischen Wälzer "Der Gotteswahn" gerade die gesamte religiöse Rechte gegen sich aufgebracht hat, gestern gemeinsam auf die Bühne gingen, um über ein "gen-zentrisches Weltbild" zu sprechen, stehen weniger als sonst. Und das, obwohl in dieser Veranstaltung revolutionärere Sätze gesagt und wildere Prognosen formuliert werden als sonst auf dieser an Zukunftsvisionen nicht armen Veranstaltung.
Venter, der zuletzt Schlagzeilen machte, als er sein persönliches Genomvollständig ins Netz stellte, sagt ständig Ungeheuerliches - aber keiner reagiert. Der Klimawandel, so Venter, sei eine viel größere Bedrohung für die Menschheit als die Gentechnik. Die aber könne dagegen helfen: Mit genmanipulierten Mikroben, die CO2 fressen zum Beispiel: "Wir können die Umwelt durch gezielte Gestaltung verändern." John Brockman, der als Literaturagent sowohl Dawkins als auch Venter unter Vertrag hat, soll eigentlich moderieren, überlässt das aber dann doch weitgehend Dawkins. Als Venter dann vom gezielten gentechnischen Gestalten der Umwelt spricht, wacht er kurz auf. Man habe ihm einmal erklärt, wenn man in Deutschland solche Themen anschneide, "dann gibt es einen Aufstand - aber Sie scheinen alle so ruhig!". Und er hat recht.
"Leben wird zu Technologie"
Aufregung will sich einfach nicht einstellen, also setzt Venter - wie immer ganz Provokateur - noch einen drauf. Dawkins - notgedrungen nun in der Rolle des Advocatus Diaboli - fragt, ob Venter denn alles Leben als Technologie betrachte. "Das Leben ist Maschinerie", antwortet der, "und während wir lernen, es zu beeinflussen, wird es zu Technologie". Dawkins, der in Hemdsärmeln und mit einer sehr eigenwillig gemusterten weiß-grauen Krawatte ein bisschen wirkt wie ein freundlicher Mathematiklehrer, sieht sich nun doch zu einer zaghaften Warnung genötigt: Das wilde Vermischen von Genpools könne unabsehbare Folgen haben. Er zieht eine Parallele zu eingeschleppten Mikroben, Pflanzen oder Säugetierarten, die in unvorbereiteten Ökosystemen Verheerendes anrichten können.
Dawkins weiß, wovon er redet - er ist in den Siebzigern mit einem Buch namens "Das egoistische Gen" berühmt geworden. Zu Anfang des Gesprächs hat er gesagt: "Gene sind Information." Darauf aufbauend skizziert Venter nun eine Zukunft, in der genetische Information per E-Mail verschickt und beim Empfänger wieder zu einem Lebewesen zusammengebaut werden kann: "Wir können ein Chromosom jetzt schon im Labor rekonstruieren." Dem "Guardian" hatte Venter schon im vergangenen Oktober berichtet, er werde demnächst das erste vollständig künstliche Lebewesen erschaffen - nun führt er aus, wie er sich diese Zukunft vorstellt, in der Gene Software sind und Menschen nach Gutdünken Lebewesen nach ihren Wünschen schaffen. Was passiert, wenn diese Zuchtgene sich, frei nach Dawkins, allzu egoistisch verhalten sollten, bleibt ungefragt.
Während auf dem Podium diese unerhörte Konversation stattfindet, während ein gentechnologisch Radikaler und ein Vordenker der sich gerade vollziehenden wissenschaftlichen Revolution eine Zukunft inmitten von Designer-Wesen und DNA-Druckern ausmalen, unterhalten sich direkt daneben einige Web-Unternehmer und Venture-Kapitalisten lautstark über soziale Netzwerke und Verdienstmöglichkeiten. Neben den in Ehren ergrauten Herren auf der Bühne, sind sie es, die in diesem Moment ein bisschen farblos wirken - fast schon ein wenig gestrig.
IN a couple of days, Obama mania will reach new heights.
The US President-elect will gaze across to the Lincoln Memorial in Washington and deliver his inaugural speech, grandly titled the New Birth of Freedom.
The speech will certainly contain multiple references to change and hope for a better world.
It will undoubtedly be an event of monumental historical significance - nothing can match a US presidential inauguration for star-studded razzmatazz and fulsome displays of faith. But will anything really change?
Possibly. The cynics may disagree, but Barak Obama seems capable of inspiring the world right now. He reaches out to something deep seated in human nature - the need to believe, hope and love.
Obama's job won't be easy. In the words of writer Ron Rolheiser, we are a culture rich in everything except clarity.
We are drowning in information, discoveries, competing ideologies and values and personal options. Our psyches and souls are shaped by the explosion of technology and information that renders almost everything we learn almost immediately obsolete. Nothing seems permanent.
Anyone who watches Oprah or Jerry Springer knows the culture - long on openness, but short on trust.
We are a world suffering allergies. About a third of us are allergic to cat fur, peanuts, dust mites, seafood, selected chemicals or something else. There's a lot to fear.
The Edge, a website that regularly poses big questions, recently asked a select group of thinkers: What will change everything?
The scientists, philosophers and writers came up with some interesting answers.
Some argued that everything would change with the invention of cheap and powerful artificial intelligence that would improve itself.
Others opted for advances in molecular technology, discovery of intelligent life elsewhere, an end to war and human misery, mastering death, accidental nuclear war, a web-powered revolution, the breakdown of all computers and the invention of a laptop quantum computer.
A playwright suggested nothing needed to happen to bring about change; real changes, he said, had always happened, and always would.
Actor Alan Alda said: "I find it hard to believe that anything will change everything. The only exception might be if we suddenly learned how to live with one another. But, does anyone think that will come about in a foreseeable lifetime?
"Even if we were visited by weird little people from another planet and were forced to band together, I doubt if it would be long before we'd find ways to break into factions again, identifying those among us who are not quite people."
American author and philosopher Sam Keen believes real change comes when we thoughtfully question our existence.
Keen calls himself a recovering Presbyterian and a trustful agnostic. He wears a question mark rather than a cross around his neck.
He believes the path of spirituality is not the path of religion. Religion begins with the answers, but spirituality begins with the questions.
In his view, you never arrive at the end of this journey. Human life is a journey whose end is not in sight.
Keen says to maintain our sanity in today's world, we all need a spiritual bulldust detector.
"In a world of cults, gurus, and self-help programs, we need to be mindful of how accepted beliefs often get in the way of true understanding," he says. As he sees it, real wisdom is born of "epistemological humility" of bewilderment in the face of life's enduring mysteries.
Keen recognises a worldwide longing for answers that cannot be satisfied by traditional religion. And the statistics seem to confirm his view.
Church attendances are down, but spiritual searchers - those who want something more than paying lip service to God or attending a church on the weekend - are increasing.
"The spiritual craving of our time is triggered by the perennial human need to connect with something that transcends the fragile self, to surrender to something bigger and more lasting than our brief moment in history," he writes in his book Hymns to an Unknown God.
"Spirituality is in," he writes. "Millions who have become disillusioned with a secular view of life, but are unmoved by established religion in any of its institutional forms, are setting out on a quest for something - some missing value, some absent purpose, some new meaning, some presence of the sacred."
History Shows That Famous Thinkers Also Get It Wrong. And they admit it
Cover Story, Sunday Magazine
One hundred and sixty-five eminent thinkers, researchers, and communicators, at the annual request of the edge.org website, answered the following question: "What Have You Changed Your Mind About? Why?"
From particle physics to evolutionary theory, to the atomic bomb, to global warming, to the battle of the sexes, to the equality of human beings, to God and the paranormal, and to the dogmatism of scientists themselves, dozens of the big thinkers in the world explained online, at the start of 2008, what the most important things that they’ve change their minds about during their lives are.
The project takes place on the website www.edge.org, a kind of informal think tank, a forum for ideas and scientific debates (see adjoining article), which asks such questions annually online and later publishes the result in book form.
Many of the names here are well known to the interested public—the physicist Freeman Dyson, the "genome decoder" Craig Venter, the biologist Richard Dawkins (author of the controversial book The God Delusion), the Nobel laureate physicist Leon Lederman. Other participants, such as actor Alan Alda or the musician Brian Eno, may be surprising departures, but are just as interesting. And there are a number of science journalists, as well, including Steve Connor of the Independent, Roger Highfield of theTelegraph, and Philip Campbell, editor of Nature. The following are some examples of the ideas that they are re-evaluating.
The atomic bomb won the war
Freeman Dyson, renowned physicist and mathematician, Princeton's Institute of Advanced Study
I changed my mind about an important historical question: did the nuclear bombings of Hiroshima and Nagasaki bring World War Two to an end? Until this year I used to say, perhaps. Now, because of new facts, I say no.
We have stopped evolving
Steven Pinker, experimental psychologist, Harvard University
Ten years ago I wrote, "Are we still evolving? Biologically, probably not much." The completion of the Human Genome Project was several years away. But new results have suggested that thousands of genes, perhaps as much as ten percent of the human genome, have been under strong recent selection, and the selection may even have accelerated during the past several thousand years. Currently, evolutionary psychology assumes that any adaptation to post-agricultural ways of life are 100% cultural. If these results hold up, and apply to psychologically relevant brain function, then that simplifying assumption might have to be reconsidered.
The paranormal exists
Susan Blackmore, psychologist, consultant to the journalSkeptical Inquirer
When I was a student at Oxford in 1970, I became became fascinated with occultism, mediumship and the paranormal. I did the experiments. I tested telepathy, precognition, and clairvoyance; I got only chance results. I trained fellow students in imagery techniques and tested them again; chance results. I tested twins in pairs; chance results. I worked in play groups and nursery schools with very young children (their naturally telepathic minds are not yet warped by education, you see); chance results. I trained as a Tarot reader and tested the readings; chance results. I was lying in the bath trying to fit my latest null results into paranormal theory, when it occurred to me for the very first time that I might have been completely wrong, and my tutors right. Perhaps there were no paranormal phenomena at all. I had hunted ghosts and poltergeists, trained as a witch, attended spiritualist churches, and stared into crystal balls. But all of that had to go. Once the decision was made it was actually quite easy.
We are all equal
Simon Baron-Cohen, psychologist, Autism Research Center, Cambridge University
When I was young I believed in equality as a guiding principle in life. My mind has been changed. I still believe in some aspects of the idea of equality, but I can no longer accept the whole package. Striving to give people equality of social opportunity is still a value system worth defending, but we have to accept that equality has no place in the realm of biology.
The obligation of a scientist to do science
Leon Lederman, Nobel Laureate in Physics (author of The God Particle)
I have always believed that the scientist’s most sacred obligation is to continue to do science. Now I know that I was dead wrong. I am driven to the ultimately wise advice of my Columbia mentor, I.I. Rabi, who, in our many corridor bull sessions, urged his students to run for public office and get elected. He insisted that to be an advisor (he was an advisor to Oppenheimer at Los Alamos, later to Eisenhower and to the AEC) was ultimately an exercise in futility and that the power belonged to those who are elected. Then, we thought the old man was bonkers. But today... A Congress which is overwhelmingly dominated by lawyers and MBAs makes no sense in this 21st century in which almost all issues have a science and technology aspect.
Men are at the top because they are smarter
Helena Cronin, philosopher, London School of Economics
I used to think that these patterns of sex differences resulted mainly from average differences between men and women in innate talents, tastes and temperaments. After all, in talents men are on average more mathematical, more technically minded, women more verbal; in tastes, men are more interested in things, women in people; in temperaments, men are more competitive, risk-taking, single-minded, status-conscious, women far less so. But I have now changed my mind. It is not a matter of averages, but of extremes. Females are much of a muchness, clustering round the mean. But, among males, the variance—the difference between the most and the least, the best and the worst—can be vast. So males are almost bound to be over-represented both at the bottom and at the top. I think of this as 'more dumbbells but more Nobels'.
It is possible to unify the forces of physics
Marcelo Gleiser, Brazilian physicist and astronomer, Dartmouth College
I was always fascinated by the idea of unification of the forces of nature. I wrote dozens of papers related to the subject of unification, even my Ph.D. dissertation was on the topic. I was fascinated by the modern approaches to the idea, supersymmetry, superstrings, a space with extra, hidden dimensions. A part of me still is. But then, a few years ago, I started to doubt unification, finding it to be the scientific equivalent of a monotheistic formulation of reality, a search for God revealed in equations. Of course, had we the slightest experimental evidence in favor of unification, of supersymmetry and superstrings, I'd be the first popping the champagne open. But it's been over twenty years, and all attempts so far have failed.
Global warming is not an urgent problem
Craig Venter, human genome decoder, J. Craig Venter Institute
Like many or perhaps most I wanted to believe that our oceans and atmosphere were basically unlimited sinks with an endless capacity to absorb the waste products of human existence. I wanted to believe that solving the carbon fuel problem was for future generations and that the big concern was the limited supply of oil not the rate of adding carbon to the atmosphere. The data is irrefutable. We are conducting a dangerous experiment with our planet. One we need to stop. Now.
Humans emerged because they began to eat meat
Richard Wrangham, British anthropologist, student of Jane Goodall, Harvard University
I used to think that human origins were explained by meat-eating. But I now think that cooking was the major advance that made us human. Cooked food allows our guts, teeth and mouths to be small, while giving us abundant food energy and freeing our time. Cooked food, of course, requires the control of fire; and a fire at night explains how Homo erectus dared sleep on the ground. So, in a roast potato and a hunk of beef we have a new theory of what made us human.
Races do not exist
Mark Pagel, evolutionary biologist, Reading University
There is an overbearing censorship to the way we are allowed to think and talk about the diversity of people on Earth. Officially we are all the same: there are no races. Flawed as the old ideas about race are, modern genomic studies reveal a surprising, compelling and different picture of human genetic diversity. What this all means is that, like it or not, there may be many genetic differences among human populations—including differences that may even correspond to old categories of 'race'—that are real differences in the sense of making one group better than another at responding to some particular environmental problem. This in no way says one group is in general 'superior' to another, or that one group should be preferred over another. But it warns us that we must be prepared to discuss genetic differences among human populations.
John Brockman intersects the cultures
Edge: brilliant, essential and addictive
Edge is a bimonthly newsletter and a website. It is a single publication, run by North American John Brockman, a literary agent with a constellation of world-famous scientists (most, but not all, are from the Anglo-Saxon world). Brockman, born in Boston in 1941, now resides in New York. He is the author and editor of 19 books, including The Third Culture: Beyond the Scientific Revolution.
Brockman writes in his presentation of the site that the "traditional intellectual", i.e. one with a 1950s education "in Freud, Marx, and modernism" no longer has sufficient qualifications to be a thinking person in the world today. One cannot be just a "literary intellectual"—that self-defined term used in the 1930s by "men of letters" to the exclusion of scientists such as Einstein, Bohr, and Heisenberg. "The traditional American intellectuals are, in a sense, increasingly reactionary, and quite often proudly (and perversely) ignorant of many of the truly significant intellectual accomplishments of our time", he says.
"The third culture" is defined by Brockman as consisting of "those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are." The mandate of the Edge Foundation is "to promote inquiry into and discussion of intellectual, philosophical, artistic, and literary issues, as well as to work for the intellectual and social achievement of society."
The online world of Edge clearly benefits from a suspension of the fear of not being politically correct or addressing issues that are not the specialty of the participant. All the invited participants play the game, presenting controversial ideas, confessing doubts, casting proposals for the future. "There is no canon of acceptable ideas" notes Brockman. "The strength of the Third Culture is precisely that it can tolerate those disagreements." The result of this ambitious venture, for those who have already experienced navigating the web pages of edge.org, is not only brilliant, but addictive. It interprets, it interrogates, it provokes. Each text can be a world in itself.
Although little known to the greater European public—just looking at the list of periodical articles referenced on the website’s press page is enough to see that Edge has become an indispensable point of passage essential for all—specialists and fans—who like to perceive and reflect on the great scientific, social, cultural, and policy questions that are shaped by the arguments of these "new intellectuals", who work and think "at the edge of the world's knowledge" (Brockman's words, of course). Ana Gerschenfeld
The film adaptation of English writer Ian McEwan’s prize-winning novel Atonementopened last month to widespread critical acclaim. Winners of the Golden Globes will be announced this weekend, and Atonement sits on top of the field, with the mostnominations of any film. Isaac Chotiner spoke with McEwan about letting go, growing up, and why atheists need to speak out.
Was it hard to watch Atonement be adapted to film by other people? Did you feel possessive?
I’m fairly used to the process. I think this is the fifth or sixth of my stories or novels that have been made into films. I’m sure I’d be possessive if I allowed myself to get involved in the writing of the script. There’s a lot to be said for not doing that. I did it once withThe Innocent and John Schlesinger, and it was a fairly difficult process because everyone--the director, the designers, actors, everyone--had their own ideas and came piling in. And you are suddenly knocked off your perch as the God in this machine. It is better to have someone take a free run at it. But I can’t quite walk away, so I like to stay involved. I like film sets, and I enjoy the collaborative process. I’m not sure if I had the worst of both worlds or the best.
One of the great things about the book is the way you get inside the head of Briony Tallis, a 13-year-old girl. Were you worried that film is a medium in which it is harder to get inside a character’s head?
Well, it is impossible for a movie to give you what a novel can give you, which is the flavor of rolling thoughts and consciousness. But you have to do the best with what you’ve got, which with movies is a high dependence on actors to somehow let us feel the illusion that we can follow a thought process. And I think the casting of Briony with Saoirse Ronan was really astute. She is a very watchful girl, a completely intuitive young actress.
Earlier in your career, you were known as "Ian Macabre.” Though there is less of what you call the darkness and violence that was marked your stories 25 years ago, your newer work still has a level of intensity and discomfort. I’m thinking particularly of the sex scene in your latest novel, On Chesil Beach.
Some of the dark-hearted stuff from those short stories still lives on, whether it is the beginning of Enduring Love or the scene toward the end of Saturday or even elements of Atonement. But it is bound to change. One passes the usual milestones in life: You have children, you find that whether you like it or not, you have a huge investment in the human project somehow succeeding. You become maybe a little more tolerant as you get older. Pessimism begins to feel something like a badge that you perhaps do not wear so easily. There is something delicious and reckless about the pessimism of being 21. And when you get older you feel maybe a little more delicate and hope that things will flourish. You don’t want to take a stick to it.
I want to read you a quote from James Wood in The New Yorker about Philip Roth’s latest book: “How much of any self is pure invention? Isn't such invention as real to us as reality? But then how much reality can we bear? Roth knows that this kind of inquiry, far from robbing his fiction of reality, provokes an intense desire in his readers to invest his invented characters with solid reality.” A lot of Atonement is about the question of what is real in fiction, and I was curious for your thoughts about literary realism these days.
The kind of fiction I like and the kind of fiction I most often want to write does have its feet on the ground of realism, certainly psychological realism. I have no interest in magical realism and the supernatural--that is really an extension, I guess, of my atheism. I think that the world, as it is, is so difficult to capture that some kind of enactment of the plausibly shared reality that we inhabit is a very difficult task. But it is one that fascinates me. I have just re-read a couple of Saul Bellow novels, Mr. Sammler’s Planet and The Dean’s December. I really get a thrill from his engagement with the momentous task of what it is like to be in the 20th century in Chicago or even Bucharest, what the condition is, what it’s like, how it is now. This is something that modernism shied away from--the pace of things, the solid achievement of weight in your hand. So I remain rather committed to that. But also to what is psychologically real--the small print of consciousness, the corners and vagaries of thinking that when you read them in another writer, and they are done well, you just know they are right. Not only because you had this thought to yourself, but because that way of thinking seems so ineradicably human.
You mentioned Bellow. Who are the writers you are particularly drawn to now, people you have stuck with?
Really, your amazing triptych, one now dead, of Bellow, Roth, and Updike. They have been voices all the way through my writing life, from the time I started writing. I readPortnoy’s Complaint, Rabbit Run, and Mr. Sammler, and there was nothing like that happening in Britain or for that matter in Europe, so far as I could tell. It has something to do with a largeness of ambition, a generosity of imagination, and a wicked sense of humor, particularly in Portnoy. It comes back to that kind of realism, with that wish to engage with conditions as they are now, to capture the city or the moment in time. We had nothing so sparkling. So, yes, I have kept faith with those guys.
What are your online habits? Do you surf the web?
Do you read any online reviews?
I don’t read the blogs much. I don’t like the tone-the rather in-your-face road-rage quality of a lot of exchange on the Internet. I don’t like the threads that come out of any given piece of journalism. It seems that when people know they can’t be held accountable, when they don’t have eye contact, it seems to bring out a rather nasty, truculent, aggressive edge that I think slightly doesn’t belong in the world of book reviewing.
I just read a quote of yours, “Atheists have as much conscience, possibly more, than people with deep religious convictions,” and I have noticed that recently you have been talking a little more about atheism. You also contributed an essay to a new book calledThe Portable Atheist. What are your thoughts on the “New Atheist” movement, which has gotten so much publicity and sold so many books in the last year or so. Do you think it differs from strains of atheism in the past?
I am a little baffled as to why it is called the “New Atheism.” There is a very long tradition of free thinking, and the arguments made against religion tend to be the same but made over and over again. But I think what has happened is that there have been a number of good, articulate books--Hitchens, Dawkins, Dennett, Sam Harris, and so on. What they have discovered to their own great surprise is that in the United States, and right across the South too, there are an enormous number of people who also think this way. I don’t think they have suddenly been persuaded by this rash of books--the feelings were there anyway--but they didn’t have a voice, they didn’t have a focus. When Hitchens took his book across the Bible Belt and debated with Baptist ministers in churches, there were huge audiences, most of whom, it seems, from when they spoke to him afterwards, were somewhat irritated that the place in the United States that they lived in was called the Bible Belt. I think there was something there that people had not taken into account. Quite heartening really, given that America is meant to be a secular republic with a strong tradition of upholding all freedom of thought.
Do you see religion as ineradicable, or do you think there is a chance to change people’s minds on religion?
I think it is ineradicable, and I think it is a terrible idea to suppress it, too. We have tried that and it joins the list of political oppression. It seems to be fairly deeply stitched into human nature. It seems to be part of all cultures, so I don’t expect it to vanish. And yet at the same time, if it is built into human nature, why are there so many people who don’t believe in it? I think it is important that people with no religious beliefs speak up and speak for what they value. It is a bit of a problem, the title “Atheist”--no one really wants to be defined by what they do not believe in. We haven’t yet settled on a name, but you wouldn’t expect a Baptist minister to go around calling himself a Darwinist. But it is crucial that people who do not have a sky god and don’t have a set of supernatural beliefs assert their belief in moral values and in love and in the transcendence that they might experience in landscape or art or music or sculpture or whatever. Since they do not believe in an afterlife, it makes them give more valence to life itself. The little spark that we do have becomes all the more valuable when you can’t be trading off any moments for eternity.
"What Are You Optimistic About? Today's Leading Thinkers on Why Things Are Good and Getting Better," edited by John Brockman, Harper Perennial, $14.95, 374 pages.
If that "bah, humbug" mood lingers, ponder the observations of an odd assortment of academics and other intellectuals, who choose to see that mug of hot cider as half full. "What Are You Optimistic About?" knows that Americans have an increasingly deep morale problem, so these 150 essays of hope are an antidote for societal despair.
Contributors -- quantum physicist David Deutsch of Oxford, former Time magazine editor James Geary, musician/record producer Brian Eno -- tend to use logic, not sap or divine intervention, to make their arguments. "I am a short-term pessimist but a long-term optimist," writes Paul Saffo, technology forecaster at Stanford. "History is on my side, because the cause of today's fashionable pessimism lies much deeper than the unpleasant surprises of the last half-decade."
I've been traveling in Central America for the past few weeks, so I'm late on blogging a number of things -- including this. Each year, EDGE.org's John Brockman asks a new question, and a bunch of tech/sci/internet folks reply. This year's question: What have you changed your mind about?
Science is based on evidence. What happens when the data change? How have scientific findings or arguments changed your mind?
I was one of the 165 participants, and wrote about what I learned from Boing Boing's community experiments, under the guidance of our community manager Teresa Nielsen Hayden: Link to "Online Communities Rot Without Daily Tending By Human Hands."