Edge in the News

PROSPECT MAGAZINE [5.31.07]

In 2005, the American anthropologist Daniel Everett published an article in Current Anthropology in which he presented his insights into Pirahã life, acquired over years spent living with the tribe. Pirahã culture, Everett claimed, was unique: it was totally focused on immediate experience and it lacked basic number skills, a vocabulary for colours, a past perfect tense and a creation myth....

Chicago Tribune [5.8.07]

Genres crumble, divisions fade in light of tragedy

By Julia Keller
Tribune cultural critic

...Contemporary culture is a blur, a haze, a hodgepodge, a constant shuffle play on the natural-born iPod known as the human consciousness. The old hierarchies -- high art, low art, enlightenment, junk -- are dead. The ancient demarcations of poem and story and painting are pointless.

Genres are dissolving. Boundaries are disintegrating. Old lines of stratification and division and roping-off of subject areas, gone. Next thing you know, they'll be taking the 9/11 commission's austere and straightforward exegesis of the defining national tragedy of our lifetimes and turning it into a comic book. ...

... Modern technology, then, may have been almost as urgent a target for the 9/11 terrorists as were the helpless humans they murdered. The audacity of the attacks may have arisen from a desire to splash the world with the ghastly imagery of technology run amok, of technology outsmarting itself to bring about chaos and death. Thus the arts -- still our chief means of engaging with ideas, even the heinous ideas of terrorists -- must grapple with technology's double-edged sword: Some of us see it as redemptive and positive, while others see it as threateningly negative.

John Brockman, founder of a Web site illuminating the interplay of science and culture (www.edge.org), believes technological advances are always beneficial, despite the lethal misgivings that certain groups harbor. Science "figures out how things work and thus can make them work better," he wrote in an e-mail. "As an activity, as a state of mind, it is fundamentally optimistic."

And so here we stand, clutching a comic book in one hand and a copy of "Hamlet" in the other, listening to an aria through one headphone and a Dixie Chicks ballad through the other, looking out at a landscape that seems ancient and exhausted -- and bright and new. A world in which we are, every second, individuals and vital parts of communities as well.

[...continued]

The Times [4.22.07]

Philip Zimbardo used to be one of my heroes, but no longer. The psychologist dreamt up the Stanford Prison experiment, in which 24 male students were randomly assigned roles as either captive or guard in a mock prison. Guards were given uniforms and power; prisoners were stripped of their names and privileges, and were ordered to remain largely silent. The nightly toilet run saw the prisoners blindfolded and shackled together before being marched to the bathroom.

The experiment, in 1971, was stopped after just six days because the guards had become sadists and the prisoners depressives. Remember, they all started off as nice, normal college kids. The experiment became a totem of a thing called “situational” evil: good people, when put into bad situations, could become brutes. It has furnished an explanation — but not exoneration — for atrocities ranging from the Holocaust to Abu Ghraib (Professor Zimbardo appeared as a defence witness at the trial of a soldier charged with torture at the Iraqi prison).

I had always assumed it was Professor Zimbardo who called time. In fact, it was a young psychologist called Christina Maslach. Professor Zimbardo, who had just started dating Dr Maslach, had invited her over to impress her. Instead, after witnessing the toilet run, she fled in horror, telling Professor Zimbardo she no longer wanted to know him. The experiment, she said, had dehumanised its instigator as well as its participants.

So, Professor Zimbardo stopped the experiment because he risked losing the woman he loved. He calls Dr Maslach a hero for challenging the wisdom that the experiment was a justifiable study of human nature. And it is has led him, he tells the Edge website (www.edge.org), to consider the flip side of evil: the psychology of heroism.

Just as some people can be made to grow horns, others grow haloes. Yet, so little is known about heroes, other than that they often say, in the face of mountainous evidence to the contrary, that they didn’t do anything special. Do heroes ever contemplate the risks? Or do they consider them and then override them? Such basic research, Professor Zimbardo says, has never been conducted but should be, ideally in the immediate aftermath of a heroic act.

We must also cultivate a different heroic imagination in the young. Dangerously, children grow up believing that heroism is the preserve of the legendary rather than the ordinary: Achilles or Superman. He says: “The secondary consequence is for us to say, ‘I could never be that . . . or bear such a burden’. I think, on the other hand, we each could say, ‘I could do what Christina Maslach did’.” Indeed: we need heroes who will stop another Enron, another Abu Ghraib, another questionable psychology experiment.

By the way, Professor Zimbardo and the now Professor Maslach celebrate their 35th wedding anniversary this year.

Atlanta Journal-Constitution [4.21.07]


Slide into evil. In the Stanford Prison Study in 1971, university students were randomly assigned to be prisoners or guards and then placed in a mock prison setting in the basement of the campus psych building. The guards became so oppressive and sadistic, and the prisoners so passive and depressed, that the two-week study was ended after six days. Lead researcher Philip Zimbardo is featured on edge.org in a lengthy discussion of evil and heroism. He calls the study a "cautionary tale of the many ways in which good people can be readily and easily seduced into evil. . . . Those who sustain an illusion of invulnerability are the easiest touch for the con man, the cult recruiter or the social psychologist ready to demonstrate how easy it is to twist such arrogance into submission."

...

boingboing [4.18.07]


In an original EDGE essay, Wikipedia co-founder Larry Sanger claims that the Web's ability to aggregate public opinion and knowledge into some form of "collective intelligence" is leading to a new politics of knowledge. According to Sanger, the power to establish what "we all know'" is shifting out of the hands of a small elite group and becoming more of a conversation open to anyone with a Net connection. However, Sanger is also the founder of Citizendium, a competitor to Wikipedia that, according to its Web site, "aims to improve on (the Wikipedia) model by adding 'gentle expert oversight' and requiring contributors to use their real names." In this essay, titled "Who Says We Know: On The New Politics Of Knowledge," Sanger argues that a lack of "expert" oversight leads to unreliable information, something he sees as a major flaw in knowledge egalitarianism. I'm sure this essay will spark as much fiery debate as the previous essay in this EDGE series, Jaron Lanier's "
Digital Maoism." From Sanger's essay:

Today's Establishment is nervous about Web 2.0 and Establishment-bashers love it, and for the same reason: its egalitarianism about knowledge means that, with the chorus (or cacophony) of voices out there, there is so much dissent, about everything, that there is a lot less of what "we all know." Insofar as the unity of our culture depends on a large body of background knowledge, handing a megaphone to everyone has the effect of fracturing our culture.

I, at least, think it is wonderful that the power to declare what we all know is no longer exclusively in the hands of a professional elite. A giant, open, global conversation has just begun—one that will live on for the rest of human history—and its potential for good is tremendous. Perhaps our culture is fracturing, but we may choose to interpret that as the sign of a healthy liberal society, precisely because knowledge egalitarianism gives a voice to those minorities who think that what "we all know" is actually false. And—as one of the fathers of modern liberalism, John Stuart Mill, argued—an unfettered, vigorous exchange of opinion ought to improve our grasp of the truth.

This makes a nice story; but it's not the whole story.

As it turns out, our many Web 2.0 revolutionaries have been so thoroughly seized with the successes of strong collaboration that they are resistant to recognizing some hard truths. As wonderful as it might be that the hegemony of professionals over knowledge is lessening, there is a downside: our grasp of and respect for reliable information suffers. With the rejection of professionalism has come a widespread rejection of expertise—of the proper role in society of people who make it their life's work to know stuff. This, I maintain, is not a positive development; but it is also not a necessary one. We can imagine a Web 2.0 with experts. We can imagine an Internet that is still egalitarian, but which is more open and welcoming to specialists. The new politics of knowledge that I advocate would place experts at the head of the table, but—unlike the old order—gives the general public a place at the table as well.

El Norte [4.16.07]


Alfonso Elizondo

In 1992, John Brockman defined the concept of the third culture in his essay entitled "The Emerging Third Culture": " The third culture consists of those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are."

For John Brockman, the strength of the third culture is precisely that it can tolerate disagreements about which ideas are to be taken seriously. Unlike previous intellectual pursuits, the achievements of the third culture are not the marginal disputes of a quarrelsome mandarin class: they will affect the lives of everybody on the planet. Scientific subjects now receive outstanding  treatment in the pages in newspapers and magazines. 

Molecular Biology, artificial intelligence, artificial life, the neuronal theory of the chaos, networks, the inflationary universe, adaptive fractals, complex systems, superstrings, biodiversity, nanotecnology, human genome, the virtual reality, etc., are some of the new scientific subjects that they are transmitted to the present society under new metaphors created by the intellectuals of the third culture.

In the third culture a new philosophy of the nature is being born whose sustenance is in the understanding of the complexity of the evolution. According to Brockman, very complex systems, such as the organisms, brains, the biosphere or the own universe, were not constructed following a deterministic design, but all are evolutionary processes, whose interpretation through images and metaphors has been the function of the intellectuals of the third culture, and in this regard they attempt to express their deeper reflections in an accessible way for the intelligent reading public.

In spite of its critics, the third culture is alive and in the heat of development. Books by Richard Dawkins, Daniel C. Dennett, Jared Diamond, Brian Greene, Stephen Pinker, Martin Rees, etcetera, are indispensable not only for their information, but they are also great successes in the bookstore. Their subjects deal wth the main controversies of the western world in the last decades: abortion and euthanasia, demographic policies, the increase of differences between rich and poor countries, pacifism, migrations, racism and xenophobia, the causes of the ecological crisis and the implications of the technology that lead to a postulation of an ethics of the responsibility and the social control of the scientific policies.

The world-wide phenomenon of the third culture is not only the interruption by the natural scientists of the postmodern intellectual scene, but a movement towards a global intellectual vision caused by the intensive use of the images and  hypermedia in the communication between the human beings, which has allowed the scientific knowledge of second half of 20th century  to permeate all society, providing for the utlization of information for confronting the great universal challenges of 21st century.

However, in spite of the serious warnings of the natural scientists, the mainstream political leaders of the world have not managed to include or understand that present political action must be focused to the preservation of the habitat of human beings. Although the contribution of the scientific knowledge is falsifiable, ephemeral and almost always probabilistic, it is always helpful in making important decisions, indicating what it is not due to do. As Machiavelli wrote: "To know the ways that lead to hell is to avoid them".

New Yorker [4.15.07]

 

Has a remote Amazonian tribe upended our understanding of language?

Dan Everett believes that Pirahã undermines Noam Chomsky’s idea of a universal grammar.

[ED. NOTE: Thanks to the New Yorker for making available the link to John Colapinto's article.]

Blog Watch A Mainstream Look At Weblogs
Newsweek [4.10.07]

Great reading in George Dyson's essay "Turing's Cathedral," found atedge.org. It connects the impulses of original computer pioneers to the age of Google.

Read the full article →

The Wall Street Journal [4.9.07]

Life is full of surprises, but it's rare to reach for a carafe of wine and find your hand clutching a bottle of milk -- and even rarer, you'd think, to react by deciding the milk was actually what you wanted all along.

Yet something like that happened when scientists in Sweden asked people to choose which of two women's photos they found most attractive. After the subject made his choice, whom we'll call Beth, the experimenter turned the chosen photo face down. Sliding it across the table, he asked the subject the reasons he chose the photo he did. But the experimenter was a sleight-of-hand artist. A copy of the unchosen photo, "Grizelda," was tucked behind Beth's, so what he actually slid was the duplicate of Grizelda, palming Beth.

Few subjects batted an eye. Looking at the unchosen Grizelda, they smoothly explained why they had chosen her ("She was smiling," "she looks hot"), even though they hadn't.

In 1966, Time magazine asked, "Is God Dead?" Even then, the answer was no, and with the rise of religion in the public square, the question now seems ludicrous. In one of those strange-bedfellows things, it is science that is shedding light on why belief in God will never die, at least until humans evolve very different brains, brains that don't (as they did with Beth and Grizelda) interpret unexpected and even unwanted outcomes as being for the best.

"Belief in God," says Daniel Gilbert, professor of psychology at Harvard University, "is compelled by the way our brains work."

As shown in the Grizelda-and-Beth study, by scientists at Lund University and published this month in Science, brains have a remarkable talent for reframing suboptimal outcomes to see setbacks in the best possible light. You can see it when high-school seniors decide that colleges that rejected them really weren't much good, come to think of it.

You can see it, too, in experiments where Prof. Gilbert and colleagues told female volunteers they would be working on a task that required them to have a likeable, trustworthy partner. They would get a partner randomly, by blindly choosing one of four folders, each containing a biography of a potential teammate. Unknown to the volunteers, each folder contained the same bio, describing an unlikable, untrustworthy person.

The volunteers were unfazed. Reading the randomly chosen bio, they interpreted even negatives as positives. "She doesn't like people" made them think of her as "exceptionally discerning." And when they read different bios, they concluded their partner was hands-down superior. "Their brains found the most rewarding view of their circumstances," says Prof. Gilbert.

The experimenter then told the volunteer that although she thought she was choosing a folder at random, in fact the experimenter had given her a subliminal message so she would pick the best possible partner. The volunteers later said they believed this lie, agreeing that the subliminal message had led them to the best folder. Having thought themselves into believing they had chosen the best teammate, they needed an explanation for their good fortune and experienced what Prof. Gilbert calls the illusion of external agency.

"People don't know how good they are at finding something desirable in almost any outcome," he says. "So when there is a good outcome, they're surprised, and they conclude that someone else has engineered their fate" -- a lab's subliminal message or, in real life, God.

Religion used to be ascribed to a wish to escape mortality by invoking an afterlife or to feel less alone in the world. Now, some anthropologists and psychologists suspect that religious belief is what Pascal Boyer of Washington University, St. Louis, calls in a 2003 paper "a predictable by-product of ordinary cognitive function."

One of those functions is the ability to imagine what Prof. Boyer calls "nonphysically present agents." We do this all the time when we recall the past or project the future, or imagine "what if" scenarios involving others. It's not a big leap for those same brain mechanisms to imagine spirits and gods as real.

Another God-producing brain quirk is that although many things can be viewed in multiple ways, the mind settles on the most rewarding. Take the Necker cube, the line drawing that shifts orientation as you stare at it. (A cool version is at dogfeathers.com/java/necker.html.) If you reward someone for seeing the cube one way, however, his brain starts seeing it that way only. The cube stops flipping.

There are only two ways to see a Necker cube, but loads of ways to see a hurricane or a recovery from illness. The brain "tends to search for and hold onto the most rewarding view of events, much as it does of objects," Prof. Gilbert writes on the Web site Edge. It is much more rewarding to attribute death to God's will, and to see in disasters hints of the hand of God.

Prof. Gilbert once asked a religious colleague how he felt about helping to discover that people can misattribute the products of their own minds to acts of God. The reply: "I feel fine. God doesn't want us to confuse our miracles with his."

The Sunday Times [4.7.07]

Michael Wright enjoys a eureka moment at the edge of knowledge, as scientists ponder the imponderable

Here is a good-news story: a website that will expand your mind. Edge.org is a forum for science, philosophy and culture that maps the boundary fence over which today’s big thinkers, standing on tiptoes, are peering. Well-known scientists and assorted eggheads can post their opinions on hotly debated topics of the moment — from the evolutionary biologist Richard Dawkins, discussing why science has more in common with literature than we might think, to the leading geneticist and human-genome maverick J Craig Venter on why he wants to create life.

Some of the presentations are available to watch as QuickTime movies, if you prefer not to read, and keen thinkers can have a bimonthly e-mail of the latest discussions delivered to their inbox.

Each year, John Brockman, the site’s American editor, also sends a big, open-ended question to all the notable thinkers he knows, then publishes their responses online. This year’s little teaser — “What do you believe is true, even though you cannot prove it?” — prompted 60,000 words in reply, on subjects including particle physics, consciousness, arti- ficial intelligence, global warming and tedious sophistry.

I like the belief of Alun Anderson, the editor-in-chief of New Scientist, that cockroaches are conscious, but cannot comment on the theoretical physicist who denies that black holes destroy information or the computer scientist who believes the continuum hypothesis is false.

Visiting Edge will make pseudo- scientists feel cleverer, and the rest of us more than usually stupid, as we discover, with a jolt of pleasure, how little we really know about the world.

Townsville Bulletin [4.6.07]

IN this special anthology, leading public thinkers _ scientists, writers and philosophers such as Richard Dawkins,
Howard Gardner, Freeman Dyson, Jared Diamond and Ray Kurzweil _ respond to a question proposed by Stephen
Pinker: `What is your dangerous idea?'
John Brockman clarifies the question in his introduction: he wanted `statements of fact or policy that are defended
with evidence and argument by serious scientists and thinkers but which are felt to challenge the collective decency
of an age.'
Good ideas really shouldn't be thought of as dangerous, so several writers shadow-box around the question a bit,
but nearly all of them come up with something original and thought-provoking.
One of my own favourites was about the lab rats that learned to prefer Schoenberg to Mozart, but there is
something here for every interest. Common topics are religion (especially its troubled relationship to science),
psychology (especially free will), politics, and the impact of technological change (genetic engineering, and the
clash between our instincts and our computer-dominated culture).
Contributions are all quite short, ranging from less than a page up to perhaps five pages, which makes it all too
easy to give oneself mental indigestion. Other than that, however, it is a veritable feast of ideas.
In a word: Zesty.

The Independent [4.5.07]

If you stroll along the "infinite shingle" of Chesil Beach in Dorset, as Ian McEwan did while composing his new novel, you will find that millennia of tides and winds have "graded the size of pebbles" along its 18-mile length, "with the bigger stones at the eastern end". The writer went to check this out, and felt - as he weighed the pebbles in his palms - that it was true.

Already, critics have lauded On Chesil Beach as a major achievement from a painstaking micro-historian of the inner life. Edward and Florence, its loving but fatally innocent couple, stumble into a wedding-night disaster in the "buttoned-up", respectable England of July 1962, the victims not merely of "their personalities and pasts" but of "class, and history itself". Yet long-haul admirers of McEwan will detect some even deeper rhythms at work here. Once again, he traces the ominous crossing of a threshold from one human state to another: a step into the dark framed - as often in his fiction - by the inexorable onward movement of maturing and ageing bodies, of biological evolution, of climate and even geology itself.

We talk in a restaurant in Fitzrovia, a short walk for McEwan from the handsome house in a Georgian square that he fictionally lends to the neurosurgeon Henry Perowne in Saturday - another novel that pivots on momentous changes, all the way from the medical to the military realms. Upstairs, there seems to be a meeting of the revived Bonzo Dog Doo-Dah Band, exactly the kind of wacky pop pranksters that Edward, in the lonely hippie-era limbo where McEwan's epilogue leaves his stubborn hero, might have promoted in his Camden record shop. Outside, the sunshine signals another kind of transition, from winter into spring. And McEwan, a model of quietly spoken exactitude with words and ideas alike, stresses that On Chesil Beach aims at more than just the scrutiny of that early-Sixties cusp of change between - as Philip Larkin and almost all the reviewers have put it - "the end of the Chatterley ban/ And The Beatles' first LP".

For all the pin-sharp evocation of a time when "youthful energies were pushing to escape, like steam under pressure", this last gasp of British sexual inhibition gave his story a starting point and not a terminus. "I never really thought of it as a historical novel," he explains, "because I was interested in another aspect: which is when young people cross this line - the Conradian shadow-line - from innocence to knowledge. You're also dealing with a human universal. So I was rather interested to discover what young people would make of this. And I was quite relieved, for example, that my sons took to it avidly - even though they're living at a time when they not only have girlfriends, but they have lots of friends who happen to be girls: another world."

The book also survived a test-run beyond McEwan's family (his wife is the journalist and author Annalena McAfee, and he has two early-twenties sons from his first marriage). He read an extract at Hunter College in New York, to the sort of student body who might have been forgiven for failing to sympathise with the bedroom blunderings of a pair of virginal Home Counties 22-year-olds in the summer before the Cuban missile crisis. "This is a community college," the author says, "and the kids are - tough is not the word, they're really lovely, but they're not protected. They've clearly been out there." Would this street-smart audience think: why don't Edward and Florence "just get on with it? What's the problem? On the contrary: they seemed deeply engaged.

"So there have to be two elements running side by side," McEwan continues. "One is that, this is particular: these are characters frozen in history, limited by psychology, by class, by private experience. But on the other hand, this is a universal experience that is differently dressed up by different people at different times." Youth always has to cross that line, even if it would no longer run through the starched sheets of a marriage bed in a dowdy Dorset hotel.

Always the punctilious realist, McEwan nonetheless skirts the seas of parable, or myth. Yet for this, the 12th work of fiction since his 1975 debut with the luridly memorable tales of First Love, Last Rites, he wanted to avoid wading in too deep. "This particular beach offered so many metaphorical possibilities," he says. "They could kill the novel! So I really had to row back quite hard on that. The fact that impersonal forces have created order; the fact that the last scene is played out on a tongue of shingle, so you're stranded on both sides; the sense that they sit down to dinner on an evening when they both hope to gain knowledge, which clearly relates to being on the edge of the known world... It was so rich, that I had to keep the volume down."

McEwan's fiction strikes so hard and lingers so long in the imagination precisely because he keeps the interpretative volume down. "Readers will rebel," he believes, "when they spot an overriding, determining metaphor." Or, perhaps, a determining cause. On Chesil Beach hints at a specific reason for Florence's "visceral dread" of sexual experience, one that throws a line from this work back to the toxic households of those earliest stories. Her creator reveals that "in an early draft, it was all too clear". The finished work allows more space for the reader: we can join the dots through the past ourselves, just as we can fill in the futures to be enjoyed or endured by both after the act, or failure to act, that will mould them. Edward, the promising historian, now seems headed for a life of amiable counter-cultural drift; Florence, the driven violinist, stands on the brink of a solitary musical destiny.

Florence plays in a rising string quartet, and the novel that tells her story has a densely wrought, compacted, chamber-music quality. A central movement - the wedding night itself - is interspersed with chapters that delve into the characters' past and, at the finale, the future as * * well. "One of the first things that I wrote about it when I was making notes," McEwan recalls, "was a simple direction: five times eight - five chapters of about 8,000 words. A wedding night seemed to me perfect for a short novel."

The author of other compressed but resonant pieces, such as The Comfort of Strangers, Black Dogs and the Booker-winning Amsterdam, points out that "I've always liked that form: the novel that can be read in three hours, at a sitting, like a movie or an opera". A chamber opera will be McEwan's next project, due for its premiere at next year's Hay festival. He has almost completed a small-scale, "easily exportable" collaboration with the composer Michael Berkeley (who was his partner more than 20 years ago on the anti-nuclear oratorio Or Shall We Die?). It has a Don Giovanni-style seducer for its protagonist: "We thought that sexual obsession would be a very good subject for an opera."

And sexual obsession, in the form of longing or loathing rather than action, makes an equally compelling motif for On Chesil Beach. For McEwan, the book's microscopically observed convergence of social embarrassment and erotic misery "is not great tragedy. But it's something I always have an interest in: how something small, like not saying the right thing or not making the right gesture, could then send you down a slightly different path in life. It must happen to us countless times, but we barely notice."

In the pre-permissive shadowland of 1962, McEwan himself was a 13-year-old schoolboy, the itinerant, Aldershot-born son of a career army officer from Glasgow. Famously, his father's ordeal at Dunkirk helped to shape the wartime scenes of Atonement, the 2001 novel that, for many of his readers, ranked Mc-Ewan first-among-equals in that gifted cohort of novelists (Amis, Barnes and Rushdie among them) born into the aftermath of global war. Now, a few readers wonder if the poignant road-not-taken theme in On Chesil Beach might connect with his rediscovered brother, David Sharp. The son born to McEwan's parents while his mother was still married to her first husband (later killed in action), David was given up for adoption in 1942. McEwan first encountered him in 2002, and they periodically meet, but he says that this reconfigured family history has not (yet) found its way into his work.

The novelist may not enlist people into fiction so directly, but he does recruit places. Just as Saturday more or less gave Mr Perowne his creator's own address, so On Chesil Beach has Edward grow up in a Chilterns cottage that Mc-Ewan once almost rented, while Florence's chilly family occupies the north Oxford house he lived in during the 1980s. "I've come to it late," he says, "and it's such a standard thing in the English novel: a sense of place. Which I've always rather lacked, I think, being an army brat, going to boarding school, then a modern university": Sussex, followed by his pioneering stint as the first creative-writing student at East Anglia. "I've never been very rooted but, cumulatively, I guess, I do have a 30-year experience of the Chilterns." McEwan now draws on that intimacy in Edward's memories of an idyllic corner of those hills which has, he says, "withstood the onslaught of modernity reasonably well".

McEwan conjures up his terrain with a walker's close-to-the-ground eye. Plants thrust, creatures breed (or refuse to), and even hills or beaches shift according to the overlapping cycles that push on beyond the limited history that persons or societies know. He is also deeply immersed in ecological debates. In 2005, he joined a trip to the Svalbard archipelago, 79 degrees north in the Arctic, for the Cape Farewell project led by the artist David Buckland, which aims to raise cultural awareness of the issues of global warming. He reads widely in scientific literature and, just before we met, had travelled to Hamburg for a public dialogue with John Schellnhuber, the German government's adviser on climate change.

Yet McEwan the engaged intellectual (as he was during an earlier wave of doomsday anxiety, in the nuclear arms race of the early 1980s) and McEwan the novelist remain separate beings. "Fiction hates preachiness," he affirms. "Nor does it much like facts and figures or trends or curves on graphs. Nor do readers much like to be hectored." He says that in spite of "all the reading that I've done around climate change, none of it suggests anything useful in the way of approaching this novelistically".

What about one more fictional dystopia, with marauding survivors once more trekking through a blasted wasteland? "That doesn't interest me at all. We've had so many dystopias that we're brain-dead in that direction. Also, you can go to certain parts of the world - say, Sudan. There is a dystopia. You don't have to launch these things into the future."

Still, he can just about envisage a fiction that would do artistic justice to a perilously warming world: "Something small and fierce, that would unwind in a way that's intrinsically interesting... It's got to be fascinating, in the way that gossip is. It's got to be about ourselves. Maybe it needs an Animal Farm. Maybe it needs allegory. But if you're going in that direction, then you need a lot of wit."

Meanwhile, ventures such as Cape Farewell (whose exhibition will reach the Barbican gallery in January 2008) may trigger an urge to cherish as well as to lament. In an Arctic cold snap, "I did two or three long hikes that just took my breath away," he says. "Many others have thought this too: that one way forward is not doom-and-gloom but celebration; of what we are, what we have, and what we don't want to lose." On his return, he wrote a fable about the boot-room of the expedition ship, with its all-too-human rows over purloined kit. Artists may not refine the theory or advance the technology that will grapple with climate change, but they can deepen the self-knowledge of the selfish but potentially co-operative beasts who have crossed a fateful, collective shadow-line. "How do you talk about the state we've got ourselves into," he asks, "as a very successful, fossil-fuel-burning civilisation? How do we stop? That really does become a matter of human nature. There's all the science to consider, but finally there is a massive issue of politics and ethics."

McEwan, who shadowed a leading neurosurgeon while researching Saturday, likes the company and outlook of scientists as an antidote to lazy arts-faculty despair. "Among cultural intellectuals, pessimism is the style," he says with a tinge of scorn. "You're not a paid-up member unless you're gloomy." But when it comes to climate change, he finds (quoting the Italian revolutionary Gramsci) that scientists can combine "pessimism of the intellect" with "optimism of the will". "Science is an intrinsically optimistic project. You can't be curious and depressed. Curiosity is itself a sure stake in life. And science is often quite conscious of intellectual pleasure, in a way that the humanities are not."

He loves the spirited playfulness evident in places such as John Brockman's celebrated website Edge, where "neuroscientists might talk to mathematicians, biologists to computer-modelling experts", and in an accessible, discipline-crossing language that lets us all eavesdrop. "In order to talk to each other, they just have to use plain English. That's where the rest of us benefit." Science may also now "encroach" on traditional artistic soil. McEwan recently heard a lecture on the neuroscience of revenge, in which the rage to get even - that inexhaustible fuel for tragedy and comedy alike - illuminated parts of the brain via "real-time, functioning MRI [magnetic resonance imaging]. What was demonstrated was that people were prepared to punish themselves in order to punish others: negative altruism."

For all the storytelling confidence of scientists who try to uncover the biological roots of personal emotions and social beliefs, McEwan keeps faith in the special tasks of art. "I hold to the view that novelists can go to places that might be parallel to a scientific investigation, and can never really be replaced by it: the investigation into our natures; our condition; what we're like in specific circumstances." On Chesil Beach, it strikes me, shows at its infinitely sad conclusion an example of self-punishing "negative altruism" at work. Here, a vengeful righteousness that wrecks the "injured" party takes shape not in the colour-coded neural maps of MRI - but through a vigilant writer's heartbreaking empathy with the twisted feelings of a child in its time.

If human communication and solidarity can founder so totally in this novel's little pool of fear and frustration, what are its prospects in the great ocean of social behaviour? We talk of the carbon-cutting, resource-saving sacrifices this generation may have to make on behalf of its successors, and McEwan comments that such long-term altruism "does go against the grain a bit". All the same, he adds: "I cheer myself up with the thought of medieval cathedral builders, who built for the future - or 18th-century tree-planters, who planted sapling oaks which they would never enjoy. Here, it's much more dire; but we're bound to think of our children, or at least our grandchildren.

"It is difficult to do favours to people you have never met," he says. "But we give money to Oxfam, to charities, to victims of the tsunami and so forth. These are not people who are ever going to repay those favours, or even know who bestowed them." Unlike his characters, doomed to a kind of soul-extinction in their solitude, McEwan believes in making the last-ditch gesture that might save a world. "The worst fate would be to conclude that there's nothing we can do about this, and so let's party to the end."

What is Your Dangerous Idea? Today's Leading Thinkers on the Unthinkable
Weekend Australian [3.23.07]


BRAIN stretch is an exciting concept, the more so as John Brockman's anthology pushes everything to the extreme. Can our brains exist without bodies? If, as Ray Kurzweil says, ''we need only 1 per cent of 1 per cent of the sunlight to meet all our energy needs'', why are we pouring billions into Middle East wars over oil and not into research on nano-engineered solar panels and fuel cells? Read these 100 or so mini-essays and realise how lacking in vision most politicians are.

Read the full article →

http://www.ireland.com/newspaper/weekend/2007/0317/1173880409261.html [3.16.07]

Reading room: a surfers' guide

The Dublin Review of Books will boast a regular blog where readers can carry on live discussion of particular articles or topics between issues.

But it isn't the only online magazine vying for the attention of literary audiences - there are dozens of sassy outfits out there, each with its own distinctive perks and quirks. ...

www.edge.org has established itself as a major force on the intellectual scene in the US and as required reading for humanities heads who want to keep up to speed with the latest in science and technology. Current debates on the site feature stellar contributors Noam Chomsky, Scott Atran and Daniel C Dennett.

...

The New York Times [3.12.07]

When Robert R. Provine tried applying his training in neuroscience to laughter 20 years ago, he naïvely began by dragging people into his laboratory at the University of Maryland, Baltimore County, to watch episodes of "Saturday Night Live" and a George Carlin routine. They didn't laugh much. It was what a stand-up comic would call a bad room.

So he went out into natural habitats — city sidewalks, suburban malls — and carefully observed thousands of "laugh episodes." He found that 80 percent to 90 percent of them came after straight lines like "I know" or "I'll see you guys later." The witticisms that induced laughter rarely rose above the level of "You smell like you had a good workout.""Most prelaugh dialogue," Professor Provine concluded in "Laughter," his 2000 book, "is like that of an interminable television situation comedy scripted by an extremely ungifted writer."He found that most speakers, particularly women, did more laughing than their listeners, using the laughs as punctuation for their sentences. It's a largely involuntary process. People can consciously suppress laughs, but few can make themselves laugh convincingly.

"Laughter is an honest social signal because it's hard to fake," Professor Provine says. "We're dealing with something powerful, ancient and crude. It's a kind of behavioral fossil showing the roots that all human beings, maybe all mammals, have in common."

The New York Times Magazine [3.10.07]

How neuroscience is transforming the legal system.

...Two of the most ardent supporters of the claim that neuroscience requires the redefinition of guilt and punishment are Joshua D. Greene, an assistant professor of psychology at Harvard, and Jonathan D. Cohen, a professor of psychology who directs the neuroscience program at Princeton. Greene got Cohen interested in the legal implications of neuroscience, and together they conducted a series of experiments exploring how people's brains react to moral dilemmas involving life and death. In particular, they wanted to test people's responses in the f.M.R.I. scanner to variations of the famous trolley problem, which philosophers have been arguing about for decades. ...

...Michael Gazzaniga, a professor of psychology at the University of California, Santa Barbara, and author of "The Ethical Brain," notes that within 10 years, neuroscientists may be able to show that there are neurological differences when people testify about their own previous acts and when they testify to something they saw. "If you kill someone, you have a procedural memory of that, whereas if I'm standing and watch you kill somebody, that's an episodic memory that uses a different part of the brain," he told me. ...

...In a series of famous experiments in the 1970s and '80s, Benjamin Libet measured people's brain activity while telling them to move their fingers whenever they felt like it. Libet detected brain activity suggesting a readiness to move the finger half a second before the actual movement and about 400 milliseconds before people became aware of their conscious intention to move their finger. Libet argued that this leaves 100 milliseconds for the conscious self to veto the brain's unconscious decision, or to give way to it — suggesting, in the words of the neuroscientist Vilayanur S. Ramachandran, that we have not free will but "free won't.". ...

...The legal implications of the new experiments involving bias and neuroscience are hotly disputed. Mahzarin R. Banaji, a psychology professor at Harvard who helped to pioneer the I.A.T., has argued that there may be a big gap between the concept of intentional bias embedded in law and the reality of unconscious racism revealed by science. When the gap is "substantial," she and the U.C.L.A. law professor Jerry Kang have argued, "the law should be changed to comport with science" — relaxing, for example, the current focus on intentional discrimination and trying to root out unconscious bias in the workplace with "structural interventions," which critics say may be tantamount to racial quotas. . ...

...Others agree with Greene and Cohen that the legal system should be radically refocused on deterrence rather than on retribution. Since the celebrated M'Naughten case in 1843, involving a paranoid British assassin, English and American courts have recognized an insanity defense only for those who are unable to appreciate the difference between right and wrong. (This is consistent with the idea that only rational people can be held criminally responsible for their actions.) According to some neuroscientists, that rule makes no sense in light of recent brain-imaging studies. "You can have a horrendously damaged brain where someone knows the difference between right and wrong but nonetheless can't control their behavior," saysRobert Sapolsky, a neurobiologist at Stanford. "At that point, you're dealing with a broken machine, and concepts like punishment and evil and sin become utterly irrelevant. Does that mean the person should be dumped back on the street? Absolutely not. You have a car with the brakes not working, and it shouldn't be allowed to be near anyone it can hurt.". ...

John Markoff, The New York Times [3.8.07]

SAN FRANCISCO, March 8 — A new company founded by a longtime technologist is setting out to create a vast public database intended to be read by computers rather than people, paving the way for a more automated Internet in which machines will routinely share information.

The company, Metaweb Technologies, is led by Danny Hillis, whose background includes a stint at Walt Disney Imagineering and who has long championed the idea of intelligent machines.

He says his latest effort, to be announced Friday, will help develop a realm frequently described as the “semantic Web” — a set of services that will give rise to software agents that automate many functions now performed manually in front of a Web browser.

The idea of a centralized database storing all of the world’s digital information is a fundamental shift away from today’s World Wide Web, which is akin to a library of linked digital documents stored separately on millions of computers where search engines serve as the equivalent of a card catalog.

In contrast, Mr. Hillis envisions a centralized repository that is more like a digital almanac. The new system can be extended freely by those wishing to share their information widely.

On the Web, there are few rules governing how information should be organized. But in the Metaweb database, to be named Freebase, information will be structured to make it possible for software programs to discern relationships and even meaning.

For example, an entry for California’s governor, Arnold Schwarzenegger, would be entered as a topic that would include a variety of attributes or “views” describing him as an actor, athlete and politician — listing them in a highly structured way in the database.

That would make it possible for programmers and Web developers to write programs allowing Internet users to pose queries that might produce a simple, useful answer rather than a long list of documents.

Since it could offer an understanding of relationships like geographic location and occupational specialties, Freebase might be able to field a query about a child-friendly dentist within 10 miles of one’s home and yield a single result.

The system will also make it possible to transform the way electronic devices communicate with one another, Mr. Hillis said. An Internet-enabled remote control could reconfigure itself automatically to be compatible with a new television set by tapping into data from Freebase. Or the video recorder of the future might stop blinking and program itself without confounding its owner.

In its ambitions, Freebase has some similarities to Google — which has asserted that its mission is to organize the world’s information and make it universally accessible and useful. But its approach sets it apart.

“As wonderful as Google is, there is still much to do,” said Esther Dyson, a computer and Internet industry analyst and investor at EDventure, based in New York.

Most search engines are about algorithms and statistics without structure, while databases have been solely about structure until now, she said.

“In the middle there is something that represents things as they are,” she said. “Something that captures the relationships between things.”

That addition has long been a vision of researchers in artificial intelligence. The Freebase system will offer a set of controls that will allow both programmers and Web designers to extract information easily from the system.

“It’s like a system for building the synapses for the global brain,” said Tim O’Reilly, chief executive of O’Reilly Media, a technology publishing firm based in Sebastopol, Calif.

Mr. Hillis received his Ph.D. in computer science while studying artificial intelligence at the Massachusetts Institute of Technology.

In 1985 he founded one of the first companies focused on massively parallel computing, Thinking Machines. When the company failed commercially at the end of the cold war, he became vice president for research and development at Walt Disney Imagineering. More recently he was a founder of Applied Minds, a research and consulting firm based in Glendale, Calif. Metaweb, founded in 2005 with venture capital backing, is a spinoff of that company.

Mr. Hillis first described his idea for creating a knowledge web he called Aristotle in a paper in 2000. But he said he did not try to build the system until he had recruited two technical experts as co-founders. Robert Cook, an expert in parallel computing and database design, is Metaweb’s executive vice president for product development. John Giannandrea, formerly chief technologist at Tellme Networks and chief technologist of the Web browser group at Netscape/AOL, is the company’s chief technology officer.

“We’re trying to create the world’s database, with all of the world’s information,” Mr. Hillis said.

All of the information in Freebase will be available under a license that makes it freely shareable, Mr. Hillis said. In the future, he said, the company plans to create a business by organizing proprietary information in a similar fashion.

Contributions already added into the Freebase system include descriptive information about four million songs from Musicbrainz, a user-maintained database; details on 100,000 restaurants supplied by Chemoz; extensive information from Wikipedia; and census data and location information.

A number of private companies, including Encyclopaedia Britannica, have indicated that they are willing to add some of their existing databases to the system, Mr. Hillis said.

skeptical inquirer [2.28.07]

Those who wonder what cutting-edge scientists might ponder outside of their classrooms and laboratories need wonder no more. In What We Believe But Cannot Prove, "intellectuals in action" speculate on the frontiers of science, both hard and soft. Skeptics, however, should not be deceived by the title. An ample majority of the more than 100 teasingly short essays included will sate the intellect's appetite for both facts and reasoned theory. John Brockman's new collection features the world's most celebrated and respected scientists and their musings on everything from human pre-history to cosmology and astrophysics, from evolution to extraterrestrial intelligence, and from genetics to theories of consciousness. ....

...What We Believe But Cannot Prove offers an impressive array of insights and challenges that will surely delight curious readers, generalists and specialists alike. Science is intimidating for the vast majority of us. But John Brockman has grown deservedly famous in recent years for his ability to lure these disciplines and theirleading practitioners back to Earth where terrestrials are afforded all-too-rare opportunities to marvel at the intellectual and creativemagnificence of science in particular, and at our species' immeasurable potential in all pursuits more generally.

[...continue]

Prospect [2.28.07]


Brian Eno
, musician

Interventionists vs laissez-faireists
One of the big divisions of the future will be between those who believe in intervention as a moral duty and those who don't. This issue cuts across the left/right divide, as we saw in the lead-up to the invasion of Iraq. It asks us to consider whether we believe our way of doing things to be so superior that we must persuade others to follow it, or whether, on the other hand, we are prepared to watch as other countries pursue their own, often apparently flawed, paths. It will be a discussion between pluralists, who are prepared to tolerate the discomfort of diversity, and those who feel they know what the best system is and feel it is their moral duty to encourage it.

Globalists vs nationalists
How prepared are we to allow national governments the freedom to make decisions which may not be in the interests of the rest of the world? With issues such as climate change becoming increasingly urgent, many people will begin arguing for a global system of government with the power to overrule specific national interests.

Communities of geography vs communities of choice
At the same time, some people will feel less and less allegiance to "the nation," which will become an increasingly nebulous act of faith, and more allegiance to "communities of choice" which exist outside national identities and geographical restraints. We see the beginnings of this in transnational pressure groups such as Greenpeace, MoveOn and Amnesty International, but also in the choices that people now make about where they live, bank their money, get their healthcare and go on holiday.

Real life vs virtual life
Some people will spend more and more of their time in virtual communities such as Second Life. They will claim that their communities represent the logical extension of citizen democracy. They will be ridiculed and opposed by "First Lifers," who will insist that reality with all its complications always trumps virtual reality, but the second-lifers in turn will insist that they live in a world of their own design and therefore are by definition more creative and free. This division will deepen and intensify, and will develop from just a cultural preference into a choice about how and where people spend their lives.

Life extension for all vs for some
There will be an increasingly agonised division between those who feel that new life-extension technologies should be either available to those who can afford them or available to everyone. Life itself will be the resource over which wars will be fought: the "have nots" will feel that there is a fundamental injustice in the possibility for some people to enjoy conspicuously longer and healthier lives because they happen to be richer.

Anthony Giddens, sociologist

"The future isn't what it used to be," George Burns once said. And he was right. This century we are peering over a precipice, and it's an awful long way down. We have unleashed forces into the world that it is not certain that we can control. We may have already done so much damage to the planet that by the end of the century people will live in a world ravaged by storms, with large areas flooded and others arid. But you have to add in nuclear proliferation, and new diseases that we might have inadvertently created. Space might become militarised. The emergence of mega-computers, allied to robotics, might at some point also create beings able to escape the clutches of their creators.

Against that, you could say that we haven't much clue what the future will bring, except it's bound to be things that we haven't even suspected. Twenty years ago, Bill Gates thought there was no future in the internet. The current century might turn out much more benign than scary.

As for politics, left and right aren't about to disappear—the metaphor is too strongly entrenched for that. My best guess about where politics will focus would be upon life itself. Life politics concerns the environment, lifestyle change, health, ageing, identity and technology. It may be a politics of survival, it may be a politics of hope, or perhaps a bit of both.

Nicholas Humphrey, scientist

How can anyone doubt that the faultline is going to be religion? On one side there will be those who continue to appeal for their political and moral values to what they understand to be God's will. On the other there will be the atheists, agnostics and scientific materialists, who see human lives as being under human control, subject only to the relatively negotiable constraints of our evolved psychology. What makes the outcome uncertain is that our evolved psychology almost certainly leans us towards religion, as an essential defence against the terror of death and meaninglessness.

Marek Kohn, science writer

The right, of course, is still with us; robust structures remain to uphold individualism and the pursuit of wealth. There is also plenty of room in the current orthodoxy for liberalism and conservatism of all kind of stripes. What's left out? Equality and solidarity—which takes us back to the egalite and fraternite of the French revolution, where the terms "left" and "right" came in. These seem to be fundamental values, intuitively recognised as the basis of fair and healthy social relations, so we may expect that they will reassert themselves. But as dominant ideologies fail to give them their fair dues, they will reappear in marginal and often disagreeable guises. Social solidarity may be advanced within narrow group solidarities; equality may be appropriated by demagogues.

Recent manifestations in central Europe and South America have been overlooked because they are accompanied by tendencies that rightly affront liberals. It is hard to imagine what could restore social solidarity and equality to the heart of political discourse, so we must expect that collectivist tendencies in our kind of polity will likely be largely confined to the bureaucratic management of resources placed under ever-growing pressure by economic growth and its environmental consequences.

Mark Pagel, scientist

Modern humans evolved to live in small co-operative groups with extensive divisions of labour among unrelated people linked only by their common culture. Co-operation is fragile, being the contented face of trust, reciprocity and the perception of a shared fate—when they go, the mask can quickly fall. The psychology of the co-operative group, of how we can maintain it and equally how we can control its dangerous tendencies—parochialism, xenophobia, exclusion and warfare—will often be at the front door of 21st-century politics.

The reasons are clear. The politics of the 20th century were expansive and hopeful, enlivened by growing prosperity. In the 21st century, increasing multiculturalism and widespread movements of people will repeatedly challenge the trust and sense of equity that binds together co-operative groups, unleashing instincts for selfish preservation. For politicians and thinkers, a pressing task at all levels of politics is to seek ways to manage these issues that somehow draw all of the actors into the elaborate and fragile reciprocity loops of the co-operative society. It sounds impossible, it won't be easy and there are no simple recipes. But if we fail, we risk sliding into xenophobic hysteria, clashes of culture, and the frenzied and dangerous grabbing of natural resources.

Lisa Randall, scientist

Debates today have descended into those between the lazy and the slightly less lazy. We are faced with urgent issues, yet the speed with which lawmakers approach them is glacial—actually slower than that: glaciers are melting faster than we are attacking the issues.

Steven Rose, biologist

Last century's alternatives were socialism or barbarism. This century's prospects are starker: social justice or the end of human civilisation—if not our species. To achieve that justice it is imperative that we retain the utopian dream of "from each according to their abilities: to each according to their needs," but needs and abilities are constantly being refashioned by runaway sciences and technologies harnessed ever more closely to global industry and imperial power and embedded within a degraded and degrading environment. This century's "left," just as that of the last century, is constituted by those groups, old or newly constituted, struggling against these hegemonic powers.

[...more]

Juan Enriquez, Reforma [2.18.07]

Las tragedias individuales, dice Anderson, venden muchos más periódicos y atraen muchos más televidentes que las tendencias generales

A menudo, después de abrir el periódico, ver las noticias o vivir algún suceso especialmente triste, acaba uno con la idea de que el mundo era mucho mejor antes y que vamos rumbo a la decadencia, soledad, podredumbre y extrema violencia. En algunas partes y épocas efectivamente es así. Pero no lo es en general...Dos amigos míos me recordaron, en escritos de fin de año, que hay mucho que criticar, afrontar, cambiar, pero también hay mucho que celebrar. Chris Anderson escribió sobre el extremo sobrerreportaje que ocurre cuando hay un incidente terrorista, accidente masivo o desastre natural. Esto ocurre porque, en la mayoría del mundo, este tipo de muertes violentas no son lugar común. Hay grandes reportajes precisamente porque son sucesos excepcionales.Las tragedias individuales, dice Anderson, venden muchos más periódicos y atraen muchos más televidentes que las tendencias generales. "Perro ataca inocente infante" es mucho más poderoso que "la pobreza se redujo en un 1 por ciento". Pero aunque la segunda nota es mucho menos atractiva en términos mediáticos significa salvar y mejorar muchas más vidas.

Mucho se ha escrito sobre cómo la red, Google, Yahoo, Skype, You Tube eliminan distancias y reducen el costo de la comunicación, de lograr comunicación y obtener información global a casi cero. El resultado de estar siempre conectados a todas partes a todas horas es que las distancias se reducen y que individuales dramas mundiales entran, cada vez más, a nuestras casas a diario. Podemos enterarnos 24 x 7 sobre incendios, bombas, asaltos, torturas, desapariciones, violaciones y escándalos políticos en cualquiera de los casi 200 países del planeta. Una foto, un testimonial, un videoclip de 15 segundos, nos acercan a más y más dramas individuales. Cada historia nos convence, un poquito más, de que vivimos en mundo cruel, duro y violento...

Pages