Edge in the News

WHAT ARE YOU OPTIMISTIC ABOUT?
THE MAIL ON SUNDAY [12.29.08]

"The planet's overheating, the icecaps are melting, the population is exploding, there's a bird-flu epidemic waiting to get us and even if we avoid a terrorist Armageddon, there's bound to be an asteroid up there with all our names on it. We are, to quote Private Frazer, doomed.

"Nonsense, say the 150 leading scientists assembled by John Brockman in this uplifting anthology.

"Asked the title's question, the world's best brains examined our prospects - and all of them found reasons to be very cheerful indeed. Once again, the scientific community seems to challenge our instinctive, common-sense assumption. First they told us the Earth isn't flat. Then, that solid objects are made up of empty space. ...

"...This is an enthralling book that delivers two very significant truths: we've never had it so good and things can only get better. Global warming — and asteroids — permitting."

Read the full article →

THE NEW YORK TIMES [12.27.08]

Several months ago, Christopher Hitchens was sent an article about a young soldier, Mark Jennings Daily, who had been killed in Iraq. Daily was improbably all-American — born on the Fourth of July, an honors graduate from U.C.L.A., strikingly handsome. He’d been a Democrat with reservations about the war. But, “somewhere along the way, he changed his mind,” the article said. “Writings by author and commentator Christopher Hitchens on the moral case for war deeply influenced him.”

“I don’t exaggerate by much when I say I froze,” Hitchens wrote about reading that sentence.

His essay in the November issue of Vanity Fair is a meditation on his own role in Daily’s death, and a description of the family Daily left behind. Hitchens asks painful questions and steps on every opportunity to be maudlin, and yet for all its tightly controlled intellectualism, the essay packs a bigger emotional wallop than any other this year.

Daily took books by Thomas Paine, Tolstoy, John McCain and Orwell to Iraq.

“Anyone who knew me before I joined,” Daily wrote from the front, “knows that I am quite aware and at times sympathetic to the arguments against the war in Iraq. If you think the only way a person could bring themselves to volunteer for this war is through sheer desperation or blind obedience, then consider me the exception (though there are countless like me)... . Consider that there are 19-year-old soldiers from the Midwest who have never touched a college campus or a protest who have done more to uphold the universal legitimacy of representative government and individual rights by placing themselves between Iraqi voting lines and homicidal religious fanatics.”

Hitchens spent a day with the Daily family and then was asked to speak at a memorial service. He read a passage from “Macbeth” and later reflected: “Here we are to perform the last honors for a warrior and hero, and there are no hysterical ululations, no shrieks for revenge, no insults hurled at the enemy, no firing into the air or bogus hysterics. Instead, an honest, brave, modest family is doing its private best.”

Hitchens also wrote “God Is Not Great,” which Ross Douthat reviewed provocatively in The Claremont Review of Books. Douthat noted that Hitchens specializes in picking out crackpot quotations rather than trying to closely observe the nature of spiritual experience: “Like most apologists for atheism, he evinces little interest in the topic of religion as it is actually lived, preferring to stick to the safer ground of putting the godly in the dock and cataloging their crimes against humanity.” Douthat, the believer, comes off as more curious about the world than any skeptic.

One of the best pieces of career advice I ever got is: Interview three people every day. If you try to write about politics without interviewing policy makers, you’ll wind up spewing all sorts of nonsense. John Mearsheimer and Stephen Walt wrote an entire book on the Israel Lobby without ever interviewing any of their subjects.

Jeffrey Goldberg dissected their effort in The New Republic. Goldberg usefully describes Judeocentrism, the belief that Jews play a central role in world history. Walt and Mearsheimer have a tendency, Goldberg writes, to bring the vectors of recent world history back to the Jews — the rise of radical Islam, shifts in U.S. foreign policy, Sept. 11. He then offers a piece-by-piece dissection of their historical claims.

Wonks talk about inequality, but voters talk about immigration. Christopher Jencks wrote an essay on immigration in The New York Review of Books that was superb not because he took a polemical stance, but because he clarified a complex issue in an honest way.

He shows how fluid public opinion is. Certain poll questions suggest that 69 percent of Americans want to deport illegal immigrants. Others indicate the true figure is only 14 percent. He ends up at the nub of the current deadlock. Conservatives, having learned from past failures, demand “enforcement first.” Employers, fearing bankruptcy, demand the legalization of the current immigrants first. Neither powerful group will budge.

Three other essays are worth your time. In the online magazine Edge, Jonathan Haidt wrote “Moral Psychology and the Misunderstanding of Religion,” an excellent summary of how we make ethical judgments. In the Chronicle of Higher Education, J. Bradford DeLong wrote “Creative Destruction’s Reconstruction” on why Joseph Schumpeter matters to the 21st century. In her essay, “The Abduction of Opera” in The City Journal, Heather MacDonald wonders why European directors now introduce mutilation, rape, masturbation and urination into lighthearted operas like “The Abduction from the Seraglio.” She argues that a resurgent adolescent culture has allowed directors there to wallow in all manner of self-indulgence.

TORONTO STAR [12.27.08]

When politicians change their minds, they're often lambasted for flip-flopping by other politicians, the media and the public. When scientists change their minds, their fellow scientists eventually see it as progress, integral to the self-correcting discipline of their vocation.

Unfortunately the public usually notices only a marginal subset of this phenomenon: how the futurists and short-term forecasters so often get it wrong.

After all, where is the paperless office? Or the Jetsons' flying car? And remember how hurricane forecasters used computer models to predict – wrongly – that the last two seasons would be monsters?

For a spectacularly bad computer projection, look at the mid-1970s, when a study from the Club of Rome warned that the world would run out of many essential minerals before the end of the century. Skeptical researchers picked apart the naïve assumptions of "The Limits to Growth," but not before world leaders, including Pierre Elliott Trudeau, had jetted off to an Austrian castle for a summit.

Yet the truly important self-corrections of science often escape public attention because they escape the media's attention. That's mainly because journalism exists on the time scale of mayflies while scientific consensus evolves over elephantine decades.

A personal example: When I was squeaking through university science in the mid-1960s, we were taught that the adult brain does not make new neurons.

But even then, unbeknown to us, a few researchers were arguing that the adult brain did continue to manufacture neurons. But they were dismissed as crackpots, just as Alfred Wegener was in 1915 when he proposed that the continents drifted. Or as Mario Molina and F. Sherwood Rowland were in 1974 when they warned that CFCs were destroying the ozone layer.

Molina and Rowland were vindicated in just a few years and went on to win the Nobel Prize in chemistry. But it wasn't until the 1950s that continental drift was accepted as the consensus theory.

The neuron "crackpots" were finally declared correct by their fellow brain scientists in the 1990s, and today adult neurogenesis – the fancy name for making new neurons – is a burgeoning field of study for people such as Stanford neuroscientist Robert Sapolsky, who originally dismissed the idea.

Sapolsky is one of 130-plus scientists and "thinkers" who have contributed highly personal revelations to What Have You Changed Your Mind About?, due next month.

Book marketing seems to demand sensational subtitles, but Today's Leading Minds Rethink Everything turns out to be an accurate guide to the content. In almost 400 pages, the contributors cover frontier aspects of all three scientific arenas: physical, biomedical and social.

It should come with a warning: "Reading this book may be dangerous to your cherished myths and perceptions." For example:

  • Helena Cronin says it's not primarily bias and barriers that give men the top positions and prizes. After analyzing the statistical evidence, the philosopher at the London School of Economics has come to accept that there will be (as she puts it) more dumbbells and more Nobels among males because there's a much greater variance in ability among men as a group than among women, even though both are similar on average.
  • There is probably no intelligent life elsewhere in the universe because we would have detected a stray electromagnetic signal by now, argues technologist Ray Kurzweil, who wanted to believe in E.T.
  • Until a few years ago, neuroscientist Joseph LeDoux thought that a memory is something stored in the brain into which we could tap again and again. Then a researcher in his lab at New York University did an experiment that convinced LeDoux – and is convincing others – that each time a memory is used, it has to be stored again in the brain as a new memory to be accessible later. This concept of memory "reconsolidation" is now being tested in treating drug addiction and post-traumatic stress disorder.
  • Danish science writer Tor Nørretranders changed his mind about his body, which he now considers closer to software than hardware. It's been known for decades that 98 per cent of the atoms in the human body are replaced every year, but only recently was Nørretranders able to come up with the concept of permanent reincarnation, like music moving from vinyl LPs to audio tapes to CDs and now iPods.

Many other contributors challenge conventional wisdom to write about, among other things, a finite universe; the brain creating a soul; and the Internet as a powerful tool for centralized state control.

Nor do all these deep thinkers agree. Computer scientist Rudy Rucker has come around to thinking that a computer program will be able to emulate the human mind so that self-aware robots could even believe in God. But computer scientist Roger Schank, who once said he would see machines as smart as humans within his lifetime, now believes that won't happen within the lifetime of his grandchildren.

The book's most important contribution, however, is to drive home the lesson that in science being wrong occasionally is a good thing, not least because it renews curiosity and reminds the scientists that they don't know everything.

As Discover magazine columnist Jaron Lanier writes in the book, "Being aware of being wrong once in a while keeps you young."

And since admitting they've been wrong and changing their minds works well for rational decision-making by scientists, perhaps politicians, the media and others might give it a try.

3 QUARKS DAILY [12.21.08]

How ought we, in this historical moment, use science and technology to remake the world?

Americans have been talking about what to do about climate change. Two of the lead voices are, recently, Jeffrey Sachs from Columbia University’s Earth Institute and Joseph Romm from the Center for American Progress Action Fund. Romm recommends a federal cap-and-trade plan that would immediately make carbon emissions more expensive. Sachs, however, believes that the economic costs of cap-and-trade are prohibitive without radically advanced technologies to make a low-carbon economy actually possible. Thus, Sachs proposes large-scale federal investments in development and demonstration of new energy technologies. This is what the climate change debate is now about: whether or not we need new inventions to build a low-carbon economy, or whether our existing tools are good enough to get started right away.

But what is the debate over? The question of whether man will conquer nature occupied Enlightenment philosophers. But today, the best progressive thinkers have moved on to other questions. The new conversation assumes the fact that humans have been remaking the world for millennia.

The first part of the Hebrew Bible expresses this part of human life well, depending on how one reads it. God implicitly invites humankind to be creative: Of this earth you were made, and likewise, you shall remake it. Robert Harrison, in his new book [2], ascribes to the human condition what he calls the vocation of care, of which the act of tending a garden is the best example. In Harrison’s way of reading Genesis, the fall from the Garden of Eden was more of a blessing to be cherished than a loss to be mourned: Adam and Eve were granted the privilege of caring about the world. And what if they were originally created in God’s image? In that case, says theologian Ted Peters, they must participate in the ongoing creation of the world. Peters says we are “created cocreators.”

In any case, Earth’s crust is a dynamic place, and we might as well help it along. Though Sachs and Romm offer different suggestions for climate policy, they are responding to the same question: How ought we, in this historical moment, remake the world again? If one feels alarmed by declining biodiversity, then one understands the importance of this moment. The energy technologies we select, whether they are old, emerging, or not yet developed, will have consequences for the continuing evolution of terrestrial life. The job of democratic citizens, as Walter Truett Anderson has been saying for years, includes governing evolution itself [1]. There is no turning back.

 

The Left — the party of science, environmentalism, equality, and choice — would do well to understand what this job does and does not include. First, as Oliver Morton explained a couple of years ago on Edge.org, it does not include saving the planet. Earth and its biosphere is resilient enough in the long term to take what we are giving it: fresh water depletion, species losses, a boosted greenhouse effect, and more. Nothing we can do (or at least, are at all likely to do) can stop biological and geological evolution on Earth. But while the planet can adapt, humans, especially the poorest, could be greatly harmed. The strongest arguments for cutting greenhouse gas emissions start by honoring human solidarity, not the intrinsic value of sea ice.

Second, our job does not include protecting the natural from the unnatural. It is too late, except in some of Earth’s remote polar regions, to preserve any “natural” ecosystems that remain unaffected by the conscious vita activa of men and women. The natural–unnatural distinction now serves no useful purpose. Moreover, it distracts us from other distinctions that do matter for our actual lives, like sustainable and unsustainable development.

The irrelevance of the natural–unnatural distinction matters, too, for our health and our ability to control our own bodies. Consider the rhetorical value of the word, “nature.” Some philosophers use the word to mask moral norms. Leon Kass, for instance, goes to great lengths to explain why the assisted reproductive technologies that he finds repugnant are also “unnatural” [3]. One benefit of decisively discarding the language of the natural and the unnatural is that doing so will prevent people like Kass from using the word, “nature,” as a way of inserting private morals into public politics.

Third, our job does not include transcending the planet or our bodies. This should go without saying, but some writers have acquired some fame by demanding, in the name of the Enlightenment, human enhancement technologies that can deliver immortality and cognitive and emotional bliss.Dale Carrico has explained that these desires ignore both the fact of human vulnerability and the fact that technological progress does not happen without political progress to enable it.

Saving the planet, protecting the natural, and achieving technological transcendence are projects with which many persons of the Left have burdened themselves. Each of these projects is misguided, unnecessary, and counter-productive. By pushing these ideas firmly and permanently aside, we can more easily grasp the challenges we are really confronting.

The climate debate demonstrates the different way of talking about nature. Despite the contrasts between Sachs’s and Romm’s plans, neither of them is a defense of nature. Instead, they are both proposals for how, in essence, to better integrate blind evolution and conscious design, ecology and technology, and nature and art.

Consider the biological history of life. An unknown number of millennia ago, creativity on Earth was blind; intentionality and purposefulness, as we know them, had not yet been invented. Later, genetic evolution gave rise to the modern human. From that time forward, design, technology, and art variously complemented and commandeered the original program of evolution, ecology, and nature. The transition is irrevocable, and now, the challenge is to make it work. Modernity may overflow with excess, but we have little else to build upon. Even if we cannot directly counter Heidegger’s objections to industrial technology, we can still design a humane home on Earth good enough for anyone but the most rabid intellectual opponents of the Gestell. Sadly, we haven’t really started trying.

We must begin with design, which is everything we make and everything else that happens amidst the private and public relationships between human beings. Human consciousness is the key ingredient of design. Sachs and Romm offer different answers to the question: Which specific products of human conscious design should interact with the products of blind evolution, and how, and when, and where, and in what combinations? For the foreseeable future, this is the question that policymakers concerned with science and technology would be wise to ask. The very ability to ask the question suggests that the question itself is urgent. And we shall be better off if philosophers, policymakers, and scientists are not the only ones asking it.

Anyone who lives in a clothed or bejeweled body confronts the question constantly, though most often without realizing it. However, the development of human modification medicine and of nanoscale, biological, information, and cognitive technologies reveals more conspicuously the question’s importance. Here, the politics of choice and self-determination — which the Left played no small part in developing — is indispensable. Again, the question is not how to protect the natural body from unnatural adulteration. Rather, the question is how to enable a new kind of human and extra-human diversity. Carrico calls it “lifeway diversity:” the varieties of ways not of transcending one’s body, but rather of transforming it.

There are right ways and wrong ways to use science and technology for the transformation of our selves and the world. Unintended consequences of science and technology are inevitable. We need to minimize them, respond to them, and learn from them. And while we must not concern ourselves with the categories of natural and unnatural, we also should not forget that many things in the world that came before us — most especially ecosystems — are crucial to a well-functioning technological biosphere. In most cases, public policies should preserve ancient ecological balances that give rise to services we depend on. Technologies will play an ever-expanding role first, in understanding what is actually happening in ecosystems and second, in intervening appropriately.

What is true for ecosystems is also true for the biological and psychological systems of human bodies and minds. This point is most easily understood through the idea (popularized by the authorRichard Ogle) of the “extended mind.” Mental life — cognition, emotion, and creativity — does not happen within the confines of the brain. Instead, it depends on complex interactions between bodies, environments, and culture. Some of these components are evolved; others, designed. Evolution and design interact every time we put our shoes on, read a map, or press the keys on a piano. The better interactions are those that give rise to what the economist Amartya Sen calls human capabilities, such as bodily health, practical reason, and play. Just as technologies should maximize ecosystem services, they should also maximize human capabilities.

The form of economic growth that is implicated in this dual focus on ecosystems and human beings should be the main concern of decision-makers and governments everywhere. Every single one of the U.N. Millennium Development Goals can be understood, at least didactically, as an effort to synchronize design and evolution. Sustainable economic growth comes from technologies that enable ecosystems and humans to do what they have been doing for millions or billions of years — and to do so more abundantly and with more freedom than is currently possible. Imagine everything that is beautiful in the world, and imagine lots more of it, but imagine still being vulnerable. That is the imperfect world that is ready to be slowly forged, whether it is made from tools we have now or tools still uninvented.

Letras Libres [12.15.08]

Humanism today limps as Andalusia ostensibly despises science. Gonzalez and Salazar Férriz indicate a new and commendable effort to remedy that Soanish ignorance: Culture 3.0.

In the preface to the recent reissue of The betrayal of the intellectuals, 1927 Julien Benda (Galaxia Gutenberg), Fernando Savater stated that "perhaps the greatest paradox of the paradoxes of the twentieth century is this: there has never been a time in human history in which more developed the ability to produce tools and knowledge the inner structure of reality in all fields. So, never was more scientific and technical brilliance. But neither had ever so many ideological movements based (or better, desfondados) as irrational, dogmatic or unverifiable, above all, never was such a wealth of supporters of rapture or intuitive certainty blood among the elite of servers for high spiritual functions. "In the words of Benda," men whose function is to defend and selfless eternal values such as justice and reason, and I call intellectuals have betrayed that role for practical interests, which often result in the conversion of a mere intellectual ideologue who aspires to a space power...

...Following the wake of Snow and probably trying to repair the betrayal of Benda-speaking, John Brockman in 1988 founded the Edge Foundation(www.edge.org), an organization that seeks to reintegrate, under the idea of a "Third Culture "scientific and humanistic discourse and contribute to that science has a key role in the discussion of public affairs. ...

SPANISH ORIGINAL
GOOGLE TRANSLATION

THE CHRONICLE OF HIGHER EDUCATION [12.14.08]

Just because you’re smart doesn’t mean you get things right the first time. That’s the premise behind What Have You Changed Your Mind About? (Harper Perennial), a new anthology. In it, 150 “big thinkers” describe what they now think they were wrong about earlier in their lives. Much of this has to do with technology and education. Among the highlights:

Ray Kurzweil no longer thinks that intelligent aliens exist. The oft-cited futurist and inventor, a pioneer in artificial intelligence and in making reading machines for the blind, says that conventional thinking holds there should be billions of such civilizations and a number of them should be ahead of us, “capable of vast, galaxy-wide technologies. So how can it be that we haven’t noticed” all of the signals they should be creating? “My own conclusion is that they don’t exist.”

Roger C. Schank used to say “we would have machines as smart as we are within my lifetime.” Now Mr. Schank, a former Yale University professor and director of Yale’s artificial-intelligence project, says: “I no longer believe that will happen… I still believe we can create very intelligent machines. But I no longer believe that those machines will be like us.” Chess-playing computers that beat people are not good examples, he says. Playing chess is not representative of typical human intelligence. “Chess players are methodical planners. Human beings are not.” We tend, Mr. Schank says, “to not know what we know.”

Randolph M. Nesse “used to believe that truth had a special home at universities.” Mr. Nesse, professor of psychiatry at the University of Michigan and an expert on evolution and medicine, now thinks “universities may be the best show in town for truth pursuers, but most of them stifle innovation and constructive engagement of real controversies — not just sometimes but most of the time, systematically.” Faculty committees, he complains, make sure that most positions “go to people just about like themselves.” Deans ask how much external financing new hires will bring in. “No one with new ideas … can hope to get through this fine sieve.” —Josh Fischman

THE CHRONICLE OF HIGHER EDUCATION [12.14.08]

Yesterday I listed a few flip-flops by leading thinkers chronicled in a new anthology, What Have You Changed Your Mind About? (Harper Perennial). Whether universities were really that great was one of them. But there are more.

One of the major things that bright minds have rethought is that the Internet will be a boon to humanity. Here is why:

It does not fight authority. Nicholas Carr, who wrote the recent best sellerThe Big Switch: Rewiring the World, From Edison to Google, used to believe the Internet would shift the bulk of power to the little people, away from big companies and governments. But "its technical and commercial working actually promote the centralization of power and control," he says. Although the overall number of Web sites has increased from 2002 through 2006, the concentration of traffic at the 10 most popular sites has grown from 31 percent to 40 percent of all page views. Further, "look at how Google continues to expand its hegemony over Web searching," Mr. Carr says. "To what end will the Web giants deploy their power? They will, of course, seek to further their own commercial or political interests."

A few bad people counteract many good people, and machines can't fix that. Xeni Jardin, co-editor of the tech blog Boing Boing, says comments on the blog were useful and fun, originally. But as the blog grew more popular, so did antisocial posts by "trolls," or "people for whom dialogue wasn't the point." Things got so nasty that Boing Boing editors finally removed the ability for readers to comment. Now she has reinstated comments, because "we hired a community manager. … If someone is misbehaving, she can remove all the vowels from their screed with one click." There is no automated way to do this, Ms. Jardin says, and "the solution isn't easy, cheap, or hands-free. Few things of value are."

TECHNOLOGY REVIEW [11.30.08]

With little notice from the outside world, the community-written encyclopedia Wikipedia has redefined the commonly accepted use of the word "truth."

Why should we care? Because ­Wikipedia's articles are the first- or second-ranked results for most Internet searches. Type "iron" into Google, and Wikipedia's article on the element is the top-ranked result; likewise, its article on the Iron Cross is first when the search words are "iron cross." Google's search algorithms rank a story in part by how many times it has been linked to; people are linking to Wikipedia articles a lot.

This means that the content of these articles really matters. Wikipedia's standards of inclusion--what's in and what's not--affect the work of journalists, who routinely read Wikipedia articles and then repeat the wikiclaims as "background" without bothering to cite them. These standards affect students, whose research on many topics starts (and often ends) with Wikipedia. And since I used Wikipedia to research large parts of this article, these standards are affecting you, dear reader, at this very moment.

Essays and Opinion
ARTS & LETTERS DAILY [11.19.08]

Witch hunters in Africa lynch “thieves” who rob men of their masculinity. Many people’s grasp of economics is at the same level. The Edge economics course is an curative... more» ... Class no. 1 ...

Read the full article →

SUEDDEUTSCHE ZEITUNG [11.10.08]

Es war einer jener traumhaften Momente der Wissenschaft, bei dem man gerne dabei gewesen wäre. Im vergangenen Sommer trafen sich im kalifornischen Sonoma drei Generationen der Verhaltensökonomie zu einer Meisterklasse der Edge Foundation, jener Forschungsrichtung also, die versucht, den Mechanismen des Marktes aus dem Blickwinkel der Menschen zu begegnen. Daniel Kahnemann war der Älteste der drei prominenten Gäste, eigentlich Professor der Psychologie in Princeton, aber eben auch Wirtschaftsnobelpreisträger für seine ...

PUBLISHERS WEEKLY [11.2.08]

In this wide-ranging assortment of 150 brief essays, well-known figures from every conceivable field demonstrate why it's a prerogative of all thoughtful people to change their mind once in a while. TechnologistRay Kurzweil says he now shares Enrico Fermi's question: if other intelligent civilizations exist, then where are they? Nassim Nicholas Taleb (The Black Swan) reveals that he has lost faith in probability as a guiding light for making decisions. Oliver Morton (Mapping Mars) confesses that he has lost his childlike faith in the value of manned space flight to distant worlds. J. Craig Venter, celebrated for his work on the human genome, has ceased to believe that nature can absorb any abuses that we subject it to, and that world governments must move quickly to prevent global disaster. Alan Alda says, “So far, I've changed my mind twice about God,” going from believer to atheist to agnostic. Brockman, editor of Edge.org and numerous anthologies, has pulled together a thought-provoking collection of focused and tightly argued pieces demonstrating the courage to change strongly held convictions. (Jan.)

HUFFINGTON POST [11.1.08]

...Jaron Lanier takes on the debate about the role and power of computers in shaping human finances, behavior and prospects from a radically different vantage point faulting -- in an article published on theEdge web site -- "cybernetic totalists" who, absolve from responsibility for "whatever happens" the individual people who do specific things. I think that treating technology as if it were autonomous is the ultimate self-fulfilling prophecy. There is no difference between machine autonomy and the abdication of human responsibility. . . .There is a real chance that evolutionary psychology, artificial intelligence, Moore's law fetishizing, and the rest of the package will catch on in a big way, as big as Freud or Marx did in their times.

[Also: Nathan MyhrvoldGeorge Dyson, Ray Kurzweil]

...

THE BOSTON GLOBE [11.1.08]

RESERVE ANOTHER LAUREL for Edward O. Wilson, the Pellegrino University Professor emeritus at Harvard, serial Pulitzer winner, and prominent intellectual: online celebrity.

(Tim Bower for the Boston Globe)

Forget Charlie Rose - Wilson has Google for a soapbox. Amid the amateur-hour piffle of YouTube "talent" and skateboarding dogs, the famed biologist stands in bold relief, with more than 500 Google video search results to his credit: Interviews ranging far afield of TV shows to a spate of appearances on several Web-only video platforms such as Meaningoflife.tv,Bigthink.com, Fora.tv, and the online home of the Technology Entertainment Design (TED) conference.

It was through a TED presentation that Wilson chose to unveil his proposal for the Encyclopedia of Life, a Wikipedia of biodiversity, and a few short months later he secured the funding necessary to launch it. Hitting the talk show circuit never looked so passe.

The rise this year of a host of new Web video sites targeting high-minded, edifying content suggests that today's marketplace of ideas is rapidly moving online. "The Last Lecture," a 76-minute video by the late engineering professor Randy Pausch recorded late last year, became a crossover phenomenon - viewed by at least 7 million - and easily one of the most widely watched academic events in history. The buzzy presentation shorts of TED surged past 50 million viewings on only their second birthday.

Newly minted video start-ups Fora.tv and Bigthink.com, boasting auspicious programming starring top-shelf public intellectuals, each pledged this year to become a thinking person's YouTube: With combined inventories in the tens of thousands of clips and numerous partnerships with major media properties, that viewership is only expanding. And iTunes U., a multimedia channel of free higher education content at the iTunes Store, continued to amass its increasingly Amazonian stockpile of labs and lectures from schools around the world.

From interviews with obscure geniuses to splashy marquee names, and from hoary conference proceedings and drowsy first-year survey classes to passionate debates at exclusive, invite-only affairs like the Aspen Institute, an entire back catalog of cerebral Web video is steadily accumulating online. How do the various offerings rate?

The Oprah: TED

The TED Talks program single-handedly popularized the phenomenon of brainy programming. It's an online repository of zippy, often provocative presentations delivered by speakers at the eponymous conference. Topics range widely across the arts and sciences: inventors in today's Africa, the nature of happiness, and an evolutionary theory of technology.

TED has become, in no small part due to its videos' viral popularity, a high rooftop in academia for world authorities to report on the latest thinking in their fields, an Oprah of the intelligentsia. Like Oprah, it wields considerable kingmaking power through its presentation schedule, whose speakers graduate to greater buzz, and sometimes lasting celebrity, in the wider media.

The reason TED videos work? Their winning production values and packaging translate so well online. Even the stuffiest subject - say, Swedish global health professor Hans Rosling talking about how to visualize statistics - proves, through the meticulously observed conventions of a TED presentation (brisk pacing, humor, strong visuals), a reliably entertaining break from the tedium and rigors of high church academic discourse. TED is not a venue for speakers to go deep on a subject; instead, it's one for teasing out the more bravura elements of their work.

The Public Broadcaster: Fora.tv

Amassing video from public events and colloquia around the world, the online video storehouse Fora.tv is the wide angle lens to TED's close-up. Giving up a TED-like uniform polish for the capaciousness of its collection, Fora.tv is quickly expanding its inventory through a deep roster of partners: a number of schools, the New Republic, Brookings Institution, the Oxonian Society, the Long Now Foundation, the Hoover Foundation, and the Aspen Institute.

The speakers themselves include a wide range of Nobel laureates, cultural notables, politicians, and policy wonks - but also a good deal of unexpected surprises. Settle in for a lively food fight between Christopher Hitchens and the Rev. Al Sharpton, tussling over religion at the New York Public Library, and then rove over to see Christian Lander, author of the blog Stuff White People Like, gawking meekly and stopping to photograph the large crowd attending his Washington, D.C., reading.

Fora.tv's features are more specialized than the others, including the ability for users to compile their own video libraries, as well to upload and integrate their own contributions, public access cable-style. Production quality hardly warrants demerits in the face of the catalog collected here - one could sink into this C-SPAN of the public sphere for days of unexpected viewing.

The Talk Show: Bigthink.com

For those who think small is better, the video clip library of Bigthink.comdelivers. Its shorts are studio-shot, first-person interviews. Each clip features the interviewee answering a single question or waxing on a single topic: for example, UCLA law professor Kal Raustiala explaining his "piracy paradox," the puzzle that intellectual property protection may be inhibiting creative progress in culture and industry.

Like TED Talks, Bigthink.com video can be searched by themes and ideas as well as speakers. It also provides text transcripts, which are easy to scan and search for those just researching.

Its edge is in the notoriety of its interviewees - world leaders, presidential candidates, and prestigious thinkers - combined with its prolific production schedule. It maintains the clip of new content one would expect of a daily television show, not an Internet video site. Still, for longer sittings, its quick-take editing style can feel thin.

The Op/Ed: Bloggingheads.tv

Following from an earlier video series concept, Meaningoflife.tv, the cultural critic Robert Wright created with prominent blogger Mickey Kaus a head-to-head video debate series called Bloggingheads.tv. Advertisements tout it as being like "Lincoln-Douglas . . . with lower production values."

Nearly every day, it publishes a new "diavlog", or two-way video debate, and chapters allow users to surf quickly between conversation points. The diavlog is a millennial take on another press mainstay, the opinion editorial. (Disclosure: Bloggingheads.tv has a partnership with the New York Times Co. through the Times' opinion pages online. The Globe is owned by the New York Times Co.)

The roster of speakers is heavy on journalists and public-policy types, and the daily subject matter skews - no doubt, somewhat wearingly for some of us - toward headline political news and opinion. While Wright and Kaus star as the mainstays, the stable of commentators is populous enough to keep Bloggingheads.tv from the whiff of celebrity - the topic is always the star.

Graduate Studies: Edge.org

For those seeking substance over sheen, the occasional videos released atEdge.org hit the mark. The Edge Foundation community is a circle, mainly scientists but also other academics, entrepreneurs, and cultural figures, brought together by the literary agent John Brockman.

Edge's long-form interview videos are a deep-dive into the daily lives and passions of its subjects, and their passions are presented without primers or apologies. It is presently streaming excerpts from a private lecture, including a thoughtful question and answer session, by Nobel laureate Daniel Kahneman to Edge colleagues on the importance of behavioral economics.

It won't run to everyone's tastes. Unvarnished speakers like Sendhil Mullainathan, a MacArthur recipient with intriguing insights on poverty, are filmed in casual lecture, his thoughts unspooling in the mode of someone not preoccupied with clarity or economy of expression. The text transcripts are helpful in this context.

Regardless, the decidedly noncommercial nature of Edge's offerings, and the egghead imprimatur of the Edge community, lend its videos a refreshing air, making one wonder if broadcast television will ever offer half the off-kilter sparkle of their salon chatter.

And as for Charlie Rose? Perhaps he's voted with his feet. Excerpts of his PBS show are now streaming daily at the online magazine Slate.

Jeffrey MacIntyre (jeffmacintyre.com), who writes on culture, science and technology, is also a consultant to digital publishers (predicate-llc.com). He lives in New York.

Correction: An article about video lectures ("U Tube") in the Ideas section of Sunday, Nov. 2, misidentified Edward O. Wilson of Harvard University. He is a biologist.

THE RECORD (WATERLOO) [10.31.08]

I was watching a PBS production the other day entitled Dogs That Changed the World, and wondered about our contemporary fascination with things "That Changed the World."

The Machine That Changed the World (a 1991 book about automotive mass production). Cod: A Biography of The Fish That Changed the World (a 1998 book about, well, cod). The Map That Changed The World (2002 book about geologist William Smith). 100 Photographs That Changed the World (Life, 2003). Bridges That Changed the World (book, 2005). The Harlem Globetrotters: The Team That Changed the World (book, 2005). How William Shatner Changed the World (documentary, 2006). Genius Genes: How Asperger Talents Changed the World (book on brilliant people with autism, 2007). The Book That Changed the World (2008 article in the Guardian, about The Origin of Species).

This "Changed the World" stuff is getting to be a bit tedious, isn't it? Now that we have Dogs That Changed the World, can Cats That Changed the World be far behind? ...

...Bill Bean notes that there is already a place to read about People Who Changed the World and Then Changed Their Minds. Every year, the people at the Edge Foundation ask writers, thinkers, psychologists, historians and others what major ideas they have changed their minds about. Go to www.edge.org. It's good reading.

...

THE NEW YORK TIMES [10.10.08]

“BEWARE of geeks bearing formulas.” So saith Warren Buffett, the Wizard of Omaha. Words to bear in mind as we bail out banks and buy up mortgages and tweak interest rates and nothing, nothing seems to make any difference on Wall Street or Main Street. Years ago, Mr. Buffett called derivatives “weapons of financial mass destruction” — an apt metaphor considering that the Manhattan Project’s math and physics geeks bearing formulas brought us the original weapon of mass destruction, at Trinity in New Mexico on July 16, 1945.

In a 1981 documentary called “The Day After Trinity,” Freeman Dyson, a reigning gray eminence of math and theoretical physics, as well as an ardent proponent of nuclear disarmament, described the seductive power that brought us the ability to create atomic energy out of nothing.

“I have felt it myself,” he warned. “The glitter of nuclear weapons. It is irresistible if you come to them as a scientist. To feel it’s there in your hands, to release this energy that fuels the stars, to let it do your bidding. To perform these miracles, to lift a million tons of rock into the sky. It is something that gives people an illusion of illimitable power, and it is, in some ways, responsible for all our troubles — this, what you might call technical arrogance, that overcomes people when they see what they can do with their minds.”

The Wall Street geeks, the quantitative analysts (“quants”) and masters of “algo trading” probably felt the same irresistible lure of “illimitable power” when they discovered “evolutionary algorithms” that allowed them to create vast empires of wealth by deriving the dependence structures of portfolio credit derivatives.

What does that mean? You’ll never know. Over and over again, financial experts and wonkish talking heads endeavor to explain these mysterious, “toxic” financial instruments to us lay folk. Over and over, they ignobly fail, because we all know that no one understands credit default obligations and derivatives, except perhaps Mr. Buffett and the computers who created them.

Somehow the genius quants — the best and brightest geeks Wall Street firms could buy — fed $1 trillion in subprime mortgage debt into their supercomputers, added some derivatives, massaged the arrangements with computer algorithms and — poof! — created $62 trillion in imaginary wealth. It’s not much of a stretch to imagine that all of that imaginary wealth is locked up somewhere inside the computers, and that we humans, led by the silverback males of the financial world, Ben Bernanke and Henry Paulson, are frantically beseeching the monolith for answers. Or maybe we are lost in space, with Dave the astronaut pleading, “Open the bank vault doors, Hal.”

As the current financial crisis spreads (like a computer virus) on the earth’s nervous system (the Internet), it’s worth asking if we have somehow managed to colossally outsmart ourselves using computers. After all, the Wall Street titans loved swaps and derivatives because they were totally unregulated by humans. That left nobody but the machines in charge.

How fitting then, that almost 30 years after Freeman Dyson described the almost unspeakable urges of the nuclear geeks creating illimitable energy out of equations, his son, George Dyson, has written an essay (published at Edge.org) warning about a different strain of technical arrogance that has brought the entire planet to the brink of financial destruction. George Dyson is an historian of technology and the author of “Darwin Among the Machines,” a book that warned us a decade ago that it was only a matter of time before technology out-evolves us and takes over.

His new essay — “Economic Dis-Equilibrium: Can You Have Your House and Spend It Too?” — begins with a history of “stock,” originally a stick of hazel, willow or alder wood, inscribed with notches indicating monetary amounts and dates. When funds were transferred, the stick was split into identical halves — with one side going to the depositor and the other to the party safeguarding the money — and represented proof positive that gold had been deposited somewhere to back it up. That was good enough for 600 years, until we decided that we needed more speed and efficiency.

Making money, it seems, is all about the velocity of moving it around, so that it can exist in Hong Kong one moment and Wall Street a split second later. “The unlimited replication of information is generally a public good,” George Dyson writes. “The problem starts, as the current crisis demonstrates, when unregulated replication is applied to money itself. Highly complex computer-generated financial instruments (known as derivatives) are being produced, not from natural factors of production or other goods, but purely from other financial instruments.”

It was easy enough for us humans to understand a stick or a dollar bill when it was backed by something tangible somewhere, but only computers can understand and derive a correlation structure from observed collateralized debt obligation tranche spreads. Which leads us to the next question: Just how much of the world’s financial stability now lies in the “hands” of computerized trading algorithms?

Here’s a frightening party trick that I learned from the futurist Ray Kurzweil. Read this excerpt and then I’ll tell you who wrote it:

But we are suggesting neither that the human race would voluntarily turn power over to the machines nor that the machines would willfully seize power. What we do suggest is that the human race might easily permit itself to drift into a position of such dependence on the machines that it would have no practical choice but to accept all of the machines’ decisions. ... Eventually a stage may be reached at which the decisions necessary to keep the system running will be so complex that human beings will be incapable of making them intelligently. At that stage the machines will be in effective control. People won’t be able to just turn the machines off, because they will be so dependent on them that turning them off would amount to suicide.

Brace yourself. It comes from the Unabomber’s manifesto.

Yes, Theodore Kaczynski was a homicidal psychopath and a paranoid kook, but he was also a bloodhound when it came to scenting all of the horrors technology holds in store for us. Hence his mission to kill technologists before machines commenced what he believed would be their inevitable reign of terror.

We are living, we have long been told, in the Information Age. Yet now we are faced with the sickening suspicion that technology has run ahead of us. Man is a fire-stealing animal, and we can’t help building machines and machine intelligences, even if, from time to time, we use them not only to outsmart ourselves but to bring us right up to the doorstep of Doom.

We are still fearful, superstitious and all-too-human creatures. At times, we forget the magnitude of the havoc we can wreak by off-loading our minds onto super-intelligent machines, that is, until they run away from us, like mad sorcerers’ apprentices, and drag us up to the precipice for a look down into the abyss.

As the financial experts all over the world use machines to unwind Gordian knots of financial arrangements so complex that only machines can make — “derive” — and trade them, we have to wonder: Are we living in a bad sci-fi movie? Is the Matrix made of credit default swaps?

When Treasury Secretary Paulson (looking very much like a frightened primate) came to Congress seeking an emergency loan, Senator Jon Tester of Montana, a Democrat still living on his family homestead, asked him: “I’m a dirt farmer. Why do we have one week to determine that $700 billion has to be appropriated or this country’s financial system goes down the pipes?”

“Well, sir,” Mr. Paulson could well have responded, “the computers have demanded it.”

Richard Dooling is the author of “Rapture for the Geeks: When A.I. Outsmarts I.Q.”

EL PAIS [10.9.08]

Internet ya es para muchos el mayor canal de información. Cada vez es superior el tiempo empleado en navegar, ya sea para leer las noticias, revisar el correo, ver vídeos y escuchar música, consultar enciclopedias, mapas, conversar por teléfono y escribir blogs. En definitiva, la Red filtra gran parte de nuestro acceso a la realidad. El cerebro humano se adapta a cada nuevo cambio e Internet supone uno sin precedentes. ¿Cuál va a ser su influencia? Los expertos están divididos. Para unos, podría disminuir la capacidad de leer y pensar en profundidad. Para otros, la tecnología se combinará en un futuro próximo con el cerebro para aumentar exponencialmente la capacidad intelectual.

Uno de los más recientes en plantear el debate ha sido el ensayista estadounidense Nicholas G. Carr, experto en Tecnologías de la Información y la Comunicación (TIC), y asesor de la Enciclopedia británica. Asegura que ya no piensa como antes. Le sucede sobre todo cuando lee. Antes se sumergía en un libro y era capaz de zamparse páginas y páginas hora tras hora. Pero ahora sólo aguanta unos párrafos. Se desconcentra, se inquieta y busca otra cosa que hacer. "La lectura profunda que solía suceder de forma natural se ha convertido en un esfuerzo", señala Carr en el provocador artículo Is Google making us stupid? (¿Está Google volviéndonos tontos?), publicado en la revista The Atlantic. Carr achaca su desorientación a una razón principal: el uso prolongado de Internet. Está convencido de que la Red, como el resto de medios de comunicación, no es inocua. "[Los medios] Suministran el material del pensamiento, pero también modelan el proceso de pensar", insiste.

"Creo que la mayor amenaza es su potencial para disminuir nuestra capacidad de concentración, reflexión y contemplación", advierte Carr, a través del correo electrónico. "Mientras Internet se convierte en nuestro medio universal, podría estar readiestrando nuestros cerebros para recibir información de manera muy rápida y en pequeñas porciones", añade. "Lo que perdemos es nuestra capacidad para mantener una línea de pensamiento sostenida durante un periodo largo".

El planteamiento de Carr ha suscitado cierto debate en foros especializados, como en la revista científica online Edge.org, y de hecho no es descabellado. Los neurólogos sostienen que todas las actividades mentales influyen a un nivel biológico en el cerebro; es decir, en el establecimiento de las conexiones neuronales, la compleja red eléctrica en la que se forman los pensamientos. "El cerebro evolucionó para encontrar pautas. Si la información se presenta en una forma determinada, el cerebro aprenderá esa estructura", detalla desde Londres Beau Lotto, profesor de neurociencia en el University College de Londres. Y añade una precisión: "Luego habría que ver si el cerebro aplica esa estructura en el modo de comportarse frente a otras circunstancias; no tiene por qué ser así necesariamente, pero es perfectamente posible".

Lo que queda por ver es si esta influencia va a ser negativa, como vaticina Carr, o si va a ser el primer paso para integrar la tecnología en el cuerpo humano y ampliar las capacidades del cerebro, como predice el inventor y experto en inteligencia artificial Raymond Kurzweil. "Nuestras primeras herramientas ampliaron nuestro alcance físico, y ahora extienden nuestro alcance mental. Nuestros cerebros advierten de que no necesitan dedicar un esfuerzo mental (y neuronal) a aquellas tareas que podemos dejar a las máquinas", razona Kurzweil desde Nueva Jersey. Y cita un ejemplo: "Nos hemos vuelto menos capaces de realizar operaciones aritméticas desde que las calculadoras lo hacen por nosotros hace ya muchas décadas. Ahora confiamos en Google como un amplificador de nuestra memoria, así que de hecho recordamos peor las cosas que sin él. Pero eso no es un problema porque no tenemos por qué prescindir de Google. De hecho, estas herramientas se están volviendo más ubicuas, y están disponibles todo el tiempo".

Oponer cerebro y tecnología es un enfoque erróneo, según coincide con Kurzweil el profesor JohnMcEneaney, del Departamento de Lectura y Artes lingüísticas de la Universidad de Oakland (EE UU). "Creo que la tecnología es una expresión directa de nuestra cognición", discurre McEneaney. "Las herramientas que empleamos son tan importantes como las neuronas de nuestros cráneos. Las herramientas definen la naturaleza de la tarea para que las neuronas puedan hacer el trabajo".

Carr insiste en que esta influencia será mucho mayor a medida que aumente el uso de Internet. Se trata de un fenómeno incipiente que la neurología y la psicología tendrán que abordar a fondo, pero de momento un informe pionero sobre hábitos de búsqueda de información en Internet, dirigido por expertos del University College de Londres (UCL), indica que podríamos hallarnos en medio de un gran cambio de la capacidad humana para leer y pensar.

El estudio observó el comportamiento de los usuarios de dos páginas web de investigación, uno de la British Library y otro del Joint Information Systems Comittee (JISC), un consorcio educativo estatal que proporciona acceso a periódicos y libros electrónicos, entre otros recursos. Al recopilar los registros, los investigadores advirtieron que los usuarios "echaban vistazos" a la información, en vez de detenerse en ella. Saltaban de un artículo a otro, y no solían volver atrás. Leían una o dos páginas en cada fuente y clicaban a otra. Solían dedicar una media de cuatro minutos por libro electrónico y ocho minutos por periódico electrónico. "Está claro que los usuarios no leenonline en el sentido tradicional; de hecho, hay indicios de que surgen nuevas formas de lectura a medida que los usuarios echan vistazos horizontalmente a través de títulos, páginas y resúmenes en busca de satisfacciones inmediatas", constata el documento. "Casi parece que se conectan a la Red para evitar leer al modo tradicional".

Los expertos inciden en que se trata de un cambio vertiginoso. "La Red ha provocado que la gente se comporte de una manera bastante diferente con respecto a la información. Esto podría parecer contradictorio con las ideas aceptadas de la biología y la psicología evolutivas de que el comportamiento humano básico no cambia de manera súbita", señala desde Londres el profesor David Nicholas, de la Facultad de Información, Archivos y Bibliotecas del UCL. "Hay un consenso general en que nunca habíamos visto un cambio a esta escala y rapidez, así que éste podría muy bien ser el caso [de un cambio repentino]", añade, citando su ensayo Digital consumers.

Se trata de una transformación sin precedentes porque es un nuevo medio con el potencial de incluir a todos los demás. "Nunca un sistema de comunicaciones ha jugado tantos papeles en nuestras vidas ?o ejercido semejante influencia sobre nuestros pensamientos? como Internet hace hoy", incide Carr. "Aun así, a pesar de todo lo que se ha escrito sobre la Red, se ha prestado poca atención a cómo nos está reprogramando exactamente".

Esta alteración de las maneras de buscar información y de leer no sólo afectaría a los más jóvenes, a los que se les supone mayor número de horas conectado, sino a individuos de todas las edades. "Lo mismo les ha sucedido a maestros, profesores y médicos de cabecera. Todo el mundo muestra un comportamiento de saltos y lecturas por encima", precisa el informe.

Carr insiste en que una de las cuestiones clave es el modo de lectura "superficial" que va ganando terreno. "En los tranquilos espacios abiertos por la lectura de un libro, sostenida y sin distracciones, o por cualquier otro acto de contemplación, establecemos nuestras propias asociaciones, extraemos nuestras propias inferencias y analogías, y damos luz a nuestras propias ideas". El problema es que al impedir la lectura profunda se impide el pensamiento profundo, ya que uno es indistinguible del otro, según escribe Maryanne Wolf, investigadora de la lectura y el lenguaje de la Tufts University (EE UU) y autora deCómo aprendemos a leer (Ediciones B). Su preocupación es que "la información sin guía pueda crear un espejismo de conocimiento y, por ello, restrinja los largos, difíciles y cruciales procesos de pensamiento que llevan al conocimiento auténtico", señala Wolf desde Boston.

Más allá de las advertencias sobre los hipotéticos efectos de Internet sobre la cognición, científicos como Kurzweil dan la bienvenida a esta influencia: "Cuanto más confiamos en la parte no biológica (es decir, las máquinas) de nuestra inteligencia, la parte biológica trabaja menos, pero la combinación total aumenta su inteligencia". Otros discrepan de esta predicción. La mayor dependencia de la Red conllevaría que el usuario se vuelva vago y, entre otras costumbres adquiridas, confíe completamente en los motores de búsqueda como si fueran el grial. "Lo utilizan como una muleta", señala el profesor Nicholas, que recela de que esa herramienta sirva para liberar al cerebro de las tareas de búsqueda para poder emplearse en otras.

Carr va más allá y asegura que el tipo de lectura "vistazo" beneficia a las empresas. "Sus ingresos aumentan a medida que pasamos más tiempo conectados y que aumentamos el número de páginas y de los elementos de información que vemos", razona. "Las empresas tienen un gran interés económico en que aumentemos la velocidad de nuestra ingesta de información", añade. "Eso no significa que deliberadamente quieran que perdamos la capacidad de concentración y contemplación: es sólo un efecto colateral de su modelo de negocio".

Otros expertos matizan bastante el pronóstico de Carr. El experto en tecnología Edward Tenner, autor de Our own devices: how technology remake humanity (Nuestros propios dispositivos: cómo la tecnología rehace a la humanidad), se suma a la crítica de Carr pero añade que no tiene por qué ser irreversible. "Coincido con la preocupación por el uso superficial de Internet, pero lo considero como un problema cultural reversible a través de una mejor enseñanza y un mejor software de búsqueda, y no como una deformación neurológica", explica desde Nueva Jersey (EE UU). "Sucede como con la gente que está acostumbrada a los coches y a las tumbonas pero entiende la importancia de hacer ejercicio".

En definitiva, científicos como Kurzweil destacan el potencial de Internet como herramienta de conocimiento. "La Red ofrece la oportunidad de albergar toda la computación, el conocimiento y la comunicación que hay. Al final, excederá ampliamente la capacidad de la inteligencia humana biológica. Y concluye: "Una vez que las máquinas puedan hacer todo lo que hacen los humanos, será una conjunción poderosa porque se combinará con los modos en los que las máquinas ya son superiores. Pero nos mezclaremos con esta tecnología para hacernos más inteligentes".

FINANCIAL TIMES [9.29.08]

At last, we have a black swan. The credit crisis began last year soon after the publication of Nassim Nicholas Taleb's bestselling Black Swan, which tackled the impact of unexpected events, such as the discovery of black swans in Australia by explorers who had thought all swans were white. ...

...Prediction markets, summing the market's wisdom, had it wrong. Last week, the Intrade market put the odds that the Tarp would have passed by now at more than 90 per cent.

Models using market extremes to predict political interventions were also fooled. When volatility rises as high as in the past few weeks, it has in the past been a great bet that the government will do something—which is in part why spikes in volatility tend to be great predictors of a subsequent bounce.

Taleb himself suggested recently that investors should rely least on normal statistical methods when they are in the "fourth quadrant"—when there is a complex range of possible outcomes and where the distribution of responses to those outcomes does not follow a well understood pattern.

Investors were in that quadrant on Monday morning. They were vulnerable to black swans and should not have relied on statistics as a guide.

One prediction for the future does look safe, however: investors will spend much more time making qualitative assessments of political risk.

...

BLOGGING HEADS.TV [9.19.08]

JOHNSON: To get back to Taleb again, obviously, this piece on Edge is really food for thought. He mentioned Manderbrot sets and fractals and power laws where you have a rare number of extreme events and a lot of smaller less extreme events but his point underling this is that he didn't believe for a moment that these mathematical models actually explained reality or financial market place reality in any case but they are ways to think about it, ways to get a handle on it, but basically, it's too complex for us to understand.

I found that very rereshing since the thing that strikes me sometimes about the universe when we get to the ultimate questions is that we have these wonderful tools that are very helpful but you can't mistake the map for the reality—that old saw—the map for the territory.

HORGAN: We're all bozos on this bus....

FRANKFURTER ALLGEMEINE ZEITUNG [9.14.08]


295 Gramm für 359 Dollar: Das Lesegerät Kindle speichert bis zu 200 Bücher

295 Gramm für 359 Dollar: Das Lesegerät Kindle speichert bis zu 200 Bücher

Als am 21. Juli 2007 der siebte und letzte Band der Harry-Potter-Reihe erschien, wurden innerhalb von vierundzwanzig Stunden mehr als zehn Millionen Exemplare verkauft. Man sprach von einer logistischen Meisterleistung und dem Triumph eines altehrwürdigen Mediums, dem seit vielen Jahren immer mal wieder sein bevorstehendes Ende verkündet wird. Nie zuvor in der jahrhundertealten Geschichte des Buchdrucks hatte sich ein einzelnes Buch mit einer solchen Geschwindigkeit verbreitet. Aber was wäre geschehen, wenn jedermann sich das Werk als elektronisches Buch im Internet hätte herunterladen können? Wie viele Menschen hätten von dieser Möglichkeit Gebrauch gemacht - zwanzig Millionen, vielleicht fünfzig Millionen? Wie groß die Zahl auch sein mag, für den traditionellen Buchhandel, für den nur noch ein Potter-Jahr ein gutes Jahr ist, beschreibt sie ein Katastrophenszenario. Denn der Vertrieb der e-books findet ausschließlich im Internet statt, die stationären Buchhandlungen können daran nichts verdienen.

Das ist nur einer von vielen Gründen, weshalb der Kindle, wie Amazon sein neues elektronisches Lesegerät getauft hat, neben Begeisterung über den technischen Fortschritt auch große Befürchtungen auslöst. Denn erstmals seit der Einführung solcher Lesegeräte vor etwa zehn Jahren drohen der Kindle und seine Artgenossen wie Sonys Portable Reader, das Cybook oder der iLiad dem gedruckten Buch ernsthafte Konkurrenz zu machen. Dank einer neuen Bildschirmtechnologie, die mit einer Art elektronischer Tinte arbeitet, lässt sich mit den neuen Geräten genauso gut lesen wie am Computerbildschirm und das sogar bei Tageslicht im Freien.

Elektronisches Potter-Feuer im Keim erstickt

Man könnte diese Geräte als nützliche Hilfsmittel abtun, als digitale Lastenesel und profane Alternative zum schönen Buch. Schließlich ist es ja wirklich bequem, wenn man auf Reisen eine veritable Handbibliothek mit mehreren hundert Bänden in einem kleinen, eher hässlichen Plastikkästchen gespeichert hat, das indes kaum größer und schwerer ist als ein Taschenbuch und überdies noch so nützliche Funktionen wie eine Volltextsuche anbietet. Aber während die einen freudig vom größten evolutionären Schritt seit Erfindung des Buchdrucks sprechen, fragen sich die anderen besorgt, welche Umwälzungen dem wichtigsten Medium der Kulturgeschichte bevorstehen könnten. Bislang scheint nur eines sicher: All jene, die mit dem Buch zu tun haben, ob sie es schreiben, drucken, binden, verlegen, durchs Land transportieren, verkaufen oder lesen, dürften auf die oder andere Weise von der neuen Technologie berührt werden.

Wörtlich übersetzt bedeutet „to kindle“ so viel wie entzünden oder entflammen. Joanne K. Rowling hat das elektronische Potter-Feuer, das der Kindle entfachen könnte, im Keim erstickt, als sie entschied, dass ihre Werke nicht als e-book erscheinen dürfen. Über die Beweggründe der ehemaligen Lehrerin wird heftig spekuliert, auch auf der Website von Amazons Abteilung für e-books, im „Kindle Store“. Potter-Fans, die zugleich Kindle-Fans sind, machen dort ihrem Ärger Luft, verdächtigen die Autorin anti-amerikanischer Absichten oder bezeichnen sie schlicht als „Inkarnation des Bösen“.

Droht der Buchbranche das Schicksal der Musikindustrie?

Seit einigen Tagen ist dort der Beitrag eines Mannes zu lesen, von dem nichts bekannt ist außer seinem Namen, und der ist womöglich erfunden. Wir wissen nichts über John Newton, außer dass er einen Kindle besitzt und all jenen seine Stimme leiht, die der vielleicht auflagenstärksten Schriftstellerin der Welt das Recht an ihrem geistigen Eigentum absprechen wollen. Denn John Newton vertritt die Ansicht, dass es ganz und gar bedeutungslos sei, ob Joanne K. Rowling ihre Zustimmung dazu gibt, dass die Harry-Potter Bücher als e-books erscheinen oder nicht. Sein einziges Argument ist von brutaler Nüchternheit: „Es gibt diese e-books bereits.“ Die Schriftstellerin könne nur noch darüber entscheiden, ob sie ihren Lesern die Möglichkeit erlauben möchte, ein e-book von Harry Potter auch auf „legalem Wege“ zu erlangen.

Das ist, kaum verhüllt, ein Aufruf zum Raubkopieren, jener Form der alltäglichen Internetpiraterie, die die Musikbranche nach dem Aufkommen des i-Pod an den Rand des Untergangs gebracht hat. Droht der Buchbranche jetzt ein ähnliches Schicksal?

Die schönste Erfindung der Welt

„Was ist schlecht am e-book?“, fragt der türkische Nobelpreisträger Orhan Pamuk und möchte sogleich wissen, wie viele Kindle-Benutzer es in Deutschland bereits gibt. Sein Agent rate zwar entschieden zur Zurückhaltung, aber er selbst habe im Prinzip keine Einwände. Pamuk ist ein internationaler Autor. Er hat in Deutschland oder den Vereinigten Staaten fast ebenso viele Leser wie in der Türkei. Neuen Märkten gegenüber zeigt er sich aufgeschlossen: „Das e-book wird sehr wahrscheinlich auch nichts anderes sein als eine weitere Ergänzung zum Hardcover, ähnlich wie das Taschenbuch oder Hörbücher. Ich hätte nichts dagegen, wenn ich auf diesem Wege vielleicht noch einmal 100.000 zusätzliche Leser finde.“ Dass der Kindle Store bereits vier seiner Titel anbietet, scheint der Nobelpreisträger nicht zu wissen. Vielen Schriftstellern, die dort vertreten sind, dürfte es nicht anders gehen.

Pamuks deutscher Verleger ist entschieden anderer Ansicht als sein Autor. Michael Krüger, seit vier Jahrzehnten in der deutschen Buchbranche tätig und seit langem eine ihrer prägenden Gestalten, kann sich nicht vorstellen, wie jemand „freiwillig auf die schönste Erfindung der Weltgeschichte“ verzichten könne: „Das Buch ist das einzige Objekt unserer Zivilisation, auf das wir wirklich stolz sein können. Wenn es sich jetzt in einen multifunktionalen Speicher verwandeln soll, dann entspricht das dem Lauf der Zeit, der aus unserer Zivilisation eine elektronische Hölle machen will. Also muss man widerstehen. Aber da diese Dinge nicht aus der Welt zu schaffen sind und geistiges Eigentum sich sowieso verflüchtigt in den elektronischen Netzen, werden wir natürlich unsere Rechte lizenzieren.“ Wie und zu welchen Konditionen das geschehen könne, darüber ist sich Krüger bislang ebenso im Unklaren wie seine Kollegen bei Suhrkamp oder dem S. Fischer Verlag. Wohin man in diesen Tagen auch hört in der deutschen Verlagswelt, ob es um Belletristik, Sachbücher oder Kinderbücher geht, überall klingt die Auskunft ganz ähnlich wie bei Carlsen, Joanne K. Rowlings deutschem Verlag: Man habe gerade erst begonnen, sich mit der Sache zu befassen.

Wenn das Medium mit Macht ruft

Dabei ist der Kindle in den Vereinigten Staaten seit zehn Monaten auf dem Markt. Amazon soll dort bislang etwa 300 000 Geräte verkauft haben und bietet zur Zeit 166.000 elektronische Titel an, Neuerscheinungen, Bestseller, Klassiker sowie einige Magazine und Tageszeitungen. Die Preise reichen von 99 Cent für einen Klassiker wie Emily Dickinsons Gedichte bis zu 9,99 Dollar für einen aktuellen Bestseller. Orhan Pamuk meint, dass die Autoren in der virtuellen Welt des e-books künftig besser vergütet werden müssten als in der realen Welt, wo sie in der Regel zwischen zehn und fünfzehn Prozent des Verkaufspreises ihrer Bücher erhalten. Schließlich, so Pamuk, fielen beim e-book für die Verlage weit geringere Kosten für Produktion und Vertrieb der Bücher an. Michael Krüger sieht das naturgemäß anders: „Wir alle machen den größten Teil unseres Umsatzes mit dem Hardcover. Wenn dort der Umsatz halbiert wird, weil das e-book nur halb so viel kostet wie das gedruckte Buch, schrumpft früher oder später jeder Verlag auf die Hälfte seiner jetzigen Größe.“

Aber das e-book droht beileibe nicht nur die derzeitigen ökonomischen Grundlagen unserer Buchkultur in Frage zu stellen. Es könnte den Lesevorgang an sich verändern. Als hätten Amazons Ingenieure Heideggers Satz, dass die Technik selbst jede Erfahrung ihres Wesens verhindere, im Hinterkopf gehabt, betreiben sie entschieden Mimesis mit dem Buch. Der Kindle steckt in einem Lederetui wie zwischen zwei Buchdeckeln, und sein Bildschirm zeigt Seite für Seite nichts anderes als das gedruckte Buch auch. Vittorio Klostermann, der Verleger Heideggers, benutzt selbst ein elektronisches Lesegerät, wenn er auf Reisen ist: „Als Arbeitsinstrument für Vielleser könnte das e-book stark an Bedeutung gewinnen, aber es dürfte auch den Prozess der Entfremdung vom Buch beschleunigen. Schon heute können wir an jeder Universität beobachten, dass immer mehr Studierende nur noch jene Texte wahrnehmen, die ihnen online an ihrem elektronischen Arbeitsplatz zur Verfügung stehen.“ Dass Heideggers Werke als e-book erscheinen könnten, schließt Klostermann nicht aus: „Wenn das Medium wirklich mit Macht ruft, können wir nicht abseits stehen.“

Auch das Lesezeug arbeitet an unseren Gedanken mit

Angeblich will Amazon für Studierende und alle Leser von Fachbüchern schon bald einen größeren Kindle anbieten, der Tabellen, Grafiken und Abbildungen aller Art besser darstellen kann. Allein schon die Volltextsuche, die der Kindle erlaubt, macht ihn für alle interessant, die mit Büchern arbeiten wollen. Aber Lesen ist etwas anderes als Suchen, das immer nur einen Ausschnitt, das Zitat als Fragment zum Ziel hat und den Lesevorgang selbst fragmentarisiert. Was aber bedeutet es eigentlich, wenn die seit Jahrtausenden eingeübte Linearität des Lesevorgangs immer weiter aufgehoben wird? Muss nicht das konzentrierte Lesen, das Sichvertiefen in ein einzelnes Werk, irgendwann zum Ding der Unmöglichkeit werden?

Viele Beobachter glauben, dass sich das e-book vor allem im Bereich des Sachbuchs rasch Terrain erobern könnte. John Brockman und seine Frau Katinka Matson gehören zu den einflussreichsten Akteuren der amerikanischen Verlagswelt. Die Literaturagenten, die sich auf wissenschaftliche Publikationen und populäre Sachbücher spezialisiert haben, sagen der neuen Technologie eine große Zukunft voraus. Sie selbst lese zwar nach wie vor lieber in einem Buch, sagt Katinka Matson, aber der Kindle sei nun mal „viel praktischer, ein wirklich cooles Gerät: Ich kann im Bett liegen und mir jedes Buch aus dem Netz herunterladen.“ Brockman und seine Frau sind davon überzeugt, dass der Kindle unsere Lesegewohnheiten revolutionieren werde. „Aber Bücher müssen deshalb nicht anders geschrieben werden, und Autoren sollten den Kindle auch nicht bei der Konzeption ihrer Werke berücksichtigen.“

Wenn Nietzsches Satz, dass unser „Schreibzeug“ an unseren Gedanken mitarbeite, zutrifft, dann beeinflussen auch die Medien, mit denen wir lesen, den Prozess unserer Lektüre. Die umwälzenden Veränderungen, die mit der Erfindung der Schreibmaschine verbunden waren, kamen auf den hochgeknöpften Damenschuhen der Sekretärinnen daher, wie der Kulturhistoriker Bruce Bliven gesagt hat. Amazons digitales Lesegerät schleicht sich lautlos im Whispernet heran: Es ist das Buch, das aus dem Äther kam.

Stephen King horcht auf

Über die Internetverbindung Whispernet lässt sich innerhalb von sechzig Sekunden jedes im Kindle Store angebotene Buch herunterladen. Bislang soll sich überwiegend Unterhaltungsliteratur verkauft haben: Thriller und Titel der aktuellen Bestsellerlisten. Stephen King, mit einer Gesamtauflage von vierhundert Millionen Büchern einer der meistgelesenen Autoren der Welt, hat sich immer schon dafür interessiert, welche Möglichkeiten ihm neue Medien bieten. Vor acht Jahren hat er seine Kurzgeschichte „Riding the Bullett“ komplett zum Download ins Netz gestellt, bevor sie als Buch erschien.

Jetzt hat er für seine Fans, denen er gern als „Uncle Stevie“ gegenübertritt, Amazons Lesegerät einem Test unterzogen. Sein Fazit: Der Kindle leistet Lesern gute Dienste und wird sich deshalb durchsetzen. Aber wird er jemals das gedruckte Buch verdrängen können? Kings Antwort: „Nein. Die Unwandelbarkeit des gedruckten Buchs unterstreicht die Bedeutung der Ideen und Geschichten, die wir darin finden. Erst das Buch verleiht dem flüchtigen, fragilen Medium Dauer und Stabilität.“ Dennoch habe er sein Leben lang die Auffassung vertreten, dass die Geschichte, die erzählt wird, wichtiger sei als das System, das sie transportiert, den Autor eingeschlossen. Es ist interessant, dass King in diesem Zusammenhang ausdrücklich die Vorzüge des Hörbuchs gegenüber dem gedruckten Buch hervorhebt, als wolle er darauf hinweisen, dass die anthropologische Konstante, die besagt, dass Menschen einander Geschichten erzählen, älter sei als die Schrift und diese durchaus überdauern könne.

Entscheidend für jeden Schriftsteller

Felicitas Hoppe hat soeben einer sehr alten Geschichte einen neuen Weg gebahnt. Ihr jüngstes Buch unternimmt das Wagnis, ein achthundert Jahre alte Versepos Hartmann von Aues für unsere Zeit neu zu erzählen. „Iwein Löwenritter“ ist als Auftakt einer neuen Kinderbuchreihe des S. Fischer Verlags erschienen. Die Bücher sind leinengebunden, fadengeheftet und mit aufwendigen Illustrationen versehen. „Wir möchten bibliophile Bücher für bibliophile junge Leser machen“, heißt es beim Verlag.

Der kühne Sprung in eine bislang unbewohnte Nische des Jugendbuchmarkts soll das Gespür junger Leser für die ästhetische Schönheit des Buches wecken. Gleichwohl hätte Felicitas Hoppe nichts dagegen, wenn „Iwein Löwenritter“ als e-book erschiene - solange die gedruckte Ausgabe nicht dadurch verdrängt werde: „Ob ein Werk gedruckt wird oder nicht, ist entscheidend für jeden Schriftsteller. Denn es ist das Buch als Objekt, das in unseren Augen dem Text seinen Wert verleiht. Ich glaube nicht, dass sich daran jemals etwas ändern wird.“

Schon gibt es Einbruchsversuche im geschlossenen System

Aber spricht nicht doch manches dafür, dass das e-book dem unendlich differenzierten Buchmarkt ein neues Dreiklassenmodell überstülpen wird? Vielleicht wird es schon in wenigen Jahren Titel geben, die nur als e-book erscheinen, andere werden in digitaler und gedruckter Form angeboten werden, und die dritte Gruppe könnte aus besonders schön gestalteten Büchern bestehen, die den Reiz der Exklusivität nicht zuletzt daraus ableiten, dass sie eben nicht im Netz verfügbar sind. Dann würde das e-book das Buch entzaubern und gleichzeitig zu seiner Reauratisierung beitragen.

Noch vermag niemand zu sagen, wie und in welchem Maße das e-book unsere von Büchern geprägte Welt verändern wird, inwieweit es Lesegewohnheiten oder den Lesevorgang selbst zu beeinflussen vermag. Wie werden Kinder auf das neue Medium reagieren? Amazon hat diese Zielgruppe offenbar bereits fest ins Visier genommen: Die für das nächste Jahr angekündigte verbesserte Standardversion des Kindle soll über einen Farbbildschirm verfügen und so die Attraktivität des Produkts für junge Leser erhöhen. Und wie wird die Auseinandersetzung um das unverhohlene Monopolstreben des weltweit größten Internetbuchhändlers enden?

Während Sonys Portable Reader und der iLiad die gängigen e-book-Formate wie PDF wiedergeben, akzeptiert der Kindle nur Amazons eigenes Format und errichtet somit ein geschlossenes System, das andere Anbieter ausschließt und den Kunden gängelt. Dessen Weg zur scheinbar schrankenlosen neuen Bücherwelt soll ausschließlich durch das digitale Nadelöhr des Kindle Stores führen. Was dort nicht angeboten wird, ist für Kindle-Nutzer nicht erreichbar und führt kein digitales Leben. Anleitungen zum Knacken des Kindle-Codes werden auf einschlägigen Websites denn auch bereits diskutiert.

Die Flügel des Schmetterlings

Nicht nur der Ökonomie des Buchmarkts steht Veränderung bevor. Auf dem langen Weg der Profanisierung des geschriebenen Wortes droht eine unerhörte Zäsur. Das Evangeliar des Augustinus von Canterbury, eine oberitalienische Handschrift des sechsten Jahrhunderts, wurde über Jahrhunderte hinweg nicht in der Klosterbibliothek, sondern wie eine Reliquie auf dem Altar der Kirche selbst aufbewahrt. Das fünfzehnte Jahrhundert feierte den Buchdruck als Divina ars, als göttliche Kunst, den Menschen geschenkt, damit das Wort Gottes rascher Verbreitung finde. Doch schon um 1550 setzte die Klage ein, die Massenproduktion der Drucke beschädige das Arkanum, das die seltenen und kostbaren Handschriften des Mittelalters umgeben hatte.

Der Kindle hat nicht Trivialisierung zum Ziel, sondern das Verschwinden des Buches als sinnlicher Gegenstand, der riecht, altert und sich anfassen lässt. Die Verehrung des Buches hat ihre tiefsten und ältesten Wurzeln in der Religion, und die Prozesse der Aufklärung und Säkularisierung haben diese Verehrung nicht zerstört, sondern ihr weitere Wurzeln wachsen lassen. Wer heute Amazons Lesegerät zur Hand nimmt, ist in den ersten Stunden und Tagen dieser Erfahrung beeindruckt von den technischen Möglichkeiten dieses Geräts. Tritt er danach wieder an sein Bücherregal, wird ihm die Aura des Buches so zart und verletzlich erscheinen wie der Flügel eines Schmetterlings

LAS VEGAS SUN [9.13.08]

The question is as maddening as it is quadrennial: How do the amoeba middle, the undecided, the independent, the low-information voters make up their minds?

What on earth goes through their brains?

We’re not talking about the partisans, the roughly 80 percent of voters who lean toward one party or the other and will generally go that way, assuming the candidate meets a reasonable threshold.

We’re talking about the other 20 percent. Some of these people like to guard their independence — studying issues, weighing candidates’ resumes and proposed solutions. That’s only a tiny percentage of the 20 percent, though. For the most part, this group doesn’t know much about public policy, or its knowledge is instinctual, with basic hard-wired ideas of justice.

This is not meant to be an insult. They’re busy. They hate politics. Can’t blame them.

“The decisive bloc of voters may indeed be people who don’t follow issues, so character may matter quite a bit,” said Michael McDonald, an expert on voter behavior at the Brookings Institution and George Mason University.

Within this context of “character,” how do voters make up their minds?

There are many theories, none conclusive.

“Why People Vote Republican,” a recent essay by University of Virginia psychologist Jonathan Haidt, offers some clues, tying “why” to the origins of morality.

He posits that liberal-leaning Americans tend to subscribe to social contract ethics: You and I agree we’re equals, and we won’t get in each other’s way. The basic values: fairness, reciprocity and helping those in need.

Social contract liberals tend not to care so much about victimless crime.

Traditional morality, however, arises out of a more ancient need to quell selfish desires and thus create strong groups, Haidt argues. Its adherents do this by demanding loyalty to the group, respecting authority and revering sanctified traditions, symbols, etc.

People often fall into one or the other category, and a candidate who speaks to these instincts will win over these voters.

Democrats who disrespect or fail to understand the second type of morality — the morality of sanctified symbols, authority and loyalty to group — do so at their peril, as when Sen. Barack Obama talked about bitter Americans clinging to God and guns.

Other theories:

University of California, Berkeley, linguist George Lakoff thinks conservatives are more aware of the importance of metaphor and language, and thus frame political debates to their advantage. So, for example, President Bush proposed “tax relief,” which made the current tax structure seem like an affliction. Who could oppose that? Examples are endless.

New York Times columnist David Brooks pointed this year to Princeton University psychologist Alexander Todorov, who claims he can predict the votes of 70 percent of test subjects by their facial reactions when seeing a candidate for the first time.

Other theorists think voters have an emotional response to candidates, and then create post-hoc rational reasons for supporting a candidate. They personally and emotionally like Arizona Sen. John McCain, and then come up with some reason: He’s against earmarks.

Brookings’ McDonald tends to work in the other direction. He thinks issues do matter, and quite a lot. In his view, people make a determination about whether they agree or disagree with the candidate on issues, and then fill in character blanks.

Such as: Candidate X wants to cut taxes, and I want a tax cut. Suddenly Candidate X seems more like a someone I’d like to have a beer with, as the (in)famous saying goes.

Or, Candidate Y wants to raise taxes, and suddenly the voter starts to think the candidate is a wine-swiller who’s out of touch with his values.

In this determination, I fill in character traits based on whether I agree or disagree with the candidate on the issues.

But what if I have little or no information on the candidate and his issues?

Now we’re back to that 20 percent of voters, and again trying to figure out what moves them.

Or as McDonald put it: “The dirty little secret of American politics is that the least informed decide the winner.”

Sun reporter Joe Schoenmann contributed to this story.

Pages