Edge in the News: 2010
Curator Hans Ulrich Obrist, invited 50 artists on stage in a 20-hour event in London: A marathon on the Maps of the 21 Century. It was about more than just Google Maps.
[Google Transtlation:] With pith helmet and ice axes, the men went into battle against the unknown. The victorious are now in the Hall of Fame. David Livingston, the missionary and explorer, the first European to see Victoria Falls and as its "discoverer", looks down graciously from his lush stucco frame at the guests of the Royal Geographical Society in London.
The marathon was to be understood literally. In 20 hours were about 50 artists, architects, philosophers, scientists and musicians on the stage. After 15 minutes they were usually heruntergescheucht again, the next speaker was waiting. The British architect David Adjaye put on there about his typology of architecture in Africa, for which he has photographed ten years buildings in the 53 states.
The skepticism about the truth of cards
The philosopher Rosi Braidotti, who teaches at the Human Rights Institute at the University of Utrecht, pointed out that we owe the skepticism about the truth of cards by post-modernism. And the American literary agent John Brockman, founder of the influential Internet platform "Edge", showed different maps used in science are like. Many of them were indeed diagrams, such as the developmental biologist Lewis Wolpert made clear, but it would be in London not to spoil the fun.
Thus one saw an intricate network of countless points of intersection, which made the sociologist Nicholas A. Christakis and political scientist James Fowler of the relationship between obesity and social contacts visible, and the gene pioneer J. Craig Venter gave what looked like an endless series of colorful building blocks: the map of the first synthetic composite heritage.
It was the fifth time that Hans Ulrich Obrist chose this strength-sapping format of the race, an issue considered by as many points of view. After manifesto and poetry — the themes of the past two years — Obrist with this year's focus has, however, was a perfect landing.
Few technologies seems to characterize the young century as the new cartography. This could already be at the DLD Conference this year in Munich from the start, when the curator ever been to a small group, a symposium cards held on: Whether designer, astronomer, or Internet artists — all spoke of the great changes that the digital cartography with them was initiated.
While in London Obristgreatly expanded its guest list, showing that the marathon reflects a global development. Since 2005 when the U.S. internet company Google began its map service, Google Maps is almost omnipresent. From the once precious resource for a hero and ruler, the medium has become readily available for the masses, to be used by both experts and lay people.
In the run up to the "Map Marathon" Obrist asked contemporary artists about their personal map of the 21 Asked century. Because Obrist, simply the greatest networker in the art world, is why he is the main curator of the world was titled as before, it was then much mail for the Serpentine Gallery . ...
Whats the connection between Kevin Kellys habits on the internet, Louise Bourgeoiss contented view of France, and Craig Venters genome?
They can all be represented as maps. And this weekend, they all were -- along with hundreds of maps of experimental art, of the worlds oldest-known words, and of the steampunk-and-superheroes content of BoingBoing.
Oh, and not forgetting dozens more maps celebrating this magazines fascination with data visualisations as a way of turning data into stunningly beautiful visualisations.
Let me explain. Over the weekend I took part in an epic project organised by our friends at the Serpentine Gallery in London -- the Map Marathon, a live two-day event at the Royal Geographical Society in Kensington. It was the fifth in the annual series of Serpentine Gallery Marathons, conceived by super-curator Hans Ulrich Obrist, the gallerys extraordinarily energetic co-director of exhibitions, with gallery director Julia Peyton-Jones.
The Swiss-born Obrist, featured in Wired in February, was once called by Art Review "the art worlds most powerful figure". After seeing the impressive cast list for the two-day event, youll understand why -- with contributions from the likes of Anish Kapoor, David Adjaye and Gilbert & George.
In previous years, the Serpentines marathons have been curated around themes such as interviews and poetry. But this years theme was just up Wireds street (well, perhaps a short walk away up the A315): maps, in all their forms and beauty, from literal representations of physical landscapes, to abstract conceptualisations by scientists. The overall aim was "to challenge notions of art, culture, science, technology, and methods of public discourse and debate" -- and in that it more than succeeded.
From noon until 10pm on both Saturday and Sunday, there were non-stop live presentations by more than 50 artists, scientists, poets, writers, philosophers, musicians, architects and designers. There were also special collaborations with the Edge community and with the DLD conference community run by Steffie Czerny and Marcel Reichart, whose excellent events Ive written about here before.
On Sunday lunchtime, I shared the stage with Hal Bertram of ITO, the smart visualisers who worked with us on our "Data into Information" feature in Septembers issue of Wired. Our conversation was titled: When Data Meets Maps: How Datavisualisation is Changing the World. Hal showed some of our favourite visualisations, including examples of how OpenStreetMap was used to save lives after the Haiti earthquake in January, plus examples of how open data streams can be used effectively to visualise traffic flow.
Also up on stage were Wired friends such as Eric Rodenbeck of Stamen, whose work we featured a few months ago; and Aaron Koblin, who presented on "Re-Embodied Data: Mapping the Unseeable". But data visualisers were just one thread running through this constantly surprising event. There was Marina Abramovi? presenting on Body Maps; the writer Russell Hoban; and Marcus du Sautoy talking about Mathematical Maps.
One of my favourite panels was run by John Brockman, the literary super-agent who runs the EDGE community of "some of the most interesting minds in the world". Together with Lewis Wolpert and Armand Leroi, he presented maps submitted by members of the EDGE community. So we got to see Kevin Kellys internet; plus philosopher Eduardo Salcedo-Albaráns map of interconnections between Mexican drugs cartels.
That was followed by two strikingly contrasting but equally compelling sessions - C. E. B. Reas, the co-inventor (with Ben Fry) of the Processing software language, who explored the beautiful patterns it creates out of data; and architect David Adjaye, who showed some of the 35,000 photos he took in Africa.
Well done to the @WiredUK Twitter followers who got to go free of charge - the rest of you really will need to follow us on Twitter so you get early warning next time. And contragulations to all at the Serpentine for a rich and brain-expanding weekend.
"We are fixated on technology and technological success, and we have no sustained or systematic approach to field-based social understanding of our adversaries' motivation, intent, will, and the dreams that drive their strategic vision, however strange those dreams and vision may seem to us."—Anthropologist Scott Atran, who believes the quest to end violent political extremism needs more science. (edge.org)
When I claimed that the John Templeton Foundation was engaged in bribing journalists, I didn’t mean that they directly paid off those journalists for writing articles that blurred the lines between science and faith. It’s nothing so crass as that. What I meant was that Templeton creates a climate in which journalists who take a certain line in their writings can expect sizable monetary and career rewards:
As I said, The Templeton Foundation is smart—or rather wily. They realize that few people, especially underpaid journalists and overworked academics, are immune to the temptation of dosh, and once those people get hooked on the promise of money and prestige, they forever have a stall in the Templeton stable. And, in the hopes of future Templeton funding, perhaps they’ll continue to write pieces congenial to the Foundation’s mission.
It’s a subtle way of using writers to promulgate your own views, though of course none of those writers would ever admit that they had been bought off.
Rod Dreher is an example of how the Templeton system works. Dreher was a columnist at the Dallas Morning News, and author of Crunchy Cons (2006), a book about those conservatives who think as righties and live as lefties. Last year, Dreher won a Templeton-Cambridge Journalism Fellowship, one of Templeton’s most important vehicles for conflating science and faith. Since he got his fellowship, Dreher has written not only for the Dallas paper, but also on beliefnet, a religion/sprirituality website. His columns have pretty much been aligned with the Templeton Foundation’s own views.
Last August, for example, either at or near the end of his Fellowship, Dreher wrote a piece for the Dallas Morning News describing his wonderful experience at Cambridge, decrying “atheist fundamentalism,” and asserting that the horrors of Nazi Germany were part of “atheism’s savage legacy.” He then touted a NOMA-like solution:
We ought to reject the shibboleth, advocated by both religious and secular fundamentalists, that religion and science are doomed to be antagonists. They are both legitimate ways of knowing within their limited spheres and should both complement and temper each other. The trouble comes when one tries to assert universal hegemony over the other. . .
Contrary to the biases of our time, the importance of science does not exceed that of art and religion. As the poet Wendell Berry writes, the sacredness of life “cannot be proved. It can only be told or shown.” Fortunate are those whose minds are free enough to recognize it.
This kind of stuff is like cream to the cats at Templeton. How they must have licked their whiskers when they read it!
In a beliefnet column posted last week, Dreher decried the coming “Age of Wonder” touted by physicist Freeman Dyson,” in which science may play an increasingly important role in our life:
This, in the end, is why science and religion have to engage each other seriously. Without each other, both live in darkness, and the destruction each is capable of is terrifying to contemplate — although I daresay you will not find a monk or a rabbi prescribing altering the genetic code of living organisms for the sake of mankind’s artistic amusement. What troubles me, and troubles me greatly, about the techno-utopians who hail a New Age of Wonder is their optimism uncut by any sense of reality, which is to say, of human history. In the end, what you think of the idea of a New Age of Wonder depends on what you think of human nature. I give better than even odds that this era of biology and computers identified by [Freeman] Dyson and celebrated by the Edge folks will in the end turn out to have been at least as much a Dark Age as an era of Enlightenment. I hope I’m wrong. I don’t think I will be wrong.
Over at Pharyngula, P. Z. Myers took apart Dreher’s arguments against biotechnology, giving a dozen examples of Dreher’s ignorance and misstatement. And although Dreher wrote
The truth of the matter is that I turned up in Cambridge knowing a lot about religion, but not much about science. What I saw and heard during those two-week seminars, and what I learned from my Templeton-subsidized research that summer (I designed my own reading program, which compared Taoist and Eastern Christian views of the body and healing) opened my mind to science. It turned out that I didn’t know what I didn’t know until I went on the fellowship.
it appears that he still doesn’t know what he doesn’t know.
On Sept. 26 of last year, five days before Templeton started accepting applications for their journalism fellowships, Dreher promoted the Templeton Journalism Fellowships on belief.net, encouraging people to apply.
On November 30 of last year, Dreher announced that he was leaving the Dallas Morning News to become director of publications at the John Templeton Foundation. That’s where he is now. He’s still publishing on beliefnet, though, where, a week ago, he wrote a heated column attacking my contention that Templeton bribes journalists. It’s the usual stuff—outraged assertions that journalists could be bought, attacks on “atheist fundamentalists,” and what Dreher calls a “brave, contrarian position” that we should all be “nice” to each other. You can read it for yourself, and I urge you to do so.
The curious thing, though, is that while decrying the idea that Templeton “buys off” journalists, Dreher is himself a beneficiary of Templeton’s practice of rewarding those who, after entering the system, perform well. Dreher was a journalism fellow just last year. Other journalism fellows have been promoted to the advisory committee for the fellowships. And several members of the Templeton Foundation’s Board of Advisors have, after their service, gone on to win the million-pound Templeton Prize itself. The lesson, which seems transparently obvious, is that if you clamber aboard the Templeton gravy train and keep repeating that science and faith are complementary “ways of knowing,” good things will happen to you.
Oh, one last point. The Templeton website says this about Dreher’s credentials:
A seven-time Pulitzer Prize nominee, Rod has spent most of the past two decades as an opinion journalist, having worked as a film and television critic and news columnist at the New York Post and other newspapers. He has appeared on National Public Radio, ABC News, Fox News Channel, CNN, and MSNBC.
That seemed odd to me. Seven-time Pulitzer nominee? That’s big stuff! But a bit of sleuthing showed that it’s not what it seems. Nearly any journalist can be a Pulitzer “nominee” for journalism. All somebody has to do is fill out a form, submit a few of the “nominee’s” articles, and write a $50 check to Columbia University/Pulitzer Prizes. As the Pulitzer website says:
By February 1, the Administrator’s office in the Columbia School of Journalism has received more than 1,300 journalism entries. Those entries may be submitted by any individual based on material coming from a text-based United States newspaper or news site that publishes at least weekly during the calendar year and that adheres to the highest journalistic principles.
Editors do this all the time for their writers, but you don’t have to be an editor to nominate someone: anybody can do it.
And the thing is, the Pulitzer organization does not recognize the category of “nominee” for those who get nominated this way—it recognizes the category of “nominated finalist,” those three individuals whose submissions make the cut and get considered for the Pulitzer Prize itself. The Pulitzer organization, in fact, discourages the use of the term “nominee,” presumably because any newspaper or news site journalist who has a friend with fifty bucks can be a nominee. From their website:
22. What does it mean to be a Pulitzer Prize Winner or a Pulitzer Prize Nominated Finalist?
-
- A Pulitzer Prize Winner may be an individual, a group of individuals, or a newspaper’s staff.
- Nominated Finalists are selected by the Nominating Juries for each category as finalists in the competition. The Pulitzer Prize Board generally selects the Pulitzer Prize Winners from the three nominated finalists in each category. The names of nominated finalists have been announced only since 1980. Work that has been submitted for Prize consideration but not chosen as either a nominated finalist or a winner is termed an entry or submission. No information on entrants is provided.
Pulitzer also says this:
The three finalists in each category are the only entries in the competition that are recognized by the Pulitzer office as nominees.
I checked the Pulitzer list of nominated finalists, and I didn’t find Dreher’s name on it. I guess Templeton is calling Dreher a “nominee” against the recommendations of the Pulitzer organization. If I’m right here, Dreher and Templeton may want to correct his credentials.
So you’re an organization whose mission is to blur the lines between faith and science, and you have huge wads of cash to do this. What’s the best strategy?
Well, if you’re smart, you find a bunch of journalists who are not averse to being bribed to write articles consonant with your mission, give them a lot of money to attend “seminars” on reconciling faith and science (you also give a nice emolument to the speakers), enlist a spiffy British university to house these journalists, on whom you bestow the fancy title of “fellows,” cover all their expenses (including housing) to go to the UK for a couple of months, and even give them a “book allowance.” What could be more congenial to an overworked journalist than a chance to play British scholar, punting along the lovely Cam or enjoying a nice pint in a quant pub, all the while chewing over the wisdom of luminaries like John Polkinghorne and John Haught, and pondering the mysteries of a fine-tuned universe and the inevitable evolution of humans?
And the best part is this: forever after, those journalists are in your camp. Not only can you use their names in your advertising, but you’ve conditioned them, in Pavlovian fashion, to think that great rewards come to those who favor the accommodation of science and faith. They’ll do your job for you!
The John Templeton Foundation may be misguided, but it’s not stupid. The Templeton-Cambridge Journalism Fellowships in Science and Religion pay senior and mid-career journalists $15,000 (plus all the perks above) to come to Cambridge University for two months, listen to other people talk about science and religion, study a religion/science topic of their own devising, and then write a nifty paper that they can publish, so getting even more money. What a perk! Imagine sitting in a medieval library, pondering the Great Questions. And you get to be called a fellow! And write a term paper! Isn’t that better than cranking out hack pieces for people who’d rather be watching American Idol? Sure, you have to apply, and write an application essay stating how you intend to relate science and religion, but, hey, it’s only 1500 words, and once you’re in, you’re golden. You may even get to be on the advisory board, and have a chance to come back to the trough.
As I said, The Templeton Foundation is smart—or rather wily. They realize that few people, especially underpaid journalists and overworked academics, are immune to the temptation of dosh, and once those people get hooked on the promise of money and prestige, they forever have a stall in the Templeton stable. And, in the hopes of future Templeton funding, perhaps they’ll continue to write pieces congenial to the Foundation’s mission.
The Temple Foundation is wily, but they’re not exactly honest. Look at this:
After decades during which leading voices from science and religion viewed each other with suspicion and little sense of how the two areas might relate, recent years have brought an active pursuit of understanding how science may deepen theological awareness, for example, or how religious traditions might illuminate the scientific realm. Fellowship organizers note that rigorous journalistic examination of the region where science and theology overlap – as well as understanding the reasoning of many who assert the two disciplines are without common ground – can effectively promote a deeper understanding of the emerging dialogue.
Now if you’re interested in seeing how science and religion “illuminate” one another, what’s the first thing you think of? How about this: is there any empirical truth in the claims of faith? After all, if you’re trying to “reconcile” two areas of thought, and look at their interactions, surely you’d be interested if there’s any empirical truth in them. After all, why “reconcile” two areas if one of them might be only baseless superstition? Is the evidence for God as strong as it is for evolution? Does the “fine-tuning” of physical constants prove Jesus? Was the evolution of humans inevitable, thereby showing that we were part of God’s plan?
It’s not that there’s nothing to say about this. After all, one of the speakers in the Fellows’ symposia is Simon Conway Morris, who has written a popular-science book claiming that biology proves that the evolution of human-like creatures was inevitable. It’s just that the Templeton Foundation doesn’t want to promote, or have its Fellows write about, the other side, the Dark Side that feels that no reconciliation is possible between science and faith. John Horgan, who was once a Journalism Fellow, talks about his experience:
My ambivalence about the foundation came to a head during my fellowship in Cambridge last summer. The British biologist Richard Dawkins, whose participation in the meeting helped convince me and other fellows of its legitimacy, was the only speaker who denounced religious beliefs as incompatible with science, irrational, and harmful. The other speakers — three agnostics, one Jew, a deist, and 12 Christians (a Muslim philosopher canceled at the last minute) — offered a perspective clearly skewed in favor of religion and Christianity.
Some of the Christian speakers’ views struck me as inconsistent, to say the least. None of them supported intelligent design, the notion that life is in certain respects irreducibly complex and hence must have a divine origin, and several of them denounced it. Simon Conway Morris, a biologist at Cambridge and an adviser to the Templeton Foundation, ridiculed intelligent design as nonsense that no respectable biologist could accept. That stance echoes the view of the foundation, which over the last year has taken pains to distance itself from the American intelligent-design movement.
And yet Morris, a Catholic, revealed in response to questions that he believes Christ was a supernatural figure who performed miracles and was resurrected after his death. Other Templeton speakers also rejected intelligent design while espousing beliefs at least as lacking in scientific substance.
The Templeton prize-winners John Polkinghorne and John Barrow argued that the laws of physics seem fine-tuned to allow for the existence of human beings, which is the physics version of intelligent design. The physicist F. Russell Stannard, a member of the Templeton Foundation Board of Trustees, contended that prayers can heal the sick — not through the placebo effect, which is an established fact, but through the intercession of God. In fact the foundation has supported studies of the effectiveness of so-called intercessory prayer, which have been inconclusive.
One Templeton official made what I felt were inappropriate remarks about the foundation’s expectations of us fellows. She told us that the meeting cost more than $1-million, and in return the foundation wanted us to publish articles touching on science and religion. But when I told her one evening at dinner that — given all the problems caused by religion throughout human history — I didn’t want science and religion to be reconciled, and that I hoped humanity would eventually outgrow religion, she replied that she didn’t think someone with those opinions should have accepted a fellowship. So much for an open exchange of views.
So, the Foundation doesn’t really want the hard light of science cast upon faith. It wants its journalists (and nearly everyone it funds) to show how faith and science are compatible. Those who feel otherwise, like Victor Stenger, Richard Dawkins, Anthony Grayling, Steven Weinberg, well, those people don’t have a say. (In fact, the Foundation’s history of intellectual dishonesty has made many of them unwilling to be part of its endeavors.) If a miscreant sneaks in by accident, as did John Horgan, he’s told that he doesn’t belong. The Foundation may pay lip service to dissenters, as in this statement (my emphasis),
Fellowship organizers note that rigorous journalistic examination of the region where science and theology overlap – as well as understanding the reasoning of many who assert the two disciplines are without common ground – can effectively promote a deeper understanding of the emerging dialogue.
but you won’t see Templeton giving Journalism Fellowships to people who have a track record of such views. Instead, the Fellows spend their time pondering, “Now how on earth could those poor people think that science and faith are incompatible?”
These journalism fellowships are nothing more than a bribe—a bribe to get journalists to favor a certain point of view. The Foundation’s success at recruiting reputable candidates proves one thing: it doesn’t cost much to buy a journalist’s integrity. Fifteen thousand bucks, a “book allowance,” and a fancy title will do it.
Could this explain why those journalists who trumpet every other achievement on their websites keep quiet when they get a Templeton Fellowship?
Ich werde aufgefressen“, klagte vor Monaten „FAZ“-Herausgeber Frank Schirrmacher im Buch „Payback“. Sein Problem: die Informationsexplosion durch Internet, Twitter und Co. Schirrmacher zufolge macht sie aus uns neue Menschen. Sie „verändert unser Gedächtnis, unsere Aufmerksamkeit und unsere geistigen Fähigkeiten, unser Gehirn wird physisch verändert, vergleichbar nur den Muskel- und Körperveränderungen der Menschen im Zeitalter der industriellen Revolution“.
Das alles war freilich schon im Jahr davor beim US-Wissenschaftspublizisten Nicholas Carr zu lesen. Carr machte 2008 mit seinem Artikel „Is Google Making Us Stupid?“ Furore. Die neuen Medien untergraben die Fähigkeit zu Konzentration und Kontemplation, behauptete Carr. Er bemühte dabei ein neurowissenschaftliches Phänomen, die „neuronale Plastizität“: nämlich, dass Synapsen, Nervenzellen, ganze Hirnareale sich durch die menschliche Erfahrung verändern können.
Jetzt werden diese Thesen in den USA erneut diskutiert, denn Carr hat sie zu einem Buch ausgebaut: „The Shallows: What the Internet is Doing to Our Brains“. Und zitiert darin unter anderem den Psychiater Gary Small, dessen Forschungen zufolge der Gebrauch der neuen Medien „schrittweise neue neuronale Pfade in unserem Hirn verstärkt und alte schwächt“. Durch das Internet werde das Hirn also quasi neu verdrahtet.
Na und? In der kognitiven Neurowissenschaft „verdreht man bei solchem Gerede nur die Augen“, meint nun der in Harvard lehrende kanadische Psychologe Steven Pinker. Tatsächlich verdrahte sich das Gehirn bei jeder neuen Erfahrung oder Fähigkeit neu, „die Information wird schließlich nicht in der Bauchspeicheldrüse gespeichert“, schrieb er in der „New York Times“ („Mind Over Mass Media“ – eine deutsche Fassung des Artikels erschien am Montag in der „Süddeutschen Zeitung“).
Doch Erfahrungen würden die grundsätzlichen Fähigkeiten des Hirns zur Informationsverarbeitung nicht neu ordnen: „Zwar haben Speed-Reading-Programme lange für sich in Anspruch genommen, sie würden genau das schaffen. Aber zu diesen hat bereits Woody Allen das gültige Urteil gefällt, nachdem er zuvor ,Krieg und Frieden' in einem Rutsch gelesen hatte: ,Es ging um Russland.‘“ Auch echtes Multitasking sei längst als Mythos entlarvt, „nicht nur durch Laborstudien, sondern auch durch den vertrauten Anblick eines zwischen den Fahrbahnspuren herumschlenkernden Geländewagens, dessen Fahrer am Handy seinen Geschäften nachgeht.“
Nicht ein Wissenschaftler hat heuer allerdings die Debatte um die kognitiven Auswirkungen des Internets am meisten befördert, sondern ein Literaturagent. John Brockman, der Autoren wie Richard Dawkins und Jared Diamond vertritt, fragte: „Verändert das Internet Ihr Denken?“ Die über 100 Antworten von bekannten Wissenschaftlern, Künstlern und Denkern aufwww.edge.org zeigen vor allem: Die Antwort gibt es nicht.
According to media columnist Michael Wolff, the name Clay Shirky is "now uttered in technology circles with the kind of reverence with which left-wingers used to say, 'Herbert Marcuse'." Wolff is right. Shirky has emerged as a luminary of the new digital intelligentsia, a daringly eclectic thinker as comfortable discussing 15th-century publishing technology as he is making political sense of 21st-century social media.
In his 2008 book, "Here Comes Everybody," Shirky imagined a world without traditional economic or political organizations. Two years later and Shirky has a new book, "Cognitive Surplus," which imagines something even more daring -- a world without television. To celebrate the appearance of the revered futurist's latest volume, we're delighted to share a February discussion between Shirky, Barnes & Noble Review editor in chief James Mustich, and BNR contributor Andrew Keen. What follows is an edited transcript of their conversation about the future of the book, of the reader and the writer, and, most intriguingly, the future of intimacy.
Or, how the annual networking session of America's nerd elite became the world's most important and influential talking shop. MICHAEL WOLFF reports on the technology, entertainment and design conference that's the global power summit for the new super-wealthy, tech-savvy, hyper-connected intelligentsia
...But TED, which launched first in 1984, and then became an annual event from 1990. was always a little different. It was a pageant of nerdiness, in a sense combining the key forms of nerd social life: summer camp, talent show and adult education class. Physicists competed with juggling acts. Magicians with New Yorker writers. Quincy Jones followed Richard Dawkins (who gave one of his first talks about atheism at TED). Cellist Yo-Yo Ma shared a stage with superstring theorist Brian Greene.
Most elementally, it attracted the world's biggest nerds. Bill Gates, Steve Jobs, the Yahoo! boys, the Google boys and everybody else who ever made a billion dollars. They, in turn, attracted Hollywood royalty, who in turn attracted the media moguls. TED is where I first went drinking with Rupert Murdoch and first flirted with American television personality Martha Stewart.
If there was a theme at TED, then it was "insider-ism". Everybody present was somebody And everybody knew everybody. (For several dotcom years, TED was the main driver of my social life.) The tech business was the Mafia and TED was the biggest Mafia wedding of the year.
A key feature and sought-after invitation at TED, hosted on the second night by the literary agent John Brockman, is the Billionaires' Dinner — row upon row of the world's most successful (and richest) human beings (Murdoch, in my first conversation with him at TED, was grouchy about some of the people who were implying they were billionaires who, according to him, were most definitely not!). ...
—Theoretical physicistFreeman Dyson on the Venter synthetic biology paper in Science,quoted in Edge.org.
—Daniel C. Dennett, Tufts University philosopher on the Venter synthetic biology paper in Science, quoted in Edge.org.
—In “The Tears of Strangers Are Only Water,” a Big Think blog post by David Berreby about research probing the physiology of empathy.
—David Willetts in his first press briefing as the UK’s new Conservative minister for universities and science.
—Birdlife International’s Leon Bennun on the recent extinction of Madagascar’s Alaotra grebe.
Max Brockman (Hg.): "Die Zukunftsmacher. Die Nobelpreisträger von morgen verraten, worüber sie forschen", S. Fischer Verlag, Frankfurt am Main 2010, 270 Seiten
18 jüngere Wissenschaftler zeigen, mit welchen Themen sich die Gesellschaft in Zukunft auseinandersetzen muss. Im Mittelpunkt steht dabei die Frage nach dem Wesen des Menschen.
"What's next?": Früher hätte man den Seufzer Zukunftsforschern überlassen - in diesem neuen Buch widmen sich 18 jüngere Wissenschaftler dieser Frage. Sie definieren damit, so der Herausgeber Max Brockman, "mit welchen Themen sich die Gesellschaft in Zukunft auseinandersetzen muss".
Nicht wenige zielen dabei mit ihrer Grundlagenforschung auch auf die lange nicht mehr gestellte, bis vor Kurzem angestaubt wirkende Frage nach dem Wesen des Menschen. Sie wollen dazu beitragen, "dass wir neu definieren, wer und was wir sind".
Scheinbar harmlose und akademisch trockene Forschungsfragen entpuppen sich dabei oft als Sprengsätze. Zum Beispiel die Frage nach der zeitlichen Verarbeitung verschiedener Komponenten eines alltäglichen Erlebnisses. Akustische, visuelle, taktile und andere Reize werden jeweils von unterschiedlichen Hirnbereichen verarbeitet, und die funktionieren nicht zeitgleich.
Wie also koordiniert unser Hirn die unterschiedlichen Komponenten, sodass die Reize als ein Ereignis wahrgenommen, gedeutet und in seiner Relevanz beurteilt werden; dass sie mit anderen Gedächtnisinhalten abgeglichen und als Muster für künftiges Handeln gespeichert werden?
Könnte es sein, dass bestimmte Störungen - Dyslexie zum Beispiel, das eingeschränkte Lesevermögen - nicht auf Defekte der Sprachfähigkeit zurückgehen, sondern auf eine gestörte zeitliche Verarbeitung? Möglicherweise werden hier akustische und visuelle Repräsentationen zeitlich nicht richtig koordiniert, vermutet der Neurologe David Eagleman.
Oder ein anderes Beispiel: Unterschiede der Sprache bedingen nachweislich unsere Denkstrukturen, betont die Linguistin Lera Boroditsky. Sprache ist nicht nur Ausdruck von Inhalt, sie hat eine Definitionsmacht. Analog dazu steuern kulturelle Wertvorstellungen und Begriffe jeweils unterschiedliche Evolutionsmuster, zeigt der Oxforder Philosoph Nick Bostrom.
Und längst haben Anthropologen den Nachweis erbracht, dass, umgekehrt, unterschiedliche biologische, etwa genetische Muster wiederum jeweils andere kulturelle und soziale Wertpräferenzen entstehen lassen.
Dass Buddhismus und Konfuzianismus sich im Osten festsetzten und das Christentum im Westen: Dies sei kein Zufall, behauptet der Neuropsychologe Matthew Lieberman - sondern eine Art bio-kognitiver Konsequenz, evolutionär gewachsen, genetisch bedingt, hormonell gesteuert durch den Botenstoff Serotonin.
Überraschend viele Forscher fordern - angesichts der wachsenden Möglichkeiten, in die Natur einzugreifen - eine bewusste Steuerung der Evolution. Experimente an Tieren zeigen, dass Menschen allein durch Änderungen eines Lebensumfelds in wenigen Generationen genetische Veränderungen bewirken können, auch ohne direkt ins Erbmaterial einzugreifen, berichtet der Biologe Brian Hare. Wünschenswerte Menschentypen werden ohnehin längst gezüchtet: Erziehung ist nichts anders als der Versuch einer solchen evolutionären Steuerung.
Zum Thema: Treffen der Nobelpreisträger in Lindau
Besprochen von Eike Gebhardt
Max Brockman (Hg.): Die Zukunftsmacher. Die Nobelpreisträger von morgen verraten, worüber sie forschen
Aus dem Amerikanischen von Sebastian Vogel
S. Fischer Verlag, Frankfurt am Mein 2010
270 Seiten, 19,95 Euro
I was digging through some files the other day and found this document from 1997. It gathers a set of quotes from issues of Wired magazine in its first five years. I don't recall why I created this (or even if I did compile all of them), but I suspect it was for our fifth anniversary issue. I don't think we ever ran any of it. Reading it now it is clear that all predictions of the future are really just predictions of the present.
Here it is in full:
We as a culture are deeply, hopelessly, insanely in love with gadgetry. And you can't fight love and win.
Jaron Lanier, Wired 1.02, May/June 1993, p. 80
No class in history has ever risen as fast as the blue-collar worker and no class has ever fallen as fast.
Peter Drucker, Wired 1.03, Jul/Aug 1993, p. 80
In the world of immersion, authorship is no longer the transmission of experience, but rather the construction of utterly personal experiences.
Brenda Laurel, Wired 1.06, Dec 1993, p. 107
I expect that within the next five years more than one in ten people will wear head-mounted computer displays while traveling in buses, trains, and planes.
Nicholas Negroponte, Wired 1.06, Dec 1993, p. 136
Pretty soon you'll have no more idea of what computer you're using than you have an idea of where your electricity is generated.
Danny Hillis, Wired 2.01, Jan 1994, p. 103
If we're ever going to make a thinking machine, we're going to have to face the problem of being able to build things that are more complex than we can understand.
Danny Hillis, Wired 2.01, Jan 1994, p. 104
Computers are the metaphor of our time.
Jim Metzner, Wired 2.02, Feb 1994, p. 66
Yesterday, we changed the channel; today we hit the remote; tomorrow, we'll reprogram our agents/filters. Advertising will not go away; it will be rejuvenated.
Michael Schrage, Wired 2.02, Feb 1994, p. 73
The scarce resource will not be stuff, but point of view.
Paul Saffo, Wired 2.03, Mar 1994, p. 73
The idea of Apple making a $200 anything was ridiculous to me. Apple couldn't make a $200 blank disk.
Bill Atkinson, Wired 2.04, Apr 1994, p. 104
Roadkill on the information highway will be the billions who will forget there are offramps to destinations other than Hollywood, Las Vegas, the local bingo parlor, or shiny beads from a shopping network.
Alan Kay, Wired 2.05, May 1994, p. 77
The future is bullshit.
Jay Chiat, Wired 2.07, Jul 1994, p. 84
Money is just a type of information, a pattern that, once digitized, becomes subject to persistent programmatic hacking by the mathematically skilled.
Kevin Kelly, Wired 2.07, Jul 1994, p. 93
In a world where information plus technology equals power, those who control the editing rooms run the show.
Hugh Gallagher, Wired 2.08, Aug 1994, p. 86
Some functions require domesticated robots -- wild robots that have been bribed, tricked, or evolved into household roles. But the wild robot has to come first.
Mark Tilden, Wired 2.09, Sep 1994, p. 107
Immortality is mathematical, not mystical.
Mike Perry, Wired 2.10, Oct 1994, p. 105
As the world becomes more universal, it also becomes more tribal. Holding on to what distinguishes you from others becomes very important.
John Naisbitt, Wired 2.10, Oct 1994, p. 115
Marc Andreessen will tell you with a straight face that he expects Mosaic Communications's Mosaic to become the world's standard interface to electronic information.
Gary Wolf, Wired 2.10, Oct 1994, p. 116
Life is not going to be easy in the 21st century for people who insist on black-and-white descriptions of reality.
Joel Garreau, Wired 2.11, Nov 1994, p. 158
Take Bugs Bunny and Elmer Fudd. In mere seconds, you get an entire war -- the strategy, the attack, the retreat, the recapitulation. The whole military-industrial complex is reduced to a bunny and a stuttering guy zipping across the landscape.
Brian Boigon, Wired 2.12, Dec 1994, p. 94
The very distinction between original and copy becomes meaningless in a digital world -- there the work exists only as a copy.
Daniel Pierehbech, Wired 2.12, Dec 1994, p. 158
It's hard to predict this stuff. Say you'd been around in 1980, trying to predict the PC revolution. You never would've come and seen me.
Bill Gates, Wired 2.12, Dec 1994, p. 166
For a long time now, America has seemed like a country where most people watch television most of the time. But only recently are we beginning to notice that it is also a country where television watches us.
Phil Petton, Wired 3.01, Jan 1995, p. 126
What gives humans access to the symbolic domain of value and meaning is the fact that we die.
Regis Debray, Wired 3.01, Jan 1995, p. 162
The scary thing isn't that computers will match our intelligence by 2008; the scary thing is that this exponential curve keeps on going, and going, and going.
Greg Blonder, Wired 3.03, Mar 1995, p. 107
The future won't be 500 channels -- it will be one channel, your channel.
Scott Sassa, Wired 3.03, Mar 1995, p. 113
In the future, you won't buy artists' works; you'll buy software that makes original pieces of "their" works, or that recreates their way of looking at things.
Brian Eno, Wired 3.05, May 1995, p. 150
It's important to regard technology in the long sweep of history as being one with history.
Vernor Vinge, Wired 3.06, Jun 1995, p. 161
Sufficiently radical optimism -- optimism that more and more seems to be technically feasible -- raises the most fundamental questions about consciousness, identity, and desire.
Vernor Vinge, Wired 3.06, Jun 1995, p. 161
I believe human nature is vastly more conservative than human technologies.
Newt Gingrich, Wired 3.08, Aug 1995, p. 109
We're using tools with unprecedented power, and in the process, we're becoming those tools.
John Brockman, Wired 3.08, Aug 1995, p. 119
If the Boeing 747 obeyed Moore's Law, it would travel a million miles an hour, it would be shrunken down in size, and a trip to New York would cost about five dollars.
Nathan Myrhvold, Wired 3.09, Sep 1995, p. 154
Isn't it odd how parents grieve if their child spends six hours a day on the Net but delight if those same hours are spent reading books?
Nicholas Negroponte, Wired 3.09, Sep 1995, p. 206
The human spirit is infinitely more complex than anything that we're going to be able to create in the short run. And if we somehow did create it in the short run, it would mean that we aren't so complex after all, and that we've all been tricking ourselves.
Douglas Hofstadter, Wired 3.11, Nov 1995, p. 114
What the Net is, more than anything else at this point, is a platform for entrepreneurial activities -- a free-market economy in its truest sense.
Marc Andreessen, Wired 3.12, Dec 1995, p. 236
3-D isn't an interface paradigm. 3-D isn't a world model. 3-D isn't the missing ingredient. 3-D is an attribute, like the color blue.
F. Randall Farmer, Wired 4.01, Jan 1996, p. 117
Without a deep understanding of the many selves that we express in the virtual, we cannot use our experiences there to enrich the real.
Sherry Turkle, Wired 4.01, Jan 1996, p. 199
The annoyance caused by spammers grows as the square of the size of the Net.
Ray Jones, Wired 4.02, Feb 1996, p. 96
We're born, we live for a brief instant, and we die. It's been happening for a long time. Technology is not changing it much -- if at all.
Steve Jobs, Wired 4.02, Feb 1996, p. 106-107
Just as there is religious fundamentalism, there is a technical fundamentalism.
Paul Virilio, Wired 4.05, May 1996, p. 121
When I want to do something mindless to relax, I reinstall Windows 95.
Jean-Louis Gassee, Wired 4.05, May 1996, p. 190
It is doubtful that the [computer industry] as a whole has yet broken even.
Peter Drucker, Wired 4.08, Aug 1996, p. 116
The most successful innovators are the creative imitators, the Number Two.
Peter Drucker, Wired 4.08, Aug 1996, p. 118
We have a predisposition in Western culture for "just do it," whereas, I think that part of the future will be built much more around "just be it."
Watts Wacker, Wired 4.09, Sep 1996, p. 168
Revolutions aren't made by gadgets and technology. They're made by a shift in power, which is taking place all over the world.
Walter Wriston, Wired 4.10, Oct 1996, p. 205
Wires warp cyberspace. The two points at opposite ends of a wire are, for informational purposes, the same point, even if they are on opposite sides of the planet.
Neal Stephenson, Wired 4.12, Dec 1996, p. 98
The Web is alive. Not as a sentient being or mega-meta-super-collective consciousness, but more like a gigantic, sprouting slime mold.
Steven Alan Edwards, Wired 5.04, Apr 1997
Of all the prospects raised by the evolution of digital culture, the most tantalizing is the possibility that technology could fuse with politics to create a more civil society.
Jon Katz, Wired 5.04, Apr 1997
Technology is not the nameless Other. To embrace technology is to embrace, and face, ourselves.
David Cronenberg, Wired 5.05, May 1997, p. 185
Community precedes commerce.
John Hagel, Wired 5.08, Aug 1997, p. 84
Modern technology is a major evolutionary transition. It would be astonishing if that occurred without disrupting existing life.
Gregory Stock, Wired 5.09, Sep 1997, p. 128
Pollution is a measure of inefficiency, and inefficiency is lost profit.
Joe Maceda, Wired 5.10, Oct 1997, p. 138
For email, the old postcard rule applies. Nobody else is supposed to read your postcards, but you'd be a fool if you wrote anything private on one.
Miss Manners, Wired 5.11, Nov 1997
The American government can stop me from going to the US, but they can't stop my virus.
Dark Avenger, Wired 5.11, Nov 1997 (from a side-bar item on p.270 which does not appear in the Wired digital archives, excerpting from an interview by Sarah Gordon)
It is the arrogance of every age to believe that yesterday was calm.
Tom Peters, Wired 5.12, Dec 1997
Vier naturwissenschaftliche Bücher erklären ganz einfach, wie das Leben funktioniert
Kinder sind die geborenen Naturwissenschaftler. Sie sind neugierig. Sie stellen Fragen. Sie wollen wissen, wie die Dinge funktionieren, und warum die Dinge so sind und nicht anders. Sie gehen lieber in den Zoo als in eine Kunstausstellung. Aber irgendwann - genauer: in der Pubertät - geht das meistens verloren. Wer sich noch für die Naturwissenschaften interessiert, wird zum pickeligen Nerd; die anderen werden coole Teenies. Im Kampf der Kulturen haben die Geisteswissenschaften immer die Oberhand, wenn es um Distinktionsgewinn geht. Man gewinnt keinen Blumentopf, wenn man weiß, wie das Internet funktioniert, aber man kann viel Geld mit einem Buch verdienen, in dem man dessen schädliche Auswirkungen auf die eigene Konzentrationsfähigkeit beschreibt. Wissenschaftsskepsis ist hip, wissenschaftliche Kenntnisse dabei störend. Wo sind all die neugierigen Kinder geblieben?
Vermutlich sind sie zunächst durch die Schule abgeschreckt worden. Durch einen Unterricht, der alles tut, um die Neugier zu bestrafen, indem er auf die Grundfrage, was die Welt zusammenhält und wie alles miteinander zusammenhängt, keine Antworten gibt, sondern das Wissen in voneinander abgeschottete Fächer aufteilt. Wie dem auch sei: Wenn Sie sich beim Rückblick auf Ihren Schulunterricht allenfalls an einen Geruch von Schwefelwasserstoff und eine Kreidetafel voller Formeln erinnern, ist Natalie Angiers "Naturwissenschaft" das richtige Buch für Sie.
Im englischen Original heißt es herausfordernd: "The Canon" und verspricht eine Tour durch die "wunderschönen Grundlagen der Naturwissenschaften". Im Deutschen Untertitel hat sich ein Imperativ eingeschlichen: "Was man wissen muss, um die Welt zu verstehen". Davon sollte man sich nicht abschrecken lassen. Angier ist Wissenschaftsjournalistin bei der "New York Times" und beherrscht die angelsächsische Kunst, komplizierte Dinge einfach zu erklären. Das Buch ist garantiert formelfrei - denn für Angier ist die Mathematik "eine Sprache, nicht die Sprache, und ihre Symbole lassen sich in anderen Idiomen erklären, einschließlich der schönen Sprache, die Klartext heißt".
Anders als der Schulunterricht, der oft beim vermeintlich Konkreten beginnt und sich dann schnell im Abstrakten verliert, arbeitet sich Angier vom vermeintlich Abstrakten zum Konkreten vor. Sie beginnt mit einem brillanten Essay über das naturwissenschaftliche Denken, erklärt dann die Konzepte von Wahrscheinlichkeiten und Größenordnungen, um dann ihre Tour durch die Naturwissenschaften mit der Atomphysik zu beginnen. Tatsächlich wird es auf dieser Grundlage einfacher, die folgenden Kapitel über Chemie, Evolutionsbiologie, Molekularbiologie, Geologie und Astronomie zu begreifen. Es wäre zumindest einen Versuch wert, den Schulunterricht ebenfalls so umzustellen. Zumindest aber müsste man Angiers Buch zur Pflichtlektüre für Lehrer erklären - nicht nur für solche, die Naturwissenschaften unterrichten.
In Deutschland ist Stefan Klein möglicherweise der einzige Wissenschaftsautor, der es mit den großen angelsächsischen Vorbildern aufnehmen kann. Der Titel seines neuen Buchs, "Wir alle sind Sternenstaub", ist wohl eine Reverenz an seinen großen Vorläufer, den Wissenschaftspopularisator Hoimar von Ditfurth, dessen erstes großes Werk "Kinder des Weltalls" hieß. Und tatsächlich ist jedes Atom unseres Körpers irgendwann im Zentrum eines Sterns entstanden, wie der britische Hofastronom Martin Rees im Gespräch mit Klein erläutert, um dann hinzuzufügen: "Wenn Sie weniger romantisch veranlagt sind, können Sie die Menschen auch als stellaren Atommüll bezeichnen."
Im Auftrag des "Zeit-Magazins" führte Klein in aller Welt Gespräche mit einigen der führenden Wissenschaftler der Welt - übrigens nicht nur mit Naturwissenschaftlern wie Martin Rees oder der rumäniendeutschen Neurobiologin Hannah Monyer, sondern auch mit dem Schweizer Ökonomen Ernst Feher oder dem spekulativen Zivilisationstheoretiker Jared Diamond in Los Angeles. Er sprach mit ihnen nicht nur über die Resultate ihrer Forschung, sondern - buchstäblich - über Gott und die Welt. So antwortet der bekennende Anglikaner und Kirchgänger Rees auf die Frage, ob er an das glaube, was in der Kirche gepredigt wird: "Nein. Ich weiß doch, dass wir nicht einmal das Wasserstoffatom verstehen - wie könnte ich da an Dogmen glauben? Ich bin ein praktizierender, aber kein gläubiger Christ." Kleins Buch enthält viele solcher erhellender Momente - es ist das reinste Lesevergnügen und wie Angiers "Kanon" eine Verführung zum naturwissenschaftlichen Denken.
Wenn es einen Mann gibt, der mehr als alle anderen getan hat, um die Naturwissenschaften trotz der uns eingebauten feuilletonistischen Präferenz zu popularisieren, dann ist es der Literaturagent John Brockman, in dessen Autorenstall sich solche Stars finden wie der bereits erwähnte Jared Diamond oder das enfant terrible der Evolutionsbiologie und Religionskritiker Richard Dawkins. Jedes Jahr stellt Brockmann in seiner Internetzeitschrift "The Edge" eine Frage, die von Brockmans beängstigend weitgespanntem Netz korrespondierender Wissenschaftler beantwortet wird. 2005 etwa hieß die Frage: "Was halten Sie für wahr, ohne es beweisen zu können?" 2006 lautete sie: "Was ist Ihre gefährlichste Idee?" Der Fischer Taschenbuch Verlag hat es verdienstvollerweise unternommen, die im Internet auf Englisch erschienenen Antworten Jahr für Jahr ins Deutsche zu übersetzen. Perfekte Schmöker für einen kurzen Flug - nach einer Stunde fühlt man sich aufs angenehmste angeregt und klüger als seine Mitpassagiere.
Brockmans Sohn Max ist in die Fußstapfen (und die Firma) seines Vaters getreten und versammelt in "Die Zukunftsmacher" Essays von 18 jungen Wissenschaftlern über ihr Forschungsgebiet - es geht unter anderem um das Multiversum, die dunkle Energie, Spiegelneuronen und die Evolution der Moral, die Fantasie, die Ausbreitung guter Gedanken und das Verhältnis vom naturwissenschaftlichen Denken zur Realität - womit wir wieder bei Angiers Ausgangspunkt wären. Man muss also kein pickeliger Nerd sein, um sich für Naturwissenschaften zu begeistern. Es reicht, einige dieser Bücher zu lesen, um wieder zum Kind zu werden.
Natalie Angier: Naturwissenschaft. C. Bertelsmann, München. 382 S., 22,95 Euro.
Stefan Klein: Wir alle sind Sternenstaub. S. Fischer, Frankfurt/M. 269 S., 8,95 Euro.
John Brockman (Hg.): Das Wissen von morgen. S. Fischer. 287 S., 9,95 Euro.
Max Brockman (Hg.): Die Zukunfstmacher. S. Fischer, Frankfurt/M. 270 S., 19,95 Euro.
From books to boardroom
Q: We all need advice as we seek success in our careers and lives. What are your five favorite business books, and why? What advice wasn't so helpful?
I believe there are three "must reads" for business.
The first is "Crucial Conversations: Tools for Talking When Stakes are High" (Patterson et al, 2002). Based on the psychology of dialogue, this book provides concrete tools and examples for handling difficult conversations in business.
Most business books provide, at best, a cursory coverage of effective business conversations. In fact, the business world assumes that interpersonal skills and dialogue techniques are not important enough to teach in an MBA program. And yet, when asked to share their most difficult business situations, my advanced business majors offer issues around conflict, negotiation, and verbal problem solving -- not finance, economics, or marketing. Patterson's book outlines a model that is invaluable to anyone that wants to go beyond the director level.
The second required book is "Primal Leadership"(Goleman, 2002). This book builds off of Goleman's initial two texts, Emotional Intelligence (1997) and Working with Emotional
Intelligence (1998). Based on extensive research over ten years (tracking hundreds of leaders), Goleman proves that a distinctive set of core "soft skills" actually leads to bottom-line success.
Goleman's early work caused excitement in the world of organizational psychology. Yet, his latest work -- demonstrating a significant statistical difference in financial success --rocked the business world. For some executives, it was the first time they seriously began to consider that there might be something to this "soft stuff" such as empathy, emotional self-awareness, reality testing, and adaptability.
Last is a quasi-business book entitled "This Will Change Everything" (Brockman, 2010). This book compiles the thoughts of great thinkers of our time from every walk of life, including business, art, neuroscience, physics, chemistry, education, computers, etc. Every business person should read this book in order to maintain the big perspective and to hone one's thinking in strategic and synergistic ways. The best business people are those who can balance several yet seemingly contrasting concepts at once and, like a silver bullet, make the best decisions for overall effectiveness.
Stay away from quick-fix books. They are fun to read on airplanes or when you need to fall asleep. Yet in the complex business world, it takes energy and thought to continually develop and perfect the art of leadership and business success. Read books that challenge and force you to think beyond your daily grind. Or pick up the paper and read Dilbert. Laughing is always good.
Liz Else, associate editor and Shaun Gamble, contributor
"If you're confused by climate change, baffled by biodiversity and puzzled by particle physics, join us at Speakers' Corner to cut out the middle man and get the truth behind the headlines."
That was the invitation and challenge from the Zoological Society of London, the folks that runLondon Zoo. Just show up at the few square metres in London's Hyde Park that have become synonymous with freedom of expression, and look out for a bunch of scientists on soapboxes.
Fifteen scientists and science popularisers turned up on Monday to help invent a new form of science communication. This was the kind of public exposure that would make even an experienced stand-up comedian anxious, so wisely they all came armed with props, from a giant plastic ladybird to a blow-up globe.
The speakers' remit was to talk about the science the public care about most - or perhaps, more honestly, ought to care about most. So the kick-off session was Earth Evolution with talks including "Life on Mars from life on Earth", "Where do species come from anyway?" and "Pheromones: Smells at the heart of life".
Up next was Earth Challenges: "Bees in crisis: Well known fact or widely held belief?", "Why deforestation in the tropics should worry us" and "Global warming and a cold winter".
Last, and cut a bit short because of organisational glitches, came Earth Solutions: "Why we need science like we never needed it before", "Lessons from Ban the Bulb" and "Conservationists must learn Chinese".
Five at a time, the soapbox scientists were left to their own devices, giving mini-lectures, or asking questions to drag in the punters in true Speakers' Corner fashion.
Warmed-up by members of Team ZSL clutching questions sent via Twitter, the public began to quiz the speakers. Is global warming real? If it is, what can we do about it? Will humans evolve? Are polar bears becoming cannibals?
The speakers had questions of their own. Why fight to preserve the British green belt, but not the foreign rainforests? Is development worth the price of diversity?
This kind of getting down and dirty with the public is a rare thing in the UK, where we tend to prefer our science on TV shows with David Attenborough, Brian Cox or Kate Humble, in public lectures featuring Richard Dawkins or in occasional forays to the more populist Cafés Scientifique.
But apart from the odd crying child and a completely confused elderly Japanese couple, everyone seemed to find scientists on soapboxes a most agreeable way of whiling away a few summer hours.
Is this something that ought to happen more often? Chatting afterwards, one of the speakers, Exeter University professor Stephan Harrison, said he had come round to the view that engaging with the public was not just an important thing to do, it is a scientist's obligation.
New Scientist's own senior consultant Alun Anderson - whose "Vanishing Arctic" talk was guaranteed to appeal to a public in love with polar bears - agreed, adding that this kind of one-on-one connection could be positively "life-changing".
The problem remains convincing the public that there are pressing issues that deserve their attention, without allowing an unbalanced presentation of the facts. Polar bears may be resorting to cannibalism, but will people remember why?
Some topics are clearly more appealing to the public than others. It speaks volumes thatJonathan Baillie, director of conservation programmes at ZSL, literally had to shout to draw an audience to hear about endangered creatures that aren't as cute and cuddly as polar bears or pandas.
The truth is that science is seldom a majority sport, especially in times of economic stress, yet in an increasingly technological age when new jobs will depend on it, the need for science popularisation is arguably greater than ever.
Robin Dunbar, head of the Institute of Social and Cultural Anthropology at the University of Oxford and a professor of evolutionary anthropology, is fired up by the challenge.
He was burning with factoids guaranteed to disturb even the most bullish. Did we know, for example, that the number of UK-based applicants applying to study philosophy and English at university was holding steady but that the number applying to study chemistry and biology was showing such a linear decline year on year that by 2030 these departments would have no students at all?
That's all the more reason for doing more of these events, as apparently some of the speakers are now thinking of doing according to the event's organiser Seirian Sumner, whose team for the event also included Charlotte Walters and Kate Jones. Sumner is a featured essayist on how social insects got to be social in Max Brockman's book What's Next?, a who's who of science's next generation.
Given the modest £6000 funding (from Research Councils UK) it took to organise, such a simple event could hardly represent better value for money to a severely cash-strapped government. Assuming of course, that everyone is really serious about the horribly tough job of communicating science without patronising the general public: not the niche general public that pick up science magazines on newsstands, but the real masses who walk through places like Hyde Park, their minds reeling with daily concerns and science generally nowhere among them.
(Images: Aidan Weatherill)
The Economist has recently featured an interesting article on the behavioral effects that parasitic protozoa Toxoplasma gondii has on its mammalian hosts. Many of these effects have been recognized for years, and some of us here at Medgadget been privy to Toxoplasma news, thanks to a friend at Stanford who works with Dr. Robert Sapolsky, a leading researcher in the field. First of all, there is strong evidence that urine from cats is sexually attractive to rats infected with Toxoplasma. Then there seems to be a connection between Toxoplasma gondii and schizophrenia, lack of interest in the novelties of life, and a noted correlation with people getting into more car accidents. It seems that the nature of this parasite's life cycle has created a strange symbiotic, psychological relationship between it and its typical feline and rodent hosts. The Economist provides a handy overview of the latest knowledge around this topic.
If an alien bug invaded the brains of half the population, hijacked their neurochemistry, altered the way they acted and drove some of them crazy, then you might expect a few excitable headlines to appear in the press. Yet something disturbingly like this may actually be happening without the world noticing....
One reason to suspect [that some people have their behaviour permanently changed] is that a country's level of Toxoplasma infection seems to be related to the level of neuroticism displayed by its population. Another is that those infected seem to have poor reaction times and are more likely to be involved in road accidents. A third is that they have short attention spans and little interest in seeking out novelty. A fourth, possibly the most worrying, is that those who suffer from schizophrenia are more likely than those who do not to have been exposed to Toxoplasma.
Nor is any of this truly surprising. For, besides humans, Toxoplasma has two normal hosts: rodents and cats. And what it does to rodents is very odd indeed.
An essay on how language influences thought from the pop-science anthology "What's Next: Dispatches on the Future of Science" has been posted on The Edge. Author Lera Boroditsky, an assistant professor of psychology, neuroscience and symbolic systems at Stanford, writes:
She brings up experiments and other examples involving use of language and direction, time, color and gender, all of which seem to demonstrate that yes, language shapes how we think.
But my favorite is this example above. Only a linguist -- or perhaps a social scientist -- would put Chomsky in a hypothetical.
-- Carolyn Kellogg
Wenn wir jemanden vermissen, trösten wir uns mit Dingen, um mit der Sehnsucht zurechtzukommen. Mit Fetischismus hat das nichts zu tun.
Linus van Pelt erscheint interessanterweise als die am wenigsten neurotische Figur unter den Peanuts. Interessant deshalb, weil er nur in wenigen, extrem grausamen Szenen der Comics ohne sein Schnuffeltuch zu sehen ist. Die Beziehung von Linus und Schnuffeltuch ist sehr eng. In einer Episode, als seine Schwester Lucy eine feierliche Feuerbestattung des Tuches inszeniert, erleidet der hilflose Linus Qualen, die mit dem Schmerz des Vermissens eines geliebten Menschen identisch sind.
Unter Erwachsenen würde Linus als Fetischist bezeichnet. Seine Liebe zu einem Objekt würde als fehlgeleitet empfunden. Zwar ist ungeklärt, wie alt Linus, wie alt Lucy oder Charlie Brown zur Erzählzeit der Comicstrips sind - doch dass sie als Kinder betrachtet werden dürfen, sichert ihnen den Bonus der Unschuldigkeit. Und ein Schnuffeltuch erscheint harmlos, nicht als ein Fetisch, der unterschwellig stets mit der Aura des Perversen verknüpft wird. Charles M. Schulz, der sich die alterslosen Peanuts ausgedacht hatte, erfand auch den hübschen Begriff "security blanket", als das ein Schnuffeltuch im englischen Sprachraum seitdem bezeichnet wird.
Doch welche Sicherheit bietet Linus sein unverzichtbare Tuch?
Da vorausgesetzt werden kann, dass es sich bei Linus um ein Kind handelt, befindet er sich in einer Übergangsphase - noch nicht von der Mutter entkoppelt, in deren Leib er immerhin über neun Monate wohnte und schlief. Der frostige Begriff des seelischen Abnabelungsprozesses kommt hier zum Tragen. Ab einem gewissen Alter soll aus einem Mamakind ein "Nur-noch-Kind" werden. Es ist gewissermaßen eine zweite Geburt, allerdings wird auch diese nicht von allen Kindern als problemlos empfunden. Zwar klingt der Begriff einer seelischen Abnabelung irritierend leicht durchführbar (durch einen klaren Schnitt) - doch ist es, da es längst keine materielle Verbindung zwischen Mutter und Kind mehr gibt, tatsächlich ein mitunter langwieriger Vorgang, der sich sogar traumatisch gestalten kann.
"Die Mutter bestreiche sich die Brust mit bitterer Salbe", lautet einer der Erziehungstipps im Alten Testament: So, behauptet der Text, sei die seelische Abnabelung des Kindes von seiner Mutter durchzuführen. Von den Phantomschmerzen, die einem Kind entstehen können, ist freilich nicht die Rede. Die Psychologie ist eine Errungenschaft des 19. Jahrhunderts - als der Glaube nicht mehr weiterhalf.
Das "security blanket" des Linus van Pelt wird im Vokabular dieser jungen Wissenschaft als "transitionales Objekt" bezeichnet. Der Begriff beschreibt die Funktion zum Beispiel eines Schnuffeltuchs als Ersatz für die ersehnte Zuwendung der Mutter in jenem mulmigen Schwebezustand zwischen süßem Milchfluss und der Bitterkeit des Alleinseins (mutterseelenallein). Das Schnuffeltuch wird, trotzdem es nur ein Gegenstand ist, vom heranwachsenden Kind beatmet; dabei saugt es den Duft des Gewebes ein, es schmiegt sich in die Falten des Tuchs. Die Pfannkuchenhaftigkeit des Schnuffeltuchs lässt an eine schillernde Wortschöpfung des französischen Psychoanalytikers Jacques Lacan denken: Für die Gestalt des Begehrens erfand dieser den Begriff des "Hommelettes" - einer Kreuzung aus Mensch und Pfannkuchen offenbar, der es mit den Mitteln von Schmiegsamkeit und flacher Form gelänge, unter der Tür hindurch ins Schlafzimmer einzudringen, um sich einem dort Träumenden über die Nasenlöcher zu breiten. So gesehen begegnet das schnuffelnde Kind mit dem Tuch seinem Begehren nach der Mutter, von der es seelisch abgenabelt werden soll.
Wenn es nicht die Mutter ist, sondern eine Frau oder ein Mann, wenn der Mensch nicht mehr Kind ist, sondern erwachsen, verliert die Liebe zum transitionalen Objekt ihren unschuldigen Charakter. Hat es jemand beispielsweise nötig, beim Gefühl des Aufkommens einer innerer Unruhe sich einen bestimmten Pullover überzustreifen (weil dieses Kleidungsstück an eine Begebenheit mit der vermissten Person erinnern lässt), sich mit einer Creme zu pflegen (weil es Kosmetik ist, die auch von der vermissten Person verwendet wird) oder sich sofort ein bestimmtes Tellergericht kommen zu lassen (weil es sich dabei um der vermissten Person Leibspeise handelt), sind dies alles Handlungen, die identisch sind mit dem Griff zum Schnuffeltuch. Allerdings - Hölle des Erwachsenendaseins - verbietet sich das, "the real Stoff" sozusagen, von selbst. Regression wird halt leider mit geistiger Störung gleichgesetzt. Außerdem ist es im wünschenswerten Fall nicht länger die eigene Mutter, die jemand schmerzlich vermisst, sondern ein - zumindest potenzieller - Geschlechtspartner.
Was den Bilderbuchfetischisten, der beispielsweise an Schuhe denken muss, um in Ekstase zu geraten, mit dem Übergangsobjekt-Benutzer eint, ist die Heimlichkeit, in der er seine Handlung vollziehen will.
Zwar ist, von außen betrachtet, davon nichts wirklich zu erkennen - doch das Verzehren eines Sauerbratens mit Nudeln und Knödeln kann einen Moment größtmöglicher Intimität bedeuten, wenn es sich bei diesem Gericht um besagte Leibspeise des vermissten Menschen handelt; dieser Sauerbraten also die Funktion eines transitionalen Objekts innehat, dem sich der Esser sozusagen liebevoll widmet - und das nicht erst durch das Einverleiben, sondern schon während des Aufsuchens des Restaurants, durch das Bestellen des einzig Möglichen et cetera: Die Beschäftigung mit all diesen Vorgängen rund um das Übergangsobjekt wird als Beschäftigung mit dem abwesenden Menschen erlebt, dessen Nähe eigentlich von größter Wichtigkeit wäre, warum auch immer das dann gerade nicht möglich ist.
Diese gehirnliche Leistung ist die Königsdisziplin der Vorstellungskraft. Romaneschreiben, das blöde Ausdenken von gleichwie opulent ausgestatteten Phantasmagorien à la "Herr der Ringe" oder "Alice im Wunderland" sind ordentlich, aber nichts im Vergleich zu der kindlichen Selbsthypnose vermittels eines quadratischen Stücks Windeltuch. Oder dem Besänftigen der quälenden Sehnsucht durch die Beschäftigung mit einem Sauerbraten samt Knödel und Nudeln. Oder dem Sich-Einsprühen mit des anderen Parfum; dem bloßen Betrachten des Anzeigenmotivs dieses Parfums; dem Anhören eines Songs ("unser Lied").
In der jährlichen Umfrage der Wissenschaftsplattform edge.org ging es in diesem Jahr um Antworten auf "Was wird unser Leben noch einmal richtig verändern?"
Der 87s-jährige Physiker Freeman Dyson aus Princeton bedauerte, es selbst nicht mehr erleben zu können, glaubt aber an eine Revolution durch Radiotelepathie. Bei dieser Technologie wird das Gehirn von Mikrowellensensoren ummantelt werden, die jede neuronale Aktivität registrieren und - an einen anderen Menschen versenden können. Es wird dann möglich sein, zu erleben, was und wie ein anderer Mensch denkt; was und wie ein anderer Mensch fühlt. Man wird auf eine kaum vorstellbare Weise innig verbunden sein mit einem anderen Menschen. Man wird ihn, so Freeman Dysons Hoffnung, zumindest: tatsächlich verstehen.
Bis es zu ersten Tests kommen kann, müssen laut Dyson noch zwei marginale Technologien erfunden werden. Er geht davon aus, dass es noch 80 Jahre dauern wird, bis die ersten Radiotelepathie-Probanden große Augen machen werden.
Aber was diese gehirnliche Übertragungsleistung angeht, sind Übergangsobjektbenutzer dem Ganzen bereits recht nah.