Has a remote Amazonian tribe upended our understanding of language?
Dan Everett believes that Pirahã undermines Noam Chomsky’s idea of a universal grammar.
[ED. NOTE: Thanks to the New Yorker for making available the link to John Colapinto's article.]
Great reading in George Dyson's essay "Turing's Cathedral," found atedge.org. It connects the impulses of original computer pioneers to the age of Google.
Life is full of surprises, but it's rare to reach for a carafe of wine and find your hand clutching a bottle of milk -- and even rarer, you'd think, to react by deciding the milk was actually what you wanted all along.
Yet something like that happened when scientists in Sweden asked people to choose which of two women's photos they found most attractive. After the subject made his choice, whom we'll call Beth, the experimenter turned the chosen photo face down. Sliding it across the table, he asked the subject the reasons he chose the photo he did. But the experimenter was a sleight-of-hand artist. A copy of the unchosen photo, "Grizelda," was tucked behind Beth's, so what he actually slid was the duplicate of Grizelda, palming Beth.
Few subjects batted an eye. Looking at the unchosen Grizelda, they smoothly explained why they had chosen her ("She was smiling," "she looks hot"), even though they hadn't.
In 1966, Time magazine asked, "Is God Dead?" Even then, the answer was no, and with the rise of religion in the public square, the question now seems ludicrous. In one of those strange-bedfellows things, it is science that is shedding light on why belief in God will never die, at least until humans evolve very different brains, brains that don't (as they did with Beth and Grizelda) interpret unexpected and even unwanted outcomes as being for the best.
"Belief in God," says Daniel Gilbert, professor of psychology at Harvard University, "is compelled by the way our brains work."
As shown in the Grizelda-and-Beth study, by scientists at Lund University and published this month in Science, brains have a remarkable talent for reframing suboptimal outcomes to see setbacks in the best possible light. You can see it when high-school seniors decide that colleges that rejected them really weren't much good, come to think of it.
You can see it, too, in experiments where Prof. Gilbert and colleagues told female volunteers they would be working on a task that required them to have a likeable, trustworthy partner. They would get a partner randomly, by blindly choosing one of four folders, each containing a biography of a potential teammate. Unknown to the volunteers, each folder contained the same bio, describing an unlikable, untrustworthy person.
The volunteers were unfazed. Reading the randomly chosen bio, they interpreted even negatives as positives. "She doesn't like people" made them think of her as "exceptionally discerning." And when they read different bios, they concluded their partner was hands-down superior. "Their brains found the most rewarding view of their circumstances," says Prof. Gilbert.
The experimenter then told the volunteer that although she thought she was choosing a folder at random, in fact the experimenter had given her a subliminal message so she would pick the best possible partner. The volunteers later said they believed this lie, agreeing that the subliminal message had led them to the best folder. Having thought themselves into believing they had chosen the best teammate, they needed an explanation for their good fortune and experienced what Prof. Gilbert calls the illusion of external agency.
"People don't know how good they are at finding something desirable in almost any outcome," he says. "So when there is a good outcome, they're surprised, and they conclude that someone else has engineered their fate" -- a lab's subliminal message or, in real life, God.
Religion used to be ascribed to a wish to escape mortality by invoking an afterlife or to feel less alone in the world. Now, some anthropologists and psychologists suspect that religious belief is what Pascal Boyer of Washington University, St. Louis, calls in a 2003 paper "a predictable by-product of ordinary cognitive function."
One of those functions is the ability to imagine what Prof. Boyer calls "nonphysically present agents." We do this all the time when we recall the past or project the future, or imagine "what if" scenarios involving others. It's not a big leap for those same brain mechanisms to imagine spirits and gods as real.
Another God-producing brain quirk is that although many things can be viewed in multiple ways, the mind settles on the most rewarding. Take the Necker cube, the line drawing that shifts orientation as you stare at it. (A cool version is at dogfeathers.com/java/necker.html.) If you reward someone for seeing the cube one way, however, his brain starts seeing it that way only. The cube stops flipping.
There are only two ways to see a Necker cube, but loads of ways to see a hurricane or a recovery from illness. The brain "tends to search for and hold onto the most rewarding view of events, much as it does of objects," Prof. Gilbert writes on the Web site Edge. It is much more rewarding to attribute death to God's will, and to see in disasters hints of the hand of God.
Prof. Gilbert once asked a religious colleague how he felt about helping to discover that people can misattribute the products of their own minds to acts of God. The reply: "I feel fine. God doesn't want us to confuse our miracles with his."
Michael Wright enjoys a eureka moment at the edge of knowledge, as scientists ponder the imponderable
Some of the presentations are available to watch as QuickTime movies, if you prefer not to read, and keen thinkers can have a bimonthly e-mail of the latest discussions delivered to their inbox.
Each year, John Brockman, the site’s American editor, also sends a big, open-ended question to all the notable thinkers he knows, then publishes their responses online. This year’s little teaser — “What do you believe is true, even though you cannot prove it?” — prompted 60,000 words in reply, on subjects including particle physics, consciousness, arti- ficial intelligence, global warming and tedious sophistry.
I like the belief of Alun Anderson, the editor-in-chief of New Scientist, that cockroaches are conscious, but cannot comment on the theoretical physicist who denies that black holes destroy information or the computer scientist who believes the continuum hypothesis is false.
Visiting Edge will make pseudo- scientists feel cleverer, and the rest of us more than usually stupid, as we discover, with a jolt of pleasure, how little we really know about the world.
IN this special anthology, leading public thinkers _ scientists, writers and philosophers such as Richard Dawkins,
Howard Gardner, Freeman Dyson, Jared Diamond and Ray Kurzweil _ respond to a question proposed by Stephen
Pinker: `What is your dangerous idea?'
John Brockman clarifies the question in his introduction: he wanted `statements of fact or policy that are defended
with evidence and argument by serious scientists and thinkers but which are felt to challenge the collective decency
of an age.'
Good ideas really shouldn't be thought of as dangerous, so several writers shadow-box around the question a bit,
but nearly all of them come up with something original and thought-provoking.
One of my own favourites was about the lab rats that learned to prefer Schoenberg to Mozart, but there is
something here for every interest. Common topics are religion (especially its troubled relationship to science),
psychology (especially free will), politics, and the impact of technological change (genetic engineering, and the
clash between our instincts and our computer-dominated culture).
Contributions are all quite short, ranging from less than a page up to perhaps five pages, which makes it all too
easy to give oneself mental indigestion. Other than that, however, it is a veritable feast of ideas.
In a word: Zesty.
If you stroll along the "infinite shingle" of Chesil Beach in Dorset, as Ian McEwan did while composing his new novel, you will find that millennia of tides and winds have "graded the size of pebbles" along its 18-mile length, "with the bigger stones at the eastern end". The writer went to check this out, and felt - as he weighed the pebbles in his palms - that it was true.
Already, critics have lauded On Chesil Beach as a major achievement from a painstaking micro-historian of the inner life. Edward and Florence, its loving but fatally innocent couple, stumble into a wedding-night disaster in the "buttoned-up", respectable England of July 1962, the victims not merely of "their personalities and pasts" but of "class, and history itself". Yet long-haul admirers of McEwan will detect some even deeper rhythms at work here. Once again, he traces the ominous crossing of a threshold from one human state to another: a step into the dark framed - as often in his fiction - by the inexorable onward movement of maturing and ageing bodies, of biological evolution, of climate and even geology itself.
We talk in a restaurant in Fitzrovia, a short walk for McEwan from the handsome house in a Georgian square that he fictionally lends to the neurosurgeon Henry Perowne in Saturday - another novel that pivots on momentous changes, all the way from the medical to the military realms. Upstairs, there seems to be a meeting of the revived Bonzo Dog Doo-Dah Band, exactly the kind of wacky pop pranksters that Edward, in the lonely hippie-era limbo where McEwan's epilogue leaves his stubborn hero, might have promoted in his Camden record shop. Outside, the sunshine signals another kind of transition, from winter into spring. And McEwan, a model of quietly spoken exactitude with words and ideas alike, stresses that On Chesil Beach aims at more than just the scrutiny of that early-Sixties cusp of change between - as Philip Larkin and almost all the reviewers have put it - "the end of the Chatterley ban/ And The Beatles' first LP".
For all the pin-sharp evocation of a time when "youthful energies were pushing to escape, like steam under pressure", this last gasp of British sexual inhibition gave his story a starting point and not a terminus. "I never really thought of it as a historical novel," he explains, "because I was interested in another aspect: which is when young people cross this line - the Conradian shadow-line - from innocence to knowledge. You're also dealing with a human universal. So I was rather interested to discover what young people would make of this. And I was quite relieved, for example, that my sons took to it avidly - even though they're living at a time when they not only have girlfriends, but they have lots of friends who happen to be girls: another world."
The book also survived a test-run beyond McEwan's family (his wife is the journalist and author Annalena McAfee, and he has two early-twenties sons from his first marriage). He read an extract at Hunter College in New York, to the sort of student body who might have been forgiven for failing to sympathise with the bedroom blunderings of a pair of virginal Home Counties 22-year-olds in the summer before the Cuban missile crisis. "This is a community college," the author says, "and the kids are - tough is not the word, they're really lovely, but they're not protected. They've clearly been out there." Would this street-smart audience think: why don't Edward and Florence "just get on with it? What's the problem? On the contrary: they seemed deeply engaged.
"So there have to be two elements running side by side," McEwan continues. "One is that, this is particular: these are characters frozen in history, limited by psychology, by class, by private experience. But on the other hand, this is a universal experience that is differently dressed up by different people at different times." Youth always has to cross that line, even if it would no longer run through the starched sheets of a marriage bed in a dowdy Dorset hotel.
Always the punctilious realist, McEwan nonetheless skirts the seas of parable, or myth. Yet for this, the 12th work of fiction since his 1975 debut with the luridly memorable tales of First Love, Last Rites, he wanted to avoid wading in too deep. "This particular beach offered so many metaphorical possibilities," he says. "They could kill the novel! So I really had to row back quite hard on that. The fact that impersonal forces have created order; the fact that the last scene is played out on a tongue of shingle, so you're stranded on both sides; the sense that they sit down to dinner on an evening when they both hope to gain knowledge, which clearly relates to being on the edge of the known world... It was so rich, that I had to keep the volume down."
McEwan's fiction strikes so hard and lingers so long in the imagination precisely because he keeps the interpretative volume down. "Readers will rebel," he believes, "when they spot an overriding, determining metaphor." Or, perhaps, a determining cause. On Chesil Beach hints at a specific reason for Florence's "visceral dread" of sexual experience, one that throws a line from this work back to the toxic households of those earliest stories. Her creator reveals that "in an early draft, it was all too clear". The finished work allows more space for the reader: we can join the dots through the past ourselves, just as we can fill in the futures to be enjoyed or endured by both after the act, or failure to act, that will mould them. Edward, the promising historian, now seems headed for a life of amiable counter-cultural drift; Florence, the driven violinist, stands on the brink of a solitary musical destiny.
Florence plays in a rising string quartet, and the novel that tells her story has a densely wrought, compacted, chamber-music quality. A central movement - the wedding night itself - is interspersed with chapters that delve into the characters' past and, at the finale, the future as * * well. "One of the first things that I wrote about it when I was making notes," McEwan recalls, "was a simple direction: five times eight - five chapters of about 8,000 words. A wedding night seemed to me perfect for a short novel."
The author of other compressed but resonant pieces, such as The Comfort of Strangers, Black Dogs and the Booker-winning Amsterdam, points out that "I've always liked that form: the novel that can be read in three hours, at a sitting, like a movie or an opera". A chamber opera will be McEwan's next project, due for its premiere at next year's Hay festival. He has almost completed a small-scale, "easily exportable" collaboration with the composer Michael Berkeley (who was his partner more than 20 years ago on the anti-nuclear oratorio Or Shall We Die?). It has a Don Giovanni-style seducer for its protagonist: "We thought that sexual obsession would be a very good subject for an opera."
And sexual obsession, in the form of longing or loathing rather than action, makes an equally compelling motif for On Chesil Beach. For McEwan, the book's microscopically observed convergence of social embarrassment and erotic misery "is not great tragedy. But it's something I always have an interest in: how something small, like not saying the right thing or not making the right gesture, could then send you down a slightly different path in life. It must happen to us countless times, but we barely notice."
In the pre-permissive shadowland of 1962, McEwan himself was a 13-year-old schoolboy, the itinerant, Aldershot-born son of a career army officer from Glasgow. Famously, his father's ordeal at Dunkirk helped to shape the wartime scenes of Atonement, the 2001 novel that, for many of his readers, ranked Mc-Ewan first-among-equals in that gifted cohort of novelists (Amis, Barnes and Rushdie among them) born into the aftermath of global war. Now, a few readers wonder if the poignant road-not-taken theme in On Chesil Beach might connect with his rediscovered brother, David Sharp. The son born to McEwan's parents while his mother was still married to her first husband (later killed in action), David was given up for adoption in 1942. McEwan first encountered him in 2002, and they periodically meet, but he says that this reconfigured family history has not (yet) found its way into his work.
The novelist may not enlist people into fiction so directly, but he does recruit places. Just as Saturday more or less gave Mr Perowne his creator's own address, so On Chesil Beach has Edward grow up in a Chilterns cottage that Mc-Ewan once almost rented, while Florence's chilly family occupies the north Oxford house he lived in during the 1980s. "I've come to it late," he says, "and it's such a standard thing in the English novel: a sense of place. Which I've always rather lacked, I think, being an army brat, going to boarding school, then a modern university": Sussex, followed by his pioneering stint as the first creative-writing student at East Anglia. "I've never been very rooted but, cumulatively, I guess, I do have a 30-year experience of the Chilterns." McEwan now draws on that intimacy in Edward's memories of an idyllic corner of those hills which has, he says, "withstood the onslaught of modernity reasonably well".
McEwan conjures up his terrain with a walker's close-to-the-ground eye. Plants thrust, creatures breed (or refuse to), and even hills or beaches shift according to the overlapping cycles that push on beyond the limited history that persons or societies know. He is also deeply immersed in ecological debates. In 2005, he joined a trip to the Svalbard archipelago, 79 degrees north in the Arctic, for the Cape Farewell project led by the artist David Buckland, which aims to raise cultural awareness of the issues of global warming. He reads widely in scientific literature and, just before we met, had travelled to Hamburg for a public dialogue with John Schellnhuber, the German government's adviser on climate change.
Yet McEwan the engaged intellectual (as he was during an earlier wave of doomsday anxiety, in the nuclear arms race of the early 1980s) and McEwan the novelist remain separate beings. "Fiction hates preachiness," he affirms. "Nor does it much like facts and figures or trends or curves on graphs. Nor do readers much like to be hectored." He says that in spite of "all the reading that I've done around climate change, none of it suggests anything useful in the way of approaching this novelistically".
What about one more fictional dystopia, with marauding survivors once more trekking through a blasted wasteland? "That doesn't interest me at all. We've had so many dystopias that we're brain-dead in that direction. Also, you can go to certain parts of the world - say, Sudan. There is a dystopia. You don't have to launch these things into the future."
Still, he can just about envisage a fiction that would do artistic justice to a perilously warming world: "Something small and fierce, that would unwind in a way that's intrinsically interesting... It's got to be fascinating, in the way that gossip is. It's got to be about ourselves. Maybe it needs an Animal Farm. Maybe it needs allegory. But if you're going in that direction, then you need a lot of wit."
Meanwhile, ventures such as Cape Farewell (whose exhibition will reach the Barbican gallery in January 2008) may trigger an urge to cherish as well as to lament. In an Arctic cold snap, "I did two or three long hikes that just took my breath away," he says. "Many others have thought this too: that one way forward is not doom-and-gloom but celebration; of what we are, what we have, and what we don't want to lose." On his return, he wrote a fable about the boot-room of the expedition ship, with its all-too-human rows over purloined kit. Artists may not refine the theory or advance the technology that will grapple with climate change, but they can deepen the self-knowledge of the selfish but potentially co-operative beasts who have crossed a fateful, collective shadow-line. "How do you talk about the state we've got ourselves into," he asks, "as a very successful, fossil-fuel-burning civilisation? How do we stop? That really does become a matter of human nature. There's all the science to consider, but finally there is a massive issue of politics and ethics."
McEwan, who shadowed a leading neurosurgeon while researching Saturday, likes the company and outlook of scientists as an antidote to lazy arts-faculty despair. "Among cultural intellectuals, pessimism is the style," he says with a tinge of scorn. "You're not a paid-up member unless you're gloomy." But when it comes to climate change, he finds (quoting the Italian revolutionary Gramsci) that scientists can combine "pessimism of the intellect" with "optimism of the will". "Science is an intrinsically optimistic project. You can't be curious and depressed. Curiosity is itself a sure stake in life. And science is often quite conscious of intellectual pleasure, in a way that the humanities are not."
He loves the spirited playfulness evident in places such as John Brockman's celebrated website Edge, where "neuroscientists might talk to mathematicians, biologists to computer-modelling experts", and in an accessible, discipline-crossing language that lets us all eavesdrop. "In order to talk to each other, they just have to use plain English. That's where the rest of us benefit." Science may also now "encroach" on traditional artistic soil. McEwan recently heard a lecture on the neuroscience of revenge, in which the rage to get even - that inexhaustible fuel for tragedy and comedy alike - illuminated parts of the brain via "real-time, functioning MRI [magnetic resonance imaging]. What was demonstrated was that people were prepared to punish themselves in order to punish others: negative altruism."
For all the storytelling confidence of scientists who try to uncover the biological roots of personal emotions and social beliefs, McEwan keeps faith in the special tasks of art. "I hold to the view that novelists can go to places that might be parallel to a scientific investigation, and can never really be replaced by it: the investigation into our natures; our condition; what we're like in specific circumstances." On Chesil Beach, it strikes me, shows at its infinitely sad conclusion an example of self-punishing "negative altruism" at work. Here, a vengeful righteousness that wrecks the "injured" party takes shape not in the colour-coded neural maps of MRI - but through a vigilant writer's heartbreaking empathy with the twisted feelings of a child in its time.
If human communication and solidarity can founder so totally in this novel's little pool of fear and frustration, what are its prospects in the great ocean of social behaviour? We talk of the carbon-cutting, resource-saving sacrifices this generation may have to make on behalf of its successors, and McEwan comments that such long-term altruism "does go against the grain a bit". All the same, he adds: "I cheer myself up with the thought of medieval cathedral builders, who built for the future - or 18th-century tree-planters, who planted sapling oaks which they would never enjoy. Here, it's much more dire; but we're bound to think of our children, or at least our grandchildren.
"It is difficult to do favours to people you have never met," he says. "But we give money to Oxfam, to charities, to victims of the tsunami and so forth. These are not people who are ever going to repay those favours, or even know who bestowed them." Unlike his characters, doomed to a kind of soul-extinction in their solitude, McEwan believes in making the last-ditch gesture that might save a world. "The worst fate would be to conclude that there's nothing we can do about this, and so let's party to the end."
BRAIN stretch is an exciting concept, the more so as John Brockman's anthology pushes everything to the extreme. Can our brains exist without bodies? If, as Ray Kurzweil says, ''we need only 1 per cent of 1 per cent of the sunlight to meet all our energy needs'', why are we pouring billions into Middle East wars over oil and not into research on nano-engineered solar panels and fuel cells? Read these 100 or so mini-essays and realise how lacking in vision most politicians are.
The Dublin Review of Books will boast a regular blog where readers can carry on live discussion of particular articles or topics between issues.
But it isn't the only online magazine vying for the attention of literary audiences - there are dozens of sassy outfits out there, each with its own distinctive perks and quirks. ...
www.edge.org has established itself as a major force on the intellectual scene in the US and as required reading for humanities heads who want to keep up to speed with the latest in science and technology. Current debates on the site feature stellar contributors Noam Chomsky, Scott Atran and Daniel C Dennett.
When Robert R. Provine tried applying his training in neuroscience to laughter 20 years ago, he naïvely began by dragging people into his laboratory at the University of Maryland, Baltimore County, to watch episodes of "Saturday Night Live" and a George Carlin routine. They didn't laugh much. It was what a stand-up comic would call a bad room.
So he went out into natural habitats — city sidewalks, suburban malls — and carefully observed thousands of "laugh episodes." He found that 80 percent to 90 percent of them came after straight lines like "I know" or "I'll see you guys later." The witticisms that induced laughter rarely rose above the level of "You smell like you had a good workout.""Most prelaugh dialogue," Professor Provine concluded in "Laughter," his 2000 book, "is like that of an interminable television situation comedy scripted by an extremely ungifted writer."He found that most speakers, particularly women, did more laughing than their listeners, using the laughs as punctuation for their sentences. It's a largely involuntary process. People can consciously suppress laughs, but few can make themselves laugh convincingly.
"Laughter is an honest social signal because it's hard to fake," Professor Provine says. "We're dealing with something powerful, ancient and crude. It's a kind of behavioral fossil showing the roots that all human beings, maybe all mammals, have in common."
How neuroscience is transforming the legal system.
...Two of the most ardent supporters of the claim that neuroscience requires the redefinition of guilt and punishment are Joshua D. Greene, an assistant professor of psychology at Harvard, and Jonathan D. Cohen, a professor of psychology who directs the neuroscience program at Princeton. Greene got Cohen interested in the legal implications of neuroscience, and together they conducted a series of experiments exploring how people's brains react to moral dilemmas involving life and death. In particular, they wanted to test people's responses in the f.M.R.I. scanner to variations of the famous trolley problem, which philosophers have been arguing about for decades. ...
...Michael Gazzaniga, a professor of psychology at the University of California, Santa Barbara, and author of "The Ethical Brain," notes that within 10 years, neuroscientists may be able to show that there are neurological differences when people testify about their own previous acts and when they testify to something they saw. "If you kill someone, you have a procedural memory of that, whereas if I'm standing and watch you kill somebody, that's an episodic memory that uses a different part of the brain," he told me. ...
...In a series of famous experiments in the 1970s and '80s, Benjamin Libet measured people's brain activity while telling them to move their fingers whenever they felt like it. Libet detected brain activity suggesting a readiness to move the finger half a second before the actual movement and about 400 milliseconds before people became aware of their conscious intention to move their finger. Libet argued that this leaves 100 milliseconds for the conscious self to veto the brain's unconscious decision, or to give way to it — suggesting, in the words of the neuroscientist Vilayanur S. Ramachandran, that we have not free will but "free won't.". ...
...The legal implications of the new experiments involving bias and neuroscience are hotly disputed. Mahzarin R. Banaji, a psychology professor at Harvard who helped to pioneer the I.A.T., has argued that there may be a big gap between the concept of intentional bias embedded in law and the reality of unconscious racism revealed by science. When the gap is "substantial," she and the U.C.L.A. law professor Jerry Kang have argued, "the law should be changed to comport with science" — relaxing, for example, the current focus on intentional discrimination and trying to root out unconscious bias in the workplace with "structural interventions," which critics say may be tantamount to racial quotas. . ...
...Others agree with Greene and Cohen that the legal system should be radically refocused on deterrence rather than on retribution. Since the celebrated M'Naughten case in 1843, involving a paranoid British assassin, English and American courts have recognized an insanity defense only for those who are unable to appreciate the difference between right and wrong. (This is consistent with the idea that only rational people can be held criminally responsible for their actions.) According to some neuroscientists, that rule makes no sense in light of recent brain-imaging studies. "You can have a horrendously damaged brain where someone knows the difference between right and wrong but nonetheless can't control their behavior," saysRobert Sapolsky, a neurobiologist at Stanford. "At that point, you're dealing with a broken machine, and concepts like punishment and evil and sin become utterly irrelevant. Does that mean the person should be dumped back on the street? Absolutely not. You have a car with the brakes not working, and it shouldn't be allowed to be near anyone it can hurt.". ...
SAN FRANCISCO, March 8 — A new company founded by a longtime technologist is setting out to create a vast public database intended to be read by computers rather than people, paving the way for a more automated Internet in which machines will routinely share information.
The company, Metaweb Technologies, is led by Danny Hillis, whose background includes a stint at Walt Disney Imagineering and who has long championed the idea of intelligent machines.
He says his latest effort, to be announced Friday, will help develop a realm frequently described as the “semantic Web” — a set of services that will give rise to software agents that automate many functions now performed manually in front of a Web browser.
The idea of a centralized database storing all of the world’s digital information is a fundamental shift away from today’s World Wide Web, which is akin to a library of linked digital documents stored separately on millions of computers where search engines serve as the equivalent of a card catalog.
In contrast, Mr. Hillis envisions a centralized repository that is more like a digital almanac. The new system can be extended freely by those wishing to share their information widely.
On the Web, there are few rules governing how information should be organized. But in the Metaweb database, to be named Freebase, information will be structured to make it possible for software programs to discern relationships and even meaning.
For example, an entry for California’s governor, Arnold Schwarzenegger, would be entered as a topic that would include a variety of attributes or “views” describing him as an actor, athlete and politician — listing them in a highly structured way in the database.
That would make it possible for programmers and Web developers to write programs allowing Internet users to pose queries that might produce a simple, useful answer rather than a long list of documents.
Since it could offer an understanding of relationships like geographic location and occupational specialties, Freebase might be able to field a query about a child-friendly dentist within 10 miles of one’s home and yield a single result.
The system will also make it possible to transform the way electronic devices communicate with one another, Mr. Hillis said. An Internet-enabled remote control could reconfigure itself automatically to be compatible with a new television set by tapping into data from Freebase. Or the video recorder of the future might stop blinking and program itself without confounding its owner.
In its ambitions, Freebase has some similarities to Google — which has asserted that its mission is to organize the world’s information and make it universally accessible and useful. But its approach sets it apart.
“As wonderful as Google is, there is still much to do,” said Esther Dyson, a computer and Internet industry analyst and investor at EDventure, based in New York.
Most search engines are about algorithms and statistics without structure, while databases have been solely about structure until now, she said.
“In the middle there is something that represents things as they are,” she said. “Something that captures the relationships between things.”
That addition has long been a vision of researchers in artificial intelligence. The Freebase system will offer a set of controls that will allow both programmers and Web designers to extract information easily from the system.
“It’s like a system for building the synapses for the global brain,” said Tim O’Reilly, chief executive of O’Reilly Media, a technology publishing firm based in Sebastopol, Calif.
Mr. Hillis received his Ph.D. in computer science while studying artificial intelligence at the Massachusetts Institute of Technology.
In 1985 he founded one of the first companies focused on massively parallel computing, Thinking Machines. When the company failed commercially at the end of the cold war, he became vice president for research and development at Walt Disney Imagineering. More recently he was a founder of Applied Minds, a research and consulting firm based in Glendale, Calif. Metaweb, founded in 2005 with venture capital backing, is a spinoff of that company.
Mr. Hillis first described his idea for creating a knowledge web he called Aristotle in a paper in 2000. But he said he did not try to build the system until he had recruited two technical experts as co-founders. Robert Cook, an expert in parallel computing and database design, is Metaweb’s executive vice president for product development. John Giannandrea, formerly chief technologist at Tellme Networks and chief technologist of the Web browser group at Netscape/AOL, is the company’s chief technology officer.
“We’re trying to create the world’s database, with all of the world’s information,” Mr. Hillis said.
All of the information in Freebase will be available under a license that makes it freely shareable, Mr. Hillis said. In the future, he said, the company plans to create a business by organizing proprietary information in a similar fashion.
Contributions already added into the Freebase system include descriptive information about four million songs from Musicbrainz, a user-maintained database; details on 100,000 restaurants supplied by Chemoz; extensive information from Wikipedia; and census data and location information.
A number of private companies, including Encyclopaedia Britannica, have indicated that they are willing to add some of their existing databases to the system, Mr. Hillis said.
Brian Eno, musician
Interventionists vs laissez-faireists
One of the big divisions of the future will be between those who believe in intervention as a moral duty and those who don't. This issue cuts across the left/right divide, as we saw in the lead-up to the invasion of Iraq. It asks us to consider whether we believe our way of doing things to be so superior that we must persuade others to follow it, or whether, on the other hand, we are prepared to watch as other countries pursue their own, often apparently flawed, paths. It will be a discussion between pluralists, who are prepared to tolerate the discomfort of diversity, and those who feel they know what the best system is and feel it is their moral duty to encourage it.
Globalists vs nationalists
How prepared are we to allow national governments the freedom to make decisions which may not be in the interests of the rest of the world? With issues such as climate change becoming increasingly urgent, many people will begin arguing for a global system of government with the power to overrule specific national interests.
Communities of geography vs communities of choice
At the same time, some people will feel less and less allegiance to "the nation," which will become an increasingly nebulous act of faith, and more allegiance to "communities of choice" which exist outside national identities and geographical restraints. We see the beginnings of this in transnational pressure groups such as Greenpeace, MoveOn and Amnesty International, but also in the choices that people now make about where they live, bank their money, get their healthcare and go on holiday.
Real life vs virtual life
Some people will spend more and more of their time in virtual communities such as Second Life. They will claim that their communities represent the logical extension of citizen democracy. They will be ridiculed and opposed by "First Lifers," who will insist that reality with all its complications always trumps virtual reality, but the second-lifers in turn will insist that they live in a world of their own design and therefore are by definition more creative and free. This division will deepen and intensify, and will develop from just a cultural preference into a choice about how and where people spend their lives.
Life extension for all vs for some
There will be an increasingly agonised division between those who feel that new life-extension technologies should be either available to those who can afford them or available to everyone. Life itself will be the resource over which wars will be fought: the "have nots" will feel that there is a fundamental injustice in the possibility for some people to enjoy conspicuously longer and healthier lives because they happen to be richer.
Anthony Giddens, sociologist
"The future isn't what it used to be," George Burns once said. And he was right. This century we are peering over a precipice, and it's an awful long way down. We have unleashed forces into the world that it is not certain that we can control. We may have already done so much damage to the planet that by the end of the century people will live in a world ravaged by storms, with large areas flooded and others arid. But you have to add in nuclear proliferation, and new diseases that we might have inadvertently created. Space might become militarised. The emergence of mega-computers, allied to robotics, might at some point also create beings able to escape the clutches of their creators.
Against that, you could say that we haven't much clue what the future will bring, except it's bound to be things that we haven't even suspected. Twenty years ago, Bill Gates thought there was no future in the internet. The current century might turn out much more benign than scary.
As for politics, left and right aren't about to disappear—the metaphor is too strongly entrenched for that. My best guess about where politics will focus would be upon life itself. Life politics concerns the environment, lifestyle change, health, ageing, identity and technology. It may be a politics of survival, it may be a politics of hope, or perhaps a bit of both.
Nicholas Humphrey, scientist
How can anyone doubt that the faultline is going to be religion? On one side there will be those who continue to appeal for their political and moral values to what they understand to be God's will. On the other there will be the atheists, agnostics and scientific materialists, who see human lives as being under human control, subject only to the relatively negotiable constraints of our evolved psychology. What makes the outcome uncertain is that our evolved psychology almost certainly leans us towards religion, as an essential defence against the terror of death and meaninglessness.
Marek Kohn, science writer
The right, of course, is still with us; robust structures remain to uphold individualism and the pursuit of wealth. There is also plenty of room in the current orthodoxy for liberalism and conservatism of all kind of stripes. What's left out? Equality and solidarity—which takes us back to the egalite and fraternite of the French revolution, where the terms "left" and "right" came in. These seem to be fundamental values, intuitively recognised as the basis of fair and healthy social relations, so we may expect that they will reassert themselves. But as dominant ideologies fail to give them their fair dues, they will reappear in marginal and often disagreeable guises. Social solidarity may be advanced within narrow group solidarities; equality may be appropriated by demagogues.
Recent manifestations in central Europe and South America have been overlooked because they are accompanied by tendencies that rightly affront liberals. It is hard to imagine what could restore social solidarity and equality to the heart of political discourse, so we must expect that collectivist tendencies in our kind of polity will likely be largely confined to the bureaucratic management of resources placed under ever-growing pressure by economic growth and its environmental consequences.
Mark Pagel, scientist
Modern humans evolved to live in small co-operative groups with extensive divisions of labour among unrelated people linked only by their common culture. Co-operation is fragile, being the contented face of trust, reciprocity and the perception of a shared fate—when they go, the mask can quickly fall. The psychology of the co-operative group, of how we can maintain it and equally how we can control its dangerous tendencies—parochialism, xenophobia, exclusion and warfare—will often be at the front door of 21st-century politics.
The reasons are clear. The politics of the 20th century were expansive and hopeful, enlivened by growing prosperity. In the 21st century, increasing multiculturalism and widespread movements of people will repeatedly challenge the trust and sense of equity that binds together co-operative groups, unleashing instincts for selfish preservation. For politicians and thinkers, a pressing task at all levels of politics is to seek ways to manage these issues that somehow draw all of the actors into the elaborate and fragile reciprocity loops of the co-operative society. It sounds impossible, it won't be easy and there are no simple recipes. But if we fail, we risk sliding into xenophobic hysteria, clashes of culture, and the frenzied and dangerous grabbing of natural resources.
Lisa Randall, scientist
Debates today have descended into those between the lazy and the slightly less lazy. We are faced with urgent issues, yet the speed with which lawmakers approach them is glacial—actually slower than that: glaciers are melting faster than we are attacking the issues.
Steven Rose, biologist
Last century's alternatives were socialism or barbarism. This century's prospects are starker: social justice or the end of human civilisation—if not our species. To achieve that justice it is imperative that we retain the utopian dream of "from each according to their abilities: to each according to their needs," but needs and abilities are constantly being refashioned by runaway sciences and technologies harnessed ever more closely to global industry and imperial power and embedded within a degraded and degrading environment. This century's "left," just as that of the last century, is constituted by those groups, old or newly constituted, struggling against these hegemonic powers.
Those who wonder what cutting-edge scientists might ponder outside of their classrooms and laboratories need wonder no more. In What We Believe But Cannot Prove, "intellectuals in action" speculate on the frontiers of science, both hard and soft. Skeptics, however, should not be deceived by the title. An ample majority of the more than 100 teasingly short essays included will sate the intellect's appetite for both facts and reasoned theory. John Brockman's new collection features the world's most celebrated and respected scientists and their musings on everything from human pre-history to cosmology and astrophysics, from evolution to extraterrestrial intelligence, and from genetics to theories of consciousness. ....
...What We Believe But Cannot Prove offers an impressive array of insights and challenges that will surely delight curious readers, generalists and specialists alike. Science is intimidating for the vast majority of us. But John Brockman has grown deservedly famous in recent years for his ability to lure these disciplines and theirleading practitioners back to Earth where terrestrials are afforded all-too-rare opportunities to marvel at the intellectual and creativemagnificence of science in particular, and at our species' immeasurable potential in all pursuits more generally.
Las tragedias individuales, dice Anderson, venden muchos más peri"dicos y atraen muchos más televidentes que las tendencias generales
A menudo, después de abrir el peri"dico, ver las noticias o vivir algún suceso especialmente triste, acaba uno con la idea de que el mundo era mucho mejor antes y que vamos rumbo a la decadencia, soledad, podredumbre y extrema violencia. En algunas partes y épocas efectivamente es así. Pero no lo es en general...Dos amigos míos me recordaron, en escritos de fin de año, que hay mucho que criticar, afrontar, cambiar, pero también hay mucho que celebrar. Chris Anderson escribi" sobre el extremo sobrerreportaje que ocurre cuando hay un incidente terrorista, accidente masivo o desastre natural. Esto ocurre porque, en la mayoría del mundo, este tipo de muertes violentas no son lugar común. Hay grandes reportajes precisamente porque son sucesos excepcionales.Las tragedias individuales, dice Anderson, venden muchos más peri"dicos y atraen muchos más televidentes que las tendencias generales. "Perro ataca inocente infante" es mucho más poderoso que "la pobreza se redujo en un 1 por ciento". Pero aunque la segunda nota es mucho menos atractiva en términos mediáticos significa salvar y mejorar muchas más vidas.
Mucho se ha escrito sobre c"mo la red, Google, Yahoo, Skype, You Tube eliminan distancias y reducen el costo de la comunicaci"n, de lograr comunicaci"n y obtener informaci"n global a casi cero. El resultado de estar siempre conectados a todas partes a todas horas es que las distancias se reducen y que individuales dramas mundiales entran, cada vez más, a nuestras casas a diario. Podemos enterarnos 24 x 7 sobre incendios, bombas, asaltos, torturas, desapariciones, violaciones y escándalos políticos en cualquiera de los casi 200 países del planeta. Una foto, un testimonial, un videoclip de 15 segundos, nos acercan a más y más dramas individuales. Cada historia nos convence, un poquito más, de que vivimos en mundo cruel, duro y violento...
Las tragedias individuales, dice Anderson, venden muchos más periódicos y atraen muchos más televidentes que las tendencias generales
A menudo, después de abrir el periódico, ver las noticias o vivir algún suceso especialmente triste, acaba uno con la idea de que el mundo era mucho mejor antes y que vamos rumbo a la decadencia, soledad, podredumbre y extrema violencia. En algunas partes y épocas efectivamente es así. Pero no lo es en general...Dos amigos míos me recordaron, en escritos de fin de año, que hay mucho que criticar, afrontar, cambiar, pero también hay mucho que celebrar. Chris Anderson escribió sobre el extremo sobrerreportaje que ocurre cuando hay un incidente terrorista, accidente masivo o desastre natural. Esto ocurre porque, en la mayoría del mundo, este tipo de muertes violentas no son lugar común. Hay grandes reportajes precisamente porque son sucesos excepcionales.Las tragedias individuales, dice Anderson, venden muchos más periódicos y atraen muchos más televidentes que las tendencias generales. "Perro ataca inocente infante" es mucho más poderoso que "la pobreza se redujo en un 1 por ciento". Pero aunque la segunda nota es mucho menos atractiva en términos mediáticos significa salvar y mejorar muchas más vidas.
Mucho se ha escrito sobre cómo la red, Google, Yahoo, Skype, You Tube eliminan distancias y reducen el costo de la comunicación, de lograr comunicación y obtener información global a casi cero. El resultado de estar siempre conectados a todas partes a todas horas es que las distancias se reducen y que individuales dramas mundiales entran, cada vez más, a nuestras casas a diario. Podemos enterarnos 24 x 7 sobre incendios, bombas, asaltos, torturas, desapariciones, violaciones y escándalos políticos en cualquiera de los casi 200 países del planeta. Una foto, un testimonial, un videoclip de 15 segundos, nos acercan a más y más dramas individuales. Cada historia nos convence, un poquito más, de que vivimos en mundo cruel, duro y violento...
Ein nüchterner Blick auf die Geschichte zeigt, dass Optimismus grundsätzlich gerechtfertigt ist. Denn heute ist die Gewalt als bestimmendes Moment der Menschheitsgeschichte auf dem Rückzug. Darauf weist der Psychologe Steven Pinker von der Harvard University im Internetforum edge.org hin, in dem er zusammen mit 160 anderen Kollegen und Kolleginnen auf die Frage antwortet, was sie optimistisch mache. Es möge überraschen, so Pinker, aber die Gewalt habe seit Jahrhunderten drastisch abgenommen. Der Völkermord als gängige Form der Konfliktlösung, das Attentat zur Erbfolgeregelung, Exekution und Folter als Strafe, Sklaverei aus Faulheit und Habgier seien heute Seltenheiten und, wo sie aufträten, Gegenstand heftigerKritik. Was lief hier richtig? fragt Pinker, und stellt fest, dass wir wenig zu antworten wissen. Dies läge wohl daran, dass wir immer danach fragten, warum es Krieg gibt, und niemals, wieso der Frieden da ist. ......Fast alle Antworten in der Sammlung, die demnächst als Buch erscheint, sind von solchem Optimismus getragen. Geograph und Biologe Jared Diamond ist optimistisch, weil es in der Wirtschaft manchmal Entscheidungen gibt, die auch für die Menschheit gut sind. Brian Eno ist es, weil die Akzeptanz der Erder-wärmung das größte Versagen des Marktes transparent gemacht habe. J. Craig Ventererwartet eine Revolution der Entscheidungskultur, wenn außerhalb der Wissenschaft ihre jüngsten Methoden übernommen werden. Diese beruhten vor allem auf dem Erkennen irrelevanter Informationen. Die Zukunft ist also kein Überwachungsstaat. Vor allem die Infor ation-stechnologie ist unter den Optimisten im Trend. Auch Afrika, der verlorene Kontinent, erlebt hier einen Boom, der viel verändern wird.
Einzig Nobelpreisträger Frank Wilczek macht Hoffnung, dass es die alles erklärende Theorie, jene Weltformel, die als „Einsteins Traum" bekannt ist, nie geben wird. Man sollte seine Worte besser wählen, meint der Physiktheoretiker. Er lässt so eine unter seines-gleichen seltene Demut gegenüber der Schöpfung erkennen, deren Gedanke er nicht für die Hoffnung auf ein wissenschaftliches Erlösungsmoment opfern will.
Martin Rees, dessen Royal Society übrigens einst den Prioritätenstreit zwischen Newton und Leibniz um die Infinitesimalrechnung falsch zu Gunsten des Engländers entschied, äußerte sich auch: Er habe viele Zuschriften bekommen, sein Buch sei noch beschönigend und er selbst ein unverbesserlicher Optimist. Das, schreibt er nun, wolle er bleiben. Dennet gibt zwar zu, an schlechten Tagen den düsteren Szenarien seines Kollegen anhängen zu können. Als größte Gefahr macht er jedoch etwas anderes als der Physiker aus: Die gute alte Überreaktion.
Ein nüchterner Blick auf die Geschichte zeigt, dass Optimismus grundsätzlich gerechtfertigt ist. Denn heute ist die Gewalt als bestimmendes Moment der Menschheitsgeschichte auf dem Rückzug. Darauf weist der Psychologe Steven Pinker von der Harvard University im Internetforum edge.org hin, in dem er zusammen mit 160 anderen Kollegen und Kolleginnen auf die Frage antwortet, was sie optimistisch mache. Es möge überraschen, so Pinker, aber die Gewalt habe seit Jahrhunderten drastisch abgenommen. Der Völkermord als gängige Form der Konfliktlösung, das Attentat zur Erbfolgeregelung, Exekution und Folter als Strafe, Sklaverei aus Faulheit und Habgier seien heute Seltenheiten und, wo sie aufträten, Gegenstand heftigerKritik. Was lief hier richtig? fragt Pinker, und stellt fest, dass wir wenig zu antworten wissen. Dies läge wohl daran, dass wir immer danach fragten, warum es Krieg gibt, und niemals, wieso der Frieden da ist. ......Fast alle Antworten in der Sammlung, die demnächst als Buch erscheint, sind von solchem Optimismus getragen. Geograph und Biologe Jared Diamond ist optimistisch, weil es in der Wirtschaft manchmal Entscheidungen gibt, die auch für die Menschheit gut sind. Brian Eno ist es, weil die Akzeptanz der Erder-wärmung das größte Versagen des Marktes transparent gemacht habe. J. Craig Venter erwartet eine Revolution der Entscheidungskultur, wenn außerhalb der Wissenschaft ihre jüngsten Methoden übernommen werden. Diese beruhten vor allem auf dem Erkennen irrelevanter Informationen. Die Zukunft ist also kein Überwachungsstaat. Vor allem die Infor ation-stechnologie ist unter den Optimisten im Trend. Auch Afrika, der verlorene Kontinent, erlebt hier einen Boom, der viel verändern wird.
Einzig Nobelpreisträger Frank Wilczek macht Hoffnung, dass es die alles erklärende Theorie, jene Weltformel, die als „Einsteins Traum“ bekannt ist, nie geben wird. Man sollte seine Worte besser wählen, meint der Physiktheoretiker. Er lässt so eine unter seines-gleichen seltene Demut gegenüber der Schöpfung erkennen, deren Gedanke er nicht für die Hoffnung auf ein wissenschaftliches Erlösungsmoment opfern will.
Martin Rees, dessen Royal Society übrigens einst den Prioritätenstreit zwischen Newton und Leibniz um die Infinitesimalrechnung falsch zu Gunsten des Engländers entschied, äußerte sich auch: Er habe viele Zuschriften bekommen, sein Buch sei noch beschönigend und er selbst ein unverbesserlicher Optimist. Das, schreibt er nun, wolle er bleiben. Dennet gibt zwar zu, an schlechten Tagen den düsteren Szenarien seines Kollegen anhängen zu können. Als größte Gefahr macht er jedoch etwas anderes als der Physiker aus: Die gute alte Überreaktion.
It has been a strange year for science books. Some authors have presented new ideas about science — there has been a tussle over string theory, for example, and in Moral Minds Marc Hauser has suggested that morality is as innate as language (see Nature 443, 909–910; 2006).But perhaps the dominant theme running through many of the popular science books published this year has been, surprisingly, religion.
The continuing debate about the teaching of creationism in schools has no doubt fuelled this preoccupation. Many scientists, particularly those in the United States, have been moved to take a stand against proponents of creationism and intelligent design. Intelligent Thought, edited by John Brockman, is a collection of essays from the likes of Jerry Coyne and Tim White who provide elegantly expressed scientific arguments to counter the claims of intelligent design. This book should appeal to "those who already see evolutionary biology as a science", according to John Tyler Bonner (see Nature 442, 355–356; 2006). Michael Shermer's Why Darwin Matters is perhaps more accessible for the public, but neither book is likely to sway creationists from their belief.
Many of the scientists who made it to the top of the bestseller lists focused specifically on religion. Daniel Dennett's book Breaking the Spell provides essentially a natural history of religion but skirts around the cultural reasons why religion has developed and become such a dominant force in politics today, in the view of reviewer Michael Ruse (see Nature 439, 535; 2006).
......But Richard Dawkins isn't interested in reconciling science and religion. In The God Delusion, which has topped the bestseller lists in both the United States and Britain this autumn, Dawkins argues with the fervour of a preacher that religion has no place in the modern world, and that atheism is the 'true path' (see Nature 443, 914–915; 2006).
Dawkins' domination of the genre of popular science books was celebrated earlier in the year with the publication by Oxford University Press of a thirtieth-anniversary edition of his book The Selfish Gene, and Richard Dawkins: How A Scientist Changed the Way We Think, a collection of comments and testimonials edited by Alan Grafen and Mark Ridley (see Nature 441, 151–152; 2006).
Physicists have also been questioning our place in the Universe. Cosmologist Alex Vilenkin's Many Worlds in One takes a look at the multiverse theory — the idea that many different universes exist and explanations for how we came to be in this one (see Nature 443, 145–146; 2006). Paul Davies' The Goldilocks Enigma gives the topic a more popular treatment (see Nature 444, 423–424; 2006). ...
After a spate of books on string theory in 2005, the hottest hope for a 'theory of everything' came in for criticism this year, with the appearance of Lee Smolin's The Trouble with Physics ... (seeNature 443, 482, 491 ... 2006).
AN IDEA may be dangerous either to its conceiver or to others, including its proponents. Four hundred years ago, heliocentricity was acutely dangerous to Galileo, whom it led before the Holy Inquisition. Two and a half centuries later, Darwin's notions on natural selection and the evolution of species jeopardised the certainties and imperilled the livelihoods of many professional Christians. To this day, the idea that God does not exist is dangerous enough to get atheists murdered in America.
The editor of this anthology of dangerous ideas, John Brockman, is, among other things, the publisher of Edge, the "Third Culture" website (www.edge.org). He has already published What We Believe but Cannot Prove, to which this volume is a companion. Each year, Brockman asks a question of his contributors. Last year's was: "What is your dangerous idea?" He meant not necessarily a new idea, or even one which they had originated, but one which is dangerous "not because it is assumed to be false but because it might be true". This volume, with an introduction by Steven Pinker and an afterword by Richard Dawkins, publishes the responses given in 2006 by 108 of "Today's Leading Thinkers on the Unthinkable".
...There is much in many of these brief essays to astonish, to be appalled at, to mull over or to wish for. Some of them suffer from galloping emailographism, that mannerism of the hasty respondent whose elliptical prose can make even the most pregnant idea indigestible. But most of them, from the three-sentence reminder by Nicholas Humphrey of Bertrand Russell's dangerous idea ("That it is undesirable to believe in a proposition when there is no ground whatever for supposing it true") to the five pages of V.S. Ramachandran on Francis Crick's "Astonishing Hypothesis" (that what we think of as our self is merely the activity of 100 billion bits of jelly, the neurons which constitute the brain), are vitally engaging to anyone with an ounce of interest in matters such as being or whatever
...Mind you, there is one glimpse of the future which rings grotesque enough to be plausible, Gerald Holton's"Projection of the Longevity Curve", in which we see a future matriarch, 200 years old, on her death bed, surrounded by her children aged about 180, her grandchildren of about 150, her great-grandchildren of about 120, their offspring aged in their 90s, and so on for several more generations. A touching picture, as the author says, "But what are the costs involved?"
Personality, the connection between the Massachusetts Institute of Technology and the military, Norbert Wiener and cybernetics, Henry A. Murray and the LSD experiments at Harvard and crazy old Mr. Kaczynksi with his terror of mind control and supercomputers.
Are you lost yet? I've watched the film a few times, and I'm still not quite sure what it all means, or if it means anything at all. Like the Internet itself, the bewildering density of information requires careful sorting.
But one idea does jump out. John Brockman paraphrases a quote from Doubt and Certainty in Science: A Biologist's Reflections on the Brain by J.Z. Young that states: "We create tools and then we mould ourselves through our use of them."
In the brave new world of Google Video, YouTube, MySpace, et al., what does this mean? If we create technology and then become what we have created, have we now succeeded in making Jackass World?...
...So, are you being controlled by an elite group of cyber-hippies and ex-CIA military types without even knowing it? Or, as Theodor Adorno believed, lulled into a state of passivity and pseudo-individualization by pop culture. Or are you part of what Marshall McLuhan heralded as the new dawn in which "we have extended our central nervous system itself in a global embrace, abolishing both space and time as far as our planet is concerned."
[Ed. Note: See the trailer]