Edge 288—June 4, 2009
In the industrial model of student mass production, the teacher is the broadcaster. A broadcast is by definition the transmission of information from transmitter to receiver in a one-way, linear fashion. The teacher is the transmitter and student is a receptor in the learning process. The formula goes like this: "I'm a professor and I have knowledge. You're a student, you're an empty vessel and you don't. Get ready, here it comes. Your goal is to take this data into your short-term memory and through practice and repetition build deeper cognitive structures so you can recall it to me when I test you."... The definition of a lecture has become the process in which the notes of the teacher go to the notes of the student without going through the brains of either.
IMPENDING DEMISE OF THE UNIVERSITY
In his Edge feature "Gin, Television, and Cognitive Surplus", Clay Shirky noted that after WWII we were faced with something new: "free time. Lots and lots of free time. The amount of unstructured time among the educated population ballooned, accounting for billions of hours a year. And what did we do with that time? Mostly, we watched TV."
In "The End of Universal Rationality", Yochai Benkler explored the social implications of the Internet and network societies since the early 90s. Benkler has been looking at the social implications of the Internet and network societies since the early 90s. He saw the end of an era:
Benkler believes that these "phenomena on the Net are not ephemeral". And he has spent the last 20 years trying to get his head around the process of understanding what is transpiring.
In a Reality Club discussion "On 'Is Google Making Us Stupid' By Nicholas Carr" W. Daniel Hillis, Kevin Kelly, Nicholas Carr, Jaron Lanier, Douglas Rushkoff and others explored the future of the printed book.
Enter Don Tapscott, who is looking at the challenges the digital revolution poses to the fundamental aspects of the University.
"Universities are finally losing their monopoly on higher learning", he writes. "There is fundamental challenge to the foundational modus operandi of the University — the model of pedagogy. Specifically, there is a widening gap between the model of learning offered by many big universities and the natural way that young people who have grown up digital best learn."
Contrary to Nicholas Carr's proposition that Google is making us stupid, Tapscott counters with the following:
There's a new kind of conversation taking place among the younger generation and our Universities have yet to embrace it, This is a topic that is worthy of a serious conversation by the Edge community and I hope to present comment from contributors in future Edge editions.
— John Brockman
IMPENDING DEMISE OF THE UNIVERSITY
For fifteen years, I've been arguing that the digital revolution will challenge many fundamental aspects of the University. I've not been alone. In 1998, none other than, Peter Drucker predicted that big universities would be "relics" within 30 years.
Flash forward to today and you'd be reasonable to think that we have been quite wrong. University attendance is at an all time high. The percentage of young people enrolling in degree granting institutions rose over 115% from 1969-1970 to 2005-2007, while the percentage of 25- to 29-year-old Americans with a college degree doubled. The competition to get into the greatest universities has never been fiercer. At first blush the university seems to be in greater demand than ever.
Yet there are troubling indicators that the picture is not so rosy. And I'm not just talking about the decimation of university endowments by the current financial meltdown.
Universities are finally losing their monopoly on higher learning, as the web inexorably becomes the dominant infrastructure for knowledge sweeney both as a container and as a global platform for knowledge exchange between people.
Meanwhile on campus, there is fundamental challenge to the foundational modus operandi of the University — the model of pedagogy. Specifically, there is a widening gap between the model of learning offered by many big universities and the natural way that young people who have grown up digital best learn.
The old-style lecture, with the professor standing at the podium in front of a large group of students, is still a fixture of university life on many campuses. It's a model that is teacher-focused, one-way, one-size-fits-all and the student is isolated in the learning process. Yet the students, who have grown up in an interactive digital world, learn differently. Schooled on Google and Wikipedia, they want to inquire, not rely on the professor for a detailed roadmap. They want an animated conversation, not a lecture. They want an interactive education, not a broadcast one that might have been perfectly fine for the Industrial Age, or even for boomers. These students are making new demands of universities, and if the universities try to ignore them, they will do so at their peril.
The model of pedagogy, of course, is only one target of criticism directed toward universities.
The Many Challenges to the University
Most resources of large universities are directed towards research, not learning. The universities are not primarily institutes of higher learning, but institutes for science and research. In his book Rethinking Science, Michael Gibbons developed a scathing critique of the current model science as conducted in the university.
Recently the questioning has heated up on other fronts. In the New York Times last month, Mark Taylor, chairman of Columbia University's religion department, whipped up a storm of academic controversy with a provocative OpEd page article called "The End of University as We Know It".
"Graduate education," he began, "is the Detroit of higher learning. Most graduate programs in American universities produce a product for which there is no market (candidates for teaching positions that do not exist) and develop skills for which there is diminishing demand (research in subfields within subfields and publication in journals read by no one other than a few like-minded colleagues), all at a rapidly rising cost (sometimes well over $100,000 in student loans)." The key problem, he noted, began with Kant in his 1798 work, "The Conflict of the Faculties." Kant argued that universities should "handle the entire content of learning by mass production, so to speak, by a division of labor, so that for every branch of the sciences there would be a public teacher or professor appointed as its trustee."
Taylor argued that graduate education must be restructured at a fundamental level to move away from the ultra-narrow scholarship. Among other things, he called for more cross-disciplinary inquiry, the creation of problem-focused programs, with a sunset clause, as well as more collaboration between all educational institutions, and the abolition of tenure. One week later, the outcry from fellow academics filled the entire letters page on the Sunday New York Times. One of his own colleagues at Columbia said it was "alarming and embarrassing" to hear "crass anti-intellectualism" emerge from his own institution. Another academic accused Taylor of "poisoning the waters of higher education."
The Model of Pedagogy
Whatever the merits of Taylor's call to restructure higher education, I think he is right to call for a deep debate on how universities function in a networked society. Yet I think he misses the most fundamental challenge to the university as we know it. The basic model of pedagogy is broken. "Broadcast learning" as I've called it is no longer appropriate for the digital age and for a new generation of students who represent the future of learning.
In the industrial
model of student mass production, the teacher is the broadcaster. A
broadcast is by definition the transmission of information from transmitter
to receiver in a one-way, linear fashion. The teacher is the transmitter
and student is a receptor in the learning process. The formula goes
"I'm a professor and I have knowledge. You're a student you're an
empty vassal and you don't. Get ready, here it comes. Your goal is to
take this data into your short-term memory and through practice and repetition
build deeper cognitive structures so you can recall it to me when I test
The New Generation of Students
model might have been perfectly adequate for the baby-boomers, who
grew up in broadcast mode, watching 24 hours a week of television (not
to mention being broadcast to as children by parents, as students by
teachers, as citizens by politicians, and when then entered the workforce
as employees by bosses). But young people who have grown up digital
are abandoning one-way TV for the higher stimulus of interactive communication
they find on the Internet. In fact television viewing is dropping and
TV has become nothing more than ambient media for youth — akin to Muzak.
Sitting mutely in front of a TV set — or a professor — doesn't appeal
to or work for this generation. They learn differently best through
non-sequential, interactive, asynchronous, multi-tasked and collaborative
of course, think that Google makes you stupid; it's so hard to concentrate
and think deeply amid the overwhelming amounts of bits of information
online, they contend. Mark Bauerlein, an English professor at Emory
University, even calls them the "dumbest generation" in his
recent book on the topic.
My research suggests these critics are wrong. Growing up digital has changed the way their minds work in a manner that will help them handle the challenges of the digital age. They're used to multi-tasking, and have learned to handle the information overload. They expect a two-way conversation. What's more, growing up digital has encouraged this generation to be active and demanding enquirers. Rather than waiting for a trusted professor to tell them what's going on, they find out on their own on everything from Google to Wikipedia.
If universities want to adapt the teaching techniques to their current audience, they should, as I've been saying for years, make significant changes to the pedagogy. And the new model of learning is not only appropriate for youth — but increasingly for all of us. In this generation's culture is the new culture of learning.
The professors who remain relevant will have to abandon the traditional lecture, and start listening and conversing with the students — shifting from a broadcast style and adopting an interactive one. Second, they should encourage students to discover for themselves, and learn a process of discovery and critical thinking instead of just memorizing the professor's store of information. Third, they need to encourage students to collaborate among themselves and with others outside the university. Finally, they need to tailor the style of education to their students' individual learning styles.
Because of technology this is now possible. But this is not fundamentally about technology per se. Rather it represents a change in the relationship between students and teachers in the learning process.
Most Vulnerable Universities
But the same cannot be said of many of the big universities that regard their prime role to be a centre for research, with teaching as an inconvenient afterthought, and class sizes so large that they only want to "teach" is through lectures.
are vulnerable, especially at a time when students can watch lectures
online for free by some of the world's leading professors on sites
like Academic Earth. They can even take the entire course online, for
credit. According to the Sloan Consortium, a recent article in
Chronicle of Higher Education tells us, "nearly 20 per cent of
college students — some 3.9 million people — took an online course
in 2007, and their numbers are growing by hundreds of thousands each
year. The University of Phoenix enrolls over 200,000 each year."
produces real results. An evaluation study of 350 Cornell students
found that those who were asked "deep questions" (that elicit
higher-order thinking) with frequent peer discussion scored noticeably
higher on their math exams than students who were not asked deep questions
or who had little to no chance for peer discussion. Dr. Terrell explains: "It's
when the students talk about what they think is going on and why, that's
where the biggest learning occurs for them…. You can hear people sort
of saying, 'Oh I see, I get it.' … And then they're explaining to somebody
else … and there's an authentic understanding of what's going on. So
much better than what would happen if I, as the teacher person, explain
it. There's something that happens with this peer instruction."
no lectures. Just as well: the statistics lecture is by definition
a bust. There is no "one-size-fits-all" for statistics –
everyone in the lecture hall is either bored or doesn't get it. Instead,
we got face-to-face time with Dr. Hunka, who was freed up from being
a transmitter of data to someone who customized a learning experience
for each of us, one on one.
Challenge to Teaching
But if campuses are seen as places where learning is inferior to other models, or worse places where learning is restricted and stifled, the role of the campus experience will be undermined as well.
that embrace the new models become more effective learning environments
and more desirable places. Even something as simple as online lectures
do not undermine the value of on-campus education, they have enhanced
it. The video lectures allow students to absorb the course content
online — whenever it's convenient — and then get together to tinker,
invent new things, or discuss the material. The experience has shown
MIT that real value of what they offer is not the lecture per se, but
rather the whole package — the content tied to the human learning experience
on campus, plus the certification. Universities, in other words, cannot
survive on lectures alone.
A Challenge to the Relationship of the University to Other Institutions
"The time has come for some far reaching changes to the university, our model of pedagogy, how we operate, and our relationship to the rest of the world," says Luis M. Proenza, president of the University of Akron.
He asks a provocative question: Why should a university student be restricted to learning from the professors at the university he or she is attending. True, students can obviously learn from intellectuals around the world through books, or via the Internet. Yet in a digital world, why shouldn't a student be able to take a course from a professor at another university? Proenza thinks universities should use the Internet to create a global centre of excellence. In other words, choose the best courses you have and link them with the best at a handful of universities around the world to create an unquestionably best-in-class program for students. Students would get to learn from the world's greatest minds in their area of interest — either in the physical classroom, or online. This global academy would be also be open to anyone online. This is a beautiful example of the collaboration I described in the book I co-authored, Wikinomics.
So why hasn't it happened yet? "It's the legacy of established human and educational infrastructure," says Proenza. The analogy is not the newspaper business, which has been weakened by the distribution of knowledge on the Internet, he notes. "We're more like health care. We're challenged by obstructive, non-market-based business models. We're also burdened by a sense that doctor knows best, or professor knows best."
are a lot of sacred cows," he said. Why, for example, are universities
judged by the number of students they exclude, or by how much they
spend? Why aren't they judged by how well they teach, and at what price?
Paradigms Die Hard
Changing the model of pedagogy for this generation is crucial for the survival of the university. If students turn away from a traditional university education, this will erode the value of the credentials universities award, their position as centers of learning and research, and as campuses where young people get a change to "grow up."
"A captivating collection of essays ... a medley of big ideas." — Amanda Gefter, New Scientist
If these authors are the future of science, then the science of the future will be one exciting ride! Find out what the best minds of the new generation are thinking before the Nobel Committee does. A fascinating chronicle of the big, new ideas that are keeping young scientists up at night. — Daniel Gilbert, author of Stumbling on Happiness
"A preview of the ideas you're going to be reading about in ten years." — Steven Pinker, author of The Stuff of Thought
"Brockman has a nose for talent." — Nassim Nicholas Taleb, author The Black Swan
"Capaciously accessible, these writings project a curiosity to which followers of science news will gravitate." — Booklist
I'm eager to hear Smith's perspective on the Northern Rim as a climate driver. As the permafrost melts and the boreal forest marches north, what happens with methane and CO2 emissions? What happens with snow and vegetation albedo? What happens with cloud and precipitation regimes? How are coastal areas different from the vast inlands?
And what does Smith think of ecologist Sergei Zimov's effort to restore the "mammoth tundra steppe" in northeastern Siberia?
This is a fun essay if you read it backwards. The real conclusion towards the end is that we are talking about a "conversion from land that is hardly livable to land that is somewhat livable" which is perhaps not such a big change.
I don't think that the small change in winter low temperatures as climate warms is the constraint on the development of the Arctic.
The fundamental constraints are the world price of natural resources and the strength of the environmental lobby (outside of Russia). There has been development in the Arctic already, long before the temperature warmed a little bit. It is just very very expensive.
A good way to understand the remoteness of the Arctic is to consider Canada's territory of Nunavut. It is three times the size of California and has a population of 30,000 people. To put that in perspective, imagine if the entire population of the United States was just 150,000, or only 3,000 people lived in the whole of Great Britain. These nations would then have the same population density as Nunavut (might sound like heaven to some). There are no roads, railroads or useful ports in Nunavut. You face distance, cold, no infrastructure and an unbelievably sparse population. Warming undermines hunting and merely lengthens the ice-free season for supply by ship by a small period. That's it.
To take out natural mineral resources from these areas you have to start with high-value-for-weight materials like diamonds and gold which can be flown out. But these mines don't really bring development, just short-lived boom and bust mining towns. When you move onto heavier stuff, like iron ore, that can run for many decades, you have staggering transport investment costs. Mary River in Baffin Island is home to perhaps the biggest deposit of high grade iron ore in the world but to get it out requires a railway across the tundra, a new port and a fleet of ice breaking bulk carriers. The ambitious BaffinLand company is all ready and waiting for a $4.1 billion investment to get going and I hope they succeed. But such big opportunities are rare and few have been taken, like the Red Dog Mine in Alaska and Norilsk in Russia.
Oil and gas are the only large sources of long-term natural resource wealth across the Arctic. In Alaska, offshore development, where the big fields are, has come to a halt in the face of environmental groups concerned about the risks to wildlife and fisheries, already under strain from climate change. Only in Russia is really rapid development under way. That's where the innovation is right now, in the Shtokman field and out around Yamal. Is this really going to change as we face the realities of a warming world? Are we going to say, let's go for lots more high-priced Arctic oil in Alaska? More hydrocarbons please and don't worry about oil spills and polar bears? I doubt it, or if we do, it won't be for long. The race will go to the brilliant innovators who show us how to replace high-priced oil.
Still I very much hope that some development will come to the Arctic, but not any more people. Taking the arc of land from Alaska to Greenland, the Inuit lands, the situation of the indigenous people is very tough. There's a booming population, (60 per cent of the population is under 25 in Nunavut, while in the US, 32 per cent of the population is under 25), high unemployment, staggering suicide rates for young men (it's not the dark cold winters, 80 per cent of suicides are in the 24 hour summer light), and low education levels. They desperately need jobs. One Inuit regional development official put it like this to me: "When Inuit are making a meaningful living, it's a lot better. You see the community being much more vibrant, everybody feels much better about themselves and life is good."
So I'd rather not think of the Arctic as a place southerners might settle but as a place where we southerners might help to bring a life that is good.
LAURENCE C. SMITH
The prospect of southern refugees pouring into the Arctic — or even wanting to — is miniscule; time will whether coming decades will see the rapid growth of human activities in the North. But the pressures are there, and climate change is just one of several including demographic, political, and resource-based. Aboriginal people are in a surprisingly good position to advance northern development and with it, themselves.
Cloud physics are poorly captured in our coarse-scale climate models and the future of cloud radiative forcing is a very active research subject. However, the models express near-unanimity when it comes to precipitation: That is going to increase. It probably already has, if only our lousy snow-gauges could measure it well enough. Other far-reaching effects of a warming North include global sea-level rise (from melting land-based glaciers, not sea ice), an easing of extreme winter cold (allowing northward penetration of southern biota, pests, and disease), and — as Brand notes, the potential unleashing of new greenhouse gas sources from thawing, carbon-rich soils.
Because huge amounts of carbon constantly exit and enter northern soils — the net balance is but a tiny residual of two huge numbers of opposing sign — this question has frustrated us for years. But just this week, Ted Schuur and others may have discovered the answer. In their paper published in Nature (i) they learned, by radiocarbon dating the ages of carbon released from recent vs. not-so-recently thawed permafrost, that the vegetation grab-back is likely a temporary, transient effect. So it seems, Mr. Brand, the CO2 compost heap remains very much on the table.
There is a warm fuzzy moment near the end of the movie “Angels & Demons,” starring Tom Hanks and directed by Ron Howard.
Mr. Hanks as the Harvard symbologist Robert Langdon has just exposed the archvillain who was threatening to blow up the Vatican with antimatter stolen from a particle collider. A Catholic cardinal who has been giving him a hard time all through the movie and has suddenly turned twinkly-eyed says a small prayer thanking God for sending someone to save them.
Mr. Hanks replies that he doesn’t think he was “sent.”
Of course he was, he just doesn’t know it, the priest says gently. Mr. Hanks, taken aback, smiles in his classic sheepish way. Suddenly he is not so sure.
This may seem like a happy ending. Faith and science reconciled or at least holding their fire in the face of mystery. But for me that moment ruined what had otherwise been a pleasant two hours on a rainy afternoon. It crystallized what is wrong with the entire way that popular culture regards science. Scientists and academics are smart, but religious leaders are wise.
...In a recent interview, Mr. Howard said that he didn’t think there was any conflict between science and religion. Both are after big mysteries, but whatever science finds, he said, “There’s still going be that question: ‘And before that?’ ”
But I can’t help being bugged by that warm, fuzzy moment at the end, that figurative pat on the head. After all is said and done, it seems to imply, having faith is just a little bit better than being smart. ...
Can Admitting a Wrong Make It Right?
When the president of the United States of America stands before a huge crowd at Cairo University and makes his long-anticipated speech to the Muslim world Thursday, will he say that he's sorry? Will he, for instance, offer to make amends for the blind support some of his predecessors have shown for Israel's occupation of Arab lands? Will he ask forgiveness for the CIA coup in Iran that overthrew a democratically elected government there in 1953? Will Barack Obama try to talk directly to the people and apologize for the many decades Washington has spent supporting Arab dictators, including the one who rules in Egypt, the country where he is speaking?
You see the problem. Yet there is a body of evidence to suggest that the most vital element in Middle East peacemaking may lie in questions of language and symbols–what social anthropologist Scott Atran calls a "moral logic" based on "sacred values." And sometimes what that boils down to, essentially, is saying you're sorry. As Atran sees it, this is not really a theological question. It's more fundamental than fundamentalism. The need for dignity and respect—a craving for recognition and vindication—is at the heart of the region's most intractable conflicts.
Even when ballots replace bullets, these factors that Atran calls "intangible" remain important. An obvious reason that extremists have done so well in the region's elections in recent years, whether among the Arabs, Iranians or Israelis, is that they have addressed emotional and moral questions head on. ...
Scientific ideas are exciting, yet the scientific literature is far from lively. John Maddox's achievement was to sidestep the drabness of scientific writing by emphasizing the ideas that thrived beneath the leaden prose. In doing so he made Nature a compellingly interesting journal whose success forced others in the staid world of scientific publishing to adopt many of his ideas.
Though trained as a physical chemist, Maddox was a journalist at heart, having spent his formative early career as a science writer for the Manchester Guardian. On becoming editor of Nature in 1966, he recognized that the dry format of the scientific article could not be greatly changed but that the excitement of scientific ideas could be conveyed by other kinds of articles. Maddox expanded the News and Views section of Nature into a lively commentary on the scientific issues of the day.
Long is the list of original ideas that have been rejected by Nature or Science but later proved correct. My guess is that far fewer of these mistaken rejections occurred while Maddox was editor. He loved new ideas and was always ready to take a chance on a bold paper.
His other great virtue as an editor was that he thought like a publisher. Instead of waiting for interesting papers to come in, he went around asking for them, and people responded to his enthusiasm. The more visible Nature became under its unorthodox new editor, the more its prestige and circulation grew, especially in the United States. ...
Black Swan Fund Makes a Big Bet on Inflation
A hedge fund firm that reaped huge rewards betting against the market last year is about to open a fund premised on another wager: that the massive stimulus efforts of global governments will lead to hyperinflation.
The firm, Universa Investments L.P., is known for its ties to gloomy investor Nassim Nicholas Taleb, author of the 2007 bestseller "The Black Swan," which describes the impact of extreme events on the world and financial markets.
Funds run by Universa, which is managed and owned by Mr. Taleb's long-time collaborator Mark Spitznagel, last year gained more than 100% thanks to its bearish bets. Universa now runs about $6 billion, up from the $300 million it began with in January 2007. Earlier this year, Mr. Spitznagel closed several funds to new investors....
Mr. Taleb doesn't have an ownership interest in the Santa Monica, Calif., firm, but he has significant investments in it and helps shape its strategies.
The term "black swan," which has become a market catchphrase in the last few years, alludes to the once-widespread belief in the West that all swans are white. The notion was proven false when European explorers discovered black swans in Australia. A black-swan event, according to Mr. Taleb, is something that is extreme and highly unexpected. ...
A Human Language Gene Changes the Sound of Mouse Squeaks
People have a deep desire to communicate with animals, as is evident from the way they converse with their dogs, enjoy myths about talking animals or devote lifetimes to teaching chimpanzees how to speak. A delicate, if tiny, step has now been taken toward the real thing: the creation of a mouse with a human gene for language.
The gene, FOXP2, was identified in 1998 as the cause of a subtle speech defect in a large London family, half of whose members have difficulties with articulation and grammar. All those affected inherited a disrupted version of the gene from one parent. FOXP2 quickly attracted the attention of evolutionary biologists because other animals also possess the gene, and the human version differs significantly in its DNA sequence from those of mice and chimpanzees, just as might be expected for a gene sculpted by natural selection to play an important role in language.
Researchers at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, have now genetically engineered a strain of mice whose FOXP2 gene has been swapped out for the human version. Svante Paabo, in whose laboratory the mouse was engineered, promised several years ago that when the project was completed, "We will speak to the mouse." He did not promise that the mouse would say anything in reply, doubtless because a great many genes must have undergone evolutionary change to endow people with the faculty of language, and the new mouse was gaining only one of them. So it is perhaps surprising that possession of the human version of FOXP2 does in fact change the sounds that mice use to communicate with other mice, as well as other aspects of brain function. ...
Let's Talk About God
The atheist writers Sam Harris, Richard Dawkins and Christopher Hitchens have presented us with a choice: either you don't believe in God or you're a dope. "It is perfectly absurd for religious moderates to suggest that a rational human being can believe in God, simply because that belief makes him happy," writes Harris in the 2005 "Atheist Manifesto" now posted on the Web site of his new nonprofit, The Reason Project. Their brilliance, wit and (general) good humor have made the new generation of atheists celebrities among people who like to consider themselves smart. We enjoy their books and their telegenic bombast so much that we don't mind their low opinion of us. Dopey or not, 90 percent of Americans continue to say they believe in God. ...
...Robert Wright's The Evolution of God, which comes out next week, is about to reframe this debate. Wright doesn't argue one side or other of the "Is God real?" question. He leaves that aside. Instead, he grapples with God as an idea that has changed—evolved—through history. ...
...Though he never comes right out and declares that the human propensity for morality—and, by extension, truth and love—is given by God (or is God), he comes awfully close. In an imaginary debate with a scientist, he compares God to an electron. You know it's there, but you don't know anything real about what it looks like or what its properties are. Scientists believe in electrons because they see the effects of electrons on the world. "You might say," he writes in his afterword, "that love and truth are the two primary manifestations of divinity in which we can partake, and that by partaking in them we become truer manifestations of the divine. Then again, you might not say that. The point is just that you wouldn't have to be crazy to say it." (I can already hear Steven Pinker typing like mad.)
With those three sentences, Wright gives relief and intellectual ballast to those believers weary of the punching-bag tone of the recent faith-and—reason debates. ...
By NICHOLAS D. KRISTOF
...This came up after I wrote a column earlier this year called "The Daily Me." I argued that most of us employ the Internet not to seek the best information, but rather to select information that confirms our prejudices. To overcome that tendency, I argued, we should set aside time for a daily mental workout with an ideological sparring partner. Afterward, I heard from Jonathan Haidt, a psychology professor at the University of Virginia. "You got the problem right, but the prescription wrong," he said.
Simply exposing people to counterarguments may not accomplish much, he said, and may inflame antagonisms....
...Some evolutionary psychologists believe that disgust emerged as a protective mechanism against health risks, like feces, spoiled food or corpses. Later, many societies came to apply the same emotion to social "threats." Humans appear to be the only species that registers disgust, which is why a dog will wag its tail in puzzlement when its horrified owner yanks it back from eating excrement.
Psychologists have developed a "disgust scale" based on how queasy people would be in 27 situations, such as stepping barefoot on an earthworm or smelling urine in a tunnel. Conservatives systematically register more disgust than liberals. (To see how you weigh factors in moral decisions, take the tests at www.yourmorals.org.) ..
THE WILD SIDE
"In the spring," wrote Tennyson, "a young man's fancy lightly turns to thoughts of love." And so in keeping with the spirit of the season, this week's column looks at love affairs — mathematically. The analysis is offered tongue in cheek, but it does touch on a serious point: that the laws of nature are written as differential equations. It also helps explain why, in the words of another poet, "the course of true love never did run smooth."
To illustrate the approach, suppose Romeo is in love with Juliet, but in our version of the story, Juliet is a fickle lover. The more Romeo loves her, the more she wants to run away and hide. But when he takes the hint and backs off, she begins to find him strangely attractive. He, on the other hand, tends to echo her: he warms up when she loves him and cools down when she hates him.
What happens to our star-crossed lovers? How does their love ebb and flow over time? That's where the math comes in. By writing equations that summarize how Romeo and Juliet respond to each other's affections and then solving those equations with calculus, we can predict the course of their affair. The resulting forecast for this couple is, tragically, a never-ending cycle of love and hate. At least they manage to achieve simultaneous love a quarter of the time. ...
BOOKS OF THE TIMES
Catching Fire" is a plain-spoken and thoroughly gripping scientific essay that presents nothing less than a new theory of human evolution.
Human beings are not obviously equipped to be nature's gladiators. We have no claws, no armor. That we eat meat seems surprising, because we are not made for chewing it uncooked in the wild. Our jaws are weak; our teeth are blunt; our mouths are small. That thing below our noses? It truly is a pie hole.
To attend to these facts, for some people, is to plead for vegetarianism or for a raw-food diet. We should forage and eat the way our long-ago ancestors surely did. For Richard Wrangham, a professor of biological anthropology at Harvard and the author of "Catching Fire," however, these facts and others demonstrate something quite different. They help prove that we are, as he vividly puts it, "the cooking apes, the creatures of the flame."
The title of Mr. Wrangham's new book — "Catching Fire: How Cooking Made Us Human" — sounds a bit touchy-feely. Perhaps, you think, he has written a meditation on hearth and fellow feeling and s'mores. He has not. "Catching Fire" is a plain-spoken and thoroughly gripping scientific essay that presents nothing less than a new theory of human evolution, one he calls "the cooking hypothesis," one that Darwin (among others) simply missed. ...
"For those seeking substance over sheen, the occasional videos released at Edge.org hit the mark. The Edge Foundation community is a circle, mainly scientists but also other academics, entrepreneurs, and cultural figures. ... Edge's long-form interview videos are a deep-dive into the daily lives and passions of its subjects, and their passions are presented without primers or apologies. The decidedly noncommercial nature of Edge's offerings, and the egghead imprimatur of the Edge community, lend its videos a refreshing air, making one wonder if broadcast television will ever offer half the off-kilter sparkle of their salon chatter. — Boston Globe
Mahzarin Banaji, Samuel Barondes, Yochai Benkler, Paul Bloom, Rodney Brooks, Hubert Burda, George Church, Nicholas Christakis, Brian Cox, Iain Couzin, Helena Cronin, Paul Davies, Daniel C. Dennett, David Deutsch,Dennis Dutton, Jared Diamond, Freeman Dyson, Drew Endy, Peter Galison, Murray Gell-Mann, David Gelernter, Neil Gershenfeld, Anthony Giddens, Gerd Gigerenzer, Daniel Gilbert, Rebecca Goldstein, John Gottman, Brian Greene, Anthony Greenwald, Alan Guth, David Haig, Marc D. Hauser, Walter Isaacson, Steve Jones, Daniel Kahneman, Stuart Kauffman, Ken Kesey, Stephen Kosslyn, Lawrence Krauss, Ray Kurzweil, Jaron Lanier, Armand Leroi, Seth Lloyd, Gary Marcus, John Markoff, Ernst Mayr, Marvin Minsky, Sendhil Mullainathan, Dennis Overbye, Dean Ornish, Elaine Pagels, Steven Pinker, Jordan Pollack, Lisa Randall, Martin Rees, Matt Ridley, Lee Smolin, Elisabeth Spelke, Scott Sampson, Robert Sapolsky, Dimitar Sasselov, Stephen Schneider, Martin Seligman, Robert Shapiro, Clay Shirky, Lee Smolin, Dan Sperber, Paul Steinhardt, Steven Strogatz, Seirian Sumner, Leonard Susskind, Nassim Nicholas Taleb, Timothy Taylor, Richard Thaler, Robert Trivers, Neil Turok, J.Craig Venter, Edward O. Wilson, Lewis Wolpert, Richard Wrangham, Philip Zimbardo
WHAT HAVE YOU CHANGED YOUR MIND ABOUT
great event in the Anglo-Saxon culture."
Praise for the online publication of
"The splendidly enlightened Edge website (www.edge.org) has rounded off each year of inter-disciplinary debate by asking its heavy-hitting contributors to answer one question. I strongly recommend a visit." The Independent
"A great event in the Anglo-Saxon culture." El Mundo
"As fascinating and weighty as one would imagine." The Independent
"They are the intellectual elite, the brains the rest of us rely on to make sense of the universe and answer the big questions. But in a refreshing show of new year humility, the world's best thinkers have admitted that from time to time even they are forced to change their minds." The Guardian
"Even the world's best brains have to admit to being wrong sometimes: here, leading scientists respond to a new year challenge." The Times
"Provocative ideas put forward today by leading figures."The Telegraph
The world's finest minds have responded with some of the most insightful, humbling, fascinating confessions and anecdotes, an intellectual treasure trove. ... Best three or four hours of intense, enlightening reading you can do for the new year. Read it now." San Francisco Chronicle
"As in the past, these world-class thinkers have responded to impossibly open-ended questions with erudition, imagination and clarity." The News & Observer
"A jolt of fresh thinking...The answers address a fabulous array of issues. This is the intellectual equivalent of a New Year's dip in the lake—bracing, possibly shriek-inducing, and bound to wake you up." The Globe and Mail
"Answers ring like scientific odes to uncertainty, humility and doubt; passionate pleas for critical thought in a world threatened by blind convictions." The Toronto Star
"For an exceptionally high quotient of interesting ideas to words, this is hard to beat. ...What a feast of egg-head opinionating!" National Review Online
"The optimistic visions seem not just wonderful but plausible." Wall Street Journal
"Persuasively upbeat." O, The Oprah Magazine
"Our greatest minds provide nutshell insights on how science will help forge a better world ahead." Seed
"Uplifting...an enthralling book." The Mail on Sunday
"Danger – brilliant minds at work...A brilliant bok: exhilarating, hilarious, and chilling." The Evening Standard (London)
"A selection of the most explosive ideas of our age." Sunday Herald
"Provocative" The Independent
"Challenging notions put forward by some of the world's sharpest minds" Sunday Times
"A titillating compilation" The Guardian
"Reads like an intriguing dinner party conversation among great minds in science" Discover
"Whether or not we believe proof or prove belief, understanding belief itself becomes essential in a time when so many people in the world are ardent believers." LA Times
"Belief appears to motivate even the most rigorously scientific minds. It stimulates and challenges, it tricks us into holding things to be true against our better judgment, and, like scepticism -its opposite -it serves a function in science that is playful as well as thought-provoking. not we believe proof or prove belief, understanding belief itself becomes essential in a time when so many people in the world are ardent believers." The Times
"John Brockman is the PT Barnum of popular science. He has always been a great huckster of ideas." The Observer
"An unprecedented roster of brilliant minds, the sum of which is nothing short of an oracle—a book ro be dog-eared and debated." Seed
"Scientific pipedreams at their very best." The Guardian
"Makes for some astounding reading." Boston Globe
"Fantastically stimulating...It's like the crack cocaine of the thinking world.... Once you start, you can't stop thinking about that question." BBC Radio 4
"Intellectual and creative magnificence" The Skeptical Inquirer
Edge Foundation, Inc. is a nonprofit private operating foundation under Section 501(c)(3) of the Internal Revenue Code.