Edge Dinners
Event Date: [ 2.27.06 ]
Location:
Monterey, CA
United States

"The dinner party was a microcosm of a newly dominant sector of American business." — Wired


 

Edge Dinners
Event Date: [ 1.24.06 12:45 PM ]
Location:
United States

 

Laws of attraction in action
January 31, 2006

This year's Scientists Meet the Media gathering at the Royal Society showed that boffins know how to party, too. Nic Fleming reports

...The heady mix of scholarly gossip and highbrow chitchat was also enriched by the contributions of Nature, Science and New Scientist writers, the authors Dr Matt Ridley and Sanjida O'Connell, the New York agent John Brockman, Craig Venter, the legendary American "bad boy" of genomics, the zoological sex therapist Dr Tatiana (Olivia Judson), the mutant expert Armand Leroi and the Sky TV meteorologist, Lisa Burke.

[...continue]

THE TELEGRAPH/NOVARTIS MEET THE MEDIA EVENT AT THE ROYAL SOCIETY [1.25.06]


Photo © Eleanor Bentall

John Bryant, Editor, The Telegraph; Craig Venter; Olivia Judson;
Max Brockman; Lord (Martin) Rees, President, The Royal Society & Astronomer Royal
at The Telegraph/Novartis Meet The Media Event, Royal Society



Photo © Eleanor Bentall

Roger Highfield, The Telegraph; Armand Leroi


EDGE LONDON SCIENCE DINNER  [1.24.06]

Oliver Morton, Nature; David Goodhart, Prospect


Science notebook by Anjana Ahuja
Doctors, athletes and prostitutes: the deadly common denominator
January 30, 2006

ON To cheerier matters. When people turn up to a dinner before the appointed 7pm start, you know it's going to be fun. And so it was on Tuesday when the literary agent John Brockman hosted a gathering in Soho. I showed up at 7:10pm, depriving myself of ten minutes of serious schmoozing.

Brian Eno was there, as were Richard Dawkins and Simon Baron-Cohen, the autism researcher. Colin Blakemore, the head of the Medical Research Council, came along, joining the authors Olivia Judson, Matt Ridley, Armand Leroi and David Bodanis (the fastest talker I've ever met). Ian McEwan dropped by. The editors of Nature, New Scientist and Prospect mingled amiably. I ended up sharing a pudding plate with Craig Venter, the Celera Genomics entrepreneur who helped to unravel the human genome and in whose honour the dinner was held.

[Subscription Required]

 

 

Special Events
Event Date: [ 5.16.05 ]
Location:
United States

 

...on the research on mind, brain, and behavior that may be relevant to gender disparities in the sciences, including the studies of bias, discrimination and innate and acquired difference between the sexes.

Harvard University • Mind/Brain/Behavior Initiative

The Mind Brain and Behavior Inter-Faculty Initiative (MBB), under the leadership of Co-Directors Marc D. Hauser and Elizabeth Spelke, is a university-wide community that studies the structure, function, evolution, development, and pathology of the nervous system, in relation to decision-making and behavior.


Introduction

On April 22, 2005, Harvard University's Mind/Brain/Behavior Initiative (MBB) held a defining debate on the public discussion that began on January 16th with the public comments by Lawrence Summers, president of Harvard, on sex differences between men and women and how they may relate to the careers of women in science. The debate at MBB, "The Gender of Gender and Science" was "on the research on mind, brain, and behavior that may be relevant to gender disparities in the sciences, including the studies of bias, discrimination and innate and acquired difference between the sexes".

It's interesting to note that since the controversy surrounding Summers' remarks began, there has been an astonishing absence of discussion of the relevant science...you won't find it in the hundreds and hundreds of articles in major newspapers; nor will find it in the Harvard faculty meetings where the president of the leading University in America was indicted for presenting controversial ideas.

Scientists debate continually, and reality is the check. They may have egos as large as those possessed by the iconic figures of the academic humanities, but they handle their hubris in a very different way. They can be moved by arguments, because they work in an empirical world of facts, a world based on reality. There are no fixed, unalterable positions. They are both the creators and the critics of their shared enterprise. Ideas come from them and they also criticize one another's ideas.

Through the process of creativity and criticism and debates, they decide which ideas get weeded out and which become part of the consensus that leads to the next level of discovery.

But unlike just about anything else said about Summers' remarks, the debate, "The Science of Gender and Science", between Harvard psychology professors Steven Pinker and Elizabeth Spelke, focused on the relevant scientific literature. It was both interesting on facts but differing in interpretation.

Both presented scientific evidence with the realization and understanding that there was nothing obvious about how the data was to be interpreted. Their sharp scientific debate informed rather than detracted. And it showed how a leading University can still fulfill its role of providing a forum for free and open discussion on controversial subjects in a fair-minded way. It also had the added benefit that the participants knew what they were talking about.

Who won the debate? Make up your own mind. Watch the video, listen to the audio, read the text and check out the slide presentations.

There's a lesson here: let's get it right and when we do we will adjust our attitudes. That's what science can do, and that's what Edge offers by presenting Pinker vs. Spelke to a wide public audience.

JB

STEVEN PINKER is the Johnstone Family Professor in the Department of Psychology at Harvard University. His research has won prizes from the National Academy of Sciences and the Royal Institution of Great Britain, and he is the author of six books, including The Language Instinct, How the Mind Works, Words and Rules, and The Blank Slate.
Steven Pinker's Edge Bio Page

ELIZABETH S. SPELKE is Berkman Professor of Psychology at Harvard University, where she is Co-Director of the Mind, Brain, and Behavior Initiative. A member of the National Academy of Sciences and the American Academy of Arts and Sciences, she is cited by Time Magazine as one of America's Best in Science and Medicine.

Elizabeth Spelke's Edge Bio Page


THE SCIENCE OF GENDER AND SCIENCE
PINKER VS. SPELKE
A DEBATE

[EDITOR'S NOTE: Pinker and Spelke each made presentations of about 40 minutes, without interruption, from each other or from the audience. They then responded to each other's presentations. By mutual agreement, Pinker made the first presentation.

This Edge presentation includes: the transcribed text; streaming audio of the full debate; 6-minute video clips from Pinker and Spelke's opening statements; a 20-minute video clip of the their closing discussion; and online versions of the speakers' slide presentations. There are two options for viewing the slides: Clicking on the links immediately below brings up the file of either Pinker or Spelke's complete slide presentation. Or, the individual slides are also included for reference as expandable thumbnails in the margin of the transcript.]

The complete video, in .avi format, is also available for download through Harvard's MBB website (click here).



Steven Pinker

(STEVEN PINKER:) Thanks, Liz, for agreeing to this exchange. It's a privilege to be engaged in a conversation with Elizabeth Spelke. We go back a long way. We have been colleagues at MIT, where I helped attract her, and at Harvard, where she helped to attract me. With the rest of my field, I have enormous admiration for Elizabeth's brilliant contributions to our understanding of the origins of cognition. But we do find ourselves with different perspectives on a recent issue.

For those of you who just arrived from Mars, there has been a certain amount of discussion here at Harvard on a particular datum, namely the under-representation of women among tenure-track faculty in elite universities in physical science, math, and engineering. Here are some recent numbers:

As with many issues in psychology, there are three broad ways to explain this phenomenon. One can imagine an extreme "nature" position: that males but not females have the talents and temperaments necessary for science. Needless to say, only a madman could take that view. The extreme nature position has no serious proponents. 

There is an extreme "nurture" position: that males and females are biologically indistinguishable, and all relevant sex differences are products of socialization and bias.

Then there are various intermediate positions: that the difference is explainable by some combination of biological differences in average temperaments and talents interacting with socialization and bias.

Liz has embraced the extreme nurture position. There is an irony here, because in most discussions in cognitive science she and I are put in the same camp, namely the "innatists," when it comes to explaining the mind. But in this case Liz has said that there is "not a shred of evidence" for the biological factor, that "the evidence against there being an advantage for males in intrinsic aptitude is so overwhelming that it is hard for me to see how one can make a case at this point on the other side," and that "it seems to me as conclusive as any finding I know of in science."

Well we certainly aren't seeing the stereotypical gender difference in confidence here! Now, I'm a controversial guy. I've taken many controversial positions over the years, and, as a member of Homo sapiens, I think I am right on all of them. But I don't think that in any of them I would say there is "not a shred of evidence" for the other side, even if I think that the evidence favors one side. I would not say that the other side "can't even make a case" for their position, even if I think that their case is not as good as the one I favor. And as for saying that a position is "as conclusive as any finding in science" — well, we're talking about social science here! This statement would imply that the extreme nurture position on gender differences is more conclusive than, say the evidence that the sun is at the center of the solar system, for the laws of thermodynamics, for the theory of evolution, for plate tectonics, and so on.

These are extreme statements — especially in light of the fact that an enormous amount of research, summarized in these and many other literature reviews, in fact points to a very different conclusion. I'll quote from one of them, a book called Sex Differences in Cognitive Ability by Diane Halpern. She is a respected psychologist, recently elected as president of the American Psychological Association, and someone with no theoretical axe to grind. She does not subscribe to any particular theory, and has been a critic, for example, of evolutionary psychology. And here what she wrote in the preface to her book:

"At the time I started writing this book it seemed clear to me that any between sex differences in thinking abilities were due to socialization practices, artifacts, and mistakes in the research. After reviewing a pile of journal articles that stood several feet high, and numerous books and book chapters that dwarfed the stack of journal articles, I changed my mind. The literature on sex differences in cognitive abilities is filled with inconsistent findings, contradictory theories, and emotional claims that are unsupported by the research. Yet despite all the noise in the data, clear and consistent messages could be heard. There are real and in some cases sizable sex differences with respect to some cognitive abilities. Socialization practices are undoubtedly important, but there is also good evidence that biological sex differences play a role in establishing and maintaining cognitive sex differences, a conclusion I wasn't prepared to make when I began reviewing the relevant literature."

This captures my assessment perfectly.

Again for the benefit of the Martians in this room: This isn't just any old issue in empirical psychology. There are obvious political colorings to it, and I want to begin with a confession of my own politics. I am a feminist. I believe that women have been oppressed, discriminated against, and harassed for thousands of years. I believe that the two waves of the feminist movement in the 20th century are among the proudest achievements of our species, and I am proud to have lived through one of them, including the effort to increase the representation of women in the sciences.

But it is crucial to distinguish the moral proposition that people should not be discriminated against on account of their sex — which I take to be the core of feminism — and the empirical claim that males and females are biologically indistinguishable. They are not the same thing. Indeed, distinguishing them is essential to protecting the core of feminism. Anyone who takes an honest interest in science has to be prepared for the facts on a given issue to come out either way. And that makes it essential that we not hold the ideals of feminism hostage to the latest findings from the lab or field. Otherwise, if the findings come out as showing a sex difference, one would either have to say, "I guess sex discrimination wasn't so bad after all," or else furiously suppress or distort the findings so as to preserve the ideal. The truth cannot be sexist. Whatever the facts turn out to be, they should not be taken to compromise the core of feminism.

Why study sex differences? Believe me, being the Bobby Riggs of cognitive science is not my idea of a good time. So should I care about them, especially since they are not the focus of my own research?

First, differences between the sexes are part of the human condition. We all have a mother and a father. Most of us are attracted to members of the opposite sex, and the rest of us notice the difference from those who do. And we can't help but notice the sex of our children, friends, and our colleagues, in every aspect of life.

Also, the topic of possible sex differences is of great scientific interest. Sex is a fundamental problem in biology, and sexual reproduction and sex differences go back a billion years. There's an interesting theory, which I won't have time to explain, which predicts that there should be an overall equal investment of organisms in their sons and daughters; neither sex is predicted to be superior or inferior across the board. There is also an elegant theory, namely Bob Trivers' theory of differential parental investment, which makes highly specific predictions about when you should expect sex differences and what they should look like.

The nature and source of sex differences are also of practical importance. Most of us agree that there are aspects of the world, including gender disparities, that we want to change. But if we want to change the world we must first understand it, and that includes understanding the sources of sex differences.

Let's get back to the datum to be explained. In many ways this is an exotic phenomenon. It involves biologically unprepared talents and temperaments: evolution certainly did not shape any part of the mind to do the work of a professor of mechanical engineering at MIT, for example. The datum has nothing to do with basic cognitive processes, or with those we use in our everyday lives, in school, or even in most college courses, where indeed there are few sex differences.

Also, we are talking about extremes of achievement. Most women are not qualified to be math professors at Harvard because most men aren't qualified to be math professors at Harvard. These are extremes in the population.

And we're talking about a subset of fields. Women are no under-represented to nearly the same extent in all academic fields, and certainly not in all prestigious professions.

Finally, we are talking about a statistical effect. This is such a crucial point that I have to discuss it in some detail.

Women are nowhere near absent even from the field in which they are most under-represented. The explanations for sex differences must be statistical as well. And here is a touchstone for the entire discussion:

These are two Gaussian or normal distributions; two bell curves. The X axis stands for any ability you want to measure. The Yaxis stands for the proportion of people having that ability. The overlapping curves are what you get whenever you compare the sexes on any measure in which they differ. In this example, if we say that this is the male curve and this is the female curve, the means may be different, but at any particular ability level there are always representatives of both genders.

So right away a number of public statements that have been made last couple of months can be seen as red herrings, and should never have been made by anyone who understands the nature of statistical distributions. This includes the accusation that President Summers implied that "50% of the brightest minds in America do not have the right aptitude for science," that "women just can't cut it," and so on. These statements are statistically illiterate, and have nothing to do with the phenomena we are discussing.

There are some important corollaries of having two overlapping normal distributions. One is that a normal distribution falls off according to the negative exponential of the square of the distance from the mean. That means that even when there is only a small difference in the means of two distributions, the more extreme a score, the greater the disparity there will be in the two kinds of individuals having such a score. That is, the ratios get more extreme as you go farther out along the tail. If we hold a magnifying glass to the tail of the distribution, we see that even though the distributions overlap in the bulk of the curves, when you get out to the extremes the difference between the two curves gets larger and larger.

For example, it's obvious that distributions of height for men and women overlap: it's not the case that all men are taller than all women. But while at five foot ten there are thirty men for every woman, at six feet there are two thousand men for every woman. Now, sex differences in cognition tend not to be so extreme, but the statistical phenomenon is the same.

A second important corollary is that tail ratios are affected by differences in variance. And biologists since Darwin have noted that for many traits and many species, males are the more variable gender. So even in cases where the mean for women and the mean for men are the same, the fact that men are more variable implies that the proportion of men would be higher at one tail, and also higher at the other. As it's sometimes summarized: more prodigies, more idiots. 

With these statistical points in mind, let me begin the substance of my presentation by connecting the political issue with the scientific one. Economists who study patterns of discrimination have long argued (generally to no avail) that there is a crucial conceptual difference between difference and discrimination. A departure from a 50-50 sex ratio in any profession does not, by itself, imply that we are seeing discrimination, unless the interests and aptitudes of the two groups are equated. Let me illustrate the point with an example, involving myself.

I work in a scientific field — the study of language acquisition in children — that is in fact dominated by women. Seventy-five percent of the members the main professional association are female, as are a majority of the keynote speakers at our main conference. I'm here to tell you that it's not because men like me have been discriminated against. I decided to study language development, as opposed to, say, mechanical engineering, for many reasons. The goal of designing a better automobile transmission does not turn me on as much as the goal of figuring out how kids acquire language. And I don't think I'd be as good at designing a transmission as I am in studying child language.

Now, all we need to do to explain sex differences without invoking the discrimination or invidious sexist comparisons is to suppose that whatever traits I have that predispose me to choose (say) child language over (say) mechanical engineering are not exactly equally distributed statistically among men and women. For those of you out there — of either gender — who also are not mechanical engineers, you should understand what I'm talking about.

Okay, so what are the similarities and differences between the sexes? There certainly are many similarities. Men and women show no differences in general intelligence or g — on average, they are exactly the same, right on the money. Also, when it comes to the basic categories of cognition — how we negotiate the world and live our lives; our concept of objects, of numbers, of people, of living things, and so on — there are no differences.

Indeed, in cases where there are differences, there are as many instances in which women do slightly better than men as ones in which men do slightly better than women. For example, men are better at throwing, but women are more dexterous. Men are better at mentally rotating shapes; women are better at visual memory. Men are better at mathematical problem-solving; women are better at mathematical calculation. And so on.

But there are at least six differences that are relevant to the datum we have been discussing. The literature on these differences is so enormous that I can only touch on a fraction of it. I'll restrict my discussion to a few examples in which there are enormous data sets, or there are meta-analyses that boil down a literature.

The first difference, long noted by economists studying employment practices, is that men and women differ in what they state are their priorities in life. To sum it up: men, on average, are more likely to chase status at the expense of their families; women give a more balanced weighting. Once again: Think statistics! The finding is not that women value family and don't value status. It is not that men value status and don't value family. Nor does the finding imply that every last woman has the asymmetry that women show on average or that every last man has the asymmetry that men show on average. But in large data sets, on average, an asymmetry what you find.

Just one example. In a famous long-term study of mathematically precocious youth, 1,975 youngsters were selected in 7th grade for being in the top 1% of ability in mathematics, and then followed up for more than two decades. These men and women are certainly equally talented. And if anyone has ever been encouraged in math and science, these kids were. Both genders: they are equal in their levels of achievement, and they report being equally satisfied with the course of their lives. Nonetheless there are statistical differences in what they say is important to them. There are some things in life that the females rated higher than males, such as the ability to have a part-time career for a limited time in one's life; living close to parents and relatives; having a meaningful spiritual life; and having strong friendships. And there are some things in life that the males rated higher than the females. They include having lots of money; inventing or creating something; having a full-time career; and being successful in one's line of work. It's worth noting that studies of highly successful people find that single-mindedness and competitiveness are recurring traits in geniuses (of both sexes).

Here is one other figure from this data set. As you might expect, this sample has a lot of people who like to work Herculean hours. Many people in this group say they would like to work 50, 60, even 70 hours a week. But there are also slight differences. At each one of these high numbers of hours there are slightly more men than women who want to work that much. That is, more men than women don't care about whether they have a life.

Second, interest in people versus things and abstract rule systems. There is a staggering amount of data on this trait, because there is an entire field that studies people's vocational interests. I bet most of the people in this room have taken a vocational interest test at some point in their lives. And this field has documented that there are consistent differences in the kinds of activities that appeal to men and women in their ideal jobs. I'll just discuss one of them: the desire to work with people versus things. There is an enormous average difference between women and men in this dimension, about one standard deviation.

And this difference in interests will tend to cause people to gravitate in slightly different directions in their choice of career. The occupation that fits best with the "people" end of the continuum is "director of a community services organization." The occupations that fit best with the "things" end are physicist, chemist, mathematician, computer programmer, and biologist.

We see this consequence not only in the choice of whether to go into science, but also in the choice which branch of science the two sexes tend to go into. Needless to say, from 1970 to 2002 there was a huge increase in the percentage of university degrees awarded to women. But the percentage still differs dramatically across fields. Among the Ph.Ds awarded in 2001, for example, in education 65% of the doctorates went to women; in the social sciences, 54%; in the life sciences, 47%; in the physical sciences, 26%; in engineering, 17%. This is completely predictable from the difference in interests between people and living things, on the one hand, and inanimate objects, on the other. And the pattern is pretty much the same in 1980 and 2001, despite the change in absolute numbers.

Third, risk. Men are by far the more reckless sex. In a large meta-analysis involving 150 studies and 100,000 participants, in 14 out of 16 categories of risk-taking, men were over-represented. The two sexes were equally represented in the other two categories, one of which was smoking, for obvious reasons. And two of the largest sex differences were in "intellectual risk taking" and "participation in a risky experiment." We see this sex difference in everyday life, in particular, in the following category: the Darwin Awards, "commemorating those individuals who ensure the long-term survival of our species by removing themselves from the gene pool in a sublimely idiotic fashion." Virtually all — perhaps all — of the winners are men.

Fourth, three-dimensional mental transformations: the ability to determine whether the drawings in each of these pairs the same 3-dimensional shape. Again I'll appeal to a meta-analysis, this one containing 286 data sets and 100,000 subjects. The authors conclude, "we have specified a number of tests that show highly significant sex differences that are stable across age, at least after puberty, and have not decreased in recent years." Now, as I mentioned, for some kinds of spatial ability, the advantage goes to women, but in "mental rotation,"spatial perception," and "spatial visualization" the advantage goes to men. 

Now, does this have any relevance to scientific achievement? We don't know for sure, but there's some reason to think that it does. In psychometric studies, three-dimensional spatial visualization is correlated with mathematical problem-solving. And mental manipulation of objects in three dimensions figures prominently in the memoirs and introspections of most creative physicists and chemists, including Faraday, Maxwell, Tesla, Kéekulé, and Lawrence, all of whom claim to have hit upon their discoveries by dynamic visual imagery and only later set them down in equations. A typical introspection is the following: "The cyclical entities which seem to serve as elements in my thought are certain signs and more or less clear images which can be voluntarily reproduced and combined. This combinatory play seems to be the essential feature in productive thought before there is any connection with logical construction in words or other kinds of signs." The quote comes from this fairly well-known physicist.

Fifth, mathematical reasoning. Girls and women get better school grades in mathematics and pretty much everything else these days. And women are better at mathematical calculation. But consistently, men score better on mathematical word problems and on tests of mathematical reasoning, at least statistically. Again, here is a meta analysis, with 254 data sets and 3 million subjects. It shows no significant difference in childhood; this is a difference that emerges around puberty, like many secondary sexual characteristics. But there are sizable differences in adolescence and adulthood, especially in high-end samples. Here is an example of the average SAT mathematical scores, showing a 40-point difference in favor of men that's pretty much consistent from 1972 to 1997. In the Study of Mathematically Precocious Youth (in which 7th graders were given the SAT, which of course ordinarily is administered only to older, college-bound kids), the ratio of those scoring over 700 is 2.8 to 1 male to female. (Admittedly, and interestingly, that's down from 25 years ago, when the ratio was 13-to1, and perhaps we can discuss some of the reasons.) At the 760 cutoff, the ratio nowadays is 7 males to 1 female. 

Now why is there a discrepancy with grades? Do SATs and other tests of mathematical reasoning aptitude underpredict grades, or do grades overpredict high-end aptitude? At the Radical Forum Liz was completely explicit in which side she takes, saying that "the tests are no good," unquote. But if the tests are really so useless, why does every major graduate program in science still use them — including the very departments at Harvard and MIT in which Liz and I have selected our own graduate students?

I think the reason is that school grades are affected by homework and by the ability to solve the kinds of problems that have already been presented in lecture and textbooks. Whereas the aptitude tests are designed to test the application of mathematical knowledge to unfamiliar problems. And this, of course, is closer to the way that math is used in actually doing math and science.

Indeed, contrary to Liz, and the popular opinion of many intellectuals, the tests are surprisingly good. There is an enormous amount of data on the predictive power of the SAT. For example, people in science careers overwhelmingly scored in 90th percentile in the SAT or GRE math test. And the tests predict earnings, occupational choice, doctoral degrees, the prestige of one's degree, the probability of having a tenure-track position, and the number of patents. Moreover this predictive power is the same for men and for women. As for why there is that underprediction of grades — a slight under-prediction, one-tenth of a standard deviation — the Educational Testing Service did a study on that phenomenon, and were able to explain the mystery by a combination of the choice of major, which differs between the sexes, and the greater conscientiousness of women.

Finally there's a sex difference in variability. It's crucial here to look at the right samples. Estimates of variance depend highly on the tails of the distribution, which by definition contain smaller numbers of people. Since people at the tails of the distribution in many surveys are likely to be weeded out for various reasons, it's important to have large representative samples from national populations. In this regard the gold standard is the Science paper by Novell and Hedges, which reported six large stratified probability samples. They found that in 35 out of 37 tests, including all of the tests in math, space, and science, the male variance was greater than the female variance.

One other data set meeting the gold standard is displayed in this graph, showing the entire population of Scotland, who all took an intelligence test in a single year. The X axis represents IQ, where the mean is 100, and the Yaxis represents the proportion of men versus women. As you can see these are extremely orderly data. In the middle part of the range, females predominate; at both extremes, males slightly predominate. Needless to say, there is a large percentage of women at both ends of the scale — but there is also large sex difference.

Now the fact that these six gender differences exist does not mean that they are innate. This of course is a much more difficult issue to resolve. A necessary preamble to this discussion is that nature and nurture are not alternatives; it is possible that the explanation for a given sex difference involves some of each. The only issue is whether the contribution of biology is greater than zero. I think that there are ten kinds of evidence that the contribution of biology is greater than zero, though of course it is nowhere near 100 percent.

First, there are many biological mechanisms by which a sex difference could occur. There are large differences between males and females in levels of sex hormones, especially prenatally, in the first six months of life, and in adolescence. There are receptors for hormones all over the brain, including the cerebral cortex. There are many small differences in men's and women's brains, including the overall size of the brain (even correcting for body size), the density of cortical neurons, the degree of cortical asymmetry, the size of hypothalamic nuclei, and several others. 

Second, many of the major sex differences — certainly some of them, maybe all of them, are universal. The idea that there are cultures out there somewhere in which everything is the reverse of here turns out to be an academic legend. In his survey of the anthropological literature called Human Universals, the anthropologist Donald Brown points out that in all cultures men and women are seen as having different natures; that there is a greater involvement of women in direct child care; more competitiveness in various measures for men than for women; and a greater spatial range traveled by men compared to by women.

In personality, we have a cross-national survey (if not a true cross-cultural one) in Feingold's meta-analysis, which noted that gender differences in personality are consistent across ages, years of data collection, educational levels, and nations. When it comes to spatial manipulation and mathematical reasoning, we have fewer relevant data, and we honestly don't have true cross-cultural surveys, but we do have cross-national surveys. David Geary and Catherine Desoto found the expected sex difference in mental rotation in ten European countries and in Ghana, Turkey, and China. Similarly, Diane Halpern, analyzing results from ten countries, said that "the majority of the findings show amazing cross-cultural consistency when comparing males and females on cognitive tests."

Third, stability over time. Surveys of life interests and personality have shown little or no change in the two generations that have come of age since the second wave of feminism. There is also, famously, resistance to change in communities that, for various ideological reasons, were dedicated to stamping out sex differences, and found they were unable to do so. These include the Israeli kibbutz, various American Utopian communes a century ago, and contemporary androgynous academic couples.

In tests of mental rotation, the meta-analysis by Voyer et al found no change over time. In mathematical reasoning there has been a decline in the size of the difference, although it has certainly not disappeared.

Fourth, many sex differences can be seen in other mammals. It would be an amazing coincidence if these differences just happened to be replicated in the arbitrary choices made by human cultures at the dawn of time. There are large differences between males and females in many mammals in aggression, in investment in offspring, in play aggression play versus play parenting, and in the range size, which predicts a species' sex differences in spatial ability (such as in solving mazes), at least in polygynous species, which is how the human species is classified. Many primate species even show a sex difference in their interest in physical objects versus conspecifics, a difference seen their patterns of juvenile play. Among baby vervet monkeys, the males even prefer to play with trucks and the females with other kinds of toys!

Fifth, many of these differences emerge in early childhood. It is said that there is a technical term for people who believe that little boys and little girls are born indistinguishable and are molded into their natures by parental socialization. The term is "childless."

Some sex differences seem to emerge even in the first week of life. Girls respond more to sounds of distress, and girls make more eye contact than boys. And in a study that I know Liz disputes and that I hope we'll talk about, newborn boys were shown to be more interested in looking at a physical object than a face, whereas newborn girls were shown to be more interested in looking at a face than a physical object.

A bit later in development there are vast and robust differences between boys and girls, seen all over the world. Boys far more often than girls engage in rough-and-tumble play, which involves aggression, physical activity, and competition. Girls spend a lot more often in cooperative play. Girls engage much more often in play parenting. And yes, boys the world over turn anything into a vehicle or a weapon, and girls turn anything into a doll. There are sex differences in intuitive psychology, that is, how well children can read one another's minds. For instance, several large studies show that girls are better than boys in solving the "false belief task," and in interpreting the mental states of characters in stories.

Sixth, genetic boys brought up as girls. In a famous 1970s incident called the John/Joan case, one member of a pair of identical twin boys lost his penis in a botched circumcision (I was relieved to learn that this was not done by a moyl, but by a bumbling surgeon). Following advice from the leading gender expert of the time, the parents agreed to have the boy castrated, given female-specific hormones, and brought up as a girl. All this was hidden from him throughout his childhood.

When I was an undergraduate the case was taught to me as proof of how gender roles are socially acquired. But it turned out that the facts had been suppressed. When "Joan" and her family were interviewed years later, it turned out that from the youngest ages he exhibited boy-typical patterns of aggression and rough-and-tumble play, rejected girl-typical activities, and showed a greater interest in things than in people. At age 14, suffering from depression, his father finally told him the truth. He underwent further surgery, married a woman, adopted two children, and got a job in a slaughterhouse.

This is not just a unique instance. In a condition called cloacal exstrophy, genetic boys are sometimes born without normal male genitalia. When they are castrated and brought up as girls, in 25 out of 25 documented instances they have felt that they were boys trapped in girls' bodies, and showed male-specific patterns of behavior such as rough-and-tumble play.

Seventh, a lack of differential treatment by parents and teachers. These conclusions come as a shock to many people. One comes from Lytton and Romney's meta-analysis of sex-specific socialization involving 172 studies and 28,000 children, in which they looked both at parents' reports and at direct observations of how parents treat their sons and daughters — and found few or no differences among contemporary Americans. In particular, there was no difference in the categories "Encouraging Achievement" and "Encouraging Achievement in Mathematics."

There is a widespread myth that teachers (who of course are disproportionately female) are dupes who perpetuate gender inequities by failing to call on girls in class, and who otherwise having low expectations of girls' performance. In fact Jussim and Eccles, in a study of 100 teachers and 1,800 students, concluded that teachers seemed to be basing their perceptions of students on those students' actual performances and motivation.

Eighth, studies of prenatal sex hormones: the mechanism that makes boys boys and girls girls in the first place. There is evidence, admittedly squishy in parts, that differences in prenatal hormones make a difference in later thought and behavior even within a given sex. In the condition called congenital adrenal hyperplasia, girls in utero are subjected to an increased dose of androgens, which is neutralized postnatally. But when they grow up they have male-typical toy preferences — trucks and guns — compared to other girls, male-typical play patterns, more competitiveness, less cooperativeness, and male-typical occupational preferences. However, research on their spatial abilities is inconclusive, and I cannot honestly say that there are replicable demonstrations that CAH women have male-typical patterns of spatial cognition.

Similarly, variations in fetal testosterone, studied in various ways, show that fetal testosterone has a nonmonotic relationship to reduced eye contact and face perception at 12 months, to reduced vocabulary at 18 months, to reduced social skills and greater narrowness of interest at 48 months, and to enhanced mental rotation abilities in the school-age years.

Ninth, circulating sex hormones. I'm going to go over this slide pretty quickly because the literature is a bit messy. Though it's possible that all claims of the effects of hormones on cognition will turn out to be bogus, I suspect something will be salvaged from this somewhat contradictory literature. There are, in any case, many studies showing that testosterone levels in the low-normal male range are associated with better abilities in spatial manipulation. And in a variety of studies in which estrogens are compared or manipulated, there is evidence, admittedly disputed, for statistical changes in the strengths and weaknesses in women's cognition during the menstrual cycle, possibly a counterpart to the changes in men's abilities during their daily and seasonal cycles of testosterone.

My last kind of evidence: imprinted X chromosomes. In the past fifteen years an entirely separate genetic system capable of implementing sex differences has been discovered. In the phenomenon called genetic imprinting, studied by David Haig and others, a chromosome such as the X chromosome can be altered depending on whether it was passed on from one's mother or from one's father. This makes a difference in the condition called Turner syndrome, in which a child has just one X chromosome, but can get it either from her mother or her father. When she inherits an X that is specific to girls, on average she has a better vocabulary and better social skills, and is better at reading emotions, at reading body language, and at reading faces.

A remark on stereotypes, and then I'll finish.

Are these stereotypes? Yes, many of them are (although, I must add, not all of them — for example, women's superiority in spatial memory and mathematical calculation. There seems to be a widespread assumption that if a sex difference conforms to a stereotype, the difference must have been caused by the stereotype, via differential expectations for boys and for girls. But of course the causal arrow could go in either direction: stereotypes might reflect differences rather than cause them. In fact there's an enormous literature in cognitive psychology which says that people can be good intuitive statisticians when forming categories and that their prototypes for conceptual categories track the statistics of the natural world pretty well. For example, there is a stereotype that basketball players are taller on average than jockeys. But that does not mean that basketball players grow tall, and jockeys shrink, because we expect them to have certain heights! Likewise, Alice Eagly and Jussim and Eccles have shown that most of people's gender stereotypes are in fact pretty accurate. Indeed the error people make is in the direction of underpredicting sex differences.

To sum up: I think there is more than "a shred of evidence" for sex differences that are relevant to statistical gender disparities in elite hard science departments. There are reliable average difference in life priorities, in an interest in people versus things, in risk-seeking, in spatial transformations, in mathematical reasoning, and in variability in these traits. And there are ten kinds of evidence that these differences are not completely explained by socialization and bias, although they surely are in part.

A concluding remark. None of this provides grounds for ignoring the biases and barriers that do keep women out of science, as long as we keep in mind the distinction between fairness on the one hand and sameness on the other. And I will give the final word to Gloria Steinem: "there are very few jobs that actually require a penis or a vagina, and all the other jobs should be open to both sexes." 



Elizabeth Spelke

(ELIZABETH SPELKE:) Thanks, especially to Steve; I'm really glad we're able to have this debate, I've been looking forward to it.

I want to start by talking about the points of agreement between Steve and me, and as he suggested, there are many. If we got away from the topic of sex and science, we'd be hard pressed to find issues that we disagree on. Here are a few of the points of agreement that are particularly relevant to the discussions of the last few months.

First, we agree that both our society in general and our university in particular will be healthiest if all opinions can be put on the table and debated on their merits. We also agree that claims concerning sex differences are empirical, they should be evaluated by evidence, and we'll all be happier and live longer if we can undertake that evaluation as dispassionately and rationally as possible. We agree that the mind is not a blank slate; in fact one of the deepest things that Steve and I agree on is that there is such a thing as human nature, and it is a fascinating and exhilarating experience to study it. And finally, I think we agree that the role of scientists in society is rather modest. Scientists find things out. The much more difficult questions of how to use that information, live our lives, and structure our societies are not questions that science can answer. Those are questions that everybody must consider.

So where do we disagree?

We disagree on the answer to the question, why in the world are women scarce as hens' teeth on Harvard's mathematics faculty and other similar institutions? In the current debate, two classes of factors have been said to account for this difference. In one class are social forces, including overt and covert discrimination and social influences that lead men and women to develop different skills and different priorities. In the other class are genetic differences that predispose men and women to have different capacities and to want different things.

In his book, The Blank Slate, and again today, Steve argued that social forces are over-rated as causes of gender differences. Intrinsic differences in aptitude are a larger factor, and intrinsic differences in motives are the biggest factor of all. Most of the examples that Steve gave concerned what he takes to be biologically based differences in motives.

My own view is different. I think the big forces causing this gap are social factors. There are no differences in overall intrinsic aptitude for science and mathematics between women and men. Notice that I am not saying the genders are indistinguishable, that men and women are alike in every way, or even that men and women have identical cognitive profiles. I'm saying that when you add up all the things that men are good at, and all the things that women are good at, there is no overall advantage for men that would put them at the top of the fields of math and science.

On the issue of motives, I think we're not in a position to know whether the different things that men and women often say they want stem only from social forces, or in part from intrinsic sex differences. I don't think we can know that now.

I want to start with the issue that's clearly the biggest source of debate between Steve and me: the issue of differences in intrinsic aptitude. This is the only issue that my own work and professional knowledge bear on. Then I will turn to the social forces, as a lay person as it were, because I think they are exerting the biggest effects. Finally, I'll consider the question of intrinsic motives, which I hope we'll come back to in our discussion.

Over the last months, we've heard three arguments that men have greater cognitive aptitude for science. The first argument is that from birth, boys are interested in objects and mechanics, and girls are interested in people and emotions. The predisposition to figure out the mechanics of the world sets boys on a path that makes them more likely to become scientists or mathematicians. The second argument assumes, as Galileo told us, that science is conducted in the language of mathematics. On the second claim, males are intrinsically better at mathematical reasoning, including spatial reasoning. The third argument is that men show greater variability than women, and as a result there are more men at the extreme upper end of the ability distribution from which scientists and mathematicians are drawn. Let me take these claims one by one.

The first claim, as Steve said, is gaining new currency from the work of Simon Baron-Cohen. It's an old idea, presented with some new language. Baron-Cohen says that males are innately predisposed to learn about objects and mechanical relationships, and this sets them on a path to becoming what he calls "systematizers." Females, on the other hand, are innately predisposed to learn about people and their emotions, and this puts them on a path to becoming "empathizers." Since systematizing is at the heart of math and science, boys are more apt to develop the knowledge and skills that lead to math and science.

To anyone as old as I am who has been following the literature on sex differences, this may seem like a surprising claim. The classic reference on the nature and development of sex differences is a book by Eleanor Maccoby and Carol Jacklin that came out in the 1970s. They reviewed evidence for all sorts of sex differences, across large numbers of studies, but they also concluded that certain ideas about differences between the genders were myths. At the top of their list of myths was the idea that males are primarily interested in objects and females are primarily interested in people. They reviewed an enormous literature, in which babies were presented with objects and people to see if they were more interested in one than the other. They concluded that there were no sex differences in these interests.

Nevertheless, this conclusion was made in the early 70s. At that time, we didn't know much about babies' understanding of objects and people, or how their understanding grows. Since Baron-Cohen's claims concern differential predispositions to learn about different kinds of things, you could argue that the claims hadn't been tested in Maccoby and Jacklin's time. What does research now show?

Let me take you on a whirlwind tour of 30 years of research in one powerpoint slide. From birth, babies perceive objects. They know where one object ends and the next one begins. They can't see objects as well as we can, but as they grow their object perception becomes richer and more differentiated.

Babies also start with rudimentary abilities to represent that an object continues to exist when it's out of view, and they hold onto those representations longer, and over more complicated kinds of changes, as they grow. Babies make basic inferences about object motion: inferences like, the force with which an object is hit determines the speed with which it moves. These inferences undergo regular developmental changes over the infancy period.

In each of these cases, there is systematic developmental change, and there's variability. Because of this variability, we can compare the abilities of male infants to females. Do we see sex differences? The research gives a clear answer to this question: We don't.

Male and female infants are equally interested in objects. Male and female infants make the same inferences about object motion, at the same time in development. They learn the same things about object mechanics at the same time.

Across large numbers of studies, occasionally a study will favor one sex over the other. For example, girls learn that the force with which something is hit influences the distance it moves a month earlier than boys do. But these differences are small and scattered. For the most part, we see high convergence across the sexes. Common paths of learning continue through the preschool years, as kids start manipulating objects to see if they can get a rectangular block into a circular hole. If you look at the rates at which boys and girls figure these things out, you don't find any differences. We see equal developmental paths.

I think this research supports an important conclusion. In discussions of sex differences, we need to ask what's common across the two sexes. One thing that's common is infants don't divide up the labor of understanding the world, with males focusing on mechanics and females focusing on emotions. Male and female infants are both interested in objects and in people, and they learn about both. The conclusions that Maccoby and Jacklin drew in the early 1970s are well supported by research since that time.

Let me turn to the second claim. People may have equal abilities to develop intuitive understanding of the physical world, but formal math and science don't build on these intuitions. Scientists use mathematics to come up with new characterizations of the world and new principles to explain its functioning. Maybe males have an edge in scientific reasoning because of their greater talent for mathematics.

As Steve said, formal mathematics is not something we have evolved to do; it's a recent accomplishment. Animals don't do formal math or science, and neither did humans back in the Pleistocene. If there is a biological basis for our mathematical reasoning abilities, it must depend on systems that evolved for other purposes, but that we've been able to harness for the new purpose of representing and manipulating numbers and geometry.

Research from the intersecting fields of cognitive neuroscience, neuropsychology, cognitive psychology, and cognitive development provide evidence for five "core systems" at the foundations of mathematical reasoning. The first is a system for representing small exact numbers of objects — the difference between one, two, and three. This system emerges in human infants at about five months of age, and it continues to be present in adults. The second is a system for discriminating large, approximate numerical magnitudes — the difference between a set of about ten things and a set of about 20 things. That system also emerges early in infancy, at four or five months, and continues to be present and functional in adults.

The third system is probably the first uniquely human foundation for numerical abilities: the system of natural number concepts that we construct as children when we learn verbal counting. That construction takes place between about the ages of two and a half and four years. The last two systems are first seen in children when they navigate. One system represents the geometry of the surrounding layout. The other system represents landmark objects.

All five systems have been studied quite extensively in large numbers of male and female infants. We can ask, are there sex differences in the development of any of these systems at the foundations of mathematical thinking? Again, the answer is no. I will show you data from just two cases.

The first is the development of natural number concepts, constructed by children between the ages of two and four. At any particular time in this period, you'll find a lot of variability. For example, between the ages of three and three and a half years, some children have only figured out the meaning of the word "one" and can only distinguish the symbolic concept one from all other numbers. Other kids have figured out the meanings of all the words in the count list up to "ten" or more, and they can use all of them in a meaningful way. Most kids are somewhere in between: they have figured out the first two symbols, or the first three, and so forth. When you compare children's performance by sex, you see no hint of a superiority of males in constructing natural number concepts.

The other example comes from studies that I think are the closest thing in preschool children to the mental rotation tests conducted with adults. In these studies, children are brought into a room of a given shape, something is hidden in a corner, and then their eyes are closed and they're spun around. They have to remember the shape of the room, open their eyes, and figure out how to rotate themselves back to the object where it was hidden. If you test a group of 4 year olds, you find they can do this task well above chance but not perfectly; there's a range of performance. When you break that performance down by gender, again there is not a hint of an advantage for boys over girls.

These findings and others support two important points. First, indeed there is a biological foundation to mathematical and scientific reasoning. We are endowed with core knowledge systems that emerge prior to any formal instruction and that serve as a basis for mathematical thinking. Second, these systems develop equally in males and females. Ten years ago, the evolutionary psychologist and sex difference researcher, David Geary, reviewed the literature that was available at that time. He concluded that there were no sex differences in "primary abilities" underlying mathematics. What we've learned in the last ten years continues to support that conclusion.

Sex differences do emerge at older ages. Because they emerge later in childhood, it's hard to tease apart their biological and social sources. But before we attempt that task, let's ask what the differences are.

I think the following is a fair statement, both of the cognitive differences that Steve described and of others. When people are presented with a complex task that can be solved through multiple different strategies, males and females sometimes differ in the strategy that they prefer.

For example, if a task can only be solved by representing the geometry of the layout, we do not see a difference between men and women. But if the task can be accomplished either by representing geometry or by representing individual landmarks, girls tend to rely on the landmarks, and boys on the geometry. To take another example, when you compare the shapes of two objects of different orientations, there are two different strategies you can use. You can attempt a holistic rotation of one of the objects into registration with the other, or you can do point-by-point featural comparisons of the two objects. Men are more likely to do the first; women are more likely to do the second.

Finally, the mathematical word problems on the SAT-M very often allow multiple solutions. Both item analyses and studies of high school students engaged in the act of solving such problems suggest that when students have the choice of solving a problem by plugging in a formula or by doing Ven diagram-like spatial reasoning, girls tend to do the first and boys tend to do the second.

Because of these differences, males and females sometimes show differing cognitive profiles on timed tests. When you have to solve problems fast, some strategies will be faster than others. Thus, females perform better at some verbal, mathematical and spatial tasks, and males perform better at other verbal, mathematical, and spatial tasks. This pattern of differing profiles is not well captured by the generalization, often bandied about in the popular press, that women are "verbal" and men are "spatial." There doesn't seem to be any more evidence for that than there was for the idea that women are people-oriented and men are object-oriented. Rather the differences are more subtle.

Does one of these two profiles foster better learning of math than the other? In particular, is the male profile better suited to high-level mathematical reasoning?

At this point, we face a question that's been much discussed in the literature on mathematics education and mathematical testing. The question is, by what yardstick can we decide whether men or women are better at math?

Some people suggest that we look at performance on the SAT-M, the quantitative portion of the Scholastic Assessment Test. But this suggestion raises a problem of circularity. The SAT test is composed of many different types of items. Some of those items are solved better by females. Some are solved better by males. The people who make the test have to decide, how many items of each type to include? Depending on how they answer that question, they can create a test that makes women look like better mathematicians, or a test that makes men look like better mathematicians. What's the right solution?

Books are devoted to this question, with much debate, but there seems to be a consensus on one point: The only way to come up with a test that's fair is to develop an independent understanding of what mathematical aptitude is and how it's distributed between men and women. But in that case, we can't use performance on the SAT to give us that understanding. We've got to get that understanding in some other way. So how are we going to get it?

A second strategy is to look at job outcomes. Maybe the people who are better at mathematics are those who pursue more mathematically intensive careers. But this strategy raises two problems. First, which mathematically intensive jobs should we choose? If we choose engineering, we will conclude that men are better at math because more men become engineers. If we choose accounting, we will think that women are better at math because more women become accountants: 57% of current accountants are women. So which job are we going to pick, to decide who has more mathematical talent?

These two examples suggest a deeper problem with job outcomes as a measure of mathematical talent. Surely you've got to be good at math to land a mathematically intensive job, but talent in mathematics is only one of the factors influencing career choice. It can't be our gold standard for mathematical ability.

So what can be? I suggest the following experiment. We should take a large number of male students and a large number of female students who have equal educational backgrounds, and present them with the kinds of tasks that real mathematicians face. We should give them new mathematical material that they have not yet mastered, and allow them to learn it over an extended period of time: the kind of time scale that real mathematicians work on. We should ask, how well do the students master this material? The good news is, this experiment is done all the time. It's called high school and college.

Here's the outcome. In high school, girls and boys now take equally many math classes, including the most advanced ones, and girls get better grades. In college, women earn almost half of the bachelor's degrees in mathematics, and men and women get equal grades. Here I respectfully disagree with one thing that Steve said: men and women get equal grades, even when you only compare people within a single institution and a single math class. Equating for classes, men and women get equal grades.

The outcome of this large-scale experiment gives us every reason to conclude that men and women have equal talent for mathematics. Here, I too would like to quote Diane Halpern. Halpern reviews much evidence for sex differences, but she concludes, "differences are not deficiencies." Men and women have equal aptitude for mathematics. Yes, there are sex differences, but they don't add up to an overall advantage for one sex over the other.

Let me turn to the third claim, that men show greater variability, either in general or in quantitative abilities in particular, and so there are more men at the upper end of the ability distribution. I can go quickly here, because Steve has already talked about the work of Camilla Benbow and Julian Stanley, focusing on mathematically precocious youth who are screened at the age of 13, put in intensive accelerated programs, and then followed up to see what they achieve in mathematics and other fields.

As Steve said, students were screened at age 13 by the SAT, and there were many more boys than girls who scored at the highest levels on the SAT-M. In the 1980s, the disparity was almost 13 to 1. It is now substantially lower, but there still are more boys among the very small subset of people from this large, talented sample who scored at the very upper end. Based on these data, Benbow and Stanley concluded that there are more boys than girls in the pool from which future mathematicians will be drawn. But notice the problem with this conclusion: It's based entirely on the SAT-M. This test, and the disparity it revealed, are in need of an explanation, a firmer yardstick for assessing and understanding gender differences in this talented population.

Fortunately, Benbow, Stanley and Lubinski have collected much more data on these mathematically talented boys and girls: not just the ones with top scores on one timed test, but rather the larger sample of girls and boys who were accelerated and followed over time. Let's look at some of the key things that they found.

First, they looked at college performance by the talented sample. They found that the males and females took equally demanding math classes and majored in math in equal numbers. More girls majored in biology and more boys in physics and engineering, but equal numbers of girls and boys majored in math. And they got equal grades. The SAT-M not only under-predicts the performance of college women in general, it also under-predicted the college performance of women in the talented sample. These women and men have been shown to be equally talented by the most meaningful measure we have: their ability to assimilate new, challenging material in demanding mathematics classes at top-flight institutions. By that measure, the study does not find any difference between highly talented girls and boys.

So, what's causing the gender imbalance on faculties of math and science? Not differences in intrinsic aptitude. Let's turn to the social factors that I think are much more important. Because I'm venturing outside my own area of work, and because time is short, I won't review all of the social factors producing differential success of men and women. I will talk about just one effect: how gender stereotypes influence the ways in which males and females are perceived. 

Let me start with studies of parents' perceptions of their own children. Steve said that parents report that they treat their children equally. They treat their boys and girls alike, and they encourage them to equal extents, for they want both their sons and their daughters to succeed. This is no doubt true. But how are parents perceiving their kids?

Some studies have interviewed parents just after the birth of their child, at the point where the first question that 80% of parents ask — is it a boy or a girl? — has been answered. Parents of boys describe their babies as stronger, heartier, and bigger than parents of girls. The investigators also looked at the babies' medical records and asked whether there really were differences between the boys and girls in weight, strength, or coordination. The boys and girls were indistinguishable in these respects, but the parents' descriptions were different.

At 12 months of age, girls and boys show equal abilities to walk, crawl, or clamber. But before one study, Karen Adolph, an investigator of infants' locomotor development, asked parents to predict how well their child would do on a set of crawling tasks: Would the child be able to crawl down a sloping ramp? Parents of sons were more confident that their child would make it down the ramp than parents of daughters. When Adolph tested the infants on the ramp, there was no difference whatever between the sons and daughters, but there was a difference in the parents' predictions.

My third example, moving up in age, comes from the studies of Jackie Eccles. She asked parents of boys and girls in sixth grade, how talented do you think your child is in mathematics? Parents of sons were more likely to judge that their sons had talent than parents of daughters. A panoply of objective measures, including math grades in school, performance on standardized tests, teachers' evaluations, and children's expressed interest in math, revealed no differences between the girls and boys. Still, there was a difference in parents' perception of their child's intangible talent. Other studies have shown a similar effect for science.

There's clearly a mismatch between what parents perceive in their kids and what objective measures reveal. But is it possible that the parents are seeing something that the objective measures are missing? Maybe the boy getting B's in his math class really is a mathematical genius, and his mom or dad has sensed that. To eliminate that possibility, we need to present observers with the very same baby, or child, or Ph.D. candidate, and manipulate their belief about the person's gender. Then we can ask whether their belief influences their perception.

It's hard to do these studies, but there are examples, and I will describe a few of them. A bunch of studies take the following form: you show a group of parents, or college undergraduates, video-clips of babies that they don't know personally. For half of them you give the baby a male name, and for the other half you give the baby a female name. (Male and female babies don't look very different.) The observers watch the baby and then are asked a series of questions: What is the baby doing? What is the baby feeling? How would you rate the baby on a dimension like strong-to-weak, or more intelligent to less intelligent? There are two important findings.

First, when babies do something unambiguous, reports are not affected by the baby's gender. If the baby clearly smiles, everybody says the baby is smiling or happy. Perception of children is not pure hallucination. Second, children often do things that are ambiguous, and parents face questions whose answers aren't easily readable off their child's overt behavior. In those cases, you see some interesting gender labeling effects. For example, in one study a child on a video-clip was playing with a jack-in-the-box. It suddenly popped up, and the child was startled and jumped backward. When people were asked, what's the child feeling, those who were given a female label said, "she's afraid." But the ones given a male label said, "he's angry." Same child, same reaction, different interpretation.

In other studies, children with male names were more likely to be rated as strong, intelligent, and active; those with female names were more likely to be rated as little, soft, and so forth.

I think these perceptions matter. You, as a parent, may be completely committed to treating your male and female children equally. But no sane parents would treat a fearful child the same way they treat an angry child. If knowledge of a child's gender affects adults' perception of that child, then male and female children are going to elicit different reactions from the world, different patterns of encouragement. These perceptions matter, even in parents who are committed to treating sons and daughters alike.

I will give you one last version of a gender-labeling study. This one hits particularly close to home. The subjects in the study were people like Steve and me: professors of psychology, who were sent some vitas to evaluate as applicants for a tenure track position. Two different vitas were used in the study. One was a vita of a walk-on-water candidate, best candidate you've ever seen, you would die to have this person on your faculty. The other vita was a middling, average vita among successful candidates. For half the professors, the name on the vita was male, for the other half the name was female. People were asked a series of questions: What do you think about this candidate's research productivity? What do you think about his or her teaching experience? And finally, Would you hire this candidate at your university?

For the walk-on-water candidate, there was no effect of gender labeling on these judgments. I think this finding supports Steve's view that we're dealing with little overt discrimination at universities. It's not as if professors see a female name on a vita and think, I don't want her. When the vita's great, everybody says great, let's hire.

What about the average successful vita, though: that is to say, the kind of vita that professors most often must evaluate? In that case, there were differences. The male was rated as having higher research productivity. These psychologists, Steve's and my colleagues, looked at the same number of publications and thought, "good productivity" when the name was male, and "less good productivity" when the name was female. Same thing for teaching experience. The very same list of courses was seen as good teaching experience when the name was male, and less good teaching experience when the name was female. In answer to the question would they hire the candidate, 70% said yes for the male, 45% for the female. If the decision were made by majority rule, the male would get hired and the female would not.

A couple other interesting things came out of this study. The effects were every bit as strong among the female respondents as among the male respondents. Men are not the culprits here. There were effects at the tenure level as well. At the tenure level, professors evaluated a very strong candidate, and almost everyone said this looked like a good case for tenure. But people were invited to express their reservations, and they came up with some very reasonable doubts. For example, "This person looks very strong, but before I agree to give her tenure I would need to know, was this her own work or the work of her adviser?" Now that's a perfectly reasonable question to ask. But what ought to give us pause is that those kinds of reservations were expressed four times more often when the name was female than when the name was male.

So there's a pervasive difference in perceptions, and I think the difference matters. Scientists' perception of the quality of a candidate will influence the likelihood that the candidate will get a fellowship, a job, resources, or a promotion. A pattern of biased evaluation therefore will occur even in people who are absolutely committed to gender equity.

I have little doubt that all my colleagues here at Harvard are committed to the principle that a male candidate and a female candidate of equal qualifications should have equal chance at a job. But we also think that when we compare a more productive scholar to a less productive one, a more experienced teacher to a less experienced one, a more independent investigator to a less independent one, those factors matter as well. These studies say that knowledge of a person's gender will influence our assessment of those factors, and that's going to produce a pattern of discrimination, even in people with the best intentions.

From the moment of birth to the moment of tenure, throughout this great developmental progression, there are unintentional but pervasive and important differences in the ways that males and females are perceived and evaluated.

I have to emphasize that perceptions are not everything. When cases are unambiguous, you don't see these effects. What's more, cognitive development is robust: boys and girls show equal capacities and achievements in educational settings, including in science and mathematics, despite the very different ways in which boys and girls are perceived and evaluated. I think it's really great news that males and females develop along common paths and gain common sets of abilities. The equal performance of males and females, despite their unequal treatment, strongly suggests that mathematical and scientific reasoning has a biological foundation, and this foundation is shared by males and females.

Finally, you do not create someone who feels like a girl or boy simply by perceiving them as male or female. That's the lesson that comes from the studies of people of one sex who are raised as the opposite sex. Biological sex differences are real and important. Sex is not a cultural construction that's imposed on people.

But the question on the table is not, Are there biological sex differences? The question is, Why are there fewer women mathematicians and scientists? The patterns of bias that I described provide four interconnected answers to that question. First, and most obviously, biased perceptions produce discrimination: When a group of equally qualified men and women are evaluated for jobs, more of the men will get those jobs if they are perceived to be more qualified. Second, if people are rational, more men than women will put themselves forward into the academic competition, because men will see that they've got a better chance for success. Academic jobs will be more attractive to men because they face better odds, will get more resources, and so forth.

Third, biased perceptions earlier in life may well deter some female students from even attempting a career in science or mathematics. If your parents feel that you don't have as much natural talent as someone else whose objective abilities are no better than yours, that may discourage you, as Eccles's work shows. Finally, there's likely to be a snowball effect. All of us have an easier time imagining ourselves in careers where there are other people like us. If the first three effects perpetuate a situation where there are few female scientists and mathematicians, young girls will be less likely to see math and science as a possible life.

So by my personal scorecard, these are the major factors. Let me end, though, by asking, could Steve also be partly right? Could biological differences in motives — motivational patterns that evolved in the Pleistocene but that apply to us today — propel more men than women towards careers in mathematics and science?

My feeling is that where we stand now, we cannot evaluate this claim. It may be true, but as long as the forces of discrimination and biased perceptions affect people so pervasively, we'll never know. I think the only way we can find out is to do one more experiment. We should allow all of the evidence that men and women have equal cognitive capacity, to permeate through society. We should allow people to evaluate children in relation to their actual capacities, rather than one's sense of what their capacities ought to be, given their gender. Then we can see, as those boys and girls grow up, whether different inner voices pull them in different directions. I don't know what the findings of that experiment will be. But I do hope that some future generation of children gets to find out. 


Steven Pinker & Elizabeth Spelke: Concluding Discussion

PINKER: Thanks, Liz, for a very stimulating and apposite presentation. A number of comments.

I don't dispute a lot of the points you made, but many have lost sight of the datum that we're here to explain in the first place. Basic abilities like knowing that an object is still there when you put a hankie over it, or knowing that one object can't pass through another, are not the kinds of things that distinguish someone who's capable of being a professor of physics or math from someone who isn't. And in many of the cases in which you correctly said that there is no gender difference in kids, there is no gender difference in adults either — such as the give-a-number task and other core abilities.

Also, a big concern with all of the null effects that you mentioned is statistical power. Bob Rosenthal 20 years ago pointed out that the vast majority of studies that psychologists do are incapable of detecting the kinds of results they seek, which is why it's so important to have meta-analyses and large sample sizes. I question whether all of the null results that you mentioned can really be justified, and whether they are comparable to the studies done on older kids and adults.

One place where I really do disagree with you is in the value of the SAT-M, where the "circle" has amply been broken. This is what people at the College Board are obsessed with. What you are treating as the gold standard is performance in college courses. But the datum we are disputing is not how well boys and girls do in school, or how well men and women do in college, because there we agree there is no male advantage. The phenomenon we really are discussing is performance at the upper levels: getting tenure-track job, getting patents, and so on. And here the analyses have shown that the SAT is not biased against girls. That is, a given increment in SAT score predicts a given increment in the variable of interest to the same extent whether you're male or female.

I think there may be a slight difference in which finding each of us is alluding to in talking about differences in grades. I was not suggesting that girls' better grades come about because they take easier courses; they really do get better grades holding courses constant. Rather it's the slight underprediction of grades by the SAT that can be explained in part by class choice and in part by conscientiousness.

SPELKE: Well the most recent thing that I've read about this issue is the Gallagher and Kaufman book, Gender Differences in Mathematics, which just came out about a month ago. They report that equating for classes and institutions, and looking just at A students, there's a 21 point SAT math differential; that is to say, for two students getting the same grade of A, the average for the girls on the SAT will have been 21 points lower. That differential is there at every grade level and in all the courses.

The SAT people have discussed it as a problem. One of the discussions reached the conclusion that the SAT is still useful, because although it under-predicts girls' performance in college, girls' grades over-predict their performance in college, and if you use the two together you are okay. In fact, they advised that people never take account of the SAT simply by itself, but consider it in relation to grades. When you spoke earlier about the use of GREs in admitting people to grad school, that's in fact what graduate programs do: We consider both grades and GREs.

Interestingly, though, in all of the public discussion of the relative advantages of men versus women for math and science, over the last two months, people have not used the SAT in conjunction with grades. When talking about relative ability, they've used the SAT by itself. I think that has led to a distorted conversation about this issue.

PINKER: It nonetheless remains true that in the most recent study by Lubinski and Benbow, which showed a fantastic degree of predictive power of the SAT given in 7th grade, there was no difference in predictive power in boys and girls in any of these measures.

But let me return to the datum that is at issue here, namely the differential representation of the sexes in physical sciences, mechanical engineering, and mathematics. The fact that men and women are equal overall in spatial abilities, and overall in mathematical abilities, is irrelevant to this. It may be that the particular subtalents in which women excel make them more likely to go into accounting. But the datum we are discussing is not a gender difference in accounting. The datum we are discussing is a gender difference in the physical sciences, engineering, and mathematics. And I suspect that when you look at a range of professions, the size of the sex discrepancy correlates with how much spatial manipulation (not just any kind of spatial cognition) and how much mathematical reasoning (not just any kind of mathematical ability) each of those jobs requires.

What about parents' expectations? In the 1970s the model for development was, "as the twig is bent, so grows the branch." — that subtle differences in parents' perceptions early in life can have a lasting effect. You nudge the child in a particular direction and you'll see an effect on his trajectory years later. But there is now an enormous amount of research spearheaded by the behavioral genetics revolution suggesting that that is not true. There may be effects of parental expectations and parental treatment on young children while they're still in the home, but most follow-up studies show that short of outright abuse and neglect, these effects peter out by late adolescence. And studies of adoption and of twins and other sibs reared apart suggest that any effects of the kinds of parenting that are specific to a child simply reflect the preexisting genetic traits of the child, and the additional effect of parenting peters out to nothing.

SPELKE: Can I respond to that? I think one thing is different about the gender case, compared to the early socialization effects for other kinds of categories, different styles of parenting, and so forth. The gender differences that we see reflected in parents' differing perceptions are mirrored by differing perceptions that males and females experience throughout their lives. It's not the case that idiosyncratic pairs of parents treat their kids one way, but then as soon as the children leave that environment, other people treat them differently. Rather, what we have in the case of gender is a pervasive pattern that just keeps getting perpetuated in different people. I'm rather a nativist about cognition, and I am tempted to look at that pattern and wonder, did Darwin give us some innately wrong idea about the genders? Professionals in professional contexts show the same patterns of evaluation that parents show in home contexts, and children face those patterns of evaluation, not just when they're young and at home, but continuing through high school, college, and finally with their colleagues on academic faculties. We're dealing here with a much more pervasive effect than the effects of socialization in the other studies that you've written and talked about.

PINKER: Regarding bias: as I mentioned at the outset, I don't doubt that bias exists. But the idea that the bias started out from some arbitrary coin flip at the dawn of time and that gender differences have been perpetuated ever since by the existence of that bias is extremely unlikely. In so many cases, as Eagly and the Stereotype-Accuracy people point out, the biases are accurate. Also, there's an irony in these discussion of bias. When we test people in the cognitive psychology lab, and we don't call these base rates "gender," we applaud people when they apply them. If people apply the statistics of a group to an individual case, we call it rational Bayesian reasoning, and congratulate ourselves for getting them to overcome the cognitive illusion of base rate neglect. But when people do the same thing in the case of gender, we treat Bayesian reasoning as a cognitive flaw and base-rate neglect as rational! Now I agree that applying base rates for gender in evaluating individual men and women is a moral flaw; I don't think that base rates ought to be applied in judging individuals in most cases of public decision-making. But the fact that the statistics of a gender are applied does not mean that their origin was arbitrary; it could be statistically sound in some cases.

SPELKE: Let me reply to that, because I agree that the origin is not arbitrary, and that the bias is there for an objective reason, but I think you're drawing the wrong conclusion about it. I think the reason there's a bias to think that men have greater natural talent for math and science is that when we look around the world and ask, who's winning the Nobel Prizes and making the great advances in science, what we see, again and again, is men.

Although Linda Buck received this year's Nobel Prize in physiology or medicine, for the most part it's overwhelmingly men who are reaching the upper levels of math and science. It's natural to look at that and think, there must be some reason, some inner difference between men and women, which produces this enormous disparity. And I quite agree with you that good statistical reasoning should lead you to think, the next student who comes along, if male, is more likely to join that group of Nobel Prize winners.

What I would like to suggest is that we have good reasons to resist this kind of conclusion, and the reasons aren't only moral. Let me just use an analogy, and replay this debate over the biological bases of mathematics and science talent 150 years ago.

Let's consider who the 19th century mathematicians and scientists were. They were overwhelmingly male, just as they are today, but also overwhelmingly European, not Asian. You won't see a Chinese face or an Indian face in 19th century science. It would have been tempting to apply this same pattern of statistical reasoning and say, there must be something about European genes that give rise to greater mathematical talent than Asian genes do. If we go back still further, and play this debate in the Renaissance, I think we would be tempted to conclude that Catholic genes make for better science than Jewish genes, because all those Renaissance scientists were Catholic. If you look at those cases, you see what's wrong with this argument.

What's wrong with the argument is not that biology is irrelevant. If Galileo had been switched at birth with some baby from the Pisan ghetto, the baby raised by Galileo's parents would not likely have ended up teaching us that the language of physics is mathematics. I think that Galileo's genes had something to do with his achievement, but so did Galileo's cultural and social environment: his nurturing. Genius requires huge amounts of both. If, in that baby switch, Galileo had found himself growing up in the Pisan ghetto, I bet he wouldn't have ended up being the example in this discussion today either. So yes, there are reasons for this statistical bias. But I think we want to step back and ask, why is it that almost all Nobel Prize winners are men today? The answer to that question may be the same reason why all the great scientists in Florence were Christian.

PINKER: I think you could take the same phenomenon and come to the opposite conclusion! Say there were really was such a self-reinforcing, self-perpetuating dynamic: a difference originates for reasons that might be arbitrary; people perceive the difference; they perpetuate it by their expectations. Just as bad, you say, is the fact that people don't go into fields in which they don't find enough people like themselves. If so, the dynamic you would expect is that the representation of different genders or ethnic groups should migrate to the extremes. That is, there is a positive feedback loop where if you're in the minority, it will discourage people like you from entering the field, which will mean that there'll be even fewer people in the field, and so on. On either side of this threshold you should get a drift of the percentages in opposite directions.

Now, there is an alternative model. At many points in history, arbitrary barriers against the entry of genders and races and ethnic groups to various professions were removed. And as soon as the barrier was removed, far from the statistical underrepresentation perpetuating or exaggerating itself, as you predict, the floodgates open, and the formerly underrepresented people reaches some natural level. It's the Jackie Robinson effect in baseball. In the case of gender and science, remember what our datum is. It's not that women are under-represented in professions in general or in the sciences in general: in many professions women are perfectly well represented, such as being a veterinarian, in which the majority of recent graduates are women by a long shot. If you go back fifty years or a hundred years, there would have been virtually no veterinarians who were women. That underrepresentation did not perpetuate itself via the positive feedback loop that you allude to.

SPELKE: I'm glad you brought up the case of the basketball and baseball players. I think it's interesting to ask, what distinguishes these cases, where you remove the overt discrimination and within a very short period of time the differential disappears, from other cases, where you remove the overt discrimination and the covert discrimination continues? In the athletic cases where discrimination disappears quickly, there are clear, objective measures of success. Whatever people think about the capacities of a black player, if he is hitting the ball out of the park, he is going to get credit for a home run. That is not the case in science.

In science, the judgments are subjective, every step of the way. Who's really talented? Who deserves bigger lab space? Who should get the next fellowship? Who should get promoted to tenure? These decisions are not based on clear and objective criteria. These are the cases where you see discrimination persisting. You see it in academia. You see it in Claudia Goldin's studies of orchestra auditions, which also involve subtle judgments: Who's the more emotive, sensitive player? If you know that the players are male or female, you're going pick mostly men, but if the players are behind a screen, you'll start picking more women.

PINKER: But that makes the wrong prediction: the harder the science, the greater the participation of women! We find exactly the opposite: it's the most subjective fields within academia — the social sciences, the humanities, the helping professions — that have the greatest representation of women. This follows exactly from the choices that women express in what gives them satisfaction in life. But it goes in the opposite direction to the prediction you made about the role of objective criteria in bringing about gender equity. Surely it's physics, and not, say, sociology, that has the more objective criteria for success.

SPELKE: Let me just say one thing, because I didn't say much in the talk at all, about this issue of motives, and biological differences in motives. That's been a less controversial issue, but I think it's an important one, and most of your examples were concerned with it. I think it's a really interesting possibility that the forces that were active in our evolutionary past have led men and women to evolve somewhat differing concerns. But to jump from that possibility into the present, and draw conclusions about what people's motives will be for pursuing one or another career, is way too big a stretch.

As we both agree, the kinds of careers people pursue now, the kinds of choices they make, are radically different from anything that anybody faced back in the Pleistocene. It is anything but clear how motives that evolved then translate into a modern context. Let me just give one example of this. You've suggested, as a hypothesis, that because of sexual selection and also parental investment issues, men are selected to be more competitive, and women are selected to be more nurturant. Suppose that hypothesis is true. If we want to use it to make predictions about desires for careers in math and science, we're going to have to answer a question that I think is wide open right now. What makes for better motives in a scientist?

What kind of motives are more likely to lead to good science: Competitive motives, like the motive J. D. Watson described in The Double Helix, to get the structure of DNA before Linus Pauling did? Or nurturant motives of the kind that Doug Melton has described recently to explain why he's going into stem cell research: to find a cure for juvenile diabetes, which his children suffer from? I think it's anything but clear how motives from our past translate into modern contexts. We would need to do the experiment, getting rid of discrimination and social pressures, in order to find out. 

Edge Dinners
Event Date: [ 2.24.04 ]
Location:
Monterey, CA
United States

"This goes beyond all known schmoozing. 
This is like some kind of virtual-intellectual conspiracy-in-restraint-of-trade."
— Bruce Sterling, "Third Culture Schmoozing"

"The dinner party was a microcosm of a newly dominant sector of American business." — Wired

There's no such thing as a free lunch, or a free Billionaires' Dinner.

Ariane de Bonvoisin - Daniel Gilbert - Eva Wisten?(En route to The Billionaires' Dinner - 2004)

This year, a downsized (or, if you like, more exclusive) Edge dinner was convened in Monterey at the Indian Summer Restaurant.

The dinner, which for the past few years has been held during the annual TED Conference, always has a name attached to it. It began in 1984 as "The Millionaires' Dinner" (thanks to a page one article in The Wall Street Journal) in a Las Vegas Mexican restaurant during COMDEX Eventually it evolved to "The Digerati Dinner"; to "The World Domination, Corporate Cubism, and Alien Mind Control Dinner", to "The Billionaires' Dinner". Last year we tried "The Science Dinner". Everyone yawned. So this year, it's back to the money-sex-power thing with "The Billionaires' Dinner". I realize that "Billionaire" is tired and very '90s, but the name worked for this year's dinner. It was a coincidence that during the dinner, Google cofounder Larry Page received a message on his pager informing him that he and cofounder Sergey Brin had made the ForbesMagazine list of 157 billionaires.

The communications revolution occurring in the age of information and computation has not stopped, nor has it even slowed down. The markets crashed. The innovation continues. And a number of people who showed up for the dinner are really cooking: Jeff Bezos of Amazon; Google's CEO Eric Schmidt, Larry, Sergey, Lori Park, and Megan Smith; Pierre Omidyar, founder of eBay; Dean Kamen, inventor of the Segway; Steve Case, former Chairman of AOL Time-Warner who is now on to new adventures; and Jeffrey Epstein, who recently endowed The Program for Evolutionary Dynamics at Harvard University which is involved in researching applications of mathematics and computer science to biology.

They were mixing it up with the cosmologists Alan Guth (inflationary universe), Leonard Susskind (the landscape of universes), and Paul Steinhardt (the cyclic universe); the physicist Seth Lloyd (quantum computing); the applied mathematician Steve Strogatz (synchronicity in nature); and the psychologists Mike Csikszentmihalyi (flow), Nancy Etcoff (perception of faces), Martin Seligman (positive psychology), Dan Gilbert (mis-wanting), as well as a number of technology and media journalists.

Also attending were Alisa Volkman of the literary-erotic website nerve.com, book packager Ariane de Bonvoisin, and Swedish journalist Eva Wisten. They spent the dinner in rapt conversation with the three cosmologists. "Where were they? I never saw them," said Kevin Kelly. But then Kevin was busy: he and Jeff Bezos, who attended with his mother Jackie, were producing a wall of sound from a table in the middle of the room that made quiet conversation impossible.

An interesting aspect of the dinner was that Seth Lloyd flew in from Tokyo (where he is spending a year) to join us. Seth was the only student of the late Heinz Pagels (who helped to start Edge, and was deeply involved in all its activities). Although I never met Seth when Heinz was alive, I vividly recall Heinz's descriptions of him as the brightest of the bright young physicists...of any generation. Heinz and I had several conversations about how Heinz was attempting to harness Seth's intelligence since he was one of those trans-categorematic individuals. In other words, Heinz was telling me that Seth was unemployable.

Over the years things have worked out for Seth. His seminal work in the fields of quantum computation and quantum communications—including proposing the first technologically feasible design for a quantum computer, demonstrating the viability of quantum analog computation, proving quantum analogs of Shannon's noisy channel theorem, and designing novel methods for quantum error correction and noise reduction—has gained him a reputation as an innovator and leader in the field of quantum computing. He has made the front pages of the world's newspapers several times; collaborates with Murray Gell-Mann; and is now Professor of Quantum-Mechanical Engineering at MIT.

My idea was to use the platform of "The Billionaires' Dinner" and Seth's visit to announce "The Quantum Internet" but I became so caught up in the high energy of of the occasion that I forgot all about it. I also forgot I had a new digital camera in my pocket and didn't take any pictures. Rather than deprive Edge readers of an inside look at the dinner, I sent the following email to the dinner guests:

"Sing for your supper!"

Instead of photos, I plan to run a text portrait. You can help out by responding to the following Edge question (a paragraph or two will 
do):

"Who were you sitting with? What interesting things were discussed? What did you learn?"

I can recount my own conversation with Lenny Susskind, the father of string theory, who walked in wearing a new sports jacket. I looked at the jacket admiringly, and Lenny told me a story:

"I'm going to Holland next week where I'll have an honorary professorship. Three weeks ago the host called me up and said 'please, get yourself a nice set of clothes, because you're going to meet the queen.' "

" 'Wow,' I said, 'the Queen?'"

"'Yes, the Queen. She wants to meet a physicist,' said my host."

"'That's fantastic,' I replied. 'I'm going to be a guest of honor at a dinner given by the Queen of Holland!' "

"And all of a sudden on the other end of the phone, there's silence. And he says, 'no, Lenny, 'you don't understand; Brian Greene is going to be the guest of honor.' "

 


SETH LLOYD

I arrived in Monterey that evening tired and sweaty: my route there from Japan had included climbing a mountain in LA that morning and I hadn't had time to change.

JB immediately tossed me in a corner of the restaurant with Sergey Brin and Larry Page, who grilled me on the potential applications of quantum computation. They were shockingly knowledgeable on the subject and quickly pushed me like a novice sumo wrestler to edge of the ring that marks the boundary between the known and the unknown. That boundary is always closer than one thinks.

We agreed that quantum internet searches are a few years off. I had spent the afternoon in Jeff Kimble's lab at Caltech contemplating the first node of the the quantum internet — a single atom trapped in an optical cavity, capable of exchanging entangled photons with any other nodes, as soon as they are brought into existence. But when the quantum internet has only one node, containing one bit, Q-Google (Quoogle?) is not yet necessary. Sergey and Larry and I noted that when it is up and running, the quantum internet should offer all sorts of wacky possibilities for quantum internet search. Searches could be made significantly more efficient, for example, by using quantum parallelism to explore every node of the quantum internet simultaneously. Problems arise, however, from the fact that quantum bits can't be cloned. I cannot go further into our discussion as that would involve proprietary information concerning quantum internet protocols (e.g., Q-TCPIP).

Sergey broached the subject of massive entanglement and decoherence, a hot topic in quantum information these days (Entanglement is a peculiarly quantum-mechanical effect in which a bunch of quantum systems such as atoms share more information with each other than is possible classically. Entanglement is the branch of quantum weirdness that allows quantum computers to function. Decoherence is a process that destroys entanglement. As I said, these guys were really on top of their quanta).

We discussed recent experiments that Dave Cory and I had done at MIT, and Sergey made a rather fine suggestion for an experiment to test whether gagillions of entangled nuclear spins decohere faster than gagillions of unentangled nuclear spins. Dave and I will check it out.

At this point Jeffrey Epstein joined the conversation and demanded to know whether weird quantum effects had played a significant role in the origins of life. That question pushed me way out of the sumo ring into the deep unknown. We tried to construct a version of the question that could be answered. I was pushing my own personal theory of everything (the universe is a giant quantum computer, and to understand how things like life came into existence, we have to understand how atoms, molecules, and photons process information). Jeffrey was pushing back with his own theory (we need to understand what problem was being solved at the moment life came into being). By pushing from both sides, we managed to assemble a metaphor in which molecules divert the flow of free energy to their own recreational purposes (i.e., literally recreating themselves) somewhat in the way Jeffrey manages to divert the flow of money as it moves from time-zone to time-zone, using that money for his own recreational purposes (i.e., to create more money). I'm not saying it was the right way to describe the origins of life: I'm just saying that it was fun.

SETH LLOYD is Professor of Quantum-Mechanical Engineering, MIT.


PAUL STEINHARDT

Three of the people I spoke with at the Dinner were Alisa Volkman from Nerve, Steve Petranek of Discover, and Jeff Bezos from Amazon. Alisa and I spoke about the cosmos and film, and creativity in our respective work. Jeff is a Princeton graduate who spent his first three years as a physics major. We talked about physics and engineering at Princeton and the challenge of mentoring young people and helping them excel. Steve Petranek and I talked about dark matter, dark energy and gravity.

PAUL STEINHARDT, father of "The Cyclic Theory of the Universe" is the Albert Einstein Professor in Science and on the faculty of both the Departments of Physics and Astrophysical Sciences at Princeton University.


DANIEL DUBNO

Well, all the good looking women were sitting with the physicists' table (go figure!) so I had to settle for sitting next to Steve Case. Later I worked the room and had the terrific luck to sit next to Martin Seligman, of UPenn, and Dean Kamen.

He can invent a self-balancing wheelchair and the Segway, but is Dean Kamen happy? That's what Marty Seligman wanted to know. At the University of Pennsylvania, Marty has been researching optimal experiences... studying what makes people happy for the last two decades. At the Edge dinner he asked Dean to describe his happiest moment. Perhaps America's most famous inventor could have mentioned any one of the "Aha!" moments in life: when Dean first had the Stirling Engine working, or when he saw the stents or portable dialysis machines he designed saving any of the hundreds of thousands of lives they have.

But, with a faraway look, Dean thought back to a company holiday he shared with his DEKA colleagues several years ago. "It was a long weekend ahead and no work was going to get done. So I just figured we'd take all the kids and their families and go to Disneyworld. We chartered a jet, had a bunch of buses, and all the families of people I work with were soon headed down the highway." But the joyous moment was not that Dean could afford to take all these people on a vacation. "So everybody was kind of hungry, and I had the three buses pull in to the takeout window at McDonalds. It was pretty funny as I ordered three hundred shakes, and hundreds of fries, and plenty of burgers. And the very young son of one of the people who works with me was sitting there happy as can be. He's eating a burger the size of his head and holding on to it for dear life. And I came over and said that looked good. And he stopped eating, and with a big smile, held out his prized burger and just offered it to me." And Dean smiled remembering a tiny moment of optimal joy.

DANIEL DUBNO is producer and technologist for CBS News in New York, where he coordinates Special Events Unit coverage of major national and international news stories.


LINDA STONE

I sat next to Jackie Bezos and Tom Reilly and across from Jeff Bezos and Kevin Kelly. The laughter was so loud, so continuous and so infectious that the conversation gently threaded it's way through the laughter. We talked about Asia, physics and space, outsourcing to India and China, and a whole lot of other thngs that gave us endless pleasure.

It was a magical evening.

LINDA STONE is a former Apple and Microsoft executive.


DANIEL GILBERT

I had the pleasure of being seated at the end of a long table and across from Lenny Susskind, so everyone else was pretty much outside my sonic reach. Lenny told me a bit about physics, I told him a bit about psychology, and then we spent the rest of the evening talking about the many odd coincidences in our personal histories. We both had checkered pasts that included more than a little aimlessness, delinquency, truancy, bad grades, and youthful marriages. No one would have bet that we'd ever go to college, much less become professors. Indeed, Lenny and I had so much in common that the only way the waiter could tell us apart was that Lenny invented string theory and I didn't. Good thing I held back on that one, otherwise there would have been some confusion about who got the chicken curry.

DANIEL GILBERT is Professor of Psychology at Harvard University.


JEAN PAUL SCHMETZ

I sat at a table with Eva Wisten, Paul Steinhardt, Lenny Susskind, Dan Gilbert and Alisa Volkman. It was a wonderful dinner. I talked a lot about media with Eva Wisten (she's a journalist and we publish some 250 magazines). I also discovered that Lenny does not like religion at all (me neither, and I cannot remember how we got to talk about this). Later, I talked to Steven Strogatz and Stephen Petranek but I forgot what we talked about. I remember talking to my old friend Megan Smith for a long time about Space Camp which I plan to go to with my kids soon.

I feel very happy about having been to the dinner and at the same time a bit unhappy not having talked to more people (I guess Dan Gilbert is right about the twisted relationship between choice and happiness).

JEAN PAUL SCHMETZ is Managing Director of CyberLab Interactive Productions GmbH, a subsidiary of the Burda Media Group and a Member of the Executive Board of Burda New Media GmbH.


LEONARD SUSSKIND

Larry Page who told me about his experiences taking a physics course from me. The psychologist Daniel Gilbert . We talked a lot about life, love and the pursuit of the ladies. I explained physics and cosmology to him and explained a lot of interesting psych phenomena to me. I loved it. The two young women, Eva and Alisa. We talked about you. I hope they remember more because I don't. But it was for sure the most interesting dinner company that I've had since the old days with my physicist friends Sidney Coleman, Dick Feynman and Jack Goldberg. It could become addictive.

LEONARD SUSSKIND, the father of string theory, is Felix Bloch Professor in theoretical physics at Stanford University.


STEVEN STROGATZ

Steve Petranek, the editor of Discover magazine, was sitting on my right. Rodney Brooks, the MIT artificial intelligence researcher who makes little insect-like robots, was on my left. Both are fun and easy to be around. I always like to hear people's life stories, and without too much effort, I managed to get both of them to tell how they got to where they are today. Pertranek told of his days as a cub reporter at various small newspapers, covering all sorts of different areas, from finance to energy (where he did some investigative reporting and once broke a story about some shenanigans at a nuclear power plant, if I remember right). Brooks told charming stories of his days as a kid in Australia, playing out in his shed in the backyard, trying to build computers and other contraptions from spare parts and assorted junk, and nearly electrocuting himself or blowing himself up from time to time. It made me think about the importance of tinkering and fooling around.

It was a real treat—a night to cherish.

STEVEN STROGATZ is an applied mathematician at Cornell University and the author of Sync: The Emerging Science of Spontaneous Order.


CHRIS W. ANDERSON

After reading of the quantum depths to which Sergey and Larry took Seth, I'm ashamed to recount that I spent most of my dinner asking Alan Guth beginner's questions about quantum communications. He patiently explained the pros and cons of electron vs. photon methods, and the difference between truly encrypted communications and those that simply reveal if they've been tapped by a third party.

Given that this isn't even his field, it was a virtuoso performance of clarity and deduction from first principles. Even better, he then followed up a day or two later with this email:

As I was leaving Monterey I met Robert Gelfond, the CEO of MagicQ Technologies Inc., which is in the business of quantum encryption.

It turned out that almost all of my guesses were right. The currently working systems are not what I would call a true quantum encryption device, which disguises each bit by flipping it or not flipping it according to the spin of an entangled particle. Instead they are quantum intruder-detection devices, which send photons on a light tube. The signal is mixed with a stream of photons with a predetermined pattern of polarizations, which are then verified at the other end. Since an intruder cannot measure the polarization of a single photon, he cannot detect photons and retransmit them in an identical polarization state. I think Robert said that they can send photons up to 50 km with complete security, and up to about 100 km with security that is safe as long as the intruder is limited to present technology. If one wants to go further, one must send the signal in steps of this length, with a secure box at each step which receives the message and retransmits it. There is no quantum algorithm that can detect an intruder who breaks open this box, so it must be secured by ordinary means.

The one point that I didn't foresee is that apparently it is not practical to send all bits by this method. Instead they use the protected photon signal only to distribute frequently changing encryption keys. Then the signal is transmitted separately, using ordinary transmission lines and ordinary encryption, such as perhaps DES. As long as the key is changed frequently, this is regarded as safe.

Most rewarding dinner conversation I've had for ages!

CHRIS W. ANDERSON is Editor-in-Chief of Wired.


ALAN GUTH, father in the inflationary theory of the Universe, is Victor F. Weisskopf Professor of Physics at MIT; author of The Inflationary Universe.


STEVE PETRANEK

I sat next to Steve Strogatz, the Cornell mathematician who wrote one of my favorite books, Sync. He was to my left. To my right was the lyrical, mystical and charming Ariane de Bonvoisin. I sat across from Nancy Etcoff, the research psychologist from Harvard Medical School who has actually defined the word happiness in such depth that it continues to astound me. I kept trying to get some free shrinking from Nancy but she effortlessly shifted the subject of my childhood back to her present research, which was nearly as fascinating.??When others swept Nancy's attention away I turned to Ariane, who unfortunately knows how to work a room full of smart people and kept disappearing. However, I learned enough about her project to publish books that get people through the first 30 days of a crisis (divorce, death of a spouse, realization there isn't a Santa Claus) to know that I'd probably buy each and every one of them even if the crisis didn't match my circumstances (One always has friends in crisis).

Strogatz pulled a reversal by interviewing me before I got to interview him. His descriptions of fireflies along riverbanks syncing up their flashes was far more mesmerizing in person (and after a couple glasses of wine) than it is in the book. Of course, I did my best to stay as far away from Dan Dubno as I could, which turned out to be fairly easy because he was toadying up to Steve Case all night.

As the soiree ended, physicist Paul Stenhardt asked me a few pointed questions about my definition of gravity as I had referred to it in my TED presentation. Besides being embarrassed by my inability to be as clear as a cosmologist can be on what holds us down to this planet, I was fascinated to be led by him to the idea that gravity might be quite different on at least four planes: gravity in very large circumstances, like everything in the universe flying away from everything else at an accelerated pace; gravity on the scale I'm used to as I move across the Earth; gravity in very tiny (quantum) circumstances; and gravity under intense pressure and heat circumstances. All of which made me sorry that dinner came to an end.

STEVE PETRANEK is editor in chief of Discover.


Attendees: Pam Alexander, Alexander Ogilvy; Chris Anderson, TED; Chris Anderson, Wired; Jeff Bezos, amazon.com; Jackie Bezos, amazon.com; Adam Bly, Seed; Stewart Brand, Long Now Foundation; Sergey Brin, Google; Patti Brown, New York Times; Steve Case; Mihalyi Csikszentmihalyi, Claremont; Steffi Czerny, Burda Media; Susan Dawson, Sapling Foundation; Ariane De Bonvoisin; Dan Dubno, CBS News; Jeffrey Epstein, Epstein Assoc.; Nancy Etcoff, Harvard Medical School; Daniel Gilbert, Harvard; Alan Guth, MIT; Katrina Heron; Kevin Kelly, Wired; Seth Lloyd. MIT; Pam Omidyar, Omidyar Foundation; Pierre Omidyar, eBay ; Larry Page, Google; Steve Petranek,Discover; Ryan Phelan, DNA Direct; Tom Rielly, TED; Forrest Sawyer, MSNBC; Eric Schmidt, Google; Martin Seligman, UPenn; Megan Smith, Google; Paul Steinhardt, Princeton; Cyndi Stivers, Time Out New York ; Linda Stone; Steven Strogatz, Cornell; Leonard Susskind, Stanford; Kara Swisher, Wall Street Journal; Yossi Vardi, ICQ; Alisa Volkman, Nerve; Eva Wisten, Bon Magazine; Michael Wolff, Vanity Fair


Edge Dinners
Event Date: [ 2.27.03 ]
Location:
United States

Seminars
Event Date: [ 7.21.02 ]
Location:
United States

The metaphors of information processing and computation are at the center of today's intellectual action. A new and unified language of science is beginning to emerge.

 

Participants:

Seth Lloyd Paul Steinhardt Alan Guth Marvin Minsky Ray Kurzweil
Computational Universe Cyclic Universe Inflationary Universe Emotion Universe Intelligent Universe

What's happening in these new scientific endeavors is truly a work in progress. A year ago, at the first REBOOTING CIVILIZATION meeting in July, 2001, physicists Alan Guth and Brian Greene, computer scientists David Gelernter, Jaron Lanier, and Jordan Pollack, and research psychologist Marc D. Hauser could not reach a consensus about exactly what computation is, when it is useful, when it is inappropriate, and what it reveals. Reporting on the event in The New York Times ("Time of Growing Pains for Information Age", August 7, 2001), Dennis Overbye wrote:On July 21, Edge held an event at Eastover Farm which included the physicists Seth Lloyd, Paul Steinhardt, and Alan Guth, computer scientist Marvin Minsky, and technologist Ray Kurzweil. This year, I noted there are a lot of "universes" floating around. Seth Lloyd: the computational universe (or, if you prefer, the it and bit-itty bitty-universe); Paul Steinhardt: the cyclic universe; Alan Guth: the inflationary universe; Marvin Minsky: the emotion universe, Ray Kurzweil: the intelligent universe. I asked each of the speakers to comment on their "universe". All, to some degree, were concerned with information processing and computation as central metaphors. See below for their links to their talks and streaming video.??Concepts of information and computation have infiltrated a wide range of sciences, from physics and cosmology, to cognitive psychology, to evolutionary biology, to genetic engineering. Such innovations as the binary code, the bit, and the algorithm have been applied in ways that reach far beyond the programming of computers, and are being used to understand such mysteries as the origins of the universe, the operation of the human body, and the working of the mind. ?

Mr. Brockman said he had been inspired to gather the group by a conversation with Dr. Seth Lloyd, a professor of mechanical engineering and quantum computing expert at M.I.T. Mr. Brockman recently posted Dr. Lloyd's statement on his Web site, www.edge.org: "Of course, one way of thinking about all of life and civilization," Dr. Lloyd said, "is as being about how the world registers and processes information. Certainly that's what sex is about; that's what history is about.

Humans have always tended to try to envision the world and themselves in terms of the latest technology. In the 17th and 18th centuries, for example, workings of the cosmos were thought of as the workings of a clock, and the building of clockwork automata was fashionable. But not everybody in the world of computers and science agrees with Dr. Lloyd that the computation metaphor is ready for prime time.

Several of the people gathered under the maple tree had come in the hopes of debating that issue with Dr. Lloyd, but he could not attend at the last moment. Others were drawn by what Dr. Greene called "the glimmer of a unified language" in which to talk about physics, biology, neuroscience and other realms of thought. What happened instead was an illustration of how hard it is to define a revolution from the inside.

Indeed, exactly what computation and information are continue to be subjects of intense debate. But less than a year later, in the "Week In Review" section of the Sunday New York Times ("What's So New In A Newfangled Science?", June 16, 2002) George Johnson wrote about "a movement some call digital physics or digital philosophy — a worldview that has been slowly developing for 20 years."...

Just last week, a professor at the Massachusetts Institute of Technology named Seth Lloyd published a paper in Physical Review Letters estimating how many calculations the universe could have performed since the Big Bang — 10^120 operations on 10^90 bits of data, putting the mightiest supercomputer to shame. This grand computation essentially consists of subatomic particles ricocheting off one another and "calculating" where to go.

As the researcher Tommaso Toffoli mused back in 1984, "In a sense, nature has been continually computing the `next state' of the universe for billions of years; all we have to do — and, actually, all we can do — is `hitch a ride' on this huge ongoing computation."

This may seem like an odd way to think about cosmology. But some scientists find it no weirder than imagining that particles dutifully obey ethereal equations expressing the laws of physics. Last year Dr. Lloyd created a stir on Edge.org, a Web site devoted to discussions of cutting edge science, when he proposed "Lloyd's hypothesis": "Everything that's worth understanding about a complex system can be understood in terms of how it processes information."*....

[*See "Seth Lloyd: How Fast, How Small, and How Powerful: Moore's Law and the Ultimate Laptop"]

Dr, Lloyd did indeed cause a stir when his ideas were presented on Edge in 2001, but George Johnson's recent New York Times piece caused an even greater stir, as Edge received over half a million unique visits the following week, a strong confirmation that something is indeed happening here. (Usual Edge readership is about 60,000 unique visitors a month). There is no longer any doubt that the metaphors of information processing and computation are at the center of today's intellectual action. A new and unified language of science is beginning to emerge.


THE COMPUTATIONAL UNIVERSE:SETH LLOYD [9.19.02]

Every physical system registers information, and just by evolving in time, by doing its thing, it changes that information, transforms that information, or, if you like, processes that information. Since I've been building quantum computers I've come around to thinking about the world in terms of how it processes information.

 

SETH LLOYD is Professor of Mechanical Engineering at MIT and a principal investigator at the Research Laboratory of Electronics. He is also adjunct assistant professor at the Santa Fe Institute. He works on problems having to do with information and complex systems from the very small—how do atoms process information, how can you make them compute, to the very large — how does society process information? And how can we understand society in terms of its ability to process information?

His seminal work in the fields of quantum computation and quantum communications — including proposing the first technologically feasible design for a quantum computer, demonstrating the viability of quantum analog computation, proving quantum analogs of Shannon's noisy channel theorem, and designing novel methods for quantum error correction and noise reduction — has gained him a reputation as an innovator and leader in the field of quantum computing. Lloyd has been featured widely in the mainstream media including the front page of The New York Times, The LA Times, The Washington Post, The Economist, Wired, The Dallas Morning News, and The Times (London), among others. His name also frequently appears (both as writer and subject) in the pages of Nature, New Scientist, Science and Scientific American.


THE CYCLIC UNIVERSE: PAUL STEINHARDT [9.16.02]

...in the last year I've been involved in the development of an alternative theory that turns the cosmic history topsy-turvy. All the events that created the important features of our universe occur in a different order, by different physics, at different times, over different time scales—and yet this model seems capable of reproducing all of the successful predictions of the consensus picture with the same exquisite detail.


THE INFLATIONARY UNIVERSE:ALAN GUTH [9.16.02]

Inflationary theory itself is a twist on the conventional Big Bang theory. The shortcoming that inflation is intended to fill in is the basic fact that although the Big Bang theory is called the Big Bang theory it is, in fact, not really a theory of a bang at all; it never was.


THE EMOTION UNIVERSE: MARVIN MINSKY [9.16.02]

To say that the universe exists is silly, because it's saying that the universe is one of the things in the universe. There's something wrong with that idea. If you carry that a little further, then it doesn't make any sense to have a predicate like, "Where did the universe come from?" or "Why does it exist?"


THE INTELLIGENT UNIVERSE: RAY KURZWEIL  [9.16.02]

The universe has been set up in an exquisitely specific way so that evolution could produce the people that are sitting here today and we could use our intelligence to talk about the universe. We see a formidable power in the ability to use our minds and the tools we've created to gather evidence, to use our inferential abilities to develop theories, to test the theories, and to understand the universe at increasingly precise levels.


 

WHICH UNIVERSE WOULD YOU LIKE?

Five stars of American science meet in Connecticut to explain first and last things.

By Jordan Mejias

August 28, 2002

 

They begin a free-floating debate, which drives them back and forth across the universe. Guth encourages the exploration of black holes, not to be confused with cosmic wormholes, which Kurzweil — just like the heroes of Star Trek — wants to use as a shortcut for his intergalactic excursions and as a means of overtaking light. Steinhardt suggests that we should realize that we are not familiar with most of what the cosmos consists of and do not understand its greatest force, dark matter. Understand? There is no such thing as a rational process, Minsky objects; it is simply a myth. In his cosmos, emotion is a word we use to circumscribe another form of our thinking that we cannot yet conceive of. Emotion, Kurzweil interrupts, is a highly intelligent form of thinking. "We have a dinner reservation at a nearby country restaurant," says Brockman in an emotionally neutral tone.

Edge Dinners
Event Date: [ 2.21.02 ]
Location:
Monterey, CA
United States

This media life
My Dinner with Rupert

How do you get the king of all media to break bread with you when — truth be told — you've often said unkind things about him? Michael Wolff on his precious moments with the gossipy mogul.
By Michael Wolff

In an unlikely turn of events — and thanks to some shameless maneuvering to achieve (and protect) proximity — our Murdoch-deconstructing media columnist breaks bread with the man himself.

Rupert kept talking. He grew more expansive, more conspiratorial, even (although it did seem like he'd conspire with anyone), his commentary more intimate. We proposed that he come with us to the dinner we were scheduled to go to — John Brockman's Billionaire's dinner, a TED ritual......


"The TED Conference: 3 Days in the Future"
By Patricia Leigh Brown
February 28, 2002
(free registration required)

MONTEREY, Calif., Feb. 23 — What preternatural power can prompt Rupert Murdoch, Jeffrey Katzenberg, Richard Dawkins, Neil Simon, Art Buchwald, Frank Gehry and Quincy Jones to sit for hours in a hot room contemplating the nano-sized split ends on gecko toes? ...

...Where else but at TED would Mr. Katzenberg, standing Armani-deep in sawdust with Spirit, his stallion and the namesake of his new animated film, be upstaged by Rex, a biologically inspired robot with springy legs and gecko-like feet capable of navigating the outer reaches of the Amazon — specifically, the leg of the Amazon.com founder, Jeff Bezos, a longtime Tedster?

It can get deep. Very deep. Steven Pinker, the eminent cognitive psychologist, found himself deep in conversation with the singer Naomi Judd about the role of the amygdala, the part of the brain that colors memory with emotion; something, he aptly noted, "that would not happen at the meeting of the Cognitive Neuroscience Society."

It happened here one night last week over chicken and polenta at the annual private dinner, given by the New York literary agent John Brockman, formerly called the Millionaires' and Billionaires' Dinner after the rich techies who traditionally flocked to TED. There were still a few members of that endangered species scattered about, among them Nathan Myhrvold, the retired Microsoft chief technology officer, who gave an electrifying discourse at the 1997 TED about dinosaur sex. .....

 

Seminars
Event Date: [ 9.10.01 ]
Location:
United States

Everything is up for grabs. Everything will change. There is a magnificent sweep of intellectual landscape right in front of us.

One aspect of our culture that is no longer open to question is that the most signigicant developments in the sciences today (i.e. those that affect the lives of everybody on the planet) are about, informed by, or implemented through advances in software and computation. This Edgeevent presented an opportunity for people in the various fields of computer science, cosmology, cognition, evolutionary biology, etc., to begin talking to each other, to become aware of interesting and important work in other fields.

Participants:

Marc D. Hauser Lee Smolin Brain Greene Jaron Lanier Jordan Pollack David Gelernter Alan Guth

HAUSER, SMOLIN, GREENE, LANIER, POLLACK, GELERNTER, GUTH at the Edge "REBOOTING CIVILIZATION" meeting at Eastover Farm. Opening comments [12,000 words] and streaming video. 


Software and computation are reinventing the civilized world —"rebooting civilization," in the words of David Gelernter. "It's a software-first world," notes Stanford AI expert Edward Feigenbaum, chief scientist of the U.S. Air Force in the mid-nineties. "It's not a mistake that the world's two richest men are pure software plays. Or that the most advanced fighter planes in the U.S. Air Force are bundles of software wrapped in aluminum shells, or that the most advanced bomber is run by computers and cannot be flown manually". Everybody in business today is in the software business. But what comes after software?

Experimental psychologist Steven Pinker speaks of "a new understanding that the human mind is a remarkably complex processor of information." To Pinker, our minds are "organs of computation." To philosopher Daniel C. Dennett, "the basic idea of computation, as formulated by the mathematicians John von Neumann and Alan Turing, is in a class by itself as a breakthrough idea." Dennett asks us to think about the idea that what we have in our heads is software, "a virtual machine, in the same way that a word processor is a virtual machine." Pinker and Dennett are talking about our mental life in terms of the idea of computation, not simply proposing the digital computer as a metaphor for the mind. Other scientists such as physicist Freeman Dyson disagree , but most recognize that these are big questions.??Physicist David Deutsch, a pioneer in the development of the quantum computer, points out that "the chances are that the technological implications of quantum computers, though large by some standards, are never going to be the really important thing about them. The really important thing is the philosophical implications, epistemological and metaphysical. The largest implication, from my point of view, is the one that we get right from the beginning, even before we build the first quantum computer, before we build the first cubit. The very theory of quantum computers already forces upon us a view of physical reality as a multiverse."??Computer scientist and AI researcher Rodney Brooks is puzzled that "we've got all these biological metaphors that we're playing around with — artificial immunology systems, building robots that appear lifelike — but none of them come close to real biological systems in robustness and in performance. They look a little like it, but they're not really like biological systems." Brooks worries that in looking at biological systems we are missing something that is already there — that has always been there. To Brooks, this might be called "the essence of life," but he is talking about a biochemical phenomenon, not a metaphysical one. Brooks is searching for a new conceptual framework that, like computation, does not involve any new physics or chemistry — a framework that gives us a different way of thinking about the stuff that's there. "We see the biological systems, we see how they operate," he says, "but we don't have the right explanatory modes to explain what's going on and therefore we can't reproduce all these sorts of biological processes. That to me right now is the deep question."

One aspect of our culture that is no longer open to question is that the most signigicant developments in the sciences today (i.e. the developments that affect the lives of everybody on the planet) are about, informed by, or implemented through advances in software and computation.

This Edge event is an opportunity for people in various fields such as computer science, cosmology, cognition, evolutionary biology, etc., to begin talking to each other, to become aware of interesting and important work in other fields.

— JB


MARC HAUSER

Some of the problems that we've been dealing with in the neurosciences and the cognitive sciences concerns the initial state of the organism. What do animals, including humans, come equipped with? What are the tools that they have to deal with the world as it is? There's somewhat of an illusion in the neurosciences that we have really begun to understand how the brain works. That's put quite nicely in a recent talk by Noam Chomsky. The title of the talk was "Language and the Brain."

Everybody's very surprised to hear him mention the brain word, since he's mostly referred to the mind. The talk was a warning to the neuroscientists about how little we know about, especially when it comes to understanding how the brain actually does language. Here's the idea Chomsky played with, which I think is quite right. Let's take a very simple system that is actually very good at a kind of computation: the honey bee. Here is this very little insect, tiny little brain, simple nervous system, that is capable of transmitting information about where it's been and what it's eaten to a colony and that information is sufficiently precise that the colony members can go find the food. We know that that kind of information is encoded in the signal because people in Denmark have created a robotic honey bee that you can plop in the middle of a colony, programmed to dance in a certain way, and the hive members will actually follow the information precisely to that location. Researchers have been able to understand the information processing system to this level, and consequently, can actually transmit it through the robot to other members of the hive. When you step back and say, what do we know about how the brain of a honeybee represents that information, the answer is: we know nothing. Thus, our understanding of the way in which a bee's brain represents its dance, its language, is quite poor. And this lack of understanding comes from the study of a relatively simple nervous system, especially when contrasted with the human nervous system.

So the point that Chomsky made, which I think is a very powerful one, and not that well understood, is that what we actually know about how the human brain represents language is at some level very trivial. That's not to say that neuroscientists haven't made quite a lot of impact on, for example, what areas of the brain when damaged will wipe out language. For example, we know that you can find patients who have damage to a particular part of the brain that results in the loss of representations for consonants, while other patients have damage that results in the loss of representations for vowels.

But we know relatively little about how the circuitry of the brain represents the consonants and vowels. The chasm between the neurosciences today and understanding representations like language is very wide. It's a delusion that we are going to get close to that any time soon. We've gotten almost nowhere in how the bee's brain represents the simplicity of the dance language. Although any good biologist, after several hours of observation, can predict accurately where the bee is going, we currently have no understanding of how the brain actually performs that computation.

The reason there have been some advances in the computational domain is there's been a lot of systems where the behavior showcases what the problem truly is, ranging from echolocation in bats to long distance navigation in birds. For humans, Chomsky's insights into the computational mechanisms underlying language really revolutionized the field, even though not all would agree with the approach he has taken. Nonetheless, the fact that he pointed to the universality of many linguistic features, and the poverty of the input for the child acquiring language, suggested that an innate computational mechanism must be at play. This insight revolutionized the field of linguistics, and set much of the cognitive sciences in motion. That's a verbal claim, and as Chomsky himself would quickly recognize, we really don't know how the brain generates such computation.

One of the interesting things about evolution that's been telling us more and more is that even though evolution has no direction, one of the things you can see, for example, within the primates is that a part of the brain that actually stores the information for a representation, the frontal lobes of our brain, has undergone quite a massive change over time. So you have systems like the apes who probably don't have the neural structures that would allow them to do the kind of computations you need to do language-processing. In our own work we've begun to look at the kinds of computations that animals are capable of, as well as the kind of computations that human infants are capable of, to try to see where the constraints lie.

Whenever nature has created systems that seem to be open-ended and generative, they've used some kind of system with a discrete set of recombinable elements. The question you can begin to ask in biology is, what kind of systems are capable of those kinds of computational processes. For example, many organisms seem to be capable of quite simple statistical computations, such as conditional probabilities that focus on local dependencies: if A, then B. Lots of animals seem capable of that. But when you step up to the next level in the computational hierarchy, one that requires recursion, you find great limitations both among animals and human infants. For example, an animal that can do if A then B, would have great difficulty doing if A to the N, then B to the N. We now begin to have a loop. If animals lack this capacity, which we believe is true, then we have identified an evolutionary constraint; humans seem to have evolved the capacity for recursion, a computation that liberated us in an incredible way.

It allows us to do mathematics as well as language. And this system of taking discrete or particulate elements and recombining them, is what gives genetics and chemistry their open ended structure. Given this pattern, an interesting question then is: what were the selective pressures that led to the evolution of a recursive system? Why is it that humans seem to be the only organisms on the planet, the only natural system, that has this capacity? What were the pressures that created it? Thinking about things like artificial intelligence, what would be the kinds of pressures on an artificial system that would get to that end point?

An interesting problem for natural biological systems as well as artificial systems is whether the two can meet, to try to figure out what kinds of pressures lead to a capacity for recursion, what are the building blocks that must be in place for the system to evolve? Comparative biology doesn't provide any helpful hints at present because we simply have two end points, humans that do it, and other organisms that don't. At this point in time, therefore, this evolutionary transition is opaque.

MARC D. HAUSER, a cognitive neuroscientist, is a professor in the departments of Psychology and the Program in Neurosciences at Harvard, where he is also a fellow of the Mind, Brain, and Behavior Program. He is the author of The Evolution of Communication, The Design of Animal Communication (with M. Konishi), and Wild Minds: What Animals Really Think.


LEE SMOLIN

As a theoretical physicist, my main concern is space, time and cosmology. The metaphor about information and computation is interesting. There are some people in physics who have begun to talk as if we all know that what's really behind physics is computation and information, who find it very natural to say things like anything that's happening in the world is a computation, and all of physics can be understood in terms of information. There's another set of physicists who have no idea what those people are talking about. And there's a third set — and I'm among them — who begin by saying we have no idea what you're talking about, but we have reasons why it would be nice if it was useful to talk about physics in terms of information.

I can mention two ways in which the metaphor of information and computation may be infiltrating into our thinking about fundamental physics, although we're a long way from really understanding these things. The first is that the mathematical metaphor and the conceptual metaphor of a system of relationships which evolves in time is something which is found in physics. It is also something that we clearly see when we talk to computer scientists and biologists and people who work on evolutionary theory, that they tend to model their systems in terms of networks where there are nodes and there are relationships between the nodes, and those things evolve in time, and they can be asking questions about the time evolution, what happens after a long time, what are the statistical properties of subsystems.

That kind of idea came into physics a long time ago with relativity theory and general relativity. The idea that all the properties of interest are really about relationships between things and not a relationship between some thing and some absolute fixed background that defines what anything means is an important idea and an old idea in physics. In classical general relativity, one sees the realization of the idea that all the properties that we observe are about relationships. Those of us who are interested in quantum gravity are thinking a lot about how to bring that picture, in which the world is an evolving network of relationships, into quantum physics.

And there are several different aspects of that. There are very interesting ideas around but they're in the stage of interesting ideas, interesting models, interesting attempts — it is science in progress.

That's the first thing. To the extent to which our physics will turn out to look like a network of relationships which are evolving in time, physics will look like some system that computational people or biologists using the computational metaphor may be studying. Part of that is the questions of whether nature is really discrete — that underlying the continuous notion of space and time there's really some discrete structure, that's also something that from different points of view — when we work on quantum gravity we find evidence that space and time are really discrete and are really made up on processes which may have some discrete character. But again, this is something in progress.

One piece of evidence that nature is discrete is something called the holographic principle. This leads some of us physicists to use the word information even when we don't really know what we're talking about but it is interesting and worth exposing. It comes from an idea called the Bekenstein Bound, a conjecture of Jacob Bekenstein that there is more and more theoretical evidence for. The Bekenstein Bound says that if I have a surface and I'm making observations on that surface —that surface could be my retina, or it could be some screen in front of me — I observe the world through the screen, at any one moment there's a limitation to the amount of information that could be observed on that screen.

First of all that amount of information is finite, and it's four bits of information per Planck area of the screen, where a Planck area is 10 to the minus 66 centimeters squared. And there are various arguments that if that bound were to be exceeded, in a world where there is relativity and black holes, then we would violate the Second Law of Thermodynamics. Since none of us wants to violate the Second Law of Thermodynamics, I think it's an important clue, and it says something important about the underlying discreteness of nature. It also suggests that information, although we don't know what information is, may have some fundamental place in physics.

The holographic principle, of which there are several versions by different people — the idea was invented by Dutch theoretical physicist Gerard 't Hooft — is that the laws of physics should be rewritten, or could be rewritten including dynamics, how things evolve in time, so we're no longer talking about things happening out there in the world in space, we're talking about representing systems that we observe in terms of the information as it evolves on the screen. The metaphor is that there's a screen through which we're observing the world. There are various claims that this idea is realized at least partly in several different versions of string theory or quantum gravity This is an idea there's a lot of interest in, but we really don't know whether it can be realized completely or not.

One extreme form of it, which I like, is that perhaps the way to read the Bekenstein Bound is not that there are two different things, geometry and flow of information and a law that relates them, but somehow we could try to envision the world as one of these evolving networks. What happens is processes where "information", whatever information is, flows from event to event, and geometry is defined by saying that the measure of the information capacity of some channel by which information is flowing, from the past to the future, would be the area of a surface, so that somehow geometry that is space would turn out to be some derived quantity, like temperature or density, and just the same way that temperature is a measure of the average energy of some particles, the area of some surface would turn out to be an approximate measure of the capacity of some channel in the world would fundamentally be information flow. It's an idea that some of us like to play with, but we have not yet constructed physics on those grounds, and it's not at all clear that it will work. This is a transition to a computational metaphor in physics — it's something which is in progress, and may or may not happen.

LEE SMOLIN, a theoretical physicist, is a founding member and research physicist at the Perimeter Institute in Waterloo Canada. He is the author ofThe Life of The Cosmos and Three Roads to Quantum Gravity.


BRIAN GREENE

Physics and everything we know in the world around us may really be tied to processes whose fundamental existence is not here around us, but rather exists in some distant bounding surface like some thin hologram, which by virtue of illuminating it in the right way can reproduce what looks like a 3-dimensional world. Perhaps our three dimensional world is really just a holographic illumination of laws that exist on some thin bounding slice, like that thin little piece of plastic, that thin hologram. It's an amazing idea, and I think is likely to be where physics goes in the next few years or in the next decade, at least when one's talking about quantum gravity or quantum string theory.

[Opening comments to come.]

BRIAN GREENE, professor of physics and of mathematics, at Columbia University is widely regarded for a number of groundbreaking discoveries in superstring theory. He is the author of The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for an Ultimate Theory.


JARON LANIER

One of the striking things about being a computer scientist in this age is that all sorts of other people are happy to tell us that what we do is the central metaphor of everything, which is very ego-gratifying. We hear from various quarters that our work can serve as the best way of understanding - if not in the present but any minute now because of Moore's law - of everything from biology to the economy to aesthetics, child-rearing, sex, you name it. I have found myself being critical of what I view as this overuse as the computational metaphor. My initial motivation was because I thought there was naive and poorly constructed philosophy at work. It's as if these people had never read philosophy at all and there was no sense of epistemological or other problems.

Then I became concerned for a different reason which was pragmatic and immediate: I became convinced that the overuse of the computational metaphor was actually harming the quality of the present-day design of computer systems. One example of that, the belief that people and computers are similar, the artificial intelligence mindset, has a tendency to create systems that are naively and overly automated. An example of that is the Microsoft word processor that attempts to retype what you've just typed, the notion of trying to make computers into people because somehow that agenda of making them into people is so important that if you jump the gun it has to be for the greater good, even if it makes the current software stupid.

There's a third reason to be suspicious of the overuse of computer metaphors, and that is that it leads us by reflection to have an overly simplistic view of computers. The particular simplification of computers I'm concerned with is imagining that Moore's Law applies to software as well as hardware. More specifically, that Moore's Law applies to things that have to have complicated interfaces with their surroundings as opposed to things that have simple interfaces with their surroundings, which I think is the better distinction.

Moore's Law is truly an overwhelming phenomenon; it represents the greatest triumph of technology ever, the fact that we could keep on this track that was predicted for all these many years and that we have machines that are a million times better than they were at the dawn of our work, which was just a half century ago. And yet during that same period of time our software has really not kept pace. In fact not only could you argue that software has not improved at the same rate as hardware, you could even argue that it's often been in retrograde. It seems to me that our software architectures have not even been able to maintain their initial functionality as they've scaled with hardware, so that in effect we've had worse and worse software. Most people who use personal computers can experience that effect directly, and it's true in most situations.

But I want to emphasize that the real distinction that I see is between systems with simple interfaces to their surroundings and systems with complex interfaces. If you want to have a fancy user interface and you run a bigger thing it just gets awful. Windows doesn't scale.

One question to ask is, why does software suck so badly? There are a number of answers to that. The first thing I would say is that I have absolutely no doubt that David Gelernter's framework of streams is fundamentally and overwhelmingly superior to the basis in which our current software is designed. The next question is, is that enough to cause it to come about? It really becomes a competition between good taste and good judgment on the one hand, and legacy and corruption on the other - which are effectively two words for the same thing, in effect. What happens with software systems is that the legacy effects end up being the overwhelming determinants of what can happen next as the systems scale.

For instance, there is the idea of the computer file, which was debated up until the early 80s. There was an active contingent that thought that the idea of the file wasn't a good thing and we should instead have a massive distributed data base with a micro-structure of some sort. The first (unreleased) version of the Macintosh did not have files. But Unix jumped the fence from the academic to the business world and it had files, and Macintosh ultimately came out with files, and the Microsoft world had files, and basically everything has files. At this point, when we teach undergraduates computer science, we do not talk about the file as an invention, but speak of it as if it were a photon, because it in effect is more likely to still be around in 50 years than the photon.

I can imagine physicists coming up with some reasons not to believe in photons any more, but I cannot imagine any way that we can tell you not to believe in files. We are stuck with the damn things. That legacy effect is truly astonishing, the sort of non-linearity of the costs of undoing decisions that have been made. The remarkable degree to which the arrow of time is amplified in software development in its brutalness is extraordinary, and perhaps one of the things that really distinguishes software from other phenomena.

Back to the physics for a second. One of the most remarkable and startling insights in 20th century thought was Claude Shannon's connection of information and thermodynamics. Somehow for all of these years working with computers I've been looking at these things and I've been thinking, "Are these bits the same bits Shannon was talking about, or is there something different?" I still don't know the answer, but I'd like to share my recent thoughts because I think this all ties together. If you wish to treat the world as being computational and if you wish to say that the pair of sunglasses I am wearing is a computer that has sunglass input and output- if you wish to think of things that way, you would have to say that not all of the bits that are potentially measurable are in practice having an effect. Most of them are lost in statistical effects, and the situation has to be rather special for a particular bit to matter.

In fact, bits really do matter. If somebody says "I do" in the right context that means a lot, whereas a similar number of bits of information coming in another context might mean much less. Various measurable bits in the universe have vastly different potentials to have a causal impact. If you could possibly delineate all the bits you would probably see some dramatic power law where there would be a small number of bits that had tremendously greater potential for having an effect, and a vast number that had very small potentials. It's those bits that have the potential for great effect that are probably the ones that computer scientists are concerned with, and probably Shannon doesn't differentiate between those bits as far as he went.

Then the question is how do we distinguish between the bits; what differentiates one from the other, how can we talk about them? One speculation is that legacy effects have something to do with it. If you have a system with a vast configuration space, as is our world, and you have some process, perhaps an evolutionary process, that's searching through possible configurations, rather than just a meandering random walk, perhaps what we see in nature is a series of stair steps where legacies are created that prohibit large numbers of configurations from every being searched again, and that there's a series of refinements.

Once DNA has won out, variants of DNA are very unlikely to appear. Once Windows has appeared, it's stuck around, and so forth. Perhaps what happens is that the legacy effect, which is because of the non-linearity of the tremendous expense of reversing certain kinds of systems. Legacies that are created are like lenses that amplify certain bits to be more important. This suggests that legacies are similar to semantics on some fundamental level. And it suggests that the legacy effect might have something to do with the syntax/semantics distinction, to the degree that might be meaningful. And it's the first glimmer of a definition of semantics I've ever had, because I've always thought the word didn't mean a damn thing except "what we don't understand". But I'm beginning to think what it might be is the legacies that we're stuck with.

To tie the circle back to the "Rebooting Civilization" question, what I'm hoping might happen is as we start to gain a better understanding of how enormously difficult, slow, expensive, tedious and rare an event it is to program a very large computer well; as soon as we have a sense and appreciation of that, I think we can overcome the sort of intoxication that overcomes us when we think about Moore's Law, and start to apply computation metaphors more soberly to both natural science and to metaphorical purposes for society and so forth. A well-appreciated computer that included the difficulty of making large software well could serve as a far more beneficial metaphor than the cartoon computer, which is based only on Moore's Law; all you have to do is make it fast and everything will suddenly work, and the computers-will-become-smarter than-us-if-you just-wait-for-20-years sort of metaphor that has been prevalent lately.

The really good computer simulations that do exist in biology and in other areas of science, and I've been part of a few that count, particularly in surgical prediction and simulation, and in certain neuroscience simulations, have been enormously expensive. It took 18 years and 5,000 patients to get the first surgical simulation to the point of testable usability. That is what software is, that's what computers are, and we should de-intoxicate ourselves from Moore's Law before continuing with the use of this metaphor.

JARON LANIER, a computer scientist and musician, is best known for his work in virtual reality. He is the lead scientist for the National Tele-Immersion Initiative, a consortium of universities studying the implications and applications of next-generation Internet technologies.


JORDAN POLLACK

The limits of software engineering have been clear now for about 20 years. We reached a limit in the size of the programs that we could build, and since then we've essentially just been putting them together in different packages and adding wallpaper. Windows is little more than just DOS with wallpaper - it doesn't really add any more fundamental complexity or autonomy to the process.

Being in AI, I see this "scale" of programming as the problem, not the speed of computers. Its not that we don't understand some principles, but we just can't write a program big enough. We have really big computers. You could hook up a Beowulf, you could hook up a Cray super computer to the smallest robot, and if you knew how to make the robot be alive, it would be alive, if all you needed was computer time. Moores law won't solve AI. Computer time is not what we need; we need to understand how to organize systems of biological complexity.

What I've been working with for the past decade or so has been this question of self-organization. How can a system of chemicals heated by the sun dissipate energy and become more and more complex over time? If we really understood that, we'd be able to build it into software, we'd be able to build it into electronics, and if we got the theory right, we would see a piece of software that ran and wasted energy in the form of computer cycles and became more and more complex over time, and perhaps would bust through the ten million line code limit. In this field, which I've been calling co-evolutionary learning, we have had limited successes in areas like games, problem-solving, and robotics, but no open ended self-organizing reaction. Yet.

It looks like biological complexity comes from the interplay of several different fields. Physics, Evolution, and Game theory. What's possible is determined by a set of rules. There are the immutable rules, systems that obey these rules, and create new systems which operate with rule-like behaviors that we think of as computation. Evolution enables a kind of exploration of the possible, putting together components in various ways, exploring a constrained space of possible designs. What was the fitness, what was the affordance, what was the reason that something arose? But that's only part of it. Physics (the rules) determine whats possible. Evolution (the variation) explores the possible. The Game determines what persists.

When we look at the results of our evolution and co-evolution software, we see that many early discoveries afford temporary advantages, but when something gets built on top of something, because it's part of a larger configuration, it persists, even tho it may be less optimal than other competitive discoveries.

I believe we're never going to get rid of the DOS file system, we're never going to get rid of the notepad text editor, we're never going to get rid of the Qwerty keyboard, because there are systems built on top of them. Just like the human eye has a blind spot.

At one point in human evolution there arose a bundle of nerves that twisted in the wrong direction and, even though it blocked a small bit of the visual field, nevertheless added some kind of advantage, and then layered systems built on top of that and locked the blind spot into being, and there's no way you can get rid of the blind spot now because it's essentially built in and supportive of other mechanisms.

We must look at the entire game to see what among the possible persists. Winners are not determined by the best technology, or by the best design. It's determined by a set of factors that says, whatever this thing is, it's part of a network, and that network supports its persistance. And as evolution proceeds, the suboptimal systems tend to stay in place. Just like vision has a blind spot, we're going to be stuck with technology upon which economic systems depend.

Studying more economics than a usual computer scientist, and reflecting back to the question of the change of society due to software, which John and I have called it the solvent - a social solvent - our naive notion of free enterprise, our naive notion of how competition works in economy is that there's supposed to be checks and balances. Supposedly a durable goods monopoly is impossible. To increase market share, the Tractor monopoly makes better tractors which sell more and last longer until the used market in perfectly good tractors at half the price stops them. As something gets bigger, as a company's products become more widely used, instead of locking into monopoly, there's supposed to be a negative limiting effect, in the fact that more competition comes in, the monopoly is stuck on its fat margins, and stumbles while competition chases profit down to a normal profit.

The way I described this stumbling is "monopoly necrosis." You become so dependent on the sales and margins of a product in demand that you can't imagine selling another one, even though technology is driving prices down. There are some great examples of this. The IBM PC junior was built to not compete with the Selective typewriter by putting in a rate limiter on the keyboard, so a kid couldn't type more than 2 characters a second! This was really the end of IBM. It wasn't Microsoft, it was this necrosis of not exploiting new technology which might erode the profit margins on the old ones. Ten years ago you could see the digital camera and the ink jet printer were going to come together and give you something that you could print pictures for pennies apiece. But Polaroid was getting a dollar a sheet for silver based instant film, and they couldn't see how to move their company in front of this wave that was coming. The storage companies, the big million-dollar terabyte disk companies are going to run into the same sort of thing, a terabyte for $5,000.

From the point of view of software, what I've noticed is that the software companies don't seem to suffer monopoly necrosis as traditional durable-goods theory would predict. They seem to just get bigger and bigger, because while a telephone company gets locked into its telephones or its wires or its interfaces, a tractor company starts to compete against its excellent used tractors, a software company can buy back old licences for more than they would sell for on the street, destroying the secondary market.

We see this in software, and it's because of the particular way that software has changed the equation of information property. The "upgrade" is the idea that you've bought a piece of software and now there's a new release and so - as a loyal customer - you should be able trade the old one in and buy the new one. What that ends up being is since what you bought wasn't the software but a permanent right to use the software, what you do when you upgrade, is forfeit your permanent right and purchase it again. If you don't upgrade, your old software won't work soon, so that permanent right you thought you owned will be worthless.

It seems to me that what we're seeing in the software area, and this is the scary part for human society, is the beginning of a kind of dispossession. People are talking about this as dispossession that only comes from piracy, like Napster and Gnutella where the rights of artists are being violated by people sharing their work. But there's another kind of dispossession, which is the inability to actually BUY a product. The idea is here: you couldn't buy this piece of software, you could only licence it on a day by day, month by month, year by year basis; As this idea spreads from software to music, films, books, human civilization based on property fundamentally changes.

The idea we hear of the big Internet in the sky with all the music we want to listen to, all the books and movies we want to read and watch on demand, all the software games and apps we want to use, sounds real nice, until you realize it isnt a public library, it is a private jukebox. You could download whenever you wanted over high-speed wireless 3-G systems into portable playing devices and pay only $50/month. But you can never own the ebook, you can never own the divx movie, you can never own the ASP software.

By the way, all the bookstores and music stores have been shut down.

It turns out that property isn't about possession after all, it is only about a relationship between an individual and their right to use something. Ownership is just "right to use until you sell" Your right to a house, or your liquid wealth, are stored as bits in an institutional computer - whether the computer is at the bureau of deeds in your town, whether the computer is at the bank, whether the computer is at the stock transfer agent. And property only works when transfers are not duplicative or lossy.

There is a fundamental difference between protecting the encryption systems for real currency and securities, and protecting encryptions systems for unlimited publishing. If the content industries prevail in gaining legal protection for renting infinite simultaneous copies, if we don't protect the notion of ownership, which includes the ability to loan, rent, and sell something when you're done with it, we will lose the ability to own things. Dispossession is a very real threat to civilization.

One of the other initiatives that I've been working on is trying to get my university at least to see that software and books and patents are really varieties of the same thing, and that they should normalize the intellectual property policy. Most universities give the copyright to your books back, and let professors keep all the royalties, even on a $1,000,000 book. But if you write a piece of software or you have a patent and it earns $50,000, the university tries to claim the whole thing. Very few universities get lucky with a cancer drug or Vitamin D, yet their IP policies drive innovation underground.

What I'm trying to do is separate academe from industry by giving academics back all their intellectual properties, and accept a tithe, like 9 percent of the value of all IP created on campus, including books and software and options in companies. I call it the "commonwealth of intellectual property," and through it a community of diverse scholars can share in some way in the success, drive, and luck of themselves and their colleagues. Most people are afraid of this, but I am certain it would lead to greater wealth and academic freedom for smaller universities like my own.

JORDAN POLLACK is a computer science and complex systems professor at Brandeis University. His laboratory's work on AI, Artificial Life, Neural Networks, Evolution, Dynamical Systems, Games, Robotics, Machine Learning, and Educational Technology has been reported on by the New York Times, Time, Science, NPR, Slashdot.org and many other media sources worldwide.


DAVID GELERNTER

Questions about the evolution of software in the big picture are worth asking. It's important that we don't lose sight of the fact that some of the key issues in software don't have anything to do with big strategic questions, they have to do with the fact that the software that's becoming ubiquitous and that so many people rely on is so crummy, and that for so many people software ­ and in fact the whole world of electronics ­ is a constant pain. The computers we're inflicting on people are more a cause for irritation and confusion and dissatisfaction and angst than a positive benefit. One thing that's going to happen is clearly a tactical issue; we're going to throw out the crummy, primitive software on which we rely, and see a completely new generation of software very soon.

If you look at where we are in the evolution of the desktop computer today, the machine is about 20 to 25 years old. Relatively speaking we're roughly where the airplane was in the late 1920s. A lot of work had been done but we were yet to see the first even quasi-proto modern airplane, which was the DC3 of 1935. In the evolution of desktop computing we haven't even reached DC3 level. We're a tremendously self-conscious and self-aware society, and yet we have to keep in mind how much we haven't done, and how crummy and primitive much of what we've built is. For most people a new electronic gadget is a disaster, another incomprehensible users manual or help set, things that break, don't work, that people can never figure out; features they don't need and don't understand. All of these are just tactical issues, but they are important to the quality of life of people who depend on computers, which increasingly is everybody.

When I look at where software is heading and what is it really doing, what's happening and what will happen with the emergence of a new generation of information-management systems, as we discard Windows and NT ­ these systems that are 1960s, 1970s systems on which we rely today, we'll see a transition similar to what happened during the 19th century, when people's sense of space suddenly changed. If you compare the world of 1800 to the world of 1900, people's sense of space was tremendously limited and local and restricted in 1800. If you look at a New England village of the time, you can see this dramatically, everything is on site, a small cluster of houses, in which everything that needs to be done is done, and fields beyond, and beyond the fields a forest.

People traveled to some extent, but they didn't travel often, most people rarely traveled at all. The picture of space outside people's own local space was exceptionally fuzzy. Today, our picture of time is equally fuzzy; we have an idea of our local time and what happened today and yesterday, and what's going to happen next week, what happened the last few weeks, but outside of this, our view of time is as restricted and local as people's view of space was around 1800. If you look at what happened in the 19th century as transportation became available, cheap and ubiquitous, all of a sudden people developed a sense of space beyond their own local spaces, and the world changed dramatically. It wasn't just that people got around more and the economy changed and wealth was created. There was a tremendous change in the intellectual status of life. People moved outside their intellectual burrows; religion collapsed; the character of arts changed during the 19th century far more than it has during the 20th century or during any other century as the people's lives became fundamentally less internal, less spiritual, because they had more to do. They had places to go, they had things to see. When we look at the collapse of religion in the 19th century, it had far less to do with science than with technology, the technology of transportation that changed people's view of space and put the world at people's beck and call, in a sense. In 1800 this country was deeply religious; in 1900 religion had already become a footnote. And art had fundamentally changed in character as well.

What's going to happen, what software will do over the next few years ­ this has already started to happen and will accelerate ­ is that our software will be time-based, rather than space-based. We'll deal with streams of information rather than chaotic file systems that are based on 1940s idea of desks and file cabinets. The transition to a software world where we have a stream with a past, present and future is a transition to a world in which people have a much more acute sense of time outside their own local week, or month ­ in which they now have a clear idea of what was different, why February of 1997 was different from February of 1994, which most people today don't have a clear picture of.

When we ask ourselves what the effect will be of time coming into focus the way space came into focus during the 19th century, we can count on the fact that the consequences will be big. It won't cause the kind of change in our spiritual life that space coming into focus did, because we've moved as far outside as we can get, pretty much. We won't see any further fundamental changes in our attitude towards art or religion ­ all that has happened already. We're apt to see other incalculably large affects on the way we deal with the world and with each other, and looking back at this world today it will look more or less the way 1800 did from the vantage point of 1900. Not just a world with fewer gadgets, but a world with a fundamentally different relationship to space and time. From the small details of our crummy software to the biggest and most abstract issues of how we deal with the world at large, this is a big story.

"Streams" is a software project I've been obsessed with. In the early '90s it was clear to me that the operating system, the standard world in which I lived, was collapsing. For me and the academic community it was Unix; but it was the same in the world of Windows or the world of Mac or whatever world you were in. In the early 90s we'd been online solidly for at least a decade; I was a graduate student in the early 80s when the first desktop computers hit the stands. By the early 90s there was too much, it was breaking down. The flow of email, the number of files we had because we kept making more and they kept accumulating, we no longer threw them out every few years when we threw out the machine, they just grew to a larger and larger assemblage.

In the early 90s we were seeing electronic images, electronic faxes and stuff like that. The Web hadn't hit yet but it was clear to some of us what was coming and we talked about it and we wrote about it. The Internet was already big in the early 90s, and it was clear that the software we had was no good. It was designed for a different age. Unix was built at Bell Labs in the 1970s for a radically different technology world where computing power was rare and expensive, memories were small, disks were small, bandwidth was expensive, email was non-existent, the net was an esoteric fringe phenomenon. And that was the software we were using to run our lives in 1991, 1992. It was clear it was no good, it was broken, and it was clear that things were not going to get any better in terms of managing our online lives. It seemed to us at that point that we needed to throw out this 60s and 70s stuff.

The Unix idea of a file system copied so faithfully from the 1941 Steelcase file cabinet, which had its files and it had its folders, and the Xerox idea of a desktop with its icons of wastepaper baskets and stuff just like the offices that we were supposed to be leaving behind us, all this stuff copied faithful from the pre-electronic age. It was a good way to get started, but it was no good anymore. We needed something that was designed for computers. Forms and ways of doing business that were electronic and software-based, as opposed to being cribbed from what people knew how to do in 1944. They did well in 1944 but by 1991 it was no longer the way to operate in a software and electronic-based world.

It seemed to us that we wanted to arrange our stuff in time rather than in space. Instead of spreading it out on a virtual desktop in front of us we wanted all our information to accumulate in a kind of time line, or a diary or narrative with a past, present and future, or a stream, as we called the software. Every piece of information that came into my life, whether it was an email, or eventually a URL, or a fax or an image or a digital photo or a voice mail, or the 15th draft of a book chapter, all pieces of information would be plopped down at the end of a growing stream.

By looking at this stream I'd be looking at my entire information life, I would drop the absurd idea of giving files names ­ the whole idea of names and directories had rendered itself ridiculous, and a burden. If we dropped everything into the stream and we provided powerful searching and indexing tools and powerful browsing tools, and we allowed time itself to guide us, we'd have a much better tool than trying to remember, am I looking for letter to John number 15B or am I looking for new new new letter to John prime. Instead I could say I'm looking for the letter to John I wrote last week, and go to last week and browse. It was clear that by keeping our stuff in a time line we could throw away the idea of names, we could throw away the idea of files and folders, we could throw away the desktop. Instead we'd have the stream, which was a virtual object that we could look at using any computer and no longer have to worry whether I put the file at work or at home, or in the laptop or the palm pilot. The stream was a virtual structure and by looking at it, tuning it in, I tuned in my life, and I could tune it in from any computer. It had a future as well, so if I was going to do something next Friday, I'd drop into the future, and next Friday would flow to the present, the present would flow to the past.

To make a long story short we built the software and the software was the basis of a world view, an approach to software and the way of dealing with information. It was also a commercial proposition. That's got intellectual content in a way because for so many of us we have been challenged, asked whether the intellectual center of gravity and technology has not moved away from the university into the private sector. I thought it was a rotten idea, I resisted this heavily. I had a bet with my graduate students in the mid-90s. I would try to fund this project by the usual government funding ways and they would try and fund it by private investors, and whoever got the money first, that's the way we would go. I thought there was no contest, I had all sorts of Washington funding contacts, but they beat me hands down. When I was trying to wangle invitations to Washington to talk about this stuff, they would get private investors to hop on a plane and fly to New Haven to see it. The difference in energy level between the private and the Washington sector was enormous. And bigots like myself, who didn't want to hear about private industry or private spending or the private sector, who believed in the university, as I still do, in principle were confronted with the fact that there was a radically higher energy level among people who had made a billion dollars and wanted to make another billion, than people who had got tenure and who were now bucking for what? A chair, or whatever.

The academic world was more restricted in terms of what it could offer greedy people, and greed drives the world, one of the things which you confront as you get older. Reluctantly. So this story is a commercial story also, and raises questions about the future of the university ­ where the smart people are, where the graduate students go, where the dollars are, where the energy is, where the activity is. It hasn't been raised in quite the same way in some of the sciences as it has in technology. It certainly has become a big issue in biology and in medicine. The University, forgetting about software, and forgetting about the future of the stream, fiddling while Rome burns, or whatever it does, thinks that it's going to come to grips with the world by putting course notes on the Web. But we're dealing with something much bigger and much deeper than that.

What Yale charges for an education, as you know, is simply incredible. What it delivers is not worth what it charges. It gets by today on its reputation and in fact can get good jobs for its graduates. However, we're resting on our laurels. All these are big changes. And the changes that will happen in this nation's intellectual life when the university as we know it today collapses. The Yales and the Harvards, will do okay, but the 98% of the nation's universities that are not the Yales and Harvards and the MITs, when they collapse, intellectual life will be different, and that will be a big change, too. We're not thinking about this enough. And I know the universities are not. ?

DAVID GELERNTER is a professor of computer science at Yale and chief scientist at Mirror Worlds Technologies (New Haven). His research centers on information management, parallel programming, and artificial intelligence. The "tuple spaces" introduced in Nicholas Carriero and Gelernter's Linda system (1983) are the basis of many computer communication systems worldwide. Dr. Gelernter is the author of Mirror Worlds, The Muse in the Machine, 1939: The Lost World of the Fair, and Drawiing a Life: Surviving the Unabomber.


ALAN GUTH

Even though cosmology doesn't have that much to do with information, it certainly has a lot to do with revolution and phase transitions. In fact, it is connected to phase transitions in both the literal and the figurative sense of the phrase.

It's often said — and I believe this saying was started by the late David Schramm — that today we are in a golden age of cosmology. That's really true. Cosmology at this present time is undergoing a transition from being a bunch of speculations to being a genuine branch of hard science, where theories can be developed and tested against precise observations. One of the most interesting areas of this is the prediction of the fluctuations, the non-uniformities, in the cosmic background radiation, an area that I've been heavily involved in. We think of this radiation as being the afterglow of the heat of the Big Bang. One of the remarkable features of the radiation is that it's uniform in all directions, to an accuracy of about one part in a hundred thousand, after you subtract the term that's related to the motion of the earth through the background radiation.

I've been heavily involved in a theory called the inflationary universe, which seems to be our best explanation for this uniformity. The uniformity is hard to understand. You might think initially that maybe the uniformity could be explained by the same principles of physics that cause a hot slice of pizza to get cold when you take it out of the oven; things tend to come to a uniform temperature. But once the equations of cosmology were worked out, so that one could calculate how fast the universe was expanding at any given time, then physicists were able to calculate how much time there was for this uniformity to set in.

They found that, in order for the universe to have become uniform fast enough to account for the uniformity that we see in the cosmic background radiation, information would have to have been transferred at approximately a hundred times the speed of light. But according to all our theories of physics, nothing can travel faster than light, so there's no way that this could have happened. So the classical version of the Big Bang theory had to simply start out by assuming that the universe was homogeneous — completely uniform — from the very beginning.

The inflationary universe theory is an add-on to the standard Big Bang theory, and basically what it adds on is a description of what drove the universe into expansion in the first place. In the classic version of the Big Bang theory, that expansion was put in as part of the initial assumptions, so there's no explanation for it whatever. The classical Big Bang theory was never really a theory of a bang; it was really a theory about the aftermath of a bang. Inflation provides a possible answer to the question of what made the universe bang, and now it looks like it's almost certainly the right answer.

Inflationary theory takes advantage of results from modern particle physics, which predicts that at very high energies there should exist peculiar kinds of substances which actually turn gravity on its head and produce repulsive gravitational forces. The inflationary explanation is the idea that the early universe contains at least a patch of this peculiar substance. It turns out that all you need is a patch; it can actually be more than a billion times smaller than a proton. But once such a patch exists, its own gravitational repulsion causes it to grow, rapidly becoming large enough to encompass the entire observed universe.

The inflationary theory gives a simple explanation for the uniformity of the observed universe, because in the inflationary model the universe starts out incredibly tiny. There was plenty of time for such a tiny region to reach a uniform temperature and uniform density, by the same mechanisms through which the air in a room reaches a uniform density throughout the room. And if you isolated a room and let it sit long enough, it will reach a uniform temperature as well. For the tiny universe with which the inflationary model begins, there is enough time in the early history of the universe for these mechanisms to work, causing the universe to become almost perfectly uniform. Then inflation takes over and magnifies this tiny region to become large enough to encompass the entire universe, maintaining this uniformity as the expansion takes place.

For a while, when the theory was first developed, we were very worried that we would get too much uniformity. One of the amazing features of the universe is how uniform it is, but it's still by no means completely uniform. We have galaxies, and stars and clusters and all kinds of complicated structure in the universe that needs to be explained. If the universe started out completely uniform, it would just remain completely uniform, as there would be nothing to cause matter to collect here or there or any particular place.

I believe Stephen Hawking was the first person to suggest what we now think is the answer to this riddle. He pointed out — although his first calculations were inaccurate — that quantum effects could come to our rescue. The real world is not described by classical physics, and even though this was very "high-brow" physics, we were in fact describing things completely classically, with deterministic equations. The real world, according to what we understand about physics, is described quantum-mechanically, which means, deep down, that everything has to be described in terms of probabilities.

The "classical" world that we perceive, in which every object has a definite position and moves in a deterministic way, is really just the average of the different possibilities that the full quantum theory would predict. If you apply that notion here, it is at least qualitatively clear from the beginning that it gets us in the direction that we want to go. It means that the uniform density, which our classical equations were predicting, would really be just the average of the quantum mechanical densities, which would have a range of values which could differ from one place to another. The quantum mechanical uncertainly would make the density of the early universe a little bit higher in some places, and in other places it would be a little bit lower.

So, at the end of inflation, we expect to have ripples on top of an almost uniform density of matter. It's possible to actually calculate these ripples. I should confess that we don't yet know enough about the particle physics to actually predict the amplitude of these ripples, the intensity of the ripples, but what we can calculate is the way in which the intensity depends on the wavelength of the ripples. That is, there are ripples of all sizes, and you can measure the intensity of ripples of different sizes. And you can discuss what we call the spectrum — we use that word exactly the way it's used to describe sound waves. When we talk about the spectrum of a sound wave, we're talking about how the intensity varies with the different wavelengths that make up that sound wave.

We do exactly the same thing in the early universe, and talk about how the intensity of these ripples in the mass density of the early universe varied with the wavelengths of the different ripples that we're looking at. Today we can see those ripples in the cosmic background radiation. The fact that we can see them at all is an absolutely fantastic success of modern technology. When we were first making these predictions back in 1982, at that time astronomers had just barely been able to see the effect of the earth's motion through the cosmic background radiation, which is an effect of about one part in a thousand. The ripples that I'm talking about are only one part in a hundred thousand — just one percent of the intensity of the most subtle effect that it had been possible to observe at the time we were first doing these calculations.

I never believed that we would ever actually see these ripples. It just seemed too far fetched that astronomers would get to be a hundred times better at measuring these things than they were at the time. But, to my astonishment and delight, in 1992 these ripples were first detected by a satellite called COBE, the Cosmic Background Explorer, and now we have far better measurements than COBE, which had an angular resolution of about 7 degrees. This meant that you could only see the longest wavelength ripples. Now we have measurements that go down to a fraction of a degree, and we're getting very precise measurements now of how the intensity varies with wavelength, with marvelous success.

About a year and a half ago, there was a spectacular set of announcements from experiments called BOOMERANG and MAXIMA, both balloon-based experiments, which gave very strong evidence that the universe is geometrically flat, which is just what inflation predicts. (By flat I don't mean two-dimensional; I just mean that the three-dimensional space of the universe in not curved, as it could have been, according to general relativity.) You can actually see the curvature of space in the way that the pattern of ripples has been affected by the evolution of the universe. A year and a half ago, however, there was an important discrepancy that people worried about; and no one was sure how big a deal to make out of it. The spectrum they were measuring was a graph that had, in principle, several peaks. These peaks had to do with successive oscillations of the density waves in the early universe, and a phenomenon called resonance that makes some wavelengths more intense than others. The measurements showed the first peak beautifully, exactly where we expected it to be, with just the shape that was expected. But we couldn't actually see the second peak.

In order to fit the data with the theories, people had to assume that there were about ten times as many protons in the universe as we actually thought, because the extra protons would lead to a friction effect that could make the second peak disappear. Of course every experiment has some uncertainty — if an experiment is performed many times, the results will not be exactly the same each time. So we could imagine that the second peak was not seen purely because of bad luck. However, the probability that the peak could be so invisible, if the universe contained the density of protons that is indicated by other measurements, was down to about the one percent level. So, it was a very serious-looking discrepancy between what was observed and what was expected. All this changed dramatically for the better about 3 or 4 months ago, with the next set of announcements with more precise measurements. Now the second peak is not only visible, but it has exactly the height that was expected, and everything about the data now fits beautifully with the theoretical predictions. Too good, really. I'm sure it will get worse before it continues to get better, given the difficulties in making these kinds of measurements. But we have a beautiful picture now which seems to be confirming the inflationary theory of the early universe.

Our current picture of the universe has a new twist, however, which was discovered two or three years ago. To make things fit, to match the observations, which are now getting very clear, we have to assume that there's a new component of energy in the universe that we didn't know existed before. This new component is usually referred to as "dark energy." As the name clearly suggests, we still don't know exactly what this new component is. It's a component of energy which in fact is very much like the repulsive gravity matter I talked about earlier — the material that drives the inflation in the early universe. It appears that, in fact, today the universe is filled with a similar kind of matter. The antigravity effect is much weaker than the effect that I was talking about in the early universe, but the universe today appears very definitely to be starting to accelerate again under the influence of this so-called dark energy.

Although I'm trying to advertise that we've understood a lot, and we have, there are still many uncertainties. In particular, we still don't know what most of the universe is made out of. There's the dark energy, which seems to comprise in fact about 60% of the total mass/energy of the universe. We don't know what it is. It could in fact be the energy of the vacuum itself, but we don't know that for a fact. In addition, there's what we call dark matter, which is another 30%, or maybe almost 40%, of the total matter in the universe; we don't know what that is, either. The difference between the two is that the dark energy causes repulsive gravity and is smoothly distributed; the dark matter behaves like ordinary matter in terms of its gravitational properties — it's attractive and it clusters; but we don't know what it's made of. The stuff we do know about — protons, neutrons, ordinary atoms and molecules — appear to comprise only about 5% of the mass of the universe.

The moral of the story is we have a great deal to learn. At the same time, the theories that we have developed so far seem to be working almost shockingly well.

ALAN GUTH, father in the inflationary theory of the Universe, is Victor F. Weisskopf Professor of Physics at MIT; author of The Inflationary Universe: The Quest for a New Theory of Cosmic Origins.


Edge Dinners
Event Date: [ 2.22.01 ]
Location:
CA
United States

February 2000: "A few TEDs ago, John Brockman began hosting an annual Millionaires' Dinner in honor of his acquaintances at the conference whose net worth exceeded seven figures. But rising equity values prompted Brockman to rename his party the Billionaires' Dinner. Last year, Steve Case, Jeff Bezos, and Nathan Myhrvold joined such comparatively impoverished multimillionaires as Barnes & Noble's Steve Riggio, EarthLink's Sky Dayton, and Marimba's Kim Polese. The dinner party was a microcosm of a newly dominant sector of American business." — Gary Wolf, Wired


January 8, 2001: "These days, it's open season on the Web. Where that will take us now is anybody's guess, but it won't be back to headier times, says John Brockman, a New York literary agent who became known in Silicon Valley over the past several years for throwing an annual "Billionaires Dinner".....He wants to change the name of the event. "This year," he says. "It's the 'Joy of the Ordinary Income Dinner.' .....Bon appetit and pass the Rolaids." — Kara Swisher, The Wall Street Journal


 

Pages

Subscribe to