In the late summer of 1914, as European civilization began its extended suicide, dissenters were scarce. On the contrary: From every major capital, we have jerky newsreel footage of happy crowds, cheering in the summer sunshine. More war and oppression followed in subsequent decades, and there was never a shortage of willing executioners and obedient lackeys. By mid-century, the time of Stalin and Mao and their smaller-bore imitators, it seemed urgent to understand why people throughout the 20th century had failed to rise up against masters who sent them to war, or to concentration camps, or to the gulag. So social scientists came up with an answer, which was then consolidated and popularized into something every educated person supposedly knows: People are sheep—cowardly, deplorable sheep.
This idea, that most of us are unwilling to "think for ourselves," instead preferring to stay out of trouble, obey the rules, and conform, was supposedly established by rigorous laboratory experiments. ("We have found," wrote the great psychologist Solomon Asch in 1955, "the tendency to conform in our society is so strong that reasonably intelligent and well-meaning young people are willing to call white black.") Plenty of research papers still refer to one or another aspect of the sheep model as if it were a truth universally acknowledged, and a sturdy rock on which to build new hypotheses about mass behavior. Worse yet, it's rampant in the conversation of educate laypeople—politicians, voters, government officials. Yet it is false. It makes for bad assumptions and bad policies. It is time to set it aside.
Some years ago, the psychologists Bert Hodges and Anne Geyer examined one of Asch's own experiments from the 1950s. He'd asked people to look at a line printed on a white card and then tell which of three similar lines was the same length. Each volunteer was sitting in a small group, all of whose other members were actually collaborators in the study, deliberately picking wrong answers. Asch reported that when the group chose the wrong match, many individuals went along, against the evidence of their own senses.
But the experiment actually involved 12 separate comparisons for each subject, and most did not agree with the majority, most of the time. In fact, on average, each person agreed three times with the majority, and insisted on his own view nine other times. To make those results all about the evils of conformity is to say, as Hodges and Geyer note, that "an individual's moral obligation in the situation is to 'call it as he sees it' without consideration of what others say.''
To explain their actions, the volunteers didn't indicate that their senses had been warped or that they were terrified of going against consensus. Instead, they said they had chosen to go along that one time. It's not hard to see why a reasonable person would do so.
The "people are sheep" model sets us up to think in terms of obedience or defiance, dumb conformity versus solitary self-assertion (to avoid being a sheep, you must be a lone wolf). It does not recognize that people need to place their trust in others, and win the trust of others, and that this guides their behavior. (Stanley Milgram's famous experiments, where men were willing to give severe shocks to a supposed stranger, are often cited as Exhibit A for the "people are sheep" model. But what these studies really tested was the trust the subjects had in the experimenter.)
Indeed, questions about trust in others—how it is won and kept, who wins it and who doesn't—seem to be essential to understanding how collectives of people operate, and affect their members. What else is at work?
It appears that behavior is also susceptible to the sort of moment-by-moment influences that were once considered irrelevant noise (for example, divinity students in a rush were far less likely to help a stranger than were divinity students who were not late, in an experiment performed by John M. Darley and Dan Batson). And then there is mounting evidence of influences that discomfit psychologists because there doesn't seem to be much psychology in them at all. For example, Neil Johnson of the University of Miami and Michael Spagat of University College London and their colleagues have found the severity and timing of attacks in many different wars (different actors, different stakes, different cultures, different continents) adheres to a power law. If that's true, then an individual fighter's motivation, ideology, and beliefs make much less difference than we think for the decision to attack next Tuesday.
Or, to take another example, if as Nicholas Christakis' work suggests, your risks of smoking, getting an STD, catching the flu or being obese depend in part on your social network ties, then how much difference does it make what you, as an individual, feel or think?
Perhaps the behavior of people in groups will eventually be explained as a combination of moment-to-moment influences (like waves on the sea) and powerful drivers that work outside of awareness (like deep ocean currents). All the open questions are important and fascinating. But they're only visible after we give up the simplistic notion that we are sheep.