In 1964 Robert Fantz published a brief paper in Science that revolutionized the study of cognitive development. Building on the idea that infants’ gaze can tell you something about their processing of visual stimuli, he demonstrated that babies respond differently to familiarity and novelty. When infants see the same thing again and again, they look for less and less time—they habituate. When infants next see a new stimulus, they regain their visual interest and look longer. Habituation establishes the status quo—the reality you no longer notice or attend to.
Subsequent generations of developmental psychologists have expanded on this methodological insight to probe the building blocks of human thinking. Capitalizing on the idea that babies get bored of the familiar and start to look to the novel, researchers can test how infants categorize many aspects of the world as same or different. From this, scientists have been able to investigate humans’ early perceptual and conceptual discriminations of the world. Such studies of early thinking can help reveal signatures of human thinking that can persist into adulthood.
The basic idea of “habituation” is exceedingly simple at its outset. And humans are not the only species to habituate with familiarity—around the same time as Fantz’s work, related papers studying habituation in other species of infant animals were published. An associated literature on the neural mechanisms of learning and memory similarly finds that neural responses decrease after repeated exposures to the same stimulus. The punch line is clear: Organisms, and their neural responses, get bored.
This intuitive boredom is etched in our brains and visible in babies’ first visual responses. But the concept of habituation can also scale up to explain a range of people’s behaviors, their pleasures, and their failures. In many domains of life, adults habituate too.
If you think about eating an entire chocolate cake, the first slice is almost certainly going to be more pleasurable than the last. It is not hard to imagine being satiated. Indeed, the economic law of diminishing marginal utility describes a related idea. The first slice has a high utility or value to the consumer. The last one does not (and may even have negative utility if it makes you sick). Adults’ responses to pleasing stimuli habituate.
People are often not aware at the outset about how much they habituate. A seminal observation of lottery winners by psychologists Philip Brickman, Dan Coates, and Ronnie Janoff-Bulman found that after time, the happiness of lottery winners returned to baseline. The thrill of winning—and the pleasure associated with new possessions—wore off. Even among non-lottery winners, people overestimate the positive impact that acquiring new possessions will have on their lives. Instead, people habituate to a new status quo of having more things, and those new things become familiar and no longer bring them joy.
Behavioral economists such as Shane Frederick and George Lowenstein have shown that this “hedonic adaptation,” or reduction in the intensity of an emotional response over time, can occur for both positive and negative life events. In addition to shifting their baseline of what is perceived as normal, people start to respond with less intensity to circumstances to which they are habituated. Over time, highs become less exhilarating, but lows also become less distressing.
Habituation may serve a protective function in helping people cope with difficult life circumstances, but it also can also carry a moral cost. Peoples get used to many circumstances, including those that (without prior experience) would otherwise be considered morally repugnant. Think of the frog in boiling water—it is only because the temperature is raised little by little that he does not jump out. In the (in)famous Milgram studies, participants are asked to shock a confederate by increasing the voltage in small increments. They are not asked to give a potentially lethal shock right at the outset. If you have already given many smaller shocks, the addition of just one more shock may not overwhelm the moral compass.
Future research exploring the human propensity toward habituation may help explain the situations that lead to moral failures—to Hannah Arendt’s “banality of evil.” Literature on workplace misconduct finds that large transgressions in business contexts often start with small wrongdoings, subtle moral breaches that grow over time. New studies are testing the ways in which our minds and brains habituate to dishonesty.
From the looking of babies to the actions of adults, habituation can help explain how people navigate their worlds, interpret familiar and new events, and make both beneficial and immoral choices. Many human tendencies—both good and bad—are composed of smaller components of familiarity, slippery slopes that people become habituated to.