[ print ]

Psychologist, MacroCognition LLC; Author, Seeing What Others Don't
Exchanging Your Mind

It's generally a bad idea to change your mind and an even worse idea to do it publicly. Politicians who get caught changing their minds are labeled "flip-floppers." When managers change their minds about what they want they risk losing credibility and they create frustration in subordinates who find that much of their work has now been wasted. Researchers who change their minds may be regarded as sloppy, shooting from the hip rather than delaying publication until they nail down all the loose ends in their data.

Clearly the Edge Annual Question for 2008 carries with it some dangers in disclosure:  "What have you changed your mind about? Why?" Nevertheless, I'll take the bait and describe a case where I changed my mind about the nature of the phenomenon I was studying.

My colleagues Roberta Calderwood, Anne Clinton-Cirocco, and I were investigating how people make decisions under time pressure. Obviously, under time pressure people can't canvass all the relevant possibilities and compare them along a common set of dimensions. So what are they doing instead?

I thought I knew what happened. Peer Soelberg had investigated the job-choice strategy of students. In most cases they quickly identified a favorite job option and evaluated it by comparing it to another option, a choice comparison, trying to show that their favorite option was as good as or better than this comparison case on every relevant dimension. This strategy seemed like a very useful way to handle time pressure. Instead of systematically assessing a large number of options, you only have to compare two options until you're satisfied that your favorite dominates the other.

To demonstrate that people used this strategy to handle time pressure I studied fireground commanders. Unhappily, the firefighters had not read the script. We conducted interviews with them about tough cases, probing them about the options they considered. And in the great majority of cases (about 81%), they insisted that they only considered one option.

The evidence obviously didn't support my hypothesis. Still, I wasn't convinced that my hypothesis was wrong. Perhaps we hadn't phrased the questions appropriately. Perhaps the firefighters' memories were inaccurate. At this point I hadn't changed my mind. I had just conducted a study that didn't work out.

People are very good at deflecting inconvenient evidence. There are very few facts that can't be explained away. Facts rarely force us to change our minds.

Eventually my frustration about not getting the results I wanted was replaced by a different emotion: curiosity. If the firefighters weren't comparing options just what were they doing?

They described how they usually knew what to do once they sized up the situation. This claim generated two mysteries:  How could the first option they considered have such a high likelihood of succeeding?  And how could they evaluate an option except by comparing it to another?

Going back over the data we resolved each of these mysteries. They were using their years of experience to rapidly size up situations. The patterns they had acquired suggested typical ways of reacting. But they still needed to evaluate the options they identified. They did so by imagining what might happen if they carried out the action in the context of their situation. If it worked, they proceeded. If it almost worked then they looked for ways to repair any weaknesses or else looked at other typical reactions until they found one that satisfied them.

Together, this forms a recognition-primed decision strategy that is based on pattern recognition but tests the results using deliberate mental simulation. This strategy is very different from the original hypothesis about comparing the favorite versus a choice comparison.

I had an advantage in that I had never received any formal training in decision research. One of my specialty areas was the nature of expertise. Therefore, the conceptual shift I made was about  peripheral constructs, rather than core constructs about how decisions are made. The notions of Peer Soelberg that I was testing weren't central to my understanding of skilled performance.

Changing one's mind isn't merely revising the numerical value of a fact in a mental data base or changing the beliefs we hold. Changing my mind also means changing the way I will then use my mind to search for and interpret facts. When I changed my understanding of how the fireground commanders were making decisions I altered the way I viewed experts and decision makers. I altered the ways I collected and analyzed data in later studies. As a result, I began looking at events with a different mind, one that I had exchanged for the mind I previously had been using.