All Videos

How the Brain Is Computing the Mind

[2.12.16]

The history of science has shown us that you need the tools first. Then you get the data. Then you can make the theory. Then you can achieve understanding.

ED BOYDEN is a professor of biological engineering and brain and cognitive sciences at the MIT Media Lab and the MIT McGovern Institute. He leads the Synthetic Neurobiology Group. Ed Boyden's Edge Bio Page 


Go to stand-alone video: :
 

The Crusade Against Multiple Regression Analysis

[1.21.16]

A huge range of science projects are done with these multiple regression things. The results are often somewhere between meaningless and quite damaging. ...                             

I hope that in the future, if I’m successful in communicating with people about this, that there’ll be a kind of upfront warning in New York Times articles: These data are based on multiple regression analysis. This would be a sign that you probably shouldn’t read the article because you’re quite likely to get non-information or misinformation.

RICHARD NISBETT is a professor of psychology and co-director of the Culture and Cognition Program at the University of Michigan. Richard Nisbett's Edge Bio Page.


 

What is Reputation?

[11.5.15]

That is basically what interests me—the double question of understanding our own biases, but also understanding the potential of using this indirect information and these indirect cues of quality of reputation in order to navigate this enormous amount of knowledge. What is interesting about Internet, and especially about the Web, is that Internet is not only an enormous reservoir of information, it is a reputational device. It means that it accumulates tons of evaluations of other people, so the information you get is pre-evaluated. This makes you go much faster. This is an evolutionary heuristic that we have, probably since the birth of the human mind.                                             

Follow the people who know how to treat information. Don't go yourself for the solution. Follow those who have the solution. This is a super strong drive—to learn faster. Children know very well this drive. And of course it can bring you to conformism and have very negative side effects, but also can make you know faster. We know faster, not because there is a lot of information around, but because the information that is around is evaluated, it has a reputational label on it. 

GLORIA ORIGGI is a philosopher and researcher at the Centre Nationale de la Recherche Scientifique in Paris. Gloria Origgi's Edge Bio Page.


Go to stand-alone video: :
 

Choosing Empathy

[10.20.15]

If you believe that you can harness empathy and make choices about when to experience it versus when not to, it adds a layer of responsibility to how you engage with other people. If you feel like you're powerless to control your empathy, you might be satisfied with whatever biases and limits you have on it. You might be okay with not caring about someone just because they're different from you. I want people to not feel safe empathizing in the way that they always have. I want them to understand that they're doing something deliberate when they connect with someone, and I want them to own that responsibility.

JAMIL ZAKI is an assistant professor of psychology at Stanford University and the director of the Stanford Social Neuroscience Lab.  Jamil Zaki's Edge Bio Page.

 


Go to stand-alone video: :
 

Edge Master Class 2015: A Short Course in Superforecasting, Class V

Condensing it All Into Four Big Problems and a Killer App Solution
[9.22.15]

The beauty of forecasting tournaments is that they’re pure accuracy games that impose an unusual monastic discipline on how people go about making probability estimates of the possible consequences of policy options. It’s a way of reducing escape clauses for the debaters, as well as reducing motivated reasoning room for the audience.

Tournaments, if they’re given a real shot, have a potential to raise the quality of debates by incentivizing competition to be more accurate and reducing functionalist blurring that makes it so difficult to figure out who is closer to the truth. 


[24:43 minutes]

PHILIP E. TETLOCK is the Annenberg University Professor at the University of Pennsylvania, with appointments in Wharton, psychology and political science. He is co-leader of the Good Judgment Project, a multi-year forecasting study, the author of Expert Political Judgment and (with Aaron Belkin) Counterfactual Thought Experiments in World Politics, and co-author (with Dan Gardner) of Superforecasting: The Art & Science of Prediction (forthcoming, September 2015). Philip Tetlock's Edge Bio Page.


 

Edge Master Class 2015: A Short Course in Superforecasting, Class V

Condensing it All Into Four Big Problems and a Killer App Solution
[9.22.15]

The beauty of forecasting tournaments is that they’re pure accuracy games that impose an unusual monastic discipline on how people go about making probability estimates of the possible consequences of policy options. It’s a way of reducing escape clauses for the debaters, as well as reducing motivated reasoning room for the audience.

Tournaments, if they’re given a real shot, have a potential to raise the quality of debates by incentivizing competition to be more accurate and reducing functionalist blurring that makes it so difficult to figure out who is closer to the truth. 


[29:26 minutes]

PHILIP E. TETLOCK is the Annenberg University Professor at the University of Pennsylvania, with appointments in Wharton, psychology and political science. He is co-leader of the Good Judgment Project, a multi-year forecasting study, the author of Expert Political Judgment and (with Aaron Belkin) Counterfactual Thought Experiments in World Politics, and co-author (with Dan Gardner) of Superforecasting: The Art & Science of Prediction (forthcoming, September 2015). Philip Tetlock's Edge Bio Page.


 

Edge Master Class 2015: A Short Course in Superforecasting, Class IV

Skillful Backward and Forward Reasoning in Time: Superforecasting Requires "Counterfactualizing"
[9.15.15]

A famous economist, Albert Hirschman, had a wonderful phrase, "self-subversion." Some people, he thought, were capable of thinking in self-subverting ways. What would a self-subverting liberal or conservative say about the Cold War? A self-subverting liberal might say, "I don’t like Reagan. I don’t think he was right, but yes, there may be some truth to the counterfactual that if he hadn’t been in power and doing what he did, the Soviet Union might still be around." A self-subverting conservative might say, "I like Reagan a lot, but it’s quite possible that the Soviet Union would have disintegrated anyway because there were lots of other forces in play."
        
Self-subversion is an integral part of what makes superforecasting cognition work. It’s the willingness to tolerate dissonance. It’s hard to be an extremist when you engage in self-subverting counterfactual cognition. That’s the first example. The second example deals with how regular people think about fate and how superforecasters think about it, which is, they don’t. Regular people often invoke fate, "it was meant to be," as an explanation for things.


[34:14 minutes]

PHILIP E. TETLOCK is the Annenberg University Professor at the University of Pennsylvania, with appointments in Wharton, psychology and political science. He is co-leader of the Good Judgment Project, a multi-year forecasting study, the author of Expert Political Judgment and (with Aaron Belkin) Counterfactual Thought Experiments in World Politics, and co-author (with Dan Gardner) of Superforecasting: The Art & Science of Prediction (forthcoming, September 2015). Philip Tetlock's Edge Bio Page.


 

Edge Master Class 2015: A Short Course in Superforecasting, Class IV

Skillful Backward and Forward Reasoning in Time: Superforecasting Requires "Counterfactualizing"
[9.15.15]

A famous economist, Albert Hirschman, had a wonderful phrase, "self-subversion." Some people, he thought, were capable of thinking in self-subverting ways. What would a self-subverting liberal or conservative say about the Cold War? A self-subverting liberal might say, "I don’t like Reagan. I don’t think he was right, but yes, there may be some truth to the counterfactual that if he hadn’t been in power and doing what he did, the Soviet Union might still be around." A self-subverting conservative might say, "I like Reagan a lot, but it’s quite possible that the Soviet Union would have disintegrated anyway because there were lots of other forces in play."

        

Self-subversion is an integral part of what makes superforecasting cognition work. It’s the willingness to tolerate dissonance. It’s hard to be an extremist when you engage in self-subverting counterfactual cognition. That’s the first example. The second example deals with how regular people think about fate and how superforecasters think about it, which is, they don’t. Regular people often invoke fate, "it was meant to be," as an explanation for things.


[33:47 minutes]

PHILIP E. TETLOCK is the Annenberg University Professor at the University of Pennsylvania, with appointments in Wharton, psychology and political science. He is co-leader of the Good Judgment Project, a multi-year forecasting study, the author of Expert Political Judgment and (with Aaron Belkin) Counterfactual Thought Experiments in World Politics, and co-author (with Dan Gardner) of Superforecasting: The Art & Science of Prediction (forthcoming, September 2015). Philip Tetlock's Edge Bio Page.


 

Edge Master Class 2015: A Short Course in Superforecasting, Class III

Counterfactual History: The Elusive Control Groups in Policy Debates
[9.1.15]

There's a picture of two people on slide seventy-two, one of whom is one of the most famous historians in the 20th century, E.H. Carr, and the other of whom is a famous economic historian at the University of Chicago, Robert Fogel. They could not have more different attitudes toward the importance of counterfactuals in history. For E.H. Carr, counterfactuals were a pestilence, they were a frivolous parlor game, a methodological rattle, a sore loser's history. It was a waste of cognitive effort to think about counterfactuals. You should think about history the way it did unfold and figure out why it had to unfold the way it did—almost a prescription for hindsight bias.

Robert Fogel, on the other hand, approached it more like a scientist. He quite correctly recognized that if you want to draw causal inferences from any historical sequence, you have to make assumptions about what would have happened if the hypothesized cause had taken on a different value. That's a counterfactual. You had this interesting tension. Many historians do still agree, in some form, with E.H. Carr. Virtually all economic historians would agree with Robert Fogel, who’s one of the pivital people in economic history; he won a Nobel Prize. But there’s this very interesting tension between people who are more open or less open to thinking about counterfactuals. Why that is, is something that is worth exploring. 


[31:07 minutes]

PHILIP E. TETLOCK is the Annenberg University Professor at the University of Pennsylvania, with appointments in Wharton, psychology and political science. He is co-leader of the Good Judgment Project, a multi-year forecasting study, the author of Expert Political Judgment and (with Aaron Belkin) Counterfactual Thought Experiments in World Politics, and co-author (with Dan Gardner) of Superforecasting: The Art & Science of Prediction (forthcoming, September 2015). Philip Tetlock's Edge Bio Page.


 

Edge Master Class 2015: A Short Course in Superforecasting, Class III

Counterfactual History: The Elusive Control Groups in Policy Debates
[9.1.15]

There's a picture of two people on slide seventy-two, one of whom is one of the most famous historians in the 20th century, E.H. Carr, and the other of whom is a famous economic historian at the University of Chicago, Robert Fogel. They could not have more different attitudes toward the importance of counterfactuals in history. For E.H. Carr, counterfactuals were a pestilence, they were a frivolous parlor game, a methodological rattle, a sore loser's history. It was a waste of cognitive effort to think about counterfactuals. You should think about history the way it did unfold and figure out why it had to unfold the way it did—almost a prescription for hindsight bias.

Robert Fogel, on the other hand, approached it more like a scientist. He quite correctly recognized that if you want to draw causal inferences from any historical sequence, you have to make assumptions about what would have happened if the hypothesized cause had taken on a different value. That's a counterfactual. You had this interesting tension. Many historians do still agree, in some form, with E.H. Carr. Virtually all economic historians would agree with Robert Fogel, who’s one of the pivital people in economic history; he won a Nobel Prize. But there’s this very interesting tension between people who are more open or less open to thinking about counterfactuals. Why that is, is something that is worth exploring. 


[46:31 minutes]

PHILIP E. TETLOCK is the Annenberg University Professor at the University of Pennsylvania, with appointments in Wharton, psychology and political science. He is co-leader of the Good Judgment Project, a multi-year forecasting study, the author of Expert Political Judgment and (with Aaron Belkin) Counterfactual Thought Experiments in World Politics, and co-author (with Dan Gardner) of Superforecasting: The Art & Science of Prediction (forthcoming, September 2015). Philip Tetlock's Edge Bio Page.


 

Pages