The Lectern: Leadership and overconfidence
As I've mentioned in some earlier posts, I am teaching several new classes this year in my introductory public management course in the master's core curriculum at the Kennedy School at Harvard -- we're trying to add more material under the rubric of "leadership."
In one of my new classes, which I taught on Thursday, we introduce students to the literature on biases that tend to occur when people make decisions. We illustrate a number of them with simulations and exercises, and this year included an Internet poll where students had to answer a number of questions.
In one of the exercises, students were asked to give a low and high estimate for answers to a number of numerical questions involving information people are unlikely to know for certain -- such as the length of the Nile River, the gestation period for an Indian elephant, or the empty weight of a 747 airplane. The instructions were that students should estimate the low and high numbers within which they were 90 percent confident that the true number lay.
There were ten questions. If students estimated each quantity with a 90 percent chance of being in the range, on average respondents should have missed 10 percent of the estimates.
When we calculated the results of the Internet poll, in fact students on the whole had missed the correct answer 63 percent of the time.
(I will publish the correct answers to these three questions in about a week, after people have had a chance to read this blog and make low and high estimates yourselves.)
Results such as these are typical. They illustrate the presence of overconfidence as a normal human failing -- we are more sure that we are right about things than is warranted. This may produce a number of decision failings, including stopping information-gathering too quickly and being less-willing than we should be to reconsider decisions.
This means all of us.
Decision biases are a rapidly growing research field. In discussion during the class, my students were especially interested in what we could do to reduce the impact of these biases, a good thing to be thinking about. This area of research (on "de-biasing") is less-developed, and conclusions so far suggest that de-biasing is not easy, but this is something important to worry about if we want to do better as individuals and societies.
Posted by Steve Kelman on Sep 20, 2008 at 12:10 PM