Last Friday in Part 1, I started to explore Daniel Kahneman’s book, Thinking, Fast and Slow, by looking at some of the sections I had underlined. Today I conclude with the final set of quotations. I hope you can see just how much there is to learn from this book.
This is one of many practical suggestions for thinking well in spite of systematic errors. In a recent article, Louis Alloro discussed setting up cultural expectations that people point out flawed thinking as a way to avoid the dangers of the confirmation bias. Kahneman recognizes that it becomes harder to see flaws after you’ve heard someone else speak. A term he uses over and over is WYSIATI: what you see is all there is. When someone has spoken, that’s what everyone tends to see.
5. “To derive the most useful information from multiple sources of evidence, you should always try to make these sources independent of each other. “
“A simple rule can help: before an issue is discussed, all members of the committee should be asked to write a very brief summary of their position. This procedure makes good use of the value of the diversity of knowledge and opinion in the group. The standard practice of open discussion gives too much weight to the opinions of those who speak early and assertively, causing others to line up behind them.”
Using Kahneman’s simple rule of writing brief summaries is one way to remove the impact of status differences, which can have a remarkable impact on who gets heard, as I explored earlier in High Status, Low Status: What Difference Does it Make?
Thanks for reminding us that there’s a simple way to improve group thinking.
6. “As we navigate our lives, we normally allow ourselves to be guided by impressions and feelings, and the confidence we have in our intuitive beliefs and preferences is usually justified. But not always. We are often confident even when we are wrong, and an objective observer is more likely to detect our errors than we are.”
“We documented systematic errors in the thinking of normal people, and we traced these errors to the design of the machinery of cognition rather than to the corruption of thought by emotion.”
This is a book about how people think. It describes some ways that thinking works well, but places considerably more stress on systematic errors and biases. Jeremy McCarthy’s recent article, You’re not as smart as you think you are, sketches some of the ways Kahneman shows our thinking goes wrong.One of these systematic errors is ignoring regression to the mean. Kahneman describes trying to teach Israeli air force instructors that reward for improved performance works better than punishment for mistakes. The instructors flat out disbelieved him. If they yelled at a pilot for a terrible performance, the next flight was almost invariably better, and the opposite happened if they complimented the pilot for a great performance. Of course, they were ignoring the fact that ANY extreme case is highly likely to be followed by a less extreme case. They thought their yelling made the difference.
What would you conclude if you heard that small schools are over-represented in the top performing schools by a factor of 4? That it’s time to reduce the size of all schools? Before you take action, first check to see whether small schools aren’t overrepresented in bottom performing schools as well. Smaller samples tend to be more variable than larger ones.
Thank you for helping me become more cautious about jumping to conclusions.
7. “Professional controversies bring out the worst in academics. Scientific journals occasionally publish exchanges, often beginning with someone’s critique of another’s research, followed by a reply and a rejoinder. I have always thought that these exchanges are a waste of time. Especially when the original critique is sharply worded, the reply and the rejoinder are often exercises in what I have called sarcasm for beginners and advanced sarcasm. The replies rarely concede anything to a biting critique, and it is almost unheard of for a rejoinder to admit that the original critique was misguided or erroneous in any way.“
We have had some sharpness and sarcasm in the comment streams here on PPND from time to time, and several people have expressed concern that the sharpness and sarcasm reduce the willingness of other people to participate in discussions.
What if, instead, we try the approach Kahneman used with Klein of exploring why we see things differently? Perhaps like them, we might find new insights in a failure to disagree.
8. “The question that is most often asked about cognitive illusions is whether they can be overcome. The message of these examples is not encouraging. … As a way to live your life, continuous vigilance is not necessarily good, and it is certainly impractical. … The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high. The premise of this book it that it is easier to recognize other people’s mistakes than our own.”
Here is one more reason to stay connected to others and to be open to the feedback they give you about the way you think. None of us are immune.
Kahneman, D. (2011). Thinking, Fast and Slow. London, Allen Lane. New York: Farrar, Straus and Giroux.
Kahneman, D. & Klein, G. (2009). Conditions for intuitive expertise: A failure to disagree. American Psychologist, 64(6), 515-526.
Kahneman, D. & Tversky, A. (1984). Choices, values, and frames. American Psychologist, 39(4), 341-350.
Tversky, A. & Kahneman, D. (1974) Judgment under uncertainty: Heuristics and Biases Science, 185 (4157), 1124-1131
Group Discussion courtesy of Nonviolent Peace Force
Pilot Training courtesy of UK Ministry of Defense