Mom Psych



Louis Cozolino: The Social Neuroscience of Education

Richard J. Davidson and Sharon Begley: The Emotional Life of Your Brain

Daniel Kahneman: Thinking, Fast and Slow

Michael S. Gazzaniga: Who's In Charge? Free Will and the Science of the Brain

Daniel J. Siegel: Mindsight

Daniel J. Siegel and Tina Payne Bryson: The Whole-Brain Child: Revolutionary Strategies to Nurture Your Child's Developing Mind

Aaron Ben-Ze'ev and Ruhama Goussinsky: In the Name of Love: Romantic Ideology and Its Victims

Daniel Goleman: Social Intelligence





Daniel Kahneman: Unveling the Two-Faced Brain

Thinking, Fast and Slow
Daniel Kahneman. 2011. Farrar, Straus and Giroux, New York. 512 pages.

March 20, 2012—How many of us are really open to the possibility of shattering our cherished biases and illusions, especially those that support the trust we maintain toward our own mind? Well, don't read Daniel Kahneman's latest book unless that is precisely what you are prepared to do. In Thinking, Fast and Slow, Kahneman reveals what he has learned as a result of his Nobel Prize–winning research in judgment and decision making: human beings (and that includes you and me) are not the rational agents economists and decision theorists have traditionally assumed.

This is not to say that humans are irrational. Rather, says Kahneman, they “are not well described by the rational-agent model.” Although a psychologist, he was awarded the Nobel Prize in Economic Sciences precisely because his research challenges traditional economic theory. Old-school economists have held that the test of rational thinking is whether a person’s beliefs and preferences are internally consistent. For instance, rational thinkers would not be subject to reversing their preferences based on the words in which the choice is framed, but real people are. According to Kahneman, expecting people to think the way economists have traditionally theorized is “impossibly restrictive” and “demands adherence to rules of logic that a finite mind is not able to implement.”

In other words, we cannot be internally consistent because, far more than we imagine, we are ruled by hidden influences when making judgments and decisions. This is the human mind’s System 1—in Kahneman’s terminology, the fast process that operates automatically and usually outside our awareness. It’s that impressionable gut feeling that looks for patterns and coherence and that makes us feel complacent unless challenged. And it does its best to ignore challenges if it can. Most of our thoughts originate here, but it’s the job of System 2 (the logical, controlled process) to step in when the input becomes more demanding or when effortful self-control is required. Normally, says Kahneman, System 2 gets the last word, but only if it can get one in edgewise; and it is often too busy or lazy to intervene. As long as we feel the sense of “cognitive ease” that System 1 is so willing to provide, we don’t call in System 2 forces.

System 1 is what allows us to make snap decisions in situations that call for them. But it is highly vulnerable to error. Because it operates on emotions, impressions, intuitions and intentions, says Kahneman, it is “gullible and biased” toward believing whatever it is confronted with, and it searches its experience for information that confirms these biases and beliefs. Its job is to jump to conclusions, but these are necessarily based on limited evidence because System 1 operates on the basis of what Kahneman dubs “What you see is all there is.” Trusting this notion, we construct whatever reasonably coherent story we can using whatever information might be within easy reach of our intuition (and our intuition does not allow for information that fails to come to mind, much less information it never had in the first place). Among other problems, this leads to overconfidence. Fallible though it may be, we prefer the “evidence” of our own memory and experience to any kind of fact-checking from outside ourselves.

To illustrate the fallibility of experience, Kahneman tells of the time he taught Israeli Air Force flight instructors the science of effective training. After he explained that rewards work better than punishment in improving performance, one of the instructors objected that his long experience as an instructor had taught him otherwise. Whenever he had praised a flight cadet for excellence, the cadet did worse on the next try. Conversely, when he reprimanded a cadet for bad execution, the next try was better. “So please don’t tell us that reward works and punishment does not,” the instructor said, “because the opposite is the case.”

Any of us might be tempted to jump to the same conclusion, but as Kahneman explains, the experience described by the instructor did not, in fact, teach a lesson about reward and punishment but about a principle known as “regression to the mean.” Both high and low performances are usually followed by an attempt that is closer to average, simply because significant variances from the mean usually have more to do with luck than anything else. Rather than viewing the individual’s performance over an extended period, the instructor was making judgments about the effects of praise based on the cadet’s single next performance, which statistically would almost certainly be closer to the average than his more memorable attempt. The instructor’s experience was true, but the conclusion his intuition drew from the experience was wrong.

Those who have a hard time believing this or any of Kahneman’s other claims about the quirks of System 1 need only try his examples at home; doing so will make a believer out of even the most severe skeptic. It is humbling as well as enlightening to see oneself defeated by the wiles of System 1 despite having been warned in advance of its methods.

How, then, can we ever put trust in our thoughts and judgments?

Only by being scrupulously vigilant about how we are thinking, researchers tell us. And even then, "trust" is a strong word. As Kahneman puts it, “little can be achieved without a considerable investment of effort. . . . System 1 is not readily educable.” Still, he offers a seemingly simple principle: “Recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2.” Unfortunately System 2 is often sluggish in coming to our aid. Illusions and biases in our thinking are hard to recognize, and we would rather not recognize them if we can help it. As a result, we fail to exercise caution when we need it most. If we do happen to stumble upon our mind’s biases in a mirror, we quickly look the other way.

And who can blame us for avoiding that inward mirror? It can be very unpleasant to identify our self-deceptions, or even to entertain the possibility that we may have them. We feel much safer avoiding the risk of inconvenient self-revelations. But the downside of safety is that shrouding the mirror in a cloak of self-deception leaves us unable to compare who we are to who we could be.

“Changing one’s mind about human nature is hard work,” Kahneman observes, “and changing one’s mind for the worse about oneself is even harder.”




Book Review: DIY Brain Makeover

Hello, Emotional Style!





First Published: Spring 2012 Issue Vision Journal

Django Productions About Us |Privacy Policy |Submission Policy | Contact Us | ©2003 Mom Psych