- Home
- Adam Grant
Think Again: The Power of Knowing What You Don't Know Page 6
Think Again: The Power of Knowing What You Don't Know Read online
Page 6
In a classic paper, sociologist Murray Davis argued that when ideas survive, it’s not because they’re true—it’s because they’re interesting. What makes an idea interesting is that it challenges our weakly held opinions. Did you know that the moon might originally have formed inside a vaporous Earth out of magma rain? That a narwhal’s tusk is actually a tooth? When an idea or assumption doesn’t matter deeply to us, we’re often excited to question it. The natural sequence of emotions is surprise (“Really?”) followed by curiosity (“Tell me more!”) and thrill (“Whoa!”). To paraphrase a line attributed to Isaac Asimov, great discoveries often begin not with “Eureka!” but with “That’s funny . . .”
When a core belief is questioned, though, we tend to shut down rather than open up. It’s as if there’s a miniature dictator living inside our heads, controlling the flow of facts to our minds, much like Kim Jong-un controls the press in North Korea. The technical term for this in psychology is the totalitarian ego, and its job is to keep out threatening information.
It’s easy to see how an inner dictator comes in handy when someone attacks our character or intelligence. Those kinds of personal affronts threaten to shatter aspects of our identities that are important to us and might be difficult to change. The totalitarian ego steps in like a bodyguard for our minds, protecting our self-image by feeding us comforting lies. They’re all just jealous. You’re really, really, ridiculously good-looking. You’re on the verge of inventing the next Pet Rock. As physicist Richard Feynman quipped, “You must not fool yourself—and you are the easiest person to fool.”
Our inner dictator also likes to take charge when our deeply held opinions are threatened. In the Harvard study of attacking students’ worldviews, the participant who had the strongest negative reaction was code-named Lawful. He came from a blue-collar background and was unusually precocious, having started college at sixteen and joined the study at seventeen. One of his beliefs was that technology was harming civilization, and he became hostile when his views were questioned. Lawful went on to become an academic, and when he penned his magnum opus, it was clear that he hadn’t changed his mind. His concerns about technology had only intensified:
The Industrial Revolution and its consequences have been a disaster for the human race. They have greatly increased the life-expectancy of those of us who live in “advanced” countries, but they have destabilized society, have made life unfulfilling, have subjected human beings to indignities . . . to physical suffering as well . . . and have inflicted severe damage on the natural world.
That kind of conviction is a common response to threats. Neuroscientists find that when our core beliefs are challenged, it can trigger the amygdala, the primitive “lizard brain” that breezes right past cool rationality and activates a hot fight-or-flight response. The anger and fear are visceral: it feels as if we’ve been punched in the mind. The totalitarian ego comes to the rescue with mental armor. We become preachers or prosecutors striving to convert or condemn the unenlightened. “Presented with someone else’s argument, we’re quite adept at spotting the weaknesses,” journalist Elizabeth Kolbert writes, but “the positions we’re blind about are our own.”
I find this odd, because we weren’t born with our opinions. Unlike our height or raw intelligence, we have full control over what we believe is true. We choose our views, and we can choose to rethink them any time we want. This should be a familiar task, because we have a lifetime of evidence that we’re wrong on a regular basis. I was sure I’d finish a draft of this chapter by Friday. I was certain the cereal with the toucan on the box was Fruit Loops, but I just noticed the box says Froot Loops. I was sure I put the milk back in the fridge last night, but strangely it’s sitting on the counter this morning.
The inner dictator manages to prevail by activating an overconfidence cycle. First, our wrong opinions are shielded in filter bubbles, where we feel pride when we see only information that supports our convictions. Then our beliefs are sealed in echo chambers, where we hear only from people who intensify and validate them. Although the resulting fortress can appear impenetrable, there’s a growing community of experts who are determined to break through.
ATTACHMENT ISSUES
Not long ago I gave a speech at a conference about my research on givers, takers, and matchers. I was studying whether generous, selfish, or fair people were more productive in jobs like sales and engineering. One of the attendees was Daniel Kahneman, the Nobel Prize–winning psychologist who has spent much of his career demonstrating how flawed our intuitions are. He told me afterward that he was surprised by my finding that givers had higher rates of failure than takers and matchers—but higher rates of success, too.
When you read a study that surprises you, how do you react? Many people would get defensive, searching for flaws in the study’s design or the statistical analysis. Danny did the opposite. His eyes lit up, and a huge grin appeared on his face. “That was wonderful,” he said. “I was wrong.”
Later, I sat down with Danny for lunch and asked him about his reaction. It looked a lot to me like the joy of being wrong—his eyes twinkled as if he was having fun. He said that in his eighty-five years, no one had pointed that out before, but yes, he genuinely enjoys discovering that he was wrong, because it means he is now less wrong than before.
I knew the feeling. In college, what first attracted me to social science was reading studies that clashed with my expectations; I couldn’t wait to tell my roommates about all the assumptions I’d been rethinking. In my first independent research project, I tested some predictions of my own, and more than a dozen of my hypotheses turned out to be false.* It was a major lesson in intellectual humility, but I wasn’t devastated. I felt an immediate rush of excitement. Discovering I was wrong felt joyful because it meant I’d learned something. As Danny told me, “Being wrong is the only way I feel sure I’ve learned anything.”
Danny isn’t interested in preaching, prosecuting, or politicking. He’s a scientist devoted to the truth. When I asked him how he stays in that mode, he said he refuses to let his beliefs become part of his identity. “I change my mind at a speed that drives my collaborators crazy,” he explained. “My attachment to my ideas is provisional. There’s no unconditional love for them.”
Attachment. That’s what keeps us from recognizing when our opinions are off the mark and rethinking them. To unlock the joy of being wrong, we need to detach. I’ve learned that two kinds of detachment are especially useful: detaching your present from your past and detaching your opinions from your identity.
Let’s start with detaching your present from your past. In psychology, one way of measuring the similarity between the person you are right now and your former self is to ask: which pair of circles best describes how you see yourself?
In the moment, separating your past self from your current self can be unsettling. Even positive changes can lead to negative emotions; evolving your identity can leave you feeling derailed and disconnected. Over time, though, rethinking who you are appears to become mentally healthy—as long as you can tell a coherent story about how you got from past to present you. In one study, when people felt detached from their past selves, they became less depressed over the course of the year. When you feel as if your life is changing direction, and you’re in the process of shifting who you are, it’s easier to walk away from foolish beliefs you once held.
My past self was Mr. Facts—I was too fixated on knowing. Now I’m more interested in finding out what I don’t know. As Bridgewater founder Ray Dalio told me, “If you don’t look back at yourself and think, ‘Wow, how stupid I was a year ago,’ then you must not have learned much in the last year.”
The second kind of detachment is separating your opinions from your identity. I’m guessing you wouldn’t want to see a doctor whose identity is Professional Lobotomist, send your kids to a teacher whose identity is Corporal Punisher, or live in a town where the police chief’s id
entity is Stop-and-Frisker. Once upon a time, all of these practices were seen as reasonable and effective.
Most of us are accustomed to defining ourselves in terms of our beliefs, ideas, and ideologies. This can become a problem when it prevents us from changing our minds as the world changes and knowledge evolves. Our opinions can become so sacred that we grow hostile to the mere thought of being wrong, and the totalitarian ego leaps in to silence counterarguments, squash contrary evidence, and close the door on learning.
Who you are should be a question of what you value, not what you believe. Values are your core principles in life—they might be excellence and generosity, freedom and fairness, or security and integrity. Basing your identity on these kinds of principles enables you to remain open-minded about the best ways to advance them. You want the doctor whose identity is protecting health, the teacher whose identity is helping students learn, and the police chief whose identity is promoting safety and justice. When they define themselves by values rather than opinions, they buy themselves the flexibility to update their practices in light of new evidence.
THE YODA EFFECT: “YOU MUST UNLEARN WHAT YOU HAVE LEARNED”
On my quest to find people who enjoy discovering they were wrong, a trusted colleague told me I had to meet Jean-Pierre Beugoms. He’s in his late forties, and he’s the sort of person who’s honest to a fault; he tells the truth even if it hurts. When his son was a toddler, they were watching a space documentary together, and Jean-Pierre casually mentioned that the sun would one day turn into a red giant and engulf the Earth. His son was not amused. Between tears, he cried, “But I love this planet!” Jean-Pierre felt so terrible that he decided to bite his tongue instead of mentioning threats that could prevent the Earth from even lasting that long.
Back in the 1990s, Jean-Pierre had a hobby of collecting the predictions that pundits made on the news and scoring his own forecasts against them. Eventually he started competing in forecasting tournaments—international contests hosted by Good Judgment, where people try to predict the future. It’s a daunting task; there’s an old saying that historians can’t even predict the past. A typical tournament draws thousands of entrants from around the world to anticipate big political, economic, and technological events. The questions are time-bound, with measurable, specific results. Will the current president of Iran still be in office in six months? Which soccer team will win the next World Cup? In the following year, will an individual or a company face criminal charges for an accident involving a self-driving vehicle?
Participants don’t just answer yes or no; they have to give their odds. It’s a systematic way of testing whether they know what they don’t know. They get scored months later on accuracy and calibration—earning points not just for giving the right answer, but also for having the right level of conviction. The best forecasters have confidence in their predictions that come true and doubt in their predictions that prove false.
On November 18, 2015, Jean-Pierre registered a prediction that stunned his opponents. A day earlier, a new question had popped up in an open forecasting tournament: in July 2016, who would win the U.S. Republican presidential primary? The options were Jeb Bush, Ben Carson, Ted Cruz, Carly Fiorina, Marco Rubio, Donald Trump, and none of the above. With eight months to go before the Republican National Convention, Trump was largely seen as a joke. His odds of becoming the Republican nominee were only 6 percent according to Nate Silver, the celebrated statistician behind the website FiveThirtyEight. When Jean-Pierre peered into his crystal ball, though, he decided Trump had a 68 percent chance of winning.
Jean-Pierre didn’t just excel in predicting the results of American events. His Brexit forecasts hovered in the 50 percent range when most of his competitors thought the referendum had little chance of passing. He successfully predicted that the incumbent would lose a presidential election in Senegal, even though the base rates of reelection were extremely high and other forecasters were expecting a decisive win. And he had, in fact, pegged Trump as the favorite long before pundits and pollsters even considered him a viable contender. “It’s striking,” Jean-Pierre wrote early on, back in 2015, that so many forecasters are “still in denial about his chances.”
Based on his performance, Jean-Pierre might be the world’s best election forecaster. His advantage: he thinks like a scientist. He’s passionately dispassionate. At various points in his life, Jean-Pierre has changed his political ideologies and religious beliefs.* He doesn’t come from a polling or statistics background; he’s a military historian, which means he has no stake in the way things have always been done in forecasting. The statisticians were attached to their views about how to aggregate polls. Jean-Pierre paid more attention to factors that were hard to measure and overlooked. For Trump, those included “Mastery at manipulating the media; Name recognition; and A winning issue (i.e., immigration and ‘the wall’).”
Even if forecasting isn’t your hobby, there’s a lot to be learned from studying how forecasters like Jean-Pierre form their opinions. My colleague Phil Tetlock finds that forecasting skill is less a matter of what we know than of how we think. When he and his collaborators studied a host of factors that predict excellence in forecasting, grit and ambition didn’t rise to the top. Neither did intelligence, which came in second. There was another factor that had roughly triple the predictive power of brainpower.
The single most important driver of forecasters’ success was how often they updated their beliefs. The best forecasters went through more rethinking cycles. They had the confident humility to doubt their judgments and the curiosity to discover new information that led them to revise their predictions.
A key question here is how much rethinking is necessary. Although the sweet spot will always vary from one person and situation to the next, the averages can give us a clue. A few years into their tournaments, typical competitors updated their predictions about twice per question. The superforecasters updated their predictions more than four times per question.
Think about how manageable that is. Better judgment doesn’t necessarily require hundreds or even dozens of updates. Just a few more efforts at rethinking can move the needle. It’s also worth noting, though, how unusual that level of rethinking is. How many of us can even remember the last time we admitted being wrong and revised our opinions accordingly? As journalist Kathryn Schulz observes, “Although small amounts of evidence are sufficient to make us draw conclusions, they are seldom sufficient to make us revise them.”
That’s where the best forecasters excelled: they were eager to think again. They saw their opinions more as hunches than as truths—as possibilities to entertain rather than facts to embrace. They questioned ideas before accepting them, and they were willing to keep questioning them even after accepting them. They were constantly seeking new information and better evidence—especially disconfirming evidence.
On Seinfeld, George Costanza famously said, “It’s not a lie if you believe it.” I might add that it doesn’t become the truth just because you believe it. It’s a sign of wisdom to avoid believing every thought that enters your mind. It’s a mark of emotional intelligence to avoid internalizing every feeling that enters your heart.
Ellis Rosen/The New Yorker Collection/The Cartoon Bank
Another of the world’s top forecasters is Kjirste Morrell. She’s obviously bright—she has a doctorate from MIT in mechanical engineering—but her academic and professional experience wasn’t exactly relevant to predicting world events. Her background was in human hip joint mechanics, designing better shoes, and building robotic wheelchairs. When I asked Kjirste what made her so good at forecasting, she replied, “There’s no benefit to me for being wrong for longer. It’s much better if I change my beliefs sooner, and it’s a good feeling to have that sense of a discovery, that surprise—I would think people would enjoy that.”
Kjirste hasn’t just figured out how to erase the pain of being wrong. She’s transform
ed it into a source of pleasure. She landed there through a form of classical conditioning, like when Pavlov’s dog learned to salivate at the sound of a bell. If being wrong repeatedly leads us to the right answer, the experience of being wrong itself can become joyful.
That doesn’t mean we’ll enjoy it every step of the way. One of Kjirste’s biggest misses was her forecast for the 2016 U.S. presidential election, where she bet on Hillary Clinton to beat Donald Trump. Since she wasn’t a Trump supporter, the prospect of being wrong was painful—it was too central to her identity. She knew a Trump presidency was possible, but she didn’t want to think it was probable, so she couldn’t bring herself to forecast it.
That was a common mistake in 2016. Countless experts, pollsters, and pundits underestimated Trump—and Brexit—because they were too emotionally invested in their past predictions and identities. If you want to be a better forecaster today, it helps to let go of your commitment to the opinions you held yesterday. Just wake up in the morning, snap your fingers, and decide you don’t care. It doesn’t matter who’s president or what happens to your country. The world is unjust and the expertise you spent decades developing is obsolete! It’s a piece of cake, right? About as easy as willing yourself to fall out of love. Somehow, Jean-Pierre Beugoms managed to pull it off.
When Donald Trump first declared his candidacy in the spring of 2015, Jean-Pierre gave him only a 2 percent chance of becoming the nominee. As Trump began rising in the August polls, Jean-Pierre was motivated to question himself. He detached his present from his past, acknowledging that his original prediction was understandable, given the information he had at the time.
Detaching his opinions from his identity was harder. Jean-Pierre didn’t want Trump to win, so it would’ve been easy to fall into the trap of desirability bias. He overcame it by focusing on a different goal. “I wasn’t so attached to my original forecast,” he explained, because of “the desire to win, the desire to be the best forecaster.” He still had a stake in the outcome he actually preferred, but he had an even bigger stake in not making a mistake. His values put truth above tribe: “If the evidence strongly suggests that my tribe is wrong on a particular issue, then so be it. I consider all of my opinions tentative. When the facts change, I change my opinions.”