The Difference Between “Can I…” and “Must I…”

Jonathan Haidt is a social psychologist and professor at NYU’s business school. His 2012 bestseller The Righteous Mind continues to make a big impact on me. What follows is one of my favorite takeaways from it, where Haidt explains how we process information one way if we want to believe something and a different way if we don’t:

The difference between can and must is the key to understanding the profound effects of self-interest on reasoning. It’s also the key to understanding many of the strangest beliefs—in UFO abductions, quack medical treatments, and conspiracy theories.

The social psychologist Tom Gilovich studies the cognitive mechanisms of strange beliefs. His simple formulation is that when we want to believe something, we ask ourselves, “Can I believe it?” Then, we search for supporting evidence, and if we find even a single piece of pseudo-evidence, we can stop thinking. We now have permission to believe. We have a justification, in case anyone asks.

In contrast, when we don’t want to believe something, we ask ourselves “Must I believe it?” Then we search for contrary evidence, and if we find a single reason to doubt the claim, we can dismiss it. You only need one key to unlock the handcuffs of must…

The difference between a mind asking “Must I believe it?” versus “Can I believe it?” is so profound that it even influences visual perception. Subjects who thought they’d get something good if a computer flashed up a letter than a number were more likely to see [an] ambiguous figure [as] the letter B, rather than the number 13.

As another example, let’s say that you don’t like marijuana. You think it’s leading society into decline. Because of this stance, when you hear news about the benefits of marijuana, Haidt argues that you are likely to ask, “Must I believe it?” From there, you will instinctively seek out evidence that confirms your bias against marijuana. Mining either your personal history, social network, or favorite publications, you will find solace in any number of references to marijuana as the scourge of civility. And, inevitably, the answer to “Must I believe it?” will be, “No, you mustn’t.”

Marijuana is simply one modern, high-profile, frequently contested example. The same idea—which psychologists call confirmation bias—applies to everything under the sun, from the inconsequential (Was that joke funny?) to the consequential (Is this approach to foreign policy humane?).

Living in a world where we can find a perspective or statistic to back up any of our arguments makes it easy to shrink into our own versions of reality, overlook potentially good ideas, and alienate ourselves from a more objective truth.

So, what do we do about all this?

To ignore our inherent bias is to risk a future of tribalism and suboptimal decision-making. So, how do we fight our hardwiring? How do we push ourselves to be more open to alternative points of view when we are so naturally biased against them?

When it comes to the workplace, David Rock has a simple answer: Let your teammates help.

Rock co-founded the NeuroLeadership Institute, a group that uses brain science to help people become better leaders. As he put it in an interview with NPR, “Mitigating bias is one of the hardest things in human existence… [To do it,] you’ve got to shift the focus from individuals trying not to be biased to teams being able to catch bias… There are decades of research showing [this approach] is the best format for behavior change and habit formation.”

NPR’s Yuki Noguchi summed up Rock’s advice: “In other words, [we need to] create structures that don’t rely on the individual to change.”

This is all the more reason to make an effort to foster psychological safety on your team. Our best chance at avoiding confirmation bias is the feedback and perspectives of our teammates. So we need to create environments where people feel safe enough to speak up when they see a person evaluating someone or something from a potentially limiting angle. We also need to work to surround ourselves with people who keep us vigilant and question our logic. All so we see things from different angles—and make better decisions.

This strategy of using team dynamics to mitigate bias should increase the global urgency to foster more diverse teams. Surrounding ourselves with people who come at things from different reference points will increase our chances of mitigating bias, getting a broader perspective, and doing better work.

Hope this helps!

—Max

 

This is Max’s note—a weekly message from Lessonly’s CEO about learning, leadership, and doing Better Work. Sign up below to subscribe via email. No spam, we promise!

It Doesn’t Have to be a Grand Gesture
How We Show Up