Nowadays we hear a lot about how important it is that we are data-driven in our decision-making. We also hear a lot of criticism aimed at those that are driven more by intuition than data. Like most things in life, however, there’s a big difference between theory and practice.
It’s easy to say that we will go where data drives us, but what happens if data is driving us to a destination that we’re uncomfortable with? What happens when data calls into question some of our long-standing beliefs?
We like to think that we are all natural data scientists who are ready, willing and able to be swayed by evidence presented by new data. And in a big data world we certainly do not suffer from a dearth of new data.
However, whether or not we want to admit it (especially to others), our minds are often already made up before we look at data. And big data makes a very good yes-man, amplifying our natural tendency to only search out data that supports our viewpoints so that we find further evidence for what we already believe.
This is known as confirmation bias, which, as Chip and Dan Heath, co-authors of Decisive: How to Make Better Choices in Life and Work explained, “leads us to hunt for information that flatters our existing beliefs.” They cited a recent meta-analysis of more than 91 psychological studies involving over 8,000 participants that concluded we are twice as likely to favor confirming information than disconfirming information.
This is particularly troublesome when an organization is debating a strategic business decision and each decision maker is armed with confirming information to support their idea. One way to encourage the analysis of each alternative, according to Roger Martin, author of Opposable Mind: Winning Through Integrative Thinking, is to ask:
“What would have to be true for this option to be the right answer?”
This question encourages the analysis of the logical underpinnings of each alternative by using confirmation bias to your advantage. If it’s a bad option then it should be easy to find information to confirm that fact.
As the Heaths explained, “the search for disconfirming information might seem, on the surface, like a thoroughly negative process: We try to poke holes in our own arguments or the arguments of others. But Martin’s question adds something constructive: What if our least favorite option were actually the best one? What data might convince us of that?”
As Martin explained, “if you think an idea is the wrong way to approach a problem and someone asks you if you think it’s the right way, you’ll reply no—and defend that answer against all comers. But if someone asks you to figure out what would have to be true for that approach to work, your frame of thinking changes. This subtle shift gives people a way to back away from their beliefs and allow exploration by which they give themselves the opportunity to learn something new.”
“What makes Roger Martin’s technique so effective,” the Heaths concluded, “is that it allows people to disagree without becoming disagreeable. It goes beyond merely exposing ourselves to disconfirming evidence; it forces us to imagine a set of conditions where we’d willingly change our minds, without feeling that we lost the debate.”
What do you think? Have you encountered a situation where you or a colleague was all too willing to believe confirming information? What happened? Share your thoughts below.