Can data change an already made up mind?


Nowadays we hear a lot about how important it is that we are data-driven in our decision-making. We also hear a lot of criticism aimed at those that are driven more by intuition than data. Like most things in life, however, there’s a big difference between theory and practice.

It’s easy to say that we will go where data drives us, but what happens if data is driving us to a destination that we’re uncomfortable with? What happens when data calls into question some of our long-standing beliefs?

We like to think that we are all natural data scientists who are ready, willing and able to be swayed by evidence presented by new data. And in a big data world we certainly do not suffer from a dearth of new data.

However, whether or not we want to admit it (especially to others), our minds are often already made up before we look at data. And big data makes a very good yes-man, amplifying our natural tendency to only search out data that supports our viewpoints so that we find further evidence for what we already believe.

This is known as confirmation bias, which, as Chip and Dan Heath, co-authors of Decisive: How to Make Better Choices in Life and Work explained, “leads us to hunt for information that flatters our existing beliefs.” They cited a recent meta-analysis of more than 91 psychological studies involving over 8,000 participants that concluded we are twice as likely to favor confirming information than disconfirming information.

This is particularly troublesome when an organization is debating a strategic business decision and each decision maker is armed with confirming information to support their idea. One way to encourage the analysis of each alternative, according to Roger Martin, author of Opposable Mind: Winning Through Integrative Thinking, is to ask:

“What would have to be true for this option to be the right answer?”

This question encourages the analysis of the logical underpinnings of each alternative by using confirmation bias to your advantage. If it’s a bad option then it should be easy to find information to confirm that fact.

As the Heaths explained, “the search for disconfirming information might seem, on the surface, like a thoroughly negative process: We try to poke holes in our own arguments or the arguments of others. But Martin’s question adds something constructive: What if our least favorite option were actually the best one? What data might convince us of that?

As Martin explained, “if you think an idea is the wrong way to approach a problem and someone asks you if you think it’s the right way, you’ll reply no—and defend that answer against all comers. But if someone asks you to figure out what would have to be true for that approach to work, your frame of thinking changes. This subtle shift gives people a way to back away from their beliefs and allow exploration by which they give themselves the opportunity to learn something new.”

“What makes Roger Martin’s technique so effective,” the Heaths concluded, “is that it allows people to disagree without becoming disagreeable. It goes beyond merely exposing ourselves to disconfirming evidence; it forces us to imagine a set of conditions where we’d willingly change our minds, without feeling that we lost the debate.”

What do you think? Have you encountered a situation where you or a colleague was all too willing to believe confirming information? What happened? Share your thoughts below.


About Author

Jim Harris

Blogger-in-Chief at Obsessive-Compulsive Data Quality (OCDQ)

Jim Harris is a recognized data quality thought leader with 25 years of enterprise data management industry experience. Jim is an independent consultant, speaker, and freelance writer. Jim is the Blogger-in-Chief at Obsessive-Compulsive Data Quality, an independent blog offering a vendor-neutral perspective on data quality and its related disciplines, including data governance, master data management, and business intelligence.


  1. Charles Harbour on


    Thank you for posting this. I have come to know your work over the past couple of years, and always look forward to hearing what you have to say. You've really hit a home run here - your insight on this topic is very helpful to folks out here in the bleachers, who are trying to make a difference from a DQ/Governance perspective. While the cost/benefit analysis or resource allocation priorities seem obvious to me (baking quality into the system), they're not so obvious to others. I continue to struggle to find the terms and emotions that will resonate with the management folks, and your note here opens up a whole new perspective - Under what circumstances would it be a priority for the company to provide xyz? I can come up with a whole new list of possibilities/probable scenarios that support the DQ cause, that are fundamentally different from the usual arguments (higher confidence, saves time, etc). Things like "When the auditors came back with follow up questions, how hard was it to produce what they were looking for? Did you have to promise them to make improvements? What kind of improvements, and what happens if we don't make them?"

    Thanks for your insight!

    • Jim Harris

      Thanks for your comment, Charles.

      I am happy to hear that the post provided you with a new perspective on communicating the data quality and data governance message to management.

      Best Regards,


  2. Pingback: Data science versus narrative psychology - The Data Roundtable

  3. Pingback: Discuss: hotel reviews, statistical wisdom and confirmation bias - SAS Voices

Leave A Reply

Back to Top