Innovation needs contamination


In his book Where Good Ideas Come From: The Natural History of Innovation, Steven Johnson explained that “error is not simply a phase you have to suffer through on the way to genius. Error often creates a path that leads you out of your comfortable assumptions. Being right keeps you in place. Being wrong forces you to explore.”

Mistaking signal for noise

“The trouble with error,” the psychologist Kevin Dunbar noted, “is that we have a natural tendency to dismiss it.” Dunbar’s research found many scientific experiments produced results that were genuinely unexpected, meaning that more than half of the data collected by scientists deviated significantly from what they had predicted they would find. Dunbar found that scientists tended to treat these surprising outcomes as the result of flaws in their experimental method, or a mechanical malfunction in their laboratory equipment, or an error in data processing. In other words, the scientists assumed the result was noise, not signal.

Even though poor data quality is rightfully regarded as bad, as Johnson explained, “paradigm shifts begin with anomalies in the data, when scientists find that their predictions keep turning out to be wrong.” In other words, what appears to be poor data quality – in this case in the data recording the result of an experiment – reveals the need to challenge the assumptions that the experiment was based on.

Being deliberately noisy

Johnson also cited the research of psychologist Charlan Nemeth, which included experiments that deliberately introduced noise into the decision-making process, and what she found ran directly counter to our intuitive assumptions about truth and error. According to Johnson, her research suggests “a paradoxical truth about innovation: good ideas are more likely to emerge in environments that contain a certain amount of noise and error. You would think that innovation would be more strongly correlated with accuracy, clarity, and focus. A good idea has to be correct on some basic level, and we value good ideas because they tend to have a high signal-to-noise ratio. But that doesn’t mean you want to cultivate those ideas in noise-free environments, because they end up being too sterile and predictable in their output. The best innovation labs are always a little contaminated.”

Thriving on useful mistakes

“Innovative environments thrive on useful mistakes,” Johnson concluded, “and suffer when the demands of quality control overwhelm them. While big organizations like to follow perfectionist regimes like Six Sigma and Total Quality Management, it’s no accident that one of the mantras of the web startup world is fail faster. It’s not that mistakes are the goal – they’re still mistakes after all, which is why you want to get through them quickly. But those mistakes are an inevitable step on the path to true innovation.”

The ability of algorithms to accept mistakes as part of the evolutionary path of problem solving is why they play such a big role in big data analytics. People also have a big role to play, and the ones who will do best are those who realize they are playing in a sandbox, not a clean room.


About Author

Jim Harris

Blogger-in-Chief at Obsessive-Compulsive Data Quality (OCDQ)

Jim Harris is a recognized data quality thought leader with 25 years of enterprise data management industry experience. Jim is an independent consultant, speaker, and freelance writer. Jim is the Blogger-in-Chief at Obsessive-Compulsive Data Quality, an independent blog offering a vendor-neutral perspective on data quality and its related disciplines, including data governance, master data management, and business intelligence.

Leave A Reply

Back to Top