The Second Law of Data Quality

4

The First Law of Data Quality explained the importance of understanding your Data Usage, which is essential to the proper preparation required before launching your data quality initiative.

This makes sure that you neither climb every data mountain, nor make a mountain out of every data molehill—but instead focus on the data being used to make critical business decisions.

 

The Second Law of Data Quality

“A data quality program in progress, tends to incrementally improve data quality. 

A data quality project at completion, tends to make data quality easily forgotten.”

Data quality must always be understood as an iterative process.

A successful data quality initiative requires a program—and not a one-time project.

In order for incremental improvements to build momentum to sustained success, you need to continue investing your time in a data quality program.

The Second Law of Data Quality is about the need for continuous improvement and maintaining your Data Quality Inertia.

 

Data Quality Inertia

As every high school physics student could easily tell you (hopefully), Isaac Newton's First Law of Motion states that:

“An object in motion tends to stay in motion.

An object at rest tends to stay at rest.”

Isaac Newton defined inertia as the corresponding resistance that an object has against any attempted change in its motion.

So, what causes a data quality initiative in progress to lose its momentum?

What can counteract data quality's inertia?

As every Star Trek fan worth at least a single bar of gold-pressed latinum could easily tell you (or anyone who has read the book The Physics of Star Trek), the answer is inertial dampeners.

(Star Trek fans—yes, I know inertial dampeners are used to protect the crew against the effects of a starship's acceleration—and not to sustain its momentum.)

(And physics purists—yes, I know Star Trek actually uses inertia negation.)

 

Data Quality Inertial Dampeners

Many data quality starships deactivate their inertial dampeners—always a bad idea.

If your Star Trek metaphor safeguards are offline—this means many data quality initiatives are approached with the “declare victory and walk away” mentality.

In other words, most initiatives are reactive data quality projects driven by a business triage for the most critical data problems requiring near-term prioritization.

When the project is over, the assembled project team disbands, returning to previous activities believing “we are all done with that data quality stuff.”

Organizations following this approach to data quality will be forced into triage once again when the next inevitable crisis occurs when poor quality attacks critical data.

 

Can your data quality initiative boldly keep going?

Executing disconnected one-off projects to deal with data issues when they become too big to ignore is far from the sustained program of continuous improvement that a successful approach to data quality truly requires.

A data quality initiative is easy to get started.

Can your data quality initiative boldly keep going?

Share

About Author

Jim Harris

Blogger-in-Chief at Obsessive-Compulsive Data Quality (OCDQ)

Jim Harris is a recognized data quality thought leader with 25 years of enterprise data management industry experience. Jim is an independent consultant, speaker, and freelance writer. Jim is the Blogger-in-Chief at Obsessive-Compulsive Data Quality, an independent blog offering a vendor-neutral perspective on data quality and its related disciplines, including data governance, master data management, and business intelligence.

4 Comments

  1. I like the post, Jim. I would argue that walking away from DQ initiatives is a huge mistake, with one caveat:

    * until the organization's culture embraces DQ and DG and moves to what Tony Fisher calls the governed stage.

    At that point, perhaps DQ initiatives cease to exist because DQ is now part of the organization's DNA.

    I suppose it's kind of like changing your habits. Psychologists claim that it takes people three weeks to adopt a new routine or habit, such as going to the gym. After that, you don't have to remind yourself to do 30 minutes of cardio; you just do it.

    Sorry for the non-Star Trek allusion. Hey, at least this wasn't Rush-based!

  2. Thanks for the (especially non-Rush-based) comment, Phil.

    Good point about organizations that reach the final stage of the data governance maturity model.

    Upon reaching that stage, they exemplify the forward-thinking quality culture necessary to so seamlessly integrate DG/DQ best practices that everyone just assumes "this is how we do things here" and couldn't even imagine it being any other way.

    It's kind of like when a planet develops warp-drive technology and becomes capable of interstellar travel, meriting an official first contact and possible inclusion within the United Federation of Planets. The paradigm shift this bring to their world can never be undone.

    What? You didn't think I was going to use another Star Trek analogy?

  3. Nice post Jim.

    I think a data migration project is one of the examples where companies traditionally put a time box around the data quality efforts, we have a go-live date and come hell or high water we have to shut down our legacy environment and get that data moved. So some could argue that is an example of a one-stop data quality effort.

    However, even in that situation I would urge a company to look at ongoing data quality in the target system (touched on this as conceptually this is still the same data "furniture", it has simply "moved house" so still needs to be quality controlled and maintained, particularly as you've gone to all the trouble of migrating high quality data in the first place.

    I've seen this issue a few times in the past where a sponsor starts to wriggle uncomfortably in their chair at the words "continuous improvement". I often use the Human Resources capability of an organisation to illustrate why data quality is not a one-stop effort.

    When we've spent 6 months trying to hire that perfect recruit we don't then simply walk away from our responsibilities to them as an asset tothe company. We continue to support them, ensure there are no underlying issues and monitor their progress to ensure they increase the quality of service in their role.

    Data quality is no different, in my view. Yes we must strive to eliminate defects at source and build in defect prevention mechanisms but once we create a data asset surely we must continue to analyse, measure, improve and control it throughout its life cycle?

    I think where confusion has arisen is that so many people now mistakenly equate data quality to the function of cleansing, de-duping, matching, profiling etc. These are all "bit-part players" and not to be confused with the broader goal which is continuous improvement.

  4. Thanks for the excellent comment Dylan.

    I really like the data as “furniture” moving from one house to another as a simile for a data migration project.

    (For those playing along at home -- it was a simile and not a metaphor because Dylan used “as” to compare data to furniture -- I am not just a Star Trek geek :-) )

    I also completely agree with the Human Resources analogy for data quality management and continuous improvement.

    Cheers,

    Jim

Leave A Reply

Back to Top