When poor data quality calls

1

It has been a while since I shared a story about DQ-IRL (i.e., Data Quality in Real Life, a few of the past stories included The Seven Year Glitch and Data Quality, 50023).

While listening to a recent broadcast of NPR’s weekly news quiz program Wait Wait... Don't Tell Me!, I learned about a DQ-IRL story recently reported by Lawrence Mower in the Las Vegas Review-Journal.

For the past two years, a 59-year-old Las Vegas retiree has been pestered by people showing up at his house at all hours of the day and night, demanding their cell phones, as well as local police hunting down criminals via their cell phones or responding to 911 calls made from cell phones.

The root cause of the problem is a data quality glitch in the GPS tracking feature used by a major mobile provider, a feature primarily intended to help its customers locate their missing cell phones.  Please note that cell phone GPS systems do not provide exact locations, but instead triangulate a signal between nearby cell towers to provide a general location to begin your search.

However, for some reason, one specific location is being provided for all Las Vegas area missing cell phone traces — the house of the 59-year-old retiree, who has had to resort to posting a sign next to his front door telling people that he doesn’t have their cell phone and advising them to call police.

Tracing the root cause of a data quality issue is difficult.  And often, when poor data quality calls, no one wants to answer because no one wants to be blamed for causing the data quality issue.  But just imagine if poor data quality caused everyone to literally come knocking on your door.

Share Your DQ-IRL Story

Please share your DQ-IRL story by posting a comment below.

Alternatively, post it on your own blog, then let us all know about it via a comment, a trackback, or if you use Twitter, then please share it via the #DQ-IRL hashtag.

Share

About Author

Jim Harris

Blogger-in-Chief at Obsessive-Compulsive Data Quality (OCDQ)

Jim Harris is a recognized data quality thought leader with 25 years of enterprise data management industry experience. Jim is an independent consultant, speaker, and freelance writer. Jim is the Blogger-in-Chief at Obsessive-Compulsive Data Quality, an independent blog offering a vendor-neutral perspective on data quality and its related disciplines, including data governance, master data management, and business intelligence.

1 Comment

  1. Great artcile Jim!

    I agree if their is a glitch in the data system...all manor of things start to go wrong. Its imperitive that the data being used by businesses is accurate and relevant if its to be useful. If their is a problem with the data it will only reflect poorly on the product being produced or results trying to be achieved. I read this the other the day you might want to take a look over it. http://www.dnb.co.uk/solutions/data-integration-solutions/data-intelligence/b2b-data-lists

    R.

Leave A Reply

Back to Top