With Kimberly Nevala touring Australia with Rick Andrews last week there has been renewed interest in data quality. Kimberly is the Director of Business Solutions for Information Management at SAS. Rick is a Senior Information Quality Analyst at Telstra. A key takeaway, for me, from the breakfast sessions was the question to Kimberly and Rick regarding the return on investment of data quality initiatives.
When building the business case for data quality it is important to link the data in question to a business objective. We can then break the business initiative down to see where improving the data quality will enhance the business outcome. We are linking the business outcome with the key information points that support that outcome. Therefore the business driver for a data quality initiative is not to correct bad data. It is to enhance existing, or new, business processes. By “enhance” we are normally referring to three things:
- Increasing income.
- Decreasing costs.
- Mitigating risk/achieving regulatory compliance.
In order to do this we need to agree that the reason data exists within an organization is to support business process (e.g. taking an order from a customer, activation of a service). Some processes bring data into the organisation but then other business processes consume it (e.g. financial planning, stock ordering, assessing the credit risk of a customer).
Here are some of the customer examples from the IM breakfast session:
- Increased Revenue: A 5% improvement in the marketing response rates using the same business process while only improving the quality of the data that was fed into it.
- Reduced Cost: A retailer improved the on boarding of new products from their suppliers. They went from 3-4 weeks to 48 hours, achieved by better defining the data they required. They also reduced the duplication of data entry across their systems. Improved on boarding moves the revenue recognition from selling those products forward (Increased Revenue). The reduced duplication and rework frees up resources for other tasks.
- Regulatory Compliance: A US based bank spent US$14.5 million to settle a claim where customer information was improperly distributed and used for marketing purposes. We need to calculate how much it will cost to protect our customer information and contrast this to the cost of non-compliance.
Rick Andrews mentioned that the data quality firewall had already seen value in the form of process improvement efficiencies. It had also improved the ability to maintain separation between wholesale and retail data. This is an important regulatory requirement for Telstra.
In a purely reporting and analytics context we should focus on improving the process of consuming information. Improved data quality reduces the time we spend defending and validating our results. Who amongst us has spent hours in meetings arguing over which departments figure for total number of customer is correct? And then in follow up meetings trying to reconcile each others' figures and coming to agreement before presenting to the board? There is huge scope for process improvement efficiencies in this area. From there we can then link the data to the business context in which it is used. For example, after analysing our supplier data we identify four separate supplier contracts at the same supplier. With the awareness that our total business with that supplier was now 10 times larger realised we are able to negotiate another 3% off or standard agreement. It all adds up.
Once we have made the link to business objectives, and especially if we quantify them in financial terms, it should make finding funding for our data quality initiative much easier. It will also help us set the priorities of where we should start - based on greatest business benefit.