Data quality is a topic that is often discussed in insurance, but also plays a subordinate role in the project day. I asked Karen Prillwitz about the importance of data quality at large insurers. For many years Karen has advised insurance companies, and as a project manager in a large German insurance company felt the effects of poor data quality. She has a clear opinion on this dilemma.
Why is professional data quality management for insurance so important?
The issue of data quality is as old as the IT industry itself. Qualitative problems and their effects are complex. IT departments use programmed rules to ensure that data input is correct. But technically valid content is still far from right. Who hasn’t seen the same column headings with different content? This has already “enriched” many board meetings by a discussion about the right number. It sounds funny, but it is not, because it undermines confidence in the numbers. Apply this to our everyday life. Imagine your car shows you that the tank is soon empty, but says you have a 400 km range. Which is right?
Of course, refuel. But you’re right; it sounds trivial, but can have unintended consequences.
Exactly, and if quality problems lead a whole project to failure, then it becomes clear what the value should be. Poor data quality costs money, can lead to considerable project delays and, in the worst case, to seriously bad decisions. Data in insurance is the raw material or the foundation for many important decisions. Tariff calculation based on telematics data, risk analyses from Nat Cat damage and weather data, NBA/NBO, Net Promoter Score – just to name a few of the data-driven topics. You would not build your house on sand, but on a solid foundation. Professional data quality management ensures that decisions can be made on such a solid foundation.
True, and the number of data-driven decisions is growing. Do regulatory requirements not make the issue of data quality even more important?
Correct, new legal requirements such as SII and IFRS contain requirements for data governance and data quality. But even new innovative projects – for example, in the area of digitisation – can’t be underestimated. Data scientists open up many new data sources from the social or big data environment. You have to understand the data in order to be able to analyse it meaningfully. These agile projects are about speed. The quality of the data determines what results can be achieved. Why wasting time unnecessarily with analyses that aren't possible with the available data?
#DataQuality determines what results can be achieved. Why waste time unnecessarily with analyses that are not possible with existing data? #Insurance #CostFactor Click To TweetAnd how can software support it? Rules are difficult to apply to unknown data.
DQ software supports profiling, important especially at the beginning of a new analysis. To ensure high performance for these tasks, appropriate technology is needed, within SAS e.g. with “In Hadoop” profiling. Important advantage: the resulting rules can also be used in production mode.
Insurance companies have a long tradition in the use of analytical methods. Where does the insurance industry stand in terms of modern analytics and artificial intelligence? This is what our industry experts found out in personal talks with insurers from all over Europe. Read the survey results.
There are so many reasons to explain the topic of data quality as a chief issue and to put it professionally. What do you mean by professional data quality management? Is it mainly software?
The right solution plays an important role, of course. It supports the responsible quality managers optimally and also be suitable for worldwide use. And it is, of course, the basis for a meaningful glossary and the legally required monitoring of data quality. However, it is much more important that an understanding of the importance of data quality emerge within a company. And this should also be reflected in organisational processes.
Are insurers aware of all this? Do they know the importance of data quality and how much trouble they could spare?
I believe there is a certain awareness, but the impact of poor data quality is difficult to measure, and organisational hurdles have not been sufficiently identified. Historically, IT is still responsible for data quality, but the data is mostly formally tested. The content is evaluated in the business areas. Once again, the resources are missing in order to ensure good data quality. And that must change. Let me put it this way: There is a data protection officer in every insurance company. Quality managers are rare, and in the upper management almost not at all.
For me, this means that professional data quality management means establishing a data governance process. Is that so?
That’s exactly how it is. People responsible for DQ are needed in every relevant business area. And they must also have the means to fill their roles.
For example, to create and maintain professional rules yourself?
Exactly, a really important point, because the existing procedures are far too cumbersome and prevent the business specialists from adopting real responsibility. Defined transfer processes ensure, that only approved and tested rules go into production. This creates a new kind of cooperation between the DQ managers and IT, a new role concept. On the management side, a data governance board ensures the necessary prioritisation and the provision of the necessary funds.
This sounds like a complicated step. Is that really necessary?
This sounds unusual, but the successes are quickly achieved.
How do you start as a company?
It is important to proceed with a sense of proportion and an eye to reasonable pragmatism. Huge reconciliations and bureaucracy help no one. Small, effective teams with the right practical solutions and a lean decision-making and prioritisation process are essential for success. Properly applied, this even saves time, as the functional correctness of the data is ensured by business specialists, which have the necessary know-how due to their business responsibility. This replaces the elaborate preparation of design-concepts, including the resulting various round-offs. Properly implemented, this means insurance companies get a much more solid basis for their decisions.
In summary, professional data quality management is of great importance, both from the point of view of costs and as a basis for regulatory processes and important company decisions for insurance companies. It is time that we give it a corresponding value.
Thank you Karen. In our next interview, we will discuss what else is important to establish a professional data governance process.
This interview was originally published in German on the regional SAS blog Mehr Wissen.