Is good…good enough?

“Garbage in, garbage out” is more than a catchphrase – it’s the unfortunate reality in many analytics initiatives. For most analytical applications, the biggest problem lies not in the predictive modeling, but in gathering and preparing data for analysis. When the analytics seems to be underperforming, the problem almost invariably lies in the data.

Entity resolutionInsurers typically have multiple legacy transactional systems often for each line of business. Hence when an insurers has an individual with profiles across multiple systems, it needs to be able to identify that as the same person and resolve different data variations into a single entity. In many cases entity resolution can be solved using simple business rules. Based on data items like date of birth and phones numbers. But this is not always sufficient. Insurers are now using advanced analytical techniques such as probabilistic matching to determine the actual statistical likelihood that two entities are the same.

However when it comes to data quality, there is one small anomaly. That is when the data is being used for fraud analytics. Insurance companies should be careful not to over-cleanse their data. In some cases, an error such as a transposition in a phone number or ID number may be intentional; that’s how the fraudster generate variations of data.

To learn more download the white paper “Fraud Analytics: Data Challenges and Business Opportunities”.

Insurance companies should not under estimate the amount of work required for data management. It’s not uncommon for over 50 percent of the implementation effort dedicated to data integration and data quality. However they should not over think the problem. While poor data will deliver poor results, perfect data is an unrealistic expectation. In most cases good data is good enough!

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.


Post a Comment

How insurers can take a giant leap forward with big data

Oh, how times have changed during my 20-plus years in the insurance industry. Data wasn’t a word we used much back in the 80s and 90s, unless of course you worked in those arcane and mysterious IT data centres.

Even amidst the computerisation of the insurance industry in the 80s, many policy documents and related files were still paper-based. In those days the data being captured came from the company’s own staff and was keyed into a mainframe terminal. The most sophisticated ones received a bordereaux file from their insurance brokers, transferred through a kind of electronic data interface or courier process – and more often than not this was just a floppy disk received in the post.

Data Center

Today, insurance companies have more data than they can realistically handle. And the sources are numerous, including: in-house generated, aggregators, fraud bureaus, government agencies, telematics boxes and of course social media sites. This much data presents a pretty significant headache for insurance companies: where do you store it all, how do you access and understand what that data can tell you – and do so quickly. Most important of all, how do you profit from it?

What are the questions around “big data” that insurers today are grappling with?

Storing big data
First of all, lets define the size of the problem. We know that many insurers and insurance brokers produce anywhere between 300,000 and 700,000 quotes per day for the UK aggregator sites, and that telematics boxes can generate hundreds of data items per second. Deciding how much of this data to store and where to store it cheaply has become a serious issue for insurers. Cost is the primary challenge, but any storage also has to meet regulatory compliance needs, so you can’t just store this data anywhere.

Accessing and analysing big data
Storing all this data effectively and cheaply is only worthwhile if the business users can access it quickly. Only then can they can run the analysis and reports that they need. How do you do that when you have terabytes worth of data and many millions of records? Traditional data warehouse systems and Excel (the typical analysts’ tool of choice) struggle with these volumes, the speeds required and storing the growth in unstructured data. As such, insurers need to look at new ways to do this, such as Hadoop.

Visualising big data
For end users, one of the biggest problems with big data is simply being able to see the data fields and structures themselves. A few insurers have invested the time and effort to build good data dictionaries that help users to understand the data and associated metadata. But for everyone else, clearly labelled tables, fields and data values are extremely important if you want to get fast insights.

However, knowing the question is only half the problem. We also need to look at what insurers can do to address these issues. The first thing to know is that the answers are here today and are available for all insurance companies.

The data can now be quickly and cheaply stored in Hadoop and, with tools like SAS Visual Analytics, can be easily accessed and analysed by anyone with a mouse and a web browser. The ability to review and analyse vast quantities of data received from insurance aggregators in real time makes data insight immediately actionable. If you wanted to improve your quote to policy conversion, you could reduce your price by a percentage point. Or increase your price if, for example, you’ve reached your quota for young drivers. With technologies like SAS Event Stream Processing, this is a reality.

In the world of data, times have changed a great deal, but users need not fear the terms “big data” or “Big Data Analytics”. In those immortal words “we have the technology…”

If you want to know more about how to properly exploit big data using Hadoop, follow the eight-point checklist set out in this report.

Post a Comment

No more excuses.. Analytics IS a game changer

Good analytics survey last year found that 72% of insurance executive agreed that analytics is the biggest game-changer in the next 2 years. Bad news...compared to other industries the adoption rates of analytics in the insurance has lagged other industries.

To reverse this trend and help insurers travel down the path to becoming an Analytic Insurer they must consider four things:

Business Strategy

- What data is accessible and obtainable?
- What type of analytic tools are needed and available?
- What type of resources are required? What are the right skill sets?
- What best practices or processes will facilitate the implementation of an analytic strategy.


Organizations new to analytics should start with small proof of concept and pilot projects and build on success to gain support throughout the organization. This will lead to buy-in from senior executives, increased funding and more ambitious projects. A win-win for everyone.

There is no “one size fits all” when it comes to analytic execution. To find out how you can transform your organization into the Analytic Insurer download a copy of the white paper “Building a strategic analytics culture: a guide for the insurance industry”.

To survive and thrive in the competitive insurance industry, carriers must implement analytics.

There are no more excuses.

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.

Post a Comment

Help wanted

The insurance industry is heading for a crisis. Depending on which report you read the insurance industry is facing a shortfall in job vacancy from anything from 40,000 to nearly half million in the next few years. Baby boomers in specialized jobs like underwriters and claims adjusters are retiring and insurers are struggling to replace these employees.

Insurers face a classic good news / bad news scenario. On the positive side the industry is financially stable, has great career paths, good benefits and is almost recession proof. On the negative side the perception of insurance is an industry of laggards. The decision makers at the top are old-school, they struggle to relate with the millennial generation graduating from college. Insurance companies have got to be able to connect with these individuals and look cool because that is the “new normal”.

In an early Analytic Insurer blog I write how Data Scientist is considered the sexiest job of the 21st century. Ironically, the insurance industry is no stranger to the importance of analytics and has historically been success in attracting mathematicians and statisticians to the industry to work as actuaries.  But as more and more industries and companies develop data-driven initiatives the demand for these individuals with analytical skills has increased.  Therefore, attracting and retaining employees with such specialized capabilities remains a significant hurdle to overcome and a potential differentiator for those that do.

One insurance company that reversing this trend is XL Group. To get a better handle on pricing, XL Group created an analytical team led by SVP of Strategic Analytics, Kimberly Holmes to work on predictive, multi-variant analytics using internal and external data. After four years, XL has seen its claims per dollar of premium written fall significantly in those lines of businesses which were early adopters of predictive models. To read more about this case study and how analytical talent is driving a competitive advantage download the MITSloan report "The Talent Dividend".

The question for all insurers concerned with building their analytics capabilities is this: What is you plan for cultivating analytics talent?

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.


Post a Comment

Location, Location, Location

If you buying or selling a house. The relator will tell the value of the property is all about location, location, location. For insurance companies location is just as important.

For an underwriter assessing the risk on a property is essential that they consider the location of the property. How close is it to the coastline? Is it a flood plain? Where is the nearest fire station? It is even important to know the types of businesses in surrounding properties. A clothing manufacturing next to a fireworks store is a greater risk than if the neighboring property sold sports equipment. Traditionally underwriters have relied on quotation data from the insured or agent, plus information from local adjusters. Now underwriters are supplementing that information with geo-spatial data.

geo spatial


Of course, it is impossible to predict exactly where a natural disaster will hit, but quickly predicting the exposure is essential when catastrophe strikes. When torrential rain inundated the city of Calgary in June 2013, Aviva Canada used SAS Visual Analytics to quickly determine which customers were most affected. By combining policy data with geocoding software, SAS provided Aviva Canada executives with a dashboard to easily monitor the amount of claims coming in. This resulted in speedier claim fulfillment, lower fraud rates and better customer retention. Read more about the AVIVA Canada case study.


There are many other business applications for location analytics within insurance. One example is marketing. Insurance companies have always created customer segmentation. However now insurance companies are can create micro-segmentation down to a hyperlocal level.   Overlaying internal customer data with behavioral and demographic information, marketing analysts can identify sales gaps and untapped opportunities.

Another example is using geo-spatial data to quickly find emerging trends in fraudulent and suspicious activities. This video shows how quickly and effectively an insurance company could reveal anomalies in the data. In this instance, by analyzing the data the insurance company was able to discover that the spike in claims were related to unscrupulous, out-of-state, contractors targeting elderly policyholders for unnecessary property repairs.

SAS Visual Analytics and location analysis empowers underwriters, adjusters, marketing specialists to better assess risk, be more efficient and increase revenue. Are you using it?

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.

Post a Comment

Customer experience conundrum

Who is your best customer?  The answer to this question can vary dramatically depending on your industry. A retailer’s best customer is someone who comes back to their store over and over again. A gym owner’s best customer could be considered consumer who pays their monthly on time but never uses the gym. For insurance companies it could be argued that their best customer is closer to the gym analogy than a retail customer. The policyholder who pays their premiums and never makes a claim.

Acquiring new customers is expensive, so keeping the ones you have and increasing wallet share is critical to success. It’s a multifaceted challenge that often comes down to customer experience. The conundrum for insurers is how to improve the customer experience when you have little interaction with your best customers.

A key insight is to know your audience; understand your customers and what drives their behavior. Whilst demographic variables give you a good idea of what a customer looks like, many insurers are finding that behavior data is a much better predictor. Examples of behavioral data would be marriage, moving to a new house or having a baby. But just as important as identifying these event-driven variables and who to target, is also knowing who NOT to target. By removing these non-buying customers from marketing campaigns helps increase acquisition rate, reduce marketing spend saving millions of dollars per year.

One organization with a reputation for exceptional customer service is USAA. A financial services company providing insurance to US military and their families. Doug Mowen, Executive Director of USAA’s Data and Analytics Office recently shared his insights with using analytics for improving the customer experience. To read these insights download the conclusion paper “Five Keys to Marketing Excellence”.

Measuring the impact of customer experience efforts is difficult. Most insurance companies struggle to tie customer experience investments to the bottom line. But ultimately insurance companies that embrace analytically driven customer experience management can get an edge on the competition by enabling better customer experiences, which in turn creates value for their organizations.

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.

Post a Comment

Time is precious, so are your analytical models

The analytical lifecycle is iterative and interactive in nature. The process is not a one and done exercise, insurance companies need to continuously evaluate and manage its growing model portfolio. In the last of four articles on the analytical lifecycle, this blog will cover the model management process.

Model management is a critical step in the analytics lifecycle. It involves validating a model, selecting a champion model, and monitoring the performance of your models on constant basis to decide whether to retire upgrade current models or create new ones.

Let’s explore some important best practices for model management:
Model validation

Model Validation and Comparison
With advances in modeling software and computing performance, it’s both easy and feasible to build multiple candidate models for use in analyses. By experimenting with different binning definitions, analysts can efficiently select alternate input variables, model techniques, and build multiple models. But before using these models, insurance companies need to evaluate them to determine which ones provide the greatest benefit.


Model Monitoring
To be effective over time analytical models need to be monitored regularly and adjusted to optimize their performance. Outdated, poorly performing models can be dangerous, as they can generate inaccurate projections that could lead to poor business decisions. As a best practice, once a model is in a production environment and being executed at regular intervals, the champion model should be centrally monitored through a variety of reports, as its performance will degrade over time. When performance degradation hits a certain threshold, the model should be replaced with a new model that has been recalibrated or rebuilt.

Model Governance
For insurance companies, greater regulatory scrutiny increases the need for model transparency. However, keeping records of the entire modeling process – from data preparation through to model performance – can be time consuming and unproductive. As a best practice, deploy model governance tools that enable you to automate model governance – complete with documentation to create qualitative notes, summaries and detailed explanations, as needed. Having an efficient model governance approval process can also save you time and money, as it can help insurers avoid audits and regulatory fines

To learn more about the model management process, download the white paper “Manage the analytical lifecycle for continuous innovation”.

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.

Post a Comment

Putting predictive analytics to work.

Insurance relies on the ability to predict future claims or loss exposure based on historical information and experience. However, insurers face an uncertain future due to spiraling operational costs, escalating regulatory pressures, increasing competition and greater customer expectations.

More than ever, insurance companies need to optimize their business processes. But what does that mean in practice?

In the third part of a series of articles on the analytical lifecycle, this blog focus on an analytics ecosystem for decision making.

Traditionally analytics has been seen as a back-office function. Implemented in silo by different departments and lines of business. Today it is becoming essential that insurance companies embed analytics into real-time business decisions by deploying predictive models into transactional systems like policy administration, claims management or even call centers.

Whether it be underwriters, claims adjusters, risk managers, insurance personal make hundreds and thousands of operational decisions each day that impact an insurance company. For insurance companies that rely on analytical models in the decision process, SAS Decision Manager provides a single point of control to integrate sophisticated analytical models with their business rules. The solution takes data, business rules and analytical models and turns them into consistent, automated actions that drive faster, better operational deci­sions. In addition, these business decisions are now governed, traceable and fully documented which is essential in the heavily regulated insurance industry.

One organization that recognized that to stay ahead of the competition, it had to manage its information as a strategic assets was Australian insurance company, IAG.  They use SAS High Performance Data Mining to analyze its growing database. By using analytical models that used to take hours can be reduced to minutes. Read more about the IAG case study.

Today, insurance companies are becoming data intoxicated as they consume more and more data. But the true value of big data lies not just in having it, but in being able to use it for fast, fact-based decisions that lead to real business value.

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.

Demystifying analytics

word cloudThere is no doubt that analytics is an overused and often abused term. So what does really analytics means?

In part 2 of a series of articles on the analytical lifecycle, this blog will highlight some of the common and emerging techniques used to analyze data and build predictive models


Exploratory data analysis
The science of extracting insight from data is constantly evolving. But regardless of how much data you have, one of the best ways to determine important information is through exploratory data analysis. Some of the most valuable ways you can do this are:

  • Line graphs
  • Bar and pie charts
  • Scatter and bubble plots
  • Correlation matrices
  • Clustering

Predictive modeling techniques
To gain an edge in today’s competitive market, insurance companies need advanced analytics solutions that reveal hidden patterns and insights. A few of the most valuable predictive modeling techniques are:

  • Regression models
  • Generalized linear modeling
  • Decision trees
  • Forecasting

Emerging analytical techniques
The digital age has brought with it a quantum increase in the amount of data available to companies. The vast majority of new data being created is semi-structured and unstructured data, both of which require new analytical techniques to generate insights. Several of the most important, emerging analytical techniques used to analyze big data are:

  • Link analysis
  • Textual analytics
  • Word clouds
  • Sentiment analysis

Analytics is no longer a luxury. It has become fundamental to the success and growth of insurance organizations worldwide because it supports better decision making about customers, prices, offers and more. The key is turning the massive amounts of data your company has collected – and continues to collect – into timely, trusted insights and competitive advantage. The best way to do this is through proven analytical techniques. To learn more about the available analytical capabilities download the white paper “Demystifying Analytics: Proven Analytical Techniques and Best Practices for Insurers”.

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.

Post a Comment

Data is King

In my last blog I detailed the four primary steps within the analytical lifecycle. The first and most time consuming step is data preparation.

Many consider the term “Big Data” overhyped, and certainly overused. But there is no doubt that the explosion of new data is turning the insurance business model on its head. There is more data and more access to data than there has ever been and it’s growing. The challenge for insurers is how do you take advantage of all this data to price better, expand your market and improve the business of underwriting risk and handing claims.

In the past insurance companies have relied on “old data”. Information from policy administration solutions, claims management applications and billing systems, often supplemented by 3rd party data such as census data, motor vehicle records (MVRs), medical reports and Dun & Bradstreet to name but a few. However today, insurance companies are incorporating “new data” such as credit scoring, social media such as Facebook and Twitter, telematics from in-car data recording devices and geo-spatial information like Google maps. Unfortunately many insurers are drowning in this vast amount of data and struggling to digest it for meaningful insight.

Data management

To combat the existing silo approach and to alleviate problems with growing amount and diversity of data, insurers are undertaking enterprisewide data management projects. One organization that successfully implemented an enterprise data warehouse is Zenith Insurance. Due to a period of rapid growth the management team were relying on reports and information pulled from numerous departments and disparate systems. Because of the manual process, inconsistency in data, as well as the time and resources required to collate the reports, was proving inefficient. To resolve this problem Zenith rolled out a coherent data-management strat­egy and brought all the company’s data into a SAS Enterprise Data Warehouse. Read more about the Zenith Insurance success story.

The foundation of a successful analytics operation is quality data, and superior data management. There are many examples of where data defects and inaccessibility to data can result in increased operational costs, potential customer dissatisfaction and even missed revenue opportunities. Clearly there is a business case for creating a single, unified environment for integrating, sharing and centrally managing data for business analytics.

To learn more download the white paper “Data is King”

Unleashing the full power of business analytics should be on the short list for every insurer. According to a recent survey 63% of insurers are planning to increase their spending on analytics in the next 12 months.  The path to maximize the investment in business analytics is data. Data management and data quality are no longer optional components of an analytical environment they are essential.

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.

Post a Comment