The final countdown… and beyond

It’s rather appropriate that the rock band Europe recorded the hit “The Final Countdown”, because today, September 22nd, represents 100 days until the much anticipated (and delayed) European insurance legislation Solvency II will come into effect on January 1st 2016.

Designed to introduce a harmonized, EU-wide insurance regulation, Solvency II demands a more comprehensive approach to risk management. To not only ensure Solvency II compliance, but to help insurance companies anticipate the regulatory and risk changes ahead and deal with them efficiently and proactively, these essential steps are recommended:

Step 1 – Data Management
Data management or more precisely, managing the quality and consistency of data – is fundamental to Solvency II compliance. A good data management strategy is a prerequisite for meeting Solvency II regulations. However, an excellent strategy not only ensures Solvency II compliance, it can also help increase risk awareness and improve business decision making processes throughout the organization.

Step 2 – Risk Calculations
Insurers must develop a central risk calculation “engine” that will analyze risks and calculate capital requirements that are in line with both Solvency II and company strategy, taking into account all quantifiable risks that an insurer may encounter, including underwriting risk, market risk, credit risk, liquidity risk and operational risk.

Step 3 – ORSA
Own Risk and Solvency Assessment (ORSA) is a relatively new concept. It’s a forward-looking assessment aimed at enhancing insurer awareness and understanding significant risks and their interdependency. With an integrated environment, users can align business planning processes and capital projects with income statements and balance sheets.

Step 4 – Reporting
Greater transparency, through public disclosure and reporting requirements, is one of the central foundations of Solvency II. Insurers will be expected to produce more reports than ever. Managing separate reporting systems for regulatory and management purposes will not be practical or cost-efficient. Insurers must save time and money by integrating their existing reporting system with their Solvency II implementation to produce consistent, timely and relevant information for all stakeholders.

The SAS approach to Solvency II and risk management is a flexible framework leading companies through the minefield of data, risk models and reporting. Our framework includes the solutions SAS Firmwide Risk for Solvency II and SAS Capital Planning and Management.

The lyrics of Europe’s “The Final Countdown” include the lines “Will things ever be the same again? It’s the final countdown.” There is no doubt, ensuring Solvency II compliance has been a long and difficult journey for many insurance companies. However January 2016 should not represent the finish line; it’s just the beginning of a new risk management playbook and things may never be the same again.

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.


Post a Comment

Flipping the data equation

Big_Data_For_Small_CompaniesBig Data has become a technology buzzword. But how is Big Data changing insurance?

Historically, insurance companies have used SMALL data to make BIG decisions. Today, insurers are using BIG data for SMALL decisions.

What does this mean?

Traditionally, insurance companies have aggregated data to group risks into broad categories based on basic factors, such as gender, age, martial status etc. For example, let’s consider auto insurance. They have assumed that all young, single male drivers are reckless and that middle-aged, married, female drivers are more cautious and priced their products accordingly. While the law of averages might back up this conclusion, we all know that each individual driver is different.

Fortunately things are changing.

With modern technology insurers are able to gauge risk more precisely down to a micro-level. For example, insurance companies are now using telematics data to assess each driver based on driving behavior such as hard braking, acceleration, speed and distance traveled. But the usage of big data is not exclusive to auto insurance. Many life insurers are beginning to use the data from wearable devices as an extra rating variable. While property insurance companies are using data from sensors (Internet of Things) and working with vendors like Nest to recognize potential claims and hopefully prevent losses.

To learn more about how insurance companies are flipping the data equation and moving from experimentation to innovation download the research paper “Big Data in Insurance”.

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.


Post a Comment

Is good…good enough?

“Garbage in, garbage out” is more than a catchphrase – it’s the unfortunate reality in many analytics initiatives. For most analytical applications, the biggest problem lies not in the predictive modeling, but in gathering and preparing data for analysis. When the analytics seems to be underperforming, the problem almost invariably lies in the data.

Entity resolutionInsurers typically have multiple legacy transactional systems often for each line of business. Hence when an insurers has an individual with profiles across multiple systems, it needs to be able to identify that as the same person and resolve different data variations into a single entity. In many cases entity resolution can be solved using simple business rules. Based on data items like date of birth and phones numbers. But this is not always sufficient. Insurers are now using advanced analytical techniques such as probabilistic matching to determine the actual statistical likelihood that two entities are the same.

However when it comes to data quality, there is one small anomaly. That is when the data is being used for fraud analytics. Insurance companies should be careful not to over-cleanse their data. In some cases, an error such as a transposition in a phone number or ID number may be intentional; that’s how the fraudster generate variations of data.

To learn more download the white paper “Fraud Analytics: Data Challenges and Business Opportunities”.

Insurance companies should not under estimate the amount of work required for data management. It’s not uncommon for over 50 percent of the implementation effort dedicated to data integration and data quality. However they should not over think the problem. While poor data will deliver poor results, perfect data is an unrealistic expectation. In most cases good data is good enough!

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.


Post a Comment

How insurers can take a giant leap forward with big data

Oh, how times have changed during my 20-plus years in the insurance industry. Data wasn’t a word we used much back in the 80s and 90s, unless of course you worked in those arcane and mysterious IT data centres.

Even amidst the computerisation of the insurance industry in the 80s, many policy documents and related files were still paper-based. In those days the data being captured came from the company’s own staff and was keyed into a mainframe terminal. The most sophisticated ones received a bordereaux file from their insurance brokers, transferred through a kind of electronic data interface or courier process – and more often than not this was just a floppy disk received in the post.

Data Center

Today, insurance companies have more data than they can realistically handle. And the sources are numerous, including: in-house generated, aggregators, fraud bureaus, government agencies, telematics boxes and of course social media sites. This much data presents a pretty significant headache for insurance companies: where do you store it all, how do you access and understand what that data can tell you – and do so quickly. Most important of all, how do you profit from it?

What are the questions around “big data” that insurers today are grappling with?

Storing big data
First of all, lets define the size of the problem. We know that many insurers and insurance brokers produce anywhere between 300,000 and 700,000 quotes per day for the UK aggregator sites, and that telematics boxes can generate hundreds of data items per second. Deciding how much of this data to store and where to store it cheaply has become a serious issue for insurers. Cost is the primary challenge, but any storage also has to meet regulatory compliance needs, so you can’t just store this data anywhere.

Accessing and analysing big data
Storing all this data effectively and cheaply is only worthwhile if the business users can access it quickly. Only then can they can run the analysis and reports that they need. How do you do that when you have terabytes worth of data and many millions of records? Traditional data warehouse systems and Excel (the typical analysts’ tool of choice) struggle with these volumes, the speeds required and storing the growth in unstructured data. As such, insurers need to look at new ways to do this, such as Hadoop.

Visualising big data
For end users, one of the biggest problems with big data is simply being able to see the data fields and structures themselves. A few insurers have invested the time and effort to build good data dictionaries that help users to understand the data and associated metadata. But for everyone else, clearly labelled tables, fields and data values are extremely important if you want to get fast insights.

However, knowing the question is only half the problem. We also need to look at what insurers can do to address these issues. The first thing to know is that the answers are here today and are available for all insurance companies.

The data can now be quickly and cheaply stored in Hadoop and, with tools like SAS Visual Analytics, can be easily accessed and analysed by anyone with a mouse and a web browser. The ability to review and analyse vast quantities of data received from insurance aggregators in real time makes data insight immediately actionable. If you wanted to improve your quote to policy conversion, you could reduce your price by a percentage point. Or increase your price if, for example, you’ve reached your quota for young drivers. With technologies like SAS Event Stream Processing, this is a reality.

In the world of data, times have changed a great deal, but users need not fear the terms “big data” or “Big Data Analytics”. In those immortal words “we have the technology…”

If you want to know more about how to properly exploit big data using Hadoop, follow the eight-point checklist set out in this report.

Post a Comment

No more excuses.. Analytics IS a game changer

Good analytics survey last year found that 72% of insurance executive agreed that analytics is the biggest game-changer in the next 2 years. Bad news...compared to other industries the adoption rates of analytics in the insurance has lagged other industries.

To reverse this trend and help insurers travel down the path to becoming an Analytic Insurer they must consider four things:

Business Strategy

- What data is accessible and obtainable?
- What type of analytic tools are needed and available?
- What type of resources are required? What are the right skill sets?
- What best practices or processes will facilitate the implementation of an analytic strategy.


Organizations new to analytics should start with small proof of concept and pilot projects and build on success to gain support throughout the organization. This will lead to buy-in from senior executives, increased funding and more ambitious projects. A win-win for everyone.

There is no “one size fits all” when it comes to analytic execution. To find out how you can transform your organization into the Analytic Insurer download a copy of the white paper “Building a strategic analytics culture: a guide for the insurance industry”.

To survive and thrive in the competitive insurance industry, carriers must implement analytics.

There are no more excuses.

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.

Post a Comment

Help wanted

The insurance industry is heading for a crisis. Depending on which report you read the insurance industry is facing a shortfall in job vacancy from anything from 40,000 to nearly half million in the next few years. Baby boomers in specialized jobs like underwriters and claims adjusters are retiring and insurers are struggling to replace these employees.

Insurers face a classic good news / bad news scenario. On the positive side the industry is financially stable, has great career paths, good benefits and is almost recession proof. On the negative side the perception of insurance is an industry of laggards. The decision makers at the top are old-school, they struggle to relate with the millennial generation graduating from college. Insurance companies have got to be able to connect with these individuals and look cool because that is the “new normal”.

In an early Analytic Insurer blog I write how Data Scientist is considered the sexiest job of the 21st century. Ironically, the insurance industry is no stranger to the importance of analytics and has historically been success in attracting mathematicians and statisticians to the industry to work as actuaries.  But as more and more industries and companies develop data-driven initiatives the demand for these individuals with analytical skills has increased.  Therefore, attracting and retaining employees with such specialized capabilities remains a significant hurdle to overcome and a potential differentiator for those that do.

One insurance company that reversing this trend is XL Group. To get a better handle on pricing, XL Group created an analytical team led by SVP of Strategic Analytics, Kimberly Holmes to work on predictive, multi-variant analytics using internal and external data. After four years, XL has seen its claims per dollar of premium written fall significantly in those lines of businesses which were early adopters of predictive models. To read more about this case study and how analytical talent is driving a competitive advantage download the MITSloan report "The Talent Dividend".

The question for all insurers concerned with building their analytics capabilities is this: What is you plan for cultivating analytics talent?

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.


Post a Comment

Location, Location, Location

If you buying or selling a house. The relator will tell the value of the property is all about location, location, location. For insurance companies location is just as important.

For an underwriter assessing the risk on a property is essential that they consider the location of the property. How close is it to the coastline? Is it a flood plain? Where is the nearest fire station? It is even important to know the types of businesses in surrounding properties. A clothing manufacturing next to a fireworks store is a greater risk than if the neighboring property sold sports equipment. Traditionally underwriters have relied on quotation data from the insured or agent, plus information from local adjusters. Now underwriters are supplementing that information with geo-spatial data.

geo spatial


Of course, it is impossible to predict exactly where a natural disaster will hit, but quickly predicting the exposure is essential when catastrophe strikes. When torrential rain inundated the city of Calgary in June 2013, Aviva Canada used SAS Visual Analytics to quickly determine which customers were most affected. By combining policy data with geocoding software, SAS provided Aviva Canada executives with a dashboard to easily monitor the amount of claims coming in. This resulted in speedier claim fulfillment, lower fraud rates and better customer retention. Read more about the AVIVA Canada case study.


There are many other business applications for location analytics within insurance. One example is marketing. Insurance companies have always created customer segmentation. However now insurance companies are can create micro-segmentation down to a hyperlocal level.   Overlaying internal customer data with behavioral and demographic information, marketing analysts can identify sales gaps and untapped opportunities.

Another example is using geo-spatial data to quickly find emerging trends in fraudulent and suspicious activities. This video shows how quickly and effectively an insurance company could reveal anomalies in the data. In this instance, by analyzing the data the insurance company was able to discover that the spike in claims were related to unscrupulous, out-of-state, contractors targeting elderly policyholders for unnecessary property repairs.

SAS Visual Analytics and location analysis empowers underwriters, adjusters, marketing specialists to better assess risk, be more efficient and increase revenue. Are you using it?

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.

Post a Comment

Customer experience conundrum

Who is your best customer?  The answer to this question can vary dramatically depending on your industry. A retailer’s best customer is someone who comes back to their store over and over again. A gym owner’s best customer could be considered consumer who pays their monthly on time but never uses the gym. For insurance companies it could be argued that their best customer is closer to the gym analogy than a retail customer. The policyholder who pays their premiums and never makes a claim.

Acquiring new customers is expensive, so keeping the ones you have and increasing wallet share is critical to success. It’s a multifaceted challenge that often comes down to customer experience. The conundrum for insurers is how to improve the customer experience when you have little interaction with your best customers.

A key insight is to know your audience; understand your customers and what drives their behavior. Whilst demographic variables give you a good idea of what a customer looks like, many insurers are finding that behavior data is a much better predictor. Examples of behavioral data would be marriage, moving to a new house or having a baby. But just as important as identifying these event-driven variables and who to target, is also knowing who NOT to target. By removing these non-buying customers from marketing campaigns helps increase acquisition rate, reduce marketing spend saving millions of dollars per year.

One organization with a reputation for exceptional customer service is USAA. A financial services company providing insurance to US military and their families. Doug Mowen, Executive Director of USAA’s Data and Analytics Office recently shared his insights with using analytics for improving the customer experience. To read these insights download the conclusion paper “Five Keys to Marketing Excellence”.

Measuring the impact of customer experience efforts is difficult. Most insurance companies struggle to tie customer experience investments to the bottom line. But ultimately insurance companies that embrace analytically driven customer experience management can get an edge on the competition by enabling better customer experiences, which in turn creates value for their organizations.

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.

Post a Comment

Time is precious, so are your analytical models

The analytical lifecycle is iterative and interactive in nature. The process is not a one and done exercise, insurance companies need to continuously evaluate and manage its growing model portfolio. In the last of four articles on the analytical lifecycle, this blog will cover the model management process.

Model management is a critical step in the analytics lifecycle. It involves validating a model, selecting a champion model, and monitoring the performance of your models on constant basis to decide whether to retire upgrade current models or create new ones.

Let’s explore some important best practices for model management:
Model validation

Model Validation and Comparison
With advances in modeling software and computing performance, it’s both easy and feasible to build multiple candidate models for use in analyses. By experimenting with different binning definitions, analysts can efficiently select alternate input variables, model techniques, and build multiple models. But before using these models, insurance companies need to evaluate them to determine which ones provide the greatest benefit.


Model Monitoring
To be effective over time analytical models need to be monitored regularly and adjusted to optimize their performance. Outdated, poorly performing models can be dangerous, as they can generate inaccurate projections that could lead to poor business decisions. As a best practice, once a model is in a production environment and being executed at regular intervals, the champion model should be centrally monitored through a variety of reports, as its performance will degrade over time. When performance degradation hits a certain threshold, the model should be replaced with a new model that has been recalibrated or rebuilt.

Model Governance
For insurance companies, greater regulatory scrutiny increases the need for model transparency. However, keeping records of the entire modeling process – from data preparation through to model performance – can be time consuming and unproductive. As a best practice, deploy model governance tools that enable you to automate model governance – complete with documentation to create qualitative notes, summaries and detailed explanations, as needed. Having an efficient model governance approval process can also save you time and money, as it can help insurers avoid audits and regulatory fines

To learn more about the model management process, download the white paper “Manage the analytical lifecycle for continuous innovation”.

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.

Post a Comment

Putting predictive analytics to work.

Insurance relies on the ability to predict future claims or loss exposure based on historical information and experience. However, insurers face an uncertain future due to spiraling operational costs, escalating regulatory pressures, increasing competition and greater customer expectations.

More than ever, insurance companies need to optimize their business processes. But what does that mean in practice?

In the third part of a series of articles on the analytical lifecycle, this blog focus on an analytics ecosystem for decision making.

Traditionally analytics has been seen as a back-office function. Implemented in silo by different departments and lines of business. Today it is becoming essential that insurance companies embed analytics into real-time business decisions by deploying predictive models into transactional systems like policy administration, claims management or even call centers.

Whether it be underwriters, claims adjusters, risk managers, insurance personal make hundreds and thousands of operational decisions each day that impact an insurance company. For insurance companies that rely on analytical models in the decision process, SAS Decision Manager provides a single point of control to integrate sophisticated analytical models with their business rules. The solution takes data, business rules and analytical models and turns them into consistent, automated actions that drive faster, better operational deci­sions. In addition, these business decisions are now governed, traceable and fully documented which is essential in the heavily regulated insurance industry.

One organization that recognized that to stay ahead of the competition, it had to manage its information as a strategic assets was Australian insurance company, IAG.  They use SAS High Performance Data Mining to analyze its growing database. By using analytical models that used to take hours can be reduced to minutes. Read more about the IAG case study.

Today, insurance companies are becoming data intoxicated as they consume more and more data. But the true value of big data lies not just in having it, but in being able to use it for fast, fact-based decisions that lead to real business value.

I’m Stuart Rose, Global Insurance Marketing Director at SAS. For further discussions, connect with me on LinkedIn and Twitter.