The Role of Data Management in Risk and Compliance

200426063-001As a Data Management expert, I am increasingly being called upon to talk to risk and compliance teams about their specific and unique data management challenges. It’s no secret that high quality data has always been critical to effective risk management and SAS’ market leading Data Management capabilities have long been an integrated component of our comprehensive Risk Management product portfolio. Having said that, the amount of interest, project funding and inquiries around data management for risk have reached new heights in the last twelve months and are driving a lot of our conversation with customers.

It seems that not only are organisations getting serious about data management, governments and regulators are also getting into the act in terms of enforcing good data management practices in order to promote stability of the global financial system and to avoid future crisis.

As a customer of these financial institutions, I am happy knowing that these regulations will make these organisations more robust and stronger in the event of future crisis by instilling strong governance and best practices around how data is used and managed.

On the other hand, as a technology and solution provider to these financial institutions, I can sympathise with their pain and trepidation as they prepare and modernise their infrastructure in order to support their day to day operations and at the same time be compliant to these new regulations.

Globally, regulatory frameworks such as BCBS 239 is putting the focus and attention squarely on how quality data needs to be managed and used in support of key risk aggregation and reporting.

Locally in Australia, APRA's CPG-235 in which the regulator has provided principles based guidance has outlined the types of roles, internal processes and data architectures needed in order to have a robust data risk management environment and to manage data risk effectively.

Now I must say as a long time data management professional, this latest development is extremely exciting to me and long overdue. Speaking to some of our customers in the risk and compliance departments, the same enthusiasm is definitely not shared by those charged with implementing these new processes and capabilities.

Whilst the overall level of effort involved in terms of process, people and technology cannot be underestimated in these compliance related projects, there are things that organisations can do to accelerate their effort in order to get ahead of the regulators. One piece of good news is that a large portion of the compliance related data management requirements map well with traditional data governance capabilities. Most traditional data governance projects have focused around the following key deliverables:

•      Common business definitions1397487016440

•      Monitoring of key data quality dimensions

•      Data lineage reporting and auditing

These are also the very items that the regulators are asking organisations to deliver today. SAS’ mature and proven data governance capabilities have been helping organisation with data governance projects and initiatives over the years and are now helping financial institutions tackle risk and compliance related data management requirements quickly and cost effectively.

Incidentally, our strong data governance capabilities along with our market leading data quality capabilities were cited as the main reasons SAS was selected as a category leader in Chartis Research’s first Data Management and Business Intelligence for Risk report

The combination of our risk expertise and proven data management capabilities means we are in a prime position to help our customers with these emerging data management challenges. Check out the following white papers to get a better understanding of how SAS can help you on this journey.

•      BCBS 239: Meeting Regulatory Obligations While Optimizing Cost Reductions

•      Risk and Compliance in Banking: Data Management Best Practices

•      Best Practices in Enterprise Data Governance

 

Post a Comment

Treat Yourself with Text

Did you check your email and/or favourite social media site before leaving for work this morning? Did you do it before getting out of bed? Don’t worry, you’re not alone. I do it every morning as a little treat while I “wake up” and whether I realize it or not, sometimes it sets the tone for the rest of the day.

PhonebedThe other day I was looking at my Facebook news feed and a couple of things drew my attention. One of them was an abridged transcript of an interview on Conan of the writer of the A Song of Ice and Fire books, which the TV series Game of Thrones is based on, George R. R. Martin. Because I don’t have much time in the mornings, if the article was much longer I would probably have stopped partway through. If I had, I would have missed this quote right at the end: “I hate some of these modern systems where you type a lowercase letter and it becomes a capital. I don’t want a capital. If I’d wanted a capital, I would’ve typed a capital. I know how to work the shift key.”. This put a smile on my face and in a good mood for the rest of the day.

This made me think.

What if I had woken up just that 5 minutes too late to have read the whole thing or the article at all? How many other interesting things could I have missed because I didn’t have time or had a “squirrel” moment? And as everybody with a social media account knows, there is too much to delve into everything.

With all that data, how can I be sure that I’m exposing myself to the most interesting information? Maybe it’s just a matter of looking at how many people have viewed something. But then I like to think I’m unique so that doesn’t really work. Maybe I can only look at things my closest friends recommend. But that’s more about being a good friend than being interested. Sorry friends.

Let’s take this situation to your organisation. How do you know what information is relevant to your business? There are a myriad of techniques to analyse the structured data that the data warehouse folks have invested a large amount of time designing and the business folks have spent vetting. But how about the unstructured data – the text-based data like survey responses, call centre feedback and social media posts? This data accounts for up to 90% of all information available for analysis.

How can we make use of this text-based data? Should you have people read through all the documents and give their opinions on themes and trends? But there is an inherent bias and unreliability as people have different backgrounds and perspectives. And it’s unlikely that 1 person would be able to read everything in a timely manner. On top of all this, what we need is more than just a word search. It’s more than word counts or word clouds. It’s more than just discovering topics. In fact, what we really need is to attach a measure of prediction to words, phrases and topics.

  • Escalate an incoming customer service call to the client relations team because the caller has used 3 key “future churn” phrases in the right order.
  • Redesign a product because very negative feedback always contain words that are classified under “aesthetic features”.
  • Discover the true customer service differentiators which give a positive Net Promoter Score (NPS).
  • The areas law enforcement should increase its presence to protect the public from activities being promoted on social media that are likely to have dangerous outcomes.
  • In a B2B scenario, understand and communicate the gaps in knowledge of the client organisation based on the volume, topics and severity of support calls they put through to the service organisation.
  • Determine the root cause of employee concerns and the best methods to manage them.

We need Text Analytics to structure the unstructured text-based data in an objective way.text globe

There are 2 sides to Text Analytics:

  • A statistical approach where text data is converted into numerical information for analysis, and words and phrases are grouped by their common pattern across documents. The converted data and groupings can then be used alone or combined with structured data in a statistical model to predict outcomes.
  • A linguistic approach or Natural Language Processing (NLP) where logical text-based rules are created to classify and measure the polarity (e.g. of sentiment) of documents.

Both sides are equally important because despite how far advanced computing algorithms have gotten, there is still a lot of nuance in the way people speak, like sarcasm and colloquialism. By using techniques from both sides in an integrated environment, we can create a whole brained analysis which includes clustering of speech behavior, prediction of speech and other behavior against topics, and prediction of the severity of sentiment towards a product, person or organisation.

One organisation which has been using Text Analytics from SAS for a number of years to provide pro-active services to their clients is the Hong Kong Efficiency Unit. This organisation is the central point of contact for handling public inquiries and complaints on behalf of many government departments.  With this comes the responsibility of managing 2.65 million calls and 98,000 e-mails a year.

"Having received so many calls and e-mails, we gather substantial volumes of data. The next step is to make sense of the data. Now, with SAS®, we can obtain deep insights through uncovering the hidden relationship between words and sentences of complaints information, spot emerging trends and public concerns, and produce high-quality, visual interactive intelligence about complaints for the departments we serve." Efficiency Unit's Assistant Director, W. F. Yuk.

Whatever size your organisation is, and whatever purpose your organisation has, there are many sources of text-based data that is readily available and may already be amongst the structured data in your data warehouse, Hadoop or on a network drive. By using this data to supplement the structured data many people are already analyzing, we can better pinpoint not only what is driving behavior but how we can better serve our customers and our employees. Wouldn’t it be great to know what is relevant to people in and out of your organisation without having to manually read thousands of documents?

Applying Text Analytics to your documents is treating yourself with Text because amongst the masses of words you will find nuggets which will brighten up your (and your company’s) day.

If you’re interested in treating yourself with Text and are in the Sydney region this July, sign up for the SAS Business Knowledge Series course Text Analytics and Sentiment Mining Using SAS taught by renowned expert in the field Dr Goutam Chakraborty from Oklahoma State University. Dr Chakraborty will also be speaking of his experiences in the field at the next Institute of Analytics Professionals of Australia (IAPA) NSW Chapter meeting.

Happy Texting!

Post a Comment

Are you riding the analytics curve?

Ever looked into what being “behind the curve” means? I’ll save you the time – it is being less advanced or getting somewhere slower than other people. Remember grade school? No-one likes being behind the curve. That’s where people tend to get crunched. So how do you know if you’re ahead or behind the curve? Measurement. Thanks to the poll we ran in our last newsletter, you’ll be interested to know that most of your peers in Australia and New Zealand stated that they were either “behind the curve or comparable”.

Coincidentally when we ran our poll question, MIT Sloan Management Review and SAS surveyed 2000 executives globally which resulted in a report titled “The Analytics Mandate”. Results showed that, “organisations have to do more to stay ahead of the curve,” as Pamela Prentice, SAS chief research officer, put it. “Nine in ten believe their organisations need to step up their use of analytics. This is true even among those who report having a competitive advantage.”

For us in the antipodes, there’s good news and bad news. The good news is that you are not alone! Don’t feel bad - if you think your organisation is behind the curve with your analytics mandate, so do your global peers. Now’s the right time to shine and open that window of opportunity to get ahead of the curve!

Unfortunately, there’s bad news too. Isn’t there always?! Being behind the curve might not seem like a big deal. Your organisation is still profitable, right? Assuming you are and that you’re not going anywhere, have you heard the saying, “if you snooze you lose”? That’s the real issue. It’s not about what you’re not doing. It’s about what your competitors are doing. How will your business survive if this is not part of your future strategic goal?

I refer back to the Analytics Mandate research that states – an analytics culture is the most significant driving factor in achieving a competitive advantage from data and analytics.

Changing the culture of an organisation does not happen overnight. In fact, it is probably the hardest transition to make as you try to juggle people, process, technology and the big O - Opinion! But once it is achieved, the business outcomes and competitive advantage are rewardingly accelerated. Here’s proof from Australian Natural Care, a medium-size online retailer of vitamins and supplements, “We have taken the results we’ve been getting to board level to report on the wins and everyone is pleased. We went from having no analytic capabilities to building analytical models within four weeks of implementation. This has had a direct impact on our entire business.”

How does that happen? To drive analytical aspiration and encourage an analytics culture, the Analytics Mandate study recommends you answer the following:

1. Is my organisation open to new ideas that challenge current practice?
2. Does my organisation view data as a core asset?
3. Is senior management driving the organisation to become more data driven and analytical?
4. Is my organisation using analytical insights to guide strategy?
5. Are we willing to let analytics help change the way we do business?

How did you go? Did you score five out of five? If so, then you’re on your way to analytical competitive genius!

If you need some guidance, then you need to read Evan Stubbs book, Delivering Business Analytics: Practical Guidelines for Best Practice to get some ideas on mandating the Analytics Mandate.

And if you were analytically enthusiastic enough to read this complete blog and are the first 25 readers to complete this Business Analytics Assessment, I will send you a signed copy of his book. Natalie.Mendes@sas.com

See you on the other side of the curve!

Post a Comment

Capital Management - Driving Business Evolution with Enterprise Risk Management

For many years enterprise risk management (ERM) was more of a concept than a reality, a visionary strategy that spanned every department within an organisation at every level. However, in recent years, ERM programmes have been indisputably implemented by organisations of all sizes, with treasury central to this process. This article asks what constitutes an effective ERM programme, and how does a company go about creating a solid risk framework with treasury now playing a vital role?

ERM is no longer aligned to an academic-based vision with only limited application in the real world. Investment in ERM infrastructure and analytics is now viewed as integral to the management decision process, especially in financial services firms. A successful bank must conduct stress testing and create a capital planning approach, integrating risk and financial data. While regulation plays a part in this change, it is only a fraction of what banks must do to remain profitable.

An ERM vision must have an IT infrastructure and risk culture that support cross-functional collaboration, such as the partnership of risk and finance. Collaboration requires understanding all portfolio risks and how to manage those risks to meet the long-term goals of the bank. While a chief financial officer (CFO) is responsible for balance sheet growth, the chief risk officer (CRO) is in charge of risk control objectives, such as minimising balance sheet risk and calibrating this with the bank’s risk appetite.

The CRO must also avoid balance sheet exposure at all times. For the treasury function, Basel III regulations demand comprehensive assessments that affect businesses. For example, banks may examine each instrument in terms of its contribution to the balance sheet, leverage, liquidity, profit and loss (P&L), regulatory and economic capital, and risk-adjusted profitability. Getting a holistic view for different portfolios from one place is both a big data and systems challenge for banks.

Re-examining Capital Planning

New approaches to capital planning have elevated the role of treasury beyond solvency and into strategic planning, helping banks evaluate long-term capital under forward-looking projections as well as stressed scenarios. Regulation has focused attention back on capital efficiency and allocating increasingly scarce capital to businesses that outperform on a risk-adjusted basis.

Data management and quality remain key challenges to the delivery of these new approaches. Specifically for asset liability management (ALM) and liquidity risk, data is necessary to maintain a comprehensive balance sheet. For a large bank, the balance sheet can comprise trillions of assets, where the data requirement for asset liability management is huge since analysis must be performed on all assets and liabilities.

The data challenge is obvious when integrating information from the risk side (for regulatory and economic capital) with the finance side (for financial statements, general ledger accounts, liquidity, Generally Accepted Accounting Principles and International Financial Reporting Standards (GAAP/IFRS) compliance).

Historically, these departments operated independently and had no business incentive to work together. But imperative regulations - such as balance sheet stress testing, IAS 39, IAS 9 and Dodd-Frank - are now motivating risk and finance groups to work together for compliance. An integrated view of risk and finance data conveys far more information than a disjointed view. Furthermore, banks can use that information for effective future strategic planning - the main goal when optimizing capital planning and management.

An ERM architecture must have the ability to process each data item required. The best practice is to establish automatic metadata management that tracks the lineage of all models, stress scenarios, and data outputs.

The goal for ERM is to calculate analyses only once and use the results in various applications. The planning systems evolve from pure recipients of data into new suppliers of data. Within a ‘data universe’, these systems must provide information of overriding interest (e.g., cash flow plans) as a central service. When implemented consistently, the result is an integrated bank control architecture with a single point of truth.

Stepping Up Stress Testing

Another powerful ERM tool is the increasing use of stress testing to help business executives gain enterprise and business unit level oversight.

Financial institutions (FIs) are implementing enterprise stress testing systems that can be used for various analyses and reporting. Stress testing a bank’s portfolio requires several dimensions. Macroeconomic factors must be linked to the portfolio’s loan performance (credit risk), liquidity profile and market risk. This is a difficult modeling exercise that banks were not previously required to do.

Since the development of Basel II stress testing has been hotly debated, and, in many instances, inadequately implemented. The topic is relevant both from a regulatory perspective and for the internal risk assessment in the context of the internal capital adequacy assessment process (ICAAP). The Capital Requirements Directive (CRD IV) consultations as the European implementation of Basel III have already addressed the existing shortcomings both in stress testing and back testing.

Stress tests represent only one form of simulation that highlights the effects of extremely negative developments. However, the simulation capability is important for applications that go far beyond this, especially for opportunity identification such as economic development planning and control. A simulation can, and should, make it possible to evaluate the effects of decisions.

In stress tests, simulating different environmental conditions (scenarios) requires the financial position of the bank to be investigated. Here, negative economic developments are simulated by means of risk parameters, to read potential portfolios. The goal is early detection of serious changes in the risk structure to estimate bank stability, especially in periods of crisis. For planning timely countermeasures, both the executive board and the supervisory board need to know and understand the risks.

To use stress tests continuously and regularly as a management tool, a process must transparently document the assumptions, models, and the results. The risk process must also be established across departments so that the entire bank’s risk can be determined, which influences the calculation of the risk-bearing capacity.

Benefits of ERM

ERM implementations attempt to develop an integrated method of approaching risk. This means integration of the different approaches to risks and solutions, as well as approaches to identify and manage risk effectively. ERM can provide firms with the means to decide how much uncertainty (risk) to accept while achieving growth, with new stress testing and capital planning capabilities playing an increasingly important role in senior management decision making.

The key to meeting ERM objectives is to develop a risk infrastructure to obtain and then maintain a holistic view of risk while dealing with ever-evolving data, complexity in measurement and timely management reporting. However, the reality for many firms is that following such an approach depends heavily on how the IT infrastructure has grown. An integrated view of risk and finance data conveys far more information than a disjointed view. Furthermore, banks can use that information for effective future strategic planning – the main goal when optimizing capital planning and management.

How has your approach to capital planning changed in light of today’s regulatory and financial environment?

Learn more about leading practices in capital planning and risk adjusted performance management in the SAS whitepaper  Capital Management for Banking: Dawn of a New Era and visiting us at SAS Capital Planning & Management.

- by Larry Roadcap and Dave Rogers, SAS

Post a Comment

Analytics in Government: The new era

“Evidence-based policy development” is a term which has been in use for a number of years. However, in most cases it remains more an aspiration than a practical reality. The consciousness of using more data to underpin policy has risen, but the change in practice has been modest. The question is whether agencies are using all available internal and external data at their disposal, exploring that data to gain valuable insights, using predictive modelling to determine how different options play out over the life of the program and balancing budgets and optimal outcomes.

Typically the basis of policy proposals consist of historical data, analytical models from other agencies (such as treasuries), accepted wisdom and professional opinion. While there may be valid reasons for ministers to sometimes take a path different from what the data might suggest, it is even more important in these circumstances to model the likely outcomes in order to successfully protect the government from unintended consequences.

New generations of policy developers are likely to be far more receptive to the “scientific” approaches to policy development. However, the imperative is to provide them with access to all relevant data and tools which enable them to explore it. Another aspect to consider is that the greater quantity of accessible data (often enabled by government ‘open data’ policies) potentially expose governments to challenges by opposition parties, individuals, academics and advocacy groups who, based on their own analysis of the available data, may question the declared expected outcomes of policies and programs. It is therefore important for ministers and their agencies to be able to demonstrate the strong factual foundation of their policies and programs. Importantly, the inflection point is now since all of this is already possible – the question is whether the will to fundamentally change the current approach to policy development exists.

Pro-active Compliance Management alludes to a very significant shift in the approach to fraud, waste, error and debt management in the public sector. The current post-fact management of these issues – where money is allowed to “leak out of the system” through a “pay now – check later” approach and where agencies then spend additional funds and resources to recover some of this money – is shifting to a proactive, preventative approach. Commercial organisations, like banks, cannot afford the high costs of the “double whammy” of the post-fact approach and there is a growing recognition that governments can’t either.

The drive from governments to return budgets to surplus has heightened this consciousness. The move to on-line systems has meant that analytics can be applied to transactions and inputs to verify their validity in real time. This means that fraud and error can be trapped at the point of interaction and be effectively prevented – very much the same as financial institutions already do at the point of transactions.

An important point to be made in this context is that analytical approaches to these areas are vastly superior to business rules-based approaches. The latter creates rules based on the repetition over time of an identified activity, however it is increasingly apparent that perpetrators are just as quick to change their behaviour to circumvent detection – meanwhile considerable financial loss or debt has occurred. By contrast, analytical approaches focus on the detection of anomalies which suggest error or malfeasance. While the error isn’t identified as a particular activity or repetitive, it flags something that is outside of normal or expected behaviour. In the sphere of social services, a proactive approach translates into better social outcomes. For instance, if the vast majority of welfare recipients are entitled because of an economic need, then the recovery of over-payments exacerbates their economic hardship. Better to have prevented an overpayment in the first place.

To recover current debt carried by governments – which in the Federal Government alone is in the tens of billions of dollars – analytical approaches allow for the differentiation of treatment strategies to optimise recovery, allowing agencies to recover the most money in the shortest time from those individuals and entities in the best position to pay off their debts.

Citizen-centric Service Delivery represents a fundamental shift away from payment-centric approaches towards service delivery that is built around a holistic understanding of individual citizens. This differentiated approach looks at all the relevant factors in an individual’s life and the identification of the next best action to improve their personal, social and economic wellbeing. This is a laudable vision towards which a number of agencies are already progressing.

It is reasonably expected that over time this will result in a more engaged and participative relationship between governments and citizens which will in turn produce better outcomes for both. It recognises that the various functions supporting citizens in need exist in silos, both within larger agencies, governments and across layers of government and that this does not serve citizens well. Experience to date demonstrates that with a more holistic and cross-functional approach, citizens feel more cared for and are therefore more inclined to take greater ownership outcomes and begin collaborating with agencies in what becomes a shared endeavour.

A benefit to governments is that they are able to reduce their service delivery costs while ensuring the citizen receives the most beneficial services. From a relevant technology perspective, it is interesting to observe that many of the analytical tools developed for the commercial sector work equally well in government services. For example, a customer intelligence capability designed to use all the known data about an individual consumer to determine the next best product to sell to them is reversed to determine the next best action to assist a citizen. The challenge for major agencies is that their systems are built around eligibility assessment and payments and have very little, if any, customer intelligence, case management or service planning capability. The need for a solid eligibility assessment and payment system remains, but these additional capabilities are required if agencies are to effectively transition to citizen-centric service delivery.

Post a Comment

The Australian Analytics 2014 skills and salary survey

I recently attended an event where a speaker from LinkedIn presented. He mentioned an interesting trend – the demand for analytical professionals has increased in the last five the years. We all know analytics is the hottest field around. Better than mining even - as of last year, the median analyst in Australia was earning over twice the median Australian salary. Although, some evidence suggests that salary growth may be cooling somewhat overseas.

Big data’s hot, then it’s not. Machine learning’s cool, then it’s old school. LASR, Spark, in-memory, and good old data mining; what’s a person to focus on?! Given the rate that everything’s changing out there, have you ever wondered how the skills you choose affect your salary?

Don’t guess, find out. The Institute of Analytics Professionals of Australia is running their annual skills and salary survey for the second time. As with last year, it covers:
• The industries that people work in
• The tools people have used and are using
• The challenges people are facing

New to this year, it also covers:
• The degree to which analytics is being centralised or federated in organisations
• Whether people are interested in formal or informal education on business analytics

With the recent launch of a Masters of Data Science at the University of South Australia, the Masters of Business Analytics at Deakin University, the Masters of Analytics at RMIT and the Masters of Analytics by Research at UTS, there’s clearly a lot of pent-up demand for learning.

Respondents have the option of receiving the report once it’s completed. And, given what was in last year’s report, it’s sure to be interesting reading.

Regardless of what you use or where your focus lies, as long as you’re an analyst and you’re working in Australia make sure to complete it before it closes. Tell your friends, tell your enemies, tell anyone and everyone you know who works in the field. Tweet it, blog it, talk about it over the water cooler - as the most comprehensive survey of its kind in Australia, it represents a unique opportunity to really understand what’s hot, what’s not, what’s trending.

Complete the survey

SAS is an active sponsor of IAPA and supports the broad development of skills across the industry.

Post a Comment

Don't Let Bad Quality Data Kill Your Data Visualisation Project

824009.TIFNext generation business intelligence and visualisation tools such as SAS Visual Analytics, are revolutionising insight discovery by offering a truly self service platform powered by sophisticated visualisations and embedded analytics. It has never been easier to get hold of vast amounts of data, visualise that data, uncover valuable insights and make important business decisions, all in a single day’s work.

On the flip side, the speed and ease of getting access to data, and then uncovering and delivering insights via powerful charts and graphs have also exasperated the issue around data quality. It is all well and good when the data being used by analysts is clean and pristine. More often than not, when the data being visualisation is of poor quality, the output and results can be telling and dramatic, but in a bad way.

Let me give you an example from a recent customer discussion to illustrate the point (I have, of course synthesised the data here to protect the innocent!).

Our business analyst in ACME bank has been tasked with the job of analysing customer deposits to identify geographically oriented patterns, as well as identifying the top 20 customers in terms of total deposit amount. These are simple but classic questions that are perfectly suited for a data visualisation tool such as SAS Visual Analytics.

We will start with a simple cross-tab visualisation to display the aggregated deposit amount across the different Australian states:

when visulisation image 1

Oops, the problem around non-standardised state values means that this simple crosstab view is basically unusable. The fact that New South Wales (a state in Australia) is represented nine different ways in our STATE field presents a major problem whenever the state field is used for the purpose of aggregating a measure.

In addition, the fact that the source data only contain a full address field (FULL_ADDR) means that we are also unable to build the next level of geographical aggregation using city as it is embedded into the FULL_ADDR free form text field.

when visulisation image 2

It would be ideal if the FULL_ADDR was parsed out and street number, street name and city are all individual, standardised fields that can used as additional fields in a visualisation.

How about our top 20 customers list table?

when visulisation image 3

Whilst a list table sorted by deposit amount should easily give us what we need, a closer inspection of the list table reveals troubling signs that we have duplicated customers (with names and addresses typed slightly differently) in our customer table. A major problem that will prevent us from building a true top 20 customers list table unless we can match up all the duplicated customers confidently and work out what their true total deposits are with the bank.

All in all, you probably don’t want to share these visualisations with key executives using the dataset you were given by IT. The scariest thing is that these are the data quality issues that are very obvious to the analyst. Without a thorough data profiling process, other surprises may just be around the corner.

One of two things typically happens from here on. Organisations might find it too difficult and give up on the dataset, the report or the data visualisation tool all together. The second option typically involves investing significant cost and effort in hiring an army of programmers and data analysts in order to code their way out of their data quality problems. Something that is often done without detailed understanding of the true cost involved in building a scalable and maintainable data quality process.

There is however, a third and better way. In contrast to other niche visualisation vendors, SAS has always believed in the importance of high quality data in analytics and data visualisation. SAS offers mature and integrated Data Quality solutions within its comprehensive Data Management portfolio that can automate data cleansing routines, minimise the costs involved in delivering quality data, and ultimately unleash the true power of visualised data.

There is however, a third and better way.

Whilst incredibly powerful and flexible, our Data Quality Solution is also extremely easy to pick up by business users with minimum training and detailed knowledge around data cleansing techniques. Without the need to code or program, powerful data cleansing routines can be built and deployed in minutes.

I built a simple data quality process using our solution to illustrate how easy it is to identify and resolve data quality issues described in this example.

Here is the basic data quality routine I built using the SAS Data Management Studio. The data cleansing routine essentially involves a series of data quality nodes that resolve each of the data quality issues we identified above via pre-built data quality rules and a simple drag and drop user interface.

when visulisation image 4

For example, here is the configuration for the "Address standardisation" data quality node. All I had to do was define which locale to use (English Australia in this case), which input fields I want to standard (STATE, DQ_City) ), which data quality definitions to use (City - City/State and City) and what the output fields should be called (DQ_State_Std and DQ_City_Std). The other nodes take a similar approach to automatically parse the full address field, and match similar customers using their name and address to create a new cluster ID field called DQ_CL_ID (we’ll get to this in a minute)

when visulisation image 5

I then loaded the newly cleansed data into SAS Visual Analytics to try tackle the questions that I was tasked to answer in the first place.

The cross-tab now looks much better and I now know (for sure), the best performing state from a deposit amount point of view is New South Wales (now standardised as NSW), followed by Victoria and Queensland.

when visulisation image 6

As a bonus for getting clean, high quality address data, I am also now able to easily visualise the geo based measures on a map, down to the city level since we now have access to the parsed out, standardised city field! Interestingly, our customers are spread out quite evenly across the state of NSW, something I wasn’t expecting in the first place.

when visulisation image 7

As for the top 20 customer list table, I can now use the newly created cluster field called DQ_CL_ID to group similar customers together and add their total deposit to work out who my top 20 customers really are. As it turns out, a number of our customers have multiple deposit accounts with us and go straight to the top of the list when their various accounts are combined.

when visulisation image 8

I can now clearly see that Mr. Alan Davies is our number one customer with a combined deposit amount of $1,621,768, followed Mr. Philip McBride, both of which will get the special treatment they deserve whenever they are targeted for marketing campaigns.

All in all, I can now comfortably share my insights and visualisations with business stakeholders with the knowledge that any decision made are using sound, high quality data. And I was able to do all this with minimum support and all in a single day’s work!

Is your poor quality data holding you back in data visualisation projects? Interested in finding out more about SAS Data Quality solutions? Come join us at the Data Quality hands on workshop and discover how you can easily tame your data and unleash its true potential.

Post a Comment

Banking: Using enterprise data for business decisions at the right-time

With their extensive volumes of customer and transaction information, banks in Australia and New Zealand are leveraging their data assets to transition to more customer-centric organisations. They are focusing on the best customer experience and positioning themselves as relevant and valued providers of financial services. In doing so, Australian and New Zealand banks are leading the charge to become fact-based decision makers driven by big and small data.

Even though opportunities around customer engagement are plentiful, regulatory change and new prudential standards and guidelines are requiring increased agility to accommodate new data management, credit reporting and risk aggregation obligations.  Increased focus on addressing and responding to fraud and cybercrime in more challenging economic conditions is also top of mind for the industry.

Data on its own is not valuable. Value is only realised when organisations employ genuine and sustainable approaches to analytics and insight to generate business outcomes.  I thought it would be worth considering what SAS has seen in some recent customer use cases.

Customer data – doing good and right things
We all want to do what is good and right. There are two interesting use cases where customer data is being used to derive actionable insight to deliver valued business outcomes for SAS customers and their customers. Westpac’s KnowMe project is using customer data to better understand how to serve and interact with its customers and deliver to them, offers which are relevant appropriate and have higher levels of potential acceptance based on preferences and behaviour.  Not only are customers rating Westpac 10% higher for their Net Promoter Score but in knowing and understanding their specific needs, Westpac has attained 37% improvement in new business acceptance in branches and 60% improvement in call centre interactions. With such impressive early gains and ROI at more than double initial targets, you cannot challenge the age-old saying that knowledge is power!  Power to the customer and bank alike for good outcomes.

Commonwealth Bank is leveraging customer data for a considerably different purpose. Understanding customer behaviour from the insights gained from extensive customer transaction data can also be used for good in responding to a need to both detect and prevent unexpected and adverse customer activity due to fraud or cybercrime.

If security is not maintained, trust is the first casualty and reputational damage and regulatory intervention is likely.  Commonwealth Bank has used real-time analytics and SAS’ proprietary hybrid technology for right-time fraud prevention and detection. With rules, predictive models and neural learning brought together, they assess growing transaction volumes in both real-time and right-time at speeds exceeding 250 transactions per second and an average response time per transaction assessed of 40 milliseconds. This approach not only optimises coverage but actively assesses unexpected activity using a risk-based model with higher levels of overall coverage and lower dollar values for fraud.

Against a backdrop of exponential increases in customer transaction activity and revenue, the ROI and value to both customers and CBA – as a trusted and reliable bank from real-time detection and prevention based on actionable insight and scoring – gives Commonwealth Bank knowledge and power to meet the expectations to protect depositers and shareholders.  Knowing what is normal and what is abnormal proves the value of rich customer information.

Risk-reward remains a key focus for CROs
We all know that pricing risk should be in the DNA for all banks, particularly in today’s environment with subdued demand and lower growth rates for retail lending. Lenders must remain focused on risk-reward trade-offs and writing business which is priced right and reflects the quality and capacity for borrowers to repay. There are continuing demands from both global and domestic regulators to implement and comply with new capital adequacy standards such as Basel III. Meaning that the sourcing and sustainable allocation of funding requires increasingly sophisticated and transparent capabilities in advanced analytics to better support and manage credit and financial risks.  Optimising available capital and funds for lending purposes and attracting and retaining customer profitability over the life of the banking relationship means that risk-reward optimisation is the main game for CROs.

Seeing the right story in complex data
Making better business decisions earlier and at lower costs is what financial services executives continue to ask for as both time and funds remain scarce and markets change quickly.  Having the right information at the right time is important but to derive actionable insights which are capable of being understood is more important. New capabilities for business intelligence, reporting and data visualisation enables all available data – internal and external - to be analysed and presented clearly, accurately and reliably  for real-time insights to support decisions and create meaningful conversations.  Making complex information simple and meaningful means the time to value is reduced and opportunity for competitive advantage improved.

Post a Comment

Taking the big data dive with Hadoop

Demand for analytics is at an all-time high. Monster.com has rated SAS as the number one skill to have to increase your salary and Harvard Business Review continues to highlight why the data scientist is the sexiest job of the 21st century.  It is clear that if you want to be sexy and rich we are in the right profession! Jokes aside I have spent the past five weeks travelling around Australia, Singapore and New Zealand discussing the need to modernise analytical platforms to help meet the sharp increase in demand for analytics to support better business and social outcomes.

While there are many aspects to modernisation, the most prolific discussion during the roadshow was around Hadoop. About 20% of the 150 plus companies were already up and running with their Hadoop play pen. Questions had moved beyond “What is Hadoop?” to “How do I leverage Hadoop as part of my analytical process?”. Within the region we have live customers using Hadoop in various ways:

  • Exploring new text based data sets like customer surveys and feedback.
  • Replicating core transaction system data to perform adhoc queries faster. Removing the need to grab extra data not currently supported in the EDW.
  • The establishment of an analytical sandpit to explore relationships that can have an impact on marketing, risk, fraud and operations by looking at new data sets and combining them with traditional data sets.

The key challenge discussed was unanimous. While Hadoop provided a low cost way to store and retrieve data, it was still a cost without an obvious business outcome. Customers were looking at how to plug Hadoop into their existing analytical processes, and quickly discovering that Hadoop comes with a complex zoo of capabilities and consequentially, skills gaps.

The SAS /Hadoop Ecosystem

The SAS /Hadoop Ecosystem

Be assured that this was and is a top priority in our research and development labs. In response to our customers' concerns, our focus has been to reduce the skills needed to integrate Hadoop into the decision-making value chain. SAS offers a set of technologies that enable users to bring the full power of business analytics functionality to Hadoop. Users can prepare and explore data, develop analytical models with the full depth and breadth of techniques, as well as execute the analytical model in Hadoop. It can be best explained using the four key areas of the data‐to‐decision lifecycle process:

  • Managing data – there are a couple of gaps to address in this area. Firstly, if you need to connect to Hadoop, read and write file data or execute a map reduce job; using Base SAS you can use the FILENAME statement to read and write file data to and from Hadoop. This can be done from your existing SAS environment. Using PROC HADOOP, users can submit HDFS commands and Pig Scripts, as well as upload and execute a map reduce tasks.
    SAS 9.4 is able to use Hadoop to store SAS data through the SAS Scalable Performance Data (SPD) Engine within Base SAS. With SAS/ACCESS to Hadoop, you can connect, read and write data to and from Hadoop as if it were any other source that SAS can connect to. From any SAS client, a connection to Hadoop can be made and users can analyse data with their favourite SAS Procedures and Data Step. SAS/ACCESS to Hadoop supports explicit Hive QL calls. This means that rather than extracting the data into SAS for processing SAS translates these procedures into the appropriate Hive‐QL which resolves the results on Hadoop and only returns the results back to SAS. SAS/ACCESS to Hadoop allows the SAS user to leverage Hadoop just like they do with an RDBMS today.
  • Exploring and visualising insight - With SAS Visual Analytics, users can quickly and easily explore and visualise large amounts of data stored in the Hadoop distributed file system based on SAS LASR Analytics server.  This is an extremely scalable, in‐memory processing engine that is optimised for interactive and iterative analytics. This engine addresses the gaps in MapReduce based analysis, by persisting data in‐memory and taking full advantage of computing resources. Multiple users can interact with data in real‐time because there is no re‐lifting data into memory for each analysis or request, there is no serial sequence of jobs, and computational resources available can be fully exploited.
  • Building modelsSAS High Performance Analytics (HPA) products (Statistics, Data Mining, Text Mining, Econometrics, Forecasting and Optimisation) provide a highly scalable in‐memory infrastructure that supports Hadoop. Enabling you to apply domain‐specific analytics to large data on Hadoop, it effectively eliminates the data movement between the SAS server and Hadoop. SAS provides a set of procedures that enable users to manipulate, transform, explore, model and score data all within Hadoop. In addition, SAS In‐Memory Statistics for Hadoop is an interactive programing environment for data preparation, exploration, modelling and deployment in Hadoop with an extremely fast, multi‐user environment leveraging SAS Enterprise Guide to connect and interact with LASR or take advantage of SAS’ new modern web‐editor, SAS Studio.
  • Deploying and executing models - conventional model scoring requires the transfer of data from one system to SAS where it is scored and then written back. In Hadoop the movement of data from the cluster to SAS can be prohibitively expensive. Instead, you want to keep data in place and integrate SAS Scoring processes on Hadoop. The SAS Scoring Accelerator for Hadoop enables analytic models created with Enterprise Miner or with core SAS/STAT procedures to be processed in Hadoop via MapReduce. This requires no data movement and is performed on the cluster in parallel, just like SAS does with other in‐database accelerators.

To be ahead of competitors we need to act now to leverage the power of Hadoop. SAS has embraced Hadoop and provided a flexible architecture to support deployment with other data warehouse technologies.  SAS now enables you to analyse large, diverse and complex data sets in Hadoop within a single environment – instead of using a mix of languages and products from different vendors.

 Click here to find out how SAS can help you innovate with Hadoop.

Post a Comment

SAS users bask at SUNZ

SAS users at the SUNZ conference in Wellington

SAS users at the SUNZ conference in Wellington

With a record-high 309 in attendance at SUNZ 2014, seems every SAS user in New Zealand was at Te Papa Museum in Wellington last month for the flagship user conference. If you were unable to attend, you missed a great presentation by me and lots of good ones by everyone else.

But in all seriousness, the conference was a success, and we were honored to have a distinguished lineup of speakers from both the public and private sectors.

The day opened with Minister for Social Development Paula Bennett discussing her department’s use of SAS to protect vulnerable children. Bennett told how predictive risk modeling is being used by the state to summarize information about children quickly so frontline workers can better protect those in need.

“We found it’s possible to pinpoint which children are most at risk before harm is done,” said Bennett. For more on this story, watch SAS Analytics helps the Ministry of Social Development in New Zealand. You can also watch Bennett's SUNZ speech or read the transcript.

Next came a complimentary keynote from James Mansell, Director of Innovation and Strategy at the Ministry of Social Development. Mansell added that analytics is essential in the public sector – however, data must be shared for benefits to be fully realized. “Until we can join up data across the various organizations looking after citizens, we cannot act as effectively and deliver better outcomes,” said Mansell.

He was followed by Kiwibank Senior Economist Donna Purdue, who energized the room with news of New Zealand’s bright economic future, noting 2014 is shaping up to be a bumper year.

With the morning sessions in the bag, the group enjoyed some tea before splitting off into the various business, technology and SAS update streams. One I’ll mention was by Loyalty New Zealand Analytics Manager Vince Morder, who described his company’s use of SAS® Customer Intelligence to understand shopping patterns and increase partner involvement with its popular Fly Buys program. Another was by Carl Rajendram, National Manager of Commercial Pricing and Analytics with NZI Insurance. Rajendram delivered a passionate testimony of his company’s use of SAS® Data Quality to evaluate risk and optimize insurance pricing.

Select presentations are available on the SUNZ website.

Much like the braided streams that converge into the Pacific, attendees convened once again in the Soundings Theatre for Ken Quarrie’s locknote presentation, “The Game is Changing.” Quarrie is Senior Scientist for New Zealand Rugby, and he held the room in rapture as he showed old footage, remarking how analytics has been instrumental in helping the All Blacks prevent injury, improve player performance and lead to an overall exceptional win ratio.

And finally, it wouldn’t be a SUNZ conference without cold beers to end it – a fitting conclusion for a family of SAS users who form a close community of bright, passionate professionals – all working to make New Zealand a smarter and safer place.

***

For press coverage of SUNZ 2014, see the links below.

Post a Comment
  • About this blog

    The Asia Pacific region provides a unique set of challenges (ความท้าทาย, cabaran) and opportunities (peluang, 机会). Our diverse culture, rapid technology adoption and positive market has our region poised for great things. One thing we have in common with the rest of the world is the need to be globally competitive while staying locally relevant. On this blog, our key regional thought leaders provide an Asia Pacific perspective on doing business, using analytics to be more effective, and life left of the date line.
  • Subscribe to this blog

    Enter your email address:

    Other subscription options