How does big data power your electricity?

1397487016440The energy & utilities industry as a whole has experienced a seismic shift over the past five years due to rising costs and price pressures, and has become a priority discussion on the political and media agenda. Falling demand overall combined with “peakier” peaks is making supply, forecasting and public perception management more difficult by the day.

For the gentailers there is a data-rich environment that is driven by a need for marketing, loyalty and retention, but also to ensure accurate forecasts are aligned with demand and that sound and profitable trading decisions are made. Competition is fierce in the industry as Australia has moved to a contestable market in most jurisdictions. This competitive pressure has meant we are seeing some of the highest customer churn rates in the world – pushing towards 30 percent including connects and disconnects in some states. This has forced retailers to be very focused on what they need to do to maintain customer profitability and prevent customer churn.

An industry driven by customer data

The industry comprises two types of organisations – the network side of the business and the gentailers (generator-retailers) – both of whom approach their customers in vastly different ways.

On the network side of the business there are continuing cost pressures. The recent modernisation of the network infrastructure has contributed the greatest cost inputs into the overall price in the past five years particularly in the larger jurisdictions such as NSW. This period of increased investment has coincided with the Global Financial Crisis, as well as other environmental, economic and technology factors altering the country’s energy profile in a way never before experienced. As a result this has created a strong move and political will to drive down costs on the network side of the business and for the companies to engage better with their stakeholders.

The ability for utilities companies to better understand their customers, optimise pricing models, and manage demand to meet individual requirements is the driving force for keeping customers.

Data and analytics is helping predict customer behaviour and identify those customers who are most likely to churn and the points at which it is likely to happen. For us as the consumer it means they’ll use that insight gleaned from all marketing channels – call centre, digital and other forms of marketing – to really understand us and ensure our pricing plans are based on our needs.

Blog Matt

Video: Hear how Spanish energy company Endesa has used SAS to improve its performance using SAS leading to a reduction in churn and increased profitability.

And the proof is in the data. For example a Spanish utilities organisation, Endesa, exists in a highly competitive market due to a move from a regulated to a deregulated industry. SAS Analytics was able to help them get to the core of what their customer was thinking and get in front of them in predicting and intervening on the likelihood of churn. Not only has it meant increased customer satisfaction and loyalty but over two years they’ve reduced churn by 50 percent.

Understanding the customer to drive retention is just one part of the complexities faced by energy and utilities organisations. The other part is understanding a customer’s behaviour and in particular how they generate demand for electricity. Interestingly in Australia demand is actually falling and becoming much more based around peak times. Utilities need to understand exactly when those peaks are going to happen so trading decisions are optimised. And network companies are focused on more accurately predicting requirements and identifying where they should apply their capital budget to assets in the short, medium and longer term.

The changing energy landscape in Australia

The uptake of solar and renewable installation across Australia is also contributing to a change, particularly around peak load forecasting in ways traditional forecasting methods struggle to predict. Through advanced analytics gentailers are using the data available to better understand the profile of solar and renewable users and how it affects load predictions, particularly on the peak days. For the company, the service provided is much more aligned with reality and investment isn’t wasted on unnecessary infrastructure.

As an example, one of the utilities organisations in this region has been able to trade much more effectively based on significantly reducing their MAPE (Mean Absolute Percent Error). Typically in ANZ, this has been often 10 percent or higher. This SAS customer has been able to reduce that to figures considerably less than 3 percent. For the company this translates to millions on their bottom line and provides the understanding of the actual requirements of energy users. Increasing accuracy rates means utilities don’t overbuy on electricity – which in turn has a positive effect on consumer electricity bills.

Post a Comment

Thinking like the customer: the value in mapping the customer journey

CRMData-driven marketing is all about how marketers can harness data and analytics to create a more customer-centric, fact-based approach to customer engagement. This, combined with quality execution leads to better customer experiences and improved customer equity.

However when looking at customer-brand interactions in silos such as in the call centre and separately online, we don’t gain an accurate view of their lifecycle – we need to be able to look at the entire customer lifecycle and every touch point it involves across all channels including call centres, in-store and online.

Now, it’s easy to say all of this in theory, however when it comes to actually implementing and or obtaining data on all these touch points – especially when you have thousands, if not millions of customers – the process becomes somewhat complex. It’s no wonder that some marketers struggle to get a holistic view of how they should be engaging with their customers.

This is where customer intelligence comes into play.

Data plays a huge role in mapping out the customer journey, which is one of the key enablers for understanding individual preferences and motivations to create that feeling of appreciation and recognition. Not only does this build long term value into the conversation, it ensures consistency in messaging across sales and services efforts.

It’s important to ask ourselves, as professional marketers, how we could go about our job differently in an attempt to further enhance our perception and knowledge of customer lifecycles.

It is now widely accepted that customer centricity is a priority in the boardroom. Recognising that customers are the most important asset to any organisation is crucial, and many have come to this conclusion through practice. However, the challenge is determining how to develop this holistic relationship with individual customers. If marketing is unable to gain this understanding, they will struggle to position themselves strategically in the boardroom.

So, how do you avoid the struggle and embed customer centricity in the company’s DNA?

Think of how you approach personal interactions and relationships – when and how did you last speak with that person? What was the context? Was it on the phone, email or in person? Was it a defining moment for them, and if so was it positive or negative? All of these are valid questions that can, and should be considered when creating a mutually valuable interaction with another individual, whether it be personal or with a customer. Customer intelligence helps accomplish this. It allows you to scale that understanding into a holistic communication program across multiple channels, ultimately enabling you to take advantage of those ‘defining moments’ and make tailored offers or decisions that are relevant and timely to that individual.

The relationship you form with a customer needs to reflect and or be the same as the interpersonal relationships you have formed in your own life. So, things like common interests, past conversations, preferences – all that jazz – it has to be accessible and utilised in order to build a long term relationship with that particular customer. Customer intelligence does exactly that – provides readily available information that can be used at one’s own discretion, to specifically target that personas needs.

Post a Comment

What ever happened to customer segmentation?

Customer SegmentationI recall that in the not too distant past customer segmentation, in its many guises, was once a flavour of the times. Segmentation was (and still is) however, a greatly misused term, with organisations confusing the (correct) approach of offering a strategic view of a customer base with the more tactical initiatives of predictive modelling.

I have come to be aware that ‘true’ customer segmentation seems to be disappearing as an input from the planning cycle.

To recap, traditionally there have been a number of ways for organisations to segment their customer base. A few examples of how segmentation could be performed are by:

  • Profitability.
  • Life-stage.
  • Behaviour.
  • Attitude.

The choice of segmentation would always depend on the business application.

Behavioural segmentation will utilise a large number of behavioural related data fields to define a relatively small number of roughly homogeneous segments, each of which is distinct in terms of its dominant characteristics.

Such systems are quite different from traditional segmentation schemes which are largely intuition-led (rather than data led), and are defined in terms of very few fields (Recency-Frequency-Value is typical of this class).

Segmentation should always have been considered as a powerful tool within customer relationship management and its role is at the heart of customer relationship building.

As previously stated, a statement such as “derive segments for my next campaign” is completely at odds with what a ‘true’ customer segmentation should stand for. Customer segmentation should not be confused with specific campaign activity.

So what does segmentation stand for?

In my view, it should be to use available data or information to create natural groupings of customers or prospects which:

  • Simplify the description of a large heterogeneous customer base or market.
  • Suggest tailored marketing strategies, objectives, hypotheses and idea generation.

Importantly, it is worth remembering that there is NO UNIQUE solution and what is best will depend on the application.

Applications and benefits

Segmentation needs to be considered as a planning tool, usually for marketing, it would lend itself to:

  • Improved relationship building through tailored customer management and marketing strategies.
  • Segment-specific targets & performance monitoring.
  • Cross-selling plans.
  • Broad brush targeting.
  • A new sampling frame.
  • Framework to measure overall success of customer strategies, as opposed to campaign-by-campaign evaluation.

So the words “planning” and “strategy” are key to what segmentation is meant to deliver against. The reference to “targeting” has been prefaced by “broad-brush” as segmentation as an approach will deliver an inexact allocation of customers to segments.

Although historically segmentation systems have been derived through the use of complex multivariate statistical methods, a more sensible approach has been to combine logical rules with approaches such as clustering.

If segmentation is a planning and strategic tool then targeting and predictive models can be viewed as a series of “knitting needles” that cut through the segments selecting individuals for specific marketing activity. True, there may be more customers selected from certain segments but customers are likely to be selected from ALL segments.

There may be still be significant variations of customers within each segment; therefore communicate with individual customers and not segments.

Customer segmentation as a planning tool

This weapon in the marketing planners’ armoury appears to have fallen out of favour in recent times – more and more organisations appear to ignore this powerful strategic tool.

Why might this be the case?

  • The term “customer centricity” for one thing might have steered organisations towards solely thinking about more tactical activity, focused at the individual customer level, and actioned through specific targeting criteria for a very specific action. However, that specific action should have been driven from a thorough understanding of the customer base, with a key part of that coming through identifying the customer segments that exist
  • Organisations talk in terms of the “segment of one”; this is not an advancement on traditional segmentation applications, it is getting to know a customer as an individual, which of course requires investment in new technologies and advanced analytics. But it is not “segmentation” in the true meaning of the term
  • The approach to planning has changed. Have marketing planners and strategists simply turned their back on segmentation as a tool to help them? This may be due to:
    • Lack of understanding as to what segmentation is and how it can help
    • Perceived other priorities, with the headlong rush to be  reactive (to competitors) rather than be proactive and drive activity through segmenting the customer base  and “knowing your customers”
    • Lack of analytical support and skill to develop a tailor made segmentation system

Why is segmentation still important?

Customer segmentation as described here should still be considered an important part of the marketing planning process and will give organisations an enhanced ability to tailor marketing and management strategies.

Segmentation can deliver an even more powerful tool given the increase in the volume and type of data now available for analysis. The ability to incorporate unstructured, web and social media data, campaign response and channel data as drivers of segmentation systems, means that segments can be understood in even greater detail and hence are even more powerful inputs into the planning cycle

Historically it could be a difficult task for the analyst to explain segments and what constitutes the typical features of an individual segment and what makes a segment different from another.

With the advent of powerful visualisation tools on the market (for example SAS Visual Analytics) it is possible to easily demonstrate the key features and differences of and between segments. Additionally the ability the report on segment migration and membership through a range of specific segment reports can be viewed through such tools.

These will undoubtedly make the understanding of segments far easier and hopefully reinforce the value and acceptance of segmentation at the heart of customer relationship building.

Post a Comment

The Role of Data Management in Risk and Compliance

200426063-001As a Data Management expert, I am increasingly being called upon to talk to risk and compliance teams about their specific and unique data management challenges. It’s no secret that high quality data has always been critical to effective risk management and SAS’ market leading Data Management capabilities have long been an integrated component of our comprehensive Risk Management product portfolio. Having said that, the amount of interest, project funding and inquiries around data management for risk have reached new heights in the last twelve months and are driving a lot of our conversation with customers.

It seems that not only are organisations getting serious about data management, governments and regulators are also getting into the act in terms of enforcing good data management practices in order to promote stability of the global financial system and to avoid future crisis.

As a customer of these financial institutions, I am happy knowing that these regulations will make these organisations more robust and stronger in the event of future crisis by instilling strong governance and best practices around how data is used and managed.

On the other hand, as a technology and solution provider to these financial institutions, I can sympathise with their pain and trepidation as they prepare and modernise their infrastructure in order to support their day to day operations and at the same time be compliant to these new regulations.

Globally, regulatory frameworks such as BCBS 239 is putting the focus and attention squarely on how quality data needs to be managed and used in support of key risk aggregation and reporting.

Locally in Australia, APRA's CPG-235 in which the regulator has provided principles based guidance has outlined the types of roles, internal processes and data architectures needed in order to have a robust data risk management environment and to manage data risk effectively.

Now I must say as a long time data management professional, this latest development is extremely exciting to me and long overdue. Speaking to some of our customers in the risk and compliance departments, the same enthusiasm is definitely not shared by those charged with implementing these new processes and capabilities.

Whilst the overall level of effort involved in terms of process, people and technology cannot be underestimated in these compliance related projects, there are things that organisations can do to accelerate their effort in order to get ahead of the regulators. One piece of good news is that a large portion of the compliance related data management requirements map well with traditional data governance capabilities. Most traditional data governance projects have focused around the following key deliverables:

•      Common business definitions1397487016440

•      Monitoring of key data quality dimensions

•      Data lineage reporting and auditing

These are also the very items that the regulators are asking organisations to deliver today. SAS’ mature and proven data governance capabilities have been helping organisation with data governance projects and initiatives over the years and are now helping financial institutions tackle risk and compliance related data management requirements quickly and cost effectively.

Incidentally, our strong data governance capabilities along with our market leading data quality capabilities were cited as the main reasons SAS was selected as a category leader in Chartis Research’s first Data Management and Business Intelligence for Risk report

The combination of our risk expertise and proven data management capabilities means we are in a prime position to help our customers with these emerging data management challenges. Check out the following white papers to get a better understanding of how SAS can help you on this journey.

•      BCBS 239: Meeting Regulatory Obligations While Optimizing Cost Reductions

•      Risk and Compliance in Banking: Data Management Best Practices

•      Best Practices in Enterprise Data Governance

 

Post a Comment

Treat Yourself with Text

Did you check your email and/or favourite social media site before leaving for work this morning? Did you do it before getting out of bed? Don’t worry, you’re not alone. I do it every morning as a little treat while I “wake up” and whether I realize it or not, sometimes it sets the tone for the rest of the day.

PhonebedThe other day I was looking at my Facebook news feed and a couple of things drew my attention. One of them was an abridged transcript of an interview on Conan of the writer of the A Song of Ice and Fire books, which the TV series Game of Thrones is based on, George R. R. Martin. Because I don’t have much time in the mornings, if the article was much longer I would probably have stopped partway through. If I had, I would have missed this quote right at the end: “I hate some of these modern systems where you type a lowercase letter and it becomes a capital. I don’t want a capital. If I’d wanted a capital, I would’ve typed a capital. I know how to work the shift key.”. This put a smile on my face and in a good mood for the rest of the day.

This made me think.

What if I had woken up just that 5 minutes too late to have read the whole thing or the article at all? How many other interesting things could I have missed because I didn’t have time or had a “squirrel” moment? And as everybody with a social media account knows, there is too much to delve into everything.

With all that data, how can I be sure that I’m exposing myself to the most interesting information? Maybe it’s just a matter of looking at how many people have viewed something. But then I like to think I’m unique so that doesn’t really work. Maybe I can only look at things my closest friends recommend. But that’s more about being a good friend than being interested. Sorry friends.

Let’s take this situation to your organisation. How do you know what information is relevant to your business? There are a myriad of techniques to analyse the structured data that the data warehouse folks have invested a large amount of time designing and the business folks have spent vetting. But how about the unstructured data – the text-based data like survey responses, call centre feedback and social media posts? This data accounts for up to 90% of all information available for analysis.

How can we make use of this text-based data? Should you have people read through all the documents and give their opinions on themes and trends? But there is an inherent bias and unreliability as people have different backgrounds and perspectives. And it’s unlikely that 1 person would be able to read everything in a timely manner. On top of all this, what we need is more than just a word search. It’s more than word counts or word clouds. It’s more than just discovering topics. In fact, what we really need is to attach a measure of prediction to words, phrases and topics.

  • Escalate an incoming customer service call to the client relations team because the caller has used 3 key “future churn” phrases in the right order.
  • Redesign a product because very negative feedback always contain words that are classified under “aesthetic features”.
  • Discover the true customer service differentiators which give a positive Net Promoter Score (NPS).
  • The areas law enforcement should increase its presence to protect the public from activities being promoted on social media that are likely to have dangerous outcomes.
  • In a B2B scenario, understand and communicate the gaps in knowledge of the client organisation based on the volume, topics and severity of support calls they put through to the service organisation.
  • Determine the root cause of employee concerns and the best methods to manage them.

We need Text Analytics to structure the unstructured text-based data in an objective way.text globe

There are 2 sides to Text Analytics:

  • A statistical approach where text data is converted into numerical information for analysis, and words and phrases are grouped by their common pattern across documents. The converted data and groupings can then be used alone or combined with structured data in a statistical model to predict outcomes.
  • A linguistic approach or Natural Language Processing (NLP) where logical text-based rules are created to classify and measure the polarity (e.g. of sentiment) of documents.

Both sides are equally important because despite how far advanced computing algorithms have gotten, there is still a lot of nuance in the way people speak, like sarcasm and colloquialism. By using techniques from both sides in an integrated environment, we can create a whole brained analysis which includes clustering of speech behavior, prediction of speech and other behavior against topics, and prediction of the severity of sentiment towards a product, person or organisation.

One organisation which has been using Text Analytics from SAS for a number of years to provide pro-active services to their clients is the Hong Kong Efficiency Unit. This organisation is the central point of contact for handling public inquiries and complaints on behalf of many government departments.  With this comes the responsibility of managing 2.65 million calls and 98,000 e-mails a year.

"Having received so many calls and e-mails, we gather substantial volumes of data. The next step is to make sense of the data. Now, with SAS®, we can obtain deep insights through uncovering the hidden relationship between words and sentences of complaints information, spot emerging trends and public concerns, and produce high-quality, visual interactive intelligence about complaints for the departments we serve." Efficiency Unit's Assistant Director, W. F. Yuk.

Whatever size your organisation is, and whatever purpose your organisation has, there are many sources of text-based data that is readily available and may already be amongst the structured data in your data warehouse, Hadoop or on a network drive. By using this data to supplement the structured data many people are already analyzing, we can better pinpoint not only what is driving behavior but how we can better serve our customers and our employees. Wouldn’t it be great to know what is relevant to people in and out of your organisation without having to manually read thousands of documents?

Applying Text Analytics to your documents is treating yourself with Text because amongst the masses of words you will find nuggets which will brighten up your (and your company’s) day.

If you’re interested in treating yourself with Text and are in the Sydney region this July, sign up for the SAS Business Knowledge Series course Text Analytics and Sentiment Mining Using SAS taught by renowned expert in the field Dr Goutam Chakraborty from Oklahoma State University. Dr Chakraborty will also be speaking of his experiences in the field at the next Institute of Analytics Professionals of Australia (IAPA) NSW Chapter meeting.

Happy Texting!

Post a Comment

Are you riding the analytics curve?

Ever looked into what being “behind the curve” means? I’ll save you the time – it is being less advanced or getting somewhere slower than other people. Remember grade school? No-one likes being behind the curve. That’s where people tend to get crunched. So how do you know if you’re ahead or behind the curve? Measurement. Thanks to the poll we ran in our last newsletter, you’ll be interested to know that most of your peers in Australia and New Zealand stated that they were either “behind the curve or comparable”.

Coincidentally when we ran our poll question, MIT Sloan Management Review and SAS surveyed 2000 executives globally which resulted in a report titled “The Analytics Mandate”. Results showed that, “organisations have to do more to stay ahead of the curve,” as Pamela Prentice, SAS chief research officer, put it. “Nine in ten believe their organisations need to step up their use of analytics. This is true even among those who report having a competitive advantage.”

For us in the antipodes, there’s good news and bad news. The good news is that you are not alone! Don’t feel bad - if you think your organisation is behind the curve with your analytics mandate, so do your global peers. Now’s the right time to shine and open that window of opportunity to get ahead of the curve!

Unfortunately, there’s bad news too. Isn’t there always?! Being behind the curve might not seem like a big deal. Your organisation is still profitable, right? Assuming you are and that you’re not going anywhere, have you heard the saying, “if you snooze you lose”? That’s the real issue. It’s not about what you’re not doing. It’s about what your competitors are doing. How will your business survive if this is not part of your future strategic goal?

I refer back to the Analytics Mandate research that states – an analytics culture is the most significant driving factor in achieving a competitive advantage from data and analytics.

Changing the culture of an organisation does not happen overnight. In fact, it is probably the hardest transition to make as you try to juggle people, process, technology and the big O - Opinion! But once it is achieved, the business outcomes and competitive advantage are rewardingly accelerated. Here’s proof from Australian Natural Care, a medium-size online retailer of vitamins and supplements, “We have taken the results we’ve been getting to board level to report on the wins and everyone is pleased. We went from having no analytic capabilities to building analytical models within four weeks of implementation. This has had a direct impact on our entire business.”

How does that happen? To drive analytical aspiration and encourage an analytics culture, the Analytics Mandate study recommends you answer the following:

1. Is my organisation open to new ideas that challenge current practice?
2. Does my organisation view data as a core asset?
3. Is senior management driving the organisation to become more data driven and analytical?
4. Is my organisation using analytical insights to guide strategy?
5. Are we willing to let analytics help change the way we do business?

How did you go? Did you score five out of five? If so, then you’re on your way to analytical competitive genius!

If you need some guidance, then you need to read Evan Stubbs book, Delivering Business Analytics: Practical Guidelines for Best Practice to get some ideas on mandating the Analytics Mandate.

And if you were analytically enthusiastic enough to read this complete blog and are the first 25 readers to complete this Business Analytics Assessment, I will send you a signed copy of his book. Natalie.Mendes@sas.com

See you on the other side of the curve!

Post a Comment

Capital Management - Driving Business Evolution with Enterprise Risk Management

For many years enterprise risk management (ERM) was more of a concept than a reality, a visionary strategy that spanned every department within an organisation at every level. However, in recent years, ERM programmes have been indisputably implemented by organisations of all sizes, with treasury central to this process. This article asks what constitutes an effective ERM programme, and how does a company go about creating a solid risk framework with treasury now playing a vital role?

ERM is no longer aligned to an academic-based vision with only limited application in the real world. Investment in ERM infrastructure and analytics is now viewed as integral to the management decision process, especially in financial services firms. A successful bank must conduct stress testing and create a capital planning approach, integrating risk and financial data. While regulation plays a part in this change, it is only a fraction of what banks must do to remain profitable.

An ERM vision must have an IT infrastructure and risk culture that support cross-functional collaboration, such as the partnership of risk and finance. Collaboration requires understanding all portfolio risks and how to manage those risks to meet the long-term goals of the bank. While a chief financial officer (CFO) is responsible for balance sheet growth, the chief risk officer (CRO) is in charge of risk control objectives, such as minimising balance sheet risk and calibrating this with the bank’s risk appetite.

The CRO must also avoid balance sheet exposure at all times. For the treasury function, Basel III regulations demand comprehensive assessments that affect businesses. For example, banks may examine each instrument in terms of its contribution to the balance sheet, leverage, liquidity, profit and loss (P&L), regulatory and economic capital, and risk-adjusted profitability. Getting a holistic view for different portfolios from one place is both a big data and systems challenge for banks.

Re-examining Capital Planning

New approaches to capital planning have elevated the role of treasury beyond solvency and into strategic planning, helping banks evaluate long-term capital under forward-looking projections as well as stressed scenarios. Regulation has focused attention back on capital efficiency and allocating increasingly scarce capital to businesses that outperform on a risk-adjusted basis.

Data management and quality remain key challenges to the delivery of these new approaches. Specifically for asset liability management (ALM) and liquidity risk, data is necessary to maintain a comprehensive balance sheet. For a large bank, the balance sheet can comprise trillions of assets, where the data requirement for asset liability management is huge since analysis must be performed on all assets and liabilities.

The data challenge is obvious when integrating information from the risk side (for regulatory and economic capital) with the finance side (for financial statements, general ledger accounts, liquidity, Generally Accepted Accounting Principles and International Financial Reporting Standards (GAAP/IFRS) compliance).

Historically, these departments operated independently and had no business incentive to work together. But imperative regulations - such as balance sheet stress testing, IAS 39, IAS 9 and Dodd-Frank - are now motivating risk and finance groups to work together for compliance. An integrated view of risk and finance data conveys far more information than a disjointed view. Furthermore, banks can use that information for effective future strategic planning - the main goal when optimizing capital planning and management.

An ERM architecture must have the ability to process each data item required. The best practice is to establish automatic metadata management that tracks the lineage of all models, stress scenarios, and data outputs.

The goal for ERM is to calculate analyses only once and use the results in various applications. The planning systems evolve from pure recipients of data into new suppliers of data. Within a ‘data universe’, these systems must provide information of overriding interest (e.g., cash flow plans) as a central service. When implemented consistently, the result is an integrated bank control architecture with a single point of truth.

Stepping Up Stress Testing

Another powerful ERM tool is the increasing use of stress testing to help business executives gain enterprise and business unit level oversight.

Financial institutions (FIs) are implementing enterprise stress testing systems that can be used for various analyses and reporting. Stress testing a bank’s portfolio requires several dimensions. Macroeconomic factors must be linked to the portfolio’s loan performance (credit risk), liquidity profile and market risk. This is a difficult modeling exercise that banks were not previously required to do.

Since the development of Basel II stress testing has been hotly debated, and, in many instances, inadequately implemented. The topic is relevant both from a regulatory perspective and for the internal risk assessment in the context of the internal capital adequacy assessment process (ICAAP). The Capital Requirements Directive (CRD IV) consultations as the European implementation of Basel III have already addressed the existing shortcomings both in stress testing and back testing.

Stress tests represent only one form of simulation that highlights the effects of extremely negative developments. However, the simulation capability is important for applications that go far beyond this, especially for opportunity identification such as economic development planning and control. A simulation can, and should, make it possible to evaluate the effects of decisions.

In stress tests, simulating different environmental conditions (scenarios) requires the financial position of the bank to be investigated. Here, negative economic developments are simulated by means of risk parameters, to read potential portfolios. The goal is early detection of serious changes in the risk structure to estimate bank stability, especially in periods of crisis. For planning timely countermeasures, both the executive board and the supervisory board need to know and understand the risks.

To use stress tests continuously and regularly as a management tool, a process must transparently document the assumptions, models, and the results. The risk process must also be established across departments so that the entire bank’s risk can be determined, which influences the calculation of the risk-bearing capacity.

Benefits of ERM

ERM implementations attempt to develop an integrated method of approaching risk. This means integration of the different approaches to risks and solutions, as well as approaches to identify and manage risk effectively. ERM can provide firms with the means to decide how much uncertainty (risk) to accept while achieving growth, with new stress testing and capital planning capabilities playing an increasingly important role in senior management decision making.

The key to meeting ERM objectives is to develop a risk infrastructure to obtain and then maintain a holistic view of risk while dealing with ever-evolving data, complexity in measurement and timely management reporting. However, the reality for many firms is that following such an approach depends heavily on how the IT infrastructure has grown. An integrated view of risk and finance data conveys far more information than a disjointed view. Furthermore, banks can use that information for effective future strategic planning – the main goal when optimizing capital planning and management.

How has your approach to capital planning changed in light of today’s regulatory and financial environment?

Learn more about leading practices in capital planning and risk adjusted performance management in the SAS whitepaper  Capital Management for Banking: Dawn of a New Era and visiting us at SAS Capital Planning & Management.

- by Larry Roadcap and Dave Rogers, SAS

Post a Comment

Analytics in Government: The new era

“Evidence-based policy development” is a term which has been in use for a number of years. However, in most cases it remains more an aspiration than a practical reality. The consciousness of using more data to underpin policy has risen, but the change in practice has been modest. The question is whether agencies are using all available internal and external data at their disposal, exploring that data to gain valuable insights, using predictive modelling to determine how different options play out over the life of the program and balancing budgets and optimal outcomes.

Typically the basis of policy proposals consist of historical data, analytical models from other agencies (such as treasuries), accepted wisdom and professional opinion. While there may be valid reasons for ministers to sometimes take a path different from what the data might suggest, it is even more important in these circumstances to model the likely outcomes in order to successfully protect the government from unintended consequences.

New generations of policy developers are likely to be far more receptive to the “scientific” approaches to policy development. However, the imperative is to provide them with access to all relevant data and tools which enable them to explore it. Another aspect to consider is that the greater quantity of accessible data (often enabled by government ‘open data’ policies) potentially expose governments to challenges by opposition parties, individuals, academics and advocacy groups who, based on their own analysis of the available data, may question the declared expected outcomes of policies and programs. It is therefore important for ministers and their agencies to be able to demonstrate the strong factual foundation of their policies and programs. Importantly, the inflection point is now since all of this is already possible – the question is whether the will to fundamentally change the current approach to policy development exists.

Pro-active Compliance Management alludes to a very significant shift in the approach to fraud, waste, error and debt management in the public sector. The current post-fact management of these issues – where money is allowed to “leak out of the system” through a “pay now – check later” approach and where agencies then spend additional funds and resources to recover some of this money – is shifting to a proactive, preventative approach. Commercial organisations, like banks, cannot afford the high costs of the “double whammy” of the post-fact approach and there is a growing recognition that governments can’t either.

The drive from governments to return budgets to surplus has heightened this consciousness. The move to on-line systems has meant that analytics can be applied to transactions and inputs to verify their validity in real time. This means that fraud and error can be trapped at the point of interaction and be effectively prevented – very much the same as financial institutions already do at the point of transactions.

An important point to be made in this context is that analytical approaches to these areas are vastly superior to business rules-based approaches. The latter creates rules based on the repetition over time of an identified activity, however it is increasingly apparent that perpetrators are just as quick to change their behaviour to circumvent detection – meanwhile considerable financial loss or debt has occurred. By contrast, analytical approaches focus on the detection of anomalies which suggest error or malfeasance. While the error isn’t identified as a particular activity or repetitive, it flags something that is outside of normal or expected behaviour. In the sphere of social services, a proactive approach translates into better social outcomes. For instance, if the vast majority of welfare recipients are entitled because of an economic need, then the recovery of over-payments exacerbates their economic hardship. Better to have prevented an overpayment in the first place.

To recover current debt carried by governments – which in the Federal Government alone is in the tens of billions of dollars – analytical approaches allow for the differentiation of treatment strategies to optimise recovery, allowing agencies to recover the most money in the shortest time from those individuals and entities in the best position to pay off their debts.

Citizen-centric Service Delivery represents a fundamental shift away from payment-centric approaches towards service delivery that is built around a holistic understanding of individual citizens. This differentiated approach looks at all the relevant factors in an individual’s life and the identification of the next best action to improve their personal, social and economic wellbeing. This is a laudable vision towards which a number of agencies are already progressing.

It is reasonably expected that over time this will result in a more engaged and participative relationship between governments and citizens which will in turn produce better outcomes for both. It recognises that the various functions supporting citizens in need exist in silos, both within larger agencies, governments and across layers of government and that this does not serve citizens well. Experience to date demonstrates that with a more holistic and cross-functional approach, citizens feel more cared for and are therefore more inclined to take greater ownership outcomes and begin collaborating with agencies in what becomes a shared endeavour.

A benefit to governments is that they are able to reduce their service delivery costs while ensuring the citizen receives the most beneficial services. From a relevant technology perspective, it is interesting to observe that many of the analytical tools developed for the commercial sector work equally well in government services. For example, a customer intelligence capability designed to use all the known data about an individual consumer to determine the next best product to sell to them is reversed to determine the next best action to assist a citizen. The challenge for major agencies is that their systems are built around eligibility assessment and payments and have very little, if any, customer intelligence, case management or service planning capability. The need for a solid eligibility assessment and payment system remains, but these additional capabilities are required if agencies are to effectively transition to citizen-centric service delivery.

Post a Comment

The Australian Analytics 2014 skills and salary survey

I recently attended an event where a speaker from LinkedIn presented. He mentioned an interesting trend – the demand for analytical professionals has increased in the last five the years. We all know analytics is the hottest field around. Better than mining even - as of last year, the median analyst in Australia was earning over twice the median Australian salary. Although, some evidence suggests that salary growth may be cooling somewhat overseas.

Big data’s hot, then it’s not. Machine learning’s cool, then it’s old school. LASR, Spark, in-memory, and good old data mining; what’s a person to focus on?! Given the rate that everything’s changing out there, have you ever wondered how the skills you choose affect your salary?

Don’t guess, find out. The Institute of Analytics Professionals of Australia is running their annual skills and salary survey for the second time. As with last year, it covers:
• The industries that people work in
• The tools people have used and are using
• The challenges people are facing

New to this year, it also covers:
• The degree to which analytics is being centralised or federated in organisations
• Whether people are interested in formal or informal education on business analytics

With the recent launch of a Masters of Data Science at the University of South Australia, the Masters of Business Analytics at Deakin University, the Masters of Analytics at RMIT and the Masters of Analytics by Research at UTS, there’s clearly a lot of pent-up demand for learning.

Respondents have the option of receiving the report once it’s completed. And, given what was in last year’s report, it’s sure to be interesting reading.

Regardless of what you use or where your focus lies, as long as you’re an analyst and you’re working in Australia make sure to complete it before it closes. Tell your friends, tell your enemies, tell anyone and everyone you know who works in the field. Tweet it, blog it, talk about it over the water cooler - as the most comprehensive survey of its kind in Australia, it represents a unique opportunity to really understand what’s hot, what’s not, what’s trending.

Complete the survey

SAS is an active sponsor of IAPA and supports the broad development of skills across the industry.

Post a Comment

Don't Let Bad Quality Data Kill Your Data Visualisation Project

824009.TIFNext generation business intelligence and visualisation tools such as SAS Visual Analytics, are revolutionising insight discovery by offering a truly self service platform powered by sophisticated visualisations and embedded analytics. It has never been easier to get hold of vast amounts of data, visualise that data, uncover valuable insights and make important business decisions, all in a single day’s work.

On the flip side, the speed and ease of getting access to data, and then uncovering and delivering insights via powerful charts and graphs have also exasperated the issue around data quality. It is all well and good when the data being used by analysts is clean and pristine. More often than not, when the data being visualisation is of poor quality, the output and results can be telling and dramatic, but in a bad way.

Let me give you an example from a recent customer discussion to illustrate the point (I have, of course synthesised the data here to protect the innocent!).

Our business analyst in ACME bank has been tasked with the job of analysing customer deposits to identify geographically oriented patterns, as well as identifying the top 20 customers in terms of total deposit amount. These are simple but classic questions that are perfectly suited for a data visualisation tool such as SAS Visual Analytics.

We will start with a simple cross-tab visualisation to display the aggregated deposit amount across the different Australian states:

when visulisation image 1

Oops, the problem around non-standardised state values means that this simple crosstab view is basically unusable. The fact that New South Wales (a state in Australia) is represented nine different ways in our STATE field presents a major problem whenever the state field is used for the purpose of aggregating a measure.

In addition, the fact that the source data only contain a full address field (FULL_ADDR) means that we are also unable to build the next level of geographical aggregation using city as it is embedded into the FULL_ADDR free form text field.

when visulisation image 2

It would be ideal if the FULL_ADDR was parsed out and street number, street name and city are all individual, standardised fields that can used as additional fields in a visualisation.

How about our top 20 customers list table?

when visulisation image 3

Whilst a list table sorted by deposit amount should easily give us what we need, a closer inspection of the list table reveals troubling signs that we have duplicated customers (with names and addresses typed slightly differently) in our customer table. A major problem that will prevent us from building a true top 20 customers list table unless we can match up all the duplicated customers confidently and work out what their true total deposits are with the bank.

All in all, you probably don’t want to share these visualisations with key executives using the dataset you were given by IT. The scariest thing is that these are the data quality issues that are very obvious to the analyst. Without a thorough data profiling process, other surprises may just be around the corner.

One of two things typically happens from here on. Organisations might find it too difficult and give up on the dataset, the report or the data visualisation tool all together. The second option typically involves investing significant cost and effort in hiring an army of programmers and data analysts in order to code their way out of their data quality problems. Something that is often done without detailed understanding of the true cost involved in building a scalable and maintainable data quality process.

There is however, a third and better way. In contrast to other niche visualisation vendors, SAS has always believed in the importance of high quality data in analytics and data visualisation. SAS offers mature and integrated Data Quality solutions within its comprehensive Data Management portfolio that can automate data cleansing routines, minimise the costs involved in delivering quality data, and ultimately unleash the true power of visualised data.

There is however, a third and better way.

Whilst incredibly powerful and flexible, our Data Quality Solution is also extremely easy to pick up by business users with minimum training and detailed knowledge around data cleansing techniques. Without the need to code or program, powerful data cleansing routines can be built and deployed in minutes.

I built a simple data quality process using our solution to illustrate how easy it is to identify and resolve data quality issues described in this example.

Here is the basic data quality routine I built using the SAS Data Management Studio. The data cleansing routine essentially involves a series of data quality nodes that resolve each of the data quality issues we identified above via pre-built data quality rules and a simple drag and drop user interface.

when visulisation image 4

For example, here is the configuration for the "Address standardisation" data quality node. All I had to do was define which locale to use (English Australia in this case), which input fields I want to standard (STATE, DQ_City) ), which data quality definitions to use (City - City/State and City) and what the output fields should be called (DQ_State_Std and DQ_City_Std). The other nodes take a similar approach to automatically parse the full address field, and match similar customers using their name and address to create a new cluster ID field called DQ_CL_ID (we’ll get to this in a minute)

when visulisation image 5

I then loaded the newly cleansed data into SAS Visual Analytics to try tackle the questions that I was tasked to answer in the first place.

The cross-tab now looks much better and I now know (for sure), the best performing state from a deposit amount point of view is New South Wales (now standardised as NSW), followed by Victoria and Queensland.

when visulisation image 6

As a bonus for getting clean, high quality address data, I am also now able to easily visualise the geo based measures on a map, down to the city level since we now have access to the parsed out, standardised city field! Interestingly, our customers are spread out quite evenly across the state of NSW, something I wasn’t expecting in the first place.

when visulisation image 7

As for the top 20 customer list table, I can now use the newly created cluster field called DQ_CL_ID to group similar customers together and add their total deposit to work out who my top 20 customers really are. As it turns out, a number of our customers have multiple deposit accounts with us and go straight to the top of the list when their various accounts are combined.

when visulisation image 8

I can now clearly see that Mr. Alan Davies is our number one customer with a combined deposit amount of $1,621,768, followed Mr. Philip McBride, both of which will get the special treatment they deserve whenever they are targeted for marketing campaigns.

All in all, I can now comfortably share my insights and visualisations with business stakeholders with the knowledge that any decision made are using sound, high quality data. And I was able to do all this with minimum support and all in a single day’s work!

Is your poor quality data holding you back in data visualisation projects? Interested in finding out more about SAS Data Quality solutions? Come join us at the Data Quality hands on workshop and discover how you can easily tame your data and unleash its true potential.

Post a Comment
  • About this blog

    The Asia Pacific region provides a unique set of challenges (ความท้าทาย, cabaran) and opportunities (peluang, 机会). Our diverse culture, rapid technology adoption and positive market has our region poised for great things. One thing we have in common with the rest of the world is the need to be globally competitive while staying locally relevant. On this blog, our key regional thought leaders provide an Asia Pacific perspective on doing business, using analytics to be more effective, and life left of the date line.
  • Subscribe to this blog

    Enter your email address:

    Other subscription options