Banker, turned technologist, turns banker again a decade later!

I have decided to re-enter the banking profession after ten years of great personal and professional growth and opportunity here at SAS

New role and focus

My new focus area will be credit model validation, which sits in the more general realm of model risk managementSix years ago, my co-author Dr. Mingyuan (Sunny) Zhang and I devoted Chapter 8 in our first book to the topic of credit model validation.  Over the years, model validation remained an area of focus for our collaborative research leading to a patent on a new credit model validation methodThe patent was issued to SAS nine days ago.

Model risk is explored in my recent post entitled "What's a model?I have also been contributing over the past two months to a discussion on GRC and decision-making with posts on the OCEG blog site, and I invite you visit there regularly to hear views from a variety of practitioners and thought-leaders in the GRC space on topics of current interest.

Great timing, great people, great fun!

I was fortunate to join SAS five years prior to the financial crisis and I am leaving now five years post-crisis.   During the past 10 years I have tried to help bankers from the sidelines in the areas of risk and compliance.  It turned out to be an unprecedented period in the history of the financial markets -- one where improvements in lending systems were most needed and sought.  It was thrilling to work on different solution approaches to important business problems and I have had some wonderful collaborations with so many capable and generous people in the field that there is no space here to give proper thanks.

Whatever measure of success SAS and I have enjoyed is due in large part to the opportunity afforded to me by SAS and to those many friends and colleagues who provided me with support, encouragement and, on so many occasions, a significant contribution.

My thanks to SAS!

My decision to leave SAS was a difficult one.  I love working for SAS and have great respect and genuine affection for my teammates here.  The amazing campus facility in Cary and all of the wonderful privileges that SAS affords its employees are really great.  However, it is the people at SAS, here at the Cary main campus and all around the globe that are its greatest blessing!   I will miss them the most.

My thanks to you!

Thank you for your attention and comments on this blog series which began with post #1 on Friday, February 1. 2008.  It has been my privilege to have you accompany me on my journey, which spanned fair lending, community development, HMDA, new lending systems, credit scoring, governance, risk and compliance (GRC), boards and directors, high performance analytics, and decision-making!

Parting words!

I wish you every success as you further your professional pursuits!   And remember to aim high!

Our aspirations are our possibilities.

                                                                      Robert Browning

Post a Comment

No "one-size-fits-all" decision process -- no worries -- NACD, OCEG & SAS can help!

An OCEG blog post that went live earlier today describes a concept that Chuck Re Corr and I refer to as decision triage.  The post connects the concept to OCEG's GRC Capability Model (OCEG refers to the Open Compliance and Ethics Group, a non-profit think tank founded by Scott Mitchell in December 2002 with global membership exceeding 40,000) .

In the absence of a crystal ball, the best that decision makers can hope for is a process that will lead them in the right direction and guide them through the necessary steps to a good conclusion.

Naturally, a one process-fits-all for landing on what you should decide to do does not exist! That is because decisions can occur at all levels within an organization and may vary in terms of direct impact, urgency, scope, long-term implications, and so on.


The Re Corr et al. Decision Process is designed so that deciding what you are deciding (the first part) lands you on a sensible approach to take before proceeding to making the decision (the second part).  That process emphasizes the more practical aspects of decision making, and it can leverage the well-vetted OCEG GRC Capability Model for greater decision comprehensiveness, efficiency, and quality.

Effective integration of content and process afforded by NACD (National Association of Corporate Directors) and OCEG into a cohesive and workable decision making process is really a straightforward exercise for those possessing sufficient familiarity with both source documents (for the GRC Capability Model that would be the OCEG Red Book Version 2.1publication).

If you're interested in formalizing or enhancing your enterprise decision making, I suggest you check out my latest post on the OCEG website and consider using it as a base to build upon with additional details versus starting from square one!


You can also enhance decision making efficiency and effectiveness through the use of technology.  In fact, SAS offers an enterprise decision management solution that can help you make better decisions on everything from profitability to customer satisfaction.  It is able to do so by helping you to:

1) Establish a formal decision process for your organization,
2) Enforce internal rules and policies supporting that process,
3) Facilitate greater collaboration,
4) Enables better compliance with industry regulations, and
5) Accelerate the decision life cycle.

I suggest you contact SAS today in order to better understand and explore in greater depth possible options for enhancing your decision making.  Even small gains in this area can pay huge dividends.

Why you should care

Just imagine what 2% more good decisions and 3% fewer bad decisions would translate to for your organization?  Would that be a big number?  I'll bet so!



Post a Comment

What's a model?

Is a calculation a model? Is a spreadsheet a model? Is the computer-based implementation of a mathematical solution to a problem a model?

Model Identification

These are questions that rest heavily on the minds of bankers these days. "Why?" you ask.


The answer is found in federal regulatory guidance on model risk management.  If you are a banker, depending on your definition of what constitutes a model, you may or may not need to do some extra work.

Let me explain, but first some definitions are required.  My source document is SR 11-7 guidance from the Federal Reserve Board, issued April 4, 2011. (The OCC issued similar guidance for national banks, federal savings associations, and thrifts).

Regulatory Defn: Model -- "a quantitative method, system, or approach that applies statistical, economic, financial or mathematical theories, techniques and assumptions to process input data into quantitative estimates." This definition encompasses "quantitative approaches whose inputs are partially or wholly qualitative or based on expert judgment, provided that the output is quantitative in nature."

That is a pretty open-ended definition, which is not surprising since oftentimes regulatory guidance leaves room for interpretation that varies due to differing facts and circumstances.


Models are useful things to have around and bankers have come to rely on them to a great extent for certain applications, some of which expose the bank to significant risks. Predictive models fall into this category. Examples include loan approval using credit scoring, hedging models using swaps and options to manage the balance sheet while protecting liquidity, determining capital adequacy, etc. Regulators have come up with a definition for this risk exposure.

Regulatory Defn: Model Risk -- "the potential for adverse consequences from decisions based on incorrect or misused model outputs and reports."

Banks are tasked by the regulators to manage model risk "both from individual models and in the aggregate." That means they need to decide what falls into the model category, and which models have the greatest potential adverse impact.  Relative to model risk, bankers need to be able to:

  • Recognize it
  • Quantify it
  • Know when a model goes wrong
  • Know what to do when a model goes wrong
  • Know when additional capital should be allocated to cover it

On the last point, banks have regulatory capital models and they perform stress testing to demonstrate the degree to which their capital levels are sufficient under a variety of economic and market conditions. In March of last year, the Fed published its examination methodology and results relating to a comprehensive capital analysis and review (CCAR) program for the nineteen largest and most complex bank holding companies in the US. Stress testing for CCAR is an area where technology can help bankers to better manage their process through workflow automation, capital planning framework deployment, iterative scenario results aggregation, visualization, exploration and reporting.


Models need to be scrutinized and challenged by staff who are not involved in their development or use -- i.e. experts who do not bear any negative consequences for finding fault with a model or the manner in which it is being used. There is a significant cost to performing that exercise, which is referred to as model validation.

Regulatory Defn: Model Validation -- "the set of processes or activities intended to verify that models are performing as expected, in line with their design objectives and business uses."

Fundamentally, model validators must determine whether or not a given model is fit for the purpose intended.  Model validation is not a purely statistical exercise. That is because almost all input data used in business modeling is biased due to policy rules, inconsistencies in business practices, differences across markets and geographies, data collection and sampling rules pertaining to what gets included versus excluded, how values are translated and standardized, inconsistencies in data definitions, variation in interpretation of the data, and so on.  Model validators must understand the business environment in which the model will operate and the business objectives that they were designed to support.  They must also gauge the uncertainty due to unobservable or unreliable inputs.

In addition to its inputs, the validity of a model hinges on its processing, and outputs. The processing depends upon choice of algorithm (i.e. solution method), calibration or tuning parameters, a set of assumptions, a set of limitations, agreed upon objectives, etc. The output consists of estimates and error bounds, and business reports and sufficient information to allow for outcome measurement, monitoring and assessment sufficient to gage robustness, stability, and accuracy.

Finally, let's not forget about usage. Model's that are perfectly developed and implemented can be applied to the wrong population of customers, or wrong type of financial instrument, or to the wrong set of transactions, etc. Model validators must pay careful attention to all of these areas, again keeping in mind the business context as they perform their assessments.


Banks must develop and maintain effective model governance.  Doing so entails the creation of a model risk management framework that is made up of a supportive corporate culture and values, clear vision articulated from executive management, risk appetite, policies, procedures, testing regimen, validation process, well-orchestrated lines of defense against failure to detect problems, clear definition of roles and responsibilities and resource needs, and documentation.  An inventory of models should also be maintained and sufficient resources allocated in order to ensure models are understood, the risk exposures they represent are quantified for present and future operation, and all models and their input data and key underlying assumptions are continuously verified and properly managed and maintained.

This is a tall order, and it rightfully involves the board of directors and executive management who must establish and direct an enterprise-wide program that addresses model risk management (MRM).  How can directors decide on the proper allocation of resources to MRM?  Much of the answer is coming from the regulators these days, but practicalities and experience will ultimately dictate what is truly necessary and what constitutes a waste of time.  I suspect that regulatory expectations will continue to rise until such time that banks can demonstrate that the bar needs to be lowered.

Bankers need to have a robust model risk framework in place that promotes consistent model risk management standards across the firm that:

  • Is efficient
  • Reports on the entire program (aggregate as well as detailed performance measures from the bottom-up)
  • Establishes appropriate limits on model risk
  • Can perform stress testing that encompasses extreme use cases
  • Facilitates risk mitigation and measurement of model risk before and after mitigation
  • Measures residual model risk directly based on model performance and traced to sources of risk
  • Avoids cherry-picking and overly optimistic projections


Overall model complexity is on the increase and banks are taking greater model risk due to increasing reliance and expanded use of models to:

  • Value instruments and positions
  • Quantify exposures
  • Measure and manage all forms of risk
  • Refine and further automate credit underwriting models
  • Determine capital levels and reserve adequacy
  • Perform profitability and performance analysis

The trend of increasing complexity will likely continue fueled by greater computer processing power, more sophisticated and powerful business solution software, the pace of change in business, and the ever-present pressure for better and faster decisions.

Model Management & High Performance Computing

In response to these demands, technology can play an important supporting role to hasten the collection of proof points for the value of MRM, which transcends regulatory mandated testing to tangible business benefits stemming from model and process improvement.

An example that comes immediately to mind is where, due to time pressure to produce models quickly, developers may not go the extra mile in performing variable screening, sub-setting and clustering in order to determine the best choice that exhausts all possible cases while making the best business sense.  Often, the extra time spent to re-visit this area pays significant dividends.  To be sure, creating a predictive model entails far more that throwing a few hundred variables into a stepwise selection algorithm.  This is because issues such as quasi-complete separation, non-linearity, and redundancy (co-linearity) crop up, which can complicate model interpretation, affect convergence of the estimation algorithm, and ultimately lead to incorrect decisions regarding variable selection.  These issues can be addressed using a variety of techniques, such as collapsing the problem based on chi-square reduction in association testing (Greenacre's method), the use of logit plots and Spearman and Hoeffding correlation coefficients to screen model inputs, ranking of alternative models based on the Bayesian Information Criterion (BIC), and a variety of other statistics (AIC, adjusted-R2, area under the ROC curve, and the Brier score) just to name a few.

There are many other remedies that can be explored and may prove worthwhile, given some extra time.  Additional areas where improvement can be realized include strategies for splitting data for model training and validation, and model tuning and fitting.  If you are interested in learning more, SAS Education offers a course entitled Predictive Modeling Using Logistic Regression that covers the bases.  I invite you to enroll

I suspect adoption of technological advances to spur MRM efforts will occur on an application-by-application and bank-by-bank basis.  Those who choose to invest in technology will find there is significant help available relative to:

On the last point, all SAS solutions and tools are self-documenting white boxes, i.e. they provide transparency into the modeling process, options elected, assumptions made and results obtained -- all in an intuitive and thoroughly documented computing environment.

Time is running short and regulatory expectations are high.  You may have noticed that in the US, Basel III and Dodd-Frank rollouts have picked up steam, and Tuesday's Senate approval of Richard Cordray to direct the Consumer Financial Protection Bureau (CFPB) will certainly spur that agency's bank oversight program.

What is your institution's strategy on the MRM front? More specifically:

  1. Have you wrestled down the definition of a model? (Does Internal Audit agree?!)
  2. Do you have a complete inventory of your models to show your regulator?
  3. Can you quantify the exposure that each model represents?
  4. How confident are you that you have sufficient controls in place to manage the risks?
  5. Have you established, and has your board approved, an MRM framework?
  6. In the aggregate, how much of your institution's capital could be wiped out due to bad/misused models?

Responsible development and use of models requires knowing the risks they pose in addition to the rewards they offer.  Modeling success rests on the quality of the data used to build and to run them, the assumptions they rely on, the reliability of the process used to deploy them, the appropriateness of the way in which they are used, and the controls used to monitor their performance.  MRM encompasses a lot of moving parts!

[My thanks to Naeem Siddiqi for his thought leadership emphasizing the critical need for scorecard developers, users and validators to constantly keep in mind the business considerations that come into play all along the model life-cycle and model value chain. Failure to consider the full business context in model development and usage is a huge contributor to model risk. Business models are solutions to business problems that often try to predict human or market behavior as a critical component. Business models are not math or stat solutions to laboratory experiments that can be nearly perfectly controlled and measured! If you deal with business models and have not done so already, I encourage you to pick up a copy of his book entitled Credit Risk Scorecards -- Developing and Implementing Intelligent Credit Scoring. It provides a step-by-step guide that can, and should, be generalized for any modeling exercise. Naeem also teaches a course on the same subject that is offered through SAS Education that is definitely worth the investment of two days to learn how to better manage model development and usage in order to achieve the business objectives they were designed to deliver.]

Post a Comment

Making principled decisions rooted in GRC

In a recent post on the OCEG blog page ( entitled: decision-making and GRC capabilities are inextricable ) I note the leverage to be gained by having a systematic decision process in place that is rooted in a GRC framework (e.g. OCEG GRC Capability Model).  I encourage you to check out my latest post on OCEG in order to better visualize how the two work hand-in-hand.  There is no need to re-invent wheels on either front.

And, speaking of re-inventing wheels, I can speak from experience as a senior risk manager directly reporting to a bank board, that it makes sense to leverage a business solution rather than attempting to build one in-house.  I just happen to know of a great one -- SAS Enterprise GRC !!  I showcased the SAS solution in a 9-part blog series on SteadyBank towards the end of last year.

Ned Thomas, CRO

That series is now approaching 25,000 views (over 9,000 page views alone on the final one)!  Based on these results, perhaps I should consider creating another series -- possibly a continuation of the SteadyBank saga and the cast of characters -- what do you think?!  Seriously, if you are in the market for a GRC solution, please reach out to your local SAS account executive to find out additional information about our solution - its breadth and depth in the realm of GRC is truly remarkable.

I am pretty engaged these days preparing for delivery of some great risk and compliance information through course instruction - near term focus is on credit scorecard development and implementation (CSDI).  It is a great offering by SAS, and I encourage those interested in the subject to explore the course further -- one of many great offerings by SAS Education.


Post a Comment

Context -- problem-solving linchpin

I recently completed a video series on high-performance analytics (HPA) where I introduced a problem having a quadrillion decision variables, with each decision variable having eleven subscripts.  Now that is what I call big context!

Solving problems having big data, big analytics, big context

The problem in question actually calls for some sophisticated analytics, but it really does not require big data because it is a super-sparse application (in the parlance of liner optimization, less than one percent of the coefficient matrix has non-zero entries).  So you could think of problems as falling into one of 8 classes, based on combinations of type of analytics (routine, sophisticated), data (normal, big), and context (typical, big).  You need some especially powerful and efficient supporting technology to tackle problems in the realm of big on any/all of the three, and this is especially the case when sophisticated analytics are in play.  Making decisions oftentimes entails a lot of nested conditions and there are numerous multi-way effects that may be required to effectively capture and deal with the business context around a complex problem.

I find the intersection of decision-making and context sufficiency especially intriguing, and so I am in the process of connecting the body of work from my recent co-authored e-book on decision-making at the top of the house with my research on context-preserved credit scoring, which takes place in the trenches of operating units!  I will share some of the more important findings that will undoubtedly occur during this journey, so please stay tuned for further developments in this area!

New role

On June 1, I joined the SAS Education faculty as an instructor in the statistical training and technical services department.  I will be teaching credit scorecard development and implementation (2-day course) later this year.  As I earn additional certifications, I may also teach courses on data mining, predictive analytics and statistics.  I am currently doing deep dives into the software tools.  I plan to leverage SAS core technology in my research efforts to better address some particularly high-impact business problems that currently challenge the business community.

I am especially excited about my new role.  A core passion of mine has always been problem-solving.  I am happiest and feel most alive when I am struggling with a very difficult business problem.  I also enjoy sharing insights and helping others improve their problem-solving skills.  It has been said that the best way to learn a subject is to teach it to someone else -- there is a great deal of preparation in the teaching profession and I am now learning this first-hand!   I am very much a student myself -- the more I learn, the more I realize how little I know!  I am privileged to be in the company of such an outstanding group of professional instructors and I hope that some portion of their knowledge will find its way into my head through informal hallway conversations, in addition to classroom instruction!  (Yes, I have moved offices from Building C to Building H on the SAS Campus).

In my spare time, I am working on some ideas for new course offerings, e.g. context-preserved scoring, fair lending statistical analysis and self-testing methods, and model governance and validation (both statistical and logical).  Also in consideration, and at a purely conceptual stage, are C-level continued education offerings on methodologies related to successful strategy development and execution, including decision-making, risk evaluation and policy formulation, and general problem-solving.  SAS has some wonderful technology and solutions (e.g. SAS Enterprise-Miner, SAS Real-Time Decision Manageretc.), which have direct application in these areas.

Governance, Risk, and Compliance (GRC)

GRC remains an area of focus (SAS possess a fantastic Enterprise GRC solution) and I will continue my involvement with OCEG on their Policy Management Council.  Another area of interest I am pursuing with the Open Compliance and Ethics Group (OCEG) deals with principled decision-making.  More on that in future posts.

I look forward to continuing with the Principled Achiever blog, and my change in roles will have little or no impact.  Looking ahead, I may blog on good processes for making better decisions, corporate governance best practices, character-defining moments, topics in risk and compliance, model life-cycle management, predictive modeling, the challenges of credit access in countries having low data availability, or taking calculated versus uncalculated risks—knowing what you don’t know!

I invite you to follow me on my journeys, which will take me in many different directions, and which will hopefully shed light on some challenges that you are facing in your day-to-day experiences.

Post a Comment

Borrower versus iBorrower, more context please!

On Sunday I will travel to Ontario to present at the 20th Annual Conference of the Credit Scoring and Risk Strategy Association.  The conference agenda is action-packed!  My talk is on context preserved scoring, a term I recently coined to describe an enhanced credit scoring approach that is described in detail in the book Sunny Zhang and I co-authored on credit risk assessment.   

As you probably are aware, credit scoring buckets loan applicants into risk homogeneous score ranges, yet people occupying the same credit score bucket (or even having the exact same score) are not necessarily similarly situated.  With context preserved scoring, people with the same number are both similarly situated and have the same score.  As a result, context preserved scoring offers some significant advantages.

Borrower > iBorrower, so mind the gap!

Context preserved scoring (CPS for short)  leverages the best that data, scientific methods and judgment have to offer in granting credit.  CPS applies the same basic process and agreed to principles used at the highest levels in corporations and in boardrooms to loan underwriting.  Trust in that lending process is crucial, and it goes both ways.   Lenders seek assurance that they can trust a borrower’s pledge to re-pay the loan.  Borrowers want the lender to understand them and the context around their loan request, and they seek greater transparency into the credit qualification process and outcome.  CPS seeks a richer borrower context that includes alternative data and payment vehicles, insurance coverage, borrower traits, circumstances, behaviors, and even borrower values, wants and needs.

Borrower - iBorrower = Lending Information Gap

We must not lose sight of the people behind the numbers.  There is definitely a gap between what gets captured and what's relevant.  Knowing more about the borrower is an area of opportunity for lenders today.

Those lenders who can make inroads and incorporate more borrower information of the type described on the slide will find it is a win-win situation in many respects, not the least of which is that they could not only better predict, but actually influence, borrower behavior in a positive way for everyone!

For example, if having insurance were a factor in loan underwriting, borrowers might seek affordable coverage to better protect themselves and their family against financial strain if unforeseen adverse circumstances should materialize.  Or, if average monthly savings rate over the past year was a factor, borrowers might seek to save more regularly for a rainy day.

My presentation explores the nature of a CPS system, its development, distinguishing features, operation, validation, maintenance and uses.  Comparisons with traditional scoring systems are made and discussed.  CPS system advantages are noted relative to numerous points along the lending value chain.  Preserving and enriching context has far-reaching consequences, such as greater inclusion and credit access, elimination of system overrides, more effective marketing of credit products, improved credit account management strategies, and more effective loan portfolio management (including securitization, security rating and investor reporting) and more pro-active and accurate asset quality monitoring.

I hope to see you there!

Post a Comment

Buy HPA Now!

SAS Global Forum is taking place this year in my home town -- San Francisco.  I feel especially proud and moved by the words of Dr. Jim Goodnight, SAS CEO, who spoke at the event last night, telling customer attendees: "Our mission is to keep innovating to support the great work that you do."  A great example of the innovation that Dr. Goodnight is referring to is the SAS High-Performance Risk solution.  I have had the good fortune to support that solution from a product marketing standpoint, and it is amazing what it can do!  Among other things, it empowers risk professionals to quickly obtain precise answers to very tough questions concerning their current risk exposures so that they can make well-informed decisions.  Those decisions include what they must buy, sell, hold, or insure to properly lock-in gains or attractive funding and limit market and credit risk, while protecting liquidity of their positions -- even in highly volatile and stressed markets.  This is an area where I spent five years of my career as a balance sheet management analyst and fixed income portfolio strategist.  Boy, if only I could have had this solution back in the day! (Note: I explain why in video #4 in this post.)

Senior Vice President and Chief Marketing Officer Jim Davis also addressed the gathering, noting that: “This is a significant moment in SAS’ history as we introduce high-performance analytics in-memory capability. It is tremendous what we can do today with this technology.”  Yes, there are exciting times at SAS and very promising times for our customers who now able to harness the power of HPA in order to solve their most vexing and complex business problems.  Those companies who adopt this high-powered approach will be rewarded not only immediately with the means to re-think the manner in which they operate, but also by virtue of the fact that they will preparing for what the future will hold as excascale computing becomes commercially available in the years to come. 

In two past posts, I sought to explain what HPA means to modelers and what HPA means to CEOs.  I would like to invite you to check out videos 3-5 in a series, continuing from video number two on decision making and analytics, which appeared in my April 17th post entitled: Decision-making for the boardroom and beyond.  

In video #3, which follows, I talk about HPA and highlight the evolution of computing storage and power over the past three decades, which has led us to the brink of excascale computing. 



In case you were still pondering the sort of HPA-enabled problem solving I alluded to in my earlier post on "What's in it for modelers?"  then the next video (#4) is for you!  It covers that ground and shares a real-world risk and financial problem that is as relevant and even more complicated now than it was back in the mid-80's when I first encountered it.

 HPA & Problem-Solving


I'll conclude this post with what I see as perhaps the greatest promise of HPA as a means to spur imagination and help executives not only develop better strategies for achieving their business goals, but also imagine goals which would prove even more fruitful to pursue in the first place!  I hope you find the final video (#5) in the series to be thought-provoking!

 HPA & Imagination

Post a Comment

Director decision making -- a sensible approach

Decision making in the board room is an interesting topic indeed!  In her blog post today entitled Guidance for Director Decisions, Alex Lajoux, NACD Chief Knowledge Officer, addresses the philosophical starting point for a new NACD publication entitled: Director Decision Making -- a Sensible Approach by putting a twist on the famous quote from Shakespeare's The Tragedy of Hamlet, Prince of Denmark :

"To decide, or not to decide, that is the question!"

Oftentimes decision makers feel that their situation, or company, is unique.  However, closer inspection and some reflection will most certainly expose some common threads.  That's where the realization occurs that a well-thought-out and carefully crafted process can advantageously come into play.  Decision-making lies at the heart of a company operating model, and the quality of decisions often spell the difference between a successful venture and a losing proposition. 

And, of course, everyone loves a winner and we congratulate leaders all of the time when they experience a successful outcome.  But consider for a moment:

Question: If we have a successful outcome, does that mean we made good decisions? 
→ Answer:Maybe, but maybe not.

Question: Conversely, if we have a bad outcome, does that mean we made bad decisions?  
Answer: Not necessarily.

Execution is a critical component to be sure, but the seeds of failure in execution can be sewn in flawed planning and decision making that fails to uncover any gaps between capabilities and capacity to perform versus the goals. 

 I have some additional questions for you! 

Question: Has anyone seen the national statistic on what bad decisions cost annually?

  • Question: What would it mean of you could make 2% more good decisions and 3% fewer bad decisions at your company?
  • Question: Would that be a big number?

Why do directors need an approach to decision making?

In the limited time they spend meeting face to face at board and committee meetings directors may make many important decisions that affect shareholders and other stakeholders, such as employees, customers, regulators, and local communities.  Despite the board’s vital role as a decision-making body, directors rarely employ tools to support the decision-making process. At the same time, we know that:

1. Collective wisdom, even if qualified, does not always ensure good decisions.
2. Experience and common sense will not always be sufficient to see the Board through.
3. There is a tendency to define the quality of a decision by its outcome.

Some additional points about this publication

This publication is structured as a series of progressive questions for decision makers’ consideration. There is a worksheet listing the questions located in an appendix. Not all the questions will be applicable to all situations, but together they form a systematic approach to board decision making.

Directors can work on the questions independently and compare notes at a meeting, or they can answer them collectively with the board chair or lead director facilitating. Directors can also use these questions in reviewing management’s decision-making process.

The process described in this book is intended to chart a practical path for reasoning through decisions in a deliberative manner. We offer it as a sensible approach that can help directors make the best possible decisions for their organizations in a changing and challenging world.

The role of technology

Naturally, technology can play an enabling role in situations where decisions are highly complex, involving may rules, assumptions, constraints, multiple objectives, and a considerable variety and quantity of relevant information that should be considered.  In my prior post on "Quality information for the  board"  I pointed out that asking the right questions leads to better decisions.  That's where the  process comes in, as described in the NACD publication on decision making.  But what about the answer to the question I posed:  " How can Boards more effectively make decisions, such that all relevant factors and attractive alternatives are identified and taken into account, and so that risk, value and time preferences are weighted, and all information, models and probability assignments are validated and well-documented?"  Well, on that front, SAS has solutions in the area of SAS® Decision Management that afford:

  • Integrated decisions. By embedding rich information and analytics services directly within operational applications, SAS brings the value of information and analytics to the point of decision. In addition, SAS provides a closed loop continuum that cycles analytical results back into the information and decision life cycle.
  • Rich analytics. While there are many solutions available that help organizations manage their processes, none include the depth and breadth of SAS Analytics. Decisions based on analytics applied at the moment they are needed can lead to superior results that enable competitive advantage.
  • Ability to manage business processes, workflow and collaboration. With SAS, you can streamline interactions that relate to the decision process or analytics life cycle through the use of business rules, including the ability to make investigative processes more efficient to reduce costs or prevent fraud and waste.
  • Shortened response times to real-time events. By monitoring and analyzing real-time streaming events, SAS is able to identify anomalies, threats and opportunities faster than ever. Real-time analytical capabilities automatically provide the most appropriate responses to mitigate risks or exploit opportunities. The ability to correct for events relies on analytics being embedded into organizational processes.

I welcome any/all comments or questions on this topic.  Don't be shy!

Post a Comment

Decision-making for the boardroom and beyond

Last Friday I co-presented a session on decision-making in the boardroom at the Research Triangle Chapter, NACD Directors College. There was a great line-up of speakers on a number of topics that corporate directors must deal with on a day-in and day-out basis. The NACD Directors College Agenda consisted of the following sessions:

I was able to attend the full day, which proved to be a great learning experience. I am a firm believer in continuing education. In fact, I picked up several points from the first presentation relating to duty of care, business judgment rule, and duty of loyalty that I referred to as I delivered my portion of the second session!

Decision-making and the value of process

 Good decision-making is a key driver of sustainable success for any company.  Companies possess vision, they seek to instill a core set of  values and they set performance goals. 

The question is "How do they achieve those goals?" -- The answer is "One decision at a time!"

How decisions are made is of great importance and that is where process comes into play.   A process can help where memory fails, or attention wanes or is interrupted, or when thoroughness falls short of sufficiency.  A process for decision-making can ensure that necessary and sufficient questions are addressed before a decision is rendered.  It can be customizable, so that only certain questions may apply in a particular situation, but it forces the decision makers to make a conscious choice of what is, or is not, required. 

No process can guarantee a good decision, but a good process can reduce the likelihood of a bad decision.
Charles Re Corr

So, you may be wondering "What would a good process for decision-making look like?"  Well, actually I have been working on a project for three and a half years to devise a sensible and practical approach for making decisions. 

Decision-making book due out in May

Many years ago, Charles (Chuck) Re Corr conceived to write a book that would set forth a high-level characterization of a simple, yet comprehensive process, including a checklist of basic decision-making steps that any decision-maker or decision-making body could easily comprehend and follow.  If born of experience, Chuck’s thinking was that a simple, well-thought-out process could help people navigate complex situations.  At his invitation, I joined him in his effort to come up with a decision-making methodology that was simple and high-level, yet thorough and effective.  The result was a sequence of twenty questions that decision makers need to consider.  The first ten questions deal with defining the problem and the remaining ten questions deal with making the decision.  In our session at the NACD Directors College, Frank Gozzo covered problem definition, and I focused on the ten questions around making the decision. 

The entire process is described in an NACD e-book publication that is  scheduled for May 2013.  The book also delineates those decisions which are the boards alone to make.  It will be available online from the NACD Bookstore, and possibly also on Amazon (not confirmed).  Be on the lookout for another post in May with some additional thoughts about the importance of good decision-making in the boardroom  (and a link to where you can download the e-book). 

Connecting the dots between decision-making and credit scoring

It turns out that the same basic principles that apply to good decision-making in the boardroom also apply beyond that realm to the C-suites and right on down to the fundamental business operating units.  In future posts, I will tie these results back to my earlier work, including the pioneering approach Dr. Zhang and I have described in our books on lending, and advances in the area of credit scoring and loan decisioning.  I am actually presenting on this subject in Ontario, Canada in late May.  My talk is entitled: Context Preserved Scoring -- Enhancing Loan Decision Quality and Transparency.  Expect to see a mid-May post with more details on that subject!

 Decision-making and analytics

It seems appropriate to close with part two of my five-part Buy HPA Now! video series that deals specifically with the topic of decision-making and analytics.  The video explores the give-and-take between data-driven decision-making, analytical decision-making, and judgmental decision-making.  Sometimes there are decisions in business and in life that have extreme consequences.  In those situations, it may come down to answering the fundamental question: 

Which do you trust, science or your gut?!

The video explores alternative answers to that question and makes the case for balanced decision making that leverages the best that science and experience-based reasoning and knowledge can provide. 


 Please invest a little time to view this and let me know what you think based on your own analysis or straight from your gut -- your choice!

Post a Comment

Imagine HPA in an exascale world!

I will be sharing thoughts about high-performance analytics (HPA) next week in Southfield Michigan at the Great Lakes BI & Big Data Summit.  In particular, I will point to advances in computing technology that offer opportunities to re-think how people, processes, data, and systems can combine to create better outcomes for an organization. High performance analytics enables clearer vision, better and more timely decisions, and improved execution.

My talk provides a glimpse into the future where exascale computing (billion-way concurrency) will enable corporate executives to more fully understand and optimize all aspects of their operations and form decisions based upon a unified interactive and near real-time system that captures the essence of the business reality.  This has implications for business leaders today, who must either prepare to leverage this major development or risk losing market position to those who invest sufficient time and resources  to understand, acquire and begin using HPA now, versus down the road.

What my HPA talk will cover...


HPA -- it's about more than speed

I will illustrate with real world examples the numerous benefits of HPA that extend beyond pure solution speed.   The ability to conceptualize and communicate complex and large-scale business problems is a key challenge that I will also address and illustrate with a very concrete example.

The upshot of HPA in an era of exascale computing will be a new generation of decision-makers who, liberated from their waiting-for-answers mode, will experience far greater operational agility as they exercise their ability to re-frame problems on-the-fly, and gauge the joint sensitivity of results to a multitude of assumptions and big data.

HPA offers more business options and opportunities

The implications for how executives will operate are significant and their emphasis shifts from figuring out how to achieve goals to conceptualizing an even better and yet achievable set of goals to pursue.

It’s envisioned that this type of high-performance-enabled analytic system might function for a business leader much like a walking stick that, after a period of time, the brain fully accepts as part of the anatomy!

This has the potential to greatly accelerate learning cycles, speed decisions and execution, and facilitate the sort of imagination and innovation that results in greater vision. This, in turn, opens up a larger set of options and opportunities for organization as they strive to achieve their goals.

I will share perspectives and several examples drawn from SAS solution successes and personal industry experience.

I look forward to seeing those of you who can make it to this event, so please come over and introduce yourself!


Post a Comment