It has become almost cliché to say that financial services firms are navigating one of the most challenging business climates in recent memory. The confluence of economic, political, cultural and public health factors has produced a volatile environment, placing a premium on being adaptable and nimble. This is clearly driving the need for step improvements in risk modeling at financial institutions – particularly when we consider how banks can benefit by transitioning their fast-evolving risk modeling life cycle processes to the cloud.

My consulting work with multiple banks reflects how vital it is for all firms, big and small, to modernize their risk modeling processes.

Gridlocked model development

A Tier 1 bank worked to develop models for the mortgage business, where the process was anything but straightforward. Models were developed in SAS and deployed on a grid, which seemed to be a good approach. However, separating the data and writing code for parallel processing was left to individual developers. And their efficiency varied wildly.

The same grid supported multiple processes, including some production. That meant not being able to run processes at certain times or seeing extensive slowdowns in performance. Documentation was manual, and it took a team of dozens several weeks to produce. To top it off, developers emailed approved models to a deployment team which recoded them into C++ so they could be used.

As you can imagine, a long process of testing followed. Rather than being a well-managed, agile process, model development required many months.

First steps on the model life cycle innovation journey

While this may sound like a relic from another age, my recent experience on dozens of other projects across the globe suggests otherwise. In case after case, firms rely on technologies and operating models from another era. These tend to entail manual operations, lack of repeatability and long cycle times.

Many companies have recently turned to the cloud to address these shortcomings. While this is a step in the right direction, it is only the first step of a journey. That’s because taking full advantage of model life cycle innovations by moving to this new platform requires more than simply moving the code. It requires rethinking the ways in which data and models are used and provided.

Rethinking cloud: A modern architecture

Consider what the cloud signifies. At its core, the cloud is a set of servers and management protocols providing flexible performance and data storage. When properly configured, a cloud-based environment allows easy expansion and contraction (“elasticity”), robustness and failover.

A modern cloud architecture extends well beyond that foundation.

With containers and container orchestration tools, you can introduce new and updated elements without disruption. Microservices provide common functionality to support both technical (e.g., authentication, governance, access control) and analytical functions (e.g., model performance, data requests, model fit metrics). By taking an API-based approach to solution development, you can access these elements from virtually any technology. This approach prevents repeating (and potentially corrupting) information as it’s moved.

When combined, the components of a modern cloud architecture open the possibility of automating multiple activities across the model lifecycle. Forward-thinking firms have already begun this journey.

Adopting the new model development paradigm

One large bank (with more than $500 billion in assets) is investing heavily to support this new paradigm – with future benefits becoming clear.

Now, model developers can start with a template for development that includes adjustable parameters (variables to include, training data, etc.). And they can review model assessment statistics to determine how the models perform as they go through the development process.

Having a standardized approach to model development not only reduces the need for specialized coding skills. It also enables automation of model documentation. These efficiency gains extend beyond individual developers as they share results across the modeler community and with others – like model validators and data scientists.

Models are stored in a common repository where they’re accessible as a microservice. These microservices can be pulled into a container for multiple purposes, such as validation, back testing, or real-time decisioning. Data and results – whether model metadata, model results or execution status – are available via API. This eliminates the need for reproduction.

Looking to the future

Our strategic partnership with Microsoft will bring SAS risk modeling onto the Azure cloud, presenting the opportunity to accelerate innovation. I believe we will see tremendous progress toward the type of adaptable, nimble risk modeling life cycle ecosystems that firms must have to ensure continued success of their fast-moving businesses. For the risk modeling community, the future holds great promise as more technologies become available on the cloud – enabling even greater innovation across the entire model life cycle.

Share

About Author

Anthony Mancuso

Director, Global Head Risk Modeling and Decisioning Lifecycle

Anthony Mancuso is Head of Risk Modeling and Decisioning in the RQS division of SAS. In that capacity, he has ultimate responsibility for market strategy, product requirements, project implementation, and advisory consulting for all Risk solutions relating to model development, deployment, and governance. In addition, he works closely with internal and external colleagues to develop a product pipeline, support presales activities, and provide training and knowledge transfer. In his 15 years in financial services, he has worked globally in credit risk, stress testing, ALM, regulatory capital, and general risk analytics. Anthony has a Masters in Statistics and a Ph.D. in Economics (econometrics concentration), both from North Carolina State University.

2 Comments

Back to Top