There have been a lot of retrospectives marking the 10th anniversary of the USA PATRIOT Act, but I think what is most compelling is how dissatisfied the industry is with the status quo . KPMG’s ‘Global Anti-Money Laundering Survey 2011’ confirmed a number of trends: cost of compliance increased 45 percent since 2007; clients are raising more questions about transaction monitoring; and there’s an increased emphasis on testing and monitoring of controls. Based on modest budgets, staff reductions and increased regulatory expectations, the old way of monitoring your customers just doesn’t work anymore.
If one reads the survey in more detail, North American respondents’ satisfaction with transaction monitoring declined from 3.9 to 3.2 on a scale of 1-5 since the 2007 survey. The staffing costs associated with high false positive rates and systems expenditures to support transaction monitoring are cited as the biggest contributors to rising costs. It’s fair to say that North American banks have the longest tenure with automated monitoring systems. Too much money has been spent for too little effect.
Tackling transaction monitoring
One conventional approach to address the problem is the “tuning of scenario parameters.” There are numerous articles in the public domain on how to adjust parameter settings based on upper and lower ranges of transaction amounts and volume. But this approach usually fails to consider qualitative data and creates the risk of over tuning, which may result in false negatives. There’s no doubt that some institutions tune their scenarios to an expected level of alert volume that can reasonably be managed by their current investigations staff. That’s the concern that is causing the regulatory community to ask more detailed questions about monitoring governance.
Larger US banks have had the pleasure of a visit from the OCC’s RAD team (Research Analysis Division). RAD examiners are statistically savvy and have recently applied the Office of the Comptroller of the Currency’s ‘Supervisory Guidance on Model Risk Management’ (OCC 2011-12) to AML (anti-money laundering) transaction monitoring. (Whether scenarios, sentinels or business logic units are models is a debate for another article.) RAD examiners are interested in how institutions set their monitoring thresholds, how frequently they tune them and how institutions can be assured that all requisite data has been sourced for the monitoring process. The majority of US compliance departments will struggle with this line of questioning - top 10 banks not included!
On average, consultants document five to eight pages of information describing the objective, performance history and rationale for modifications for each scenario. Given that most medium to large banks have 50 to 100 rules; this process is overly burdensome for most compliance departments. If compliance departments have to enhance their AML transaction monitoring governance to meet the expectations on model development, model validation, ongoing monitoring and outcome analysis per the OCC’s supervisory guidance, costs will rise beyond current levels. But there is a better way …
Applying the analytic approach
As an analytics company, SAS has spent the past three years refining an approach we call “Compliance Analytics” that is designed to reduce the burden of tuning individual rules, significantly increase the quality of alert output and automate the triaging of alerts to cases. We collaborated with several large US banks to mine their case management histories to understand the data elements that have the greatest impact on whether an alert will lead to a productive investigation. This approach is similar to how we’ve solved problems in fraud detection, database marketing, credit scoring and portfolio risk management.
The results have been impressive. At a top 10 US bank, we reduced work items by 46 percent so that investigators could focus more time on egregious alerts. The FIU (financial intelligence unit) supervisor said her team had improved morale because they knew that every other alert was going to yield a useful investigation. Even structuring alerts saw an improvement in hit rates from 42 percent to 72 percent! More recently, we completed a pilot on cash alerts for a bank that used an industry leading activity monitoring system from another vendor. We were able to suppress approximately 20 percent of the alerts because of marginal quality. But more importantly, we triaged 30 percent of cash alerts directly for investigation, and another 20 percent were auto-SARs that went directly to a Quality Assurance analyst for final review. By using advanced models to decision alerts, we eliminated unnecessary human processes that will save them the equivalent of 7 full-time positions per year.
With our latest release, SAS Anti-Money Laundering™ 5.1, the “Compliance Analytics” methodology is generally available to our customers. In expectation of increased regulatory scrutiny on governance, we packaged the data management routines to build the signatures required for our analytics. A visual user interface displays the process flow for preparing the data and provides alerts to systems administrators in the event there’s a data quality issue. This makes it easier to explain to an auditor or examiner that you haven’t had any data leakage.
This approach will enable compliance departments to effectively manage a handful of models instead of 50-100 rule sets. Our system provides constant feedback on model performance - ongoing monitoring consistent with OCC guidance. Maybe using advanced scoring approaches like this gets us closer to the intent of the risk-based approach?
What will the next ten years be like? Bob Dylan was right when he sang in the ‘Times They Are a Changing’ that ‘the order is rapidly fadin’.’ AML compliance departments need a methodology that stands the test of regulatory scrutiny while helping them make their monitoring processes more effective and less costly - now! SAS’ Compliance Analytics approach is the answer.