At our latest Power Series event in Manhattan, one of the teams took a futuristic approach to addressing data challenges. They described the need for a “compliance box” that front ends the data management process that integrates, cleanses and ensure data integrity across a number of different, typically silo’d data sources. Unbeknownst to them, they actually identified another use case for what we refer to as “store and score” – an approach that we leverage today since the SAS Data Integration (DI) Studio can integrate analytical model processing directly in a DI job flow.
For example, a rich analytical model could “score” each record that is coming into the data management pipeline and apply logic that will determine how the record should be processed. In this use case, each incoming record could be analyzed for various compliance factors. And these compliance checks don’t have to be limited to simple business logic; they can leverage the entire breadth and depth of the SAS analytics capabilities. For example, predictive analytics can be used to score incoming records based on the future likelihood that they will result in compliance or regulatory issues.
This pattern is becoming increasingly important as organizations address data volumes that cannot be managed by the typical store, then analyze or process model – think about massive amounts of text data, think about massive amounts of data that represents noise – if you could provide an analytics based filter on the front end you are basically moving from a traditional data management approach to intelligent information management.
Instead of simply thinking about data preparation for analytics, you are applying analytics directly to the data preparation process – just like applying analytics to a business process, applying analytics to a technical process can provide groundbreaking results!