Do you ever have stress dreams? You know, where you’re taking an exam for which you haven’t studied, or you’re forced to wait tables in a sea of angry restaurant customers?
For many of us, the stress nightmare of the modern era involves trying to make sense of a never-ending stream of data. Being asked to make decisions based on a constant flow of ever-changing variables – that’s not only stressful, it’s the downside of the information barrage that is the Internet of Things. Getting it right is tricky – and worse than just being unprepared. It’s like trying to navigate a boat at the edge of Niagara Falls.
That’s an apt analogy because more than 3,000 tons of water crash over those falls every second. The volume of streaming data is even bigger. With the Internet of Things pushing out data from more and more devices, streaming data is transmitted continuously at rates of millions of events per second. Storing this data and then analyzing it later is just not practical in this new world.
Big data, big challenge
When it comes to crunching all that data, and doing it fast, the challenge is immense. The brilliant people in SAS R&D saw what was coming and knew what was needed: software that processes and analyzes data in milliseconds or even microseconds – before it is stored. We call it SAS® Event Stream Processing. To date, this solution has helped many of our customers remain competitive and maximize the value of their goods and services.
One example I like to share is a consumer bank’s fight against fraud. The bank is using SAS Event Stream Processing to detect, monitor and address suspicious transactions and fraudulent behavior in 8 million online banking accounts. With 1 million payments per day, the bank sees an average of 35 transactions per second, with a peak of 230 transactions per second. Thanks to real-time alerting, these suspicious transactions are put into an investigation queue and addressed within 30 minutes. As I’ve noted before, fighting fraud is something that has to happen in the moment. Catch it after the fact, and you’re way too late.
And in some cases, midstream processing is actually saving the world. None of us want to see another offshore drilling accident, right? That’s why a leading energy company is using SAS Event Stream Processing to continuously monitor the performance of its electrical pumps in offshore oil platforms. More than 2 million sensors on hundreds of platforms yield about 3 trillion rows of data per minute. Crunching all that data, the SAS solution was able to predict – and thus prevent – a pump failure, protecting the ocean and saving the customer millions of dollars in the process. I’m proud of that.
Cleaning while streaming
So how do we actually do it? Half the battle is cleaning the data as it’s streaming. Sensors can give false readings, and data can be inconsistently formatted. Normalizing data while it’s flowing is a key part of success because that’s when you’re able to detect patterns and define priorities. SAS Event Stream Processing determines what data is relevant, if and when it needs immediate action, where it should be surfaced for situational monitoring, and where it should be stored for more in-depth analysis. And it all happens in the blink of an eye.
Pretty cool stuff.
The reason I’m so pumped about event stream processing is that the stream is only going to get bigger. Analyst firm Gartner Inc. predicts that the Internet of Things will grow to 26 billion units by 2020 – an almost 30-fold increase from 2009. Not sure what that really means? Consider it this way: We’re already drinking from a fire hose. Soon we’ll be trying to sip from Niagara Falls.
I, for one, am glad we’ve got help.
More on event stream processing
For more information about how SAS can turn streaming data into actionable insight, check out the following resources: