In my previous post, I talked about how the Internet of Things promises new ways to use sensor and machine data by creating a highly efficient world that demands constant analysis and evaluation of the state of events across everything that surrounds us.
I have also explained why it is important for real-time decision making on streaming data, to have a technology that is able to process data at a very high speed and this is done using an event stream processing engine (ESP), which is sometimes also referred to as complex event processing.
An Event Stream Processing engine allows processing of the data stream, before the data is stored in the cloud or in any high-performance repository, and through the use of analytics it can decipher streaming data as close to the device as possible creating new knowledge for many industries and reducing the latency of the information being analyzed for decision-making.
As the data is continuously processed and analyzed in real time in memory, it can be streamed to other analytic applications for deeper analysis or perform, for example, pattern or behavior analysis over a rolling period of time or use more information for correlations.
These capabilities set event stream processing apart from other approaches by revealing what’s happening now, not just what happened in the past, so you can take action immediately. It also can be used to predict what will happen next, and to determine where process optimizations can be applied for better outcomes.
Within such a vast and emerging domain as the Internet of Things, it is possible to identify a broad range of applications of streaming data processing, and there is no doubt many applications that are yet to be imagined. Nevertheless, we can already identify some familiar examples that will generate a high return value:
- Detect events of interest and trigger appropriate action: Event stream processing can spot, in real time, complex patterns generated by a user’s behavior on their device(s) or an equipment’s state that represent risks and opportunities such as a system failure, a potential fraud, an opportunity to send a personalized marketing offer, as well as to propagate the information to dedicated systems for immediate attention.
- Aggregate information for monitoring: Sensor data from equipment can be continuously aggregated and monitored for trends, or correlations, that indicate a problem, alerting an operator to take immediate action before equipment damage occurs
- Sensor data cleansing and validation: Sensor data is notoriously "dirty." There are, for example, often missing time-stamps for a single sensor, mostly related to network issues, rather than sensor failure for example. As a result, sensor data can be incomplete or contain inconsistent values.
As a conclusion, we have to keep in mind that event processing, though core, isn't enough. Streaming data processing is most powerful when analytics are also used to understand historical patterns that can be added to the mix of what’s been observed in the past and what's currently being observed. This allows us to add a third element: what can be anticipated leading to real-time predictive and optimization operations.
Interested in learning more about event stream processing and the SAS solutions for the Internet of Things? Read the Understanding Data Streams in IoT white paper and visit the SAS and the Internet of Things page.