ESP can determine if big data is eventful

0

Man on smartphone considers data privacy concernsMany recent posts on this blog have discussed various aspects of event stream processing (ESP) where data is continuously analyzed while it’s still in motion, within what are referred to as event streams. This differs from traditional data analytics where data is not analyzed until after it has stopped moving and has been stored.

Instead of running queries against stored data, ESP enables the continuous querying of streaming data for filtering, normalizing and aggregating data in real time. ESP captures the real-time value of data before it’s lost – not only in the time lag between creation and storage, but also before it’s lost in the time lag between analysis and action.

The goal of ESP is to detect meaningful patterns within event streams by continuously analyzing an ever-changing stream of data-driven events to determine if an individual event within the stream should trigger an immediate action. As such, ESP provides immediate situational awareness and empowers proactive responses to changing conditions, to improve business operations and facilitate better customer interactions.

Real-time data: Be wary of commitment

While the pre-storage processing capabilities of ESP empower the enterprise to gain as much insight as possible while data is still in motion, it can also help determine whether real-time data should be allowed to come to rest within some form of data storage. All data has an expiration date, but real-time data often has a very short half-life. Because data falls under the purview of data management and governance once it's stored, it’s best not to make a long-term commitment to real-time data before you know if it’s worth putting a ring on any of its digits (so to speak).

This rule of thumb is especially important when it comes to big data, which generates such massive volumes that it’s often impractical to store it all. Moreover, a whole lot of big data is irrelevant for any analysis, action or even archiving. Consider social media data, for example. It definitely has a lot more noise than signal.

Event stream processing can determine if big data is eventful. That's because ESP standardizes big data streaming from various sources as it arrives (such as big data emanating from the Internet of Things). And it applies simple transformations and rules to determine whether this data is relevant enough to warrant further downstream processing and/or eventual storage. If not, the data can be quickly discarded. You, in turn, can avoid using additional processing bandwidth or incurring the costs of long-term storage, management and governance for big – but largely worthless – data.

Read more in: Channeling Streaming Data for Competitive Advantage
Share

About Author

Jim Harris

Blogger-in-Chief at Obsessive-Compulsive Data Quality (OCDQ)

Jim Harris is a recognized data quality thought leader with 25 years of enterprise data management industry experience. Jim is an independent consultant, speaker, and freelance writer. Jim is the Blogger-in-Chief at Obsessive-Compulsive Data Quality, an independent blog offering a vendor-neutral perspective on data quality and its related disciplines, including data governance, master data management, and business intelligence.

Leave A Reply

Back to Top