Why data quality becomes critical to CEP

0

For the last few years Complex Event Processing (CEP) has gained ground as more and more applications in business are realised.

The key to CEP is the real-time aspect of the processing combined with the ability to infer meaning from multiple information sources. The net result should be far better and more reactive decision making than most companies currently have available to them.

Let’s take a Telecoms example.

A railway engineering team is installing a new section of track signalling cable and inadvertently slices through a fibre optic transmission cable that is routed along the track and owned by a major telecoms provider.

The fibre optic cable provides voice transmission for households and businesses plus a considerable amount of trunk capacity for mobile network providers on both their own mobile network and other network providers.

In a split second of unplanned shovel work, potentially thousands of people are impacted and events start to happen in real time.

First, the telecoms network management centre spots the break in circuit and identifies the location as being between two junction points. Unfortunately, the break is serious and in a very difficult spot to work on; service will be impacted for several hours.

Fortunately the network has redundancy. Messaging can be re-routed, but via a slower circuit so there will be a service impact.

People start to notice their internet connections drop in performance. They start to tweet about the low speeds they’re witnessing.

Some people find they can’t get a mobile connection or their signal keeps dropping out mid-conversation.

Businesses find that their dedicated high-speed connections between regional stores are getting slower and slower as the capacity is shunted down failover circuits.

Call centres start to take on more calls than normal. Customers are also complaining on Twitter and Facebook.

When these types of incidents occur it is critical to monitor and react to the flood of complex events taking place. Why? Because as consumers we demand it. We crave better service and of course if we don’t get it, we simply churn over to another provider.

Historically, we may look at complaints and account cancellations over a period of time to determine whether some actions need to be taken. We may get even more sophisticated and use a data warehouse to consolidate events over periods of time and merge it with other data to identify correlations and trends. Unfortunately, we don’t always have that luxury anymore; businesses need to increasingly “think” in real time and react accordingly.

In my Telecoms example above, CEP could help by scanning the social media airwaves for comments by disgruntled customers – but of course we need to also listen beyond our own customers because our service failure impacts other communications providers, too. Events streaming from our customer support systems, network management systems and partner systems can all be merged with CEP to help us respond to real-time events on the ground.

Customers need to know we’re listening so we could start to respond with Tweets and alerts about the situation, not just "group alerts" but to specific customers and partners.

To make this work we need accurate contact details. We need to know which customers have a particular service that will be impacted. Silos need to be aligned so that we can find which circuits are connected to the ruptured fibre optic cable. Which customers are related to those circuits? Which billing accounts relate to those customer accounts? What’s the best way to contact the customer related to that account?

We can quickly see that information can only be turned into action if it is well governed and of the highest quality.

CEP is clearly capable of delivering tangible gains for the data-driven enterprise but we can’t overlook the importance of having traditional data of great quality to add meaning and context to the real-time data being processed by CEP.

Only when we provide the best of both worlds do we create real innovation and competitive advantage.

Share

About Author

Dylan Jones

Founder, Data Quality Pro and Data Migration Pro

Dylan Jones is the founder of Data Quality Pro and Data Migration Pro, popular online communities that provide a range of practical resources and support to their respective professions. Dylan has an extensive information management background and is a prolific publisher of expert articles and tutorials on all manner of data related initiatives.

Leave A Reply

Back to Top