Whenever I travel, my trail shoes go with me and I try to find the time to run some unfamiliar path. Why? There are the usual health and stress-relief reasons, but the main driver is the excitement of what I might discover. You really don’t know what is just around the bend.
The new world of big data, advanced computing architectures, and new and sophisticated software environments – collectively known as big data analytics – offers the same opportunity to discover and explore the unknown. And it offers the same level of excitement for those who want to make better decisions across an array of functional areas.
In my last post I wrote about using this new approach to radically improve how you engage with and deliver value with your customers. But it’s not just about finding ways to improve business processes. Big data analytics can be used to prevent catastrophic losses too.
Every private sector company and public sector agency is worried about cybersecurity. In 2014, the Center for Strategic and International Studies, a Washington, DC think tank, estimated the likely annual cost of cyber crime globally at $445 billion. Regardless of company industry or size, security is at or near the top of the list of executive concerns.
The last thing an executive team wants to experience is a hack that causes real and reputational damage – and puts them on the front-page news.
How do these hacks occur? Computer networks are constantly bombarded with access attempts that originate both inside network boundaries and outside the network from sources around the world. Traditional counter-measures include perimeter firewall defense, intrusion detection sensors and router security. Some companies augment these solutions by correlating event data across these systems, but more sophisticated attacks are always underway.
One challenge is the variability in the nature of attacks. “Low-and-slow” attacks have very subtle system and data relationships with muted patterns of behavior over long periods of time, which is hard to detect. Another challenge is that the disconnected nature of network protection systems makes it difficult to view the entire system in a holistic way. Yet another challenge is the high variability in the types of data sources required and the many analytical techniques required to interpret this data – at extreme scale and speed – that place a premium on architecture design and quality. And, finally, things are always changing as perpetrators change tactics.
An approach that yielded great insights for one forward-thinking company used Hadoop and big data analytics. This company:
- Centralized 25 billion events (into Hadoop) of machine-to-machine communication over a four-day period using event stream processing technology.
- Explored the data with advanced visualization software that identified new connections and patterns of interactions between machines that could represent aberrant behavior.
- Developed new and combined analytical models to identify and score interactions that represent high-probability security events. This focuses the right energy on the most important events while reducing false positives.
- Provided a real-time and integrated view of what is happening so that threat mitigation actions can be taken sooner with greater effect.
This is another business case that we’ll explore more deeply this week in New York City. Join us to learn more about how these new technologies can help you protect your valuable assets better than ever before. And bring your shoes – exploring our big data world and finding new ways of doing things – fast – is going to take a lot of tread!