It’s time to say goodbye to your father’s data. Today's data sources are large, fast and constantly flowing. Between the Internet of Things and always-on customers, it's hard to keep up with the data that's being generated all day long. “The goal is to find a way to harness the massive volumes of connectivity,” says Vice President of Product Management at SAS, Randy Guard.
And, says SAS Advisory Business Solutions Manager Roger Shears, “We have to address not just the data, but the amount, the complexity and the pace it’s being generated.”
Guard and Shears spoke at a recent SAS event in Chicago about analyzing data in Hadoop. Attendees ranged from SAS customers who are already running production applications in Hadoop clusters to those with research and development projects currently underway. The event was the first stop in a multi-city tour.
“We’ve spent our energy creating a path for you, our customers, to work directly within the Hadoop clusters being created,” said Guard. That path has led to a new methodology, where data, discovery and deployment come together for better decision making.
Data serves as the foundation; discovery is the art and science component, allowing for innovation and experimentation to exist; and deployment puts what is known into a regulated environment that’s trusted, scalable and continuously evaluated. Knowing the importance of scale and speed when making business decisions is critical, says Guard.
A cultural shift in the making
Shears went on to explain the “why” behind the analytical cultural shift taking place, attributing it to four key factors: disruptive technology, infinite data volume and variety, unrivaled processing power and the new problem-solving mindset.
It’s the mindset of the millennial generation that’s helping organizations uncover a new way of thinking and problem-solving. A generation that has always had technology at their fingertips, millennials are unafraid to quickly take on problems, test them and create new solutions, Shears said. “We have to give them the space to innovate and discover what’s beneath the surface.”
That innovative way of thinking is critical in moving away from the previous Hadoop mentality of build it and they will come. Only looking from the lower-cost-of-ownership lens is a limited viewpoint. “We have to look at creating value, and that’s where analytics comes into play,” said Shears.
By putting Hadoop and analytics together, many organizations are modernizing legacy BI strategies, growing a culture of innovation, scaling data and analytics, analyzing all data sources and leveraging open source effectively.
Lowering risks with Hadoop
Is Hadoop shifting from a storage and computing solution to an environment ready for advanced analytics, multi-workloads, hybrid cloud environments and data security? Vice President of Marketing for Cloudera, Alan Saldich, says yes, Hadoop is answering tough business problems with advanced, interactive analytics.
For example, some customers are using Hadoop and SAS together to address data breaches and cyber security. Specifically, Saldich addressed the limitations behind conventional SIEM architecture. With its storage restrictions, organizations have a limited window to view data history, and, by not analyzing data holistically, blind spots are being created. “The risk is the inability to detect risk,” he said.
With Hadoop, that risk is lowered, as all data is stored, whether generated internally or externally. The result means early detection of advanced threats, dark data being unlocked with immediate access to relevant data, and more.
“Cyber security is one of the most important ways to deploy Hadoop, but it’s not the only way,” said Saldich, who noted that Hadoop is working from all corners of business, from operations to compliance to marketing.
The next stop on the SAS and Hadoop roadshow is New York City on July 14, followed by Santa Clara on July 28. After that it’s the global leg of the tour with stops in Toronto, Sao Paulo, Rome, London, Frankfurt, Germany and Amsterdam.