Staying ahead of the big data curve


With all of the great information coming out of this week’s Gartner Business Intelligence Summit and the recent wave of vendors arriving on the big data scene, we decided to take a hard look at how we stack up.

First, here’s our take on big data: Data – big or not - will always be a problem if you don’t have predictive analytics to provide insights for forward-looking decisions. You need these insights on time! SAS has been offering business analytics for more than 36 years. Even as data sources have changed to include social and mobile data, real-time stock analyses, call center data, computerized medical records and more, the real issue is still having the power to know what your data says so that you can make better decisions.

The need for speed

High-performance analytics and high-performance computing are terms used in the technology industry to describe software and hardware combinations that allow organizations to competitively manage big data. You hear a lot about speed - and speed is important - but its importance is sometimes overplayed and under explained.

The value of speed is about getting the timely analytical insights and then integrating those insights into the decision-making process. Are you sampling data when looking at all of your data would be more valuable? Can you quickly embed those analytical insights to the business processes to empower those at the point of decision (e.g. traders, credit managers, call center representatives, logistics managers)? Or does it take so long to get results that the situation has changed by the time the answer comes back?

With high-performance analytics, you can be more confident in those decisions because you can now see your results in minutes or seconds and fine-tune the models - changing variables and asking more what-if questions - to provide multiple options for the decision makers.

The evolution to high-performance analytics

If I had to guess, one of the questions you are constantly asking yourself is “How can I stay ahead of my competition?” Leveraging data is a big part of competing effectively, but how do you decide when it’s time to jump on the high-performance bandwagon?

The deciding factors are really intuitive: Are you being out-maneuvered and out-gunned (Is it just a matter of time until you are)? Would the ability to ask questions of today’s data put you out in front? At this point, you need to evaluate your organization’s analytical and infrastructure constraints.

  • Can your organization analyze all of its data?
  • Can you incorporate more variables and ‘what if’ scenarios and then rerun the model – now?
  • Can you tackle complex problems instead of just solving simple problems or a subset of a problem?
  • Instead of using simple techniques, can you use complex modeling techniques?
  • Can you perform more model iterations in a given day to improve accuracy?

No? These constraints may soon put you at a competitive disadvantage. Through the years, our customers have come to us and said that they have a LOT of data in data warehouses, but they are unable to successfully unlock the value in that data. The task of moving the data in and out of the warehouses for analysis was time-consuming and posed risks. We agreed and found what we believed to be the key – analysis inside the database.

We didn’t stop there because our customers’ needs won’t stop there. We’re continuing to build high-performance capabilities including concurrent, in-memory analytics, visual analytics, and a seamless integration with Hadoop. According to Keith Collins, SAS is coming out with new innovations in high-performance analytics at a pace of about every six months – not just product usability upgrades, but new products, tools, engines and solutions that “change the way products are developed and revolutionize the way organizations work with their data.”

Who’s there?

As hardware vendors have built better systems to give organizations like yours increased storage, speed and availability, software vendors have made changes to the analytics architecture to leverage these improvements.

Some vendors have adapted through acquisition – stacking various pieces together like a giant Jenga game. In these situations, the glue binding the stack is the specialized hardware on which it runs - creating a one-size-fits-all approach. High-performance analytics from SAS provides a flexible infrastructure that is not tied to a single architecture implementation.

Unlike other solutions that are leveraging in-memory database approaches, where the processing is optimized for data storage, SAS has designed an in-memory solution that is optimized specifically for analytic workloads. This allows your organization to use the database of your choice, including Hadoop, and doesn’t force you to abandon your existing database.

Other great benefits:

  • The ability to analyze complete data sets or a stream, score and store approach that leverages analytics on the front end to determine the most relevant data.
  • Comprehensive information management capabilities that allow organizations to effectively manage the entire data to decision lifecycle.

You’ve got a lot to think about. I think the biggest ‘aha’ will be, “It’s not just the speed; it’s a combination of precision, forward-looking answers at speeds never before possible.”

Now, if you are starting to think about looking for a high-performance vendor, read Alison Bolen’s most recent post, 12 questions to ask your high-performance vendor.


About Author

Tapan Patel

Global Product Marketing Manager, SAS

Tapan Patel is Principal Product Marketing Manager at SAS. With more than 13 years of experience in the enterprise software market, Patel leads product marketing for Predictive Analytics and Data Mining as well as High-Performance Analytics, In-memory Computing and In-Database Processing.

Back to Top