How are Hadoop deployments like snowflakes?

0

Because every single Hadoop deployment, like the structure of a snowflake, is different. For the open source big data framework, there are different distributions from various Hadoop vendors and all implementations are, or should be, tailored for that specific organization’s needs.

And given this infinite variety of Hadoop foundations, the analytics deployed to pull value from the data must embody three critical characteristics:

  1. The analytics must be accurate. It’s tough to make good business decisions if you aren't sure if the analytic results are valid.
  2. The analytics must be scalable, after all this is big data and you might not know how much data you’ll use a few years down the road.
  3. The analytics must support governance so that large amounts of data don’t have to be unnecessarily moved or replicated.

SAS, with its “laser focus” on analytics, big data and Hadoop, stands out for its dedication to the three critical characteristics of accuracy, scalability and data governance.

To get a quick overview of how SAS supports these vital customer requirements, take a look at the below video produced by the same folks who made the intro to Hadoop video I recently included as part of a post "Has the Hadoop hype left you confused?"

Highly committed to enabling the big data framework, SAS continues to invest heavily in Hadoop with recent announcements such as:

These advanced software products are being used now by customers to create powerful analytic ecosystems around Hadoop. And you can expect SAS to unveil more products--some this year-- which further exploit the potential of Hadoop resources. Stay tuned.

Share

About Author

Steve Polilli

I've worked in SAS media relations since 2008. Prior to that I held PR positions at several other technology companies. Earlier in my career I was a news reporter and editor.

Comments are closed.

Back to Top