Contributed by Kim Truong, Sr. Marketing Manager, Hortonworks, where she is responsible for partner marketing with SAS.
At Hortonworks, we’ve been working with SAS engineering teams since 2013, allowing our customers to leverage the inherent scale-out compute and storage capabilities of Hadoop in combination with the richness of SAS analytics. We’re going to be at SAS Global Forum 2015 in Dallas, April 26-29 and would love to show you all the innovative integrations on the SAS and Apache Hadoop front.
Visit our booth in The Quad to learn more about how SAS and Hortonworks can help you unlock the value of your data stored in Hadoop. You’ll see in action how to:
- access and integrate data with SAS from Hadoop
- push SAS processing in the Hadoop cluster
- lift data into memory and perform highly scalable, distributed computing
Get the most out of your Hadoop deployment
We’re especially excited that Hadoop developers and data scientists can now try SAS® Data Loader for Hadoop in the Hortonworks Sandbox! This recent launch is one of the many ways we’re investing with SAS to enable the adoption of Hadoop as a component of a Modern Data Architecture (MDA).
With SAS Data Loader for Hadoop, Hadoopers and Data Scientists can now transform, query, profile and analyze big data inside the Hortonworks Data Platform – allowing them more time to develop innovative models and less time working on data.
- For the business analyst, SAS Data Loader for Hadoop on the Hortonworks Data Platform will eliminate the complexities of writing MapReduce code with a simple, point-and-click interface that empowers them to prepare, integrate and cleanse big data faster and easier than ever.
- For data scientists and programmers, they can run SAS code on Hadoop in parallel for better performance and greater productivity.
Can’t wait until SAS Global Forum? You can try SAS Data Loader for Hadoop for 90 days with our Hortonworks sandbox! To get started, install the Hortonworks Sandbox 2.2 and SAS Data Loader for Hadoop today.
Building a Modern Data Architecture
At Hortonworks, our focus has always been making Apache Hadoop an open core component of a Modern Data Architecture that integrates and interoperates with key technologies and skills within the data center.
The collaboration between SAS and Hortonworks is essential to building a Modern Data Architecture, allowing organizations to collect, store, analyze and manipulate massive quantities of data on their own terms—regardless of the source of that data, how old it is, where it is stored, or under what format. We’re putting new capabilities along side existing technologies to provide new opportunities for the enterprise.