SAS ODPi Interoperability helps reduce risk, simplify testing, accelerate development on Hadoop

0

Just in time for the Strata + Hadoop World Conference, SAS became the first software vendor to achieve ODPi Interoperability with our Base SAS® and SAS/ACCESS® Interface to Hadoop products. Now, that's a lot to digest – so let me back up a second and give some background as to what this means and why it's important.

Why is Hadoop so important?

These are exciting times in the world of open software. Apache Hadoop – with its disruptive, low-cost storage and processing capabilities – has taken the world by storm. SAS has embraced Hadoop, releasing many solutions that run on it. That includes SAS Data Loader for Hadoop (used for data preparation) and SAS Visual Analytics (used for visualization).

One reflection of this extreme growth is the projected attendance for the Strata + Hadoop World Conference in NYC September 27 - 29. SAS has been a major sponsor and supporter of this event for several years now. It has been  incredible to watch attendance increase and to observe the high quality of leading-edge presentations that address challenges in the big data ecosystem.

But there are still challenges

building on open source compared to fixing a car while you're driving it
Building on open source can sometimes be like fixing a car while it's being driven.

Despite this incredible growth, or possibly because of it, we and other solution providers quickly realized that developing applications on open source technologies such as Hadoop can be like building or fixing a car while it's moving. Often, different distributions adhere to slightly different standards and patch requirements. That results in unreliable testing across different distributions. Given that Hadoop is only 10 years old, there is still a great need for stability in this space.

The Open Data Platform (ODP) initiative (ODPi) was formed to address these concerns and to accelerate development of new applications that use the big data ecosystem, including Apache Hadoop. Members create  specifications, then products are certified to meet those specifications. This approach reduces how much testing is needed, and standardizes execution across multiple Hadoop distributions. It's like hitting a pause button while you're driving your car so you can get under the hood and change the spark plugs (no pun intended) without having to stop. As a result, you can drive the car on different types of roads, in much the same way you can run an application on different Hadoop distributions (for example, Hortonworks and IBM.)

SAS and ODPi Interoperability

ODPi interoperable logoSAS is one of the founders of the ODP initiative. One of only seven Platinum members, we were involved from the start in the creation of the ODPi specification. In fact, Craig Rubendall, VP of Platform R&D at SAS, was recently installed as the Chairman of the ODPi Board of Directors. (For a great explanation, visit Craig's blog about Why SAS joined the ODP initiative.) Here's a quote from Craig that summarizes what happened:

 

“SAS is pleased to announce that we are the first software vendor to achieve the distinction of ODPi Interoperability,” said Craig Rubendall, VP of Platform R&D at SAS. Craig, who was recently installed as chair of the ODPi board, continued: “By declaring that SAS interfaces with Apache Hadoop in demonstrable, standard ways, we can reduce our customers' risk, simplify testing complexity, and speed time to value for anyone building or deploying SAS applications.”


To learn more about SAS and how it has embraced Hadoop, download this e-book:
An Early Adopters Guide to Hadoop

Share

About Author

Matthew Magne

Principal Product Marketing Manager

@bigdatamagnet - Matthew is a TEDx speaker, musician, and Catan player. He is currently the Global Product Marketing Manager for SAS Data Management focusing on Big Data, Master Data Management, Data Quality, Data Integration and Data Governance. Previously, Matthew was an Information Management Solutions Architect at SAS, worked as a Certified Data Management Consulting IT Professional at IBM, and is a recovering software engineer and entrepreneur. Mr. Magne received his BS, cum laude, in Computer Engineering at Boston University, has done graduate work in Object Oriented Development and completed his MBA at UNCW.

Related Posts

Leave A Reply

Back to Top