Oliver Schabenberger likes things that go fast.
Perched on a shelf in his office at SAS world headquarters in Cary, North Carolina, are five scale models of Formula One race cars, which symbolize Schabenberger’s fascination with speed and technology. The models are replicas of the life-size, aerodynamic machines that can travel at speeds up to 220 miles per hour.
“I’m fascinated by the technology associated with these cars, and how you can push things to make them go faster,” said Schabenberger, Lead Architect for High Performance Analytics. “But it’s not just about going fast. It’s about achieving extreme performance when building to specifications.”
Schabenberger’s description of Formula One racing sounds awfully similar to another passion of his – high-performance analytics.
High-performance analytics involves utilizing solutions, tools and statistical methods within a group of clustered or distributed computers to solve complex computational problems. This approach spreads the analytic workload across a number of linked machines, which improves both the speed and efficiency of the computation and minimizes data movement.
According to Schabenberger, a key to achieving speed and efficiency is collocating the data and analytics. “You have to get the analytics to where the data are,” he said.
But like the race cars, high-performance analytics isn’t merely about going faster. It’s about developing tools, products and solutions that meet customers’ needs.
“Our approach has always been to address the problem that the customer needs to solve. Then we’ll work on a way to make it run fast. That approach fits well with our customer-driven focus,” Schabenberger said.
“With high-performance analytics, we want to transform the way a customer can operate their business,” Schabenberger said. “It’s not about setting a world record. We want to make it useful to our customers.”
The transformative power of high-performance analytics was on display at the Disney Analytics and Optimization Summit in Orlando, Florida, in August. There, Schabenberger and CEO Jim Goodnight demonstrated three new SAS high-performance solutions and highlighted how SAS has helped a major US retailer to analyze and optimize prices for 270 million items each week, taking a job that once took more than 30 hours to compute and reducing it to about two.
“What would you do with the extra time if your code ran in five minutes instead of hours or days? I want to reset how you think about business problems,” Goodnight said during the event.
Such significant reductions allow customers to perform further analysis of their data, which represents the transformative power of high-performance analytics. “If you can reduce a computation from 10 hours down to 50 seconds, now you have the capability to ask more ‘what if’ questions,” Schabenberger said. “Instead of waiting long periods of time, you can adjust your models and run it again. Now you can perform more analysis on your data in a few minutes than you could have done in a week previously.”
Applications for high-performance analytics are not limited to retail. For example, it allows banks such as United Overseas Banks to more quickly assess risks and, hence, make decisions that better protect the business and its customers. Telecommunications companies like Telstra use grid computing to keep pace with the needs and preferences of its customers, thereby helping the company retain existing customers, attract new customers and increase sales.
And it’s likely that you routinely encounter high-performance analytics in your daily life.
“Catalina Marketing uses SAS Scoring Accelerator for Netezza,” said Senior Director of Technology Product Management Ryan Schmiedl. “I don’t know if you’re familiar with Catalina, but when you shop at the grocery store and the cashier hands you coupons when you check out, that’s Catalina Marketing.”
Schmiedl said that Catalina keeps track of and runs analytical models on the purchase history of nearly 140 million customers. Each time you make a purchase and scan your customer loyalty card, the models help determine which coupons are most likely to entice you to buy products.
“It used to take them several hours to run the models because of the sheer volumes of data,” Schmiedl said. “Now, they run the model inside the data base, so things that used to take hours are now taking minutes, which means they are able to regenerate and rescore models on customers throughout the day, as opposed to doing it once or twice a day. So it helps them get that extra level of lift in model predictability, which translates into you getting a more appropriate coupon which, in turn, translates into you buying that product the next time you go in the store.”
Bill Abbate also contributed to this post.