Changing the rules of the game: Why in-memory computing is like hydrofoils


After  SAS® Global Forum in Denver, I took a holiday in San Francisco. Like all good tourists, I obviously went to visit the Golden Gate Bridge. While I was walking across it, I noticed a windsurfer getting what I thought was a bit close to the mouth of San Francisco Bay, especially given that the tide was going out. I do a bit of sailing, and it really looked like he was in trouble. The strong wind and the current were moving him rapidly towards the open sea, and much faster than he could surf upwind. I was just wondering whether I ought to do something when the scenario suddenly changed.

The surfer managed to gain enough speed to lift his board onto its hydrofoils. Suddenly he was flying. The waves and current were not bothering him anymore and he was gaining tremendous speed. Within a couple of seconds he was gone, back towards San Francisco and Alcatraz.

Hidden Insights: Changing the rules of the game: Why in-memory computing is like hydrofoils

A real game-changer

It is not the first time that I have seen sailboats or windsurfers with hydrofoils. At Lake Neusiedl, where I sail in summer, a few smaller yawls (dinghies) and boards use this new technology. The 34th America’s Cup was the first large public sailing event where large catamarans were equipped with hydrofoils to allow them to "come out of the water" when reaching a certain speed and virtually fly over the water. It was, however, the first time that I really appreciated how much foils had changed the game by allowing surfers or sailors to stand up against forces of nature that would previously have held them back.

This started me thinking about the similarity between this technology and distributed in-memory computing for machine learning and artificial intelligence. Over the last couple of years, computing technology and high-speed surfing and sailing have both evolved massively. Today’s possibilities, in terms of speed and power, are far beyond what we were discussing even just 10 years ago. There are some interesting parallels.

First of all, neither technology is especially new, but they are being used in new ways to deliver huge increases in capability. Foils were first patented back in 1898, but new materials and new control systems have made them viable at long last. With high-performance in-memory computing, distributed systems and GPUs, we can solve problems that were largely the stuff of science fiction when I started working in data science 20 years ago. We are not talking about a speed gain of 20, 50 or 100 percent, but multiples of computing performance.

This technology allows us to process analyses far quicker than before. We are also able to study large data volumes with advanced machine learning algorithms that were impossible in the past because of advances in both the volume of data and our capacity to handle it.

Finally, for both hydrofoils and machine learning, the hardware is important: the carbon-foil, or the RAM and CPUs in a server. However, these would not be usable without the control systems or surrounding technology, which make the technology manageable, even for relative amateurs. Foils are now available not just for America’s Cup yachts, but for other dinghies and boards. Machine learning and AI methods have been made available in more usable forms via options like SAS® Viya®, our distributed high-performance environment.

Hidden Insights: Changing the rules of the game: Why in-memory computing is like hydrofoils

Giving our customers the chance to fly over the water

As I watched that surfer getting smaller and smaller towards the horizon, I had a feeling that what I had just seen might be a metaphor for what we provide  for our customers. We want to give them the chance to reach areas in their data lakes that were impossible to reach before, just as the surfer was able to get nearer the mouth of the bay. We allow them to see features of their customer base that were inaccessible and give them the opportunity to process these findings with tremendous speed, before safely returning to a point from where they can start new analyses.


About Author

Gerhard Svolba

Principal Solutions Architect

Dr. Gerhard Svolba ist Analytics Expert und Senior Solution Architect bei SAS. Zu seinen Aufgabenbereichen gehören das Produktmanagement für die analytischen Produkte von SAS sowie die branchenübergreifende Konzeption und Durchführung analytischer Projekte. Besonderer Fokus liegt auf Kundenanalyse, Risiko-Prognose und Demand Forecasting. Seit 2004 ist Gerhard Svolba Buchautor für SAS-Press. 2007 erschien das Buch „Data Preparation for Analytics using SAS“, im Mai 2012 „Data Quality for Analytics Using SAS“. Darüber hinaus betreut er die User Group „SAS Club“. Dr. Gerhard Svolba hat einen Lehrauftrag an der Fachhochschule Steyr im Bereich „International Marketing Management“. Der promovierte Statistiker ist seit 1999 bei SAS. Neben seinem Statistikstudium erlangte er zusätzlich den Master-Titel im Bereich Betriebsinformatik. ENGLISH: Dr. Gerhard Svolba is a senior solutions architect and analytic expert at SAS Institute Inc. in Austria, where he specializes in analytics in different business and research domains. His project experience ranges from business and technical conceptual considerations to data preparation and analytic modeling across industries. He is the author of Data Preparation for Analytics Using SAS® and Data Quality for Analytics using SAS and teaches a SAS training course called “Building Analytic Data Marts.

Related Posts

Leave A Reply

Back to Top