Artificial intelligence (AI) is a natural evolution of analytics. Over the years, we have seen AI add learning and automation capabilities to the predictive and prescriptive jobs of analytics.
We have been building AI systems for decades, but a few things have changed to make today’s AI systems more powerful. The first wave of AI systems were rules-based, deterministic systems. The tasks we automated were routine tasks, like recommending the next movie to watch or next product to buy.
We have built expert systems like that for decades, and still do. They are very good at reasoning for a narrow class of problems. But they cannot learn and abstract, they do not handle uncertainty well, they are inflexible, and they are not good at perceiving the natural world.
The current wave of AI systems learn patterns from data, using machine learning and deep learning. Most of the learning is supervised in the sense that during training, we tell the model the ground truth: this is the thing we want you to detect, recognize, predict or classify. Through exposure to many training samples, the system, mostly a deep neural network, draws the connection between the pattern in the input data and the outcome. When exposed to a new set of data for which the ground truth is not available, it can then predict the outcome, for example detecting disease in a medical scan.
The deep learning systems can do something the handcrafted expert systems are not good at: perceiving the world. This wave of AI gave us voice recognition, speech-to-text, and computer vision. It gave us the ability to classify, differentiate, and recommend.
Hand-crafted knowledge systems for voice recognition and computer vision existed in the first wave, before deep learning took off. Their accuracy was OK, but not good enough for commercial applications. They could not really replace the task performed by a human. Today, thanks to data-driven AI, we can.
Why deep learning now?
Massive compute power allows us to build deeper, more complex models that solve more complex tasks. With increasing depth of the model comes better abstraction of the problem. Massive data allows us to train these models well. As a result, autonomous driving, medical diagnosis, legal review, language generation and translation, are now possible at human or super-human levels.
So why are we seeing so many examples of deep learning success in computer vision and natural language processing? Because we have identified types of neural networks that work particularly well for that type of data: convolutional neural nets for images and video, recurrent neural nets for text and speech.
And today, GPU computing technologies allow these complex models to run quickly even on large amounts of data.
As a result, we can automate tasks that were previously considered non-routine and not automatable, like making a medical diagnosis, understanding sentiment in text, and driving a car. These operations all require a lot of data and a lot of passes at the data.
The question, “Tell me how you did that” has been replaced with “Do you have the data that a machine can use to learn this?”
Investing in GPUs
As SAS continues to answer these questions and invest heavily in deep learning technology, the workloads map well to the GPU paradigm. Using GPUs, we can achieve a high degree of parallelism during training and inference of the models.
We are also investing in using GPUs as math co-processors to support general analytical workloads. Problem sizes keep increasing, data sizes grow, models use more features. The goal is to bring the GPU computing capabilities directly into our calculations, so the power is built into the math and you do not have to code for it.
Our advancements are not just aiming at neural networks and certain machine learning algorithms, for which GPU-specific implementations are being developed. They apply to the general mathematics that are underpinning many of our analytic tools and solutions.
With this effort, we are making it easy and transparent for our customers to benefit from their GPU investment —whether it is in the data center or the cloud.
To learn more about deep learning and the importance of GPUs, watch the video below or download the paper, Deep Learning with SAS.