Three virtualization myths and how to dispel them

0

Hardware virtualization was developed to reduce resource needs, utilize idle hardware and address compatibility issues that come with multiple layers of hardware, software and middleware.

While virtualization solves many problems, it has a reputation for creating others. Here, I’d like to address three of the concerns I most often hear about virtualization, and explain why they’re not true.

Myth number 1: Virtualization is slow.
We recently ran a series of tests and determined that SAS running in a virtualized environment performs within two to seven percent of its typical performance in a physical system. The test included an analytic workload with multiple calculations, large data sets and high input/output. Not only is performance reasonable for most applications, but provisioning times can also improve drastically. When you use the hypervisor to host a repository of images (predefined installations of an environment with content), you can get SAS up and running in six minutes on Linux. This type of system works great when installation is needed quickly, like in a disaster recovery scenario, for example.

Myth number 2: I’ve tried, and my application won’t run in a virtual environment.
Most people who make this claim are basing their decisions on tests or numbers from five years ago. There used to be a 30 to 60 percent overhead for a typical virtual environment, but today that’s down to 2-10 percent. As performance has gone up, this excuse has become less true. Today’s virtual technology has matured to a point where you can easily reach performance characteristics that most applications need.

Myth number 3: I can’t troubleshoot in a virtual environment.
It can become harder to diagnose problems in virtual environments, but you can overcome that by monitoring thoroughly. Instrumentation throughout the stack is very important. If you monitor each layer, you can really begin to understand how your virtual layer is performing and know pretty quickly when things are going wrong. Using the operating system for monitoring won’t give you complete picture. Tools to look at each new piece are required.

Conclusion
At SAS, we’re committed to understanding the impact of virtualization on every product we deliver. High-performance computing is an important initiative for SAS, as customer problems become more complex and require more data to process – and it’s difficult to do high-performance computing without virtualization.

If you’re one of the many people who have perpetuated the myths above, reconsider your environment with today’s technologies. We’ve set up virtual environments to run applications for customers in many different industries. You might be surprised at the performance levels for your application.

Tags
Share

About Author

Craig Rubendall

Vice President of Platform R&D, SAS

As the head of Platform R&D at SAS, Rubendall manages research and development for core SAS products, including the broad areas of data storage and access, computing infrastructure, and management and application integration services.

Comments are closed.

Back to Top