Server Virtualization vs. Cloud Computing

Author: Craig Pollack Date: Dec 13, 2012 Topics: General Business Owner Blogs

virtualization-cloud-computingServer virtualization and cloud computing are often talked about  even by service providers as if the terms were interchangeable.

This is in part because both were developed to solve a similar business problem: how to provide more computing power at a lower cost. It's also in part because many of these terms are bantied about by marketing folks who, unbeknownst to them, don't know that they really don't know the difference.

As it turns out, virtualization is pretty much THE key component of the underlying technology used to provide cloud computing services. So, I thought I'd take a little time to detail some of the key differences between server virtualization and cloud computing...

Q: What is virtualization?

Virtualization is a way to make more efficient use of today’s high-performance CPUs, by letting you run multiple (virtual) servers on the same (physical) hardware. One or more virtual servers share computing resources (provided by the physical server machine) under the control of a "hypervisor" (this is the controlling operating system layer running all the (virtual) servers on top of it.

The more virtual servers you can get on to a physical machine, the more the need for physical servers goes down. If you take 5 physical servers and move them onto one physical server, you've reduced the number of physical servers by four. This reduces hardware, space, cooling, and power costs. Virtual servers can also be moved among different physical machines to further align available resources as demand on those servers change.

Q: What is cloud computing?

Cloud computing is a service that relies on a highly virtualized physical infrastructure. In the cloud, applications generally run on virtual servers that are independent of the underlying hardware. (Indeed, a virtual server environment for your application can be one of the services a cloud computing provider offers.) But there’s more to the cloud than virtualization - in that cloud computing is based on the concept of a "utility computing" service, where RAM, CPU cycles, storage and network bandwidth are commodities to be consumed on a "pay per use" basis, like water or electricity.

A cloud computing environment relies on many physical and virtual servers. It is configured in both hardware and software to provide high reliability and availability. Clouds are also very flexible and scalable, in the sense that an application can simply consume resources as needed.

Q: What are the basic pros and cons of virtualization?

Virtualization lets you reduce the cost and complexity of your IT infrastructure by maximizing the utilization of your physical computing resources. It also reduces your reliance on the physical machines reducing the number (and therefore the risk) of physical failure points. But keep in mind - you still need to purchase and maintain servers and software. Multiple virtual servers increases the complexity in the initial configuration and requires a knowledgeable and experienced staff (whether inhouse or outsourced) to manage.

Q: What are the basic pros and cons of cloud computing?

The primary benefit of cloud computing is that you basically remove the physical layer from your control and responsibility. The provider now takes care of the infrastructure your application runs on. You can eliminate the initial start up costs associated with purchasing all the needed hardware and software. Your IT infrastructure costs now become operating rather than capital. Everything becomes an ongoing monthly fee.

But this is only a benefit if the provider does a good job. And your internet connection becomes uber-critical! While hardware failures may become a thing of the past, you are more dependent on your internet connection and bandwidth than ever before. Also, when your data is outside of your "four walls", your ability to control and secure it changes (and usually significantly). It's now possible (and sometimes probable) to lose access to business-critical services as well as data, when cloud services fail.

Q: What is a quick way to tell if a vendor is really talking about true cloud computing services or about virtualization?

As Salesforce.com CEO Marc Benioff recently pointed out, virtualization is a software-based technology. "They have versions with numbers after it. That is when you know you are dealing with software; if you hear about versions, you know you are not in the cloud." Cloud computing is a service that goes beyond what software alone can provide.

Q: Should we choose virtualization or cloud computing?

Which approach is right for your application? Virtualization can certainly save companies money in both the short- and long-term. But it is still necessary to purchase and provision hardware and software upfront in order to run an application on virtualized infrastructure. The IT costs associated with managing the virtualized application is also a factor.

Cloud computing, in contrast, costs less upfront because you don’t have to buy and manage the infrastructure. But the more cloud-based resources you use, the higher your costs will be. Ultimately, cloud computing might cost more than running virtual servers on your own hardware, depending on your requirements, workload, and many other factors.

Another key choice factor is data security. In a virtual environment, you control the hardware, the access permissions, the backup/recovery, etc. In a cloud computing environment the service provider handles those concerns, for better or worse. And many times your just stuck with whatever their approach is. For instance, what's their approach to backup? Many times cloud providers only retain backups for 14 days. If you want to save End of Month backups, good luck!

Q: What about application performance?

Whether you choose virtualization or cloud computing, or maybe even both, the performance of your applications is paramount. Both cloud and server virtualization place new and often hard-to-predict demands on the network.

To ensure application performance levels to your distributed users, you need to be able to efficiently manage and troubleshoot network performance. Factors like bandwidth, latency, packet loss and jitter can play havoc on both virtualized and cloud-based applications.

Whichever path you choose, the ability to ensure network performance is a prerequisite for acceptable application performance.

 

 

Author

Craig Pollack

Craig Pollack

Craig is the Founder & CEO of FPA Technology Services, Inc. Craig provides the strategy and direction for FPA, ensuring its clients, business owners, and key decision makers leverage technology as efficiently and effectively as possible. With over 30 years of experience building the preeminent IT Service Provider in the Southern California area, Craig is one of the area’s leading authorities on how small to mid-sized businesses can best leverage and secure their technology to achieve their business objectives.

Comments