Is there an optimal adoption curve for server virtualization?

Posted by Staff   |   Thursday, September 17th, 2009   |  

Guest star author: Ron Oglesby, Practice Executive, Global Infrastructure Consulting Services, at Dell.

One of the things that has consistently interested me has been the rate at which server virtualization is adopted within customers. The reason this is so interesting is the huge differences I see in different organizations that often have the same goals, and are about the same size. Some organizations have huge implementations with CEOs or CIOs driving everyone to virtualize. While other organizations have a ‘grass roots’ push for virtualization squeaking it in where they think they may get away with it. Sometimes I have even found that the ‘early adopters’ of the technology are only 30% virtualized after 3 or 4 years and their competition that just started their project last year is now 40% virtualized.

All of this led to me wonder if we could ‘grade’ where someone was in their adoption solely based on the % of their environment that is virtualized and the number of years they have been using server virtualization. And if we can grade someone, then we need to define a possible “optimal adoption curve” for server virtualization, again based on pure potential within an environment.

Before we get into some of the numbers behind the optimal adoption curve we have to dispose of a couple items / make a few assumptions.

The first item to dispose of is the political issues with server virtualization. Often I hear from organizations that they “simply cannot move any faster due to push back from app owners”. While tactically this is an issue that we have to deal with, from a strategic point of view it is just an excuse.

The first reason to do away with this ‘excuse’ is that MOST organizations do not virtualize a server that is a year or two old. Generally moving to a virtual platform (at a micro level) is done once a server is depreciated/up for refresh and is changing to new hardware anyway. In this case there is NO REASON not to move to virtualization as the VM will often have more compute power than the existing server. So while this is a real tactical issue it can often be solved with enough C level push, policy changes and IT standardization on VMs.

The second item we need to dispose of (or create a new assumption about) is the idea that everything cannot run in a VM… Yes you heard me. Everything can run in a VM today. Does that mean you will run out and virtualize everything starting tomorrow? No. But for 5+ years I have been doing virtualization assessments, and in each one I have generally recommended candidates that utilize less than 1 processor, less than 2GB of memory, and equally low disk and network utilization. Using this type of criteria I consistently find that 80+% of any customer’s environment can be virtualized.

So, since 2004 I have been telling customers that 80% of their environment could be virtualized, yet most of these companies (5 years later) are nowhere near that. Why?

Running something outside of a VM today should be the exception to the rule. CAPEX and OPEX expenditures for a standalone physical server can be two or three times higher than a virtual server and this alone should dictate that VMs are the dominant platform in any environment today.

So all of this thinking led me to create 3 adoption models and then layer them on top of each other to create a way for you to grade your organization and its use of virtualization. The assumptions used to build this model (shown below) are not unrealistic and allow you to see where you are in the curve.

Environmental Assumptions:

  • 10% growth rate per year
  • 4 Year replacement/depreciation cycle
  • Project Starts in Year 1
  • Rate of adoption (measured as a % of new/refreshed servers virtualized) increases each year in all models

The following table shows all three models their rate of adoption by project year:



Year 1

Year 2

Year 3

Year 4

Ahead of the curve/ Power Band










Behind the curve






To better understand this lets use the Average path numbers. The 40% rate of adoption in Year 1 means that during the first year of the project 40% of NEW server deployed and 40% of refreshed servers are deployed as VMs instead of physical machines. Then in year 2 this customer will deploy 75% of new and refreshed servers as VMs. So, following year 1 only 25% of servers deployed are still physical.

Obviously “Behind the curve” is a slower adoption rate (and not all that unique really) with 25% or less of the new servers being deployed as VMs in year 1, with Year 2 increasing to 33%. While “Ahead of the curve” or the Power Band is what I would call an “optimal” adoption rate that leverages virtualization fully with up to 70% adoption for new servers in year 1 and up to 85% in year 2.

So what does all this mean? What does it mean to you? Well if you know when you started your virtualization (the year) and can estimate pretty closely the percentage of x86 servers in your environment that are virtualized, then you can rate yourself on the curve. Below is the curve without a customer rating:


In this chart the smaller numbers (7.3 16.2 29.4 and 44.5) represent the upper end of the “Behind the curve” model. In order to get into the “Average” range your environment will have to surpass that % virtualized. The larger numbers (11.6 32.3 51.4 and 68.9) are the upper end of the Average path or the bottom of the ‘Power Band’. If you want to say you are fully leveraging virtualization and are in the 3rd year of your use of the technology, then by the end of that third year more than 51% of your environment should be virtualized. If you are in year 3 of your project and ar
e 25% virtualized then you can see you (based on the previous table) that you are using virtualization for less than 40 or 50% of your new servers. Given today’s technology that should be completely unacceptable.

Grading a company
To show you how to score yourself we have created two fictional companies as samples below:

Company A: using virtualization since 2007 and has 40% of their environment virtualized:


As you can see, this customer is in the third year of their project (nearing the end) and are just below the optimal/Power Band adoption curve. They are tracking in the middle or about average but are not fully leveraging the technology. It will be several more (potentially another 3 or 4 years) until they are near 80% virtualized.

Company B: Has been using virtualization since 2006 and also has 40% of their environment virtualized:


Company B has virtualized the same percentage of their environment as Company A but is behind the adoption curve due their not leveraging virtualization to its fullest. As an example this company has been using VMs for less than 50% of their new machines. At this rate they will never get to the 80% virtualized mark that they have been capable of since 2006 and have been missing a large percentage of their potential savings.

So where are you on this curve? Are you leveraging virtualization as much as you could be? Or are you on track to continue buying physical servers even 2 or 3 years into your project? Remember a physical server purchased today is powered and cooled for at least 4 years in your DC. And if you buy it today it will be 2013 before you even consider it being a VM. Shouldn’t your adoption rate (from the first table) be closer to the Power Band?

As you may have noticed the aggressive path/power band seems well… aggressive. But it is very realistic and not only completely doable with today’s technology, but completely doable with technology from 2006! This is the Power Band is optimal path for you to realize a large majority of your environment on VMs within just a few years. Below that band you are essentially ‘bailing water’ to keep the ship afloat but still not plugging the holes that are filling it.

The question you have to ask yourself if you are not within the “Oglesby” Power Band is WHY? What is slowing down your adoption? It’s not the technology…

In later articles will lay out some of the keys to accelerating your adoption rate when it comes to standardizing on virtualization and, over the next few weeks, I will be blogging about this topic over at Direct2Dell.
I would love to read your comments on where you are in the adoption curve. And, if you are in the Power Band how did you get there? And if not, what has held you back?

blog comments powered by Disqus Newest articles
Release: VMware vRealize Log Insight 4.5

June 13th, 2017

Log Insight is a log aggregation, management and analysis tool, that VMware first introduced in 2013 and considered a competitor of Splunk.
Yesterday VMware announced the release of version 4.5, available for…

Release: VMware vRealize Automation 7.3

June 6th, 2017

Today VMware announced the latest release of its cloud management platform vRealize Automation, former vCloud Automation Center.
VMware vRealize Automation 7.3 release notes can be found at this link.


Paper: Introducing the NSX-T Platform

February 9th, 2017

“We see greater potential strategic opportunity in NSX over the next decade than our franchise product vSphere has had for the past decade.”
said VMware’s CEO Pat Gelsinger talking about…

Paper: VMware vSphere Virtual Machine Encryption Performance

November 22nd, 2016

Encryption of virtual machines is something that has been requested for years by the security community. VMware continued to postpone its implementation due to the negative operational impact that many…

Quest Software leaves Dell

November 1st, 2016

In September 2012 Dell announced to have completed the acquisition of Quest Software, a Californian company with an history in systems management, security, business intelligence and, falling back in our…

Citrix announces Q3 2016 results

October 21st, 2016

Citrix announced its financial results for third quarter 2016.
The revenues for the second quarter were $841 million for an increase of 3% compared to Q3 2015.
Net income was $132…

Release: VMware vSphere 6.5 & Virtual SAN 6.5

October 19th, 2016

2016 edition of VMworld US has been quite turbulent, on the other hand during VMworld Europe, happening these days in Barcelona, the company announced a few more products for the…

Release: VMware vRealize Log Insight 4.0

October 18th, 2016

Log Insight is a log aggregation, management and analisys tool, that VMware first introduced in 2013 and now is usually compared with Splunk.
Yesterday VMware announced Log Insight’s new major…

Release: Windows Server 2016 with support for Window Server & Hyper-V containers

October 13th, 2016

Yesterday Microsoft announced the general availability of Windows Server 2016 which the company defines as a cloud-ready OS.
Beside fancy definitions, one of the most relevant perks of this release…

Release: Oracle VM 3.4.2

September 22nd, 2016

During Oracle OpenWorld 2016 the company released version 3.4.2 of its enterprise virtualization solution.
Oracle VM is available for both x86 and SPARC based processor architectures and uses the Xen hypervisor…

VMworld US 2016 Wrap-up

September 1st, 2016

Today was the last day of VMware’s flagship conference VMworld in Las Vegas, an highly controversial edition which left a good chunk of the audience disoriented if not properly disappointed….

Gartner releases its Magic Quadrant for Cloud Infrastructure as a Service for 2016

August 11th, 2016

Last week Gartner updated its Magic Quadrant for Cloud Infrastructure as a Service (IaaS) for the year 2016. The Magic Quadrant for the year 2015 was released in May last year…

Release: Ansible Tower 3 by Red Hat

August 2nd, 2016

Ansible is one of the four main players in the automation market, younger then the well known Chef and Puppet, has been launched in 2013 in Durham, N.C. and acquired…

IBM announces earnings for Q2 2016

July 19th, 2016

Yesterday IBM announced its results for Q2 2016.

If we compare with the same quarter in 2015 earnings per share, from continuing operations, decreased 22%. Net income, from continuing operations,…

Monthly Archive