Benchmarks: Citrix XenDesktop 2.1 vs VMware View 3.0

citrix logo

vmware logo

For the forth time in few days that benchmarks about Citrix and VMware desktop virtualization (VDI, presentation virtualization and application virtualization) solutions take the central stage.
Is it a sign that somebody is getting nervous?

The first (non-sponsored) analysis came out from two independent enterprise architects, Ruben Spruijt and Jeroen van de Kamp, which evaluated how XenServer, ESX and Hyper-V perform in VDI scenarios.

After an immediate reaction from VMware, a XenDesktop 2.1 Scalability Analysis popped up from Citrix (to be fair this document was released on Jan 12, days before the Spruijt/van de Kamp work, and further updated on Jan 27).

Then an independent performance comparison (committed by VMware) between Microsoft App-V, Symantec/Altiris SVS, VMware ThinApp and Citrix XenApp was released by the Exo Performance Network team.

The last episode of this saga come out last week from the Tolly Group.

The test lab realized an independent performance comparison (once again committed by VMware) between Citrix XenDesktop 2.1 Enterprise and VMware View 3.0 Premier.

As for any sponsored analysis the results are easy to guess:

The VMware View 3 VDI solution deployed more simply and more rapidly than Citrix XenDesktop 2.1. VMware provided more comprehensive, efficient image and storage management of virtual desktops. It provides end-users with a quality of experience on the LAN that matches or exceeds that offered by the Citrix solution.

Citrix promptly answers from the corporate blog, invalidating the analysis as it covers unrealistic scenarios and evaluates an old product (XenDesktop 3.0 was released just two weeks ago):

There’s a prominent sidebar that in the report that states that Citrix declined to participate in the testing – this is true, and I was the one that actually made that call and discussed it with Tolly Group. To their credit, Tolly Group did call us prior to beginning the testing and informed us of the project and shared the statement of work prepared for VMware. We asked some questions and provided some feedback about the testing methodology. I had serious concerns that the proposed tests did not reflect true customer use cases. For example, the user experience testing was only for a few productivity applications in a LAN environment – that was all that was planned, and it didn’t seem to realistic based on what we’ve seen in real customer environments. Tolly took note of our concerns and asked VMware as the sponsor of the paper whether they would alter their approach.  Later we learned that VMware (not surprisingly) had rejected our suggestions and was not open to changing the proposed tests. At that point, it was clear that it made no sense to participate because…