Virtualization lacks of management tools

Techworld published a nice article about the lack of mature management tools for various aspects of virtualization, which is something I strongly agree with:


But as virtual machine technology moves out of development labs and into production server environments in large numbers, some administrators are finding that the growth of virtual servers is getting ahead of the tools available to effectively manage them.

Existing server-monitoring tools are increasingly aware of virtual servers, but most aren’t yet sophisticated enough to interpret feedback in a virtual machine context.

“Some of the things you monitor no longer mean the same thing,”

For many organisations, identifying the root cause of virtual server problems and rectifying them remains largely a manual process. As the number of virtual machines in the data centre increases, solving those problems in an automated way becomes more urgent.

Performance monitoring is just one aspect of virtual machine management. Other tasks include optimising the mix of virtual machines that should reside on each physical server to achieve the best possible performance; automating virtual machine provisioning, load balancing, patch management, configuration management and fail-over; and enabling policy-based orchestration to automatically trigger the appropriate responses to events…

Read the whole article at source.

Processor published an article on the same theme:


“The trend toward server consolidation is an attempt on the part of IT departments to try to reduce the number of servers to reduce some of the complexity,” says Susan Davis, a vice president of strategy at Egenera. “However, as virtual machine technology becomes more widely used, the complexity problem could actually get worse. As an example, if today a company is managing 100 servers, in the future they may need to manage 1,000 virtual machines—thus adding complexity.”

Vishria notes that virtualization costs only add to the perception of complexity. According to Vishria, a driving force behind virtualization is that, while hardware costs have plummeted in a six-to-nine-fold reversal from just a few years ago, labor costs have risen three times as high as they were in 2000. Meanwhile, while SMEs are investing about 3% of their budgets on hardware each year, labor costs have grown to 10% of budgets and will continue to escalate. Virtualization addresses the rising labor cost issues, but the confusion comes in when data centers consider the fact that adding a new Dell PowerEdge server is cheaper than ever…

Read the whole article at source.

A third one appeared on SearchOpenSource:


Key vendors like IBM, Microsoft, VMware Inc., XenSource Inc., Virtual Iron Software Inc. and others are hard at work on management tools, and several are on the verge of releasing beta versions. But fully formed server virtualization management tools will be unavailable until probably sometime next year.

Specific gaps identified by analysts and users include tools to facilitate patch management, x86-based server aggregation, backup and restore management, and workload balancing optimized for virtual servers. With the gaps identified, forthcoming management tools have been scheduled for release…

Read the whole article at source.

IBM releases a tool for billing virtualization usage

Quoting from the IBM official announcement:

IBM today announced new software that assists customers in virtualizing their entire technology infrastructure by tracking the use of computing assets across different types of systems. The software also supplies companies and IT outsourcing providers with a simple method to bill internal departments or individual clients for the amount of computing they actually consume.

The Tivoli Usage and Accounting Manager is designed to simplify how IT outsourcing companies track virtualized data centers used in outsourcing engagements, and accurately bill each of their clients. The software allows for virtualization of shared servers, and it eliminates the need for outsourcing companies to provide separate servers to each customer to help meet service level agreements.

The software is also aimed at individual companies who manage their own virtualized IT by providing a simple way to bill internal departments that use shared computing resources. For example, a marketing department may have increased usage during a particular month because of a special promotion, and the IT department can accurately bill them for increased computing needs.

The IBM Tivoli Usage and Accounting Manager is currently available through IBM or IBM Business Partners for IBM x86 servers and the mainframe. The IBM Tivoli Usage and Accounting Manager will be available for System p server later this year. Pricing for the IBM Tivoli Usage and Accounting Manager for System X begins at $599 per server and $75,000 for mainframe customers in the United States…

Parallels launches Release Candidate 2 for Desktop for Mac OS X and includes Compressor

Quoting from the Parallels official announcement:

Parallels today announced the immediate availability of the release candidate 2 (RC2) of Parallels Desktop for Mac, the first virtualization software that gives Apple users the ability to simultaneously run Windows, Linux or any other operating system and their applications in a stable, secure high-performing, virtual machine alongside Mac OS X on any Intel-powered Apple computer.

As part of Parallels’ commitment to simplicity, and in response to many user requests, the company has integrated Parallels Compressor Server – a powerful management tool that can reduce the size of Windows 2000, XP or 2003 virtual hard disks by 50% or more – directly into Parallels Desktop.

The enhanced product will be available for sale at $79.99, $150 less than the cost of buying stand-alone copies of Parallels Desktop for $49.99 and Parallels Compressor Server for $179.99. As an extended promotion, Parallels plans to offer the enhanced product at $49.99 for 30-days following its release…

Read full Release Notes and download it here.

The virtualization.info Virtualization Industry Roadmap has been updated accordingly.

More rumors on Mac OS X 10.5 virtualization

Quoting from Mac OS Rumors:

In recent weeks, the core feature set and low-level changes to the Mac OS X codebase have been firmed up in preparation for focused efforts to produce a “WWDC Preview” release in early August to be shared with developers in attendance of Apple’s World Wide Developer Conference (Aug. 7-11).

This Preview Release will not include all of the high-level features and extra software that will be present in the final release due out next summer; nor will it be anywhere near production quality in terms of hardware support or crash-free reliability.

Apple is soon to introduce its “Mac Pro” line, which will sport Intel’s “Conroe” desktop Core 2 Extreme processors with up to two four-core processors for a total of eight CPUs.

Core OS changes Although many details of the new Darwin/Core OS changelog are heavily embargoed for obvious reasons with the WWDC Release still being two months away and the final Leopard release still a year off, we can summarize a few of the less closely guarded improvements being made according to sources in Cupertino:

  • Simultaneous (e.g. not dual-boot) operating system virtualization technology derived from quiet efforts in this area at Apple over the past five years will allow Leopard owners to run OS X, Windows, Linux, Solaris and other operating systems simultaneously with near-native performance and no need for third party software. This may help explain Microsoft’s lack of interest in developing VirtualPC for MacIntel…

Read the whole article at source.

Double-Take and PlateSpin partner to protect entire Windows servers

Quoting from the PlateSpin official announcement:

The recovery of an entire Windows® server has never been an easy task. Today, Double-Take® Software and PlateSpin formed a partnership to offer centralized backup of entire Windows servers with a full recovery solution that is fast, flexible and hardware-independent. Customers using Double-Take by Double-Take Software coupled with PlateSpin PowerConvert experience continuous, whole-server protection and bare-metal recovery.

The complementary solution from Double-Take Software and PlateSpin is optimal for disaster recovery because it creates a copy of an entire server, including the OS, applications and data. The server and its applications can be replicated locally or across long-distance WAN connections, to a central backup repository. Systems can then be quickly restored to the same or different hardware, or even to a virtual machine. The simple recovery process leverages Double-Take real-time protection capability for a continuously updated copy of the data and PlateSpin PowerConvert’s OS Portability technology which provides hardware independent restore, eliminating the need to reinstall or reconfigure a failed system.

Traditional imaging or backup solutions require recovery to identical hardware or data protection involving point-in-time based image captures. This results in out-of-date image archives requiring administrators to manually reinstall and reconfigure the server’s OS and applications before recovery can occur. The time to recovery can usually take hours and even days in some situations…

Over 60 percent of Fortune 500 is virtualizing

Quoting from World Peace Herald:


Experts tell UPI’s Networking that more than 45 percent of servers in corporate networks purchased in the coming year will be “virtualized,”

According to another research firm, New York-based TheInfoPro Inc., nearly 60 percent of Fortune 500 companies are now in the process of “virtualizing” their servers, and another 30 percent are developing plans to do so. That means nearly all large companies are pushing the trend.

The market, consequently, for software to service this movement is big, and will be $15 billion by 2009, according to research from International Data Corp…

Read the whole article at source.

VMware wins Network Computing Well-Connected Award

Quoting from the VMware official announcement:

VMware, Inc., the global leader in virtual infrastructure software for industry-standard systems, today announced that VMware GSX Server has won Network Computing’s 12th Annual Well-Connected award, taking top honors in the Virtual Machine Software category.

VMware GSX Server, first introduced five years ago, enables users to quickly provision new server capacity by partitioning a physical Windows or Linux server into multiple virtual machines. The successor to VMware GSX
Server is the recently-introduced, feature-packed, entry-level VMware Server.

VMware Server is the first commercially available server virtualization product with support for 64-bit virtual machines and support for Intel Virtualization Technology. In addition, VMware Server supports Virtual SMP, a technology that enables single virtual machine to span multiple physical processors…

Wikipedia reviewers censor virtualization.info

I’m very sorry to soil this technical blog with a polemic, but there are things I dislike and I think should be mentioned.

As you know Wikipedia, autodeclared The Free Encyclopedia, offers an impressive amount of informations on an impressive amount of topics.
Everybody can start writing a new topic, expand or correct an existing. There is also a chance to add, for every topic, one or more external links, which users consider relevants for who’s reading.

To avoid an uncontrolled amount of spam Wikipedia not only counts on the occasional readers’ contribute but also on a certain amount of volunteer users, which act as reviewers, sistematically analyzing pages modifications and removing undesired contents.
Since Wikipedia is autodeclared free, meaning that everybody is free to contribute, removing an added content should be accepted just when it represent spam or an evident error.

I, probably not so humbly, consider virtualization.info (which exists and covers modern virtualization since much before the world turned its head on and started to keep interested), the Virtualization Industry Roadmap and the What is Virtualization webcast relevant for every virtualization topic.
So several months ago, like others bloggers and vendors, proceeded to add these 3 entries in many Wikipedia virtualization topics, as External Links. I acted in good faith, thought and still think it’s a good idea.

A week ago two different users, in two different moments, deleted my three entries from everywhere, evaluating virtualization.info mentioned contents as spam.
Users are Wmahan and Discospinster.

I received a message from the last one, while trying to re-add my contents, explaining me that Wikipedia is not a place to post your own website or spam contents but that I was free to add more to the topic.

Now I have two problems, which I exposed as questions:

  • if the reviewers would take just few minutes to analyze links I added as External Links they would agree virtualization.info contents are value-added contents and not spam. Unless they think to have a better judgement capability of 100,000 users / month…
  • in some cases my links have been deleted while others, even identical in name, are still there.
    I’m talking in particular of the link to What is Virtualization: mine has been deleted while the others provided by ZDnet or KernelLinux are still there, for example in the topic Virtualization

Why? I immediately thought is was depending on advertising which is present on this blog, but after a fast check I found both contents saved from censorship have same or even more amount of advertising.
So what is the criteria? Have I the right to ask and receive an answer?

The last reviewer answered me in the following way:

These links were taken off the articles because they weren’t directly related to the subject. Most of the articles were about specific brands of virtualization technology, so the external links were specific as well. However, in the general article Virtual Machine, the link to the blog is relevant.

This answer has 3 problems:

  • doesn’t answer the question why similar contents (in some cases with same name) are still there and mine were removed.
  • What is Virtualization webcast seems to me a content pertinent enough to both the Virtual Machine topic and the Virtualization topic, it’s not available in any
  • the Virtualization Industry Roadmap seems to me pertinent enough with any virtualization topic, including Wikipedia pages talking about virtualization vendors

I asked further explainations but received any.

So the question now is: Wikipedia is really free? Reviewers are really competent on whay they edit?
Nicholas Carr recently expressed a very interesting opinion on this.

At the end of the story virtualization.info is still not present in Wikipedia (I refuse to re-add links until I receive a satisfactory explaination for censorship) and I personally changed my opinion on the project, considering what happend to virtualization topics could happen to every listed topic, influencing users interests in every way but free.

Update: the discussion greatly continued on this post comments, so I suggest you to read them, before further reading.

One or more passionate virtualization.info readers, after reading this post, re-added my links to the Virtualization topic but, once again, the mentioned censors (Wmahan in this case) removed the link:

Your opinion about the site’s relevance would be more credible if it didn’t appear to be self-promotion. Your actions make it difficult to assume you take a netural point of view: you added links to many articles, you added multiple links within a single article, and you didn’t contribute any new information on the subject, as far as I can tell.

If you have a vested interest in the site, it might be better to wait and let someone else add links to it, if they find it relevant.

Also, you said above that the virtualization articles “are incomplete or totally missing.” Any help you can provide in improving the articles is welcome. — Wmahan. 18:11, 24 May 2006 (UTC)
Now User:XXX.XXX.XXX.XXX is continuing to add links to virtualization.info without discussion.

Now this thing is going comic:

  • This time I didn’t add links from myself but someone else did for me (even if the submitting IP address is the same this doesn’t mean I’m the submitter…).
    Why if someone else, from my same network, add links pointing to my site I am still not credible?
  • Why I have to add contents to articles in exchange of placing External Links? Others external links provideds did it? What if (and this is exactly the case of Virtualization Industry Roadmap) I believe my contents are valuable and pertinent but cannot fit inside the article itself?

The main point reviewers still have to argue is not why I’m not allowed to post (which is debatable), but why I’ve been removed while others are still there.