Sage Research finds out great customers expertise in hardware-assisted virtualization

Quoting from ComputerWorld:


Moreover, other results of the survey, which involved 265 IT decision-makers at companies with 500 or more employees, show that organizations that are already using server virtualization or that are interested in the technology are doing so mostly to increase efficiency and utilization of their servers (84%) and lower data center costs (72%) — the basic and successful marketing mantra espoused by current market leader VMware Inc.

Eighteeen percent of respondents said they were “very familiar” with chip-assisted virtualization technology, such as Intel Corp.’s VT or Advanced Micro Devices Inc.’s AMD-V. Meanwhile, a third of respondents claimed to be “very familiar” with either hardware-assisted virtualization, which promises to offer faster performance than VMware’s more software-heavy approach, or open-source virtualization software such as Xen, Open VZ or Virtual Iron…

Read the whole article at source.

It’s notable so much familiarity with hardware-assisted virtualization when VMware just reported they don’t improve performances at all. Au contraire.

Stream Theory obtains another software streaming patent

Quoting from the Stream Theory official announcement:

Stream Theory, Inc., a leading developer of patented virtualized software delivery and digital rights management solutions, today announced it has been awarded U.S. Patent number 7,096,253 for “Method and Apparatus for Streaming Software.” The patent, which covers core technology related to streaming remotely located software programs and data to a local computer, underpins the Company’s worldwide leadership role in virtualized software delivery…

Stream Theory is the company which filed a lawsuit against Softricity (and AppStream and Exent) immediately after its acquisition by Microsoft, for infringment of its patent number 6,453,334.

This new patent could further enforce its position against Microsoft and other vendors working on streaming technologies.

The risk of using free virtualization products

Since the launch of VMware Player, the first free desktop virtualization solution, and Microsoft Virtual Server, the first free server virtualization solution, the IT world has never been the same.
A revolution in the way of thinking about computing resources started, and it will greatly accelerate now that also VMware Server 1.0 and Microsoft Virtual PC 2004 have been releases as free products.

At this moment the IT world is shaking up by two concurrent phenomenons at the same time: on a side the server virtualization technology itself and on the other side the fact virtualization didn’t ever have the chance to become mainstream and it’s already completely free.

While free virtualization is a huge benefit for the whole industry, obtaining it so fast could bring in a lot of issues.

The problem
Problems free virtualization could raise in the next years mainly depend on three factors: technology complexity, critical role in business, easiness in adoption.

A virtualized datacenter involves new challenges, and IT staff has to handle technical incompatibilities, performances penalties, lacks of products support, interoperability, accountability, and many others.
Professionals and companies had no enough time to become really expert in handling all of this in the new scenarios. There are so many aspects still to be fully understood and so much experience to collect before reaching the level of confidence we have today with physical server.

While desktop virtualization has a large diffusion but a limited impact on the way business services are offered, server virtualization completely changes the approach to datacenter, from hardware purchasing to resources management.
While desktop virtualization is a technology companies can decide to forgo at any moment if it doesn’t meet certain expectations, server virtualization is a no way back adoption most of times.

The fact today’s free solutions yesterday were commercial products, advertised as enterprise grade solutions, imply companies from small business to enterprise, will embrace them, both because are at no cost and because are trusted as reliable. And when a much desirable technology suddenly becomes free, a mass of professionals approach it, with or without required knowledge.

Where’s the risk? The biggest one is for small and medium companies which surely see in free server virtualization the biggest opportunity to lower costs.
In these realities time and budget allocated for IT staff training or outsourcing consulting and for testing is small or non-existent and often happens technologies are thrown in production without adequate skills and experience.

Here comes the technology complexity and multiple factors which could compromise a virtualization project: a poor capacity planning, superficiality in host and guest OSes configuration, missing policies for virtual machines provisioning, lack of knowledge for needed third party tools, poor investigation in supported configurations. All elements with lead to disappointing performances, virtual machines sprawl and increased efforts in management.

Such bad results will not only translates in many money required to correct deficiencies or revert back to physical server, but will also become the reason why companies will stay away from virtualization as much as possible, believing the technology is much less useful and reliable than expected.

At the end of the day surrendering the mirage of a complex solution such server virtualization available at no cost will damage companies in the short and medium term.

Future trends
It’s pretty sure server virtualization will remain free, will extend to the datacenter class solutions, now still a profitable part of the vendors offering, and will become pervasive, included in every operating system.

The biggest contribution in this direction will arrive from Microsoft which announced will embed a new virtualization technology called Windows Server Virtualization inside upcoming versions of its server operating system, codename Longhorn.

Within two years or little more virtualization as a commodity will appears in millions of installations, becoming a de-facto standard in datacenter architectures.

Investing in training or consulting today is not just a way to ensure free virtualization will deliver supposed benefits, but it’s also a way to build knowledge and be ahead of competition in the near future.

Conclusion
Free virtualization could appear as a very simple technology to solve very complex problems, and this appearance could lead to not consider mandatory an investment in training or outsourcing help.

The reality is today’s virtualization is very hard to handle and requires new capabilities IT staff doesn’t have.
Companies going to adopt free virtualization too easily could face stop issues at a point of the project so that correcting or reverting back to physical server will result in big waste of money.

This article originally appeared on SearchServerVirtualization.

Release: Scalent V/OE 2.0

Quoting from the Scalent official announcement:

Scalent Systems today announced general availability of Scalent™ Virtual Operating Environment (V/OE) version 2.0, the industry’s next-generation server infrastructure repurposing technology.

Serving as virtualization middleware V/OE enables data center operations owners to rapidly change entire systems and associated topologies, which servers are running, what software is running on them, and how they’re connected to network and storage, without altering physical infrastructure.

Scalent V/OE 2.0 extends Scalent’s broad hardware and OS support, with the introduction of additional enterprise extensibility, including:

  • Support for Solaris 10 on x86 and SPARC
  • Support for enterprise-class bladed Ethernet switches (for example, the Cisco 65xx)
  • Addition of a programmatic interface for third-party systems integration

Virtual Strategy Magazine also published a podcast about this release with company’s VicePresident of Marketing Kevin Epstein.

The virtualization.info Virtualization Industry Roadmap has been updated accordingly.

Podcast: TechNet Radio interviews Jeff Woolsey of Microsoft

Channel 9 published a TechNet Radio interview with Jeff Woolsey, Lead Program Manager Windows Virtualization at Microsoft, about the upcoming Windows Server Virtualization (WSV).

Listen the whole interview at source.

If you missed the WinHEC 2006 presentation be sure to check the webcasts and the virtualization.info Q&A about Windows Server Virtualization with Mike Neil, Virtual Machine Technologies Product Unit Manager at Microsoft.

Ron Oglesby on hypervisors future ubiquity

Brian Madden published an article from Ron Oglesby about virtualization future in the middle term.
Ron focused on what upcoming change in virtualization could further revolutionize the IT world, predicting it will be hypervisor binding with hardware and its ubiquity in desktop and server machines:


Right now I believe that the real race going on in the virtualization space isn’t about who can Vmotion or support four processor VMs, etc. The real race is about who has the first lightweight fully integrated hypervisor that is OEM’ed on servers and desktops.

The Future is a thin layer that is OEM’ed that can work with and control all these devices. It will not be as bulky as any Windows or Linux OS you have ever seen and will more closely resemble a glorified piece of firmware that boots and starts dividing up resources to whatever number of VMs you have running on the machine. Of course it will still have some type of interface while the server and its VMs are running, but it will be extremely lightweight and self-sustaining. This will come with every x86 server and desktop. What you will buy is not the hypervisor but the management tools that wrap around it…

Read the whole article at source.

Whitepaper: Roadmap to Virtual Infrastructure: Practical Implementation Strategies

VMware published a very interesting paper about steps CIOs should take to gradually implement virtualization technologies in the company, from the pilot program to the adoption in production environment:


We cover organizational charter, stakeholder buy-in strategies to ease the common non-technical resistance that can affect virtualization rollouts. We also highlight key areas of IT infrastructure and operations most impacted by virtualization.

We include some actionable next steps and templates for how to build an effective virtualization support team, assess readiness of your organization to adopt virtualization, and scope initial projects to help ensure success and develop your organization’s capabilities for broader virtualization deployment…

Read the whole paper at source.