Oracle to announce Sun merge plans next week

oracle logo

sun logo

Next week is going to be one of the most important ones in 2010, at least in terms of announcements.

The consumer side of the IT world in fact is going to enjoy the Apple presentation that will take place on Jan. 27, where Steve Jobs is expected to launch the iTablet/iPad or whatever he decided to call it.
The business side of the same IT world instead is probably expecting a significant announcement from VMware, Cisco and NetApp, in the joint presentation that will take place on Jan. 26, despite the absence of Cisco CEO John Chambers (who probably doesn’t want to replicate the show to launch the VMware-Cisco-EMC alliance).

But next week there’s another, even more important, presentation that corporations may want to attend: the Oracle announcement about its strategy with the Sun assets.

Oracle announced the acquisition of Sun in April 2009. The US Department of Justice approved it just four months later. The European Union, instead, took until today to approve the $7.4 billion deal without conditions.

Read more

Is over-capacity inevitable in cloud computing?

A lot of discussion is going on these days around some performance issues that Amazon customers are suffering with the Elastic Computing Cloud (EC2).

The discussion was triggered by Alan Williamson, a prominent voice in the Java community, who posted an interesting description of his 3-years experience with EC2. 
Williamson suggests that Amazon is allowing EC2 over-subscription at the point that the cloud is so crowded to generate some serious latency in the internal network, which impacts on the performance of any multi-tier application that resides on multiple virtual machines.

Another Amazon customer, David Mok, CTO at OleOle.com, suggests instead that the overall performance degradation depends on some differences (the CPU) in the physical hardware that is below the cloud, and that the cloud platform, the Amazon implementation of Xen, is incapable to fully abstract.

Read more

VMware releases Python and Java open source SDKs for its vCloud APIs

vmware logo

In September 2008 VMware announced the future release of a set of APIs to manipulate vSphere-based cloud computing infrastructures.

More than one year later the vCloud APIs still are at version 0.8 (Citrix is not doing much better with the Xen Cloud Platform) and only five hosting providers in the world are using them through a beta implementation of vCloud Express.

Nonetheless, VMware is working hard to be sure that its vision of idea of cloud computing it the one that the industry will embrace at large.

First of all the company submitted the APIs to the Distributed Management Task Force (DMTF) in September 2009.
VMware has a very relevant position in this standard committee since June 2008, when it hired the DMTF President Winston Bumpus as Director of Standards Architecture.

Read more

Xen Cloud Platform alpha expected for early February 2010

xen logo

The Xen Cloud Platform (XCP), announced in August 2009, is the Citrix answer to the VMware vCloud initiative that a few hosting providers are implementing worldwide.

The first XCP implementation (version 0.1) emerged in November 2009.

This week the Xen.org community announced a little step forward which moves the platform to version 0.1.1, which includes a number of improvements.
The platform is on based on Xen 3.4.2 and its Dom0 is now based on CentOS 5.4.

The most important news anyway is that the team expects to deliver the alpha no later than early February.
It’s a good news but at this pace customers won’t have anything concrete (like a XCP 1.0 GA) before next year. And considering that the VMware partners have frozen their vCloud Express implementations in an “unlimited beta” status, maybe we should all reconsider the idea that 2010 is the year of private clouds. More likely 2012.

Lanamark extends its offering to include capacity planning as a service

lanamark logo

Lanamark is a Canadian startup that entered the capacity planning market in June 2008. It is led by Mark Angelo, coming from PlateSpin and VMLogix.

Compared to most of its competitors, Lanamark offers a hosted capacity planning suite that is available only to its partners.
VMware is the only other company in this space that restrict the usage of its hosted Capacity Planner to the channel.

So far the Lanamark platform has been available only on a project-basis: a PSO requires access to the Suite to complete a certain project, like a desktop virtualization assessment, and it’s charged accordingly. Any new activity on the same customers site required to start a new project.

Read more

5nine launches Migrator for Hyper-V 1.0 beta

5nine logo

The startup 5nine Software, which emerged from the stealth mode in June 2009, announced its third product: Migrator for Hyper-V.

Their first solution, P2V Hyper-V Planner, is a physical to virtual migration engine that also performs some capacity planning, by comparing the consolidation plans in case you use VMware or Microsoft hypervisors. 

Migrator is a superset of the P2V Hyper-V Planner, which attempts to merge the capacity planning, P2V migration, and ongoing placement optimization in a single software. It makes a lot of sense.

Read more

Microsoft integrates virtualization capabilities in System Center Essentials 2010

microsoft logo

Along with System Center Data Protection Manager (SCDPM) 2010 and Visual Studio Team System (VSTS) 2010 Lab Management, Microsoft is about to release another product that will have extended virtualization capabilities: System Center Essentials (SCE) 2010.

Essentials (to not be confused with Citrix Essentials, which indeed is another management suite for Hyper-V) is a bundle of multiple System Center products, integrated by a single management console, that supports up to 50 servers and 500 clients.

The current version, SCE 2007, merges together System Center Operation Manager (SCOM) 2007 and Windows Server Update Services (WSUS) 3.0.
It doesn’t have any capability to manage Hyper-V because it doesn’t include the System Center Virtual Machine Manager (SCVMM) engine.

Read more

Microsoft helps Visual Studio 2010 developers to fully automate their labs with VM Factory

microsoft logo

As most virtualization.info readers know by now, Microsoft is finally approaching the .NET developers with a virtualization-friendly edition of its upcoming IDE Visual Studio 2010.

The product will be called Visual Studio Team System 2010 Lab Management, and will integrate Hyper-V R2 and System Center Virtual Machine Manager (SCVMM) 2008 R2 to offer a virtual lab automation platform that competes against products like VMware Lab Manager, VMLogix LabManager, Surgient Virtual Automation Platform and others.

Microsoft took forever to leverage its huge MSDN community to let Hyper-V slip into new customers’ sites.
Ironically, the company is doing it right now that VMware, who rules the developers world thanks to Workstation, seems to have lost interest in it.

On top of the new Visual Studio 2010, which may be released in Q2 2010, Microsoft recently released another tool called VM Factory:

Read more

Tech: How to automatically protect new Hyper-V virtual machines with Data Protection Manager 2010

microsoft logo

One aspect where Microsoft is extremely weak in the virtualization realm is the VMs backup and restore.

The market doesn’t offer many choices to Hyper-V customers and even the Microsoft own enterprise disaster recovery solution, System Center Data Protection Manager (SCDPM) 2007, leaves much to desire.

Things are slowly changing. In part because existing partners like NetApp are more committed to release products that support Hyper-V now that Microsoft is gaining some concrete market share.
In part because the VMware strategy is pushing even its most loyal partners into Microsoft’s arms.
In part because Microsoft is working on a better integration between Hyper-V and most of its other products.

Read more

Microsoft and HP agree to jointly invest $250M over the next 3 years. For what?

microsoft logo

hp logo

Yesterday Microsoft and HP announced a 3-year agreement to spend $250M on several fronts: Hyper-V and System Center, Windows Azure, Exchange, SQL Server and more.

The problem with such announcements (see the 3-year alliance with EMC for another example) is that just a few (to not say nobody) really understand what’s the difference between before and after the deal.
The language used in the press announcements never helps.

Microsoft and HP already are very good partners. Customers expect the option to have Microsoft products inside their brand new HP servers at the purchase time because this happens since a lot of year.
So this deal requires some clarifications (of course we’ll focus on Hyper-V, System Center and Azure):

Read more