Cloud Computing Trends: An interview with Steve Herrod

Posted by virtualization.info Staff   |   Tuesday, October 7th, 2014

As the majority of virtualization.info readers already know, Steve Herrod is one of the most relevant CTOs, not only for VMware but for the entire panorama of virtualization and cloud computing.

Steve joined VMware as one of the very first engineering directors in the early days of the company and left it in February 2013, currently he is Managing Director at General Catalyst Partners, a venture capital firm focused on Early Stage and XIR (Executive In Residence)/Growt investments.

Having the opportunity to speak with a professional of his caliber isn’t something that happens every single day and, as a proper analyst as I claim to be, I came prepared with an outline and 5 precise, studied and obviously clever questions.

Eventually we ended up with a nice informal talk, touching about two million different topics and jumping from one to another ignoring any semblance of the good intentions that I had at the beginning.

What I’ve done in this article is to re-assemble our conversation ex-post in order to make it fit my original questions, (how convenient?) in the end what you will read here is more the result of an exchange of ideas than straight answers.

Can you give us a general opinion about the current Cloud Computing Trends in your experience?

Companies want to have the ability to choose where to deploy their workloads between on premises, public and hybrid environments. The path that many companies are now following is to build a standardized infrastructure layer on which to offer value-added services. In this evolution, customers, especially enterprises, have to plan a co-existence of traditional and cloud-oriented IT because, for example, we cannot expect that legacy applications can run inside containers as they cannot run in a cloud computing consumption model. The co-existence between the old world and this new world should be managed with a management tool able to provide a single pane of glass for all environments.

Presumably in the future we will see different kinds of workloads, someone running inside containers and others on virtual machines.

One interesting thing is that in the past you never heard about IT-related topics in a board meeting. Today, cloud computing has raised more attention on these thematics because everyone perceives himself as a consumer of this kind of technology.

We know that enterprise adoption is crucial for the establishment of a technology. What are the main challenges or frictions that you have seen in your enterprise customers in the adoption of a software defined approach or generally a private/hybrid/public cloud strategy?

The problems that you can encounter in an Enterprise are mainly three:

  • Costs: both implementation and running costs
  • Developers becoming more and more central: IT operations now have to redesign their processes in order to provide developers with the tools that enables them to work at their best
  • Security: both in terms of infrastructure security and data sovereignty

Today, these kinds of issues are discussed not only within IT but are also relevant at a business level, in some occasions also at top management level, thanks to the raising adoption of cloud computing models.

How do you think about OpenStack? What are the challenges that customers have to face in order to implement it?

The two main problems that OpenStack is facing today are its complexity and the difficulty that potential customers encounter when they try to identify a leader in providing OpenStack from both technological and vision points of view.

The vast majority of OpenStack’s implementations require a lot of customization, and the companies, especially those that don’t have technology as their core business, don’t want to sustain the cost of the customization. They would probably prefer a ready to install product, as long as it maintains a modular nature able to follow their specific needs.

From a leadership perspective, everyone now bets on Red Hat but there is still a bit of confusion and especially enterprise grade companies tend to be scared.

Generally speaking, OpenStack suffers the “Kardashian syndrome”: being famous for being famous. Most of the implementations that are actually claimed have only test purposes but is also true that VMware itself now recognizes the potential behind this technology.

The VMware OpenStack distribution is born to support the best of breed virtualization and SDN technologies, in other words vSphere and NSX, and will be offered as an option not necessarily in competition with vCloud. VMware Integrated OpenStack will offer a less deep integration with VMware proprietary technologies, but will protect customers against Vendor Lock-in.

Open Source is gaining more and more traction among vendors, this trend emerged even from VMware at VMworld this year, what is your opinion?

Most of the “new world” applications gravitate around Linux and generally the Open Source world. From a vendors’ perspective there is a growing trend to move all the components that are considered as “commodities” to a “community developed” model.

We also have to consider that an Open Source model gives the advantage of more cohesion with the developers community, which, in turn, helps a vendor to develop products that the community wants as the community expects them.

From a business model perspective, there are three revenue streams and there is nothing new in the pipeline. We have the advisory and support on open source products, the provisioning of product packets based on open source technologies and training and certification upon open source technologies. That said, the question from an economic perspective is about who is benefiting most from the open source model. Are these companies that provide services built on top of open source technologies? They can benefit from the support provided from the community but they haven’t any obligation to contribute.

A very hot topic has emerged in the last months: VMs versus Containers. What is your opinion? Do you see any possible co-existence, at least at the beginning or more likely a parallel path for the two technologies?

Few projects have had a vertical take off like Docker, maybe OpenStack, but probably most as a brand than as a truly adopted technology. What people like about Docker is the simplicity behind its base concepts (as the name wants to represent with the analogy about shipping containers) and its ability to provide standardization to applications (fulfilling one of the key requirements of cloud computing deployment model), in addition with the ease of moving the applications from your very own laptop (or staging environment) to production leveraging all the characteristics of the deployment environment.

As is often said, virtual machines and containers are different technologies that answer to different needs, in a medium term scenario (3 to 5 years) we will probably see both in the IT world.

We have also to consider that containers are a different type of virtualization, OS virtualization compared to hardware virtualization represented from VMs and if is true that containers have proven to be more performing on bare metal is absolutely meaningful to think of a strategy based on containers provided on top of VMs in order to add more management layers.