Companies developing software for themselves or for customers know how complex, expensive and time consuming can be releasing a new product.
Development team members have to independently work on code and share it when needed, build and rebuild it on the same or different environments, while QA engineers have to test it on multiple configurations and scenarios, until the final deployment in production, where several factors are out of control and can mine stability and reliability.
IT managers always had small or no possibilities to mitigate technical issues and smooth the release path. But server virtualization changed everything becoming one of the first choices for boost the process.
In this article we’ll see how deploying a VMware virtual infrastructure can reduce most of the problems our development department encounter, speeding up its capability to deliver new products at new, unexpected levels.
VMware is not the only company offering virtualization solutions, but its wide range of products and its capability to seamless migrate a virtual machine from a platform to another makes it the best choice for this scenario.
Typical problems
The very first issue of a software development is environment integrity.
For many software engineers is normal having their development tools on the everyday workstation. When a new project starts the large majority of them simply start coding on the same environment they use for browsing the web, reading the email, watch videos or presentations, etc. Often even gaming on it.
Such systems should be perfectly clean, like the fresh installation where customers are supposed to host the product we are developing. Unfortunately this rarely happens.
Daily use for so many tasks imply a lot of installed software, which injects libraries, requires high-level permissions, modifies environmental variables and so on. Not to talk about possible malware infections.
Developed code could easily run or not run because of these variations but moving it on different machines, where operating systems have been compromised in different manners, will produce different results, leading to a complex and time-consuming troubleshooting.
Another frequent slowing down issue in complex projects is environment portability.
Software architects and engineers, product managers, have to verify how a product is growing during the whole development process or have to collaborate on it to improve or debug its routines.
Having many persons around the same monitor or permitting remote access to the development environment is highly unpractical. On the other way moving code from an operating system to another is not simple.
It’s not only depending on environment integrity, which cannot be granted in any way, but also a mere fact of delivering all parts needed to run a piece of code.
Any application based on database access, for example, is very hard to share if the developer, as often happens, has installed the database instance on his own machine or rely on a remote instance on a dedicated network segment where not everybody can access.
But even without a database, development team could be in need of libraries or, in case of multi-tiered applications, of other services which aren’t moved along with the code.
A third typical problem is the lack of physical resources.
When software engineers are savvy enough to not rely on locally installed services, they need remotely accessed services which have to be deployed, configured and notified to the team.
This requires time but most of all implies machines availability which cannot be given for granted.
In similar way often happens hardware configurations have to be modified during development, adding for example more memory or another network interface card.
Adding new components can be even more complex in big companies where hardware is acquired in stock from OEMs.
But the amount of machines for software development is not only limited to ones where to deploy needed services. It’s also depends on how many systems the company wants to support.
QA engineers which have to try the same code on several versions of the same operating system to verify our code works as expected in all possible scenarios: with different permission levels in several Windows editions or with different libraries availability in several Linux distributions.
A dedicated machine is expected to be available for each of them and things become very complex when multiple new applications are concurrently in development.
It’s worth to note that lack of hardware machines once solved for developers and testers can soon turn to be a problem for IT managers.
Once the big project is finished they have a certain amount of computers which will be wasted until the next one and could become obsolete in the meanwhile, obliging to replace some or all of them.
Even having enough resources, software engineers and testing staff still have to front the most frightening risk: time shortcoming.
A long series of logistical operations can severely slowdown development distracting coders from their focuses.
For example recognizing the need for environment integrity leads to the act of debugging code always on a fresh installation, which is impossible until developers reinstall the whole operating system from scratch every time.
But even without such level of attention, when the developed code includes an installer it’s critical working on a virgin OS.
Lack of time availability interests also testers, which not only need multiple physical machines for every platform where our code has to be certified as working, but also need to reinstall the same operating system several times, maybe because last installation failed or simply because have to test different languages, service packs or concurrent applications.
Fundamentally every test case should be conducted on a dedicated environment and this implies a notable effort. Even if disk backup solutions are in place they can help limitedly considering dependency on underlying hardware, which could change and require to save a whole new set of images, and restore times.
Improving the development phase
The most popular and oldest product from VMware is also the most important in the whole solution chain: Workstation.
Workstation offers a wide range of features able to address the largest part of mentioned problems in software development.
The probability a software engineer tries it and still sticks with traditional tools is near zero.
The first problem Workstation addresses is the one about environment integrity: developers and testers can count on the Snapshot feature which allows saving a virtual machine state and reverting back to it anytime is needed.
A savvy use of the snapshot feature implies developers install a brand new operating system in the virtual machine, fits it with all tools they need to produce new code and finally save a snapshot.
The operation grants a pure environment completely isolated from the every day workstation.
In this scenario a developer is able browse the internet, read his emails and even gaming on his own machine without jeopardizing the development workspace.
For maximum security the virtual machine could be completely disconnected from the real network card, so there are no chances the workspace can be compromised with a remote attack or virus infection, and there is no hassle to continuously patch the operating system or install an antivirus to maintain security.
But even if the workspace cannot be compromised it still can be overloaded with libraries, utilities and other things during a project.
In this case our software engineer can revert back to a clean state as soon as the project is closed simply calling back its first snapshot, within seconds.
Snapshot feature is pretty evolved and it’s a needful tool for QA as well.
When a compatibility testing is in progress testers need to assure the new application works correctly with several different products, from our company or third parties.
A snapshot manager permits to save multiple states of the same virtual machine, allowing testers to install one application after another without reinstalling the whole environment every time.
For example in a scenario where our new prototype application has to be tested for compatibility with Microsoft Office and several service packs, the best approach is to save a first snapshot of the just installed operating system, another after the non service packed version of Office has been installed, and still another one after the service pack is in place.
At this point testers are able to proceed with our code installation.
If something goes wrong or if they want to test the same installation with a different service pack, they just need to revert back to the snapshot taken before the service pack installation.
Trying to do the same thing without virtualization or a lot of different physical machines would take hours or days.
This process can be further improved thanks to other Workstation feature: multiple snapshots branches and linked clones.
Multiple snapshots branches feature permits to set an already taken snapshot as the original virtual machine image, and take new snapshots from there.
Linked clones act in similar way but disjoin the new snapshots from original virtual machine image location.
Both features are particularly useful for QA since they don’t copy what already exists of the original virtual machine but only refer to it for what will be done in future.
To better clarify we can reconsider the previous example: a tester in need of verify compatibility of a new application against multiple Microsoft Office versions and their service packs, can proceed creating a snapshot of the brand new operating system.
After installing Office 2003 over this snapshot the QA engineer will be able to set the new starting point on the snapshot he already took after installing the fresh OS.
At this point he will be able to invoke a new snapshot for both branches before install Service Pack 2 on Office 2003 and Service Pack 1 on Office 2000.
Our application can be tested against all these environments while the Snapshot Manager makes easy creating and discarding snapshots and linked clones.
Snapshots and linked clones not only drastically reduce time needed to prepare a new development or testing environment but also address another critical problem we already discussed: lack of physical resources.
With them QA engineers don’t need a new machine for every single environment to test, but just enough disk space to contain several branches of snapshots and clones.
Another great feature of Workstation is Teaming, useful both in development and testing.
Teaming allows logically linking together multiple virtual machines and launching them at the same time.
It also allows users to create virtual connections between them with simulated speeds.
So, for example, software engineer developing a multi-tier application can control how his code performs when used on a modem or a broadband connection, or usability tester can verify how much bandwidth is needed for a networked application to run without providing a bad user experience.
Addressing portability
As already said the biggest benefit of VMware software is capability to seamless share virtual machines between different products.
This not only permits developers to move without modifications and show their work to other team mates or product managers, but also allows to port applications to other virtualization facilities, where the code will be tested or even put in production.
So as simply as copying a folder the virtual machine containing the new software can be moved from Workstation to Server, the enterprise virtualization product which VMware offers for free.
There can be tested for compatibility, usability and its performances can be verified against stress tests.
After the QA phase then, the same virtual machine can be moved again in the VMware product aimed at datacenter deployment, ESX Server, where it will be put in production.
Anytime a problem appears the virtual machine can be moved back and forth between these platforms for patching errors or testing new configurations.
And if a customer wants an onsite demonstration of the new product, the same virtual machine can be moved once again, put in the VMware free virtualization product for desktops, Player, and distributed to sales managers worldwide.
Conclusion
Virtualization is not revolutioning just datacenter planning and deployment aspects. It’s also touching the most critical part of IT industry: application development.
VMware saw this before competitors and is creating a whole ecosystem improving software engineers’ efficiency by cutting away unproductive time.
As side benefits, companies fully adopting virtualization gain safer environments and flexible ways to reach new customers. But it’s just the beginning: today all operations are done manually but in a near future VMware will provide automation for some of them with a new product called Virtual Lab Manager which is expected before end of this year.
This will greatly simplify control and optimization of software production phases in big companies where multiple departments adopt different development tools but need to leverage virtual machines images in mandatory testing and production virtual infrastructures.
Automation is behind the corner. A new dimension in software development lifecycle too.
This article originally appeared on SearchServerVirtualization.