The more developers use virtualization, the more they find new uses for it. Discover what you’re missing and how virtualization can help you get more done.
by Andrew Binstock
Many developers today are finding that virtualization enables them to develop, test, and debug software in ways they never could do before. This is in large part due to the tremendous horsepower of today’s dual- and quad-core systems that allow developers to run multiple virtual machines (VMs) and leverage the abilities to simulate environments on their existing development platforms. This article discusses the ways that developers at some leading IT sites are using virtualization. I suspect you’ll find ways to use their practices within your own site’s development lifecycle.
An important activity, especially for teams working on projects that require several inter-operable technologies, is the evaluation of new development tools. Frequently, these tools take the form of plug-ins to existing IDEs or stand-alone products that must be configured to fit into an existing environment. Wisdom dictates that these tools should not initially be evaluated in the developer’s existing programming environment because they frequently cause disruptions to existing tools. Most every developer has had the experience of installing a plug-in, deciding later to remove it, and discovering that the platform or IDE has been changed in ways that are difficult to undo. To avoid this problem, sites should have copies of VMs set up with a developer’s basic toolset and configured as site policy dictates. New tools should be evaluated initially in such a VM to make sure that they do what they claim to do, that they don’t have unexpected drawbacks, and that they truly integrate with the site’s existing toolchain. Only when tools have passed this initial test, should they be allowed provisional deployment in active development environments.
The benefit of this approach is that if the tool reconfigures the VM or mangles its registry entries, the VM can simply be thrown out. As long an instance of the original VM exists, a new one can be cloned at any time for further tool testing.
Development teams generally use systems that are configured substantially differently from those of regular employees. Consequently, developers can have difficulty making sure that they are running their tests on a system that matches the authorized employee platforms. Virtualization helps in this instance. IT managers can create VMs that reproduce all the permitted platforms in use at the firm. Then developers and QA engineers can use those VMs to make sure that they are testing their software against the exact configuration that the users have on their desktops and laptops. In this scenario, the VM is discarded after testing, so that the developers and test engineers always know that they are working on a fresh, and therefore true, version of the platform.
This approach has other benefits. As the user platforms evolve over time, there are always pockets of employees who for one reason or another lag adoption of the prescribed configuration. By having a library of VMs that reflect all historical configurations, testers can mak e sure that they are maintaining backwards compatibility, whenever that is important.
Virtualization also offers an additional benefit that is very difficult to reproduce with hardware. How do you test on a standard platform that has reduced resources? For example, suppose that the standard specified desktop at a firm is Microsoft Windows XP with 512MB of RAM. Testing on a VM with this configuration shows no particular problem. Now, suppose the user runs the application on a system that is already running eight other memory-intensive applications. Does the new app still run correctly? VMs today make it simple to vary the amount of RAM allocated for use. By running the tests and lowering this number, it becomes possible to emulate a standard but overburdened platform and to determine at what point the application under test breaks down from lack of RAM. This information has obvious benefits for developers, testers, and help-desk staff.
Virtualization gives QA engineers an ability that they have historically lacked: capturing the entire state of the machine at the point where a bug manifests. This is possible because the entire VM can be saved at any time in its current state to a single large file. Once saved this way, the VM can be made available to developers who can now see for themselves how the problem manifests. At sites that use defect-tracking software, it is easy to place a URI to the saved VM in the defect report, and thereby greatly improve the quality of the information exchanged between QA and the development teams.
Sometimes, however, it’s not sufficient to capture a single VM. For the developer to run the VM with the saved defect, other VMs (such as servers, for example) might be necessary. In such cases, the test engineer needs to be able to capture a snapshot of several VMs that work together. Today, packages from VMware (notably, the company’s VMware Lab Manager) and Surgient (its VQMS product) enable capturing multi-VM configurations in a single snapshot.
Both companies also add a crucial element to the snapshot: the ability to run it correctly. This seems intuitive, but it refers to a specific problem that occurs when snapshots are run. Suppose a configuration that contains three VMs is captured. Let us say it contains a VM each for the client, the application server, and the database. Now, the developer wants to run this snapshot, but the system on which he wants to run it is currently running the database VM that was copied to the snapshot. Without special software, a problem will occur as the snapshot VM will have the same IP address and, crucially, the same MAC address as the original database VM. The network will detect this conflict and the snapshot VM will not have network access. Both VMware and Surgient solve this problem by building a virtual network switch into the saved configuration. This switch performs network-address translation (NAT) on the configuration when the configuration is started up, thereby avoiding IP address conflicts. A similar translation process occurs for MAC addresses. Inside the snapshot configurations, the network and MAC addresses remain unchanged, while to the outside world they’re completely different. [This is needed in the event licensing or other configuration records rely on these items.] This solution works so well that multiple instances of the snapshot can b e run simultaneously on the same system without conflict.
Sites that rely on offshore development use virtualization to provide some IP safety and to make sure that developers are using the correct and properly configured tools. These sites require offshore developers to do their work by using remote access software that logs into a VM that is locally hosted using shared desktop tools. It is clear to see how this provides developers with the site-required toolset. But the IP protection needs a bit more explanation. VMs do not allow you to copy files to the local hosting machines and can be blocked from copying files to other locations. This makes it impossible—or at least far more difficult—for off-shore workers to get remote access to IP. They have no way of copying the files to a device at their physical site or of sending the IP to an unauthorized network location. At best, they can screen scrape or spend a lot of time trying to circumvent the security on the VM host, all of which can be seen and thereby detected at the hosting site. Tools such as VMware ACE can further automate the lockdown of a VM.
This arrangement has limitations and benefits. The benefits, beyond those already mentioned, are that the local site can monitor what the remote developers are doing, track their daily progress, and more intimately manage the development process. The downside of this approach is that dial-in access to a hosted VM is hardly an optimal way of getting development done, due primarily to the latency caused by the long-distance networking and the limitations of remote desktop software.
Portability testing is the original development use case for virtualization. A developer who is programming on a Windows workstation installs, let us say, Microsoft Virtual PC on his system and then creates a Linux VM. When code is developed and tested to a point of satisfaction on Windows, the developer then fires up the Linux VM and recompiles the code on Linux to make sure it runs correctly there as well. This model is particularly easy for many sites because Virtual PC is available from Microsoft to anyone as a free download. (See References at the end of this article). Sites should also consider VMware Workstation (not free, but ideal for the software testing use case) and VMware Server, which runs on Windows and is free but transforms an entire system into a virtualization server. Sites that develop primarily on Linux will likely use the open-source Xen product, which can host VMs running Microsoft Windows XP and Windows 2003 Server.
The advantage of developers doing this portability testing, rather than passing the responsibility to QA, is the immediacy of the feedback. If the code does working correctly, the developer knows what has changed since the last working build and can figure out what needs to be changed. If the testing is done by QA, however, the lag in testing and getting feedback to the developer makes it far harder to locate the non-portable code quickly.
An additional benefit of this approach is that developers can develop for multiple platforms with no additional purchase of hardware.
As I’ve shown here, there are many good reasons for developers to adopt virtualization. The testing benefits alone (portability, verifica tion of the correct platform, testing under constrained physical resources, etc.) make virtualization compelling, (even for Java developers who might not find benefit in the portability testing). The defect-snapshot capability is an additional productivity boost. Because of these advantages, some pundits expect that in a few years almost all development shops will make significant use of virtualization.
VMware Server from VMware: http://www.vmware.com/products/server/
VMware ACE from VMWare: http://www.vmware.com/solutions/desktop/datasecurity.html
Microsoft Virtual PC 2007 (free download page) http://www.microsoft.com/windows/products/winfamily/virtualpc/default.mspx
Xen (The Xen Express product is available as a free download) Citrix XenServer Express Edition
About the Author
Andrew Binstock is the principal analyst at Pacific Data Works LLC, a firm that specializes in technology white papers. He will be lecturing at the InfoWorld Virtualization Executive Forum in New York City in September, 2007. He can be reached at email@example.com