Snapshot: Virtualization

Service Virtualization Could be Big for DevOps

DevOps is often considered the future for software development. Central to the process is collaboration between software engineering, quality assurance and IT operations. An integrated team manages the entire application lifecycle.

It’s a new arena for government, but the desire to adopt is there. In a survey of more than 150 federal IT managers, MeriTalk found 63 percent of respondents think DevOps will help speed application delivery and migration to the cloud. A similar number thought it would produce faster application testing.

Virtualization, particularly service virtualization, could play a big part in how agencies embrace DevOps. With continuous testing being a major element of DevOps, service virtualization can provide an “always there” test environment for when physical test environments might not be available.

“Service virtualization enables development and test teams to statefully simulate and model their dependencies of unavailable or limited services and data that can’t be easily virtualized by conventional server or hardware virtualization means,” according to voke, a market analyst firm.

Service virtualization removes the constraints and wait times frequently experienced by development and test teams that need to access components, architectures, databases, mainframes, mobile platforms and so on, according to voke analysts.

Using virtualization for software development is certainly not new. In 2009, for example, the Defense Information Systems Agency (DISA) set up a program whereby DOD users could quickly and easily buy virtualized server space for their application development and testing. This Infrastructure-as-a-Service initiative was to be sunset in FY 2015. They planned to replace it with milCloud, which DISA describes as “an integrated suite of capabilities designed to drive agility” into DOD application development.

Service virtualization takes that several steps further. It closely mimics the behavior of an application —how it makes database calls, updates values, reacts to security protocols and policies, and so on —just as a physical testbed. However, everything is emulated in software and available at all times on a local server. The development team always has access to the virtual test bed.

This is different from such things as mocks and stubs; other techniques for software testing. A tester has to inform a mock object what they think will happen. After executing the code, it verifies that it actually happened in the specified manner. Likewise, a stub can be told to return a specific fake value when something specific is called.

Service virtualization tools, on the other hand, can test entire networks, either with pre-determined responses or using actual recordings of network traffic. Testers can change various network elements, attached servers, domain settings and other things to test application response under different circumstances. In all respects, the application will react as if it’s operating on a real network.

Gone are the days when software development teams could take their time producing and testing applications. The demands of today’s IT users require a fast and agile response —especially in constantly shifting environments like the cloud. DevOps promises the continuous integration and delivery they need.

Delivering complex applications under those kinds of time constraints is difficult at best if testing is kept to the end. Testing can no longer be put off until a fully-functional application is available or a complete operating environment for end-to-end testing. Continuous testing that runs through the entirety of the development cycle, enabled by service virtualization, is the key to continuous delivery.