The perils and promise of software-defined data centers

Proponents say SDDC could revolutionize the data center, but real-world proof of its effectiveness is hard to find.

Shutterstock image: black data center with blue, glowing lights.

The software-defined data center is an idea whose time will eventually come. SDDCs virtualize storage, processing and network resources and package them into on-demand units that allow non-IT managers to provision only what they need through a self-service portal. The savings in time and money are obvious.

There is no clear path to that destination, but despite that, every federal agency has been directed to get there.

Why it matters

The White House's Data Center Optimization Initiative (DCOI) memo could not be clearer: Agencies must increase "the use of virtualization to enable pooling of storage, network and compute resources, and dynamic allocation on-demand."

SDDC checks all those boxes. Given that DCOI effectively turns off the funding spigot for any non-compliant data center expenditures, investment in SDDC suddenly becomes considerably more attractive.

DCOI compliance aside, though, SDDC could be cheaper and easier than an agency's current state. End users can order what they need via a browser and have it provisioned quickly or even immediately. IT managers might spend a little more time on capacity management but a lot less time on vendor and other IT service management components.

To be clear, the point of data center consolidation in government is to save taxpayers' money. Whether an agency does that by moving workloads to public, private or hybrid clouds or to a better optimized infrastructure, SDDC enables all of those solutions.

The fundamentals

With SDDC, computing, storage and network resources are virtualized and bundled together into units that non-techies can understand. They can dial the quantity up or down depending on their needs, which means less hardware, software and labor must be procured. And data security protocols are infused throughout the infrastructure.

Another benefit that should not get lost in the hype is that on-demand IT provides a context for integrating application development and the delivery of IT operations. According to a Gartner study, SDDC "enables increased levels of automation and flexibility that will underpin business agility and enable modern IT approaches such as DevOps."

The hurdles

However, the Gartner study goes on to say — and these two points are critically important — that IT managers "can't just buy a ready-made SDDC from a vendor," and "most organizations are not ready to begin adoption and should proceed with caution."

The fact is that, as of summer 2016, SDDC has rarely been fully and satisfactorily implemented at any sizable organization in either the public or private sector. I served as a consultant to a major commercial bank that was attempting a large-scale transformation to SDDC. The effort failed despite the bank's extensive resources. For this article, I asked numerous vendors to recommend federal clients who could comment for attribution; ultimately, they all declined.

In the bank's case, the hurdles were mainly organizational. Top managers disagreed about which solutions should be used at the upper levels of the software stack and which firms should implement the work. As the project fell behind schedule, talented members of the design team found other work.

Government, of course, is different. When national security or individuals' personally identifiable information is at stake, security is paramount. As with anything else in this age of hacktivism and cyberwarfare, every effort must be made to avoid data breaches in an SDDC infrastructure. IT departments must be able to verify that data is secure at rest, in transit and at any endpoint.

Still, some challenges are not unique to any sector. And many of them stem from the tug-of-war between two philosophies of SDDC. The first, as represented by VMware and its partners, is that tasks originally handled by hardware will inevitably be performed by software, so the hardware choice ought to be inconsequential.

Alternately, Cisco and others start their solutions from the bare metal and use that native understanding of networking to inform their software.

For those reasons, it is difficult to define the optimal solution, especially at the orchestration level, which enables everything within the infrastructure to play together nicely. Legacy hardware might be expensive to write off, and vendor lock-in is a serious issue.

And, as suggested above, there is a dearth of real-world proof that SDDC works.

Next steps

There might well be such proof someday. But so far nobody has completed the journey and used an SDDC infrastructure long enough to say that the projected savings and efficiencies have been attained.

There is no "SDDC in a box" solution. It is an evolutionary process. Typically, an agency will start by virtualizing its servers and, when that new state has solidified, go on to virtualizing storage. At that juncture, the major roadblock to true SDDC is the network.

The decision point then is whether to go with the hardware-and-up route touted by Cisco or the orchestration layer-and-down route that VMware offers. The former provides an opportunity to use existing hardware or the next scheduled refresh, thereby avoiding a big write-off. It also means that the investment in paying to train Cisco Certified Network Associates and Technicians will not become a sunk cost.

But the agency risks hardware vendor lock-in, scalability could become an issue, and the long-term savings might not be as pronounced as what the VMware solution offers.

VMware's top-down approach, for its part, would probably require a steep learning curve on the part of the IT department. The hardware vendor lock-in disappears but is replaced by software vendor lock-in, which could be even harder to uncouple from.

It is clear that the arc of history bends toward software, but even so, VMware is not the only game in town. There are any number of open-source solutions specific to particular domains, but it would be difficult to find a suite whose elements work together optimally, no less a team of IT architects sophisticated enough to build the custom solution.

So the IT manager contemplating a move to SDDC, willingly or not, must understand that it is a long road. Virtualizing computing, storage and network resources will probably need to proceed sequentially rather than in parallel.

And above all else, before even thinking about the design phase, managers are well advised to determine what their true functional and security requirements are. Only then can they adequately define the target state and perform a feasibility study. Those efforts will lead to decisions about which vendors to screen, negotiate with and ultimately select.