Even in Early Stages, HCI Has Bright Future

HCI has already demonstrated the flexibility and reliability that help ensure eventual widespread adoption.

As the demand for integrated systems continues to march forward—driven by the increasing demand for consolidation, greater data center performance, and cost savings—hyperconverged infrastructure (HCI) faces a rapidly evolving future.

Market researcher Gartner sees several forces driving the positive climate for HCI:

  • IT departments are under increasing pressure to instantly provision resources to cater to the demand for new applications and services.
  • Applications are becoming more dependent on “scale-out” systems based on low-cost commodity components such as Intel x86 processors.
  • As data volumes grow quickly and unpredictably, the type of software-defined storage HCI delivers presents the most efficient management solution.
  • All this is moving the market toward what Gartner calls the third phase of integrated systems. This will be characterized by continuous delivery of applications and microservices using HCI platforms, and will rely on a “dynamic, composable, and fabric-based” infrastructure. This is leading to what has been called IT-as-a-Service and even HCI-as-a-Service.

    Composable infrastructure envisions the entire IT infrastructure as a single, virtualized entity. This is even a step or two beyond HCI’s current convergence of compute, storage, and networking resources into a single integrated unit. As with HCI, however, composable infrastructures are centrally managed using software.

    The idea is composable “infrastructure as code” will allow for the entire infrastructure, not only separate components, to be quickly provisioned to support applications as needed. It will be able to happen as quickly as cloud providers can now with on-premises private clouds.

    Others have also pointed to a maturing concept of the cloud. Certainly in the federal government, the “cloud first” mandate has pushed agencies towards a broad adoption of the cloud as a way to quickly provide new applications and services for their users. It’s also becoming clear that the public cloud is not the cheapest or most efficient answer for all workloads. Those that require flexible storage parameters and are constrained by strict government regulations and requirements for use of the data may be better off sticking with the internal resources in agency data centers.

    Along those lines, HCI is already seen as well-suited for operating private clouds. In fact, this has become an early major use case for HCI. As agencies become more familiar with its architecture, HCI is expected to be the basis for even more private and hybrid clouds, leading the way for future deployment of the truly composable infrastructure.

    Over time, the infrastructure capabilities and software services that need to operate on HCI itself will move up the stack and embrace a different level of virtualization, primarily provided through containers, says Kirk Kern, chief technology officer for NetApp’s U.S. public sector division. It’s at that point the container control framework will start to take on a more next-generation management for scaled applications.

    Since containers work through an operating system-based virtualization, they can run applications without having to apply separate virtual machines for each one. Each container has all the necessary resources needed to run an application. These are often considered more efficient than virtual machines.

    When all of that happens, we’ll see a morphing of HCI technology into one that can dynamically adapt to workload requirements, as opposed to it being more of a scale-out option. As the software changes to adapt to this, says Kern, HCI will continue to become more prevalent in enterprise IT environments.