Virtualization Drives HCI’s Flexibility and Power
Virtualization technology plays a key role in making HCI a reality.
While virtualization has been used for years to consolidate data center resources, it has found an increasingly critical role in the hyperconverged infrastructure (HCI). Here is where virtualization technology is arguably being used most effectively. In fact, without virtualization, HCI would not be possible.
In a converged infrastructure, virtualization is used to combine various compute, networking, and storage elements into more manageable units. However, IT staff must manage each of these technology stacks separately. That provides a large degree of flexibility in scaling each element, but still requires complicated provisioning to satisfy various workload demands.
HCI uses virtualization to bring all of those elements together within a single management platform. Using a software-defined approach; IT staff can manage compute, storage, and network resources through policy as a single shared resource, instead of having to relate them to physical or logical constraints. That becomes even more important in this increasingly complicated world, where data has to be moved continuously back-and-forth between on-premises systems and the cloud, and automatic processes are required to accomplish this effectively.
Data storage probably best demonstrates the value of this technology. In a typical enterprise, users have to copy data multiple times across individual storage devices. This is particularly true for data not required for primary, mission critical applications. HCI manages this secondary data as a single resource agencies can allocate for all uses.
Most of a storage administrator’s time now is spent provisioning the data infrastructure to meet the specific needs of the various workloads. Any required upgrades must be done manually. That takes time and typically the attention of skilled storage specialists. With HCI, admins can set policies for each workload in advance, and allocate resources appropriately and automatically.
The value of HCI also becomes obvious when storing this secondary data on a range of devices throughout an enterprise, from solid state drives and regular rotating disk drives to cloud storage and even tape. The single control pane HCI provides allows for a much clearer view of this widely dispersed data, and helps determine more intelligent data allocation.
The leading-edge use of virtualization in HCI has also helped to deploy technologies that have suffered under more traditional virtualization schemes. The Army, for example, several years ago decided to replace its traditional physical desktop solutions with a virtual desktop implementation, run off a traditional server and storage area network (SAN) infrastructure.
However, in pilot tests it proved excruciatingly slow. So the Army turned to HCI. Since the compute, storage, and networking resources needed for a typical Army desktop user could be identified ahead of time, Army IT could build an HCI specifically for those requirements and managed that through software. As extra capacity was needed, all they then had to do was add an identical HCI node.
Future HCI solutions should become even more flexible and powerful as virtualization itself moves from being based around the virtual machine, with each application requiring its own VM, to one based on containers. These can run multiple applications and are seen as more efficient than VMs.
Even before that happens, HCI will upend the idea of the modern data center and what it can accomplish. Instead of a compilation of separately managed resources, which in the past led to expensive sprawl, HCI collects these into a centrally managed, policy driven pool of shared resources configured to support the virtual machine as the focus of the modern data center.
That’s a far more flexible, and better performing, environment for modern application and service delivery than the traditional, expensive and relatively cumbersome data center.