Why 2010 is the year of the virtual data center

Agencies focused on virtualized servers for greater efficiency are seeing only half of the picture, according to Christopher Poelker, vice president of Enterprise Solutions at FalconStor Software. A look at why 2010 will emerge as the year for virtualized storage virtual data centers.

 In 2009, many government agencies were focused on realizing the benefits of using virtualized servers. Perhaps chief among those benefits was the opportunity to make better use of existing budgets by consolidating physical infrastructure and commoditizing server hardware. Software and hardware vendors, meanwhile, have been equally focused on selling the benefits of server virtualization. That’s reflected by the success of VMware and the moves by major operating system vendors (Microsoft Hyper-V), database vendors (Oracle VM), server vendors (Hewlett-Packard, IBM, and Dell) and even network vendors such as Cisco with network convergence. 

What many agencies don’t realize is that they are missing the other half of the story. And that’s the huge opportunity to increase the overall efficiency of their systems, including storage, while optimizing the use of their assets and improving data mobility. 

That story will become more evident in 2010, as more agencies move to address the data aspects of information technology and the value of storage virtualization.

Storage virtualization is accomplished by creating a virtual abstraction layer between physical or virtual servers and the existing physical storage (see illustration).

IP network

Click for larger image.

Once the storage is virtualized and the physical location of data becomes abstracted from the hosts, some interesting opportunities for savings become apparent. The storage is instantly commoditized. Provisioning can be done the same way across all storage vendors. Tiers of storage can be created, and the data can be moved between tiers automatically, while applications continue running operations, resulting in zero down time.

Cloud vendors will be able to provide outsourced capabilities for data services such as continuous protection and recovery, which in turn will enable organizations to focus more on core applications and perhaps not have to build out that very expensive data center for data recovery.

Virtualization in 2010 will also provide better data mobility. That will help the government create the plumbing required for internal service-oriented computing clouds. Virtualization will also enable cloud service providers to offer solutions needed to more efficiently outsource computing work into the cloud. And industry will focus increasingly on the necessary technology to securely connect internal clouds to external clouds, and help further reduce IT costs, enhance productivity and improve the value of IT within their organizations.

Efforts are already underway to provide the technology for enhanced provisioning of storage resources for structured and unstructured data. For instance, the virtual abstraction layer is being enhanced to provide the metadata required for policy management, data placement, performance, and service levels required by the business.

But there are also a number of other developments in the works this year that will make storage virtualization a greater part of the overall value equation in IT data services, including:

  • Prompt automated provisioning for both structured and unstructured data.
  • Ubiquitous data deduplication as part of the abstraction layer.
  • Cloud-based global file systems.
  • Intelligence in the cloud for application integration and platforms (VMware SRM, Microsoft Geo-Clusters, Oracle RAC, Oracle VM).
  • Continuous availability and protection (the final elimination of the backup process!).
  • Inter-cloud data transport and WAN optimization.
  • Data security and encryption.

Acceleration of I/O performance in the cloud will be addressed by making more effective use of solid state drives and by providing data caching at memory speeds within the abstraction layer. And all data recovery will be based on specific service-level agreements for files, databases, mail, application servers and entire sites.

With the world focusing more on adding software-based intelligence at the abstraction layer, hardware vendors will be adding value by moving software into application specific integrated circuits (ASIC), and making faster and more reliable components. Network vendors will be working on centralizing management of the cloud and providing the required security for the data.

The movement toward virtualization on the server side and the storage side will turn both into commodity products, and reduce the relevance of the physical location of both. This will enable the creation of internal clouds and virtual organizations, making it extremely simple to collapse physical data center infrastructure into more manageable virtual entities. That in turn will simplify the movement and sharing of data in a secure and optimized manner.

Since the physical locality of data will be no longer be as important (as long as it’s secure), some of the functions typically provided internally by IT departments, such as backup and data recovery, can be easily outsourced to external cloud service providers, allowing IT departments to focus more on their core applications and functions. 

The role of storage virtualization in conjunction with server virtualization is to create an end state vision for the chief information officer of an efficient, optimized IT infrastructure which reduces costs, and provides enhanced services for the provisioning, protection, replication and recovery of data. The fact that much less infrastructure is required also reduces costs for continuity of operations, power, cooling, and data center floor space. When all these benefits are tied together, the message becomes clear. Storage virtualization will be the next big thing in IT.

 

NEXT STORY: Enough Tools to Telework?