Research Report: The Virtual Public Sector

Agencies face daunting storage challenges

Government agencies at all levels will need robust data storage technologies to avoid being buried under an avalanche of data that is expected to increase at a rate of nearly one-third (31 percent) each year, according to a survey conducted by the 1105 Public Sector Media Group. This data avalanche brings with it a number of inherent challenges, such as the need to maintain a high level of service delivery and data protection.

IT professionals who participated in the survey work for agencies that gather substantially more operational data with each passing year. Specifically, 13 percent work for an agency that gathers 10 petabytes or more, 13 percent work for an agency that gathers between one to nine petabytes, 24 percent work for an agency that gathers 100 terabytes to 1 petabyte and 20 percent work for an agency that gathers 30 to 99 terabytes.

As increasingly is the case these days, there is constant tension between performance and cost. A clear majority—77 percent—of respondents believe their agency must strive to improve service delivery and data protection while at the same time reducing storage management costs.

Figure 1


But performance is not just about the volume of data. IT managers also are dealing with a number of technology trends that are shifting how users access data.

That is the case with virtualization. The survey found that the existing array of storage options available to government agencies may still be unable to cope with the demand and myriad challenges associated with virtualized environments. In many cases, the problem is that agencies have not upgraded their existing infrastructure. A majority of respondents, 56 percent, indicated that outdated technologies were preventing their agency from making more efficient use of its storage resources.

One of the key drivers in the quest for more robust storage tools is the quest for better approaches to disaster recovery and continuity of operations plans (COOP), according to the survey. Another key driver is the advent of big data.

A total of 72 percent of respondents said that their agency has either deployed or is planning to deploy within the next 12 months enhancements to disaster recovery and COOP. Similarly, 59 percent of respondents said their agency has either already embarked on a big data initiative or is planning one within the next 12 months.

Figure 2


A wide array of advanced storage solutions is available to address the needs of government agencies. Among the advanced storage technologies are software-defined storage, flash storage and object storage, according to data storage experts.

Defense Department agencies and the military services are expected to have a steeper challenge with regards to data capture, storage and management than other agency types, according to the survey.

Since 9/11, the amount of data captured that the DOD has captured with sensors on unmanned vehicles and other surveillance technologies has increased 1,600 percent, according to the March 13, 2104, article “Enabling battlefield big data on the move” in Defense Systems. What’s more, the DOD is now using as many as seven million computing devices, and is expected to have more than twice that number in use by 2020.

In Afghanistan, the Army tackled its avalanche of intelligence data by building with open source technologies its own secure cloud known as the Distributed Common Ground System-Army (DCGS-A) Standard Cloud (DSC), according to the October 8, 2013, GCN article “How troops in Afghanistan get a clear view of intell.”

DSC currently indexes and stores text and visual data based on 75 million intelligence records from as many as 600 separate feeds, including unmanned aerial vehicles, satellite imagery and ground sensors.

To handle this vast library of virtual data, DSC uses integrated cloud computing technology, a specialized data ingestion system and an open computing framework employing open source technologies such as Hadoop, Hadoop Core, Accumulo, Condor and SolrCloud.

DSC’s cloud technology was modeled after the robust clouds used by the National Security Agency and other intelligence agencies, the GCN article said.

As for the user interface, the Army sought to make it as simple for warfighters to use as the Internet when making queries, said Col. Charles Wells, DCGS-A program manager.

“What’s really key to DCGS-A is the data,” Wells said. “It doesn’t matter how great the tool is; if we don’t have the data behind it, then it’s really meaningless. A lot of the power behind the cloud is the data behind the answer.”

Methodology and survey demographics

Between February 21st and March 1st, 2014, 107 subscribers of FCW, GCN and other Public Sector Media Group publications responded to an e-mail survey about networking and storage trends in government agencies. Survey respondents were comprised of those involved with networking and storage operations for their department or agency. Beacon Technology Partners developed the methodology, fielded the survey and compiled the results.

Approximately 88% of respondents were technology decision-makers (CIOs or other IT managers or professionals), while 12 percent were senior managers, program managers or other business decision-makers. Approximately 67 percent came from the federal government (34 percent civilian, 33 percent defense) and 33 percent from state or local government agencies.

About this Report

This report was commissioned by the Content Solutions unit, an independent editorial arm of 1105 Public Sector Media Group. Specific topics are chosen in response to interest from the vendor community; however, sponsors are not guaranteed content contribution or review of content before publication. For more information about 1105 Public Sector Media Group Content Solutions, please email us at [email protected]