Cloud computing is not always helpful in data recovery

Post-disaster data recovery has become easier now that storage is automated

Newer technologies such as cloud computing can be a boon for post-disaster recovery of data, but they don't always help much, Dennis Heretick, former chief information security officer for the Justice Department, said at a FOSE trade show session today.

“Cloud computing can provide more reliability, but that should not be assumed,” Heretick said. How a specific cloud application fits within an agency’s or company’s disaster recovery strategy should be assessed by each organization individually, he added.

Overall, in the last five years, disaster recovery and business continuity planning have become easier and less costly because of the availability of automated electronic storage processes for critical data, Heretick said.

Even so, there are hurdles to overcome in developing and implementing a disaster recovery plan and process. Some of the main obstacles include the difficulty of obtaining management support for disaster recovery goals and identifying and obtaining support for roles for individuals to perform in executing the plan, Heretick said.

For the typical manager, “disaster recovery planning is important, but not as important as the day-to-day operations,” Heretick said. He suggests gaining support for continuity plans by linking them to specific high-priority missions of the company or agency. For example, assessing the business impact of the loss of specific types of data can show the effect on the agency mission if the data were to be lost or unavailable.

Heretick and Bill Nichols, senior systems engineer for Mitre Corp., outlined seven tiers of disaster recovery. In the lowest tiers, there is a loss of data, little or no backup, and limited recovery. In the middle tiers, there is manual or automated backup of data. In the upper tiers, there is fully automated backup of data and of applications.

One of the most important first steps in planning is accurately classifying the data by its importance. The next step is drawing up a plan and identify roles. Too often, people may be identified for a role without being knowledgeable or committed to performing the role. Those are issues that will be worked out through discussion and exercises, Heretick said.

A simple strategy that can be executed effectively is more worthwhile than a complicated strategy for which managers and IT employees are not fully on board, Heretick said.

“The pitfalls of a disaster recovery plan are too much detail, too much information, and people don’t ‘own’ their roles,” Heretick said.



About the Author

Alice Lipowicz is a staff writer covering government 2.0, homeland security and other IT policies for Federal Computer Week.

FCW in Print

In the latest issue: Looking back on three decades of big stories in federal IT.

Featured

  • FCW @ 30 GPS

    FCW @ 30

    Since 1996, FCW has covered it all -- the major contracts, the disruptive technologies, the picayune scandals and the many, many people who make federal IT function. Here's a look back at six of the most significant stories.

  • Shutterstock image.

    A 'minibus' appropriations package could be in the cards

    A short-term funding bill is expected by Sept. 30 to keep the federal government operating through early December, but after that the options get more complicated.

  • Defense Secretary Ash Carter speaks at the TechCrunch Disrupt conference in San Francisco

    DOD launches new tech hub in Austin

    The DOD is opening a new Defense Innovation Unit Experimental office in Austin, Texas, while Congress debates legislation that could defund DIUx.

  • Shutterstock image.

    Merged IT modernization bill punts on funding

    A House panel approved a new IT modernization bill that appears poised to pass, but key funding questions are left for appropriators.

  • General Frost

    Army wants cyber capability everywhere

    The Army's cyber director said cyber, electronic warfare and information operations must be integrated into warfighters' doctrine and training.

  • Rising Star 2013

    Meet the 2016 Rising Stars

    FCW honors 30 early-career leaders in federal IT.

Reader comments

Thu, Mar 25, 2010 Chris Canada

Thanks Alice for this article, Heretick said it best, simple strategies that work will always out perform overly complex strategies. Cloud computing definitely provides us with great options for data storage and redundancy, but they also provide far greater risk of overwriting, moving or defragging drive space over accidentally deleted files. This type of problem can greatly complicate recovery processes. I would suggest cloud computing as one part of a recovery scheme for any large organization, however having data stored at multiple locations(complete data) is always a priority. Also having multiple copies is something of great importance. Ultimately though the single most important factor in any data loss prevention plan is execution. Having the people at the ground level verify that backup processes are in place and working. Having them test their backups on a regular basis and audit any errors or failures in a way that they are made accountable for any problems. Most data backup plans that fail, do so simply do to lack of follow through once a plan is enacted. I would be a poor man if this were not true as I specialize in data recovery. Chris Armor-IT Data Recovery Saskatchewan, Canada

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group