Data Management

4 crucial steps to manage data

Jim McGann is vice president of information discovery at Index Engines, an e-discovery provider based in New Jersey.

Managing data according to ever-changing regulatory and compliance policies is complex. Enormous volumes of sensitive files and e-mail messages are scattered about every organization. That data flows through massive networks and is clustered away in proprietary repositories and archives, which makes access even more of a challenge. As a result, information management strategies are nearly impossible to design and deploy.

Government agencies in particular face information management challenges every day. The related regulations require detailed knowledge, access, reporting, and management of user data and e-mail in order to remain compliant.

Those policies and regulations have motivated government agencies to rethink how they manage user files and e-mail. Agencies need to understand what data exists, develop a plan to manage it, and take action to protect the organization from long-term risk and liability. Here’s how your organization can create a sound and flexible platform that can adapt to new strategies and environments.

1. Find out what you’re dealing with. Knowledge is crucial to creating sound information management and compliance policies and to meeting the demands of initiatives such as cloud on-ramping, intelligent management of big data, deduplication strategies, e-mail categorization, data retention and e-discovery, information governance, and more.

To better support those initiatives, you must know what you have and where it is so you can then take the appropriate action. Data resides on desktops, networks, servers and repositories that are dynamic and constantly changing. That primary data and the legacy data that has been archived on backup tapes created for business continuity are all important sources of content.

Understanding user files and e-mail across the entire network is crucial to developing and applying policies. Without detailed knowledge of the content, creating a sound policy is overwhelming, and executing it is impossible. Understanding and profiling that data will drive efficiency and management of the content.

2. Map it out. Data mapping allows government agencies to determine the locations of sensitive data and develop a sound policy and logical plan of action. A data map can provide information such as the age of the data, when it was last accessed or modified, the owner, the location, an e-mail message’s sender and receiver, and even keywords. A data map will deliver the knowledge required to make “keep or delete” decisions for files and e-mail messages and can help you act on those decisions by defensibly deleting what is no longer required and archiving what must be kept.

3. Develop a plan. Breaking down the data into logical segments makes the process manageable and achievable. Defining a plan that targets the highest-risk data first is essential. The highest-risk data environments are typically e-mail servers and legacy tapes. Your policy can initially focus on the riskiest data and continue down to lower-risk items. That approach makes a monumental task more manageable.

4. Make use of software tools. Federal and state agencies are using indexing software to help inventory data assets and apply policies that ensure the data is protected and managed appropriately. Summary reports can be generated to profile the content, providing a high-level view or a drilled-down summary of specific owners, servers or date ranges. Metadata information can also be exported and loaded into reporting tools to generate comprehensive summary reports.

Hoarding data for years is no longer a suitable practice. Implementing a defensible deletion methodology saves time and expense while ensuring that you keep only the data you need.

About the Author

Jim McGann is vice president of Information Discovery for Index Engines, an electronic discovery provider based in New Jersey. McGann is a frequent writer and speaker on the topics of big data, backup tape remediation, electronic discovery and records management. Email him at

FCW in Print

In the latest issue: Looking back on three decades of big stories in federal IT.


  • Anne Rung -- Commerce Department Photo

    Exit interview with Anne Rung

    The government's departing top acquisition official said she leaves behind a solid foundation on which to build more effective and efficient federal IT.

  • Charles Phalen

    Administration appoints first head of NBIB

    The National Background Investigations Bureau announced the appointment of its first director as the agency prepares to take over processing government background checks.

  • Sen. James Lankford (R-Okla.)

    Senator: Rigid hiring process pushes millennials from federal work

    Sen. James Lankford (R-Okla.) said agencies are missing out on younger workers because of the government's rigidity, particularly its protracted hiring process.

  • FCW @ 30 GPS

    FCW @ 30

    Since 1987, FCW has covered it all -- the major contracts, the disruptive technologies, the picayune scandals and the many, many people who make federal IT function. Here's a look back at six of the most significant stories.

  • Shutterstock image.

    A 'minibus' appropriations package could be in the cards

    A short-term funding bill is expected by Sept. 30 to keep the federal government operating through early December, but after that the options get more complicated.

  • Defense Secretary Ash Carter speaks at the TechCrunch Disrupt conference in San Francisco

    DOD launches new tech hub in Austin

    The DOD is opening a new Defense Innovation Unit Experimental office in Austin, Texas, while Congress debates legislation that could defund DIUx.

Reader comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group