Big Data

Big data and disasters

analytics concept art

As an emergency management tool, big-data analytics can improve situational awareness, accurately predict outcomes and offer improved security, according to a new report from the National Security Telecommunications Advisory Committee.

However, major technical and legal impediments bar the way to the successful sharing of data across disparate systems, the report states.

NSTAC, which is made up of 30 private-sector executives in communications and technology, is seeking protection and immunity for companies that share data during emergencies such as weather disasters, cyberattacks on critical infrastructure and physical attacks by terrorists or other adversaries.

The report cites instances of big-data analytics being used to optimize the delivery of critical services at scale in health care, transportation and city management. The committee members found three common elements among successful users of big-data analytics: well-defined industry terminology, a focus on infrastructure and data systems, and well-trained personnel.

Furthermore, interoperable technologies and analytics can help government protect against, prepare for, respond to and detect emergencies more rapidly and effectively through augmented situational awareness and more accurately projected outcomes.

However, challenges include a need to standardize terminology, policies and plans to allow efficient sharing of data in a national security crisis, provide greater clarity on privacy and security best practices, and address a skills gap.

Ignoring the potential value of big-data analytics would be a mistake, and even though a more expansive implementation would be challenging, it is achievable, the report states.

NSTAC recommended developing a framework that would offer companies immunity from consumer protection, antitrust, and other laws and legal frameworks that govern data sharing.

"By relying on a framework of 'Good Samaritan' protections, both government and private entities would have a clear understanding of rules regarding the protection of privacy, data use, ownership, storage, retention, accidental disclosure and deletion," the report states.

It also encourages the federal government to collaborate with the private sector to establish community-driven standards, improve the capacity and effectiveness of industry-provided services that can aid national security, increase data sharing and integration across government agencies, clearly define data assurance plans for big-data contracts and increase the IT talent pool by improving data science skills and training requirements.

The report also offers three case-study scenarios -- a natural disaster, a manmade disaster and a cyberattack on critical infrastructure -- to measure the potential impact of big-data analytics on non-military emergency response procedures.

About the Author

Chase Gunter is a former FCW staff writer.

Featured

  • Comment
    Pilot Class. The author and Barbie Flowers are first row third and second from right, respectively.

    How VA is disrupting tech delivery

    A former Digital Service specialist at the Department of Veterans Affairs explains efforts to transition government from a legacy "project" approach to a more user-centered "product" method.

  • Cloud
    cloud migration

    DHS cloud push comes with complications

    A pressing data center closure schedule and an ensuing scramble to move applications means that some Homeland Security components might need more than one hop to get to the cloud.

  • Comment
    Blue Signage and logo of the U.S. Department of Veterans Affairs

    Doing digital differently at VA

    The Department of Veterans Affairs CIO explains why digital transformation is not optional.

Stay Connected

FCW INSIDER

Sign up for our newsletter.

I agree to this site's Privacy Policy.