Big data and disasters
- By Chase Gunter
- May 11, 2016
As an emergency management tool, big-data analytics can improve situational awareness, accurately predict outcomes and offer improved security, according to a new report from the National Security Telecommunications Advisory Committee.
However, major technical and legal impediments bar the way to the successful sharing of data across disparate systems, the report states.
NSTAC, which is made up of 30 private-sector executives in communications and technology, is seeking protection and immunity for companies that share data during emergencies such as weather disasters, cyberattacks on critical infrastructure and physical attacks by terrorists or other adversaries.
The report cites instances of big-data analytics being used to optimize the delivery of critical services at scale in health care, transportation and city management. The committee members found three common elements among successful users of big-data analytics: well-defined industry terminology, a focus on infrastructure and data systems, and well-trained personnel.
Furthermore, interoperable technologies and analytics can help government protect against, prepare for, respond to and detect emergencies more rapidly and effectively through augmented situational awareness and more accurately projected outcomes.
However, challenges include a need to standardize terminology, policies and plans to allow efficient sharing of data in a national security crisis, provide greater clarity on privacy and security best practices, and address a skills gap.
Ignoring the potential value of big-data analytics would be a mistake, and even though a more expansive implementation would be challenging, it is achievable, the report states.
NSTAC recommended developing a framework that would offer companies immunity from consumer protection, antitrust, and other laws and legal frameworks that govern data sharing.
"By relying on a framework of 'Good Samaritan' protections, both government and private entities would have a clear understanding of rules regarding the protection of privacy, data use, ownership, storage, retention, accidental disclosure and deletion," the report states.
It also encourages the federal government to collaborate with the private sector to establish community-driven standards, improve the capacity and effectiveness of industry-provided services that can aid national security, increase data sharing and integration across government agencies, clearly define data assurance plans for big-data contracts and increase the IT talent pool by improving data science skills and training requirements.
The report also offers three case-study scenarios -- a natural disaster, a manmade disaster and a cyberattack on critical infrastructure -- to measure the potential impact of big-data analytics on non-military emergency response procedures.
Chase Gunter is a staff writer covering civilian agencies, workforce issues, health IT, open data and innovation.
Prior to joining FCW, Gunter reported for the C-Ville Weekly in Charlottesville, Va., and served as a college sports beat writer for the South Boston (Va.) News and Record. He started at FCW as an editorial fellow before joining the team full-time as a reporter.
Gunter is a graduate of the University of Virginia, where his emphases were English, history and media studies.
Click here for previous articles by Gunter, or connect with him on Twitter: @WChaseGunter