Data tools outstrip laws, policies

Since the 2001 terrorist attacks, the government has enacted a range of information-based programs and policies designed to improve information sharing and help analysts identify patterns and connections. But programs that employ tactics such as data mining and behavioral surveillance have stoked worries that laws, policies and oversight might not be keeping pace with new technology.

A widely anticipated report released earlier this month found that laws have not kept up and states that the next administration and Congress should review the current legal framework for such programs.

“The laws and regulations in this field are out of date,” said William Perry, a former Defense secretary and co-chairman of a panel that authored the report on counterterrorism technologies and privacy. “It is not enough to say that the government must obey its own laws in this field if we recognize that the laws are inadequate and the laws need to be changed.”

Perry made the comments Oct. 7 during a press conference where the report, which was co-sponsored by the Homeland Security Department and the National Science Foundation, was released.  The government did not review the report, issued by the National Research Council of the National Academies, before its release.

The report, called “Protecting Individual Privacy in the Struggle Against Terrorists,” also laid out an extensive framework for how the government should assess and oversee data and information-based programs used to fight terrorism.

The panel also said that just because a data-mining tool works in commercial applications does not mean it will work to combat terrorism, which the committee called a vastly more difficult problem.

The panel did not make recommendations on any one program and only looked at unclassified programs. However, the group said its findings were also applicable to classified programs.

Providing a framework
The panel also laid out a detailed framework for how programs should be evaluated for their effectiveness and effect on privacy.

Fred Cate, a panel member and director of Indiana University’s Center for Applied Cybersecurity Research, said the framework is based on the conviction that using data to fight terrorism has unavoidable effects on privacy.

Cate, speaking at the report’s release, also said the framework was designed to address the tendency to throw solutions at a problem without first testing them and evaluating their impact.

“The framework is not designed simply to — if you will — balance privacy and security but rather with the intention of accomplishing both privacy and security within [the] technological means available to the government,” he said.

The panel’s recommendations addressed issues concerning the actual data used in the programs, the information-based programs that are being used for counterterrorism and the government’s oversight and organizational requirements. The panel emphasized the quality of the information over the quantity.
“More data does not mean better data,” Perry said.

Cate echoed that sentiment at the report’s release. “If you combine bad data, you just get worse data,” he said.

Civil liberties advocates not involved with the panel have voiced similar concerns during the past several years.

“The right analogy here is a haystack in which we seem to think if you pour more hay on the stack ,you can find the needles,” said Barry Steinhardt, director of the Technology and Liberty Project at the American Civil Liberties Union. “It defies all logic.”

Perry said the panel agreed that pattern-based analysis techniques, such as data mining, potentially can be a useful ways to help the analysts determine where to look but concluded they should never be used as an automated terrorist identifier.

He said the panel had similar reservations about modern behavioral surveillance techniques. Behavioral surveillance could help detect individuals whose behavior in one way or another deviates from the norm, but it should not be a tool for concluding that a person is a terrorist.

“These kinds of techniques, to the extent that they’re used, have to be used very carefully, and [we] have to understand that there is going to be a very high probability of false positives,” Perry said.

Defining terms
One challenge in developing policies about data mining is defining exactly what it is.
In its 2007 annual report to Congress on the department’s data-mining activities, DHS’ privacy office said no consensus existed then on what constitutes data mining.

“In colloquial use, data mining generally refers to any predictive, pattern-based technology,” DHS officials wrote. Different government reports have used different definitions some more narrow than others, according to the DHS report.

However, the Data Mining Reporting Act, passed as part of a major anti-terrorism law in 2007, defines the term as “a program involving pattern-based queries, searches, or other analyses of one or more electronic databases.” In February, DHS released a letter on its use of data mining that applied this definition.

In its 2008 updated report on the department’s data-mining activities, DHS identified four programs that include some measures of data mining: Customs and Border Protection’s Automated Targeting System Inbound and Outbound Cargo Analysis; Immigration and Customs Enforcement’s Data Analysis and Research for Trade Transparency System; the Law Enforcement Analysis Data System; and the Transportation Security Administration’s Freight Assessment System.

DHS spokeswoman Caroline Dierker said the committee’s suggestions were in line with the steps the department takes to ensure privacy is protected in its programs.

“What the report suggests gels quite well with what we are already doing,” she said.

Timothy Edgar, deputy for civil liberties for the Office of the Director of National Intelligence and a former ACLU employee, also said he welcomed the report’s findings and said that the intelligence community undertakes these types of evaluations for data mining and other programs. However, he added that national security systems are not required to undergo the same types of privacy assessments as other programs.

Edgar said data mining play a lesser role in the government’s counterterrorism efforts than many people think.

Government uses a number of link analysis tools that look across several different types of data to uncover the connections between terrorist networks and groups have to each other that are used more broadly, Edgar said.

But, the Data Mining Reporting Act wouldn’t describe those activities as data mining, Edgar said.
Edgar added that although laws were written before many of these tools existed, new laws for data mining or other information-based programs need to be specific.

However, the ACLU’s Steinhardt said  the laws need to be amended.

“The laws are just outdated; they don’t take into account this kind of searching,” he said.
At the core of the report is a framework for evaluating the programs for effectiveness and consistency with U.S. law, said Charles Vest, one of the panelists.

“The terrorists can win in two different ways: they can win by having successful attacks or they can win by our response,” if it undermines our values, Perry added. 

About the Author

Ben Bain is a reporter for Federal Computer Week.


  • Workforce
    White House rainbow light shutterstock ID : 1130423963 By zhephotography

    White House rolls out DEIA strategy

    On Tuesday, the Biden administration issued agencies a roadmap to guide their efforts to develop strategic plans for diversity, equity, inclusion and accessibility (DEIA), as required under a as required under a June executive order.

  • Defense
    software (whiteMocca/

    Why DOD is so bad at buying software

    The Defense Department wants to acquire emerging technology faster and more efficiently. But will its latest attempts to streamline its processes be enough?

Stay Connected