Data Analytics

What the NSA can't do with your data (probably)

open eye and data

The National Security Agency probably isn't spying on you. It's just measuring you for risk, according to two experts on the science of predictive analytics and data mining.

The NSA's Prism program, it has now been revealed, collects communications data from leading online commercial services and collects metadata and envelope information from mobile providers, including Verizon.

Neither of the experts has direct knowledge of classified intelligence work, but they do understand the capabilities, and limitations, of the data that the NSA is gathering. They believe that NSA uses the information to mine for connections between individuals and pick up whiffs of plots that might be directed at the U.S. homeland or its interests abroad.

While the amount of information subject to NSA's analysis is almost inconceivably vast, the methods used to analyze it are likely similar to the commercially available applications that marketers use to predict behavior or that auditors use to detect fraud, according to Eric Siegel, founder of Prediction Impact and author of the book "Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die."

Eric Siegel

Eric Siegel

It is useful to think of analytics as a winnowing process, Siegel said. Rather than finding a needle in a haystack, it makes smaller and smaller haystacks, and decides which of those are most likely to be concealing the hidden needles. "It's all about tipping the odds, and making more efficient use of human investigators," Siegel said.

It's also about teaching the computers how to look, said Dean Abbott, a data mining expert and founder of Abbott Analytics. "From a pure data standpoint, you can see where all those connections lead – that's where you find the unexpected," Abbott said.

For instance, patterns might emerge from the historical data trail left by the Boston marathon bombers that could help identify threats in the future. "Now we can label everything they did, and roll back the data clock and replay what those communications were like, and ask whether there were tell-tale signs that something was wrong," Abbott said.

This kind of "supervised learning" is fundamental to the operation of technologies most Americans use every day, from e-mail spam filters to Netflix movie recommendations.

Spam filters come equipped with settings that identify junk messages based on the way email is addressed, the sender and the content. Individual users train their filters by making decisions on which emails to mark as suspect. Netflix users, meanwhile, get movie recommendations based on individual taste profiles, and the historical likes and dislikes of similar users. The more movies a user rates, the more refined the profile becomes and theoretically the more likely it is that a Netflix suggestion will fit with a user's taste. Abbott said he can imagine something similar at work on the NSA computers, parsing connections between phone calls, geo-location data, web activity, social network contacts, and more.

"Think of it as a nefarious activity filter," Abbott said. The scale of the enterprise doesn't present any hurdles. "There's no technical problem with being able to crunch through all this data. With enough resources, you can do just about anything," Abbott said. In his career working with commercial and government clients, Abbott has not worked with petabytes worth of data, but private companies like the telecom providers and Google are certainly set up to handle that level of processing power, and the NSA with its computing power is surely equipped to do more.

Dean Abbott

Dean Abbott

Marketers use data mining in social networks to identify individuals who are likely to exert influence over the consumer choices of their friends, family, and followers. By keying in on people who fit a certain profile, they can be targeted with messages designed to filter down into their networks. "Figuring out one of those items isn't rocket science," Siegel said. "The rocket science is in the ability to learn how to weight those things, in combination, to make one predictive number as an output for the model."

Separate and apart from any legal, ethical, or constitutional problems that arise from the alleged data collection on the part of NSA, is that their computer scientists are trusted with making sure their models are accurate. "Algorithms don't know about common sense," Abbott said. "If your data is bad, they'll infer the wrong thing."

If the press reports are accurate, however, the NSA is getting "very highly potent data," Siegel said. He doesn't doubt that the NSA could produce models that would yield valuable insights into the risks posed by individuals whose contacts, social data, interactions, suggest links to criminal activity. "It can be any little arcane aspect of your behavior," Siegel said. "People in this field discover thing that are quite telling, that tip the balance one factor at a time. Data is really predictive."

The 2014 Federal 100

FCW is very pleased to profile the women and men who make up this year's Fed 100. 

Reader comments

Tue, Jun 18, 2013

Maybe we're getting too fancy trying to do more with less. I say hire more people to do this data gathering with some reliance on filters, but let's never fool ourselves that filters can do what "boots on the ground" can do. "Doing more with less" has its limits.

Tue, Jun 18, 2013

Even if it is shown that plots have been foiled using these filters, should we then simply accept that everything's OK and give our government the green light to proceed? Even though we're a government, "of the people, by the people and for the people", there are many in appointed or elected positions that don't realize that. I thought I'd never live to see it in the US but the government is starting to act like "them" vs the people ("us").

Tue, Jun 18, 2013

"Algorithms don't know about common sense," Abbott said. "If your data is bad, they'll infer the wrong thing."

Well, this goes in the "duh" category. Another way of saying this is, "garbage in, garbage out". And the seemingly least important piece of erroneous "metadata" can result is a garbage inferrence. But, and here's the catch, these algorithms are so complex that over time no one will be able to refute their inferrences (conclusions) and will acccept them as fact. Now this is something we should all be very wary of. The paradigm will shift to, "garbage in, correct conclusion out".

Tue, Jun 18, 2013

What has happened to our government? I am getting more and more nervous about trusting what "they" say, primarily the president.

Thu, Jun 13, 2013

PW3N obviously didn't read "NSA director says surveillance programs thwarted ‘dozens’ of attacks" by the Washington post.

Show All Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above