What the NSA can't do with your data (probably)
- By Adam Mazmanian
- Jun 12, 2013
The National Security Agency probably isn't spying on you. It's just measuring you for risk, according to two experts on the science of predictive analytics and data mining.
The NSA's Prism program, it has now been revealed, collects communications data from leading online commercial services and collects metadata and envelope information from mobile providers, including Verizon.
Neither of the experts has direct knowledge of classified intelligence work, but they do understand the capabilities, and limitations, of the data that the NSA is gathering. They believe that NSA uses the information to mine for connections between individuals and pick up whiffs of plots that might be directed at the U.S. homeland or its interests abroad.
While the amount of information subject to NSA's analysis is almost inconceivably vast, the methods used to analyze it are likely similar to the commercially available applications that marketers use to predict behavior or that auditors use to detect fraud, according to Eric Siegel, founder of Prediction Impact and author of the book "Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die."
It is useful to think of analytics as a winnowing process, Siegel said. Rather than finding a needle in a haystack, it makes smaller and smaller haystacks, and decides which of those are most likely to be concealing the hidden needles. "It's all about tipping the odds, and making more efficient use of human investigators," Siegel said.
It's also about teaching the computers how to look, said Dean Abbott, a data mining expert and founder of Abbott Analytics. "From a pure data standpoint, you can see where all those connections lead – that's where you find the unexpected," Abbott said.
For instance, patterns might emerge from the historical data trail left by the Boston marathon bombers that could help identify threats in the future. "Now we can label everything they did, and roll back the data clock and replay what those communications were like, and ask whether there were tell-tale signs that something was wrong," Abbott said.
This kind of "supervised learning" is fundamental to the operation of technologies most Americans use every day, from e-mail spam filters to Netflix movie recommendations.
Spam filters come equipped with settings that identify junk messages based on the way email is addressed, the sender and the content. Individual users train their filters by making decisions on which emails to mark as suspect. Netflix users, meanwhile, get movie recommendations based on individual taste profiles, and the historical likes and dislikes of similar users. The more movies a user rates, the more refined the profile becomes and theoretically the more likely it is that a Netflix suggestion will fit with a user's taste. Abbott said he can imagine something similar at work on the NSA computers, parsing connections between phone calls, geo-location data, web activity, social network contacts, and more.
"Think of it as a nefarious activity filter," Abbott said. The scale of the enterprise doesn't present any hurdles. "There's no technical problem with being able to crunch through all this data. With enough resources, you can do just about anything," Abbott said. In his career working with commercial and government clients, Abbott has not worked with petabytes worth of data, but private companies like the telecom providers and Google are certainly set up to handle that level of processing power, and the NSA with its computing power is surely equipped to do more.
Marketers use data mining in social networks to identify individuals who are likely to exert influence over the consumer choices of their friends, family, and followers. By keying in on people who fit a certain profile, they can be targeted with messages designed to filter down into their networks. "Figuring out one of those items isn't rocket science," Siegel said. "The rocket science is in the ability to learn how to weight those things, in combination, to make one predictive number as an output for the model."
Separate and apart from any legal, ethical, or constitutional problems that arise from the alleged data collection on the part of NSA, is that their computer scientists are trusted with making sure their models are accurate. "Algorithms don't know about common sense," Abbott said. "If your data is bad, they'll infer the wrong thing."
If the press reports are accurate, however, the NSA is getting "very highly potent data," Siegel said. He doesn't doubt that the NSA could produce models that would yield valuable insights into the risks posed by individuals whose contacts, social data, interactions, suggest links to criminal activity. "It can be any little arcane aspect of your behavior," Siegel said. "People in this field discover thing that are quite telling, that tip the balance one factor at a time. Data is really predictive."
Adam Mazmanian is executive editor of FCW.
Before joining the editing team, Mazmanian was an FCW staff writer covering Congress, government-wide technology policy and the Department of Veterans Affairs. Prior to joining FCW, Mazmanian was technology correspondent for National Journal and served in a variety of editorial roles at B2B news service SmartBrief. Mazmanian has contributed reviews and articles to the Washington Post, the Washington City Paper, Newsday, New York Press, Architect Magazine and other publications.
Click here for previous articles by Mazmanian. Connect with him on Twitter at @thisismaz.