Intelligence

For NGA, key to open source might be embracing uncertainty

NGA Flickr image: interview with Susan Gordon conducted by Sean Lyngaas.

NGA Deputy Director Susan Gordon says open-source data presents a challenge about "how to treat it as seriously" as tradtional forms of data. (Image: NGA photographer Tony Boone / Flickr)

The National Geospatial-Intelligence Agency has made a point of hitching its operations to the recent boom in open source data. The intelligence community’s mapping agency has in the last few months released a software toolkit for crowdsourcing on GitHub and created a website for sharing unclassified information on the Arctic, for example.

With that work in full swing, the next step for the agency’s push to leverage open source information could be philosophical rather than technical.

NGA has traditionally treated locational accuracy as “one of our zero-tolerance businesses,” with expectations of 100 percent accuracy, Deputy Director Susan Gordon said in a recent interview. “I think it’s a challenge, but I think we need to be less binary about perfect precision for every single one of our customers.”

In other words, imprecise data is better than none at all, and embracing uncertainty might become an imperative in a world suffused with open-source data whose absolute accuracy can be challenging to verify.

The GEOINT Pathfinder project is the agency’s way of planting its flag in the open-source space. The project, announced in March, taps in-house expertise – but also outsources it – to deliver unclassified geospatial intelligence to customers on mobile devices.

NGA Director Robert Cardillo’s statement announcing the project was telling. While classified information will always be valuable to the agency, he said, “we cannot always view unclassified information as supplemental. Moving forward the reverse is more likely to be true – that which is exquisite but classified will supplement an ever broader and richer unclassified base.”

For Gordon, Pathfinder’s purpose is “to teach us how to really use the power of open source without the crutch of the data we’ve always used.” The “crutch” Gordon referred to was the data collection regime under the National Photographic Interpretation Center, an image analysis outfit born decades ago. The NPIC, which was housed at the CIA and would eventually be folded into the agency that would become the NGA, was a means of collecting geospatial intelligence in-house and is a far cry from the multi-source collection regime of today.

The NPIC collection regime is “what we know, that’s what I grew up with and that’s what’s taught,” said Gordon, who spent more than 25 years at the Central Intelligence Agency before moving to NGA in January. “This explosion in open source is an interesting challenge about how to treat it as seriously, use it as surely as the stuff that was collected for our use.”

New postures

One doesn’t need to look hard to find signs of intelligence agencies’ greater emphasis on open-source information, and even the traditionally clandestine are in on the act. CIA Director John Brennan is the intelligence community’s designated “functional manager” for open-source intelligence, for example.

There was also an entire conference dedicated to open-source intelligence at Georgetown University last week. Gary Dunow, director of NGA’s analysis directorate, took heed of the rise of the “volunteered geographic information” community – people who contribute locational data to a map of a protest going on overseas, for example. “In order for us to take advantage of that, we need to move away from our closed intelligence architecture and move into the open world where everybody else is right now,” Dunow said.

The head of the intelligence community’s R&D arm offered another example of just how crucial open-source intelligence can be.

“One of the virtues of open source is that it’s especially relevant for the kinds of events that the intelligence community cares about that don’t involve adversaries, in the traditional sense,” said IARPA Director Jason Matheny. “We don’t … have a traditional intelligence collection posture against disease. We have to use non-traditional, open-source data to in order to detect disease outbreaks at the earliest moment” and even predict them, he said.

About the Author

Sean Lyngaas is an FCW staff writer covering defense, cybersecurity and intelligence issues. Prior to joining FCW, he was a reporter and editor at Smart Grid Today, where he covered everything from cyber vulnerabilities in the U.S. electric grid to the national energy policies of Britain and Mexico. His reporting on a range of global issues has appeared in publications such as The Atlantic, The Economist, The Washington Diplomat and The Washington Post.

Lyngaas is an active member of the National Press Club, where he served as chairman of the Young Members Committee. He earned his M.A. in international affairs from The Fletcher School of Law and Diplomacy at Tufts University, and his B.A. in public policy from Duke University.

Click here for previous articles by Lyngaas, or connect with him on Twitter: @snlyngaas.


Featured

  • Telecommunications
    Stock photo ID: 658810513 By asharkyu

    GSA extends EIS deadline to 2023

    Agencies are getting up to three more years on existing telecom contracts before having to shift to the $50 billion Enterprise Infrastructure Solutions vehicle.

  • Workforce
    Shutterstock image ID: 569172169 By Zenzen

    OMB looks to retrain feds to fill cyber needs

    The federal government is taking steps to fill high-demand, skills-gap positions in tech by retraining employees already working within agencies without a cyber or IT background.

  • Acquisition
    GSA Headquarters (Photo by Rena Schild/Shutterstock)

    GSA to consolidate multiple award schedules

    The General Services Administration plans to consolidate dozens of its buying schedules across product areas including IT and services to reduce duplication.

Stay Connected

FCW Update

Sign up for our newsletter.

I agree to this site's Privacy Policy.