Open government initiatives might not work, CRS suggests

New report suggests transparency, openness hold risks

President Barack Obama’s open government programs might be ineffective or could interfere with government operations, according to a new report from the Congressional Research Service, an agency that advises Congress.

Congress should consider evaluating whether the White House programs are effective and worth the cost and whether they slow down government operations and negatively affect privacy and security, wrote Wendy Ginsberg, an analyst in government organization and management at CRS, in the research report published online Feb. 14 by the Federation of American Scientists.

“Congress may decide that some of the policies in President Obama’s Dec. 8, 2009, memorandum increase transparency, public participation and collaboration in a way that improves the effectiveness of federal government,” Ginsberg wrote.

Related stories:

Open government plans updated, criticized

Open government beats FOIA for information access

“Conversely, Congress may find that increased transparency and public attention make the federal government more susceptible to information leaks of sensitive materials," she wrote. "Additionally, increased collaboration and participation may make the sometimes slow process of democratic deliberation even slower. Congress may also choose to evaluate the monetary costs associated with implementation of the open government policies.”

Obama has issued three memos related to transparency and then the Open Government Directive in December 2009. In response, agencies have developed open government plans and websites and published datasets on, among other activities.

However, the memos and directives left room for interpretation and might not be producing the expected results, Ginsberg wrote.

“The Open Government Directive (OGD) did not explain the consequences for ignoring or disobeying the directive’s requirements,” she wrote. “It is, therefore, unclear what may happen to agencies that did not to meet the requirements set out in the Dec. 8, 2009, OGD or to an agency that did not complete the requirements in a manner that is consistent with the spirit of the memorandum. It is also unclear whether there will be consequences for agencies that do not maintain the OGD requirements or allow certain elements of the OGD to lapse.”

The report also questions the value of specific aspects of the administration's transparency agenda, including the release of large amounts of federal data to the public. Ginsberg said there might be risks involved.

“Releasing these datasets to the public also assumes that the public will have the knowledge, capacity, and resources to evaluate the data, offer valid insights, and reach replicable results and verifiable conclusions," she wrote. "Irresponsible manipulation of the datasets may allow certain groups or individuals to present unclear or skewed interpretations of government datasets, or come to questionable conclusions.”

Releasing too much data could also hinder transparency, she added.

“Counterintuitively, agencies that release vast amounts of datasets may become even less transparent because the public will be unable to decipher which data are important to their needs,” Ginsberg wrote. "Users may have to sift through thousands of datasets to determine which ones include the information they seek. It may be difficult for a researcher to pinpoint the dataset he or she needs in a collection of similarly titled datasets."

Therefore, Congress might need to consider ways to ensure that the public has a means to determine the authenticity of federal data and analysis.

“Congress may want to create ways to make clear to the public when data analysis is performed by the federal government as opposed to when analysis is performed by a private group or individual with its own goals and mission,” Ginsberg concluded. "Congress may choose to require certain government agencies to perform reviews and analyses of the data that is released to the public. Congress may also decide to hold hearings in which Members themselves determine the value and validity of agencies’ datasets and analysis."

About the Author

Alice Lipowicz is a staff writer covering government 2.0, homeland security and other IT policies for Federal Computer Week.

Cyber. Covered.

Government Cyber Insider tracks the technologies, policies, threats and emerging solutions that shape the cybersecurity landscape.


Reader comments

Thu, Feb 17, 2011 Technocrat

The taxpayer who funds government's aggregation of information cannot be trusted in its interpretation. This thesis is unlikely to play well in Peoria.

Wed, Feb 16, 2011 Interested Party

Even within agencies it can be difficult to validate conclusions reached based on data that was not intended for specific uses. Typically, assumptions used to gather the data are not clearly expressed leaving unintentional data gaps. Raw data sets are exciting for researchers but will ultimately produce inaccurate conclusions. Case in point are all of the interpretations of data ragarding federal empoyee compensation compared to private sector compensation when the titles may be similar but the duties and skillsets may be vastly different.

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group