Researchers recommend Wikis for government information
E-government researchers have suggested that collaborative Wiki software may be the best avenue for getting public information to the citizenry.
They advocate building two-layered Web pages, with the agency providing a base layer of information and interactive pages layered on top that domain experts, volunteers and others could use to annotate and link the data.
The scholarly journal Electronic Government
published a paper titled Building Semantic Webs for e-government with Wiki technology
in its January issue describing the approach. The paper was authored by Christian Wagner, Karen Cheung and Rachael Ip, all of the City University of Hong Kong, as well as Stefan Böttcher of the University of Paderborn in Germany.
The research lends support to some of the Semantic Web work done in the Federal CIO Council’s Semantic Interoperability Community of Practice, according to Brand Niemann, chair of SICoP.
The researchers are confronting the problem of how agencies can manage the great volume of material they post online. Through a Google search, the research group found there are 368 million Web pages under the federal .gov domain alone. The authors propose using Wiki software to ease the burden of handling all this material.
Their idea is this: In addition to standard Web pages, a second interactive layer could be added to allow outside parties to add contextual information and pull together disparate strands of data. Wiki is collaborative software that allows viewers to edit Web pages. Most Wiki engines
use a combination of a server-side scripting language and databases.
The two-layer design owes a debt to database design, the authors concede. The database itself holds the raw data, while additional indexes are placed over top to parse the data in various ways. The agencies would “rely on a community of users to maintain the semantic relationships in the form of a Wiki web,” according to the paper.
This approach “allows you to federate information, spread it around,” said Mills Davis, managing director of Washington consultancy Project10X. Davis is a frequent contributor to SICoP activities. Communities of interest, such as domain experts from different agencies, can get together to explain the data and how it could be used.
“You really need to involve communities of interest. It takes a community to make sense out of the content,” Davis said. What the Wiki offers is an easy means to collaborate.
SICoP uses the Wiki to organize
meeting materials and documents, Niemann said. A Wiki was also used in developing
the second draft of the Federal Enterprise Architecture’s Data Reference Model. In both of those project pages, the numbers in purple are links to other static Web pages or sub-parts of the same page.
SICoP is looking at other ways and possible pilots to repurpose Wiki and other online content through the use of additional semantic layers. Niemann also points to a pilot
of a medical search engine developed by SemanTx Life Sciences Inc. of Waltham, Mass. Here, you type in a question and the system “builds an ontology so you can see if that is really what you mean and then uses the ontology to structure the answers to your question,” he explains.
Popular public Wikis, such as the volunteer-driven Wikipedia
public encyclopedia, have come under criticism because of the mistakes and biased information they contain, which were usually added by malicious or uniformed users.
Such information is easily corrected, Wiki’s defenders maintain. Additional research could allow agencies to “provide trusted versions of Wikis, with all elements of an overlay structure and the information contained in content pages being government-checked and verified,” the researchers admit. Editor’s note: Brand Niemann will be discussing the promise of Semantic Wikis, along with the newly released Data Reference Model and other topics, during a GCN online forum this Wednesday from 10 a.m. to 11 a.m. EST.
Connect with the GCN staff on Twitter @GCNtech.