Better management key to making fusion technology work
- By Ronald Elliott
- Dec 06, 1998
The federal government has developed a number of intelligence programs and initiatives, called fusion algorithms, systems and processes, that rely on systems to produce coherent intelligence products. The systems use complex methods of sifting through enormous amounts of data to find and assimilate obscure information.
Although Congress this year supported programs and efforts that assist these fusion systems, the national intelligence community lacks a coherent management capability to coordinate the diverse developments and activities to make these systems workable, to ensure that the proper efforts are being undertaken and to avoid duplication of efforts.
Intelligence is a complex discipline that requires its practitioners to analyze enormous amounts of information in a variety of forms, categories and languages that occur in diverse contexts, disciplines and in different geospatial and temporal concepts. Combing through this web of information to identify pieces of information that could be tied together to form needed knowledge is time-consuming and requires large staffs. With fusion, which has been under development for at least 15 years, computers can help sift through the mountains of information by rapidly examining data to find relationships. These bits of information when taken individually are meaningless, but when compiled they give intelligence consumers valuable information.
But information technology can only do so much. Although computers can sort information rapidly when given clear instructions, the federal government— namely the Defense Department and the intelligence community— has not established a common set of parameters that can "automatically" identify and appropriately associate the myriad data types according to agreed-upon security classification rules. Common parameters are necessary to guide the development of interoperable information systems that "fuse" bits of information from the mass of data stored in government computers or flowing across its networks.
Neither has the intelligence community developed a common data model to design or operate the many projects, programs and initiatives. Thus, the systems are not compatible and weaken the systems' ability to more efficiently cull information and make associations among elements of the vast databases.
Only through adoption of a common model of information elements and standardization of processes, which can be applied to computer and communications systems, can the government achieve the potential benefits of enterprisewide distributed collaboration and effectively manage the dissemination of information.
This problem represents a management challenge, not a technical one. Investments by the government in IT projects and programs intended to automatically fuse information must be managed from an enterprisewide level. This would ensure that each program is guided toward the common objective of enterprisewide information fusion.
A consistent and coherent intelligence enterprisewide perspective is essential to address and justify expenditures for fusion. Congress should be applauded because it required in the 1999 Intelligence Authorization Act that the offices of the Director of Central Intelligence and the Secretary of Defense provide a long-term strategy for coordinating and developing the various intelligence data fusion efforts.
At least this requirement should expedite discussion on the clarification of terms and concepts to make some progress in managing the wealth of information resources.
It is regrettable that millions of dollars will continue to be poured down the "black hole" of fusion before a coherent and cost-effective management structure and plan can be developed.
-- Elliott, a retired federal executive with more than 30 years' experience in the national security arena, served as director of the Intelligence Systems Secretariat.