Federal IT program failures: It's the content, stupid

Many federal IT programs founder because agencies focus on systems rather than substance, writes consultant Barry Schaeffer

Barry Schaeffer is principal consultant at Content Life-cycle Consulting and senior consultant at the Gilbane Group.

In a world increasingly dependent on automation, it's little wonder that IT spending is exploding, along with the interest in what is spent and what we’re getting for it. That no community spends more for less in return than the federal government has sparked numerous studies, including one currently under way, of reasons and remedies for the historically abysmal success rate of federal IT projects.

If history holds serve, those efforts will focus on the acquisition process, procurement rules, contract types and budgeting vehicles but ignore what is arguably the most important factor: failure to recognize the critical role of content in many programs. In effect, the designation of many major programs as information “technology” projects can help doom them to failure.

That is counterintuitive for many IT people. How can the content be the problem when technology presents the most complex challenges and accounts for most of the wasted funds? Yet in government, complex, trusted content and its life cycle from creation to delivery are the foundation variables that, if ignored in the design phase, can compromise even the most expansive — and expensive — program.

An agency can spend millions trying to integrate disparate or even incompatible systems — often with marginal success — and end up missing the point. The goal is not to make systems connect but to enable content to be managed, accessed and shared. To design a system without thinking about the content is the equivalent of building the railroad locomotives and cars before deciding the track gauge.

When engineers start by focusing on the technology, they often end up making an entire range of architectural, hardware and software design decisions that saddle the project with too much cost and tools and vendors poorly suited to the real problems.

It doesn’t have to be that way. A more content-centric approach can allow projects to define the content architecture, interchange forms, and process requirements and logical transfer pathways. That approach can then encourage participating organizations to meet requirements in their own ways, based on the limits of their technology resources. A growing number of successful efforts are completed this way, with technology supporting but not dictating the content life cycle.

For example, the Defense Intelligence Agency, under a 2002 mandate from Vice Adm. Lowell Jacoby to “standardize at the content instead of system level,” succeeded in building a Library of National Intelligence built on Extensible Markup Language assessments from across the agency worldwide. Likewise, the Office of Management and Budget has significantly improved the federal budgeting process at modest cost by adopting an XML content architecture.

The transition will not be easy. Technology professionals are most comfortable with what they know, and many CIOs are actually CTOs with little experience in or respect for the world of content.

Likewise, the IT industry, ruled by huge firms still mired in 1980s thinking when technology was the major challenge — and revenue producer — will not willingly relinquish center stage and the huge revenues that go with it. That is why the change must originate on the buying side. Bidders are unlikely to propose content-based solutions, no matter how appropriate, in competitions in which the evaluation team is thinking technology approaches and will favor them. 

But change must come. Until we learn that, in an increasingly content-centric world, we cannot continue to base automation efforts exclusively on a technology-centric model, agencies will continue to invest millions of dollars in programs and have little to show for it.


About the Author

Barry Schaeffer is principal consultant with Content Life-cycle Consulting and a senior consultant with the Gilbane Group.

Nominate Today!

Nominations for the 2018 Federal 100 Awards are now being accepted, and are due by Dec. 23. 


Reader comments

Mon, Sep 27, 2010 Barry Schaeffer resonds Virginia

John Weiler makes an important point that failure to focus on content, however important, is not the only cause of government IT failures. Moreover, even where it is the foundation cause, that failure is often not the only component of the failure. My commentary was intended to highlight the fact that failure to put content issues in their proper place in an automation effort can virtually ensure that the effort goes badly. One of the most important factors in a content focus is the fact that it forces designers to get down in the weeds where the people work and processes actually take place (as mentioned by another commenter.) In fact, the immediate focus on technology tends to encourage glossing over these sometimes messy issues, focusing instead on the neat and sturcutred issues of the technology itself. In my experience, a content focus, while not the only issue to be dealt with, is often the best way to make sure that all the cans of worms inherent in the project are opened and included in the planning process.

Fri, Sep 24, 2010

Amen to previous comment. Automating a bad process does not improve it. Any major new AIS project, step 1 should be to examine the business process and workflows, and see if there is a simpler way to do things. 'We have to do it that way' is often just another way of saying 'That's the way we've always done it.'

Thu, Sep 23, 2010 John Weiler Alex VA

I would like to up the ante on this root cause analysis. Federal Agencies need to first focus on the problem they are solving, the outcome they seek, and to what degree IT can improve the outcome. Sounds simple, but a often overlooked with inexperienced PMs and CIOs who only chase technology. I remember a statement from a CIO who retired from a large civil agency, who stated that his tope priority was to make sure that his agency had the latest technology. Contractors, Trade Associations and FFRDCs fuel this problem to get their share of the $100B pie. What is interesting is that agencies with the smallest IT budgets get the most done, like USMC, GPO, and PTO. They are forced to focus on bottom line and the use of technology as a tool. The IT Acquisition Advisory Council has pooled the collective expertise of several hundred SMEs, experiences of 21 agencies, four universities, 3 think tanks, and a half dozen research institutes to establish an actionable Roadmap for Sustainable IT Reform. Focusing on content is a start, but far from being THE root cause.

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group