Rethinking government's biggest IT ideas

Sometimes IT executives in the private sector look with envy at the federal government’s nearly $80 billion in annual technology spending and imagine what such a titanic budget could accomplish: deep volume discounts, influence over industry standards and future products, and ambitious and innovative large-scale programs.

The reality, of course, is that the federal government is not a monolithic entity but a collection of independent actors with widely varying missions and priorities.

But that hasn’t stopped federal executives and political leaders from trying to capitalize on the government’s size when possible to save money, increase efficiency and improve operations.

Below are four areas where government leaders have ventured forth with big ideas for doing IT better. Not all the efforts have met with complete success, and the challenge has become how to apply what we've learned to what comes next.

Change 1:
Cybersecurity: FISMA’s follow-up

The original plan: Congress enacted the Federal Information Security Management Act in 2002 to create a formal, coordinated strategy for implementing cybersecurity across the federal government rather than leaving such critical safeguards to individual agencies' discretion. To demonstrate compliance, agencies had to document efforts in a multitude of cybersecurity categories. Results were originally consolidated into annual grades issued by Congress, but they are now reported via the online CyberScope tool.

The path ahead: Security experts give FISMA mixed grades. On the one hand, the law gets high marks for elevating the importance of cybersecurity by holding department secretaries responsible for security efforts.

“That was very powerful because it created a very strong partnership between the CIOs and the secretaries,” said Lee Holcomb, former CIO at NASA, former chief technology officer at the Homeland Security Department and now vice president of strategic initiatives at Lockheed Martin.

But some argue that agencies are expending too many resources on regulatory compliance rather than security activities. Assessments "mostly focused on process, policy and paperwork and very little on the actual effectiveness of controls,” said John Pescatore, a vice president and research fellow at Gartner.

The annual audits specified in the original rules are another negative, he added. Indeed, many experts say assessments need to happen more frequently to address a constantly changing threat landscape.

Others complain that FISMA compliance requirements are so broad that agencies can’t focus on the most serious threats to their organizations. “Five percent of the controls add true security to your system, but you have this other 95 percent [of requirements] that you are audited against,” said Patrick Cronin, executive consultant at systems integrator CGI Group.

FISMA has seen numerous updates since its introduction, including outlines for continuous security monitoring to augment the annual reviews. In addition, some FISMA oversight responsibility has shifted from the Office of Management and Budget to DHS, a move that could give agencies additional resources for complying with audit requirements. The House and Senate are working on more updates, although it’s not clear how soon a new law might come up for a vote.

Lawmakers and security officials face challenges in keeping FISMA relevant for traditional security threats and the latest vulnerabilities, which include advanced persistent threats, wider adoption of cloud computing and ubiquitous mobile devices.

Alan Paller, director of research at the SANS Institute, advised following the lead of the Australian government, which encourages agencies to focus on four key areas: patching applications, patching operating systems, minimizing administrative privileges and whitelisting applications — the practice of denying executable files unless they’re on an approved list.

“These four things cut through everything," Paller said. "They’re what you need to do in cloud, in mobile, on desktops and with servers.”

Change 2:
Telecom: From FTS to Networx

The original plan: The first series of Federal Telecommunications System contracts — FTS 2000 and FTS 2001 — sought to provide agencies with a governmentwide contract option for telecom and network services that would keep pace with the carriers’ commercial offerings and offer better prices than what agencies traditionally paid. The General Services Administration, which manages the program, structured later iterations of the program so that it could include new services as technologies evolved and lower prices as market conditions changed.

The path ahead: GSA designed the latest contract, Networx, to be more expansive and robust than FTS 2001. Networx offers a much broader array of services and is easily expandable as new technologies emerge.

IP services were becoming more prevalent when GSA was designing Networx, which was awarded to service providers in 2007, said John Johnson, who was assistant commissioner of service development and delivery at GSA’s Federal Technology Service at the time and led the Networx program until he left GSA in 2009. He added that the agency wanted a contract that could accommodate new and emerging technologies.

The original FTS contract offered six services, FTS 2001 had 26, and Networx debuted with more than 50.

More than four years after it was awarded, Networx still has not replaced FTS 2001 completely, but it’s very close, said Karl Krumbholz, GSA’s director of network services programs.

Officials at 89 percent of the agencies have made fair-opportunity decisions, and 94 percent have completed statements of work, he said. Those are preliminary steps that allow agencies to then choose providers under the contract.

For the agencies that have completed the transition and moved their services to Networx, things are running smoothly, Krumbholz said. Many agencies and organizations kept the same services and providers, while others took the opportunity to make changes.

Networx will expire in 2017, and GSA is already working on its successor, tentatively titled NS2020. However, the work is in the early stages. GSA has interviewed about 100 stakeholders — including agencies, service providers and others — to develop an understanding of how the next contract should be structured.

“The environment is complex, and there’s a lot going on,” Krumbholz said.

Johnson said he questions whether another overarching network services contract is the best way forward. Instead, perhaps it would be better to offer services through a collection of smaller existing and new contract vehicles. In that case, the new network “contract” would actually be more of a catalog that links customers to the various vehicles, he said.

Change 3:
Collaboration: AKO to enterprise e-mail

The original plan: The Army launched the first version of its Army Knowledge Online Web portal in 1998. Since then, AKO has grown to offer centralized, or at least centralized access to, communications services such as e-mail and a wide variety of knowledge management and personnel management applications. The site currently handles about 16 million log-ins monthly.

The path ahead: In 2010, the Army and the Defense Information Systems Agency began planning a single, cloud-based enterprise solution that would provide shared e-mail, Active Directory and global address book services to Defense Department users. It is designed to improve collaboration by enabling better access to information across the enterprise. It will also help establish single-identity user access rather than the current setup, which can result in multiple e-mail addresses for a single user.

“We’re going to fundamentally change the way we deliver this network,” said Army CIO Lt. Gen. Susan Lawrence. “We have to direct the standards, configurations and common operating environment, and set forth the discipline and metrics. It’s the right thing to do.”

But enterprise e-mail might not replace AKO.

In May, the House Armed Services Committee’s Emerging Threats and Capabilities Subcommittee stripped 98 percent of the project’s funding pending the secretary of the Army's response to a request to furnish business-case and cost/benefit analyses for moving the service’s e-mail to the DISA cloud. The business-case analysis has been submitted, and a decision on funding will likely come when the fiscal 2012 budget appropriations are made.

Meanwhile, an internal June 21 Army memo outlined the gradual drawdown of AKO and a shift of funding to enterprise e-mail, but since then, some Army officials have indicated that AKO could continue to exist in a different form.

Then in July, the migration process to the new enterprise e-mail system was temporarily halted to address technical complications in what Lawrence called a "dirty network.” The process cautiously resumed in early September. So far, more than 90,000 accounts have been migrated, and officials said they expect to have about 1.2 million accounts on the new system by spring 2012.

It remains unclear whether the Navy, Marine Corps and Air Force will also adopt the DISA e-mail solution. But according to DOD CIO Teri Takai, that’s less important than getting in line with the reasoning behind it.

“Enterprise e-mail doesn’t mean everybody goes to DISA,” Takai said. “What it does mean is we have to get to a common identity management structure, and we have to get to a common directory structure. We have to be able to collaborate. That’s really the infrastructure that is critical here. That’s the architecture we’re looking at now.”

Change 4:
The networked desktop: NMCI to NGEN

The original plan: The Navy Marine Corps Intranet (NMCI), which launched in 2000, put all the responsibility for provisioning, supporting and upgrading the service’s desktop computers and network connectivity into the hands of one contractor, originally Electronic Data Systems. The program's goals included achieving standard configurations for desktop computers and reducing total cost of ownership through standardization and better economies of scale. With more than 700,000 users, NMCI is one of the government’s largest IT programs.

The path ahead: Hewlett-Packard has been operating NMCI since it acquired EDS in 2008, most recently under a $3 billion continuity-of-services contract. That will continue until the successor to NMCI, the Next Generation Enterprise Network (NGEN), is ready to go in May 2014. A final request for NGEN proposals is due in December, and the contract award or awards are likely to be made in late 2012.

NMCI was originally conceived to reduce the burden of end-to-end IT operations and maintenance on the Navy and Marine Corps while improving technology and reinvesting the savings in the afloat environment, said Warren Suss, president of Suss Consulting.

The Navy succeeded in transferring the burden to industry in perhaps the largest IT outsourcing contract ever awarded, but in implementing the program, both sides encountered much greater challenges and costs than anticipated, he added.

“The Navy has always been decentralized,” Suss said. "The governance challenges in consolidating and managing the infrastructure have haunted the program from the beginning until today."

The Navy is taking more control of the network and looking to NGEN as the foundation of its future Naval Networking Environment, which will integrate the afloat and onshore environments.

“I think the Navy has learned that [it] needed to re-engage and take more responsibility for the architecture and the technical direction of where the network is going,” Suss said.

But growing budget pressures are already forcing Navy officials to postpone major IT upgrades and will increasingly target DOD's biggest-ticket line items, including NGEN.

In March, the Government Accountability Office issued a report urging the Navy to reconsider its efforts on the grounds that it has failed to properly evaluate, plan and resource the project. Those concerns were shared by the House Appropriations Committee's Defense Subcommittee in its fiscal 2012 budget, which recommended that the defense secretary conduct an independent cost estimate — a recommendation DOD has rejected.

Some experts have also warned against moving forward with NGEN. Loren Thompson, chief operating officer of the Lexington Institute think tank, called it ill-conceived and flawed. But NGEN's program manager, Capt. Shawn Hendricks, is confident that it will be a success.

“There is no more important program in the Navy and Marine Corps than NGEN — certainly none that touch so many parts of [our] communications systems,” Hendricks said. “Now is the right time.”

Read more of the 2011 Federal List.

2014 Rising Star Awards

Help us find the next generation of leaders in federal IT.

Reader comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above