Models of mayhem

From major power outages and crippled telecommunications nodes to the dramatic spread of pneumonic plague, government agencies have increasingly played out mock disasters since last September's terrorist attacks using sophisticated modeling and simulation tools.

Yet few of those models take into account the set of "interdependencies," or specific repercussions, that affect the outcome when a disaster in one industry wreaks havoc on the nearby, dependent infrastructures of other sectors.

The electronic simulation of those interdependencies and relationships has emerged as a field begging for more federal research and development.

"We are looking for new types of capabilities to prove the robustness of infrastructures and to better equip

decision-makers and policy analysts," said Steve Rinaldi, joint program director of the National Infrastructure Simulation and Analysis Center (NISAC), led by Sandia and Los Alamos national laboratories.

In some cases, existing solutions are being modified into new applications. For instance, government officials are tweaking airport-modeling programs to simulate worst-case scenarios at the nation's seaports. Another push has vendors scrambling to adapt simulation tools — usually used for planning and analysis beforehand — to double as command and control centers that can help manage the infrastructure during a crisis should an anticipated event actually unfold.

All the while, emphasis is being placed on interdependencies. In the future, more simulation efforts will be designed to enable officials to answer often confounding questions, such as what happens to the water, telecommunications and financial infrastructures after massive failure in a power grid, or what steps responders should take when catastrophic events ripple across


However, the federal government's disaster-modeling capabilities currently present a split picture: optimistic on one side, more cautious on the other.

"Each sector has the phenomenal ability to model their own individual infrastructure," said Brenton Greene, deputy manager of the National Communications System (NCS). "What is less robust and newer

science — and frankly more challenging science — is the ability to

accurately and predictively model the interdependencies among infrastructures."

NCS is co-managed by the White House and the Defense Information Systems Agency and assists the president, the National Security Council and federal agencies with their telecommunications functions. But it also serves an important homeland security function by coordinating communications for the government's national security and emergency preparedness initiatives.

Formidable challenges are tied to the fact that modeling interdependencies requires the use of complex algorithms capable of processing large volumes of data. On top of that, officials must overcome the technology and equipment differences among established efforts in separate industries, such as the work NCS has done in the telecommunications sector.

NCS' infrastructure modeling partnership with the telecom sector is well advanced (see "Project eyes global early warning system for the Internet," below). And its work is an example of the established single-sector modeling efforts that the push for interdependency modeling, led by NISAC, must not only build on, but also overcome.

Addressing Interdependence

To foster more comprehensive infrastructure modeling, President Bush in his National Strategy for Homeland Security pledged additional research in areas such as analysis, technical planning and decision support environments.

This need was also cited extensively in a June National Academies report that suggested ways to harness science and technology to fight terrorism.

"New and improved simulation design tools should be developed," the report recommended. "These efforts would include simulations of interdependencies between the energy sector and key infrastructures such as communication, transportation and water supply systems."

To make sure that happens, Bush tapped NISAC to lead this redoubled effort, which will pull in the private sector using R&D incentives.

"By funding modeling and simulation across critical infrastructures, we are trying to get at the complexities brought about by interdependence," Rinaldi said.

Currently endowed with a $20 million budget that flows from the Pentagon's Defense Threat Reduction Agency, NISAC has ramped up R&D efforts significantly since its formal inception in 2000. At that point, the center had only $500,000 in funding and was a mere joint effort between Sandia and Los Alamos officials.

NISAC officials are now on a campaign to wrap in many of the modeling and simulation efforts scattered across government.

"The plan is to take tools that have been developed and incorporate them into a common platform within NISAC," said Steve Fernandez, manager of the Infrastructure Protection Systems division at the Idaho National Engineering and

Environmental Laboratory (INEEL). "That way, when there is a problem or threat, officials will have a

virtual menu of different modeling capabilities."

As a sign that this consolidation has begun, Fernandez referenced increased NISAC involvement in national labs' efforts to model potential weaknesses in supervisory control and data acquisition (SCADA) systems.

Such systems are a crucial part of the control mechanism used to manage critical energy infrastructures because they direct the flow of energy and molecules, he said.

The architecture and interfaces between the scattered SCADA systems have become more open through advances in the technology industry, particularly public networks — and this has proved to be a mixed blessing. "The evolving SCADA systems are becoming more efficient and cost-effective, but arguably less secure," an April Sandia report concluded.

The National Academies report also suggested that SCADA systems "pose a special problem" and recommended encryption techniques, improved firewalls and intrusion-detection systems to reduce the risk of disruption.

Indeed, increasing the security of SCADA systems is now a top R&D priority for the labs, and simulations play a key role in that effort, Fernandez said.

"There are many different models as to how you should tie the SCADA systems together," he said. The INEEL staff has put together SGI 64-bit processors in a 32-node cluster to complete much of the modeling.

Means and Methods

To better simulate the interaction among industries, NISAC will pursue a series of modeling approaches.

The center is working to advance screening tools to stitch together existing simulators to form an early warning system. The system will rely on the development of algorithms and technologies that offer a composite view of all the nation's critical infrastructures.

Another approach is what NISAC's Rinaldi termed a "stocks and flows approach" that will show goods and services flowing through and across infrastructures.

"We built some pretty sophisticated models around the California energy crisis," he said.

Using the California models, NISAC was able to show the compounding effect of power outages. The models displayed the drain that the energy crisis put on other industries, such as the agricultural community, which is heavily reliant on hydro-electrical power.

NISAC is also testing agent-based simulation. "An agent is an encapsulated piece of software that acts as a decision-making piece of a physical infrastructure," Rinaldi explained.

For instance, in a simulation of a stock market, agents could be individual traders, acting separately but working to create the total functioning of the infrastructure. In an electric power plant, the agents could be generators or any objects working separately but impacting the whole.

A fourth area involves physics-based models, which will simulate operations occurring within infrastructures. For instance, within an oil or gas system, the models could be the detailed operation of a pipeline.

In both agent-based and physics-based modeling, the impact of disasters on infrastructures is more accurately simulated because of the level of detail. In the former case, the different behavior of agents is factored in. In the latter, the behavior of elements such as electricity or gas is built into the models.

Finally, Rinaldi described population mobility modeling, which the center is also exploring. "We are looking at how entities, namely people, move through a geographic area," he said.

As individuals move through an area, they impact, for example, the financial infrastructure through use of an automated teller machine or energy systems by fueling and driving a car. "This is a very microscopic view of how an individual moves through and interacts with infrastructures," Rinaldi said.

The common thread in the five areas is infrastructure assessment, he concluded. "We are looking for vulnerabilities, areas of mitigation and methods of response," he said.

Leaning on Industry

Along with the centralization of modeling efforts among the labs, NISAC is also getting more aggressive in its efforts to include private industry.

"One of the things that will be absolutely essential is for NISAC to work closely with the owners and operators of the infrastructures," Rinaldi said. "We are also working with the vendor and academic communities, which have expertise in the operational characteristics and the network topologies in place."

Vendors such as computer simulation specialist SGI have long worked with government to develop high-performance computing systems, visualization tools and advanced algorithms. That corporate history is now playing into homeland security opportunities, said David Shorrock, SGI's marketing development manager for government industries.

A key solution will be immersive visualization tools, which allow large amounts of data to be processed and simulated, Shorrock said. "We have honed [our] tool so that it is available to officials not only to practice response but to use operationally as a command and control center," he said.

As the proposed Homeland Security Department continues to take shape, R&D dollars are still flowing from various agencies. For instance, the Defense Advanced Research Projects Agency recently embarked on a "conceptual studies" initiative to address holes in simulation capabilities.

"Current trends in commercial high-performance computing are creating technology gaps that threaten continued U.S. superiority in important national security applications," DARPA officials reported during a June unveiling of the effort.

To explore the gaps, Cray Inc., IBM Corp., SGI and Sun Microsystems Inc. will each get $3 million to develop ways to analyze areas such

as the dispersion of chemical or biological agents and to work on advanced intelligence and surveillance


Shorrock also predicted that data fusion will be an area of intensive government R&D focus. "Nobody has conceived the breadth of this problem," he said. "Research like this has not been done to the depth now needed to satisfy the government's efforts to model disaster


Jones is a freelance writer based in Vienna, Va.


  • Workforce
    White House rainbow light shutterstock ID : 1130423963 By zhephotography

    White House rolls out DEIA strategy

    On Tuesday, the Biden administration issued agencies a roadmap to guide their efforts to develop strategic plans for diversity, equity, inclusion and accessibility (DEIA), as required under a as required under a June executive order.

  • Defense
    software (whiteMocca/

    Why DOD is so bad at buying software

    The Defense Department wants to acquire emerging technology faster and more efficiently. But will its latest attempts to streamline its processes be enough?

Stay Connected