How HPCs crunch rivers of water data

Lawrence Livermore National Laboratory has come up with a way to use its supercomputing power to speed the sprawling data analysis associated with extremely complex public works.

Shutterstock image.

Lawrence Livermore National Laboratory is tapping its supercomputing power to speed the data analysis needed to manage the allocation of Colorado River water to millions of stakeholders. The lab's researchers have teamed with nonprofit research organization Rand Corp. to use the lab's High Performance Computing Innovation Center (HPCIC) to perform water allocation tasks in minutes instead of weeks.

Managing the flow of the river, which winds almost 1,500 miles through seven states and some of the most arid land in the country, has been a big job for the past century. Like many other U.S. waterways, the Colorado River is under increasing pressure from a growing throng of consumers. Lab officials said the river provides water for 30 million people.

The federal government has been slow in tracking data on water consumption for the country as a whole. Its most recent account of water consumption and availability covers 2010, before Texas entered a four-year drought (which has since ended) and before California's drought crisis even began.

By using the lab's supercomputer, researchers said they ran 12,000 "alternative futures" for the river, which are plans for allocation among the states, cities and American Indian tribal communities that use the river's water.

Traditional computers would have taken six weeks to complete the study, but officials said the supercomputer crunched the data in 45 minutes.

They added that the approach has promise for other data-centric policy issues.

"These same methods and resources can improve the decision-making process for a broad spectrum of national and corporate challenges," HPCIC Director Fred Streitz said. "The types of resource allocation and policy questions that can be answered with high-fidelity simulation show up regularly [in energy, agriculture and transportation]. There are many, many other areas where complex, slow-running models inform analysts, who, in turn, inform policymakers."

Adding high-performance computing to the mix compresses "the time frame between asking questions and getting answers," he added.

The analytical capabilities build on previous work done by Rand's researchers in 2012 and 2014 on the Colorado River Basin, including a joint workshop that used high-performance computer analytics to fuel the collaborative "deliberation with analysis" method. That process brings together stakeholders and experts to assess complex problems and find alternative solutions using scientific methods, including data analysis.

"In the latest workshop, we performed and evaluated about 60,000 simulations over lunch," said Ed Balkovich, senior information scientist at Rand, in the lab's statement. "What would have taken about 14 days of continuous computations in 2012 was completed in 45 mins -- about 500 times faster."