Big Data

Big data brings new vigor to health research

DNA strand

Two of the top federal agencies that deal with health care and medical research took center stage at FCW's big data conference on June 20, explaining the many different ways the government uses big data tools and analytics to improve researcher collaboration and early recognition of health trends.

Dr. Jack Collins, director of the Advanced Biomedical Computing Center at the National Cancer Institute, said that sharing big data among medical researchers allows experts to see experiments within their specialty for analysis.

"What I look at big data as – it's really asking the right questions," Collins said. "It's about being able to ask the questions that we couldn't do in the past because we either didn't have the data, we couldn't collect it or we didn't have the means to analyze it."

In terms of health data storage in and of itself, Collins said that a genetic analysis of 20 types of cancers produces roughly one petabyte of data. Improved connectivity allows that data to be accessed directly by the appropriate specialists within the medical community. While Collins acknowledged that connecting experimental data with the appropriate experts has not yet reached its potential, government programs have begun efforts that enable data-analysis integration.

The National Cancer Institute and the Advanced Biomedical Research Center, for example, teamed up with Oracle to create bioDBnet, which integrates biological databases. Researchers are using this data mining to analyze cancer cells, as well as protein signals that can predict post-traumatic stress disorder for the Army.

"Having somebody who knows how to do data and run a query is great, but until it gets to the person who understands what's going on and what to do with that information, it doesn't really matter," Collins said.

Niall Brennan, director of the Office of Information Products and Data Analytics, within the Office of Enterprise Management at the Centers for Medicare and Medicaid Services, said big data analysis can also predict and explain healthcare trends.

In 2011, almost 20 percent of Medicare beneficiaries were readmitted to medical care after an initial admittance, a rate that had been similar in previous years. When there was a drop in the readmission rate in early 2012, Brennan said that CMS was able to use big data analysis and algorithms to try to explain the reason for the decrease.

"Health reform is about outcomes and outcomes can be hard to identify in changes," Brennan said. "Health and health behavior can take years to manifest themselves. So we don't know if it's account of the organizations, we don't know if it's the readmission penalties that hospitals are facing for certain conditions."

"Both the government and the private sector are barely scratching the surface here," Brennan said.

About the Author

Reid Davenport is an FCW editorial fellow. Connect with him on Twitter: @ReidDavenport.

Featured

  • Telecommunications
    Stock photo ID: 658810513 By asharkyu

    GSA extends EIS deadline to 2023

    Agencies are getting up to three more years on existing telecom contracts before having to shift to the $50 billion Enterprise Infrastructure Solutions vehicle.

  • Workforce
    Shutterstock image ID: 569172169 By Zenzen

    OMB looks to retrain feds to fill cyber needs

    The federal government is taking steps to fill high-demand, skills-gap positions in tech by retraining employees already working within agencies without a cyber or IT background.

  • Acquisition
    GSA Headquarters (Photo by Rena Schild/Shutterstock)

    GSA to consolidate multiple award schedules

    The General Services Administration plans to consolidate dozens of its buying schedules across product areas including IT and services to reduce duplication.

Stay Connected

FCW Update

Sign up for our newsletter.

I agree to this site's Privacy Policy.