Agencies embrace research analytics in push for greater efficiency

Analytical and reporting tools help agencies improve the delivery and performance of public services

During the early 1980s the on-time delivery rate for first class mail handled by the U.S. Postal Service was about 80 percent, according to Maryellen Clarke, manger of customer knowledge management at USPS.

Today, the on-time rate exceeds 94 percent. A major factor in that improvement was the decision by USPS leaders to collect more data about mail delivery and to effectively analyze it, Clarke said today during a panel discussion on the use of analytics in government.

Research analytics technology is becoming an important tool in agencies’ efforts to become more efficient and transparent, several experts on the panel said.

Using research analytics to help formulate policy has “gained speed” under the new administration, said Edward Szymanoski, associate deputy assistant secretary of economic affairs for the Housing and Urban Development Department’s (HUD) Office of Policy Development and Research.

For example, HUD officials use analytics to determine how much insurance premiums should be to guarantee government-backed reverse mortgages, Szymanoski said.

Data on reverse mortgages was sparse 20 years ago, according to Szymanoski. So HUD officials developed analytical models and collected more data to ensure insurance premiums covered expected losses.

In response to the growing interest in analytics, IBM, one of the sponsors of the event, announced today that it is opening an analytics solution center in Washington dedicated to helping federal agencies better use data.

Analytics played a key role in USPS improving its on-time delivery rate, Clarke said.

Test pieces of mail are put into the system every day under the Transit-Time Measurement System that tracks how long it takes letters, boxes and postcards to be delivered. The simple system has expanded in the Postal Service over nearly 20 years, Clarke said.

“What we found is this data is very useful to the operations and processing folks because what started as a measurement system has become a very robust diagnostics system,” Clarke said.

“We started with a minimum 40 reports quarterly, and we now produce more than 10,000 on an annual basis,” she said. “Now the organization has sufficient data to identify where the deficiencies are in the system and to make adjustments.”

Meanwhile, Justice Department officials hope analytics will make it possible to get a better view of how investments relate to achieving objectives at the department, according to Jolene Lauria-Sullens, deputy assistant attorney general, controller for Justice.

For example, Justice officials are often asked how much the department spends on conferences per year, Lauria-Sullens said. After reconciling data from multiple financial systems, officials are able to come up with a number.

“We are able to say this is what we spent, but what does that mean?” she said. “What do we want to say about the numbers and how they reflect on the department’s work?”

An effort to unify Justice's financial and acquisitions systems should make it easier to answer those questions, she said. Analytical tools will be an important part of the unified system.

“As we build it we must be sure we have the reporting and analytical tools to be able to answer questions in a way we think is reflective of the important work we do,” Lauria-Sullens said.

About the Author

Doug Beizer is a staff writer for Federal Computer Week.


  • Federal 100 Awards
    Federal 100 logo

    Nominations for the 2021 Fed 100 are now being accepted

    The deadline for submissions is Dec. 31.

  • Government Innovation Awards
    Government Innovation Awards -

    Congratulations to the 2020 Rising Stars

    These early-career leaders already are having an outsized impact on government IT.

Stay Connected