- By Paula Shaki Trimble
- Apr 09, 2001
Most people can find something about the way computers work that seems unnecessarily difficult, unintuitive or just plain annoying. Bad design elements are grudgingly accepted as a fact of technical life.
But if you depend on computers to control the comings and goings of planes at a busy airport, a system that's hard to use becomes more than a minor irritant. The Federal Aviation Administration learned that lesson in 1997 when air traffic controllers assessing the Standard Terminal Automation Replacement System (STARS) found 98 usability problems with the system. The controllers' concerns ranged from opaque drop-down menus that blocked their view of critical aircraft data on displays to keyboards unlike those they had been using.
The concerns prompted the FAA to make a better effort to keep users and maintenance technicians in the loop as systems are developed. A key component of that effort is the Human Factors Branch at the FAA's William J. Hughes Technical Center in Atlantic City, N.J., which advises the agency on systems development and commercial off-the-shelf acquisitions.
"Often they don't come to us until it's too late, but it's changing," said Earl Stein, manager of the Human Factors Branch. "Many people think of human-factors people as egghead researchers who like to get in their way. We are information sources. If you don't include us, we'll find problems."
The failure to address how people will actually use a system can put a program behind schedule and over budget. In the case of STARS, which is being developed by Raytheon Co., fixing the human factors has slowed deployment of the system and boosted the price tag from $940 million to $1.4 billion.
Last month, Lockheed Martin Corp. offered its Common Automated Radar Terminal System (ARTS) as an alternative to STARS. Common ARTS, already delivered to 136 facilities, has been a contingency system during delays in STARS. Common ARTS is similar to STARS but does not include the human-factors changes sought by controllers, and the FAA will likely stick with the Raytheon system, which will be implemented starting in 2003.
"The largest challenge we faced in the last two years was to complete the development of the new [STARS] software to incorporate the computer/human interface changes identified by our workforce," said Steven Zaidman, FAA associate administrator for research and acquisitions. "That challenge is mostly behind us."
The development of STARS provided valuable lessons about the need to consider exactly how new tools will be used, said Kenneth Mead, the Department of Transportation inspector general.
"The FAA and DOT found out you can't just have a meeting on human- factors issues," Mead said during a hearing last month before the House Transportation and Infrastructure Committee's Aviation Subcommittee. "It's a scientific process. That was not known before STARS."
In April 1997, the FAA's Acquisition Management System required that "human factors will be considered during architectural and engineering design to achieve effective human performance during operations, maintenance and support."
The Human Factors Branch employs eight engineering research psychologists and an air traffic controller who report to the FAA's chief scientist for human factors. The group works with the integrated product teams responsible for systems acquisition and with developers of new air traffic control concepts.
Working at the Research Development and Human Factors Laboratory in Atlantic City, branch researchers conduct simulations and perform computer/ human interface analyses. The goal is systems that work better because they optimize the strengths of people and machines.
"There is a desire of technologists to implement technology because they can, whether or not it improves performance or reduces workload," Stein said. "If you put technology into a system, do it for the right reasons."