Senses working overtime
Air Force researchers aim to help pilots and others operate increasingly complex aircraft and mission support systems.
- By Brian Robinson
- Aug 01, 2008
Flying aircraft has never been easy. The Wright Brothers spent several years after their inaugural 1903 flight perfecting the three-axis control that would enable a person to fly a fixed-wing aircraft with any precision.
Since then, aircraft have become immeasurably more complex. Modern fighter pilots use a wide range of sensors to fly their craft with split-second reactions, while the new fleets of unmanned aerial vehicles (UAVs) are guided remotely by ground-based pilots who rely entirely on machine sensors.
Flying successfully means pilots have to make sense of a jumble of images and other data that come at them simultaneously. Making sure they can do so, and with constantly improving performance, is the job of the Air Force Research Laboratory’s Human Effectiveness Directorate.
One of the three mission units of the AFRL’s new 711th Human Performance Wing at the Wright Patterson Air Force Base in Ohio, the directorate is focused on developing the technology that allows humans to operate the systems that make up the modern combat-ready Air Force.
“The challenge with flying airplanes is how safely you can incorporate people into them,” said Maris Vikmanis, chief of plans and programs for the 711th. “Since World War II, airplanes have become so complicated that the human-machine interface is now critically important.”
The Human Performance Directorate is near the top of all the world’s laboratories in terms of the breadth of the activities it engages in, he said. It must deal with a complex array of technology, a wide variety of missions and a national security environment that allows no room for error. At the same time, directorate officials are also trying to transform deeply rooted development processes that place human considerations second.
Traditionally, the machine and the technology are designed first and then the pilot has to deal with what’s left over, usually through training, Vikmanis said.
“We’d like to be able to design things in from the beginning that fit [the human] better,” he said.
The growing prevalence of UAVs on the modern battlefield is helping to spur this change, said Dan Goddard, chief of the directorate’s Warfighter Interface Division. The desire now to have a single pilot operate multiple UAVs simultaneously means a well-designed human-machine interface is essential, he said.
That requirement has carried over into other Air Operations Center weapons systems.
“There’s now so much reconnaissance data flowing down into the AOC that it’s creating information overload,” Goddard said. “You need a much better human-machine interface to be able to get actionable information out of this very quickly.”
For example, computer displays are basic equipment for any data-related task, but some situations require more than the regular 2-D view that standard devices provide. Analyzing columns and rows of numbers is fine with a spreadsheet application, but doctors evaluating a knee injury would do much better if they could see a ligament tear from all angles using a 3-D display.
The same is true with certain command-and-control tasks. There’s only so much you can get out of a 2-D display, said Paul Havig, senior engineering psychologist for the Battlespace Visualization Branch of the Directorate’s Warfighter Interface Division. A third dimension means being able to see more of the data that’s on the screen and gives a better way of interacting with it, he said.
“Being able to interact with the data means you get a much better understanding of them and what they mean,” he said. “With map displays, for example, you get a much better situational awareness.”
In some cases, true 3-D may not be required. Havig’s branch is experimenting with so-called 2 1/2-D displays that give a compelling sense of depth for anyone looking at the screen because of the perspective. For example, an image of a person standing in front of a wall on such a display may actually appear as though the person is separated from the wall, even though it’s not a real 3-D display.
The advantage of 2 1/2-D is that images can be rendered on regular, inexpensive flat-panel displays, Havig said. True 3-D displays are much more expensive.
The high cost of 3-D displays is not the only problem. The few 3-D displays that exist now are drastically different from each other, making it difficult to know how they would perform with different types of data, images and situations.
“I can easily spec out to my folks what’s needed in a high-definition display, and if they went to the local Best Buy, they could pick out a fair number of displays that would meet the specs,” Havig said. “But we only have two 3-D displays in our lab, and if we try to spec out the contrast, luminance and other things, they are such different beasts.”
Other research issues involve such tasks as deciding the best interface for people to use in interacting with on-screen data. A regular mouse turns out not to be so good for this; it would be better if someone could actually reach into the data to interact with it, which means devising more tangible interfaces.
“But we don’t even know when the first 3-D displays will be used in an operational setting,” Havig said. “We’re still trying to work out when they would be useful and for what tasks.”Power of listening
Sound perception can play an equally important role in combat scenarios. On the battlefield, people often pick up aural cues about what’s happening before they see it. Developing technology that can take advantage of that is the goal of the Battlespace Acoustics Branch of the Warfighter Interface Division.
“Sound sources are all around and people are good at telling what they are and where they come from,” said Doug Brungart, technology adviser at the Battlespace Acoustics Branch. “They turn their heads to where the sounds come from, and we are trying to use the same concepts for use by fighter pilots.”
Pilots must track numerous sensors that have visual displays as they fly the aircraft and operate the weapons systems. Sound cues can be used to alert them to the most important area they need to pay attention to at a given time.
That takes advantage of what Brungart called the cocktail party effect. While a crowded room full of voices and other sounds can seem like a cacophony, he said people are actually pretty good at perceiving individual sources.
“Even if they are not aware of it at the time, people know when sounds change or are switched off in that kind of environment,” he said. “Most people are able to distinguish around six different sounds at any one time.”Simplifying the picture
Human factor considerations are also important to less frontline military operations, such as coordinating global fuel tanker support for warfighters. The Air Force flies around 100,000 such sorties each year.
It’s not an easy task. Controllers must juggle available ramp space at various airfields, heed the diplomatic clearances needed for flying over foreign airspace, and accommodate changes caused by weather and other factors.
Current systems used for this are based on applications such as Microsoft Office, which don’t allow much insight into situational aspects of the operation, such as knowing the changing capacities at various airports for handling tanker missions.
The Work-centered Interface Distributed Environment (WIDE) program in the Cognitive Systems Branch of the Warfighter Interface Division is looking at designing more intuitive systems that will give the controllers who h dle these tasks a better grasp on situations.
“Guys can keep track of the capacities at particular airports and try to fix the problems there,” said Jeff Wampler, program manager for the WIDE program. “But missions can go through three or four different airports, crews need rest and so on. Problems have ripple type of effects for future missions, so people are in constant crisis mode trying to deal with this.”
WIDE’s goal is to bring a more visual sense to the task of coordinating these flights. The first step is to understand the cognitive aspects involved by observing how people perform these operations.
“The current dashboards that these people use are littered with red and yellow alerts, for example,” Wampler said. “We want to know how to bring the temporal contrasts of these out graphically, so operators can see the problems at a glance.”
As helpful as many of these solutions sound, it’s not a simple matter of building them and expecting users to embrace them. It often seems to take as much effort to persuade people to use them as it does to develop them in the first place.
There’s been a lack of awareness about the importance of the human-machine interface in the early designs of weapons systems, Goddard said. The directorate is constantly having to market itself and its work, to show what the challenges are and how its technologies and solutions can ultimately save money.
The historical challenge with human interface technology has always been in putting a number on it, Vikmanis said.
“We might intuitively know that a better interface leads to making better decisions,” he said. “But measuring effectiveness is a very hard thing to do.”