Army lab builds AI robots for situational awareness

The Army Research Laboratory is tinkering with unmanned AI-driven robots that can scan a scene for enhanced warfighter awareness.

artificial intelligence (vs148/Shutterstock.com)
 

The Army is designing artificial intelligence for the battlefield with two prototype drones aimed at gathering situational awareness intelligence for soldiers on the ground

Raghuveer Rao, the Army Research Laboratory's image branch chief of the Combat Capabilities Development Command, supervises the lab's AI scene perception research that builds bots that can recognize real-world objects.

Rao's lab has developed drone prototypes for ground and aerial use to enhance situational awareness. It is also working with the Joint AI Center and Army AI Task Force to further develop and define priority areas, namely intelligent scene perception and mission learning.

"They're one of a kind at this stage," Rao said. "We're interested in is having algorithms and processes onboard that integrate perception with navigation. We want to be able to understand, … what's going on below when you fly a UAV. And then on that basis [have the UAV] decide if it wants to do a closer examination."

That means the drone would be able to detect people and then decide to investigate what else might be happening in the scene, such as the activities people are engaged in.

"You would want the machine to make its own decisions and take a closer look," Rao said.

ARL is planning to demonstrate the UAV platform in August or September when the team will looking for whether the drone can autonomously fly and discern objects of interest and make decisions about what to do next.

Later this year, Rao said his team will explore multiplatform operations, where ground and aerial drones pool resources to get a better sense of the battlefield. Another challenge will be looking at adversarial environments and conditions because "it's one thing to make sense of what's going on when everything is clearly visible to our cameras, but what if you have impairments? Objects in the way, smoke, fog. What do you do?"

ARL, which now sits under Army Futures Command, is also tinkering with next-generation facial recognition technology using heat signatures to identify targets in the dark. For the face recognition prototype, there's a handheld device that can take thermal images -- recognizing the heat a person or organism emits -- and match them to images in a database, Rao said.

Priya Narayanan, mechanical engineer in ARL's Signal and Image Processing Division, demonstrated thermal recognition abilities using a land-based robot that stood nearly four-feet tall and weighed more than 100 pounds. On a laptop screen, green and blue boxes outlined objects the robot could see in the room: people, chairs, tables. But the robot didn't recognize the wallet offered for inspection by a reporter.

It's not part of the database, Narayanan said, noting there were only about 20 items logged in the database that the robot could recognize. If the wallet were added, then it could see it, she said. 

The database is the key starting point for identification, Rao said, and it is the foundation for analysis of thousands of facial images to make the best association between those and the thermal signature. And before releasing the technology for battlefield use, the researchers must have a high degree of certainty that the image and thermal association match.

FCW got to interact with the technology, sitting for a personal thermal recognition reading. But the machine only managed to bring up male targets that weren't of much likeness to the reporter, again because of a thin database.

"It's basically a heat profile that comes out of the fact that we are warm bodies emanating heat, and on that basis trying to recognize who's there," Rao said. "In that sense, it's unique."