Should robots kill?
- By Chase Gunter
- Aug 21, 2017
This probably won't happen. (Image credit: Avo/Shutterstock)
Over 100 robotics and artificial intelligence experts worldwide warned the United Nations about a future of war that includes autonomous killing machines.
In a letter to the U.N. Convention on Certain Conventional Weapons, the 116 signatories representing companies from 26 countries -- including co-founder of Tesla and SpaceX Elon Musk and co-founder of Google's DeepMind Mustafa Suleyman -- urge the U.N.'s Group of Governmental Experts to ban the international use of robotic weapons.
"As companies building the technologies in artificial intelligence and robotics that may be repurposed to develop autonomous weapons, we feel especially responsible in raising this alarm," the signatories write, adding that researchers and engineers from their companies are "eager to offer technical advice to your deliberations."
These autonomous weapons include drones, robots and other forms of weaponry that can be controlled by artificial intelligence.
In its December 2016 meeting in Geneva, the U.N. Review Conference of the Convention on Conventional Weapons had agreed to begin discussions on prohibiting lethal autonomous weapons. The convention scheduled a meeting in August, but that plan has now been pushed to November.
"Lethal autonomous weapons threaten to become the third revolution in warfare," they write. "Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend."
These autonomous weapons include drones and other forms of weapons that can be controlled by artificial intelligence.
"These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways," the letter states. "We do not have long to act. Once this Pandora's box is opened, it will be hard to close."
U.S.-based government contractors specializing in defense technologies are absent from the list of signatories.
Andrew Hunter, director of the Center for Strategic and International Studies' Defense-Industrial Initiatives Group, said that writing such an international policy or standard is trickier than an outright ban.
"The goal advanced in the letter… is not something that a simple UN ban can completely prevent," he said. "In addition to the fact that some states are likely to ignore such a policy, it is no easy thing to define the kind of autonomy you are trying to ban."
James Lewis, senior vice president and director of the Strategic Technologies Program at CSIS, termed the call for such a prohibition "pious imbecility."
"It would be nice if there were no more wars, [but] since that's unlikely to happen, I want our guys to have the best stuff, and I'd rather put some machine at risk rather than one of our folks," Lewis said. "How hard is this?"
Chase Gunter is a staff writer covering civilian agencies, workforce issues, health IT, open data and innovation.
Prior to joining FCW, Gunter reported for the C-Ville Weekly in Charlottesville, Va., and served as a college sports beat writer for the South Boston (Va.) News and Record. He started at FCW as an editorial fellow before joining the team full-time as a reporter.
Gunter is a graduate of the University of Virginia, where his emphases were English, history and media studies.
Click here for previous articles by Gunter, or connect with him on Twitter: @WChaseGunter