ENTER SARAH CONNER

With no laws to stop them, defense firms are on track to make killer robots a reality

Human oversight is crucial, say experts.
Human oversight is crucial, say experts.
Image: REUTERS/Maxim Shemetov
By
We may earn a commission from links on this page.

Weapons built by defense manufacturers that can think for themselves are getting smarter, which mean the much-feared killer robot could be a reality sooner than later. That’s the warning contained in a new report from Pax, a nonprofit based in the Netherlands that campaigns for peace around the world.

Killer robots, or lethal autonomous weapons systems, are designed to make life-or-death decisions on their own, without human control. It’s a worrying leap that’s been called the “third revolution in warfare,” after gunpowder and the atomic bomb. Both activists and military leaders have called for international regulations to govern these weapons, or even ban them outright, but key governments—like the United States and Russia—have so far resisted.

As far as anyone knows, militaries have yet to actually deploy killer robots on the battlefield, at least offensively. But Pax has identified at least 30 global arms manufacturers that don’t have policies against developing these kinds of weapons systems, and are reportedly doing so at a rate that is outpacing regulation.

The companies include US defense firms Lockheed Martin, Boeing, and Raytheon, the Chinese state-owned conglomerates AVIC and CASC, Israeli firms IAI, Elbit, and Rafael, Rostec of Russia, and Turkey’s STM.

“As long as states haven’t agreed to collectively come up with some kind of regulatory regime, or ideally, a preemptive ban, the fear is very real that companies will be crossing this plane and will develop and produce and eventually field weapons that lack sufficient human control,” the report’s author, Frank Slijper, told Quartz.

Activists don’t believe that military use of some degree of artificial intelligence is problematic in it itself. The US military is already employing full autonomy in some of its defensive weapons platforms, like the US Navy’s Aegis shipboard missile defense system, which is designed to intercept enemy fire on its own. The US Army is developing an AI-capable cannon, which would select and engage targets on its own, as well as AI-assisted tanks that, as Quartz first reported, will be able to “acquire, identify, and engage targets” at least three times faster than any human. But these systems still all require a person to pull the trigger, so to speak.

PAX is more concerned about the potential deployment of AI in offensive systems that would select and attack targets on their own without human oversight. The group questions how these weapons would distinguish between combatants and civilians, or judge proportional responses. Legal experts still don’t know who would be held responsible if an autonomous weapon broke international law. And without lives on the line, these weapons could make it easier to go to war, and for those wars to escalate more quickly.

The report warns that such weapons would “violate fundamental legal and ethical principles and would destabilize international peace and security.”

What they’re building

Defense firms don’t produce weapons in a vacuum, Slijper said. Instead, he said, these weapons are developed because companies believe that’s what militaries want in their arsenals.

And unlike Google or Amazon, which have both faced public and internal backlash for their work on military systems, companies like Lockheed Martin and Raytheon do almost all of their business with militaries, so they face little risk from the negative reaction of consumers.

For its report, Pax sent questionnaires to 50 arms manufacturers that produce military systems, asking each if it had policies regarding autonomous weapons. Just eight firms said they had in place principals guiding their AI work. The rest did not reply.

Here’s what they told Pax:

Of the weapons that exist now, Slijper said he is particularly worried about “loitering munitions.” Pax describes these as hybrids between drones and guided missiles, which can “loiter” in the air for two hours or more before attacking their targets. Small, cheap and relatively easy to produce, the number of companies developing these weapons has grown considerably in the last 10 years, Slijper said. With so much availability, it’s only a matter of time before they are deployed in a large scale by both state and non-state actors alike.

The Pax report singled out two companies that are now manufacturing such weapons:

  • STM, a Turkish state-owned defense company, produces an AI-equipped loitering munition called KARGU. Complete with facial recognition capabilities, KARGU can autonomously select and attack targets using coordinates pre-selected by an operator. Turkey is reportedly set to use these “kamikaze drones” in Syria.
  • The Harpy, a “fire and forget” loitering munition manufactured by state-owned Israel Aerospace Industries, has a range of 62 miles and can stay aloft for two hours. IAI states that the system “loiters in the air waiting for the target to appear and then attacks and destroys the hostile threat within seconds.”

What’s next

While development of autonomous weapons continues apace, Pax believes there is still time to head off eventual catastrophe. The group said companies can play a crucial role in this, and should first make a public pledge against the manufacture of fully autonomous lethal weapons. As far as AI-assisted weapons systems go, Pax believes defense firms must “establish a clear corporate policy with implementation measures” that include:

  • Ensuring each new project is assessed by an ethics committee;
  • Ensuring the principle of meaningful human control is an integral part of the design and development of weapon systems;
  • Adding a clause in contracts, especially in collaborations with ministries of defense and arms producers, stating that the technology developed may not be used in lethal autonomous weapon systems;
  • Ensure employees are well informed about what they work on and allow open discussions on any related concerns.

Aside from a German arms industry association, which called for a ban on fully autonomous weapons systems earlier this year, most companies have not committed to any regulations, according to Pax.

It is important for nations to immediately take “bold steps to stop lethal autonomous weapons from becoming reality,” the report says. Yet, while Australia, Brazil, Chile, and Peru have been outspoken in their opposition to fully autonomous weapons, the US and Russia have so far stymied any attempts to pass a unified international treaty.

“Also countries such as Pakistan, Egypt, and Iraq have been supporters of a ban treaty,” Slijper said. “Probably quite understandable that some of the countries that over the past two decades have experienced drone warfare are probably anxious for what the future might bring to them.”