Dozens of defense companies are developing lethal autonomous weapons (LAWS), as humanitarian groups seek to build international support for a treaty to ban them, according to a recent report.

LAWS, so-called “killer robots,” would rely on artificial intelligence (AI) to remove the human from targeting decisions, but how close such systems are to mature development and deployment readiness is a matter of debate in technology circles.

“As part of an imminent arms race to develop increasingly autonomous weapons, states rely on and involve arms producing companies in those efforts,” according to Slippery Slope: The Arms Industry and Increasingly Autonomous Weapons by the Dutch non-governmental organization, PAX for Peace.  “While digital technology, especially artificial intelligence, can be beneficial in many ways, countless AI and robotics experts have warned that the technology must not be used to develop lethal autonomous weapons. The research however shows the clear proliferation of increasingly autonomous weapon systems. Not only is there a growing number of companies in a growing number of countries developing such weapons, these technologies are also applied to an ever-expanding range of military systems, in the air, on the ground and at sea.”

The United Nations Convention on Conventional Weapons (CCW) has been discussing LAWS’ concerns and is to start devising a “normative and operational framework” for such weapons at meetings in Geneva on June 22-26 and August 10-14. But humanitarian groups are frustrated that CCW has not progressed further in its work on the issue and that CCW has not brought up a legally binding document to stop or significantly restrict LAWS.

In a message to CCW’s Group of Government Experts convened for a meeting on emerging LAWS technologies last March, U.N. Secretary General António Guterres wrote that “autonomous machines with the power and discretion to select targets and take lives without human involvement are politically unacceptable, morally repugnant and should be prohibited by international law.”

“This reflects what I see as the prevailing sentiment across the world,” Guterres wrote. “I know of no state or armed force in favor of fully autonomous weapon systems empowered to take human life.”

Last February, Russia’s ZALA Aero Group, the unmanned aircraft systems (UAS) division of Kalashnikov, unveiled a “kamikaze” drone — the KUB-BLA — at the International Defense Exhibition and Conference (IDEX) in Abu Dhabi.

The small UAS is designed to have a maximum speed of about 80 miles per hour, an endurance of 30 minutes, and an explosive payload of 7 pounds against “remote ground targets.”

Loitering munitions can have a dwell time up to six hours and are equipped with sensors to allow the drones to detect and attack targets independently. Early 1980s-era examples include Israel Aircraft Industries’ Harpy suppression of enemy air defenses (SEAD) drone and the U.S. Air Force AGM-136 “Tacit Rainbow” SEAD system by Northrop Grumman — a $4 billion development program canceled in 1991.

“Especially significant are the developments related to loitering munitions, which are able to operate for longer amounts of time and over larger areas in order to select and attack targets,” according to last month’s PAX for Peace report. “Major efforts related to swarm technologies multiply the potential of such weapons. These developments raise serious questions of how human control is guaranteed over these weapon systems.”

The Turkish state-owned firm STM is “improving the capabilities of its KARGU loitering munitions through using AI, including facial recognition,” the report said. “According to the company, the KARGU can ‘autonomously fire-and-forget through the entry of target coordinates.’ It has been suggested that these systems will be deployed on the border with Syria.”

A September article in The New Scientist magazine reported that KARGU positions Turkey “to become the first nation to use drones able to find, track and kill people without human intervention.” The Turkish newspaper, Hürriyet, has said that some 30 STM “kamikaze” drones will be deployed early next year to the Turkish-Syrian border region.

The PAX for Peace report on LAWS listed 30 “high concern companies,” as the latter “work on increasingly autonomous weapon systems and do not appear to have a policy in place [to ensure meaningful human control over such weapons] and did not respond in a meaningful way to our survey.”

Such companies include Lockheed Martin, Boeing, and Raytheon in the United States; China’s AVIC and CASC; Russia’s Rostec; Israel’s IAI, Elbit Systems, and Rafael; and Turkey’s STM, according to the report.

Source: Avionics Today

ARTICLE COURTESY OF UAS VISION

 

 

Related Posts