News & Events > Nations dawdle on agreeing rules to control ‘killer robots’ in future wars

Nations dawdle on agreeing rules to control ‘killer robots’ in future wars

  • Jan 18, 2020
  • Categories: Counter Drone News

Nations are investing in developing lethal autonomous weapons systems which can identify, target, and kill a person all on their own – but there are no international laws governing their use

By Nita Bhalla

NAIROBI, Jan 17 (Thomson Reuters Foundation) – Countries are rapidly developing “killer robots” – machines with artificial intelligence (AI) that independently kill – but are moving at a snail’s pace on agreeing global rules over their use in future wars, warn technology and human rights experts.

From drones and missiles to tanks and submarines, semi-autonomous weapons systems have been used for decades to eliminate targets in modern day warfare – but they all have human supervision.

Nations such as the United States, Russia and Israel are now investing in developing lethal autonomous weapons systems (LAWS) which can identify, target, and kill a person all on their own – but to date there are no international laws governing their use.

“Some kind of human control is necessary … Only humans can make context-specific judgements of distinction, proportionality and precautions in combat,” said Peter Maurer, President of the International Committee of the Red Cross (ICRC).

“(Building consensus) is the big issue we are dealing with and unsurprisingly, those who have today invested a lot of capacities and do have certain skill which promise advantages to them, are more reluctant than those who don’t.”

The ICRC oversaw the adoption of the 1949 Geneva Conventions that define the laws of war and the rights of civilians to protection and assistance during conflicts and it engages with governments to adapt these rules to modern warfare.

AI researchers, defence analysts and roboticists say LAWS such as military robots are no longer confined to the realm of science fiction or video games, but are fast progressing from graphic design boards to defence engineering laboratories.

Within a few years, they could be deployed by state militaries to the battlefield, they add, painting dystopian scenarios of swarms of drones moving through a town or city, scanning and selectively killing their targets within seconds.

DEATH BY ALGORITHM

This has raised ethical concerns from human rights groups and some tech experts who say giving machines the power of life and death violates the principles of human dignity.

Not only are LAWS vulnerable to interference and hacking which would result in increased civilian deaths, they add, but their deployment would raise questions over who would be held accountable in the event of misuse.

“Don’t be mistaken by the nonsense of how intelligent these weapons will be,” said Noel Sharkey, chairman of the International Committee for Robot Arms Control.

“You simply can’t trust an algorithm – no matter how smart – to seek out, identify and kill the correct target, especially in the complexity of war,” said Sharkey, who is also an AI and robotics expert at Britain’s University of Sheffield.

Experts in defence-based AI systems argue such weapons, if developed well, can make war more humane.

They will be more precise and efficient, not fall prey to human emotions such as fear or vengeance and minimise deaths of civilians and soldiers, they add.

“From a military’s perspective, the primary concern is to protect the security of the country with the least amount of lives lost – and that means its soldiers,” said Anuj Sharma, chairman of India Research Centre, which works on AI warfare.

“So if you can remove the human out of the equation as much as possible, it’s a win because it means less body bags going back home – and that’s what everyone wants.”

AVOIDABLE TRAGEDY – READ MORE > 

READ FULL ARTICLE COURTESY OF THOMSON REUTERS FOUNDATION

Related Posts

  • FAA Evaluates Drone Detection Systems Around Denver

      November 16– Unmanned Aircraft Systems (UAS) that enter the protected airspace around airports can pose serious threats to safety. The FAA is coordinating with our government and industry partners to evaluate technologies that can be used safely to detect drones near airports. This week, the FAA and the Department of Homeland Security (DHS) are […]

  • Implementing Combat Lessons with C-UAV Capabilities

    Determined to meet the challenge of hostile Unmanned Aerial Systems (UAS), the US Army acquired a number of countermeasures able to defeat such threats using electronic warfare. The Islamic State in Iraq and Syria pioneered the use of commercially available micro drones armed for attack or suicide missions. These weapons were used on a large […]

  • ORBITAL ATK CREATES INTEGRATED, COUNTER UAS CAPABILITY

    At DSEI, Orbital ATK showcases Tactical-Robotic Exterminator (T-REX), a mounted and integrated version of the combat-proven Liteye AUDS non-lethal Electronic Attack (EA) capability combined with the lethal defeat capability of the Orbital ATK XM914 30mm BUSMASTER Chain Gun. This new mounted system integrated with tactical radar detection and electro-optical infrared (EO/IR) sensors, provides great Unmanned Aerial System (UAS) identification […]

This website uses cookies to ensure you get the best experience on our website. Visit our Privacy & Terms of Use here.