Oni Science
  • Home
  • Environment
  • Humans
  • Nature
  • Physics
  • Space
  • Tech
  • Video
  • Contact Us
    • About us
    • Privacy Policy
    • Terms and Conditions
    • Amazon Disclaimer
    • DMCA / Copyrights Disclaimer
Skip to content
Oni Science
Your Daily Science News
  • Environment
  • Humans
  • Nature
  • Physics
  • Space
  • Tech
  • Video
  • Contact Us
    • About us
    • Privacy Policy
    • Terms and Conditions
    • Amazon Disclaimer
    • DMCA / Copyrights Disclaimer
Tech

Lessons From Ukraine Are Escalating Research Into Developing Killer Robots

February 22, 2023 by admin 0 Comments

Share on Facebook
Share on Twitter
Share on Pinterest
Share on LinkedIn

The US military is intensifying its commitment to the development and use of autonomous weapons, as confirmed by an update to a Department of Defense directive.

The update, released 25 January 2023, is the first in a decade to focus on artificial intelligence autonomous weapons. It follows a related implementation plan released by NATO on 13 October 2022, that is aimed at preserving the alliance’s “technological edge” in what are sometimes called “killer robots”.

Both announcements reflect a crucial lesson militaries around the world have learned from recent combat operations in Ukraine and Nagorno-Karabakh: Weaponized artificial intelligence is the future of warfare.

“We know that commanders are seeing a military value in loitering munitions in Ukraine,” Richard Moyes, director of Article 36, a humanitarian organization focused on reducing harm from weapons, told me in an interview.

These weapons, which are a cross between a bomb and a drone, can hover for extended periods while waiting for a target. For now, such semi-autonomous missiles are generally being operated with significant human control over key decisions, he said.

Pressure of war

But as casualties mount in Ukraine, so does the pressure to achieve decisive battlefield advantages with fully autonomous weapons – robots that can choose, hunt down and attack their targets all on their own, without needing any human supervision.

This month, a key Russian manufacturer announced plans to develop a new combat version of its Marker reconnaissance robot, an uncrewed ground vehicle, to augment existing forces in Ukraine.

Fully autonomous drones are already being used to defend Ukrainian energy facilities from other drones. Wahid Nawabi, CEO of the US defense contractor that manufactures the semi-autonomous Switchblade drone, said the technology is already within reach to convert these weapons to become fully autonomous.

‘Android Technology’ and the ‘Foundation for Advanced Research’ (FPI) test the Marker UGV carrying out patrol duties at the Russian spaceport (Vostochny Cosmodrome). #ugv #unmanned #uncrewed #robotics #russianfederation #combat #patrol #security #autonomy #autonomousvehicles pic.twitter.com/jbIX7GQPzU

— Melanie Rovery (@MelanieRovery) October 8, 2021

Mykhailo Fedorov, Ukraine’s digital transformation minister, has argued that fully autonomous weapons are the war’s “logical and inevitable next step” and recently said that soldiers might see them on the battlefield in the next six months.

Proponents of fully autonomous weapons systems argue that the technology will keep soldiers out of harm’s way by keeping them off the battlefield. They will also allow for military decisions to be made at superhuman speed, allowing for radically improved defensive capabilities.

Currently, semi-autonomous weapons, like loitering munitions that track and detonate themselves on targets, require a “human in the loop.” They can recommend actions but require their operators to initiate them.

By contrast, fully autonomous drones, like the so-called “drone hunters” now deployed in Ukraine, can track and disable incoming unmanned aerial vehicles day and night, with no need for operator intervention and faster than human-controlled weapons systems.

Calling for a timeout

Critics like The Campaign to Stop Killer Robots have been advocating for more than a decade to ban research and development of autonomous weapons systems. They point to a future where autonomous weapons systems are designed specifically to target humans, not just vehicles, infrastructure and other weapons.

They argue that wartime decisions over life and death must remain in human hands. Turning them over to an algorithm amounts to the ultimate form of digital dehumanization.

Together with Human Rights Watch, The Campaign to Stop Killer Robots argues that autonomous weapons systems lack the human judgment necessary to distinguish between civilians and legitimate military targets. They also lower the threshold to war by reducing the perceived risks, and they erode meaningful human control over what happens on the battlefield.

The organizations argue that the militaries investing most heavily in autonomous weapons systems, including the US, Russia, China, South Korea and the European Union, are launching the world into a costly and destabilizing new arms race. One consequence could be this dangerous new technology falling into the hands of terrorists and others outside of government control.

The updated Department of Defense directive tries to address some of the key concerns. It declares that the US will use autonomous weapons systems with “appropriate levels of human judgment over the use of force“.

Human Rights Watch issued a statement saying that the new directive fails to make clear what the phrase “appropriate level” means and doesn’t establish guidelines for who should determine it.

But as Gregory Allen, an expert from the national defense and international relations think tank Center for Strategic and International Studies, argues, this language establishes a lower threshold than the “meaningful human control” demanded by critics.

The Defense Department’s wording, he points out, allows for the possibility that in certain cases, such as with surveillance aircraft, the level of human control considered appropriate “may be little to none”.

The updated directive also includes language promising ethical use of autonomous weapons systems, specifically by establishing a system of oversight for developing and employing the technology, and by insisting that the weapons will be used in accordance with existing international laws of war.

But Article 36’s Moyes noted that international law currently does not provide an adequate framework for understanding, much less regulating, the concept of weapon autonomy.

The current legal framework does not make it clear, for instance, that commanders are responsible for understanding what will trigger the systems that they use, or that they must limit the area and time over which those systems will operate.

“The danger is that there is not a bright line between where we are now and where we have accepted the unacceptable,” said Moyes.

Impossible balance?

The Pentagon’s update demonstrates a simultaneous commitment to deploying autonomous weapons systems and to complying with international humanitarian law. How the US will balance these commitments, and if such a balance is even possible, remains to be seen.

The International Committee of the Red Cross, the custodian of international humanitarian law, insists that the legal obligations of commanders and operators “cannot be transferred to a machine, algorithm or weapon system.” Right now, human beings are held responsible for protecting civilians and limiting combat damage by making sure the use of force is proportional to military objectives.

If and when artificially intelligent weapons are deployed on the battlefield, who should be held responsible when needless civilian deaths occur? There isn’t a clear answer to that very important question.The Conversation

James Dawes, Professor of English, Macalester College

This article is republished from The Conversation under a Creative Commons license. Read the original article.

This article was originally published by Sciencealert.com. Read the original article here.

Articles You May Like

This Adaptation Allowed Dinosaurs to Not Only Survive But to Dominate The Planet
Ancient Siberian Bear Reveals an Unexpected Twist on Close Inspection
Notre Dame’s Fire Reveals a Major Surprise Hidden in Its Architecture
‘Horrifying’ Plastic Rocks Emerge in Remote Island Paradise
‘Giant Hole’ in The Sun Predicted to Unleash Stunning Light Show Across US

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recent Articles

  • Tasmanian Tiger ‘Probably’ Survived to 1980s or Even Later, Study Claims
  • NASA Is Tracking a Huge, Growing Anomaly in Earth’s Magnetic Field
  • ‘Giant Hole’ in The Sun Predicted to Unleash Stunning Light Show Across US
  • Physicists Have Manipulated ‘Quantum Light’ For The First Time, in a Huge Breakthrough
  • Strange Acceleration of Mysterious Interstellar Visitor Finally Explained
  • AI Could Be Our Best Chance of Finding Life on Mars. Here’s Why.
  • ‘Ghost Particles’: Scientists Finally Detect Neutrinos in Particle Collider
  • ‘Horrifying’ Plastic Rocks Emerge in Remote Island Paradise
  • Scientists Discover RNA Component Buried in The Dust of an Asteroid
  • Risk of Giant Asteroids Hitting Earth Could Be Worse Than We Realized

Space

  • NASA Is Tracking a Huge, Growing Anomaly in Earth’s Magnetic Field
  • ‘Giant Hole’ in The Sun Predicted to Unleash Stunning Light Show Across US
  • Strange Acceleration of Mysterious Interstellar Visitor Finally Explained
  • Scientists Discover RNA Component Buried in The Dust of an Asteroid
  • Risk of Giant Asteroids Hitting Earth Could Be Worse Than We Realized

Physics

  • Physicists Have Manipulated ‘Quantum Light’ For The First Time, in a Huge Breakthrough
  • ‘Ghost Particles’: Scientists Finally Detect Neutrinos in Particle Collider
  • We’re Either Suspiciously Lucky, or There Really Are Many Universes Out There
  • Blueprint of a Quantum Wormhole Teleporter Could Point to Deeper Physics
  • ‘Time Reflections’ Finally Observed by Physicists After Decades of Searching

Archives

  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • September 2017
  • August 2017
  • March 2017
  • November 2016

Categories

  • Environment
  • Humans
  • Nature
  • Physics
  • Space
  • Tech
  • Video

Useful Links

  • Contact Us
  • About us
  • Privacy Policy
  • Terms and Conditions
  • Amazon Disclaimer
  • DMCA / Copyrights Disclaimer

Recent Posts

  • Tasmanian Tiger ‘Probably’ Survived to 1980s or Even Later, Study Claims
  • NASA Is Tracking a Huge, Growing Anomaly in Earth’s Magnetic Field
  • ‘Giant Hole’ in The Sun Predicted to Unleash Stunning Light Show Across US
  • Physicists Have Manipulated ‘Quantum Light’ For The First Time, in a Huge Breakthrough
  • Strange Acceleration of Mysterious Interstellar Visitor Finally Explained

Copyright © 2023 by Oni Science. All rights reserved. All articles, images, product names, logos, and brands are property of their respective owners. All company, product and service names used in this website are for identification purposes only. Use of these names, logos, and brands does not imply endorsement unless specified. By using this site, you agree to the Terms of Use and Privacy Policy.

Powered by WordPress using DisruptPress Theme.