[Ci4cg-announce] Fwd: [liberationtech] Killer Robots Aren't Regulated. Yet.

Doug Schuler douglas at publicsphereproject.org
Sat Dec 14 11:26:09 PST 2019


Not for  the common good....

---------- Forwarded message ---------
From: Yosem Companys <ycompanys at gmail.com>
Date: Sat, Dec 14, 2019 at 9:38 AM
Subject: [liberationtech] Killer Robots Aren't Regulated. Yet.
To: LT <lt at lists.liberationtech.org>


“Killing in the Age of Algorithms” is a New York Times documentary
examining the future of artificial intelligence and warfare.
By Jonah M. Kessel
Dec 13 2019
https://www.nytimes.com/2019/12/13/technology/autonomous-weapons-video.html

https://www.nytimes.com/video/technology/100000006082083/lethal-autonomous-weapons.html

Times reporters traveled to Russia, Switzerland, California and Washington,
D.C., talking to experts in the commercial tech, military and A.I.
communities. Below are some key points and analysis, along with extras from
the documentary.

Do I need to worry about a Terminator knocking on my door?

Most experts say you can rest easy, for now. Weapons that can operate like
human soldiers are not something they see in our immediate future. Although
there are varying opinions, most agree we are far from achieving artificial
general intelligence, or A.G.I., that would allow for Terminators with the
kind of flexibility necessary to be effective on today’s complex
battlefield.

However, Stuart J. Russell, a professor of computer science at the
University of California, Berkeley, who wrote an influential textbook on
artificial intelligence, says achieving A.G.I. that is as smart as humans
is inevitable.

So where are we now?

There are many weapons systems that use artificial intelligence. But
instead of thinking about Terminators, it might be better to think about
software transforming the tech we already have.
There are weapons that use artificial intelligence in active use today,
including some that can search, select and engage targets on their own,
attributes often associated with defining what constitutes a lethal
autonomous weapon system (a.k.a. a killer robot).

In his book “Army of None: Autonomous Weapons and the Future of War,” the
Army Ranger turned policy analyst Paul Scharreexplained, “More than 30
nations already have defensive supervised autonomous weapons for situations
in which the speed of engagement is too fast for humans to respond.”

Perhaps the best known of these weapons is the Israel Aerospace Industries
Harpy, an armed drone that can hang out high in the skies surveying large
areas of land until it detects an enemy radar signal, at which point it
crashes into the source of the radar, destroying both itself and the target.

The weapon needs no specific target to be launched, and a human is not
necessary to its lethal decision making. It has been sold to Chile, China,
India, South Korea and Turkey, Mr. Scharre said, and the Chinese are
reported to have reverse-engineered their own variant.

“We call them precursors,” Mary Wareham, advocacy director of the arms
division at Human Rights Watch, said in an interview between meetings at
the United Nations in Geneva. “We’re not quite there yet, but we are coming
ever closer.”

So when will more advanced lethal autonomous weapons systems be upon us?

“I think we’re talking more about years not decades,” she said.

But for the moment, most weapons that use A.I. have a narrow field of use
and aren’t flexible. They can’t adapt to different situations.

“One of the things that’s hard to understand unless you’ve been there is
just the messiness and confusion of modern warfare,” Mr. Scharre said in an
interview.

“In all of those firefights,” he explained, “there was never a point where
I could very clearly say that it was 100 percent that the person I was
looking at down the scope of my rifle was definitely a combatant.

Soldiers are constantly trying to gauge — is this person a threat? How
close can they get to me? If I tell them to stop, does that mean that they
didn’t hear me or they didn’t understand? Maybe they’re too frightened to
react? Maybe they’re not thinking? Or maybe they’re a suicide bomber and
they’re trying to kill me and my teammates.”

Mr. Scharre added, “Those can be very challenging environments for robots
that have algorithms they have to follow to be able to make clear and
correct decisions.”

Although current A.I. is relatively brittle, that isn’t stopping militaries
from incorporating it into their robots. In his book, which was published
in 2018, Mr. Scharre wrote that at least 16 countries had armed drones,
adding that more than a dozen others were working on them.

[snip]

-- 
Liberationtech is public & archives are searchable from any major
commercial search engine. Violations of list guidelines will get you
moderated: https://lists.ghserv.net/mailman/listinfo/lt. Unsubscribe,
change to digest mode, or change password by emailing
lt-owner at lists.liberationtech.org.


-- 
Douglas Schuler
douglas at publicsphereproject.org
Twitter: @doug_schuler

------------------------------------------------------------------------------
Public Sphere Project
     http://www.publicsphereproject.org/

Mailing list ~ Collective Intelligence for the Common Good
     * http://lists.scn.org/mailman/listinfo/ci
<http://lists.scn.org/mailman/listinfo/ci>4cg-announce*

Creating the World Citizen Parliament

http://interactions.acm.org/archive/view/may-june-2013/creating-the-world-citizen-parliament

Liberating Voices!  A Pattern Language for Communication Revolution
(project)
     http://www.publicsphereproject.org/patterns/lv
<http://www.publicsphereproject.org/patterns/>

Liberating Voices!  A Pattern Language for Communication Revolution (book)

 http://mitpress.mit.edu/catalog/item/default.asp?ttype=2&tid=11601
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.scn.org/pipermail/ci4cg-announce/attachments/20191214/b53764c4/attachment.html>


More information about the Ci4cg-announce mailing list