[ExI] Thousands of scientists pledge not to help build killer AI robots

Adrian Tymes atymes at gmail.com
Wed Aug 1 09:02:30 UTC 2018


On Tue, Jul 31, 2018 at 6:50 PM,  <spike at rainier66.com> wrote:
> Humanity is clearly on the verge of humanoid robots in form: ones which are
> bipedal and look a lot like humans from a moderate distance.  We probably
> aren’t that close to AI, but imagine if… we had androids which could be
> programmed to walk into some public space and start shooting.  They would
> use ordinary firearms, the kind which are in existence by the billions, they
> could be entirely untraceable, so there would be no negative repercussions
> to whoever turned it loose.  If there are no repercussions to murder, plenty
> of people in civilized society would do it now, for the most trivial of
> motives.  When I say plenty, I mean 1%, which is plenty.

Consider the quadcopter drone.  An increasingly common sight, such
that one buzzing around might attract looks but not alarm - until it
began letting loose with a pistol concealed among its machinery.
(Perhaps a pistol with an extended magazine, whether or not those are
illegal at the location of the shooting spree.)

Let's say its constructor was careful, using gloves to handle the
drone at all times so as to leave no fingerprints, and operated by
onboard camera outside of visual range, from the back of a parked van
where no one would see the control setup (and which could be calmly
driven away when the drone was taken down), paid cash for all the
components, and used an unregistered gun (which is not made that much
easier by 3D printed firearms, though some would try to claim that is
a critical enabling technology for this even if this gun was provably
not 3D printed).

How would the shooter be found?  And if not, then what consequences
(other than financial outlay, some hours of careful work, and the need
to not do it again in the same area for a while) would the shooter
suffer?




More information about the extropy-chat mailing list