[ExI] Is AGI development going to destroy humanity?

spike at rainier66.com spike at rainier66.com
Sat Apr 2 16:55:23 UTC 2022


 

 

…> On Behalf Of Adrian Tymes via extropy-chat
Subject: Re: [ExI] Is AGI development going to destroy humanity?

 

On Sat, Apr 2, 2022 at 6:56 AM spike jones via extropy-chat <extropy-chat at lists.extropy.org <mailto:extropy-chat at lists.extropy.org> > wrote:

He is far more convinced than most of us that
unfriendly AI will destroy humanity.

 

>…His main argument seems to be that AI will be unimaginably smarter than humans (achieving superintelligence near-instantaneously through the Technological Singularity process) therefore AI can do literally anything it wants with effectively infinite resources (including time since it will act so much faster than humanity), and unfriendly AI will have the same advantage over friendly AI since it is easier to destroy than to create…

 

Ah, kind of an advanced version of Core War?  Apologies to anyone who is too young to have gone to college in the 70s.  Core War was a kind of machine-code analog of the old Battleship game, where you write a piece of code as compact and elusive as a PT boat but as destructive as a carrier (in software terms.)  It was how we had fun before actual video games came along and sex.

 

Well OK, one can imagine plausibility to that argument.  A malicious AI could theoretically wreck the existing software ecosystem.  That would be bad.  At some point we need to think hard about how we will deal with it if a foreign bad actor were to take down the internet or make it not work right.  Consider the US election system alone (never mind our system of commerce for now.)  If a bad guy were to make all those voting machines crash the morning of the election and there were not sufficient paper ballots, total chaos would ensue, providing a golden opportunity for China to grab Taiwan while the potential opposition was screaming at each other over who won the election.

 

Nah.  That can’t happen.  Besides that is off topic, for it isn’t unfriendly artificial intelligence but rather unfriendly biological intelligence doing the damage.  That’s different.

 

spike

 

 

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220402/ec1efb80/attachment.htm>


More information about the extropy-chat mailing list