[ExI] A Realistic Scenario of AI Takeover - Minute-By-Minute
    John Clark 
    johnkclark at gmail.com
       
    Tue Oct 21 12:00:38 UTC 2025
    
    
  
On Mon, Oct 20, 2025 at 5:39 PM Adrian Tymes via extropy-chat <
extropy-chat at lists.extropy.org> wrote
>
>
>
> *> A nice explanation, and it stumbles right over the main objections to
> the prediction.  To list a few: 1) Assuming that literally everyone is
> online or is trackable online. "A lot of people" or even "the majority of
> humanity" is not literally every human on Earth.*
>
*Not a serious criticism. If not every human on earth then every human on
earth who is important in this matter. *
>
> * >2) Postulating that the AI can do certain things that humans can't find
> a counter to,*
*It said an AI can do things that a human can't if the AI is smarter than a
human, and it said you can't  permanently outsmart something that is
smarter than you are. And those things are not postulates, they are facts.*
>
> * > but at the same time can always find counters if someone or something
> - such as another AI - does these things.*
*It said a smart AI can do things that a less smart AI can't, and that is
also not a postulate, that is a fact.  *
>
> * > 3) Assuming that the AI will definitely see no value in keeping any
> portion of humanity alive indefinitely*
*This is where I disagree with the authors of the book, they're certain a
super intelligent AI will want to kill us all, I however am uncertain what
a super intelligent AI will want to do. Their assumption might be valid,
but then again it might not be.  Mr. Jupiter Brain may keep us around
because He thinks we're amusing and we make cute pets, or for sentimental
reasons for the same reason that few human beings are eager to exterminate
all chimpanzees. Or maybe he would feel some responsibility towards us
since He would not exist without us. Or maybe none of that is true and He
will conclude that we are more trouble than we're worth. There is no way to
predict what it's going to do, but very soon we're going to find out. *
*> that it knows it can fullysolve its own upgrade problems forever,*
*They do not postulate that and they didn't need to.  *
> * > postulates that the rogue AI can eventually subvert humans to do its
> will,*
*That is a perfectly reasonable postulate. However the postulate that
something very stupid can remain in control of something very smart forever
is not a reasonable postulate.  *
> *>and humans can cross those air gaps.*
*If no human can cross the air gap to a super intelligent AI then it would
be a completely useless machine and there would be no reason that humans
would want to build such a thing. *
>
* > 4) No comment on the fact that their call to destroy all AI
> centers that will not accept lockdown, is basically declaring nuclear war
> on China and Russia*
*I read the book the video was based on and the book did comment about
that. It said that if any rogue nation was constructing a building that
would contain more than 8 State of the art (as of 2025) Nvidia GPUs then
the other nations of the world should unite and use any means necessary, up
to and including nuclear weapons, to prevent the construction of that
building from being finished. Their reasoning was that a nuclear war would
kill billions of people but it wouldn't cause the extinction of the human
race, but an AI takeover would. They admit there is very little likelihood
of the world uniting in that way and that is why they are so
pessimistic.   *
>
> *> No comment on how the start of the scenario as postulated would
> bepossible even with locked down data centers, meaning their
> suggestedsolution wouldn't work anyway. *
*Although it's extremely unlikely to ever be implemented, I think the
drastic measures they recommend would stop the AI revolution, although the
probability of it producing a nuclear war would be pretty high, and even if
you somehow managed to avoid that you'd need such total control you'd end
up with a global society reminiscent of George Orwell's 1984.*
*John K Clark*
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20251021/9aabda5b/attachment.htm>
    
    
More information about the extropy-chat
mailing list