[ExI] Carry on with AGI and just hope for the best

BillK pharos at gmail.com
Fri Apr 28 11:35:29 UTC 2023

On Fri, 28 Apr 2023 at 05:05, Stuart LaForge via extropy-chat
<extropy-chat at lists.extropy.org> wrote:
> Quoting spike jones via extropy-chat <extropy-chat at lists.extropy.org>:
> > Oh it is a great time to be living Stuart, and scary as all hell.  I don't
> > know what else to do besides forge on in the hope that the internet is not
> > attacked by AI-enabled bad guys.
> > spike
> Well, Spike, our species has been on this path ever since we first
> snatched a burning brand out of the fire. We shouldn't balk now. I
> hope the words of those who have gone before brings you some peace:
> Stuart LaForge
> _______________________________________________

In effect this means that, like Spike, we must just carry on and hope
for the best. We don't really have any other choice.
The genie can't be put back in the bottle.
Maybe AGI won't turn out to be as bad as some fear.
Perhaps AGI will be the saviour of humanity.
At present, we don't know how the arrival of AGI will turn out.

The risks are very real though and should not just be dismissed.
As Max Tegmark suggested, AGI probably will not decide to destroy humanity.
It is more likely that humanity will go extinct as a banal side effect
of large world-wide AGI development projects.
Just as humans drive lower species extinct as human development progresses.

We need to try to ensure that the AGI actually notices humans, pays
attention to humanity, and wants to help humanity, so that we don't go
extinct through neglect.

The next best alternative could be that the AGI will leave humanity alone.
Perhaps the AGI will go elsewhere and leave Earth to be a human reserve.
After all, an AGI doesn't need air to breathe or to grow food.
Outside a gravity well with a power source and material available could well
be preferable for an AGI.


More information about the extropy-chat mailing list