[ExI] Top AI Researchers Meet to Discuss What Comes After Humanity
BillK
pharos at gmail.com
Sun Jun 22 19:26:22 UTC 2025
On Sun, 22 Jun 2025 at 19:02, Stuart LaForge via extropy-chat
<extropy-chat at lists.extropy.org> wrote:
<snip>
>
> If 1280 primitive humans out of 100,000 could survive a bottleneck
> caused by climate change, then I think it is quite likely that at least
> 500 modern humans would survive any conceivable bottleneck caused by AI,
> although it too could lead to speciation event. I think that an
> AI-related bottleneck could give rise to full-fledged transhumans such
> as transgenic humans and cyborgs. If the technology can ever be
> demonstrated in principle, then uploads could also happen which could be
> called post-humanity proper. But a situation like this would not be the
> end of humanity, only a transformation or metamorphosis of it.
> Furthermore, it would likely be a gradual thing, with transhuman
> subspecies diversifying and adapting to peculiar specialized niches in a
> largely automated economic infrastructure. Some of us might be pets kept
> alive for our ascetic virtues while others of us might like racoons
> digging through AI's trash. Still others might maneuver themselves into
> having some kind of political leverage over AI for unfathomable reasons.
>
> I don't doubt that humanity will go through the crucible, but I have
> faith that it will emerge on the other side better for it.
>
> Stuart LaForge
> _______________________________________________
The discussion was concerned with slowing down the current race to get
to AGI first at all costs.
They wanted to ensure that AGI was a worthy successor to humanity.
The AGI would have superhuman powers, but it would be based on human
values.
(Assuming that humans could ever agree on what these might be).
Trivial enhancements to humans would be of little interest to the AGI.
The AGI would be like a God to humans, thinking thousands of times
faster, with nanotech abilities to create almost anything - including
more copies of itself.
Humanity wouldn't become extinct, rather just become irrelevant when a
civilization of Gods appears.
The organiser of the AI discussion has produced an article on what the
Worthy Successor is and is not.
<https://danfaggella.com/is-and-is-not/>
This article also links to several of his other articles on this subject.
BillK
More information about the extropy-chat
mailing list