[ExI] How Close is the AI Intelligence Explosion?

Keith Henson hkeithhenson at gmail.com
Sat Mar 22 19:28:14 UTC 2025


On Sat, Mar 22, 2025 at 11:30 AM Ben Zaiboc via extropy-chat
<extropy-chat at lists.extropy.org> wrote:

snip

> This is just like global warming. It's happening, and there's nothing we can, or rather, nothing we will, do about it.

That's currently the case, but I have been talking since 1990 about
humans pulling so much CO2 out of the atmosphere that it caused an ice
age.  Current humans want houses, it's not hard to project
nanomachines that suck carbon out of the air and grow houses out of
diamond.  How far off is this level of nanotechnology?

> And, like global warming, I reckon that anyone with an ounce of sense should not be concentrating on how to avoid it (because that's pointless), but on how to survive it (because that's slightly less pointless).
>
> 'Keep the future human' is sad, distasteful and just wrong. The future will not be 'human', that's pretty much certain (and I'd be disappointed if it were. It would mean, essentially, that we'd failed).

I don't know.  AIs may need humans and keep them like we keep cats.

Keith

> What's more important, I think, is that we do what we can to make our mind-children the best they can be, regardless of what happens to the human race.
>
> --
> Ben
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat



More information about the extropy-chat mailing list