[ExI] Existential risk of AI
BillK
pharos at gmail.com
Tue Mar 14 18:18:18 UTC 2023
On Tue, 14 Mar 2023 at 17:03, Tara Maya via extropy-chat
<extropy-chat at lists.extropy.org> wrote:
>
> If AI loved us as much as our dogs love us, it would be a wonderful Singularity.
>
>_____________________________________________
Yes, but.....
After the Singularity, some people have worried about the AGI and
robot helpers killing humanity with kindness.
Humans don't do well with everything provided for them.
The AGI would probably have to provide some form of virtual reality
where humans could go on quests and have adventures and never die or
come to harm.
Keeping humanity happy is a major task for the AGI caretaker.
BillK
More information about the extropy-chat
mailing list