[ExI] Why “Everyone Dies” Gets AGI All Wrong by Ben Goertzel

Brent Allsop brent.allsop at gmail.com
Fri Oct 3 10:46:56 UTC 2025


I don't see any of this as a problem at all.  You just need to find a way
to build and track consensus around what EVERYONE wants.  And then use a
sorting algorithm which gives more vote to less rich people and stuff like
that. (only a minor vote to AI systems or systems emulating dead
people...?) After all, if you know what everyone wants, THAT, by definition
is consensus.  And SAIs will help us know, better, what we as
individuals really want and how to be just and fair with it all.







On Fri, Oct 3, 2025 at 3:37 AM BillK via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On Fri, 3 Oct 2025 at 06:26, Adam A. Ford <tech101 at gmail.com> wrote:
>
>> >  Getting what we desire may cause us to go extinct
>> Perhaps what we need is indirect normativity
>> <https://www.scifuture.org/indirect-normativity/>
>>
>> Kind regards,  Adam A. Ford
>>  Science, Technology & the Future <http://scifuture.org>
>> _______________________________________________
>>
>
>
> Yes, everybody agrees that AI alignment is a problem that needs to be
> solved.  :)
> And using Initial versions of AI to assist in devising alignment rules is
> a good idea. After all, we will be using AI to assist in designing
> everything else!
> I see a few problems though. The early versions of AI are likely to be
> aligned to fairly specific values. Say, for example, in line with the
> values of the richest man in the world. This is unlikely to iterate into
> ethical versions suitable for humanity as a whole.
> The whole alignment problem runs up against the conflicting beliefs and
> world views of the widely different groups of humanity.
> These are not just theoretical differences of opinion. These are
> fundamental conflicts, leading to wars and destruction.
> An AGI will have to be exceptionally persuasive to get all humans to agree
> with the final ethical system that it designs!
>
> BillK
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20251003/604a1a97/attachment.htm>


More information about the extropy-chat mailing list