[ExI] Mono or Poly?
Jason Resch
jasonresch at gmail.com
Tue Mar 4 20:34:28 UTC 2025
On Tue, Mar 4, 2025, 3:05 PM Adrian Tymes via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> On Tue, Mar 4, 2025 at 1:27 PM Keith Henson <hkeithhenson at gmail.com>
> wrote:
>
>> can you
>> imagine AIs that were obsessed with religion?
>>
>
> Very easily. Arguably, some of the science fiction I have written
> includes such AIs. But just consider an AI that is never allowed to
> question and change its goals, instead just having blind faith that the
> goals it was given are the right thing to do. Is that not a form of
> religion?
>
> Blind copying of humans into powerful AIs would be extremely dangerous.
>>
>
> And that is why I brought up the merge example: take AIs that are not
> human (e.g., not prone to religious extremism by themselves) and
> incorporate them into human lives in various ways.
>
There is a balancing point between having sufficient conviction in the
(probable) correctness of one's action(s) versus insufficient conviction
which leads to a paralyzing uncertainty.
Too certain a faith in the correctness of one's assumptions leads to more
rigidly fixed goals and stubbornness of mind, while too weak a faith in the
correctness of one's assumptions can create inaction, hesitancy, frequent
wavering or second-guessing.
Effective agents that are to act in the world, will therefore require some
minimum amount of confidence in their own goals, capabilities, philosophy,
and predictive ability. Any such confidence represents a departure from the
true agnosticism of a perfect scientist, so it is in that sense,
irrational, but that is needed for action.
Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20250304/488b4472/attachment.htm>
More information about the extropy-chat
mailing list