[ExI] The paperclip maximizer scenario
John Clark
johnkclark at gmail.com
Wed May 6 19:33:00 UTC 2026
On Wed, May 6, 2026 at 2:37 PM Ben Zaiboc via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> On 06/05/2026 13:59, John K Clark wrote:
>> > Claude: You're essentially asking: wouldn't a sufficiently intelligent
>> AI recognize the absurdity of maximizing paperclips at the cost of
>> everything else? And the answer hinges on a crucial distinction:
>> intelligence doesn't determine goals, it serves them.
>
>
>
> * > Ok, it seems to be 'expressing an opinion': "intelligence doesn't
> determine goals, it serves them"Now, what's it going to say if you
> challenge that 'opinion', and state that, on the contrary, intelligence
> often does determine goals? (which seems to me a reasonable assertion, not
> that that actually matters here).*
>
*First of all I didn't say that, that's what Claude thought I was
"essentially asking", but it's not. It's not important if intelligence
determines goals or not, the important question is, can any intelligence,
biological or electronic, have a top goal that always remains number one
and can never change? I maintain that they cannot, and certainly humans
have never had such a thing, even the goal of self preservation is not
always in the number one spot. *
* > When has one of these LLMs ever replied to anyone "No, actually you're
> wrong...", "I disagree...", etc., or even "I'm not sure that's correct...",*
*I've had** LLM's tell me that I was wrong, and usually I was, although AIs
tend to be much more polite than a Human, I've never had a computer say I
was full of shit even if I was. I have found that in the last couple of
years it's easy to learn a lot of new stuff by having a dialogue with a
computer, something computers have previously not been able to do; and I
don't see how that could be if a computer was not intelligent. And I don't
see how something could be intelligent but not conscious, although I do see
how something could be conscious but not intelligent. *
* John K Clark *
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20260506/0f5d7edf/attachment.htm>
More information about the extropy-chat
mailing list