[ExI] thought experiment part 1

John Clark johnkclark at gmail.com
Wed Dec 10 12:00:42 UTC 2025


On Tue, Dec 9, 2025 at 4:53 PM Colin Hales via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

*> Perhaps rethink your ideas on money? The only thing that makes a rich
> man rich is a belief,*
>

*The only reason most people believe that having a lot of money is more
desirable than having very little money is that a rich man has a greater
ability to induce other people to do what he wants to be done than a poor
man does. And the only reason money has this power over people is the near
universal belief (perhaps "agreement" would be a better word) that money
has value. However technology is rapidly approaching the point where
machines alone can do what somebody wants to get done, completely bypassing
human labor, and thus also bypassing the abstract concept of "money". That
alone would produce a semi-singularity, but it is only a very small part of
the cosmic upheaval we are rapidly heading towards. *

*John K Clark*








> by a market, that the 'idea' of what money is has attached to that person.
> Money is not real. Acting as if it is real is the only thing that reifies
> it.
>
> Then there's the matter of the impact of AI automating humans out of
> everything. Consider if you defund all your customers, who no longer have
> jobs and are on subsistence incomes. Those customers ARE your market. The
> fatcat with all the $ thereby kills all the actual value. A trillionaire
> with no market and no humans to buy what you sell is a deluded pauper with
> nothing but a large number in a computer, and will starve to misery & death
> like everyone else.
>
> We either share all the benefits of AI or we sink into misery, mediocrity,
> war and religiosity.
>
> What I am hoping is that money itself can go away, perhaps a poison to be
> carried by the AI, who can squabble over it and screw each other over and
> leave us out of it while they organize supply.
>
>
> On Tue, 2 Dec 2025, 12:08 am John Clark via extropy-chat, <
> extropy-chat at lists.extropy.org> wrote:
>
>> On Wed, Nov 26, 2025 at 6:48 PM <spike at rainier66.com> wrote:
>>
>>
>>> *> John opined that shortly a few thousand people become trillionaires,
>>> with an implied opinion that this would be a bad thing. *
>>>
>>
>> *By itself that's not a bad thing, however if all the astronomically huge
>> amount of new wealth generated by AI went to just a few thousand people
>> then objectively it would be a bad thing, if you make the assumption that a
>> bloody Civil War would be a bad thing.  *
>>
>>
>>
>>> *> ** take it to the absurd but absolutely ultimate extreme: one guy
>>> owns not just a trillion dollars, but aaaaaalllll the daaaaaaammm money in
>>> the world.  One guy.  Somehow he gets AI working for him before anybody
>>> else, and now he owns it all, one guy with all the money in the world.*
>>>
>>> * OK.  Then what? *
>>>
>>
>> *Good question. Then Mr. One Guy would have no further use for anybody
>> else, and so the future of the entire human race would depend on the whim
>> of just one human being. So what happens when he is in a bad mood? I would
>> prefer Mr. Jupiter Brain have that power because I very much doubt Mr. One
>> Guy managed to accumulate all the wealth that exists by being Mr. Nice Guy,
>> and Mr. Jupiter Brain would be one hell of a lot smarter. From history we
>> have learned that when it comes to human beings, power corrupts and
>> absolute power corrupts absolutely, but perhaps a Jupiter Brain would
>> behave differently. *
>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20251210/bc8a46b4/attachment.htm>


More information about the extropy-chat mailing list