[extropy-chat] How to be copied into the future?

Lee Corbin lcorbin at rawbw.com
Mon Apr 23 03:17:16 UTC 2007


Mike writes


> Samantha wrote:
>> Lee Corbin wrote:
>> > That's a truism.  But I take you to mean that you'd be willing to
>> > defer to its terms.  No?  As for me, if it is cheap in terms of time
>> > and resources, and the advanced entities could save us---but
>> > choose not to (for whatever reasons)---then I'd just as soon
>> > they not exist either.  The bastards.
>>
>> You would prefer no intelligence at all in the local universe if that
>> intelligence wasn't human or formerly human?

That's right.  At least under the condition that they could have saved
us---at no expense yet!---but did not deign to.  To hell with them.

You can't let people push you around.

> If They are so full of themselves that they can't be bothered to run
> LeeCorbin as a screen saver, then Lee is justifiably bitter

It's the least they could do.  And I can't believe that not everyone would
be a bit miffed at their actions:

Suppose an AI takes over the world, answers all our math, physics,
and general knowledge questions, and then suddenly says "Oh, by the
way, I've decided I don't want you around no more. Tuesday---then
that's it!"

> and is (verbally) spitting on them.  :)

Yeah---and don't forget I'm making a fuss for the rest of you too!
Let's plant fear right now:  fear that it's all a test and that we have a 
secret way to pull the plug---or that they're in a simulation and we're
waiting to see if they pass a test of basic gratitude!  :-)

Lee




More information about the extropy-chat mailing list