[extropy-chat] A brief transhuman life, anyone?

Eliezer Yudkowsky sentience at pobox.com
Wed Feb 9 21:08:28 UTC 2005


Damien Broderick wrote:
> 
>>>> I think that in the end I would choose the ten years of transhuman
>>>> life, for that I cannot imagine I would choose forty years at average
>>>>  intelligence over twenty years continued as myself.
> 
> [Paraphrase: Eliezer currently stands halfway between average human and 
> transhuman and would prefer it that way even at a cost of abbreviated 
> life span]

Nnnoo...

I'm saying that I wasn't sure which option I would take, until I realized 
how much I value the intelligence I already have.  This being the case, if 
I was smarter, I would probably value that too much to give it up.  So I 
should choose the short but smart life.

I don't know how you get the "halfway" figure.

>>> You missed my point: I was saying that in the real world there are
>>> always options C->infinity ... because you can always move the
>>> goalposts. By constraining yourself to visualizing dilemmas based on a
>>> given set of constraints, you deny your (transhumanist) ability to
>>> engineer a better set of constraints to live within.
> 
> [Paraphrase: Reason thinks part of the definition of being transhuman is 
> the capacity to alter human-scale constraints such as life span]
> 
> [Eliezer again:]
> 
>> "I do not know what I may appear to the world; but to myself I seem to 
>> have been only like a boy playing on the seashore, while the great 
>> ocean of truth lay all undiscovered before me."  So Newton said, and 
>> he was right; he knew how little he knew.  And then he died, and worms 
>> ate his brain, and he never learned the answers to his questions.
>>
>> Tell me how he could have moved the goalposts.
> 
> By becoming (mysteriously, somehow) an accelerated transhuman with the 
> ability to move the goalposts--the initial posit in this rather daffy 
> thread.

I don't think so.  There could always be a bigger superintelligence 
imposing limits on you.

> Back in the real world: by choosing not to work with mercury which ate 
> his brain before the worms did.

He would still have died.

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



More information about the extropy-chat mailing list