[extropy-chat] A brief transhuman life, anyone?
Eliezer Yudkowsky
sentience at pobox.com
Wed Feb 9 13:58:37 UTC 2005
Reason . wrote:
>
> Eliezer Yudkowsky <sentience at pobox.com>
>
>> I think that in the end I would choose the ten years of transhuman
>> life, for that I cannot imagine I would choose forty years at average
>> intelligence over twenty years continued as myself.
>>
>> Some have replied: "Oh, you can't present me with that dilemma, there
>> *must* be an option C". I think they don't realize how cruel the
>> real universe can be. Tell it to humanity's dead. The question is
>> fair.
>
> You missed my point: I was saying that in the real world there are
> always options C->infinity ... because you can always move the
> goalposts. By constraining yourself to visualizing dilemmas based on a
> given set of constraints, you deny your (transhumanist) ability to
> engineer a better set of constraints to live within.
"I do not know what I may appear to the world; but to myself I seem to have
been only like a boy playing on the seashore, while the great ocean of
truth lay all undiscovered before me." So Newton said, and he was right;
he knew how little he knew. And then he died, and worms ate his brain, and
he never learned the answers to his questions.
Tell me how he could have moved the goalposts.
--
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
More information about the extropy-chat
mailing list