[extropy-chat] frozen in fire
Eliezer S. Yudkowsky
sentience at pobox.com
Sun Jan 21 23:20:43 UTC 2007
Samantha Atkins wrote:
>
> Elegant and poetic perhaps but likely? I have doubts. Precisely why
> should we expect some being post Singularity of nature utterly
> unknowable by us today to choose to reanimate pre-Singularity beings?
If the future is *utterly* unknowable you might as well write it over
with random noise. That's what a maximum-entropy ignorance prior means.
If the future is *utterly* unknowable, there's no point in wanting it
to be full of the descendants of humanity, rather than lifeless stars
shining until they go out. If the future is not utterly unknowable, and
we have any part in shaping it - if there is meaning to our lives - then
empathy and sympathy are surely one of the most important things of all
to preserve. They may change form, but I *choose* - for I know of no
way for a nice future to come into existence, except by choices we make
along the way - that they *must* be preserved.
> What if the pre-singularity being would require a near full rewrite to
> begin to comprehend the world as it had become. Would reanimation of
> the old self be considered good quite so automatically?
Meh, so it takes a few thousand years to grow up, whatever.
> For your sake? Just because you are a sentient? Is that how we act
> now?
It's how I act. It's not universal but I'm not exactly a unique
specimen either, Samantha.
> No? Then why do we have such an assurance that these future
> beings will act thus?
Assurance? No. I *choose* that future beings will act thus. I've made
my decision - now I just have to implement it.
--
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
More information about the extropy-chat
mailing list