[ExI] singularity summit on foxnews
stefano.vaj at gmail.com
Fri Sep 14 20:54:55 UTC 2007
On 9/14/07, hkhenson <hkhenson at rogers.com> wrote:
> Consider the plant kudzu.
Yes. I will be the first to admit that I do not care much for the
success of kudzu, nor I identify much with its destiny. I think that a
kudzu individual may have a different view, however.
> >but above all why you or
> >I should care, especially if we were to be (physically?) dead anyway
> >before the coming of such an AI.
> You can't count on it, not unless you take steps to die real
> soon. It is very likely someone will be alive at the point AIs reach
> takeoff. The problem with AIs thinning out the world's excess
> population is that it's hard to imagine a situation where unfriendly
> AIs didn't make a clean sweep.
Why don't we make a sweep as clean as possible of other species, or
for that matter of silicon
crystals? Because if they are not in our harm's way we do not care,
basically (even not counting the living species or other chemical
configurations we actually like) . Why should AIs?
> The assumption on this list used to be that people intend to live a
> very long time so there were no problems in the future they were not
> concerned about. (A lot, if not most, of the early Extropians were
> signed up for cryonic suspension in the event they needed it.)
Fine with me. But the definition of "survival", when we exceed our
current lifespan and limited extentions thereto, is pretty much an
open question. The memory of you? Your memories? Your genes? Your
species? A biologically- or otherwise-based "identity" which is
somehow related to your present self, even though more and more
tenuously with the passing of time? Successors who put forward your
culture? Future entities sharing your extropian views?
More information about the extropy-chat