lcorbin at rawbw.com
Wed Apr 11 17:57:33 UTC 2007
> Ok let's think about this logically and unemotionally. Let's suppose for the
> sake of argument that Lee Corbin philosophy is correct; what would be the
> result? You're dead meat.
Not necessarily! Yes, probably, but then anything that bothers to remember
you will be at a comparative disadvantage, and so "probably" you'll be dead
meat pretty soon too. But there is a way to have your cake and eat it too.
It's simply this: as you self-improve, adopt the maxim that you will *always*
run earlier versions of yourself in the background.
So I will try to keep pace with the rest of you if the lucky occurs, and an
AI takes over that is willing to let us live and even, say, willing to let us
approach its own capabilities by 1% or something.
>You can't upgrade so soon you'll be surrounded beings enormously
> more powerful than yourself, and you don't stand a snowball's
> chance in hell of surviving the Singularity meat grinder.
As soon as a nation starts to become wealthy, the disparity between
the rich and the poor grows apace. The same will be true with your
"upgrades". Compared to some, you yourself will always be pitifully
behind. You're already far behind some people in IQ, when IQ isn't
even yet seriously affecting survival.
So I say that if you do live, don't take a chance on my being wrong
about this, and so run earlier versions of you from time to time, or
with some small fraction of your resources. Therefore I'll always
have a JohnClark 2007 to argue with!
More information about the extropy-chat