[extropy-chat] Avoid Too Much Change.
John K Clark
jonkc at att.net
Thu Apr 12 15:15:39 UTC 2007
>>It I'm more powerful than you that means I can do things you can't,
> But not an evolutionary advantage.
I can kill you. You can't kill me. That is an evolutionary advantage.
> What's your evolutionary fitness?
> So if you acquire the wherewithal, are you planning to go around
> "hammering" people who hold the belief in question? What are you planning
> to do, beat them up? Machine gun them to death? Gas them?
It could be, but I don't really know. As Lee Corbin pointed out "this being
Isador with his IQ 12,000 and his unbelievably vast erudition has concerns
that you today cannot relate to in the slightest". And even if I don't feel
like eliminating those pesky meat bags you can bet somebody else will, and
it only takes one.
> My point is merely that there are times when it's important to remind
> ourselves of the difference between stories and reality.
Reality is unobtainable; all you can do is make up stories about it.
> AI doesn't presently exist, it isn't going to exist unless people build
> it, and nobody rational enough to be capable of contributing to the field
> is going to build an AI that can't be controlled and whose motives are
If I don't build an AI then people in country X will and then I'd be in deep
shit, so best to make one first and hope for the best. And besides, there is
something irresistible in working on such a Godlike project.
And I might add that there are people on this very list who think they can
outsmart and control an intelligence a billion times greater than their own,
by the time they learn they are wrong it will be too late.
John K Clark
More information about the extropy-chat