[extropy-chat] Avoid Too Much Change.

Russell Wallace russell.wallace at gmail.com
Wed Apr 11 18:20:04 UTC 2007

On 4/11/07, John K Clark <jonkc at att.net> wrote:
> Russell Wallace
> > We're programmed to believe personal power confers selective advantage,
> > because it was true in the conditions in which we evolved - but even
> > though we still believe it because we're programmed to, it's no longer
> > true.
> No longer true?! It I'm more powerful than you that means I can do things
> you can't, and that gives me an advantage over you.

But not an evolutionary advantage. What's your evolutionary fitness?

People who have a
> superstition against radical upgrades are going to get hammered by those
> who
> don't have that prejudice.

So if you acquire the wherewithal, are you planning to go around "hammering"
people who hold the belief in question? What are you planning to do, beat
them up? Machine gun them to death? Gas them? Zap them with unobtainium?

Yes it is a story, but what is your point? Stories are a good thing, stories
> are how we understand the way the world works.

Sure. My point is merely that there are times when it's important to remind
ourselves of the difference between stories and reality.

So we're back to that "friendly AI" nonsense. The AI is going to do what it
> wants to do and it will not care if you "tolerate" it or not. You won't be
> able to command it and you won't be able to trick it because you can't
> outthink something smarter than you are.

Except that AI doesn't presently exist, it isn't going to exist unless
people build it, and nobody rational enough to be capable of contributing to
the field is going to build an AI that can't be controlled and whose motives
are destructive.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20070411/cd0d7c11/attachment.html>

More information about the extropy-chat mailing list