[ExI] MAX MORE in Second Life yesterday

hkhenson hkhenson at rogers.com
Sat Jun 14 00:25:53 UTC 2008

At 02:51 PM 6/13/2008, Michael wrote:

>On Thu, Jun 12, 2008 at 11:01 PM, hkhenson 
><<mailto:hkhenson at rogers.com>hkhenson at rogers.com> wrote:
>I am surprised you would even consider the singularity in terms of
>geography.  If a smarter-than-human AI existed anywhere within light
>hours of the net it would have huge effects unless it was blocked
>from communicating.  If it was blocked, it could be co-located with
>MAE West and have no impact.
>Keith, a smarter-than-human intelligence could be an enhanced human, 
>network of interfaced humans, or some other non-AI 
>superintelligence.  These possibilities were introduced in Vinge's 
>original essay.

Read it.  Many years ago and several times since.  If you have since 
read "A Deepness in the Sky" go back and read it again.

>Having been locked up in solitary confinement with very
>limited communications recently I can state that you *really* don't
>want to do that to something smarter than humans.  It's bad enough to
>lock up an engineer.
>You wouldn't want to do it because it would prevent the AI from 
>helping us, but to attribute feelings of resentment to an AI because 
>you lock it up is attributing human psychology to a non-human entity.

Two of the three you mentioned above *are* humans and I don't know 
about the third.  I suspect (read Marvin Minsky's latest book) that 
the quality we refer to as intelligence will not emerge until the 
entity has emotions and personality.  It would greatly surprise me if 
a useful personality could be constructed that didn't incorporate 
resentment as an emotion in appropriate circumstances.

If you want an AI to help humans, then you need to give it emotions 
that a helpful human has.

Incidentally, the combination of an engineer and something as simple 
as a spread sheet has super human intelligence.


More information about the extropy-chat mailing list