[ExI] Digital Consciousness .

spike spike at rainier66.com
Thu Apr 25 13:46:28 UTC 2013


 

 

From: extropy-chat-bounces at lists.extropy.org
[mailto:extropy-chat-bounces at lists.extropy.org] On Behalf Of Gordon
Sent: Wednesday, April 24, 2013 9:51 PM
To: ExI chat list
Subject: Re: [ExI] Digital Consciousness .

 

spike,

 

>>. A chess program is not conscious, but it creates some gooooorgeous
combinations that feel like an iron fist reached out of the computer and
grabbed you by the ass.  

 

>.I understand exactly what you mean, spike. Chess programs seem like they
know exactly what they are doing. They are masters of the game. 

 

>.But in reality they are like my watch. My watch tells me the time
accurately, but I'm pretty sure it has no idea what the time is.  -Gordon

 

 

Gordon this has been a valuable and insightful thread, and I am glad you
came back.  What it has really made me think about is that bifurcation I
mentioned yesterday: a perfectly acceptable AI could arise without
consciousness.  We tend to think intelligence requires self-awareness but I
now think it does not, or rather the kind of intelligence I am interested
in.  The chess program does what we want it to do, actually better than we
want it to do it, but it doesn't know how to play chess exactly.  It plays
better than we do, but it still doesn't know how to play in the human sense.
It is just shuffling a bunch of bits.

 

Now that goes down another interesting road.  Consider the common internet
trope which you have likely received at some time in the last few years.  It
goes something like this.  A woman is at the funeral for her mother.  She
meets a man there she has never seen, speaks to him briefly, falls in love
instantly, but he is gone before she gets his name or number.  A week later
she kills her sister.  Why did she do that?

 

This is one of those which stumped me.  I never did get it until I read the
punch line provided, which actually ties into this discussion.  OK, so did
anyone here figure out why funeral girl slew her sister?  Do you want more
time?

 

So we should think about what kinds of intelligences could arise without
humanlike emotions and without consciousness.  After reading the thread, I
think you may be right: a computer could be made to be highly intelligent
and to simulate human-like emotions and still not have them.  But it might
be perfectly acceptable that they don't, as illustrated by funeral-girl.

 

Answer: the reason she slew her sister is that she hoped mystery-man would
show up at the funeral.

 

If anyone figured it out easily and to whom that answer seems perfectly
logical, please post me so I can put your ass on moderation forthwith, you
scare me.

 

Can we imagine an AGI which would propose this solution to funeral-girl? 

 

spike

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20130425/9d51a962/attachment.html>


More information about the extropy-chat mailing list