[ExI] Strong AI Hypothesis: logically flawed?

Ben bbenzai at yahoo.com
Thu Oct 2 17:15:58 UTC 2014


William Flynn Wallace <foozler83 at gmail.com> wrote:

 >>?? Suppose you are informed that
 >>you have a disease that causes you to die whenever you fall asleep at
 >>night so that the person who wakes up in the morning is a completely
 >>different Dan Ust who shares your memories. This has been happening
 >>every day of your life, but you have only just found out about it.
 >>Would this information worry you or make any difference to how you
 >>live your life?
 >
 >Absolutely YES.   It means that someone with my body and memories will
 >awake tomorrow but it will not be me.  It means that I will die today.


I think it means that we need to look very carefully at what we mean by 
'death'.  Let's say that instead of once a day, we all 'die' ten million 
trillion trillion times every second, and are replaced by "someone else 
with..", etc.  How can that possibly make any difference to you?  What 
does that actually mean for the concept of 'you'?

The point of the argument is to show that /it doesn't matter/.  In fact, 
I think there's a bit of a trick in the statement "a completely 
different Dan Ust who shares your memories".  That's a bit like saying 
"a completely different tune that shares all the same notes, in the same 
order, with the same timing".  What we're talking about with uploading 
or copying or whatever, is equivalent to the same tune being played on a 
different instrument.  Calling it a different tune is clearly incorrect.

Ben Zaiboc



More information about the extropy-chat mailing list