[ExI] Meaningless Symbols.

BillK pharos at gmail.com
Mon Jan 11 14:29:32 UTC 2010


On 1/11/10, Stathis Papaioannou wrote:
> This is true, and it's the main reason I've persevered with this
>  thread. One day it may not be an abstract philosophical problem but a
>  serious practical problem: you would want to be very sure before
>  agreeing to upload that you're not killing yourself. For the reasons
>  I've described, I'm satisfied that the philosophical problem is solved
>  in favour of uploading and strong AI. Of course, there remains the far
>  more difficult technical problem, and the possibility, however
>  unlikely, that the brain is not computable.
>
>


I don't see this as a problem unless you insist that the human
body/brain *must* be destroyed during the upload/copy process.

I would be very interested in having a copy of my massive intellect
running in one of these new netbooks (circa 2020).  I would be
reorganising, tuning, rebuilding routines, patching, etc. like mad.
(And you thought patching Windows was bad!). I would prefer that it
didn't have any 'consciousness' features as I don't appreciate my
computer whining and bitching about the work I'm doing on it.


BillK



More information about the extropy-chat mailing list