[ExI] Destructive uploading.

Kelly Anderson kellycoinguy at gmail.com
Wed Sep 7 17:28:59 UTC 2011


2011/9/5 Stefano Vaj <stefano.vaj at gmail.com>:
> On 4 September 2011 21:43, Kelly Anderson <kellycoinguy at gmail.com> wrote:
>>
>> I see the difference between replacing the atoms in your brain quickly
>> vs. slowly to be an issue of continuity of consciousness. While you
>> may perceive a me uploaded into a robot or VR to be the same as me,
>> I'm more interested in the internal perception. Do I, internally to my
>> thought processes feel as though I've had a nap, or just lived my life
>> day by day, and the pattern is continuous. If there is a big
>> discontinuity, then it will feel as though I've died. That would
>> inflict a certain amount of psychological damage on me (or my copy)
>
> Actually,... no. No matter how you perform the upload, perfect continuity is
> perceived by the upload, by definition (one cannot be anybody but oneself at
> any stage of the process, and at any stage either you are counscious -
> including "sleep" consciousness - or you are not).

OK, so here's a scenario... suppose that I have my brain frozen (and
that this is not considered murder or suicide) and sliced into thin
segments, scanned and uploaded. Suppose further that this takes 6
months for whatever reason. I have not then experienced perfect
continuity, but rather more something like having been on a 6 month
vacation.  So what I'm saying is the continuity includes continuous
interaction with my friends and relatives, etc.

> With regard to the public, which has nothing to do with internal perception,
> it strictly depends on what you decide to show it, not on the mechanics of
> the process.

Other than perhaps the time the process takes to complete...

> Similarly, if one approximates B from A, either B' is different enough from
> B that it can be described as a different result - and graduality is
> irrelevant for this purpose, or is "similar enough", and in such event
> nothing change.>If we continued conscious awareness when doubled, or tripled
> if the process is non-destructive, which one are we aware of, or are we
> aware in multiple places at the same time?

Under my scheme, we will not be aware of multiple places at the same
time, but what we will be aware of is that we were in multiple places
yesterday. Present time is a single line of execution for each
emulant, but when the emulants are merged into the main brain, then
the main brain acquires a new yesterday.... Of course, other schemes
may prove workable that are different than what I'm proposing, but my
view of heaven on earth includes the ability to do more than one thing
at the same time.

>> I like to think of it as multiple threads of execution, perhaps even
>> distributed to different physical computers. When the threads are
>> merged later, you just have new memory of having done two different
>> things yesterday. It would be weird at first, but I think we could get
>> use to remembering two yesterdays, or twenty. It would probably be
>> percieved initially as yesterday, and the day before yesterday....
>
> I suspect that the merger of two "threads" would be no different from any
> practical purpose than the merger of any two individuals. You may probably
> create an AGI as a patchwork of different experiences lived by several
> persons, as in Blade Runner's Rachel, but *this* would certainly qualify as
> a new person.

I completely disagree here. Merging a thread from an emulant back into
a wet ware brain would require that the emulant be based upon that
brain. Merging new neuronal connections from my emulant into your
brain would be VERY confusing because you store your concepts in a
different hologram within your brain than I do. So this scheme of mine
is based upon the fact that the two brains are only a day or two of
divergence, (at first anyway) and the remerger of the new connections
(and other changes) back into the main wetware brain is simply (haha)
a matter of stimulating the growth of the same dendrites, etc. that
were grown in the emulant.  If the brains differed by too much, say my
brain and yours, then they would have to be converted to some kind of
common brain language, and then converted back. Hell, we can't even
get Macs and PCs to communicate! So I think this is quite a ways
further out than what I'm talking about.

So my assertion is that only an emulation of MY brain can be remerged
into my brain, at least easily. Do you understand my point? And this
may be something that only happens for a brief period of time before
we fully understand the brain enough to translate changes in my brain
to changes in yours... but that seems many orders of magnitude more
complex.

-Kelly



More information about the extropy-chat mailing list