[extropy-chat] "Dead Time" of the Brain.

Heartland velvet977 at hotmail.com
Fri May 5 23:07:46 UTC 2006


Heartland:
>>>> Of course not. The point is that if you have two
>>>> identical, but separate brains, this must add up
>>>> to two separate *instances* of one *type* of mind.
>>>> If you have any experience in OOP, and I can't
>>>> imagine you don't, then you should know exactly
>>>> what I mean.

Christopher Healey:
>>> Is forking an instance equivalent to type?  I think not.

Heartland:
>>Are you disagreeing with what seems to be your point?

Christopher Healey:
> My point is that this seems like saying identical twins are really just
> two separate instances of type HumanBeing.  Well, yeah!
>
> But it fails to capture the important distinction, and perhaps even
> subtly diverts attention from it:  A particular instance possesses a
> higher amount of information content than a type, because in further
> constraining the realm of possibility, additional specification is
> always required.  When forking a particular instance, all *specific*
> state information, as well as the type structure is preserved.  To
> reduce the situation to a type comparison misses this deeper equivalence
> between the source and target instances.

Christopher, I suppose you joined this discussion late so let me reiterate my 
point. Type is an abstract concept and is fundamentally different from a concept of 
an instance of that type. I may be wrong but it seems to me that you agree with 
this.

But if that's the case, even though what you say above is true, I think the 
conclusion is true for a different and more important reason, namely, that activity 
is the only sufficient representation of itself, meaning that no amount of 
information can ever be equivalent to activity. Type is information. Instance is an 
activity. No amount of information can preserve an instance of mind process. Type 
does not preserve instance. Cryonics preserves only type.


Christopher Healey:
> Jumping off this specific point, I don't think that this whole problem
> can be solved while simultaneously maintaining our current notions of
> identity.

Yes. IMO, identity should be defined as the unique space-time trajectory of an 
instance of subjective experience. Last time I checked, this definition is still 
pretty far from the mainstream. :)


Christopher Healey:
> If we want to make useful progress on it, we need to put aside many of
> our deeply embedded notions regarding our everyday experience of life.
> We can't start off saying, "That cannot be the answer, for that would
> lead to the death of the mind!"  We should instead simply say, "How does
> this thing we perceive as mind actually operate?"

Yes.

Christopher Healey:
> In troubleshooting complex systems, what appears to be the problem is
> often really just the symptom of a deeper cause.  In a similar way, we
> should be careful that what appears to be an important structure in our
> model of the mind is not just a surface indication of a deeper process
> at work, a process that may work very differently than its surface
> indications suggest.

I'm constantly aware of that, Christopher. Thanks.

S. 



More information about the extropy-chat mailing list