[ExI] teachers

Jason Resch jasonresch at gmail.com
Sun Aug 27 20:01:15 UTC 2023


On Thu, Aug 24, 2023, 12:24 PM Darin Sunley via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> An important component of what a lot of people want out of immortality is
> not so much continuity as it is not-experiencing-discontinuity [And no,
> they're not the same thing].
>
> If I'm dying of cancer, and you do a brain scan, the resulting upload will
> remember being me, but /I'm/ still gonna experience a painful death. And
> no, killing me painlessly, or even instantaneously, during or in the
> immediate aftermath of the brain scan doesn't solve the problem either.
>
> If "me" is ever on two substrates simultaneously, you may have copied me,
> but you haven't moved me, and a copy, by definition, isn't the me I want to
> be immortal.
>


Consider the following observation, which has 3 different explanations:

You walk into a closet. Five minutes later you walk out of the closet,
apparently the same as you were when you entered. The question is, did you
survive entering and exiting the closet?

Could the answer depend on which of 3 scenarios happened during those five
minutes, when in all 3 scenarios, the you who emerges is atom-for-atom
identical?

Scenario A) you walk into the closet, stand around for five minutes, then
step out.
Scenario B) you walk into the closet, are scanned, destroyed, then
reassembled, and the reassembled form walks out.
Scenario C) you walk into the closet, are scanned, and duplicated. Your
original is then destroyed, and the copy walks out.

In all 3 scenarios, the version of you who emerges is stipulated to be atom
for atom the same, all versions are provided the same memories, all have
the same behaviors and thought patterns and lives after leaving the closet.
The question is, does it matter what happened in that closet during those
five minutes? Does the history of the atoms have some bearing on whether it
is really you, or are particles indistinguishable as QM suggests (
https://en.m.wikipedia.org/wiki/Identical_particles ), where an electron is
an electron is an electron, regardless of it's history. If so, then on what
basis can we say it's not still you, when everything about the body that
emerges is in principle, indistinguishable? What is attached to the you in
scenario A that's not there in scenario B or scenario C?

Jason




> On Thu, Aug 24, 2023 at 9:31 AM Gregory Jones via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> ...>...height of wishful thinking...cart, horse inversion, gun jumping,
>> etc...
>>
>> Billw, you will need to offer a bit of evidence to back up that strong
>> statement.  You offered us an opinion only.  Granted it is one shared by
>> most of humanity.
>>
>> Regarding your question of asking ChatGPT to write in the style of a 12
>> yr old, it can.  It does a better job of writing in 12 yr old than a smart
>> 12 yr old.  Not as good as a dumb one, but they are working that.
>>
>> But do explain why you are so confident that uploading to immortality is
>> not rational please.  Note that I am not necessarily disagreeing.  But I
>> want to hear your reasoning.
>>
>> spike
>>
>> On Thu, Aug 24, 2023 at 8:20 AM William Flynn Wallace via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>> It seems that many of us want AIs to be people:  conscious, with
>>> emotions and so forth.  I suggest that this stems from wanting uploading to
>>> work so we can be immortal and have all the same lives we have now.
>>>
>>> I suggest that this is the height of wishful thinking.  And putting the
>>> cart WAY before the horse.    Jumping the gun, etc. Not rational. bill w
>>>
>>> On Thu, Aug 24, 2023 at 10:08 AM efc--- via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>
>>>> I'm currently a materialist and find many good points in scientism, so
>>>> if
>>>> I have a box or a robot, that convinces me in every aspect that it is
>>>> conscious
>>>> by acting as if it was conscious, that's conscious for me.
>>>>
>>>> I do not subscribe to unique qualia or "redness" experiences,
>>>> therefore, I
>>>> cannot see a problem with the good old turing.
>>>>
>>>> Best regards,
>>>> Daniel
>>>>
>>>>
>>>> On Thu, 24 Aug 2023, Gregory Jones via extropy-chat wrote:
>>>>
>>>> > BillW's question regarding the instructor's task of distinguishing
>>>> between a student and AI puts a final nail in the coffin of
>>>> > Turing's test.  Artificial intelligence is able create an illusion of
>>>> consciousness so convincing, we are still debating if it really
>>>> > is the real thing, all while failing to adequately define precisely
>>>> what we mean by "real."
>>>> > spike
>>>> >
>>>> >_______________________________________________
>>>> extropy-chat mailing list
>>>> extropy-chat at lists.extropy.org
>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>
>>> _______________________________________________
>>> extropy-chat mailing list
>>> extropy-chat at lists.extropy.org
>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230827/fcc40f50/attachment-0001.htm>


More information about the extropy-chat mailing list