[ExI] teachers

Darin Sunley dsunley at gmail.com
Thu Aug 24 16:23:01 UTC 2023


An important component of what a lot of people want out of immortality is
not so much continuity as it is not-experiencing-discontinuity [And no,
they're not the same thing].

If I'm dying of cancer, and you do a brain scan, the resulting upload will
remember being me, but /I'm/ still gonna experience a painful death. And
no, killing me painlessly, or even instantaneously, during or in the
immediate aftermath of the brain scan doesn't solve the problem either.

If "me" is ever on two substrates simultaneously, you may have copied me,
but you haven't moved me, and a copy, by definition, isn't the me I want to
be immortal.

On Thu, Aug 24, 2023 at 9:31 AM Gregory Jones via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> ...>...height of wishful thinking...cart, horse inversion, gun jumping,
> etc...
>
> Billw, you will need to offer a bit of evidence to back up that strong
> statement.  You offered us an opinion only.  Granted it is one shared by
> most of humanity.
>
> Regarding your question of asking ChatGPT to write in the style of a 12 yr
> old, it can.  It does a better job of writing in 12 yr old than a smart 12
> yr old.  Not as good as a dumb one, but they are working that.
>
> But do explain why you are so confident that uploading to immortality is
> not rational please.  Note that I am not necessarily disagreeing.  But I
> want to hear your reasoning.
>
> spike
>
> On Thu, Aug 24, 2023 at 8:20 AM William Flynn Wallace via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> It seems that many of us want AIs to be people:  conscious, with emotions
>> and so forth.  I suggest that this stems from wanting uploading to work so
>> we can be immortal and have all the same lives we have now.
>>
>> I suggest that this is the height of wishful thinking.  And putting the
>> cart WAY before the horse.    Jumping the gun, etc. Not rational. bill w
>>
>> On Thu, Aug 24, 2023 at 10:08 AM efc--- via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>> I'm currently a materialist and find many good points in scientism, so
>>> if
>>> I have a box or a robot, that convinces me in every aspect that it is
>>> conscious
>>> by acting as if it was conscious, that's conscious for me.
>>>
>>> I do not subscribe to unique qualia or "redness" experiences, therefore,
>>> I
>>> cannot see a problem with the good old turing.
>>>
>>> Best regards,
>>> Daniel
>>>
>>>
>>> On Thu, 24 Aug 2023, Gregory Jones via extropy-chat wrote:
>>>
>>> > BillW's question regarding the instructor's task of distinguishing
>>> between a student and AI puts a final nail in the coffin of
>>> > Turing's test.  Artificial intelligence is able create an illusion of
>>> consciousness so convincing, we are still debating if it really
>>> > is the real thing, all while failing to adequately define precisely
>>> what we mean by "real."
>>> > spike
>>> >
>>> >_______________________________________________
>>> extropy-chat mailing list
>>> extropy-chat at lists.extropy.org
>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230824/878f0d28/attachment.htm>


More information about the extropy-chat mailing list