[ExI] People often think their chatbot is alive

Adrian Tymes atymes at gmail.com
Sat Jul 16 18:43:34 UTC 2022


On Sat, Jul 16, 2022 at 11:23 AM Brent Allsop via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> I agree with most everything you are saying, but you seem to be missing
> what I'm trying to focus on.
>
> On Sat, Jul 16, 2022 at 11:28 AM Adrian Tymes via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> On Sat, Jul 16, 2022 at 9:39 AM Brent Allsop via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>> You said: "Things like redness might be part of it, " which says to me
>>> you would agree that, though the one on the right can "function" the same
>>> (as in can tell you the strawberry is red) you would not consider it to
>>> have the same kind of conscious knowledge as you, who knows the physical
>>> definition of redness, and would answer different than the one on the
>>> right, when asked: "What is redness like for you?"
>>>
>>
>> I would consider it to have functionally the same kind of conscious
>> knowledge until presented with evidence to the contrary.
>>
> [image: 3_functionally_equal_machines_tiny.png]
>
> Surely you must admit that these 3 are all fundamentally different.
>

You miss my point.  Are they different in a meaningful or interesting way,
that there is reason to care about?  The answer seems to be "no".


> IF redness could emerge from some set of 1s and 0s, or whatever "function"
> or "pattern" results in a redness experience, then you could do whatever
> that is, and engineer the one on the right, to use whatever that set of 1s
> and 0s are, which have the redness quality.  and you could invert that to
> get the one in the middle.  Then that one would answer the question of
> "what is redness like for you?" differently than the one, on the right.
>

Actually they might not.  "What is redness like for you?" gets translated
into language, at which point the exact same words might get used to define
it even if they have different meanings, much like how the word "redness"
itself has different meanings (per that illustration).


> All we are missing is the dictionary which says it is X, in the brain,
> that has the redness quality you represent knowledge of red things with.
>

Eh...that depends on how deep into the brain you mean.  At the retina, we
absolutely know which cells react to red-frequency light.  Anything further
in than that gets into how knowledge is represented in general, where
"redness" does not necessarily exist as a discrete physical thing that is
separable from everything else.


> If we could have 10,000 people sign a petion
>

It would likely be ignored (beyond maybe invoking a few response
statements, but instigating no serious effort), like almost all petitions
are.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220716/8d393987/attachment-0001.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: 3_functionally_equal_machines_tiny.png
Type: image/png
Size: 26214 bytes
Desc: not available
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220716/8d393987/attachment-0001.png>


More information about the extropy-chat mailing list