[ExI] People often think their chatbot is alive

Brent Allsop brent.allsop at gmail.com
Sat Jul 16 20:32:22 UTC 2022


OK, let me ask you this.  Are you interested in finding out the colorness
qualities of anything in physics?  Not just the colerness qualities things
seem to have?

In my opinion, that is one of the most important completely unknown
questions in physics today.  Nobody knows the intrinsic colerness quality
of anything.
And of course, again, once we discover which of all our descriptions of
physics in the brain is a description of redness, (it will falsify all the
crap in the gap theories like substance dualism and functionalism) and
result in a clear scientific consensus about not only what
consciousness is, but an understanding of what consciousness is like.
Along with that will be a near unanimous consensus that abstract systems,
like the one on the right in the image, would not be considered to be
conscious by anyone, with any reasonable intelligence.





On Sat, Jul 16, 2022 at 12:44 PM Adrian Tymes via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On Sat, Jul 16, 2022 at 11:23 AM Brent Allsop via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> I agree with most everything you are saying, but you seem to be missing
>> what I'm trying to focus on.
>>
>> On Sat, Jul 16, 2022 at 11:28 AM Adrian Tymes via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>> On Sat, Jul 16, 2022 at 9:39 AM Brent Allsop via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>
>>>> You said: "Things like redness might be part of it, " which says to me
>>>> you would agree that, though the one on the right can "function" the same
>>>> (as in can tell you the strawberry is red) you would not consider it to
>>>> have the same kind of conscious knowledge as you, who knows the physical
>>>> definition of redness, and would answer different than the one on the
>>>> right, when asked: "What is redness like for you?"
>>>>
>>>
>>> I would consider it to have functionally the same kind of conscious
>>> knowledge until presented with evidence to the contrary.
>>>
>> [image: 3_functionally_equal_machines_tiny.png]
>>
>> Surely you must admit that these 3 are all fundamentally different.
>>
>
> You miss my point.  Are they different in a meaningful or interesting way,
> that there is reason to care about?  The answer seems to be "no".
>
>
>> IF redness could emerge from some set of 1s and 0s, or whatever
>> "function" or "pattern" results in a redness experience, then you could do
>> whatever that is, and engineer the one on the right, to use whatever that
>> set of 1s and 0s are, which have the redness quality.  and you could invert
>> that to get the one in the middle.  Then that one would answer the question
>> of "what is redness like for you?" differently than the one, on the right.
>>
>
> Actually they might not.  "What is redness like for you?" gets translated
> into language, at which point the exact same words might get used to define
> it even if they have different meanings, much like how the word "redness"
> itself has different meanings (per that illustration).
>
>
>> All we are missing is the dictionary which says it is X, in the brain,
>> that has the redness quality you represent knowledge of red things with.
>>
>
> Eh...that depends on how deep into the brain you mean.  At the retina, we
> absolutely know which cells react to red-frequency light.  Anything further
> in than that gets into how knowledge is represented in general, where
> "redness" does not necessarily exist as a discrete physical thing that is
> separable from everything else.
>
>
>> If we could have 10,000 people sign a petion
>>
>
> It would likely be ignored (beyond maybe invoking a few response
> statements, but instigating no serious effort), like almost all petitions
> are.
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220716/d087f461/attachment.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: 3_functionally_equal_machines_tiny.png
Type: image/png
Size: 26214 bytes
Desc: not available
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220716/d087f461/attachment.png>


More information about the extropy-chat mailing list