[ExI] Fwd: Chalmers

Brent Allsop brent.allsop at gmail.com
Thu Dec 19 17:33:58 UTC 2019


Hi William,

Yes, I probably didn’t quite say that correctly.  I’m just trying to point
out that self-awareness, intelligence, and all forms of functionality can
be achieved in multiple ways, some of which aren’t conscious.  See this “3
Robots that are functionally equivalent but qualitatively different
<https://docs.google.com/document/d/1YnTMoU2LKER78bjVJsGkxMsSwvhpPBJZvp9e2oJX9GA/edit?usp=sharing>”
paper.



If you define consciousness to be computationally bound elemental physical
qualities like redness and greenness, then a substrate independent system
that uses an abstract word like “red” to represent knowledge of red things
it isn’t conscious.  The abstract word “red” has no physical quality, while
our redness physical quality is the conscious definition of the word “red”.



On Thu, Dec 19, 2019 at 10:19 AM William Flynn Wallace via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
> Brent Allsop via extropy-chat
> 11:00 AM (12 minutes ago)
> to *Brent*, ExI
>
> Consciousness isn’t about functionality or intelligence,
>
>
> I don't know where else you would use your intelligence unless it was in
> your conscious mind (unconscious too, of course).  Maybe you mean something
> different with the term 'functionality', but we cannot perform any
> functions of a voluntary nature without consciousness.  Of course I haven't
> read all the articles and books and don't know how twisted the definitions
> get there of functionality and whatever.  And everything is a physical
> quality and quantity to me as a materialist.
>
>
> bill w
>
>
>
>
> On Thu, Dec 19, 2019 at 11:00 AM Brent Allsop via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> Consciousness isn’t about functionality or intelligence, it’s about
>> physical qualities.  What is it like?  Is the physics I represent red with
>> the same as yours, or is it more like your greenness?  Intelligent computer
>> systems are abstracted away from physical qualities.  Any physical property
>> can represent a 1, but only if you have a dictionary interpretation
>> mechanism to get the one from that particular physical property.  We, on
>> the other hand represent information directly on physical qualities, like
>> redness and greenness.  This is more efficient, since you don’t need the
>> abstraction layer to make it substrate independent.
>>
>>
>>
>> Stathis, from what I hear from you, you are saying that redness is not a
>> physical quality and that greenness is not something physically different.
>> Is that the really case?
>>
>> On Thu, Dec 19, 2019 at 7:38 AM John Clark via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>> On Wed, Dec 18, 2019 at 7:33 PM William Flynn Wallace via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>
>>> > What would you like to see done in intelligence research?
>>>>
>>>
>>> I'm not complaining, thanks to the Free Market there is already plenty
>>> of intelligence research done in Silicon Valley and elsewhere because doing
>>> so has a tendency to make people ridiculously rich so it needs no
>>> encouragement by me. I'm just saying that those who like to develop
>>> intricate consciousness theories would do better if they tried to figure
>>> out better ways to make something intelligent instead because doing so just
>>> might make them a billionaire and if they're successful at producing
>>> intelligence they'll get consciousness automatically as a free bonus.
>>>
>>>  John K Clark
>>> _______________________________________________
>>> extropy-chat mailing list
>>> extropy-chat at lists.extropy.org
>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20191219/74b72c55/attachment.htm>


More information about the extropy-chat mailing list