[ExI] Fwd: Chalmers

William Flynn Wallace foozler83 at gmail.com
Thu Dec 19 17:18:14 UTC 2019


Brent Allsop via extropy-chat
11:00 AM (12 minutes ago)
to *Brent*, ExI

Consciousness isn’t about functionality or intelligence,


I don't know where else you would use your intelligence unless it was in
your conscious mind (unconscious too, of course).  Maybe you mean something
different with the term 'functionality', but we cannot perform any
functions of a voluntary nature without consciousness.  Of course I haven't
read all the articles and books and don't know how twisted the definitions
get there of functionality and whatever.  And everything is a physical
quality and quantity to me as a materialist.


bill w




On Thu, Dec 19, 2019 at 11:00 AM Brent Allsop via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> Consciousness isn’t about functionality or intelligence, it’s about
> physical qualities.  What is it like?  Is the physics I represent red with
> the same as yours, or is it more like your greenness?  Intelligent computer
> systems are abstracted away from physical qualities.  Any physical property
> can represent a 1, but only if you have a dictionary interpretation
> mechanism to get the one from that particular physical property.  We, on
> the other hand represent information directly on physical qualities, like
> redness and greenness.  This is more efficient, since you don’t need the
> abstraction layer to make it substrate independent.
>
>
>
> Stathis, from what I hear from you, you are saying that redness is not a
> physical quality and that greenness is not something physically different.
> Is that the really case?
>
> On Thu, Dec 19, 2019 at 7:38 AM John Clark via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> On Wed, Dec 18, 2019 at 7:33 PM William Flynn Wallace via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>> > What would you like to see done in intelligence research?
>>>
>>
>> I'm not complaining, thanks to the Free Market there is already plenty of
>> intelligence research done in Silicon Valley and elsewhere because doing so
>> has a tendency to make people ridiculously rich so it needs no
>> encouragement by me. I'm just saying that those who like to develop
>> intricate consciousness theories would do better if they tried to figure
>> out better ways to make something intelligent instead because doing so just
>> might make them a billionaire and if they're successful at producing
>> intelligence they'll get consciousness automatically as a free bonus.
>>
>>  John K Clark
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20191219/4a8316c0/attachment.htm>


More information about the extropy-chat mailing list