[ExI] Fwd: Chalmers

Brent Allsop brent.allsop at gmail.com
Thu Dec 19 17:45:21 UTC 2019


William,

You said: “we cannot perform any functions of a voluntary nature without
consciousness”.



This depends on your definition of “voluntary”.  You can certainly achieve
the same functionally of voluntary choice with an abstract system.  But if
you define “voluntary”, like you do consciousness: to be a system that
makes decisions with a system implemented directly on physical qualities,
then you are right.  A substrate independent computer cannot perform
“voluntary” actions, per this definition.

On Thu, Dec 19, 2019 at 10:19 AM William Flynn Wallace via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
> Brent Allsop via extropy-chat
> 11:00 AM (12 minutes ago)
> to *Brent*, ExI
>
> Consciousness isn’t about functionality or intelligence,
>
>
> I don't know where else you would use your intelligence unless it was in
> your conscious mind (unconscious too, of course).  Maybe you mean something
> different with the term 'functionality', but we cannot perform any
> functions of a voluntary nature without consciousness.  Of course I haven't
> read all the articles and books and don't know how twisted the definitions
> get there of functionality and whatever.  And everything is a physical
> quality and quantity to me as a materialist.
>
>
> bill w
>
>
>
>
> On Thu, Dec 19, 2019 at 11:00 AM Brent Allsop via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> Consciousness isn’t about functionality or intelligence, it’s about
>> physical qualities.  What is it like?  Is the physics I represent red with
>> the same as yours, or is it more like your greenness?  Intelligent computer
>> systems are abstracted away from physical qualities.  Any physical property
>> can represent a 1, but only if you have a dictionary interpretation
>> mechanism to get the one from that particular physical property.  We, on
>> the other hand represent information directly on physical qualities, like
>> redness and greenness.  This is more efficient, since you don’t need the
>> abstraction layer to make it substrate independent.
>>
>>
>>
>> Stathis, from what I hear from you, you are saying that redness is not a
>> physical quality and that greenness is not something physically different.
>> Is that the really case?
>>
>> On Thu, Dec 19, 2019 at 7:38 AM John Clark via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>> On Wed, Dec 18, 2019 at 7:33 PM William Flynn Wallace via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>
>>> > What would you like to see done in intelligence research?
>>>>
>>>
>>> I'm not complaining, thanks to the Free Market there is already plenty
>>> of intelligence research done in Silicon Valley and elsewhere because doing
>>> so has a tendency to make people ridiculously rich so it needs no
>>> encouragement by me. I'm just saying that those who like to develop
>>> intricate consciousness theories would do better if they tried to figure
>>> out better ways to make something intelligent instead because doing so just
>>> might make them a billionaire and if they're successful at producing
>>> intelligence they'll get consciousness automatically as a free bonus.
>>>
>>>  John K Clark
>>> _______________________________________________
>>> extropy-chat mailing list
>>> extropy-chat at lists.extropy.org
>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20191219/4941a460/attachment.htm>


More information about the extropy-chat mailing list