[ExI] Is the GPT-3 statistical language model conscious?

William Flynn Wallace foozler83 at gmail.com
Tue Oct 13 17:01:10 UTC 2020


>
> Sure you can ask, "Is it concious?" But who are we to decide what
> consciousness is and isn't?
>

Right. It's a waste of time.

-Dave

*The only people for whom it is not a waste of time are philosophers who
write books, get tenure, etc.    bill w*

On Tue, Oct 13, 2020 at 11:46 AM Dave Sill via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On Tue, Oct 13, 2020 at 12:27 PM Jalil Farid via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>>
>> I think one question to ask is "what is consciousness?"
>>
>
> That's a rabbit hole we've been down many times to no avail. In the case
> of GPT-3, though, we don't really need to ask it because it's clear from
> its design that it's incapable of consciousness by almost any definition.
>
> After hearing the remarks, it appears a program is probably on track
>> within the next 10 years to at least statistically answer some basic
>> questions and pass a Turing Test. We probably will see some commercial
>> applications for weak AIs, but within my lifetime it's very likely that
>> GPT-10 is mostly impossible to differentiate from a real human.
>>
>
> Weak AIs are already being used for chatbots, customer service call
> centers, and many other things. GPT-N alone will never pass a Turing
> test--it's a language generator with no understanding of anything.
>
> Sure you can ask, "Is it concious?" But who are we to decide what
>> consciousness is and isn't?
>>
>
> Right. It's a waste of time.
>
> -Dave
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20201013/5bdf62b6/attachment.htm>


More information about the extropy-chat mailing list