[ExI] GPT-4 on its inability to solve the symbol grounding problem

Stuart LaForge avant at sollegro.com
Sun Apr 9 17:05:05 UTC 2023

Quoting Gordon Swobe via extropy-chat <extropy-chat at lists.extropy.org>:

> That is the only kind of consciousness with which we have any familiarity.
> I think it is reasonable to infer something similar in other people and in
> other higher mammals, as their anatomies and nervous systems and lives and
> behaviors are so similar to ours, but then things start to get sketchy as
> we go down the food chain.

So for you consciousness is an all or nothing thing? Either you have  
it or you don't? You and Bender call LLMs "stochastic parrots". Since  
African gray parrots are approximately as intelligent as 3.5-year-old  
human children, that would imply that ChatGPT is likewise at least as  
conscious as a 3.5-year-old human child if not more so. That is unless  
you can specify the difference between intelligence and consciousness,  
in such a way that humans have consciousness and birds do not.  
Incidentally, while smarter than the average bird, parrot intelligence  
is thought to be less than that of corvids like ravens and crows.  
Parrots being 3.5 years equivalent human age (EHA) and the smartest  
corvids being about 7 EHA.


>    In the effort to justify the belief that even
> software can be conscious, people find themselves saying all sorts of silly
> things, for example that doorbells and cars are conscious.
> Their arguments
> lose by reductio ad absurdum except on ExI, where anything goes.

When it comes to the survival of the human race, silliness is  
preferable to factual inaccuracy. Thus far, I have caught your  
supposed thought leader Bender in two cringy factual inaccuracies. The  
first regarding parrots as being models of unconscious stupidity and  
the second being that octopi don't understand the uses of coconuts  
which is clearly refuted by this video.


I don't think that your hero Bender understands parrots, octopi,  
bears, or tropical islands as well as she thinks she does.

Stuart LaForge

More information about the extropy-chat mailing list