[ExI] all we are is just llms was: RE: e: GPT-4 on its inability to solve the symbol grounding problem
jasonresch at gmail.com
Fri Apr 21 11:18:32 UTC 2023
On Fri, Apr 21, 2023, 1:25 AM spike jones via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> Regarding measuring GPT’s intelligence, this must have already been done
> and is being done. Reasoning: I hear GPT is passing medical boards exams
> and bar exams and such, so we should be able to give it IQ tests, then
> compare its performance with humans on that test. I suspect GPT will beat
> everybody at least on some tests.
The last I remember seeing was that it scored a 119 on a general IQ test,
and 147 on a verbal IQ test. I don't remember whether this was for ChatGPT
So… now we need to assume for the sake of investigation that consciousness
> and intelligence are (somehow) separable mental processes. I don’t know of
> any test analogous to an IQ test for consciousness. Does anyone here know
> of such a thing?
Just as there are many forms and ways of being intelligent, there will be
many ways of being conscious. Intelligence is measured by evaluating
optimality of actions towards achieving some objective, while consciousness
must be measured by determining the presence of knowledge states necessary
to perform some behavior.
E.g. if the robot arm reliably catches a thrown ball, something in the
system must be conscious of the ball's trajectory and position relative to
If consciousness were not necessary for intelligent behavior, why would
nature bother to evolve (and retain) it?
> After all my posts, I think I have just argued myself back to where I
> started: without some objective way to measure consciousness, we are doing
> little more here than debating the definition of an ill-defined term.
I come at it from the other direction: we need the definition first. When
we can agree on that we can agree on how it can be measured.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat