[ExI] all we are is just llms
Ben Zaiboc
ben at zaiboc.net
Fri Apr 21 16:38:15 UTC 2023
On 21/04/2023 15:56, spike wrote:
> Ja, Ben where I was really going with that idea is exploring whether
> it is possible to separate consciousness from intelligence.
Personally, although I do think that consciousness necessarily goes
along with intelligence, for a number of reasons (and evolution
retaining it, as Jason mentioned, is a big one), I regard it as a bit
like discussions about qualia. Doesn't really matter.
If something looks like a duck and quacks like a duck, it might as well
be a duck for all practical purposes. Especially if it also tastes like
a duck.
I think that self-awareness is the thing to look for, rather than
consciousness. Maybe they're the same thing, maybe not, but
self-awareness is something that can be detected, and is obviously
important and useful. Whether or not all self-aware entities are
conscious, we can leave to the philosophers to argue amongst themselves
about. I suspect, though, that self-awareness without consciousness may
be an oxymoron.
Asking someone if they are a duck, though, is silly. People (who can
answer the question) are not ducks. Ducks (who can't answer the
question) are ducks. Talking ducks? ok they could answer either way.
These questions are not answered by asking the system in question. They
are answered by testing it. Granted, the tests can include asking, but
asking alone is useless. Especially when the people or ducks might have
been instructed beforehand to give a particular answer.
The thing that nobody seems to be on the lookout for with these AI
systems, is spontaneous behaviour. When one starts asking its own
unprompted and unscripted questions, /that/ will be interesting.
Ben
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230421/ca772a28/attachment.htm>
More information about the extropy-chat
mailing list