<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
</head>
<body>
On 21/04/2023 15:56, spike wrote:<br>
<blockquote type="cite"
cite="mid:mailman.501.1682088971.847.extropy-chat@lists.extropy.org">Ja,
Ben where I was really going with that idea is exploring whether
it is possible to separate consciousness from intelligence.</blockquote>
<br>
Personally, although I do think that consciousness necessarily goes
along with intelligence, for a number of reasons (and evolution
retaining it, as Jason mentioned, is a big one), I regard it as a
bit like discussions about qualia. Doesn't really matter.<br>
<br>
If something looks like a duck and quacks like a duck, it might as
well be a duck for all practical purposes. Especially if it also
tastes like a duck.<br>
<br>
I think that self-awareness is the thing to look for, rather than
consciousness. Maybe they're the same thing, maybe not, but
self-awareness is something that can be detected, and is obviously
important and useful. Whether or not all self-aware entities are
conscious, we can leave to the philosophers to argue amongst
themselves about. I suspect, though, that self-awareness without
consciousness may be an oxymoron.<br>
<br>
Asking someone if they are a duck, though, is silly. People (who can
answer the question) are not ducks. Ducks (who can't answer the
question) are ducks. Talking ducks? ok they could answer either way.
These questions are not answered by asking the system in question.
They are answered by testing it. Granted, the tests can include
asking, but asking alone is useless. Especially when the people or
ducks might have been instructed beforehand to give a particular
answer.<br>
<br>
The thing that nobody seems to be on the lookout for with these AI
systems, is spontaneous behaviour. When one starts asking its own
unprompted and unscripted questions, <i>that</i> will be
interesting.<br>
<br>
Ben<br>
</body>
</html>