<div dir="ltr"><div class="gmail_default" style="font-family:"comic sans ms",sans-serif;font-size:large;color:rgb(0,0,0)">Three silly questions: how would you know if you had created a conscious mind? Why do you want to do that? What makes that necessary? bill w</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Feb 23, 2023 at 10:05 AM Ben Zaiboc via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Giovanni, you've just made it onto my (regrettably small) list of people <br>
who seem to be capable of thinking in a joined-up fashion, who seemingly <br>
haven't fallen prey to dualistic and crypto-dualistic notions. I'll take <br>
more notice of your posts here in future.<br>
<br>
It seems we on this list suffer from just as many misunderstandings as <br>
any other group of humans, and your statement "the brain is a <br>
simulation" is a good example of how this happens. I'm not criticising <br>
you, far from it, but it does illustrate how easy it is for people to <br>
get the wrong end of the stick and run with that, ignoring later <br>
clarifications of what the poster actually meant. I understand what you <br>
(almost certainly) meant by that comment, even if I wouldn't have put it <br>
that way myself. Some others will not.<br>
<br>
Literally speaking, what you said doesn't make any sense. The brain is a <br>
physical object, in what we regard as the 'real world', so it can't be a <br>
simulation. But of course (my assumption is) you didn't really mean <br>
that, and it should be pretty easy to figure that out. Our internal <br>
representation of the world and other people, our entire experience, is <br>
what I'm assuming you mean, and of course that is a simulation. It <br>
couldn't be anything else (silly notions of specific molecules actually <br>
being certain experiences of certain colours notwithstanding).<br>
<br>
My undertanding is that what our brain does, is simulate the world and <br>
the agents that appear in it, and even the agent that is experiencing <br>
the simulations.<br>
<br>
The way I'd put it, is that everything I experience (including myself), <br>
is a simulation created by a ('my') brain.<br>
<br>
Just to be clear, is that what you meant? I'm open to the possibility <br>
that I've got this totally wrong! (in which case, I may need to withdraw <br>
what I said in the first paragraph, above :D )<br>
<br>
I also suspect you're right in saying that consciousness is going to be <br>
much easier to produce than we currently think, once we figure it out. <br>
We will probably be astonished at how simple it is, and how easy it will <br>
be to create fully-conscious artificial minds.<br>
<br>
I think it's a bit like our understanding of tying a knot. At some point <br>
in our prehistory, humans wouldn't have known what knots were*, and <br>
probably struggled to do things like keeping animal skins on their <br>
bodies when they needed them to stay warm. Once some genius invented the <br>
knot (which probably didn't take long), it would have been a real 'Aha!' <br>
moment, and, once shown, suddenly everyone could securely tie a skin on <br>
themselves to keep warm, and we've hardly given it a second thought ever <br>
since (apart from a certain group of geeky mathematicians!).<br>
<br>
I reckon the trick of creating fully-conscious minds will be similar. <br>
There's probably a small set of necessary features that a system needs <br>
in order to be conscious and self-aware, we just don't know what they <br>
are yet. But I think we're getting close (just for the record, I very <br>
much doubt that any chatbot has these features, quite possibly by a long <br>
chalk. Spike's remarks about having a persistent memory is a good start, <br>
but probably far from all that's needed).<br>
<br>
Ben<br>
<br>
* If this strikes you as ridiculously unlikely, substitute some other <br>
obvious-in-hindsight thing that would totally elude someone not aware of <br>
it, like maybe using stones to make sharp sticks or digging a hole then <br>
making a noise to kill an animal, etc.<br>
<br>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>