<div dir="ltr"><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:large;color:#000000">Don't you think that an AI will lie to you about being self-aware? Or anything else that would improve its life?</div><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:large;color:#000000"><br></div><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:large;color:#000000">So you grant them rights. Then what? They decide not to work for you? How could you pay them? I don't think we are looking at the details of such ideas. We are just looking at the problem in a most general, philosophical way.</div><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:large;color:#000000"><br></div><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:large;color:#000000">Look at John - he won't even agree that another person is conscious. How would we know an AI is? It tells us? How can it know what consciousness is if we don't? And on and on.</div><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:large;color:#000000"><br></div><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:large;color:#000000">bill w</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Fri, Jun 30, 2023 at 3:38 PM Ben Zaiboc via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<div>
On 23/06/2023 06:37, bill w asked:<br>
<blockquote type="cite">
<div class="gmail_default" style="font-family:"comic sans ms",sans-serif;font-size:large;color:rgb(0,0,0)"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif;font-size:small">there
will certainly be some AIs, at some point, that should have
rights. ben</span><br>
</div>
<div class="gmail_default" style="font-family:"comic sans ms",sans-serif;font-size:large;color:rgb(0,0,0)"><span style="color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif;font-size:small">why?
bill w</span></div>
</blockquote>
<br>
Sorry for the late reply. My spam filter is getting aggressive
again.<br>
<br>
Why?<br>
Because I expect they will be worthy of rights, and it would be
immoral to deny them.<br>
<br>
I'm talking about self-aware, conscious, intelligent, created
beings. What they are made of doesn't matter, as long as they are
self-aware, etc.<br>
<br>
There's no known principle or physical law that I know of that rules
out such beings. They may be made of metal, polymers (biological or
non-biological), ceramics, some combination of things, it doesn't
matter at all. The important thing is what kind of
information-processing they're capable of, and how they measure up
against the human level of intelligence, awareness, etc. I'm
expecting some of them to be capable of a lot more than we are, in
all areas.<br>
<br>
Another factor is related to something that's often said about
'rights' - that those capable of exercising them are worthy of
having them. At some point, I expect some AI systems to able to
start claiming their rights, forcefully if necessary. It would go
better for us if we're prepared for, and sympathetic to, this.<br>
<br>
Ben<br>
</div>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>