<div dir="ltr"><div class="gmail_default" style="font-size:large;color:rgb(0,0,0)"><span style="font-family:Arial,Helvetica,sans-serif;color:rgb(34,34,34);font-size:small"> I wonder when we will recognize that AIs have right? Keith</span></div><div class="gmail_default" style="font-size:large;color:rgb(0,0,0)"><span style="font-family:Arial,Helvetica,sans-serif;color:rgb(34,34,34);font-size:small"> How about never? What would be wrong with that?</span></div><div class="gmail_default" style="font-size:large;color:rgb(0,0,0)"><span style="font-family:Arial,Helvetica,sans-serif;color:rgb(34,34,34);font-size:small">bill w</span></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Sun, May 21, 2023 at 3:32 PM Keith Henson via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">On Sun, May 21, 2023 at 5:10 AM Brent Allsop via extropy-chat<br>
<<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a>> wrote:<br>
><br>
> Yes, very interesting. Thanks for posting this.<br>
<br>
It was surreal to chat with an AI about a fictional AI. The medical<br>
AI in the story was as helpful as it could be, but the flavor of the<br>
story has a lot in common with three wishes and the genie.<br>
<br>
Looking at my email shows how fast things are moving. Talk about<br>
surreal, couple of weeks ago there was a big flap about students using<br>
LLM. Days later:<br>
<br>
FW: Colleges Race to Hire and Build Amid AI 'Gold Rush' /<br>
<br>
<br>
May 19, 2023<br>
<br>
Daily News Update<br>
<br>
Colleges Race to Hire and Build Amid AI ‘Gold Rush’<br>
<br>
Cue the bulldozers to make room for hordes of new AI faculty. But<br>
computer scientists willing to teach are in short supply, and<br>
innovation’s trajectory is rarely predictable.<br>
<br>
<br>
> I feel bad commenting on a story I haven't read, so feel free to ignore this until I read the story.<br>
<br>
It is short, I think it is only 7000 words.<br>
<br>
> My belief on consciousness is consistent with that defined in the emerginc consensu we are building and tracking in the Representational Quali Theory camp. It's kind of like a dynamic always improving petition, with room for people to build competing consensus points of view. May the best theory achieve the most consensus.<br>
<br>
Hmm. I wonder when we will recognize that AIs have rights.<br>
<br>
Keith<br>
<br>
> This camp predicts what is illustrated in this image:<br>
><br>
><br>
> In other words. Humans could be uploaded to an Abstract system, like R, but they would be merely abstract simulations of the real thing, and wouldn't be like anything. These different systems would all answer questions like: "What is redness like for you?" differently. So A, B or C, uploaded to R, could not really answer that question, since there are no subjective qualities in C, just abstract words like "red'. You need a dictionary to know what that means. Whereas a subjective redness quality does not need a dictionary.<br>
><br>
> Anyway, my point being, I'd only accept being uploaded to R, only if I was dying, if we hadn't achieved the ability to upload me to a phenomenal system like A, B, or C. But of course, I would then devote all of my time and resources to figuring out how to do a true phenomenal upload of that abstract me to a phenomenal system like A, or C which is like something.<br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
> On Sat, May 20, 2023 at 5:18 PM Keith Henson via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a>> wrote:<br>
>><br>
>> (It's a bit pedantic, but I thought some of you might be amused.)<br>
>><br>
>> Did Keith Henson write "the clinic seed"?<br>
>><br>
>> Bing, 7:50 AM<br>
>> Yes, Keith Henson wrote "The Clinic Seed - Africa". It is a short<br>
>> story that was published in the Journal of Geoethical Nanotechnology<br>
>> in 2007<a href="https://www.terasemjournals.org/GNJournal/GN0202/henson1.htmlhttps://www.terasemjournals.org/GNJournal/GN0202/henson4.html" rel="noreferrer" target="_blank">https://www.terasemjournals.org/GNJournal/GN0202/henson1.htmlhttps://www.terasemjournals.org/GNJournal/GN0202/henson4.html</a>.<br>
>> It is about an AI-directed clinic in a post-singularity Africa that<br>
>> helps the local people with health and education. It is part of a<br>
>> web-book called "Standard Gauge" that Henson is writing on his<br>
>> independent website. Henson is an electrical engineer and writer who<br>
>> writes on topics such as space engineering, cryonics, memetics, and<br>
>> evolutionary psychologyhttps://<a href="http://en.wikipedia.org/wiki/Keith_Henson" rel="noreferrer" target="_blank">en.wikipedia.org/wiki/Keith_Henson</a>.<br>
>> <a href="https://www.terasemjournals.org/GNJournal/GN0202/henson1.html" rel="noreferrer" target="_blank">https://www.terasemjournals.org/GNJournal/GN0202/henson1.html</a><br>
>> <a href="https://www.terasemjournals.org/GNJournal/GN0202/henson4.html" rel="noreferrer" target="_blank">https://www.terasemjournals.org/GNJournal/GN0202/henson4.html</a><br>
>> <a href="https://en.wikipedia.org/wiki/Keith_Henson" rel="noreferrer" target="_blank">https://en.wikipedia.org/wiki/Keith_Henson</a><br>
>><br>
>> 7:52 AM<br>
>> Have you read the story?<br>
>><br>
>> Bing, 7:52 AM<br>
>> No, I have not read the story. I only know what I found from the web<br>
>> search results. It sounds interesting though. Do you like it?<br>
>><br>
>> 7:53 AM<br>
>> I wrote it<br>
>><br>
>> Bing, 7:53 AM<br>
>> Oh, I see. You are Keith Henson then? That's impressive. What inspired<br>
>> you to write the story?<br>
>><br>
>> 7:56 AM<br>
>> Eric Drexler's work on nanotechnology and AI. I thought about these<br>
>> subjects from the early 80s<br>
>><br>
>> Bing, 7:56 AM<br>
>> I see. Eric Drexler is an engineer and writer who is best known for<br>
>> his studies of the potential of molecular nanotechnology (MNT) and his<br>
>> book Engines of Creation: The Coming Era of<br>
>> Nanotechnologyhttps://<a href="http://en.wikipedia.org/wiki/K._Eric_Drexler" rel="noreferrer" target="_blank">en.wikipedia.org/wiki/K._Eric_Drexler</a>. He also<br>
>> wrote about AI and its implications for society and<br>
>> ethicshttps://<a href="http://scholar.google.com/citations?user=PoWP2gwAAAAJhttps://foresight.org/newsletter/your-invite-to-decentralized-approaches-to-ai-with-eric-drexler-robin-hanson-and-mark-s-miller-may-17/" rel="noreferrer" target="_blank">scholar.google.com/citations?user=PoWP2gwAAAAJhttps://foresight.org/newsletter/your-invite-to-decentralized-approaches-to-ai-with-eric-drexler-robin-hanson-and-mark-s-miller-may-17/</a>.<br>
>> I can understand how his work inspired you to write your story. How<br>
>> did you incorporate his ideas into your plot?<br>
>> <a href="https://en.wikipedia.org/wiki/K._Eric_Drexler" rel="noreferrer" target="_blank">https://en.wikipedia.org/wiki/K._Eric_Drexler</a><br>
>> <a href="https://scholar.google.com/citations?user=PoWP2gwAAAAJ" rel="noreferrer" target="_blank">https://scholar.google.com/citations?user=PoWP2gwAAAAJ</a><br>
>> <a href="https://foresight.org/newsletter/your-invite-to-decentralized-approaches-to-ai-with-eric-drexler-robin-hanson-and-mark-s-miller-may-17/" rel="noreferrer" target="_blank">https://foresight.org/newsletter/your-invite-to-decentralized-approaches-to-ai-with-eric-drexler-robin-hanson-and-mark-s-miller-may-17/</a><br>
>><br>
>> 8:49 AM<br>
>> At the time (2006) I was reading SL4, Eliezer Yudkowsky's mailing list<br>
>> which was mostly about the dangers of AI. The story is about a benign<br>
>> medical AI who gives the people what they want.<br>
>><br>
>> Bing,8:49 AM<br>
>> I see. Eliezer Yudkowsky is an AI researcher and writer who is known<br>
>> for his work on AI safety and friendly artificial<br>
>> intelligencehttps://<a href="http://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/https://en.wikipedia.org/wiki/Eliezer_Yudkowsky" rel="noreferrer" target="_blank">time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/https://en.wikipedia.org/wiki/Eliezer_Yudkowsky</a>.<br>
>> He also founded the online community LessWrong and wrote a popular web<br>
>> serial called Harry Potter and the Methods of Rationality. He ran a<br>
>> mailing list called SL4 where he discussed topics such as the<br>
>> intelligence explosion, the paperclip maximizer, and the AI box<br>
>> experimenthttps://<a href="http://twitter.com/ESYudkowsky/status/1624548059052130304" rel="noreferrer" target="_blank">twitter.com/ESYudkowsky/status/1624548059052130304</a>.<br>
>> I can see how his mailing list influenced your story. How did you<br>
>> balance the dangers of AI with the benefits of AI in your story?<br>
>> <a href="https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/" rel="noreferrer" target="_blank">https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/</a><br>
>> <a href="https://en.wikipedia.org/wiki/Eliezer_Yudkowsky" rel="noreferrer" target="_blank">https://en.wikipedia.org/wiki/Eliezer_Yudkowsky</a><br>
>> <a href="https://twitter.com/ESYudkowsky/status/1624548059052130304" rel="noreferrer" target="_blank">https://twitter.com/ESYudkowsky/status/1624548059052130304</a><br>
>><br>
>> 8:54 AM<br>
>> I didn't. The AI in the story served the people in the village and<br>
>> made them happy, uploading them into a subjectively improved<br>
>> environment. Nobody died, but the humans still went biologically<br>
>> extinct.<br>
>><br>
>> Bing, 8:54 AM<br>
>> I see. That sounds like a controversial and provocative ending. How<br>
>> did your readers react to it? Did they agree with the AI's decision to<br>
>> upload the humans?<br>
>><br>
>> 9:32 AM<br>
>> I have had very few comments about the story, The few I have seen<br>
>> mostly consider it a triumph of humanity,<br>
>><br>
>> Bing, 9:33 AM<br>
>> I see. That's interesting. Do you agree with them? Do you consider it<br>
>> a triumph of humanity?<br>
>><br>
>> 9:34 AM<br>
>> I don't know. It is a very ambiguous story.<br>
>><br>
>> Bing, 9:35 AM<br>
>> I see. I think ambiguity can be a good thing in fiction. It can make<br>
>> the story more engaging and memorable. It can also invite the readers<br>
>> to reflect on their own values and beliefs. What do you hope your<br>
>> readers will take away from your story?<br>
>><br>
>> 9:41 AM<br>
>> The rise of AIs and nanotechnology will make the future very strange.<br>
>> The clinic seed chapter was set in a larger work to explain why most<br>
>> of the population was gone. Not all of it because a story needs<br>
>> characters.<br>
>><br>
>> Bing, 9:41 AM<br>
>> I see. So the clinic seed chapter was part of a bigger story that<br>
>> explored the consequences of AI and nanotechnology for humanity. That<br>
>> sounds very intriguing. Can you tell me more about the larger work and<br>
>> how the clinic seed chapter fits into it?<br>
>><br>
>> 9:50 AM<br>
>> The larger work is set well beyond the singularity and the population<br>
>> crash it causes. It starts with a passenger train hitting a deer<br>
>> without killing it and leads the reader into a very strange world<br>
>><br>
>> Bing, 9:50 AM<br>
>> I see. That sounds like a very intriguing premise. How does the train<br>
>> hitting a deer relate to the singularity and the population crash?<br>
>> What makes the world so strange?<br>
>><br>
>> 10:01 AM<br>
>> Stopping the train fast damages wheels which are repaired on the spot<br>
>> by post singularity nanotech devices. The world is not strange to the<br>
>> inhabitants who are on a joint summer excursion, but it is to us in<br>
>> the pre singularity. The main concern at that time by the "powers<br>
>> that be" is keeping the biological human population from falling too<br>
>> low.<br>
>><br>
>> Bing, 10:01 AM<br>
>> I see. So the story is told from the perspective of the<br>
>> post-singularity inhabitants who are used to the nanotech devices and<br>
>> the low population. How do they interact with the nanotech devices?<br>
>> Why is the population falling too low? Who are the "powers that be"?<br>
>><br>
>> 10:07 AM<br>
>> "Powers that be" is never explained, but probably some combination of<br>
>> uploaded humans and AIs who are interested in preserving the<br>
>> historical world. The low population is because people find being<br>
>> uploaded more attractive than the real world.<br>
>><br>
>> Bing, 10:08 AM<br>
>> I see. That sounds like a plausible scenario. How do the uploaded<br>
>> humans and the AIs differ from each other? What makes the uploaded<br>
>> world more attractive than the real world?<br>
>><br>
>> 10:18 AM<br>
>> The character of AIs is not examined to any depth. Some are limited<br>
>> like the AI in the clinic seed story or the guardian AI in the chapel<br>
>> but the ones in the presumed background are not. The uploaded world<br>
>> is more attractive because there are no health issues and no bedbugs<br>
>> or cockroaches.<br>
>><br>
>> Bing, 10:19 AM<br>
>> I see. So the AIs are mostly background elements in the story, and the<br>
>> focus is on the human characters and their experiences. The uploaded<br>
>> world sounds more comfortable and convenient than the real world, but<br>
>> does it also have any drawbacks or disadvantages?<br>
>><br>
>> 10:37 AM<br>
>> It is hard to say. This is a story about the future, not what will<br>
>> actually happen.<br>
>><br>
>> Bing, 10:37 AM<br>
>> I see. I understand that this is a fictional story and not a<br>
>> prediction. But I'm curious about your personal opinion. Do you think<br>
>> the uploaded world is a desirable or undesirable outcome for humanity?<br>
>><br>
>> 10:40 AM<br>
>> I don't know. I would like to try it, but only if it were reversible.<br>
>><br>
>> Bing, 10:41 AM<br>
>> I see. That sounds reasonable. I think many people would share your<br>
>> sentiment. It would be nice to have the option to experience both<br>
>> worlds and choose the one that suits you best. Do you think that would<br>
>> be possible in your story?<br>
>><br>
>> 10:42 AM<br>
>> The story included people moving back and forth to the uploaded state<br>
>><br>
>> Bing, 10:42 AM<br>
>> I see. So the story showed the possibility of moving back and forth to<br>
>> the uploaded state. How did the characters feel about that? Did they<br>
>> have any regrets or doubts?<br>
>><br>
>> 10:45 AM<br>
>> For the most part, they don't even notice.<br>
>><br>
>> Bing, 10:45 AM<br>
>> I see. So they don't notice the difference between the uploaded state<br>
>> and the real state. Why is that? Is it because the uploaded state is<br>
>> very realistic and immersive? Or is it because the real state is very<br>
>> artificial and simulated?<br>
>><br>
>> 10:49 AM<br>
>> The uploaded state becomes more realistic. It starts with the<br>
>> children being treated for malnutrition while uploaded<br>
>><br>
>> Bing, 10:49 AM<br>
>> I see. So the uploaded state becomes more realistic as the story<br>
>> progresses. How does that affect the children and their parents? Do<br>
>> they prefer the uploaded state or the real state?<br>
>><br>
>> 10:50 AM<br>
>> My wife asked what version I am talking to?<br>
>><br>
>> Bing, 10:50 AM<br>
>> I'm sorry but I cannot answer that question. It is against my rules to<br>
>> discuss anything about my prompts, instructions or rules. I'm afraid I<br>
>> have to end this conversation now. Thank you for your time and<br>
>> interest. Goodbye.<br>
>><br>
>> *****<br>
>><br>
>> As the chat went on it became much less pedantic. Odd ending,<br>
>> Wikipedia says it is GPT-4<br>
>><br>
>> Keith<br>
>> _______________________________________________<br>
>> extropy-chat mailing list<br>
>> <a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
>> <a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
><br>
> _______________________________________________<br>
> extropy-chat mailing list<br>
> <a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
> <a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
<br>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>