<div dir="ltr"><div dir="ltr"><br></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Wed, Jul 19, 2023 at 7:07 AM efc--- via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Hello everyone,<br>
<br>
I think this raises interesting questions of ethics and the "purpose" of <br>
mankind.<br>
<br>
If we agree that happiness is the the ultimate goal, and not the survival <br>
of the species, then why not stop having children if we can get along fine <br>
with robots and AI:s?<br>
<br>
Perhaps, assuming life extension or "immortality", there will be a final <br>
generation?<br></blockquote><div><br></div><div>Good questions. I think there are at least three values: quantity, quality, and diversity of conscious experiences.</div><div><a href="https://alwaysasking.com/what-is-the-meaning-of-life/#Knowing_the_Meaning_of_Life">https://alwaysasking.com/what-is-the-meaning-of-life/#Knowing_the_Meaning_of_Life</a><br></div><div><br></div><div>When it comes to maximizing total experience, it is a function of: Total Experience = Population×Time. A civilization with 10 billion that lasts 1 century, generates as much experience as a population of 1 Billion would in 1 millennium. Creating unique types of beings also contributes to exploring a greater range of possible experiences, which I think is a value with its own unique utility.</div><div><br></div><div>This video gives a good perspective of what might be possible in the future, so civilization surviving the immediate future is also immensely important: <a href="https://www.youtube.com/watch?v=LEENEFaVUzU">https://www.youtube.com/watch?v=LEENEFaVUzU</a></div><div><br></div><div>Jason</div><div><br></div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<br>
Best regards,<br>
Daniel<br>
<br>
<br>
On Tue, 18 Jul 2023, spike jones via extropy-chat wrote:<br>
<br>
><br>
> I posted this right before the ExI list barfed. Posting again.<br>
><br>
> spike<br>
><br>
><br>
><br>
> -----Original Message-----<br>
> From: <a href="mailto:spike@rainier66.com" target="_blank">spike@rainier66.com</a> <<a href="mailto:spike@rainier66.com" target="_blank">spike@rainier66.com</a>> <br>
> Sent: Saturday, 15 July, 2023 4:14 PM<br>
> To: 'ExI chat list' <<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a>><br>
> Cc: 'BillK' <<a href="mailto:pharos@gmail.com" target="_blank">pharos@gmail.com</a>>; <a href="mailto:spike@rainier66.com" target="_blank">spike@rainier66.com</a><br>
> Subject: RE: [ExI] How fun could doom intelligent life to a blissful extinction<br>
><br>
><br>
><br>
> -----Original Message-----<br>
> From: extropy-chat <<a href="mailto:extropy-chat-bounces@lists.extropy.org" target="_blank">extropy-chat-bounces@lists.extropy.org</a>> On Behalf Of BillK via extropy-chat<br>
><br>
><br>
>> ...If the pursuit of happiness is the primary explanation for our decreasing fertility rate, this tendency might be true not just for humans but for all intelligent life — providing a possible explanation for the Fermi Paradox.<br>
><br>
> <<a href="https://bigthink.com/the-future/pursuit-happiness-doom-intelligent-life-blissful-extinction/" rel="noreferrer" target="_blank">https://bigthink.com/the-future/pursuit-happiness-doom-intelligent-life-blissful-extinction/</a>><br>
> -------------------<br>
> ...<br>
><br>
> BillK<br>
><br>
> _______________________________________________<br>
><br>
><br>
><br>
> BillK, this is really as plausible an explanation for the Fermi Paradox as any I have heard, and perhaps the most pleasant one. Having children is a way to experience happiness, but it is a risky bet indeed. If we find sufficient alternative routes to happiness, the notion of having children becomes ever less compelling. If we find alternative routes to the pleasures of copulation and all those cool endorphins we get from love, that whole risky activity isn't worth the effort either. Result: not enough young people to run the world we already built for them.<br>
><br>
> But of course nuclear war could wipe out most of what we have done, creating the need for rebuilders and family people, so we might save our species in that horrifying way: radiation therapy. Or the singularity could kill us, but I don't think it would kill people who have never seen a computer. They might survive to build it all back.<br>
><br>
> spike<br>
><br>
><br>
><br>
> _______________________________________________<br>
> extropy-chat mailing list<br>
> <a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
> <a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a>_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div></div>