[ExI] AI extinction risk

Tara Maya tara at taramayastales.com
Mon Mar 17 21:24:53 UTC 2014


On Mar 17, 2014, at 3:30 AM, Eugenio Martínez <rolandodegilead at gmail.com> wrote:

> If we achieve B and still people have to work and, therefore, still are poors, I mean:
> 
> If we have the possibility of make everybody´s life assured (and AI´s give us that possibility) and we don´t, TransHumanist philosophies can be marked as another crazy unreachable utopia. Saving lives (of those who want to live) is ethically important.



But you see, poverty is completely relative. I've worked in a homeless shelter and I can tell you that the homeless of America are wealthy and healthy compared to the poor of the Third World. And the poor of the Third World are healthy and wealthy compared to people from the past.

In Bali, a delightful island paradise where food literally falls from trees, the traditional religious calendar is arranged so that about a third of all days in the year "must" be spent creating huge arrangements of fruit, flowers and colored powders. Dancing and rituals take the rest of the day. Traditionally the "rich" were people who could afford bigger arrangements of flowers. (Obviously, I'm simplifying, there were wars and what-not too), but my point is that even in a society where no one HAD to starve, or fight, or die of illness, because it was just such a rich and safe and prosperous society… there would still be an economy, there would still be people with more flowers and others with less. Because that's human nature. 

Now, once human nature changes, all bets are off. But there's unfortunately no reason to think that AIs or Transhumans would be any less competitive. 

Tara Maya
Blog  |  Twitter  |  Facebook  |  Amazon  |  Goodreads



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20140317/de008253/attachment.html>


More information about the extropy-chat mailing list