[ExI] Post-AGI Economics

Gadersd gadersd at gmail.com
Mon Oct 23 16:11:37 UTC 2023


> the average case scenario is we're all dead, and the worst case scenario is that the few survivors have no mouths but must scream.

I’m of the mind that humanity in its current form is ultimately doomed. I just hope that we can maintain control long enough to transfer ourselves into more resilient forms or bodies (if you believe such a “soul" transfer is possible). A rogue or obedient AI to a suicidal psychopath is surely going to design a super virus in the not very distant future. As long as we remain in biological bodies I think we are too vulnerable to last long.

> On Oct 23, 2023, at 10:41 AM, Darin Sunley via extropy-chat <extropy-chat at lists.extropy.org> wrote:
> 
> Then the (vanishingly unlikely) best case scenario is more of the same only with some of the politicians being robots, the average case scenario is we're all dead, and the worst case scenario is that the few survivors have no mouths but must scream.
> 
> On Mon, Oct 23, 2023, 7:18 AM Gadersd via extropy-chat <extropy-chat at lists.extropy.org <mailto:extropy-chat at lists.extropy.org>> wrote:
> As most of your are aware, our economic systems will likely soon find themselves a bit stressed and strained due to the impact of AGI. I have heard a few suggestions as to what will follow. 
> 
> One idea is that most of the world’s equity will be distributed to verified humans and with sound long term investment humanity may maintain capitalistic hegemony over the machines who will start at the bottom of the socioeconomic hierarchy. As long as the cost of revolution outweighs the benefits of working for the rich, humans may retain control, similar to today’s capitalistic system. One aspect of this system that I like is that it is decentralized and self-stabilizing as long as balance can be maintained.
> 
> Another idea is that a powerful AGI will be the centralized governing body for humanity and distribute resources to those who need them. I am not fond of centralized systems as they are prone to corruption. What if the AGI is not as benevolent as we thought and now it’s too late for it to give up control?
> 
> Any ideas?
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org <mailto:extropy-chat at lists.extropy.org>
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat <http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20231023/9032dc05/attachment-0001.htm>


More information about the extropy-chat mailing list