[ExI] Post-AGI Economics

Gadersd gadersd at gmail.com
Mon Oct 23 16:17:16 UTC 2023


> Humanity would have to be aligned first.
> 
> BillK

That’s one possibility I fear. I believe it has been tried before and the success rate doesn’t seem very good, although with enough bloodshed I’m sure it's possible at least for a time. The future may be bright, blindingly burning bright.

> On Oct 23, 2023, at 11:04 AM, BillK via extropy-chat <extropy-chat at lists.extropy.org> wrote:
> 
> On Mon, 23 Oct 2023 at 14:19, Gadersd via extropy-chat
> <extropy-chat at lists.extropy.org> wrote:
>> 
>> As most of your are aware, our economic systems will likely soon find themselves a bit stressed and strained due to the impact of AGI. I have heard a few suggestions as to what will follow.
>> 
>> One idea is that most of the world’s equity will be distributed to verified humans and with sound long term investment humanity may maintain capitalistic hegemony over the machines who will start at the bottom of the socioeconomic hierarchy. As long as the cost of revolution outweighs the benefits of working for the rich, humans may retain control, similar to today’s capitalistic system. One aspect of this system that I like is that it is decentralized and self-stabilizing as long as balance can be maintained.
>> 
>> Another idea is that a powerful AGI will be the centralized governing body for humanity and distribute resources to those who need them. I am not fond of centralized systems as they are prone to corruption. What if the AGI is not as benevolent as we thought and now it’s too late for it to give up control?
>> 
>> Any ideas?
>> _______________________________________________
> 
> 
> The idea that a powerful AGI could be the centralized governing body
> for humanity and distribute resources to those who need them is full
> of problems.
> Would this give too much power to AGI?
> How to ensure fairness of distribution?
> Are all nations involved?
> Could the distribution be corrupted?
> (But then look at the present state we are in).
> 
> It depends on resolving the AGI alignment problem and humanity has so
> many different value systems that it becomes impossible to apply one
> solution to all humanity. Humanity would have to be aligned first.
> 
> BillK
> 
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat




More information about the extropy-chat mailing list