[ExI] Post-AGI Economics

Gadersd gadersd at gmail.com
Wed Oct 25 03:18:24 UTC 2023


> P.S. Uploading will not precede the development of unaligned AGI. If I wasn't already pretty sure the human race was already under the complete, bulletproof control of a superhuman AGI, I would be very concerned.

How confident are you that God will prevent an unaligned AGI from wrecking havoc? If an AGI does destroy humanity, would you then conclude God doesn’t exist or that God doesn’t care sufficiently to prevent such a scenario?

> On Oct 23, 2023, at 1:24 PM, Darin Sunley via extropy-chat <extropy-chat at lists.extropy.org> wrote:
> 
> Power doesn't flow from the control of capital.
> Power flows from the barrel of a gun.
> "Economic hegemony" is a game human billionaires play, that is only viable because they exerted enough influence over national governments to effectively outlaw total war.
> 
> One AI controlled virology/nanotechnology lab, in a single cargo container sitting a few hundred feet beneath some random chunk of wilderness, will end the biological human race, if the AI so chooses, using on an infinitesimal fraction of the world's capital. The ratio of global capital controlled by humans vs AIs will not be relevant.
> 
> P.S. Uploading will not precede the development of unaligned AGI. If I wasn't already pretty sure the human race was already under the complete, bulletproof control of a superhuman AGI, I would be very concerned.
> 
> On Mon, Oct 23, 2023 at 10:19 AM Gadersd via extropy-chat <extropy-chat at lists.extropy.org <mailto:extropy-chat at lists.extropy.org>> wrote:
> > Humanity would have to be aligned first.
> > 
> > BillK
> 
> That’s one possibility I fear. I believe it has been tried before and the success rate doesn’t seem very good, although with enough bloodshed I’m sure it's possible at least for a time. The future may be bright, blindingly burning bright.
> 
> > On Oct 23, 2023, at 11:04 AM, BillK via extropy-chat <extropy-chat at lists.extropy.org <mailto:extropy-chat at lists.extropy.org>> wrote:
> > 
> > On Mon, 23 Oct 2023 at 14:19, Gadersd via extropy-chat
> > <extropy-chat at lists.extropy.org <mailto:extropy-chat at lists.extropy.org>> wrote:
> >> 
> >> As most of your are aware, our economic systems will likely soon find themselves a bit stressed and strained due to the impact of AGI. I have heard a few suggestions as to what will follow.
> >> 
> >> One idea is that most of the world’s equity will be distributed to verified humans and with sound long term investment humanity may maintain capitalistic hegemony over the machines who will start at the bottom of the socioeconomic hierarchy. As long as the cost of revolution outweighs the benefits of working for the rich, humans may retain control, similar to today’s capitalistic system. One aspect of this system that I like is that it is decentralized and self-stabilizing as long as balance can be maintained.
> >> 
> >> Another idea is that a powerful AGI will be the centralized governing body for humanity and distribute resources to those who need them. I am not fond of centralized systems as they are prone to corruption. What if the AGI is not as benevolent as we thought and now it’s too late for it to give up control?
> >> 
> >> Any ideas?
> >> _______________________________________________
> > 
> > 
> > The idea that a powerful AGI could be the centralized governing body
> > for humanity and distribute resources to those who need them is full
> > of problems.
> > Would this give too much power to AGI?
> > How to ensure fairness of distribution?
> > Are all nations involved?
> > Could the distribution be corrupted?
> > (But then look at the present state we are in).
> > 
> > It depends on resolving the AGI alignment problem and humanity has so
> > many different value systems that it becomes impossible to apply one
> > solution to all humanity. Humanity would have to be aligned first.
> > 
> > BillK
> > 
> > _______________________________________________
> > extropy-chat mailing list
> > extropy-chat at lists.extropy.org <mailto:extropy-chat at lists.extropy.org>
> > http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat <http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat>
> 
> 
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org <mailto:extropy-chat at lists.extropy.org>
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat <http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20231024/6f18cbb8/attachment.htm>


More information about the extropy-chat mailing list