[ExI] The Status Game

Kelly Anderson postmowoods at gmail.com
Sat Dec 16 22:36:47 UTC 2023

So I have been reading Will Storr's book entitled "The Status Game"
and have come to realize from this that we MAY not have quite as much
to worry about AI as I had previously thought. As group oriented
primates, we crave status, and it has led, according to the cogent
argument of Storr, to many if not most of the "bad things" as well as
"good things" in human history. It matters a lot that your Status Game
is played in the right kind of system as to whether bad or good things
ultimately emerge. It's a fascinating read and I recommend it to you
all on its own merits.

That being said, if we do not program AIs' optimization functions to
attempt to achieve status, might we not be avoiding SOME of the more
important pitfalls that we primates are prone to?


More information about the extropy-chat mailing list