[ExI] Axiological Risk (was ? Risk:)

Stefano Vaj stefano.vaj at gmail.com
Sat Mar 10 11:06:24 UTC 2012


On 9 March 2012 23:14, The Avantguardian <avantguardian2020 at yahoo.com>wrote:

> Axiological risk is an interesting point to raise. *I can see why a
> humanist would try using such an argument against life-extension and
> transhuman tech in general*. But I think that the weakness of this
> argument lies in the false assumptions of the universality and
> consistency of human values, be they ethical or aesthetic, across space and
> time.
>

!!!

My very mantra.

Because axiology is based on a relative value set, the humanist must
> assume their particular brand of "western mainstream culture" is the human
> norm and then cloak themselves in armor of "human values". But these very
> "human values" have undergone extensive evolution themselves. After
> all, you will probably never have to pay a weregild to a woman for slaying
> her husband in a meadhall during a drunken brawl, oh son of Viking
> beserkers. And I seldomly consider the size of the lip-disks of my
> potential mates, despite their importance and sexiness in several very
> human tribes around the world. So there is a certain cultural narcisism
> that comes with such assumptions that you could call, "the Golden Culture
> fallacy" in analogy to the Golden Age that never existed.
>

Were I not already in love with Natasha, I would ask you to marry me... :-)

-- 
Stefano Vaj
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20120310/07475949/attachment.html>


More information about the extropy-chat mailing list