[ExI] The Unsong

Anders anders at aleph.se
Fri Oct 21 09:37:06 UTC 2016


On 2016-10-21 05:05, Rafal Smigrodzki wrote:
>
> ### And Now For Something Completely Different - I just noticed you 
> are reading and commenting on the Unsong!
>
> A most uproariously complex and multi-level intellectually 
> entertaining prose, shading into poetry, isn't it? And morally 
> uplifting to boot!

The URL is http://unsongbook.com/

Basically, this is kabbalist science fiction/satire/alternate history, 
with people pirating Names of God from the big theonomics corporations, 
accidental crashes of the virtual machine the universe runs on, puns 
being serious business (since they represent hidden correspondences in 
the universe), a very *different* War on Drugs and end of the Nixon 
administration. Very fun. It inspired me to do some multidimensional 
stochastic geometry: http://aleph.se/andart2/math/uriels-stacking-problem/

Unsong also leads to an interesting philosophical question for 
transhumanism: suppose we find that our current ontology of the world is 
pretty wrong - maybe magic actually exists. It seems the basic 
transhumanist approach still makes sense: figure out the techniques that 
can enhance our abilities and long-term future, and use them even if 
they are now magic spells, holy names or exploit code for reality. So is 
transhumanism independent of the nature of the world? Would 
transhumanist ideas make sense in any universe, despite the very 
different tools, risks and opportunities?

I think this actually may be the case if we say transhumanism is about 
maximizing value of a certain kind. Obviously we have possible worlds 
with no observers or no way of increasing value, but leaving those aside 
we may say that transhumanism is about observer/actors of value changing 
themselves to achieve greater value.

But it is also possible to have possible worlds where the structure of 
*value* is different from what we think is the case in our world. For 
example, a theocentric world where one entity is the sole arbiter and 
supplier of value would make transhumanism all about changing or acting 
in such a way that the entity assigns more value to the being changing. 
One can also make sneaky possible worlds where change is against value 
(essentially worlds where bioconservative notions are morally true). 
Now, many philosophers would cheerfully argue that many of these worlds 
are not possible worlds because they are inconsistent in various ways 
and hence cannot exist. Some philosophers may even argue that there may 
be only one structure of value that is consistent, so if we were to find 
it we would have "solved" ethics. Maybe.

But one interesting result of this ramble is that if we buy this version 
of transhumanism as observers/actors changing to increase their value 
(to themselves or in some global sense) and look at a world where 
transhumanism is possible to do, then different states of that world 
must have different value. That means that there has to be relatively 
low-value states. Hence transhumanism *requires* the existence of evil, 
or at least not-quite-as-good states.

Not quite an answer to the theodice problem (unless you are a religious 
transhumanist or a process theologician), but definitely worth 
pondering. Omega point-like cosmologies that produce a godlike entity of 
high value through deliberate actions of agents also seem to require bad 
states.




-- 
Dr Anders Sandberg
Future of Humanity Institute
Oxford Martin School
Oxford University

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20161021/dccfb41d/attachment.html>


More information about the extropy-chat mailing list