[ExI] Fermi Paradox and Transcension

Joshua Job nanite1018 at gmail.com
Thu Sep 6 01:21:57 UTC 2012


There would be selection pressures and economic factors to consider. For
example, humans or their uploads likely will be resource poor, as they will
be inefficient in the environment of a computer system (we'll be carrying
around bulky physics simulations to let us deal with information, each
other, and our own brains). We may not be able to afford our own
computational substrate, and be forced out of existence due to poverty and
competition for finite and essentially non-increasing resources in our
local MBrain.

The AIs will likely have markedly dissimilar value structures, and there
will be strong selection pressures to use resources as efficiently as
possible. Since AIs (at least the most economically successful ones in the
long run) likely won't have a strong aesthetic sense, at least not for
things like humans are aware of, there will be few if any beings who would
do such things for art's sake.

And then we have the relative cost. For the same cost in energy as sending
a unit of computronium out of the solar system will likely be years of
operational energy (rod logic nano computers ala Drexler would take ~15
minutes of running energy, and they're clearly not at computronium level
efficiency yet). That is extremely costly, especially in what will quickly
become an extremely competitive environment, as we reach the limits of
irreversible computation near our star.

And then there's relative value. If I can simulate the thing for less than
it would take to create it, then there would be a strong impetuous to do so.

Not saying it's perfect, but it seems very scarce computational resources
and a highly competitive environment might make the energy barrier a high
hurdle to clear. I don't think it would be strong enough to prevent
everyone, ever, but combined with a few filters in our past (life,
eukaryotic cells, intelligence) it might be enough.
-Josh.
On Sep 5, 2012 4:03 PM, "Anders Sandberg" <anders at aleph.se> wrote:

> On 05/09/2012 23:01, BillK wrote:
>
>> Once your mind is uploaded and operating at computronium speed, then
>> spamming the universe would just seem silly.
>>
>
> For *every* mind with *every* possible motivation?!
>
> Organ²/ASLSP is a music piece by John Cage which is being played at St.
> Burchardi church in Halberstadt. It is  is scheduled to have a duration of
> 639 years, ending in 2640. If current human artists do things like that,
> don't you think future posthuman artists might spam the universe for art?
>
> The speed argument is not enough.
>
> --
> Anders Sandberg,
> Future of Humanity Institute
> Oxford Martin School
> Faculty of Philosophy
> Oxford University
>
> ______________________________**_________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/**mailman/listinfo.cgi/extropy-**chat<http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20120905/ffbc1fe9/attachment.html>


More information about the extropy-chat mailing list