<p dir="ltr">There would be selection pressures and economic factors to consider. For example, humans or their uploads likely will be resource poor, as they will be inefficient in the environment of a computer system (we'll be carrying around bulky physics simulations to let us deal with information, each other, and our own brains). We may not be able to afford our own computational substrate, and be forced out of existence due to poverty and competition for finite and essentially non-increasing resources in our local MBrain.</p>
<p dir="ltr">The AIs will likely have markedly dissimilar value structures, and there will be strong selection pressures to use resources as efficiently as possible. Since AIs (at least the most economically successful ones in the long run) likely won't have a strong aesthetic sense, at least not for things like humans are aware of, there will be few if any beings who would do such things for art's sake.</p>
<p dir="ltr">And then we have the relative cost. For the same cost in energy as sending a unit of computronium out of the solar system will likely be years of operational energy (rod logic nano computers ala Drexler would take ~15 minutes of running energy, and they're clearly not at computronium level efficiency yet). That is extremely costly, especially in what will quickly become an extremely competitive environment, as we reach the limits of irreversible computation near our star.</p>
<p dir="ltr">And then there's relative value. If I can simulate the thing for less than it would take to create it, then there would be a strong impetuous to do so.</p>
<p dir="ltr">Not saying it's perfect, but it seems very scarce computational resources and a highly competitive environment might make the energy barrier a high hurdle to clear. I don't think it would be strong enough to prevent everyone, ever, but combined with a few filters in our past (life, eukaryotic cells, intelligence) it might be enough.<br>
-Josh.</p>
<div class="gmail_quote">On Sep 5, 2012 4:03 PM, "Anders Sandberg" <<a href="mailto:anders@aleph.se">anders@aleph.se</a>> wrote:<br type="attribution"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
On 05/09/2012 23:01, BillK wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Once your mind is uploaded and operating at computronium speed, then spamming the universe would just seem silly.<br>
</blockquote>
<br>
For *every* mind with *every* possible motivation?!<br>
<br>
Organē/ASLSP is a music piece by John Cage which is being played at St. Burchardi church in Halberstadt. It is is scheduled to have a duration of 639 years, ending in 2640. If current human artists do things like that, don't you think future posthuman artists might spam the universe for art?<br>
<br>
The speed argument is not enough.<br>
<br>
-- <br>
Anders Sandberg,<br>
Future of Humanity Institute<br>
Oxford Martin School<br>
Faculty of Philosophy<br>
Oxford University<br>
<br>
______________________________<u></u>_________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" target="_blank">http://lists.extropy.org/<u></u>mailman/listinfo.cgi/extropy-<u></u>chat</a><br>
</blockquote></div>