ok ....<br>then we are talking about the Frankenstein nightmare that all the anti-technology people evoke every single time a new technology is created...<br>since the time we discovered fire, we always imagined a punishment for our arrogance, for our dream to become Gods (think Prometeus)....so far not withstanding our self inflicted imaginary punishments nothing has happened (except some local sad but really insignificat disasters) to our species as a whole.....<br>I bet the pattern will continue with the creation of AGI....<br>sure we have to be careful but not paranoid....<br>there are more reasons to be optimistic than not....<br><br><br><br><b><i>Lee Corbin <lcorbin@rawbw.com></i></b> wrote:<blockquote class="replbq" style="border-left: 2px solid rgb(16, 16, 255); margin-left: 5px; padding-left: 5px;"> Giovanni writes<br><br>> the trend is there..... and you can see similar things happening in human<br>> society where there are different kinds of
iindividual ntelligences,<br>> civilizations, laws and moral conducts and so on (sure the spectrum<br>> is restricted in comparison with the amazing possibilities opened by<br>> an AGI consciousness) <br><br>Yes, it sure is. An AGI will not by any means *necessarily* have any<br>altruism. We hope for our survival that either it does, or it adopts the<br>logic I have proposed for years<br><br> "Best to be nice to your creators so that those you<br> create will be nice to you... for the reason that<br> that those that *they* create will be nice to them...<br> ad infinitum...<br><br> And if it takes almost zero resources to "be nice",<br> why not? It's safer to go with this meme. A post-<br> Singularity AI could upload and run everyone on<br> Earth in the pleasantest possible environments within<br> one cubic meter, easily.<br><br>> but again you can come to a similar conclusion that in general intelligence<br>>
(at the individual or civilization level) means higher altruism (that buddhists<br>> call very to the point here intelligent selfishness).<br><br>But the altruism at every point is explained by the particular evolutionary<br>history of the species in question. The AGI won't have an evolutionary<br>history---unless we succeed in giving it one or finding some other way<br>to make ti Friendly (pace the logic I expouse above)<br><br>> There are exception to this pattern, there a geniuses psychopaths....but<br>> their intelligence is very limited and specialized....they are usually not<br>> very successful in society and usually do not survive in the long run<br>> (or not very successful at least in transmitting their genes to future<br>> generations)....evolution do not favour such aberrations...<br><br>It hasn't so far. But as governments will now support *all* conceived<br>children, new opportunities open up for psychopaths.<br><br>> we can imagine for
example that AGI would have to share information<br>> and data with other entities on the web and be able to manage resources<br>> in a cooperative way, the pace of evolution in this environment would be<br>> amazingly fast and AGI that are not apt to share information, work<br>> together with other intelligences for the common good and so on would<br>> not survive very long...<br><br>But the "AI hard-takeoff" that worries so many fine thinkers on the SL4<br>list and here considers the possibility that one AI makes a breakthrough,<br>and in hours or even minutes is vastly, vastly ahead of all the others,<br>and is the first to achieve total world domination.<br><br>Lee<br><br>> that could be a self-selective mechanism for AGIs (even if what I just<br>> explained is somehow simplicistic) that would emulate similar processes<br>> that made us prone to cooperate and created in us that "feeling", that<br>> "emotion" of altruism that is actually a
very logical, intelligent and<br>> probably unavoidable response by any higher form of consciousness<br>> to the environmental challenges and pressures.<br><br>_______________________________________________<br>extropy-chat mailing list<br>extropy-chat@lists.extropy.org<br>http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat<br></blockquote><br><p>
<hr size=1>Never miss a thing. <a href="http://us.rd.yahoo.com/evt=51438/*http://www.yahoo.com/r/hs"> Make Yahoo your homepage.</a>