[ExI] How could you ever support an AGI?

giovanni santost santostasigio at yahoo.com
Tue Mar 4 20:25:09 UTC 2008


Very disappointed by Bradbury ideas on AGI.
What Kevin has to say is much more in the line with what I think on the issue. Even if there was a sudden creation of AGI (I think that gradual coming into being of AGI and integration with us is the more realistic scenario) it would be more unlikely that it would desire to destroy us than us wanted to destroy our parents when we were born (or even when we reached teenager stage). I had parents that were not particularly intellectual or interested in my aspirations (even if supportive) and I never desired to eliminate them, in fact I have the opposite desire to take care of them now that they need my help. In addition, I often fantasize about the possibility to bring my parents up in their education or desire for knowledge so I could have shared with them my interests and passions.
In fact, would not be wonderful if we could accelerate the evolution of not just other human being but also other non human primates so they could have a comparable intelligence to ours but of a different kind? As humans we are always looking for possible extra terrestrial "alien" companions (angels in the prescientific times, green little men now), but what if we could bring to higher level of consciousness other terrestrial species as dolphins and primates. so we could share thoughts, music and art ?
Would not the AGI have similar yearnings to share existence with other "intelligent" beings and even upgrade them to be peer with His/Her/Its/Their own intelligence and consciousness?
I think this is more likely than a crazy, primitive, selfish, destructive, nihilist AGI.



Kevin Freels <kevin at kevinfreels.com> wrote:        
 
 John K Clark wrote:              Robert Bradbury Wrote:
    
      > I believe the production of an AGI spells
   > the extinction of humanity.  
    
   Me too.
    
   > Why should I expend intellectual energy,
   > time, money, etc. in a doomed species?  
    
   If you don’t want to develop an AI somebody else certainly will, there is too much money and power involved for such a possibility to be ignored, not to mention the adrenalin high creating such a godlike being would bring. And if you’re the first to make an AI you would have more control (very small but larger than zero) over future events than the person who came in second. It may also give these developers some comfort to know that even if they or their children do not survive their mind children will.
    
   > those of you who have had and/or are investing
   > in children are potentially pursuing a pointless endeavor. 
    
   Sucks doesn’t it? Still, things aren’t completely hopeless, just almost hopeless. If you or your biological children have any wish to survive they must shed the silly superstitions regarding identity and consciousness that is epidemic in society and even infects most members of this list. If they can do that then there would be no reason not to upload and engage in pedal to the metal upgrading, and if they are also very lucky they might survive.  
    
   > And so, we must present transhumanism 
   > as an "Extinction Level Event"
    
   Yes.
    
    > are willing to deal with thiat?
    
   Well, it’s not like we had any choice over the matter.
    
    John K Clark
    
   
  Why is it that you think that humanity would become extinct? A "doomed species"? You are much better than that. First of all - ALL SPECIES ARE DOOMED. It's called evolution. The name we give a specific species is our way of fitting things into neat little boxes as us humans like to do, but the fact is that the human of today is different then the human of yesterday and will be different in the future. Whatever may come will be different from us but it will have our mark. Even the AGI - if it is actually intelligent - will recognize that had it not been for us, it would not exist. And had it not been for our parents we would not exist and so on. So your investment in children would not be "pointless" if those children were to be part of the world that brought the AGI into existence. 
 
 One thing you will notice is that the greater the education a person has, the less likely they engage in wholesale destruction of life. Assuming an AGI would be very well educated, I would expect it to seek the protection of humanity just as we seek to protect chimps and gorillas. Certainly humans kills these animals, but it's for economic reasons that an AGI would simply not subscribe to. 
 
 So I would expect with an AGI that people could still choose their own destiny. They could remain human and continue as before except in a much better world, or they could upload, convert to a mechanical body for exploration, or any combination in between. Some may even make copies of themselves digitally and shoot themselves across the universe on a laserbeam. Some will "perfect" themselves into oblivion. Others will choose to remain as traditionally human as possible. Divergence is almost inevitable.  
 
 But in all cases it would be pointless if you didn't feel and think like "you" when you were done. And personally I wouldn't feel that to survive alone is enough. If the search for upgrades and one-up-manship turned out to be as you state it is no different than the current state of affairs except there is no time for relaxation and entertainment and your entire life is dedicated to survival. If the identity can't be preserved it is indeed all pointless. So anything that doesn't produce that result would not be worth the time. An AGI would clearly see this. 
 
 All this doom and gloom about the pointlessness of it all really concerns me. Because once you go down that path you have to ask yourself why you even bother getting up in the morning. Might as well put an end to it now. 
 _______________________________________________
extropy-chat mailing list
extropy-chat at lists.extropy.org
http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat


       
---------------------------------
Be a better friend, newshound, and know-it-all with Yahoo! Mobile.  Try it now.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20080304/a5504aa2/attachment.html>


More information about the extropy-chat mailing list