[ExI] How could you ever support an AGI?

giovanni santost santostasigio at yahoo.com
Tue Mar 4 23:54:15 UTC 2008


Well,
about the anthropomorphisizing of AGI, you say in the end that some say the motivation of AGI will be the one we program into it. Exactly, that is my point, it is difficult for us to create an intelligence utterly alien when the only example of intelligence we have is us.
But maybe there are general and universal principles associated with intelligence.
Intelligence means finding patterns and connections, understanding that affecting this part here means affecting this other part over there, intelligence means having a higher sense of physical and moral "ecology".
If you see connections between all the beings than you feel compassion and understanding (and yes these are human feelings, but they are also fundamental components of our intelligence, and a lot of new research shows that without feelings we would no have a conscious intelligence at all).
Yes we exterminate bugs, but usually in limited situations (like in our house or on a crop). It would be unacceptable for mankind to have a global plan to complete exterminate all the roaches of the earth even if it could be done.
And it is difficult to have feelings for bug, it would not make sense ecologically, it would not be the intelligent thing to do, and by defintion AGI is supposed to be Intelligent.



John Grigg <possiblepaths2050 at gmail.com> wrote: giovanni santost wrote:
 Even if there was a sudden creation of AGI (I think that gradual coming into being of AGI and integration with us is the more realistic scenario) it would be more unlikely that it would desire to destroy us than us wanted to destroy our parents when we were born (or even when we reached teenager stage). 
  >>>
  
 You are badly anthropomorphizing the AGI.  It will most likely not have the same biological drives/wiring that you and I have.  Where is Eliezer Yudkowsky when we need him? lol  I think the "whole gradual coming into being of AGI combined with the integration of us into it," is actually the very unlikely scenario.  Purely AGI development will definitely progress faster than the machine/biological interfaces that you imagine.     
   
 you continue:
 I had parents that were not particularly intellectual or interested in my aspirations (even if supportive) and I never desired to eliminate them, in fact I have the opposite desire to take care of them now that they need my help. In addition, I often fantasize about the possibility to bring my parents up in their education or desire for knowledge so I could have shared with them my interests and passions.
  >>>
  
 You sound like a good person. : )
  
 you continue:
In fact, would not be wonderful if we could accelerate the evolution of not just other human being but also other non human primates so they could have a comparable intelligence to ours but of a different kind? As humans we are always looking for possible extra terrestrial "alien" companions (angels in the prescientific times, green little men now), but what if we could bring to higher level of consciousness other terrestrial species as dolphins and primates. so we could share thoughts, music and art ?
  >>>
  
 Upgrading animals would be a very cool thing, indeed.  Just thinking about this brought back fond memories of reading the "Uplift Saga" by David Brin.  My landlord has a chicken that I would like to see uplifted.  I say this mainly because she constantly follows me around like a faithful hound.  I'd like to take this for loyalty and natural affection on her part but I realize that she is just very patiently waiting for a handout. 
   
 you continue:
Would not the AGI have similar yearnings to share existence with other "intelligent" beings and even upgrade them to be peer with His/Her/Its/Their own intelligence and consciousness?
  >>>
  
 I would say this is a very big "if."  But some say AGI would only have the motivations which we program into them.  
  
 you continue:
I think this is more likely than a crazy, primitive, selfish, destructive, nihilist AGI.
>>>

 Perhaps we have all seen the Terminator films (and the new TV series) just too many times!  And then again, maybe James Cameron was on to something.  
  
 John : )
 _______________________________________________
extropy-chat mailing list
extropy-chat at lists.extropy.org
http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat


       
---------------------------------
Never miss a thing.   Make Yahoo your homepage.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20080304/001332f9/attachment.html>


More information about the extropy-chat mailing list