[ExI] How could you ever support an AGI?

John Grigg possiblepaths2050 at gmail.com
Tue Mar 4 21:31:21 UTC 2008


giovanni santost wrote:
Even if there was a sudden creation of AGI (I think that gradual coming into
being of AGI and integration with us is the more realistic scenario) it
would be more unlikely that it would desire to destroy us than us wanted to
destroy our parents when we were born (or even when we reached teenager
stage).
>>>

You are badly anthropomorphizing the AGI.  It will most likely not have the
same biological drives/wiring that you and I have.  Where is Eliezer
Yudkowsky when we need him? lol  I think the "whole gradual coming into
being of AGI combined with the integration of us into it," is actually the
very unlikely scenario.  Purely AGI development will definitely progress
faster than the machine/biological interfaces that you imagine.

you continue:
I had parents that were not particularly intellectual or interested in my
aspirations (even if supportive) and I never desired to eliminate them, in
fact I have the opposite desire to take care of them now that they need my
help. In addition, I often fantasize about the possibility to bring my
parents up in their education or desire for knowledge so I could have shared
with them my interests and passions.
>>>

You sound like a good person. : )

you continue:
In fact, would not be wonderful if we could accelerate the evolution of not
just other human being but also other non human primates so they could have
a comparable intelligence to ours but of a different kind? As humans we are
always looking for possible extra terrestrial "alien" companions (angels in
the prescientific times, green little men now), but what if we could bring
to higher level of consciousness other terrestrial species as dolphins and
primates. so we could share thoughts, music and art ?
>>>

Upgrading animals would be a very cool thing, indeed.  Just thinking about
this brought back fond memories of reading the "Uplift Saga" by David Brin.
My landlord has a chicken that I would like to see uplifted.  I say this
mainly because she constantly follows me around like a faithful hound.  I'd
like to take this for loyalty and natural affection on her part but I
realize that she is just very patiently waiting for a handout.

you continue:
Would not the AGI have similar yearnings to share existence with other
"intelligent" beings and even upgrade them to be peer with His/Her/Its/Their
own intelligence and consciousness?
>>>

I would say this is a very big "if."  But some say AGI would only have the
motivations which we program into them.

you continue:
I think this is more likely than a crazy, primitive, selfish, destructive,
nihilist AGI.
>>>
Perhaps we have all seen the Terminator films (and the new TV series) just
too many times!  And then again, maybe James Cameron was on to something.

John : )
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20080304/0e294281/attachment.html>


More information about the extropy-chat mailing list