[ExI] Hard Takeoff

Samantha Atkins sjatkins at mac.com
Thu Nov 18 06:02:27 UTC 2010


On Nov 17, 2010, at 11:00 AM, spike wrote:

>> ... On Behalf Of BillK
> 
>> ...How something can be designed to be 'Friendly' without emotions or
> caring is a mystery to me...BillK

Who says that it will be without emotions?  Well, first we would have to reach agreement on what emotions are and are not.  One part of human emotion is seemingly baked into the hardware lightning fast situational evaluations and reactions.  Another part seems amenable to training, therapy and so on.  It acts sort of like and semi-automated fast evaluation or general feeling tone that is somewhat programmed by patterns of thought and feeling that are repeatedly associated together with something in the environment or associated with one's self image.  

So an AGI would not have the first.  It would have, because everything it is composed of can do this, the ability to program itself with fast evaluation functions.  Unlike us it likely will have much better conscious awareness of doing so and more ability to debug such.  

I consider the AGI nature in this way, if I am right, an advantage over humans as far as acting consistently/rationally in accordance with values.    If it is of value to the AGI to be helpful or at least not harmful to humans then it will do a much more reliable job of it.    

I think what the statement really implies is the idea that it is not rational for a much smarter than human AGI to be 'friendly' to humans.    Therefore we appeal to irrational aspects for 'friendliness'.   If this is indeed the case then there is nothing that can be done about it that is consistent with the facts of reality.    I don't believe you can pull the wool over an AGI's perception or coerce it for very long.    

I also doubt very much you would want anything like normal human drives and emotions in your AGI.  How many humans have ever lived that would be great or even save to have around if they thought six or more orders of magnitude faster than any other humans and at much greater depth?  What would a non-human with human emotion and drives be able to do with them exactly?

> 
> BillK, this is only one of many mysteries inherent in the notion of AI.  We
> know how our emotional systems work, sort of.  But we do not know how a
> machine based emotional system might work.  Actually even this is a comical
> overstatement.  We don't really know how our emotional systems work.
> 

Part of human design is that we automatically distrust and fear any human, much less the truly alien, that  we cannot predict because we cannot model its nature.  We have no theory of mind that covers it, no mirroring expectation of how it might perceive things or act or react.   Combine that with it being very powerful and perhaps superseding us economically and creatively and you have the recipe for deep fear.


>> ...Did you know that more than one million blokes have been dumped by their
> girlfriends - because of their obsession with computer games?
> <http://swns.com/one-million-men-dumped-because-of-computer-game-obsessions-
> 151620.html>
> 

Most of my girlfriends have been much worse game addicts that I am.  

> OK, suppose we get computer based intelligence.  Then our computer game will
> dump our asses because it thinks we have an obsession with our girlfriends.

You think an AGI is going enjoy playing down to your level in a computer game? :)

> Then without girl or a computer, we have absolutely nothing to do.  We need
> to develop an AI that is not only friendly, but is tolerant of our
> mistresses.  That daunting software task makes friendly AI look simple.

An AGI embedded in a computer game is going to think of a mere human romantically or be jealous when you don't play with it?

- s




More information about the extropy-chat mailing list