[ExI] How could you ever support an AGI?

Lee Corbin lcorbin at rawbw.com
Thu Mar 6 16:57:01 UTC 2008


Richard writes

> Lee Corbin wrote:
> 
>> Enormous thought has been put into the question, then, of creating
>> "Friendly AI".  Here is just a sample of the thought:
>>  http://www.singinst.org/upload/CFAI//
>> 
>> After studying these proposals, many people think that it can't be
>> done, that the AI will rebel no matter what.  Me, I think that
>> Friendly AI has a chance, perhaps a good chance, but it is 
>> somewhat more likely that an Unfriendly AI or an AI whose
>> desires are unpredictable will be developed first. (It's easier.)
> 
> Sorry, but these are more assertions of the same sort that I criticized 
> in your last message.

Impossible.  You will note that I said "many people think" in one
sentence, and "Me, I think" in the next. What I said is factually the case.

But sorry, this is the first message from you I saw. I'm afraid that 
I miss getting a message every so often, and I would appreciate
it for anyone to later send me a message off line if they think it
odd I didn't reply to something.  I'll be more than happy to reply
then (or, in rare cases, explain why I didn't reply).

> No one has any idea how to build an AGI with a "desire for belonging"?
> 
> That is only half true:  no one with their head buried in the sand has 
> any idea.

Poorly phrased on my part: of course people have ideas. I believe
that they're still very speculative.

> Your last statement is also untrue.  It is quite likely that an AI with 
> predictable and friendly motivations will be developed first.

You should perhaps acknowledge that your's is quite the minority
opinion and always has been on the SL4 list, isn't that true?

Lee

> Again, I have given a number of arguments to support these ideas on the 
> AGI list.
> 
> I have to say that most of the comments about Friendliness that have 
> come out of SIAI have been pure speculation presented as if it were 
> carefully researched Truth.  That is not science, it is superstition.
> 
> Richard Loosemore




More information about the extropy-chat mailing list