[ExI] singularity summit on foxnews

Richard Loosemore rpwl at lightlink.com
Thu Sep 13 17:40:47 UTC 2007


Samantha Atkins wrote:
> Brent,
> 
> Calling those that point out the dangers of AGI "fear mongers" 
> immediately denigrates their position.  That is not a reasonable 
> beginning to eliciting thoughtful discussion.   Please start again with 
> a less biased attitude.
> 
> - samantha

No, what he said was not "biased", because these people that he 
described as fear mongers are not neutrally and with due diligence 
"pointing out the dangers of AI' as you choose (with some bias, maybe?) 
to phrase it, they are exaggerating the dangers in an (apparent) attempt 
to attract attention to themselves.

As you know, I point out to the SIAI-related community that there were 
other possible approaches to the whole enterprise that is Artificial 
Intelligence research, and that some of those other approaches hold out 
the possibility of build extremely stable systems that do not suffer 
from the many problems they so loudly complain about.

What was their response to this suggestion?  Were they eager to 
investigate this possibility?  Did they devote some time to having a 
public debate about the issues?  Did they welcome the possibility that 
things might not be as dire as they thought?

No.  What they did was mount a childish series of venomous, ad hominem 
attacks against both the ideas and the person who suggested them.  These 
attacks were led by Eliezer Yudkowsky himself.  They did not have the 
slightest interest in the issue itself.

They are self-serving fear mongers.



Richard Loosemore



More information about the extropy-chat mailing list