[ExI] Best case, was Hard Takeoff

Brent Allsop brent.allsop at canonizer.com
Mon Nov 29 01:27:12 UTC 2010


On 11/28/2010 4:45 PM, Keith Henson wrote:
> On Fri, Nov 26, 2010 at 5:00 AM,  Michael Anissimov
> <michaelanissimov at gmail.com>  wrote:
>> On Fri, Nov 19, 2010 at 11:18 AM, Keith Henson<hkeithhenson at gmail.com>wrote:
>>
>>> Re these threads, I have not seen any ideas here that have not been
>>> considered for a *long* time on the sl4 list.
>>>
>>> Sorry.
>> So who won the argument?
> I was not aware that it was an argument.  In any case, "win the
> argument" in the sense of convincing others that your position is
> correct almost never happens on the net.
I think this should be rephrased to be "almost never YET happens", and 
it does happen.  Sure, it's not going to happen over the time period of 
this discussion, but over 20 years?  And also, when it does hapen it's 
nice to know why, when, who, and to have a definitive way to track all 
such.  In other words, I believe it would be great to rigorously measure 
for just how much consensus there is, and how fast as it changing, and 
in what direction.

And of course, reality well eventually converts everyone, or falsifies 
the wrong camps.  If an AI launches tomorrow and wipes out half of 
humanity before we overcome it, obviously those in the 'wrong' camp 
would be converted.  And obviosly those that have worried about 
unfriendly AI, and spent any time and effort during the last 10 years, 
have completely wasted their time for the foreseeable future.  (i.e. 
more or less for every dollar we waste, instead of spending it on 
achieving immortal life, another person will fail to make it into the 
immortal heavenly future and could rot in the grave for the rest of 
eternity that would have otherwise made it.)

>
>> If there's no consensus, then there's always plenty more to discuss.
>>
>> Contrary to consensus, we have people in the transhumanist community calling
>> us cultists and as deluded as fundamentalist Christians.
> That's funny since most of the world things the transhumists are
> deluded cultists.
>

This is where it is critical to distinguish between the experts, and the 
general population.  The experts will always be in the minority, and 
will almost always have a very different POV than the general 
population.  To the degree that you track this, and definitively show 
how much worse the not experts are, compared to the experts, people will 
obviously learn more to trust the experts, sooner.  Also, it helps if 
experts colaberate to sound like a unified voice, for at least as many 
as there are, on the moral issues they agree on - instead of always 
sounding no different than the rest of the loner crazy people.

Brent Allsop





More information about the extropy-chat mailing list