[extropy-chat] Singularitarian verses singularity

Jef Allbright jef at jefallbright.net
Thu Dec 22 23:38:34 UTC 2005


Brett -

You are very close, if not already crossing over the line toward
personal attacks.

It is obvious that you don't understand Eliezer's writings and he
himself has repeatedly said they become outdated as his understanding
evolves. You seem to be so biased that you are blatantly
misinterpreting his (polite and accurate) statements to you in this
thread.

You've been considerably on the attack lately.  Maybe it's time to
take a careful look in the mirror and see what might be going on.

- Jef


On 12/22/05, Brett Paatsch <bpaatsch at bigpond.net.au> wrote:
> Eliezer S. Yudkowsky wrote:
>
> >>> I agree that political yammering is a failure mode, which is why
> >>> the SL4 list bans political discussion.
> >>
> >> To ban something is an intensely political act.
> >
> > Well, duh.  I'm not saying it's bad to commit acts that someone might
> > view as intensely political.
>
> What you *did* say is on record and is at the top of this post. That
> *you* agree and that *that* is why.
>
> You have form on this issue. You have tried to have political issues
> banned on this list before.
>
> >> To ban political yammering there has to be someone who decides
> >> what is and is not political yammering.
> >
> > The SL4 List Snipers are myself, Mitch Howe, and J. Andrew Rogers.
> >
> >> How do you conclude that *you* are especially qualified to decide
> >> what is political yammering?
> >
> > Anyone with common sense can do the job.  We don't try to
> > discriminate between good political posts and bad political posts, we
> > just ban it all.  That's not what the SL4 list is for.
>
> And how are we to suppose a work in progress such as yourself
> decides who has common sense I wonder?  Pre-judice maybe?
>
> >> It seems like a "friendly" AI with
> >> *your* values could only be a benevolent dictator at best.  And
> >> benevolent not as those that are ruled by it decide but as it decides
> >> using the values built in by you.
> >
> > Yeah, the same way an AI built by pre-Copernican scientists must
> > forever believe that the Sun orbits the Earth.  Unless the scientists
> > understand Bayes better than they understand Newtonian mechanics.
> >  AIs ain't tape recorders.
>
> This paragraph of yours is completely irrelevant, and utterly absurd.
>
> > http://singinst.org/friendly/collective-volition.html
>
> This is a link to a work in progress, Collective volition - one author -
> Eliezer Yudlowsky.
>
> How is this link anything other than an attempt to divert attention
> from your faux pas?
>
> I have some very serious doubts about the aims of the Singularity
> Institute as I've understood them, but in all other areas of discussion
> you exhibit such good sense that I have set them aside.
>
> I cannot see how an AI built with your values could be friendly
> Eliezer. Nor do I see that you have enough common sense to know
> what you do not know, "all political yammering is a failure mode".
> You just make an assumption and bang ahead on the basis of
> reckless self-belief.
>



More information about the extropy-chat mailing list