[extropy-chat] On difficult choices (was: Books: Harris; Religion and Reason)
m_j_geddes at yahoo.com.au
Thu Jan 12 05:19:10 UTC 2006
> The theory behind the Singularity Institute is that
it's possible to
> *save the entire damn world* without killing people,
pointing guns at
> people, telling people what to do, or any of the
> tribal-chief solutions that instantly pop into
people's heads when they
> consider political problems. That's not idealism,
> History teaches us that the "difficult" choices, the
obvious wrong ways
> to solve the problem, DON'T FRICKIN' WORK. Stalin
broke plenty of eggs,
> but where are the omelets?
Hysterial nonsense. Eli's 'Elrond and the ring' thing
is not a valid analogy either I don't think.
We *don't know* that there's any real danger from
recursively self-improving AI (and sorry, but 'Eli
says so' doesn't count it. Only results published in
an accredited academic journal do).
In fact the more I've learned about Ai stuff, the more
confident I am that there's no danger. I would never
have posted the things I did to Sl4 , wta-talk and the
Extropy list if I wasn't
very very very very very very very VERY confident that
Eli is wrong. That's why I've been taking the mikey
out of him.
AI's which aren't friendly can't recursively
self-improve I say. All the unfriendly's are limited
I think - that's my theory any way (Of course, even
the limited unfriendly's could still do a fair bit of
damage I must concede - they wouldn't be world
I shall attempt to write a paper which proves this at
some point - it might take me several years to get it
up a standard which might actually be accepted in an
academic journal though. If I ever can.
Of course, all my ideas *may* be total bullshit, in
which case I'll be the first to concede I was a total
"Till shade is gone, till water is gone, into the shadow with teeth bared, screaming defiance with the last breath, to spit in Sightblinders eye on the last day
Do you Yahoo!?
Listen to over 20 online radio stations and watch the latest music videos on Yahoo! Music.
More information about the extropy-chat