[extropy-chat] Tyranny in place
sjatkins at mac.com
Thu Oct 5 08:33:41 UTC 2006
On Oct 4, 2006, at 11:01 AM, Eliezer S. Yudkowsky wrote:
> Rafal Smigrodzki wrote:
>> How is the likelihood of the rise of an unfriendly AI impacted by
>> increased secrecy, and out-of-control spending on nefarious projects?
>> We know that the opportunity to exercise unbridled control over
>> attracts the most vicious humans - will therefore our guardians (now
>> less guarded by judges and citizens) become more vicious on average?
>> What will be the AI like, if its makers are psychopaths with
>> budgets in a black project?
> Paperclips are paperclips, whether the AI is built by terrorists
> to create a Sharia enforcer, or eager idealistic researchers who don't
> understand the concept of "back off until you know what you're doing".
Below you say that research into AI matters and not much else does.
Sharia enforcers are not very likely to do AI research. An ultra-
secretive full-bore paranoid US military is very likely to do advanced
AI research. Both would likely be deadly.
> No one gives a damn about AI research, and until that changes, changes
> in other government policies aren't going to affect anything one way
> the other.
False. Changes in government policies could make it impossible to
even have the choice to do such research. Changes in government
policy could so impoverish the nation that funds dry up.
> Save your strength for things that (a) matter and (b) you can make a
> personal difference on.
Most of us can make a personal difference at anything we consider
important enough and have enough brains and determination to tackle
and enough persuasiveness to enroll others in. So the above boils
down to determining what is really important.
> Some things are just not relevant to the Singularity, people. Get
> over it.
If certain types of mistakes are made widely enough there will not be
a Singularity. Get over that.
More information about the extropy-chat