[extropy-chat] Singularity Blues
mlorrey at yahoo.com
Wed Apr 6 17:39:51 UTC 2005
--- Edmund Schaefer <edmund.schaefer at gmail.com> wrote:
> > The thing you are forgetting is it isn't going to be an immediate
> > event. We will be easing our way over the edge for many years
> before it
> > happens (it is already starting) so by the time it gets really
> > from our current point of view, we will be quite inured to radical
> > change.
> Once superhuman intelligences exist, the power differential will be
> too vast for the future-shock endurance, flexibility, or any other
> psychological aspects of the human population to have bearing.
> we can "ride the wave" of advanced technology is only important if
> we're imagining that lots of people are going to gain access to it at
> once, which we can't safely assume. All the power goes to the
> superintelligences that invent the stuff, and that's what we should
> prepare for.
I think the history of environmental stewardship improving over time,
particularly at the hands of wealthy individuals and
purpose-organizations, it is clear that the trend shows that AI will,
provided we do not fear and attack them as a species vs species
conflict, treat humanity as a species needing protection. I believe you
are applying a stereotype of the worst sort of inhuman callousness to
AI which has no basis to be expected, but is typical of human emotional
distain for those who are not emotionally motivated.
Vice-Chair, 2nd District, Libertarian Party of NH
"Necessity is the plea for every infringement of human freedom.
It is the argument of tyrants; it is the creed of slaves."
-William Pitt (1759-1806)
Do you Yahoo!?
Read only the mail you want - Yahoo! Mail SpamGuard.
More information about the extropy-chat