[extropy-chat] Timescale to Singularity

The Avantguardian avantguardian2020 at yahoo.com
Sun Jun 19 06:15:22 UTC 2005



--- Samantha Atkins <sjatkins at mac.com> wrote:

> In other words you not only don't really believe you
> would actually  
> be smarter, you actually believe you are doomed to
> or would choose a  
> fight to the death with something you believe will
> be smarter than  
> you.

No. You are reading too much into this. All I have
said is that I value my freedom over my life. And I
will not accept an AI in any more than an advisory
role in my life no matter how smart it is. There are
aspects of the human consciousness that I don't think
a machine can ever surpass no matter how "smart" it
is. When an AI paints the Sistine Chapel or "the
Scream", I will be impressed. I don't believe I am
doomed at all. Neither in a fight to the death, nor in
a game of go. I mean no more and no less. If some
entity whether it be a germ, beast, man, or machine
seeks to rob me of life or self-determination, then
let it come. I am ready. Let it match its will against
mine and let the universe decide. John Henry did not
beat the jackhammer but the whole point of the story
was that he ALMOST did and therefore COULD have. And
let no one forget that Kasparov did win a few games.  


> But you will pretend otherwise suspecting that
> will give you  
> more of a chance.

There is no pretense. Only emptiness and tranquility.
The race is not always to the swift, nor the battle to
the strong. I have the chance that I have. How that
probability function will collapse is determined as
equally by my mind and will as by my opponent's. That
my opponent might be a machine with a 2000 IQ is
irrelevant.

> Something being (a lot) more
> intelligent than you  
> does not necessarily mean you are obsolescent or at
> least not in any  
> way that will necessarily endanger anything other
> than your pride.

It is not about pride, it is about self-determination
and free-will. I am happy to co-exist with an A.I.
that does not try to kill me or to control me, but it
would have to earn my trust. Should it prove itself to
me as a friend, I would even be inclined to do it
favors. But it goes no further than that.

> 
> I have no idea what you are talking about.

What part don't you understand? There are those
Singulatarians that would build an A.I. and try to put
it charge of everything. I disagree strongly with
this. An AI does not have the right to be in charge of
anything more or less than itself like any other
sentient being. We have not abolished gods and kings
only to be ruled by a unix box on steroids.


The Avantguardian 
is 
Stuart LaForge
alt email: stuart"AT"ucla.edu

"The surest sign of intelligent life in the universe is that they haven't attempted to contact us." 
-Bill Watterson

__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 



More information about the extropy-chat mailing list