[extropy-chat] Timescale to Singularity

Samantha Atkins sjatkins at mac.com
Mon Jun 20 23:04:00 UTC 2005


On Jun 19, 2005, at 10:44 PM, The Avantguardian wrote:

>
>
> --- Samantha Atkins <sjatkins at mac.com> wrote:
>
>
>> So you don't consider yourself to be a biological
>> machine.  How so?
>> In what way are you not a machine?   How are the
>> ways you are not a
>> machine fundamentally unavailable to machines even
>> if they become
>> many orders of magnitude more intelligent than you
>> and I?
>>
>
> Oh, I am certain that the monkey that carries my
> wallet is a machine but I am far more.

if so then why can this purported "far more" not have "its wallet  
carried" by a far more intelligent "monkey"?

>
>> In any contest where intelligence is the determining
>> factor that is
>> surely NOT irrelevant.  My point is that while the
>> advent of such AIs
>> may be a huge blow to our egos and self-image it
>> need not be some war
>> against what is.
>>
>
> The advent of AI will not be a blow to my ego as I
> don't see a need to have one. When one finds out who
> one really is, ego is pointless.

Then "you" cannot be threatened at all and I see no point in your  
having started this thread of discussion as you did.

>
>
>
>> How would it necessarily be any threat to those?
>>
>
> An AI would not be a threat to my free will or self
> determination unless people try to make it one.

 From what you say above you cannot be threatened or touched.  Only  
your "monkey" can.  So what is the big deal?
>

>
>> Please submit your proof that you and billions of
>> mostly even more
>> limited humans are capable of running this world
>> sanely much less the
>> increasingly complex world we are ever more quickly
>> moving into.
>>
>
>       The world strikes me as still pretty sane. We've
> got problems yes, but the ones that tell you that it
> is spiralling out of control are the ones that are
> trying to scare you into giving them yet more power
> over you. About the only problem with the world is a
> lack of vision.

Given current relatively fixed levels of individual and collective  
intelligence it is quite clear that there is a limit to what that  
level of intelligence can effectively deal with.  It is also clear  
that the world is becoming more complex and moving at an accelerated  
rateof change.  It is also clear that only a major deadly disaster  
would decrease that rate of change.  Thus it seems clear that we are  
or are headed toward spiraling out of control.

>
>       Not that there are no visionaries in the world
> but that the voices of the visionaries are drowned out
> by the chanted mantras of the brainwashed masses. All
> to the tune set by those mediocre souls that champion
> the status quo. So power structures do not change
> while the very essence of the world changes beneath
> those structures.

Which is a restatement of the fact of our limitation in the face of  
what the world is and what it is becoming and the speed at which it  
is doing so.

>       As you have obviously recognized this change has
> been manifested as increased complexity. But this
> complication of our lives is a choice. You speak as if
> we have no choice as to whether or not to complicate
> things further.

Short of a major calamity knocking us severely backward  
technologically we have no choice.

> The truth is we could make our lives
> as simple as we choose to.

I think you jumped out of context here.


> And as far as proof that
> humanity can manage the affairs of humanity better
> than an AI, I have a deal for you. You show me a
> subroutine for love, compassion, and courage and I
> will show you proof that leadership takes more than
> intelligence no matter how super-human.
>

I am sorry to disturb your  denial.  I have no idea how far human  
intelligence and effective group intelligence and rationality can be  
raised.  But I am certain that the current levels are proving grossly  
inadequate.  I would hate to see the same be the best we have to deal  
with say MNT.   Subroutine?  Check out the human implementation of  
such things.  These sloppy evolved biological implementations are not  
so terribly dependable.

>
>>> We have not abolished gods and kings
>>> only to be ruled by a unix box on steroids.
>>>
>>
>> Some day you will understand just how utterly
>> inappropriate that
>> rhetorical flourish is.
>>
>
> Let us hope I am wrong. My vision of transhumanity is
> that of us surpassing ourselves and guiding our own
> evolution in an enlightened fashion. Not building our
> replacements and handing them the keys to the kingdom.
>

Then we become less and less evolved chimps.  We become our own  
replacements.  Whether that Other has the imprint of you or I or a  
once human or not the days of humanity as we know it are numbered.

- samantha



More information about the extropy-chat mailing list