[ExI] MAX MORE in Second Life yesterday

hkhenson hkhenson at rogers.com
Fri Jun 13 06:01:31 UTC 2008


At 06:35 PM 6/12/2008, Michael Anissimov wrote:

>On Mon, Jun 9, 2008 at 3:21 PM, hkhenson 
><<mailto:hkhenson at rogers.com>hkhenson at rogers.com> wrote:
>
>Sorry Natasha, this "distinction" isn't one at all.  It makes no
>difference if the events making up what is called the singularity
>take place over hours, days or years.  The end result is still the
>same, humans as they are known today are no longer significant in
>shaping the world.
>
>I have to agree with Natasha and disagree with you on this... does 
>it make no difference if it takes two years or ten years to graduate 
>college?  Get promoted at a job?  Get to the end of the line at the 
>DMV?  Launch a superintelligence?

No matter how long these take, there is a watershed event at the end, 
a degree, a new job title, a car license.  Unless you die in the line 
of course.

Same with AI.  Some of the events that the future will see as marking 
the singularity are happening now.  As I have stated before, it's 
been decades since an unaided human could design a VLSI chip.  And 
consider CGI.  Are these tools, well yes.  Are the tools making 
decisions?  Again yes, millions of decisions.  Are these tools on the 
march to sentience?  If you have a good argument why they are not, 
please state it.  Think of the economic value of being able to 
discuss a chip or a scene with a design computer.

>Also, some people seem to be under the mistaken impression that the 
>Singularity is necessarily a worldwide event that touches 
>everyone.  As initially defined, it meant smarter-than-human 
>intelligence.  So you could have a smarter-than-human intelligence 
>in Antarctica that just sits around and has no impact on the world 
>whatsoever.  Until the initial, Vingean, most useful definition of 
>the Singularity, that would constitute one, but under the new, 
>messy, overbroad, Kurzweilian definition, it wouldn't.

I am surprised you would even consider the singularity in terms of 
geography.  If a smarter-than-human AI existed anywhere within light 
hours of the net it would have huge effects unless it was blocked 
from communicating.  If it was blocked, it could be co-located with 
MAE West and have no impact.

Blocking an AI from communication would be pointless 
economically.  Do you doubt that value of a response email from an AI 
that had access to all human knowledge and the ability to sort out 
what was needed to solve some problem?  Besides it would be suicidal 
to block an AI if the AI had human type emotional 
motivations.  Having been locked up in solitary confinement with very 
limited communications recently I can state that you *really* don't 
want to do that to something smarter than humans.  It's bad enough to 
lock up an engineer.

http://www.kuro5hin.org/story/2007/10/30/18253/301

Keith

PS.  Being locked up and recovering from it (not likely to ever be 
complete--detest California now) added a year to finding a solution 
to make solar power satellites a possibly viable solution to the 
energy crisis.  Models of failing to solve energy problems show world 
population falling by 100 million a 
year.  http://www.drmillslmu.com/peakoil.htm

PPS.  I really do appreciate that Extropians and related fellow 
travelers tried, donations, petitions and thousands of phone 
calls.  Shame it didn't work.   




More information about the extropy-chat mailing list