[ExI] MAX MORE in Second Life yesterday
natasha at natasha.cc
Fri Jun 13 13:24:37 UTC 2008
> At 06:35 PM 6/12/2008, Michael Anissimov wrote:
> >On Mon, Jun 9, 2008 at 3:21 PM, hkhenson
> ><<mailto:hkhenson at rogers.com>hkhenson at rogers.com> wrote:
> >Sorry Natasha, this "distinction" isn't one at all. It makes no
> >difference if the events making up what is called the singularity
> >take place over hours, days or years. The end result is still the
> >same, humans as they are known today are no longer significant in
> >shaping the world.
> >I have to agree with Natasha and disagree with you on this... does
> >it make no difference if it takes two years or ten years to graduate
> >college? Get promoted at a job? Get to the end of the line at the
> >DMV? Launch a superintelligence?
> No matter how long these take, there is a watershed event at the end,
> a degree, a new job title, a car license. Unless you die in the line
> of course.
This thread has become apples and oranges. First, Keith misunderstood my
post by assuming that I do not think that a watershed of events will occur.
Since the term watershed is a popular means to describe events which occurs
as an accumulative result of any number of trends which become forces, it is
a given that watersheds occur. They have to occur in complex adaptive
systems. Further that "humans as they are known today are no longer
significant in shaping the world" is assumptive based on single-track
thinking that SI will occur outside of human and that humans will not be a
part of the SI advance.
My particular theory is that the SI will be combined human and connective
intelligence - thus the posthuman, or whatever anyone wants to call it. I
use posthuman because the term is familiar in academic literature. I do not
think that SIs will sprout in a major Singularity wherein humans run to the
caves and villages as they are being burned down around them. This tale
stems from a Buddhist story which Carl Sagan notes in his book Demon Haunted
World, Science as a Candle in the Dark. It is a lovely example of the
shortsighted preoccupation with oneself the near at hand, which results in a
daunting demise. Be that as it may, it is not as I personally see the
My earnest approach to the future is the combined effort of the human brain
and technological innovation in enhancing human to merge with SI through
stages of development and not one big event that occurs overnight.
Certainly, one could look back at this from and just look at the end point
and say a sudden event occurred. That is not my particular vantage because
I am interested in the process.
> >Also, some people seem to be under the mistaken impression that the
> >Singularity is necessarily a worldwide event that touches
> >everyone. As initially defined, it meant smarter-than-human
> >intelligence. So you could have a smarter-than-human intelligence
> >in Antarctica that just sits around and has no impact on the world
> >whatsoever. Until the initial, Vingean, most useful definition of
> >the Singularity, that would constitute one, but under the new,
> >messy, overbroad, Kurzweilian definition, it wouldn't.
You make a good point Michael.
> I am surprised you would even consider the singularity in terms of
> geography. If a smarter-than-human AI existed anywhere within light
> hours of the net it would have huge effects unless it was blocked
> from communicating. If it was blocked, it could be co-located with
> MAE West and have no impact.
Again, apples and oranges. Michael is looking in from the perspective of
process and extending the seconds of time. Keith is looking back at an event
and contracting time. Both are meaningful perspectives.
BFA, MS, MPhil/PhD Candidate, Planetary Collegium
Faculty of Technology, School of Computers, Communication and Electronics
University of Plymouth, UK
Arts and Design - NBIC+ Convergence
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat