[ExI] Yes, the Singularity is the greatest threat to humanity

Eugen Leitl eugen at leitl.org
Mon Jan 17 12:34:48 UTC 2011


On Mon, Jan 17, 2011 at 12:43:35PM +0100, Stefano Vaj wrote:

> I am still persuaded that the crux of the matter remains a less superficial
> consideration of concept such as "intelligence" or "friendliness".  I

To be able to build friendly you must first be able to define friendly.
Notice that it's a relative metric, both in regards to the entitity
and the state at time t.

What is friendly today is not friendly tomorrow. What is friendly to me,
a god, is not friendly to you, a mere human.

> suspect that at any level of computing power, "motivation" would only emerge
> if a deliberate effort is made to emulate human (or at least biological)
> evolutionary artifacts such as sense of identity, survival istinct, etc.,

When you bootstrap de novo by co-evolution in a virtual environment and
aim for very high fitness target is extremely unlikely to be a good team 
player to us meat puppets.

> which would be certainly interesting, albeit probably much less crucial to
> their performances and flexibility than one may think.
> 
> This in turns means that AGIs in that sense will be from all practical
> purposes *uploaded humans*, be they modelled on actual individuals or on a

Uploaded humans are only initially friendly, of course. Which is why
it is a stop-gap measure, which can be extended, but not indefinitely.
The point is to that as few as possible fall off the bus, which will
be departing shortly.

> patchwork thereof, neither more nor less "friendly" than their models would
> be or evolve to be.
> 
> Now, both stupid and "intelligent" computers can obviously be dangerous. If
> we postulate that intelligent ones would be more so because of their ability
> to exhibit "motivations", we should however keep in mind that such feature
> may easily be indistinguishably supplied, fyborg-style, by a silicon system
> of equivalent power plus a carbon-based human being with a keyboard.

This is not possible.
 
> Now, are we really in the business of transhumanism to advocate for the
> enforcement of a global, public control of tech progress in the field of
> information technology aimed at slowing down its already glacial pace? I

It's about a consensus in a pantheon of demigods to temporarily postpone 
their ascension to Mount Olympus. At that time the progress is not really
glacial. (Unless you're a said demigod, of course).

> think there are already more than enough people who are only too happy to
> preach for the adoption of such measures...

We're an endangered species. We will need protection, or we will go
completely extinct. We're the mountain gorillas of the future.

-- 
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE



More information about the extropy-chat mailing list