[ExI] Survival (was: elections again)

Samantha Atkins sjatkins at mac.com
Tue Jan 1 09:39:29 UTC 2008


On Dec 31, 2007, at 12:13 PM, Harvey Newstrom wrote:

> On Monday 31 December 2007 13:40, Eugen Leitl wrote:
>> On Mon, Dec 31, 2007 at 01:11:17PM -0500, Harvey Newstrom wrote:
>>> Then I want to be the cutest pet ever!  Or else the stealthiest  
>>> scavenger
>>> rat
>>
>> Don't we all! But it's not obvious artificial beings will conserve  
>> specific
>> traits we equipped them with initially (assuming, we could do that  
>> at all,
>> which is not obvious) across many generations (not necessarily long  
>> in
>> terms of wallclock time). Or that they keep environments nicely  
>> temperated,
>> and full of breathable gases, and allow us to grow or synthesize  
>> food.
>
> I assume that such super machines would outgrow the need to adapt  
> their
> environment.  They would be functional in virtually any  
> environment.  So they
> might not have any need to rework the existing environments.

Not all environments would be equally conducive to its highest desired  
functioning.  Great capacity doesn't mean it is totally self contained  
and self sufficient.

>


>>> Even given your scenarios, we have a lot of choices on how our
>>> subjugation is going to occur.
>>
>> How much are cockroaches worth on the Wall Street job market? Do  
>> they make
>> good quants?
>
> Cockroaches have no influence on Wall Street.  But they have almost  
> total
> control over their own nests and societies.  Sure, we wipe them out  
> where
> they are in the way.  But where they do exist, human have virtually no
> influence on them.  I doubt most cockroaches even know that humans  
> exist.
>

This assumes that the needs/desires crossed with the capabilities of  
AGIs will leave ample room for humans to exist.  We aren't nearly as  
difficult to eradicate, even accidentally, as cockroaches.  We are  
much more fragile with more needs.


>> Not any time soon. But, eventually. We might not see it (heck, what  
>> is
>> another 40-50 years), but our children could very well, and their
>> children's children almost certainly (unless they're not too busy  
>> fighting
>> in the Thunderdome, of course).
>
> I think it is possible, but unlikely that our children will see  
> this.  It all
> assumes that a self-evolving AI will suddenly evolve quickly.   
> Evolution is a
> slow random process that uses brute-force to solve problems.

But we aren't talking about brute-force once self-improving AGI  
exists.   There need be nothing akin to normal evolution about it.

> Growing smarter
> is not a simple brute-force search.

It will not likely employ much in the way of brute-force search.

>  Even a super-smart AI won't instantly
> have god-like powers.

Well, what qualifies do we consider as god-like and how do we know  
exactly how much smarts it takes to obtain some of those powers?  Not  
instantly no but I doubt it will take a self-improving AGI as much as  
a human generation to be able to do things that to us are decidedly  
"god-like".

> They will have to perform slow physical experiments in
> the real world of physics to discover or build faster communications,
> transportation, and utilization of resources.

A lot of the most important work of self-improvement of intelligence  
is internal and does not require so many physical world steps.  Once  
the AGI has optimized on its existing substrate it can see about  
upgrading its physical components.  I doubt it needs to figure out any  
new transportation methods or invent faster communications until it is  
already extremely advanced.

>  They also will have to build
> factories to build future hardware upgrades.  These macro, physical  
> processes
> are slow and easily disrupted.  It it not clear to me that even a
> super-intelligent AI can quickly or easily accomplish anything that  
> we really
> want to stop.

One of the first external science priorities will likely be MNT.  It  
will not need conventional factories.  The benefits of MNT will be too  
great for all humans to want to stop it.   For that matter it will  
very early on be such a beneficial boon  that it will find patrons and  
protectors easily.

- samantha



More information about the extropy-chat mailing list