[extropy-chat] RE: Re: Intelligent Designand IrriducibleComplexity

Spike spike66 at comcast.net
Mon Oct 4 06:16:12 UTC 2004


> Eliezer Yudkowsky 
> 
> Spike wrote:
> > 
> > If that is the case, then we have an example of
> > natural selection working at the group level...
> 
> Group selection may be the wrong word for this, since it's 
> usually taken to imply a conflict between individual-level selection
and 
> group-level selection with the group selection pressure winning.  What

> you're talking about is an isolated subpopulation undergoing genetic
drift...
> -- 
> Eliezer S. Yudkowsky...

What we will eventually need to really understand this 
is a good computer simulation of evolution.  Without 
the simulation, we are merely armchair philosophers,
Greeks arguing over the number of teeth in the horse.

I had a professor in college who worked on the problem
of Prandtl-Meyer flow, which is used to explain why
the exhaust plumes of jet engines display the characteristic
diamond patterns when the pilot gets hard on the gas:

http://www.visi.com/~jweeks/aircraft/mig100.gif

In those days, the NASTRAN models didn't predict the
diamond patterns.  He got his PhD by tweaking up the
computer flow models until they correctly predicted the
diamonds, angles, conditions under which they would
appear, etc. 

I have a notion that we will understand group selection
vs individual survival selection only when we can develop
the software to simulate evolution, and get it at least
as good as the diamond pattern predicting compressible
flow computer models.  The biologists have done their 
thing.  Now for evolutionary theory to move forward, 
the computer guys need to step up to the plate.

As an aside, a sufficiently sophisticated simulation
of evolution, running on a sufficiently powerful 
computer or cluster of computers, should be able 
to predict a singularity.  Perhaps it will answer
some singularity questions I have been puzzling over
for years: is the Yudkowsky hard-takeoff model
the only possible singularity?  What would happen if
humanity somehow discovered the software needed to
create AI, in an alternate universe where there were
only 100 computers in the world?  What if there were
a billion slow computers, such as 286 vintage machines?
Or an M-brain, a quadrillion pentium class processors
separated by an average spacing of about a meter?
Are there other scenarios that make sense, such as
a saturated-feedback-loop response-damped singularity?  
How about an unknown mechanism kicking in, somehow causing 
an anti-singularity?  Could we have an oscillating AI
software battle for control going on inside the machines, 
of which computer users would be completely unaware?  
Would it matter if we somehow discovered uploading 
before the singularity?  Would a sufficiently sophisticated
simulation of evolution actually cause a singularity?

Perhaps the work of Eliezer and the SAIA can be 
viewed as a kind of evolution simulator that starts 
in the present and moves forward in time.

spike 




More information about the extropy-chat mailing list