[ExI] How not to make a thought experiment

Mike Dougherty msd001 at gmail.com
Sat Feb 20 18:54:05 UTC 2010


On Sat, Feb 20, 2010 at 12:33 PM, Spencer Campbell
<lacertilian at gmail.com> wrote:
> A static GLUT can not learn, obviously, so it can't be intelligent in
> the same way that humans are. Maybe in some other, less interesting
> way. You have to go shockingly far down the phylogenetic tree before
> learning disappears entirely; it's a pretty basic trait of earthly
> lifeforms.

And a static GLUT big enough to produce a passable zombie in a given
domain of intelligence would be encoding that domain's intelligence AS
a program.  We would be shifting the responsibility from the zombie to
the programmer.  Few consider such 'narrow AI' to be very interesting
at all.  We already have a machine that is 'programmed' to be an
expert system in the domain of burning bread to make toast (timer
connected to on/off)  The thermostat has extrahuman intelligence for
keeping your house at the correct temperature.  Maybe a really
advanced toaster would ask the house thermostat for the ambient
temperature and humidity of the house and consult the breadbox for the
age of the bread in order to more accurately drive the bread-burning
function.  The as-is toaster is good enough at what it does to
preclude such extreme engineering.  I think the argument for
artificial general intelligence (AGI) is that a cross-domain
application of knowledge to solve novel situations is currently
reserved for a select few creatures on earth.  Humans may be the best
of the bunch, but we're still not very good at it. (lets be honest)
So we should turn our tool-building skills to making a machine that
can solve problems better than any individual human thinker.  This is
the progression from man pulling a cart, to a horse pulling a cart, to
a locomotive pulling several carts, etc.  I think the more interesting
question will be whether or not mere humans will continue to be able
to drive the progression once we successfully complete this next step.



More information about the extropy-chat mailing list