[ExI] Did Hugo de Garis leave the field?

Eugen Leitl eugen at leitl.org
Sun Apr 24 11:53:45 UTC 2011


On Wed, Apr 20, 2011 at 11:02:17PM -0600, Kelly Anderson wrote:

> There seems to be a philosophical position on the part of some that
> you can't design intelligence that is more intelligent than yourself.

Artificial intelligence is hard. We know that darwinian design
can do it, and we have plenty of educated guesses to prime
the evolutionary pump: animals.

It's just the easiest approach, particularly given the
chronical hubris in the AI camp. I never understood the
arrogance of early AI people, it was just not obvious that
it was easy. But after getting so many bloody noses they 
still seem to think it's easy. Weird. 

> I think that is just a ridiculous position. Just having an
> intelligence with the same structure as ours, but on a better
> substrate, or larger than a physical skull would result in higher

It does seem that we're metabolically constrained, so a larger
cortex should result in more processing capacity. But the most
interesting things is that we've got some pretty extreme outliers
which manage extremely impressive feats within the same metabolic
or more or less genetic envelope, so it's worthwhile to look
at how they differ from us bog humans. It might be the synaptic
density and the fibre connectivity as well as molecular-scale
features, but right now nobody really knows. Both postmortem
and in vivo instrumentation are difficult.

> intelligence rather trivially. Add a computer type memory without the
> fuzziness of human memory and you would get better processing. I have

Ah, this is not obvious. There are some extreme cases where people
are cursed with a real photographic memory to the point of their
processing ability being completely overwhelmed. Have you ever
had the problem of seeing too many options simultaneously, and
unable to pick the right path?

> never understood the argument that the brain is insufficiently bright
> to understand itself, since we work in groups... it just confuses me

I presume you're familiar with "The Mythical Man-month" and its more
recent ilk.

> that anyone would take this position. It seems like saying one person
> could never design a mission to land men on the moon. That may be
> true, but it is also entirely irrelevant to whether we can accomplish
> it as a species.

Artificial intelligence is a lot harder than putting men on the moon.
And will have a *slightly* higher impact than that.

-- 
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE



More information about the extropy-chat mailing list