[extropy-chat] Maths ability
Eliezer S. Yudkowsky
sentience at pobox.com
Sun Mar 5 02:56:11 UTC 2006
Robert Bradbury wrote:
>
> On 3/4/06, *Eliezer S. Yudkowsky* <sentience at pobox.com
>
> ... It is theoretically possible that, as they will helpfully tell
> you, you've
> just been doing it wrong. But in all probability, you're right about
> the brain rewiring. A fast, powerful nonhuman intelligence is going to
> have to do some gradual, subtle neural tinkering before your mind wakes
> up to complex numbers. (I would not advise that you try doing it to
> yourself.)
>
> Of course, there are some, maybe including Ben, but almost certainly
> including myself, who would find it relatively worthless, if not
> completely repulsive, for a nonhuman "intelligence" to do such tinkering.
>
> Although you and others may find it unimaginable, having "the world, the
> universe and everything" handed to me on a "silver platter" is
> relatively worthless. I have personally experienced the middle road of
> the silver platter. I have a more than the average person's awareness
> of the benefits and costs involved in the gold ones. Having worked
> fairly hard to get my hands on a silver one for a while was immensely
> more satisfying than having a dozen, or even a million of them handed to
> me.
>
> The beauty of chopping wood is that I can look at the pile when I am
> done and say "I did that". The beauty of watching someone chop wood
> better than I is in admiring the skill that they demonstrate in doing
> so. There is little satisfaction involved in picking up the phone and
> requesting that a cord of finely chopped wood be dropped off in the
> driveway in the morning. Presumably in a world where intelligence can
> be engineered into matter by the F(?)AI the wood would have known enough
> to assemble itself as a finely chopped cord in the driveway before I
> even picked up the phone.
I didn't say Ben needed an FAI to slowly, subtly rewire his brain in
such fashion as to contain predigested declarative knowledge of complex
numbers. I meant that Ben needed the math talent, the *ability to
learn*, and this would require rewiring his brain; or possibly just
changing Ben's temporal dynamics, the way his brain changes over time,
rather than surgically operating on his immediate data.
People who are naturally good at math may not be able to conceive that
someone could try hard, practice hard, and *never acquire the basic
talent*. But I am afraid that it is so. This is the problem I want to
fix. If people try hard to learn linear algebra, they deserve to make
progress on it - not to stare blankly at the problem with tears in their
eyes. And because Mother Nature is a vicious slut, people don't always
get what they deserve.
The human brain was never designed to be end-user-modifiable. Properly
adjusting the dynamics will probably require a huge number of tiny
tweaks that would be incredibly boring to a human surgeon, which tweaks
require considerable domain-specific neurobiological knowledge and
computational intelligence to make correctly, and which tweaks could
have negative consequences if gotten wrong. Therefore the operation
should be carried out by a fast, powerful intelligence incapable of boredom.
Hence the "AI midwife" strategy of human intelligence enhancement.
Humans, who are not end-user-modifiable, build an AI which is cleanly
designed for safe stable self-improvement, which AI then takes on the
far more complex task of upgrading biologically tangled humans.
> Nor will it be of much interest if we reach the point in the
> development of humanity when silver platters will be available for one
> and all for "free" but getting them is made artificially hard in an
> attempt to fool people into believing that they actually accomplished
> something in getting their hands on one.
Do you consider it "artificially hard" to do something for yourself that
in theory a superintelligence could have done for you, but which it
refuses to do directly for you or anyone, even though it did tweak your
brain dynamics in such fashion as to make it newly possible for you to
learn through sufficient effort to do it for yourself?
If that is your philosophy, I fear you may quickly run out of Fun.
Why is it that your ability to learn math in the first place, handed you
on a silver platter by natural selection - an optimization process which
you never chose, giving you gifts you could not create for yourself -
doesn't detract from the fun of actually learning?
What's the advantage of having your "natural" talents awarded in lottery
by the vicious slut Dame Nature; versus operating under the general and
fair Law that if you practice you *will* get better?
If I stay at the piano *long enough*, I ought to surpass Bach or Mozart.
But that's not the Law we live under, right now. Now it's a question
of bloody talent.
--
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
More information about the extropy-chat
mailing list