[ExI] Automated black-box-based system design of unsupervised hyperintelligent learning systems

Mike Dougherty msd001 at gmail.com
Mon Sep 19 23:13:46 UTC 2011


On Mon, Sep 19, 2011 at 6:36 PM, Kelly Anderson <kellycoinguy at gmail.com> wrote:
> On Mon, Sep 19, 2011 at 7:12 AM, Mike Dougherty <msd001 at gmail.com> wrote:
>> You can't build something smarter than yourself.
>
> I call bull shit on this one. Suppose that I can build something half
> as smart as I am, then run it at four times the speed I run. It would
> be twice as smart, even though you didn't know how to build something
> that initially was smarter than me...
>
> This is just a silly thing to say. How would you even define smarter?
> Hell a 4 function calculator is smarter than me at multiplication.
>
> Something smarter that me isn't the light speed limit for Thor's
> sake!!! It's not even the sound barrier.

I concede that your calculator is a higher performance rules processor
for simple math.  Outside the trivial domain of input, transform,
output - I don't expect your calculator to spontaneously know how to
pour a bowl of cereal.  I don't have one of my own to test the theory,
but I expect that a 2 year old could figure out the bowl of cereal
with only a few examples and hunger for motivation.

I don't define intelligence as the ability to process some rules.
That would lead to clocks being "intelligent" about the passage of
time or thermostats being "intelligent" about temperature of the
house.

I'll admit that my original blanket-statement lacked definition - I'm
glad so many pounced.  :)



More information about the extropy-chat mailing list