[ExI] alpha zero
John Clark
johnkclark at gmail.com
Thu Dec 7 18:29:46 UTC 2017
On Thu, Dec 7, 2017 at 12:07 PM, Dylan Distasio <interzone at gmail.com> wrote:
> >
> This type of program still needs to be trained on a very specific problem,
>
It trained itself and it started with nothing but the basic rules and was
able to beat the best in the world at it, human or machine, in one day
.
And it didn't just do it with one problem it did it with 3 different
ones, Chess being the least complex.
>
> there is no thought process going on behind it.
>
That is a strange statement.
If you can teach yourself to be the best in the world at some complex task
without "thought" then what's the point of "thought"? Who needs it?
>
> It's quite startling at first glance to think that an end goal of
> minimizing a loss function can generate so much razzle dazzle, but the math
> behind these systems is actually not that complex.
>
But we know for a fact that the
recipe for a mind
can't be very big, we must have that master learning algorithm so we can
put a upper limit on it.
In the human genome there are only 3 billion base pairs,
there are 4 bases so each base can represent 2 bits, there are 8 bits per
byte so that comes out to 750 meg. And all that 750 meg certainly can not
be used just for the master learning software algorithm, you've got to
leave room for instructions on how to build a human body as well as the
brain hardware. So the information
must
contain wiring directions such as "wire up
a neuron this way and then repeat that
procedure exactly the same way
42
billion times".
And the 750 meg isn't even efficiently coded, there is a ridiculous amount
of redundancy in the human genome.
I would guess
the
master
learning algorithm is less than a meg in size, possibly a lot less.
John K Clark
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20171207/ad6cb7ea/attachment.html>
More information about the extropy-chat
mailing list