[ExI] Superhuman Poker

Dylan Distasio interzone at gmail.com
Thu Jul 18 16:57:32 UTC 2019


On Thu, Jul 18, 2019 at 12:39 PM John Clark <johnkclark at gmail.com> wrote:

> On Thu, Jul 18, 2019 at 12:17 PM Dylan Distasio <interzone at gmail.com>
> wrote:
>
> If the behavior of the code changed between iterations (and it has) then
> *something* must have changed, if it wasn't the code then what on earth was
> it? I insist that the code HAS changed between iterations, and it changed
> in a way that the human programer did not predict and does not understand.
>
>
I don't know if we're arguing semantics over the definition of code, but I
can assure you, it does NOT change between iterations.  If you write a deep
learning model in python (or any other language),  the python code IS
unchanged between iterations.   The model itself is stored as data points
in multidimensional arrays.  Linear algebra is used to calculate changes in
the data as it feeds through the network based on default weights applied
to each node.   Gradient descent is typically used after that forward feed
through the net to attempt to get closer to a global minimum (although that
can be a challenge depending on the landscape as it may get stuck on local
minima) and then calculus is used to feed the results of that iteration
back through the network to adjust the weights of each node.  Rinse and
repeat.   Throughout this process, the code itself does not change.

Some AI researchers have argued replacing code with the way these algos
work:
https://medium.com/@karpathy/software-2-0-a64152b37c35
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20190718/e3c53459/attachment.htm>


More information about the extropy-chat mailing list