[ExI] alpha zero

Dylan Distasio interzone at gmail.com
Fri Dec 8 14:42:12 UTC 2017


Thanks for clarifying the TPU counts, I should have read the paper more
carefully.  That changes what I said earlier a great deal in terms of
computing horsepower used for training and game simulation.  They actually
used an absolutely tremendous amount of computing power that is not easily
available to the general populace.  This doesn't take away from the
accomplishment, but it explains the training time being accomplished in
less than half a day.

On Fri, Dec 8, 2017 at 3:35 AM, Alejandro Dubrovsky <alito at organicrobot.com>
wrote:

> The 4TPUs were what was used during playing time. During training they say
> it used 64 second-generation TPUs for "training the neural network" ie the
> gradient, backprop etc of the neural network itself. It also used another
> 5,000 first-generation TPUs to generate the games. Both of those are from
> page 4 of their paper. I've got no idea of the computing power of those
> TPUs.
>
> BTW, while this is, to me, the biggest newest in the history of chess, it
> follows naturally from their AlphaGo Zero paper published on Nature a whole
> month and a half ago: https://www.nature.com/articles/nature24270
>
> I don't think there's anything sinister in the whole affair, except for
> the usual DeepMind showboating (and the usual question of why they denied
> StockFish their opening book and only gave it 1GB of hashtable memory). I
> expect people to indepedently replicate this, albeit at a slower pace due
> to limited resources, within the next couple of years.
>
>
> On 08/12/17 07:10, Dylan Distasio wrote:
>
>> The paper describes in detail how they did it and how long it took.
>> Again, I see no reason to question any of it.  It's an evolutionary advance
>> from their work on Go, and is using established deep learning and
>> reinforcement techniques on powerful hardware (although not insanely
>> powerful, 4 TPUs is not unobtainium https://www.blog.google/topics
>> /google-cloud/google-cloud-offer-tpus-machine-learning/).
>>
>>
>> On Thu, Dec 7, 2017 at 3:01 PM, Tomaz Kristan <protokol2020 at gmail.com
>> <mailto:protokol2020 at gmail.com>> wrote:
>>
>>     Like spike, I also think something hasn't been told or at least
>>     something hasn't been emphasized as it should be.
>>
>>     But if it's not today, it will be in a year or two. Doesn't really
>>     matter. (What does matter, is that the Singularity is closer than
>> ever.)
>>
>>     On Thu, Dec 7, 2017 at 8:34 PM, Dylan Distasio <interzone at gmail.com
>>     <mailto:interzone at gmail.com>> wrote:
>>
>>         I think you would be surprised at how little code is actually
>>         there if we could see the source code.  Tensorflow does a lot of
>>         heavy lifting in terms of abstracting neural nets, and the
>>         secret sauce in the recipe is the trained weights of the net.
>>      There is very little conventional code beyond setting up the
>>         structure of the neural net and the reinforcement objectives.
>>      Training on 77 million simulated games is the key.
>>
>>         On Thu, Dec 7, 2017 at 2:15 PM, spike <spike66 at att.net
>>         <mailto:spike66 at att.net>> wrote:
>>
>>             __ __
>>
>>             __ __
>>
>>             *From:* spike [mailto:spike66 at att.net <mailto:spike66 at att.net
>> >]
>>             *Sent:* Thursday, December 07, 2017 10:53 AM
>>             *To:* 'ExI chat list' <extropy-chat at lists.extropy.org
>>             <mailto:extropy-chat at lists.extropy.org>>
>>             *Subject:* RE: [ExI] alpha zero____
>>
>>             __ __
>>
>>             __ __
>>
>>             __ __
>>
>>             *From:* extropy-chat
>>             [mailto:extropy-chat-bounces at lists.extropy.org
>>             <mailto:extropy-chat-bounces at lists.extropy.org>] *On Behalf
>>             Of *spike
>>             *Subject:* Re: [ExI] alpha zero____
>>
>>             __ __
>>
>>             __ __
>>
>>             __ __
>>
>>             *From:* extropy-chat
>>             [mailto:extropy-chat-bounces at lists.extropy.org
>>             <mailto:extropy-chat-bounces at lists.extropy.org>] *On Behalf
>>             Of *John Clark____
>>
>>             __ __
>>
>>                 ​>>>​…I still haven’t convinced myself it is true.
>> spike____
>>
>>                 __ __
>>
>>             __ __
>>
>>             __ __
>>
>>             >>…If this is a hoax it's a very elaborate one the likes of
>> which we haven't seen since the cold fusion fiasco. ​…John K Clark ​____
>>
>>             __ __
>>
>>             __ __
>>
>>             __ __
>>
>>              >…Ja I could have clarified my doubt a bit.  I don’t
>>             suspect an intentional hoax, rather something they neglected
>>             to tell us…spike____
>>
>>             __ __
>>
>>             __ __
>>
>>             __ __
>>
>>             There is something else.  If someone writes a chess-playing
>>             program today that defeats the best effort of the best
>>             humans for 500 years, that has been done.  According to the
>>             article, what Alpha Zero has done is defeat the collective
>>             programming efforts of armies of professional programmers
>>             working with huge monetary rewards for the past 75 years,
>>             which is far more impressive than overpowering humans in
>>             chess.____
>>
>>             __ __
>>
>>             Something is still not being told here methinks.____
>>
>>             __ __
>>
>>             spike____
>>
>>
>>             _______________________________________________
>>             extropy-chat mailing list
>>             extropy-chat at lists.extropy.org
>>             <mailto:extropy-chat at lists.extropy.org>
>>             http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>             <http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat>
>>
>>
>>
>>         _______________________________________________
>>         extropy-chat mailing list
>>         extropy-chat at lists.extropy.org
>>         <mailto:extropy-chat at lists.extropy.org>
>>         http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>         <http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat>
>>
>>
>>
>>
>>     --     https://protokol2020.wordpress.com/
>>     <https://protokol2020.wordpress.com/>
>>
>>     _______________________________________________
>>     extropy-chat mailing list
>>     extropy-chat at lists.extropy.org <mailto:extropy-chat at lists.extropy.org
>> >
>>     http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>     <http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat>
>>
>>
>>
>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20171208/86c4d600/attachment.html>


More information about the extropy-chat mailing list