[ExI] alpha zero

Dylan Distasio interzone at gmail.com
Thu Dec 14 12:53:02 UTC 2017


More on Deepmind and its peculiar gameplay from a human POV:

https://www.technologyreview.com/s/609736/alpha-zeros-alien-chess-shows-the-power-and-the-peculiarity-of-ai/

On Dec 11, 2017 8:02 AM, "Alejandro Dubrovsky" <alito at organicrobot.com>
wrote:

> On 11/12/17 11:40, Alejandro Dubrovsky wrote:
>
>> On 11/12/17 02:42, John Clark wrote:
>>
>>>
>>> On Fri, Dec 8, 2017 at 3:35 AM, Alejandro Dubrovsky <
>>> alito at organicrobot.com <mailto:alito at organicrobot.com>> wrote:
>>>
>>>     ​> ​
>>>     The 4TPUs were what was used during playing time.
>>>
>>> ​And ​
>>> Stockfish
>>> ​used​
>>>   64 CPU cores
>>> ​.​
>>>
>>>
>> Yes. It's hard tell how the two compare. My guess is that the 4TPUs are
>> doing way more FLOPs than the 64 cores. Just one of the latest TITAN Vs
>> from Nvidia claims to do 110TF. There's no Intel beast going much past 1TF,
>> and each one of those has 24 CPUs, so max around 3TF in the 64 CPU cores. I
>> think the assumption that a TPU designed to run neural-network code can at
>> least keep up with a GPU at neural-network specific code is reasonable. All
>> of that is a bit misleading since it's much easier to make close to full
>> use of a CPU than it is of a GPU.
>>
>> Last thing before I go back to lurking, http://learningsys.org/nips17/
> assets/slides/dean-nips17.pdf says each TPU v2 chip is 45 TFlops and each
> unit consists of 4 chips and gives you 180TFlops. Either way, quite a beast.
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20171214/47392aa0/attachment.html>


More information about the extropy-chat mailing list