[ExI] alpha zero
Alejandro Dubrovsky
alito at organicrobot.com
Mon Dec 11 12:43:11 UTC 2017
On 11/12/17 11:40, Alejandro Dubrovsky wrote:
> On 11/12/17 02:42, John Clark wrote:
>>
>> On Fri, Dec 8, 2017 at 3:35 AM, Alejandro Dubrovsky
>> <alito at organicrobot.com <mailto:alito at organicrobot.com>> wrote:
>>
>> >
>> The 4TPUs were what was used during playing time.
>>
>> And
>> Stockfish
>> used
>> 64 CPU cores
>> .
>>
>
> Yes. It's hard tell how the two compare. My guess is that the 4TPUs are
> doing way more FLOPs than the 64 cores. Just one of the latest TITAN Vs
> from Nvidia claims to do 110TF. There's no Intel beast going much past
> 1TF, and each one of those has 24 CPUs, so max around 3TF in the 64 CPU
> cores. I think the assumption that a TPU designed to run neural-network
> code can at least keep up with a GPU at neural-network specific code is
> reasonable. All of that is a bit misleading since it's much easier to
> make close to full use of a CPU than it is of a GPU.
>
Last thing before I go back to lurking,
http://learningsys.org/nips17/assets/slides/dean-nips17.pdf says each
TPU v2 chip is 45 TFlops and each unit consists of 4 chips and gives you
180TFlops. Either way, quite a beast.
More information about the extropy-chat
mailing list