[ExI] Google’s Go Victory Is Just a Glimpse of How Powerful AI Will Be
Anders Sandberg
anders at aleph.se
Mon Feb 1 01:49:39 UTC 2016
Good point. There seems to be a fair bit of interest in DL hardware, and
I think we will see ASICs as applications become standardized:
https://www.quora.com/Is-there-any-specialized-hardware-for-deep-learning-algorithms
It is a one-time performance increment, but as you say maybe 3 orders of
magnitude. Most obvious uses will likely be on the application side
rather than the more expensive training side, but I think different
ASICs would be able to boost both legs.
On 2016-01-31 21:52, Brian Atkins wrote:
> On 1/31/2016 1:04 PM, Anders Sandberg wrote:
>>
>> More practically I think the Wired article gets things right: this is
>> a big deal
>> commercially.
>
> Right. I find an interesting possible parallel with Bitcoin mining
> hardware. Which obviously has a large economic incentive behind it.
> Similar to Deep Learning tech, it started out as a CPU algorithm and
> then eventually moved to GPUs. But then within a short period of 3 to
> 4 years now has gone through a brief FPGA phase, and after that onto
> sustained generations of custom ASIC designs that now have pretty much
> caught up to state of the art Moore's Law (16nm designs being deployed).
>
> So I would expect something like possibly 3 orders of magnitude power
> efficiency improvements could occur before 2020 as Deep Learning ASICs
> start to deploy and get quickly improved? And more orders of magnitude
> of total Deep Learning computing capability on top of that due to
> simply deploying more hardware overall. Bitcoin mining just recently
> crossed the exahash/s level... some of that due to ASIC improvements,
> but a lot due to simply more chips being deployed. Deep Learning
> takeoff could be an even quicker timeline than with Bitcoin since the
> companies involved with Deep Learning stuff are much larger and better
> funded.
>
> large disclaimer: I have no technical knowledge of whether there might
> be some major difference between Bitcoin mining's hashing algorithm
> and Deep Learning's typical algorithms that would make DL stuff
> significantly harder to translate into efficient ASICs.
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
--
Anders Sandberg
Future of Humanity Institute
Oxford Martin School
Oxford University
More information about the extropy-chat
mailing list