[ExI] Google’s Go Victory Is Just a Glimpse of How Powerful AI Will Be
Brian Atkins
brian at posthuman.com
Sun Jan 31 21:52:37 UTC 2016
On 1/31/2016 1:04 PM, Anders Sandberg wrote:
>
> More practically I think the Wired article gets things right: this is a big deal
> commercially.
Right. I find an interesting possible parallel with Bitcoin mining hardware.
Which obviously has a large economic incentive behind it. Similar to Deep
Learning tech, it started out as a CPU algorithm and then eventually moved to
GPUs. But then within a short period of 3 to 4 years now has gone through a
brief FPGA phase, and after that onto sustained generations of custom ASIC
designs that now have pretty much caught up to state of the art Moore's Law
(16nm designs being deployed).
So I would expect something like possibly 3 orders of magnitude power efficiency
improvements could occur before 2020 as Deep Learning ASICs start to deploy and
get quickly improved? And more orders of magnitude of total Deep Learning
computing capability on top of that due to simply deploying more hardware
overall. Bitcoin mining just recently crossed the exahash/s level... some of
that due to ASIC improvements, but a lot due to simply more chips being
deployed. Deep Learning takeoff could be an even quicker timeline than with
Bitcoin since the companies involved with Deep Learning stuff are much larger
and better funded.
large disclaimer: I have no technical knowledge of whether there might be some
major difference between Bitcoin mining's hashing algorithm and Deep Learning's
typical algorithms that would make DL stuff significantly harder to translate
into efficient ASICs.
More information about the extropy-chat
mailing list