[ExI] How AGI superintelligence will happen

BillK pharos at gmail.com
Mon Oct 30 22:12:04 UTC 2023


I've just read a comment from an interview -
<https://thebulletin.org/2023/10/ai-godfather-yoshua-bengio-we-need-a-humanity-defense-organization/>

I hadn't really realised it before, but the comment is pretty obvious
once you read it.
Quote:
Geoff Hinton argued that digital computing technologies have
fundamental advantages over brains. In other words, even if we only
figure out the principles that are sufficient to explain most of our
intelligence and put that in machines, the machines would
automatically be smarter than us because of technical things like the
ability to read huge quantities of text and integrate that much faster
than the human could—like tens of thousands or millions of times
faster.
----------------

Former Google AI researcher Geoffrey Hinton received the Turing
Award—known as the “Nobel” of computing—for “conceptual and
engineering breakthroughs that have made deep neural networks a
critical component of computing.”

What Hinton is saying is that if we only create the basic human-level
machine intelligence, then the inbuilt advantages that computers have
will make them much smarter than humans.

AGI might arrive rather unexpectedly...........

BillK



More information about the extropy-chat mailing list