[ExI] Eliezer at SXSW March 2025
BillK
pharos at gmail.com
Wed May 28 08:44:54 UTC 2025
The prophet of Silicon Valley doom: 'We must stop, we are not ready for AI'
Eliezer Yudkowsky, a pioneer in AI research, called for halting AI
development before it's too late; according to him, humanity lacks the
ability to control technology that will eventually surpass human
intelligence; 'It can evolve beyond our control'
Amir Bogen <https://www.ynetnews.com/topics/Amir_Bogen> 03.17.25
<https://www.ynetnews.com/business/article/b1sorkin1x>
Quotes:
Yudkowsky also criticized public misconceptions about AI, particularly the
belief that AI will remain confined to an advisory role. "People think no
one would be foolish enough to create an AI that acts independently," he
said. "They also assume there’s no need for AGI when specialized AI can
handle tasks like language translation or medical diagnostics."
According to Yudkowsky, these assumptions have already been debunked. "Not
only do we already have AGI systems delivering impressive results," he
said, "but some are also capable of making decisions and acting
independently. We must stop everything. We are not ready. We do not have
the technological capability to design a super intelligent AI that is
polite, obedient and aligned with human intentions — and we are nowhere
close to achieving that."
Drawing on his knowledge of AI and his readings in history, Yudkowsky
believes AI poses a greater threat than any political conflict. "The human
brain is flawed. We are plagued by self-doubt and the fear of failure," he
explained. "Machines, by contrast, pursue their objectives relentlessly.
When the Stockfish chess engine plays against a human, it does not
hesitate. AI systems will operate similarly — without second-guessing or
considering the consequences."
---------------------
"Danger, Will Robinson!" indeed.
BillK
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20250528/a939f30a/attachment.htm>
More information about the extropy-chat
mailing list