[ExI] Leading AI Scientists Warn AI Could Escape Control at Any Moment

Adrian Tymes atymes at gmail.com
Sat Sep 21 15:55:35 UTC 2024


AI systems are already being used maliciously.  There is no plausible means
of utterly preventing this, but there are means of countering it.  It would
help if groups like this would get out of the way and let those means be
developed.

On Sat, Sep 21, 2024 at 11:51 AM BillK via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> Leading AI Scientists Warn AI Could Escape Control at Any Moment
> "Loss of human control or malicious use of these AI systems could lead
> to catastrophic outcomes."
> Sep 21, 2024 byNoor Al-Sibai.
>
> <https://futurism.com/the-byte/ai-safety-expert-warning-loss-control>
> Quotes:
> "Rapid advances in artificial intelligence systems’ capabilities are
> pushing humanity closer to a world where AI meets and surpasses human
> intelligence," begins the statement from International Dialogues on AI
> Safety (IDAIS), a cross-cultural consortium of scientists intent on
> mitigating AI risks.
> "Experts agree these AI systems are likely to be developed in the
> coming decades, with many of them believing they will arrive
> imminently," the IDAIS statement continues. "Loss of human control or
> malicious use of these AI systems could lead to catastrophic outcomes
> for all of humanity."
> ------------------
>
> Well, I suppose keeping control of AGI is a good idea in theory, but
> it faces a lot of opposition.
> Do they really think they can control something that greatly surpasses
> human intelligence?
> Other AI experts are saying that too much regulation will slow down
> development and let other countries leap ahead. So they are opposing
> restrictive regulations.
> Then there are the military uses of AI to consider. Weaponising AI is
> seen as an absolute necessity. Every nation wants to be the first to
> develop far superior weapons.
>
> To me, it looks like AGI is so powerful that every nation will decide
> to say, "Damn the torpedoes, full speed ahead!".
>
> BillK
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20240921/c4884377/attachment.htm>


More information about the extropy-chat mailing list