[ExI] Elon Musk, Emad Mostaque, and other AI leaders sign open letter to 'Pause Giant AI Experiments'

Gordon Swobe gordon.swobe at gmail.com
Thu Mar 30 03:17:19 UTC 2023


On Wed, Mar 29, 2023 at 8:58 PM Will Steinberg via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

Man this letter is crazy.  Gaia is a massive machine that already exists
> and produces lots of things that could be useful to a machine.  Humans,
> too.  I think a rogue AI would enslave life rather than use us as atomic
> building blocks.  Maybe I'm wrong but it seems like a smart AI wouldn't
> destroy such a useful system
>

I assume you mean Eliezer’s letter. It’s an opinion piece in Time magazine
online.

I agree he seems to be taking an extreme position. When did this happen? I
don’t recall him being such a doomer back in the early 2000s.

He hear even that he has a bet with Sam Altman that AI will end the world
by 2030. I wonder how he plans to collect if he wins.

-gts





> TIME Magazine, today March 29.
>> "Shut down all the large GPU clusters (the large computer farms where the
>> most powerful AIs are refined). Shut down all the large training runs. Put
>> a ceiling on how much computing power anyone is allowed to use in training
>> an AI system, and move it downward over the coming years to compensate for
>> more efficient training algorithms. No exceptions for anyone, including
>> governments and militaries. Make immediate multinational agreements to
>> prevent the prohibited activities from moving elsewhere. Track all GPUs
>> sold. If intelligence says that a country outside the agreement is building
>> a GPU cluster, be less scared of a shooting conflict between nations than
>> of the moratorium being violated; be willing to destroy a rogue datacenter
>> by airstrike."
>> -Eliezer Yudkowsky
>>
>> https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/
>>
>> On Wed, Mar 29, 2023 at 12:11 AM Gordon Swobe <gordon.swobe at gmail.com>
>> wrote:
>>
>>> I agree and am glad to see this development. As I have argued here,
>>> these language models literally have no idea what they are talking about.
>>> They have mastered the structures of language but have no grounding. They
>>> are blind software applications with no idea of the meanings of the words
>>> and sentences they generate. If they were human, we would call them
>>> sophists.
>>>
>>> From the letter:
>>>
>>> --
>>> Contemporary AI systems are now becoming human-competitive at general
>>> tasks,[3] and we must ask ourselves: Should we let machines flood our
>>> information channels with propaganda and untruth? Should we automate away
>>> all the jobs, including the fulfilling ones? Should we develop nonhuman
>>> minds that might eventually outnumber, outsmart, obsolete and replace us?
>>> Should we risk loss of control of our civilization? Such decisions must not
>>> be delegated to unelected tech leaders. Powerful AI systems should be
>>> developed only once we are confident that their effects will be positive
>>> and their risks will be manageable. This confidence must be well justified
>>> and increase with the magnitude of a system's potential effects. OpenAI's
>>> recent statement regarding artificial general intelligence, states that "At
>>> some point, it may be important to get independent review before starting
>>> to train future systems, and for the most advanced efforts to agree to
>>> limit the rate of growth of compute used for creating new models." We
>>> agree. That point is now.
>>>
>>>
>>> Therefore, we call on all AI labs to immediately pause for at least 6
>>> months the training of AI systems more powerful than GPT-4. This pause
>>> should be public and verifiable, and include all key actors. If such a
>>> pause cannot be enacted quickly, governments should step in and institute a
>>> moratorium.
>>> --
>>> https://twitter.com/SmokeAwayyy/status/1640906401408225280?s=20
>>>
>>> -gts
>>>
>> _______________________________________________
>
>
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230329/ab727533/attachment.htm>


More information about the extropy-chat mailing list