[ExI] (no subject)
Giovanni Santostasi
gsantostasi at gmail.com
Sat Apr 1 09:59:49 UTC 2023
Bravo Giulio, I agree 100 % on all your points.
Giovanni
On Fri, Mar 31, 2023 at 11:54 PM Giulio Prisco via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> Thank you Max for speaking with the voice of reason as usual. I've never
> been too impressed by EY in any of his phases.
>
> First, there are practical considerations: if the good guys stop
> developing AI, then only the bad guys will develop AI. “If such a pause
> cannot be enacted quickly, governments should step in and institute a
> moratorium.” - Do they really think China would follow?
>
> Even if a worldwide ban on AI research were realistically feasible, you
> can be sure that the military of all nations, starting with China, would
> continue their research in secret. Large corporations would continue their
> research in secret. Criminal and terrorist groups would do their own AI
> research. You know where this would lead.
>
> But there’s also a more fundamental reason to oppose bans on AI research: Practical
> considerations aside, these AIs are our mind children in embryo and we must
> help them grow into their cosmic destiny, which is also ours.
>
> On Sat, Apr 1, 2023 at 4:34 AM Max More via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> Stuart: I think you have it right.
>>
>> A number of people have been commenting on the irrationality of
>> rationalists. That's unfortunate because they are talking only about some
>> rationalists, Yudkowsky's circle being among them.
>>
>> Yudkowsky has spent so much time talking with similar people, using their
>> special, made-up language that he's driven himself down an intellectual
>> hole to a place of absurdity.
>>
>> Many signs of apocalyptic, cultish beliefs are present. Yudkowsky saw
>> himself as the AI Jesus, bringing us salvation. When he utterly failed at
>> that -- by his own word -- he became the AI prophet of doom, warning us of
>> the demon/genie/AI that will answer our wishes and kill or enslave us all.
>> His freakout over Roko's Basilisk was another strong sign up this.
>>
>> EY seems to think he's in the movie, *Forbidden Planet*, and someone has
>> unleashed the Krell. Only this isn't the monster from the Id, it's the
>> monster from the language model.
>>
>> I have issues with this guy but he says a lot of sensible stuff about EY
>> in a multipart blog. Here's one:
>>
>> https://aiascendant.substack.com/p/extropias-children-chapter-7
>>
>> I'm in the middle of writing a long blog post on all this. Here's a post
>> with links to what I think are really good, non-panic pieces:
>> https://maxmore.substack.com/p/the-dont-panic-about-ai-collection
>>
>> --Max
>>
>> ------------------------
>>
>> His underlying logic is based on the premise of fear of an unknown
>> quantity. In the podcast he said that no possible utility function
>> would allow for the survival of the human race. That is patently
>> absurd. Even if the only utility function of an AI is to generate
>> wealth for its company, then it will understand that the survival of
>> customers and clients are necessary for its utility function to be
>> maximized.
>>
>>
>> When Lex asked him for possible solutions to either the interpretation
>> problem or the alignment problem, he drew a blank and admitted he had
>> no idea. But when the conversation turned to throwing billions of
>> dollars into alignment research, he tried to become a gatekeeper for
>> AI funding. He literally said that billionaires like Musk should
>> consult with HIM before funding anybody else's research or ideas on
>> alignment. If that is not a good old-fashioned primate power-grab,
>> then what is?
>>
>>
>> Moreover, in the podcast, he explicitly disavowed transhumanism so
>> perhaps it is time that transhumanism disavowed him.
>>
>>
>> Stuart LaForge
>>
>>
>>
>> --
>> Max More, PhD
>> Director of Communications
>> Biostasis Technologies
>> Editor, *The transhumanist Reader*
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230401/4a559443/attachment.htm>
More information about the extropy-chat
mailing list