[ExI] Yudkowsky in Time on AI Open Letter.

Giovanni Santostasi gsantostasi at gmail.com
Fri Mar 31 09:18:11 UTC 2023


Right,
And if we made an AI that is misaligned then maybe we do deserve to be
taken out.
Kidding but I'm also serious. I trust intelligence == good.
Giovanni

On Thu, Mar 30, 2023 at 1:54 PM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
>
> On Thu, Mar 30, 2023, 2:48 PM Darin Sunley via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/
>>
>> We live in a timeline where Eliezer Yudkowsky just got published in Time
>> magazine responding to a proposal to halt or at least drastically curtail
>> AI research due to existential risk fears.
>>
>> Without commencing on the arguments on either side or the qualities
>> thereof, can I just say how f*cking BONKERS that is?!
>>
>> This is the sort of thing that damages my already very put upon and
>> rapidly deteriorating suspension of disbelief.
>>
>> If you sent 25-years-ago-me the single sentence "In 2023, Eliezer
>> Yudkowsky will get published in Time magazine responding to a proposal to
>> halt or at least drastically curtail AI research due to existential risk
>> fears." I would probably have concluded I was already in a simulation.
>>
>> And I'm not certain I would have been wrong.
>>
>
> It is a sign of the times that these conversations are now reaching these
> outlets.
>
> I think "alignment" generally insoluble because each next higher level of
> AI faces its own "alignment problem" for the next smarter AI. How can we at
> level 0, ensure that our solution for level 1, continues on through levels
> 2 - 99?
>
> Moreover presuming alignment can be solved presumes our existing values
> are correct and no greater intelligence will ever disagree with them or
> find a higher truth. So either our values are correct and we don't need to
> worry about alignment or they are incorrect, and a later greater
> intelligence will correct them.
>
> Jason
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230331/f35b7692/attachment.htm>


More information about the extropy-chat mailing list