[ExI] alt man out

Adrian Tymes atymes at gmail.com
Mon Nov 20 20:26:41 UTC 2023


I do not believe that name change makes sense, for two or three reasons.

1) AI is being and will continue to be commercialized.  There does not seem
to be any plausible way to stop this, so there appears to be
negative utility in debating it (as it takes attention and energy away from
the more useful question of how best to guide, encourage, and shape the
commercialization of AI).  Where methods have been proposed, such as
Elizier's, the details are highly relevant to whether or not people would
want to support it - again to take Elizier's proposal, quite a few who
might generally support non-commercialization of AI would back down from
that if, as Elizier (possibly correctly) states, such drastic action is the
only way to achieve that goal.  It is like debating whether the Moon should
orbit the Earth.

2) Commercialization of AI and AI being an existential threat are not
necessarily opposing beliefs.  It is entirely possible that the best, or
only realistic, way to deal with what existential threats AI brings is via
commercializing it: letting it get out into the hands of the masses so the
general public figures out how to deal with it before it could end
humanity, thereby preventing it from ending humanity.  As I understand it,
your system relies on the inherent assumption that camps are mutually
opposed.

3) I don't know if this is the case, but if you do the name change, will
all the statements and supporters automatically be switched to the new
names even if said statements and supporters might be
irrelevant/indifferent, or even in opposition, to their newly assigned
camp, with that supporter or the person who made that statement not
bothering to come back onto the platform to update and correct it?  If this
is the case, this would be a third reason - though presumably easily dealt
with just by closing the old debate and opening a new one.

On Mon, Nov 20, 2023 at 10:49 AM Brent Allsop via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
> I believe it is critically important that we find a way so our morals can
> keep up with our technology, especially on existential issues like General
> AI.  If we continue on this type of polarizing and hierarchical win / lose
> survival of the fittest, fight to the death path, it could have grave
> consequences for us all.  All the wars, and destructive polarization of
> society is ever more clear proof of this importance.
>
> We simply need to convert from a win / lose, survival of the fittest war
> to the death between hierarchies, where moral truth is determined via edict
> (if you aren't with our hierarchy, you are against us) to a bottom up win /
> win building and tracking of moral consensus, with a focus on what everyone
> agrees on.  We need to make our morals based on building and tracking
> scientific moral consensus.  A moral truth derived from bottom up, grass
> roots, experimental demonstration and rational arguments rather than
> hierarchical edict.
>
> There is already an existing topic on "The Importance of Friendly AI
> <https://canonizer.com/topic/16-Friendly-AI-Importance/1-Agreement>".
> There is a unanimous supper camp where everyone agrees that AI will "Surpass
> Current Humans
> <https://canonizer.com/topic/16-Friendly-AI-Importance/8-Will-Surpass-current-humans>"
> and the "Such Concern is Mistaken
> <https://canonizer.com/topic/16-Friendly-AI-Importance/3-Such-Concern-Is-Mistaken>"
> with 12 supporters continues to extend its lead over the "Friendly AI is
> Sensible
> <https://canonizer.com/topic/16-Friendly-AI-Importance/9-FriendlyAIisSensible>"
> camp currently with half as many supporters.
>
> To me, this topic is too vague, not centering around any specific
> actions.  So I propose the following name changes to pivot the topic to be
> more specific about actions that need to be taken.
>
> Old:                                                        New:
>
> Friendly AI Importance                  Should AI be Commercialized?
> <-Topic Name
>
> Such Concern Is Mistaken             AI should be commercialized.
>
> Friendly AI is Sensible                    AI Poses an Existential Threat.
>
>
> I would love to hear anyone's thoughts, especially if you are a supporter
> of any of the camps with the proposed name changes who objects to this
> proposal.  And of course, the more people that help communicate about their
> current moral beliefs (whether experienced and educated or not) the
> better.  That which you measure, improves.
>
>
>
>
>
>
>
>
>
>
>
>
>
> On Mon, Nov 20, 2023 at 11:05 AM Keith Henson via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> On Mon, Nov 20, 2023 at 2:13 AM efc--- via extropy-chat
>> <extropy-chat at lists.extropy.org> wrote:
>> >
>> > Based on the gossip I've seen and read I think it is due to Sam wanting
>> to
>> > accelerate and earn money, and the board wanting to decelerate to
>> choose a
>> > more cautious approach.
>> >
>> > But who knows? ;)
>>
>> By historical and Supreme Court standards, this would be malfeasance
>> by the board, opening them to stockholder lawsuits.
>>
>> I don't think it makes much difference. The advances in AI are way out
>> of control.
>>
>> Keith
>>
>> > Best regards,
>> > Daniel
>> >
>> >
>> > On Sat, 18 Nov 2023, spike jones via extropy-chat wrote:
>> >
>> > >
>> > >
>> > >
>> > >
>> > >
>> > >
>> https://www.theverge.com/2023/11/17/23965982/openai-ceo-sam-altman-fired
>> > >
>> > >
>> > >
>> > >
>> > > WOWsers.
>> > >
>> > >
>> > >
>> > > I am told Altman is a talented guy, as is Brockman.  We don’t know
>> what went on there, but watch for both to team up with Musk and
>> > > Thiel, start a competitor company that will blow OpenAI’s artificial
>> socks off.
>> > >
>> > >
>> > >
>> > > As I wrote that sentence, it occurred to me that what happened today
>> is Eliezer Yudkowsky’s nightmare scenario.
>> > >
>> > >
>> > >
>> > > spike
>> > >
>> > >
>> > >
>> > >
>> > >
>> > >
>> > >
>> > >
>> > >_______________________________________________
>> > extropy-chat mailing list
>> > extropy-chat at lists.extropy.org
>> > http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20231120/9e8b3951/attachment.htm>


More information about the extropy-chat mailing list