[ExI] alt man out

Brent Allsop brent.allsop at gmail.com
Mon Nov 20 18:47:25 UTC 2023


I believe it is critically important that we find a way so our morals can
keep up with our technology, especially on existential issues like General
AI.  If we continue on this type of polarizing and hierarchical win / lose
survival of the fittest, fight to the death path, it could have grave
consequences for us all.  All the wars, and destructive polarization of
society is ever more clear proof of this importance.

We simply need to convert from a win / lose, survival of the fittest war to
the death between hierarchies, where moral truth is determined via edict
(if you aren't with our hierarchy, you are against us) to a bottom up win /
win building and tracking of moral consensus, with a focus on what everyone
agrees on.  We need to make our morals based on building and tracking
scientific moral consensus.  A moral truth derived from bottom up, grass
roots, experimental demonstration and rational arguments rather than
hierarchical edict.

There is already an existing topic on "The Importance of Friendly AI
<https://canonizer.com/topic/16-Friendly-AI-Importance/1-Agreement>".
There is a unanimous supper camp where everyone agrees that AI will "Surpass
Current Humans
<https://canonizer.com/topic/16-Friendly-AI-Importance/8-Will-Surpass-current-humans>"
and the "Such Concern is Mistaken
<https://canonizer.com/topic/16-Friendly-AI-Importance/3-Such-Concern-Is-Mistaken>"
with 12 supporters continues to extend its lead over the "Friendly AI is
Sensible
<https://canonizer.com/topic/16-Friendly-AI-Importance/9-FriendlyAIisSensible>"
camp currently with half as many supporters.

To me, this topic is too vague, not centering around any specific actions.
So I propose the following name changes to pivot the topic to be more
specific about actions that need to be taken.

Old:                                                        New:

Friendly AI Importance                  Should AI be Commercialized?
<-Topic Name

Such Concern Is Mistaken             AI should be commercialized.

Friendly AI is Sensible                    AI Poses an Existential Threat.


I would love to hear anyone's thoughts, especially if you are a supporter
of any of the camps with the proposed name changes who objects to this
proposal.  And of course, the more people that help communicate about their
current moral beliefs (whether experienced and educated or not) the
better.  That which you measure, improves.













On Mon, Nov 20, 2023 at 11:05 AM Keith Henson via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On Mon, Nov 20, 2023 at 2:13 AM efc--- via extropy-chat
> <extropy-chat at lists.extropy.org> wrote:
> >
> > Based on the gossip I've seen and read I think it is due to Sam wanting
> to
> > accelerate and earn money, and the board wanting to decelerate to choose
> a
> > more cautious approach.
> >
> > But who knows? ;)
>
> By historical and Supreme Court standards, this would be malfeasance
> by the board, opening them to stockholder lawsuits.
>
> I don't think it makes much difference. The advances in AI are way out
> of control.
>
> Keith
>
> > Best regards,
> > Daniel
> >
> >
> > On Sat, 18 Nov 2023, spike jones via extropy-chat wrote:
> >
> > >
> > >
> > >
> > >
> > >
> > >
> https://www.theverge.com/2023/11/17/23965982/openai-ceo-sam-altman-fired
> > >
> > >
> > >
> > >
> > > WOWsers.
> > >
> > >
> > >
> > > I am told Altman is a talented guy, as is Brockman.  We don’t know
> what went on there, but watch for both to team up with Musk and
> > > Thiel, start a competitor company that will blow OpenAI’s artificial
> socks off.
> > >
> > >
> > >
> > > As I wrote that sentence, it occurred to me that what happened today
> is Eliezer Yudkowsky’s nightmare scenario.
> > >
> > >
> > >
> > > spike
> > >
> > >
> > >
> > >
> > >
> > >
> > >
> > >
> > >_______________________________________________
> > extropy-chat mailing list
> > extropy-chat at lists.extropy.org
> > http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20231120/2eb0d7b0/attachment.htm>


More information about the extropy-chat mailing list