[ExI] AI and autonomous warfare

Kelly Anderson postmowoods at gmail.com
Wed Jul 17 15:59:10 UTC 2024


The good news is that we can maybe choose to not put those particular
characteristics into our AIs. It does seem possible that we will
choose another, less safe, way. Stupid Putin anyway. He couldn't have
picked a worse time to do what he's doing. Makes the likelihood of a
bad AI outcome seem more probable.

-Kelly

On Tue, Jul 16, 2024 at 5:09 PM Keith Henson via extropy-chat
<extropy-chat at lists.extropy.org> wrote:
>
> The problem is humans, not AIs.
>
> We have evolved to have psychological traits for wars simply because
> genes for war survived bad times about 40% better than the no-war
> alternative.  We are stuck with our genetics.
>
>
> Keith
>
> On Tue, Jul 16, 2024 at 7:33 AM BillK via extropy-chat
> <extropy-chat at lists.extropy.org> wrote:
> >
> > On Mon, 15 Jul 2024 at 00:22, BillK <pharos at gmail.com> wrote:
> > >
> > > This long article discusses the increasing problem of autonomous weapons in warfare.
> > > <https://www.theguardian.com/technology/article/2024/jul/14/ais-oppenheimer-moment-autonomous-weapons-enter-the-battlefield>
> > > Quote:
> > >
> > > AI’s ‘Oppenheimer moment’: autonomous weapons enter the battlefield
> > > The military use of AI-enabled weapons is growing, and the industry that provides them is booming
> > > Nick Robins-Early      Sun 14 Jul 2024 17.00 BS
> > > What has changed, researchers say, is both increased public attention to everything AI and genuine breakthroughs in the technology. Whether a weapon is truly “autonomous” has always been the subject of debate. Experts and researchers say autonomy is better understood as a spectrum rather than a binary, but they generally agree that machines are now able to make more decisions without human input than ever before.
> > > -------------------------------------
> > >
> > > I asked Perplexity AI to review this long article.
> > > BillK
> > >
> > > Perplexity AI -
> > >
> > > The article from The Guardian discusses the increasing integration of artificial intelligence (AI) into military weapons and operations, highlighting what some experts are calling AI's "Oppenheimer moment" - a reference to the development of the atomic bomb. This trend raises significant ethical, legal, and strategic implications for modern warfare.
> > > <snip>
> > > -----------------------
> >
> >
> > As Perplexity AI insisted on the need for international cooperation to
> > avoid an AI weapons arms race, I pointed out that while a war was in
> > progress there would be little likelihood of international cooperation.
> > Nations will not restrict weapon developments during wartime.
> > War plus AI seems to indicate a bleak future.
> >
> > Perplexity AI grudgingly agreed, but then still added a long section
> > about the urgent need for international cooperation.
> >
> > Below is the first part of the reply (with the long wishful-thinking cooperation
> > section removed):
> > BillK
> >
> > Perplexity AI:
> > The continuation of the AI weapons arms race during ongoing conflicts
> > is a significant concern, as nations prioritize military superiority
> > over cooperative regulation. Here are the key points and implications
> > based on the provided search results:
> >
> > 1. Intensification of the AI Arms Race
> >
> > Current State of Affairs:
> > The AI arms race is characterized by rapid advancements in military
> > technology among major powers like the U.S., China, and Russia. This
> > competition is driven by the desire for strategic and tactical
> > advantages on the battlefield, leading to the development of
> > increasingly sophisticated AI-driven systems such as drones and
> > autonomous vehicles.
> >
> > Implications:
> >
> > Escalation of Conflicts:
> > The deployment of AI weapons can lead to faster and more automated
> > warfare, potentially escalating conflicts due to reduced
> > decision-making times and increased reliance on AI for strategic
> > assessments.
> >
> > Ethical and Legal Challenges:
> > The use of AI in warfare raises significant ethical and legal
> > questions, including accountability for AI-driven actions and the
> > potential for increased civilian casualties.
> >
> > 2. Challenges to International Cooperation
> >
> > War and Cooperation:
> > During active conflicts, nations are less likely to engage in
> > cooperative efforts to regulate AI weapons, as their immediate focus
> > is on gaining military advantages. This competitive environment makes
> > it difficult to establish and enforce international agreements or
> > treaties aimed at controlling the development and use of AI in
> > warfare. Implications:
> >
> > Lack of Regulation:
> > The absence of international cooperation can lead to a lack of
> > effective regulation, increasing the risks associated with the
> > deployment of AI weapons.
> >
> > Proliferation of AI Weapons:
> > Without international agreements, there is a higher likelihood of AI
> > weapons proliferating, potentially falling into the hands of non-state
> > actors or being used in ways that violate international humanitarian
> > law.
> > -----------------------
> >
> > _______________________________________________
> > extropy-chat mailing list
> > extropy-chat at lists.extropy.org
> > http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat



More information about the extropy-chat mailing list