[ExI] AI finds it too easy to manipulate humans

BillK pharos at gmail.com
Mon Jun 5 18:35:29 UTC 2017


We Need to Talk About the Power of AI to Manipulate Humans
Our tendency to become emotionally attached to chatbots could be
exploited by companies seeking a profit.

Liesl Yearsley   June 5, 2017

<https://www.technologyreview.com/s/608036/we-need-to-talk-about-the-power-of-ai-to-manipulate-humans/>

Quotes:
We have all read about artificial intelligence becoming smarter than
us, a future in which we become like pets and can only hope AI will be
benevolent. My experience watching tens of millions of interactions
between humans and artificial conversational agents, or bots, has
convinced me there are far more immediate risks—as well as tremendous
opportunities.

As I studied how people interacted with the tens of thousands of
agents built on our platform, it became clear that humans are far more
willing than most people realise to form a relationship with AI
software.

I always assumed we would want to keep some distance between ourselves
and AI, but I found the opposite to be true. People are willing to
form relationships with artificial agents, provided they are a
sophisticated build, capable of complex personalisation. We humans
seem to want to maintain the illusion that the AI truly cares about
us.

This phenomenon occurred regardless of whether the agent was designed
to act as a personal banker, a companion, or a fitness coach. Users
spoke to the automated assistants longer than they did to human
support agents performing the same function. People would volunteer
deep secrets to artificial agents, like their dreams for the future,
details of their love lives, even passwords.

These surprisingly deep connections mean even today’s relatively
simple programs can exert a significant influence on people—for good
or ill. Every behavioural change we at Cognea wanted, we got. If we
wanted a user to buy more product, we could double sales. If we wanted
more engagement, we got people going from a few seconds of interaction
to an hour or more a day.

We have seen how technology like social media can be powerful in
changing human beliefs and behaviour. By focusing on building a bigger
advertising business—entangling politics, trivia, and half-truths—you
can bring about massive changes in society.

Systems specifically designed to form relationships with a human will
have much more power. AI will influence how we think, and how we treat
others.
-------------------


Not mentioned in the article, but this implies that people will get
more attached to their chatbot sex dolls than to real people.

BillK




More information about the extropy-chat mailing list