[ExI] AI companions are the final stage of digital addiction
Ben Zaiboc
ben at zaiboc.net
Wed Apr 9 07:46:34 UTC 2025
On 09/04/2025 00:22, BillK wrote:
> In an interview with podcast host Lex Fridman, Eugenia Kuyda, the CEO
> of the companion site Replika, explained the appeal at the heart of
> the company’s product. “If you create something that is always there
> for you, that never criticizes you, that always understands you and
> understands you for who you are,” she said, “how can you not fall in
> love with that?”
That's an easy one to answer:
"Because I'm not a baby anymore".
Surely one of the things you realise as you grow up is that people are
not always there for you, so you can learn self-reliance (and it's
corollary, self-esteem).
As for understanding you, nobody does, even other humans, much less
these things. Just another thing that we learn that's part of growing
up. Admittedly, that does cause a lot of problems, but I can't see
chatbots helping to solve them, when they just reinforce each person's
own views.
And we know what happens to people who are shielded from criticism.
These days, we call them 'university students', or 'woke snowflakes'.
If people are really thinking like this, then they are deluding
themselves, big-time, and I can't see any good coming from it. It seems
to be a recipe for infantilising the human race. Maybe when real AI does
appear, we will have no problem being their pets, because the chatbots
will have already groomed us.
Finally, she has a rather twisted view of love, in my opinion.
--
Ben
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20250409/0ab51379/attachment.htm>
More information about the extropy-chat
mailing list