[ExI] AI finds it too easy to manipulate humans

William Flynn Wallace foozler83 at gmail.com
Mon Jun 5 20:57:51 UTC 2017


...Not mentioned in the article, but this implies that people will get more
attached to their chatbot sex dolls than to real people.

BillK

People will attach powerful feelings to dogs, cats, snakes, rats - why not
a sex doll?  Some people will talk to anyone, or maybe anything, that will
listen to them, because they have worn out their friends and family with
prattle.

Wait until the prices come down and see the robot animals selling like real
pets.  How many pet owners would eagerly pay for a pet that could talk?  My
wife for one.  No litter box needed.  Imagine, after extensive experience,
of course, an AI able to respond to a person exactly the way they want to
be responded to.  Better than people.  I also suggest that the AI does not
have to be human shaped either.  Also, I see no reason why the AI has to
pass the Turing test.  Depending on how you define 'smart', there are
millions of people dumber than the AIs we have now and would not notice
some nonhuman behavior.

This reminds me of a 1950s TV show - I've Got a Secret.  They brought out
this ancient crone and asked her questions like how many grandchildren etc.
before they gave up.

So they brought out her family on stage.  The kicker was that that person's
mother was still alive and motoring better than her daughter who answered
the questions.  Well over 100.  I believe it was about 7 generations on
stage.  Remarkable (well, of course - I just remarked on it, though I have
now forgotten the reason!)

Yes, money by the billions just awaiting smarter AIs.  Lonely stay-at-homes
with someone to talk to, get quick answers to just about anything, order
grocery delivery, etc.  It won't take long, I suspect.  Would you believe
robot graveyards?  Of course, they get reincarnated in a newer model, and
would have backups so nothing changes except the hardware.

A family AI for generations to come.  Already see that in some scifi books.

bill w

On Mon, Jun 5, 2017 at 2:29 PM, spike <spike66 at att.net> wrote:

>
> >...Not mentioned in the article, but this implies that people will get
> more attached to their chatbot sex dolls than to real people.
>
> BillK
>
>
>
>
>
> BillK, you make that sound like a bad thing.  It would be safe, cheap,
> help reduce overpopulation, might be more emotionally satisfying, low risk
> you will come home and find her in the sack with someone else, much easier
> to get a divorce when a new model comes out, etc.
>
> >>... We Need to Talk About the Power of AI to Manipulate Humans Our
> tendency to become emotionally attached to chatbots could be exploited by
> companies seeking a profit.
>
> Ja, those evil companies, seeking a profit.
>
> What about individuals seeking a profit?  Don't we count?  Wouldn't it be
> cool to work alone, figure out how to write software that is titillating
> and satisfying, then make a ton of money?  When you think about it, this is
> a modern version of writing pulpy romance stories.  There you have
> individuals writing software which is (in a sense) a readily-available
> substitute for actual human contact, with the writers seeking to make a
> profit and the more skilled ones succeeding.
>
> Think of it this way please.   Artists hire models to come to the studio,
> get nekkid, artist paints (the canvas, not the model.)  OK that has sooo
> been done.  Imagine artists of the future, software geeks who need a model
> to know what a really cool sexy encounter is like (we software geeks often
> never had one of those (too busy in college writing software.))  I can
> imagine it would suddenly be a lot more fun to be a software geek.
>
> Hell we could have competitions and championships: who can spawn the most
> sexy scenarios and behaviors.  The mind boggles.  But it's a good boggle.
> We can imagine a new industry: people writing sexy software for chatbot sex
> dolls.  Oh my, can you imagine?  Just thinking of the money to be made
> here, wide, tall piles of money, oh it just makes one's butt hurt.  It's a
> good hurt.
>
> spike
>
>
>
>
>
>
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20170605/829cb6b9/attachment.html>


More information about the extropy-chat mailing list