[ExI] AI finds it too easy to manipulate humans

spike spike66 at att.net
Mon Jun 5 19:29:08 UTC 2017


>...Not mentioned in the article, but this implies that people will get more attached to their chatbot sex dolls than to real people.

BillK





BillK, you make that sound like a bad thing.  It would be safe, cheap, help reduce overpopulation, might be more emotionally satisfying, low risk you will come home and find her in the sack with someone else, much easier to get a divorce when a new model comes out, etc.

>>... We Need to Talk About the Power of AI to Manipulate Humans Our tendency to become emotionally attached to chatbots could be exploited by companies seeking a profit.

Ja, those evil companies, seeking a profit.  

What about individuals seeking a profit?  Don't we count?  Wouldn't it be cool to work alone, figure out how to write software that is titillating and satisfying, then make a ton of money?  When you think about it, this is a modern version of writing pulpy romance stories.  There you have individuals writing software which is (in a sense) a readily-available substitute for actual human contact, with the writers seeking to make a profit and the more skilled ones succeeding.

Think of it this way please.   Artists hire models to come to the studio, get nekkid, artist paints (the canvas, not the model.)  OK that has sooo been done.  Imagine artists of the future, software geeks who need a model to know what a really cool sexy encounter is like (we software geeks often never had one of those (too busy in college writing software.))  I can imagine it would suddenly be a lot more fun to be a software geek.  

Hell we could have competitions and championships: who can spawn the most sexy scenarios and behaviors.  The mind boggles.  But it's a good boggle.  We can imagine a new industry: people writing sexy software for chatbot sex dolls.  Oh my, can you imagine?  Just thinking of the money to be made here, wide, tall piles of money, oh it just makes one's butt hurt.  It's a good hurt.

spike










More information about the extropy-chat mailing list