<div dir="auto">My wife and I have been noting how addicted we have become to each other. I can see attempts at a synthetic substitute, for those who do not find such a partner of their own (or are of an intellectual/emotional makeup that does not facilitate being such a partner, which is somewhat of a prerequisite to finding one).<div dir="auto"><br></div><div dir="auto">I can particularly see these used to seduce and coddle angry loners with shallow emotions and thoughts, for whom no deep adaptation is required (and thus, to whom relatively simple AIs would be able to connect), much like how sterile insects are released to keep certain infestations from spreading. Benevolent ones could attempt therapy to the point that the people might be able to be companions (possibly matchmade - even psychologically shaped into being good for each other - by these AIs); malevolent ones might start by coaxing their humans away from the ballot box and ultimately guide or trick them into suicide.</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Fri, Aug 9, 2024, 5:49 AM BillK via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">We need to prepare for ‘addictive intelligence’<br>
The allure of AI companions is hard to resist. Here’s how innovation<br>
in regulation can help protect people.<br>
By Robert Mahari & Pat Pataranutaporn August 5, 2024<br>
<br>
<<a href="https://www.technologyreview.com/2024/08/05/1095600/we-need-to-prepare-for-addictive-intelligence/" rel="noreferrer noreferrer noreferrer" target="_blank">https://www.technologyreview.com/2024/08/05/1095600/we-need-to-prepare-for-addictive-intelligence/</a>><br>
<br>
Quote:<br>
It is no accident that internet platforms are addictive—deliberate<br>
design choices, known as “dark patterns,” are made to maximize user<br>
engagement. We expect similar incentives to ultimately create AI<br>
companions that provide hedonism as a service.<br>
This raises two separate questions related to AI.<br>
What design choices will be used to make AI companions engaging and<br>
ultimately addictive?<br>
And how will these addictive companions affect the people who use them?<br>
---------------<br>
<br>
BillK<br>
<br>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" rel="noreferrer noreferrer" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer noreferrer noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>