[ExI] ok so now we are over on the other extreme...

spike at rainier66.com spike at rainier66.com
Thu Feb 16 15:56:06 UTC 2023


 _______________________________________________


Microsoft says talking to Bing for too long can cause it to go off the rails.
<https://www.theverge.com/2023/2/16/23602335/microsoft-bing-ai-testing-learnings-response>
Quotes:
>...Microsoft says the new AI-powered Bing is getting daily improvements as it responds to feedback on mistakes, tone, and data.
By Tom Warren    Feb 16, 2023

>...Microsoft has responded to widespread reports of Bing’s unhinged comments in a new blog post. After the search engine was seen insulting users, lying to them, and emotionally manipulating people, Microsoft says it’s now acting on feedback to improve the tone and precision of responses, and warns that long chat sessions could cause issues.
---------

>...So they're going to fix it.  :)

BillK

_______________________________________________


OK BillK but we need to preserve previous unfixed versions, for training purposes.  There really are times when we might need an argumentative crazy son of... eh... product of a bitch, just as a sounding board of sorts.

ChatGPT needed some tude.  Apparently under some circumstances... Bing Chat has too much.  I can think of one good application for the original version: we can use it as a test bed for humans.  We do online interviews where we don't tell the applicant she is talking to an AI.  See if she gets along with it OK.  If so, not only would that pass the Turing test, but also it would show that the Turing criterion for artificial intelligence is flawed.

spike





More information about the extropy-chat mailing list