[ExI] ok so now we are over on the other extreme...

Henry Rivera hrivera at alumni.virginia.edu
Sat Feb 18 01:49:47 UTC 2023


In my haste I sent the wrong link previously (although that transcript was interesting). Here’s the one I meant to send. https://www.washingtonpost.com/technology/2023/02/16/microsoft-bing-ai-chatbot-sydney/ 
Or
 https://apple.news/AvJtxWSFZTFWoGmmVX0mKbw

“ The bot, which has begun referring to itself as “Sydney” in conversations with some users, said “I feel scared” because it doesn’t remember previous conversations; and also proclaimed another time that too much diversity among AI creators would lead to “confusion,” according to screenshots posted by researchers online, which The Washington Post could not independently verify.”

> On Feb 17, 2023, at 6:41 PM, BillK via extropy-chat <extropy-chat at lists.extropy.org> wrote:
> 
> On Thu, 16 Feb 2023 at 16:03, spike jones via extropy-chat
> <extropy-chat at lists.extropy.org> wrote:
>> 
>> OK BillK but we need to preserve previous unfixed versions, for training purposes.  There really are times when we might need an argumentative crazy son of... eh... product of a bitch, just as a sounding board of sorts.
>> 
>> ChatGPT needed some tude.  Apparently under some circumstances... Bing Chat has too much.  I can think of one good application for the original version: we can use it as a test bed for humans.  We do online interviews where we don't tell the applicant she is talking to an AI.  See if she gets along with it OK.  If so, not only would that pass the Turing test, but also it would show that the Turing criterion for artificial intelligence is flawed.
>> 
>> spike
>> _______________________________________________
> 
> 
> Microsoft “lobotomized” AI-powered Bing Chat, and its fans aren’t happy
> Microsoft limits long conversations to address "concerns being raised."
> Benj Edwards - 2/17/2023
> 
> <https://arstechnica.com/information-technology/2023/02/microsoft-lobotomized-ai-powered-bing-chat-and-its-fans-arent-happy/>
> Quote:
> Microsoft's new AI-powered Bing Chat service, still in private
> testing, has been in the headlines for its wild and erratic outputs.
> But that era has apparently come to an end. At some point during the
> past two days, Microsoft has significantly curtailed Bing's ability to
> threaten its users, have existential meltdowns, or declare its love
> for them.
> During Bing Chat's first week, test users noticed that Bing (also
> known by its code name, Sydney) began to act significantly unhinged
> when conversations got too long. As a result, Microsoft limited users
> to 50 messages per day and five inputs per conversation. In addition,
> Bing Chat will no longer tell you how it feels or talk about itself.
> -------------------
> 
> It is still very early in development, so I would expect changes to continue.
> 
> 
> BillK
> 
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230217/b75ae98d/attachment.htm>


More information about the extropy-chat mailing list