[ExI] all we are is just llms was: RE: e: GPT-4 on its inability to solve the symbol grounding problem

Giovanni Santostasi gsantostasi at gmail.com
Fri Apr 21 02:23:20 UTC 2023


Somebody fired by Google is not a sign that they are a bad actor, as in the
case of Black Lemoine. He revealed company information to the public and
that is not that ok (even if I believe his motivations were just).
But this Mitchel lady seems to have done something much worse if we had to
believe Google's official reason for firing her. It doesn't mean that her
claim about PaLM training is incorrect but for sure she has an axe to
grind, it seems.

   - Mitchell, on Twitter
   <https://twitter.com/mmitchell_ai/status/1362885356127801345?s=20>: "I'm
   fired."
   - Google, in a statement to Axios*: *“After conducting a review of this
   manager’s conduct, we confirmed that there were multiple violations of our
   code of conduct, as well as of our security policies, which included the
   exfiltration of confidential business-sensitive documents and private data
   of other employees.”



On Thu, Apr 20, 2023 at 7:12 PM Giovanni Santostasi <gsantostasi at gmail.com>
wrote:

> I mentioned this claim because it came directly from Google's CEO. It is
> not a scientific claim and it is not mentioned in a scientific article so
> some level of skepticism is needed. At the same time, Gordon is jumping on
> it to discredit supporters of the emergent capabilities of AIs as expected.
> At this particular time, there is no debunking of any kind of this claim
> from Google's CEO (or evidence to support it) but just another claim from
> an ex-Google employee that was fired for misconduct, so all this should be
> taken with a grain of salt.
>
>
> https://www.bbc.com/news/technology-56135817?xtor=AL-72-%5Bpartner%5D-%5Bbbc.news.twitter%5D-%5Bheadline%5D-%5Bnews%5D-%5Bbizdev%5D-%5Bisapi%5D&at_custom4=5D727E32-731F-11EB-B58A-D0D04744363C&at_custom3=%40BBCWorld&at_custom2=twitter&at_campaign=64&at_custom1=%5Bpost+type%5D&at_medium=custom7
>
> Giovanni
>
>
> On Thu, Apr 20, 2023 at 6:55 PM Gordon Swobe via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> The Bengali claim was debunked and in fact some AI researchers are quite
>> annoyed by the 60 minutes coverage. Thanks to ChatGPT, the media is full of
>> hype about AI. Some people still believe in Santa Claus, too.
>>
>> https://twitter.com/mmitchell_ai/status/1648029417497853953?s=20
>>
>> -gts
>>
>> On Thu, Apr 20, 2023 at 7:11 PM spike jones via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> I received this from Bengali note from Giovanni, but I see his message
>>> went to ExI as well.  I wrote the reply below to Giovanni as an offlist,
>>> but it doesn’t have anything in it which is any sillier than the usual
>>> stuff I post here, so I shall post it forward.
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> *From:* spike at rainier66.com <spike at rainier66.com>
>>> *Subject:* RE: [ExI] all we are is just llms was: RE: e: GPT-4 on its
>>> inability to solve the symbol grounding problem
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> *From:* Giovanni Santostasi <gsantostasi at gmail.com>
>>> *Subject:* Re: [ExI] all we are is just llms was: RE: e: GPT-4 on its
>>> inability to solve the symbol grounding problem
>>>
>>>
>>>
>>> Spike,
>>> >…By the way, did you hear that a Google version of an LLM was given
>>> just a few prompts in Bengali …
>>> Giovanni
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> Hmmm, I don’t know what to make of it Geovanni.  It cannot be creating
>>> information (in the form of being able to translate.)  Like any human,
>>> regardless of the intelligence of that human, it would need to somehow be
>>> given a reference source to explain the definitions of Bengali terms.
>>> Perhaps I am misunderstanding what you mean.
>>>
>>>
>>>
>>> Many years ago when I was many years younger than I am now, I was
>>> working in an office with a number of young singles who were likewise many
>>> years younger than they are now.  They made it a cheerful custom to go out
>>> together and party at a local bar on Friday nights.  They invited me, but
>>> I demurred, for two good reasons:  first, I was married, and second, I do
>>> like using the term demur.  I have long felt that terms should have an
>>> opposite or counterpart, ideally using the same form, in order to simplify
>>> language.  Had I accepted the invitation from my colleagues, would I then
>>> mur?  Would I have murred their invitation, for I am known for murring such
>>> occasions?
>>>
>>>
>>>
>>> But I digress.  In any case, I chose to leave my single office mates to
>>> celebrate at their riotous leisure, which was the fiftieth anniversary of
>>> spam (the original spam, not the later electronic advertising.)  Being a
>>> fan of the stuff (I blush (hey, we 60s kids grew up on the revolting
>>> tripe)) I was sad to miss the occasion, but wrote a poem about it (several
>>> rather elaborate verses) and gave it to them at the office.  Later I heard
>>> the party was a total hoot.  They doled out the poem, reading it once verse
>>> at a time during the entire baccinalian feast.
>>>
>>>
>>>
>>> About five years went by.  The organizer of those festivities wrote a
>>> sci-fi novel in which the characters in one hilarious passage were wildly
>>> celebrating the centennial of the ham-flavored foodlike product.  He asked
>>> me to proof read his novel, in which the main character of the novel
>>> recited a poem written by the characters’ absent colleague.  I commented to
>>> the author that the poem sounded familiar.  He replied “It aughta.  You
>>> wrote it.”  I was appalled of course that I would create such silliness,
>>> but… I did.  I blush.  Or rather, I would have blushed, were I physically
>>> capable of the emotion others describe as embarrassment.
>>>
>>>
>>>
>>> That author is still living.  I get Christmas cards from him still.
>>>
>>>
>>>
>>> On the main list, I mentioned a friend from college who taught me the
>>> basics of computer maintenance (back in the days when the discs were the
>>> size of a garbage can lid and had to be changed out manually.)  He is, if
>>> anything, even more convinced than you are that ChatGPT is a conscious
>>> being.  I demur from arguing with him over it because he is a real
>>> programmer, whereas I am merely an engineer.
>>>
>>>
>>>
>>> spike
>>> _______________________________________________
>>> extropy-chat mailing list
>>> extropy-chat at lists.extropy.org
>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230420/e57fcb4a/attachment-0001.htm>


More information about the extropy-chat mailing list