[ExI] Symbol Grounding
jasonresch at gmail.com
Sun Apr 23 20:13:18 UTC 2023
On Sun, Apr 23, 2023 at 1:05 PM spike jones via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> -----Original Message-----
> From: extropy-chat <extropy-chat-bounces at lists.extropy.org> On Behalf Of
> Stuart LaForge ..
> >...I sense some big companies are starting to play a lot closer to their
> vests in that regard. I am not entirely sure I trust Sam Altman's claim
> that he hasn't started developing GPT-5...
> At no point will those with the means to do so unanimously decide: OK
> GPT-4 is getting scary, let's all stop further development forthwith.
> A few may do this, most will suggest the others stop development while
> simultaneously scouring the globe for those who understand the unfortunate
> term "transformer" as it relates to AI and language models.
> (Who the heck thought it was a good idea to further overwork the
> already-defined term "transformer" please? We coulda made up a new term for
> that, for I think it refers to a novel concept.)
I agree it is always unfortunate when general words get overloaded, but
there was a legitimate historical reason for this term (at least
initially). The transformer model was designed to consist of a pair, of an
"encoder" and "decoder", where one input can be transformed into another.
For example, translating English into French, or subtitles into spoken
words, or vice-versa, etc.
However, what GPT and its many related incarnations as chat bots are, are
not transformers in this original sense, but simply one part of it,
generally the decoder only. The decoder, running on its own without the
encoder, is sufficient to generate streams of english text given a prompt.
But with only the decoder, it's no longer really a "transformer", so we are
stuck with this inaccurate, overloaded, confusing term.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat