[ExI] ChatGPT 'Not Interesting' for creative works

Gadersd gadersd at gmail.com
Mon Mar 6 16:31:39 UTC 2023


Correction, I meant to say that the human brain has 100 trillion parameters, not 1 trillion.

> On Mar 6, 2023, at 11:24 AM, Gadersd <gadersd at gmail.com> wrote:
> 
> The human brain has roughly 1 trillion parameters so large language models are still a few orders of magnitude short of the human brain. It should be noted however that not all human brain connections perform language tasks, so achieving a fully human level of natural language understanding should require fewer than 1 trillion parameters.
> 
> Toy models can and have been trained in parallel across consumer computers, but I think you would be disappointed in their intelligence as compared to ChatGPT.
> 
> For example I tried a 6 billion parameter model GPT-JT, accessible at https://huggingface.co/spaces/togethercomputer/GPT-JT <https://huggingface.co/spaces/togethercomputer/GPT-JT>.
> Prompt: "solve 2x+3=-1 step by step. 2x="
> Answer: "1, so x=1/2.
> 
> A:
> 
> The answer is $1”
> 
> This model was trained in parallel as you have suggested. Not very useful, is it?
> 
>> On Mar 5, 2023, at 10:17 PM, spike jones via extropy-chat <extropy-chat at lists.extropy.org <mailto:extropy-chat at lists.extropy.org>> wrote:
>> 
>>> ... Gadersd via extropy-chat
>> ...
>> Subject: Re: [ExI] ChatGPT 'Not Interesting' for creative works
>> 
>>> ...Computing technology is not advanced enough for consumer computers to run the powerful models. Consumer computers do not have the bandwidth and GPU FLOPS to run the good models. It isn’t a matter of speed, consumer computers just cannot run the big models. The best you could do is run a toy model with maybe a billion parameters. Such toy models are completely dumb compared to ChatGPT and can barely string coherent sentences together...
>> 
>> 
>> 
>> OK so what if... we get a number of us running in parallel.  A toy version with a billion parameters, well OK then, a billion is about three orders of magnitude more parameters than my beleaguered meat brain has (as far as I know (hell I don't even know what my own parameters are)) and yet it seems to somehow write fun stuff on occasion.
>> 
>> spike
>> 
>> 
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org <mailto:extropy-chat at lists.extropy.org>
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230306/1dd53d28/attachment.htm>


More information about the extropy-chat mailing list