[ExI] ChatGPT 'Not Interesting' for creative works
Gadersd
gadersd at gmail.com
Tue Mar 7 04:39:23 UTC 2023
At least with weather stations one can average the measurements of many to get an arbitrarily good estimate. In the case of language models averaging many small models still yields junk output. ChatGPT's abilities are irreducible to the sum of many smaller models. It is like the saying “Consciousness is more than the sum of its parts.” More precisely, a large model is required to integrate all available information. Small models can only integrate small subsets of the information that larger models can. The sum of partially integrated information does not equal the full information totally integrated.
> On Mar 6, 2023, at 9:15 PM, Mike Dougherty via extropy-chat <extropy-chat at lists.extropy.org> wrote:
>
> On Mon, Mar 6, 2023, 8:07 PM Gadersd via extropy-chat <extropy-chat at lists.extropy.org <mailto:extropy-chat at lists.extropy.org>> wrote:
>
> No, the small models generate output faster than the big models. The small models are not slower version of the big models, they have completely different capabilities. You will never be able to get ChatGPT level output out of a much smaller model. It would be like trying to run modern engineering software on an Atari console: it wouldn’t be slower it just wouldn’t run at all.
>
> Or weather prediction using only one weather station? Or a single environmental reading (such as temperature or barometric pressure)?
> <http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230306/f4f53495/attachment.htm>
More information about the extropy-chat
mailing list