<div dir="ltr">Stuart,<br>I didn't read this as saying exactly that but there are diminishing returns in scaling and we can improve these models in other ways that do not require scaling.<div>Giovanni </div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Fri, Apr 21, 2023 at 6:15 AM Stuart LaForge via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><br>
On the way out the door to work so I can't write a digest or  <br>
editorialize, but Open AI founder says GPT4 is about as good as LLM  <br>
can get.<br>
<br>
<a href="https://www.wired.com/story/openai-ceo-sam-altman-the-age-of-giant-ai-models-is-already-over/" rel="noreferrer" target="_blank">https://www.wired.com/story/openai-ceo-sam-altman-the-age-of-giant-ai-models-is-already-over/</a><br>
<br>
Stuart LaForge<br>
<br>
<br>
<br>
<br>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>