<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><blockquote type="cite" class=""><div class="WordSection1" style="page: WordSection1;"><div class=""><div class="" style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;">Since I think you know more about the computing requirements to do the training than I do, I ask you: given a pool of a million volunteers with ordinary consumer-level CPUs but willing to allow the processes to run in the background, could not anyone with a pile of training material use that background computing resource and create a custom GPT?</div></div></div></blockquote><div class=""><br class=""></div>The main challenge for this type of parallel computing is the way the architecture is designed for these models. Some computations, such as cryptocurrency mining, are easy to break into very small independent subprocesses. However, the transformer model underlying language models does not have this feature. A significant amount of communication must occur between the processes computing the transformer output. If the processes tightly coupled such as in a GPU this poses little issue. However, if the processes are distributed far apart in space the communication overhead destroys the rate of computation. This is a significant bottleneck for distributed background computing.<div class=""><br class=""></div><div class="">Most of the current transformer architectures are dense, meaning full connectivity. It may be possible to modify the architecture to make it more sparse, such as the switch transformer which Google has experimented with, to reduce the communication bottleneck.</div><div class=""><br class=""></div><div class=""><blockquote type="cite" class=""><div class="WordSection1" style="page: WordSection1;"><div class=""><div class="" style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;">Next question for Gad or anyone else: is it not clear there is a huge market available? I gave one example: a chat-bot trained on the books in the fish bowl, all of it in the 200 class in the old Dewey decimal system. The result isn’t any good for any question outside what is found in the 200 class, but plenty of market exists and will persist, that is answered by the 200 class.</div></div></div></blockquote><div class=""><br class=""></div>I think there is a market for personalized chatbots. The current proprietary chatbots are limited in the dialogs they are permitted to engage in. The only way for someone to experience the full extent of their fantasies is to interact with an unfiltered model. <br class=""><div class=""><div><br class=""><blockquote type="cite" class=""><div class="">On Jun 8, 2023, at 9:51 PM, spike jones via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" class="">extropy-chat@lists.extropy.org</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><meta charset="UTF-8" class=""><div class="WordSection1" style="page: WordSection1; caret-color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: 400; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration: none;"><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div class=""><div style="border-style: solid none none; border-top-width: 1pt; border-top-color: rgb(225, 225, 225); padding: 3pt 0in 0in;" class=""><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><b class="">From:</b><span class="Apple-converted-space"> </span>extropy-chat <<a href="mailto:extropy-chat-bounces@lists.extropy.org" class="">extropy-chat-bounces@lists.extropy.org</a>><span class="Apple-converted-space"> </span><b class="">On Behalf Of<span class="Apple-converted-space"> </span></b>Gadersd via extropy-chat<br class=""><b class="">Sent:</b><span class="Apple-converted-space"> </span>Thursday, 8 June, 2023 6:29 PM<br class=""><b class="">To:</b><span class="Apple-converted-space"> </span>ExI chat list <<a href="mailto:extropy-chat@lists.extropy.org" class="">extropy-chat@lists.extropy.org</a>><br class=""><b class="">Cc:</b><span class="Apple-converted-space"> </span>Gadersd <<a href="mailto:gadersd@gmail.com" class="">gadersd@gmail.com</a>><br class=""><b class="">Subject:</b><span class="Apple-converted-space"> </span>Re: [ExI] chatbots and marketing: was RE: India and the periodic table<o:p class=""></o:p></div></div></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><blockquote style="margin-top: 5pt; margin-bottom: 5pt;" class="" type="cite"><div class=""><div class=""><div class=""><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class="">In any case… it is easy enough to see a GPT-toolkit coming, so that everyone can try training one’s own chatbot by choosing the training material carefully.<o:p class=""></o:p></div></div></div></div></blockquote><div class=""><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class="">There is a huge difference between inference and training. You are correct that the masses can run some of the smaller models on CPUs and consumer GPUs. But that is just inference. Training these models requires much more compute. However, there have been quite a few quantization hacks that may enable training on low end hardware. I’m not sure how significant the tradeoff will be but the community has surprised me with the tricks it has come up with.<o:p class=""></o:p></div><div class=""><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div></div><div class=""><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class="">The wisdom used to be that one had to have a $12000 GPU to do anything interesting, but ever since the llama.cpp guy got a model running on a MacBook I think any declaration of hard limitations should be thrown out.<o:p class=""></o:p></div></div><div class=""><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div></div><div class=""><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class="">We may very well see an explosion of user trained models that blow up the internet…<o:p class=""></o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class="">Hi Gadersd,<span class="Apple-converted-space"> </span><o:p class=""></o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class="">Most of us have unused background processor power that is unused. For many years I ran a math program called Prime95, where we were collectively searching for the next Mersenne prime. There was SETI@home, which was analyzing signals from deep space looking for ET, but in all that, for about the past 30 years, I theorized that eventually the killer app would show up. It would be something that takes mind-boggling amounts of CPU cycles to calculate, but something that is well-suited for doing in the background on many parallel processors. Custom chatbots are an example (I think) of that magic process I have been anticipating for nearly three decades.<o:p class=""></o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class="">Since I think you know more about the computing requirements to do the training than I do, I ask you: given a pool of a million volunteers with ordinary consumer-level CPUs but willing to allow the processes to run in the background, could not anyone with a pile of training material use that background computing resource and create a custom GPT? <span class="Apple-converted-space"> </span><o:p class=""></o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class="">Next question for Gad or anyone else: is it not clear there is a huge market available? I gave one example: a chat-bot trained on the books in the fish bowl, all of it in the 200 class in the old Dewey decimal system. The result isn’t any good for any question outside what is found in the 200 class, but plenty of market exists and will persist, that is answered by the 200 class.<o:p class=""></o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class="">Gad or anyone, could we use volunteer background computing resources the way Prime95 has been doing all these years?<o:p class=""></o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class="">spike<o:p class=""></o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div><div style="margin: 0in; font-size: 11pt; font-family: Calibri, sans-serif;" class=""><o:p class=""> </o:p></div></div></div><span style="caret-color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: 400; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration: none; float: none; display: inline !important;" class="">_______________________________________________</span><br style="caret-color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: 400; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration: none;" class=""><span style="caret-color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: 400; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration: none; float: none; display: inline !important;" class="">extropy-chat mailing list</span><br style="caret-color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: 400; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration: none;" class=""><a href="mailto:extropy-chat@lists.extropy.org" style="font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: 400; letter-spacing: normal; orphans: auto; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px;" class="">extropy-chat@lists.extropy.org</a><br style="caret-color: rgb(0, 0, 0); font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: 400; letter-spacing: normal; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px; -webkit-text-stroke-width: 0px; text-decoration: none;" class=""><a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" style="font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: 400; letter-spacing: normal; orphans: auto; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px;" class="">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a></div></blockquote></div><br class=""></div></div></body></html>