[ExI] chatbots and marketing: was RE: India and the periodic table

Gadersd gadersd at gmail.com
Fri Jun 9 01:29:23 UTC 2023


> In any case… it is easy enough to see a GPT-toolkit coming, so that everyone can try training one’s own chatbot by choosing the training material carefully.

There is a huge difference between inference and training. You are correct that the masses can run some of the smaller models on CPUs and consumer GPUs. But that is just inference. Training these models requires much more compute. However, there have been quite a few quantization hacks that may enable training on low end hardware. I’m not sure how significant the tradeoff will be but the community has surprised me with the tricks it has come up with.

The wisdom used to be that one had to have a $12000 GPU to do anything interesting, but ever since the llama.cpp guy got a model running on a MacBook I think any declaration of hard limitations should be thrown out.

We may very well see an explosion of user trained models that blow up the internet.

> On Jun 8, 2023, at 8:57 PM, spike jones via extropy-chat <extropy-chat at lists.extropy.org> wrote:
> 
>  
>  
> From: extropy-chat <extropy-chat-bounces at lists.extropy.org <mailto:extropy-chat-bounces at lists.extropy.org>> On Behalf Of Mike Dougherty via extropy-chat
>>  
> >…tbh though, I've been considering what chatGPT/et al. are going to do with/to marketing... maximize profit at ideal customer acquisition cost…  Mike
>  
>  
>  
> I have been kinda quiet lately because I have been studying up on how GPT systems work and now I understand why we can’t run these things on our phones or typical home computers, but if one is fortunate enough to be able to afford a big number of high-end processors, GPUs, and one is patient, it can be done.  If one has an application which does not require high speed answers simulating conversation for instance, the computing requirement is greatly reduced.  If one has an application where 5 words per minute is acceptable, then such things are easily within the range of the ordinary proletariat.
>  
> If so, then we can imagine a GPT toolkit, where the user doesn’t need to be a coding expert.  Rather, the consumer would supply the chat-bot with training material, selected in accordance with that user’s personal biases and preconceived notion of truth, train a personal chat-bot on that, only.
>  
> I gave the example of the material in the fish bowl where I went to college, with the approximately 200 billion words of more or less similar world view.  But one wouldn’t really need all that, nor would the product of the fish bowl trained GPT be likely to produce an interesting result.  
>  
> In any case… it is easy enough to see a GPT-toolkit coming, so that everyone can try training one’s own chatbot by choosing the training material carefully.
>  
> spike
>  
>  
>  
>  
>  
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org <mailto:extropy-chat at lists.extropy.org>
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat <http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230608/89e44605/attachment.htm>


More information about the extropy-chat mailing list