[ExI] ChatGPT 'Not Interesting' for creative works

Darin Sunley dsunley at gmail.com
Mon Mar 6 02:45:32 UTC 2023


Correction, that's GPT-3. chatGPT is significantly smaller.

On Sun, Mar 5, 2023 at 7:44 PM Darin Sunley <dsunley at gmail.com> wrote:

> ChatGPT3 has ~175 billion parameters. Training it requires
> datacenters of computing power. But the model itself will fit into a
> relatively small number of desktop PCs, even without compression. I'm
> pretty sure the model itself can be compressed to where paths through it
> will fit in the memory of a beefy desktop.
>
> On Sun, Mar 5, 2023 at 7:29 PM spike jones via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>>
>> >... On Behalf Of BillK via extropy-chat
>>
>> > _______________________________________________
>>
>>
>> Ecclesiastes 12:12
>> And further, my son, be admonished by these. Of making many books there is
>> no end, and much study is wearisome to the flesh.
>> ----------
>>
>> >...And now we have the Internet, self-publishing, Kindle and ChatGPT.
>> We have so much to read it is indeed wearisome to the flesh.
>> I don't think Stephenson is talking just about his personal preferences.
>> If
>> computers can now produce ream after ream of plausible words strung
>> together, what is the point of spending human time reading this endless
>> stream? If there is no human personality behind it, then let another
>> machine
>> read it.
>>
>>
>> BillK
>>
>> _______________________________________________
>>
>>
>>
>> Ja!  This thread has long been heading in this direction BillK: we need
>> versions of ChatGPT that can be personally owned and operated.  I am told
>> it
>> requires tons of bandwidth and computing speed, but I don't understand why
>> one couldn't have a micro-ChatGPT that operates on my one processor and
>> uses
>> my modest home bandwidth, going out researching in its background
>> computing
>> cycles and searching around mostly as I sleep.  I don't understand why it
>> wouldn't gradually get smarter and become a better companion, if it can be
>> trained by me.  It hasta be able to learn and remember what I told it.
>>
>> I still want to try that experiment where you train a micro-ChatGPT, I
>> train
>> one, then we have the two debate away in the night.  Then we see what they
>> said.  That should be a hoot.
>>
>> If anyone here knows exactly why ChatGPT can't be scaled down by six
>> orders
>> of magnitude and sold to consumers, do educate me please.  Seems to me
>> like
>> whatever magic that banks of a thousand computers can do can be done at a
>> thousandth that pace with one.  Ja?  Why not?  I want to try it.
>>
>> Thanks for the cool Ecclesiastes quote, me lad!
>>
>> spike
>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230305/6d213844/attachment-0001.htm>


More information about the extropy-chat mailing list