[ExI] ChatGPT 'Not Interesting' for creative works

spike at rainier66.com spike at rainier66.com
Mon Mar 6 03:10:02 UTC 2023


 

 

…> On Behalf Of Darin Sunley via extropy-chat
Subject: Re: [ExI] ChatGPT 'Not Interesting' for creative works

 

 

 

 

>… ChatGPT3 has ~175 billion parameters. Training it requires datacenters of computing power. But the model itself will fit into a relatively small number of desktop PCs, even without compression. I'm pretty sure the model itself can be compressed to where paths through it will fit in the memory of a beefy desktop…

 

 

 

Cool, that was my intuition from a person who watched in realtime as Deep Blue the chess program which ran on a supercomputer was taken out of service almost immediately after it defeated the carbon unit Kasparov.  We couldn’t figure out why until my computer jockey friend told me IBM didn’t want its big iron to be defeated by a desktop computer.  I wasn’t sure I believed it until I followed thru Deep Blue’s games against Gary, then compared them with the stuff the desktops were playing less than five years later.  I realized it was the same level of play.  

 

But even before five years, whatever magic Deep Blue was calculating could have been done with a few desktops running in parallel and given more time.

 

Darin’s theory gives me an idea: we could get an ExI team together and let our computers collectively train a micro-ChatGPT using the pooled computing resources of a dozen of us.  Then we take on a similar uGPT trained by Mensa or the Prime95 group in a game of Jeopardy or something.

 

spike

 

 

 

 

 

 

 

 

 

 

 

 

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230305/953a03b9/attachment.htm>


More information about the extropy-chat mailing list