[ExI] Symbol Grounding

Stuart LaForge avant at sollegro.com
Sun Apr 23 17:38:04 UTC 2023


Quoting Jason Resch via extropy-chat <extropy-chat at lists.extropy.org>:

>
> "Here’s a quick recap on the saga that’s taken us from completely
> closed-source models like GPT-4 to the latest open-source AI models that
> anyone (with a beefy enough computer) can run for free.
>
> First, Facebook (aka Meta) announced LLaMA, an alternative to OpenAI’s
> GPT-4 that they wanted to restrict only to certain researchers they
> approved.
>
> Barely a week later LLaMA got leaked by 4Chan users, meaning anyone could
> download LLaMA and use it themselves, with the small asterisk* that they
> might get sued by Facebook if they used it to build a business.
>
> Then, some researchers from Stanford showed that any large language model
> (LLM) like GPT-4, which cost many millions to train, could be replicated
> for just a few hundred dollars using untrained models like LLaMA and having
> GPT-4 itself do all the hard work of training it. That project was called
> Alpaca.
>
> Alpaca used OpenAI’s hard work to build a model that’s 80% as good for
> almost free. This means any powerful AI model can quickly spawn as many
> “pretty good” models as we want.
>
> Last week, we got Dolly 2.0, perhaps the world's first truly open LLM.
> Dolly is special because unlike LLaMA and other not-quite-open models, the
> dataset, dataset licensing, training code, and model weights are all
> open-source and suitable for commercial use.
>
> On Monday we got an even more ambitious attempt to build an open-source
> dataset for training LLMs: RedPajama-Data, which has over 1.2 Trillion
> tokens worth of training data anyone can use.
>
> As of yesterday we now have MPT-1b-RedPajama-200b-dolly — a ”1.3 billion
> parameter decoder-only transformer pre-trained on the RedPajama dataset and
> subsequently fine-tuned on the Databricks Dolly.”
>
> Phew, that’s a lot. Caught your breath?

Thanks for that list. For completeness, I would add Freedom GPT which  
I found out about from BillK.

https://openaimaster.com/what-is-freedomgpt-how-does-it-work/#:~:text=is%20Freedom%20GPT%3F-,How%20does%20it%20work%3F,privacy%2C%20neutrality%2C%20and%20customization.

I have downloaded a copy of Freedom GPT, which I intend to install on  
its own air-gapped PC in order to tinker around with when I have the  
opportunity.

>
> Here’s what’s next: We now live in a world where anyone who wants a
> powerful AI model can quickly and cheaply create one.
>
> This means big companies and governments will need to tread very carefully
> as they develop the next generation of even more powerful AI.
> If they create something dangerous it will quickly spawn thousands of
> almost-as-powerful replicas."

I sense some big companies are starting to play a lot closer to their  
vests in that regard. I am not entirely sure I trust Sam Altman's  
claim that he hasn't started developing GPT-5. Also Elon Musk has  
started a new AI company and bought 10,000 GPUs for it.
https://venturebeat.com/ai/elon-musk-quietly-starts-x-ai-a-new-artificial-intelligence-company-to-challenge-openai/

In any case, it will be interesting going forward.

Stuart LaForge






More information about the extropy-chat mailing list