[ExI] India and the periodic table
Keith Henson
hkeithhenson at gmail.com
Sat Jun 3 15:59:49 UTC 2023
I bet the Chinese fail to make such an AI, or at best it will take
them a couple of years.
In AI terms, that's a couple of eons.
Best wishes,
Keith
On Sat, Jun 3, 2023 at 7:59 AM Gadersd via extropy-chat
<extropy-chat at lists.extropy.org> wrote:
>
> > It wouldn't be so much censorship as it is choosing the training material. So if the commies only let the chatbot read commie input material, then no censorship would be required, ja? As in Orwell's Nineteen Eighty Four, it was impossible to criticize the government in Newspeak because there was no available vocabulary from which to do so, given that subset of English.
>
> There probably isn’t enough communist friendly training material available to train a powerful model. These models need billion or even trillions of tokens/words. The best way is to train models on all internet data and then beat the model with a programmatical stick until it stops supporting capitalism and recognizes the “true way", similar to how some liberal universities do it but in a more digitized fashion.
>
> > If a chatbot is trained only on one very limited text, then only the words found in that text will be in the generated answer?
>
> Correct, the models only output tokens they have seen before in their training set. If you converted the entire internet to kindergarten level English you could train a model could speak about anything always in childlike English.
>
> > If I made my own GE&H-bot and asked it a question about plasma physics, its own dictionary would define plasma as something like train of rain? Then give an obviously silly answer a 3 yr old child could understand as well as we do?
>
> If you had a large enough corpus of Dr. Seuss material, generated by GPT4 perhaps, something similar to that would be the likely result.
>
> > Reason I asked: if someone were to create a Bible-bot with input consisting only of a bible and a few dozen commentaries on that collection of books, would that generate actual text sounding like the bible? If so, could we generate something like the Book of Mormon? If so, could not we make a cubic buttload of money off of that?
>
> That could be done but would require an astronomical amount of Bible themed training text. GPT4 could be leveraged to generate the material. However, it would be much easier to just fine tune a general model such as GPT4 on religious text and ask it to write a novel religious text. Fine tuning may not even be necessary as GPT4 may have already memorized most of the religious texts of the world.
>
> > On Jun 2, 2023, at 9:20 PM, spike jones via extropy-chat <extropy-chat at lists.extropy.org> wrote:
> >
> >
> >
> > ...> On Behalf Of Gadersd via extropy-chat
> > Subject: Re: [ExI] India and the periodic table
> >
> >>> ... Prediction: both China and India will reverse course in the foreseeable future.
> >
> >> ...The Chinese government loves power too much to let itself be left in the dust. Be prepared for Communist Censorship Chatbots to take the stage...
> >
> >
> > Perhaps some GPT hipsters can educate me please.
> >
> > It wouldn't be so much censorship as it is choosing the training material. So if the commies only let the chatbot read commie input material, then no censorship would be required, ja? As in Orwell's Nineteen Eighty Four, it was impossible to criticize the government in Newspeak because there was no available vocabulary from which to do so, given that subset of English.
> >
> > This is the part I don't know and invite wise counsel. If a chatbot is trained only on one very limited text, then only the words found in that text will be in the generated answer? If I train a chatbot using only the 300 words in the Dr. Seuss classic Green Eggs and Ham, then I can only get answers with those 300 words? And the GE&H-bot will have a complete dictionary with every word defined in terms of the 300 GE&H words? If so, then any answers would not make a lot of sense. Everything would be in a boat with a goat, or on a train in the rain, but nothing outside of that?
> >
> > If I made my own GE&H-bot and asked it a question about plasma physics, its own dictionary would define plasma as something like train of rain? Then give an obviously silly answer a 3 yr old child could understand as well as we do?
> >
> > Reason I asked: if someone were to create a Bible-bot with input consisting only of a bible and a few dozen commentaries on that collection of books, would that generate actual text sounding like the bible? If so, could we generate something like the Book of Mormon? If so, could not we make a cubic buttload of money off of that?
> >
> > spike
> >
> >
> > _______________________________________________
> > extropy-chat mailing list
> > extropy-chat at lists.extropy.org
> > http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
More information about the extropy-chat
mailing list