[ExI] Training GPT-3 to play nice - it's difficult!

BillK pharos at gmail.com
Mon Nov 2 17:27:18 UTC 2020


On Mon, 2 Nov 2020 at 17:12, spike jones via extropy-chat
<extropy-chat at lists.extropy.org> wrote:
>
<snip>
>
> Another fun experiment: find old posts from Mike Lorrey, Lee Corbin, Mike Butler and Daniel whats-his-name, train it on those four, see if ExiMod boots it.  {8^D  That one would be cool because ExI would warn her (it?) but of course she doesn't learn from the warning, so you know she would be cruisin' fer a bruisin'.  Alternative: first warning from ExiMod, tip off ExiMod that it is a joke (sorta) and the Turing Test has just been passed (sorta.)
>
> BillK, do you know what is the best platform that we can get to?  Some kind of public domain something where we give it the feedstock text?
>
> spike
> _______________________________________________


Training GPT-3 is not easy. You need billions of text sentences. Like
the whole of reddit.......

Smaller groups of text would quickly lead to repetition of similar
phrases. Like the early chatbots.

BillK



More information about the extropy-chat mailing list