[ExI] Talk to GPT-3 via Philosopher AI website

BillK pharos at gmail.com
Sun May 2 21:30:46 UTC 2021


On Sun, 2 May 2021 at 01:36, Dan TheBookMan via extropy-chat
<extropy-chat at lists.extropy.org> wrote:
>
> Yeah, almost. It’s confusing another form of relationalism though. Leibniz was taking the view that the objects exist, but that space and time don’t really or only have a secondary existence. I kind of agree with Lawrence Sklar (in his 1974 book _Space, Time and Spacetime_) here: Newton’s scholium kind of showed in classical physics that there’s something there. Maybe, though, not exactly what Newton thought. But the debate rages on… maybe rages isn’t an accurate way to put it. ;)
>
> I merely wanted to test out the site with what I thought would be an easy if obscure subject.
>
> Regards, Dan
> _______________________________________________
>


I've tried a few more times.  I think these tests make the point that GPT-3
is not intelligent. It does not understand what it writes.
It is searching through its (very large!) training data files and selecting
and rewriting into sensible-sounding sentences. So replies are generally
random associations of sentences that it links to each query.
As shown by the wide variety of responses to substantivalism.
It is a tremendous computing achievement, but I think it would be
risky to treat it as an oracle of wisdom.


BillK



More information about the extropy-chat mailing list