[ExI] Talk to GPT-3 via Philosopher AI website

Stuart LaForge avant at sollegro.com
Wed May 5 23:42:43 UTC 2021


Quoting Bill W and Bill K:


> Message: 1
> Date: Sat, 1 May 2021 17:24:25 -0500
> From: William Flynn Wallace <foozler83 at gmail.com>
> To: ExI chat list <extropy-chat at lists.extropy.org>
> Subject: Re: [ExI] Talk to GPT-3 via Philosopher AI website
> Message-ID:
> 	<CAO+xQEbkqrnM5wxkPHz0_uLX9W3sJZOx3p0VAjmJT8KXuFAHLA at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
>
> Ignorant and/or stupid question:
>
> Modern physics supports Relationalism, with
> Quantum Mechanics arguing that particles have a wave-particle duality
> and strings having a shape and direction but only when observed.
>
> How do they know what it is doing when not observed?  bill w

We don't. All we know is the wave function which encapsulates  
everything it CAN do whether we watch it or not.

>
> On Sat, May 1, 2021 at 5:07 PM BillK via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> On Thu, 29 Apr 2021 at 13:43, Dan TheBookMan via extropy-chat
>> <extropy-chat at lists.extropy.org> wrote:
>>>
>>> Didn?t work that well for me:
>>> https://philosopherai.xyz/2c1b0c9c-d448-411e-a05c-532ed501f613
>>>
>>
>> It seems to be a personal website, sometimes too busy, please try later.
>> :)
>>
>>> What is substantivalism?
>>>
>>> Substantivalism is a position in the philosophy of mathematics.
>>> It states that numbers and mathematical objects have real existence.
>>> _______________________________________________
>>
>>
>> I now get different results for substantivalism.  Wonder why it changes???

I think part of the problem is confusing the raw algorithm with the  
fully trained model or even a model that updates itself on a regular  
basis. As someone who has dabbled with neural networks and their  
training, I can tell you that the raw algorithm is a tabula rasa in  
every way and the process of training a neural network starts out with  
random values for every neuron's weight. I have done experiments that  
show that you can take different instances of the very same neural  
network and train them identically, yet at the end of training you get  
different distinct networks from the very same data that have slightly  
different weights for the various neurons. If properly trained, the  
neural network will still get the right answer, but the last few  
decimal digits will differ for each neuronal weight, and different  
"trainings" will result in AIs that might give different answers in  
the gray areas where the answers are not absolute.

Also in the specific case of GPT-3 there are four different models  
with different parameters and levels of training. The models are Ada,  
Babbage, Curie, Davinci. Open-AI charges the most for API calls to  
Davinci, which is the smartest model and the least for API calls to  
Ada which is the least-well trained model. The website you link to  
sounds like it running off of Ada.

Stuart LaForge


>> ----------------
>> What is substantivalism?
>>
>> The term substantivalism describes a broad category of metaphysics
>> where an object's properties and nature determine its identity. In
>> this school of thought, objects are distinct and separate from the
>> outside world.
>>
>> It is contrasted by Relationalism which argues that objects have
>> relational identities. Modern physics supports Relationalism, with
>> Quantum Mechanics arguing that particles have a wave-particle duality
>> and strings having a shape and direction but only when observed.
>> ________________________
>>
>> This response seems to relate to <
>> https://en.wikipedia.org/wiki/Relationalism>
>> "According to Newton?s substantivalism, space and time are entities in
>> their own right, existing independently of things. Leibniz?s
>> relationism, on the other hand, describes space and time as systems of
>> relations that exist between objects".
>>
>>
>> BillK




More information about the extropy-chat mailing list