[ExI]  Google’s Willow Quantum Chip: Proof of the Multiverse?
    Jason Resch 
    jasonresch at gmail.com
       
    Mon Oct 13 14:52:32 UTC 2025
    
    
  
On Mon, Oct 13, 2025, 9:17 AM Adrian Tymes via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> On Mon, Oct 13, 2025, 1:26 AM Jason Resch via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> Superdeterminism is a hidden variables theory. One where the hidden
>> variables are assigned their values at the time the particles are created.
>> Do you agree with this much of the standard definition of superdeterminism?
>>
>
> Yes, except for "are assigned" and possibly the time aspect.
>
Okay.
> They have values when created, but "assigned" may imply that someone does
> the assigning.  "One where the hidden variables have their values at the
> time..." would be more accurate.
>
I think we agree. I don't mean to imply anything anthropomorphic.is going
on. Merely that if a variable has a particular value, then that value has
"been assigned", but I say this in the general "programming sense" of the
word assignment, as in: "int x = 5;"
> As to the time aspect: at least some particles appear to have at least
> some values depending on properties from before they existed.  For example,
> when an electron changes energy state around an atom and emits a photon,
> that photon's energy will depend on the difference between the states that
> electron is transitioning from and to.
>
Yes. Conservation laws enable us to greatly constrain the range of possible
values for many things. In fact, according to many-worlds, the entire time
evolution of the universal wave function is deterministic, so all values
and outcomes would then depend on properties that existed previously.
Likewise, when a proton and an electron smash into each other resulting in
> a neutron, that neutron's initial position and velocity are highly
> dependent on that proton's and electron's.  Similar to the lack of data
> about interstellar matter giving rise to descriptions of "dark matter", we
> have not yet mapped subatomic forces well enough to know for sure that it
> is all such properties, but that would be consistent with the evidence thus
> far.
>
We know there is conservation of energy, momentum and angular momentum.
Moreover we know by Noether's theorem that these conservation laws are
consequences of time, space, and rotational symmetry, respectively.
So if the laws of physics are the same regardless of when, where, or the
angle in which you are rotated, then we can mathematically prove these
conservation laws apply to all physical systems.
(Unlike with dark matter, it is not yet known if this mapping is
> theoretically possible.  The debate between superdetermination, MWI, et al
> may turn out to be unprovable.)
>
> Thus: at least some of the properties depend on things from before
> particle creation - sometimes, ultimately, very far before.
>
> If so, then I would pose this challenge: by what mechanism are they
>> assigned?
>>
>
> In at least some cases, possibly all: pre-existing, all the way back to
> the Big Bang.  We do not know what happened before the Big Bang, and that
> would include how this state of things came to be.
>
> I say "at least some" because some at-the-time generation mechanisms have
> yet to be ruled out.  But even in these cases, once the particle exists it
> has its values.
>
Thank you I appreciate your description of your view.
> How come they are assigned in a way that will anticipate the manner in
>> which they will later be measured, when that decision may be made in a way
>> seemingly causally disconnected from the assignment of the hidden variables?
>>
>
> They are not.  Again: in every specific instance I have examined where
> someone claimed that an a priori distribution was modified by subsequent
> actions (as opposed to e.g. filtering out some particles so as to change
> the nature of the distribution after it was generated), it turned out to
> not be the case - if one accepts that the a priori distribution fully
> existed, even if hidden, before the alleged modification.
>
> It is, however, fully within the capability of observers to see this
> distribution and convince themselves there is a causal link.
>
Are you familiar with Mermin's Bell inequality experiment with the two
detectors and three possible settings that can be set for each? If not (or
if you want a refresher) here is a great (and short) account of it:
https://youtu.be/0RiAxvb_qI4
Understanding this experiment, and it's implications, will be necessary to
understand my comments which follow below.
> (One might wonder, if possibly all values go back to at least the Big
> Bang, if the decision to measure a thing in a certain way thus effectively
> predates the particle's creation.  Whether or not that's possible, it does
> not seem necessary to explain this.)
>
> For example, with a distribution (evenly distributed on average) of a
> certain property of 1 1 0 1 0 0 - if you take every third particle and
> measure just the first two third particles that one time, you'll get 0 0.
> This does not mean that you set the original distribution to 0 0 0 0 0 0,
> nor that it somehow anticipated that you'd do that and set up so you'd get
> 0 0, no matter how stridently you insist it did.  Try again with another
> distribution and you'll have equal odds of 0 0, 1 0, 0 1, or 1 1.
>
It is not that the values are random that is strange or hard to explain.
Nor that they have a certain correlation that is strange. Indeed, we can
easily imagine that particles are like a set of two matching gloves, and
when we find one is a right handed glove, the other we know will be left
handed. Nothing overly strange or mysterious is needed to explain such a
thing.
Where things get strange, and very hard to explain, is when we play with
the not-perfectly-correlates measurements. It is then we find (and can
mathematically prove) that no pre-existing fixed set of information the
particle took with it, and nor any function computed  on that data, can
account for the observed facts that:
A) when both devices are set to the same position they are 100% correlated
B) when the devices are set to different positions they are only 25%
correlated
There are a few "outs" for this seeming mathematical impossibility.
Copenhagen takes the out that there is a faster-than-light influence such
that when one particle is measured the other is instantly updated.
Many-worlds takes the out that there is not a single outcome of a
measurement (Bell's inequality relies on the assumption that experiments
have single unique outcomes).
The "out" which superdeterminism takes is to say that the information the
particle has (and took with it) contained information about what position
the measurement switches would be in at each location at the time each
particle is measured.
But how did this information get there? If we set the positions by rolling
a die, how are the particle's properties be tied to the outcome of this die
roll, and why are it's statistics such to show us a 25% correlation, when
it would be so much simpler to show a 33% correlation?
If you say, "it was all pre-set at the time of the big bang," that's all
well and good, but for every way things could all be setup at the time of
the big bang to be this way, there are trillions upon trillions of ways
things could be setup to not follow these otherwise impossible to explain
correlation statistics.
Why should we prefer this out, which requires so much specific tuning of
initial conditions, over the other outs of FTL influences, or measurements
having more than one outcome?
Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20251013/369516da/attachment-0001.htm>
    
    
More information about the extropy-chat
mailing list