[ExI] Newcomb's Paradox

Adrian Tymes atymes at gmail.com
Fri Feb 24 18:54:13 UTC 2023


On Thu, Feb 23, 2023 at 8:56 AM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On Wed, Feb 22, 2023 at 8:00 PM Giovanni Santostasi via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> Jason,
>> The Newcomb paradox is mildly interesting. But the perceived depthness of
>> it is all in the word game that AGAIN philosophers are so good at. I'm so
>> glad I'm a physicist and not a philosopher (we are better philosophers than
>> philosophers but we stopped calling ourselves that given the bad name
>> philosophers gave to philosophy). The false depth of this so-called paradox
>> comes from a sophistry that is the special case of the predictor being
>> infallible. In that case all kinds of paradoxes come up and "deep"
>> conversations about free will, time machines and so on ensue.
>>
>
> I agree there is no real paradox here, but what is interesting is that it
> shows a conflict between two commonly used decision theories: one based on
> empiricism and the other based on expected-value. Note that perfect
> prediction is not required for this conflict to arise, this happens even
> for imperfect predictors, say a psychologist who is 75% accurate:
> Empiricist thinker: Those who take only one box walk away 75% of the time
> with a million dollars. Those who take both 75% of the time walk away with
> $1,000 and 25% of the time walk away with $1,001,000. So I am better off
> trying my luck with one box, as so many others before me did that and made
> out well.
> Expected-value thinker: The guess of my behavior has already been made by
> the psychologist. Box A is already empty or not. I will increase my
> expected value by $1,000 if I take box B. I am better off taking both
> boxes.
> On an analysis it seems the empiricist tends to do better. So is
> expected-value thinking wrong? If so, where is its error?
>

What these miss is the information that led the predictor to make its
prediction in the first place: does the predictor think you are the kind of
thinker who'll take both boxes or the kind who'll only take one?

Reputation and demonstrated behavior presumably feed into the predictor's
decision.  Whether you get that million is decided by the predictor's
action, which will have been decided in part by your actions before you
even get to this decision point.

I have been in quite a few analogous real-world situations, where I was
offered a chance at small, short-term gains but the true reward depended on
whether the offerer thought I would grab them or not - specifically, that
the offerer trusted that I would not.  As with the Prisoner's Dilemma, the
true answer (at least for real world situations) only emerges if one
considers that the game repeats indefinitely (until one dies, which is
usually not predictable enough to base strategies on), and thus how one's
choices this time impacts future plays.

On the other hand, if it is definitely guaranteed that your choice now will
not impact any other events you care about, including that you will never
play this game again and your actions now won't affect any future
predictions that you care about, then the expected-value thinker is
correct, and the empiricist is wrong for thinking that other plays of the
game are relevant (those preconditions make them irrelevant, at least to
you).  That said, it is all too easy to believe that those conditions apply
when they do not.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230224/63103023/attachment.htm>


More information about the extropy-chat mailing list