[extropy-chat] singularity conference at stanford
Hal Finney
hal at finney.org
Mon May 15 21:49:49 UTC 2006
Metavalent Stigmergy writes:
> I understood his final comment to mean that we greatly *under*
> estimate the chances of particular outcomes; that his study subjects
> essentially said, "i'm 98% certain that such-and-such will *not*
> happen" and yet 42.6% of the time, the event *did* happen.
> Implicitly, in that case, they had given the event a 2% chance of
> happening and they had greatly underestimated those chances.
That sounds right to me. I was emphasizing that people overestimate
things they think will happen, and you're saying that people underestimate
things they think will not happen. It amounts to the same thing.
I have to confess, when I hear about irrationalities like this, I always
have the same reaction. It's not, "how could I help people overcome
these limitations?" Rather, it's "how could I get rich from this?" :-)
In principle, if people are being irrational in their beliefs, you
should be able to set up some kind of arrangement in which they will
systematically give money away. Not only do you get rich, you also
provide negative feedback to false beliefs and help people to gradually
improve their rationality. At least, that's the rationalization.
In practice, either I can't find a way to do it, or else it turns out
there is already a thriving industry built around the practice, such as
insurance. And in many cases it seems like exploiting these weaknesses
is unsavory and destructive, like loan sharking.
Interestingly, insurance exploits an opposite fallacy to what we are
talking about here. Insurance basically relies on people overestimating
the chances of rare events. People are willing to pay to avoid risky
events, out of proportion to the true level of danger. (Partially this
is because people are risk averse, but the effect is the same as if they
were risk neutral and overestimated risk.)
Here we have the opposite irrationality, people underestimating the
chance of rare events. One place you might hope to make money would
be in the commodities markets, where you can use options to bet that
the price will change by some large amount. For example, right now,
oil is about $70/barrel. Options prices imply that the market consensus
of the odds that oil will fall below $45/barrel by the end of the year
is about 1%. Maybe you could say, people are overconfident and have
too narrow confidence intervals, so the odds are actually probably much
greater than that, and take a "long-shot" position on such options.
Unfortunately, that doesn't work. I'm not sure why, but when money is on
the line like this, people are not idiots. Statistically, option prices
do not show systematic biases in terms of underestimating unlikely events.
If they did, people would have discovered it a long time ago and made
all kinds of money, until the very actions of these traders put the
prices back to where they should be.
There is a sense though in which markets do provide an opportunity
to profit from overconfidence, which is the profit acquired by the
market-maker himself. Most markets charge commissions on trading,
and trading relies on differences of opinion, which are themselves
strengthened (and perhaps actually caused) by overconfidence.
So ultimately the market maker is profiting from the error of rationality
we are talking about here.
The same thing happens with bookmakers who take bets on sports events.
The bookie acts as an intermediary and balances the bets on both sides
of the outcome, profiting from overconfident differences in opinion.
Anyway, in this case as in others I can't see an available niche for
exploiting this particular form of irrationality, one that has not
already been filled. An interesting dichotomy arises in that some of the
institutions, like sports gambling, arguably are harmful to people and
exploit their irrationality to their detriment. Others, like markets,
are arguably helpful, provide socially useful information, and to at
least some extent encourage greater rationality among participants.
Hal
More information about the extropy-chat
mailing list