[extropy-chat] Re: Why Occam's Razor?
ebaum at fastmail.fm
Tue Feb 3 16:40:47 UTC 2004
If you are interested in Occam's Razor, I suggest you will find
*What is Thought?* more illuminating.
Over the last 20-40 years, computer scientists have formalized
Occam's razor from a number of perspectives:
"Vapnik-Chervonenkis dimension", Minimum Description Length,
Bayesian statistics among them. Chapter 4 of What is Thought?
provides a pedagogical survey of this literature, including the
ideas behind the main theorems in each of these formalizations.
I believe it is clearly written-- a number of people from various
walks of life have complimented me on chapter 4's
clarity in particular.
Very roughly speaking, CS has formalized Occam's razor to say:
if you have a lot of data and you find a sufficiently compact
description of it, that can only happen because there is a
simple underlying structure to the process producing the data,
and the compact description has captured this structure.
So, for example, (roughly speaking) the fact that Newton's
laws explain a vast array of data implies that the world
is describable by simple physics.
What is Thought? is organized around a generalization of this
formalized Occam's razor, and explains
in some concrete detail why and how thought results from evolution's
exploitation of it, why thought then has the qualities and capabilities
it does, how consciousness arises and what it is, and various
What is Thought?
Eric B. Baum
MIT press, Jan 2004.
On Jan 28, 2004 Ray J. Bradbury wrote:
> I can't even figure out how I ran across this, but given that Hal
> and Wei Dai are cited in the acknowledgments section it seems
> worthy of note...
> Why Occam's Razor Russell K. Standish
> I'll observe that from the bibliography one needs to be versed in
> quantum mechanics, multiverse and turing machine theory to be able
> to get through this.
More information about the extropy-chat