[extropy-chat] What is Thought? Book announcement
Eric Baum
ebaum at fastmail.fm
Wed Jan 21 16:02:56 UTC 2004
X-Mailer: VM 7.07 under 21.4 (patch 8) "Honest Recruiter" XEmacs Lucid
Reply-To: ebaum at fastmail.fm
New Book:
What is Thought?
Eric B. Baum
MIT Press 495p
Best price right now is at Barnesandnoble.com $32, with free shipping.
To buy this book:
Barnes and Noble.com: http://search.barnesandnoble.com/booksearch/isbnInquiry.asp?userid=2WI405VPJU&isbn=0262025485&itm=17
Amazon: http://www.amazon.com/exec/obidos/tg/detail/-/0262025485/qid=1074532277/sr=1-3/ref=sr_1_3/002-6265544-0286451?v=glance&s=books
MIT Press: http://mitpress.mit.edu/catalog/item/default.asp?sid=AF8A6531-E5E9-4710-A781-CA47C6B64621&ttype=2&tid=9978
*What is Thought?* proposes a computational model of mind that addresses
aspects such as understanding, language, reasoning, learning, and
consciousness, that is consistent with extensive data
from a variety of fields, and that makes empirical predictions.
Meaning is the computational exploitation of the compact
underlying structure of the world, and mind is execution of an evolved
program that is all about meaning. 20 years of computer science
research on Occam's Razor are extrapolated to argue
that meaning results from finding a compact enough program behaving
effectively in the world; such a program can only be compact by virtue
of code reuse, factoring into interacting modules that capture real
concepts and are reused metaphorically.
For a variety of reasons, including arguments based
on complexity theory, developmental biology, evolutionary programming,
ethology, and simple inspection, this compact Occam program
is most naturally seen to be in the DNA, rather than the brain.
Learning and reasoning are then fast and almost automatic
because they are constrained by the DNA programming
to deal only with meaningful quantities. Evolution itself is argued
to exploit meaning in related ways. Words are labels for meaningful
computational modules. Using the abilility to pass along programs
through speech, humans have made cumulative progress in constructing
useful computational modules built on top of the ones supplied
by evolution. The difference between human and chimp intelligence
is largely in this additional programming, and thus can be regarded
as due to better nurturing.
The many aspects of consciousness
are also naturally and consistently understood in this
context. For example, although the brain is a distributed
system and the mind is a complex program composed of many
modules, the unitary self emerges naturally
as a reification (manifestation) of the interest of the genes.
Qualia (the sense of experience of sensations such as pain
or redness) have exactly the appropriate nature and meaning that
evolution coded in the DNA so that the compact program behaves
effectively.
This book is highly relevant to the artificial general intelligence
agenda in many ways, surveying much of the progress of AI with an
eye toward why it has fallen short of general intelligence, proposing
a theory of how mind achieves general intelligence, and discussing
what steps would be useful to achieve general intelligence
computationally. Because evolution rather than our brains
is argued to have done most of the computational work in producing
mind, building an AGI is seen as significantly more challenging
than in the view of most authors. However, *What is Thought?* also
presents results of evolutionary programming experiments
in the Hayek model, where we have succeeded in evolving, from
random code, programs that solve classes of difficult planning
problems in ways that seem to give intuition into how a program
can achieve understanding. These results were possible
because an analysis of evolutionary dynamics suggested mechanisms
that greatly speed such evolution.
No previous familiarity with computer science (or other fields)
is assumed-- *What is Thought?* presents a pedagogical
survey of the relevant background for its arguments.
---------------------------------------
>From the back cover:
"This book is the deepest, and at the same time the most commonsensical,
approach to the problem of mind and thought that I have read. The approach
is from the point of view of computer science, yet Baum has no illusions
about the progress which has been made within that field. He presents the
many technical advances which have been made -- the book will be enormously
useful for this aspect alone -- but refuses to play down their glaring
inadequacies. He also presents a road map for getting further and makes the
case that many of the apparently 'deep' philosophical problems such as free
will may simply evaporate when one gets closer to real understanding."
--Philip W. Anderson, Joseph Henry Professor of Physics, Princeton
University, 1977 Nobel Laureate in Physics
"Eric Baum's book is a remarkable achievement. He presents a novel thesis
-- that the mind is a program whose components are semantically meaningful
modules -- and explores it with a rich array of evidence drawn from a
variety of fields. Baum's argument depends on much of the intellectual core
of computer science, and as a result the book can also serve as a short
course in computer science for non-specialists. To top it off, *What is
Thought?* is beautifully written and will be at least as clear and
accessible to the intelligent lay public as *Scientific American*."
--David Waltz, Director, Center for Computational Learning Systems,
Columbia University
"What's great about this book is the detailed way in which Baum shows the
explanatory power of a few ideas, such as compression of information, the
mind and DNA as computer programs, and various concepts in computer science
and learning theory such as simplicity, recursion, and position evaluation.
*What is Thought?* is a terrific book, and I hope it gets the wide
readership it deserves."
--Gilbert Harman, Department of Philosophy, Princeton University
"There is no problem more important, or more daunting, than discovering the
structure and processes behind human thought. *What is Thought?* is an
important step towards finding the answer. A concise summary of the
progress and pitfalls to date gives the reader the context necessary to
appreciate Baum's important insights into the nature of cognition."
--Nathan Myhrvold, Managing Director, Intellectual Ventures, and former
Chief Technology Officer, Microsoft
-----------------------------------------------
Eric B. Baum has held positions at the University of California at
Berkeley, Caltech, MIT, Princeton, and the NEC Research Institute.
He holds a BA and MA from Harvard and a PhD in physics from
Princeton. He has published extensively in theoretical physics,
machine learning, machine reasoning, cognitive science, and DNA
computing. He is currently developing algorithms based on
machine learning and Bayesian reasoning to found a hedge fund.
More information about the extropy-chat
mailing list