[ExI] The meaning of life revisited
pharos at gmail.com
Sat Jun 14 22:17:43 UTC 2014
On Sat, Jun 14, 2014 at 10:39 PM, Anders Sandberg wrote:
> But there is another aspect of housekeeping: how much of life is actual
> thought, and how much is just chatbot-like stimulus-response patterns? After
> spending a week talking Turing tests I suspect most of our time is in
> chatbot mode. This makes sense: thinking is expensive and slow, while
> well-learned responses can be cheap and fast. And most thinking moments are
> about housekeeping (how should I handle my family? what job would pay
> better?) rather than wonder or meaning-searching.
> So when we get richer and freer in the future, the amount of actual thinking
> about meaning in the average person might not go up much. Certainly people
> will do more high status "searching" showing that they are deep, creative
> and individual, but most don't care that much about figuring out what it is
> all about. You need to have a pretty high need for cognition to care about
> The Meaning rather than a meaning. But it is fun to be in that state!
There's a best-seller book about that.
Thinking, Fast and Slow by Daniel Kahneman.
There was also a documentary on BBC TV about this.
System 2 is your conscious, thinking mind. We conceive of this active
consciousness as the principal actor, the "decider" in our lives.
System 2 thinks slowly; it considers, evaluates, reasons. Its work
requires mental effort--multiplying 24 by 17 or turning left at a busy
intersection. We attribute most of our opinions and decisions to this
thinking, reasonable fellow.
For Kahneman, however, the main protagonist is System 1. This is the
agent of our automatic and effortless mental responses. System 1 can
add single-digit numbers and fill in the phrase "bread and --." It is
equipped with a nuanced picture of the world, the product of retained
memory and learned patterns of association ("Florida/old people") that
enable it to spew out a stream of reactions, judgments, opinions.
The flaw in this remarkable machine is that System 1 works with as
little or as much information as it has. If it can't answer the
question, "Is Ford (F) stock a good investment?" it supplies an answer
based on related but not really relevant data, such as whether you
like Ford's cars.
There is also a lot in the book about all the cognitive biases that
System 1 uses.
More information about the extropy-chat