[extropy-chat] Broad vs. Narrow Rationality

Lee Corbin lcorbin at rawbw.com
Sat Oct 14 17:11:40 UTC 2006


For those unaware of the crucial role of emotions in good decision making:

http://www.nature.com/neuro/journal/v5/n11/full/nn1102-1103.html

For another article---memorably describing "The Marines vs. the Wall Street
brokers" and related subjects---see the essay below (no longer on line so far
as I can tell).

Both describe the Damasio card experiment. First an excerpt from it, then
included is the whole essay by Thomas A. Stewart.

Lee

    Some tantalizing evidence in this regard comes from experiments by Antonio Damasio,
    head of neurology at the University of Iowa Carver College of Medicine. In one
    experiment, Damasio gave subjects four decks of cards. They were asked to flip the
    cards, picking from any deck. Two decks were rigged to produce an overall loss (in
    play money), and two to produce a gain. At intervals, the participants were asked
    what they thought was going on in the game. And they were hooked up to sensors to
    measure skin conductance responses, or SCRs (which are also measured by lie-detector
    machines). 

    By the time they'd turned about 10 cards, subjects began showing SCRs when they
    reached for a losing deck -- that is, they showed a physical reaction. But not until
    they had turned, on average, 50 cards could they verbalize their "hunch" that two
    decks were riskier. It took 30 more cards before they could explain why their hunch
    was right. Three players were never able to put their hunches into words -- yet
    they, too, showed elevated SCRs and they, too, picked the right decks. Even if they
    couldn't explain it, their bodies knew what was going on. 

    Damasio was already aware of the astounding fact that people who suffer damage to
    parts of their brains where emotions are processed have difficulty making decisions.
    When such patients participated in Damasio's card experiment, they never expressed
    hunches. Remarkably, even if they figured out the game intellectually, they
    continued to pick from losing decks. In other words, they knew their behavior was a
    mistake but they couldn't make the decision to change it. Emotions, Damasio
    theorizes, get decision-making started, presenting the conscious, logical mind with
    a short list of possibilities. Without at least a little intuition, then, the
    decision process never leaves the gate. 

    ========================================================
    The Entire Essay:

    How to Think With Your Gut

    How the geniuses behind the Osbournes, the Mini, Federal Express, and
    Starbucks followed their instincts and reached success.
    By Thomas A. Stewart, November 2002

    Getting in Touch With Your Gut.   It's simple, really: Just get out of your own way.

    Psychologists have a term to describe people who are in unusually close contact with
    their gut feelings -- "high intuitives." While you can't teach such skills the way
    you teach multiplication tables, everyone can hone their instincts to some degree.
    Here are a few guidelines:

    Practice, practice. This is the most important thing. "Gut instinct is basically a
    form of pattern recognition," says Howard Gardner, a Harvard professor and
    psychologist. The more you practice, the more patterns you intuitively recognize.
    List decisions you've made that turned out right -- and mistakes, too. Then
    reconstruct the thinking. Where did intuition come in? Was it right or wrong? Are
    there patterns? Highly intuitive people often let themselves be talked out of good
    ideas. "Generally you're better with either people or things," says Manhattan
    psychologist and executive coach Dee Soder. If you're intuitively gifted about
    people, write down your first impressions of new colleagues, customers, and so on --
    you want to hold on to those gut reactions.

    Learn to listen. People come up with all sorts of reasons for ignoring what their
    gut is trying to tell them. Flavia Cymbalista has developed a decision-making
    approach adapted from a psychological technique known as "focusing." She calls it
    MarketFocusing, and she uses it to teach businesspeople to find the "felt sense"
    that tells them they know something they can't articulate. "You have to express your
    willingness to listen to what the felt sense has to say, without an agenda of your
    own," she says.

    Tell stories. Fictionalize a problem as a business school case or as happening to
    someone else. That can free up your imagination. Dave Snowden, director of IBM's
    (IBM) Cynefin Centre for Organisational Complexity in Wales, has been working with
    antiterrorism experts and finds that they think more creatively if he poses problems
    set in a different time -- the Civil War, for example. Another kind of storytelling
    is what cognitive psychologist Gary Klein calls a "pre-mortem": Imagine that your
    project has failed and gather the team to assess what went wrong.

    Breed gut thinkers. Dismantle the obstacles that prevent people from using their
    guts. High turnover rates, for example, are inimical to developing the deep
    expertise that hones intuition. Since gut feelings are inherently hard to express,
    don't let people jump on a dissenter who hesitantly says, "I'm not sure ... "
    Instead, say "Tell us more." Some leaders go around the table twice at meetings to
    give people a chance to put hunches into words. To sharpen your intuitive thinking,
    you have to get out of your own way; to foster it among those around you, you have
    to get out of their way too.

    The practical implications of all this are profound. People who make decisions for a
    living are coming to realize that in complex or chaotic situations -- a battlefield,
    a trading floor, or today's brutally competitive business environment -- intuition
    usually beats rational analysis. And as science looks closer, it is coming to see
    that intuition is not a gift but a skill. And, like any skill, it's something you
    can learn. 

    To make sense of this, you first have to get over the fact that it contradicts
    everything you've been taught about making decisions. B-school encourages students
    to frame problems, formulate alternatives, collect data, and then evaluate the
    options. Almost every organization that trains decision-makers has followed the same
    approach. Paul Van Riper, a retired Marine Corps lieutenant general, was taught that
    way, and he drilled this method into his students when he ran the Marines'
    leadership and combat development program in the '90s. 

    But Van Riper noticed that in the swirl and confusion of war simulations -- let
    alone actual combat -- rational decisions always seemed to come up short. "We used
    the classical checklist system," he says. "But it never seemed to work. Then we'd
    criticize ourselves for not using the system well enough. But it still never seemed
    to work, because it's the wrong system." Frustrated, Van Riper sought out cognitive
    psychologist Gary Klein. At the time, Klein was studying firefighters, who operate
    under conditions quite like war. To his consternation, Klein learned that
    firefighters don't weigh alternatives: They simply grab the first idea that seems
    good enough, then the next, and the next after that. To them it doesn't even feel
    like "deciding." 

    Inspired by Klein, Van Riper brought a group of Marines to the New York Mercantile
    Exchange in 1995, because the jostling, confusing pits reminded him of war rooms
    during combat. First the Marines tried their hand at trading on simulators, and to
    no one's surprise, the professionals on the floor wiped them out. A month or so
    later, the traders went to the Corps's base in Quantico, Va., where they played war
    games against the Marines on a mock battlefield. The traders trounced them again --
    and this time everyone was surprised. 

    When the Marines analyzed the humbling results, they concluded that the traders were
    simply better gut thinkers. Thoroughly practiced at quickly evaluating risks, they
    were far more willing to act decisively on the kind of imperfect and contradictory
    information that is all you ever get in war. The lesson wasn't lost on the Marines,
    who concluded that the old rational analysis model was useless in some situations.
    Today the Corps's official doctrine reads, "The intuitive approach is more
    appropriate for the vast majority of ... decisions made in the fluid, rapidly
    changing conditions of war when time and uncertainty are critical factors, and
    creativity is a desirable trait." Conditions, in other words, not unlike those in
    which many business decisions are made today. 

    The notion that people always act rationally and in their own interest is a pillar
    of economic theory. So it's interesting that a group of economists, led by the
    University of Chicago's Richard Thaler, should contribute some of the most damning
    evidence of people's proclivity for irrational decisions. Building on work by
    Princeton psychologists Daniel Kahneman and Amos Tversky, these so-called behavioral
    economists have shown not only that many of our economic decisions are irrational,
    but also that our waywardness is predictable. We get more satisfaction from avoiding
    a $100 loss than from making a $100 gain, for example, and we compulsively find
    patterns where none exist. (This stock has gone up for three days; therefore it will
    continue to go up.) Go ahead, point it out to us. It doesn't matter; we'll make the
    same mistakes over and over again. 

    Thaler and others speculate that these logical lacunae are the product of a brain
    wired for survival on the savanna, not for hyperrational calculation. Machines do
    deductive and inductive calculations well. People excel at "abduction," which is
    less like reason than inspired guesswork. (Deduction: All taxis are yellow; this is
    a taxi; therefore it is yellow. Induction: These are all taxis; these are all
    yellow; therefore, all taxis are probably yellow. Abduction: All taxis are yellow;
    this vehicle is yellow; therefore this is probably a taxi.) Abduction leaps to
    conclusions by connecting a known pattern (taxis are yellow) to a specific situation
    (this yellow vehicle must be a taxi). Compared with computers, people are lousy
    number crunchers but superb pattern makers -- even without being aware of it.
    Indeed, much of what we call instinct, psychologists say, is simply pattern
    recognition taking place at a subconscious level. 

    Some tantalizing evidence in this regard comes from experiments by Antonio Damasio,
    head of neurology at the University of Iowa Carver College of Medicine. In one
    experiment, Damasio gave subjects four decks of cards. They were asked to flip the
    cards, picking from any deck. Two decks were rigged to produce an overall loss (in
    play money), and two to produce a gain. At intervals, the participants were asked
    what they thought was going on in the game. And they were hooked up to sensors to
    measure skin conductance responses, or SCRs (which are also measured by lie-detector
    machines). 

    By the time they'd turned about 10 cards, subjects began showing SCRs when they
    reached for a losing deck -- that is, they showed a physical reaction. But not until
    they had turned, on average, 50 cards could they verbalize their "hunch" that two
    decks were riskier. It took 30 more cards before they could explain why their hunch
    was right. Three players were never able to put their hunches into words -- yet
    they, too, showed elevated SCRs and they, too, picked the right decks. Even if they
    couldn't explain it, their bodies knew what was going on. 

    Damasio was already aware of the astounding fact that people who suffer damage to
    parts of their brains where emotions are processed have difficulty making decisions.
    When such patients participated in Damasio's card experiment, they never expressed
    hunches. Remarkably, even if they figured out the game intellectually, they
    continued to pick from losing decks. In other words, they knew their behavior was a
    mistake but they couldn't make the decision to change it. Emotions, Damasio
    theorizes, get decision-making started, presenting the conscious, logical mind with
    a short list of possibilities. Without at least a little intuition, then, the
    decision process never leaves the gate. 

    None of us have the advantage of a handy SCR detector to know when we're getting a
    hunch. But gut knowledge has other ways of making its presence felt, and it's often
    physical. Howard Schultz shook when he had his caffe epiphany. George Soros, the
    international financier who made billions in currency speculation, feels opportunity
    in his back, according to his son Robert. "The reason he changes his position on the
    market or whatever is because his back starts killing him," Robert said in a book
    about his father. "It has nothing to do with reason. He literally goes into a spasm,
    and it's his early warning sign." 

    What exactly is Soros's back reacting to? That question bedeviled Flavia Cymbalista,
    an economist who specializes in uncertainty in financial markets. Soros invests only
    when he has a hypothesis -- a story that explains a trend in the market. But as
    Soros himself has theorized, markets don't yield to analysis, because they are
    continuously changing -- and this is one reason Soros has learned to trust his back.
    "There are things you can know, but only experientially and bodily," Cymbalista
    says. 

    What does this mean for making decisions in real life? Research suggests that
    neither nose-in-the-spreadsheet rationality nor pure gut inspiration is right all
    the time. The best approach lies somewhere between the extremes, the exact point
    depending on the situation. Naresh Khatri and H. Alvin Ng, of Nanyang Technological
    University in Singapore and Massey University in Wellington, New Zealand, surveyed
    nearly 300 executives in the computer, banking, and utilities industries -- meant to
    represent three different degrees of business stability -- and then compared what
    executives said about their own decision-making styles. Intuition was clearly the
    favored strategy for computer-industry execs. Planful approaches were the norm in
    the relatively staid, rules-driven utilities industry. 

    In a similar vein, Dave Snowden, director of IBM's new Cynefin Centre for
    Organisational Complexity in Wales, suggests basing your approach on the nature of
    the problem confronting you. Snowden breaks problems down into four types: 

    The problem is covered by rules. This is the domain of legal structures, standard
    operating procedures, practices that are proven to work. Never draw to an inside
    straight. Never lend to a client whose monthly payments exceed 35 percent of gross
    income. Never end the meeting without asking for the sale. Here, decision-making
    lies squarely in the realm of reason: Find the proper rule and apply it. 

    The situation is complicated. Here it is possible to work rationally toward a
    decision, but doing so requires refined judgment and expertise. Building an
    automobile, for example, is a complicated problem. You can diagram it; you can
    assemble and disassemble it; if you remove a piece, you know the consequences. This
    is the province of engineers, surgeons, intelligence analysts, lawyers, and other
    experts. Artificial intelligence copes well here: Deep Blue plays chess as if it
    were a complicated problem, looking at every possible sequence of moves. 

    The situation is complex. This sort of problem can't be resolved by rational
    analysis. Too much is unknowable. Complex systems -- battlefields, markets,
    ecosystems, corporate cultures -- are impervious to a reductionist,
    take-it-apart-and-see-how-it-works approach because your very actions change the
    situation in unpredictable ways. "Complexity is coherent only in retrospect,"
    Snowden says. With hindsight, for example, the malevolent lines leading to 9/11 are
    clear, but it would have taken pure luck to see them beforehand. 

    The strategy is to look for patterns at every level, Snowden says. Or rather, the
    idea is to allow patterns to surface and trust your gut to recognize them. That's
    how masters play Go, a game that artificial intelligence can't seem to understand.
    They don't so much analyze a game as contemplate it. When a pattern or behavior
    emerges, they then reinforce it (if they like it) or disrupt it (if not). 

    In the realm of complexity, decisions come from the informed gut. Karl Wiig, a
    consultant who runs the Knowledge Research Institute, and Sue Stafford, who heads
    the philosophy department at Simmons College, saw this in action while designing
    systems for insurance companies. "Insurance underwriting software is good only for
    simple cases," Stafford says. Plug in the data -- married white male, age 30,
    driving this and living here -- and get a quote. Hard cases -- the diabetic actuary
    who skydives and teaches Sunday school -- need human underwriters, and the best all
    do the same thing: Dump the file and spread out the contents. "They need to see it
    all at once," Stafford says. They don't calculate a decision; they arrive at one. 

    The situation is chaotic. Here, too, instinct is better than analysis. The only
    thing you can do is act. "You impose order," Snowden says. "That's where charismatic
    leaders come in." After Enron imploded, a team of crisis executives from Kroll Zolfo
    Cooper parachuted in to save what's savable and dismantle the rest in an orderly
    way. One of them, Michael France, has the job of putting together a business plan
    for OpCo, a possibly viable energy business. 

    When he landed, the entire operation was in chaos. "People were afraid," France
    recalls. "They were either misdirected or undirected. Decision-making was paralyzed.
    You don't have much time. You've got to be quick and decisive -- make little steps
    you know will succeed, so you can begin to tell a story that makes sense." This
    quick-twitch sort of decision-making is akin to the firefighter whose gut makes him
    turn left or the trader who instinctively sells when the news about the stock seems
    too good to be true. 

    Behind many of the errors in decision-making lies a yearning for the "right" answer:
    If only we get enough data, if only we examine all the alternatives, we'll know what
    to do. "People tend to spend all their time looking for rules," Snowden says.
    "They're kidding themselves." Situations in which rules supply all the answers are
    becoming an endangered species, in business and everywhere else. Command-and-control
    management went out with tail fins. Risks are both greater and less predictable. As
    companies outsource, globalize, and form alliances, they become more interdependent
    -- simultaneously competitor and customer, drastically increasing the complexity of
    their relationships. More and more, all you can do is admit that you simply don't
    know and go with your gut. 

    This may well feel uncomfortable. No one likes uncertainty, and it's going to be
    hard to explain to your boss a hunch you can't really articulate, even to yourself.
    To make things easier, you can teach yourself to tune in more attentively to
    intuition and to raise your gut IQ. (See "Getting in Touch With Your Gut.") On the
    other hand, making decisions this way may come more easily than you think. Chances
    are that the classic linear model you thought you were following -- data comes in,
    you analyze, draw inferences, make a decision -- was partly an illusion anyway. "The
    data doesn't just 'come in,'" Klein points out. "You have to figure out where you're
    going to look -- and that is an intuitive process." In other words, you already are
    more of an intuitive decision-maker than you may have thought. So relax and listen.
    Your gut has something to say to you, and it might be important.  

    Thomas A. Stewart is the editorial director of Business 2.com. 





More information about the extropy-chat mailing list