[Paleopsych] Scientific American: Natural-Born Liars

Premise Checker checker at panix.com
Sat Jul 9 15:48:48 UTC 2005


Natural-Born Liars
http://www.sciam.com/print_version.cfm?articleID=0007B7A0-49D6-128A-89D683414B7F0000
May 18, 2005

Thanks to Alice Andrews for this. What I want to know is why mechanisms 
for detecting *self*-detection haven't evolved as well. Maybe they have, 
to a certain extent. But the deception can go many layers deep. I might 
grow up in a culture that believes that Mahomet is Allah's only prophet. 
If I belong to the vast majority that never really question this 
statement, then I'm not really all that self-deceiving. No doubt I should 
be a skeptic, but there are too many things to be skeptical about. And no 
brain mechanism has direct access to the truth. What might be detected is 
that I am little bit *too* sincere in my protestation of faith.

In sum, mechanisms for detecting self-deception in others are too costly 
to develop in most cases. Also, it takes time for such mechanisms to 
evolve successfully, for there's a race between ever more subtle means of 
(both other- and self-) deception and the detection of that deception.

--------------------

    Why do we lie, and why are we so good at it? Because it works

    By David Livingstone Smith

    Deception runs like a red thread throughout all of human history. It
    sustains literature, from Homer's wily Odysseus to the biggest pop
    novels of today. Go to a movie, and odds are that the plot will
    revolve around deceit in some shape or form. Perhaps we find such
    stories so enthralling because lying pervades human life. Lying is a
    skill that wells up from deep within us, and we use it with abandon.
    As the great American observer Mark Twain wrote more than a century
    ago: "Everybody lies ... every day, every hour, awake, asleep, in his
    dreams, in his joy, in his mourning. If he keeps his tongue still his
    hands, his feet, his eyes, his attitude will convey deception." Deceit
    is fundamental to the human condition.

    Research supports Twain's conviction. One good example was a study
    conducted in 2002 by psychologist Robert S. Feldman of the University
    of Massachusetts Amherst. Feldman secretly videotaped students who
    were asked to talk with a stranger. He later had the students analyze
    their tapes and tally the number of lies they had told. A whopping 60
    percent admitted to lying at least once during 10 minutes of
    conversation, and the group averaged 2.9 untruths in that time period.
    The transgressions ranged from intentional exaggeration to flat-out
    fibs. Interestingly, men and women lied with equal frequency; however,
    Feldman found that women were more likely to lie to make the stranger
    feel good, whereas men lied most often to make themselves look better.

    In another study a decade earlier by David Knox and Caroline Schacht,
    both now at East Carolina University, 92 percent of college students
    confessed that they had lied to a current or previous sexual partner,
    which left the husband-and-wife research team wondering whether the
    remaining 8 percent were lying. And whereas it has long been known
    that men are prone to lie about the number of their sexual conquests,
    recent research shows that women tend to underrepresent their degree
    of sexual experience. When asked to fill out questionnaires on
    personal sexual behavior and attitudes, women wired to a dummy
    polygraph machine reported having had twice as many lovers as those
    who were not, showing that the women who were not wired were less
    honest. It's all too ironic that the investigators had to deceive
    subjects to get them to tell the truth about their lies.

    These references are just a few of the many examples of lying that
    pepper the scientific record. And yet research on deception is almost
    always focused on lying in the narrowest sense-literally saying things
    that aren't true. But our fetish extends far beyond verbal
    falsification. We lie by omission and through the subtleties of spin.
    We engage in myriad forms of nonverbal deception, too: we use makeup,
    hairpieces, cosmetic surgery, clothing and other forms of adornment to
    disguise our true appearance, and we apply artificial fragrances to
    misrepresent our body odors. We cry crocodile tears, fake orgasms and
    flash phony "have a nice day" smiles. Out-and-out verbal lies are just
    a small part of the vast tapestry of human deceit.

    The obvious question raised by all of this accounting is: Why do we
    lie so readily? The answer: because it works. The Homo sapiens who are
    best able to lie have an edge over their counterparts in a relentless
    struggle for the reproductive success that drives the engine of
    evolution. As humans, we must fit into a close-knit social system to
    succeed, yet our primary aim is still to look out for ourselves above
    all others. Lying helps. And lying to ourselves--a talent built into
    our brains--helps us accept our fraudulent behavior.

    Passport to Success

    If this bald truth makes any one of us feel uncomfortable, we can take
    some solace in knowing we are not the only species to exploit the lie.
    Plants and animals communicate with one another by sounds, ritualistic
    displays, colors, airborne chemicals and other methods, and biologists
    once naively assumed that the sole function of these communication
    systems was to transmit accurate information. But the more we have
    learned, the more obvious it has become that nonhuman species put a
    lot of effort into sending inaccurate messages.

    The mirror orchid, for example, displays beautiful blue blossoms that
    are dead ringers for female wasps. The flower also manufactures a
    chemical cocktail that simulates the pheromones released by females to
    attract mates. These visual and olfactory cues keep hapless male wasps
    on the flower long enough to ensure that a hefty load of pollen is
    clinging to their bodies by the time they fly off to try their luck
    with another orchid in disguise. Of course, the orchid does not
    "intend" to deceive the wasp. Its fakery is built into its physical
    design, because over the course of history plants that had this
    capability were more readily able to pass on their genes than those
    that did not. Other creatures deploy equally deceptive strategies.
    When approached by an erstwhile predator, the harmless hog-nosed snake
    flattens its head, spreads out a cobralike hood and, hissing
    menacingly, pretends to strike with maniacal aggression, all the while
    keeping its mouth discreetly closed.

    These cases and others show that nature favors deception because it
    provides survival advantages. The tricks become increasingly
    sophisticated the closer we get to Homo sapiens on the evolutionary
    chain. Consider an incident between Mel and Paul:

    Mel dug furiously with her bare hands to extract the large succulent
    corm from the rock-hard Ethiopian ground. It was the dry season and
    food was scarce. Corms are edible bulbs somewhat like onions and are a
    staple during these long, hard months. Little Paul sat nearby and
    surreptitiously observed Mel's labors. Paul's mother was out of sight;
    she had left him to play in the grass, but he knew she would remain
    within earshot in case he needed her. Just as Mel managed, with a
    final pull, to yank her prize out of the earth, Paul let out an
    ear-splitting cry that shattered the peace of the savannah. His mother
    rushed to him. Heart pounding and adrenaline pumping, she burst upon
    the scene and quickly sized up the situation: Mel had obviously
    harassed her darling child. Shrieking, she stormed after the
    bewildered Mel, who dropped the corm and fled. Paul's scheme was
    complete. After a furtive glance to make sure nobody was looking, he
    scurried over to the corm, picked up his prize and began to eat. The
    trick worked so well that he used it several more times before anyone
    wised up.

    The actors in this real-life drama were not people. They were Chacma
    baboons, described in a 1987 article by primatologists Richard W.
    Byrne and Andrew Whiten of the University of St. Andrews in Scotland
    for i magazine and later recounted in Byrne's 1995 book The Thinking
    Ape (Oxford University Press). In 1983 Byrne and Whiten began noticing
    deceptive tactics among the mountain baboons in Drakensberg, South
    Africa. Catarrhine primates, the group that includes the Old World
    monkeys, apes and ourselves, are all able to tactically dupe members
    of their own species. The deceptiveness is not built into their
    appearance, as with the mirror orchid, nor is it encapsulated in rigid
    behavioral routines like those of the hog-nosed snake. The primates'
    repertoires are calculated, flexible and exquisitely sensitive to
    shifting social contexts.

    Byrne and Whiten catalogued many such observations, and these became
    the basis for their celebrated Machiavellian intelligence hypothesis,
    which states that the extraordinary explosion of intelligence in
    primate evolution was prompted by the need to master ever more
    sophisticated forms of social trickery and manipulation. Primates had
    to get smart to keep up with the snowballing development of social
    gamesmanship.

    The Machiavellian intelligence hypothesis suggests that social
    complexity propelled our ancestors to become progressively more
    intelligent and increasingly adept at wheeling, dealing, bluffing and
    conniving. That means human beings are natural-born liars. And in line
    with other evolutionary trends, our talent for dissembling dwarfs that
    of our nearest relatives by several orders of magnitude.

    The complex choreography of social gamesmanship remains central to our
    lives today. The best deceivers continue to reap advantages denied to
    their more honest or less competent peers. Lying helps us facilitate
    social interactions, manipulate others and make friends.

    There is even a correlation between social popularity and deceptive
    skill. We falsify our ré³µmé³ to get jobs, plagiarize essays to boost
    grade-point averages and pull the wool over the eyes of potential
    sexual partners to lure them into bed. Research shows that liars are
    often better able to get jobs and attract members of the opposite sex
    into relationships. Several years later Feldman demonstrated that the
    adolescents who are most popular in their schools are also better at
    fooling their peers. Lying continues to work. Although it would be
    self-defeating to lie all the time (remember the fate of the boy who
    cried, "Wolf!"), lying often and well remains a passport to social,
    professional and economic success.

    Fooling Ourselves
    Ironically, the primary reasons we are so good at lying to others is
    that we are good at lying to ourselves. There is a strange asymmetry
    in how we apportion dishonesty. Although we are often ready to accuse
    others of deceiving us, we are astonishingly oblivious to our own
    duplicity. Experiences of being a victim of deception are burned
    indelibly into our memories, but our own prevarications slip off our
    tongues so easily that we often do not notice them for what they are.

    The strange phenomenon of self-deception has perplexed philosophers
    and psychologists for more than 2,000 years. On the face of it, the
    idea that a person can con oneself seems as nonsensical as cheating at
    solitaire or embezzling money from one's own bank account. But the
    paradoxical character of self-deception flows from the idea,
    formalized by French polymath René escartes in the 17th century, that
    human minds are transparent to their owners and that introspection
    yields an accurate understanding of our own mental life. As natural as
    this perspective is to most of us, it turns out to be deeply
    misguided.

    If we hope to understand self-deception, we need to draw on a more
    scientifically sound conception of how the mind works. The brain
    comprises a number of functional systems. The system responsible for
    cognition--the thinking part of the brain--is somewhat distinct from
    the system that produces conscious experiences. The relation between
    the two systems can be thought of as similar to the relation between
    the processor and monitor of a personal computer. The work takes place
    in the processor; the monitor does nothing but display information the
    processor transfers to it. By the same token, the brain's cognitive
    systems do the thinking, whereas consciousness displays the
    information that it has received. Consciousness plays a less important
    role in cognition than previously expected.

    This general picture is supported by a great deal of experimental
    evidence. Some of the most remarkable and widely discussed studies
    were conducted several decades ago by neuroscientist Benjamin Libet,
    now professor emeritus at the University of California at San Diego.
    In one experiment, Libet placed subjects in front of a button and a
    rapidly moving clock and asked them to press the button whenever they
    wished and to note the time, as displayed on the clock, the moment
    they felt an impulse to press the button. Libet also attached
    electrodes over the motor cortex, which controls movement, in each of
    his subjects to monitor the electrical tension that mounts as the
    brain prepares to initiate an action. He found that our brains begin
    to prepare for action just over a third of a second before we
    consciously decide to act. In other words, despite appearances, it is
    not the conscious mind that decides to perform an action: the decision
    is made unconsciously. Although our consciousness likes to take the
    credit (so to speak), it is merely informed of unconscious decisions
    after the fact. This study and others like it suggest that we are
    systematically deluded about the role consciousness plays in our
    lives. Strange as it may seem, consciousness may not do any-thing
    except display the results of unconscious cognition.

    This general model of the mind, supported by various experiments
    beyond Libet's, gives us exactly what we need to resolve the paradox
    of self-deception--at least in theory. We are able to deceive
    ourselves by invoking the equivalent of a cognitive filter between
    unconscious cognition and conscious awareness. The filter preempts
    information before it reaches consciousness, preventing selected
    thoughts from proliferating along the neural pathways to awareness.

    Solving the Pinocchio Problem
    But why would we filter information? Considered from a biological
    perspective, this notion presents a problem. The idea that we have an
    evolved tendency to deprive ourselves of information sounds wildly
    implausible, self-defeating and biologically disadvantageous. But once
    again we can find a clue from Mark Twain, who bequeathed to us an
    amazingly insightful explanation. "When a person cannot deceive
    himself," he wrote, "the chances are against his being able to deceive
    other people." Self-deception is advantageous because it helps us lie
    to others more convincingly. Concealing the truth from ourselves
    conceals it from others.

    In the early 1970s biologist Robert L. Trivers, now at Rutgers
    University, put scientific flesh on Twain's insight. Trivers made the
    case that our flair for self-deception might be a solution to an
    adaptive problem that repeatedly faced ancestral humans when they
    attempted to deceive one another. Deception can be a risky business.
    In the tribal, hunter-gatherer bands that were presumably the standard
    social environment in which our hominid ancestors lived, being caught
    red-handed in an act of deception could result in social ostracism or
    banishment from the community, to become hyena bait. Because our
    ancestors were socially savvy, highly intelligent primates, there came
    a point when they became aware of these dangers and learned to be
    self-conscious liars.

    This awareness created a brand-new problem. Uncomfortable, jittery
    liars are bad liars. Like Pinocchio, they give themselves away by
    involuntary, nonverbal behaviors. A good deal of experimental evidence
    indicates that humans are remarkably adept at making inferences about
    one another's mental states on the basis of even minimal exposure to
    nonverbal information. As Freud once commented, "No mortal can keep a
    secret. If his lips are silent, he chatters with his fingertips;
    betrayal oozes out of him at every pore." In an effort to quell our
    rising anxiety, we may automatically raise the pitch of our voice,
    blush, break out into the proverbial cold sweat, scratch our nose or
    make small movements with our feet as though barely squelching an
    impulse to flee.

    Alternatively, we may attempt to rigidly control the tone of our voice
    and, in an effort to suppress telltale stray movements, raise
    suspicion by our stiff, wooden bearing. In any case, we sabotage our
    own efforts to deceive. Nowadays a used-car salesman can hide his
    shifty eyes behind dark sunglasses, but this cover was not available
    during the Pleistocene epoch. Some other solution was required.

    Natural selection appears to have cracked the Pinocchio problem by
    endowing us with the ability to lie to ourselves. Fooling ourselves
    allows us to selfishly manipulate others around us while remaining
    conveniently innocent of our own shady agendas.

    If this is right, self-deception took root in the human mind as a tool
    for social manipulation. As Trivers noted, biologists propose that the
    overriding function of self-deception is the more fluid deception of
    others. Self-deception helps us ensnare other people more effectively.
    It enables us to lie sincerely, to lie without knowing that we are
    lying. There is no longer any need to put on an act, to pretend that
    we are telling the truth. Indeed, a self-deceived person is actually
    telling the truth to the best of his or her knowledge, and believing
    one's own story makes it all the more persuasive.

    Although Trivers's thesis is difficult to test, it has gained wide
    currency as the only biologically realistic explanation of
    self-deception as an adaptive feature of the human mind. The view also
    fits very well with a good deal of work on the evolutionary roots of
    social behavior that has been supported empirically.

    Of course, self-deception is not always so absolute. We are sometimes
    aware that we are willing dupes in our own con game, stubbornly
    refusing to explicitly articulate to ourselves just what we are up to.
    We know that the stories we tell ourselves do not jibe with our
    behavior, or they fail to mesh with physical signs such as a thumping
    heart or sweaty palms that betray our emotional states. For example,
    the students described earlier, who admitted their lies when watching
    themselves on videotape, knew they were lying at times, and most
    likely they did not stop themselves because they were not disturbed by
    this behavior.

    At other times, however, we are happily unaware that we are pulling
    the wool over our own eyes. A biological perspective helps us
    understand why the cognitive gears of self-deception engage so
    smoothly and silently. They cleverly and imperceptibly embroil us in
    performances that are so skillfully crafted that the act gives every
    indication of complete sincerity, even to the actors themselves.


More information about the paleopsych mailing list