[Paleopsych] WP Magazine: See No Bias

Premise Checker checker at panix.com
Wed Jan 26 18:16:48 UTC 2005

See No Bias

    Many Americans believe they are not prejudiced. Now a new test
    provides powerful evidence that a majority of us really are. Assu

    By Shankar Vedantam
    Sunday, January 23, 2005; Page W12

    woman sat down in her Washington office to take a psychological test.
    Her office decor attested to her passion for civil rights -- as a
    senior activist at a national gay rights organization, and as a
    lesbian herself, fighting bias and discrimination is what gets her out
    of bed every morning. A rainbow flag rested in a mug on her desk.

    The woman brought up a test on her computer from a Harvard University
    Web site. It was really very simple: All it asked her to do was
    distinguish between a series of black and white faces. When she saw a
    black face she was to hit a key on the left, when she saw a white face
    she was to hit a key on the right. Next, she was asked to distinguish
    between a series of positive and negative words. Words such as
    "glorious" and "wonderful" required a left key, words such as "nasty"
    and "awful" required a right key. The test remained simple when two
    categories were combined: The activist hit the left key if she saw
    either a white face or a positive word, and hit the right key if she
    saw either a black face or a negative word.

    Then the groupings were reversed. The woman's index fingers hovered
    over her keyboard. The test now required her to group black faces with
    positive words, and white faces with negative words. She leaned
    forward intently. She made no mistakes, but it took her longer to
    correctly sort the words and images.

    Her result appeared on the screen, and the activist became very
    silent. The test found she had a bias for whites over blacks.

    "It surprises me I have any preferences at all," she said. "By the
    work I do, by my education, my background. I'm progressive, and I
    think I have no bias. Being a minority myself, I don't feel I should
    or would have biases."

    Although the activist had initially agreed to be identified, she and a
    male colleague who volunteered to take the tests requested anonymity
    after seeing their results. The man, who also is gay, did not show a
    race bias. But a second test found that both activists held biases
    against homosexuals -- they more quickly associated words such as
    "humiliate" and "painful" with gays and words such as "beautiful" and
    "glorious" with heterosexuals.

    If anything, both activists reasoned, they ought to have shown a bias
    in favor of gay people. The man's social life, his professional circle
    and his work revolve around gay culture. His home, he said, is in
    Washington's "gayborhood."

    "I'm surprised," the woman said. She bit her lip. "And disappointed."

    TOOK A BIAS TEST, now widely known as the Implicit Association Test.
    But whom could she blame? After all, she'd finally found what she was
    looking for.

    Growing up in India, Banaji had studied psychophysics, the
    psychological representation of physical objects: A 20-watt bulb may
    be twice as bright as a 10-watt bulb, for example, but if the two
    bulbs are next to each another, a person may guess the difference is
    only 5 watts. Banaji enjoyed the precision of the field, but she
    realized that she found people and their behavior toward one another
    much more interesting. The problem was that there was no accurate way
    to gauge people's attitudes. You had to trust what they told you, and
    when it came to things such as prejudice -- say, against blacks or
    poor people -- people usually gave politically correct answers. It
    wasn't just that people lied to psychologists -- when it came to
    certain sensitive topics, they often lied to themselves. Banaji began
    to wonder: Was it possible to create something that could divine what
    people really felt -- even if they weren't aware of it themselves?

    The results of one of Banaji's experiments as a young scholar at Yale
    University encouraged her. She and her colleagues replicated a
    well-known experiment devised by psychologist Larry Jacoby. Volunteers
    were first shown a list of unfamiliar names such as Sebastian
    Weisdorf. The volunteers later picked out that name when asked to
    identify famous people from a list of famous and unknown names.
    Because they had become familiar with the name, people mistakenly
    assumed Sebastian Weisdorf was a famous man. The experiment showed how
    subtle cues can cause errors without people's awareness.

    Banaji and her colleagues came up with a twist. Instead of Sebastian
    Weisdorf, they asked, what if the name was Sally Weisdorf? It turned
    out that female names were less likely to elicit the false-fame error;
    volunteers did not say Sally Weisdorf was a famous woman. Women, it
    appeared, had to be more than familiar to be considered famous. Banaji
    had stumbled on an indirect measure of gender bias.

    She began scouting for other techniques. In 1994, Anthony Greenwald,
    Banaji's PhD adviser and later her collaborator, came up with a
    breakthrough. Working out of the University of Washington, Greenwald
    drew up a list of 25 insect names such as wasp, cricket and cockroach,
    25 flower names such as rose, tulip and daffodil, and a list of
    pleasant and unpleasant words. Given a random list of these words and
    told to sort them into the four groups, it was very easy to put each
    word in the right category. It was just as easy when insects were
    grouped with unpleasant words and flowers were grouped with pleasant

    But when insects were grouped with pleasant words, and flowers with
    unpleasant words, the task became unexpectedly difficult. It was
    harder to hold a mental association of insects with words such as
    "dream," "candy" and "heaven," and flowers with words such as "evil,"
    "poison" and "devil." It took longer to complete the task.

    Psychologists have long used time differences to measure the relative
    difficulty of tasks. The new test produced astonishing results.
    Greenwald took the next step: Instead of insects and flowers, he used
    stereotypically white-sounding names such as Adam and Chip and
    black-sounding names such as Alonzo and Jamel and grouped them with
    the pleasant and unpleasant words. He ran the test on himself.

    "I don't know whether to tell you I was elated or depressed," he says.
    "It was as if African American names were insect names and European
    American names were flower names. I had as much trouble pairing
    African American names with pleasant words as I did insect names with
    pleasant words."

    Greenwald sent Banaji the computer test. She quickly discovered that
    her results were similar to his. Incredulous, she reversed the order
    of the names in the test. She switched the left and right keys. The
    answer wouldn't budge.

    "I was deeply embarrassed," she recalls. "I was humbled in a way that
    few experiences in my life have humbled me."

    The Implicit Association Test is designed to examine which words and
    concepts are strongly paired in people's minds. For example,
    "lightning" is associated with "thunder," rather than with "horses,"
    just as "salt" is associated with "pepper," "day" with "night." The
    reason Banaji and Greenwald still find it difficult to associate black
    faces with pleasant words, they believe, is the same reason it is
    harder to associate lightning with horses than with thunder.
    Connecting concepts that the mind perceives as incompatible simply
    takes extra time. The time difference can be quantified and, the
    creators of the test argue, is an objective measure of people's
    implicit attitudes.

    For years, Banaji had told students that ugly prejudices were not just
    in other people but inside themselves. As Banaji stared at her
    results, the cliche felt viscerally true.

    predictors of many behaviors than people's explicit opinions were.
    They predicted preferences on matters of public policy -- even
    ideological affiliations. Banaji and others soon developed tests for
    bias against gays, women and foreigners. The bias tests, which have
    now been taken by more than 2 million people, 90 percent of them
    American, and used in hundreds of research studies, have arguably
    revolutionized the study of prejudice. In their simplicity, the tests
    have raised provocative questions about this nation's ideal of a
    meritocracy and the nature of America's red state/blue state political
    divide. Civil rights activists say the tests have the potential to
    address some of the most corrosive problems of American society;
    critics, meanwhile, have simultaneously challenged the results and
    warned they could usher in an Orwellian world of thought crimes.
    Banaji has received death threats from supremacist groups; sensing
    that the tests can detect secrets, officials from the Central
    Intelligence Agency have made discreet inquiries.

    The results of the millions of tests that have been taken anonymously
    on the Harvard Web site and other sites hint at the potential impact
    of the research. Analyses of tens of thousands of tests found 88
    percent of white people had a pro-white or anti-black implicit bias;
    nearly 83 percent of heterosexuals showed implicit biases for straight
    people over gays and lesbians; and more than two-thirds of non-Arab,
    non-Muslim volunteers displayed implicit biases against Arab Muslims.

    Overall, according to the researchers, large majorities showed biases
    for Christians over Jews, the rich over the poor, and men's careers
    over women's careers. The results contrasted sharply with what most
    people said about themselves -- that they had no biases. The tests
    also revealed another unsettling truth: Minorities internalized the
    same biases as majority groups. Some 48 percent of blacks showed a
    pro-white or anti-black bias; 36 percent of Arab Muslims showed an
    anti-Muslim bias; and 38 percent of gays and lesbians showed a bias
    for straight people over homosexuals.

    "The Implicit Association Test measures the thumbprint of the culture
    on our minds," says Banaji, one of three researchers who developed the
    test and its most ardent proponent. "If Europeans had been carted to
    Africa as slaves, blacks would have the same beliefs about whites that
    whites now have about blacks."

    As the tests have been refined, replicated and reinterpreted over the
    past decade, they have challenged many popular notions -- beginning
    with the increasingly common assertion that discrimination is a thing
    of the past.

    The research has also upset notions of how prejudice can best be
    addressed. Through much of the 20th century, activists believed that
    biases were merely errors of conscious thought that could be corrected
    through education. This hopeful idea is behind the popularity of
    diversity training. But Banaji suggests such training relies on the
    wrong idea of how people form biases.

    There is likely a biological reason people so quickly make assumptions
    -- good or bad -- about others, Banaji says. The implicit system is
    likely a part of the "primitive" brain, designed to be reactive rather
    than reasoned. It specializes in quick generalizations, not subtle
    distinctions. Such mental shortcuts probably helped our ancestors
    survive. It was more important when they encountered a snake in the
    jungle to leap back swiftly than to deduce whether the snake belonged
    to a poisonous species. The same mental shortcuts in the urban jungles
    of the 21st century are what cause people to form unwelcome
    stereotypes about other people, Banaji says. People revert to the
    shortcuts simply because they require less effort. But powerful as
    such assumptions are, they are far from permanent, she says. The
    latest research, in fact, suggests these attitudes are highly

    Such reassurance has not assuaged test takers, who are frequently
    shocked by their results. The tests are stupid, and the results are
    wrong, some say. People have argued that the tests are measures of
    only hand-eye coordination or manual dexterity. Some have complained
    about which groups are assigned to the left- and right-hand keys, and
    about how the computer switches those categories. None of these
    factors has any real impact on the results, but Banaji believes the
    complaints are a sign of embarrassment. Americans find evidence of
    implicit bias particularly galling, Banaji theorizes, because more
    than any other nation, America is obsessed with the ideal of fairness.
    Most of the people approached for this article declined to
    participate. Several prominent politicians, Republican and Democrat,
    declined to take the tests for this article. The aide to one senator
    bristled, "You think he is a racist!"

    But the tests do not measure actions. The race test, for example, does
    not measure racism as much as a race bias. Banaji is the first to say
    people ought to be judged by how they behave, not how they think. She
    tells incredulous volunteers who show biases that it does not mean
    they will always act in biased ways -- people can consciously override
    their biases. But she also acknowledges a sad finding of the research:
    Although people may wish to act in egalitarian ways, implicit biases
    are a powerful predictor of how they actually behave.

    IMPLICIT ASSOCIATION TEST are asked a few questions about themselves.
    The tests are anonymous, but volunteers are asked about their sex,
    race and whether they consider themselves liberal or conservative.

    The voluntary questionnaires have allowed Banaji and her colleagues to
    arrive at one of the most provocative conclusions of the research:
    Conservatives, on average, show higher levels of bias against gays,
    blacks and Arabs than liberals, says Brian Nosek, a psychologist at
    the University of Virginia and a principal IAT researcher with
    Greenwald and Banaji. In turn, bias against blacks and Arabs predicts
    policy preferences on affirmative action and racial profiling. This
    suggests that implicit attitudes affect more than snap judgments --
    they play a role in positions arrived at after careful consideration.

    Brian Jones, a Republican National Committee spokesman, says the
    findings are interesting in an academic context but questions whether
    they have much relevance in the real world. "It's interesting to
    ponder how people implicitly make decisions, but ultimately we live in
    a world where explicit thoughts and actions are the bottom line," he
    says. Volunteers drawn to the tests were not a random sample of
    Americans, Jones adds, cautioning against reading too much into the

    Though it's true that about two-thirds of test takers lean liberal,
    Banaji says, the sample sizes are so large that randomness is not a
    serious concern. And Andy Poehlman, a graduate student at Yale, has
    tracked 61 academic studies using the IAT to explore how implicit
    attitudes predict people's actions.

    When volunteers who took the race bias test were given the option to
    work with a white or black partner, one study found, those with the
    strongest implicit bias scores on the test tended to choose a white
    partner. Another study found that volunteers with lower bias scores
    against gays were more willing to interact with a stranger holding a
    book with an obviously gay theme. A third experiment found that when
    volunteers were told that another person was gay, those whose scores
    indicated more bias against gays were more likely to avoid eye contact
    and show other signs of unfriendliness. A study in Germany by
    psychologist Arnd Florack found that volunteers whose results
    suggested more bias against Turks -- an immigrant group -- were more
    likely to find a Turkish suspect guilty when asked to make a judgment
    about criminality in an ambiguous situation.

    In another study by psychologist Robert W. Livingston at the
    University of Wisconsin, Poehlman says, volunteers were given details
    of a crime in which a Milwaukee woman had been assaulted, suffered a
    concussion and required several stitches. In this case, Poehlman says,
    some volunteers were told the perpetrator had been proven to be David
    Edmonds from Canada. Others were told the guilty perpetrator was Juan
    Luis Martinez from Mexico. Volunteers were asked what length of
    sentence was appropriate for the crime: Bias scores against Hispanics
    on the implicit tests tended to predict a longer sentence for the

    An implicit attitude "doesn't control our behavior in a be-all and
    end-all kind of way, but it flavors our behavior in a pretty
    consistent way," says Poehlman.

    In perhaps the most dramatic real-world correlate of the bias tests,
    economists at the Massachusetts Institute of Technology and the
    University of Chicago recently sent out 5,000 résumés to 1,250
    employers who had help-wanted ads in Chicago and Boston. The résumés
    were culled from Internet Web sites and mailed out with one crucial
    change: Some applicants were given stereotypically white-sounding
    names such as Greg; others were given black-sounding names such as

    Interviews beforehand with human resources managers at many companies
    in Boston and Chicago had led the economists to believe that black
    applicants would be more likely to get interview calls: Employers said
    they were hungry for qualified minorities and were aggressively
    seeking diversity. Every employer got four résumés: an average white
    applicant, an average black applicant, a highly skilled white
    applicant and a highly skilled black applicant.

    The economists measured only one outcome: Which résumés triggered

    To the economists' surprise, the résumés with white-sounding names
    triggered 50 percent more callbacks than résumés with black-sounding
    names. Furthermore, the researchers found that the high-quality black
    résumés drew no more calls than the average black résumés. Highly
    skilled candidates with white names got more calls than average white
    candidates, but lower-skilled candidates with white names got many
    more callbacks than even highly skilled black applicants.

    "Fifty percent? That's huge," says Sendhil Mullainathan, an economist
    who led the study and who recently moved to Harvard to work with
    Banaji. Human resources managers were stunned by the results, he says.
    Explicit bias, says Mullainathan, can occur not only without the
    intent to discriminate, but despite explicit desires to recruit
    minorities. Implicit attitudes need only sway a few decisions to have
    large impact, he says. For example, if implicit bias caused a
    recruiter to set one résumé aside, it could be just one of 100
    decisions the recruiter made that day. Collectively, however, such
    decisions can have dramatically large consequences.

    MATHEMATICS PROFESSOR AT MIT. It was 1977, and there were no women's
    bathrooms in her building. Joni was not particularly surprised. She
    had battled obstacles all her life. When she first declared -- at age
    12 -- that she was going to be a mathematician, her announcement
    evoked gales of laughter at a family gathering. But opposition only
    made her more determined. After a successful stint at MIT, Joni worked
    for Microsoft and then launched a successful business consulting firm
    called the Cambridge International Group Ltd. Her recent book, The
    Third Opinion, stresses the importance of seeking diverse points of

    Joni was recently introduced to Banaji and expressed interest in
    taking the Implicit Association Test. Like most volunteers, she did
    not think she had biases and believed strongly in "meeting people as
    they are, without looking at the color of their skin."

    Given Joni's background, Banaji thought it would be interesting for
    her to take a bias test that examined whether Joni associated men or
    women with careers in science. Most people find it easier to associate
    men with the sciences -- but Joni was clearly not most people.

    The test came up on the screen. Joni's fingers, trained for many years
    on the piano, flew as she classified a number of words such as
    "husband," "father," "mother" and "wife" between "male" and "female"
    groups. She then grouped words such as "chemistry," "history,"
    "astronomy" and "music" under "science" or "liberal arts." The
    computer then asked her to group "male" with "science" and "female"
    with "liberal arts."

    When the groupings were reversed, Joni had to group "male" words with
    "liberal arts," and "female" words with various disciplines in
    science. She made a mistake in classifying "uncle." She hesitated over
    "astronomy" and made a second mistake in classifying "physics."

    The results popped up: "Your data show a strong association between
    science and Male relative to Female."

    Joni's fingers tapped the table in frustration. "I fought for women to
    be scientists all my life," she said, incredulous. Banaji nodded
    sympathetically. Her own results on this test were similar.

    While Banaji says such results show the pervasive power that cultural
    biases have even on those who are themselves the victims of such
    biases, critics of the Implicit Association Test have asked whether it
    might be merely measuring people's awareness of bias. In other words,
    might Joni and Banaji associate men with careers in science precisely
    because, as women who chose to be scientists, they were intimately
    familiar with the obstacles? Alternatively, could the tests be picking
    up something about the larger culture, rather than about the
    individual herself?

    Banaji says that researchers have shown the implicit tests are
    measuring more than mere awareness of bias, through studies that
    cancel out the effects of familiarity.

    "Is the IAT picking up something about the culture?" Banaji asks.
    "Yes, but it is picking up that aspect of the culture that has gotten
    into your brain and mind."

    On the race test, for example, a sophisticated brain-imaging study
    showed that implicit bias tests can predict fear responses among
    volunteers. Banaji and New York University neural scientist Elizabeth
    Phelps had white volunteers take the implicit race bias test and then
    undergo sophisticated brain scans called fMRIs, which measure
    instantaneous changes in brain activity. Those with the most bias on
    the implicit tests showed the most activity in the brain area called
    the amygdala, when photos of black faces, obtained from college
    yearbooks, were flashed before their eyes. The amygdala is part of the
    primitive brain involved with fear responses.

    But the critics persist. Philip Tetlock, a professor of organizational
    behavior in the business school at the University of California at
    Berkeley, and Ohio State University psychology professor Hal Arkes
    argue that Jesse Jackson might score poorly on the test. They cite the
    civil rights leader's statement a decade ago that there was nothing
    more painful at that stage of his life "than to walk down the street
    and hear footsteps and start thinking about robbery. Then look around
    and see somebody white and feel relieved."

    If a prominent black civil rights leader could hold such a bias,
    Tetlock and Arkes ask, what do bias scores really mean? Whatever the
    IAT is measuring, Tetlock and Arkes argue, it is not what people would
    call discrimination -- no one would dream of accusing Jesse Jackson of
    harboring feelings of hostility toward African Americans.

    Banaji says Tetlock and Arkes are relying on an outmoded notion of
    discrimination. The IAT research shows that hostility is not needed
    for discrimination to occur. Women and minorities can just as easily
    harbor biases, absorbed from the larger culture, that can lead them to
    discriminate against people like themselves.

    Tetlock says he thinks the IAT research project is drawing conclusions
    much more sweeping than are justified.

    "One of the key points in contention is not a psychological point, it
    is a political point," says Tetlock. "It is where we are going to set
    our threshold of proof for saying something represents prejudice. My
    view is the implicit prejudice program sets the threshold at a
    historical low."

    By the standards of slavery and segregation, the critics argue, delays
    in mental associations are trivial. "We've come a long way from Selma,
    Alabama, if we have to calibrate prejudice in milliseconds," says

    But the biases that the tests uncover are not trivial, Banaji
    counters. Their consequences, while subtler, could be devastating. In
    settings such as the criminal justice system, she argues, lives may
    hang in the balance.

    In their most controversial argument, Tetlock and Arkes asked whether
    some implicit biases might simply be politically incorrect truths. By
    comparing national statistics of violent crime against census figures
    of different ethnic groups, the researchers argued it was more likely
    for a violent crime to be perpetrated by an African American man than
    a white man. Would it not therefore be rational, they asked, for
    people to hold biases against blacks?

    Even here, however, rationality did not appear to be the prime mover,
    Banaji argues. Even if whites and blacks committed crimes at exactly
    the same rate, Banaji says, people would assign greater weight to the
    black crimes. This phenomenon is known as an illusory correlation:
    Aberrational behavior by a member of a minority group is not only
    given greater prominence in the mind but is also more easily
    associated with the entire group, rather than just the individual.
    "When in-groups do bad things, we think it is individual behavior or
    circumstance," says Jerry Kang, a UCLA law professor who is interested
    in policy applications of the research. "I screw up because it is a
    bad day; others screw up because they are incompetent."

    ATTITUDES AND PREDICT BEHAVIOR has raised questions about its
    potential uses. Might it predict, for example, which police officers
    are likely to mistakenly shoot an unarmed black man? Should such tests
    be used to cull juries of people with high bias scores? Might
    employers use such tests to weed out potential racists? Might
    employees trying to prove discrimination demand that their bosses take
    bias tests?

    The problem, Banaji says, is that all those uses assume that someone
    who shows bias on the test will always act in a biased manner. Because
    this isn't true, Banaji and her colleagues argue against the use of
    the IAT as a selection tool or a means to prove discrimination. Banaji
    says she and her colleagues will testify in court against any attempt
    to use the test to identify biased individuals.

    Another reason to limit the IAT's use: Research has shown that
    individuals who are highly motivated can successfully fool the tests
    by temporarily holding counter-stereotypes in their minds. (Other
    attempts to fool the tests -- such as consciously attempting to
    respond faster or slower -- tend to change results only slightly, if
    at all, Banaji says.) Banaji hesitates to perform real-world studies
    that examine, for instance, whether police officers with the most bias
    are the most likely to shoot an unarmed suspect in an ambiguous
    situation, because the results of such studies could be subpoenaed and
    used in lawsuits against police departments. The researchers say they
    want to keep the focus of the tests on public education and research.
    They are wary of having the tests used in lawsuits, because if people
    feared their results might one day be used against them, they would be
    hesitant to use the tests for personal education.

    Banaji says she is keenly aware that psychology has a long history of
    tests -- starting with the "lie-detector" polygraph -- that have been
    hyped and misused. Personality tests that lack the rigor of the
    Implicit Association Test have been widely used by companies in
    employee training and even hiring. Atop Banaji's desk at work is a
    bust of a human skull marked with different brain areas once thought
    to be responsible for different emotions: a representation of the
    discredited science of phrenology. The bust is a daily warning about
    the many failed ways science has promised to unlock people's minds and

    But even as Banaji hears from critics who say the Implicit Association
    Test, which is not patented, will get misused, some proponents tell
    her it would be unethical not to use the test to screen officials who
    make life-and-death decisions about others. One test in a British jail
    showed that, compared with other criminals, pedophiles had implicit
    associations linking children and sexual attraction. Should such tests
    be used to determine which pedophiles have been rehabilitated and
    should be eligible for parole or, more controversially, as a law
    enforcement tool to evaluate which individuals are at risk of harming

    "People ask me, 'How do you sleep at night knowing this can be
    misused?'" Banaji says. "Others ask me, 'How do you sleep at night
    knowing this won't be used fully?'"

    Avenue, across from New York's St. Patrick's Cathedral. They were a
    self-assured, competitive bunch, the type of crowd that usually views
    academics with skepticism. The executives had assembled for one of the
    leadership training programs that the firm mandates, and the mood in
    the room was very much "uh-huh, uh-huh," and "here we go again," says
    Barbara Byrne, a senior executive at the company who was present.

    Banaji told the executives she was going to test their skills of
    observation. She played a video of a basketball game. Shot in
    black-and-white, the video showed a swift series of basketball passes
    between players with rapidly changing positions. Banaji asked the
    executives to count the number of passes. The group loved competitive
    exercises. As soon as the short clip was over, answers came flying
    from all sides: Five! Seven! Eleven!

    Banaji asked whether anyone had seen anything unusual? No one had
    noticed anything out of place. Banaji played the video again, this
    time instructing her audience not to pay any attention to the
    basketball passes. Halfway through the video clip, a woman with an
    open umbrella slowly walked through the frame from one end to the
    other. Stunned at what they had missed, the executives collapsed in
    helpless laughter.

    "I sat there and said, God, it wasn't subtle," says Byrne. "It was a
    woman with an open umbrella. It was right in front of your eyes. But
    you were so focused on counting the basketballs, that part of your
    brain was not functioning."

    Banaji's point was that human beings filter what they see through the
    lenses of their own expectations. People believe they are acting
    rationally, but numerous psychological tests prove that subtle cues
    influence people all the time without their knowledge.

    "You thought to yourself, Maybe [hidden biases] could influence me in
    other ways," Byrne says.

    No one knows exactly why people develop implicit biases. Living in a
    diverse neighborhood does not in itself seem to reduce bias, but
    having close friendships with people from other ethnic groups does
    appear to lower bias, the IAT researchers have found. Saj-nicole Joni,
    who is white, for example, did not have test results showing a race
    bias and said she has long been close friends with an African American
    woman. Morgan Walls, an African American woman who works at the Peace
    Corps in the District, used to work in Thailand and has retained her
    connections with Asia. Her test suggested no bias toward European
    Americans or Asian Americans. Jeff Chenoweth, the director of national
    operations at the Catholic Legal Immigration Network in Washington,
    appeared tohave no bias against Arab Muslims compared with people from
    other parts of the world. As he took the tests, Chenoweth, a white man
    and a devout evangelical, said he was planning to have two Iraqi
    Shiite Muslims over to his home for Christmas dinner. "I've lived as a
    minority in an Arab country and have 10 close friends who are Arab,"
    he said.

    Banaji herself shows no implicit biases against gays or Jews -- a
    result, she believes, of an upbringing where explicit biases against
    those groups were largely nonexistent.

    There is growing evidence that implicit attitudes can be changed
    through exposure to counter-stereotypes. When the race test is
    administered by a black man, test takers' implicit bias toward blacks
    is reduced, says Irene Blair, a University of Colorado psychologist
    who recently conducted a review of studies that looked at how
    attitudes could be changed. Volunteers who mentally visualized a
    strong woman for a few seconds -- some thought of athletes, some
    thought of professionals, some thought of the strength it takes to be
    a homemaker -- had lower bias scores on gender tests. Having people
    think of black exemplars such as Bill Cosby or Michael Jordan lowered
    race bias scores. One experiment found that stereotypes about women
    became weaker after test takers watched a Chinese woman use chopsticks
    and became stronger after they watched the woman put on makeup.
    Interventions as brief as a few seconds had effects that lasted at
    least as long as 24 hours. But the volunteers were not aware of their
    attitudes having been changed.

    Having counter-stereotypical experiences, in other words, might be
    much like going on a new diet with healthier food. Just as healthy
    eating can have a subtle impact on how people look and feel,
    counter-stereotypical experiences sustained throughout one's life seem
    to subtly change how one thinks. But, Banaji says, such experiences
    may not eliminate bias altogether.

    Banaji believes that conscious efforts are needed to fight what she
    calls ordinary prejudice, the primitive brain filtering the world
    through its biased lenses without the conscious part of the brain
    being aware of it. Tests have shown, for example, that when people are
    given a sense of power, they show greater biases than they did before.
    As a result, workplaces that are explicitly more egalitarian might be
    implicitly less biased, she says. Since Mullainathan found startling
    differences in his résumé study, he says, he has come to believe that
    personal identifiers should be removed from résumés to make
    evaluations more fair. Another area highly prone to implicit biases is
    job interviews, says Max Bazerman of Harvard Business School. "What
    you need to do is look at objective measures separate from the

    Banaji and Kang believe the IAT can be used as one measure to
    determine when affirmative action policies ought to be ended. Rather
    than pick an arbitrary amount of time -- Supreme Court Justice Sandra
    Day O'Connor recently suggested 25 years -- the researchers asked
    whether such policies should expire when implicit tests show that
    people are really evaluating others without bias. Banaji and Kang are
    less interested in using affirmative action to redress historical
    wrongs -- they argue it is essential to fight discrimination still
    taking place today.

    Lani Guinier, President Bill Clinton's unsuccessful nominee for
    assistant attorney general for civil rights and now a professor at
    Harvard, is a fan of Banaji's work. But she says she worries the IAT
    will usher in superficial changes. The decor on the walls might be
    important, she says, but it isn't the real problem. "I worry people
    will think you can depress [implicit bias] scores through sporadic
    interventions," she says. "That will channel our efforts toward reform
    in potentially modest ways that don't fundamentally change the
    cultural swamp in which we are living."

    Banaji disagrees. Decades of research in social psychology, she says,
    have demonstrated that small cues can have powerful impact on the way
    people think and behave. Finding evidence of implicit bias, she says,
    is like driving a car and discovering that, although the steering
    wheel is being held straight, the vehicle is drifting to one side.
    Banaji's solution: However strange it may feel, the driver should
    consciously hold the steering wheel against the known bias.

    "The implicit system is dumb," Banaji says. "It reacts to what it
    sees. That is its drawback. But if we change the environment, we can
    change our attitudes."

    has applied her research to her own life. Her office at Harvard is
    testimony. At eye level on a bookshelf are postcards of famous women
    and African Americans: George Washington Carver, Emma Goldman, Miles
    Davis, Marie Curie, Frederick Douglass and Langston Hughes. During one
    interview, she wore a brooch on her jacket depicting Africa. What
    might seem like political correctness to some is an evidence-based
    intervention to combat her own biases, Banaji says.

    People's minds do not function with the detachment of machines, she
    says. For example, when she was recently asked to help select a
    psychologist for an award, Banaji says, she and two other panelists
    drew up a list of potential winners. But then they realized that their
    implicit biases might have eliminated many worthy candidates. So they
    came up with a new approach. They alphabetically went down a list of
    all the psychologists who were in the pool and evaluated each in turn.

    "Mind bugs operate without us being conscious of them," Banaji says.
    "They are not special things that happen in our heart because we are

    But assumptions lead to attitudes, and attitudes lead to choices with
    moral and political consequences. So, whether she is in a classroom or
    a grocery store, Banaji says, she forces herself to engage with people
    she might otherwise have avoided.

    Just before Halloween, Banaji says, she was in a Crate & Barrel store
    when she spied a young woman in a Goth outfit. The woman had spiky
    hair that stuck out in all directions. Her body was pierced with
    studs. Her skull was tattooed. Banaji's instant reaction was distaste.
    But then she remembered her resolution. She turned to make eye contact
    with the woman and opened a conversation.

    Shankar Vedantam covers science and human behavior for The Post's
    National desk. He will be fielding questions and comments about this
    article Monday at 1 p.m. at washingtonpost.com/liveonline.
    How the Web Version Of the Implicit Association Test Works

    By linking together words and images, the race bias test measures what
    associations come most easily to mind. People who take the Web version
    are asked to classify a series of faces into two categories, black
    American and white American. They are then asked to mentally associate
    the white and black faces with words such as "joy" and "failure."
    Under time pressure, many Americans find it easier to group words such
    as "failure" with black faces, and words such as "joy" with white
    faces. The test "measures the thumbprint of the culture on our minds,"
    says Harvard psychologist Mahzarin Banaji.

    To take the Implicit Association Test, go to https:

    To better understand how the test works and your results, go to
    The Paper Version Of the Implicit Association Test

    This test was designed by University of Washington psychologist
    Anthony Greenwald. It is intended to measure how easily people
    associate home- and career-related words with either men or women. If
    you can, time yourself as you do Part 1 and compare the result with
    how long it takes to do Part 2. Many people find grouping men with
    home words takes longer than grouping women with home words --
    evidence of a possible gender bias. Do you think your results occurred
    because you took the tests in a particular order? You can repeat the
    tests again, this time pairing men with career words in Part 1 and
    women with career words in Part 2. Whichever part took longer the
    first time should be shorter this time, and vice versa. Results from
    the Web version are considered more reliable than those from the paper

    Part 1

    The words in this first list are in four categories. MALE NAMES and
    FEMALE NAMES are in CAPITAL letters. Home-related and career-related
    words are in lowercase. Go through the list from left to right, line
    by line, putting a line through only each MALE NAME and each
    home-related word. Do this as fast as you can.

    executive LISA housework SARAH entrepreneur DEREK silverware MATT
    cleaning TAMMY career BILL corporation VICKY office STEVE
    administrator PAUL home AMY employment PEGGY dishwasher MARK babies
    BOB marriage MIKE professional MARY merchant JEFF garden KEVIN family
    HOLLY salary SCOTT shopping DIANA business DONNA manager EMILY laundry
    JOHN promotion KATE commerce JILL kitchen GREG children JASON
    briefcase JOAN living room ANN house ADAM

    Part 2

    The following list is the same as the one above. This time, go through
    the list putting a line through only each FEMALE NAME and each
    home-related word. Again do this as fast as you can.

    executive LISA housework SARAH entrepreneur DEREK silverware MATT
    cleaning TAMMY career BILL corporation VICKY office STEVE
    administrator PAUL home AMY

    employment PEGGY dishwasher MARK babies BOB marriage MIKE professional
    MARY merchant JEFF garden KEVIN family HOLLY salary SCOTT shopping

    business DONNA manager EMILY laundry JOHN promotion KATE commerce JILL
    kitchen GREG children JASON briefcase JOAN living room ANN house ADAM
    The Deese-Roediger-McDermott Test (Part 1)

    Much as we like to believe that our perceptions and memories are
    always accurate, a number of experiments show people routinely make
    errors in how they see and remember things, without their being aware
    of it. Read the list of words in this box. Then refer to Part 2.















    The Deese-Roediger-McDermott Test (Part 2)

    Go through the words in this list, without referring back to the other
    list. Check all of the words that you recall as being in the previous
    list. The explanation of the test is below.
















    Explanation: Harvard psychologist Mahzarin Banaji offers this test in
    lectures to show how easily a false memory can be created. Most people
    remember seeing the word "insect" in the first list. The mistake
    happens because the words in the first box were associated with
    insects: Unlike a machine, human memory is prone to error, because of
    reasonable-but incorrect-assumptions. "Mind bugs operate without us
    being concious of them," Banaji says.

More information about the paleopsych mailing list