[Paleopsych] Science: Forbidden Knowledge

Premise Checker checker at panix.com
Tue Mar 29 20:49:35 UTC 2005

Science: Forbidden Knowledge

[Send me your own lists of the top five forbidden areas, with or without 
the right to distribute them using your name.

[First, the summary from the Chronicle of Higher Education. Then the article 
from Science itself, followed by supplementary materials.]

News bulletin from the Chronicle of Higher Education, 5.2.11

   Scientists Censor What They Study to Avoid Controversy and 'Lunatic-Proof'
   Their Lives, Researchers Find


    Unwritten social and political rules affect what scientists in many
    fields study and publish, according to a paper published today in
    Science, and those constraints are even more prevalent than formal
    constraints, such as government or university regulations.

    The paper is based on interviews with 41 researchers at top academic
    departments in fields such as neuroscience, drug and alcohol abuse,
    and molecular and cellular biology. The interviews were conducted by
    Joanna Kempner, Clifford S. Perlis, and Jon F. Merz, of the University
    of Michigan at Ann Arbor, Brown University, and the University of
    Pennsylvania, respectively. They asked the researchers if they or any
    of their colleagues had ever refrained from doing or publishing

    Almost half of those interviewed said they felt constrained by formal
    controls, but the respondents said they felt even more affected by
    informal ones. Many of the scientists interviewed said they had found
    out their research was "forbidden knowledge" only after papers
    reporting their results had been published.

    One respondent told the interviewers that a colleague's graduate
    student had a job offer rescinded when the would-be employer found out
    the student had worked on a study of race and intelligence. Another
    researcher stood accused of "murderous behavior" after doing an
    anonymous survey in which he was incapable of intervening when
    respondents said they were infected with HIV and were having sex
    without a condom.

    Many other researchers said they simply chose not to do studies, or
    not to publish completed ones, because of concern about controversy.
    Several said they did not study dogs or other higher mammals because
    of fears of animal-rights activism. "I would like to lunatic-proof my
    life as much as possible," one told the interviewers.

    Mr. Merz, an assistant professor in Penn's department of medical
    ethics, said the study was not designed to determine the abundance of
    constraints on science. But, he said, just from the small group the
    researchers interviewed, it is clear that people feel constrained
    "fairly frequently."

    "It's a source of bias, another source of nonobjectivity in science,"
    he continued. "It's hard to measure. We don't know really what's not
    being done."

Forbidden Knowledge
Science, Vol 307, Issue 5711, 854 , 11 February 2005
Joanna Kempner,1 Clifford S. Perlis,2 Jon F. Merz3*

There is growing concern about the politicization and social control of 
science, constraining the conduct, funding, publication, and public use of 
scientific research (1). For example, human cloning and embryonic stem cell 
creation have been regulated or banned (2), activists have been lobbying 
Congress to remove funding from certain government-sponsored research (3-5), 
and science journal editors have been compelled to develop policies for 
publication of sensitive manuscripts (6, 7).

Forbidden knowledge embodies the idea that there are things that we should not 
know (8-15 ). Knowledge may be forbidden because it can only be obtained 
through unacceptable means, such as human experiments conducted by the Nazis 
(9, 11 ); knowledge may be considered too dangerous, as with weapons of mass 
destruction or research on sexual practices that undermine social norms (8, 9, 
12); and knowledge may be prohibited by religious, moral, or secular authority, 
exemplified by human cloning (10, 12).

Beyond anecdotal cases, little is known about what, and in what ways, science 
is constrained. To begin to fill this gap, we performed an interview study to 
examine how constraints affect what scientists do. In 2002-03, we conducted 10 
pilot and 41 in-depth semistructured interviews with a sample of researchers 
drawn from prestigious U.S. academic departments of neuroscience, sociology, 
molecular and cellular biology, genetics, industrial psychology, drug and 
alcohol abuse, and computer science. We chose diverse disciplines to gauge the 
range, rather than prevalence, of experiences.

We asked subjects to consider their practices and rationales for limiting 
scientific inquiry or dissemination and to tell us about cases in which 
research in their own discipline had been constrained. Respondents reported a 
wide range of sensitive topics, including studies relating to human cloning, 
embryonic stem cells, weapons, race, intelligence, sexual behaviors, and 
addiction, as well as concerns about using humans and animals in research.

Nearly half the researchers felt constrained by explicit, formal controls, such 
as governmental regulations and guidelines codified by universities, 
professional societies, or journals. Respondents generally agreed that formal 
controls offered important protections. Less consensus surrounded the 
necessity, efficiency, or good sense of specific policies. Stem cell research 
was repeatedly identified as an example of an overly restricted area. Many 
respondents expressed a preference that scientists--not 
policy-makers--determine which research is too dangerous.

We were surprised, however, that respondents felt most affected by what we 
characterize as "informal constraints." Researchers sometimes only know that 
they have encountered forbidden knowledge when their research breaches an 
unspoken rule and is identified as problematic by legislators, news agencies, 
activists, editors, or peers. Studies by Kinsey et al. (16, 17), Milgram (18), 
Humphreys (19), Herrnstein and Murray (20), and Rind et al. (21 ) were attacked 
only after publication. Many researchers (42%) described how their own work had 
been targeted for censure. One researcher was accused by activists of 
"murderous behavior" because he was incapable of reporting HIV+ subjects who 
admitted to unsafe sex practices in an anonymous survey. A sociologist 
published an article that undermined the central claim of a particular group, 
who allegedly then accused him of funding improprieties.

In other cases, the mere threat of social sanction deterred particular types of 
inquiry. Several researchers said that their choices to study yeast or mice 
instead of dogs were guided by fears of retribution from animal rights groups. 
As one respondent commented, "I would like to lunatic-proof my life as much as 
possible." Drug and alcohol researchers reported similar fears, stating that 
they had not pursued studies that might provoke moral outrage.

Finally, there may be unspoken rules shared by the community. As one respondent 
stated, "every microbiologist knows not to make a more virulent pathogen."

We failed to detect a coherent ethos regarding production of
forbidden knowledge. Respondents at once decried external regulation and
recognized the right of society to place limits on what and how science is
done. They stated that scientists are "moral" and "responsible," but
acknowledged cases in which scientists were sanctioned for acting outside
the mainstream of their disciplines. They also said that, although
information and "truth" had inherent utility, full and open publication
was not always possible. Whereas most respondents worked hard to avoid
controversy, others relished it.

In summary, formal and informal constraints have a palpable effect on what 
science is studied, how studies are performed, how data are interpreted, and 
how results are disseminated. Our results suggest that informal limitations are 
more prevalent and pervasive than formal constraints. Although formal 
constraints will bias science--by affecting what is studied and how it is 
studied--these biases are relatively transparent and amenable to political 
change. Informal constraints, in contrast, may be culturally ingrained and 
resistant to change, leaving few markers by which to assess their effects. We 
believe it is important to observe these constraints, assess their effects, and 
openly debate their desirability for science and society.

References and Notes

1.. R. A. Charo, J. Law Med. Ethics 32, 307 (2004).
2.. G. Q. Daley, New Engl. J. Med. 349, 211 (2003).
3.. J. Kaiser, Science 300, 403 (2003).
4.. J. Kaiser, Science 302, 758 (2003).
5.. J. Kaiser, Science 302, 966 (2003).
6.. J. Couzin, Science 297, 749 (2002).
7.. Journal Editors and Authors Group, Science 299, 1149 (2003).
8.. C. Cohen, New Engl. J. Med. 296, 1203 (1977).
9.. D. Smith, Hastings Center Rep. 8 (6), 30 (1978).
10.. G. Holton, R. S. Morison, Eds., Limits of Scientific Inquiry (Norton, New 
York, 1979).
11.. D. Nelkin, in Ethical Issues in Social Science  Research, T. L. Beauchamp, 
R. R. Faden, R. J. Wallace, L. Walters, Eds.
(Johns Hopkins Univ. Press, Baltimore, MD, 1982), pp. 163-174.
12.. R. Shattuck, Forbidden Knowledge: From Prometheus to Pornography (Harcourt 
Brace, New York, 1996).
13.. D. B. Johnson, Monist 79, 197 (1996).
14.. B. Allen, Monist 79, 294 (1996).
15.. D. B. Johnson, Sci. Eng. Ethics 5, 445 (1999).
16.. A. C. Kinsey et al., Sexual Behavior in the Human Male
(Saunders, Philadelphia, 1948).
17.. A. C. Kinsey et al., Sexual Behavior in the Human
Female (Saunders, Philadelphia, 1953).
18.. S. Milgram, Obedience to Authority: An Experimental
View (Harper Row, New York, 1974).
19.. L. Humphreys, Tearoom Trade: Impersonal Sex in Public
Places (Aldine, Chicago, 1970).
20.. R. Herrnstein, C. Murray, The Bell Curve: Intelligence
and Class Structure in American Life (Simon & Schuster, New York, 1996).
21.. B. Rind et al., Psychol. Bull. 124, 22 (1998).
2.. This study was approved by the University of
Pennsylvania Institutional Review Board. We thank all respondents for
their participation; B. Sitko for assistance; and C. Bosk, A. Caplan, J.
Drury, C. Lee, and B. Sampat for comments. Supported by the Greenwall
Foundation (J.K., C.S.P., J.F.M.) and the Robert Wood Johnson Foundation
1School of Public Health, The University of Michigan, Ann
Arbor, MI 48109-2029, USA. 2Department of Dermatology, Brown University
Medical School, Providence, RI 02903, USA. 3Department of Medical Ethics,
University of Pennsylvania School of Medicine, Philadelphia, PA
19104-3308, USA.
*Author for correspondence. E-mail: merz at mail: med.upenn.edu


Science Supporting Online Material Kempner et al., p. 1 Science Supporting 
Online Material Forbidden Knowledge Joanna Kempner, Clifford S. Perlis, Jon F. 

Materials and Methods

There are no empirical data on forbidden knowledge in science. To begin
to fill this gap, we performed this interview study to examine why and
in what ways scientists constrain and censor their work. This supplement
describes our methods and sample.

Pilot Study

We began the study by generating an interview guide and holding 10 pilot
interviews with researchers from a diverse range of disciplines,
including psychiatry, psychology, epidemiology, genetics, economics,
criminology, and physics. Our intent was to develop a survey instrument
to systematically examine the frequency with which scientists reported
problems of forbidden knowledge. Our pilot interviews, however, showed
us that survey methods would be inadequate. Few researchers could easily
recall instances in which they decided not to proceed with a particular
line of research. Simply, researchers generally had little insight into
what they do not do, and why.

We thus redesigned the study to be a qualitative interview study using
semistructured interviews. This approach allowed us to explain questions
and probe respondents for more information. This style of interviewing
is particularly useful when the respondent has not previously elaborated
their perspective on a particular topic (S1). In addition, the interview
format allowed the interviewer to develop rapport with and trust of each
respondent, which was helpful in eliciting potentially sensitive

Interview Guide

Drawing on the results of the pilot study, we further refined the
interview guide. The final guide consists of four sections. To put the
respondents at ease and to acquaint them with the concept of forbidden
knowledge, the interview began by asking researchers to identify a
relevant, well-publicized controversy in their field. Respondents were
then asked to comment on their own experiences as well as the
experiences of their colleagues. We asked, for example, whether their
work had ever been the target of controversy, and had they or one of
their colleagues ever shied away from a topic in order to avoid
controversy. In the third section, we asked a series of close-ended,
specific questions about practices and experiences. Finally, each
interview ended with 4 attitudinal questions about scientific freedom
and social and professional constraints (S2). The interview guide is
available from the authors.

The revised guide was tested with 2 local researchers, and modified
slightly thereafter. These 2 interviews are included in our study

Interview Sample

We developed a multistage cluster sample of academic researchers drawn
from six subject areas (microbiology, neuroscience, sociology, computer
science, industrial/organizational psychology, and researchers from
various disciplines who conduct drug and alcohol studies). We chose
these research areas because they often address controversial topics. We
identified the 10 top ranked Science Supporting Online Material Kempner et al., 
p. 2 universities in each discipline using 2002 U.S. News & World Report
rankings. Drug and alcohol researchers were identified from key word
searches of the NIH CRISP database (http://crisp.cit.nih.gov/) for
investigators funded in 2001-02 for research on addiction and related
issues. From lists of faculty, we randomly chose names to solicit for
participation, with replacement of those who did not respond or refused.

Including the two test subjects, we solicited a total of 95 individuals
and successfully contacted 76. We completed 41 interviews (43% of the
total sample). In total, we interviewed 10 sociologists, 9
microbiologists, 9 drug and alcohol researchers, 6
industrial/organizational psychologists, 6 neuroscientists, and 1
computer scientist. There was no difference in response rates across 
disciplines (.2 = 4.4 with 5 df, P = 0.49). Our total
sample included 10 women and 31 men, ranging in age from 28 to 76, with
a median age of 46. Respondents ranged in academic rank, with 21 full
professors, 6 associate professors, 12 assistant professors and 2
adjunct lecturers. We did not have the data to stratify by or to examine
response rates by gender, age, and rank.

Interview Method

All interviews were performed by one of us (J.K.) and audiotaped.
Thirty-eight interviews were conducted by telephone and three were
conducted in person. Each interview lasted between 30 and 45 minutes.
All respondents completed the interview.

Coding and Analysis

Each interview was transcribed and analyzed using QSR's NVivo2, a
qualitative data analysis program (S3). We developed our coding
categories using a grounded theory approach (S4). These categories coded
respondents' stories about forbidden knowledge by person (who is the
subject of the story), topic (the subject matter of the research project
in question), research process (at what stage in the research did the
event occur), and nature of constraints (what were the sources of
concern or limits reported by respondents). Each of the three authors
coded a third of the transcripts. Each of the transcript codes was then
checked by another coder. Disagreements were settled by consensus. This
coding system allowed us to uncover emerging themes, relations, and
perspectives of the respondents.


This study focused on a relatively small sample of academic researchers
chosen from topic areas suspected of raising controversial political and
moral issues. The types of forbidden knowledge and frequencies observed
in our sample are not generalizable to the scientific community at


S1. J.A. Holstein, J.F. Gubrium, The Active Interview (Sage
Publications, Thousand Oaks, CA, 1995). S2. J. Kempner, C. S. Perlis, J. F. 
Merz, unpublished
observations. S3. NVivo2 Qualitative Data Analysis Program Version
2.0.161 (QSR International Pty Ltd., Melbourne, Australia, 1998-2002). S4. A. 
Strauss, J. Corbin, Basics of
Qualitative Research: Grounded Theory Procedures and Techniques (Sage 
Publications, Thousand Oaks, CA, 1990).

More information about the paleopsych mailing list