|
« AAUP Homepage
|
Can E.T. Phone Home? The Brave New World of University Surveillance
Once confined largely to the sciences, institutional review boards have begun to broaden their purview. The consequences for other disciplines could be serious.
By Cary Nelson
"Hey, Frank, you forgot to put the gonad shield on this guy." Frank was a medical technician at the National Institutes of Health (NIH) in Bethesda, Maryland. The time was winter 1964. The "guy" who'd just had an X ray without proper shielding was me. I was a first-year college student employed in a dual role that winter, as a normal control in a cystic fibrosis study and as a laboratory assistant. I was also a student at the most progressive institution of higher education in the country, Antioch College, in Yellow Springs, Ohio, which had a work-study requirement. Antioch typically placed about two dozen students a year in NIH jobs.
My lab job was fairly straightforward. It began by my processing tissue samples from children who had died of cystic fibrosis. But these were the days when neither family nor employee sensitivities were elaborately protected. I would receive a beaker with, say, a little human heart inside it. The label was never "Subject C-33" but rather "Tim" or "Sally." Tim or Sally was typically a child I had known on the ward the week before. I pulverized the samples and put them through a centrifuge. No better preparation could be imagined for a lifetime as a literary critic.
But the NIH as a whole taught me many lessons about institutions and about academic researchers. It was a thirteen-story research hospital, full not only of patients but also of M.D.'s, Ph.D.'s, and federal bureaucrats. It was an airless world unto itself and in some fundamental ways a madhouse. One of the Ph.D.'s in my lab sometimes began his day by killing experimental mice so they could be analyzed. Dissatisfied with mass execution, he acquired a little guillotine that enabled him to behead his research subjects one by one. It was his collaborators, alas, who had charge of me.
The main consent form that I signed to participate in the experimental program was my weekly salary check, $35, minus taxes, plus room and board. In other words, I was an employee and employees were expected to do what they were told. Officially, I was being paid only for the lab job, but the main reason I was brought there was to participate in the cystic fibrosis study.
One of my regular duties was to have an intestinal biopsy. And it was at one of those biopsies that Frank, as I am calling him, forgot the lead apron. The biopsy was performed by inserting a long metal tube down my throat, through my stomach, and into my intestine. The X ray machine was kept on to guide the tube along its way. When the sample site was reached, a wire inside the tube was pushed forward. This opened a little metal claw at the end. A yank on the wire then made the claw tear off a piece of intestine, which was then pulled up all the way through my gut while I retched on the table. Still retching, I was wheeled back to my room.
Things became still more interesting when the researchers learned from my medical history form that I had no sense of smell; so far as I knew, I had been born without one. There was no question of a cure, but if they could find out why I had no sense of smell, they might have another publication. This angle of research was not part of their cystic fibrosis study, just an unexpected target of opportunity. They asked me if I would agree to a brain X ray preceded by removal of a small quantity of spinal fluid and injection of an equivalent quantity of air. Because of the very small chance of introducing impurities with the air, and the possibilities of resultant paralysis, they asked me to sign a form absolving both them and the federal government of any liability. I refused. Then the pressure began.
It was not until two years later, in 1966, that the NIH—under orders from the U.S. Surgeon General—introduced a committee system to evaluate the ethics of human subject research. In 1964, the hospital had a preliminary system to review general plans for research with normal controls, but no full-scale ethical oversight. Suffice it to say that it would have been nice to have had more formal regulations in place in 1964.
But that would have to wait for later. Despite the indictment of Nazi physicians at the Nuremberg Tribunals, Americans showed little concern about medical experimentation in their own country; indeed, Nuremberg may have helped assure Americans that such outrages occur only in exceptional elsewheres. Government regulation in the United States is almost always scandal driven, so it took incidents like the 1972 revelation of the forty-year-long Tuskegee experiments—in which researchers withheld treatment from a group of African American men infected with syphilis—to generate congressional pressure to turn the surgeon general's policy into a formal regulation.
Even then, the standards applied only to federally funded projects. Not until 1981 did a national commission recommend that the review policy be extended to all human-participant research in the United States. In 1991, some seventeen federal agencies adopted a uniform policy, widely known as the "Common Rule," which established the principles that research institutions receiving federal funds have to follow. The key element of the policy is the institutional review board (IRB), which has authority to approve, require modification of, or disapprove research subject to the Common Rule. As of 2000, roughly 4,000 IRBs were operating in the United States, primarily at universities, hospitals, and private research facilities.
But, after all, in 1964, I was an Antioch student. The college would protect me. It was while I indulged in such reassuring reflections that the responsible Antioch work-study supervisor, who held faculty status, called me to urge my cooperation with the study. It was clear he had consulted with the head of the department. Antioch needed these jobs; if I wanted a positive evaluation, I had better sit still for the syringe. It would have been nice to have had an IRB at Antioch, but they did not exist. I had many other NIH adventures, but this particular story came to an end because the NIH realized at last that I was under eighteen. My mother would have to cosign the waiver. She was a registered nurse and told them to go to hell.
Among the things I learned at the NIH was the attitude that biomedical scientists sometimes harbor toward their research subjects. Research is a heady mix of intellectual curiosity, self-interested careerism, and an ideology constructed of high-minded ideals: the pursuit of truth, the advancement of knowledge, the good of the many, notions that are not simply catch phrases but entire transcendentalizing discourses. Their usefulness in self-deception and rationalization is both notorious and easily forgotten. A system of independent research review and curtailment is clearly essential when real harm is a possibility.
I do not consider personal experience a decisive or necessary category for intellectual or cultural understanding, but my experience did make a difference. And thus I will say that it is not easy to forget being the object of a certain kind of investigative gaze, to look into the eyes of a researcher and realize you are expendable. Yet IRBs, which operate without their own system of checks and balances, often without secure mechanisms of appeal, are equally subject to individual and group self-deception, even more so now that campus IRBs are moving to review social science and humanities research more widely than ever before. Their staffs and members can also be corrupted by the ideals of justice and advocacy that energize them.
The Common Rule describes research as "a systematic investigation, including research development, testing, and evaluation, designed to develop or contribute to generalizable knowledge," a definition that embraces much systematic social science research while leaving most humanities projects outside its orbit. Increasingly, however, the mission to protect people who are the objects of study has led some IRBs to begin reviewing even oral histories focused on a single person. Studies involving little real risk to their subjects are designed to be exempted from full review, but the definitions of risk can be interpreted differently by different boards.
Among the texts that have had a strong influence on campus IRBs is the 1979 Belmont Report issued by the National Commission for the Protection of Human Subjects in Biomedical and Behavioral Research. It established three principles for ethical research: respect for persons, beneficence, and justice. The first of these is most consistently cited by IRB staffs and committees in characterizing their mission. As the 2003 IRB chair on my own campus put it, "The Belmont principles transcend academic disciplines."
Of course, "respect for persons" can hardly entail respect for every human action, but IRBs are ill equipped to negotiate the difference. Instead, they often give unquestioned allegiance to a concept that might be given more nuanced application to, say, Ku Klux Klan or Nazi Party members, who might merit humanity qualified with disapproval and who might on occasion appropriately be challenged aggressively in an interview. A historian might well wish to investigate the self-understanding of a Ku Klux Klan member and might choose to present a neutral account of the organization, but academic freedom means that the decision to do so needs to be the historian's, not that of an IRB. One consequence of an unreflective commitment to "respect for persons" is that IRBs have great difficulty accepting research destined to be critical of its "human subjects" and to cause them pain, even though interviewers may treat them with cordiality during the research phase.
At the "Human Subjects Policy Conference," held in April 2003 at the University of Illinois, the director of the Division of Behavioral and Cognitive Sciences at the National Science Foundation, David Rubin, recognized that disciplines are different and thus suggested that "we need flexible solutions that impose minimal regulatory burden," but he also stated that "the protection of human research participants should be uniform." A historian conducting an oral history interview might well be willing to grant the interviewee final power over the disposition of the interview after it is concluded—whether it is to be preserved, whether it can be quoted. A journalist extracting an admission of guilt from an interviewee is hardly likely to make the same offer. Such disclosures can certainly cause pain and may do harm to individuals, yet both the common good and the historical record may call for a degree of ruthlessness.
When "respect for persons" is inflected with a heightened sensitivity to the risks inherent in biomedical research, the concept may be adopted with particular fervor. Then an IRB can effectively become a virtual police force-enforcing across campus a philosophy of liberal humanism and its "respect for persons." As IRBs review more and more sorts of research—contributing to what C. K. Gunsalus, special counsel at the University of Illinois, aptly described last November in the Chronicle of Higher Education as "mission creep"—physical risk is conceptually leveraged to restrain a much wider range of scholarly inquiry. In some cases, one encounters a kangaroo court ironically enforcing "respect for persons."
Yet IRBs typically find it impossible to apply their standards to all disciplines. The most consistent exception, as the argument above would suggest, is for journalists, who frequently write exposés of scoundrels. IRB members often solemnly announce that essays written by journalists for newspapers do not count as research, even though the same essay written for a scholarly journal does. The "research" project requires IRB scrutiny, while the newspaper article does not. Asking IRB members to adhere to fundamental ethical values and then apply them inconsistently and differentially does real harm to those who fulfill this service, and it is a source of the inner corruption I mentioned above.
In truth, IRBs exempt newspaper journalism in part because they do not dare take on the press, not only because of caution about the First Amendment but also because of justified fear of the press's power. IRBs thus excuse university journalists from evaluation of harm done to human subjects for political, not principled, reasons. The strain has been apparent in my interviews of IRB staff and members around the country, some of whom aggressively recite the catechism—"if it's published in a scholarly venue, it's research; if it's published in a newspaper, it's not"—while others blurt out that "journalism is unethical," and yet others express anger that they are barred from reviewing projects designed for newspapers.
What's more, IRBs have still fewer Platonic forces operating on them. University lawyers will be warning them soon that social science research protocols could be the subject of objections raised by interplanetary travelers reporting violations of intergalactic rules and regulations. Think about it. Michael Rennie's space ship sets down atop the Assembly Hall and he warns us of worldwide consequences if we don't obtain consent forms from every undergraduate who fills out a survey. And there just might be the odd IRB head who thinks the potential loss of his or her job through failure to protect the university from legal action or government sanction was equally consequential.
My science fiction hyperbole is meant to be heuristic. One of the duties of university legal counsel is to generate accounts of hypothetical risk and advise how to avoid it. Yet we have little basis on which to judge the probability of lawsuits based on social science or humanities interviews and surveys. So there is a tendency to manage all research proposals on the basis of a worst-case scenario, even when actual risk may not be much greater than the likelihood of a real-world version of events in The Day the Earth Stood Still.
The admirable moral imperatives propelling an IRB forward become entangled with legal constructs that compromise fairness and sanity, not to say academic freedom. Moreover, the federal rules governing IRBs suggest, at least implicitly, that knowledge of disciplinary practices be part of the context of all decisions, requiring that boards must be composed of members with sufficiently "varying backgrounds to promote complete and adequate review of research activities commonly conducted by the institution." Yet the growing literature on campus IRBs shows again and again that boards assembled to supervise biomedical research often haven't a clue about the culture of history or anthropology or literature departments. In 2002, the University of Illinois IRB included not one humanities faculty member; a year later, an anthropologist was added.
I do not mean to say that there are no ethical or legal risks in research. The financial and institutional implications of biomedical research can be considerable; indeed, I believe they should be. In the relatively uncharted legal flows of social science research, matters are less certain. Yet IRB surveillance wields several double-edged swords. It serves to protect the rights of individual research subjects and to safeguard the institution against suits and regulatory reprisals. It also entails institutional responsibility and liability. Once an IRB has reviewed a project in detail and approved it, it makes itself a party to any further action. A signed consent form is at once a source of legal protection and a potential proof of responsibility. IRB surveillance thus simultaneously mitigates and enhances legal risk. That suggests the necessity for further surveillance, further intervention, further caution.
At some point, we enter the airless world of the NIH in 1964. The single most strict rule of the hospital back then was that no one open a window. The penalty for opening a window was immediate dismissal. Many of the patients had rare and untreatable illnesses. Some were in long-term pain. Others had their social lives curtailed for years by their conditions. Thus it was feared that as soon as the word was out that a window was open, patient after patient would rush to the room to leap to his or her death. The academic equivalent is less dramatic: you give up doing what you are doing because IRB oversight has made it too burdensome.
Here at Illinois, we have sometimes lacked sufficient oxygen in recent months. In spring 2002, David Wright, at the time an assistant professor of English and African American studies, was subjected to a largely insane review of an essay accepted for publication in the distinguished literary journal, The Kenyon Review. Although the essay itself was an exercise in creative nonfiction—concerning a class in creative nonfiction taught at another school, with both the school and the students fictionalized—the Illinois IRB took everything literally and demanded permissions and reports of events described in the piece. The IRB threatened to block publication, until the local and the national AAUP intervened, together with half a dozen senior campus administrators.
My own belief is that the board's first response should have been that the whole matter was none of its business. IRB administrators at other campuses whom I interviewed suggested they would have done exactly that. In point of fact, the board took jurisdiction only because its members happened to find out about Wright's essay. It does not represent a category of work that the IRB routinely or comprehensively supervises. Most IRB-vetted research now comes to the board's attention when faculty members apply for internal or external support. But Wright was merely doing what thousands of faculty members across the country regularly do—using anecdotal classroom evidence to ground a professional narrative. It's the sort of occasional pedagogical essay that wouldn't warrant a research grant.
Some pedagogical practices, however, now regularly draw IRB attention at Illinois and elsewhere. Anthropologists doing research in foreign countries and teaching fieldwork courses in the United States are increasingly coming under surveillance. Given the rather narrow horizons of many IRB staffs, it has taken no small effort to get them to understand, as anthropologists have widely reported, that a preliterate indigenous population halfway across the world is ill prepared to read and sign a consent form. I regularly interview nervous former Communist Party members now in their seventies and eighties. If I or my students showed them a consent form, they'd show us the door. As Joan Sieber and her colleagues pointed out in an essay published in spring 2000 in Professional Ethics Report, consent forms can create the anxiety they are designed to ameliorate.
Meanwhile, our IRB at Illinois now expects undergraduates assigned to do practice interviews with their families to get signed consent forms in advance. In the now-familiar pattern of institutionalizing every imagined anxiety, covering every imagined risk, the IRB wants to be sure a student's relatives do not feel coerced into doing an interview. Rather, they must feel free to refuse without fearing they are jeopardizing the student's grade. And a record of their signed consent must be on file.
As recently as 2003, our IRB insisted that a student needed approval before interviewing his or her mother. But not to worry. Provided the course instructor has filed the twelve-page IRB form, and as long as the mother is mentally competent to make her decision (there's a place to confirm that), university approval for the family conversation should be forthcoming in a matter of weeks. Better safe than sorry. Back in 1964, I was daring enough to call home without a bureaucrat's okay.
In 2002, it was worse still for University of Illinois anthropologists. Undergraduate students assigned to write papers about body language at the university gym were asked to get consent forms from everyone they watched. Although all the students were members of the facility, the IRB contended that the Intramural Physical Education Building was a private club and could not be treated as a public space. Students interviewing their friends and roommates about their reactions to magazine ads were also required to get signed forms. At first, the IRB also demanded the full twelve-page application from each undergrad doing a unique project. Now it seems willing to take class-wide proposals as long as they are supplemented by individual consent forms.
The fact that anthropologists have been teaching fieldwork courses for decades without difficulty does not matter. They are accustomed to supervising student projects. No matter. Big brother knows best. The extra time and effort involved in getting IRB approval for family conversations will most likely have a decisively chilling effect only on large courses. So we hope. But the controlling principle, unannounced, unreflected, ill considered-and, by the way, insane-is this: every single research interview conducted by a faculty member or student should be vetted by a bureaucrat.
I have myself—in the course of doing twenty-three books and over a hundred articles over thirty years—conducted more than 3,000 one-on-one interviews. Of course, there are difficult moral issues involved. I struggle with them all the time. A useful essay by Michael J. Oakes, published in October 2002 in Evaluation Review , charts damage done to research subjects on a spectrum running from "annoyance" to "death." Journalism is lodged at one end of this spectrum, medical research at the other.
My own interviews often produce a good deal more distress than annoyance, but they are not fatal. I may ask difficult questions. I am seeking individual historical truths. Often enough, I end up with information I do not publish until my interviewees have died. I have withheld publication in some cases to save people distress, and I have caused real distress by publishing in others. For better or worse, I make my own decisions after consulting friends and colleagues.
In the end, a journal or a publisher makes a final decision about publication, after employing its own procedures for peer review. I've been denounced but not sued. And I persevere. So it goes. I need advice all the time from people doing research in similar areas. I do not need bureaucrats or faculty members from distant fields telling me what to do, especially when they set themselves up as the ultimate arbiters of ethics and professional conduct. There are no such arbiters. There are only the ongoing struggles with complex competing responsibilities.
In the David Wright case, the IRB staff dismissed out of hand the suggestion that it could rely on The Kenyon Review and the academic peer-review system, which has worked for decades, to decide whether an essay merited publication. If that does not offer sufficient warning for the future, I would add this: in all my conversations with IRB staff and members across the country, I have never encountered anyone who believed a publisher's ethical standards equaled his or her own.
Like everyone else I interviewed—save Wright, to whom I talked for this essay—the IRB members all requested anonymity. They are afraid of the federal regulators, just as faculty members are afraid of their IRB. At every level of the system, the people in power declare, "We are your friends. We want to work with you." And in every instance, the person hearing the message wholly or partly discounts it. It does not matter whether this anxiety is well founded; it is a predictable product of the power differential inherent in the system.
What would it take, we might ask, to extend the David Wright and the anthropology cases to the whole campus? Wright's case was about prior restraint of publication, not about a research proposal. If the IRB chose to institutionalize prepublication reviews, it would need an immense staff and would produce a monstrous, intrusive surveillance culture that would substantively imperil academic freedom. Whether we face such a prospect remains to be seen.
The whole IRB enterprise may either grow or contract, but it is unlikely simply to continue on its present scale. Yet it is difficult to see how IRBs can guarantee that faculty research has done no harm to individuals without prior vetting of publications. Nor can they minimize the risk of suits without seeing what scholars are actually saying about people in their work. IRBs presently review research proposals, but most people have no idea what they will ultimately put into print until the research and writing are done.
The impulse to focus on research plans is another awkward transfer from the medical model, where it is easy to assume that the important human interventions will occur during the research. But with humanities and qualitative social science projects, the analysis and reporting of results are as likely as the research itself to produce significant interventions in people's lives.
Meanwhile, institutions are increasingly likely to commit themselves to campus-wide surveillance; it's the simplest way to assure federal regulatory agencies that their IRB is doing its job. It is easy to imagine how lawyerly anxiety and a rigid liberal humanism would combine to warrant curtailing faculty freedom of expression. The steadily increasing scope of the campus IRB mission is often justified by warning of the government's willingness to close down all funded research on a campus that endangers human subjects. But it is a long way from a university biomedical scientist's killing a research subject to a student's potentially annoying a parent with an interview request. As C. K. Gunsalus asked in her opening remarks at the April 2003 University of Illinois conference on human-subject policy, "Is an ounce of prevention really worth a pound of cure?" And what if it is pretty doubtful that we are getting that ounce? It is time instead for IRBs to just say no. As others have urged, I believe it is safer in the long run for IRBs to declare most pedagogy and most humanities and social science research none of their business. Should faculty and students meanwhile be educated much more thoroughly about their ethical responsibilities in doing research? Absolutely. Are there cases in which departments should supervise and alter teaching practices? Certainly. But E.T. should not need permission to call home.
Further Reading
American Association of University Professors, "Protecting Human Beings: Institutional Review Boards and Social Science Research," Academe (May-June 2001): 55-67.
Anderson, Paul V., "Ethics, Institutional Review Boards, and the Involvement of Human Participants in Composition Research," in Peter Mortensen and Gesa E. Kirsch, eds. Ethics and Representation in Qualitative Studies of Literacy (Urbana, Ill.: National Council of Teachers of English, 1996), 260-85.
Bloom, Lynn Z., "Living to Tell the Tale: The Complicated Ethics of Creative Nonfiction," College English 65 (January 2003): 276-89.
Church, Jonathan T., Linda Shopes, and Margaret A. Blanchard, "Should All Disciplines Be Subject to the Common Rule? Human Subjects of Social Science Research," Academe (May-June 2002): 62-69.
Morgan, Dan, "Ethical Issues Raised By Students' Personal Writing," College English 60 (March 1998): 318-25.
Wright, David, "Writing the Real World," Kenyon Review 24, nos. 3-4 (2002): 38-48.
Cary Nelson is Jubilee Professor of Liberal Arts and Sciences at the University of Illinois at Urbana-Champaign and second vice president of the AAUP.
|