September-October 2003

Balancing Security and Openness in Research and Education

The success of academic science today entails novel risks. Working together, academics, institutions, and the government can preserve scientific openness.


The scale and nature of the ongoing revolution in science and technology, and what it implies for the quality of human capital in the twenty-first century, pose critical national security challenges for the United States. Second only to a weapon of mass destruction detonating in an American city, we can think of nothing more dangerous than a failure to manage properly science, technology, and education for the common good over the next quarter-century.

—U.S. Commission on National Security in the Twenty-first Century
March 15, 2001

The ability of our nation to remain secure in the face of both traditional military threats and international terrorism while maintaining the excellence and pace of American science and technology requires a delicate balance. It depends first and foremost on effective dialogue and joint problem solving by those responsible for maintaining our security and those who lead our scientific, engineering, and higher education communities.

Our immediate impulse when threatened is to wall ourselves off and to regulate the release of information of potential use to our enemies. This is understandable, and frequently justified, but in today's complicated world, the security issues raised regarding research and education do not lend themselves to simple responses—especially when long-term consequences are considered. Why?

The future health and economic strength of America, and indeed the world, depend on the continued rapid advance of science and technology, and on the education of scientists and engineers at the most advanced levels. The rapid progress of science and technology, and the advanced education of scientists and engineers, in turn, depend critically on openness of process, openness of publication, and openness of participation within our institutions and across national boundaries.

Historically, our nation and world have faced many challenges to peace and security. Now we face a constant threat of determined terrorists. Their immediate objectives are to kill large numbers of people, or to cause terror, panic, or disruptions of our lives and economy.

As we respond to the reality of terrorism, we must not unintentionally disable the quality and rapid evolution of American science and technology, or of advanced education, by closing their various boundaries. The irony is that, over time, doing so would achieve in substantial measure the objectives of those who disdain our society and would do us harm by disrupting our economy and quality of life.

Americans are learning that striking a balance between protection of our lives and of our liberties is difficult but essential. I believe that it is equally imperative that we strike the right balance between security and the openness of our scientific research and education. But I conclude that we must rely heavily on maintaining that openness.

In the two years since the murderous attacks in New York, Washington, and Pennsylvania, the experience of the Massachusetts Institute of Technology and other leading research-intensive universities has been primarily one of calm and reasoned interaction and consultation with the federal government on such matters as the admission of international students and scholars, the openness of scientific research, and the control of dangerous chemical and biological agents.

However, the discussion of these issues and the establishment of a regulatory environment associated with homeland security are far from over. It therefore seems timely to address some of the fundamental issues and long-term consequences of our decisions.

Before doing so, let me make clear that MIT and our sister institutions take seriously our responsibility to serve our nation by applying our talents and capabilities to the protection of human life and infrastructure in our homeland and throughout the world. (See, for example, the Web site on MIT research and education on homeland and global security <http://web.mit.edu/homeland/index.html>.)

Materials and Information

Terrorism to date has been decidedly low-tech, although its worst instances have been sophisticated organizationally. Truck bombs, commandeering of commercial aircraft, and credit card fraud appear to have been the primary tools used by those who have done us great harm. The materials they used have been things such as fertilizer, diesel fuel, and off-the-shelf chemicals. They have not relied on scientific or technical information that is advanced or difficult to obtain. This is an important observation, although no guarantee of the future course of events. Indeed, the as-yet-undetermined origin of anthrax attacks in the United States gives rise to important concerns.

The nebulous, diffuse nature of terrorism makes a simple prescription for the responsibilities of academic institutions impossible. Nonetheless, let me suggest a basic framework for thinking about the issues by surveying the most commonly discussed mechanisms for terrorist attacks of a technological nature. This framework reflects the nature of the information and materials required.

Nuclear Weapons

The use of nuclear weapons and missiles is a singular matter. The information required to construct a nuclear weapon is acquired over many years. It is generally not the stuff of classroom learning; rather, it involves sophisticated know-how developed by experience, testing, and advanced computational simulation. Most nations can acquire the critical components and materials required for construction of a nuclear weapon only by illegal means.

Cyberterrorism

Cyberterrorism is the use of computer and communication technology to disrupt, corrupt, or disable our military or commercial information technology systems. Potentially, it could directly weaken our national security, or it could wreak havoc on our economy. The information required by a cyberterrorist can be presumed to be of varying degrees of sophistication, but is generally available. It is largely the stuff of hacking. The materials are computers and access to the Internet.

Cybersecurity is an urgent issue in all domains of industry, education, and government. It imposes additional administrative burdens and regulatory costs on all organizations, and it calls for more computer scientists and mathematicians who are U.S. citizens, trained to protect our information infrastructure.

Bioterrorism

Bioterrorism could involve the propagation of disease and the defeat or disruption of therapies to counter it. The information required is likely to be available in published literature. Some experientially gained know-how might be involved, but it could generally be obtained by experiences in laboratories, medical establishments, or pharmaceutical companies. Some specialized equipment or facilities might be required, but they would likely have widespread applicability to legitimate activities. This situation is distinctly unlike the case of nuclear weapons and poses some of the most vexing issues. The needed biological materials may or may not be readily available.

Chemicals

Chemical or explosive attacks are somewhat less commonly discussed, but are, in my view, among the things we should be most worried about. The information required for many forms of attack is readily available, even to the layperson. Some dangerous agents are difficult to obtain, but others can be purchased off the shelf. The terrible destruction of lives by an angry American at the Alfred P. Murrah Building in Oklahoma City and the use of sarin gas in Tokyo are prime examples.

Academic Access

Having reviewed these categories, I would reiterate that nuclear weaponry seems to be an almost singular case. Critical knowledge and know-how should be, and are, highly restricted by the normal security classification processes of the Department of Defense and the Department of Energy. These are not things that students should be required to access in the conduct of university research; they cannot be taught in a normal classroom. It is an area that, in my view, is appropriate for reasoned decision making by the Interagency Panel on Advanced Science and Security. (This panel is charged with evaluating and making recommendations regarding visa applications by certain students and visiting scholars who wish to take part in advanced science programs at U.S. universities. It considers such factors as nationality, previous education, area of scientific interest, and the nature of the programs at the university in question.) But we should depend primarily on our well-established classification and security mechanisms.

I do not believe that cyber-terrorism, bioterrorism, or the use of chemical explosives pose threats that could in a meaningful way be countered or avoided by restrictions on what is taught in our university classrooms, or on the country of origin of our students. What is taught in classrooms is basic knowledge, and, as in most instances in life, basic knowledge can be used for good or ill. The knowledge of what makes a virus virulent is also the key to medical therapies and disease prevention. This may be an uncomfortable reality, but it is a reality.

The material (as distinct from the information) needed to cause terror by chemical or biological means is a different matter. It is a clear responsibility of universities not to be a source of such materials for use by those who would do harm. Access to pathogens and dangerous chemicals must be carefully restricted and monitored in the normal course of doing science. Inventories should be minimized. Location, quantities, and security should be maintained effectively and accurately. We are working hard to establish best practice in this regard at MIT.

It is the further responsibility of universities to educate all of their research and laboratory students about security issues regarding their materials and equipment. This should be integrated with education and training regarding the health, safety, and environmental responsibilities of laboratory practice. Things as basic as not working alone in chemical and biological laboratories must be reinforced.

Select Agents

The term "select agent" came into the scientific vernacular when, on June 12, 2002, President Bush signed into law the Public Health Security and Bioterrorism Preparedness and Response Act of 2002.

As a first step in this law, all researchers in the life sciences were required to report to their institution and to the government (the U.S. Department of Health and Human Services) by September 10, 2002, their inventory of forty "select agents" that might be used as bioweapons. Other provisions of the law include similar reporting requirements for potentially lethal agricultural materials and security measures for laboratories that keep such agents. In addition, only those researchers determined to have a legitimate need will be allowed access to these materials, which will not be available to students or scholars from countries that are considered to be sponsors of terrorism or to people with histories of mental illness or felony or drug convictions.

By and large, the academic community has treated this approach as reasonable and, of course, will comply with the law. But even this seemingly straightforward approach is not without a huge potential price to be paid in the advancement of science and, therefore, in our health and welfare. The MIT Ad Hoc Committee on Access to and Disclosure of Scientific Information voiced deep concerns about the path down which we may be starting, noting that the U.S. secretary of health and human services has the statutory power to expand the list of select agents. The ad hoc committee expressed the view that we could soon arrive at a level at which access to materials by our students, faculty, or staff would be based on, for example, their citizenship—something that would be incompatible with our principles of openness and would cause us to withdraw from the corresponding research topics on our campus.

Publication of Scientific Information

As we balance prudent measures to maintain our security with the openness that is so essential to America's basic principles, to the excellence of our universities, and to the conduct of science, the most difficult challenge is associated with publishing information in the life sciences. Why is this so complicated?

Science is a collective endeavor. Science increasingly is an international endeavor. The weight of these two statements is compounding at lightning speed as the complexity of science increases and because, like all of society, scientists are tied together through the Internet. Science progresses not just by singular discoveries, but also by the independent verification and interactive discussion of discoveries. Knowledge is honed through ongoing dialogue that takes unexpected twists and turns. It thrives in openness and suffers in isolation.

Thus, in fields such as microbiology, the very nature of science, when combined with the dual nature of information—that is, its use for good or for ill—presents a challenge in an environment filled with well-justified concern about terrorism.

I worry that the broad advance of biological science is open to compromise. Restrictions that have been or may be imposed by our government as it struggles to carry out its most fundamental mission of protecting its citizens are not the only issue. The politics of subjects such as in vitro fertilization and stem cell research have removed them from the sphere of federally funded university and government laboratory research, where the mission is to achieve basic scientific understanding. Former National Institutes of Health director Harold E. Varmus, among others, has raised deep concern about distortions in the conduct of these and certain other areas of basic biological research that, as a result of such federal policies, can go forward only in industrial labs, where, Varmus said in a New York Times piece in December 2001, "commercial realities must be considered along with scientific progress, where full disclosure is not the norm, and where oversight is limited."

Three Suggestions

The resolution of matters of open publication when our security is challenged is not easy. A panel of the National Academy of Sciences has been established to provide guidance on this matter. It is chaired by MIT professor Gerald R. Fink, former director of the Whitehead Institute for Biomedical Research. While looking forward to the panel's wisdom, let me offer three suggestions for the resolution of the issues of sensitive areas of study, select agents, and publication of scientific information.

First, consultation by the federal government with the academic and scientific communities is essential. This consultation must be continuous and directly effective at both the policy and the operational levels. As pointed out with great clarity by former U.S. deputy secretary of defense John J. Hamre in the summer 2002 edition of Issues in Science and Technology, all too often security professionals do not understand or trust scientists, and scientists may be quite unaware of some of the real risks associated with their work. This has been a major problem within the nuclear weapons arena since its beginning. It will be even more complex as we worry about basic research in universities in the diffuse, little-understood context of terrorist threats. But there is no viable alternative to substantive consultation and mutual effort.

Second, distinct boundaries must be drawn where it actually is possible and appropriate. It is the ambiguity and uncertainty of what is inappropriate to publish, or in the use by the government of ill-defined terms like "sensitive but unclassified," that creates danger for the scientific enterprise and invites bad decisions. Well before September 2001, difficult issues were arising regarding the application of export controls on the uses of computers and satellites for basic research, and even regulation of certain unclassified but export-controlled library documents. Productive collaborations with scientists in other countries and the work of noncitizen graduate students and scholars have been prohibited by increasingly broad interpretation of the International Traffic in Arms Regulations (ITAR).

Similar problems with export control arose in the 1980s. The problem was settled effectively when President Reagan issued National Security Decision Directive 189 (NSDD 189). Basic-ally, NSDD 189 stated that scientific information is either classified or unclassified. It generally exempted fundamental research from security regulations. This distinct boundary was clear and effective for many years. Then, over time, its interpretation by the bureaucracy became increasingly broad and its effectiveness was diminished by application of other statutes—an opportunity afforded by the compromise insertion of one open-ended clause when it was drafted. NSDD 189 should be reaffirmed, and its spirit should be applied in other domains. The default in fuzzy areas should be to keep basic research open and unencumbered.

Third, we should not underestimate the power of voluntary agreements within the scientific community. The decisions about publication of detailed results faced by many scientists, especially biologists and biomedical researchers, simply do not lend themselves to decisions by security personnel. In the end, most decisions will be made by the scientists who perform the work being reported because, given the dynamic evolution of scientific knowledge, such decisions do not lend themselves to simple regulatory rules. We also must be keenly aware that regulations in the United States are limited in their effectiveness in an age when important frontier science is done in many nations around the world. (Indeed, the incident that first brought this issue to the public's attention occurred when an Australian group reportedly learned how to make a virus related to smallpox 100 percent virulent.) It may be that the most effective thing to do is to create a framework or forums from which scientists can gain guidance and advice from their peers as they wrestle with such daunting decisions.

Here, too, there is precedent of sorts. In the war years preceding the development of the atomic bomb, allied scientists stopped publishing research associated with uranium physics, although they continued to discuss the topic privately among themselves. And when recombinant DNA first became possible, leading scientists, led by biologist David Baltimore, established a moratorium on their work, pending open discussion among themselves and a range of laypeople, to establish standards. Work and open publication proceeded smoothly thereafter. Neither of these examples provides direct guidance for the less focused situation we face today, but the point is that the scientists themselves, in consultation with others as appropriate, found an effective path forward.

The debate about security and openness is not new. In 1958 mathematician Norbert Wiener was quoted in Brighter Than a Thousand Suns as opining, "To disseminate information about a weapon . . . is to make practically certain that it will be used." As if in rejoinder, physicist Edward Teller wrote in 1987 in Better a Shield Than a Sword: Perspectives on Defense and Technology that "secrecy is not compatible with science, but it is even less compatible with democratic procedure." These statements by two brilliant scientists with experience in defense work reflect the fact that virtually all science and engineering knowledge, or most other knowledge for that matter, can be used for good or ill.

This certainly does not mean that we can wash our hands of the responsibility to address hard questions about the safety and security of our fellow citizens. But in an age when the "weapon" may be a truckload of explosives, a computer virus, a commandeered aircraft, or finely milled bacterial spores, "dissemination of information" is a nebulous matter. And in an age when the rapid advance of science and technology is essential to sustaining our health, economy, and quality of life, Teller's observation is of crucial importance.

Traditional American values of openness in education and research must prevail. But that will be possible only if we in research universities contribute our talents to maintaining the security of our homeland, and if the federal government and academia maintain a respectful, substantive, and effective dialogue between those who do science and those who are charged with protecting the nation.

Charles Vest is president of the Massachusetts Institute of Technology. This article originally appeared as part of the MIT Report of the President for the Academic Year 2001-02. The original version, which is available at <http://web.mit.edu/president/communications/rpt01-02.html>, also includes a discussion of restrictions on international students.