EDITORIAL

Two Cultures

See allHide authors and affiliations

Science  21 Feb 2003:
Vol. 299, Issue 5610, pp. 1148
DOI: 10.1126/science.299.5610.1148

C. P. Snow, in a series of splendid novels, explored the cultural gulf between the sciences and the humanities. The “Two Cultures” problem of which Snow wrote resurfaces from time to time, as in the Science Wars debate about whether scientific truth is culturally “situated.” That cultural divide has largely disappeared, but it has been replaced by another “Two Cultures” problem that is even more troublesome. This new problem is the separation between the cultures of science and security.

The two are being put in much closer contact as the new war against terror proceeds. The community concerned with security (let's call it S1 for convenience) has been given a plethora of new things to worry about, and many of these have given it new reasons for closer interactions with the scientific community (hereafter, S2). It is apparent that certain kinds of scientific information—particularly in microbiology or biomedicine—might, if published, help terrorists or unfriendly states design bioweapons of mass destruction or ways of delivering them. S1 people would worry were such information published. Most S2 people would agree that it shouldn't be, but agreement is harder to find on how frequently this kind of information is generated in papers submitted for publication.

Under the sponsorship of the National Academies and the Center for Security and International Studies (CSIS), the two groups recently spent a day exploring this ground to see how much of it was common. A real effort was made to create dialogue, and it usually worked. But beneath the surface one could sense some tension: scientists perhaps questioning how much the security people knew about the science, and some security people wondering how much the scientists really understood about the danger and about the cleverness of those who are now being called “bad guys.”

This same kind of tension surfaces in another sector, through the well-publicized current troubles at the national weapons laboratories (Los Alamos and Lawrence Livermore) that are managed for the government by the University of California. Many security-minded critics have attributed administrative failures and security lapses at the laboratories to an academic mindset that overemphasizes scientific welfare and underemphasizes security. Critics on the other side are quick to blame episodes like the Wen Ho Lee fiasco on the overzealousness of the security community.

But nothing is really new under the Sun, and the same conflict emerged in a very problematic way in the early 1980s, when regulations designed to prevent the transfer of weapons specifications were suddenly applied to basic research. That problem almost created a showdown between the two communities; after a debate involving the universities and the Department of Defense, it was eventually solved with the help of some intercession by a report from the Academies. That report, named for the committee's chair Dale Corson, found it particularly difficult to deal with “dual-use technologies”: those with highly beneficial civilian uses, yet with some potential for military application.

In a way, that's exactly where we are now, as we contemplate the tasks of authors and editors with respect to publication. The papers that might present some risk of instructing a terrorist or rogue-state plotter are likely also to contain information of value—either to those developing counterterror strategies or, more generally, to those protecting the public health. In that sense, the technologies are dual-use, like those that engaged the Corson Committee. Thus, the issue of publication has to be approached with some delicacy, in a way that respects the prospective value of the information as well as its potential risks.

The statement that appears on the opposite page is the outcome of a daylong meeting after the workshop sponsored by the Academies and CSIS. The participants had a promising background from which to work: The workshop had made good progress in creating understanding, and the worst caricatures (“naïve geeks” versus “ignorant spooks”) never surfaced. The editors, authors, and others reached a general sense (although “consensus” might put it too strongly) that there was indeed scientific information that should not be published; yet they reinforced the view that benefits and costs need to be considered together in reaching such decisions. The statement is perhaps unremarkable in that it poses no radical policy departure. But it makes good sense, and that's a place to start in getting the two cultures together.

Navigate This Article