ESSAYS ON SCIENCE AND SOCIETY

The Nature of Evidence

Science  07 Jul 2000:
Vol. 289, Issue 5476, pp. 61
DOI: 10.1126/science.289.5476.61

Boyce Rensberger began his career in science journalism at the Detroit Free Press, followed by the New York Times and the Washington Post. In 1998, he became director of the Knight Science Journalism Fellowships program at the Massachusetts Institute of Technology. He has written four popular science books, most recently, Life Itself: Exploring the Realm of the Living Cell.

CREDIT: ALLAN BURCH

You just want to sell newspapers,” a scientist hissed at me at a meeting not long ago. “That's your bottom line.” The event was one of many efforts around the United States to bridge the gap that supposedly exists between scientists and science journalists.

“Well, yes,” I replied. We do like to sell newspapers or attract viewers, just as much as the average scientist likes to have a big turnout for his talk at the annual meeting. But that is not the main motivation for me and my science-writing colleagues as we sift among the many scientific developments of the moment and single out a select few for our scarce column inches or minutes of airtime.

We write about science because we love science and want to communicate our fascination with the natural world. And we write about science and technology because we believe that the more people know and understand, the better informed public opinion will be. Of course, we must also cover the harms or the risks that some technologies pose. We do this not because we question the overall value of science or technology but because the watchdog role is an integral part of journalism.

There are those in science who believe that journalists have become careless and irresponsible, that we devote our words and pictures to half-baked research, even antiscientific claims. Critics point to the popularity of parapsychology, UFOs, and other forms of pseudoscience and insist that if interest in them rises, it does so at the expense of interest in real science. Some see the rise of public opposition to genetic engineering—prominently covered in the news media—as another worrisome symptom. Often mixed into this criticism is the allegation that Americans are woefully uneducated about science, especially compared with people in other countries.

My experience as a science journalist has led me to some rather different views. I find that the situation is more hopeful, and that the weakness in the public's understanding of science lies in an area not often addressed in interactions between scientists and journalists—the nature of evidence.

But first, take the claim that Americans are more ignorant of science than are people in other countries. According to a National Science Foundation (NSF) report,* American adults understand basic scientific facts at least as well as those in most other developed countries. A set of nine science-based questions was asked of adults in 11 European countries, Canada, Japan, and the United States. Denmark scored at the top, followed a point or two behind by the Netherlands, the United States, and, ever so slightly lower, Great Britain.

The same NSF study found that 70% of American adults say they are “interested” in science but that only 48% consider themselves “informed” about scientific matters. When I speak to editors, encouraging them to improve their science coverage, I use these figures to suggest that the public is not being well served. People say they are interested in science but realize they don't know much. Therefore, I surmise, they want to know more.

When NSF surveyors asked specific questions—such as what the term “molecule” means, or whether light or sound travels faster—they found a composite score of only 57% correct answers. The score was even worse when people were asked about the nature of scientific inquiry—for example, to define an experiment or a hypothesis. Only 27% of the sample gave passable answers. In sum, Americans are overwhelmingly interested in science but don't understand it and know even less about how it is done.

In these data, I believe, lies the reason for the popularity of pseudoscience. Without a grasp of scientific ways of thinking, the average person cannot tell the difference between science based on real data and something that resembles science—at least in their eyes—but is based on uncontrolled experiments, anecdotal evidence, and passionate assertions. They like it all.

The claim, for example, that brains can transmit information telepathically, strikes them as no less believable than the claim that whole stars can collapse into infinitesimal points. Many among the public have not yet learned that what makes science special is that evidence has to meet certain standards.

My own encounters with believers in pseudoscience—based on anecdotal evidence, to be sure—are consistent with the view that many adults are fascinated by claims that the world is filled with wonders and that some of them remain inexplicable. No problem there. But instead of dismissing such people as hopelessly beyond the pale, both scientists and journalists need to find ways of teaching them how to think more rigorously.

First, I suggest, journalists need to learn more about scientific methods and thinking. Most full-time science writers are already up on this, but nonspecialist journalists seldom are, and it is increasingly common that they cover stories with science content. It is important that we educate these nonspecialist journalists.

Second, when scientists talk to journalists, they ought to move beyond the highlights of their findings and wade into the methods, taking the initiative to ensure that the reporter understands why the results may be believed. Journalists, then, must make it a point to explain in their stories, somehow, that the new finding is founded on a plausible base of evidence.

  • *Science and Engineering Indicators 1998 (National Science Foundation, Washington, DC, 1998).

Subjects

Navigate This Article