News this Week

Science  18 Feb 2000:
Vol. 287, Issue 5456, pp. 55

    Loss of ASTRO-E X-ray Satellite Hurts Japan's Space Science …

    1. Dennis Normile*
    1. With reporting by Govert Schilling in the Netherlands.

    TOKYOAstronomers and astrophysicists are mourning the loss last week of a Japanese-American x-ray satellite that promised to give them their closest look yet at the region surrounding black holes and other objects that emit high-energy x-rays. A rocket failed to lift the $105 million ASTRO-E spacecraft into a sustainable orbit. The loss pinches particularly hard into Japan's space science efforts, which rely heavily on a small number of carefully targeted satellites.

    “It leaves a big gap in our research program,” says Tsuneyoshi Kamae, an astrophysicist at the University of Tokyo who helped develop one of ASTRO-E's three major instruments. Scientists say it could take at least 3 or 4 years to build and fly a replacement mission, if a suitable rocket can be found.

    Launched on 10 February from the Kagoshima Space Center of Japan's Institute for Space and Astronautical Science (ISAS), ASTRO-E was to be the third major space telescope launched in a span of less than 7 months to study cosmic x-rays. It was meant to complement NASA's Chandra X-ray Observatory and the European XMM-Newton Observatory (see story below). Both those satellites, which are already in orbit, are focusing on low- and medium-energy x-rays; ASTRO-E would have concentrated on the high-energy end of the spectrum. “One corner of the triangle is now missing,” says x-ray astronomer Joachim Trümper of the Max Planck Institute for Extraterrestrial Physics in Garching, Germany. “It's a very big loss for x-ray astronomy.”

    It is especially bad news for Japan's astronomers, who have applied their modest budgets to carefully targeted niches. “Even though Japan's [space program] is small, we've been able to join the world's front ranks in x-ray astrophysics,” says Kazuo Makishima, an astrophysicist at the University of Tokyo.

    ASTRO-E would have been the fifth Japanese x-ray mission. Its predecessor, ASCA, launched in 1993, has made several ground-breaking discoveries, including the detection of iron in the x-ray emissions from accretion disks, the swirls of gas and dust around black holes. Distortions of the normal fingerprint of iron bore telltale evidence of the enormous gravitational pull of the black hole, something expected but never before observed.

    ASTRO-E's prime instrument, the x-ray spectro-meter (XRS), developed by ISAS and NASA, is extremely sensitive, designed to measure the energy of individual x-ray photons hitting the detector. This would have allowed astrophysicists “to determine properties of accretion disks very close to black holes,” says Richard Kelley, principal investigator for the XRS at NASA's Goddard Space Flight Center in Greenbelt, Maryland, who says that his team doesn't have any plans to relaunch a similar payload.

    A second instrument, the hard x-ray detector (HXD), developed by ISAS and a group of astrophysicists at the University of Tokyo, was expected to set the direction for Japan's space x-ray efforts for the next decade. By focusing on very high energy x-rays, it would have ventured into relatively unexplored territory. These higher energies “are the least explored regions in astrophysics, and yet they contain some of the most violent astrophysical phenomena,” says Tokyo's Kamae.

    A group at Nagoya University is already at work on a next-generation very high energy x-ray instrument to be launched about 2007. But with ASCA nearing the end of its life, the loss of HXD leaves a gap in Japan's observational program. Makishima says that a dozen faculty members and 30 graduate students must now salvage what they can from 8 years spent planning and building HXD. “The most important thing will be to find some way to put this instrument in orbit,” he says. In the meantime, they will continue using ASCA and seek observing time on the U.S. and European x-ray observatories.

    The failure of the M5 rocket, in its third launch after two successes, puts a cloud over future ISAS missions. The M5 is intended to be ISAS's primary launch vehicle for the next decade. Its next scheduled mission is not until the summer of 2002, giving some breathing room to fix whatever went wrong.

    ISAS officials are focusing on the first stage of the rocket, in particular the possibility of damage to the graphite rocket nozzle. Onboard cameras transmitted images of sparks coming from the rocket nozzle just before altitude control problems developed about 40 seconds after lift-off. The rocket did not achieve the desired trajectory by the time the first stage separated, and the second and third stages failed to lift the satellite into orbit, causing it to burn up in the atmosphere.

  2. … While European Observatory Sends Back First X-ray Images

    1. Govert Schilling*
    1. Govert Schilling is an astronomy writer in Utrecht, the Netherlands.

    The first pictures from Europe's new orbiting x-ray observatory were released last week. Launched on 10 December, the X-ray Multi-Mirror Mission (XMM)-Newton—named after the father of spectroscopy—joins NASA's Chandra X-ray Observatory, which was sent into space last summer. Japan's ill-fated ASTRO-E was to have been the third in a planned trio of x-ray observatories (see previous story).

    XMM-Newton will be able to detect much fainter emissions than Chandra can, although with less detail. The European observatory is “absolutely superior in doing certain types of observations, like taking spectra of faint, isolated neutron stars,” says Martin Weisskopf of NASA's Marshall Space Flight Center in Huntsville, Alabama, a Chandra project scientist. “Chandra and Newton complement each other very well.” In spite of the loss of ASTRO-E, “we are up for a prosperous new era in x-ray astronomy,” predicts project scientist Fred Jansen of the European Space Research and Technology Centre in Noordwijk, the Netherlands.


    Mouse Sequencers Take Up the Shotgun

    1. Elizabeth Pennisi

    MARCO ISLAND, FLORIDAIn a dramatic departure, the genome community will be using a controversial sequencing strategy—one that many scientists have publicly trounced—to tackle its next big target: the mouse. Discussions at a meeting* held here last week made clear that a good part of the mouse will be sequenced using the whole-genome shotgun method pioneered by J. Craig Venter of Celera Genomics in Rockville, Maryland. But it will be used in combination with the more incremental strategy used by the Human Genome Project, the publicly funded consortium sequencing the human genome and the genomes of other organisms. “Both approaches have something to offer for getting large, complex genomes sorted out,” says Robert Waterston, who heads Washington University's sequencing center in St. Louis.

    Less than 2 years ago, the sequencing community sharply criticized the shotgun approach when Venter announced that he planned to tackle the human genome this way (Science, 22 May 1998, p. 1185). In the shotgun approach, the entire genome is cut into tiny pieces and then sequenced and reassembled all at once. Many genome scientists thought Venter would never be able to put his millions of pieces of DNA back together in the right order—a process akin to assembling a jigsaw puzzle consisting almost entirely of blue sky. “Over 100,000 serious gaps” would remain, predicted Maynard Olson, a sequencing authority at the University of Washington, Seattle, in 1998.

    View this table:

    But a collaboration between Celera and academic partners to sequence the 160-million-base genome of the fruit fly Drosophila melanogaster (Science, 5 February 1999, p. 767) is showing that a hybrid strategy—combining whole-genome shotgun data with sequence generated the more traditional way, one bit at a time—can work. Because the chromosomal locations of those bits, which are represented in bacterial artificial chromosomes (BACs), are known, they can help sequencers assemble data from the whole-genome shotgun approach more accurately. “Drosophila taught us a compelling lesson. The hybrid approach has a lot of validity,” says Eric Green, a geneticist at the National Human Genome Research Institute (NHGRI) in Bethesda, Maryland. Venter seems to agree: He announced last month that Celera is relying on public data to speed its human genome effort.

    Even so, the decision to sequence the 3-billion-base mouse genome this way is not being made lightly, and many details “are still being thrashed out,” says Green. For Green and others, the mouse genome is pivotal because it can help them decipher the human genome. The mouse will be only the second mammal sequenced in its entirety, after the human, making possible all sorts of comparative analyses. Humans and mice share many of the same genes; indeed, many mouse aficionados assert that the best way to figure out how human genes work is to study them in the mouse.

    Understandably, the consortium of 10 labs tackling the mouse wants to be sure to do it right. To sequence a genome to the desired standard of accuracy—99.99%—each small stretch must be represented perhaps five to 10 times in the various pieces of sequenced DNA, whatever method is used to produce them. As recently as last October, when the 10 labs received $21 million from NHGRI to begin on the mouse, no one had seriously considered tackling the mouse genome with anything but the tried-and-true approach (Science, 8 October 1999, p. 210). But that changed at the first meeting of the mouse network, when Richard Gibbs of Baylor College of Medicine in Houston proposed doing some “shotgunning” of the mouse. Not only might the shotgun approach be faster, suggested Gibbs, but it would ensure that the laborious front-end work needed to characterize the BACs wouldn't delay the project.

    Most of the group was receptive, recalls Bruce Roe, who will sequence some of the mouse genome at the University of Oklahoma, Norman. But some were worried that a combined approach might diminish a key near-term benefit of the mouse sequence: Geneticists want to use the mouse genome to find and characterize new genes in the human sequence, but that would require identifying genes in the mouse at very early stages in the project and also discerning areas where the mouse and human genomes are nearly the same. Roe had broader concerns as well: “I was afraid that there would be momentum to do something that was not well thought out.”

    So instead of settling on a sequencing strategy right away, the group decided to test the validity of the hybrid approach, and especially whether shotgun data could aid gene discovery. “The answer came back a resounding ‘Yes,’” says Green. Working with three other network members, Genome Therapeutics Corp. in Waltham, Massachusetts, simulated shotgun data by stripping down known sequences to a bare minimum. The group then evaluated how useful these new data would be for finding conserved sequence in both human and mouse genomes—sequence that would likely represent undiscovered genes. At last week's meeting, Lynn Doucette-Stamm of Genome Therapeutics reported that gene-finding programs could still pick up almost all the coding regions in shotgun data. What's more, those conserved regions helped the researchers pinpoint regulatory regions in both genomes.

    Although the network has agreed to tackle “most” of the mouse genome with the shotgun approach, the researchers are still debating exactly what “most” means. Last week, Washington University's John McPherson suggested at least three-quarters. But another network member, W. Richard McCombie of Cold Spring Harbor Laboratory in New York, is not comfortable with such a high proportion. Even Gibbs has reservations. “The real scientific issues [about the optimum ratio] remain unsettled,” he concedes. But the network expects to settle them soon, adds Francis Collins of NHGRI, who heads the network. Already, some mouse DNA is in the sequencing pipeline, with much more expected in the coming months. Boasts McPherson: “You should start seeing some mouse sequence hitting GenBank” soon.

    • *“Advances in Genome Biology and Technology I” was held 5 to 8 February.


    New Clue to Age Control in Yeast

    1. Evelyn Strauss

    In fairy tales, you drink a magic potion to live longer. In real life, just eating less might do the trick. For years, researchers who work on aging have known that they could extend the lives of species from yeast to rodents by restricting their food intake. The mechanism, however, has remained mysterious. Now, a team led by molecular biologist Leonard Guarente of the Massachusetts Institute of Technology may have turned up part of the answer. The group has identified what may be a biochemical link between calorie restriction and increased life-span, at least in yeast.

    The link appeared fortuitously during studies of a phenomenon called silencing that both turns genes off at particular chromosomal regions and also helps maintain the structural integrity of the DNA. In previous work, Guarente's group showed that Sir2, a protein needed for silencing in yeast and possibly other organisms, controls yeast life-span. This was likely due to its silencing activities, but no one knew exactly how the protein performed this silencing feat—much less how it might be related to aging.

    As the researchers report in the 17 February issue of Nature, Sir2 might work by removing acetyl chemical groups from the histone proteins that bind DNA in the chromatin, a chemical change that ultimately ties up the DNA so that the proteins needed for gene activity can't gain access. Geneticists had suspected for many years that such an activity was behind Sir2's silencing action, but had never been able to catch the protein in the act of removing the acetyl groups. The Guarente team succeeded while actually studying a different type of reaction that is also catalyzed by Sir2. In the course of that work, they added a chemical called NAD to the reaction mixture. They found that when—and only when—NAD is present, Sir2 removes acetyl groups from a synthetic portion of a histone.

    That discovery also provides a link to calorie restriction, because NAD normally helps the cell capture energy from food. When food is restricted, concentrations of available NAD could rise, Guarente proposes. This rise, in turn, could boost Sir2's silencing activities to help cells live longer. “If you lose silencing over time, you could get inappropriate gene expression, and these changes could be responsible for some of what we see in aging.”

    George Roth, who studies caloric restriction in primates at the National Institute on Aging in Baltimore, says Guarente's idea is “consistent with what one might expect.” Still, he cautions, “it's a big stretch to go from that observation in yeast to mammals.”

    When the Guarente team began its work, the researchers were investigating another theory about how Sir2 might silence DNA. Results from bacterial and mammalian proteins that resemble yeast Sir2 suggested that it might transfer part of NAD, containing the sugar ribose attached to the nucleotide adenosine diphosphate (ADP), to other proteins. They wanted to see if Sir2 would add the ADP-ribose to a synthetic histone segment, and it did. But when they further analyzed the reaction mixture, they got a surprise. Addition of ADP-ribose should increase the histone peptide's weight by 541 daltons. Instead, the researchers found a major product that weighed 42 daltons less than the original peptide. “That was an astounding thing,” says Guarente. “It was getting smaller, not bigger.”

    Fortunately, Guarente knew his numbers: 42 is exactly the size of an acetyl group. “So we thought, ‘Oh my goodness, it's deacetylating the peptide,’” he recalls. Other researchers hadn't been able to detect that reaction with Sir2 before, he postulated, because they hadn't added NAD. When he and his colleagues then repeated the experiments with and without NAD, they found that indeed, it is required for deacetylation by Sir2. They also showed that a related protein in the mouse performs a similar function, suggesting that Sir2 carries out the same reaction in mammals.

    Although the work provides the first direct evidence that Sir2 can deacetylate, it does not prove that deacetylation is responsible for the protein's silencing activity. The problem is Sir2's ADP-ribosyl transferase activity. Last year, Danesh Moazed, a molecular biologist at Harvard University, and his colleagues showed that a particular mutation could obliterate both it and Sir2-mediated silencing. This raises the possibility that the transferase, rather than the deacetylase, is what's necessary for silencing. Guarente has shown, however, that the same mutation also curbs Sir2's ability to deacetylate histones, so it's not clear which activity is more important for silencing.

    He and his colleagues tried to find out by creating a modified version of Sir2 that lacks most of its ADP-ribose transferase activity, but retains deacetylase activity. This protein performed many of the usual silencing feats, but the experiment wasn't conclusive because the protein was still a weak transferase. “I'm very excited about the new result,” Moazed says. “But until we get mutations that cleanly separate the activities, the issue won't be settled.” Either way, NAD seems to be involved and could thus serve as a biochemical link between Sir2-mediated silencing, caloric restriction, and aging. If so, Guarente's hypothesis may help us take a step toward living, well, happily ever after.


    RNA Works Out Knight Moves

    1. Charles Seife

    Silicon upstarts aside, the best chess computers are biological—the brain of grand master Gary Kasparov, for example. Now a team of scientists at Princeton University has used a different sort of biological computer—beakers full of organic glop—to solve a chess problem. The feat, the most difficult problem ever solved by molecular computing, marks the first time RNA has been used as a molecule for computation and may point the way to powerful techniques for solving other mathematical puzzles.

    Everyday computers manipulate information in the form of bits: 1s and 0s. The bits may be stored as high and low voltages (as in a computer's processor) or as north- or south-pointing magnetic fields (as in a hard drive). But they may also take more exotic forms. For example, the chemical bases that make up molecules of DNA and its cousin RNA are ideal for storing digital information. In 1994, computer scientist Leonard Adleman of the University of Southern California in Los Angeles showed that jugs of DNA could be turned into computers (Science, 11 November 1994, p. 1021). Since then, scientists have been using DNA to solve small mathematical problems such as adding two numbers together. Now, in the 15 February Proceedings of the National Academy of Sciences, evolutionary biologist Laura Landweber and her colleagues at Princeton University describe how they used RNA to solve the “knights problem” on a 3 _ 3 chessboard: finding all the ways to place a collection of knight pieces (which move in an L-shaped pattern) so that no knight can attack another.

    To solve this problem with a regular computer, you could start by assigning one bit to each of the nine squares on the board. Each bit represents whether a knight is sitting in that position (1) or if that position is empty (0). Then you could simply crank through all the possible combinations of 1s and 0s for the nine positions and eliminate the ones where knights are able to attack each other.

    Landweber took a similar “brute force” approach. First, she synthesized 18 different stretches of DNA, each consisting of 15 base pairs. Each stretch represented a bit for a particular space—a “knight” or a “blank” for each of the nine positions on the board. (For instance, CTCTTACTCAATTCT meant that the upper left-hand corner is blank, while ACCTTACTTTCCATA meant there's a knight in the center square.) She then created a “library” of millions of DNA strands representing all possible configurations of the board—that is, every possible permutation of knights and blanks.

    Landweber then methodically eliminated the permutations in which one knight could capture another. Using standard techniques, she copied the DNA into RNA, which can be readily cleaved by an enzyme called ribonuclease H. That set the stage for a molecular slice-and-dice fest that minced all the non-solution-bearing RNA strands out of the library.

    The enzymatic algorithm was possible because the knights problem can be reduced to a set of logical statements. One statement might be: “Either the upper-left corner is blank, or the two squares that a knight threatens from that position must be blank.” To satisfy that statement, Landweber split the library into two. Into one jug, she poured an enzyme that targeted the sequence that meant “there is a knight in the upper-left corner.” To the other jug she added two enzymes that targeted the sequence that signaled the presence of a knight in the two threatened positions. After the broken fragments were all weeded out, neither jug contained an RNA strand that included sequences that had both a knight in the upper-left corner and a knight in one of the two squares threatened from that position.

    Landweber then mixed the jugs together, converted the RNA back to DNA, amplified the DNA, and started all over again with another logical statement. After repeating the process for each logical statement that describes the knights process, she was left with a flask full of strands corresponding to every valid solution to the knights problem—plus a few rogues that escaped the cleaving enzymes by a fortuitous mutation. “We pulled out 42 correct solutions and one incorrect solution out of 43 clones that we tested,” says Landweber.

    “It is the world champion so far,” says Adleman, who, along with other biochemists and computer scientists, is trying to make a molecular computer do a problem that a human can't do in a reasonable amount of time. “[Landweber] has got the inside track for trying to reach that milestone first.”

    To develop practical nucleic-acid computers, however, scientists will have to clear some major hurdles, such as figuring out how to correct errors and how to produce and handle large volumes of nucleic acid. Kasparov has no reason yet to feel threatened by beakers of glop, but Adleman is hopeful that nucleic-acid computers will be more than mere curiosities. “Here's nature's toolbox, a bunch of little tools that are dirt cheap; you can buy a DNA strand for 100 femtocents,” he says. “Here's a great set of tools, we know they can do lots—let's build cool things!”


    Fired Researcher Is Rehired and Refired

    1. Eliot Marshall

    A bitter and long-running dispute at the University of Arizona (UA) over the firing of a senior biomedical researcher for scientific misconduct has taken a strange new turn. On 4 February, UA president Peter Likins reinstated the researcher he had fired 19 months ago—former Regents Professor Marguerite Kay, an expert on the immune system and Alzheimer's disease. But, on the same day, Likins notified Kay that she was being dismissed again and he barred her from the campus, citing a policy that permits him to exile a faculty member whose presence is deemed “likely to constitute a substantial interference with the orderly functioning of the university. …” Likins gave the same reasons as before: A faculty panel ruled in 1998 that Kay had engaged in scientific misconduct and neglected her duties as a professor (Science, 5 November 1999, p. 1076).

    This bizarre twist is the result of a judge's rulings last year that the university had acted in an “arbitrary and capricious” manner in firing Kay without a regular personnel hearing, and that she was wrongly denied full legal representation in a misconduct hearing. Likins informed Kay she had a right to appeal the redismissal, which would presumably trigger a personnel hearing.

    Kay's supporters on the faculty were outraged by these moves. Two lawyers on the faculty senate immediately objected that Likins had violated Kay's rights and, thereby, the rights of all tenured faculty members. Attorneys Roy Spece Jr. and Andrew Silverman read a protest note during a senate meeting on 7 February in which they urged Likins to redo the entire investigation against Kay. The findings of misconduct against her, they argued, were rendered “null and void” by the court rulings. Judge Stephen Villarreal of the state court for Pima County found that the faculty-run hearing that investigated and condemned Kay's research in 1998 was deficient because Kay's attorney was not permitted to speak during the proceedings (Science, 26 November 1999, p. 1657). As a result, “the only proper way to proceed is to return to the very beginning and to do it right this time,” said Spece and Silverman.

    Likins clearly isn't interested in doing that. In a memo to department heads on 4 February, he noted that the court “did not make any determination regarding the substantive basis for the decision to dismiss Dr. Kay.” And he said that the work of several faculty committees that investigated the case “will be respected.” Likins declined to comment, according to university spokesperson Sharon Kha, because university rules forbid public discussion of personnel matters. Kha said she was limited to stating that Kay is once again on the faculty—nothing more.

    Kay also could not be reached for comment. But her attorney in Tucson, Don Awerkamp, predicted that the decision not to redo the investigation from the top but to rely on the disputed misconduct investigation of 1998 will waste time and “cost hundreds of thousands of dollars more in litigation expenses.”

    On Kay's behalf, Awerkamp filed suit against the university in December, demanding $3 million for breach of contract. The suit also seeks additional damages for violation of Kay's rights to due process in job termination, and for pain and suffering and other harms. Included in the list of defendants are the university's board of regents, Likins, the chief counsel, the former research administrator, the oncologist who chaired the panel that investigated Kay, and two other faculty members who stepped forward to support the allegations of misconduct.

    Kay is willing to settle out of court, Awerkamp says, but so far the university has made “no formal response” to the breach of contract claim, although it has offered her back pay to the day of her first firing. “We will be back in court,” he predicts.


    Double Magic for New Nickel Nucleus

    1. Adrian Cho

    In what you might call a smashing success, physicists have chiseled out an atomic nucleus laden with a record eight more protons than neutrons. The new nucleus, nickel-48, self-destructs in a fraction of a second, but it may help researchers study a form of radioactive decay that they have long sought but not yet observed.

    As with fraternity brothers in a phone booth, you can cram only so many protons into a nucleus. There, protons and neutrons stick to one another with a peculiar pull called the strong force. But the like-charged protons also push against one another with an electric force. If the protons outnumber the uncharged neutrons by too wide a margin, the electric force wins out and the nucleus falls apart. That's why almost every atomic nucleus in nature has more neutrons than protons.

    But proton-rich nuclei can stabilize themselves by playing a shell game. Like the electrons buzzing around in their orbital shells, the protons and neutrons in the nucleus pile into distinct nuclear shells. And just as elements with full electron shells are inert, nuclei with full nuclear shells tend to be more stable. So nuclei that are “doubly magic”—that possess a full shell of neutrons and a full shell of protons—may be stable, even if the protons outnumber the neutrons. Knowing this, nuclear physicist Bertram Blank and his team at the Grand Accélérateur National d'Ions Lourds in Caen, France, spent the last 10 years hunting for nickel-48, a doubly magic nucleus with 28 protons but only 20 neutrons.

    To make the novel nucleus, the researchers smashed a high-energy beam of naturally occurring nickel-58 nuclei (28 protons and 30 neutrons) into a nickel-58 target. The collision knocked bits and pieces off the passing particles, producing a shower of odd nuclei. The team measured the charge and mass of every one with a sophisticated spectrometer. After sifting through millions of nuclei, they found four nickel-48 nuclei, as the researchers report in the 7 February Physical Review Letters.

    The barely stable nucleus is a prime candidate for two-proton decay, a form of nuclear decay that until now has been strictly theoretical, says P. Gregers Hansen, a nuclear physicist at Michigan State University in East Lansing. Because of peculiarities of the strong force, nickel- 48 cannot eject one proton, but it might just spit out two at once, thereby emitting a kind of radiation that has never been detected. “It's wonderful to know [nickel-48] exists,” Hansen says. “Someday we may do some experiments with it.”


    U.K. Plans Major Medical DNA Database

    1. Michael Hagmann

    Following the examples of Iceland, Sweden, and Estonia, the United Kingdom is drawing up plans to create a national database linking the DNA of 500,000 of its citizens to their medical records and lifestyle details. Its main goal is to tease apart the genetic and environmental components of conditions such as cardiovascular disease and cancer and, eventually, to come up with new drugs to treat—or even prevent—these conditions. An expert panel is currently hammering out a strategy for setting up the database and is due to report its recommendations next month.

    Dubbed the U.K. Population Biomedical Collection, the database is a joint project of Britain's two principal supporters of biomedical research, the government-funded Medical Research Council (MRC) and the Wellcome Trust, a charitable foundation, in collaboration with the National Health Service. “It's basically a very large-scale epidemiological study,” says Tom Meade, director of the MRC Epidemiology and Medical Care Unit, who also heads the MRC/Wellcome working group convened last May to look into the feasibility of the project. Although the working group has found few major technical stumbling blocks, it is taking a cautious approach to such issues as personal privacy and consent, following the outcry that greeted the setting up of the Icelandic database.

    In Iceland, the database project was initiated by a private company, deCODE Genetics. Once the project won parliamentary approval in 1998, deCODE could gain access to the medical records of Icelandic citizens who did not opt out of the scheme (Science, 1 January 1999, p. 13). Researchers and civil liberties campaigners protested deCODE's monopoly of the data and expressed concern about informed consent procedures and patient confidentiality.

    Meade stresses that the U.K. collection will have several fundamental differences from the Icelandic scheme. “Instead of being run by a commercial company, the MRC and the Wellcome Trust are independent research organizations,” he says, adding that the British database would be based “entirely on an opt-in approach; participation would be completely voluntary.” After getting patients' consent, physicians would gather information about their social status and lifestyle—such as dietary habits, smoking, and exercise—and take blood to obtain DNA samples. The patients would then have regular checkups, and their medical records in the database would be updated.

    Many working details, such as who should have access to the data or have rights to the intellectual property contained in it or drugs derived from it, are yet to be worked out, Meade says. Meanwhile, the MRC and the Wellcome Trust are consulting widely “to gather information about how the public feels” about such a medical database, says Ify Uwechue, a Wellcome spokesperson. Sue Mayer, director of the pressure group GeneWatch U.K., says such a study is essential. “Before steaming ahead, we need to have rules and guidelines in place and a proper public debate about such urgent issues as privacy, consent, and access,” she says. Mayer predicts that a medical data bank “is going to be a very sensitive issue in the United Kingdom.”

    Nevertheless, Meade is optimistic that “if we can make it very clear why we do it and what the tremendous potential benefits will be, most people will find [the database] acceptable.” If they succeed in getting their message across, Meade adds, the database could be up and running as early as next year.


    New Genetic Tricks to Rejuvenate Ailing Livers

    1. Michael Hagmann

    Some 20 million people in the United States alone suffer from liver diseases, and more than 40,000 of them die each year. Liver transplants could save many of those lives, but there are only enough to treat about 4000 American patients each year. Now, researchers have developed two new treatments that have proved successful in rodents with severe liver damage. The hope is that one day they may help prolong the lives of patients awaiting a donor organ—or perhaps even do away with the need for a transplant altogether.

    On page 1253, Ronald DePinho of the Dana-Farber Cancer Institute and Harvard Medical School in Boston and his colleagues report that erosion of the telomeres, caplike structures that protect the ends of the chromosomes, predisposes mice to liver cirrhosis, a degenerative liver condition that in humans is the main killer among liver diseases. What's more, the researchers restored liver function in the animals by using gene therapy that kept the telomeres from withering away.

    And on page 1258, Philippe Leboulch of the Massachusetts Institute of Technology and Harvard Medical School, Ira Fox of the University of Nebraska Medical Center in Omaha, Naoya Kobayashi of Okayama University in Japan, and their colleagues report that they were able to grow enough liver cells (hepatocytes) in lab cultures to get rats through an acute liver failure. Ordinarily, cultured liver cells grow poorly, but the researchers overcame this handicap by first introducing a cancer gene that revved up their cell division machinery and then clipping the gene out again to curb the cancer risk.

    Experts are hailing the studies. “This is very fascinating; the telomere system hasn't been implicated in liver damage and regeneration. And what's holding back the use of hepatocytes [in cell transplantation] is the inability to greatly expand them in culture,” says Roger Williams, a hepatologist at the University College London Medical School. Still, he and others caution that there's a huge gap between bench and bedside. “There's a lot of ifs in here, and this is not immediately going to lead to a change in clinical practice,” says physiologist Irwin Arias of Tufts University School of Medicine in Boston.

    Cirrhosis results when the liver undergoes chronic damage, which in humans is often caused by excessive alcohol consumption or hepatitis virus infection. The organ tries to repair the damage by producing new cells, but the liver cells eventually stop dividing, and that in turn results in liver failure and death. Scientists have long known that telomeres in most normal human cells get a little shorter with each division until, finally, the aged cells no longer divide. So when Japanese scientists discovered in the mid-1990s that cells from cirrhotic livers have shorter telomeres than those from healthy ones, “this suggested to us that telomere attrition may be a crucial trigger for end-stage cirrhosis,” says DePinho.

    To test his hypothesis in mice, DePinho and his colleagues had to employ an experimental trick. In normal human cells, telomerase—the enzyme that makes telomeres—stops working when cells differentiate to form the various tissues, but in most mouse cells, it's active throughout life. As a result, mice have much longer telomeres. So the researchers used mice in which the gene for one telomerase component had been “knocked out.” Through the generations, telomeres in these animals shrink, “making them more humanlike,” as DePinho puts it.

    When his team then subjected the animals to various liver injuries, their rate of liver cell death shot up, the cells' capacity to divide and regenerate was reduced, and they showed many of the symptoms of severe cirrhosis. The DePinho team also found that they could ward off the disease by injecting the mice with an adenovirus, which usually causes respiratory infections, that had been modified to carry the gene for the missing telomerase component.

    One potentially serious issue would have to be resolved before researchers can even think about trying such a therapy in humans, however. As DePinho points out, telomerase often gets turned on in cancers—including, Arias says, about 75% of all liver cancers. Researchers will need to make sure that adding the enzyme to the livers of human cirrhosis patients won't facilitate liver cancer development.

    To devise their liver cell therapy, Leboulch and Fox also took a cue from relentlessly dividing cancer cells. Hepatocytes, like all freshly isolated normal cells, are tricky to grow in culture. In 1996, Leboulch devised a technique to introduce a cancer-causing oncogene called large T antigen into primary cells, thus “immortalizing” them so that they would grow continuously in culture. As a safety device against the cells spiraling toward cancer, he designed the oncogene so that it could be chopped out of the cells' DNA by a “genetic scissors” that, upon introduction by an adenovirus, recognizes a pair of sequences bracketing the gene and deletes the portion in between.

    When Fox, a liver transplant surgeon, heard about Leboulch's system, he immediately called him and proposed a collaboration to give it a try in hepatocytes. Leboulch agreed, and the team set out to see whether they could grow enough liver cells to save the lives of rats that had had about 90% of their livers surgically removed. This treatment is always fatal, but when the researchers injected the lab-grown hepatocytes into the animals' spleens, up to 60% of the animals survived. “This system may do away with the shortage of hepatocytes. You could keep [the immortalized cells] in the freezer and take them whenever you needed them, which is not possible with primary hepatocytes,” says Fox.

    Hepatologist Roy Chowdhury of the Albert Einstein College of Medicine in New York City calls the Leboulch-Fox team results “very encouraging.” But he adds, “a 90% hepatectomy does not accurately reflect liver failure in humans where some insult like a virus or a toxin that caused the damage in the first place persists.” These could also damage the transplanted cells. Achilles Demetriou, a liver expert at the Cedars-Sinai Medical Center in Los Angeles, agrees. Demetriou instead favors coaxing versatile fetal stem cells to differentiate into liver cells that could then be used for transplantation. “They're more promising and safer,” he says.

    No one can predict when—or if—the new strategies will provide relief for the thousands of patients awaiting liver transplants. Says Demetriou: “For about 20 years now hepatocyte transplantation has always been ‘promising.’ But we still haven't delivered.” And right now gene therapy doesn't have such a great reputation either. So, the new strategies may not be chopped liver, but they also don't appear to be magic bullets—at least not yet.


    Researchers Brace for Political Backlash

    1. Robert Koenig

    ZURICH—From its position at the crossroads of Europe, the former imperial capital of Vienna has attracted scientists from across the continent, east and west, for centuries. But earlier this month, after the Austrian president approved a new government incorporating members of a far-right political party, the nation's researchers suddenly found themselves living in a state shunned by much of the international community. “My colleagues are deeply worried,” says physicist Arnold Schmidt, president of the country's basic research granting agency, the Austrian Science Fund. “We need international cooperation, and we don't feel that we should be held responsible for a government that many did not support.” In an open letter sent last week, Schmidt appealed to foreign researchers “to maintain or increase contacts and cooperation with scientists in Austria.”

    Other bodies have also been quick to reaffirm international ties and—in many cases—to distance themselves from the new government. Austria's university rectors issued a statement warning of possible “international isolation of Austria, which would be detrimental to its universities,” and called on national leaders to show “openness and internationality.” The University of Vienna's medical school pledged to intensify its efforts “to ensure that racism and prejudice are not tolerated.” And the scientific directors of the Erwin Schrödinger Institute—a mathematical physics center in Vienna that draws visiting fellows from around the world—say on the institute's Web site that they oppose “the nationalistic and xenophobic sentiments expressed by some politicians” and reaffirm their commitment to “international scientific interaction and exchange.”

    So far, the actions taken by Western governments to protest the new Austrian coalition—in which six out of 12 ministers come from Jörg Haider's far-right Freedom Party—have been mainly symbolic. The European Union (E.U.) and many of its member nations have issued statements and snubbed some representatives of the new government, but a spokesperson for the E.U.'s research commissioner, Philippe Busquin, told Science that sanctions are unlikely to impact research programs and that Austrian scientists will not face discrimination in their applications for E.U. grants.

    But there are some signs of unease in the international community: Manfred Horvath, an engineer who heads Austria's Office of International Research and Technology Cooperation, told Science that the Washington-based Interagency Environmental Technology Office asked last week that a conference planned for Vienna in October be moved elsewhere. “The situation is very unpleasant for Austrian science and research, which has been increasingly integrated into the E.U.'s research activities,” he says.

    So far, most Austrian scientists seem more worried about the reaction from outside than about changes in national science policy or funding. The new coalition purports to back science—reaffirming the previous government's commitment to increase research spending in coming years—but it raised some concern last week when it announced that it will split management of basic and applied research. The education ministry will now oversee basic research, while applied research will be combined with infrastructure in a ministry to be headed by Freedom Party member Michael Schmid. “I'm uneasy about splitting basic and applied research, and worried that the new government may focus less on research and more on development,” says the Science Fund's Schmidt.

    In the meantime, Austrian researchers are trying to reassure their colleagues around the world. Quantum teleportation pioneer Anton Zeilinger of the University of Vienna is urging international colleagues to continue their normal exchange with Austrian counterparts. And Erwin Heberle-Bors of Vienna's Institute of Microbiology and Genetics wants fellow Austrian researchers to “leave their ivory towers and discuss and explain the situation” to scientists abroad, and to continue their “full engagement with the international research community.”


    Controversy Claims CDC Lab Chief

    1. Martin Enserink

    A senior manager at the Centers for Disease Control and Prevention (CDC) has been reassigned as agency officials scramble to quell a widening controversy about the reallocation of funds that Congress had earmarked for specific diseases. Testifying last week before Congress, CDC chief Jeffrey Koplan announced that virologist Brian Mahy has been replaced temporarily by James LeDuc as head of the division of viral and rickettsial diseases.

    Mahy came under fire last year when an investigation by the Inspector General of the Department of Health and Human Services found that his division had spent between $8.8 million and $12.9 million that Congress had approved for research into chronic fatigue syndrome (CFS) on a variety of other diseases. The report angered lawmakers and CFS patient groups. In response, Koplan offered his apologies and promised to restore the CFS funds, but he didn't take action against Mahy (Science, 7 January, p. 22).

    Earlier this month, however, Mahy's position was weakened further when a Washington Post report alleged that his division had also reallocated money earmarked for hantavirus programs. Testifying before the House appropriations subcommittee on Labor, Health and Human Services, and Education, Koplan said a preliminary inquiry has confirmed the allegation. CDC has hired private accountants to review the hantavirus program, Koplan said, followed by an investigation of all programs within the National Center for Infectious Diseases. In addition, Koplan has asked CDC managers to report within 90 days on any other cases where the agency provided inaccurate information to Congress.

    Koplan stressed that the diverted money had not been wasted but used “to combat other life-threatening infectious diseases,” such as Ebola and Nipah virus. He blamed the diversions on CDC's culture, “which emphasizes getting things done and taking care of the administrative niceties afterward.” A CDC spokesperson said Mahy will stay at CDC but couldn't say in which position.

    “It's a welcome piece of news,” says Kimberly Kenney, executive director of the Chronic Fatigue and Immune Dysfunction Syndrome Association of America, a group that helped expose the diversion of CFS money. But scientists express sympathy for Mahy, who they believe was trying to deal as best he could with emerging epidemics. “It's kind of sad,” says Charles Calisher, a virologist at Colorado State University in Fort Collins. “A guy does what he thinks is the right thing, and he gets lambasted.”


    Ecologists on a Mission to Save the World

    1. Jocelyn Kaiser

    While some leading ecologists are urging their colleagues to inject their findings—and themselves—into policy debates, others warn that activism will erode the discipline's credibility

    Anger wells up whenever Les Watling recalls the cruise that helped transform the reclusive scientist into a vocal champion of biodiversity. It was August 1993, and Watling was revisiting a spot in the Gulf of Maine where, a few years earlier, he had stumbled across one of the richest assemblages of life he'd ever seen in those frigid waters. Carpeting the boulder-strewn sea floor were unusual sponges nearly 30 centimeters tall—veritable cacti—including several species he didn't know. “This place was just amazing,” he says.

    Hoping to learn more about this underwater Eden, Watling, who studies crustaceans, took a Danish sponge expert to the spot, Jeffrey's Bank. Shining a searchlight on the ocean bottom, researchers spent two increasingly desperate hours in a submersible looking at mostly barren rock and silt. The sponges were gone. Although Watling was no expert on the fishing industry, the University of Maine, Orono, professor had a hunch that trawlers scraping the ocean floor with their heavy nets had mowed down the sponge garden. His suspicions grew when he later learned that a white hake fishery had begun operating in the waters around Jeffrey's Bank, using new “rock hopper” gear—a set of large balls or rollers that ride the ocean floor, dragging nets to scoop up hake and other bottom-dwellers. “I was just appalled,” he says.

    Soon after that experience, Watling made a fateful decision, one he realized could diminish his standing among his peers and even jeopardize his career: He picked up a slingshot and headed after Goliath. He began publishing papers on trawling's harmful effects and called for a ban on the practice, a message he thumped in media tours sponsored by the American Oceans Campaign and SeaWeb, nonprofits that raise awareness of marine conservation issues. Up to that point in his 20-year career, Watling says, “I was a classic scientist. I'd sit in my office, work on my grant proposals, write my papers, take my professional accolades, try not to stick my head out the door too far.” Those halcyon days were over: Watling had become an advocate.

    He's also at the vanguard of a movement that's causing some soul-searching among ecologists. Across the country, ecologists are stepping up efforts to speak out about the policy implications of their findings. Spurred by leading figures in their field, they are writing commentaries, signing letters, speaking to Congress, even sharing the bully pulpit with environmental groups. The Ecological Society of America (ESA) has launched a fellowship program to train experienced environmental scientists in the delicate art of conveying a bottom line to the media and to policy-makers. Meanwhile, conservation biologists, whose subdiscipline was conceived explicitly to generate the science for protecting habitats and species, are embarking on a major new push to reach out to managers and make their voices heard (see p. 1192). To many ecologists, locking themselves away in the ivory tower is now unconscionable. Columbia University ecologist Stuart Pimm, who has given his opinion on such matters as the multibillion-dollar plan to restore Florida's Everglades (a flawed effort, in his view), sums up what many of his colleagues feel: “I have a moral responsibility as a citizen to make people aware of what the science means.”

    But the drive to make advocacy an accepted practice in ecology has provoked a backlash. Some ecologists worry that to the public, environmental scientists are becoming indistinguishable from environmental activists. “If we promote our opinions as though they are the truth, people won't listen to the science as carefully because they'll think we have an agenda,” says Ingrid Burke of Colorado State University in Fort Collins. She and others fret that ecologists will handicap their ability to do empirical research if they go beyond current science by making value judgments—for example, by saying that nonnative plant species or global warming are categorically bad, or that economic growth should be curtailed to save species from extinction. Such values “can really affect the way you design a study,” says John Wiens of Colorado State, who warns against “creeping advocacy syndrome.”

    Ecologists are discussing these issues among themselves: The ESA devoted a whole session to scientific objectivity, values, and advocacy at its annual meeting last August. With debate heating up, Science polled more than two dozen ecologists to see just how far they think they should go in getting a message out to the public. Many ecologists expressed deep reservations about crossing the blurry line that separates scientific meaning from social values. As Stanford University ecologist Pamela Matson explains it, “a lot of ecologists walk a really fine line between advocacy for science and advocacy for a cause.” Others argue that ecologists often deal with issues, such as climate change, that require policies to be adopted before the science is certain—and if they don't weigh in heavily in political debates to counteract the arguments of those with a vested interest in delaying action, it could be too late. There are no easy answers, but Science offers examples of how advocacy can color a scientific issue, and how three individuals—Stephen Schneider (p. 1189), Gene Likens (p. 1190), and Gretchen Daily (p. 1191)—have become comfortable with the level of advocacy they've pursued.

    Birth of a movement

    Watling is not the first scientist to parachute down from the ivory tower in the hope of making a difference to society. During World War II, many U.S. physicists donned a political mantle to argue for development of the atomic bomb. And in the decades after the war, many of the same physicists played leading roles in the policy debates over arms control and the Strategic Defense Initiative. But activism has more recently been thrust upon or embraced by ecologists, whose studies of the natural world have revealed worrisome, if sometimes uncertain, trends that point toward a need for political action.

    Tensions over advocacy came to a boil in 1951, when ESA members who felt that urgent measures were needed to protect habitats branched off from their more circumspect colleagues to form The Nature Conservancy. Several years later the modern environmental movement was born, when biologist Rachel Carson sounded the alarm on how DDT and other pesticides were harming wildlife. Her 1962 book Silent Spring spurred scientists to form the activist group, the Environmental Defense Fund (now called Environmental Defense). The ranks of scientist-activists swelled in the 1980s, when environmental interests often took a back seat to business interests. Back then, says Environmental Defense senior scientist David Wilcove, “I felt a social responsibility to get out there and try to make a difference.” As an environmentalist with a doctorate, Wilcove says, he feels he's had “a much greater impact” than he would have without that credential on issues such as protecting old-growth forest in the Northwest and revamping the Endangered Species Act.

    He and many others have paid a price for getting involved. Some scientists question the objectivity of papers published by scientist-advocates like Wilcove. After Silent Spring came out, Rachel Carson spent the last months of her life fending off a vicious backlash from pesticide manufacturers, who labeled her a “fanatic.” More recently, ecologist Jerry Franklin of the University of Washington, Seattle (UW), received anonymous death threats during the early 1990s for affirming the spotted owl's dependence on old-growth forest habitat.

    Still, many ecologists say their colleagues aren't aggressive enough in injecting their findings into policy debates. To remedy this perceived shortcoming, in 1983 the ESA, after much soul-searching, established a beachhead in Washington, D.C., opening an office to convey its expertise to policy-makers. Then 9 years later it launched the Sustainable Biosphere Initiative, which aims to educate the public and federal agencies on issues ranging from political hot potatoes like endangered species to unsexy topics like the surfeit of nitrogen from burning fossil fuels. Continuing this trend, the ESA in 1998 helped launch the Aldo Leopold Leadership Program to teach midcareer ecologists how to get their message across in the media.

    The new spirit of activism was summed up in a letter to Science signed by 20 prominent ecologists, including Paul Ehrlich and Matson of Stanford and Jane Lubchenco of Oregon State University in Corvallis (Science, 30 October 1998, p. 879). It isn't enough for a scientist to merely report findings, they wrote. Ecologists should contribute to “stemming the tide of environmental degradation and the associated losses of biodiversity and its ecological services.” “[M]uch of what we study,” they continued, “is fast disappearing. … Ecologists have a responsibility to humanity, one that we are not yet discharging adequately.”

    If the letter was meant to rouse the community, it worked. Although some ecologists applauded the statement, others cried foul. “I thought that was just a travesty,” says one. “The public won't know when to trust us.” As for the letter's tone, wrote UW marine biologist Warren Wooster in a letter to Science, “When an ecologist makes an apocalyptic statement about the death of one or another ecosystem, he trades his credibility for his passion as an advocate.” Wooster and others say they don't disagree with the need to publicize results. The problems come when scientists advocate, be it calmly or shrilly, a course of action.

    The danger they perceive is that outspoken advocacy may make it hard to retreat from, or to qualify, positions once new findings come in. For instance, some scientists argue that fires and other human activities may be key to the vitality of certain swaths of land in the Amazon basin (Science, 4 February, p. 786)—a conclusion that, if true, would be hard to stomach for those who view humans as ecological transgressors. “It's just assumed biodiversity is good,” and such things as grazing and invasive species are bad, says Ed Rykiel of Washington State University, Richland, who organized the symposium on advocacy at the ESA's annual meeting. However, he says, “all ecological systems are dynamic. Is that good or bad?”

    Albatross or badge of honor?

    Forecasting environmental disasters often requires taking a value-laden leap of faith beyond the present state of knowledge. “Sometimes we extrapolate from our data and we don't know if that [scenario] will be true under new conditions,” says Jim MacMahon of Utah State University in Logan. And when the data don't track, the predictions can go belly up. Many point to dire warnings in the 1970s by Ehrlich and others of runaway population growth—a scenario that didn't play out as predicted. It happened again in the late 1980s, when drought in the U.S. Midwest was linked to global warming. “Every instance of advocacy [that] espouses something beyond what's known and is presented as science destroys the credibility of real science,” says David Tilman of the University of Minnesota, Twin Cities, who argues that environmental and industry groups are more often to blame than academics.

    One argument in favor of saving species is a recent flash point. A common theme in ecology, and one picked up by environmentalists, is that a swath of land bursting with a wide array of species is healthier and more productive than an ecosystem with just a few species. Some studies of grasslands have shown just that, but others have not. “There needs to be a lot more careful research done … about what biodiversity does in systems,” says Steward Pickett of the Institute of Ecosystem Studies in Millbrook, New York.

    Some ecologists also argue that when scientists become wedded to a position, they may—perhaps unconsciously—ignore findings that don't square with their values. For instance, Bill Parton of Colorado State argues that environmentalists and even some of his colleagues pay short shrift to findings suggesting that carbon dioxide pumped into the atmosphere by human activity will boost crop yields in some areas, particularly arid regions, perhaps outweighing the negative effects of hotter temperatures for that region. If scientists put forward what should be a far more mixed message, he says, they might appear more honest—and less like advocates—than they do now. Such a change in tone might make Republicans in Congress less skeptical of ratifying the Kyoto climate treaty. “If you present a balanced approach, people might not confuse you with environmentalists,” Parton says.

    Wiens of Colorado State argues that some scientists let their values erode their objectivity in assessing the ecological damage wrought when the Exxon Valdez spilled 11 million gallons of oil into Alaska's Prince William Sound in 1989. “Everybody's preconception was that this was bad and it was going to be an environmental disaster,” says Wiens, who received funding from Exxon to study seabird recovery. That, he says, was exactly what the early research tended to find—until better designed studies found that, while hit hard initially, many birds and other species bounced back fairly quickly (Science, 9 April 1999, p. 247).

    “When scientists become activists without hard evidence to back [their positions] up, they run the risk of being decloaked. People find out the emperor has no clothes,” says Wiens. That undermines “the credibility of scientists as a whole.” The result, claims Fred Wagner of Utah State, is that “ecologists have an image problem.” The public, he says, has come to view him and his colleagues as “environmental advocates with college degrees.”

    It can be hard for environmental scientists to hold back until uncertainties are resolved, however, when their results point to a catastrophe in the offing. Ozone depletion is a classic case. After publishing a paper in 1974 hypothesizing that chlorofluorocarbons from aerosol cans and refrigerators were destroying Earth's protective ozone layer, atmospheric chemists F. Sherwood Rowland and Mario Molina argued vehemently that releases should be slashed. This was years before the Antarctic ozone hole appeared. But the fact that ozone depletion was a “global effect” and that any action would take many years to kick in made it more urgent, says Rowland, of the University of California, Irvine: “I thought that the possible consequences were severe enough that one should not sit back and watch this for a while to see what happens.”

    Many other ecologists agree that advocacy stems directly from their science. “The idea that we can draw a line down the center of ourselves and say, ‘This is purely our science and this side is purely our values’ is ridiculous,” says Alison Power of Cornell University, who's spoken out on the ecological risks of genetically modified plants. She and others point to scientist-activists who have maintained solid reputations as researchers, such as Stanford's Ehrlich, a National Academy of Sciences member. Pimm says there's no harm in speaking out—even being wrong—because science has a safety net, peer review, that corrects exaggerations. “The reality is, there is an enormous number of checks and balances,” he says: “the very rigorous, brutal selection of ideas.”

    How to take a stand

    Although many ecologists are willing to wade into this moral and political quagmire, they say they would feel more comfortable if their savvier colleagues laid down some ground rules. For instance, a conservative approach might be to limit oneself to presenting data and discussing uncertainties, without venturing an opinion on policy actions. “It's advocacy for science, in a way,” says Stanford's Matson.

    ESA leaders insist that they firmly toe this line. For example, an ESA panel last year completed a joint report with the Union of Concerned Scientists, an advocacy group, on the potential effects of climate change in California, but let the union take the results to Capitol Hill. Others feel more comfortable singing in a choir—on National Academy of Sciences panels, for instance.

    For those willing to go a step further and offer their scientific take on policy, one could offer a range of alternatives—for example, the odds that a salmon population will be wiped out if a dam is or is not built. “Scientists should be providing information rather than advocating any particular solution,” says UW's Franklin.

    The most aggressive scientist-advocates claim they can lead a successful double life. The key, they say, is to make it clear that when they are taking a stand, they are doing so not as a scientist but as a citizen, and that their views are based on values. “I try to make that distinction clear to people,” says Environmental Defense's Wilcove. Many advocates who spoke with Science said they wear “two hats,” as a scientist and as an activist. Watling, for example, says he has defended his reputation by continuing to publish research, even while writing commentaries—including ones comparing trawling to clear-cutting a forest—“termed rants by my colleagues,” he says (Science, 18 December 1998, p. 2168). But he and others admit that reporters, particularly those coming in cold to report on an issue, often don't see the difference.

    The debate isn't going away anytime soon. Pickett says he is preparing a white paper for the ESA aimed at lawmakers and others that will “clear up some misconceptions” about the differences between an ecologist and an environmentalist. (Rykiel of Washington State thinks the ESA should produce guidelines for its members on where to draw the line on advocacy, although society officials say they have no immediate plans for that.) Meanwhile, the Society for Conservation Biology has commissioned a panel of members to hammer out an issue paper on the topic—though they're still struggling to “define advocacy,” says Gary Meffe of the University of Florida, Gainesville, editor of the society's journal.

    Whether their colleagues are right or wrong, many ecologists staunchly defend the right to speak out, even when the science is unclear. “If some people didn't feel deeply about some of these issues, scientists never would have pursued them and we would not know the vast majority of what we know in science,” says Tilman. “I don't think there's anything wrong with conveying these hunches when they're relevant to society.”


    Citizen-Scientist Guru

    1. Jocelyn Kaiser

    Stephen H. Schneider is the quintessential media-savvy scientist-advocate. Since the early 1970s, this climatologist and science popularizer has been a fixture on TV news shows, on Capitol Hill, and on White House panels, where he weighs in on both the politics and science of climate change. In Schneider's opinion, scientists sometimes need to dramatize their data and discard the subtleties to sell a message.

    A mechanical engineer by training who is now a biology professor at Stanford University, Schneider, 55, says his advocacy began with the heightened environmental awareness that bloomed around the time of the first Earth Day in 1970. Then a plasma physics and engineering grad student at Columbia University, he remembers a talk by biologist and erstwhile presidential candidate Barry Commoner—himself an ardent advocate—claiming that pollution was poised to send Earth's climate off kilter: Either airborne particles would bring on a mini-ice age, or carbon dioxide would trigger global warming. Intrigued, Schneider took a summer job as a computer programmer for planetary scientist S. I. Rasool, who asked him to model both grim scenarios. Their 1971 paper in Science landed global climate change—specifically, a major cooldown—in the pages of leading newspapers, which eagerly quoted the articulate young postdoc.

    By 1975, Schneider's more refined models pointed toward warming. Not missing a beat, he began warning of the havoc rapid global warming might bring. Unlike many of his colleagues, Schneider feels at ease weighing in on policy issues such as the 1997 Kyoto protocol to reduce carbon dioxide emissions, which he thinks doesn't go far enough over the long term. He says he gained this expertise early in his career by hobnobbing with social scientists at places like the Aspen Institute: “I taught them climate, they taught me economics.”

    Schneider recommends “three rules” of advocacy: explicitly stating when one's views reflect values rather than science; using colorful, easy-to-grasp metaphors; and producing a “hierarchy of products,” ranging from sound bites to op-eds to scholarly papers to lengthy books “where you can put in all the caveats.” At the same time, he says scientists shouldn't shy away from painting “scary scenarios”—such as deadly heat waves in New York City and a dried-up Mississippi River as possible results of global warming—to get a message across.

    Schneider says he gets “frustrated” by “all the false spin on my motives or advice” from the likes of conservative columnists George Will and Charles Krauthammer, who have trumpeted his 30-year-old paper on global cooling to question his credibility on global warming. But controversy hasn't made him gun shy. Lately, Schneider has been urging his colleagues working on the next Intergovernmental Panel on Climate Change report to overcome their natural reluctance to describe the most extreme possible outcomes, caveats and all. “Policy people are notoriously bad at translating science,” he says. And if scientists don't speak up, “who's going to talk about it? Somebody less qualified or with an agenda?”


    A Reluctant Warrior

    1. Jocelyn Kaiser

    Gene Likens never intended to let himself get drawn into the maelstrom of environmental politics. But that was before his low-key style of activism earned him a sterling reputation both as a researcher and as an advocate for bringing attention to the problem of acid rain.

    When the 67-year-old ecologist began his research career in 1962 at the Hubbard Brook Experimental Forest in New Hampshire's White Mountains, he wanted to know how nutrients cycle through a watershed. But his team's meticulous measurements revealed a more insidious threat: Increasingly, acidic rain and snow were steadily lowering the pH of lakes and streams and killing fish. By the time acid rain made it onto the environmental agenda in the early 1970s, Likens was the leading scientific voice on the issue—and a target of industries blamed for releasing too much sulfur dioxide, nitrogen oxides, and other acrid pollutants. While defending his science, Likens briefed President Reagan in 1983, testified before Congress, and advised a massive government study documenting the tie between acidic waters in the Northeast and coal-burning power plants in the Midwest. These efforts culminated in the 1990 Clean Air Act amendments to reduce sulfur dioxide emissions. Recently, Likens has taken up the cause again, arguing that pollution regulations don't clamp down hard enough on nitrogen oxides, and that forests still haven't rebounded from decades of nutrients leached from soil by acid rain (Science, 12 April 1996, p. 244).

    Colleagues describe Likens as an advocate with his views firmly rooted in basic science. “He's cautious but not to the point of being paralyzed,” says Duke University ecologist Norman Christensen. Likens, who now heads the nonprofit Institute of Ecosystem Studies in upstate New York, has blended his commitment to research with a more subtle brand of activism: As president of the Ecological Society of America in 1981, Likens lobbied hard for what became the National Science Foundation's Long Term Ecological Research sites, which he felt were essential for amassing the kind of data necessary to convince policy-makers that certain environmental problems were real and were not going away.

    A longtime board member of Environmental Defense, Likens says he often mulls the fine line between environmentalism and ecology. “There's tremendous public confusion, because we often work on the very same things,” he says. One way he counters this is by “trying very hard not to let my emotions and my personal views color my science.” And when he's asked his views on a policy question, “I will say, I'm going to take off my science hat and give my opinion as a person.”

    Despite Likens's high scientific standing—he's a National Academy of Sciences member and a 1993 co-winner of the Tyler prize, considered the Nobel of ecology—you won't often find his name in letter-writing campaigns or on commentaries. Likens says he avoids getting caught up in what he calls the “the Nobel syndrome”: weighing in on issues he hasn't studied directly. He urges younger scientists to concentrate on building a strong research record before becoming too active in environmental issues. Says Likens, “You shouldn't speak out unless you have something to say.”


    Role Model for Ecology's Generation X

    1. Jocelyn Kaiser

    She made Newsweek's 1997 “Century Club” of 100 people to watch in the new millennium. She has to her credit five papers that have appeared in Science and Nature. At 35, Gretchen Daily has already made a name for herself as a leading voice in ecology—and not just because she toils hard in the field studying ecosystems. Daily has carved a niche in ecological economics, an emerging discipline that argues for saving habitats and species not only for their intrinsic ethical value, but for what they're worth in cold hard cash.

    Daily is emblematic of a new generation of ecologists who are motivated by strong environmental values and generally feel comfortable surfing the breakers where science washes onto the shores of policy. As an undergraduate at Stanford, she worked as a researcher for the Worldwatch Institute's Sandra Postel on issues such as global water shortages; then as a grad student she broke new ground in the biology department by completing a doctorate that blended science and policy. She studied which plants and animals were likely to survive land development in a Rocky Mountain ecosystem and also explored which species society would want to save. She stayed on at Stanford as a researcher, often collaborating with population biologist Paul Ehrlich.

    Daily soon joined a few other pioneering ecologists in blazing a trail in ecosystem services, a new area that attempts to put a price tag on natural habitats. Ecological economists might argue, for example, that preserving a watershed is a cheaper way for a city to clean its water supply than is building a purification plant. Daily sees this as a “promising” new approach to environmental protection, because it appeals to businesspeople. She admits it's a gamble: It might be hard, say, to make a case on economic grounds for preserving a wetland rather than building a new shopping mall. But it's a risk that must be taken, she says: “The ethical arguments for saving biodiversity and the environment are not winning the war.”

    Daily tries to avoid being viewed—and possibly dismissed—as a one-sided environmental activist. She makes explicit her assumptions, for example when she suggests that preserving native habitat next to farmers' fields can help boost crop yields by contributing pollinating insects. And she lays out options without “making a judgment as to which is better or worse.” Daily says scientist-advocates are more apt to be taken seriously if they present a consensus, such as by running with a pack of authors when airing commentaries.

    For Daily, there's no question that her work is motivated by caring about the environment. “If I were not in this area of science, I would definitely be an environmentalist. But I try to just think about all these issues as problems to be tackled somewhat dispassionately.”


    A New Breed of Scientist-Advocate Emerges

    1. Kathryn S. Brown*
    1. Kathryn S. Brown is a writer in Columbia, Missouri.

    Conservation biologists clearly want to influence policy. After 15 years of frustration, practitioners are beginning to learn the fine art of making a difference

    For months, David Wilcove peppered the U.S. Fish and Wildlife Service (FWS) with letters protesting the agency's plans to save the threatened Utah prairie dog. Wilcove, a conservation biologist, and his colleagues at Environmental Defense in Washington, D.C., argued that FWS was putting too much emphasis on protecting prairie dogs on federal lands, when most of the animals now live on private land and cannot be relocated easily.

    In the midst of this typical conservation battle—scientist-advocates on one side, resource managers on the other—Wilcove made an atypical move. Conceding that his organization and the FWS were both shooting from the hip, making cases based on skimpy data, he flew a team from Princeton University to Utah last November to meet with agency managers and Environmental Defense officials. The Princeton group, led by biologist Andrew Dobson, began working up what the cash-strapped FWS could not afford to do on its own: a model on how various factors, from climate to disease epidemics, would affect Utah prairie dogs. “When the study is done this spring, we'll all have a better blueprint for determining the relative importance of public and private lands,” Wilcove says.

    That kind of cooperation is a novel way to get more science into resource management decisions. Week in and week out, managers dictate which sections of forest to sell to logging companies, which wetlands to pave over for houses, and which prairies to till into pastures. Such decisions often are justified by price tag or politics, but it's rare that more than lip service is paid to science. Part of the problem is that many scientists are hesitant, or unable, to participate in the process. “Academics don't know how to affect policy, and they don't communicate with managers very well,” says Michael Soulé, a professor emeritus at the University of California, Santa Cruz.

    The disconnect between science and management is disconcerting to researchers who launched the Society for Conservation Biology (SCB) in 1985. “Our mission was to provide the scientific tools and ideas to protect nature,” says Soulé, who served as SCB's first president. Fifteen years later, however, he and others say that scientists are still struggling to influence policy decisions. “The nuts and bolts of conservation biology just aren't working,” says Barry Noon, a biologist at Colorado State University in Fort Collins.

    Alarmed by their own irrelevance, conservation biologists are now taking steps to make their voices heard. While many ecologists agonize over whether to weigh in on policy issues (see previous story), conservation biologists are taking the offensive. SCB plans to unveil a magazine designed for resource managers that's packed with case studies and the latest biology. Meanwhile, a new program sponsored by The Nature Conservancy in Arlington, Virginia, plans to put some 50 biology postdocs into the field for 2 years at a time to learn from resource managers. And many seasoned conservation biologists are teaming up with resource managers to rethink endangered species recovery plans, evaluate marine parks, or simply get a dialogue going.

    Different worlds. Like the Grand Canyon, the gulf between scientists and managers is deep and has been around a while. That's partly due to the mandate of many agencies to help the private sector access and exploit natural resources. The Forest Service, for example, clears the way for timber sales, the Department of Agriculture pushes the conversion of wild lands into agricultural fields, and the National Park Service establishes campsites and other services for tourists at scenic destinations. Ecologists, starting with Aldo Leopold, entered the scene in the 1930s and grew increasingly vocal in the 1960s. “That's when people started pushing for ecosystem management,” says Jack Oelfke, a resource manager at Isle Royale National Park in Houghton, Michigan.

    But this message is not sinking in, observers say. As a case in point, Noon, graduate student Jennifer Blakesley, and their colleagues have spent the last decade capturing and marking spotted owls in California's Lassen National Forest. By their count, about 8% of the territorial birds are disappearing every year. Their reports have repeatedly called on the Forest Service to save larger patches of old-growth trees, where the birds prefer to nest. “Our data are convincing, and the Forest Service has a legal responsibility to monitor this species,” Noon says. But service managers have not responded to his team's findings, he adds.

    Scientists also have a blind spot: They often ignore the politics and economics of resource management decisions, says former Forest Service director Jack Ward Thomas, who is now with the University of Montana, Missoula. “Researchers present results as if they were handed down from the heavens on inscribed tablets—the best scientific alternative is the only one,” says Thomas. In reality, he notes, science is just one factor in conservation decisions: “It's up to the politicians and decision-makers to weigh the costs and benefits.”

    Two years ago, scientists handed down a major indictment of weak science behind management decisions. A study led by Peter Kareiva, an ecologist now with the National Marine Fisheries Service, found that weak data lay behind 233 habitat conservation plans (HCPs). These are federal agreements that allow private landowners and others to wipe out some members of an endangered species in return for substantial efforts to protect the population's habitat (Science, 19 December 1997, p. 2052). “FWS is much too comfortable with using expert opinion, assumptions, and guesses in place of hard empirical evidence,” says Kareiva.

    FWS managers respond that HCPs are a legal mandate, and they don't have the luxury of waiting for science to give them more definitive answers. “The bottom line is, we have to make a decision, and we have to use the best science we've got,” says FWS biologist Deborah Crouse. “If we wait 5 years for better answers, the species might just be gone.”

    Getting the word out. Many scientists blame themselves for not getting political mileage out of their findings. “If you assume your findings will be translated into management or policy, you're wrong,” says Gary Meffe of the University of Florida, Gainesville. Adds ecologist Peter Stine of the U.S. Geological Survey in Sacramento, California: “There's so much managers could gain from what researchers have learned, if only we could synthesize the information for them.”

    That's precisely what some scientists are aiming to do. For starters, the SCB this spring will launch a new magazine—Conservation Biology in Practice—that editor Kathryn Kohm of the University of Washington, Seattle, hopes will read like a Harvard Business Review for the conservation crowd. “Academics complain that managers don't read, but that's only because we haven't given them something worth reading,” Kohm says. The bimonthly is intended to help managers ground conservation strategies in the latest science. “Dumping information from the ivory tower down clearly isn't working,” she notes. “This is just one way to fill that gap.”

    Another strategy is a quarterly forum hosted by the Sustainable Ecosystems Institute (SEI), a nonprofit in Portland, Oregon, that analyzes ecological issues. SEI forums bring together scientists, managers, politicians, and industry officials to debate issues—and get past their mutual mistrust. At a forum 2 years ago on science in resource management, for instance, SEI president Deborah Brosnan broke the ice with a skit in which she played “the scientist from hell”—nose in the air, demanding endless cash and time for experiments, and offering ambiguous results in return. Across stage, arms folded, was “the manager from hell.” She had zero money—and less patience—for slow-moving science. Shakespearean comedy it was not, says Brosnan, but “it opened the floodgates.” Scientists came up with a proposal to rate their degree of confidence in conservation recommendations in order to help managers weigh the options. At SEI, this practice has become routine.

    Such efforts toward rapprochement are a good start, say conservation biologists, who are counting on stronger advocates from the next generation to narrow the divide even more. The Nature Conservancy just launched its $9.5 million David H. Smith Conservation Science Fellowship Program, sending the first seven of 50 postdocs to work with managers in the field. The idea, says program director Guy McPherson, is “to grab the best and brightest headed into academia and expose them to the culture of on-the-ground conservation.”

    One Smith fellow, Jake Vander Zanden, a postdoc at the University of California, Davis, is plucking nonnative fish and amphibians from streams in the Sierra Nevada to help native populations rebound. “This is get-your-hands-dirty work,” says Vander Zanden, who earned his Ph.D. studying food webs in lakes.

    Progress may be halting, but scientists are beginning to find their voice, says Meffe. “Conservation biology is growing up.”


    At Home on the Range

    1. Kathryn S. Brown

    If there is a Shangri-La where conservation biologists and resource managers work in harmony, it just might be Illinois. While the two worlds often are out of sync elsewhere (see main text), a close-knit collaboration has nurtured the prairies and forests in this midwestern state. “We don't just publish a paper and send it out to resource managers,” says Scott Robinson, a biologist who divides his time between the University of Illinois and the Illinois Natural History Survey in Champaign. “We talk to them, we give presentations, and it makes a difference. We also get a lot of good ideas from them.”

    One recent victory for the team is a safer home for the Henslow's sparrow. In decline nationwide, these birds tend to avoid nesting in recently burned grasslands. That's a problem for Illinois managers, who burn substantial areas of prairie every year to keep the land healthy. In 1991, biologist James Herkert of Illinois's Endangered Species Protection Board wondered if there was a way to reconcile the two conflicting conservation goals. For 5 years, he and William Glass, a resource manager with the Illinois Department of Natural Resources, tested the effects of various fire regimes on Henslow's sparrows at Goose Lake Prairie, a 650-hectare haven for the birds. They discovered that the birds favor certain patches of the prairie, establishing more nests in preferred areas not burned in back-to-back years. “It was good science, and we changed our management scheme to reflect it,” says Fran Harty, a resource administrator with the department. The two sides have brought their partnership to bear on other conservation issues as well, such as deciding which parcels of forest to buy up as habitat for native warblers, herons, and hawks in the Cache River watershed.

    Both camps say they have learned a lot along the way. Managers should not be shy about telephoning scientists, Harty says: “If you just start talking, you can get a wealth of information.” Conservation biologists, meanwhile, should learn to live with imperfect information, Robinson says: “By the standards of experimental science, a lot of management recommendations are based on pretty flimsy evidence. Data are often correlative and inconclusive. But we have to be willing to enter the decision-making arena with the best advice we have.” Sometimes the key is simply finding the right ear to bend. “It's important to single out the open-minded managers,” says Robinson. “They're out there.”


    Knotted Jets and Odd Quasars Reveal Secrets by Radio

    1. Dennis Normile

    SAGAMIHARA, JAPANMore than 100 radio astronomers from around the world gathered here last month* to review early results of the first radio observation program that makes use of a space-based antenna and related topics. The Highly Advanced Laboratory for Communications and Astronomy (HALCA), launched by Japan's Institute for Space and Astronautical Science in 1997, was designed to make observations at three frequencies—1.6, 5, and 22 gigahertz. Unfortunately, the highest frequency band never worked properly, and problems with orienting the spacecraft and with its power supply have left it capable of making observations only at certain points in its orbit. Even so, HALCA, in combination with ground-based telescopes, is helping answer some long-standing puzzles.

    Telltale Jets

    Active galactic nuclei (AGNs) are among the most fascinating and puzzling structures in space. An AGN packs the energy output of an entire galaxy of stars into a region smaller than the solar system. At the center of each AGN, scientists believe, lurks a black hole that sucks in gas and dust from a surrounding accretion disk of rotating matter. Perpendicular to the disks, enormous jets of gases hundreds of light-years long spurt from the AGN's core at velocities near the speed of light.

    How AGNs form and what drives them are puzzles. But scientists are getting a closer look at the jets and how they change over time from the Very Long Baseline Interferometry (VLBI) Space Observatory Programme (VSOP), which uses HALCA in combination with ground-based radio telescopes to generate images with a resolution that earthbound equipment alone cannot approach.

    Ground observations had shown that the jets are made up of bloblike knots of material that seem to emanate from the core and move lengthwise along the jet. Now, VSOP has added a wealth of extra detail. “We can see it's not just a blob [of material in the jet] but that it's extended and moving around,” says Bernard Burke, a radio astronomer at the Center for Space Research at the Massachusetts Institute of Technology.

    One of the clearest examples of the increased resolution of VSOP images was presented by Jens Klare, a doctoral student in astronomy at the Max Planck Institute for Radio Astronomy, in Bonn. He and colleagues are studying the quasar 3C345. In a jet from the quasar, VSOP revealed that what looked like a single large blob (a “component,” in astronomer-speak) in images from ground arrays actually consists of three separate components. What's more, images captured a year apart seem to indicate that the knots are moving away from the core and that one is getting brighter while another dims. Klare suspects that the knots are following a helical spiral along the axis of the jet, dimming and brightening as they veer away from and toward Earth.

    But other observations suggest knots do not move away from an AGN's core. William Junor, an astrophysicist at the University of New Mexico in Albuquerque, and colleagues have been using both ground-based arrays and VSOP to study M87, the central galaxy in the Virgo cluster and one of the first radio galaxies in which a jet was observed. “We didn't see much movement” in the components of M87's jet, Junor says. This raises the possibility that jets from different sources behave differently, suggesting that the jets change over time or that different forces are at work depending on the scale of the AGN.

    A hint of what might be happening in the jets comes from recent three-dimensional computer simulations created by Kazunari Shibata, a theorist at Kyoto University's Kwasan Observatory in Kyoto. Based on magnetohydrodynamic equations, his simulations show gas and dust ejected from the core swirling into knots and rings. The knots and rings are more or less stationary, wobbling slightly toward and away from the core while the material that forms them continues on its way.

    So far, even the most sophisticated simulations can only model systems a tiny fraction of the size of the AGNs radio astronomers are observing. The complexity of the forces involved stretches the limits of supercomputers and their programs, Shibata explains. Still, astronomers say the simulations are coming of age. “We're seeing a convergence of the magnetohydrodynamic modeling and observations,” says Junor, and this will make simulations an increasingly useful tool for testing theories.

    Twinkle, Twinkle, Little Quasar

    One of radio astronomy's longest running debates centers on distant objects whose radio emissions vary over time. Do the emissions really wax and wane, or is the variation caused by some sort of scintillation in the interstellar medium, the gas and dust between the stars? Astronomers generally accept that variability on the order of months or years comes straight from the source. But change over less than a day, so-called intraday variability, is harder to explain. Astrophysicists believe that the shorter the period of the variability, the more compact the source has to be. Yet these highly variable quasars are emitting thousands of times more radiation than theory would allow highly compact sources.

    David Jauncey of the Australia Telescope National Facility and colleagues have now pinned the intraday variability of at least one radio source on the interstellar medium. The source, known as PKS 0405-385, is extremely variable: Its emissions nearly double in intensity and then fade within an hour.

    Jauncey and his colleagues theorized that they might be able to tell where the variability was coming from by precisely timing when changes in the radio signals arrived at radio telescope arrays on opposite sides of Earth. Because PKS 0405-385 is halfway across the universe, signals due to changes in the source itself would reach the arrays simultaneously, give or take a few milliseconds. But if the variability arises in the relatively nearby interstellar medium of our own galaxy, the signals reaching two arrays might form different patterns or arrive at different times.

    Using the Very Large Array, a set of 27 radio telescopes near Socorro, New Mexico, and the Australian Telescope Compact Array, a set of six telescopes in Narrabri, New South Wales, the astronomers found that both arrays detected very similar patterns of variability. But the signals arrived in New Mexico about 2 minutes before reaching Australia—much too long to be explained by one array being closer to the source or by experimental error.

    “The conclusion is inescapable,” Jauncey says. “It is interstellar scintillation that is at least a major cause of this intraday variability.” William Junor, an astrophysicist at the University of New Mexico, Albuquerque, agrees: “It looks pretty conclusive and comprehensive.”

    If other groups get similar results with other sources, the technique could provide a means of probing the interstellar medium. “I think this result is going to motivate a lot of work on other [intraday variable] sources,” says Bernard Burke, a radio astronomer at the Center for Space Research at the Massachusetts Institute of Technology.

    • *The VSOP Symposium, “High Energy Astrophysical Phenomena Revealed by Space-VLBI,” 19-21 January


    Patent Office May Raise the Bar on Gene Claims

    1. Martin Enserink

    But NIH officials worry that the bar might not be high enough to keep out unwarranted claims, which they say threaten to stymie research

    The U.S. Patent and Trademark Office (PTO) is inching toward resolution of an issue that has dogged it, and the biomedical research community, since the early 1990s: What sort of genetic information is patentable? Over the past decade the PTO has been deluged with applications for patents on millions of gene fragments. Yet most have been stalled because of enduring questions over exactly what can be patented.

    Now, in a major shift, the PTO has proposed a policy that will raise the bar for patent applications on DNA—a change that could lead to the rejection of many of those idling claims. Although the proposed change is welcomed by many in the research community, some, including top scientists at the National Institutes of Health (NIH), argue that it still does not go far enough. Unless the PTO tightens its rules further, they warn, research and innovation could be stifled by a quagmire of overlapping rights and claims. “This could be a big disincentive for biomedical research,” says Maria Freire, director of the Office of Technology Transfer at NIH.

    By all accounts, the stakes are enormous. To reap the harvest of the genome era, companies have invested hundreds of millions of dollars in uncovering genes on which new drugs and diagnostic tests can be based. Analysts say their business strategies are at least partly based on the assumption that they will own the rights to exploit that genetic knowledge. Few in the biomedical community argue against that basic position. Without some form of intellectual property protection, pharmaceutical companies would not bet large sums on developing gene-based drugs, and those drugs would never reach the market.

    The question, then, is how much someone needs to know about the usefulness of a piece of DNA—its “utility,” in patent law terms—to merit a patent. NIH officials and many other publicly funded scientists argue that no DNA patent should be granted unless researchers know a gene's full sequence and have figured out what protein it produces and what that protein does in the cell. The first hard-won gene patents, issued in the 1970s and 1980s, met those criteria, because researchers often started with a known protein and worked their way back to the encoding gene—a difficult and laborious process.

    Since then, new and less cumbersome ways to find genes have emerged. In the early 1990s, scientists discovered a way to identify short scraps of DNA—called expressed sequence tags (ESTs)—about which they knew little more than that they belonged to some gene that was switched on somewhere in the body. That didn't deter researchers from applying for patents, however. Often, ESTs were ascribed some unspecific type of utility—for instance, that the sequence could be useful in forensic science or could help find genes on a chromosome. Nobody knows exactly how many EST applications have been filed, but millions are believed to be in the queue at the patent office. Until recently, the agency had indicated it would award such general claims—for instance, John Doll, PTO's director of biotechnology, outlined such a policy in a commentary published in Science (1 May 1998, p. 689).

    But, in what Doll concedes is a “significant change” in policy, the agency has decided that patent applicants must demonstrate a more “substantial, real-world utility; not some throwaway utility.” As a result, many EST applications “will have a difficult time” meeting the utility requirement, says Doll. This proposed change is spelled out in a set of new guidelines for the agency's patent examiners—the people who judge the validity of claims. Published in the Federal Register on 21 December, the guidelines are open for public comment until 22 March.

    Experts say the proposed change could hurt companies that have applied for patents on large numbers of ESTs, such as Incyte of Palo Alto, California, and Human Genome Sciences, based in Rockville, Maryland. But Lee Bendekgey, general counsel at Incyte, says his company is confident its 1.2 million ESTs are patentable. “We never filed applications where we didn't know what the EST did,” says Bendekgey. A spokesperson at Human Genome Sciences said the company is studying the proposal and declined to comment.

    For its part, NIH is “very pleased” by the proposed change in policy, says Freire. NIH officials worry that under the current regime, an EST patent might give the patent holder rights over not just that snippet but also the full-length gene, if it is later characterized by somebody else. Because many ESTs can originate from the same gene, several patent holders could all have a share in a single gene—a recipe for disaster. “That was indeed a frightening prospect,” says Iain Cockburn, a finance and economics professor at Boston University School of Management.

    But NIH is decidedly unhappy about the PTO's position on another class of gene patents, also contained in the proposed guidelines. Since the advent of sophisticated gene-hunting software, researchers have been able to take a gene, or even just a piece of a gene, plug it into a computer, and instantly turn up vast amounts of intriguing but theoretical information about it. For instance, a gene might resemble one that produces a protein involved in intracellular transport in the fruit fly. Or it might produce a protein that, judging by its hydrophobic nature, probably floats in a cell membrane. Such searches have become the mainstay of companies like Rockville-based Celera Genomics and Incyte. Already, thousands of patent applications have been filed on genes that have been characterized only through computer searches—without doing a single experiment or “getting your pipette wet,” as one critic explains.

    Much to the dismay of NIH, patent officials say that under the new guidelines, such applications will likely pass muster. Searching sequence databases for similar genes has become common practice in genomics, explains Doll: “Scientifically, it's very well established and very well accepted in the academic community.” Indeed, the patent office has already awarded one such patent, to Incyte in 1998, on a set of ESTs believed to encode a family of 44 enzymes called kinases.

    In a polite but spirited letter-writing campaign in December, then-NIH director Harold Varmus and Francis Collins, director of the National Human Genome Research Institute, voiced their opposition to PTO Commissioner Q. Todd Dickinson. Varmus and Collins wrote that they were “very concerned with the PTO's apparent willingness” to grant claims based on such “theoretical” functions. Varmus and Collins argue that although databases may give researchers tantalizing hints about what a gene could do, they don't prove anything, let alone give researchers new ideas for drugs. Therefore, such searches should not be sufficient to enable scientists to lay claim to a gene. NIH officials worry that, if approved, such claims could impede research by other investigators. “In 3 or 5 years, people can come and say: ‘Hey, you can't be working on that gene. That's mine,’” says Freire. “That's a very scary proposition.”

    Not surprisingly, the genomics industry is pleased with this part of the proposal, arguing, as does Doll, that homology searches are an accepted way to ascribe function to a gene. “Everybody uses these techniques,” says Incyte's Bendekgey, “and they are virtually 100% correct.”

    In the end, determining what is and isn't patentable will likely be decided in court, as the PTO's decisions must stand up to legal challenge. Doll says that if the PTO rejects a patent application based on a database search, that decision is likely to be overturned by the court. Patent experts tend to agree. The courts have never enforced the utility requirement very strictly, says Rebecca Eisenberg, a patent law scholar at the University of Michigan, Ann Arbor. As far as existing law is concerned, she says, the new policy “is probably on pretty safe ground.” The only way to be sure is to take it to court. That's why the PTO is now preparing a test case: It will issue a patent to an applicant who has volunteered to have it challenged by a third party. The agency declines to reveal details about the case.

    No one is willing to bet on the outcome. Nor is anyone certain what would happen if genomics companies actually get patents on the thousands of genes they claim to have found but have not fully characterized. “Are we heading for a situation where nobody can do business without negotiating 400 agreements? That's a possibility,” says Cockburn. On the other hand, NIH officials may be overly pessimistic, and such complexities may be sorted out in the marketplace. Whatever the outcome, he says, it's likely to lead to several high-profile patent infringement lawsuits.

    That's nothing new. Virtually every major therapeutic product to emerge from the biotech field has been the subject of intense and often bitter litigation, says Cockburn: “It's one of the sad features of the biomedical industry.”

Log in to view full text