News this Week

Science  11 Dec 1998:
Vol. 282, Issue 5396, pp. 1962

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Use of Stem Cells Still Legally Murky, But Hearing Offers Hope

    1. Eliot Marshall

    Biologists who were called to Capitol Hill last week to testify about the ethics of their research on human stem cells were expecting “a thunderstorm,” as one of the witnesses said. Their controversial work, which extracts cells from human embryos and aborted fetuses and coaxes them to grow in cell lines capable of developing into any tissue type, made headlines in November (Science, 6 November, p. 1014). But the atmosphere in the 2 December congressional hearing was calm. Members of the panel—the Senate appropriations subcommittee for health and human services, chaired by Arlen Specter (R-PA)–seemed more interested in biology than bombast.

    If researchers were hoping that the hearing would clarify whether they can use these versatile cell lines in federally funded research, however, they were disappointed. For the third year in a row, Congress has passed an appropriation bill that forbids funding research in which an embryo is “destroyed, discarded, or knowingly subjected to the risk of injury or death.” The key question is whether this ban applies to research using the new stem cells because they were derived from embryos. National Institutes of Health (NIH) director Harold Varmus testified that the Administration is still studying the legal issues. And although Specter and the ranking Democrat, Tom Harkin of Iowa, said that they want to encourage this research, Specter said Congress is likely to move slowly in reviewing whether the law needs to be changed.

    In an interview after the hearing, Specter said that new stem cell research “has tremendous practical applications,” adding that it is “obviously on the cutting edge, and that's why Tom [Harkin] and I decided we ought to move ahead” with a public inquiry before the 106th Congress begins in January. But he said, “There's going to be a lot of controversy,” and “I don't think there can be a rush to judgment.” Specter noted that Harkin had already concluded that NIH would not violate current federal law if it funded experiments using the new stem cell lines because—as researchers testified at the hearing—the cells cannot develop into embryos without radical experimentation (which no one is attempting). But Specter said, “I would not want to make that legal judgment based on this state of the record and my knowledge. I think that, for that conclusion to carry public support, you have to do it in a little more systematic, thoughtful, recordmaking way.” Varmus, at least, came away encouraged. When he met with his advisory council the next day, he described the hearing as “an upbeat conversation.”

    In suspense.

    arold Varmus (left), James Thomson (center), and John Gearhart (right) await a decision on funding of stem cell research.


    The impetus for reexamining the law comes from announcements by two academic biologists. James Thomson of the University of Wisconsin, Madison, and John Gearhart of The Johns Hopkins University in Baltimore revealed in November that they have established long-lived cultures of human stem cells. They hope that these cells can be used to create transplant tissue for people who cannot find suitable donors. Gearhart piqued everyone's interest at the Senate hearing by displaying photos of human neurons derived from his cells. He and others predicted that within 10 to 20 years it will be possible to grow healthy neurons to replace damaged brain cells in people with Parkinson's disease.

    “I've been hearing from many scientists” who want to work with the new cell lines, Varmus says. But so far, NIH hasn't allowed any NIH-funded researchers to do so, because the methods of deriving these cells may cross into forbidden territory. (Gearhart and Thomson both relied on private money to develop the cell lines.) Cells obtained by Gearhart's method are less controversial because they come from aborted fetuses, and federal guidelines since the 1970s permit some research on fetal tissue if the abortion clinic and the research lab are separate. But Thomson's cells are in a different category. They were extracted from embryos donated to research by couples who had undergone in vitro fertilization procedures. The experiments Thomson performed to establish the cell lines, all witnesses agreed, could not be supported with federal money under current law. But Harkin and others suggested that Thomson's stem cells—because they are not embryos—could be used by NIH-funded scientists.

    The only strong dissent from Harkin's interpretation came from Richard Doerflinger, a spokesperson for the Committee for Pro-Life Activities of the National Conference of Catholic Bishops. He noted that it would be a crime to do Thomson's experiments under the unusually restrictive laws of Pennsylvania—Specter's state. As for Thomson's cell lines, Doerflinger said that “ethical principles reflected in current law … argue against funding the research.” Doerflinger acknowledged, however, that there was no apparent barrier to federal researchers using Gearhart's stem cells—although Doerflinger made it clear that he disapproved.

    Several expert groups—in addition to members of Congress—are now deliberating on issues surrounding both types of cell lines. At the president's request, the National Bioethics Advisory Commission (NBAC) is conducting a comprehensive ethical review, due sometime next summer. Varmus says he may not need to wait for NBAC's conclusions, however, because he received good ethical advice from his own advisory panel on embryo research in 1994. But before he can act, he needs a response from the general counsel of the Department of Health and Human Services and the Office of Management and Budget at the White House on whether the current law bars federal support for work with the new cell lines. If they determine that it does not, researchers may be able to use the cells even if Congress does not change the law to make the permission explicit. “I hope we will have an answer to these questions soon,” Varmus said, but “I can't say how long it will take.”


    Scientific Panel Clears Breast Implants

    1. Jocelyn Kaiser

    Kicking off a momentous 2 weeks for science in the courtroom, a scientific panel on 30 November issued a long-awaited report finding no evidence that silicone breast implants cause systemic diseases in women. The report may lay to rest one of the biggest scientific- legal controversies of the decade, involving thousands of lawsuits seeking billions of dollars in damages. “It is absolutely as strong a report against the plaintiffs' position as one could imagine,” says Michael Green, a law professor at the University of Iowa, Iowa City.

    Legal scholars are paying close attention, because the panel is part of a sea change in courtrooms since a 1993 U.S. Supreme Court ruling called on trial judges to scrutinize the validity of scientific evidence themselves before it is presented to a jury. “Before, there probably never would have been a scientific panel in such really important litigation,” says Daniel Capra, a professor at Fordham Law School in New York City. Scientists may not be the only experts affected: Earlier this week the Supreme Court heard arguments in a case in which it could offer guidance as to when other kinds of expert testimony—including that from engineers and physicians—should meet scientific standards.

    The backdrop for all this is the 1993 Supreme Court decision in Daubert v. Merrell Dow Pharmaceuticals, in which the court called on federal trial judges to act as “gatekeepers” and screen out so-called junk science. The court suggested four tests, including whether an expert's views had been peer reviewed. Before then, the standard was “general acceptability” of the views. Although the decision has in some cases allowed into the record more novel kinds of testimony, such as DNA evidence, experts say Daubert has led overall to less scientific testimony being aired to juries.

    The Daubert ruling also triggered wider use of Federal Rule 706, a 23-year-old law that says federal courts can assemble their own advisers. That's what Judge Sam J. Pointer Jr. of the U.S. District Court in Birmingham, Alabama, did in October 1996, when he convened an independent panel to review evidence in several thousand lawsuits claiming that breast implants caused debilitating symptoms ranging from fatigue to sore joints. Pointer asked the four-person panel* to consider whether existing research “provide[s] a reliable and reasonable scientific basis” for concluding that silicone breast implants “cause or exacerbate” lupus or other connective tissue diseases, or “atypical” immune diseases, according to the report.

    Lawyers for both sides each winnowed over 2000 studies and other documents to about 40 they deemed most important for review in each expert's area. The panelists also heard scientific witnesses. Their nearly 300-page report** finds that implants are not entirely benign: It says, for example, that animal studies show silicone breast implants can cause inflammation, and that silicone droplets may wind up in tissues far from the breasts. But the “preponderance of data” does not link these effects to autoimmune disease in people, the report says. The panel's epidemiologist, who conducted several analyses of data pooled from both published and unpublished studies, found “no association” between implants and connective tissue or immune system disease.

    Needles in a haystack.

    In a worst-case scenario, silicone breast implants would cause a handful of cases of these diseases, according to a scientific panel's analysis of pooled population studies.

    View this table:

    The clean bill of health thrills implantmakers. “This is going to help bring an end to this controversy,” says Doug Schoettinger, managing trial counsel for Dow Corning. Ironically, Dow Corning, which is in bankruptcy, proposed to settle its suits for $3.2 billion just a few weeks before the scientific panel released its findings. The report, however, is expected to influence Dow Corning's adversaries whether to settle or go to trial. In addition, videotaped depositions will be used in the cases overseen by Pointer.

    But the report's shades of gray—including its frequent criticisms of how studies were done—has led some experts to conclude that the jury is still out. “They're saying the science is inconclusive and in many ways contradictory,” says Robert Garry, an immunologist at Tulane University in New Orleans who studies women with implants. Indeed, adds Diana Zuckerman of the Institute for Women's Policy Research in Washington, D.C., the studies may not have identified problems that might develop several years after women get implants. Zuckerman says her group will reserve judgment until next year, when results are expected from a National Cancer Institute study of 17,500 women.

    For now the broader legacy of the Pointer panel is unclear. “It will be interesting to see if it has an impact on future toxic tort litigation given the expense and time that it took”–$800,000 from the Federal Judicial Center and 2 years, says Margaret Berger of Brooklyn Law School. One occasion for using such a panel, says Green, might be a class-action suit in which “the evidence is emerging” and thus hasn't been weighed by scientists; he points to mounting litigation involving fen-phen, the diet drug combination implicated in heart valve disease.

    Whether Daubert should apply to testimony from other experts, such as engineers and doctors, was the question before the Supreme Court earlier this week. The case, Kumho Tire Co. v. Carmichael, involves a minivan that crashed after a tire blew, killing one person; Carmichael, the victim's family, presented an engineer who claimed the tire was defective. Kumho's lawyers won in a trial court, which found that the testimony failed to meet Daubert tests. An appeals court reversed the decision, however, finding that technical testimony based on experience should not have to meet scientific standards (Science, 11 September, p. 1578).

    In the hearing, justices expressed a range of views. Several agreed it would be impossible to scientifically test, say, an art expert's assertion that a color in a painting was deep magenta. On the other hand, Justice Antonin Scalia echoed Kumho's argument that the tire expert's testimony should have met scientific standards because it was based on a methodology: process of elimination. The engineer had asserted that because the tire did not appear to have several indications of abuse, its failure must have been due to a defect. The court's ruling, if it issues one, is expected next summer.

    Clarifying how courts should evaluate expert opinion of all stripes will not be easy, says Berger, who co-authored an amicus brief for the Carmichael side. “I'm not sure you can come up with a magic formula.”

    • * Immunologist Betty Diamond of the Albert Einstein College of Medicine in New York City, epidemiologist Barbara Hulka of the University of North Carolina, Chapel Hill, toxicologist Nancy Kerkvliet of Oregon State University in Corvallis, and rheumatologist Peter Tugwell of the University of Ottawa.

    • ** See


    Argentina, and Perhaps Its Life, Took a Hit

    1. Richard A. Kerr

    The 10-kilometer-wide asteroid that wiped out the dinosaurs and many other species 65 million years ago was just one of a steady stream of debris of all sizes that has splattered the planet. Some impacts were small, leaving no more trace than a shooting star, while other, larger ones presumably could have triggered near-global crises. On page 2061 researchers suggest that a lesser impact showered coastal Argentina with blobs of molten glass 3.3 million years ago, perhaps cooling climate and driving some of the region's mammals to extinction. But other researchers say that although the impact looks real, its connection to climate change or extinctions is doubtful.

    Cratering specialist Peter Schultz of Brown University in Providence, Rhode Island, got his first clue to the impact 5 years ago on a visit to Argentina, when an interpreter mentioned odd green glass she had picked up as a child. Schultz eventually explored sea cliffs of windblown dust deposits called loess near the coastal town of Miramar, working with geologist Marcelo Zarate of the Regional Center of Scientific and Technical Investigations in Mendoza, Argentina. The cliffs expose a layer of glassy, bubble-filled slabs 0.5 to 2 meters across; called escorias locally and first reported in 1865, these rocks had been attributed to everything from lightning strikes to ancient human-tended fires.

    Sign of a killer?

    The impact that forged this 2-millimeter blob of molten glass in Argentina 3 million years ago may also have caused mammal extinctions.


    But after close study, Schultz, Zarate, and their colleagues conclude that an impact had fused loess into glassy slabs and flung them across at least 50 kilometers of the central coast of Argentina. The glass has streaky flow patterns typical of rapidly cooled impact glass, mineral breakdown products that require temperatures even hotter than those of lightning and volcanoes, and a chemical composition resembling that of the local loess. “It's fascinating stuff,” says meteoriticist and cratering specialist Theodore Bunch of NASA's Ames Research Center in Mountain View, California. “I think [the impact] interpretation is probably correct.” Schultz presumes that a body perhaps a kilometer in diameter hit just offshore, producing a now-buried crater perhaps 20 kilometers in diameter.

    Radiometric dating of the glass showed that the object struck 3.3 million years ago. The date of the glass layer will give paleontologists studying the region's abundant mammal fossils a long-sought benchmark in time. But Schultz and his colleagues suggest a more provocative role for the impact. Based on the glass's radiometric age and its position in the record of Earth's flip-flopping magnetic field, they establish that the impact happened within about 100,000 years or so of an abrupt, temporary 2°C cooling of ocean bottom waters recorded in Atlantic and Pacific sediments. What's more, they say, a major, sudden extinction at about this time wiped out 36 genera of mammals, mostly kinds known only from that region. They suggest that the impact either blasted the local fauna into extinction or induced global climate change that then triggered extinctions in southern South America. “Right now we only regard this as a coincidence,” says Schultz, but an intriguing one.

    Other researchers are less intrigued, noting the large uncertainties in the relative timing of the impact and its possible effects and saying that the apparent correlation could be meaningless. “Any particular moment would show quite a few extinctions” simply because the extinction rate in that geologic interval is high, says paleontologist David Webb of the University of Florida, Gainesville. “To claim [the extinctions] are sudden or unique is naïve.” The coincidence with an ocean cooling is likewise “a nonstarter,” says paleoceanographer Nicholas Shackleton of Cambridge University. Something big hit Argentina 3 million years ago, researchers agree, but to find out if it had any lasting effects on animals may take another decade of work.


    Insulator's Baby Steps to Superconductivity

    1. Robert F. Service

    Even high-temperature superconductors—the wunderkinds of the material world—can't escape modern family dynamics. The superconductors' parent materials are seemingly conservative types, layered ceramic insulators that are unable to conduct electricity. Yet their offspring, which are spiked with small amounts of other elements, are as racy and rebellious as teenagers. They conduct electricity without any resistance whatsoever, and at temperatures far above the maximum predicted by the traditional theory of superconductivity. Like psychologists trying to sort out family traits, physicists have been struggling to understand how such staid parents give rise to such unruly children. Now an experiment reported on page 2067 reveals a distinctly postmodern result: The parents aren't as conventional as once thought. In fact, they harbor an electrical signature reminiscent of a metal, a property never seen before in any insulator.

    “This is a very interesting result,” says Princeton University physicist Nai-Phuan Ong. “Normally you would not expect an insulator to have any hints of [metallic behavior].” Tantalizingly, the ceramic insulator that the researchers studied also reveals hints that it may be influenced by some of the same factors that give rise to superconductivity in the offspring. “Is this just a coincidence” that related electrical signals are produced in both? asks Ong. For now, nobody knows, but the new results give physicists a deepening family mystery. “It challenges our ability to understand solids,” says Z.-X. Shen, a physicist at Stanford University who led the new work.

    Postmodern parent.

    This insulating ceramic has metal-like properties, linking it to its superconducting offspring.

    The latest twist to the high-temperature superconductivity mystery centers on the way electrons behave in different materials. In all materials, electrons act a bit like marbles: Just as no two marbles can occupy the same space, two electrons in a material cannot have the same energy state. So the multitude of electrons in a material pile up in a range of different energy states, like marbles filling a jar. These states are organized in bands, with the valence band containing lower energy electrons with restricted movement and the conduction band containing higher energy, mobile electrons.

    In metals these two bands overlap. This allows valence band electrons to hop easily into the conduction band, where they can whiz around and conduct electricity through the material. In semiconductors, a gap of forbidden energy levels separates the two bands, so a slight energy kick is needed to boost valence electrons into the mobile conduction band. In insulators, this gap is much larger, for the most part preventing any conduction.

    Researchers also have ways to understand conducting and insulating materials based on the momenta—a function of the direction and speed—of electrons within them, relying on an equation from quantum mechanics called a wave function. When the range of different momenta of electrons in a metal is mapped out, the wave function always shows a specific shape—for example, a sphere, a shape implying that electrons have an equal chance of traveling in any direction at the same speed. Conventional insulators, by contrast, show no pattern whatsoever.

    Many research groups have studied the way electron energies pile up in the ceramic parents of high-temperature superconductors and shown them to be insulators. Yet, when Shen's group at Stanford, together with colleagues at Iowa State University in Ames and Varian Associates of Palo Alto, California, used a different technique to test the momentum behavior of one of these insulators, they got a very different and perplexing response. In what is called angle-resolved photoemission spectroscopy, the researchers blasted the surface of the material, a flat crystal composed of calcium, copper, oxygen, and chlorine (Ca2CuO2Cl2), with x-rays at precisely controlled energies. When the high-energy photons slammed into the sample, they evicted some of its electrons, launching them out of the material. Detectors then counted these homeless electrons and measured their energy and direction of travel. Much to their surprise, Shen's team found patterns reminiscent of a metal. “This is a new kind of insulator that has a sign it could be a metal,” says Shen.

    And that's not all. Another plot—this one showing how the energy of the electrons varies depending on the direction they are traveling—bore a striking resemblance to a pattern found in their superconducting offspring. Superconducting electrons, which always travel in pairs, can only move within planes of copper and oxygen atoms and only along the two axes of the crystal, not along the 45-degree diagonals. This gives the wave function a cloverleaf pattern, known as d-wave symmetry. To their surprise, Shen and his colleagues saw the same d-wave pattern in the energy of electrons in different directions, a fact that raises both eyebrows and questions.

    “We believe there must be a connection” between the d-wave pattern in the insulating and superconducting ceramics, says Shen. This common shape “doesn't come out of the blue,” adds Juan Campuzano, a physicist at the University of Illinois, Chicago, and Argonne National Laboratory. Just what the connection is, the new experiment does not say. “But this raises speculation as to whether electrons are taking the first baby steps toward superconductivity,” perhaps briefly pairing up and then separating again, says Ong.

    As with any attempt at explaining the physics of these materials, this explanation is “very contentious,” says Campuzano, a sentiment with which Ong agrees. For now, at least, the petulant high-temperature superconductors and their quirky parents will remain one of the most enigmatic families in physics.


    India Backs Off on Central Control

    1. Pallava Bagla*
    1. Pallava Bagla is a correspondent in New Delhi.

    NEW DELHI–Indian scientists are hailing a government decision to scale back a proposal for a centrally run system to regulate research involving animals. The final rules, adopted late last month, would instead place primary responsibility in the hands of animal ethics committees at individual universities and institutes, avoiding a bottleneck that scientists feared could stifle research. “I am satisfied that science will not suffer” once the rules are implemented, says Pradeep Kumar Dave, an orthopedic surgeon and director of the All India Institute of Medical Sciences here.

    Keeping count.

    New animal care rules place responsibility in the hands of individual facilities like the National Institute of Immunology, above.


    The initial proposal, from a committee chaired by social justice and empowerment minister Maneka Gandhi, would have prohibited all animal experimentation without the explicit written approval of the committee (Science, 18 September, p. 1777). Gandhi, an outspoken animal rights activist, said at the time that the government needed to step in after an attempt at self-regulation, based on 1992 guidelines from the Indian National Science Academy, had failed. But her proposal kicked up a ruckus among the scientific community. Passions ran high: Immunologist Nirmal Kumar Ganguly, director-general of the Indian Council of Medical Research here, warned of “chaos and confusion leading to anarchy” if the rules were implemented without amendments.

    The final rules give institutional panels the authority to approve animal experiments for entire programs and projects rather than the experiment-by-experiment basis envisioned in the initial proposal. All biomedical institutions using animals still must register with the social justice ministry within 60 days, but institutions need not wait for a response before carrying out the necessary oversight duties.

    The institutional panels will be composed of biomedical scientists both from within the institution and outside, as well as a veterinarian, a nonscientist, and a government representative. The first order of business for many institutions will be to create such a panel: A recent survey revealed that only 50% of all laboratories had any form of animal ethics committee. The committees will be responsible for day-to-day monitoring of experiments, but they must report periodically to the ministry, which can suspend or revoke the license of any laboratory found wanting.

    The final rules also remove a proposed ban on contract and collaborative research involving animals with overseas educational institutions, although they still prohibit contract research—such as the use of monkeys to test drugs for multinational drug companies—carried out purely for monetary considerations. It will also be more difficult for Indian institutions to import animals from overseas labs: The rules allow transfers only between labs already registered with the Indian government, in effect limiting the pool to domestic facilities.

    The rules are expected to become law by the end of the month, putting an end to what Gandhi calls “rogue firms” that have ignored proper procedures for animal safety. “It's time for them to put up or shut up,” she says.


    Extremists Steal Minister's Spotlight

    1. Robert Koenig*
    1. Robert Koenig is a writer in Bern, Switzerland.

    It had the makings of a banner week for German science, with the new education and research minister, Edelgard Bulmahn, announcing plans to increase federal funding for research and higher education, dismantle some outmoded nuclear-power research facilities, and strengthen programs to help women and young scientists. The premiere basic-research organization, the Max Planck Society, also pitched in with a positive spin on its plans for the year ahead. But the week also saw a sharp reminder of deep divisions in public attitudes toward science: The boldest headlines went to an incident in which a prominent German researcher was placed under police protection following threats from animal rights activists.

    In a speech in Bonn, Bulmahn announced that the government plans major investments and reforms in Germany's troubled university system, with the goal of making universities more dynamic, flexible, and international. The ministry wants a $500 million increase in next year's budget, to $9 billion, and plans to double expenditures over the next 5 years on investments in higher education and research—such as renovating university laboratories and other facilities.

    In an announcement that dovetails with the new German government's plans to phase out the nuclear power industry, Bulmahn also said some experimental and pilot-project facilities will be shut down or dismantled. A ministry spokesperson says the outmoded reactors include the THTR high-temperature reactor in Hamm, the FR-II research reactor in Karlsruhe, the HDR reactor in Kahl, and a reactor near the Bavarian town of Niederaichbach. A nuclear-energy expert associated with the German Physical Society told Science that he was unaware of any significant research now being done at those four reactors. A full list was unavailable.

    Bulmahn also signaled that she would like to shift the focus of space science but lacks the flexibility to do so. She criticized her predecessor's decision to commit most of Germany's space resources to crewed missions, yet insisted that Germany would stand by its commitments to spend nearly $1.5 billion on the international space station—about 40% of Europe's total contribution. Germany plans to keep working within the framework of the European Space Agency but will press to continue reforms to streamline the agency's administration.

    At the Max Planck Society's annual news conference in Bonn last week, the society's president, biologist Hubert Markl, praised Bulmahn's ministry for agreeing to 5% annual budget increases for both Max Planck and the DFG granting agency and for giving Max Planck more leeway in how it spends its federal and state funds. “We need this autonomy to make us more flexible and innovative at a time of increasingly competitive international research,” said Markl. He added that, with last month's opening of the Ethnological Research Institute in Halle, Max Planck had completed its 8-year expansion program into former East Germany, establishing 20 research centers there.

    But the edge was taken off the good news by the furor created when members of a militant animal rights group made physical threats against Wolf Singer, a director of the Max Planck Institute for Brain Research. As a result, Singer, whose lab uses primates to research brain function, was guarded by police when he received an award in Frankfurt on 29 November. At least one other German neurobiologist, at the University of Bremen, also had received police protection this year as a result of similar threats.

    Markl says he was outraged that Singer—an internationally respected researcher—has been threatened by “fanatical opponents of animal experimentation.” He defended Max Planck's policies on the use of laboratory animals in research and says he regarded the threats against Singer as “an attack on the freedom of research in Germany.”


    Forest Pact Bypasses Computer Model

    1. Elizabeth Finkel*
    1. Elizabeth Finkel writes from Melbourne.

    MELBOURNE, AUSTRALIA–Conservation scientists are reeling from the outcome of a fight over one of Australia's richest regions of biodiversity. An innovative and internationally praised scheme for reconciling conflicts over natural resources that tapped a 3-year, $23 million biota survey proved no match this fall for old-fashioned political muscle. The result was a bill passed late last month by state legislators permitting extensive logging in diverse forest ecosystems in the northeast corner of the state.

    “This is a massive waste,” says Andrew Beattie, director of the Key Center for Bio- diversity and Bio-resources at Macquarie University in Sydney. “The NSW [New South Wales] government paid for a world-class system for mediating forest conflicts through scientific knowledge but in the end chose to ignore a major part of its findings.” Dailan Pugh, a negotiator for the Nature Conservation Council, an umbrella group for private-sector conservation efforts in the state, says the plan represents the worst “regional forest agreement” in the country.

    Timber officials are hailing the new legislation as a shot in the arm for the industry. Col Dorber, executive director of the Forest Products Association, says that forest industry companies have already invested $16 million since the agreement was struck, buying up land for plantations and to obtain carbon credits that allow polluting industries to stay within government emissions standards. Government and industry officials also argue that the plan for the 380,000-hectare reserve, which creates 85 national parks, balances the interest of all sides. “It would be difficult for anyone to argue that the government has not met conservation targets given that the process has been recognized as world-leading,” says Craig Knowles, the state's minister for planning.

    That process was intended to showcase one of the most comprehensive ecological data sets anywhere in the world and state-of-the-art conservation planning software developed by scientists at the state National Parks and Wildlife Service (NPWS). It involved a program, called C-Plan, that allowed stakeholders to negotiate an arrangement that could meet both conservation targets and timber quotas (Science, 18 September, p. 1789).

    C-Plan was used successfully in preliminary negotiations in 1996 that led to nine new nature reserves in the eastern portion of the state and logging moratoria in areas likely to be tapped as national parks. However, the assessments required a second round of negotiations based on the detailed data sets. And this fall the “world-leading” process broke down during negotiations over 10 million hectares in the northeast region, say Pugh and Beattie, leaving the conservationists standing out in the cold. Instead, state officials worked behind closed doors to produce a plan that covered an area less than half the size that conservationists have insisted is necessary for biodiversity and that doubled, from 10 to 20 years, the length of time industry could continue logging at its current quota.

    Conservationists are also upset by the type of land to be included in the reserves. It's mostly unloggable escarpment forests already well represented, while diverse forest ecosystems in the foothills to the east and the tablelands to the west were left out. An analysis by the C-Plan support team at the state NPWSshows that the plan meets only 30% of the conservation targets achievable on public land for the highest priority species. One species likely to face extinction as a result of the plan is the Hasting's River mouse. Its last refuge is scattered habitats from the northeast NSW forests to the Queensland border, but the plan includes only 7% of its recommended conservation target.

    Conservationists and scientists are not the only ones unhappy with the turn of events. Federal officials say the state ignored national rules requiring the negotiations to be open and transparent. “These were back-room deals in smoke-filled rooms” is how Wilson Tuckey, federal Minister for Conservation and Forestry, describes the process, which he says lays the groundwork for the “rape and pillage” of the northeast forests.

    With the failure of science, conservationists have returned to pre-1995 tactics, complete with blockades and protesters chained to bulldozers on land where timber activity had been occurring despite the moratorium. It's the sort of confrontation, says Pugh, that C-Plan was supposed to make obsolete.

  8. 2000 CENSUS

    Supreme Court Gets Lesson in Enumeration

    1. David Kestenbaum

    The bitter battle over how to conduct the 2000 census has finally landed at the Supreme Court. Last week the justices heard arguments over whether the Census Bureau may use statistical sampling techniques to estimate the U.S. population. The court may choose not to issue an opinion, a decision that would toss the question back to a deadlocked Congress. But if the court does try to undo the tough scientific, legal, and political knot, its ruling will likely decide how the count will be conducted, experts say.

    The stage for the high-stakes hearing was set last summer, when the Clinton Administration appealed two U.S. District Court decisions that the bureau's plan for the once-a-decade count violates the Census Act. That plan would use sampling to estimate some 10% of the nation's population, an approach the bureau says will catch millions of people missed in a head count and save $675 million. The scientific community has, for the most part, rallied to sampling's defense (Science, 6 February, p. 798). Statisticians “overwhelmingly support” sampling, says Paul Voss, a statistician at the University of Wisconsin, Madison.

    But House Republicans and others are demanding a traditional person-by-person tally, arguing that the sampling procedure is subjective and would be prone to error and partisan tampering. It is also, they contend, unconstitutional. The political stakes are high: The numbers are used to divide House seats among the states, parcel out at least $180 billion in federal funds, and carve up states into congressional districts.

    The case before the court focuses on whether sampling numbers may be used to divvy up House seats among the states, a procedure called apportionment. The Census Bureau estimates that sampling corrections applied to the 1990 census would have shifted one seat to California. Critics and defenders agree, however, that if sampling data had been used to redraw congressional district lines, it might have changed the electoral outcome in several seats.

    Both sides found reason for optimism after the 90-minute court session. Sampling foes were buoyed by an exchange regarding the Constitution's call for an “actual enumeration” to divide up House seats. “Most people would think actual enumeration would mean a count—how do you get around that?” asked Justice Sandra Day O'Connor. Solicitor General Seth Waxman, arguing on behalf of the Census Bureau, replied that the founding fathers had meant a “good faith empirical effort.” So, Justice Antonin Scalia asked, “What is excluded? Rolling the dice?”

    Mystery job.

    The bureau is hiring, but census plan is uncertain.


    Sampling supporters had reason to smile when Justice John Paul Stevens asked what the bureau should do if it knew an apartment was occupied but no one answered the door. Would the Constitution require census takers to put down zero? “Your honor, they can't guess,” said attorney Maureen Mahoney, arguing for the House of Representatives. “Even if the lights go on and off in the evening?” asked Justice Stephen Breyer. At the end of the day, says Tom Hofeller, staff director for the Republican-led census subcommittee, “I don't think anybody could have walked out thinking it went either way.”

    Even if the court finds the bureau's plan unconstitutional for apportioning House seats, it might still be legal to use sampling to derive a second count for redistricting or divvying up federal funds. One Republican staffer says that redrawing congressional district lines according to sampling numbers in 1990 could have cost the party as many as 10 House seats. Census panel chair Dan Miller (R-FL) asserts that Democrats are keen on sampling only for the sake of such political gains. “That's the reason they want to change things,” he says.

    Proponents are ready to push the bureau to use sampling to provide more accurate numbers for redistricting and distributing federal dollars. “If we lose in court, we would definitely be moving toward a two-number census,” Representative Carolyn Maloney (D-NY) told Science. But bureau director Kenneth Prewitt has reservations: “Once you have two numbers, why not three, four, five, six?” Compiling inaccurate tallies, he says, “is like giving up on the scientific underpinnings” of the endeavor. A ruling that sampling is constitutional, on the other hand, would throw enormous weight behind the bureau's plan. “If that happens, the party's over,” concedes one Republican staffer. “But I don't think it will.”

    Alternately, the Supreme Court may rule that the lower courts should have thrown the case out. Waxman argued that neither the House nor the other plaintiffs had been harmed by the bureau's plan and had no right to sue. If the Supreme Court accepts that argument, House members will have to slug it out for themselves. The court is expected to rule by the end of June, but Congress may have to take up the matter sooner. Because of squabbling over the census plan last summer, the Census Bureau's authority to spend 1999 funds is set to expire on 15 June.


    Sky Survey Racks Up Record-Setting Quasars

    1. James Glanz

    CHICAGO–Just a few days into its 5 years of scanning the heavens, the Sloan Digital Sky Survey has already begun setting records. Within a narrow strip of sky along the celestial equator, the Sloan's 2.5-meter telescope has bagged three of the four most distant quasars ever seen, including a new record-holder. It also found another nine of these distant beacons, thought to be the cores of young galaxies set ablaze by mysterious central engines, at distances nearly as great. At that rate, the entire Sloan survey is likely to pick out 1000 quasars at distances close to the current record-holder, says Michael Turner, an astrophysicist at the University of Chicago and the Sloan's spokesperson.

    “Anybody who had any doubt that the Sloan was going to completely revolutionize quasar studies probably has a lot fewer doubts,” says an elated Turner. Patrick Osmer, a quasar hunter at Ohio State University in Columbus, says that the Sloan “is going to be as powerful as we all hoped in this area,” giving astronomers batteries of distant searchlights for probing the layout and makeup of the early universe.

    Quasars are not the only quarry. The Sloan survey aims to census about one-quarter of the entire northern sky and selected slices in the south, using an automated telescope at Apache Point Observatory in New Mexico (Science, 29 May, p. 1337). From hundreds of millions of celestial objects, special software will cull particularly interesting ones for a follow-up look with the same telescope, which will break their light into spectra, rich in information about the objects' nature and distance. For example, spectra of the million brightest galaxies in the census will determine their “redshift”–a measure of distance. That information will go toward creating a giant three-dimensional map of the sky.

    Most of the bright galaxies will be in our neighborhood, cosmically speaking—within a billion light-years or so. But quasars, which appear as bright points of reddish light, remain visible at greater distances, to the very outskirts of the visible universe, and the Sloan organizers hope to find 100,000 of them as well. The telescope's first sweep of the sky, mostly in September, covered just 1% of the area of the final survey. But Sloan collaborators, including Michael Strauss and Xiaohui Fan of Princeton University, have picked out 19 quasar candidates so far by analyzing the five-color images, and follow-up spectra confirmed 12 of them as actual quasars—a 70% success rate.

    That far exceeds the 10% success rate typical of quasar hunts, probably because the Sloan's images have more colors than most surveys produce. The farthest of the quasars, at a redshift of 5.00—corresponding roughly to 13 billion light-years away—just edged out the redshift 4.897 quasar reported in 1991 by James Gunn of Princeton University, Donald Schneider of Pennsylvania State University, and Maarten Schmidt of the California Institute of Technology (Caltech). Gunn, however, can take some credit for the new record-holder as well, because he led the team that built the Sloan's sophisticated electronic camera.

    Researchers hope to use the thousands of quasars expected from the Sloan as markers of cosmic structure in the early universe and probes of the gases wafting through space over billions of light-years. But they won't start amassing more of them until at least January. Before then, collaborators need to work the kinks out of optics and software and rig the telescope so it can move freely—right now they rely on Earth's rotation to slew the telescope along the equator. And so far, the astronomers have been taking their follow-up spectra one by one with a nearby 3.5-meter telescope rather than with the Sloan telescope itself, which will gobble them up 640 at a time, through holes drilled in pizza-sized aluminum disks.

    But the quasar finds—the Sloan's first harvest for science—have other astronomers giving it the thumbs-up. “This is one of the things they wanted to do better than anyone else,” says Charles Steidel of Caltech. “It looks very promising.”


    Gates Launches $100 Million Initiative

    1. Dan Ferber*
    1. Dan Ferber is a writer in Urbana, Illinois.

    The planet's richest individual is donating a portion of his fortune in hopes of buying some of the world's poorest children a priceless gift—good health. Last week Bill Gates, the chair of Microsoft Corp., gave $100 million to create the Bill and Melinda Gates Children's Vaccine Program. The program will enlist existing international health organizations in a battle against four diseases through its support of vaccine trials, public education, and new funding mechanisms. “Our goal is to make the vaccines you and I take for granted available to children no matter where they live,” Gates said at a press briefing in New York City.

    The donation comes from the William H. Gates Foundation and will be administered by a Seattle-based organization called the Program for Appropriate Technology and Health (PATH). The money, to be given over 10 years, will fund efforts to improve delivery of existing vaccines rather than to develop new ones, says Gordon Perkin, president of PATH. In particular, it is aimed at disseminating vaccines proven effective against:

    * Haemophilus influenzae type b (Hib), which causes pneumonia and meningitis;

    * Rotavirus, which causes severe diarrhea and dehydration;

    * Hepatitis B, which causes cirrhosis and liver cancer; and

    * Streptococcus pneumoniae, which causes ear infections and pneumonia.

    About three-fourths of the money is expected to go to the World Health Organization (WHO), the United Nations Children's Fund (UNICEF), and the International Vaccine Institute (IVI). The fledgling IVI, based in Seoul, Korea, has already received $250,000 to supplement drug company funding of a study of the distribution of Hib throughout China, Korea, and Vietnam. “The Korean government got us off the ground, the companies helped us start our first project, and now the PATH grant assures that we are expanding,” says immunologist Barry Bloom, chair of IVI's board of trustees and incoming dean of the Harvard School of Public Health in Boston.

    PATH has assembled an international advisory panel of seven eminent scientists that will meet in March to recommend ground rules for the new program. But Perkin says that two funding priorities are clear: to coordinate cost-effectiveness studies and trials to improve the vaccines' performance in the developing world, and to explore new ways of financing large-scale childhood immunization efforts, such as interest-free loans from the World Bank. “This $100 million is going to be a catalyst to do the advocacy work, the vaccine trials, and [to improve] the financing mechanisms,” says epidemiologist Mark Kane, a WHO veteran who will head the new program.

    The Gates program will not pay for the tens of millions of doses that will be needed throughout the world, says Perkin. Even so, says Carol Bellamy, executive director of UNICEF, the donation is certainly welcome. “The bottom line is that this money will keep more children alive.”


    Worming Secrets From the C. elegans Genome

    1. Elizabeth Pennisi

    The near completion of the sequence of the C. elegans genome should provide researchers with a gold mine of information on topics ranging from evolution to gene control

    In the early 1800s, American explorers Meriwether Lewis and William Clark traveled 8000 miles—from St. Louis, Missouri, to the Pacific Ocean and back—mapping newly acquired territory that expanded the size of the United States by more than a million square miles. The end of their 2-year journey marked the beginning of a century of expansion and growth for this new nation. Biology is now at a similar juncture, marking both an ending and a new beginning.

    This issue of Science signals the end of an 8-year effort to sequence the first animal genome, with the publication of the virtually complete sequence of the 97 million bases in the genome of a tiny nematode worm, Caenorhabditis elegans. But this milestone is also the beginning of a new era in biology.

    For one, the worm-sequencing effort has helped pave the way for sequencing the 3 billion bases that make up the human genome, a project that will extend into the next century. The early successes of worm sequencing were instrumental in convincing researchers and funding agencies of the value and feasibility of large-scale sequencing projects (Science, 10 February 1995, p. 783; 2 June 1995, p. 1270). And now, the two groups who sequenced the worm genome, located at the Washington University Genome Sequencing Center in St. Louis and the Sanger Centre in Cambridge, U.K., expect to use the skills they've acquired to generate about half the human genome. “They cut their teeth and learned how to do high-level sequencing by practicing on the worm,” says David Botstein, a geneticist at Stanford University.

    But beyond that, as the first sequence of a multicellular organism, the C. elegans genome should provide a cornucopia of biological information—and not just about the worm. As Gary Ruvkun, a developmental geneticist at Harvard Medical School in Boston, notes, “It's the first time we can see all the genes needed for an animal to function.” As a result, says Francis Collins, director of the National Human Genome Research Institute in Bethesda, Maryland, countless other life scientists in addition to the 1200 or so who call themselves worm biologists will be tapping the nematode sequence for their studies.

    He notes that studies of functions as diverse as muscle contraction, fear responses, digestion, and reproduction often lead researchers to some gene whose precise function is unknown. But because of the conservation between genomes, a matching gene can often be found in some form in the worm—even if the original organism of interest is only very distantly related, say a mammal such as the mouse or even a human being. And thanks to years of intensive study, the function of many worm genes is already known—or may soon be determined. “We'll be doing a lot of jumping back and forth between species,” Collins predicts. The completion of the sequence, he adds, is “a significant milestone.”

    It should shed light not just on how existing multicellular organisms function but also how they came to be. Comparisons of the C. elegans genome with those of yeast and the other microbes that have been sequenced have revealed both similarities and differences that are sparking new thinking in disciplines from evolutionary biology to protein chemistry. For example, they can help answer the question of how genomes have expanded and changed to support multicellular life. The nematode sequence promises to be “a basic organizing principle” for all biologists, says Robert Waterston, who headed up the sequencing effort at Washington University. “It allows cross-communication [between] very different fields.”

    Lights on.

    The glow of green fluorescent protein reveals which of the nematode's nerve cells have a particular active developmental control gene.


    A humble beginning

    The first person to sense that the worm might take on such a prominent role in biology was molecular biologist Sydney Brenner of the Medical Research Council (MRC) Laboratory of Molecular Biology in Cambridge. During the mid-'60s, he wanted to understand how the various parts of the nervous system get wired up correctly during development. In complicated species, like mice, humans, or even fruit flies, this problem seemed intractable. But C. elegans is both small—its 959 cells include only about 300 neurons—and transparent so that all the cells can be seen and followed during development. It “turned [this question] into a finite problem,” Brenner says.

    Although at the time many could not see the value of such a detailed study of a simple worm, Brenner did manage to recruit several young scientists to worm studies at his new lab. One was Waterston, an immunologist interested in muscle development who arrived in 1972; another was John Sulston, an organic chemist turned biologist, who by 1983 had traced the fate of each cell as the nematode transformed from a single cell to a full-sized worm.

    The cell lineage map, described at the time as a “monumental achievement” by a Sulston group alumnus, Bob Horvitz of the Massachusetts Institute of Technology, laid the groundwork for determining just what influenced the development of the various cells. For example, researchers could destroy one cell to see what effect, if any, its absence had on the development of its neighbors. But to get to the underlying biochemical mechanisms that determine cell fates, researchers needed to track down the genes involved. So Sulston and Alan Coulson at the MRC lab decided to make a physical map of the worm genome, consisting of a set of landmarks, separated by known numbers of bases, along each chromosome. Such a map would enable them to home in on a gene's approximate location faster than they could before.

    With help from a growing nematode research community, the two began the project by making a “library” of pieces of the entire worm genome, grown in bacteria. They then used a technique called fingerprinting to establish landmarks on each piece; by comparing the landmarks on the DNA pieces, they could determine which pieces overlapped and thus how to arrange all of the pieces and their landmarks into a map of the whole genome. As they struggled to link the pieces to cover entire chromosomes, Waterston realized that a technique developed by Maynard Olson, a colleague of Waterston's at Washington University, might be useful for filling the gaps. Olson was making yeast artificial chromosomes that contained pieces of human DNA; as David Burke in Olson's group later found, YACs turned out to be capable of expressing the missing bits of nematode DNA as well.

    In 1989, the nearly complete map took up an entire wall when displayed at the worm biology meeting at Cold Spring Harbor Laboratory in New York. Such a map was a prerequisite for sequencing a genome, as it divided each chromosome into smaller chunks whose makeup of bases could be determined and then fit into the proper place in the chromosome. Thus, to promoters of the Human Genome Project looking for a smaller genome to try first, the worm looked quite promising.

    At that meeting, Waterston, Sulston, Coulson, and Horvitz mapped out a pilot project whose goal was to sequence 3 million bases—about 3% of the worm genome—by 1993. “The idea was that this was a dry run for the human genome,” says Sulston, who is now at the Sanger Centre.

    Getting support was not easy, however. “Close colleagues told me I was nuts,” Waterston recalls. At the time, sequencing successes were measured in thousands of bases, and the nematode genome had millions. Also, many did not think it would be useful to spend millions of dollars “on something which didn't solve biological problems right off,” says Sulston.

    The worm researchers finally got initial grants from the MRC and the U.S. National Institutes of Health in 1990. But although the project was well on the way to meeting its first goal of sequencing 3 million bases in 1992, getting the rest of the money needed “was touch and go,” Sulston recalls. Not until venture capitalists threatened to lure Waterston and Sulston to a private sequencing effort did the public support fall in place.

    To go beyond the 3 million bases to the full genome, Sulston and Waterston first needed to make sequencing cheaper and more efficient. By 1992 they were doing a million bases per year, which meant that it would take nearly a century to finish the entire genome and would cost some $200 million. The researchers doubled, then quadrupled, the amount of DNA their automated sequencers could process at one time by adding more separation lanes to these machines and running the machines day and night. Both centers worked on streamlining the preparation of DNA and improving the data quality. Their labs mushroomed, and instead of managing a dozen researchers, Sulston and Waterston each eventually had 100 or more workers.

    Unexpected biological hurdles slowed their progress, though. “There was a complexity [in the genome] that we weren't fully aware of,” says Waterston. The researchers originally concentrated their sequencing efforts on the middle of the chromosomes, thinking that that's where most of the genes are. But there proved to be almost as many genes farther out on the arms, with the result that the number of genes turned out to be much higher—19,000—than the 15,000 they originally expected. That meant they couldn't relax their accuracy standards as they once thought they might be able to do in gene-poor regions. They also had to deal with hard-to-sequence repetitive regions throughout the genome.

    Finally, because the researchers had done most of their sequencing on worm DNA pieces grown in bacteria, they had optimized the sequencing operations for the bacterial material. But the last 20% of the DNA was not contained in bacteria but in YACs grown in yeast, and that required figuring out how to revamp their procedures to deal with contaminating yeast sequences. (Even now, a few hard-to-sequence gaps remain to be finished.)

    But by 1995, the tide had turned in favor of the sequencing effort. The two centers were producing several million bases of sequence per year. Many biologists had started to realize how useful the promised sequence would be for speeding up their own studies. “People were writing grant proposals around the idea that the genome was coming,” says Sulston. “It was becoming clear that if you were serious about a research question, you couldn't address it without genomics.” And that required having the full worm sequence in hand.

    A new biology

    The special section on the C. elegans genome, which begins on page 2011, hints at the richness of this genome for neurobiologists, biochemists, developmental biologists, and researchers seeking to understand how genomes change through time. As these papers show, computer analyses comparing the worm's DNA with that of other organisms are already confirming the existence of a wealth of genes that have been conserved through evolution. For example, the genes needed for individual cells to function, such as those that code for proteins involved in DNA replication or protein synthesis, appear virtually the same in yeast and worm, and even in higher species. In addition, genes involved in development are often shared by the worm and other multicellular organisms. These similarities mean that “C. elegans becomes this radio beacon that you're triangulating all of biology from,” says Ruvkun.

    Stanford's Botstein agrees. He describes the similarities in the developmental genes and others as “an unbelievable boon to understanding what all these genes [do].” If computer gene-matching programs show that a target gene in another organism also exists in the worm, then knowledge about the worm gene—such as the identity of the cell regulatory pathway in which that gene's protein operates—can add to the understanding of its counterpart. “Suddenly you have not just your gene, but [the] context revealed,” says Waterston. “You're looking at the forest, not just the tree.”

    Worm zoo.

    Stocks of mutant nematode worms, such as the ones shown at left, help researchers quickly home in on gene function.


    And even if researchers don't know the function of a worm gene of interest—and the functions of 12,000 of the 19,000 C. elegans genes are still a mystery—it can be easy to find out. It is much simpler to evaluate gene function in the worm than in, say, a human.

    For example, one 2-year-old technology known as RNAi (Science, 16 October, p. 430) has made it easier to knock out a worm gene and see how its absence affects the animal. RNAi involves simply injecting worm oocytes with a piece of double-stranded RNA that matches the gene and somehow blocks its expression. Experimenters then look for changes in the worms that develop from the injected eggs. There's even some hint that eventually researchers may be able to mix the RNA with a worm's meal and then study the effects of having the gene inactivated in the animal's offspring.

    Moreover, because the worm is see-through, the gene under study can be linked with the gene for green fluorescent protein (GFP). Then, researchers can find out when and where in the worm the hybrid gene is expressed by monitoring GFP's glow.

    To further speed the study of C. elegans gene function, Coulson is working with Robert Barstead at the Oklahoma Medical Research Foundation in Oklahoma City and Don Moerman at the University of British Columbia in Vancouver on a large-scale effort to generate large numbers of mutant worms. They have chemically induced mutations in about a half-million worms and are now screening them for interesting defects. Already, Coulson says, “we've had quite a lot of interest” from researchers studying human diseases who want to see if comparable defects crop up in the mutant worms.

    Using the mutants, RNAi technology, and GFP proteins, these scientists can begin to learn more about how the gene at fault functions. “The worm is a highly tractable genetic organism,” says Horvitz. “It's going to make a huge impact on mammalian genetics and human disease.”

    Researchers who want to understand how genes are regulated are also turning to the C. elegans genome. Take developmental biologist Stuart Kim at Stanford. He has teamed up with Yuji Kohara from the National Institute of Genetics in Mishima, Japan, to make a microarray, a glass plate with bits of worm DNA attached, which will make it possible to study when and where the worm's genes are activated by regulatory signals. By January, Kim says, the array will have some 12,500 genes represented on it, each one designed to produce a spot of fluorescence when exposed to a sample containing RNA messages from the corresponding gene. And in a few months more, he expects to have all 19,000 genes on the array. “Then we'll be ready to rock,” he says.

    Already some two dozen labs are gathering RNAs for testing on the array. And Kim is busy writing computer software that will let him discern global regulatory networks—clusters of genes whose activities are interconnected—from the microarray results. Thus he hopes to look at, for example, all the genes turned on by a particular regulatory protein, such as the cancer-promoting Ras, and then see whether some of those same genes are also activated by other proteins. In this way he hopes to discern the connections between various DNA regulatory pathways. This question “is hard to study one gene at a time,” he points out.

    And still other researchers are using the genome to address evolutionary issues. On page 2018, Neil Clarke and Jeremy Berg, biochemists from the Johns Hopkins University School of Medicine in Baltimore, compare genes for DNA regulatory proteins that interact with zinc in the worm, yeast, bacteria, and a separate group of microorganisms, called Archaea, best known for living in extreme environments. Only the yeast and nematode genomes code for large numbers of zinc-binding proteins.

    But although many of the zinc-binding proteins in C. elegans—233 of them, in fact—serve as receptors for steroid hormones, relaying messages from the hormones into the cell, those receptors are completely missing in yeast. Most likely, the evolution and expansion of steroid hormone receptors “coincided with going from a single-cell to a multicellular organism,” says Berg. Presumably, steroid hormones and their receptors were needed to help coordinate the activities of different cells throughout the body.

    Other evolutionary studies have posed new puzzles. On page 2033, Ruvkun and Harvard colleague Oliver Hobert's first pass through the worm genome confirms that animals share not just genes but entire regulatory pathways, governing their development. So what, he asks, makes people look so different from worms? Is it that the genes and pathways shared by this menagerie are simply turned on at different rates or at different times during development? Or are there also sets of genes that are not shared, which account for why worms look one way and flies another? “The analysis of genes that work in specific organisms should answer this question,” he predicts.

    Just the possibility of asking such questions is what reveals the real promise of the completed genome, says the MRC's Brenner. “The sequence is not the end of the day,” he emphasizes. “It's the beginning of the day.”


    Bid for Better Beef Gives Japan a Leg Up on Cattle

    1. Dennis Normile

    Japanese researchers achieve high success rates in cloning cattle with techniques honed on efforts to help the livestock industry

    NARA, JAPAN–When British researchers announced in early 1997 that they had successfully cloned an adult sheep, the startling news touched off a global gabfest about the ethics, dangers, and possibilities of cloning adult mammals. For Yukio Tsunoda, however, Dolly's arrival was a call to action. Tsunoda, a professor of animal reproduction at Kinki University here, had spent more than a decade refining a cloning technique at the heart of the process that produced Dolly. And Tsunoda wasn't alone in his quest to emulate the work done at Scotland's Roslin Institute with adult sheep cells.

    In the months following Dolly's arrival, at least seven Japanese groups set out to replicate the experiment in cattle. Their success has created a herd of clones stampeding out of livestock research centers. Five groups have reported the births of 19 calves cloned from adult cows, and so many more surrogate mothers are carrying cloned cow embryos that the Ministry of Agriculture, Forestry, and Fisheries has given up trying to track them all.

    The first scientific paper resulting from these efforts appears on page 2095. In it Tsunoda's team reports cloning eight calves from cells taken from the oviducts and cumulus, the tissue that surrounds the oocytes, or egg cells, of a single adult cow. Closely following the nuclear transfer technique used to produce Dolly, the team starved the cells into quiescence, transferred their DNA-carrying nuclei into enucleated egg cells, and then reactivated them with an electric shock. Particularly significant is the group's success rate: Of 10 implanted embryos, eight were carried to term, although four died soon after birth. That rate is far higher than for any other group attempting to clone large mammals.

    Although the self-effacing Tsunoda chalks up their success to “beginner's luck,” most analysts point instead to Japan's extensive support for livestock research, a component of its international trade policy. Japan's longtime efforts to improve livestock management and breeding techniques gained a big boost in the early 1990s when the government bowed to heavy pressure from the United States and other countries to gradually open its beef market to imports. In an attempt to gird local industry against the looming competition, the Agriculture ministry organized programs to train researchers at prefectural livestock research centers in the latest biotechnological techniques. That early government investment gave the domestic livestock industry a leg up on the global competition and scientists valuable experience in cloning.

    Before Dolly appeared, Japan's livestock researchers and their counterparts around the world were trying to clone animals by transferring the nuclei from very early embryo cells—before they have begun to differentiate into specialized cell types—into other embryos whose nuclei have been removed. The idea is to create many copies of an animal likely to have valuable traits. But there were major problems with the technique. Most of the fetuses grew to enormous sizes, resulting in difficult births. More importantly, the technique “never proved to be economical,” says James Robl, a professor of veterinary and animal sciences at the University of Massachusetts, Amherst.

    Whereas those obstacles drove off researchers in the United States and Europe, Japanese scientists persisted. The result, says Akira Iritani, a professor of genetic engineering at Kinki University who is working on the cloning of rabbits, is close to 400 head of cattle born using this embryonic nuclear transfer technique. Many of those prefectural researchers trained under Tsunoda, who had pioneered nuclear transfer work in Japan in the mid-1980s.

    Until the Dolly news broke, most researchers believed that DNA from adult mammalian cells could not develop into a complete embryo. Indeed, the Agriculture ministry even had a policy that discouraged the use of adult somatic cells. But once Dolly appeared, officials quickly revised their guidelines and in August 1997 approved attempts to clone adult animals. The early work with embryonic nuclear transfer put Japanese researchers in a good position. “Everyone [in the cattle-breeding industry] is very familiar with nuclear transfer techniques,” says Chikara Kubota, who heads a cloning research group at Kagoshima Prefecture's Cattle Breeding Development Institute that followed Tsunoda's group in successfully bringing a fetus cloned from an adult cow to term.

    Tsunoda was not surprised by the Dolly announcement. He had already spent years looking at various types of cell nuclei for totipotency—the ability of a cell to differentiate into all the types needed by an organism—and had discovered that a mouse could develop from cells taken from the trophectoderm, the first differentiated cells in an embryo which eventually form the placenta. “So we thought somatic cells should have totipotency,” he says.

    When setting out to replicate the Dolly experiment with cattle, the group grew several oocytes in culture to the blastocyst stage and examined them for abnormalities in the number of chromosomes. When the first batch of blastocysts proved normal, the researchers prepared a second batch of oocytes with transferred nuclei and cultured the developing embryo on mouse fibroblast cells for 8 or 9 days. They then selected 10 of the blastocysts and implanted two in each of five hosts. All five became pregnant and were cared for at the Ishikawa Prefectural Livestock Research Center, on the Japan Sea coast 300 kilometers west of Tokyo.

    Two other groups started cloning efforts at about the same time, and more joined the chase once news of Tsunoda's pregnant cows spread. But Tsunoda's group won the race when one of the surrogate mothers delivered healthy twin calves after 243 days, only 37 days earlier than a normal gestation. The average weight at birth of the eight calves, all born naturally, was about 32 kilograms; the average for natural pregnancies is about 27 kilograms.

    Despite Tsunoda's success, other groups have had problems with large calves, including one cloned from fetal cells that weighed in at a whopping 52 kilograms after a cesarean section. And although it won't be clear until their reports are published, few of the other groups seem to be matching Tsunoda's success rate. Kubota says only about 10% of his team's implanted embryos have been carried to term, and the Roslin Institute group produced only one live sheep from 13 transferred blastocysts.

    Tsunoda believes that the most important factor in his high success rate may be his use of cumulus and oviductal cells, because of their role in reproduction. Other groups have used a variety of cells, including muscle and skin cells. Tsunoda is hoping to begin to clarify such issues through a systematic screening of cells. A second batch of cloned calves still in utero has been produced using cells from 20 different tissues, including the liver, kidney, and heart. Tsunoda anticipates results by spring.

    Most research efforts outside Japan are likely to focus on cloning cattle from fetal rather than adult cells, says Will Eyestone, a reproductive physiologist working on transgenic animals at PPL Therapeutics Inc. in Blacksburg, Virginia. Embryonic cells containing fetal or embryonic DNA are believed to grow faster, and the resulting animals appear to live longer than if they had been cloned from adult cells, he notes. “It's a better way to go,” adds Robl, if the goal is to modify the genetic makeup of the animals so that their milk contains drugs for use in humans. Using adult cells might offer an advantage in getting exact copies of cows or bulls that are particularly valuable for breeding purposes or for meat, however. But producing calves for agricultural purposes through cloning is not likely to prove economical in the United States. “We don't have a market for very high premium beef,” Robl says.

    But Japan does. “The cost of agricultural products in Japan is high, but they still sell,” says Tokyo University's Tomohiro Kono. Superpremium Matsuzaka beef roasts, for example, cost $100 a pound, and those prices would support the expense of cloning prize beef cattle. But just how commercially important cloning might be is an open question. A genetically ideal calf is just the starting point for Matsuzaka beef. The animals are also fed beer and given daily massages as part of a regimen that results in fine flecks of fat uniformly scattered throughout the meat. “We have a long way to go to make [premium] beef inexpensive,” says Hiroto Takahashi, an official in the Agriculture ministry's animal production division. “In other countries, there would be no meaning in producing [cattle] this way,” he adds.

    Given those limitations, some scientists feel that the research efforts should be focused on understanding the cloning mechanism itself. Kono, who uses rabbits to study that mechanism, says that while it was important to confirm the Roslin results in cattle, there is no need to have the efforts duplicated by so many groups. “It is a Japanese trait, [in which] everyone heads in the same direction,” he says. “There isn't much originality in the research.”

    Tsunoda agrees that a lot of the work is redundant. But using different cells may help researchers clarify the mechanism through which cells are reprogrammed to start the development process anew. “Right now, what happens in cell reprogramming is a black box,” he says. “We are at the starting point to study the reprogramming [of cells].”


    From Army of Hackers, an Upstart Operating System

    1. Joseph Alper*
    1. Joseph Alper is a writer in Louisville, Colorado.

    The open-source software movement has developed a free computer operating system that is poised to compete with Microsoft's Windows

    In the titanic struggle between Microsoft and the Justice Department, one of the software giant's chief defenses against the charge of monopoly-building is to argue that its lead in the operating-system market is vulnerable. New competitors, say the company, could challenge it at any time. That argument may seem laughable on the surface, given that neither Apple nor IBM was able to best Microsoft in the operating system wars. But there is a competitor on the horizon—and if internal Microsoft documents are to be believed, the software Goliath may be showing some nervousness.

    David in this case is called Linux (, a simpler-to-use variant of the old standby UNIX. Already, Linux is the operating system of choice for Internet servers, the computers that route Internet traffic and host sites on the World Wide Web, and its use is growing rapidly for small-sized servers on local area networks. It also serves as the operating system on a cut-rate supercomputer at Los Alamos National Laboratory, called Avalon. And now, thanks to a massive effort by programmers around the world, all but a handful working for free, Linux is poised to make significant inroads in the workstation and desktop personal computer world, which is largely the domain of Microsoft and, to a lesser degree, of UNIX.

    Linux, by all accounts, is stable, powerful, and fast—and it's free. Yes, free: Linux is the fruit of a kind of online commune, an intellectual descendant of the counterculture of the 1960s. And although free software is nothing new—UNIX itself was initially distributed free by its developer, Bell Labs—Linux represents one of the most visible successes of what is called the open-source software (OSS) movement, an approach to software development first championed in 1983 by computer scientist Richard Stallman at the Massachusetts Institute of Technology. Proponents of this model, which now include the Web giant Netscape, believe that all software should not only be free, but should also be accompanied by its native code, the instructions written by programmers. “By making the code open and providing it free with the binary software, it means that you are now allowing users to start tinkering with a program,” says Miguel de Icaza, a programmer and system administrator for Universidad Nacional Autónoma de Mexico in Mexico City. “Essentially, you harness the power of millions of users to find problems, whether they be bugs or just deficiencies, and thousands of programmers to fix them quickly.”

    “The end result,” he says, “is that you get software that's smaller, less buggy, and more stable”–which many computer scientists say is the case for Linux. The Avalon supercomputer has been running for many months now without crashing, reliability that is almost unheard of in the supercomputer world. Some common applications for personal computers and workstations also run faster under Linux. According to an internal Microsoft memo, leaked to the public via the Internet by an internal source and confirmed as authentic by Microsoft, Netscape's Navigator Web browser rendered graphics and text “at least 30–40% faster” when it ran in Linux than it did in Microsoft's own operating system, Windows NT. Finally, Linux's small size and speed mean that it runs just fine on less expensive computers, including those with Intel's older 80486 processor and its clones. “For low-budget operations, you can take used 486s, install Linux, and you have a really cheap but powerful Internet server,” explains Todd Lewis, a Linux volunteer whose paying job is at MindSpring, an Internet service provider.

    Linux's source code has been open since it was first created in 1991 as a simpler-to-use version of UNIX by Linus Torvalds, then a 21-year-old undergraduate at the University of Helsinki. He posted the approximately 10,000 lines of code on the Internet, and in short order other programmers began sending him fixes and improvements, which Torvalds incorporated into the system. As news of Linux spread through the programming community, more programmers joined the effort, and in March 1994, with 100,000 users, version 1.0 was released along with supporting software. This was followed in June 1996 by version 2.0.

    By then, the Linux kernel—the core operating system—had grown to 400,000 lines of code, all of it written by volunteers and incorporated into the kernel by Torvalds. Under the terms of the “CopyLeft” license developed by Stallman's Free Software Foundation, a virtual organization that is promoting the concept of open-source software, anyone can use the code and modify it, but they must then send the change to the community for review. “Then, the ultimate decision on whether or not to incorporate a change rests with one of the ‘benevolent dictators’ that ‘rule’ a particular part of the Linux project,” explains Lewis. Several thousand volunteers now contribute, but the buck still stops with Torvalds, who works for a small Silicon Valley chip designer called Transmeta Corp. in Santa Clara, California.

    The development effort has spread to the commercial world. Today, at least four companies, including RedHat Software in Research Triangle Park, North Carolina, and Caldera Systems in Orem, Utah, develop and sell Linux-compatible ancillary software such as graphical user interfaces and suites of common business applications, which the companies package with Linux. They, too, send proposed Linux modifications to the appropriate individual and adhere to the same CopyLeft license as everyone else. According to industry newsletters, the operating system now runs on an estimated 10 million computers, mostly workstations and Internet servers, displacing other UNIX systems and Microsoft's Windows NT.

    Linux is also making inroads in the research community. Recently, Michael Warren's group at Los Alamos used Linux as the operating system for Avalon, a collection of 140 Digital Equipment Corp. Alpha desktop computers wired in parallel. The result is a genuine supercomputer, capable of 47.7 billion calculations per second, at a total cost of $300,000—parts, labor, and software included. Several research groups at Los Alamos have used Avalon to solve problems in areas as diverse as astrophysics and molecular dynamics (

    Linux will soon make its most direct challenge to commercial software in the form of a set of programs called GNOME (the g is pronounced), a so-called front end that would turn Linux into the equivalent of the Windows 98 operating system for PCs, complete with supporting functions such as a file management system, text editor, mail protocol interpreter, and disk formatter. GNOME can't run software configured for Windows 98, but with filters it can work with files prepared in various Microsoft applications, such as Word and Excel. Recently, de Icaza, who heads GNOME development, “froze” further additions to the software in anticipation of releasing the first complete version in March 1999. “Now, the GNOME community—several hundred strong—will complete testing and debugging,” says de Icaza.

    That enormous people resource gives the Linux development effort and open-source software in general a key advantage, explains Ransome Love, president of Caldera Systems. “We and everyone who uses Linux, as well as the other companies that distribute it, benefit from the 24-hour-a-day efforts of thousands of people around the world who just pounce on problems and get them fixed.” It's an advantage that Microsoft has noted. In a confidential memo posted to various bulletin boards by someone inside the company, Microsoft product manager Vinod Valloppillil wrote, “The ability of the OSS process to collect and harness the collective IQ of thousands of individuals across the Internet is simply amazing.”

    In Linux, Valloppillil said in a second leaked memo, the process has resulted in “a best-of-breed UNIX that is trusted in mission critical applications and due to its open source code has long-term credibility which exceeds many other competitive [operating systems]. Long term, [our] simple experiments indicate that Linux has a chance at the desktop market.” A Microsoft spokesperson has confirmed publicly that these memos are legitimate, saying they describe “business models that would be valuable in order to stimulate additional internal dialog within Microsoft.”

    Linux is not the only success of the open software movement. Many add-on programs, so-called utilities such as the file compression program Zip, and file readers such as Ghostview were born of this movement. Apache, an Internet server program that runs under Linux and was developed by about 20 programmers from around the world, is now found on more than half the computers that host Web sites, according to the November 1998 Netcraft Web Server Survey ( BIND, developed by hackers at the University of California, Berkeley, in the 1980s, is used by virtually every Internet router system to convert Web aliases into true numeric addresses, and the open-source program Sendmail routes about 80% of all e-mail sent today.

    Netscape embraced the open-source software credo last March when it released the source code for the latest version of its Communicator software. Within hours after its release, a group of Australian hackers wrote a small piece of cryptographic code that greatly increased the security of Communicator, which can be downloaded free from Netscape's Web site. Netscape got the fix for free, every user of Communicator benefited, the Australians, who call themselves the Mozilla Crypto group, got kudos from the programming community for their great hack, and their consulting group, Cryptosoft in Brisbane, probably got some more business.

    Is the open software movement likely to have further successes? Microsoft evidently thinks so. These programs have proven to be “at least as robust—if not more—than commercial alternatives,” wrote Valloppillil. And for participants in the movement, the rewards remain strong. “It's a very positive ego thing, knowing that you've contributed to a great piece of software,” says Lewis.


    New Minister Sets Lofty Goals

    1. Richard Stone

    As Russia unravels, Science Minister Mikhail Kirpichnikov struggles to find a balance between supporting basic research and forcing science to pay for itself

    MOSCOW–If you find running a lab and stumping up grant support an oppressive burden, spare a thought for Mikhail Kirpichnikov, Russia's new science minister. Kirpichnikov, a protein chemist who still manages to keep one foot in the lab, is trying to patch together enough resources to keep the entire country's once-proud research enterprise from crumbling away. It's a daunting challenge. Russian researchers, sick of not receiving salary for months, are staging strikes. Unable to pay utility bills, institutes are plunged into a cold twilight during working hours. The few labs able to sustain world-class research rely on foreign colleagues for access to instruments or reagents. And in the latest blow to Russia's scientific community, the ruble has lost two-thirds of its value against the dollar in 3 months, vaporizing razor-thin budgets for Western research supplies.

    Kirpichnikov, who moved into the research hot seat on 25 September, says his chief priority is to “do everything we can to protect basic research. It is our country's destiny.” But just how to fend off further decay is a matter of vigorous debate among rank-and-file scientists—and is likely to dominate discussions at a meeting next week in Moscow, co-sponsored by the ministry and the Paris-based Organization for Economic Cooperation and Development, on the future of Russian science. Kirpichnikov “understands all the problems of science, I think,” says Alexander Litvak, deputy director of the Institute of Applied Physics in Nizhny Novgorod. “But what depends on him? It all depends on money.” Still, top officials place high hopes in him. “I am convinced that Kirpichnikov … will do his best to preserve the best of Russian science,” says Michael Alfimov, chair of the Russian Foundation for Basic Research, Russia's version of the U.S. National Science Foundation.

    Kirpichnikov, who agreed to an interview with Science despite government restrictions on Cabinet members speaking with the press, says he is ready to try some new maneuvers to steer Russian science through its latest crises. These measures may include channeling money to some disciplines at the expense of others and “aggressively” claiming intellectual property rights for scientists. To Kirpichnikov, saving Russian science means exploiting it: stepping up efforts to patent inventions, license products, and contract out research. “Science is one of our strategic resources, a reusable resource,” he says. “For years, science has been underutilized.”

    Kirpichnikov acknowledges, however, that his ministry's options are limited by Russia's dwindling finances: This year, the government will spend less than $2 billion on science, the lowest sum in decades—and that's an official figure that, optimistically, includes payment of overdue salaries by month's end (see graph). August's financial crash made a bad situation far worse. The ruble lost half its value in a week, cutting pay for top scientists to $100 a month, says Vladimir Strakhov, director of the All-Union Institute of Earth Physics in Moscow. The result was double jeopardy for research: “On the one side, the ruble fell, and on the other, we get less rubles,” says Strakhov, who went on a hunger strike in 1996 to protest shrinking research budgets. “Today's situation is the worst it's ever been for Russian science. And the most difficult times are in the future.” Sources say that next year's budget, which is expected to go to the Duma, the lower house of parliament, on 12 December, includes a paltry 8 billion rubles ($444 million, at the present exchange rate) for science.

    Kirpichnikov, 53, has experienced the rise and fall of Russia's scientific community firsthand. He graduated in 1969 from the prestigious Moscow Institute of Physics and Technology, specializing in molecular biophysics. From there he landed a research position at the Engelhardt Institute of Molecular Biology, where he pioneered techniques for making artificial proteins, racking up more than 200 publications. His research has been “highly appreciated and recognized by the scientific community,” says Alfimov. Today, Kirpichnikov heads the institute's protein engineering group, where he still puts in regular appearances. “When I come to my laboratory, it's a kind of relaxation for me,” he says.

    Conducting research provides a respite from Russia's government offices, where Kirpichnikov has labored for the last 9 years. He spent 4 years as a division chief in the science ministry before becoming director in 1993 of the government's Department of Science, High Technologies, Education, and Culture—a position similar to that held by Neal Lane, President Clinton's science adviser. Known as an intelligent and soft-spoken administrator, Kirpichnikov forged a strong ally in former Science Minister Vladimir Bulgak, now a deputy to Prime Minister Evgeny Primakov. For 2 years Bulgak talked up plans for commercializing Russian research and closing some of the Russian Academy of Sciences' roughly 350 institutes, where much of the best research is done, but he failed to deliver on the promised reforms (Science, 14 November 1997, p. 1220).

    Like Bulgak, Kirpichnikov says he hopes to “target funding for research priorities” and “increase the competitiveness of Russian scientists on the world market.” He rattles off a list of areas that he says merit special attention: molecular biology, genetic engineering, physics, new materials, telecommunications, and information technology. “Not a single country in the world can carry out research in all disciplines,” he says. Targeting research in this way, he acknowledges, would require restructuring the academy.

    The question is whether Kirpichnikov can do any better than Bulgak in shaking up a research system deeply rooted in the Soviet era, when there was little competition for funding. Institutes still receive budgets determined primarily by the size of their staffs, and the science minister has no authority over how the academy spends its money. Kirpichnikov, for now, declines to reveal how his ministry might steer more money to labs in strategic areas. And when he does show his hand, he is sure to provoke a backlash from scientists clinging to tenuous careers. “The very mention of reforms irritates impoverished scientists,” says Leo Borkin, founder of the St. Petersburg Association of Scientists and Scholars.

    Kirpichnikov also faces a tough challenge in trying to stem the loss of Russian innovations overseas. The brain drain of the early 1990s may be over, but for “any scientist who remains here in Moscow, his intellectual property may drain to the West,” Kirpichnikov says, referring to dozens of contracts inked between Russian researchers and firms such as Microsoft and Motorola, as well as inventions for which institutes lack funds to seek patents. “We don't have much experience with protection of intellectual property. This worries me a lot.” The ministry is exploring ways to safeguard Russian inventions without harming the ability of scientists or institutes to cut deals with foreign firms, and they will seek ideas at next week's meeting in Moscow.

    One recent thrust Kirpichnikov says he's planning to continue is a ministry program called Integration, which spent $32 million this year priming collaboration between researchers at the academy institutes, where the best science often takes place, and professors and students at universities. Loosely coordinated with a similar initiative run by the Education Ministry and Western foundations (Science, 29 May, p. 1336), Integration is expected to remain a priority next year, with an undetermined amount of new funds set aside for equipment for future joint academic-university labs, called Centers of Excellence. “This is a very acute issue,” Kirpichnikov says. “Most equipment is out of date.”

    Reform-minded scientists welcome such moves, but they argue that the prospects for Russian science are now so dire that radical surgery is needed. “It's terrible,” says Strakhov. “Instead of reading scientific literature and discussing problems, scientists must spend their time selling cigarettes or tending gardens. They're losing their professional level.” The time has come, he says, to fire mediocre scientists and close lame institutes. “The government is avoiding confronting this inevitable question. They are afraid of the responsibility.” With the gauntlet thrown down, Russian scientists are waiting to see if Kirpichnikov, unlike his predecessors, will pick it up. Over the next few months, those precious few hours Kirpichnikov spends in his lab are likely to seem more and more appealing.


    New DOE Research Program to Boost Sagging Industry

    1. David Malakoff

    A $19 million competitive grants program aims at developing new technologies and reinvigorating the nuclear science community

    When nuclear engineers from academia and industry gathered last month in Washington, D.C., for an annual conclave, they heard an old refrain: The prospects for building a nuclear power plant in the United States anytime in the foreseeable future are bleak. Panelists noted that no new U.S. plants have been ordered since 1978, and many others have been shuttered because of cost and safety concerns. And there was consensus that an increasingly competitive power market may soon snuff out more of the nation's 109 aging plants. But, amid the gloomy predictions, researchers heard one note of optimism: Officials at the U.S. Department of Energy (DOE) had breathed fresh life into the government's moribund nuclear research program with the creation of a Nuclear Energy Research Initiative (NERI).

    Turning on the power?

    OE's Nuclear Energy Research Initiative restores some money to a field that has seen its fortunes plummet in the past 2 decades.


    Supporters say the $19 million initiative is a desperately needed first step toward funding studies that may not pay off for decades. DOE undersecretary Ernest Moniz hopes the program, which has six focus areas (see box), will eventually lead to technologies that prevent nuclear weapons proliferation and form the basis for cheaper, safer reactors that generate less waste. NERI proponents also argue that the funding is essential to preserve the nation's nuclear science community, which has seen its numbers and funding dwindle in recent years. “There's no question that NERI is a big shot in the arm—we've been virtually without funding for years,” says Barclay Jones, a nuclear engineer at the University of Illinois, Urbana-Champaign. Indeed, cash-starved researchers have responded enthusiastically: This week, DOE officials began sorting through more than 500 preliminary proposals seeking slices of the NERI pie.

    Critics, however, charge that NERI represents a handout to a mature industry that can afford its own research. “These corporate welfare programs waste taxpayer dollars on a crumbling industry,” says Auke Piersma of Public Citizen's Critical Mass Energy Project, a Washington, D.C.-based advocacy group.

    The impetus for NERI was an imbalance in DOE's research portfolio that followed the 1985 cancellation of the $7 billion Clinch River Breeder Reactor in Tennessee and other nuclear projects (see graph). That trend culminated last year in the zeroing out of DOE's primary nuclear research budget, while the department's solar energy research program received $79 million and $362 million was allocated for oil, gas, and coal studies. In a November 1997 report (, the President's Committee of Advisors on Science and Technology (PCAST) warned that those spending decisions threatened to rule out a U.S. atomic resurgence at a time when concerns about global warming could revive interest in nuclear energy. Because nuclear plants produce virtually no carbon dioxide, they offer an attractive alternative to some policy-makers searching for ways to limit carbon emissions over the next century.

    View this table:

    Such concerns led PCAST to conclude that “fission belongs in the R&D portfolio.” The panel recommended that Congress revive DOE's nuclear program, starting at $50 million in 1999 and reaching an annual steady state of $100 million by 2003. The Clinton Administration, however, requested just $24 million, and Congress coughed up $19 million. The appropriation, say congressional aides, reflected the ambivalence of some economically conservative lawmakers who saw the program as an unnecessary handout. But support from Senator Peter Domenici (R-NM) and Representative Joseph Knollenberg (R-MI) eventually helped secure enough funds to get the program rolling.

    The new initiative comes with one string attached, however. Lawmakers and the White House insisted that DOE conduct a peer-reviewed competition open to both DOE scientists and their colleagues in academia and industry. In the past, DOE nuclear program officers had doled out money to selected researchers, often at the department's own extensive network of laboratories, with minimal external review. “We need strong, competitive proposals to revitalize the nuclear option,” Moniz told more than 100 researchers who gathered in Washington in April to help design the initiative. NERI, he said, would focus on “finding the best ideas, irrespective of where they originate.”

    The new approach appears to be paying off, says NERI manager John Herczeg, noting that his office received 524 preproposals earlier this month. They presented “a whole raft of ideas that we might never have dreamt of here in DOE,” he says, from self-repairing ceramics for coating nuclear fuel pellets to innovative control systems. “There's a tremendous pent-up demand and untapped creativity out there,” says nuclear engineer William Kastenberg of the University of California, Berkeley.

    That popularity may complicate the peer-review process, however, as many potential reviewers are also applicants. To solve the problem, DOE is planning to pile the work onto fewer reviewers. And in a novel twist, reviewers who are not government employees will get a small monetary reward for their labor: $1200 to evaluate nine proposals.

    Herczeg expects to receive some 300 proposals by the 29 January deadline, with requests ranging from $100,000 to $1 million a year for 3 years. The DOE announcement encourages collaborations among sectors, and would-be applicants say they recognize the advantages of working together. “What seems to be emerging is that labs are scrambling to find university partners,” says nuclear engineer Gilbert Emmert of the University of Wisconsin, Madison, who reports that colleagues in his department were approached by several DOE labs. Similarly, Kastenberg says he's been courted “by at least four different labs. … They know partnering enhances their chance of getting money.” He eventually hooked up with all four on two proposals, including one that included three DOElabs, two universities, and a company.

    For their part, many universities are hoping that NERI funds will help slow—and eventually reverse—the decline of nuclear science departments. Nationally, undergraduate enrollment in nuclear engineering and related programs has declined by an “alarming” 10% per year in the 1990s, according to the PCAST report, while the number of graduate programs in the field has fallen by 30%, to 35, since 1975. “Formerly strong university groups are becoming subcritical in size,” the PCAST report concludes.

    How far NERI can go in rebuilding academic programs will be up to Congress, which will set annual funding levels for the program. Moniz said recently that DOE plans to ask for a boost in its 2000 budget request, to be submitted to Congress in January. And researchers hope that the outpouring of proposals this year will convince lawmakers to pump up the program.

    NERI's advocates are also braced for another round of attacks by opponents, both antinuclear campaigners and economic conservatives. During this year's budget battles, for instance, Public Citizen suggested that NERI stood for “Nuclear Expenditures to Replace the Insolvency.” The group accused the program of focusing on “increasing industry profits by reducing the cost of fuel, bandaging aging reactors, and planning future reactor designs.”

    But Herczeg says that criticism is off the mark. It's unlikely industry will see any short-term benefits from NERI spending, he says. “This has to be long-term R&D that may not pay off for a minimum of 10 years and preferably 20 years,” he says. “We don't want any technology that is off the shelf. If [a project] doesn't produce new knowledge, we're not interested.”

Stay Connected to Science