News this Week

Science  09 May 1997:
Vol. 276, Issue 5314, pp. 888
  1. Biomedical Research

    NIH Plans Peer-Review Overhaul

    1. Eliot Marshall

    AFTER A LONG DEBATE, NIH DIRECTOR HAROLD VARMUS HAS RULED THAT GRANT PROPOSALS SHOULD BE JUDGED ON “INNOVATION.” NEXT, NIH PLANS TO REVAMP THE STRUCTURE OF THE STUDY SECTIONS

    SPECIAL NEWS REPORT

    The National Research Council—the operating arm of the U.S. National Academies of Sciences and Engineering and the Institute of Medicine—has been hit by two lawsuits and is grappling with declining federal revenues. A Special News Report on the NRC begins on page 900. News and Comment and Research News are combined into a single section for this issue.

    If you have been arguing that the peer-review system for biomedical research is in trouble and needs to be fixed, be forewarned: You may get your wish. This month, the National Institutes of Health is embarking on an overhaul of the entire network of peer-review panels that selects winners of the more than $5 billion worth of grants NIH doles out each year. The effort—if carried out as intended over the next 18 months—would go beyond the relatively minor tinkering with procedures that has taken place over the past 3 years. And, like all previous attempts to modify the system that determines who gets funded and who doesn't, this effort is already proving controversial.

    NIH took the first step in its attempt to bring substantive change to its peer-review procedures on 5 May, when NIH director Harold Varmus announced that he has ended a long-running debate over proposed new criteria for judging the quality of grant proposals. Varmus overruled some of NIH's top leaders by deciding that, beginning next year, the study sections will rate proposals on a new criterion: innovation. It joins four other newly defined criteria reviewers will use to come up with a single score that determines whether a proposal will be funded or not. The next steps will come over the next few months, as agency staffers begin exploring changes in the structure and membership of the study sections—the peer-review panels that will implement these new criteria. They have already begun experiments that could lead to the assignment of some research fields to new panels (see sidebar).

    DOES YOUR GRANT MEASURE UP?

    Stamp of approval.

    Varmus released these criteria on 5 May.

    Rick Kozak

    SIGNIFICANCE

    Does this study address an important problem? If the aims of the application are achieved, how will scientific knowledge be advanced? What will be the effect of these studies on the concepts or methods that drive this field?

    APPROACH

    Are the conceptual framework, design, methods, and analyses adequately developed, well integrated, and appropriate to the aims of the project? Does the applicant acknowledge potential problem areas and consider alternative tactics?

    INNOVATION

    Does the project employ novel concepts, approaches, or methods? Are the aims original and innovative? Does the project challenge existing paradigms or develop new methodologies or technologies?

    INVESTIGATOR

    Is the investigator appropriately trained and well suited to carry out this work? Is the work proposed appropriate to the experience level of the principal investigator and other researchers (if any)?

    ENVIRONMENT

    Does the scientific environment in which the work will be done contribute to the probability of success? Do the proposed experiments take advantage of unique features of the scientific environment or employ useful collaborative arrangements? Is there evidence of institutional support?

    Revamping the criteria should have been the easy part. But, for almost a year, leading NIH program officers and academic advisers have been mired in a stalemate over the question of creativity. On the one side are some outspoken academics—led by Keith Yamamoto, a molecular biologist and former colleague of Varmus's at the University of California (UC), San Francisco—who believe that the degree of originality in a proposal should be explicitly weighed by study groups. On the other are research leaders, mostly at NIH, who are skeptical about singling out this quality for special attention, arguing that it could lead to discrimination against proposals from areas such as clinical research.

    Yamamoto, who chairs the advisory panel for the NIH's Division of Research Grants (DRG), has been lobbying for creativity as a criterion since 1994, shortly after his colleague Varmus took the helm at NIH (Science, 4 March 1994, p. 1212). Like many others in the biomedical community, Yamamoto feels that the funding crunch is having a detrimental impact on science by making reviewers too cautious. Reviewers often endorse proposals by scientists who have already established a successful track record or who propose a modest extension of work they have already published, he says. “When money is very tight,” agrees DRG advisory-committee member Elizabeth Theil, a biochemistry and physics professor at North Carolina State University in Raleigh, people favor “the sure bet” and neglect “the innovative possibilities.”

    NIH institute leaders themselves concede that reviewers have become risk-averse in the 1990s, favoring grants with the best pedigrees rather than those taking on the biggest challenges. At a meeting of NIH's Peer Review Oversight Group (PROG) this week, for example, Howard Schachman, a professor of molecular and cellular biology at UC Berkeley and a special adviser to Varmus, said that reviewers do “too much nit-picking.” They harp on technical weaknesses, Schachman claimed, rather than focus on substance.

    To counter that trend, Yamamoto suggested last year that NIH instruct reviewers to rate grants explicitly on “creativity” or “innovation” as well as on several other traditional measures. His suggestion was one of several new grant criteria that were debated at NIH over the winter and have been included in Varmus's list. A few outsiders weighed in on Yamamoto's side, including Stanford University biologist Paul Berg, who, as an officer of the American Society for Cell Biology, “wholeheartedly” urged Varmus to adopt the creativity standard to show that “innovative ideas are valued” at NIH.

    The notion did not get wholehearted endorsement among senior NIH staffers, however. Although they gave their support to most elements of the revised set of grant-review criteria, some NIH research chiefs balked at the idea of judging grants on originality.

    Claude Lenfant, director of the National Heart, Lung, and Blood Institute, led a working group that surveyed staffers at NIH on this question last winter and concluded that they “would not find it helpful” to have grants rated on their innovativeness. Lenfant conceded in a report that new ideas in science are often “controversial” and that “there is a perception … that such work is not well received in review groups.” But his panel was against using creativity as “an explicit review criterion” because doing so might “create the awkward situation” in which a proposal's failure to address the criterion, even when it was not applicable, “could carry a negative connotation.” In addition, Lenfant said clinical researchers might feel it put them at a disadvantage. The Lenfant panel's skepticism was shared by other NIH program officers, including Wendy Baldwin, deputy NIH director for extramural research. When discussions reached an impasse in February, Baldwin bumped the matter up to Varmus.

    Now, after a 2-month review, Varmus has given an unequivocal endorsement of the creativity principle. In a statement Varmus read to the PROG meeting this week, he said that, starting in fiscal 1998, proposals submitted to NIH should be reviewed on five criteria—made explicit for the first time—including the degree of “innovation” they exhibit. To judge innovation, Varmus explained, he would ask: “Does the project employ novel concepts, approaches, or method? Are the aims original and innovative? Does the project challenge existing paradigms or develop new methodologies or technologies?” (See table above for other criteria.)

    Varmus handed out copies of the new criteria at the PROG meeting and brushed aside the arguments against evaluating proposals on innovation. “I take umbrage at the suggestion” that clinical research would score poorly on innovation, he said. “There's plenty of imaginative clinical research and plenty of me-too-ism in laboratory research.” He complained that the “whole issue” of grant criteria “got overly wound up” and pleaded for “a more dispassionate treatment” of it. The reason for adopting new standards, he explained, is to shift reviewers' attention away from “technical details” and toward substantive research concepts.

    Varmus made it clear that he doesn't want to debate the criteria any longer. He said he would entertain proposed “refinements” of wording over the next “couple of days.” But apart from that, he said, “we should get on with the business of trying” the new standards. With the first skirmish over, NIH now begins the bigger task of revamping the structure of the peer-review network.

  2. Biomedical Research

    Review Panels Under Review

    1. Eliot Marshall

    My e-mail box has never been as full,” says National Institutes of Health (NIH) director Harold Varmus, discussing the intense reaction from biomedical researchers to relatively modest changes in the criteria used to judge grant proposals (see main text). Varmus's e-mail may reach another high-water mark this spring, for NIH is now considering changes that would revamp the structure of the study groups that will apply these new criteria.

    NIH is doing this, Varmus says, because he would like to “loosen the rules” a bit, in an attempt to make the review system more flexible and more focused on scientific concepts. He also wants to ensure that the same review standards are applied across all fields of research. And he says he hopes to attract more senior scientists to serve on peer-review panels, perhaps by letting busy researchers serve on a study section just once rather than three times a year as rules now require.

    Spearheading this reform effort is Elvera Ehrenfeld, a molecular biologist and former dean of biological sciences at the University of California (UC), Irvine. She joined NIH officially in January, picked by Varmus to head NIH's Division of Research Grants (DRG). Ehrenfeld oversees a network of 1800 extramural scientists who sit on 105 study sections, each focused on an arbitrarily defined area of scientific turf. At a meeting of DRG's advisory council on 28 to 29 April, Ehrenfeld sketched out a plan to achieve some of Varmus's goals, which she said could lead to a “much bigger and broader overhaul” of these peer panels than has been attempted before. She has retained one consultant—pediatrician Michael Simmons of the University of North Carolina—to serve as liaison to the clinical-research community and is looking for another to link up with behavioral researchers. DRG's advisory group, chaired by molecular biologist Keith Yamamoto of UC San Francisco, endorsed the broad proposal, and Ehrenfeld and her staff are now working out a concrete agenda.

    Judging the judges.

    DRG chief Elvera Ehrenfeld

    Ehrenfeld told Yamamoto's committee that in addition to trying to lure more senior scientists into the system, she is concerned about reports that competition for funding is much more intense in some study sections than in others. Ehrenfeld said some grantees complain that “too much of the hottest, most active” science may be crowding into “too small a subset of study sections.” At the same time, she said, other panels may be responsible for proposals from “not-so-productive” fields. As Donald Cleveland, a member of DRG's advisory council and professor of medicine at UC Los Angeles, notes, panels in both the hot and the not-so-active fields can give fundable scores to the same percentage of grants. Ehrenfeld also mentioned a third problem: “the orphans”—the emerging areas of science that have no proper home and “may not be getting the best review.” They should be better provided for, she says, perhaps by creating entirely new study sections.

    Ehrenfeld told the advisory group that her office will look into the flow of grant applications, the principles that guide the assignment of proposals to specific study sections, and how “hot” areas of science fare in review. DRG staffers have already started a set of “pilot projects” to judge how well the judges perform. They are looking over the shoulders of seven study sections in cell development and function, and 22 recently reorganized study sections in the field of neuroscience, to see how efficiently they handle grant applications and how well they have done at picking winners. As the DRG gathers the information, it will begin drafting recommendations.

    Ehrenfeld and Yamamoto say they hope the DRG will be able to come up with a concept for reorganizing NIH's study sections within 18 months. It is an ambitious goal. But, as Ehrenfeld said last week, many committees have studied peer review at NIH, and “we suffer … from talking a lot.” This time around, she said, “’we will see a bunch of experiments and hopefully some solutions” to these long-debated problems.

  3. Department of Energy

    Changes at Brookhaven Shock National Lab System

    1. Andrew Lawler

    Energy Secretary Federico Peña delivered a personal message to Brookhaven National Laboratory managers last week—but it was not, as some might have expected, a homily to mark the lab's 50th birthday. Instead, Peña announced that he intends to end the government's contract with the New York lab's operator, Associated Universities Inc. (AUI). And he excoriated Brookhaven management for “unacceptable and inexcusable” behavior in dealing with a series of toxic spills and leaks. “Doing excellent science does not excuse lapses in environment, safety, and health management,” Peña said.

    The surprise decision by the new secretary is sending shock waves through the Department of Energy (DOE) and its network of laboratories around the country, most of which are run by private contractors. It's being seen as evidence of a new, get-tough attitude by federal officials that lab directors would ignore at their peril. “[Peña] sent a very clear message to all of us: Scientific excellence is important, but he expects operational excellence as well,” says William Madia, director of Pacific Northwest National Laboratory.

    It's also a watershed event for Brookhaven, a $400 million a year lab with 3000 employees. The lab is home to the High-Flux Beam Reactor (HFBR)—one of the nation's premier neutron-scattering facilities—and other machines used for a host of materials, biological, and medical research experiments.

    The storm that has engulfed Brookhaven has been brewing since the 1980s, when community activists on Long Island succeeded in shutting down the nearby Shoreham nuclear plant. A series of chemical and radionuclide leaks at the lab over the past decade, combined with a 1994 fire at the HFBR, heightened safety concerns and soured relations between the lab and its neighbors. The anger of local residents reached fever pitch this winter, when Brookhaven revealed a tritium leak at the HFBR. The reactor, shut down last fall for maintenance, will remain closed while technicians pin down the source of the leak, which has spread in a plume underneath the lab grounds, and monitor its impact. DOE officials now believe there has been a continuous leak for more than a decade, although they say it poses no threat to the region's drinking water.

    Such problems, however, have already had a far-reaching effect on the lab itself. An extensive evaluation of Brookhaven's safety-management program released last week blames both DOE and AUI—a Washington-based consortium of nine northeastern universities—for failing to deal with a host of environmental, health, and safety problems. The study found confusion at DOE headquarters, field offices, and among DOE officials at Brookhaven over who was responsible for enforcing regulations. Adding to the problem is Brookhaven's status as a multiprogram lab, which means that several DOE offices oversee it. The study calls for better department coordination of environmental and health issues “and more effective and efficient allocation of funding and resources” in these areas. It also says the relationship between DOE and Brookhaven officials is too cozy.

    Lab brass.

    Secretary Peña (left) introduces Brookhaven to its new DOE boss, John Wagoner.

    ROGER STOUTENBURGH/BROOKHAVEN NATIONAL LABORATORY

    For now, the heaviest blow falls on AUI, which Peña says failed to abide by DOE regulations while playing down local concerns. DOE will hold a new competition, and a new contractor will be selected within the next 6 months. The contractor will appoint a new lab chief. AUI, which has run Brookhaven since its inception in 1947, is free to bid, says Martha Krebs, DOE energy research chief. The secretary has given Krebs, whose office is the largest overseer of Brookhaven, a month to come up with a plan to correct the problems laid out in the report and to address DOE's failure to respond adequately to the Long Island community. In the meantime, John Wagoner, manager of the DOE office in Richland, Washington, will oversee the lab, while the Environmental Protection Agency conducts an independent inspection of the facility's environmental problems.

    Longtime Brookhaven director Nicholas Samios, who stepped down on 30 April as part of a long-planned transition of management, acknowledges that “upper management should have done more” to address the concerns. “We could have moved more aggressively,” says Samios, now a researcher at the lab. And he admits that Brookhaven failed to bridge the gap between local residents and the lab: “Our outreach program probably wasn't as large as it should have been.” But he also blames what he calls an “antiscience” attitude among the public. “People are frightened of radiation,” he says. “It's a problem of ignorance.”

    Local activists, who say their complaints were swept under the rug, bristle at such statements. And Brookhaven's insensitivity to local fears draws a harsh reaction from some DOE officials as well. “This is an example of the arrogance of the scientific community,” says one DOE manager, who notes that the lab is “in the backyard of Shoreham and still they didn't get the message.”

    Peña appeared particularly angry last week about what he called the lab's loss of public trust. AUI Chair Paul Martin, dean of engineering and applied sciences at Harvard University, says “there's no doubt there has been quite a bit of hubris” among some Brookhaven scientists, although he says those at Harvard and other universities can be equally arrogant. He adds that Samios didn't make many friends in Washington during his 15-year tenure—and angered Senator Alfonse D'Amato (R-NY)—by his response to the environmental questions being raised. “Maybe we should have beat on [Samios] more,” he says about AUI's duty to oversee the lab. But he insists that Peña moved precipitously to terminate the contract, given AUI's recent efforts to find a new director and to impose a stricter system of monitoring problems such as the tritium leak.

    Privately, some AUI officials say they are victims of Peña's desire to appear decisive after being criticized for his handling of the 1996 ValuJet crash in Florida while he was transportation secretary. They also decry D'Amato's efforts to win Long Island votes by vilifying Brookhaven. “We're the whipping boy,” says one AUI manager.

    Local critics have applauded Peña's actions, although they say their work is not done. “I'm cautiously optimistic,” says William Smith, director of Fish Unlimited, a national organization located near Brookhaven. But he said critics still plan to file suit this summer against the lab for failing to abide by federal environmental regulations.

    AUI's fate serves as a clear warning to other lab managers. “I don't think we have any problems—but as of this morning, you can be sure we're double-checking,” says Madia.

  4. Genomics

    Whitehead, Three Firms Splice a Deal

    1. Wade Roush

    CAMBRIDGE, MASSACHUSETTS—“Genomics” is the biotech industry's next unexplored continent: a world where information about people's genes and gene activities will create new ways to diagnose, prevent, and combat disease. Or so researchers and investors hope (Science, 7 February, pp. 767, 769, 770, 773a, 773b, 774, 776, 777, 778, 780, and 782). Biotech and pharmaceuticals firms are now betting that a good way to stake out commercial territory in this new world is to hire the mapmakers. That's why an unusual new consortium of companies announced last week that it has joined Eric Lander, a gene mapper at the Whitehead Institute for Biomedical Research and the Massachusetts Institute of Technology (MIT), in a 5-year, $40 million effort to develop new “functional genomics” techniques.

    “Rather than wait and see what happens, we want to pick and choose our path into genomics,” says biologist Richard Gregg, a vice president and head of an internal genomics task force at consortium member Bristol-Myers Squibb. “Eric is one of the leaders in thought and technology.” A former mathematician and MacArthur Fellow who was elected last month to the National Academy of Sciences, Lander directs the Whitehead/MIT Center for Genome Research, one hub of the massive government-funded effort to locate and characterize the estimated 60,000 to 100,000 genes in the human genome.

    Mapping out a deal.

    Lander lands $40 million for functional genomics.

    Sam Ogden

    Under the deal, announced on 29 April, the New Jersey-based pharmaceuticals giant and two smaller biotech firms—“DNA chip”-maker Affymetrix Inc. of Santa Clara, California, and Millennium Pharmaceuticals of Cambridge, Massachusetts—will give the center equal amounts of cash and equipment for research into faster, more efficient ways to gather and compare genetic data. In return, the companies will receive commercial rights to technologies developed under the program. Most coveted by the firms are automated systems for analyzing, simultaneously and over time, the activities of tens of thousands of genes and proteins in normal and diseased cells. Detailed legal provisions, and the nature of the inventions themselves, will govern which of the firms will get joint or exclusive rights.

    The agreement will significantly boost the center's current $14 million annual research budget. That money, most of it from federal grants through the Human Genome Project, has paid for the first rough guides to the 3 billion nucleotides in human DNA: maps studded with thousands of landmarks called “sequence tagged sites” (Science, 25 October 1996, p. 540). Lander says he is now eager to see that information put to work in biomedicine. “We've put 7 years so far into building maps and sequences, telling ourselves that this structural genomic information would help change the world. It's time to take that out for a test drive,” he says.

    Responding to recent concerns that corporate funding could quash the free exchange of scientific data, Lander and the consortium members went out of their way last week to emphasize their “airtight” agreement limiting publication delays to 60 days to allow time for patent filings. Lander, moreover, will divest his stock holdings in both Millennium, which he co-founded in 1993, and Affymetrix, in accordance with the conflict-of-interest guidelines of MIT and the Whitehead Institute.

    Industry-university collaborations are common in biotech, and so is “partnering” between pharmaceuticals companies and smaller, idea-driven firms—just last week, for example, Schering-Plough Corp. signed a potential $60 million deal with Myriad Genetics Inc. of Salt Lake City focusing on cancer genetics. But Whitehead Institute director Gerald Fink, a yeast geneticist, says, “It is very unusual to see three companies working together in this way.”

    Affymetrix President Stephen Fodor predicts, however, that this consortium may well set the tone for future collaborations in biotechnology. “I suspect you will see many of these types of interactions that allow technology to be integrated in new ways by people who … are not biased by the internal culture of a particular company,” says Fodor. “It should be a very powerful way to multiply our resources.”

  5. Neutron Research

    Europeans Plan Their Next Big Source

    1. Alexander Hellemans
    1. Alexander Hellemans is a science writer in Paris.

    European neutron-scattering researchers this week announced the next step in their ambitious plan to build the world's most powerful pulsed-neutron source by 2010. On 5 May, a group of five leading research institutions released a feasibility study for the proposed European Spallation Source (ESS), a $1.1 billion neutron facility powered by a 5-megawatt particle accelerator. The 3-year technical study, supported by the European Science Foundation, detailed the technical specifications for the new machine, and the five partner institutions agreed this week to seek funding for a 3-year research and development phase to prove the concept. “In Europe, I think we can do it,” says Andrew Taylor of Britain's Rutherford Appleton Laboratory (RAL), secretary of the ESS Council.

    The hard part will be to convince European governments to pay for the ESS, but its proponents believe they have a strong selling point in the growing demand for access to neutron beams, from users ranging from individual university researchers to industrial conglomerates. In Europe alone, there are estimated to be roughly 4000 researchers who conduct neutron-scattering experiments in fields including physics, chemistry, materials science, and biology. “What we do is underpin condensed-matter science. … There is even an applied dimension to it: Understanding how alloys behave under extreme stress at a microscopic level is not a million miles away from designing turbine jet engines,” says Taylor.

    Europe is currently home to the best of each of the two types of facilities for producing neutrons for research: The Institut Laue-Langevin in Grenoble, France, has the most powerful reactor source, while RAL houses ISIS, the most intense accelerator, or “spallation,” source. The ESS would produce neutron pulses that are 30 times brighter than those obtained at ISIS. In 1995, the United States abandoned plans for a more powerful reactor facility, the Advanced Neutron Source (Science, 17 February 1995, p. 952), but a new proposal for a National Spallation Neutron Source at Oak Ridge National Laboratory in Tennessee, powered by a 1- to 5-megawatt accelerator, is currently being developed. Japan is also working on two schemes for spallation sources in the 1- to 5-megawatt range.

    Like other spallation sources, ESS would use an accelerator to speed protons, bunched together in short pulses, and slam them into a target. The collision generates neutrons by knocking fragments off the target nuclei. ESS's 700-meter linear accelerator would be the most expensive and physically the largest component. Together with the storage rings, it would cost $390 million.

    With an energy of 5 megawatts, the proton beam would destroy the solid metal targets used in today's spallation sources. As a result, the target for ESS would have an entirely new design, says ESS Study Group member Tim Broome of RAL. It would consist of liquid mercury, continuously pumped through a specially designed container at room temperature. “We are completely confident that it will work, but we still have to do some work to establish the real lifetime of the target,” says Broome. For example, the team has yet to investigate whether pressure waves created in the mercury by the extremely short, high-power proton pulses would damage the container.

    Although ESS still has a long way to go before it produces its first neutrons, researchers are already looking forward to the science that its intense beams would make possible. “We can use it to look at processes in shorter times, we can use it to look at much smaller samples and dilute samples, and we can look at much more subtle effects,” says ESS science coordinator John Finney of University College London. For example, researchers would be able to study the folding and denaturation of proteins in solutions and how the interactions between nonpolar groups are modulated by adding particular ions. “This is the kind of thing we cannot even dream of getting a proper answer to now,” Finney says.

    But before that dream comes true, the participating laboratories—in France, Germany, Switzerland, Denmark, and the United Kingdom—must still secure funding for the next R&D phase from their own governments and the European Union. If they clear that hurdle, the next step would be a 2-year engineering phase, followed by a 6-year construction phase starting in 2002, and a 2-year commissioning phase from 2008. “We have to improve confidence and reduce costs,” says Taylor. “We have [technical] solutions, but we have to seek more cost-effective solutions in the long run.”

    U.S. researchers are looking on with interest. “I think it is an extremely important project for the world, not just for Europe,” says Bill Appleton at Oak Ridge National Laboratory. “The total availability of neutron sources is going down, and most sources don't have enough intensity to do the kind of new science that is possible. This source addresses both of these things.”

  6. Genome Research

    Watson Urges 'Put Hitler Behind Us'

    1. Robert Koenig
    1. Robert Koenig is a writer in Berlin.

    BERLIN—In a keynote speech to a molecular medicine congress here last weekend, one of the world's foremost geneticists—Nobel Prize-winner James D. Watson, co-discoverer of the structure of DNA and a founder of the Human Genome Project—stepped carefully into the ethical minefield of German genetic research and the legacy of Nazi eugenics policies. The time has come, he said, to “put Hitler behind us.” He urged Germany to focus on the great benefits that applying genome research can offer humankind, and to put more resources into genetic research. At the same time, Watson warned that geneticists should try to keep decisions about genetic testing and related matters “in the hands of the people” and away from state control. “Genetics as a discipline must strive to be the servant of the people, as opposed to our governments,” he told the 1000 delegates. “Never again must geneticists be seen as the servants of political and social masters” who use pseudoscience for despicable ends.

    “Your budget is still totally inadequate for Germany to have a real impact.”—James Watson

    Margot Bennett

    In the name of the pseudoscience of eugenics, Adolf Hitler's Nazi regime exterminated millions of Jews, Gypsies, mental patients, and disabled people between 1933 and 1945, and carried out experiments on concentration-camp prisoners. The guilt and horror at that grotesque misuse of science was a major factor in transforming Germany—once a leader in genetics—into one of the most hostile environments for such research and the scientists who do it. In recent years, Germany has begun to emerge from its withdrawal—loosening some strict regulations and slowly rebuilding its genetic research.

    Nevertheless, in his speech and at a related news conference, Watson—president of Cold Spring Harbor Laboratory in New York—told German scientists that their nation's genetic research is not moving fast enough. Watson said he was “very happy that Germany has now finally chosen to join” the genome project, but he added: “Your budget is still totally inadequate for Germany to have a real impact. You are putting money in to use the genome, not to get it.” Watson also attacked Germany's restrictive laws governing genetic research. “Your regulation of biotechnology has been counterproductive,” Watson said, asserting that overregulation had threatened to make Germany “a second-rate nation as far as biotechnology is concerned.”

    Watson, who is famous for his outspokenness, stepped even further into delicate territory: One reason German genetics has taken so long to recover, he said, is that “Germany never purged itself” completely of the scientists whose work was misused by the Nazis. Watson—who conceded that some Americans, including scientists at Cold Spring Harbor, carried out eugenics research before the Nazi era—said that while some German researchers were punished after World War II, a number of discredited geneticists retained influential university posts. “You never came out and said the bad guys” were bad geneticists, he said, and that failure has hurt later researchers.

    Watson's remarks were loudly applauded by many German scientists at the meeting. Detlev Ganten, head of Berlin's Max Delbrück Center for Molecular Medicine, which co-organized the conference, told Science he agreed that Germany “never fully purged itself” of the sins of some Nazi-era geneticists and is still “psychologically not well prepared” for such research. He added: “We are grateful that Jim Watson came here with this message. It is very hard for Germans to say it.”

    But Watson's jabs at Germany's fledgling genome project and the nation's research regulations ruffled some feathers at the Ministry of Education, Science, Research and Technology. Elke Wülfing, the ministry's representative at the conference, told Science that the genome project, started 2 years ago (Science, 16 June 1995, p. 1556), “is getting an adequate share of the German research budget.” Germany now spends about $24 million a year on its genome project, a small fraction of the estimated $3 billion total cost of the entire Human Genome Project. Wülfing said the delay in establishing Germany's project related to “extensive discussions about ethics,” and she defended Germany's stance on regulating genetic engineering. “Despite all the known opportunities inherent in biotechnology and genetic engineering, we must not lose sight of the concerns of the general population in the face of the inherent risks,” she said.

    Watson said he hoped German science will make significant contributions to the genome project: “The gene is still regarded by much of the German population as a bad thing. The time has come to end this.” He then added: “Genetics per se can never be evil. It is only when we use or misuse it that morality comes in.”

  7. Biodiversity

    Unique, All-Taxa Survey in Costa Rica "Self-Destructs"

    1. Jocelyn Kaiser

    President Bill Clinton's visit this past week to Costa Rica has thrown a spotlight on this small, species-rich Central American country, which has been hailed as a model for how poorer nations can develop sustainably. Under President José Maria Figueres and his predecessor, Oscar Arias, Costa Rica has set aside almost one-fourth of its land for protected areas, and, 6 years ago, it signed a much-ballyhooed deal with the pharmaceutical giant Merck to search the country's rain forests for plants and insects containing disease-fighting chemicals.

    But off the front pages, many scientists are lamenting the demise of a unique, 7-year project, envisioned by University of Pennsylvania tropical ecologist Daniel Janzen and run by Costa Rica's National Institute for Biodiversity (INBio), that would have cataloged every single species in a Costa Rican conservation area. This spring, a team of taxonomists was scheduled to descend on the country's Guanacaste Conservation Area (ACG) on the northwest Pacific coast to begin the first stage of the inventory. Instead, project scientists are spending the season getting over their shock over learning that the world's first All-Taxa Biological Inventory, or ATBI, has, as one researcher puts it, “self-destructed.”

    According to INBio officials, the $90 million ATBI was canceled in November because it seemed to benefit science more than the Costa Rican people. “The ATBI was a beautiful scientific project,” says INBio's director, Rodrigo Gamez, “but there are social and economic considerations that are more relevant than scientific ones.” Other scientists close to the project say that additional factors also may have helped kill the inventory, including a grab by some INBio officials for the first $22 million raised for ATBI for INBio's overall budget.

    “It was a very exciting project scientifically … it's very disappointing,” says Amy Rossman, director of the U.S. Department of Agriculture's U.S. National Fungus Collections in Beltsville, Maryland, who chaired an ATBI taxonomic working group. The project's cancellation is also a great blow to Janzen. He's “not complaining,” says a taxonomist involved with ATBI, but “he must be pretty embittered that he put all that time and effort into [the ATBI only to have it] collapse around him.”

    Janzen, who has conducted fieldwork in the ACG since 1963, has played a key role in shaping that country's green policies. In 1989, he and plant virologist Gamez set up INBio with the mission of inventorying the country's vast wealth of species—some 5% of the world's total—and preserving it in sustainable ways. INBio has promoted eco-tourism, employed small legions of nonscientists as “parataxonomists,” and acted as the Costa Rican partner in the deal with Merck, which Janzen helped broker.

    Beetlemania.

    A few of the some 300,000 species the now-defunct survey was to catalog.

    SHARON GUYNUP

    Janzen first began talking about an ATBI in the early 1980s. He argued that identifying every type of organism in a landscape, whether viruses, orchids, or monkeys, would shed light on a host of scientific questions, such as how species interact in an ecosystem. It also would further efforts to “bioprospect” for new medicines in natural areas. After holding an ATBI workshop in the spring of 1993 (Science, 30 April 1993, p. 620), Janzen and his wife and collaborator, University of Pennsylvania ecologist Winnie Hallwachs, suggested that the first inventory be carried out in the area they had been studying—the ACG.

    Some biologists questioned whether the flood of taxonomic data would yield as much scientific knowledge as Janzen claimed, but most endorsed the project. In 1994, the Costa Rican government signed on, too. With start-up funds from the National Science Foundation and Norway, a group of taxonomists from around the world began developing detailed protocols for the survey. Last September, at a workshop in Beltsville, the project seemed set to go: ATBI organizers announced that INBio was close to securing a total of $22 million for the project's first phase from the World Bank's Global Environmental Fund, the Netherlands, and Norway.

    But on 8 November, Janzen, Gamez, and Guanacaste officials sent a curt e-mail message to ATBI working groups stating that they had decided to “permanently discontinue” the project. In the end, INBio and the ATBI didn't have “compatible agendas,” says Janzen, who declines to discuss what happened in detail. Gamez says the project was focused too narrowly on amassing scientific information. Instead, he says, the country needed a program that would directly contribute to economic development by turning out “products” such as field guides, bioprospecting, and interpretive nature trails. These would bring in income for INBio and the conservation areas and provide jobs for Costa Ricans, Gamez says.

    Janzen maintains that the ATBI would have yielded solid economic benefits, including jobs for curators and parataxonomists. “I will defend [that] to the death. It was planned and designed to be highly sustainable,” he says. But while ACG officials strongly supported the project, he says, “enthusiasm for it in [other] Costa Rican circles was highly varied,” and this mixed message impeded negotiations with donor organizations for grants.

    Other scientists involved in the project suggest that some INBio officials may have let the project founder in the hope of salvaging the funds for INBio's overall budget. “Any project with a lot of money like that, people start scrabbling,” says one U.S. scientist.

    INBio officials emphasize that they haven't abandoned the idea of a species inventory. In fact, INBio is close to completing agreements to spend the $22 million raised for ATBI on another biodiversity program. Instead of counting all species in one area, it would inventory species in five different parks, focusing on taxa that are most relevant to practical goals such as bioprospecting, educational programs, nature tours, and biological pest control, Gamez says. Janzen, however, is not an organizer of the revised project, although he continues to contribute to INBio's national inventory from the ACG.

    Nor has Janzen given up on the idea of a full-fledged ATBI. Indeed, in recent months his colleague John Pickering, an ecologist at the University of Georgia, Athens, and the U.S. National Park Service have been exploring the possibility of conducting an ATBI closer to home—in the Great Smoky Mountains National Park. And Rossman notes that the planning process of the past 2 years has yielded many tools for carrying out an ATBI, including an entire book on isolating and identifying fungi. “I hope we can do an ATBI somewhere,” Rossman says. “[Maybe] right outside my window.”

  8. Archaeobiology

    Squash Seeds Yield New View of Early American Farming

    1. Wade Roush

    Just as a wisp of hair or a speck of blood can sway the verdict in a criminal trial, the tiniest morsels of evidence can swiftly undercut an ascending scientific theory. According to a view that has attracted many supporters over the last few years, human beings in the Americas gave up hunting and gathering for farming 5000 to 3500 years ago—much more recently than previously thought, and several millennia behind Near Eastern and Asian civilizations. But a reanalysis of a handful of oversized squash seeds and other table scraps dug up in a Mexican cave has upset this theory, leading to a brand-new picture of the transition to agriculture in the New World.

    Curiously enough, the new evidence comes from a researcher who, until recently, insisted that early Americans were latecomers to agriculture. Archaeologist Bruce Smith, director of the archaeobiology program at the National Museum of Natural History in Washington, D.C., based his case on an improved carbon-14 dating technique called accelerator mass spectrometry (AMS); it had revealed the oldest fragments of domesticated squash, corn, and beans from caves in Mexico to be no more than 5000 years old—several thousand years younger than had been thought. Then last year, Smith traveled to the Instituto Nacional de Antropología e Historia in Mexico City to gather an additional set of squash fragments for the same kind of dating. And, as he reports on page 932, these fragments imply that agriculture got an early start after all: They turn out to be 8000 to 10,000 years old.

    Because the corn and bean fragments still seem to be much younger, Smith's new findings suggest that farming took hold more gradually in the Americas than in other parts of the world. In the Near East and China, for instance, vigorous agricultural economies had sprung up by 9000 and 7500 years ago, respectively, within a thousand years of the domestication of the first crops. The gap between the squash dates and those for other crops implies that Mesoamericans spent thousands of years planting gourds such as squash—even selecting strains for certain characteristics—without “making any other substantial transition to growing their own food,” says Gayle Fritz, an archaeobotanist at Washington University in St. Louis. “They were still practicing their old hunting, gathering, and fishing ways.”

    The debate about the emergence of New World agriculture turns on a few bits of squash stashed in a drawer at the Mexico City institute. Found in the 1960s by the University of Michigan's Kent Flannery at a dry cave in Oaxaca, Mexico, they are the only evidence of agriculture's earliest days in the Americas—along with a few other crop fragments discovered by Flannery's mentor, Richard “Scotty” MacNeish, in four rock shelters and caves in Tamalipas and the Tehuacán Valley. Most archaeologists who go to Mexico “want to dig up Mayan tombs, fancy temples, and caches of jade,” explains Flannery. “They're not interested in these preceramic hunter-and-gatherer cultures.”

    When Flannery found his squash fragments in 1966, he quickly identified them as domesticated varieties of the squash Cucurbita pepo, the same species to which modern pumpkins and summer squash belong. The seeds and stems were larger, the rinds thicker, and the outer skin more colorful than the corresponding parts of the wild Cucurbita gourds still growing in some regions of the Americas. But he couldn't date them directly. The radiocarbon-dating technique then available required pulverizing large chunks of material, which would have destroyed the fragments.

    So Flannery resorted to an indirect technique. In each era of human occupation of the cave, called Guilá Naquitz, residents added a new layer of detritus to the cave floor. Flannery gauged the age of the oldest squash parts by dating charcoal from the same layers as the squash. He concluded that the fragments were nearly 10,000 years old—implying that the transition from foraging to farming began more or less simultaneously in the Fertile Crescent and the Americas. MacNeish used the same strategy to estimate slightly younger ages for the crop fragments from his caves.

    There matters stood until the 1980s, when many archaeologists began using AMS. In both conventional and AMS radiocarbon dating of organic samples, the ratio of the slowly decaying isotope carbon-14 to normal carbon yields an estimate of the time elapsed since the organism's metabolism ceased. But whereas conventional radiocarbon dating measures carbon-14 by counting decay events, AMS counts the carbon-14 atoms directly, meaning that much less sample material is needed.

    In 1989, Austin Long, a geochemist at the University of Arizona, Tucson, and Bruce Benz, a paleobotanist now at the University of Guadalajara in Mexico, used AMS to date the oldest domesticated corncobs directly from MacNeish's digs in Tehuacán. Astonishingly, they found the cobs to be only 4700 to 1600 years old, some 800 to 2300 years younger than MacNeish and his collaborators had estimated.

    That finding sparked an impassioned debate among archaeologists about the accuracy of the dates for all the crop fragments from the Mexican caves. In a 1994 Current Anthropology article, entitled “Are the First American Farmers Getting Younger?,” Fritz wrote that “One begins to wonder how many, if any, of the [domesticated species] from the Tehuacán excavations actually precede 3000 B.C.” Fritz argued that burrowing rodents or people digging fire pits might have disturbed the cave floor's layers, casting doubt on all of MacNeish's dates from charcoal in the layers.

    This revisionism also led Smith to wonder about the reliability of Flannery's dates for the Guilá Naquitz squash material. If the other crop fragments were as young as they now seemed, Flannery's dates would mean that several thousand years had passed between the domestication of the first crop, squash, and the widespread domestication of other crops, such as beans and corn. To Smith, that seemed implausibly long, in light of the abrupt appearance of farming-based civilizations in the Near East and Asia. “All these different lines of evidence seemed to suggest that it was worth taking a step out onto thin ice and predicting that [when dated using AMS] the Guilá Naquitz squash would turn out to be younger than people had initially proposed,” says Smith.

    In retrospect, that step wasn't warranted. In Mexico City last year, Smith retrieved the 13 seeds and other Cucurbita material Flannery had extracted from zones B through E of the floor of Guilá Naquitz, layers spanning the period from 10,800 to 8700 years before the present. Smith's AMS dating of the seeds bore out Flannery's original estimates: One zone C seed proved to be 9900 years old, while five zone B seeds ranged from 8400 to 10,000 years old.

    This effectively squelched the squash squabble. “What's exciting about Smith's data,” says Flannery, is that the first American farmers “aren't getting younger—in fact, they may be a little older than we thought.” Question marks still hang over the plant artifacts from MacNeish's Tehuacán sites; although most researchers agree that the new AMS dates there are trustworthy, MacNeish isn't ready to concede that his dating of the cave-floor layers was flawed.

    Still, by all accounts, AMS dating is forcing ethnobotanists and archaeobiologists to rethink their definition of “farmer.” “There's been a dichotomy of views: People can either be hunter-gatherers or agriculturalists,” explains Smith. “In reality, there's a transition stage between these two, but we don't know much about it. And in Mesoamerica, it's clear that this ‘transitional phase' was 6500 years long”—longer, Smith notes, than the hunting-gathering and farming phases put together.

    The only way to fill in this picture, say Smith and other researchers, is to search out new troves of agricultural remains. Until then, says MacNeish, there will be “a lot of room for speculation, and a lot of very opinionated people.”

  9. Materials Science

    Will UV Lasers Beat the Blues?

    1. Robert F. Service

    SAN FRANCISCO—The competition to make the next generation of compact-disc readers continues to burn white hot. At a meeting of the Materials Research Society held here last month, a team of scientists reported creating a new type of chip-based laser that generates the shortest wavelength light yet—ultraviolet (UV). If the experimental laser could be turned into a practical device, its short wavelengths would allow it to read CDs and CD-ROMs packed with far more information than today's versions can hold.

    That could put the new device out front in the competition to succeed current CD lasers, which play a beam of near-infrared light over the tiny indentations that store data on a CD. Over the last few years, researchers worldwide have been racing to commercialize other chip-based lasers that emit shorter wavelength blue light, which—by allowing the use of even tinier indentations—could increase fourfold the amount of data stored on discs. But the new laser, based on the semiconductor zinc oxide (ZnO), promises even greater gains.

    The device was developed by ceramics researcher Masashi Kawasaki and his colleagues at the Tokyo Institute of Technology in Yokohama, the Hong Kong University of Science and Technology in Kowloon, and the Institute of Physical and Chemical Research (RIKEN) in Sendai, Japan. “It's great work,” says Chang-Beom Eom, an associate professor of materials science at Duke University. The device now has to be pumped with light from another laser, but if the researchers can adapt it to run on electric current, says Eom, it “means the density of data storage can go even higher.”

    At the heart of all chip-based lasers are semiconductor materials that allow electrons to exist at only specific energy levels, or “bands.” Incoming energy excites the electrons, boosting them from a low-energy to a high-energy band, and leaving behind positively charged vacancies in the lattice structure of the semiconductor crystal. In a laser, the energized electrons give up their excess energy as photons of light when they fall back down into the vacancies. Mirrors flanking the semiconductor then toss the photons back and forth through the material, stimulating the generation of additional photons at the same wavelength, which make up the laser's beam.

    In semiconductors used in conventional lasers, the energy gap between the high- and low-energy bands is too narrow to generate UV photons. In ZnO, this gap is wide enough, but past efforts to build lasers using the material have been largely unsuccessful. Although the crystals have been coaxed to emit photons of UV light, the emission is weak, in part because defects in the crystals trap the photons. When researchers were unable to grow bulk ZnO crystals without the defects, most gave up on the material for making lasers.

    Kawasaki and his colleagues took a different approach. Using a sort of high-tech, atomic spray-painting machine, they were able to create thin, nearly defect-free, ZnO films made of tiny crystalline grains laid down in a honeycomblike pattern. When team members bombarded the film with a light beam, they found that the film absorbed the energy and reemitted it as a surprisingly intense UV laser beam. Kawasaki believes that the sharp boundaries between illuminated and dark portions of the film act as tiny mirrors to reflect UV photons into a coherent beam. The film's honeycomb structure also may help excited electrons to combine with positively charged vacancies to produce photons, he says, although the exact mechanism there is not yet clear.

    For now, the new films only produce UV laser light when their electrons are energized with blasts of high-energy photons from big, laboratory lasers, which are much too cumbersome to be adapted for use in consumer electronics products. The next step is developing compact, ZnO chip-based lasers that will generate a beam when jolted with electrical current from electrodes above and below the semiconductor film. But that won't be easy, because vacancies don't move as easily from the anode through ZnO as they do in other materials, says Jeff Cheung, a materials scientist at the Rockwell Science Center in Thousand Oaks, California.

    Cheung and others recently succeeded in enhancing the positive conductivity of ZnO films by spiking them with trace amounts of nitrogen. If such doping techniques allow ZnO lasers to make it to market, they probably will have another advantage over the competition. ZnO films can be grown at about 500 degrees Celsius, hundreds of degrees lower than gallium nitride, the most popular material for blue lasers. Call it a cool way to beat the blues.

  10. Paleoanthropology

    Bone Sizes Trace the Decline of Man (and Woman)

    1. Ann Gibbons

    When paleontologist Stephen J. Gould teaches his history of life course to Harvard undergraduates, he flashes slides from popular media that depict human ancestors marching through time. The earliest members of our genus, Homo, appear short and stooped, but as time passes, our forebears gradually grow taller and bigger brained. The march of progress culminates in the implicit masterpiece of evolution—big-brained, upright, modern humans.

    But this popular view of human evolution is wrong, says Gould, who is the most visible critic of the long-standing notion that our lineage evolved gradually and inexorably toward a bigger, brainier human. Compelling proof that he's right has now come from the fossil record. In the 8 May issue of Nature, a new study of the bones of 163 early members of Homo who lived 2 million to 10,000 years ago suggests that our bodies—and brains—have gotten smaller lately, not bigger. Anthropologists have long thought that some members of the Homo lineage, the Neandertals, were brawnier than we are, but the new study, based on skull volume and two skeletal indicators of body mass, shows that the same was true of our direct ancestors.

    What's more, the study shows that the recent downsizing trend is just the latest twist in a complex history of brain and body size. Evolution apparently favored brawn early in human history: At least one early human already stood 1.85 meters (6 feet 1 inch) tall 1.8 million years ago. But brains stayed relatively small until 600,000 years ago, when they underwent a tremendous growth spurt that lasted until 50,000 years ago. It has been downhill ever since, with our brains and bodies shrinking by about 10% on average—perhaps, the authors speculate, because changes in technology and lifestyle have rendered muscular bodies unnecessary. “The bottom line is that body size did vary through human evolution,” says paleoanthropologist John Kappelman of the University of Texas, Austin. That variation “challenges the traditional view that living humans are the epitome of large body and brain size,” says Christopher Ruff, a biological anthropologist at Johns Hopkins University and the study's lead author.

    Brawn and brain.

    Two skeletal measures yielded a history of body mass, shown below with brain size. After expanding for half a million years, both have declined in recent millennia.

    Putting that view to the test has been difficult because anthropologists can't measure early humans from head to toe, or “even from pelvis to toe,” says Ruff, as the fossils from any one skeleton are too fragmentary. As a result, researchers have tried indirect means of estimating body size, such as skull thickness or tooth and eye-socket size. These methods, however, have proven to be unreliable, probably because factors such as activity, diet, or climate can also influence them, says Ruff. For the past decade, he has been trying to find better measures.

    One feature that seems to fit the bill is the head of the femur, or thighbone. In studies of living humans, Ruff and others have found that the breadth of this femoral head is proportional to the mass of a person's body—the bigger the body, the bigger the femoral head supporting its weight—and they have developed equations that express the relation. A team including Ruff and paleoanthropologists Erik Trinkaus of the University of New Mexico and Trenton Holliday of the College of William and Mary in Williamsburg, Virginia, has now applied this equation to fossil femora from 93 individuals.

    To check their results, Ruff and colleagues used a second method for estimating body mass based on stature, which they gauged from the length of limb bones, and on the breadth of the pelvis, as measured (or estimated) between the two widest points of the flaring iliac bones. They applied this measure to 96 fossils (including 26 for which they also had femoral heads) and got results that closely matched those based on the femoral head. The agreement “helped increase my confidence that we were getting fairly unbiased estimates of body weight,” says Ruff. Finally, they compared those estimates with their own and others' published measurements or estimates of cranial capacity to get brain size relative to body size.

    In an earlier, less comprehensive study with Pennsylvania State University paleoanthropologist Alan Walker, Ruff had found that one H. erectus fossil, the 1.5 million year old Nariokotome boy who lived near Lake Turkana in Kenya, would have stood 1.85 meters (6 feet 1 inch) tall and weighed 71 kilograms (156 pounds) if he had reached adulthood. The new study confirmed that six-footers were already striding around east Africa at that time, but their brains were about two-thirds the size of ours—and stayed that size for a million years.

    The stasis ended when “there was a truly extraordinary increase in brain size from about 600,000 to 30,000 years ago,” says Trinkaus. This coincides, he notes, with the expansion of early humans to colder climates, which could have reinforced selection for larger brains to plan the use of seasonal resources. The trend in brain size continued over the past 100,000 years through the Neandertals to early modern humans. Brain size peaked at about 10% larger than ours in early modern humans, such as the people who lived in caves at Skhul and Qafzeh in Israel 90,000 years ago and the cave painters at the Cro-Magnon rock shelter in France 30,000 years ago.

    Meanwhile, body size continued a slower increase, peaking within the last 100,000 years. Indeed, the new analysis shows that Neandertals were the champions of brawn, outweighing contemporary humans by 30%. Their brains were also larger than ours in absolute terms, but their ratio of brain size to body mass was about 10% lower. This finding solves an important mystery—why Neandertals' activities don't look particularly intelligent in the fossil record, despite their big brains: “People always go on about Neandertals having larger brains than ours, but this disproves that if you take into account body size,” says Leslie Aiello of University College London. In a commentary in Nature, Kappelman suggests that the result will require “critical re-thinking” about the behavior of Neandertals, implying that it “was probably decidedly non-modern—and more dependent on brawn than brains.”

    The decline in both brain and body size since the days of the Neandertals and Cro-Magnons may be due to tools or social skills that reduced our ancestors' reliance on sheer brawn, says Ruff. And as the body shrank, so did the brain. Trinkaus points out other factors that may have contributed to the trend in recent millennia: for example, poor nutrition as agriculture replaced the varied fare of hunter-gatherers with a poorer diet. Other researchers have found that stature was smallest in the Neolithic and Middle Ages, although Ruff suggests that better nutrition has allowed some populations to bounce back to their Pleistocene heights, including Americans and northern Europeans.

    Kappelman and Richard Smith of Washington University in St. Louis believe that the trends in brain and body size that the Ruff study has traced are real. They are less convinced by Ruff's absolute values for body mass, however, because he calibrated his equations on living humans. The modern, sedentary lifestyle may have thrown off the relation between body mass and skeletal features. Kappelman suggests that athletes might be a better basis for the equations.

    But those concerns, he adds, won't affect the most important conclusions. Body and brain size reflect the different ways our ancestors adapted to their environments—suggesting that “they were behaving differently than us,” says Kappelman. And, as far as the human physique goes, the march of progress is definitely a myth.

  11. Astronomy

    Antimatter Hints at Galactic Turmoil

    1. Erik Stokstad

    WILLIAMSBURG, VIRGINIA—Compared with some of the universe's more turbulent neighborhoods, the Milky Way is a tranquil suburb. But last week's announcement here that the orbiting Compton Gamma-Ray Observatory (CGRO) had spotted a wayward cloud of positrons—the antimatter equivalent of electrons—near the galactic center hinted that, like many suburbs, the galaxy is not as placid as it seems.

    Some astronomers are speculating that the cloud may be a legacy of thousands of stellar explosions that rocked the galactic center about 10 million years ago, creating positrons and driving them outward. It's not the only possible explanation, and it received mixed reviews at the Fourth Compton Symposium on Gamma-Ray Astronomy and Astrophysics, where the cloud was announced. “It's a neat discovery,” says Neil Gehrels, an astrophysicist at the NASA Goddard Space Flight Center in Greenbelt, Maryland, but the supernova scenario “is a bit of a stretch” because it requires the fragile antimatter particles to survive a long trip through space. But whatever the cloud's true story turns out to be, it is likely to leave the galaxy looking more tumultuous than before.

    The positrons prompting this new view of the galactic center can be seen only when they meet with electrons in a violent encounter that annihilates both particles, producing gamma rays concentrated at an energy of 511 kiloelectron volts. Since the 1970s, detectors lofted by balloons and satellites above Earth's gamma-ray-absorbing atmosphere have picked up this death cry coming from the center of the galaxy. Astronomers speculated that the massive black holes thought to lurk there were responsible: They theorized that matter is superheated as it falls into the black holes, generating gamma rays that collide and spawn positron-electron pairs.

    These instruments yielded only rough indications of the amount and location of the positrons. After NASA launched the CGRO in 1991, astrophysicists set out to use the satellite's Oriented Scintillation Spectrometer Experiment (OSSE), which has a finer spatial resolution than its predecessors, to pin down the precise locations of the antimatter. But when OSSE searched near the center of the galaxy, it found only about half of the positrons tallied earlier.

    So the researchers, led by astrophysicists William Purcell of Northwestern University and James Kurfess of the Naval Research Laboratory (NRL) in Washington, D.C., broadened their search. They have now found the missing positrons in an unlikely spot—about 3000 light-years above the galactic center. “We were very surprised to see this,” Purcell says, because the region appears to lack any sign of a black hole or other positron source.

    Some researchers argue that a black-hole source may yet be discovered. But Charles Dermer and Jeff Skibo of the NRL are skeptical. For one thing, they say, black holes hiccup out positrons, as clumps of matter fall in, but months of OSSE observations haven't detected any variation in the amount of antimatter.

    Dermer and his colleagues envision a different source: supernovae at the center of the galaxy. Exploding stars make radioactive isotopes that emit positrons as they decay. And a volley of supernovae sometime in the last 10 million years could have turned the galactic center into “a cauldron of violence,” says Dermer, propelling “a fountain of hot gas” that would have swept the positrons out of the galactic plane.

    The picture has some observational support. Astronomers have seen chimneys of hot gas escaping from the galactic disk, presumably powered by supernovae. And glimpses of the dust-shrouded galactic center have revealed hints of turmoil there. Radio emissions suggest a flow of gas streaming in the general direction of the cloud. X-ray observations also suggest that the ionized gas there has been heated to 10 to 100 million degrees or more by past violence. The annihilation fountain, says Dermer, “knits together those observations into a coherent picture.”

    The model faces some difficulties, however, including the question of how the positrons get so far from the galactic center without encountering matter and annihilating. Astronomers also wonder whether the Milky Way was capable of forming massive stars—the kind that explode—fast enough to explain the burst of supernovae. But Dieter Hartmann, an astrophysicist at Clemson University in South Carolina, says that while “there's no rigorous, solid evidence” of such a starburst, “the assumptions are reasonable.”

    Perhaps the biggest question is whether the antimatter cloud really does hover over the galactic center, because the gamma rays do not reveal the distance of the positrons. If its apparent link to the center turns out to be just a chance alignment, the hunt will be on for other pockets of violence in our quiet cosmic suburb.

  12. AIDS

    Stubborn HIV Reservoirs Vulnerable to New Treatments

    1. Jon Cohen

    In chess, it's called the end game. In football, it's called the longest yard. In the world of AIDS, researchers don't know quite what to call it. But suddenly scientists are tossing around these sorts of metaphors as new treatments have allowed them to beat down HIV RNA in the blood, raising hopes of attacking the virus's other bastions, which so far have remained beyond the reach of anti-HIV drugs. This week, three new reports, one of which appears in this issue of Scienceon page 960, offer insight into how these redoubts finally might be conquered.

    Several recent studies have shown that potent combinations of drugs that attack two critical viral enzymes, reverse transcriptase (RT) and protease, can reduce the amount of HIV RNA in a person's blood to levels too low for the most sensitive assays to detect. But AIDS researchers have cautioned that the virus also hides out in the lymph nodes, which can harbor more virus than can the blood. And HIV is even more impervious to attack when it transforms its genetic material from RNA into DNA, weaves itself into the host's genes, and does not make new viruses—a strategy that allows it to stay under the radar of drugs and the immune system.

    The first bastion looks more vulnerable in light of this issue's report by the University of Minnesota's Ashley Haase, Winston Cavert, and their co-workers, who show that a triple-drug combination therapy can dramatically reduce the amount of HIV RNA in lymph nodes. And although no one is reporting any progress in eliminating HIV in its hidden, DNA form, a paper in the 8 May issue of Nature sizes up the challenges. Still another report in that same issue explores the rate at which the most persistent HIV-infected cells are eliminated from blood and tissue.

    Virologist Steven Wolinsky of Northwestern University says the papers are “critical” to translating scientists' understanding of how HIV causes disease into treatment strategies: “The bottom line is that as long as there's one infected cell left, the job's not done.” John Coffin, a retrovirologist at Tufts University in Medford, Massachusetts, adds that these studies “really flesh out some important gaps in previous papers.”

    One such gap was how triple-drug treatments affect HIV in the lymph nodes. To find out, the Haase group repeatedly biopsied the tonsils (a type of lymph node) of 10 patients who were receiving the RT inhibitors AZT and 3TC, plus the protease inhibitor ritonavir. Over a period of 6 months, the researchers found, the drugs rapidly decreased the levels of HIV RNA inside mononuclear cells, which include two types of white blood cells: scavenger cells known as macrophages, and T lymphocytes that bear a CD4 receptor on their surface (whose decline is a hallmark of AIDS). The drop was not surprising, because mononuclear cells traffic between the lymph nodes and the blood, where the virus already has been shown to be vulnerable to drugs.

    More startling was the finding that HIV RNA trapped on the surface of immune cells that form the scaffolding of lymph nodes—follicular dendritic cells (FDCs)—dropped off at roughly the same rate as HIV in mononuclear cells. Based on earlier test-tube studies, says Haase, “we were thinking it would take months, if not years, to clear this virus.” Coffin says the result is “quite a novel finding.”

    Haase cautions that even though the drugs have rid patients' lymph nodes of HIV RNA within a few months, nobody's been cured: “There's clearly a residue of virus still stored on FDCs, and if you stop therapy, there's every indication that you can restart the infection.” The Haase group did find one exceptional patient who had no detectable virus in lymph-node mononuclear cells or FDCs after 6 months of treatment. But further study showed that this patient, too, still is infected—with HIV DNA.

    The virus's life cycle explains why HIV DNA lingers in the body even after drugs appear to have cleared cells of HIV RNA. When HIV enters a cell, its reverse-transcriptase enzyme changes its RNA genome into DNA—so the virus can integrate its genes into the host cell's DNA. When the cell divides, that viral DNA produces new HIV particles, which once again contain RNA. But often, the HIV-infected host cell remains in a “resting” phase in which it does not divide. The viral DNA, or provirus, in these resting cells is invisible to the immune system and invulnerable to current anti-HIV drugs.

    Virologist Winston Cavert, lead author of the Sciencepaper, says “the big question” now is how long cells harboring HIV DNA survive, how many such cells exist in an infected person, and how much new virus they can produce. Robert Siliciano of Johns Hopkins University and colleagues addressed these issues by studying HIV DNA levels in resting, CD4-bearing T cells from the blood and lymph nodes of 14 patients, half of whom were receiving treatment. As they report in Nature, fewer than 0.05% of the resting CD4s harbored proviral DNA, but many of them, when stimulated to divide, were capable of producing virus. “These cells can survive for months or years,” says Siliciano. “They represent the potential barrier for curing the infection.”

    Because of these and other cells that can coexist with HIV for long periods, the plummeting virus levels seen with the latest therapies may give way to a slower “second-phase decay,” say the authors of the second Nature paper. David Ho of the Aaron Diamond AIDS Research Center in New York City, Alan Perelson of New Mexico's Los Alamos National Laboratory, and their co-workers reached that conclusion with a computer model, which they used earlier to describe first-phase decay (Science, 15 March 1996, p. 1582).

    Their model also challenges the dogma that an HIV infection can never be completely cleared from the body. The model takes into account the dynamics in the dance between HIV, cells, and drugs. Untreated, HIV constantly kills the cells it infects, but the virus persists because the body provides it with a steady stream of newly minted cells. Anti-HIV drug treatment prevents the infection of these new cells, and the HIV-infected cells slowly die off. Ho and Perelson's model suggests that the main holdouts are infected macrophages, which are not easily killed by the HIV they produce. By calculating the half-life of these cells, combined with the half-life of infected resting T cells, they estimate that 2.3 to 3.1 years of treatment with a 100% inhibitory anti-HIV regimen might be able to eliminate all remaining virus. Even so, they too conclude with a loud caveat: “Although significant progress has been made in the past year in the treatment of HIV-1 infection, it would be wrong to believe that we are close to a cure for AIDS.”

    Haase and other researchers contend that mathematical studies of viral kinetics have provided as much information as they ever will about whether it is possible to eradicate the virus. “The only way to answer the question now is to discontinue therapy and measure the virus,” says Haase. Ho's group and other labs now have studies under way that should do just that during the next year or so. But Ho stresses that he is not at all certain that the virus will ever be vanquished. “The end game can be as hard as the beginning game or the middle game—and you can lose,” says Ho, who formerly was a serious chess player. “I think for HIV, this last bit is the hardest.”

  13. Astronomy

    Primordial Gas: Fog Not Clouds

    1. Charles Seife
    1. Charles Seife is a free-lance writer in Riverdale, New York.

    Astronomers lost in an intergalactic forest finally may have found a way out. Two astrophysicists appear to have cleared up the puzzling distribution of tenuous gas in the distant universe, which produces a “forest” of dark lines in the spectra of the bright, remote objects called quasars. By assuming that there is a ubiquitous, undulating fog of hydrogen and helium in the vast reaches of space between galaxies, Arthur Davidsen and HongGuang Bi of Johns Hopkins University not only provide a new explanation for these dark lines, they also account for a key part of the universe's “missing matter.”

    The new picture, published in the current issue of Astrophysical Journal, replaces a model that attributed the forest of lines to discrete gas clouds along the path from Earth to each quasar. Recent computer simulations of how structures took shape in the early universe had cast doubt on the picture, however, and so had the inability of the hypothetical clouds to account for all the hydrogen and other normal matter that cosmologists believe is left over from the big bang. Bi and Davidsen's theory agrees with simulations and provides a home for the missing matter—and it explains the forest of lines to a high degree of accuracy. “[The fact] that observation and theory agree so well means that this is a success—a big success,” says David Weinberg, a computational astrophysicist at Ohio State University.

    In the early 1970s, astronomers discovered these lines by the hundreds in the ultraviolet region of quasar spectra and dubbed them the Lyman-α forest. Theorists guessed that each line was the shadow of a cloud of hydrogen gas between the quasar and Earth, absorbing light at a specific wavelength. The clouds would be moving away from Earth at different speeds because of the expansion of the universe, and so the Doppler effect would shift each absorption band by a different amount. Hundreds of discrete, spherical hydrogen clouds, each with a different Doppler shift, could explain the whole forest of lines.

    But computer models of the universe, which build in assumptions about the amount and types of matter created in the big bang, then trace how the matter coalesces under gravity, couldn't reproduce the clouds. So Davidsen and Bi tried a new approach. They looked at the consequences of an extremely simple mathematical assumption about the intergalactic medium: that the logarithm of its density has a normal, or bell-curve, distribution, meaning that the gas has denser and more tenuous regions but isn't broken into discrete clouds.

    A decade ago, the idea would have faced a struggle, explains Davidsen, because the discrete lines in the Lyman-α forest seem to say that the space between the absorbing regions is empty. But astronomers now have reason to think that intervening matter could simply be invisible. The intergalactic medium contains other elements besides hydrogen, among them helium and carbon, and their shadows in the quasar spectra indicate that the medium is highly ionized: Radiation from quasars has stripped away electrons from most of the atoms.

    As astrophysicist Donald York of the University of Chicago explains, “Ionized hydrogen has no signature; only neutral hydrogen does.” Without an electron, a bare hydrogen nucleus can't absorb a photon, so a region of tenuous, ionized hydrogen should leave no trace in the Lyman-α forest. Only in the denser regions would enough neutral atoms remain to create a line.

    When Davidsen and Bi used a computer to calculate the precise forest of lines that would be seen in quasar light passing through their continuous medium, they found a close match to the Lyman-α forest. Regions of high density absorb the light at many different wavelengths, much as the discrete clouds did, but they are larger, less dense, and less contrived than the clouds. “There hasn't been a theory which can explain [the lines],” says Mike Norman of the University of Illinois, Urbana-Champaign. “Now that theoretical model is in hand.”

    Norman, like Weinberg, runs supercomputer simulations of cosmic-structure growth. He too is impressed by the agreement between the Davidsen-Bi picture and the simulation results. The latest computer models generate nothing like the discrete clouds of the earlier picture; instead, they naturally evolve a mass distribution that mimics the Lyman-α forest. Says Norman, “By assuming a distribution for the density, Davidsen and Bi bypassed 500 hours of supercomputer time.”

    The new picture offers an answer to one of the long-standing problems of astronomy. As David Schramm, an astrophysicist at the University of Chicago, puts it: “Where are the bulk of the baryons?”

    According to the big bang theory, the universe should contain several times more baryons—particles of ordinary matter, such as protons and neutrons—than we can see in the stars and galaxies. If gas in the intergalactic medium is in the form of discrete, light-absorbing clouds, there can't be enough of it to account for the baryon deficit. But with highly ionized, and therefore invisible, hydrogen distributed between the dense patches, the diffuse intergalactic medium could account for nearly all the baryonic matter in the early universe—the full amount that had been missing.

    Not all astronomers accept the picture, neat as it is. “I don't think that the [baryon-density] value is as high as their model's value,” says Lennox Cowie, an astrophysicist at the University of Hawaii. Cowie thinks the new theory and the computer simulations overestimate the amount of ionizing radiation in the early universe and thus the number of invisible baryons. “It's a nice piece of work,” he says. “Still, the question's not settled yet.”

    Others, however, are ready to lay the matter to rest. “We finally feel we understand the Lyman-α forest,” says Neal Katz, an astrophysicist at the University of Massachusetts. “Now we can use it as a tool” to study the elusive matter that produces it.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution


Navigate This Article