News this Week

Science  20 Jul 2001:
Vol. 293, Issue 5529, pp. 404

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Pentagon Proposes to Cut AIDS Research From Defense Budget

    1. Jon Cohen

    The U.S. Department of Defense (DOD) has fired a salvo at its own AIDS research effort, branding it a “nontraditional defense program” that should be cut from the defense budget. Not only has the designation wounded military AIDS researchers, it is causing outrage in the broader community, because the U.S. Military HIV Research Program plays a unique role in AIDS vaccine development.

    DOD officials decline to discuss the program's future on the record because it is still being debated within the Pentagon. But Science has obtained official memoranda that describe a proposal to transfer the program and $11 million—less than a third of its current $35 million budget—to the National Institutes of Health (NIH).

    Several outsiders familiar with the program recently got wind of the proposal. They are not pleased. “I think it would be a terrible idea,” says Donald Burke, the former head of the military's AIDS research program, who now directs the Center for Immunization Research at Johns Hopkins University in Baltimore, Maryland. “The DOD brings a different, niched approach to international AIDS vaccine development that neither the NIH nor the [Centers for Disease Control and Prevention] has picked up.”

    Richard Hecklinger, the U.S. ambassador to Thailand—where the U.S. military is testing HIV vaccines—is even more disparaging. “For the U.S. to pull out now or put the project on hold while we transfer responsibility to a whole new set of players will be a major blow not only to our cooperation with Thailand, but to our high-priority effort to find an HIV vaccine for the developing world,” Hecklinger wrote the deputy secretary of defense last week.

    The proposal comes from DOD's bean counters, the Office of the Under Secretary of Defense (Comptroller), as part of a sweeping review of defense spending launched by Defense Secretary Donald Rumsfeld. The comptroller's office identified 10 “nontraditional” defense programs and in June solicited input throughout DOD on whether they should be cut. Particularly galling to military AIDS researchers is that they were lumped together with activities such as supporting Olympic athletes and “at-risk” youth.

    Well-placed sources tell Science that several high-level DOD officials—including the Joint Chiefs of Staff and the Assistant Secretary of Defense for Health Affairs—“nonconcurred” with the suggestion to cut the HIV research program. But, in what some insiders have perceived as a “slap in the face,” the new Secretary of the Army, Thomas E. White, whose department runs the AIDS research program, supported the idea. “We believe the National Institute (sic) of Health … can provide a better, integrated approach to HIV research,” White wrote in a 29 June memo to the Under Secretary of Defense (see below). White, a brigadier general who retired from the Army in 1990, worked for the past decade as an executive at Enron Corp. in Houston, Texas, before assuming his current post.

    The military has a long tradition of medical research targeting specific diseases such as yellow fever, encephalitis, and malaria, and it has made vaccines against several of them. The AIDS program, which grew out of work begun in the early 1980s at the Walter Reed Army Institute of Research in Washington, D.C., mainly focuses on HIV vaccine research and development.

    The planned vaccine trial in Thailand is one of DOD's largest AIDS projects. U.S. military researchers hope to test the efficacy of a combination of two AIDS vaccines on some 15,000 people. Transferring the program to NIH, Hecklinger argues, would “pull the rug out from under a carefully planned cooperation with our Thai counterparts—10 years in the making—in which both sides have expended considerable resources.”

    NIH officials who would inherit the program are confused about the proposal. “The program has been very productive and accomplished many things, and it has in place the mechanisms to evaluate vaccines in underdeveloped countries,” says Edmond Tramont, director of the Division of AIDS at the National Institute of Allergy and Infectious Diseases (NIAID). “The world cannot afford to let that collapse. The question is how to save it.” (Tramont, who started the military's AIDS research program in 1985, took the NIH job last week in a move unrelated to the proposed shift of the program.)

    NIAID director Anthony Fauci met with the current head of the program, Colonel Deborah Birx, to discuss the implications of the proposal. “It was unclear even to her what they were considering,” says Fauci. “My experience with the Army's efforts in HIV/AIDS is that they have been and still are an important player in this whole scene. I think this is an important part of their mission.”

    Burke of Johns Hopkins, who regularly had to defend the military's AIDS research program from superiors who questioned its worth, says President George W. Bush's new Administration needs to be educated about the program's value. Indeed, Army Secretary White endorsed the proposal to eliminate it less than a month after he started his job on 31 May.

    The education process began in earnest this week. On 17 July, as Science went to press, DOD was planning to hold a high-level meeting to air criticisms and support of the proposal. A final decision is expected by the end of the month.


    Nearby Galaxy Breaks the Black Hole Chain

    1. Charles Seife

    An invisible star has not been seen, and astronomers are taking notice.

    According to a report published online this week by Science (, the center of a nearby spiral galaxy, M33, seems to have no black hole—unlike all of its larger, more bulging brothers. The finding may help scientists puzzle out the so-far-murky sequence of events by which galaxies assemble themselves.

    Elsewhere in our celestial neighborhood, black holes rule. In the center of our own Milky Way galaxy, for example, squats a hungry black hole as massive as several million suns. Although the supermassive black hole itself is invisible to telescopes, astronomers have figured out its position and mass by measuring the velocity of stars wheeling wildly around the monster.

    And our galaxy is not the only one that has a black heart. Nearby Andromeda has a 50-million-solar-mass black hole at its center; indeed, astronomers have begun to believe that a supermassive black hole lies at the center of every bulging gal-axy. “So far, in every [bulging] galaxy where people have looked, they find a supermassive black hole,” says Laura Ferrarese, an astronomer at Rutgers University in New Brunswick, New Jersey, and a co-author of the report.


    Nearby galaxy M33 has no supermassive black hole at its center.


    In general, the bigger the galaxy's bulge, the bigger the black hole at its center. However, nobody really knows how these black holes formed, or how they are connected with the bulges. Do the black holes help form the galactic bulges, or vice versa? Or do they form at the same time? And what about flat galaxies that lack bulges—do they harbor black holes as well?

    To help clear up at least the last of those questions, Ferrarese and two colleagues pointed the Hubble Space Telescope at nearby M33, a small spiral galaxy about 2.5 million light-years away that doesn't have a bulge. With Hubble's Space Telescope Imaging Spectrograph, the team measured the light spectra from stars near the galaxy's center. By looking at how much the spectral lines are redshifted or blueshifted, the astronomers figured out how quickly the stars at the core of the galaxy were moving.

    Sure enough, the stars very close to the center of the galaxy weren't moving much faster than those farther out, as one would expect if a supermassive black hole squatted in the center of the galaxy. If there is a black hole at all, it must be puny compared to its supermassive cousins. “How big a black hole can you hide in there?” asks Ferrarese. “About 3000 solar masses.”

    “In part, it answers the question of whether black holes are absent in bulgeless galaxies,” says Luis Ho, an astronomer at the Observatories of the Carnegie Institution of Washington in Pasadena, California. “It's nice. Now there's one datum, though we'd like to get a lot more.” According to Ho, the lack of an apparent black hole at the center of M33 implies that the supermassive black holes in the centers of galaxies form “during or after bulge formation.”

    Astronomer Karl Gebhardt of the University of Texas, Austin, whose team has analyzed the same Hubble data and come to the same conclusion about M33, agrees. “The process that forms the bulge might form the black hole,” he says. If so, careful analysis of galaxies with different-sized bulges will give astronomers snapshots of supermassive black holes at different points in their evolution. That information may enable scientists to figure out how supermassive black holes form, and why there's no invisible star at the center of M33.


    Procedures Faulted in Fatal Asthma Trial

    1. Eliot Marshall

    BALTIMORE, MARYLAND—Eight weeks ago, a young lab technician at Johns Hopkins University in Baltimore died after participating in a clinical study of asthma. The university delayed initial disclosure of the death of the volunteer—Ellen Roche, a 24-year-old employee of the Hopkins Asthma and Allergy Center, which ran the study (Science, 22 June, p. 2226)—but this week it made public a 32-page internal report on the case and answered a barrage of questions from the press.

    The seven-member inquiry, chaired by cardiologist Lewis Becker, criticized some aspects of the study but found no major flaws. “We will never know the exact cause of her death,” said Edward Miller, the CEO of Johns Hopkins Medicine and dean of the medical faculty. But, he added: “We accept full institutional responsibility” for the tragedy.

    The inquiry concluded that the fatal reaction was probably triggered by a chemical used in the trial, hexamethonium bromide. Roche was the third volunteer to inhale this chemical; the first developed a short-lived dry cough, and the second reported no problems. Roche responded differently. The alveolar sacs of her lungs, which transfer oxygen into the blood, were irreversibly damaged. She slowly asphyxiated between 4 May, when she inhaled hexamethonium, and 2 June, when she suffered multiple organ failure.

    Full responsibility.

    Johns Hopkins medical dean Edward Miller announced institutional changes at a press conference.


    Hexamethonium blocks certain autonomic system nerves, including those controlling the airways. It was used as part of a simulated asthma episode in volunteers who were given a drug that induces asthma-like effects. More than 3 decades ago, doctors prescribed a pill form of hexamethonium to treat hypertension; that approved use ended when manufacturers withdrew the drug in the 1970s. (Its main side effect was to decrease blood pressure too much.) Clinical researchers a decade ago also gave an inhalable form of hexamethonium—similar to the one used at Hopkins—to 20 volunteers in two independent studies. They reported no ill effects. But, the Hopkins review found, the Food and Drug Administration (FDA) has never approved hexamethonium for any use by inhalation.

    The report also notes that, to shorten the procedure, hexamethonium was delivered to Roche's lungs by a more powerful spray mechanism than was used for the first two volunteers. This might have resulted in a higher concentration, the report says, although “the pharmacokinetics of inhaled hexamethonium are not known, and any possible increase in lung tissue concentration in [Roche] cannot actually be verified.”

    Given the lack of experience with the drug, the panel examined whether the lead researcher, Alkis Togias, and the university's human safety group, the Institutional Review Board (IRB), had researched the hazards adequately. The panel found that Togias's literature review was “standard,” although it failed to turn up reports of lung toxicity from 1953 to 1970 among seriously ill patients who had taken hexamethonium intravenously. But the panel faulted the IRB for lack of rigor, concluding that there was not adequate evidence for it to conclude “that inhaled hexamethonium was safe for use in research subjects.”

    Becker's group also noted problems in the consent form. It didn't state that inhaled hexamethonium had never been approved by the FDA or that volunteers could risk death. The panel found no evidence that Roche or other volunteers had been coerced into participating, but it disclosed that eight of the nine volunteers for the trial were employed by the Hopkins Asthma Center. When asked if employees were expected to volunteer as part of their work, Becker responded firmly that they were not. The aim of the research, Becker's panel said, was “important,” and the scientific rationale was “solid.”

    Miller announced that Hopkins intends to add a third IRB to the two it already maintains—this one to conduct random checks of clinical trials. Plans are also under way for a stem-to-stern review of clinical operations. And all trials directed by Togias, as well as 16 others employing chemicals not approved by FDA for clinical use, have been suspended pending review. “We will have to raise the bar [for clinical research] even higher,” Miller said. The next step, he added, will be to ask a panel of experts headed by Samuel Hellman, dean emeritus of the University of Chicago School of Medicine, to take an independent look. That report will go to the university's trustees “by late summer.”


    Wet Stellar System Like Ours Found

    1. Richard A. Kerr

    A solar system is dying, and in its last gasps astronomers 5 light-years away can see signs that a billion comets are blazing into oblivion at once. The discovery of huge amounts of water streaming away from an aging, swollen red giant star in the constellation Leo shows that our own planetary system is not alone in harboring a key ingredient of life as we know it, researchers reported in last week's issue of Nature.

    Scientists operating the Submillimeter Wave Astronomy Satellite (SWAS) in low-Earth orbit had no intention of getting into astrobiology. SWAS was designed to measure water, oxygen, and carbon in gas clouds around the galaxy, but a gap in the observing schedule seemed best filled by a star called CW Leonis, one that should have had practically no detectable water anywhere near it. Instead, SWAS detected 10,000 times more water than the star could have been giving off. The only way to make that much water is by vaporizing it from a billion icy comets at once, SWAS researchers concluded. “Nothing but comets comes close to the amount of water SWAS is seeing,” says SWAS team member David Neufeld of Johns Hopkins University in Baltimore. “We believe we are witnessing the apocalypse that will engulf our solar system in 6 billion years.”

    Nearing the end.

    The star CW Leonis has swollen to engulf any nearby planets and is vaporizing an entire belt of comets.


    CW Leonis, it appears, is consuming a belt of small, icy bodies orbiting it just as Kuiper Belt objects (KBOs) orbit our sun beyond Pluto and Neptune. KBOs become active comets only when one swoops near the sun, but CW Leonis, running low on nuclear fuel, has ballooned out to the distance of Jupiter from our sun and blazed to 5000 times its normal luminosity. That would vaporize the ice of bodies orbiting 10 to 100 times the Earth-sun distance from CW Leonis, SWAS researchers say, turning each into an active comet with a fuzzy, glowing head and streaming tail.

    “If their interpretation is correct, instead of just finding huge planets around other stars, we're finding comets,” says astronomer Tobias Owen of the University of Hawaii, Honolulu. “A lot of us believe these icy bodies are fundamental building blocks of planets. It's nice to know they're out there. It helps the prospects of finding planets, planets with the [gases] that make atmospheres and oceans”—and that could sustain life.


    Congress Orders Halt to Planned NASA Cuts

    1. Andrew Lawler

    Researchers upset about cuts to space station research have found some allies in Congress. A powerful House panel that sets NASA's budget last week ordered the agency to halt its plans to gut nearly 40% of the orbiting facility's science program. It also added money to rescue one set of experiments and asked President George W. Bush for a “clear and unambiguous statement” on the role of research aboard the orbiting lab. The move sets the stage for another confrontation between Congress and the new Administration over who should pay for the station's skyrocketing price tag.

    This spring, the White House refused to request additional funding to meet an overrun of more than $4 billion on the $60 billion facility, ordering NASA to find the money within the program's own strained budget. That prompted NASA to scale back the number of crew members planned for the station as well as its budget for research equipment. Researchers quickly claimed that those moves would cripple science aboard the orbiting lab (Science, 23 March, p. 2291).

    NASA Chief Scientist Kathie Olsen says the agency intends to spend, through 2006, 36% less on research facilities than the $4.4 billion in its original plans. Some areas, such as fundamental biology, would take up to an 80% cut. In 2002 alone, the $452 million planned for facilities would sink to $284 million, according to Olsen. She adds that the changes amount to a shortfall of only $70 million to $75 million in research-related efforts in 2002, in part because a new round of delays in launching station hardware means there's no rush to build some of the experimental facilities. But Olsen insists the cuts don't mean a reduced commitment to science: “Research remains central on the station—I am adamant on this.”

    The House panel moved last week to ease the problem by adding $35 million for fluids and combustion research, which accounts for a small portion of planned 2002 station research funding. Much of that program is run by the Glenn Research Center in Cleveland. The panel also added $275 million for work on a crew return vehicle to carry six persons—the number needed to support the station's ambitious scientific agenda—provided the Administration includes funding for the vehicle in its 2003 budget planning.

    The committee has asked for a comprehensive plan on the station by 1 March 2002, and it told NASA to suspend its plans to cut research until Congress decides the number of crew members. The agency is setting up an independent panel to review NASA's scaled-back station plan, with a report due by the end of the year. NASA is also searching for cheaper alternatives to keep research on track, including use of the space shuttle for extended periods.

    In the meantime, NASA's international partners are having their own problems. A Japanese centrifuge to conduct a host of biological experiments has been delayed repeatedly because of technical problems and won't be available to researchers until late 2008. Such distant dates frustrate would-be station researchers. “It's just so discouraging,” says Patricia Russell, executive director of the American Society for Gravitational and Space Biology.


    New Panel Will Redirect Science

    1. Elizabeth Pennisi

    The Smithsonian Institution took a long-awaited step this week, selecting 18 researchers to help guide this conglomerate of 16 museums, a zoo, and a half-dozen research centers through a reorganization of its scientific research. The institution has been in turmoil since early April, when institution scientists and the public got wind of Smithsonian Secretary Lawrence Small's plans to cut some research efforts and revamp others. The Smithsonian's Governing Board of Regents decided in early May to convene this panel to help quell the outcry and chart a less controversial course for future scientific endeavors (Science, 11 May, p. 1034; 13 July, p. 194).

    The panel includes biologists, astronomers, geologists, anthropologists, and paleontologists—a half-dozen of whom are Smithsonian staff members. Chaired by Jeremy Sabloff, director of the University of Pennsylvania Museum of Archaeology and Anthropology in Philadelphia, the panel will first meet on 6 September. Afterward, “we will take as long as it is necessary to come up with the appropriate recommendations,” says Sabloff. Over the next several months, he expects the group to evaluate reorganization plans proposed by Small, Smithsonian scientists, and perhaps even commission members.

    Smithsonian paleontologist Brian Huber, a spokesperson for the Senate of Scientists at the beleaguered National Museum of Natural History, is not happy about the wait: “It's going to be a slow process, and we're going to be in limbo for some time.” Even so, he says, the delay will be worth it if the panel “will move us in a direction we want to go.”


    Animals Line Up to Be Sequenced

    1. Josh Gewolb

    CHEVY CHASE, MARYLAND—The mouse was a shoo-in. After all, what other organism could better illuminate the human genome? The rat has undisputed standing as a lab staple, and the zebrafish brings a clear vision of development. But, with work on the human genome winding up in 2003, deciphering these other three will only keep the 2000-base-per-second worldwide sequencing capacity busy for so long. So it's high time to add other creatures to the pipeline, sequencers agree. With 1.7 million known species to choose from, however—and almost as many specialists lobbying for their favorites—selecting the next few will not be easy.


    The primate community is arguing over which to sequence first: the macaque (top) or the chimp.


    The stakes are enormous, researchers agreed at a workshop* here last week designed to set criteria for choosing the next candidates for the sequencing machines. If an organism is picked, its research community is guaranteed to be vibrant and well-funded long into the future. As mammalian geneticist Steven O'Brien of the National Cancer Institute put it, “Species that don't get selected will go away, and species that do get selected will prevail.”

    At the invitation-only workshop, sponsored by the National Human Genome Research Institute (NHGRI), four dozen researchers touted the research value of sequences from organisms as varied as swine, cats, and sea urchins. Partisans of microorganisms known as protists commandeered the slide projector to woo the crowd with a diagram showing that microorganisms constitute all but a tiny fraction of living things. Primates clearly dominated the discussion—and in this venue, at least, macaques had a home court advantage, as no chimpanzee experts were present.

    Meanwhile, meeting organizers urged the group to put horse-trading aside and instead settle on criteria for deciding what to sequence next. With co-chairs David Botstein and Robert Horvitz cracking the whip, the group agreed on two sets. One they called general considerations, such as the ease of obtaining sequence and the factors that will make the sequence useful. These included small genome size, existing technical knowledge, suitability for experiments, and an active and eager research community.

    The other was scientific merit—essentially, what questions a particular organism would enable researchers to explore. This was a bit more tricky, because efforts to understand human diseases or probe evolutionary relationships would demand different organisms, the group agreed. To study the evolution of traits, for example, scientists would want to sample groups in each of the major branches of life—say, a mollusk, an earthworm, and a starfish. But to develop new model systems for human neurobiology, a species closely related to humans would be more useful. Still other organisms would help researchers interpret sequence data from humans and model organisms. Although the assembled scientists outlined about 10 questions, they wisely did not attempt to rank them.

    The next step in this already-contentious process, said NHGRI chief Francis Collins, is to turn these preliminary ideas into formal recommendations. The co-chairs are expected to issue a report in the next few months. Collins recommended that over the next few years, NHGRI solicit white papers, hold additional conferences, and support pilot projects to explore the scientific and political questions underlying the choice. Ultimately, said Collins, the institute will issue a call for investigator-initiated proposals, and the decision will be left to the peer-review process. From the looks of things last week, there will be no shortage of applications.

    • *NHGRI Workshop on Developing Guidelines for Choosing New Genomic Sequencing Targets, 9–10 July.


    S. pneumoniae Genome Falls to Sequencers

    1. Dan Ferber

    At the height of his power, Genghis Khan and his armies swept from the steppes of Mongolia, capturing two-thirds of the known world. But for sheer ferocity, Khan's forces pale next to those of a tiny microbe called Streptococcus pneumoniae. These pathogenic bacteria readily move from the throat to invade the lungs, blood, or brain—every year killing millions of children and elderly people worldwide with pneumonia, bloodstream infections, or meningitis. Now, researchers have uncovered clues to just what makes the organism so savage.

    On page 498, a team led by Claire Fraser and Hervé Tettelin of The Institute for Genomic Research (TIGR) in Rockville, Maryland, reports the exact order of the 2.16 million bases that make up the genetic code of a virulent S. pneumoniae strain. By examining the sequence and comparing it with those of other strains, the team found that the microbe is particularly well equipped to invade the body's tissues. It also seems adept at shuffling its genes —an ability that may help it evade the immune system. This “thrilling view” of the S. pneumoniae genome offers “a [new] glimpse into the lifestyle of the organism,” says microbiologist Alexander Tomasz of Rockefeller University in New York City.

    It should also provide new targets for better drugs to treat S. pneumoniae infections and vaccines to prevent them, both of which are badly needed. The microbe's resistance to penicillin and related antibiotics has skyrocketed worldwide in the last decade, and none of the vaccines on the market can ward off all the dangerous strains.

    Fearsome invader.

    Squads of sugar-digesting enzymes may help Streptococcus pneumoniae bacteria, such as these, eat their way through human tissue.


    The TIGR team began the sequencing project in 1996, but progress was slowed by a series of technical problems and a split with the institute's original sponsor, Human Genome Sciences of Rockville, Maryland, that caused a temporary funding drought. Although TIGR made most of the raw sequence data available in 1997, the team took several more years to finish the sequencing and characterize the genes, a process called annotation.

    Meanwhile, several companies, including Glaxo Wellcome and SmithKline Beecham (now merged to form GlaxoSmithKline) and Eli Lilly & Co., sequenced most of the genomes of three other S. pneumoniae strains. GlaxoSmithKline's Damien McDevitt predicts that “there are at least 10 genomes out there from different companies.” So far, however, only two have been made public.

    A team led by Jose García-Bustos of GlaxoSmithKline's molecular microbiology division in Tres Cantos, Spain, published the annotated but incomplete genome of a virulent, antibiotic-resistant S. pneumoniae strain in the June issue of Microbial Drug Resistance. After publishing an incomplete draft of the genome of a widely studied, nonvirulent laboratory strain in 1998, an Eli Lilly team says a more complete version is in press at the Journal of Bacteriology.

    Comparing multiple genomes is key for stopping S. pneumoniae, because 92 different strains infect humans—and researchers would like to protect against all of them. The TIGR team has taken a step in that direction by comparing its strain with both a nonvirulent laboratory strain and a second virulent one. This revealed that about 10% of the genes in the virulent TIGR strain are missing in the other two, pointing to several genes that may be important for infection. (In the June issue of FEMS Microbiology Letters, microbiologists Gianni Pozzi and Marco Oggionni of the University of Siena Medical School in Italy, who also used TIGR data, reported similar results.)

    Among the potentially important genes are a group that encodes a variety of unusual cell-surface enzymes that the microbe uses to break down the carbohydrates that help hold biological membranes together, thus weakening barriers to invasion and freeing sugars for the bug to eat. Indeed, S. pneumoniae appears to have sworn off many amino acids in favor of sugars. If similar carbohydrate-degrading enzymes are found in all the pathogenic strains, they might be good drug targets or vaccine candidates.

    But another finding suggests that S. pneumoniae may be more capable than most bacteria of undergoing the genetic changes needed to elude such protective agents. Several of the microbe's genes appear to be foreign—possibly acquired from other bacteria. In addition, 5% of the S. pneumoniae genome is made up of small genetic elements that can hop to new locations in the chromosome—compared to 3% or less in more typical bacteria. “That's an incredible number,” Tomasz says.

    In another departure from the norm, almost all of these so-called insertion sequences had hopped between genes rather than into them. Together, the results suggest an “extraordinary capability to take DNA, move it around, shuffle it, and do it without making any mistakes,” says infectious-disease specialist Elaine Tuomanen of St. Jude Children's Research Hospital in Memphis, Tennessee.

    Despite the bug's craftiness, experts are optimistic that the genome data will help uncover how the microbe activates key genes as it invades the deep tissues of the body. “It will be like having a frozen picture that all of a sudden starts moving,” Tomasz says.

  9. JAPAN

    Lab Chiefs Decry Push for Strategic Research

    1. Dennis Normile

    TOKYO—Shattering a tradition of restraint in criticizing government policy, 14 current and four former heads of major Japanese laboratories have sent an open letter to Prime Minister Junichiro Koizumi, pleading for greater recognition of the value of basic research and a bigger role for active researchers in shaping the nation's research policies.

    The lab chiefs' main target is a report issued on 11 July by the Council for Science and Technology Policy, the nation's highest science advisory body. The 12-page report recommends that the government realign its research priorities “to strengthen industrial competitiveness, invigorate the economy, [and] promote a high quality of life and a vigorous society.” In particular, the council said that the budget for the fiscal year beginning next April should lean toward the life sciences, information technology, environmental studies, and nanotechnology.


    Japanese astrophysicist Norio Kaifu and other lab chiefs have attacked new report on research priorities.


    The lab chiefs responded the same day, complaining to Koizumi, who chairs the council, that the report is a vote for “the short-term goal of strengthening industrial competitiveness.” Such a policy would be shortsighted, they argue, because “advanced science and technology must be supported by the cultivation of basic research in a wide range of fields.” The missive was drafted by Yoshiki Hotta, director-general of the National Institute of Genetics, based in Mishima; Norio Kaifu, director-general of the National Astronomical Observatory of Japan in Mitaka; and Motoya Katsuki, head of the National Institute for Basic Biology in Okazaki.

    “The [report] really says very little about basic research,” says Kaifu. The lab heads felt they had to speak up now to ensure that their concerns are considered as the various ministries start work on next year's budget.

    Koji Omi, a career politician who was recently named to the new position of minister for science and technology policy, says the government agrees with the lab chiefs and doesn't see a gap between their concerns and the council's report. “We recognize the importance of basic science,” he said at a press conference, “and those laboratory heads don't have to worry” about budget allocations.

    But the lab directors aren't taking anything for granted. Hotta says the group is weighing how to be more active in influencing future policy debates. “This is the first time we have presented a request [to a prime minister], but we may do so regularly from now on,” he says.


    Appeals Court Clears Way for Academic Suits

    1. Eliot Marshall

    Postdocs and junior faculty members generally have a tough time convincing a court to hear their grievances when they feel they've been denied a share in the financial rewards from discoveries they participated in. Now, the trip to the courthouse may have become a little easier.

    In a unanimous ruling on 3 July, a three-judge panel of the U.S. Appeals Court for the Federal Circuit in Washington, D.C., removed a major roadblock facing one such claim. Patent attorneys say the ruling sets a strong precedent for similar cases. The judges also admonished universities and senior faculty members to keep their junior colleagues fully informed of intellectual property claims they file.

    The ruling came in a case brought by herpesvirus researcher Joany Chou, a former postdoc at the University of Chicago who is now self-employed. Chou claims that in 1990 she discovered a variant herpesvirus gene that may be useful in vaccine manufacturing. She co-authored a paper on the finding with her lab chief and mentor, Bernard Roizman, a well-known virologist at the university. Roizman and the university filed for a patent, issued in 1994, on uses of the gene. When Chou learned about the patent—in 1997, she says—she demanded to be named as an inventor. She eventually sued the University of Chicago, Roizman, and two spin-off corporations, seeking due credit and a share of profits.

    Roizman and the other defendants deny Chou's allegations and argue that Roizman was the inventor. They sought to have Chou's suit thrown out on grounds that she has no legal “standing” because she can never claim to be the owner of the discovery. (Discoveries made in university labs belong to the university, no matter who the inventor is.) The Chicago federal judge who heard the case, James Zagel, agreed. He dismissed Chou's claims last year without examining the contentious details of who discovered what (Science, 31 March 2000, p. 2399).

    But the appeals court read the law differently. In a sweeping reversal, it ruled that the law “imposes no requirement of potential ownership” on people who want to go to court to prove they are the inventor of something that has already been patented. “Chou should have a right to assert her interest, both for her own benefits and in the public interest,” the court ruled. It went on to suggest that an inventor seeking legal redress of this kind doesn't even need to prove a direct financial stake. “After all,” the judges said, “being considered an inventor … is a mark of success in one's field, comparable to being an author of an important scientific paper.” And that kind of interest in a patent is probably enough to get a claimant into court.

    “I think this is an important decision,” says Sam Pasternack, a patent attorney who handles many academic intellectual property cases at the firm of Choate, Hall & Stewart in Boston. He is impressed not only by the fact that the ruling lowers the barriers to bringing such cases to court, but also by strong language in the ruling on the “fiduciary duty” of professors and university officials. The tone, he adds, implies that the lower court in Chicago “absolutely blew it.”

    The university will not appeal this decision, says Larry Arbeiter, a spokesperson; it will prepare for a trial in Chicago this fall. The university has examined Chou's claims carefully, he says, and “found them wanting.” Roizman's attorney, Timothy Vezeau of the Chicago firm of Katten Muchin Zavis, says Roizman “looks forward to presenting the facts in court. … He believes that Chou's assertions against him personally are absolutely meritless.”

    Chou's attorney, Paul Vickery of the Chicago firm of Niro, Scavone, Haller & Niro, says “it's hard to put a number” on the exact amount Chou is seeking, although it's likely to be in the tens of millions of dollars. Vickery is handling this case through a contingency agreement that will give him a share of any winnings. Right now, he wants to obtain more information on the profits made by Roizman and the university. The process of legal discovery will begin in a few weeks.


    NIH Review Outlines 'Enormous Promise'

    1. Gretchen Vogel

    In a comprehensive review of stem cell research, the National Institutes of Health (NIH) this week laid out its perspective on the promise and unanswered questions of the nascent field. As Science went to press, the report, requested by Secretary of Health and Human Services Tommy Thompson, was scheduled to be released at an 18 July Senate hearing.

    A preliminary copy of the report obtained by Science describes a field that is full of potential but still fairly short on concrete results. It carefully outlines the differences among results in stem cells derived from adults, fetal tissue, or embryos, reviewing both published and unpublished work. But the report does not take a position on whether the federal government should fund work with embryonic stem cells—a question President George W. Bush is still trying to resolve.

    Together, all types of stem cells “hold enormous promise for new approaches to tissue and organ repair,” says the report, compiled by the NIH office of public policy under the direction of Lana Skirboll. Offering a dramatic example, the NIH report describes a study, still under review at a scientific journal, suggesting that pluripotent stem cells can restore mobility to the hind limbs of rats paralyzed by a virus. In this work, John Gearhart of Johns Hopkins University in Baltimore, with colleagues Douglas Kerr, Jeffrey Rothstein, and others, used a line of cells that Gearhart originally derived from the gonadal tissue of an aborted fetus. The team injected the cells into the fluid surrounding the spinal cord of rats that had been infected with the so-called Sindbis virus. The virus destroys motor neurons in the rear half of rats' bodies, damage similar to that caused by amyotrophic lateral sclerosis (ALS). Three months after receiving the injections, many of the 18 treated rats were able to walk, albeit clumsily, the report notes.

    This is one of the first examples in which human pluripotent stem cells have partially corrected an animal model of disease. “The data are pretty dramatic,” says Rothstein, who nonetheless cautions that they are still preliminary. Experiments are now under way in a mouse model of ALS that is much closer to the human form of the disease, he says, but they have not yet produced results.

    Evan Snyder of Harvard Medical School in Boston, who has been collaborating with the Johns Hopkins team on related experiments, says the results are encouraging. He suspects that neuroprotective factors produced by the stem cells may be the reason for the recovery rather than new neurons.

    Despite the experiment's promise, NIH is cautious about the potential of stem cell therapy for spinal cord injury, one of the most frequently cited applications. “Complete restoration after severe spinal cord injury … is probably far in the future, if it can ever be done at all,” the report says. Partial restoration of some functions is “a more achievable goal.”


    Science Goes Begging in Recovery Package

    1. Richard Stone

    CAMBRIDGE, U.K.—The shipment last month of former Yugoslavian President Slobodan Milosevic to The Hague to stand trial as a war criminal has unleashed a flood of Western aid for the shattered country. But high hopes that some of the $1.28 billion pledged at a 29 June conference in Belgium would nourish good science have, for now, been dashed. Several Yugoslav science initiatives failed to win a slice of the pie, leaving their future uncertain.

    Research has struggled along with the rest of the fragile federation of Serbia and Montenegro since Milosevic was toppled last year (Science, 27 October 2000, p. 690). Roughly half of the country's top scientists are thought to have left the country, and the level of outside support has been disappointing. But there have also been hopeful signs: In February, Science Minister Dragan Domazet won a doubling of his budget, to $25 million, and he was looking to extend those gains with a slice of the new money.

    Instead, more than half the funds from the donors' conference, run by the European Commission (EC) and the World Bank, were earmarked for such reforms as overhauling the banking industry and tightening the social safety net. The Serbian government was allowed to dole out much of the rest (Montenegro received roughly 10% of the pot) according to its own priorities. And only one of a dozen projects—upgrading Internet connections—received the go-ahead. “We're pretty disappointed,” says Domazet. “We're not getting any financial help for our scientists or labs.”

    One of the biggest blows was a failure to secure additional funding for a state-of-the-art cyclotron facility under construction at the Vinvca Institute of Nuclear Sciences near Belgrade. Serbia has already spent $18 million building the TESLA Scientific Center, which would do everything from probing atomic structure to treating cancer patients, and the science ministry had requested $8 million to finish the job. “We are continuing the fight,” says Vinvca's Nebojsa Neskovic. Officials also hope to find donors to replace obsolete equipment in labs around the country and to create technology parks in Belgrade and Nis.

    Although money is scarce, contacts are expanding. Serbia has inked a deal to allow a handful of scientists to work at CERN, the European particle physics laboratory near Geneva. And the EC is poised to make Yugoslav scientists eligible to compete for funds in its flagship Framework research program.

    Domazet is also given credit for making available funds go farther. In a break with the tradition of spreading a thin budget evenly, a call for research proposals issued last month is designed to funnel money to the best labs. The idea is to force mediocre scientists in the 9000-strong workforce to upgrade their skills, change careers, or retire.


    Down to the Wire on Bioweapons Talks

    1. Richard Stone

    A decade ago, the world learned that the Soviets had built an extensive germ-warfare reserve; today, nations are trying to prevent anyone from trying it again

    CAMBRIDGE, U.K.—In the summer of 1994, U.S. experts on biological weapons received a startling invitation to visit a set of secret Russian labs. The offer came as they were delving into a clandestine program revealed earlier by Soviet defectors—one that had employed thousands of scientists and technicians to design and produce weapons loaded with deadly microbes, such as anthrax.

    Former Russian President Boris Yeltsin admitted in 1992 that the Soviets had run this vast enterprise, called Biopreparat. He also pledged to dismantle it and adhere to the ban on offensive weapons in the Biological and Toxin Weapons Convention (BWC), which the Soviets had ratified 2 decades earlier but clearly ignored.

    As a gesture of good faith, Yeltsin allowed Western experts to visit the Biopreparat buildings. But another clutch of facilities remained off limits—a shadowy network run by the Ministry of Defense (MOD) for the ostensible purpose of developing vaccines against bioweapons. U.S. and British experts kept asking questions, however, and in June 1994, they got a surprise: Russian officials invited them to visit any lab, including the MOD facilities. “The offer came out of the blue,” says a U.S. official.

    Over the next few weeks, however, separate talks on Russian visits to U.S. disease-research labs overseas foundered, and by fall, Russia had withdrawn its invitation. The MOD labs remain a closely guarded secret. “We had a chance to learn what they were up to,” says the U.S. official, “and we let it slip away.”

    Today, bioweapons experts say, the entire BWC could also become a lost opportunity. A quarter-century after entering into force, the treaty remains the weakest of the international arms-control agreements. The problem: It has no mechanism for checking on whether states parties are obeying the ban on developing biological weapons. Other agreements on nuclear and chemical weapons have established technical systems for monitoring compliance. But the BWC remains little more than an agreement based on trust.

    “We need to have a radar screen to identify spots and follow them up,” Ambassador Tibor Tóth, chair of the Ad Hoc Group of the States Parties to the BWC, told Science. “Right now our radar screen is blank.”

    Tóth is hoping to change that. Starting on 23 July, the Hungarian diplomat's Ad Hoc Group will meet in Geneva for a 4-week session to hammer out rules that BWC state parties must abide by. Formally known as the Protocol to the BWC, the measures would include mandatory investigations of facilities suspected of contravening the treaty as well as visits to declared facilities that are not under suspicion, plus export controls on organisms and technologies that might be used to develop biological weapons.

    The challenge will be to win over U.S. policy-makers. They are not alone in having concerns about some of the compromises in Tóth's “composite text,” but they have spoken against it strongly and consistently—most recently, before a U.S. House of Representatives national security subcommittee on 10 July. In prepared remarks, Ambassador Donald Mahley, special negotiator for chemical and biological arms control at the U.S. State Department, said that continuing negotiations have become “sterile.”

    Mahley also described “serious and substantive concerns” with Tóth's compromise language, noting that the United States cannot go along with requirements for “transparency” that could disclose secret data from defense labs or proprietary information from U.S. biotech companies. Restrictions on U.S. industrial exports, he said, would be at odds with trade policy. But he still held out hope, saying the Bush Administration is “grappling with its final decision” on the BWC protocol.

    Diplomatic efforts will shift into high gear at next week's meeting in Geneva, where a number of delegations plan an all-out effort to complete the protocol and persuade the U.S. team to go along with it. Rejecting the protocol “would send the message unequivocally that the United States does not care about establishing a stronger regime to prevent biological weapons and their proliferation,” maintains Graham Pearson, former director-general and chief executive of the U.K. Ministry of Defence's Chemical and Biological Defence Establishment in Porton Down.

    Adds James Leonard, former U.S. ambassador to the United Nations Conference on Disarmament: “A BWC with zero verification provisions stands as a sort of open invitation to do BW, while a BWC with a protocol, even if we think it is rather weak, is a substantial deterrent.”

    Concealed threat

    Biological weapons enjoyed mixed success when they emerged as a strategic threat in World War II. When Japan unleashed fleas infected with bubonic plague on Chinese forces in Manchuria in 1942, for instance, many Japanese soldiers themselves became ill. For the next quarter-century the United States and other countries aggressively developed their biological arsenals, prizing them as potential weapons of mass destruction. In 1969, however, U.S. President Richard Nixon, in a bid to stop other countries from refining their bioweapon capabilities, announced that the United States would dismantle its offensive program. A treaty banning these weapons altogether had great humanitarian appeal; country after country signed the BWC in 1972. In the era of détente, nuclear missiles held the world in thrall, while bioweapons appeared to fade from the military agenda.

    That comforting view changed little over the next 2 decades, despite occasional hints that countries were cheating on the BWC. In 1979, for example, dozens of people died of anthrax near the city of Sverdlovsk, now Yekaterinburg. Soviet officials invited suspicious Western experts to Moscow and managed to convince them that the anthrax outbreak came from contaminated beef. (In 1992, Yeltsin acknowledged that anthrax spores had escaped from an MOD lab in Yekaterinburg.)

    The mind-boggling scale of Soviet deceit came to light in the 1990s as Western intelligence agencies learned from Soviet defectors Vladimir Pasechnick in 1989 and Kanetjan Alibekov in 1992 about the extent of the Biopreparat operation. By its reported peak in the 1970s, the Soviet bioweapons program, which had begun around World War II, employed an estimated 25,000 or more workers, technicians, and scientists at more than a dozen major facilities.

    Russia claims to have made a clean break with the past, but Western experts have lingering concerns about the MOD labs. Russian officials say they cannot shed any light on the MOD lab activities. “They are off limits for national security reasons,” says an official with the Agency of Munitions in Moscow. “If I were to ask questions about what goes on in these labs, I could get into trouble.” The draft protocol, notes Leonard, would provide “a framework and a trigger” for helping to dispel such questions.

    The difficulties of monitoring BWC compliance came into sharper relief after the Gulf War, when the United Nations Special Commission (UNSCOM) went to work in Iraq to ensure the elimination of Saddam Hussein's weapons of mass destruction. With Iraqi officials grudgingly cooperating from 1991 to 1995, the UNSCOM team found loads of circumstantial evidence—facilities with a high capacity for fermentation inconsistent with peaceful purposes as well as irreconcilable records—all pointing to a broad, clandestine program aimed at “weaponizing” bacteria, viruses, and toxins. Despite UNSCOM's sweeping mandate to investigate suspicious activity anywhere and anytime, it wasn't until a son-in-law of Hussein defected in 1995 with damning inside information that the commission was able to further pressure Iraq into acknowledging the extent of its offensive biological weapons program.

    Toxic brews.

    The United States and other countries are now drafting science briefs describing the state of the black art of bioweapons—including a few of the potential threats highlighted in this table—for the fifth BWC review conference in November. At the conference, signatories will discuss a declaration affirming the treaty that would account for threats that have emerged since the last review conference took place 5 years ago.


    The UNSCOM experience crystallizes one of the biggest points of dispute among Ad Hoc Group negotiators at next week's meeting: how to protect national rights while empowering inspectors to determine whether a country is complying with the BWC. The draft protocol—the 16th version of the now-210-page document—would, among other things, allow a future protocol body to mount random transparency visits at declared facilities in precisely defined categories, including maximum containment (biosafety level 4) labs, vaccine facilities, biodefense shops, and plant pathogen containment laboratories. If a facility were suspected of contravening the treaty, the protocol would permit challenge investigations, in which teams of up to 30 investigators would be allowed to remain on site for 84 hours for a lab visit, or 30 days to investigate an alleged field release of a bioweapon.

    These provisions, Tóth told the House subcommittee last week, would “create a climate of openness and candor around significant dual-use activities. We are about creating light where there is darkness.” U.S. officials disagree. “Illicit biological warfare work could easily be concealed or cleaned up, rendering it highly improbable that international inspectors would detect evidence of noncompliance,” asserts the State Department's Edward J. Lacey.

    There's the rub, notes a U.S. delegate to the Ad Hoc Group. Weak challenge investigations might provide “a false sense of security,” the participant warns. “Under the more restrictive protocol, would you ever really find a smoking gun? … Is that false sense of security better than nothing?” But negotiators from other countries—and some U.S. experts as well —argue that protocol provisions can deter violators even if they don't necessarily provide access to ironclad evidence. “After all,” asks Pearson, “what treaty ever provides such evidence?”

    Political endgame

    The U.S. is also sparring with its allies over the procedures for facility visits and mandatory declarations of potential dual-use organisms and technologies. Driven by the concerns of the biotech and pharmaceutical industries, the Bush Administration is worried about the inadvertent leakage of trade secrets—vaccines in development, for example. The Administration also fears that visits to government labs could compromise national security. A facility might take pains to ensure that nothing it declares or shows investigators would harm specific security interests. However, claims the U.S. Ad Hoc delegate, by looking at “aggregates of information,” an enemy might add things up and find vulnerabilities in biodefenses of the U.S. or its allies.

    Negotiators from other countries discount the U.S. objections as primarily politically motivated. When the Ad Hoc Group first began its deliberations in 1995, the British government pressed for more expansive measures that would give the protocol more teeth than the composite text. That put the United Kingdom at odds with the United States on several key issues, including the scope of declarations and site visits. In negotiations since, the British delegates have sought to find compromises that would be acceptable to the United States and other countries without undermining the protocol's effectiveness. The United Kingdom is now solidly in the protocol-enthusiastic camp. “The protocol, in my view, is a very fair compromise,” says a senior U.K. official close to the negotiations.

    But the protocol has some difficult hurdles to clear. If it wins a vote of confidence in the next few weeks—which will require at least tacit U.S. backing—it would go to a Special Conference in the autumn. If treaty signatories approve it there, the protocol would then go to each country for ratification.

    That could be problematic in the United States. Arms-control agreements have not been popular among Republicans in the U.S. Senate, where a two-thirds majority is required to ratify international agreements. It would take a remarkable change of attitude in the Bush Administration to line up enough votes to get the protocol approved.

    If the Ad Hoc Group reaches an impasse next month, one U.S. participant doesn't think the years of talks will have been all for naught. The discussions have brought weapons experts from so-called rogue nations that are BWC state parties into the fold and allowed experts from former adversaries to forge more open relationships. “It's much more difficult, in the end, to make a weapon against someone you know,” the delegate says, “unless your government compels you to or unless you're truly evil.” Of course, without a way to ensure treaty compliance, it will be hard to know who really is evil—until it's too late.


    Malaria's Beginnings: On the Heels of Hoes?

    1. Elizabeth Pennisi

    By analyzing DNA of the parasite that causes malaria, researchers are trying to determine the role agriculture played in promoting this deadly disease

    How long has the deadly malaria parasite Plasmodium falciparum plagued humanity, causing wrenching disease and epidemic death? Since hominids and chimps first went their separate ways? Or only after agriculture created the right conditions for malaria's spread? Scholars have debated this question for decades, but only recently have data become available that might turn theory into fact. By comparing DNA among the various Plasmodium species and strains, researchers are attempting to reconstruct when modern Plasmodium falciparum first emerged. But, depending on where they look, the DNA is providing different answers—and resolution is nowhere in sight.

    Both camps agree that P. falciparum has been around in one form or another since the human branch of the primate tree split off from chimps about 8 million years ago. One new analysis, reported on page 482, supports the notion that epidemic malaria traces back to a small population of P. falciparum that suddenly expanded exponentially about 20,000 years ago. But another, in press at the Proceedings of the Royal Society, suggests the parasite has been common for hundreds of thousands of years, and that malaria took much the same toll on our ancestors on African savannahs as it does today across the globe. “This work is all so new that opinions are unsettled,” says Daniel Hartl, a population geneticist at Harvard University.

    Persistent parasite.

    After attaching to red blood cells, Plasmodium falciparum (yellow) enters the cell, where it differentiates into a gametocyte that is ready to be taken up by the mosquito.


    Although the prevailing view has long favored ancient origins, a few iconoclasts proposed a link between malaria and agriculture as early as 1958. More recently, a DNA-based study of the Plasmodium family tree, done in 1992, concluded that P. falciparum evolved from bird parasites after agriculture was established.

    Curious about that result, Francisco Ayala, a population and evolutionary geneticist at the University of California, Irvine, decided to build his own family tree. To do so, he and his colleagues compared the genes for the small subunit ribosomal RNA of 11 Plasmodium species. As they reported in 1994, P. falciparum, the most virulent of the three Plasmodium species that infect humans, appeared to be about 8 million years old; in addition, it was more closely related to P. reichenowi, which infects chimps, than to either of the other human pathogens or the bird parasites.

    But Ayala revised his thinking in 1998, after he and Stephen Rich from Tufts University in Medford, Massachusetts, did another analysis, this time assessing the amount of variation among DNA from different P. falciparum strains. (Typically, the older a species is, the greater the variation among strains.)

    Using published data, they compared the DNA sequence of 10 genes in 30 P. falciparum strains. To their surprise, they found almost no variation—just a few differences that altered the protein-coding DNA. These data indicated that the 30 strains all came from a common ancestral population no more than 57,500 years ago. This conflicted, however, with earlier work by others that found considerable variation in the genes for antigenic proteins that are targets of the human immune system. Ayala and his colleagues concluded that the variation among those genes must have cropped up as they were rapidly evolving to evade the human immune system.

    Skepticism was rampant. “My opinion was that [Ayala's result] could not possibly be true,” Hartl recalls. Later that year, Austin Hughes, a molecular evolutionist at the University of South Carolina, Columbia, published a critique of the work. Among other points, he noted that the variation in one of the antigenic genes, CSP, indicated that the parasites had been around a very long time. He suggested that perhaps Rich and Ayala didn't analyze the right set of genes.

    Seeking resolution, Dyann Wirth, a molecular parasitologist at the Harvard School of Public Health in Boston, and Karen Day, an epidemiologist at the University of Oxford, asked Hartl for help in deciphering the parasite's history. By that time, 2000, two of the parasite's 14 chromosomes had been sequenced and published, enabling the group “to take a much more systematic approach” than had Ayala and Rich, says Wirth.

    They first identified eight genes that should have no evolutionary forces pressuring them to change at an accelerated or slowed rate. They decided to focus on the introns, the noncoding regions of the genes, because these regions mutate readily and so should be the first to accumulate differences. Thomas McCutchan, a molecular parasitologist at the National Institute of Allergy and Infectious Diseases (NIAID), calls that strategy “really smart.” In all, they sequenced 25 introns from eight P. falciparum strains collected from all over the world, sequencing each multiple times for accuracy.

    As they report on page 482, the Harvard-Oxford team found just three differences in the sequences from the different isolates, two of which were in an intron that might actually play a more important role in the gene than the researchers originally thought. Using those changes, they calculated the age of the genome as between 9500 and 23,000 years—about the time when early hunters settled down in communities and began to farm. “This may have been the time when human conditions, culture, and the [mosquito] vector came together to develop self-sustaining epidemics,” explains Hartl.

    Rich and Ayala are impressed. “[The study] is more complete than ours, as we were relying on stuff in public databases,” says Rich. The new results also jell nicely with work by David Conway, a molecular parasitologist at the London School of Hygiene and Tropical Medicine. He, too, found little variation when he analyzed mitochondrial, as opposed to nuclear, genomes of various strains. “The implication, which I find convincing, is that virtually all of the existing sequence variation in the nuclear or mitochondrial genomes is of recent origin,” Conway says.

    Supporting evidence comes from other quarters as well. A new study, reported on page 455, suggests that the genetic mutation that confers malaria resistance in humans also arose recently (also see p. 442). To NIAID's McCutchan, the Harvard-Oxford group's new work “is another brick put in place,” demonstrating the intertwined destinies of humans and P. falciparum: Agriculture enabled not only human populations to expand greatly but also P. falciparum's.

    That edifice is far from solid, however. Richard Carter, a geneticist from the University of Edinburgh in Scotland, for one, questions the underlying assumptions that the Harvard-Oxford team used to calculate the ages of the strains. Hughes, who advocates an ancient origin for P. falciparum, calls the conclusions “premature” because they looked at only two chromosomes in the genome. As evidence, he cites his work with Federica Verra of the University of Rome, reported in their upcoming Proceedings of the Royal Society paper. The two recently compared versions of 23 P. falciparum genes whose sequences were in public databases, looking for any base changes in them. They found plenty, enough to conclude that P. falciparum has existed in substantial numbers for at least 300,000 years.

    Hartl calls the Hughes paper “extremely important and provocative.” It might be right, he concedes—or not. One possibility is that the differences Hughes detected may simply reflect sequencing errors in the public database, although Hughes thinks that errors would account for only a small proportion of that variation. Alternatively, perhaps some parts of the parasite's genome are more uniform across strains than others—and Hartl, Wirth, and Day happened to study one of those regions. “That alternative has to be taken seriously,” Hartl says.

    Researchers in both camps agree that resolving whether modern malaria stems from a large, already diverse ancestral population of P. falciparum or a small, rather homogeneous one could reveal new ways to fight this scourge. If very little variation exists among all the parasite strains, says McCutchan, then “when you see a mutation, it probably means something and tells us that's where we should focus our studies.” Those variations, in fact, could prove to be vaccine or drug targets. And that makes the ongoing debate more than academic.


    'Inconceivable' Bugs Eat Methane on the Ocean Floor

    1. Carl Zimmer*
    1. Carl Zimmer is the author of Parasite Rex and At the Water's Edge.

    By devouring 300 million tons of methane each year, newly found archaea may help keep this greenhouse gas in check

    Buried in the ocean floor are more than 10 trillion tons of methane—twice the amount of all known coal, oil, and other fossil fuels. Methane (CH4) is also 25 times more potent, molecule for molecule, as a greenhouse gas than carbon dioxide is. That means that the ocean's hidden methane reservoirs could play havoc with the world's climate if they were to escape to the atmosphere. Yet most of the methane that does rise toward the surface of the ocean floor vanishes before it even reaches the water. On page 484 of this issue, a team of researchers provides the clinching evidence for where all that methane goes: It is devoured by vast hordes of mud-dwelling microbes that microbiologists once said couldn't exist.

    William Reeburgh of the University of California, Irvine, and other geochemists first stumbled on this enigma in the 1970s as they studied methane-rich regions of the ocean floor. Methane-producing microbes continually generate the gas deep below the ocean floor. But when the researchers checked the mud near the ocean floor's surface, they found the methane had all but disappeared. Perhaps, they speculated, another group of microbes dwelling in the shallow mud was eating the methane and converting it to carbon dioxide.

    Other clues also hinted at the presence of methane feeders. Methane has a distinctively low ratio of carbon-13 to carbon-12 isotopes. The researchers discovered that in some places where the methane was seeping to the surface, there were carbonates—formed from carbon dioxide—that shared this remarkably low ratio. If methane feeders were consuming the gas and then releasing carbon dioxide low in carbon-13, that could account for the strange carbonates.

    There was only one problem: The laws of biology seemed to forbid such a creature. Methane-eating bacteria are well known—but they thrive in fresh water and soils where oxygen is abundant, suggesting that oxygen is an essential agent for breaking down methane. The sediments on the ocean floor, however, are entirely oxygen free. “A lot of microbiologists were saying these organisms can't exist because they couldn't imagine that this reaction could yield sufficient energy without oxygen to support life,” explains Kai-Uwe Hinrichs, a biogeochemist at the Woods Hole Oceanographic Institution in Massachusetts and co-author of the current paper.

    The mystery only deepened when geochemists measured levels of sulfate (SO42−) in the ocean floor. The sulfate, normally present in sea water, penetrated into the mud but then abruptly disappeared in the sediment horizon, right where the methane vanishes. So, the geochemists reasoned, the same creatures that were destroying methane must also be destroying sulfate. But such an organism was even further beyond the imagination of microbiologists.

    Looking for these improbable creatures would not be easy, because the exotic conditions in which they live cannot be replicated in the lab. Instead, microbiologists teamed up with biogeochemists to look for indirect signs of the microbes in the sediments. Hinrichs and his co-workers at Woods Hole and the Monterey Bay Aquarium Research Institute in California discovered that the mud in the Eel River Basin, off the coast of California, was packed with organic molecules—specifically, lipids from the cell walls of dead microbes. The lipids, they found, had a distinctive isotopic ratio suggesting they had been formed from methane. The researchers also found that the lipids had a structure that had been seen before only in archaea. (Archaea look superficially like bacteria, but they represent a separate domain of life.) Analyzing fragments of DNA collected from the sediments, Hinrichs and his co-workers confirmed in 1999 that the microbes were previously unknown archaea that make their homes in the mud.

    Mystery revealed.

    Microbiologists found hints of these gas-gobbling archaea (shown in red, surrounded by bacteria, in green) by sampling the mud in the Eel River Basin, off the coast of California.


    The following year a team of German researchers led by Antje Boetius of the Max Planck Institute for Marine Microbiology in Bremen got the first look at the actual microbes. To do so, they fashioned probes that could latch onto the DNA sequences discovered at Eel River. The probes were designed to glow when they reached their target, revealing the archaea.

    Looking at the glowing archaea under a microscope, Boetius discovered that they were not solitary creatures. They lived in tightly packed clusters of about 100 individuals, surrounded by a shell of bacteria. Boetius and her co-workers gathered DNA from the bacteria and found that they belonged to a group of species that consume sulfates. “I was very surprised, because it seemed to be too logical to be true,” jokes Boetius.

    In her report last year, Boetius proposed that the archaea and the bacteria live in some kind of biochemical symbiosis. The bacteria may use the waste products made by the archaea—such as molecular hydrogen and carbon compounds—to help them get energy from sulfate. At the same time, the bacteria might somehow allow the archaea to feed on methane without oxygen.

    But as of 2000, neither team had direct evidence that these particular archaea were actually feeding on methane. The only way to confirm those suspicions was to borrow a tool from a very different field—an ion microprobe that had already revolutionized geology and paleontology.

    Ion microprobes fire precise beams of ions at targets, blasting microscopic pits in their surface. The liberated atoms can then be measured to determine their isotopic composition. These instruments have enabled researchers to date the oldest minerals on Earth (Science, 22 December 2000, p. 2239) and to recognize the isotopic traces of the oldest signs of life (Science, 3 January 1997, p. 38). Christopher House of Pennsylvania State University, University Park, and his co-workers had recently adapted ion microprobes to measure the carbon isotopes found in individual microbes. “We went to a system where we dried the microbes on a piece of glass, and we found that it worked quite well,” he explains. House teamed up with the researchers from Woods Hole and the Monterey Bay Aquarium to put the newly discovered archaea-bacteria aggregates in the sights of a microprobe.

    For the first time, the researchers succeeded in identifying the microbes and then directly measuring their carbon isotopes. And, as they report in this issue, those isotopes clearly show that these specific archaea feed on methane and that the bacteria in turn get most or all of their carbon from the archaea.

    Previous research on these microbes “was compelling, but this one is convincing,” says Reeburgh. All that remains now is to determine exactly what sort of chemistry goes on between the two microbes. “We still don't know what chemicals are being processed. But I keep telling people, we're on the right street, we're approaching the house, and we're about to knock on the door.”

    These methane-eating microbes—once thought to be impossible—now look to be profoundly important to the planet's carbon cycle. Hinrichs and Boetius estimate that they devour 300 million tons of methane every year, about as much as humans now inject into the atmosphere with agriculture, landfills, and fossil fuel burning. But on early Earth, these microbes might have been even more significant. Atmospheric scientists have suggested that methane levels in the atmosphere may have been 1000 times higher than they are today, created initially by volcanoes and later by methane-producing microbes (Science, 25 June 1999, p. 2111). At first, this methane may have been beneficial, creating a greenhouse effect that kept the planet from freezing. But if the rise in methane had gone unchecked, Earth might have become too hot for life, as Venus is today. We may have the evolution of methane-eating archaea to thank for saving us from that grim fate. “If they hadn't been established at some point in Earth's history,” says Hinrichs, “we probably wouldn't be here.”


    A Man and His Archive Seek Greener Pastures

    1. Mark Sincell*
    1. Mark Sincell writes from Houston, Texas.

    Paul Ginsparg started the wildly successful Los Alamos Electronic Preprint Archive. Now, he and his creation are headed to Cornell

    HOUSTON—For 10 years Paul Ginsparg thrived as an anomaly at Los Alamos National Laboratory (LANL). A high-energy physicist who founded a revolutionary electronic system to store and disseminate research findings, he was also a vociferous advocate for the open exchange of scientific information from inside a top-secret nuclear weapons laboratory. But that special status isolated him from the rest of the lab, generating a burden that eventually proved too heavy to bear.

    So last month, Ginsparg decided to resolve those contradictions by moving 3000 kilometers east. His decision to accept a tenured position at Cornell University in Ithaca, New York, bolsters the university's efforts to establish a world-class center for research in information science. The preprint archive will move with him, and Ginsparg says he hopes to expand it to include papers from disciplines other than physics, its current focus. He and others promise that users of the archive, which next month marks its 10th anniversary, won't notice the move.

    Ginsparg, 45, is the father of the Los Alamos Electronic Preprint Archive (, recently renamed Spawned by an offhanded complaint from a colleague at a summer workshop, the archive has grown to become a part of every physicist's daily routine. This year the archive expects to receive 35,000 papers across various fields in physics, math, and computer science (see graph). “It has completely revolutionized physics communication, and all for the better,” says Gary Horowitz, a physicist at the University of California, Santa Barbara, and the author of the first paper submitted to the archive on 14 August 1991 (“Exact Black String Solutions in Three Dimensions”). “The first place I look [for papers] is the archives, not the library.”

    Long shackled by the slow pace of journals, physicists used to spend hundreds of dollars a year mailing around copies of not-yet-published papers, called preprints. E-mail, which scientists embraced years before the public had heard of AOL, eased the postal bills somewhat, but its value was limited by the puny storage capacity of most desktop computers.

    In June 1991, at the Aspen Center for Physics in Colorado, Ginsparg overheard physicist Spenta Wadia of the Tata Institute in India fret about the e-mailed preprints that flooded his disk while he was away. Realizing that it would be much more efficient to circulate only the abstracts and archive the full papers, Ginsparg spent that afternoon at the Aspen gym working out an automated preprint archiving and distribution system. He wrote the code later that month and opened the server in August.

    An instant and enduring hit with the physics community, the archive has been a mixed blessing for LANL. “The lab is justly proud of supporting and developing the server,” says LANL senior physicist Geoffrey West, another Ginsparg collaborator. But the server, which costs roughly $300,000 a year to run, didn't fit into any of the lab's core missions. And its relentless growth posed a problem. “The archive has wanted to expand to a much bigger natural size,” says Ginsparg, “and too much time was spent trying to keep it from growing.” Although the archive is supported by the National Science Foundation (NSF), the Department of Energy, and LANL, West says that “getting a stable source of funding has been a very rocky road.”

    In 1995, the archive received NSF funding aimed at completing its development. With a mature piece of software in hand, Ginsparg hoped to end his all-encompassing involvement in developing the archive and hand over responsibility for its daily management. But reality intruded. “With the growth of the Web, the popularity of the Internet, and improving end-user hardware and software, it turned out instead to be a period of extraordinary transition,” he says. Another obstacle was the lack of a trained successor.

    Faced with the disheartening prospect of running harder to stay in place, Ginsparg started dropping hints that he might want to return to academia, where he had spent most of the decade preceding his arrival at LANL in 1990. There were other signs that the time was ripe to return. Cornell had just founded the Faculty of Computing and Information (FCI), an interdisciplinary program in scientific communication, and was looking for top-notch scientists like Ginsparg. “Paul is one of the most far-thinking people in the field,” says Robert Constable, Cornell's dean of computing and information science. “He will be a nucleus we can build around.”

    Cornell's offer proved irresistible: a tenured position in computer science and physics, with the archive as a “special collection” in the FCI's digital library. The library staff will use university funds to take over the day-to-day administration of the archive, freeing up Ginsparg to apply the rest of an existing NSF grant to explore new directions for the archive. “A long-term commitment from the library was crucial,” says Cornell computer scientist Bill Arms. “Paul wanted to make sure the future of the archive would not depend on him personally.”

    Growing knowledge bank.

    The appeal of an electronic preprint archive caught on quickly among high-energy physicists. More recently, it has spread to condensed-matter physics and astrophysics.


    Indeed, Ginsparg sees his move east as a welcome break from the past. “Frankly I would have been more than happy to leave it all and start something entirely new,” he says. “But the unanimous recommendation from various interested parties was that the main site should move with me.” The move to Ithaca also brings him and his wife and young daughter close to his sister and her family.

    His Los Alamos collaborators are saddened by his pending departure. “I feel a sense of tremendous loss for the institution and personally,” says Rick Luce, the leader of LANL's “library without walls” initiative. But his former bosses aren't too surprised that he chose Cornell. “The opportunity that Cornell has created for Paul is uniquely suited for advancing his vision,” says LANL Deputy Director William Press, adding that he expects the lab to remain involved. “Paul and Rick Luce are discussing continuing collaborations between Cornell and Los Alamos,” says Press, “I have already heard some very interesting concepts emerging, and I commit Los Alamos to helping to support them.”

    Once he arrives on campus this fall, Ginsparg will have plenty to do. One of the most pressing questions to help Cornell think about is how to expand the electronic preprint server model into other scientific fields, particularly biology. So far, its adoption has been stymied by a lack of a preprint culture in biology and concerns about the open dissemination of research that could lead to a blockbuster drug or a change in patient care. “Our hope is that the library and the information program at Cornell will view this as a precursor to a more visionary digital scholarly solution that puts the institution and its scholars in better control of their intellectual property,” Ginsparg says. The researchers know that this path is studded with landmines, as scientific publishers try to satisfy their customers without jeopardizing revenue sources.

    And what about physics· Ginsparg says that he hopes to find time to return to his work on computational string theory and quantum gravity. “But if he doesn't, we will still be happy,” says Peter Lepage, department chair.

    Despite all the changes going on around them, regular users of the preprint archive should be able to continue with business as usual. “This is an Internet-based service with a worldwide network of access points,” says Duke University mathematician David Morrison. “The physical location of the primary server should have very little impact on the operation.” Even so, Ginsparg and Cornell are hoping that a change in scenery will lead to big improvements in the larger world of scientific communication.