News this Week

Science  31 May 2002:
Vol. 296, Issue 5573, pp. 522

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Physicists Question Safeguards, Ponder Their Next Moves

    1. Robert F. Service

    Recent revelations about questionable data in a handful of papers by Bell Laboratories physicist Jan Hendrik Schön and colleagues continued to reverberate throughout the condensed matter physics community last week. Researchers both inside and outside Bell Labs—the research arm of Lucent Technologies in Murray Hill, New Jersey—are asking whether formal peer review at journals and an informal review system at Bell Labs should have raised concerns earlier. And several teams have already begun to scale back efforts to extend the pioneering work that is now in doubt.

    Chemists and physicists were talking about little else last week. “I'm trying to find an e-mail in my inbox that is not related to Hendrik Schön,” says Charles Marcus, a physicist at Harvard University.

    Schön, 31, has pioneered two separate fields over the last few years: using transistors to inject a high density of electric charges into organic and inorganic crystals to study new physics, and creating molecular-scale transistors. Both aspects of the work came under scrutiny 3 weeks ago when outside researchers presented Bell Labs officials with evidence of possible manipulation of data in five separate papers published over 2 years (Science, 24 May, p. 1376). Schön was the only co-author who was on all five papers and was the first author on each. Since then, researchers combing the literature have turned up nine more figures from eight other papers that appear to share unusual similarities.* Last week Schön, who stands by his results and says it's not surprising that similar measurements produce similar graphs, announced that he would hold off on publishing papers currently in press.

    On 10 May, Bell Labs set up a five-member panel of independent researchers to investigate the concerns. That panel is expected to take months to reach its conclusions, which Bell Labs officials say will be made public. In the meantime, researchers are asking whether the troubling data should have been caught earlier. The questions are particularly acute within Bell Labs. According to Bell Labs physicist Robert Willett, researchers there typically send papers to a selection of peers before sending them to journals. Although not intended as a formal peer-review system, the practice ensures that other researchers can keep abreast of the latest work by their colleagues and can raise scientific concerns before the papers are published. But for at least two of Schön's most controversial papers last year—which appeared in the 18 October issue of Nature and the 7 December issue of Science—that standard procedure was not followed. “That was reason for concern,” Willett says. Similar figures in those two papers triggered a broader look at Schön's work by outside researchers, which led to the current inquiry. Now, in the wake of that inquiry, Willett says he has been asked to serve on an internal committee to determine whether a more formal review process is needed.

    New tack?

    Ramirez (top) plans to keep trying to replicate Schön's work, but Natelson (bottom) says the controversy might shift his efforts.


    Whether peer reviewers for Science, Nature, and other journals should have spotted similarities in the figures is also a topic of heated debate. Arthur Hebard, a physicist at the University of Florida in Gainesville, says that the papers now under investigation were so important that they should have been given more thorough scrutiny during their review process. “This is such revolutionary physics, reviewers probably should have picked this up,” Hebard says.

    But Leo Kouwenhoven, a physicist at Delft University of Technology in the Netherlands, suggests a couple of reasons why problems of that kind would be difficult to catch. First, Kouwenhoven notes, reviewers typically look at papers one at a time, so any data duplicated from earlier papers wouldn't have been obvious. Moreover, Schön and colleagues turned out so many papers—about 90 in recent years—that their track record produced a halo effect. “I thought if I criticized too much, I would look like I am jealous. That has made everybody cautious to speak out too loud,” Kouwenhoven says—a sentiment other researchers echo.

    Although most researchers doubt that this case will prompt drastic changes in the peer-review system, it is already having an impact on some of the estimated 100 groups worldwide believed to be working on projects related to the Bell Labs results. In most cases, those results have not been reproduced. “The question is, what do we do now?” says Douglas Natelson, a physicist at Rice University in Houston, Texas, who has had a Ph.D. student working for the last year trying to replicate some of Schön's work using high-field transistors. “It's really tricky. I'm reluctant to spend any more money on the high-field work until I know more.”

    Allen Goldman, a physicist at the University of Minnesota, Twin Cities, who has had a postdoc working about half time to replicate some of Schön's results for the past year, agrees. “We've sort of changed directions to hedge our bets a little bit. In the next months, we won't pursue that quite as intensively as we had because of these questions,” Goldman says. “As a mentor, I have an obligation to make sure people [in my group] are productive.”

    Others, meanwhile, say they're staying the course. Art Ramirez, a physicist at Los Alamos National Laboratory in New Mexico, for example, says his group isn't planning any drastic shifts. Ramirez has about five people working directly on extending Schön's results using high electric fields produced by transistors to explore new physics in organic materials. Ramirez created a buzz at the American Physical Society meeting in March when he reported that his team had used the transistor setup to trigger a normally insulating crystal of C60 to behave like a metal. Schön and colleagues had previously reported using the same setup to coax C60 to go one step further and superconduct. Ramirez's team hasn't duplicated that result. But he believes he's close and therefore is reluctant to alter his focus. “Things will just change overnight if we can duplicate this,” Ramirez says. Perhaps no one is pulling for him to succeed more than Hendrik Schön.

    • *Applied Physics Letters 73, 3574 (1998), Fig. 3 (top).

      Physical Review B63, 245201 (2001), Fig. 2.

      Physical Review B58, 12952 (1998), Fig. 2 (top).

      Synthetic Metals122, 195 (2001), Fig. 2 and Fig. 3.

      Applied Physics Letters79, 4043 (2001), Fig. 2.

      Thin Solid Films385, 271 (2001), Fig. 5.

      Science288, 2338 (2000), Fig. 2.

      Physica Status SolidiB226, 257 (2001), Fig. 4.


    Europe Does More With Less

    1. Govert Schilling*
    1. Govert Schilling is an astronomy writer in Utrecht, the Netherlands.

    PARIS—Last November, Europe's space scientists faced a grim future. Ministers meeting in Edinburgh had capped the European Space Agency's (ESA's) science budget over 3 years, forcing about $460 million in savings in the next decade. It seemed certain that one large mission would have to be scrapped, most likely a galaxy-charting satellite called Gaia. It was, according to David Southwood, ESA's science director, “a rather dismal picture.”

    At a press conference here on 27 May, Southwood and his team emerged from a 6-month huddle to unveil an ambitiously revised slate of missions. By reshuffling schedules, squeezing money from existing programs, and weaving together the development of missions as tightly as possible, they have transformed a program of 12 launches in 11 years into one of 16 launches in 10 years. They even managed to save Gaia and introduce a new mission into the $3.4 billion mix. Despite the axing of one planetary mission, “the final result is the best of the possible solutions,” says Bo Andersen of the Norwegian Space Centre in Oslo, chair of ESA's Science Programme Committee.

    Over the next decade, Southwood's “cosmic vision” program calls for, among other goals, landing spacecraft on Mars, Mercury, Saturn's moon Titan, and a comet; observing the birth, evolution, and death of stars and galaxies at gamma ray and infrared wavelengths; studying the afterglow of the big bang; and mapping the positions and motions of nearly every star in the Milky Way. ESA will also join NASA in building Hubble's successor, the Next Generation Space Telescope, and LISA, a gravitational wave observatory in space.

    The program's transformation squeezes many missions to the limit. For example, Gaia is now $140 million cheaper thanks to a less costly spacecraft that will fit on a smaller launch vehicle. For the BepiColombo mission to Mercury, ESA is hoping to cut a deal with Russia on a less expensive lander and launcher. Also to cut costs, BepiColombo will be delayed a few years and developed in tandem with the Solar Orbiter, a mission to study the sun. All this leaves little slack in the program. “You can do this only once,” says Southwood. “[ESA ministers shouldn't] ask me to repeat the trick. I'm not a magician.”

    The savings have allowed Southwood to pull one extra mission out of the hat. Previously just a backup mission, Eddington will study the composition and structure of stars by measuring seismic vibrations at their surfaces, a technique known as asteroseismology (see p. 1595). It will also look for small extrasolar planets moving across the disks of parent stars. Eddington is a step toward a proposed mission called Darwin, pegged for 2015, that would study the atmospheres of extrasolar planets and search for life. “I can't imagine a human being not interested in this,” Southwood says.

    The drastic pruning of the program budget did nip one bud, however. ESA's planned mission to Venus, called Venus Express, was dropped last week because “not everybody could commit to the necessary schedule,” says Southwood, who warns that future missions that don't stick to tight schedules might suffer the same fate.

    Double save.

    Europe's new plan preserves Gaia (top) and includes Eddington.


    Some scientists rue the loss of Venus Express. “It's tragic that we now have a scientifically very interesting mission without an option of really flying it,” says Michael Grewing of the Institute for Millimeter Radio Astronomy in Grenoble, France, chair of ESA's Space Science Advisory Committee. But Southwood's sword of Damocles hanging over future missions might end up saving Venus Express in the end. Grewing says that Venus Express could get a second chance if another mission is dropped from the roster. According to Joop Hovenier of the Free University in Amsterdam, the decision to cancel Venus Express came like a bolt from the blue. “It's a pity,” he says. “It was a cheap mission, because it would use the same platform as Mars Express. You would expect projects like that to be applauded.”


    Congress Adopts Tough Rules for Labs

    1. David Malakoff

    Biomedical and agricultural researchers working with potential bioweapons face an array of new regulations under a new U.S. law aimed at combating bioterrorism. Science lobbyists say that the rules, passed overwhelmingly last week by Congress, are more reasonable than earlier drafts developed last fall immediately after a wave of anthrax-tainted letters killed five people. But they remain cautious until the Bush Administration spells out how it plans to implement the law.

    The Public Health Security and Bioterrorism Preparedness and Response Act (H.R. 3448; will require tighter laboratory security, government registration, and background checks for scientists and others handling more than three dozen potential bioterror agents identified by the government. The $4.6 billion measure, which addresses everything from food and water supply safety to prescription drug testing, also calls for more money for research, upgrading labs, and developing better systems for tracking and detecting threats to human health and agriculture. “Congress did a good job of providing clarity to researchers about their responsibilities,” says George Leventhal, a lobbyist with the Association of American Universities in Washington, D.C. But “the development of sound regulations will be extremely important,” adds Janet Shoemaker of the American Society for Microbiology, also in Washington.

    The anthrax letters triggered an immediate reaction from Congress, including calls to bar all non-U.S. citizens from working in labs that handle dangerous agents. The research community mobilized quickly against such extreme measures and gained time to make its case after the bill became bogged down over disagreements on food-safety and drug-testing provisions.

    United front.

    HHS Secretary Tommy Thompson, far right, joins legislators from both parties at briefing on bioterrorism bill.


    The final version reflects some of the researchers' input. It avoids a blanket ban on foreign scientists, as well as an earlier one-size-fits-all approach to regulating the 42 viruses, organisms, and toxins on the list of dangerous “select agents” compiled by the federal Centers for Disease Control and Prevention (CDC) in Atlanta, Georgia (Science, 16 November 2001, p. 1438). Instead, it gives regulators with the Department of Health and Human Services (HHS, CDC's parent agency) “flexibility to impose different levels of security requirements on different select agents,” according to a report accompanying the bill. The U.S. Department of Agriculture (USDA) must develop its own list of potential agroterror agents that will be subject to regulation; it will have similar leeway. Violators face fines and jail terms of up to 5 years.

    The new law also calls on HHS and USDA to “minimize disruption” of existing research by setting realistic regulatory timetables. It requires “prompt” government screening of researchers and technicians working with select agents and offers an appeals process for workers who say they were improperly placed into a barred category, such as drug user or felon. But a definition of “prompt” must still be written, notes Leventhal.

    The new law sets a 6-month timeline for implementing the rules, which will take effect just as the National Institutes of Health (NIH) makes a new set of awards to combat bioterrorism. With that in mind, the legislative report urges NIH officials to remind grant seekers “to begin the registration and screening process … so as not to delay this important research.”

    The bill also approves $600 million over the next 2 years for CDC to address “extreme disrepair” at its laboratories and $415 million for USDA to improve agricultural security. CDC is expected to get at least $20 million to boost lab inspections and to set up a national database to help track stolen or misplaced agents back to their source. An effort to study antibiotic resistance receives $50 million. Congress has already agreed in separate bills to provide money for most of these projects.


    Redrawing the Brain's Map of the Body

    1. Laura Helmuth

    One of the more distinctive images taught in introductory biology or psychology courses is the motor homunculus: a deformed map of the body drawn on the primary motor cortex, a part of the brain that guides movements. Lots of neurons in this region help control the hands and face, so these features of the homunculus are exaggerated, the lesson goes. In contrast, less nimble body parts, such as the torso, look relatively scrawny.

    As is often the case, experts knew that reality was somewhat more complicated; for instance, the areas representing different body parts weren't as well defined as the picture implies. But new findings go even farther, suggesting that the role of the primary motor cortex might be fundamentally different than originally thought. Rather than simply controlling different parts of the body, it might direct a host of body parts to assume complex postures. What's more, the map appears to be organized not just according to muscle groups but by the positions in space where the animal's movements conclude.

    The research, reported by Michael Graziano and colleagues at Princeton University in New Jersey in the 30 May issue of Neuron, might mean that the “map of motor cortex will have to be redrawn,” says neuroscientist Larry Abbott of Brandeis University in Waltham, Massachusetts. “It will be a much more deep map than just this picture of a body.”

    Goodbye, homunculus?

    This textbook depiction of the primary motor cortex's responsibilities might need to be refined.


    The Princeton team uncovered the new map by electrically stimulating neurons in the primary motor cortex of monkeys, as other researchers had done, but in these experiments they used longer durations. A 50-millisecond stimulation—the same duration used in 1950 and earlier to trace the motor homunculus—causes brief muscle twitches. But Graziano and his team then tested how the neurons respond to pulses lasting a half-second or so, about as long as neurons in this part of the brain are active during natural movements.

    The results were “pretty spectacular,” says Abbott, who's seen the team's videos of monkeys responding to the stimulation. Rather than just twitching a muscle, the monkeys carried out coordinated, well-timed, determined movements. For instance, stimulation to one spot in the primary motor cortex caused a monkey to clench its fingers, move its hand near its face, and open its mouth. It didn't matter where the monkey's hand started out—every stimulation of the same cortical spot caused the monkey to assume the same final position. Of 324 stimulation sites in two monkeys, some 86% evoked a distinct posture.

    The team found some evidence for a rough body-centered map of the motor cortex that matches the traditional motor homunculus. Certain regions often directed the hand and arm, whereas others largely caused movements of the legs or face. But the researchers found a more systematic pattern superimposed on the brain region, this one of locations in space. Zapping certain spots made the monkey reach its hand down and to one side, say, whereas nearby neurons led the hand to a slightly higher position or slightly closer to the middle of the body.

    And in yet another blow to textbook descriptions of the motor system, the researchers found that their newfound neural map of space extended from the primary motor cortex to the nearby premotor area—a region that in other respects works quite independently.

    Neuroscientists are both intrigued and mystified by these findings. “The challenge is to figure out just what this tells us about the organization of the cortex,” says cognitive neuroscientist Larry Snyder of Washington University in St. Louis, Missouri.

    One possibility, suggests neuroscientist Peter Strick of the University of Pittsburgh, is that two systems coexist in the primary motor cortex. Brief stimulation might directly activate spinal cord neurons in a way that matches the traditional view of the motor homunculus, whereas longer pulses might pull in more complicated neural circuits. But as difficult as it might turn out to be for neuroscientists to agree on what the primary motor cortex is doing, the textbook illustrators are going to have the even tougher job of depicting it.


    Best Big Bang Pictures Show New Wrinkles

    1. Charles Seife

    ARLINGTON, VIRGINIA—The data were so hot that bootlegged versions had been circulating for months. Even so, their official release here on 23 May has set cosmologists abuzz. The measurements, taken by the Cosmic Background Imager (CBI), a telescope in the Atacama Desert of Chile, give scientists their sharpest view yet of the infant universe.

    “The instrument is extremely exciting,” says Max Tegmark, a cosmologist at the University of Pennsylvania in Philadelphia. CBI is sensitive to very small features in the cosmic microwave background that other microwave background observatories such as DASI, BOOMERANG, and even the new Microwave Anisotropy Probe (MAP) satellite can't see. “This is the one experiment that is not going to get eclipsed by MAP,” Tegmark says. Indeed, Tegmark was so eager to get his hands on the CBI researchers' data that he used a digital camera to snap a photo of one of their overheads at a lecture—a photo that quickly made the rounds of the cosmological community.

    The new CBI results show details of a surface that existed about 300,000 years after the big bang, when the universe was a roiling ball of plasma. Theory says that plasma was ringing with sound waves, which caused ripples in the cosmic microwave background. This “acoustic” model predicts that these still-visible ripples should be of characteristic sizes, just as a musical instrument produces sound waves of characteristic frequencies.

    Ripple effect.

    Extra “peaks” in the distribution of cosmic background radiation, seen by CBI (bottom), might shed light on the universe's history and composition.


    Last April, BOOMERANG and DASI released observations revealing the first and second “peaks” in the microwave background spectrum—akin to hearing the fundamental and first overtone of a musical instrument—and gave hints of a third peak (Science, 19 January 2001, p. 414). CBI, with its extra sensitivity to higher overtones, appears to have spotted the third and fourth peaks and might even have glimpsed the fifth and sixth peaks as well. Comparing the data with theoretical models gives scientists an independent measure of the budget of matter and energy in the universe. “The CBI data allows us to have a completely new test” of cosmological models—and the models are passing the test, says Alan Guth, a cosmologist at the Massachusetts Institute of Technology. For example, the “volume” of the overtones is diminishing, as the acoustic model predicts. “You see it starting to damp out as it should,” says John Carlstrom, a cosmologist at the University of Chicago who is associated with the CBI team.

    Even though the peaks are getting smaller, the CBI team has noticed that there is more “volume” at small scales than the acoustic model can account for. “In these deep observations, we do see excess power over what the theory predicts,” says Anthony Readhead, leader of the CBI team. “If it's real, then we think it's a very exciting result.” Readhead says that the unexpected excess might be due to the so-called Sunyaev-Zel'dovich effect, in which photons from the early universe scatter off electrons in hot gas in clusters of galaxies closer to Earth, distorting the cosmic background radiation.

    If the observations hold up, a more detailed analysis of the excess “volume” at small scales might enable cosmologists to map the formation of galaxies and galaxy clusters in the early universe. “These are really signposts of the structural evolution of the universe,” says Carlstrom.

    CBI's fine-grained photos should also complement data taken from galaxy surveys. “Traditionally, microwave background has been on superlarge scales, while surveys of galaxy clusters have dealt with very small scales,” says Tegmark. But with big galaxy surveys, such as the Sloan Digital Sky Survey, and small angular-scale measurements of the microwave background, the measurements are beginning to overlap, allowing scientists to compare them directly. “This will be particularly fun,” Tegmark says.


    Organic Farms Reap Many Benefits

    1. Erik Stokstad

    The bountiful crop yields of the green revolution have fed millions, yet they pose an environmental tradeoff: rich harvests in exchange for polluting pesticides and fertilizers. Organic farmers have long touted their methods as a more benign way to nourish the world. But few rigorous studies have looked at the long-term yields and environmental effects of organic farming. Outside of Europe, organic farms remain a niche operation relying on premium prices to survive.

    Now a report on page 1694 brings encouraging news for organic fans. A team led by agronomists Paul Mäder of the Research Institute of Organic Agriculture in Frick, Switzerland, and David Dubois of the Swiss Federal Research Station for Agroecology and Agriculture in Zürich reports the results of the longest and most comprehensive study to date comparing organic and conventional farming, measuring many aspects of crops and soil over 21 years. The bottom line: Organic farms can be nearly as productive as regular farms for some crops, and they leave soils healthier. The study also conclusively demonstrates that for most crops, organic plots are more energy efficient per unit crop.

    “This study is as complete a picture as we have from anywhere,” says Phil Robertson, an agricultural ecologist at Michigan State University, East Lansing. Agrees soil scientist John Reganold of Washington State University, Pullman: “This gives more credibility to organic systems.”

    Green thumb.

    Organic farms, which used mechanical weeding rather than herbicides, hosted more kinds of beneficial insects.


    The 1.5-hectare trial, started in 1978 near Basel, Switzerland, compares four farming systems. One group of plots mimics conventional farming, treated with chemical pesticides and herbicides and soluble nitrogen for fertilizer. Another models an “integrated” approach that includes manure with conventional techniques. The organic plots use only manure and mechanical weeding, along with plant extracts to control pests. The fourth system is a much less common practice called biodynamic farming that adds unique treatments, such as a variety of herbs added to compost manure.

    Over 2 decades, the average crop yield was about 20% lower in both kinds of organic fields, a finding on par with previous studies. The best-performing organic crop was winter wheat, which stacked up at about 90% of the conventional harvest. Potatoes fared the worst with about 38% lower yields, mainly due to potato blight and potassium deficiency. The yields are all the more impressive given that the organic plants received less than half the nutrients given to conventional plots. “To add that much less fertilizer and still get 80% of the conventional yields is outstanding,” says Reganold.

    Because no synthetic fertilizer had to be produced or applied, growing organic crops also required less energy than conventional crops—up to 56% less energy per unit yield. The team also found evidence that nutrient-cycling microbes are more plentiful and efficient in organic soil, making more nutrients available to plants. According to a microbial diversity assay (which measures the range of bacterial metabolites as a proxy), biodynamic soil ranked higher than organic, which in turn outranked soils in conventional fields.

    More microbes are known to improve soil structure, and Mäder's team found another benefit: higher yields in organic plots with maximum microbes. But Robertson questions whether the greater microbial diversity is simply a product of more diverse organic materials in the soil, for example, from the added manure. And microbiologist Kate Scow of the University of California, Davis, notes that the diversity assay looks at an “incredibly narrow” range of ecological niches and that other studies have been contradictory.

    Soils did appear to be healthier in organic plots, with 40% more roots colonized by fungi that assist with plant nutrition. Earthworms were up to three times more abundant, and there were twice as many spiders and other pest-eating arthropods. Mäder thinks that these environmental benefits and higher energy efficiency help justify the existing government subsidies for organic farmers in Europe: “I think our research could stimulate governments to encourage this by showing long-term benefits.”

    But the study doesn't address other concerns about organic farming, Robertson adds, for example, whether organic farms can be economically viable on a large scale or in other economic conditions, such as in the United States, where such farms are not subsidized. Also uncertain are questions of groundwater pollution and atmospheric emissions of nitrogen forms. But if such concerns can be addressed, as indicated by a few other large trials, then perhaps the next revolution might be a bit greener.


    Chimps and Fungi Make Genome "Top Six"

    1. Elizabeth Pennisi

    Although the relevance of honey bees, chickens, and sea urchins to biomedicine might seem a stretch, the National Human Genome Research Institute (NHGRI) announced last week that deciphering the genomes of these species is a top priority. These organisms, in addition to the chimp, a protozoan called Tetrahymena thermophila, and several fungi, will be next in line at the big sequencing centers that are now scrambling to decipher the genetic code of humans, mice, and rats, says NHGRI director Francis Collins.


    The list is intended to bring order out of chaos. In the past couple of years, various researchers have lobbied the big sequencing labs—including those at Washington University, Baylor College of Medicine, and the Whitehead Institute for Biomedical Research —which now have excess capacity, to sequence their pet organism. Some succeeded: The Whitehead Institute recently deciphered the genomes of a fungus and a tunicate.

    To make the process fairer, NHGRI last summer invited researchers to justify why their organism should jump to the front of the queue. Each proposal was peer reviewed and rated according to the organism's importance to medical research, basic biology, and evolutionary studies. The selection committee also considered how many researchers would benefit from the sequence and how amenable it was to study.


    In setting sequencing priorities, NHGRI was also indirectly determining which model organisms biologists will be studying in the coming years. “The viability of your model system is really dependent in this day and age on having a genome sequence,” says James Coffman, a developmental biologist at the Stowers Institute for Medical Research in Kansas City, Missouri.


    The sea urchin made the cut after 75 biologists wrote effusive letters about its usefulness in developmental biology, cell biology, biochemistry, and studies of gene regulation.


    Honey bee proponents, including entomologist Gene Robinson of the University of Illinois, Urbana-Champaign, swayed the committee by describing the insights the bee can offer into the genetics underlying complex behaviors.


    Fungi made it for practical reasons—small genome size, big impact on crops and health—and its potential contribution to evolutionary biology. “With them, we can learn about a whole kingdom in one fell swoop,” says Ralph Dean, a fungus expert at North Carolina State University in Raleigh (Science, 22 June 2001, p. 2273).


    Lobbyists for the rhesus macaque got an assurance that it will be first in line once these six are done. But advocates of the cow and Xenopus will have to wait.


    New Law Could Turn Scientists Into Outlaws

    1. Charles C. Mann

    Imprison your leading scientists for doing … research? That could be the effect of a sweeping new law that effectively bans most biotechnology in Mexico. So broad are its restrictions, in fact, that they could block researchers from working with any transgenic organisms, even in the lab. Although Parliament passed the law in February, many of the nation's molecular biologists are just now learning of it, and they are up in arms.

    The law, perhaps the world's most sweeping biotech regulation, is part of a larger initiative to reform biosafety rules in Mexico. Most of the law deals with relatively uncontroversial matters: regulating the disposal of hazardous wastes, controlling toxic chemicals in urban areas, and blocking the introduction of exotic species. But the little-noted Article 420 of the new law imposes up to a 9-year prison sentence on anyone who, “in violation of the established applicable norms, imports, exports, traffics, transports, stores or releases into the environment any genetically modified organism that changes or can change negatively the components, structure, or function of natural ecosystems.” According to Article 420, “genetically modified organism” means “any organism with a new combination of genetic material that has been created by the techniques of biotechnology, including those deriving from the techniques of genetic engineering.”

    Illegal action?

    A researcher obtains DNA at a maize research lab, CIMMYT, outside Mexico City.


    The draconian ban might have been a legislative response to the controversial report that maize in southern Mexico, the center of diversity for that crop, contained genes apparently acquired from illegally planted transgenic stock (Science, 1 March, p. 1617; 12 April, p. 236). If so, scientists say, it is a ludicrous overreaction that seems at cross-purposes with government efforts to encourage home-grown biotech. Indeed, the government body intended to supervise these efforts, the Consultative Committee on Biotechnology, complained in a 26 April letter to Parliament that its members “could be threatened with prison by a simple claim that the transgenic organisms that we developed, stored or transported could have negative effects on the environment.”

    Although Article 420 is in effect, it is not yet being enforced, because the relevant “established applicable norms” do not exist. SEMARNAT, Mexico's environment ministry, has told researchers that it is developing the “norms,” which will be published in draft form in the Diario Oficial, probably next month, for a 60-day comment period. Scientists inside and outside SEMARNAT are demanding that the norms be used to rein in the law.

    But researchers in Mexico are far from complacent. “Nobody is sure how the law will affect them, nor how it will be enforced,” Science was told by one geneticist in Mexico, who has asked for anonymity because of the “delicate” situation. “It is very difficult to envision that the Mexican government is going to send some of its best scientists to jail for following what were the laws before this latest act was passed.” But that, this researcher said, might end up being the case.


    How Devastating Would a Smallpox Attack Really Be?

    1. Martin Enserink

    A smallpox attack is widely seen as the ultimate nightmare scenario. But many scientists—including some who know the disease firsthand—say it wouldn't be as horrific as everybody thinks

    It was 3 days before Christmas, but the National Security Council was not in a festive mood. Some 16,000 people across 25 U.S. states had come down with smallpox; 1000 had already died. Frightened citizens were fighting over the dwindling vaccine supply. Canada and Mexico had closed their borders, food shortages were looming, and people were fleeing the cities. When desperate council members asked experts what the future would hold, the answer was unfathomable: By early February, as many as 3 million U.S. residents might be struck by smallpox—and a million might be dead.

    That stark picture was painted by “Dark Winter,” an exercise designed by top U.S. bioterror experts and held last summer. Still the most widely publicized scenario, the exercise enlisted senior U.S. policy-makers and confronted them with an imaginary smallpox attack. An equally stark image—an apocalyptic event that could bring the nation to its knees—jumps out from many press stories and from Smallpox 2002: Silent Weapon, a chilling BBC docudrama in which a “suicide patient” strolling around New York City triggers a worldwide pandemic that leaves 60 million dead. (President George W. Bush has reportedly ordered a copy of the tape, which aired in Britain in February.)

    Yet some experts—including several veterans of the eradication effort—are pushing a much less alarmist message: A smallpox outbreak needn't be dire. They don't dispute that even a single case would be an emergency, or that it would be horrific for the patients. But they argue that the disease would spread much more slowly than Dark Winter and other scenarios suggest and could, in many cases, be contained quite easily. “Smallpox is a barely contagious and very slow-spreading infection,” says James Koopman of the University of Michigan, Ann Arbor, who helped fight the disease in India in the early 1970s. Indeed, the way it spread in Dark Winter was “silly,” says Michael Lane, a former director of the smallpox eradication unit at the Centers for Disease Control and Prevention (CDC) in Atlanta. “There's no way that's going to happen.”

    Koopman, Lane, and others say it is especially important to tone down the hype now, as the government is reviewing its policy on smallpox vaccination and holding public hearings on the issue. Because the vaccine can have serious and sometimes deadly side effects, CDC's current plan is to use it only during an outbreak and give it just to those who have been close to a patient—a strategy called ring vaccination. But others question that approach. In a recent paper in The New England Journal of Medicine, William Bicknell, former commissioner of the Massachusetts Department of Public Health and now at the Boston University School of Public Health, instead called for the government to make the vaccine available now to those who want it. So did a policy analysis published last month by the Cato Institute, a conservative think tank.

    Scary scenario.

    Stills from Smallpox 2002: Silent Weapon, a BBC docudrama in which a “suicide patient” in New York City sparks a smallpox pandemic.


    Modeling the unfathomable

    So what would a smallpox outbreak look like? That question has spawned a new growth industry since 11 September. At least six U.S. research groups are building computer models of an epidemic, enabling them not only to project its possible course under different scenarios but also to test the effect of vaccination and quarantine strategies. Many of these models are still in development, and their results are sometimes contradictory. But the outcomes—none of which are as gloomy as Dark Winter—are eagerly awaited by public health officials hoping for guidance. Already, in briefings for state and local officials, CDC's senior adviser for smallpox preparedness and response, Harold Margolis, is trying to “demystify” smallpox. “We know this disease,” says Margolis. “We have eradicated it once, and we can do it again.”

    One reason Dark Winter, organized by the Johns Hopkins University Center for Civilian Biodefense Strategies and the Analytic Services Institute for Homeland Security, made such a big impression is that it was one of the first exercises of its kind. The underlying mathematical model was simple, says Tara O'Toole, director of the Hopkins center and one of the three authors of the scenario. The primary goal was not to accurately predict the course of an outbreak but to show gaps in the country's biodefense system and jolt policy-makers into action.

    That it did. In the exercise, conducted on 22 and 23 June 2001, former Senator Sam Nunn (D-GA) played the U.S. president. Republican Governor Frank Keating of Oklahoma—no stranger to dealing with terrorism—played himself, as did a handful of TV and print reporters covering the outbreak. The scenario assumed that three large aerosol clouds infected 1000 people each in shopping malls in Oklahoma City, Atlanta, and Philadelphia. With the fictional policy-makers woefully unprepared, the situation got “worse and worse and worse and worse and worse,” as “President” Nunn put it.

    “We have to take some of the responsibility” for giving smallpox an extremely scary reputation, concedes O'Toole. Indeed, she worries that because of the current fixation on smallpox, four dozen or so other potential biowarfare agents are being ignored: “That wasn't the intended message.”

    But O'Toole also blames the media. At the end of the exercise, she points out, there were still fewer dead than at Pearl Harbor; the 3 million number was a worst case prediction. And O'Toole stands by every assumption used in Dark Winter. One reason it was so bleak, she points out, is that the scenario provided only 15.4 million doses of vaccine—the actual U.S. stockpile at that time. The stockpile has since grown, and there will soon be enough to cover the entire U.S. population (Science, 5 April, p. 25).

    Others question the assumption that terrorists could actually spread enough smallpox particles to infect 3000 people. In a model he has developed, for instance, Ira Longini, a biostatistician at Emory University in Atlanta, sets an Al Qaeda-style group of up to five infected terrorists loose and, not, surprisingly, the resulting outbreak is much smaller than the Dark Winter simulation. The most likely introduction scenario is still hotly debated, however, and most researchers say it's a question that the CIA, not epidemiologists, will have to answer.

    The most contested assumption in Dark Winter, however, is the transmission speed for smallpox. A crucial factor in most disease models is what epidemiologists call the basic reproductive number, or R0, which is defined as the number of secondary cases caused by each primary case, absent any control measures. When R0 is high—say 10 to 13, as for measles—a disease will spread exponentially; when it's between 1 and 2, it will just keep going, and below 1, it will peter out. For most modern-day diseases, careful epidemiologic studies have determined R0 in various populations.

    But for smallpox, researchers can't agree on the right value. Past outbreaks yielded varying results, and the number can differ from population to population. Most people in the United States today lack immunity, which could make matters much worse than in the past. On the other hand, they don't live in crowded and squalid conditions, like many of the disease's erstwhile victims, which could reduce transmission rates.

    Too gloomy?

    Sam Nunn, playing the president, scrambled to deal with a smallpox attack in Dark Winter. But some say the scenario exaggerated the spread of the disease.


    O'Toole and her co-authors selected six smallpox importations into Europe after World War II that they thought might be typical for a 21st century attack. For instance, these outbreaks took place in winter, the season terrorists would choose because it's peak transmission time for smallpox; infected people had lots of interaction with others; and doctors were slow to recognize the disease, as they would likely be today. They settled on an R0 of 10—although they think that may be on the low side. In one famous and “particularly instructive” case, they wrote in a paper, a patient who returned to Yugoslavia from a trip to Iraq in 1972 infected 11 others, who in turn caused 140 “second generation” cases. The same number—10—had also been suggested in several papers (including one in Science, 26 February 1999, p. 1279) by the former head of the smallpox eradication effort, Donald A. Henderson, who served as a consultant to Dark Winter.

    But a team led by CDC's Martin Meltzer, which published a smallpox outbreak model in Emerging Infectious Diseases last fall, concluded after a similar analysis of many more past outbreaks that the average rate of transmission was lower than 2. The CDC group recognized that today's citizens might be more vulnerable but not all that much more, so they ran scenarios in which R0 was 2, 3, or 5, resulting in outbreaks that were easier to contain than the one in Dark Winter. Raymond Gani and Steve Leach of the Centre for Applied Microbiology and Research in Porton Down, U.K., reached a conclusion somewhere in the middle after analyzing historic outbreaks. R0 was usually somewhere between 3.5 and 6, they wrote in Nature last December. In reality, says Koopman, the transmission rate may be much lower than past publications suggest. The published literature contains a skewed record, he says, tending to register significant outbreaks, whereas small ones were never written up. Koopman puts smallpox's R0 at “barely above 1.” If true, a small attack may well fizzle after a handful of additional cases.

    How fast would it spread?

    Other researchers are using more sophisticated models that avoid the problem of choosing the right R0. Instead of relying on a single transmission rate, these modelers estimate the chance that any given encounter—say, a colleague with smallpox sitting in the cubicle next to you for 4 hours—will result in an infection. They then plug these numbers into a model that simulates the behavior of all the individuals in a given community.

    In his computer at Los Alamos National Laboratory in New Mexico, for instance, Stephen Eubank has data on each of the 1.6 million people in Portland, Oregon. Not the real citizens, he hastens to add, but a “synthetic population” based on census, transportation, and other data about real Portlanders. Eubank's people go to work, take their children to soccer practice, and go to movies.

    To estimate the risk of infection from different types of encounters, Eubank and Longini, who has developed a similar model with his Emory colleagues Elizabeth Halloran and Azhar Nizam, have quizzed smallpox experts. Again, past experience suggests that the risks are much smaller than most people would think. Most smallpox infections were the result of several hours spent in close contact—usually 2 meters or less—to a patient, says Lane. Indeed, Koopman says some patients did not infect anybody at all. “In India, we got very worried sometimes, because a patient had gone into a big crowd, or boarded a bus—and yet there was no secondary transmission,” he says. To Meltzer, the notion that an infected terrorist could condemn people to death by patting children on the head or bumping into Manhattan office workers, as happened in the BBC docudrama, is “absolutely preposterous.”

    Eubank has incorporated risk estimates from these experiences into his model. Now, when he releases a smallpox cloud in, say, a shopping mall, he sits back to watch the drama unfold; the model calculates the transmission rate. Depending on the model “run,” the value can vary, but it usually turns out to be 3 or 4, he says. Longini and his colleagues found the mean to be about 2, although in some runs there were no secondary cases.

    No simple answers

    Earlier this month, a group of smallpox modelers met behind closed doors with officials from the Department of Health and Human Services at the John E. Fogarty International Center, part of the National Institutes of Health in Bethesda, Maryland, to compare and discuss their work. Some of those who attended say that there were great differences among the models. Not surprisingly, their gloominess had a major effect on the chances of success for the ring vaccination strategy. Longini, for instance, says his model shows that ring vaccination, even when it's started only after the 25th case of smallpox, can contain an epidemic almost as well as mass vaccination, provided that at least 80% of those exposed can be found and vaccinated. But a model by Edward Kaplan and colleagues at Yale University comes out more in favor of mass vaccination, others say.

    “There are serious disagreements, and they're well founded,” says Fogarty's Ellis McKenzie, who organized the meeting. But such differences can help clarify the debate, he asserts. “One of the important things modeling forces you to do is put all your assumptions on the table, which is extremely helpful.” Until now, he adds, U.S. epidemiologists and public health officials have not been very interested in how mathematical models could aid disease control—in contrast to their counterparts in the United Kingdom, where the field has been thriving. Now, that attitude has changed, as has almost everything since 11 September.


    In Search of a Kinder, Gentler Vaccine

    1. Martin Enserink

    James Koopman saw the last 16 cases of smallpox in the Indian district of Azhagar in the early 1970s, but by now they blur together. Crystal clear, however, is the memory of a child, about 1 year old, who suffered from an uncontrollable infection called progressive vaccinia after receiving a smallpox vaccination in 1973. “It completely destroyed her arm, right down to the bone,” says Koopman, now a researcher at the University of Michigan, Ann Arbor.

    To Koopman, that girl is a grim reminder that the traditional smallpox vaccine, a virus called vaccinia that's harvested from the pustules on the skin of infected calves, is very effective—and also quite dangerous. So current talk of vaccinating thousands or millions makes him rather nervous. But it's not yet clear what the alternatives are.

    Vaccinia—known in the United States as a Wyeth product called Dryvax—works by producing a local infection on the arm, a so-called take, which normally heals in 2 to 3 weeks. But in progressive vaccinia, it grows out of control. Other serious side effects include eczema vaccinatum, a localized or systemic infection in people with a history of eczema, and encephalitis, a brain inflammation. During the smallpox eradication era, about 1250 in every million vaccinees—many of them children under 2 years of age—suffered one of these side effects, and about one in a million died. Researchers expect that those numbers would be significantly higher today, as millions of people have compromised immune systems as a result of HIV or immunosuppressive drugs. Eczema rates have also shot up, for unknown reasons.

    A huge new batch of vaccine scheduled for delivery to the government before the end of the year is not expected to be much safer. Produced by a company called Acambis and its subcontractor Baxter, the vaccine is a single, clonal strain of vaccinia, rather than the mélange in Dryvax, and it's produced by cleaner techniques. But it was chosen to resemble the old vaccine as closely as possible, says Thomas Monath, chief scientific officer at Acambis, because that's known to work. Animal tests suggest that the new version has a slightly lower risk of causing encephalitis, but other side effects will probably be the same, Monath says. Clinical trials are under way.

    The National Institutes of Health is pushing both academic and commercial researchers to develop a safer alternative. Such a vaccine could be used in the more than 20% of Americans who either belong to one of various risk groups or are in close contact with those in them—and perhaps in the long run, the general population. Prime candidates are vaccinia strains that are much more weakened, so that they're powerless to set up an infection but still elicit an immune response.

    Such highly attenuated vaccinia viruses are already used as backbones for several other vaccines. Aventis Pasteur, for instance, is developing an HIV vaccine based on a highly weakened vaccinia strain called NYVAC; now, it plans to test whether NYVAC by itself might make a smallpox vaccine.

    Waiting for a shot.

    In February 1947, New Yorkers lined up for smallpox vaccines after an outbreak hit the city. The vaccine can cause serious and fatal side effects.


    But the candidate with the best prospects, experts say—simply because it has the longest track record—is a vaccine called modified vaccinia Ankara (MVA). Produced by passaging vaccinia 574 times in chicken embryo fibroblasts, MVA was given to more than 150,000 Germans in the 1970s, most of them at high risk of side effects. The vaccine was used as a primer to establish a baseline immunity, thus preparing the body for the traditional smallpox shot given several months later. MVA was shown to be safe, and it helped people tolerate the real vaccine. One of the two companies developing it, Copenhagen-based Bavarian Nordic, has recently given its version of MVA to small groups of healthy and immunocompromised volunteers; they, too, didn't suffer serious side effects, the company recently reported.

    But does the combination of MVA and a traditional vaccine work? Unlike Dryvax and other old-style vaccines, it has not proven its mettle in endemic areas, and there was no smallpox around in Germany when it was used. It doesn't produce the telltale take—and the subsequent scar—that researchers have always relied on to indicate protection. Because it's unethical to do tests in which human vaccinees are exposed to smallpox, the Food and Drug Administration will have to rely on animal tests, as well as measurements of the immune response it generates in humans. Researchers at the Centers for Disease Control and Prevention and the U.S. Army Medical Research Institute of Infectious Diseases have developed a monkey model of smallpox, which they hope to perfect this year (Science, 15 March, p. 2001); MVA is one of the first products they plan to test.

    Because MVA protects against several other members of the orthopoxvirus family, Bavarian Nordic president Peter Wulff is confident that it will pass those remaining tests. He predicts that the combination will eventually replace Dryvax and other traditional vaccines—not just for high-risk groups but for the general population as well. The objection that it doesn't produce a take, Wulff adds, “is a little beside the point from a scientific viewpoint. What counts is the immune response.”

    But others are not so sure. Donald A. Henderson, former head of the World Health Organization's eradication effort and now a top bioterrorism adviser to the Department of Health and Human Services, for instance, says he'd be leery of relying on anything less than the tried and true to protect the population. “I don't know how you could ever be completely sure of [MVA's] efficacy,” Henderson says.

    The dilemma seems certain to crop up more frequently as scientists shore up the world's defenses against bioterrorist threats. Most of the diseases that would appeal to terrorists are extremely rare in nature, so doing efficacy tests of new drugs and vaccines in humans will usually be impossible—unless, of course, the worst scenario materializes.


    For Whom the Stars Toll

    1. Govert Schilling*
    1. Govert Schilling is an astronomy writer in Utrecht, the Netherlands.

    In the new field of asteroseismology, researchers are hoping to peer into stars by studying tiny oscillations of their gaseous surfaces

    It is said that a blind musician can recognize a Stradivarius from its sound and even sense the occasional crack in its top plate. Similarly, astronomers hope that they will soon be able to detect the provenance and health of stars by listening to their sound.

    For decades, solar physicists have extracted huge amounts of data about the interior of the sun by studying vibrations of its surface, a field that has become known as helioseismology. Over the past few years, astronomers have detected, across vast distances, similar vibrations in a few other stars. Earlier this month, a team reported for the first time the existence of vibrations in a giant star, 10 times the size of our sun. The new field of asteroseismology is taking off, and researchers hope it will provide a wealth of data on the size, density, temperature, age, structure, and chemical composition of stars. “Asteroseismology will become an extremely important tool in astrophysics,” says Douglas Gough of Cambridge University.

    The complicated oscillations of the sun's surface that physicists first spotted in the 1960s have periods of about 5 minutes. They are now known to be caused by turbulence and convection of hot gas in the outer mantle of the sun. The surface responds to these motions by ringing like a bell, at frequencies determined by the sun's physical properties and internal structure. These movements of the sun's surface are now tracked daily by satellites such as the U.S.-European Solar and Heliospheric Observatory (SOHO).

    Astrophysicists are eager to know if their model of how the sun works also applies to other stars, but detecting similar oscillations at such distances has proved extremely difficult. It's little wonder: The quivering undulations of the solar surface have an amplitude of less than 25 meters and move at velocities well below 1 meter per second. And for distant stars, which appear as mere points of light from Earth, motions of different parts of the star tend to cancel each other out, giving an overall impression of a motionless surface. As a result, 10 years ago asteroseismology seemed like wishful thinking.

    But in spring 1994, a team led by Hans Kjeldsen of the University of Aarhus in Denmark succeeded in detecting oscillations in the star Eta Bootis, 38 light-years from Earth. This star is larger and more massive than the sun and so should oscillate more slowly. By measuring subtle changes in the relative intensities of spectral lines, Kjeldsen and his colleagues deduced temperature changes of a few hundredths of a degree with periods of about 20 minutes. Other astronomers looked for corresponding Doppler shifts in the light from the star's heaving surface but were unable to confirm the findings, mainly because the star rotates quickly, making Doppler measurements more difficult. “Not everyone believed our results,” says Kjeldsen.


    Oscillation patterns on Xi Hydrae, such as that above, were spotted using the Leonard Euler telescope in Chile.


    The breakthrough came in 1999 and 2000, when astronomers finally measured minute variable Doppler shifts in the light from nearby stars Procyon and Beta Hydri. Detecting a star's surface moving at just 50 centimeters per second had become possible thanks to extremely sensitive spectrographs that astronomers had also used to look for extrasolar planets. In fact, for the observations of Beta Hydri, the team headed by Tim Bedding of the University of Sydney, Australia, also included the renowned planet hunters Geoff Marcy of the University of California, Berkeley, and Paul Butler, now at the Carnegie Institution of Washington in Washington, D.C.

    Soon afterward, astronomers at the University of Geneva, Switzerland, detected vibrations on Alpha Centauri A, our sun's nearest stellar neighbor. They used the CORALIE spectrograph on the Swiss 1.2-meter Leonard Euler telescope at La Silla, Chile, which is also used for planet hunting. “It's a fantastic instrument,” says Kjeldsen. “The results are just amazing.” The surface of Alpha Centauri A was clocked at just 35 centimeters per second.

    But what about stars very different from our sun? For giant, slowly oscillating stars, astronomers need to keep their spectrographs extremely stable and well calibrated over long periods and might need more than one instrument to keep watching during several oscillations. This month, a large collaboration of European astronomers announced using the Euler Telescope and a similar instrument at La Palma in the Canary Islands to observe the old giant star Xi Hydrae twice per hour for a month. It oscillated, they found, once every 3 hours. “Eventually, we hope to deduce the internal structure of stars in each possible evolutionary stage,” says team member Conny Aerts of the Catholic University of Leuven, Belgium.

    Gough says it's not yet possible to deduce detailed stellar properties from asteroseismology data. “But right now, we have so few measurements of this kind that any new observation is of interest, especially for an evolved star” such as Xi Hydrae. “The observed frequencies fit the theoretical expectations, and that's comforting,” he says. Already, other groups are focusing on stars with special properties, such as compact dwarfs and stars with low abundances of heavy elements, and later this year the Euler Telescope will be equipped with a much more sensitive spectrograph.

    The big breakthroughs, however, will have to await the launch of dedicated asteroseismology satellites. From space, sensors will be able to measure tiny brightness fluctuations rather than Doppler shifts, enabling them to scan many more stars for vibrations and study fainter stars. In December, Canada will launch a small asteroseismology satellite called MOST, and 2 or 3 years later, Denmark and France hope to launch their own missions, dubbed Rømer/MONS and COROT.

    But these space pioneers will likely be eclipsed by an orbiting observatory called Eddington that the European Space Agency plans to loft in 2007 (see p. 1585). “Its final design could benefit from the preliminary results of the other missions,” says Gough. “Without any doubt, space observations are going to transform this field, just like they did with helioseismology.”


    Cash-Strapped Fund Struggles to Make Science a Priority

    1. Adam Bostanci

    Earlier this month, donor nations failed to agree on a budget for the Global Environment Facility. The crisis could undermine nascent efforts to strengthen its science base

    CAMBRIDGE, U.K.—When a pair of ecologists unveiled an ambitious proposal 5 years ago to triple the size of South Africa's Addo Elephant National Park, they had no trouble convincing peers that the idea was sound. As a recipe for sustainable development, their plan offered all the right ingredients: thousands of new ecotourism jobs for the depressed Eastern Cape region; a refuge for more than 400 bird species, including the world's largest gannet colony; ample breeding grounds for elephants; and hundreds of kilometers of new fencing to protect six of the country's seven biomes, from the spectacular coastal dunes in Alexandria to the fragile Sundays River estuary. All that ecologists Graham Kerley and André Boshoff of the University of Port Elizabeth needed to drum up were political support and the $40 million it would take to make the Greater Addo National Park a reality.

    Their quest for a Greater Addo came to a satisfying conclusion earlier this month, when the governing body of the Global Environment Facility (GEF) approved a $5.5 million grant. Even before the decision was made, the megafund's promise of support had provided the leverage that Kerley and Boshoff needed to convince South African National Parks and the government to embrace their proposal and to raise the rest of the money from private donors. To enthusiasts, this is exactly what GEF is all about: getting environmentally sound initiatives off the ground.

    But many other projects in GEF's pipeline were not so lucky. At a parallel meeting earlier this month, the fund's donor nations failed to agree on GEF's next budget. The United States has resisted calls to increase GEF's 4-year funding from $2.2 billion to $3.2 billion to cope with a mandate recently broadened to cover desertification and persistent organic pollutants. A Treasury Department official attributes the reluctance to pony up funds to what the United States views as GEF's inadequate monitoring of whether grant money is spent wisely, as well as an intention to catch up on arrears: The United States currently owes the fund $220 million. The uncertainty leaves dozens of projects in limbo, including 16 that have already been delayed by funding shortfalls. In addition, GEF's Scientific and Technical Advisory Panel (STAP) had to cancel some of its initiatives, including a workshop on how to clean up organic pollutants.

    Taking off.

    GEF funding was crucial to starting work on Greater Addo National Park and its famous gannetry.


    The belt-tightening has brought to a boil what many STAP members view as the fund's most pressing issue: weak scientific underpinnings of many projects. “Given that resources are scarce, it is absolutely critical that scientific input is sought as a basis for GEF policymaking,” says new STAP member Leonard Nurse, a coastal ecologist at the University of the West Indies. Adds Al Hammond, senior scientist of the World Resources Institute, who recently took part in a major outside review of GEF's operations: “If we are serious about getting scientific advice, then we have to change the system.”

    Since being set up a decade ago to fund the United Nations Conventions on Biological Diversity and Climate Change, GEF has spent $4.2 billion on more than 1200 projects and catalyzed an additional $12.7 billion in matching funds from governments and private investors. It funds proposals from the U.N. Development Programme, the U.N. Environment Programme, and the World Bank, which will oversee the Greater Addo project.

    In the run-up to its meeting held earlier this month in Washington, D.C., GEF's governing council commissioned a major review of the fund's first decade of operation. The panel, led by sustainable development specialist Leif Christoffersen of the World Conservation Union, concluded in January that the fund has achieved “significant results” in addressing global problems. The review drew attention to GEF-funded projects that have slashed emissions of ozone-depleting compounds and supported energy-efficient technologies. For example, a project on converting wood chips and sugar-cane waste into fuel in Brazil squeezed five times more energy out of the equivalent amount of biomass used previously; the technology is now being replicated in the United Kingdom.

    GEF has moved more slowly on species conservation, but Peter Raven, director of the Missouri Botanical Garden in St. Louis, says the fund has done “invaluable work” in this area, such as funding the training of young taxonomists in SABONET, a network of botanical gardens in southern Africa. It is also contributing one-third of the $20 million budget of the Millennium Ecosystem Assessment. Under the 4-year effort, some 1500 scientists are gathering data on the world's ecosystems, assessing nature's ability to provide essential “services” such as food and clean water and projecting trends such as deforestation and species loss (Science, 8 September 2000, p. 1676). The analysis of this data is intended to help governments better steward their natural resources.

    Like night and day.

    Fencing protects Addo land from overgrazing.


    But GEF received poor marks for its efforts to reach out to the scientific community. Presenting the review's findings last month in The Hague, James Seyani, a Malawian botanist at the Commonwealth Institute in London, argued that “projects should be submitted to critical scientific review, which has not been the case in the past.” Christoffersen's panel characterized the current process, in which one or more scientists drawn from a 400-strong roster reviews each proposal, as an “obligatory but sometimes meaningless checkoff.” One reviewer told Science that implementing agencies are sometimes looking for a “rubber stamp” before a project goes to the council for approval. And experts in developing nations are rarely consulted, because the U.N. agencies prefer faster replies from Western experts with better communication links.

    At the council meeting in Washington, outgoing STAP chair Madhav Gadgil, an ecologist at the Indian Institute of Science in Bangalore, recommended that the fund greatly expand its scientific links in the developing world and require agencies to tap local scientific networks during a project's design. His call has set in motion a long-term reassessment of scientific input into GEF projects. But given the fund's cash crunch, the council only endorsed steps that would cost a negligible amount to implement—such as consulting local experts more often.

    Frantic negotiations are now under way to arrive at GEF's new budget, with some Western European nations and Japan pressing for a big boost. If the funds come through, new STAP member Peter Schei of Norway's Directorate for Nature Management says he hopes to see broader scientific involvement in GEF projects, along the lines of the Intergovernmental Panel on Climate Change.

    “GEF was started as an experiment, but the experiment has grown up,” says Hammond. Now, he says, is the time to take GEF to another level—by strengthening its independence and making input from scientific communities around the globe a priority.


    Biologist Gets Under the Skin of Plants--And Peers

    1. Richard Stone

    By doing things his own way, Gary Strobel has made himself a bioprospector par excellence, finding useful chemicals in the most unusual places

    Gary Strobel was returning from a bush walk in Australia's Northern Territory one afternoon when he encountered a wizened man on the outskirts of a village. They struck up a conversation, during which Strobel explained that he is an American scientist who had come to collect snakevine, a plant that indigenous Australians mash up and apply to wounds. Strobel, an expert on endophytes—organisms that make their homes inside other organisms—described how he would hunt for endophytic microbes in the snakevine that might help explain its medicinal properties. Reggie Munumbi Miller, an Aborigine, listened patiently, then asked, “Can you tell me where I came from?”

    That sort of metaphysical rejoinder might put off most conservative middle-aged white guys with a crewcut. But it didn't faze Strobel, who got to know the tribal elder well over the next several days. Then he returned to his lab at Montana State University (MSU) in Bozeman, in September 2000, with snakevine samples and a didjeridu, a deeply resonant Aboriginal wind instrument that he enjoys playing.

    And just as Strobel has often done before in remote corners of the globe, he discovered a new life-form: a kind of streptomycete, a genus of bacteria that produces more than 50 licensed antibiotics. Whereas most streptomycetes live freely in soil, the new species appears to dwell inside the snakevine, in the interstices between cells. That November, Strobel sent some photos of the bacteria to Munumbi Miller. After waiting a few months for a reply he dispatched a second set of photos, in case the first had been lost. Then in early 2001 he and postdoc Uvi Castillo found that the new streptomycete makes a clutch of potent antibiotics. Strobel's joy would be tempered, however, by the reply he would eventually receive from Australia.

    Strobel, now 63, has spent the past decade of a diverse career uncovering biochemical relations between plants and endophytes that could prove beneficial to society. His most famous—and controversial—find came in 1993, when his team discovered an endophytic fungus inside yew trees in Montana that, like its host, produces the billion-dollar cancer drug Taxol. In the decade since, Strobel has embarked on an endophyte road show, having ferreted out fascinating species from Brazil to China. Back at the lab, he screens metabolites of these organisms in assays for potential drugs, pesticides, or other commercial properties.

    Rare bird.

    Gary Strobel sees himself as a nonconformist.


    Strobel's globetrotting makes him a rare bird: a prospector who is still working with indigenous peoples to identify promising sources of drugs. Most major pharmaceutical companies have sidelined that approach in favor of combinatorial chemistry and other high-throughput screens. “We need people like Gary to keep natural stuff in the pipeline,” says industrial microbiologist Arnold Demain of Drew University in Madison, New Jersey. By playing that role, Strobel has become a rainmaker for MSU, which holds with him and his collaborators some 16 patents, with another dozen pending, on his lab's top finds. Strobel is an ardent advocate of returning a portion of the profits to the local communities in which he collects.

    Success as a bioprospector has provided a measure of redemption for Strobel, after an incident 15 years ago that nearly ended his career. In 1987, he deliberately infected a group of elm trees on the MSU campus with deadly Dutch elm disease, after injecting them with a mutated bacterium to see whether he could cure them. The unapproved study brought him head to head with the Environmental Protection Agency (EPA). Thumbing his nose at the feds transformed him into a cause célèbre of conservative groups and a darling of The Wall Street Journal's editorial page. But many scientists denounced the experiment as irresponsible. “Strobel is a bright guy,” says Anne Vidaver, chief scientist of the U.S. Department of Agriculture National Research Initiative. But “his view of ethical and appropriate scientific behavior is at odds with most scientists.”

    While defending his scientific record, Strobel acknowledges that he's a maverick. He eschews most scientific societies. “I dislike ‘me too’ science,” he says. Membership of such organizations “does not foster thinking beyond or outside the box.” That iconoclastic view generally does not endear him to his peers; some endophyte specialists who wished to remain anonymous grouse about Strobel's reluctance to come to their meetings and defend his findings.


    Drawing a line under a controversial experiment, Strobel takes a chainsaw to his infected elms in 1987.


    At the same time, his folksy manner and ability to connect with people make him a consummate politician, a role he has used to champion Montana's homegrown scientists. Even friends and admirers call him an enigma—a characterization that Strobel clearly relishes. “You feel an excitement in your soul when somebody wants you to conform, and you choose not to,” he says.

    Civil disobedience

    Strobel has been swimming far from the school ever since he was a youngster in Massillon, Ohio, forgoing football in favor of a passion for nature. Chasing the Boy Scouts' William T. Hornaday Award for conservation during high school, Strobel spent Saturdays on a variety of projects, including planting trees along the Tuscarawas River to help reclaim land denuded by strip-mining. He ended up sowing 30,000 seeds—and won the rare honor.

    Shortly after graduating from Colorado State University with a degree in agricultural science, Strobel began sowing the seeds of his future peripatetic lifestyle. In 1960 he joined one of the first U.S. tour groups allowed into the Soviet Union. He spent the summer there and got to see much of western Russia, then returned to the United States to start his Ph.D. studies at the University of California, Davis. After graduating in 1963 he ended up in Montana, where he is now into his fifth decade at MSU.

    Strobel toiled in relative obscurity until becoming interested in Dutch elm disease. The disease, caused by a fungus, slipped into the United States in the 1930s, when it first showed up in Ohio. The fungus started killing elms by the thousands and spread into other states, carried by bark beetles that fly from tree to tree.

    One day in the late 1970s, Strobel struck on the idea of waging biological war on the Dutch elm fungus, Ophiostoma ulmi. “I decided that I would take bacteria that I know make antifungal compounds and put them into trees,” he says. He found a strain of the bacterium Pseudomonas syringae that produces a toxin that kills O. ulmi in culture, then bred a mutant that warded off Dutch elm disease in trees grown in a greenhouse. In early 1987, he told an MSU colleague that he was eyeing a stand of 27 scrawny elms near the university's football field for a field trial of his mutant bacterium. “He told me, ‘Strobie, you've got to call the feds before you do that experiment.’” Strobel duly called EPA headquarters in Washington, D.C., described his experiment, and was told that he couldn't proceed without the agency's approval.

    Strobel did fire off an application to EPA, but he didn't wait for the reply. On 17 June 1987, he injected half the MSU trees with his mutated bacterium; a month later, he infected all the trees with the Dutch elm fungus. “Montana independence and resistance boiled over in my soul” is how he puts it now.

    Strobel called EPA and told officials what he'd done. An EPA investigative team dispatched to MSU in late July determined that Strobel had violated the agency's recently introduced rules on field trials involving genetically modified organisms and issued a letter of reprimand. In the meantime, MSU's biosafety committee was on the case, and on 12 August held an inquiry. “It was what you call a kangaroo court,” Strobel contends—a characterization rejected by scientists who served on the panel. When asked at the hearing why he undertook the experiment, he blurted out “civil disobedience.”

    Surely you jest?

    The fungus Pestalotiopsis jesteri, so named because it resembles a jester's cap, makes an antifungal compound called ambuic acid.


    The comment made the front page of the local newspaper, triggering an avalanche of coverage in the national press the next day. An article in The New York Times recited the possible consequences of Strobel's action: fines, imprisonment, even closure of the university. Nevertheless, Strobel says he felt a strange equanimity: “For the first time, I realized there was a cause bigger than myself.”

    The situation became even more surreal when a Wall Street Journal editorial in late August derided the EPA regulations and likened Strobel's plight to Galileo Galilei's standoff with the Catholic Church. He ended up escaping with a slap on the wrist.

    Strobel was out of the woods, but he still had his trees to worry about. The treatment appeared to have worked: The control trees sickened, and the treated trees thrived. Although no one compelled him to do it, he took out a chainsaw and felled the seemingly healthy trees. Strobel admits that he worried about the possibility of the disease spreading. But he also wanted to close this chapter in his life. “I just wanted to obliterate it,” he says.

    The fallout did not end there, however. During the weeks he was in hot water, grant money for his lab ran out and members of his staff departed one by one. Things were rough on the home front as well: His marriage was breaking up. “I saw everything just sort of disappear,” he says. “All I had left was two hands and a brain.”

    New territory

    On a blustery spring day last November in Patagonia, with the awe-inspiring southernmost part of the Andes chain as a backdrop, Strobel snips a branch from a gnarled nirre, also known as the Antarctic southern beech. “This is ancient forest, the same kind of trees that were here millions of years ago,” he says. In this dry climate, he expects far fewer endophytic fungi than he'd normally find on a rainforest jaunt, but he nonetheless predicts “interesting chemistry” from the ones he does find, because the plant-microbe interactions have developed for so long in relative isolation.

    Strobel's second wife, Suzan, jots their Global Positioning System location and other information in a notebook. She has helped him bounce back from his Dutch elm ordeal. A few years after his divorce, Strobel was invited to Brigham Young University (BYU) in Provo, Utah, by his longtime friend and collaborator, Bill Hess, a microscopist. Like Strobel, Hess is a Mormon, and he wanted to introduce Strobel to a few nice Mormon ladies. “We had a plan of going through the BYU directory,” Hess jokes. Suzan's daughter worked in Hess's lab and brought in a photo and résumé of her divorced mother. When Strobel saw Suzan's photo, he stopped and looked at it for 5 minutes, then said, “I've got to meet her.”

    Strobel hasn't looked back since. A parade of intriguing findings followed his team's 1993 discovery of a Taxol-producing fungus, including three more species of endophytic fungi around the world that make Taxol. Still, the minuscule quantities leave many researchers wondering if Strobel and others aren't seeing an artifact. “A large percentage of the natural products and mycological community is still unsure whether fungal Taxol is real or not,” says one mycologist.

    Other finds are less controversial. Bounty of the past few collecting trips includes a fungus from inside the bark of the cinnamon tree that does gas warfare with other microbes; the ecomycins, antifungal peptides made by the bacterium Pseudomonas viridiflava; and ambuic acid, an antifungal derived from a fungus in Papua New Guinea. Ambuic acid is complex, says chemist James Harper of the University of Utah, Salt Lake City: a cyclohexanone with a seeming shortage of hydrogens that doesn't form crystals. “They would never be able to come up with something like this using combinatorial chemistry,” Harper says. “Gary finds some really weird molecules.”

    Prospecting in Patagonia.

    Some 10% to 15% of the endophytic fungi (bottom) that Strobel collected last fall have antibiotic activity.


    Montana's finest

    As much as Strobel wants to pioneer new territory on his own, he also yearns to be appreciated as a fatherly figure who has nurtured two generations of young scientists in Montana. For years he served as state project director for the Experimental Program to Stimulate Competitive Research (EPSCoR), an effort at the U.S. National Science Foundation (NSF) and other agencies to strengthen science in states such as Montana, Nebraska, and South Carolina that do not win their share of federal research funds. Strobel showed a knack for the right political moves: He arranged visits to Montana by powerbrokers such as former NSF director Neal Lane, current director Rita Colwell, and Bruce Alberts, president of the U.S. National Academy of Sciences.

    But cultivating political goodwill isn't Strobel's only strong suit. “One of his greatest accomplishments was recognizing talent,” says Richard Anderson, NSF's senior adviser for EPSCoR. Perhaps the best example is that of a young museum curator who came to Strobel's office one day in the early 1980s to ask for help in winning a grant. He didn't have a bachelor's degree, but he was obsessed with dinosaurs and claimed to have made an unprecedented find: a nest of dinosaur eggs. Strobel sent Jack Horner's proposal out for review. Two reviewers pronounced Horner's nests, if they were for real, the paleontological find of the century. A few years later, Strobel says, a couple of project officers from different NSF divisions were out at Horner's field site in Montana arguing over who was going to have the honor of funding him.

    Last week, Strobel headed back to Australia's Northern Territory for more prospecting, but it was a bittersweet return. A year ago, he got a letter from a doctor in Munumbi Miller's village. She had seen Strobel's letter, along with the images of the bacterium, on a nightstand at the tribal elder's bedside. He had passed away on Christmas Eve 2000. The bacterium produces several novel, broad-spectrum antibiotics, including one that obliterates the malaria parasite in culture. In honor of his friend, Strobel has named these compounds the munumbicins. He intends to ensure that if any of the munumbicins do reach the market, that some of the profits get funneled back to Munumbi Miller's community.

    Now that he's approaching the sunset of his roller-coaster career, Strobel has started thinking of his own mortality. He doesn't expect to embark on any new radical change in direction. “I'll probably finish my days on this Earth scrounging around in the ground to see what's there,” he says. “I wish I had 10 more lifetimes—everything we touch is novel.”


    Genome Centers Push for Polished Draft

    1. Elizabeth Pennisi

    The genome centers are working flat out, trying to overcome perhaps the hardest challenges yet in finishing the sequence of the human genome

    COLD SPRING HARBOR, NEW YORK—Faced with stiff competition from a private company, the publicly funded Human Genome Project 3 years ago pulled out all the stops to finish a rough draft of the sequence. Since then, even though there's been no competition, the consortium's pace hasn't flagged. It will produce a polished version of all 3 billion bases of the human genome by spring 2003—right on schedule, and right on time for the 50th anniversary of the discovery of the structure of DNA, Jane Rogers of the Wellcome Trust Sanger Institute in Hinxton, U.K., reported at the Genome Sequencing and Biology meeting held here earlier this month.

    Biologists have been waiting for this prize for more than a decade. Although the draft sequence gave researchers a leg up on finding genes and a preview of what was to come, the finished product promises much more. It's “going to provide a kickoff for a whole lot of interesting science for a very broad set of scientists,” says Sanger's Richard Durbin. Not only will the flood of data make it easier for geneticists to track down genes, but it will also help cell biologists understand chromosome structure and developmental biologists unravel how organisms assume their final form.

    But the last 5% is always the toughest, as Francis Collins, director of the National Human Genome Research Institute (NHGRI), and others warned when the project began a decade or so ago. Parts of the chromosomes still stubbornly resist efforts to break their DNA code. Other stretches of the sequence are so alike that they are confounding efforts to piece the genome together correctly. Yet the sequencers are undaunted. “When [the spring 2003 goal] was first mentioned 4 years ago, I was skeptical,” admits Rogers. But now she pronounces it “doable.”

    Four chromosomes are virtually finished, Rogers points out: 20, 21, 22, and Y. Chromosomes 6, 7, 13, and 14 “are in the final stages,” she notes. Nine, 10, and some others are about 85% done. And “we've got a year” to do the rest, Rogers adds confidently.

    Some attribute this progress to the huge sums of money NHGRI and the Wellcome Trust poured into the big sequencing centers as the race heated up. But Collins also credits his nontraditional management style. Typically, researchers get their grants and proceed with little input or interference from the granting agency. But not the U.S. genome centers. “For a while I was riding herd pretty rough on the centers,” Collins admits.

    Each of the 16 participants in the international consortium had to agree in writing to complete its share of finished sequence, and Collins has been tracking their progress by monitoring the data deposited in GenBank. Any dawdlers must get another center to take up the slack, but Collins doesn't expect that to happen, “because no one wants to be the weak link.” If the centers keep churning out bases at the current rate, he predicts that all the sequencing could be finished by the end of the year. Then all they have to add are the “final touches.”

    Race to finish.

    The percentages shown reflect how complete the sequence of each chromosome is.


    Unfortunately, these “final touches”—which involve assembling all the DNA in the right order without gaps and without mistakes—make sequencing the entire genome look easy. Finishing, as this phase is known, cannot be automated to the same extent as sequencing is automated. What's more, it leaves to humans all the problems that the computers couldn't solve when they tried to assemble the genome.

    DNA hiccups

    One of the bigger challenges that has confounded the sequencers is repetitive DNA: sections along the chromosomes where the same base or base sequences are repeated once, a dozen, or more times. These reside mostly at the telomeres, or the ends of the chromosomes, and the centromeres, positioned near the middle of each chromosome. For reasons not well understood, it is sometimes difficult or even impossible to copy, or clone, pieces of the DNA from these two regions in bacterial artificial chromosomes, a key first step to sequencing. Those pieces that are sequenced are often misassembled, notes Evan Eichler of Case Western Reserve University in Cleveland, Ohio.

    The teams that published the four finished human chromosomes opted for expedience: They simply skipped these regions. But for the finished genome, that won't do. “We've already pushed hard into the unclonable regions,” says Collins. “We are really stretching the technology.” And a few brave souls have refocused their efforts on new strategies to make the job feasible.

    Harold Riethman, a molecular biologist at the Wistar Institute in Philadelphia, Pennsylvania, is tackling the telomeres. Researchers worked out the repeated sequence of the telomere DNA several years ago but have not been able to decipher the DNA nearby: the so-called subtelomeric regions. So Riethman has turned to a special type of yeast artificial chromosome called a half-YAC, which contains just a small amount of yeast DNA relative to the inserted human DNA. Into these half-YACS, Riethman has inserted pieces of human DNA that butt up against the telomere's signature repetitive sequence.

    The half-YACS duplicate these chunks of DNA, providing fodder for sequencers ready to take on these regions. The DNA bits are still difficult to decipher, and the sequence hard to reassemble. The biggest problem is that these subtelomeric regions often contain double, triple, or even more copies of segments from that and various other human chromosomes: Deciding exactly where one fits is tough when several seem to match. Nonetheless, 37 of these subtelomeric regions have been partially sequenced and joined to the appropriate chromosome by collaborators in the sequencing centers, Riethman reported at the meeting. “He has done an outstanding job,” says David Haussler, a bioinformaticist at the University of California, Santa Cruz.

    The centromeres are proving even more vexing, says the geneticist who has taken them on: Case Western's Eichler. “In general, the pericentric regions are the hardest thing left to do,” Haussler says. The centromeres, too, are plagued by blocks of DNA up to 10 million bases long that are a mishmash of duplicated segments from elsewhere in the genome. Despite the challenge, these duplicated regions need to be sequenced, says Eichler: “They can be hotspots for rapidly evolving genes,” and they are implicated in some two dozen diseases, including Prader-Willi syndrome and DiGeorge syndrome.

    Beyond the centromere

    Eichler and his colleagues have identified even more duplicated regions outside the centromeres. These segments might also cause big headaches to sequencers trying to assemble the complete human genome, Eichler's postdoc Vicky Choi reported at the meeting. These duplicated regions, which can be more than 200,000 bases long, make up at least 5% of the genome, and any two duplications can be as much as 99% similar.

    To get the finished sequence right, each chunk must be in correct chromosomal location, but the computer can't easily place them; in fact, it often doesn't know they exist. A computer might, for example, superimpose a chunk from one chromosome over a sequence that looks like it on another, or discard it altogether because it looks like something that's already been incorporated into the genome. And if the duplicated segments are located close together on the same chromosome, the computer is likely to “collapse” them: treat them as one and ignore the sequences in between. “This is a very, very important issue in finishing the genome,” says Haussler.

    Help is on the way from Choi and Eichler. After writing a computer program to ferret out these duplicated regions, Choi compared human sequence data from the Human Genome Project and its rival, Celera Genomics in Rockville, Maryland. She found some 24,000 fragments in which the assembly might have been confused by duplicated DNA and 89 places where two copies have been merged and intervening sequence lost.

    Impressed by Choi's and Eichler's talks, Collins wasted no time asking for their help at the meeting. Knowing where the duplications are is the first step to dealing with them correctly, says Collins. “For a long time, assemblers weren't paying attention [to these regions],” says Piu-Yan Kwok of the University of California, San Francisco. “Now there is a simple test to help them detect [duplicated regions].”

    These and other efforts should help ensure that the May 2003 version of the human genome meets the agreed-upon criteria for “finished”: all the bases in the right order with the sequence running from telomere to centromere to telomere for each chromosome. The only gaps allowed are those that could not be filled after an exhaustive effort was made with “the set of techniques that are currently available,” says Rogers. Collins predicts that each chromosome might have a dozen gaps. Although nitpickers might argue that the genome is not finished, says Eddy Rubin, a geneticist at Lawrence Berkeley National Laboratory in Berkeley, California, in reality, biologists will at last have all they need.


    Charting a Genome's Hills and Valleys

    1. Elizabeth Pennisi

    Comparisons of the sequences of the mouse and human genomes have turned up unexpected features

    COLD SPRING HARBOR, NEW YORK—Time was when geneticists thought the human genome was quite uniform—consisting simply of genes strung together one after another. Then in the late 1970s, they realized that long stretches of seemingly useless DNA are sandwiched between—and even within—genes. Researchers thought that this intervening DNA constituted a second type of DNA, one that is less essential to an organism's survival and thus likely to accumulate more mutations over time than coding DNA. This accelerated evolution, they thought, would occur at a constant rate across all the noncoding regions. Now, it turns out that they were wrong on that front as well.

    As researchers begin comparing newly sequenced genomes, numerous surprises are emerging, as described at a genome meeting here held from 7 to 11 May. For one, some of that “useless” noncoding DNA turns out to be highly conserved among humans and mice (see also Research Article on p. 1661 and Perspective on p. 1617). In addition to this conservation, another unexpected find is that the rate at which different DNA sequences change through time varies significantly. Some noncoding DNA regions change a lot; many others remain nearly constant.

    With each new genome, “we're seeing there's more to the story” than we realized, says David Haussler, a bioinformaticist at the University of California (UC), Santa Cruz. Adds Pui-Yan Kwok, a geneticist at UC San Francisco, “Genomes are evolving in a completely nonuniform way.”

    As a result, biologists are rethinking their views of how genomes operate—and shedding some of their “gene-centric” views in the process. In particular, the high degree of conservation of some noncoding DNA is helping convince them that these sequences are somehow useful to the genome after all, says Edward Rubin, a geneticist at Lawrence Berkeley National Laboratory in California. And because different parts of genomes change at different rates, evolutionary biologists will have to be much more careful in selecting the DNA they use to evaluate the phylogenetic relationships among organisms.

    Deserts and jungles

    Some of the new findings emerged once researchers were able to survey the entire landscapes of the mouse and human genomes. Rubin and his colleagues have determined how gene density can vary dramatically in both species. Some regions—“gene jungles”—have a high density of genes, whereas “gene deserts” have very few, if any, over long stretches of DNA. At the meeting, Rubin's Lawrence Berkeley colleague Inna Dubchak reported that the human genome contains 234 gene-poor sections ranging in length from 620,000 to 4 million bases. Together, these deserts comprise about 9% of the human genome, Dubchak said.

    The researchers expected little similarity between the gene deserts of the mouse and human, but when they matched the human sequence up with the draft sequence of the mouse, they found that the two species had 178 deserts in common. Jane Rogers, a sequencer at the Sanger Institute in Hinxton, U.K., describes this observation as “the most surprising thing” that's so far come out of comparisons of the two genomes.

    Of mice and (wo)men.

    The genomes of the two species are proving to be more similar than predicted, particularly in noncoding regions.


    The results have driven home how narrow-minded genome explorers have been. “We've had an extremely gene-centric view,” says Rubin. What's more, as Francis Collins, director of the National Human Genome Research Institute in Bethesda, Maryland, points out, the work “implies some sort of chromosome organization that has not really been appreciated.” Biologists are also scrambling to identify a function for the noncoding regions. Otherwise, Rubin asks, “what has preserved them over millions of years of evolution?”

    He suggests that they might assist in the pairing of chromosomes that takes place prior to cell division. Or they might help keep chromosomes organized. Either job would be sufficiently critical to help keep those sequences relatively constant through the millennia, says Rubin. To find out, he, Dubchak, and their colleagues are in the midst of breeding mice that lack particular gene deserts to study the resulting effects of these genome deficiencies.

    However those experiments come out, the close similarities between deserts and genes in the two species threaten to make genome analyses more difficult. Because the typical gene has characteristic DNA at its beginning and end, spotting coding regions is easy—but not trivial.

    Researchers had hoped that, with two genomes in hand, they would be able to go beyond just picking out genes to finding the DNA involved in regulating gene expression—DNA thought to be critical in the evolution of new species and in human disease. But such regulatory sequences have been harder to find than expected, Collins explains, because for them “we do not have good signatures.”

    One problem is that these DNA sequences can lie far away from the genes they regulate. Genomicists thought they would be able to discern some of these regions by comparing mouse and human sequences, as many key regulatory regions should be the same or similar in both. But the presence of so much conserved sequence that's not regulatory DNA, particularly in the deserts, complicates that search.

    Evolutionary tempos

    Given the comparisons of mouse and human sequences, researchers must also rethink notions about how genomes evolve. At one time, researchers thought the rate of change tended to bottom out in the genes, making them, from an evolutionary point of view, canyons in the genomic landscape. The DNA in between are the plateaus, with elevated, but supposedly consistent, rates of change through time.

    To the contrary, the “plateaus” are pockmarked with accelerations and decelerations in sequence mutations, Robert Waterston, director of the Genome Sequencing Center at Washington University in St. Louis, Missouri, reported at the meeting. The same is true of a few bases within genes—those that don't help define a particular amino acid.

    Some preliminary evidence existed for an irregular rate of evolution along noncoding DNA. But those analyses involved small regions of DNA. Only now that researchers can examine whole genomes have they been able to confirm this idea, says Haussler, whose group did much of the analysis that Waterston described.

    Shared topography.

    In the human genome, a subset of gene deserts (red lines) also exist in mice (blue lines).


    For this work, Haussler and his colleagues focused on two types of DNA, neither of which are thought to be subjected to any evolutionary forces but instead undergo what is called neutral evolution, changing randomly over time. The first type of DNA consisted of repeats in ancient mobile elements, called transposons, that took up permanent residence in the genome of the common ancestor of the human and mouse many millions of years ago. This provided “an enormous data set, more than 50 million bases,” says Haussler. The second type consisted of the last base in the three-base codons for each of eight amino acids. For all of these codons, Haussler says, “the third base is completely free to change” without affecting the identity of the amino acid specified. That means any changes would reflect neutral evolution, even though they occur within genes.

    Haussler and his colleagues sought out the two types of DNA along 5- million-base windows of each human chromosome. They then calculated the rate of evolution in each window based on differences in the two species. The researchers found that the rate went up and down along the chromosomes and that these fluctuations were largely consistent whether they were looking at ancient transposons or third bases. “We're seeing strong and distinctive variation,” says Haussler. Thus far, no one has found any characteristics in the genome that could explain why this is happening. “It's an absolutely intriguing puzzle.”

    It is also, Waterston noted, “a complication we didn't anticipate.” For one, it will complicate the work of evolutionary biologists, who often attempt to date when new species emerged using so-called molecular clocks. These clocks depend on the relative number of mutations in a species and are based on the premise that they “tick” at a constant rate throughout the history of the DNA. A number of skeptics have questioned the reliability of these clocks (Science, 5 March 1999, p. 1435), and the new findings could provide them with new ammunition. For instance, if the genome changes at different rates in different parts of the chromosomes—rates that might also slow down or speed up through time—then molecular clocks might provide erroneous results unless researchers chose the regions they are comparing very carefully.

    Researchers have just embarked on their exploration of the full mouse and human genomes, and already long-held notions are being overturned. Just think, says Rubin, of what lies ahead.