News this Week

Science  05 May 2000:
Vol. 288, Issue 5467, pp. 782

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    DOE Set to Double Price of Troubled Laser Project

    1. David Malakoff

    The world's largest laser project will end up costing almost twice as much as originally estimated. Last week, Department of Energy (DOE) officials publicly outlined several options for rescuing the troubled National Ignition Facility (NIF) at Lawrence Livermore National Laboratory in California that will add from $750 million to $1 billion to NIF's current $1.2 billion price tag. Members of Congress—which must approve a new spending plan for the project—are grumbling about the huge overruns, and Science has learned that Livermore and DOE officials are at odds over which option should be chosen. Energy Secretary Bill Richardson may decide how to proceed as early as this week.

    DOE has already spent nearly $800 million to build NIF (see graphic), which is designed to focus 192 powerful laser beams on a pellet-sized target in a bid to test the feasibility of fusion energy and simulate how nuclear weapons behave. It is a pillar of the department's $4.5-billion-a-year Stockpile Stewardship Program, which aims to preserve the reliability of the country's nuclear arsenal without actual testing. The project ran into serious problems last year, however, after Livermore officials revealed that technical glitches and management missteps would delay the project beyond its scheduled 2003 completion date and drive up costs (Science, 31 March, p. 2389). The revelations led to a string of reports critical of NIF's management and the departure or punishment of several top project officials. Through it all, DOE officials could not say for certain how much it would cost to achieve first firing.

    NIF planners are now close to calculating that number. In a 9 March letter, DOE weapons program head Thomas Gioconda asked Livermore director Bruce Tarter to flesh out the impacts of six budget options, ranging from substantially scaling back the project to spending an extra $1 billion or more over the next 10 years. A few weeks later, according to notes obtained by Science, Gioconda briefed Richardson on the causes of NIF's budget woes and the two leading options. DOE faces a 1 June deadline for giving Congress the new numbers and explaining how it will pay for the overruns.

    Surprise rise.

    NIF construction costs were due to dip sharply next year before overruns.


    One problem that plagued NIF was a contingency fund—about $150 million—that planners now say was too small to absorb the inevitable surprises that come with such a technically challenging project. But even a much bigger fund might not have solved the expensive problems that cropped up in building the laser's beam path, the tortuous system of pipes, lenses, and mirrors that guides the powerful light to its destination. The beam path proved “much more complex than originally understood,” Gioconda's presentation noted. In particular, Livermore lacked the engineering experience to integrate the system's many parts in a timely fashion. The complexity compounded what he called another “poor” management decision: completing the building to house NIF before the laser's full requirements were known. The result, planners say, is a space that is too small for easy installation. It will cost about $300 million to leap that and other hurdles, such as maintaining the extremely clean environment needed to keep dust out of the beamline.

    Livermore officials strongly favor completing the project relatively quickly, by 2007. That so-called Option 1 would spend an extra $390 million over the next two fiscal years (see graph) and raise the total cost to $1.95 billion by 2010 (not counting operating costs). DOE officials, in contrast, prefer Option 3, which would stretch out the work—and the costs—until late 2009. That would raise the total cost to $2.15 billion but add only $100 million per year to NIF's budget.

    At a 26 April meeting of a NIF advisory board, Livermore officials argued that Option 3 would kill the project's momentum. “It would put [NIF] on the back burner,” lab spokesperson Susan Houghton later explained, making it difficult for Livermore to fulfill existing contracts or retain skilled staff. DOE, for instance, estimates that nearly 150 of NIF's 1000-person workforce might face layoffs under Option 3. Houghton, however, says the lab will “live with whatever it gets.”

    That decision now rests with Richardson and with Congress, which this week began work on DOE's first spending guidelines for 2001. A strong economy may give NIF supporters plenty of leeway to provide the “relatively modest increases that NIF will need to survive,” says a House aide. But “jaws are dropping all over Capitol Hill,” he adds. “There are going to be lots of questions about how this happened and how DOE is going to pay for it.”

    Some of those queries may be answered in an extensive General Accounting Office study due to be delivered later this month, or in hearings that the House Science Committee is considering for June. DOE officials also may discuss which weapons or civilian science programs to trim to pay for NIF at another advisory board meeting scheduled for 17 May at Livermore. In the meantime, the black humor surrounding NIF is getting a bit darker. Noting that the laser is often described as a “stadium-sized” project, some DOE employees have proposed that a more fitting benchmark might be the Titanic.


    Scientists Gain Access to Sharper GPS Signal

    1. Andrew Lawler

    Rarely can the government make thousands of scientists happy by simply flipping a switch. This week, however, U.S. President Bill Clinton did so by ordering his military commanders to turn off a scrambling device on Air Force positioning satellites. The change, effective at midnight on 2 May, provides researchers and commercial users with a 10-fold improvement in their ability to pinpoint the location of receivers on Earth.

    Until now, only the U.S. military has had regular access to the high level of precision possible with the constellation of 24 Global Positioning Satellites (GPS). Civilian researchers have had to make do with the scrambled version of the signals, which are based on atomic clocks onboard each spacecraft. “This really opens up the field for scientists,” says Charles Challstrom, director of the National Geodetic Survey.

    Combining the more exact GPS data with existing maps, for example, will allow scientists in remote territory to plot immediately their locations to within 10 meters, says James Baker, National Oceanic and Atmospheric Administration chief, eliminating time-consuming calculations. Glenda Humiston, Agriculture Department deputy undersecretary of national resources and environment, says that the greater accuracy will allow researchers to probe environmental effects at a much more fine-tuned level; for example, to understand the effects of roads on watersheds.

    National security agencies have long resisted any move to improve the precision for civilian and commercial users, arguing that terrorists could make use of GPS signals. And the few civilian researchers temporarily given the codes to receive the more exact signals had to face a process that one Commerce Department official says was “costly and cumbersome.” Time-consuming data processing by civilian scientists helped improve the accuracy of scrambled signals, but the quality remained less than that enjoyed by the military.

    Eager for benefits ranging from improved car and boat navigation to tracking freight, however, commercial concerns kept up pressure to unscramble the signals. The Administration in 1996 agreed to stop the scrambling within a decade and moved up that timetable after the Defense Department found a way to rescramble the signals in a particular region in case of war, military officials said at a 1 May White House briefing. Now anyone with a GPS receiver anywhere will have access to signals nearly as accurate as those used by the military services.

    The Administration plans to make GPS even more accurate by adding a second frequency in 2003 that will compensate for disturbances in the ionosphere that interfere with the GPS signals. A third frequency for even greater accuracy is slated for 2006. The government also hopes to improve its ability to estimate the position of each satellite, giving scientists an even better sense of where they are.


    Hints of Frequent Pre-Columbian Contacts

    1. Heather Pringle*
    1. Heather Pringle is the author of In Search of Ancient North America.

    Last week's opening of the Smithsonian Institution's exhibition, “Vikings: The North Atlantic Saga,” was a glittering black-tie affair, with Scandinavian royals rubbing shoulders with an international scientific crowd. The pomp and ceremony served to popularize the scientific evidence that the first contact between Europeans and Americans was not Columbus's voyage but Viking landfall in Newfoundland, thought to have occurred about A.D. 1000. And at a 2-day symposium, Canadian archaeologist Patricia Sutherland took the Viking story a big step further, presenting stunning new traces of the Norse on northern Baffin Island in the Canadian Arctic, at least 200 years before Columbus. Although not all her colleagues are convinced, Sutherland argues that the evidence shows that in the Arctic, unlike in Newfoundland, the Norse had frequent and prolonged contact with aboriginal peoples—the first sustained close encounter of the Old World with the New. “There was more than just in-and-out trading and ‘Goodbye, we won't be back,’” says Sutherland, who suspects that the Norse actually established shore stations on Baffin Island.

    The Norse artifacts, including spun and plied yarn, characteristic woodworking, and even carvings with European-like faces, come from sites of arctic hunter-gatherers known as the Dorset. Although archaeologists had suspected that the Norse had traveled the eastern arctic coasts, until recently no one had searched Dorset sites for clues. Researchers had thought that the Dorset had vanished from most of the Arctic by the time the Vikings crossed the Atlantic, although a fragment of a Norse pot did turn up in a Dorset site in Greenland a few years ago. “But that was just a single find,” says Bjarne Grønnow, an archaeologist and director of The Greenland Research Centre in Copenhagen, who heard Sutherland present her work at a recent conference in Denmark. “When Pat presented her finds, we all said, ‘Wow, it's not just a coincidence. It's really something.’”

    Sutherland, who works at the Canadian Museum of Civilization in Hull, first came across the Norse objects while examining the museum's artifacts from Nunguvik, a Dorset site on northern Baffin Island excavated in the 1970s and 1980s. Among trays of distinctive Dorset harpoons and carvings, she pulled out two strands of soft yarn, one 3 meters in length. Neither the Dorset nor other native northern groups were known to have spun fibers for cloth. Moreover, the specimens closely resembled yarns Sutherland had seen while helping to excavate a well-known Norse site, Gården Under Sandet (GUS), in Greenland.

    Intrigued, Sutherland sent samples to Penelope Walton Rogers of Textile Research in Archaeology, an independent consulting firm in York, U.K.; Walton Rogers has analyzed many of the Greenlandic Norse textiles, including those from GUS, some of which were made of arctic hare fur and goat hair. She concluded that both Nunguvik samples had also been spun from arctic hare fur, and one had probable goat hairs attached, closely resembling GUS's 13th and 14th century textiles.

    As Sutherland continued to pore through the Nunguvik collection, she found other Norse items. Several pieces of wood revealed European carpentry techniques, such as iron-stained holes apparently made by square nails and special kinds of fitted joints called mortise-tenon and scarfing. She also found small, Dorset-style carvings of European-looking faces.

    Three other Dorset sites on Baffin Island yielded even more Norse traces. All three sites had produced specimens that the original excavators had described as musk-oxen cordage, but Walton Rogers pronounced two samples to be yarn made from arctic hare fur. One piece was infused with oil and pigment, suggesting a form of oilcloth. All three sites also yielded unusually worked wood similar to that from Nunguvik. One piece with a cut Norse-style design has been identified by conservator Greg Young of the Canadian Conservation Institute in Ottawa as fir, a southern tree rarely if ever found in arctic driftwood.

    The number of Norse objects suggests that the Norse were frequent visitors to Canada's far north, argues Sutherland. “The contact was possibly sufficient to have influenced local technology. Some of the stuff might clearly be identified as Norse. Some might be the Norse stuff modified by the Dorset, and some might be a combination of the two technologies,” she says.

    Other researchers, although they agree there was some kind of contact, aren't yet sure how extensive it was. “Is this just the tip of the iceberg?” wonders William Fitzhugh, an archaeologist at the Smithsonian's Arctic Studies Center and the curator of the Viking show. “Or are we finding just a few things from a few chance encounters?”

    Conclusively dating the artifacts, moreover, is proving difficult. At the moment, Sutherland believes the evidence points to the late 13th and early 14th centuries, as indicated by radiocarbon dates she obtained on one of the nail-pierced wood pieces from Nunguvik. But other radiocarbon dates, including on the Nunguvik yarn, are as early as the 7th and 8th centuries, and those dates fit the original excavators' estimates of the age of the Dorset sites.

    But such early dates—more than a century before the great sea voyages were thought to have begun—would rewrite Viking history. Thus most archaeologists favor the medieval dates. “My qualified guess is that the yarn tells us about early Norse contacts with the people of the Late Dorset culture, between A.D. 1000 and 1300,” says Hans Christian Gullov, an archaeologist at the Danish National Museum. Fitzhugh agrees, noting that bad times in the 13th century may have led the Norse to try their hand at trading walrus ivory, which was much coveted in Europe. That might spur a journey to the high Arctic and contact with the walrus-hunting Dorset.

    All the same, Sutherland is not ready to rule out the early dates entirely. Crossing the Atlantic may not have been beyond the capabilities of Europeans at that time. “So we shouldn't be unwilling to consider something earlier.”


    Does Biodiversity Help Fend Off Invaders?

    1. Jocelyn Kaiser

    As nonnative species such as cheatgrass in the western United States and rosy wolf snails in Hawaii have swept over native ecosystems, ecologists have wondered why some ecosystems are more vulnerable to invasions than others. Theory suggests that an ecosystem rich in biological diversity should be better able to resist invasions, but contradictory studies have led to a heated debate over whether biodiversity matters at all. A study on page 852 may help bring the two camps together by showing that both arguments can hold true—depending upon whether one looks at microcosms or across an entire ecological community. “It's a nice paper,” says David Tilman, an ecologist at the University of Minnesota, Twin Cities. “It resolves what's been a brewing controversy the last few years.”

    As far back as Darwin, biologists have suggested that exotic species should have a harder time taking root in a diverse ecosystem, because a whole web of species should more efficiently tie up such critical resources as nutrients, light, and water than would a single species, leaving fewer resources for the invaders. That notion has been supported by theoretical models as well as experiments with plots of grasses and a marine microcosm, among others (Science, 19 November 1999, p. 1577).

    But in large-scale settings such as parks, many land managers and scientists have found that the reverse holds true: Invaders tend to be more successful in diverse ecosystems. Diversity might matter within small experimental plots, biologists have suggested, but in nature other factors that operate across a broad scale—such as good soil and lots of water and sun—seem to swamp the effects of biodiversity and enable both native and exotic plants to flourish.

    To explore these two ideas, Jonathan Levine, an ecology graduate student at the University of California, Berkeley, studied a stretch of the South Fork Eel River in northern California that's dotted with tussocks of a sedge called Carex nudata. Anchored on rocks where they trap soils, these tussocks are miniecosystems containing other grasses, forbs, and mosses. These tiny islands, about two-thirds the size of a sheet of letter paper, are being invaded by three European plants: Canada thistle, common plantain, and creeping bent grass.

    Levine started with a broad look across the ecological community, counting how many exotic plants had invaded the tussocks along a 7-kilometer stretch of river. The more diverse a tussock was, the more invaders it had, he found, as practical experience had suggested. Although that implied a limited role for diversity at the community level, Levine still suspected that diversity could be a major factor at the local scale. To decouple biodiversity from other factors, he manipulated the number of species on individual tussocks. He weeded 65 tussocks along a few dozen meters of river of everything except the sedge, then transplanted onto them anywhere from one to nine native species, creating tussocks with similar plant cover but differing levels of diversity. The next spring, he added seeds of the three invading plants to each tussock. This time, the opposite was true: The more native species there were, the fewer weeds took hold. In short, diversity does matter in fending off invasives, says Levine, but its effects are wiped out by other factors at larger scales.

    Levine wanted to know what those factors might be. One clue was that in the natural ecosystem, the tussocks that had the greatest natural diversity and the highest number of invasive species tended to lie farther downstream. Levine wondered if, compared to upriver plants, downstream tussocks were simply deluged with more seeds of both native and exotic plants washing downriver. To find out, Levine added vast quantities of seeds—enough to wipe out any differences in seeds coming from upstream—from the three invaders to 190 tussocks that varied naturally in diversity along 7 kilometers of river. This time, the invaders were equally successful in colonizing diverse and less diverse communities. The upshot, says Levine, is that in this particular large-scale system, the most important factor influencing invasion abundance was the number of seeds—as opposed to either diversity or resource conditions. Levine says his findings support growing suspicions that the most effective way to stem invasions is not just to try to maintain diversity but to stop nonnative seeds or organisms from getting into an ecosystem in first place. “A lot of people are coming down to propagule pressure,” or seed number, as the critical factor across large ecosystems, he says.

    Not everyone buys that conclusion, however. Philip Grime of the University of Sheffield in the United Kingdom, for example, believes that if one looks across all studies on invading species, resource supply is just as important as seed number in giving invasives a leg up, and biodiversity doesn't play a role at any scale. So the argument over the role of diversity seems likely to continue.


    Caution Raised About Possible New Drug

    1. Trisha Gura*
    1. Trisha Gura is a science writer in Cleveland, Ohio.

    For the past 4 years, the protein known as TRAIL has dazzled cancer researchers with its discriminating ability. The molecule appears to kill off many types of cancer cells while leaving normal cells unscathed. And it does the job without any measurable toxicity in animals. Not surprisingly, drug developers have been eager to move the protein into human trials. But now TRAIL's march to the clinic may have hit a roadblock.

    In work reported in this month's issue of Nature Medicine, a team led by cell biologist Stephen Strom at the University of Pittsburgh in Pennsylvania found that, previous studies notwithstanding, TRAIL kills normal human liver cells, both in culture and in tissue slices from normal individuals and patients with hepatitis or other liver diseases. “The data are clear and convincing,” says molecular biologist Shigekazu Nagata of Osaka University Medical School in Japan, who wrote an accompanying commentary on the study. “TRAIL can kill human [liver cells], but it does not kill monkey or mouse liver cells.”

    If the results hold up, they could dash plans by the biotech firms Genentech in South San Francisco and Immunex in Seattle to jointly develop the potential anticancer drug, dubbed TRAIL/Apo2L. The study also points up a possible pitfall in the standard drug-discovery strategy. Researchers test potential drugs in a variety of cells, but they rarely use healthy human liver cells, says Strom, who is a liver expert, because the cells are very hard to obtain. Now, he says, “we have to be cognizant that we might miss some things. There is not really much in these preclinical studies [of TRAIL] that could have been done differently.”

    Strom did not set out to study TRAIL's role as a cancer drug. With graduate student Minji Jo and their colleagues, he was trying to find the molecular players that bring about liver cell death in diseases such as hepatitis and alcoholic cirrhosis. Previous work had shown that TRAIL kills cancer cells by triggering a form of cell suicide called apoptosis, and the team wanted to find out whether TRAIL-induced apoptosis might also be involved in the premature death of diseased liver cells.

    Much to their surprise, the researchers found that TRAIL readily induces apoptosis in cultured liver cells from both sick individuals and healthy liver donors. Because the protein had never before been shown to trigger death in animal liver cells, “we thought [the result] was a fluke,” perhaps an artifact of the team's culture conditions, recalls Strom.

    Further work ruled out that possibility. For example, the team found that TRAIL also induces apoptosis in cells anchored in liver slices. And, in accord with previous findings, the researchers found no effect of TRAIL on cultured human epithelial cells derived from liver tissue or on rat, mouse, or monkey liver cells. “The animal cells are resistant in the culture dish, in whole livers, and in the live animal,” Strom sums up. “This was not a culture artifact.”

    So why do human liver cells remain susceptible to TRAIL when other normal cells appear to fend the molecule off? Work a few years ago suggested that tumor cells succumb because they, unlike normal cells, fail to make a so-called decoy receptor that sops up TRAIL and prevents it from triggering the receptor that, in turn, programs cell suicide (Science, 8 August 1997, pp. 768, 815, and 818). But Strom and his colleagues found no differences between human liver cells and others in the production of either the decoy or regular TRAIL receptors. “This is very difficult to explain,” says Nagata, whose own work focuses on apoptosis. “Perhaps, some factor is produced by mouse or monkey hepatocytes but not humans' that can inhibit TRAIL.”

    Researchers at Immunex and Genentech think there might be a simpler and, from their view, more favorable explanation for the Pittsburgh team's findings. Often, especially in the standard bench-top purification method used by Strom's group, TRAIL proteins clump together, forming “multimers” that are more toxic to cells than more natural preparations, says Douglas Williams, executive vice president and chief technology officer at Immunex. “The studies being reported are interesting,” he adds, “but they have not been performed with the material we have been producing here at Immunex and Genentech.”

    To see whether that accounts for the discrepancy between the current observations and those made previously, Strom and the Immunex-Genentech workers will exchange their TRAIL preparations, and each group will redo the tests with the other's material. Until the results are in, TRAIL's fate will hang in the balance. “There is no magic drug for cancer at the moment,” Nagata says.


    Grants Evoke Squeals of Delight and Anger

    1. Jeffrey Mervis

    It's easy to understand why the “Three Little Pigs” is David Ayares's favorite children's story. On 14 March Ayares's company, PPL Therapeutics Inc. of Midlothian, Scotland, surprised the scientific community by announcing that it had successfully cloned a pig from an adult cell, taking a small but vital step toward its goal of mass-producing organs to be transplanted into humans.

    Last week Ayares told a version of the story to an audience at the U.S. National Academy of Sciences, recounting how a $2 million grant last fall from a beleaguered federal research program had rescued his U.S.-based project. The grant came just in the nick of time, he said, as the company was ready to pull the plug because success seemed too far away. But unbeknownst to Ayares, at the very moment he was relating his story, a Big Bad Wolf on Capitol Hill was trying to blow his house down.

    What Ayares was praising, and what Representative James Sensenbrenner (R-WI) was criticizing, was the Advanced Technology Program (ATP) run by the National Institute of Standards and Technology (NIST). Created by a 1988 law aimed at making U.S. companies more competitive in global markets by funding innovative research with potentially high payoffs, ATP for the last decade has been a $1.5 billion political litmus test for whether the government should subsidize corporate research. A hard-earned truce in recent years has left ATP with an annual budget of about $200 million, far below what the Clinton Administration has requested in most years but much more than many Republicans say it deserves.

    Last week Sensenbrenner, chair of the House Science Committee, tried to shatter that truce by releasing a report (GAO/RCED-00-114) questioning ATP's ability to plow new ground. ATP is violating its mission by backing projects that “addressed similar research goals to those already being funded by the private sector,” the Government Accounting Office (GAO) concluded after examining three ATP awards. Although the auditors concede that company scientists pursued “unique technological approaches” in carrying out the funded work, Sensenbrenner complained in a press release that ATP was “duplicating private research and shortchanging taxpayers.”

    ATP supporters say that the report's methodology and conclusions are badly flawed. NIST Director Ray Kammer says its definition of “similar research” is so broad that “by that doubtful criterion we would shut down federal research on cures for a host of diseases” and many nonmedical projects. One participant in last week's academy meeting saw the report as old wine in new bottles: “It's part of a continuing battle by those who dislike ATP.”

    For Ayares, PPL's vice president for research, the idea that his ATP grant is duplicative ignores both the formidable technological challenges his pig xenograft team faced and the fact that the company was ready to jettison the project. “We were going down,” Ayares told his audience. The company laid off a third of its research staff in Blacksburg, Virginia, and shut down complementary projects there involving cows and rabbits. Ayares could read the writing on the wall: “My house was on the market,” he confesses.

    Instead, 5 months after receiving the grant, the company announced the Caesarian birth of five piglets cloned from adult cells. Big Pharma companies and venture capitalists that had once shunned the work as too speculative are now calling him on the phone and begging him to do a deal, Ayares says. The company's board of directors “now agrees to support us for as long as it takes” to find support for clinical trials and commercialization of the technology, he crows. “Without ATP,” he says, “there would be no cloned pigs, only 4 years of intellectual property.”


    Coaxing Shy Particles Into an Atomic Jar

    1. Alexander Hellemans*
    1. Alexander Hellemans writes from Naples, Italy.

    One major frustration the universe inflicts on physicists is a serious shortage of antimatter. It should be fascinating stuff—looking-glass atoms with negatively charged nuclei surrounded by clouds of positive charge, and with who knows what bizarre properties—but no one has ever made enough of it to tell.

    That could change, thanks to a new method a Dutch-American team has demonstrated for recombining free electrons with ions to form atoms. In the 24 April Physical Review Letters, researchers led by Bart Noordam of the FOM Institute for Atomic and Molecular Physics (AMOLF) in Amsterdam describe how they enabled rubidium ions to trap electrons by applying a pulsed electric field in a series of steps similar to the way a child traps an insect in a jar. The team claims the technique can be used to produce atoms of antihydrogen, the simplest form of antimatter, in greater numbers than ever before.

    “So far there is no proven way to make these atoms, so any additional suggestion is certainly welcome,” says Theodor Hänsch of the Max Planck Institute for Quantum Optics in Garching, Germany. Thomas Gallagher of the University of Virginia in Charlottesville agrees wholeheartedly. “A few years ago, I would have thought this is completely nuts,” he says. “I think it is a nifty trick.”

    To create antihydrogen, scientists must nudge a positron—the antiparticle of an electron—into a quantum-mechanical dance around a negatively charged antiproton. Between 1995 and 1997, physicists at CERN, the European particle physics laboratory near Geneva, and at the Fermi National Accelerator Laboratory near Chicago forged 108 atoms of antihydrogen in particle accelerators, but all perished in collisions with accelerator walls within billionths of a second. Because of the short life-span, Noordam says, “this was not a method that ever could lead to antimatter that we actually could study.”

    Researchers tried to extend the longevity of antihydrogen by catching it in Penning traps, devices that slow down and confine atomic particles through an interplay of electric and magnetic fields. In 1996, physicists at CERN tried bringing together antiprotons and positrons in such a trap, but the energetic positrons just flew past the antiprotons without forming atoms. “You can compare this to the flyby of a spacecraft around a planet,” Noordam says. If the spacecraft is moving too fast, it will swerve past the planet and out into space. To put it into orbit, you must first reduce its energy.

    Scientists are investigating several braking maneuvers for a positron approaching an antiproton. Hänsch and colleagues plan to use laser light to stimulate the positron to emit a photon, jettisoning enough energy for the antiproton to capture it. Another team, led by Gerald Gabrielse at Harvard University, is investigating a method called three-body recombination, in which the positron transfers some of its energy to another, onlooking positron. Researchers plan to test those methods and others when CERN's new Antiproton Decelerator becomes available for experiments later this year, but the yield of antiatoms is still expected to be very small.

    Noordam and Kees Wesdorp at AMOLF think their new technique can do much better. As stand-ins for positrons and antiprotons, they shoot electrons into tiny clouds of rubidium ions. The speeding electrons swerve toward the oppositely charged rubidium ions. Before the electrons can swing past the ions and escape, an electric field decelerates them and turns them back. Then, suddenly, the field is switched off, leaving some of the stalled electrons easy prey for the rubidium ions to capture into wide orbits. Researchers compare the method to the way a child coaxes a wasp into a jar, then clamps on the lid just as the wasp turns around to escape.

    The AMOLF team obtained three atoms for every 1000 free ions, more than 100 times better than other methods, Noordam says. “This is very efficient, and this is impressive,” says Hänsch's collaborator, Joachim Walz.

    Next, the Dutch experimenters plan to replace the rubidium ions with hydrogen nuclei—protons. Theoretical calculations show that the capture step should work just as well for hydrogen as it does for rubidium, says another team member, theoretical physicist Francis Robicheaux of Auburn University in Auburn, Alabama. If all goes well, next year the team hopes to apply the technique at CERN to create antimatter.

    Abundant antiatoms would certainly give physicists much to think about. For example, general relativity holds that an atom and its antimatter counterpart should have identical masses, but some versions of string theory disagree. Serious discrepancies between predictions and observations could revolutionize theories of matter. “If you want to discover something interesting and unusual,” Gabrielse says, “you have to look at unusual places.”


    Stretching the Reign of Early Animals

    1. Richard A. Kerr

    To paleontologists, fossils are crucial, but so is time. Fossils tell the who of evolution, but for the how and why, paleontologists need to put together a story in which events occur in the right order and unfold at a realistic pace. Nowhere is that task more demanding than in the murky realm of the Ediacara, frond- or disc-shaped blobs of living matter that preceded the more familiar (if still bizarre) creatures of the Cambrian explosion 543 million years ago. Cryptic Ediacaran fossils are hard enough to connect to later animals, but ordering them in time has been if anything harder. Groping for firm dates, paleontologists had been forced to rely on wiggles in the changing isotopic composition of the rock encasing Ediacara.

    No more. On page 841 of this issue of Science, geochronologist Mark Martin of the Massachusetts Institute of Technology and his colleagues report that they have determined the age of the most diverse Ediacaran fauna through highly precise uranium-lead radiometric dating of a layer of volcanic ash. The technique—which overthrows isotopic wiggle counting as the best means of keeping Ediacaran time—has produced a surprisingly early date. It not only doubles the length of the late Ediacaran reign but also revises a strange and mysterious chapter in the early history of life.

    The oldest, simplest Ediacara are found in the Mackenzie Mountains of northwest Canada in rocks thought to be about 600 million years old, but the most diverse and complex Ediacara seemed to come 50 million years later. That's right in a narrow window of time—the last 6 million years before the Neoproterozoic era gave way to the evolutionary commotion of the Cambrian—when complex, mobile organisms, including perhaps the last common ancestor of modern multicellular life, emerged.

    For all their tantalizing significance, though, the Ediacaran layers offer few distinctive fossils to serve as bookmarks in the geologic record. Instead, for more than 5 years researchers have pegged their geologic calendars to the ratio of two isotopes of carbon. The ratio changes with shifting environmental conditions, such as ice ages and varying biological productivity.

    On either side of the late Neoproterozoic flowering, the carbon-isotope ratio rises and falls in roller coaster-like spikes and troughs. In some rocks from the latest part of the Neoproterozoic, however, it appears to level off around values of +1 to +2 per mil. Taking the coincidence at face value, paleontologists tried assuming that all of the most diverse deposits of Ediacara—even those lacking carbon ratios, such as the original find in the Flinders Ranges of South Australia—were the same age as those of the “+2 plateau.” It seemed to work.

    Now a volcanic ash bed has blown that tidy assumption sky-high. Martin and his colleagues found the bed in the Ediacara-bearing cliffs along the coast of the White Sea, 100 kilometers northwest of Arkhangelsk and 1100 kilometers north of Moscow. Once an ash bed is found—and more and more are turning up as geochronologists join paleontologists in the hunt—the key is separating out the mineral zircon. By measuring the amount of lead produced in zircon grains through the steady radioactive decay of uranium, geochronologists can determine the age of zircon formed in the Neoproterozoic with a precision of better than 1 million years. Extracted and analyzed, the White Sea zircons yielded an age of 555.3 ± 0.3 million years.

    “The date is exciting,” says paleontologist Guy Narbonne of Queen's University in Kingston, Ontario. “It's nearly 10 million years older than we might have expected for the Flinders, Australia, Ediacara, yet [the two Ediacaran faunas] are indistinguishable” in diversity and complexity. Apparently, the assumption paleontologists had drawn from the carbon isotopes—that the most diverse Ediacaran fossils have similar ages—does not always hold, says Narbonne.

    This first absolute date among the most diverse Ediacara changes the history of life by pushing back the emergence of large, complex organisms. One of those creatures may have been the long-sought animal whose descendants split into the two great lineages of modern life: the protostomes—mollusks, annelids, and arthropods—and the deuterostomes—the echinoderms and chordates. Obviously, “high-precision uranium-lead dating will be extremely critical in sorting out Neoproterozoic evolution,” says Narbonne. Keep an eye peeled, paleontologists.


    Backward Heat Flow Bends the Law a Bit

    1. Mark Sincell*
    1. Mark Sincell is a science writer in Houston.

    When the teachers go on strike during an episode of The Simpsons, the 8-year-old overachiever Lisa burns up her excess energy by creating a perpetual motion machine. Her father, Homer, is furious. “In this house,” he barks, “we obey the laws of thermodynamics!” Most houses do. But now a space-borne experiment has for the first time accomplished what the second law of thermodynamics seems to forbid: transfer of heat from a cold surface to a hot liquid.

    The groundbreaking experiment was carried out onboard the Mir space station last year as part of the French-Russian Perseus mission. By warming a copper-and-sapphire-walled cell filled with a drop of liquid sulfur hexafluoride and one tiny bubble of gaseous sulfur hexafluoride in near-zero gravity, scientists triggered a slight compression of the bubble. That gentle squeeze raised the temperature of the gas above that of the cell walls. For this to happen, heat must have been transferred from the cooler walls to the hotter gas, scientists report in the 1 May Physical Review Letters.

    It's a weird result—but then, the sulfur hexafluoride on Mir was a weird fluid. Common sense tells us that liquids are different from gases. Liquids are dense, gases are light. Molecules in a liquid are weakly bound together; gaseous molecules are free. But at a certain critical temperature and density—45.54°C and 0.737 grams per cubic centimeter for sulfur hexafluoride—those distinctions disappear. “It can't decide if it is a gas or a liquid,” says John Hegseth, a physicist at the University of New Orleans in Louisiana who took part in the experiment. “It is both and neither.” And once a fluid reaches the critical point, you can forget all the intuition you developed by watching water boil.

    In a normal liquid, heat is transferred in one of three ways. First, it can diffuse from hot regions to cold regions. This is what happens when you heat a pan of water on the stove. Second, hot bubbles of gas or liquid can rise up against the pull of gravity and deposit heat in the cooler, overlying liquid. This process is called convection, and it creates the familiar bubbles in boiling water. There is also a third, little known way to transfer heat: the piston effect. When the liquid surrounding a bubble is heated, it expands, which compresses the bubble and warms the gas. Diffusion and convection both transfer heat so rapidly that the piston effect is usually negligible.

    Things are different near the critical point. As the sulfur hexafluoride in the cell approaches it, bubbles form in the liquid. The two-phase fluid also becomes acutely compressible—in other words, squishy. Heat from the cell walls raises the liquid pressure, instantaneously compressing and heating the gas in the bubbles. Because the fluid is so squishy, the piston effect heats the gas much faster than diffusion can carry energy back to the liquid, and the bubble temperature overshoots that of both the wall and the liquid. In the microgravity of Mir, there is no convection to compensate for the temperature imbalance.

    Within seconds of raising the cell wall temperature, Hegseth's team found that the bubble temperature had risen 23% above that of the wall. Although ground-based experiments and computer simulations had suggested that the piston effect might cause such a rise, scientists were surprised by the magnitude of the change. “They called me right away and asked me to check the result,” says Fong Zhou, a physicist at the Jet Propulsion Laboratory in Pasadena, California. Zhou quickly modified the computer code he had used to simulate the piston effect on the ground. After removing the effect of gravity, his simulation reproduced the Mir experiment results almost perfectly, Zhou says.

    Despite appearances to the contrary, the experiment didn't really break the second law of thermodynamics, Hegseth says. Strictly speaking, the law applies not to changes in temperature but to changes in entropy—a related but different property. What's more, it applies only to systems in thermodynamic equilibrium. Inside the cell on board Mir, conditions were changing so fast that they left equilibrium behind. Only temporarily, however: After about 2 minutes, the second law reasserted itself, and the bubble cooled back down to the same temperature as the wall and the fluid. The counterintuitive heat flow was a transient temperature overshoot, Hegseth says—no lapse, no miracle, just “a weird way to get back to equilibrium.”


    New Minister Hopes to Create an Italian NIH

    1. Michael Balter

    One of Italy's top cancer researchers has been appointed health minister in the new government of Prime Minister Giuliano Amato, raising hopes that the nation's sagging biomedical research effort might receive a badly needed boost. Umberto Veronesi, founder and scientific director of the European Institute of Oncology (EIO) in Milan, was tapped last week to replace outgoing minister Rosy Bindi. His appointment is a break from the tradition of naming politicians rather than scientists to the post.

    One of Veronesi's major tasks, he says, will be to reorganize the system for funding Italian biomedical science, which badly trails that of many other Western countries (Science, 3 July 1998, p. 49). And he must work quickly: Amato's caretaker government—Italy's 58th since the end of World War II—will face a general election in March or April 2001. “The prime minister said I have just 1 year,” Veronesi told Science. “That is why it is better to have an expert than a politician as health minister during this period.”

    Not wasting any time, Veronesi has begun crafting a 10-year plan to beef up Italy's biomedical research institutions. A key pillar is the creation of a public agency—similar to the U.S. National Institutes of Health (NIH)—that would provide extramural grants for fundamental biomedical research. Such an agency does not currently exist in Italy, where most of the roughly $250 million the government spends each year on biomedical research goes to institutes run by the health and research ministries. Although it has continued to support a few outside programs, such as AIDS research, the Italian government has mostly weaned biomedical scientists from direct government grants. Even the National Research Council, once an important source of such grants, has recently restricted its funding mostly to its own institutes (Science, 30 October 1998, p. 855).

    As a result, biomedical research has become increasingly dependent on funding from charities and industry. Veronesi's own EIO is no exception: It gets most of its support from Italian cancer charities and private donations. In addition to reinstating some government grants, researchers are hoping the new minister will encourage more private funding. “Veronesi has campaigned for tax deductions for private donations to cancer research,” says gene therapy researcher Claudio Bordignon, scientific director of the private San Raffaele Institute in Milan. “This would be a major change for Italian science and should be extended to all biomedical research.”

    Whether or not Veronesi gets a chance to design his reform package and sell it to Amato and the Italian Parliament depends on the current government lasting until next spring's legislative elections. The perpetual crises of Italian politics could bring the government down at any time, leaving the new minister's reforms stillborn. But if Veronesi does hang on, many researchers have high expectations for what he might accomplish. “In a year, one can do many things,” says Silvio Garattini, scientific director of the Mario Negri Institute for Pharmacological Research in Milan.


    Superbugs on the Hoof?

    1. Dan Ferber*
    1. Dan Ferber is a writer in Urbana, Illinois.

    Disturbing new findings have provided a key link in the chain of evidence connecting antibiotics used on livestock to outbreaks of disease caused by antibiotic-resistant human pathogens

    When the severe diarrhea didn't stop after nine awful days, the 62-year-old Danish woman dragged herself to the emergency room at Bispebjerg Hospital in Copenhagen. The diagnosis was a cinch: food poisoning from Salmonella. Doctors rolled out their big gun, an antibiotic called ciprofloxacin that can vanquish the nastiest Salmonella strains in a few days. But as the hours passed, the infection worsened—becoming so bad that the Salmonella punched a hole in her colon, allowing it and other bacteria to invade the rest of her body. As the situation grew desperate, doctors blasted her with heavy doses of two more antibiotics and stitched up her damaged colon. The drugs knocked off the Salmonella, but other escapees from the gut sent her into septic shock; one by one, her organs failed. Four days after doctors realized the Salmonella was impervious to ciprofloxacin, she was dead.

    The Danish woman was not the first person to succumb to a superbug resistant to antibiotics. But she and another Salmonella victim in the summer of 1998 put a human face on an alarming trend: pathogens rapidly acquiring resistance to drugs that are similar to antibiotics used for years to treat livestock. In a nice piece of detective work, a team led by microbiologist Henrik Wegener of the Danish Veterinary Laboratory in Copenhagen traced the drug-resistant strain of Salmonella to infected swine. To fight Salmonella outbreaks, some farmers had been dosing herds with enrofloxacin. It turns out that this drug and ciprofloxacin belong to a class of compounds called quinolones that gum up bacterial machinery for replicating DNA. The researchers traced the deadly strain to contaminated pork products from a single Danish herd. The findings, reported last November in The New England Journal of Medicine (NEJM), are the strongest indictment yet implicating livestock antibiotics in human deaths. Says microbiologist Abigail Salyers of the University of Illinois, Urbana-Champaign: “It's the closest that anybody has come to a smoking gun.” And just last week, researchers reported evidence linking a case involving a resistant Salmonella strain in the United States to the use of animal antibiotics.

    For decades farmers have mostly had free rein in dosing livestock with antibiotics to treat illnesses, prevent infections, and fatten animals on less feed. With evidence mounting that this unfettered practice can spawn new superbugs, agencies worldwide are beginning to clamp down on antibiotic use in agriculture. The European Union has issued new rules limiting the use of several livestock antibiotics, while the U.S. Food and Drug Administration (FDA) has proposed similar regulations.

    The moves have riled industry officials, who argue that antibiotics are essential to keeping animals healthy and the food supply safe. They contend that regulators and public health activists are blowing the problem out of proportion. The most serious threat, they point out, comes from indiscriminate use of antibiotics in people, not livestock. “We're not saying there isn't any concern,” says Richard Carnevale of the Animal Health Institute, which represents U.S. animal-drug producers. “But in the whole scheme of things, we believe that it's relatively minor.”

    A growing number of scientists, however, are taking the threat quite seriously, as is the British Royal Society of Medicine, which brought experts together in Washington, D.C., this week to brainstorm on the issue and to educate the public. Although drug use on the farm may have little to do with drug-resistant tuberculosis or other pathogens transmitted from person to person, it “has everything to do with bacteria acquired through the food supply,” says medical epidemiologist David Bell of the U.S. Centers for Disease Control and Prevention (CDC).

    Gut reaction

    The case against antibiotic use in livestock rests largely on drug resistance observed in food-borne pathogens such as Salmonella or Campylobacter, which often infect animals without causing symptoms. First, microbial sleuths must link a livestock antibiotic to a drug-resistant strain. Next, they must show that the strain can survive the slaughterhouse. Finally, to cement the connection to human illness, they must prove that eating tainted meat leads to an infection that defies antibiotic treatment. The last link is the hardest to verify. “That's where the chain of evidence starts to get frayed,” Salyers says.

    Luckily for the Danish team, the deadly bug did not take them by surprise. It's a variant of Salmonella typhimurium DT104, a strain that resists five common antibiotics and had flared up in many European countries—but rarely in Denmark. Hoping to keep it at bay, Danish officials set up in 1997 what Wegener calls the world's most aggressive surveillance system for resistant Salmonella. They test for drug resistance in every Danish patient who sees a doctor for a Salmonella infection, about 3200 people a year; in roughly 1 million samples of meat shipped each year to grocery stores; and in nearly every flock of chickens and herd of pigs—the usual sources of Salmonella that infect people—raised for the market.

    When word came on 18 June 1998 that a quinolone-resistant strain had shown up in a hog slaughterhouse on the island of Zealand, Kåre Mølbak of Copenhagen's Statens Serum Institute leaped to action. By coincidence, earlier that day his team had identified samples of a vicious DT104 strain in five Danish patients. This strain, and the one in the slaughterhouse, beat back the same seven drugs—two more than other DT104 strains. Mølbak's team started phoning patients, slaughterhouse workers, and meat wholesalers. By nightfall they learned that all the patients had bought pork from shops supplied by the Zealand slaughterhouse.

    “That made the hypothesis simple,” says Wegener, who was consulting with Mølbak throughout the day: The pork was the source of the super DT104. Records later showed that just one of the 37 herds slaughtered for that shipment was infected by the resistant strain. Although the herd had not been treated with quinolones that year, others on nearby farms had, and Salmonella can easily jump from herd to herd, Wegener says. What's more, DNA fingerprinting showed that the drug-resistance genes in the patients were identical to those in the pigs.

    Now, another case, reported in last week's NEJM, suggests that a second new drug against Salmonella has been compromised by a livestock antibiotic. Bacteriologist Paul Fey of the University of Nebraska Medical Center in Omaha and his colleagues described how a 12-year-old Nebraska boy became infected by the same Salmonella strain as had cows on his father's ranch. Apparently, says Fey, the cows had contracted the resistant Salmonella from a ceftiofur-treated herd on another ranch. When the boy became ill, doctors treated him with ceftriaxone, an antibiotic similar to ceftiofur. The boy recovered—but not thanks to the ceftriaxone, which hardly dented the Salmonella. The strain was resistant. That worries Fey, as doctors already have their hands tied when treating childhood Salmonella: Front-line quinolones can't be used because they impede bone development. That's why Fey predicts “dire consequences” for children's health if the ceftriaxone-resistant strains spread.

    Deadly pas de deux.

    Usually a kind of bacteria called enterococci dwell peaceably in our gut. But new findings suggest that drug-resistant bacteria in contaminated food or water can slip enterococci the genes conferring resistance. For people with weakened immune systems who are susceptible to enterococci infection, the drug-resistant enterococci pose a grave threat.


    Even in the absence of a direct link between farm and table, circumstantial evidence can be damning. Spurning the CDC's advice, the FDA in 1994 approved the use of quinolones for preventing infections of an intestinal bacterium, Campylobacter jejuni, in poultry. Since then, the percentage of quinolone-resistant Campylobacter cultures in routine sampling in people has skyrocketed from 1% to 17% in just 7 years, Kirk Smith of the Minnesota Department of Public Health and his colleagues revealed last year in NEJM. Campylobacter very rarely kills people, Smith says, “but it's a severe enough illness that you're just hating life.” And it shows that Salmonella isn't the only bug to worry about.

    Fomenting resistance

    A more insidious threat comes from pathogens passing their genetic know-how to bacteria in our gut. Resistance develops like this: Under an antibiotic blitz, a tiny fraction of any population of otherwise-susceptible bacteria can survive, because they possess mutations—acquired randomly or from other bacteria that slip them rings of DNA called plasmids—that counter an antibiotic's effects. Because different strains swap genes routinely, and even different species exchange genes from time to time, one bug can gird itself with another's resistance genes.

    In a particularly troubling example, gene transfer appears to have turned gut-dwelling bacteria called enterococci into a public health threat. Most of the time, people and their intestinal bacteria get along fine, but enterococci can infect hospitalized patients with weakened immune systems, replicate like mad, and trigger the immune system to go haywire, which poisons organs. Because enterococci have developed resistance to most antibiotics, systemic infections often rage unchecked, killing thousands of people each year. A powerful drug called vancomycin can stop susceptible enterococci in their tracks. But vancomycin-resistant strains are spreading in Europe—even though the antibiotic is new to the European scene.

    Vancomycin resistance already is a huge problem in the United States. The drug has been used for years in hospitals, allowing bugs to develop resistance. But that doesn't explain why resistance is cropping up in healthy people in Europe, says microbiologist Wolfgang Witte of the Robert Koch Institute in Wernigerude, Germany. The evidence suggests another reason: In 1974, the European Union approved the use of avoparcin, an antibiotic that, by an unknown mechanism, makes livestock grow fatter on less feed. Avoparcin and vancomycin kill bacteria by blocking an enzyme essential for building the cell wall. Not surprisingly, enterococci in livestock that resist avoparcin also can withstand vancomycin. Despite strict procedures, enterococci from the gut on occasion infect meat during slaughter. If a person eats undercooked meat tainted with resistant enterococci, the livestock strain can transfer to the human strain the genes conferring resistance to vancomycin. Alarmed by the potential link between avoparcin and vancomycin resistance, the European Union banned avoparcin in 1997.

    A team led by molecular microbiologist Rob Willems of the Dutch National Institute of Public Health and the Environment in Bilthoven has evidence that such gene transfers are occurring. The researchers found identical sequences of transposons—DNA snippets that can jump from one bacterium to another—with identical resistance genes in enterococci from people and from pigs. These transposons are different from ones found in resistant enterococci from cows, chickens, and turkeys, the researchers report in the March 1999 issue of Antimicrobial Agents and Chemotherapy. That suggests the resistance moved from pigs to humans, Willems says.

    One obvious question is whether cutting down on antibiotics on the farm can throttle resistant pathogens. New research suggests that's the case. In a report in the March issue of the Journal of Antimicrobial Chemotherapy, a group led by veterinary microbiologist Anthony van den Bogaard of the University of Maastricht in the Netherlands showed that by 1999, 2 years after avoparcin was taken off the market in Europe, the prevalence of vancomycin-resistant strains in pigs, chickens, and people in the Netherlands had dropped to half the 1997 levels.

    Culling the antibiotic herd

    To guard against the nightmare of animal enterococci or other bugs planting the seeds of resistance in more dangerous pathogens, governments worldwide are cracking down on the use of drugs in livestock. The first moves came after a British panel in 1969 recommended banning growth-promoting antibiotics that spur resistance to drugs used in human medicine. The panel's advice was partly heeded in Europe, where key antibiotics like penicillin and tetracycline were taken out of agricultural use in the 1970s.

    But it has been mostly ignored in the United States, where industry officials insist that antibiotics keep animals healthy and thus safeguard the food supply. “While there's a theoretical link [between resistant strains in livestock and people], we think that there's so many things that need to happen that the risk is diminishingly small,” says the Animal Health Institute's Carnevale. Even if antibiotic use on the farm does pose a threat, it pales in comparison with the scourge of resistance from human medical practices, says Robin Bywater, a Reigate, U.K., consultant for Pfizer, which produces animal antibiotics. Besides, Bywater says, the drug-resistant pathogens most dangerous to people—such as Staphylococcus aureus or the tuberculosis bacterium—do not infect livestock.

    The main skirmishes now are over the practice of using low doses of antibiotics to make livestock fatter on less feed. Three years ago, the World Health Organization argued for phasing out use of antibiotics for this purpose, if the animal drugs are used in people or breed resistance to human drugs that work by a similar mechanism. The European Union agreed and banned the use of avoparcin and four other drugs as growth promoters.

    The FDA, meanwhile, has given mixed signals regarding so-called subtherapeutic uses—growth promotion and illness prevention—of animal antibiotics. In a series of actions since 1994, the agency has approved the use of quinolones to treat and prevent infections in poultry and beef cattle. But in 1998 it floated draft regulations that would raise the bar for all uses of new animal antibiotics. The regulations would require companies to carry out resistance studies before and after a drug's approval, and to pull any drug from the market if the target bacteria develop resistance to human antibiotics. “We're most concerned about those pathogens for which the disease is serious in humans and for which the drug we're considering may be the drug of last resort,” such as quinolone-resistant Salmonella and vancomycin-resistant enterococci, says Stephen Sundlof, director of the FDA's Center for Veterinary Medicine. “The only scientific way we have to do it is to look at it on a case-by-case basis.”

    Congress, however, may prod the FDA into a more aggressive stance. A bill introduced last year by Representative Sherrod Brown (D-OH) would order companies to discontinue using seven antibiotics for any reason other than to treat illness in animals—unless the industry proves that the drugs won't harm human health. Brown hopes the bill, opposed by the agriculture industry, will pass in 2 or 3 years. “The burden should be on the drug industry to prove that they are safe, not on the FDA to prove 100% that they are unsafe,” he told Science.

    Researchers are also trying to provide industry with alternatives to antibiotics that can keep livestock healthy. These include probiotics, in which healthy gut bacteria are infused into animals before they are weaned to crowd out pathogens; vaccines; and animal husbandry practices that prevent infections from spreading from farm to farm. Although these possibilities hold promise, “there is no one magic bullet” in the pipeline, says microbiologist Paula Fedorka-Cray of the U.S. Department of Agriculture's Russel Research Center in Athens, Georgia.

    Wegener and others believe that U.S. regulators must follow the lead of their European counterparts and act quickly to get livestock antibiotics off the market for uses other than treating sick animals. Otherwise more outbreaks like the one in Denmark could occur, he says, adding, “I have difficulty understanding why we should take that risk.”


    Indian Schools Cash In on Silicon Valley Wealth

    1. Pallava Bagla

    A $1 billion campaign to shore up the elite Indian Institutes of Technology is part of a tidal wave of philanthropy that hopes to raise up Indian higher education

    New Delhi AND HyderabadThe 13 CEOs who gathered in December for a meeting with Indian Prime Minister A. B. Vajpayee had all made their fortunes in Silicon Valley, California. Born and raised in India, and educated at one of the six elite Indian Institutes of Technology (IIT), they had decided it was time to give something back to their native country. In particular, they had come to seek Vajpayee's blessing for a bold plan to raise $500 million from wealthy IIT alumni like themselves for new buildings, equipment, and programs at their alma maters.

    The prime minister was delighted, but urged them to think even bigger: “He said, why not pick up the entire tab?” recalls Kanwal S. Rekhi, the retired technology chief at Novell Inc., who now runs a small California company, IndUS Entrepreneurs, that fosters collaborations between the two democracies. The startled executives ran the numbers and calculated the cost to modernize the institutes at $1 billion. Unfazed by the sum, the self-made multimillionaires said OK, but under two conditions: The money, to be raised over 6 years, would go directly to the IITs, and an independent board of trustees, made up of the country's business and academic elite, would replace the government as overseer. Vajpayee said he'd look into it.

    These high-tech tycoons are part of a startling new trend in India. Even as they begin passing the hat for the IITs, another group of Indian-born, U.S.-based software engineers led by Purnendu Chatterjee, managing director of the New York-based Chatterjee Group, wants to raise the same amount—$1 billion—to set up a series of world-class centers of higher learning in India (Science, 31 March, p. 2389). And individual expatriate Indians are donating to other high-tech causes back home. Although a few observers are grumbling privately that such largesse is a thinly veiled attempt by wealthy individuals to seize control of state assets, most view the charity drives as part of the long-awaited payback on India's massive brain drain of the last few decades. “We welcome the initiative,” says Science and Education Minister Murli Manohar Josh, who says it recognizes the fact that India is fertile ground for growing future high-tech entrepreneurs. That receptivity is a change from the past, when rules prevented such individuals from making direct donations to universities and there was a feeling that it was not right for public institutions to accept private support.

    The high-tech executives are already making an impact on their alma maters. In 1998 Rekhi gave $5 million to IIT-Mumbai to establish a school of information technology that has been named after him. Another successful computer scientist, Desh Deshpande, founder of Sycamore Networks of Chelmsford, Massachusetts, has pledged $100 million over the next 20 years to his alma mater, IIT-Madras.

    All this philanthropy could not have come at a better time for the IITs. They were formed shortly after Indian independence as “institutions of national importance,” but have struggled to keep up with the fast-changing and burgeoning fields for which they provide human capital. With enrollments up 35% to 40% in the past 4 years to an average of 2500, students endure packed lecture halls and overcrowded youth hostels. V. S. Raju, director of IIT-Delhi, estimates that the six IITs will need approximately $220 million in the next 3 to 4 years just to maintain existing facilities—an amount unlikely to come from the government, which provides roughly 80% of each institute's operational expenses.

    What makes this new wave of private donations especially remarkable is that there is no history in India of academic philanthropy from expatriates. As recently as 1994-95, the total alumni contribution to all the IITs was less than $250,000. But that was before the New Economy began turning entrepreneurs into megamillionaires. Last August, N. R. Narayanamurthy, head of the fund-raising committee for IIT-Kanpur and chair of Infosys Technologies Limited of Bangalore, raised $1 million during a single lunch meeting in San Francisco. Narayanamurthy himself gave another $2 million to the school for a new computer lab. The $3 million represents roughly 30% of IIT-Kanpur's annual operating budget. Efforts to help IIT-Mumbai have been even more successful, with alumni in Chicago pledging $22 million in the span of a few days in December. That's more than the institute's annual operating budget of $20 million.

    Many IIT alumni are content to give without asking anything in return, but others would like a bigger say in the way IITs are run. Although the prime minister's office had initially agreed with the idea of an independent board of trustees, Rekhi says, newspaper stories questioning whether the plan constituted a “takeover” that would threaten the institutes' independence have pushed the idea onto the back burner. The industry leaders say they never intended to seize control—“running the IITs is not such a good business,” says Rekhi—but only wanted to ensure that the money was used for the desired ends. “I do not think it will make IITs beholden to anybody,” says Nandan M. Nilekani, managing director of Infosys Technologies Limited in Bangalore, who so far has donated $1.4 million to IIT-Mumbai.

    While some wealthy benefactors are trying to shore up their alma maters, others are trying to set up de novo private institutions. The Global Institute of Science and Technology (GIST) would consist of six research-centered institutes offering undergraduate, graduate, and postgraduate courses. Each institute would enroll up to 2000 students, with any surplus being plowed back into the facility. Some $300 million has already been pledged for the new institutes, which are currently before the influential Scientific Advisory Committee to the Indian Cabinet. Raghunath A. Mashelkar, director-general of the Council of Scientific and Industrial Research in New Delhi, says GIST addresses “a crying need” for another world-class research facility as well as additional training capacity.

    Although the IITs and GIST are receiving most of the attention, other institutions are also getting into the act. K. B. Chandrashekar, co-founder and chair of Exodus Communications of Santa Clara, California, has funded a $600,000 center for excellence in Internet and telecommunications at his alma mater, Anna University in Chennai, as well as a remote learning center for students to take courses at any of the school's four campuses. “I got all my education in India, and the first 7 years of my work experience was in India,” says Chandrashekar, who plans additional gifts.

    Indian government officials see the new wave of philanthropy as the next step in boosting the nation's economy. Vajpayee has talked about creating “Silicon Valley-like conditions in India so that promising young Indians can create world-class ventures while living and working in India.” Chandrashekar has an even more expansive vision: “By providing a better infrastructure and making it affordable for all people,” he says, “we can make India a superpower through its intellectual capital and not through war.”


    Money and Machines Fuel China's Push in Sequencing

    1. Li Hui*
    1. Li Hui writes for China Features. Additional reporting by Elizabeth Pennisi.

    China hopes that a heavy investment in genomics will help it to fight disease, foster economic growth, and tap its vast biological diversity

    BeijingYang Huanming, who directs the human genome center at the Chinese Academy of Sciences' (CAS's) Institute of Genetics, is still unhappy with a deal that two Chinese laboratories struck last spring with a foreign-owned company based in China. The labs agreed to pay Shanghai GeneCore Biotechnologies $225,000 to sequence a prawn virus that threatens China's lucrative shrimp industry. They also ceded one-third of the intellectual property rights on the 300-kilobase sequence. The labs had to pay this steep price, Yang says, because they lacked the capacity to do the work in-house.

    Negotiating from a position of weakness so rankled Yang that he and his colleagues lobbied the government for the capacity to do such sequencing jobs themselves. They prevailed: By the end of this month, Yang's center will have enough sequencing machines to process 15 million bases (Mb) of raw sequence a day. “That will give us the capability to do [such organisms] by ourselves,” he says proudly.

    A latecomer to the revolution in genomics, China is scrambling to catch up and become a major global player. Last year it joined the international consortium that is sequencing the human genome, agreeing to decipher 1% of the 3 billion base pairs. With all their machines, sequencing those 30 million bases 10 times over will be relatively easy, but the real work comes later, in piecing raw sequence into long stretches of DNA and filling in the gaps. In addition, it is stepping up investment in the scientific capacity to generate, analyze, and profit from genomic data. And to protect China's rights in any collaborative project with the public or private sector, the government is drafting new regulations on foreign access to the country's considerable genetic resources.

    Chinese researchers welcome their country's entry into the field, but they are sharply divided over what path it should take. One group is pushing for the rapid sequencing of as many genomes—human, plant, and other animals—as possible, while another is hoping to concentrate on finding genes and sequencing only those species with the greatest impact on human health and the environment. “We caught the last bus, but everybody is pleased,” says Yang, a prime mover in China's entry into the global human genome project, who is pushing an ambitious plan to sequence as much of China's vast biological diversity as possible. But others argue for a more targeted approach that uses sequencing selectively to explore specific disease genes, produce knockout animal modes, and spur development of related fields such as bioinformatics. “Sequencing is not everything,” says Chen Zhu, an academician and hematology researcher at Rui Jin Hospital, Shanghai Second Medical University, who also heads the National South Genome Center in Shanghai.

    China began its human genome sequencing project in 1994, with 20 research groups supported by the country's National Science Foundation (NSF). The initial emphasis was on genomic diversity and the identification of genes implicated in the country's major diseases, including hepatocellular carcinoma, nasopharyngeal cancer, essential hypertension, and neurological disorders. In early 1998 the Ministry of Science and Technology (MOST) set up two human genome centers, one in the north, in Beijing, and one in the south, in Shanghai. The country's work on the rice genome began even earlier, when the CAS National Center for Gene Research was created in 1992. Researchers there finished a physical map at the end of 1996. The sequencing was originally scheduled to be completed in 2002, but Hong Guofan, head of the center, said it may be done early.

    Yang's center, affiliated with CAS, was created in August 1998, and he and his supporters immediately began lobbying to join the global consortium. Their success, however, meant that China needed to ramp up its sequencing operations, including the purchase of foreign-made sequencing machines.

    That it has done. The CAS center, which is dedicated to China's 1% share of the human genome project, had been churning out 2 Mb daily by running its machines—three of the new, state-of-the-art MegaBACE1000 capillary machines made by Amersham-Pharmacia Biotech and 11 of the older ABI377 sequencers made by PE Biosystems—24 hours a day, 7 days a week. Last month, it installed another 30 MegaBACE1000 machines. Yang hopes to receive up to 100 more machines by the end of the year, raising his center's capacity for generating raw sequence to 50 Mb a day. The government's northern human genome center has five ABI377 sequencers and one of the latest ABI3700 machines from PE Biosystems, while the southern center has 10 ABI377s, two MegaBACEs, and one ABI3700. More sequencing machines will be added this year. While devoting a substantial portion of their capacity to the international sequencing project, the two centers are also sequencing other organisms, including the bacterium that causes the tropical disease leptospirosis.

    Although China's capacity is impressive, it is not yet a match for the United States and Europe. For example, Celera Genomics Corp. of Rockville, Maryland, which has the highest concentration of sequencing firepower anywhere, has 300 of the new PE Biosystems 3700 machines, which make use of the new capillary technology for faster, more automated sequencing. But once the new machines arrive at CAS, China's overall capacity will be comparable to what's available at major Western centers like Washington University in St. Louis and the Sanger Centre in the U.K.

    China's participation in the global human genome project has also forced it to grapple with the sensitive issue of access to data. Its human genome centers have followed the consortium's rules about depositing data immediately into a public database, but Chen's university lab also does proprietary work under contract with private companies. Chen also has a collaboration with the pharmaceutical giant SmithKline Beecham that will soon make public a large collection of expressed sequence tags, small pieces of expressed genes, from stem cells that give rise to blood cells and neuroendocrine tissue. “I think that the public sector and the private one can work together to accelerate the genome project as long as we strike a balance between protecting intellectual property and sharing data with the public,” says Chen.

    The sequencing work has been a boost to the entire field of biology, say officials. “The 1% project has stimulated the development of other disciplines and genome-related industries,” says Gu Jun, CEO of the northern center. The project has given Chinese microbiology a big boost, Chen adds, noting that it greatly hastened work on the leptospira genome and force-fed the field of bioinformatics.

    Yang wants the government to capitalize on that expertise by adopting an ambitious plan to generate working drafts of the genomes of most Chinese species—animals, plants, and microorganisms—that are economically or scientifically significant. He admits that the idea, temporarily called the National Bio-resource Genome Project, is an enormous undertaking that could take decades. But he estimates that an initial investment in equipment of roughly $20 million would be enough to get started. Yang has already received a $2.5 million down payment from CAS, and the science ministry and NSF are weighing their contributions. He estimates that, with those resources, his center could produce a working draft of one mammalian genome a year.

    To clear the way for such bold projects, and make possible collaborations with foreign companies and organizations with sufficient sequencing power, Chinese authorities want to extend existing rules on foreign access to the country's human genetic materials to cover all organisms. Xu Xinlai, head of the MOST bioengineering center, which supervises the ministry's high-tech enterprises and drafts related regulations, says the new rules will uphold the idea that “resources have monetary values” and that China is entitled to its fair share. Under the draft, which has not been made public, Xu says that “if a foreign commercial company wants to make use of China's genetic resources, it has to do it through collaboration with a Chinese partner. The resources to be used should be considered as a kind of investment.”

    MOST began drafting the new regulations earlier this year in response to concerns raised by Chinese scientists when GeneCore was acquired by PE Corp., Celera's parent company. Yang says Chinese scientists were “alarmed” when Celera's president, J. Craig Venter, announced in January that the deal will “provide access to” many new sources of genetic information. “The intention expressed in Celera's wording is very obvious—to monopolize our country's genetic resources,” says Yang. “We need foreign collaborations and even fair competition. But such collaboration and competition should abide by Chinese law.”

    Celera spokesperson Paul Gilman says Celera has no intention of monopolizing resources: “The only thing that we would negotiate is something that is satisfactory to all parties.”

    In addition to the effort to sequence part of the human genome and the planned project to sequence China's key organisms, China has already begun deciphering the genome of its staple crop, rice. While an international consortium is working on the Japonica variety (Science, 1 October 1999, p. 24), China has pursued a different strain, Indica, grown throughout China and in most of the rest of rice-eating Asia. CAS's Hong says that parallel sequencing of the two rice varieties “provides a unique way to understand their functions.” The decision to work on Indica was made early in 1992 at the beginning of the rice genome project. Since then the mapping and sequencing have been sticking with this strain.

    With the government committed to raising its investment in genomics, the debate over how to distribute limited funds is likely to remain fierce. But China's research community appears increasingly confident that it will be able to keep up with the rest of the world. “Some scientists see Celera as a hungry wolf,” says Chen. “But China is not as weak as it was 5 years ago. We should have confidence in our current capacity.”


    From Field to Lab, New Insights on Being Human

    1. Ann Gibbons

    San AntonioMore than 1100 researchers gathered for the 69th Annual Meeting of the American Association of Physical Anthropologists here, presenting a record 665 papers. Analyses ranged from field reports to lab experiments, including a genetic study of the evolution of a deadly gene, a description of a rare hominid fossil, and brain scans comparing humans and other apes.

    Tracing the Genealogy of a Deadly Gene

    If there were a Most Wanted list of common deadly genes, the apo-lipoprotein E (APOE) gene surely would be near the top. In its most dangerous form, this gene increases the risk of cardiovascular disease and Alzheimer's disease. Ever since geneticists discovered that there are three types of APOE proteins—the worst is known as E4—scientists have wondered where and when the killer variant arose. Now a powerful piece of evolutionary sleuthing presented at the meeting reveals the ancestry of the gene's three forms. An international team found that the dangerous E4 type was inherited from our apelike ancestors and has given rise in the past 300,000 years to the two less harmful forms of the gene, E3 and E2. E3 is now moving through the world's populations and replacing the deadlier form. “We've actually caught a gene in the process of being changed to a favorable type,” says S. Malia Fullerton, a population geneticist at Pennsylvania State University, University Park, and a member of the team headed by genetic epidemiologist Charles Sing at the University of Michigan Medical School in Ann Arbor.

    The group also found that each of the three types of APOE has many variations, which may help explain why people who have inherited the E4 type have different degrees of risk for developing heart disease or Alzheimer's. “This shows that it's important when we do these assessments of risk that we do not just use these simplistic three types,” says pathologist George Martin of the University of Washington (UW), Seattle, who hopes that better diagnostic tests can now be developed.

    The gene has been a target since 1985, when researchers found that people with the APOE4 protein have a higher risk of heart disease than those who inherit the other two types. In 1993, researchers linked E4 to Alzheimer's (Science, 13 August 1993, p. 921), and soon after found that chimps only have E4. That sparked much debate about which type of APOE was ancestral, because APOE3 is most common today.

    To answer that question, the team sampled 96 people from four populations and began massive sequencing, done by UW geneticist Debbie Nickerson, of a 5491-base pair region of the gene on chromosome 19. When the team organized the sequences into a tree where the most similar sequences clustered together, they found 31 distinct subtypes of DNA that sorted into the three major types of APOE. Each of the variants within the three types preserves the genetic code necessary to produce one of the three APOE proteins, but the sequence varies in other regions. Those changes may subtly affect the way the APOE proteins are regulated and expressed, and may help explain why in some populations, such as African Americans, those with E4 get Alzheimer's less than expected.

    “It's ground-breaking that they did large-scale sequencing of a long stretch of DNA on so many people,” says human population geneticist Jeffrey Long of the National Institutes of Health. Although geneticist Allen Roses's group at Glaxo Wellcome in Research Triangle Park, North Carolina, also has found variation at the DNA level within the three seemingly simple APOE types (Science, 15 May 1998, p. 1001), this new report also traces, step by step, the DNA changes that led from E4 in the chimpanzee to E3 and E2, and proves that E4 was the ancestral form of the gene.

    The limited variation within the E3 and E2 haplotypes shows that they emerged fairly recently, sometime in the past 300,000 years, according to a molecular clock based on the apparent rate of divergence between human and chimp. The good news, says Fullerton, is that the new and less dangerous E3 type seems to be under selection pressure, as it is spreading most rapidly. Martin speculates that E4 may have persisted for millions of years in human ancestors, offering some advantage, perhaps against parasites.

    One intriguing idea is that E4 slows the rate at which lipids such as cholesterol are delivered to pathogens. That could have been an advantage against trypanosomes that cause African sleeping sickness, as the parasite must suck up cholesterol in a few hours to survive. Now that the gene's evolutionary history is revealed, Martin and colleagues plan to test such ideas, using mice bred to express the different types of APOE.

    “Little Foot” Hominid Gets a Hand

    In December 1998, researchers announced a spectacular discovery: the nearly complete skeleton of a 3.3-million-year-old australopithecine from a South African cave (Science, 1 January 1999, p. 9). The skeleton was known as Little Foot because its foot was found first—in a box in a storeroom—but it is now the most complete ancient hominid known, even more complete than its Ethiopian contemporary, the famed Lucy fossil. Anthropologists have been eagerly awaiting news of the rest of its anatomy, and to their delight excavator Ron Clarke of J. W. Goethe University in Frankfurt, Germany, showed his hand, so to speak, in San Antonio.

    Clarke presented slides of what is the only complete hand and arm of an early hominid—still partly encased in rock—and noted that its short palm and fingers look quite modern. Although his analysis is preliminary, Clarke concludes that Little Foot did not knuckle walk like living apes, but walked upright while also spending some time feeding and sleeping in the trees. “The old concept of having us come out of a knuckle walker is not borne out by this skeleton,” Clarke told the audience.

    But not all of his listeners were convinced, because even if Little Foot didn't knuckle walk, its ancestors—not to mention human ancestors—might have (Science, 24 March 2000, p. 2131). “It's not a knuckle walker, but I don't see how you can tell what kind of creature it evolved from,” says paleoanthropologist David Begun of the University of Toronto in Canada.

    Little Foot entered the modern world foot first, as Clarke discovered its left foot and ankle in 1994 while picking through a box of animal bones from the Sterkfontein grotto in a University of the Witwatersrand storeroom in Johannesburg, where he was then director of field operations. Chimplike characters, such as a slightly divergent big toe, imply that this hominid spent some time in the trees as well as on the ground, according to Clarke and Phillip Tobias of the University of the Witwatersrand (Science, 28 July 1995, p. 521). Two years later Clarke found more bones of the skeleton in another box, including a partial shinbone. Incredibly, 2 days later his team returned to the dark cave with the shinbone and found a lower leg and foot that fit perfectly; soon after they found a skull and almost all the skeleton. The team is now chipping away the breccia that encases the skeleton, which has been dated to 3.3 million years ago using reversals in the Earth's magnetic field and associated animal fossils.

    Last July, Clarke partially prepared the hand and arm. Although he could not measure the fossils still in rock, he says that all five metacarpals and some of the phalanges of the hand are short, like those of humans, and don't resemble the elongated finger and palm bones other apes use to move through trees or stabilize themselves while knuckle walking. The ancient hominid's long, powerful thumb also looks modern, and may suggest more manual dexterity. The forearm also appears shorter than the long arms needed in knuckle walking.

    But evidence that Little Foot also spent some time in the trees comes from its grasping, primitive foot, and from the phalanges in the hand, which are curved in a way that differs from humans and resembles the hand bones attributed to Lucy's species, Australopithecus afarensis. Clarke suggests that both hominids gripped tree branches. He argues that this shortened hand is a primitive trait in australopithecines (and was therefore present in human ancestors), and that other apes' long hands are specialized for knuckle walking.

    Those who saw Clarke's presentation say the skeleton, which may be a new species, is an interesting mix of modern and primitive traits—although there is debate about how primitive the foot is. “Both the hand and foot show intermediacy between an arboreal ape and bipedal hominid,” says Elwyn Simons of Duke University. “It is evidence of how we moved from one stage to another.”

    Human Frontal Lobes Sized Right

    Ever since the Greeks portrayed gods, artists, and poets with large foreheads to signify intelligence, the frontal lobe—the portion of the brain situated just behind the forehead—has been seen as the mark of a highly evolved person. Indeed, brain-mapping studies have shown that it is the seat of creativity, decision-making, working memory, and planning—in short, of the characters that make us human. So it was no surprise when anatomists studied human and chimp brains early this century and concluded that the human's frontal lobe occupied a larger percentage of its brain than did the chimp's. “It has almost been received wisdom that the frontal lobe is enlarged in humans,” says paleoanthropologist Dean Falk of the University at Albany in New York.

    Sizing up.

    Most parts of the human brain, including the frontal lobe, scale up as expected when compared to the brains of other apes.

    Now, however, a new comparative study finds nothing disproportionate about the frontal lobe in humans. Although humans do have larger brains for their bodies than other primates, no particular section of the brain is swollen. Rather, all large parts of the brain scale up proportionately—except for the cerebellum, which is actually smaller than expected, according to a presentation by anthropologist Katerina Semendeferi of the University of California, San Diego, and neuroscientist Hanna Damasio of the University of Iowa in Iowa City. The authors, who used magnetic resonance imaging (MRI) to compare the brains of living apes and humans, also suggest that the size of the frontal lobe relative to that of the overall brain has not changed significantly during human evolution, contrary to popular evolutionary models. “The myth has been that we have these big, big frontal lobes,” says anthropologist Ralph Holloway of Columbia University in New York City, who predicted in 1964 that the human frontal lobe was not enlarged. “Now, we'll have to look at something else” to explain what is special about the human brain.

    Semendeferi began her study as a graduate student a decade ago, when she was surprised to find that only a few isolated studies supported the notion that the frontal cortex was enlarged in humans. “There were almost no comparative data on the great apes,” she recalls. So she and Damasio, her postdoctoral mentor, scanned the brains of a few living humans and dead zoo apes with MRI. When they found that the human frontal lobe was not enlarged, they met with so much skepticism, including criticism about comparing living and dead tissue, that they expanded their study. They scanned 10 living humans and enlisted the help of Tom Insel at Yerkes Regional Primate Research Center in Atlanta to scan 19 living apes—chimpanzees, bonobos, gorillas, orangutans, and gibbons.

    They compared the size and volume of the overall brain as well as both hemispheres (halves of the forebrain), frontal lobes, temporal lobes, cerebellums, and other regions. The result: The frontal lobes make up 36.8% of the hemispheres in humans, compared with 34.9% in bonobos, 34.8% in chimpanzees, 35.1% in gorillas, and 36.3% in the orangutan. The gibbon was the only standout—its frontal lobes account for only 28.4% of its hemispheres. And the cerebellum is smallest in humans—only 11.2% of the brain—and largest in the gorilla—16.1% of the brain. That was surprising, too, because arboreal species like orangutans were expected to have the largest cerebellum, a structure involved in balance. The findings imply that our ape ancestors devoted the same proportion of their brains as we do to the frontal lobe, notes Semendeferi.

    The results also dovetail with studies suggesting that other brain structures, such as the neocortex, scale up as expected in large-brained animals such as apes, dolphins, and elephants. “The bottom line is that humans' brains are perfectly predictable,” says Barbara Finlay, a developmental neuroscientist at Cornell University in Ithaca, New York. “They're large, but they scale up like they are supposed to for their body size.”

  15. GENOMES 2000

    Intimate Portraits of Bacterial Nemeses

    1. Michael Hagmann

    ParisAttendance was limited to 600, but that didn't stop geneticists who gathered here 11 to 15 April from creating a big buzz over a dozen bacterial genomes unveiled at a meeting sponsored by the Institut Pasteur and the American Society for Microbiology. Among the highlights were a peek at leprosy's blighted genome and a heartening tale behind the sequencing of a nasty plant pathogen.

    Leprosy's Dying Genome

    Even though it has been disfiguring and stigmatizing people since the dawn of civilization, leprosy remains a scientific puzzle. Nobody is sure how the slowest growing bacterial pathogen spreads, or how it triggers the gruesome degeneration of the hands and feet that for centuries doomed many victims to leper colonies. And although Mycobacterium leprae was the first bacterium linked to a human disease, more than a century later it's still impossible to grow in the test tube. Now, this mysterious microbe's genetic code has been laid bare to reveal yet another conundrum: The bacterium has one of the most blighted genomes ever seen, with vast stretches of “junk” DNA and hundreds of genes that no longer function. Scientists are hoping that this genetic wasteland will provide some new leads on how to combat leprosy. “We've waited for this for a long time,” says microbiologist Anura Rambukkana of The Rockefeller University in New York City.

    A chronic infectious disease that affects the peripheral nerves, the skin, and the upper respiratory tract, leprosy, also called Hansen's disease, still flourishes in developing countries such as India, Brazil, Indonesia, and Myanmar, where more than 750,000 people contract the disease each year. Scientists think the rod-shaped bacterium spreads through the air or through direct contact, entering the body via the mucosal linings of the nose or through open wounds. From there it somehow infiltrates Schwann cells, which insulate nerves. The bacterium can hide there for years without causing symptoms. Eventually, the body's immune system attacks the infected Schwann cells, destroying the nerves in the process. Accompanying the telltale sensory loss are secondary infections that whip the immune system into a frenzy. Under siege, soft tissues and even bones become degraded, particularly in the limbs and digits.

    Like other leprosy researchers, a team at the Institut Pasteur in Paris, in collaboration with the Sanger Centre in Cambridge, U.K., resorted to growing M. leprae in a rather unusual “test tube”: the nine-banded armadillo, a critter whose cool body temperature appears ideal for the bacterium. (M. leprae can also be grown poorly in the footpads of mice.) This provided the raw material for a 5-year sequencing effort.

    The researchers found a depauperate genome, half of which appears devoid of genes, says Institut Pasteur's Stewart Cole. On closer inspection, he and his colleagues found that most of the so-called junk DNA consists of more than 1000 degraded genes whose functions were lost in the course of evolution. Although noncoding DNA makes up less than 15% of the genomes of most bacteria studied so far, it accounts for half of M. leprae's genome. Compared to the closely related M. tuberculosis, with which it shares a common ancestor, M. leprae appears to have lost more than its degraded genes. The genome of M. tuberculosis is a third bigger than that of M. leprae, suggesting to researchers that the leprosy bug has lost an additional 1000 genes, including ones coding for enzymes used in energy production and for DNA replication. Cole thinks this massive gene loss has crippled the leprosy bacillus, which would explain its slow growth rate—its population doubles in mice only every 2 weeks, compared to hours for most other bugs. M. leprae, it would seem, has few working genes to spare. “I think it is on its way out,” says Cole.

    What genes are left, however, may lead to new tools for fighting the bug. Cole's group has spotted some 100 genes that have no counterpart in M. tuberculosis. If any code for a unique protein on M. leprae's surface, identifying that protein could enable researchers to develop a simple skin-prick test for detecting an infection early. Being able to treat patients with a course of three drugs before symptoms begin, says Cole, “may prevent nerve damage, which would make leprosy a rather harmless skin disease, something like acne.” And learning how leprosy triggers nerve loss may offer clues to multiple sclerosis and other common neurodegenerative diseases. “My guess is that the initial events are very similar,” says Rambukkana.

    A Genome Cinderella Story

    In 1987 orange growers in the Brazilian state of Säo Paulo first noticed the telltale signs of a new disease: conspicuous yellow patches on individual leaves. The fruits on these spotted trees turned out to be small, hard, and gave little juice, rendering them commercially useless. Today, citrus variegated chlorosis (CVC)—as the disease is known—threatens the entire citrus industry in Säo Paulo state, the world's largest exporter of concentrated orange juice. The disease affects more than 30% of all trees and causes losses estimated at $100 million each year.

    Now scientists have a new tool to attack this devastating microbe. On 12 April a team reported at the meeting that they had deciphered the 2.7-million-base-pair genome of Xylella fastidiosa, the causative agent. X. fastidiosa is the first bacterial plant pathogen ever to be fully sequenced. What's more, the feat was pulled off not by one of the sequencing superstars in the United States or Europe but by a consortium of some 30 labs in Säo Paulo state—groups with little or no previous genomic expertise.

    This coup earned the Brazilian scientists ample praise from their international peers. Raves biochemist André Goffeau of the école Normale Superieure in Paris: “The quality [of the sequence] is superb. It's incredible how fast they've done it, given that 2 years ago they didn't even have the [sequencing] machines.” The X. fastidiosa genome “is quite a big deal,” says Edwin Civerolo, a plant pathologist with the U.S. Department of Agriculture (USDA) who works at the University of California, Davis. Indeed, the work is so impressive that the USDA and the state of California have just enlisted—to the tune of $250,000—the Brazilian team to sequence a related strain of X. fastidiosa that causes Pierce's disease and is threatening vineyards across California.

    The X. fastidiosa genome project was conceived in 1997 when Fernando Perez, scientific director of the State of Säo Paulo Research Foundation (FAPESP), a state-run public funding agency, became concerned about the lack of genomics research in Brazil. After consulting some of Brazil's top life scientists, Perez and his scientific advisers decided that Brazil should embark on its own genome project. But what to sequence? Given its economic impact, X. fastidiosa was a logical choice, but it was hard to work with. However, Joseph Bové, a microbiologist at the French National Institute for Agricultural Research in Bordeaux, whose team was among the first to identify the CVC pathogen in plant material from Brazil, convinced them to try.

    Rather than follow the U.S. model and set up a few supercenters for sequencing, Perez suggested a diffuse approach: a “virtual genome center” of more than 30 labs. He admits that this may not have been the most cost-effective approach—the project cost more than $13 million—“but it wasn't designed to be cheap or quick. The main goal was to create a broad competence in the field of molecular genetics in Brazil,” says Perez, and the foundation apparently achieved it. More than 30 sequencing machines were up and running by May 1998, and by January 2000 the job was done; the work has now been submitted for publication.

    At the meeting, geneticist Andrew Simpson of the Ludwig Institute of Cancer Research in Säo Paulo and one of the coordinators of the project provided a first peek into the inner workings of this agricultural pest. On its single circular chromosome, X. fastidiosa harbors some 2800 potential genes, half of which have been assigned a putative function. From these and others yet to be identified, the scientists hope to discover how X. fastidiosa wreaks its havoc.

    A common theory is that the microbe clogs up the xylem tubes in orange trees and thus starves the affected branches and leaves of water and other nutrients. Some of the new evidence seems to support that view. Explains Simpson: “The biosynthetic machinery [for producing essential cell components] is fairly complete, and there are a lot of nutrient-sequestering and energy-providing genes. That's what you'd expect for a bacterium that lives in the rather nutrient-poor xylem.” The team has also found genes that produce an extracellular “gum” that may help stick the bacteria together and, eventually, clog the xylem tubes or help attach the bacteria to the foregut of sharpshooter leafhoppers that transmit the bacterium by feeding on xylem sap. These, speculates Goffeau, “may be promising targets for new antimicrobials.”

    An unexpected find were genes for surface molecules called hemagglutinins and adhesins that are usually present in animal pathogens, where they help anchor the microbe to the target cell. “This suggests that this hardware is very well conserved within pathogens across a broad host range, and that only the software, or the details, change,” says Simpson.

    Buoyed by this success, FAPESP is expanding its genome efforts beyond X. fastidiosa and the collaboration with USDA. The agency is now supporting efforts to sequence snippets of genes that are active in various human cancers, as well as the genomes of sugarcane and several other plant pathogens. Says Simpson, “Agricultural genomics is crucial for an agricultural country like Brazil—and it's a research area where almost by omission we are now world leaders.”