News this Week

Science  23 Mar 2001:
Vol. 291, Issue 5512, pp. 2288

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Russia, NIH Float Big Plan for Former Soviet Bioweapons Lab

    1. Richard Stone

    CAMBRIDGE, U.K.—A former bioweapons lab in the heart of Siberia may soon open its doors to scientists from around the world. The head of Russia's State Research Center of Virology and Biotechnology (VECTOR) will unveil a multimillion-dollar proposal next week at a forum in Atlanta to transform the lab—which features the only biosafety level 4 lab in Asia—into an international center for emerging diseases. The U.S. National Institutes of Health (NIH) is helping develop the proposal, which could become one of the most expensive projects ever to beat Russian scientific swords into plowshares. Many experts are supporting it, but some argue that a reincarnated VECTOR would not be able to sustain itself without a Western lifeline of noncompetitive grants.

    VECTOR's ambitious plan would require about $25 million up front to modernize its labs to create the center—which would open in 4 to 5 years—and up to $12 million a year to operate it, says VECTOR general director Lev Sandakhchiev. He is hoping to cobble together the money from a variety of sources, including CNN founder Ted Turner and the World Health Organization, which is developing a plan to establish a network of up to a dozen such disease research centers in critical regions. Creating the facility—which would be called the International Center for the Study of Emerging and Reemerging Diseases (ICERID)—will be a challenge, but worth it, argues emerging disease expert Susan Fisher-Hoch of the University of Texas School of Public Health in Brownsville. “In the long term, we would all benefit” by tapping Russian talent, she says.

    Brewing collaboration.

    Center could adopt VECTOR's fermenters.


    The effort to transform VECTOR is gaining momentum despite a gloomy outlook for U.S.-funded nonproliferation activities in Russia. The Bush Administration's 2002 budget proposal would cut by nearly 10% the $870 million clutch of Russian nonproliferation programs, which had been slated to increase to $1.2 billion under the Clinton Administration's budget proposal. Backers of ICERID—including the U.S. State Department—hope that the diversity of potential funders will insulate the venture from U.S. budget cuts.

    Fisher-Hoch and others note that VECTOR has expertise in some of the world's nastiest viruses. It is one of two centers in the world sanctioned to maintain and study samples of smallpox virus. (The Centers for Disease Control and Prevention in Atlanta is the other.) It also has extensive experience with arboviruses, such as tick-borne encephalitis, and hemorrhagic fever viruses that are endemic in Siberia. And because it is located in southern Siberia near the borders of Kazakhstan and Mongolia, ICERID would provide “access to an important part of the world,” says NIH medical epidemiologist David Morens. He is the U.S. co-principal investigator with VECTOR on a 2-year planning proposal being submitted to the U.S. Department of Health and Human Services' Biotechnology Engagement Program, which supports work at former Russian bioweapons labs.

    Initially, ICERID's scientific staff would be drawn from VECTOR's 340 researchers. Although Sandakhchiev says he hopes to employ “almost all” these scientists, Morens feels that for ICERID to work, VECTOR “will have to become leaner and meaner.” In a move that would break new ground for a Russian former weapons lab—and virtually assure a clean break with the past—Sandakhchiev says ICERID would eventually hire foreigners. This could best be accomplished, he says, if ICERID were modeled after labs established through intergovernmental agreements, such as CERN, the European particle physics laboratory near Geneva, or the Joint Institute for Nuclear Research in Dubna, Russia. ICERID member states would kick in contributions scaled according to their gross domestic products.

    Some experts have qualms about the proposal. Although VECTOR has a talented staff, they have been too narrowly trained on bioweapons threats like smallpox and Ebola virus, says one U.S. virologist, adding that VECTOR researchers have been slow to shift into hot areas such as research on foot-and-mouth disease or West Nile virus. “The staff do not think outside the box,” he contends. Other critics are amazed by the amount of money VECTOR is hoping to raise. “It's a great way to float an entire institute,” says one U.S. defense scientist, but “I don't like block grants, and I see ICERID as a block grant.” Another concern is that ICERID scientists would lack the grantsmanship needed to compete for Western grants. Morens acknowledges this challenge. “They haven't thought through all the roadblocks yet,” he says, adding that “the basic concept is a good one.”

    Sandakhchiev hopes that latter message comes through when he presents the center concept at the Sam Nunn Policy Forum to be held next week at Georgia Tech. Nunn chairs the Nuclear Threat Initiative, an organization the former senator launched in January with Turner, who has promised the initiative $250 million over 5 years. Besides helping fund ICERID, Sandakhchiev hopes Turner will try to persuade President George W. Bush and Russian President Vladimir Putin to establish ICERID's legal status as an intergovernmental center and provide some baseline funding. After all, says Sandakhchiev, honing his sales pitch, ICERID “would be far less expensive than the partnership on the international space station.”

  2. Uncertainty on Bioweapons Treaty

    1. Eliot Marshall

    In the depths of the Cold War, the United States made a remarkable decision: It renounced biological weapons, stopped its R&D program, and urged other countries to do the same. About 140 followed this lead, supporting a general ban, the Biological and Toxin Weapons Convention (BTWC) of 1972. But the treaty has a flaw: It lacks an enforcement system, relying instead on public pressure to keep countries honest. Diplomats and technical experts have been struggling for years to come up with a better way of enforcing the BTWC. But their self-imposed deadline for reaching an agreement is looming, and observers fear that negotiations may end this summer with no consensus. That could cast a pall over the BTWC, which is due for a full international review in November.

    At a small meeting sponsored by the Carnegie Corp. earlier this month in Washington, D.C., Barbara Hatch Rosenberg, a microbiologist who leads a BTWC verification working group for the Federation of American Scientists in Washington, D.C., gave a bleak report. “The negotiations [on a protocol for verifying BTWC compliance] are certain to go on the back burner for the next 4 years or more,” she predicted, unless the parties reach agreement in the 7 weeks set aside for these talks in April-May and July-August. She criticized the Clinton Administration for its “passivity” on BTWC and expressed concern that the Bush Administration—which is reviewing its policy this spring—has a “well-known antipathy to multilateral arms treaties.”

    “Everyone is waiting” to find out what the Bush policy will be, says Amy Smithson, a bioweapons specialist at the Henry L. Stimson Center in Washington, D.C., adding that, “It's like waiting at a wake.” Smithson, who has examined the infrastructure left behind by the Soviet Union's cheating on the BTWC in the 1980s, claims that the U.S. and other governments “haven't done their homework” on technical issues in BTWC enforcement. The resulting lack of data, she argues, has made it more difficult to agree on a protocol.

    Gillian Woollett, a BTWC expert at the Pharmaceutical Research and Manufacturers of America, says it is risky to view the November BTWC review as a “make or break” deadline, because compromises made under pressure may lead to a flawed protocol. “We do not think that the only choice is to accept a bad protocol or no protocol.” U.S., European, and Japanese industry leaders have agreed on a model approach that would not use surprise onsite visits or routine inspection of industry labs, Woollett says, adding: “We would like to see a good protocol adopted,” even if that can't be done by November.

    Because the schedule is so tight, most observers doubt that a strong enforcement regime will be in hand by November. But Donald Mahley, the U.S. representative to the negotiations and chair of the Bush policy review on BTWC, argues it's too early to declare the protocol dead, saying, “It's not over until it's over.”


    Fossil Tangles Roots of Human Family Tree

    1. Michael Balter

    The discovery of a 3.5-million-year-old hominid skull and other fossil remains in northern Kenya is shaking the human family tree at its very roots. The new find, reported in this week's issue of Nature, shows that this bushy tree started sprouting branches even earlier than researchers had realized. And the discovery is the second new fossil evidence described in the past month that challenges the exalted status of Australopithecus afarensis, a hominid best known from the partial skeleton “Lucy” and long the leading candidate as the common ancestor of later australopithecines and our own genus, Homo.

    The latest hominid bones—an almost complete cranium, a bone from the temple, parts of two upper jaws, and assorted teeth—were discovered in 1998 and 1999 by a team led by paleontologist Meave Leakey of the National Museums of Kenya in Nairobi. They were found in well-dated volcanic deposits west of Lake Turkana, whose shores have yielded a trove of hominid fossils.

    Experts say that the new hominid's characteristics—a small braincase and small molars set in a large, flat face—have never before been found together in one skull. “When I first saw it, I couldn't believe it,” says Daniel Lieberman of George Washington University in Washington, D.C. “Nobody would have dreamt of this combination of features.” In contrast, A. afarensis, the only other hominid known from this period, has large molars and a much smaller, projecting face. Bernard Wood, also at George Washington, says the new fossils “show that human evolution before 3 million years ago is likely to be every bit as complex as in its later stage,” when many hominid species walked the Earth at the same time.

    The team has named its find Kenyanthropus platyops, “the flat-faced man of Kenya.” Although Lieberman is not convinced that the specimen justifies a new genus, others think the team is on firm ground. “The authors make a very good case,” says Laura MacLatchy of Boston University. And Ian Tattersall of the American Museum of Natural History in New York City, who has long argued that some colleagues are too timid when it comes to creating taxa, praises the team for having “the courage to recognize diversity in the hominid fossil record.”

    Experts are unanimous in the opinion that Kenyanthropus will complicate efforts to trace the convoluted course of human evolution. This task is especially mind-bending because, beginning about 3 million years ago, hominid species began sprouting like wildflowers across Africa. But for the period between 3 million and 4 million years ago, things had seemed relatively simple. After decades of searching, most researchers had concluded that A. afarensis was the only clearly identified hominid in Africa at that time. Winding the clock back farther, a 4-million-year-old australopithecine, A. anamensis, seemed a likely ancestor to Lucy. But the new discovery, dated smack in the middle of this critical million years, could put a kink into any straight-lined phylogeny, because it doesn't share key features with either A. afarensis or A. anamensis. “It certainly puts a big question mark over the status of A. afarensis as the sole ancestor” of all later hominids, says Chris Stringer of London's Natural History Museum.

    Indeed, just last month, a Paris-based team led by Martin Pickford and Brigitte Senut described a 6-million-year-old candidate hominid from Kenya's Tugen Hills, named Orrorin tugenensis, which had small, humanlike molars (Science, 23 February, p. 1460). Pickford and Senut argued that Lucy and her large molars could not have given rise to humans. In the Nature paper, Leakey's team says that Kenyanthropus's small molars might also sideline A. afarensis as a human ancestor. “If the hominid status of Orrorin is confirmed, it would support the suggestion that small molar size is the primitive condition,” says Leakey's co-author Fred Spoor of University College London. And Lucy co-discoverer Donald Johanson, director of the Institute of Human Origins in Tempe, Arizona, says he isn't surprised that Lucy has a rival in Kenyanthropus. “The presence of a single [species] between 3 and 4 million years ago,” he says, “just didn't make any sense.”

    On the other hand, few researchers, Leakey's group included, are suggesting that Kenyanthropus necessarily lay on the path to Homo. Rather, experts say, the importance of the new discovery lies in its demonstration that the roots of the human evolutionary tree are pretty tangled. “Those of us who have been suggesting that human evolution is more like a bush than a ladder,” says Wood, “may not have been far off the mark.”


    New Cuts in Station Could Spark Walkout

    1. Andrew Lawler
    1. With reporting by Jeffrey Mervis.

    U.S. researchers eager to use the international space station are threatening mutiny if NASA carries out plans to trim facilities and crew in the wake of exploding costs. A biological sciences advisory group has called the proposed cuts a “betrayal of the public trust” that undermines the scientific rationale for the station. Although critics have long questioned the station's likely scientific payoff, what's new about the latest attack is that it's coming from the station's staunchest scientific supporters.

    Destiny diminished.

    Cutting back on crew size could hamper research aboard the space station's laboratory, Destiny.


    Last month, NASA announced that the station, now under construction, faces a $4 billion overrun. In response, agency managers plan to cancel a habitat module and a rescue vehicle, and reduce the size of the crew and the amount of power available. Further cuts, such as delaying or canceling a centrifuge module critical for nonhuman biological research, are pending. Even so, administration officials insist that the $60 billion station will meet NASA's promise to the science community and its international partners to operate a world-class research facility with a sustained human presence. “We can continue to maximize research,” says Joe Rothenberg, NASA's space flight chief.

    Not true, say members of NASA's space station biological research project science working group, made up of a half-dozen outside advisers. “We were [already] at the extreme edge of maintaining a credible science endeavor,” writes Martin Fettman, a veterinarian at Colorado State University in Fort Collins and chair of the working group, in a 9 March letter to Rothenberg. If NASA goes ahead with the proposed cuts, the panel adds, “we might as well completely discontinue” science funding for the space station.

    Fettman, who also serves as chair of the NASA-funded National Space Biomedical Research Institute's external advisory panel, says the working group is prepared to quit in protest. The letter also warns that “the entire life sciences community would turn its support away” from a scaled-back station “and in fact become active campaigners against the station if the program continues to divert resources from science to solve construction problems.” Members of the National Academy of Sciences' Space Studies Board raised similar concerns last week during a briefing by Administration and congressional aides on NASA's proposed budget for 2002. “It's the old fear of putting up a tin can that isn't capable of doing good science,” says John McElroy, chair of the board and a former engineering dean at the University of Texas, Arlington.

    Rothenberg says the list of cuts won't be finalized until June, and only after consultation with researchers. He adds that other partners, such as Europe, might contribute elements NASA can no longer afford. “We honestly believe the science community is our customer,” says Rothenberg. That position is seconded by Steve Isakowitz of the Office of Management and Budget, who cautioned researchers “to wait for the agency to complete its review before taking any premature action.” Speaking to the academy panel, he said that “whatever happens, we'll still have a station that's better [for research] than anything we've had before.”

    But Fettman and others are skeptical. Part of the problem, they say, is that scientists lack clout at Houston's Johnson Space Center, which until last month was in charge of the effort, and at headquarters, which now oversees the program. “Decisions are made by people who don't understand the science” or the equipment that it requires, Fettman complained in an interview. “The engineers don't seem to connect with what we want.” For example, he says that it took much effort to convince industry contractors and NASA to abandon plans for an expensive lead-lined locker to hold film, at a time when “everyone is going digital.” And NASA has just discovered a 60% cost overrun on the racks that will house nonhuman biological experiments, he adds: “This is no way to run a business.” Fettman urges NASA to assign a scientist to a “critical management slot” before making any further cuts in the research program.

    Rothenberg was still mulling a response to the challenge last week, but he makes a plea for patience. “Give us a chance,” he says. “The community should be concerned, but it shouldn't panic.”


    New Fossil May Change Idea of First Mollusk

    1. Erik Stokstad

    Some 425 million years ago, a heavily armored, wormlike mollusk died on the deep sea floor and was buried in volcanic ash. Now, using a novel technique, a team of paleontologists has created a virtual reconstruction of its perfectly preserved shape. The fossil has a strange mix of traits that, although not conclusive, supports a controversial idea about the identity of the earliest mollusks. “This is really a major discovery,” says paleontologist Bruce Runnegar of the University of California, Los Angeles.

    Mollusks include snails and bivalve clams and also several groups of more puzzling and obscure organisms. Tidepoolers known as the chitons, for example, have segmented shells and superficially resemble arthropods. Deep on the ocean bottom, feeding on forami-nifera, live the aplacophorans—shell-less mollusks that look like odd worms. Because they lack some key traits, not just a shell but also sometimes the muscular foot, many malacologists think the aplacophorans resemble the first mollusks. Yet none had been found in the fossil record of mollusks, which stretches back more than 500 million years to the early Cambrian—until now.

    The new fossil, described in this week's issue of Nature, comes from an ash bed of Silurian age in Herefordshire, United Kingdom, a deposit noted for preserving an extraordinary record of soft tissue in three dimensions. After creatures were entombed in ash, their bodies rotted away, and the cavities filled with calcite. This process also made the fossils frustratingly tough to study. The only way to reveal them was to laboriously pick away the surrounding rock. So the collection sat relatively unstudied until Mark Sutton, a postdoc at the University of Oxford, tried a better way—but one that is still a bit of a grind.

    Sutton chose specimens that were numerous enough that a few could be destroyed. Then he ground down the rock 30 micrometers at a time. At each step, he polished the end of the rock and took a digital photograph. A computer outlined the fossil, which was darker than the rock matrix, on each picture. “We didn't have a clue what it was,” Sutton says. But after the computer had stacked up several hundred slices into a 3D replica, “everything fell into place.”

    The fossil, named Acaenoplax hayae, has several aplacophoran traits, such as a posterior cavity with features that may have been gills and lack of a typical molluscan foot. With its rows of spiny ridges, however, Acaenoplax is more strongly serialized, or repetitively structured, than any known mollusk, Sutton says. That is consistent with the widely held idea that the common ancestor to all mollusks had a serial structure, even though most modern mollusks (chitons excepted) have at most faint serial patterns.

    A more significant feature, in terms of molluscan evolution, is that Acaenoplax had dorsal shell plates. Chitons have always had this armor, but modern-day aplacophorans don't. This has led most malacologists to believe that the shell-less aplacophorans were the first to split from the early mollusk lineage, followed by chitons. Acaenoplax may support a more contentious view: that the ancestor of chitons and aplacophorans formed the first branch of the molluscan family tree. Sutton and his colleagues—Derek Briggs of the University of Bristol, David Siveter of the University of Leicester, and Derek Siveter of the University of Oxford—say that Acaenoplax suggests that shell plates were originally a common feature of both aplacophorans and chitons, and that the two form a natural group. It's not an exact match, because chitons have eight dorsal plates, whereas Acaenoplax has only seven. Nevertheless, “our creature seems to be the missing link” between chitons and aplacophorans, Sutton says.

    Mollusk fans are happier than a clam. “This is a most amazing beast,” says aplacophoran expert Amélie Scheltema of the Woods Hole Oceanographic Institution in Massachusetts. “It titillates the imagination.” But she's not sure whether to call Acaenoplax an aplacophoran, because it is so different from modern ones. To Runnegar, that's part of the attraction. “It gives us a great deal of information about early molluscan evolution that cannot be retrieved from the living biota,” he says.


    Astronomers Glimpse Galaxy's Heavy Halo

    1. Mark Sincell*
    1. Mark Sincell is a science writer in Houston.

    Astronomers deal in light, so dark matter drives them a little crazy. For decades they have watched as the gravitational pull of an invisible hand twirls stars and gas around the fringes of galaxies like a ball on a string. And for decades they have failed to identify the source of the excess galactic gravity. With little to guide them, astronomers have fashioned dark matter candidates out of everything from underweight failed stars to massive subatomic particles that currently exist only in a theorist's imagination.

    Now, they have something more to work with. In a paper published online today by Science (, an international team of astronomers claims to have directly spotted the source of at least 3% of all the dark matter in the galaxy: They have identified large numbers of fast-moving white dwarf stars that formed when the Milky Way first flickered to life several billion years ago. And that is just a cautious lower limit. “They could account for up to one-third of it,” says team member Ben R. Oppenheimer, an astronomer at the University of California, Berkeley.

    Dim prospects.

    Invisible dark matter may have started out as hot white dwarfs like this one in the middle of planetary nebula NGC 2440.


    “The authors are to be congratulated,” says astrophysicist Harvey Richer of the University of British Columbia in Vancouver. “This is a very nice piece of work.”

    Most of the mass in a galaxy is invisible. The Milky Way's familiar sparkly pinwheel of relatively young stars sits amid an extended spherical halo of older stars and gas. Their combined gravity holds the galaxy together and keeps the stellar pinwheel spinning. Far enough from the center, the pull should eventually weaken and the stars slow down. But they don't. Stars and gas clouds as far as the telescope-aided eye can see continue to orbit at the same speed. The best explanation is that almost 90% of the total mass of the galaxy is an invisible substance spread throughout the halo called, for want of a better name, dark matter.

    Tiny clumps of dark matter occasionally pass between Earth and distant stars. The gravitational field of the clump bends light from the star, causing a sudden brightening called microlensing. After spending the last 6 years counting these rare microlensing events, the Massive Compact Halo Object survey has concluded that between 8% and 50% of galactic dark matter is in clumps weighing about half the mass of the sun. But they never actually caught the culprit in the act.

    White dwarfs in the halo have long been the leading suspects, says Richer. They are of the right mass, they move fast enough, and they should be quite common. Any star born weighing between one and eight times the mass of our sun sheds most of its mass as it evolves, eventually ending up as a dimly glowing, cold, half-solar-mass white dwarf. But are they common enough? The answer seemed to be no. When the number of nearby white dwarfs in the galactic disk—they can be seen directly because they are closer—was extrapolated to the halo, the total density came up short. Although isolated halo dwarfs had been spotted before, attempts to directly identify a large enough population to explain the microlensing had failed. Part of the reason, it turns out, is that astronomers were looking for the wrong color star.

    Contrary to the popular images of red-hot peppers and cool blue ice, hot stars are blue and cold stars are red. So searches for white dwarfs in the halo targeted faint red stars. It was a mistake. The light radiated by a hot white dwarf starts off blue and turns red as the dwarf cools, as expected. But when the dwarf's temperature drops below 4500 kelvin, recent theoretical models by astrophysicist Brad Hansen of Princeton University and others show that molecular hydrogen in the dense, cold atmosphere absorbs the red light and reemits it at higher, bluer frequencies.

    The realization led the team to change strategies. First, team member Nigel Hambly of the University of Edinburgh scanned almost 200 digital images covering 12% of the sky for fast-moving faint objects, turning up 126 candidate halo stars. Then the team checked for the blue color spectrum using four nights of follow-up observations from the 4-meter Blanco telescope at the Cerro Tololo Inter-American Observatory in Chile. In the end, the team plucked out 38 new cool white dwarfs orbiting in the galactic halo. By multiplying the density of the newfound cool dwarfs by the volume of the galactic halo, Oppenheimer's team estimated that white dwarfs make up at least 3% of the total galactic dark matter density.

    There is only one catch, astronomers say: Some of the newly discovered dwarfs might not actually be from the halo. Physically, they are all inside the galactic disk. Because they are moving faster than typical disk stars, the white dwarfs are almost certainly just passing through as they circle the halo. “The gravity of the galaxy is not strong enough to confine such objects to a disklike geometry,” Hansen explains. But even a small contamination of disk white dwarfs could reduce the estimated halo dwarf density, so Richer won't draw his final conclusions until several ongoing white dwarf surveys that probe farther into the halo start producing results in a year or two. “We are in the very early stages,” he says. “Things haven't all shaken out yet.”


    U.N. Report Suggests Slowed Forest Losses

    1. Erik Stokstad

    A comprehensive survey of the world's forests, released last week by the United Nations (U.N.), suggests that global rates of forest loss decreased in the 1990s. But the ink was barely dry on the report before the World Resources Institute (WRI), a think tank in Washington, D.C., disputed that conclusion. “We need good news about the world's forests,” says Dirk Bryant, who directs WRI's forest program. “But this is definitely not it.”


    Forests (yellow) are colored red where deforestation was especially high during the 1990s. Green shows new plantations and regrowth.


    Previous reports have been an important source of information for policy-makers, climate change scientists, and others. So WRI scrutinized the data as they were released on the Web over the last 6 weeks. It claims that the numbers are “out-of-date, patchy, and inaccurate.” Moreover, the WRI says, changed baselines and methods invalidate the comparison of deforestation rates. The U.N. admits that the quality of data varies but says its methods and conclusions are sound.

    Every 5 to 10 years, the U.N.'s Food and Agriculture Organization (FAO) reports on the status of forests. For the current Global Forest Resources Assessment 2000 (FRA2000), it tabulated the latest data, such as forest areas and composition for 217 countries—a challenging task because few countries undertake regular inventories of their forests. To get estimates for 2000, FAO analysts sometimes had to use economic statistics and other information to project trends from older forest data. As a supplement, they examined satellite images that covered 10% of the world's tropical forests.

    The report suggests that the average annual net loss of forests during the 1990s was 9 million hectares—0.2% of the global total—or an area roughly the size of the state of Maine. That rate is at least 10% lower than the one FAO calculated for the first half of the decade in a 1997 report, says Peter Holmgren of FAO's Forest Resources Assessment Program in Rome, Italy. The slowdown is mainly due to new tree plantations, particularly in India and China, and forest growth on disused farms. The rate of loss apparently remained the same in the tropics, however. Satellite images combined with data supplied by the countries themselves suggest a gross annual loss of natural forest of 14.5 million hectares, which Holmgren says differs little from figures from the 1980s.

    The WRI disagrees with both of FAO's assessments. “I don't believe we can say deforestation is slowing down based on this report,” says WRI's Emily Matthews, an environmental analyst. For the FRA2000 report, Matthews says, FAO revised many of the 1990 forest areas to be much larger than in the 1997 report. Although done partly to standardize worldwide forest definitions, this reduced, perhaps erroneously, the percentage of forest lost calculated in FRA2000 for the decade. Holmgren admits that the comparison is not straightforward, but he says that estimates using other FAO reports also indicate a slowdown.

    Matthews also worries that the report will send a “damaging” message that natural forests are in less danger than before, although as Holmgren notes, the report makes clear that deforestation of tropical, natural forests has not declined. And still another criticism concerns FAO's sampling of satellite images.

    The problem, remote-sensing experts say, is that deforestation in tropical forests is highly concentrated along roads and rivers. As a result, says Compton Tucker of NASA's Goddard Space Flight Center in Greenbelt, Maryland, a small and random sample—such as the 10% used by FAO—“will give you grossly inaccurate numbers.” Holmgren responds that funding constraints prevented wider coverage, but that each sampling area—3.4 million hectares—did cover some roads and rivers. Even more important for an accurate assessment of forests, he says, is field sampling—something that most countries don't do systematically.

    Indeed, everyone agrees on the need for better and more consistent forest data from almost all countries. To minimize such problems in the future, Holmgren says the FAO has proposed an initiative, called the Global Forest Survey, that would establish a global standard for data collection and support individual countries in monitoring their forest resources.


    Physicists Scramble to Recapture the Magic

    1. Robert F. Service

    SEATTLE, WASHINGTON—Superconductivity researchers call it the “Woodstock of physics.” Fourteen years ago, at the March meeting of the American Physical Society in New York City, thousands of physicists jammed a hotel ballroom until 3 in the morning to listen to 51 talks about the newly discovered high-temperature superconductors. Last Monday, the scene was repeated as over 1000 researchers gathered here for a session that lasted until 1 a.m. to compare notes on magnesium diboride (MgB2). The newfound metallic compound superconducts at 39 kelvin, a temperature well below that of high-temperature superconductors but double that of its metallic kin.

    Although “Woodstock west” couldn't quite recapture the past magic, interest has been intense. “It's probably the most important discovery in superconductivity since the high-temperature superconductors,” says Gerry Perkins, a physicist at Imperial College in London. Superconductivity in the material was first reported at a meeting only in January. Within a week, the news spread around the world. Dozens of groups dropped what they were doing and raced to understand why the material superconducts well above the temperature of other metal compounds, and to see if they could coax it even higher. And unlike 14 years ago, “we knew exactly what to do this time,” says Jim Jorgensen, a superconductivity researcher at Argonne National Laboratory in Illinois.

    A flurry of results started appearing last month on the Los Alamos National Laboratory physics preprint server (Science, 23 February, p. 1476), and the American Physical Society hastily organized a session to discuss the latest findings. In Seattle, just 2 months after the first report, 79 researchers presented papers in rapid-fire succession, with just 2 minutes each to present their results and 1 minute for questions. “Progress in the field has been very, very rapid,” says Robert Cava, a superconductivity researcher at Princeton University in New Jersey. “The world has learned an awful lot about this material.”

    The lesson so far is that MgB2 “seems to be a standard, old-fashioned superconductor,” says Paul Canfield, a physicist at Iowa State University in Ames. Metal compounds conduct electricity without the usual electrical losses thanks to a mechanism first described in 1957 by John Bardeen, Leon Cooper, and Robert Schrieffer. Their BCS theory says that vibrations of the material's crystalline lattice cause electrons to overcome their usual repulsion of one another and surf through the material in pairs. Most theorists suspected that above about 20 K, additional vibrations in the lattice would cause these pairs to fly apart. The fact that MgB2 works at nearly 40 K led some to suspect that a different type of glue must be holding electron pairs together. At the meeting, however, numerous teams described experimental evidence on the way MgB2 conducts heat, transports electrical current, and behaves under pressure that makes it look like a familiar BCS superconductor.

    “Generally, the consensus is we know what is going on,” says Marvin Cohen, a theoretician at the University of California, Berkeley. Because of that, Cohen says, you could sense interest in the fundamental physics of MgB2 deflating as the late-night meeting wore on. Adds Paul Grant, a superconductivity expert at the Electric Power Research Institute in Palo Alto, California: “I'll give it a half-life of about a year on the front lines of fundamental physics.”

    Even so, Grant and others say that interest in the more practical use of the material is just heating up. And it was in this area that some of the most exciting results at the meeting were revealed. Physicist David Larbalestier of the University of Wisconsin, Madison, reported that UW researchers led by Chang-Beom Eom had already made thin films of the material that conduct up to 10,000 amps per square centimeter. Although high-temperature superconductors can do more than 100-fold better, MgB2 films need few of the tricks required to make high-temperature superconductors carry high currents. The reason, says Larbalestier, is that high-temperature superconductors are layered materials in which electrical currents zip through in flat planes. For the current to hop from one crystallite in the material to another, the crystalline lattices of the two grains must be nearly perfectly aligned. But MgB2, like other metallic superconductors, doesn't require this same alignment, because superconducting currents can flow in any direction. That should make wires much easier to fabricate, Larbalestier says. In fact, his group has already made a nearly 10-meter-long wire of MgB2 encased in copper, although for now the amount of supercurrent it can carry remains low. Still, Grant says, “it's a good start.”

    Perkins revealed an equally welcome result on a way to prevent loss of superconductivity of the materials in high magnetic fields. For superconductors, magnetic fields are like Superman's kryptonite: They sap their power in short order. Magnetic fields initially penetrate superconductors in particular spots called vortices that look like tiny whirlpools of magnetic flux. If these vortices move around in the superconductor, they can dissipate electrical energy. Early results didn't bode well: The effect is particularly strong in MgB2, potentially ruling out applications such as electrical motors that generate such fields.

    To lessen this problem in other superconductors, researchers typically fire protons at a material or use other methods to create an array of defects in the materials' crystalline lattice. This disrupts the ability of the material to superconduct at these sites, a situation that is energetically favorable to vortices. As a result, the defects hold vortices in place, allowing the supercurrent to weave around them and continue through the material without shedding energy. At the late-night session, Perkins reported that his team had introduced defects into MgB2 that sustain the ability of the material to carry a high electrical current in a magnetic field of 3 tesla, a field well above that found in many applications.

    Even better would be to find new relatives of MgB2 that carry supercurrents at still higher temperatures. Jun Akimitsu, a physicist at Aoyama Gakuin University in Tokyo, reported that adding beryllium to MgB2 creates a new superconductor, but the temperature drops to 35 K. Still, most researchers are hopeful that better compounds may yet be found. Cohen says BCS theory suggests that the superconducting temperature of the material should go up if researchers can somehow expand the spacing of atoms in MgB2's crystalline lattice. One way to do that, he says, is to substitute larger calcium atoms for magnesium. It hasn't worked yet. But Jorgensen, for one, believes there is good reason to hope. “What we have here is a remarkable science surprise. There surely must be more surprises ahead.”


    Shake-Up in Nuclear Weapons Program

    1. Pallava Bagla

    NEW DELHI, INDIA—Pakistan has removed its two top nuclear scientists as part of a major reshuffling of the country's nuclear weapons program. Observers see the move as a signal that the military government is trying to conform to global norms on the management of nuclear material as well as distance itself from a controversial figure regarded as the “father of the Islamic bomb.”

    On 10 March the government announced that Abdul Qadeer Khan, head of the country's major nuclear weapons research lab, and Ishfaq Ahmad, head of the Pakistan Atomic Energy Commission (PAEC), were being made special science and technology advisers to the chief executive, General Pervez Musharraf. The government is also combining separate nuclear weapons research programs at the lab and commission under a new entity, the National Defense Complex (NDC), leaving the commission to manage civilian nuclear activities.

    “It's a positive step forward in many respects,” says David Albright, president of the Institute for Science and International Security, a Washington, D.C.-based arms control organization. “It puts a fence around the military program, and it makes possible an open and safeguarded civilian program. In addition, anything that takes Qadeer Khan out of the picture makes me happy.”

    The announcement came as United Nations Secretary-General Kofi Annan was visiting Islamabad to try to ease Indo-Pakistani tensions. Although the government labeled it as a promotion, Khan has said that he will not accept the appointment. “Khan was surprised at the news and is very unhappy,” says Pervez Hoodbhoy, a professor of high-energy physics at Quaid-i-Azam University in Islamabad. Khan told Pakistani journalists that he will instead pursue “social welfare,” often a code for seeking public office.

    Khan has headed the Khan Research Laboratories (KRL) in Kahuta since returning from the Netherlands in the mid-1970s, where he was trained in metallurgy and worked for an international nuclear power consortium. He is credited with setting up Pakistan's uranium enrichment facilities, which enabled it to build the nuclear bombs that Pakistan exploded in May 1998. Albright says that Khan was given extraordinary freedom to carry out his work, which some have alleged skirted or ignored international law, and that the new structure, by putting the lab under the NDC, is likely to provide more institutional oversight. “[Khan] has not been accountable to anyone in the government with the technical expertise to know what was going on,” says Albright.

    Khan will be succeeded at KRL by Javed Mirza, an experimental nuclear physicist who is now deputy chair of the lab. Pervez Butt, a nuclear engineer, takes over for Ahmad as head of the slimmed-down PAEC. “Both men are experienced and competent professionals from inside the establishments that they are now going to lead,” says Abdul Nayyar, a physicist at Quaid-i-Azam University.


    Chimp Sequencing Crawls Forward

    1. Dennis Normile

    TOKYO—An international team of researchers is cobbling together an effort to sequence the chimpanzee genome. So far, however, the big global genome players aren't joining the party. Scientists from Japan, Germany, China, Korea, and Taiwan used a meeting here last week to promote the project, while interested U.S. researchers were dubious that their government would provide a significant portion of the estimated $100 million needed to complete it.

    Yoshiyuki Sakaki, director of the Human Genome Research Group at RIKEN's Genomic Sciences Center in Yokohama, said that sequencing the chimp genome should help to answer basic questions about evolution. Only a chimp-human comparison “will show what makes humans human,” he says. Ajit Varki, a biochemist at the University of California, San Diego, believes that the knowledge could also prove invaluable in treating a number of human diseases, including AIDS and Alzheimer's, that are very difficult to model in animals. Varki, who is studying the evolutionary implications of a gene that chimps possess but humans don't (see p. 2340), co-authored a public letter last summer urging the U.S. government to climb aboard (Science, 25 August 2000, p. 1295). But no government funding has been forthcoming from the United States, the United Kingdom, or France.

    Rumors were rife at the meeting that Celera Genomics of Rockville, Maryland, one of two teams that sequenced the human genome, might turn its large bank of sequencing machines over to the chimp genome. But Celera's president, J. Craig Venter, told Science last week that his company has no plans to do so. “[The chimp sequence] is too close to [humans] to be really useful at this stage,” Venter says.

    View this table:

    That lack of interest has left a niche for groups with smaller sequencing operations. Rather than join U.S. efforts to sequence the mouse, Sakaki says, “we wanted to do something where we could play a bigger role.” That sentiment was echoed by representatives of other smaller sequencing centers. Park Hong-Seog, a molecular biologist at the Korea Research Institute of Bioscience and Biotechnology in Taejon, says his colleagues feel they largely missed out on the human genome sequencing effort. “Participating in the ape genome sequencing effort would be an attractive way for us to contribute,” he says.

    Japan is taking the lead. Next month RIKEN's Genomic Sciences Center will get $2 million to construct a map; scientists hope for additional funding in fiscal 2002 to begin sequencing. The Max Planck Institute for Molecular Genetics in Berlin has also gotten approval for a chimp sequencing project; Hans Lehrach, a molecular biologist at the institute, says they, too, are negotiating funding levels. Other centers are just starting to line up support.

    Sakaki expects to divide up the work following a formal cooperative agreement “within a few months.” For example, RIKEN and the German group collaborated on sequencing human chromosome 21 in part to extend work on diseases, including Down syndrome, associated with this chromosome. The two groups hope to repeat that collaboration on the equivalent portion of the chimp genome, which is chromosome 23.


    Barricading U.S. Borders Against a Devastating Disease

    1. Martin Enserink

    U.S. officials are stepping up efforts to keep out foot-and-mouth disease. But in this age of global travel, can they succeed?

    DULLES AIRPORT, VIRGINIA—The question on the customs declaration form, to be checked “Yes” or “No” by every traveler coming into the United States, has a somewhat anachronistic ring to it: “I am (We are) bringing fruits, plants, meats, food, soil, birds, snails, other live animals, wildlife products, farm products; or, have been on a farm or ranch outside the U.S.” It seems reminiscent of the days when you had to check whether you were a communist or a homosexual. But in recent weeks, it has become acutely relevant. Question 11 is the first line of defense against foot-and-mouth disease (FMD), which has crippled British agriculture, disrupted the country's way of life, and struck fear across the rest of Europe.

    Last week, the U.S. Department of Agriculture (USDA) used the international arrivals hall of this airport near Washington, D.C., to demonstrate that it is increasing vigilance to keep FMD out. As living proof, the agency brought out Quincy, a 7-year-old beagle trained to sniff out meat, cheese, and other contraband after passengers pick up their bags from the luggage carousel. Direct flights from FMD-affected countries will be subject to increased scrutiny by Quincy and his peers, the agency said. For TV crews, USDA also staged the mock arrival of a traveler from Britain who, after confessing she had visited a farm there, was asked by friendly-but-firm agents to surrender her muddy tennis shoes. The footwear was taken to a back room, where an agent scraped off the dirt and disinfected the soles with a stiff brush dipped in a bleach solution. Then she was free to go. “The shoes are clean,” sneered one French reporter after witnessing the performance. “The country is safe.”

    Funeral pyre.

    The grim site of burning carcasses here in Longtown, U.K., reflects the fact that it is cheaper to stamp out foot-and-mouth disease than to prevent it.


    But some experts doubt that it really is. When Science went to press this week, no signs of FMD had been seen in the United States, while the epidemic raged on in the U.K. But FMD is one of the most contagious diseases on Earth and is constantly on the rampage: Since early 2000, it has struck Russia, China, South Korea, Taiwan, Japan, Mongolia, and at least seven African and five South American countries. It infects all cloven-hoofed animals, such as cattle, pigs, sheep, and goats. Given that impressive march, there's little doubt that the virus will sooner or later reach the United States, says veterinary epidemiologist Peter Cowen of North Carolina State University, Raleigh. “I just don't see why we would stay immune,” says Cowen. “I think the question is when we'll get it, not if.”

    Without doubt, the economic toll in this agriculture-dependent country would be devastating, say Cowen and others. So why is the nation dependent on bleach and beagles—especially when good vaccines to ward off the disease have been available for decades? Even some scientists are at a loss to fully explain why—but the answer involves the global economics of modern farming, in which the direct and indirect costs of vaccines often make it cheaper to stamp out disease outbreaks than to prevent them.

    Two oceans

    Although local foot-and-mouth outbreaks have occurred almost yearly in Europe for most of the 20th century, the United States, where it's often called hoof-and-mouth disease, had its last epidemic in 1929. (Mexico had outbreaks until the 1940s, and Canada had one in 1952.) Similarly, classical swine fever breaks out in Europe regularly; in 1997 it forced Dutch authorities to massacre almost 10 million pigs. The same disease, often called hog cholera here, was eradicated from the United States in 1978. Similarly, no sign of mad cow disease, or bovine spongiform encephalopathy (BSE), has been detected in the United States, although it has spread over the past decade from the British Isles to at least a dozen other countries.

    Part of the credit goes to the strict importation rules by the USDA's Animal and Plant Health Inspection Service (APHIS). Beef from the United Kingdom has been banned since 1989 because of the BSE crisis. And after FMD gained a foothold in France last week, the U.S. government temporarily banned the import of all European animals and many animal products. At the borders, Quincy and his human co-workers keep the pressure up on individual travelers who may be tempted to smuggle some bratwurst or salami in their suitcase. But the health of U.S. livestock is also a result of the country's geographic isolation, says Martin Hugh-Jones of Louisiana State University, Baton Rouge. “We have to be very grateful for two things,” says Hugh-Jones: “The Atlantic and the Pacific.” (How FMD entered the U.K. is still not known, but there are solid clues that an animal transport across the English Channel recently brought the disease into France.)

    Nonetheless, with a quarter-million people entering the United States daily at 90 entry points, it's impossible to check everybody, and the virus can survive on people's clothes or even in their throats as a passenger. That's why APHIS also has set up a second line of defense, should an outbreak occur. The strategy would be similar to the take-no-prisoners approach that Britain and France are using: Cordon off infected farms and destroy every animal potentially exposed to the virus. And it would work, asserted APHIS administrator Craig Reed at last week's press briefing. “We will contain the virus,” Reed promised. “It's our job.”


    But even a limited outbreak could have a staggering economic impact. In 1998, Javier Ekboir, then at the School of Veterinary Medicine at the University of California, Davis, built a computer model to simulate an FMD outbreak in Tulare County, California, an area with lots of animals in close proximity and a mild climate that would allow airborne spread of the virus year-round. His simulation showed that the direct cost of destroying animals and disinfecting farms and other facilities would run between $500 million and $1.5 billion; that number would be much higher if the disease spread to other counties. The cost of lost trade in the 2 years after the outbreak would run about $1.9 billion if other countries shunned California animals and meat. If they extended the ban to all U.S. meat, the cost of one small outbreak could run as high as $8.9 billion.

    More than anything else, the outcomes of the model depended on how much time elapsed before the disease was detected. With a fast-moving disease like FMD, even a few days can be extremely costly. If, for instance, health authorities started the eradication job in the second week after the first infections, 81% of the county's herds would escape culling; if they started in the third week, only 13% would. But rapid detection may be the Achilles' heel of FMD control in the United States, Ekboir concluded, because farmers don't expect the disease. Hugh-Jones, who was in the trenches during Britain's last battle with the disease, in 1967–68, says a local outbreak often went undetected until at least four or five farms had been hit.

    In Texas, too, animal health authorities say an outbreak could be devastating. To be better prepared, the Texas Animal Health Commission played out a mock epidemic last fall, together with Canadian and Mexican officials. The exercise drove home the point that time is of the essence, says state epidemiologist Terry Conger. In this scenario, infected animals were sold at an auction, and then a trucker took them all the way to Canada in a matter of days. Officials also learned that they didn't have enough epidemiologists to deal with a big outbreak, and that red tape would prevent the quick release of indemnity funds needed to make farmers give up their herds.

    A costly gamble

    The gruesome sight of thousands of carcasses burning atop huge pyres or being dumped in mass graves raises the question of why, even though FMD has been around for centuries, farmers still can't protect their animals. The answer has less to do with science than with the fact that in livestock as opposed to human diseases, economy is all-important. And often, it's cheaper to slaughter hundreds of thousands of animals once in a while than vaccinate all of them all the time.

    FMD can be caused by any of seven viruses, or serotypes, called A, O, C, Asia1, SAT-1, SAT-2, and SAT-3, all members of the picornavirus family; each strain also contains many subtypes. The outbreaks now seen in Asia, Africa, and Europe all seem to be caused by one subtype of the O serotype, says Fred Brown, a veteran of FMD at the Plum Island Animal Disease Center, just off the coast of New York state. (The current strain has been dubbed, somewhat confusingly, PanAsian.)

    Although highly contagious, FMD does not kill most animals; calves often die, but adults usually recover within a few weeks. But they often remain underweight and exhausted, suffer permanent foot damage, give less milk, or fail to reproduce. Whereas farmers simply had to live with that in centuries past, today the sickened animals are an economic drain and might as well be killed, the argument goes. Given the fast spread of the disease, it is imperative that the culling happens quickly. Because infected animals start churning out new virus particles before they show any symptoms, even those who have potentially been exposed are put down.

    Economics also dictate that vaccination should be as limited as possible, despite the fact that current vaccines—which are all made of killed, not attenuated, virus—offer very good protection. Although the vaccines themselves are cheap, delivering them is expensive, especially because periodic booster shots are needed. And there's no such thing as one general FMD vaccine: Like flu vaccine, it always has to be chosen to match the strain, and even the substrain, that's a threat at a given time.

    The current vaccines, made by companies such as London-based Merial and Intervet in the Netherlands, have been consistently used in countries where the disease is still endemic. South American and European countries, for instance, have used them for decades to successfully eradicate the disease. Once an area is FMD-free, however, the scales tip, and the vaccines become less attractive. In the past, insufficiently killed vaccines sometimes caused outbreaks; although modern manufacturing methods have greatly reduced this risk, it cannot be totally eliminated. In addition, vaccinated animals can still harbor low levels of the virus that put susceptible animals at risk; that's why disease-free, isolated countries such as the United States, the United Kingdom, and Japan have opted not to vaccinate. And because vaccinated animals produce the same antibodies as infected animals, making it difficult to distinguish between the two, those countries also refuse to import vaccinated animals or their meat. In an effort to expand the market for its meat, the European Union (E.U.) gave in to that demand by banning vaccines in 1992, after FMD had not shown its face for several years. Now, some farmers and politicians, such as the Dutch agriculture minister Laurens Jan Brinkhorst, argue that it's irresponsible not to start vaccinating anew. But so far, the E.U. has not budged.

    A test to distinguish vaccinated from infected animals could end the current impasse. United Biomedical Inc., a company in Hauppauge, New York, has developed one that works well, says company scientist Alan Walfield. As his team described in a 1999 paper in Vaccine, the test specifically detects antibodies against proteins involved in viral replication, which the virus produces but the vaccine doesn't. The test, which still needs to be approved and registered, will likely make vaccines more acceptable, Walfield predicts.

    Another approach would be to develop a vaccine whose immunologic footprint is distinct from that of an FMD virus; ideally, such a vaccine would also protect against as many strains as possible, instead of just one. But because the different types vary greatly, this is difficult to do. Brown says he's currently testing vaccines that simply consist of a short protein sequence expressed by the virus, as is United Biomedical; another group at Plum Island, led by Marvin Grubman, is trying to develop a vaccine by sticking FMD virus genes into an adenovirus that has been crippled by removing some of its own genes.

    Although Walfield says big vaccine makers have recently shown a little more interest in the disease, most FMD researchers expect progress to be slow. After the memories of the devastating 1967–68 outbreak started to fade, Britain decreased its funding for FMD research, says Brown, and currently, the disease is studied by just a handful of labs around the world. “There's a certain degree of complacency,” says Brown. “We haven't had foot and mouth for so long, so why should we worry about it?”


    Can a King of Catalysis Spur U.K. Science to New Heights?

    1. Kirstie Urquhart,
    2. Andrew Watson*
    1. Andrew Watson writes from Norwich, U.K.

    David King, one of the world's leading physical chemists, is now the U.K.'s chief scientific adviser. He describes the challenges of his new job

    LONDON—If politics is supposed to mean catalyzing change for the greater good, David King ought to be a master of the art. After all, few people have a finer grasp on the intricacies of catalysis than the Prime Minister's new chief scientific adviser, a leading expert on the interactions of atoms at surfaces.

    Positioned to exert more political influence than any other scientist in the United Kingdom, King has spent the early days of his 5-year term as chief scientific adviser—he was appointed last October—getting a feel for his new milieu. Earlier this month, King carved out time from a calendar jammed with meetings with key science players in the Parliament, in the government, and among advocacy groups to share his thoughts with Science on a range of issues, from low salaries for British scientists to coping with a series of crises culminating in the ongoing outbreak of foot-and-mouth disease.

    Although King thus far has refrained from pushing for shifts in science policy, that should change after national elections, which the current Labour government is widely expected to win. (The election could be called as early as 3 May.) “Immediately after the election, he's going to have to swing into action and make sure that science is high on the political agenda,” says biologist Ian Gibson, a member of Parliament who serves on the Science and Technology select committee.

    Observers predict that King will be an effective advocate for science. The chief scientific adviser should be “somebody who is an excellent scientist” with “a strong and independent voice”—and King fits the bill, says his predecessor, Sir Robert May, who is now president of the Royal Society. But perhaps the biggest asset King brings to the job is his power of persuasion, says Gibson: “He can seduce people into doing things.”

    Skimming the surface

    Born in Durban, South Africa, in 1939, King earned a doctorate in chemistry from the University of Witwatersrand in Johannesburg in 1963, followed by a postdoc at Imperial College London. He spent more than 2 decades at the University of East Anglia in Norwich, U.K., and the University of Liverpool before landing a professorship at his current home, the University of Cambridge, in 1988.

    One of King's early passions was scrutinizing the encounters of gaseous hydrogen and metallic tungsten. He found that as the hydrogen molecules cleave, the individual atoms glom onto the surface, elbowing the tungsten atoms into new configurations. His studies helped show that “metal surfaces are not rigid checkerboards; they are flexible,” says physical chemist Richard Lambert, a Cambridge colleague. King also helped improve low-energy electron diffraction, a technique used to build up atomic-scale images of surfaces.

    But the innovation that has won King the most plaudits from his peers so far is the single-crystal microcalorimeter. This device measures the heat shed by molecules as they break apart at a surface. Pulses of molecules are fired at a crystal wafer barely two-tenths of a micrometer thick. A thermal camera monitoring this molecular barrage picks up infrared radiation as the wafer heats up by anywhere from 0.1 to 1.0 degrees Celsius per pulse. The device offered a new way to measure the energy liberated when atomic bonds are broken. “It's really a quantum leap above what anybody else has done,” says physical chemist John Yates of the University of Pittsburgh.

    Such insights are highly prized by industry. “If you understand in detail the mechanisms going on on a surface, then perhaps you can develop better catalysts,” says buckyball Nobelist Harry Kroto of the University of Sussex. Indeed, King and his team have recently unraveled how ammonia reacts with oxygen in the presence of platinum—a discovery that, by suggesting a more efficient way to drive the reaction, could save the fertilizer industry millions of dollars on platinum catalysts. In the hot field of surface chemical physics, Kroto says, “there's no doubt that Dave King is a leading scientist.”

    Culture change

    King hasn't forsaken his thriving research career. He works 4 days a week here in a stark room (bare walls, empty bookshelves) at the Department of Trade and Industry's Office of Science and Technology before decamping to Cambridge to spend Fridays with his research team. “When I took this job,” he says, “I made it a condition that I could keep doing research.”

    King has inherited a hot seat in the science adviser post. In the last several months, uncertainties over health risks posed by bovine spongiform encephalopathy (BSE), genetically modified crops, the measles-mumps-rubella vaccine, and depleted uranium armaments have dominated in the newspapers—issues all overtaken by the grim vigil on the foot-and-mouth disease outbreak. In this unsettling context, King says, “science has hardly been out of the news.” The string of crises has posed a huge challenge, he says: “The single biggest problem is public confidence in science.”

    Part of the solution is ensuring greater transparency in the way policy-makers take scientific advice, King argues. He describes the recent Phillips Commission report on the government's fumbling of the BSE crisis as “an invaluable audit” and gives May credit for having anticipated many of the report's recommendations: “He could clearly see what needed to be done in response” to the fiasco. In the past, King says, his predecessors and each ministry's chief scientist were discouraged from being forthcoming about risks, no matter how negligible, posed by new technologies. The new policy, King says, is that “you have to come straight out with it and say what you know about the risks … and what your decisions are and why you took those decisions.”

    King holds up the 1-year-old Food Standards Agency as the “flagship” effort to change the culture within the government. By holding meetings in public and posting minutes to the Web, he says, the agency “is spearheading this whole notion of engaging with the public in the decision-making process.” But although everyone across government has bought into this idea, he claims, “it takes quite a while before people actually behave, on each occasion, according to the new culture.” An important part of his job, he says, will be to ensure that this culture change takes hold: “I'm keeping a very watchful eye on it.”

    Among his core constituents, a preeminent concern is salaries. Not surprisingly, King, a former president of the Association of University Teachers, believes these must be raised. “When we advertise a top position in a top university in the U.K., we ought to judge whether salaries are being paid properly by looking at the people who apply,” he says. King concludes that many departments are failing to attract the best applicants and says he will fight to double top salaries. (The salary scale for university lecturers—comparable to assistant and associate professors in the United States—currently tops out at just under 40,000 pounds a year, or $57,000). But all levels of scientists are underpaid, he insists. As a former department head, King says, “one of the most upsetting things for me was year after year watching the brightest crop of graduating students not going into careers in science” but opting instead for high-salary careers in finance, for example.

    King promises that his initiation into the world of British science policy “won't last much longer.” One of his main priorities after the election, he says, will be to shore up energy research, from biomass to fusion. “We need … to be working very, very hard on future energy scenarios,” he says.

    Observers are watching to see how King's tenure will differ from May's. Although both grew up outside the United Kingdom, each brings to the table entirely different people skills. “Bob May is the outback Australian,” says Gibson, a former colleague of King's at the University of East Anglia, “whereas Dave is more the smooth, sherry-drinking type, although he tells me he doesn't drink sherry now.” That King has managed to sustain this illusion is another sign that, as a student of surfaces, he should easily grasp the contours of the political world as well.


    Doing the Bose Nova With Your Main Squeeze

    1. David Voss

    In Bose-Einstein condensates, dancing atoms merge into a chorus line. Now physicists are teaching them some new steps

    Six years ago, researchers in Boulder, Colorado, hit the physics jackpot when they created a new state of matter. By trapping a wisp of rubidium atoms and cooling them to a few hundred billionths of a degree above absolute zero, the physicists managed to get all the atoms to lock together in one quantum mechanical state—as uniform and coherent as a single particle. The frigid rubidium vapor, the first Bose-Einstein condensate (BEC) made out of an atomic vapor, threw the physics community into overdrive as labs raced to build magnetic traps and create colder and colder atoms. Since then, the condensate gold rush has slowed as easy veins of new physics ore were mined out and fewer physicists came in to stake their claims. More recently, many researchers have concentrated on fine-tuning their gadgetry, and the torrent of preprints and papers has slowed.

    Yet within the past year or so, a remarkably diverse set of results has continued to amaze the atom wranglers. Much to the satisfaction of condensed-matter physicists, the tenuous vapors that make up atomic BECs turn out to resemble much denser substances known as quantum fluids—including the classic superfluid, liquid helium. In other labs, researchers are putting the materials through some weird contortions: engineering them with quantum properties that might lead to ultraprecise measurements of distance or time; imploding atomic vapors at will to create a kind of miniature supernova or “Bose nova”; and pumping their atoms so full of internal energy that less uniform substances would be instantly destroyed.

    “I am truly amazed that even in the sixth year of BEC research, there is so much excitement and so many new things happening, both conceptually and experimentally,” enthuses Wolfgang Ketterle of the Massachusetts Institute of Technology (MIT), leader of one of the early groups to achieve BEC.

    Quantum fluid tricks

    The history of BECs began in 1937, when Pyotr Kapitsa, working in Moscow, cooled liquid helium below its 4.2-kelvin boiling point and discovered an astonishing thing: At 2.17 kelvin, the boiling stopped and the liquid became calm again, but it had no viscosity. All trace of resistance to fluid flow had vanished; Kapitsa had discovered the first superfluid. With the jiggling from thermal energy removed, theorists explained, the wave functions of all the atoms had locked together. The helium fluid had undergone Bose-Einstein condensation down to a common quantum state.

    Since then, liquid helium has been the lab rat for many experiments in condensed-matter physics. One of the key findings was that superfluids break up into tiny whirlpools when they are rotated. The vortices are strikingly long-lived. In theory, without viscosity to stop them, they could persist forever. For many physicists, this vortex phenomenon is the acid test for superfluidity. “Superfluidity has many manifestations,” says Ketterle, “but for many people vortices are the most direct evidence.”

    When the much more rarefied atomic BECs came along, physicists wondered whether they would pass that test. Over the past year or so, most have concluded that they do. Late in 1999, groups at JILA in Boulder and the Ecole Normale Supérieure in Paris directly observed superfluid whirlpools in BECs. They created the vortices by using a laser beam, in effect as an optical spoon to stir up the condensate. As the stirring speed increases above a critical velocity, they found, a hole forms in the condensate as the quantum fluid spins around it—just as it does in liquid helium. By dragging a laser beam through a condensate, Ketterle and his group at MIT confirmed that the beam moved without resistance, again just as in a superfluid.

    Recently, BEC researchers have nailed down more elaborate superfluid phenomena. In the laboratory of Jean Dalibard at the Ecole Normale Supérieure, researchers have created impressive arrays of vortices. As they stir more vigorously, more and more vortices appear, packing themselves into lattices that look strikingly like the arrays of magnetic vortices observed in superconductors—another example of outlandish quantum behavior at extremely low temperatures. Ketterle's group has taken the creation of vortex lattices the farthest to date. In a paper published online today by Science (, they report that they have spawned highly ordered arrays of more than 100 vortex lines, lasting as long as 40 seconds. “If you want to see superfluid motion, you are looking for something persistent,” Ketterle explains. “We can watch the vortices make 50,000 rotations. In a classical gas, these would be completely damped out.” Meanwhile, Eric Cornell and colleagues at JILA have used a laser to change spinning vortices into doughnut-shaped rings something like smoke rings.

    Small whirl.

    Vortices in BEC stirred by laser show it's a superfluid.


    Such experiments have left most physicists convinced that cold atomic vapors behave like superfluids. “Everybody believed that superfluidity was going on in these things,” says Bill Phillips of the National Institute of Standards and Technology (NIST) in Gaithersburg, Maryland, “but seeing it unequivocally was something people were looking for.”

    A helping of fresh squeezed

    While some physicists stir BECs, others prefer to squeeze. In many labs, the squeezing is metaphorical—a way of working around the limits to knowledge imposed by Werner Heisenberg's famous uncertainty principle. The principle tells how accurately anyone can hope to pin down the value of two complementary physical quantities. The better you can measure the position of a particle, for example, the less you know about its momentum; the better you measure its energy, the less precisely you know the time at which you measured it. For a century, scientists have resigned themselves to the bounds the principle sets on the ultimate knowability of the universe.

    But applied physicists have sought to squeeze lemonade out of Heisenberg's lemons. The limit, they note, applies only to the product of two uncertainties—not to each one separately. Thus, in principle, the uncertainty of one variable can be very, very low while the uncertainty of the other is immense. In other words, if you're only interested in one quantity and not the other, you could “squeeze” all the uncertainty into the less pertinent variable and capture the interesting quantity with sublime accuracy. It would be like zooming down the Long Island Expressway in a microscopic sports car: The highway patrol could precisely clock your excessive speed (or momentum, to be more accurate), but they couldn't give you a ticket because they wouldn't know your position.

    Physicists who work with photons were among the first to tackle this concept, and much effort has gone into trying to make “squeezed states” with light. By making fluctuations of some desired quantity (such as wavelength) extremely small, they hope to make measurements with great precision—perhaps great enough to detect gravitational waves. “The fly in the ointment is you've got to reliably make these squeezed states,” says Mark Kasevich of Yale University, “and it turns out to be very difficult to do that in optical systems.”

    As Kasevich and his colleagues report on page 2386, atomic BECs may offer an easier way to the promised land. They've created squeezed states of atoms with BEC inserted into tiny traps formed with walls of light. “We can retroreflect a laser beam and create a standing wave,” Kasevich explains, “and that overlaps with the magnetic atom trap to make this corrugated optical potential. Then we can load little condensates into each of these corrugations.”

    In this experiment, Kasevich says, Heisenberg's trade-off pits the number of quantum particles against the phase of the wave function that describes them. If the number of atoms is well defined, the phase—the point at which the amplitude of the wave crosses zero—should be all over the map. To bring about such a situation, the researchers set the laser at low intensity, allowing the atoms to “tunnel” from one trap to the next. When about 1000 atoms have fallen into the corrugations, they jack up the laser intensity, preventing any further tunneling. Now the numbers of atoms in each well ought to be nailed down—the fluctuations in number negligible.

    Are they? To check, Kasevich's group looks at the complementary variable, phase. If it were well defined, Kasevich says, the atomic bunches in each optical well ought to create interference fringes like those that arise from combining laser beams. But “if we make squeezed states, the phase fluctuations should wash out the interference fringes,” he notes. When they turn the corrugated trap laser off, the atoms are free to interact and interfere. And indeed, the atomic interference fringes are smudged out, a key sign that the phase is nearly undefined—the hallmark of a squeezed state.

    Kasevich says such states may offer an alternative to optical squeezed states for ultraprecise measurement. Now that scientists know how to create them, he says, the next crucial step will be to reverse the squeeze, blurring the number of atoms in order to home in on information about the waves. “The physics community is building all of these instruments based on atom wave interference for doing measurements,” says Kasevich. “Right now they run at the classical noise limit. If we have good ways of making squeezed states with atoms, we can realize a tantalizing enhancement in sensitivity.”

    Doing the Bose nova

    Meanwhile, at JILA, Carl Wieman and Cornell have been forcing condensates to run squeeze plays for real. By tweaking a BEC cloud with a magnetic field, they triggered a spectacular collapse.

    The key was to subvert the BEC's atoms. Most atoms used to make condensates, such as the isotope rubidium-87, slightly repel one another. In the early days of atomic BECs, many researchers thought that only repulsive atoms could form BECs. Theorists believed that a gas of atoms with slightly attractive interactions would instead turn to a liquid or solid when cooled. In 1995, Randy Hulet's group at Rice University in Houston disproved that notion by forming small BECs out of lithium-7, an isotope with slightly attractive atoms. Still, if the number of atoms in the BEC cloud becomes too large, the Texans found, the attractive force wins out, and the cloud implodes.

    Wieman and Cornell have discovered that another isotope of rubidium, Rb-85, is even more fickle. In the 28 August 2000 issue of Physical Review Letters, the JILA researchers described how they changed its atoms' interactions from repulsive to attractive by using a magnetic field to manipulate a property called the Feshbach resonance. When Cornell and Wieman did that last June, they saw the atomic cloud collapse on itself as the atoms suddenly became attractive. Then came something even more astonishing: When the BEC vapor collapsed, it spit out a tiny blast of hot atoms. Cornell likens the miniblast to the neutrino burst that comes out of a collapsing star during a supernova. In a dying star, fusion reactions create energy that staves off gravitational collapse. When the fusion furnace goes out, however, the star implodes and the infalling mass creates a powerful shock wave that rebounds at the center and blows everything apart—the supernova. The shock wave also drives a huge burst of neutrino emission.

    The JILA team thinks something similar is going on with their collapsing BEC. This “Bose nova,” as Wieman calls it, even leaves behind a tiny BEC core like a supernova remnant. “We think there is some kind of shock wave phenomenon, but nobody knows the exact mechanism for heating the atoms that are emitted,” Cornell says. The Bose nova will now provide a playground for exploring the rich physics of condensate collapse.

    Big squeeze.

    A laser beam bounced off a mirror creates a standing wave that traps atoms from a BEC (top). By hiking the intensity, physicists keep atoms from tunneling out of their pockets (middle). Lack of interference fringes when the laser is turned off (bottom) shows that the BEC's wave function was “squeezed.”

    More BEC surprises are likely to be on the horizon, as physicists attempt to make condensates of other atoms and molecules. Last month, a group led by Alain Aspect at the University of Paris in Orsay expanded the condensate zoo by returning to the element that began it all: helium. In another paper published online today by Science (, they report that, unlike the atoms in liquid helium, the atoms in the new condensate have been boosted above their lowest energy state. In each of them, one of the two electrons has been kicked upstairs to an excited energy level. Starting with a beam of such “metastable” helium, the physicists slow it down and cork it inside a magnetic trap. Each of the resulting excited electrons carries a whopping 20 electron volts of energy—enough to rip the electrons out of just about anything it touches. So a gas of metastable helium is like a cloud of little flying sticks of dynamite, just waiting to bump into something and detonate.

    “It's like a bomb waiting to explode,” Aspect says. “When isolated, it will not decay. But if the atom collides with anything except another helium atom with the same spin, it releases its energy.” What prevents the Orsay group's metastable helium vapor from self-destructing is that during the Bose-Einstein condensation process all the atoms are put into the same spin state. Every atom has a total spin that is the sum of the spins of the electrons and the nucleus, and they can be lined up just like a bunch of little toy tops or bar magnets. If the spins were all randomly aligned, the gas wouldn't last more than a millisecond. But when the spins are all pointing the same direction, as in a BEC, quantum theory says the internal energy of the metastable helium atoms cannot be released in a collision. Thus the condensate is stable. “This is just absolutely astounding and wonderful,” says Phillips of NIST. “People have talked about this for a long time and thought it would never happen. The data look really nice.” Researchers at the Ecole Normale Supérieure achieved the same result just a few days later.

    What use is a BEC with an insanely high internal potential energy? For one, its atoms can be counted and detected with unheard-of sensitivity. “Metastable helium going into our detector creates a huge signal, and we can even detect a single atom,” Aspect says. This means better BEC experiments with fewer atoms. Perhaps more exciting is the potential to use metastable helium BEC as a new kind of laser. According to Aspect, a slight kick in the right way could release the 20-eV energy stored in the helium as light, producing laser emission in the deep ultraviolet.

    All in all, researchers say, the future of condensate work appears as vigorous as a vortex in a superfluid. New BEC machines and new condensates such as metastable helium continue to drive the field. “With all that hype in 1995 and 1996, I was concerned that the BEC field [would] not live up to the high expectations,” Ketterle says. “It is very exciting to see that BEC is exceeding expectations and is still producing surprises.”


    Linnaeus's Last Stand?

    1. Elizabeth Pennisi

    A fight has erupted over the best way to name and classify organisms in light of current understanding of evolution and biodiversity

    These days the once-serene hallways of the world's natural history museums are anything but tranquil. A small but powerful contingent of systematists is challenging more than 2 centuries of taxonomic tradition by proposing a new system for naming and classifying life, one they say is more in line with the current understanding of evolution. Their brash proposal, which will be debated at a symposium in Washington, D.C., on 30 and 31 March, has raised the ire of the more conservative leaders in the field. The resulting controversy over the new naming system, known as “PhyloCode,” has pitted colleague against colleague, office mate against office mate. “You've got people willing to throw down their lives on both sides,” says Michael Donoghue, a phylogenetic systematist at Yale University in New Haven, Connecticut.

    Although few biologists pay rapt attention to systematics, the new proposal, if it prevails, could broadly affect how people think about the biological world. For more than 200 years, a Latin “first” and “last” name—genus and species—has been de rigueur for each organism on Earth. No matter what a person's native tongue or the common name of a species, “Quercus alba” identifies the same exact tree species—white oak—the world over. Yet under PhyloCode, which seeks to reflect phylogenetic relationships, genus names might be lost and species names might be shortened, hyphenated with their former genus designation, or given a numeric designation. The critics are not happy.

    The traditional system groups organisms in part according to their resemblance to a representative “type” specimen and places them in a hierarchy of ever more inclusive categories called ranks that have helped people organize and communicate their thinking about flora and fauna. The new naming system would be based more explicitly on evolutionary relationships. Instead of being grouped into ranks, such as genus, family, and order, organisms would be assembled into “clades,” defined as any set of organisms with a common ancestor.

    Under PhyloCode, each clade's name would refer to a node in the tree of life and should thus provide nomenclature more appropriate for modern biological thinking, says the Smithsonian's Kevin de Queiroz, one of PhyloCode's developers. As such, it should simplify the current push to catalog millions of undescribed (and unnamed) species. “The inappropriateness and ineffectiveness of the current system in naming clades are obvious,” asserts Philip Cantino, a plant systematist at Ohio University in Athens. “New clades are being discovered every day, but few are being named.”

    Defenders of the Linnaean system disagree, maintaining that its shortcomings—and the advantages of PhyloCode—are exaggerated. “PhyloCode is an impractical and poorly founded system,” says Jerrold Davis, a systematist at Cornell University in Ithaca, New York. But Davis is worried nonetheless. “There's just one group of people standing on the street corner making a lot of noise,” he says. Yet, “it's starting to consume resources and starting to appear in the popular press as if these folks have won.”

    Taxonomic tradition

    The Swedish botanist Carolus Linnaeus could not have anticipated the uproar that has erupted concerning the classification and nomenclature system he described in a 1758 book called Systema Naturae. At the time, names tended to be strings of descriptors that varied in length and meaning depending not just on the characteristics of the plant or animal but also on the scientist who named it. To enable botanists to equate plants from Europe, say, with plants from Turkey, Linnaeus devised a standardized system that has grown into the modern genus-species designation.

    He also came up with basic principles for organizing newly named species into groups and then for assigning groups to specific taxonomic categories. His followers shaped this classification system into the “ranks” that have since been taught in every basic biology class. Thus humans are Homo sapiens, part of the genus Homo, the family Hominidae, the order Primate, the class Mammalia, the superclass Tetrapoda, the subphylum Vertebrata, the phylum Chordata, and the kingdom Animalia.

    But in Linnaeus's mind, a species never changed—Darwin's observations about variation and evolution were still a century away. Thus, the Swede's system made no provision for naming and classifying organisms with evolutionary relationships in mind. “The Linnaean system was set up under a creationist world view to reflect a hierarchy of ideas in the eyes of the creator,” explains Brent Mishler, herbarium director and systematist at the University of California, Berkeley. Furthermore, as far as Linnaeus could tell, life consisted of about 10,000 species. “The world was much more circumscribed than [the one] we have today,” points out systematist Peter Stevens of the Missouri Botanical Garden in St. Louis.

    Darwin's 19th century contemporaries maintained the Linnaean system. But as they learned more about the number and evolution of organisms, they found they had to devise ever more extensive nomenclature rules, or “codes”—one each for plants, animals, and microbes—that would guide researchers as they fit new species into the traditional ranked hierarchies and enable them to keep names and classifications straight.

    Under the traditional system, a taxonomist begins by assessing the physical characteristics—say, petal or leaf arrangements a set of species has in common—then selects the most representative species to be the “type” for each genus, then the most representative genus to be the type of the family, and so forth. Individual specimens are then deposited in a museum to serve as the reference point for that species and genus. Thereafter, as new specimens with similar characteristics are found, they are deemed part of a known species, a new species, or even a new genus based on how closely they resemble the type specimen. In this way the original “type” becomes an anchor point for the ranked groups to which it belongs. Thus, the flowering plant Aster amellus is the type species for the genus Aster, which in turn is the type for the family Asteraceae and the order Asterales.

    Because of this dependence on type species, if a systematist reassesses a group of organisms and concludes that certain members don't belong, this removal can sometimes mean that the group's name must change, or a new group name must be created. Thus, when a group of herbs called Ajuga was added to the family called Teurcrioidae, that family had to be renamed for the herb's original family, Ajugoideae, because it was the older name. And because a common weed called henbit (Lamium amplexicaule) is the type genus for the mint family, any subfamily with Lamium in it must go by the name Lamiodeae. But over the past 35 years, Lamium has been reclassified three times—with each reclassification putting it into a new group. Thus three different subfamilies have borne the name Lamiodeae at one time or another. This ambiguity and these name changes are a hassle, say PhyloCode's advocates.

    Another problem is that researchers unfamiliar with the intricacies of ranks often misinterpret them. For example, sometimes biodiversity is assessed in terms of numbers of families, but that ranking says little about the number of species contained therein. The family Hominidae has only one living species—Homo sapiens—while other families have tens, hundreds, or in the case of some plant families, even thousands of members. Because ranks are not always equivalent, a simple family count may give a false picture of an area's biodiversity. Even Peter Forey, a vertebrate paleontologist at the London Natural History Museum who supports the current naming system, agrees that “the Linnaean ranks … don't mean a lot in the modern-day world.”

    Time for a change?

    These drawbacks became apparent to de Queiroz in the 1980s while he was a graduate student at the University of California, Berkeley. At the time, a new way of classifying organisms called cladistics, based on assessing the evolutionary histories of features shared by organisms, had begun to make its mark on the field. This was causing great rifts among systematists about how they should do their work, as existing Linnaean categories were not based on phylogeny. Like increasing numbers of his contemporaries, de Queiroz wanted to reclassify the organisms he worked on following the principles of cladistics; yet he wasn't sure how to apply the existing nomenclature codes to the groups, or clades, he came up with.

    As a result, de Queiroz says, applying the existing nomenclature codes could be cumbersome and confusing. As he and his colleague Jacques Gauthier, now a vertebrate paleontologist at Yale, were writing up their work, he recalls, “we stumbled on the idea of developing a naming system depicting phylogenetic relationships. At the time we didn't realize the full significance of it.”

    As de Queiroz and Gauthier worked out the conceptual underpinnings of such an approach over the next 8 years, they began to wonder whether Linnaean taxonomy had outlived its usefulness. Thus was born the PhyloCode, and from the start, it didn't quite jibe with the Linnaean approach. For example, one way a PhyloCoder might define a clade would be to choose the two most distantly related organisms in that group as the “specifiers” and say that the group consisted of all those with the same last common ancestor as the specifiers. Such groupings didn't always coincide with previous membership in ranks.

    The researchers described these ideas in several publications during the early 1990s, and then introduced them to the broader biological community at a symposium held during the 1995 meeting of the American Institute of Biological Sciences (AIBS). Interest was strong enough that de Queiroz and his converts organized a workshop at Harvard in August 1998.

    Among the 30 attendees was Ohio's Cantino, who 4 years earlier had “written off phylogenetic nomenclature as impractical,” he recalls. But for the AIBS symposium, he had been asked to evaluate how the old and new approaches would work with the mint plants that he studies. As a result, Cantino says, “I realized that phylogenetic nomenclature has great advantages.”

    Cantino has since become one of the PhyloCode's strongest advocates. He helped de Queiroz polish rules for the new system that were developed at the 1998 workshop. In May 2000 he posted them on the World Wide Web for comment ( As comments trickle in, momentum is building to establish a society to guide PhyloCode's continued development, says Yale's Donoghue.

    Tough sell

    Whereas almost all 21st century systematists now take a phylogenetic approach toward classifying organisms, PhyloCode presents them with an alternative to the Linnaean approach for naming what they classify. One key difference is that because organisms would be grouped in clades under the new system, names would include no references to families, orders, classes, even genera in the traditional sense. And the definition of each name, be it for a species or some more inclusive clade, would be based on the shared ancestry of its members.

    PhyloCode advocates haven't settled what will happen to species names, but they insist that most Linnaean family, class, or order names will survive the transition and will usually cover the same array of organisms. Thus, there could be a clade called Asterales that included another, smaller clade called Asteraceae, and the traditional relationship of these two groups would be retained. “Critics have said you'd lose all the hierarchical information, but you wouldn't,” says Berkeley's Mishler.

    In addition, PhyloCoders say that once a name has been redefined in PhyloCode terms, it should be more stable than it has been under Linnaean rules. For example, in the PhyloCode system, the addition of the herb Ajuga to Teurcrioidae would not have forced that name to be changed. Unlike in the Linnaean system, they say, the new definitions will allow for organisms to move in and out of clades without disturbing the clade's name or the names of the other organisms. In some ways, “PhyloCode is a more flexible naming system,” Missouri's Stevens asserts.


    Because of Linnaean rules, the names of groups containing Ajuga reptans (left), Teucrium fruticans, and Lamium amplexicaule (right) have changed through time.


    Both sides agree that the names of living organisms should be stable. “You don't want a system of nomenclature that is too mushy, where the names have no meaning,” says the Smithsonian's Frank Ferrari, a PhyloCode critic. But both sides vehemently disagree about which system provides the strongest guarantee that a name and its meaning will remain unchanged through the decades. In the Linnaean world, instability arises because names for the groups change as the group's members change. Yet in the PhyloCode world, say its critics, names may stabilize, but what they signify will change as new evolutionary studies cause members to shift from clade to clade—as is bound to happen.

    Evolutionary biologists across the globe are busy rearranging many branches of the tree of life, often by comparing genetic material from a wide range of species. Sometimes analyses of one gene will lead to a different branching pattern than analyses of a different gene. Organisms, perhaps even those specifying a clade, may shift in and out of the clade accordingly. Thus, PhyloCode “is not stable to changes in the phylogeny,” Cornell plant systematist Kevin Nixon contends. And he argues that Linnaean categories may be just fine, as they are being revised to better reflect phylogeny.

    Nixon and others also see value in retaining the Linnaean ranks, even if they lack biological meaning. “They are extremely important to our ability to communicate information about the biodiversity that we see and study,” he argues. When he teaches a class, he likes to be able to refer to families, so that if he's taking a class on a field trip, he can communicate about whole groups of trees. Clunky as the current system may be, it works, he insists, because of what family, class, genera, and other names have come to represent.

    Despite being convinced that the Linnaean way is superior, Nixon is concerned about the headway PhyloCode is making. “[PhyloCode] is not going to die out, because the spinmeisters behind this have the ear of the large funding agencies,” he complains. Even the upcoming workshop on Linnaean taxonomy at the Smithsonian National Museum of Natural History “will very much play into the PhyloCode [camp's] hands,” Nixon predicts. At any rate, there's likely to be vociferous debate about the two systems at the meeting.

    But what rankles Nixon and his loyal Linnaean colleagues the most, they say, is that PhyloCoders appear to have seceded from the taxonomic community. Several governing bodies exist to help enforce and clarify Linnaean codes of nomenclature, but PhyloCoders seemed to have bypassed both the codes and their congresses. “They are going to erect a shadow government and [set up] a coup,” Nixon complains. “This is arrogance.”

    In their defense, PhyloCode supporters say they have no choice but to go outside the existing system. “The differences between phylogenetic and rank-based nomenclature are just too fundamental for them to be combined,” Cantino argues. Furthermore, they say they need an organization that can help iron out the details of PhyloCode. One contentious issue: how to name species.

    Many systematists, such as Stevens, want the names to remain the same. “The only reason to junk [a name] would be because it causes widespread confusion,” he suggests. “You can add lots of higher order stuff by PhyloCode around the rudiments of the Linnaean system.”

    But in one popular proposal, just the “species” epithet would become the name. So Homo sapiens would get shortened to sapiens. Its drawback: Many organisms would need further qualification, as there are quite a few genera, for example, with a species named vulgaris, and searching archived literature for the “vulgaris” organism could yield many false citations. Another proposal calls for adding a number that might signify the place of a particular vulgaris on the tree of life, while a third calls for simply adding a hyphen to the existing genus-species designation, thereby linking them for all time. Although this yields a stable, unambiguous name, it could be misleading should phylogenetic studies later prove that one species didn't really share a common ancestor with another with the same genus name.

    The lack of agreement about what to call species gives many systematists pause, even those who are open to PhyloCode. PhyloCode is “not ready for prime time,” says Paul Berry, herbarium director at the University of Wisconsin, Madison. But the only way PhyloCode will make it to prime time will be if systematists take it seriously enough to test its potential. “One can never know for sure that one system is better than another until both have been tried for a while,” says Cantino, who is nonetheless pleased about the volume of activity at the PhyloCode Web site. “Most of the negative reactions have come from people who have not visited the Web site,” he notes.

    He anticipates that continued feedback will lead to refinements, and that over the next several years, researchers will start naming organisms using both approaches. In this way, the relative shortcomings and merits of each will become apparent.

  15. Searching for Medicine's Sweet Spot

    1. Joseph Alper*
    1. Joseph Alper, a science writer in Louisville, Colorado, is managing editor of the online biotechnology magazine dailytwist.

    Long avoided by chemists and biologists, sugar-based drugs are suddenly on medicine's menu and garnering impressive early reviews

    Just over a decade ago, a bug called Haemophilus influenzae type b (Hib) wrecked the lives of some 25,000 children a year in the United States alone. Rather than causing the flu, as its name suggests, Hib produced bacterial meningitis in 60% of infected children, 10% of whom died. Those who survived often suffered permanent damage, ranging from mild hearing loss to mental retardation. But thanks to a sugar-based vaccine, Hib has been virtually eliminated from the United States, most European countries, and a growing number of developing nations. The World Health Organization is now pushing the vaccine's use to prevent the estimated 400,000 annual Hib-related deaths worldwide.

    The Hib vaccine isn't the only sweet success for sugar-based therapies. The anticoagulant heparin—a complex sugar, or carbohydrate—has long been the number-one-selling drug in the world. Top-selling protein drugs, such as the red blood cell booster erythropoietin (EPO), bristle with sugars that ensure that these molecules stay in circulation long enough to carry out their task. And two new flu drugs approved for use in 1999 work by attacking a sugar-busting enzyme that influenza viruses use to help them exit infected cells.

    But these triumphs have been hard won. Six decades elapsed between the discovery of the sugar groups on Hib that induce an immune response and the development of an effective vaccine. Amgen, the maker of EPO, regularly throws out as much as 80% of its protein because the attached sugars are incorrect. Meanwhile, the 1990s saw a series of biotech companies go belly up after their carbohydrate drugs failed in clinical trials (see p. 2340). “The field has struggled, there's no denying it,” says David Zopf, vice president of Neose Technologies in Horsham, Pennsylvania.

    This struggle has long confined research on carbohydrates to a small niche in biology. The field didn't even have a proper name until 1988, when Oxford University biochemist Raymond Dwek coined the word glycobiology during an appearance on a British morning television show. “In a world focused on nucleic acids and proteins, there really wasn't that much interest in carbohydrates except from people studying energy metabolism and the few of us who were toiling away under the radar,” says Dwek, who is now director of Oxford University's Glycobiology Institute in the U.K. John Magnani, president of GlycoTech, a carbohydrate- focused biotech firm in Rockville, Maryland, says that even today, carbohydrate research is “the Rodney Dangerfield of pharmaceutical research. It gets no respect.”

    But recent advances in the study of carbohydrate chemistry and biology are beginning to turn the tide. Chemists have begun to crack a long-standing problem: producing carbohydrates in quantities large enough to study in biological systems. Like proteins and nucleic acids, carbohydrates are polymers of relatively simple molecules linked together. But unlike the amino acids in proteins and the nucleotides in nucleic acids that link up like boxcars in a train, sugar chains can branch and twist, giving them a multitude of three-dimensional shapes. That structural complexity makes carbohydrates difficult to analyze and extremely hard to make. But new automated methods of synthesizing large quantities of carbohydrates promise a dramatic change (Science, 2 February, p. 805); similar developments spurred the study of proteins and nucleic acids in the 1970s and 1980s.

    Two decades of genetic and biochemical studies by researchers who toiled largely in obscurity have also spelled out many critical roles played by carbohydrates and identified potential drug targets. Sugars are ubiquitous in cells. They dangle from nearly all the protein and many of the fat molecules in the body. These combination molecules, called glycoproteins and glycolipids, dot the outer surfaces of all cells and serve as cellular identification tags to the surrounding world. The body uses them to say something as general as “I'm a human tissue, I belong here,” or as specific as “I'm injured, send help from the immune system over here.” Cancer cells use sugar groups on their surfaces to slip past the immune cells looking to do them in as they migrate through the body. Pathogens rely on the glycoproteins and glycolipids on cell surfaces to home in on their tissue of choice in their favorite host species and to spread themselves from cell to cell. Harmful inflammatory reactions are often triggered by carbohydrates, as is blood clotting.

    “Carbohydrates are central to many processes that are at the core of important diseases, and now that we understand some of those roles, it's not surprising that this has become a hot topic at drug companies,” says Christian Raetz, a glycobiologist who helped Merck get into the field before he returned to academia as chair of the biochemistry department at Duke University Medical Center in Durham, North Carolina.

    Now, dozens of new carbo-related compounds are in clinical trials, aimed at treating conditions ranging from inflammation and tissue rejection to hepatitis and cancer. “The result is that glycobiology's future now looks pretty bright,” says Zopf. “Glycobiology has finally become part of the mainstream,” adds Hudson Freeze, a longtime glycobiology researcher and director of the glycobiology program at the Burnham Institute in La Jolla, California. “We're no longer a boutique science.”

    Interrupting inflammation

    One of the first critical roles played by carbohydrates began to come into focus in the late 1980s. Several groups independently cloned the genes for three human carbohydrate- binding proteins that play a role in attracting leukocytes, or white blood cells, to injured sites in the body. Called selectins, these proteins appear on the surface of the endothelial cells lining blood vessels after injured tissues nearby release powerful signaling compounds known as cytokines. The availability of the cloned genes led to a flood of publications in the early 1990s showing that selectins bind to a distinct carbohydrate structure, called sialyl Lewis × (sLex), on the surface of circulating leukocytes. This binding acts as a kind of brake that slows down leukocytes circulating in the bloodstream, causing them to roll along the injured blood vessel wall and allowing them to slip into the injured tissue.

    Hold it.

    Selectin proteins and their sugar-based targets slow the passage of white blood cells in circulation, allowing them to enter damaged tissue. Interrupting this binding may prevent inflammation.


    Although those leukocytes launch the healing process, they can also lead to inflammation, which itself can damage tissues. That prompted several groups to try to prevent inflammation by blocking the ability of selectins to bind to their sLex targets. Early experiments by Ajit Varki and his colleagues at the University of California, San Diego (UCSD), offered initial hope. Their work in mice showed that blocking a selectin subgroup called L selectin from binding to sLex reduced inflammatory responses in damaged blood vessels.

    Since then, the track record for getting selectin inhibitors to work in humans has been spotty at best. In 1995, researchers at Boulder, Colorado-based NeXstar Pharmaceuticals, now a part of Gilead Sciences in Foster City, California, developed potent selectin inhibitors in collaboration with Varki's group. But when the company's inhibitors failed to work in animal disease models, the project died on the vine. Cytel, formerly of San Diego, made it farther. Its compound, intended to prevent damage to tissues after blood starts flowing again after a heart attack, stroke, or tissue transplantation—a condition known as ischemia reperfusion injury—made it through human safety studies. But final-stage clinical data showed no benefit from the drug, and the company closed shop. “Historically, this has been a tough area to be in,” says Gray Shaw, who heads drug development for a selectin inhibitor at Wyeth, the pharmaceutical arm of American Home Products based in St. Davids, Pennsylvania.

    Wyeth is betting it can do better with a compound called PSGL-1 that binds to another selectin subtype called P selectin. P selectin is expressed not only on endothelial cells but also on platelet cells, causing these red blood cells to stick to leukocytes and create blood clots. “By inhibiting P selectin, we hope to not only prevent ischemia reperfusion injury but also the subsequent clotting events that can reocclude a vessel that's just been opened,” says Shaw. Wyeth is currently conducting phase II trials with a soluble recombinant form of PSGL-1, and Varki, for one, is optimistic. “I think that the Wyeth drug is the first really good selectin inhibitor that has been given a chance to prove itself,” he says.

    Inhibiting P selectin may also prove to be useful for stopping the spread of tumors. Metastasizing tumor cells grab onto P selectin on platelets and use them as a protective shield against immune system cells. In the 13 March issue of The Proceedings of the National Academy of Sciences, Varki and his UCSD colleagues reported that heparin treatment, which has been used with mixed success as part of chemotherapy, dramatically slows tumor metastasis by binding to P selectin on platelets before cancer cells can do the same. “This appears to markedly reduce the long-term organ colonization by tumor cells,” says Varki. The response is inconsistent, however, perhaps because the chemical makeup of heparin is variable, and the clotting problems that can result from heparin therapy make it unlikely that heparin will be widely used as a chemotherapy agent. Other P selectin inhibitors may fare better, however.

    Cancer vaccines

    Stopping tumor cells from binding selectins isn't the only way researchers hope to use carbohydrates to block cancer. Transformed tumor cells hide from normal immune surveillance by displaying glycoproteins and glycolipids on their surfaces. Some researchers are now trying to turn the tables by using the carbohydrate chains from these glycomolecules as vaccines. “The idea is to manipulate the antigens in such a way that they become visible to the immune system,” says Alan Houghton, chief of clinical immunology at Memorial Sloan-Kettering Cancer Center in New York City. “We and others have been able to do that, and the immune system will then make a concerted, and apparently successful, attack on tumors.”

    Several companies and universities are conducting clinical trials of carbohydrate-based anticancer vaccines. Houghton and his colleague Philip O. Livingston, for example, have been heading a team that is using synthetic carbohydrate antigens prepared by chemist Samuel Danishefsky and members of his laboratory at Sloan-Kettering and Columbia University.

    While Danishefsky's team worked out how to synthesize carbohydrate cancer antigens with names such as Globo-H, which is associated with breast cancer, and Fucosyl GM1, which is isolated from small cell lung cancer, Livingston was figuring out how to boost their otherwise lousy ability to trigger an immune response. The solution was to link these carbohydrates to keyhole limpet hemocyanin, a strongly immunogenic protein isolated from a marine mollusk, and then deliver the twosome with another immune booster. The Sloan-Kettering group is now conducting phase II and phase III trials in late-stage patients for whom more conventional therapies have failed. Patients receive immunizations once weekly for a month and then every 3 months for the duration of the trial. Results are expected within the next year.

    In the meantime, Danishefsky's group is pressing ahead on a next-generation vaccine: a chemically linked combination of several individual carbohydrates known as “polyvalent antigens.” “Work in the mouse suggests that these polyvalent antigens will do a better job yet,” says Danishefsky, who adds that making the molecules is the most complex synthesis project he has ever undertaken.

    Biomira, a biotech firm in Edmonton, Alberta, is also nearing the end of its own cancer vaccine trial. The company's Theratope vaccine uses an antigen known as STn, part of a larger antigen known as mucin-1 found on breast cancer cells. This month, the company completed enrollment in a double-blinded phase III breast cancer trial that will test the vaccine on more than 950 women with metastatic breast cancer. “We'll get our first look at the data in about 6 months and take another look at these patients about 18 months from now,” explains Mairead Kehoe, the company's director of clinical trials. While the waiting game continues, the mood among cancer researchers remains upbeat. “Our expectation is that carbohydrate-based vaccines will stop the metastatic spread of cancer and allow the body to control this disease,” predicts Danishefsky.

    Viral deconstruction

    Researchers also have big hopes for carbohydrate drugs to stop another kind of invader: viruses. It turns out that even relatively minor interference with the sugars on proteins that make up the viral coats of two major scourges, the hepatitis B and C viruses, can produce big results.

    Later this year, United Therapeutics of Silver Spring, Maryland, is expected to begin clinical trials on two antihepatitis drugs discovered as part of a collaboration between Oxford's Dwek and Timothy Block, a viral hepatitis specialist and director of Thomas Jefferson Medical College's Jefferson Center for Biomedical Research in Doylestown, Pennsylvania. Both drugs are variants of natural sugars, and they work by gumming up two glycoprotein-processing enzymes in the endoplasmic reticulum (ER), the site where cells add carbohydrates to newly synthesized proteins.

    When the hepatitis B virus invades liver cells, it depends on the ER's machinery to reproduce. But Dwek and Block found that when they added the compound N-nonyl deoxynojirimycin (NN-DNJ) to human liver cells, glycoprocessing was disrupted to a small but crucial extent. The result: The virus couldn't construct its M envelope protein, a critical coat component. These test tube studies show that “inhibition of as little as 6% of cellular glycoprocessing results in a greater than 99% reduction in the secretion of hepatitis B virus,” says Dwek. “There seems to be no effect on the host [cells], but it's a lethal change for the virus.”

    Dwek and Block believe these drugs cause viral replication to go awry by preventing the envelope proteins from folding into the correct three-dimensional shape. “We know that there are proteins called chaperonins in the ER that grab onto a new protein's sugars and help the protein fold correctly,” says Dwek. The two scientists suggest that only a small number of misfolded M proteins disrupt the symmetry characteristic of the hepatitis B virus coat and prevent it from escaping the endoplasmic reticulum.

    Studies in woodchucks, the preferred animal model for testing potential hepatitis B drugs, confirmed that NN-DNJ stops viral replication cold, with no detrimental effects on the animal's health. “Better yet, we've been unable to find any mutant viruses that can escape this effect,” says Block. “That's one big advantage of targeting a host enzyme and doing so at such a low level.”

    That could come as welcome news to hepatitis B patients, who commonly take a drug called lamivudine. The drug has serious side effects, and in 20% of patients the virus develops at least partial resistance within a year, a figure that rises to 53% after 3 years.

    Using the same partial interference approach, Dwek and Block recently developed a second sugar compound, N-nonyl deoxygalactojirimycin, that has the same effect on hepatitis C virus in animal tests.

    Although this approach has yet to prove itself in humans, a third sugar compound from Dwek's group is well on its way to demonstrating that a subtle adjustment in glycolipid synthesis can benefit at least some patients with Gaucher's disease, one of a class of inherited disorders known as glycolipid storage diseases. Gaucher's disease results from one of many genetic mutations that can either slow or prevent the breakdown of certain glycolipids, which accumulate in storage vesicles and eventually kill cells. Since 1994, Genzyme Therapeutics, headquartered in Cambridge, Massachusetts, has been selling a recombinant form of the enzyme at fault, glucocerebrosidase, under the trade name Cerezyme. The therapy is highly effective, but it requires a 2-hour intravenous infusion as often as three times a week and costs approximately $200,000 a year.

    No escape.

    Newly approved carbohydrate drugs block the ability of flu viruses to exit infected cells by binding to neuraminidase, a viral protein required for the job. [animation]


    Dwek, along with Oxford colleagues Terry Butters and Frances Platt, reasoned that they might be able to restore the body's glycolipid balance—at least in patients whose mutations don't completely destroy the affected enzyme—by decreasing the synthesis of the glycolipids, which also occurs in the ER. This time, they chose a sugar variant called NB-DNJ as their weapon of choice. The results have been promising. “We don't shut down glycolipid synthesis completely, just enough to restore the proper balance between synthesis and degradation,” says Dwek.

    Oxford Glycosciences, an Oxford, U.K.-based biotech firm specializing in carbohydrates, has been conducting clinical trials with NB-DNJ, known more prosaically as Vevesca, both alone and in combination with Cerezyme. Last April, the company published initial clinical data in The Lancet demonstrating the drug's safety and hinting at its effectiveness. Last month, the company announced its preliminary analysis of a 6-month phase III study indicating that the compound—which is taken orally—was as effective as Cerezyme at maintaining healthy phospholipid levels. With these promising results in hand, the company is now pressing forward with clinical trials on related compounds for other lipid storage disorders, including Fabry's disease.

    This new surge of interest in carbohydrate-based therapies is removing some of the sour taste of the earlier disappointments in the field. Now, glycobiologists are more confident that a spoonful of sugar will not only make the medicine go down, but replace it with something that works better.

  16. Saving Lives With Sugar

    1. Joseph Alper

    The study of slime molds and the development of a lifesaving treatment for a rare genetic disorder may seem worlds apart. But a chance observation recently brought them together in dramatic fashion. Six years ago, Hudson Freeze noticed that cells from one of his molds (more properly known by the name of the organism, Dictyostelium) showed biochemical similarities to cells taken from a child with a disorder in the way cells manipulate sugars. The condition, known as congenital disorders of glycosylation type Ib (CDG1b), causes chronic gastrointestinal problems, including vomiting, diarrhea, bleeding, and blood clot formation. And though not always fatal, it's related to other inherited sugar-processing defects that can cause severe neurological problems and even death.

    Children with CDG1b lack an enzyme called phosphomannose isomerase that converts the sugar fructose-6-phosphate to mannose-6-phosphate. The mannose compound is a critical intermediate needed to synthesize N-linked glycosylated proteins, which are involved in myriad biochemical functions. Freeze, who heads the glycobiology program at the Burnham Institute in La Jolla, California, was studying a strain of Dictyostelium engineered to produce no phosphomannose isomerase. He discovered that adding mannose to the mutants' culture medium corrected this deficit by allowing Dictyostelium to use an alternative route for making mannose-6-phosphate. On a hunch, Freeze added mannose to the CDG1b cells and got the same results. Hoping to use mannose to treat CDG1b, Freeze asked the U.S. Food and Drug Administration (FDA) for permission to test the safety of mannose therapy on healthy volunteers. In 1995, the FDA agreed.

    Shortly after Freeze and his Burnham Institute colleagues began their initial safety tests with mannose, he got a call from a physician in Germany who had read a paper on the cell work. One of the doctor's patients, a young boy, was about to die from CDG1b—the child was bleeding to death and had already received 20 liters of blood. “I told him how much mannose to give the child and how often,” says Freeze. “Six months later, the physician called back and told me that the boy was completely fine.”

    Freeze and chief collaborator Thorsten Marquardt of the Pediatric Clinic in Münster, Germany, published their findings in the April 1998 issue of the Journal of Clinical Investigation. Since then, Freeze has gotten two or three calls a week from physicians wanting to know more about mannose treatment for CDG. “Unfortunately, mannose only works for CDG1b,” he explains with obvious regret, but then he adds that he and Marquardt have found that the sugar fucose works as a treatment for another glycosylation disorder that interferes with the body's ability to fight infections by altering the capacity of immune cells called leukocytes to stick to their targets. Both disorders are easily detected by a simple blood test.

    “The real importance of what Hud and Thorsten have done is not just that we can treat these syndromes, but that we have a new perspective on multisystem diseases that we had no way of understanding before,” says William Balistreri, editor of the Journal of Pediatrics and head of pediatric gastroenterology, hepatology, and nutrition at the Children's Hospital Medical Center in Cincinnati. “The fact that a simple biochemical defect causes such a wide range of symptoms involving multiple organ systems has been a revelation.”

    Today, Freeze continues to study the biochemistry of CDG and other aspects of carbohydrate assembly. But his biggest passion is getting the word out about potential treatments for these disorders. “We're not talking about a lot of kids, but for the few hundred born every year, these therapies can change their lives,” he says—and even save them.

  17. Sugar Separates Humans From Apes

    1. Joseph Alper

    Humans and chimps differ at the genomic level by 1% to 2%. Yet so far, the only identified gene that differs between humans and chimpanzees codes for an enzyme that makes a particular form of a sugar called sialic acid: Chimps, and all other mammals for that matter, have the gene, while humans do not.

    To Ajit Varki and his wife and colleague Nissi Varki—both professors at the University of California, San Diego (UCSD)—this fact may provide a clue to how evolutionary pressure and molecular biology interact to produce changes that have multiple consequences. “Since many pathogens bind to sialic acids on cell surfaces, changing those sialic acids is one way for an organism to evade a particular kind of pathogen,” says Ajit Varki. “Such a change could give a big selective advantage to an individual with such a mutation.” But the Varkis, together with UCSD colleague Elaine Muchmore, are now exploring an even more intriguing possibility: Perhaps the change may also make the brain work better.

    “Ajit and Nissi are asking a very important question, which is what are the consequences of a single gene change on the physiology, and therefore the evolution, of humans,” says Bernard Wood, the Henry R. Luce Professor of Human Origins at George Washington University in Washington, D.C. “In terms of trying to understand how humans fit into the rest of the world, this work has a very important place. It's one of those bits of biology that will become a citation classic.”

    The gene in question codes for the enzyme CMP-sialic acid hydroxylase, which adds an oxygen atom to a sialic acid variant known as N-acetylneuraminic acid, creating N-glycolylneuraminic acid (Neu5Gc). All mammals except humans have this enzyme, so all mammals except humans have both forms of sialic acid in their cells. It turns out, however, that the distribution of this enzyme is always skewed: plentiful throughout the body, but present in only small amounts in the brain and central nervous system. “For unknown reasons, the expression of this sugar is selectively down-regulated in the brains of all mammals, while being widely expressed everywhere else in the body,” says Ajit Varki. “We wonder if its total elimination in the human brain might then have prompted a further improvement in the brain.”

    Using knockout and transgenic models, the Varkis hope to look at the effect of eliminating or overexpressing this enzyme in the brains of other species. While this probably won't produce animal Einsteins, it may provide answers to the question of how one mutation presumably driven by environmental pressures, such as infection, might lead to other consequences that drive evolution. For example, these studies have already shown that humans' unique sialic acid profile has marked effects on the ability of immune system cells known as macrophages to home in on their targets. The effect can be positive or negative, depending on the target.

    As it turns out, however, humans do have trace amounts of Neu5Gc in their tissues. This observation baffled the Varkis, because not only is the Neu5Gc-producing enzyme totally missing from humans, but there is no obvious alternative pathway for making the compound. Their hypothesis: “This is most likely coming from the diet, from humans eating meat, since plants, lower invertebrates, and bacteria don't make this sialic acid,” says Ajit Varki. The Varkis are now studying whether these trace amounts have any biological significance.

  18. After the Fall

    1. Robert F. Service

    In the early 1990s, biotechnology companies experienced a brief sugar high. Two start-ups—Cytel and Glycomed—pushed new sugar-based drugs into late-stage clinical trials. Cytel's compound was intended to block inflammation by inhibiting the binding of proteins called selectins to white blood cells, thereby disrupting the ability of white blood cells to find damaged tissue. Glycomed tried to modify a natural complex sugar compound called heparin to improve its performance in slowing the proliferation of smooth muscle cells in patients with damaged blood vessels. If the drugs worked, they promised to usher in a new era of compounds targeting sugars and sugar-binding proteins, which together play key roles in infections, cancer metastasis, and immune system diseases. That era hasn't yet arrived. Both drugs faltered, as did the two companies, whose assets and technologies were eventually sold off to competitors.

    “Those two companies looked pretty sexy, and both failed,” says David Cox, CEO of Synsorb Biotech, a carbohydrate-based drug discovery company in Calgary, Alberta. That soured many investors and major pharmaceutical companies on the idea that sugars would make good drugs, and the industry is still struggling to emerge from the shadow. “Sugars are now a show-me story, rather than a promise-me story,” says Cox.

    But carbohydrate specialists say biotech companies and a few venturesome pharmaceutical giants are regaining a bit of a sweet tooth. According to a review of the field last year by Ole Hindsgaul, a carbohydrate chemist at the University of Alberta in Edmonton, and Minoru Fukuda of the Burnham Institute in La Jolla, California, more than 30 carbohydrate-based drugs have recently been approved by the U.S. Food and Drug Administration or are in clinical trials (see table). That pales in comparison to the number of potential new drugs from traditional sources, small organic molecules and proteins. Still, “the field is really gaining momentum,” says James Paulson, a carbohydrate expert at the Scripps Research Institute in La Jolla.

    Until recently, that momentum has been held in check by a handful of challenges. For one, sugar-based drugs tend to take a weak hold on their targets. This means they must often be given in high doses to produce an effect. The molecules also tend to be broken down or cleared quickly from the body. Finally, their branching structures are difficult to make, raising the cost of manufacture.

    To get around these problems, companies are adopting several strategies. The first is to overcome sugars' weak binding with sheer numbers. Synsorb Biotech, for example, is in the final stage of clinical trials with a sugar-based compound against a diarrhea-causing bacterial toxin. Because the compound binds to the toxin in the gut, it acts before the body has a chance to clear it. The drug, called Synsorb Cd, resembles a child's Koosh ball, with a solid core and thousands of sugar strands dangling off it. The toxin molecules it seeks each contain some 15 sites that normally bind to sugars on cells lining the gut. So when they encounter a sugar-coated Koosh ball, all 15 quickly get tied up. “The toxin can't get away” from the Koosh ball and is flushed from the body, says Cox.

    Other companies are trying to develop compounds with the attributes but not the downsides of sugar molecules. Researchers at Rockville, Maryland-based GlycoTech, for example, are working to imitate the structure of carbohydrates in a small organic molecule of the type most often used for making drugs. The hope, says GlycoTech president John Magnani, is that through a combination of careful design and trial and error, researchers will hit upon a compound that binds to a specific carbohydrate docking site with a tenacious grip.

    Taking a more fundamental approach, companies such as GLYCODesign of Toronto, Ontario, and Abaron Sciences of La Jolla are attacking the enzymes that build the carbohydrates in the first place. “Rather than try and physically interfere with carbohydrate-binding proteins with carbohydrate drugs, we try to block the synthesis of the structures they recognize,” says Paulson, who is also an Abaron co-founder. Abaron, Paulson adds, is developing inhibitors against six glycosyl transferase enzymes that make carbohydrates recognized by white blood cells. The hope is that by temporarily shutting down the production of these carbohydrates, the drugs will disrupt the cycle of inflammation in chronic conditions such as rheumatoid arthritis and Crohn's disease.

    New ideas are advancing by another route as well: Some traditional carbohydrate companies are offering their expertise to pharmaceutical firms. For example, Neose Technologies of Horsham, Pennsylvania—which makes sugars for bulking agents and food additives—uses naturally occurring sugar-linking enzymes to join the proper sugar groups to recombinant protein-based therapeutics. Those sugars typically play one of two key roles, either stabilizing the protein's shape so that it can bind to its target, or modifying the target binding site, says Neose's president Sherrill Neff. Drugs such as Amgen's top-selling protein drug erythropoietin would be ineffective without the proper sugars.

    There is no guarantee that any of these strategies will pay dividends. Even the carbo supplier Neose, a company not dependent on the clinical success of its own drugs, is currently struggling to become profitable. But Cox argues that just one or two successes could inject new hope into the field. Says Cox: “There's nothing like success to silence your skeptics.”

    View this table:
  19. The Best of Both Worlds?

    1. Robert F. Service

    When drug companies search for novel pharmaceuticals, they pay close attention to how well candidate molecules bind to their targets. The tighter the binding, the smaller the amount needed to do the job and, consequently, the cheaper the drug is likely to be. Unfortunately, sugar-based compounds often don't follow this tidy rule of thumb: They typically bind to their targets with numerous weak handholds. They are also poor at crossing cell membranes and are expensive to make. As a result, many drug companies don't bother with them and look instead for more traditional small organic molecules to do the same job. But Ole Hindsgaul thinks sugars have something worth hanging on to.

    Hindsgaul, a carbohydrate chemist at the University of Alberta in Edmonton, has recently begun making what he calls “carbohybrids,” molecules containing a sugar portion linked to other organic groups that are simple to make and typically bind more tightly to protein drug targets. A wide variety of proteins in the body bind oligosaccharides, compounds made up of multiple sugars. And although there are usually many bonds involved, one key sugar group typically embeds itself deeper in the protein than the rest, holding the oligosaccharide in place. “The proteins we're trying to inhibit evolved to bind sugars,” says Hindsgaul. “If we give them one of the sugars that fits in the active site, it anchors the compound. As a result, you're more likely to hit only carbohydrate-binding sites.”

    The strategy is showing early promise. Two years ago, Hindsgaul and his Alberta colleagues devised a high-speed “combinatorial” technique for making hundreds of related carbohybrids. The hybrids were designed to bind to plant proteins that themselves bind oligosaccharides containing the sugar unit galactose. Each hybrid contained a galactose group linked to a different combination of other organic groups, giving it a slightly different shape. Hindsgaul's group tested their compounds against four separate galactose-binding proteins. “Out of the blue, one of these [hybrids] lit up,” he says, showing 16-fold better binding ability than a conventional multisugar compound. What's more, the new compound contained groups that make it more lipid friendly, enabling it to cross cell membranes more easily.

    The compound's binding ability “is still considered lousy by pharmaceutical standards,” says Hindsgaul. “But it's good for sugars.” Still, the approach “makes sense,” says David Cox, president of Synsorb Biotech, a carbohydrate biotech company in Calgary, Alberta. “By manipulating the organic [groups] to the sugar, you can refine the binding that a sugar has to a protein,” he says. If Hindsgaul's group can improve the hybrids' binding still further, says Cox, the molecules could find applications ranging from preventing cancer metastases and tissue rejection to combating viral infections.

    Hindsgaul's group is testing further alterations of the hybrids. At a meeting last year, members reported finding a compound that inhibits an enzyme associated with cancer metastasis at a 0.3 micromolar concentration. That's still a lot, about 10-fold higher than the sweet spot for most pharmaceuticals. But the continued improvements mean that carbohybrids are starting to look enticing.

  20. Bent Out of Shape

    1. Michael Balter

    LONDON—When it was first proposed in the early 1980s, the notion that aberrant proteins called prions can replicate without DNA or RNA and cause infectious diseases was biological heresy of the first order. While still controversial, this hypothesis eventually won enough adherents that it earned its leading advocate, University of California, San Francisco, neurologist Stanley Prusiner, the 1997 Nobel Prize in physiology or medicine. Now, John Collinge, a neurologist at St. Mary's Hospital in London, is pushing the controversy a step further. Collinge contends that the differences between prion types, or “strains” as they are often called, is in large part determined by how many carbohydrate molecules bind to a protein. This claim has helped make Collinge one of the prion world's most high-profile scientists.

    Collinge was introduced to prion diseases in the late 1980s when, soon after medical school, he began working in psychiatrist Tim Crow's lab at the Clinical Research Centre in Harrow, U.K. Shortly before, Crow's group, together with Anita Harding's team in London, had identified a gene mutation linked to an inherited form of Creutzfeldt-Jakob disease (CJD), an invariably fatal neurodegenerative disorder. Other workers had showed that this gene codes for a protein called PrP, of which prions are misfolded versions, and Collinge set to work characterizing this and other PrP mutations responsible for CJD. According to Collinge, the fact that the prions created by these mutations could then infect experimental animals was a “very powerful argument” that proteins alone could cause a transmissible disease.

    In 1990, Collinge moved to St. Mary's, where he continued working on prion protein genetics and structure. Other researchers—including Richard Marsh of the University of Wisconsin, Madison, who worked on prion diseases in minks—had found a correlation between prion strains and protein structures. Collinge's group, working with the prions that cause human disease, found that strains also correlated with the pattern of sugar groups bound to the protein. For example, prions isolated from the brains of patients with “classical” forms of CJD were found to have a very different sugar pattern from those isolated from patients with a variant form of CJD that struck some patients much earlier in life. But the sugar pattern of variant CJD prions was identical to those that cause bovine spongiform encephalopathy—a finding key to establishing that variant CJD was in fact the human form of “mad cow disease.”

    Collinge believes that sugars known as glycosyl groups mark prion strains in part because they stabilize the proteins' aberrant shapes. “Certain [protein] conformations will associate with different glycosyl patterns,” he says. And in early 1999, Collinge and his colleagues reported developing a diagnostic test that detects variant CJD in tonsil biopsies on the basis of their glycosylation patterns (Science, 22 January 1999, p. 469).

    Given the continuing controversy over whether proteins alone can form infectious particles, many of Collinge's colleagues are intrigued but cautious about the role of sugars in prion diseases. Collinge's glycosylation work “is a very significant breakthrough,” says neuropathologist Adriano Aguzzi of the University of Zurich. “But we do not know if the glycotype distribution represents the essence of a strain or a surrogate for it.” Like the prion theory itself, Collinge's glycosylation theory is heretical enough to be treated with caution but intriguing enough to stimulate a lot of interest.