News this Week

Science  09 Jul 1999:
Vol. 285, Issue 5425, pp. 174
  1. WORLD CONFERENCE ON SCIENCE

    Science Blueprint Is High on Ideals, Light on Details

    1. Robert Koenig*
    1. With additional reporting by Daniel Clery.

    BUDAPESTAfter 6 days of sound and fury, the 1800 scientists, activists, and policy-makers who took part in last week's World Conference on Science departed with two documents that offer a sketchy roadmap for science in the 21st century. Despite the some-times emotional debates—during which there were calls for scientists to take a form of Hippocratic Oath, for more research aid for developing countries, for women to play a greater role in science, and for universal access to the Internet for scientists—the conference's final documents managed to endorse the general concepts of ethics, equity, and access without offering up many specifics on how science should be conducted, funded, disseminated, or policed.

    The architects of the meeting, the first such gathering in 20 years, defended the final products—a nonbinding “Declaration on Science” and “Framework for Action”—as constructive blueprints that offer each nation a chance to build up its science. Swiss Nobel laureate Werner Arber, president of the International Council for Science (ICSU), a cosponsor of the conference, said the declaration defines a “new social contract” between science and society. Spanish biochemist Federico Mayor, director-general of the United Nations' Educational, Scientific, and Cultural Organization (UNESCO), the other sponsor, told Science: “Scientists must have more contact with society, and more connections to power.” Mayor's take-home message for delegates was: “Now that you have this in your hands, tell your governments that this is an opportunity for you.”

    The Budapest blueprints call for governments across the globe to support science adequately because of its importance for sustainable development. The documents also demand more support for science education at universities, and for science to be made more open to women and other groups such as the disabled, ethnic minorities, and indigenous peoples. International collaboration, the blueprints say, should be supported at all levels. Governments and professional bodies should promote ethical conduct or draw up codes of ethics, while scientists should themselves adhere to the highest ethical standards. And, in an initiative that emerged during the conference, the main declaration suggests that developing countries could take advantage of the debt relief offered recently by the G8 industrial nations by increasing their spending on science and education.

    Reaching even this modest consensus was far from easy in Budapest. The heated debates began straight after the initial session, when some women delegates pointed out that all speakers in the podium were male, and the arguments spilled over into a stormy session on “gender mainstreaming” in science. “Not one female voice came from the podium that first day,” complained Canadian bioethics specialist Margaret Somerville of McGill University in Montreal. Shirley Malcom of the American Association for the Advancement of Science (the publisher of Science) added: “That galvanized women here, because it made it look like UNESCO is talking the talk, but not walking the walk.” UNESCO's Mayor told Science that he was “personally concerned” by the lack of representation on the first panel, but that the conference as a whole had done much to bring gender issues to the fore.

    Many of the two dozen thematic sessions, which took place in venues across Budapest, were similarly heated. Conference rooms rang with a cacophony of voices, from Australia to Zambia, representing more than 150 countries and scientific interests as diverse as particle physics and small-farm agriculture. U.S. Nobelist Leon Lederman of the Fermilab accelerator center in Illinois expressed the frustration of numerous scientists in having to distill the debate and recommendations of 3-hour sessions into a few paragraphs for the committee revising the final documents. “It's like trying to illustrate War and Peace in a single-frame cartoon.”

    As science officials from country after country expounded their national positions on the conference's main stage, there were complaints from Iraq and Cuba about how U.S. sanctions had hurt their science. And, behind the scenes, some delegates from industrialized nations grumbled that industry—which accounted for 61% of R&D expenditures in the North in 1996—was barely represented in Budapest. “The next time we do this, we should make sure that leading research industries have a voice,” said Japanese parliamentarian Wakako Hironaka, a former environment minister. Mayor responded that UNESCO had invited more industry officials who did not attend, and that it would involve industry in follow-up meetings.

    Meanwhile, some Third World scientists complained that—at a time of U.S. budget surpluses—the world's strongest economy was unwilling to commit itself to greater aid to science in the developing world. Some even proposed that wealthy countries earmark 0.5% or more of their national income for research aid to poorer nations. But officials from the United States—which is not a UNESCO member—steered clear of specific commitments and emphasized cooperation rather than aid. E. Michael Southwick, a deputy assistant Secretary of State, told delegates that the U.S. government has been involved in “the education and training in the United States of thousands of international scientists, joint laboratory and field projects with many countries to share knowledge and technology, and cooperative research to tackle problems of global dimensions.”

    In the absence of firm aid commitments, a group of African delegations came up with the self-help plan to devote a portion of the Third World debt relief from the G8 to science and education. “We hope this will allow African governments to devote more resources to research and education,” said Ahmadou Lamine Ndiaye, rector of Senegal's Gaston Berger University.

    Ethics was another hotly debated issue, in large part due to the proposal by Joseph Rotblat—a physicist who won the 1995 Nobel Peace Prize for his work as founder of the Pugwash Conference, which pressed for nuclear disarmament—that all young scientists be required to take an oath, similar to the Hippocratic Oath taken by physicians. He suggested it include the line: “I will not use my education for any purpose intended to harm human beings or the environment.” While UNESCO's Mayor told Science that he favored the concept, many scientists at the conference—including U.S. Nobelist Paul Berg and German biologist Hubert Markl, president of the Max Planck Society—said such an oath would have only symbolic value. “The Nazi doctors who committed atrocities at the concentration camps had taken the Hippocratic Oath,” Markl told Science. In the end, the conference documents did not specifically back Rotblat's proposal, but urged young scientists to “adhere to the basic ethical principles and responsibilities of science.”

    One issue that all delegates seemed to agree on was the need for a rapprochement between science and society. Indian plant geneticist M. S. Swaminathan told the conference that, despite its great advances during this century, science has failed to address many human needs in the developing world. “The formidable power of science and technology can benefit mankind only if we know how to temper it with humanism.” He called for “information empowerment” by connecting the Third World to the Internet and giving the poor greater access to scientific advances.

    As the bleary-eyed delegates headed home from Budapest, some recalled that the noble goals of the last such conference—held in Vienna in 1979—had not been fulfilled (Science, 11 June 1999, p.1760). And they vowed this time around to make sure the Budapest framework goes somewhere. “The follow-up is the most important aspect of the conference,” said Mayor. He said UNESCO would establish a network to meet regularly to evaluate the conference follow-up and recommend ways to implement the resolutions. “We want the Budapest conference to be more effective in the long run” than was the Vienna meeting, said Indian scientist M. G. K. Menon, a former ICSU president. Added Mohamed Hassan, a Sudanese mathematician who directs the Third World Academy of Sciences and is president of the African Academy of Sciences: “We want to send a message to the world's countries: support science, because it is in your interest.”

  2. NEUROBIOLOGY

    An Immunization Against Alzheimer's?

    1. Marcia Barinaga

    Immunization, once largely limited to fighting infectious diseases, is finding surprising new targets. Researchers have recently learned that some cancers can be eliminated by cranking up the immune system with vaccines, and now, new findings raise a startling possibility: someday immunizing people to prevent or even reverse the mental devastation of Alzheimer's disease.

    One hallmark of Alzheimer's is amyloid plaque, a protein deposit that builds up in the brains of those with the disease. A team at Elan Pharmaceuticals in South San Francisco reports this week in Nature that, in mice genetically engineered to develop an Alzheimer's-like condition, immunization with β-amyloid (Aβ), the protein fragment that forms the plaque, reversed or prevented plaque formation and neural damage.

    The finding “raises the possibility that immunization with Aβmay eventually be used as a treatment, or prophylactically, for Alzheimer's disease,” says Alzheimer's researcher Peter St. George-Hyslop, of the University of Toronto. “If so, this would be an absolutely tremendous result.” Alzheimer's researcher Sam Sisodia of the University of Chicago agrees, but adds: “One has to exert caution [in thinking] about using this strategy for therapeutics. Things could work differently in humans.” One big question mark, notes St. George-Hyslop, is that even if immunization prevented plaque formation in humans, no one is certain yet that plaque actually causes Alzheimer's symptoms.

    Even so, plaque made up of Aβ42, an abnormal-length fragment of a normal cellular protein, has been a central focus of Alzheimer's research. It is an early and consistent feature of the disease, and while it hasn't been proven to cause the symptoms, many researchers think multiple lines of evidence strongly suggest that it does.

    Several labs have bred transgenic mice that produce Aβand develop plaques and neuron damage in their brains. Although they don't develop the widespread neuron death and severe dementia seen in the human disease, they are used as models for its study. Dale Schenk, vice president of neurobiology at Elan, wondered whether immunization with Aβmight produce antibodies that would prevent plaque formation in the mice. The antibodies would have to cross the blood-brain barrier, but “we knew [from prior work] that everything from the blood gets into the brain at some level,” says Schenk, so he thought it was worth a try.

    Schenk's team injected the mutant mice with Aβ at a young age, before plaque formation had begun, and found that those mice never developed plaque or neuron damage. When they immunized older mice that already had plaque in their brains, the plaque—and the signs of disease—largely went away. In the brains of these mice the team found evidence of an immune response: bits of remaining amyloid that were dotted with antibodies, and microglia, the scavenger immune cells of the brain, chock-full of amyloid protein they had cleared away.

    The presence of antibodies on the remaining plaque means that the antibodies successfully crossed the blood-brain barrier, says neurologist Lawrence Steinman, who studies immune-brain interactions and amyloid at Stanford University Medical Center. Once there, he says, it's easy to see how they could block amyloid molecules from sticking together in plaques. “If the amyloid protein is bound to an antibody, there is no way it can form these aggregations,” he says. What's more, Sisodia notes that recent studies in mice showed that when amyloid deposition is halted by killing neurons that secrete Aβ, existing deposits diminish over time. “The idea that you can … get rid of [amyloid] is not inconceivable,” he says. Researchers agree they'd like to see the immunization results repeated. They may not have long to wait, as at least one other group is rumored to have similar results.

    But will the approach work in humans? Mice aren't a perfect mirror of human physiology, Steinman notes. In particular, he worries whether in humans “there is enough of a breach of the blood-brain barrier to allow this to happen.” And St. George-Hyslop cautions that the protein precursor to Aβ is found in many cell types, so immunization might induce a harmful autoimmune response in nonbrain tissues.

    Allaying concerns about autoimmune reactions may require further animal testing. But by the end of the year, Elan hopes to start clinical trials of the therapy on Alzheimer's patients. Those trials could yield a verdict not only on this therapeutic approach but also on the importance of plaque in Alzheimer's disease. “The bottom line of this all,” says St. George-Hyslop, is that “we will know quite clearly what the true role of extracellular Aβ is in Alzheimer's disease. We will either get a brilliant treatment, or we will get some powerful insights that modify how we think about the disease.”

  3. SCIENCE POLICY

    NRC Pulled Into Radiation Risk Brawl

    1. Jocelyn Kaiser

    A festering feud over possible health risks of low radiation levels has blistered into public view. But instead of assailing each other, two bitter foes are unloading on the National Research Council (NRC) for assembling what they claim is a biased panel to weigh radiation risks. In response, the NRC last month canceled the panel's first meeting and agreed to review its composition. “We're just taking a breather,” says radiation biologist Evan Douple, director of the NRC Board on Radiation Effects Research.

    The nasty decades-long dispute centers on the risk posed by ionizing radiation from sources such as medical isotopes and spent nuclear fuel. A range of federal agencies have set exposure standards for the general public and for workers—standards based on accepted risk levels that the government tasks the NRC to review every several years. Billions of dollars are at stake: Stricter standards could increase the amount that agencies and industries must spend to clean up radioactive waste and protect workers.

    Arriving at safe levels of radiation exposure is hard because little data exist on how low doses—less than 10 Roentgen equivalent man (rem) a year—affect health. (Annual U.S. exposure from all sources is 360 millirem). For years researchers have derived estimates mainly from cancer rates among 50,000 Japanese atom bomb survivors who received acute doses of more than 500 millirem. Current exposure regulations are based on the Linear No-Threshold (LNT) model, which uses a straight line to extrapolate the Japanese data to zero: It assumes no safe cutoff, and that doubling the dose doubles the risk.

    The bone of contention is whether the LNT reflects reality. Some experts believe that population studies in regions with high background exposure—from radon or uranium deposits—suggest that radiation is harmless below a certain dose. Others point to data—including cellular studies—hinting that low doses may pose an even greater cancer risk, proportionally, than higher doses. At the request of several agencies, the NRC organized the latest panel on the Biological Effects of Ionizing Radiation to look at what model best fits the data.

    But the 16-person committee that the NRC unveiled on 10 June, chaired by Harvard epidemiologist Richard Monson, drew an angry response. The panel “is completely skewed” toward people who favor relaxed standards, claims Dan Hirsch of the Committee to Bridge the Gap, a nuclear watchdog group in Santa Cruz, California. His organization and 73 other groups and individuals claim in a 22 June letter that most panelists have published studies or opinions suggesting the need for looser standards.

    Other groups say the panel contains the opposite bias and ignores researchers who believe the LNT model is too restrictive. A nonprofit called Radiation, Science, and Health Inc., which insists low doses are harmless, claims that panelist Geoffery Howe, a Columbia University epidemiologist, has “obfuscat[ed] data so as to support the LNT.” Bridge the Gap, meanwhile, finds fault for a different reason, claiming Howe advocates “the premise that low doses of radiation are substantially less harmful than officially presumed.” Howe told Science he considers the LNT model “a reasonable assumption not proven.”

    NRC hopes to announce any revisions to the panel within a few weeks, Douple says. But that may not quell the fire: If the NRC makes “minor cosmetic changes that do not alter the imbalance of the panel,” Hirsch says, his group may file a lawsuit under the Federal Advisory Committee Act. Revisions to the act in 1997 opened panel memberships to public debate in the first place.

  4. MATHEMATICS

    Fermat's Last Theorem Extended

    1. Dana Mackenzie*
    1. Dana Mackenzie is a writer in Santa Cruz, CA.

    Five years ago, the proof of Fermat's Last Theorem by Andrew Wiles of Princeton University hit the mathematical world like an earthquake, rearranging the landscape and leaving previously unassailable peaks on the verge of collapse. This month, an aftershock has finally leveled the most prominent of these, a 40-year-old unsolved problem called the Taniyama-Shimura conjecture. While it lacks the colorful history of Fermat's 350-year-old unsolved puzzle, this conjecture applies to a vastly broader class of problems.

    “Before Wiles came along, nobody even knew how to begin proving the conjecture. Afterwards, there was a widespread belief that it was just a matter of time,” says Brian Conrad of Harvard University, who collaborated on completing the solution with Christophe Breuil of the Universite de Paris-Sud, Fred Diamond of Rutgers University, and Richard Taylor of Harvard. “The Taniyama-Shimura conjecture is a wonderful, major conjecture,” comments number theorist Kenneth Ribet of the University of California, Berkeley.

    The conjecture, which Wiles partially proved en route to Fermat, states that all elliptic curves are modular. A couple of definitions make the statement a trifle less gnomic. An elliptic curve is not an ellipse: It is the set of solutions to a cubic polynomial in two variables, usually written in the form y2 = x3 + Ax2 + Bx + C. If x ranges over all real numbers, such equations indeed define curves—mildly wiggly ones that come in one or two pieces. However, number theorists are generally interested only in rational solutions—values of x and y that can be written as fractions. And an elliptic curve is modular if every rational solution can be found with the help of “modular functions,” a very high-tech version of periodic functions familiar from geometry, like sine and cosine.

    In 1955, a young Japanese mathematician named Yutaka Taniyama first suggested using such modular functions to describe all rational points on an elliptic curve. Taniyama, who committed suicide at age 31, never got a chance to work seriously on his problem. However, his contemporary Goro Shimura, now at Princeton University, took this geometric approach to the problem further, strengthening the conjecture into its present form in the early 1960s.

    To explain how geometry can be used to solve algebraic problems, Conrad cites the oldest problem in number theory: finding Pythagorean triples. These are sets of three integers such that the square of one is the sum of the squares of the other two: for example, 32 + 42 = 52. This equation can be rewritten as (3/5)2 + (4/5)2 = 1. In this way, Pythagorean triples correspond to rational points, such as (3/5, 4/5), on the circle whose equation is x2 + y2 = 1. And Conrad notes that there's a simple geometric technique for finding all the solutions. First pick one solution—say (1, 0)—and draw any line through that point whose slope is a rational number. That line intersects the circle in a second point, the coordinates of which will be another rational solution.

    A similar idea works for elliptic curves. Given one rational solution, called a “generator,” you can get another by drawing a tangent to the curve at that point and looking for its other intersection with the curve. By repeating this procedure (and a variation of it) over and over, you can get lots of solutions—but only if you have one to start with. Sometimes, no such “generator” exists. In other cases, no single generator can produce all the rational solutions. The current record-holder is a curve that requires at least 23 of them. At present, modular functions offer the only hope for predicting the number of generators.

    Indeed, number theorists have proven quite a few results about modular elliptic curves, including how to tell if they have only one generator. But until now, they didn't know which elliptic curves would turn out to be modular. Wiles, in effect, found modular traces for many elliptic curves. Now, Breuil, Conrad, Diamond, and Taylor have proved that such modular functions exist for all the rest.

    “It is very aesthetically pleasing that now the full conjecture has been proved, rather than just ‘most’ of it,” says Berkeley mathematician Hendrik Lenstra. “It is just as with stamp collecting … having a complete collection is infinitely more pleasing than having all but one.” Lenstra and other mathematicians note, however, that they have not yet been able to judge the correctness of the proof, which so far has been presented only in public lectures. “I hope a complete draft will be ready by the end of the summer,” Conrad says.

  5. GENETIC TESTING

    Beryllium Screening Raises Ethical Issues

    1. Eliot Marshall

    Analytical chemist Reed Durham finds himself at the cutting edge of an ethical debate over research on genetic risks in the workplace—but not as an investigator. Instead, Durham has become a significant data point in an effort to understand why a small percentage of people exposed to the metal beryllium—element number four on the periodic table—develop an incurable and sometimes fatal lung disease. And he's not happy about being removed from his job after testing positive for a sensitivity to the metal that is believed to be caused by a genetic variation. “I have been excluded from anything that has to do with beryllium,” says Durham, who does not have the disease. “All the expertise that I've gained over the past 30 years working with these materials, I can't use any more.”

    Durham spoke about his plight at a 24 June meeting outside Washington, D.C., on the ethical problems of conducting workplace health studies. His case illustrates the “troublesome aspects” of using a test without clear benefits to those taking it, one that not only produces lots of “wrong” answers but that also monitors a condition that cannot be treated until symptoms appear, says Donna Cragle, director of epidemiological studies at the Oak Ridge (Tennessee) Institute for Science and Education. Cragle, who is helping the Department of Energy (DOE) build a registry of workers exposed to beryllium, says the value of a robust database must be weighed against the psychological impact of a positive test, the threat to an individual's health insurance, and the disruption at work.

    Durham has spent his career in an area of the Y-12 plant at the Oak Ridge National Laboratory, where beryllium-containing nuclear weapons components are handled. Recently, he received word that a blood test for an immune reaction was positive, which fingers him as at greater risk for developing chronic beryllium disease (CBD), also known as berylliosis. As a consequence, Durham has been banned from certain areas of the factory where he might be exposed to beryllium, although the value of removing such workers is not clear.

    He is just one of nearly 10,000 beryllium workers who have been screened by DOE, which sponsored the meeting. The practice is also in use at the Cleveland, Ohio-based Brush Wellman Co., which mines, refines, and sells beryllium for uses that range from electronics to golf clubs. Both organizations are using blood test results, offered on a voluntary basis, to warn employees about their immunogenicity status and to keep “sensitives” away from high-exposure areas. And both are developing sophisticated new molecular assays—including potential genetic tests—to more reliably identify workers who may become ill. DOE researchers say this may be the first major attempt to screen for an employment risk based on genetic susceptibility.

    The current screening program is based on a relatively straightforward blood test validated 7 years ago by researchers at the National Jewish Medical Research Center in Denver, Colorado, who demonstrated that lymphocytes from certain workers exposed to beryllium proliferated rapidly in the presence of beryllium salts. That indicated their immune systems had become “hypersensitized” to the metal and were attacking the lungs. Patients with full-blown CBD test positive on this assay, and recent studies have shown it is predictive as well: About 45% of the people with positive results also develop CBD.

    These findings, plus health surveillance results based on the blood test, point to the disheartening conclusion that the rate of CBD among exposed workers, despite a decades-long effort to reduce ambient beryllium dust in the workplace, is “about the same as it was in the 1930s and 1940s,” says Paul Wambach, an occupational medicine officer at DOE's office in Germantown, Maryland. The new research also turned up an alarming incidence of CBD among clerical staff and workers outside the beryllium work zones—raising the possibility that very small amounts of dust might trigger the disease process in “sensitives.” As a result, DOE and Brush Wellman have expanded their screening programs and introduced new measures to reduce dust, including protective gear for those subject to greatest exposure.

    Now researchers are seeking a genetic marker more predictive than the lymphocyte proliferation test. Babbetta Marrone and colleagues at the Los Alamos National Laboratory in New Mexico have built upon work on the major histocompatibility complex gene to identify a rare allele and a pattern of homozygous inheritance of a common allele that appear to be strong predictors of disease risk. About 85% of CBD patients in a small study were positive for one of these markers, Marrone says, compared with only 16% of beryllium-exposed workers without CBD.

    Genetic information like this may soon make possible more accurate predictions of which prospective employees would be harmed by exposure to beryllium. Officials at both DOE and Brush Wellman say that genetic tests are not being used in the work place today because they are not yet considered good enough predictors of health and because they have been denounced as discriminatory. Of course, the whole point of diagnostic testing is to discriminate. Reed Durham has learned that lesson firsthand, and his reaction suggests that the issue is likely to remain controversial. “I cannot recommend that anyone ever take that test,” he says. “If they'd ask me again, I'd say no.”

  6. ECOLOGICAL RESTORATION

    Plan to Quench the Everglades' Thirst

    1. Martin Enserink

    Vice President Al Gore last week presented Congress with a Herculean challenge: To restore a more natural water flow to the Everglades, the vast wetland in southern Florida, while safeguarding the booming region's water supply. The $7.8 billion project, drawn up by the Army Corps of Engineers, would be the most expensive restoration effort ever undertaken. It calls for a 20-year overhaul of southern Florida's water management system that would, ironically, undo large portions of an equally ambitious plumbing system built by the Corps decades ago.

    Often called the “River of Grass,” the Everglades once was a 100-kilometer-wide, shallow sheet of water flowing south at a rate imperceptible to the eye from Lake Okeechobee to Florida Bay. It sustained a unique ecosystem including marsh grasses, cypress trees, herons and other birds, alligators, and panthers. Agriculture and mushrooming urban centers have eaten away half the Everglades. But a particularly devastating blow was dealt after a disastrous flood in 1948 prompted the Corps to devise a system to curb flooding and create a dependable water supply: over 1600 kilometers of levees and canals that channel water from the area north of the Everglades to cities and plantations.

    The waterworks also dump 6.4 billion liters of freshwater into the Atlantic and the Gulf of Mexico every day. As a result, the Everglades has become drier, and its denizens are suffering. Since the turn of the century, wading bird populations have declined over 90%, and dozens of animals and plant species are near extinction. Meanwhile, the salt balance in the estuaries on both coasts is disturbed by too much freshwater, harming seagrasses and animals. “The irony is that we've made freshwater a pollutant,” says David Guggenheim, co-chair of the Everglades Coalition, a group of national and Florida conservation organizations.

    Now the Corps hopes to undo its grand mistake. Its plan (www.restudy.org), scheduled to begin next year, calls for stopping the diversion of water and letting it flow naturally. To do this, engineers would remove some 400 kilometers of canals and levees. To prevent flooding during heavy rainfall, the Corps would turn two limestone quarries near Miami into reservoirs and create 16 more reservoirs elsewhere. And in an unprecedented engineering feat, over 300 wells would be drilled around Lake Okeechobee to pump up to 6 billion liters of freshwater per day several hundred meters underground into the Floridian Aquifer. To give the River of Grass unfettered access to Florida Bay, state and federal agencies would buy 24,000 hectares of farmland—and allow it to flood—and build bridges to elevate 30 kilometers of U.S. 41, also known as Tamiami Trail, which connects the Florida coasts.

    Most environmental groups applaud the plan. Its size and sophistication make it “a cutting edge project,” says Stuart Strahl, vice president of the National Audubon Society. “We're really setting precedents here” for future restorations, he says. Others voice doubts: The Sierra Club, for instance, questions the wisdom of filling the aquifer. Drilling is expensive, and pumping unprecedented amounts of water into the ground could crack the aquifer, says the Sierra Club's Frank Jackalone, who would rather see more water kept in reservoirs. He worries the water may become contaminated if it reaches nearby underground waste pits.

    But Jackalone and other skeptics say their minds are eased by an Interior Department decision to ask the National Academy of Sciences (NAS) to help organize a review of the project, after a trio of prominent ecologists last February warned of what they viewed as the plan's “deep, systematic” scientific failings. NAS plans to appoint a committee later this month. “We are now confident that the panel will make the necessary fixes,” says Jackalone.

    The plan also has the backing of Florida politicians, including Governor Jeb Bush and the entire congressional delegation. But it may face hurdles in Congress. In April the General Accounting Office, Congress's independent financial watchdog, concluded that the plan's cost—to be split between the state and the federal government—could rise to $11 billion. It remains to be seen whether legislators from around the country will be willing to channel billions into one state.

    However, administration officials and conservationists are confident the project will stay on track. “We're restoring a whole ecosystem instead of a single species,” says Audubon president John Flicker. “The whole world is watching.”

  7. ANIMAL WELFARE

    New Indian Rules Disrupt Research

    1. Pallava Bagla

    NEW DELHIIndian scientists have been unable to import animals for research this year because of inaction by a new government committee set up to review such requests. In the meantime, the relationship between biomedical researchers and government regulators has taken a turn for the worse as the Animal Welfare Board last week threatened to close the country's main center for supplying laboratory animals after it failed to follow new registration procedures.

    The tensions grow out of a law that went into effect on 15 December 1998 to safeguard an estimated 5 million animals, from mice to primates, used at 5000 labs throughout the country. It gives authority for reviewing import requests to a Committee for the Purpose of Control and Supervision of Experiments on Animals (CPCSEA), which has met several times but taken no action on scores of requests. Previously, Indian scientists could obtain animals directly from overseas sources once they obtained approval from their own institutions.

    The delay has led some to take extreme steps. Neuroscientist Sumantra Chattarji of the National Center for Biological Sciences in Bangalore headed to Cambridge, Massachusetts, to use genetically engineered mice developed by MIT biologist Susumu Tonegawa for a project on learning and memory in the hippocampus. Immunologist Satyajit Rath of the National Institute of Immunology in New Delhi, awaiting approval to buy knockout mice from the Jackson Laboratory in Bar Harbor, Maine, warns that continued delays “are likely to prove disastrous to Indian science.” CPCSEA officials say that they are working through the backlog “as quickly as possible.”

    Indian researchers can still use animals bred in the country, but the main lab animal supply facility, the National Center for Laboratory Animal Sciences in Hyderabad, fell afoul of another new requirement for labs working with animals that involves submitting registration papers and passing an on-site inspection. Officials at the national center, which each year ships some 40,000 animals to nearly 200 research facilities around the country, believed that paperwork filed last summer was sufficient to meet the new rules. But the Animal Welfare Board said that the center missed a 15 February deadline for compliance, and last week the board notified the center that it would have to close unless officials could demonstrate why it should remain open.

    Negotiations have begun to resolve the situation, and scientists are hopeful that the problem will be cleared up. “We will not let this happen,” says Nirmal Kumar Ganguly, director-general of the Indian Council of Medical Research. “A national facility just cannot be shut down.”

    For their part, animal welfare officials say they want to make sure that researchers are taking the new law seriously. They note that some 152 labs, including most major public and private facilities, have complied with the new registration requirements. “The rules are applicable equally to everybody,” says one CPCSEA official who requested anonymity. “And if some can comply, why can't the others?”

  8. AGRICULTURAL RESEARCH

    Report Tells USDA to Narrow Its Focus

    1. Jocelyn Kaiser

    The research program of the U.S. Department of Agriculture (USDA) needs a major overhaul, building stronger ties to the outside research community and focusing more sharply on fewer research priorities, according to a federally appointed task force that laid out a blueprint for such reforms. In a draft report obtained by Science, the panel urges that USDA build fewer new labs, shut down many existing research stations, and increase partnerships with academic and corporate labs. “We need to spend the money on science and not facilities,” says the panel's chair, Bruce Andrews, until recently the director of agriculture for the state of Oregon. USDA's in-house labs, the report says, should concentrate on work that can best—or only—be done by the federal government.

    This sort of advice has been offered before, but it may pack an additional wallop this time, because the report was done at the request of Congress, which approved much of the overbuilt research enterprise that the report decries. The 14 members of the task force include cattle and soybean producers, farmers, and a state agricultural official as well as university scientists. “I think they're on target,” says Lou Sherman, chair of biology at Purdue University in West Lafayette, Indiana, who has seen the report's conclusions. But task force members haven't yet signed off on the final report, which is due out in a few weeks.

    The task force was created by the 1996 Farm Bill, which calls for a 10-year strategic plan for federally funded agricultural research facilities “to ensure that a comprehensive research capacity is maintained.” Congress and other observers want to make the most of a $1.6 billion USDA research budget that hasn't risen in recent years. The task force notes that this money must be stretched across some 370 labs—the bulk of them within the Agricultural Research Service (ARS) but also the Forest Service and land grant university labs—leaving many badly in need of repair and short of money for doing science. A big problem is the steady flow of new facilities, such as cranberry research in New Jersey and a swine center in Iowa, “dictated by politics” and stuffed into USDA's budget as a favor to individual legislators.

    Andrews, who is now head of marketing for the Port of Portland, says the panel considered options from the status quo to a system looking outward—“the NIH [National Institutes of Health] model.” It chose “a middle road,” he says. Under its plan, USDA would classify programs as “uniquely federal,” “appropriately federal,” or neither. Uniquely federal projects, such as storing genetic materials, studying highly infectious foreign animal diseases, and work relating to national security issues (such as bioterrorism), represent, perhaps, one-fourth of what USDA does, Andrews says. Only this work should continue in federal facilities. Labs for “appropriately federal” work, such as climate change and biodiversity, should be done by universities or the private sector whenever possible. Andrews anticipates that USDA will do “less and less [of this research]” over the next decade.

    While the task force asked USDA to finish this classification by July 2000, it went ahead and identified about 23 labs that should be closed or consolidated. It even suggests that USDA's flagship facility in Beltsville, Maryland, should consider relocating, because its once-rural setting is now valuable suburban real estate.

    Andrews says he knows the report will ruffle feathers. “Any system that's been in lock step for the last 10 or 20 years I'm sure will view this as an attack.” Already, he adds, the task force's ideas have “created some hostility” from the ARS and Congress. ARS associate administrator Ed Knipling declined to comment until the report is officially released, but noted that “the department does recognize it as somewhat controversial.” Sherman agrees. “It's going to take a lot of political will on the part of the executive branch and Congress” to turn its recommendations into reality, he says.

  9. SCIENTIFIC COMMUNITY

    Tragedy Devastates Radio Astronomers

    1. Michael Balter

    PARISScientists and staff at the Institute of Millimetric Radioastronomy (IRAM), one of the world's leading radioastronomy research centers, were in shock last week after a cable-car accident on 1 July killed 20 people, all workers employed by IRAM and its subcontractors. The cable car, which was ferrying the workers to IRAM's facility 2552 meters atop the Bure plateau in the French Alps, somehow came loose from its cable and plummeted 80 meters. The accident is likely to delay completion of a new radiotelescope.

    IRAM is run by the French basic research agency CNRS, Germany's Max Planck Society, and Spain's National Geographical Institute. IRAM operates two major facilities, a 30-meter telescope at Pico Veleta in southern Spain and an array of five 15-meter telescopes at Bure. The telescopes detect emissions at millimeter wavelengths, between radio and infrared, which reveal molecules in cool interstellar regions and dust shrouding comets and young stars. In recent years, IRAM's telescopes have racked up an impressive list of accomplishments. For example, they enabled astronomers to detect flat disks of gas and dust around several young stars, resembling a disk thought to have encircled our own sun before the birth of the solar system. Combined with other studies of disks around young stars, the finding could provide valuable insights into how planets form.

    The cable car provided the sole access to the mountaintop facility, so until the tragedy has been fully investigated and the cable system repaired, the center's activities will be sharply curtailed, says Philippe Chauvin, a CNRS spokesperson in Paris. The accident will also delay completion of a sixth radiotelescope, which was supposed to have come online late this year or early next, Chauvin says. At press time, IRAM officials were scheduled to meet this week to discuss the consequences of the accident. But late last week, Chauvin said, no one wanted to think about what to do next: “The people there are completely traumatized.”

  10. LIFE SCIENCES

    Japan Readies Huge Increase in Biotech

    1. Dennis Normile

    TOKYOJapan's political leaders are piecing together a plan that would nearly double the country's current investment in life sciences research over the next 5 years. The multi-agency initiative, which would start next spring, is aimed at bolstering the country's biotechnology industry through support for basic science.

    A 3 July report in Nihon Keizai Shimbun, Japan's leading economic daily, says that the government and the ruling Liberal Democratic Party have set a target of adding 2 trillion yen ($16.7 billion) over 5 years to the current annual $4.2 billion spent on biotechnology-related R&D. Officials at several ministries cautioned that 2 trillion yen is likely to be more than they will get from the Ministry of Finance, which is trying to rein in the country's recession-swollen national debt. But they confirmed that five ministries and agencies are seeking major increases to support everything from accelerating the sequencing of the rice genome to boosting efforts in human genomics to providing support for new biotechnology companies. “The life sciences will be one of the most important areas [of science and technology] for the next few decades, so the Japanese government is trying to increase its support,” says Nobuhiro Muroya, deputy director of planning for the Science and Technology Agency (STA).

    Details on the new spending are sketchy, even though proposed budgets for the next fiscal year must be worked out by the end of August. One major initiative will be to hunt for the subtle genetic variations known as SNPs—sites where the spelling of the genome varies by a single letter from one individual to another. The hope is that SNPs, or single-nucleotide polymorphisms, will provide a powerful tool for tracing disease genes and developing individualized drugs.

    That idea has already led the U.S. National Institutes of Health to set up a SNPs program, and brought together 10 major U.S. and European pharmaceutical firms to set up The SNP Consortium (TSC) (Science, 16 April, p. 406). Researchers in Japan hope their own SNPs project will move them into the mainstream of genetic research. “We are behind the American and European efforts [in genomics],” says Yusuke Nakamura, director of University of Tokyo's Human Genome Center and a member of a working group that mapped out a SNPs project strategy. Nakamura hopes the initiative also will provide an opportunity to revamp Japan's human genome project, which he believes has been hampered by too many research groups and the lack of an overall strategy.

    The first step in the SNPs project involves spending $50 million over 2 years to map between 100,000 and 150,000 SNPs, concentrating on areas associated with gene expression and function and using samples from 50 Japanese individuals. Researchers hope the approach will increase the likelihood of spotting SNPs associated with disease susceptibility and drug response. The work will be concentrated at one center, says Kanji Fujiki, director of the Life Sciences Division of STA, which is funding the effort. Fujiki says a final decision hasn't been made, but that the center is likely to be Nakamura's.

    The second step will be to look for associations between the identified SNPs and disease susceptibility and drug response. This research will target diseases prevalent in Japan, including cancer, diabetes, rheumatoid arthritis, and cardiovascular diseases. The effort will involve STA and possibly the Ministry of Health and Welfare and the private sector. Fujiki says the scope of the project will vary by disease and the total effort is still being worked out.

    Drug companies hope that studies of SNPs will reveal why medication that proves effective in one person is useless or produces side effects in another—information that could lead to “tailor-made medicine,” says Teruhisa Noguchi, a former vice president of Yamanouchi Pharmaceutical Co. “This is going to be an extremely important field for industry,” says Noguchi, who heads an industry group promoting the nascent field of pharmacogenomics, or the use of genomic data and lab techniques in developing drugs and diagnostics.

    So far, it appears that Japan's effort will be independent of SNP searches overseas, such as TSC. That consortium plans to spend $45 million over the next 2 years to create a SNP map of the whole genome and to make the data public without any preferential access for consortium members (Science, 16 April, p. 406). Speaking at a pharmacogenomics forum in Tsukuba last week, Arthur Holden, consortium chairman, emphasized the group's interest in having Japanese participation. “Expanding our activities would benefit everyone,” he says.

    But Japanese officials say their priorities lie elsewhere. “The Japanese have their own [genetic] characteristics and we feel we should do our own work on SNPs,” says STA's Fujiki. At the same time, he says that the databases for Japan's project will be open to all and that the idea of merging the data is worth pursuing.

    Other initiatives that might be funded by the new money include three new STA research institutes focused on genome informatics, stem cell research, and plant genetics. Hiroshi Arakawa, an official in the Ministry of International Trade and Industry's (MITI's) Biochemical Industry Division, says MITI is studying a range of programs to promote Japan's biotechnology industry, including possible tax incentives to promote the commercialization of technologies resulting from genomic research. The ministry is also considering a plan to support the creation of up to 1000 new biotechnology businesses.

  11. CHEMISTRY

    Race for Molecular Summits

    1. Robert F. Service

    In a branch of chemistry called total synthesis, glory goes to the first team to reproduce a complex molecule from simple ingredients. But some wonder whether the competition is healthy

    When researchers at the Merck pharmaceutical firm discovered in 1995 that a compound newly isolated from a soil microbe had a novel anticancer activity, the finding touched off a race—not only among biologists to understand and test the new molecule, but also among synthetic chemists to craft it from scratch. Dubbed epothilone A, the compound thwarts cells' ability to divide, rather than killing them directly. That promising trait, along with epothilone's complex structure (37 atoms arrayed in a fishlike shape, with a contorted body and sideswept tail) made it an inviting target for synthetic chemists who specialize in reproducing the most complex naturally occurring molecules they can find. Epothilone stood out as a Himalayan summit in a field that prides itself on taking on every 8000-meter peak in view.

    Molecular Everest.

    The structure of palytoxin, a compound from a soft coral, which was synthesized in 1994.

    SOURCE: NICOLAOU ET AL., JOURNAL OF CHEMICAL EDUCATION, 75, NO. 10 (1998)

    The race to plant a flag on this molecular mountain was a sprint, lasting only about a year, far shorter than the 10 years or more that are regularly spent on such endeavors. Still, some labs made an all-out assault on the structure, with half a dozen or more chemists trying a variety of synthetic strategies simultaneously in hopes that one would pan out. In the end, a group led by Samuel Danishefsky of the Memorial Sloan-Kettering Cancer Center and Columbia University in New York City crested the peak in December 1996, just a month ahead of another team led by K. C. Nicolaou at The Scripps Research Institute in La Jolla, California. The virtual tie was hailed in the pages of Science and other science press as a great achievement for organic chemistry.

    But was it? Everyone agrees that the synthesis opened the door for medicinal chemists to tweak the molecule's structure in search of more potent analogs with fewer side effects. But the broader goal of this kind of work—long a prestigious subfield of chemistry that attracts the best and brightest—goes beyond the practical applications of understanding any particular molecule. Chemists want to learn the fundamental rules of how molecules react, to find new reactions and uncover surprising new ways to make and break bonds. In other words, the surprises encountered on the journey are supposed to be as important as the destination. And by that standard, some synthetic chemists say, the epothilone race brought little glory to the winners. “The early syntheses of the epothilones were not up to modern standards,” says Steven Burke, a total synthesis researcher at the University of Wisconsin, Madison. “They wouldn't have been publishable if the molecule hadn't been such a high-profile target.”

    Nicolaou and Danishefsky acknowledge that the original epothilone syntheses didn't rewrite chemistry textbooks, although they say new reactions were discovered in the effort, and that it led to epothilone analogs that may improve on the original compounds. Nevertheless, Burke and a broad cross section of other total synthesis chemists say that these are trying times for the field. They argue that the chemical synthesis race is intensifying while the number of targets is shrinking, because of a slowdown in searches for promising new natural products (see sidebar, p. 186). As a result, more and more synthesis groups are racing after fewer inviting targets. To stay ahead, some researchers rely on safe, well-established reactions, and so make fewer fundamental discoveries along the way. At the same time, the rise of biological chemistry and materials science disciplines are siphoning off the talented students once attracted to the field.

    Even chemists who are major forces in total synthesis acknowledge that the pressure to race for the finish line is limiting fundamental explorations. However, they also say that total synthesis remains a fundamentally healthy wellspring of chemical discovery and the ideal training ground for chemists in hot demand in the pharmaceutical industry. Complaints, they say, come largely from scientists without firsthand experience of work on chemistry's cutting edge. “To say this work isn't turning up anything only comes from someone who says surgery is surgery, so removing a corn on a toe is the same as an organ transplant,” says Danishefsky.

    It's a measure of how touchy the issue has become that many of those interviewed by Science agreed to talk on the condition that they not be identified. Yet the soul-searching runs deep, says George Whitesides, a synthetic chemist at Harvard University who is widely regarded as having a broad overview of the field. Total synthesis, it seems, is akin to a superpower in the post-Cold War world. “It's going from a dominant field to one thinking about what's next,” says Whitesides. Just what is next depends on whom you talk to, but most in the field agree that biology will be a big part of the equation, as total synthesis retools to focus on the practical goal of creating novel medicines in a simple fashion.

    Imitating nature

    In the early days of the field—just after World War II—synthesis was performed not so much to uncover new reactions as to nail down the structure of particular molecules. By using a series of well-known reactions with predictable results, researchers could be sure of the three-dimensional shape of the molecule they created. But from the late 1940s through the 1970s, with the advent of other techniques to probe molecular structure—x-ray crystallography and nuclear magnetic resonance spectroscopy—natural products chemists lost their chief raison d'être.

    They soon found others. Because synthetic chemists were no longer trying to infer structure from the reactions they used, they no longer had to limit themselves to familiar reactions. They were unleashed to explore the world of synthesis, to try to invent new reactions to make impossibly complex molecules from simple starting ingredients, such as amino acids. Natural products, with their wide variety of complex shapes, became ideal targets. “Early on, the issue was, can you make a complex natural product?” says Harvard chemist Eric Jacobsen, who searches for new “asymmetric” reactions that preferentially produce one of several mirror-image forms of some molecules. More recently, synthetic chemists have offered an additional rationale: finding ways to make and modify natural drug candidates that are either rare or hard to isolate in abundance.

    But, for many chemists, the basic appeal of total synthesis was visceral, akin to the reason climbing pioneer George Mallory gave when he was asked why he wanted to climb Mount Everest: “Because it is there.” The title “Total Synthesis of …” began proliferating in the pages of the major chemistry journals.

    Fortunately, the mountain-climbing approach proved fruitful for chemical discovery as well. By taking on daunting molecular targets such as vitamin B12, for example, Harvard University's R. B. Woodward—together with Albert Eschenmoser at the Federal Technical University in Zurich, Switzerland—laid the empirical foundation for what later became the Woodward-Hoffman rules spelling out how the electronic structures of molecules reorganize during reactions. “These helped make sense of a large body of chemical reactions and provided a way to rationalize why certain reactions take the course they do,” says Erik Sorensen, a total synthesis chemist at Scripps. Other foundations of the discipline soon fell into place, and the advances earned a string of Nobel prizes. The names of the total synthesis pioneers—Woodward, Hoffman, Eschenmoser, and E. J. Corey—are as revered among chemists as the names Edmund Hillary and Tensing Norgay are among mountain climbers.

    Like Hillary's Everest expedition, total synthesis “is often done in a kind of land war way,” says Sorensen, with a heavy investment of personnel and time. The dozen or so large total synthesis labs around the world are each home to between 20 and 40 chemists, organized into teams pursuing separate projects. A few months ago in the Nicolaou lab, for example, one team was toiling on the synthesis of potential anticancer compounds called CP molecules, while another was doing follow-up work on a recently completed project to synthesize vancomycin, and others were following early ideas on compounds such as maitotoxin, a molecule from a marine sponge.

    “You eat, drink, and sleep your molecule,” says Phil Baran, a Ph.D. student in Nicolaou's lab, who was part of the CP team. Creativity is prized for finding ways to knit desired bonds together. But most good ideas fail, says Baran. So it's typically those who work harder and try more reactions in the lab who come out ahead. “There's always a question of who will finish first,” he says.

    Embarrassment of riches

    As molecular mountain after mountain has fallen under the onslaught, the field's very success has begun to change it. In 1994, Yoshito Kishi and his colleagues at Harvard pulled off the complete synthesis of palytoxin, a neurotoxin from a soft coral. Even today, the molecule is recognized as one of the most complex ever attempted. Palytoxin, Jacobsen explains, harbors more than 100 “stereocenters,” where the molecule has mirror-image forms. Synthesizing it demanded not only forging the right bonds, but controlling the orientation of each stereocenter. “A lot of people saw that molecule as a defining moment in the field,” says Jacobsen. “It put to rest the question of will it be possible to make any molecule that nature makes. The Mount Everest issue has been answered without exception,” he adds. Given ample time, money, and skilled practitioners, any mountain would fall.

    View this table:

    Not everyone agrees. Nicolaou, for example, points out that other molecular behemoths such as maitotoxin remain unclimbed, and others will undoubtedly be discovered. Still, as the summits become more attainable, the nuggets of discovery—the other key justification of the work—are becoming rarer as well. As one prominent California-based researcher puts it, organic chemists can work on discovering fundamental principles of organic chemistry, or they can make something, such as a natural product. “It used to be that the way to learn the principles was to engage in the second exercise,” says the researcher. “That is less and less true. There's the rub.”

    “That much I'm ready to concede,” says Danishefsky. The chemistry that is developed on the climb up the mountain “doesn't impact as many other projects” as the discoveries made in previous decades, he says. “I'm sure that's true,” agrees Scripps total synthesis chemist Dale Boger. “At some point, [the chemistry] becomes so well developed that it becomes harder to justify chemistry for chemistry's sake.” Ask synthetic chemists about recent first total syntheses, and for molecule after molecule they'll call the efforts “heroic,” “impressive,” and more. Privately, though, many wonder whether the chemistry developed is truly goundbreaking.

    Some researchers blame competition for shrinking the yield of new science. “The quickest way to make a molecule is not to discover new reactions, but to use known reactions,” says Jacobsen. “Rarely do you see a lot of new chemistry come out of that effort.” As the California critic puts it, “You're taking known reactions and putting them in a new order.”

    And competition is on the rise. “I think it has increased over the years as the number of groups engaged in natural product synthesis has increased and the advances in synthetic methods have made it so that a large number of groups are able to tackle complex natural product targets,” says Stuart Schreiber, a synthetic chemist at Harvard. Others blame the science media—including news articles in Science—for celebrating the races and lavishing recognition on teams that are the first, perhaps by only a few weeks or months, to complete a molecule, instead of mentioning subtler accomplishments, such as developing a more elegant and concise way to make a medically important compound. “If someone comes up with a truly superior synthesis, it would probably not be given as much credit as it should because the summit had already been achieved,” says Burke. Adds Nicolaou, “You don't get much credit for rediscovering the wheel.”

    “I believe these races are minimizing our opportunity to make fundamental discoveries,” says Schreiber. “I feel it's not a healthy development in our field.” It may also be jeopardizing the field's future, some chemists say. Because of the competition, the handful of big groups that can put the most grad students and postdocs on a project tend to dominate the field. “If you're a young organic chemist, you can't compete with these teams,” says one researcher. As a result, promising students may be looking elsewhere. “I think that the best people aren't going into this area,” says one synthetic chemist from the Midwest who is not involved with total synthesis. Instead, other areas where synthesis is used as a tool to make materials or drugs, rather than as an end in itself, are siphoning off much of the talent. “Straight synthesis has a lot of competition it didn't use to have,” adds the chemist.

    But even if total synthesis is facing a period of soul-searching, it will continue to be valued as a way to test cutting-edge chemical techniques and as a robust training ground for the pharmaceutical industry. Companies gobble up graduates from total synthesis labs as fast as they can be minted, putting them to work on crafting potential drug molecules. The targeted, goal-oriented, problem-solving training that students get in total synthesis labs “is very important for what we do,” says Paul Anderson, vice president for chemical and physical sciences at the DuPont Pharmaceutical Co. in Wilmington, Delaware.

    Synthesis experts also argue that by no means does every total synthesis wind up in a race. And they say they are far from exhausting the veins of new science to be mined. “As long as you are facing new structural types, you will learn new chemistry” in order to make them, says Nicolaou. Harvard synthesis pioneer E. J. Corey adds that the total synthesis field is still assimilating the recent discovery of novel asymmetric catalytic reactions, which have had an enormous impact on how complex molecules are made. “The way people do syntheses now is totally different than [it was] 15 years ago,” he says.

    But leaders of big and small groups alike say it's time for the field to move on to new goals, such as developing techniques to make exotic compounds with just a few steps, so that the synthesis is commercially practical, or making natural products and their kin in large quantities so biologists can study their effects. Whether the field embraces these aims or continues to be gripped by the lure of racing for unclimbed summits could determine how it fares in its period of greatest uncertainty.

  12. CHEMISTRY

    Drug Industry Looks to the Lab Instead of Rainforest and Reef

    1. Robert F. Service

    LA JOLLA, CALIFORNIAChemists who synthesize complex organic molecules from scratch take their lessons from nature. As a test of chemical artistry, they try to mimic the complex, pharmacologically active molecules made by obscure living things such as marine sponges, rainforest plants, and soil microbes. The field is at a turning point, with some chemists arguing that it isn't producing the fundamental insights of the past (see main text). And, to add to its troubles, natural products discovery—the effort to find candidate drugs in natural sources, which provides the chemists with their targets—has run into troubles of its own.

    In February, Shaman Pharmaceuticals, a biotech start-up devoted to finding drugs in the rainforest, laid off the bulk of its staff after it had a hard time getting its top drug candidate past the U.S. Food and Drug Administration. In 1995, the large pharmaceutical company Abbott Laboratories put an end to its own natural products research, and pharma giants such as SmithKline Beecham have cut back their internal efforts considerably. Instead of looking for promising compounds in nature, they are turning to faster, cheaper techniques for generating and testing synthetic compounds by the thousands.

    A decade ago, enthusiasm for stalking new drugs in rainforests and oceans ran high—and with good reason. Historically, natural products have been the main source of drugs, giving rise to everything from penicillin to salicin, the precursor to aspirin. A 1997 ranking showed that natural product drugs still make up 34% of the 25 best-selling drugs, according to Gordon Cragg, who heads natural products drug discovery at the National Cancer Institute in Bethesda, Maryland. But both business strategy and technology have taken the bloom from natural products discovery, Cragg and others said at a conference here recently.*

    On the business side is industry's appetite for new products. Major pharmaceutical companies in recent years have grown at an impressive 12% a year over the last 5 years. To keep this up, they must come up with as many as six to eight new high-selling compounds a year, up from three to four a few years ago, says Sunil Kadam, a natural products drug discovery researcher at Eli Lilly in Indianapolis. And with natural products, the road to discovery is long.

    First, finding novel plants and other organisms for testing requires extensive and painstaking fieldwork. Researchers must then create extracts that can be screened for a desired activity, such as inhibiting cell growth for a candidate cancer drug. Next, the active compound has to be isolated, purified, and tested again. Finally, if the compound passes all these hurdles, researchers must hope that it hasn't already been identified and patented by a previous screen. “All of this takes resources,” says Pfizer's Jim Valentine. “There's only so much money to go around.”

    Over the last decade, much of this money has been channeled into a more industrial approach to drug discovery, combinatorial chemistry. Instead of extracting candidate compounds one by one, combinatorial chemistry synthesizes tens of thousands to millions of them at once. Robotic high-throughput techniques screen them en masse for drug leads (Science, 31 May 1996, p. 1266).

    “If someone says I'm making 5 million compounds and another says I'll come up with 50,000 extracts, the former gets the money,” says Kai Bindseil, executive director of AnalytiCon, a natural products discovery company in Potsdam, Germany. The combinatorial paradigm remains largely unproven: The first few drugs found this way are still working their way through clinical trials. But “the chemistry is improving rapidly,” says Bindseil. “There will be further pressures on natural products.”

    Still, Bindseil and others believe that it's too early to count natural products out. For one, they argue that natural products have a key advantage: diversity. Evolution has done an extraordinary job of making compounds with a wide variety of three-dimensional structures, which is essential for turning up novel drug leads, says Cragg. Combinatorial chemistry, on the other hand, excels at making huge libraries of compounds all slightly different from one another. That can help optimize a drug lead, but it's not as useful in finding the lead in the first place.

    To speed their work, natural products researchers are also turning to technology. In recent years, devices for purifying and analyzing compounds in complex mixtures have improved dramatically, says Kadam. Separations and analysis that used to take months can now be done in days. “This will really speed up natural products discovery,” says Bindseil. His company is betting on it. Last December, AnalytiCon embarked on an ambitious program to create a library of some 50,000 natural product extracts by 2002. The company will then license the extracts to pharmaceutical companies for screening.

    Meanwhile, researchers at Diversa, a San Diego-based biotech firm, are banking on a new approach to isolating natural products from microorganisms. Microbes—by far the most numerous and varied organisms on Earth—have been prolific sources of breakthrough compounds, particularly antibiotics. And their sheer diversity suggests that a rich panoply of compounds remains to be discovered. But researchers have traditionally been able to sample compounds only from organisms they could grow in culture, perhaps less than 1% of the species thought to exist. “There's a tremendous amount of chemical information that has not been tapped yet,” says Cragg.

    Diversa gets around the culturing problem by isolating DNA from new microbes and cloning random fragments—containing genes or families of genes—into the common lab microbe, Escherichia coli. The E. coli produce the novel proteins, which sometimes work as biosynthesis machines, producing novel compounds found in the original, unculturable organisms. Finally, the researchers isolate both the new proteins and their handiwork and scan them for druglike activity.

    It's still too early to know if these or other approaches to high-speed natural products discovery will be adopted throughout the industry. But in the end, says Kadam, “this is a ‘show me’ science.” If natural products researchers show that they can turn out novel compounds at a relatively low cost, their discipline may again thrive.

    • *“Natural Products Discovery: At the Crossroads of New Technology and Genomics,” La Jolla, California, 13–14 May 1999.

  13. GEOPHYSICS

    The Great African Plume Emerges as a Tectonic Player

    1. Richard A. Kerr

    A massive upwelling of hot rock beneath southern Africa may be shaping the continent as it cools Earth's core, in the flip side of plate tectonics

    Plate tectonics gets all the glory. We humans ride the plates across the planet at their stately (and now measurable) pace and marvel at the natural wonders they produce—the soaring Himalayas, the deep-sea trenches, the earthquakes, the volcanoes. All this geologic hubbub happens because, through plate tectonics, Earth's mantle is cooling itself. Hot new ocean crust forms at midocean ridges, cools, and sinks back into the mantle, shedding heat and driving the plates. But geophysicists have long suspected that Earth might have another, less obvious way of chilling out. Almost 3000 kilometers down at the bottom of the mantle, they figured, heat from the molten iron core may churn up towering plumes of hot rock that slowly rise to the surface to spew volcanic outpourings. A narrow plume has recently been spied beneath Iceland (Science, 14 May, p. 1095), and another may fuel Hawaii's volcanoes—small potatoes in Earth's cooling system. But geophysicists are now accumulating increasing evidence of two huge “superplumes” cooling the core.

    Blowing hot and cold.

    Superplumes loft heat from Earth's core, while cold slabs sink inward.

    SOURCE: A. FORTE/U. WESTERN ONTARIO; ILLUSTRATION: L. CARROLL

    Deep beneath southern Africa, the “Great African Plume” is shaping up as the clearest example of a superplume. At the spring meeting of the American Geophysical Union (AGU) in Boston and in recent publications, geophysicists report signs that a blob of hot rock several thousand kilometers wide at its base, long known to lurk beneath southern Africa, extends toward the surface, spanning the mantle from the core to the volcanic hotspot of northeastern Africa. Its ascent could be pushing up much of southern Africa, and it could be feeding a dozen or more volcanic hotspots across the continent.

    Another likely superplume seethes beneath the southwest Pacific. Together, the plumes are a major force in the 80% of the planet that is the mantle, says geophysicist Alessandro Forte of the University of Western Ontario in London, who sees them forming half of “the dominant large-scale structure of the deep mantle.” They might even shape climate.

    Earth scientists have only just convinced themselves and most of their colleagues that narrower structures span the mantle from top to bottom. By using earthquake waves crisscrossing the mantle as a global version of the x-ray CT scans in medicine, seismologists have seen slabs of cold, dense ocean plate sinking below a depth of 670 kilometers into the lower mantle—in places, apparently, all the way to the bottom (Science, 31 January 1997, p. 613). A curtain of slabs descends around the Pacific Rim of Fire, while others plunge under the Mediterranean Sea and India.

    The slabs show up on seismic images because colder rock speeds up seismic waves. Hot, seismically slow features are tougher to pin down. Two great blobs of seismically slow mantle, one beneath the southern tip of Africa and the other beneath French Polynesia in the southwest Pacific, stood out in even the first fuzzy seismic images of the mantle made in the 1980s. Later images hinted that the African blob in particular might extend toward the surface, but the fuzziness was never quite dispelled.

    With more earthquakes, more and better seismographs recording quakes, and more comprehensive compilations of seismic data, seismologists are sharpening their view of the African plume. At the AGU meeting, seismologists Jeroen Ritsema and Hendrik van Heijst of the California Institute of Technology presented a new mantle image based on three types of seismic observations—surface waves, which travel through only the 670 kilometers of the upper mantle; waves that rumble throughout the mantle, although less frequently through its upper reaches; and quake-triggered oscillations of the whole planet, which are particularly sensitive to velocity variations in the mid and lowermost mantle.

    The resulting image portrays the African plume reaching continuously from the core-mantle boundary to the surface. “The dataset we use to assess this feature is pretty diverse,” says Ritsema. “I believe [the plume] is continuous.” The image shows a great blob of seismically slow rock appearing in the lower mantle beneath the southern tip of Africa, narrowing in the upper mantle, and bending northeastward to rise beneath the Afar Triangle, where the Red Sea, the Gulf of Aden, and the East African Rift mark volcanic rupturings of the plate.

    “It's a very intriguing structure,” says seismologist Andrew Nyblade of Pennsylvania State University in University Park, who agrees that the previously suspected continuity seems to be real. Seismic data from the Pacific are still sparse, but the chemistry of lavas erupted onto the sea floor also suggest a plume has risen there, too.

    And both plumes seem to be on the move. Some recent seismic studies have suggested that the hot rock in the lowermost portion of the African plume might have a different chemical composition than its surroundings. If it were something heavy like iron that made the difference, the plume—which would otherwise be rising through the surrounding, cooler rock like a hot-air balloon—might be stagnant or even sinking. But at the AGU meeting, Forte presented calculations made with Jerry X. Mitrovica of the University of Toronto suggesting that the two superplumes are indeed rising.

    In a sense, they used Earth's core to weigh the mantle's two superplumes. Positioned on opposite sides of the core in the plane of the equator, the superplumes would tend to squash the core if they were heavier and sinking. But if they were lighter and rising, they would pull the core into a flattened egg shape lying in the plane of the equator. Forte and Mitrovica calculate that without two buoyant plumes rising on opposite sides of the core, it would still be flattened, by about 150 meters, simply due to the squeeze of slabs sinking to the north and south. If the plumes are rising as fast as their seismically inferred temperatures suggest they should be, the calculated flattening is 500 meters. And that is just the amount geophysicists have inferred from subtle wobblings in Earth's rotation.

    “I couldn't come anywhere close to the observed flattening if I assumed these megaplumes were stagnant,” says Forte. Seismologist Thorne Lay of the University of California, Santa Cruz, tends to agree: “The idea I prefer at the moment is that [the African plume] is a large upwelling or a set of upwellings.”

    A massive plume rising from near the core could help explain how Africa looks at the surface: uplifted and pocked with volcanoes. Last fall, tectonophysicists Cynthia Ebinger of the University of London, Royal Holloway, in Egham and Stanford University's Norman Sleep explained a host of surface features by proposing that a single plume hit the African plate beneath Ethiopia 45 million years ago, then spread across at least 5000 kilometers of the underside of the plate, channeled by preexisting “inverted ruts” in the plate.

    In addition to spewing voluminous lavas across Ethiopia 30 million years ago, the still-buoyant spreading plume would have raised East Africa as it spread and fired up volcanic hotspots from Cameroon on the west coast to the Comoro Islands off Madagascar in the east. “I thought it was nuts at first,” says geophysicist Bradford Hager of the Massachusetts Institute of Technology, but after doing some modeling of his own, he thinks the one-plume-serves-all idea “is probably a neat idea, though I wouldn't bet the farm on it yet.”

    To the south, the deeper part of the African plume may be making itself felt as well. Most of southern Africa and the surrounding sea floor is half a kilometer or more higher than it ought to be, as Nyblade and Scott Robinson of Penn State have pointed out. Tectonophysicist Carolina Lithgow-Bertelloni of the University of Michigan in Ann Arbor and seismologist Paul Silver of the Carnegie Institution of Washington's Department of Terrestrial Magnetism recently suggested that this African “superswell” gets its boost not from thicker crust, as others had suggested, but from the plume. They calculate that the plume's buoyancy, as inferred by seismic imaging, is just enough to produce a bulge in the overlying surface that matches the superswell in size and height.

    On a global scale, the African plume seems to be a sizable part of the grand heat engine that shapes the surface while slowly draining the life from the planet. The two superplumes rise like opposing pistons on opposite sides of the world, while descending slabs form a ragged north-south curtain between them. Slabs have been sinking in much the same places for the past couple of hundred million years, notes Hager, so their failure to cool the mantle beneath the Pacific and Africa probably led to overheating and superplume formation.

    If so, then plume tectonics could join plate tectonics as a prime mover in a range of terrestrial phenomena. Marine geophysicist Roger Larson of the University of Rhode Island in Kingston has proposed that the Pacific plume and its superswell may be just the remnants of a superplume that burst to the surface about 120 million years ago, gushing lava onto the sea floor and jerking Pacific plate tectonics into high gear (Science, 15 February 1991, p. 746). From a chain of speculative links, Larson invokes volcanic gases that stoked a greenhouse warming, oceans that spilled onto the continents to form inland seas, and a plume-chilled core that shut down the flip-flopping of its magnetic field. Now that, if true, would garner some glory for plume tectonics.

  14. NEUROSCIENCE

    The Mapmaking Mind

    1. Marcia Barinaga

    Studies in monkeys are revealing how the brain manipulates maps of sensory information to guide our movements through the world around us

    You are eating a meal. Never shifting your gaze from your dinner companion, you spot your wine glass out of the corner of your eye, reach for it, and take a drink. Casting a glance to your plate, you spear a piece of meat with your fork and bring it to your mouth. Making these movements seems simple—just a matter of using your eyes to direct your hands. But that impression belies the complex manipulations of visual information that your brain must perform before it can guide your activity.

    “If you could only process information that came out of the optic nerve, you would never be able to localize objects in space,” says neuroscientist David Sparks of Baylor College of Medicine in Houston. That's because the images that fall on your retina are of limited use for finding the objects themselves, unless the brain knows the orientation of your eyes in your head, and of your head on your shoulders. The same is true for information received through other senses, such as hearing: Each one delivers a map of the world that must be transformed before it can guide motor activity.

    In a series of recent experiments, including one described on page 257 of this issue, neuroscientists eavesdropping on neurons in monkeys are learning how the brain manages this feat. They are finding that as the brain passes information along the pathway from sensation to movement, it modifies the maps, adding information to them and replotting them in coordinates that are useful for the movements it needs to direct. The experiments have pinpointed areas where the transformation is largely complete, and areas where it appears to be in process. And they suggest that the default frame of reference in the primate brain seems to be the visual world, perhaps reflecting an overriding importance of vision among the senses.

    “It is a fascinating area” of research, says neuroscientist Richard Andersen of the California Institute of Technology, the senior author on the Science paper. “It has to do with the whole issue of how we perceive the world as stable, and how we can adjust our movements” to interact with it, even though our eyes and bodies are never still.

    Researchers have known for decades that many of the brain areas that first receive sensory information are organized spatially in a way that resembles a map of the sensory world. Sensory neurons have so-called “receptive fields”—locations that they survey and to which they respond. Visual neurons, for example, respond to particular patches of the retina, and all their receptive fields together describe a map of the retina. In brain areas that respond to touch, the neurons' receptive fields map out the surface of the skin; auditory neurons map the three-dimensional space around the head. But, just as a street map won't help you find the highest spot in town unless someone adds elevation data, those sensory maps don't have all the information the brain needs to direct the body's interactions with the world.

    Indeed, even keeping track of an object in the visual world requires constant remapping, as eye movements shift the position of the object on the retina. Early evidence of this remapping came in 1980, when Sparks, then at the University of Alabama, and postdoc Lawrence Mays were studying neurons in the superior colliculus, a part of the brain stem that directs eye movements. The neurons there have a memory for visual targets, which the brain uses to direct the eyes to objects. For example, when monkeys are trained to shift their gaze toward a target that flashed somewhere in their peripheral vision a moment earlier, collicular neurons begin to fire as soon as they detect the target in their receptive fields, and they keep firing even when the target disappears, holding the memory of its location in retinal map coordinates. The brain uses those remembered coordinates to calculate the direction and distance it needs to shift the eyes to look at the spot.

    Sparks wondered what would happen to that memory of retinal position if the monkey's eyes took a detour on their way to the target location, rather than going there directly. Such a move would shift the position of the remembered target relative to the retina, so that it would fall on a different patch of retina, surveyed by different collicular neurons. The brain would need to incorporate that shift into its memory of the target, in order to move the eyes correctly to the spot where the target had been.

    To see how the brain would compensate for the shift, Sparks trained monkeys in a task that inserted an intermediate eye movement. While they sat with eyes fixed ahead, two spots appeared in their peripheral vision and then disappeared. After a brief wait, the monkeys looked first to the location of one spot, then to the location of the other.

    While the monkeys did this, Mays recorded the activity of a particular set of neurons, those whose receptive fields covered the spot on the retina where the second remembered target would fall once the monkeys' eyes had made the first move. Those neurons never actually sensed the spot of light, because by the time the eyes moved, it was gone from the screen. But they seemed to get the message that the location of the remembered target had shifted into their receptive fields, because as soon as the eyes moved, they began to fire, marking the new retinal position of the target. That result, Sparks says, showed that collicular neurons receive information about the position of the eyes and use it to adjust their map of the world, providing “an updated signal of the movement required to look to the spot.”

    Such remapping within an eye-based reference frame may be all the brain needs to shift the eyes to a visual target. But how does an animal direct its eyes to an object it has heard, not seen? In 1984, Martha Jay, a student with Sparks, found that the superior colliculus transforms other types of sensory information into an eye-based reference frame, apparently in preparation for moving the eyes. She trained monkeys to look toward the source of a sound in a dark room while she recorded from sound-responsive collicular neurons. If those neurons were behaving like pure auditory neurons, and simply localizing the sound source with respect to the head, it shouldn't matter where the animals' eyes were looking, Sparks says. “But that's not what we found.”

    When Jay got the animals to shift their gaze, neurons that had responded to the sound in previous trials gave no response, and other neurons instead became active. That finding suggested that the neurons used data on eye position to generate an eye-based map of the sound source, containing the information necessary for the monkey to look to the sound.

    More recently, researchers have studied how the brain remaps sensory information into frames of reference tied to a limb—which is what has to happen before you can, say, reach for a glass of wine. In 1994, Michael Graziano, working with Charles Gross at Princeton, mapped the receptive fields of monkey neurons in the premotor area, which helps control limb movements. Others had shown that neurons in this area respond both to visual stimuli and to touch, and Graziano found that the responses were closely linked. If a neuron responded to touches on a particular area of the body, it also fired when visual objects came within 20 centimeters of that area. “The neuron seems to be interested in a chunk of space near the body, near the tactile representation,” says Graziano. “Anything entering that chunk of space will activate the neuron.”

    When Graziano then had the monkey shift its gaze to a different place, the same neurons continued to respond to objects near the same locations on the animal's arm, even though the image of those objects now fell on a different part of the retina. “[Normally] a visual receptive field would move when the eye moves,” says Graziano. “But these don't. The visual receptive field is anchored to the tactile receptive field.” Such maps aren't limited to the hands and arms; Graziano and Gross checked the premotor area that controls head and facial movements, and found similar maps there. Soccer players provide a good example of how important this information can be, he says. In the midst of play, he notes, “they have to encode the position of the ball relative to each of their body parts—head, knee, foot,” and premotor maps such as those he found can help them to do just that.

    Discoveries like these show that, in many cases, by the time sensory information reaches brain areas near the end of the sensation-to-movement path, it has been translated into map coordinates that can guide the movements those areas control. But earlier in the processing path, in a brain area called the parietal cortex, researchers have found a fascinating mixed-bag of transformations.

    In some cases, major map transformations have been made by the time the information reaches the parietal cortex. Last year, for example, Jean-René Duhamel and Carol Colby, then postdocs with Michael Goldberg at the National Eye Institute, found a radically transformed visual map in a parietal area called the ventral intraparietal area (VIP). This location seems to be involved in locating objects with respect to the head and face. Some neurons in VIP respond to touch sensations on the head and face, and also have visual receptive fields that—like those Graziano and Gross found in the premotor area—“remain tied to a portion of the skin surface” regardless of which way the eyes are looking, says Colby, who is now at the University of Pittsburgh.

    In other cases, the parietal maps seem only partially transformed, like a work in progress. Brigitte Stricanne and Pietro Mazzoni, students in Andersen's lab when he was at the Massachusetts Institute of Technology, reported in 1996 that sound-sensitive neurons in the lateral intraparietal area (LIP), which keeps track of objects that are possible targets for eye movements, encode the location of a sound source in eye-based coordinates. Although that finding is reminiscent of Jay's and Sparks's results in the superior colliculus, this map seemed to be incomplete. “It was predominantly eye-centered, and that was consistent with what was found in the colliculus,” says Andersen, “but there were also cells that weren't in eye-centered coordinates … and there were cells that were intermediate. That suggested to us that the transformation could actually be taking place there. But we don't know that for sure.”

    Andersen and his team expected they would also find at least a partial remapping of visual information in the parietal reach region (PRR). Identified in 1997 by Larry Snyder, then a postdoc in Andersen's lab, this region seems to specialize in tracking objects that the animal can reach out and grasp. Some PRR neurons track those objects visually, and Andersen and Snyder expected that these neurons' receptive fields would be “limb-centered”—linked to a spot on the animals' arms or hands. After all, says Andersen, “to move your hand to a target, you need the location of the target with respect to the limb.” But as the team reports in this issue, that turned out not to be the case.

    Graduate student Aaron Batista and postdocs Snyder and Chris Buneo trained monkeys to hold their eyes still while one of several buttons in front of the monkeys lit up briefly, and then—after a pause—to reach out and push that button. The researchers identified PRR neurons that fired during the waiting period, marking the location of the target. Then they had the monkeys fix their eyes in a new direction and repeated the task two ways. In one case, they had the monkey reach to the same button. That used the same arm movement as before, but because the animal's eyes had moved, it put the target at a different retinal location. In the other case, the monkey reached to a different button, chosen because its image fell in the same place on the retina as the first button had before the eyes moved.

    The results, reported on page 257, show that after the eye movement, the same neurons responded only when the reach target occupied the same retinal location, even though the direction of the reach had changed. That shows the PRR neurons' receptive fields are tied to retinal, not limb, coordinates. The findings may reflect an economy on the part of the brain, says Snyder, who is now at Washington University in St. Louis. The PRR pays attention to objects that might be targets for reaching, he says, but it “makes perfect sense” that “if there are 20 things out there that you can reach for, you aren't going to want to transform every one of those into reach coordinates” before you know which one you're actually going to reach to. The brain might wait until later in the processing stream to make the transformation, when there's been a decision to reach for a particular object.

    Other things being equal, in fact, the primate brain seems to favor keeping its maps in visual coordinates. An unpublished study from Andersen's lab underscores the dominance of vision in mental mapmaking: Former postdoc Yale Cohen, now at Dartmouth, trained a monkey to reach toward a sound rather than a visual target. Even though the information was coming in via the auditory system and going out as a reach, it was encoded in the PRR in visual coordinates. The neurons that responded to the sound shifted when the animal moved its eyes, even though neither the source of the sound nor the direction of the reach had changed.

    This preference for eye-based maps may reflect the fact that “vision is the dominant sense in primates, and it seems to provide a common framework for coding all kinds of spatial information,” says Colby. What's more, she notes, we generally first move our eyes to something that has drawn our attention, regardless of what we are going to do next. “It may be that in the parietal reach region, things are in eye-centered coordinates because you normally look at the thing you are reaching for,” Colby says.

    A big question that remains is just how such sense-based maps ultimately get redrawn. The brain can get information about the position of the eyes, head, and limbs from neurons in the motor areas that control them, says neuroscientist Jennifer Groh of Dartmouth College. She and others have developed computer models that suggest how the brain could incorporate that information into new maps that reflect the body as well as its environment. Those models will help researchers devise experiments to tackle the question, and take the next step toward understanding how the mind maps the world.

  15. EVOLUTION '99

    DNA and Field Data Help Plumb Evolution's Secrets

    1. Elizabeth Pennisi

    MADISON, WISCONSINBy the shores of Lake Mendota from 22 to 26 June, biologists at Evolution '99 revealed some surprising twists in nature's evolutionary course, from the ills of inbreeding to a complex three-way relationship among plants and moths.

    Evolutionary Ménage à Trois

    For insects, as for people, the way to get the most out of life is often to form partnerships. In the case of several nondescript white moths, the partner is a desert plant, the yucca. At first glance, the relationship seems quite amiable: the moth pollinates yucca flowers and then lays its eggs there, where nascent seeds nourish the developing larvae.

    But moths, like people, sometimes cheat on their partners when a third party enters the relationship, evolutionary biologist Olle Pellmyr from Vanderbilt University in Nashville, Tennessee, reported at the meeting. When two kinds of moths depend on the same yucca, “it's a different game,” he says, opening the way for one moth to become a parasite of both the yucca and the other moth. “The bottom line is one [moth] can turn into a cheater if it can get another moth to carry the [pollination] burden,” says Pellmyr.

    Slacker.

    Some moths that depend on the yucca plant have shifted from pollinators (left) to parasites (right).

    CREDITS: PELLMYR

    The work “is an exquisite demonstration,” showing “a deep understanding of evolutionary interactions,” says Douglas Futuyma, an evolutionary biologist at the State University of New York, Stony Brook. “Even in mutualistic interactions, there are conflicts” over each partner's share of seeds, for example, “and this can lead to the evolution of parasitism.”

    Since 1992, Pellmyr and his colleagues have been documenting relations between 35 yucca species and the 13 moths that tend them. The system seems elegantly balanced: The moths have evolved large, specialized mouth parts to gather yucca pollen, and the plant guards against the moth larvae eating too many seeds by monitoring the egg load and dropping flowers with too many eggs before the seeds and fruit mature (Science, 25 August 1995, p. 1046). Some moths specialize on one yucca; others visit several species. But two of the moth species are different. They are “cheaters”: They no longer have the pollen-gathering mouth parts, and they lay their eggs late, directly into the fruit or seed after the seed is set—thus avoiding the yucca's defense.

    Pellmyr and his colleagues have now analyzed the cheaters' DNA, which shows that they evolved separately, and studied their ecology. The fieldwork shows that the cheaters flourish only where there is another moth also depositing its eggs on the same yucca species. Pellmyr concludes that a three-way relationship drove the moths to cheat. With two moths going after the same flowers, egg overload could be rampant. But if one moth species showed up late, it could bypass the plant's ability to drop off flowers containing too many moth eggs and could rely on the other, hard-working moth for pollinating duty.

    For example, in Florida, where two moth species, Tegeticula yuccasella and T. cassandra, head for the same yucca, a new cheater species has split off from T. cassandra. Both moths inject their eggs just under the surface of yucca fruit, but T. cassandra deposits its eggs on day 1, and the cheater waits until 5 days later, Pellmyr reported. Furthermore, this cheater has not only stopped spending its energy growing special mouth parts and pollinating, but it has also lost its dependence on a single yucca species, and has spread westward, taking advantage of other moth-yucca partnerships all the way to New Mexico, Pellmyr says.

    The other cheater also arose from a situation in which two moths depended on the same yucca. This second cheater visits a variety of yuccas in its southwestern U.S. habitat and is “a little more evolved,” Pellmyr notes. It arrives very late, encountering 3-week-old fruit, and has a special knifelike appendage that it uses to place eggs deep into the hard fruit, directly onto the seeds.

    Though the two cheaters originated in very different parts of the United States, Pellmyr's team has now found plants in New Mexico with both cheaters as well as a faithful pollinator. “We're itching to go study this more,” he says. He also expects that this yucca has some additional means of keeping the parasite in check, because otherwise the cheaters could overwhelm the plant, and the yucca and its faithful pollinator would likely go extinct. No matter what turn the moth-yucca soap opera takes, says Futuyma, “the work is rapidly becoming a classic in evolutionary ecology.”

    The Perils of Genetic Purging

    Charles Darwin was the first to document the evils of inbreeding, when he discovered that morning glories that fertilized their own flowers produced fewer seeds and stunted seedlings. But Darwin also found that inbreeding depression, as this decrease in fecundity is now called, could be reversible: After several generations of inbreeding, the morning glories recovered, producing a line so healthy and fertile that he named it Hero. Darwin couldn't explain the phenomenon, but a century later, in the 1980s, several researchers concluded that inbreeding purges deleterious genes from a population, and some suggested that it might help small populations of endangered species recover their vitality.

    At the meeting, however, two evolutionary biologists warned that this “genetic purging” strategy works inefficiently if at all. Inbreeding small endangered populations any more than is necessary is “a bad idea,” as evolutionary biologist Diane Byers of Illinois State University in Normal puts it. She reached that conclusion after analyzing 52 published plant experiments with evolutionary biologist Donald Waller of the University of Wisconsin, Madison.

    Inbreeding depression occurs in part because individuals in a small, inbred population are more likely to inherit two copies of rare, deleterious versions of a gene. In bigger populations, the chance that one individual will inherit two copies of such genes is quite small, and those who carry only one copy don't express the negative traits. But the idea behind genetic purging is that, in small populations, the bad genes should gradually die out over time, because individuals carrying two copies of such genes are less likely to produce young.

    Both plant and animal breeders often use purging to remove potentially lethal genes, breeding animals with close relatives until the trait disappears. And evolutionary geneticist Alan Templeton of Washington University in St. Louis found that further inbreeding decreased inbreeding depression in an endangered gazelle. So conservation biologists began to think that purging might help save other rare species.

    To find out how well purging might work in nonagricultural plants, Byers and Waller gathered all the relevant studies, focusing on those that compared the amount of inbreeding depression in two sets of plants with different histories of inbreeding. If purging worked, then plants with a longer history of inbreeding—for example, those that had been self-fertilized for many generations—should show less inbreeding depression.

    Overall, only 38% of the studies found definite evidence of genetic purging. And even when purging occurred, it removed only about 10% of the deleterious genes. “[Inbreeding depression] is not going to be eliminated, it's not even going to be cut in half,” says Waller, though he admits that researchers don't know exactly why purging is so inefficient.

    These results signal “a really important paper,” says Lukas Keller, an evolutionary biologist at Princeton University in New Jersey. The analysis shows that the threat from deleterious genes “is not going to be reduced to zero” by purging. That means that further inbreeding of an endangered population carries a high risk of inbreeding depression, says Keller. “With conservation of endangered species, you can't afford that,” he adds. “If you lose [individuals], that's a catastrophe.”

    Tracking a Sparrow's Fall

    In the winter of 1989, harsh weather killed off all but 11 of some 200 song sparrows living on Mandarte Island, off the coast of Vancouver, Canada. For Princeton University's Lukas Keller, this catastrophe was one of a series that offered a rare opportunity to study the genetic costs of inbreeding. Inbreeding depression, in which recessive, deleterious genes are expressed in small populations, is well known in domestic animals and plants. But documenting it in the wild has proved surprisingly difficult, because researchers rarely know the family history of all the individuals in a wild population.

    However, on Mandarte Island, researchers led by ecologists James N. M. Smith and Peter Arcese of the University of British Columbia have tracked every sparrow birth, death, immigration, and mating since 1959. Those detailed records allowed Keller to count the offspring of each bird that survived two population crashes, in 1979–80 and 1989, and match those numbers with the degree of relatedness between each bird's parents.

    “In [inbred] females, there was a significant decrease in reproductive success,” Keller reported at the meeting. The daughters of two siblings or of a parent mating with an offspring produced 48% fewer young than daughters of unrelated parents, while sons whose parents were closely related suffered only a slight drop in fecundity. The eggs of inbred daughters were less likely to hatch than eggs fertilized by inbred males.

    Such documentation of reduced fitness is important, researchers say, because most evidence of inbreeding in wild populations is indirect, relying on signs such as abnormal sperm, as in cheetahs. “[Keller] has demonstrated that inbreeding has effects in natural populations,” says Donald Waller, an evolutionary biologist at the University of Wisconsin, Madison. “That is usually very hard to measure.”

    Keller also studied how the sparrows' genetic diversity recovered after the population crashes. Working with molecular biologists Michael Bruford and Kathryn Jeffery of the Institute of Zoology in London, he analyzed microsatellites, small pieces of repetitive DNA that are highly varied in most populations. Microsatellite diversity dropped immediately after the population crash of 1989, “but it only took 3 years for the [diversity] to go back to normal levels,” Keller reported.

    In this case, Keller knew why the diversity rebounded so quickly: new blood. On average, one new sparrow arrives each year from the mainland to make Mandarte Island its home. Most evolutionary theory on the recovery of genetic diversity after a bottleneck considers only the restorative effects of mutations, which accumulate very slowly. But this result, Keller says, “should make people reexamine the importance of immigration.”

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution