News this Week

Science  13 Jul 2001:
Vol. 293, Issue 5528, pp. 186

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Rumors and Trial Balloons Precede Bush's Funding Decision

    1. Gretchen Vogel

    Opponents and supporters of embryonic stem (ES) cell research reached a rare moment of agreement last week. When aides to President George W. Bush floated options for a possible compromise on government funding for such research, both sides said the proposals were unacceptable.

    Rumors about a compromise came in the midst of a barrage of intense lobbying. Last week the White House spokesperson was dogged with questions, while newspapers across the country ran front-page stories and passionate editorials. The once-arcane debate over the ethics of working with these cells, which are derived from week-old embryos, even dominated Sunday morning political talk shows. As Science went to press, Bush's decision was said to be imminent.

    The president is caught between two politically powerful ideas. ES cells, which in theory can become any type of cell in the body, are touted as possible cures for diseases including diabetes and Parkinson's. The sticking point is their origin: embryos left over from fertility treatments, slated to be discarded. Opponents of the research believe such embryos deserve protection as human life, while advocates say these embryos have no chance at life and cells derived from them promise widespread benefits. Under guidelines crafted under the Clinton Administration, now on hold, the National Institutes of Health (NIH) can support research on ES cells, as long as they have been derived by privately funded researchers in accordance with ethical standards. But NIH-funded researchers cannot derive the cell lines themselves. Many researchers say that arrangement is compromise enough.

    But Bush was considering proposals last week that would restrict NIH even more. One idea the White House floated was to allow NIH to fund this research—but only on already-existing cell lines. In other words, when new cell lines become available, NIH researchers would not be able to work on them. Others hinted that Bush might support giving government grants to private groups who could then divert their own funds to ES cell research.

    The first proposal is unworkable for several reasons, say advocates for ES cell research. For one, all known ES cell lines were derived by privately funded researchers, and many have commercial strings attached that could limit research. Scientists also say they need a variety of cell lines to determine which might be most useful for treatment—dozens or even hundreds, says developmental biologist Douglas Melton of Harvard University.

    The anti-ES cell coalition is no happier. Although limiting work to existing cell lines would not directly cause the destruction of additional embryos, it would still rely on that immoral act, says Richard Doerflinger, spokesperson for the National Conference of Catholic Bishops. “I don't think such a compromise works on any plane,” he says. “It does not resolve the ethical problem, and it doesn't resolve the scientific problem”—that is, the need for new cell lines.

    Pro-research advocate Tony Mazzaschi of the Association of American Medical Colleges agrees. “The talk of a compromise is really foolish. You have ethical paradigms that can't be bridged. To try to bridge them is to respect neither one. You just have to choose.”

    The other idea does not fare much better. Although the Juvenile Diabetes Research Foundation, for example, would not immediately refuse an offer of federal money, it would be cautious about accepting money with complicated oversight requirements, says spokesperson Julie Kimbrough.

    If Bush does decide to limit research, supporters may have enough allies on Capitol Hill to overrule him. Since late June, dozens of legislators, including many abortion opponents, have stated their support for the research (see timeline).

    Some congressional staffers favor removing the so-called “Dickey amendment” from next year's Health and Human Services (HHS) funding bill. That amendment forbids HHS (which oversees NIH) to spend money on research that harms or destroys human embryos. Senator Arlen Specter (R-PA) has introduced a bill in the Senate that would allow NIH to fund derivation of new ES cell lines, and a companion bill has been introduced in the House of Representatives. On 8 July Specter said on CBS's Face the Nation that he had counted 70 votes in favor of his bill—enough to pass it even over a presidential veto.

    “The president is very aware that there is a balance on this issue where there is so much potential for health and for breakthroughs,” White House spokesperson Ari Fleischer said on 9 July. “The president … is listening to all sides of the debate.” Many believe he will announce his decision before a 23 July meeting with Pope John Paul II.

    The Stem Cell Saga


    13 June Senator Orrin Hatch (R-UT) writes to Bush Administration supporting embryonic stem (ES) cell research.

    20 June NIH sends scientific review of stem cell science to Health and Human Services Secretary Tommy Thompson.

    27 June U.S. Representative Jennifer Dunn (R-WA), with 37 Republican colleagues, sends letter urging President Bush to support ES cell research.

    1 July Coalition of Americans for Medical Research urges more than 70,000 patient activists to lobby their representatives over the 4 July holiday.

    2 July House Republican leaders Richard Armey (TX), Tom DeLay (TX), and J. C. Watts (OK) urge Bush not to allow funding for ES cell research.

    2 July Main Street Republicans, a group of 57 moderate Republicans, issue statement supporting ES cell research.

    3 July Press reports possible “compromise” from White House aides.

    3 July Bush tells journalists he'll decide the issue “in a while.”

    5 July Conservative columnist William Safire urges Bush to support ES cell research.


    Spain Cuts Off Aid to Foreign Ph.D. Students

    1. Xavier Bosch*
    1. Xavier Bosch is a science writer in Barcelona.

    BARCELONA—Eduardo Agatângelo came to the Autonomous University of Barcelona (UAB) from his native Angola last year under a new program to help promising students from the developing world earn Ph.D. degrees. But last month the Spanish government pulled the plug on its 3-year commitment to Agatângelo and hundreds of other students from around the world, shifting the money to target links with Latin America. The move has left many students angry at their host country and anxious about their chances of becoming scientists. “This worries me greatly,” says Agatângelo, who is seeking a degree in food science. “I have no prospects to continue my Ph.D. training in Angola.”

    In 1998 Spain's Agency of International Collaboration (AECI) expanded a program, begun at the end of World War II, that awards competitive 3-year training grants to deserving graduate students from the developing world. The program now supports more than 1200 students from 40 countries. But last month the AECI announced that it would transfer $3.6 million from the grants program into a new entity, the Carolina Foundation, to support cultural and education programs in Latin America, including science courses for biomedical postdocs. An AECI official says the foreign grants program was too expensive and that the students, instead of returning to their home countries, were using the training as a kind of work placement program to land jobs in Spain—a characterization that the students deny.

    The AECI's decision means that some 900 foreign students may soon be homeward bound. Last month the agency informed first-and second-year students like Agatângelo that their training grants would be extended by 1 year. The roughly 350 students who were completing a third year without earning a Ph.D. were told that their support would end on 30 June, the last day of the academic year. The AECI said it would no longer grant extensions to allow such students to finish their degrees.

    Taking action.

    Barcelona students protest the Spanish government's decision to curtail grants program.


    The news left students up in arms, leading to demonstrations in Barcelona and Madrid. “This situation places hundreds of researchers in a situation of economic precariousness,” says Silvina van der Molen, an Argentinean who has just completed her third year of studies in ichthyology at UAB and was hoping to finish her Ph.D. next year.

    Faculty members have also condemned the AECI's hard line. The Spanish Council of University Rectors, representing 64 universities, criticized the disruption to the students' lives and work. Echoing that theme, officials at 11 universities in Catalonia say the decision “is harming not only the Ph.D. students but also the research institutions where they are developing their training grants and their corresponding countries.”

    The backlash has sent AECI officials backpedaling. Jesús Silva, the Foreign Office's director of cultural and science relations, says that the agency will now give “a few extra months” of support to third-year students on the verge of completing degrees and may give selected foreign students a few months of grant support for study in Spain. But university officials remain upset. The AECI, fumes UAB research vice chancellor Joan Antón Carbonell, is “not taking this matter seriously.”

    A spokesperson for the students says AECI bylaws mandate that these grants run for 3 years. But the students may have little legal recourse: Class-action lawsuits are prohibited in Spain, and individual lawsuits would be costly. Unless the AECI changes its stance, scores of embittered students will be packing their bags for home next year.


    Another Emissary From the Dawn of Humanity

    1. Michael Balter,
    2. Ann Gibbons

    Fossils unearthed in Ethiopia offer a glimpse of the time when humans and chimps first went their separate evolutionary ways—and may represent the earliest known human ancestor. The remains—a jawbone with teeth as well as arm, hand, and foot bones—have been dated at between 5.2 million and 5.8 million years old. From the shape of one nearly complete foot bone, the discoverers conclude that their specimen walked upright, a hallmark of all hominids.

    The find comes hot on the heels of the report of 6-million-year-old bones found in Kenya's Tugen Hills, also hailed by their discoverers as belonging to the earliest known hominid (Science, 23 February, p. 1460). The two creatures share a crucial feature: Both appear to have lived in relatively wet woodlands. If either is indeed a hominid, that could overturn a long-held theory that bipedalism evolved when forest-dwelling apes moved out into open savannas, possibly as a result of climate change. “The recent findings … challenge some long-cherished ideas about the mode and timing of hominid evolution,” says Brigitte Senut of the National Museum of Natural History in Paris, co-leader of the Tugen Hills team, which has dubbed its specimen Orrorin tugenensis.

    The Ethiopian fossils were found between 1997 and early this year by an Ethiopian-American team led by Yohannes Haile-Selassie and Tim White of the University of California, Berkeley. The dates—determined by argon-argon dating and confirmed by paleomagnetic measurements and analysis of animal bones found in the same sediments—place the teeth and bones around the time when most geneticists believe that humans and chimpanzees split from a common ancestor, between 6 million and 9 million years ago. “But the closer you get to the branching point, the harder it is to say what the characters are that define hominids,” says Rick Potts of the Smithsonian Institution in Washington, D.C.


    Tooth of oldest known hominid?


    Describing their find in two papers in this week's issue of Nature, the team has named its specimen Ardipithecus ramidus kaddaba, as it may represent a subspecies of 4.4- million-year-old Ardipithecus ramidus fossils first reported from Ethiopia in 1994. The researchers argue for hominid status for Ardipithecus ramidus kaddaba partly because of its lower canine teeth, which in cross section are diamond-shaped like those of later hominids rather than V-shaped like those of apes. They also note that the foot bone has features—such as a joint's orientation—similar to those of later hominids, including the famous 3.5-million-year-old Lucy. This orientation “suggests bipedality,” says Juan Luis Arsuaga of the University of Madrid, although “the evidence is still weak” because so far it is based primarily on a single bone. Fred Spoor of University College, London, agrees that the jury is still out on hominid status: “Neither this nor the Orrorin paper make a really watertight case,” he says.

    The Orrorin and Ardipithecus teams assert that each other's fossils could represent an ancestor of chimps or other apes, rather than one of our early human ancestors or cousins. Figuring out who's right is hard: Although numerous hominid species have been unearthed over the years, no fossils representing the chimp evolutionary line have ever been discovered. “Our obsession over the earliest hominids is a bad habit,” says Daniel Lieberman of Harvard University. “Finding the earliest known chimpanzee would be just as exciting.” To help resolve the debate, Haile-Selassie says the team will go back into the field in November to search for more fossils.

    Whatever they dig up could offer an important piece to the evolutionary puzzle. “It is a mistake to feel that one has to squeeze this [find] into the category of human or chimp ancestor,” says Bernard Wood of George Washington University in Washington, D.C. “Just to have a fleeting glimpse of these creatures is exciting.”


    Journals Offered Free to Poorest Nations

    1. David Malakoff

    Researchers and doctors in poorer nations will get free or low-priced electronic access to nearly 1000 biomedical journals. The six largest commercial journal publishers agreed this week to open their Internet vaults to universities, laboratories, and health agencies in nearly 100 nations under an initiative led by the World Health Organization (WHO).

    Scientists and health workers in the developing world have long struggled to obtain timely, affordable access to information on new findings and therapies. Many journals are too expensive or arrive months after publication. The 3-year pilot project, set to begin early next year, “is perhaps the biggest step ever taken towards reducing the health information gap between rich and poor countries,” WHO director Gro Harlem Brundtland said at a 9 July press conference in London announcing the deal.

    The six publishers—which publish 80% of the world's top 1240 biomedical journals—have agreed to let WHO set up an Internet portal through which approved institutions can retrieve papers. Initially, says Barbara Aronson, a librarian at WHO's Geneva headquarters, the portal will be free to more than 600 institutions in 63 of the world's poorest nations, mostly in Africa, with per capita incomes of less than $755 annually. Later, WHO hopes to arrange deeply discounted subscriptions for institutions in about 40 nations, including some in Eastern Europe, with per capita annual incomes of up to $3000.

    Free deal.

    WHO's Brundtland, right, and Blackwell's Jon Conibear, left, unveil the plan.


    Health InterNetwork, a United Nations program led by WHO, will help institutions get the necessary hardware, Internet connections, and training. Participants declined to put a price tag on the project, estimated by publishing industry analysts as being worth millions of dollars at normal market rates.

    The list of available journals does not—so far—include those produced by smaller publishers, including such prominent publications as The New England Journal of Medicine, Science, and Nature. “We decided to go for the largest publishers first [rather than] the most prestigious journals,” Aronson says. Many big-name publications already have their own low-or no-cost distribution programs, but Aronson said that their participation in WHO's project would be welcome. Donald Kennedy, editor-in-chief of Science, says the journal would consider all invitations and is already weighing whether to join one effort to make scientific information more easily available in the developing world.

    The six cooperating publishers are Reed-Elsevier, Wolters Kluwer, Blackwell, Harcourt General (which is merging with Reed-Elsevier), Springer-Verlag, and John Wiley & Sons. The Open Society Institute, a charitable group founded by finance billionaire George Soros that already runs its own journal distribution program, will help identify eligible institutions. The British Medical Journal, published by the British Medical Association, also played a role in developing the initiative.


    Fresh Molecule Whets Appetite

    1. Evelyn Strauss

    As appetite researchers feast on a banquet of molecules that control eating behavior, yet another joins the spread. This one comes from a field that has produced samplings of other proteins that transmit signals—but none before that governs hunger.

    “It's a different kind of molecule for influencing these pathways,” says Jeffrey Flier, an endocrinologist at Beth Israel Deaconess Medical Center in Boston. The 20 to 25 players identified so far are either neuropeptides that transmit messages or cell surface receptors that register incoming information. By contrast, this new molecule, called syndecan-3, apparently helps one of these neuropeptides activate a receptor. Fiddling with the newfound regulator, researchers suggest, could open therapeutic avenues for controlling appetite.

    Although syndecan-3 is an unexpected player in appetite control, it works with a molecule well known to regulate body weight, report Ofer Reizes, a postdoc in Merton Bernfield's lab at Harvard Medical School in Boston, and their colleagues in the 13 July issue of Cell. Syndecan-3 belongs to a family of proteins that grab signaling molecules and attach them to receptors. Long thought to act as molecular glue, these proteins—and in particular, their side chain appendages—have recently been shown to help send signals, some of which are crucial for embryonic development. And that's what the Bernfield team was exploring when it discovered syndecan-3's appetite-control powers.

    The researchers were curious about the developmental role of a cousin of syndecan-3, called syndecan-1. To test its function, they engineered a mouse to overproduce the molecule. “We expected to see some sort of developmental anomaly—perhaps an extra digit,” says Reizes. But the mice grew normally, at least for the first 6 weeks. Then they began to gain excess amounts of weight. By adulthood, they were obese.

    The animals' behavioral and biochemical abnormalities resembled those found in obese mice with defects in an established appetite-control network called the melanocortin system. A key protein in the system, called the melanocortin-4 receptor (MC-4R), receives competing signals that tell an animal whether to eat. The receptor can bind a “satiety peptide” that activates the receptor and produces a feeling of fullness. Or it can bind an “antisatiety peptide” that obstructs the satiety peptide and induces hunger.

    The researchers suspected that syndecan-1 was interfering with MC-4R, so they tested how cultured cells operate with and without the syndecan. They already knew that the antisatiety peptide by itself inhibits activation of the receptor. But when the researchers added syndecan-1, the receptor's activity took a deeper dive.

    Shedding molecules, shedding pounds?

    Syndecan-3 collaborates with an antisatiety peptide.


    While intriguing, the results had a major limitation: Syndecan-1 normally resides in skin and related tissues, not in the brain. The transgenic mice were an exception, the team found—they produced syndecan-1 in the brain's hypothalamus, the part that regulates feeding behavior. “My initial response was, it's probably not interesting,” says Flier, who advised the team. He suspected that the transgene was damaging the hypothalamus somehow rather than playing an active role in appetite control.

    To get at the “real physiology,” says Reizes, the group went after a relative of syndecan-1 that normally dwells in the brain: syndecan-3. They tested whether syndecan-3 concentrations rise when mice are hungry, as would be expected if the molecule works in concert with an antisatiety peptide. “Sure enough, that's exactly what we saw,” says Reizes. And without syndecan-3, engineered mice don't eat even after fasting all night. “It's as if they don't perceive that they're hungry,” says Reizes. This result “suggests that syndecan-3 may actually be a normal factor that modulates body weight,” says Flier.

    The researchers propose that when normal mice eat, they shed syndecan-3 from the surface of cells in the hypothalamus; the syndecan-3, in turn, takes the antisatiety peptide along for the ride, removing it from the MC-4R and liberating the receptor so it can bind the satiety peptide (see figure). To test this theory, they engineered two strains of mice. In one, syndecan-1—whose structure they understood well enough to tweak in the appropriate ways—was stuck to the cell membrane; in the other, it wasn't. Only membrane-bound syndecan-1 caused animals to gain weight. That result “strongly suggests that only the membrane-anchored form can potentiate the activity of the antisatiety peptide,” says Carl Blobel, a cell biologist at the Memorial Sloan-Kettering Cancer Center in New York City. Perhaps, he suggests, a future therapy might clip syndecan-3 from the cell surface.

    The new finding adds another molecule to the smorgasbord of biochemical factors that might predispose a person to obesity or leanness. “Studies of mechanisms that regulate feeding are at an exciting juncture,” says Jeffrey Friedman, a molecular biologist at Rockefeller University in New York City. “A rudimentary wiring diagram can be drawn now that includes a number of molecules known to regulate feeding behavior. Syndecan-3 is a new and important element in this system.”


    New Finding Heats Up the Hot Zone

    1. Jon Cohen

    Ebola and its equally gruesome cousin, Marburg, fascinate the public. They've even played starring roles in best-selling books and Hollywood movies. But these terrifying viruses, which typically cause death by hemorrhage within weeks of infection, have attracted relatively few research groups (and scant funding) since they were discovered some 25 years ago. As a result, the modus operandi of these viruses is largely unknown. Now a team led by Mark Goldsmith at the Gladstone Institute of Virology and Immunology in San Francisco, California, has uncovered an intriguing lead: a molecule that helps both viruses infect cells.

    As Goldsmith, Stephen Chan, and co-workers describe in the 13 July issue of Cell, these viruses have an intimate relationship with folate receptor-α (FR-α), which is found on the surface of many cell types. “It's an important finding,” says Paul Bates of the University of Pennsylvania in Philadelphia. “We're now going to get into the nitty-gritty of taking [the discovery] apart and trying to figure out the nature of the interaction,” says Bates, whose own lab also studies how Marburg and Ebola establish infections. If FR-α does turn out to be an important accomplice for viral entry, this insight might point the way to novel treatment and vaccine strategies.

    Many viruses cause infections by first docking onto receptors on cell surfaces. But the Goldsmith team carefully avoids calling FR-α a “receptor” for Ebola and Marburg. Rather, they describe it as a “cofactor for cellular entry”—which is another way of saying that they don't yet know the precise role that FR-α plays in opening the cellular door to these invaders. “At this point we're happy to have any clues about how [these viruses] enter cells,” says Gary Nabel, who heads the Vaccine Research Center at the National Institutes of Health and is part of the small club of investigators who study Ebola.

    Breaking and entering.

    Researchers have found a molecule that helps Ebola (above) and Marburg viruses enter cells.


    The Gladstone researchers found FR-α by exploiting an earlier finding—that Ebola and Marburg do not infect T cells. These cells thus provide an ideal system for testing, one by one, possible factors that control viral entry. Specifically, the researchers made a library of DNA from cells that the viruses can infect, reasoning that these genes produce the missing factor (or factors). They next engineered new T cells to contain one or more of those genes, then “challenged” the modified cells with a chimeric virus that combined a surface protein from Marburg with HIV. (Researchers routinely use such “pseudotype” viruses of both Marburg and Ebola because few labs have the biosafety capabilities to work with the real pathogens.) They soon found that the pseudotype Marburg virus could infect T cells that contained FR-α. The same held true for a pseudotype Ebola virus.

    “We did many, many, many experiments to try and convince ourselves that the results were real,” says Goldsmith. For example, they found that blocking FR-α inhibits viral entry. And, working with Alan Schmaljohn at the U.S. Army Medical Research Institute of Infectious Diseases in Fort Detrick, Maryland—which has one of two labs in the country equipped to handle the hottest viruses—the researchers demonstrated that wild-type Marburg virus could infect T cells only if they were modified to contain FR-α.

    Several caveats remain. Nabel cautions that rather than serving as a receptor for Ebola and Marburg, FR-α might send signals that tell the cell to let the viruses enter. Pennsylvania's Bates adds that he'd like to see more studies done with other cell types. “The three most important targets [for these viruses] aren't addressed in this paper: macrophages, hepatocytes, and endothelial cells,” says Bates. (All have FR-α, to varying degrees.) And both Goldsmith's group and Bates have unpublished data that Ebola and Marburg can infect cells that do not have FR-α—evidence that the viruses may use different pathways to enter different types of cells.

    Goldsmith, too, is cautious about the FR-α findings. “We don't know how things will play out,” he says. But for the small cadre of Ebola and Marburg researchers, the new work offers a tantalizing clue about two diseases that, famous though they are, largely remain a mystery.


    Mission to Saturn Rises From Ashes

    1. Helen Gavaghan*
    1. Helen Gavaghan is a writer in Hebden Bridge, U.K.

    Last year, space scientists got some very bad news: A flaw in a communications device imperiled the high-profile Cassini-Huygens mission. If the problem wasn't fixed, nearly all the data from the Huygens probe, which the Cassini spacecraft is supposed to release while orbiting Saturn's moon Titan in 2005, would be lost. The reason: Cassini would be unable to interpret the probe's transmissions. As the months passed, researchers dampened their expectations about Huygens's scientific payoff.

    Now they are smiling again. Engineers last month worked out an elegant solution that should save the mission's science. “We were in hell, now we're in paradise,” says Marcello Fulchignoni, a Huygens scientist and astronomer at the Paris Observatory.

    After its anticipated arrival at Saturn in July 2004, the $3 billion Cassini-Huygens mission, a joint operation of NASA and the European Space Agency (ESA), is slated to image the planet and its moons and study its magnetosphere and ring system. Most thrilling to some researchers, however, is Huygens's parachute through Titan's atmosphere, rich in prebiotic molecules such as nitrogen and hydrocarbons. “Although [Titan is] colder than the early Earth and lacking water, we hope to gain some insight into the processes that precede the formation of life,” says Huygens scientist Guy Israel of the French research agency CNRS in Verrieres-le-Buisson.

    On track.

    Huygens glitch is fixed.


    The communication glitch dealt those expectations a huge blow. Last year it became increasingly clear that most of the data Huygens would gather and send back to Earth would be indecipherable. An inquiry board established by NASA and ESA found that engineers had miscalculated the Doppler shift in the probe's transmission, leading to the installation of an incorrect component in the Huygens receiver on Cassini. “We have a technical term for what went wrong here,” says Huygens scientist John Zarnecki of the Open University in Milton Keynes, U.K. “It's called a cock-up.”

    Last fall, the agencies established a working group to look for a solution. After working out the performance characteristics of the receiver, given the anomalies they were observing, NASA engineers in February beamed to Cassini the range of frequencies and signal strengths expected from Huygens. The test confirmed the nature of the glitch. Armed with this information, experts at NASA's Jet Propulsion Laboratory in Pasadena, California, recalculated Cassini's trajectory to achieve an orbital path that allows the receiver to pick up the critical portion of the probe's signal. The new trajectory reduces the Doppler shift, bringing the probe's transmission frequencies within detection range. The fix will not delay the start to Cassini's planned 4-year mission. “It's a fantastic solution,” says Zarnecki.

    An inquiry by the space agencies didn't assign blame for the flawed receiver, stating that the error was caused by the system rather than an individual or organization. It's not the first such breakdown: In 1999, the use of English units, rather than the called-for metric units, sent NASA's Mars Climate Orbiter on a fatally low entry into the martian atmosphere (Science, 19 November 1999, p. 1457). To avoid more such embarrassments, ESA has ordered two upcoming missions—Integral, a gamma ray observatory, and Rosetta, which will explore comet Wirtanen—to undergo the kind of in-depth testing that might have spotted the Huygens problem on the ground.


    Liberal Arts Schools Pass Science Checkup

    1. Jeffrey Mervis

    BATAVIA, ILLINOIS—U.S. liberal arts colleges and small universities that focus on undergraduate education have been hiring science faculty at record rates the past few years. Educators say this trend is one of many signs of research vitality at these schools, which train a disproportionate share of the nation's scientific workforce compared with the big research universities.

    The data come from a new study of the research environment at 136 predominantly undergraduate schools that covers everything from the average number of publications per faculty member to the number of students doing summer research projects.* The $300,000 survey was designed and paid for by five foundations—W. M. Keck, M. J. Murdock, Camille and Henry Dreyfus, Robert A. Welch, and the Research Corp.—that support undergraduate research in the natural sciences. The survey, presented at a 2-day meeting here late last month for nearly 200 college presidents and other research administrators, is perhaps the richest compendium of data ever compiled on this sector of American education.

    The foundations were concerned that applications for research grants from faculty members at these colleges hadn't increased in the past decade despite a 21% increase in new faculty. The reasons remain murky, however. “We've learned a heck of a lot [from the survey], but it's an open question where we go from here,” says Robert Lichter, executive director of the New York City-based Dreyfus Foundation.

    Welcome aboard.

    U.S. undergraduate colleges have added scores of new science faculty in recent years.


    The survey reaches no conclusions and does not identify specific institutions. But school administrators said the data are invaluable for self-analysis as well as for comparisons with other schools for which they compete for students and faculty. (Institutions contributed vast amounts of hard-to-get data with the promise of confidentiality; schools can identify themselves through a key.)

    Among the mass of data:

    • Full professors generate 0.6 research publications a year and receive research grants that average $26,700; the figures are 0.48 and $17,500 for associate professors and 0.45 and $27,100 for assistant professors.

    • The average number of faculty members per institution in the natural sciences has risen from 27.5 to 33.5 over the decade.

    • More students are majoring in the natural sciences—led by the biosciences—with their numbers rising by nearly 50% from 1990 to 1997 before tapering off slightly.

    • Faculty members spend more time on research and less time on teaching than a decade ago. But even those at the most research-intensive of these institutions—public schools that award some advanced degrees—still spend 3 hours teaching for every hour in the lab.

    • The gender gap is narrowing among new hires, standing at 55:45 in favor of men for those hired in the past 5 years compared with 82:18 for the period 1975–80.


    Turmoil Behind the Exhibits

    1. Elizabeth Pennisi*
    1. With reporting by Andrew Lawler, David Malakoff, and Erik Stokstad.

    The Smithsonian Institution is home to some of the world's finest collections, but it's beset by problems, and Smithsonian scientists are up in arms over Secretary Lawrence Small's efforts to fix them

    On 4 April, Lucy Spelman, director of the Smithsonian Institution's National Zoo, drove an hour west of Washington, D.C., to deliver some grim news. She told 65 employees of the zoo's Conservation and Research Center (CRC) in Front Royal, Virginia, that the Smithsonian planned to close the sprawling facility, one of the world's oldest centers for research on endangered species. That same day, scientists at another Smithsonian research facility, the Center for Materials Research and Education (SCMRE) in Suitland, Maryland, learned that they, too, could be out of a job by the end of the year: Their center had also been tagged for the ax (Science, 13 April, p. 183). The reaction was swift, noisy, and effective.

    The announcements caused a public eruption of discontent that had been festering behind the Smithsonian's famous exhibits for more than a year. The planned closures were the first casualties in a battle over the future of science at the venerable institution. Over the next few weeks, a flurry of newspaper stories told of trouble in the nation's attic. Smithsonian scientists accused the institution's top brass of killing off science to fund splashy exhibits and undermining the Smithsonian's credibility as a result. The director of the National Museum of Natural History (NMNH), Robert Fri, resigned, saying he could no longer support the policies of his boss. And Virginia and Maryland politicians rushed to the defense of the two threatened facilities. In the end, the public fuss gained the CRC a reprieve. But the tussle over Smithsonian science is far from over.

    Smithsonian fireworks.

    The institution's historic castle and its chief, Lawrence M. Small (below).


    At the eye of the storm is Lawrence M. Small, the Smithsonian's secretary. A former chair of the executive committee of the board of directors of Citibank and president and chief operating officer of Fannie Mae, the large home mortgage institution, Small brought a corporate style to the Smithsonian when he was installed in January 2000 as its 11th secretary. He is the first nonacademic to head the institution since it was founded in 1846 with a gift from British scientist James Smithson “for the increase and diffusion of knowledge.”

    Small came in as a reformer with a pledge to put the Smithsonian on a firmer financial footing, define a new “mission” for the institution, and subject its activities to more rigorous assessments. He has announced his intention to reorganize science throughout the Smithsonian into “centers of excellence,” administratively separate from the public exhibits, and raise its visibility. Good science would get more support, while lower priority areas would be phased out. “It's necessary for us to set our priorities in a thoughtful, paninstitutional way” and focus on fewer themes so “we can make a more compelling case” to potential donors, he said in a recent interview with Science.

    Those plans have generated deep anxiety, in part because they have yet to be spelled out in detail. Researchers, who have felt left out of the process, don't yet know to whom their groups will be reporting—or even whether their work will be phased out. The discontent is deepest at the NMNH, the largest Smithsonian unit and, arguably, the one with the most to lose in a shake-up.

    The uncertainties could persist for months. In May, following the furor over the planned closure of the CRC, the Smithsonian's Board of Regents approved the creation of a blue-ribbon commission of prominent scientists from within and outside the Smithsonian to work with Small on the scientific reorganization. The panel, which had not been appointed when this issue went to press, is expected to make preliminary recommendations by the end of the year. In the meantime, concerned members of Congress have told Small not to act without the commission's approval.

    Small has justified his actions in part on the grounds that, although some areas of science are outstanding, others are not up to par and leadership has been lacking. To examine the basis for those charges, Science obtained the reports of outside reviews of the Smithsonian's major science programs, asked the Institute for Scientific Information (ISI) in Philadelphia to analyze the scientific output of Smithsonian scientists, and talked to dozens of scientists inside and outside the institution. The picture that emerges is of a scientific enterprise that is, as one review put it, a “national treasure,” but that is beset by budget problems and in need of reforms. “There's so much potential there,” says Hans-Dieter Sues, vice president for collections and research at the Royal Ontario Museum in Toronto. “Somehow, the creative spark needs to be reignited.”

    The big squeeze

    Nobody doubts that Small took on a big challenge when he became secretary. Under his charge are the National Zoo and 16 museums covering a wide range of subjects including art, architecture, American history, and natural history. Like the zoo, most of the museums are based in Washington, D.C., many of them laid out in parallel down a grassy mall that extends about a kilometer to the steps of the U.S. Capitol. Free of charge to all comers, the museums and their exhibits are the veneer, the public view of some 155 years of research, scholarship, and collections by Smithsonian curators. Together, they draw more visitors than any other institution in the world.

    Behind that veneer is a sprawling research enterprise spread among the museums and a half-dozen separate research centers, including the Smithsonian Tropical Research Institute (STRI) at sites throughout Panama, the Harvard-Smithsonian Center for Astrophysics (CfA) in Cambridge, Massachusetts, and the Smithsonian Environmental Research Center (SERC) in Edgewater, Maryland. Both the veneer and the base on which it rests are in financial trouble.

    Support from the federal government, which covers about 70% of the Smithsonian budget, hasn't even kept up with the payroll over the past decade. Yet the institution has embarked on a string of major capital projects, including construction of a new building on the mall for the National Museum of the American Indian, that were started long before Small arrived. Cost overruns on the new museum and sharp increases in the costs of renovating aging buildings have caused the postponement of some repairs and other construction projects.

    The Bush Administration's budget, submitted to Congress in April, won't help: Although Small says he requested a “huge” increase, the Office of Management and Budget approved a mere $40 million more for next year, for a total of $494 million. Some $30 million of that increase will go to building the new museum and restoring old ones. What's left won't cover the $15.6 million increase needed for salaries, utilities, and postage; nor will it fund “institution priorities”: setting up a satellite air and space museum ($1.7 million), buying a new financial system and updating computers and security systems for $8 million, and a new $2 million outreach program. Thus the secretary expects to find $13.45 million from other programs.

    That's bad news for science, which would get about the same as last year: $113 million—60% of which will pay for salaries. Smithsonian science has been chronically short of funds for years. Over the past decade, research dollars have decreased by almost a quarter Smithsonian-wide, and while the number of Smithsonian employees has jumped 41%, reaching 5700 in 1999, the number of curators in natural history alone has slipped from 118 to 99 during that same decade. “We're overstressed and overbooked on everything we try to do here,” says Richard Benson, who chairs the paleontology department.

    Diverse science.

    Smithsonian scientists run CT scans on mummies (Bruno Frohlich, above), identify birds from feathers caught in plane engines (Roxie Laybourne, below), explore the undersea world, and much, much more.


    A mixed bag

    Those budget pressures are forcing a hard look at the quality of science at the Smithsonian. Just how good is it? The question is difficult to answer because the sweeping range of the science at the institution defies easy comparison, and in many cases—especially in the NMNH—scientists play an important role that may not show up in the traditional measures of scientific excellence: maintaining and making use of valuable collections. But many consider it the ultimate information source. “The Smithsonian research effort is so large and broad based that no other museum in the world can hold a candle to it,” says Michael Mares, director of the Sam Noble Oklahoma Museum of Natural History in Norman.

    The ISI figures seem to bear out that perception. The natural history museum, STRI, SERC, and the astrophysical observatory rank in the top 1% of institutions in terms of their scientific impact, as measured by the number of publications they produce in their appropriate fields and the average number of citations each paper receives (see table, p. 197). The astrophysical observatory, which is linked to Harvard's astronomy department, is considered among the best astronomy centers in the world. And STRI's work on tropical forests and coral reefs have been cornerstones in tropical and marine ecology.

    STRI, SERC, and the astrophysical observatory have managed to weather the fiscal drought in recent years and remain strong. They are focusing on “hot” research fields and have been able to attract considerable outside support (see chart, p. 198). But the NMNH hasn't done as well. “We've had several years of very marginal funding in this museum, and we've never been able to break out of it,” says Smithsonian anthropologist Donald Ortner. And museum scientists have had a tough time generating outside support, in part because the National Science Foundation traditionally funds research involving other government-supported scientists only if they team up as co-investigators with university researchers.

    The museum has also suffered chronic leadership problems. While STRI and the astrophysical observatory have had long-term, stable leadership, NMNH has had eight in the past 20 years alone. Larry Abele, a systematist at Florida State University in Tallahassee, realized the seriousness of the situation when it took two tries for the museum to find a new director in 1995. It seemed clear then that “the national museum was really losing its position as a leader in the world of systematic and evolutionary biology,” he recalls.

    Trouble at Natural History

    The same perception troubled the man who eventually took the job, Fri, a soft-spoken energy expert. In 1998, Fri instituted a series of reviews of the museum's science. Panels of outside experts evaluated the life, human, and earth sciences programs. Then a committee headed by Jack Gibbons, President Clinton's first science adviser, and entomologist May Berenbaum of the University of Illinois, Urbana-Champaign, worked with the chairs of the three expert panels—Abele, Jane Buikstra of the University of New Mexico, Albuquerque, and Alfred Fischer of the University of Southern California in Los Angeles—to integrate their findings into a report recommending change in the museum as a whole. The panels found a mixed bag and problems that go beyond the budget squeeze.

    The Berenbaum-Gibbons panel pointed to many strengths across the museum. The museum's life scientists “have significantly contributed to our fundamental knowledge of numbers and kinds of macroorganisms on earth,” it said. The geological and paleontological collections are “second to none in the world.” The gems and minerals collection and the scholarship associated with it “are particularly notable.” Work on volcanism “sets a standard for excellence.” The human science program has developed “an incomparable resource documenting the history of humankind in North America” and “advanced our knowledge of significant issues ranging from the origins of agriculture to human origins.”

    However, the panel continued, “the NMNH is not known institutionally for having developed great principles and theory, particularly in life sciences, nor has it as yet established a noteworthy position in the ecological and evolutionary sides of systematics and natural history.” It complained about “departmental insularity and fragmented vision, in which curators seemed to be more concerned with their turf than in crossing departmental boundaries to pursue such matters as global change, biocomplexity, and conservation.”

    The panel urged the museum to turn more toward synthesizing data, developing broad concepts, and addressing the impact of human history on the natural world. It also called for more stringent evaluations of research productivity and expressed concern about lack of staff turnover, noting that the number of curators over 65—about 25%—at the museum had tripled since 1987.

    “We drafted a very substantive report, and this was bought into by the scientists,” recalls David Dilcher, a paleobotanist at the University of Florida, Gainesville, who served on the life sciences panel. “We have [developed] a wide array of wonderful science without an understanding of what we're strategically positioned to do,” explains Melinda Zeder, a Smithsonian anthropologist. “With the current funding climate, there is a need for finding [a] vision” and setting priorities. “Everyone understood where they were going and were ready to implement this work,” says Emilio Moran, an anthropologist at Indiana University, Bloomington, who is an adviser to the museum.

    The Small revolution

    The NMNH's outside panel delivered its report on 24 January 2000. That same day, Small was installed as secretary.

    Three weeks after taking office, Small announced a new structure for managing science across the Smithsonian. He promoted the Smithsonian's provost, biologist Dennis O'Connor, to a new position of under secretary for science, with oversight over all the institution's science programs. In a memorandum to the staff announcing the reorganization, Small said he planned “to focus the Institution's resources into centers of excellence, which should receive most of the funds devoted to science” and “develop plans to phase out … the scientific activities that are determined to be outside our chosen areas of specialization.” O'Connor—who was later joined by former deputy STRI director Anthony Coates in the new position of director for scientific research programs—began reviewing all the Smithsonian programs and the external reviews that had piled up over the years, and they looked at the c.v. of every Smithsonian scientist.

    When Small first arrived, many scientists welcomed the prospect of reform. “I think he understands the problem: We need money,” says Ortner. “I think he's right on target there.” But Small soon began rubbing scientists the wrong way with remarks that seemed critical of the work the museum scientists had done. He called the collections disorganized, a new museum exhibit too dense with information, and constantly lauded the astrophysicists while criticizing natural history researchers. They felt they were being blamed for problems beyond their control. “What we need is support, not detraction and insults from the secretary,” says Vic Springer, a fish systematist at the NMNH. To make matters worse, in the summer of 2000, Small froze endowed funds that scientists relied on for special projects. The freeze proved temporary but fueled suspicions that he was robbing science's till to support pet projects.

    View this table:

    But Small and his scientists were trying to work together. Small set up regular breakfast meetings, held “town meetings,” and called individual scientists to ask for their thoughts. In response to Small's call for a small number of “themes” to describe Smithsonian science, representatives from each department at the NMNH formed a science council that came up with nine crosscutting themes, or research questions, and proposed that all of the NMNH be realigned into three areas: earth and planetary systems; evolution, diversity, and dynamics of life; and human dimensions of diversity and change.

    Despite these efforts, new strains emerged. Small ceased to meet regularly with museum directors, and more and more museum administrators felt left out of strategic planning sessions. Tensions came to a head in March 2001, when Ross Simons, head of science at NMNH, and Coates briefed the science council on the outlines of a reorganization plan. The Smithsonian's science operations and personnel would report to four “institutes” spanning the entire institution: biodiversity, astrophysics, human sciences, and earth and planetary sciences. In addition, science and public programs—which include the exhibits—would be managed separately.

    In interviews with Science, Small said the changes were needed for “better coordination and better sharing of resources.” Once the administrative units are congruous with a few scientific themes the Smithsonian can excel in, Small said, he will be able to raise money for those themes. Small also argued that better science would result if scientists didn't have to divide their time between public programs and research, and they could develop more fruitful collaborations with Smithsonian colleagues interested in the same themes. “If we want to stop withering on the vine,” adds Coates, “we have to do something about it by pruning ourselves, reorganizing ourselves, and getting out into the private sector.”

    Federal trough.

    NASA funding makes the astrophysical observatory the big breadwinner.


    But the council scientists left the meeting shaking their heads. Many say they support the idea of discipline-based centers of excellence, but some question the wisdom of restructuring the entire science program to create them. “I think [revitalizing science] requires a more nuanced approach than reorganizing the structure,” says Zeder. STRI evolutionary biology Mary Jane West-Eberhard agrees, suggesting that stable funding and “strong leadership that can distinguish between delayed payoffs and deadwood” are the solutions.

    Many found the move to divide science from public programs especially troubling if it separates scientists from the building of exhibits. “There needs to be the sense of obligation to the public; they need to work with the public to engender public enthusiasm [for science],” says Dilcher. Adds Conrad Labandeira, a Smithsonian paleobiologist, “It would be an unmitigated disaster.”

    In the face of these criticisms, Small, O'Connor, and Coates decided not to release details of the impending reorganization until the Smithsonian's Board of Regents had a chance to review it at its next meeting in early May. Rumors filled the void left by their silence, fanning concerns about the future and about Small's true intentions. It didn't help that Small suspended the procedure by which scientists get promotions and raises, expecting in May to implement a new one based on the reorganization.

    The announcement on 4 April of the planned closure of the CRC and the SCMRE provided the spark that ignited this powder keg. Small and zoo chief Spelman justified the cuts, which were included in the Bush Administration's budget submission to Congress, as being necessary to free up funds for higher priority efforts, including refurbishing and updating exhibits and programs at the zoo. Spelman noted that some of the work of the CRC would be transferred to the zoo's main site in Washington, D.C.

    Over the next several weeks, however, congressional representatives, scientific organizations, and individual researchers stood up to defend these centers. Many scientists felt that the cuts were further evidence that Small regarded science as a low priority. “Small has articulated support for science, but at the same time, we don't see support,” says NMNH paleontologist Doug Erwin.

    The heated rhetoric continued until, on the eve of the regents' meeting, Small withdrew the proposal to close the CRC, saying the reasons had been misinterpreted. Small provided the regents (and eventually the press) with a white paper called “Science for the 21st Century,” again laying out the rationale for centers of excellence and focusing on areas of science the Smithsonian does best. But the document provided no details of how the science would be structured or what would be cut.

    The regents approved the overall concept. Given the funding trends, “we can't be all things to all people,” says one regent, microbiologist Manuel Ibáñez of Corpus Christi, Texas. The details will now be worked out in conjunction with the blue-ribbon committee. Coates expects the group to consider not just a reorganization strategy proposed by Small but also a plan drawn up by the scientists themselves. And although everyone is anxious to have Smithsonian life settle down, he expects the committee's work to take at least to the end of the year.

    Natural History may have to face the changes without a leader. In June, Fri submitted his resignation. He declines to discuss his reasons, but in a terse statement he issued at the time, he said “I do not feel that I can make [the] commitment enthusiastically” to the impending changes.

    All along, Small has argued that the Smithsonian's science lacks visibility. That's now changed: Key members of Congress are now acutely aware of the Smithsonian's science programs, and last week a Senate committee blocked the closure of the SCMRE. If Congress eventually produces a more generous budget for science, that may be one of the lasting benefits of the past year's noisy recriminations.


    Astrophysical Observatory

    1. Andrew Lawler

    CAMBRIDGE, MASSACHUSETTS—With its formidable staff of 240 scientists, a bevy of cutting-edge instruments, and a long-standing partnership with one of the world's great universities, the Harvard-Smithsonian Center for Astrophysics is one of the Smithsonian's—and one of the world's—premier research facilities. But the 111-year-old center is embroiled in a quiet institutional crisis.

    Cost overruns are straining a tightening budget, while rising salaries and a severe lack of office space limit new hires and new programs. At the same time, researchers are up in arms about director Irwin Shapiro's plan to increase Harvard's influence at the center. “This is a pivotal time for us,” says Andrea Dupree, a longtime center researcher and past president of the American Astronomical Society. “We need to make decisions about our future, and there's a sense we are going to be left out.”

    The center is the result of a 1973 merger of Harvard's and the Smithsonian's observatory programs. Since 1985, shortly after Shapiro became chief, it has doubled its staff and more than tripled its budget. Facilities range from the Multiple Mirror Telescope (MMT) on Mount Hopkins in Arizona—currently being upgraded—to the Submillimeter Array (SMA) now under construction on Mauna Kea in Hawaii. A committee of outside scholars who reviewed the center in 2000 was “uniformly impressed” by the state of its operations and science, according to a copy obtained by Science.

    High impact.

    Crab Nebula as seen by NASA's Chandra spacecraft, operated by the Harvard-Smithsonian center.


    But there are signs of strain. In 1984, the SMA was to cost $25 million and take 6 years to build; the most recent estimate is $70 million in current dollars, and a series of technical troubles has delayed the effort by several years. The MMT conversion to a more powerful set of telescopes also has encountered technical troubles and is short-staffed. Other projects, such as NASA's Chandra X-ray Observatory which is operated by the center, are running smoothly. But a stagnant budget in recent years has left Shapiro with little flexibility. Travel, support, and computer system funds have suffered. “Such a pattern cannot continue indefinitely, as it will eventually strangle the institution,” warns the committee, which urged Shapiro to come up with a long-term plan—and hinted that it is time for new leadership. Shapiro, who says the plan will be ready by December, insists that he has no plans to retire.

    The tight budget is not Shapiro's only difficulty. He wants to increase the number of center scientists who have joint appointments at Harvard and give the university a greater say in appointing them. He contends that reviving the tradition of appointing joint researchers—the last of whom was hired 13 years ago—is essential to maintain Smithsonian representation on the Harvard faculty. Many center astronomers say it will make second-class citizens of those paid purely by the Smithsonian.

    Smithsonian Secretary Lawrence M. Small likely will have to wade into the center's issues in the coming months. Although he has other fish to fry as he revamps the institution, he will have plenty of work to do to keep one of the Smithsonian's jewels well polished.


    Environment Center

    1. Elizabeth Pennisi

    Barely an hour away from the turmoil in Washington, D.C., researchers at the Smithsonian Environmental Research Center (SERC) in Edgewater, Maryland, do long-term ecological research, monitoring the Chesapeake Bay estuary and the interconnected ecosystems at the 1000-hectare site. Elsewhere around the world, SERC scientists study the effects of global change and landscape ecology. During the summer, the research staff swells to 130, although only 14 researchers are full-time Smithsonian senior scientists. As with the astrophysical observatory, this center is not as reliant on money from Congress as is the National Museum of Natural History; some 60% of its support comes from outside grants and contracts. “It's an entrepreneurial staff, working in topical areas that agencies are interested in funding,” says Ross Simons, its director.

    Living lab.

    Chambers at SERC test the effect of high carbon dioxide concentrations on plant growth.


    Tropical Research Center

    1. David Malakoff

    Since the 1920s, the Smithsonian Tropical Research Institute (STRI) in Panama has been a favorite study site for tropical researchers. STRI currently operates nine field stations in Panama that focus on everything from cloud forests to coral reefs. More than 600 visiting scientists flock to the centrally located Barro Colorado Island facility in a typical year, often lured by the chance to work shoulder-to-shoulder with STRI's 33 principal investigators. “There is no place like it; the quality of their investigators and their publication rates are remarkable,” says Chris Peterson, a coral reef researcher at the College of the Atlantic in Bar Harbor, Maine, who did postdoctoral work at STRI in the late 1980s.

    Coral reef life.

    STRI made its mark studying both marine and terrestrial tropical biodiversity.


    STRI's budget has been essentially flat at about $15 million over the last 3 years, with about a third of the total coming from private donations and contracts. Despite the financial stagnation, however, outsiders perceive STRI to be thriving. A five-member visiting committee that graded STRI late last year concluded, “Among hundreds of tropical research institutions around the world, STRI is undeniably the best,” singling out research in ecology, evolution, animal behavior, and anthropology for special praise.

    That may be due to something that many observers say the National Museum of Natural History lacks: energetic and consistent leadership. Tropical biologist Ira Rubinoff, widely regarded as a model of the politically savvy scientist, has led STRI for 28 years.


    Major Challenges for Bush's Climate Initiative

    1. Richard A. Kerr

    The president's drive to reduce the uncertainties about global warming faces daunting obstacles both scientific and institutional

    When President George W. Bush yanked support for the Kyoto Protocol last month and called for more research to reduce the “uncertainties” about global warming, many policy-makers and scientists worldwide let out a collective groan. Bush's stance, they said, was just another excuse for inaction. Many climate scientists believe that, despite the admittedly large uncertainties, their current knowledge merits action (Science, 13 April, p. 192). But political disagreements aside, scientists do see a silver lining in Bush's call for a “science-based approach” to global climate change: the opportunity to focus climate science on key research and rein in the country's sprawling research enterprise.

    Neither task will be easy, says David Evans, an assistant administrator of the National Oceanic and Atmospheric Administration (NOAA), who suspects that tightening the coordination of federal agency activities may prove even harder than setting scientific priorities. Indeed, Bush's father already tried. In 1990, he set up the U.S. Global Change Research Program (USGCRP) to coordinate research among 10 federal agencies. Although the country has since spent $18 billion on the program—“three times as much as any other country,” the president pointed out—researchers and managers agree that it needs an overhaul.

    Red-hot scenario.

    With more on their plates than greenhouse warming simulations like this one from NCAR's model, U.S. researchers have not contributed as much to assessments as Europeans have.


    “We have lots of talent and capabilities [in the United States], but they aren't as coordinated as they need to be,” says atmospheric science program director Jay Fein of the National Science Foundation, one of the leading USGCRP agencies. One problem is that “the agencies' priorities have taken precedence over the coordinated needs of the program,” agrees climate modeler Maurice Blackmon of the National Center for Atmospheric Research (NCAR) in Boulder, Colorado. “In tight budgetary times, the agencies have a tendency to protect their own interests.”

    Another problem, in the view of some, is a loose interpretation of “global change.” For example, NASA claims the lion's share—69%—of USGCRP spending, which it uses to support costly satellites examining everything from stratospheric ozone to tropical forest clearing to the algae coloring the ocean. Some of the satellite data are crucial to monitoring long-term trends and verifying climate models, says meteorologist J. Michael Wallace of the University of Washington, Seattle, a member of the recent National Research Council (NRC) committee that reviewed climate change science (Science, 15 June, p. 1978). But “how much of that is really dedicated to climate?” he asks. “If NASA wants to, they say it's part of climate.”

    To address such problems, President Bush said in June, he will “establish the U.S. Climate Change Research Initiative to study areas of uncertainty and identify priority areas where investments can make a difference.” He mentioned no dollar amounts, only “additional investments in climate change research.” The president gave the secretary of commerce (home to NOAA) lead responsibility for setting priorities and improving coordination among the various agencies. “We have really just begun,” says Evans, who is organizing the effort for the secretary. He expects that, under this new umbrella, climate research will be more responsive to “a wider range of policy needs.”

    Fast, but not the fastest.

    U.S. climate-modeling supercomputers like NCAR's are not the best for the job.


    The secretary of commerce may have his hands full deciding where to give climate change research a policy-oriented tweak. In the realm of climate change processes, “we want everything, we need it all,” says NRC study committee member Thomas Karl of NOAA's National Climatic Data Center in Asheville, North Carolina, only slightly tongue-in-cheek. But when pressed hard to go beyond the lengthy laundry lists of research needs that study committees have produced of late, some other climate scientists identify the role of clouds and pollutant hazes as the biggest and perhaps most recalcitrant uncertainties in projecting future climate.

    In a warming climate, clouds will morph to reflect more or less solar energy back into space—but no one is sure which it will be or how much warming or cooling that process will add. Similarly, depending on their composition, pollutant hazes made up of aerosol particles will either reflect solar energy and cool their surroundings or absorb solar energy and warm the region. And aerosols can modify clouds themselves so that they cool more, but the magnitude of this indirect effect is so uncertain that it might be large enough to counteract most of the greenhouse warming to date.

    “We have made some progress since 1990 understanding what we don't understand” about clouds and related processes such as aerosols, says climate modeler John Mitchell of the Hadley Center for Climate Prediction and Research in Bracknell, U.K. “But we've probably not progressed at all as far as decreasing uncertainty.” Indeed, the estimated range of climate's sensitivity to increasing greenhouse gases—a measure of just how warm it could get—hasn't shrunk since it was first officially estimated in 1979.

    To narrow these uncertainties, researchers call for many more targeted field investigations like INDOEX, the 1998–99 study of pollutant aerosols over the Indian Ocean. They also urge dramatic improvements in the most computationally demanding sort of climate modeling and in the collection of global climate data used to verify the models. “A sustained network of global observations is required if we're going to believe what's coming out of the models,” says climate modeler Jeffrey Kiehl of NCAR. “We don't have that, and what we do have is deteriorating.”

    After a decade of planning and construction, NASA's space-based Earth Observing System satellites, such as Terra, are beginning to return the data on changing seasons and year-to-year climate variations that are needed to validate models of atmospheric behavior, Kiehl says. But for the long-term records that are also needed, climate researchers have had to depend on data collected for daily weather forecasts, a poor substitute. “A climate observing system really has always been lacking,” says atmospheric chemist Ralph Cicerone of the University of California, Irvine, who chaired the NRC review of climate change science. “We're limping by with observations from platforms that were never designed for climate studies.”

    Nor is all of U.S. climate modeling up to world standards, researchers say. “I find it extraordinary that England does more focused and more extensive climate modeling than the United States,” says oceanographer Edward Sarachik of the University of Washington, Seattle, who headed a recent NRC study on improving the effectiveness of U.S. climate modeling. “In the United States, our top two centers together don't amount to one-fifth of the European effort.” When the U.S. global change program assessed the prospects for greenhouse warming in various regions of the country, it had to rely on two foreign models, one Canadian and one British (Science, 23 June 2000, p. 2113). No U.S. center with a top-of-the-line climate model had the spare computer time to run the needed simulations. And when the U.N.-sponsored Intergovernmental Panel on Climate Change compiled its regular 5-year report on the state of climate science, U.S. researchers were embarrassed because the U.K.'s Hadley Center and the Max Planck Institute for Meteorology in Hamburg, Germany, supplied the bulk of the needed simulations.

    The crunch in U.S. climate computing “is not a matter of talent,” says Cicerone, “it's focus, emphasis, and computer resources.” U.S. researchers have long felt at a disadvantage because a trade dispute with Japan has prevented them from buying “vector” supercomputers from Japanese companies like Fujitsu and Hitachi. Instead, U.S. firms have concentrated on massively parallel supercomputers that, despite their much ballyhooed promise, have failed to provide the promised boosts, at least when computing climate change, says Kiehl.

    Perhaps more crucial, say U.S. researchers, is the lack of focus on climate modeling. NCAR, NOAA's Geophysical Fluid Dynamics Laboratory, and several other institutions have developed sophisticated global climate models. But at each center, climate change modeling must vie for available computer time with other research on atmospheric science. “It's a matter of dedicating computer hardware to climate modeling,” says Kiehl. “We don't do that here. There's also a cultural issue. The competition [among many centers] is viewed as a healthy way to stimulate research. I agree, but the climate modeling field has reached such a level of complexity that we have to change the way we've been working.” Both the NRC study and a December 2000 USGCRP report recommend that the country create a center for U.S. climate modeling. The USGCRP report even suggests a “Climate Service,” modeled on the Weather Service, that could perform climate modeling and run a climate observing system.

    Given the push from the White House, climate scientists are optimistic that the field will get a needed boost. A new 10-year plan for USGCRP that would tighten up management is working its way to Congress, but all eyes are now focused on the Commerce Department. “The secretary is very engaged,” says Evans, “and everyone's taking it very seriously, so something is likely to happen.”


    Scientists Shower Climate Change Delegates With Paper

    1. John Pickrell

    LONDON, U.K.—A group of prominent European academics this week released a trio of reports aimed at reviving the stalled international talks on climate change. Their advice comes on the heels of a report by the U.K.'s Royal Society that questions the value of carbon sinks in absorbing greenhouse gas emissions, one of the major sticking points to those talks. The torrent of words reflects the vigorous debate over the fate of the Kyoto Protocol, which delegates will take up next week during a follow-up meeting in Bonn, Germany.

    The first set of reports comes from Climate Strategies, a new pan-European network of senior climate researchers and social scientists formed to shape the post-Kyoto debate by keeping policy-makers abreast of relevant climate data. “We've come up with great ideas in the past, but it's been too late,” says Benito Müller, an expert on the Kyoto Protocol at the Oxford Institute of Energy Studies and a founding member of the group, which announced its formation at a press briefing here on 9 July. “The idea is to coordinate research so that it comes out on a timely basis.”

    Funded by a seed grant from the Shell Foundation, Climate Strategies argues in two reports that U.S. opposition to Kyoto should not torpedo the 1997 agreement. “Kyoto is a huge investment in intellectual and research effort, and renegotiating a new protocol would not necessarily give a better product —and would waste another 10 years,” says Michael Grubb, a professor of climate change and energy policy at Imperial College and leader of the group. A third report looks at ways to tap carbon sinks for energy.

    Carbon sinks were also the subject of a report last week by the Royal Society, the U.K.'s most prestigious scientific body. It warned against overreliance on carbon sinks as an alternative to slashing carbon dioxide emissions. “Carbon sinks may help to reduce greenhouse gas levels during the short term, but the amounts of carbon dioxide that can be stored are small compared to emissions from the burning of fossil fuels,” says David Read, an ecologist at the University of Sheffield, U.K., and leader of the panel that prepared the report. Countries such as Japan, Australia, and the United States have been arguing for a larger role for sinks in meeting emissions targets. The report, however, says the role of sinks is small and finite.

    Up a tree.

    A Royal Society report warns against overstating the value of carbon sinks in mitigating global warming.


    One big uncertainty, the report says, is their estimated life-span. “Land sinks are not stable for long periods of time. Carbon locked up in trees and soils can also be released,” says John Shepherd, a climate modeler at the Southampton Oceanography Centre. Another major issue involves the techniques required to monitor, quantify, and verify sinks. “Current techniques are not good enough operationally for something as important as this,” says Shepherd.

    Climate change experts not involved in any of the reports are divided on their value in shaping the negotiations to be held in Bonn from 16 to 27 July. “[The reports] will have some visibility, but I don't think the primary driver will be scientists or policy analysts,” says Mike Hulme, director of the Tyndall Centre for Climate Change Research in Norwich, U.K. But Darren Goetze, a senior policy adviser on carbon sinks for the Canadian Environment Ministry in Ottawa, says the Royal Society's report “in large part confirms what the IPCC [Intergovernmental Panel on Climate Change] has been saying” and, thus, may steer debate away from carbon sinks and toward more lasting solutions to anthropogenic global warming.

    As with greenhouse gases, there is no shortage of greenhouse analyses. Climate Strategies hopes to feed government officials throughout Europe a steady stream of reports that will help them take definitive action, and Frank Biermann, a political scientist not affiliated with the group, thinks that it can play a valuable role. “European [research] institutes are too small, and it makes sense that they join forces,” says Biermann, who is with the Potsdam Institute for Climate Impact Research in Germany. “Europe needs a united position on climate change policy, which is sometimes lacking.”

    Even so, Grubb and others readily admit that a problem as complex as climate change won't be solved overnight. “Countries need to focus on completely restructuring their generation and use of energy,” says Read. “These measures may be socially and politically more painful to implement. … But they provide the ultimate solution.”


    High-Powered GRAPEs Take On the Cosmos

    1. Dennis Normile

    An astrophysicist's dream machine, the GRAPE-6 supercomputer can put virtual galaxies through their gravitational paces

    TOKYO—The world's fastest supercomputer is an unimpressive-looking machine with a name like a fruit drink and a price tag to warm a lab director's heart. Meet GRAPE-6, perhaps astrophysicists' most anticipated tool.

    Unveiled earlier this week at a symposium* here, GRAPE-6 is the latest in a line of machines that has been quietly revolutionizing astrophysical simulation. Developed on a shoestring budget by a small team of researchers at the University of Tokyo, GRAPEs have become the machines of choice for simulating the formation of planets, the evolution of star clusters, and the collisions of galaxies. They have earned this distinction by being exquisitely tailored to do just one thing: computing the gravitational attraction between two bodies. GRAPEs run through this calculation so quickly that researchers have been able to begin simulating these astrophysical phenomena using realistic numbers of celestial bodies, a task that chokes conventional supercomputers. And GRAPEs are so affordable that even modestly funded groups can buy their own. Many groups have several.

    “This is the democratization of supercomputing,” says Mordecai-Mark MacLow, an astrophysicist at the American Museum of Natural History in New York City. Simon White, a theorist at the Max Planck Institute for Astrophysics in Garching, Germany, adds that work on stellar dynamics “has been much more lively than it would have been without [GRAPEs].” At least 32 research groups around the world now use them, and the tangle of collaborative work is so thick that Piet Hut, an astrophysicist at the Institute for Advanced Study in Princeton, New Jersey, has to think long and hard to name theorists without some connection to GRAPE simulations. The impact of GRAPE “has been revolutionary,” says Hut, who himself collaborates with the GRAPE developers. Despite their limitations—GRAPES alone cannot calculate the effects of temperature, radiation, or magnetism—over the past decade their knack for gravitational number-crunching has helped resolve long-standing questions about planet formation, the behavior of globular clusters, and the collision of galaxies.

    Hot item.

    Astrophysicist Jun Makino shows off a circuit board from the just-unveiled GRAPE-6.


    And the best may be yet to come. GRAPE-6 debuts as the fastest computer in the world, with a theoretical peak speed of 30 trillion floating-point operations per second —teraflops or Tflops, for short. IBM still claims to hold the record for general-purpose computers, with its ASCI White supercomputer, which operates at 12.3 Tflops.

    The GRAPE project grew out of dissatisfaction with available computers. In the mid-1980s, Daiichiro Sugimoto, an astrophysicist at the University of Tokyo, and Jun Makino, one of his grad students, were using so-called N-body simulations to study the evolution of star clusters. Makino recalls that simulations involving a few thousand stars took hundreds of hours of supercomputing time. And they wanted to scale up to hundreds of thousands of stars. “We really needed a computer much faster than what was available,” Makino says.

    The calculation is elementary, equaling the product of the masses of the two bodies and the gravitational constant divided by the square of the distance between the bodies. The problem is that increasing the number of bodies for greater realism increases the number of computations quadratically, because every body interacts with every other body.

    Sugimoto's group was unaware that a problem similar to theirs had already been solved just across town. In 1983, Yoshihiro Chikada, an astronomer at Japan's National Astronomical Observatory, completed a special-purpose computer that processed the raw signals gathered by the battery of antennas at the Nobeyama Radio Observatory into usable data. The key idea was to hardwire the routine calculations as electronic circuits instead of directing the computer to perform them by software routines. Instead of one line of code executed with each computer clock cycle, the calculation would be completed in the time it took electrons to speed through the circuit. Despite the cost of having these customized chips specially fabricated, they ended up with a machine that was “hundreds of times better in terms of cost-performance” than its alternatives, Chikada says.

    It occurred to Chikada that a similar approach might be useful for theoretical simulations. He was too busy to follow up on his own idea, but he circulated an outline of how he thought such a computer would work. And sometime in 1988, this idea reached Sugimoto. Soon afterward, Sugimoto's group started work on its first GRAPE.

    Sugimoto and his colleagues relied on off-the-shelf chips and components, which they wired by hand into circuits that would perform the gravitational computation. This kind of hardwired circuit is called a pipeline, which led to GRAvity PipE and to GRAPE. (Makino explains that Sugimoto, who retired in 1997, had wanted a catchy name for their device, and Apple Computer had legitimized the idea of computers with fruity names.) The researchers also decided not to make GRAPE a stand-alone computer. Instead, GRAPEs rely on a front-end computer, a workstation or desktop PC, that keeps track of progress and feeds the data on the mass and location of the bodies to the GRAPE boards for the heavy computing. The forces GRAPE comes up with go back to the front-end computer, which adjusts the positions of the bodies, takes a step forward in time, and starts another computing cycle.

    Sugimoto's group completed the first GRAPE in 1989. Running at 120 million flops, GRAPE-1 was slower than the 1- billion-flops (1-gigaflops) capability of supercomputers then available. But they put it together with just $3000 in parts and a year's worth of effort. Best of all, Makino says, “we had it for 24 hours a day.” And they used it round-the-clock, producing three or four papers on galactic dynamics over the following year.

    They had actually started work on two versions of GRAPE simultaneously, and this set a pattern for future development. Even-numbered generations have higher numerical accuracy and are intended for studying phenomena involving collisions which require very accurate force calculations. Odd-numbered generations are less accurate, less powerful, less expensive, and more suited for studying long-term collisionless processes, such as galactic evolution.

    GRAPE power.

    Circuits hardwired for gravity thrive on complex simulations such as evolving a virtual moon.


    GRAPE-2 was completed about a year after GRAPE-1 and soon yielded its own crop of papers. “We demonstrated we could do interesting science and not just hardware,” Makino says. And suddenly, researchers around the world started asking how they could get their own GRAPEs. Sugimoto's group was happy to share its development but didn't want to spend all its time building computers. They now have an agreement with Hamamatsu Metrix, a small electronics firm in Hamamatsu, which builds and sells GRAPEs, returning some nominal level of financial support for research to the University of Tokyo team.

    GRAPE-3, the first version widely available to other groups, marked a new level of sophistication. Instead of off-the-shelf chips, Makino and his colleagues started designing their own custom integrated circuits, putting one entire pipeline on a chip and arranging 48 chips in parallel. They took an even bigger step with GRAPE-4, wiring up 1692 chips. Completed in mid-1995, it was the first computer in the world to break the Tflops barrier, with a peak speed of 1.08 Tflops. At the time, the fastest general-purpose supercomputer was a Fujitsu model with a peak speed of 280 gigaflops. GRAPE-5 was optimized for lower accuracy and power. But now GRAPE-6, built on a budget of about $4.2 million, once again wins the world's-fastest crown.

    Just as important for astrophysicists is the affordability. The most basic GRAPE-6 configuration, for about $13,000, has just four GRAPE-6 chips, each with six pipelines, and runs at 150 gigaflops. A single GRAPE-6 board sporting 32 chips and a total of 192 pipelines capable of 1 Tflops costs about $42,000. The full-sized GRAPE unveiled this week has 32 of those boards. That means that just about any group with even modest funding can put what Hut calls “a virtual astrophysical laboratory” on its desktop. As a result, “a bright new idea can be tested right away, as opposed to having to argue it in an application for supercomputer time in order to be able to try it out 6 to 12 months later,” says Lia Athanassoula, a theorist at the Marseille Observatory in France, which has several different GRAPE configurations.

    The enthusiasm of GRAPE users for their machines resembles that of early computer hackers. There are GRAPE users' conferences and even an occasional newsletter. “The community of GRAPE users is informal, but rather tight-knit,” says Albert Bosma, who is also based at Marseille. “People help each other out with technical problems.” The can-do spirit has paid off. By an informal and certainly incomplete tally, more than 40 journal papers published last year alone were based at least in part on GRAPE simulations.

    GRAPE's tailor-made take on scientific computing might soon spread to other fields as well. Toshikazu Ebisuzaki, a GRAPE project veteran who now directs the Advanced Computing Center at RIKEN, the Institute for Physical and Chemical Research, near Tokyo, is modifying the GRAPE approach to tackle molecular dynamics. He says that just as computing gravitational attraction was a bottleneck for simulations of stellar dynamics, figuring Coulomb and van der Waals forces, which govern molecular attraction, is the bottleneck in simulating molecular dynamics. Ebisuzaki has already demonstrated the feasibility of modifying GRAPE chips to handle these calculations and is gearing up to start simulations of protein folding that may prove more practical and affordable than current approaches using supercomputers.

    For Makino and his collaborators, claiming the prize as the world's fastest computer is nice, of course. But their primary interest is still astrophysics. High on their list of problems to tackle is just how black holes inevitably end up at the centers of galaxies, even in cases where two galaxies collide. There are various theories. But attempts to simulate them on previous GRAPEs proved maddeningly inconclusive. Makino thinks the ability of GRAPE-6 to handle more stars in greater detail may resolve the question. But it could be one of the problems that may just have to wait for GRAPE-7 or -8.

    • *Astrophysical Computing Using Particle Simulations, 10–13 July.

  16. Driving a Stake Into Resurgent TB

    1. Martin Enserink

    Eighty years after the first tuberculosis vaccine, health leaders say they need a new one. Scientists have several promising candidates—but who will pay for them?

    Tuberculosis is an anachronism to many Westerners. It conjures up images of hollow-eyed writers and artists wracked by “consumption,” or Swiss sanatoriums where patients are wheeled outside to be cured by fresh mountain air. From a scientific viewpoint, tuberculosis (TB) also seems out of place in the 21st century. Most cases are now curable with antibiotics, and a cheap vaccine is administered to some 100 million newborns each year, making it the most widely used vaccine ever made. TB shouldn't be a threat any longer.

    But TB still ranks among the world's most deadly infectious diseases, killing 2 million to 3 million people a year. An astonishing 2 billion people—a third of the world's population—may carry a latent TB infection, and roughly 10% of them will develop a life-threatening form of disease. The vaccine, it turns out, is no match for Mycobacterium tuberculosis, the bacterium that causes the disease. Treatment is problematic as well; indeed, the bacterium has evolved deadly new strains, resistant to the most powerful drugs. “TB is arguably the most successful pathogen on the planet,” says William Jacobs, a researcher at the Albert Einstein College of Medicine in New York City.

    The only effective way to control TB, researchers have concluded in the last decade, is to develop new vaccines. The challenges will be daunting, because much about the disease is still unknown. But optimism, a rare commodity in TB research, is rising: Molecular biology is providing new tools for the battle and, for the first time in decades, money is pouring in to support new research.

    The big scientific leap came in 1998, when researchers sequenced the genome of M. tuberculosis. Obtaining the DNA for all the TB genes at once was like walking into “a candy store,” says Stanford University geneticist Peter Small. And the mounting death toll, as TB teamed up with AIDS, helped stimulate global awareness of the disease. In 1993, for instance, the World Health Organization (WHO) declared TB a “global health emergency.” In response, the European Union launched a $4.3 million TB Vaccine Cluster research program in 1999. In the United States, a $25 million grant from the Bill and Melinda Gates Foundation has energized a new round of vaccine development projects, coordinated by the Sequella Global Tuberculosis Foundation in Rockville, Maryland. Created in 1997, Sequella's goal is to get vaccine candidates far enough along in clinical trials to make them interesting for drug companies. The foundation hopes to take three candidate vaccines into the first small-scale safety trials in humans over the next 12 to 18 months—a step few would have thought possible 3 years ago. A British group, meanwhile, plans to start human trials as early as September.

    Old foe.

    Tuberculosis has been controlled in wealthy countries through a combination of screening, antibiotics, and vaccination.


    Even WHO, which has traditionally focused on how to deliver drugs to people who need them —and ensure that they take them—is rethinking its strategy. It is examining the logistics of future large-scale clinical trials, says Michael Brennan, a TB vaccine researcher at the U.S. Food and Drug Administration, who is now advising the international agency.

    A deadly partnership

    Veterans in the field trace the beginnings of the turnaround to the late 1980s, when TB suddenly surged, and drug-resistant strains emerged in the United States, riding on the heels of the AIDS epidemic. The outbreak drove home the point that even in industrialized countries, TB remained a large if hidden threat, says Marcus Horwitz of the University of California, Los Angeles.

    TB is tough to control because it can lurk in the body for years, out of reach of the immune system. After the initial infection, M. tuberculosis goes into a latent phase, hiding inside a class of blood cells called macrophages. An intense course of multiple antibiotics, administered for 6 to 8 months, is needed to rid the body of the bug, even though the symptoms usually disappear within weeks. Patients tend to drop their antibiotic treatment too early, which fuels the emergence of drug-resistant strains.

    U.S. public health officials mounted a strong offensive against resistant TB, imposing new regimes to ensure that patients finish their therapy. After peaking in 1992, the U.S. outbreak declined by the end of the decade. Last month, the Centers for Disease Control and Prevention in Atlanta announced that in 2000, the number of new cases compared to 1999 had dropped by 6.6%, reaching an all-time low of 16,377.

    But in Eastern Europe, where funds are lacking and public health services are inadequate, multidrug resistance has become rampant. And in Africa, TB and HIV have combined in a deadly spiral. In AIDS patients, latent TB has a much higher chance of becoming active, and when it does, the symptoms are much more severe. As a result, most African AIDS patients die of TB. TB also appears to speed HIV's replication rate. Although several new drugs are in the pipeline, none is expected to make a big dent in the epidemic. “I think everybody now agrees that we won't substantially reduce the burden of TB without new vaccines,” says Brennan.

    The existing vaccine was developed by Albert Calmette and Camille Guérin of the Pasteur Institute in Paris some 80 years ago from a bovine cousin of M. tuberculosis. Initially, it seemed to work well. But over the decades, the efficacy of Bacille Calmette-Guérin (BCG), as the vaccine is known, has come into question. Large-scale trials have produced wildly conflicting results, varying from 80% protection to none at all. “There's no other vaccine for which the results are so inconsistent,” says Paul Fine, a BCG expert at the London School of Hygiene and Tropical Medicine in the United Kingdom.

    In most parts of the world, BCG's protection doesn't seem to extend beyond childhood, during which it generally protects against TB's most severe forms. Young adults, however, remain at risk of the disease. Except for the United States and the Netherlands, which have never used BCG, almost the entire world has followed WHO's recommendation to vaccinate every child with BCG at birth. “It's not a perfect vaccine,” explains Fine, “but it's doing some good and it's dirt cheap—so we give it.”

    The new generation

    Coming up with alternatives, however, has proved tough. For one, the world may need several vaccines: a preventive type to immunize children at birth, as well as “therapeutic vaccines” to protect the 2 billion latently infected from actually developing symptoms and cure those who already have active TB. Researchers still understand little about the complex ways in which M. tuberculosis thwarts the human immune system. It's unclear how the bug manages to persist for decades, for example, or why some people eventually get sick, yet the majority stay healthy. “At the end of the day, we need to know the answers if we want to make a very good, very effective vaccine,” says Jacobs.

    Despite the obstacles, TB experts in recent years have identified several strong candidate vaccines that either perform significantly better than BCG in animals, appear not just to prevent but also to cure TB, or trigger a strong response in immune cells taken from infected people. The first of these to reach human testing is a vaccine developed by Adrian Hill and his colleagues at the University of Oxford, slated to enter clinical trials in September. With money from the Wellcome Trust, a large British charity, Hill is betting on a combination of BCG and a booster made of a vaccinia virus that produces antigen 85, a protein produced by M. tuberculosis.

    The big problem with TB vaccine candidates, however, is that because most patients are poor and cannot pay for therapy, market-conscious pharmaceutical companies have little interest in them. That's why the Sequella Global Tuberculosis Foundation wants to fund researchers to help make their vaccine attractive to industry.

    Sequella's goal is to get candidates ready for industrial development by putting them through small-scale tests, including phase I trials, which test for safety in healthy volunteers, and phase II trials, which probe for efficacy. After the candidates clear those hurdles, Sequella hopes to interest big pharmaceutical companies in funding large-scale phase III trials and, eventually, producing and distributing the vaccines.

    Global problem.

    TB occurs everywhere, but most cases are in Africa and Asia.


    Several observers credit Sequella's co-founder and director, Carol Nacy, for changing the culture and pushing researchers to apply their discoveries. “Frankly, we researchers are most interested in basic knowledge,” says Jacobs. “We want that next paper in Nature or Science. But that's a far cry from getting a vaccine.” One of Sequella's candidates, developed by Horwitz, builds on the original BCG vaccine strain. But Horwitz tries to beef up the body's immune response by making the BCG organism overexpress antigen 85, the same protein used by Hill and his colleagues. In guinea pig trials, the vaccine yielded 10- to 100-fold better protection than BCG itself, says Horwitz. A “turbo-BCG,” as this type of vaccine has been dubbed, would have several advantages, he says. It's as safe as classic BCG, and its introduction would be relatively easy, because the new vaccine could simply take BCG's place in immunization programs.

    Sequella's second candidate is a DNA vaccine, produced by Douglas Lowrie and his colleagues at the Medical Research Council in London. In 1999, Lowrie discovered that injecting into muscle a piece of DNA encoding a so-called heat shock protein from M. leprae, a bug related to M. tuberculosis, not only protected mice from getting tuberculosis but also cured mice that had been infected. Based on that approach, Lowrie has developed a so-called therapeutic vaccine that would be given along with drugs to patients with pulmonary TB.

    An Austrian biotech company, Intercell, has developed the third candidate, a subunit vaccine. It consists of eight epitopes—key parts of M. tuberculosis proteins that trigger an immune response—and an “adjuvant” that fires up the body's immune response. “Everybody is getting nervous now, because they want their vaccine in the queue too,” says Nacy. “You can't imagine how exciting that is.”

    Meanwhile, a Seattle-based biotech called Corixa plans to begin testing its own candidate in humans next year. With a $2.3 million challenge grant from the National Institutes of Health, the company has produced a “fusion protein” vaccine that combines two antigens from M. tuberculosis. The protein has shown “exquisitely good protection” in mice, guinea pigs, and monkeys, says Corixa's chief scientific officer, Steven Reed.

    These new studies are small and preliminary; even if they produce a green light, a marketable vaccine remains years away. Sequella, for instance, has just begun long-range planning for phase II and III trials in South Africa—an ideal test case, says Nacy, “because it has a First World medical community and a Third World TB incidence.” But preparing the site—collecting baseline epidemiological data and setting up the required infrastructure—will take 3 to 5 years. Then there's the biggest hurdle of all: getting a company interested. Even if researchers deliver a vaccine to a manufacturer on a silver platter, as Sequella intends to do, it's not clear that a pharmaceutical company will invest the hundreds of millions of dollars required to get the product into clinics.

    For the moment, however, researchers are delighted just to see the field moving forward. “From now on, the research will be driven by the results from clinical trials,” says Brennan. “That's a big change. We've really turned a corner.”

  17. 'Breeding' Antigens for New Vaccines

    1. Jon Cohen

    Backed by DARPA funding, a biotech company in California is hunting for better reagents with a technique known as directed molecular evolution

    REDWOOD CITY, CALIFORNIA—Russell Howard is exploiting the wonders of sex to develop vaccines. For 10,000 years, explains Howard, who heads a biotechnology company called Maxygen, humans have used breeding techniques to make crops and animals with specific traits. Now Maxygen is “mating” viral genes to achieve similar ends, fashioning vaccines designed to combat AIDS, dengue, hepatitis B, and other diseases. And, Howard likes to brag, this strategy differs markedly from the standard biotech approach: Rather than carefully manipulating genes to develop a product with specific characteristics, Maxygen hunts for chance offspring that have the desired features.

    Maxygen is one of a handful of biotechs focusing on “directed molecular evolution.” This approach mimics natural selection, but on a minuscule scale and with a focused purpose. These companies use a variety of techniques to modify genes, which they assemble into large libraries. They then screen the libraries for genes that produce proteins with a particular biological feature, select the best genes, modify them, and run through the process repeatedly until they get a result they like. “It's an amazingly productive way to go,” says Lawrence Loeb, a researcher at the University of Washington, Seattle, who does directed evolution and consults with Maxygen.

    Gene shuffler.

    Maxygen CEO Russell Howard exploits mating strategy.


    Loeb emphasizes that nature would never select for many of the products that directed evolution allows you to create. “Nature is really limited,” says Loeb, who has used directed evolution to make novel enzymes with potential medical and industrial applications. “A ‘better’ enzyme wouldn't help the cell.”

    Enzymes are a key focus of every directed evolution company, as are therapeutic proteins, antibodies, and small molecules for drugs. But only Maxygen—a company in the San Francisco Bay area that has rapidly grown in the last year (the number of employees doubled to 270)—so far has used the strategy to hunt for vaccines. Howard's rationale echoes Loeb's. The immune system evolved to respond to foreigners—which immunologists call antigens—but, he notes, “antigens were never evolved to elicit an immune response.”

    Maxygen's work is still in the early stages of test tube and animal testing. But if it succeeds, it could add a novel tool to the vaccinemaker's workbench. “It's a powerful technology, there's no doubt,” says Gerald Joyce, a pioneering investigator in the directed molecular evolution field who works at the Scripps Research Institute in La Jolla, California. “They've really run with it in a very pragmatic way.”

    DNA shuffling

    Maxygen evolved from the work of Willem “Pim” Stemmer, formerly a staff scientist at the Affymax Research Institute, who in 1993 discovered a new way to direct molec-ular evolution. Joyce, Loeb, and researchers in other labs were creating novel DNA or RNA with a version of this technique, introducing random mutations into genes by chemically damaging them or copying them in systems that introduce errors. Stemmer took a different tack, dubbed “DNA shuffling,” which resembles the way genetic material recombines through sex.

    As Stemmer first reported in Nature in 1994, the initial step in DNA shuffling is to isolate several slightly different genes that code for the same product. Enzymes chop the genes into random fragments, and the polymerase chain reaction assay (PCR) recombines fragments from various genes. Finally, recombined fragments are reassembled into unique, full-length genes (see sidebar). “Recombination is a very gentle way of increasing diversity,” says Howard.

    In 1997, Stemmer, Howard (then Affymax's scientific director), and Affymax founder Alejandro Zaffaroni spun off Maxygen as an independent company. Zaffaroni, a well-connected chemist and founder of innovative biotechs, quickly put together a star-studded scientific advisory board; it includes Nobel laureates Baruch Blumberg, Arthur Kornberg, and Joshua Lederberg. Maxygen split into different divisions to work on agriculture, chemicals, pharmaceuticals, and vaccines. The vaccine division, Maxyvax, has not yet presented much data publicly, but vaccine experts are watching it closely and it is getting support from an unusual sponsor of vaccine research: the Defense Advanced Research Projects Agency (DARPA), which has kicked in $20 million from its unconventional pathogen countermeasures program.

    Traditional vaccines stimulate the immune system by exposing it to whole disease-causing organisms that have been weakened or killed to render them harmless. But these concoctions have sometimes caused the diseases they aim to prevent. Modern vaccines, with the help of genetic engineering, attempt to avoid this problem by delivering only parts of the bug in question—the coat proteins, for example. Maxygen is trying to improve the potency of some of these modern vaccines.

    For more than a decade, vaccine researchers have mimicked the live, weakened approach by using safe bacterial or viral “vectors” to shuttle genes from the disease-causing organisms into the body. One of the most popular vectors is naked DNA, a circular piece of bacterial DNA that slips inside target cells. Yet it has its own limitation: In the body, it often produces low levels of the antigen that's required to stimulate an immune response. Maxygen says its shuffling technology has cranked up antigen yields using naked DNA. But results are wrapped in secrecy.

    High on DARPA's wish list is a vaccine that could protect against dengue, a virus transmitted by mosquitoes. It causes severe fevers and, in extreme cases, hemorrhaging and death. Four related but antigenically distinct strains of dengue cause the disease, and antibodies against one often do not protect against another. Maxygen has shuffled various dengue strains to create single antigens that, Howard says, produce antibodies that work against all four strains. The U.S. Navy now is evaluating Maxygen's data.

    A single-antigen dengue vaccine could have important advantages. Donald Burke, who heads the Center for Immunization Research at Johns Hopkins University and has studied dengue in Thailand, says, in theory, it could be better than one that contains antigens from each of the four strains because dengue has a peculiar feature. Low levels of antibodies can actually enhance the ability of the virus to cause hemorrhagic fever. With a vaccine composed of four independent antigens, explains Burke, “the concern is you'd get waning of immunity to one antigen, and that would sensitize you to hemorrhagic fever.” A single-antigen vaccine may produce a uniform response.

    Maxygen has been more open about another vaccine project, one that involves hepatitis B. The existing hepatitis B vaccine contains a genetically engineered version of that virus's surface protein. Seeking a more powerful effect, Maxygen scientists took surface antigen genes from several hepatitis B viruses, shuffled them, and put the recombined genes into naked DNA vectors. They then injected mice with these vaccines and natural (or “wild-type”) hepatitis B surface proteins. Animals injected with the shuffled genes produced five-fold higher levels of antibodies than did the mice with the best wild-type antigen. Then they went a step further, reshuffling genes that produced the best antigens and reinjecting mice. They found vaccines that led to as much as a 12-fold greater antibody response than the most potent wild-type vaccine.

    Maxygen's Robert Whalen, who works on the hepatitis B project, says the company recognizes that the existing vaccine can prevent infection. But Whalen notes that it can do little for the more than 250 million people who are infected with hepatitis B and are at risk for fatal liver disease. He's hoping that a more potent vaccine might work as a treatment by knocking down viral levels in patients with a chronic infection.

    The unknowns

    Improving DNA vectors, making a multivalent dengue vaccine, and boosting the potency of the existing hepatitis B vaccine represent three novel ways that DNA shuffling might help vaccine developers. But researchers often face a more fundamental, perplexing dilemma: They remain in the dark about the immune responses that a vaccine needs to trigger. “There are very few diseases where you really know what you want,” says Howard.

    HIV is a case in point. AIDS researchers still debate the relative importance of antibodies, which prevent cells from becoming infected, and cellular immunity, the arm of the immune system that clears infected cells. Maxygen thinks directed evolution might shine some light here, too.

    Taking a page from the hepatitis B story, many AIDS vaccine researchers in the 1980s banked on the idea that an antibody to HIV's surface, or envelope, protein could be protective. But by the early 1990s, it became clear that antibodies triggered by these envelope proteins failed to stop infections in test tube experiments. Many groups abandoned antibodies altogether, focusing on cellular immunity instead.

    Dennis Burton of the Scripps Research Institute steadfastly stuck with the antibody idea. After screening thousands of envelope antibodies from infected people, Burton found one that powerfully stops HIV in vitro. But Burton got only half way: He did not find a version of the envelope protein that triggers production of the antibody.

    Maxygen has begun to hunt for this elusive protein. Its scientists plan to shuffle envelope genes from several different isolates of HIV and test whether any of the proteins expressed by these reshuffled genes bind to Burton's antibody. They will then inject the most promising genes into mice to see whether they produce a strong antibody response. This strategy, Burton notes, differs markedly from rational gene design, which aims to fashion new HIV envelopes by deleting parts of the known structure. “One strategy says you work everything out and design a change in the protein,” says Burton. “Here, you just randomize and select. And that's what nature does. It doesn't design.”

    In February, Maxygen's AIDS vaccine project received a boost when the nonprofit International AIDS Vaccine Initiative, in collaboration with the Rockefeller Foundation, both of in New York City, agreed to fund the work in exchange for a royalty-free license to distribute any resulting vaccine to poor countries. David Ho of the Aaron Diamond AIDS Research Center in New York City became a part of this collaboration, too, providing envelope genes from Asian isolates of HIV. “The concept is obviously novel,” says Ho. “It's a fishing expedition: The more you toss the line out, the more chances you have.”

    Novel as the HIV vaccine project is, it also has a serious constraint: Maxygen's screening assays currently ignore cellular immunity. And that's true for all of their vaccines under development. “They've got a ways to go before they maximally exploit the technology,” says Burke of Johns Hopkins. Maxygen's Howard agrees. “If we have an inferior assay, we're going to get inferior products,” he says. And he says they eventually plan to develop assays for cellular immunity.

    Whether Maxygen's experiments lead to new and improved vaccines remains to be seen. But already it's clear that they are introducing vaccine researchers to a sexy new technology.

  18. How DNA Shuffling Works

    1. Jon Cohen

    The core technology Maxygen uses in vaccine research is the polymerase chain reaction (PCR), the “molecular photocopier” that's used to amplify small amounts of DNA. But Maxygen scientists have given it a special tweak to produce what they call “DNA shuffling.”


    In the PCR process, DNA is heated to a temperature that separates the double helix into two strands, or templates. Researchers then add short, synthetic stretches of DNA, called primers, which complement portions of the templates, attach to them, and jump-start the copying process. This builds new double-helical DNA structures, as one base (A, T, G, or C) after another attaches to its complement on the templates.

    DNA shuffling, in contrast, is PCR without synthetic primers. In this process, a family of related genes—say, the ones that codes for the surface protein of three different HIV isolates—are first chopped up with enzymes. The gene fragments then are heated up to separate them into single-stranded templates. Some of these fragments will bind to other fragments that share complementary DNA regions, which in some cases will be from other family members. Regions of DNA that are non-complementary hang over the ends of the templates (see illustration). The PCR reaction then treats the complementary regions as primers and builds the new double-helical DNA. But PCR also adds bases to the overhanging piece of the primer, forming a double helix there, too. This ultimately creates a mixed structure or chimera. In the final step, PCR reassembles these chimeras into full-length, shuffled genes.

  19. Closing of Basel Institute Scatters Immunologists

    1. Giselle Weiss*
    1. Giselle Weiss is a writer in Allschwil, Switzerland.

    Hoffmann-La Roche's celebrated center—an experiment originally run by basic researchers—experiences a traumatic change

    BASEL, SWITZERLAND—Hoffmann-La Roche shocked Europe's immunology community last year when it pulled the plug on its renowned Basel Institute of Immunology (BII). For more than 30 years, the drug manufacturer had supported the institute's self-directed program of basic immunology research. In return, the institute gave Roche access to first-rate science and talented biologists. It was an idyllic relationship, but it came to an end on 5 June 2000 when Roche announced that it was converting the institute into a center for medical genomics. The BII's board of directors voted to dissolve itself, and Roche took direct control.

    The institute has been slowly dispersing since then. Fritz Melchers, the BII's director for 20 years, retired this past April. Twenty-seven of the 48 members of the scientific staff have left, reportedly with generous severance deals. More are expected to leave by the end of July, when additional contracts expire, and a few who have longer contracts will depart over the next 18 months.


    Fritz Melchers, the institute's former chief.


    Roche's publicly stated rationale for the transition was that it needed to position itself at the “cutting edge of the biosciences.” But employees paint a more complicated picture. Some BII alumni worry that the institute's demise indicates that Europe is losing interest in basic biology that doesn't involve genomics. Others say it has long been in a precarious position. It seemed as though the BII “was always closing,” says Christopher Paige, who spent 8 years there in the 1980s and now directs research at the Ontario Cancer Institute. “Even when I left,” says Brigitte Askonas, an immunologist at Imperial College, London, who was at the institute for a short period in the early 1970s, “there was uncertainty about the longer term future of the BII.” The company's ambivalent support diminished, some say, following the death in 1999 of Paul Sacher, a Swiss conductor and avid patron of the arts and sciences who married into the Roche family. Observers say the BII was Sacher's pet project and that his departure tipped the balance.

    The transition has been bruising for some, particularly researchers past middle age like Louis Du Pasquier, whose early retirement means an end to 31 years of studying amphibian immunology. Yet by and large the institute is “successful even in its death,” says Melchers, pointing to the number of employees who have found good jobs. The big loss, say the people who knew the institute well, is the closing of an extraordinary opportunity for young scientists. Du Pasquier says it was “a paradise for creativity at the single-person level”—one that's not likely to be replaced.

    The Valium windfall

    Roche conceived the idea of an independent academic institute in the late 1960s, flush with money from sales of benzodiazepines. The company hired Niels Jerne, then almost 60, to direct the new center and gave him $22 million a year and a free hand in planning it. Although Roche got a first look at research coming out of the institute, by all accounts, commercial payback was never the focus. Askonas remembers how Jerne called several meetings of the new institute's first recruits, and “we had a wonderful time discussing how an institute should be run to encourage original research.”

    The structure that emerged was simple but unique. Administration was nominal. Groups were kept small to discourage empire building. Although the BII hired a handful of researchers on a permanent basis, most scientists came on 2-year, rolling contracts. “It was not a place to stay,” says Du Pasquier, adding that the turnover kept it young. According to Polly Matzinger, an immunologist at the National Institutes of Health who spent 6 years at the BII, the advantage was the freedom and support for researchers to do what they wanted without any questions and without having to write grants. “The only limitation was your own brain,” says Michael Julius, a one-time member of the institute and now vice president for research at the Sunnybrook and Women's Health Sciences Center in Toronto.

    Jerne tried to encourage interaction by connecting the labs with a famous network of spiral staircases and putting the cafeteria in a central location. “We were forced to communicate,” says Matzinger. The sum of these small groups collaborating was greater than its parts, says Fred Alt, an immunologist at Harvard Medical School in Boston, who served as an adviser to the BII in its final years.

    The institute became home to 50 scientists and 25,000 model organisms, mostly mice, but also worms, chickens, frogs, and trout. Researchers were backed by a corps of superbly trained technicians. Hundreds of scientific visitors contributed to a knowledge base that made the institute a mecca for immunology. “Everybody stopped in Basel,” says Alt. Large parts of the canon of immunology were hammered out at the BII. Just as important, it served as a breeding ground for immunologists. A map fills a wall in the conference room on the second floor, pasted over with photos of many of the 500 scientists who worked there. “There are metastases from the institute all over the world,” says Klaus Karjalainen, a member for 17 years. Julius points out that the department of immunology at the University of Toronto has “filled up with ex-BII members.”


    Niels Jerne, the first director, had a free hand to design a center for basic research in immunology.


    “It was a great place to establish the seeds of new research,” says Susumu Tonegawa, director of the Center for Learning and Memory at the Massachusetts Institute of Technology in Cambridge. Tonegawa's work at the BII on antibody diversity garnered him one of the three Nobel Prizes awarded to members within the institute's first 11 years of operation. (Georges Köhler and Jerne, together with César Milstein, snagged the other two.) But he and others eventually began to chafe at the limits on group size. “Some research does go on better in factories,” admits Paige. Pressure to perform and the peculiar social situation of the institute—50 foreign scientists isolated in a small Swiss city—also exacted a toll. Melchers likens it to going into a monastery for a few years. “You hurt by the time you left,” observes Matzinger.

    Jerne retired in 1980 and was succeeded by Melchers. The staff gives Melchers credit for maintaining Jerne's singular vision, but by the 1990s, something had changed. Melchers himself volunteers freely that after 20 years, it was time for him to go. But there was no scientific reason to close the institute, he says.

    Icon of immunology.

    Jean Tinguely's iron sculpture of a helix towered over the entrance to the Basel Institute for 30 years.


    The BII's shutdown, some observers say, coincides with a decline in support for basic research in Europe. People feel that European leaders are emphasizing science as a means of stimulating the economy “as opposed to discovering novel principles in biology,” says Harald von Boehmer, a permanent member who left in 1996 and now heads a group at the Dana-Farber Cancer Institute in Boston. That's one reason, he says, why several leading immunologists have moved to the United States, “where funding for basic research … is by comparison enormous.”

    Nearby universities have snapped up several BII scientists. Other members of the BII staff are heading for the Swiss Federal Institute of Technology in Zurich, Washington University in St. Louis, and the European Institute of Oncology in Milan, to name just a few. Karjalainen will join the Institute of Research in Biomedicine recently started in southern Switzerland by Antonio Lanzavecchia, who left the BII the year before it closed. In accepting a chair endowed by Roche for $6.7 million at the University of Basel, Ton Rolink, who was with the BII since 1983, will become the university's first professor of immunology.

    The new Roche Center for Medical Genomics, meanwhile, has yet to take shape. Signs for the center and the BII face off outside the building, but inside, there's little sign of new life. The institute's library was dismantled months ago to make way for a department of bioinformatics that Roche's media office confirms has been staffed with a department head and several other people. Detailed drawings for renovating the cafeteria have gone up and come down from the announcement board, fueling speculation that Roche may be rethinking its commitment to the center. But the company maintains that its plans are “right on track.”

    For many immunologists who grew up at the BII, there can be no successor. “There is no place that I know of on this planet where a bright young person can go and just do what they want to do,” says Matzinger, who fears that without the institute, immunology will become more mainstream. Indeed, muses Askonas, “the BII provided the opportunity for young investigators to develop into independent scientists.”

    No one disputes Roche's right to disband the institute. What people do regret is the abrupt way it was done. Still, many express wonder that the BII continued for as long as it did. “It was a remarkable experiment,” says Julius.