News this Week

Science  04 Jan 2002:
Vol. 295, Issue 5552, pp. 24

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution

  1. U.S. BUDGET

    Biomedicine Gets Record Raise as Congress Sets 2002 Spending

    1. David Malakoff*
    1. With reporting by Jeffrey Mervis.

    All's well that ends well. This year's bumpy budget-making journey ended in a relatively smooth landing late last month with a record raise for U.S. biomedical science and increases for many other basic research budgets.

    In April, President George W. Bush alarmed some science lobbyists with a spending plan for 2002 that called for trimming many nonbiomedical research budgets. But lawmakers rejected most of those cuts, instead increasing spending on everything from geological research to space science (see table). They added even more funds for military science and bioterrorism-related research in the wake of the 11 September terrorist assaults and subsequent anthrax attacks.

    View this table:

    While the final numbers are still being tallied, analysts expect overall government R&D spending to rise by more than 6%, to some $100 billion, in the 2002 fiscal year that began 1 October. “It was like holiday shopping—they went on a late spending spree,” says one congressional budget aide. Some of the money has strings attached, however, as lawmakers ordered up an unprecedented number of earmarks. The practice, in which Congress directs funds to a specific institution or research project, is opposed by many scientists and the Bush Administration.

    Although Congress had already passed some spending bills affecting science agencies (Science, 16 November 2001, p. 1430), it wasn't until the week before Christmas that it approved budgets for the National Institutes of Health (NIH) and the Departments of Education and Defense. Here are highlights from those budgets and from other agencies that support research:

    Biomedical science. For the fourth year in a row, NIH was the biggest winner. Its record $3 billion, 15% raise, to $23.2 billion, was contained in a $396 billion measure funding health, education, and welfare programs. The 2002 number is $200 million more than the White House requested and keeps the world's largest biomedical research funder on track for a budget doubling in 5 years, to $27 billion next year.

    NIH's two dozen major institutes—including those focused on cancer, heart disease, and diabetes research —will each grow by 12% to 15%, producing nearly 900 new grants across the agency. A new, congressionally mandated institute for biomedical imaging will start life with a $112 million budget. Construction grants for extramural facilities will grow by 10%, to $110 million. But that's far shy of what some university administrators had sought.

    In a significant victory for science society lobbyists, Congress rejected a Senate proposal forcing NIH to spend $143 million on Parkinson's disease research. The 60,000-member Federation of American Societies for Experimental Biology (FASEB) and other groups had opposed the directive, saying it would undermine NIH's history of giving funds to the best peer-reviewed science and not to specific diseases. The groups also feared that the earmark would trigger further directives in the NIH spending bill, which Congress has traditionally kept earmark-free. Lawmakers also turned aside a House move to reduce salaries for grantees, which FASEB and other groups said would doom efforts to recruit more physicians for high-priority clinical research studies.

    Even so, some of NIH's increase won't remain fully in the agency's control. Budgeteers tagged nearly $300 million to be funneled to the Department of Health and Human Services, NIH's parent agency, for management oversight and other activities. Another $100 million is dedicated to a new Administration initiative to fight AIDS, malaria, and tuberculosis, part of which will involve NIH.

    Military research. The $318 billion defense bill boosts the military's own basic research spending by 5%, to $1.4 billion—more than $100 million above the president's request. Basic science funded by the Defense Advanced Research Projects Agency will grow by nearly 4%, to $519 million. The raises are good news for universities, which depend on the Pentagon for up to half of their math, engineering, and computer science research funding.

    The defense bill also restores a cut of nearly $150 million that the Administration had proposed for Department of Energy programs aimed at securing Russian nuclear materials and keeping ex-Soviet scientists from taking their talents to enemy nations. And it includes some goodies for NIH as well: The National Institute of Allergy and Infectious Diseases will get $85 million for bioterrorism-related research and $70 million to build an ultrasecure laboratory for working with dangerous pathogens.

    Bioterrorism security. At the last minute, lawmakers removed language from the defense spending bill that would have tightened security requirements for researchers working with potential bioweapons. The American Society for Microbiology and other groups had scrambled to help Congress craft workable regulations on worker screening and registering of pathogens, and some of those measures were attached to the Senate's version of the spending bill. After House leaders objected to using an appropriations bill to pass the new rules, however, the two bodies agreed to finalize separate bioterrorism security legislation early next year.

    Fighting terrorism.

    NIH gets a new lab to work with dangerous pathogens and the government beefs up security.


    Science education. The science education community got its heart broken by congressional appropriators. On 18 December, science education lobbyists celebrated completion of a highly publicized reform of federal funding for elementary and secondary education. Among many other provisions, the new law authorizes the Department of Education to spend up to $450 million a year on partnerships between universities and local school districts to improve math and science. Later that day, however, the committee that actually hands out the money approved a paltry $12.5 million for the program. “We're still recovering from the shock,” says a disheartened Jodi Peterson of the National Science Teachers Association.

    “It's kind of interesting that in the midst of all this talk about improving education, math and science are left out of the picture,” says Representative Vernon Ehlers (R-MI), a long-time advocate for better science education. He also scolded scientists for not getting more involved sooner. “It was a question of too little, too late,” he said. But Congress did provide at least a little solace: Earlier this year it gave the National Science Foundation $160 million to start a similar partnerships program.

    NIST earmarks. The construction budget of the National Institute for Standards and Technology (NIST), approved in November, is larded with an unprecedented level of earmarks. Senator Judd Gregg (R-NH), the top Republican on the committee that oversees NIST's budget, managed to steer a staggering $18 million to his home state of New Hampshire, part of $41 million in earmarks for a $62 million account. NIST's overall budget rose 13%, to $674 million, due to the earmarks and a 27% increase, to $185 million, for the Advanced Technology Program. But NIST's $321 million core research budget edged up by only 3%. “I am amazed to see that we are more concerned about 'pork' than supporting world-class research facilities,” says Senator John McCain (R-AZ), who regularly rails against the practice.

    Researchers are already wondering about the effects of earmarks on next year's budget. The Bush Administration, which will release its 2003 budget proposal on 4 February, has said that rising security needs and a declining economy will leave little room for new research initiatives. But, while Bush has threatened to veto any bill that smells too strongly of bacon, Congress often holds the upper hand in budget battles.


    Cloned Pigs May Help Overcome Rejection

    1. Jocelyn Kaiser

    The cloning of Dolly the sheep nearly 5 years ago raised the hopes of transplant scientists looking for an endless supply of lifesaving organs. It was a key step toward creating a line of identical animals genetically engineered so their organs could be used in people. Now, a team led by researchers at the University of Missouri, Columbia, has made another major advance–the creation of four cloned piglets that lack one copy of a gene that causes pig organs to be rejected by the human immune system.

    “This is something that's been eagerly awaited,” says immunologist Jeffrey Platt of the Mayo Clinic in Rochester, Minnesota, an expert in xenotransplantation, or animal-to-human transplants. The work, published online this week by Science (, brings researchers halfway to their goal of producing live pigs lacking both copies of the gene. It puts the Missouri group ahead of a pack of companies, one of which has just welcomed the birth of knockout pigs, that are pursuing the same goal.

    Pigs are the most promising species for organ transplants because they are physiologically similar to humans and, unlike nonhuman primates, are in plentiful supply. But progress in the field has been slow for two reasonsthe fear of new viruses being transmitted from pigs to humans and the almost certain rejection of the transplanted organ.

    Handling rejection.

    This piglet lacks one copy of a sugar-producing gene that makes humans reject pig organs.


    Pigs produce a sugar, a link between two galactoses, on the surface of their endothelial cells that humans and Old World monkeys do not make. Primates' immune systems recognize this sugar as a foreign antigen and attack the pig cells, leading to “hyperacute rejection” and organ failure.

    Researchers have addressed the problem by endowing transgenic pigs with protective proteins to counter the immune response, which has allowed the organs to function in primates for months rather than days. But the only complete solution is thought to be a pig lacking the gene for the enzyme galactosyltransferase that makes the sugar. Cloning technology raises the possibility of disrupting, or knocking out, this gene in cultured cells, then inserting the nucleus of the modified cells into an empty pig egg to create embryos.

    The first cloned pigs were created in 2000 (Science, 18 August 2000, pp. 1118 and 1188). Now, animal scientist Randall Prather and his team at Missouri, along with collaborators at Immerge BioTherapeutics Inc. in Charlestown, Massachusetts, have knocked out the galtransferase gene in fetal cells used to make cloned piglets.

    To disrupt the gene, the researchers used a “gene trap” vector, a piece of DNA containing snippets complementary to the target gene along with sequences for antibiotic resistance. They moved this vector across the cell membrane and into the nucleus by jiggling the cells with electricity. They then treated the cells with antibiotics to kill all but the cells that contained the inserted DNA, then screened for those that had it in the right location. This replaced gene causes the cell to make a truncated version of galtransferase. Because the odds of a successful insert were only 1 in 5 million, the team didn't expect to get any cells with both alleles knocked out.

    The researchers then fused these modified fetal cells with oocytes from which the chromosomes had been removed by zapping the cells with electricity, which kick-started the process of cell division. They implanted these embryos into sows that had just come into heat.

    Because fetal cells stop dividing after a few weeks in culture, the team had to move quickly. “We did a bunch of things in the lab differently” to speed up the modification and testing steps, Prather says. All the same, the team had to implant more than 3000 embryos in 28 surrogate sows to get seven live piglets born in September and October, a 0.2% success rate. “It's a rather heroic piece of work,” says cattle cloning researcher George Seidel of Colorado State University, Fort Collins. And the work isn't over: The four surviving piglets, all females, still make the galactose link with their good copy of galtransferase.

    At least two other companies are hot on the Missouri team's heels. Advanced Cell Technology of Worcester, Massachusetts, say they are close to announcing the birth of pigs lacking the galtransferase gene. And David Ayares of Scotland-based PPL Therapeutics's lab in Blacksburg, Virginia, told Science at press time that five pigs appearing to have the knockout allele were born on Christmas Day. Prather says the next step, which his group hopes to achieve within 18 months, is to produce double knockout pigs using conventional breeding methods.


    Livestock Feed Ban Preserves Drugs' Power

    1. Dan Ferber

    CHICAGOIt's no secret that livestock fed antibiotics breed drug-resistant bacteria that can cause dangerous infections in people. But a new study suggests that the process is reversible. Banning a drug called avoparcin from animal feed dramatically reduced the chances that potentially dangerous gut microbes in hospital patients would be resistant to an important, related drug, Belgian researchers reported last month at a meeting* sponsored by the American Society for Microbiology.

    The results are the first to show that cutting antibiotic use on the farm leads to reduced resistance in hospital patients—those who need antibiotics the most, says microbiologist Stuart Levy of Tufts University School of Medicine in Boston. “This says there's a strong connection between what's done in animals and what you see in people,” he says.

    Farmers mix low doses of antibiotics into animal feed to keep infections from spreading through a flock or herd. The antibiotics also help fatten animals on less feed, although researchers aren't sure why. But antibiotics almost inevitably spur some bacteria to develop resistance to the drugs, and researchers have long warned that the bugs, or the resistance genes they harbor, can make their way through the food chain to the human gut. That, in turn, could make it harder to treat dangerous infections with antibiotics akin to the drugs used on the farm (Science, 5 May 2000, p. 792).

    This concern prompted the European Union to ban avoparcin from livestock feed in 1997 after more than 2 decades of use. Avoparcin and a human antibiotic called vancomycin kill bacteria by preventing them from building cell walls. Earlier studies showed that avoparcin-resistant gut microbes in chickens and hogs also resist vancomycin. That could be bad news for hospital patients, who receive vancomycin to fight enterococci that cause life-threatening infections when they escape from the gut during surgery. Researchers suspected that the avoparcin ban would help prevent the spread of vancomycin-resistant enterococci (VRE) in humans.

    Just say no.

    Cutting antibiotics from chicken feed reduces microbes' drug resistance in people.


    In monitoring the ban's effects, researchers have found a dramatic drop in VRE among pigs, chickens, and supermarket chicken meat. Fewer VRE were also found in the human population. But it wasn't clear whether this trend extended to patients in hospitals, where most opportunistic enterococci infections occur.

    To find out, microbiologist Greet Ieven of the University of Antwerp and her colleagues cultured enterococci from stool samples of 353 patients in May and June 2001 and tested how many microbes survived high levels of vancomycin. Just three of the enterococci cultures, or 0.6%, stood up to vancomycin—a big drop from the 5.7% resistance rate in 1996, when avoparcin was still widely fed to livestock. Molecular genetic analysis confirmed that the prevalence of a key vancomycin resistance gene plummeted from 5.7% to 0.8%. Because the use of vancomycin in Belgian hospitals hasn't changed in recent years, Ieven says, the results “confirm the hypothesis that VRE in Europe originates [on farms].”

    The Belgian results “confirm what people thought might happen in the clinic,” says pharmacologist Michael Dudley of Microcide Pharmaceuticals in Mountain View, California. The results mean that antibiotic resistance flows like water from the farm to the clinic, he says, and by stopping the use of avoparcin, “you stop the tap.”

    Two other classes of farm antibiotics have come under scrutiny lately because they resemble human drugs, and the U.S. Food and Drug Administration has proposed banning them from livestock feed. They include virginiamycin, which breeds enterococci that resist a recently developed drug called Synercid that kills tough enterococci infections; and enrofloxacin, a member of the widely used class of human antibiotics called fluoroquinolones that includes ciprofloxacin (Cipro). Abbott Laboratories, the maker of enrofloxacin, is fighting the ban. A final FDA decision is not expected for at least another year.

    • *The Interscience Conference on Antimicrobial Agents and Chemotherapy, Chicago, 16-19 December 2001.


    Cancer-Stalling System Accelerates Aging

    1. Evelyn Strauss

    Fending off cancer: a good idea? Maybe not. A mechanism that thwarts deadly tumors might come with a major drawback. Overactive p53—a protein that foils potentially cancerous cells—causes symptoms of old age and hastens death in mice, report Lawrence Donehower, a molecular biologist at Baylor College of Medicine in Houston, and colleagues in the 3 January issue of Nature.

    The work suggests a trade-off between cancer prevention and sustained vigor. “A robust surveillance mechanism against cancer is important for getting the animal to reproductive age,” says Leonard Guarente, a molecular geneticist at the Massachusetts Institute of Technology in Cambridge. But if the system's gauge is set high enough to protect against cancer, it might thwart normal cells as well. “This might be good early on but bad later in life,” says Guarente.

    Most research on p53 has focused on how a deficiency of the protein causes cancer. The protein prevents genetically marred cells from reproducing by sending them to their death or by stalling reproduction until the damage is fixed. Without sufficient p53, corrupt cells run amok and some grow into tumors.

    While trying to mimic a common human p53 mutation in mice, Donehower's team serendipitously created a different one. The mutation lopped off about half of one copy of the p53 gene, causing mice that carried it to build some normal and some stunted p53 proteins. Counterintuitively, the mice behaved as if they had acquired extra dollops of the protein. None of the 35 animals that harbored the altered p53 developed life-threatening tumors, although some acquired small tumors that were discovered on autopsy. But in the 56 mice with two intact copies of the gene, 45% developed deadly tumors. A series of tests led the researchers to conclude that they had inadvertently constructed a mutant p53 that grants the normal protein superpowers.

    If it's not one thing, it's another.

    Cancer-resistant mouse (top) appears to grow old faster than normal mouse (bottom).


    The mice didn't have a chance to fully enjoy their cancer-free lives, however. By 96 weeks of age, half the mutants had died. Half of normal animals, in contrast, lived 118 weeks or longer. And the oldest mutant barely squeaked by the 136-week mark, while its normal counterpart celebrated 164 weeks.

    As pups, the mutant mice seem similar to their normal littermates, but “they start to look decrepit and sluggish in middle age,” says Donehower. The mice developed many age-related conditions. They shrunk, for example, their skin thinned, and wounds healed slowly. Many of their organs shriveled as well, apparently because the constituent cells withered. The results suggest that p53 repels cancer at a price: The ability to obstruct rampant cell division might also hamper an animal's ability to replenish essential cells. This idea fits with other work that links p53 with aging. For example, a gene known to extend longevity in worms and yeast—Sir2—also promotes survival in mammalian cells by turning down p53 activity.

    Guarente cautions that the study doesn't prove that the p53 mutants grow old before their time: “Is this premature aging, or is this just a sick mouse?” But he and others are impressed by the large constellation of old-age traits in the mice. To convince people that the mice exhibit accelerated aging, “you'd have to knock out p53 and get a longer life-span,” Guarente says. But “we already know what happens then: You get cancer.”

    Researchers can't yet circumvent this problem, but Donehower has managed to collect some preliminary data. Mice that carry one normal and one inactive copy of p53 suffer from a high incidence of cancer. But two of 217 such animals he studied did not happen to get tumors—and they “lived much longer than any of the wild-type mice,” says Donehower. “It's only two mice,” he cautions, but he would like to follow up this tantalizing observation to see if small amounts of p53 make for a longer life-span, provided the animal remains cancer-free.

    Teasing apart p53's age-promoting and cancer-preventing capabilities might eventually lead to therapeutic interventions, suggests Ronald DePinho, a cancer geneticist at the Dana-Farber Cancer Institute in Boston. Perhaps such an approach would “help the organism age gracefully,” he says, without compromising its ability to guard against cancer.


    Reducing Uncertainties of Global Warming

    1. Richard A. Kerr

    About all that climate researchers can say with any confidence concerning global warming is that the world has warmed during the past century and much of that warming is probably due to humans pouring greenhouse gases into the atmosphere. How bad could things get as the world continues to warm? Scientists' bottom-up approach—trying to understand the role of every part in the dizzyingly complex climate machine—has left that question unanswered. But in this issue of Science (p. 113), a group of researchers take a top-down approach: They plugged different combinations of values for fundamental properties of the climate system—such as its sensitivity to the nudge that humans are giving it—into a computer model and looked to see how well the model's output matched long-term observations. The results are mixed.

    Climate dynamicist Chris E. Forest of the Massachusetts Institute of Technology and his colleagues used this new combination of computer simulation and observations to calculate climate properties that had usually been estimated from climate models alone or from polling researchers for their opinions. Using an intermediate-complexity model simple enough to make hundreds of long runs, Forest and his colleagues simulated the climate of 1860 to 1995 under accumulating greenhouse gases. They compared their results to three observational records of temperature that gauge global warming: the changing temperatures of the surface, the upper atmosphere, and the deep ocean.

    In the model, they included three adjustable “knobs": the sensitivity of climate to a given amount of added greenhouse gases, the rate at which the ocean can take up heat, and the ability of aerosols—microscopic particles found in pollutant hazes—to change solar heating of the atmosphere. Forest and his colleagues twiddled the knobs over a range of values, ran the model under a large number of setting combinations, and then compared the simulated climate trends with the three observed temperature records. If a three-setting combination produced a reasonable match for all three records, then each of the combination's settings became a possible value of the actual climate property.

    By their own concession, Forest and colleagues had varied success pinning down key parameters of the climate system. The rate at which the ocean takes up heat—and counteracts greenhouse warming—couldn't be usefully constrained. “Our result suggests that more research is needed” on ocean heat uptake, they write.

    Their lower limit (90% confidence level) on the all-important climate sensitivity— 1.4 kelvin for a doubling of atmospheric carbon dioxide—matches the long-cited, subjective 1.5 K lower limit recently repeated by the Intergovernmental Panel on Climate Change (IPCC) (Science, 26 January 2001, p. 566). At that level, “future changes in climate are of considerable concern,” notes climatologist Tom Wigley of the National Center for Atmospheric Research in Boulder, Colorado. But Forest and colleagues came up with an upper limit even higher than the IPCC's: 7.7 K, compared with the IPCC's 4.5 K. The only way they could bring their scorching upper limit down was to use an expert opinion as the starting point for their statistical analysis of their modeling, an option that climate modeler Michael Schlesinger of the University of Illinois, Urbana- Champaign (UIUC), calls “extremely unsatisfactory.” Schlesinger has conducted a similar analysis with Natalia Andronova of UIUC that yields an upper limit as high as 9.3 K, depending on the uncertain role of variations in the brightness of the sun.

    Calculation of the likely range of the aerosol effect seemed most successful. Forest and his colleagues found that aerosols have most likely cooled the planet, but not as much as IPCC allowed. The net effect of aerosols could have been to reflect 0.30 to 0.95 watt per square meter of solar energy back into space, according to their result, compared with IPCC's admittedly uncertain range of zero net effect to more than 4 watts per square meter of net cooling. If correct, Forest's modest cooling would mean that most of early greenhouse warming is not being masked by aerosols. But aerosol modeler Joyce Penner of the University of Michigan, Ann Arbor, cautions that Forest probably shouldn't be lumping all types of aerosols together. She notes that some, such as those from field and forest burnings, do not concentrate in northern mid-latitudes, as Forest had to assume. “I wouldn't want to rewrite IPCC,” adds climatologist Gabriele Hegerl of Duke University in Durham, North Carolina, but Forest's “range [for aerosols] is more likely than the very high ones of IPCC.” To be more confident, researchers must further refine the top-down approach and, like it or not, gain more bottom-up understanding.


    Missing Biologist Found Dead in River

    1. Josh Gewolb

    The body of Harvard biochemist Don Wiley was found last month, 5 weeks after he disappeared while attending a professional meeting in Memphis, Tennessee. The corpse was discovered floating in a Mississippi River tributary in Louisiana, more than 300 miles south of the bridge where Wiley's abandoned car was discovered at 4 a.m. on 16 November. Dental records confirmed the identity of the Lasker Prize winner.

    “It's a long ways, but the river has a mind of its own,” said Lt. Joe Scott of the Memphis Police Department.

    As Science went to press, Memphis police were awaiting results of an autopsy to determine the cause of Wiley's death. Colleagues have said that they doubt Wiley would have jumped off the bridge, but so far police have found no evidence of foul play (Science, 14 December 2001, p. 2265). A wallet containing identification was found with the body.

    Harvard president Lawrence Summers issued a statement saying that the loss left a “tremendous void” in the campus community, which was shut down for winter vacation when the news arrived. Wiley's principle research sponsor, the Howard Hughes Medical Institute, has yet to set a timetable for determining the status of the lab, according to institute spokesperson Bob Potter (see Letters, p. 43). “It's just too early for that,” says Potter. “Everybody has to come to terms with the fact that Don is not coming back.”


    U.S. Breeder Reactor Runs Out of Lives

    1. Robert F. Service

    This time the decision looks final. The Bush Administration has abandoned its search for a new mission for the Fast Flux Text Facility (FFTF) in Hanford, Washington, and is planning to decommission the reactor.

    The 19 December announcement marks the end of a decade-long saga for the Department of Energy's (DOE's) experimental breeder reactor. Concerns that its fuel might be a tempting target for nuclear terrorists, the high cost of restart, and the lack of a clear mission sealed its fate. But many FFTF opponents are already gearing up for the next fight, over getting the government to spend the $300 million needed to permanently shut down the facility. “The department's final determination is based on sound science, an extensive analysis of the costs and benefits of disposition options, and an in-depth consideration of the feasibility of commercial use options,” said Energy Secretary Spencer Abraham in a written statement.

    Proponents saw the reactor as an important source of radioisotopes used in cancer therapy and other treatments. But most biomedical researchers say that the isotopes are available from other sources and that restarting the reactor would drain scarce resources from other DOE research programs. “The data were compelling a long time ago that the cost of restarting FFTF relative to the need was not favorable,” says Ken Khrone, a radiologist at the University of Washington, Seattle.


    DOE officials flagged uncertain large costs as the main reason for shutting down the reactor.


    FFTF went online in 1980 but was shut down in 1992 because of high operating costs. DOE has since spent about $35 million a year to maintain the reactor in standby mode while exploring possible uses ranging from producing tritium for nuclear weapons to plutonium-powered batteries on deep space probes (Science, 4 April 1997, p. 28). Last January, the outgoing Clinton Administration ordered the facility permanently closed, but Abraham stayed the order in April, soon after taking office.

    Abraham ultimately rejected two ideas, one from a consortium of companies that wanted to produce and sell medical isotopes, the other to use the facility to research advanced reactor designs. “Both were found to have major drawbacks and present potential DOE liabilities that collectively could exceed $2 billion,” concluded DOE's review leader, Robert Card, in a 14 December memo to Abraham. Those costs were too much for a shrinking DOE budget, says Tom Carpenter, who heads the West Coast office of the Government Accountability Project, a public interest law firm that opposed a restart: “The proposal to restart FFTF may have gotten somewhere 10 years ago, but not in this budget climate.”

    This same fiscal climate already has chilled other Hanford cleanup operations. Last spring, the Bush Administration requested $430 million less for cleanup activities at Hanford than required under a binding agreement with Washington state and the Environmental Protection Agency, although Congress later restored the funds. With even more money needed next year to begin shutting down the reactor, another round of budget battles seems like a good bet.


    Planetary Science's Defining Moment

    1. Andrew Lawler

    Internecine battles have tarnished the reputation of planetary scientists in Washington, D.C. Researchers hope they can win back respect with a consensus long-range plan

    IRVINE, CALIFORNIAThe stark message from the black box on the conference table left many of the dozen or so scientists visibly shaken. “It would be very easy for this Administration to walk away from the planetary program,” said the voice from Washington, D.C. The speaker was Steve Isakowitz, who oversees space and science programs at the powerful White House Office of Management and Budget (OMB). The stunned audience members were part of a National Research Council (NRC) team working on the first long-term science plan for solar system exploration. “The planetary community is fractured, and we don't have a clear vision,” chimed in fellow budgeteer Brant Sponberg during the 15 November teleconference. “And that makes you guys very, very vulnerable.”

    The crisis in NASA's solar system effort is forcing a painful reckoning for researchers who study planets, moons, asteroids, and comets. “I shiver at the idea of this nation abandoning exploration of the outer planets,” says Wesley Huntress, an NRC panel member and geophysicist at the Carnegie Institution of Washington. Ironically, the trouble comes at a time when the field is booming. Public support is strong, and flotillas of U.S., European, and Japanese spacecraft are planned, on their way, or already gathering data in the far corners of the solar system. NASA's $800 million planetary science budget is slated to top $1.3 billion by 2005. Meanwhile, many U.S. companies, private and public labs, and individual scientists are scrambling to get a piece of an area that was once the domain of a few privileged institutions and researchers.

    But this expansion also comes in the midst of bitter rivalries among subdisciplines, friction between labs and their political backers, and rancor between Washington players like Isakowitz and bench scientists with their own agendas. “We're in a mess right now,” sighs Andrew Nagy, an NRC panel member and space physicist at the University of Michigan, Ann Arbor. Things came to a head last fall during a successful campaign by researchers to win congressional approval to revive a Pluto mission that NASA had shelved. The $30 million appropriation, a mere down payment on what could be a $500 million effort, angered NASA officials and their political paymasters, who don't think the nation can afford such a voyage. Now both the Pluto mission and a separate trip to study Jupiter's moon Europa are on the chopping block. Unless planetary scientists make some prudent choices, say OMB officials, the future of U.S. exploration beyond Mars is highly uncertain.

    Pluto and Pasadena

    Academic researchers, NASA officials, and policy-makers alike are counting on the NRC panel to rescue the field from that uncertainty by providing a set of scientific objectives and associated missions for the next decade on which everyone can agree. The NRC previously drew up long-term plans for astronomy and astrophysics, providing clear priorities that those communities have turned into an effective lobbying tool.

    On ice?

    The status of a Europa orbiter that would provide better closeups of the Jupiter moon is unclear


    In the past, planetary scientists have rather relished the rough-and-tumble sparring over NASA's budget. Just a few years ago, advocates of a Mars lander fought with those who preferred a mission to a nearby asteroid. The infighting seems inevitable, because one mission can never please everyone. “Solar system missions are the opposite of astronomy missions; they are narrowly tailored and specifically targeted,” notes Mark Sykes, an astronomer at the University of Arizona in Tucson. The Hubble Space Telescope may study the birth of stars, black holes, and extrasolar planets all in the space of a week, but a mission to Mars offers little of interest to a researcher studying gas giant planets. Even within the Mars community, a mission that focuses on geology would not attract those who study atmospheric chemistry or the magnetosphere.

    The stakes were raised in 2000, however, when NASA's then-new space science chief Ed Weiler learned about dramatic cost overruns on two missions—one to Pluto and one to Europa—planned by the Jet Propulsion Laboratory (JPL) in Pasadena, California. With the total cost jumping from $654 million to $1.49 billion, Weiler was forced to choose between the two. “I had no clear priority, so I used the best information I had,” he recalls, and halted work on the Pluto mission. There was also heavyweight support behind the Europa mission: The Galileo spacecraft had found evidence for an ocean beneath the icy skin of the Jovian moon, and the Clinton White House was intrigued by the possibility of life there, fueled by 1996 claims—still controversial—that a martian meteorite contained evidence of fossils.

    Weiler's decision infuriated backers of the Pluto mission. Public groups such as Pasadena's Planetary Society joined scientific advocates in arguing that Pluto was the more urgent target, because its orbit was taking it farther from the sun and its already thin atmosphere could freeze by 2020. A NASA advisory panel subsequently agreed with that assessment—if the costs of a Pluto mission could be held at $500 million. The White House refused to request Pluto money in its 2002 budget, although NASA continued preparations at the request of the Senate. Following an intensive lobbying campaign by Pluto backers, Congress added $30 million to keep the mission alive through 2002. Last month Weiler awarded a contract to planetary astronomer Alan Stern of the Southwest Research Institute in Boulder, Colorado, to run the mission, with Johns Hopkins University's Applied Physics Laboratory in Laurel, Maryland, to develop the spacecraft and instruments.

    Upward bound.

    NASA spending on planetary missions is slated to rise 50% between 2002 and 2006.


    The congressional move angered OMB deputy director Sean O'Keefe, who last month took over as NASA administrator, as well as Isakowitz and Sponberg. And NASA officials say they've learned a valuable lesson. “We should not eat our own,” says Colleen Hartman, NASA's solar system exploration chief. “That's what's happened with Pluto and Europa.”

    Weiler, meanwhile, says he has no intention of being placed in such an uncomfortable position again. He asked the NRC to come up with a decade-long plan that would force the science community to adopt a ranked list of projects and objectives.

    Paneling together

    Led by retired astronomer Michael Belton, the 16-member NRC steering group and its half-dozen panels went to work last summer and are due to complete the survey in May or June. The timing will coincide with a legislative debate on the fate of missions including Europa and Pluto. The panel's most pressing task will be to win consensus in a traditionally diverse field that lacks the hierarchical leadership of more established areas such as astronomy. “There is no planetary pope,” quips Sykes. He has helped involve some 350 researchers in the NRC study, ensuring that the community has a say—and a stake—in the final report.

    Nothing short of consensus will do. At the 15 November teleconference, the first one in which members talked with OMB officials, Isakowitz and Sponberg told the NRC panel that there is not enough money to go to both Europa and Pluto and that the community is undermining itself by lobbying Congress and opposing the president. “This is a free-for-all,” complained Sponberg. The Pluto funding approved by Congress, he added, “may have irreparably damaged planetary funding for the next several years.” Delaying Europa in favor of Pluto is “pretty unlikely,” he added, and the idea of letting Congress make the cuts necessary to fund Pluto is not very appealing, either. “Probably the most likely option,” he said, is to cancel funding for both Europa and Pluto.

    Panel members were appalled by that grim scenario. Belton tried to defend the push to fund Pluto, arguing that it was a natural response by a community that wants both missions. “What were we supposed to do,” he asked the budget examiners, “roll over?” Pluto's advocates also deny that their lobbying is selfish. “This is a groundbreaking mission that NASA advisory panels have put as their top priority,” says Stern, who will be principal investigator. Supporters, he notes, have simply been making use of “the democratic process.”

    Asteroid advantage

    Duly chastised by the White House, Belton's panel now must juggle a complicated array of scientific, technical, and political issues while remaining united. The sheer breadth of solar system science alone will make for very difficult choices, and every subdiscipline has pressing questions. Radar images of Mercury show evidence of polar frost on that fiery planet. A uniform resurfacing of Venus about 500 million years ago intrigues geologists. How much water ran free on Mars, and for how long, remains a subject of intense debate. In the outer solar system, Jupiter's 300-year-old storms baffle researchers, Saturn's moon Titan shows exciting evidence of organic molecules, and Neptune's high winds are a puzzle on a planet so far from the sun. The nature of Pluto's companion, Charon, remains a mystery. The density of comets is still not known, and the composition of asteroids is more varied than imagined.


    Jealousies are also aroused by the uneven distribution of past missions throughout the solar system. Some planets such as Mars and Venus have been visited frequently, prompting plans for landers and, eventually, sample returns with accompanying large costs. Other bodies such as Neptune and Uranus have only been briefly glimpsed by Voyager, and Pluto and Kuiper Belt objects—potential comets orbiting beyond Neptune—have only been observed through telescopes. Cheaper orbiters and flybys make more sense for that early stage of exploration. Meanwhile, discoveries from ongoing missions such as Cassini, due to arrive at Saturn in 2004, could open up new areas for research.

    White papers from the 20-odd community groups organized by Sykes have thrown up a huge list of possible missions and objectives, while JPL—the longtime leader in overseeing planetary projects—has proposed ambitious efforts ranging from a $500 million lunar sample return to a $1 billion mission to float a balloon over the possible methane ocean of Saturn's moon Titan. Survey members aim to come up with a list of scientific objectives and associated missions. The panel also wants to create a “Discovery-plus” program for missions costing on the order of $500 million that will be chosen by open competition. Belton says this proposal is still in draft form, but “Ifeel it will carry the day.”

    Comet and asteroid studies are likely to fare particularly well in the new report, thanks to nature as much as to their scientific value. Such bodies are easier and cheaper to reach than planets and don't require complex and expensive maneuvering to get spacecraft into orbit. “It looks like cost will drive things toward missions to small bodies,” says Michael A'Hearn, an NRC panel member and a comet astronomer at the University of Maryland, College Park.

    The report is not expected to contain calls for huge missions, including planetary sample returns. “Those will be in our vision of the future” beyond 2013, says panel member Nagy. “For the report to have an impact, it has to be realistic.” NASA's Hartman agrees: “We do not need, in the next decade, a $1 billion mission,” she warned the panel in November. “I don't believe I will be able to sell it.”

    The astrobiology gap

    To be successful, the survey team also must bridge a fundamental philosophical divide in the planetary community exposed by the Pluto and Europa dispute. Since 1996, with the backing of the White House, NASA has plowed ever larger amounts of funding—about $30 million for this year—into the nascent study of astrobiology. That field strives to combine research into life in extreme environments on Earth with study of potentially life-friendly places such as Mars and Europa.

    The White House's interest in the search for life also has led directly to a stable and politically valued Mars exploration program, with a price tag in excess of $500 million annually, as well as a green light for the Europa mission. “Biocentric arguments have tended to do well, and that has pulled the rest of activities along,” says Isakowitz.

    But astrobiology gets little respect from many traditional planetary scientists, who see it more as a creation of Washington politicians than as a legitimate research area. That was evident at the November meeting here during an astrobiology briefing by Bruce Jakosky, an atmospheric physicist at the University of Colorado, Boulder. “This is like teaching freshman geology,” he complained, as panel members leafed through newspapers or chatted quietly with other participants.

    The complaint against astrobiology is that the field is heavy on hype and light on results. “Are we selling packaging or content?” asks Sykes. Briefings to lawmakers about the Europa mission, he says, “leave them with the impression that [the spacecraft] will capture caribou walking across the ice.” He warns that overselling astrobiology could be disastrous.

    Such skepticism seems to hold the upper hand within the NRC panel. John Baross, an oceanographer at the University of Washington, Seattle, who co-chairs the survey's astrobiology panel, says that the topic will be integrated into the whole report rather than be a stand-alone chapter. “It's a shame,” Baross adds, because he believes that planetary science, now almost wholly dependent on NASA, could be enriched by funding from the National Science Foundation and the National Institutes of Health.

    Administration officials make no bones about their frustration with astrobiology skeptics in academia. “Decision-makers are excited by the possibility that we could revolutionize whole areas of science and our view of the universe” through astrobiology, said Sponberg. “That's really exciting.” He argues that the interest in astrobiology will benefit all aspects of planetary science. Weiler agrees. “It's really scary when OMB may have more vision than scientists,” he says. “The most important scientific discovery that could be made in this century is the discovery of life [elsewhere] in the universe.”

    Some researchers hope to find a middle ground that recognizes the political value of searching for extraterrestrial life without endangering the credibility of a scientific plan. “Both sides are right,” says Jonathan Lunine, a physicist at the University of Arizona and co-chair of the astrobiology panel. “There is a political aspect associated with astrobiology. But we are on the threshold of bringing different disciplines together, and this is an important new endeavor.” Hartman thinks that “the debate is couched incorrectly” and that astrobiology should be considered as one driver of the overall program.

    Planetary pope?

    Michael Belton hopes NRC panel can reach consensus.


    Policing the future

    With the panel's survey now in full swing and Sykes collecting input from hundreds of researchers, participants are optimistic about their chances of coming up with a definitive decade-long plan. “The community is rising to the challenge,” says Belton. “And we've been able to communicate with a large fraction” of its researchers. The ultimate audience, however, won't be researchers: “The prime customers are NASA, OMB, and Congress,” he adds. It's an audience that scientists can't afford to ignore, Hartman warned the panel: “We're in a fight for scarce resources, a fight we are currently poised to lose.”

    Sykes says that the ultimate value of the survey would be to provide “long-term cover” for Washington officials like Weiler and Isakowitz, who must make tough decisions on planetary program spending. The community is not likely to respond favorably to threats or scapegoating, he notes, adding that attempts to kill healthy programs—such as Pluto—simply invite scientists to lobby influential backers. A good survey, he says, will do away with much of this tension by carving out a clear path.

    Sponberg agrees that the survey will be a critical element in solidifying support for planetary science. But he warned the panel that the report is only a first step—and that maintaining consensus will be a full-time job requiring strong leadership. Sykes is confident that the field is mature enough to take responsibility for its own future. “It has taken 40 years,” he says. “But now the community is big enough to do this.”


    Lab Rivalry Spices Up Solar System Exploration

    1. Andrew Lawler

    PASADENA, CALIFORNIAThe battle over who will build the next round of U.S. missions to explore the solar system is a classic match-up between the grizzled veteran and the young and hungry challenger. But the real winner, if NASA officials and scientists can be believed, will be science and the public.

    To many, the Jet Propulsion Laboratory (JPL) here is planetary science. Its star-studded cast of nearly two dozen missions includes Mariner 2, which flew past Venus in 1962 in our first encounter with another planet; the Viking orbiters, which mapped Mars in the mid-1970s; and the Voyager 1 and 2 missions now leaving the solar system.

    So when Maryland's Applied Physics Laboratory (APL) in Laurel, part of Johns Hopkins University and traditionally a Navy contractor, offered to build an asteroid-rendezvous mission in the early 1990s for less than $150 million, it was seen as something of an upstart. “Everybody laughed,” recalls Tom Krimigis, APL's space chief. APL got the NASAcontract, for $120 million, after JPL engineers estimated it would cost them three times as much. Although controllers had to abort the first attempt at rendezvous, the Near Earth Asteroid Rendezvous (NEAR) spacecraft began orbiting Eros 2 years ago and last year landed on its surface.

    APL's bold proposal led NASA to create a competitive planetary program called Discovery. Last year alone, APL won two contracts to explore the far corners of the solar system, from sun-hugging Mercury to distant Pluto. This summer APL hopes to launch a payload that will fly by at least two comets. That's an impressive showing for a lab where only about one-fifth of the 3200 staff members are involved in space projects.

    On the other side of the continent, JPL is still struggling to cope with the new world of competition. The disastrous loss of three Mars missions during the 1990s tarnished the reputation of the lab, which is affiliated with the California Institute of Technology, and last year longtime director Ed Stone retired. NASA headquarters decided to open up portions of the Mars exploration effort to competition, and spiraling costs on the proposed Pluto and Europa missions sparked a political furor in Washington.

    Making history.

    JPL's Mariner 2 to Venus (bottom) provided first closeups of another planet; upstart APL wants to get personal with Mercury.


    The crises have contributed to sagging morale and a sense of being under siege, say JPL employees. But Stone's successor, Charles Elachi, says the new competition should be seen as a sign of the lab's success, not failure. “We opened the frontiers of planetary exploration,” he says. “And like anybody who opens new frontiers, other people are going to follow.” Those include not just APL but also private companies such as Lockheed Martin and Ball Aerospace.

    JPL can draw on its 4 decades of experience, a $1.4 billion annual budget, and some 5300 people at its sprawling facilities in the Pasadena hills. The lab has two spacecraft orbiting Mars, another on its way to Saturn, and a fourth en route to a comet with hopes of bring back material from its nucleus. It operates NASA's Deep Space Network, the critical link in every planetary mission, and loans out its crack team of navigators—even to help APL on the tricky NEAR mission. “Clearly, JPL continues to be the flagship lab for NASA's planetary exploration program,” acknowledges Krimigis. “We have no plans to duplicate JPL.”

    NASA space science chief Ed Weiler, who has criticized the Pasadena lab for underestimating mission costs, says, “Whether or not some people want to admit it, this country needs JPL.” That means building spacecraft as well as helping with navigation and communications. “I have to find ways to keep JPL doing real engineering science,” he says.

    Although APL and JPL are now cooperating on several missions, the underlying rivalry seems unlikely to lose its edge. That's in part because APL has strong congressional backing, thanks to Senator Barbara Mikulski (D-MD), who chairs the panel that funds NASA. Although APL director Richard Roca says he doesn't plan to ramp up the NASA-funded portion of his lab, the competition does serve as a useful tool for NASA managers to stimulate new ideas, shore up political support, and save money.

    That's good news for scientists, who are eager to fly more instruments. “It's great,” says one researcher who has worked with both labs. “The competition keeps people honest—and costs under control.”


    Researchers Fear Merger Could Muffle Their Voice

    1. Dennis Normile*
    1. With reporting by Andrew Lawler.

    As Japan plans to combine its two space agencies, researchers wonder how they will be heard

    TOKYOBeing small has its advantages. For nearly 4 decades Japanese space scientists have been allowed to call the shots on planetary exploration—setting the agenda and running their own missions. And the results have been impressive, including a string of successful probes studying the sun, Halley's Comet, and Earth's magnetosphere.

    But now the Institute of Space and Astronautical Science (ISAS), whose modest budget has funded the bulk of university-based research in the field, is being merged with Japan's giant National Space Development Agency (NASDA) and the National Aerospace Laboratory (NAL) as part of a sweeping streamlining of the nation's bureaucracy. Although there will undoubtedly be benefits to being part of a larger, more powerful agency, scientists are worried that the loss of independence will put science in the shadow of the more commercial aspects of space.

    “We're concerned that there will be a lack of visibility for space science once these organizations are merged,” says Takeo Kosugi, who heads ISAS's solar physics program and is also chair of the Space Research Committee of the Science Council of Japan, the nation's largest association of scientists. “We worry that if bureaucrats control the decisions, budget cuts will fall especially hard on space science.”

    NASDA is a very different beast from ISAS. It develops heavy-lifting rockets for launching weather and communications satellites and manages Japan's contribution to the international space station. It also dwarfs ISAS in size, with a current budget of $1.7 billion and 1090 employees compared with $223 million and 325 staffers for ISAS. Including NAL, whose 410 researchers use its $166 million budget to study fluid dynamics and other more technological problems, the merger will further tilt the new agency toward applied fields.

    But more troubling to researchers than NASDA's size is its culture. ISAS's missions are proposed by research groups and reviewed by committees of scientists and engineers. NASDA is run by bureaucrats charged with developing Japan's aerospace industry. NASDA has broadened its vision in recent years, using remote-sensing satellites to study long-term weather patterns and watch for signs of global warming. It is also collaborating with ISAS on the 2005 Selene mission to the moon, which will probe, among other things, its mineral composition, topography, and gravity field. But researchers still view NASDA as an organization whose priorities and missions are set at the top and are aimed at fostering commercial aerospace development.

    The merger will certainly provide some new opportunities. ISAS missions will be able to take advantage of NASDA's H-IIA rocket, with four times the lifting capacity of the institute's M-V rocket. Previously, cooperation between the two agencies was extremely difficult because they were affiliated with different ministries, which rigidly protected their turf.

    Slow mo.

    Launched in 1998, Nozomi overcame flight troubles and is set for a 2004 Mars rendezvous.


    Kosugi also believes that the merger might be an opportunity to revamp space science efforts. He thinks ISAS has outgrown its committee-based decision-making process, which he says worked well when the agency had just two major research groups, one for x-ray astronomy and one studying magnetospheres. But that constituency has grown in the past 2 decades to encompass radio astronomy, infrared astronomy, the moon, and other planets, each with its own slate of missions. “ISAS has grown to include too many groups and too many missions,” Kosugi says. He believes the community needs to agree on priorities in order to make the best use of limited resources. But as yet, it is completely unclear how decisions will be made in the new organization.

    Yoshihisa Nemoto of the Space Policy Division of the Ministry of Education, Culture, Sports, Science, and Technology, which oversees both ISAS and NASDA, says ministry officials are aware of the need for balance within the new agency. “Japan's space science, in certain areas, is world-class, and it would be a terrible shame if those efforts were not properly supported,” he says. “Discussions are going on over how to preserve the bottom-up process for space scientific research, but there has been no conclusion.”

    A merger preparation committee was due to release an interim report by the end of December. But no one is expecting it to resolve the fundamental issues. A proposed structure for the new agency is due out in March, with the merger to take place in fall 2003 or later.


    Tight Budget Makes for an Uncertain Future

    1. Michael Balter

    Europe's planetary scientists had grand plans for a series of missions, but politics is getting in the way

    PARISWhile NASA struggles to set its priorities for solar system exploration (see p. 32), European space scientists are grappling with a more fundamental question: Will they continue to be major players in the game? Recent budget cuts threaten both ongoing programs and new missions, and cuts are forcing space agency officials to scale back their grand plans for the future.

    The European Space Agency's (ESA's) long-term science program includes robotic missions to Mars, Mercury, and Venus. But last November in Edinburgh, at a meeting to decide the agency's budget for the next 5 years, ESA's 15 member governments dealt the agency a double disappointment. The agency's council of ministers declined to boost the science budget—which has lost 15% in purchasing power since 1996—above inflationary levels. They also gave only grudging support to a new program called Aurora, which will map out a series of missions to search for traces of life in the solar system and to develop technology for human expeditions (Science, 23 November 2001, p. 1631).

    In response, last month the agency's Science Program Committee imposed a 6-month hold on a number of projects still in the pipeline so it can assess how many missions can be done with the allocated funds. “We are going into a period of reflection and analysis,” says physicist David Southwood, ESA's science chief, predicated on the assumption that “our budget is now about as high as it is going to be.” Among the projects now in jeopardy are the BepiColombo mission to Mercury and the Gaia astrometry mission, which will record the brightness and position of 1 billion stars in our galaxy. At a minimum, Southwood says, the agency will have to reduce the scientific scope of one or both missions.

    Hot trip.

    One European spacecraft will map Mercury, while a smaller companion will examine the planet's magnetosphere.


    The budget disappointment comes just as Europe's space scientists were hoping to strike a more independent pose vis à vis their international partners, especially the United States. “We cannot sit and wait for others to decide our role in what they decide to do,” says Franco Ongaro, program coordinator of Aurora, an umbrella program that researchers hope will take ESA's solar system exploration to new heights over the next 30 years. One goal, to put a European on Mars by 2025, would require many new technologies, says Ongaro: “We don't today have a credible scenario to put a man or woman on Mars and bring the person back alive.”

    ESA members aren't required to contribute to Aurora, which is classified as an optional program. In Edinburgh, for example, nine countries dug up only $12 million for a batch of preparatory studies, some $24 million short of the total ESA requested. Italy withdrew a $16 million pledge made by its previous government, and Germany opted out as part of an across-the-board retrenchment.

    The funding shortfall angers many planetary scientists. “It shows a lack of commitment,” says atmospheric physicist Alan Aylward of University College London. “Europe has the economy to be an equal partner in space with the U.S., but intergovernmental wrangling and national shortsightedness has always held back space development.” André Brack, an origins-of-life researcher at the University of Orleans in France, says scaling back Aurora flies in the face of “a huge interest in the search for extraterrestrial life and life-forms” by scientists and the general public.

    Scientists hope that Italy will reconsider its decision. Giovanni Bignami, director of space science at the Italian Space Agency (ASI), says that ASI's president, Sergio Vertella, had just assumed the post before the Edinburgh meeting and did not have time to make his own assessment of Aurora's importance. “For the moment,” Bignami says, “Italy's role is very much reduced. It can only improve.”

    In the meantime, Ongaro says that $12 million is almost enough to fund the first 2 years of the 3-year preparatory period, which will lay down the program's overall strategy and its specific missions. That work, he says, should help to persuade other ESA partners to fund the third and most expensive year, which will focus on development of new technology and specific missions.

    The time is running short for ESA to make some definite decisions about current plans. If ambitious missions like BepiColombo or Gaia have to be sacrificed, Southwood says, ESA will “move off the gold standard” of leading space exploration programs. Ongaro agrees that the next step is critical: “We are walking the thin line between having a vision and living an illusion.”


    Technology Is Essential, But It's a Tough Sell

    1. Andrew Lawler

    Developing new technology is not as exciting as sending a probe into deep space, so NASA keeps relying on age-old hardware

    WASHINGTON, D.C.— The problem is two-fold. First, investment in technology lacks the appeal of dramatic missions to a comet, asteroid, or planet. The Deep Space 1 mission, built for the bargain price of $160 million and turned off last month after a 3-year voyage, successfully tested an innovative ion engine as a more efficient alternative to chemical rockets—yet its visit to a comet is what captured public and scientific attention. More typical is Congress's decision in November to take nearly half of the $20 million NASA wanted to spend in 2002 on developing in-space propulsion systems and reassign it to construction projects. That shift will force NASA to scale back planned work on aerocapture—using the atmosphere as a natural brake for spacecraft—as well as on promising nuclear and ion engine research.

    A second problem lies in public and political nervousness over the use of nuclear fuel—whether to supply electricity for operating instruments or for propulsion. Missions far from the sun require more power and reliability than current solar arrays can provide and so use plutonium-powered electrical systems, euphemistically called radioisotope thermal generators (RTGs). The plutonium fuel simply gives off heat, which the RTG easily converts into electricity. Equipped with RTGs, the Viking landers of the mid-1970s kept working into the early 1980s.

    In contrast, the solar-powered rover used in 1997 on Mars Pathfinder operated for only a month before martian dust obscured its solar panels. A similar fate awaits the rovers on the next Mars landing, scheduled for early 2004. That limitation worries NASA planners. “Without RTGs, we're not going anywhere,” says Colleen Hartman, NASA solar system chief. “It's number one on our tech list; nothing else comes close.”

    It's number one for a very good reason. Only two RTGs are left, and demand exceeds supply: One is needed for a Pluto mission and two for a Europa flight, although the future of both missions is in question. To start up a new line, NASA must negotiate with the Department of Energy, which is responsible for overseeing construction of the generators and finding the necessary plutonium-238 fuel. One source is Russia, which has an agreement to sell plutonium to the United States at $2 million per kilogram.

    Plutonium power?

    The Cassini mission to Saturn uses radioisotopes, but protesters worry about an accident that could spread deadly radiation.


    Until recently, there seemed to be no solution to the RTG shortage. The Clinton Administration frowned on the use of nuclear fuel, and activists have waged a bitter, although ultimately unsuccessful, battle against spacecraft such as Saturn-bound Cassini that carry RTGs. They worry that an accident during launch or during an Earth flyby could expose the planet to deadly plutonium.

    And nuclear propulsion would almost certainly face similar opposition. “There's no doubt it would allow us incredibly quick trip times, but we have to wrestle with severe political issues,” says Wesley Huntress, a geophysicist at the Carnegie Institution of Washington. A former NASA space science chief, Huntress is leading the technology panel for the National Research Council's (NRC's) solar system survey due out in the spring (see p. 32).

    The arrival of a Republican Administration could herald a new day for nuclear electric power and, perhaps, even propulsion. “We are not afraid to use the 'N' word anymore in Washington,” says Weiler. Although White House officials declined to comment on the topic, Weiler adds that it is no longer inconceivable that the president could support a 10-year NASA plan to spend $1 billion developing nuclear-based power systems. The first sign of such support could be in the 2003 budget request to be released next month.

    Charles Elachi, director of the Jet Propulsion Laboratory in Pasadena, California, recently argued that advanced propulsion would give NASA more time to develop a Pluto craft and still reach the planet by 2020, after which time its atmosphere is likely to be frozen for decades. With new systems such as the one demonstrated by Deep Space 1, he says, “you can go to Pluto anytime.” But many researchers are skeptical. They are unwilling to let go of a mission in hand, now tentatively set to launch in 2006, for a vague promise of high technology in the future.

    Academic and NASA officials agree that part of the problem is cultural. Engineers and scientists simply don't talk to each other enough. NASA high-tech funding typically flows to aerospace companies with few ties to academic institutions, and universities spend too little time communicating their scientific needs to industry. However, all sides agree on one thing: The NRC survey must make a strong case for the importance of new technology, even at the risk of jeopardizing some near-term missions, if scientists are to have any chance of powering future missions with something better than what is already on NASA's shelf.

  13. The Runts of the Cosmic Litter

    1. Robert Irion

    Brown dwarfs and other substellar bodies behave like giant planets, but most of them may form like stars

    It takes a lot of cold gas to make a hot star. A cloud of gas and dust must start out cold for gravity to overcome the cloud's thermal unrest. Once it collapses, the gas must get hot and dense enough to ignite hydrogen fusion at its core. The newborn star then throws off its blanket with a gusty wind, having gathered enough gas to ensure a warm and stable life.

    But plenty of objects never reach that critical phase. Lighting the fires of fusion requires a ball of gas at least 75 times the mass of Jupiter, or about 7% of our sun's heft. Anything below that threshold simply cools down for billions of years, like an ember on the hearth, until it vanishes from sight.

    These cosmic castoffs, called brown dwarfs, eluded detection for years. But now, searches have turned up so many—more than 200 by the latest count—that astronomers expect brown dwarfs to fill gaps in their theories about the origins of the smallest stars. “Brown dwarfs straddle the realm between stars and giant planets,” says astrophysicist Adam Burrows of the University of Arizona in Tucson. “It's the last great chapter of stellar astronomy.”

    Here on the fringes of planetary science, astronomers see signs that brown dwarfs arise by means of surprisingly puny versions of the processes that create stars like our sun. Surveys reveal a smooth distribution of the numbers of brown dwarfs from 75 Jupiter masses down to about 10 Jupiter masses, suggesting that kernels of gas collapse into a range of substellar sizes. Some young dwarfs seem shrouded by dusty disks as are their stellar cousins, pointing to similar origins. In other cases, protostellar clouds appear to split into smaller chunks, making multiple embryos that compete for infalling gas like chicks in a nest. Gravitational skirmishes among these siblings may eject the smallest ones before they collect enough gas to fledge as bona fide stars.

    On the other hand, a few of the scores of known planets outside our solar system look more like brown dwarfs than planets. For example, two titans circle the star HD168443, tipping the scales at a minimum of 7 and 17 Jupiter masses. Some free-floating “planetary-mass objects” as light as 5 Jupiter masses have surfaced in surveys of young star clusters. At the moment, the birthplaces of these nomadic gas giants are impossible to trace.

    “It's a confusing picture,” says astronomer J. Davy Kirkpatrick of the California Institute of Technology in Pasadena. “It could be that things we think of as brown dwarfs can form by either process, at the high-end tail of planet formation or at the low-end tail of star formation.” Those distinctions are crucial to theorists grappling with the extremes of both scenarios. As astronomer Bo Reipurth of the University of Hawaii, Manoa, puts it: “Brown dwarfs do not represent a few pathological objects that form under rare circumstances. They must be integrated within a general understanding of star birth.”

    Dwarf toss.

    Simulations of prestellar clumps within a collapsing cloud of gas suggest that the runtiest ones get flung into space, dooming them to lives as brown dwarfs.


    Additions to the alphabet

    That integration won't come quickly, for it wasn't many years ago that brown dwarfs existed only in theory. Searches in the 1980s and early 1990s unveiled candidates, but on closer examination they all morphed into stars or observational glitches. “It was frustrating,” recalls astronomer Sandy Leggett of the United Kingdom Infrared Telescope on Mauna Kea, Hawaii. “People began to say that star formation must know about the hydrogen-burning mass limit, because we weren't finding anything smaller.”

    But in 1995, astronomers confirmed the first brown dwarfs by spying the telltale imprints of fragile substances in their faint spectra of light. The gases, either lithium or methane, are destroyed by the heat of genuine stars but can persist in the cooler confines of brown-dwarf atmospheres. At the same time, other teams found the first giant planets circling sunlike stars. “It was suddenly very clear that nature has no problem manufacturing substellar objects,” says astronomer Gibor Basri of the University of California, Berkeley.

    Subsequent hunts within young stellar clusters have flushed out droves of brown dwarfs. Prime targets are the Pleiades cluster and star-forming regions in Orion and Taurus, where substellar bodies rival stars in number. Other teams have pored over data from broad surveys of the sky to find lone dwarfs among the stars. The most fruitful projects are the Sloan Digital Sky Survey in optical light and two ambitious projects in infrared wavelengths: the Deep Near Infrared Survey in southern skies and the 2-Micron All-Sky Survey in both hemispheres.

    Spectra of these objects show that as brown dwarfs age, they look like hot planets rather than cool stars. For a few million years, bodies with at least 13 times the mass of Jupiter can fuse deuterium, an isotope of hydrogen, at their cores. (This nuclear capability led an international panel* to declare last year, after prickly debate, that objects heftier than 13 Jupiter masses are “brown dwarfs” no matter where they are, whereas smaller objects are “planets” if they circle stars and “sub-brown dwarfs” if they drift through space.) The deuterium glow is temporary, however. When it expires, the dwarf bleeds heat as it contracts. Ultimately, it slowly cools like a desert rock after sunset.

    The slow chill alters brown dwarfs' atmospheres, strange molecular stews in which compounds form and “rain” downward as the dwarf evolves. Clouds of condensed silicate minerals dominate at 2000 kelvin, about 4000 degrees cooler than the sun. By 1400 K, those solid grains settle deep in the atmosphere, leaving clearer layers ripe with methane, water vapor, and alkali metals such as sodium and potassium. The sodium absorbs most of a dwarf's yellow light. Because yellow is a key part of the color we perceive as “brown,” Burrows observes that the objects are woefully misnamed. “They're actually purple or magenta,” he says. “You can't get them to be brown.”

    More exotica await below 800 K, the temperature of the coolest dwarf yet seen. Water clouds condense at about 500 K, and ammonia droplets appear at 300 K. This should sound familiar, because planetary scientists probe the coolest of those layers within the closest dwarf wannabe: Jupiter. “In brown dwarfs, we're seeing the early phases of Jupiter's evolution and its interior today,” Burrows says.

    This changing cast of compounds produces spectra utterly different from those of stars, forcing astronomers to amend the “O B A F G K M” spectral classifications devised by Annie Jump Cannon at the Harvard College Observatory in 1901. So far, the new entries are “L” for warmer dwarfs and “T” for cooler ones—spoiling the famed mnemonic, “O Be A Fine Girl, Kiss Me.” “Here I am 100 years after my heroines classified stars, and I'm getting a chance to do what they did,” Leggett says. “It's such a thrill.”


    The Trapezium, a star-forming region in Orion, is studded with probable brown dwarfs (circled). Hubble Space Telescope images show that some have dusty disks (bottom three).


    As astronomers continue to scan the heavens, their censuses are bound to turn up many more substellar objects in our galaxy. “There could be several hundred unknown brown dwarfs within 25 parsecs [80 light-years] of the sun,” says astronomer Eduardo Martín of the University of Hawaii, Manoa. Contrary to earlier hopes, however, those hidden neighbors probably don't carry enough weight to account for much of the galaxy's missing dark matter. In clusters and star-forming regions, brown dwarfs probably make up just a few percent of the total mass, Martín notes, although untold numbers of them may orbit through the Milky Way's extended halo of stars.

    Clues to the womb

    Instead of focusing on the galaxy's “missing mass,” astronomers now view brown dwarfs as missing links in low-mass star formation. Recent studies offer intriguing hints about their role.

    One clue is the prevalence of dusty disks around brown dwarfs. Such remnants of protostellar nurseries often linger around infant stars. A team led by graduate student August Muench of the University of Florida, Gainesville, studied dozens of candidate brown dwarfs in the Trapezium, a crowded star-forming swarm in Orion. Although the team members didn't see disks directly, excesses of infrared emission suggest that dust clouds around most of the dwarfs absorb energy from the newborn objects and reradiate it as heat. The ubiquity of disks indicates that small stars and substellar pygmies are born in the same way, Muench and his colleagues maintain.

    Basri of Berkeley agrees that the births of brown dwarfs probably cut handily across the mass bins into which astronomers have placed them. His team's studies of newborn dwarfs in Orion and Taurus bear that out. Emissions from substellar objects have the same spectral patterns as those from low-mass stars, just at a smaller scale. “They look very much like wimpier versions of stars forming,” Basri says. “It's nothing weird at all.”

    Muench draws the same conclusion from the statistics of the ongoing tally of brown dwarfs and extrasolar planets. Astronomers presume that giant planets arise in disks of gas around their parent stars, a process quite distinct from star formation. The planets may accumulate mass by adding gas onto seeds of ice and rock, or they may collapse gravitationally within their own dense vortices of gas. However, if many substellar objects formed in those planetlike ways, one might expect a notable hump in the numbers of low-mass brown dwarfs, Muench says. Instead, the parsing of masses seems evenly spread, down to the range of 10 to 20 Jupiter masses. “There's no evidence for multiple mechanisms,” he says. “It's a consistent picture: Brown dwarfs form from protosubstellar cores.”

    Surveys aren't as clear about bodies weighing in at a few Jupiters, whose masses are hard to determine. If deeper images expose many more, it would bolster the views of some theorists that chaotic interactions within young planetary systems fling such bodies into interstellar space. Simulations do suggest that gravitational whiplash can expel planets, but the most common victims are smaller objects such as Saturn and Earth. For that reason, theorist Alan Boss of the Carnegie Institution of Washington, D.C., thinks planetary ejection goes only so far. “The masses don't quite agree,” Boss says. “If you're trying to explain free floaters as big as 5 to 10 Jupiters, you'd need stars with companions of 20 to 50 Jupiter masses to kick them out. We just don't see those.”

    Rather, Boss favors another scenario to cast most lone dwarfs into space: expulsions from nests of substellar embryos. This provocative idea belongs to Hawaii's Reipurth, who believes that most brown dwarfs are losers in battles for food among many siblings. If such events occur, they would inject randomness into a process that most astronomers have viewed as smoothly evolving.

    Reipurth bases his hypothesis on observations of 14 “giant Herbig-Haro flows,” great outpourings of gas from certain newborn stars. The Hubble Space Telescope pinpointed the apexes of the long, energetic jets. With help from ground-based radio images, Reipurth and others determined that 12 of them had two or more stars embedded inside. “These flows are a fossil record of small, unstable systems, which eject low-mass objects,” Reipurth says. “If an embryo is ejected before it has accumulated enough mass to burn hydrogen, it will forever remain a brown dwarf.”

    One snag is that models of star-forming regions have strained to produce anything resembling a tightly clumped nest of small gaseous cores. It seemed that a single object—or at most, a binary—would condense from the infalling deluge. However, research conducted last year by Boss may provide an answer. Magnetic fields lacing through a protostellar cloud may fracture a disk of gas into at least four pieces, Boss says. The initial core of each fragment might be as small as 1 Jupiter mass—about 1/10 the size that previous nonmagnetic simulations had predicted. “For now, it's a hand-waving idea,” Boss admits.

    Observations of stellar cradles with the Space Infrared Telescope Facility, due for launch in July 2002, may discriminate among these and other notions. In the dim, magenta-tinged realm of brown dwarfs, astronomers will need a bright red flag to rule anything out.

  14. The Quest for Population III

    1. Robert Irion

    Astronomers may soon see traces of the first stars in the universe—and they're likely to be whoppers

    Most creation stories begin with the first rays of light. As astronomers tell the tale, the brilliance of the big bang faded to a black murk for at least 100 million years. Gravity pulled gas into clumps, but nothing shone. Then, somewhere, the nuclear fires of the first star cast light into the void. That event marked the end of what Cambridge University astronomer Martin Rees calls the cosmic “dark ages,” and it started a cycle of star birth and death that transformed a simple broth of gas into the complex stew of elements we see today.

    The primordial ancestors of today's stars have long since vanished from our sector of space-time. However, recent research promises a glimpse of the first stars. Simulations of collapsing clouds in a starless universe predict that gigantic stars formed, each containing about 100 times more gas than our sun. Most of those titans lived fast and died so explosively that new telescopes should be able to see them as supernovae or gamma ray bursts at the margins of the visible universe. By spewing a unique blend of chemical ingredients into space, the first stars probably spawned a second generation of lighter objects—some of which may linger today.

    Big star, big boom.

    Simulations of the first star point to a seed dozens of times more massive than our sun (top) and an explosive death that disrupts its entire halo of gas (bottom).


    Moreover, astronomers are devising ways to explore the neighborhoods in which the first stars blasted through life. Searchlight beacons from the most distant quasars have lit the edges of a key region within which ultraviolet (UV) light from the earliest stars ionized the universe. Other astronomers are using the intense gravity of galaxy clusters as lenses to magnify tiny parts of the universe, exposing shreds of light from the first infant galaxies in the throes of formation.

    Within a decade, the Next Generation Space Telescope (NGST) should see those dim objects clearly. But for now, astronomers are thrilled to open the curtain on the cosmic stage for a mere glimpse of the first act. “Mankind has always been interested in how the first light was produced,” says astronomer Abraham Loeb of the Harvard-Smithsonian Center for Astrophysics (CfA) in Cambridge, Massachusetts. “Now we have the opportunity to address that question with scientific tools for the first time.”

    Doomed giants

    Loeb and his colleagues are pursuing “Population III” stars, which contain only the pristine mixture of hydrogen, helium, and a dash of lithium created in the big bang. The name extends the stellar categories invented 50 years ago by German astronomer Walter Baade: Population I, stars such as our sun with ample elements heavier than hydrogen and helium, and Population II, rarer stars with few heavy elements. These “metal-poor” stars, as astronomers call them, are ancient, but they still contain a smattering of elements such as carbon, oxygen, silicon, and iron from primordial stars that preceded them.

    Astronomers have never seen a pure Population III star, despite years of combing our Milky Way galaxy. The best they have managed is to find a few “extreme Population II” stars, in which iron is only 1/10,000 as abundant as in our sun. To make the leap to Population III, some researchers are using silicon—not in stars, but in the chips of supercomputers.

    The setting in which the first structures formed is surprisingly tractable to model, says astrophysicist Michael Norman of the University of California (UC), San Diego. “It struck me as curious that the whole field of structure formation was focused on clusters and superclusters today, which are horribly complex,” Norman says. “From a physics standpoint, things are simpler as you go back to the early universe.”

    Specifically, hydrogen and helium are easier to understand than the dozens of elements that drive star formation now. No other stars roiled the pot, and magnetic fields were negligible. Using a modeling method called “adaptive mesh refinement,” Norman's team tracks a cloud of gas in a simulated cube of space 400,000 light-years across as it collapses into a ball just 100 times as large as the sun. “If you think of the universe as the size of Earth, our calculation can resolve a single red blood cell,” says astrophysicist Tom Abel of Pennsylvania State University, University Park.

    On page 93 of this issue, Abel, Norman, and astrophysicist Greg Bryan of Oxford University report their latest results. Contrary to some expectations, the collapsing gas cloud does not split into myriad small stars. Rather, a single giant star forms, sucking in gas at a startling rate. The likely final mass of the star is between 50 and 300 times the mass of our sun, Abel says.

    Another set of simulations by astrophysicist Volker Bromm of CfA and by Paolo Coppi and Richard Larson of Yale University in New Haven, Connecticut, yields a similarly hefty first generation. The reason, Bromm says, is that in order to contract into stars, gas clouds need to cool dramatically, but molecular hydrogen in the early universe can't cool the gas below a relatively balmy 150 kelvin. Dust and heavier molecules, such as carbon monoxide, radiate heat so efficiently that modern star-forming clouds plunge to 10 K or so. That leads to tighter knots of gas and smaller stars.

    The giants of the early cosmos were raging bonfires of fusion, consuming their fuel in a few million years. Their deaths weren't ordinary, either. Calculations by astrophysicists Alexander Heger and Stan Woosley of UC Santa Cruz and their colleagues predict that primordial stars between 140 and 260 times the mass of our sun exploded as extraordinarily brilliant supernovae. Their flares of light, stretched into infrared wavelengths by the expansion of the universe, should be spotted by NGST—and perhaps by the Space Infrared Telescope Facility, due for launch in July 2002.

    A subset of those first detonations may have triggered gamma ray bursts, the most energetic events in the cosmos. “We should be able to see gamma ray bursts no matter how distant they are,” says astronomer James Rhoads of the Space Telescope Science Institute in Baltimore, Maryland. One reason is that bursts and their fireball “afterglows” are brighter than supernovae across most of the spectrum, Rhoads says, and gamma rays zing through gas and dust. A second reason stems from the time-dilating tricks of relativity. Primordial stars recede from Earth so quickly that when they explode, their bursts seem to last much longer than if the stars were nearby. That makes it easier to catch them in the act.

    Ultradistant bursts may be detected soon. A planned gamma ray satellite called Swift, to fly by the end of 2003, should detect about one gamma ray burst per day. Swift will alert telescopes on the ground to look for afterglows. Analysis will reveal how far away the bursts are, providing clues about when and how often the earliest stars blew up.

    Although the prospect of spotting remote explosions is tantalizing, other Population III enthusiasts are looking closer to home. Extreme Population II stars in the extended halo of our galaxy may be just one step removed from the firstborn stars. “We believe we see the chemical history of the first generation written into low-mass second-generation stars that are still burning,” says astronomer Timothy Beers of Michigan State University in East Lansing. The few Milky Way stars that have 1/10,000 of the sun's allotment of heavy elements may record “single pollution events": a spray of metal-laden gas from the death of just one giant star.

    Heger and Woosley's team predicts that such a chemical fingerprint should be rich in silicon and barren of metals heavier than zinc. Unusual neutron physics within the first stars also should expel a pronounced “odd-even pattern” of elements, producing 10 to 100 times more mass of the even- numbered elements in the periodic table than of their odd-numbered neighbors. Spectral studies of ancient stars in the Milky Way haven't turned up anything so distinctive, Beers notes, but the search continues.

    Distant windows

    As the first stars forged heavy elements, they also scorched their surroundings with fierce UV light. This radiation stripped electrons from the hydrogen atoms that had formed 300,000 years after the big bang from the universe's initial hot bath of protons and electrons. By “reionizing” the universe, the earliest stars forever changed how light travels through intergalactic space, because neutral hydrogen atoms absorb the UV light from young stars.

    Arc lights.

    The powerful gravity in galaxy cluster Abell 2218 magnifies and distorts remote objects—including a faint star-forming blob 13 billion light-years away (inset).


    Reionization didn't happen all at once, says astronomer S. George Djorgovski of the California Institute of Technology (Caltech) in Pasadena. Rather, patchy islands of ionized hydrogen formed around the first stars within a sea of neutral hydrogen. “It's a phase transition, like the bubbles in boiling water,” Djorgovski says. “They grow and eventually overlap, and the whole thing turns into steam. That's the response of the gas filling the universe to the light from the first sources.”

    Four months ago, a team claimed that the most distant quasar yet known lies just beyond the final fringes of reionization. The quasar, detected by the Sloan Digital Sky Survey, arose about 1 billion years after the big bang. Neutral hydrogen between us and the quasar completely absorbed some of the quasar's light, according to astronomer Robert Becker of UC Davis and Lawrence Livermore National Laboratory in California and his colleagues. The amount of light that made it through, however, implies that most of the neutral hydrogen must lie still farther away, in a region within which the universe's first reionized bubbles are hidden. “None of us believe we are seeing all the way back to a fully neutral universe,” Becker acknowledges.

    Total reionization required a lot of UV radiation, most likely from small assemblages of stars rather than single giant stars. However, those first stellar groupings may have been so small—like the globular clusters that swarm around the Milky Way today—that even deep exposures by the Hubble Space Telescope can't detect them. Fortunately, nature provides another means: gravitational lensing.

    First predicted by Albert Einstein as part of his theory of general relativity, gravitational lensing occurs when something massive bends the light of a more distant object into a smear or a multiple image. The biggest clusters of galaxies in the cosmos are adept at this optical wizardry. One of them, called Abell 2218, is laced with ghostly arcs from the warped images of more distant galaxies. A survey of the most highly magnified slices of sky behind Abell 2218 recently exposed two tiny red jewels.

    A team led by Caltech astronomer Richard Ellis, director of the Palomar Observatory, found the red patches with the 10-meter Keck Telescope in Hawaii. Spectra confirmed that they were images of the same small system of stars, split and magnified about 30 times by gravitational lensing. The type of light emitted by the stars suggests that the system is hot and nearly newborn. “We believe we're seeing stars that were less than 2 million years old” when they emitted their light, Ellis says, at a time less than a billion years after the big bang.

    Ellis doesn't claim that the stars in his vigorous but minuscule system are the long-sought Population III. “We can't yet rule out that it arose from a second- generation event,” he says. “We're dying to know whether there's any evidence of an underlying older stellar population that we can't see.” The team has received more time with Hubble to stare at the patches for signs of such parent stars.

    Even if Population III stars elude these various techniques, astronomers feel confident that NGST will succeed. “NGST is the optimum instrument to view the first objects directly,” says CfA's Loeb. “It's perfectly tuned to the right wavelengths to observe the early universe.” Those infrared signals—faint wafts of heat straggling across 14 billion light-years of space—may illuminate the dark ages once and for all.