News this Week

Science  27 Sep 2002:
Vol. 297, Issue 5590, pp. 2184

    Subtle Signals in Ancient Light Promise New View of Cosmos

    1. Charles Seife

    CHICAGO, ILLINOIS—A new chapter in the study of the early universe opened last week. At a meeting here,* physicists from the University of Chicago announced the first detection of polarization in the cosmic microwave background (CMB), the light left over from the very early ages of the cosmos. The much-anticipated result is squarely in line with theoretical predictions, a sign that physicists' theories about the makeup and evolution of the universe are on track. “It is a dramatic confirmation of the ideas in the cosmological standard model,” says Wayne Hu, a cosmologist at the University of Chicago who did not take part in the work. “If they didn't see it, we couldn't say we know how atoms form. It's as fundamental as that.” The achievement also heralds the start of a new approach for deciphering the CMB.

    The results came from the Degree Angular Scale Interferometer (DASI), a microwave telescope based near the South Pole. DASI peers into the heavens for light that scattered off the first atoms, which formed when the universe was 400,000 years old. Detected in 1965 by Arno Penzias and Robert Wilson of Bell Labs, the CMB, stretched and attenuated by 14 billion years of travel, looks much the same in all directions except for faint hot and cold spots caused by the sloshing of matter in the early universe. In 2000 and 2001, sensitive microwave telescopes, including DASI, made a splash by producing exquisitely detailed maps of those temperature variations (Science, 4 May 2001, p. 823). From those maps, scientists divined the amount of matter and energy in the universe and showed, beyond a reasonable doubt, that the “shape” of spacetime was flat (Science, 28 April 2000, p. 595).

    Polarization, the “orientation” of the incoming light, adds another dimension to the map of the CMB. “It's going to triple the amount of information that we get from the [CMB],” DASI team member John Kovac, a cosmologist at the University of Chicago, said in a statement released at the meeting. “It's like going from the picture on a black-and-white TV to color.”


    DASI telescope (bottom) detected polarization of cosmic radiation (black lines, top).


    Theorists had long suspected that light from the CMB is polarized. Because of the Doppler effect, atoms hurtling through the early universe would have scattered light differently depending on whether the rays were heading toward or away from the moving atoms. That differential leads to a preferred polarization in the scattered light. The faster a clump of matter is moving, the more pronounced that preference. Because polarization, unlike temperature fluctuation, is not distorted by gravitational kneading as it travels through space, the pattern of polarization in the microwave sky could, in theory, give cosmologists an even sharper picture of the early universe than temperature maps can reveal. But that possibility was just a dream until physicists could detect that polarization, which is only a tenth as strong as even the tiny temperature fluctuations.

    The dream is now becoming a reality. “We've detected polarization at a high level of confidence,” says John Carlstrom, the leader of the DASI team. The team claims a “five sigma” detection, a confidence level that physicists consider extremely convincing. “They didn't just kind of maybe see it; they really saw it,” says Max Tegmark, a cosmologist at the University of Pennsylvania in Philadelphia. “I was very, very psyched.” Physicists had feared that the faint signal might be swamped by polarized light from other sources, Tegmark says. “The Achilles' heel of the whole polarization challenge is polarized space junk that could completely hose everything,” he says, adding that the lack of noise “bodes really well for the future of the field.”

    That future promises to be bright indeed. Ever-more-sensitive observations by DASI and its rivals BOOMERANG, CAPMAP, MAXIPOL, and a handful of other telescopes in operation or scheduled to begin observing within the next few years should give physicists an incredibly accurate picture of how the matter in the early universe was moving under the influences of radiation and gravity. That information will likely reveal precisely what the cosmos is made of by revealing the balance of matter and energy in the primordial plasma and also help clarify the laws of physics that set the matter in motion. And another 10-fold improvement in sensitivity, which will be within reach of the Planck satellite, scheduled to be launched in 2007, might even reveal ancient gravitational waves, which would shed light on the very earliest moments of the universe. “We'll be probing the universe at 10−30 of a second,” says Carlstrom. Although the DASI results are but a first step, cosmologists hope that the polarization of the CMB will contain the universe's baby pictures.

    • *Cosmo-02, 18–21 September.


    California Flashes a Green Light

    1. Constance Holden

    On 22 September, California Governor Gray Davis signed the nation's first state law explicitly allowing scientists to derive human embryonic stem cell lines as well as to clone embryos to study and treat diseases in what is known as nuclear transfer research. The measure, sponsored by state Senator Deborah Ortiz (D-Sacramento), “is going to make a huge difference” in directing state resources toward such research and luring scientists to California, says stem cell researcher Irving Weissman of Stanford University, who helped draft the legislation. National regulations currently prohibit the use of federal funds, but not state or private money, for such research.

    The measure permits “research involving the derivation and use of human embryonic stem cells, human embryonic germ cells, and human adult stem cells from any source, including somatic cell nuclear transplantation,” with regulation by institutional review boards. The bill might also boost the supply of embryos available for research by calling on fertility doctors to inform patients of this option. Davis also signed a second bill making permanent a temporary ban on reproductive cloning.

    Golden opportunity.

    California Governor Gray Davis signs stem cell research bill.


    The nuclear transfer bill does not provide new funds for stem cell research, but an aide to Ortiz says it is expected to “open up” existing state research funding programs by signaling the importance of stem cell research. The national debate has had an “enormously chilling effect” on the research, says Susanne Huttner, associate vice provost for research in the University of California (UC) system and head of the Industry-University Cooperative Research Program. She plans a mailing to alert scientists to the new measure.

    The bill's greatest value, says Weissman, is its support for nuclear transfer. “There's a whole area of research that's been sitting there, and people have been afraid to do it,” he says. Scientists want to be able to both create a cell line using a nucleus from a cell of a person with a genetically transmitted disease and distribute the cells to other researchers. “Under this law,” he notes, “it'll be done in California.”

    Fittingly enough, UC San Francisco last week sent out its first batch of long-awaited stem cells to nine U.S. investigators. A U.K. shipment is next, says spokesperson Jennifer O'Brien.


    Drug Find Could Give Ravers the Jitters

    1. Constance Holden

    “Ecstasy” is a cruel misnomer for the party drug (±)3,4-methylenedioxymethamphetamine (MDMA). Long known to disrupt neurons that communicate via the neurotransmitter serotonin, the controversial drug now appears to have even more potential for roughing up the dopamine system.

    A quintessential social drug, ecstasy heightens sensations, gives a euphoric rush, and creates feelings of warmth and empathy. It ostensibly achieves this effect by causing neurons to spurt huge quantities of serotonin. In the immediate aftermath, the partier is temporarily drained of serotonin, often depressed and unable to concentrate. Some researchers see strong evidence from both brain and behavioral research that permanent brain damage also can result from MDMA use, but others are skeptical and suggest that the drug might be useful for certain types of psychotherapy.

    Now George Ricaurte and colleagues at Johns Hopkins University School of Medicine in Baltimore report on page 2260 that ecstasy can cause “profound dopaminergic toxicity,” possibly explaining some of the drug's reported negative short- and long-term effects. Researchers administered the equivalent of a heavy party night's worth of MDMA to monkeys. The damage they observed suggests that one all-night “rave” might be enough to induce permanent brain damage and make a person more vulnerable to Parkinson's disease, which is caused by a loss of dopamine-producing neurons.


    Emergency room personnel call the hyperthermic effect of ecstasy (bottom) “Saturday night fever.”


    The researchers injected three doses of MDMA into five monkeys over a period of 9 hours. One monkey died of hyperthermia within hours: Overheating is one of the main side effects of ecstasy. Another grew shaky after the second dose and was not given the third. After 2 weeks, the three other animals were killed and found to have lasting reductions in systems that process serotonin and, even more markedly, dopamine. These neurotransmitters were depleted, and neurons that use them showed damage to their axons, projections that send signals to other cells. The two-dose monkey, killed after 6 weeks, showed similar symptoms. The researchers repeated the regimen with five baboons and, even 8 weeks after exposure, recorded a “profound loss” of markers seen in healthy dopaminergic axons.

    Ricaurte says that other studies might have missed such dopamine system damage because they spaced the doses farther apart; protracted MDMA exposure, in contrast, might make dopamine neurons more vulnerable to the drug's toxic effects. The researchers go on to speculate that damage to the dopamine system might be responsible for cognitive deficits, such as memory loss, seen in some MDMA users. They assert that one reason no one has associated MDMA with parkinsonism might be because the disease does not manifest itself until 70% to 80% of brain dopamine has been depleted.

    Andy Parrott, head of the Recreational Drugs Research Group at the University of East London, calls the findings “very worrying.” He points out that some ecstasy takers—even “novice” users—have motor symptoms such as tremors and twitches, “which may be best explained in these dopaminergic terms,” because dopamine- dependent neurons are one of the major lines of communication in the motor system.

    Other researchers are continuing to withhold judgment about the perils of MDMA, pointing to methodological difficulties in this kind of research and evidence that the damage might be only temporary. Cognitive neuroscientist Jon Cole of the University of Liverpool, for example, is skeptical about the Parkinson's risk. So far, he says, “there is only a single case report of parkinsonism related to the use of ecstasy. The sheer number of ecstasy users indicates that there would be millions of these patients presenting for treatment.” Nonetheless, he says the study might call for a major revision of the existing view of MDMA: “The entire human literature … relies on the notion that MDMA is a selective serotonergic neurotoxin.”

    The Johns Hopkins finding is thus unlikely to put an end to the ongoing debate over MDMA. Criminalized in the United States in 1985, MDMA is still a subject of intense controversy, because some psychologists believe it can be a useful adjunct to psychotherapy—helping people open up emotionally, especially those suffering from posttraumatic stress disorder. Indeed, trials are ongoing in Israel and Spain, and the U.S. Food and Drug Administration approved a new one last November to be conducted in North Carolina.


    NSF Fights Changes in Oversight Bill

    1. Jeffrey Mervis

    Two Senate committees have approved a bill supporting a 5-year doubling of the National Science Foundation's (NSF's) budget, one of the highest priorities of NSF Director Rita Colwell and lobbyists for the scientific community. But the legislation also contains some bitter pills—involving science and math education, major research equipment, and NSF's relation with its oversight board—that Colwell hopes to avoid swallowing.

    Lobbyists see this month's votes, which set NSF policies but don't provide any money, as a sign of the growing strength of their doubling campaign. “It's symbolic, but at least it puts both panels on record in favor of doubling,” says Samuel Rankin, head of the Coalition for National Science Funding. The reauthorization bill, S. 2817, is a variation on one (H.R. 4664) passed in June by the full House, and the two versions must now be reconciled. The NSF spending bill goes through a different set of committees, which will miss a 1 October deadline to complete their work.

    At the same time, those lobbyists are quite unhappy with provisions that would merge a Department of Education program that gives states money to improve science and math education with a new NSF program that awards grants through a national competition to achieve the same end (Science, 11 January, p. 265). The hybrid, proposed by senators who felt that NSF was more likely to run a high-quality program involving university scientists, would allow NSF to continue its national competition for 3 years before converting to a block grant program in 2006.

    The compromise leaves both sides unhappy. The lobbyists fear that local jurisdictions could be left out in the cold if NSF makes grants on a competitive basis. “Moving the program to NSF effectively reduces vital resources and programs at a time when local education agencies need them the most,” a coalition of professional societies wrote to Senator Ron Wyden (D-OR), chair of the science subcommittee of the Senate Commerce, Science, and Transportation Committee, shortly before a vote last week by the full committee. And NSF doesn't like being tied to a predetermined formula. The use of block grants, Colwell wrote Wyden the day before the vote, “is inconsistent with the Foundation's exemplary merit review process and conflicts with competitive processes that promote excellence.” House members are also unhappy with the Senate language, which had been drafted and approved a week earlier by another Senate panel with jurisdiction over NSF's education programs, and they hope to remove it before the bill moves forward in the Senate.

    View this table:

    On another contentious issue, the Senate panels and the House have adopted identical language requiring NSF to rank the importance of projects approved for its major research equipment account. The growing number of big projects approved by the National Science Board but not yet funded has become an irritant for Congress, which has warned NSF that it will be hard to fend off efforts to fund specific projects that are stuck in the queue unless NSF signals which are most important (Science, 12 July, p. 183). Colwell has told legislators that such rankings would limit her ability to adjust priorities according to the timeliness of the science, the size of her budget, and the current lineup of projects being supported.

    The most objectionable portion of the Senate bill, according to Colwell's letter to Wyden, is a provision giving the science board authority to hire its own professional staff. Senate appropriators have proposed similar steps to strengthen the board's independence, and legislators in both bodies see it as an innocuous way to improve government oversight at a time when corporate boards are also being encouraged to be more accountable. But Colwell says the language “would fundamentally change” the relation between the board and the foundation, as well as create a needless bureaucracy to deal with personnel issues now handled by NSF.

    Board chair Warren Washington says he can see both sides of the issue. “I don't think it's a huge impediment to our doing our business,” he says. “But some members don't see a need to fix something that isn't broken.”


    Rings of Light Could Reveal Black Holes

    1. Robert Irion

    The fertile mind of astrophysicist John Archibald Wheeler has conjured another nifty notion: a direct way to detect a black hole that might wander near our solar system. Sunlight would dance rings around the hole and return to Earth, briefly creating a flare in otherwise empty space. Searching for such an exceedingly rare vision might not be practical for decades, but the elegant analysis has delighted other physicists. “It's an ingenious and charming idea,” says Wheeler's Princeton University colleague, astrophysicist Bohdan Paczynski.

    Albert Einstein's general theory of relativity predicts that a black hole's intense gravity bends passing rays of light. Near the event horizon—the threshold beyond which nothing can escape the hole's gravitational well—this deflection becomes extreme. Some photons orbit at least once before darting back into space on new paths. Black holes thus spray incoming light in all directions, like water drops from a whirling sprinkler.

    Wheeler and his former undergraduate student, astrophysicist Daniel Holz, realized that light from our sun redirected in this way could expose a black hole. If Earth sat between the sun and the hole, some sunlight would reflect back as concentric rings. Photons returning after one-half orbit (π radians, or 180 angular degrees) would create a sharp outer ring. Fewer photons would stream back after a tighter encounter of 1.5 orbits (3π radians) around the hole, forming a fainter and smaller ring, and so on. Wheeler dubbed the shadowy mirror a “retro-MACHO,” after the “massive compact halo objects” in our galaxy that deflect light from more-distant stars. (Nor could he resist adding “Pi in the sky?” to the paper's title.)

    Wheels of light.

    Sunlight reflected from a nearby black hole would return to Earth as sharp, concentric rings, caused by light rays that orbit the hole either 0.5 or 1.5 times.


    Holz worked on the details for the last few years with Wheeler, who pioneered the study of black holes in the 1960s and is still active in retirement at age 91. Their results, to appear in the 20 October issue of Astrophysical Journal, suggest that a black hole 50 times farther away than Pluto's orbit with 10 times the mass of our sun would flare for about a day as Earth orbited between the hole and the sun. The rings would be too tiny to resolve, but giant telescopes just might catch the faint solar reflection. “It's a clean calculation,” says Holz, now a postdoctoral researcher at the University of California, Santa Barbara. “If there are black holes out there, this effect would happen.”

    Scanning the sky for the flare is another matter. Paczynski notes that it's not feasible —at least not yet. Current telescopes could barely perceive light reflected from a hole at the outskirts of our solar system, even if they knew where to look. Further, astronomers suspect that isolated black holes typically are dozens of light-years apart, making it unlikely that one would drift nearby. “There's maybe a 50-50 chance it has happened in the history of our galaxy,” says astronomer David Bennett of the University of Notre Dame in Indiana.

    Holz and Wheeler acknowledge the caveats. “It's incredibly speculative, but we don't know what our telescope sensitivities will be in 50 years,” Holz says. In the far future, arrays of telescopes could detect a more distant flare. Our descendants might even probe for rogue black holes with lasers that search for their own reflections—a Wheeler-esque vision indeed.


    Mystery Anti-HIV Factor Unmasked?

    1. Jon Cohen

    Sixteen years ago, in the relatively early days of the AIDS epidemic, virologist Jay Levy of the University of California (UC), San Francisco, proposed an answer to one puzzle about the AIDS virus and created another. He had tantalizing evidence that immune system cells secrete a chemical that can stop HIV, and he suggested that HIV- infected people who have high levels of this factor could live for decades without damage to their immune systems. This would explain in part why HIV kills at such disparate rates. But just what is the mystery factor? Legions of AIDS researchers have searched for this treasure, unearthing half a dozen different candidates, none of which has been completely convincing. Now a team led by Linqi Zhang and David Ho of the Aaron Diamond AIDS Research Center in New York City claims to have the elusive factor in hand. But other researchers who have been digging for answers and have seen the new data say they're not throwing away their shovels.

    In a report published online by Science this week,* Zhang, Ho, and co-workers finger three tiny molecules known as α-defensins. The researchers discovered that white blood cells with CD8 receptors secrete these molecules, but infection by HIV appears to shut down production. However, the CD8 cells of people who remained unharmed by the virus a decade or more after being infected continued to produce them. “It largely solves a mystery that's been out there for 16 years,” says Ho. Robert Siliciano, an AIDS virologist at Johns Hopkins University in Baltimore, Maryland, says the work is “certainly a big advance.” Robert Lehrer, who first described human defensins in 1985, agrees. “The data are very convincing,” says Lehrer, a researcher at UC Los Angeles.

    But several leading AIDS researchers aren't convinced. Levy himself applauds “the great effort” to find the defensins in CD8s, but he says that it's not the factor he postulated. Levy says his lab has tested defensins and found that they did not meet his criteria for the factor, which he called CAF (for CD8 antiviral factor). Robert Gallo, head of the Institute of Human Virology in Baltimore, says that although he finds the work “technically sound,” the paper relies on too few patients, offers no mechanism for how defensins stop HIV, and dismisses other factors that he believes are likely to be more important. Bruce Walker, whose lab at Harvard Medical School in Boston also recently described a candidate CAF, says the new data show that the defensins have “a very modest effect” against HIV.

    In the study, the Aaron Diamond researchers teamed up with scientists from Ciphergen Biosystems Inc. in Fremont, California, to compare secretions of CD8 cells from three HIV-infected “long-term nonprogressors,” four HIV-infected “progressors,” and 15 uninfected people. Ciphergen makes tiny arrays of proteins that, with the help of mass spectrometry, allowed them to analyze the components in each sample. Company scientists found that only the long-term nonprogressors and uninfected people produced three small, related proteins that a database search revealed as defensins.

    As first reported by Lehrer, human defensins are secreted primarily by neutrophils and break down bacterial walls, acting like natural antibiotics. A group in Japan 9 years ago showed that defensins from guinea pigs, rabbits, and rats could inhibit HIV, but the work received little notice. Some AIDS researchers also have incorporated defensins into their vaccines because the molecules can act like an adjuvant, boosting the immune response to the HIV components of the preparation.

    When Ho and his colleagues depleted the defensins in the cell secretions from the long-term nonprogressors, they found that the secretions had markedly less anti-HIV activity. And when they depleted both defensins and immune messengers known as β-chemokines—which Gallo's lab in 1995 showed powerfully prevent HIV entry into cells—the secretions had almost no antiviral activity. In what's sure to be the paper's most controversial assertion, the researchers state that the α-defensins “collectively account for the anti-HIV-1 activity of CAF that is not attributable to β-chemokines.” As for the mechanism, Ho and Zhang say the shortage of clean defensin material makes it difficult to conduct experiments that might tease out how it combats HIV. But they have now begun those experiments.

    Gallo takes exception to the entire concept that a single mysterious, undiscovered CAF exists. “This is ludicrous,” he says. He argues that CD8 cells secrete many substances that inhibit HIV, including one his lab has yet to describe that he says appears to be much more powerful than defensins. “We don't use the word ‘CAF,’” says Gallo. “Throw it out.”

    Zhang agrees that CD8s might well secrete other, undiscovered molecules that inhibit HIV. “CAF is a black box,” he says. “Different molecules could play different roles in different circumstances. We have no idea in vivo.” Still, the α-defensins' apparent anti-HIV powers are likely to provide a new focus for research and, if they pan out, open new avenues for treatment.


    Catalyst Boosts Hopes for Hydrogen Bonanza

    1. Robert F. Service

    Solar cells are the best known way to turn sunlight directly into usable power. But green-energy aficionados have long dreamed of using the sun's rays to make a chemical fuel as well, by splitting water molecules to release hydrogen gas, which produces only water when it burns. For decades researchers have tinkered with light-triggered catalysts that encourage this water splitting. But although a handful of efficient catalysts have been found, none are both cheap and stable enough to be practical. Now researchers at Duquesne University in Pittsburgh, Pennsylvania, have come up with a novel catalyst that might bring the long-sought goal within reach.

    On page 2243, chemist Shahed Khan and his graduate students Mofareh Al-Shahry and William Ingler Jr. report that adding carbon to the well-known water-splitting catalyst titanium dioxide increased the material's ability to absorb visible light. The change boosted the catalyst's ability to convert the energy in sunlight more than eightfold, to 8.5%. “That's an excellent result,” says T. Nejat Veziroglu, a hydrogen energy specialist at the University of Miami, Florida. The efficiency still falls just below the U.S. Department of Energy's 10% benchmark for a commercially viable catalyst, notes Eric Miller, an electrical engineer at the University of Hawaii, Manoa. But he says Khan's team has a real chance to clear the hurdle: “It's a good lead in a good direction.”

    Researchers started experimenting with TiO2 as a water-splitting catalyst in the early 1970s. Like other semiconductors, TiO2 absorbs photons, which excite electrical charges in the material. These charges can then break apart water molecules to produce hydrogen gas (see diagram). TiO2's big advantage is that it is stable under prolonged sunlight, and the material, which is added to everything from paint to sunscreen, is cheap. But TiO2 also has a big drawback: It absorbs only ultraviolet light, a small fraction of the spectrum of sunlight that reaches Earth. That finickiness makes TiO2 an inefficient hydrogen-gas generator, converting less than 1% of the energy in sunlight to chemical energy in hydrogen.


    In an electrolyte, water molecules (center) split into H+ and OH ions, which sunlight and catalysts turn into oxygen and water (left) and hydrogen gas (right).


    Researchers have developed much more efficient catalysts, including other inorganic semiconductors such as gallium arsenide and TiO2 laced with dyes that absorb visible light. But crystalline semiconductors such as gallium arsenide are expensive, and the TiO2 dyes are unstable in the charge-carrying electrolytes that must be added to working water-splitting systems.

    Khan suspected that part of the problem was that the high-temperature process of turning titanium metal to TiO2 created other types of compounds in the mix that do a poor job of absorbing light. He also knew that water vapor helps oxidize titanium metal to TiO2. So Khan's group designed a precisely controlled furnace and placed a sheet of titanium metal in a flame of natural gas. Methane, the most abundant component of natural gas, breaks down into CO2 and water vapor when it burns. Khan's team found that burning the titanium metal at 850°C did a good job of oxidizing the titanium. But it did something else as well: It added some carbon to the mix.

    When Khan and colleagues tested their new material as a water-splitting catalyst, they got a pleasant surprise. Unmodified TiO2 absorbs UV light with a wavelength below 400 nanometers. The carbon-containing TiO2 catalyst, however, also absorbed longer wavelength photons in the violet, blue, and green regions of the spectrum, yielding the eightfold efficiency boost.

    Still, Khan thinks his team can do better. He believes that the amount of carbon incorporated into the TiO2 varies as the titanium sheet is burned and that the higher carbon regions do a better job of absorbing longer wavelength visible-light photons. Khan says his team plans to look for ways to pack more carbon into the TiO2 throughout the firing process. If that works, he says, “it would definitely increase the efficiency to above 10%.”


    Million-Dollar Plums for Teaching Biology

    1. Erik Stokstad

    Research grants have always been the main source of prestige and money for academic scientists. Now one of the biggest funding sources for biologists, the Howard Hughes Medical Institute (HHMI) in Chevy Chase, Maryland, is hoping to add luster to researchers who are devoted to teaching. Last week, the philanthropic giant announced fellowships that will give each of 20 top U.S. biologists $1 million over 4 years to enhance undergraduate education.

    The new awards are meant to improve science curricula at major research universities, where courses are often outdated, boring, and impersonal, says Peter Bruns, HHMI's vice president for grants and special programs. To rev up interest in the classroom, which frequently plays second fiddle to research, HHMI asked 84 research universities to nominate faculty members who are committed to working with students.

    Many of the 20 winners* will use the money to give more undergraduates research experiences. Jo Handelsman, a microbiologist at the University of Wisconsin, Madison, will bring 15 undergrads into her lab. Although the undergraduates will require extra attention, Handelsman expects them to nourish her research. “I have always found undergraduates to be productive and creative scientists,” she says. “They ask good questions and aren't as bound by dogma as the rest of us.”

    Providing those opportunities isn't cheap, though. Neurobiologist Ronald Hoy of Cornell University in Ithaca, New York, estimates that it will cost upward of $65,000 for a single setup of software and high-speed video cameras to enable undergrads to study behavior in mutant flies. He's also planning multimedia lab materials, akin to an earlier project involving crayfish (see figure). The award, Hoy says, “lets you make a very ambitious plan from the get-go.”


    HHMI hopes that new lab materials, such as this multimedia crayfish experiment, will make undergraduate biology more exciting


    Other winners will create programs to mentor prospective scientists, especially minorities. Hilary Godwin, a chemist at Northwestern University in Evanston, Illinois, is planning a summer workshop for incoming minority first-year students. They will learn chemistry skills by mapping lead levels in soil and correlating them with lead-poisoning rates. Afterward, they'll be eligible for further research stipends and training as student mentors.

    The sterling research reputations of the new fellows should help leverage the program, Bruns says. “We wanted to pick people who could influence their colleagues” and promote more interest in improving education, he says. Geneticist Elizabeth Jones of Carnegie Mellon University in Pittsburgh, Pennsylvania, plans to reserve advertising space in Genetics, the journal she edits, to plug her tutoring software. “I wouldn't have that bully pulpit if I had just taught,” she says. Jones and the other fellows will have a chance to convert other high-powered researchers at meetings with Hughes investigators. The institute will assess the program before deciding whether to repeat it in 2006.


    Report Takes Stock of Knowns and Unknowns

    1. Ben Shouse*
    1. Ben Shouse is a writer in Santa Cruz, California.

    The United States spends more than $120 billion a year on protecting ecosystems, but the information used to evaluate such efforts is often inadequate or of questionable relevance, say ecologists and policy experts. According to one environmentalist, it's like monitoring a sick patient by measuring fingernail length. A new report—The State of the Nation's Ecosystems, published 24 September by the nonpartisan Heinz Center in Washington, D.C.—tries to provide the missing data and point out where better measures are needed.

    The Heinz report ( aspires to be the Dow Jones Industrial Average of the environment. But don't expect to see it published next to the latest stock prices; half of its “ecosystem indicators” can't be measured yet. And for those that can be measured, the center is not saying whether the results represent good or bad news, because it wants to steer away from opinion.

    Some environmentalists say such rigid neutrality masks the dire straits of certain ecosystems. But the report's authors insist that their approach is an essential starting point for protecting the environment. The authors also worked to build consensus among people with a range of viewpoints. As with economic indicators, they say, the goal is to start with widely accepted data, which can lead to a debate on policies to change the status quo.

    The $3.7 million report was conceived in 1995 by the White House Office of Science and Technology Policy, which asked the Heinz Center to complete it. Its funders run the ideological gamut from International Paper to Defenders of Wildlife, and its 150 authors come from universities, environmental groups, industry, and government. About 100 reviewers of a prototype report in 1999 (Science, 10 December 1999, p. 2071) helped the team's seven committees compile government data into 10 national indicators plus 93 other indicators tailored to fit six broad ecosystem types.

    Sea to shining sea.

    A new report identifies environmental health indicators for six broad ecosystem types.


    Despite all the number crunching, the 270-page report is most striking for what it lacks. “Half the report is empty,” admits William Clark, chair of the report's design committee and a professor of international science policy at Harvard University's John F. Kennedy School of Government in Cambridge, Massachusetts. Missing data are marked by bleak gray boxes that say, “Data Not Adequate,” meant to prod monitoring programs to fill the gaps. For example, participants agreed that the degree of human alteration is an important ecosystem indicator but say there is no widely accepted way to measure it.

    The available indicators organize and add precision to a welter of existing data. For example, the report points out that the four major U.S. rivers carry three times as much nitrate per year as in 1955. A fifth of native animal species are faced with serious decline. Three-fifths of estuaries are contaminated. On the other hand, 85% of streams meet human health standards. Moreover, agricultural production has doubled since the 1950s, and land threatened by erosion has declined by a third since 1985.

    “Anyone could do a list that would make everything look good or everything look bad,” Clark says. The report's authors sought to avoid the criticism, often heaped on past assessments, that their measures are biased. The Heinz report's indicators are accepted by all participants.

    But some environmentalists say the focus on consensus is the report's greatest weakness. “Because the emphasis was on producing a report that was consensus-driven, they had to focus on the margins of what most scientists would have looked at,” says Dominick DellaSala of the World Wildlife Fund. He quit one of the report committees, claiming that its indicator of forest fragmentation downplays the extent to which forests are broken up by roads, power lines, and development.

    Observers' initial reactions were mixed. The report “is the first to employ a comprehensive set of indicators integrating biophysical and sociocultural measures,” says ecologist Bruce Wilcox of the University of Hawaii, Honolulu, who edits the journal Ecosystem Health. David Rapport of the University of Guelph in Ontario, Canada, however, is disappointed. “No attempt is made … to relate human activities to the changes in American ecosystems, and no attempt is made to evaluate the health of U.S. ecosystems,” he says. But Chet Boruff of Farmers National Marketing Group in Moline, Illinois, defends the report's neutrality: “That's the best way to build understanding … and to come up with a report that is unbiased.”


    Coulston Chimps Head to Retirement

    1. Gretchen Vogel

    The beleaguered Coulston Foundation, formerly the largest chimpanzee research facility in the United States, is no longer in the primate research business. On 16 September the Florida-based Center for Captive Chimpanzee Care took over Coulston's Alamogordo, New Mexico, facility. The new caretaker for the 266 chimpanzees and 61 monkeys plans to retire the animals from research, eventually relocating many to a sanctuary in Florida.

    The Coulston Foundation had long been dogged by complaints about and government investigations into its animal care practices. Last summer the National Institutes of Health (NIH) let lapse the foundation's Animal Welfare Assurance, which is required for government-supported animal research (Science, 24 August 2001, p. 1415). Faced with mounting debts and few customers, the foundation's president, Frederick Coulston, agreed to sell its property and equipment to the Center for Captive Chimpanzee Care (CCCC), for $3.7 million, and also donate the remaining animals. About two dozen Coulston-owned chimpanzees are housed at other research facilities, but Carol Noon, president of CCCC, expects them to be retired as well.

    Looking forward.

    The Coulston Foundation's chimpanzees will be retired from research.


    Toxicologist and millionaire Coulston founded the nonprofit in 1993. Despite complaints of negligent and unsafe practices from animal-rights groups, by 1998 the foundation was the nation's largest chimpanzee research facility, housing more than 600 chimps. The foundation's troubles mounted as a series of inquiries by the U.S. Department of Agriculture (USDA) and the Food and Drug Administration found Coulston in violation of the Animal Welfare Act and Good Laboratory Practices regulations. In a settlement with USDA in 1999, the foundation, without admitting guilt, agreed to give up half of its chimps.

    As part of that agreement, NIH in May 2000 took custody of 288 of Coulston's animals, many of which had been infected with HIV or hepatitis C. In May 2001, NIH awarded a contract to Charles River Laboratories to care for the animals at a primate facility at Holloman Air Force Base outside Alamogordo, a property Coulston had been managing. A month later, NIH allowed Coulston's Animal Welfare Assurance to lapse. That left Coulston with more than 250 chimpanzees at its main Alamogordo facility but no possibility of government funding and few private customers.

    In December, a local bank filed suit to recover $1.2 million in unpaid loans (Science, 18 January, p. 421). Facing mounting debts, Coulston began negotiations with CCCC last spring. The center announced its agreement with Coulston last week. For now, the chimpanzees and monkeys will remain at the Alamogordo facility, Noon says. But the Arcus Foundation in Kalamazoo, Michigan, which donated the $3.7 million for the Coulston purchase, has said it will help fund additional construction at CCCC's sanctuary in Florida.

    John Strandberg, head of the National Center for Research Resources at NIH, says the animals' retirement should not affect researchers. “There are enough chimpanzees in the program now to meet needs,” he says. “I definitely think it's a positive development. These animals needed a long-term-care solution, and the Center for Captive Chimpanzee Care is able to provide that.”

    Coulston spokesperson Don McKinney says that the Coulston Foundation will continue to exist but that it “will be taking a different research direction, closer to pure science.” He declined to elaborate.


    Crucial Cipher Flawed, Cryptographers Claim

    1. Charles Seife

    It was supposed to be as secure as a bank vault: a cryptographic algorithm that would make documents unintelligible to prying eyes for the foreseeable future. But two cryptographers say the vault, the Advanced Encryption Standard (AES), has a hole in it. Although some of their colleagues doubt the validity of their analysis, the cryptographic community is on edge, wondering whether the new cipher can withstand a future assault.

    “It's nerve-wracking for me that this stuff is going on,” says William Burr, the manager of the Security Technology group at the National Institute of Standards and Technology (NIST) in Gaithersburg, Maryland. “It's very worrisome if [the analysis] holds up, but it may not hold up.”

    Two years ago, NIST selected an algorithm to replace the aging Digital Encryption Standard. DES, the national standard for a quarter-century, was arguably the most widely used encryption algorithm in the world. But when it began to show its age, NIST held a competition to determine the next standard (Science, 19 May 2000, p. 1161). Rijndael, an elegant algorithm created by two Belgians, Vincent Rijmen of the Katholieke Universiteit Leuven and Joan Daemen of Proton World International, a company that makes smart cards, won the contest and became the AES (Science, 6 October 2000, p. 25).

    Now, attacks aimed at the heart of Rijndael and other algorithms point to a possible weakness. Cryptographers Nicolas Courtois, who works for technology corporation SchlumbergerSema in Louveciennes, France, and Josef Pieprzyk of Macquarie University in Sydney, Australia, believe they have undermined the algorithms by rewriting their “S-boxes.”

    S-boxes are a crucial element in many ciphers. An S-box adds unpredictability to an algorithm. It takes a string of ones and zeros and returns a different set, turning small changes in input into large changes in output —a boost that makes the algorithm much more difficult to crack. Probing for weaknesses, Courtois and Pieprzyk rewrote Rijndael's S-boxes as a system of equations that a cracker must solve to break the cipher. “[Each S-box] can be described by a small system of equations,” says Courtois. “Nobody thought it would matter.”

    But it does matter. Earlier this year, cryptographers Sean Murphy and Matt Robshaw of the Royal Holloway University of London showed that the S-boxes can be reformulated in a way that Courtois and Pieprzyk exploited to make their attack a force to be reckoned with. All told, Courtois and Pieprzyk believe that they have an attack of order 2100: That is, it takes roughly 2100 operations to crack the cipher, significantly less than the 2128 to 2256 operations needed to try every combination.

    Even if Courtois and Pieprzyk are correct, AES won't crumble overnight. The fastest computers can mount attacks of perhaps order 270, Burr says: “With 2100, we might not be able to verify the attack for the next 70 years, maybe more.” Still, he says, a theoretically sound attack would be a “very, very disturbing proposition,” because attacks get refined over time and computers are speeding up exponentially.

    Some analysts think there's nothing to fret about. Don Coppersmith, a cryptographer at IBM in Yorktown, New York, and one of the designers of DES, claims to have found a flaw in the analysis: Courtois and Pieprzyk miscounted the number of equations, he believes. But Courtois says the criticism does not apply to the latest version of the paper, which will be presented in December at the Asiacrypt 2002 conference.

    Barring an obvious mathematical error, though, it might take cryptographers years to determine whether the attack is worrisome. The only way to prove that the algorithm works, Courtois says, is to use it to crack AES—and computers aren't up to the job yet. Meanwhile, says Bruce Schneier, a Minnesota-based cryptographer at security company Counterpane Systems, computer-security experts have no way of knowing which attacks pose true threats and which are phantoms. “How could you do particle physics if you couldn't do experiments?” he asks. “We've entered an era of cryptanalysis where you can't verify attacks.”


    Arizona Ecologist Puts Stamp on Forest Restoration Debate

    1. David Malakoff

    Wally Covington burns with an intensity that matches the debate over President Bush's forest initiative, which draws on 30 years' work by the forest ecologist

    FLAGSTAFF, ARIZONA—It's hard to resist comparing forest ecologist Wally Covington to the old-growth ponderosa pines that tower above this bustling mountain town. Both are tall and burly, with rugged skins molded by age and experience. And like the pines, Covington is a lightning rod. But he doesn't burn easily. “I am incredibly confident in what I do—and I have a very thick skin,” he says.

    Over the last few years, the 55-year-old academic has emerged as perhaps the nation's most visible—and controversial—forest scientist. The folksy, 2-meter-tall woodlands expert has become a scientific Paul Bunyan on the subject of how to prevent forest fires. He is consulted by presidents and lawmakers, courted by the media, and hailed and assailed by environmentalists, loggers, and scientists alike. His research underpins parts of the White House's controversial new “healthy forests” initiative, and President George W. Bush himself might make a pilgrimage to the researcher's plots this week —all this because Covington believes in cutting down some trees in the name of saving forests.

    This year's conflagration in the western United States—which scorched 2.5 million hectares of forest—has only heightened interest in Covington's work. He and his colleagues at Northern Arizona University (NAU) here have spent 3 decades studying southwestern ponderosa pine ecosystems. For years, they've warned that vast swaths of woodland have become dangerous tinderboxes: A century of overgrazing, logging, and misguided fire-suppression policies has left many stands unnaturally choked with thickets of young trees and deadwood. To prevent catastrophic fires, Covington argues that massive cutting is needed to restore these forests to their less dense, parklike pasts. The idea is being promoted by the president in his plan, now before Congress.

    Some researchers and environmentalists, however, are uncomfortable with Covington's approach to forest restoration. They complain that he has used his political prowess to monopolize resources and worry that government officials might misapply his ponderosa-based ideas to dramatically different types of woodlands. And they charge that loggers have cynically embraced his work to justify a return to damaging practices. “Unfortunately, his science gives political cover to forces that want to increase logging,” says Todd Schulke of the Center for Biological Diversity, an environmental advocacy group in Tucson, Arizona.

    Still, even critics applaud Covington for bringing the plight of the southwestern forests to national attention. They also confess to liking him. “We disagree, but I really enjoy his company,” says Schulke.

    Pointing the way.

    Wally Covington routinely gives policy-makers, such as Interior Secretary Gale Norton (far right) and Senator Jon Kyl (left of Norton), a firsthand look at his Arizona forest restoration experiments.


    “Wally's got strong ideas, and they are not entirely congenial to the timber industry, environmentalists, or policy-makers,” says Bruce Babbitt, longtime friend, former Arizona governor, and head of the Department of the Interior under President Bill Clinton. “But people listen because he's passionate, articulate, and credible.”

    In Leopold's footsteps

    Covington was clearly in his element recently as he loped over downed trees and sized up standing timber. But it was the trees' ecological—not commercial—potential that he was eyeing. Covington has spent his career trying to figure out how to reclaim the glory of declining ponderosa forests. His goal is radical: “I want to see half of all public forests in the Southwest restored to presettlement conditions,” he says. Turning the clock back, he argues, is the best way to combat fires—and do everything from protect endangered species to enable the forests to ride out climate change.

    His quest for ecological healing goes back to his youth in Oklahoma and west Texas. Covington's dad—a jack-of-all-trades who once worked as a barnstorming pilot—was a devotee of Aldo Leopold, a restoration pioneer who died in 1948. When Covington was 7 his father died, but not before drilling home Leopold's lessons during weekends spent living off the land. “We'd alternate [spending] Sundays in church and outside,” he recalls.

    Under the guidance of his mother, a schoolteacher, Covington excelled in the classroom and headed off to college to pursue a career in medicine. But a postgraduation summer working among children with cancer exhausted him emotionally, and in 1970 he found himself teaching school in Gallup, New Mexico—and organizing an Earth Day celebration that ultimately got him fired. “Basically, I was a hippie,” he says.

    A few years later, inspired by undergraduate ecology courses and graduate work with Jim Gosz at the University of New Mexico in Albuquerque, Covington won a spot in the graduate forestry program at Yale University, where Leopold had also studied. Working under F. Herbert Borman and Daniel Botkin, Covington completed “one of the most influential studies in the recent history of forest ecology,” says Ruth Yanai, an ecologist at the State University of New York College of Environmental Science and Forestry in Syracuse.

    Using plots in the Hubbard Brook Experimental Forest in New Hampshire, Covington measured how organic matter in forest soils changed after logging. The result—known as Covington's curve—suggested that logged soils lost 50% of their organic matter, and hence their ecologically important loads of carbon, within 20 years. Even before it was published, the curve was being used by scientists trying to estimate how much carbon logged soils contributed to global atmospheric loads. And although subsequent studies have raised major doubts about the curve's applicability, “it continues to have influence,” says Yanai, who is about to publish a paper on the topic.

    In 1975, Covington joined NAU and began to examine the nearby Ponderosa forests. The U.S. Forest Service was beginning to experiment with controlled burns to reduce wildfire dangers, and Covington saw a chance to test some of his own ideas. But first—following Leopold's example—he had to set a restoration goal, and that “meant figuring out what the forests looked like before European settlers arrived,” he says.

    Ever since, he and an array of colleagues—including his wife, NAU forest ecologist Margaret Moore—have pored over historical photographs and records and have mapped old stumps, downed logs, and fire scars on study plots. In a series of 1994 papers, they made the case that many presettlement ponderosa stands were swept regularly by “cool” fires, making them far less dense than today's forests. “There were often several dozen relatively large trees [per hectare],” says Covington, compared with the hundreds of stunted stems often found today.

    The data convinced Covington and other scientists that a conflagration was imminent and that immediate thinning was the only way to save the forests. But controlled burns alone wouldn't do the trick, he warned: Early experiments near Flagstaff showed that debris-clogged forests often burned so hot that they scorched the soil and eventually killed the older, larger trees that ecologists wanted to save. Covington's team argued that loggers first needed to thin the forests and rake accumulated needles away from old-growth trees.

    At the same time, the team was fine-tuning its “presettlement model” of restoration (Science, 28 January 2000, p. 573). The approach begins with mapping the locations of presettlement trees on a target site. Then, loggers clear away younger trees—at times 80% or more—in a bid to recreate the lost landscape.

    Into the limelight

    Such ideas, however, were little known outside the West until several lethal fires struck in 1994. The resulting outcry prompted Babbitt to seek Covington's help. Soon, Babbitt was touting Covington's work, which he dubbed the Flagstaff plan, as one solution. “I called up Wally and said: ‘Ever hear of the Flagstaff plan? Well, now you have and you'd better get ready to answer questions about it.’”


    Arizona wildfires have sparked debate over how to prevent them.


    The state's lawmakers, especially Senator Jon Kyl (R-AZ), then went to work in Washington, D.C., earmarking funds to help Covington and working with Babbitt to line up study sites. Soon, Covington had access to a swath of isolated federal land on the slopes of Mount Trumbull in northern Arizona, where he could scale up experiments that involved removing different proportions of trees and implementing various burning timetables. Preliminary tests had suggested that such approaches could lead to less damaging fires and improved biodiversity. And although the outcome of the bigger experiments won't be known for years, the uncertainty hasn't kept Covington's support from growing.

    Several years of devastating fires in the West have increased pressure on legislators to act (Science, 1 September 2000, p. 1448). In response, Kyl—who serves on the Senate Energy and Natural Resources Committee—has helped funnel more than $10 million to Covington's Ecological Restoration Institute, which promotes science-based forest restoration. Now, a group of Western lawmakers wants to establish duplicate centers in other states.

    Covington isn't embarrassed by the earmarked cash. He says that he's won plenty of peer-reviewed grants, too, but that “unconventional research requires unconventional sources of funding.” Although some researchers in the region grumble privately that Covington hasn't shared the wealth, others dismiss that argument as sour grapes. “He's had a tremendous catalyzing effect and brought in new resources,” says Melissa Savage, a retired forestry professor who now runs the Four Corners Institute in Santa Fe, New Mexico.

    Covington's windfall was the product of decades of outreach. Even as a young faculty member, he would regularly invite lawmakers and their local staffers for field trips. “I'd call in the summer, when it was really hot in Phoenix, and invite them up to nice, cool Flagstaff for a tour,” he says. “Then I'd campaign the research.” He soon was advising governors, then testifying before Congress. This year, the Bush Administration tapped him for advice, and Interior Secretary Gale Norton visited.

    But Covington's growing reputation also brought increased scrutiny. Several environmental groups that were initially supportive of his presettlement model, for instance, grew restive after they saw some heavily logged and rutted sites on Mount Trumbull. “It was horrid; there was a real gap between theory and practice,” says Sharon Galbreath of the Southwest Forest Alliance in Flagstaff, which subsequently challenged and negotiated alterations to other thinning projects based on Covington's concepts.

    Environmentalists worry that Covington's projects are “too aggressive,” says Galbreath: “He's trying to restore presettlement forests in one fell swoop.” Some environmentalists now derisively refer to extensive thinning projects as “Covington cuts.” In their quest to promote what they see as more incremental approaches, they have developed a multipronged critique that challenges nearly everything Covington's team does.

    For example, they argue that Covington's method undercounts the number of small presettlement trees on site. And they criticize Covington's zeal for restoration of distant wilderness areas, which they say is expensive and does little to reduce fire threats near human population centers. They are particularly troubled by his unwillingness to preserve all larger trees by prohibiting thinning above a certain size—say, a half-meter in diameter.

    The release of the Bush plan has only increased their fears that Covington's ideas will be misapplied. The fire-related portion of the plan calls for increasing thinning in all threatened forests—including stands far from inhabited areas—and paying for it by allowing loggers to cut bigger, more valuable trees. It would also ease environmental regulations to avoid legal delays. Critics say that would promote too much cutting in the wrong places, and they argue that Covington hasn't done enough to challenge the plan. “He hasn't made it clear that his approach is experimental and focused on ponderosa forests; we're not ready to run with it across the Southwest,” says Bryan Bird of the Forest Conservation Council in Santa Fe.

    Such criticism, Covington says, is the inevitable consequence of trying to spur action and running the largest ponderosa restoration experiments in the nation. It's true, he says, that he declined to sign a recent letter from dozens of prominent academics opposing the plan, because he disagreed with parts of their argument. But he bridles at suggestions that he's bitten his tongue in order to promote a “one size fits all” approach. In testimony and interviews, for instance, he's criticized the plan for failing to differentiate between forest types that withstand cool fires and those that need to burn fiercely to reproduce. His own experiments, he notes, test more than two dozen tailored restoration recipes. And he argues that the plan gives too much attention to “salvage logging”—cutting already burned trees—which has questionable value for restoration.

    Overall, the plan's sometimes vague and confusing language “is pretty much what you'd expect from people who aren't well trained in logic,” he says. He also notes that the White House ignored chunks of his advice, including having the National Academy of Sciences develop guidelines on thinning and restoration. (He now hopes to have his institute assemble such a study.)

    But like many other forestry scientists, he is weary of the debate over tree-size caps, saying the science doesn't support any particular size limit. Similarly, he says that although it makes economic sense to start thinning near inhabited areas, many of the natural resources most at risk—from endangered species to water supplies—often lie deep within a forest. And he expresses some sympathy for the Administration's controversial proposal to streamline environmental regulations. Some of his own experiments have been held hostage by “environmental obstructionists,” he says. In testimony earlier this month, he encouraged Congress to consider rules that would allow the public to help craft multistage experiments but then leave land managers free to practice “adaptive management”—feeding lessons learned into the next stage of the process. “We shouldn't have to revisit every detail,” he says.

    At the same time, Covington joins forces with some of his environmental foes in questioning the government's commitment to restoration. He rejects the notion that the agencies are still dominated by “timber beasts” dedicated to logging: “These are post-Earth Day foresters,” he says. But he concedes that they “still aren't very good at learning from the past.” One solution, he suggests, is to earmark 5% of the potential $1 billion or so that might be spent on thinning over the next few years for studies designed to test—and then replicate—a wider range of approaches.

    Before designing those studies, Covington suggests that researchers think about how Aldo Leopold would tackle the challenge. “If you were a time traveler from the future, what would you ask us to do right now so that future forests would be healthy?” he asks. Immediate action is essential, he emphasizes. Otherwise, scientists might find themselves fiddling around for answers while Rome burns.


    Out of the Vault, Into the Forest

    1. David Malakoff

    FLAGSTAFF, ARIZONA—Seven years ago, a local historian tipped off forest ecologist Margaret Moore of Northern Arizona University to a cache of dusty maps. They rested in a neglected vault here at the U.S. Forest Service's Fort Valley Experimental Station, set up in 1909 as the government's first research forest. Moore took a peek and marveled at the ecological time capsule that lay before her. In spare black-and-white were drawn the locations of hundreds of saplings, trees, and downed logs in a 160-square-meter patch of forest almost a century ago. The vault contained dozens of such labor-intensive portraits, from 1- to 7-hectare plots spread across the southwestern United States.

    The maps were a potential cornucopia of data for Moore and her husband, Wally Covington, who have spent much of the last decade trying to understand the recent evolution of the southwestern ponderosa pine forests (see main text). “That kind of detailed information is incredibly rare,” she says. The drawings cried out for a follow-up study to see how the plots had changed.

    Time series.

    Rediscovered maps (far left) have allowed ponderosa pine researchers to track plant biodiversity decreases (right) on forest floor.


    The collection wasn't perfect, to be sure. Data ledgers were missing, and the cross-hatched rectangles that appeared randomly on some maps were a mystery. Moore didn't even know if the plots, some staked out as long as 94 years ago, could still be found.

    Undaunted, Moore and the historian—Susan Olberding—searched for one plot that appeared to be just steps from the vault. “I bet it didn't take us 10 minutes to locate it,” recalls Moore. The treasure trove grew as Moore and her colleagues hunted down original corner stakes and even metal tree-marking tags, rusted but still readable. Then the ledgers turned up, providing meaning to the mysterious rectangles. They were 4.5-square-meter microplots that had also been carefully surveyed—down to grass stem and twig locations. Beginning in 1909, Forest Service scientists G. A. Pearson and T. S. Woolsey had established the more than 100 plots of various sizes as part of long-term monitoring studies. Some had been revisited periodically up to the 1950s.

    Discovering the microplots was enough for Moore to get started. She and her colleagues have already looked at eight of the microplots near Flagstaff, and they have found that the thickening ponderosa forests appear to have reduced the number and kinds of plants on the forest floor. A $310,000, 4-year grant from the U.S. Department of Agriculture will now allow them to remap about 40 of the bigger plots in New Mexico and Arizona. Moore expects to document measurable changes and build computer simulations to depict the past, present, and future of the forests. “It would be great,” she says, “if researchers 100 years from now could revisit these plots, too.”


    Piloting High-Flying NIH to a Soft Landing

    1. Jocelyn Kaiser

    As the budget-doubling campaign ends, the new NIH chief discusses biodefense needs, large-scale biology, and stem cell research

    The Bush Administration took a year before naming Elias Zerhouni to lead the National Institutes of Health (NIH). But after he was nominated on 26 March, the Johns Hopkins radiologist moved through the Senate at light speed. Zerhouni appeared 30 April before a confirmation committee, where a predicted brouhaha over NIH funding of human embryonic stem cell research never materialized. Instead, Zerhouni won the Senate's approval on 2 May and was sworn in 18 days later.

    Congress might have moved swiftly in part because it felt that NIH needs strong leadership to manage its ballooning budget and 27 sprawling institutes and centers. NIH is set to receive $27 billion in 2003, a doubling of its budget since 1998. Zerhouni is expected to face much tighter budgets in future years, at a time when NIH will be taking on new responsibilities in bioterrorism research. This will present a management challenge—one for which the Algeria native appears to be well qualified. A former executive vice dean at Johns Hopkins University School of Medicine in Baltimore, Zerhouni boasts a versatile background that includes work as a clinician, inventor, entrepreneur, and biomedical engineer.

    In an interview last week with Science in his office on the NIH campus in Bethesda, Maryland, Zerhouni, 51, who favors dark business suits, outlined his goals 4 months into the job. He said he aims to make sure that biomedical research results are translated into public health benefits. But he took pains to explain that that does not mean less attention to basic research: “I am absolutely committed to making sure that NIH has a very balanced portfolio.”

    This summer Zerhouni brought together 100 scientists—ranging from intramural bench scientists to industry researchers—to meet in focus groups. The “ground rules,” he said, were to identify needed resources, “roadblocks,” and “gaps in knowledge” that no single NIH institute could tackle in areas such as bioinformatics, molecular libraries, systems biology, and clinical research. Zerhouni intends to develop action plans for the next 3 to 5 years.

    Earlier this month, Zerhouni appointed directors for two institutes: mental health and alcoholism. That leaves three slots to fill (drug abuse, neurological disorders, and general medical sciences). He has initiated a review of the intramural research program and NIH's management. Zerhouni also noted that Deputy Director Ruth Kirschstein, whom he described as “a treasure for the institution,” will become his adviser. “She's done so much for this place; I think a senior advisory position to the director would be terrific,” Zerhouni says.

    A transcript of questions and answers, edited for brevity and clarity, follows.


    Q: What are your first priorities?

    A: I thought [there were several] important things a new director could do after two and a half years of transition. Funny things happen in large organizations when you have a long transition. Decisions are not made, clear strategic directions are not spelled out, recruitments are sort of delayed.

    One was to reenergize the place, recreate momentum. … I had an immediate need to recruit excellent directors. I really challenged both the internal community and the outside community over the summer to try to define [key issues] that only the NIH director could activate. …

    I had 100 scientists over the summer from outside, from inside, and organized five what I call “road map” meetings. … These scientists were from basic science, translational research, clinical research, social, behavioral. … And once we did this, it was very obvious that there were issues that just hadn't been attended to.

    There's a major change in the way science is conducted. I don't have to tell you about the explosion in data, the scaling up of research technologies. So that essentially became a topic: How do we generate facilitating and enabling resource support, whether it be bioinformatics, molecular libraries, structural biology resources, and so on?

    [Another] recurring theme across institutes is this issue of making sure that our science is relevant to the public. … I tend to look at it as a systems problem. … We're suffering because you see a much longer timeline between [the discovery and the use of new ideas in medicine]. Clearly that is something you hear from the scientists, who feel that they do not have access to the resources to validate their approach or get the proof of concept. You [also] hear it from the pharmaceutical industry: They quote statistics that a few years back, 40% of drugs that went to phase III trials would eventually fail. Now it's 80% of drugs that go to phase III trials fail. … It tells you that we have a problem.

    Q: What will you do with these action plans?

    A: We'll follow up. We'll have meetings with the directors, and each one of the items that is identified will then be analyzed in depth in terms of really having data and having a good quantitative understanding.

    Q: Is the model of the traditional NIH grant going to change? Will there be more networks like the Alliance for Cellular Signaling [which received a large “glue grant”]?

    A: It won't change. It will evolve into different shapes, because multidisciplinary science requires you to have collaborations. … But at the end of the day, you also need [principal investigators] who themselves have an inherent understanding of [multiple] fields so they can ask the right questions. Before I even came to NIH, I heard about the glue-grant idea, and I thought that was a terrific idea. [It goes] way beyond the capabilities of any one laboratory. And we're going to face more and more problems of this nature. I understand [the glue-grant group is] doing well; I'm going to watch it carefully. Is that a model that we need to implement? I don't know, but it's really exciting, I think.

    Q: You have said NIH needs to explain how it has spent its funds. How are you going to do that?

    A: Congress, the public, patient advocacy groups [are] saying, “We've doubled our investment; what does that mean, where's it going to, what are we getting for it?” There's a huge outcry out there about the public health impact of the budget. And there is a sense that perhaps we're not as good at communicating what those impacts have been. So I really think that we need to as an institution do better and be more proactive in explaining in detail what it is we're doing.

    Q: How does NIH come down with a “soft landing” from budget increases of up to 15% for the past 5 years to projected increases starting in 2004 of 2%?

    A: I think it's a challenge. … When the doubling was going on, you had a very vibrant economy, you had a federal surplus, you had no crisis like biodefense. This is a totally different environment. We can't defy gravity and expect 15% numbers every year.

    [But] I'm working as hard as I can with the Administration. I can tell you, I'm advocating for NIH as strongly as I can, because I think we have the justification. … If you look at our nation and the growth rate of health care costs considering our current health care system, … we're entering a race between the growth of that health care cost and the need for discovering new approaches that will drastically change the cost structure in major, major diseases. So my view is that it is a strategic imperative to invest aggressively in research.

    “As we're starting a new field [biodefense research], we absolutely need to have a huge investment in infrastructure.”CREDIT: RICK KOZAK

    Q: Is there an imbalance in federal support for biomedical science versus physical and other sciences? There is a proposal to double the budget of the National Science Foundation.

    A: There's no doubt that we need to have more investment in the quantitative sciences in this country. … But I wouldn't say that means stop investing in biomedical sciences. … I find that these approaches—double, triple, let's do it, me too—don't appeal to me intellectually. What is it that you want to do strategically? That is my question. It's a good slogan, though.

    Q: Can NIH spend $1.7 billion on bioterrorism research wisely in 1 year, and is it smart to keep spending at that level if there are no new bioterrorism incidents?

    A: I'm absolutely convinced [that the funds can be spent wisely]. First of all, you have to give credit to NIH … for coming up with a very cogent research agenda within 90 days of request. Now, as we're starting a new field, we absolutely need to have a huge investment in infrastructure. … If you look at the magnitude of the problem, $1.5 billion, $1.7 billion is probably going to be the base investment in biodefense.

    Q: How will you promote translational and clinical research?

    A: I think we have a systemic issue. We really need to have common data standards. We need to have a better computer infrastructure, information infrastructure for conducting clinical investigations in the country. We need to have a strategy that looks at new models, new validated research methods in clinical investigations. … For example, there are some scientists who feel that Bayesian statistics should be used in clinical trials early on. … You know, the approaches that we use—the double-blind randomized study—is the mainstay of clinical investigations, and we've used that successfully. But in a day and age when you can really combine data on large-scale populations, … should we look at that and reengineer our clinical research and clinical investigation system in this country? That's a question that I'm going to look into.

    Q: What's happening with the office of the director? Do you anticipate some changes?

    A: Yes. I took my time to evaluate the office. I think we definitely need to have more capabilities to present what our results are, analyze our portfolio, evaluate our portfolio, be able to communicate that portfolio. So I'm very interested in building an advanced analysis capability so that when NIH provides results or provides plans, they're well supported by coherent analysis of the data.

    Q: What do you think of the proposal being explored by a National Academy of Sciences (NAS) panel to consolidate NIH institutes?

    A: I don't see how you could do it politically. I have to tell you, in 4 months I've learned enough. I opened the meeting of the national advisory council of the Eye Institute. Somebody started raising the issue, saying, “Well, we want to talk to you about the NAS study.” I said, “Yes, actually, I wanted your input. Which institute would you want to be merged with?” There was a silence in the room.

    Q: What is the mission of the stem cell SWAT team headed by [deafness institute director] James Battey? Will it consider creating an NIH stem cell repository?

    A: I wanted … the action of NIH to become more strategic: Review the state of the science, review exactly what the pathway to progress should be. What are the stumbling blocks? Do we have enough scientists? Do we have enough access to cells?

    There is … informal talk [about creating a repository], because one of the issues that you absolutely need to tackle is this issue of full characterization, standardization of the cell lines so that you can compare results across laboratories. … [But] remember, we don't own the cell lines.

    Q: Do you intend to stay at NIH a certain length of time?

    A: No, I don't have a particular agenda. I really don't. I had a great job where I was. I have to tell you, if you had told me a year ago that I would be here, I would have said, “You're dreaming.” I would turn down offers to consider jobs. … I did mention one time that the only job worth doing is this one. So it's probably what got me the nomination.


    Ashes to Ashes: The Inner Lives of Neutron Stars

    1. Robert Irion

    “Superbursts” that rage for hours within certain neutron stars might drive exotic thermonuclear reactions unique in the universe

    CHICAGO, ILLINOIS—Steamboat Geyser, in the heart of Yellowstone National Park, usually shoots fountains of water 5 to 10 meters high. But at irregular intervals of years to decades, the geyser unleashes a scalding 100-meter column, followed by a deafening roar of steam for a day or more. A mysterious trigger far underground expels the deepest, hottest water from the geyser's hydrothermal system in a crowd-pleasing burst.

    Similar outbursts happen in space, astrophysicists have learned. Powerful and unpredictable flares of energy, given the geyserlike name of “superbursts,” strike beneath the surfaces of a few special neutron stars—the dense, spinning corpses of stars that died in supernova explosions. Orbiting telescopes have spotted seven superbursts so far, spouting intense x-rays for hours. Even more compelling than the fireworks is the root cause: a thermonuclear flash of heavy elements, burning in ways that might occur nowhere else.

    Superbursts happen only in tight binary pairs, where a neutron star pulls gas from the outer atmosphere of a small companion. Under certain conditions, a layer of the gas—usually hydrogen and helium—can build up on the neutron star's surface. Every few hours or days, this raw nuclear fuel spontaneously combusts in about 10 seconds. Astrophysicists first spotted these “type I x-ray bursts” in the mid-1970s and explained them as thermonuclear fusion run amok.

    A quarter-century later, it turns out the story doesn't end there. A modest type I burst creates “ashes” in the form of heavy atomic nuclei, which settle into a dense ocean covering the entire star. When the pressure and temperature at the bottom of this bizarre ocean get high enough, the ashes themselves ignite. The thermonuclear flame explodes into a superburst that spews 1000 times as much energy into space as a type I burst. According to models, the inferno burns the entire 100-meter-thick ocean of heavy ashes.

    When x-ray satellites first caught these jaw-dropping events a few years ago, some astrophysicists thought pure carbon fueled the flame (Science, 17 November 2000, p. 1279). However, new calculations suggest that although carbon is indeed the spark, most energy in a superburst comes from far heavier elements that literally disintegrate in a 7-billion-kelvin bath of gamma rays.

    “We sometimes get lulled into believing that we've seen it all in astrophysics, then nature comes up with something new and amazing,” says physicist Robert Rosner, director of the Department of Energy's Center for Astrophysical Thermonuclear Flashes, or Flash Center, here at the University of Chicago. “Superbursts are extraordinary. To many of us, they seem like science fiction.”

    Small potatoes

    In the panoply of cosmic things that go boom, superbursts are impressive but not ascendant. “In comparison with gamma ray bursts and supernovas, these are small potatoes,” says astrophysicist Edward Brown, a postdoctoral researcher at the Flash Center. Type Ia supernovas—the kind used to gauge the accelerating expansion of the universe—release a billion times as much energy and shine across billions of light-years. Satellites have spotted superbursts only within the cozier confines of our Milky Way galaxy.

    Superbursts excite astrophysicists not for their absolute power but for their potential to reveal how matter is transformed by pressures and densities just a whisper shy of a black hole. Neutron stars pack about 1.5 times the mass of the sun into a sphere just 20 kilometers wide, only three times the size of a black hole of the same mass. “On a neutron star, gravity completely has the upper hand,” says astrophysicist Don Lamb, also a member of the Flash Center. Indeed, gravity's clamps are so strong that the energy created by thermonuclear burning barely budges the matter above it. “Once burning starts, the pressure rises and there's no way out; it can't find a stable equilibrium,” Lamb says. “It runs away and burns everything that has been accreted.”

    Hydrogen and helium cascading onto a neutron star from a companion can burn in two ways. In stable burning, the gas combusts just as quickly as it hits the star, like the steady blue flame on a natural gas stove. In unstable burning, not enough gas arrives to keep the flame going. The gas builds up and then ignites with a poof, enveloping the whole star in less than a second with a type I x-ray burst.

    Shattered disk.

    X-rays from a thermonuclear “superburst” might blow apart the disk of gas that girdles a neutron star in a tight binary system.


    For years, researchers thought these processes would forge a lot of carbon—especially helium burning, which cooks carbon efficiently. Stripped of their electrons in the extreme conditions, the nuclei of carbon atoms would settle into an ever-deepening ocean beneath the ongoing hydrogen-helium bursts. Papers predicted that thermonuclear flashes could happen in that ocean. However, calculations showed that it might take hundreds or thousands of years to build up enough carbon for the pressure to cross its higher ignition threshold. No one expected to get lucky enough to see such an event.

    Then, starting in 1996, superbursts began popping up. Astronomers at the Space Research Organization Netherlands in Utrecht found the first one—originally dubbed the “Oh my God!” burst—with the Dutch-Italian satellite BeppoSAX, which flew for 6 years. Teams led by astronomers Erik Kuulkers, John Heise, and Remon Cornelisse have since found a few more by scouring data from a wide-field camera on BeppoSAX that gazed toward the center of the galaxy for months at a time.

    Astronomer Tod Strohmayer of NASA's Goddard Space Flight Center in Greenbelt, Maryland, also struck pay dirt twice with fortuitous observations by NASA's Rossi X-ray Timing Explorer, still active after 7 years. One of those discoveries revealed that the same binary system had spawned a pair of superbursts 4.5 years apart. Clearly, it didn't take 1000 years to replenish the superburst fuel on that neutron star.

    Once observers showed that superbursts are far more common than expected, theorists were forced to reconsider the mixture of ashes left over from type I x-ray bursts. First, a team led by nuclear physicist Hendrik Schatz of Michigan State University, East Lansing, did a thorough calculation of the reaction products of hydrogen flash fusing on a neutron star.

    The results, in the 16 April 2001 Physical Review Letters, showed a slurry of massive nuclei up to tin, antimony, and tellurium (atomic numbers 50 to 52). At that point, a closed cycle of element creation and destruction prevents heavier nuclei from forming. Even so, exotic elements such as krypton and ruthenium should churn into the dense ocean with the carbon from helium burning.

    Second, postdoctoral researcher Andrew Cumming and physicist Lars Bildsten of the University of California (UC), Santa Barbara, realized that those ashes would change the fluid's key properties. Heat flows easily through an ocean of pure carbon nuclei, making it hard to build up hot spots and ignite a spark. With heavy elements, “the ocean becomes more opaque, so heat can't escape as quickly,” says Cumming, now at UC Santa Cruz. “You can ignite a flash much sooner with less material.” The study, in the 1 October 2001 Astrophysical Journal Letters, predicted that an ocean containing just 5% to 10% carbon would suffice to trip the thermonuclear trigger every few years.

    Nuclear weirdness

    As for the prodigious energy spewed by the events, new research by Schatz and his colleagues points to another surprise. After the flash fusion of carbon nuclei gets a superburst started, it might play only a supporting role.

    Nuclear storm.

    Flash ignition of hydrogen and helium on a spinning neutron star might spread in a hurricane-like pattern. Carbon-powered superbursts ignite a far deeper layer.


    The team's work, still in progress, suggests that temperatures soar to 7 billion kelvin in the first microseconds of the runaway burst. In those searing conditions, an intense bath of gamma ray photons bombards nuclei like shotgun pellets blasting a boulder. The onslaught chips neutrons and protons off the heavy nuclei and reduces their mass, a process called photodisintegration. “We all overlooked the fact that it gets hot enough to start this other process,” Schatz says.

    The effects of photodisintegration are dramatic. Iron (atomic number 26) is the most stable nucleus. Smashing bits off massive nuclei and transforming them into iron yields energy as each nucleus becomes more tightly bound. Depending on the mixture of heavy ashes in their modeled neutron-star oceans, Schatz and his co-workers are finding that this violent sputtering produces far more energy than the fusion of carbon and other lighter nuclei.

    “If you look at all the sites in the universe where nuclear processes happen, this would be the only case where photodisintegration is the dominant energy source,” Schatz says. “It's a weird process that generates much more energy and happens much faster than we expected.”

    Much of the superburst's energy dives into the neutron star's colder interior. The rest takes hours to diffuse through the thick ocean and blast into space. Above the neutron star's surface, the intense x-rays might destroy part of the disk of gas from the companion star. Strohmayer hopes to scrutinize superburst records for clues about how the disk reassembles.

    Verifying other aspects of superbursts—especially the details of how they transform matter—will be more challenging. Unlike a supernova, which propels its freshly made elements into space for telescopes to detect, neutron stars horde it all. Virtually nothing escapes. “It's like a black hole in its shyness,” says Lamb. It's barely possible that some heavy ashes might convect to the neutron star's surface long enough for future x-ray or gamma ray telescopes to probe.

    What's the trigger?

    With just a handful of superbursts identified, other mysteries are keeping physicists busy. The frontier, says Bildsten, is that no one knows what sparks the flash. In some cases, a type I x-ray burst precedes the superburst, leading some scientists to propose that the surface flash of hydrogen and helium fusion triggers the deeper event. Bildsten considers that unlikely. “It's like sitting at the bottom of a 10-kilometer-deep ocean and asking, ‘Do I care what the weather is at the surface?’” he says. “The scales of pressure are that different.”

    Researchers also debate whether ignition occurs nearly everywhere on the star at once, as Lamb believes, or whether it starts at one point and then spreads. Type I bursts might yield a clue. A model by astrophysicist Anatoly Spitkovsky of UC Berkeley and his colleagues suggests that the breakneck spin of a neutron star in a close binary—typically hundreds of rotations each second—distorts the initial flame of a type I burst into a lopsided hurricane, thanks to the strong Coriolis force. That storm then expands quickly around the star's equator.

    Observations of both type I bursts and superbursts show rapid oscillations that Strohmayer and Bildsten interpret as a hotter spot rotating in and out of view. However, the oscillations last far longer than it should take for the flame to travel evenly around the star. Nor do researchers know how quickly the ignition might spread for the deeper superbursts, where conditions are far harder to model.

    Despite all the unknowns, theorists agree on what not to expect: We shouldn't hold our breath for a super-superburst. “Superbursts are the last step for explosive ignition,” says Cumming, because their ashes consist largely of stable iron. Deeper inside a neutron star, he notes, further reactions will squeeze nuclei together into a true ball of neutrons, gradually releasing energy. The end result, just a few kilometers down but forever inaccessible, consists of matter crushed as close to oblivion as one can get without vanishing into a black vortex.


    Mild Winters Mostly Hot Air, Not Gulf Stream

    1. Richard A. Kerr

    The Gulf Stream does little to moderate European winters, it turns out, and the atmosphere plays a bigger role in climate change than once thought

    The idea that the Gulf Stream warms Europe goes back at least to pioneering U.S. oceanographer Matthew Maury, who waxed eloquently in 1855 about how it “makes Erin the ‘Emerald Isle of the Sea,’ and clothes the shores of Albion in evergreen robes; while in the same latitude, on this side, the coasts of Labrador are fast bound in fetters of ice.” Not so, says a group of climate researchers in next month's Quarterly Journal of the Royal Meteorological Society.

    Maury's contention that the Gulf Stream's warmth alone moderates Europe's winters might linger in the popular imagination and even in some scientific circles, but the reality is quite different, the group says. The heat carried into the North Atlantic by the great current is not what plays the dominant role in moderating Europe's winters, the researchers say. Instead, it's atmospheric circulation, tweaked by the Rocky Mountains, that counters intuitive expectations of high-latitude cold. The summer's warmth lingering in the North Atlantic also plays a role.

    The new study doesn't propose anything climatically new under the sun. Everyone has long agreed that three factors make Europe's winters milder than those of eastern North America at the same latitude. Winds blowing toward Europe from the west pick up heat from the waters of the North Atlantic, which retain far more summer heat than the interior of the North American continent. Those balmy winds give Europe a milder, maritime climate relative to North America's more extreme, continental climate. The Gulf Stream—the tail end of a great “conveyor belt” of currents carrying warm waters from the Southern Hemisphere—also contributes heat to the westerly winds and thus to Europe. And those westerly winds tend not to blow straight out of the west. They arrive over eastern North America from more out of the frigid north, intensifying the continentality of eastern North America's climate. In contrast, Atlantic winds tend to blow more from the warmer climes of the south before reaching Europe.

    Together, these moderating mechanisms work to make British winters 15°C warmer than Labrador's, but “no one had bothered to quantify the relative importance of the three effects,” says climate researcher Richard Seager of Columbia University's Lamont-Doherty Earth Observatory in Palisades, New York, who with David Battisti of the University of Washington, Seattle, headed up the Quarterly Journal study. The lack of quantification, says Seager, left the impression in many quarters that the Gulf Stream's “ocean heat transport is the dominant mechanism making Western European winters mild.”

    Seager and his colleagues began by using meteorological observations made over the past half-century to calculate how the world's winds—which carry five times more heat out of the tropics than do ocean currents—distributed heat over the globe. Indeed, frigid northwest winds did chill central and eastern North America, whereas more southwesterly winds across the Atlantic warmed Europe. And analysis of marine observations showed that 80% of the heat that cross-Atlantic winds picked up was summer heat briefly stored in the ocean rather than heat carried in by the Gulf Stream.

    Seager and colleagues confirmed a minor role for the Gulf Stream by running two climate models with fairly complete climate systems, then removing all ocean heat transport from both models and comparing the results to the complete versions. The ocean heat transport of the Gulf Stream was crucial to warming Scandinavia and keeping the far northern North Atlantic free of ice. It also warmed latitudes south of Scandinavia by 3°C on both sides of the Atlantic. But the wintertime temperature contrast between Europe south of Scandinavia and eastern North America was still about 15°C.

    About half of that contrast is due to the huge wiggles induced in the prevailing westerlies as the winds pass over the Rocky Mountains and the rest of the high ground of the North American cordillera, the researchers found. Instead of blowing straight from west to east, the westerlies alternately snake more from the north and then turn more from the south and back to more northerly, like the wiggles in a rope shaken at one end. The researchers removed a model's cordillera, reducing the wiggles in the westerlies streaming toward Europe. That warmed eastern North America by as much as 6°C and cooled Europe by 3°C.

    Seager and his colleagues seem to be making headway in their argument that the Gulf Stream plays less of a role than the winds in creating the wintertime east-west contrast. “It's an excellent paper,” says meteorologist Rowan Sutton of the University of Reading, U.K. “The importance of the Gulf Stream in European climate is often overstated.”

    The analysis will also no doubt stoke the debate over the relative roles of the Gulf Stream and the tropics in climate change (Science, 27 April 2001, p. 660). Geochemist Wallace Broecker of Lamont has long invoked a sudden shutdown of the Gulf Stream and its larger conveyor belt to explain abrupt climate shifts during the last ice age. In his scheme, the loss of Gulf Stream heat simply cooled everything down, including large parts of the Southern Hemisphere. “Broecker's simple [conveyor] diagram captured people's imaginations, but it's a bit simplistic,” says Sutton.

    Broecker now agrees. “I apologize for my previous sins,” he has said. Still, he remains convinced that the trigger of abrupt climate change, although not necessarily the sole driving force, lies in the North Atlantic.

    Whether or not he's right, the case of mild European winters only encourages Sutton and others to explore the more climatically sensitive atmosphere as well as the ocean. “There's been too much emphasis on mid- and high latitudes,” Sutton says. The tropics and its atmosphere deserve their due.


    An 11th-Hour Rescue for Great Apes?

    1. John Bohannon

    A globetrotting conservation biologist is spearheading a last-ditch effort to save these embattled primates from extinction—but the clock is ticking

    LONDON—Ian Redmond was grief stricken when he heard last May of Muraha's murder. He first met her 25 years ago, just 2 days after her birth in Rwanda. They grew so familiar that Muraha had sidled up to him on a visit last year, enraging a jealous suitor. She was one tough cookie: Despite losing a hand and a foot as a teenager, Muraha later gave birth and started raising a baby. And she was doing a good job: The 13-month-old infant was still healthy when biologists found it clinging to her corpse in the Virunga forest. Nearby lay another dead mother whose infant is presumed to have been kidnapped by the unknown assailants. Their motive? A stolen baby could fetch nearly half a million dollars on the black market: Gorillas like Muraha have never been bred successfully in captivity.

    Redmond looks dazed. He's in London for just a day before setting off to Malaysia to help the BBC on a story about a criminal ring of gorilla traders. This morning he met with the prime minister of the Central African Republic to offer advice on conserving that country's endangered apes. Just the day before, Redmond, an independent conservation biologist, had been at the World Summit on Sustainable Development in Johannesburg, South Africa, acting as technical adviser to the United Nations Great Apes Survival Project (GRASP). This was one of the few science-based projects to get a real boost at the summit, where delegates designated ape conservation a global priority.

    Muraha was a mountain gorilla (Gorilla beringei beringei), a species discovered exactly 100 years ago in the mountains east of the Congo River Basin and made famous by the 1988 film Gorillas in the Mist. Poaching and human-transmitted pathogens have since taken a heavy toll, leaving fewer than 650 mountain gorillas in the wild. Down in the valleys, eastern lowland gorillas (Gorilla beringei graurei) are not faring much better. These so-called Grauer's gorillas are being devastated by a mining spree reminiscent of the Klondike gold rush. The soil in this region is rich with coltan, an ore that's refined into the rare metal tantalum, which is used to make tiny capacitors within cell phones and laptop computers. In 2000, the price of tantalum shot up to $80 per kilogram, a small fortune by Congolese standards. Thousands of prospectors have trekked deep into gorilla country, digging up tons of soil in search of coltan-rich mud. Redmond has documented their destruction. To sustain themselves in the rainforest, miners hunt for “bush meat,” a catch phrase for any large mammals, including gorillas. A 1996 survey by the World Conservation Society reported 17,000 Grauer's gorillas. This year there are roughly 2500.

    Field biologists today are becoming scientist-activists by necessity, says Redmond. Ape conservation, he says, is in “a state of emergency.” Doing ape conservation is like doing triage—“a situation I know well and like very little,” says Redmond, referring to his days as a medic in the Royal Army Medical Corps in the early 1970s.

    Gentle giant.

    Ian Redmond comes to grips with Pablo, a mountain gorilla, while checking the silverback's body for parasites.


    Redmond began his biology career by moving to Africa in 1976 to study the parasites of mountain gorillas, identifying two new nematode species. The turning point for both Redmond and his mentor, the late Dian Fossey, came 2 years later when he discovered the headless, handless body of a long-studied gorilla. Since then he has devoted an ever-increasing share of his time to the politics of conservation. At times he has put his life on the line, once getting speared while confronting a poacher.

    Mountain gorillas are not the only apes at risk of extinction. Although they are legally protected in every country they inhabit, all great ape species—gorillas, orangutans, chimpanzees, and bonobos—are listed as endangered or critically endangered by the Convention on International Trade in Endangered Species (CITES). Hunting and habitat loss are the main threats, exacerbated by political instability (Science, 31 March 2000, p. 2386). At its present rate of decline, the bonobo (Pan paniscus) is predicted to go extinct within a decade. The Sumatran orangutan (Pongo abelii) is thought to have 5 years left. Only 150 individuals remain of the Cross River gorilla (Gorilla gorilla diehli).

    Apes have been in the limelight since the late 1960s, but their populations have dwindled in spite of numerous conservation efforts. Until now, says Redmond, these efforts have lacked funding, been poorly enforced, or ignored the interests of people who share forest resources with apes. At a meeting of CITES member states in 2000, Redmond suggested unifying existing ape conservation initiatives under a single banner. Klaus Töpfer, director of the U.N. Environment Programme, agreed, and a year later, 22 organizations came together to form GRASP.

    The hope is that GRASP will succeed in protecting apes where others have failed because of its scope and credibility as an international partnership. “GRASP is the last and best possibility for saving apes from extinction,” says William Travers, chief executive of Born Free, an ape conservation organization, “and Redmond is the dynamo at the heart of it.” There are 23 countries with viable populations of great apes. Most of these countries are among the poorest in the world, making sustainable solutions a challenge. The top priorities are to halt the sale of ape meat by finding alternative sources of income for hunters and to shut down illegal logging. Within a year's time, GRASP will convene a meeting between representatives of countries with apes and those willing to fund conservation. Redmond hopes to have an action plan for all ape populations ready for this meeting.

    In the midst of this explanation, Redmond pauses to receive an urgent message. Illegal logging has just penetrated the orangutan study area in Indonesia. “This is very disappointing,” says Redmond. “They're always one step ahead of us.” In spite of such setbacks, Redmond is confident that GRASP will be able to pull the great apes back from the brink. When asked how he remains optimistic, he replies: “I have to be.”

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution