News this Week

Science  28 Sep 2007:
Vol. 317, Issue 5846, pp. 1842
  1. INFECTIOUS DISEASE

    Vaccine-Related Polio Outbreak in Nigeria Raises Concerns

    1. Leslie Roberts

    Northern Nigeria has been hit by one of the largest known outbreaks of poliomyelitis caused by the live polio vaccine itself. The ongoing outbreak could be a serious setback for the global polio eradication campaign: It is occurring in a region where rumors about vaccine safety derailed vaccination efforts several years ago.

    Experts with the Global Polio Eradication Initiative emphasize that the widely used trivalent oral polio vaccine (OPV) is safe. But the low immunization rates in northern Nigeria have created the conditions for the attenuated vaccine virus to regain its virulence and trigger an outbreak.

    Detected in September 2006, the outbreak of vaccine-derived poliovirus (VDPV) type 2 was immediately reported to the World Health Organization and Nigerian health officials. But the information is just now being released publicly—in the 28 September Morbidity and Mortality Weekly Report and WHO's Weekly Epidemiological Record—a delay that has caused some consternation in the polio community. Officials say they were worried that the news, if misconstrued, could again disrupt polio vaccination efforts in Nigeria.

    “There were legitimate concerns that anti-polio vaccination rumors would be rekindled by an incomplete explanation of the cause of the VDPV outbreak,” says Olen Kew, who has led efforts to analyze the outbreak from the U.S. Centers for Disease Control and Prevention in Atlanta, Georgia.

    Double whammy. Northern Nigeria is battling wild and vaccine-derived poliovirus.

    CREDIT: JEAN-MARC GIBOUX

    Several polio experts told Science that although they understand how sensitive the situation is, they disagree with the decision to keep quiet. “I am troubled that the information hasn't come out, absolutely,” says Donald A. Henderson of the University of Pittsburgh Center for Biosecurity in Baltimore, Maryland. Henderson says details of each outbreak are essential if scientists are to understand just how risky these vaccine-derived strains are.

    So far, there are 69 confirmed cases of paralysis, and more suspected, caused by VDPV in nine northern Nigeria states, says Kew. The case count seems certain to rise. About half the cases have occurred around Kano, a largely Muslim state where anti-Western sentiment and rumors that the vaccine caused sterility or AIDS led several states to halt polio vaccination in 2003. After repeated demonstrations of the vaccine's safety and considerable behind-the-scenes diplomacy, vaccinations resumed about a year later, but the damage had already been done.

    By the end of 2004, the number of polio cases in Nigeria had doubled to about 800, and in 2006 it soared to more than 1100. Wild virus from Nigeria reinfected some 20 other countries, leading to a spike in global cases. It was a huge setback to the Global Polio Eradication Initiative, which estimates that the world spent an additional $500 million to contain the damage. Only recently have global cases dropped back to near preboycott levels.

    Although Nigeria has since made considerable progress, wild poliovirus, both type 1 and type 3, is still circulating in the north, and vaccine coverage there remains low. In 2006, between 6% and 30% of children in the north had never received a single dose of OPV.

    Those are exactly the conditions that render an area susceptible to outbreaks of vaccinederived virus. Since the 1960s, scientists have known that attenuated viruses can in rare instances mutate and regain virulence, but it was only in 2000, with an outbreak in Hispaniola, that they realized VDPVs could spread disease from person to person.

    The current outbreak came to light when a technician at the CDC polio lab noticed a preponderance of type 2 virus in the isolates sent in from northern Nigeria. That instantly raised suspicion, Kew says, because wild type 2 poliovirus has been eradicated globally. That meant the only possible source was the trivalent vaccine, which had been used in Nigeria in preboycott campaigns. Since Nigeria resumed vaccinations in 2004, says Kew, it had “quite properly” been using the more effective monovalent vaccines against wild types 1 and 3 in its campaigns. Genetic analysis quickly confirmed the source; it also suggests that several VDPVs emerged independently in 2005 and 2006, multiple times.

    In earlier outbreaks, circulating VDPVs have been relatively easy to stamp out, but this one has persisted despite four campaigns with trivalent OPV in the past year. “We suspect it is simply because the coverage was not adequate; we don't believe there is anything exceptional about this virus,” says Kew. As evidence, he notes that two VDPV strains jumped from Nigeria to Niger, where routine vaccination is almost 90%. Both “barely made it 5 kilometers before they dead-ended,” he says.

    Polio expert Oyewale Tomori, vice chancellor of Redeemer's University near Lagos and chair of Nigeria's expert advisory committee for polio eradication, says he has been urging officials to go public. He worries that secrecy might fuel suspicions about vaccine safety instead of reinforcing the need to intensify immunizations in Nigeria.

  2. ENVIRONMENTAL POLICY

    Tougher Ozone Accord Also Addresses Global Warming

    1. Eli Kintisch

    Come for the ozone layer, stay for the climate. That come-on might have been the marketing spiel for negotiators meeting last week under the aegis of the United Nations Environment Programme to strengthen the Montreal Protocol, the 20-year-old accord on chlorofluorocarbons (CFCs) and other chemicals that deplete the ozone layer. And it worked. The delegates who returned to the city from which the 1987 treaty got its name also made significant progress in combating global warming by recognizing the fact that most chemicals affected by the treaty are also potent greenhouse gases and that restricting them pays double dividends.

    The thin shell in the stratosphere that protects Earth from the sun's rays has a variety of enemies, and the Montreal Protocol has been tightened four times as scientists have placed restrictions on newly recognized threats. As a result, the rate of harmful emissions has slowed (see graph), and more than 90% of the production and use of ozone-depleting chemicals has been phased out. The biggest threats may have passed, say experts, but this year's weeklong meeting set itself two main goals: to clamp down on ozone-harming refrigerants that have become prevalent in the developing world, and to do it in a way that could provide tangible side benefits for climate.

    Production. A Chinese HCFC factory.

    CREDIT: (PHOTO) EZRA CLARK/ENVIRONMENTAL INVESTIGATION AGENCY

    Protection. The 1987 ozone treaty has reduced global greenhouse gas emissions that otherwise would have risen sharply.

    CREDIT: ADAPTED FROM G. J. M. VELDERS ET AL., PNAS (2007)

    By the end of the meeting, they could claim to have met both goals. Most impressive was an agreement by delegates to push forward by a decade a legally binding schedule to phase out in developing nations a family of chemicals called hydrochlorofluorocarbons (HCFCs). The 191 participating nations also pledged to finance a transition fund, currently funded at roughly $150 million per year, to support the conversion to alternatives. And with the urging of U.S. officials, the delegates also pledged to make sure that the HCFC replacements would have the lowest possible harmful impact on global warming.

    “The delegates deserve lots of credit for both recognizing and seizing a historic opportunity to protect both the ozone layer and the climate,” says Alexander von Bismarck of the Environmental Investigation Agency, a London-based nonprofit that has monitored the treaty. Activists hope the action gives momentum to international meetings on climate change occurring in New York and Washington, D.C., as Science went to press.

    The Montreal Protocol arose out of two scientific developments. In 1974, chemists F. Sherwood Rowland and Mario Molina calculated that CFCs, a common ingredient in spray cans, could destroy stratospheric ozone. Although that discovery led to some voluntary curbs on their use, the impetus for mandatory action came in 1985, when British scientists measured an ozone “hole” over the Antarctic. The new agreement is consistent with the findings of a scientific assessment in August that said an accelerated HCFC phaseout in developing countries would produce the equivalent savings of 18 billion tons of carbon dioxide emissions by 2050. HCFCs were meant to be a transition chemical as countries phased out CFCs, but they have been widely used as coolants in the booming economies of China and India.

    The agreement would freeze HCFC production by developing nations in 2013—2 years earlier than planned—followed by successive cuts until production was ended in 2030, a decade sooner than previously agreed. Developed nations agreed to advance their deadline for a phaseout from 2030 to 2020, as well as promising to provide “stable and sufficient” funds for replacements while “taking into account global warming potential” in deciding which chemicals to accept as substitutes.

    “You couldn't have imagined this 5 years ago,” says Durwood Zaelke of the nonprofit Institute for Governance and Sustainable Development (IGSD) in Washington, D.C. The current availability in China of products with many of the HCFC alternatives bodes well for the new agreement, says DuPont chemist Mack McFarland. An important driver during talks was a statement from the G8 summit in June, pushed by the U.S., pledging the industrial powers to climate-friendly action on ozone. “It's being held up [as an inspiration] by all the parties” during negotiations, said IGSD's Scott Stone, who attended last week's conference. “This [showed] great leadership by the White House.”

    The reviews weren't entirely positive. Delegates agreed to U.S. demands to continue an exemption for ozone-unfriendly methyl bromide, a fumigant used by U.S. farmers, allowing annual emissions of 4600 tons. David Doniger of the Washington, D.C.-based Natural Resources Defense Council, which wants a full phaseout, called that “a black mark” on an otherwise strong U.S. performance. White House environment aide James Connaughton says U.S. negotiators hope to build on the success in Montreal during a 2-day meeting hosted by the Bush Administration this week attended by representatives from 15 industrial nations and major emitters.

  3. BIOWEAPONS

    Panel Wants U.S. Program to Retain Its Russian Roots

    1. Yudhijit Bhattacharjee

    Four years ago, scientists in Obolensk, Russia, wanted to ship a strain of anthrax to a lab in Fort Detrick, Maryland, under a U.S. program intended to prevent former bioweapons scientists from selling their expertise to terrorists or “rogue” nations. But the Russian government wouldn't give the Obolensk lab an export license.

    Such setbacks are one reason the Department of Defense (DOD) has decided to phase out collaborative research projects in Russia under its Biological Threat Reduction Program. But a new report by the U.S. National Academies' National Research Council (NRC) says the agency is making a mistake. It calls on DOD to increase support for the program and to give Russian administrators and scientists a greater voice in determining its direction.

    Whose threat? U.S. entities have received most of the money for research on reducing the threat of Russian bioweapons.

    SOURCE: DEFENSE THREAT REDUCTION AGENCY

    Since 1998, the agency has spent more than $430 million on dismantling biological weapons production centers, improving security at research facilities, setting up disease surveillance networks, and supporting scientists throughout the region. The activities are part of the U.S. government's biological nonproliferation efforts in the former Soviet Union. DOD officials plan to continue, and even increase, support for scientists in other countries in the region. But the difficulties in gaining access to Russian labs, combined with an improved outlook for funding science that stems from the country's robust economy, have made Russia the odd man out. “We think the Russians are now perfectly capable of transitioning their former weapons labs to make them part of their public health system,” a senior DOD official told Science this summer.

    The academies report, requested by DOD's Defense Threat Reduction Agency (DTRA) at the urging of Congress, takes a different view. “Although the economic situation in Russia is stabilizing, the future of a large number of biological institutions is in flux,” it says. “And many former weapons scientists remain trapped in uncertain circumstances that could raise serious proliferation concerns.” The bottom line, according to NRC's Glenn Schweitzer, is that “Russia is too important a country to not engage in.”

    The report cites the program's success in improving security at a number of Russian labs. And it suggests that heeding the wishes of its Russian hosts would strengthen the program. “DOD continues to be interested only in studying pathogens it considers dangerous for bioterrorism, while the Russians want to tackle health problems like cholera and tuberculosis,” explains Sonia Ben Ouagrham-Gormley of the Monterey Institute of International Studies in Washington, D.C. The fact that most of the money has gone to U.S. contractors and visiting scientists has also soured Russians on the program, the report notes.

    The panel's recommendations are right on the mark, says Carleton Phillips II, a biologist at Texas Tech University in Lubbock and the principal investigator on a DOD-funded project in Kyrgyzstan aimed at mapping the distribution of mammals that are reservoirs of infectious diseases. He says the best way to reduce biological threats in the former Soviet Union is to “engage local scientists in true collaborations.”

    Now that the experts have spoken, will DOD listen? A DTRA official says the report “will definitely have an impact on the future of the program,” adding that the agency plans to respond before the end of the year. A staffer on the Senate Foreign Relations Committee also likes the report's recommendations. “There is basic support on the Hill for doing more collaborative research in Russia,” he says.

  4. NEUROSCIENCE

    Uncovering the Magic in Magnetic Brain Stimulation

    1. Greg Miller

    In recent years, neuroscientists and psychiatrists alike have touted the potential uses of a noninvasive brain stimulation technique called transcranial magnetic stimulation (TMS). The method has been used to disrupt neural activity experimentally in studies of human cognition, and it has shown promise in clinical trials for treating psychiatric disorders such as depression (Science, 18 May 2001, p. 1284). Although widely considered safe—thousands of people have received TMS—relatively little is known about how it actually works. Now, a detailed look at its effects shows that TMS can boost or dampen the firing of neurons depending on ongoing brain activity.

    Neuroscientists at the University of California, Berkeley, applied TMS to the cerebral cortex of cats while monitoring neural activity and metabolism. Their findings, reported on page 1918—and future investigations of this type—will have important implications for how TMS is used in people, other researchers say.

    One interesting possibility, according to Mark George, a psychiatrist at the Medical University of South Carolina in Charleston, is that it may matter what subjects think about while they're being stimulated, a factor that hasn't received much consideration to date. George, who pioneered TMS therapy for depression, says a better understanding of how TMS works will enable researchers and clinicians to apply it more effectively: “This is precisely where the field needs to go.”

    In a typical TMS procedure, technicians place a ring-shaped paddle near the scalp. Electric currents swirling inside the paddle produce a magnetic field that in turn generates currents in the underlying brain tissue. These currents alter the electrical activity of neurons, but exactly how they alter it is poorly understood.

    Stimulating results. New research hints at the mechanisms of magnetic brain stimulation.

    CREDIT: COURTESY OF HANNA MÄKI/LABORATORY OF BIOMEDICAL ENGINEERING, HELSINKI UNIVERSITY OF TECHNOLOGY AND BIOMAG LABORATORY, HELSINKI UNIVERSITY HOSPITAL

    Led by Ralph Freeman and graduate students Elena Allen and Brian Pasley, the Berkeley scientists applied TMS to the visual cortex of anesthetized cats and tracked the aftermath using probes developed in Freeman's lab that can simultaneously record the electrical activity of neurons and measure fluctuations in oxygen concentration, an indicator of energy consumption. Using optical imaging methods, the researchers also tracked hemoglobin levels, another metabolic marker. A train of TMS pulses lasting a few seconds caused an immediate increase in neural firing that lasted for about a minute, followed by a decrease in firing for several minutes. Oxygen and hemoglobin mirrored this pattern, indicating that neurons' firing and energy demands go hand in hand.

    TMS had a dramatically different effect, however, on neural activity evoked by black and white bars flashed on a computer screen. (Such responses persist even in anesthetized animals.) In this case, neural firing dipped sharply after TMS and remained suppressed for several minutes.

    The findings have implications for designing TMS therapies, says George. For depression therapy, for example, “we may need people to become sad in the chair while stimulating [them],” George says. “Alternatively, we might have them engage in formal cognitive therapy, thinking positive thoughts.” Such considerations are important, he adds, as the Food and Drug Administration is considering approval for daily TMS of the prefrontal cortex to treat depression.

    The new findings also suggest why the effects of TMS often vary, says Alvaro Pascual-Leone, a neurologist at Harvard Medical School in Boston. Pascual-Leone suggests that TMS results could be made more consistent by monitoring the physiological state of the brain using electroencephalography or functional magnetic resonance imaging.

  5. ENGINEERING

    Pollution Slows China's Canal Project

    1. Xin Hao

    The first phase of a massive project to replumb some of China's mightiest waterways has fallen far behind schedule because local authorities don't want to pay for the privilege of drinking polluted water.

    The South-to-North Water Diversion Project is a three-stage effort to alleviate chronic water shortages in the country's more populous but parched northern plains (Science, 25 August 2006, p. 1034). The eastern route makes use of an existing network of canals, rivers, and lakes to pump and move water from the lower Yangtze River to Jiangsu and Shandong provinces. But this month, the official Xinhua news agency announced that the first phase of the route, scheduled to begin operating this year, has been delayed at least 3 years.

    Nearly half of the $4 billion cost of the first phase is earmarked for improving the quality of the water. However, the central government is footing only about 10% of the bill, with the rest expected to come from localities that will benefit from the project. But because nobody wants to clean up somebody else's dirty water, few treatment facilities have been built along the route, and water quality continues to deteriorate. So far this year, according to Xinhua, the water is drinkable at only one of the 21 monitored cross sections in Shandong.

    Some engineering experts say the entire project itself needs to be rethought, with a greater emphasis placed on improving the ecology of the Yellow and Huai river basins. Dredging the Grand Canal north of the Yellow River to make the ancient waterway navigable, they say, would provide a greater benefit to the region and, thus, attract more investment.

    Qian Ye, a climate researcher at the U.S. National Center for Atmospheric Research in Boulder, Colorado, thinks the Chinese government should do a more comprehensive feasibility study of the project that considers the impact of climate change. Global warming, he says, could make China's north wetter and allow authorities to scale back the controversial project.

  6. CLIMATE CHANGE; A Far-South Start for Ice Age's End

    1. Richard A. Kerr

    Long story. A lengthy sediment core tells of an early thaw.

    CREDITS: (PHOTO) L. D. STOTT/UNIVERSITY OF SOUTHERN CALIFORNIA

    Where was the thermostat switch that, once thrown, began to thaw the world out of the last ice age? Paleoceanographers long assumed that it lay in the North Atlantic Ocean somewhere; then the tropical ocean gained popularity in some quarters. But now, strong new evidence from the tropics places the start yet farther south, in the waters around Antarctica. The result “is all very solid, very hard to question,” says paleoceanographer William Ruddiman, professor emeritus at the University of Virginia, Charlottesville. “But it also tells us things are complicated. There are just layers of complexity to this.”

    Finding where it all started “comes down to timing,” says paleoceanographer Lowell Stott of the University of Southern California in Los Angeles. But determining the timing of climate events can be tough when, say, warming in the tropics is recorded in marine sediment, whereas warming in Antarctica is recorded in glacial ice. Those are dated by entirely different methods, which injects an uncomfortable amount of uncertainty.

    Stott and colleagues Axel Timmermann, a modeler at the University of Hawaii, Manoa, and paleoceanographer Robert Thunell of the University of South Carolina, Columbia, eliminated that uncertainty, at least, by gauging changing temperature in western Pacific surface waters and in Antarctic waters in a single sediment core recovered just west of the Philippine island of Mindanao. At any point in the core, microfossils that had fallen from western Pacific surface waters recorded temperature there in their oxygen isotopic composition, whereas microfossils that always lived on the sea floor recorded the temperature of bottom water that had sunk from the surface of the Southern Ocean near Antarctica. Then the group radiocarbon-dated the sediment.

    A southerly start. Water from near Antarctica (bottom) warmed before the tropics (top).

    CREDIT: (GRAPHIC) ADAPTED FROM L. D. STOTT ET AL., SCIENCE

    The results, reported online at www.sciencemag.org/cgi/content/abstract/1143791, were startling. In an earlier Science paper, Thunell and Stott had concluded that the tropical Pacific had warmed first, presumably causing glacial ice to begin melting. But their new analysis shows that more than 18,000 years ago, Antarctic waters warmed 1000 to 1300 years before tropical waters.

    Starting from that timing and drawing on other dated records, Stott and colleagues spin a tale of how the ice started melting. First, predictable variations in Earth's orbit and tilt increased the amount of sunlight hitting high southern latitudes during austral spring. That warmed things up locally and shrank the sea ice back toward Antarctica, uncapping the Southern Ocean and freeing much of its carbon dioxide to begin warming the whole world.

    Nice story, other researchers say, and the starting point at least seems fairly solid. “I think they make a convincing case that something is happening at high southern latitudes before tropical temperatures change,” says paleoceanographer Jean Lynch-Stieglitz of Georgia Institute of Technology in Atlanta. But, as she and Ruddiman both note, putting together the deglaciation story is “a tricky business.” And there are dissenting voices. Paleoceanographer David Lea of the University of California, Santa Barbara, says it isn't so clear polar warming preceded tropical warming, given the difficulty of picking out exactly when the tropical warming began. All agree that finishing up the story in the Northern Hemisphere—where most ice melting eventually occurred—will take much more work.

  7. COSMOLOGY

    A Singular Conundrum: How Odd Is Our Universe?

    1. Adrian Cho

    Subtleties in the big bang afterglow could hint that the universe is arranged around an “axis of evil.” Or they may be the products of random chance. With only one universe to study researchers may be hard pressed to say one way or the other

    Weirdness ahead? Future maps of all the galaxies will be scrutinized for unexplained patterns.

    CREDIT: MAX TEGMARK/SDSS

    The universe: there's nothing else like it. So if the cosmos is strange in some way, how would you tell? That may sound like the beginning of an annoying argument among philosophy majors. But cosmologists have been debating just this point as they try to figure out whether—just maybe—our universe is even weirder than they thought.

    The controversy has been simmering for some time. In 2003, NASA's Wilkinson Microwave Anisotropy Probe (WMAP) satellite measured the light lingering from the big bang (Science, 14 February 2003, p. 991). Researchers charted the slight variations in the temperature of the radiation, known as the cosmic microwave background (CMB), to produce a sky map resembling a dimply lime. Analyzed statistically, that iconic map bolstered a bizarre scenario called inflation, in which in a billionth of a trillionth of a trillionth of a second, the newborn universe doubled and redoubled in size 100 times over, stretching each atom-sized volume to the size of a galaxy.

    But the map led to some mysteries, too. Within 6 months, one team had found a curious alignment of certain undulations in the CMB. Others soon found more correlations that suggested that the cosmos might be skewered like a meatball on a toothpick by an “axis of evil.” That axis might show that the universe has a strange shape or is rotating. It could trash cosmologists' cherished assumption that the universe has no center and no special directions, the so-called cosmological principle that traces its origins to Copernicus. Or it could be a meaningless fluke. “Everyone agrees it's there,” says Kate Land, a cosmologist at the University of Oxford in the U.K. “But is it significant?”

    There's the rub: With only one universe to measure, it may be impossible to tell. Modern cosmological theory mixes the elegant predictability of Einstein's theory of gravity with the inherent randomness of quantum mechanics, which held sway when the universe was just an infinitesimal speck. So theorists can predict the statistical or average properties of the universe, but not the individual quirks and coincidences caused by random quantum fluctuations all those billions of years ago.

    Moreover, although the universe in toto may be infinite, we see only a finite part of it. Because the universe is expanding and the speed of light is finite, anything beyond 50 billion light-years away is zooming off so fast we'll never glimpse it. So the axis could hint at some fundamental feature of the entire universe. Or it could be a meaningless peculiarity of the bit we can see, our so-called Hubble volume or observable universe. Deciding will be difficult, as it's impossible to peer into neighboring Hubble volumes or to repeat the “experiment” that produced our homey patch of the infinite.

    “The main problem with cosmology is our sample size—that of just one universe,” Land says. “If our universe is unusual, what does it mean?” The question underscores just how much cosmologists' observations have improved in the past 2 decades. “In the past, our frontier has always been set by technology; you could always build a bigger telescope and look deeper,” says Max Tegmark of the Massachusetts Institute of Technology (MIT). “For the first time, we've hit the final frontier.”

    So far, only measurements of the CMB have run up against that barrier. But as cosmologists launch studies that take all they can see as one measurement, other efforts could hit it, too. A few researchers think that the matter undermines cosmology's status as a science. All agree that having only one observable universe means that some of its quirks may remain forever mysterious.

    Fanfare for the common universe

    Like a Jimi Hendrix power chord, the CMB reverberates through time. The harmonies in the electromagnetic echo reveal the state of the universe when the chord was struck instantly after the big bang.

    According to current theory, the universe popped into existence infinitely dense and hot and crammed with light and subatomic particles. Within 10−33 seconds, inflation stretched it immensely, before its expansion slowed to a more leisurely pace. Inflation evened out the temperature of the universe, stretched space as flat as a taut bed sheet, and diluted to nothing the numbers of certain pesky particles that theorists say should exist but that have never been seen.

    The big blowup also sowed the seeds for the ripples in the CMB. The stretching magnified tiny quantum fluctuations in the soup of fundamental particles, creating slight variations in its density. Matter began to coalesce into the denser spots, setting off a sloshing of light and matter and leading to tiny temperature variations. These are the same variations conveyed by the CMB, which began shining through the cooling universe 400,000 years after the big bang, when light-trapping protons and electrons combined to form transparent hydrogen atoms.

    To decipher the mottled CMB map, WMAP researchers broke it down much as a musical chord can be broken into individual pitches. Any spherical map can be viewed as the sum of coarser and finer undulations called harmonics or multipoles. For the CMB, the coarsest, the dipole, simply divides the sky into hotter and colder halves. The next, the quadrupole, divides the sky roughly in four, into the two hottest and two coldest regions, and so on. Researchers measured the strengths of hundreds of harmonics and plotted them in a so-called power spectrum, the cosmological equivalent of musical notation specifying which notes to play louder or softer.

    The WMAP team then tried to match this spectrum to the predictions of cosmological theory. They found they could do just that if the universe is precisely 13.7 billion years old and flatter than an Illinois cornfield. It also has to contain 4% ordinary matter; 23% dark matter, which has revealed itself only through its gravity; and 73% space-stretching dark energy, which is currently accelerating the expansion of the universe (Science, 19 December 2003, p. 2038).

    The power spectrum also rises and dips in several places, revealing how sound waves rippled through the toddler universe. Inflationary theory predicted the nature of the bumps and wiggles, and the WMAP data fit the predictions precisely, says Charles Bennett, a cosmologist at Johns Hopkins University in Baltimore, Maryland, and leader of the WMAP team: “It certainly was a major victory for inflationary cosmology.”

    The axis emerges

    Amid the cacophony, however, scientists detected some distinctive harmonies. In 2004, MIT's Tegmark and Angélica de Oliveira-Costa, both then at the University of Pennsylvania, reported in Physical Review D that the hot and cold spots of the octopole pattern are arrayed around a single axis a bit like the panels on a basketball. Moreover, this axis appears to line up with a similar axis for the quadrupole. They estimated the chances that the alignment is a fluke at 1 in 66.

    A few months later, another team found more curious alignments. The axis defined by the quadrupole and the octopole lies in the plane of the solar system, known as the ecliptic, and points toward the equinoxes, the two points on Earth's orbit at which night and day are equal length all over the planet, reported Glenn Starkman of Case Western Reserve University in Cleveland, Ohio, Dominik Schwarz of the University of Bielefeld in Germany, and colleagues in Physical Review Letters.

    Poles apart. The axis of evil (poles marked red) lies almost perpendicular to the solar system's axis (poles marked blue) and far from the galactic axis.

    CREDIT: NASA/WMAP SCIENCE TEAM

    Then, in 2005, Oxford's Land and Joäo Magueijo of Imperial College London reported that the next two harmonics also appear to be aligned with the quadrupole and octopole. They interpreted that as possible evidence that the universe is in fact arrayed around a special axis. If that were true, the cosmological principle would go out the window. The alignments are all the more suggestive because they involve the broadest undulations on the microwave sky, Land says: “If there were something odd cosmologically, this is where you'd expect it to kick in.”

    Many are skeptical. “If you look at a data set in 1000 different ways, you expect to find something that's unusual at the 1-in-1000 level,” WMAP leader Bennett says. Some suspect that the axis may be an illusion produced by an unaccounted bias in how the satellite works. And even those who have studied the alignments note that exactly how unlikely they appear depends on which mathematical tools researchers use to analyze them. Still, many are taking it seriously. “I would say that with a bit more than 99% confidence you can say there's something strange,” Schwarz says.

    Axis-stential musings

    Assuming the axis is more than a measurement error, what new physics might it point to? The most conservative explanation is that the alignments reflect contamination from some nearby stuff that either emits or absorbs microwaves. Researchers already have to filter out the “foreground” microwave glare from the disk of the galaxy. A signal originating within the solar system might explain the connection to the ecliptic—although that would be far more likely if the axis ran parallel to the axis of the solar system, not perpendicular to it.

    Such a foreground would be sexier than it sounds at first, Starkman says. At the least, it would reveal new astrophysics, perhaps some really bizarre form of dust. Moreover, the presence of a foreground from the solar system would most likely only bolster the case for the cosmic axis, he says. Strange though it may sound, compensating for such a foreground would probably accentuate the oddities. “The chances are that when you subtract it, the data will agree even less well with the theory than they do now,” Starkman says.

    By far the most exciting possibility is that the axis indicates that the universe is stranger than cosmologists have assumed. For example, the universe could whip up such an axis if space had an odd shape, such as a torus that wraps around and reconnects to itself. “If you want to explain the axis of evil, the easiest way would be to say that our universe is a lot like [the video game] Asteroids, where if you go off the screen on one side, you come back on the other,” Tegmark says. Others have proposed weirder shapes or suggested that the whole universe could be spinning around the axis.

    None of these tantalizing ideas has bowled researchers over, however. Tegmark and de Oliveira-Costa's doughnut universe clashes with other observational constraints. If the universe wraps around in such a way, then researchers should see faint matching circles in the CMB on opposite sides of the sky (Science, 22 June 2001, p. 2237). But none have been found. Other models suffer similar problems. “It's tough because on one hand, I'm on the side of saying that this may be telling us something,” Starkman says. “On the other hand, so far I'm unconvinced by the ideas that have been put forward for what this might be telling us.”

    Of course, the axis could just be a fluke, a coincidence produced by primordial quantum fluctuations in our particular Hubble volume. Although Albert Einstein insisted that “God does not play dice”—so often that others tired of hearing it—theorists now think that in making the cosmos, the metaphorical creator rolled the bones once and walked away. Perhaps, like troubled gamblers, some cosmologists read too much into the fact that he tossed a 2 and a 3.

    One thing is certain: Cosmologists will never figure out what the axis of evil is by remeasuring the CMB. Researchers have measured the temperature variations in the CMB so precisely that the biggest uncertainty now stems from the fact that we see the microwave sky for only one Hubble volume, an uncertainty called cosmic variance. “We've done the measurement,” Bennett says. “It's not going to get any better.”

    Going my way? The CMB quadrupole (top left), octopole (top right), and the next two multipoles. The red dots mark their symmetry axes, which appear to line up.

    CREDIT: NASA/WMAP SCIENCE TEAM

    The geology of the cosmos

    To be sure, many things remain to be measured. But cosmic variance could ultimately pinch other sorts of studies, such as galaxy surveys. Researchers with the Sloan Digital Sky Survey are using a 2.5-meter telescope at Apache Pointe, New Mexico, to map everything they can see in a quarter of the sky and have spotted 80 million galaxies so far. The proposed 8.4-meter Large Synoptic Survey Telescope (LSST) aims to tally 3 billion. By the middle of the century, researchers likely will have surveyed all 100 billion bright galaxies in our Hubble volume, says Michael Turner, a cosmologist at the University of Chicago in Illinois.

    Such surveys aim to study the distribution of the galaxies en masse, or how the galaxies' images are distorted by huge threads of dark matter spanning the universe. In doing so, they trace the evolution of the density fluctuations that rippled the CMB. That's because those fluctuations also spawned the dark matter filaments, which in turn seeded the galaxies. Even with LSST, some of those studies will butt against the limits of cosmic variance, says Lloyd Knox of the University of California, Davis.

    That barrier to knowledge, some argue, is cosmology's Achilles' heel. “Cosmology may look like a science, but it isn't a science,” says James Gunn of Princeton University, co-founder of the Sloan survey. “A basic tenet of science is that you can do repeatable experiments, and you can't do that in cosmology.” Others don't see the problem. “So far, the fact that we can only see one Hubble volume has not been an impediment to understanding the origin and evolution of the universe,” Turner says. Some note that cosmic variance will limit measurement of only the very largest features of the universe and that studying the myriad smaller bumps and wiggles may be more revealing anyway.

    Even so, all agree that cosmic variance highlights a definite limit to what cosmology can tell us. “The goal of physics is to understand the basic dynamics of the universe,” Turner says. “Cosmology is a little different. The goal is to reconstruct the history of the universe.” Cosmology is more akin to evolutionary biology or geology, he says, in which researchers must simply accept some facts as given. For example, the theory of plate tectonics does not explain why Earth has precisely seven continents.

    That distinction may disappoint the many researchers who have come to cosmology by way of physics, a field that prides itself on its rigor and unparalleled testability. Many hope to connect particle physics directly to the birth and evolution of the universe to arrive at an all-encompassing theory that, at least statistically, would allow them to mathematically derive the universe. “I don't like unexplained coincidences,” Tegmark says. “Generally, I want an explanation.”

    To get one for the axis, many say, researchers must press on concocting new models. Others suggest looking for evidence of the axis in galaxy surveys and other types of data. A few even say that the most promising route may be to develop theories that extend to the “multiverse” of Hubble volumes beyond our own (Science, 23 July 2004, p. 464) and ask why the axis might help make our existence in this one more likely.

    All agree that, in the end, they may never know if the axis signifies anything. “Let's face it: Nature might leave us wondering forever about certain questions,” Tegmark says. We have only one universe, and in some ways perhaps it just is as it is.

  8. EDUCATION RESEARCH

    U.S. Says No to Next Global Test of Advanced Math, Science Students

    1. Jeffrey Mervis

    After U.S. high school students did poorly on TIMSS in 1995, the government has decided not to participate in another version to be given next year

    In 1995, the United States lagged behind most of the world on a test of advanced mathematics and physics taken by graduating high school students from 16 countries. That won't happen again, if the Bush Administration has its way: It has decided not to participate in the next version of the test.

    The National Center for Education Statistics (NCES), part of the U.S. Department of Education's Institute of Education Sciences (IES), says it is bowing out of 2008 TIMSSA, an advanced version of the Trends in International Mathematics and Science Study given quadrennially to younger students, because it can't fit the $5 million to $10 million price tag into its flat budget. Officials also question whether the target cohort—students finishing secondary school who have taken advanced mathematics and physics courses—is comparable around the world.

    But many leaders in the mathematics community believe that the Administration opted out because it feared another poor U.S. performance would reflect badly on its signature education program, the 2002 No Child Left Behind Act. While advocates of the test look for other sources of funding, Science has learned that the National Board for Education Sciences, which advises IES, will ask for a review of the decision next month.

    International tests have proliferated in recent years as countries seek ways to measure how well they are preparing students for jobs in a global economy. And although fourth- and eighth-grade U.S. students have performed adequately on the TIMMS tests, high school seniors have not. In 1995, the last time that cohort was measured, U.S. students topped only Austria in advanced math and ranked dead last in physics.

    Planning for 2008 TIMSS-A began in 2006 at the urging of Norway and Sweden. Although 16 countries participated in the first test, only nine—the two proponents plus Russia, Italy, the Netherlands, Slovenia, Iran, Lebanon, and Armenia—have ponied up for the new test, which covers geometry, algebra, and calculus as well as mechanics, electricity and magnetism, heat and temperature, and atomic and nuclear physics. Sometime last year, NCES quietly decided not to get involved, and since then Australia, Germany, and Finland have also dropped out.

    Leaders from the U.S. mathematical community, including the National Council of Teachers of Mathematics and the American Mathematical Society, are up in arms at the department's decision, first reported last month by the newspaper Education Week. They argue that this elite group of students needs to be monitored because they are most likely to major in STEM (science, technology, engineering, and mathematics) fields in college and become the next generation of scientists and engineers. “It's inconceivable to me that the government wouldn't fund our participation,” says Stanford mathematician R. James Milgram, a member of the IES advisory board that expects to take up the issue at its 30 to 31 October meeting. “The 1995 test was extremely important in showing that a problem exists,” he notes. “And the only way to know if we're beginning to turn things around is by looking at new data to see if we've made any progress.”

    CREDIT: PETER HOEY

    In defending their decision, NCES officials note that they are already supporting international assessments such as the regular 2007 TIMSS for fourth and eighth graders, a fourth-grade reading exam, a math and science assessment of 15-year-olds, and a planned survey of adult literacy. They say that U.S. students may be at a disadvantage because some TIMSS-A test-takers from other countries are older and may have specialized in math and science during the latter part of their secondary school years. In addition, says NCES associate commissioner Valerie Plisko, whose office manages the various international assessments, “we typically do not benchmark against these countries.”

    But those explanations don't pass muster with critics. TIMSS-A “is not just a horse race,” responds Patsy Wang-Iverson, coordinator of the group advocating U.S. participation and also vice president of the Gabriella and Paul Rosenbaum Foundation in Bryn Mawr, Pennsylvania, which supports mathematics education. She says there is much that U.S. educators can learn from looking more closely at this population. “A lot has changed since 1995,” she says. “Students are taking more math and science and more AP [advanced placement] courses, and TIMSS-A provides us with a wonderful opportunity to evaluate their performance. If we don't do it now, we'll lose track of an entire generation of reform efforts.”

    After NCES bowed out, officials at the National Science Foundation (NSF) asked the Educational Testing Service (ETS) in Princeton, New Jersey, to propose how it would administer TIMSS-A. ETS's approach also would have laid the foundation for a longitudinal study of these advanced math and science students. But this summer, NSF officials declined to fund the proposal after reviewers raised questions about the target population and ETS's ability to improve on the disappointingly low levels of U.S. participation in the 1995 test. “We'd have to do more work to resolve those issues,” admits ETS's Michael Nettles.

    Michael Martin, co-director of the Boston College-based center that manages the international TIMMS-A assessment, says the group is on schedule to administer TIMMS-A next spring in participating countries and report the results by the end of 2009. Any change of heart by U.S. officials, he adds, won't alter that time frame. “We are sad that the United States won't be participating,” Martin says. “But at some point the ship must sail.”

  9. BIOSAFETY BREACHES

    Accidents Spur a Closer Look at Risks at Biodefense Labs

    1. Jocelyn Kaiser

    Failure to report a Brucella infection and other problems at a Texas university have microbiologists searching for ways to ensure safety and public trust

    Redundancy. A positive-pressure “space suit” is one of several precautions used to protect workers from the deadliest pathogens in a biosafety level 4 lab.

    CREDIT: CDC

    An unreported infection with a dangerous pathogen and other biosafety breaches at a Texas university are fueling an already heated debate about safety at U.S. biodefense labs. The problems at Texas A&M University in College Station, which led federal officials to shut down the university's biodefense research this summer, follow a spate of accidents at other U.S. labs in the past few years. They also coincide with the accidental release of foot-and-mouth virus from a research facility in the United Kingdom that has shown the potential economic devastation that can result if a pathogen escapes. These events are bringing new urgency to a question raised soon after the United States began pouring money into biodefense research after the 2001 anthrax attacks: Are the nation's biodefense labs safe enough?

    “Proponents insist there is a clean safety record. That is simply wrong. With some agents, it could have catastrophic consequences,” says microbiologist Richard Ebright of Rutgers University in Piscataway, New Jersey, a critic of the biodefense expansion.

    Although other scientists and biosafety experts say the extensive breakdown in procedures at Texas A&M is probably exceptional, they too worry that many incidents are going unreported. Next week, a congressional committee will examine the recent accidents and the biodefense buildup.

    The scrutiny is sending tremors through university administrators and the microbiology community, which is struggling with how to both ensure safety and gain the public's trust. One idea under discussion is an anonymous national accident reporting system that would enable institutions to learn from one another's mistakes.

    Winning public confidence could determine whether several proposed labs, such as one being built in Boston, will be allowed to operate at biosecurity level 4 (BSL-4), the highest level used to study the most dangerous pathogens. Community support will also likely play a role in which of five competing sites wins a planned $450 million BSL-4 national agro-biodefense lab funded by the Department of Homeland Security.

    CREDIT: CDC

    Some infectious disease experts worry that public hysteria fueled by watchdog groups over even relatively minor lab incidents will paradoxically make it harder to establish the atmosphere of trust that is essential to running a safe lab. “To ring all the bells and bring out the fire trucks is counter-productive,” says virologist Clarence J. Peters of the University of Texas Medical Branch (UTMB) in Galveston. But there is room for improvement, he adds: “One of the biggest problems is transparency. I think we're all going to have to get past that.”

    Into the hot zone

    To be sure, biosafety has come a long way in the past few decades. Before then, “there weren't a whole lot of rules, just a lot of common sense” about how to run an infectious disease lab, says virologist Charles Calisher of Colorado State University in Fort Collins, who says the biosafety officer's main message was: “Put that cigarette out; no more mouth pipetting.” Peters notes that there were thousands of lab-acquired infections before the 1970s, when labs began installing hoods, shields around centrifuges, and other safeguards. In 1984, the U.S. National Institutes of Health (NIH) in Bethesda, Maryland, and Centers for Disease Control and Prevention (CDC) in Atlanta, Georgia, produced the first edition of a guidebook, called Biosafety in Microbiological and Biomedical Laboratories (BMBL), that pooled researchers' experiences and is now considered the Bible of safety.

    Oversight became stricter after 2001 when federal agencies beefed up a regulation, called the select-agent rule, for the handling of pathogens such as anthrax and the Ebola virus that are potential bioweapons. The rule requires that lab workers get a security clearance for working on the roughly 80 select agents and toxins; that selectagent labs be inspected and workers undergo training; and that lab exposures and losses of select agents be reported to CDC. About 14,000 people at 400 labs now have select-agent authorization.

    To date, the most serious biosafety breaches have occurred outside the United States, such as several SARS infections in Asia in 2003 and 2004 that killed one researcher and infected several people outside the lab and the death of a Russian lab worker from Ebola in 2004. And some potential exposures—such as animal bites, needle sticks, and glove tears—are inevitable, U.S. biosafety experts say. One of the worst recent accidents occurred at the U.S. Army Medical Research Institute of Infectious Diseases in Fort Detrick, Maryland, where a worker was exposed to the Ebola virus but didn't become infected. Others (see table) involved shipments of pathogens labeled nonpathogenic that turned out to be virulent. That happened with tularemia in Boston University in 2004, where three workers were infected. The incident was reported to local authorities and made public only after delays, adding to criticism of the proposed Boston BSL-4 lab (Science, 28 January 2005, p. 501).

    The problems at Texas A&M, however, may be the most egregious to date. They first emerged in April when the school belatedly reported to CDC that in February 2006, a worker was infected with Brucella bacteria, a pathogen common in livestock that causes fever and fatigue in humans but is rarely fatal. This incident, like many others, was brought to light through public records requests by Edward Hammond of the Sunshine Project, a watchdog group in Austin, Texas. In June, after the Sunshine Project reported that three workers had tested positive for antibodies to the Q fever pathogen, CDC shut down all of Texas A&M's select-agent work. In an August investigation, CDC inspectors found a dozen serious violations, including unapproved experiments, lost samples, improper safety training, and lab workers without select-agent authorization (Science, 14 September, p. 1487).

    Some observers suggest the Q fever antibody tests were not a major issue; none of the workers became ill, and two were apparently exposed before they joined the lab. But the Brucella case, which happened when a worker leaned into an aerosol chamber to clean it, is a clear violation of safe practices: The chamber should have been decontaminated with gas first, says Jonathan Richmond, a consultant in Southport, North Carolina, who oversaw biosafety at CDC in the 1990s. It has added to speculation that more incidents aren't being reported. Hammond has used open-records requests to dig up examples of exposures, equipment failures, and other near-misses at various labs that weren't publicly disclosed. He says they suggest other significant mishaps are hidden.

    Researchers and biosecurity experts say serious infections would be difficult to hide from CDC. But some agree there is probably underreporting of mild infections and potential exposures. Workers who make a mistake are often embarrassed and may fear angering their supervisor, and institutions worry about the damage to their reputation, says Richmond. “It's been a problem for a long time,” he says. Supporting that suspicion, CDC, which has recorded about 20 accident reports a year since 2004, has received 32 reports since April 2007, possibly because of the publicity about Texas A&M, says a CDC spokesperson.

    Proliferation. Critics are worried about the potential for infections and escapes at biosafety level 4 (BSL-4) labs (five existing, at six least planned) and 84 existing and new BSL-3 biodefense labs, as compiled here by the Sunshine Project.

    SOURCE: THE SUNSHINE PROJECT, http://www.sunshine-project.org/ AS OF FEBRUARY 2006

    Although the multiple protocol violations at Texas A&M may be the exception, less extensive violations are not. A 2006 Department of Health and Human Services (HHS) Inspector General audit of security procedures found that 11 of 15 institutions had “serious weaknesses” such as unlocked doors and freezers and lax inventory records. Janet Shoemaker, public affairs director for the American Society for Microbiology in Washington, D.C., points out that schools have a strong incentive to adhere to the rules; since 2003, the HHS Inspector General has levied fines ranging from $12,000 to $150,000 on nine research institutions and companies for breaches such as unapproved select-agent shipments. Texas A&M is facing fines as high as $500,000 for each violation.

    No public menace

    One point of agreement among most scientists is that however scary these incidents sound—the mention of Ebola virus conjures the 1995 movie Outbreak, for example—the risk to the public is very low for most pathogens, for two reasons. First, there have been no known environmental escapes from BSL-4 labs since the early 1980s and only two workers are known to have become infected in BSL-4 labs, both outside the United States. Workers have many layers of protection, including positive-pressure “space suits,” and realize the hazards of working with pathogens studied in BSL-4 labs, for which, by definition, there are no treatments.

    Second, even if an agent studied in a BSL-4 lab did escape, most, with the exception of smallpox (which can only be studied at CDC), are not very transmissible. Anthrax doesn't spread person to person, for example. Ebola and other hemorrhagic fevers that have killed hundreds in Africa would likely never cause an outbreak in Western countries because hygiene and medical treatments are so much better, says Peters. (He also notes that many select agents, such as anthrax and Q fever, occur commonly in nature, so people can get infected without coming anywhere near a biodefense lab.)

    Some scientists and biosafety experts are more worried about risks at BSL-3 labs, because the standards at these labs are not as stringent. But even most of these pathogens—with the exception of SARS, avian influenza, and 1918 flu—are not very communicable, and in any case vaccines and other treatments are available. At most, says infectious disease modeler Ira Longini of the University of Washington, Seattle, “the result could be a handful of cases and maybe deaths.” Another exception is foot-and-mouth disease, which doesn't infect humans but is extremely contagious among animals; the escape in the United Kingdom, which has been tied to an outdated effluent treatment system, would be unlikely to occur at more modern facilities in the United States, Richmond says.

    Peters worries that the “hysteria and witch hunting” by people like Hammond of the Sunshine Project is compromising safety by making lab workers worry that reporting potential exposures will get them fired. “People can't be terrified to report,” agrees Jean Patterson of the Southwest Foundation for Biomedical Research in San Antonio, Texas, which runs a BSL-4 lab.

    Safety check

    So how can biosafety be improved? One proposal is an anonymous, mandatory reporting system for all laboratory accidents. Such a system would enable labs to learn from one another's mistakes, as do the data compiled on aviation accidents by the National Transportation Safety Board, says Gigi Kwik Gronvall of the Center for Biosecurity of the University of Pittsburgh Medical Center in Baltimore, Maryland, who co-authored a paper describing this proposal earlier this year in Biosecurity and Bioterrorism. “Other industries have gone through this,” says Gronvall. The system would also capture lab exposures to pathogens not on the selectagent list, such as HIV and tuberculosis. Reporting these to NIH or CDC is not mandatory, Rutgers's Ebright notes.

    But some microbiologists caution that reportable incidents should be well-defined, lest the system become glutted with minor mishaps. (Peters cites UTMB's recent decision to release, at a community group's request, a list of its 17 near-misses in the past 5 years.) Also important, says biosafety consultant W. Emmett Barkley of Bethesda, Maryland, reports should include not just bare facts but analysis, as CDC now provides for selected lab accidents in its Morbidity and Mortality Weekly Report.

    A more radical idea is to require that BSL-3 and BSL-4 labs be licensed by the federal government. This would mean that all these labs, not just those working on select agents, would be inspected and they would be required to follow the same operating procedures. One supporter of this proposal, biosecurity expert Anthony Della-Porta of Geelong, Australia, says the problem now is that BMBL offers only general guidance. Others, such as Barkley, say institutions need flexibility, especially the many BSL-3 labs that don't do biodefense work.

    There's one fact that nobody disputes: The risk of accidents in biosafety labs goes up with the number of workers. For that reason, watchdog groups and even some biodefense researchers lament the lack of analysis on whether all of the six planned BSL-4 and two dozen new BSL-3 biodefense labs are actually necessary to protect the nation from bioterrorism (see map). Says Gronvall: “Is there too much [biodefense research]? Without seeing the plan of action, it's hard to say.”

  10. ECOLOGY

    Setting the Forest Alight

    1. Paul Webster*
    1. Paul Webster writes from Toronto, Canada.

    To validate satellite data for carbon-emissions modeling, researchers this summer torched a jack-pine forest in Canada and tried to ignite a stand of larch in Siberia

    KODINSK, RUSSIA— In July, as temperatures soared during a heat wave in eastern Siberia, scores of large fires flared through the region's dense pine forests. For 500 kilometers along the Amur River northwest of Lake Baikal, thick smoke blanketed the wilderness. Officials with Russia's famous airborne forest fire fighting service, Avialesookhrana, were tracking the wildfires at an airbase here in Kodinsk, a small city on the Amur. They were tense. To them it seemed bizarre that a team of international scientists had received permission to burn a patch of nearby forest. Even with every local helicopter and plane conscripted to serve their firefighting crews, millions of dollars' worth of timber was going up in smoke in wildfires. “It's not as though we don't have enough to worry about already,” mused Oleg Mityagin, the overtaxed local Avialesookhrana boss. “We're in no position to help them if they lose control.”

    Safe distance. Douglas McRae checks out a gap in a pine forest during an experimental burn in Ontario, Canada.

    CREDIT: PAUL WEBSTER

    Sixty kilometers to the west at the experimental site, a group of Russian, American, and Canadian researchers hoped to set a test fire that would thoroughly burn a hectare-sized patch of larch forest, Siberia's dominant conifer. Their aim was to quantify carbon emissions from fires in larch forests across Siberia, now inadequately documented, according to Douglas McRae, a forest-fire researcher with the Canadian Forest Service. McRae has been conducting experimental burns in Canada and Russia since 1999 as part of project FIRE BEAR (Fire Effects in the Boreal Eurasia Region), a research program aimed at studying forest-fire behavior, ecological effects, emissions, carbon cycling, and remote sensing.

    Conceived in 1997, FIRE BEAR brings researchers from the U.S. Department of Agriculture (USDA) Forest Service and the Canadian Forest Service together with colleagues at the Siberian branch of the Russian Academy of Sciences' (RAS's) V. N. Sukachev Institute in Krasnoyarsk. As the group's previous studies have shown, extreme forest fires are growing more frequent in Siberia. And some models predict that climate change will bring dramatic warming—and more forest destruction—in eastern Siberia and other northern regions. The experimental burn, the FIRE BEAR team hoped, would yield direct observations to buttress satellite data and fill gaps in the models.

    Flaming wilderness

    The searing summer heat in Kodinsk presented a dilemma for the scientific team. “We want the larch to burn well in order to obtain good data,” McRae explained, “but we risk losing control if it burns a little too well.” In the days leading up to the experimental burn, bulldozers hacked firebreak lanes around the test patch, and researchers wired the forest floor with probes to gauge heat release, carbon emissions, and effects on vegetation and microbes. McRae had good reason to be anxious. In May, in similar weather, he and his FIRE BEAR colleagues conducted an experimental burn near Sault Ste. Marie, Canada, in which a hectare-sized patch of bone-dry jack-pine forest fanned out of control. That experiment was meant to show how infrared technology can be used to estimate fuel consumption and carbon emissions during fires. McRae and his colleagues hoped it would help them gauge how Russian wildfires contribute to greenhouse gas emissions. (Russian security laws prevent infrared filming from the air.)

    Only minutes before the scientists ignited the fire in Ontario, wind gusts unexpectedly blew through the treetops. After ignition, the entire test plot flared in an explosive burst that melted computerized monitoring equipment. The equipment technicians got out unharmed with much of the damaged, although still-functioning, gear belonging to Martin Wooster, a geographer at King's College London.

    Wooster believes that the amount of carbon emitted from wildfires every year is possibly half that released by fossil-fuel consumption. He has been traveling the world collecting data to confirm his theory. In the Canadian test, he had an opportunity to gather data at ground level and at 300 meters above the fire in a helicopter. Researchers will use the observations to test the accuracy of satellite data.

    While making an infrared film, Wooster watched the test fire jump across the firebreaks around the experimental site. Within a few hours, more than 1400 hectares of magnificent pine forests were ablaze. Water bombers, surveillance planes, and Wooster's rented helicopter scrambled to get the situation under control. Wooster came away with an impressive data haul that will help to validate the usefulness of infrared measurement, he said later. But Ontario forest officials were not pleased. “I strongly doubt they'll be quick to give permission for more such experimental fires in future,” Wooster said.

    Foresters aren't the only ones to express doubts; Russian security officials have been wary, too. Thanks to an infusion of funding from the International Science and Technology Center in Moscow, which supports nonmilitary collaboration between Western scientists and those within the Russian weapons complex, FIRE BEAR has attracted former-Soviet military experts in remote sensing. Other scientists have joined, including members of the Siberian RAS's Institute of Chemical Kinetics and Combustion in Novosibirsk, as well as U.S. researchers funded by NASA.

    Hot results. A sudden gust of wind sent flames temporarily out of control in a Canadian test area, but the fire produced terrific data.

    CREDIT: PAUL WEBSTER

    Some Russians have complained of being arrested and undergoing harrowingly long interviews, says Anatoly Sukhinin, a remote-sensing expert who joined FIRE BEAR after a career in the Soviet military. “I still spend a fair amount of my time explaining our work to the police,” complained Sukhinin, sitting in his laboratory in Krasnoyarsk, which NASA helped equip to receive and interpret Siberian fire data beamed from American and Russian satellites. “It doesn't help that we're doing these experiments in a region which was until recently secret and still remains heavily militarized.”

    Despite the hassles, the partnership seems to be paying off. In recent years, says Amber Soja, a research scientist with the U.S. National Institute of Aerospace, currently resident in the Climate Dynamics branch of NASA's Langley Research Center in Hampton, Virginia, FIRE BEAR papers have widened knowledge of Siberian forest fires and their global atmospheric effects. In 1998, Brian Stocks of the Canadian Forest Service reported a positive correlation between climate-change impacts and an increase in the severity of Siberian fires. A 2004 paper by Soja, along with McRae, Sukhinin, and Susan Conard of the USDA Forest Service, concluded that disparities in the amount of carbon stored in different forest types and the severity of fires within them can affect total direct carbon emissions by as much as 50%. This is why they need specific data on larch fires, which emit less carbon than pine. In extreme fire years, they found, total direct carbon emissions from wildfires can be 37% to 41% greater than in normal ones, because severer fires consume more organic matter in the forest floor.

    Last year, Soja, Stocks, and Sukhinin published a review of predictions of climate-induced boreal forest change. Four of seven models predict that warming in Siberia will be 40% greater than the global mean. Soja spent several weeks at the FIRE BEAR camp near Kodinsk last summer, living in a tent and subsisting largely on tinned fish and buckwheat cereal while comparing notes with her Canadian and Russian co-investigators in the run-up the test burn. The predictions she co-reviewed, she says, are already coming true in Alaska, Canada, and Russia. In Siberia, 7 of the last 9 years have resulted in extreme fire seasons, she explains. Speaking from the camp, she said, “If you are looking for climate-change impacts on forests, this is the place to be.”

    On the day of the big test burn this summer in Kodinsk, however, all predictions went up in smoke. Minutes after local fire crews ignited the perimeter of the experimental larch site with benzene, dark clouds suddenly appeared and rain doused the flames. “You'd be surprised how often this sort of thing happens,” McRae said with a shrug. “That's what you get for playing with fire.” The researchers, who still need the larch data, are already planning to torch a forest in Siberia next summer.

Log in to view full text

Log in through your institution

Log in through your institution