News this Week

Science  16 Mar 2007:
Vol. 315, Issue 5818, pp. 1476

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    NASA Declares No Room for Antimatter Experiment

    1. Andrew Lawler

    The Alpha Magnetic Spectrometer (AMS) is a model of international cooperation, led by a dynamic Nobel Prize winner, and promises to do impressive science in space. But it may never get a chance to do its thing.

    The problem is that NASA has no room on its space shuttle to launch the $1.5 billion AMS mission, which is designed to search for antimatter from its perch on the international space station. “Every shuttle flight that I have has got to be used to finish the station,” NASA Administrator Michael Griffin told a Senate panel on 28 February.

    Not stationary.

    AMS needs another way to get to the international space station after NASA said that the shuttles are booked.


    Griffin's categorical statement could spell doom for the innovative experiment, which received a glowing review in December from an independent scientific review panel appointed by the mission's sponsor, the U.S. Department of Energy (DOE). The decision is sure to send ripples around the world, considering that 16 countries have contributed large sums of money to the effort. And it is one of the only significant scientific facilities planned for the space station.

    AMS is the brainchild of Samuel Ting, a physicist at the Massachusetts Institute of Technology in Cambridge and Nobel laureate. One of its major goals is to understand the uneven distribution of matter and antimatter in the universe by searching for antimatter. The experiment, nearing completion in Geneva, Switzerland, could also help search for dark matter and a new form of quark matter called strangelets.

    NASA and Ting announced the experiment with much fanfare in 1995, and the shuttle flew a small prototype in 1998. Although the loss of the Columbia orbiter put launch of the AMS on indefinite hold, Ting has continued work on the spacecraft, which should be ready to be shipped to Kennedy Space Center in Florida by 2008 after testing at Geneva's CERN and the European Space Agency's facility in Noordwijk, the Netherlands.

    NASA has spent $55 million to build the skeleton, which will hold the device in the shuttle hold—the 6800-kg AMS would take up nearly half a shuttle bay—and be attached to the long truss on the space station. Although DOE has contributed about $30 million, the vast bulk of AMS funding has come from international partners such as Italy and France, as well as the unlikely combination of Taiwan and China. “The AMS project is sure to be viewed as a model for international collaboration in science,” noted one reviewer in the DOE study chaired by Barry Barish, a physicist at the California Institute of Technology in Pasadena. That study “had only praise and some wonder” at Ting's ability to create such a far-reaching coalition.

    Barish last week called the NASA news “disappointing” and said it would be “a big blow for international collaborators.” He added that Ting has already been looking for other routes into space. One alternative is to launch the AMS on an expendable rocket with a robot that could guide it to the space station. The only realistic candidate, NASA officials say, is the Japanese H-2 transfer vehicle now under development. To alter both that vehicle and the AMS for such a mission, however, would cost between $254 million and $564 million, says Mark Sistilli, NASA AMS program manager.

    Another alternative would be to place it in orbit aboard a rocket, which could leave the AMS in orbit until the shuttle could pick it up. That option could cost $380 million to $400 million and would entail a complex docking maneuver. A final option, according to Sistilli, would be to turn the AMS into a free-flying spacecraft with its own radiators and solar panels. Such a conversion, however, could top $1 billion.

    DOE officials declined comment, and Ting was traveling in Asia and could not be reached. But Sistilli, who agrees that “the science is terrific and the international commitment is huge,” says that NASA will continue to fund its portion of the project and hope for a positive outcome. “We didn't want to outright kill it,” he says. “We don't really know how to handle the situation.”


    Budget Pressure Puts High-Profile Study in Doubt

    1. Eliot Marshall

    A budget crunch has delayed and could scuttle a major U.S. cancer-prevention trial set to begin in April. The $100 million experiment aims to compare a new drug, letrozole, to older pills for preventing breast cancer in women after menopause. Just 1 day after the trial won a high-level approval, John Niederhuber, director of the National Cancer Institute (NCI), flagged it for an intense review, to take place on 23 March. This private session will also look broadly at improving prevention trials.

    Tough choices.

    NCI Director John Niederhuber is grappling with a shrinking real budget.


    The reversal has upset the center that designed the trial, the National Surgical Adjuvant Breast and Bowel Project (NSABP) in Pittsburgh, Pennsylvania. NSABP may be best known for pioneering tamoxifen therapy and new methods of breast surgery. Oncologist D. Lawrence Wickerham, NSABP's associate chair, says, “We were ecstatic on 22 January” when NCI's executive committee endorsed the project, known as the STELLAR trial, after 18 months of reviews. Then on 23 January, “we were informed that Niederhuber had apparently unilaterally” placed it on hold, rejecting an 8-2 approval by his executive committee. Wickerham says it suggests that cancer prevention is being pushed into “second class.”

    Niederhuber told The Cancer Letter, which first reported this decision, that NCI programs were under “a great deal of stress” and that some NCI grantees had “strong feelings” that the STELLAR proposal “was not good science” and not a good use of funds. Niederhuber declined Science's request for an interview on grounds that it might affect the 23 March review. In a statement, NCI said the fresh look at STELLAR was “part of ongoing deliberations about difficult decisions regarding the best use of scarce resources that have resulted from 5 years of below-inflation appropriations.” NCI notes that the trial “would cost approximately $100 million, would involve about 13,000 women, and require at least 10 years before results would be available.”

    STELLAR asks a specific question: Does letrozole, a drug in the new aromatase inhibitor (AI) class, work better as a preventative for postmenopausal women at high risk for breast cancer than an older drug, raloxifene? (Other data already indicate that raloxifene is better than an earlier preventative, tamoxifen.) Based on cancer treatment results, many think that letrozole will have milder side effects and provide better protection. All these drugs are designed to blunt the effects of estrogen; tamoxifen and raloxifene block estrogen from stimulating tumor growth, whereas AI drugs stop the synthesis of estrogen.

    Paul Goss, director of breast cancer research at Harvard's Massachusetts General Hospital in Boston, says that AI drugs have had great success in treating cancer, which has raised hopes for prevention. Data consistently show that AI drugs reduce estrogen in postmenopausal women to a very low level, he says, and women who took AI drugs after cancer in one breast were far less likely to develop a new tumor in the other breast. (Rates were reduced by about 60% to 75%, compared to 40% for tamoxifen.)

    At the same time, Goss says, people are raising questions about the trial's value. He notes, for example, that two other big trials of AI drugs—one led by the National Cancer Institute of Canada's clinical trial group, which he chairs, and another funded by the charity Cancer Research UK—are already under way. Each uses an AI from a different company: Letrozole is made by Novartis, the Canadian trial is testing a Pfizer drug, and the U.K. trial is testing an AstraZeneca drug. The Canadian and U.K. trials compare women given the test drug to those in control groups given a dummy pill. These placebo-controlled trials can get by with relatively small enrollments (4000 to 6000). In contrast, STELLAR will need to enroll 13,000 to find subtle differences between two active drugs. This means STELLAR will cost more and deliver results long after the others. Goss says that STELLAR's head-to-head comparison would give a more definitive reading on each drug, but he questions whether the results will come soon enough to affect clinical practice.

    Niederhuber and others have mentioned another concern: Women may not be interested in STELLAR's results. For example, only a small fraction of those at risk for breast cancer have embraced tamoxifen as a preventative, despite its proven value. (The drug has serious side effects, including a risk of endometrial cancer and blood clots.) Whether AI drugs would be more popular is a guess. Cynthia Pearson, director of a cancer activist group, the National Women's Health Network in Washington, D.C., says she “would not be so sorry” if the STELLAR trial were set aside, because “I don't agree with the whole line … that breast cancer treatment drugs should be used in healthy women.”

    For these and other reasons, Niederhuber has put STELLAR on a list of projects that need to be reconsidered in light of NCI's tight 2007 budget. Many could be trimmed, he told NCI's Board of Scientific Advisors on 5 March. NCI is looking at reductions in tobacco-control research and seven intramural research projects, as well as potential 10% cuts in NCI's flagship comprehensive cancer centers, clinical trials, and the NCI director's budget.


    German Law Stirs Concern Illegal Artifacts Will Be Easier to Sell

    1. Andrew Curry*
    1. Andrew Curry is a freelance writer in Berlin.
    Controversial law.

    New German legislation could encourage the trade of artifacts from illegal excavations that have potholed large areas of Iraq.


    Last week, the German Senate ratified the 1970 UNESCO Convention on Cultural Property, but archaeologists around the world fear that the long-delayed approval will do more harm than good. Many worry that Germany's interpretation of the convention will make the country a haven for illegally excavated antiquities from Iraq and elsewhere.

    The UNESCO convention has been a defining document in the global battle to protect artistic and especially archaeological heritage from theft, looting, and destruction. Yet governments can make their own decisions on how to implement it. Whereas the United States and many of the other 112 signatories to the convention restrict or prohibit trade in broad categories of artifacts, the German law passed last Friday requires countries to publish lists of specific items they consider valuable to their cultural heritage. Only those items will be protected under German law, which means trade in undocumented artifacts, such as those looted from archaeological sites, will be difficult to restrict. “This is a bad signal,” says Michael Mueller-Karpe, an archaeologist at the Roman-German Central Museum in Mainz. “It tells the world that whatever isn't published isn't worth protecting.”

    The idea of restricting specifically listed objects may make sense for museum collections but not for looted artifacts, say archaeologists. By the time they reach the market, such artifacts—Egyptian sculpture, Akkadian cuneiform tablets from Iraq, and Cambodian stone carvings, for instance—are typically stripped of the painstaking archaeological documentation and context that makes them scientifically valuable.

    Still, Germany's implementation of the convention is well within the treaty's original requirements. “According to UNESCO, stolen objects have to be from documented collections,” says Neil Brodie, research director of Cambridge University's Illicit Antiquities Research Centre. “There's no legal obligation for countries to treat illegally excavated objects as stolen.” Mueller-Karpe calls the convention the “Grave-robbing Law” because he feels it encourages such theft.

    Many countries have gone further than Germany in restricting the trade in illegally excavated artifacts. In the United States, for instance, dealers trading in certain categories of items are required to have export licenses from the country of origin or prove that the object has been out of the country of origin since before the agreement went into effect. “The important part is the difference between designated categories and a list of specific objects,” says Patty Gerstenblith, a professor at DePaul University College of Law in Chicago, Illinois. “A list simply doesn't work, because artifacts that are taken out of the ground are unknown.”

    Indeed, as more countries crack down on the trade of artifacts—the United Kingdom and Switzerland, long notorious as transit countries for illegal antiquities, ratified the UNESCO treaty in 2002 and 2003, respectively—German archaeologists fear that the country's loopholes could make it a destination where dealers turn stolen property into legal merchandise that can then be traded worldwide. Until now, objects with no proof of origin have been assumed stolen. But under the new law, if they're not listed, they can be presumed legal and potentially sold with Germany as their country of origin—making it easier to move them to the United States or elsewhere. “It's like an antiquities laundry,” says Mueller-Karpe.

    Eckhard Laufer, a police official and part of the German Task Force on Illegal Excavation, says the new law is a missed opportunity. “We'll have to wait and see, but I'm afraid it's totally inadequate,” Laufer says. “The new law won't make any improvement, and the situation can't get much worse than it is right now.”

    German coin collectors and art and antiquities dealers counter that the new law is too strong. “Germany always does things 150%. The more strict the laws are, the more objects are going to go to a gray market,” says Christoph von Mosch, a Munich art dealer with a degree in classical archaeology. Countries can now make claims on artifacts worth more than €1000 for up to a year after they are posted for sale, creating complications and paperwork that some dealers say puts them at a competitive disadvantage.

    That it has taken Germany 36 years to ratify the original UNESCO convention doesn't bode well for prompt action on the 1995 UNIDROIT Convention, a much more stringent agreement that characterizes illegal excavation as theft and requires the return of stolen objects and cultural property. So far, only a few dozen countries have signed. Along with Germany, Brodie says, “none of the major market or transit countries”—including the United States, the U.K., Switzerland, France, and Belgium—“have ratified it.”


    Is a Thinning Haze Unveiling the Real Global Warming?

    1. Richard A. Kerr

    The sunlight-reflecting haze that cools much of the planet seems to have thinned over the past decade or so, remote-sensing specialists report on page 1543. If real, the thinning would not explain away a century of global warming, experts say, but it might explain the unexpectedly strong global warming of late, the accelerating loss of glacial ice, and much of rising sea levels. However, many other researchers are highly suspicious of the data and frustrated by the lack of any quantitative measures of their reliability.

    Down for real?

    Satellite data show a thinning of global hazes (declining green line), but calibration questions cloud the issue.


    The observations come from Advanced Very High Resolution Radiometer (AVHRR) instruments flown aboard weather satellites. Designed to measure cloud cover for weather forecasters, they can also measure the much weaker sunlight reflected from the aerosol particles of haze. And unlike newer, more precise instruments, they have been measuring aerosols since 1981.

    Michael Mishchenko and his colleagues at NASA Goddard Institute for Space Studies (GISS) in New York City took advantage of AVHRR longevity to search for long-term trends in aerosols. Since the early 1990s, they say, the global aerosol layer has been thinning rather dramatically. “We can't claim it's 100% real,” says Mishchenko. AVHRR is “not a very good instrument. It's just a weather satellite.” But the data check out when compared with some ground-based observations and are broadly consistent with certain other satellite data, the authors write.

    If aerosols are really thinning that much, substantially more sunlight has been escaping reflection back into space and warming the planet. That extra energy, rather than an unrecognized quirk in the climate system, would explain the greater-than-forecasted warming of the 1990s and early 2000s that another team noted last month ( The extra warming might in turn explain the accelerated loss of sea ice from the Arctic Ocean and from the great ice sheets, which feed rising seas.

    First things first, however, say critics. “What [Mishchenko and colleagues] are trying to do is admirable,” says Sarah Doherty of the University of Washington, Seattle, who has studied the calibration of AVHRR instruments. But “there's just too much uncertainty.”

    The problem, Doherty says, lies in part in stringing together records from five different instruments flown on five different satellites over the years. At times, the next instrument was not launched before its predecessor failed, preventing a precise calibration. Mishchenko and colleagues “need to say how well they know the uncertainties,” she says. Without quantitative estimates of uncertainty, she can't tell whether the trend is real.


    Report Backs More Projects to Sequester CO2 From Coal

    1. Eli Kintisch

    A new academic study of capturing and storing carbon emissions from coal burning—the 800-pound gorilla in the climate policy debate—says that billions of dollars in demonstration projects are needed to help put the ape in a cage.

    Worldwide, the 5.4 billion tons of coal burned each year generate roughly a third of the world's carbon dioxide emissions. But coal's low cost compared to other energy sources makes it irresistible to nations with plentiful deposits. China, for example, weekly puts online two new coal-fired generating plants. This week, scientists at the Massachusetts Institute of Technology (MIT) in Cambridge, led by physicist Ernest Moniz and chemist John Deutch, propose policy and research to help governments achieve big cuts by capturing and burying the CO2 ( The study describes a number of daunting technical hurdles and warns against a “rushed attempt” to deploy the two leading technological fixes before the science is mature.

    That cautionary note has sparked criticism from more bullish experts. “We know enough technologically to do it today,” says mechanical engineer George Peridas of the Natural Resources Defense Council in Washington, D.C., which wants new plants to be forced to include technology to capture carbon emissions. “From a climate perspective, the risk [of waiting] is huge.”

    Using a computer model, MIT researchers examined how charging utilities a global price for emitted carbon dioxide (either $7 or $25 per ton) might impact coal consumption by 2050. The scenarios suggest that the policies “will limit” the expected growth in the use of coal but not bring it below current levels. Improvements in the process of capturing emitted carbon and sequestering it underground will therefore be “critical,” the report's authors say. The study calls for the U.S. Department of Energy (DOE) to continue funding technologies for capturing carbon from the two main ways of burning it: pulverized coal (PC) and integrated gasification combined cycle (IGCC). PC plants grab CO2 just before emissions travel to the smokestack; IGCC plants remove the gas after the coal is gasified but before it is burned.

    Drill squad.

    A new report recommends scaling up work on carbon sequestration, such as this study of saline formations by DOE geochemists.


    The report calls on the U.S. government to spend up to half a billion dollars a year to subsidize demonstration projects run by partnerships with the private sector. It says that FutureGen, the current DOE effort to demonstrate a carbon-capturing IGCC facility by 2012, lacks “clarity of purpose.” It also faults DOE's assessment of U.S. geologic sequestration sites as “not uniform”—an atlas due out in May omits detail on coal-rich regions in Wyoming, for example—preferring a national map prepared by the U.S. Geological Survey.

    Experts praise the report's support for demonstration projects but criticize its technology-neutral stance on the competing technologies. Joseph Chaisson of the Clean Air Task Force in Boston says the report uses “out-of-date” data that blunt the comparative advantage for IGCC, adding that utilities are actively exploring new industrial gasifiers. Geologist Susan Havorka of the University of Texas, Austin, questions the report's emphasis on giant injection sites as test beds, saying that ongoing “small tests” can give important clues in tracking CO2 behavior.

    Moniz says the group accounted for recent industry progress, but that there are too many unknowns to favor one technology. A plant built now with one capture technology would be hard to retrofit for a different one. “It's not as simple as just dropping in” a sequestration module, he says. Meanwhile, some companies are moving forward. Last week, a group of investors and Dallas-based TXU announced plans to build two IGCC demonstration plants.


    Asking for the Moon

    1. Andrew Lawler*
    1. With reporting by Pallava Bagla

    Thanks to several upcoming robotic missions, lunar science is poised for its biggest boost in a generation. But NASA managers have made it clear that research will be the tail on the exploration dog

    TEMPE, ARIZONA—Fashion isn't restricted to Paris runways. A decade ago, space scientists became enamored with the possibility of past life on Mars. More recently, moons such as Europa, Titan, and Enceladus captured the imagination of researchers. Soon, Earth's only satellite will get a chance to strut her stuff again, after being out of style for more than 3 decades. Four countries—Japan, India, China, and the United States—are preparing to launch robotic lunar probes in the next 18 months. China is planning a human mission, and NASA is pushing ahead with plans for a human outpost by the end of the next decade based on a 2004 vision laid down by President George W. Bush.

    With the moon back in the footlights, the question for U.S. scientists is whether lunar science can sustain funding for a long-term research program. Science has always played second fiddle to engineering human flight at NASA, and the new exploration program is no exception. As NASA Administrator Michael Griffin bluntly told the 250 scientists who gathered here last week at the request of NASA's Advisory Council, a return to the moon “is not all about you.” If scientists want a dedicated human research sortie, he added, they'll need to find the $2 billion or so it would cost.

    That message, along with NASA's recent decision to shelve a series of lunar robotic missions, stunned some participants. “The rather pessimistic view of lunar science outlined by Mike Griffin,” says Brown University geologist Carle Pieters, left her “depressed and discouraged.” Yet she and other scientists say they want to be involved in lunar planning. A weeklong session generated a long list of intriguing projects to pursue, along with advance word from a National Research Council (NRC) panel now studying lunar science that its report would urge NASA to ramp up funding for such research. “We don't want to preclude what could be a fascinating scientific opportunity,” says Neil Tyson, an astronomer at the American Museum of Natural History in New York City. “The ship is leaving the dock, and the question is whether we'll be on it.”

    Back to the future

    The gathering in Tempe hearkened back to a 1965 meeting on Massachusetts's Cape Cod that gave researchers an opportunity to inject scientific research into the Apollo program. Harrison Schmitt, a geologist who went on to become the first and only scientist to visit the moon and now chairs the agency's advisory council, was so impressed by the meeting that he asked NASA to repeat it. Schmitt says he overcame NASA's initial resistance by arguing that it needed a clear set of scientific priorities.

    Lunar astronomy.

    Scientists have proposed a radio array on the moon's “quiet” far side.

    Old digs.

    Geologist Harrison Schmitt on the lunar surface during the last Apollo mission in December 1972.


    The early days of lunar science benefited greatly from the Cold War race to the moon. The United States and the Soviet Union sent more than 60 robotic missions—crash landers, soft landers, orbiters, sample returns—between 1958 and 1976. And that 18-year tally doesn't count the nine piloted Apollo flights that circled or landed on the lunar surface. By contrast, only four missions have visited the moon in the last 31 years.

    Scientists still know remarkably little about Earth's satellite. Pressing scientific questions include why the moon's magnetic field appears to have shut off, how dust and plasma interact near the surface, and the nature of hydrogen deposits at the poles. The Apollo soil samples are insufficiently diverse to answer fundamental geological questions because they were drawn largely from the maria in the mid-latitudes of the moon's near side. The solar system's largest hole—the Aitken Basin near the south pole—has yet to be explored, and Mars has been mapped more accurately than the moon's pockmarked surface, which contains clues to the extent and timing of the heavy bombardment that shaped the early solar system. Like Greenland's ice cap, the moon's undisturbed layers preserve a long history—for example, a concise record of the sun's radiance over billions of years.

    Scientists soon will have a shot at answering these and other questions. This year, Japan will launch a 3-ton, 14-sensor probe called Selene. China is completing work on Chang'e 1, which will examine the lunar crust and temperature and the space environment between Earth and the moon. Next year, India plans to send Chandrayaan-1, with a NASA-funded instrument on board, around the moon, followed by a more ambitious sample-return mission in 2010. “Chandrayaan-2 will have a lander that will touch down on the lunar surface and pick up samples,” says G. Madhavan Nair, chair of the Indian Space Research Organization. And German officials recently said they are considering building a lunar probe outside the umbrella of the European Space Agency, which launched a 2003 moon orbiter but has no plans for further flights.

    Meanwhile, NASA is readying the Lunar Reconnaissance Orbiter (LRO) for a late 2008 launch. It's designed to provide detailed maps of the moon to assist in planning for human missions. That mission will last a year, after which NASA scientists will take over its operation.

    NASA plans to boost its lunar funding from $27 million in 2008 to $97 million in 2011. That pot will help cover the cost of operating LRO as well as paying for small instruments to go aboard foreign probes such as Chandrayaan-2. In addition, NASA officials promise to make more money available for scientists to analyze data from both U.S. and foreign spacecraft. “There is a real richness of data” headed to Earth, says Pieters, who is co-chair of the NRC panel. She says the panel's report will urge greater international cooperation on lunar robotic efforts.

    Visioning science

    LRO's central mission, however, is not science. The spacecraft is the first step in NASA's march to send humans back to the moon by 2020. The agency's exploration agenda begins with finishing the space station and retiring the space shuttle in 2010, followed by a 2015 launching of a large new rocket. Aside from LRO, there's no room for research during the first decade of the exploration effort; NASA just put on hold a series of orbiters and landers after LRO that could provide exploration—and science—data prior to the arrival of humans.


    Scott Horowitz, NASA's exploration chief, says that those robotic missions would be nice to do—if the agency had the money. All he really needs, he told the scientists, is “a damn good map,” which LRO will provide. He made it clear his interest is not in blue-sky research. “We don't have to get rocks back.”

    And the role of science even once humans arrive remains tenuous. In December, NASA decided to build an outpost rather than send a series of missions to several locations. That disappointed scientists hoping to collect a variety of lunar samples—an important goal highlighted in the NRC interim report—and build a distributed seismic network. Griffin, however, says the base gives potential foreign partners the chance to contribute in a manner similar to their involvement with the space station. It also creates opportunities for space tourism and the possibility of exploiting potential resources such as water, ice, and minerals. “We're not going back to the moon and on to Mars solely for science,” Griffin reminded the Tempe audience.

    The base, tentatively planned for the rim of the south pole's Shackleton Crater, would initially be home to a crew of four staying for 1 to 2 weeks, says Laurie Leshin, science and exploration chief at NASA's Goddard Space Flight Center in Greenbelt, Maryland. Astronauts could travel a few kilometers around the landing site and collect up to 100 kilograms of lunar samples. Griffin envisions the base as analogous to the U.S. National Science Foundation's McMurdo Station in Antarctica, which serves as a logistics and transportation hub for research across the continent.

    But Leshin acknowledged that “science might be deferred” in the early stages of exploration. And in Tempe, Griffin stopped short of saying that there will be a permanent human presence at the outpost. “Our return to the moon is as a training ground, a step along the path to Mars,” he said.

    Although Griffin and Horowitz downplayed the role of research, scientists used the meeting to generate a host of ideas for projects that could be conducted from the lunar outpost, including the study of nearby regolith, volatiles, impact craters, the solar wind, and a low-frequency radio observatory on the far side, in the quiet zone protected from noisy Earth. “This is the most exciting experiment which could be done from the surface of the moon,” says Mario Livio, an astrophysicist at the Space Telescope Science Institute in Baltimore, Maryland.

    But the researchers also concluded that the highest-priority lunar science missions could be done better, faster, and more cheaply using robots. And they agreed that, with the exception of the radio observatory, the moon is a poor place to conduct astronomy or Earth sciences.

    Even so, the results of NASA's multibillion-dollar vision could trickle down to science. The new heavy-lift launcher could orbit space telescopes with mirrors 10 meters across when it isn't ferrying humans to the moon. And new spacecraft systems could lay the groundwork for a new generation of sophisticated planetary probes. “Science is not a priority in the vision, but unless scientists voice what their needs are, nothing is going to happen,” says Livio. “We'd like to ensure they include the capabilities which could be used for space science.”

    Dual mission.

    The 2008 Lunar Reconnaissance Orbiter will be turned over to scientists after scoping out possible landing sites.


    And there is still the possibility that robotic missions could prove critical for exploration after LRO. Some NASA officials predict that the agency will need to study the effects of moon dust and radiation on equipment and astronauts, as well as to follow up LRO observations on potential water at the lunar poles. And NASA's Ames Research Center in Mountain View, California, is proposing a set of $100 million missions that could deliver 50 kg of payload to orbit or 10 kg to the surface. Paying for them is another matter, however, as the budgets of both the exploration and science offices are being squeezed.

    NASA's science office is already considering some lunar projects as part of its regular mission competitions. Geophysicist Maria Zuber of the Massachusetts Institute of Technology in Cambridge has proposed examining the lunar gravitational field and the moon's interior structure using an orbiter. And planetary scientists agreed in a 2003 NRC decadal survey of their discipline that a robotic mission to the Aitken Basin is a high priority; NASA may consider such a mission in the near future.

    Some researchers worry that lunar science is a passing fad that may not last into the next administration. “A lot of scientists I know are staying away from this” because they expect the vision to collapse once Bush leaves office in 2009, says Lucy Fortson, an astronomer at the University of Chicago in Illinois.

    However, those who have been waiting patiently for more than 3 decades welcome the resurgence of interest in lunar research and don't mind that science isn't in the driver's seat. “The moon is now front and center, and the lunar community is quite comfortable with the fact that science is not the preeminent activity,” says astronomer Wendell Mendell of NASA's Johnson Space Center in Houston. “We've been waiting for so long, it's good to have anything.”


    Taking a Stern Look at NASA Science

    1. Andrew Lawler

    Finding room for lunar research in NASA's $5.4 billion science budget is one of many challenges facing Alan Stern, who next month takes over the troubled program. He's a planetary scientist from the Southwest Research Institute in Boulder, Colorado, and a one-time astronaut candidate. Last week, during a meeting with the National Academies' Space Studies Board, Stern pledged to wring more science out of a flat budget and find ways to ease controversial cuts to university grants.

    Stern's portfolio includes nearly 100 projects in space or being readied for launch. But several are mired in cost overruns. Two years ago, the price tag for the biggest item, the James Webb Space Telescope, shot up more than $1 billion to $4.5 billion, although its costs now seem under control. More recently, NASA was forced to budget 10% more for the $250 million Orbiting Carbon Observatory, slated for launch next year to collect precise measurements of carbon dioxide in Earth's atmosphere, and the $1.69 billion Mars Science Laboratory rover, which will leave Earth in 2009.

    Not surprisingly, those larger mortgages are squeezing everything else. Some disciplines, such as astrobiology, face dramatic cuts (Science, 19 January, p. 318), and NASA has no plans for major earth science missions in the next decade. To make matters worse, last year, NASA chief Michael Griffin froze the science budget after ordering $3.1 billion in cuts to the 5-year plan for science to cover shortfalls in the space shuttle and space station programs.

    Stern hopes to ease the crisis through “innovative thinking” rather than any additional funding. “We're living in a zero-sum game,” he told the academy panel. He plans to save some money by cutting back on what he deems the agency's excessive oversight of small satellite projects. “We've shot ourselves in the foot,” he says about well-meaning attempts to avoid failure that have substantially added to costs.

    Sober forecast.

    Alan Stern warns scientists they're playing a zero-sum game.


    A recent academy report calling for a new flotilla of earth science platforms is out of step with the available resources, he says. He's even willing to consider killing a space mission if it's the only way to preserve a robust budget for research and analysis, which is mostly conducted by academic scientists.

    Stern warns that long-running missions may need to be turned off to make room for new projects. “There are going to be things that I do that cause pain,” he says. He's also decided to recuse himself from a mission competition to examine Mars's atmosphere, in which he had hoped to play a lead role. At the same time, he plans to remain principal investigator of the Pluto mission that just passed Jupiter.


    U.S. Math Tests Don't Line Up

    1. Jeffrey Mervis

    The latest national assessment of high school achievement can't be compared with previous ones, the government says. Does no trend mean no progress?

    For more than 3 decades, the National Assessment of Educational Progress (NAEP) has monitored how much U.S. students know in a variety of subjects. The recent trends for 12th-grade mathematics are disturbing: Scores haven't improved, and U.S. students rank near the bottom on international comparisons. So math experts around the country eagerly awaited the latest trend data, based on a test taken by 9000 high school seniors in 2005. The government's answer last month turned out to be a surprising “we don't know.”

    Race to the bottom.

    The last three NAEP tests show that a shrinking share of high school seniors have even basic math skills—and the racial gap persists.


    Officials at the Department of Education's National Center for Education Statistics (NCES), which runs the program, say that the 2005 scores cannot be compared to the 1996 and 2000 assessments. The 2005 test contained more algebra and less numeracy, it used a different format, and students were allowed to bring their own calculators rather than use ones provided at the test site. “We wanted to offer some sort of comparison,” says Peggy Carr, NCES's associate director. “After all, NAEP is about educational progress, and for that you need trends. But we decided in the end that there were too many changes.”

    The new test was intended to be more rigorous than previous versions, explains Mary Crovo of the National Assessment Governing Board (NAGB), which sets policies for NAEP. But after the board approved the changes in content, she says, testing experts advised that it had also lost the ability to draw any comparisons with the 2000 test. Psychometricians say that the gold standard would have been a bridging study: having one set of students take the 2000 test and a matched sample take the 2005 test, both under the 2000 rules. Any scoring difference could then reliably be attributed to a student's knowledge of mathematics. No bridging study was done, although Carr and Crovo disagree on the reasons. “It was a funding decision by NCES,” says Crovo. Carr says, however, “we initially thought that we should do one, but NAGB said it wouldn't be appropriate because [the 2005 test] used a new framework.”

    An outside study funded by the department did, however, find some basis for comparison. After analyzing answers to the 60% to 65% of the questions on the two tests that were identical, researchers at the Human Resources Research Organization (HumRRO) in Alexandria, Virginia, found evidence that there were “probable gains in 12th-grade mathematics between 2000 and 2005.” Although the report (posted at is studded with caveats, it finds that the calculator policy had “minimal affect, if any,” and that the new format may actually disguise a larger real gain.

    Whether the 2005 NAEP scores can be compared with those of earlier tests is more than a simple disagreement among psychometricians. Although the NAEP is not part of the state-by-state assessment of student achievement mandated under the federal No Child Left Behind Act, the test matters. Proponents of national standards see NAEP as a promising way to achieve their goal in the face of the famously decentralized U.S. educational system (Science, 2 February, p. 595). Even the Bush Administration, which cherishes the principle of local control, has dubbed NAEP “the nation's report card” to emphasize its importance.

    The comparability of the two tests also has bearing on efforts to erase the sizable achievement gap between white and Asian students, on the one hand, and their African-American and Hispanic peers on the other. Using a three-point scale—basic, proficient, and advanced—to measure achievement, the 2005 NAEP test found that a staggering 39% of U.S. high school seniors lack even a basic understanding of high school mathematics. That's up from 35% in the 2000 test and 31% in 1996. Using that same scale, the large achievement gap by race and ethnicity has persisted. The 2005 test reports that some 70% of blacks and 60% of Hispanics fell below that minimal cutoff, compared with 30% for whites and 27% for Asian-Americans. In a depressing spiral to the bottom, the percentage of students from each racial and ethnic group falling below “basic” has increased from 1996 to 2005.

    For many math educators, what's most depressing is that changes in NAEP results, if any, are minimal. The HumRRO analysis estimates an increase of three to five points on a scale of 300, a bump-up consistent with the recent pattern in math scores for elementary and middle school students. “A three-point gain seems about right to me,” agrees Tom Loveless, director of the Brown Center for Education Policy at the Brookings Institution in Washington, D.C., and co-author of a 2006 study comparing student achievement on NAEP and state assessments.

    “Even if you could compare the two tests, there are clearly no major improvements” in the mathematical abilities of U.S. high schoolers, says William Schmidt, U.S. coordinator for the 1995 Third International Mathematics and Science Study, which revealed a growing achievement gap between U.S. students and the rest of the world as they move through the educational system. “I'm not surprised,” says Schmidt, a math educator at Michigan State University in East Lansing. “The country hasn't made a commitment to the sort of rigorous and demanding curriculum that is needed to raise achievement.” Until that happens, he says, nothing will really change. And once it does, the results should be obvious to everyone.


    Ocean Study Yields a Tidal Wave of Microbial DNA

    1. John Bohannon

    Data glut or unprecedented science? A global hunt for marine microbial diversity turns up a vast, underexplored world of genes, proteins, and “species”

    Floating laboratory.

    Sorcerer II, a private yacht outfitted to collect and freeze microbial samples, netted a huge bounty of DNA sequence.


    After relishing the role of David to the Human Genome Project's Goliath, J. Craig Venter is now positioning himself as a Charles Darwin of the 21st century. Darwin's voyage aboard the H.M.S. Beagle 170 years ago to the Galápagos Islands netted a plethora of observations—the bedrock for his theory of evolution. Four years ago, Venter set sail for the same islands and returned 9 months later with his own cache of data—billions of bases of DNA sequence from the ocean's microbial communities. But whether that trip will prove anything more than a fishing expedition remains to be seen.

    Microbial explorers.

    J. Craig Venter (left) and Anthony Knap of the Bermuda Biological Station for Research aboard Venter's yacht.


    On 13 March, Venter, head of the J. Craig Venter Institute in Rockville, Maryland, and a bevy of co-authors rolled out 7.7 million snippets of sequence, dubbed the Global Ocean Sampling, in a trio of online papers in PLoS Biology. As a first stab at mining these data, which have just become publicly available to other scientists, Venter's team has found evidence of so many new microbial species that the researchers want to redraw the tree of microbial life. They have also translated the sequences into hypothetical proteins and made some educated guesses about their possible functions.

    Some scientists are wowed by the effort. Others worry that researchers will not be able to make sense of all this information. The diversity of microbes uncovered is “overwhelming, … tantamount to trying to understand the plot of a full-length motion picture after looking at a single frame of the movie,” says Mitch Sogin, a molecular evolutionary biologist at the Marine Biological Laboratory in Woods Hole, Massachusetts. And Venter doesn't necessarily disagree. In 2004, as the data were first rolling in, Venter confidently predicted that his salty DNA survey would “provide a different view of evolution.” To make that happen, however, he now says, “we need even more data.”

    The big trawl

    This is the second time that the American millionaire genome sequencer has returned to port laden with DNA. Venter's 2004 study of microbes living in the Sargasso Sea was easily the largest DNA sequencing of environmental samples ever accomplished (Science, 2 April 2004, p. 66). This time around, he sailed from Halifax, Canada, through the Panama Canal and finished up 6500 kilometers southwest of the Galápagos. The funding for the $10 million project came from the Gordon and Betty Moore Foundation, the U.S. Department of Energy, and Venter's nonprofit foundation. The research vessel, the Sorcerer II, is Venter's private yacht tricked out as a floating laboratory.

    The researchers sampled at 41 locations, isolating and subsequently freezing bacterium-sized cells. They also recorded the temperature, salinity, pH, oxygen concentration, and depth.

    Back at Venter's institute, technicians extracted and sequenced the DNA. Using a whole-genome shotgun approach, they shattered all the DNA in a sample into fragments of specific sizes, sequenced each one, and then assembled these sequences together by matching the ends of the DNA with a powerful overlap-hunting computer program. In principle, this approach allows the reconstruction of entire genomes of the different organisms in a sample.

    Three years and 6.3 billion bases of DNA sequence later, at least one thing is clear: The DNA in a typical community of marine microbes is so diverse that nothing close to a whole genome can be assembled, even with all the sequencing that Venter has mustered. Half of his 7.7 million DNA sequence fragments are so different that they could not be linked at all.

    Nonetheless, the researchers could estimate the number of species in the samples based on slowly evolving marker genes. Judging by these glimpses of genomes, Venter's team identified more than 400 microbial species new to science, and more than 100 of those are sufficiently different to define new taxonomic families, they report. “This is a great milestone event” for environmental microbiology, says Dawn Field, a molecular evolutionary biologist at the Centre for Ecology and Hydrology in Oxford, U.K., who predicts that “these papers will become among the most highly cited of all time in biology.”

    Diversity deep end

    The fact that Venter's brute-force sequencing approach fell short of capturing whole genomes shows that scientists are far from a full accounting of the species packed in a drop of seawater, says David Scanlan, a marine microbiologist at the University of Warwick, U.K. And this “astounding” genetic diversity points to what Scanlan and others call the “paradox of the plankton.”

    Traditional ecological theory predicts that when multiple species compete for the same resources—in the case of ocean microbes, light and dissolved nutrients—then one, or a few, species should eventually outcompete the rest. If that were the case, then many of the sequences plucked from the waters by Venter's crew should map down onto a few dominant genomes.

    But rather than a sharp portrait of a few different microbes, the data create a pointillist painting of a countless mob. The vast majority of the microbes that found themselves snared in Venter's filters were genetically unique, says Scanlan: “It's a clear message that there's a tremendous gene pool in the ocean.”

    The diversity itself could be the solution to the paradox, according to Douglas Rusch, a computational biologist at the Venter Institute, and his colleagues. The staggering variety of genes may endow each species with sufficiently different metabolic tool kits to take advantage of slightly different combinations of resources, including the waste products of others, such that they can all coexist.

    The newly detailed diversity also suggests that microbial taxonomy needs a major overhaul, says Ian Joint, a marine microbiologist at the Plymouth Marine Laboratory in the U.K. The current taxonomy carves up microbes into different “ribotypes” by comparing the sequence of the highly conserved genes of the protein-synthesizing ribosome. Because there is so much diversity within the DNA even after dividing them into ribotypes, Venter's team proposes to throw out ribotyping altogether. Instead, they are defining groups of microbes based on the environment in which they were collected and how well their DNA matches a reference set of fully sequenced marine microbial genomes. Doing so has allowed Venter's team to group sequence fragments into different “subtypes.” Venter's team says that each of these subtypes represents a “distinct, closely related population” of microbes that fill a particular niche in their local environment. However, many more marine microbial genomes must be sequenced to make this scheme work, says Joint.

    Marine data-mining

    The samples brought to port by Sorcerer II do more than shake up microbial taxonomy. Based on their best guess as to the beginning and end of each gene teased out from the DNA sequences, Venter Institute computational biologist Shibu Yooseph and his colleagues have concluded that the DNA encodes 6.12 million hypothetical proteins. That finding almost doubles the number of known proteins in a single stroke. It also shows that the end of protein diversity is not in sight, says David O'Connor, a molecular biologist at the University of Southampton, U.K. Most of the predicted proteins are of unknown function, and a quarter of them have no similarity to any known proteins. Venter expects that some of these can be exploited to develop new synthetic materials, clean up pollution, or bioengineer fuel production.

    Taking stock.

    Sorcerer II collected bacteria at dozens of sites in the Atlantic and Pacific, particularly around the Galápagos Islands (inset).


    But the hypothetical proteins are already offering a new view of basic microbial biology. A team led by Venter and Gerard Manning, a computational biologist at the Salk Institute for Biological Studies in San Diego, California, says that the current picture of the proteins responsible for coordinating marine microbes' gene expression and metabolism is off the mark. By comparing predicted amino acid sequences with those of known proteins, they found a surprising abundance of signaling proteins thought to be used only by multicellular organisms. Among the hypothetical proteins from their marine samples, the researchers found 28,000 of the so-called eukaryotic protein kinases, as well as another 19,000 of a group that are highly similar to these kinases—triple the number previously known.

    These analyses of Venter's metagenomic data hint at the work that lies ahead for protein researchers. “Claims by some biologists that complete catalogs of the protein universe would be attainable within a decade now look naïve,” O'Connor points out.

    Thus to some, the data produced by Venter's voyage are an exciting starting point for protein, gene, and microbe discovery. It's something “people will be working on for quite some time,” says Howard Ochman, a molecular evolutionary biologist at the University of Arizona in Tucson. But for others, the value of this tidal wave of data is uncertain. James Prosser, a molecular biologist at the University of Aberdeen, U.K., worries that adding all of this sequence to the existing gene and protein databases could “swamp” the system, cluttering the results of searches for well-characterized genes.

    To help researchers deal with not just Venter's 100 gigabytes of sequence data but also other relevant information about a microbe's environment and location, Venter's team and Larry Smarr, a computer scientist at the California Institute for Telecommunications and Information Technology in San Diego, have built a metagenomics version of GenBank, the online genetic database curated by the National Center for Biotechnology Information in Bethesda, Maryland. In addition to doing the typical gene searches and genome comparisons, the new system, known as the Community Cyberinfrastructure for Advanced Marine Microbial Ecology Research and Analysis (CAMERA), can hunt for correlations between DNA sequence and environment for clues about co-occurring microbes. So far, however, CAMERA has only a few active users.

    A more serious drawback of Venter's study, says Prosser, is that the samplings do not appear to have been carried out with any specific scientific hypotheses or aims in mind. The cynical view is that these are little more than “fishing trips,” he says. “There would be greater potential for scientific advances if more focused, better designed studies were carried out.”

    Will the voyage of the Sorcerer II live up to Venter's hopes? It took Darwin 25 years after returning from his expedition to publish his theory of evolution. With the three papers online this week, Venter, at least, has hopped on the fast track. But in terms of synthesizing the big picture of marine microbiology, he and his colleagues are still out to sea.


    Biofuel Researchers Prepare to Reap a New Harvest

    1. Robert F. Service

    After decades in the background, technology for converting agricultural wastes into liquid fuels is now poised to enter the market

    Tall order.

    Starches and sugars in corn kernels readily ferment into alcohol; corn stalks are more challenging.


    When U.S. President George W. Bush announced an initiative in January to reduce U. S. gasoline use by 20% in 10 years, critics could be forgiven for thinking it sounded familiar. Presidents since Jimmy Carter have called for reducing U.S. dependence on foreign oil. But so far there's been little to show for it. Shale oil, electric cars, and hydrogen fuel cells have all at one time or another had their 15 minutes of fame. But all have failed to make a dent in U.S. gasoline use.

    Today, biofuels are the alternatives du jour, with ethanol chief among them. And in the United States, that currently means corn ethanol. But the big hope for the field is a technology called “cellulosic ethanol,” which aims to turn all kinds of plant material—from corn stalks and wheat straw to forest trimmings—into fuel. According to a 2005 study by the U.S. departments of Energy and Agriculture, the U.S. could convert 1.3 billion dry tons a year of biomass to 227 billion liters (60 billion gallons) a year of ethanol with little impact on food or timber harvests and in the process displace 30% of the nation's transportation fuel. Not bad for what amounts to a lot of unwanted yard waste.

    No commercial cellulosic-ethanol plants exist today. But despite the failures of previous alternative fuels, decades of research in biotechnology, chemistry, and chemical engineering are merging to bring cellulosic-ethanol technology to the verge of a payoff. A host of small and large chemical companies have jumped into the area, propelled by recent high gas prices and nearly $2 billion in private and venture-capital funding for biofuels last year alone, according to London-based research firm New Energy Finance. A handful of cellulosic-ethanol demonstration plants have popped up as a result. And last month, the U.S. Department of Energy (DOE) announced awards of $385 million for six commercial-scale cellulosic-ethanol refineries (see table) that are expected to produce more than 130 million gallons of ethanol per year.

    That's still just a small fraction of the some 5 billion gallons of corn-based ethanol produced in the U.S. annually. But confidence in the new technology is riding high. Experts believe that scientific successes are now coming in a steady stream, which should progressively improve the technology and chip away at ethanol prices. “I think we will be there with cellulosic ethanol much more quickly than anybody realizes,” says Bruce Dale, a chemical engineer at Michigan State University (MSU) in East Lansing who has worked on ethanol conversion technology for 30 years.

    Fuel versus food?

    Ethanol hasn't always been an alternative fuel. Henry Ford originally planned to use it to power his Model T's. But it was quickly supplanted by cheap and plentiful gasoline, which packs 30% more energy per gallon than ethanol does.

    Ethanol began making its comeback after the oil shocks of the 1970s. Brazil launched a national effort to convert sugar cane into ethanol in 1975 in hopes of reducing its vulnerability to high oil prices. As part of that effort, the country's federal government required gas stations to blend 25% ethanol into gasoline and encouraged carmakers to sell engines capable of running on pure ethanol. As a result, ethanol production in Brazil has climbed steadily, from 0.9 billion gallons in 1980 to 4.2 billion gallons last year. And the price of the fuel has dropped steadily to $0.81 cents a gallon, according to a recent article by José Goldemberg, the State of São Paulo's Secretary for the Environment (Science, 9 February, p. 808).

    U.S. ethanol producers have seen a similar surge in output. In 2005, they turned out roughly 4 billion gallons of ethanol, or about 3% of the 140 billion gallons of gasoline used in the U.S. each year. Today, most of that ethanol is blended with gasoline at a 10:90 ethanol-to-gasoline ratio to boost the fuel's octane rating, which allows it to burn more cleanly, reducing urban smog. Two years ago, Congress mandated a production increase to 7.5 billion gallons a year by 2012. And the president's recent initiative aims to produce as much as 35 billion gallons of alternative fuels by 2017. The European Commission too has called for 10% of its transportation fuel to come from biofuels such as ethanol and biodiesel by 2020.

    But crops such as corn and sugar cane won't be enough to produce all this fuel. According to one recent DOE study, U.S. corn grain ethanol production is likely to top out somewhere around 12 billion gallons a year. “Even if you took all the starch and converted it to fuel, it only gets you to about 10% of our gasoline,” says Jim McMillan, a biochemical engineer with the National Renewable Energy Laboratory (NREL) in Golden, Colorado. Long before that point, diverting too much of the corn crop would cause dramatic rises in the cost of the food. And even at today's modest levels of ethanol production, a price pressure is already being felt. Corn prices in the United States hit a 10-year high of $4.47 a bushel ($176 per metric ton) last month, nearly double the price a year ago, fueled in part by the increased demand for ethanol.

    The winners.

    The U.S. Department of Energy recently backed six cellulosic-ethanol refineries.


    View this table:

    To get past the food-versus-fuel debate, “you've got to get into cellulose,” says McMillan. Doing so would both increase the volume of ethanol that can be made and lower emissions of greenhouse gases. That's where cellulosic ethanol really shines, says Alexander Farrell, an energy resource expert at the University of California, Berkeley. In a paper published last year in Science (27 January 2006, p. 506) and in follow-on work, Farrell and colleagues found that because of its high energy inputs, using corn-based ethanol instead of gasoline reduces greenhouse gas emissions only about 18%. With its modest energy inputs, cellulosic ethanol fares much better, reducing greenhouse gas emissions by 88%.

    Sweet science

    But converting cellulose to fuel is far more difficult than starting with simple sugar, as in Brazil, or corn starch, as in the United States. Starch is a straightforward polymer of glucose that is easily broken down by enzymes. Agricultural and forest wastes, by contrast, are far more complex. This biomass is made up of three ingredients: cellulose, a polymer of the six-carbon sugar glucose that's the main component of plant cell walls; hemicellulose, a branched polymer composed of xylose and other five-carbon sugars; and lignin, which crosslinks the other polymers into a robust structure.

    To convert any source of sugars to ethanol, those sugars must first be made accessible. That's simple in the case of sugar cane, where the sugar is harvested and made into a syrup. It's a bit harder with corn grain. But there, engineers simply add enzymes called amylases to clip apart the starch polymer into separate glucose molecules. But with other agricultural products, such as leaves, stalks, grasses, and trees, the material must be broken down so that crystalline fibers made up of hemicellulose and cellulose can be digested into simple sugars before being turned over to microbes that convert them to ethanol, a process known as fermentation.

    So far, it's on this fermentation stage that most of the attention in the cellulosic-ethanol field has focused. That's because although yeast naturally converts glucose to ethanol, there are no naturally occurring organisms that convert xylose and other five-carbon sugars to ethanol. Escherichia coli and other organisms do metabolize five-carbon sugars. But instead of making ethanol, they naturally produce a variety of acetic and lactic acids as fermentation products. To take advantage of the sugars that make up some 25% of plants, researchers needed to reengineer the workings of microbes.

    The first to do so, in 1985, was microbiologist Lonnie Ingram of the University of Florida, Gainesville, who reported that he and his colleagues had inserted a pair of key sugar-fermenting genes into the bacterium E. coli. The genes redirected E. coli's metabolism to convert 90% to 95% of the sugars in biomass to ethanol. Ingram's early E. coli strains weren't perfect. They could tolerate only about 4% ethanol in the final fermenting solution. Because the fuel must be distilled out of the surrounding water, a highly energy intensive process, ethanol makers strive to minimize the amount of distillation by using organisms that can tolerate the most ethanol possible. Since their early work, Ingram says he and his colleagues have managed to increase E. coli's tolerance to about 6.4% ethanol. Ingram's strains have since been licensed to Celunol, which is building a 1.4-million-gallons-per-year cellulosic-ethanol plant in Jennings, Louisiana.

    Other groups, meanwhile, have pushed to impart new talents to other organisms. In 1995, for example, researchers at NREL engineered a bacterium called Zymomonas mobilis to ferment xylose and other five-carbon sugars in addition to the six-carbon sugars it favors naturally. The work has since been taken up by researchers at DuPont in Wilmington, Delaware. And last year, DuPont's biofuels technology manager William Provine reported at the annual American Institute of Chemical Engineers meeting in San Francisco, California, that his group has recently come up with a Zymomonas strain capable of tolerating up to 10% ethanol. That process too is on the road to commercialization. Last month, officials at DuPont, Broin (a major corn-ethanol producer), and Novozymes announced that, as part of the DOE award, they will expand an existing corn-grain ethanol plant in Emmetsburg, Iowa, to produce approximately 30 million gallons of ethanol a year from corncobs and other cellulosic feedstock.

    Yeast researchers have also gotten in on the act. Yeast is today's ethanol heavyweight, given its natural proclivity for turning glucose into ethanol. But because the microbe doesn't naturally process five-carbon sugars, researchers have expanded its abilities to it make better suited for more complex biomass feedstock. In 1993, researchers led by Nancy Ho, a microbiologist at Purdue University in West Lafayette, Indiana, spliced a trio of xylose-fermenting genes into yeast, making it the first yeast strain capable of fermenting xylose to ethanol. Since then, Ho's group has honed yeast's ability to convert a mixture of sugars to ethanol through improvements that include enabling it to use five-carbon sugars other than xylose and boosting the speed at which the organism produces ethanol.

    Growth industry.

    A new agricultural-waste-to-ethanol plant in Jennings, Louisiana, is among the first of a new crop of cellulosic-ethanol facilities.


    Tougher, softer, faster

    Despite their successes in coaxing organisms to convert sugars to ethanol, most researchers recognize that much work remains to be done. “We are still climbing the mountain,” McMillan says, and are “relatively low” on the slope. For example, yeast can convert a bath of glucose to ethanol in just a few hours, but microbes working on a complex mix of sugars can take 1 to 2 days to do the same thing. In a commercial plant, that means lower fuel output. So researchers around the globe are focusing heavily on increasing the expression of fermenting enzymes to step up the speed.

    Another focal point for researchers, Ho and others say, has been toughening up the microbes. “All of these strains, while they are good at making ethanol, their robustness is nowhere near baker's yeast [working] on glucose,” says McMillan. In addition to the intolerance many organisms have for ethanol, a wide variety of other compounds from broken-down biomass inhibit enzymes in fermentation.

    Researchers are also looking for improvements in other parts of the process. One that has come under scrutiny is the chemical processing used to prepare plants for fermentation. Traditionally, researchers break apart the plant fibers by exposing biomass to dilute acids and steam. The result is a soup that can then be exposed to cellulase and hemicellulase enzymes, which further break fibers down into simple sugars for fermentation. But acid-steam processing has several drawbacks. For one, the acid reacts with sugars, reducing by about 10% the amount of total sugars that can later be fermented, MSU's Dale says. The acid byproducts, he adds, also inhibit celluloses and other key enzymes. Finally, the acids typically cannot be recovered and used again, which adds to the costs.

    So Dale and other researchers are now commercializing a process that, instead of acids, uses basic compounds such as ammonia to accomplish the job. In recent years, Dale's group has developed a low-temperature process that readily breaks down leaves, grasses, and straws. It also allows facility operators to recover and reuse the ammonia and creates fewer enzyme inhibitors than do acid treatments. According to a recent analysis by Tim Eggeman, a chemical engineer with Neoterics International in Lakewood, Colorado, the technique could drop the cost of cellulosic ethanol 40 cents per gallon. At least for now, however, the technique doesn't work well with lignin-rich woody feedstock such as trees. So the hunt is still on for improvements in that arena.

    A final target for many researchers lies inside plants themselves. Some companies and academic groups are working to reengineer plants such as corn, poplar trees, and switchgrass to boost their yields and make them easier to turn into fuel. In 1999, for example, researchers led by Vincent Chiang, a molecular biologist and organic chemist at North Carolina State University in Raleigh, reported that they had engineered poplar trees with 50% less lignin than conventional varieties and more cellulose instead. Originally, that work aimed at increasing the cellulose content for paper production. But Chiang says the result is equally valuable for improving the carbohydrates in trees for conversion to ethanol. “The idea is to generate as much polysaccharides as possible,” Chiang says.

    Since their early success, Chiang says, his group has been unable to reduce the lignin content below the initial 50%. More recently, he and his colleagues have turned to tinkering with genes that control the cellulose fibers within trees, aiming to reduce the crystallinity. Although the work is still unpublished, “we have altered several cellulose synthase genes and have pretty much figured out which are the important ones,” Chiang says. The hope, he says, is to make it easier for cellulase enzymes to break down the polymer into glucose units during processing. That, in turn, would reduce the amount of enzymes that need to be added prior to fermentation and chip away at the overall cost. Related efforts are also under way to improve other potential energy crops—for example, reducing the lignin content and increasing the yield of grasses such as switchgrass and Miscanthus.

    These and other advances lead alternative-fuel experts to predict that the cost of cellulosic ethanol will continue to decline, just the cost of as corn- and sugar-cane-based ethanol has. “Each step has a newness to it that allows for optimization. Each one of them helps bring the cost down,” says John Pierce, who oversees DuPont's bio-based technologies in Wilmington, Delaware. Although there are no commercial cellulosic-ethanol plants today, most estimates put the current cost of producing a gallon of cellulosic ethanol at between $3 and $4. By the time the full-scale production plants come on line beginning in 2009, that cost is expected to be about $2 a gallon. DOE's current goal is to drop the price to $1.07 a gallon, at which point it will be competitive with making ethanol from corn.

    Yet even if cellulosic ethanol is destined to compete head-to-head with corn-based ethanol, it is benefiting right now by being in the second rank. “Corn ethanol has certainly paved the way for a lot of alternative fuels,” says Ingram. In addition to pioneering the commercialization of enzymes used to digest starch and reducing their price dramatically, corn ethanol producers have created an infrastructure for handling large volumes of biomass and spurred gasoline suppliers to incorporate ethanol into their supply chains. Numerous U.S. automakers have also begun producing E85 vehicles, able to burn a mixture of 85% ethanol and 15% gasoline. Cellulosic-ethanol makers will inherit this established infrastructure, easing their way into the market—and perhaps even helping them create the first real alternative to gasoline.

  11. IPY Means Doing What It Takes to Get to the Ends of the Earth

    1. Jeffrey Mervis*
    1. With reporting by Ahn Mi-Young in Seoul and Richard Stone in Bangkok.

    The International Polar Year poses unprecedented logistical challenges—and scientific opportunities—for governments around the world

    Powered by four massive engines, the Polarstern can crunch through 1.5 meters of sea ice at 5 knots as it takes scientists where they want to go in the polar regions. But there is one obstacle that even this 25-year workhorse of Germany's polar research program can't overcome: high Russian tariffs.

    For the past several years, prohibitively expensive fees to pass through Russia's Exclusive Economic Zone (EEZ), which extends 320 km from its coast, have forced Arctic researchers from many nations to rearrange their plans. In 2005, for example, scientists at the Alfred Wegener Institute for Polar and Marine Research (AWI) in Bremerhaven, Germany, rerouted a research cruise after learning that taking the shortest route home from the East Siberian Sea would cost them nearly half a million dollars. “Our program has some money, but we were not willing to pay such a high amount,” says AWI senior scientist Ursula Schauer.

    This year, Schauer, a physical oceanographer, is once again hoping to take the ship through the Russian EEZ. And although she hasn't won the lottery, she's counting on something even more valuable to make it happen: the International Polar Year (IPY). Last October, Russia decided to slash its tariff by half for foreign vessels taking part in IPY, a 2-year research extravaganza that kicked off this month and continues until March 2009 ( Cheered on by Artur Chilingarov, a vice speaker of the Duma, Russia's legislature, and head of its IPY committee, the Russian government has agreed to give a green light to any IPY-approved project. “Artur is the pole on which we hoist our IPY flag,” says Sergey Priamikov, a physical oceanographer at the Arctic and Antarctic Research Institute in St. Petersburg who also heads the Eurasian branch of the IPY program office.

    IPY is the fourth in a series of polar lollapaloozas going back to 1882. The previous effort, the 1957–58 International Geophysical Year (IGY), laid the groundwork for the international regimen that governs Antarctic research. Although it has no money to give out, IPY serves as both administrative umbrella and cheerleader for a mélange of research collaborations that individual countries and organizations have pledged to support. Its volunteer leadership has vetted more than 1200 proposals and created a honeycomb chart of some 200 approved projects that incorporate IPY's six research themes—most prominently, to understand the changing polar environment and the impact of those changes. “We hope IPY will create a greater community of cooperation,” says Australia's Ian Allison, co-chair of IPY's joint committee.

    Such global teamwork can be just as important as money, says glaciologist Olav Orheim, head of IPY activities at the Norwegian Research Council. The council is pouring all of a 50% increase in its annual research budget—some $50 million—into 26 IPY projects extending to 2010, including a joint Antarctic traverse with U.S. scientists that will mark the Scandinavian country's first overland visit to the South Pole since legendary explorer Roald Amundsen was the first to arrive in 1911. “We hope that IPY will create a legacy of cooperation that will remove the unpredictability and logistical challenges that have plagued polar research,” says Orheim.

    On the move.

    Scientists from several nations will carry out extensive ice coring and other research activities throughout eastern Antarctica during IPY.


    The right stuff

    The logistical challenges are indeed formidable. A surge in activity at the poles during IPY will strain the world's existing capacity—from ships and planes to labs and communications equipment—to do science in these unforgiving conditions.

    Canada's Arctic program, for instance, is scrambling to secure enough boats. “Last year, we realized we were facing a potential shortage because of the demand from oil and mineral companies and from ecotourism,” says David Hik, an Alpine ecologist at the University of Alberta, Edmonton. Hik heads Canada's IPY office, which oversees the government's new $128 million investment in polar research, 80% of which will be spent during IPY. “We also expect a crunch on helicopters and fixed-wing craft because of the limited number of pilots with experience landing on ice or in remote camps,” he says.

    Much of the stampede is to address a major IPY theme, understanding climate change in the Arctic. An alphabet soup of projects includes SEARCH (Study of Environmental Arctic Change), DAMOCLES (Developing Arctic Modeling and Observing Capabilities for Long-Term Environmental Studies), and IASOA (International Arctic Systems for Observing the Atmosphere). Orheim also has high hopes for an ongoing meteorological project linked up with IPY, called THORPEX, that aims to quantify the impact of polar warming on the stability of the Gulf Stream and global climate by tracking water inflow into the Arctic in unprecedented detail. “There is a lot of speculation about what is happening but very few facts,” he says. Adds Schauer, a member of the scientific steering committee for DAMOCLES: “This is the first time so many international partners have focused on the physical aspects of climate change. We know the ice shrinks every year, but we don't really know why.”

    At sea

    Ship time for polar research is a precious commodity. Funding agencies around the world must match scientists' needs with a suitable vessel in the right place at the right time. Schauer, whose request for the Polarstern to pass through the Russian EEZ this fall is wending its way through a thicket of government ministries, has even created a Web site ( to make the horse-trading more transparent and to foster collaboration. But that's not enough to level the playing field.

    Take the Oden, Sweden's heavy-duty research icebreaker, which fills the same niche as Germany's Polarstern and the U.S. Coast Guard's Healy. Whereas German and U.S. officials have decided to dedicate the next 2 years of ship time to IPY-related cruises, the Oden is available to any group with the money—$25,000 a day, plus fuel costs—to charter it. That leaves Swedish researchers, whose government has not allocated any new funding for IPY, at a disadvantage. “Our polar research secretariat has scheduled one Arctic expedition every other year on the Oden,” says Michael Tjernstrom, a meteorologist at Stockholm University who has put together the IPY-endorsed Arctic Summer Cloud-Ocean Study that hopes to win the coveted cruise slot in the summer of 2008. But with funding tight, he says, “it's really very difficult for us to find the money.”

    Sweden's loss is a gain for other countries. This summer, a U.S.-led team will char ter the Oden to explore the seldom-visited Gakkel Ridge in the central Arctic basin. The AGAVE project will deploy a new generation of autonomous vehicles deep under the Arctic ice pack to hunt for undiscovered life forms nurtured by hydrothermal vents that dot the slow-spreading ridge.

    AGAVE didn't even start out as an IPY project. It was initially funded by the U.S. National Science Foundation (NSF) and NASA before IPY took shape, says principal investigator Robert Reves-Sohn, a marine geophysicist at Woods Hole Oceanographic Institution in Massachusetts. After delays in finding a ship pushed the project into the IPY time frame, Reves-Sohn broadened the mission's scientific agenda and invited non-U.S. scientists. Those changes—including adding Japanese sensors to the underwater robots and German seismometers to ice floes at the surface—earned AGAVE an IPY stamp of approval. “We expanded the scientific scope at no cost to the U.S.,” says Reves-Sohn.

    Finding common ground

    At the other end of the world, IPY is accelerating a revival of a hallowed Antarctic tradition—the research traverse. These forbidding land treks are akin to research cruises because of the wealth of data collected on the move. Popular during IGY and into the 1970s as a means of gathering geophysical information about the largely unexplored interior of the frozen continent, traverses gradually fell out of fashion as scientists opted to mine a confined area, often season after season.

    Several traverses are in the works for IPY (see map). Although their emphases vary from glaciology to geophysics to climate science, each team has agreed to pool data whenever possible to help fill in what remains a very sketchy picture of the continent. Many will build on the International Trans-Antarctic Scientific Expedition (ITASE), an ongoing series of expeditions to gather continent-wide environmental parameters. They feature drilling ice cores for climate information going back as far as 1000 years, operating surface and subsurface radar that help to provide ground truth for satellites, and gathering hard-to-get Antarctic weather data.

    When ITASE began in the 1990s, “we thought that Antarctic climate might be a simple picture,” says one of the project's initiators, glaciologist Paul Mayewski of the University of Maine, Orono. “But it's a vast place. And after 15 years, we know enough now to realize where the sensitive spots are.” In January, his team completed the first, 500-km leg of a planned 6000-km traverse over three seasons that will take them from McMurdo Bay past the South Pole to Dome A, the highest point on the continent, and then back to the pole.

    The latest ITASE expedition also complements routes taken by other traverses flying the IPY banner. “Once we can get to Dome A,” Mayewski says, “we can match our data with what is being collected by the Chinese arriving from the opposite side and also with TASTE-IDEA,” a 3-year, European-led traverse beginning in the fall of 2007 that will cross the less-familiar East Antarctic ice divide. More information will come from a U.S.-Norwegian traverse beginning this fall that will travel to the South Pole from Norway's Troll Station near the African coast of Antarctica and then return to Troll in the 2008–09 season (

    “We'll be going through no man's land,” says Jan Gunnar Winther, director of the Norwegian Polar Institute and leader of the joint project, which will cost an estimated $15 million. In addition to drilling ice cores, the 11-member team will launch drones for airborne photography and operate a scanning electron microscope to analyze the transitions from fresh snow to compacted old snow to ice. “It will fill a huge hole in the data,” says Winther. “It's also by far the biggest traverse that we've ever done.”

    IPY's integrative nature provides a rationale for traverses that might be hard to justify on their own, says Ian Goodwin, ITASE co-chair and a glaciologist at the University of Newcastle, Australia. For example, he says, “IPY is likely the only way” to get the necessary support to carry out exploratory geophysics.

    IPY also represents a golden opportunity for nations relatively new to polar research to build up their scientific infrastructure. In addition to China's big spurt (see p. 1516), South Korea, for example, last year committed $183 million to two polar projects: a $107 million research vessel with icebreaking capabilities, to be ready in 2010, and a $76 million research station, its first on the Antarctic mainland, to be completed in 2011. That money comes on top of the $38 million a year that it spends through the Korean Polar Research Institute (KOPRI), which this winter funded a six-member team that searched for meteorites near Patriot Hills in western Antarctica, the first-ever Korean expedition on the continent.

    The new ship will service existing Korean stations in the Arctic and on King George's Island off the Antarctic Peninsula. It will allow Korean scientists to do research at both poles now performed on leased Russian and European icebreakers. KOPRI is eyeing two sites for the station, one in West Antarctica, off the Amundsen Sea, and the other farther east, near Queen Maud Land off the Weddell Sea.


    The French polar yacht Vagabond is used as a base camp for DAMOCLES environmental research in the Arctic.


    Power to the people

    Arguably the biggest logistical challenge facing polar science isn't an IPY-inspired project. It's the simultaneous construction of the 10-meter South Pole Telescope (SPT) and installation of the 1-km2 Ice Cube neutrino array (see p. 1523) at the U.S. Amundsen-Scott station, now in the final stages of a major upgrade. The way NSF has managed those three projects offers a blueprint for supporting science in inhospitable surroundings.

    “The scope of Antarctic science has changed over the past 20 years,” says Erick Chiang, head of NSF's Antarctic logistical support, a $66-million-a-year program to enable NSF's $322 million a year in polar research. “The projects are more complex, and they address larger problems. But probably the biggest change for researchers has been a move away from designing a proposal around what's there and toward an approach that says, ‘Here's what we want to do. Let's see what NSF can support.’”

    That can-do spirit has taken the South Pole station close to its operational limits. “Last March, it became clear that there was a looming power crunch,” recalls John Carlstrom, director of the Kavli Institute for Cosmological Physics at the University of Chicago in Illinois and SPT project leader. “The station was already running at capacity without any telescope activity and with nothing new for Ice Cube. Our first reaction was to wonder if there had been a mistake or poor planning,” he says. “But NSF put together a tiger team—the first time we'd ever done that for power issues. Now we have a plan that should get us through this year and give us more time to think.”

    The interim solution was improved fuel efficiency and energy conservation, says Chiang. But the larger lesson, he believes, is the inherent difficulty of anticipating where science might be headed. The final design of the new station was completed in 1997: “Nobody then could have anticipated SPT and Ice Cube,” Chiang notes.

    Nor can scientists anticipate the new lines of inquiry that might come out of IPY. But it's clear that the scientific onslaught taking place at the ends of the Earth will bolster polar research for decades to come.

  12. Long (and Perilous) March Heralds China's Rise as Polar Research Power

    1. Richard Stone
    Seeing them off.

    Zhongshan Station in East Antarctica is the staging ground for China's traverses inland to Dome A.


    SHANGHAI—Bo Sun remembers the first time he and his fellow Chinese adventurers struck out from Zhongshan Station on the East Antarctic coast, trekking inland across uncharted terrain in 1996. “It was terrible,” says Bo. “We did not know how to handle it.” Their lumbering vehicles frequently got stuck in drifts. Death traps lurked, unseen, beneath wispy ice bridges. “Sometimes when we looked back, a big crevasse appeared. Our hearts jumped,” says Bo, a glaciologist at the Polar Research Institute of China (PRIC) in Shanghai.

    But they lived, and they learned—how to avoid getting stranded by forging ahead during a storm, for example, and what to eat on the energy-sapping traverses. Along the way, they've mapped major crevasse fields. “Now we are experts at recognizing where the dangers are,” says Bo. After several short trips to build up their capacity, Bo and his colleagues in January 2005 reached their objective—the highest point on Antarctica's ice sheets, Dome Argus (Dome A)—and returned home, completing the 2500-kilometer roundtrip in 10 weeks.

    The conquest marked a coming of age of China's polar program. Radar echo soundings during the traverse suggested that the ice at the bottom of Dome A could be the oldest on the continent, going back as much as 1.2 million years, says PRIC Director Zhang Zhanhai.

    The rising power got a late start in Antarctica. Its researchers first visited the continent in 1980, and Zhongshan was opened in 1989. (China's Great Wall Station debuted on King George Island in the South Shetlands in 1985.) During the International Polar Year (IPY) and beyond, China is set to really take off. In 2006, the government approved $70 million for major polar projects, including $4 million for IPY research in 2007; $19 million for new PRIC headquarters in Shanghai; $22 million to overhaul the Zhongshan and Great Wall stations; and $25 million to renovate the Snow Dragon (Xuelong), a Ukrainian-built research vessel that will cap IPY with a globe-girdling expedition to plumb the effects of rapid Arctic change on the mid-latitudes.

    But it's Dome A that could well turn Chinese scientists into the new darlings of Antarctic research. Their 2005 radar soundings from Argus, 4093 meters above sea level, revealed that the ice there astride the Gamburtsev Mountain Range is 3070 meters thick—twice what modeling had suggested, says Bo. In the next few years, China hopes to start drilling at Dome A to retrieve what could be an unparalleled window on past Antarctic climate. Finding ice that “captures our planet's climate in a different phase … is the Holy Grail of the ice-core community,” says Robin Bell, an Antarctic expert at Columbia University's Lamont-Doherty Earth Observatory in Palisades, New York, who concurs with PRIC's estimated age of the Dome A ice.

    In the meantime, the centerpiece of China's 3-year IPY program is PANDA, a geophysics initiative at Prydz Bay near Zhongshan Station, and Dome A. In collaboration with researchers from five countries, China 3 years ago drilled a 310-meter ice core on the Amery Ice Shelf in Prydz Bay. Drilling will continue in 2007–08, and during a traverse to Dome A in late 2007, scientists will install a series of magnetometers to probe how the solar wind energizes electrons in the magnetosphere to form auroras.

    The traverse will cast its gaze downward, too. Following up on the 2005 traverse findings, China is planning airborne radar mapping of the enigmatic Gamburtsevs and the overlying ice sheet with help from Australian, German, U.K., and U.S. researchers. “We know little about the boundary of ice and rock,” says Bo.

    Bus to Argus.

    The 2005 traverse.


    Also during IPY, China will lay the groundwork for a major-league astronomy observatory at Dome A. Like the South Pole and Dome C (p. 1523), Argus provides the thin, clear skies ideal for serious stargazing. The Purple Mountain Observatory in Nanjing is teaming up with Australian and U.S. astronomers on a Dome A survey in 2007–08 to determine the best spot for a major telescope facility, and an array of four 15-centimeter telescopes will be installed to hunt for planets orbiting nearby stars. Later in 2008, PRIC will ship three 50-centimeter telescopes to Dome A that will provide the first picture of a large portion of the sky in polarized light, says Wang Lifan, director of the Chinese Center for Antarctic Astronomy.

    In the short term, Zhang says, “nobody will winter over at Dome A.” Construction of a year-round station is set to begin in 2011, but for the next decade, the annual work window at Dome A will be narrow as the monthlong traverse leaves roughly 2 weeks on site. “There's only a short time that the weather is suitable for people,” Bo says. “But the place is so fantastic”—fantastic enough to endure the hardships Bo and his team faced to get there.

  13. Opening Doors to Native Knowledge

    1. Jennifer Couzin

    Scientific and local cultures seek common ground for tackling climate-change questions in the Arctic

    Growing trend.

    Like other groups native to the Arctic, Sami reindeer herders in Northern Europe are collaborating more and more with polar researchers.


    “I say that there are three sure signs of spring,” says Caleb Pungowiyi, a 65-year-old Siberian Yu'pik who lives in Kotzebue, Alaska. “The ducks and the geese coming back, tourists coming back, and scientists who come back to check their instruments.” Some Inuit in Alaska call these researchers Siksik, the Inuit word for the ground squirrel, which pokes its head up only in the summer. For their part, these “squirrels” have traditionally gone about their business, rarely tapping the natives for their expertise.

    But as climate change sweeps through the Arctic, decades-old divisions between the indigenous people who live there and the scientists who parachute in and out are slowly dissolving. Projects that draw on traditional knowledge of animal migrations, ice patterns, shrubbery, and weather are popping up from Baffin Island, Canada, to Rovaniemi, Finland. The goal is to use local information—say, about the health of individual caribou or about whales killed by hunters—to supplement and enrich scientific data, such as sea-ice or vegetation changes. “We're living in a period of extreme uncertainty, and these perspectives add insight,” says Gary Kofinas, who studies resource management at the University of Alaska, Fairbanks. “I think it's fair to say that we need all the help we can get.”

    Collaborations between scientists and indigenous people are also driven by external pressures. The influential 2005 Arctic Climate Impact Assessment ranked indigenous knowledge high on the list of topics that scientists should pay attention to. And the International Polar Year, which began this month, lists the sustainability and perspectives of societies living in the Arctic as one of six themes shaping its research agenda.

    But although many researchers are enthusiastic about these nascent collaborations, and some projects have already yielded interesting new directions or confirmed scientific findings, drawing on traditional knowledge to supplement hard science is still largely uncharted territory. There is no way yet to quantify uncertainty in the information supplied by indigenous people. And even as native collaborators begin to take on ever-larger roles in scientific projects, there is no systematic strategy for reconciling conflicts between scientific data and traditional knowledge.

    “The partnership is in its infancy,” says Igor Krupnik, a cultural anthropologist at the Smithsonian National Museum of Natural History in Washington, D.C., who has worked extensively with Arctic indigenous people. Their insight “is great stuff, but you have to be careful.”

    Breaking the ice

    Many scientists point to the Arctic Climate Impact Assessment, an international project coordinated by members of eight Arctic countries, as a turning point in efforts to draw on indigenous knowledge. Initially, “indigenous people were listed as chapter number 10, 11, or 12, back at the very end” of the report, recalls Krupnik, one of the authors. But gradually, as the document was assembled, the chapter on indigenous knowledge climbed the ladder, ending up as number three. It's a subtle distinction, but Krupnik considers it highly symbolic. The report “was very instrumental,” he says, in awakening people to the value of traditional knowledge as “very solid science.”

    Those who work with indigenous communities agree that locals possess information that scientists have difficulty accessing independently. “I'm looking at computer screens or satellite images, but I don't have the time to wander around the landscape like the Sami do,” says Terry Callaghan, who runs the Abisko Scientific Research Station in Sweden, referring to the native population that herds reindeer in three Nordic countries and Arctic Russia. Furthermore, although some scientists such as Callaghan spend time in the Arctic year-round, their presence through the winter is rare. That makes locals such as the Sami unique observers of how changing winter weather patterns are altering the landscape.

    Scientists and native people say traditional knowledge can be especially helpful in providing an in-depth, up-close view of the Arctic. Whereas scientists might be well-versed in, say, sea-ice extent, the Inuit know the ice much more intimately—hole by hole, crack by crack—says Shari Gearheard, a geographer at the University of Colorado, Boulder, who lives in Clyde River on Baffin Island. Over the next 3 years, she and local partners will be traveling across sea-ice hunting grounds in Baff in Island, Alaska, and Greenland, recording what they see. Already she's learned more about sea-ice dynamics than she would have had she struck out alone. “There are certain cracks that are in the same place every year, but some are moving now, and there are new ones that are not expected,” Gearheard says.

    Such attention to detail has impressed Krupnik and walrus biologist G. Carleton Ray of the University of Virginia, Charlottesville. Yu'pik Eskimos on Alaska's St. Lawrence Island, in the Northern Bering Sea, not only examine hunted walruses for everything from gut parasites to the texture of their blubber, they also have a far more descriptive language than biologists. Ayviquma, for example, means mother, yearling, and young calf in one group. Amiinaqut nunavaget defines a group of walruses isolated on an ice floe. Such precision, Krupnik says, makes the historical record passed through generations especially valuable.

    So far, some of these recollections match up well with scientific data. As part of a project with the Sami reindeer herders around Abisko and Sami academics from northern Norway, Callaghan has found that Sami observations of how snow depth has changed over 50 years generally jibe with long-term data collected by scientists. The Sami also informed him that during the middle 1980s, the wind had switched directions. Combing through climate records, Callaghan found an abrupt climate change at that time. The mean annual temperature had jumped by about 1.5°C.

    Culture clash

    But for every observation like this one, which underscores the remarkable inter-generational memory of many indigenous people, there are other recollections that might be difficult to interpret or even trust. “One Sami said, ‘The sky isn't as blue as it used to be.’ He was talking about changing atmospheric conditions, but it was an observation I could not accept,” says Callaghan. “You can't remember color; you can't pass it down through generations.”

    Furthermore, how to use indigenous knowledge is something that dogs Arctic researchers. “Human knowledge is not scientific knowledge,” says Tero Mustonen, a subsistence fisher in the Finnish Arctic who also studies human ecology. “It's not universal, it's not systematic, it's not free of biases.” And layering that knowledge onto scientific data can be especially troublesome when they conflict, Krupnik adds.

    Some scientists are trying to gain a better sense of how indigenous people gather information about their natural environment by literally working side by side with them, as Gearheard does in her travels onto sea ice. The hope is that such collaborations might ease the translation of indigenous knowledge into scientific data, where that's appropriate, and make science more useful to the locals. For example, at the University of Lapland in Rovaniemi, Finland, biogeographer and vegetation scientist Bruce Forbes is 3 years into a 4-year project examining how the terrestrial ecosystem in parts of Siberia and the Eastern European Arctic are changing. He and his colleagues migrate with the Nenets reindeer herders, living and working alongside them in teepeelike structures.

    In it together.

    Baffin Island locals Teema Qillaq, Lasalie Joanasie, and Andy Murray help install a monitoring station to assess changes in sea ice.


    During one such migration last November, a major “icing event” occurred in which the air temperature warmed enough for rain, then plunged again. The water froze atop the snow, making it difficult for the reindeer to access food. By documenting how the Nenets reacted to this situation, which until recently was very rare, the scientists are developing a keener sense of how resilient and adaptable their hosts may be in the face of climate change.

    Thanks to these efforts, an indigenous slant on Arctic science is on the rise, bringing to the fore issues about the control, acquisition, and dissemination of information. There is no good standard for deciding when indigenous knowledge is intellectual property, for example. To strike a balance, Kofinas, Krupnik, and others are increasingly including indigenous collaborators as co-authors on papers. Yet, for a local to say, “‘You can't publish it without my approval’ … would never go over well in my university,” says Kofinas.

    For their part, indigenous people and their governments are becoming ever more proactive. In Canada, they actively screen projects that fall into their geographic region. For a caribou-monitoring network called the Arctic Borderlands Ecological Knowledge Co-op, Inuit in Canada and Alaska conduct interviews with other residents about caribou populations. “It's local folks running the show,” Kofinas says.

    Pungowiyi, the Alaskan native, also thinks it's time to get more involved with the “ground squirrels.” He recently submitted his first proposal to the U.S. National Science Foundation with Henry Huntington, a social scientist in Eagle River, Alaska, to examine climate change effects that indigenous people are observing on land and in the ocean. Understanding the Arctic requires more than numbers and satellite photos, he insists. There's a need to “put a human face to the effects,” he points out. “That's what we're trying to get to.”

  14. Sailing the Southern Sea

    1. John Bohannon

    An international project sorts out the dynamics of climate and nutrient fluxes in this polar ocean community

    On ice.

    The icebreaker RV Nathaniel B. Palmer crunched through winter pack ice to reach open water in Antarctica's frozen Ross Sea for algae studies.


    The spirit of John Martin still seems to haunt oceanography 14 years after his death from cancer. When he was a researcher at Moss Landing Marine Laboratories in California, Martin proposed that massive blooms of photosynthetic plankton in the frigid Southern Ocean around Antarctica and other nutrient-rich but iron-starved waters could be an antidote to global warming. By pulling carbon dioxide out of the atmosphere to build their tiny bodies and then sequestering that carbon as they die and drift to the bottom of the ocean, these microscopic algae could reduce greenhouse gases and cool Earth. The only thing holding them back, Martin argued, was a dearth of iron, a necessary part of their photosynthetic machinery. To drive the point home, he once stood up at a conference in Woods Hole, Massachusetts, and said half-jokingly, “Give me a half tanker of iron, and I will give you an ice age.”

    It was an idea that launched 1000 ships, “or certainly hundreds,” says Giacomo (Jack) DiTullio, an oceanographer at Hollings Marine Laboratory in Charleston, South Carolina. Although Martin did not live to see it, his colleagues at Moss Landing and others confirmed that lack of iron does limit growth: In a 1995 experiment, spreading dissolved iron over a 64-km2 patch of ocean near the Galápagos Islands caused a temporary, 30-fold boost in phytoplankton biomass.

    DiTullio has also followed in Martin's wake. As head of the Controls on Ross Sea Algal Community Structure (CORSACS) project, he and his colleagues are sorting out what—in addition to iron—makes plankton communities tick in the Ross Sea, the southernmost part of the Southern Ocean. It is one of the biggest potential hot spots for the phytoplankton blooms Martin envisioned.

    The CORSACS researchers have looked at half a dozen environmental parameters—iron, light, carbon dioxide, temperature, and several trace nutrients—not only individually but also in combination. Doing so is “both novel and necessary” for understanding how this community can contribute to climate change, says Philip Boyd, an oceanographer at the University of Otago in Dunedin, New Zealand.

    The data reveal a tangled web of interactions between the plankton community and its environment. “Iron addition works” to spark plankton blooms, says Jorge Sarmiento, a climate modeler at Princeton University, but it's clearly not the only factor controlling this key part of the global carbon cycle. In addition, the study indicates that changes in the Southern Ocean environment could shift the ecological balance between competing species of phytoplankton and, consequently, alter the contribution of phytoplankton to the carbon cycle. Such shifts could in turn exert “a very large impact on the air-sea balance of carbon dioxide,” says Sarmiento. CORSACS's challenge is to nail down these complexities so they can be built into climate models.

    Probing a complex soup

    The Ross Sea is a tough place to work. Its thick blanket of ice covers all but a Francesized pool, or polynya, where sunlight allows phytoplankton to flourish. For two cruises (December 2005 to January 2006 and then again November to December 2006), the researchers worked there around the clock, often with wet and freezing hands, while their shipboard laboratory pitched wildly beneath their feet. Although satellite data, easily collected from the comfort of one's office chair, can provide information about plankton densities and environmental conditions, “there's just no other way to answer the kind of questions we're after,” says DiTullio. To predict how the phytoplankton community will react to a changing environment, the researchers had to survey conditions in real time and test samples onboard to avoid confounding factors introduced by shipping them to labs on land.

    The team dangled sampling devices that tracked iron and other nutrient concentrations, as well as light, temperature, and pH at various depths throughout the cruises. At the same time, they studied the species composition, biomass, and photosynthetic activity. As Martin predicted, the survey data “consistently demonstrated iron limitation of growth,” says DiTullio. Wherever iron was available—blown in as dust or transported in water upwelling from the deep—the plankton multiplied, but only as long as the iron lasted. Several experiments onboard the ship showed the same phenomenon: Adding iron boosts growth until the metal is gone.

    But there is more to life than iron. Mak Saito, an oceanographer at the Woods Hole Oceanographic Institution in Massachusetts, wondered how vitamin B-12, another necessary nutrient, might affect algal growth. Only prokaryotic organisms such as bacteria are capable of making the molecule from scratch. In most ecosystems, there are more than enough bacteria to go around. “But the polar environments are unique,” says Saito, because these bacteria can be so rare there that B-12 might limit community growth. To test that idea, he took samples of plankton from three locations—ranging from low to high bacterial concentrations—and incubated them for a week in bottles with or without a supplement of the vitamin.

    Sure enough, adding B-12 boosted the algae's growth in bottles with sparse bacteria, whereas it had no effect when plenty of bacteria were present. An input of iron may be necessary for the phytoplankton to grow, says Saito, but so is access to vitamin B-12.

    A broad sampling.

    The CORSACS team crisscrossed the Ross Sea (above) to understand what gives some plankton species, like this diatom (below), an edge over others.


    CORSACS researcher Phillipe Tortell, an oceanographer at the University of British Columbia in Vancouver, Canada, wanted to know what increased carbon dioxide concentrations might do to the productivity of the Ross Sea. Tortell incubated samples of algae in bottles with a range of carbon dioxide concentrations, from preindustrial levels up to more than twice current levels. As one would expect for photosynthesizers, higher carbon dioxide led to higher metabolic rates and faster reproduction. But a surprise was that increased carbon dioxide changed the species mix, causing one group of diatoms, known as Chaetoceros, to dominate the rest. This result sounds like good news for climate-change buffering, as these particular diatoms form long chains of cells that sink efficiently, making them adept carbon sequesterers.

    But the story is not that simple, says Tortell.

    Algal politics

    Tortell's carbon dioxide experiments highlight the dynamic nature of the Ross Sea ecosystem. The Southern Ocean is home to a diversity of species that compete for resources using different life strategies. In the Ross Sea, the diatoms and Phaeocystis, a green-brown, single-celled photosynthesizer that forms mucus-covered colonies, compete to be the top-dog alga. And the dominant species can vastly change the community's impact on the carbon cycle. Thus, understanding what gives one group of plankton an edge over another is key to predicting how the Ross Sea community will evolve in response to climate change, says Tortell.

    Saito has made a start. His group has discovered dense clusters of bacteria thriving within the mucus of Phaeocystis colonies. The bacteria may provide B-12 in exchange for a free ride in the mucus's carbohydrate-rich environment. And the mucus may benefit both species by acting as a sponge for iron. Because of this symbiosis, “Phaeocystis may out-compete diatoms,” says Saito. This interplay between bacteria and algae and between the algae themselves is occurring against a backdrop of the many environmental changes associated with global warming. For instance, the atmosphere is likely to become dustier due to increased droughts and soil degradation—potentially delivering more iron to the Southern Ocean. Warmer surface waters could also shift the ocean's ecological balance. And increased precipitation and ice melt will decrease the salinity of surface waters, which in turn will slow down mixing with saltier, lower levels. Less mixing means more time spent near the surface, says DiTullio, which is effectively “an increase in the average light level experienced by phytoplankton.”

    To test how these changes may act in concert on algal growth, a team led by David Hutchins, an oceanographer at the University of Southern California in Los Angeles, grew cultures of plankton during both cruises while tweaking temperature, light, and carbon dioxide and iron concentration simultaneously. Models of the Ross Sea based on field observations have “typically assumed that high light and high iron would favor diatom communities,” says Hutchins. But under high light and high iron concentrations in his experiments, Phaeocystis thrived whereas diatoms lagged.

    These dynamics make the Southern Sea a devilishly complex environment for climate researchers to model. Whatever ecological shifts wait over the horizon, says Hutchins, they will alter “the efficiency of the biological pump” for pulling carbon dioxide out of the atmosphere. And what sign will be attached to that change—positive or negative—is still anyone's guess.

    At this point, the CORSACS cruises have generated more questions than answers. In April, the researchers will gather in South Carolina to analyze the data. But one thing is already clear, says Boyd: “They indicate that climate change will simultaneously impact a wide range of ocean properties in Antarctica.” The big question is what those changes will mean for global climate in the coming decades—an unplanned experiment on a far grander scale than anything Martin ever imagined.

  15. Boom and Bust in a Polar Hot Zone

    1. Erik Stokstad

    Changes in penguin and other populations on the Antarctic peninsula reflect the turmoil caused by climate change

    When William Fraser first arrived in Antarctica in 1976 to study Adélie penguins, their nesting sites were surrounded by sturdy sea ice. “We could ski for miles,” recalls Fraser, an ecologist with Polar Oceans Research Group, a small nonprofit organization in Sheridan, Montana.

    No longer. In the past 30 years, Fraser and his colleagues have witnessed a stunning change in the climate, one that is altering the mix of species in the West Antarctic Peninsula, an icy mountain range that reaches toward South America. Air and ocean temperatures have risen, causing less ice to form and more snow to fall. “This ecosystem is on fire,” says Hugh Ducklow of the College of William and Mary Virginia Institute of Marine Science in Gloucester Point.

    The change has taken a dramatic toll on some species, especially Adélie penguins, as their habitat and a key prey are disappearing. Also, new experiments suggest that increased snowfall is adversely affecting nesting. “The story for Adélies is absolutely dismal,” Fraser says.

    Slippery slope.

    Some Adélie penguins in Antarctica suffer from a loss of sea ice and prey, such as the silverfish (inset).


    But the tale is not one of universal suffering. On the peninsula, the populations of other species of marine birds and mammals are booming. As sea ice diminishes, living conditions seem to be improving for them. Across Antarctica, even Adélies are benefiting in some places. They are expanding into areas where sea ice was once too extensive.

    No one knows how long such favorable conditions will last. But as the hottest of polar hot spots, the West Antarctic Peninsula provides a sneak preview for the rest of the continent. “This is one of the best examples we have of an ecosystem where we can see the responses of rapid climate change,” says Ducklow. As such, “it may be harbinger of what will happen elsewhere as the system breaks down,” adds Gerald Kooyman, a penguin biologist at the Scripps Institution of Oceanography in San Diego, California.

    In hot water

    The West Antarctic Peninsula has proven especially vulnerable to climate change. The peninsula's mean winter temperature has risen 6°C since 1950—the fastest rate on the planet. The ocean has warmed by nearly 0.7°C.

    Those trends have made Palmer Station, located at the northern end of the peninsula, a focal point for research on the impact of climate change on polar ecosystems. For the past 17 years, Fraser and about two dozen researchers have fanned out from this base camp to sample a swath of 120,000 km2. They're studying the ecosystems' physical and biological components, including penguins, seals, fish, and krill, which anchor the food web.

    The clearest trend is in the penguins. Adélies on the peninsula have suffered a 70% decline from the 1970s, yet nearby populations of gentoo and chinstrap penguins have gone through the roof. One difference is that Adélies prefer fairly wide expanses of ice, whereas the chinstraps, for example, like open water. Sea ice is forming later and retreating sooner each winter, resulting in 85 fewer days of ice cover than 25 years ago, Ducklow and his colleagues reported in January in the Philosophical Transactions of the Royal Society of London.

    Adélie penguins may also be affected by a decline in silverfish, which were once half this penguin's diet. Now the penguins eat mostly krill, which are just as nutritious but not as reliable as a food source because their populations periodically bottom out. When krill decline, the penguins have a harder time feeding their chicks, says Fraser.

    There's trouble for Adélies on land as well. Adélies nest on rocky ground, near the shore. Fraser and his colleagues have studied seven major colonies in the 50 square kilometers around Palmer Station. About a dozen years ago, they noticed that some of the colonies were shrinking much more severely than others, even though the birds were feeding in the same area.

    He wondered whether snowfall was a factor. By comparing snow accumulation at various colonies, Fraser discovered that chicks tended to weigh less in colonies with deeper snow. The colony with the most snow, on Litchfield Island, has fared the worst, declining from roughly 1000 breeding pairs in 1975 to zero last year.

    To confirm his suspicion, Fraser and his colleagues erected a fence near a northeast-facing colony where winds normally blow snow away, causing snow to accumulate on half the nests. According to unpublished results, chicks behind the snow fence weighed up to 15% less than chicks from nests a few meters away with no snow. The explanation, Fraser says, is that Adélies only lay eggs on dry ground. Penguins in areas with more snow wait longer to breed, and their later-hatching chicks miss the 2 weeks when krill are most abundant. The chicks gain less weight and are much less likely to survive the next winter.

    He points out that although Adélie populations have fluctuated over millennia, the current decline is unprecedented. Within a decade, there may be no more Adélies within 200 kilometers of Palmer Station.

    This doomsday prediction doesn't tell the whole story, however. As Adélie penguins lose ground, other species are thriving. Species that prefer open ocean used to be limited to the north and east parts of the peninsula, where the ocean didn't freeze during the winter. Now, with ever more open water, these species are expanding their ranges.

    In the past decade, Palmer Station has seen a huge proliferation of southern fur seals and southern elephant seals—species that were present only as small colonies in the 1990s. In one case, a population of six seals now numbers 5000. The presence of these species suggests that a sub-Antarctic ecosystem is replacing the polar ecosystem of Adélies and silverfish.

    But as this ecosystem moves southward, so too is the “polar” world, and that's good news for the overall survival of Adélie penguins. Some 400 kilometers south of Palmer Station, the populations of Adélies in Marguerite Bay have tripled since the 1950s. Just as Adélies don't like a lack of ice, they also dislike a surfeit—the greater expanse makes it strenuous to reach open water for foraging. The warming climate and reduced sea ice are apparently making Marguerite Bay a nicer place for Adélies to live. That's true farther south too, says David Ainley of H. T. Harvey & Associates in San Jose, California, who studies Adélies in the southern Ross Sea. “As ice shelf breaks up, there should be more habitat, and we should be seeing more penguins.”

    Still, Ainley and others caution that it's dicey to predict exactly what will happen as ecosystems continue to respond to climate change. But looking back on the fate of the Adélies he has watched for 3 decades, Fraser offers a warning: “If Antarctica is a model for how ecosystems might change in other parts of the world, the changes will be severe.”

  16. For Extreme Astronomy, Head Due South

    1. Daniel Clery

    Over the past decade, small telescopes in Antarctica have revealed key features of the early universe. Now astronomers are rolling out the big guns

    Some astronomers choose to build telescopes in idyllic locations—atop Mauna Kea in Hawaii, for instance, or in the Canary Islands. Not John Carlstrom. During the just-finished austral summer, his team shipped 270 metric tons of equipment to the South Pole and raced to assemble a new radio telescope with a 10-meter dish before winter shuts down flights and maroons Amundsen-Scott South Pole Station for 9 months. “Everything has to be ready to go,” says Carlstrom, of the University of Chicago in Illinois. “There's only so much you can do in 3 months.”

    To extreme astronomers, Antarctica's unrivaled view of the stars makes the prodigious and risky work to build a scope there worth-while. Water vapor, the enemy of radio astronomers who tune in to microwave signals, is virtually absent at the bone-dry pole. And microwaves are not the only game in town. Alongside Carlstrom's South Pole Telescope (SPT), a giant neutrino observatory, IceCube, is taking shape.

    They are the vanguard. Surveys on the Antarctic plateau have pinpointed perches with little atmospheric turbulence, ideal for astronomy at infrared wavelengths. “These are the best sites in the world by a big factor,” argues astronomer Edward Kibblewhite of the University of Chicago. As a result, optical astronomers are hatching plans for front-rank observatories across the frozen continent.

    Some astronomers are hedging their bets on whether the risk is worth taking. “Almost every common system that we use at our very large telescopes now would fail in the extreme conditions in Antarctica,” says Daniel Fabricant of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts. Today's pioneers have yet to prove that big optical or infrared scopes in Antarctica are feasible, he says. Kibblewhite, for one, is taking up the gauntlet: “The pain of building there is offset by the fantastic capabilities.”

    Cold, high, and dry

    Since the early 1990s, astrophysicists have eagerly pitched camp in Antarctica to study the cosmic microwave background (CMB) radiation, a relic of the early universe when the plasma of electrons and protons coalesced into atoms and the universe became transparent. Over the eons, photons from that primordial fog have cooled to microwave wavelengths. Since the CMB's discovery in 1965, researchers have interrogated it for clues to what the universe was like in those early days and to test various models of the big bang.

    Because water vapor absorbs microwaves, CMB astrophysicists must put their telescopes in space or some other ultradry place. Some flocked to mountaintops or deserts. Others chose Antarctica, where moisture freezes out of the air. Early scopes on the ice looked for wrinkles in the CMB—variations in the radiation's temperature across the sky. The field really took off after the Cosmic Background Explorer satellite charted wrinkles over the whole sky in the early 1990s. The satellite's map revealed an early universe of uneven density, in which denser regions led to galaxy clusters seen today.

    Continuing the work, the Degree Angular Scale Interferometer, based at the South Pole, discovered in 2002 that CMB radiation is slightly polarized, giving a picture of how regions of different densities were moving early on. And BOOMERANG, a CMB telescope flown over Antarctica by balloon in 1998 and 2003, allowed cosmologists to estimate the universe's overall density, leading to the conclusion that spacetime has no overall curvature: The universe is flat.

    The latest CMB scope at the pole is QUaD. It is scrutinizing polarization in an attempt to put limits on certain properties of the early universe, such as how much normal matter there was. But in QUaD's first season in 2005, the experiment almost ground to a halt when it nearly exhausted South Pole station's liquid nitrogen and helium, used to chill the scope's detectors to 0.3 kelvin. “The biggest challenge is having no access [during the winter],” making maintenance difficult, says QUaD astronomer Walter Gear of the University of Cardiff, United Kingdom.

    Crystal-clear vision.

    The new South Pole Telescope, with its heated 10-meter dish, will study the properties of enigmatic dark energy.


    Another recent arrival at the pole is the Background Imaging of Cosmic Extragalactic Polarization (BICEP) telescope. BICEP aims to answer one of the burning questions of cosmology. Big bang theory predicts that the infant universe underwent a rapid expansion known as inflation. It's impossible to peer into the opaque young universe to verify that inflation occurred. But if it did, it would have created a cosmic perfusion of gravitational waves—something absent from other theories. The technology does not exist to detect a gravitational-wave background, but the waves should have left a faint fingerprint in the form of a slight swirl in the CMB's polarization. BICEP is the first scope designed to detect this, says project leader Andrew Lange of the California Institute of Technology in Pasadena.

    To perceive gravity's subtle signature in the early universe, a telescope must peer at one patch of sky for days on end. “At the South Pole, you can stare relentlessly at the target,” Lange says, because the same stars circle the pole at the same elevation. After BICEP's first season last year, the team is honing its detectors and expects to reach the required sensitivity in the next year or two. “We could see something soon,” says Lange.

    Big science arrives

    With the installation of the 10-meter SPT, a behemoth has taken its place beside the much smaller scopes. Although the CMB is also the target of this newcomer, its principal goal is not to study the early universe but rather to probe the nature of dark energy, a mysterious, unseen force that is speeding up the universe's expansion. SPT will study the evolution of galaxy clusters over the universe's history. Because dark energy seems to push everything apart, it inhibits the growth of galaxy clusters, so studying how clusters have developed should reveal something about dark energy. Rather than observe clusters directly, SPT will look for the imprint they have left on the CMB as it has wafted through space.

    Building the delicate instrument at the pole was no mean feat. All moving parts must be sheltered for warmth, while the exposed dish's aluminum panels are kept ice-free with electric heaters. It took three field seasons to assemble the scope; after finishing its construction in January, most SPT crew members flew home and will control the scope from the relative comfort of Chicago. Three colleagues will winter at the pole to ensure SPT runs smoothly. The scope saw first light on 16 February, and Carlstrom expects initial science results this austral winter.

    SPT's completion was a sprint compared to IceCube, one of two major rivals in observing neutrinos from deep space. These particles are born in the hearts of stars and cosmic calamities such as supernovae and gamma ray bursts. They are chargeless, nearly massless, and race by at close to light speed. Space is teeming with them. But they rarely interact with normal matter. Billions pass right through your body every second without ever interacting.

    Researchers detect the ghostly particles by putting a large volume of water under surveillance. After a neutrino strikes a nucleus, the streaking subatomic shards produce a flash of light that spreads in a cone shape. The cone's orientation reveals the direction the neutrino came from, making it possible to retrace the neutrino's path—perhaps all the way back to the cosmic event that spawned it.

    Researchers estimate that roughly a cubic kilometer of water is necessary to get a fix on neutrino sources. A European team plans to use the Mediterranean as an instrument by floating strings of detectors anchored to the sea floor. Their U.S. counterparts are using ice rather than water, boring into the Antarctic plateau with a hot water drill and then lowering strings of detectors into the holes, which will fill with water and freeze. It's a mammoth undertaking. Each borehole is 2.5 kilometers deep, takes 48 hours to drill, and creates 750,000 liters of water. IceCube's crew members have been honing their techniques: Two years ago, they dug a single hole; in 2005–06, they managed eight; and this season, the team sank 13. The target is now 14 holes per season; with at least 70 planned, installation has a few years to go.

    Other astronomers, inspired by the groundbreaking polar work, are hoping to get in on the action. Three years ago, Michael Burton and his colleagues at the University of New South Wales in Sydney, Australia, used an automated test scope to survey Dome C, a bulge on the plateau that's home to the French-Italian Concordia station. There they found “uniquely stable conditions,” Burton says, “two times better than any temperate latitude site.” The key is scant turbulence above about 30 meters that can be corrected using adaptive optics, Kibblewhite says.

    As part of the International Polar Year, the Australian team in 2007–08 will set up another test scope at Dome A, where China has ambitious plans for astronomy (see p. 1516). The Australians' long-term goal is to erect a 2-meter telescope called PILOT at Concordia. “Big enough to do interesting science, but not too expensive,” Burton says. PILOT's images, he predicts, will be similar in quality to those of the Hubble Space Telescope. Kibblewhite and U.S. collaborators have grander plans: a 15-meter telescope at one of the domes, with a mirror made of many small segments to reduce weight and cost. Planning is in the early stages.

    Kibblewhite believes that with astronomy's history of international collaboration, it won't be long until nations work together to build large observatories in Antarctica. If so, they will owe a debt to Carlstrom and other polar astronomy pioneers.

  17. Race to Plumb the Frigid Depths

    1. Kevin Krajick*
    1. Kevin Krajick is a writer in New York City.

    In the Arctic Ocean, research fueled by national claims could reveal past climates, unknown life forms—and vast natural resources

    By all appearances, Trine Dahl-Jensen and Ruth Jackson have transcended national boundaries in the name of science. Working for the geological surveys of Denmark and Canada, respectively, the geophysicists are mapping the structure of undersea rocks hundreds of kilometers north of Greenland and Canada's Ellesmere Island. In this forbidding region, habitual convergences of winds and currents force ice floes into solid jumbles 100 meters thick. Polar bears, whales, and even icebreakers are frozen out. Most knowledge of the depths comes from a few sounding tracks made by Cold War subs. To gather data, Dahl-Jensen and Jackson land by helicopter, set off explosives, collect echoes from the bottom, and then scramble back to the Canadian military base of Alert, humanity's northernmost toehold on land.

    Actually, national interests are their reason for being here. The five nations bordering the Arctic Ocean are in an underwater land rush to legally divvy up much of the sea bottom, based on its geology. In this particular region, Canada and Denmark have teamed up to cut costs. Other scientists are hunting for oil and minerals or seeking sea-floor records that might suggest how fast global warming could peel back Arctic ice. Denmark and Canada have budgeted $80 million for mapping over several years. Jackson compares it to the Americans' 1867 purchase of Alaska from Russia. “It might not have seemed too useful at the time,” she says, “but give it another 30 or 100 years.”

    Similar nationally funded projects have already sparked some nasty disputes, but the money is sloshing over into the International Polar Year (IPY), where it might foster cooperation. Researchers say that once the politics are sorted out, the same data can clarify murky geological, ecological, and climate questions, to the benefit of all. Territorial claims “are giving us an opportunity to work in unknown areas of the world that we would never get to otherwise,” says Dahl-Jensen. “There are all kinds of scientific spinoffs.”

    Undersea land grab

    Studying the Arctic sea floor, never mind laying claim to it, once seemed impossible. The heavy ice cover on the 14-million-square-kilometer northern ocean has long kept even basic bathymetry vague. In the 1990s, the U.S. Navy declassified a smattering of sub soundings and invited civilian scientists on cruises. The Russians have prowled Arctic waters secretly for decades and still keep much of their data off-limits. Only during the last decade have cruises by Canadian, German, and Swedish breakers opened a window to sea-floor geology, and it was not until 2001 that the U.S. Coast Guard commissioned its first Arctic science icebreaker, the Healy. Its inaugural cruise to the Gakkel Ridge, in the deep central Arctic, revealed rumbling volcanoes and apparent hydrothermal vents where conventional tectonic theory predicted there would be none (see sidebar, p. 1527).

    Boom with a view.

    A drilling rig in the Beaufort Sea. Warming temperatures should make it easier to extract oil and gas from the High Arctic.


    Such discoveries herald resources that might soon become accessible. Undersea vents commonly precipitate metals, including copper, gold, and silver. With undersea mining in its infancy, these deposits might not be tapped soon, but other resources could become available, says James Hein, a senior researcher at the U.S. Geological Survey (USGS), who tracks Arctic minerals. These might include placer deposits of diamonds or gold washed by rivers onto shallow continental shelves from Canada or Siberia. Dramatic summer melting in the last decade has opened shipping channels, and companies have begun prospecting. “No one is going to know what they've found until they're ready to move,” says Hein.

    The big prize, almost certainly, is hydrocarbons. In 2000, USGS estimated that maybe a quarter of the world's undiscovered hydrocarbons—an estimated 3.1 trillion to 11 trillion barrels of oil—lies in the Arctic. USGS research geologist Donald Gautier is co-heading a new circumpolar hydrocarbon survey. When it comes out in 2008, “it is logical to think” the estimate will go up, he says. Along with known oil and gas reserves already mapped in near-shore zones, such as Alaska's Prudhoe Bay, deposits may be locked up in gas hydrates farther out. One speculative map from the Geological Survey of Canada forecasts hydrate fields extending to the pole. Although all wells currently hug the shore, many companies are developing hardware to expand deep into ice-covered waters, says Graham Thomas, chief of cold-regions technology for the oil company BP.

    Geologic information is central to the territorial claims themselves. Under the U.N. Convention on the Law of the Sea, the United States, Russia, Norway, Canada, and Denmark (which administers Greenland) may claim underwater rights beyond their 200-nautical-mile economic zones via any submerged “natural prolongation” of their landmasses. The U.N. rules are based on formulas that take into account the contour line where water depth reaches 2500 meters, along with details of seabed geology, including measurements of sediments that may have eroded off continental shelves. Most of the Arctic Ocean is ringed with expansive shallow continental shelves and possibly related topographic features, so nearly the whole ocean may someday be claimed, except for a couple of small doughnut holes way out in the middle. The United States, which has yet to ratify the convention, has been mapping the seabed since 2003; it could gain some 600,000 square kilometers of Alaskan shelf, worth $650 billion if present oil estimates hold up.

    The really dicey part is mapping the deep, central Arctic, which is crisscrossed by a half-dozen huge underwater mountain ranges that may or may not be connected to certain landmasses, depending on whose scientists you listen to. One major feature is the 1800-kilometer-long Lomonosov Ridge, which runs from above the central Siberian continental shelf through the North Pole, to above Greenland and Ellesmere. Most scientists agree that it peeled off from the shelf of what is now Russia and Scandinavia some 60 million years ago. One end stuck near Siberia while the rest pivoted outward, like a splinter being pried off a log. Based on this history and proximity, the Russians say it is theirs, up to the pole. On the other side of the pole, Jackson and Dahl-Jensen are building the case for Canada and Greenland. If they succeed, Denmark stands to gain up to 180,000 square kilometers—four times the size of Denmark itself.

    By measuring seismic waves passing through the sea floor, the scientists hope to establish that the rocks under the Lomonosov are deep-seated and of low density, and thus probably of continental origin. They are also mapping sediments and bathymetry in an effort to show that their lands and the Lomonosov may be linked. Never mind where the ridge came from, says Jackson. “The question is what it's attached to now.” Under a 2005 agreement, their findings are secret, but the scientists make no bones about their governments' aim: With the ridge neatly straddling a midline above Greenland and Ellesmere, although not actually connecting to either, the nations aim to divide it up.

    Some experts don't buy this argument. Lawrence Lawver, a marine geophysicist at the University of Texas, Austin, says such claims “are based more on desire than geology.” It's hard to argue that the Lomonosov is part of either Greenland or Canada, he says: “It's just moving past as they wave hello.”

    Lawver and other Western researchers reserve harsher words for their Russian colleagues. Along with half the Lomonosov, Russia claims much of the vast, deep Alpha-Mendeleyev ridge system and parts of the adjoining Amerasian basin, which span eastern Siberia to central North America. Added to everything else they want, this would give Russia half the ocean bed. Their arguments include Siberia's proximity to one end of the Mendeleyev, dredged-up continental rocks eroded from the ridges, and magnetic signatures and other geophysical measurements that they claim show that Eurasian-type continental crust underlies both the ridge and parts of the adjoining basin. Viktor Poselov, deputy director of the Gramberg Research Institute for Geology and Mineral Resources of the World Ocean in St. Petersburg, invokes a theory of “vertical tectonics.” In this scenario, much of what is now the Arctic sea floor is continental crust that sank as a block and became “oceanized,” Russian documents assert.

    Cold rush.

    The five nations bordering the Arctic Ocean are using geology to lay claim to large chunks of sea bottom—and the virgin mineral and hydrocarbon assets buried beneath.


    It is hard to find a non-Russian who agrees. In 2001, Russia submitted its proposed claim to the U.N.'s scientific Commission on the Limits of the Continental Shelf—still the only formal proposal by any country—and it was quickly sent back for more work. North American and European scientists assert that the geophysics is at best ambiguous. Most say that any continental rocks from the bottom are migrants swept out by ice from present-day continents—probably North America. They variously propose the Alpha-Mendeleyev system to be a product of hot-spot activity, midocean spreading, or subduction—all purely oceanic processes not tied to Eurasia.

    Arthur Grantz, a retired USGS polar expert, calls the idea that buoyant crust could somehow sink into the abyss “kind of nutty.” He and others contend it is a notion peculiar to Russia—and old-fashioned even there—and long discredited by plate tectonics. “I understand where they're coming from, though,” Grantz says. “They're under great pressure. Their government gave them a lot of money, and it expects them to come up with a certain result.” Lawver, Dahl-Jensen, and others agree. Geologist Ron MacNab, a member of the Canadian Polar Commission, labels Russia's conclusions “embarrassing” and “Stalinist.”

    Russian scientists have shot back. An official summary of a 2003 conference organized by their Ministry of Natural Resources calls the criticisms “presumptuous” and “strange.” Because there is so little credible data, the Arctic sea floor's history is largely a mystery, admits MacNab—and scientists are unlikely to figure it out if they don't stop bickering. “We're developing a train wreck in the Arctic unless we get together on this,” he says.

    Curious cores If the sea ice keeps melting at its current rate, the confrontation over Arctic territory will intensify; yet most scientists are loath to predict what the climate will do, because it is difficult to disentangle the effects of humanmade green-house gases from natural cycles of warmth and cold. For this, bottom cores recording past sea-ice cover, water temperatures, and movements of glacial sediments may provide the most telling evidence.

    The first deep-sea Arctic sediment cores of great age—still the only such samples—were brought up by the Integrated Ocean Drilling Program (IODP) in 2004, 320 kilometers from the North Pole. Detailed analyses published in Nature and Geophysical Research Letters last year portray the Arctic of 45 million to 55 million years ago as a landlocked, scummy pond. Sea temperatures were warm enough to support thick layers of ferns and algae. Younger layers show long-term cooling, with more cold-water plankton and superthick glaciations or sea ice scouring the bottom. Subsequent warm periods occurred, but the timing, amplitudes, causes, and possible interactions of warm-cold cycles are perplexing. Most other cores gathered so far go back only a few hundred thousand years. Even IODP's cores are missing a giant chunk of time—sediments from 43 million to 18 million years before present—because they were apparently uplifted and eroded away in some as-yet-unidentified event.

    Efforts are under way to beef up the records. Last year, the Healy and the Swedish icebreaker Oden teamed up in the central Arctic Ocean to pull sea-floor cores that researchers hope will cover the last million years. Leonid Polyak, a marine geologist at Ohio State University in Columbus who participated, says some cores exhibit up to 80 cycles of apparent glacial melting, indicated by alternating bands of different-colored grains. Polyak says it appears that the bands come at intervals of 20,000 years, suggesting that they represent fluctuations in Earth's orbit; however, he is unsure, because the cores have not yet been well dated. His colleague Dennis Darby, a paleoclimatologist at Old Dominion University in Norfolk, Virginia, says one core from 1300 kilometers north of Alaska shows a separate 240-year freeze-thaw cycle, written in sediments scraped from continental shelves by sea ice. Darby believes this periodicity must be connected to some ocean-circulatory pattern that presumably still exists but has not yet been noted in modern times. He, Polyak, and others presented preliminary findings at the American Geophysical Union (AGU) meeting in San Francisco, California, last December.

    Next August, during IPY, a European consortium hopes to recover cores from the Fram Strait, which runs between Greenland and Norway, connecting the Arctic Ocean to the North Atlantic. Here they hope to find records tracing sea ice and currents decade by decade during historical time. This would fill an important gap. Although researchers have good decadal climate records from glacial ice cores, they lack comparable data from the sea because sediment accumulates too slowly in most of the Arctic.

    Researchers identified the Fram site last October as a place where currents concentrate sediments faster. They hope to cover the Medieval Warm Period, when melting may have rivaled today's. “The extent of short-term ocean variability during the past, especially in warm periods, is practically unknown,” says paleoceanographer Robert Spielhagen of the Leibniz Institute of Marine Sciences in Kiel, Germany, who is involved in the coring. “If scientists or policymakers want to understand our present situation, we have to have those records.”

    Another open question is the extent to which long-term tectonic rearrangement of the landmasses around the heavily landlocked Arctic Ocean has influenced climate. Before the drift of landmasses opened the Fram Strait, the region was more closed in. Slower circulation, or no circulation, could be one reason it was once so warm, says geologist Martin Jakobsson of Stockholm University. Estimates of when the strait formed vary; recent research says 15 million years ago. In any case, the circulation may have helped vent heat to the Atlantic, pushing the Arctic into the deep freeze we see today, Jakobsson says. Understanding the Fram's history would help us predict what will happen if the circulation begins to change again, he says.

    Other events, including sea-level fluctuations and periodic blockages or openings of the shallow Bering Strait between the Arctic and the Pacific, may also have influenced climate. And entirely unexpected events may have been critical. At the December AGU meeting, a group of top Arctic researchers including Bernard Coakley, a geophysicist at the University of Alaska, Fairbanks, proposed that a previously undetected 200-by-600-kilometer meteorite crater lies at the bottom of the central Arctic Ocean. Such a massive event, which they say may have taken place more than 800,000 years ago, surely would have disrupted climate signals. But so far the group has only a hypothesis, based on overturned bottom sediments found in cores and unusual nickel spherules found on some of Canada's northernmost islands. “Climate change is a very alarming issue, but anyone who says we really understand it—what are they basing this on?” says Jakobsson.

    Transcending borders

    It's hoped that IPY will soften rhetoric and get nations working together. At least a half-dozen Arctic cruises are planned for the Polar Year, all of which will include scientists from several nations. Russia has submitted three proposals to resurvey major ridges and basins using seismic reflection, bottom sampling, gravity measurements, and other methods. In an abstract, Poselov acknowledges that existing data on areas Russia is claiming are “subject to controversial interpretations” and proposes that Russia, Sweden, Germany, and the United States pool their icebreakers, possibly in 2008, to study the Alpha-Mendeleyev ridges and other remote areas. “It's quite possible something will work out,” says Coakley, who is helping pull together various efforts.

    Germany already plans to send its icebreaker, the Polarstern, to the Alpha in 2008. Studies would include identifying sites for future deep drilling—not attempted in the Arctic since the initial IODP foray in 2004. Scientists at the Alfred Wegener Institute for Polar and Marine Research in Bremerhaven are already talking about the drilling itself—again, via a multiship expedition—although this would come after IPY.

    Large-scale cooperation is needed both for science and safety. Pressure from constantly shifting pack ice makes it nearly impossible for an icebreaker to remain stationary long enough to drill; the 2004 deep cores came up only with triple teamwork: One vessel drilled while two others broke up approaching floes. And an icebreaker has never penetrated the region north of Greenland and Ellesmere, where Jackson and Dahl-Jensen are working. This summer, as part of IPY, a European group chaired by Jakobsson hopes to get there in the Oden to do geophysical, geological, and paleoclimate work; the Swedish government is negotiating to rent a nuclear Russian breaker to make sure the diesel-powered Oden does not get trapped. Even so, says Jakobsson, there is no guarantee they will make it in.

    The treachery of the ice became clear last August, when the Healy stopped 800 kilometers northwest of Barrow and two U.S. Coast Guard officers descended through a crack in the ice on a practice dive. After they failed to surface, comrades hauled them up on a line from an estimated depth of 60 meters. They were dead. The Coast Guard is investigating what went wrong—still a mystery. Pending reviews of safety protocols, the Healy's remaining 2006 cruises were canceled.

    Some say simpler is better. Since 2003, groups from Columbia University, the University of Hawaii, and other institutions have been developing seismic-sounding buoys to be set adrift in the ice, where they can take advantage of vigorous, predictable currents to sweep large areas without a need for icebreakers. Plans to deploy a large number for IPY have slipped, but a few should be put in next year, and perhaps 100 by 2009, says Yngve Kristoffersen, a marine geoscientist at the University of Bergen in Norway. For setting them out, he is pushing for a new mode of transport: hovercraft.

    The cause has been taken up by John K. Hall, an American marine geophysicist who recently retired from the Geological Survey of Israel. Hall became a convert after a 2005 trip through the central Arctic on the Healy, during which, he says, ice-topography measurements showed that 85% of the voyage could have been made by hovercraft, at a tiny fraction of the cost. He has committed money inherited from his grandparents—the makers of Chiclets gum—to buy a small, well-heated vessel that will float up to 73 centimeters off the ice. Hall plans to test the vehicle around the Norwegian island of Svalbard next year. He does not wish to fly the flag of any nation. He jokes: “Well, maybe just a black one, with skull and crossbones.”

  18. Thriving Arctic Bottom Dwellers Could Get Strangled by Warming

    1. Kevin Krajick*
    1. Kevin Krajick is a writer in New York City.
    Deep impact.

    Experts debate how well Arctic benthic communities will weather warming.


    Ten years ago, biologists skirting Canada's mainland Arctic coast on an icebreaker lowered a video camera to the bottom and got a surprise. Instead of the desolation they expected below ice-covered waters, there was a crowd. Slender brittle stars elbowed each other; fish glided by; anemones writhed under the camera's bright light. This wonderland could be jeopardized by climate change. “We don't know until it happens, but if you have no ice, you probably have no typical Arctic fauna,” says Julian Gutt, a marine ecologist at the Alfred Wegener Institute for Polar and Marine Research in Bremerhaven, Germany.

    The Arctic bottom fauna, or benthos, is surprisingly rich in species, abundance, and ecological significance. Of the northern ocean's 5000 known marine invertebrates, 90% live on the bottom. In shallow waters, they form the basic diet of many topside creatures including seabirds, walruses, bearded seals, and bowhead whales. Although many of the tiny creatures are migrants from North Atlantic waters, up to 20% are Arctic endemics.


    The bounty exists because of the cold, not in spite of it. During the brief summer warmth, ice algae and cold-water plankton explode into life. In warmer waters, such simple organisms are devoured by zooplankton, which are devoured by predators, and so on up the food chain; thus nutrients stay in the water column. But in icy Arctic water, zooplankton do not grow fast enough to consume the sudden rushes of plant life. As a result, much of the plant life sinks to the bottom, where creatures there get it. For this reason, the benthos “can have production that is actually greater than in the tropics,” says Bodil Bluhm, a benthic ecologist at the University of Alaska, Fairbanks (UAF).

    Many biologists hypothesize that climate change could hurt the Arctic benthos and the large creatures that live off it by wiping out ice (and hence ice algae), lengthening growing seasons for zooplankton, and giving warm-water species a foothold. “The way the system works now is very much in favor of the benthos,” says UAF polar ecologist Rolf Gradinger. “If the system changes, things could go downhill fast.”

    A preview might come from the Bering Sea, between Russia and Alaska. There, higher water temperatures and pullbacks in seasonal ice have progressed fast in recent decades. Oxygen uptake in sediments (an indicator of carbon supply to living things) has dropped by two-thirds, and populations of benthic creatures such as mussels have declined by half. Diving ducks, walruses, and gray whales are moving away, while pollock and other southern pelagic fish are streaming in (Science, 10 March 2006, p. 1461).

    Preliminary evidence suggests that higher temperatures may be starting to have similar effects in the more northerly Barents and Laptev seas, off Scandinavia and Siberia, says Dieter Piepenburg, a marine biologist at the University of Kiel in Germany. Piepenburg, who wrote a 2005 review on Arctic benthos in Polar Biology, says it remains to be seen whether this would spell the end. He says that Arctic benthic organisms have probably already weathered not only warm cycles but also cold ones so extreme that deep ice sheets repeatedly scoured bottoms clean of life far out to sea. Piepenburg thinks the organisms may have migrated to deep waters and then recolonized when the coast was clear.

    Those deep waters may also contain more life than previously believed. In 2001, U.S. researchers over the remote Gakkel spreading ridge detected chemical plumes indicating hydrothermal vents—which feed biological hot spots in other parts of the world—but were unable to locate a source. Indeed, no vents have yet been found anywhere in the Arctic, but as part of the International Polar Year (IPY), U.S. researchers in July will return to the Gakkel and deploy new under-ice autonomous vehicles to hunt down and sample the chemical plumes. If they find vents and vent creatures, the organisms may well be unique, because the narrow straits connecting the Arctic to other oceans are too shallow to allow movement of deep-sea creatures and thus mingling of genes.

    Researchers are bound to discover many polar organisms, especially in deep places like this, says Gradinger, who is leading the Arctic Ocean inventory for the worldwide Census of Marine Life. The deep basins are mostly unexplored, he says, and many small creatures that live buried in sediments even in shallow areas have yet to be glimpsed. IPY may help change this; within its framework, Gradinger counts 20 biological collecting projects slated so far.