News this Week

Science  18 Apr 2008:
Vol. 320, Issue 5874, pp. 300
  1. CLIMATE CHANGE

    IPCC Tunes Up for Its Next Report Aiming for Better, Timely Results

    1. Eli Kintisch

    The international team of climate change scientists that produced an influential series of reports last year—and won the 2007 Nobel Peace Prize—will be doing things a little differently in the future. Government delegates to the Intergovernmental Panel on Climate Change (IPCC), meeting last week in Budapest, Hungary, approved a plan for the 20-year, 100-nation enterprise that would generate more precise and relevant information on climate change—without taking any longer than the current 6-year gap between reports. To do so, the delegates endorsed procedural changes that scientists had proposed to streamline the process.

    Having heard persuasively in successive IPCC reports that human-induced climate change is real, governments now want more information on what those impacts will be and how the world might begin to curb emissions of greenhouse gases. But scientists say the process used to generate the previous four reports can't deliver the additional detail and greater certainty that policymakers crave in the same 6-year time frame. Rejecting a proposal for an interim report in 4 years and a final report in 8 years, the delegates instead agreed to modify the process itself to achieve the desired results.

    The first change would ditch the practice of prescribing the scenarios of economic and technological progress driving future greenhouse gas emissions that researchers should incorporate into their modeling, the first step in the process. Delegates also backed the idea of having the communities that correspond to the panel's three working groups on the science of climate change, its impacts, and mitigation strategies develop their studies in parallel rather than sequentially. Scientists say these changes will reduce the level of uncertainty in their findings, deliver more regional details, and provide policymakers with better clues on how to curb climate change—without lengthening the time from start to finish. The new regime “will expedite and improve the process,” says Richard Moss of the World Wildlife Fund, who helped coordinate the effort for IPCC.

    “We're going along with the community,” says Harlan Watson, head of the U.S. delegation, explaining that the new procedures will “let the science drive the process.” Scientists who feared more drastic measures seem pleased with the tweaks to the system. “The overarching sense was if it ain't broke don't fix it,” says ecosystem modeler Kathy Hibbard of the National Center for Atmospheric Research (NCAR) in Boulder, Colorado, who attended as an observer. The new structure lays out “a rigorous and deliberate approach, taking the time required,” says NCAR atmospheric chemist and U.S. delegate Susan Solomon.

    Homing in.

    Scientists hope to improve the geographic resolution of global climate models, shown here in old (left) and new models from the National Center for Atmospheric Research.

    CREDIT: UCAR

    The 40 emissions scenarios used in modeling studies cited by last year's IPCC reports employed outdated assumptions about population growth, energy use, and emerging emissions-reducing strategies, experts say. “For example, nobody thought seriously about storing carbon under the ground,” says Tom Kram of the Netherlands Environmental Assessment Agency. To correct that problem, IPCC sponsored a series of workshops over the past 2 years in which the community chose four scenarios from published papers that span a range of possible economic and technological futures. The most carbon-intensive scenario foresees population and economic growth driving atmospheric carbon dioxide levels to 1370 parts per million (pmm) by 2100. That's more than three times the current level of 380. The rosiest scenario has them cresting at 490 ppm before declining.

    Starting with only four pathways should actually generate better data, according to researchers. With fewer total emissions scenarios to model, scientists will be able to do more runs on the same scenarios. That allows so-called ensembles in which models are repeatedly run with nearly identical starting conditions, yielding trends and reducing uncertainty. Delegates in Budapest also endorsed running climate models out to 2035 in addition to continuing the century-long projections. The computing power saved by modeling shorter time periods will allow scientists to get results on smaller geographic grids, giving more local details. “The focus will be on regional climate change and extremes,” says NCAR's Gerald Meehl.

    The new process is expected to foster greater collaboration, says Hibbard, describing the previous regime as one of “toss your data over the fence and see ya later.” For example, climate modelers would like to know how reforestation might affect Earth's albedo this century. But the success of future reforestation efforts depends on technological and policy changes forecast by economic modelers. Similarly, climate modeling could forecast damage by drought to agriculture, which scenario builders could use to later revise the scenarios that underpin the whole enterprise.

    The panel will meet again in September in Geneva to choose its leaders—current IPCC chair Rajendra Pachauri wants to stay on and enjoys widespread support—and the heads of the three working groups. In the meantime, scientists seem to be happy with their new marching orders. “This whole thing is turning on new steam,” says Hibbard, who has already e-mailed details of the IPCC's decisions to a modeling team at NCAR.

  2. GLACIOLOGY

    Greenland Ice Slipping Away but Not All That Quickly

    1. Richard A. Kerr

    Almost 6 years ago, a paper in Science warned of an unheralded environmental peril. Melted snow and ice seemed to be reaching the base of the great Greenland ice sheet, lubricating it and accelerating the sheet's slide toward oblivion in the sea, where it was raising sea level worldwide (12 July 2002, p. 218).

    Down the hatch.

    Meltwater pouring into moulins like this one can lubricate the ice sheet's base, but the ice's resulting acceleration into the sea is modest.

    CREDIT: I. JOUGHIN, PSC/APL UNIVERSITY OF WASHINGTON

    Now a two-pronged study—both broader and more focused than the one that sounded the alarm—has confirmed that meltwater reaches the ice sheet's base and does indeed speed the ice's seaward flow. The good news is that the process is more leisurely than many climate scientists had feared. “Is it, ‘Run for the hills, the ice sheet is falling in the ocean'?” asks glaciologist Richard Alley of Pennsylvania State University in State College. “No. It matters, but it's not huge.” The finding should ease concerns that Greenland ice could raise sea level a disastrous meter or more by the end of the century. Experts remain concerned, however, because meltwater doesn't explain why Greenland's rivers of ice have recently surged forward (Science, 24 March 2006, p. 1698).

    The original paper described work by glaciologist Jay Zwally of NASA's Goddard Space Flight Center (GSFC) in Greenbelt, Maryland, and colleagues. They noted that whenever summer warmth swelled, producing more meltwater on the ice sheet, the ice beneath Swiss Camp—70 kilometers inland of the west coast—sped up as much as 28% on its 115-meter-per-year creep toward the Davis Strait. Presumably, the meltwater disappearing down tubular chasms called moulins somehow reached the ice sheet's base a kilometer down and slicked it up, letting the ice take off, if only for a couple of months.

    But no one was sure how meltwater managed the feat or whether the ice acceleration at Swiss Camp continued all the way to the sea. So glaciologists Ian Joughin of the University of Washington's Applied Physics Laboratory in Seattle, Sarah Das of Woods Hole Oceanographic Institution in Massachusetts, and their colleagues took a dual approach to meltwater lubrication, as they report in two papers published online this week in Science (www.sciencemag.org/cgi/rapidpdf/1153288.pdf). They took a close look at how lubrication works by instrumenting a growing puddle of meltwater south of Swiss Camp. For the broad view, they went to images made from satellite-borne radar that tracked ice sheet movement every 24 days across a 425-kilometer-by-100-kilometer swath of the west coast, including Swiss Camp.

    The meltwater monitoring caught a 4-kilometer-long, 8-meter-deep lake disappearing into the ice in an hour and a half. As theorists had supposed, once the lake water was deep enough, its weight began to wedge open existing cracks, which only increased the weight of overlying water on the crack tip and accelerated cracking downward. Once the main crack reached the bottom of the ice, heat from churning water flow melted out parts of the fracture, and drainage took off. The lake disappeared in about 1.4 hours at an average rate of 8700 cubic meters per second, exceeding the average flow over Niagara Falls. That's almost four Olympic pools a second.

    For all the lake's water dumped under the ice that day and all the water drained into new moulins in the following weeks, the ice sheet moved only an extra half-meter near the drained lake. Das and her colleagues conclude that an unknown number of lake drainages and new moulins account for the acceleration reported by Zwally and now confirmed more broadly by the radar observations. Joughin and his colleagues report that in August 2006 the ice sheet sped up over a broad area by a hefty 48% above its 76-meters-per-year mean speed as melt lakes grew—and disappeared—under summer warmth.

    The good news came toward the coast, where the ice speeds up as the flow narrows into a few outlet glaciers that deliver the ice to the sea. Those glaciers moved only 9% faster than normal in August of 2006. “Meltwater does indeed cause substantial speedup” inland on the ice sheet, says Joughin, “but it has a small effect on outlet glaciers.” That may be because the beds on which outlet glaciers slide are already smooth and well lubricated year-round, the group speculates. All in all, meltwater lubrication “likely will have a substantive but not catastrophic effect on the Greenland Ice Sheet's future evolution,” the group writes.

    Alley agrees. “Could things go two times faster [due to meltwater] than we thought 10 years ago?” he asks. “Yes. They can go faster but not ridiculously faster.” The danger now, warns glaciologist Robert Bindschadler of GSFC, is “falling into the same ‘We now know how ice sheets work'trap that my generation was guilty of 5 years ago.” After all, some of Greenland's outlet glaciers began galloping to the sea in recent years. If not meltwater, what set them off?

  3. ARCHAEOLOGY

    Team Unveils Mideast Archaeology Peace Plan

    1. John Bohannon

    Last week in Jerusalem, a small team of Americans, Israelis, and Palestinians presented a peace plan for the Holy Land's archaeological riches. After secretly meeting in different countries over the past 5 years, the eight archaeologists offered their view on the fate of thousands of artifacts and sacred sites. Their aim is to remove the divisive issue from negotiations over Palestine's future as an independent state.

    But whether their hard work will pay off is anyone's guess. “I am doubtful that an unofficial document drawn up by some well-meaning archaeologists will make any difference,” says Patrick Daly, an archaeologist at the Asia Research Institute in Singapore who has been visiting faculty at An-Najah National University in Nablus in the West Bank.

    At the heart of the controversy is the question of what should be done with material removed from the Palestinian West Bank territory since the 1967 Arab-Israeli War. During decades of settlement building, the Israeli Antiquities Authority (IAA) has uncovered and removed artifacts including coins from the Crusades and stone tools from the Paleolithic. When—and if—a Palestinian state is created, the question will become whether some or all of those objects, most now stored in Israeli museums and warehouses, will be repatriated. Another contentious issue is who will maintain and control access to religious sites on either side of the border, particularly in and around Jerusalem, a city claimed by both sides. These issues are a “major hurdle for peace,” says Ran Boytner, an Israeli-born archaeologist at the University of California, Los Angeles (UCLA).

    Waiting for peace.

    The Temple Mount will be at the epicenter of a protective “Heritage Zone” in Jerusalem.

    CREDIT: JOHN BOHANNON/SCIENCE

    So starting 5 years ago, Boytner and Lynn Swartz Dodd, an archaeologist at the University of Southern California (USC) in Los Angeles, quietly assembled a team of Israeli and Palestinian colleagues. “People who participated did so at great risk,” says Boytner. Israeli academics collaborating with Palestinians, and vice versa, are often viewed as traitors, he says, and losing one's job—or life—is a real possibility. Most of the team made their names public last week. But one of the three Israelis and one of the three Palestinians remain anonymous.

    The meetings were initially held in Vienna—“neutral” ground, says Dodd—then in Southampton, U.K., and finally in Jerusalem. Expenses were covered by a $150,000 fund created by USC, UCLA, the Washington, D.C.-based United States Institute for Peace, and other donors. Despite often tense negotiations—professional facilitators were brought in several times to keep the process going—the team grew intimate. “Our meetings usually included at least one recent-pictures-of-kids swap,” Dodd says. “It's probably no accident that all participants are parents who are thinking toward their children's futures.”

    The first challenge was to account for “tens of thousands of artifacts” and nearly 6000 sites, says David Ilan, one of the Israeli participants who directs the Nelson Glueck School of Biblical Archaeology in Jerusalem. “Given construction activities in the West Bank, including the building of the separation fence, new [archaeological] sites are discovered there almost daily,” says team member Raphael Greenberg of Tel Aviv University in Israel.

    Records from West Bank excavations were hard to come by. After being rebuffed by IAA, Greenberg sued and won a court injunction to obtain the data. “This filled in many gaps,” says Ilan. Boytner says all the data “will soon be made available to the public.” (IAA declined to comment.)

    The team's plan calls for a protective “Heritage Zone” around the oldest part of Jerusalem, extending to the city's 10th century boundaries. Archaeological sites in the zone would be accessible to anyone, and any research would have to be done with full transparency. The plan also recommends the repatriation of all artifacts found since 1967 to the state in which they were unearthed—essentially a one-way transfer from Israel to Palestine. To house all the material returned to the Palestinian side, new museums and conservation laboratories would be created. Exactly who would construct the facilities is not spelled out, but Katharina Galor, an archaeologist at Brown University who is “not very optimistic” for the plan's future, estimates the cost at “millions if not billions of dollars.”

    About 50 Israeli archaeologists, including IAA officials, showed up on 8 April in Jerusalem to hear the U.S. and Israeli part of the team make their case, says Boytner. (No Palestinians attended.) Ilan was prepared for the worst but says “surprisingly, the overwhelming response was positive and congratulatory. Not a single person spoke against the document.” The consensus was that “this process should continue,” says audience member Hanan Eshel of Bar-Ilan University in Ramat-Gan, Israel. The buzz at the meeting was that the team's anonymous Israeli member is an IAA archaeologist. “We will not comment,” says Boytner.

    A follow-up meeting is being planned for the Israeli side. Among Palestinians, there is broad support but also those who “do not want to involve Israel whatsoever in a future Palestinian state,” says team member Ghattas Sayej, an archaeologist with the Palestinian Association for Cultural Exchange in Ramallah, the West Bank. The effort to convince Palestinian archaeologists to formally ratify the plan is being led by team member Nazmi el-Jubeh, co-director of RIWAQ, an architectural conservation organization in Ramallah.

    And after that? “It's up to our politicians,” says Sayej. “The plan is there.”

  4. INTERNATIONAL AID

    As Food Prices Rise, U.S. Support for Agricultural Centers Wilts

    1. Dennis Normile
    Swamped.

    Potential cuts in U.S. support for research may slow the introduction of new rice varieties such as those tolerant of submergence.

    CREDIT: GENE HETTEL/IRRI

    A recent spike in wholesale and market prices for rice, wheat, and maize has touched off food riots and prompted countries with surpluses to impose restrictions on grain exports. Food importers are in a panic, and relief organizations are warning of a pending calamity. In response, U.S. President George W. Bush earlier this week ordered up $200 million in emergency food aid. Behind the scenes, however, researchers charge that the U.S. government is moving to slash funding for international agricultural research.

    “You couldn't ask for worse timing,” says Robert Zeigler, director general of the International Rice Research Institute (IRRI) in Los Baños, Philippines. “Part of the reason we're having this deterioration of the global agricultural situation is that there has been a steady erosion of support for research.” IRRI has put a freeze on hiring and is holding back on planned research investments until the budget is confirmed.

    Last week, several concerned scientists circulated an online petition seeking to reverse cuts to research funds they say are being planned by the U.S. Agency for International Development (USAID), calling them “unacceptable mistakes that will damage worldwide food production for many years to come.” The group argues that international agricultural research should be expanding. “Restoring [support] isn't really enough; this should be an area of major growth,” says Jeffrey Bennetzen, a plant geneticist at the University of Georgia, Athens, and a petition organizer.

    In 2006, USAID provided about $56 million to a network of 15 centers around the world called the Consultative Group on International Agricultural Research, or about 12% of CGIAR's budget. According to CGIAR spokesperson Fionna Douglas, USAID officials warned in February of a strong likelihood of “a huge cut” in funding this year. CGIAR later learned it could be as much as 75%; the final figure is still being determined. Douglas says they've been told that the 2008 USAID budget includes extensive earmarks requiring funding to be directed primarily to health issues, leaving little for agriculture. USAID officials did not respond to requests for details on CGIAR funding charges.

    In the meantime, a perfect storm is brewing. Across the developing world, farmland and water for irrigation have been lost to urban development and industrialization. Grains are being diverted to feed livestock to meet rising demand for meat and to make biofuels. Droughts in Asia and Australia have severely curtailed grain production. And productivity has stagnated, says Zeigler, due to cuts in agricultural research in the 1990s.

    The result is a steady rise in grain prices. On 20 March, the U.N.'s World Food Programme issued an appeal for help in covering a $500 million shortfall in its $2.9 billion budget this year to feed 73 million people in 78 countries. In the 3 weeks since, food prices shot up another 20%. “You could see the train wreck coming for years,” says Zeigler.

    Bennetzen and his colleagues plan to send their petition (www.ipetitions.com/petition/cgair_support), which has garnered more than 600 signatures, to key members of the U.S. Congress and USAID administrators. With recent headlines drawing attention to the food crisis, says Bennetzen, “I don't think it should be difficult to communicate that it's a desperate situation.”

  5. SCIENTIFIC PUBLISHING

    Croatian Editors Fight With Medical School Over Journal's Fate

    1. Gretchen Vogel

    At an international meeting on research integrity last fall, Ana Marušić spoke on the problems facing small journals. Since then, the anatomy professor has many more stories to tell. She and her husband, Matko Marušić, co-editors-in-chief of the Croatian Medical Journal (CMJ), are at the center of a controversy that threatens their jobs and, observers say, the journal's independence and its example of quality scientific publishing in countries outside the scientific mainstream.

    The editors—both professors at the University of Zagreb Medical School, which, along with three other Croatian medical schools owns the journal—have faced charges of plagiarism and defaming the university. The Marušićs say they are being targeted for their insistence on research ethics and for bringing to light corruption and plagiarism in the Croatian medical community. Their critics, primarily academics at the same school, charge that it's the editors who are behaving unethically and destroying the trust essential for the journal's operation.

    Founded in 1991, CMJ was conceived as a forum for doctors ensnared in the country's civil war to communicate to the outside world. Non-Croatian authors were also welcome. From the beginning, says Ana Marušić, the goal was to educate scientists from developing countries on how to communicate their work better—primarily in English. Available for free online, the journal is listed in major citation indexes. Ana Marušić is currently president of the Council of Science Editors, an international organization of journal editors, and also headed the World Association of Medical Editors. “Considering the size of the country and the resources available, they really hit the ground running,” says Mary Scheetz, an expert in research ethics and scientific publishing at the University of Virginia, Charlottesville. “Their journal became respected.”

    Under fire.

    Croatian Medical Journal editors-in-chief Matko and Ana Marušić worry about their journal's independence and their careers.

    CREDITS: COURTESY OF ANA MARUSIC

    That respect has perhaps contributed to the editors' problems. In Croatia, professors must publish at least five papers in journals indexed in Current Contents to receive promotions. Because CMJ is the only Croatian journal listed in Current Contents, a rejection can thwart careers. In part because of that, says Davor Solter, director emeritus at the Max Planck Institute of Immunobiology in Freiburg, Germany, and a member of CMJ's advisory board, the Marušićs “have a lot of friends outside Croatia and a lot of enemies inside Croatia.”

    The Marušićs say their troubles started in 2001 when the journal rejected a paper by a Zagreb colleague, based on unfavorable reviews. They claim their problems worsened after a September 2006 commentary in the British Medical Journal (BMJ) detailed two examples of plagiarism by Asim Kurjak, a prominent gynecologist at the University of Zagreb. Ana Marušić knew the commentary's author, Iain Chalmers, editor of the James Lind Library in Oxford, U.K., through publishing circles, and she says that Kurjak and others accused the couple of prompting the BMJ article. Chalmers says a chance conversation with a Norwegian researcher prompted him. He says he checked some of the facts with another Croatian scientist but deliberately avoided mentioning it to the Marušićs.

    After pressure from the Croatian press, Nada Čikeš, dean of the Zagreb medical school, referred the Kurjak matter to the school's Court of Honor. In October, it found that Kurjak had behaved unethically, but because he had retired a month earlier, the court did not punish him. About the same time, Čikeš also asked the body to look into plagiarism charges involving Ana Marušić.

    Impact factor.

    Started in war, the CMJ has admirers around the world

    In May 2006, an anonymous letter to Croatian science ministry and university officials said that significant portions of a 2002 anatomy textbook co-authored by Ana Marušić were identical to passages in an American textbook. This month, the Court of Honor issued a “public warning” to Ana Marušić. It found her “responsible for the fact that the textbook… without providing the source and authors, copied and translated English text.”

    Ana Marušić does not dispute the lack of acknowledgement but says she relied on her publisher to seek the necessary permission; the publisher said it tried several times before deciding the American publishers did not object. “It was definitely a mistake” not to credit the source textbook, she says.

    Matko Marušić has also run afoul of university authorities. Even his supporters say the editor is an intense person whose energy and stubbornness can rankle. Last week, a disciplinary hearing was held to decide whether comments he made to a newspaper about corruption in the Croatian scientific community defamed the university. As part of that process, the committee asked three university psychiatrists for their opinion of Matko Marušić's public comments and his correspondence. One of them told Science that he declined to cooperate, but Matko wrote to the American Psychiatric Association protesting the school's request.

    CMJ's owners, the deans of Croatia's four medical schools, are now considering a proposal by Čikeš to put the journal under their direct control instead of the current eight-member management board. Čikeš has also proposed that CMJ's editor-in-chief be rehired and that anyone who had been reprimanded by his or her university should be disqualified. Čikeš says the changes would bring the journal in line with governance standards recommended by the World Association of Medical Editors. The Marušićs have never been formally evaluated or elected to their positions, she says. She does acknowledge their accomplishments. “I am happy and proud that we have such a good journal,” she says. However, she says, the ongoing disputes have gotten out of hand. “The whole thing is immobilizing parts of the institution.”

    Solter is dismayed by the fight. Given the relative success of the journal, the disputes seem like a waste of time and energy, Solter says: “When all is said and done, they made the journal what it is.… To get that done, maybe you have to be a bit obnoxious. The journal and its editors should be left alone to do their work.”

  6. REGULATORY SCIENCE

    Changes to EPA Toxicology--Speed or Delay?

    1. Erik Stokstad

    The U.S. Environmental Protection Agency (EPA) has substantially modified the way it updates a database on chemical hazards that influences how chemicals are regulated. The agency says the changes should make the process more transparent and more rigorous, and speedier. But critics argue that the new procedure is more secretive and gives too much clout to federal agencies that pollute or face massive cleanup costs. One result, they say, will be further delays in regulation.

    Begun in 1985, the Integrated Risk Information System (IRIS) contains EPA scientists' appraisals of the chronic health effects of more than 540 chemicals. EPA regulators use this information to revise drinking water standards, for example, or set cleanup levels at Superfund sites. Many states and agencies around the world also use the data. “IRIS is the gold standard,” says Jennifer Sass of the Natural Resources Defense Council (NRDC) in Washington, D.C.

    IRIS has sometimes been a battleground. For example, the Department of Defense (DOD) criticized the science behind EPA's draft IRIS document for a solvent called trichloroethylene, which contaminates many military bases. This 2001 draft identified stronger evidence of carcinogenicity and could ultimately force DOD to spend billions more cleaning up polluted aquifers. Since then, the White House's Office of Management and Budget (OMB) started reviewing all IRIS drafts. Many came back with edits that made health effects seem more uncertain, says an EPA official who asked to remain anonymous, whereas other drafts remain in limbo at OMB. On average, documents now take 5.5 years to complete, and the most controversial can stretch to more than a decade.

    Three years ago, EPA Administrator Stephen Johnson asked his Office of Research and Development (ORD), which runs IRIS, to make the process more predictable and transparent. Under changes announced on 10 April, the public and interested federal agencies now have a chance to comment on IRIS's early “qualitative” drafts. In an added step, federal agencies—but not the public—will get a confidential “sneak preview” of the final draft before it is sent to peer review.

    DOD says it's pleased with the changes. The early reviews will enable DOD to resolve questions of scientific uncertainty more quickly and get a head start on managing risks, such as by finding substitute chemicals, predicts Shannon Cunniff, who directs DOD's program on emerging contaminants.

    NRDC's Sass worries that the added review will let federal agencies delay the process. She's also concerned that interagency comments on drafts that were previously part of the public record will now be secret. Cunniff says, however, that DOD plans to make public the scientific feedback it sends to EPA. And George Gray, who heads ORD, emphasizes that the final decisions on the content of IRIS documents will remain in EPA's hands. “Anything we do has to be scientifically justified,” he says.

    Skeptics remain. In a statement, Senator Barbara Boxer (D-CA) called the changes “devastating” and announced that the Environment and Public Works Committee, which she chairs, plans to conduct an oversight hearing on EPA's toxics program. In addition, the Government Accountability Office will shortly release a study she requested on political influence on IRIS.

  7. ENERGY

    The Greening of Synfuels

    1. Eli Kintisch

    An old, dirty technology to make transportation fuels from coal could fight global warming, say proponents. The trick is using more biomass and burying the carbon dioxide that’s generated.

    An old, dirty technology to make transportation fuels from coal could fight global warming, say proponents. The trick is using more biomass and burying the carbon dioxide that's generated

    CREDIT: BRIAN HUBBLE

    A multibillion-dollar U.S. effort to turn coal into gasoline was a colossal flop in the 1980s, plagued by mismanagement, political wrangling, and falling oil prices. Environmentalists concerned about the impact of additional coal mining cheered the end of the synthetic fuels program, which was aimed at cutting U.S. dependence on oil from the Middle East.

    A generation later, the geopolitical reasons for reducing U.S. oil imports are more compelling than ever. And with oil prices above $100 a barrel, the economic equation has changed. So it's no surprise that a few U.S. energy companies have drawn up plans for synfuels plants that would produce millions of barrels of the alternative fuel annually.

    But this time around, the technology is also gaining support from a seemingly unlikely source. A group of climate scientists believes that, barrel for barrel, synfuels can emit less carbon dioxide (CO2) than oil and, at some point, even reduce the amount of carbon in the atmosphere. “When you make synfuels, you have an incredible opportunity” to tackle climate change, says Princeton University physicist Robert Williams, an advocate of the technology.

    Living up to that promise won't be easy, however. The two keys to making synfuels green are using large amounts of plant biomass along with coal and storing in the ground the CO2 emitted during the production of synfuels. And neither has been implemented on a commercial scale. Most environmental groups are still horrified by the thought of more synfuels plants and are loath to see coal mining expanded. They also point out that the process produces CO2 at twice the rate of making gasoline from crude oil without CO2 storage and the use of biomass, the result would be disastrous. At least eight synfuels plants are expected to open soon in China, with 17 more planned; they will spew forth millions of tons of CO2. A coal-fed facility in Secunda, South Africa, built to cope with an apartheid-era fuel embargo, is the planet's single biggest point source of carbon, emitting 20 million tons of CO2 a year.

    “[Synfuels] may be worth looking into, and I have no doubt someone's going to make money with the process,” says energy professor Daniel Kammen of the University of California, Berkeley. But he thinks those who see a climate benefit are underestimating the costs of large-scale carbon storage while overestimating the availability of biomass that can be harvested without having deleterious effects. As a climate solution, he says, “I'm a lot less sanguine that it's going to work out.”

    What a gas

    The chemistry involved in making synfuels is not complicated. The process begins by turning coal into gas, which creates carbon monoxide and hydrogen (see diagram). The resulting syngas, as it's called, is then converted with catalysts into products such as diesel fuel, jet fuel, or chemical feedstocks after scrubbing for pollutants.

    Germany operated the first large-scale commercial synfuels plants in the 1940s to provide fuel for a Nazi war machine starved by an Allied oil embargo. Then the 1970s gasoline crunch led U.S. President Jimmy Carter and Congress to create the $20 billion Synthetic Fuels Corp. in 1980. The goal was to use coal to produce 700 million barrels of oil per year by 1992. The corporation spent $2 billion on demonstration projects in California, Louisiana, and North Dakota. But management scandals, battles between the corporation and the White House during the Reagan Administration, and, ultimately, the falling price of oil—it hit $21 a barrel in 1986—caused Congress to pull the plug that year. Experts said the only thing that would revive synfuels was $100-a-barrel oil.

    And here we are. In the United States, two companies lead the Synfuels 2.0 effort. Baard Energy, based in Vancouver, Washington, hopes next year to begin building a $5 billion plant in Wellsville, Ohio, that would produce 50,000 barrels a day of diesel, jet fuel, and other chemicals. Rentech Inc., based in Los Angeles, California, hopes to open a plant in Natchez, Mississippi, in 2011 that would eventually make 30,000 barrels of fuel a day. Although the hefty price tag of a synfuels plant makes it less likely that enough will be built to have a major impact on global transportation needs, Baard's owner and founder, John Baardson, says the plant will make money as long as the cost of a barrel of oil remains above $50.

    The companies plan to use 30% and 10% biomass by weight, respectively, and store the CO2 they make underground. That mix, they say, will produce fuels with a life cycle carbon footprint much smaller than the one left by those derived from Middle Eastern oil. Future projects using greater proportions of biomass, advanced gasifiers, and carbon storage could result in a carbon-negative process, say proponents, storing indefinitely the CO2 that plants had taken up from the atmosphere. Baard says that getting enough biomass for its Ohio plant won't be a problem. Rentech hopes to be able to use garbage, which is also plentiful. (A third company, owned by the power utility DKRW, is planning a coal-only gasification project in Wyoming that will inject CO2 as well.)

    A Dutch utility called Nuon has been pioneering this method, gasifying an 80–20 mix of coal and wood chips since 2006. (Its plant in Buggenum, Netherlands, generates power instead of fuel, but the gasification step is identical.) “They've solved a number of technical problems,” says Baardson, including selecting the best feedstocks and preparing them for conversion.

    Unlike coal, which is easily ground into tiny spheres, the fibrous wood gets stuck as it is fed into the gasifier, creating an uneven flow. Dutch engineers have developed a way of mixing the two feedstocks to make them flow better. A new process of drying and charring the wood beforehand, developed by the Energy Research Centre of the Netherlands, has also helped keep the mixture flowing evenly into the gasifier. The process requires extra energy, but by reducing the weight of the material it lowers transportation costs.

    Engineers in industry believe that preparing the biomass is the main technical hurdle to gasifying it and cite Nuon's success as proof. But with Nuon keeping its methods secret, government researchers want to explore the new feedstock more. “We don't exactly know how biomass is going to affect the gasifier, gas cleanup, or catalysts systems,” says Daniel Cicero of the National Energy Technology Laboratory in Morgantown, West Virginia. The lab announced a $7 million research program last month to identify the minerals found in biomass feedstocks such as poplar or switchgrass and to examine how they might affect the system.

    Nuon's gasifier, built by Shell, operates above 1200°C. That temperature melts the inorganic ash that the process creates. But gasifiers that run at temperatures hundreds of degrees cooler could save in construction and operating costs, says Richard Bain of the National Renewable Energy Laboratory in Golden, Colorado.

    It's a gas.

    Traditional synfuels plants take coal and turn it into syngas. The gas is then catalyzed into various liquid fuels. Proposed plants would also store underground the CO2 that is created. Greater reliance on biomass would make the process more carbon friendly.

    CREDIT: P. HUEY/SCIENCE

    Cooler gasifiers have their own problems, however. Lower temperatures mean that less of the feedstock—be it coal or biomass—is converted into syngas. The toxic, carbonaceous muck that remains is costly to dispose of. Researchers hope that better computer modeling and new chemical techniques will help them more fully process the gunk.

    New chemistry could cut costs even more dramatically. Two years ago, chemical engineer Lanny Schmidt of the University of Minnesota, Minneapolis, demonstrated how to gasify biomass by releasing tiny bits onto a catalyst made of rhodium and cerium, whereby it is converted instantly to syngas in an oxygen-rich vessel (Science, 3 November 2006, p. 801). Its industrial advantages include shortening the duration of the process—to roughly a tenth of the time of existing gasifier designs—and leaving behind almost no carbon. Because the reaction continually releases its own heat—700°C—the technique could eliminate costly external heating. But Schmidt acknowledges he still needs to solve a copious “ash problem” before synfuels plants can be shrunk to the size to ethanol facilities, which are small enough to sit adjacent to local farms.

    Going under

    To store the CO2 that synfuels plants create, researchers hope to take advantage of the fact that the process creates a concentrated CO2 stream that can simply be injected into deep underground formations. In contrast, CO2 from a standard generating plant must be separated from other flue gases (Science, 13 July 2007, p. 184). “In a way, the thing that makes [synfuels] so bad for the climate could make [them] so good for the climate,” says Daniel Schrag, a Harvard University geochemist who works as a part-time consultant for Rentech. Capturing and storing a ton of carbon from a standard coal plant would cost $40, according to a survey last year by a team of researchers at the Massachusetts Institute for Technology in Cambridge. Rentech says its Mississippi plant, strategically located near pipelines that currently bring CO2 to oil fields, will do it for a net of $6 a ton.

    A burning question.

    South Africa's Secunda facility is the world's biggest point source for carbon emissions, but synfuels can be made cleaner.

    CREDIT: JONATHAN BLAIR/CORBIS; SOURCE: ARGONNE NATIONAL LABORATORY

    But the amount of CO2 needed to be stored by a new generation of synfuels plants dwarfs current experimental efforts. The three largest projects worldwide—in Algeria, off the coast of Norway, and in Saskatchewan, Canada—are each storing roughly 1 million tons per year. Baard expects its plant alone will produce more than four times that amount. Baard and Rentech plan to sell the CO2 from their plants to oil companies to help them squeeze the last drops out of existing wells, a process that geologists say effectively stores the CO2 once the wells are sealed. But such opportunities are relatively rare. Fortunately, there's plenty of available space elsewhere: A government survey last year found that the United States has room underground and near power plants for at least 91 billion metric tons of CO2, enough to absorb many decades of emissions.

    The process involves injecting a stream of CO2, liquefied by high pressure, into a series of wells drilled thousands of meters into porous rocklike sandstone. The formations are capped with impermeable layers of rock. Inside the space, the liquid CO2 displaces briny liquids as it fills pores. Results of early tests on a small scale have been positive, but scientists say they still have a lot to learn as they scale up injections. “I wouldn't say there are any major technical [barriers],” says engineer Sean McCoy of Carnegie Mellon University in Pittsburgh, Pennsylvania. “We want to make sure there aren't any surprises.”

    To reassure the public that underground carbon sequestration is reliable and safe, hydrologist Diana Bacon of Pacific Northwest National Laboratories in Richland, Washington, says researchers need better computer models of how stored CO2 behaves. Adding complex geochemistry to the models is a first step. Liquid CO2 under pressure, for example, can cause the formation of solid salts such as sodium chloride that can block pores and alter the flow of the injected CO2. Varying mixtures of calcium, dolomite, and sandstone found in deep sedimentary rocks could affect CO2 behavior differently, says Bacon. Injecting CO2 into the basalts found beneath much of the United States and India, among other places, can have similarly hard-to-model effects.

    Megascale synfuels projects would give engineers the experience they now lack in long-term sequestration of CO2. “We need to just get moving,” says Bacon. But that's hard to do in the United States, where pure CO2 streams are relatively rare despite the heavy use of fossil fuels. China's projected synfuels plants give that country the chance to become “a world leader” in CO2 storage, says Princeton's Williams. But despite nascent partnerships with the U.S., European Union, and U.K. governments, the only large-scale test announced so far in China is a $1 billion power plant, dubbed GreenGen, in Tianjin. Several government-owned companies expect to begin construction next year.

    Companies say that synfuels could become an important energy source sooner if the U.S. government lends a hand. One potentially huge customer for synfuels is the U.S. Air Force, whose planes now consume 11.4 billion liters of fuel a year. Synfuels makers want Congress to grant the Pentagon the authority to sign long-term fuel-purchasing contracts for synfuels. The lawmakers who oversee the Pentagon have been mum on the matter. A compromise requiring restrictions on carbon emissions for federally supported synfuels seems possible, although a similar deal involving tax breaks and production credits failed last year.

    Environmental groups oppose such a deal. David Hawkins of the Natural Resources Defense Council in Washington, D.C., fears that any legislation will open the door to a surge in synfuels made purely from coal. Even if the CO2 generated could be stored, he says, the effects of expanding coal mining could be extremely harmful to the environment.

    Notwithstanding the technical hurdles, the fate of synfuels may hang on whether companies are forced to pay a price for the carbon they emit into the atmosphere. Opponents point out that Rentech and Baard, notwithstanding their pledges, are free to use only coal in their synfuels plants and emit millions of tons of CO2 per year. “That's only a problem if you don't have a price on carbon,” Schrag counters. Since they can use nonfood crops or plant waste, synfuels “can be better than ethanol,” he adds, citing the negative impact of corn ethanol on food prices and its projected deleterious effect on climate (Science, 29 February, p. 1235). Economic policies that reward good behavior will not only serve as a huge incentive to the synfuels industry, he notes, but also have more global effects: “If there isn't a carbon price, we're not going to solve the climate problem anyway.”

  8. OCEANOGRAPHY

    Watery Echoes Give Clues to the Past and Future of the Seas

    1. Lucas Laursen*
    1. Lucas Laursen is a freelance writer in Cambridge, U.K.

    A handful of oceanographers and geophysicists are recording seismic whispers of the ocean's structure.

    A handful of oceanographers and geophysicists are recording seismic whispers of the ocean's structure

    Last spring, Katy Sheen listened to the sounds of the ocean from a ship off the coast of Spain. A relaxing vacation? Hardly. Sheen, a graduate student at the University of Cambridge in the U.K., is one of a handful of scientists adapting a technique called seismic profiling to oceanography. By observing the changing speeds of sound waves propagating through water, geophysicists and oceanographers hope to extract information about the ocean's temperature, salinity, and velocity.

    Geologists and oil companies have long used ship-based seismic profiling to probe density changes in the solid earth beneath the sea, but the technique of mapping the ocean's internal structure this way is less than a decade old. If efforts like Sheen's succeed as expected, scientists will gain a powerful new tool that could unlock the volume of the ocean to rapid and remote study, much as satellites did for the ocean's surface. “When satellite observations came along, the oceanographic community… said, ‘Well, it's not going to tell us anything new,’ but it did and it was important,” says Nicky White of the University of Cambridge, Sheen's supervisor.

    Particularly exciting, White and others say, is the prospect of tapping a jackpot of knowledge from decades of “legacy data” that energy companies have gathered while sounding sea floors in search of oil. The results could prove invaluable for measuring how the ocean's waters interact and assessing the impact of such mixing on past and future climate change.

    First, however, ocean scientists must quantify the subtle ways sound waves veer and bounce as they pass through currents of differing temperature and salinity. Researchers plan to discuss results of their calibration cruises at this week's European Geosciences Union (EGU) meeting in Vienna. W. Steven Holbrook of the University of Wyoming in Laramie, whose team introduced seismic profiling as an oceanography tool in 2003, says he hopes EGU will provide “a real ‘meeting of the minds’ between seismologists and oceanographers.”

    To map ocean structure, oceanographers traditionally slow their ship to a crawl, lower instruments every 10 kilometers or so, and interpolate the data points. They have also tracked the spread of dyes, measured surface temperature by satellite, and anchored buoys for long-term observations of ocean currents. In 2000, Holbrook and his team began profiling the ocean with sound. By timing faint echoes from an array of seismic air guns towed behind a ship, they created sub-10-meter-resolution pictures of different water layers across large swaths of sea (Science, 8 August 2003, p. 821).

    Oceanographers admired the pictures but challenged the geophysicists to put numbers on them. Since 2003, a handful of research cruises, including Holbrook's current U.S. National Science Foundation-funded efforts off Costa Rica, have sought to do that by combining seismic profiling with traditional oceanography. Sheen's 2007 voyage, for example, was part of the European Union-funded Geophysical Oceanography (GO) project led by Richard Hobbs of Durham University in the U.K. GO researchers tested new instruments and techniques aimed at collecting “a definite calibration data set,” says Hobbs. One ship dropped instruments as often as every 2 kilometers behind an air gun-towing ship to obtain more detailed oceanographic data than have historically accompanied seismic profiles. Riffing on the geologists' “ground truth,” John Huthnance of the Proudman Oceanographic Laboratory in Liverpool, U.K., says the GO project has given “sea truth to the seismic data.”

    Sea hear.

    Seismic reflections from warm, salty water swirl above the stronger reflections from the sea floor.

    CREDIT: (DATA) TGS NOPEC; GO PROJECT, R. HOBBS, E. VSEMIRNOVA AND E. QUENTEL (UBO), DURHAM UNIVERSITY

    Once seismic profiling has been fully calibrated, researchers say, terabytes of seismic records from past oil exploration will become ripe for reanalysis. Oil companies stored seismic data from the ocean, which was so much noise to them, in order to subtract it from the much stronger reflection profiles they made of the solid sea floor. “We can happily plug away at legacy data,” says White. Mining old data would also bypass the enormous costs of new voyages, which Stephen Jones of Trinity College Dublin in Ireland says cost upward of $25,000 a day.

    Legacy data sets aren't perfect. Until the late 1980s, hydrocarbon exploration was largely limited to shallow continental shelves, whereas oceanographers and climate researchers are most interested in ocean-mixing hot spots in deeper waters and at bottlenecks such as the Drake Passage and the Strait of Gibraltar. The biggest limitation on the legacy data is that the petroleum geologists who collected them didn't take enough oceanographic measurements. “There are lots of seismic reflection profiling sections available, but few of them have even a single temperature profile to tell us about the water column,” says Raymond Schmitt of Woods Hole Oceanographic Institution in Massachusetts.

    Even without completing all their calibrations, oceanographers have published quantitative studies of ocean mixing and in Vienna will discuss imaging eddies with sound. Aided by such progress, they are slowly persuading funding bodies to support seismic profiling despite initially “having some difficulty,” says White; the U.K.'s Natural Environment Research Council rejected his and Hobbs's proposals three times. “It's natural to be hesitant to spend money on something which is a little bit unknown,” says U.S. Navy oceanographer Warren Wood, a collaborator of Holbrook's, who has obtained internal funding from the Navy.

    But Wood says he is “quite impressed by the first results of the GO project.” White expects even more later this year, pending analyses by graduate students such as Sheen. For oceanographers, he concludes, seismic profiling “hasn't laid its golden egg, yet.”

  9. VIROLOGY

    Mapmaker for the World of Influenza

    1. Martin Enserink

    Flu researchers are captivated by computer scientist Derek Smith’s maps of viral evolution. Today, he helps them make their toughest decisions.

    Flu researchers are captivated by computer scientist Derek Smith's maps of viral evolution. Today, he helps them make their toughest decisions

    Number cruncher.

    At his computers in Cambridge, Smith stores and analyzes flu data from around the world.

    CREDIT: SEBASTIAN MEYER/GETTY IMAGES FOR SCIENCE

    Derek Smith didn't want to do rocket science—literally. That's how he ended up becoming an internationally recognized expert in influenza virus evolution.

    In 1992, at age 33, Smith was working at a research lab of Texas Instruments (TI) in Dallas, a company he had joined a decade earlier, fresh out of a British university. He specialized in the mathematics of speech recognition. One day, a colleague noted that the integrated circuits Smith was developing might play a key role in the control systems for an antiradar missile called HARM that TI was producing for the Pentagon. The missile needed to discern real radar stations from decoys, a problem not unlike detecting subtle differences in spoken words.

    “I'm not a pacifist,” Smith says, “but I didn't want anything to do with work directly related to the military.” Instead, he started looking for a job in which his expertise might benefit public health. He found it in a Ph.D. project to model the immune system's recognition of influenza viruses at the Santa Fe Institute in New Mexico.

    He never regretted the choice. Now at the University of Cambridge, U.K., Smith has become the unofficial cartographer of the influenza world. He has developed a technique to produce colorful maps visualizing the never-ending changes in the influenza virus, and over the past 4 years, his lab has become a global nerve center that analyzes influenza data from around the world. His work offers scientists a way to track the virus' evolution almost in real time, says Ian Barr of the World Health Organization's (WHO's) Collaborating Centre for Reference and Research on Influenza in Melbourne, Australia.

    Indeed, influenza researchers find Smith's “antigenic cartography” so enlightening that, shortly after he and others published the first results in 2004, he was asked to join the select group that huddles at WHO's headquarters in Geneva, Switzerland, twice a year to decide which strains to put in the annual influenza vaccine that protects 300 million people. “It's a huge responsibility,” he says.

    From tables to maps

    Influenza viruses elude the immune system by changing the shapes of the glycoproteins on their coat—in particular, hemagglutinin (HA), the one that latches onto human cells and to which our immune systems produce antibodies. That's why a flu shot or a natural infection one winter may not protect the year after.

    To tell how much a new strain differs from previous ones, researchers test how well its HA is inhibited by antibodies to known strains harvested from infected ferrets. If the antibodies bind well, the new virus is “antigenically close” to those earlier ones; if they don't, the new strain is more distant. These results are used to create complex tables with thousands of numbers, each describing the outcome of one binding assay; they are impenetrable to all but the most experienced researchers.

    Smith wanted to turn the tables into clear, accessible maps. Just as mathematicians can reconstruct a decent map of a country from the distance table in the back of a road atlas, it should be possible to map influenza strains based solely on each strain's antigenic distance from the others, he says. So in 1999, Smith teamed up with Alan Lapedes, a mathematician at Los Alamos National Laboratory in New Mexico, who, with Robert Farber, had laid part of the theoretical groundwork for such maps.

    He also struck up a collaboration with Ron Fouchier, a virologist at the Erasmus Medical Center in Rotterdam, the Netherlands. When Fouchier switched from HIV research to influenza in 1998, he, too, had been struck by the opacity of the binding assay tables. “I thought there had to be a better way,” says Fouchier. The Rotterdam lab also had 3 decades' worth of data and samples—precisely what was needed to produce a map. Of the three influenza types now circulating in humans, the trio picked H3N2, which changes the fastest and affects the most people.

    The project was a gamble, says Smith; several groups tried before but failed to get the mathematics right. Even after they produced the first maps, the researchers spent several more years checking the results before publishing.

    In the 16 July 2004 issue of Science (p. 371), they finally published a map of 273 virus strains that had been isolated since H3N2 emerged in 1968. The map is like that of an archipelago, and the strains come in clusters: Very often, the virus changes little from one year to the next, but occasionally, it makes a major antigenic jump, starting a new cluster, for which existing vaccines offer no protection. The jumps can't always be predicted from the viruses' genetic sequence, because a small change in the HA gene can sometimes cause a major shape change in the glycoprotein that makes antibodies lose their grip.

    Smith had already presented his results to some members of the WHO panel; the invitation to join the group followed just 2 weeks after the Science paper was published. “All of us could see this was an emerging technology which had immediate application for the work we were doing,” says Barr, a member of the group.

    Smith's maps increase the group members' confidence that they're making the right choice, says WHO influenza expert Keiji Fukuda; they're also helpful for those less familiar with the tables, he says, such as vaccine producers, regulatory officials, and scientists who don't specialize in influenza. “They can look at these maps and go: ‘Oh, now I understand it,'” Fukuda says. Barr concedes that the math and computer wizardry Smith uses to produce his colorful maps go over most influenza scientists’ heads. “It will take a while before we throw away our tables,” he says.

    Sister labs

    Meanwhile, Smith's career has taken off. The University of Cambridge made him a research associate in 2003 and a full professor in 2007. In 2005, he landed a $2.5 million U.S. National Institutes of Health Director Pioneers Award that enabled him to expand his research group to 10 members. He still spends a day a week at Fouchier's lab in Rotterdam, which he says helps keep him grounded in real biology. Some of their best ideas bubble up while the two indulge their shared love of Tom Waits and good whiskey, says Fouchier. Grad students and postdocs, too, are encouraged to cross the North Sea frequently. “We're really like sister labs,” Smith says.

    Influenza archipelago.

    On an antigenic map, each virus strain isolated between 1968 and 2003 appears as a small blob. They occur in clusters (each given a different color), starting with Hong Kong '68. The scale bar represents one antigenic unit, a measure of how similar strains are.

    CREDIT: SMITH ET AL., SCIENCE 305, 371 (2004)

    “Derek has a wonderful personality for bringing together people and data,” says Nancy Cox of the U.S. Centers for Disease Control and Prevention in Atlanta, Georgia, one of the four WHO Collaborating Centers. Smith's close association with the Collaborating Centers has given him access to an unparalleled wealth of antigenic and genetic data from around the world, which enables him to study questions on a global scale, as a paper in this week's issue of Science (see sidebar) illustrates.

    Aware of his privileged position, Smith is careful not to hog the glory, stressing the collaborative nature of the process and crediting the people who provide the data. “As a theoretical biologist, you have to be aware of your place in the food chain,” he says. “I don't even know how to do a binding assay.”

    Now that his lab has come into its own, Smith hopes to tackle new problems. He would like to predict farther in advance which strains will be dominant in a given year. Currently, vaccine producers have just 8 months between the panel's decision and the start of the vaccination season, which means a yearly scramble. At the same time, Fouchier and Smith are trying to predict a strain's antigenic profile directly from its gene sequence; that might eliminate the need for those pesky tables altogether.

    Smith also wants to expand the scope of his cartography. Maps for H1N1—the other influenza A virus circulating among humans—and for influenza B are under way. He has also started collaborations to work on agents such as rabies, malaria, and dengue and has plans to branch out into HIV. “There's no reason you can't do the same thing with many other pathogens,” he says. One thing seems sure: The mapmaker has put himself firmly on the map.

  10. VIROLOGY

    Coming Out of Asia--Year In, Year Out

    1. Martin Enserink

    On page 340 of this week's issue of Science, researchers report that a small number of countries in East and Southeast Asia “seed” the yearly flu epidemics washing over the planet.

    Where does the flu virus hide when there's no flu? That question has puzzled epidemiologists for decades. Every place on Earth has an influenza season, usually the winter, when conditions are best for its spread. But what happens after that? Does the virus lurk in a few people until next year? Or does it disappear and come back, and if so, where from?

    Using data about some 13,000 seasonal flu samples from around the world, Derek Smith of Cambridge University in the U.K. and colleagues provide an answer in this issue of Science (p. 340): A small number of countries in East and Southeast Asia “seed” the yearly epidemics washing over the planet. “It's really a fantastic paper,” says Keiji Fukuda of the World Health Organization (WHO) in Geneva, Switzerland. It shows that strengthening surveillance in Asia is crucial, Fukuda says.

    There were plenty of theories on what happens during influenza's absence. Some believed the virus remained in every country, hiding in infected but symptom-free people, or is passed on at rates too low to detect, only to roar back when winter comes around. Others believed it vanished, moving back between the northern and southern hemispheres, for instance, or receding temporarily into tropical Asia, Africa, and South America.

    For the new study, Smith and his colleague Colin Russell first analyzed an antigenic map (see main text) of some 13,000 samples of H3N2, the most important flu type currently circulating. They discovered that changes in the virus always occur first in countries in East and Southeast Asia. That doesn't necessarily mean that the area acts as a source; the virus might also be evolving in parallel around the globe, with Asia being ahead of the curve by a couple of months.

    But an analysis of the strains' hemagglutinin genes showed that flu epidemics in Europe, North America, and Australia are actually seeded by countries such as Japan, Thailand, South Korea, and Singapore. Europe and North America then act as conduits to South America, which has less direct contact with Asia.

    A study by Edward Holmes of Pennsylvania State University in State College and colleagues, published online by Nature this week, also shows that yearly waves in the temperate regions originate in the tropics. But that paper—based on a whole-genome analysis of 1302 strains from New York and New Zealand—does not pinpoint the source.

    So what makes East and Southeast Asia special? A variety of climate zones in a small area creates a network of countries with overlapping flu seasons, Smith says. Frequent human travel gives the virus a chance to jump from one country to another. When winter arrives in Europe and the United States, strains from the Asian network spread to those continents aboard jumbo jets. But further, fine-grained studies will be needed to clarify exactly how the Asian network works and whether a similar network exists in India, as Smith and Russell hope to find out together with Indian scientists.

  11. CONDENSED-MATTER PHYSICS

    The Mad Dash to Make Light Crystals

    1. Adrian Cho

    Simulations fashioned from laser light and wisps of ultracold atoms might crack the hardest problems in the physics of solids. DARPA wants them in just over a year.

    Simulations fashioned from laser light and wisps of ultracold atoms might crack the hardest problems in the physics of solids. DARPA wants them in just over a year

    Crystal clear.

    In an optical lattice, spots of laser light simulate the ions in a crystal. Atoms (red) hopping between the spots simulate the electron in the solid.

    CREDITS: J. NEWFIELD/SCIENCE

    For more than a century, physicists have developed ever more sophisticated theories of rock-hard solids and the electrons whizzing within them. They've deciphered metals and insulators, concocted the semiconductors that make computers hum, and explained mind-boggling phenomena such as conventional superconductivity, in which some alloys conduct electricity with no resistance at temperatures near absolute zero.

    Yet many problems continue to stump theorists. For example, 22 years after high-temperature superconductors were discovered, physicists still don't know how the exotic compounds carry current without resistance at temperatures up to 138 kelvin. Generally, whenever the shoving among electrons grows too strong, physicists find themselves stymied—even if they resort to high-powered numerical simulations. “We know that we can't do these things on a classical computer,” says David Ceperley, a theorist at the University of Illinois, Urbana- Champaign (UIUC).

    Help may be on the way, and from an unlikely quarter. Atomic physicists have spent 2 decades fiddling with ultracold gases a millionth the density of air. Now they're striving to model weighty crystalline solids with laser light and cold atoms. Interfering laser beams create an array of bright spots called an “optical lattice” that emulates the ions in a crystal; atoms hopping between the spots emulate the electrons. Physicists can tune the lattice's geometry, the rate of hopping, and the push and pull between atoms. So they hope to map the various behaviors of a model solid—superconducting, insulating, and so on—in a portrait called a “phase diagram.”

    The U.S. Defense Advanced Research Projects Agency (DARPA) has launched a multimillion-dollar program to develop such optical-lattice emulators. They might crack some of the toughest problems in condensed-matter physics or even enable researchers to design materials from scratch. Plans call for the first ones to be running in 15 months.

    “The DARPA program is excellent,” says Wolfgang Ketterle, an experimenter at the Massachusetts Institute of Technology (MIT) in Cambridge and leader of one of three multi-institution teams receiving funding. “It puts money and resources into an effort that is scientifically superb.” But some warn that making the emulators work may be harder than expected. And researchers' goals don't necessarily jibe with DARPA's. Physicists want the phase diagrams. Seeking a tool to design exotic new materials, DARPA wants an automated system that works in just 10 hours.

    Abstractions made material

    The emulators could bridge the gap between the abstraction of theory and idiosyncrasies of experiments with real solids. Theorists make an educated guess at the physics behind a material's behavior. This “model” is captured in a mathematical expression known as a Hamiltonian, which describes the system's energy. Unfortunately, it's often impossible to “solve” the Hamiltonian to prove it produces the observed behavior. And there is no guarantee that the model doesn't leave out some key detail.

    Take, for example, a high-temperature superconductor. It contains planes of copper and oxygen ions arranged in a square pattern along which the electrons pair and glide. At low enough temperatures, the electrons repel one another so strongly they get stuck one-to-a-copper-ion in a traffic jam known as a Mott insulator state. The electrons also act like little magnets, and neighboring electrons point alternately up and down to form an “antiferromagnet.” Now, take out a few electrons by tweaking the material's composition. The traffic jam breaks and, perhaps through waves of magnetism, the electrons pair and flow without resistance. Or so many theorists assume.

    This scenario is known as the two-dimensional (2D) Fermi-Hubbard model, and nobody can prove it produces superconductivity. Nobody is sure that it captures the essential physics of the messy crystals, either. “The materials are so complicated that you can't look at just the electron-electron correlations,” UIUC's Ceperley says. “There are all these other things going on.”

    But physicists might be able to make a Fermi-Hubbard model by loading cold atoms into a 2D optical lattice that would simulate just these copper-and-oxygen planes. Atoms spinning in opposite directions would hop from bright spot to bright spot. By tweaking the laser beams and applying a magnetic field, physicists would vary the rate of hopping, the repulsion between atoms, and other factors to determine under what conditions if any the model produces superconductivity, says Tin-Lun “Jason” Ho, a theorist at Ohio State University in Columbus. “The goal is to reproduce the model faithfully in an optical lattice and let nature tell you what the solution is,” he says.

    The next coolest thing

    The push marks the next chapter in the short, glorious history of ultracold atoms. Atoms can be sorted into two types—bosons and fermions—depending on how much they spin. Thanks to quantum mechanics, the two types behave very differently. Bosons are inherently gregarious. In 1995, two teams independently chilled bosons to below a millionth of a kelvin to coax them into a single quantum wave and produce a state of matter called a Bose-Einstein condensate (BEC) that flows without resistance (Science, 14 July 1995, p. 152). That accomplishment netted a Nobel Prize in 2001.

    Fermions are loners, as no two identical fermions can occupy the same quantum wave or state. Nevertheless, at very low temperatures fermions can get it together to flow freely. First they have to pair, and then the pairs condense into a quantum wave. This is what happens in superconductors, and in 2004, physicists made fermionic atoms pair and condense in much the same way (Science, 6 February 2004, p. 741).

    Given those accomplishments, creating an optical-lattice emulator might seem easy. Electrons are also fermions, so it might appear that researchers need only impose an optical lattice on fermionic atoms already trapped by magnetic fields and laser beams. But researchers have several steps to go before they can emulate the Fermi-Hubbard model and other intractable systems. They must achieve the Mott insulator state, which would pin one atom to each lattice site, and then the antiferromagnetic state, in which neighboring atoms spin in different ways.

    Physicists have made progress. In 2002, Immanuel Bloch, now at the Johannes Gutenberg University of Mainz in Germany, and colleagues reached the Mott insulator state for bosons by loading a BEC of rubidium-87 into an optical lattice and cranking up the brightness of the laser spots to effectively increase the repulsion between atoms. “That was a landmark,” says Randall Hulet, an experimenter at Rice University in Houston, Texas. “That showed that we could do something that was relevant from a condensed-matter perspective.”

    Last month at an American Physical Society meeting in New Orleans, Louisiana, Niels Strohmaier and Tilman Esslinger of the Swiss Federal Institute of Technology Zurich reported reaching the more elusive Mott state for fermions. “We have a few puzzle pieces, and now we want to put everything together,” says Ketterle, who shared the Nobel for BECs.

    Emulators, pronto!

    The DARPA program aims to do just that. Last July, the agency gave three large teams—led by Ketterle, Hulet, and Christopher Monroe at the University of Maryland, College Park—a few million dollars each (DARPA won't say exactly how much) and 2 years to develop a working emulator. In that first phase, researchers will tackle simpler models for which the Hamiltonian can be solved. For example, Hulet's group will study fermions in 1D tubes of light, and Ketterle will aim for the antiferromagnetic state of fermions in a 3D lattice.

    If a team's starter emulation works by July 2009, it will be eligible for a 3-year second phase, in which researchers will tackle an incalculable Hamiltonian. Hulet and Ketterle both hope to emulate superconductivity in the 2D Fermi-Hubbard model. Monroe's team is focusing on bosons in both phases, which do not mimic electrons but should still be useful for simulating exotic magnetic materials.

    To get to the second phase, the emulators for the first phase must work at lightning speed, however. The machinery must step through a complete phase diagram in 10 hours, not including the setup time. That's roughly how long it takes the best computer simulations to run, says DARPA program manager Air Force Lt. Col. John Lowell. “You're trying to establish a comparison with other computational techniques, and time is the metric,” he says.

    It sounds like the sort of results-on-demand program that would drive university researchers crazy. However, all voice great enthusiasm for the project. “DARPA really wants you to stay focused on the task at hand, and I find that very productive,” Hulet says. “I've got a schedule on my white board of when things have to get done, and it definitely creates some tension in the lab.”

    Heavy hitters.

    MIT's Wolfgang Ketterle (top) and Rice's Randall Hulet praise DARPA's vision.

    CREDITS (TOP TO BOTTOM): DONNA COVENEY/MIT; JEFF FITLOW/RICE UNIVERSITY

    The challenges ahead

    Making the emulators work won't be easy, physicists say. The biggest hurdle may be getting the atoms cold enough. Researchers may have to chill gases to picokelvin temperatures to emulate the Fermi-Hubbard model. Oddly, they may catch a break getting part of the way there. Theory suggests that if they turn on an optical lattice gently—so as not to add entropy—a gas of fermions should spontaneously cool enough to reach the antiferromagnetic state. But researchers may be underestimating the difficulty of getting even colder to reach the superconducting state, Ohio State's Ho says. “It requires a breakthrough,” he says. “Just doing things the way they are doing them now is as good as praying.”

    Experimenters will also have to devise ways to prove that their emulator is doing what they think it is. A high-temperature superconductor may be messy, but it produces an unambiguous signal that it's working: zero electrical resistance. Atoms in a lattice won't signal so clearly that they have gone superconducting, so proving they have will require subtle new probes.

    Then there is the 10-hour time limit—an odd requirement given that physicists would be happy to have the phase diagram for the Fermi- Hubbard model even if it took years to get it. Some predict DARPA officials won't enforce the limit strictly. “Everybody knows it can be tweaked in the end if need be,” Monroe says. Don't be so sure, Lowell warns. “I wouldn't have laid this out as a milestone if I didn't think it was doable,” he says. “And you wouldn't have signed on to it if you didn't think it was achievable.”

    Where will it all lead? Even leading physicists doubt that they'll produce a black box capable of deciphering any solid. “I see it as unlikely that in the end there will be this universal machine that solves any problem you would like to solve,” says Mainz's Bloch. Some say optical lattices may serve primarily to validate techniques for computer simulation, which will remain the biggest wrench in the theorist's toolbox.

    Nevertheless, all agree that making light crystals could be a revolutionary advance. “It's bloody difficult, but it doesn't seem impossible,” Ketterle says. “Let's stop talking and start doing.” The clock is already running.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution