News this Week

Science  11 May 2007:
Vol. 316, Issue 5826, pp. 812
  1. CLIMATE CHANGE

    IPCC Report Lays Out Options for Taming Greenhouse Gases

    1. John Bohannon*
    1. With reporting by Eli Kintisch.
    All smiles.

    Demonstrators outside the IPCC meeting reflected the mellow mood of negotiations inside.

    CREDIT: PORNCHAI KITTIWONGSAKUL/AFP/GETTY IMAGES

    BANGKOK—Reining in climate change won't bankrupt the world economy and won't require technological miracles. But we'll have to start soon. That is the mostly upbeat conclusion from Working Group III of the Intergovernmental Panel on Climate Change (IPCC), which met behind closed doors for 3 days last week here in the Thai capital.

    The fruit of the working group's labor is a 35-page document that lays out options—and their price tags—for reducing greenhouse gas emissions to head off catastrophic climate change. The most ambitious plan, which would stabilize greenhouse gas levels in the atmosphere (measured in equivalents of CO2) below 535 parts per million (ppm), would come with an estimated 3% decrease in global gross domestic product (GDP) by 2030 compared to business as usual. Less ambitious targets come cheaper. The easiest option—aiming for under 710 ppm, 50% higher than the current atmospheric concentration of long-lived greenhouse gases of 460 ppm—could yield a small net gain for the global economy.

    The report—the executive summary written by 33 of the several hundred contributing authors of a review of major economic modeling studies due to be released in September—concludes that getting from today's greenhouse gas-intensive economy to any of these targets is achievable with currently available tools such as shifting to alternative energy sources, boosting energy efficiency, and reducing deforestation, coupled with a suitable mix of caps, taxes, and economic incentives. But other scientists warn that reality will present harder choices than the models suggest. “The only reason for economists to make forecasts is to make astrologers look good,” says Martin Hoffert, a physicist at New York University who has criticized earlier IPCC studies.

    Last-ditch editing

    Reaching consensus on these take-home messages was easier than expected. Media reports had predicted bitter disputes between IPCC member countries. For example, China was expected to insist on softening statements that might suggest that its fast-growing and fossil-fueled economy might need to be slowed, whereas the United States was expected to bully for nuclear power. But in fact, says Dennis Tirpak, a climate policy analyst who heads the climate change unit at the Organisation for Economic Co-operation and Development in Paris and one of the summary's authors, “the atmosphere was quite civilized.”

    China did put its foot down—over the adjective used to characterize the scientific evidence behind estimates of the cost of achieving emissions targets. China urged that the quality be downgraded from “high” to “medium.” The motivation was “only to protect the scientific integrity of the IPCC,” says co-author Dadi Zhou, a climatologist and deputy director of the Energy Research Institute in Beijing. Others who spoke with Science agree. “China had a valid point, and we adopted it,” says co-author Jayant Sathaye, an energy policy analyst at Lawrence Berkeley National Laboratory in California.

    In the end, only two short passages in the report fell short of unanimous approval. One was four lines stating that with a price of $50 for a ton of emitted CO2, nuclear energy would be cost-effective in providing nearly a fifth of global electricity—with the caveat that “safety, weapons proliferation and waste remain as constraints.” Even that cautious endorsement sparked what Sathaye calls an “adrenaline-fueled” discussion ending with firmly antinuclear Austria insisting on a footnote saying that it “could not agree with this statement.” The other sticking point was a passage on forestry, which drew fire on technical grounds from a delegate from Tuvalu.

    The final result is a document that strikes a far more optimistic tone than did the previous three mitigation reports. At least, that was the mood of the IPCC's buoyant press release, which has been echoed by the media since its release.

    Climate crystal ball

    But hidden within the text of the report are abundant references to uncertainties and caveats that have gone largely unmentioned.

    For one, many scientists are muttering, the report is only as good as its models. To explore mitigation options, the IPCC uses two distinct strategies. Bottom-up models break the economy down into sectors and predict how different mixes of technologies will cut carbon emissions in each. Top-down models simulate whole economies to compare how different global strategies, such as carbon taxes or fixed greenhouse-gas stabilization targets, will play out through market forces. Each approach has its drawbacks. Bottom-up models tend to ignore economics, whereas top-down models smooth over the differences between regions and sectors. In 2001, the two approaches were often at odds. The good news, says Sathaye, is that “for the first time, the range of results from bottom-up and top-down models are starting to converge.” However, enormous wiggle room remains.

    CREDITS: J. BOHANNON/SCIENCE

    One problem is that bottom-up models don't cope well with lifestyle: the preferences that drive people to choose one mix of technologies over another. For example, the report suggests that a broad portfolio of alternative energy sources, such as solar and biofuels, could cut projected annual CO2 emissions in the year 2030 by 5 to 7 gigatons at no cost at all, thanks to savings in energy efficiency. But that conclusion is misleading, says author Richard Richels, an economic modeler at the Electric Power Research Institute in Palo Alto, California, because it ignores the implicit cost of making people choose something they don't want. “If it's advantageous, why aren't people doing it?” Richels asks.

    Since 2001, researchers have worked to make the models more realistic by incorporating such “market feedback,” says Billy Pizer, an economist with Resources for the Future in Washington, D.C., who co-authored a related chapter in the full mitigation report. But it's one thing to account for people's illogical behavior and quite another to persuade them to change it. “It's stuff that pays for itself that people don't do,” he says.

    Steady progress has been made with top-down models, says Jae Edmonds of the College Park, Maryland, office of the Pacific Northwest National Laboratory. The modelers are now accounting for more regional details, such as the availability of land area for biofuels and the potential for storing coal-plant carbon emissions underground. They have also expanded the models to include emissions of greenhouse gases other than CO2, such as methane. Doing so has lowered the top-down estimates of mitigation costs. “The reason is that you have other opportunities to reduce emissions,” says Sathaye. For example, a land-fill emitting methane can be cheaper to deal with than a coal plant, but such advantages were lost in previous simulations.

    But top-down models can still run aground on the shoals of international politics. One rosy prediction is that an imposed cost of $100 per ton of CO2—equivalent to an extra $1 per gallon at the pumps—could yield a cut of 17 to 26 gigatons of CO2 by 2030, as much as 38% of estimated emissions under a fairly carbon-intensive forecast. But this assumes that the whole world participates in carbon trading and that markets are free and transparent. Given current Indian and Chinese wariness towards carbon caps, says Pizer, “that's not politically likely.”

    Spin control

    Now that the debate over the content of the 1000-page Fourth Assessment Report is done, the battle is shifting to its interpretation. Many IPCC scientists say they are uneasy with the optimistic spin put on the report. “I think something that is being underplayed … is the scale of the mitigation challenge,” says Brian O'Neill, a climate policy modeler at the International Institute for Applied Systems Analysis in Vienna, Austria, who contributed to a chapter on mitigation scenarios. “To limit warming to something near the European Union's stated goal of 2°C, global emissions have to peak within the next decade or two and be cut by 50% to 80% by midcentury.” That's a tall order, O'Neill says—and it could get a lot taller if global temperatures turn out to be more sensitive to increases in greenhouse gases than the IPCC has been assuming. “My point is not that there should be more gloom and doom,” says O'Neill, but “a message that says that we have to stay below 2°C, but don't worry, it will be easy and cheap, just doesn't add up.”

    Diet plan.

    The IPCC report drew on models that calculated global portfolios of emissions reductions needed to reach various target levels of greenhouse gases in the atmosphere.

    CREDIT: ADAPTED FROM THE IPCC FOURTH ASSESSMENT REPORT

    Other researchers say the report's insistence that current mitigation strategies can suffice gives short shrift to future research. That's a mistake, says Hoffert: “It is ludicrous to think a greenhouse-gas emissions price, cap, or tax alone will get you to stable concentrations of [greenhouse gases].” New technologies will be critical, he says, and unless policymakers pave the way with measures such as a gradually increasing carbon tax, they will not be competitive. And Richels fears that if the takeaway message is that mitigation is cheap, societies “may not be as motivated to invest in the future” for such research.

    Overall, the question of whether mitigation is “affordable”—be it 0.3% or 3% of global GDP—is “a difficult one to answer,” says Sathaye. But some say that when stakes are overwhelmingly high, purely economic reasoning misses the boat. “What did World War II cost us economically?” asks Hoffert. “Does the question even make sense?”

  2. CLIMATE CHANGE

    Meanwhile, Back in Washington …

    1. Eli Kintisch
    Price club.

    MIT modeling studies suggest that policies placing different limits on greenhouse gas emissions will have varying impacts on the average U.S. citizen's wealth. Figures are cumulative amounts emitted between 2015 and 2050.

    CREDIT: MIT

    After playing a minor role for years in the U.S. Senate's Energy and Natural Resources committee, a molecule had a coming-out last week: carbon dioxide. The committee was drafting a bill meant to broaden energy independence, including measures on ethanol production, energy efficiency, and carbon sequestration.

    But when a Republican senator from coal-rich Wyoming proposed a measure to boost the production of fuel made from gasified coal, panel Chair Jeff Bingaman (D-NM) balked. Concerned that the technology was unproven and could release too much CO2 into the atmosphere, he asked Democratic members—even those from other coal-rich states, such as newly elected Jon Tester of Montana—to hold the line against the measure. The amendment failed on a party-line vote. Tester said he could support the technique later but that storing carbon emitted from coal-to-liquid facilities was a priority. “The carbon issue is that important,” he said.

    The skirmish “shows how global climate change has arrived as an issue in the debate on energy” in Washington, D.C., says Jim Presswood, a lobbyist for the Natural Resources Defense Council. Last year, when the Republican party controlled Congress, the amendment probably would have passed, Presswood says. But when Democrats took over in January, they made climate change a top priority, and the new speaker of the House of Representatives, Nancy Pelosi (D-CA), set 4 July as a target deadline to pass a House bill that would cap U.S. emissions of greenhouse gases.

    Since then, several factors have fallen into place: One longtime opponent of carbon limits, Democratic Representative John Dingell of Michigan, is listening, with a series of hearings on the idea. And the Edison Electric Institute, which represents American utilities, recently signaled its openness to emission limits—provided they cover all industries and include price controls. President George W. Bush's emphasis on research and voluntary measures no longer holds sway.

    But 4 months into their rule, Democrats are beginning to realize that the new mood in Congress won't translate into new laws overnight. Pelosi has pushed back her timeline as efforts to pass a carbon bill have collided with international implications and state interests—most importantly, coal. Some observers are already saying that major new policies will have to wait until after next year's presidential election.

    For sure, science is getting a different reception on Capitol Hill. Hearings by at least 15 panels since January have touched on everything from the environmental impacts of expanding bio-fuel production to the effects a cap would have on Detroit's automakers. Climate scientist Stephen Schneider of Stanford University in Palo Alto, California, says the “cordial” and inquisitive atmosphere of the three hearings at which he has testified this year are a welcome contrast to the previous “20 years of combat on the Hill” he's endured, much of it over the very existence of the problem. Longtime foes of carbon restrictions are laying down arms. “My view is changing, as is the view of much of the energy industry,” Representative Rick Boucher (D-VA) said in February, crediting the “deeply solidified” scientific consensus.

    After years of relatively sporadic hearings about confronting climate change, aggressive lobbying by industry, nonprofit activists, and scientists has fueled more than 100 legislative proposals on the topic—about a dozen with mandatory emissions limits. But the deluge of new input “doesn't necessarily make it simpler to get things done,” says David Hunter, an aide to Senator Susan Collins (R-ME).

    Right now the most aggressive emissions limit proposal in Congress belongs to Representative Henry Waxman (D-CA), who wants to cut U.S. emissions 83% from current levels by 2050. A recent analysis by researchers at the Massachusetts Institute of Technology (MIT) suggests that the measure would cut the average citizen's available income by about 2% by 2050. It would yield an approximate 460 parts per million (ppm) level of CO2 in the atmosphere if China and India begin by 2025 to cut their emissions and by 2050 to stabilize them. That level, roughly 20% higher than today's, would still mean “additional warming of twice to three times [what] we have seen over the last century,” the MIT study concluded.

    But few believe that bill can fly now, as a less aggressive approach, pushed by senators Joe Lieberman (D-CT) and John McCain (R-AZ), failed in 2005, attracting only 38 votes. So others, including Bingaman, have sought consensus by setting the emission bars lower. Bingaman's carbon-trading proposal includes a so-called safety valve that limits the price that industry and, subsequently, consumers must pay for emitting CO2. The MIT analysis predicts that Bingaman's approach would cost citizens only 0.5% of available income by 2050 while holding CO2 in the atmosphere to about 490 ppm.

    Some lawmakers say it's crucial to pass some bill—even a flawed one—soon. Early U.S. action, they argue, could spur the crucial participation by India and China in an emissions-control regime. “If we take 10 years to get started, the problem will be harder to deal with then,” says Representative Tom Udall (D-NM). But others, including editors at the left-leaning New Republic magazine, have urged the Democrats not to accept compromises for the sake of expedience. “There won't be many chances to get this right, and Democrats will need to wait until they can go for broke,” a March editorial declared.

    Privately, lobbyists on each side of the issue say that only a committed president can muster the political force to broker a deal. Presidential contenders such as John Edwards, senators McCain and Barack Obama (D-IL), have championed forceful proposals to contain greenhouse gas emissions. Meanwhile, the timeline is the one thing that's becoming clear: “It'll take a ways to pass comprehensive greenhouse legislation,” says Hunter.

  3. PHYSICS

    All Paired Up but Unable to Flow, Atoms Strain Key Conceptual Link

    1. Adrian Cho

    Day leads to night, life leads to death, winter leads to spring; some things necessarily imply others. So it has seemed in physics: At very low temperatures, certain particles pair, and when they do, the pairs inevitably gang up to form a “superfluid” that flows without resistance. That explains how electrons glide through superconductors, how atoms of helium-3 form a liquid with no viscosity, and perhaps, how neutrons circulate through neutron stars. But an experiment reported on page 867 breaks the pairing-to-superfluidity connection. Atoms in an ultracold gas can pair but do not flow without resistance, even at temperatures approaching absolute zero, physicists report.

    “If they have found a [zero temperature] state that has pairing but no superfluidity, that would be revolutionary,” says Mohit Randeria, a theorist at Ohio State University in Columbus. But he cautions that it's too early to rewrite the physics texts.

    How atoms and other quantum particles behave depends on how they spin. Particles can have only certain fixed amounts of spin, and those with an integer multiple of a basic amount called Planck's constant are known as bosons. They are sociable particles that at low temperature can crowd into a single jumbo quantum wave, which is the key to superfluidity. In contrast, particles with an extra half bit of spin are known as fermions and are loners. No two identical fermions can occupy the same quantum wave or state.

    Fermions can get together, however, if they form loose overlapping pairs that act like bosons. In a superconductor, an electron spinning in one direction pairs with another spinning the opposite way, and atoms in ultracold gases can pair similarly. But what happens when the particles spinning one way outnumber those spinning the other way?

    To find out, Christian Schunck, Wolfgang Ketterle, and colleagues at the Massachusetts Institute of Technology in Cambridge studied puffs of lithium-6 atoms. In previous work, they tested for superfluidity by rotating the clouds and looking for whirlpools called vortices, which are sure signs of a flowing quantum wave (Science, 23 December 2005, p. 1892). They fiddled with the ratio of up-spinning and down-spinning atoms and found that superfluidity persisted until the ratio reached about 85:15, with the pairs forcing the leftover up atoms to the cloud's edge. Larger mismatches quashed the superfluidity.

    But in the new experiment, the team has found that even when the ratio is skewed enough to prevent superfluidity, the atoms still pair. The researchers used radio waves to pop the down-spinning atoms into an entirely different quantum state. As they lowered the temperature, they had to increase the energy of the waves by a particular amount. That's exactly what should happen if the atoms pair and extra energy is needed to break the pairs apart, Ketterle says.

    Disconnect.

    When the up-spinning atoms greatly outnumber the down-spinning ones, the atoms still pair, but they do not form a superfluid.

    CREDIT: PRESTON HUEY/SCIENCE

    The finding appears to clash with a theorem which states that fermions that do not form a superfluid cannot pair either. “What we really need now is a rethinking of pairing,” says Rudolf Grimm, an experimenter at the University of Innsbruck in Austria. But theorist Kathryn Levin of the University of Chicago in Illinois says the theorem “just doesn't apply” because it relies on assumptions that aren't valid for the strongly interacting atoms.

    Even so, the experiment marks a “triumph,” Randeria says. He notes that at smaller mismatches, Ketterle and colleagues see the atoms pair above the temperature at which superfluidity is known to set in. Some physicists have argued that the electrons in high-temperature superconductors form such “preformed pairs,” but this experiment provides far clearer evidence, Randeria says. In that much at least the coupling between pairing and superfluidity is unraveling.

  4. AIDS DRUGS

    Brazil, Thailand Override Big Pharma Patents

    1. Jon Cohen

    Executing a much-repeated threat, Brazil on 4 May broke sharply with big pharma and for the first time signed a “compulsory license” that allows the country to make or import a generic version of a patented anti- HIV drug. Brazilian President Luiz Inácio Lula da Silva, who signed the decree in a televised ceremony, took this step shortly after Thailand decided on similar action with the same drug—efavirenz—and two others. “Many other countries will likely follow suit,” predicts economist James Love, who runs Knowledge Ecology International, a think tank in Washington, D.C. Love has urged developing countries to issue compulsory licenses, which are permitted by World Trade Organization rules for noncommercial uses of patented drugs, especially if they involve public health.

    Efavirenz is used by nearly 65,000 of the 170,000 people in Brazil now receiving free treatment from the government. Merck offered earlier in the week to cut the price from $580 per patient per year to $400, but Brazil noted that a generic version would reduce costs to about $165—saving the country an estimated $30 million this year alone. In a statement, Merck said it was “profoundly disappointed” by the decision and warned that the “expropriation of intellectual property sends a chilling signal to research-based companies,” contending that they “cannot sustain a situation in which the developed countries alone are expected to bear the cost for essential drugs.” But Pedro Chequer, the former head of Brazil's AIDS program who now works for the Joint United Nations Programme on HIV/AIDS, says, “I am really proud of this wonderful political decision.”

    Lifeline.

    As more patients fail to respond to front-rank anti-HIV drugs, the demand grows for inexpensive, next-generation therapies.

    CREDIT: ADREES LATIF/REUTERS

    Thailand faced similar praise and criticism when it issued compulsory licenses for efavirenz in November and then again in January for the anti-HIV drug lopinavir/ritonavir (made by Abbott Laboratories of Abbott Park, Illinois) and the blood thinner clopidogrel (made by Sanofi-Aventis of Paris, France). “Thailand's move has stirred up a hornet's nest,” says Jon Ungphakorn, a former Thai senator who strongly backs his government's actions.

    To the astonishment of Ungphakorn and many others in Thailand, Abbott announced on 14 March that it was pulling applications it had pending to register seven new medicines for sale in Thailand. Then on 30 April, the Off ice of the U.S. Trade Representative cited Thailand's issuing of compulsory licenses as one reason for elevating the country to the dreaded Priority Watch List, a U.S. government warning to countries that it judges do not adequately protect intellectual property, which can drive away foreign investment and impact export tariffs. “It's surprising that the reactions have been so harsh to a move that is perfectly legal,” says Ungphakorn. “What the United States and Abbott have done to Thailand is to send a message to the whole developing world: 'Don't you dare carry out compulsory licenses, or there will be retaliation.'”

    Merck and Abbott say they do not understand why Thailand has yet to accept their latest offers. Merck says it will sell efavirenz to the country for $237.25 per patient per year—a “no profit” price that Brazil said it would have agreed to—while Abbott reduced the price of lopinavir/ritonavir from $2200 to $1000 per patient per year. (Sanofi-Aventis, which sells clopidogrel in Thailand for about $800 per patient per year, did not reply to an interview request.)

    Lawyer Sean Flynn, an intellectual-property expert at American University in Washington, D.C., who supports Thailand's and Brazil's actions, says the countries ideally would like to create competition among generic manufacturers to drive prices as low as possible. And Flynn flatly dismisses the “tired” argument that R&D would be harmed by these compulsory licenses, stressing that the drugs were not initially made for developing countries. “They were created for the European and U.S. markets, and that's where the incentive comes from to invest in developing them,” contends Flynn, adding that patent holders also receive some royalties from drugs sold under compulsory licenses.

    Abbott has taken the brunt of the criticism. AIDS advocates in particular have protested its plans to withdraw the registration of its new drugs, including a heat-stable form of lopinavir/ritonavir that's badly needed in Thailand. “Patients are being penalized,” charges Paul Cawthorne, head of the Thai mission for Médecins Sans Frontières. “It's disgusting and completely unethical.” Such criticism is misguided, counters Abbott spokesperson Dirk van Eeden: “The Thai government said it will not buy it, so why is there a need for us to register it?” he asks.

    Although a handful of countries have issued compulsory licenses for AIDS drugs without kicking up much of a fuss, all involved older, first-generation drugs. Now the second-line treatments are at stake. Economist Love adds that big pharma feels threatened that this movement could go beyond AIDS to heart disease and other ailments. “There's a big push in Thailand to do it for everything,” says Love.

    Merck notes that it “remains flexible and committed to exploring a mutually acceptable agreement” with Brazil, and Thailand on 14 May plans to hold a meeting with Merck, Abbott, and Sanofi-Aventis to attempt again to negotiate lower prices for their products.

  5. GENDER EQUITY

    Women Are Scarce in New NAS Class

    1. Yudhijit Bhattarcharjee
    Wrong direction.

    The number of women elected to NAS this year has taken a tumble compared with recent classes.

    SOURCE: NAS

    The number of women elected this year to the U.S. National Academy of Sciences (NAS) is the smallest since 2001 and fewer than half the number chosen in 2005. Only 12% of the new class of 72 announced last week (http://www.nas.edu/) are women, compared to levels approaching 25% earlier in the decade. The dismal showing has prompted criticism from some quarters that NAS is backing away from efforts to promote gender equality. But NAS officials say the meager crop simply reflects the persistent dearth of women at the highest levels of science.

    “I am amazed that the number is so low,” says Jong-on Hahm, who until 2005 served as director of NAS's Committee on Women in Science and Engineering and is now a research professor with the Women's Leadership Program at George Washington University in Washington, D.C. “They seem to have stopped paying attention to the issue.”

    Not so, counters Ralph Cicerone, who became NAS president in 2005. He says this year's total of nine women “is an unpleasant surprise” because activity aimed at increasing women's representation within the academy “is probably at an all-time high.” The academy has been encouraging its members to identify eminent female scientists in their fields and generate “fuller lists” of candidates, says Cicerone. He says he cannot point to a specific reason why the number dipped, however, and the academy has no plans to dissect this year's process.

    But Cicerone says the general underrepresentation of women in the academy is no mystery. “Even though the number of women entering science has been increasing over the years, we are seeing a lagging effect in the composition of the membership, … since it usually takes 25 years or more of research past Ph.D. to achieve the accomplishment required to be elected to the academy,” he explains.

    Critics aren't persuaded by that argument. “It's the nomination process and sometimes the selection process that fails women,” says Nora Berrah, a physicist at Western Michigan University in Kalamazoo and co-chair of the American Physical Society's Committee on the Status of Women in Physics. “Women do not lobby to be nominated, and perhaps we should do it. Also, often the selection process does not have enough women in it.” Berrah is disappointed that only one of the nine new members is from the physical sciences and mathematics. NAS officials would not disclose the composition of the committee that chose nominees in that category, but it was unlikely to have been more than the academy's overall tally of 10% women.

    Although they receive 43% of U.S. Ph.D. degrees awarded in the natural sciences, women face several barriers that prevent “a normal career progression,” says Donna Nelson, a chemist at the University of Oklahoma, Norman. For example, she says, graduate students are sometimes discouraged from selecting a female professor as an adviser, and female professors are sometimes denied access to specialized lab equipment. Similar barriers were documented in a 1999 Massachusetts Institute of Technology study (Science, 12 November 1999, p. 1272). Nelson says such a climate hinders a woman's ability to assemble the necessary credentials.

    Cicerone says academy members “are keen to do more” to expand the pipeline as well as identify more women candidates. One suggestion is to find rising stars early in their careers and mentor them so as to increase their chances of being elected down the road. “We hope this year's number is just a temporary lull,” he says.

  6. BUDGET POLICY

    U.S. Science Adviser Tells Researchers to Look Elsewhere

    1. Jeffrey Mervis

    Hardheaded realist or apologist for the Bush Administration? That's what some U.S. researchers were asking themselves last week after presidential science adviser John Marburger said they needed to rely more on nonfederal funding—in particular philanthropy and industry—to expand the scientific enterprise because Congress and the White House cannot keep up with the type of budgetary growth needed to capitalize on scientific opportunities. Several university lobbyists discounted the advice, however, saying those other sectors can't fill the gap that would be left if federal support lags.

    Marburger argued his case last week at the annual Science and Technology Forum, the largest gathering of the year for policy analysts, sponsored by AAAS (which publishes Science). He said that competing societal priorities have held science to a constant slice of the federal pie for the past 40 years (see graph), and that it is unrealistic to expect legislators to grant larger, sustained increases. “I haven't seen any evidence of an increased top line for science,” he told Science after his 3 May speech. “I think that's wishful thinking.”

    The same slice.

    The share of U.S. discretionary spending going to research hasn't changed much since the days of the Apollo program.

    SOURCE: AAAS, FROM 2008 U.S. BUDGET REQUEST

    Marburger spoke glowingly of philanthropies willing to support basic research, citing the Kavli Foundation's network of institutes in the physical sciences (Science, 21 January 2005, p. 340), the myriad medical charities and patient advocacy groups, and university partnerships with industry. But many in the audience later referred to that support as “drops in the bucket” and felt Marburger was simply defending the Administration's policies.

    “Yes, when you cut taxes and create a deficit and spend hundreds of billions of dollars on an unpopular war, it leaves you with precious little to spend on anything else,” fumed Michael Lubell of the American Physical Society. “I don't expect to see any real changes until after the 2008 election.”

    Part of Marburger's comments were aimed at pending legislation that would authorize large increases at several science agencies (Science, 4 May, p. 672). “People probably wonder why Marburger is not more enthusiastic about these authorizations,” the science adviser said in an interview. “I appreciate the desire of Congress to do this, and I feel uncomfortable criticizing them. But it's unrealistic to expect it to happen.”

    His dark analysis also applies to the flattening of the National Institutes of Health budget after its 5-year doubling ended in 2003, which he says created an increased research capacity that the federal government cannot support. Referring to the communities' expectations of continued robust increases, he said in his speech that “I cannot see how such an expansion can be sustained by the same business model that led to its creation. The new researchers will either find new ways to fund their work, or they will leave the field.”

    Michael Rodemeyer, a former longtime Democratic congressional science aide, acknowledges that “it's politically hard” to shift spending toward science but disagrees with Marburger that there is any “iron law” fixing its share of domestic spending. But Dan Sarewitz, another former aide now at Arizona State University in Tempe, thinks that Marburger's underlying message is valid. “It's certainly reasonable to complain that the current Administration's priorities have recklessly wasted the budgetary surplus and made it impossible to make important discretionary investments,” says Sarewitz. “But if this is true for science, then it's true for other areas. … So which ones would science like to go up against?”

  7. BIODIVERSITY

    The Ultimate Life List

    1. Mitch Leslie

    Hands up if you've heard this before: An ambitious new project promises to create an online compendium of all 1.8 million or so described species. It can already claim participation by premier institutions, a wad of start-up cash, and huzzahs from biodiversity guru Edward O. Wilson. Although some confess to a wary sense of déjà vu, taxonomists hope that the Encyclopedia of Life (EOL) can provide the long-awaited comprehensive species catalog. Even enthusiasts agree that it faces some tall hurdles, however, such as signing up curators and getting permission to use copyrighted material.

    Announced this week, EOL involves big names in biodiversity research, including Harvard University and the Smithsonian Institution, and has garnered $12.5 million from the John D. and Catherine T. MacArthur Foundation and the Alfred P. Sloan Foundation. Its plan envisions posting Web pages for each known species. EOL will also provide access to original species descriptions by teaming with the Biodiversity Heritage Library, which is digitizing the pre-1923 taxonomic literature on which the copyright has expired.

    Pages on 50,000 species should be ready by the end of 2008, with 700,000 to 1 million species online by 2011, says EOL's newly appointed executive director, James Edwards. He estimates that the work will take 10 years and cost $70 million to $100 million. A separate group is developing a European equivalent, known as SpeciesBase, and the two projects will swap information.

    Electronic ark.

    E. O. Wilson's idea for a Web-based encyclopedia containing all the species on Earth is now ready for launch.

    CREDIT: JIM HARRISON/HARVARD UNIVERSITY

    If EOL sounds familiar, that's because its brief overlaps with those of several efforts, notably the All Species Foundation, whose chair promised to deliver a Web site for every species (Science, 26 October 2001, p. 769). That project is defunct, but others have managed to cover slices of biodiversity. At one end of the spectrum is the Catalogue of Life, which houses bare-bones taxonomic data—the equivalent of name, rank, and serial number—for more than 1 million species. At the opposite end are lush sites such as FishBase and AlgaeBase, which home in on specific groups and offer illustrated pages on individual species.

    EOL will follow both approaches but differs from these projects in automating information collection. Software will pluck data from FishBase, Catalogue of Life, and other Web sources—a “mashup” in Internet parlance. But EOL will be a curated mashup, with experts crafting a home page for each species that records its classification, alternative names, distribution, habitat, diet, and so on. Users will have the opportunity to build additional wiki-style pages, determining what content to include and who gets to contribute, Edwards says. Birdwatchers could flock together to post sighting records, for example, while molecular biologists might add gene expression data.

    Researchers praise the EOL's vision but fret about the execution. “The exercise is only worthwhile if it's more accurate and better coordinated than what's already available on the Internet,” says Frank Bisby, a taxonomist at the University of Reading in the U.K. and co-director of the Catalogue of Life. Even getting the names right for the poorly studied groups that contain much of biodiversity is a challenge, says Joel Cracraft, curator of ornithology at the American Museum of Natural History in New York City.

    Obtaining permission to use post-1923 literature is also an issue, says Donat Agosti, an American Museum of Natural History entomologist who works in Bern, Switzerland. Edwards says that EOL is negotiating with scientific societies and publishers. Although some deals are in the offing, none has yet been announced, he says.

  8. FRENCH SCIENCE

    Researchers Await Changes--and Clashes--After Sarkozy's Victory

    1. Martin Enserink
    Facing off.

    Both Nicolas Sarkozy and his opponent Ségolène Royal stressed the importance of science to France, but Sarkozy proposed more radical reforms.

    CREDIT: CLAUDE PARIS/AP

    PARIS—“We're in mourning,” laments Cécile Wandersman, head of a research unit at the Pasteur Institute. “For me, this is a great hope,” says Jean-Robert Pitte, president of the Université Paris-Sorbonne. Both were talking about this week's election of right-wing politician Nicolas Sarkozy as France's next president.

    As the contrasting comments indicate, Sarkozy's victory and his conservative agenda have divided the scientific community, just as it has French society as a whole. Known for tough talk on law and order, immigration, and morality, the former interior minister is mistrusted and reviled by the left—including many in the academic world. Wandersman, for instance, scoffs at Sarkozy's promise to raise research spending to 3% of gross domestic product by 2012. But to Pitte and many others, his agenda for change—including a shakeup of the higher education system as early as this summer—are just what France's sclerotic research scene needs.

    Research had played a larger-than-usual role in this election with both Sarkozy, who chairs the Union for a Popular Movement (UMP), and his rival, Socialist Party candidate Ségolène Royal, promising to increase science and higher education budgets. That was a victory in itself, says Jules Hoffmann, president of the French Academy of Sciences: “Research has never been this high on the agenda before.”

    But the candidates' opinions diverged on how to address the malaise in French research and the long-running problems at the country's universities. Science and higher education don't mix well in France, because most research takes place at mammoth government institutions such as the National Centre for Scientific Research (CNRS) rather than at the universities. A highly centralized administration system means universities are relatively powerless to set their own agendas; they also suffer from the fact that the smartest young minds typically attend the so-called grands écoles, which train France's professional and political elite but carry out little research.

    Royal's answer to these woes centered on 10% annual budget increases and revoking the most controversial elements of a research reform bill that President Jacques Chirac's government had introduced last year (Science, 10 March 2006, p. 1371). In contrast, Sarkozy offered more radical reforms that would move the country's education system closer to the Anglo-Saxon model. He has said he will introduce a law within 6 months that would offer universities much more autonomy—for instance, to manage their own budgets and set recruitment and research policies.

    Clear winner.

    Nicolas Sarkozy received 53% of the votes during the second round of the election.

    CREDIT: BENOIT TESSIER/REUTERS

    Sarkozy has also suggested turning the big research bodies such as CNRS into U.S.-style granting agencies that would reward proposals rather than employ scientists—a controversial shift in a country where science usually means a government job for life. To carry out those promises, Sarkozy's UMP will have to retain its majority in the National Assembly during elections next month; polls suggest it will.

    Sarkozy's plans have alarmed Sauvons la Recherche (SLR), a left-leaning movement that brought thousands of researchers to the streets in 2004 to protest cuts to science budgets by the Chirac government. Nine days before the runoff, SLR called on its members to vote for Royal. Sarkozy seems intent on rushing his higher education plan through Parliament without proper consultation by the scientific community, says SLR President Bertrand Monthubert. Turning France's research organizations into funding agencies would create more uncertainty for investigators and make science careers even less attractive, he says: “What works in Britain or the U.S. doesn't necessarily work in France.”

    But Pitte argues that more autonomy for universities is “absolutely needed”—and he hopes Sarkozy will go further. Universities should have the right to raise tuition fees and to select the best students rather than admitting everyone who qualifies, says Pitte. Those reforms go against France's egalitarian streak and are bound to trigger protests, he admits; his own Sorbonne was paralyzed for over a month last year by student revolts that eventually brought down a labor law already adopted by Parliament. This time, says Pitte, “I hope the government will be courageous and hard.”

    Bernard Bobe, an economist at the Ecole Nationale Supérieure de Chimie in Paris, notes that Sarkozy, like Royal, has failed to address the old split between universities and grands écoles. What's more, he is not convinced that the research and education system will be a high priority for Sarkozy, who has announced ambitious plans on a raft of other issues. France's science system has proven extremely resistant to reform, Bobe notes; “I think Sarkozy has the courage, but I'm not sure he has the ambition” to succeed where others have failed.

  9. GENOME-WIDE ASSOCIATION

    Closing the Net on Common Disease Genes

    1. Jennifer Couzin,
    2. Jocelyn Kaiser

    Huge data sets and lower cost analytical methods are speeding up the search for DNA variations that confer an increased risk for diabetes, heart disease, cancer, and other common ailments

    CREDIT: GETTY IMAGES

    After years of chasing false leads, gene hunters feel that they have finally cornered their prey. They are experiencing a rush this spring as they find, time after time, that a new strategy is enabling them to identify genetic variations that likely lie behind common diseases. By scanning the genomes of thousands of people and comparing the sick with the healthy, biologists are uncovering markers for DNA sequences they believe clearly increase the risk of type 2 diabetes, cancer, heart disease, inflammatory bowel disease, and other debilitating ailments.

    Their new tool, known as the genome-wide association (GWA) study, derives its power from the Human Genome Project and the more recent Haplotype Map that catalogs human genetic variation. The hunt has been sped along as well by the plummeting cost of gene scanning and by efficient gene-chip technologies available only in the past 2 years.

    What sets these studies apart from earlier gene discoveries claimed for the same diseases is that the new associations are statistically far more powerful and highly unlikely to be due to chance. Researchers are also confident about a flurry of new results because they've been recorded again and again in populations studied by independent teams. Fueling the excitement is a sense of surprise: “Most of these genes were not on anybody's candidate gene list,” says David Cox, chief scientific officer of Perlegen Sciences in Mountain View, California, which uses whole-genome scanning to identify drug targets. Cox recently co-authored a paper identifying a new genetic variant that raises heart disease risk and has another in the pipeline on breast cancer. He and many others expect the discoveries to point toward novel biology worth exploring.

    At the same time, this wave of GWA studies is studded with caveats. Although many agree that the findings are real, few scientists believe that they should be quickly put to clinical use—for example, to evaluate a person's risk of having a heart attack. Scientists haven't sorted out how these genes might interact with the environment, or how lifestyle changes might modulate the risk they confer. “There's going to be some scrambling to catch up on the clinical side,” says Nancy Cox, a human geneticist at the University of Chicago in Illinois.

    Furthermore, these first studies may have identified only the strongest associations, with many more genes still to be dug up. Finding them will likely require an unusual degree of cooperation in this intensely competitive field.

    Uncommon beginnings

    The new discoveries mark a major break with the past in part because their sweep is so broad. Traditionally, geneticists focused on single genes with potent effects, typically looking at large families riddled with rare diseases, such as cystic fibrosis or Huntington's disease or inherited forms of cancer. By tracking a small number of genetic markers that were linked to disease in such families, researchers successfully homed in on the culpable gene that causes disease.

    These family “linkage” studies lacked the power to pick up genetic variants that have a modest effect or that may interact with environmental exposures, however. And yet it is these variants, which may raise risk by 50% or less, that could play a key role in common, complex diseases. (The exception is work by deCODE Genetics in Reykjavik, which has used linkage methods and a proprietary database containing information on much of Iceland's adult population to find some common disease genes.)

    As an alternative to traditional linkage studies, researchers have tried searching for “candidate genes” known to play a role in some biologic process, such as insulin production. They looked for associations between mutations in these candidates and common diseases. Hundreds of studies have reported such associations. But few have been reproduced more than once or twice.

    The new strategy that's blossomed this spring has fundamentally altered the gene-hunting landscape. Rather than work with a few thousand genomic markers, scientists are now using gene chips that can scan an individual's DNA sample for anywhere from 100,000 to 500,000 or more single-base changes. Known as single-nucleotide polymorphisms (SNPs), these changes are selected to reflect patterns of common genetic diversity. Such high SNP density makes it much easier to detect culprit DNA changes. In addition, because the cost of using such chips has dropped sharply, scientists can test DNA from thousands of people, adding power to their studies.

    By compiling the SNPs from every DNA sample in a database, and comparing the SNPs in, say, heart attack patients to those in healthy people, it's possible to discern even subtle genetic signals that contribute to blocked arteries. The signals in GWA studies, emanating from a SNP or set of SNPs, don't necessarily mean that the SNPs themselves are influencing disease but rather that they're located in or near the problem DNA.

    Still, “one size never fits all,” and other strategies will still be necessary to identify the DNA behind disease, says Kathleen Merikangas, a genetic epidemiologist at the National Institute of Mental Health in Bethesda, Maryland. For example, GWA studies cannot discern rare variants or extra copies of genes, which can have a strong effect on physiology.

    Gene bonanza

    The list of diseases and traits examined with GWA began as a trickle 2 years ago with a highly touted paper on macular degeneration; it hit on a new gene in severely affected patients. Last October came a study that identified a gene involved in memory, and in December yet another gene surfaced that, depending on the version inherited, increased the risk of Crohn's disease—an inflammatory bowel disorder—by as much as 56%, or decreased it by 74%, in one group tested. All three studies have since been replicated.

    This year, the results began to come fast and furiously. In the first week of April, a new gene variant involved in prostate cancer and confirmation of a second were described in Nature Genetics. This new variant appears to raise the risk of prostate cancer by 58%, and in combination with other variants, including five in the same region discovered with a different technique, could explain a large proportion of cases in African Americans, according to the researchers who found them.

    Late April brought a trio of online papers in Science that presented three novel diabetes gene variants; the studies also confirmed a handful of other findings, including a gene involved in controlling body weight that was published weeks earlier. Last week, two independent reports were published, also online in Science, of a genetic variant associated with heart disease that, in the one-quarter of Caucasians tested who carry two copies, increases the chance of a heart attack by more than 50%.

    “There is a great deal of excitement, because everyone realizes the field is changing so fast,” says Judy Cho, who led the group that found the Crohn's gene and directs the inflammatory bowel disease center at Yale University.

    The pace, most believe, will only quicken. The Wellcome Trust Case Control Consortium, a collaboration of 24 human geneticists in the United Kingdom, will soon report findings on seven diseases studied with GWA, including bipolar disorder, rheumatoid arthritis, and hypertension. Papers on type 1 diabetes and breast cancer are nearing publication; the latter will report similar findings to prostate cancer, says co-author Bruce Ponder of Cancer Research UK Cambridge Research Institute—a handful of new genes that slightly raise risk.

    Clinical questions

    What do all these discoveries mean for medicine? With the notable exception of the macular degeneration gene variant, which raises disease risk about two to three times in those with one copy, most of the genes found so far boost risk only incrementally. Although a 50% increase may sound substantial, in absolute terms it's modest. For example, says Ponder, a 60-year-old woman's breast cancer risk is 3%. If she carries two copies of the most potent breast cancer variant found by his group, which makes her 1.6 times more likely to develop breast cancer, her overall risk increases to just 4%.

    Rising to the top.

    In a genome-wide association study for type 2 diabetes, 386,731 genetic markers, shown here by chromosome, pop up. Those above the higher line appeared to be significantly associated with disease.

    CREDIT: THE DIABETES GENETICS INITIATIVE/BROAD INSTITUTE OF MIT/HARVARD/LUND UNIVERSITY/NOVARTIS

    It's also unclear how lifestyle modifications affect the total risk these variants confer—in other words, whether someone can use the information to lower disease risk, say, by exercising or dieting. On the other hand, if the gene is common, it may have a major impact on disease prevalence across a population, notes Merikangas. Although the risk may not be worrisome for an individual, a gene with modest effects may account for many cases of disease.

    These concepts are difficult to convey to patients. “There is a complete disconnect” between what the general public expects from susceptibility genes for diseases such as these, says Cox of Perlegen, and how they should be applied clinically. “The public … really loves the concept of personalized medicine,” says Cox. “They don't understand, and I don't think we've explained, that having something that's statistically meaningful does not mean having something that's clinically relevant to them.”

    Still, many predict that companies will jump at offering “risk predictors” directly to consumers. DeCODE has already released a test for TCF7L2, the highest-risk diabetes gene, which increases the chance of disease by about 40% among those with one copy tested so far, and plans to offer a test for the heart disease marker. But most academic researchers say they need to know more about how these variants cause disease before diagnostic tests like this one can be useful. “I'm not at all enthusiastic about rushing out to test people in the clinic” for these genes, says David Altshuler of the Broad Institute in Cambridge, Massachusetts, who helped lead one of the teams studying diabetes.

    For many biologists, interest focuses not only on risk assessment but also on a longer-term clinical application: better understanding diseases and designing improved treatments to combat them. “The exciting part to us is, it's opening up completely new hypotheses,” says Lon Cardon of the University of Oxford, a collaborator in the Wellcome Trust consortium.

    View this table:

    Many of the new results point to genes in unsuspected stretches of the genome, or to regulatory regions between genes, ignored by studies focusing on candidate genes. Members of one of the teams that described new variants for type 2 diabetes, for example, at the start of the project compiled a list of 1000 candidate genes they thought they might find. None of the nine variants the group detected were on that original list, says Altshuler.

    The findings could also point researchers in new directions, to pathways not previously contemplated as drug targets. For example, the newly discovered heart disease variant falls in the same region on chromosome 9 as one of the new diabetes variants. “This is a stunner,” says Francis Collins, head of the National Human Genome Research Institute in Bethesda, Maryland, who helped lead one of the diabetes teams, because it could help explain why some people are vulnerable to both diabetes and heart disease.

    The next wave

    To unravel the deeper biology, scientists need to find still more susceptibility genes and understand how they interact. “We need to now understand what happens if you have three risk alleles, or five, or seven,” says Thomas Hughes, who collaborated with Altshuler and is global head of diabetes and metabolism research at the drug company Novartis. Some could enhance or, alternatively, neutralize the effects of others, notes Stephen Chanock of the National Cancer Institute in Bethesda, Maryland. But finding this second wave of disease variants will be much tougher, because their effects will likely be smaller—and will require massive data sets in order to be detected, researchers say.

    Many scientists say that the best way to speed progress is by sharing data fresh from the genotyping labs, even before it's been analyzed for gene-disease associations. But pooling data is more complicated than it sounds. One problem is that researchers may have to track down people who provided their DNA to a study to get their consent to have it distributed widely. DeCODE CEO Kári Stefánsson, who says the company's informed consent agreement could preclude data sharing, has another concern: He argues that because investigators have often collected data on clinical measures using different methods, pooling the data could lead to spurious associations. “It may be extraordinarily misleading,” he says.

    But the reality is that many common disease gene variants will likely go undiscovered without large collaborative efforts. The three diabetes teams, for example, were able to firmly pin down markers only because they shared their data, says Hughes. “Things that were deeply buried in our [gene] list … start to shine when you pool them with larger and larger populations,” he notes. Cho, the Crohn's researcher, admits that she and her colleagues rushed their first, potent gene into print, but now she realizes that “we have to combine forces with other people” to find the rest.

    The culture is already changing. For instance, the investigators who lead the National Institutes of Health (NIH)-sponsored Framingham Heart Study will make available later this year SNP data and clinical information with identifying information removed on 9000 residents of Framingham, Massachusetts, who have participated in the decades-long health study. And the Wellcome Trust consortium, which draws on existing cohorts in the U.K., already provides SNP data on controls to other researchers and later this summer will add data on subjects with diseases. To his surprise, says Cardon, “we didn't find any [investigators] who did not want to [contribute].”

    Hoping to encourage such sharing, NIH expects to issue a new data-release policy in a few months that will request that any U.S. investigator receiving funding for GWA studies deposit their data sets in a central repository. (Only results of genetic analyses will be posted publicly; the SNP data on individuals, which could in theory be used to identify a person, will be available to qualified investigators who agree not to distribute it publicly.)

    The proposed NIH policy, which will also restrict patenting, will ask that data be submitted as quickly as possible but provide a window—perhaps 9 months—to allow the original investigators to publish on it first.

    Scientists expect that this first round of GWA genetic discoveries will taper off in a year or two as they begin the slow, hard work of untangling what these genetic variants do. But among many biomedical researchers, there is a sense that the field of genomic medicine has entered a new phase, one that will finally test the promise of the Human Genome Project. “Are we on the threshold of something?” says Neil Risch, a human geneticist at the University of California, San Francisco. “I think so.”

  10. ECOLOGY

    Back to the No-Analog Future?

    1. Douglas Fox*
    1. Douglas Fox is a freelance science writer based in northern California.

    Fossil pollen and climate models suggest a messy world in 2100, as surviving species reshuffle into entirely new combinations, creating “no-analog” ecosystems

    Brownout.

    With warmer winters, mountain pine beetles have chewed through millions of hectares of forests in British Columbia.

    CREDIT: NATURAL RESOURCES CANADA, CANADIAN FOREST SERVICE

    Fly over northern Indiana, and you'll see a quilted landscape of corn and soybeans, punctuated by glacial lakes. The gelatinous mud in those lakes has preserved plenty of fossil pollen, from which paleoecologists have reconstructed a record of the region's past. Now, that same fossil pollen is providing a glimpse into Earth's ecological future—and it's not a pretty picture.

    It suggests that, if the climate changes over the next 100 years as current models predict, surviving species throughout much of Earth's land area will not simply migrate north and south en masse as unchanging communities, as Charles Darwin once believed. Instead, they are likely to be reshuffled into novel ecosystems unknown today. If that view is even partly correct, then the task of preparing for, or even predicting, the ecological effects of climate change just got a whole lot harder.

    Analyses over the past several decades have shown that during the last North American ice age, as the Laurentide Ice Sheet retreated into Canada 17,000 to 12,000 years ago, the region from Minnesota to Ohio to Tennessee supported a forest of spruce, sedge, oak, ash, and hophornbeam—an ecosystem that simply doesn't exist today, despite the fact that all of those species still survive. These odd communities—called “no analog” ecosystems because no modern counterparts for them exist—likely arose from odd combinations of climate variables such as precipitation, temperature, and seasonal variations that also don't exist today, say John Williams of the University of Wisconsin, Madison, and Stephen Jackson of the University of Wyoming in Laramie.

    Williams helped demonstrate the connection between no-analog communities and climate during his Ph.D. work at Brown University in the late 1990s, when he compared pollen and climate records for dozens of field sites across the eastern United States. That result, published in 2001, piqued Williams's interest in whether climate change over the next century might lead to a similar type of ecosystem reshuffling—and whether these changes could be predicted. “It was a logical next step,” says Williams, “to think about the future.”

    To find out, Williams and Jackson teamed up with John Kutzbach, a climate modeler at the University of Wisconsin, Madison. They have analyzed the outputs of standard climate models to try to map geographic areas that are likely to experience novel climates, which in turn could result in no-analog communities. In a paper published online in the Proceedings of the National Academy of Sciences (PNAS) on 27 March, they project that by 2100, depending on which climate scenario and model they use, 4% to 39% of the world's land area will experience combinations of climate variables that do not currently exist anywhere on the globe. Areas with these novel climates are likely to develop no-analog ecosystems.

    Jackson, Kutzbach, and Williams fed two standard greenhouse scenarios into their models: the pessimistic A2 scenario, in which CO2 concentrations reach 850 parts per million (ppm) by 2100, and the more optimistic B1 scenario, in which CO2 climbs to 550 ppm. They divided the world's landmasses into grid cells measuring 2.8° latitude by 2.8° longitude and, for each cell, looked at what the models predict for four climate variables: mean summer temperature, mean winter temperature, mean summer precipitation, and mean winter precipitation.

    For each spot on the map, they compared the forecast climate in 2100 with baseline climate from 1980 to 1999. To test whether these forecast climate changes would be sufficient to reshuffle ecosystems, they compared them with variations in climate that underlie different ecosystems in the same geographic area today (for example, deciduous forest and pine forest).

    Not only will novel climates appear, according to the analysis, but existing climates will disappear for 4% to 48% of the world's land area. In other words, the conditions that now exist in these areas will not be found anywhere in the world by 2100. These globally disappearing climates signal the likelihood for significant ecological disruption, if not necessarily no-analog ecosystems. “This is a conservative analysis,” says Jackson. “If we added more climate variables, we'd probably end up with more disappearing and novel climates.”

    New climates are expected to cause ecosystem reshuffling as individual species, constrained by different environmental factors, respond differently. One tree may be limited by summer rains that hold back seedling recruitment, for instance, whereas another species may be limited by winter freezes that control insect pests. Some species may migrate up-latitude or up-elevation, while others stay put. An ecosystem might see many species vanish—but also new arrivals.

    Williams and colleagues project that the tropics, including Amazonia, will see the most pronounced no-analog climates, with rising temperatures pushing these already-warm areas outside of any climates currently known today. Soaring temperatures combined with drought could selectively kill taller, canopy-forming trees—rapidly transforming ecosystems by increasing sunlight and drying at ground level.

    Within North America, the team predicts that the southeastern United States will see no-analog climates, driven by a selective rise in summer temperatures. The result could be increased wildfires in forests that are poorly adapted to fire, leading to rapid opening of the canopy unless those forests are managed aggressively.

    “I applaud their work,” says William Hargrove, a landscape ecologist at Oak Ridge National Laboratory in Tennessee. “We've seen an explosion of climate-change models turning out results, but I don't see as much work on the prognostic impact of these results on ecosystems.” In a similar exercise, published in 2005, Hargrove and colleagues mapped changes in seven climate variables across the United States. They predicted a higher rate of vanishing climates by 2100 than did Williams, Jackson, and Kutzbach. Hargrove attributes some of this difference to the fact that his team considered not only climate but also other ecologically relevant variables, such as soil type and landscape topography. “If you consider only climate,” says Hargrove, “you may underestimate the magnitude of ecological change.”

    One of the biggest issues raised by novel and disappearing climates is whether species whose preferred climates disappear locally can migrate to other areas where suitable climates still persist. As described in PNAS, Williams, Jackson, and Kutzbach performed a second analysis examining this very question. For each point on the map where the climate changes by 2100, they examined the surrounding areas to determine whether the current climate would persist elsewhere within a 500-km radius. Imposing this constraint increased the proportion of disappearing climates to 14% to 85% of the world's land area, depending on the climate scenario and model.

    Brave new world.

    Pessimistic (A2) and optimistic (B1) greenhouse scenarios predict that novel climates will appear across the tropics by 2100, while current climate types disappear in the tropics and the higher latitudes. Color scale represents degree of difference from current climate, with yellow-orange-red indicating significantly different climate by 2100 and substantial risk for developing no-analog ecosystems.

    CREDIT: WILLIAMS ET AL., PNAS 104, 14 (2007)

    Migration corridors are often proposed as a strategy to facilitate the movement of species in response to climate change; but this analysis suggests that often there may be no suitable refugia nearby. “Even if these species can migrate quickly enough,” says Jackson, “they may effectively have nowhere to go.”

    Rapid shifts in the pollen record lead some people to predict that future changes could be sudden as well. Some argue that it's already happening, with droughts across the Southwestern United States between 2000 and 2005 killing over 80% of adult piñon trees in some stands, drastically altering the piñon-juniper woodlands. If drought frequency increases in the Southwest, as some climate models predict, then other species could seize the niche vacated by those dead piñons, permanently altering the landscape's canopy structure, hydrology, and fire regime. “It's going to be interesting to see what happens there in the next 20 years,” says Jackson. “This ecosystem may or may not come back.”

    Others remain circumspect. Craig Allen, a landscape ecologist who observed the die-offs firsthand from the U.S. Geological Survey's (USGS's) Jemez Mountains Field Station in New Mexico, points out that the proximate cause of most piñon mortality was beetle infestation. That means many juvenile trees survived, and these could restore piñon populations within decades if further droughts don't intervene.

    The prospect of novel climates has people rethinking traditional goals such as maintaining native ecosystems. “That's probably going to be impossible,” says Nathan Stephenson, a research ecologist at the USGS Western Ecological Research Center in Three Rivers, California. “But what you can still do, even if you can't maintain native communities, is potentially maintain regional biodiversity and ecosystem functions.”

    Nowhere is this point of view more evident than in the management of forests—where human intervention has always been heavy. In British Columbia, climate-related surges in mountain pine beetles and fungus have browned millions of hectares of trees in recent years. Ecologists there are thinking about how to maintain a forest that will provide reliable watersheds, wildlife habitat, and lumber supplies into the future. The solution could involve planting different mixtures of tree species or replanting forests using seed stock from warmer areas, says Del Meidinger, an ecologist with the British Columbia Ministry of Forests and Range in Victoria. “You have to plant something that will survive now but will still grow well into the future” in a changed climate, says Meidinger.

    Land managers would love to predict how ecosystems will reorganize, what sorts of no-analog communities might emerge, and which species will dominate. Ecologists have produced niche models that predict species' future geographic distributions based on climates in their current locations. But that approach may break down when it comes to future no-analog climates, says Williams. “You're limited by what you can observe today,” he says. “It's a real problem for making ecological forecasts for climates that are outside the current range of observation.”

    One promising approach to making better forecasts is to base them on experimentally determined physiology. Ronald Neilson, a bioclimatologist with the Forest Service's Pacific Northwest Research Station in Corvallis, Oregon, is developing such a model. It predicts drought-induced mortality and fire by calculating how much leaf area can be supported by local moisture levels, based on measured rates of leaf water loss. “What we're simulating at the moment is plant functional types,” says Neilson. “We want to get to the species level.”

    This year, Neilson and Michael Loik of the University of California, Santa Cruz, will take the first step in that direction. They'll begin a seedling transplantation study to measure the physiological tolerances of a single species: Jeffrey pine, in the eastern Sierra Nevada of California. It's an effort-intensive approach, to be sure, but understanding the implications of a no-analog future might require no less.

  11. NEUROLOGICAL DISORDERS

    The Mystery of the Missing Smile

    1. Greg Miller

    Genetic studies and an ineffective abortion drug have provided some of the few clues researchers have about a rare disorder that hampers facial expressions

    Stern countenance.

    Tim McCaughan says his condition leads people to take him more seriously than he intends.

    CREDIT: MIKE AHLERS/CNN

    BETHESDA, MARYLAND—People can't always tell when Tim McCaughan is joking. Several years ago, a recently hired colleague congratulated him on a promotion, saying she was sure it was well-deserved. “Well, how would you know?” McCaughan quipped, playfully suggesting that maybe he didn't deserve the promotion after all. But his face didn't convey that he was joking, and the woman thought he was being a jerk. McCaughan has a rare neurological condition called Möbius syndrome that limits his ability to smile and make other facial expressions.

    “People often take me much more seriously than I really am,” says McCaughan, a senior producer at CNN who oversees the news network's coverage of the White House. He eventually managed to smooth things over with his colleague, who became his wife a few years later. “I'm still living that one down,” he says.

    McCaughan's case of Möbius syndrome is on the milder side of a wide spectrum. Many people have more extensive facial paralysis that impairs their speech and causes difficulties with eating. Limb deformities such as clubfoot often accompany Möbius syndrome. And for some unknown reason, autism appears to be far more common in people with the syndrome than in the general population.

    An international group of neurologists, geneticists, and other specialists gathered here recently for the first scientific conference* on this mysterious congenital disorder. They explored possible genetic and environmental triggers, discussed potential treatments, brainstormed research strategies, and hashed out a consensus set of diagnostic criteria. “This is really at such an early stage,” says John Porter, a program director at the National Institute of Neurological Disorders and Stroke, one of the meeting's sponsors. The meeting was a good start, but cracking the biology of Möbius syndrome isn't going to be easy, Porter says. “I think the mechanistic insights are going to take a while.”

    Frozen faces

    The syndrome is named for Paul Julius Möbius, a German neurologist who published an early description of it in 1888. (He was also the grandson of August Ferdinand Möbius, the mathematician of Möbius strip fame.) According to a statement developed at the conference, the syndrome's defining characteristics are facial weakness and impaired ability to move the eyes to the side—symptoms that are present at birth and don't worsen with age. Researchers estimate that Möbius syndrome occurs in 1 of every 50,000 live births, affecting boys and girls equally often.

    The core symptoms of Möbius syndrome point to defects in two cranial nerves: the abducens nerve, which innervates the lateral rectus muscles that rotate the eyes toward the side of the head; and the facial nerve, which innervates the muscles of the face. Yet, there doesn't seem to be a single neuropathological signature of the disorder.

    At the conference, George Padberg, a neurologist at the University Medical Center in Nijmegen, the Netherlands, described magnetic resonance imaging studies he and colleagues have done to visualize the nervous system in people with Möbius syndrome, as well as findings from electrophysiological tests of nerve function. This work has revealed a variety of defects. In some patients, the cranial nerves appear to be damaged or even missing. Others have abnormalities in the brainstem that include—and often extend beyond—the region where the abducens and facial nerves originate. Based on these and other findings, Padberg suspects that Möbius syndrome results from genetic miscues that derail the embryonic development of the brainstem.

    But the search for the relevant genes has yielded little fruit so far. The rarity of the disorder, coupled with the fact that only about 2% of cases are inherited, makes it difficult to find a sufficient number of subjects for genetic linkage studies, says Ethylin Wang Jabs, a geneticist at Johns Hopkins University in Baltimore, Maryland. The complexity of the disorder and lack of precise diagnostic criteria have also complicated matters, Jabs says. Padberg's group, for example, has published studies identifying regions of chromosome 3 and chromosome 10 as likely loci of genes related to inherited Möbius syndrome in two Dutch families, but other researchers point out that individuals in these families lack the eye-movement irregularities necessary to qualify as true cases of Möbius syndrome. (Padberg now agrees.)

    Now that there's a more precise definition of the disorder, the next step for finding Möbius genes, Jabs and others say, will be to create a central database in which researchers can share clinical and genetic data on Möbius patients. Jabs has started a database that now includes clinical data and/or DNA samples from 89 people with Möbius syndrome and more than 100 relatives, and other research teams have similar data.

    Researchers are also looking to related disorders and mouse models of brain development for clues. At the conference, Elizabeth Engle, a pediatric neurologist at Children's Hospital Boston, described her team's research on several inherited neurological conditions that share symptoms with Möbius syndrome. Athabascan brainstem dysgenesis syndrome (ABDS), named for the Native American population in which it was first described in 2003, causes impaired lateral eye movements and sometimes facial weakness as well. Similar symptoms had been reported in mice lacking a gene called Hoxa1, one of a family of genes that guide embryonic development. People with ABDS inherit a truncated copy of the human version of the gene, HOXA1, Engle and colleagues reported in 2005 in Nature Genetics. It's possible that spontaneous mutations in HOXA1 could be involved in Möbius syndrome, Engle says, but so far no one has looked. Jabs has been screening her Möbius patients for mutations in two other Hox genes, HOXB1 and HOXB2, based on findings of facial nerve abnormalities in mice lacking these genes. So far, however, nothing has turned up.

    A troubling drug problem

    Garbled genes aren't the only way to get Möbius syndrome. Since the mid-1990s, dozens of cases of Möbius syndrome have been linked to misoprostol, a drug commonly used by women in Brazil to induce abortion. Elective abortion is illegal in Brazil, but misoprostol is cheap and widely available, says pediatric ophthalmologist Liana Ventura of Fundaçäo Altino Ventura, a medical charity in Recife, Brazil. Although misoprostol is used in three-quarters of abortion attempts in Brazil, it is not particularly effective: Up to 80% of pregnancies continue to term, and about 20% of those result in an infant with Möbius syndrome, Ventura says. Misoprostol is typically used in the first trimester of pregnancy, and in the sample of Möbius children Ventura has worked with, misoprostol exposure occurred on average about 40 days after conception.

    Something to smile about.

    Möbius syndrome robbed Chelsey Thomas of a smile (left); plastic surgeons gave her a new one in two stages.

    CREDIT: COURTESY OF LORI THOMAS

    Some researchers have proposed that Möbius syndrome can result from a transient interruption in fetal blood circulation, and Ventura and others think the misoprostol findings fit with that idea. One possibility is that uterine contractions evoked by the drug disrupt fetal blood supply during a crucial stage of development, causing neural circuits in the brainstem to be permanently miswired.

    Other researchers are exploring the apparent link between Möbius syndrome and autism. Research teams from Sweden, Canada, and Brazil reported at the conference that roughly a third of their Möbius patients have autism spectrum disorders; teams from the United States and the Netherlands reported autism rates of 5% or less, however. One possibility is that the miswiring of the brainstem that occurs in Möbius syndrome somehow predisposes people to autism. Another, more speculative hypothesis is that the limited facial expressions in infants with Möbius syndrome hinder social interactions early in life, thereby stunting the development of the brain's social circuitry and leading to social impairments characteristic of autism.

    “We have evolved to use our faces as a primary means of communication, both through speech and facial expressions,” says Karen Schmidt, a biological anthropologist at the University of Pittsburgh in Pennsylvania who studies facial behavior. An infant with Möbius syndrome is less able to smile and interact with others, and many children with Möbius syndrome are shunned by their peers. The long-term effects of such relative social deprivation could be substantial, Schmidt says.

    Unfortunately, there's little help for the neurological symptoms of Möbius syndrome. One dramatic exception is “smile surgery” developed by plastic surgeon Ronald Zuker at the Hospital for Sick Children in Toronto, Canada. At the conference, Zuker described the 8-hour procedure, which he has performed in hundreds of children since the late 1980s. Zuker's team transplants a small piece of muscle from the patient's thigh to the face and positions it so that it will raise the upper lip when it contracts. To innervate the transplanted muscle, the surgeons usually reroute a nerve that innervates the masseter, the muscle that raises the lower jaw during chewing. Initially, the patients need to think about clamping their jaws to fire the nerve and elicit a smile, Zuker says, but with time the smile becomes more automatic.

    Zuker showed several before-and-after videos that revealed striking differences. One boy, when asked to smile prior to surgery, could only muster an expression that looked closer to a grimace or frown, the corners of his mouth moving slightly sideways and downward. After surgeries on both sides, his smile was unmistakable, and he even seemed to modulate it according to whether he actually felt like smiling or was merely indulging the cameraman for the umpteenth time.

    Still, smiles aren't for everyone—at least not all the time. McCaughan, whose work at CNN has given him the opportunity to travel with and interview several U.S. presidents over the years, says his condition sometimes works in his favor. “I'd say I've got the best deadpan in the business when asking a question.”

    • *Moebius Syndrome Foundation research conference, 24-25 April 2007.

Log in to view full text

Log in through your institution

Log in through your institution