News this Week

Science  06 Aug 2010:
Vol. 329, Issue 5992, pp. 614
  1. Cancer Research

    As Questions Grow, Duke Halts Trials, Launches Investigation

    1. Jennifer Couzin-Frankel

    When biostatisticians Keith Baggerly and Kevin Coombes began poking around a cancer study back in 2006, they never imagined the furor that would eventually follow. Their questions about a prominent genetics group at Duke University in Durham, North Carolina, not only raised doubts about a popular method of tumor analysis but also led to accusations that a cancer researcher, Anil Potti, had padded his resumé. Clinical trials at Duke based on these methods have been suspended, Duke is investigating the researcher in question, and there are calls for a general review of the field.

    The controversy involves the work of Potti and cancer geneticist Joseph Nevins, both of Duke. Their study of tumor genetics was first questioned by Baggerly and Coombes of the M. D. Anderson Cancer Center in Houston, Texas, in early 2007. The contretemps has gone back and forth since then. It reached a turning point this summer when The Cancer Letter reported that Potti's resumé identified him as a Rhodes scholar even though the Rhodes Trust does not. The Duke clinical trials were suspended, and Duke said in a statement last week that it has placed Potti on administrative leave.

    Potti did not respond to an e-mail seeking comment for this story. In an e-mail message, Nevins said he would like to speak out about the research, as “the stories that have been told to date, and how they have been told, are very skewed and one-sided.” But he added that because “the timing is not good” it would be best not to comment now.

    The Duke project began on a hopeful note: Like many researchers interested in personalizing cancer treatment, Potti and Nevins were examining patterns of genetic behavior in tumor cells, called gene expression signatures. It has been a popular field in recent years, as physicians look for ways to give patients only the drugs that will help them the most. But the gene signatures used to define tumor types—and there are many candidates out there—have been difficult to replicate. Only a handful have seemed reliable enough to use in the clinic.

    In October 2006, Potti and Nevins described in Nature Medicine how examining the sensitivity of cell lines to particular drugs could predict patients' responses to cancer therapies for a range of cancers. Based on this work, Duke launched three clinical trials that used the signatures to help determine which therapy patients should get.

    Other clinics were just as keen to try this idea, including physicians at M. D. Anderson. But first, the physicians wanted to be sure the data were reliable, and they asked Baggerly and Coombes to give the Nature Medicine paper a careful look.

    “We had difficulties pretty early on,” Baggerly says. He and Coombes say they found errors in the 2006 paper, including genes that didn't seem to belong on the list and tumor samples that were incorrectly labeled. They were in communication with the Duke authors, Baggerly says, but “we did not receive an answer that satisfied us.” In the spring of 2007, Baggerly and Coombes sent a communication to Nature Medicine detailing their concerns; the journal published it in November 2007. “We'd identified a problem, we'd talked with the authors, we'd written a letter to the journal, the journal was going to print our letter,” Baggerly says. “We figured, okay, this is how the process of checking and calibrating should work.”

    Under scrutiny.

    Anil Potti's work using gene signatures to guide cancer treatment has been challenged.

    CREDIT: DUKE PHOTOGRAPHY

    But Potti and Nevins continued to publish papers using the same method. This troubled Baggerly. He became obsessed with determining why the Duke team could make their prediction models work when he and Coombes could not. In subsequent papers by the Duke group, Baggerly says he found new errors and contacted both Lancet Oncology and the Journal of Clinical Oncology (JCO), where they had appeared. Both declined to publish Baggerly's letters but later printed corrections covering some points.

    Spokesperson Laura Livingston said that JCO was now in the process of tracking down this correspondence for a review of the case. Lancet Oncology declined to comment on the particulars.

    Steven Goodman, an epidemiologist and biostatistician at Johns Hopkins University in Baltimore, Maryland, first heard Baggerly give a talk on the subject about 2 years ago and was taken aback. “This was really, really serious stuff,” he says, and he told Baggerly that this needed to appear in the major journals.

    Baggerly then learned that Duke was running three clinical trials using the Potti-Nevins approach to assign patients to treatment. He took a new tack: publishing a paper of his own. He and Coombes shared their critique of several papers published by Potti and Nevins with a “prominent” biological journal, he says, whose editors suggested that the paper was too negative. They then submitted it to a high-level statistical journal, the Annals of Applied Statistics, where it was published online 2 weeks later, last September. Within weeks, Duke suspended its trials—only to restart them in January after a review by the university gave approval. It wasn't until Potti's resumé came into question last month that the trials were stopped again.

    The case threads together two tricky issues: the difficulty of correcting the scientific record and the difficulty of interpreting gene expression data, in which “our intuition is actually pretty poor” about what's right and what's not, says Baggerly.

    Dogged.

    Keith Baggerly couldn't replicate Potti's work.

    CREDIT: COURTESY KEITH BAGGERLY

    “How can we do better in the future so we don't end up in this situation?” asks Sharyl Nass, who directs the Institute of Medicine's (IOM's) National Cancer Policy Forum. The Cancer Letter reported last week that the forum has asked IOM to give it the green light to examine clinical use of these signatures. Cancer biologists are also concerned that the Duke case will cast a shadow over gene signature research. The Duke research has “done a disservice to the community,” says David Beer, who studies the molecular genetics of lung cancer at the University of Michigan, Ann Arbor. He hopes it won't tar the field.

    After the allegation that Potti may have padded his resumé, Duke announced that it was launching a “full, external review of the science” behind the clinical trials. “That painted everything in a new light,” says Michael Cuffe, vice dean for medical affairs at Duke's School of Medicine. Duke also announced last week that it has launched “a formal institutional investigation related to Dr. Potti's biographical claims.” Adds Cuffe: “There's a substantial difference” between the resumé allegations and “a debate about data integrity.” JCO is investigating too; Lancet Oncology published an Expression of Concern about the research. The New England Journal of Medicine, where Potti also published, and Nature Medicine say they're awaiting the results of the Duke inquiry.

  2. Biosecurity

    Defining Select Agents by DNA Sequence

    1. Yudhijit Bhattacharjee

    To work with dangerous pathogens such as anthrax, U.S. researchers must follow strict rules governing so-called select agents, or potential bioweapons. But what about a DNA sequence ordered from a company that contains some of the genes that make anthrax deadly? Currently, such an entity—or an artificial organism designed with such DNA—would not be subject to the same regulations despite its potential as a bioweapon, simply because of the way select agents are defined.

    A report by the National Academies this week recommends plugging this loophole with a new system of defining select agents based on DNA sequences. “That would provide a very sharp, bright line” to help gene-synthesis companies and their clients decide if a genomic sequence “meets the definition of a select agent or not,” says Sean Eddy, a biologist at the Howard Hughes Medical Institute's Janelia Farm Research Campus in Ashburn, Virginia, and one of the report's authors. He says the proposed classification system could also help gene-synthesis companies and government officials spot potential bioterrorism plots involving novel organisms cobbled together from different pieces of custom-ordered DNA.

    Concerns about artificially designed bioagents have risen with advances in synthetic biology. Many gene-synthesis companies already screen orders against known pathogenic genome sequences. Last November, the U.S. government published draft guidelines for how such screening ought to be done but did not make them mandatory.

    The classification system the academies' panel recommends would clarify requirements and make them legally binding. To be called a select agent, a stretch of DNA would have to meet at least three criteria: It would have to contain a minimum number of sequences or genelike units with pathogenic functionality; these units would have to closely match known pathogenic sequences; and the stretch of DNA would have to contain other generic parts required to create a fully functional synthetic organism.

    Anthrax would still be anthrax, but a synthetic construct containing pathogenic sequences from anthrax and meeting other criteria would now be defined as a select agent. A do-it-yourself bioterrorist could still piece together an agent by ordering different pathogenic sequences and generic sequences from a variety of companies; Eddy suggests addressing this concern by getting companies to share information with each other and identify a potentially troubling client.

    CREDIT: ISTOCKPHOTO

    Developing the proposed select-agent system would require help from the whole community, Eddy says, and resulting classifications would have to be reviewed. For example, Eddy says, if a researcher were to point out, “You dummies, you just classified the vaccine strain of [Bacillus] anthracis as a select agent,” it might require redefining a particular rule of classification.

    Jonathan Tucker, a biosecurity expert at the Monterey Institute of International Studies in Washington, D.C., says implementing such a classification system could be challenging. One problem is that “a lot of these select-agent sequences have not been adequately annotated,” he says. Because the consequences for researchers could be serious, he says, “the government would have to invest in a standard sequence database and a screening algorithm and make that available to researchers to prevent them from inadvertently ordering what could be a select agent.”

  3. Energy

    Fresh Start for Fusion Project as New Leader Shakes Up Management

    1. John Travis

    Last week, ITER, the €16 billion international effort aiming to prove the viability of fusion as an energy source, shook off nagging worries that soaring costs and management problems, combined with Europe's economic woes, could lead to the downsizing or even killing of the experimental reactor. ITER's governing council finally approved the project's so-called Baseline, an extensive document outlining its cost, schedule, and design, and officially named Osamu Motojima, former director-general of Japan's National Institute for Fusion Science, as ITER's new leader.

    The major steps forward brought almost audible sighs of relief from fusion scientists, who can now look forward to 2019, when ITER is supposed to produce its first plasma. “I was beginning to wonder if we were ever going to nail down cost and schedule. Europe had been dragging its feet,” says Steven Cowley, director of the Culham Centre for Fusion Energy in Abingdon, U.K. “We're over this nasty hump. Yes, ITER is going to cost a lot of money, but we're going to do it.”

    That hump included a fierce debate over how the European Union which is responsible for 45% of ITER's budget, would cover a €1.4 billion shortfall in short-term funding for the increasingly expensive project. In June, E.U. member states declined to inject new money into ITER, recommending instead the use of funds already allocated to other research efforts. This outraged non fusion scientists, and in July the European Commission rejected the advice; current plans are to take just €400 million from existing research budgets and find the rest from money allocated to agricultural subsidies and other uses.

    In an attempt to forestall future budget battles, the ITER Council last week placed a cap on the overall budget of the reactor, to be built in Cadarache, France. Building ITER on time and on budget will be a “tough job,” Motojima admits.

    “Fusion is not a dream but a real target.”

    —OSAMU MOTOJIMA, ITER

    CREDIT: © ITER ORGANIZATION

    He isn't moving slowly, however. The day after his appointment, Motojima announced plans to overhaul ITER's operations. A management review requested by the ITER Council had earlier this year criticized the project's governance, and several senior managers have recently departed. Motojima intends more changes. “Simplify everything; that is the only possible way to respond to the capping of the project,” he says. One casualty of this streamlining will be Norbert Holtkamp, ITER's principal deputy director-general and leader of the project's construction since 2006. An ITER spokesperson confirmed that Holtkamp would soon step down and that the position would be eliminated. “We need to simplify the decision-making process,” Motojima says.

    Even though the approval of ITER's Baseline is supposed to signify an end to major changes, Motojima will also request that the project's scientists and engineers seek new ways to simplify the fusion reactor's design and the integration of its many components, which are being built by the project's seven international partners—China, the European Union, India, Japan, the Republic of Korea, the Russian Federation, and the United States. Motojima says the ITER Council wants him to present cost-saving plans at a meeting in November. Any such changes won't mean that ITER will produce significantly less science, Motojima emphasizes: “I'm keeping the [original] scope of ITER.”

    If anyone can pull that off, it may be Motojima, who is widely praised for his oversight of the construction of another fusion experiment, Japan's Large Helical Device. “He's a real machine-builder” and “also has a real directorial presence,” says Cowley. Indeed, fusion scientists say that Motojima's appointment and the departure of former ITER Director-General Kaname Ikeda, a career diplomat with an engineering background, represent an acknowledgement that the project has moved on from securing funding to a phase dominated by construction. ITER's key components are now being built, Cowley notes. Even more promising, he says, industrial bids for other components are in line with cost predictions.

    Although some European politicians called for killing ITER in favor of promoting more immediate renewable-energy projects, Motojima argues that fusion science has rapidly matured over the past half-century. “Some say fusion is always a dream. This is not true. … Fusion is not a dream but a real target,” Motojima says.

  4. Physics

    Diamond Feats Give Quantum Computing a Solid Boost

    1. Robert F. Service

    Quantum computing may finally be ready to grow up. Over the past 3 decades, physicists have learned to use the quantum behavior of atoms to store and process a handful of bits of information. But they've never managed to scale up quantum computers the way the computer industry has integrated millions of transistors on chips.

    Now, a pair of new results brings that goal a step closer. Researchers in California report creating a way to vastly scale up the production of quantum bits (or qubits) in a diamond wafer, a leading contender for making a solid-state quantum computer. Meanwhile, researchers in Massachusetts have linked the quantum state of one such qubit in diamond to the quantum state of a photon, the basic particle behind light. The result may open the door to linking large amounts of quantum information in the solid state to photons of light that can carry the information over long distances.

    “Both [papers] taken in isolation are extremely important contributions to being able to use diamond as a platform for doing quantum information processing in the solid state,” says Ray Beausoleil, a physicist and HP fellow at HP Laboratories in Palo Alto, California. Taken together, Beausoleil adds, the papers could make it possible both to scale up quantum computers and to pass their data over a distributed quantum network.

    For decades, researchers have longed to build quantum computers because of the unique way they store and process information. In conventional computers, each bit of data exists in one of two states, either a “1” or a “0.” Quantum computers take bits much further. Each qubit can exist as either a 0 or a 1, or as a “superposition” of all its possible states. For example, it might be 19% 0 and 81% 1, or 65% 0 and 35% 1, or countless other in-between combinations. In carrying out its operations, a quantum computer weighs all such values simultaneously. As a result, stringing just 30 qubits together would give a quantum computer the computational power of a conventional computer running at 30 trillion operations per second.

    Qubits aren't just theoretical playthings. Researchers have made them out of many different ingredients, including trapped ions and superconductors. They've also entangled multiple quantum states together so that manipulating one bit causes a predictable change in its neighbor. But they've struggled with the scaling step that has defined the success of conventional computers.

    That's where diamond may have an edge. In 2006, researchers found that when they inject nitrogen atoms into a wafer of crystalline diamond, nitrogen atoms not only insert themselves in the carbon lattice but also can kick out carbon atoms. If a nitrogen atom winds up next to a vacancy, one of its electrons can form a stable qubit with a property called spin, which researchers can manipulate with radiofrequency (RF) signals, microwaves, or laser light. Nestled inside the diamond lattice, isolated from outside influences, such nitrogen-vacancy (NV) centers can maintain their quantum states much longer than qubits in rival solid-state quantum-computing setups can.

    Upscale.

    Array of NV centers in diamond.

    CREDIT: D. M. TOYLI ET. AL., NANO LETTERS (PUBLISHED ONLINE 23 JULY 2010) © 2010 AMERICAN CHEMICAL SOCIETY

    Creating NV centers in diamond has been slow going, as researchers typically make them by firing nitrogen atoms through a tiny aperture one at a time. In a paper published online in Nano Letters on 23 July, however, researchers led by David Awschalom, a physicist at the University of California (UC), Santa Barbara, report that they created a 60 × 60 array of NV centers in diamond by shining a beam of nitrogen atoms through a thin mask containing 3600 apertures. The mask, made with a conventional high-resolution technique known as electron-beam lithography, can be scaled to any size. The UC researchers also showed that they could use RF electronics to control the quantum state of individual and neighboring NV centers, a development that sets the stage for using those neighbors to carry out complex quantum computations. “This brings the ability to do nearest-neighbor information processing much closer to realization,” Beausoleil says.

    This week in Nature, meanwhile, researchers led by Harvard University physicist Mikhail Lukin report that they can reliably link quantum information in an NV center to the polarization state of photons. The feat marks the first time solid-state qubits have been linked to light, Lukin says, and it opens the way for using diamond-based qubits for long-distance quantum communication, cryptography, and distributed computing. Beausoleil cautions that it's still far too early to know whether diamond-based quantum computers will beat out their rivals. Even so, it looks as if the field is finally set to stop crawling and to start walking.

  5. ScienceInsider

    From the Science Policy Blog

    The U.S. Environmental Protection Agency (EPA) has rejected petitions asking it to halt the planned regulation of greenhouse gases. Some cited e-mails made public as part of the “Climategate” affair to question scientific aspects of the issue. But EPA says there's “no evidence” to suggest climate data were suspect.

    Mirroring its House of Representatives counterpart and the president's request, a Senate spending panel has proposed a $1 billion boost for the budget of the National Institutes of Health. The bill contains $50 million for the Cures Acceleration Network, a drug-development program that some scientific groups worry could come at the expense of funding proposals from scientists.

    A House committee hearing explored open access in scientific publishing. Advocates of making papers freely available say informing the public is a noble goal, while publishers worry that expanding a 2-year-old NIH policy to more agencies will hurt the scientific enterprise.

    A Spanish National Research Council panel has recommended the retraction of a paper published last year in Science that described an enzyme-monitoring chip called the reactome array (Science, 9 October 2009, p. 252). But some scientists, including a Nobel laureate who conducted a blind test of the technology, maintain that the array works.

    Senate Majority Leader Harry Reid (D–NV) has dropped a legislative mandate on the use of alternative energy from an energy package he introduced. The announcement came a week after Reid said that limiting carbon emissions via Senate legislation was impossible because he could not muster 60 votes.

    Dispersed oil droplets could have unknown and dangerous effects in the Gulf of Mexico, say scientists. Issues include their small size and the makeup of the dispersant molecules used by BP.

    For more science policy news, visit news.sciencemag.org/scienceinsider.

  6. Infectious Disease

    New Map Illustrates Risk From the 'Other' Malaria

    1. Gretchen Vogel

    It may be the malaria you've never heard of, but nearly 3 billion people are at risk of infection with the malaria parasite Plasmodium vivax, according to a new analysis published this week in PLoS Neglected Tropical Diseases. The P. vivax parasite has long been considered the milder, less-dangerous cousin of P. falciparum. But recent studies have made clear that P. vivax can also cause deadly complications in infected people. A new map of areas where the parasite has been reported, and where conditions are right for mosquitoes to transmit it, should serve as a wake-up call for those who hope to eliminate malaria (Science, 14 May, p. 849), says epidemiologist Carlos Guerra of the University of Oxford in the United Kingdom, who developed the map with his colleagues. Although P. vivax does often cause milder symptoms, it can lay dormant for months or even years in patients, making it difficult to cure people and to identify carriers—thereby complicating efforts to eliminate the disease from a region.

    P. falciparum has traditionally received most of the attention,” Guerra says, mainly because it kills so many people in sub-Saharan Africa. Globally, however, “P. vivax deserves equal attention, and yet there are still many fundamental gaps in our understanding of this parasite,” he says. Most of the 2.85 billion people at risk—91%, according to the new analysis—live in Central and Southeast Asia. About 5.5% of at-risk people—160 million—live in the Americas, and 100 million live in Africa, Yemen, and Saudi Arabia.

    Overlooked threat.

    A new analysis estimates the risk of vivax malaria across the globe, taking into account climate, public heath, and genetic data.

    CREDIT: C. A. GUERRA ET AL., PLOS NEGLECTED TROPICAL DISEASES 4 (AUGUST 2010)

    The map divides regions of the world into areas of high transmission, low transmission, and areas in Africa where most of the residents carry a genetic trait that makes them less likely to be infected by P. vivax. The researchers started by identifying all countries in which the vivax parasite is endemic. They then looked at the number of cases per year in subregions of the countries, categorizing them as zones of stable or unstable (rare) transmission. They further refined the map by excluding areas in which temperatures are too low or conditions too dry for mosquitoes to transmit the parasite. The scientists also drew on knowledge from on-the-ground public health workers to identify additional places that could be deemed malaria-free. Finally, they determined the areas in which more than 90% of residents carry the so-called Duffy negative genetic trait, leaving them without an antigen that the parasite uses to infect cells.

    Not everyone agrees with the methods. Richard Cibulskis of WHO's Global Malaria Programme says the spatial resolution of the map is not very high, which leads to an overestimation of the number of people at risk. “The median population size of a district in India is 1.5 million, and the whole district might be erroneously classified as at risk, whereas malaria could be occurring in only part of the district,” he says.

    But Peter Zimmerman of Case Western Reserve University in Cleveland, Ohio, who studies vivax malaria, says underestimation of the disease has long been a serious problem. He says that because P. vivax is more difficult to diagnose than P. falciparum, cases often go underreported, especially when people are infected with both parasites at once. “The renewed global effort [to eliminate malaria] needs to wake up to the persistent public health burden of vivax malaria,” he says. “I hope that [the map] causes serious discussion among malaria researchers.”

  7. ScienceNOW.org

    From Science's Online Daily News Site

    Traffic Jam in Orbit Vying for geosynchronous orbit, more than 400 telecommunications satellites occupy a narrow band of space some 35,000 kilometers above Earth's equator. Now, researchers say that attaching solar sails to satellites could reduce the congestion. Computer models developed by aerospace engineer Colin McInnes and graduate student Shahid Baig of the University of Strathclyde in the United Kingdom reveal that the photons of sunlight streaming across the solar system contain sufficient energy to push a satellite arrayed with a solar sail into a stable geostationary orbit that doesn't circle Earth's equator. The satellite could also maintain its new position without the need for heavy, liquid-fueled thrusters, the pair reported in the Journal of Guidance, Control, and Dynamics.

    CREDIT: SUZANNE LACROIX/MICHIGAN STATE UNIVERSITY

    Tough Food, Better Bite Throw a coyote a bone, and you may just change the shape of its skull. Ethologist Suzanne LaCroix of Michigan State University in East Lansing and colleagues randomly split related coyotes into two groups: one gnawed on sheep and cow femurs, and one dined exclusively on a soft diet similar to canned dog food. At 18 months old, the bone-chewing coyotes consumed rawhide treats more than three times as fast and ate nearly 1.5 times as much of a portion of beef shank as did coyotes without access to bones as pups, LaCroix reported at the 47th Annual Meeting of the Animal Behavior Society in Williamsburg, Virginia.

    As adults, the bone-gnawers also had significantly shorter and wider mouth bones, bigger chewing muscles, and a more prominent sagittal crest, the ridge of bone at the top of the skull to which these muscles attach. The researchers say this is the first time food has been shown to have such a dramatic impact on the anatomy of any animal.

    Marijuana Time Warp People who smoke pot can feel lost in time—for some, it's part of the draw. Now researchers may have figured out one reason why.

    The brain's suprachiasmatic nucleus (SCN) controls the 24-hour physiological cycle known as the circadian rhythm, using light to reset the clock. But SCN neurons also possess receptors for cannabinoids, the psychoactive compounds in marijuana. To find out what role these receptors play, a team led by Yale University circadian biologist Anthony van den Pol first housed 42 mice in total darkness for 2 weeks until they synchronized their daily internal clocks, spending 12 hours dormant and 12 hours active. Then the researchers shined a light into some of the cages shortly after the mice became active. Because mice are nocturnal, they became active about 2 hours later in the day than did mice not exposed to light. But mice given brain injections of cannabinoids before light exposure became active only 1 hour later than did the controls.

    When the researchers added cannabinoids to mouse SCN cells in a petri dish, the cells fired about 50% more frequently. This increased activity likely mucks up the circadian rhythm in a live mouse, the researchers reported in The Journal of Neuroscience; cannabinoids may have a similar effect in humans.

    CREDIT: CRISTINA SANTIESTEVAN/BBPP

    Clues to Origins of HIV's Ancestor Monkeys on an island that separated from West Africa thousands of years ago could help unravel the puzzling origins of AIDS, according to a study presented at the 18th International AIDS Conference in Vienna.

    HIV-1, the main virus driving the AIDS epidemic, likely entered humans from chimpanzees in the early 1930s. Chimps are infected with a related virus called SIVcpz, a blend of SIVs from two different monkey species. But scientists don't know when these SIVs made the leap from monkeys to chimps.

    To find out, virologist Preston Marx of the Tulane National Primate Research Center in Covington, Louisiana, isolated SIVs from four different monkey species on the island of Bioko. One species, the Bioko drill (Mandrillus leucophaeus poensis), has a mainland counterpart that also harbors SIV. By looking at changes in the viruses' RNA, and calibrating this “molecular clock” based on the island's separation 12,000 years ago, Marx's team calculated that a virus related to the Bioko drill's SIV infected chimpanzees at least 22,000 years ago, much earlier than previous estimates. At a minimum, the SIVs are 76,000 years old, which may explain why they cause no harm in infected African monkeys, Marx noted: The hosts have had more time to evolve protective immune responses.

    Read the full postings, comments, and more at news.sciencemag.org/sciencenow.

  8. Climate Change

    'Arctic Armageddon' Needs More Science, Less Hype

    1. Richard A. Kerr
    Methane on ice.

    In winter, Arctic lake ice traps methane, but only temporarily.

    CREDIT: KATEY WALTER ANTHONY/UAF; NSF/NASA FUNDING

    “Massive methane release sparks global warming fears,” blared the online news headline. “Arctic seabed methane stores destabilizing,” warned a University of Alaska, Fairbanks (UAF), press release. Even the U.S. National Science Foundation, in another press release, found Arctic methane to be leaking off Siberian shores at “an alarming rate.”

    Alarm might well be warranted. Methane, chemical formula CH4, is a powerful greenhouse gas 25 times more potent than carbon dioxide, and the ongoing global warming driven by carbon dioxide will inevitably force it out of its frozen reservoirs and into the atmosphere to amplify the warming. Such an amplifying feedback may have operated in the past, with devastating effects. If the modern version is anything like past episodes, two scientists warned in a Perspective in Science (24 April 2009, p. 477), it could mean that “far from the Arctic, crops could fail and nations crumble.”

    Going, going, …

    In a model, a methane hydrate deposit (orange) recedes as ocean warmth penetrates, much as happened off Svalbard (opposite page).

    CREDIT: M. REAGAN AND G. MORIDIS, GEOPHYS. RES. LET. 36 © 2009 AGU

    Yet, with bubbles of methane streaming from the warming Arctic sea floor and deteriorating permafrost, many scientists are trying to send a more balanced message. The threat of global warming amplifying itself by triggering massive methane releases is real and may already be under way, providing plenty of fodder for scary headlines. But what researchers understand about the threat points to a less malevolent, more protracted process. “It will aggravate the global change problem,” says geochemist Martin Heimann of the Max Planck Institute for Biogeochemistry in Jena, Germany, “but it's not a catastrophe.”

    Sure looks scary

    There's certainly plenty of methane out there. Beneath the sea floor, methane produced by the microbes in the sediment can become trapped in the crystalline cages of water ice to form methane hydrate, “ice that burns.” No one is sure how much submarine hydrate exists worldwide, but it is on the order of several thousand petagrams (Pg) of carbon. (A petagram is 1015 grams, or a billion metric tons.) That's easily 1000 times the amount of methane presently in the atmosphere.

    If hydrates are warmed, especially those under relatively low pressure beneath the shallow sea floor, they will melt, releasing their methane. “If you gave the planet a shake,” says geochemist David Archer of the University of Chicago in Illinois, the gas “would all come out, and it would be a global catastrophe.”

    The other precarious source of methane is permafrost. Permanently frozen soil and sediment contains organic matter that microbes can convert to methane if the permafrost thaws and remains free of oxygen. That happens, for example, in the bottoms of the numerous arctic lakes that form in thawing permafrost. The top 3 meters of arctic permafrost are thought to hold about 1000 Pg of carbon as organic matter that could be converted to methane that would equal 300 times the methane in the atmosphere.

    Here it comes?

    This past March, oceanographer Natalia Shakhova of UAF and colleagues reported what sounded to some like the first methane gusher of many to come. The group took exhaustive samples over the East Siberian Arctic Shelf and found pervasive methane-rich waters, as they reported in the 5 March issue of Science (p. 1246). The methane was coming from the sea floor. The group calculated that as much methane was escaping from the water into the atmosphere as had been estimated to be escaping from the entire world ocean.

    Alarm gripped at least some quarters of the media. But reporters failed to note—and scientists did not emphasize—one important detail: The methane was coming from permafrost thawing under the relatively warm waters that had inundated the Siberian shelf as sea level rose after the last ice age. With only 5 years of sampling, no one could tell whether the leak had started under global warming or had been going on for millennia. Many scientists lean toward millennia in this case.

    A far stronger case for incipient hydrate destabilization appeared last year with less fanfare. In a paper in the 6 August 2009 issue of Geophysical Research Letters, marine geophysicist Graham Westbrook of the University of Birmingham in the United Kingdom and colleagues described how they used sonar to probe the shallow waters just west of Norway's Svalbard archipelago—halfway between mainland Norway and the North Pole. There, bottom waters had in fact warmed by a considerable 1°C during the previous 30 years, possibly because of global warming.

    Where warmed currents brushed the Svalbard sea floor, the researchers found plumes of methane bubbles rising from the bottom. And in modeling reported in Geophysical Research Letters (GRL) on 15 December 2009 by hydrogeologists Matthew Reagan and George Moridis of Lawrence Berkeley National Laboratory in California, bottom-water warming melted the model's hydrates and released methane along the edge of the deteriorating hydrate deposit, much as seen off Svalbard. “That seems like the strongest argument for hydrates releasing methane” as they are warmed, says Archer.

    Up, up, and away.

    Sonar probing along 2.5 kilometers of 250-meter-deep water west of Svalbard revealed plumes of methane bubbles (multicolored) rising from the edge of a methane hydrate deposit beneath the sea floor (red-orange).

    CREDIT: G. WESTBROOK ET AL.., GEOPHYSICAL RESEARCH LETTERS 36 © 2009 AMERICAN GEOPHYSICAL UNION

    And more-widespread warming could be big trouble. Last year in the 8 December issue of the Proceedings of the National Academy of Sciences, Archer and colleagues reported on their own modeling of methane hydrate behavior, this time on a global scale. In both of their models, a 3°C warming of the ocean melts fully half of the existing hydrates. And aquatic ecologists Katey Walter Anthony of UAF and Sergey Zimov of the Northeast Science Station in Cherskii, Russia, reported at last December's meeting of the American Geophysical Union (AGU) that, according to “a very coarse estimate,” the dominant type of northern permafrost would yield 50 billion tons of methane if it should thaw—10 times the current methane content of the atmosphere.

    Not so fast

    So at least in one high-latitude location, hydrates seemed to be newly giving up their methane, while reports of thawing permafrost and bubbly arctic lakes streamed in as well. But what did all the bubbling—seen and presumably unseen—really amount to? Atmospheric chemist Edward Dlugokencky of the National Oceanic and Atmospheric Administration's Earth System Research Laboratory in Boulder, Colorado, and colleagues analyzed NOAA measurements of atmospheric methane made on samples collected weekly from 1983 to 2008 at 46 sites around the world. Atmospheric methane had been increasing until the late 1990s, when it leveled off. Then in 2007, its abundance bumped up.

    Taking into consideration numerous factors—including latitudinal patterns of change in atmospheric methane and shifting regional climates—Dlugokencky and his colleagues concluded that the recent methane jump was not driven by melting hydrates and permafrost. Instead, it seemed to be due to some combination of the high northern-latitude warmth in 2007 that is accelerating emissions from wetlands there; biomass burning contributing methane in the tropics; and heavy rains in Indonesia and the eastern Amazon encouraging tropical wetlands emissions. But because methane stopped increasing in the polar Northern Hemisphere in 2008, “the Arctic has not yet reached a point of sustained increased CH4 emissions from melting permafrost and CH4 hydrates,” the group wrote in GRL. Dlugokencky summed up their conclusions at the AGU meeting: “Despite all the media hype, I don't think we're yet at an arctic tipping point.”

    Tipping points for both methane hydrates and permafrost will come, Archer predicts—but they will probably happen slowly. It takes time, he notes, to get from an atmospheric warming driven by carbon dioxide to an amplifying warming driven by atmospheric methane. It takes time for the ocean to warm. It takes time for that warmth to penetrate into hydrates. And it takes quite a bit of that penetrating heat to melt hydrates.

    Once freed, the methane has to reach the atmosphere through the obvious obstacle of the overlying sediment. The ocean presents an impediment of its own. Bubbles may never reach the surface. Methane leaks out of bubbles, reacts with air dissolved in sea water, and becomes oxidized to form carbon dioxide. Even methane that reaches the atmosphere intact gets oxidized within about 10 years.

    Folding all of those processes into an admittedly still-crude model, Archer and his colleagues get a warming of about 0.5°C, whether the ultimate carbon dioxide warming is a very modest 2°C or an extreme 7°C. The catch is that once the methane is converted into long-lived carbon dioxide, it prolongs that added warming for thousands of years.

    So to scientists, the methane threat looks less like a catastrophe than an aggravation of a problem that already scares them. But “media people are all the time trying to have a doomsday story” about methane, says Walter Anthony. Not that scientists are blameless. “Quite a few scientists have maybe exaggerated a bit,” Heimann acknowledges.

    “Is now the time to get frightened?” Archer asked rhetorically on the blog Real Climate (www.realclimate.org) in March. His answer: “No. CO2 is plenty to be frightened of, while methane is frosting on the cake. … Methane sells newspapers, but it's not the big story.”

  9. Marine Biology

    Seeing Deeply Into the Sea's Biodiversity

    1. Elizabeth Pennisi

    For the past 10 years, scientists from 80 nations have been creating the Census of Marine Life (Science, 2 June 2000, p. 1575; 31 October 2003, p. 773). Derek Tittensor of Dalhousie University in Halifax, Canada, and colleagues have now analyzed more than 6.5 million entries from the census databases, as well as other data for 11,500 marine species to create a map (top right) of diversity hot spots. Corals and coastal fishes are most diverse in Southeast Asia, as indicated by the outlined squares in the map, the team reported online 28 July in Nature.

    Another analysis, drawn from field surveys and literature reviews by 360 scientists, appears in a series in the 2 August PLoS ONE. It looks at species diversity in 25 regions of the world and comes up with a global average of what types of species populate the oceans (see pie chart). The proportions of species that inhabit particular waters change according to location.

    Download the PDF

  10. Profile: Douglas and Pamela Soltis

    The Power of Two

    1. Elizabeth Pennisi
    Married, with plants.

    Douglas and Pamela Soltis work together in all aspects of their careers.

    CREDIT: E. PENNISI/SCIENCE

    PULLMAN, WASHINGTON—When Pamela Soltis first joined her husband, Douglas, on the faculty at Washington State University, Pullman, they wrote separate grants and ran separate research programs. But they worked side by side in the field and in the greenhouse and read and critiqued each other's grant proposals and papers. More often than not, they also worked together in the lab. “We knew we were interested in a lot of the same things,” Pam recalls. Eventually, they gave up trying to work independently.

    Today, more than 25 years later, they are known collectively as the “Solti.” “We're generally viewed as one person,” Pam says. True, they have separate appointments at the University of Florida, Gainesville, she at the natural history museum and he in the biology department. But students, grants, courses, publications, talks, even accolades are shared. They studied in London on the same Fulbright scholarship and were co-awardees on an international prize. “Everything they do, they do together,” says Michael Donoghue, an evolutionary biologist at Yale University.

    Success story.

    An 80-year-old new species, Tragopogon mirus (right), outcompetes its parental species, which produces smaller seed heads (far right).

    CREDIT: E. PENNISI/SCIENCE; ANDREW DOUST

    “They are the most powerful, productive couple that may have ever been in botany, certainly in my generation,” says John Kress, an evolutionary biologist at the Smithsonian National Museum of Natural History in Washington, D.C. The Soltises helped bring plant systematics into the molecular age, according to peers. And their innovations have led to firsts in “approaches to questions and ultimately first answers to questions,” says Vaughan Symonds, a former postdoc now at Massey University in Palmerston North, New Zealand.

    Early adopters of new techniques—including molecular DNA tools—as students in the 1980s, the Soltises have shown how rapid progress can be when two minds focus on a single research program. Says Jeffrey Doyle, a systematist at Cornell University, “They are so energetic and active that seeing Doug and Pam moving into your areas is a little frightening.”

    Doubling up is a main theme in their research as well. In their work on the evolution of flowering plants, the Soltises have shown that two genomes can be better than one. Throughout their joint career, they have studied a genus called Tragopogon, weedy plants with composite flowers that turn into puffballs. Hybridization of closely related Tragopogon species in Washington state has brought together two plant genomes in a single organism—a condition called polyploidy—yielding new species that are crowding out the parent stock. “They are weeds run amok,” Doug says.

    In addition, using molecular techniques to build a family tree of flowering plants, or angiosperms, the Soltises and their collaborators have determined which plants in the angiosperms are the most ancient. Early genome duplications, they learned, created genetic fodder for the great burst of diversification that followed the first appearance of flowering plants. “Using Tragopogon has led them to look at how important polyploidy is to angiosperms as a whole,” says Jennifer Tate, a former postdoc and now a plant systematist at Massey University.

    Backyard discovery

    In their early days in Pullman when they were first looking into angiosperm evolution, the Soltises decided to follow up on work done by Washington State University botanist Marion Ownbey. He had made a surprising discovery walking downtown from campus one day in 1949: He noticed some large, oddly colored versions of Tragopogon, whose sunflower-like flower heads are typically either yellow or purple. These had a yellow “eye” rimmed by purple. Ownbey found that they belonged to a new species that had arisen when two Tragopogon species had hybridized and produced descendents with double the usual number of chromosomes. Herbarium records showed that the two parent species were European natives that had reached Washington no earlier than 1928, which meant the new species was less than 50 years old.

    Close quarters.

    Once they were transplanted from Europe, the purple Tragopogon porrifolius and yellow T. dubius came in close enough contact to hybridize and form a new species.

    CREDIT: E. PENNISI/SCIENCE

    Ownbey also found a second new polyploid species that involved a third Tragopogon parent. In 1950, he named the one with the yellow and purple flower T. mirus and the other, T. miscellus. It was the first time anyone had pinned down such a recent origin of a new organism; a few more examples have since been documented. “So often in evolutionary biology, it's like archaeology; you are looking for signs of what might have happened in the past,” says Paul Wolf, a plant evolutionary geneticist at Utah State University in Logan. Now it was possible to work with a “real experimental system.”

    Ownbey spent much of the rest of his career tracking down local populations of the parents, hybrids, and polyploids and crossing them. He found them most often in abandoned lots, along roadsides, and other neglected places. Ownbey said these fragile populations bore watching to see how they fared over time.

    The Soltises had been reading Ownbey's papers when “all of the sudden we realized we live here” right where Ownbey worked, Pam recalls. “We knew this would be a novel research opportunity.” They were joined by a postdoc at Washington State, Stephen Novak, who tracked down Ownbey's notebooks. Novak, now a plant population geneticist at Boise State University in Idaho, and the Soltises spent a year going to 90 small towns—places where Ownbey had seen the new species and other nearby sites—estimating the numbers of each Tragopogon they found. On sunny mornings in Pullman, Doug could be found pushing a stroller—they had just had the first of their two daughters—up hillsides in search of Tragopogon flowers.

    Doug, who had the morning child-care shift in those days, seems to be the more motivated by data-gathering of the two. He's also the big-ideas person with a drive to explore. Pam is drawn to data analysis and is more of a planner, colleagues say. They are competitive inside and outside the lab, according to friends. Before Doug's knees gave way, they played a lot of basketball and tennis together. Now she runs marathons and he does triathlons and fishes.

    Speciation mania

    The Soltises “were able to … bring new tools to the problem” that their predecessor grappled with, Donoghue says. Researchers used to think that a polyploid organism arose once and then spread to new places. Ownbey suspected that in Tragopogon's case, the new species had formed multiple times. By comparing chloroplast DNA and other markers of the new species from different locations, the Soltises confirmed the multiple origins in spades.

    A paper in the July issue of Evolution by Symonds and the Soltises documents the extent of these origins. “Each one of these towns is a little evolutionary experiment,” Doug explains, as the new species arose independently many times. In one place, Oakesdale, there seem to be two origins of T. mirus: The flowers have different colors, and varied molecular markers support the suggestion that this species formed twice, with a different set of parents each time. The Soltises have argued that “multiple origins of polyploids are the rule rather than [the] exception,” says Loren Rieseberg, one of their former students and an evolutionary biologist at the University of British Columbia, Vancouver, in Canada: “I think their views have now become conventional wisdom.”

    The Soltises and Tate, then a postdoc at the University of Florida where the Soltises went in 2000, began to create polyploid Tragopogon plants in the lab. People had assumed that polyploids would retain the extra genes they had acquired but turn off duplicates to keep genetic activity in balance. Instead, “we found out the Tragopogon polyploids were losing a lot of their genes,” Tate says.

    They began to think that evolution might repeat itself, with the same genes being lost as each new polyploid population formed. Indeed, that tends to be the case for T. miscellus, they reported last year in BMC Plant Biology. “We also found that there were some genes that were always maintained in duplicate,” Tate says.

    In addition, “weird things are happening to the chromosomes,” Pam says. Even though standard biology says the chromosomes from one parent in a polyploid organism stay away from those of the other, in these plants they appear to exchange pieces. Instead of having two copies of a gene from each parent, a plant may have four copies from one and none from the other. There's an element of chance, “but there also may be a little bit of a pattern,” Doug says. Certain genes are frequently lost—often those from T. dubius—and certain genes are always retained.

    On a sunny day last month, the Soltises were back in Washington to collect seeds for an experiment they intend to run in Florida: growing the parents and the new polyploid species in the same conditions and observing DNA at work. “We want to know the effect of polyploidy on gene expression,” Doug says.

    They try to do most of their collecting before noon, when the flowers open for the morning sun. They work quickly and efficiently, Doug picking seed heads and putting them in envelopes, and Pam labeling them and putting them in a larger bag. They have seen that, over time, one of the polyploids has really taken off, while some of the parents seem to be struggling. In Pullman, for example, the Soltises can no longer find one of the parents, T. pratensis, although herbarium records show it used to be there. In contrast, T. miscellus has become quite common at a number of sites—its seed heads are big and each plant has quite a few of them. “It's a poster child for why the polyploids are so successful,” Doug says.

    Thus in less than 80 years—the earliest date at which these new species could have formed—the polyploids have shown they are here to stay. One unresolved question is whether each separate origin should each be considered its own species. “It's one of the cool questions,” Pam says.

    Sequencing frenzy

    “One of the things that make the Solti so successful is that they aren't afraid to dive into new territories” and tackle interesting questions, Tate says. “If they don't have the expertise themselves in a particular area, they build a team of collaborators who do.”

    Ancient secrets.

    DNA from basal angiosperms such as the water lily and a plant called Amborella (inset) revealed early genome duplication in flowering plants.

    CREDITS (TOP TO BOTTOM): MI-JEONG YOO

    Take DNA sequencing. When the Soltises went to California in 1988 to learn the technique from Michael Clegg, it took them 6 months to pull out eight sequences of 1700 bases apiece from the rbcL gene found in chloroplasts, a gene chosen because it was found in all photosynthetic plants. “People thought it was a speed record,” Pam says. With these data, they were able to show that a supposed flowering plant family did not belong in a single group. They were hooked on the technology.

    At the time, the map of flowering plants was a mess. With some 400,000 species, 15,000 genera, and more than 400 families, plant systematists had not been able to come up with a coherent, agreed-upon evolutionary tree. Several groups had started to collect DNA data to sort it out. At lunch one day at a conference in 1991, the Soltises and Mark Chase of the Royal Botanic Gardens, Kew, decided to pool their data; they then recruited other collaborators. “It was a novel approach for this area to get a lot of people to contribute to the same [project],” Donoghue says. “They had a vision of ‘Let's try to work together to solve a large problem.’ They became the ringleaders.”

    Two years later, Chase, the Soltises, and 40 collaborators published a tree based on 500 rbcL gene sequences. The tree's branching pattern generally agreed with results obtained earlier by nonmolecular methods. That agreement “gave credence to the surprises,” such as putting lotuses and sycamores in the same small clade, Doug says.

    Next, they came up with a tree for 220 species based on the 18s gene, a nuclear gene, in 1997. It generally agreed with the rbcL gene tree. In the new analysis, they included an obscure plant from New Caledonia called Amborella trichopoda that happened to be in the freezers of the Smithsonian Institution, where the Soltises had done some work with Elizabeth Zimmer. Its lineage was surprisingly old: It fell out at the base of the angiosperm tree. These analyses “changed [angiosperm phylogeny] from a speculative field where personality meant a lot into a scientific field,” says Gregory Plunkett, a molecular systematist at the New York Botanical Garden in Bronx, New York.

    On a roll, the Soltises embarked on an evolutionary tree based on three genes, the third also coming from the chloroplast genome. The sequencing was labor-intensive, and the Soltises did most of it themselves. “After we put the kids to bed at night, I would go and set up the sequencing reactions,” Pam explains. “Doug would go into the lab in the morning to load the gels and then come home to see the kids before they went off to school.”

    In the end, their tree encompassed 560 species and included unrelated organisms—seven gymnosperms, a category that includes pines and conifers—for comparison. It clarified branches that were not well supported in the 1993 tree and “helped us figure out where more work was needed,” Pam says. It was also adopted by the newly formed Angiosperm Phylogeny Group, which published its collaborative classification in 1998 and updated it in 2003 and 2009. With funding from the National Science Foundation, eight participating institutions are using 17 genes to understand “the dirty dozen” unresolved parts of the tree, in particular to sort out the flowering plants that diverged early, with a paper in the works. Today, “no other group of organisms have as good a phylogeny as flowering plants,” says Mark Mort of the University of Kansas, Lawrence. Adds Kress: “It's so agreed on now that people are changing the organization of the whole herbarium.”

    Now the Soltises would like to expand their angiosperm tree, which has about 1000 species, to include representatives from all 15,000 genera. China has shown an interest. When the Soltises were in Shenzhen to meet with their Chinese collaborators in July 2010, they also met with the mayor to discuss the city's support of a Chinese angiosperm tree-of-life project. “If the Chinese will help fund it, it will be a big step,” Doug says. They are also helping to promote a 100,000-species tree for all green plants.

    In the meantime, the angiosperm phylogeny work has helped clarify which modern flowering plants represent the earliest evolving branches. To get a sense of what early angiosperm genomes were like, the Soltis group joined with Claude dePamphilis of Pennsylvania State University, University Park; Victor Albert, now at the University at Buffalo in New York; and more than a dozen other researchers to develop the Floral Genome Project.

    From 2001 to 2006, this project compared the expression patterns of genes known to be important in flowering in an index plant, Arabidopsis, with patterns in plants at the base of the evolutionary tree they had constructed. The study found that most of the basal angiosperms used the same genes as Arabidopsis, but in different places at different times during development. “Basal angiosperms don't do things the way Arabidopsis does them,” Doug says.

    What's more, almost all of the genes involved with floral organization were duplicated. The Soltises and others suggested that polyploidy might be important to angiosperm evolution, enabling duplicated genes to evolve new functions and new roles. Polyploidy “might actually be instrumental to their origin,” Doug says. Next, they plan to sequence Amborella, which should show more clearly whether genome duplication occurred 130 million years ago in the common ancestor to all living angiosperms.

    The Soltises' decision to follow up on a botanist's curiosity about oddball flowers spotted on a walk into downtown Pullman is leading to a new understanding of how flowering plants evolved.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution