News this Week

Science  29 Mar 2013:
Vol. 339, Issue 6127, pp. 1504

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution

  1. Around the World

    1 - Edmonton, Canada
    German Scientists Pull Out of Oil Sands Project
    2 - Minami Torishima, Japan
    Japan Reports Rare Find of Rare Earths
    3 - Heidelberg, Germany
    HeLa Genome Withdrawn
    4 - Rome
    Stem Cell Researchers Protest Controversial Treatment
    5 - Austin
    Cancer Researcher Recruitment Grants Unfrozen
    6 - Paris
    Science Law Falls Short

    Edmonton, Canada

    German Scientists Pull Out of Oil Sands Project

    A Syncrude plant in Alberta's oil sands.


    German scientists have pulled out of a research project with Canada that sought to minimize the environmental damage caused by exploiting Alberta's oil sands. The scientists, part of the Helmholtz-Alberta Initiative (HAI), will no longer be involved in developing technologies that improve Alberta's crude oil or treat the toxic effluent from the oil sands projects.

    HAI, founded in 2011, is a partnership between the Helmholtz Association and the University of Alberta. The supervisory board for one of the four Helmholtz institutes involved in the partnership voted in December to impose a moratorium on its involvement in the project, in part due to an ongoing debate over a possible amendment to the European Union's fuel quality directive that would restrict the use of "high-polluting" oil within Europe.

    Germany (with the United Kingdom) has so far blocked the amendment, but public opposition to importing Albertan oil remains high. The Canadian government has lobbied German politicians to continue blocking the ban; that, along with Canada's withdrawal from the Kyoto Protocol, prompted several German politicians to ask the Helmholtz Association pointed questions about the Alberta project.

    An independent assessment into HAI's environmental credentials will report its findings in June.

    Minami Torishima, Japan

    Japan Reports Rare Find of Rare Earths

    Japanese researchers last week reported finding high concentrations of rare-earth elements in deep-sea sediment near the island of Minami Torishima, about 1800 kilometers southeast of Tokyo.

    Rare earths—such as the elements neodymium, terbium, and yttrium—are essential to a host of advanced electronics, including displays, cell phones, solar panels, and high-performance electric motors. China controls 95% or more of global production, a realization that led Japan, the United States, and other countries to hunt for new sources and develop substitutes (Science, 17 December 2010, p. 1598). Scientists from the University of Tokyo and the Japan Agency for Marine-Earth Science and Technology reported finding significant rare earths in seafloor sediments at 78 sites scattered around the Pacific Ocean in Nature Geoscience in July 2011.

    The latest find is potentially the most commercially significant, with a concentration of 6500 parts per million of rare earth elements—far higher than that in deposits found elsewhere in the Pacific or on land in China. The challenge will be recovering the ores economically; they lie in 5800-meter-deep water.

    Heidelberg, Germany

    HeLa Genome Withdrawn


    Complex questions of genetic privacy have led the European Molecular Biology Laboratory (EMBL) to cut off public access to a popular cell line's genome sequence, which it had announced in a press release just 2 weeks earlier. On 11 March, EMBL researchers published a genome for the widely studied cancer cell line known as HeLa. But HeLa stands for an African-American woman by the name of Henrietta Lacks, and the research use of her cells, which were originally taken without consent, has gained publicity thanks to a popular book by science writer Rebecca Skloot. Skloot and some scientists questioned the ethics of releasing the HeLa genome, suggesting that it could reveal personal information and violate the privacy of Lacks's descendants. "We have taken the data offline until the question has been resolved of whether the family consents to the public availability of genomic information on the cell line. … We did not consider this an issue exactly because of the notoriety of the cells and the existence of so much molecular biological and genetic data on these long before our study," an EMBL spokesperson writes in an e-mail.


    Stem Cell Researchers Protest Controversial Treatment

    The Stamina Foundation can continue a controversial stem cell treatment in patients with neurodegenerative diseases such as spinal muscular atrophy and Parkinson's, according to a new bill signed by Italian Health Minister Renato Balduzzi on 21 March (and approved by the Council of Ministers). Over the last 3 years, a hospital in Trieste and another in Brescia have used the treatment in agreement with Stamina, although it has never been authorized by the Italian drug administration, known as AIFA.

    Little is known of the treatment, which Stamina says is based on mesenchymal stem cells, and few documents related to its methodology are public, such as two patents issued by the United States to Stamina. After lab inspections in Brescia in 2012, AIFA called for interrupting the treatment, but patients' families appealed to courts to continue its use. Two weeks ago, Italian stem cell researchers sent an open letter to Balduzzi, asking him to bring what they called "the administration of snake oil" to a definitive end. But the new bill defines the Stamina method as a medical treatment without serious side effects and stated that it should not be stopped in patients already under Stamina's treatment.


    Cancer Researcher Recruitment Grants Unfrozen

    Texas officials will allow the state's $3 billion cancer research agency, the Cancer Prevention and Research Institute of Texas (CPRIT), to make 25 recruitment awards that had been on hold since December. The decision is the first good news in a while for the troubled agency. Last May, chief scientific officer and Nobel Prize winner Alfred Gilman quit in protest over CPRIT's scientific review processes; two more top officials left last fall. In December, CPRIT agreed to a request from state leaders to impose a moratorium on new grants while investigations continued.

    Texas research institutions, in the meantime, worried that the delay in awarding $71.8 million in approved grants to 25 researchers from outside the state would derail the recruitments. Last week, CPRIT interim Executive Director Wayne Roberts announced that state leaders will allow CPRIT to move ahead. "We have worked hard to regain trust," and "[w]e take this action as evidence that some progress has been made," Roberts said.


    Science Law Falls Short

    After months of consultation with the French scientific community, the Council of Ministers last week adopted a draft bill for a higher education and research law. The bill consists of 20 measures that, France's science minister says, will better support students at universities and will give research a new impetus, hopefully boosting economic recovery and competitiveness. Among other changes, the bill proposes reorganizing the nation's higher education institutions into 30 or so regional groups that will collaborate with research laboratories and local authorities. The law also aims to help set a clearer national strategic research agenda.

    But some researchers say that the bill doesn't address many of the community's concerns, including a guarantee of an adequate budget for universities and job stability for the nation's growing numbers of nonpermanent research staff members. On 21 March, as part of a call from French trade unions and research associations, students and nonpermanent scientific staff members held a protest in Paris for new permanent positions to be created. The draft bill will be debated in Parliament starting in May.

  2. Random Sample

    I See You: New Mouse Lemurs Spotted


    In a century where new mammal species discoveries are rare, two new mouse lemurs made their formal debut this week. Rodin Rasoloarison from the University of Antananarivo in Madagascar came across these softball-sized primates in eastern Madagascan rain forests in 2003 and 2007. He took a small skin sample from each for DNA analysis. From afar, the gray-brown primates are virtually indistinguishable, but their genes revealed that they are two species, Rasoloarison and his colleagues reported online on 26 March in the International Journal of Primatology. That means there are now 20 recognized species (pictured: Microcebus marohita), and "I suspect there are even more mouse lemur species out there to be found," says Anne Yoder, a co-author and director of the Duke Lemur Center in Durham, North Carolina. Reduced forest cover from human activities has made mouse lemurs the most endangered mammals in the world, notes the International Union for Conservation of Nature.

    Atomic Science Keeps Silver Shining Bright


    Armed with Q-tips, chemical coatings, and lots of elbow grease, art conservators wage unceasing war against tarnish, a thin layer of sulfide that forms on silver when it's exposed to air. Constant polishing can wear down artifacts, however. And the protective coatings now in use are notoriously uneven and last only about 20 years—a short time in the museum world.

    Now, materials scientists from the University of Maryland (UMD) think they've hit upon a solution. Using an easily reversible technique called atomic layer deposition (ALD), they coated pieces of silver with layers of aluminum oxide only one atom thick. By gradually building up the number of layers, the researchers could precisely control the thickness of the film in the silver's every nook and cranny. Ten nanometers of the ALD film could protect a silver artifact from tarnish for more than 80 years, the team reported at the March meeting of the American Physical Society in Baltimore, Maryland.


    If ALD coatings could save artifacts from the damaging effects of constant rubdowns, "it's really worth the effort," says Martin Fischer, a physicist at Duke University in Durham, North Carolina, who is not involved with the project. "Once you polish off all the silver, that's it." He looks forward to future work testing the technique's effectiveness on other materials as well.

    In the meantime, the UMD team is working with Baltimore's Walters Art Museum to apply the ALD film to a late 15th century Spanish cross (pictured)—the technique's first test on a true work of art.


    Join us on Thursday, 4 April, at 3 p.m. EDT for a live chat on what we've learned about dinosaurs since Jurassic Park.

    By the Numbers

    3 Types of neutrinos indicated by the study of the cosmic microwave background by Europe's Planck spacecraft, dashing some physicists' hopes that earlier hints of a fourth neutrino would be bolstered.

    £4.4 million Amount donated by an anonymous foundation to save the Royal Institution of Great Britain from eviction this month. The stay of execution gives the 200-year-old organization more time to pay off debts.

  3. Newsmakers

    Hopkins Biochemist to Head NIH's Basic Science Institute



    Jon Lorsch, a biochemist at Johns Hopkins University in Baltimore, Maryland, will become director of the $2.4 billion National Institute of General Medical Sciences (NIGMS) in August. Lorsch, 44, uses yeast to study how cells initiate the translation of RNA into proteins. He will join NIGMS, the National Institutes of Health's fourth largest institute, as NIH is coping with a 5% budget cut from sequestration. Lorsch says his priorities include making the case for basic research, using taxpayer funds "efficiently," and helping NIH carry out plans to improve graduate education and to attract minorities to research. Diversity in "scientific subjects, settings, and researchers," will be a theme, he says: "If we can ensure we have the most effective ways to distribute funding, it will be for the good of the nation and scientists." Lorsch replaces Jeremy Berg, who left NIGMS in July 2011.

    Deligne Wins Abel Prize



    Pierre Deligne has won this year's Abel Prize for outstanding work in mathematics. In its 20 March announcement, the prize committee praised his "seminal contributions to algebraic geometry and … their transformative impact on number theory, representation theory, and related fields."

    A professor emeritus at the Institute for Advanced Study in Princeton, New Jersey, Deligne, 68, was born in Brussels. After receiving his Ph.D. from the University of Brussels in 1968, he joined the Institut des Hautes Études Scientifiques at Bures-sur-Yvette, France. He moved to the United States in 1984.

    Deligne works in algebraic geometry, which uses multidimensional curves and surfaces to provide solutions to systems of equations. A versatile and creative mathematician, he has notched a string of influential proofs and forged standard mathematical tools used in several subdisciplines.

    Presented by the Norwegian Academy of Science and Letters, the Abel Prize comes with a monetary award of 6 million Norwegian kroner (about $1 million). The prize honors the 19th century Norwegian mathematical prodigy Niels Henrik Abel. Deligne has previously won the Fields Medal and the Crafoord Prize, among many other honors.

  4. Decade of the Monster

    1. Ron Cowen*

    An infalling gas cloud and other new probes herald a revealing period for the Milky Way's supermassive black hole.

    Hungry heart.

    The neighborhood of a galaxy's central black hole is a maelstrom of inward-spiraling matter, twisted spacetime, and radiation.


    It could be the most photographed snack in the history of the galaxy. Either late this year or early next, the giant black hole at the center of the Milky Way will devour a blob of gas hurtling toward it at more than 2000 kilometers per second.

    A vast array of telescopes in space and on the ground is poised to record the feast, which could rouse the Milky Way's gravitational monster from an extended period of dormancy—and reveal why the massive body appears to have been on a near-starvation diet for centuries. The new details may also provide insight about the puzzling dining habits of similar supermassive black holes believed to lie at the core of nearly every heavyweight galaxy.

    But before astronomers can attempt to answer these questions, they must nail down several basic properties about the parcel of gas known as G2: its origin, mass, and orbit. Scientists are not even sure when the gas will pass closest to the Milky Way's 4-million-solar-mass central black hole, known as Sagittarius A* (pronounced "A-star").

    G2's encounter with the black hole is just the beginning of what is shaping up to be a decadelong effort to unlock the secrets held by Sagittarius A*. Because its enormous gravitational pull traps light, astronomers can never see the heart of the galaxy directly. But they can glean information about its spin, mass, and size by studying the faint flickers of radiation emanating from surrounding gas and dust as they heat up and spiral into the invisible beast.

    Now, researchers are assembling a sensitive array of radio telescopes to perform a new trick: recording the black hole's shadow. And a star set to make its closest approach to Sagittarius A* in 2018 promises to probe in unprecedented detail the predicted curvature of spacetime just outside the body, testing Einstein's general theory of relativity in a gravitational regime more extreme than has ever been possible.

    Researchers initially predicted that G2 would make its fatal rendezvous in June, coming closer to Sagittarius A* than a distance seven times that between the planet Neptune and our sun. Additional data shifted that prediction to September. Now, a new analysis by Andrea Ghez of the University of California, Los Angeles (UCLA), and her colleagues—one of two teams that have monitored the motion of stars at the galactic center for some 20 years—suggests that closest approach might not happen until March 2014. Ghez presented the findings on 14 March at a seminar celebrating the 20th anniversary of the Keck Observatory in Hawaii.

    "The real question is, what is G2 and how much mass it is really going to dump onto the black hole?" Ghez says.

    That uncertainty may seem surprising given that G2 has been closely monitored with two of the largest telescopes on Earth since its discovery in 2011 by Stefan Gillessen of the Max Planck Institute for Extraterrestrial Physics in Garching, Germany, and his colleagues. G2, however, is a faint, infrared-emitting source located in the most congested region of the galaxy, the packed metropolis of stars and clumps of gas orbiting Sagittarius A*. As a result, Ghez says, "these are very, very difficult observations to make."

    At first, Gillessen and his co-discoverers thought that G2 was a lone gas cloud with the mass of three Earths. In the 11 September 2012 issue of Nature Communications, Abraham Loeb and Ruth Murray-Clay of Harvard University suggested that G2 could be a disk of gas orbiting a star, similar to the circumstellar gas disks that give birth to planets. Material boiled off the disk by ultraviolet radiation from other stars and then elongated by the black hole's tidal gravitational forces could produce a gas cloud, including a tail like the one astronomers have observed.

    Whether lone cloud or stellar shell, G2 ought to brighten as it heads closer to the galactic center, where intense ultraviolet radiation from closely packed stars should ionize its hydrogen gas and set it aglow in the near infrared. However, archival images recorded between 2008 and 2012 showed that G2 maintained an essentially uniform brightness, Gillessen and his colleagues reported on 1 February in The Astrophysical Journal.

    The discrepancy prompted Nick Scoville of the California Institute of Technology in Pasadena and Andreas Burkert of the Max Planck Institute for Extraterrestrial Physics and the University Observatory Munich to propose a new model. In a paper posted on the arXiv preprint server on 26 February (, they suggest the gas is a steady wind blown out by a young star and that radiation from the star itself, not from its neighbors, ionizes the gas. Those properties explain G2's unchanging brightness as it heads ever closer to Sagittarius A*, the team says.

    Inner circle.

    Stars tightly orbiting Sagittarius A* can give astronomers clues to the invisible black hole's shape, mass, and spin.


    If G2 is held together by the gravity of a star, less material will detach at closest approach and fall onto Sagittarius A*. That could mean fewer fireworks during this go-round, although the star's gravity might keep the cloud intact for several more passages around the black hole, giving it additional opportunities to feed and revive the quiescent beast.

    Exactly what astronomers will see also depends on just when the cloud arrives at Sagittarius A*. In some ways, later is better. Telescopes on Earth can't see the Milky Way's center between October and February, when our planet arcs through the part of its orbit that places Sagittarius A* on the far side of the sun. Orbiting telescopes, including NASA's Chandra X-ray Observatory, have a narrower blackout window: between November and January. G2 watchers will be disappointed if anything exciting happens during those times.


    UCLA researchers are modeling fluctuations in the infrared brightness of Sagittarius A* (left) using methods developed for studying the Standard & Poor's 500 stock market index (right).


    Tracking the fireworks

    Regardless of when G2 makes its closest approach, the light show is likely to unfold in three stages, Loeb says.

    The first fireworks could erupt just as the gas makes its closest approach to the black hole, when the tidal gravitational forces from Sagittarius A* shred the gas into spaghetti like filaments or droplets. X-ray emission will flare up if G2 continues to move at supersonic speed, producing a bow shock wave as it plows into the denser gas near the black hole, and should peak at closest approach, Loeb says.

    The next phase could start a month or two later, when the shredded gas dives inward and strikes a proposed structure called the accretion disk—the swirling doughnut of matter believed to surround and feed the black hole. If the cool gas from G2 slams onto the warm disk, it could generate both x-rays and radio waves. Just how much radiation will be produced depends on the density, temperature, and extent of the disk—all unknown quantities at the moment, Loeb notes. "If we detect this brightening, we could constrain the unknown properties of the [accretion] disk for the first time," he says.

    Timing how long the gas takes to travel to the disk may also shed light on conditions near the galactic center, says Avery Broderick of the Perimeter Institute for Theoretical Physics in Waterloo, Canada. By studying how G2 gives up its angular momentum to other gas parcels there, Broderick says, astrophysicists may be able to infer for the first time the viscosity of a gas or plasma far off in space—critical information for understanding the mechanism by which gas falls into and fuels a black hole.

    Once the gas gets dumped onto the accretion disk, it could take several months to several decades for the remains of G2 to complete its death march, spiraling downward through the disk and disappearing forever inside the black hole's event horizon. A silent scream of radiation at all wavelengths, including x-rays, infrared light, and radio waves, may accompany its demise, although Loeb says that it may be hard to distinguish from emissions due to other processes that feed the black hole.

    Several astronomers are worried that even the short-term effects of G2 may be tricky to tease out from the normal variability of Sagittarius A*. Though unusually dim, the black hole's accretion disk can sometimes generate a 10-fold increase in infrared light on a timescale of minutes. That flash of light is similar to the predictions of what G2's infall might produce. How can astronomers disentangle the impact of the gas cloud from the black hole's intrinsic fluctuations?

    Leo Meyer, a member of Ghez's team, began pondering that question last spring while working with UCLA finance professor Francis Longstaff on methods to predict the behavior of Sagittarius A* using the same kind of time-series analyses used to forecast stock market volatility (see figure). "The big question now," Meyer says, "is whether the passage of G2 will lead to something like a stock market crash": a major change in the behavior of Sagittarius A* from an unusually dim bulb to a much more voracious, glowing black hole.

    Until recently, astronomers were afraid that the black hole's natural mood swings might make such a transition hard to spot. Research had indicated that Sagittarius A* flipped between two distinct states: a quiescent low state and a high state marked by sharp outbursts of fluctuation. Such bipolar behavior could make it difficult to discern any brightening due to the impact of G2. But after a closer look at the data, Meyer and Longstaff concluded that the "low state" readings were instrumental noise from the Keck telescope and that Sagittarius A* was simpler—although more variable—than astronomers had believed. Now, Meyer says he's optimistic that after only a few nights of timely observations at Keck, he and Longstaff will know whether and by how much G2 has boosted the output of Sagittarius A* and shifted the black hole into a new, more active regime.

    Dancing around the monster.

    Observations of the central stars S0-2 (orbit shown in yellow) and S0-102 (red) will test general relativity.


    New horizons

    Even after G2 passes by and hoopla about this gas cloud dies down, aficionados of the galactic center will have a lot to look forward during the rest of the decade.

    Two years from now, if all goes according to plan, a network of four radio telescopes known as the Event Horizon Telescope, which has already revealed new insights about the properties of Sagittarius A* and the giant black hole at the center of the nearby galaxy M87, will greatly expand its own horizons (Science, 27 January 2012, p. 391). Twenty or so radio dishes from the Atacama Large Millimeter/submillimeter Array (ALMA), the radio array now nearing completion in Chile's Atacama Desert, are scheduled to join the network in 2015 along with the 10-meter South Pole Telescope. Working in tandem, ALMA and the other radio dishes will double the resolution of the Event Horizon Telescope, creating a virtual Earth-sized radio telescope powerful enough to make landmark observations of Sagittarius A* and its accretion disk, says the telescope's coordinator, Sheperd Doeleman of the Massachusetts Institute of Technology Haystack Observatory in Westford, Massachusetts, and the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts.

    Shadow play.

    Simulation (left) shows the arc of light and the shadow formed by hot material near Sagittarius A*. At right, measurements by the Event Horizon Telescope (white spots) are overlaid on simulated data.


    The boost in resolution will enable the telescope to make actual images of the region surrounding Sagittarius A* and hunt for a predicted feature known as the black hole's shadow. Although nothing, not even light, can escape a black hole's grasp, matter that gets pulled into the monster gets crushed by the extreme gravity and heats up to billions of degrees, illuminating the region around the black hole. Most of the radiation falls into Sagittarius A*; the light that just misses getting trapped is bent by the monster's gravity into a thin ring or halo that frames the black hole's shadow (see figure). Deviations from the halo geometry could indicate that Einstein's theory doesn't accurately describe the way gravity curves or distorts spacetime near a black hole and that the theory might need to be revised.

    Objects swooping past Sagittarius A* may provide other important clues. In 2018, a bright star known as S0-2, discovered nearly 2 decades ago, will put Einstein's theory through its paces when it comes within four times the Neptune-sun distance of Sagittarius A*—about half as far as G2's closest approach (see image). Two fingerprints of the star—the light it emits and the motion of the star through space—will test relativity in different ways, Ghez notes.

    Spectra of the star will reveal the gravitational redshift of the starlight—the amount by which the mass of Sagittarius A* has curved spacetime at the galactic center. The observed redshift can be directly compared with the amount that general relativity predicts.

    By precisely monitoring the 3D motion of the star, which circles the black hole about every 16 years, Ghez's team hopes to determine whether the star's closest approach to Sagittarius A* occurs at the same place in its orbit or whether it slowly moves or precesses about the supermassive black hole under the sway of the invisible body's extreme gravity.

    The precession is a much smaller fingerprint than gravitational redshift and is much more difficult to measure because of all the other objects—countless stars and clumps of gas—tugging on S0-2 at the Milky Way's crowded center. "If we had a pure system of a black hole and one star, we wouldn't be worried, but the center of the galaxy is simply a mess," Ghez says.

    Fortunately, her team recently discovered another central star with a shorter orbit of 11.5 years, Ghez and her colleagues reported in the 5 October 2012 issue of Science. Although the star is only one-sixteenth as bright as S0-2, its presence will help distinguish gravitational perturbations on the brighter star's orbit due to Sagittarius A* from those exerted by other stars and gas at the galactic core.

    In 1919, astronomers measuring how stars appeared to change position as the sun's gravity bent the path of their light made Einstein a celebrity and transported his theory to the forefront of public imagination. Nearly a century later, a star 26,000 light-years from Earth promises to test Einstein on a far grander scale and determine if the weirdest object in the universe—a supermassive black hole—can alter the fabric of spacetime.

    • * Ron Cowen is a freelance writer in Silver Spring, Maryland.

  5. Marine Conservation

    As Threats to Corals Grow, Hints of Resilience Emerge

    1. Charles Schmidt*

    Some reefs are showing a surprising ability to resist or bounce back from damage. Could such resilience help corals survive in a rapidly changing ocean?


    This coral-mangrove ecosystem in the U.S. Virgin Islands is thriving despite ocean conditions that killed nearby reefs.


    Eight years ago, a blistering heat wave sent local sea temperatures soaring in the eastern Caribbean, killing more than one-half of the region's coral reefs. Many have yet to recover. But in Hurricane Hole, a sheltered bay off St. John in the U.S. Virgin Islands, one vibrant coral ecosystem survived unscathed. "We've identified more than 30 coral species" that avoided the catastrophe, says Caroline Rogers, a marine biologist with the U.S. Geological Survey (USGS) in St. John. "The diversity is astounding."

    Rogers has been trying to understand what made the corals in Hurricane Hole so resilient, and she has plenty of company. Around the globe, a growing corps of scientists is searching for resilient reefs and then trying to identify what enables them to resist or bounce back from severe environmental stress. They've found tantalizing hints that heat-resistance genes, the proximity of other reefs, and even the presence of plant-eating sea life that scrub corals free of weedy algae can play a role. But they're also discovering that the factors that promote resilience can vary greatly from reef to reef.

    Still, coral researchers are trying to extract some general, widely applicable lessons from their studies of resilience, partly in hopes of developing smarter conservation strategies, such as better designed marine reserves. It's a task that is taking on increased urgency, as climate change threatens to wipe out corals and remake ocean ecosystems. "We in the environmental community look for points of hope," says ecologist Stephanie Wear of the Nature Conservancy in Arlington, Virginia. And in identifying possible ways to boost reef resilience, she says, "We see opportunity."

    Stressful developments

    Coral reefs weren't always considered fragile. When Rogers was working on her doctorate in the late 1970s, many researchers believed that reefs were intrinsically stable ecosystems, threatened mainly by local storm damage. But by the early 1980s, it was obvious that corals were in trouble. In the Caribbean, reefs took a noticeable hit after disease led to a massive die-off of spiny sea urchins, which had helped to keep reef-smothering algae in check after another group of herbivores, parrotfish, had been overexploited. In the Pacific and Indian oceans, unusually high seawater temperatures linked to the large-scale weather pattern known as El Niño caused massive "bleaching" events in 1982 and 1998 that turned reefs a ghostly, skeletal white. Bleaching happens when heat-stressed corals expel the symbiotic algae—known as zooxanthellea—which live in their tissues and supply nutrients in exchange for protection. And it is often fatal: the 1998 event killed 16% of the world's reefs overall and up to 95% in some locations, says Robert Steneck, a biologist at the University of Maine's Darling Marine Center in Walpole.

    But here and there, reefs have shown the ability to resist or rebound from such shocks. In an oft-cited example, the extensive reef system off Palau, in the western Pacific, charged back within a decade after suffering extensive bleaching losses in 1998. Similarly, reefs off the Cocos Islands in the eastern Pacific, which were virtually destroyed by bleaching during the 1980s, experienced up to fivefold increases in coral cover within 20 years. These recoveries sparked widespread interest in understanding the underlying "drivers" of resilience, with an eye toward developing better reef protection plans.

    Scraping by

    Different fates.

    Corals in Palau (top) have recovered from severe bleaching, while many reefs in the Caribbean (bottom) have not.


    One focus has been on understanding the role of herbivorous fish and plant-eating invertebrates such as sea urchins. Ecologists have long known that these grazers play an important role in reef health by mowing down weedy algae and clearing attractive settling spots for young corals. Now, many believe that those tasks are essential to enabling some damaged reefs to recover from ecological stress.

    For example, when Steneck visited Palau's bleached reefs in 2000, he was heartened by the abundance and diversity of herbivorous fish that were still patrolling the coral skeletons. "The reefs were extremely well-grazed," he recalls. "I thought the conditions for coral recovery were great because young corals would have a chance to settle." The subsequent recovery of those reefs reinforced Steneck's belief that protecting herbivorous fish is one of the most effective means of boosting reef resilience. A number of studies appear to back his view, including experiments published in Current Biology in 2007 conducted on Australia's Great Barrier Reef in which researchers deliberately removed grazing fish; the corals were soon overwhelmed by algae.

    At the same time, in other parts of the world, the importance of grazer abundance is less obvious. Researchers have found no clear relationship, for instance, between the number of reef herbivores and how much macroalgae covers reefs off the coast of New Caledonia, in the Southwest Pacific, a team led by Laure Carassou, a postdoctoral research fellow at Rhodes University in South Africa, reports in a forthcoming PLOS ONE paper. Even on reefs where herbivorous fish numbers had been severely depressed by overfishing—such as in the Coral Triangle, the vast marine system that stretches from the Southwest Pacific to the western tip of Indonesia—algae cover tends to be low, notes co-author Michel Kulbicki, a reef ecologist with the Research Institute for Development in Banyuls, France. Such results suggest something is still eating the algae, and that conservation plans may need to target those species—rather than all herbivores—to boost resilience.

    Making connections

    Researchers are finding similar complexities as they explore the role of "connectivity," or how ocean currents and oceanographic corridors enable organisms, especially juvenile fish and corals, to move from one reef to another. In general, connectivity has been seen as crucial to maintaining reef resilience, because it can enable a damaged reef to receive a steady supply of fresh recruits from even distant reefs that are still healthy. That's why conservation planners often seek to design marine reserves so that they protect both the nursery sources and the final homes of reef organisms. The problem with that approach, studies suggest, is that some damaged reefs may not be able to count on immigrants to make up for coral losses. Many coral larvae appear to travel no more than 300 meters before settling down, notes Maine's Steneck, and there is growing evidence that many reefs "self-seed," or produce their own recruits. That reproduction strategy presents a challenge to coral conservationists, because self-seeding reefs are thought to be less resilient than more connected reefs. For example, isolated reefs in the eastern Pacific—which rely on self-seeding by geographic necessity—recover more slowly from disturbances than better connected reefs to the western Pacific, according to biologist Nicholas Graham of James Cook University, Townsville, in Australia.

    Hot topic

    To get at the biochemical foundations of resilience, researchers have also been delving into coral physiology and genetics, particularly to understand how some reefs have avoided bleaching. In a sheltered reef pool off the coast of an island in Samoa, for instance, researchers in 2008 discovered corals that thrive in unusually warm water that regularly reaches 37°C, some 7° above the warmest summer temperature of nearby seas. That's remarkable because corals typically bleach at just one degree above the local summer maximum, notes marine biologist Stephen Palumbi of Stanford University in Palo Alto, California.

    As part of their study of these atypical corals, Palumbi's team examined which genes switched on when confronted with heat stress. In all, the tests identified 61 stress-resistance genes, including those that code for heat-shock proteins, antioxidant enzymes, and immune regulators, the team reported in January in the Proceedings of the National Academy of Sciences. The next step, researchers say, is to see how many other corals might carry similar genes that could confer resilience.

    Some good candidates might come from places where seawater temperatures already vary greatly, providing a kind of evolutionary "tough love" that could prepare corals to deal with warming seas, says zoologist Tim McClanahan of the New York City–based Wildlife Conservation Society. Off the west coast of Madagascar, for instance, reefs are sheltered from a major cooling current that flows from the east, allowing them to thrive in waters that are often as warm as those encountered during El Niño events.

    The hardy corals in St. John's Hurricane Hole may have also benefited from heat-tolerance genes, since the water there is often warmer than it is around nearby offshore reefs, says USGS's Rogers. But they also may have been helped by shading from nearby mangrove trees, she says, given that exposure to ultraviolet light is known to exacerbate the effects of warmer waters on coral bleaching. (Studies by ecologist Peter Mumby of the University of Queensland in Australia suggest that coral reefs surrounding the Society Islands, in the western Pacific, escaped the 1998 bleaching event because they were protected by cloud cover.)

    Lessons learned?

    Gone for good?

    Researchers are trying to understand what allows bleached corals, such as these off St. Croix in 2005, to recover.


    In a bid to draw some broad, practical lessons from these growing but scattered examples of resilience, McClanahan recently asked 50 colleagues to help him develop a list of the top factors that might predict which reefs can resist and recover from bleaching and other threats. He asked each scientist to rank 31 resilience drivers and give the top scores to those they felt had the most scientific evidence. The result, published online last August in PLOS ONE, was a list of 11 highly ranked drivers. The researchers said a reef could have better recovery prospects, for instance, if it had a high level of coral recruitment and a low level of macroalgae. And reefs might be able to resist bleaching if they were already home to varieties of zooxanthellae or corals known to tolerate warmer waters; corals from the genus Porites, for instance, don't bleach as readily as other species.

    McClanahan's study is prompting some government agencies that are responsible for managing corals—such as the U.S. National Oceanic and Atmospheric Administration (NOAA)—to try to characterize reefs by their resilience potential. The idea is to ultimately come up with site-specific protection plans that take into account a reef 's strengths and seek to shore up weaknesses. For example, research suggests that sediment and nutrient pollution can diminish a coral's tolerance to heat, elevating bleaching risks. By addressing those landbased threats, managers could shore up a reef 's resilience to warming seas, explains NOAA marine scientist Britt Parker in Silver Spring, Maryland.

    There are numerous hurdles, however, to putting resilience-based plans into practice. One is that managers often lack key pieces of knowledge, such as exactly where young corals and fish come from on a given reef. So if they are trying to design a new preserve, that means they may not know which other reefs to include to ensure adequate connectivity. To get around such problems, conservation planners sometimes use proxies—such as data on currents or the knowledge of local people about fish populations—to help identify the right areas to protect. One risk in that approach, however, is that it can lead to proposed reserves that are so big that they stoke opposition from anglers, divers, and others who might lose access. Such conflicts might be reduced, Wear says, if researchers were able to develop inexpensive, noninvasive ways to establish reef connectivity (such as easy genetic tests) and fine-tune reserve designs.

    A better understanding of resilience could help planners develop more targeted conservation strategies that don't threaten local livelihoods, Steneck agrees. For example, if studies show that the presence of herbivorous fish are essential to a reef's health, but carnivorous fish play a lesser role, officials could consider customized fishing limits. The Caribbean island of Bonaire, for instance, outlaws spearfishing and trap fishing, which can target herbivorous reef fish, but allows bait-and-hook fishing for carnivores. And in Palau, officials have banned the export of herbivorous parrotfish, but allow killing of the fish for local consumption. "The reef off Palau is huge, but the local [human] population is small, so there are plenty of parrotfish to graze the reef and sustain local needs," Steneck explains. "But no reef population could withstand fishing pressure for export markets."

    Long-term challenges


    Even if managers are successful in injecting resilience-based measures into conservation plans, however, it is not clear that they can protect reefs from the twin long-term challenges posed by rapid climate change: rising water temperatures and ocean acidification, a pH change spurred by the sea's absorption of atmospheric carbon dioxide. The discovery of heat-tolerant corals is giving some researchers hope that some species will adapt to warmer seas. In addition to the heat-resistant Palau corals, researchers have identified reef-building corals at the northern end of Australia's Great Barrier Reef that tolerate warmer water temperatures than those found 1500 miles to the south. But to survive, reefs may also have to migrate to cooler waters—and that's a doubtful proposition, says Ove Hoegh-Guldberg, director of the Global Change Institute at the University of Queensland. Reefs would have to move by 15 kilometers per year, with their complicated ecosystems intact, to stay within the temperatures to which they're acclimated, he reported in an August 2012 study published in Nature Climate Change.

    Another sobering unknown is whether corals can keep pace with acidifying seas, which can dissolve the calcium skeletons of corals and interfere with the development of eggs and larvae. Coral calcification will no longer keep pace with physical reef erosion once atmospheric carbon dioxide levels top 450 parts per million, according to Hoegh-Guldberg. With levels now at 395 ppm, and rising at a rate of 2 ppm a year, that tipping point could be less than 30 years away, he notes.

    Researchers have found hints that some corals and reef organisms have genetic resilience to acidification. Both Palumbi and Hoegh-Guldberg say that they have collected data showing that corals respond to acidification by changing gene expression, as they do when exposed to warmer seas. "But identifying a genetic response is a long way from showing that a coral can adapt—and that coral reefs can survive—the extremely rapid pace at which we are changing the environment," Hoegh-Guldberg says.

    With conditions changing so quickly, Hoegh-Guldberg and other scientist have suggested that the world's reefs might be saved only with more radical measures to enhance resilience: by breeding heat-resistant corals and using them to build new reefs, for instance dumping minerals into the sea to neutralize acidity; or even shading reefs with vast sheets of buoyant cloth. But such measures are probably impractical, given the vast extent of reefs.

    In the meantime, resilience studies are reinforcing the need for multifaceted conservation strategies that look for both short- and long-term gains. "In some ways, what we are learning about coral resilience is simply bringing us back to where we started," USGS's Rogers says. "Managing human activities at the local level—while still hoping that global efforts to control greenhouse gas emissions will become more effective."

    • * Charles Schmidt is a writer living in Portland, Maine.

  6. Steering Cancer Genomics Into the Fast Lane

    1. Elizabeth Pennisi

    With her passion for DNA sequencing technology, Elaine Mardis now hopes to help cancer patients.

    When Elaine Mardis competed for her first tae kwon do black belt, Richard Wilson, her instructor, was a little worried. The test included a side kick through a free-standing, 5-centimeter-thick pavers stone. Though a standard part of the belt test for years, the bricks were now made differently, with epoxy, and were harder to break. A dozen students failed in their first attempts that day. But not Mardis. She "hammered" it, Wilson recalls.

    Melding cancer and genomics.

    Timothy Ley, Elaine Mardis, and Richard Wilson (left to right) brought whole genome sequencing to bear on leukemia.


    The concentration and drive that destroyed that brick have stood Elaine Mardis in good stead throughout her almost 30-year career in genome sequencing. Now co-director of the Genome Institute at Washington University (WU) School of Medicine in St. Louis and one of the few female leaders in the field, she jumped into DNA sequencing during its earliest days, was instrumental in helping decode the human genome faster than expected, and has since pushed the limits of the technology to learn about cancer. Today, oncologists can envision the day when they routinely sequence their patients' tumors, but when Mardis and her colleagues broke new ground by fully deciphering the first cancer genome, most researchers considered it an impossibly difficult and expensive task. "It was a foundational piece of work," says Paul Spellman, a cancer systems biologist at Oregon Health & Science University in Portland. "But it did not suddenly mean that everyone else could do it easily."

    Mardis and her colleagues were not content to sequence the genome of just one patient or even one cancer. They got in on the ground floor of the Cancer Genome Atlas, a massive National Institutes of Health (NIH) effort to characterize the molecular basis of 20 cancers (Science, 15 September 2006, p. 1553). Across many tumors, the WU group is using its technology to find genes and pathways that may work as drug targets and to see how tumors develop resistance. Her team has even helped save the life of one of their labmates after he relapsed with leukemia. By providing him with what Mardis calls the Maserati approach, a series of genomic analyses that she hopes may one day become standard for clinical care, they identified an existing drug that stopped the relapse in its tracks. Her goal now, Mardis says, is "to have an influence on patient care."

    Learning to sequence

    Mardis stands out from the crowd for more than her science. She loves pink, for example, and lets people know it: Hot pink lipstick is her trademark. Today, she's also sporting a pink winter coat and rose-colored snakeskin boots. And such playful boldness continues into Mardis's office, where one can't help but notice the rows of red toy cars in all shapes and sizes. Dozens of Hot Wheels sports cars are arrayed on her desk. In front of her is a small display case with miniature Ferraris (1962 and 2002 models), reminders of "one of the most exhilarating experiences I've ever had," she says, referring to her driving a friend's Scuderia Spider in California. "I really like cars, red cars in particular." Her current vehicle is a red Audi S5, but her dream machine is a red Porsche 911 Carrera S.

    Off the road, Mardis's machine of choice is the DNA sequencer, and the faster the better. Inspired to study science by her chemistry professor father, the Nebraska native took up biology across state lines at the University of Oklahoma (OU). For graduate school, OU biochemist Bruce Roe lured her from fruit fly genetics, where DNA was more of an "abstract entity" to molecular biology, where "I could really work with it," she recalls. Roe soon managed to buy one of the first commercial DNA sequencing machines, and he set Mardis loose on making that machine, as well as a primitive robot for pipetting, work for research. "We started to think, how do we scale up these methods to produce a data production pipeline?" Mardis says.

    With the beginnings of a pipeline, she deciphered 5000 hard-to-sequence DNA bases from a bacterium for her Ph.D. thesis, a remarkable feat back in 1989. By that point, however, technology development had become her passion. After graduating, she went to work for Bio-Rad, a California company that was developing a new polymerase, an enzyme important for DNA sequencing. There she learned all about management styles—good and bad—and how to set up production lines for biological products, knowledge that she put to good use when Wilson, who had also worked in Roe's lab, recruited her for WU's new genome center in 1993. Her job: developing instruments and techniques for high-throughput DNA sequencing.

    Within months, Mardis had come up with a new strategy for isolating the DNA targeted for sequencing. At the time, a virus called M13 was used in bacteria to make many copies of the DNA to be sequenced, but then that DNA had to be separated from the bacteria's own genetic material. Mardis's approach was more efficient, less caustic, less smelly, and didn't yield as much hazardous waste as the traditional phenol-based extraction method. That was just the first of many improvements she has made in the past 20 years, during which she and her colleagues sequenced the first animal genome in 1998, Caenorhabditis elegans, and then made a frenzied push to complete the human genome ahead of a private company (Science, 16 February 2001, p. 1177). Her technology development group swelled to 2 dozen people, including electrical and mechanical engineers, as they worked to come up with ways to automate and standardize the preparation and sequencing of DNA samples. "We set aggressive goals that at the time we set them weren't possible. But failure wasn't an option, so you just rolled up your sleeves and figured out how to do it," Mardis says.

    Her peers were impressed. "In a way, she served as a model for me," says Chad Nusbaum, who was involved with scaling up human DNA sequencing at the Whitehead Institute in Cambridge, Massachusetts. "We were all tackling a big challenge. She did it with more verve than most."

    By 2001, a rough draft of the human genome was done. (The public and private efforts arguably tied.) As they finished some of the last details on that genome, WU's sequencers began knocking out additional ones, tackling the mouse, chicken, platypus, corn, and zebra finch genomes. Gradually, the Genome Center, which was renamed the Genome Institute 2 years ago, shifted into sequencing the DNA of more and more people. Since 2010, the center has really ramped up its efforts in cancer, such that today, 60% of the DNA studied comes from human cancers.

    Part of that shift was made possible by the cheaper and faster sequencing technology that began to appear in the mid-2000s. "Really seeing for the first time what would be possible" with these new machines was a "eureka moment," Mardis recalls.

    Automation for sequencing.

    Mardis developed this robot, which picks out the DNA to be sequenced, to save time and labor in the Human Genome Project.


    But while sequencing technology brokers still see her as the go-to person, and she's a founder and organizer of the annual DNA sequencing conference that's become the place to announce new technologies and hobnob with industry representatives, a new passion, battling cancer, has begun to push her. "My love of technology hasn't changed, but my job description has," Mardis says. "I feel a sense of urgency with cancer genomics."

    Tumor whole genomes

    Amidst the toy cars on her desk is a sign of that urgency: The Biology of Cancer, the classic textbook by geneticist Robert Weinberg. A crash education through reading that tome, attending cancer meetings, and talking to physicians has helped her learn the language of the many oncologists now seeking her aid in harnessing genome sequencing. (Ironically, Weinberg has publicly questioned whether cancer genomics will help patients more than traditional methods of studying individual genes.)

    Eight months ago, for example, after giving a talk in Italy, she was approached by a Harvard University physician who for years has collected tissue samples from lung cancer patients before and after their tumors had become resistant to drug treatments. He is now sharing those samples with Mardis, hoping that DNA sequencing will clarify the dynamics of the tumors. Her technology expertise had given her the confidence to be the first to try whole genome sequencing of tumors, and now she's involved with genome studies of seven cancers—of the breast, brain, liver, prostate, pancreas, lung, and blood.

    This run began in 2007 when she, Wilson, and Timothy Ley, a WU oncologist located a few blocks away, were in the midst of trying to use genomics to find mutations underlying acute myeloid leukemia (AML), which affects 13,000 people in the United States each year and kills 8800. Until then, cancer genomics consisted primarily of looking for large-scale aberrations in chromosomes, such as translocations, and studying genes that were suspect because they had already been implicated in cancer. Instead of just going after individual genes, the trio wondered, why not use next-generation sequencing technology to try to read a whole genome from tumor cells, and for comparison, from healthy cells in the same patient. Others had sequenced the exomes, or protein-coding regions, of the DNA in tumor cells, which represent just a small percentage of the whole genome, to try to find mutations. The whole genome should provide a more complete picture, pinpointing deleted genes and changes in regulatory regions as well as in genes themselves.

    They proposed the project, estimated to cost $1 million, to NIH as part of their grant renewal, but the study section reviewing it "hated it," Mardis recalls. Unfazed, they pitched their case to a local philanthropist, Alvin J. Siteman, after whom WU's cancer center is named. The next day, he transferred stocks worth $1 million to WU for the work, spurring a Human Genome Project–like frenzy among the researchers. They spent 16 months learning how to work with and then analyze the short pieces of DNA sequence produced by the next-generation sequencers. "It turned out to be very difficult," says Gad Getz, who is involved in cancer genomics at the Broad Institute in Cambridge. "In terms of analysis, it was a nightmare."

    But the resulting paper, published in Nature in 2008, was a real eye-opener. The team pinpointed 10 relevant mutations, two of which were already linked to AML and eight that were new to oncologists. "The effect was that people saw that it was actually possible to sequence a whole tumor genome," Getz says. "Very many people were excited." Suddenly, sequencing looked like it might eventually have a place in cancer care. They "started the idea of doing personal genomics" with whole genomes for cancer patients," Spellman says.

    That first AML sample came from a woman in her mid-50s who had responded to treatment at first, then relapsed a year later and died a year after that. The group described a second AML patient's genome in 2009 in The New England Journal of Medicine—the tumor cells had 12 mutations in genes and about 50 base changes in putative or known regulatory regions. Their further studies of other people with AML demonstrated that mutations in two genes, IDH1 and DNMT3A, help predict a patient's outcome. Patients have different portfolios of mutations, Mardis explains, but the hope is that by sequencing enough AML cancers, patterns will emerge that will reveal a common set of mutations, or at least pathways affected by the diversity of mutations, that can be specifically targeted for therapy.

    For eight AML patients, the WU group sequenced cancer cells when each person was first diagnosed and when the patient relapsed after chemotherapy. The relapse cancer came from the same pool of bone marrow cells that had led to cancer in the first place, but those cells had undergone additional mutations, they reported in the 11 January 2012 issue of Nature. "We were the first to show that, genome-wide, chemotherapy had a 'signature' of DNA damage," Mardis. "It confirms what was suspected for some time. By using chemotherapy, we may be setting up the patient to relapse because of acquired new mutations."

    Toward the clinic

    Cancer genomics became very personal for Mardis 2 years ago. In July 2011, Lukas Wartman, an oncologist and postdoctoral fellow in Ley's lab, who had been successfully treated for acute lymphoblastic leukemia (ALL) and had relapsed once before, relapsed again. Ley and Mardis had just begun a genome sequencing project for ALL, so they deciphered all the DNA from a sample of Wartman's cancer cells and also analyzed the cancer's transcriptome, that is, all the cells' RNA, to see which genes were active. "It seemed more of an academic exercise and not something that would change the course of my treatment," Wartman recalls. But as aggressive chemotherapy failed, and he became very sick, the project became a race against time to find another solution.

    It took about 4 weeks to get the DNA results; they were disappointing. Of the many mutations found in Wartman's cancer cells, none were known drug targets. A few days later, however, the transcriptome study revealed excessive amounts of RNA for a gene called FLT3, which helps cells grow and divide.

    Fortuitously, as part of an effort to streamline the analyses of cancer genomes and transcriptomes, Obi and Malachi Griffith, twin bioinformaticists working for Mardis, had developed a database that links gene mutations to small molecule drugs that inhibit those genes' resulting proteins. The motivation is that using such targeted drugs would have fewer side effects than conventional chemotherapy and not cause further mutation in the tumors. In their database, the research team found a drug that could counter FLT3's growth-stimulating activity; the company that owned its rights had already received Food and Drug Administration approval to treat kidney cancer with it.

    A way of life.

    The discipline and drive that Mardis brings to tae kwon do extends to her work as well.


    The treatment worked: Once again, Wartman is in remission. The satisfaction of finding something useful for a cancer patient is "a feeling I want to have more often," Mardis says.

    Toward that end, she and her colleagues have become more focused on how to use sequencing data in the clinic. Mardis argues that for now, whole genome data need to be supplemented with transcriptome and even exome information to get the most comprehensive picture possible—the so-called Maserati approach. Matthew Ellis, a WU oncologist who has closely worked with Mardis on breast cancer genomics, would like to see the proteome—details about the proteins present in a tumor cell—included as well in a patient's workup. "They are pioneers, and now the whole field is moving in that direction," Getz says.

    The push toward the clinic has required more technology development. Mardis's group has been coming up with ways to deal with patient samples that are too small for easy DNA sequencing or that are difficult to process because they have been preserved in formalin and embedded in paraffin. At the same time, the team is working to shorten the time between getting a sample and generating usable results. Right now, it takes about a month; they want it to take a few days at most. They are trying out a new machine by Illumina that promises to cut the sequencing time from 10 days to 1 day.

    Mardis is also exploring the uses of genomics data in other contexts. One new breast cancer project intends to use genomics to figure out if there are proteins uniquely produced by a person's tumor that can become targets for personalized immunotherapy or vaccinations against recurrences. A childhood brain cancer effort uses genomics to get transcriptome readouts from microglia cells, supportive cells in the brain that may provide cancer-stimulating signals to brain cancer cells.

    To fit all these projects into her schedule, Mardis gets up at 4:30 a.m. and is at her desk by 7, traveling 48 kilometers from a French country-style house she and her husband built in 2011. They are empty-nesters, except for two Yorkshire terriers and a German shepherd; their daughter is a college senior. Mardis has been away from home roughly 150 days a year for the past several years, giving so many talks and workshops that recently Wilson put a "No!" sign on her computer monitor to try to keep her from accepting every invitation and project that comes along. But Mardis's new sense of urgency rarely lets her follow that advice—and cancer patients of the future may be glad for that. Roe says, "She continues to come up with new ideas, new ways of doing things and new ways of asking questions."

  7. The Downside of Diversity

    1. Jocelyn Kaiser

    Increasing genetic evidence that tumors contain a heterogeneous mix of cells may explain why cancer treatments often fail.

    As an oncologist, Charles Swanton too often has to tell his patients with advanced lung and breast cancer that their options are running out. That despite several different treatments, each somewhat successful at first, their tumors have grown back yet again, faster than ever. "It's as though tumors have this ability to guess what you are about to do next and preempt it," he laments.

    Swanton now thinks that he has evidence explaining how they might do that. When the research team he leads at Cancer Research UK's London Research Institute recently sequenced DNA taken from different parts of a patient's kidney tumor, the results did not agree. While several genetic changes were shared throughout the original tumor mass and other tumors, or metastases, that sprang from it, most were present in only some parts, suggesting that tumors harbor diverse populations of cells. Some of these may be resistant to a treatment and become unleashed by it to grow, Swanton concluded: "To me, this begins to explain why drugs and therapies stop working."

    Swanton's results have not only confirmed a hunch that he says many clinicians share, but they have also brought him a measure of unexpected scientific fame. His team's report last March in The New England Journal of Medicine (NEJM) on the genetic diversity within individual kidney tumors attracted 143 citations by year's end, more than any other original research paper in biomedicine in 2012, according to the company Thomson Reuters, which tracks the impact of scientific papers. And Swanton's e-mail inbox is overflowing with speaking invitations. "I've never been so busy in all my life," says the 41-year-old who started his own lab just 5 years ago.

    Mixed bag.

    Breast cancer tissue tagged with colored fluorescent markers for specific molecular changes shows that not all cells in a tumor are the same.


    Genomics studies like Swanton's are stirring new interest in an old idea—tumors are a mosaic of different cells—by confirming with modern techniques that tumors are not always dominated by genetically identical cells, as once thought. Instead, tumors often contain many subsets of cells that are related but genetically distinct. As a tumor evolves, identical cells apparently split off and develop new mutations or other errors, like a tree growing many branches. This means that two parts of the same tumor, as well as the metastases that form when cancer spreads, can look very different genetically.

    The extent of this intratumor heterogeneity has shaken cancer biologists and clinicians. They're rethinking major projects to tally up cancer mutations, for example, because the data so far have been based on a single sample for each tumor type. And physicians worry that the new studies suggest that the growing practice of analyzing the genetic and molecular characteristics of a single tumor biopsy to guide patient treatment may sometimes mislead. A cancer's heterogeneity could also account for why much-heralded new drugs that target specific gene mutations may keep tumors in check for a time but almost inevitably stop working; they may merely lift a lid on a small population of drug-resistant cells and give them a chance to grow.

    "To me, this begins to explain why drugs and therapies stop working."

    — Charles Swanton, Cancer Research UK


    "What it's leading us toward is a kind of understanding that cancer is a very slippery beast. There's so much competition and so much evolution within a given cancer that it will be able to evade many of the treatments that we throw at it," says cancer geneticist Peter Campbell of the Wellcome Trust Sanger Institute in Hinxton, U.K.

    Some researchers say that not all is lost, however. A better handle on intratumor heterogeneity could lead to more effective treatments. "It's daunting, but it's exciting because we now have the tools" to understand what drives diversity in a tumor, says Kornelia Polyak of the Dana-Farber Cancer Institute in Boston, who studies intratumor heterogeneity in breast cancer.

    Clone wars

    Pathologists have known since they first put tumor tissue under a microscope in the mid- to late 1800s that tumors aren't made up of uniform-looking cells, hinting at underlying biological heterogeneity. In the 1970s and early 1980s, cancer biologists showed that cells taken from different parts of a tumor varied in their ability to form tumors and spread in mice and in how the cells responded to drugs.

    The notion that tumors contained a diverse range of cells fit with the clonal evolution theory of cancer, put forth in 1976 in Science by cancer biologist Peter Nowell, saying that cancer develops when a single cell randomly acquires a series of mutations that allow it to multiply and outcompete other, differently mutated cells, or clones, in a kind of Darwinian competition. However, Nowell assumed the less fit clones died out—which meant that one clone should make up most of the tumor mass and any other cells should be less mutated versions of that clone. That view later gained increased credibility as Johns Hopkins University researcher Bert Vogelstein and a co-worker applied it to colon cancer, describing a slow, linear accumulation of specific genetic changes that transform a normal epithelial cell into a polyp and then a growing tumor, dominated by a single clone.

    Yet in the past decade, Polyak's lab and others testing tumors for cancer-related abnormalities, such as specific mutated genes, extra copies of genes, or abnormal levels of proteins, have found variation within tumors, suggesting that many distinct subpopulations of cells, known as subclones, exist. Only when DNA sequencing costs dropped and researchers could probe tumors at the depth needed to find rare mutations, however, did they begin to glimpse the full picture of genetic diversity within a single cancer.

    In the last 4 years, for example, several research teams worldwide, including one led by Elaine Mardis of Washington University School of Medicine in St. Louis (see p. 1540), sequenced the genomes of breast tumors and found that a woman's metastases differed genetically from her primary tumor and appear to have evolved from subclones within that original tumor. Vogelstein's team sequenced different parts of several pancreatic tumors and found a series of genetically diverse subclones had evolved over time, some of which had split off from the primary tumor to form the patient's metastases outside the pancreas. Several genetics studies of leukemias, including genome sequencing by Mardis's group, have shown that these cancers of the blood sometimes harbor many rare subclones.

    These experiments, which suggested a branching evolution model for many cancers rather than a linear process (see diagram), usually tested blood or ground-up biopsy samples, so the results reflect a mixture of cells. In another approach to studying heterogeneity, Nicholas Navin and others in the lab of geneticist Michael Wigler at Cold Spring Harbor Laboratory in New York have sequenced the DNA of single cells taken from breast cancer biopsies. They found distinct populations of cells, including some that had acquired new mutations in a rapid burst of "punctuated" evolution, rather than developed them in a gradual manner as had been believed, says Navin, who is now at the University of Texas MD Anderson Cancer Center in Houston.

    "There will be a lot of investment in this area and maybe trials, but I think it will take some time to figure out what really helps [overcome tumor heterogeneity]."

    —Kornelia Polyak, Dana-Farber Cancer Institute


    Perhaps the most dramatic study yet of genetic heterogeneity within a solid tumor is the one led by Swanton's lab. In addition to conducting other molecular tests, the researchers methodically sequenced the protein-coding DNA of samples snipped from various locations on a large kidney tumor and from the patient's metastases. They reported that of the 128 mutations identified in one patient, only about one-third were shared by all the samples. The level of heterogeneity was similar for three more patients' tumors. Some parts of a tumor had different mutations in the same cancer genes, showing that subgroups of cells had evolved a defect in the same gene independently.

    The good news, Swanton says, is that some growth-spurring "driver" genetic changes present early in the evolution of the patient's tumor persisted as it grew and spread, which means that a drug targeting those abnormal genes should work on most of the cancerous cells. However, his and other studies also suggest that even if a drug works at first on metastatic cancer, it probably won't work over the long term because tumors harbor rare clones with additional genetic changes that may enable them to resist the drug, Swanton says.

    The resistance may also come from heterogeneity caused by sources other than mutated genes, some researchers say. They are finding evidence that chemical modifications to the DNA or other molecular fluctuations within the cell that turn genes off or on may allow some tumor cells to overcome cancer drugs or chemotherapy. "It certainly makes it more complicated, that's for sure," says John Dick of the University of Toronto in Canada. (He led a study with Shibata, published in the 1 February issue of Science, suggesting that genetically similar cells in colon tumors can differ in their resistance to drugs [p. 543].)

    Tumor heterogeneity, whether from DNA mutations or other changes in the cells, is a persuasive explanation of why treatments fail, Polyak says. She goes so far as to suggest that it explains resistance better than the "cancer stem cell" theory, which posits that resistance occurs when a rare, drug-resistant cell expands to dominate the tumor mass.

    Provocative questions

    To some who study the evolution of cancer, the accumulating evidence of genetic tumor heterogeneity is welcome but unsurprising—and they worry that it will do little for treatment. "It's all descriptive," says Darryl Shibata of the University of Southern California in Los Angeles. "The critical question is, is heterogeneity of any scientific and practical value? Its actual meaning is unclear."

    That hasn't stopped leaders of the Cancer Genome Atlas, a large project partly funded by the U.S. National Cancer Institute (NCI) in Bethesda, Maryland, from taking steps to address tumor heterogeneity. One follow-up project that they are considering is to go back and sequence leftover samples from primary tumors and, when available, metastases, to explore how the cancers evolved, says NCI's Stephen Chanock. The Sanger Institute's Campbell expects that other cancer genome projects in the United Kingdom and elsewhere will begin sequencing samples of both primary tumors and their metastases. Some researchers even argue that these projects may have missed the most important, deadliest mutations by focusing on only single biopsies. "It's very embarrassing because they should have done this at the very beginning," Shibata says.

    Two views.

    Tumors were long thought to evolve in a linear fashion as a single cell acquired growth-spurring mutations and dominated the final mass. But new studies indicate that in many tumors, cells branch off and form a diverse tumor with cells that may evade treatments.


    Campbell sees at least one clinical change coming out of the work on tumor heterogeneity: Physicians will need to sequence primary and metastatic tumors, when feasible, if they're tailoring therapies to an individual. They may also need to biopsy tumors throughout the course of a patient's disease to follow the cancer's evolution, he and others say.

    A major unresolved issue swirling around tumor heterogeneity is whether patients with more genetically diverse tumors have a poorer prognosis, Swanton says. They do, according to some early evidence from a team at the Dana-Farber Cancer Institute and the Broad Institute. The researchers reported in Cell in February that among people with chronic lymphocytic leukemia who received chemotherapy, those whose original leukemia harbored subclones with one or more cancer-driver genes needed retreatment more often or died sooner than people without this diversity. Campbell says his group, too, is finding that in patients with a preleukemia condition called myelodysplasia, those with an aggressive subclone in their tumor before treatment develop leukemia and die sooner.

    Some ambitious researchers would like to be more precise about heterogeneity in a person's cancer, possibly ranking it on a graded scale, Navin says, to improve treatments. For instance, Carlo Maley, director of the Center for Evolution and Cancer at the University of California, San Francisco, who has modeled the progression of Barrett's esophagus, a precancerous condition in which cells in the lower esophagus acquire an abnormal appearance, showed several years ago that patients with more genetically diverse premalignant cells as measured on a numerical scale are more likely to develop full-blown cancer.

    Some remain optimistic that the diversity of tumors doesn't mean that targeted drugs are doomed to fail. Vogelstein suggests that because most patients' primary tumors are removed with surgery, only the heterogeneity in the metastatic tumors matters (see p. 1546). Some of those patients have metastases with mutations in two different but fundamental growth pathways that can each be targeted with a drug; Vogelstein argues that combining drugs in such cases will drastically reduce the odds that the cancer will develop resistance.

    Swanton wants to defeat heterogeneity by preventing it. His group recently reported that in colon cancer, cells are often missing some genes that make sure DNA is copied correctly during cell division, which can result in chromosomal abnormalities. Gene-based drugs that restore proper copying, or other processes that keep a cell's DNA error-free, might one day limit a tumor's diversity, making it less likely to develop resistance, he suggests.

    A more radical approach is to forget about wiping out all the cells in a tumor and instead just keep it from growing. Robert Gatenby, a radiologist at the Moffitt Cancer Center in Tampa, Florida, argues that the drug-resistant cells in a tumor are kept in check by the cells that dominate the tumor mass, and it's best to let both types of cells coexist. Gatenby has reported that mice implanted with aggressive ovarian cancer tumors live at least twice as long, or 4 months, if they are given doses of chemotherapy so low that the drugs don't wipe out tumors but instead keep them stable in size. "We need to be respectful of evolution and its effects on therapy and try to use it instead of letting it defeat us," says Gatenby, who calls his strategy "adaptive therapy."

    When he submitted a research proposal to expand this work in 2009, he received "the worst reviews I ever got," Gatenby says. But last fall, he won a $2.1 million grant from NCI's Provocative Questions program, which encourages researchers to test out-of-the-box ideas. Gatenby soon hopes to launch a small clinical trial in which prostate cancer patients will receive smaller than normal doses of antiandrogen therapy, which the researchers will regularly tweak by monitoring tumors with imaging and using mathematical models to predict their growth.

    Polyak thinks such strategies for overcoming intratumor heterogeneity are worth pursuing. "There will be a lot of investment in this area and maybe trials, but I think it will take some time to figure out what really helps," Polyak says.