News this Week

Science  09 Oct 2009:
Vol. 326, Issue 5950, pp. 212
  1. Physiology Nobel

    U.S. Researchers Recognized for Work on Telomeres

    1. Gretchen Vogel and
    2. Elizabeth Pennisi

    An enzyme that keeps cells from premature death has won a bit of immortality for the scientists who discovered it.

    This year's Nobel Prize in physiology or medicine recognizes the discovery of a key mechanism that cells use to protect their genetic information. Elizabeth Blackburn of the University of California, San Francisco; Carol Greider of Johns Hopkins University School of Medicine in Baltimore, Maryland; and Jack Szostak of Harvard Medical School in Boston will each receive one-third of the $1.4 million prize for their work describing telomeres, the repetitive DNA that caps the ends of chromosomes, and telomerase, the enzyme that makes the caps. The trio has long been considered top contenders for the prize. “I've been hoping for this for about 10 years. I'm thrilled,” says Titia de Lange of Rockefeller University in New York City, who studies telomeres.

    Interest in telomeres dates back to the 1930s and 1940s, when Nobel Laureates Barbara McClintock and Hermann J. Muller gathered indirect evidence that telomeres prevented chromosomes from attaching to each other. But later researchers never followed up because the specialized DNA was hard to isolate and study.

    Milestone.

    Blackburn (left), Greider, and Szostak share the prize—the first honoring two women—for work on telomeres (stained yellow in inset).

    CREDITS (LEFT TO RIGHT): UCLA; GERBIL/WIKIPEDIA; GERBIL/WIKIPEDIA; AFP PHOTO/NETHERLANDS' ROYAL SCIENCE ACADEMY; © THE NOBEL FOUNDATION

    In the mid-1970s, Blackburn and Joseph Gall, now at the Carnegie Institution for Science in Baltimore, came up with a solution. Blackburn had just finished her Ph.D. under the direction of Fred Sanger at the University of Cambridge in the United Kingdom; she wanted to apply new sequencing approaches to telomeres. With Gall, she decided to study an unusual single-cell organism called Tetrahymena, which during its life cycle shatters its chromosomes into tens of thousands of pieces, each with telomeres. With abundant material, the duo discovered that the caps consisted of six-base segments that repeated 20 to 70 times.

    Blackburn then teamed up with Szostak, who wanted to see if telomeres would work in yeast. Typically, when extra DNA is added to yeast cells, they break it down or reconfigure it. But when Tetrahymena telomeres were added to these pieces of DNA, their function was preserved. “It was really a shot in the dark,” Blackburn recalls, because typically one organism's DNA didn't work in another organism. Yet in 1982, Blackburn and Szostak showed that yeast retained and copied extra genetic material containing Tetrahymena telomeres intact, suggesting this specialized DNA had an ancient function that predated when the ancestors of these two organisms went their separate ways.

    “It led to a better understanding of telomeres,” says Thomas Cech of the University of Colorado, Boulder. Other experiments showed that yeast could extend the ends of inserted Tetrahymena telomeres by adding new DNA to the tips. Blackburn and Szostak suspected that an unknown enzyme was responsible.

    In 1984, Greider, a Ph.D. student in Blackburn's lab, decided to see if Blackburn and Szostak's hunch was right. “It seemed like an exciting project that would potentially tell us something new about how telomeres would be maintained,” Greider recalls. Because the enzyme that copies DNA doesn't travel all the way to the end of chromosomes during replication, researchers had wondered for decades why chromosomes didn't shrink with each cell division. A telomere-lengthening enzyme might explain the puzzle. In 1985, after a series of difficult experiments, Greider and Blackburn demonstrated that six-base segments of DNA were being added to existing telomeres—evidence that such an enzyme did exist. It took almost another year to purify it.

    Telomerase proved quite complex. In 1987, Greider and Blackburn showed that it contained an RNA as well as a protein component, with the RNA serving as the template for adding the necessary bases to the end of the telomere. They pinned down the RNA by 1989, but the protein part took several more years to identify.

    During the 1980s, telomere research “was a very obscure area,” says Cech. These advances “were pretty exciting to the community, but it was a pretty small field.” There were hints of a link to aging or cancer. For example, Szostak, working with mutant yeast that lacked the ability to maintain telomeres, discovered that telomeres shrank with each cell division until eventually the cell stopped dividing. “But it was not possible to rigorously test” these ideas, Cech says, until the catalytic protein was in hand in the mid-1990s.

    At about that time, Szostak switched his focus to understanding the origin of life (Science, 9 January, p. 198). “Once it was pretty clear there would be biomedical applications and the questions would be taken care of, [I decided] it would be okay to go on to other things,” he said at a press conference in Boston on Monday.

    Telomerase is now thought of as a Dr. Jeckyll–Mr. Hyde molecule: good in one context, bad in another. Rapidly growing cells need active telomerase, but once cells have become specialized and cease dividing, telomerase gets turned off—or is supposed to be. Many types of cancer cells have overactive telomerase that allows them to continue dividing when they shouldn't. “It's an almost universal oncology target,” says Jerry Shay of the University of Texas Southwestern Medical Center in Dallas. Clinical trials are now under way to test whether a vaccine against telomerase could help fight certain kinds of cancer.

    Not having enough telomerase is bad news, however. Deficient telomerase has been linked to a lung disease called pulmonary fibrosis and anemia caused by sporadic bone marrow failure. And a rare premature aging disease called dyskeratosis congenita is caused by a faulty telomere-maintenance system. No treatment is yet available, says Thomas Vulliamy of Queen Mary, University of London, who studies the disorder. “Sometimes we're a bit impatient. … I am sure that the understanding of telomeres will help us treat patients someday.”

    Other disease connections are being investigated. In 2004, Blackburn found a correlation between shorter telomeres and chronic stress. Other data show that shorter telomeres are associated with an increased risk or incidence of several common age-related diseases, such as heart disease. Telomeres “really do seem to reflect our status of health and our risk of disease,” says Blackburn.

    Shay is convinced that the prize will draw more people into telomere research and bring the clinical payoffs closer. But he says the field already has something to celebrate: Blackburn and Greider are the ninth and tenth women to be awarded the prize in physiology and medicine—the first time in history two women will share in the award.

  2. Physics Nobel

    Digital Imaging, Communication Advances Honored

    1. Adrian Cho

    Within seconds after the announcement of this year's Nobel Prize in physics, photos and videos of the winners were zinging across the globe via the Internet. Fittingly, the winners' discoveries made that instantaneous dissemination of information and images possible. Half the award goes to Charles Kao, an electrical engineer whose theoretical work lit the way to practical optical fibers for high-speed telecommunications. The other half honors physicists Willard Boyle and George Smith for inventing the first electronic chip that could capture an image.

    Simply put, Kao figured out how to get light to travel far enough down a glass fiber to pass signals over great distances. Light tends to course down a microns-thick fiber instead of leaking out the side because it glances off the inside surface of the fiber and reflects back into the glass. The glass itself can absorb light, however, and until Kao's work in 1966, such absorption would soak up 99% of all the light passing through a 20-meter optical fiber.

    Light brigade.

    Kao (left) won laurels for fiber optics; Boyle and Smith, for charge-coupled devices.

    CREDITS (LEFT TO RIGHT): REUTERS/LANDOV; NATIONAL ACADEMY OF ENGINEERING; JEFF ZELEVANSKY/REUTERS; © THE NOBEL FOUNDATION

    Kao, who was born in Shanghai, China, and earned his Ph.D. in the United Kingdom, fingered iron impurities as the essential cause of the loss. Light can also bounce down a fiber in several different patterns or modes, and Kao, then at the Standard Telecommunication Laboratories in Harlow, U.K., correctly predicted that a very thin fiber that allowed the light to propagate in only one mode would be best for producing practical communications networks.

    Kao, 75, “was the right choice because he was the one who brought to the attention of the community that it was possible to reduce the losses,” says optical physicist Govind Agrawal of the University of Rochester in New York state. Still, Agrawal adds, “when I heard the news, my first thought was about the other people who could also have gotten it.” In particular, he says, Donald Keck and colleagues at Corning Inc. in Corning, New York, were the first to produce the fibers Kao described theoretically.

    Boyle, 85, and Smith, 79, invented the charge-coupled device (CCD), the first electronic chip capable of snapping a picture. It remains a key technology for medical imaging, astronomical telescopes, and digital cameras. Boyle and Smith “have enabled pretty much all of the modern imaging systems,” says George T. C. Chiu, an engineer specializing in imaging at Purdue University in West Lafayette, Indiana. “The CCD has changed everything about how we gather information and images.”

    The CCD contains a silicon chip that is divided into cells or “pixels.” When light hits a pixel, it excites an electric charge in the silicon, which then induces a charge in a tiny electrode on the chip's surface. The charge then quickly passes from electrode to electrode down a whole row of pixels—that's the “charge coupling”—and is read out at the edge of the chip.

    Boyle, who was born in Canada and lives in Halifax, says he and Smith thought the device up in a couple of hours in 1969. “It was an October morning, and he came up to my office for some brainstorming. And pretty soon we had some sketches on the black board, and there it was.” Smith says, “We complemented each other. We just seemed to click.” The two worked at the famed Bell Labs in Murray Hill, New Jersey. Bell Labs can now claim a total of seven Nobel Prize-winning discoveries—a fact that Boyle says he finds more amazing than his own invention.

  3. Newsmaker Interview

    Francis Collins: Looking Beyond the Funding Deluge

    1. Jocelyn Kaiser
    Tutorial.

    President Obama hears from NIH Director Francis Collins (left) and cancer researcher Marston Linehan during a visit to NIH.

    CREDIT: © BROOKS KRAFT/CORBIS

    U.S. National Institutes of Health (NIH) Director Francis Collins wound up his sixth week on the job with a splash last week, hosting a visit from President Barack Obama. On 30 September, the final day of the 2009 fiscal year, the president chose NIH as the place to highlight the first year of spending from his 2-year, $787 billion economic stimulus package. The $5 billion disbursed by NIH to more than 12,000 grants is “the single largest boost to biomedical research in history,” Obama said.

    Because $4 billion more has been committed for the second year of the new grants, NIH has effectively spent almost 90% of its $10.4 billion in stimulus funds, Collins told Science later that day. That includes nearly $100 million for autism, $750 million for heart disease research, and $175 million for The Cancer Genome Atlas. The process hasn't been easy, however. NIH staff and outside scientists have scrambled to review huge numbers of grant applications, including the Challenge Grants, a special competition that generated over 20,000 proposals requiring 15,000 reviewers.

    Besides moving money out of the door in Collins's first weeks, NIH submitted its fiscal 2011 budget request to the White House. Collins has also been busy helping to find candidates to direct the National Cancer Institute, a presidentially appointed position; overseeing a review of the National Children's Study, which is facing cost over-runs; and assessing the Roadmap, the now nearly $500 million pot of money for cross-institute projects created by previous director Elias Zerhouni.

    The following are excerpts from a conversation with Science:

    Q:Can you say anything about how NIH's 2011 budget is looking so far?

    F.C.:There's going to be a lot of people making the case for why their particular part of the government is a uniquely wonderful investment. At the same time, we have an economy that continues to struggle, we have a deficit that is now grown to something like $9 trillion. The scientific community should not in any way imagine that this is going to be easy.

    Q:So chances are there will be a plunge in success rates in 2011?

    F.C.:It's likely to be a pretty tough year. It's not only because one has to worry about what that NIH base can be, but a large number of Challenge Grants that didn't get funded are going to come back as R01s [NIH's basic research grants]. So the number of applications is expected to be quite high.

    Q:The Challenge Grants created a lot of work, and only 3% to 4% were funded. If you could go back and do it over again, would you rethink the Challenge Grant competition?

    F.C.:I wasn't here then, but I don't hear that from others. I would say this is a great problem to have. What it showed was this pent-up demand that had been building over 5 years of flat budgets.

    Q:Has Senator Charles Grassley (R–IA) uncovered a real problem with corruption in medicine?

    F.C.:I think the vast, vast majority of scientists are honorable people. At the same time, if there are examples of people who have intentionally hidden financial benefits, that's a black eye for all of biomedical research and we need to get our house in order. So certainly NIH in December will be putting forward for the community to look at some ideas about how to tighten up the system of reporting. And at least for myself, I would think the Grassley proposal [for a public database of payments reported by companies] would be a good step forward.

    Q:A recent report found that NIH is now awarding 19% of R01 grants to proposals that missed the “payline,” or quality cutoff, largely to help new investigators. Is 19% too high?

    F.C.:Isn't this an interesting discussion because certainly in the past the concern has been that NIH is not doing enough for new investigators. Now people are raising the issue: “Are we doing too much of this?” I don't think so. We're basically trying to give new investigators who already have a superb priority score but just missed the cut a chance to get started.

    Q:What do you think of President Obama's proposal to double cancer research over 8 years?

    F.C.:Obviously, the president has a particular interest in cancer because of experiences in his own family. Certainly, we would all agree that increasing the funding for cancer would be highly justified. At the same time, many people worry about a circumstance where one disease is picked out of the pile and made to sound like it is more important. And certainly from NIH's perspective, I think we would like to see the rising tide lift all the boats.

    Q:The National Children's Study could cost double or more than the original estimate of $3 billion. Should the full study go forward?

    F.C.:I'm certainly very much in favor of the science. We're going to take a very serious look at the study design, at what we've learned from the pilot projects. I think we have to figure out what we can afford to do.

  4. Virology

    Chronic Fatigue and Prostate Cancer: A Retroviral Connection?

    1. Sam Kean

    As if chronic fatigue syndrome (CFS) hasn't caused enough brawls, a new study published online by Science (www.sciencemag.org/cgi/content/abstract/1179052) links the disease to a possibly contagious rodent retrovirus, XMRV, which has also been implicated in an aggressive form of prostate cancer. Related work by the authors also suggests that CFS might best be treated with AIDS drugs. Even the lead author, Judy Mikovits of the Whittemore Peterson Institute for Neuro-Immune Disease in Reno, Nevada, says she understands why linking CFS to a retrovirus and to prostate cancer has already drawn skepticism.

    In 2006, an unrelated paper found an association between XMRV, which originated in mice, and a deadly prostate cancer exacerbated by a deficient enzyme. Mikovits and colleagues had seen the same deficiency in CFS cases. When they investigated further, the team discovered XMRV in the white blood cells of two-thirds of CFS patients but only 4% of control subjects. Intriguingly, Mikovits says XMRV does major damage in natural killer (NK) blood cells, which attack tumor cells and cells infected by viruses, and other studies suggest people with CFS suffer from high rates of cancer. Unpublished work, Mikovits adds, has found blood serum antibodies for XMRV in 95% of CFS patients.

    Controversial link.

    A study of chronic fatigue syndrome points to a retrovirus found in cancerous prostate cells (magnified in inset).

    CREDIT: ROBERT SCHLABERG AND HARSH THAKER

    All previous attempts to nail down a cause for CFS—including many links to viral infections—have foundered or been retracted, and many doctors remain doubtful that it's a coherent disease. Mikovits says her work “proves beyond a shadow of a doubt that CFS is a real disease.” But some of her peers find the report of a viral link premature.

    Joseph DeRisi, a molecular biologist at the University of California, San Francisco, who co-discovered XMRV, was not satisfied with details in the paper: He wanted to know more about the viral load in CFS patients and how the demographics of the control group matched that of CFS patients. And the Mikovits team didn't do enough to rule out contamination, he says. “One has to be very careful about making claims about such a sensitive and emotionally charged issue as CFS, where many claims have been made in the past.” At the least, a double-blind study where a third-party lab searches for XMRV in CFS patients and in controls is vital, he says.

    Other CFS specialists, including Jonathan Kerr at St. George's University of London, are convinced that the Mikovits team discovered something important. “The fact that the virus was actually grown from the blood cells of CFS patients strongly suggests some sort of role in the pathogenesis of the disease.” But exactly what they discovered remains unclear, given that the group is not claiming to have identified a cause.

    John Coffin, a molecular biologist at Tufts University in Boston, analyzed the Mikovits paper in a separate “Perspective” also published online by Science (www.sciencemag.org/cgi/content/abstract/1181349). Coffin was highly skeptical of the paper at first, but the team found enough independent lines of evidence for XMRV to convert him. “They will be celebrating in the clinics where these people [with CFS] are being treated,” he now says.

    Even if the finding of a link to XMRV holds up, treatment suggestions are bound to attract controversy. No one knows how easily XMRV spreads, although Mikovits says transmission can occur via bodily fluids, including saliva. Mikovits also says unpublished preclinical data hints that scientists can treat XMRV with AIDS drugs such as AZT, although AZT itself might prove too toxic. Kerr remained cautious about this: “With present public knowledge—what is described in this paper—further work would be necessary before antiretroviral drugs could be recommended.”

  5. ScienceNOW.org

    From Science's Online Daily News Site

    2009 Ig Nobels Announced Although the Nobel Prizes are grabbing the spotlight this week, last week belonged to the Ig Nobels, which honor the more humorous side of science. This year's winners include the inventor of a bra that can serve as a pair of emergency gas masks and a team that showed that compostable garbage can be reduced to less than 10% of its mass by fermenting it with bacteria derived from the feces of panda bears.

    CREDIT: GEORGE DOYLE/STOCKBYTE

    Helping Crops Shed Pesticides Every year, 3 million tons of pesticides are sprayed on the world's food crops. The chemicals protect plants from voracious insects and pathogens, and they save billions of dollars in damages. Yet high doses of pesticides that accumulate in the body can cause cancer and other serious illnesses in humans. Now scientists may have found a way to help crops shed these toxicants long before they end up on dinner tables around the globe.

    Buried Treasure Solves Ancient Roman Puzzle As civil wars erupted throughout the Roman Republic in the 1st century B.C.E., country dwellers may have fled to cities. Before they left, some people buried their valuables to hide them from looting armies. Now social scientists have studied these ancient stashes, called coin hoards, to answer a long-standing Roman mystery.

    The Fungus That Ate the World Scientists claim that they have identified an ancient fungus that flourished about 250 million years ago, feeding on dead trees as it spread across the planet. Those remains could provide a crucial clue to the identity of what killed off much of Earth's plant and animal life at the time, although some researchers remain skeptical.

    Read the full postings, comments, and more on sciencenow.sciencemag.org.

  6. U.S. Research Policy

    Agricultural Science Gets More Money, New Faces

    1. Erik Stokstad

    After decades of flat funding, agricultural research seems to have caught the attention of U.S. policymakers. Last week, Congress gave a 30% boost to the main competitive grants program of the U.S. Department of Agriculture (USDA), raising it to $262 million for 2010. Two new research chiefs at the department also hope to parlay an administrative reorganization into greater visibility for the field. Research advocates are cautiously upbeat that their labors are finally paying off. “There's fresh energy and optimism,” says Thomas Van Arsdall of the National Coalition for Food and Agricultural Research in Fredericksburg, Virginia.

    Many are expecting a lot from Rajiv Shah, the young and energetic deputy undersecretary for research who joined USDA in June from the Gates Foundation. Speaking with Science last week, Shah described his plans to shake up the massive department, which employs 2300 scientists and has a research budget of $2.8 billion. Lobbyists are also thrilled with this week's arrival of plant scientist Roger Beachy as head of the National Institute of Food and Agriculture (NIFA), the new home for USDA's extramural funding.

    Seeding change.

    Rajiv Shah (left) and Roger Beachy hope to make USDA a bigger player in federal research.

    CREDITS: ALICE WELSH/USDA

    “Shah's a really smart guy. He's surrounding himself with smart people; he's got a big agenda and wants to do really big things,” says Ferd Hoefner of the National Sustainable Agriculture Coalition in Washington, D.C. “He's got the personality and credibility to try to put that all together.”

    Advocates have been trying for years to raise the profile—and funding—of agricultural research. They applauded last year's passage of an agriculture bill that will provide $426 million over 4 years in new competitive research funding for bioenergy, organic farming, and vegetables and other so-called specialty crops (Science, 23 May 2008, p. 998). The bill also gives Shah the title of chief scientist as part of a broader move to improve how USDA manages research.

    Shah, 36, was an unusual pick for the position. Not only is he far younger than previous undersecretaries, he's not a scientist. Trained as a physician and also holding a business degree, Shah worked on child immunization at the Gates Foundation before switching to agricultural development. Now he's applying those skills to a department whose research budget has remained essentially flat for decades. USDA is also seen by many as a bit player among federal science agencies, a status that was reinforced earlier this year when USDA received no research funds from the $787 billion American Recovery and Reinvestment Act while the National Institutes of Health (NIH), the National Science Foundation (NSF), and the Department of Energy's science programs each received billions. “I think that was the ultimate wake-up call,” Shah says.

    Shah plans to raise the department's visibility by focusing research on five broad areas that align with Administration priorities: climate change, bioenergy, food safety, obesity, and overseas hunger. He wants to focus on core problems—such as the development of drought-tolerant crops and perennial grasses for biofuels—and leverage USDA's investments by partnering with other agencies. “Frankly, we've done too many discrete projects that are too small in scope.” Similarly, Shah hopes to give out fewer but larger grants for work that fosters multidisciplinary collaborations. He plans to hold program managers accountable by asking them to set goals for two, five, and 10 years.

    Extramural research is also being reorganized. In the farm bill, Congress directed USDA to convert its Cooperative State Research, Education, and Extension Service—which distributes extramural grants to individual scientists and so-called formula funding to landgrant universities—into NIFA and to appoint a distinguished scientist to head it.

    Beachy, 65, qualifies by any measure. A member of the National Academy of Sciences, Beachy did important work on engineering virus resistance in plants and in 1998 became the founding president of the Donald Danforth Plant Science Center in St. Louis, Missouri. “My major goal is to improve the perception of the agency and gain the same level of respect as NSF and NIH,” Beachy says. Karl Glasener, who directs science policy for the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America in Madison, says “Beachy's status as a star in the science community should help with image building.”

    One crucial measure of success, of course, will be the size of NIFA's budget. William Lesher, the director of Global Harvest Initiative in Washington, D.C., an agribusiness campaign to increase research on crop productivity, is optimistic that congressional appropriators will be receptive to requests from the Obama Administration to spend more. “If they propose larger budgets, it will really have a significant positive impact,” Lesher says.

    A coalition of research advocates, including Glasener and Hoefner, has been lobbying for a $300 million budget for competitive grants at USDA in fiscal year 2011. (The program is authorized at $700 million but received only $201 million in the 2009 fiscal year that ended last week.) Shah won't comment specifically on what the agency will request, a figure that is vetted by the White House before it's released in February as part of the president's overall budget submission to Congress. But he emphasizes that the Administration is serious about doing what it takes. “In order to get the breakthroughs we want, we have to invest at a certain level of scale and partner with others to do it well,” he says. “That's what is coming.”

  7. Climate Change

    Both of the World's Ice Sheets May Be Shrinking Faster and Faster

    1. Richard A. Kerr

    The two great ice sheets—Greenland's and Antarctica's—have had plenty of press lately, what with galloping glaciers and whole lakes of meltwater plunging into ice holes in minutes (Science, 18 April 2008, p. 301). Surveys of ice-sheet volume made from planes and satellites have quantified these losses, but those assessments have been spotty in time, space, or both. Shrinkage accelerated from the 1990s into the 2000s, but researchers couldn't be sure what would come next.

    Now the latest analysis of the most comprehensive, essentially continuous monitoring of the ice sheets shows that the losses have not eased in the past few years. More ominously, losses from both Greenland and Antarctica appear to have accelerated during the past 7 years. If the acceleration out of the 1990s “was a hiccup, it was a big one, and it's getting bigger,” says glaciologist Richard Alley of Pennsylvania State University, University Park, who was not involved in the work.

    The results, in press at Geophysical Research Letters, are based on measurements by the Gravity Recovery and Climate Experiment (GRACE) satellite mission. Rather than measuring the volume of ice sheets every few years as most earlier surveys did, GRACE “weighs” them from month to month with a pair of spacecraft launched in March 2002 as a joint NASA and German Aerospace Center mission (Science, 14 August, p. 798). Flying in tandem 220 kilometers apart, the satellites can measure subtle variations in the pull of gravity as they pass over a large mass on the surface. By beaming microwaves from one to the other, they precisely gauge the changing distance between them as the added mass tugs first on the leading satellite and then on the trailing one. Changes in gravity from pass to pass reflect changes in the icy mass below.

    The mass changes of Greenland and Antarctica during the past 7 years have all been negative, geophysicist Isabella Velicogna of NASA's Jet Propulsion Laboratory in Pasadena, California, concludes in the study. On Greenland, she calculates, the rate of ice mass loss doubled over the 7-year period, producing an acceleration of −30 cubic kilometers of water lost per year. On Antarctica, the loss rate more than doubled to produce a similar acceleration. Together, that would make for a 5% acceleration each year in the rise of sea level. Compounded year after year, “that is a big thing,” says Velicogna. “We should be more concerned.”

    Bending down.

    The trend line of Greenland ice mass (green) curves downward with time, suggesting that losses have been accelerating.

    CREDIT: ADAPTED FROM ISABELLA VELICOGNA, GEOPHYSICAL RESEARCH LETTERS

    The GRACE observations counter encouraging news from southeastern Greenland that the surging glaciers there had slowed (Science, 23 January, p. 458). The slowing was probably real, says Alley, but apparently the increasing losses from melting and accelerating glaciers elsewhere around Greenland at least made up for the slowing in the southeast. Whatever is driving ice loss—warmer oceans, warmer air, or both—is persisting, he says.

    Glaciologist Waleed Abdalati of the University of Colorado, Boulder, says Velicogna's analysis “suggests—that's the key word—that there's been an acceleration in the period examined. We have to be careful to not overinterpret and speculate about the future.” The record is too short to be extrapolated into the future, Abdalati says. And at least in Greenland, it was affected by the extreme warmth and resulting melting in 2007, a loss surge that might not be repeated in the next 7 years.

    If bursts of ice loss do occur soon, GRACE may not be around to record it. Its two satellites will fall from orbit around 2013, dragged down after a decade of orbiting through Earth's outermost atmosphere. No replacement gravity mission is yet planned.

  8. ScienceInsider

    From the Science Policy Blog

    Scientists complain that the U.S. Army's claims of success with an AIDS vaccine tested in Thailand are undermined by an unrevealed second analysis. That result found a drop in vaccine efficacy and no statistical significance when it compared vaccinated and control groups that rigorously followed the protocol.

    A number of scientists are outraged over a new program to use DNA and tissue samples to determine the nationality of applicants for asylum by the U.K. Border Agency. After ScienceInsider revealed scientific condemnation of its plans to conduct DNA tests and isotope analyses to determine nationality (Science, 2 October, p. 30), the U.K. Border Agency this week changed its plans, saying such evidence would not yet be used in asylum decisions.

    Energy Secretary Steven Chu tried to persuade Congress to fund eight Energy Innovation Hubs next year, but in a spending bill passed last week, Congress supported only three.

    A controversy at the Proceedings of the National Academy of Sciences over the journal's standards for peer review took a new turn when it decided to postpone the publication of a controversial paper on butterflies that was already accepted and published online.

    A new study conducted by the National Football League suggested that playing professional American football increases the risk of dementia, increasing pressure on the sport to study the problem.

    Congress has given the U.S. Department of Energy's Office of Science a 2.7% boost in its 2010 budget, to $4.9 billion. The $131 million increase comes as the overall budget for the $27 billion agency was held relatively flat.

    For more science policy news, visit blogs.sciencemag.org/scienceinsider.

  9. Cancer Research

    Looking for a Target On Every Tumor

    1. Jocelyn Kaiser

    Major cancer centers say they're getting ready to genotype every patient's tumor, hoping to match them with drugs specifically tailored to halt tumor growth.

    Lethal threat.

    Even some advanced lung tumors have been slowed by focused drugs.

    CREDIT: ANNE WESTON/WELLCOME IMAGES

    In August 2007, Kevin Brumett, an athletic 29-year-old Massachusetts man, was stunned to find out why he was having severe back and stomach pain: He had advanced lung cancer. Brumett was young and had never smoked. Doctors at Beth Israel Deaconess Medical Center in Boston gave him standard chemotherapy, but the drugs became toxic over time, so they tried something new. They tested biopsies from his lung tumor for several mutations known to drive cell growth. “There was a small hope that if we found [the right sequence], there would be an approved drug or a clinical trial he would qualify for,” says Daniel Costa, his oncologist at Beth Israel.

    Brumett was lucky. His tumor cells had a fusion of two genes, EML4 and ALK, recently found in a small fraction of patients with his type of cancer, non-small cell lung cancer, the most common form. A trial to test a drug targeted at EML4-ALK was just beginning at Beth Israel. A week after taking two pills daily to block the fusion gene's protein product, Brumett felt better. His nausea and fatigue were minimal, and the knifelike pain in his chest stopped.

    Last summer on a patient message board, Brumett shared the good news that his tumor was shrinking. His advice: “I want every single patient who has been diagnosed with non-small cell lung cancer to tell your doctor that you want to have your tumors biopsied and tested for every type of genetic mutation they know of.” That fall, Brumett told his story on local television and at events held by patient advocates.

    Tumor genotyping is not part of standard treatment. But that is changing. A handful of major U.S. cancer centers are laying plans to analyze the tumors of every lung cancer patient who comes in the door and check for an array of mutations. The aim is to match patients with a drug that goes after the tumor's genetic weak spot. Two centers, Harvard's Massachusetts General Hospital (MGH) in Boston and Memorial Sloan-Kettering Cancer Center (MSKCC) in New York City, have already begun; they will add several other cancer types in the coming months. Proponents hope that comprehensive tumor genetic testing—aided by a major U.S. research effort to catalog mutations in tumors, called The Cancer Genome Atlas—will soon become a standard part of a patient's medical record. “The data are most useful if they're not squirreled away in a research lab,” says medical oncologist Leif Ellisen of MGH.

    But there are challenges. Even big cancer centers are still building the infrastructure needed to comprehensively genotype tumors. And testing is likely to help only a small fraction of cancer patients. Many tumors don't have any of the well-studied mutations. Only about 15% to 20% of all lung cancer patients have mutations that can be matched to targeted anticancer drugs, for example.

    The grimmest reality, however, is that even when a match is found, the new targeted drugs have mainly slowed tumors, not stopped them in advanced lung cancer. After Brumett spent 8 months on the experimental drug, during which time he felt almost normal, his cancer developed resistance and began to spread. He died last May, 4 days after his wedding.

    It's still early days for this technology, says Bert Vogelstein, a cancer biologist at Johns Hopkins University in Baltimore, Maryland. “I don't think there's any argument that, from a research perspective, these kinds of initiatives are great.” But, he asks: “Is this something that's going to reduce suffering and death and prolong lives? I think it will, but it's still new.”

    Sharpshooters

    Many agree with Vogelstein that tumor genotyping makes perfect sense as a tool for research. It's creating biobanks that make it easier to identify cancers that might be vulnerable to a new drug and will help attract companies to pay for trials aimed at the subset of patients most likely to benefit. That should speed treatments to the clinic, proponents say. “We simply cannot do another [large clinical] trial without having this kind of information,” says John Niederhuber, director of the National Cancer Institute (NCI) in Bethesda, Maryland, a booster of the approach. Personalized treatment is exactly what NCI hopes will come out of The Cancer Genome Atlas, a huge sequencing initiative to find mutations in tumors. It is ramping up from a pilot phase and expects to tackle more than 20 cancer types in the next 5 years at a cost of $275 million the first 2 years.

    The idea of giving a patient a drug tailored to the genetic makeup of his or her tumor goes back at least a decade. The most famous example is the leukemia drug Gleevec, which targets a genetic mistake, known as the Philadelphia chromosome, that leads to uncontrolled cell growth. Over 95% of chronic myeloid leukemia patients carry this glitch and most early-stage patients respond to the drug. Another example is the breast cancer drug Herceptin, a monoclonal antibody that blocks HER2, a protein on a cancer cell's surface that receives growth signals that trigger the cell to grow. The drug is given only to the 25% of breast cancer patients whose tumors have a mutation that overexpresses HER2. Its presence is determined by counting copies of the HER2 gene or detecting protein levels.

    The developers of Herceptin suspected from the start that only certain patients would benefit. But this became clear for other targeted drugs only after the drug was tested in a general population. Take Erbitux, a drug used since 2004 to treat colorectal cancer. It homes in on EGFR, a cell receptor in the same family as HER2. Retrospective studies showed that the tumor cells of patients with a mutated oncogene called KRAS can thwart EGFR-targeted drugs. Now the drug has been recommended only for the 60% of patients with tumors that have a normal version of the KRAS oncogene.

    For lung cancer, a turning point came in 2004. Researchers at Harvard's Dana-Farber Cancer Institute and MGH, and at MSKCC, were exploring why Iressa, another drug that targets EGFR, didn't seem to help most patients but dramatically shrank lung tumors in about 10% of patients. They realized that these “responders,” more often Asian and nonsmokers, had certain mutations in EGFR that made the tumors more vulnerable (Science, 30 April 2004, p. 658).

    The initial poor showing led the U.S. Food and Drug Administration to sideline Iressa in the United States. But a new trial in Asia published this year clearly showed that for patients with the EGFR mutations, the rate of disease progression was slower in those who were on Iressa than it was in those on chemotherapy. The drug has now been approved in Europe for targeted use in patients with the EFGR mutations that make the cells vulnerable, and the manufacturer, AstraZeneca, is seeking approval for similar use in the United States. The Asia trial convinced even the skeptics that EGFR mutations should guide treatment decisions, says cancer biologist William Pao, who left MSKCC for the Vanderbilt-Ingram Cancer Center in Nashville this year.

    Good match.

    A new drug aimed at a mutation in his lung tumor made life almost normal for several months for Kevin Brumett (shown last December with fiancée Stephanie Fellingham).

    CREDIT: STEPHANIE BRUMETT

    The list of mutations that affect lung cancer patients' response to drugs, meanwhile, has continued to grow. Some EGFR mutations that crop up after treatment with EGFR-targeted drugs confer resistance to these agents. Patients with mutated KRAS don't respond, the same pattern seen in colon cancer. In addition, ALK-EML4 and rarer mutations in genes such as BRAF, MEK-1, and ERBB2, make some patients' tumors susceptible to other kinds of targeted drugs (see graph, p. 220).

    More targets

    As the picture of gene-drug interactions fills in, researchers at some cancer centers have decided to make genetic profiling of lung tumors part of routine care. In January, MSKCC began to test all patients with lung adenocarcinomas, the main type of nonsmall cell lung cancer, for the EML4-ALK fusion gene and for 40 mutations in seven other genes. MSKCC is also testing colorectal cancers and will add melanoma and thyroid cancers next year. The data will go into a research database so that even if no drug is available for that patient now, oncologists can find patients with specific mutations for clinical trials months from now. This is the only way to find enough patients to test drugs targeting rare mutations, says molecular pathologist Marc Ladanyi of MSKCC. “Otherwise, it would take years,” he says.

    MGH began testing nearly all lung adenocarcinoma patients this spring for 113 mutations in 13 genes known to be involved in cancer. It aims to test most cancer types, including gastrointestinal and breast cancers, by the end of next year. Other cancer centers are jumping on board, including Dana-Farber, Duke Comprehensive, Vanderbilt-Ingram, MD Anderson, and Fox-Chase in Philadelphia. In the United Kingdom, Royal Marsden hospital has begun routinely testing some cancer patients to find subjects for early (phase I) clinical trials, says Marsden's Johann de Bono.

    Genotyping is a good way to categorize tumors for clinical decisions—more robust than using tumor gene expression data to fingerprint a tumor, Vogelstein says. He and others argue that gene-expression analysis hasn't been as strongly validated. “Because DNA mutations are easy to assess and have a proven track record of response prediction, it makes good sense to start there,” agrees Charles Sawyers of MSKCC, who helped develop Gleevec.

    Researchers acknowledge that a broad screening effort to genotype tumors won't help most patients just yet. That's because relatively few targeted drugs have been developed, and many of the mutations isolated so far tend to rule out drugs, not rule them in. Take adenocarcinomas: A sizable portion of these tumors—at least 33%—don't have any of the known driver mutations that might be susceptible to attack, according to a literature review by Ladanyi and others. That observation is also emerging from large-scale tumor-sequencing projects such as The Cancer Genome Atlas. Still, it's worth developing new tools if they work against these tumors, Pao says. For a major threat like lung cancer, which kills 16,000 Americans a year, even if a drug helps only 1% or 2% of patients, that's a lot of people.

    Gearing up

    Cancer centers that hope to expand tumor genotyping still have a lot of kinks to work out. Logistics are the first challenge. “It sounds easy, but it requires top-down” organization, Pao says. Many pathology departments don't have a well-developed molecular diagnostics lab—so cancer centers are building them. Tissue banks are essential; centers that invested earlier in well-organized biobanks are at an advantage.

    Personal touch.

    Memorial Sloan-Kettering's Marc Ladanyi and others want to use tumor genetic profiling to match lung cancer patients with the best drug.

    CREDIT: ROBERT A. LISAK

    Because the genetic changes of interest include DNA rearrangements and deletions as well as mutations, these labs can't use just a single technology to detect them. And they must be able to produce results quickly after a sample is taken, within 2 weeks, so that oncologists can use them in time to make treatment decisions. Some cancer centers use stand-alone clinical assays to test for a few genes; others are buying genotyping systems to detect larger numbers of mutations, then adding tests to find other genetic changes. MGH's assay is “really labor intensive,” says Ellisen: Technicians extract DNA from the tumor, amplify it, test for 60 loci, then perform further tests. On the plus side, once the technology is set up, it will be easy to incorporate new mutations being turned up by the cancer genome projects.

    It's not yet clear how much a typical tumor test will cost or how much will be covered by insurance. Testing is often done today as part of a research project. MGH puts the actual cost of a typical tumor genetic analysis at about $2000. (That may sound like a lot, but it's no more than a magnetic resonance imaging scan, Ellisen points out.) Ladanyi says MSKCC's assay will likely cost $500 to $1000. Insurers might be willing to cover that if it's done by a certified lab and costs no more than what they now pay—usually under $1000—for standard clinical tests for KRAS and EGFR, he notes.

    Looming over the field is the question of how owners of key intellectual property—such as Genzyme, Harvard, and MSKCC, which hold or have licensed patents for EGFR testing for lung cancer—will handle the patent issues. Academic cancer centers could run into legal trouble if they test for these mutations on their own instead of using commercial labs that offer the tests. And the drugs are expensive. A year's supply of Herceptin—the standard treatment—costs about $60,000.

    Points of attack.

    Several mutations found in lung adenocarcinomas, a common type of lung cancer, make them vulnerable to molecularly targeted drugs.

    CREDIT: (SOURCE): MARC LADANYI AND WILLIAM PAO

    Pao and others organized a “nuts and bolts” meeting of five cancer centers last November to discuss the logistics, technologies, and data-management issues of genotyping lung tumors. One objective is to pool patients for joint trials of drugs and test them in combinations, the most promising strategy for avoiding drug resistance. Finding enough patients to get statistically significant results for a rare cancer subset “will require a type of collaboration never seen before,” says Roy Herbst of MD Anderson in Houston, Texas, who heads a prototype for this kind of lung cancer trial. In a boost for such efforts, the National Institutes of Health last month awarded $5 million in Recovery Act money to a consortium of 13 institutions that will test out tumor genotyping techniques on 1000 lung cancer patients.

    Even with a system up and running smoothly, another hurdle is getting busy doctors to take an interest in the tests, says Joseph Nevins of Duke University in Durham, North Carolina, who is working on using gene-expression signatures to guide chemotherapy. They will have to go to the trouble of collecting tissue samples and getting patients' written agreement to have their genetic data used for research. “Maybe the samples will sit in somebody's freezer for 5 years and nothing will be done with them.”

    Indeed, one tough issue raised by tumor genotyping is how to weigh the merits of a procedure that is expensive, likely to become more complex, and rarely provides a cure—but will be intensely sought after by patients and their families. Even when the new gene-targeted drugs work, they may offer just what they gave Kevin Brumett: a few good months. In January, after a period of relatively good health—“We took vacations, we went to weddings,” says his widow, Stephanie—Brumett went to the emergency room with a headache and learned that his cancer had metastasized to his brain. Although his doctors didn't seem that surprised, “for us, it came out of the blue,” says Stephanie.

    She expects to carry on her husband's efforts with advocacy groups to encourage patients to be tested for genetic mutations and volunteer for research. “He felt so strongly about it. It was a miracle drug, and he helped a lot of other patients get on the trial.” And with more work, researchers hope to find ways to get around drug resistance. If that happens, knowing the genetic fingerprint of a patient's lung tumor will not only make life with cancer more bearable for a while but also turn this killer into a manageable disease.

  10. National Labs

    For a Famous Physics Laboratory, A Quick and Painful Rebirth

    1. Adrian Cho

    Renowned for particle physics experiments, SLAC National Accelerator Laboratory has reinvented itself as a multipurpose lab featuring the world's first x-ray laser.

    CREDIT: SLAC

    MENLO PARK, CALIFORNIA—When the pipe exploded, it packed the wallop of two sticks of dynamite. The 13 September 2007 mishap could have maimed or killed technicians here at SLAC National Accelerator Laboratory. It certainly jolted Persis Drell, the particle physicist who had become acting director of the famed lab just 4 days earlier.

    The blast resulted from the kind of mistake anyone might make: Welders cutting into a steel water pipe ignited fumes from epoxy used to attach a plastic extension the day before. But Drell was already in a tricky position. Renowned as a center for particle physics, SLAC had begun a wrenching shift to a broader mission emphasizing x-ray studies for fields such as materials science and structural biology. The lab was also under fire from its owner, the U.S. Department of Energy (DOE), for management deficiencies and safety lapses, including a nearly fatal electrical accident in 2004. The blast was another boulder in a rockslide falling on the lab. Yet, Drell says, “I joke, the explosion was the best thing that ever happened to us.”

    The blast proved that things had to change, Drell says. She feared that if the lab had another accident like the one in 2004, DOE might shut it down for good—a fear that Paul Golan, manager of DOE's site office at the lab, calls “not unreasonable.” So Drell and Golan began meeting every morning to plan how to head off incidents like the explosion. Although only the acting director, Drell also began rearranging management at the lab, which is operated by Stanford University in neighboring Palo Alto, California. In December 2007 she became director, tasked with guiding SLAC through its scientific and managerial metamorphosis. “It's hard for me to imagine that you could make a transition more quickly,” she says. “It's been a bit brutal.”

    Like a butterfly cracking its chrysalis, SLAC has shed its former self. On 7 April 2008, physicists turned off the lab's last particle smasher. On 10 April 2009, they turned on its new flagship facility, the world's first x-ray laser. Dubbed the Linac Coherent Light Source (LCLS), the laser shines a billion times brighter than any previous x-ray source. First experiments with it began last week, and with the shift in science come equally dramatic changes in the management and culture of the storied lab.

    Forward!

    Director Persis Drell has pushed SLAC through its speedy transformation.

    CREDIT: BRADLEY R. PLUMMER/SLAC

    SLAC officials are rearranging everything, including the furniture—particle physicists' offices in the lab's main building are being converted into labs to support the LCLS. Even some of the lab's most eminent particle physicists say the changes are necessary. “Labs like SLAC are institutions for the long term, and they have to evolve,” says Burton Richter, a former SLAC director and one of the lab's three Nobel laureates.

    But other particle physicists fault Drell for not fighting to keep their last collider, called PEP-II, running or pushing for another project to replace it. They say that whereas past directors fought for the lab's autonomy, Drell has embraced DOE's micromanagement. “It seems to me that we're largely being run out of Washington,” says SLAC veteran Martin Breidenbach. “There's little evidence of pushing back.”

    One sharp corner

    DOE's nine other science labs have faced changes in their missions as well, particularly as particle physics facilities have shut down. In 1979, DOE closed down a proton accelerator known as the Zero Gradient Synchrotron at Argonne National Laboratory in Illinois after a higher energy machine turned on 40 kilometers away at Fermi National Accelerator Laboratory (Fermilab) in Batavia. Not long after, Argonne disbanded its 400-member accelerator physics division, recalls Ronald Martin, who was then division director.

    The lab went without a major accelerator until 1988, when it won DOE approval for an x-ray source based on a circular accelerator called a synchrotron, in which the particles radiate as they circulate. The $812 million Advanced Photon Source revved up in 1995 and is Argonne's largest facility. “Without that, it would have been a terrible disaster” for the lab, Martin says.

    Similarly, in 1983, DOE canceled the half-built ISABELLE proton collider at Brookhaven National Laboratory in Upton, New York, to shift resources to an even bigger atom smasher, the Superconducting Super Collider (which, ironically, was canceled during construction 10 years later). “We had 300 or 400 hundred people working on ISABELLE, and we had to let most of them go,” recalls Nicholas Samios, a particle physicist who was Brookhaven's director at the time. Brookhaven waited until 1990 to get the go-ahead for a different collider to smash heavy atomic nuclei for studies in nuclear physics, the $616 million Relativistic Heavy Ion Collider, which has been running since 2000.

    In contrast, SLAC has not had to languish without a major facility. LCLS construction started in 2006, 2 years before SLAC's last particle collider shut down. However, SLAC is also facing a more dramatic transformation. Both Argonne and Brookhaven were already multipurpose labs at which particle physics accounted for only a fraction of the research. In contrast, since its founding 47 years ago, SLAC has focused almost exclusively on particle physics, making its shift far more drastic. “If you bet all your money on one thing and all of a sudden that one thing doesn't have a future, then you have a problem,” Samios says.

    Fortunately for SLAC physicists, the key to their future in x-ray physics already lay before them: the 3-kilometer-long linear particle accelerator, or linac, that has been the backbone of the particle physics program since the lab's inception.

    CREDIT: DIANA ROGERS/SLAC

    The light at the end of the tunnel

    Lining one side of a broad tunnel and gleaming beneath the overhead lights, the LCLS looks powerful. On their waist-high mounts, 33 magnets called undulators lie connected end to end, each a cylinder 3.4 meters long and 30 centimeters in diameter. A beam of high-energy electrons from the linac enters one end of the chain; a laser beam of x-rays emerges from the other. The LCLS is an x-ray source unlike any before, scientists say. It's also a nifty trick done without mirrors.

    A conventional laser consists of two mirrors, one of which lets some light through. Between them sits a material that emits photons, such as a crystal that fluoresces when zapped with electricity. The photons bounce between the mirrors, and as they pass through the material, they stimulate the atoms in it to emit more light. That feedback creates a torrent of photons moving in synchrony that emerges through the imperfect mirror as the laser beam.

    Mirrors for x-rays don't exist, however, so the LCLS works in a different way. Each undulator consists of two rows of 224 small magnets, one above and one below the pipe that carries the electrons. Neighboring magnets point in opposite directions, and their fields cause electrons passing down the pipe to wiggle from side to side and radiate x-rays. The x-rays also whiz down the beam pipe, and they push the electrons into tiny bunches. “Now each bunch radiates as if it were a single electron with an enormous electric charge,” which greatly amplifies the x-rays, says John Galayda, the SLAC accelerator physicist who directed LCLS construction.

    The 90-micrometer-wide electron beam must overlap with the nearly-as-narrow x-ray beam through the entire 130-meter row of undulators, a challenge that led some researchers to question whether the LCLS would ever work. On 10 April, Galayda's team put all doubts to rest, producing the world's first x-ray laser beam after less than 2 hours of trying. “It was a really, really nice experience,” Galayda says with a wide smile.

    SLAC isn't a newcomer to x-ray science. Since 1973, the lab has run a relatively small synchrotron x-ray source, one of the first. But the LCLS may put the lab on the top of the heap. It could enable researchers to make “movies” of chemical reactions in progress, determine the structure of proteins from a sample of one molecule, or probe matter under conditions found at the center of planets.

    The LCLS cranks out x-rays with a wavelength of 0.15 nanometers—short enough to probe the atomic-scale structure of materials—in pulses lasting only 2 millionths of a nanosecond. Such short pulses should allow experimenters to snap the x-ray equivalent of stop-action photos of chemical reactions as they occur, says Joachim Stöhr, an x-ray physicist and associate director for the LCLS at SLAC. “That's one big aspect of the LCLS,” he says. “It couples the atomic scale with the ultrafast.”

    The LCLS has already attracted far more proposals for experiments than it can accommodate and is drawing top researchers. “I came here for the LCLS,” says Philip Bucksbaum, an atomic physicist at SLAC and expert in ultrafast lasers who arrived in 2006 from the University of Michigan, Ann Arbor. “It's a chance in about 10 lifetimes.”

    The outsider from within

    The LCLS wasn't always so attractive to scientists. The idea was dreamed up in 1992 by Claudio Pellegrini of the University of California, Los Angeles. He and SLAC's Herman Winick proposed a $100 million demonstration project that would work at slightly longer wavelengths. “All through the 1990s, it was just a curiosity,” Winick says. “DOE didn't really want to hear about it because they were busy with the [synchrotron] sources.”

    Even as plans evolved for shorter wavelengths, x-ray scientists remained wary. “Many of us weren't quite sure what we would do with such a machine,” Stöhr says. Various independent reviews convinced DOE that the LCLS was worth pursuing—although it insisted on a larger, potentially expandable facility, raising the cost to $420 million.

    Drell's predecessor, Jonathan Dorfan, targeted the LCLS as the lab's next major facility, but some SLAC scientists were reluctant to get involved in the project. From 1999 until 2008, most lab researchers focused on running an electron-positron collider called PEP-II and the accompanying BaBar particle detector, which measured particles called B mesons to probe the asymmetry between matter and antimatter. Some people had to be ordered to work on the new project, says Tor Raubenheimer, director of accelerator research at the lab.

    At the same time, DOE wanted SLAC to improve its outmoded management structure and practices. Drell made it her mission to push the hesitant lab down the path ahead of it. SLAC had traditionally promoted from within, choosing managers from among its scientists. Drell hired outside experts in management for positions such as chief operating officer and director of environment, safety, and health. SLAC needed the perspective of newcomers with more professional management experience, she says.

    In a flash.

    SLAC's John Galayda directed construction of the world's first x-ray laser. The machine came on in less than 2 hours and immediately met or exceeded its design goals.

    CREDIT: BRADLEY R. PLUMMER/SLAC

    Drell herself hardly fits that description, however. The daughter of Sidney Drell, a prominent particle theorist at SLAC, she had been a professor at Cornell University when she joined SLAC in 2002 as director of research. It was her first foray into management at a national lab. “Granted, she didn't have all the experience, but she had all the right instincts,” says William Madia, Stanford's vice president for SLAC, who has been director of two national labs himself.

    Drell reorganized the lab to try to break down barriers between the particle physics and x-ray science sides of the lab, for example, by pulling all accelerator work into its own division. She instituted protocols that in 2 years reduced SLAC's accident rate, which had been twice the industrial average, by 67%. (She also formed a committee to rename the lab when DOE and Stanford got into a spat over rights to the old name, Stanford Linear Accelerator Center.)

    Drell says she's committed to doing what's best for the scientific community as whole, even if it means taking a hit at home. In December 2007, last-minute cuts to the federal budget forced the lab to lay off 100 of its then-1600 staff. Drell simply made the cuts. In contrast, officials at Fermilab delayed similar layoffs until the U.S. Congress came to the rescue with extra money. The crisis forced DOE to scuttle the PEP-II collider 6 months early, another decision Drell did not fight, to keep Fermilab's larger Tevatron Collider running. “DOE's Office of High Energy Physics had to make some tough decisions, and I supported those decisions,” Drell says.

    That rankled many physicists, who thought Drell should have already been fighting to push back PEP-II's scheduled shutdown by a couple years. “I thought Persis should say to the DOE, ‘Straighten out our particle physics program, or I'm going to resign,’” says Martin Perl, another of SLAC's Nobelists.

    Now that SLAC has started to come through its transition, however, some critics have begun to warm to Drell's vision and style. Perl says he now realizes that lab directors no longer have the kind of power they used to and that what Drell did was best for SLAC. “A few months ago I apologized to her,” he says. “I was wrong.”

    Dreams deferred

    SLAC has not given up on particle physics, although it has expanded its conception of the field to include astrophysics. For example, SLAC researchers helped build the main instrument for the Fermi Gamma-ray Space Telescope, launched in June, and run the operations center that processes its data. Some SLAC physicists are joining in experiments at Europe's Large Hadron Collider (LHC) near Geneva, Switzerland, which should start smashing protons this winter. Some are still analyzing data from BaBar.

    That's a far cry from physicists' ambitions of just a few years ago. Throughout the 1990s, SLAC researchers developed plans for a linear collider 30 kilometers long and based on the same basic technology as its linac. Dorfan says they hoped that, by about 2010, SLAC would lead a global effort to build the machine, which would complement the LHC. In 2004, however, the particle-physics community opted for a different technology being developed primarily in Germany. In 2007, DOE lost enthusiasm for the whole project, when physicists estimated the cost at well over $7 billion.

    Some physicists say that SLAC could do more to maintain an accelerator-based particle physics program. The PEP-II collider could be upgraded to boost its collision rate, an option SLAC shelved to push for the linear collider, says lab veteran David Leith. In fact, researchers in Italy would like to rebuild the machine at the University of Rome Tor Vergata. “I'm working very hard to have this as part of the lab's program,” Leith says, “but it doesn't seem to fit into the new direction.” Without such a project for researchers to work on, SLAC's expertise in electron accelerators—its raison d'être—will likely wither, Leith says.

    Others say that in the transition to a multipurpose lab, SLAC has lost the cohesive culture that made it such a stimulating place to work. Michael Huffer, who worked on the BaBar particle detector, says SLAC's ethos reflected the values of its founding director, Wolfgang Panofsky: Everyone from the scientists to the custodians should have a say, there should be little hierarchy, and the lab should maintain its independence.

    Now the lab is more bureaucratic, and DOE takes a bigger role in its inner workings, Huffer says. DOE site officials now roam the site to talk directly to employees, he says, something he doubts Panofsky or Richter would have tolerated on the grounds that Stanford alone ran the lab. “Management has to worry that the things that made SLAC attractive to people are the things that aren't here anymore,” Huffer says. “If you kill that culture, you kill the golden goose.”

    Still, most scientists are adjusting. Christopher O'Grady also worked on BaBar and says he would have been happy if SLAC's particle physics experiments had continued. But even as it became clear that they wouldn't, O'Grady says, he was becoming intrigued by something else entirely: the basic science needed to address the world's energy problem.

    So O'Grady, 45, signed on to work on LCLS experiments, which are likely to be more relevant to the issue. “Just this morning I was in a meeting with the catalyst guys, and it was pretty challenging because I don't know what's going on,” he says. O'Grady says he has no regrets, though. “I decided in the eighth grade that I wanted to do high-energy physics, and that's what I did. This is the first time in my life that I've ever wanted to do something else.” At SLAC, now is the time to make a change.

  11. Paleoanthropology

    New Work May Complicate History Of Neandertals and H. sapiens

    1. Michael Balter

    At a meeting on human origins, held this month in Gibraltar, bones in a Spanish cave and stone tools in Asia sparked controversial new ideas about human evolution and migration.

    Who were they?

    Scientists aren't sure how to classify early humans fossilized at Sima de los Huesos.

    CREDIT: MAURICIO ANTON

    GIBRALTAR—In 1848, quarry workers on this peninsula at the southernmost tip of Spain unearthed the first known Neandertal skull. Sixteen years later, two friends of Charles Darwin's brought the skull to him at his sister-in-law's home in London. History does not record Darwin's thoughts as he stared at the massive skull's thick brow ridges and jutting face. Indeed, Darwin made no mention at all of human evolution in his 1859 book On the Origin of Species, and he acknowledged the Neandertals only fleetingly in 1871's The Descent of Man. Yet the nearly 100 scientists who gathered here last month* to ponder the latest research on Neandertals and other ancient humans were happy to embrace him as their intellectual godfather.

    “Charles Darwin was the man who did more than anyone else to crystallize the idea that life had changed over time and that, by implication, humankind had extinct relatives who could be found in the fossil record,” Ian Tattersall, a paleoanthropologist at the American Museum of Natural History in New York City, told attendees of the meeting in one of many talks that began by paying homage to the great man. And although Darwin may have understood the big picture, unraveling the many twists and turns that human evolution has taken over millions of years remains very much a work in progress. Among the critical questions under debate at the meeting: What were the evolutionary origins of the Neandertals, and how early did their close cousins, Homo sapiens, begin their highly successful colonization of the globe?

    What's in a name?

    Neandertals, identifiable by their unique skull shapes and stocky bodies, show up in the fossil record of Europe and western Asia at least 130,000 years ago. Recent sequencing of ancient Neandertal DNA suggests that their common ancestor with modern humans lived a bit less than 500,000 years ago, quite likely in Africa (Science, 13 February, p. 870). Some researchers call this common ancestor H. heidelbergensis, although they disagree about which fossils to group in that species. In his talk at Gibraltar, Tattersall argued that the real evolutionary picture might be much more complicated.

    Family branches.

    Two or more hominin lineages may have lived in Europe at the same time.

    CREDIT: IRA BLOCK/GETTY IMAGES; D. FINNIN/AMNH

    Many anthropologists detect “incipient” Neandertal features, including an increased forward projection of the face and the beginnings of a characteristic depression at the back of the skull, in European hominins dated at half a million years ago or less. In 1988, anthropologist Jean-Jacques Hublin, now at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, proposed that those unusual features had accumulated slowly over time. Later, Hublin and colleagues argued that Neandertals had evolved in four stages, beginning with fossils usually assigned to H. heidelbergensis (although Hublin prefers to call them H. rhodesiensis, after a skull found in Zambia).

    Tattersall agreed that some fossils—including the 225,000-year-old Steinheim skull found near Stuttgart, Germany, and a 400,000-year-old skull from Swanscombe, England—might fit Hublin's “accretion model.” But others, he said, emphatically do not. The big stumbling block is one of the most spectacular fossil finds in the history of paleoanthropology: the discovery since the mid-1990s of thousands of bones from some 28 hominin individuals at the cave site of Sima de los Huesos in northern Spain (Science, 2 March 2001, p. 1722). The published finds include four hominin skulls with both Neandertal-like and non-Neandertal features. And the team working at the site, co-led by anthropologist Juan Luis Arsuaga of the Complutense University of Madrid, has assigned its fossils to H. heidelbergensis.

    The Sima fossils were first dated to about 350,000 years ago. But more recent uranium-series dating, led by geochronologist James Bischoff of the U.S. Geological Survey in Menlo Park, California, suggests that they are at least 530,000 years old. That would make them as old as or older than “classic” H. heidelbergensis fossils from southern France, Greece, and other places—fossils that the Sima skulls don't much resemble, Tattersall insisted. Tattersall concludes that two or more hominin lineages must have existed side by side in Europe for several hundred thousand years before H. sapiens arrived from Africa. One line led to the Neandertals and may have included the Sima fossils; another, rightly called H. heidelbergensis, went extinct while the Neandertals lived on until at least 30,000 years ago.

    Tattersall then looked at Arsuaga, who was sitting in the audience waiting to speak next: “My central plea is to the colleagues who assigned the Sima de los Huesos fossils to H. heidelbergensis. They are clearly not Neandertals, but not being a Neandertal does not make them H. heidelbergensis. They need another name.”

    A hush fell over the room as Tattersall sat down and Arsuaga got up to speak. To nearly everyone's surprise, Arsuaga agreed that the Sima de los Huesos skulls looked nothing like other H. heidelbergensis specimens. Nor, he said, do 13 other skulls his team had recently excavated there. “We have always said that we put the Sima hominins under the H. heidelbergensis umbrella for convenience, for practical reasons,” Arsuaga said, adding that his team agrees with Tattersall that the accretion scenario is not likely. But he resisted Tattersall's call to rename the Sima fossils, at least until the remaining 13 skulls are published in coming months.

    Chris Stringer, a paleoanthropologist at the Natural History Museum in London whose early research led to the recognition of H. heidelbergensis as a formal species, says a lot is riding on the new 530,000-year minimum date for the Sima fossils. If the dating is right, Stringer says, “it would be evident that an early form of Neandertal was [in Europe] alongside of H. heidelbergensis.” But he argues that the dating is at the limit of the uranium-series technique and also contradicts other molecular and fossil evidence suggesting that the Neandertal line split off somewhat after 500,000 years ago. Bischoff defends his methodology, however, saying that the date is a “conservative” estimate and that the Sima hominins could be even older than 530,000 years but not younger.

    Hublin, who was not at the meeting, stands by accretion and sees no need to find a new name for the Sima fossils. He told Science that all of the hominins under discussion “have Neandertal features to some degree” that become more pronounced over time. The best solution, Hublin says, would be to scrap the species name H. heidelbergensis and lump all of these fossils, including those from Sima, together as H. neanderthalensis.

    But Tattersall insists that names do matter, even if more of them are required to classify the fossil record. “Species have an independent existence in nature,” he says. “They are the basic actors in the evolutionary play, and if you don't know who the cast is, you will never understand the plot.”

    Out of Africa and into Asia

    While the Neandertals were sorting out their family relations in Eurasia, our own species, H. sapiens, was emerging in Africa, probably from H. heidelbergensis or H. rhodesiensis ancestors. The first recognizably modern human fossils show up in Ethiopia about 200,000 years ago. Yet many experts have concluded, from genetic and other evidence, that modern humans did not leave Africa and begin to colonize the globe until about 50,000 years ago. The exception is the Levant, where modern humans apparently lived alongside Neandertals between about 130,000 and 75,000 years ago, as part of what some scientists have called a “failed dispersal.”

    In recent years, however, some researchers have seen evidence for earlier dispersals, especially into southern Asia. In 2007, for example, a team led by archaeologist Michael Petraglia, now at the University of Oxford in the United Kingdom, published evidence that hominins making sophisticated stone tools were established in south India by 74,000 years ago (Science, 6 July 2007, p. 114). The team found the tools at several archaeological sites in India's Kurnool District, above and below ash layers from a well-dated eruption of Mount Toba in Indonesia, one of Earth's most cataclysmic volcanic events. Although no human fossils were found, the team concluded that the tools bore striking similarities to Middle Stone Age tools made by modern humans in Africa and suggested that H. sapiens might have made the Indian artifacts as well.

    SOURCE: MICHAEL PETRAGLIA

    In a talk at Gibraltar, Petraglia argued that new work by his team and others in both India and the Arabian Peninsula strongly supports an earlier modern human dispersal out of Africa. At the site of Shi'bat Dihya 1 in Yemen, a French-led team recently reported finding stone tools dated by optically stimulated luminescence to between 70,000 and 80,000 years ago. Slightly older dates have emerged from work by a German-led team at Jebel Faya 1 in the United Arab Emirates. Meanwhile, at new sites in the Kurnool District and India's Middle Son valley, Petraglia and colleagues have uncovered thousands more stone tools below and above the 74,000-year-old Toba ash layer.

    Some of the Indian and Arabian tools resemble those of the African Middle Stone Age; others are typical of so-called Middle Paleolithic tools that Neandertals made in Europe and that both Neandertals and modern humans made in the Levant. Because Neandertals are not known in south Asia, and other “archaic” hominins such as H. heidelbergensis and H. erectus are assumed to have been extinct by 80,000 years ago, Petraglia argued that modern humans are the most likely toolmakers.

    If so, Petraglia said, the “failed dispersal” in the Levant might not have been such a bust after all. Modern humans might have continued on to south Asia and ultimately Australia, where they show up at least 40,000 years ago. Other possibilities include additional dispersals from Africa, possibly via the narrow Bab al Mandab strait between Arabia and the Horn of Africa.

    To some researchers at the meeting, such scenarios make perfect sense. “This is the most parsimonious explanation,” says paleoanthropologist David Begun of the University of Toronto in Canada. “Who else could it be if not Homo sapiens?” And Robin Dennell, an archaeologist at the University of Sheffield in the United Kingdom, notes that there were no climatic or geographical barriers to keep modern humans out of south Asia. “Once they were in the Levant and Arabia, there was nothing stopping them.”

    But not everyone was ready to jump on the early-dispersal bandwagon. “Mike is giving himself the benefit of the doubt,” says John Shea, an archaeologist at Stony Brook University in New York state. “The morphology of the tools doesn't tell you who made them, unless you have some other sort of proof.” Archaeologist Curtis Marean of Arizona State University, Tempe, agrees: “I am skeptical of the idea that modern humans made Middle Paleolithic tools any further east than the Levant, because we have not found modern human fossils with [those tools] further east than that.” Petraglia has provided a “plausible hypothesis,” Marean says. But to prove it, he and others say, Petraglia's team needs to find modern human fossils.

    And that, Petraglia told attendees at the meeting, is just what he and his co-workers are now looking for.

    • * Human Evolution 150 Years After Darwin, Gibraltar, 16–20 September 2009.

Log in to view full text