News this Week

Science  05 Mar 2010:
Vol. 327, Issue 5970, pp. 1184
  1. Seismology

    Two Years Later, New Rumblings Over Origins of Sichuan Quake

    1. Richard A. Kerr and
    2. Richard Stone

    BEIJING—When experts suggested that the disastrous 2008 Wenchuan earthquake might have been triggered by the reservoir behind the Zipingpu Dam, establishment scientists in China remained largely silent (Science, 16 January 2009, p. 322). Now they've weighed in, ruling out reservoir triggering. But many earth scientists don't buy their arguments.

    No large quake had ruptured the Beichuan-Yinxiu fault in southwestern China's Sichuan Province in at least a millennium or two. Then engineers built Zipingpu Dam on the Min River just 500 meters from the fault and in late September 2005 began filling it with upward of 900 million tons of water. Two-and-a-half years later, the magnitude-7.9 Wenchuan earthquake got started 5 kilometers from the reservoir.

    In the January issue of International Water Power and Dam Construction, three dam engineers at the China Institute of Water Resources and Hydropower Research here argue that the Zipingpu-Wenchuan situation was so unlike that of the four largest known reservoir-triggered earthquakes—all in the magnitude-6 range—that there could not have been a connection between reservoir and quake. The authors, led by structural engineer Chen Houqun, who has co-authored China's design code for building earthquake resistance into dams, contend that the timing was mere coincidence.

    Trickle-down theory.

    Did filling the Zipingpu Dam trigger the magnitude-7.9 Wenchuan earthquake? Chen Houqun (inset) says no.

    CREDITS: WINGCHINA; (INSET) R. STONE/SCIENCE

    Unlike the four big triggered quakes, the authors point out, the Wenchuan quake occurred on a thrust fault: One block of crust is pushed up the inclined fault over the other block. According to the authors, the reservoir's weight lies on a “relatively stable region” of the footwall, or underlying block.

    In addition, the 300-kilometer Wenchuan rupture began 18.8 kilometers beneath the surface, according to new data from a team led by geophysicist Liu Qiyuan of the China Earthquake Administration's Institute of Geology in Beijing. The reservoir's water pressure could not have driven water that far down through cracks and pores, Chen argues. Such water infiltration over months or years is thought to weaken a fault by pushing its sides apart.

    Finally, Chen and his colleagues note, in the four cases, filling a dam's reservoir led to an increasing number of temblors until a large earthquake struck. But a seismic-monitoring network around Zipingpu reservoir established 13 months before impoundment began showed only “normal variation” of seismic activity between impoundment and the quake, the group writes. “All of these factors rule out triggering,” says Chen.

    Martin Wieland, chair of the International Commission on Large Dams' committee on seismic aspects of dam design, says the paper makes a persuasive case. And Liu adds that the Wenchuan quake's focal depth by itself discounts a link to Zipingpu. But neither categorically rejects a role for Zipingpu. There are too many uncertainties, says Wieland, an earthquake engineer at Poyry Energy in Zurich, Switzerland.

    Many seismologists say the Water Power authors overstate their case. “I don't think [triggering] has been put to rest yet,” says seismologist Arthur McGarr of the U.S. Geological Survey (USGS) in Menlo Park, California, who co-authored a review of reservoir-triggered seismicity (RTS) in 2002. The problem, many seismologists say, lies in comparing Wenchuan to the four big RTS earthquakes. “Wenchuan doesn't fit the pattern? What pattern?” asks seismologist Ross Stein of USGS Menlo Park. “The [four] examples are all over the map as to how seismicity has responded to dam impoundment. It shows just how little we know about this process.” Although RTS does tend to be shallow, McGarr says, the high Aswan Dam caused a magnitude-5.3 event in 1981 at a depth of 18 kilometers, similar to Wenchuan's depth. And a magnitude-4.5 RTS quake in Tajikistan struck in a region of thrusting—as did Wenchuan.

    Last October, hydrogeologist Shemin Ge of the University of Colorado, Boulder, and colleagues presented modeling evidence in Geophysical Research Letters that water infiltration from Zipingpu “potentially hastened the occurrence of the Wenchuan earthquake by tens to hundreds of years.” Other researchers see Zipingpu's fingerprints all over the seismic data. One of the most vocal advocates for a dam-quake link, geologist Fan Xiao of the Sichuan Bureau of Geology and Mineral Resources in Chengdu (Science, 8 May 2009, p. 714), says swarms of small quakes struck the region near Zipingpu 3 months before the Wenchuan earthquake. “The fact is, there were obvious foreshocks,” Fan says. Chen says he welcomes “scientific debate”—but he is sticking to his guns.

    What researchers still want almost 2 years after the earthquake is wide dissemination of the raw data from the Zipingpu monitoring. Until such data sets become commonplace, says geophysicist Evelyn Roeloffs of USGS in Vancouver, Washington, “it's always going to be this kind of story.”

  2. Pharmacology

    Growth Hormone Test Finally Nabs First Doper

    1. John Travis

    On 22 February, Terry Newton became perhaps the world's best-known rugby player—but not for any accomplishment on the field. UK Anti-Doping last week announced that Newton is the first athlete caught by a blood test designed to detect doping with human growth hormone (HGH) to boost muscle mass. The positive test, for which Newton has accepted a 2-year ban from rugby, represents a warning to athletes who may have thought HGH use was undetectable, and it also erases lingering doubts about the test among scientists. “It's a great success,” says Mario Thevis of the Center for Preventive Doping Research at the German Sport University in Cologne. “We were always confident in principle that the test would work.”

    At the same time, this first successful detection of HGH doping underscores the challenge scientists face as they race to keep up with cheating athletes (Science, 30 July 2004, p. 632). The current HGH blood test made a limited debut at the 2004 Athens Olympics, but because of a variety of difficulties, the assay is only now being used widely. And a second HGH test that should stand a better chance of nabbing doped athletes remains on the sidelines as researchers struggle to overcome problems that have stalled its validation.

    The test that caught Newton vindicates an idea that endocrinologist Christian Strasburger of Charité-University Medicine Berlin and colleagues proposed in 1999 in an article in The Lancet. At the time, illicit use of the hormone was considered unstoppable, as the synthetic form of HGH is identical to the 22-kilodalton protein naturally made by the pituitary gland. But the German group noted that the pituitary makes other forms, notably a 20-kd HGH, and that analyzing the HGH isoform ratios in a blood sample could reveal if someone had spiked themselves with non-natural HGH. After several years of study, the World Anti-Doping Agency (WADA) approved the isoform test, initially screening a small number of athletes at the 2004 and 2006 Olympic Games.

    Yet until last week, not a single HGH-doping infraction had been declared, puzzling doping researchers. “There has been concern that there haven't been any positive tests,” says endocrinologist Richard Holt of Southampton General Hospital in the United Kingdom.

    Banned.

    Professional rugby player Terry Newton was caught by a HGH blood test.

    CREDIT: RICHARD SELLERS/SPORTSPHOTO LTD./ALLSTAR/NEWSCOM

    The likely reason: The test only spots abnormal isoform ratios within a day or so after HGH use, so athletes probably stopped taking the hormone a few days before an event. For that reason, the isoform assay was always planned as an unannounced, out-of-competition test—the way Newton was caught—but supplying enough test kits and validating regional testing labs proved difficult. The tests used in 2004 and 2006 were made by Strasburger's lab, but plans for a commercial manufacturer to take over fell through. Strasburger says the firm's parent company ultimately decided that the market for a doping test was too small, and it had legal concerns. Finally, a new supplier signed on, making the test more available. And because this wasn't a simple urine test, WADA had to develop rigorous procedures for collecting and storing blood samples. It “has been the most complex test ever put into place,” says Olivier Rabin, WADA's science director.

    Based on data he's seen, Strasburger says WADA could have accused other athletes of HGH doping, but out of caution, the agency refrained. After years of studying HGH isoform ratios in elite athletes—more than 1500 tests have now been conducted—WADA “has gotten a better idea of what's normal. … Now all the legal concerns are out of the way,” Strasburger says. Rabin confirms there have been other “suspicious” tests but says, “we really want to make sure we catch the cheater and not someone who had a peak of growth hormone.”

    This first success of the HGH isoform test has prompted the U.S. National Football League and Major League Baseball to declare interest in its use, although the players' unions are wary of blood, rather than urine, tests. Despite years of research on it, an HGH urine test remains far away, say doping scientists.

    For now, WADA is pushing to ready a second blood test that could catch athletes up to a month after they've doped themselves with HGH. It's based on the idea that injections of synthetic HGH—or of HGH from cadavers, which the isoform test would miss—dependably change blood concentrations of other substances. A major effort to identify such HGH biomarkers began in 1996, and Holt and other researchers settled on insulin-like growth factor 1 (IGF-1) and type III pro-collagen. But validating the biomarkers hasn't been easy.

    The biggest stumbling block has been the use of commercial immunoassays, antibody-based detection kits, for IGF-1. After scientists gathered years of data on IGF-1 with certain immunoassays, those kits were withdrawn from the market. “We're at the mercy of the manufacturers of immunoassays,” Rabin says.

    Rabin says WADA has lined up seemingly stable suppliers of the new immunoassays for the HGH biomarker test, and Holt and his colleagues are comparing IGF-1 measurements from the old and new assays. They're also evaluating the new IGF-1 assay in 500 elite U.K. athletes, hoping to amass enough data to persuade WADA. Meanwhile, a group at King's College London is exploring measuring IGF-1 levels in blood with a mass spectrometer.

    WADA will soon review the latest data and decide whether to launch the HGH biomarker test for the 2012 Olympics in London. “We're working toward that date,” says Holt.

  3. Paleoclimatology

    Snowball Earth Has Melted Back To a Profound Wintry Mix

    1. Richard A. Kerr

    In 1998, a handful of geoscientists at Harvard University breathed new life into a daring idea: that Earth froze over from pole to pole more than a half-billion years ago, threatening life with extinction but perhaps prodding it to greater evolutionary heights (Science, 28 August 1998, p. 1342). On page 1241 of this issue, geoscientists report evidence that the tropics also hosted glaciers more than 100 million years before that supposed global freeze. Such low-latitude glaciation is a hallmark of so-called hard snowball Earth scenarios, in which a kilometer of ice sealed off the world ocean.

    But despite the new work, the much-studied hypothesis has fallen on hard times. “In many people's minds, the hard snowball is dead,” says geochemist Michael Arthur of Pennsylvania State University (PSU), University Park, who was not involved in the new work. Earth was profoundly cold in those geologically weird days, many agree—a “slushball” of a planet, perhaps. But sealed in ice? Unlikely.

    The new contribution to the snowball debate comes from Harvard geologist Francis Macdonald and colleagues at Harvard and elsewhere. They dated volcanic ash layered within deposits of the so-called Sturtian glacial era to an age of 716.5 million years. That's the same age as rocks whose paleomagnetic record places them and the Sturtian glaciers in the tropics.

    Definitely chilly.

    Clear signs of glaciation, such as rocks dropped to the sea floor from icebergs (inset), show up in tropical deposits (dark peak).

    CREDITS: FRANCIS A. MACDONALD

    Researchers speculated about possible ancient tropical glaciers for several decades before geobiologist Joseph Kirschvink of the California Institute of Technology in Pasadena coined the term “snowball Earth” in 1992. But the hard-snowball concept gained ground only after geologist Paul Hoffman—then at Harvard University and now retired—and three colleagues boosted it in the 1998 Science paper. Drawing on simple climate modeling, the authors concluded that any ice that reached tropical latitudes during the Marinoan glaciation, about 650 million years ago, would not have stopped there. Instead, once the highly reflective ice covered enough area, a climatic feedback would inevitably drive the ice to the equator and create global glaciation: a hard snowball.

    Some more-recent paleoclimate modeling, however, suggests that the leap from low-latitude glaciation to a hard snowball may be difficult or even impossible. “We can get ice on land,” says climate modeler Mark Chandler of the Goddard Institute for Space Studies in New York City. “It's the ocean we can't freeze over.” Model oceans can hold lots of heat and move it around in currents, frustrating a complete freeze-over, Chandler says. A few years ago, “the pattern was that the more sophisticated the model, the less likely you'd get a hard snowball result,” he says. Discouraged, Chandler and others moved on to other projects.

    Atmospheric physicist James Kasting of PSU now favors a slightly more modest “thin ice” snowball. He and climate modeler David Pollard of PSU have considered how a continent poleward of an inland sea might hold off thick ice intruding from higher latitudes and preserve small areas of thin ocean ice, thin enough to let sunlight through for marine plants. “We think the thin-ice solution satisfies all the constraints better than the other models.”

    But almost all geologists now reject any worldwide freeze, says geologist Philip Allen of Imperial College London. “When the snowball came up, the [geological] community was very open to it,” he says. Now, “it's my impression that 90% of the geological community is quite hostile to the idea.”

    Allen and other geologists went to the field to study glacial deposits from about the time of the proposed Marinoan hard snowball. Instead of stagnation, the sediments recorded signs of water and ice in motion: ice moving, ocean currents flowing, and waves moving on an open sea. “We do not have a hard snowball Earth,” says Allen. Hoffman hasn't disputed such interpretations, but he has argued that they could reflect conditions either just before or after a hard snowball.

    Most geochemists aren't sold on a hard snowball either. Key to the Harvard group's argument was the contention that a bizarre chemical deposit found on the top of glacial deposits—the cap carbonate formation—could have formed after a glacial period only if the world ocean had been sealed off from the atmosphere for millions of years. Only rare cracks in the ice or open water maintained by volcanic hot spots kept the biota going, the group maintained. Geologist Alan Jay Kaufman of the University of Maryland, College Park, a co-author of the 1998 Science paper, has shifted his stance. After studying the isotopic records of carbon, strontium, and sulfur, he now supports the slushball view. The sulfur isotopes in particular, he says, suggest “that there was more than cracks in the ice.”

    Hoffman is unperturbed. Resistance to the hard snowball “is really typical of scientific controversy,” he says. “The problem is the experts reach a quick judgment and dig themselves into a position.” The idea of a recent ice age, he notes, took 40 years and a new generation of scientists to win acceptance in the 19th century. In his view, “the evidence [for a hard snowball] is getting stronger and stronger.” He cites oxygen isotope findings published last year supporting the existence of extremely high atmospheric carbon dioxide concentrations predicted by the hard snowball. Still, Hoffman says, “I don't expect to live to see the conclusion on Snowball Earth, though I think I know how it will turn out.”

  4. Archaeology

    Of Two Minds About Toba's Impact

    1. Michael Balter
    After the volcano.

    Toba's caldera is today a calm lake.

    CREDIT: NICK PEARCE

    OXFORD, U.K.—About 74,000 years ago, Mount Toba on the Indonesian island of Sumatra erupted in the most cataclysmic volcanic event of the past 2 million years. An estimated 2800 cubic kilometers of magma spewed forth, at least 2000 times that ejected during the 1980 eruption of Mount St. Helens. Wind-blown volcanic ash spread thousands of kilometers across Asia, blocking sunlight and triggering what some researchers have called a global “volcanic winter.”

    How did this gigantic eruption affect modern humans? Did Toba decimate previously thriving Homo sapiens populations in Africa and create a genetic “bottleneck” from which our species nearly failed to recover, as advocated by archaeologist Stanley Ambrose of the University of Illinois, Urbana-Champaign, and others? Or was the eruption, albeit dramatic, no match for the survival skills of our ancestors? Debate has raged for more than a decade. And some think the answer lies not in Africa but in Asia, right in the path of the volcano's wrath. If modern humans survived there relatively unscathed, African populations would likely have fared even better. But the resolution hinges on an even more hotly debated question: Was Homo sapiens even in Asia at the time?

    Key evidence that modern humans might have made it to Asia before the eruption, and that they came through it, comes from excavations since 2003 at Indian sites that show evidence of sophisticated stone tools both above and below the 74,000-year-old Toba ash layers (Science, 9 October 2009, p. 224). Many members of the dig team, led by archaeologists Michael Petraglia of the University of Oxford and Ravi Korisettar of Karnatak University in Dharwad, India, think modern humans made both sets of tools.

    But these findings conflict with recent mitochondrial DNA (mtDNA) studies suggesting that our ancestors probably left Africa after the eruption—perhaps between 55,000 and 70,000 years ago, according to a paper in the American Journal of Human Genetics last June by geneticist Martin Richards of the University of Leeds in the United Kingdom and his co-workers. The mtDNA data make a pre-Toba exodus from Africa “very unlikely,” says Richards. If modern humans hadn't reached Asia yet, a different hominin must have made those pre-Toba tools.

    At a meeting* here last month on the impact of the Toba eruption, geneticists Stephen Oppenheimer of Oxford and Robin Allaby of the University of Warwick in the United Kingdom provided some maneuvering room for both camps. The timing of the molecular clock is too uncertain to rule out a pre-Toba African exodus, they argued. Genetic data “neither includes nor excludes a pre-Toba exit,” Oppenheimer said.

    The archaeological evidence remains equivocal too, although Petraglia's team presented several papers at the meeting that may bolster their case. For example, the so-called Middle Paleolithic tools below and above the Toba ash at several sites in the Jurreru River Valley of southern India are similar, said archaeologist Chris Clarkson of the University of Queensland in Australia; the tools include sophisticated artifacts such as flakes, blades, points, and scrapers.

    Clarkson compared the detailed features of the stone cores from which the Indian tools were made with more than 800 cores made by both modern humans and earlier hominins at other sites. His study, an expanded version of a similar one reported in Science in 2007 (Science, 6 July 2007, p. 114), found that both the pre- and post-Toba tools clustered with those apparently made by modern humans in Australia, southern Africa, and Southeast Asia, whereas tools made by Neandertals and other nonmodern humans were statistically distinguishable. Clarkson concluded that his study made a “good case” that modern humans were present in India before 74,000 years ago and also supported continuity of modern human occupation before and after Toba's supposed volcanic winter.

    Archaeologist Robin Dennell of the University of Sheffield in the United Kingdom calls the study “pretty convincing.” Clarkson, Dennell adds, “seems to have found a way of distinguishing tools made by Homo sapiens and Neandertals,” both of whom made Middle Paleolithic artifacts.

    But other researchers think tools alone can't pinpoint the identity of their makers. “Artifacts are not diagnostic of [hominin] species,” says Ambrose, who was not at the meeting. Archaeologist Paul Mellars of the University of Cambridge in the United Kingdom agrees; he suspects that Neandertals made the pre-Toba tools. If so, it would be the first sighting of Neandertals in south Asia.

    As for the dates of the tools, Petraglia's group was struck by a bombshell at the meeting from a member of their own team, geochronologist Richard “Bert” Roberts of the University of Wollongong in Australia. The 2007 Science paper cited optically stimulated luminescence (OSL) dates from a site called Jwalapuram locality 3, of 77,000 years just below the Toba ash and 74,000 years just above it, with error margins of plus or minus 6000 to 7000 years. But Roberts presented new OSL dates from nine Indian sites and found that the pre-Toba dates came out at 74,000 years or earlier, as expected, but nearly all of the post-Toba dates were about 55,000 years or younger. These results, Roberts told Science, put “a question mark over” the 74,000-year, post-Toba date reported in the Science paper.

    Roberts obtained the pre-Toba dates by OSL dating of single quartz grains. But the 74,000-year post-Toba date was determined by an Oxford lab using multiple quartz grains, which Roberts considers less accurate. Roberts and the team now plan to redate the sample using the single-grain technique.

    Meanwhile, Petraglia's team continues to explore Toba's effects in India. At the meeting, Oxford archaeologist Michael Haslam reviewed the evidence, including the team's studies of the ancient Indian landscape, and concluded that “we can't say that Toba caused anything other than a lot of ash.” Humans were able to survive in many refugia—particularly the river valleys where archaeological sites have been found and where ash was more quickly flushed out by water flows, he said. The team's recent studies of carbon and oxygen isotopes in sediments above and below the ash suggest that the pre-Toba mix of woodland and grassland quickly recovered, Petraglia says.

    Ambrose agrees that there could have been refugia in India—but not necessarily for modern humans, whom he is not convinced had made it to the region that early. Mellars is also skeptical but says the ongoing Indian digs will help find the answer. “This is the most important fieldwork in Paleolithic archaeology of the last 10 years,” he said at the meeting.

    • * Toba Super-Eruption: A Critical Moment in Human Evolution?, 20–21 February 2010, Oxford, U.K.

  5. ScienceInsider

    From the Science Policy Blog

    The Intergovernmental Panel on Climate Change and its parent organization, the United Nations Environment Programme, will request an independent review of IPCC in the wake of unprecedented criticisms of the panel. The terms of reference will be set by the organization that conducts the review. Working Group II co-chair Chris Field told Science that a respected scientific organization would be tasked with the job, which he hopes will help end “a crisis of confidence” plaguing IPCC.

    Meanwhile, embattled climate scientist Phil Jones of the University of East Anglia gave testimony before the U.K. House of Commons Science and Technology Committee for the first time since roughly 1000 e-mails between climate scientists were made public in late 2009. No new revelations were made under tough questioning from members of Parliament, and lawmakers seemed unmoved by arguments by some that the e-mails undermine the scientific consensus on climate change.

    A new report suggests that Britain produces a disproportionately large number of top-ranked scientific publications, but U.K. researchers are not adept at translating that research into new inventions.

    Some bloggers are up in arms over National Institutes of Health (NIH) Director Francis Collins's new book, Belief: Readings on the Reason for Faith, which addresses the question “Is there a God?” Collins, an evangelical Christian, has been involved in the past with science-religion outreach but said he would curtail such work when he took his job at NIH.

    The European Southern Observatory's telescopes in northern Chile escaped damage from the magnitude-8.8 earthquake that devastated Concepción and nearby towns and villages in the south.

    For the full postings and more, go to blogs.sciencemag.org/scienceinsider.

  6. ScienceNOW.org

    From Science's Online Daily News Site

    CREDIT: U.S. FISH AND WILDLIFE SERVICE

    Early Polar Bear Discovered in Arctic Tundra Digging in the frozen tundra of Norway's Svalbard archipelago, scientists have uncovered the remains of the most ancient polar bear ever found. DNA analyses reveal that the bear—a mature male—lived about 120,000 years ago, at a time when wooly mammoths were also roaming the land. The work also shows that this bear represents something very rare in the fossil record: an evolutionary snapshot of one species turning into another.

    An Alternative to Insulin? In 1922, a Toronto teenager with diabetes became the first person to be saved by insulin treatment, and since then injections have sustained millions of diabetics, who don't make their own hormone. But are there alternatives to a lifetime of insulin therapy? A new study suggests that an appetite-suppressing hormone called leptin is just as effective as insulin at controlling diabetes in mice.

    Engraved Eggs Suggest Early Symbolism What does Homo sapiens have that our hominid ancestors did not? Many researchers think that the capacity for symbolic behaviors—such as art and language—is the hallmark of our species. A team working in South Africa has now discovered what it thinks is some of the best early evidence for such symbolism: a cache of ostrich eggshells dated to about 60,000 years ago and etched with intricate geometric patterns.

    CREDIT: U.S. FISH AND WILDLIFE SERVICE

    Global Warming Didn't Kill Golden Toad The golden toad was last seen in 1989 in the Costa Rican cloud forest of Monteverde—and 5 years later, its disappearance was the first extinction to be blamed on human-induced global warming. New evidence, however, suggests that humans may not have been at fault after all.

    Read the full postings, comments, and more on sciencenow.sciencemag.org.

  7. Nutrition Science

    European Food Watchdog Slashes Dubious Health Claims

    1. Martin Enserink

    BRUSSELS—Do antioxidants prevent premature aging? Do dried plums help maintain normal bowel function? Does lutein help your vision; does chewing sugarless gum prevent plaques; and does fermented whey improve gut health?

    The answers: No, no, no, no, and no, according to Europe's food safety watchdog, which on 25 February issued a scientific mass-verdict on more than 400 so-called health claims, the promises that food producers make on their labels and in advertisements. The opinions come from the European Food Safety Authority (EFSA), based in Parma, Italy, which also rejected purported health benefits of certain peptides, honey, black and green teas, and a raft of other substances.

    The decisions are the latest installment in a gargantuan and controversial effort by EFSA to validate more than 4000 health claims used by the food industry across the continent. More than a year behind schedule, the agency has more than 3000 claims to go—but so far, it has rejected more than 80% of those it has looked at. The food industry may eventually have to stop using those claims.

    Some hail the process, required by European Union (E.U.) legislation passed in 2006, as a timely new way to protect European consumers. “Finally, there's a bright scientific light on the somewhat shady world of food and supplements,” says nutrition scientist Martijn Katan of the VU University in Amsterdam, the Netherlands. But many in the food industry and some academic researchers say that EFSA has put the scientific bar so high that it may stifle food research in the long run.

    In the past decade, European companies have invested millions in so-called functional foods that they say offer health benefits; if EFSA rejects most of their claims, research on such products may lose its appeal. “I'm getting really worried about all the rejections,” says Glenn Gibson, a food microbiologist at the University of Reading in the United Kingdom who studies probiotics, or “healthy bacteria”—an area in which EFSA has so far rejected every claim it has reviewed. “My ultimate concern is that much of the food research in Europe will be lost.” Some company representatives vented their frustration at EFSA at a meeting here last week organized by Cantox Health Sciences International, a company that helps food and supplement producers prepare their dossiers for the regulatory mill.

    The high rejection rate is partly an artifact of the way the regulation was set up—a typical, tortured E.U. compromise. Member countries wanted to ensure an easy way through the process for food components already known to contribute to some function of the human body—such as vitamin C and iron. Under Article 13.1 of the regulation, such “function” claims can be filed with just a list of scientific references as evidence.

    But rather than “regulation-lite,” Article 13.1 has become a graveyard of health claims. Except for cases that are literally in the textbooks—mostly vitamins and minerals—EFSA found that there was little consensus. Judging the claims was often “very difficult” because they were poorly stated and the literature cited was often incomplete or irrelevant, says nutrition scientist Albert Flynn of University College Cork in Ireland, the chair of the 21-member Panel on Dietetic Products, Nutrition and Allergies, which issues the rulings. Nigel Baldwin, a regulation specialist at Cantox, says the 13.1 procedure, forced upon EFSA by the E.U. bureaucracy, was flawed from the beginning.

    Does it work?

    Regulators have yet to decide whether Danone's Actimel “helps strengthen your body's natural defenses,” as the company claimed.

    CREDIT: LUCAS SCHIFRES/LANDOV

    There is another way through the maze. Applicants can make their arguments for a product in a full scientific dossier—including a narrative about why they think it works—and some products rejected in the first round may yet win approval by this path, says Flynn. But industry observers say that the EFSA panel is putting the bar very high here, too. The panel puts little stock in animal studies, for instance, and as for human data, it heavily favors randomized, controlled, blind clinical trials, which are far less common in nutrition than pharmaceutical research.

    A big company recently stumbled, for example, even though it came with clinical data. On 4 February, EFSA rejected Danone's claim that Immunofortis, a mixture of oligosaccharides added to infant formula, strengthens babies' immune systems. “There's 10 years of science behind this product,” says Danone spokesperson Agnès Berthet-d'Anthonay; indeed, the company had 30 studies, including 25 with human data, to prove it. But the panel was unimpressed. Only one trial in 259 infants addressed the benefit directly, it concluded, but it had “considerable weaknesses”: it wasn't clear how infections had been diagnosed, for instance. Danone is still awaiting a verdict on several other claims, including its probiotic drinks Actimel and Activia.

    Katan disputes that the EFSA panel is particularly stringent. “This is not draconian. It's just standard, mainstream science,” he says. The industry's problem is that convincingly proving a benefit is extremely difficult. Because many ingredients can't be patented, the sector can't afford the kinds of large, rigorous studies that the pharmaceutical industry funds.

    EFSA Senior Scientific Officer Juliane Kleiner also insists that the agency's scientific assessments aren't more demanding than, say, those of the U.S. Food and Drug Administration (FDA). But analysts point out that several features in the U.S. system make life easier for companies. As a result of a freedom of speech lawsuit, for instance, FDA also admits “qualified claims” for food products when there's some evidence but no scientific consensus. (A label could say, for instance, that there is “limited evidence” that a product may reduce the risk of disease.) The E.U. system doesn't allow that way out.

    It may be years before health claims actually start disappearing from labels and TV commercials, however. EFSA can't ban their use itself; that's up to the European Commission and the European Parliament, which have yet to decide what to do with most of the rejections. E.U. politicians are subject to intense lobbying by the industry to soften the blow, Katan says—“but I can't imagine that they'll throw all of these opinions in the garbage bin.”

  8. Genomics

    Semiconductors Inspire New Sequencing Technologies

    1. Elizabeth Pennisi
    Connecting the dots.

    A quantum dot (yellow) lights up the base (blue) that polymerase is adding to match the DNA being sequenced.

    CREDIT: COURTESY OF LIFE TECHNOLOGIES CORP. AND DIGIZYME INC.

    MARCO ISLAND, FLORIDA—Fifteen years ago, gels and fluorescent dyes lay at the heart of every machine that sequenced DNA. Bases cost a dollar or more to sequence, and deciphering a human genome would take years. The expense of decoding DNA has now plummeted, and new human genomes appear in quick succession thanks to advances in DNA-sequencing technologies and a growing roster of sequencer manufacturers. Whereas one company dominated the industry 6 years ago, about a half-dozen companies now produce DNA-sequencing machines—and novel technologies come onto the scene almost annually. Attesting to this revolution was a packed house at the final Saturday session of a genome meeting here, where silicon wafers and quantum dots were the DNA-sequencing technologies du jour.

    Although neither approach is yet able to fulfill the dream of the $1000 human genome, these newcomers have the potential to change the face of DNA sequencing and expand the ability of researchers, and eventually clinicians, to incorporate DNA into their work. “It's pretty exciting that the field has opened up as much as it has,” says Elaine Mardis of the Genome Center at Washington University in St. Louis, Missouri. At the same time, labs pursuing a sequencing project face ever more choices and tradeoffs between speed, accuracy, and cost. With so many technologies to choose from, “it's confusing,” says Eric Green, director of the National Human Genome Research Institute (NHGRI) in Bethesda, Maryland.

    This year's upstart at the annual Advances in Genome Biology and Technology meeting here was Ion Torrent Systems Inc., based in Guilford, Connecticut. Started 2 years ago by Jonathan Rothberg, who founded and sold the sequencing company 454 to Roche in 2007, Ion Torrent offers a DNA-sequencing strategy that's “a radically different proposal,” says Edward Rubin, director of the Department of Energy Joint Genome Institute (JGI) in Walnut Creek, California.

    In current “next generation” DNA sequencers, such as those made by 454, the machine decodes a strand of DNA by using it as a template to synthesize a matching strand, where base additions are signaled by the emissions of photons of light. A camera records each base, revealing the corresponding one on the template DNA. In most cases, the original DNA must be cut into small pieces that are copied many times over before being sequenced. Those pieces are anchored on beads or slides, and sequencing is done in a massively parallel fashion, achieving rates unthinkable with the gel-based technology used to decode the first human genome.

    Silicon sequencer.

    This Ion Torrent chip decodes DNA using voltage detection.

    CREDIT: ION TORRENT SYSTEMS INC.

    In contrast, Ion Torrent's sequencer is a silicon chip, built the same way as a semiconductor and etched with an array of nanoscopic wells—400 per the width of a human hair. The wells sit on top of an ion-sensitive layer—a pH meter of sorts—below which is the layer that transmits electrical current. The DNA to be deciphered goes into the wells. Polymerase, the enzyme that builds up a matching strand, is added, along with each of the four different bases, one type of base at a time. When the polymerase finds the right base in the well and attaches it to the new DNA, the reaction releases a hydrogen ion that the chip detects as a voltage change. If the base in the well is wrong and isn't added, no voltage change happens. Then unattached bases are washed out and another type of base is added. Ultimately, the series of electrical pulses recorded by the chip translates into a readout of the DNA being sequenced.

    To date, Ion Torrent has sequenced the genomes of only a virus and a bacterium. But according to Rothberg, each took about an hour, 100 times faster than some of the next-generation machines now on the market. At this point, the chip-based system cannot do high-volume sequencing, says NHGRI's Jeffery Schloss, “but it has the potential of filling a very important niche” for smallscale, quick-turnaround jobs.

    Rothberg expects to greatly increase the density of the wells and thus the efficiency of sequencing. According to Ion Torrent, its machines will be for sale by the end of the year, priced to put them in reach of many smaller biology labs. “It's one of the best marketed and coolest concepts of where sequencing could potentially go,” says Len Pennacchio of JGI.

    But it wasn't the only concept to create a buzz at Marco Island. Life Technologies in Carlsbad, California, has in the wings a new strategy aimed at sequencing a single DNA molecule—most other technologies must analyze multiple copies of the DNA to be deciphered. Life Technologies' approach involves tethering a nanocrystal semiconductor called a quantum dot to a DNA polymerase. A laser excites the dot, which then transfers energy to fluorescent dye-tagged bases, but only when the polymerase adds a base to the DNA chain being built from the template being sequenced. A camera detects each added base by the color emitted from its dye, immediately translating that data into the sequence of the template.

    At this point, says Joseph Beechem of Life Technologies, the company has tried the technology only on humanmade test DNA strands. And the approach is not perfect, as the quantum-dot nanocrystals blink on and off and might miss a base addition, says Mardis. Still, she and others agree that the technology has the potential to decipher long stretches of DNA in a single sweep. Life Technologies' machine might eventually challenge a single-molecule sequencer produced by Pacific Biosciences, based in Menlo Park, California. But the latter has a head start: At the meeting, it unveiled its first commercial machine with much fanfare, including fireworks on the beach.

  9. Stem Cells

    Reprogrammed Cells Come Up Short, for Now

    1. Gretchen Vogel

    Stem cell research offers an ever-shifting battlefield, with vested interests and biologists squabbling over the political, ethical, and scientific merits of different types of cells. Some of the fiercest skirmishes once took place between advocates and opponents of fetal cells. Then along came human embryonic stem cells, opening several fronts, including not-so-civil wars among hES researchers and fans of various adult stem cells. Now two recent papers have dragged the new kid on the block, induced pluripotent stem (iPS) cells, into the fray.

    Those papers offer some of the first side-by-side comparisons of human iPS and hES cells as they differentiate into various kinds of cells. In both papers, researchers report that iPS cells can form desired cell types, but they do so with less efficiency than hES cells. Robert Lanza of Advanced Cell Technology, a biotech company based in Worcester, Massachusetts, who co-authored one of the studies, doesn't mince words about iPS cells: “These cells are pretty screwed up,” he says.

    Not so fast, say other researchers, who contend that not all iPS cells are equal. “The differences are real, but one shouldn't overinterpret them,” says James Thomson of the University of Wisconsin, Madison, who is a co-author of the second paper. “When you go back and tweak the conditions, [iPS cells] seem to have the same potential” as ES cells, he says. The differences, Thomson and others explain, are probably due to imperfections in the reprogramming process that occur when scientists activate several genes to convert a differentiated adult cell into an iPS cell. “There's going to be a lot of noise” in the data as scientists work to diagnose and overcome reprogramming's weak spots, Thomson says.

    The latest stem cell skirmish started on 12 February with an online paper in Stem Cells in which researchers including Lanza and Shi-Jiang Lu of Stem Cell and Regenerative Medicine International, another biotech company based in Worcester, compared the ability of eight human iPS cell lines and 25 hES cell lines to differentiate into several kinds of blood and endothelial cell types. In one test, the hES cells made more than 1000 times more of the desired cells than the iPS cell lines. They also found that, in contrast to cells derived from hES cells, various cell types produced by iPS cells started to undergo cellular aging and programmed death after a short time in culture. Such observations are especially worrisome, Lanza says, as scientists hope to use stem cells in industrial quantities—either for drug testing or for eventual cell therapies. (Most of Advanced Cell Technology's intellectual property portfolio focuses on ES cells and nuclear transfer techniques.)

    In the second study, Su-Chun Zhang, Thomson, and their colleagues at the University of Wisconsin, Madison, compared the differentiation of hES and iPS cells into neuronal cells. The two stem cell types behaved very similarly as they became neurons and glia, expressing the same genes at the same time, the researchers reported online 16 February in the Proceedings of the National Academy of Sciences. And both hES- and iPS-derived cells acted like normal brain cells in lab tests. But more than 90% of the hES cells responded to the chemical recipe for making neural cells, whereas the iPS cells' response was more variable: In some lines, only 15% of cells turned into neuronal cells, in another, 79%.

    Work in progress.

    iPS cells can differentiate into functional neurons (above), but analysis of PAX6 gene expression shows they are less responsive than human ES cells to neuron-making cues (chart).

    CREDITS: B.-Y. HU ET AL., PNAS (ADVANCED ONLINE EDITION) © 2010 NATIONAL ACADEMY OF SCIENCES, U.S.A.

    In contrast to the Wisconsin group's results, Hans Schöler, a stem cell biologist at the Max Planck Institute for Molecular Biomedicine in Münster, Germany, says he and his colleagues have noticed no differences between ES and iPS cells as they differentiated into neural stem cells. But, he adds, a member of his lab did try unsuccessfully for nearly 3 years to prompt murine iPS cells to form a healthy live-born mouse—the ultimate test of pluripotency that mouse ES cells achieve without a problem. Other groups succeeded, but the efficiency was still low, he notes. In addition, several groups have already reported differences in global gene expression between hES and human iPS cells.

    Such observations highlight that cellular reprogramming is still an inexact science. Like Thomson and other stem cell scientists, Schöler thinks that incomplete reprogramming still mars many iPS cells. The first iPS techniques involved using viruses to insert extra copies of reprogramming genes into target cells, but the inserted genes may affect the cells' behavior after reprogramming. Indeed, Lanza says that more-recent studies by his colleagues suggest that cells reprogrammed with newer virus-free techniques are better at differentiating.

    Shinya Yamanaka of Kyoto University in Japan, who was the first to successfully reprogram mature mouse cells into iPS cells, says that he has also observed that the differentiation performances of iPS and hES cells vary from line to line. But his lab has not seen systematic differences between the cell types. He and his colleagues are searching for a way to accurately identify more fully reprogrammed iPS cell lines. He predicts that adding additional factors to the reprogramming mix should produce more dependable iPS cells.

    Clearly, these findings do not settle the debate, says Miodrag Stojkovic of the Prince Felipe Research Centre in Valencia, Spain “We're all very excited to work with iPS cells,” he says. “But first the science has to determine how they are similar and what is different.”

  10. Psychiatry

    Anything But Child's Play

    1. Greg Miller

    An alternative to juvenile bipolar disorder and a reorganization of autism-related disorders are among the controversial changes proposed for the fifth edition of psychiatrists' bible, the Diagnostic and Statistical Manual of Mental Disorders.

    Just a tantrum?

    Revisions to DSM try to define psychiatric conditions without pathologizing children's normal mood swings.

    CREDIT: LAURENCE MONNERET

    Diagnoses of mental disorders in children and adolescents rose dramatically during the past 2 decades. Juvenile cases of bipolar disorder, once thought to strike only in adulthood, jumped 40-fold between 1993 and 2004 in the United States, according to one widely cited study. Autism estimates leapt from 1 in 1500 to as high as 1 in 90 over a similar time period. Such figures have fueled an intense debate about whether the surge is real or reflects a trend toward overzealous diagnoses and a tendency to pathologize normal youthful behavior.

    Against this backdrop, the clinicians and researchers working on revisions to the psychiatrists' bible, the Diagnostic and Statistical Manual of Mental Disorders (DSM), have been wrestling with how to improve the diagnosis of mental disorders in these age groups. It's not clear how their suggestions, released last month (Science, 12 February, p. 770), would affect the prevalence of mental disorders if adopted, but they are already altering the discussion.

    The most substantial proposals include a reclassification of autism spectrum disorders, a new diagnosis of post-traumatic stress disorder (PTSD) tailored to preschool children, and a brand-new diagnosis called temper dys-regulation disorder with dysphoria (TDD) that members of the DSM work group hope will stem what they see as a false epidemic of juvenile bipolar disorder.

    Reaction to the proposed changes for the fifth edition of DSM, slated for publication in 2013, is decidedly mixed. In a recent editorial in Psychiatric Times, Allen Frances, a professor emeritus of psychiatry at Duke University in Durham, North Carolina, branded TDD “a new monster.” Frances led the previous DSM revision and has been a dogged critic of the current revision process. Researchers also disagree about proposals to eliminate several conditions, including Asperger syndrome, and merge them into a single diagnosis of autism spectrum disorder and to stop using IQ scores to grade the severity of intellectual disabilities. “The trouble is that some of these things actually made good sense,” says Fred Volkmar, who directs the Yale Child Study Center.

    Acting out

    For years, researchers have been arguing about the causes and consequences of the explosion of juvenile bipolar disorder (Science, 11 July 2008, p. 193). In the mid-1990s, Harvard University psychiatrists Joseph Biederman and Janet Wozniak proposed that many children diagnosed with conduct disorder or attention-deficit hyper-activity disorder (ADHD) may instead have a juvenile form of bipolar disorder. The idea “took the clinical community by storm,” says Gabrielle Carlson, director of child and adolescent psychiatry at Stony Brook University School of Medicine in New York state, who is not involved in the current revisions. “What you've got is a lot of kids with very severe problems with explosive outbursts, and the categories that we had to explain their behavior were not very satisfying,” Carlson says.

    But as the popularity of the bipolar diagnosis grew, Carlson and others became concerned that too many children were being prescribed antipsychotic and mood-stabilizing drugs with serious side effects and unknown long-term effects on the developing brain. They began to question whether bipolar disorder was really the right diagnosis after all. In adults, bipolar disorder is defined by alternating episodes of depression and elevated mood, often typified by grandiose and reckless behavior. That's not what's seen in most children diagnosed with bipolar disorder, says David Shaffer, a child psychiatrist at Columbia University and a member of the DSM work group on child psychiatry. “What those cases mainly consist of is kids who may or may not be chronically depressed who have intermittent exacerbations when they become extremely irritable and lose their temper,” Shaffer says. “We feel quite confident that these kids are a distinct group … and [that] it is a disservice to them and leads perhaps to inappropriate treatment to call them juvenile bipolar.”

    Instead, Shaffer's work group has proposed TDD, which is characterized by “severe recurrent temper outbursts in response to common stressors” accompanied by dysphoria, or persistently negative mood, such as irritability, anger, or sadness.

    “It makes more sense than calling them bipolar, absolutely,” says Jon McClellan, a psychiatrist at the University of Washington, Seattle, who is not involved with the DSM revisions. For one, McClellan says, it lacks the perceived permanence of a bipolar diagnosis, which in adults carries the expectation of a lifetime of medication and a heightened risk of suicide.

    A shift away from bipolar disorder could lead to improved treatment for some patients, says Carlson. For example, she says she sees many children previously diagnosed with bipolar disorder who have untreated ADHD. That's because many psychiatrists are reluctant to prescribe the stimulants used to treat ADHD for fear of provoking a manic episode, Carlson says. That might change if these children's disorders were reclassified.

    At the same time, Carlson thinks the new diagnosis of TDD would apply only to about one-third of children who currently receive a bipolar diagnosis. In her view, the requirement of dysphoria makes the diagnosis too restrictive. “Most of the kids aren't like that,” Carlson says. “Their mood is fine until they won't do something you ask them to do or won't stop doing something you don't want them to do.” Mani Pavuluri, who runs the pediatric mood disorders clinic at the University of Illinois, Chicago, thinks TDD may capture up to two-thirds of children now diagnosed with bipolar disorder. But she sees the potential for trouble if the diagnosis is applied too broadly. McClellan has similar concerns: “It's part of normal childhood to have some temper tantrums,” he notes.

    Some parents have qualms as well, says Susan Resko, executive director of the Child & Adolescent Bipolar Foundation in Evanston, Illinois. The word “temper” in the name of the disorder reminds some parents of the days when doctors blamed childhood behavioral problems on bad parenting, Resko says: “It conjures images of inept mothers who cannot control their bratty kids.” Her group is lobbying for “mood” or “affect” to be used instead. Whatever it comes to be called, however, Resko says she's hopeful that the new diagnosis will foster more research on these kinds of problems.

    Fewer distinctions

    Also contentious is the proposal for DSM-V to combine autism spectrum disorders. DSM-IV contained separate diagnoses for autism, Asperger syndrome (considered by many to be a milder form of autism), childhood disintegrative disorder, and the catchall diagnosis of pervasive developmental disorder not otherwise specified. These distinctions attempt to “cleave meatloaf at the joints,” the neuro-developmental work group writes in explaining its rationale on the DSM-V Web site (www.dsm5.org). “Behaviorally, these groups cannot be distinguished … except in terms of severity,” says group member Catherine Lord of the University of Michigan, Ann Arbor.

    The elimination of Asperger syndrome has offended some advocates, who take pride in a diagnosis associated—in popular culture if not clinical fact—with socially awkward but brilliant figures such as Albert Einstein and Andy Warhol. People can continue to identify themselves as having Asperger, says Lord: “We don't want to take that away, but what we're saying is that scientifically it isn't a reliable diagnosis.”

    Lord's group also proposes doing away with using IQ to grade the severity of mental retardation, which would now officially be renamed intellectual disability, terminology long preferred by advocates. As with autism spectrum disorders, the group's rationale was that the existing demarcations artificially carve up what is really a continuum.

    Under the umbrella.

    This boy with autism and others with less severe disorders would fall under one diagnosis in DSM-V.

    PHOTO CREDIT: © ABRAHAM MENASHE INC./PHOTOTAKE/NEWSCOM

    But those distinctions—both in autism and intellectual disability—have practical value for clinicians and researchers, says Volkmar. “You can argue that these [IQ] tests aren't very precise, but the other side of it is that a person with an IQ of 20 has substantially different needs than someone with an IQ of 65,” Volkmar says. He also worries that the elimination of distinctions could complicate research, making it difficult to compare studies that use the new criteria with older studies. Volkmar resigned in frustration from the DSM-V work group on neurodevelopmental disorders last year. “There's an apparent lack of peer-reviewed data to justify these changes,” he says.

    Childhood trauma

    The new PTSD diagnosis for preschool children (age 6 or younger) seems to have sparked less debate. It's based largely on studies led by Michael Scheeringa of Tulane University in New Orleans, Louisiana. “Very young children who are severely traumatized develop lots of post-traumatic symptoms, but very few meet the criteria for PTSD” outlined in DSM-IV, says Charles Zeanah, a child psychiatrist at Tulane who collaborates with Scheeringa and served on the DSM work group for childhood disorders. Adults often describe reliving the traumatic event in flashbacks, for example, but children may not be able to verbalize this, Zeanah says. “What's clear is that they will repeatedly reenact their trauma during play.”

    Similarly, an adult suffering from PTSD following a car accident is likely to exhibit an aversion to driving or riding in a car, but a child may have no choice to get in the car when told to by his or her parents. The new diagnosis explicitly lists play reenactment as a possible symptom and makes avoidance behavior a possible symptom rather than a requirement, as it is for adults.

    Helen Egger, a child psychiatrist at Duke, supports the new diagnosis. She says epidemiological studies by her group and others suggest that about one-third of very young children with PTSD don't meet the DSM-IV requirements for a PTSD diagnosis but would meet the new criteria. “Of all the changes that are proposed, this is one that has some good evidence backing it up.”

  11. Astronomy

    Unwinding the Milky Way

    1. Yudhijit Bhattacharjee

    For a generation, researchers have sought clues to our galaxy's origins in the rare stars whose compositions most closely approach the purity of the primeval universe.

    CREDIT: NASA/JPL-CALTECH

    As Beatriz Barbuy explains it, she took up astronomy because she couldn't handle philosophy. With a philosopher father and a sociologist mother, Barbuy grew up in São Paolo, Brazil, amid a background hum of intellectual debate. But when she tried to read Hegel as a teenager, the German thinker's ruminations on mind, spirit, and logic made Barbuy's head spin. Instead, Barbuy found inspiration in the Russian cosmologist George Gamow's classic popular-science book One, Two, Three … Infinity. She asked for a telescope for her 16th birthday and took to rising at 4 a.m. to gaze at the stars.

    Growing up in Göttingen, Germany, 2 decades later, Anna Frebel, too, was fascinated by “glowing gas balls up in the sky.” And the same fascination led Daniella Carollo to spend hours perched on the balcony of her house in Turin, Italy, neck craned skyward, at the age of 9. “I simply couldn't stop thinking about stars and galaxies,” says Carollo, who is completing her Ph.D. at Mount Stromlo Observatory in Australia.

    Burning slowly.

    A group led by Anna Frebel (above) has spotted a metal-poor star in the Sculptor satellite galaxy that is about 13 billion years old and resembles stars on our galaxy's fringe.

    CREDIT: COURTESY OF ANNA FREBEL

    Today, all three women are professional astronomers seeking answers to a centuries-old question that has occupied philosophers and physicists alike: How did the Milky Way originate and evolve into what it is today? For decades, researchers have become increasingly convinced that our galaxy did not develop in isolation but rather grew at least in part by pulling in stars and even whole galaxies that formed outside its borders. But which parts of the galaxy are endemic and which exotic? Where did such alien interlopers come from, and when did they arrive?

    For answers, Barbuy, a professor at the University of São Paolo; Frebel, a post-doctoral researcher at Harvard University; and Carollo have devoted themselves to finding and studying stars in the Milky Way that contain vanishingly low quantities of elements heavier than hydrogen and helium. Most of the universe's hydrogen and helium came into being within a few hundred thousand years after the big bang. But heavier elements—sweepingly referred to as “metals”—form by nuclear reactions inside stars and have been steadily building up in the cosmos for billions of years. As a result, astronomers infer that low-metallicity, or “metal-poor,” stars formed in the distant past and thus can serve as a fossil record of the events that shaped our galaxy billions of years ago.

    In the late 1980s, for example, Barbuy found chemical evidence that metal-poor stars in the Milky Way's halo—the huge, diffuse ball of stars that surrounds and dwarfs the galactic disk—formed in the wake of explosions of massive progenitor stars known as type II super-novae. The work reaffirmed the usefulness of metal-poor stars as a tool for stellar archaeology and prompted a push to find stars of ever-lower metallicities to reach further back in time.

    In recent years, Frebel and her colleagues have done exactly that by discovering metal-poor stars in the Milky Way dating back to within a few hundred million years of the big bang. This week in Nature, Frebel and two co-authors report finding such a star, a slow-burning red giant nearly 13 billion years old, in the Sculptor Dwarf Galaxy—a satellite galaxy to the Milky Way. And in a recent paper published in Astronomy and Astrophysics, Barbuy and colleagues have used metallicity analysis to identify what could be the most ancient globular clusters in the central bulge of the Milky Way.

    From these and other findings, researchers are starting to piece together how our galaxy grew from the assembly of smaller galaxies. “The picture is growing more richly detailed as we go along,” says Jason Tumlinson, a theorist at the Space Telescope Science Institute in Baltimore, Maryland.

    Alien invaders

    The reason astronomers favor the Milky Way galaxy, simply enough, is that they have ringside seats. Although advanced telescopes can see out to the edge of the universe, researchers can get “relatively little information” from those images, notes Timothy Beers, an astronomer at Michigan State University (MSU) in East Lansing. “They'll have colors of stars, maybe some kind of spectral signature,” he says. “In the Milky Way, we can get enough information that we can tell the whole story.”

    That story has been evolving. In the 1960s, when Barbuy was still a schoolgirl stargazing from the top of a plum tree in her parents' garden, astronomers theorized that the Milky Way had originated from the rapid collapse of a gigantic cloud of gas and dust. In this “monolithic collapse model,” the galaxy formed over a few hundred million years more or less in one piece. In 1978, Leonard Searle of the Carnegie Observatories and Robert Zinn of Yale University proposed a different idea: that the Milky Way didn't come together all at once but had gradually “accreted” by the merger of smaller structures—a process that is continuing today. This accretion scenario has become the dominant model of our galaxy's evolution.

    CREDIT: DAVID AGUILAR, CFA

    Barbuy, who was in graduate school when Searle and Zinn's paper appeared, has focused mainly on the history of star formation well inside the Milky Way. For her doctoral research at the University of Paris, she developed a more sensitive method of measuring trace elements in star spectra and used it to determine that most metal-poor stars in the halo have a high ratio of oxygen to iron. In 1987, 5 years after finishing her doctorate, Barbuy confirmed that finding with observations at the European Southern Observatory's Very Large Telescope in northern Chile. “I was lucky to get to observe for 7 nights in a row,” she says.

    From the high oxygen-to-iron ratio, Barbuy inferred that the metal-poor halo stars had formed from material blasted into space by type II supernovae—the most efficient process for producing oxygen. Type II super-novae are the death throes of stellar objects 10 to 500 times as massive as the sun and made up of almost pure hydrogen and helium. Because the high pressures inside such stars give them extremely short life spans of only a few million years, those precursor stars had to have exploded early in the galaxy's history. Barbuy's result confirmed that metal-poor stars were reliable markers of the past. The work helped nail down a key piece of the Milky Way's evolutionary puzzle: the conditions under which the earliest generations of more lasting, stable stars had formed.

    In the 1990s, Barbuy and colleagues turned their attention to the bulge at the center of the Milky Way. By measuring the oxygen-iron ratio of hundreds of stars, they identified collections of stars known as globular clusters that were more than 10 billion years old. At the time, they were the oldest globular clusters to be discovered in the bulge; Barbuy has since found even older ones. The age of the clusters suggested that the bulge of the Milky Way—or at least parts of it—had formed early on in galactic history.

    New twists in the plot

    Meanwhile, stellar metallicity was helping other researchers probe broader questions of the Milky Way's evolution. A key recent insight has come from the work of Carollo. Carollo came to low-metal stars by chance. After finishing the Italian equivalent of a master's degree in physics, she took a staff position at the Turin Observatory. But she says she felt stifled there and began writing to astronomers at institutes around the world in search of other research projects. When Beers offered her a stint as a visiting researcher in his group at MSU, Carollo jumped at the chance. When she arrived, she says, Beers gave her a data set of some 20,000 stars from the Sloan Digital Sky Survey (SDSS) and said “Work on it”—that is, check how metallicity varied in this population.

    Measuring the velocity and the metallicity of individual stars, Carollo saw a pattern. In the inner parts of the Milky Way's halo—about as far out as the edge of the 100,000-light-year-wide galactic disk—stars of relatively high metallicity traveled around the galactic center clockwise, the same direction as the disk itself. Farther out, however, the most metal-poor stars in the halo were moving counterclockwise. In a 2007 paper in Nature, Carollo, Beers, and their colleagues reported this finding as evidence that the halo has two counter-rotating components—an inner halo and an outer halo—that formed at different times.

    The discovery, which Tumlinson calls a “gem,” gave a fillip to the idea of galactic mergers, which had been gaining ground since Searle and Zinn proposed it 3 decades earlier. “The outer halo is most likely associated with the debris of what are probably ancient dwarf galaxies that have been torn up as they merged with the core of the Milky Way,” Beers says. The researchers say an analysis of a larger sample of stars, in press at the Astrophysical Journal, confirms their interpretation.

    Elemental clues.

    Using metallicity analysis, Beatriz Barbuy has identified ancient globular clusters in the bulge of the Milky Way.

    CREDIT: MICHELINE PELLETIER/FONDATION L'ORÉAL/IAU

    A recent discovery, reported this week in Nature, supports that view of the galaxy's outskirts. Lead author Frebel, a young astronomer with a knack for finding ancient stars, reports that she and colleagues have identified an extremely metal-poor red giant star about 13 billion years old in the Sculptor Satellite Galaxy. The star, christened S1020549, has a chemical profile very similar to that of extremely metal-poor stars in the outer halo of the Milky Way. “If the outer halo was assembled from accreted dwarf galaxy stars, you would exactly expect such a chemical similarity,” Frebel says.

    Despite such recent advances, researchers still have far to go in piecing together the galaxy's history, notes Roger Cayrel, an astronomer at the Paris Observatory who has had an ultra–metal-poor star named in his honor. “In an accretion assembly scenario, we would like to be able to have a complete list of the events with the epoch of occurrence, the mass and former chemical evolution of each accreted small galaxy, and so on,” he says.

    The answers may lie in the multitude of low-metal candidates identified in the SDSS. “The high-resolution follow-up of SDSS stars is just starting,” Beers says. “My expectation is that the numbers of low-metallicity stars will be greatly expanded.” That should help develop an account of Milky Way formation that's sufficiently rich and precise to make astronomers proud and philosophers envious.

  12. 17TH CONFERENCE ON RETROVIRUSES AND OPPORTUNISTIC INFECTIONS, 16-19 FEBRUARY, SAN FRANCISCO, CA

    The Ins and Outs of HIV

    1. Jon Cohen

    Presentations at the 17th Conference on Retroviruses and Opportunistic Infections challenged the most basic notions of how HIV enters and exits cells. New work indicates that endocytosis may be a more important entry route than direct fusion.

    Researchers first isolated HIV 27 years ago and arguably know more about how it behaves than they do any other virus. But presentations here challenged the most basic notions of how HIV enters and exits cells.

    As landmark studies first revealed in 1996, HIV initiates an infection by binding to two receptors on the cell surface (Science, 10 May 1996, p. 809). But how that bound HIV then penetrates the cell membrane has remained murky. Many researchers believed that HIV sticks a protein into the cell membrane and then directly fuses with it. But HIV—and other viruses that have an outer coat made from a mishmash of its own proteins and cellular membrane—can enter via endocytosis.

    In endocytosis, the membrane invaginates and pinches off to form an endosome—a bubble of membrane with the virus inside. The virus fuses with the endosome only after the bubble is floating in the cell. Biophysicist Gregory Melikian of the Institute of Human Virology at the University of Maryland School of Medicine in Baltimore presented provocative evidence that HIV primarily relies on the endocytic pathway.

    In test tube studies with HIV and human cells, Melikian introduced a peptide that blocks direct fusion with the cell membrane. If endocytosis was a primary mechanism, the viruses already inside of endosomes could dodge the peptide. As a control, he used low temperature to block both direct fusion and fusion that occurs inside endosomes. Low temperature blocked fusion more potently than the peptide, indicating that many viruses were inside of endosomes and could proceed with the infection process. The block on direct fusion had no impact on the virus inside endosomes.

    Self-coated.

    HIV selects specific lipids from membranes as it leaves the human cell.

    CREDIT: MICHAEL F. SUMMERS/HHMI/UMBC

    This doesn't rule out the possibility that some virus enters by direct fusion, but in another experiment, Melikian made movies in which he labeled the virus with a fluorescent protein and recorded the infection process. HIV readily fused with the endosomes and initiated an infection, while the process of direct fusion aborted before the virus could unload its genetic cargo into the cell. In all, these data suggest that “the overwhelming majority of viruses are entering through endocytic pathways,” Melikian concluded. “And with endocytosis, you hide the virus much sooner, narrowing the opportunities for antibodies.”

    Virologist Hans-Georg Kräusslich of the University of Heidelberg in Germany, whose lab first showed the importance of HIV endocytosis in 2004, said Melikian's work “certainly takes it further” but cautioned that it remains controversial whether that mechanism is an exclusive entry route.

    Kräusslich presented data that tackled the other, often ignored end of the process: how HIV exits cells. “People thought the virus simply blebs out,” said Kräusslich. “It doesn't. The virus controls the process.” Over the past 6 years, Kräusslich's lab, along with several other groups—including Eric Freed of the U.S. National Cancer Institute in Frederick, Maryland; Michael Summers of the University of Maryland, Baltimore County; and Wes Sundquist of the University of Utah in Salt Lake City—have shown in exquisite detail the structure and function of HIV's Gag polyprotein, which orchestrates the exit process with help from the cell.

    Cell membranes consist of two sheets of many different types of lipids, including cholesterol. Kräusslich showed that new virions do not—as cartoons used by many in the field often suggest—bleb through the membrane and then randomly dress themselves in a coat made of these cellular lipids. Rather, HIV's Gag selects specific lipids to form the viral coat.

    The lipid bilayer has two microdomains with different degrees of rigidity. The more rigid microdomain is “ordered” into “rafts” of regularly packed lipids that are straight and lined up like soccer players forming a wall to block a penalty kick. The other one is “disordered” with lipids that have kinks in them (see illustration). Gag selects lipids in the rafts—in particular, favoring one called PIP for short—or gathers other lipids to form its own rafts.

    Using fluorescence microscopy, Kräusslich's lab showed the process of budding and release, detailing how the virus co-opts cellular proteins to pinch itself off from the cell. “What's remarkable about these new studies is what appeared to be such a simple process is so complex, and it shows how this virus interacts intimately with the machinery of the cell,” said molecular virologist Nathaniel Landau of New York University School of Medicine in New York City. “It's one of the major advances in understanding HIV molecular biology.”

  13. 17TH CONFERENCE ON RETROVIRUSES AND OPPORTUNISTIC INFECTIONS, 16-19 FEBRUARY, SAN FRANCISCO, CA

    Treatment as Prevention

    1. Jon Cohen

    At the 17th Conference on Retroviruses and Opportunistic Infections, two groups presented some of the firmest data yet to support the concept of testing everyone for HIV and immediately starting all infected people on treatment.

    An ambitious idea to slow the HIV/AIDS epidemic is gaining traction: Test everyone for the virus and immediately start all HIV-infected people on treatment. But the test-and-treat scheme has epidemic modelers battling it out, with some insisting it's feasible, both financially and practically, and others denouncing it as a pipe dream and warning that it could increase drug resistance. At the meeting, two groups presented some of the firmest data yet to support the concept.

    Although it's logical that if drugs reduce the amount of virus in individuals (the “viral load”), they then become less likely to transmit HIV, until now, the population data supporting the idea has remained sparse. Deborah Donnell of the Fred Hutchinson Cancer Research Center in Seattle, Washington, described a study of 3381 “discordant” heterosexual couples in seven sub-Saharan countries in which only one partner was infected with HIV at the outset. Over 2 years, Donnell and co-workers analyzed 103 new infections in which they could prove through genetic sequencing that the infecting virus came from the person's long-term partner. Of these, only one person who became infected had a partner who was receiving anti-HIV drugs, showing that treatment reduced the risk of transmission by 92%. “I think it's the single most important presentation here,” said virologist Mark Wainberg of McGill University in Montreal, Canada.

    Others cautioned that treatment might not have as powerful an effect with other transmission routes. But Julio Montaner, a key proponent of test-and-treat who directs the B.C. Centre for Excellence in HIV/AIDS at the University of British Columbia, Vancouver, in Canada, presented compelling data from one of the most susceptible groups, injecting drug users (IDUs). Test-and-treat is more feasible and possible to track in Canada because it has a national medical system with centralized, local control, he added.

    All together now.

    As more people received highly active antiretroviral treatment (HAART) in British Columbia, new infections plummeted, even in IDUs.

    CREDIT: JULIO MONTANER/BC CENTER FOR EXCELLENCE IN HIV/AIDS, MODIFIED FROM MONTANER ET AL., CROI 2010

    Earlier, Montaner's group showed that as 2500 people started antiretroviral drugs (ARVs) between 1996 and 1999, new infections in their steadily expanding testing program dropped by 50%. The analysis presented here examined data from 2004 to 2009, when the number of treated people doubled to about 5000; many were IDUs. New infections fell, and although the drop wasn't as dramatic as before, there was a concomitant decline in viral load in the treated people. A subset analysis of IDUs also showed a particularly sharp reduction in new infections.

    Several studies around the world will soon begin to evaluate the test-and-treat strategy. “It's a very challenging concept,” said Anthony Fauci, head of the U.S. National Institute of Allergy and Infectious Diseases (NIAID) in Bethesda, Maryland, who described obstacles, including the feasibility of conducting widespread, voluntary testing. But NIAID is funding a study of the concept at two U.S. sites that will start in June. “It's a bold, high-risk but high-return project that we are going to push the envelope on,” said Fauci.

  14. 17TH CONFERENCE ON RETROVIRUSES AND OPPORTUNISTIC INFECTIONS, 16-19 FEBRUARY, SAN FRANCISCO, CA

    Limits of Success

    1. Jon Cohen

    According to the best estimates, 480,000 babies worldwide became infected with HIV in 2008, with a mere 21% of pregnant women receiving an HIV test and only 45% of those who tested positive receiving drugs to prevent infection—and that treatment was often suboptimal, according to work presented at the 17th Conference on Retroviruses and Opportunistic Infections.

    Upgrade time.

    Single-dose nevirapine, now deemed suboptimal, was celebrated when this makeshift clinic in Kampala, Uganda, proved it could protect many babies.

    CREDIT: MALCOLM LINTON

    Huge disparities in access to proven methods to thwart HIV still exist between rich and poor countries. Prevention of mother-to-child transmission (PMTCT) efforts are a case in point, explained pediatrician Elaine Abrams of Columbia University. In wealthy countries, where HIV-infected pregnant women receive cocktails of antiretroviral drugs (ARVs) and do not breastfeed, fewer than 2% transmit the virus to their babies. That's a drop from as high as 40% of the women who receive no treatment and breastfeed. “New pediatric infections have virtually been eliminated,” said Abrams. “In contrast, the pediatric epidemic rages overseas.” According to the best estimates, 480,000 babies worldwide became infected in 2008, with a mere 21% of pregnant women receiving an HIV test and only 45% of those who tested positive receiving drugs to prevent infection—and that treatment was often suboptimal.

    The most commonly used intervention for PMTCT in developing countries is a single dose of the drug nevirapine given to the mother in labor and the baby at birth. This strategy, which first proved its worth in a Ugandan study that ended in 1999, is cheap and simple and cuts transmission rates in half—and, before the arrival of cheap ARVs, it was the only option for many poor women. But, said Abrams, abundant data now show the dangers of the “overreliance on single-dose nevirapine.” Not only are cocktails of ARVs more effective at preventing transmission, but as many as 50% of pregnant, infected women have suffered severe immune destruction and need combination treatment for their own health. What's more, the single dose of nevirapine fuels the emergence of resistant strains, compromising the ability of mothers and their babies, if they do become infected, to benefit from that entire class of drugs—a key component of cocktails in developing countries.

    Some countries now add another ARV or two for a short time before labor and during breastfeeding to reduce the risk of resistance emerging, and even when resistance does develop, it wanes over about a year's time. Still, the different standards of care for poor and rich are unacceptable, many researchers said at the meeting. “Our responsibility is to come up with consistent, across-the-board ARV recommendations,” said pediatrician Arthur Ammann, who heads Global Strategies for HIV Prevention, a nonprofit based in San Rafael, California, that does PMTCT in the Democratic Republic of the Congo and Liberia.

    Catherine Wilfert, a pediatrician emeritus at Duke University in Durham, North Carolina, stressed that the costs of ARVs are not the main roadblock. “The real funding challenge is the training,” said Wilfert.

    But if single-dose nevirapine is the only option available, it remains much better than doing nothing, said infectious disease specialist Nicholas Hellman, who heads medical and scientific affairs at the Elizabeth Glaser Pediatric AIDS Foundation: “We have to be careful not to throw out the baby with the bath water.”

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution