News this Week

Science  15 Sep 2006:
Vol. 313, Issue 5793, pp. 1550

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Endgame for the U.S.-Russian Nuclear Cities Program

    1. Richard Stone
    1. Eli Kintisch

    The United States and Russia seem ready to pull the plug on an 8-year-old effort to help steer Russian nuclear weapons scientists into civilian work. The joint Nuclear Cities Initiative (NCI) has been on life support for 3 years. It's likely to die next week, NCI proponents say—undercutting efforts to help Russia shrink its massive nuclear complex and bottle up its expertise.

    Other U.S.-funded programs employing Russian weapons scientists will continue, but they “are grossly insufficient” to help Russia deal with the problems ahead, says Matthew Bunn, a nonproliferation expert at Harvard University's Belfer Center for Science and International Affairs. And while NCI appears to be out of steam, a much-discussed European NCI has never even left the depot. “The real issue is the slow abandonment of [weapons] scientist programs by the U.S. and Europe,” says Kenneth Luongo, executive director of the Russian-American Nuclear Security Advisory Council, a think tank in Washington, D.C., and Moscow. NCI “bought the United States a seat at the table for discussions of these cities' futures,” adds Bunn. But getting dramatic results would require a “significantly bigger effort.”

    Russia's design labs and factories for fabricating nuclear fuel and warheads are dispersed in 10 closed cities that employ some 75,000 people on weapons-related work (see map). After the collapse of the Soviet Union in 1991, several U.S. agencies began assisting Russia with money for specific projects, including safeguarding uranium and plutonium stockpiles from nuclear traffickers and providing grants to reduce the temptation for scientists to work in countries such as Iran or North Korea. The U.S. Department of Energy (DOE) augmented these efforts with NCI after the ruble's collapse in 1998, when scientists were in dire straits. It was the first U.S. program specifically aimed at helping Russia downsize its nuclear complex.

    Scratching the surface.

    NCI created hundreds of jobs for scientists in three of Russia's 10 nuclear cities, but thousands may soon be out of work.


    NCI was controversial from the start, however. Critics in Congress and in the Bush Administration argued that it bankrolled middling scientists, freeing Russia to focus resources on the best weapons designers. Proponents responded that because Russia's reservoir of nuclear talent runs so deep, it's worth engaging even second- and third-tier scientists. Russia has 2000 to 3000 scientists with nuclear bomb-making skills and as many as 15,000 more who could aid a hostile weapons program, the nonprofit Nuclear Threat Initiative estimated in a report, Securing the Bomb 2005: The New Global Imperatives.

    By helping Russia “ease its nuclear cities on the path to sustainable civilian work,” says Bunn, NCI has “reduced the danger that experts from some of these cities would … sell their weapons-related knowledge.” All told, DOE claims that NCI's roughly $110 million war chest over 8 years created 1600 civilian jobs in three cities—Sarov and Snezhinsk, which specialize in nuclear weapons design, and Zheleznogorsk, a plutonium production town in southern Siberia—and drew in $63 million from outside sources.

    Even if NCI disappears, some of its projects will continue under allied programs, such as DOE's Initiatives for Proliferation Prevention and the multilateral International Science and Technology Center. Bryan Wilkes, a spokesperson for DOE's National Nuclear Security Administration, which oversees NCI, said work with Russian scientists will continue; another NNSA official notes, however, that total assistance will decrease by about $10 million per year. But a number of projects have already ended. One casualty is a trove of data on everything in the nuclear cities from joint ventures to crime rates. The information, published in a quarterly bulletin by Sarov's Analytical Center for Nonproliferation, is “invaluable,” says Luongo. “In the Cold War, we would have paid billions for it.” NCI ended support for the bulletin last February.

    Analysts blame both governments for NCI's slow death. The Russia-U.S. NCI pact lapsed in 2003 after the two countries failed to agree on liability and tax issues. A provision gave NCI a 3-year grace period to wind down projects while negotiations on resurrecting it went on. “It was a good program. We wished it to continue, but the bureaucrats killed it,” says one scientist at Sarov who was not authorized to speak to the press and asked to remain anonymous.

    NCI's threatened termination comes at a critical time for people who live and work in the once-top-secret nuclear enclaves. Earlier this year, Russia's federal government ended subsidies to the closed cities, leaving gaping budgetary holes. According to Securing the Bomb, the mayor of Zheleznogorsk recently warned: “We have no idea at all how the budget will be filled. … A starving operator of a nuclear power unit is more dangerous than any terrorist.” Layoffs of “many thousands of people” are expected in the coming decade with the closure of plutonium production and reprocessing facilities in Zheleznogorsk and Seversk, Bunn says. In its final months, NCI had stepped up activities in those two cities to help cushion the blow for unemployed scientists. Now it too finds itself out of a job.


    Claim of Oldest New World Writing Excites Archaeologists

    1. Andrew Lawler

    A stone block uncovered in a Mexican quarry provides dramatic evidence that the ancient Olmec people developed a writing system as early as 900 B.C.E., according to seven Mesoamerican scholars writing in this week's issue of Science (p. 1610). That makes the block's 62-sign inscription by far the oldest writing discovered in the New World and hints at surprising complexity in a culture that may have laid the foundation for the Mayan and Aztec empires encountered by the Spanish a millennium and a half later. “It's a jaw-dropping find,” says Brown University anthropologist and co-author Stephen Houston. “It takes this civilization to a different level.”

    Heady find.

    The Cascajal block shows signs of early script among the Olmec, who also left behind large stone heads and monumental buildings.


    Other specialists agree. “This is an exciting discovery of great significance,” says anthropologist Mary Pohl of Florida State University in Tallahassee. Even skeptics say they are convinced that the signs represent true script. But controversy remains over the block's dating and implications. And the inscription—which can't yet be read and seems unrelated to later Mesoamerican scripts—is unlikely to resolve the heated debate over whether the Olmec were the dominant culture of their time or one of many societies that shaped Mesoamerica.

    The Olmec civilization appeared on the coast of the Gulf of Mexico around 1200 B.C.E. and quickly flourished thanks to rich soils and high rainfall that allowed intensive maize production. The first center, San Lorenzo Tenochtitlán, was abandoned about 900 B.C.E. just as another one at nearby La Venta arose. By 400 B.C.E., the Olmec culture had largely vanished. During that half-millennium, Olmec fashions spread around Mesoamerica, although the extent of their influence remains contentious. Along with creating a sophisticated calendar, the Olmec carved glyphs as early as the San Lorenzo phase. Later glyphs found during the La Venta period provide more extensive evidence of iconography, but scholars are divided over whether those could be classified as writing (Science, 6 December 2002, p. 1872).

    Road builders quarrying fill from an ancient mound at Cascajal, outside San Lorenzo, found the new block with pottery fragments and figurines. The local authority on cultural materials stored the objects in his home and alerted the paper's first two authors, anthropologists Maria del Carmen Rodriguez Martinez and Ponciano Ortiz Ceballos of the Centro del Instituto Nacional de Antropología e Historia. The block was then examined by the entire team this spring. Chemical analysis shows an ancient patina in the stone's incisions, which were made with a blunted blade to make outlines and a sharper one to make cuts within the signs.

    The authors argue that the block is roughly the same age as the artifacts found with it, which they say date to the latter part of the San Lorenzo phase; they also note that the site is close to San Lorenzo itself. “There is quite a good deal of evidence on the probable context,” says Pohl, who accepts the conclusion. But those claims don't wash for some other researchers, who note that all of the artifacts were found out of context. “Once I owned a home near to Lincoln's log cabin, but that proximity didn't date my house to the same period,” says David Grove, an emeritus anthropologist at the University of Florida, Gainesville. “Likewise, the literally mixed bag of shards kept by village authorities doesn't help at all to date the piece.”

    Adds John Clark, an anthropologist at Brigham Young University in Salt Lake City, Utah: “Is the block associated with San Lorenzo or La Venta? We can't answer that definitively.” Like Grove, he favors a later date, when Olmec glyphs became more common.

    Whatever the date, he and Grove agree that the inscription qualifies as writing and so is a dramatic find. A few of the signs are repeated, and there is a pattern of variable as well as short and repeated sequences. “The Cascajal block conforms to all expectations of writing,” the authors say. They argue that such sophistication reveals “a new complexity to this civilization.” Houston goes a step further, saying, “We're looking, possibly, at the glimmerings of an early empire.”

    The script's influence on later systems is unclear, however. The text runs horizontally rather than vertically as in later Mesoamerican scripts. Nor can the writing be linked with a later writing system, Isthmian, which emerged around 500 B.C.E. and has radically different signs. Nevertheless, the authors conclude that “the clear linkage of the script to the widely diffused signs of Olmec iconography” argues in favor of a widespread system that died out before others appeared in succeeding centuries—perhaps as happened to one of the world's first writing systems, the Indus script, which vanished shortly after 2000 B.C.E.

    Like Indus script, the newly discovered Olmec writing remains undeciphered. “We would need a Rosetta stone,” says Houston. Clark hopes that the Cascajal block will encourage researchers to go back to the site. “Now we need to dig some control pits and do some real archaeology,” he says.


    Space Mission to Shine a Light on Solar Flares

    1. Dennis Normile

    TOKYO—Solar flares and coronal mass ejections, the most powerful explosions in our solar system, periodically blitz Earth with charged particles that can disrupt radio signals, fry satellite electronics, and threaten the health of astronauts who find themselves outside our planet's sheltering ionosphere. Yet these phenomena are little understood. Scientists don't know, for example, what generates the magnetic energy thought to power solar flares, what triggers the energy's release, or even whether solar flares pop up all over the sun's surface or just in certain regions.

    Solar-B, a spacecraft set for launch from Japan's Uchinoura Space Center on 23 September, “is designed to answer these questions,” says John Davis, a Solar-B project scientist at NASA's Marshall Space Flight Center in Huntsville, Alabama. A better understanding of solar processes, he says, “could have a broad impact on physics.”

    Sun spotter.

    Solar-B promises the finest look yet at magnetic processes that produce flares and coronal mass ejections.


    Solar-B is an encore to Yohkoh, the first spacecraft to observe a solar flare's highly energetic x-rays, which are obscured from land-based telescopes by Earth's atmosphere. Spiro Antiochos, an astrophysicist at the U.S. Naval Research Laboratory in Washington, D.C., says that Yohkoh, launched in 1991, provided “the first definitive observations” connecting solar flares to magnetic reconnection, in which magnetic fields generated deep within the sun suddenly break apart and reform, releasing massive amounts of energy. This energy heats the corona and accelerates electrons, protons, and heavier ions into space, forming solar flares. Yohkoh, however, was unable to link specific magnetic field structures to solar flares.

    Solar-B should fill in the gaps. “What we expect from Solar-B is to clearly identify a specific magnetic field motion and a specific type of magnetic field appearing on the sun's surface and the coronal response,” says Takeo Kosugi, project manager for the Institute of Space and Astronautical Science in Sagamihara, which is reprising its Yohkoh partnership wi th NASA and the U.K.'s Particle Physics and Astronomy Research Council on the mission.

    Three telescopes onboard the $210 million spacecraft will help achieve this precision. Its optical scope, with a 0.5-meter mirror—the largest of its kind put in space to observe the sun—will be able to resolve solar features as small as 150 kilometers across and will have a vector magnetograph that determines the polarization of magnetic fields. An improved x-ray telescope will provide higher resolution images of flares and other phenomena than Yohkoh could manage and will measure temperatures exceeding 10 million kelvin—a first. And an extreme ultraviolet imaging spectrometer will observe solar plasma, helping relate the movement of hot gases in the corona to the underlying magnetic fields. Solar-B will provide round-the-clock observations for 8 months a year over a planned mission lifetime of 3 years.

    Although Solar-B's primary objective is to unravel basic solar processes, there could be practical payoffs as well. Davis says that NASA hopes to develop an ability to predict flares and coronal mass ejections before they occur—and even better, when they won't occur. The latter knowledge would allow the agency to designate “safe periods” of hours or days for astronauts to venture out on spacewalks. And that could take the sting out of a very nasty solar punch.


    Extensively Drug-Resistant TB Gets Foothold in South Africa

    1. Jon Cohen

    An outbreak of what's called “extensively drug-resistant tuberculosis,” or XDR TB, in KwaZulu-Natal Province, South Africa, appears to be nearly twice as large as originally reported. At a meeting of international public health officials held in Johannesburg last week to discuss what were then 53 cases of the highly lethal tuberculosis at one health-care center, South African researchers reported that they now had identified a total of 102 cases at 28 hospitals. “It is extremely worrying, and WHO [the World Health Organization] is responding very proactively,” says Paul Nunn, who coordinates the TB/HIV and Drug Resistance Unit for the Stop TB Department at WHO, one of the meeting's co-organizers. “XDR threatens the significant gains made in the last 15 years in TB control globally.”


    New analyses have discovered XDR TB cases at 28 hospitals in KwaZulu-Natal.

    The South African outbreak of XDR TB is the largest ever reported. WHO and the U.S. Centers for Disease Control and Prevention, another meeting co-organizer, first described XDR TB in the 24 March Morbidity and Mortality Weekly Report (MMWR). The researchers defined XDR TB as being resistant to the two widely used first-line drugs and three of the six main classes of second-line treatment. (Multidrug-resistant TB, by contrast, does not respond to first-line drugs.) At the time, the researchers had identified 347 XDR TB cases worldwide, and only one was in Africa. In the 53 cases in KwaZulu-Natal, first reported publicly in August at the international AIDS conference (Science, 25 August, p. 1030), the average time to death was 16 days after a sputum sample was taken. “This is about as fatal as you can get,” says Nunn.

    In every one of the 102 cases that has been checked, the patients have also been infected with HIV. TB is a leading killer of people with AIDS, and some evidence suggests that TB transmits more easily in HIV-infected people. TB also complicates treatment of HIV.

    Resistance to TB drugs typically develops when people do not finish their full course of medication or receive drugs that have limited potency. Preliminary data suggest that many of the XDR TB patients are infected with the KZN strain that first surfaced in KwaZulu-Natal in 1995 as a major source of multidrug resistance. Says WHO epidemiologist Abigail Wright: “It's very worrying when you see dominant families [of Mycobacterium tuberculosis] becoming extensively resistant.”

    Wright, who co-authored the MMWR report, stresses that no one yet has a good handle on the global prevalence of XDR TB. Nunn notes that South Africa is more developed than many of its neighbors and has a better detection system. “The same kind of outbreaks in more isolated conditions might pertain,” says Nunn. “What's going on in Zambia and Zimbabwe? We really need to know.”


    Campaign Heats Up for WHO Director-General

    1. Gretchen Vogel*
    1. 1With reporting by Jon Cohen, Martin Enserink, and Eliot Marshall.

    In what promises to be an unusually hard-fought and public race, 13 candidates are competing to be the next director-general of the World Health Organization (WHO). The surfeit of contenders means that the process will be even less predictable than usual, say observers, but early signs are that it also may be more open than previous campaigns.

    In November, the WHO executive board will choose a successor to Director-General Jong Wook Lee, who died suddenly of a stroke in May, only 3 years into his 5-year term. After a flurry of last-minute nominations before the 5 September deadline, the list includes four candidates from Europe, three from the Middle East, three from Asia, one from Africa, and two from Latin America. Early front-runners include Mexican health minister Julio Frenk and two insiders: bird flu czar Margaret Chan of Hong Kong, WHO's assistant director-general for communicable diseases, and Shigeru Omi, the Japanese head of WHO's Western Pacific Division. Pascoal Mocumbi, former prime minister of Mozambique, and Bernard Kouchner of France, co-founder of Doctors Without Borders, may also gather strong support.

    The campaign is already intense. Frenk has already launched a Web site outlining his goals and priorities. He told Science that his experience reforming Mexico's health system makes him the strongest candidate. “Being minister of health of a large developing country is probably the best hands-on training you can have,” he said. Chan told journalists last week that she was confident she would win the nomination. Some WHO watchers speculate that Mocumbi will be a strong candidate among countries advocating for WHO's first African leader.

    Such posturing is a healthy sign, says Christopher Murray of Harvard School of Public Health in Boston. WHO's selection process is frequently criticized for being too influenced by behind-the-scenes diplomatic deals. The Web sites and statements are aimed at the broader public health community instead of the politicians and diplomats, Murray says. “If it does influence the race, that's a very good thing,” he says.

    One of the key questions facing the next director is where the organization fits among the other new influences in global health, says international health expert Gerald Keusch of Boston University. “Can WHO play in the same sandbox with the Gates Foundation? It's not going to have a $60 billion endowment to work with, so it's got to have something on the intellectual, political, and ethical scene to contribute—and be willing to be a partner.”


    Ground the Planes During a Flu Pandemic? Studies Disagree

    1. Martin Enserink

    By scouring mortality data from 121 cities across the United States, Harvard researchers have found footprints of 9/11 that they say should guide policy during an influenza pandemic. The decline in air travel in the months after the terrorist attacks delayed the annual flu season in the United States by almost 2 weeks, they conclude—a finding that suggests that a flu pandemic, too, could be slowed down, perhaps by months. But researchers who have studied the same question using computer models—and found closing down airports to be less useful—are skeptical.


    The decline in air travel after 9/11 delayed the U.S. flu season by almost 2 weeks, a new study says.


    The 2003 outbreak of SARS drove home the widely held belief that global mobility helps spread infections; indeed, it's almost a cliché among researchers to say that the most important disease vector today is the Boeing 747. But air-travel restriction won't help slow a flu pandemic much, three model studies concluded earlier this year—especially when compared to the judicious use of vaccines, antiviral drugs, isolation, and quarantine.

    In a paper published in July in Nature, for instance, Neil Ferguson of Imperial College London and his colleagues tested how the United States and the United Kingdom might best mitigate a pandemic's ravages. They found that unless they are 99% effective, border controls and internal travel restrictions won't slow viral spread by more than 2 or 3 weeks. Ben Cooper and his colleagues at the U.K. Health Protection Agency, who modeled air travel around the world in a June paper in PLoS Medicine, also found limits “of surprisingly little value.” The reason, says Ferguson, is that flu spreads extraordinarily rapidly.

    But in the real world, the 27% reduction in international air-travel volume after 9/11 appears to have caused a 13-day delay in the 2001–02 influenza season—considerably more than the models would predict, say John Brownstein and Kenneth Mandl of Children's Hospital Boston and Harvard Medical School in a paper released on 11 September by PLoS Medicine. Analyzing data from 1996 to 2005, they also found a correlation between higher air-travel volumes in the fall and a slightly earlier flu season. Extrapolations suggest that a full-blown travel ban, as opposed to the post-9/11 slump, might delay a flu pandemic by as much as 2 months, says Brownstein—precious time to activate countermeasures and work on a vaccine.

    The modelers aren't convinced, however. Ferguson says there is no proof that the relation between travel and timing of the flu season is causal, and he questions the team's use of a complex statistical measure to determine the timing of the peak. Although the study is “very nice,” the 9/11 effect “is an n of 1; it's intriguing, but you can't draw any conclusions,” says Ira Longini of the University of Washington, Seattle, who co-authored a paper in the Proceedings of the National Academy of Sciences in April that also concluded that travel bans had little value.

    Brownstein suspects that some of the criticism may stem from the contradiction between his data and the models. “They are making assumptions about the relationship between air travel and the spread of influenza,” he says. “But this is empirical evidence.”

    Although some countries' pandemic preparedness plans list travel bans as an option, Ferguson says most governments that have studied the idea seriously have rejected it. The World Health Organization's (WHO's) Global Influenza Preparedness Plan does not recommend travel bans because enforcement “is considered impractical,” but a footnote adds that they “could be considered as an emergency measure to avert or delay a pandemic.” WHO spokesperson Gregory Hartl says the new study is “very interesting” and “opens up the debate again.”


    Poplar Tree Sequence Yields Genome Double Take

    1. Erik Stokstad

    Black cottonwoods are the lab rats of the tree world. It's relatively easy to add or knock out genes, and like other members of the poplar genus, they grow quickly enough that researchers can check the outcome of some experiments in less than a year. Foresters love poplars too: Their fast growth rate makes them a good source of fiber for paper, lumber, plywood—and a possible source of biofuels. All these reasons motivated more than 100 researchers to sequence the tree's genome.


    The genome of black cottonwood should provide insights into how to improve commercial varieties of it and other poplars.


    On page 1596, the team, led by Gerald Tuskan of Oak Ridge National Laboratory (ORNL) in Tennessee and Daniel Rokhsar of the Joint Genome Institute (JGI) in Walnut Creek, California, describes its first analysis of the more than 45,000 likely genes in black cottonwoods (Populus trichocarpa). The group has begun to sketch out the evolutionary history of Populus, finding, for example, that a doubling of the genome about 65 million years ago freed up many genes to acquire functions important for trees, such as wood formation.

    Cottonwood is the first tree and the third plant genome to be sequenced, coming after the herbaceous annual Arabidopsis and rice. The bulk of the sequencing was done at JGI and ORNL, with researchers around the world contributing genetic markers—such as 324,000 expressed sequence tags—which aided in the search for genes. Four groups then independently trained computer algorithms to search for coding sequences, and they all agreed on 45,555 likely nuclear genes.

    By comparing the new sequence to that of Arabidopsis and sections from other plants, the team determined that the ancestral genome of poplars had been duplicated at least three times: first, at the base of all angiosperms, then about 100 million to 120 million years ago, and most recently 60 million to 65 million years ago. “The genome sequence shows this incredibly complicated evolution, full of diversity,” says Gail Taylor of the University of Southampton, U.K., who is not an author. “It's like an Aladdin's cave.” Similar doublings also occurred in rice and Arabidopsis, so they appear to be widespread among plants, Tuskan says.

    Genome duplications offer new grist for natural selection because a second copy of a gene can evolve a new function. Although the Populus genome has lost some of its extra copies, it retained others that might be particularly useful for fending off pathogens, synthesizing lignin and cellulose, transporting metabolites, and bringing about programmed cell death (which may be important for seasonal growth and autumnal senescence).

    The next step is to figure out what more of the genes do—half have no known function—by creating mutants with genes that are under- or overexpressed. “There will be thousands of new functions that were not known or fully appreciated in other species,” predicts Steven Strauss of Oregon State University in Corvallis. This will help lead to the development of new varieties of poplars that might have longer growing seasons or pack on more biomass. It could also have payoffs for ecologists, clarifying the keystone role of poplars in riparian and other ecosystems. “There's a whole new area of science opening up,” Taylor says.


    Pulsars' Gyrations Confirm Einstein's Theory

    1. Adrian Cho

    Comparing a pair of massive stellar clocks known as pulsars, an international team of astronomers has put Einstein's theory of gravity to its toughest test yet. Published online by Science this week (, the results show that the theory of general relativity (GR) is accurate to within 0.05%, even in the ultrastrong gravity of a pulsar, a spinning neutron star measuring roughly 20 kilometers wide but weighing more than the sun. Further observations could enable researchers to peek into the structure of neutron stars, the hearts of which may contain a bizarre form of nuclear matter that flows without resistance.


    Taking the pulses of two pulsars as they whiz around each other, astronomers have determined their orbit and tested general relativity.


    Most physicists agree that GR cannot be the last word on gravity because it clashes with quantum mechanics. The new observation limits the possibilities for tinkering with GR, says Joseph Taylor, a physicist at Princeton University. “They're tightening the constraints on any alternative to Einstein's theory,” he says.

    According to GR, matter and energy warp space and time, making free-falling objects travel along curved paths and producing the effects we call gravity. Einstein specified a particular mathematical connection between the density of matter and energy and the curving of spacetime. To test the theory, a team led by astronomer Michael Kramer of the Jodrell Bank Observatory in Macclesfield, U.K., studied a unique astronomical object: a pair of pulsars 2000 light-years away that orbit each other at a distance of just a million kilometers (Science, 9 January 2004, p. 153).

    Spinning like a lighthouse beacon, a pulsar beams radio waves into space, creating a pulsing signal that's nearly as steady as an atomic clock. If a pulsar orbits another object, the rate of pulsing rises and falls repeatedly as the pulsar speeds alternately toward and then away from Earth. By tracking the variations in the rates of both pulsars from April 2003 to January 2006, the researchers deduced the details of their orbit, such as the length of its elliptical shape, the rate at which the ellipse rotates, and how the orbit is tilted relative to the line from the pulsar to Earth.

    They quantified the details in several so-called post-Keplerian parameters and found that all the parameters were consistent with one another and with GR to within the uncertainties. “General relativity does a perfect job of describing what we know of the system so far,” says Ingrid Stairs, an astronomer and team member from the University of British Columbia in Vancouver, Canada.

    Taylor and others had tested GR by studying single pulsars orbiting other objects. But with just one pulsar, researchers cannot directly determine certain details, such as the relative masses of the orbiting objects, says Taylor, who won the Nobel Prize in physics in 1993. Moreover, the pulsars in the double pulsar are moving faster than those in the other systems, he says, which accentuates relativistic effects.

    As well as testing GR, further observations might reveal a subtle interplay between the rate at which the pulsars orbit and the rate at which each spins on its axis. That would give scientists a direct measurement of the distribution of mass within a neutron star and a first real glimpse into its mysterious insides, says Thibault Damour, a theoretical physicist at the Institut des Hautes Études Scientifiques in Bures-sur-Yvette, France. “This is not for today,” he says, “but it shows that high-accuracy measurements might open a new window on nuclear physics.” It might take more than a decade to see the effect, but all say it will be worth the wait.


    Mild Climate, Lack of Moderns Let Last Neandertals Linger in Gibraltar

    1. Michael Balter

    One of the few things researchers agree on regarding the Neandertals is that the story of these European hominids ends in extinction. But just when the last Neandertal died, and whether modern humans or a changing climate sealed their fate, are matters of lively debate (Science, 14 September 2001, p. 1980). Now a team working at Gibraltar, at the southern tip of Spain, reports radiocarbon dates suggesting that some Neandertals survived thousands of years longer than previously thought, taking refuge in southern Europe where the climate and environment were favorable, and where moderns were still fairly thin on the ground. “While pioneer modern humans were staking tenuous footholds” in the region, says team leader Clive Finlayson, a biologist at the Gibraltar Museum, the last Neandertals “were hanging on.”

    The good life.

    Did Neandertals find refuge from a harsh climate and modern humans along Gibraltar's lush coast?


    Anthropologist Eric Delson of the City University of New York says that “the dates appear fully supported,” and that the notion of Neandertal refugia is “quite reasonable.” But some archaeologists believe contamination from younger material might have skewed the dates. “I have considerable reservations,” says archaeologist Paul Mellars of the University of Cambridge in the United Kingdom.

    The new dates come from Gorham's Cave in Gibraltar, where Neandertals left their characteristic Mousterian stone tools, although no fossils have been found. The international team obtained 22 radiocarbon dates from small pieces of charcoal in Mousterian layers dug between 1999 and 2005. The dates, reported online this week in Nature, range from 23,000 to 33,000 with a cluster at about 28,000 raw “radiocarbon years”; these must be calibrated to provide true calendar years. Although the calendar age is probably at least several thousand years older than the radiocarbon years, the calibration is uncertain (see p. 1560), and the team has stuck to uncalibrated dates. Reconstructions suggest that Gibraltar was surrounded by coastal wetlands and woodlands and blessed with mild temperatures at this time, Finlayson says, and the Neandertals enjoyed a rich cornucopia of resources including shrubs, birds, reptiles, and mollusks.

    The Gibraltar dates appear to be the youngest accepted for a Neandertal site, although sites in Spain and Portugal have been dated as late as 32,000 radiocarbon years ago. But the Gibraltar Neandertals were not entirely alone: Although there are very few modern human sites in the region older than 30,000 years, one site about 100 kilometers east at Bajondillo, Spain, has been dated to about 32,000 uncalibrated years ago. The team concludes that Neandertals did not rapidly disappear as moderns advanced but rather co-existed with them in a “mosaic” of separate, low-density populations over thousands of years.

    Mellars counters that many of the new dates actually cluster around 30,000 to 31,500 years ago, and the later ones could be contaminated. And archaeologist João Zilhão of the University of Bristol in the U.K. dismisses the idea that Neandertals and moderns lived near each other but had only limited contact. “This really stretches the bounds of credulity,” Zilhao says.

    But the Gibraltar Neandertals used only Mousterian technology rather than copying some modern techniques as late Neandertals did elsewhere in Europe, notes Katerina Harvati of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. In the end, Harvati says, the Neandertal groups who stuck to their own traditions might have had the better strategy, and survived longer.


    Microarray Data Reproduced, But Some Concerns Remain

    1. Jennifer Couzin

    Over the past several years, debate has raged in the microarray community over how reliably these microchip-sized tools measure gene expression. Now, in a clutch of six papers published online last week by Nature Biotechnology, a consortium of 137 researchers from 51 different organizations concludes that the technology works better than expected and that results can usually be reproduced across labs. But some caution that the findings, although a technical feat, do not mark the end of microarray worries, nor do they necessarily speed the entry of the devices into patient care.


    Certain gene-expression patterns, such as those shown here, can be reproduced across labs more reliably than thought.


    The MicroArray Quality Control (MAQC) project was initiated in 2005 by the U.S. Food and Drug Administration (FDA), which has a keen interest in seeing the technology aid drug development. If reliable, microarrays could be used for everything from testing a drug's effects to identifying patients most likely to benefit from a particular treatment. The MAQC venture came about after a controversial 2003 paper by Margaret Cam of the National Institute of Diabetes and Digestive and Kidney Diseases in Bethesda, Maryland. Cam worked with pancreatic cells and reported that three different microarray platforms revealed similar expression levels for only four of 185 genes tested (Science, 22 October 2004, p. 630). The inconsistent results likely stemmed, at least in part, from some microarray probes responding to the RNA of untargeted genes.

    The MAQC project was far more expansive. It put 20 microarray products and three alternative technologies through more than 1300 tests at different labs. The project investigator repeatedly tested expression levels of more than 12,000 genes using messenger RNA (mRNA) samples drawn from a composite of human tumor cell lines and from the human brain. The expression results overlapped—meaning that a gene's expression was recorded the same way by each microarray—from 70% of the time to more than 90%.

    “This can bring a complete change in mindset in analyzing microarray data,” says Leming Shi, a computational chemist with FDA, who led the consortium. He attributes many of the problems in previous studies to faulty statistics or unfamiliarity with the technology.

    Others say the concern about microarrays isn't over yet. “I would counsel caution” in interpreting these results, says Marc Salit, a chemist at the National Institute of Standards and Technology, who consulted with the authors on the project. He notes that the RNA samples MAQC compared are far more distinct from one another than are biological samples commonly compared in microarray studies—such as normal and cancerous cells from the same organ. Salit says this level of agreement would be unlikely in such cases, “because you'd be looking at more subtle differences.” Still, he notes that scientists can now test their own microarray technologies against the MAQC samples and gauge their reliability.

    “Here's the industrial standard,” agrees Hanlee Ji, an oncologist and clinical geneticist at Stanford University in Palo Alto, California, who was part of the MAQC consortium. “Is your performance comparable to theirs, or is it so askew that that makes you concerned about your data?” FDA, says Ji, could hold microarray data supplied in drug applications to this new standard.

    But microarray technology still has a way to go before it can be broadly used to predict cancer prognosis or diagnose disease. MAQC project members will gather in Arkansas on 21 September to determine what criteria should be used when applying microarray data to disease, and which illnesses are worth focusing on first.


    Foreign Enrollment Rebounds After 3-Year Slump

    1. Yudhijit Bhattacharjee

    Foreign-born students have rediscovered U.S. graduate schools. After a 3-year decline, the number of foreign students starting graduate studies in the United States rose 4% last year. And, continuing the upswing, the number of international graduate applications to U.S. universities shot up 12% this year, suggesting that the country may be regaining its positive image among foreign students.

    Observers credit the rebound, documented in a 643-institution survey released this week by the Council of Graduate Schools (CGS), to speedier processing of visa applications and increased international outreach. “The U.S. government has clearly made efforts” to modify some of the restrictive visa policies that were put in place after the 2001 terrorist attacks, says CGS President Debra Stewart. “The visa situation is still not perfect, but it is much better than it was in 2004.” A majority of foreign-born graduate students pursue degrees in the sciences and engineering.

    Iowa State University (ISU) in Ames last year began paying the $100 fee charged every foreign student to register in the U.S. government's Student and Exchange Visitor Information System. “We're seeing some rewards,” says admissions officer Patricia Parker, citing an increase of 400 international graduate applications this year. The U.S. State Department has helped too, Parker says, by sending embassy officials to local universities to explain visa application procedures.

    Stewart says the rapidly expanding pool of college graduates in countries such as India and China should be a boon to U.S. graduate programs. But she warns her colleagues against complacency: “Given the increased global competition for talent, the U.S. government must turn its visa policy from an impediment against foreign students to an instrument of recruitment.”


    Radiocarbon Dating's Final Frontier

    1. Michael Balter

    In a heroic and sometimes contentious effort, researchers push to extend accurate radiocarbon dating back to 50,000 years ago

    By land and by sea.

    Corals and foraminifers from the oceans, and trees from the land, are used to craft the radiocarbon calibration curve.


    The 1994 discovery of France's Grotte Chauvet revolutionized ideas about symbolic expression in early modern humans. The breathtaking drawings of horses, lions, and bears that adorned the cave walls were executed with perspective and shading and rivaled the virtuosity of all other known cave art. But when were those drawings made? Early radiocarbon dates suggested 32,000 years ago, right after a major cold spell hit Europe. This implied that modern humans blossomed under frigid conditions while their Neandertal cousins were going extinct. But improved radiocarbon dating now suggests that the oldest paintings at Chauvet could be at least 36,000 years old. That's smack in the middle of a period of relative warmth and challenges speculation about modern humans' adaptability to a cold climate.

    Push ‘em back.

    These fighting rhinos in France's Grotte Chauvet could be more than 4000 years older than originally thought.


    Getting the dating right is “crucial,” says archaeologist Clive Gamble of the University of London's Royal Holloway campus. “It is not just a case of winning a trophy by being the oldest. The model up to now has been that modern humans could go anywhere and do anything, and … it didn't matter what the climate was.” Thanks to more accurate dating, says Gamble, “that model is now showing signs of cracking.”

    Indeed, as radiocarbon experts revise their estimates, all researchers working in the eventful period from about 50,000 to 25,000 years ago are facing an across-the-board realignment of dates. That's when both Neandertals and modern humans lived in Europe and when wildly fluctuating temperatures culminated in the spread of glaciers across much of the Northern Hemisphere.

    There's no question about the basic principles of the radiocarbon method: Plants and animals absorb trace amounts of radioactive carbon-14 (14C) from CO2 in the atmosphere while alive but cease to do so when they die. So the steady decay of 14C in their tissues ticks away over the years. But the amount of 14C produced in the atmosphere varies with the sun's solar activity and fluctuations in Earth's magnetic field. This means that the radiocarbon clock can race ahead or seemingly stop for up to 5 centuries. As a result, raw radiocarbon dates sometimes diverge from real calendar years by hundreds or even thousands of years. Thus researchers must calibrate the clock to account for these fluctuations, and that can be a challenge. For example, the start of the Holocene, the period when the last ice age ended, is usually dated to 10,000 uncalibrated radiocarbon years ago. But the radiocarbon clock stopped for several hundred years right at that point, so that the start of the Holocene—when agriculture began—can't be pinned down any more precisely than somewhere between 11,200 and 11,800 years ago (see graph). Because the best estimate of the calibration keeps changing, many scientists avoid reporting calendar years and simply cite “radiocarbon years” as a universal measuring stick when announcing new finds (see News of the Week story by Balter).

    Ribbons of time.

    The thickness of the curves shows that calibration of very old dates, based on deep-sea corals (bottom graph), is less precise than younger dates based on tree rings (top).


    Yet recent progress in radiocarbon dating may finally give researchers the accuracy they seek. In 2004, after 25 years of painstaking labor, an international group of radiocarbon experts extended the calibration curve back to 26,000 years by using data from tree rings, corals, lake sediments, ice cores, and other sources to create a detailed record of 14C variations over the millennia. The final frontier, which the group hopes to reach by the end of this decade, will be to push calibration to the 50,000-year mark; beyond that, there is too little residual 14C to measure precisely.

    Refinement of existing data, plus some promising new data sources, including ancient trees from the swamps of New Zealand, may help close the final gap. “These are very exciting times,” says nuclear physicist Johannes van der Plicht, director of the radiocarbon laboratory at the University of Gröningen in the Netherlands. He adds that a final calibration curve “will answer so many questions in archaeology,” in large part because the 50,000-year limit coincides with a major migration of modern humans from Africa to Europe and Asia.

    Earth scientists, many of whom use radiocarbon dating to study the movement of glaciers and ocean currents, are equally enthusiastic, in part because of the unprecedented climate variability that occurred between 30,000 and 50,000 years ago. Those who study sea-level fluctuations during and after the last ice age—data used to model patterns of global warming—rely “almost entirely” on radiocarbon dating, adds geophysicist Richard Peltier of the University of Toronto in Canada.

    Yet the eagerly awaited calibration is complicated by dissent in the ranks. One U.S. scientist has bypassed the international working group and published his own calibration curve, to the annoyance of many colleagues, while a British archaeologist is using provisional calibration data—prematurely, in the view of some radiocarbon experts—as evidence that Homo sapiens spread across Europe more rapidly than previously thought. Both researchers argue that science can't wait for an internationally agreed-upon calibration curve. The question at issue, says archaeologist Sturt Manning of Cornell University, is “who actually owns time”: the experts working to calibrate radiocarbon, or the research community at large.

    Science from the sewer

    The radiocarbon revolution that gave such a huge boost to archaeology and other fields had somewhat inauspicious origins: the sewers of Baltimore, Maryland. In 1947, chemist Willard Libby and his colleagues demonstrated that methane gas produced by Baltimore's Patapsco Sewage Plant contained trace amounts of radioactive 14C, thus proving that living organisms harbored the isotope (Science, 30 May 1947, p. 576). On the other hand, methane from much older sources, such as petroleum deposits millions of years old, did not contain 14C. From that point on, as Libby put it in his 1960 Nobel Lecture, “we [were] in the radiocarbon-dating business.”

    The revolution's early days were heady times. Libby wowed archaeologists when he accurately dated a number of samples whose ages were already known, including the 2750 B.C.E. coffin of the Egyptian pharaoh Zoser (Science, 23 December 1949, p. 678). Most archaeologists, who had previously relied on relative dating methods based on pottery styles, inscriptions, and guesswork, were thrilled to finally have a method for absolute dating, although a few attacked the method when it contradicted their pet theories.

    Some other dating methods, including thermoluminescence and uranium-series dating, overlap with the period covered by radiocarbon. But these techniques cannot be applied to bones, seeds, and other organic materials found in abundance on most archaeological sites. Yet as early as 1960, when Libby was awarded the Nobel Prize for his work, dating experts realized that past fluctuations in 14C levels were leading to erroneous and inconsistent results. Thus, although Libby had good luck with Zoser's coffin, radiocarbon dating of some earlier Egyptian artifacts contradicted dates from reliable historical sources. As the number of such troublesome discrepancies rose, it became clear that a calibration curve to correct for 14C variations, based on an independent data source, would be needed.

    Fortunately, just such a source was at hand: the sequences of annual tree rings, which dendrochronologists had been accumulating for decades. Long-lived trees such as the California bristlecone pine and European oaks and pines, which are often preserved in peat bogs, provide sections of ring width patterns that dendrochronologists use as bar codes to line up sequences of increasingly greater ages. By radiocarbon dating the rings, researchers began to construct calibration curves that could convert raw radiocarbon dates into real calendar years going back thousands of years.

    Since then, the story has been one of continuously improving accuracy, as researchers have worked to pin down the curve. Starting in the late 1970s, radiocarbon labs began using accelerator mass spectrometry to directly count 14C atoms rather than estimating them indirectly; this allowed tiny samples such as small seeds and grains to be dated with much greater precision. And the early 1980s saw “a movement to have a consistent calibration,” says Timothy Jull, head of the radiocarbon lab at the University of Arizona in Tucson and editor of the flagship journal Radiocarbon. An international group has since met regularly on the issue, and new curves have been published approximately every 6 years.

    The most recent calibrations, unveiled in Radiocarbon in 2004, consist of three different curves: one to date marine samples and one each for terrestrial samples in the northern and southern hemispheres. The effort involved in each is tremendous. For example, IntCal04 is based in part on the overlapping alignment of many thousands of tree-ring segments from the Northern Hemisphere dating back to 12,400 years ago. “This is a phenomenal achievement,” says Richard Fairbanks, an isotope chemist at Columbia University's Lamont-Doherty Earth Observatory in Palisades, New York, and a former member of the IntCal04 group.

    Going it alone.

    Richard Fairbanks's curve, based on corals (inset), extends the radiocarbon calibration to critical periods in human evolution.


    Beyond the limits of the dendrochronology record, however, radiocarbon experts have had to rely on other, considerably less precise sources of data. Between 12,400 and 26,000 years ago, the IntCal04 curve is based on two types of marine deposits: foraminifers (single-celled organisms that secrete calcium carbonate) from the Cariaco Basin of northern Venezuela up to 14,700 years ago (Science, 9 January 2004, p. 202), and several fossil coral records, including samples collected by Fairbanks and colleagues from the Atlantic and Pacific oceans, that cover this entire period.

    The new curve also introduces statistical methods to reduce uncertainties. Researchers applied a complex probabilistic approach called Bayesian statistics to make educated estimates of what the calibration curve should look like. When each data point was weighted according to how certain researchers were about it, “a more robust estimate of the curve resulted,” says Caitlin Buck, an archaeological statistician at the University of Sheffield in the U.K. and member of the IntCal group. Statistics can also improve the dates at specific sites, as in the case of the volcanic eruption of the Greek island of Thera, which destroyed a Minoan town and was recently dated to about 1600 B.C.E.—at least 100 years earlier than other estimates (Science, 28 April, p. 508).

    The field's progress can be viewed graphically, points out Cornell's Manning. The calibration curve is actually a ribbon rather than a line (see graphs, p. 1561), in which the width of the ribbon represents the remaining uncertainties in translating radiocarbon dates to calendar years. “If you plot all the calibration curves over the last 20 years, you will find that the ribbon is getting much narrower,” Manning says.

    All the way back?

    Encouraged by their recent successes, radiocarbon researchers now have their eyes on the bigger prize of the 50,000-year limit. Indeed, when the IntCal group began work on the 2004 curve, it had high hopes of extending it back to this final barrier. Yet it was not to be. Although the marine data sets were reasonably consistent with each other up to 26,000 years ago, after that they began to scatter and diverge, in some cases by up to several millennia. Geochronologist Paula Reimer of Queen's University in Belfast, Northern Ireland, who coordinates the working group, says that the differences—among the raw data as well as among the researchers—were just too great: “We had four or five people, all of whom thought their records were right.” So the group settled for publishing in Radiocarbon a comparison of the data sets earlier than 26,000 years, which they ironically called “NotCal”—meaning, Reimer and other members say, that it was not intended to be used as a calibration curve.

    But archaeologist Paul Mellars of the University of Cambridge in the U.K. used the published data to essentially do just that. Mellars was eager to get the most accurate dates for possibly contemporaneous Neandertal and modern human sites in Europe. So he used the midpoint of the differing “NotCal” curves to approximately calibrate the radiocarbon ages of 19 hominid sites ranging from Israel in the East to Spain in the West. Using this best-guess method, Mellars found that modern humans had not only spread across Europe faster than previously thought, but that they had overlapped with Neandertals during a shorter interval: only about 6000 years rather than 10,000 years in Europe as a whole, and as little as 1000 years in some parts of the continent. Mellars concluded in the 23 February 2006 issue of Nature that Neandertals must have “succumbed much more rapidly to competition” from modern humans than many had assumed.

    But Reimer and others say Mellars should not have used the NotCal data as he did. “It is dangerous to draw too fine conclusions using these data sets,” says Reimer, because they have not been finalized and the divergences between them have yet to be reconciled. Other researchers have started asking van der Plicht whether they can use the “Mellars curve” for calibration. “This is a bad thing,” says van der Plicht.

    Mellars insists that archaeologists can't wait for a final calibration curve. “Are we all really expected to keep studies of modern human origins on hold for the next 5 years, until they decide they've finally got the calibration act together?” he asks. The working group, he argues, “has hijacked the term ‘calibration’ to mean an absolutely agreed, rubber stamped, legalistic, signed, sealed, and delivered curve.” And even when the experts agree on a curve, Mellars says, it will not be “final and absolute” but “simply the best estimate from the data at the time.”

    Fairbanks is equally impatient. Last year, he and his co-workers decided to strike out on their own rather than wait for the consensus curve. In the September 2005 issue of Quaternary Science Reviews (QSR), the team published its own version of a calibration curve spanning the entire period from 50,000 years ago to today, based on its dating of fossil corals from the Atlantic and Pacific oceans. The team dated the corals using both radiocarbon and uranium-thorium dating. And the authors made it clear that they intended their curve to be used as a “stand-alone” radiocarbon calibration, arguing that their screening criteria for coral data were more rigorous than those of other coral data sets as well as the Cariaco Basin foraminifers.

    The more than 20 members of the IntCal04 working group, however, did not take this affront lying down. In the April 2006 issue of QSR, they contended in a letter that the Fairbanks paper was “extremely misleading” about the efforts of other groups and argued that stand-alone curves would lead to “confusion” among archaeologists and other researchers who had to use them. “The question is whether we maintain a common calibration curve or have different calibrations, as we did in the past,” says Jull. And Reimer maintains that the Fairbanks curve does not sufficiently take into account uncertainties from using a marine data set to estimate a terrestrial curve, because the oceans contain less 14C than the atmosphere and researchers must try to correct for the difference.

    Fairbanks, however, defends his decision to go it alone. “There is a critical need to have at least the skeleton of a precise and accurate radiocarbon calibration curve spanning the useful limits of radiocarbon dating now,” he says. “No international commission will stop scientific progress under the guise of consensus science.” Fairbanks adds that the IntCal04 group relied heavily on his team's coral data to extend its curve to 26,000 years. And he notes that he apparently has a growing number of customers. When his calibration Web site* debuted in August 2005, it received 900 visitors per month; by July 2006, it was getting about 1900. “Rick's curve is simply the most objective, because it involves the fewest assumptions,” says Christopher Charles of the Scripps Institution of Oceanography in San Diego, California, who used Fairbanks's curve to date deep-sea sediments and counts himself among the satisfied customers. Archaeologist John Hoffecker of the University of Colorado, Boulder, whose team recently used the Fairbanks curve to calibrate dates earlier than 40,000 years ago at the site of Kostenki in Russia, says that despite the controversy he was “reassured by Fairbanks's reputation.”

    Despite this acrimonious debate, however, there are signs that the community—and at least some of the data—might now be pulling together. At last April's 19th International 14C Conference in Oxford, U.K., earth scientist Konrad Hughen of Woods Hole Oceanographic Institution in Massachusetts, leader of the Cariaco Basin team, presented revised data that seemed to close much of the gap with Fairbanks's coral dates. “They are now getting very close,” says Manning, although Fairbanks points out that “Hughen's Cariaco Basin data set shifted closer to our coral data … and not the other way around.”

    And whereas the European tree-ring record goes back only 12,400 years, a paper presented at Oxford by Chris Turney of the University of Wollongong in Australia suggests that such records may be pushed back even to the 50,000-year limit. Turney and colleagues have been radiocarbon-dating fossil kauri trees—which can live up to 1000 years—from swamps in New Zealand. The dates stretch back 55,000 years and hold out the promise of a new terrestrial calibration source that could help reconcile some of the uncertainties in the marine records. “This would resolve a lot of issues,” Reimer says, “although it will take a lot of work.” Nevertheless, radiocarbon experts are optimistic that the 50,000-year barrier will soon be reached. “I foresee that in 10 years it will all be solved,” says van der Plicht.

    Tall order.

    Researchers hope ancient kauri trees buried in New Zealand swamps may extend tree-ring records back to at least 50,000 years ago.


    If so, the revolutionary promise that Libby and his colleagues first glimpsed in the sewers of Baltimore may soon become reality. And we may end up with a much better idea of when and why the creators of the Grotte Chauvet's glorious artworks came to France—and what the weather was really like when they ventured outside.


    A Stressful Situation

    1. Jean Marx

    When unfolded proteins amass, they stress the cell's endoplasmic reticulum. This ER stress is now being linked to diabetes, cancer, neurodegeneration, and other ills

    Even the most quality-conscious automaker occasionally rolls a defective car off the assembly line. Cells have a similar problem when it comes to manufacturing proteins. It's a complicated task and things can go wrong. Protein synthesis involves a lot more than stringing amino acids together in the right order. To function correctly, each linear strand of amino acids has to fold into just the right three-dimensional shape and may also have to be modified by addition of sugars or other accessory molecules.

    Like any good automaker, cells synthesizing proteins have mechanisms to maintain quality. Recently, cell biologists have learned a great deal about how the cell manages this quality control for the protein assembly line located within a convoluted network of membranous tubes known as the endoplasmic reticulum (ER). Roughly one-third of the cell's proteins, mainly those that end up in cellular membranes or are secreted to the outside, are made in the ER.

    Researchers have shown that the ER membrane contains three separate sensor molecules that respond when excessive amounts of unfolded proteins build up inside. This can happen with mutant proteins, such as the ones that cause hereditary Alzheimer's and Parkinson's disease. But it can also occur under more normal conditions if for some reason proteins are synthesized faster than they can fold and be modified. To alleviate this “ER stress,” the sensors trigger a series of signaling pathways that shut down the synthesis of most proteins while turning up the production of those needed for protein folding and degradation.

    This so-called unfolded protein response (UPR) is intended to protect the cell, but it's not foolproof. Sometimes, for example, the UPR can't eliminate the abnormal protein buildup in the ER. In that event, prolonged activity of the UPR may trigger cell death and may thus contribute to the neuronal loss of Alzheimer's, Parkinson's, and other neurodegenerative diseases. And the UPR may even backfire by protecting the cells of cancerous tumors from the lack of oxygen and nutrients they experience as tumors grow.

    Double-edged sword.

    Buildup of unfolded proteins in the ER, shown here in green with blue ribosomes, triggers a series of responses that are intended to protect cells but which can misfire and cause disease.


    In addition to cancer and neurodegeneration, ER stress and the UPR have been linked to several other common human ills, including diabetes and heart disease. In fact, the research has already suggested a new way to guide breast cancer therapy and hinted at a novel drug treatment for diabetes. “The field is expanding so much in terms of mechanism and relevance to disease. … It's just popping up everywhere,” says cell biologist Randal Kaufman of the University of Michigan Medical Center in Ann Arbor.

    Identifying the players

    Researchers are starting to find so many disease links at least partly because they now have a solid understanding of the molecular underpinnings of the ER stress system. “Most of the major components of the signaling apparatus have been ironed out,” says Linda Hendershot of St. Jude Children's Research Hospital in Memphis, Tennessee.

    Cell biologists got their first glimpse of ER stress and the UPR in the late 1980s when a group led by Kaufman and another led by Joseph Sambrook of Cold Spring Harbor Laboratory on New York's Long Island independently noticed that accumulation of unfolded proteins in the ER led to increased activity of a group of genes already known to turn on when glucose concentrations are high. The products of some of these so-called glucose regulated protein (GRP) genes are involved in protein folding, so it seemed as though the ER was trying to resolve its backup by turning up their synthesis.

    The researchers then wanted to know how unfolded proteins in the ER signal to the genes in the nucleus. In the early to mid-1990s, three proteins located in the ER membrane were found to be the key sensors. The first to be identified was a protein called IRE1 that was linked to the UPR in yeast by Peter Walter of the University of California, San Francisco (UCSF), and by Kazutoshi Mori, who was then working in Sambrook's lab. This protein is an enzyme that can cut RNAs—an activity that turned out to be crucial to its gene-regulating function.

    The researchers found that when unfolded proteins clutter the ER, IRE1 cuts a segment out of the messenger RNA (mRNA) that directs the synthesis of a yeast protein called HAC. Another enzyme then rejoins the two end pieces, producing a shorter mRNA but a longer HAC protein due to a shift in the way the mRNA is translated. HAC is a transcription factor that activates GRP genes and others involved in protein folding, and the longer version does this much more effectively than the original one.

    Altering the splicing of a transcription factor's mRNA was a novel way of regulating gene expression. “We had an absolutely wonderful time,” Walter recalls. “Every leaf we turned over revealed something new.” The mechanism wasn't unique to yeast. A few years later, Kaufman's team and also that of David Ron at New York University School of Medicine in New York City showed that mammalian cells have their own versions of IRE that work on the mRNA for a transcription factor called Xbp1.

    At about the same time, Ron's group and other researchers identified a protein called PERK as a second ER stress sensor. PERK is one of the cell's many kinase enzymes. Once activated, it adds a phosphate group to a protein called eIF2α, which normally helps initiate the translation of mRNAs into proteins. PERK's phosphorylation blocks eIF2α's function, thus helping alleviate ER stress by shutting down the production of most proteins.

    Finally, Mori's group, now at the University of Kyoto, Japan, identified ATF6 as the third ER stress sensor. When unfolded proteins amass, protease enzymes split off a segment of ATF6, and this fragment travels to the nucleus, where it helps activate protein-folding genes.

    PERK, IRE, and ATF6 may be able to sense unfolded proteins both indirectly and directly. In the absence of ER stress, binding of a regulator protein called BiP/GRP78 keeps them in check. Unfolded proteins can essentially pull off the BiP/GRP78, freeing the three sensors to trigger the UPR. In addition, recent structural studies of IRE1 by Walter and Robert Stroud, also at UCSF, suggest that this protein has a deep groove that may enable it to bind unfolded proteins directly.

    The three sensors.

    When unfolded proteins in the ER pull the regulator BiP/GRP78 off the sensor proteins PERK, IRE1, and ATF6, they send signals into the nucleus that trigger the unfolded protein response. As a result, synthesis of most proteins gets turned off, while the cell makes more of those proteins needed for protein folding and destruction.


    Diabetes culprit?

    Once researchers identified the major genes for the ER stress sensors and other components of the UPR, they could manipulate the genes to see how they affect development and health in animals. A possible diabetes connection quickly became apparent. The first indication came in 2000 when Cécile Julier's team at the Pasteur Institute in Paris found that Wolcott-Rallison syndrome, a rare human hereditary disease whose symptoms include infancyonset diabetes, is caused by mutations that inactivate PERK. When Ron and his colleagues subsequently knocked out the gene in mice, the animals also became diabetic shortly after birth because their insulin-producing beta cells malfunctioned and died.

    Further evidence that the PERK arm of the UPR is needed for normal beta-cell function came from Kaufman and his colleagues. They produced mice with a mutation in one copy of the gene for PERK's target. The defect prevents eIF2α from being phosphorylated by the kinase.

    The Michigan team showed that on a high-fat diet, the mutant mice became much more obese than did normal animals. The mutants also developed diabetes. Examination of the animals' beta cells revealed a swollen ER, apparently full of unfolded proteins including insulin, which the cells no longer secrete normally. “Compromise of the protein-folding environment is especially detrimental to the survival and function of beta cells,” Ron concludes.

    Whether mutations in PERK pathway constituents are a common cause of diabetes in humans is currently unclear, but the beta-cell failure in Wolcott-Rallison syndrome as well as the diabetes of mice with PERK pathway defects is reminiscent of type 1 diabetes, which usually occurs in childhood as a result of the death of the beta cells and the consequent loss of insulin production.

    Recent evidence from Gökhan Hotamisligil, Umut Özcan, and their colleagues at Harvard School of Public Health in Boston has also linked ER stress and the UPR to type 2 diabetes, which develops because the patients' cells become unable to respond to insulin. Obesity is a major risk factor for type 2 diabetes, although how it leads to the disease has been a mystery. But Hotamisligil says he realized a few years ago that a variety of conditions linked to obesity, including lipid accumulation, high demand for protein synthesis, and glucose deprivation, can trigger ER stress.

    To find out whether such stress might be involved in type 2 diabetes, the Harvard workers turned to the ob/ob strain of mice. The animals, which are genetically obese because they lack an appetite-regulating hormone called leptin, develop an insulin-resistant form of the disease. In work published about 2 years ago, Hotamisligil and his colleagues found that fat and liver cells from the animals produce increased amounts of several proteins involved in ER stress responses (Science, 15 October 2004, p. 457). “All branches of the UPR are activated,” Hotamisligil says.

    Ron and his colleagues had previously shown that one result of IRE1 activation is increased activity of a kinase called JNK. And JNK, the Harvard team found, phosphorylates one of the cell's insulin receptors, rendering it unresponsive to the hormone—a change that could account for the development of insulin insensitivity in ob/ob mice. Consistent with that, the researchers found that they could prevent development of insulin resistance in both ob/ob cells and in living mice by blocking IRE1 or JNK action.

    More recently, Hotamisligil and his colleagues showed that treatments that relieve ER stress can even reverse already-established diabetic changes in the mutant mice (Science, 25 August, p. 1137). For these experiments, they gave the animals either of two small-molecule drugs that were developed for treating other conditions but which can promote protein folding in the ER. Within 4 days, insulin sensitivity in the animals returned, and their previously high blood glucose concentrations dropped to normal. “The results were quite spectacular—more than we expected,” Hotamisligil says.

    Although the drugs are not widely used, the U.S. Food and Drug Administration has already approved both for certain liver diseases and hereditary disorders of the urea cycle, the cell's waste-management system. Thus, they are candidates for clinical testing as diabetes therapies. Christopher Newgard, a diabetes expert at Duke University School of Medicine in Durham, North Carolina, describes the Hotamisligil team's results as “compelling and exciting.” It's too soon to tell, however, whether the drugs will have unwanted side effects if given to the large population of diabetes patients.

    Cancers' friend

    Type 2 diabetes is not the only disease that may be fostered by the cell's efforts to protect itself against ER stress. Recent results also put cancer in that category.

    Cancer cells are subject to ER stress because a tumor can grow faster than its blood supply, thus leaving its cells short of oxygen and nutrients. Exactly how that leads to ER stress is unclear, but the nutrient deficiency may rob tumor cells of the energy they need for protein folding and the glucose needed to add the sugars to their proteins. But whatever the cause, Hendershot says, “the UPR seems to kick in. It's protective to the [cancer] cell—but detrimental to the host.”

    Some of the results linking cancer progression to the UPR come from Constantinos Koumenis and his colleagues at Wake Forest University School of Medicine in Winston-Salem, North Carolina. Working with Bradley Wouters of the University of Maastricht in the Netherlands, they found that activation of the PERK branch of the UPR fosters tumor growth. For example, tumor cells in which the PERK gene had been knocked out grew very poorly, compared to cells that still contained the gene, when transplanted into mice.

    Path to diabetes.

    A mouse (bottom) with a mutation affecting the PERK pathway becomes much fatter when fed a high-fat diet than a normal mouse (top) does and also develops diabetes. The greatly enlarged ER of the mutant animal's beta cells (bottom micrograph), compared to that of normal beta cells (top), suggests that this is related to ER stress.


    Similarly, Albert Koong and Lorenzo Romero-Ramirez of Stanford University in Palo Alto, California, and their colleagues showed that activity of the IRE1-XBP1 branch of the UPR is required for tumor growth. “If you block the UPR in any of these tumor systems, the tumors grow slower and smaller,” says Koumenis, who recently moved to the University of Pennsylvania. And the effects are apparently not limited to experimental tumors in animals: Human tumors show increased expression of UPR system components, particularly in oxygen-starved areas.

    At least one of these components may help assess how breast cancers will respond to chemotherapy. About 10 years ago, Amy Lee's team at the University of Southern California Keck School of Medicine in Los Angeles found that the UPR regulator BiP/GRP78 is needed for the growth of fibrosarcoma tumors implanted in mice. Further work by her group and others has shown that BiP/GRP78 levels increase as tumors, including human breast and lung cancers, grow.

    The protein may help cancer cells survive, Lee says, by inhibiting programmed cell death or apoptosis. Although a prolonged UPR can promote apoptosis, several groups, including Lee's, have found that BiP itself is anti-apoptotic. They've shown that it interferes with caspase enzymes and other components of cell death pathways.

    This mode of operation may also help explain another cancer-friendly aspect of UPR activation. Cancers often develop resistance to the drugs and radiation used to treat them, and researchers, including Lee and Hendershot, have found that the UPR contributes to this resistance for some drugs, particularly those that work by triggering apoptosis.

    In the 15 August issue of Cancer Research, Lee and her colleagues report that it may be possible to use the BiP/GRP78 status of a patient's breast cancer cells to guide her therapy. As expected from their previous results, the researchers found that about two-thirds of tumor samples obtained before the initiation of chemotherapy exhibited overexpression of the protein.

    Patients who had BiP/GRP78-positive tumors and were treated only with the drug adriamycin, an apoptosis inducer, had a higher risk of relapsing than patients with negative tumors. “We found that we can use this marker as a predictor of chemotherapy resistance in breast cancer,” says Lee.

    The situation is complicated, however, because in other cases the UPR may actually increase sensitivity to chemotherapeutic drugs. For example, Lee and her colleagues found that patients who took both taxane and adriamycin and had BiP/GRP78-positive tumors fared better than those whose tumors did not overexpress the protein. “We need some hard research to understand when UPR activation is good and when it is bad,” Hendershot says.

    A stressed brain

    Neurobiologists are facing a similar issue. Many of the most common and devastating neurodegenerative diseases, including Alzheimer's, Parkinson's, and Huntington's, are characterized by the accumulation and aggregation of misfolded proteins. Not surprisingly then, numerous researchers have found evidence that the UPR is turned on in these conditions.

    Studies of both human brain samples taken at autopsy and the brains of mice that have been genetically engineered to mimic the diseases have revealed increased expression of various components of the UPR pathways. “Everyone pretty much agrees that [the UPR] is activated” in Alzheimer's and other neurodegenerative diseases, says neurobiologist Dale Bredesen of the Buck Institute for Age Research in Novato, California, whose group is among those doing the work.

    But that activation may ultimately do more harm than good. If the UPR fails to resolve the ER stress that the patients' brain neurons are experiencing, its sustained attempts may contribute to the progression of the diseases by triggering apoptosis in the cells. “The initial response is protective, but the late response is destructive,” Bredesen says.

    Exactly how the UPR sets the cell on the road to apoptosis remains one of the big mysteries of the field, however. Neurobiologists have come up with some leads. For example, Bredesen's team and others have fingered caspase 12 as a possible link. Treatments that induce ER stress lead to increased activity of this enzyme, which is among those that bring about the final destruction of cells undergoing apoptosis. The work was done with mouse models, however, and it's unclear whether something similar occurs in the human diseases.

    Researchers would like to get a better handle on how the UPR triggers apoptosis, as that might provide new targets for drug therapies. The fact that the UPR can be helpful in some situations and harmful in others could complicate such a drug development effort, however. Maintaining quality control on the cell's protein assembly line looks to be a great deal more difficult than the problems car manufacturers face.


    The Galapagos Islands Kiss Their Goat Problem Goodbye

    1. Jerry Guo*
    1. Jerry Guo is a freelance writer in New Haven, Connecticut.

    The world's largest eradication campaign has virtually rid an ecological wonderland of feral goats, a devastating invader. Next in the crosshairs: cats and rats

    Buzz cut.

    As feral goats turned Isabela's brush and cloud forests into patchy grassland, tortoises suffered.


    SANTA CRUZ, GALÁPAGOS ISLANDS—Rachel Atkinson hops like a Darwin finch from one volcanic outcropping to the next, then plunges into ankle-deep mud. Squishing as she walks, the botanist with the Charles Darwin Research Station homes in on the ailing invaders: blackberry, passion fruit, and quinine bushes clustered near Santa Cruz Island's last shrubby stands of Scalesia trees. Atkinson smiles in approval. One more blast of herbicide ought to prevent the aliens from regrowing and give the Scalesia a shot at survival after all.

    Atkinson's search-and-destroy mission is part of an ambitious 6-year, $18 million Global Environment Facility (GEF) effort by the station and Galápagos National Park to turn the tide against invasive species in the Galápagos Islands, the fragile crucible of life that inspired Charles Darwin to formulate his theory of evolution 150 years ago. The GEF grant runs until next year, but the results so far are stunning. A survey here last month has confirmed that enemy number one—the feral goat—has been virtually wiped off Isabela, Santiago, and Pinta islands. All told, some 140,000 feral goats were slain in 5 years of the GEF-funded Project Isabela, the largest eradication project ever undertaken. “A great battle has been won here,” says Victor Carrion, subdirector of the park.

    Although one bane has been eliminated, others are at large. In northern Isabela, rats have ravaged the last two nesting sites of mangrove finches, estimated at fewer than 100. And both rats and feral cats have decimated a subspecies of marine iguana (Amblyrhynchus cristatus albemarlensis) endemic to Isabela, prompting the World Conservation Union to add it to its vulnerable list in 2004. Rangers have set out traps and poison for Isabela's rats and are plotting eradication campaigns on Floreana and Santiago islands. An effort to poison feral cats will commence next year.

    The Galápagos have been under siege ever since pirates and whalers began visiting the archipelago in the 1700s and leaving behind goats, pigs, and other animals as a living larder for future visits. But it wasn't until the late 1980s that the goat population suddenly started booming, possibly due to El Niño-driven changes in vegetation patterns. Godfrey Merlen, a Galápagos native and director of WildAid, says he saw “two or three” goats on the upper flanks of Isabela's Alcedo volcano in 1992. When he returned 3 years later, he saw hundreds. “It was total chaos,” Merlen says. The goats had denuded the once-lush terrain, transforming brush and cloud forests into patchy grassland.

    Ecological shock waves rippled across Isabela. The highlands had served as a safe haven for species such as the giant tortoise. “We saw many more tortoises falling into the volcanic craters,” trying to reach feeding grounds or because of erosion, says Carrion. “Being a baby tortoise is hard enough,” adds Thomas Fritts, past president of the Charles Darwin Foundation. “Competing with voracious herbivores is an extra challenge.”

    Park rangers quickly cottoned on and started slaying the goats in 1995. They had eradicated a much smaller population from Española Island in the 1970s. But with tens of thousands of goats on northern Isabela alone, officials knew they needed a novel approach. In 2000, GEF agreed to bankroll an antigoat operation as long as it was part of an effort to tackle invasive species across the board (Science, 27 July 2001, p. 590).

    Goats were still top priority. The park imported hunting dogs from New Zealand and trained them to track and kill goats. Helicopters were pressed into service for sharpshooters to reach rugged highlands. To flush out the last feral holdouts, the park released “Judas” goats, including sterilized females plied with hormones to keep them in heat and attract males. The last feral goat in northern Isabela was shot in March. Hunters have also purged pigs from Santiago and donkeys from both islands.

    Local scientists say native plants are already bouncing back. Seedlings of Scalesia and soldierbush are sprouting on Alcedo. And on Santiago, cat's claw and Galápagos guava are thriving, providing nesting grounds for the secretive Galápagos rail.

    One looming threat is microbial invaders. “What can cause far greater and permanent damage are the small introduced species [such as] West Nile virus, now in Colombia, a stone's throw away from Galápagos,” says Merlen. In a paper in the August issue of Conservation Biology, Marm Kilpatrick of the Consortium for Conservation Medicine in New York City and colleagues concluded that West Nile virus-ridden mosquitoes could easily hitch a ride on a commercial jet from mainland Ecuador. “The Galápagos has been very lucky so far, but it's just a matter of time,” says Simon Goodman of the University of Leeds in the U.K., an author of the paper. He says that West Nile virus could inflict the sort of damage in the Galápagos that avian malaria did in Hawaii in 2004, when it drove a honeycreeper (Melamprosops phaeosoma) to extinction. Galápagos officials pledge to remain vigilant and point to the establishment in 2003 of a molecular pathology lab on Santa Cruz funded by the U.K.'s Darwin Initiative.

    To avoid ceding hard-won breathing room for native species, the park and research station plan to set up a $15 million fund for ongoing eradication efforts. In the meantime, they are stepping up efforts against invasive plants and gearing up for the cat-and-rat blitzkrieg. Unless these and other unwelcome visitors go the way of the goats, warns Carrion, “the worst may be yet to come.”