News this Week

Science  26 Sep 2008:
Vol. 321, Issue 5897, pp. 1752
  1. SCIENCE EDUCATION

    Misjudged Talk Opens Creationist Rift at Royal Society

    1. Daniel Clery

    A talk titled “Should Creationism Be a Part of the Science Curriculum?” was bound to attract attention at the annual meeting of the British Association for the Advancement of Science (BA) earlier this month. But last week, it cost the speaker, Michael Reiss, his job as director of education at the Royal Society, Britain's academy of science.

    Within hours of his 11 September talk, news items appeared on the Internet claiming that Reiss had urged science educators to teach creationism, although many attending the speech said that he had clearly not made such a call. His comments—or perhaps more accurately the spin placed on them by headline writers, newspaper columnists, and editorialists—ignited a firestorm. Several prominent scientists, including a trio of Nobel laureates, called for his resignation. The Royal Society hastily put out a statement defending Reiss but 4 days later issued another statement announcing his resignation and leaving the clear impression he had been forced out.

    Although some critical of Reiss applauded his sudden exit, others saved their harsh words for the organization he left. “This has damaged the Royal Society, the way they handled it,” says Derek Bell, head of the Association for Science Education in the United Kingdom.

    On paper, Reiss, who remains a professor at the University of London's Institute of Education, seemed the perfect speaker at the science festival in Liverpool. In addition to having a doctorate in science education, he's an ordained minister in the Church of England and coedited the book Teaching About Scientific Origins: Taking Account of Creationism. In his 11 September talk, Reiss noted that teachers are bound to encounter pupils with creationist views. If these are brought up in class, he argued, simply dismissing them as not appropriate to a science lesson will only alienate those pupils. Instead, Reiss advocates taking the opportunity to explain the difference between the creationist viewpoint, which, he emphasizes, has no evidence to support it, and evolution, which, he says, has a lot. A teacher's answers to such questioning from creationist pupils “can be used to illustrate a number of aspects of how science works,” Reiss says in the online text of his talk (www1.the-ba.net/bafos/press/showtalk2.asp?TalkID=301).

    Efforts by creationists or believers in intelligent design to tamper with school science curricula are viewed in Europe as largely an American phenomenon, one reflected in the public acceptance of the theory of evolution. Only about 40% of adults in the United States accept the idea of Darwinism, compared with about 70% in the United Kingdom and many other European nations (Science, 11 August 2006, p. 765). Yet throughout Europe, groups promoting creationist views are emerging from Protestant, Catholic, and Islamic communities in different countries.

    Wise words?

    Reiss's comments on dealing with creationist pupils were jumped on by U.K. newspapers.

    CREDIT: INSTITUTE OF EDUCATION

    In the United Kingdom, government policy throughout this decade has been to create more so-called faith schools, high schools funded predominantly by government but managed by religious organizations. This has increased concern among science educators about the spread of creationism into curricula. There was also widespread condemnation in 2006 of a pressure group called Truth in Science that sent creationist teaching materials to every U.K. high school. In part because of this, in 2007 the government published guidance for schools on how to address creationism and intelligent design, drafted by Reiss and others.

    The day after his speech, Reiss was greeted by headlines such as “Call for creationism in the classroom,” in the Financial Times, and “Children should be taught about creationism in school,” in the Daily Mail. Highlighting Reiss's description of creationism as a “worldview” that should be respected, many of the stories suggested he equated it with the theory of evolution. That same day, the Royal Society issued a statement reaffirming its position that creationism should not be taught as science, saying that Reiss's views had been “misrepresented” and offering a clarification from him.

    That move failed to quell the storm. Several fellows of the Royal Society stated publicly that it wasn't appropriate for Reiss to hold such an influential position given his religious affiliation. “I do not see how he could continue,” says Nobelist Harry Kroto of Florida State University in Tallahassee. On 13 September, another society fellow, Nobelist Richard Roberts of New England Biolabs in Ipswich, Massachusetts, sent a letter to the Royal Society, cosigned by Kroto and John Sulston of the University of Manchester in the U.K., calling for Reiss to step down.

    Reiss has declined to comment since his talk, but on 16 September the society issued a new statement, saying that Reiss's comments “were open to misinterpretation. While it was not his intention, this has led to damage to the Society's reputation.” The society and Reiss, the statement continued, had agreed that “in the best interests of the Society, he will step down immediately.”

    Scientists and experts in science education have leapt to Reiss's defense. “There's an awful lot of support for Michael Reiss, as a person and his views,” says Bell. Fertility expert and society fellow Robert Winston of Imperial College London issued a statement critical of the Royal Society: “This is not a good day for the reputation of science or scientists. This individual was arguing that we should engage with and address public misconceptions about science—something that the Royal Society should applaud.” Even evolutionary biologist and noted creationism critic Richard Dawkins defended Reiss on his Web site, calling efforts to remove him “a little too close to a witch-hunt.”

    Others, however, welcomed Reiss's departure as the best way for the Royal Society to make clear its position on creationism. “The only reason to mention creationism in schools is to enable teachers to demonstrate why the idea is scientific nonsense,” says Christopher Higgins, vice-chancellor of Durham University in the U.K.

    Royal Society President Martin Rees said in an e-mail to Science that “the Royal Society should be secular, but not anti-religious.” But Phil Willis, a member of the U.K. Parliament who chairs a committee that oversees British science, acknowledges that there is a “very stark division” between those in the society who reject religion completely and those who urge coexistence or are themselves religious. Although Willis has great respect for Reiss, he was “caught making injudicious comments,” Willis says. “It's a real tragedy, [but] it's inevitable that they parted company.”

  2. PARTICLE PHYSICS

    After Spectacular Start, the LHC Injures Itself

    1. Adrian Cho

    When physicists first sent particles racing through the world's biggest atom smasher on 10 September, the Large Hadron Collider (LHC) at the European particle physics laboratory, CERN, near Geneva, Switzerland, the gargantuan machine purred like a kitten. But only 9 days later, the LHC proved it can also be a temperamental tiger, damaging itself so severely that it will be out of action until next spring.

    It all seemed so easy earlier this month when after just a few hours researchers had beams of protons whizzing through both of the 27-kilometer-long, $5.5 billion machine's countercirculating rings. That smooth start raised hopes that the LHC would start colliding particles as early as this week. But last Friday, some of the 1232 main superconducting dipole magnets, which keep the beams on their circular trajectories, abruptly overheated in an event known as a “quench.” The incident ruptured the plumbing that carries liquid helium through the magnets to chill them to 2 kelvin—2 degrees above absolute zero.

    The quench highlights the LHC's ability to injure itself, says Reinhard Bacher of the DESY particle physics lab in Hamburg, Germany. Compared with earlier colliders, “the LHC is operating in all aspects much more at the critical edge,” he says. The breakdown raises the question of whether a key protection system worked properly.

    A superconducting magnet is essentially a coil of superconducting wire that generates a magnetic field when current flows through it. If kept extremely cold, the wire carries huge currents without resistance; a quench occurs when part of it overheats and acts like an ordinary wire. The hot bit serves as an electric heater that can trigger a runaway reaction, toasting the rest of the magnet and converting the energy in its field to heat.

    Such an event can start if, for example, protons stray out the beam pipe and into the magnet material. The 19 September quench occurred a different way, however. Within the LHC, the 15-meter-long magnets are connected so that current from one magnet flows into the next. With no beam, researchers were ramping up the current in a chain of 154 dipole magnets when the superconducting connection between two magnets apparently overheated and melted, says CERN spokesperson James Gillies. That breach can also cause an uncontrolled quench. The loss of current causes a magnet's field to quickly ebb. However, the magnet will produce a voltage surge to counteract the waning of the field. The reaction can then heat the magnet as a whole.

    Hot spot.

    A worker hooks up LHC magnets. The machine broke down when an electrical link between two magnets melted.

    CREDIT: CERN

    The field of an LHC dipole contains a whopping 8.6 megajoules of energy, enough to melt 42 kilograms of copper or to cook the magnet in a fraction of a second. Researchers designed a quench-protection system to quickly shunt current out of an afflicted magnet and into large steel blocks. Ironically, it also heats the entire magnet to spread out the toll from the quench. During last Friday's mishap, the quench-protection circuits fired as expected, Gillies says.

    Nevertheless, the broken helium line suggests that at least one $900,000 magnet was ruined and will need to be replaced. “I cannot tell if any of the safety systems failed,” says CERN's Rüdiger Schmidt, who leads the machine-protection team. “It is absolutely too early to tell.”

    To make repairs, researchers will have to warm up an entire octant of the LHC and then cool it back down. Bacher says that DESY researchers experienced similar delays when they commissioned their HERA collider, which smashed electrons into protons from 1992 to 2007. “We had three or four of these warm-up-and-cool-down cycles,” he says. One such cycle will take at least 2 months for the LHC. That will run into a planned shutdown for the winter, when the cost of power climbs, so experimenters will have to wait until next spring to collect data.

  3. SPACE SCIENCE

    Rising Costs Could Delay NASA's Next Mission to Mars and Future Launches

    1. Andrew Lawler*
    1. With reporting by Richard A. Kerr.
    In the red?

    A laser on the planned Mars Science Laboratory has been redesigned, but other problems are pushing up costs.

    CREDIT: LOS ALAMOS NATIONAL LABORATORY

    Faced with a dramatically higher price tag, NASA managers will decide next month whether to postpone the launch of a sophisticated Mars rover for 2 years. Such a delay in the Mars Science Laboratory (MSL) would mark a significant setback to the Mars research program, which has sent a new spacecraft to the planet every other year for a decade. Planetary scientists also worry that pushing back the mission could have a ripple effect, delaying and even canceling future missions.

    The science laboratory, currently slated for launch in the fall of 2009, is four times heavier than the current rovers trundling across the planet's surface. It features a plethora of advanced tools and instruments designed to analyze rocks, soil, and atmosphere. But that complexity has led to technical troubles and higher costs. When proposed in 2004, the lab was expected to cost $1.2 billion. By this summer, that price tag had climbed to $1.9 billion, and last week NASA space science chief Edward Weiler warned that “there is another overrun coming.” Another NASA official put the latest increase at approximately $300 million.

    Engineers at NASA's Jet Propulsion Laboratory (JPL) in Pasadena, California, which is responsible for MSL, are now working overtime to prepare the spacecraft for environmental testing and launch. Weiler and NASA Administrator Michael Griffin will pore over those results and the latest cost estimates when they meet in mid-October to determine whether to delay the mission. In addition to worrying about the unbudgeted overtime, Weiler is concerned that engineers may be rushing their inspection of the rover's complicated systems. “The alternative could be that we get a crater on Mars,” the science chief adds ominously, evoking previously failed missions to the Red Planet.

    “Postponing MSL is a real possibility, and an unfortunate one,” says Brown University planetary scientist Jack Mustard, who also chairs NASA's Mars science advisory panel. “A 2-year delay could increase the cost of the mission significantly—and that would come out of the Mars budget.” He also fears that this could slow momentum for Mars exploration and jeopardize plans for a 2016 rover and a sample-return mission.

    MSL's cost and technical woes began in earnest last year, when NASA considered jettisoning two instruments, ChemCam and the Mars Descent Imager. ChemCam, an instrument designed by French and American researchers that would use a laser to vaporize martian rock and dust for spectrographic analysis and take detailed photographs, proved more expensive than anticipated. In a compromise reached last fall, NASA provided extra money to complete a simplified version of the instrument. The Mars Descent Imager, a camera designed to provide a view from just above the surface of the rover's landing site, was canceled but revived when engineers found ways to control costs by making relatively minor changes.

    But those changes didn't stanch the financial bleeding. The latest technical problems affecting the MSL budget include the tardy delivery of hardware used in the sample acquisition and handling portion of the laboratory. NASA Planetary Science Division Director Jim Green said in June that the total overrun for MSL in 2008 and 2009 was $190 million. Most of that money—some $115 million—will come from other Mars-related projects. JPL spokesperson Guy Webster referred MSL questions to NASA headquarters.

    A new $300 million overrun, says a NASA official familiar with MSL, could force the agency to cancel the $485 million 2013 Scout mission announced just last week to probe the planet's atmosphere (Science, 19 September, p. 1621) or the 2016 Mars mission. “Rest assured the Mars program has to pay for this,” the official added.

    Mustard, who last week chaired a Mars advisory panel session in California, says there is growing anxiety in the community about the implications of an MSL delay on an exploration program that began in the late 1990s. “If you delay it until 2011, then you might lose the 2016 mission,” he says. The 2016 Mars effort now under consideration likely would be a smaller rover that could include some sample-gathering technology designed to test systems for an eventual sample-return mission from Mars to Earth. The projected $1.4 billion cost of such a rover would fall between MSL and the current Spirit and Opportunity rovers now on the surface.

    NASA's former space science chief, S. Alan Stern, who resigned in protest this spring after a disagreement with Griffin over how to deal with cost overruns (Science, 4 April, p. 31), last year proposed a Mars sample-return mission to arrive at the end of the next decade. But static budgets, spacecraft overruns, and the need to conduct other missions make that increasingly unrealistic, say agency managers and academic researchers. Weiler notes that a sample-return mission would cost many billions of dollars and that NASA is planning first to launch a mission to either Jupiter or Saturn late in the next decade. And although scientists are intrigued by the idea of a sample-return mission, they see it slipping into the more distant future.

    “Plans for a sample return were smoke and mirrors,” says Mustard. “It's a good idea—but where's the money?”

  4. GEOCHEMISTRY

    Geologists Find Vestige of Early Earth--Maybe World's Oldest Rock

    1. Richard A. Kerr

    Really old stuff is rare on Earth. The planet's brand of violent geology has just been too dynamic to preserve much from its earliest days. Formed 4.567 billion years ago, Earth has yielded 4.3-billion-year-old mineral grains and 4.0-billion-year-old rocks that hint at how a ball of primordial debris evolved into a crusted-over, largely ocean-covered abode of life.

    So geologists keep searching the oldest, most brutally battered terrains for more traces of earliest Earth. On page 1828, a group reports the discovery of rock in northern Quebec on Hudson Bay that records the existence of the earliest crust. The Canadian rock may also be the oldest known rock by 300 million years.

    Given how beaten up the oldest rocks are, geologists often fall back on atomic-scale records preserved in the isotopic and elemental composition of the rocks. Geologist Jonathan O'Neil of McGill University in Montreal, geochemist Richard Carlson of the Carnegie Institution of Washington's Department of Terrestrial Magnetism in Washington, D.C., and colleagues analyzed isotopes of the elements samarium and neodymium from the Nuvvuagittuq greenstone belt of northern Quebec. These isotopes can be used to trace geologic processes because some isotopes are stable and don't change no matter how many eons pass, whereas some steadily decay radioactively into other, more stable isotopes. Different elements behave differently when rock partially melts; some tend to concentrate in the melt, while others remain behind.

    Delving into the samarium and neodymium of volcanic and altered sedimentary Nuvvuagittuq rock, O'Neil and his colleagues found isotopic signs that the rock could represent the oldest section of crust on Earth. Geochemists had already found rock in Greenland that, according to its isotopes, had been derived from the earliest mantle rock. But by 4.3 billion years ago, that mantle rock had partially melted to yield crustal rock, so researchers had fingered “protomantle” by analyzing Greenland rock derived from it. But where was the “protocrust” that must have been formed as the protomantle formed?

    O'Neil and colleagues think they now have such protocrust in Quebec. The Nuvvuagittuq rock has the opposite neodymium isotope signature of the Greenland rock's protomantle. Either this rock is a 2-kilometer-long sliver of protocrust resembling today's iron-rich ocean crust, or it was derived from such protocrust. “That's a first,” says geochemist Albrecht Hofmann of the Max Planck Institute for Chemistry in Mainz, Germany. “It's an heroic effort” to measure the subtle isotopic variations involved.

    Older than dirt.

    Rocks by Hudson Bay may date back to when Earth first separated its primordial stuff into mantle and crust.

    CREDIT: JONATHAN O'NEIL/MCGILL UNIVERSITY

    The group goes further, drawing on the clocklike radioactive decay of samarium-146 to calculate an age of formation of the Nuvvuagittuq rock of about 4.3 billion years. If accurate, that age would mean they have the protocrust itself, not just something derived from it. That rock would be the oldest rock known, approaching the age of individual zircon mineral grains from western Australia that tell of a wet and weathered world soon after Earth's origin. The new age “is exciting,” says geochemist Mukul Sharma of Dartmouth College, but uncertainties remain about details of the rocks' formation that bear on its isotopic age. “There's a lot more work that needs to be done,” he says, before a new world's most ancient rock can be crowned.

  5. JOHN BEDDINGTON INTERVIEW

    U.K. Science Adviser Makes His U.S. Debut

    1. Daniel Clery

    WASHINGTON, D.C.—Last week, during his first visit to the United States as the U.K. government's chief scientific adviser, John Beddington sat down with Science's news editors to discuss topics as varied as food, fuel, and physics. Nine months into his job, Beddington has adopted a lower profile than his headline-grabbing predecessor, David King. A population biologist at Imperial College London, Beddington has specialized in applying biological and economic tools to questions of natural resource management, particularly fisheries (Science, 22 June 2007, p. 1713). He's no stranger to politics, having advised the British government, the European Commission, the United Nations Environment Programme, and its Food and Agriculture Organization. Now Beddington must answer questions from the prime minister and Cabinet, as well as coordinating the science advice in all government departments and chairing a number of committees. The following excerpts from his interview were edited for brevity and clarity.

    Also Online

    Read an extended version of this interview on our daily news site, ScienceNOW.

    Q: If you could put one file in the new U.S. president's in-tray, what would it be?

    J.B.: The message I would probably want to give is the intimate connection between the issues of climate change, food security, energy security, and water security. These issues need mixed approaches; they need a mix of both science and engineering. These issues are tremendously important because they are going to come quite quickly. The sort of demand increases that are to be expected from urbanization, movement out of poverty, and population growth are quite dramatic, on a time scale of only a couple of decades.

    Q: One of David King's goals was to increase the use of science advice across all government departments. Is that job done?

    J.B.: I've done a number of things that are slightly different from David. Every 6 weeks, all of the Chief Scientific Advisors of the major departments dealing with science meet with me and with each other. We form subgroups: One is dealing with climate change and food security issues and another is going to be dealing with infectious diseases. That's a good bit of networking. In addition, this group is now meeting with the chief executives of the research councils every 3 months. You now have a network of essentially everybody who's funding government science meeting on at least a 3-monthly interval. A real community is now starting.

    Q: David King took a very public stance, putting advice into the public domain even when he disagreed with the government. What approach do you favor?

    J.B.: The key thing is that if there's an issue, it needs to be raised. The one that I raised very early on in my tenure was the issue of food security, which I felt had been quite seriously neglected, and the related issue of biofuels. In my first speech [as chief scientific adviser], I raised these issues. Very substantial increases in food prices shortly followed and [there was] a very quick reaction by the prime minister, who raised the issue of food security at the G8 Summit the following summer. Some issues are better raised involving the media and the public at large; others are better talking behind the scenes.

    Q: On biofuels, your concern was the competition for arable lands?

    J.B.: When I first raised [the issue], I made the point that some biofuels were being produced by cutting down rainforests or using permanent grassland, which has a negative effect on greenhouse gas emissions. So you don't want to be doing that. I think that the [U.K. government's] Gallagher Report indicates that there's some need for caution on the development of biofuels within the U.K. and Europe. It's a complicated issue. The information that is available to make a comprehensive assessment of the implications of biofuels is quite inadequate.

    Q: You have said that the world needs to dramatically increase food production, using less water than is used today. Will the world need to embrace GM technology?

    J.B.: Population growth and the increase in wealth implies something like a 50% increase in food demand by 2030. At the same time, the proportion of the population that lives in an urban environment will go up from about 47% to 60%. That means there's going to be some real problems for agriculture. Essentially, about 70% of available freshwater is used by agriculture. There's going to be competition [for water] between urban communities and agriculture relatively close to urban communities. I'm worried about that.

    GM is not going to be the only answer. The knowledge of the plant genome is going to be absolutely critical to improving agricultural production. GM is only one of the techniques that can be used; marker-assisted breeding could be used equally well.

    British advisory.

    John Beddington warns that increasing energy, food, and water demands are vital security issues.

    CREDIT: GREG SCHALER

    Q: The U.K. government is falling behind its own targets for reducing greenhouse gas emissions. How should it catch up?

    J.B.: There's some interesting work that's being done by the government's new Climate Change Committee, which is going to be reporting in December, that is going to answer those questions very specifically. The Energy Technology Institute, funded jointly by industry and government, is looking at operational scale inputs to a whole series of green engineering technologies to address these problems. The big [initiative], which everybody really needs to be addressing, is CCS [carbon capture and storage]. And that really needs very serious investment.

    Q: At U.K. universities, many physics and chemistry departments have closed because of declining student numbers [Science, 12 September, p. 1428]. Should the government intervene to support strategic science subjects?

    J.B.: I think it's absolutely critical that we make certain the STEM agenda works—science, technology, engineering, and mathematics are the subjects that we desperately need students to take A-levels [high school finals] in and go on to do degrees. There has been a downward trend [in undergraduate STEM enrollment], but I think it is actually starting to reverse. One area that has been very successful in reversing this [overall] downward trend has been the Ambassador Scheme, in which we've got something of the order of 20,000 scientists and engineers going into schools, talking to students about their lives and the problems they're actually facing. Now our commitment is to expand that.

  6. GEOPHYSICS

    Solid Rock Imposes Its Will on a Core's Magnetic Dynamo

    1. Richard A. Kerr
    Skewed.

    Uneven heating of a core producing a normal magnetic field (left) concentrates the field (right).

    CREDIT: S. STANLEY ET AL., SCIENCE

    Mariners have been navigating by Earth's magnetic field for centuries. Seismologists detected the fluid-iron core that generates the magnetic field a century ago. But geodynamicists still struggle to understand exactly how the churning of the core's fluid iron generates the field inside Earth. One secret, according to two papers in this issue of Science, may lie in the far slower roiling of the solid rock overlying a planet's core. The authors draw on magnetic fields long frozen into the rocks of Earth and Mars to understand how motions in the solid rock can shape a planet's magnetic field.

    Here on Earth, the frozen fields link the deep-seated magnetic field to plate tectonics at Earth's surface. On page 1800, paleomagnetist Kenneth Hoffman of California Polytechnic State University in San Luis Obispo and geochronologist Brad Singer of the University of Wisconsin, Madison, draw on magnetic fields locked into lavas as they solidified in Germany and on Tahiti since 780,000 years ago. Five times during a 200,000-year interval, Earth's magnetic field weakened for thousands of years as if it were about to switch its north and south poles, only to return to full strength without reversing. During each such excursion, magnetic field lines that had been pointing in the usual direction—roughly toward the geographic poles—swung around as if one pole were someplace in Eurasia and the other around western Australia.

    That pole pattern during ancient excursions has a familiar look, Hoffman and Singer note. Mathematically remove today's powerful, axially aligned dipole field—the sort produced by a bar magnet—from Earth's normal field, and the remaining complex but weak field would skew the pole positions in just that way. Hoffman and Singer infer that this field, called the nonaxial dipole (NAD) field, was there three-quarters of a million years ago. Ever since then, the pair argues, something must have kept the molten iron of the core swirling in the same pattern to generate the NAD field.

    The ultimate stable driving force appears to be plate tectonics. Lots of cold oceanic plates have sunk through the mantle to the top of the core beneath Western Australia. That relatively cold material would cool the underlying core fluids, which would sink, superimposing a weaker but persistent circulation on the one generating the main dipole field. Hoffman and Singer suggest that the field-generating circulations are layered, with the main dipole field generated deep within the outer core and the NAD generated near its top.

    Dynamo specialists say this paleomagnetic argument indicates that mantle rock influences the magnetic field, as modern observations had hinted. “It's very likely the mantle does have a role in the core flow,” says geophysicist Peter Olson of Johns Hopkins University in Baltimore, Maryland, “but it's not that easy to say one [field] is shallow and the other is coming from deep.”

    On Mars, the patches of magnetic field detected from orbit froze into the crust more than 4 billion years ago, not long before the dynamo in the martian core died. Oddly, the patches of field lingering in the northern hemisphere are far weaker than those in the southern hemisphere. The planet's crust also differs between hemispheres. It's thin and low-standing in the north but high and thick in the south. Could the two asymmetries be related? On page 1822, dynamo specialist Sabine Stanley of the University of Toronto, Canada, and colleagues consider the possibility.

    In a dynamo computer model, Stanley and her colleagues made the bottom of the mantle colder in the southern hemisphere than in the north. That would be the temperature pattern imposed on the core by a mantle circulating so as to create the crustal asymmetry: hotter mantle rock slowly rising throughout the northern hemisphere in one great plume—thinning the crust by eroding it—and cooler mantle sinking throughout the southern hemisphere. Researchers have suggested several ways such a mantle circulation might have been created, including a super-giant impact (Science, 11 April, p. 165). Once the resulting temperature pattern was imposed on the model mantle, it induced a circulation in the molten core that generated a magnetic field, but almost entirely in the south and only weakly in the north.

    Creating a lopsided magnetic field is “a significant accomplishment,” says planetary physicist David Stevenson of the California Institute of Technology in Pasadena. But proving that early Mars worked that way will require a better record of early magnetic field behavior, he cautions. Understanding eons-old interactions between the mantle and the magnetic field will take a lot more work.

  7. PROTEOMICS

    Proteomics Ponders Prime Time

    1. Robert F. Service

    Improved technologies for tracking thousands of proteins at once have spawned talk of a full-scale project to reveal all the proteins in each tissue--but the price tag would be daunting.

    Improved technologies for tracking thousands of proteins at once have spawned talk of a full-scale project to reveal all the proteins in each tissue—but the price tag would be daunting

    Revealing.

    Fluorescent antibodies flag the locations of different proteins in cells, offering clues to those whose functions are unknown.

    CREDITS: UHLEN AND LUNDBERG/THE HUMAN PROTEIN ATLAS

    AMSTERDAM, THE NETHERLANDS—He's too polite to come right out and say it, but Amos Bairoch thinks that much of the data generated by proteomics groups over the past decade is junk. Following the completion of the human genome project, proteomics labs set out to survey all the proteins expressed in different cells and tissues, in essence, putting meat on the bone of the genome. Mass spectrometers and other tools turned out gigabytes of data that purported to identify large numbers of proteins and fed them to Bairoch, who heads Swiss-Prot, a massive database that houses the latest findings on proteins of all stripes. Today, most of those data are ignored, Bairoch says, because the readings were too imprecise to make positive identifications. Throughout the years, many casual observers of the field dismissed proteomics as a waste of time and money. “People thought [the technology] was ready 10 years ago. But they didn't see good results and got disenchanted,” Bairoch says.

    Today, however, Bairoch's databases and others like them are filling up with terabytes of information that he calls “much better.” The upshot: Proteomics is finally coming of age. With the help of better instrumentation and refined techniques, the top proteomics labs can identify and quantify more than 6000 distinct proteins from individual cells and tissues at a time. Now that these labs can cast such a wide net, many proteomics researchers say the time is ripe to undertake a full-scale human proteome project (HPP) to survey the landscape of proteins present in dozens of different human tissues. If successful, such a project would reveal which proteins are actually expressed in different types of cells and tissues, and at what levels, and the network of proteins they communicate with. That knowledge could offer researchers innumerable insights into how organisms convert their genetic blueprint into life and perhaps lead to breakthroughs in biology and medicine. “We are at the point where we can talk about doing this in 8 to 10 years,” says Mathias Uhlen, a microbiologist and proteomics expert at the Royal Institute of Technology in Stockholm, Sweden.

    It's not just talk. Uhlen and other proteomics leaders gathered here last month to weigh plans for an HPP and to sound out representatives of science funding agencies that would need to pony up the hundreds of millions—if not billions—of dollars needed to pull it off. Most of the responses suggested that tight science budgets make a new mega-sized international science project unlikely anytime soon. Nevertheless, even without a coordinated international HPP, the field is moving so fast that “it's happening already,” says Matthias Mann, a proteomics expert at the Max Planck Institute of Biochemistry in Martinsried, Germany.

    Spotted history

    Many researchers probably assume an international proteome effort started years ago. The availability of the human genome sequence in 2001 told researchers how many proteins are likely to be out there and the exact sequence of amino acids they should look for. The race was on, amid plenty of hype. “Everyone was interested in proteomes,” says Mann.

    But there were problems, lots of them. For starters, proteins are chemically far more heterogeneous and complex than DNA and RNA. It was relatively easy for researchers to create a single, robust, and standardized sequencing technology to decode the genetic blueprint of humanity. But no single machine could tell researchers everything they wanted to know about proteins. Worse, although each cell contains the same complement of genes, the abundance of different proteins varies widely. One milliliter of blood, for example, contains about 1 picogram of cell-signaling molecules called interleukins and about 10 billion times that amount of a protein called serum albumin. Such plentiful proteins can mask the signals of their rare brethren.

    Still, the lure of proteins was undeniable. Whereas genes are life's blueprint, proteins are the bricks and mortar from which it is built. Identify a critical protein in a disease process, and it could serve as a target for a multibillion-dollar drug to fight diabetes or heart disease. Fluctuations in the amounts of some proteins could serve as “biomarkers” to alert doctors to the onset of cancer or Alzheimer's disease. In the early part of this decade, companies flocked to the field, raising and spending hundreds of millions of dollars. But it quickly became clear that the technology was immature. After several years of trudging down blind alleys, most of the companies that were formed to hunt for biomarkers and drug targets were either folded or merged out of existence (see sidebar, p. 1760).

    The news wasn't much better in academia. Take an early example from the Human Proteome Organisation (HUPO), which was launched in 2001 to coordinate international proteomics research and bring order to the unruly field. In 2004, HUPO launched its Plasma Proteome Project (PPP) to survey blood proteins and propel the search for candidate biomarkers. HUPO sent identical blood samples to research groups around the globe, each of which conducted its own analysis with its own homegrown version of the technology. “It was a big disaster,” says John Yates, a chemist and mass spectrometry (MS) expert at the Scripps Research Institute in San Diego, California. “There was no quality control. Then the data came back, and it was just a mess,” he says.

    Unfortunately, PPP and other early efforts raised expectations that they would produce a shortcut for finding novel biomarkers for a wide variety of diseases. “The plasma proteome [project] made the search for biomarkers look like a slam dunk,” says Jan Schnitzer, who directs the vascular biology and angiogenesis program at the Sidney Kimmel Cancer Center in San Diego. “But it hasn't delivered.” That failure and the failure of proteomics as a whole to deliver on its promise, Uhlen adds, “is a history which is still haunting us.”

    HUPO has since promoted uniform standards for everything from how to collect and process blood and tissue samples to the proper methodologies for screening them and analyzing the data. And PPP is now taking a more targeted approach to discovering proteins.

    The standards have helped, but they haven't solved all the problems. A study last year compared the ability of 87 different labs to use MS to identify correctly 12 different proteins spiked into an Escherichia coli sample. No lab got them all, and only one correctly identified 10 of the 12, says Thomas Nilsson, a proteomics researcher who splits his time between Göteborg University in Sweden and McGill University in Montreal, Canada. In a follow-on study completed this year, only six of 24 labs correctly identified 20 spiked proteins. “That again is quite depressing,” Nilsson says. “So what are the chances [for success] of high-throughput proteomics as a distributed effort?” Nilsson asked attendees in Amsterdam.

    Pacesetter.

    Thanks to better mass spectrometers and software, researchers such as Matthias Mann (inset) can now identify thousands of proteins in a single experiment.

    CREDITS: MAX PLANCK INSTITUTE FOR BIOCHEMISTRY

    Perhaps surprisingly, Nilsson says he thinks they are decent. This year's study, he explains, shows that most errors in MS-based analyses arise not because the technology can't spot the proteins researchers are looking for but because software programs often misidentify them.

    A big part of the problem, says John Bergeron, a proteomics expert at McGill University, rests with simple statistics. To identify proteins using MS, researchers first chop a sample of proteins into smaller fragments called peptides. Those peptides are fed into a mass spectrometer, which ionizes them and shoots them through a chamber. The time it takes for the ions to “fly” through the chamber reveals the atomic weight of the peptides, which in turn reveals their identities. Computer programs then compare them with a full list of the organism's genes, which code for those peptides and their proteins. If a peptide matches the protein code in only one gene, it is a hit and it is a unique identifier of the protein.

    The problem is that not all peptides are successfully ionized in each experiment, so some don't enter the chamber. Even if the same lab runs a sample of proteins through the machine twice, Bergeron says, 33% of the proteins identified will appear to be different between the two runs. To minimize such sampling error, MS labs now typically run samples through their machines as many as 10 times. Today, MS groups also look for more than one unique peptide to confirm the identity of a protein. Those changes, together with other emerging standards, show that “these are problems that can be addressed,” Schnitzer says.

    A new approach

    Such successes are also convincing proteomics leaders that the technology is mature enough to go after a full-scale HPP. Although details remain in flux, the generally agreed-upon plan is to identify one protein for each of the estimated 20,400 human genes. Bairoch reported at the meeting last month that Swiss-Prot has logged what is currently known about each gene, such as the primary proteins a particular gene produces and their function. Proteins for about half of the genes have never been seen, Bairoch stated.

    There are far more proteins than genes, because proteins can be spliced together from multiple genes, and once synthesized, they can later be cut down in size or modified with other chemical groups. Trying to find all those variants in all tissues is a task that will likely take decades, Uhlen says. Sticking to one protein for each gene provides a defined endpoint to the project and would create a “backbone” of all human proteins that can be continually fleshed out.

    Another possible goal is to create one antibody for every protein in HPP. Because antibodies typically bind to one target and nothing else, researchers can use them to fish out proteins of interest and track their locations in cells and tissues. That would offer clues to the functions of the thousands of proteins for which little is known. Uhlen and colleagues in Sweden launched just such a global antibody project in 2005. And in Amsterdam, Uhlen reported that the catalog now contains more than 6000 antibodies against distinct human proteins, more than one-quarter of the complete set. At the current rate of new antibody production, Uhlen says his team will finish the task in 2014. More money, he says, would undoubtedly speed the effort.

    A third project would track which proteins “talk” to one another. To find a protein's partners, researchers create thousands of identical cell lines and insert into each one a chemical tag linked to a different protein. They can use the tag to pull that protein out of the cell at a specific point in its life cycle, along with any other proteins, bits of RNA or DNA, or a metabolite that it is bound to. Bioinformatics experts can then weave together the partners for each protein to construct a complete communication network of the proteins in the cell.

    Such protein-interaction networks have been worked out in exquisite detail in yeast and other organisms. But it has been hard to insert the chemical tags reliably into human cell lines. Over the past decade, however, researchers around the globe have shown that different lentiviruses readily insert tagged proteins into a wide variety of human cells. At the meeting, Jack Greenblatt of the University of Toronto in Canada said he has proposed a project to insert one tagged protein for each of the 20,400 genes, the first step to a complete human proteome interaction map. The project is now under review by Genome Canada, the country's national genome sciences funding agency. Greenblatt adds that working with human cell lines isn't perfect, because these lines are typically made up of non-normal cells that have been immortalized. His group is also performing related studies in mice, which can be grown into adult animals, and the interaction networks can be compared with those found in the human cell lines. Other projects could be added to HPP as funding permits. They could include a catalog of all the modified proteins, such as splice variants and phosphorylated proteins, Bergeron says.

    Finding the money

    How much will it take to complete the wish list? Opinions vary, but somewhere in the neighborhood of $1 billion is a common guess. Michael Snyder, a yeast biologist at Yale University, thinks that's too little. “This is going to require a bigger budget than that,” he says.

    Whatever the projection, it was enough to make those with the money blanch. Funding agencies around the world are already collectively spending hundreds of millions of dollars on proteomics technologies and centers. They're also already committed to several international big biology projects such as the International HapMap, the International Cancer Genome Consortium, and the Knockout Mouse Consortium, which are putting the squeeze on tight budgets. “From a funding viewpoint from the U.S. context, now is not the right time,” says Sudhir Srivastava, who directs proteomics initiatives at the U.S. National Cancer Institute in Rockville, Maryland. “If this was 5 years ago when the NIH [National Institutes of Health] budget was doubling. …” Srivastava trails off.

    Still, Uhlen and others say they are hopeful that funding agencies will keep the field moving quickly. “We don't have to have $1 billion from the start,” Uhlen says. “With the Human Genome Project, it took 5 years for the funding agencies to put serious money into it. I don't think we should expect funding agencies to jump on board until we have proven the technology.”

    CREDIT: JENS LASTHEIN

    To do that and make the cost more palatable, HUPO leaders are mulling a pilot project to catalog all the proteins produced by chromosome 21, the smallest human chromosome, which has 195 genes. Although the cost of such a project isn't known, “I think there almost certainly would be interest,” says Roderick McInnes, director of the Institute of Genetics at the Canadian Institutes of Health Research in Ottawa.

    At the meeting, proteomics expert Young-Ki Paik of Yonsei University in Seoul, South Korea, said the Korean government is considering funding a similar proposal for a Korean-based pilot project on chromosome 13, the second-smallest human chromosome, with 319 genes. Paik says he and his colleagues have proposed a 10-year, $500 million initiative that is currently being considered by the Korean Parliament. A decision is expected in October. If it is funded, Bergeron says it will be a major boost to the field and could help catapult Korea into the forefront of proteomics.

    Some researchers are skeptical of going chromosome by chromosome, however. “In gene sequencing, that approach worked,” Bairoch says. “You could separate out the work by chromosome. But it doesn't make sense for proteins. There is no [body] fluid or [tissue] sample organized by chromosome.” Ruedi Aebersold, an MS expert with a joint appointment at ETH Zurich and the Institute for Systems Biology in Seattle, Washington, agrees. “I'm not a big fan of going chromosome by chromosome,” he says. MS machines, he notes, identify whatever proteins show up regardless of the chromosomes they came from.

    Whatever path they take to an HPP, proteomics leaders will need to find true believers beyond those already in the flock. “The biology community at large has to show they really need this,” Bairoch says. “If they can't, why should they fund this?” Uhlen, Bergeron, and other HUPO leaders agree. And they argue that current demonstrations of the technology are starting to build the case.

    At the Amsterdam meeting, for example, Mann reported that recent advances in instrumentation and software have enabled his group to identify the complete yeast proteome in one shot—in just a few days. That feat took months of painstaking effort when it was first accomplished by traditional methods 5 years ago. Mann also described the use of a technique his team first reported last year to monitor changes in the yeast proteome, including levels of individual proteins, between two different states. In one example, Mann's team compared yeast cells with a diploid (double) set of chromosomes to cells with the haploid (single) set undergoing sexual reproduction. The study quantified for the first time the suite of proteins that orchestrate sexual reproduction in yeast. Mann says the technique opens the door to studying proteomewide differences between healthy and diseased cells, developing and mature cells, and stem cells and differentiated cells. “There is no end to what you can compare,” Mann says. “Every lab can ask these questions.”

    In an another study, Uhlen reported using his antibodies to track global protein expression in human cells. He and his colleagues have shown that fewer than 1% of all proteins are expressed in only one tissue. That implies, he says, that tissues are differentiated “by precise regulation of protein levels in space and time, not by turning expression on and off.” Aebersold also reported that his lab has devised a scheme for detecting proteins expressed at the level of just a single copy per cell.

    “These are unbelievable advances, and they show we can take on the full human proteome project immediately,” Bergeron says. Not everyone has turned that corner, but Bergeron and others say that they are confident that time is coming soon. As Pierre LeGrain, director of life sciences at the French Commissariat à l'Energie Atomique in Gif-sur-Yvette, sums it up: “Most of us feel the human proteome project is going to happen, though we don't know how.”

  8. PROTEOMICS

    Will Biomarkers Take Off at Last?

    1. Robert F. Service

    After years of disappointments, proteomics researchers say they're cautiously optimistic that they will be able to detect proteins that are markers for specific diseases.

    One much-heralded application of proteomics—detecting proteins that are markers for specific diseases—has long been a dream deferred. “It has been extremely difficult to find those proteins that are biomarkers,” says Ruedi Aebersold, a proteomics expert at the Swiss Federal Institute of Technology in Zurich, Switzerland, and the Institute for Systems Biology in Seattle, Washington. But after years of disappointments, proteomics researchers say they're cautiously optimistic.

    When proteomics caught fire earlier this decade, scientists hoped that mass spectrometry (MS) and other technologies would help them sift through the thousands of proteins in blood and other body fluids to identify a rare protein that indicated the presence of a disease. Researchers could then use these biomarkers to spot diseases in their formative, treatable stages. But the demise of several companies, such as GeneProt and Large Scale Biology, that jumped into the field revealed that nailing down biomarkers is harder than it sounds.

    One problem is that blood—the most common hunting ground—is difficult to work with. Levels of different proteins in blood vary by 10 orders of magnitude, and the abundant proteins often mask the presence of rare ones. Unfortunately, mass spectrometry, the best tool for casting a wide net to search for proteins, hasn't been sensitive enough to spot the rare ones. “Most clinically used biomarkers are at nanogram [per milliliter] levels or below,” Aebersold says. At the meeting, Aebersold reported a new strategy for targeting protein fragments called N-linked glycopeptides, which commonly make up cell-surface receptors and thus are more likely to be shed into the blood. This targeting allowed Aebersold's team to spot proteins down to nanogram-per-milliliter levels and thereby track them to look for possible links to diseases. Aebersold says he's hopeful that similar, more focused, studies will improve prospects for the biomarker hunters.

    Needles in a haystack.

    The abundance of different proteins in blood varies by more than 10 orders of magnitude. Most commercially used biomarkers (yellow dots) are present in only minute quantities in blood, below the level at which most proteins are detected (red dots).

    CREDIT: RALPH SCHIESS/INSTITUTE OF MOLECULAR SYSTEMS BIOLOGY, ETH ZURICH

    Better instrumentation won't solve all the problems. Techniques such as MS that survey thousands of different compounds inevitably turn up false positives: proteins that change their abundance in lockstep with a disease just by chance. That means candidate biomarkers must be validated through clinical trials, which can cost tens of millions of dollars—and most of them fail. “To be accepted by [regulatory] agencies, it's almost as costly as developing a new drug,” says Denis Hochstrasser, the director of laboratory medicine at Geneva University Hospital in Switzerland. Because diagnostics companies, unlike drugmakers, typically can't charge lofty premiums for their new tests, they have less incentive to develop biomarker tests. Michael Snyder, a proteomics expert and cell biologist at Yale University, says that despite these challenges, he's hopeful that improving proteomics technologies will generate novel biomarkers—“just not on the same time frame as people thought.”

  9. ELECTION 2008

    Scientists Strive for a Seat at the Table of Each Campaign

    1. Jeffrey Mervis

    When it comes to soliciting scientific advice, Barack Obama welcomes a cast of thousands, whereas John McCain plays it close to the vest.

    When it comes to soliciting scientific advice, Barack Obama welcomes a cast of thousands, whereas John McCain plays it close to the vest

    Harold Varmus has met Senator Barack Obama only once. But he's convinced that the Democratic presidential nominee “understands the important role that science must play in tackling the problems we face as a society.” To prove it, the president of Memorial Sloan-Kettering Cancer Center in New York City points to the candidate's promise of “sustained and predictable increases in research funding” at the major federal science agencies.

    It's no surprise that the politically active Varmus, the 1989 medicine Nobelist and former director of the U.S. National Institutes of Health (NIH), is familiar with Obama's statements on funding basic research: He helped write many of them as chair of a 40-plus-member committee of prominent researchers and educators who are advising the freshman senator from Illinois on science. The panel prepared the candidate's 6000-word response last month to 14 questions posed by a coalition of scientific organizations called Science Debate 2008 (ScienceDebate2008.org). Varmus won't say how much the answers were altered by campaign officials but allows that “we're very pleased with it. His commitment to science is absolutely apparent.”

    Last week, Obama's Republican opponent, Senator John McCain (AZ), provided equally lengthy answers to the same set of questions. Douglas Holtz-Eakin, who serves as the candidate's point man on many domestic policy issues, including science, health, energy, and the environment, says McCain has contacted experts on issues such as climate, space, and “science in general” but has “no formal structure” for soliciting advice. An economist and former head of the Congressional Budget Office under President George W. Bush, Holtz-Eakin says McCain relies instead on the knowledge acquired during his 26 years in Congress, including 6 years as chair of the Senate Commerce, Science, and Transportation Committee.

    The way the answers were prepared reflects the different management styles of the two campaigns. “Obama has thousands of advisers, and McCain has two guys and a dog,” cracks one academic lobbyist who requested anonymity because his organization tries to maintain ties with both camps.

    The answers themselves—on research funding, science education, climate change, energy, space exploration, and other issues—reflect different political philosophies. Obama tends to assign government a larger role in tackling those problems—a $150 billion plan for energy independence, for example, and an $18 billion plan to improve education. McCain, in contrast, combines his $30 billion clean-coal program with talk about the need to curb spending and rely on the private sector.

    For many U.S. academic researchers, presidential politics comes down to two big issues: getting more money for science and having a seat at the table. The first requires agreement between the president and Congress, however, and any promise to increase research spending could easily be derailed by the Iraq war, an ailing economy, and rising health care and energy costs. That puts a premium on the second issue, namely, the appointment of people who will make the key decisions in the next Administration.

    Many voices.

    Harold Varmus (far left) and other leaders joined Barack Obama at a June economic roundtable at Carnegie Mellon University.

    CREDIT: CARNEGIE MELLON UNIVERSITY

    Indeed, three independent panels stuffed with science mavens have recently issued reports* emphasizing the importance of choosing an assistant to the president for science and technology soon after the election. They say that person, who would also head the Office of Science and Technology Policy (OSTP), should be part of the president's inner circle and play a major role in vetting appointments to dozens of other key science positions throughout the government.

    “We had drafted white papers on several issues, but the presidents [of the three academies] worried that nobody would pay attention to them,” says E. William Colglazier, executive officer of the U.S. National Academy of Sciences, which joined with the engineering academy and the Institute of Medicine in issuing a report last week on the appointment process. “They felt it was more important that the next president get very good people into key positions.” Working backward, scientists reason that the more interaction with a candidate before election day, the greater the chance that he will act quickly and fill those posts with highly qualified people.

    Since declaring his candidacy in February 2007, Obama has welcomed those interactions. He has solicited the views of troves of experts and created a vast network of advisers. “They didn't ask us to take a blood oath,” says Varmus, who endorsed Obama with the Democratic nomination still hanging in the balance. But he says “it's a reasonable assumption” that most of the advisers also support his candidacy.

    Varmus's panel, which includes medicine Nobelist Peter Agre and physics Nobelist Leon Lederman, is one of 20 or so advisory bodies. (The Obama campaign declined to provide a number.) Paul Kaminski, a top Pentagon official during the Clinton Administration, is heading up an eight-person group on defense science that is examining work-force, training, and acquisition issues. He'll also be representing Obama next month at a National Academy of Engineering forum on grand challenges, opposite Carly Fiorina, the former CEO of Hewlett-Packard who was once on McCain's list of possible running mates. There's another Obama group on science education, and the membership is overlapping. Kaminski recently joined the science panel, for example, and Lederman also serves on a small group examining science education.

    With regard to scientific input in a McCain Administration, Holtz-Eakin promises that McCain will be vigilant in ending what critics have called the Bush Administration's war on science. “He'll restore credibility and transparency” to the process, says Holtz-Eakin, in part by filling all six statutory positions at OSTP. Still, Holtz-Eakin knows that he's addressing a skeptical audience. “You can't convince people that you'll make sure they have access. You have to demonstrate it,” he told Science.

    Convened this summer, Obama's science group has held weekly teleconferences to field questions from the campaign staff and inject into the campaign issues that it feels are important. In preparing answers to Science Debate's 14 questions, the panel's most visible product, members sifted through Obama's past statements, added their own perspectives, and delivered answers to Jason Furman, Obama's director of economic policy, via his deputy, Larry Strickling.

    The panel's fingerprints are evident in the nuanced responses that Obama offers. To a question about how basic research would fare in a competition for scarce funds, for example, Obama discusses the declining success rate among applicants for NIH grants, the resulting pressure on young scientists, and the erosion of the agency's buying power after a succession of flat budgets that followed a 6-year doubling from 1998 to 2003. In such an environment, he adds, scientists are less inclined to take risks. “This situation is unacceptable,” he declares, offering as the solution a 10-year, across-the-board budget doubling in the physical and life sciences, mathematics, and engineering.

    McCain is less sanguine than Obama about the likelihood of large increases. “I have supported increased funding at DOE [Department of Energy], NSF [National Science Foundation], and NIH for years,” he notes in his Science Debate reply, “and will continue to do so.” But he warns that “with spending constraints, it will be more important than ever to ensure that we are maximizing our investments in basic research.” And his answer omits mention of any numerical goal. In an interview last month on National Public Radio, Holtz-Eakin said any call for doubling science agency budgets is “a nice, fun number … that doesn't reflect a balancing of political priorities.”

    Tight team.

    Douglas Holtz-Eakin (inset) has been the chief spokesperson for John McCain on most domestic issues.

    CREDITS (LEFT TO RIGHT): CRAIG LITTEN/AP; DENNIS COOK/AP

    In fact, it's hard to pin down either candidate on how quickly he would like to increase federal funding for basic research. Making a video appearance this month during a cancer research telethon, Obama promised to double the budget of NIH, including the National Cancer Institute, in 5 years. That's twice the rate described in his answers to Science Debate, which came out in August and have become the mantra for campaign surrogates. He also supports the 2007 America COMPETES Act (ACA), which is silent on NIH but which would put NSF and DOE's Office of Science on a 7-year doubling track.

    As it happens, those figures are in line with historical trends. Between 1962 and 2003, for example, the NIH budget doubled roughly every 8 years, in current dollars. NSF has seen its budget double every decade for the period from 1970 to 2000.

    A statement on McCain's Web site also promises to “fully fund” the provisions of the COMPETES Act, which authorizes spending levels that have not been met in subsequent appropriations bills. Holtz-Eakin told Science that McCain “is on the record as supporting ACA” and that, if elected, his 2010 budget would reflect those targets in the physical sciences. Taking a jab at Obama's expansive promises for increased spending in research and other domestic areas, Representative Vern Ehlers (R-MI) predicts the U.S. research enterprise will be better off under a McCain Administration, despite its more modest promises, because “he's more likely to find the money.” But Ehlers, one of three physics Ph.D.s. in Congress and a staunch supporter of science, admits that McCain hasn't sought his advice on the topic. (His colleague, Representative Rush Holt (D-NJ), has spoken for Obama, although during a recent interview with Science he deferred several questions to the campaign staff.)

    Obama's aides and outside advisers play down the discrepancies in Obama's statements on NIH doubling while at the same time perpetuating them. Domestic policy director Neera Tanden, who joined the campaign this summer after many years advising Senator Hillary Clinton (D-NY) on health-care issues, says a 5-year doubling of the NIH budget “is the right thing to do” and that it is needed to keep pace with the rapid advances in the field. Tanden also says the disruptions caused by a stagnant NIH budget after the previous doubling aren't inevitable. “There's no reason to assume you would have another crash landing,” she says.

    Gilbert Omenn, a professor of medicine and public health at the University of Michigan, Ann Arbor, and a former president of AAAS (which publishes Science) who serves on Obama's science advisory panel, acknowledges that the different timetables “are very awkward” and that the candidate's promises “add up to a lot of commitments.” But he's confident that Obama “will be able to figure out the best combination of variables to allow for a sustained investment.”

    In the end, of course, promises are only that. “Remember, it's a campaign, not governance,” notes Lederman when asked if his group expects to have an impact on Obama's education policies if he takes office in January. A seat at the table may be a better bet, says Kaminski. “I would expect some of [his defense advisers] to take key positions in his Administration.” That is, if they turn out to have bet on the winning candidate.

    • * The reports were done by the Woodrow Wilson Inter-national Center for Scholars (OSTP 2.0 Critical Upgrade, at wilsoncenter.org); the Center for the Study of the Presidency (“Presidential Personnel and Advisory Requirements for Science and Technology,” at thepresidency.org); and the three national academies (Science and Technology for America's Progress: Ensuring the Best Presidential Appointments, at nationalacademies.org).

  10. AGING

    Searching for the Secrets of the Super Old

    1. Mitch Leslie

    More and more people are living past 110. Can they show us all how to age gracefully?

    More and more people are living past 110. Can they show us all how to age gracefully?

    They were born when the years still started with “18.” They survived global traumas such as World War I, World War II, and the Great Depression. They didn't succumb to pandemic flu, polio, AIDS, Alzheimer's disease, or clogged arteries. Supercentenarians, or people who've survived to at least age 110, are longevity champions.

    Living to 100 is unlikely enough. According to one estimate, about seven in 1000 people reach the century milestone. And at that age, the odds of surviving even one more year are only 50-50, says James Vaupel, director of the Max Planck Institute for Demographic Research in Rostock, Germany. Making it from 100 to 110 “is like tossing heads 10 times in a row.”

    Researchers are keen to investigate these 19th century holdovers. “If we want to better understand the determinants of longevity, we have to look at the oldest old,” says biodemographer Jean-Marie Robine of INSERM's demography institute in Montpellier, France. With Vaupel, he has recently compiled a demographic database of verified supercentenarians from the industrialized countries.

    Two other projects, led by researchers on the opposite coasts of the United States, hope to pin down the traits of these survivors by surveying their genomes for longevity-promoting DNA sequences and by autopsying them when they finally die. Ultimately, work on supercentenarians could uncover “a unique [genetic] variation that explains their longevity that can be the subject of drug development,” says molecular geneticist Nir Barzilai of Albert Einstein College of Medicine in New York City. Such a discovery might not stretch human life span, but it could make our final years less grueling, suggests Barzilai.

    Yet studying supercentenarians is no easy task. Finding these one-in-a-million people is hard enough, and validating their ages can require that researchers become detectives or hire ones.

    Come on, how old are you really?

    Figures on the number of supercentenarians are shaky. The 2000 U.S. census claimed a total of 1400 living in the country. That number is much too high, says geriatrician Thomas Perls of Boston University School of Medicine, head of the New England Centenarian Study and its new National Institutes of Health-funded spinoff, the New England Supercentenarian Study. Researchers suspect that some of the oldsters included in the tally had already died and that others—or their relatives—were lying about their ages. Drawing on Medicare enrollment figures, two U.S. government actuaries put the number of supercentenarians in the year 2000 at a mere 105. And in 2002, 139 people claiming to be at least 110 were receiving Social Security payments.

    Turn of the century.

    The world's oldest living person is 115-year-old Edna Parker (right). Daniel Guzman (left), reached 111 before dying earlier this year.

    CREDITS: (TOP LEFT TO RIGHT) COURTESY OF GUZMAN FAMILY; R. YOUNG/NEW ENGLAND SUPERCENTENARIAN STUDY

    “Claiming” is a key word. A crucial part of studying supercentenarians is proving that they were or are their stated age. No biochemical test or medical exam can peg how old somebody is. So researchers often turn to Robert Young of Atlanta, Georgia, a self-taught documents guru who confirms the ages of the world's oldest people for Guinness World Records. Young comes across like a veteran insurance adjuster who's seen all the scams. To weed out pretenders, he requires three types of verification: proof of birth, preferably a birth certificate; proof of death, if the person is no longer alive; and “continuity” documentation, such as a driver's license or marriage certificate, that shows that the putative supercentenarian is the person listed in the birth record. If candidates or their families can't provide corroboration, Young sleuths through census rolls, school and military records, genealogies, and other types of paperwork.

    Cutting for clues.

    L. Stephen Coles leads a group that has performed most of the autopsies on supercentenarians.

    CREDIT: COURTESY OF L. S. COLES

    Using these methods, an organization called the Gerontology Research Group verifies the ages of living supercentenarians and posts a list online (http://www.grg.org/). Young is senior claims examiner for the group, which is headed by L. Stephen Coles of the University of California, Los Angeles, an ob-gyn and computer scientist by training. As of last week, the roster included 10 men and 68 women from 12 countries, ranging up to 115 years old. For reasons that remain murky, most supercentenarians are women. Moreover, of the oldest people ever documented, the majority have been women, including the record-holder Jeanne Louise Calment of France, who died in 1997 at the age of 122.

    To obtain a more complete count of supercentenarians for demographic analyses, Vaupel, Robine, and colleagues have dug into national archives, including the records of the U.S. Social Security Administration, to compile lists of candidates in 15 industrialized countries. A team of age checkers then vetted each case. In all, the new International Database on Longevity caches information on nearly 1000 supercentenarians from the past 50 years, although not every country's records span this entire range. The researchers plan to publish a monograph on the database later this year.

    But that still won't be the final word. A lack of good records in developing nations means that researchers still know little about the numbers of supercentenarians worldwide, says demographer Bertrand Desjardins of the University of Montreal in Canada: “It's anyone's guess how many supercentenarians are living in China.”

    How to grow old in style

    Scientists have gotten a few hints about what keeps centenarians alive for so long—genes associated with a beneficial lipid profile, for example (Science, 17 October 2003, p. 373)—but they're just beginning their search for the sources of supercentenarian longevity. Two years ago, Perls and colleagues published the first health survey on these so-called supers, reporting on 32 people between the ages of 110 and 119. “I think it's incredible how well off they are,” says Perls. Although almost half of the supers had osteoporosis and almost 90% had cataracts, 41% of them either lived on their own or required only minimal help with tasks such as preparing food, dressing, and bathing. Cardiovascular disease, the leading killer in developed countries, was rare among supercentenarians—only 6% had suffered heart attacks and 13% reported strokes. Diabetes and Parkinson's disease were also uncommon in the group, striking only 3% of the subjects each. Like centenarians, supercentenarians seem to be good at putting off the day when they become disabled, says Perls.

    The superseniors deviate from the norm not just in how long they live but in how they die, says Coles, who arranges autopsies of the oldest old as part of his work with the recently established Supercentenarian Research Foundation. Only nine supercentenarians have undergone postmortems—Calment, for example, never agreed to one—and Coles and colleagues have performed six of these procedures, including one earlier this year in Cali, Colombia, on a man who died at age 111.

    Coles argues, based on these autopsies, that supers aren't perishing from the typical scourges of old age, such as cancer, heart disease, stroke, and Alzheimer's disease. What kills most of them, he says, is a condition, extremely rare among younger people, called senile cardiac TTR amyloidosis. TTR is a protein that cradles the thyroid hormone thyroxine and whisks it around the body. In TTR amyloidosis, the protein amasses in and clogs blood vessels, forcing the heart to work harder and eventually fail. “The same thing that happens in the pipes of an old house happens in your blood vessels,” says Coles.

    Perls and colleagues have also shown that extreme survival runs in supercentenarians' families. Repeating an analysis they did earlier for centenarians, the researchers last year analyzed life spans of the siblings and parents of supercentenarians from the United States. The team compared the relatives' longevity with that of people born in the same year. Brothers of supers gained about 12 to 14 years over their contemporaries, whereas sisters outlasted their counterparts by about 8 to 10 years. A family connection doesn't mean that only genes are responsible for supercentenarians' great age, Perls cautions. Everything from diet to exercise habits can also run in families—the analysis can't distinguish between genetic and environmental factors.

    But Perls's current work might. He and his colleagues have collected blood samples from 130 authenticated supercentenarians and have sequenced DNA from 100 of them. As early as this fall, the team could be ready to submit a paper on gene variants that might be stretching supercentenarians' lives, he says. Moreover, because the research team also has data on the supers' past health and lifestyles, it might be able to statistically tease apart environmental and genetic influences on the oldsters' life spans.

    The Supercentenarian Research Foundation has similar ambitions. In addition to the autopsies this nonprofit group of doctors and researchers has conducted, Coles and his colleagues have obtained a few blood samples and plan to start collecting more next year. However, the effort, which is operating on donations, won't have enough money to sequence DNA from the samples. They will go into the freezer, but Coles says the shrinking costs of sequencing technology should soon make reading the DNA affordable.

    Saved by a SNP?

    Tom Perls (right) and colleagues are scanning DNA from people like 110-year-old Mary Marques for longevity clues.

    CREDIT: R. YOUNG/NEW ENGLAND SUPERCENTENARIAN STUDY

    The two projects will be sharing Young's age-checking services but nothing else. Perls says he declined to collaborate with Coles's group in part because some of its members are involved in so-called antiaging medicine, whose practitioners claim to be able to alleviate time's ravages with treatments such as injections of human growth hormone (HGH) (Science, 8 February 2002, p. 1032). The rationale is that the hormone's blood levels normally dwindle as we age. But Perls has blasted this off-label use of the hormone—it's only approved for children with stunted growth and adults with pituitary tumors or other rare conditions—as not only unproven and potentially unsafe but also illegal in the United States.

    Working quietly outside that fray, Vaupel and colleagues plan to use their new database to answer a question that's been nagging demographers and actuaries: Do the odds of dying in a given year, which rise relentlessly for most of adult life, taper off in the most senior seniors? Demographers want to determine whether the death rate stabilizes so they can test their models of mortality, whereas actuaries need the answer to help governments refine budgets for health care and pensions. If mortality does peak or even begin to decline in very old age, it could mean that people who live past 110 really are super, stronger than the rest of us, Vaupel says.

    If centenarians are any guide, researchers will find that supercentenarians have varying backgrounds, lifestyles, and genetic profiles. But as Robine notes, they share one factor: luck. Calment provides a prime example. She outlived her husband, daughter, and grandson. They died from non-aging-related causes—the husband from food poisoning, the daughter from pneumonia, and the grandson in a car accident. So if you hope to reach the big 110, keep a rabbit's foot handy.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution