News this Week

Science  12 May 2006:
Vol. 312, Issue 5775, pp. 824
  1. NASA BUDGET

    Crisis Deepens as Scientists Fail to Rejigger Space Research

    1. Andrew Lawler

    COLLEGE PARK, MARYLAND—With too many missions and not enough money, NASA's $5.5 billion science program is in a terrible fix. A 5-year plan that would cancel projects nearing completion, decimate disciplines, and slash funds to analyze data so upset space science researchers when NASA released it in February that officials gave the community an unprecedented shot at coming up with something better. But the scientists who met here last week as members of a newly expanded NASA advisory committee couldn't agree on an alternative approach that wouldn't bust NASA's proposed budget for 2007. That failure could leave the fate of the program to the whims of Congress.

    Unwise choices?

    NASA's budget woes could mean the end of several space science projects, including the Wide-Field Infrared Space Explorer (left) and a planned Mars Scout mission (right).

    CREDITS: JPL/NASA

    The precarious state of U.S. space and earth sciences has become clear in the past several months, as several costly birds have come home to roost. The problems—the need for more money to get the space shuttle flying again, the White House push for a new launcher to send humans to the moon, and rising costs in science projects such as the James Webb Space Telescope—are not chicken feed. And NASA Administrator Michael Griff in accepts a portion of the blame. “I made a mistake,” Griffin told NASA's new science advisory panel. “I made commitments in advance that I wasn't able to keep,” referring to his 2005 promise not to shift money from science to human space flight. NASA's current budget request would trim more than $3 billion from space science through 2011.

    A separate effort to confront the crisis came in a 4 May report from a National Academies' National Research Council (NRC) panel. The group, chaired by Lennard Fisk, an atmospheric scientist at the University of Michigan, Ann Arbor, concludes that the program is “fundamentally unstable [and] seriously unbalanced” and that it will fall far short of the research goals laid out in earlier academy surveys. Both the committee and the NRC report say the space agency should reverse proposed cuts to research grants, restore small missions, and move quickly to control spiraling costs. But neither tells NASA which programs or missions to cut. Both groups also criticized the agency for failing to consult regularly with researchers.

    The gathering of the advisory panel at the University of Maryland last week was intended to remedy that situation and come up with concrete solutions to NASA's fiscal crisis. Dividing themselves into four groups—earth sciences, astrophysics, heliospherics, and planetary science—the 70 members set out to devise an alternative budget. But they were stymied by financial and legal hurdles. When it came time to discuss the fate of the 2011 Scout balloon mission to Mars, for example, a half-dozen members recused themselves because they had proposals pending. “We can't very well make a decision to cancel the Scout mission after all the qualified people have left the room,” said a frustrated Sean Solomon, a planetary geologist at the Carnegie Institution of Washington and the subcommittee chair. “We're going to punt; our hands are tied by legal restrictions.”

    Despite much grumbling about NASA's planned cuts, the panels could not reach agreement on a different set of priorities. William Smith, president of the Washington, D.C.-based Association of Universities for Research in Astronomy, warned that canceling or deferring flagship missions would hurt the health of the research community, noting that three of NASA's large observatories in turn award $70 million a year in small grants. Physicist Glenn Mason of the University of Maryland, College Park, argued on behalf of small missions, saying they can provide focused data in a relatively short period. And NASA's acting earth science chief Bryant Cramer cast a vote for midsize spacecraft, which he says provide a great deal of affordable science.

    The panel adjourned without reaching a consensus but agreed to meet again in July for additional discussions. Simultaneously, it will help NASA come up with a long-term science strategy, which Congress wants delivered by December.

    The NRC report—an independent study also requested by Congress—hammered at NASA's management of science missions, which “are being executed at costs well in excess of the costs estimated at the time when the missions were recommended.” Whereas the report urged NASA to undertake detailed cost evaluations of all its missions, the advisory panel complained that some of those overruns are due to new safety requirements imposed by NASA. In fact, the only suggestion from either the advisory committee or the NRC panel about how to save money involved reducing overhead by removing some of the hurdles proposed missions must clear before launch. “Right now, we are simply too risk-averse,” says Cramer, a longtime project manager. Griff in agrees that the agency must reduce red tape, and late last month, in a speech to industry, he urged companies and his staff to come up with less costly ways of doing business.

    NASA officials, however, remain up against an immediate budget wall. They say they are considering canceling the Wide-Field Infrared Space Explorer, a $300 million mission well along in the planning. Also hanging by a thread is the Stratospheric Observatory for Infrared Astronomy, a joint project with Germany set for a first flight sometime next year.

    Scientists are hoping that Congress will step in to save the day by providing more money than the agency requested for the fiscal year that begins on 1 October. But given competing interests, lawmakers' concerns about the growing federal deficit, and the departure next month of NASA's key ally Representative Tom DeLay (R-TX), that hope may prove illusory. And without clear direction from the science community, the missions that survive may be the ones with the strongest political allies.

    In the meantime, Griffin pledges to listen more closely to scientists. He spent several hours at the advisory committee meeting answering questions and chatting informally with committee members. “I'm not the world's best communicator,” he told them. But “we don't get out of bed, drive to headquarters, and try to screw the program up. … We're not out to do a Lone Ranger act.”

  2. GLOBAL CHANGE

    No Doubt About It, the World Is Warming

    1. Richard A. Kerr

    Global warming contrarians can cross out one of their last talking points. A report released last week* settles the debate over how the atmosphere has been warming the past 35 years. The report, the first of 21 the Bush Administration has commissioned to study lingering problems of global climate change, finds that satellite-borne instruments and thermometers at the surface now agree: The world is warming throughout the lower atmosphere, not just at the surface, about the way greenhouse climate models predict.

    “The evidence continues to support a substantial human impact on global temperature increases,” added the report's chief editor Thomas Karl, director of the National Climatic Data Center in Asheville, North Carolina. The additional support for global warming will not change White House policy, however. Michele St. Martin, spokesperson for the White House Council on Environmental Quality, says President George W. Bush believes that greenhouse gas emissions can be brought down through better use of energy while the understanding of climate science continues to improve.

    Critics who blasted research under the White House's Climate Change Science Program (CCSP) (Science, 27 February 2004, p. 1269) as mere obfuscation might not have expected such a forthright conclusion from the report. Karl attributes the clarity to the CCSP approach. “For the first time, we had people [who initially disagreed] sitting down across the table. That's a tremendous advantage,” he says. “The process is great for improving understanding. It led to not just synthesis but to advancing the science.” The CCSP synthesis and assessment process prompted new, independent analyses that helped eliminate some long-standing differences, Karl says.

    The 21 authors of the report included researchers who for years had been battling in the literature over the proper way to analyze the satellite data. Meteorologists John Christy and Roy Spencer of the University of Alabama, Huntsville, were the first to construct a long record of lower-atmosphere temperature from temperature-dependent emissions observed by Microwave Sounding Units (MSUs) flown on satellites. By the early 1990s, Christy and Spencer could see little or no significant warming of the middle of the troposphere—the lowermost layer of the atmosphere—since the beginning of the satellite record in 1979, although surface temperature had risen.

    A decent match.

    Warming of the lower atmosphere as measured from satellites (yellows and oranges, top) now resembles surface warming (bottom) measured by thermometers.

    SOURCE: NOAA

    In recent years, report authors Frank Wentz of Remote Sensing Systems in Santa Rosa, California, and Konstantin Vinnikov of the University of Maryland, College Park, led separate groups analyzing the MSU data. They and others found atmospheric warming more on a par with the observed surface warming (Science, 7 May 2004, p. 805). Hashing out those differences over the same table “was a pretty draining experience,” says Christy.

    In the end, the time and effort paid off, says Karl. The report authors eventually identified several errors in earlier analyses, such as not properly allowing for a satellite's orbital drift. They had additional years of data that lengthened a relatively short record. And they could compare observations with simulations from 20 different climate models, which researchers had prepared for an upcoming international climate change assessment. The report authors found that over the 25-year satellite record, the surface and the midtroposphere each warmed roughly 0.15°C per decade averaged over the globe, give or take 0.05°C or so per decade. The tropics proved to be an exception: The models called for more warming aloft than at the surface lately, whereas most observations showed the reverse. Reconciling that discrepancy will have to wait for the next round of synthesis and assessment.

  3. INTELLECTUAL PROPERTY

    Decision on NFκB Patent Could Have Broad Implications for Biotech

    1. Ken Garber*
    1. Ken Garber is a science writer in Ann Arbor, Michigan.

    In what one patent expert called a potentially “huge, huge case,” a federal jury last week unanimously upheld a biotechnology patent that critics describe as exceptionally broad. If the verdict survives appeal, it could set a new precedent for the enforcement of patents on biological discoveries upstream of actual drugs.

    Contrary to some predictions (Science, 31 March, p. 1855), on 4 May, a Boston jury ruled that Eli Lilly's osteoporosis drug Evista and sepsis drug Xigris infringed a patent held by the Massachusetts Institute of Technology (MIT), Harvard University, and the Whitehead Institute and licensed exclusively to Ariad Pharmaceuticals, a Cambridge, Massachusetts, biotech company. The jury awarded at least $65.2 million in back royalties to Ariad, which could continue collecting 2.3% of sales of the two drugs until the patent expires in 2019.

    The patent covers methods for inhibiting NF-κB, a protein discovered 20 years ago at MIT by David Baltimore, now president of the California Institute of Technology in Pasadena, with help from fellow Nobel Prize winner Phillip Sharp and Harvard biologist Thomas Maniatis. (Sharp and Maniatis both testified for Ariad at the trial.) Because NF-κB, a prolific “transcription factor” that turns more than 175 other genes on and off, is so important in biology and disease—it has also been implicated in arthritis, cancer, diabetes, and stroke—the Lilly case could be the first of many involving the protein. Hundreds of compounds, including many drugs already on the market, are known to inhibit NF-κB.

    High-profile witness.

    Nobel Prize winner Phillip Sharp, who helped discover NF-κB 2 decades ago, testified for MIT and Ariad Pharmaceuticals in the patent-infringement trial.

    CREDIT: DONNA COVENEY/MIT

    It is that broad reach that has prompted debate. Ariad CEO Harvey Berger calls the patent claims “very specific” and typical for both industry and academia. “We had a very strong, crystal-clear case,” he says. Law professor Arti Rai of Duke University in Durham, North Carolina, on the other hand, calls Ariad's NF-κB patent “a very broad patent.” She says that an ultimate Ariad victory would herald a major change in the patent landscape, because previous decisions by the federal appeals court have led to the assumption that biotech patents must be narrow. If the Ariad patent survives appeal, “conventional wisdom gets thrown out the window,” Rai says. Lilly spokesperson Philip Belt is more outspoken, calling the verdict “shockingly inconsistent with current patent law.”

    The patent still faces several legal hurdles. The case in Boston does not end with the jury verdict; a separate trial will be held by federal Judge Rya Zobel to decide certain legal challenges to the patent's validity and enforceability. Lilly vows to appeal last week's verdict if the judge rejects these arguments. And in late April, Amgen, a biotechnology company in Thousand Oaks, California, filed suit against Ariad to invalidate the patent and certify that its blockbuster arthritis drug Enbrel, and a second arthritis treatment, Kineret, don't infringe. Amgen spokesperson David Polk called the lawsuit “a preemptive move,” because the company expected Ariad to eventually sue over Enbrel and Kineret. Berger won't comment on the Amgen claims except to say they're without merit and that licenses are available to commercial entities. (Academic scientists do not need a license, he stressed.)

    Berger considers the jury verdict “good for academic research, good for universities, and in the end, good for … discovering new drugs, because it speaks to important technology.” But Rai sees it differently. Asked whether the verdict could hinder innovation in the drug industry, she replied: “If, as a precedent, it then led to lots of upstream players deciding that they would try to follow the lead of Ariad and try to cash in on their upstream patents, [then] yes, I think it could.”

  4. SCHOLARLY PUBLISHING

    Bill Would Require Free Public Access to Research Papers

    1. Jocelyn Kaiser

    A proposal to require federally funded scientists to make their accepted papers freely available online within 6 months of publication has reignited a bruising battle over scientific publishing. The bill, introduced last week by senators John Cornyn (R-TX) and Joseph Lieberman (D-CT), would make mandatory a voluntary National Institutes of Health (NIH) policy and extend it to every major federal research agency, from the National Science Foundation (NSF) to the Department of Defense.

    Spreading the word.

    Proposal would extend NIH's free archive model to other agencies.

    Supporters argue that so-called public access should extend beyond biomedical research. “The ramifications for the acceleration of science are the same,” says Heather Joseph, executive director of the Scholarly Publishing and Academic Resources Coalition, which represents libraries. Many publishers disagree, saying that there is no evidence of an unmet public demand for nonbiomedical papers. They warn that extending NIH's policy to other disciplines could seriously harm societies that rely on journal subscription and advertising revenues to run their organizations.

    The Federal Research Public Access Act of 2006 (S.2695) follows on a 1-year-old NIH policy that asks researchers to submit accepted papers to NIH for posting in PubMed Central, NIH's full-text archive, within 12 months of publication in a journal. House and Senate appropriations committees had asked NIH to develop such a policy after patient groups argued they should have free access to biomedical studies.

    The request has been ignored by most NIH grantees: A January report by NIH noted that fewer than 4% are complying. An NIH advisory committee has recommended that the policy be mandatory and that the 12-month limit be reduced to 6 months for most journals. The Cornyn-Lieberman bill would require NIH to make those changes.

    But the bill also would mandate a similar plan at any U.S. agency funding at least $100 million a year in extramural research. That includes NSF, NASA, the Department of Energy, and even the Department of Transportation. The manuscripts could be posted in existing archives, such as a university server or arXiv, the physics preprint server. However, agencies would have to maintain a bibliography of all the papers they funded with links to full texts. This will give “students, researchers, and every American” access to research results, says Cornyn, which “will help accelerate science, innovation, and discovery.”

    Some publishers argue that there's no evidence the public is as interested in, say, highenergy physics papers as in health research. “You're just expanding this willy-nilly on the assumption that there's the same clamor,” says Allan Adler, vice president for legal and governmental affairs for the Association of American Publishers. Martin Frank, executive director of the American Physiological Society, argues that if the bill became law, it could be especially damaging to “small niche area” journals in disciplines such as ecology that have not yet experimented much with open-access journals that recoup publication costs from authors rather than subscribers.

    Observers don't expect the bill to be passed this year, but they anticipate a push to make the NIH policy mandatory. The 6-month deadline is also controversial: NIH Director Elias Zerhouni recently testified that he is sympathetic to publishers'desire for a 12-month delay.

    In the meantime, NSF plans to add citation data to the Web-based descriptions of each award in response to a February report by its inspector general that said “other science agencies have done much more than NSF” to tell the public what it gets for its money. The report said NASA and the Defense Department already make available the full texts of some journal articles.

  5. CONDENSED MATTER PHYSICS

    Solid Hydrogen Not So Super After All

    1. Adrian Cho

    Strike hydrogen from the list of possible “supersolids.” Its conceptual cousin solid helium may flow bizarrely like a liquid with no viscosity, but solid hydrogen does not, say physicists who had reported that it might. “Nature has its way of having fun with us,” says Moses Chan of Pennsylvania State University in State College, who alerted dozens of colleagues to the negative result this week.

    In 2004, Chan reported signs that a crystal of the isotope helium-4 could flow freely through itself, possibly confirming a long-hypothesized phenomenon known as supersolidity. Last year at a meeting, Chan and graduate student Anthony Clark presented data that suggested solidified molecular hydrogen flowed the same way (Science, 8 April 2005, p. 190). The notion was plausible because atoms of helium-4 and molecules of hydrogen are both “bosons”: particles with a quantum-mechanical proclivity to bunch up.

    Forthright.

    Moses Chan e-mailed colleagues news of the negative result for hydrogen.

    CREDIT: A. CHO/SCIENCE

    To check the unpublished result for hydrogen, Chan and Clark ran a series of control experiments. In one, the researchers set a can full of frigid solid hydrogen twisting back and forth on the end of a thin shaft. Below a certain temperature, some of the hydrogen seemed to let go of the can and flow effortlessly through the rest of the solid, causing the frequency of twisting to increase. But when Chan and Clark blocked the path of the hypothetical flow, the frequency jump persisted. That observation suggests some other effect, such as a rearrangement of the molecules within the solid, causes the jump.

    Chan deserves credit for his scientific integrity in quickly announcing the negative result, says Humphrey Maris of Brown University. “He's been completely open from the beginning,” Maris says. “He certainly hasn't overstated his claims at any point.”

    Solid helium-4 passed both that control experiment and several others. So supersolid helium remains a tantalizing—and controversial—possibility.

  6. U.S. SCIENCE POLICY

    Senate Panel Chair Asks Why NSF Funds Social Sciences

    1. Jeffrey Mervis

    Why is the National Science Foundation (NSF) funding a study of a women's cooperative in Bangladesh? Why are U.S. taxpayers footing the bill for efforts to understand Hungary's emerging democracy? And why are social scientists even bothering to compile an archive of state legislatures in a long-gone era when those legislators chose U.S. senators?

    Senator Kay Bailey Hutchison (R-TX), chair of a panel that oversees NSF and a member of the powerful Senate Appropriations Committee, put those and other sharply worded questions to NSF Director Arden Bement last week during an unusually combative hearing on the agency's 2007 budget request. Hutchison signaled that she will be taking a hard look at NSF's $200-million-a-year social and behavioral sciences portfolio, which funds some 52% of all social science research done by U.S. academics and some 90% of the work by political scientists. Hutchison made it clear during the 2 May hearing that she doesn't think the social sciences should benefit from President George W. Bush's proposal for a 10-year doubling of NSF's budget as part of his American Competitiveness Initiative (Science, 17 February, p. 929). And she suggested afterward to Science that she's open to more drastic measures.

    Warning shot.

    Senator Kay Bailey Hutchison (R-TX) questions the value of some NSF-funded research.

    CREDIT: COURTESY OF THE ALBERT & MARY LASKER FOUNDATION

    “I'm trying to decide whether it would be better to put political science and some other fields into another [government] department,” she said. “I want NSF to be our premier agency for basic research in the sciences, mathematics, and engineering. And when we are looking at scarce resources, I think NSF should stay focused on the hard sciences.”

    Last week's hearing was not the first time Hutchison has taken a shot at NSF's support of the social sciences. In a 30 September 2005 speech honoring the winners of the annual Lasker medical research awards, she backed a doubling of NSF's budget but added that social science research “is not where we should be directing [NSF] resources at this time.” Hutchison tipped her hand a few months before the hearing by asking NSF officials for abstracts of grants funded by the Directorate for Social, Behavioral, and Economic Sciences (SBE) going back several years. But the harshness of last week's attack caught the community by surprise, leaving social scientists and their supporters scratching their heads about how best to respond.

    “In some ways, it's SBE that tackles the most challenging scientific questions, because its research investigates people's behavior and touches on the most sensitive issues in our society,” noted Neal Lane, a physicist and former NSF director now at Rice University in Houston, Texas. “So I'm not surprised that it's been hard to articulate how it connects to innovation and improving the nation's competitiveness.” Aletha Huston, a developmental psychologist at the University of Texas, Austin, who wrote a letter to Hutchison before the hearing defending NSF-funded work by herself and colleagues at UT's Population Research Center, points out that “if you want to understand how to remain competitive, you need to look at more than technology, … at the organizational and human issues that play a role.”

    Hutchison says she hasn't decided how to translate her concerns into legislation. One option would be to limit spending for the social sciences in the upcoming 2007 appropriations bill for NSF. Another approach would be to curtail the scope of NSF's portfolio in legislation enacting the president's competitiveness initiative or reauthorizing NSF's programs.

    In the meantime, says sociologist Mark Hayward, who heads the UT population center, it would be a mistake for social scientists to ignore her concerns. “We have to be persistent and consistent in our message,” says Hayward, who along with Huston hasn't heard back from Hutchison. “We can't just say, ‘My goodness, she's not paying attention.’”

  7. CANADA

    Research Budgets Are Tight Pending Science Policy Review

    1. Wayne Kondro*
    1. Wayne Kondro writes from Ottawa, Canada.

    OTTAWA—It's an axiom of Canadian politics that new governments denounce the absence of a national science and technology (S&T) strategy, call for such a strategy to be developed, spend years creating the plan—and then get booted out of off ice. So why should Prime Minister Stephen Harper's new minority Conservative government be any different?

    Tough times.

    Prime Minister Stephen Harper (standing) keeps a lid on Canadian research in his new budget.

    CREDIT: CHRIS WATTIE/REUTERS

    Unveiling its first budget last week since being elected in January, the Harper government put S&T relatively low on its list of fiscal priorities but said it planned to develop a new research policy based on demonstrating “value for money.” In the meantime, the 2.4% increase proposed for the nation's three granting councils pales next to a 5% rise in overall government spending. The new budget, for the fiscal year that began 1 April, leaves the research councils with the unpleasant prospect of coping with a rising number of applications by chopping the number or size of awards or both, scaling back targeted programs, and at the same time, expending time and money to argue their case in the next review.

    The Harper government sees it differently, of course. Returned to power after a 13-year absence, Conservatives lamented the dire lack of a sound plan for investing in science and said a new national science policy should be based on determining “value for money” in the councils' grants. Officials say they have no preconceived notion of how to determine whether a research grant yields an adequate return. But Canadian Association of University Teachers Executive Director James Turk is worried that the exercise hides a “malevolent” attempt to gut basic research in favor of industrially relevant science.

    The government's $210 billion budget cuts taxes while bolstering Canada's military and domestic security forces. As promised, the Conservatives have gutted climate change programs once designed to meet Canada's Kyoto Protocol commitment to reduce greenhouse gas emissions. The government says it will develop its own “made in Canada” solutions this fall.

    The Canadian Institutes of Health Research (CIHR) gets a trickle-down $3.6 million a year from a 5-year, $900 million bump for “pandemic preparedness” against the avian influenza virus. The boost will supplement a tiny $15 million increase in the agency's $630 million operating budget, the same percentage increase awarded the $607 million Natural Sciences and Engineering Research Council and the $213 million Social Sciences and Humanities Research Council. CIHR President Alan Bernstein says the small rise fails to take advantage of academic investments by the previous Liberal government in more staff and the global recruitment of top scientists: “It all lands on our doorstep.” Those programs will continue even if resources to fund research by those scientists are inadequate.

    Still, Bernstein welcomes the S&T policy exercise. “We're not entitled to that money because of some preordained law,” he says. “I think we have an obligation to demonstrate value for money.”

  8. SCIENTIFIC PUBLISHING

    A Call to Improve South Africa's Journals

    1. Robert Koenig*
    1. Robert Koenig is a contributing correspondent in South Africa.

    PRETORIA, SOUTH AFRICA—In the highly competitive field of research publishing, South Africa is a giant on its continent but a dwarf in the world. A new report by the national science academy concludes that about half of the country's 255 accredited research journals have virtually no impact abroad and less than a tenth of them are even indexed on international citation lists.

    The report by the Academy of Science of South Africa—a landmark as the first academy report done at the government's request—recommends that agencies tighten their accreditation of journals and take steps to make the strongest ones more influential and more accessible via the Internet. “In a developing country like South Africa which is marginalized by the ‘journal power’ in the United States and Europe, focusing support on journals that could be world players would make a big difference in how research is conducted and published,” says the academy's executive director, biochemist Wieland Gevers, who chaired the panel that compiled the report. He expects it to trigger debate about how to make South African research more influential.

    Quality control.

    Wieland Gevers says only the best journals should get support.

    CREDIT: ACADEMY OF SCIENCE OF SOUTH AFRICA

    Critics say the current system, in which the education department rewards universities with subsidies based on the number of publications their researchers produce, has led to an overabundance of weak journals. To help snare these subsidies, some universities support journals that publish mainly work by their own professors that has little or no impact abroad.

    Microbiologist Molapo Qhobela, chief director for higher education policy at South Africa's education department, says, “This is an important topic, and we will take the recommendations very seriously.” That may include reassessing the education department's current criteria for accrediting journals, which now require that they be peer-reviewed and include contributors and editorial board members from “beyond a single institution.”

    Gevers says the report will be discussed at a meeting in Pretoria this week and at a series of seminars this year. “A good case can be made for robust and competitive local science publishing,” Gevers says, “but we think journals should seek international indexing or develop niches that lead to recognition outside of South Africa.”

  9. GLOBAL HEALTH

    Polio Eradication: Is It Time to Give Up?

    1. Leslie Roberts

    A handful of experts have reluctantly concluded that polio may never be wiped out. They are arguing that control may be a better goal than eradication

    Stepping up.

    The Nigerian government is recommitted to eradicating polio, but the virus is still circulating out of control in the north.

    CREDIT: CHRIS HONDROS/GETTY IMAGES

    Isao Arita was a believer. In the 1960s and 1970s, he was a crusader in the campaign against smallpox, the only disease ever eradicated. In 1990, he took on polio, directing the campaign that eliminated that scourge from the Western Pacific in 1997. Much of his long and distinguished career—at the World Health Organization (WHO) in Geneva, Switzerland, and the Agency for International Health in Kumanmoto, Japan—has been predicated on his faith in medicine's ability to triumph over viruses.

    So it is with great seriousness that he says that he no longer believes it is feasible to wipe out polio—not in 2006, and probably not ever.

    And he is not alone. Like a handful of other longtime supporters of eradication, Arita has begun to go public with his doubts. On page 852, he and his colleagues write that the 18-year, $4 billion campaign has brought enormous public good, reducing polio cases from 350,000 in 1988 to just shy of 2000 in 2005. But the old adage about the last few percent being the hardest is coming true in spades.

    Since polio exploded out of Nigeria in 2003, the virus has reinfected some 18 previously polio-free countries, many of them unforgiving, conflict-torn places such as Sudan and Somalia, where it is simply too dangerous to send in health workers. And despite “heroic” efforts to achieve the highest vaccination rates ever, the virus is hanging on in the slums of India and has seeded outbreaks in four countries, most recently Bangladesh. Nor does the virus show signs of budging from the shared reservoir between Pakistan and Afghanistan.

    “However diligent they are, however much the staff does its best, there are very serious obstacles that militate against eradicating polio,” agrees Donald A. Henderson, the outspoken director of the earlier smallpox program and one of the few to question the feasibility of polio eradication from the start.

    The skeptics, who include not only Henderson and Arita but also polio experts such as Konstantin Chumakov of the U.S. Food and Drug Administration and Vadim Agol of the Russian Academy of Medical Science's Chumakov Institute for Poliomyelitis (named after Konstantin's father), worry that the campaign is deluding itself and the world with its “ever-receding” deadline—originally 2000 and now reset at 2006. Says Henderson, who is now at the University of Pittsburgh's Center for Biosecurity in Baltimore, Maryland: “It is always 12 or 18 months away from where we are.” And they contend that the program leaders are not paying sufficient attention to policies needed to control, rather than eradicate, polio over the long term—which would be a major accomplishment in its own right.

    True, the case count looks bad, concedes David Heymann, another smallpox veteran who in 2002 was brought in to head the multiagency polio effort, headquartered at WHO. Global cases were higher in 2005 than in any year since 1999, and 2006 is shaping up to be even worse in Nigeria and India. But surveillance is also more sensitive, which could explain some of the increases, he says. He insists that overall, the campaign is racking up solid victories. Of the 22 countries reinfected with polio since 2003, outbreaks have been dramatically curtailed in all but nine. And the number of endemic countries—where transmission has never stopped—is down to four, an all-time low, he contends. Heymann, who runs the program with Bruce Aylward of WHO, extols the benefits of an improved, more targeted version of the oral polio vaccine. He and Aylward cite the enthusiasm and commitment of donors such as Rotary International, the G8, and the Gates Foundation—and of the polio-affected countries themselves. And they maintain that it is feasible to stop transmission of wild poliovirus in 2006 everywhere except Nigeria, which may take another year and a half, and perhaps one corner of India.

    But optimism is no substitute for a contingency plan, counter the skeptics. And so the debate continues—respectful, increasingly public, and with no sign of resolution.

    A reasonable target

    Even now, most agree that the 1988 decision to eradicate polio made scientific sense. After all, the world had eradicated smallpox, and there seemed to be no overwhelming scientific obstacles to wiping out polio as well. The virus is spread from human to human, which means there's no chance of it lurking in an animal reservoir. As with smallpox, there was an effective vaccine—two, in fact: the live oral Sabin polio vaccine (OPV) and the inactivated Salk polio vaccine (IPV). The World Health Assembly endorsed the concept in 1988, setting the world on a course to wipe out polio by 2000 and then, once the threat of the virus's return was deemed negligible, to stop all control measures, as had occurred with smallpox.

    It soon became clear, however, that polio would be even tougher to eradicate than smallpox, which Henderson has said was eradicated “just barely,” with a lot of luck. With smallpox, there was no question who was infected, as everyone developed a telltale rash. Polio, by contrast, circulates “invisibly,” causing paralysis in just one in every 100 to 200 people infected. Polio is caused by an enterovirus that replicates in the gut before sometimes invading the nervous system; it is excreted in the stool and predominantly spread by fecal-oral contamination.

    And although the Sabin OPV adopted for the mass campaign proved very effective—it contains a live, attenuated virus that is also excreted in stool and thus confers immunity on people not directly vaccinated—it has decided drawbacks. The smallpox vaccine usually worked with just one shot, recalls Henderson: “The take rate was 95% to 98%, consistently, with one dose.” But with OPV, “you need five, six, seven doses to be protected.”

    Other serious downsides have come to light. A cluster of polio cases in Hispaniola in 2000-'01 confirmed that the virus used in the Sabin vaccine can, in rare instances, regain its ability to circulate and trigger an outbreak. Scientists also discovered by chance that some immune-compromised people can shed virus for years—without showing any symptoms—and that the virus can be extremely virulent. “One man in England excreted virus for 20 years,” says Henderson. Given those dangers, the global campaign advocates that OPV use be discontinued when and if transmission of the wild virus is halted.

    Polio warriors.

    David Heymann (left) and Bruce Aylward, who run the global campaign, are unwavering in their belief that polio can be eradicated.

    CREDIT: P. VIROT/WHO

    Still, it proved relatively easy to stamp out the disease in the United States and other countries with good hygiene and good health care systems. Developing nations were tougher, as the virus thrives in crowded, unsanitary environments (Science, 26 March 2004, p. 1960). In Latin America in the 1980s, the Pan American Health Organization fine-tuned the mass vaccination strategy known as National Immunization Days, during which volunteer vaccinators fan out across the country to deliver polio drops to every child under age 5. By repeating campaigns several times a year and aggressively mopping up after any outbreak, the reasoning went, countries could boost immunity enough to knock out the virus. The last indigenous case in the Americas occurred 1991. Next was the Western Pacific region, where Arita led the effort, which interrupted transmission in 1997, followed by Europe in 1999.

    In the process, almost inadvertently, the campaign knocked out one of the three serotypes of wild poliovirus—type 2—which has not been seen since 1999. “That is a big achievement,” says Eckard Wimmer, a virologist at Stony Brook University in New York. Since then, circulation of type 3 has also been considerably curtailed. It is now confined to small areas of four countries, albeit tough ones, given their crowding and poverty: India, Pakistan, Afghanistan, and Nigeria. The polio-eradication team now thinks a sequential strategy, using new “monovalent” vaccines targeted against specific serotypes (Science, 28 October 2005, p. 625), might do the trick. “We want to get rid of type 1 first, then type 3,” explains Heymann.

    Meltdown in Nigeria

    Social and political problems are, however, overwhelming the campaign's scientific strategy, the skeptics point out. Take the case of Nigeria, the most populous country in Africa, and one with an abysmal health care system. (Only about 13% of Nigerian children are routinely vaccinated against childhood diseases.)

    In mid-2003, amid allegations that the polio vaccine was contaminated with the AIDS virus or tainted with hormones designed to sterilize Muslim girls, several states in the northern part of the country halted polio vaccination. The virus, which was already circulating in the region, found fertile ground in the growing number of unimmunized children. By the end of 2004, the number of known cases had doubled to about 800, and the virus quickly spread across Nigeria's porous borders, taking root wherever it encountered a susceptible population (see map).

    One step forward.

    The number of polio cases dropped from 350,000 in 1988 to a low of about 500 in 2001. But a 2002 outbreak in India, followed by a disastrous setback in Nigeria in 2003-'04, has sent cases climbing.

    Although Nigeria resumed vaccination about a year later, after intense lobbying and repeated tests to confirm the vaccine's safety, the virus still rages out of control. Nigeria poses a “grave threat” to the world, says Heymann. Recent analyses suggest that in five northern states, the immunization campaigns are missing more than 40% of children, and incidence is four times higher than at the same time last year. “With such high levels of transmission, … an additional 12 to 18 months of intensive activities may be required to interrupt polio,” a 1 May update from the eradication campaign warned.

    Outside Nigeria, the major problem is not so much opposition, although vaccinators still encounter it, as access. “In the Congo, between one-third and one-half [of the country] is just not accessible. You have roaming soldiers, lots of fighting in the eastern third, and it's a huge area,” says Henderson. “Similarly, for Côte d'Ivoire, Angola, Afghanistan near Kandahar, … it is not possible to work there.”

    “Security is a big issue,” concedes Heymann. Although the number of reported cases there is low, poliovirus remains entrenched in a corridor between Pakistan and Afghanistan. “The virus keeps going back and forth” between the two countries, notes Heymann, not far from where U.S. forces continue to hunt for Osama bin Laden. “Our external monitors can't get in.”

    Another “great risk” is Somalia, where the virus resurfaced around Mogadishu in July 2005. According to genetic sleuths at the U.S. Centers for Disease Control and Prevention in Atlanta, Georgia, one of the partner agencies in the campaign along with UNICEF and Rotary International, the virus came to Somalia from Nigeria, by way of Yemen.

    Meanwhile, experience in India is suggesting that in some circumstances the virus can survive even saturation campaigns. Vaccination coverage in India has never been higher, says Heymann, who notes that the country is “pounding it,” conducting nine huge campaigns last year and three already this year, to the tune of $120 million. And for the past year, vaccinators have been supplementing the standard trivalent OPV with monovalent vaccine against type 1 and, more recently, type 3.

    But still, cases are being reported in Uttar Pradesh and Bihar, areas of wrenching poverty. Monitoring has confirmed that vaccinators are reaching most children; many are getting six or seven doses of vaccine a year. Epidemiologists suspect that one reason the vaccine isn't working is that the children are infected with other enteroviruses that compete with the vaccine in the gut. And because many childern have chronic diarrhea, the vaccine simply doesn't stay in the body long enough to provide sufficient immunity.

    “Some pockets [of transmission] are damn near impossible,” says Ellie Ehrenfeld, a polio expert at the U.S. National Institutes of Health in Bethesda, Maryland, who also advises WHO on its program. “We really don't understand why mop-ups don't knock out the virus in these areas.”

    House calls.

    Going door-to-door to deliver polio drops, as this vaccination team is doing, isn't feasible in a corridor between Afghanistan and Pakistan, where the virus is entrenched.

    CREDIT: SHAH MARAI/AFP/GETTY IMAGES

    Meeting in Delhi in early May, India's expert advisory group decided to pound the virus even harder with monthly vaccination campaigns in the worst-afflicted parts of Uttar Pradesh and Bihar. Health workers in Uttar Pradesh will also test the feasibility of delivering a dose of OPV to all newborns in the most resilient areas of transmission within 72 hours of birth—before they become infected with competing viruses—to see whether that boosts seroconversion rates.

    Redefining success

    In light of these setbacks, as well as disconcerting evidence that the virus can circulate undetected even longer than people feared, prospects for stopping transmission seem grim indeed, say the skeptics. Henderson notes that last year in Sudan, surveillance turned up a strain that had been circulating silently for 5 years—while the country was labeled “polio-free.” No one is ready to say emphatically that eradication is impossible, but Arita and his colleagues write that the goal is “unlikely to be achieved.”

    “I have no way to predict what will happen in the next 5 years, but I don't think polio will disappear,” says Wimmer.

    Uphill climb.

    Costs have skyrocketed as the polio eradication initiative has had to fight near-simultaneous outbreaks in multiple countries.

    CREDIT: SOURCE: POLIOERADICATION.ORG

    Ehrenfeld has become increasingly worried in the past few years but says she is not ready to abandon all hope. “At what point do you say we are going to give up on polio? I don't know,” she asks. “Maybe these problems can be solved,” she adds, noting that there was a “fair amount of progress this past year. … But it would take a very long time, much longer than anyone now expects. … And the world is tired.”

    And since the 2000 deadline has passed, costs have skyrocketed. It's “mind-boggling” what these massive mop-ups are costing, Ehrenfeld says. The global initiative spent almost $700 million in 2005, nearly double what it spent in 2000, and up from $600 million in 2004. One reason the world bought into the huge eradication program in the first place was the promise of money to be saved by stopping vaccination, Ehrenfeld notes—a prospect that looks increasingly unlikely. Several of the skeptics suggest that some of the vast amounts of money and energy going toward wiping out every last case of polio might be better spent increasing routine immunization against all vaccine-preventable diseases.

    Unfortunately, says Chumakov, there seems to be no inclination among the program leadership to reassess whether an eradication campaign still makes sense. They “press on as if nothing had happened, as if it were 1988,” says Chumakov, who calls them “captives of their own advertising. … Every year is the final one. This can't continue forever.” He adds that the program should be proud of what it has achieved, and the world should “declare victory now.”

    In his Policy Forum in this issue, Arita urges that the reassessment begin. “The time has come for the global strategy for polio to be shifted from eradication to effective control,” he writes. Henderson agrees. “Let's create a program to keep it [polio] under moderate control and say that is the best we can do.”

    The experts differ, however, on what, exactly, such a control strategy would consist of and even which vaccine—OPV, the more expensive inactivated vaccine used in wealthy countries, or a still-to-be-invented one—should be used. But any scenario, they agree, involves incorporating polio vaccine into routine immunization—which would need to be strengthened considerably and augmented with one or several special immunization weeks a year to keep up immunity. And vaccination would need to continue indefinitely, they agree. Arita and colleagues recommend continuing emergency campaigns with OPV until global cases drop below 500 and the number of nations with polio drops below 10 and then switching to a control strategy. Which vaccine to use would be reassessed in 2015.

    Even if transmission of wild poliovirus could be stopped, vaccination will still be needed, adds Chumakov. One problem, as Henderson points out, is the difficulty of ever knowing for sure that the virus is gone. What's more, if immunization ceased, the world's population would soon become profoundly vulnerable to a reintroduced poliovirus, whatever its origins—whether a vaccine-derived strain, or one that escaped from a vaccine manufacturing plant, or a synthetic version released by a terrorist.

    The risks are well understood and are manageable, responds Heymann. He adds that policies on whether to vaccinate posteradication are still wide open to debate, which he welcomes, noting that both Henderson and Arita were his bosses in the earlier smallpox campaign. “Nothing is cast in stone,” Heymann says.

    As for stopping transmission of wild poliovirus, there is no question. “We have to finish,” he insists. “It would be injurious to the world's population and to its $4 billion investment to throw up our hands and say we are going back to routine immunization. … As long as the partners and countries are willing to make the effort, it is not for Isao [Arita] or me to say that eradication is not feasible.”

    And although it would be wonderful if polio could be controlled through routine immunization, as Arita and others propose, Heymann argues that it's simply not feasible. To keep polio in check, routine coverage would have to be maintained at consistently high levels—90% if IPV were used—and many parts of the world are not even close to achieving that. “If we had 90% or greater coverage, polio would probably have disappeared on its own,” says Heymann.

    Meanwhile, Heymann and his colleagues say they have an eradication program to run, and things are looking up. Not only are most countries committed and making progress, but “there are a whole series of things we are doing to improve” as well. For instance, the program is supporting development of a rapid diagnostic test that would enable countries to respond to outbreaks much more quickly. The state of Uttar Pradesh, India, will be testing a birth dose to see whether it boosts immunity. On the political front, Heymann just came back from Kabul, where the Afghani president reiterated his support, and the United Nations' Kofi Annan is committed to helping with security.

    “As long as there are things we haven't tried, the polio team remains optimistic.”

  10. SCIENTIFIC PUBLISHING

    A Cure for the Common Trial

    1. David Grimm

    A new journal aims to alleviate bias in clinical trials reporting, but some question whether it's the remedy the field needs

    CREDIT: CHRIS HONDROS/GETTY IMAGES

    On the excitement spectrum, results from the LOTIS trial rank right alongside “New soil fungus identified.” In the study, a Dutch team takes 402 85-year-olds and gives half access to an occupational therapist, who teaches them how to use walkers and apply for household help. The point is to see whether such interventions slow the onset of age-related disabilities. They do not.

    Ordinarily, a study with negative results like this wouldn't see the light of day in a medical journal—at least not a top-tier one. But the Public Library of Science (PLoS) aims to be different. It's using the LOTIS study to launch its new journal, PLoS Clinical Trials, which begins publishing on 19 May.

    The journal's credo is simple: Disappointing results can still be good news. Its editors have explicitly stated that all clinical trials submitted—regardless of outcome or significance—will be published, as long as they are methodologically sound. The policy takes aim at a pervasive problem in the clinical trials literature: a heavy skew toward studies with positive outcomes. Some say there's a “black hole” where studies with negative or ambiguous outcomes should be.

    This bias can cost lives. In a particularly lethal example, a pharmaceutical company shelved a 1980 clinical trial that showed that a prophylactic heart attack drug did more harm than good. Thirteen years later, the researchers involved in the trial finally published the study to illustrate the warning it could have provided: Estimates suggest that—in the intervening years—hundreds of thousands of people may have died prematurely from effects associated with this class of drugs, known as antiarrhythmics. More recently, industry-sponsored trials of Paxil and Vioxx have also highlighted the dangers of not reporting negative results (Science, 14 January 2005, p. 196).

    “Science has been letting the public down very badly by not getting to grips with this problem,” says Iain Chalmers, a clinical trials expert and editor of the James Lind Library in Oxford, U.K. “PLoS Clinical Trials is sending a message that it won't contribute to this bias.” Still, Chalmers and others wonder how effective such “catch-all” journals can be—especially given that much of the bias seems to be coming from the authors. And some worry that flooding the literature with negative or ambiguous studies could itself do more harm than good.

    Leveling the field

    The PLoS Clinical Trials philosophy is hardly unique. Several medical journals, including The New England Journal of Medicine (NEJM) and The Journal of the American Medical Association (JAMA), claim to place a high priority on methodology.

    But even the big guys admit to factoring in issues beyond study design. “Our editors are looking for research that is important” and “defines new treatments or resolves major controversies,” says NEJM spokesperson Karen Pederson. And in meetings at which JAMA editors debated the merits of manuscripts, editors have frequently mentioned “journalistic goals” such as “readership needs and timeliness,” according to an on-site analysis by Kay Dickersin, director of the Center for Clinical Trials at Johns Hopkins University in Baltimore, Maryland.

    Such standards may give pause to authors of trials with negative or ambiguous results. Reluctance to submit such papers is a huge problem, says Kirby Lee, a clinical trials expert at the University of California, San Francisco (UCSF); it's one of the biggest drivers of publication bias. In a preliminary report presented last September at the Fifth International Congress on Peer Review and Biomedical Publication in Chicago, Illinois, Lee and UCSF colleague Lisa Bero showed that only 13% of manuscripts submitted to major biomedical journals contained ambiguous outcomes. Although these trials may not seem important on their own, they help scientists design better future trials and can be vital when combined with similar trials in so-called meta-analyses, which help determine a drug's safety or efficacy.

    Gun-shy.

    A survey of manuscripts submitted to major biomedical journals between January 2003 and May 2004 indicates that authors are reluctant to submit trials with ambiguous results.

    Proof positive. Top journals tend to publish more positive trials, as evinced by this survey of manuscripts accepted for publication in JAMA between February 1996 and August 1999.

    Getting ambiguous or negative trials into the literature can also prevent needless and potentially harmful duplicate studies. In the early 1980s, researchers at the National Cancer Institute in Bethesda, Maryland, showed that retinoic acid could turn acute myeloid leukemia (AML) cells into normal cells. Soon after, many doctors apparently began testing the acne drug Accutane—then the only clinically available form of retinoic acid—on their AML patients. The treatment didn't work, but no one reported that. Toward the end of the decade, a Chinese clinical trial showed that only a particular isomer of retinoic acid had the effect. In the interim, patients were exposed to unnecessary side effects, and alternative treatment routes were not pursued as vigorously as they might have been.

    Industry suppression of unfavorable results likely plays some role in author bias, says Lee, but a lot of it comes down to human nature. “Authors don't think their studies are important, or they think editors won't be interested,” he says, so they don't take the time to write them up. As a result, adds Dickersin, only about half of the studies that should be published actually are.

    PLoS Clinical Trials could change that. Other journals say they are interested in methodology, but “it's a defining part of what PLoS Clinical Trials is,” says Dickersin, who also sits on the journal's advisory board. “The editors don't care if something's hot or not.” The approach “removes uncertainty on the author's end,” says PLoS Clinical Trials publication manager Emma Veitch. And PloS's open-access policy, which makes all of the papers freely available online at the time of publication (authors pay a negotiable $2500 fee upon acceptance), assures investigators that their research will reach a much wider audience than it would at a specialty journal, she says. A number of manuscripts are coming in: “We're getting a good mix of all types of trials,” says Veitch.

    No panacea

    But will getting more of these negative and ambiguous trials into the literature really address the bias problem? “Journals can help encourage the right atmosphere,” says the James Lind Library's Chalmers, “but the fundamental problem is with the scientists themselves.” If authors don't want to be associated with a negative trial, he says, they're still not going to submit their work. And, most say, the strategy is unlikely to stop drug companies from sitting on negative results.

    The real change, says Chalmers, has to come from within the scientific community. It is “scientifically and ethically unacceptable to invite people to participate in these studies and then not publish the results,” he says. The fact that medical societies have not stated this, he thinks, is “disgraceful.”

    Other experts worry that inundating the literature with negative and ambiguous studies could compromise patient care. “Physicians and the public rely on top-tier journals to filter out studies that are not easily interpretable or that may be misleading,” says Celia Fisher, director of the Center for Ethics Education at Fordham University in New York City. “Having access to these studies could cause patients to go off medications that could be helpful” or vice versa, she says.

    Publishing such trials could also hurt a journal—by marginalizing it—says Marcia Angell, a senior lecturer in social medicine at Harvard Medical School in Boston and former editor-in-chief of NEJM. “It sounds like a recipe for a lot of ‘so-what’ studies,” she says, “and who wants to read a study that says the world is not flat?” UCSF's Lee agrees that readership needs to be a concern for PLoS Clinical Trials and any other journal that publishes such a wide range of results. “A journal that doesn't appeal to its readers won't survive,” he says. That may explain the demise of a similar online journal, Current Clinical Trials, which began publishing in 1992 but eventually went defunct. Nevertheless, Lee is optimistic about PLoS Clinical Trials. “It's a great idea,” he says, “and it could change the way clinical trials are published.”

    Robert Califf, director of the Duke Clinical Research Institute in Durham, North Carolina, believes that the new journal will encourage more authors to submit their trials, “although, personally, I'd probably try a few specialty journals before I went to PLoS Clinical Trials,” says Califf, because the work would be more likely to reach those in his field. “Putting everything online is a good idea,” he says, “but not everyone knows how to use Google.” Still, he says, “if the new journal catches on, it's the right way to do things.”

  11. COGNITIVE NEUROSCIENCE SOCIETY MEETING

    Probing the Social Brain

    1. Greg Miller

    By scanning activity within the skull, researchers are trying to understand how our brains manage interactions with other people

    Play it cool.

    Controlling emotion is crucial in many social situations—so how does the brain do it?

    CREDIT: JOSEPH E. LEVINE/PARAMOUNT/THE KOBAL COLLECTION

    SAN FRANCISCO, CALIFORNIA—Whether or not we admit it, we are all armchair psychologists. Every day we analyze the behavior of friends and colleagues, trying to infer their motivations, intentions, and emotions. We even analyze ourselves. Neuroscientists are no different, except that they have access to expensive brain-scanning machines, which give them an advantage in figuring out what's going on inside someone's head.

    For the first time, this year's meeting of the Cognitive Neuroscience Society (held here 9 to 12 April) included two symposia devoted to social and affective neuroscience, related fields that investigate how the human brain handles everyday situations such as figuring out what's on someone else's mind or controlling one's own emotions in dealings with other people. The presentations highlighted the excitement as well as some of the growing pains of this active area of research.

    Thinking of you

    Christian Keysers of the University of Groningen in the Netherlands reported new work on the neural basis of empathy. In recent years, Keysers and others have described how the brain engages in “mirror” activity that reflects the actions and experiences we see in those around us. For example, if someone sees another person grab a piece of fruit lying on a table, the region of her brain that would prepare her own arm to reach for the fruit becomes active. This happens not only when we observe actions; brain imaging studies have revealed that the brain engages in analogous mirror activity when we observe sensations and emotions as well, leading to the hypothesis that mirror activity is part of the neural mechanism that creates empathy (Science, 13 May 2005, p. 945).

    In their new study, Keysers and his grad student Valeria Gazzola investigated whether hearing rather than seeing a human action elicits mirror activity. Inside a functional magnetic resonance imaging (fMRI) scanner, 16 volunteers listened to various sounds—some associated with mouth movements, such as gargling or spitting; some associated with hand movements, such as pouring a fizzy soda into a glass; and others not caused by human activity, such as a dripping faucet. Sounds made by the mouth or hand activated brain regions involved in planning movements, including the premotor cortex, whereas environmental sounds such as the dripping faucet did not, Keysers reported. Moreover, mouth sounds activated a different region of the premotor cortex than did hand sounds. The response is very specific, Keysers says: Hearing an action activates the same brain areas that would be involved in planning that action.

    The auditory-evoked mirror activity was most pronounced on the left side of the brain, where language circuitry is concentrated. This fits nicely with theories linking mirror activity to the evolution of human language, Keysers says. Other researchers have proposed that mirror activity enabled early humans to communicate by imitating each other's gestures (Science, 27 February 2004, p. 1316). In this view, gestures preceded vocal communication. But the new work suggests to Keysers that speech could have evolved directly from vocal imitation, aided by mirror activity in the left side of the brain.

    Keysers and Gazzola also found that people who scored higher on a questionnaire that assessed their empathetic tendencies had more mirror activity. “The people who reported in everyday life that they consider the perspective of other people were the ones whose mirror systems were most active when hearing sounds made by other people,” Keysers says.

    “The mirror data are fantastic,” says Jason Mitchell, a cognitive neuroscientist at Harvard University. But Mitchell doesn't think this is the only mechanism at the brain's disposal for reading other people's minds. “It's almost certainly part of it, but there are things we can do that the mirror system can't handle.” For instance, Mitchell says, “we're really good at inferring the mental state of a character in a novel, but it's a real stretch to imagine how the mirror system could do that because it doesn't have any [firsthand] information about the person.”

    Other studies have shown that this sort of inference—in which no human activity has been observed—involves the medial prefrontal cortex (mPFC), Mitchell notes. In a talk at the conference, he presented new findings from his lab that suggest an intriguing refinement: People seem to use one region of mPFC to consider the mental state of someone they perceive as similar to themselves and another region of mPFC to consider someone perceived as dissimilar.

    Mitchell and colleagues introduced 15 volunteers—all undergraduate or graduate students from the Boston area—to two hypothetical students by showing them photographs and short descriptions supposedly representing the students' profiles on an Internet dating site. One hypothetical student described his politics as “left of center” and said he “still can't believe Bush got reelected.” The other described himself as a fundamentalist Christian and “strong supporter of the Republican Party.”

    The researchers then used a reaction-time test to assess which of the two hypothetical students the volunteers deemed more similar to themselves. The reaction-time test, which required students to match the faces of the two students with pronouns—such as “I” or “they”—by pressing computer keys, is more reliable than simply asking people about their preferences because it's harder to fake, Mitchell says. (In liberal Cambridge, for example, a conservative might find it socially expedient to apply a leftward adjustment to his answers.)

    Their true allegiances revealed, the volunteers then slid into an fMRI scanner and answered yes-or-no questions about themselves and the hypothetical students. The questions required the volunteers to consider the mental state of the person in question, asking, for example, “Would Student #1 worry about getting a summer job?” or “Would Student #2 get upset waiting in traffic?” When volunteers thought about their own mental states in these situations, a region of ventral mPFC became active. The same area revved up when volunteers put themselves in the shoes of the student they viewed as similar to themselves. However, when volunteers considered the mental state of the dissimilar student, a nearby region, dorsal mPFC, lit up in the scans. The study will appear in the 18 May issue of Neuron.

    Keysers says Mitchell's findings may provide a bridge between two traditionally opposed hypotheses about how we infer the mental states of others: simulation theory and theory of mind. Simulation theory holds that we use our own experience to infer the experience of others. Mirror activity in the brain is often held up as an example of simulation theory in action. In contrast, theory of mind holds that we use abstract rules about how people behave to infer the mental states of others. The activity in dorsal mPFC seems to represent this second kind of cognition, Keysers says, but the activity in ventral mPFC seems to represent aspects of both simulation theory and theory of mind: It's abstract thought because the person under consideration isn't actually present, yet it taps into the same brain circuitry used for self-reflection. Rather than being mutually exclusive, simulation theory and theory of mind may turn out to be “two processes we can mix together,” Keysers says.

    Division of labor.

    Different regions of prefrontal cortex fire up when people ponder the mental states of others perceived as similar (blue) or dissimilar (red) to themselves.

    CREDIT: JASON MITCHELL/HARVARD UNIVERSITY

    Getting a grip on emotions

    Reading the minds of invisible strangers isn't the only talent neuroscientists have attributed to the prefrontal cortex, a sizable swath of tissue just behind the forehead that has expanded greatly in the course of mammalian evolution. In humans, the area also appears to have much to do with personality, planning for the future, and keeping a lid on inappropriate thoughts, behaviors, and emotions—another common focus of research presented at the meeting.

    Several recent studies have investigated the role of the prefrontal cortex in keeping emotions in check in social situations. In 2003, for example, Naomi Eisenberger and Matthew Lieberman of the University of California (UC), Los Angeles, and Kipling Williams of Macquarie University in Sydney, Australia, described findings suggesting that the right ventrolateral prefrontal cortex (RVLPFC) dampens the feeling of social rejection people experience when their character is shunned by other characters in a video game (Science, 10 October 2003, p. 290). Subjects who reported feeling less rejection showed more RVLPFC activity and less activity in the amygdala, a part of the brain whose activity reflects emotional arousal.

    At the conference, Lieberman presented new work that suggests the RVLPFC region helps keep emotions in check during another type of social interaction. Lieberman and colleagues adapted a version of the “ultimatum game” used in behavioral economics. In each round, two players are told that they will split a sum of money; Player 1 decides what share to give Player 2, whose only options are to take it or leave it. The researchers scanned the brains of volunteers playing the part of Player 2 as they played one round each against what they were told were 70 different Player 1s (all of whom were actually a computer programmed to share between 5% and 50% of the total stake).

    The most interesting situations, Lieberman says, are those in which the volunteer is offered a decent sum of cash that's a low percentage of the total stake. The rational thing to do is to take the money, but many people will reject an offer they deem to be insultingly low. From the brain scans, the best way to predict whether a volunteer would take an unfair offer was the amount of activation of RVLPFC. “The more they activate this area, the more likely they are to say ‘Forget the insult, I'll take the money,'” Lieberman says.

    Lieberman has found that the amount of RVLPFC activity in people playing the game is inversely proportional to activity in the insula, a brain region that has been linked to the perception of disgust. Although he concedes that such correlations don't prove that RVLPFC directly suppresses activity in the insula or amygdala, he says that's his working hypothesis: “What RVLPFC does well is it disengages us from our immediate responses … [and] allows higher cognitive abilities to guide thought and behavior without interference from emotional processes.” Lieberman is now investigating whether RVLPFC activity is abnormal in people with anxiety disorders and whether activity in this brain region changes in people undergoing therapy.

    The work is an interesting attempt to examine how people respond in a situation similar to what they might face in everyday life, says Jennifer Beer of UC Davis. Yet Beer says she's not convinced the RVLPFC activity in Lieberman's experiment represents emotional regulation per se. “In the ultimatum task, there's probably a lot of things going on when you're dealing with fair and unfair offers, not just [regulating] emotion,” she says. RVLPFC activity could represent some more general cognitive process related to decision-making or response selection, Beer says.

    Determining whether particular brain regions and patterns of activity are uniquely dedicated to social or emotional cognition is a major challenge for the field, says Cameron Carter of UC Davis. “This new area doesn't quite have the theoretical or methodological rigor of more traditional cognitive neuroscience” research on memory and attention, which scientists have probed with fMRI since the early 1990s, Carter says. But the field will improve as it matures, he adds: “It's one of the more exciting areas, and it's important for understanding individual differences and psychopathology.”

  12. ASTRONOMY

    A Hawaiian Upstart Prepares to Monitor the Starry Heavens

    1. Robert Irion

    Astronomers anticipate the debut of Pan-STARRS, a telescope that promises sweeping views of the nearby and distant universe

    The night sky is wide and unfathomably deep, but some insatiable astronomers want to plumb everything in it. In their dream project, an all-seeing camera would film the sky with as much detail as the images from today's best telescopes. But big telescopes have tunnel vision, so they need years to assemble such a view of the heavens. And whereas smaller mirrors can take in broad chunks of the sky, faint objects elude their sight.

    Sharp focus.

    Astronomers are assembling the 1.8-meter Pan-STARRS 1 telescope (engineering drawing, left) on Maui, Hawaii. Its camera features 1.4 billion pixels spread across 64 arrays of detectors (shown front and back, right) that will electronically counteract atmospheric distortions.

    CREDITS (LEFT): JEFFREY MORGAN/PAN-STARRS; (RIGHT) UNIVERSITY OF HAWAII

    Now, atop a dormant Hawaiian volcano, a nimble machine that combines the best of both worlds is taking shape: the Panoramic Survey Telescope and Rapid Response System, or Pan-STARRS. The 1.8-meter telescope will use the biggest astronomical camera ever built, equipped with a novel electronic system for correcting the distortion caused by Earth's atmosphere, to take crisp images across a field of view 35 times larger than the full moon. And it will do so every 30 seconds, a pace that will capture the entire sky visible from Hawaii several times a month.

    Pan-STARRS will spot thousands of things that change or move, such as supernovae, flaring stars, and asteroids. In particular, the system will find sizable asteroids that orbit dangerously close to Earth 10 times more efficiently than all current search programs combined. It also will create the most detailed all-sky atlas yet compiled, with accurate positions and brightnesses for billions of stars and galaxies. Once the telescope begins full-time science operations in early 2007, its data will flow at the rate of several terabytes (millions of megabytes) per night—a cascade that particle physicists can handle but astronomers have yet to experience.

    These ambitions have set colleagues abuzz with a blend of skepticism, envy, and praise. “The sheer volume of data processing will be a challenge, but it's a big, creative group of people,” says Wendy Freedman, director of the Carnegie Observatories in Pasadena, California. Another West Coast astronomer, involved with labyrinthine plans for a giant telescope, says: “If I could start over with everything in astronomy and just join one project, it would be Pan-STARRS.”

    A giga-view of space

    Pan-STARRS is the first in a generation of ground-based telescopes that will digitize the sky in fine detail, repeatedly. “If we had talked about terabytes of disk storage and billions of stars a decade ago, it would have been feasible but insanely expensive,” says astronomer John Tonry of the University of Hawaii (UH), Manoa. “Today, buying the disk storage to do these things is trivial. Moore's Law has met the universe, and Moore's Law won.”

    Tonry and several UH colleagues, including Gerard Luppino and project director Nicholas Kaiser, conceived Pan-STARRS in 2000. Their inventive plans drew interest from the U.S. Air Force, which operates the Defense Department's largest telescope: a 3.67-meter satellite surveillance system on UH property near the summit of Haleakala, Maui's highest peak. The Air Force has fronted $45 million to date and is on track to provide another $45 million during the next 5 years, Kaiser says. That sum covers R&D costs and construction of a single telescope at Haleakala, called Pan-STARRS 1, and its successor of four identical telescopes (Pan-STARRS 4) planned atop Mauna Kea on Hawaii's Big Island by 2010.

    Military researchers have two vested interests, says Paul Kervin, technical director of the Air Force research unit on Maui. The first is “planetary defense,” the ongoing effort to find the threatening mountains of rock in space known as near-Earth objects (NEOs). The Air Force already supports R&D and provides access to its own telescopes for two programs to detect NEOs, so Pan-STARRS is the logical next step. Second, the Air Force is eager to adapt the system's groundbreaking optics to sharpen its satellite surveillance network. But the Air Force will play no role in operating Pan-STARRS; rather, UH is seeking primarily academic partners to share costs and gain access to observing time.

    The star of Pan-STARRS is its camera with 1.4 billion pixels, nearly five times the pixel count of the biggest cameras now in use at other telescopes. The system is not just a passive recorder of light. Tonry and detector guru Barry Burke of the Massachusetts Institute of Technology's Lincoln Laboratory in Lexington devised special chips that can shift their electrical charges in all directions. By monitoring bright stars, the chips sense the jittering motions of starlight as it passes through Earth's atmosphere. The chips then shuffle their electrons to compensate about 10 times each second. The system thus keeps the images of objects in each frame confined within tight pinpricks, rather like a Steadicam. “This really is a major advance in imaging capability,” Burke says. “The astronomy community is watching very closely.”

    Because of these on-chip corrections, the telescope will yield a sharply focused image across 7 square degrees of the sky even without the usual deformable mirrors of adaptive optics. What's more, the chips will read out each data batch of 3 gigabytes in a few seconds, allowing rapid-fire exposures throughout the night.

    A hunt for moving objects

    That cadence makes Pan-STARRS ideal to watch the sky for moving blips of light. And indeed, the telescope will spend one-third of its time doing just that, with an emphasis on hunting down NEOs. UH astronomer Robert Jedicke forecasts that Pan-STARRS will chart orbits for about 90% of all NEOs larger than 300 meters across and half of those larger than 140 meters, a threshold set by NASA as worrisome. Those faint objects are beyond the reach of all five current surveys. “I think this is exactly what needs to be done,” says Donald Yeomans, director of NASA's NEO office at the Jet Propulsion Laboratory in Pasadena. “This is going to completely change the NEO landscape.” Along the way, Jedicke adds, Pan-STARRS will catalog millions of new mainbelt asteroids and thousands of dim, cometlike bodies in the Kuiper belt beyond Neptune.

    Taking the wide view.

    Pan-STARRS will command a sweeping vista 35 times the area of the full moon. That's seven times the scope of the largest current wide-field imager and 3500 times broader than the Hubble Space Telescope's narrow window.

    CREDIT: WEI-HAO WANG/UNIVERSITY OF HAWAII

    Another survey calls for Pan-STARRS to stare more deeply at a dozen patches of the sky every night to look for supernovae, gamma ray bursts, and unidentified cosmic flares. The team expects to find about 5000 of the type of supernova cosmologists use to gauge the growth history of the universe, more than any other system yet funded. And astronomers elsewhere are excited by the prospect of watching the same areas with sensitive instruments at other wavelengths of light, especially radio waves and gamma rays. Simultaneous detections by several observatories and satellites would illuminate the physical causes of fleeting bursts as never before, says Joshua Bloom of the University of California (UC), Berkeley.

    Finally, Pan-STARRS will scan the entire sky visible from Hawaii—about 3/4 of the full sweep of the heavens—several times every month. During 3 years of operation for Pan-STARRS 1, the team will add the images digitally to create an increasingly detailed map, both broader in scope and more precise than the results of the ongoing Sloan Digital Sky Survey. That celestial census will include the locations and motions of every star and brown dwarf within 200 light-years of the sun, an estimated half-million objects. Pan-STARRS will define stellar positions about 30 times more accurately than today's most popular astronomical catalog does, says UH astronomer Eugene Magnier, making it the de facto reference chart for professional observers.

    Some of the most tantalizing science must await Pan-STARRS 4. Those telescopes, all mounted on one structure, will have the light-gathering power of a single 3.6-meter mirror. Fainter galaxies will pop into view, along with tiny distortions in their images caused by dark matter along the line of sight from Earth. This phenomenon, called weak gravitational lensing, is a promising means of probing the hidden mass of the universe (Science, 20 June 2003, p. 1894). “This is by far our most challenging scientific issue, because it requires very accurate control of the shapes of images,” says Kaiser.

    The road ahead

    Pan-STARRS faces myriad near-term challenges as well. Most critical is enticing several scientific partners to share the operating costs, which Kaiser pegs at $2 million per year. Astronomy departments at Harvard University, Princeton University, the University of Pennsylvania, and UC Berkeley (in conjunction with Lawrence Berkeley National Laboratory) have piped up with interest.

    Although Pan-STARRS 1 will start taking images this summer, the fate of Pan-STARRS 4 is not assured. The team covets a prime site atop Mauna Kea, with its superior atmospheric conditions. But new projects there face strict review by the state of Hawaii, and fervent opposition from native groups has torpedoed expansions at other facilities. The UH team hopes to prevent that by removing its existing 2.2-meter telescope at Mauna Kea and replacing it with Pan-STARRS 4. The new building's lower profile should please community leaders, says Rolf-Peter Kudritzki, director of the UH Institute for Astronomy.

    A less tangible issue is whether Pan-STARRS might undercut a proposed national facility: the Large Synoptic Survey Telescope (LSST) (Science, 27 August 2004, p. 1232). This 8.4-meter telescope, planned for Chile or Baja California, Mexico, would have six times the surveying power—the product of its light-gathering capacity and its sky coverage—of Pan-STARRS 4. LSST has momentum as a high-priority project, but federal funders will need to cough up at least $200 million to build it. Some wonder whether Pan-STARRS will steal much of its scientific glory before LSST opens its wide eye on the sky.

    The teams are collegial and share software, but there are undercurrents of unease. “LSST is in all senses bigger and more ambitious, yet its science case has been predicated on Pan-STARRS not existing or not getting there 5 to 10 years sooner,” says UH's Tonry. “That needs to be looked at: How should LSST operate given that Pan-STARRS will have done many of these things?”

    LSST director Anthony Tyson of UC Davis acknowledges that Pan-STARRS will do “marvelous work” in tracking hazardous asteroids and Kuiper belt objects. But in most other areas, he says, “Pan-STARRS 4 is not scientifically competitive.” Notably, LSST will be a more sensitive probe of weak gravitational lensing—as reflected in Tyson's original name for the project, the “Dark Matter Telescope.” Ideally, argues Kaiser, both should proceed. “There's great danger that if people see these as competing, it will undermine national funding [for LSST],” he says. “We will be a precursor, a stop along the way, if LSST gets funded.”

    Regardless of the outcome, the march of Moore's Law portends a future of digital data mining for astronomers, says Alexander Szalay of Johns Hopkins University in Baltimore, Maryland, an architect of the massive Pan-STARRS data archive. “Some people miss the lonely nights in the telescope dome,” he says. “But like it or not, this is how the new astronomy will be done.”

  13. Did DNA Come From Viruses?

    1. Carl Zimmer*
    1. Carl Zimmer, a freelance writer, is the author of Parasite Rex and Soul Made Flesh.

    Research that began with a study of replication enzymes used by bacteria has led to a controversial theory: Viruses may have helped shape all three major domains of life

    Exploring the viral world.

    Patrick Forterre searches for clues to the early evolution of life by investigating exotic viruses.

    CREDITS: COURTESY OF PATRICK FORTERRE

    Scientists who deal in the history of life have never been quite sure what to do with viruses. One measure of their uncertainty is the Tree of Life Web Project, a collective effort to record everything known about the relationships of living and extinct species. The first page of its Web site—entitled “Life on Earth”—shows the broadest view: From a single root come three branches representing the domains of life (http://www.tolweb.org/). One limb, Eubacteria, includes bacteria such as Escherichia coli. Another, Archaea, includes microbes of a different lineage that are less familiar but no less common. The third, Eukaryotes, includes protozoans as well as multicellular organisms such as ourselves. And just below the tree there's a fourth branch floating off on its own, joined only to a question mark. It is labeled “Viruses.”

    A growing number of scientists hope to get rid of that question mark. They recognize that a full account of the evolution of life must include viruses. Not only are they unimaginably abundant—most of the biomass in the ocean is made up of viruses—but they are also extraordinarily diverse genetically, in part because they can acquire genes from their hosts. They can later paste these genes into new hosts, potentially steering their hosts onto new evolutionary paths.

    Patrick Forterre, an evolutionary biologist at the University of Paris-Sud in Orsay, France, believes that viruses are at the very heart of evolution. Viruses, Forterre argues, bequeathed DNA to all living things. Trace the ancestry of your genes back far enough, in other words, and you bump into a virus.

    Other experts on the early evolution of life see Forterre's theory as bold and significant. But although they find much to agree with—particularly the importance of viruses to evolution—many also regard Forterre's ideas as controversial.

    “I really applaud the bravery and intellectual power to come up with this picture,” says Eugene Koonin of the National Center for Biotechnology Information (NCBI) in Bethesda, Maryland. “But it would be strange if we agreed on all the parts of the picture.”

    A new domain

    Forterre has been developing and elaborating his theory over many years. He began his scientific career in the early 1970s studying the replication of DNA. He investigated how E. coli, the common gut bacteria, use special enzymes to make new copies of their genes without letting the double helix of DNA become tangled.

    As Forterre was studying bacteria, another group was developing a complex new view of simple organisms. Carl Woese of the University of Illinois, Urbana-Champaign, demonstrated that some bacteria are not bacteria at all. They belong to a separate branch on the tree of life, which came to be known as Archaea. Archaeans turned out to have a distinct biology. Forterre and his colleagues discovered that they use peculiar enzymes for DNA replication that work differently from those in bacteria and eukaryotes. Meanwhile, other scientists had begun looking at the DNA-replication enzymes used by a virus called T4 bacteriophage, which copy DNA in yet another way.

    This discovery made a deep impression on Forterre. At the time, many scientists thought that viruses were merely escaped genetic fragments. After all, viruses are not truly alive: They cannot replicate on their own and have no metabolism. They simply hijack host cells to make new copies of themselves. The best explanation seemed to be that viruses evolved from genes in “true” organisms. Mutations allowed these renegade genes to leave their genomes and become encased in protective protein shells. If that were true, however, the enzymes of viruses should resemble those of their hosts. Yet the DNA-replicating enzymes of T4 bear no relation to the enzymes in the bacteria they infect.

    “I thought, maybe T4 is from a fourth domain,” Forterre recalls. But T4 is not unique. Scientists continued to find more DNA-copying enzymes in viruses that have no counterparts in the world of cells, and Forterre's suspicions deepened. “Maybe each of these viruses is a remnant of a domain that has disappeared,” he wondered.

    In 1985, Forterre offered his speculations at a scientific conference. “I suggested that viruses originated from an early cell, perhaps before the origin of the three domains,” he says.

    Forterre based his theories on the enzymes in viruses rather than the genes that encode them; genome information was scarce then. And for a long time, the question of viruses' origin lay dormant. But advances in DNA sequencing have brought a wealth of new information about virus genomes. In Forterre's view, it reinforces his argument that many viruses are ancient. The genes for the most common proteins in virus shells, for example, turn out to be present in viruses that infect hosts in all three domains, suggesting that these genes originated in a virus that infected an ancestor of all three domains of cellular life. “We find more and more evolutionary connections between viruses in different domains,” says Forterre. “All these findings have completely destroyed this old idea that viruses are escaped fragments of cells.”

    A puzzle that arose from work on microbial genes in the 1990s also gave a new dimension to Forterre's thinking. As scientists dissected the DNA that codes for enzymes used in replication, they found that the precise sequences in bacteria were radically different from those in archaea and eukaryotes. The discrepancy between the domains of life meant that DNA essential to survival had come from different sources or had been separated by a long period of evolutionary time. Koonin even proposed that DNA replication had evolved twice, once in bacteria and once in the common ancestor of archaea and eukaryotes.

    Forterre had a different reaction. “I came back to this idea [of a very early origin for viruses] when genomics showed a clear difference between the DNA replication of bacteria on one side, and the system in archaea on the other,” says Forterre. “Maybe in fact one of these two systems came from a virus.”

    Forterre proposed that the genes for DNA replication in bacteria had been donated by their viruses. Soon afterward, Luis Villarreal of the University of California, Irvine, pointed out a possible evolutionary connection between DNA-replication genes in certain viruses and those in eukaryotes. He suggested that viruses had replaced the original genes in eukaryotes. “The next step,” Forterre says, “was to say, ‘Why not both systems?'”

    For the past several years, Forterre has been expanding his original ideas into a sort of grand unified theory of viruses and cellular life. Forterre proposes that viruses donated more than just their DNA-replication genes to cellular life. He argues that they donated DNA itself. In recent months, he has presented the scenario in a series of papers, the most recent of which appeared last month in the journal Virus Research.

    From RNA world to DNA world.

    Forterre proposes that all living organisms share a common ancestor that stored its genetic information in RNA. Some of its genes evolved into viruses. Later, some of those viruses evolved DNA as a way to defend their genes from attack, and DNA-based viruses became incorporated into hosts. Host genes were then transferred onto viral chromosomes and shared. In the process, the three major domains of DNA-based life emerged.

    CREDIT: C. CAIN (ADAPTED FROM P. FORTERRE)

    Like many scientists, Forterre favors the theory that DNA-based organisms are descended from simpler RNA-based organisms. Experiments on RNA suggest that it could have been versatile enough to support primitive life. Not only can it carry genetic information, but it also has the capacity to act like an enzyme, carrying out chemical reactions. RNA-based life may have been able to absorb nutrients, replicate, and evolve. According to this “RNA world” theory, these organisms later evolved proteins and DNA, which then took over many of RNA's former tasks.

    Forterre proposes that RNA organisms evolved into self-replicating cells that could produce their own proteins. At that point, the first viruses evolved. These RNA viruses parasitized RNA-based organisms, manipulating them to make new copies of themselves. These primordial RNA viruses may have produced lineages that are still with us today, in the form of modern RNA viruses such as influenza, HIV, and the common cold.

    Although a great deal of evidence supports the idea of an RNA world, the scenario raises a number of difficult questions. Not the least of these is how RNA-based life might have evolved into DNA-based life. Many scientists have pointed out that DNA is more stable than RNA and less prone to mutations. Once DNA was established, it allowed genes to become longer and more complex. But how the transition came about is difficult to explain, Forterre points out. The stability of DNA provides long-term advantages, not the short-term ones that natural selection can favor. “This is not a Darwinian way of thinking,” says Forterre. “You should explain why the first organisms in which RNA was modified had a selective advantage, not in its descendants.”

    Forterre offers a solution. He suggests that viruses were the intermediate agents of change. For viruses, DNA might have offered a very powerful, immediate benefit. It would have allowed them to ward off attacks from their hosts. Cells today use a number of weapons against RNA viruses. They can silence the viral RNA with special RNA molecules of their own. Or they can cut the genome of the virus into fragments.

    “RNA viruses have to find a way to avoid these defenses,” says Forterre. They do so by making it difficult for their hosts to grab their RNA. Living RNA viruses chemically modify their genes to thwart their hosts. Forterre proposes that some early RNA viruses altered their genes in a particularly effective way: They combined pairs of single-stranded RNA into double-stranded DNA. The vulnerable nucleotides carrying the virus's genetic information were now nestled on the inside of the double helix, while a strong backbone faced outward.

    “The idea is that an older RNA virus could use this as a trick to modify the structure of its RNA. DNA is simply modified RNA,” says Forterre.

    If they were anything like viruses today, some of the viruses found a way to coexist inside the host's cells, surviving from one generation of host to the next. Forterre suggests that in the RNA world, some DNA viruses became domesticated and lost the genes they used for escaping their hosts and for making protein shells. They became nothing more than naked DNA, encoding genes for their own replication.

    Viral diversity.

    Scientists are uncovering a vast diversity among viruses, such as this giant mimivirus. Their genetics and biochemistry point to an ancient origin.

    CREDITS: CNRS PHOTOTHÈQUE/BERNARD LA SCOLA, SERGE NITSCHE

    Only at this point, Forterre argues, could RNA-based life make the transition to DNA. From time to time, genes from the RNA chromosome would be accidentally pasted into the virus's DNA chromosome. These genes could now enjoy all the benefits of DNA-based replication. They were more stable and less prone to devastating mutations. Natural selection favored organisms carrying important genes in DNA rather than RNA. Over time, the RNA chromosome dwindled while the DNA chromosome grew. Eventually, the organism became completely DNA-based. Forterre proposes that this viral takeover occurred three times, and each one gave rise to one of the three domains of life.

    These blendings of genomes could explain the similarities and differences among the three domains, he argues. Take, for example, the fact that archaean and eukaryote DNA-replication enzymes are more similar to each other than they are to bacterial genes. Forterre suggests that it just so happened that the viruses that infected the RNA-based precursors of archaeans and eukaryotes shared a close common ancestor.

    Eukaryotes in particular pose a special challenge to any account of the early evolution of life. For instance, eukaryotes keep their DNA tucked away in a nucleus, a structure whose origins scientists have debated for years. Forterre suggests that viruses may have played a part in shaping the cells of early eukaryotes. He points out that certain viruses, such as poxviruses, can form nucleuslike shells inside their hosts. Waves of viral infections could have built some of the features of the eukaryote cell. “The eukaryote cell is very strange and very complex, so I don't have a very clear idea of how they originated,” Forterre admits.

    After viruses ushered in these three domains of DNA-based life, the new forms proved superior to those of their predecessors. “Once this occurred, the DNA cells outcompeted all the RNA cells,” says Forterre. “And once all the RNA cells were eliminated, you had a limited number of lineages.”

    Finding flaws

    Forterre's ideas have been warmly received, even by those who dispute some of them. “Patrick's ideas need to be taken seriously,” says David Penny of Massey University in Palmerston North, New Zealand. But Penny points out that in Forterre's scenario, RNA-based organisms are already relatively complex by the time viruses drive them into the DNA world.

    “I doubt in the absence of DNA any organism would be very complex at all,” Penny says. He points out that RNA replication suffers a high error rate. Under those conditions, genomes cannot become large without risking catastrophic damage.

    Penny and his colleagues dispute the idea that natural selection could not drive RNA organisms to DNA-based replication on their own. They've argued that the organisms could have made the transition through a series of evolutionary steps, each of which reduced the error rate during replication. Once life had shifted to DNA, Penny argues, viruses might have played a part in the origin of one or more of the three domains. “I definitely do not want to exclude the possibility of viral takeover,” says Penny.

    Koonin, on the other hand, agrees with Forterre that viruses originated in the RNA world. “That idea makes perfect sense, in my opinion,” he says. He also accepts the possibility that viruses might have invented DNA. But when Forterre argues that viruses gave rise to the three domains, Koonin parts company. “I don't believe it for a moment,” he declares. Archaeans share too many genes in common with eukaryotes for this to be plausible, Koonin argues. Instead, eukaryotes must have evolved from archaeans after they had become DNA-based.

    Koonin envisions a different sort of history of viruses and their hosts. He and William Martin of Heinrich Heine University in Düsseldorf, Germany, have proposed that life first evolved in honeycomblike cavities in rocks around hydrothermal vents. These compartments played the role that cell walls and membranes would later play. Initially, RNA molecules were selected simply for fast replication. Successful molecules could spread from one compartment to another. Over time, groups of RNA molecules were favored by natural selection, working together to reproduce more successfully. It was then that viruslike things emerged—before cells yet existed.

    “There would be parasites that only care for their own replication—and here Patrick's ideas might have their place,” Koonin says. “DNA replication might originally have emerged in such parasitic entities.”

    Woese, who has influenced Forterre's work for more than 20 years, is both enthusiastic and agnostic about the virus theory. “In the most specific form, I don't know if he's right, but then it doesn't make any difference, because he's going in the right direction,” he says. “I think that's a significant advance, to be able to fold the viruses into the whole process.”

    Woese and others believe that the best way to assess Forterre's theory would be to find and analyze more viruses. Forterre agrees. “At the moment, we know very few of the viruses in the living world,” says Forterre. “We know some viruses that are human pathogens, and some that infect bacteria that are important to the food industry. But for many groups, we have no idea of their viruses.”

    Forterre himself is trying to fill that gap by studying viruses that live in heat-loving archaeans. Archaean viruses are proving to be particularly diverse and bizarre—such as lemon-shaped species that don't finish growing until they have left their host cell. Because viruses have such ancient roots, they preserve a remarkable range of biochemical tricks. “It's clear now that you have many more genes in the viral world, so there are many interesting new enzymes to be found,” says Forterre. “If we explore the viral world more, I don't know if we will be able to be sure of one theory or the other. But I am sure we will get many more interesting molecular mechanisms.”

Log in to view full text