News this Week

Science  18 Apr 2003:
Vol. 300, Issue 5618, pp. 402

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Ten Millennia of Culture Pilfered Amid Baghdad Chaos

    1. Andrew Lawler

    Not since the Spanish conquistadors ravaged the Aztec and Inca cultures has so much been lost so quickly. Scholars are calling last week's looting of Baghdad's Iraq Museum, the chief repository for all archaeological research in the country since 1933, the most severe single blow to cultural heritage in modern history. “This is like destroying all the museums on the Washington Mall all at once,” says Eleanor Robson, an Assyriologist at the University of Oxford, U.K. “It's an unparalleled collection of the world's earliest and greatest civilizations.”

    The destruction came after U.S. forces swept into Baghdad last week, and it spread to the country's National Library and Central Bank—where some of the most precious artifacts were kept—as well as to museums in Mosul and Basra. The objects—clay tablets, statues, jewelry, manuscripts—represented 10,000 years of human culture. And there is growing suspicion among some U.S. government officials and archaeologists that much of the looting may have been orchestrated by individuals or organizations eager to sell the assets on the antiquities market.

    It was no ordinary mob scene, according to many accounts. “It resembles a professional bank robbery,” says one U.S. Army source in Iraq. The storerooms that held the bulk of the objects—most of which had been stored for safekeeping—were opened with keys and not explosives. And the destruction of the cards containing the catalog information makes it harder to trace objects. “It does mean that the objects were targeted before the fall of the city,” says this source.

    The failure of U.S. troops to halt the looting stemmed from a stunning failure in the chain of the U.S. Central Command. The U.S. government was familiar with the importance of the Baghdad museum and other cultural heritage sites (Science, 31 January, p. 643). Curators and archaeologists had met in January with Joseph Collins, U.S. Department of Defense deputy assistant secretary for stability operations, to discuss the importance of protecting sites from bombing and from looting similar to that which devastated Iraq's other museums after the first Gulf War. Senior military officials called for the site to be secured, but that directive was apparently ignored.

    After viewing scenes of the chaos in Baghdad on 9 April, frantic U.S. archaeologists urged a senior Air Force official to intervene to ensure the museum was safe. That official told Science he contacted the coalition's Air Operations Command on 10 April, when the looting began, and the request to secure the museum was forwarded to U.S. Central Command and to the Army liaison at Air Operations Command. The Air Force official was told the request was “taken seriously.” But nothing was done. The looting continued until 12 April.

    Another source in Iraq says that U.S. soldiers allowed looters to pass checkpoints near the museum, located on the west bank of the Tigris not far from the central railway station. A Marine commander, one Army official says, observed the museum looting and did nothing. Also looted, although the extent of the theft is not clear, was the Central Bank, which contained one of Iraq's most valuable caches of ancient objects—the Nimrud gold grave goods (Science, 6 July 2001, p. 32). The National Library was reported last weekend to be in flames.


    The absence of U.S. troops left the Iraq Museum in Baghdad defenseless last week against looters.


    As Science went to press, Iraqi officials were pleading with the U.S. military to provide guards at the museum. “All I can say is that the situation up there is very confused,” says the Department of State's John Limbert, who is responsible for cultural heritage matters as part of the Future of Iraq team. Reached in Kuwait on 14 April, he said, “all I know is reports I'm getting from [U.S.] academics and the media.”

    Secretary of Defense Donald Rumsfeld, speaking 13 April, sharply dismissed critics who blame the U.S. military. “We didn't allow it. It happened,” he said, adding that looting takes place anywhere there is disorder. The next day, however, U.S. Secretary of State Colin Powell pledged “to not only secure the facility, but to recover that which has been taken, and also to participate in restoring that which has been broken.”

    It is a Herculean task. Between 150,000 and 200,000 objects filled the display cases and large storerooms in the Baghdad museum. As many as 25,000 of those were ancient clay tablets, many unpublished. The vast collection of note cards cataloging the collection—and a huge amount was not cataloged—also was reportedly destroyed in the looting. Much of what was not taken, such as large statues, was smashed. Of 4000 objects stolen during the chaos following the first Gulf War, only a handful have been recovered.

    Archaeologists want the U.S. government to control the borders, offer rewards for recovering objects, and ask the Iraqi State Board of Antiquities for help. State Department officials agree but add that offering cash rewards would require approval from Congress and take time. Arthur Houghton of the New York-based American Council for Cultural Policy, which represents curators and collectors, says that the antiquities market in this field “should be shut down” and the stolen objects “seen as toxic.” However, there are already rumors that some stolen objects have appeared for sale in Paris and Tehran.

    An effort to put together a database of what remains is under way. But such after-the-fact efforts provide little comfort to researchers whose data has now vanished. “Our discipline is dead—what do we do now?” demands Helga Trenkwalder, an archaeologist at Austria's University of Innsbruck who on 5 March left Baghdad—and the fearful and stressed antiquities staff. Her research materials from ancient Borsippa were in the museum to satisfy a 1970 law requiring that all archaeological material remain in the country. And huge amounts of museum materials “were not published or even studied,” adds Michael Müller-Karpe, an archaeologist at the Römisch-Germanisches Zentralmuseum in Mainz, Germany.

    UNESCO officials will meet in Paris this week to assess the tragedy, and scholars are demanding that U.S. officials be held accountable. But nothing can mitigate the incalculable damage that has been done. Says John Russell, an archaeologist at Boston's Massachusetts College of Art: “Long after Saddam Hussein is forgotten, long after the oil is gone, people will remember this destruction of the world's greatest archive of the human past.”


    Studies of Gay Men, Prostitutes Come Under Scrutiny

    1. Jocelyn Kaiser

    Last month, the Department of Health and Human Services (HHS) conducted a site visit of an investigator at the University of California, San Francisco (UCSF), whose studies of sex workers have been the target of a recent inquiry by Congress. Although there is no hard evidence that the inquiry and the site visit are linked, the events have concerned researchers at UCSF and some in government who worry that the Bush Administration and congressional Republicans are intensifying their scrutiny of research on sensitive topics.

    Program staff at the National Institutes of Health (NIH), for example, have warned grant applicants to cleanse certain terms, such as “transgender” and “prostitutes,” from their grant applications. The reason, according to an NIH staffer who asked not to be identified, is to reduce the projects' visibility. “What's frightening” is that NIH staff feel grantees need to disguise their work, says Alfred Sommer, dean of the Johns Hopkins Bloomberg School of Public Health in Baltimore.

    HHS spokesperson Bill Pierce denies that the department is targeting research on certain topics. “We do nothing like that,” he says. John Burklow, NIH spokeperson, says that the site visit was for “administrative issues,” not “scientific content,” and that there was nothing unusual about it.

    The controversy centers on research by AIDS researcher Tooru Nemoto, whose projects include preventing HIV infection in Asian sex workers and in “transgender” men who are planning or have had a sex change operation. HHS officials inquired about Nemoto's research in early January, according to Regis Kelly, UCSF vice chancellor for research. Kelly says that Nemoto also had support from another HHS agency, the Substance Abuse and Mental Health Services Administration (SAMHSA), and that HHS apparently wanted to be sure there was no “double dipping.” UCSF supplied information to clarify that there was no wrongdoing, Kelly says.

    A few weeks after HHS's call, NIH told the university that several agencies planned a site visit to discuss Nemoto's grants. That step was “very unusual,” according to UCSF grants and contracts manager Joan Kaiser, who says normally such questions are addressed by phone or in correspondence. In late March, four officials from NIH and SAMHSA spent 2 days at UCSF asking about procedures and going “all over San Francisco” to hear scientific talks by Nemoto's team, Kaiser says. She says that UCSF officials “haven't heard back” but assume the grants were in compliance.

    Hot zone.

    Health studies that involve prostitutes are getting critical reviews.


    UCSF officials thought no more of it until they learned last week about a memo from the House of Representatives to NIH. The 13 March e-mail memo, from staffer Roland Foster of the House Subcommittee on Criminal Justice, Drug Policy, and Human Resources chaired by Representative Mark Souder (R-IN), raised concerns about two NIH-funded studies of sex workers—Nemoto's and another led by a researcher in Miami. The memo, which HHS routinely forwarded to NIH director Elias Zerhouni, argues that by attempting to protect the health of sex workers, the studies “seek to legitimize the commercial sexual exploitation of women.” This runs counter to a February directive from President George W. Bush to reduce international sex trafficking, the letter claims.

    Foster's memo asks for detailed information about the two grants, including the names of study section members who approved them and the scores they gave. It also requests information on all NIH studies of prostitutes over the past decade. HHS is now asking the Centers for Disease Control and Prevention (CDC) to list studies it funds of sex workers, a CDC spokesperson says. Foster says he played “no role” in the UCSF site visit but is “interested in what may be found.”

    NIH program officials who handle grants in these areas are worried about the rumored surveillance. Four staffers contacted by Science declined to be interviewed. But one NIH scientist confirmed that some program staff have been telling grantees to reword grants to avoid terms such as: “needle exchange,” “abortion,” “condom effectiveness,” “commercial sex workers,” “transgender,” and “men who have sex with men.”

    Changing words in proposals may not shield researchers from scrutiny, however. On 11 April, Foster fired off another letter to NIH raising questions about a UCSF grant to prevent HIV in gay men and demanded a list of all HIV-prevention studies.


    E.U. Starts a Chemical Reaction

    1. Samuel Loewenberg*
    1. Samuel Loewenberg is a writer in Madrid, Spain.

    BRUSSELS—If the European Union (E.U.) gets its way, toxicology will soon be booming in Europe. And, to hear chemical manufacturers tell it, their industry will be in decline.

    Last week, officials from the European Commission, Europe's executive body, met with industry and environmental groups here to discuss proposed legislation that would require chemical manufacturers to run extensive safety tests over 11 years on the 30,000 most common chemicals on the market, many of which have been used for decades. The proposal would also severely restrict the use of an estimated 1500 chemicals considered the most hazardous to humans and the environment. “The new policy introduces a radical paradigm shift,” E.U. environment commissioner Margot Wallström told the conference. “It is high time to place the responsibility where it belongs, with industry.”

    The chemical industry, understandably, is up in arms. It estimates that testing will cost at least $7.5 billion, and the policy could “impose a regulatory stranglehold on our industry,” says Alain Perroy, director-general of the European Chemical Industry Council.

    At the root of this ideological fight is the so-called “precautionary principle.” This concept, codified in the E.U. charter, states that governments should base regulatory policy on the significant possibility of risk, taking action even before all the data are compiled. In contrast, U.S. regulations are not imposed until there is concrete evidence of harm. “The United States is usually reactive when it takes steps to protect citizens against toxic chemicals,” says Mary Graham, co-director of the John F. Kennedy School of Government's Transparency Policy Project in Cambridge, Massachusetts. The E.U. chemicals legislation, she adds, “is a remarkable effort because it is very expensive and it isn't in response to a public crisis.”

    Paradigm shift.

    The E.U.'s Margot Wallström wants the chemical industry to shoulder the burden of safety.


    Perroy maintains that industry already has safety data on its products and that if the E.U. legislation goes too far, it will be a “totally bureaucratic approach to build a knowledge base without use.” Industry also predicts that the increased costs of testing, and the possibility that hundreds of chemicals could be taken off the market, could result in major job losses. Lobbyists cite a study commissioned by the Federal Association of German Industry, which predicts that in Germany alone more than 2 million jobs would disappear, a figure that E.U. officials say is hugely overblown.

    The proposed legislation does, however, contain some good news for the research community: It would significantly loosen existing regulations on R&D. Under the current system, quantities of experimental substances of less than 1 metric ton are exempt from registration with the E.U. for a year. The proposed rules would lift the weight restriction and waive registration for 5 years, with the possibility of a 5-year extension. “This is good news,” says Patrick Peuch, a director of product stewardship at BP Chemicals-Europe. The existing deadlines are often so tight they constrain evaluation and testing, he says.

    Some in the industry do see a positive side to the legislation. Horst Mensel, a lobbyist at Bayer AG, believes that the new policy will encourage innovation by forcing companies to develop substitutes for chemicals deemed hazardous. Michael Warhurst of the environmental pressure group World Wide Fund for Nature argues that the proposed rules will create new markets for safer products and spur the creation of innovative safety testing and risk assessment tools.

    A final draft of the proposed legislation is not expected before the end of this year, so industry is still lobbying hard. But the European Parliament, which must approve the new directive, has a strong environmental leaning so will likely try to strengthen it. Final approval by a council of ministers from E.U. member countries is expected in 2005 at the earliest, and individual members must then incorporate it into their national laws.


    Report Deplores Growth in Academic Patenting

    1. Jennifer Couzin

    Alarmed by what it sees as a rush to patent, the United Kingdom's premier scientific academy is warning academic researchers and their governments that the free exchange of ideas could get trampled in the stampede. In its first report on the subject, the Royal Society this week calls for freer access to scientific databases and journals and asks universities to refrain from aggressively seeking so many patents.

    The Royal Society's report, by a nine-member commission, dovetails with what the commission calls a “grassroots movement” for greater openness and data sharing, especially in fields such as genomics and computer science. It is also consistent with a recent report commissioned by the British government urging caution in implementing rigorous intellectual property rules in developing countries (Science, 13 September 2002, p. 1791). The commission was chaired by Roger Needham, managing director of Microsoft Research Ltd. in Cambridge, U.K., who died of cancer last month.

    The report acknowledges that fees generated by licenses and patents help pay for scientific innovation, but it takes aim at a number of worrying trends. One is a 1996 European Union directive on database access, which allows database creators to charge for access to data—such as gene sequences—generated by public funds. The Royal Society hopes to influence a 2005 review of this directive, says society vice president John Enderby. U.S. rules in this area are more liberal: Several years ago, for example, the National Institutes of Health purchased a Swiss protein database to make it freely available.

    The commission also took issue with what it sees as an increasingly commercial university system. “What concerned us most was the drive towards research aimed at creating intellectual property,” says commission member Roger Elliott, a theoretical physicist at Oxford University. The panel also notes that British patent law may stifle the free flow of information because scientists must file for a patent by the time they publish (U.S. rules allow a 1-year grace period).

    It's not just unpublished data that are raising concern. Scientists on both sides of the Atlantic are also debating the accessibility of research results after they are published. The report throws its weight behind journals with “liberal access policies.” Enderby says that the Royal Society, which publishes five journals whose contents are free after 2 years, is considering enhancing access to its own. (Science's research papers become freely accessible 1 year after publication.)

    Patent law scholar Rebecca Eisenberg of the University of Michigan, Ann Arbor, believes that some researchers want to have it both ways: reap the protection from patents but avoid their restrictions. Charging for access to scientific information, she says, “appears to them as a degradation of norms, rather than [proof of] their own increasing commercial relevance.”


    New Science Chief Wants Ready-Made Technologies

    1. David Malakoff

    The U.S. government's newest research funding agency is getting up to speed. Last week the Department of Homeland Security (DHS) formally welcomed its new science chief, former defense industry executive Charles McQueary, who spent his first official day on the job telling Congress how he planned to spend a proposed $800 million budget. Among the priorities: evaluating off-the-shelf antiterror technologies, deploying the most promising ones quickly, and recruiting his team. He also plans to bring universities into the new field of homeland security, starting with an academic center for antiterrorism research.

    McQueary will be responsible for “building something that's never existed before,” DHS Secretary Tom Ridge said at a 9 April swearing-in ceremony for the one-time AT&T Bell Labs director. The event was held at the ornate Washington, D.C., headquarters of the National Academies, which last year urged Congress to create the directorate to coordinate the department's research and testing programs (Science, 22 November 2002, p. 1534).

    After taking his oath, McQueary said his first task as undersecretary of science and technology will be to sift through existing technologies. His goal is to get new sensors, software, and other equipment as quickly as possible into the hands of everyone from border guards to local police. “Initially, you won't see [DHS funding] much in the very forward-looking scientific areas,” he said.

    Tech talent.

    DHS Secretary Tom Ridge (right) swears in science chief Charles McQueary (left, with wife, Cheryl).


    McQueary fleshed out that message the next day to House and Senate appropriations panels reviewing the president's request for the budget year that begins 1 October. McQueary hopes to spend $365 million on countering bio- and agroterror threats, including establishing a nationwide monitoring system for bioagents. Some $137 million would go to radiological countermeasures, and $90 million would be spent on information technology. The rest of the $800 million would be funneled into everything from chemical sensors to a rapid prototyping program that would turn promising ideas into practical—and cost-effective—devices. “I [believe] a 90% solution that gets implemented is better than a 100% solution that doesn't,” he told lawmakers.

    The first impact felt by university researchers is most likely to be $10 million for a fellowship program and at least one academic center dedicated to homeland security research. Guidelines for the fellowships are still being worked out, and McQueary says that the center may actually be a collaboration among teams at several campuses working in particular fields, such as cybersecurity.

    About $350 million of McQueary's budget will be managed through the new Homeland Security Advanced Research Projects Agency (HSARPA), which is still looking for a director. He's also hoping to hire more than 100 people with technical savvy. Lawmakers seemed satisfied with McQueary's plans. But it will be months before they complete work on his budget.


    Ancient DNA Pulled From Soil

    1. Erik Stokstad

    Paleontologists working in the Siberian tundra can trudge for kilometers without finding a bone from the mammoths or bison that once roamed the permafrost. Now, two labs have shown that ancient DNA from these creatures and other ancient animal and plant life may be quite common underfoot, preserved in sediments bearing no signs of fossils. Much of the plant DNA probably derives from roots, which would have been well protected under ground; the animal DNA likely comes from cells excreted in urine and feces. “This technique truly will revolutionize our ability to reconstruct past flora and fauna,” says paleoecologist Glen MacDonald of the University of California, Los Angeles. It could shed light on questions such as what kinds of plants lived during the ice ages or when the first humans crossed to North America.

    Ancient DNA is typically extracted from fossilized bones, tissue, or dung. But a team led by Eske Willerslev and Anders Hansen, graduate students in molecular biology at the University of Copenhagen, has now pulled DNA from Siberian permafrost sediments and the soil of caves in New Zealand. Another group matched the samples to those of known fossil bones, the collaborators report online in Science this week ( The Siberian sediments yielded what the authors say is the oldest reliable ancient DNA so far, from plants up to 400,000 years old.

    Deep freezer.

    Apparently barren permafrost can contain DNA from ancient plants and animals.


    Permafrost is a good haven for preserving DNA because it is constantly cold. The Danish team started out looking for bacterial DNA, but the researchers were surprised to find that they could recover fragments of chloroplast DNA in soil samples. They identified 19 taxa of angiosperms, gymnosperms, and mosses. Soon they found DNA from eight kinds of animals, including modern denizens such as lemmings and extinct ones such as mammoths and steppe bison, which matched fossil DNA analyzed by co-author Alan Cooper of Oxford University, U.K. The animal DNA was at most 30,000 years old, according to radiocarbon dating of the sediment.

    Perennially dry temperate caves in New Zealand, meanwhile, yielded ancient DNA from the extinct moa and a plant community much like that present before human colonization in A.D. 1000. “Importantly, this demonstrates that DNA can be preserved in soil for long periods, even in unfrozen conditions,” Cooper says.

    The permafrost plant DNA indicates that an area called Beringia, which spanned eastern Siberia and western Alaska, was once a vegetation-rich steppe—the kind that some researchers have argued would have been necessary to support mammoths and other megafauna—rather than a sparse polar tundra. But grasses declined from 36% to 3% of the DNA samples after about 11,000 years ago—which fits the idea that climate change played a big role in the mass extinction of mammoths, ground sloths, and other large North American mammals, the authors say.

    Figuring out what glacial vegetation was like in Beringia can help test computer models of climate change, says Stephen Jackson, a botanist the University of Wyoming in Laramie. But he and others want to see more samples of ancient DNA and a more thorough demonstration of the technique on modern vegetation before being convinced that it can accurately reveal the patterns of past climates. “It's pretty exciting, but there's a lot of work that needs to be done,” adds Elizabeth Hadly, a specialist in ancient DNA at Stanford University.

    One drawback of the animal DNA is that it's difficult to know exactly what layer of soil it was deposited in. The samples “could be from much younger sediments due to leaching of urine,” notes Hendrik Poinar of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. Caves, too, have their limitations. Animals can churn soil, and a fluctuating water table can transport DNA.

    But if these obstacles can be overcome, finding DNA in sediment “frees ancient DNA researchers from the shackles of needing fossils to be able to look into the past,” says Cooper.


    Reaching Their Goal Early, Sequencing Labs Celebrate

    1. Elizabeth Pennisi

    BETHESDA, MARYLAND—It may sound like science by press release, as no formal report has been published, but the news is spectacular all the same: The International Human Genome Sequencing Consortium announced on 14 April that it has completed its work. In a note of rare unanimity, the leaders of the United States, Britain, China, France, Germany, and Japan issued a joint proclamation honoring their scientists who worked on the project.

    Twice before—in 2000 and 2001—researchers celebrated draft sequences of the human genome. But the new product is much more complete and of higher quality: 99% of what can be done with current technology is now done, sequencers say. And virtually all the bases are now identified in their proper order, which was not true of the draft versions.

    “It's a great day for science and a great day for humankind,” Elias Zerhouni, director of the National Institutes of Health, said at a press conference held here as part of a celebration of the 50th anniversary of the discovery of DNA's structure (Science, 11 April, p. 277).

    Sequencing the human genome involved hundreds of researchers. Overall, says James Watson, co-discoverer of DNA's structure and the first director of the Human Genome Project, “we've seen the best of human beings in the way the consortium came up with the sequence.”

    Completing this labor-intensive phase will come as a big relief for many involved. They were spurred to a faster pace in 1998, when sequencer J. Craig Venter boasted that his new company, Celera Genomics in Rockville, Maryland, would sequence the human genome first. The consortium scrambled to prevent that from happening. Both Celera and the public groups completed drafts in June 2000 and published reports on them simultaneously in February 2001.

    Once that milestone had been reached, “everyone questioned whether [the public project] would suck it up and do the hard work of finishing [the genome],” says former Stanford University geneticist David Cox, now chief scientific officer of Perlegen Sciences in Mountain View, California. But the public consortium hunkered down and delivered on the promise 2 years ahead of schedule, says Francis Collins, director of the National Human Genome Research Institute (NHGRI). The U.S. share amounted to about 53% of the 2.9 billion bases, costing $2.7 billion over the project's 15-year duration.

    The consortium set out to meet a standard of one error in 10,000 bases; the just-finished version is 10 times better than that. Now only the most difficult regions remain to be done, including about 400 stretches of repetitive DNA and centromeres, which divide the chromosomes. “This is a reference sequence,” says Collins, one that will be used by biologists for many years to come.

    View this table:

    Even so, sequencers have not yet determined the number of human genes. Three years ago they were so confident that they would have the answer by now that many made bets on their predictions, planning to declare the winner this spring (Science, 19 May 2000, p. 1146). That's not going to happen, says Collins; there is no clear answer from computer programs that are trained to identify genes. Two years ago, the two drafts indicated there were 35,000 to 45,000 genes. Now the rough estimate “stands a little under 30,000,” says Collins.

    As the genome community inches toward an accurate gene count, it's also endorsing new, even grander research schemes. NHGRI, for example, unveiled a new plan in Nature online last week emphasizing a broad agenda. “This is not intended to be like the previous 5-year plans,” which stressed detailed targets, Collins explains. “This is the vision part.” Specifics will come later.

    If sequencing the genome was akin to landing a human on the moon, then this new vision calls for landing more humans on the moon, says Edward Rubin, director of the Department of Energy (DOE) Joint Genome Institute in Walnut Creek, California. DOE intends to focus on sequencing nonmammalian genomes, including microbes important as potential energy sources (Science, 11 April, p. 290). But NHGRI's strategy has many elements, such as extending the haplotype map, a five-nation effort to describe individual and group DNA variations. A project called ENCODE will determine the functions of the genes. And Collins hopes to launch a resource to enable researchers to screen new proteins for interactions with any of about a million small molecules, potentially to find drug candidates. But Thomas Pollard, a cell biologist at Yale University in New Haven, Connecticut, notes that NHGRI must be careful, as some aspects of its vision seem a little “off target.” NIGHI has not been funded for some large-scale biomedical projects on its agenda, he says, and these should be spearheaded by other institutes.

    Nevertheless, the completed genome is already making biologists' work easier, Pollard says—not just by identifying genes but also by revealing what regulates gene expression. He predicts that biology will continue to reap broad benefits: For that reason, it's the right time for Collins and his collaborators “to set off a few fireworks.”


    A Sea Change in Ocean Drilling

    1. Dennis Normile,
    2. Richard A. Kerr

    Japan's new ship moves it into a leadership position as it joins the United States and other countries in laying out an ambitious research agenda for the new millennium

    TOKYO—In the early 1960s, geologists took their first shot at drilling all the way through Earth's crust and into its mantle with the Mohole Project. It turned out to be a disaster. Named for the Mohorovicic discontinuity, the boundary between the crust and mantle, the ambitious attempt to penetrate 6 kilometers of crustal rock was sunk by cost overruns and management problems and scrapped after a few test holes. But out of that debacle came a highly successful international scientific endeavor. The decision to drill Mohole from a barge—to take advantage of the fact that the oceanic crust is much thinner than the continental crust—laid the foundation for modern-day scientific ocean drilling. And researchers have exploited the world it opened up to make seminal discoveries about the planet. Now, those efforts are about to enter a new era.

    Over the past 40 years, researchers have drilled as many as 1700 holes each year in the ocean floor, retrieved 160 kilometers of mud and rock core, and studied 35,000 samples. The legacy of ocean drilling includes validating the theory of plate tectonics and tracing Earth's changing climate back 100 million years, as well as inventing the field of paleoceanography. Since 1984, that work has been carried out under the 22-country Ocean Drilling Program (ODP), a unique effort that is set to end in September. But it will be replaced by something even more ambitious: Next week, Japan and the United States will ink an agreement formally creating the Integrated Ocean Drilling Program (IODP). It will eventually include 20 or so other countries (see sidebar), cost twice as much to operate as its forerunner, and use two ships rather than one.

    Initially, IODP will rely on an upgraded U.S. drill ship, either a revamped version of ODP's workhorse, the JOIDES Resolution, or a new vessel with similar capabilities. By 2008, it will be joined by a brand-new ocean drilling vessel, Japan's Chikyu, equipped with technology that will allow it to literally break new ground. Together, the two ships will enable earth scientists to bore more and much deeper holes than is currently possible and in locations that are now inaccessible. There could even be “mission-specific platforms” that would drill niche locations such as the icy Arctic Ocean and shallow coastal waters.

    The biggest change in operational capabilities will come when the 210-meter, 57,500-ton, $475 million Chikyu starts drilling. For all its achievements, the Resolution has serious limitations. It can't drill in shallow water or farther down than 2 kilometers. Nor can it tolerate the icy conditions of the Arctic Ocean. What's more, sedimentary basins have been largely off-limits because oil and gas deposits have posed safety and environmental hazards.

    The Chikyu will overcome some of these constraints. It will have a second pipe, called a riser, that will enclose the drill pipe and allow circulation of a heavy but fluid drilling mud that will flush debris from deep holes and shore up unstable sediments. The arrangement will also protect against blowouts when the bit penetrates pressurized oil or gas deposits. Attempts at drilling very deep holes using the Resolution were frustrated by the friction and debris piled up in the hole. “Because of the capabilities of the riser vessel, [all sorts of drilling] projects will be more viable,” says Hisatake Okada, a paleoceanographer at Hokkaido University in Sapporo.

    Muddy bonanza.

    The soft sediments overlying drillers' initial targetthe rocky crusthave provided a bounty of environmental records.


    But all of this comes at a steep price. The annual budget of ODP runs about $80 million, with 60% of that sum put up by the U.S. National Science Foundation (NSF) and the rest split among the other member countries. Countries spend additional funds to support scientists analyzing drilling samples and data. In comparison, IODP's annual operating budget is expected to start at $160 million and rise depending on the amount and nature of drilling carried out. Japan and the United States will split at least two-thirds of the operating costs equally, with other countries providing the rest—and possibly also funding mission-specific platforms.

    Researchers are arguing that the scientific advances will be worth the price, from a better understanding of earthquake mechanisms and the history of global climate change to the discovery of new energy sources and unusual microbes for use in biotechnology. And governments so far seem convinced (see sidebar). “There's no new money without new science ideas,” NSF's assistant director for geoscience Margaret Leinen told scientists at last fall's Geological Society of America meeting. “The thing that really excites people when you're arguing for several tens of millions of dollars is the [new] science.”

    New world view

    Ocean drilling's first significant achievement came in geophysics. “Past successes have changed our understanding of how Earth works,” says oceanographer Larry A. Mayer of the University of New Hampshire, Durham. By dating rock recovered from numerous sea-floor locations, researchers in the early 1970s confirmed the basic cycle of plate tectonics: New ocean crust forms at mid-ocean ridges and spreads outward toward deep-sea trench subduction zones. Crustal drilling also showed how great upwellings of hot rock, called plumes, could create chains of islands and seamounts such as Hawaii.

    These discoveries have raised new questions about solid earth cycles and geodynamics, one of three broad themes in IODP's initial science plan. Earlier drilling showed that large parts of the crust were formed by anomalous volcanic events separate from plate tectonics. Oceanic plateaus, so-called large igneous provinces, mostly formed during the mid-Cretaceous period 100 million to 140 million years ago when massive amounts of material burst through tectonic plates, venting heat and magmatic gases from Earth's interior. These features have as yet been barely sampled by drilling. Researchers hope that data from a combination of numerous shallow holes drilled by a riserless ship and deep holes drilled later by Chikyu may relate these events to Earth's evolution and reveal whether or not they triggered climactic changes that led to mass extinctions.

    Another major geophysical target will be subduction zones, where the clash of sinking and overriding plates generates 90% of the world's earthquakes. Chikyu's first target, reached by consensus, will be the Nankai Trough subduction zone offshore of Honshu, Japan's main island. Chikyu's riser will allow boring through the deep sedimentary deposits atop overriding plates. Those deposits were off-limits to the Resolution because of the danger of a blowout caused by inadvertently tapping into oil and gas deposits and by the depth of the fault target.

    Gaku Kimura, a geologist at University of Tokyo, says Chikyu will also be able to install a new generation of instruments in the bore hole to monitor fault zone temperatures, stresses, deformation, and fluid pressures. “This is a completely different scientific approach” to studying rock samples, says Kimura. “It's like the difference between studying a live human being and dissecting a corpse.” An improved understanding of earthquake mechanisms could help Japan and other onshore communities assess the risk of future earthquakes.

    White gold?

    Methane trapped in ice beneath the sea floor could provide bountiful clean fuel.


    IODP may even take another shot at penetrating the Mohorovicic discontinuity. With the lubricating drilling mud circulating through its riser, Chikyu should be able to reach 6 kilometers into the upper mantle. Such a hole would help refine knowledge of the structure, composition, and physical properties of the oceanic crust.

    Probing the past

    Although geophysics was the prime motivation for the first ocean drilling cruises, scientists in other disciplines soon capitalized on the data obtained from the cores. “Paleoceanography is one of the strong successes of ocean drilling,” says Jerry McManus, a paleoceanographer at the Woods Hole Oceanographic Institution in Massachusetts. Paleoceanographers recognized that cores recovered from layered sediments provided clues to a variety of climatic phenomena, sometimes going back 120 million years. McManus credits ocean drilling with clinching the orbital theory of climate change over millions of years, when Earth's wobbling drove climate oscillations. It also documented extreme climates such as the thermal maximum of the late Eocene (55 million years ago) and rapid climate change.

    But the use of drilling for paleoceanographic studies has been held back by the limitations of the Resolution. It cannot drill in water much shallower than 100 meters, ruling out the inner continental shelves and coral reefs that hold long records of climate and sea-level change. And it can't handle more than the passing bit of sea ice, which has kept it entirely out of the Arctic Ocean. “You could lay out all the existing Arctic cores in my office,” says oceanographer Theodore Moore of the University of Michigan, Ann Arbor. A successful mission to the deep Arctic, he says, would provide “an entire history from 50 million years ago to the present.”

    Under IODP's initial science vision, mission-specific platforms capable of drilling in niche locations would play a major role in studying environmental change, processes, and effects. European scientists, for example, are hoping to obtain sufficient funding to send a drill ship to retrieve long sediment cores from the Arctic Ocean in the summer of 2004.

    Living laboratory

    The third leg of the IODP scientific tripod, studying the deep biosphere and the sub-sea-floor ocean, is also the newest. The original ocean drillers never imagined there could be life within the extreme temperatures, pressures, and chemical environments of the ocean floor. But reports of microbial colonies at sea-floor vents and volcanic rifts demonstrated otherwise. Now some experts in extremophiles, as these microbes are called, believe that as much as two-thirds of Earth's microbial population may be buried in oceanic sediment and crust.

    One major challenge will be to define the range of temperatures, pressures, chemistry, and other conditions under which these sea-floor communities thrive and to map their geographical distribution. Researchers would also like to clarify whether these microbes get their nutrients from material that filters down from the surface or from updrafts of fluids flowing through the interface between sediments and hard rock. The findings “could revolutionize ideas about the origins of life,” says Asahiko Taira, director-general of the Center for Deep Earth Exploration at the Japan Marine Science and Technology Center. Researchers also hope to add to the handful of industrially useful microbes already isolated from deep-sea regions.

    Adding life sciences to the agenda was a major selling point. NSF's Leinen says that “doing science on deep life was a very, very exciting idea” to members of the National Science Board, as well as to NSF's director, microbiologist Rita Colwell.

    Another underresearched area of inquiry is gas hydrates, deposits of ice-encapsulated methane. Although a potential new source of clean energy, they also could release a significant amount of greenhouse gases into the atmosphere if thawed as a result of global warming. Scientists want to learn how microbes generate the methane, how hydrates form, and whether methane can be produced at prices competitive with those of other fuels.

    These intriguing questions weren't even on the radar screen of those working on the ill-fated Mohole Project. But if IODP comes up with some answers, its scientists will owe a debt of gratitude to those who conceived and carried out the first scientific attempt to probe what lies beneath Earth's oceanic crust.


    Under New Management

    1. Dennis Normile,
    2. Richard A. Kerr

    TOKYO—The new Integrated Ocean Drilling Program (IODP) will be much more than a repackaging of the 19-year-old Ocean Drilling Program (ODP). Larger and more ambitious, IODP will require U.S. scientists to get used to a co-captain, and Japanese scientists to take a direct hand in managing their science. Many other countries are also hoping for a seat at the table, although their status remains unclear.

    “It has been a long, long way to get to this point,” says Hajimu Kinoshita, executive director of the Japan Marine Science and Technology Center (JAMSTEC), reflecting on the 7 years of negotiations leading up to next week's signing of a memorandum of cooperation between the United States and Japan. And the work is far from over, he says. “It's still not quite foreseeable what's beyond the mogul in front of us.”

    But this much is clear: In September, the JOIDES Resolution will complete its final voyage under ODP, and IODP will take over on 1 October. The new program will be without a ship for at least a year, however, depending on how long it takes the United States to outfit and deploy an upgraded drilling vessel. That vessel will be joined in 2008 by Chikyu, Japan's riser-drill ship, which this month went to a shipyard to get its drilling apparatus installed.

    Hard work.

    Deep-sea drilling still takes a lot of sweat, as does the effort to fund the project.


    For the National Science Foundation (NSF), the U.S. partner in the venture, IODP is definitely a big-ticket item. NSF now spends $64 million a year on ocean drilling, a sum that covers 60% of ODP costs and includes $16 million to fund U.S. researchers studying ocean drilling samples. IODP will cost NSF twice as much, starting in 2006, because of increased operating costs and additional research support. And that doesn't include an estimated $100 million to acquire and ready a second drill ship. Despite its higher price tag, IODP has sailed through approvals from NSF's oversight body, the National Science Board, and won an unprecedented endorsement from White House budgeteers for a place in NSF's fiscal year 2005 and FY 2006 budgets.

    Although the management structure for IODP will parallel the one used by ODP, Japan's increased prominence will mean a less dominant role for U.S. scientists. Under ODP, scientific panels made the decisions on where to send the ship and what science to do. But the panels were staffed in proportion to each country's contribution, which effectively gave the U.S. community the dominant voice. “We're used to being the big guy on the block and calling the shots,” says oceanographer Theodore Moore of the University of Michigan, Ann Arbor. In the future, he says, “we'll be sharing [decision-making] with the Japanese, and perhaps with the Europeans.”

    If giving up the helm will seem unfamiliar for U.S. scientists, taking it will be unprecedented for the Japanese. The $475 million Chikyu was proposed by JAMSTEC and the Science and Technology Agency (STA), now part of the Ministry of Education. Both STA and JAMSTEC were often drawn to big projects as much to boost the Japanese economy as to contribute to scientific discovery. Scientists were always included on internal advisory panels, but government officials usually made the final decisions.

    Negotiations over IODP led officials to realize that “international ocean drilling is proposal driven,” says Hisatake Okada, a paleoceanographer at Hokkaido University in Sapporo. As a result, JAMSTEC is scrapping its internal advisory committee for the Japan Earth Drilling Science Consortium, a group of 39 universities modeled on the U.S. Joint Oceanographic Institutions for Deep Earth Sampling (JOIDES) that provides scientific guidance to ODP. The consortium will try to work out a consensus of the community's priorities and present that agenda to funding agencies and IODP, which will make the final decisions.

    Although Japan and the United States are first among equals, they haven't forgotten the rest of the world. To make best use of the drilling platforms, “we need lots of proposals,” says Asahiko Taira, director-general of JAMSTEC's Center for Deep Earth Exploration. But money talks, too. ODP membership costs $3 million a year; the United Kingdom, Germany, and France pay the fee individually, and others pool their pennies to earn a single vote. And although all 22 countries currently involved in ODP have expressed interest in IODP, the new fee structure—starting at $1.5 million per year and rising to $5.6 million by the time both ships are deployed in 2008—could be too steep for some.

    Deep driller.

    Japan's Chikyu drill ship, shown here before being fitted with its drill rig, will bring new capabilities to ocean drilling.


    In January 2002, Germany, France, and the U.K. joined other national funding agencies to form the European Consortium for Ocean Research Drilling (ECORD). “Over 15 European countries have so far expressed an interest in taking part in IODP as members of ECORD,” says Chris J. Franklin of the U.K.'s Natural Environment Research Council in Swindon, who is chair of ECORD's interim council. But Franklin says that ECORD's original hope of becoming a co-equal with Japan and the United States is now “very uncertain.” ECORD is also waiting to hear back from the European Union on its request for funds to build mission-specific drilling platforms, although the initial response has been negative.

    Participation is also a stretch for Asian countries. China has been an associate member of ODP since 1998 and hopes to retain that status in IODP by upping its annual contribution from $500,000 to $1 million, says Shen Jianzhong, an official at China's Ministry of Science and Technology. Money is also an obstacle for other members of what has been a Pacific Rim consortium involving Canada, South Korea, Taiwan, and Australia. Ju-Chin Chen, an oceanographer at National Taiwan University in Taipei, says that the consortium is hoping to stay together for IODP but that “the membership fee might be a problem.”


    Calling All Coronavirologists

    1. Martin Enserink*
    1. With reporting by Gretchen Vogel in Berlin.

    The ongoing epidemic of acute respiratory disease is shining an intense spotlight on one of virology's backwaters

    In December, Kathryn Holmes gave a talk at a meeting on respiratory viruses titled “Coronaviruses: How Important?” Although the audience listened politely, Holmes recalls, most would probably have answered the question with “not very.” After all, coronaviruses caused serious disease in farm animals and pets but nothing more than the common cold in humans. “It wasn't really on their radar screen,” Holmes says.

    It is now.

    Three weeks ago, studies fingered an unknown new member of the coronavirus genus as the most likely culprit behind SARS, or severe acute respiratory syndrome. Shortly after, the U.S. Centers for Disease Control and Prevention (CDC) flew Holmes, a researcher at the University of Colorado Health Sciences Center in Denver, to Atlanta to give public health experts and scientists a crash course on her field. Like most of the 12 labs that teamed up to identify the cause of SARS, CDC didn't have much experience with coronaviruses, but “they're very quick studies,” Holmes assures.

    So are others. Hordes of government, academic, and corporate scientists have rushed in to characterize the virus further and test possible drugs and vaccines. Last weekend, a group of Canadian researchers was the first to announce that it had deciphered the new virus's entire genome. All of a sudden, coronavirologists (and their research community) find themselves in the thick of a fast-moving global drama.

    It's little less than a culture shock. Accustomed to publishing in specialized virology journals, these virologists are watching The New England Journal of Medicine and The Lancet rush one paper after another about the epidemic onto their Web sites. The program for a conference about coronaviruses and the related arteriviruses, scheduled for next month in a small seaside town in the Netherlands,* has been hastily revised, say organizers Eric Snijder and Willy Spaan of Leiden University Medical Center. So many new faces are expected beyond the 120 or so regulars at the triannual meeting that hotel rooms are running out.

    Many coronavirologists have, like Holmes, helped researchers investigating the outbreak with advice and reagents. Now that the pathogen is coming into view, many want a piece of the action and have started to grow and study the virus themselves. Decades of dogged labor and thousands of scientific papers may now pay off when it comes to fighting the outbreak, Holmes says. At Utrecht University in the Netherlands, for instance, Peter Rottier's group has developed several candidate cat coronavirus vaccines; the researchers have also found peptides that block the entry into cat and mouse cells of the feline and murine coronaviruses, respectively. Some of that work may be applicable to the SARS virus, Rottier says.

    Unknown origins

    For the moment, researchers are still trying to prove that the new coronavirus really is the cause of the outbreak. The final piece of evidence could come from ongoing experiments at U.S. and Dutch labs, in which monkeys are infected with the virus to see if they develop SARS-like symptoms. Meanwhile, several of the 12 labs participating in the World Health Organization network (Science, 11 April, p. 224) are also trying to determine whether the virus has an accomplice, such as the human metapneumovirus that Canadian researchers first discovered in SARS patients.

    But already, the indictment of a coronavirus is a surprising twist in the field's 60-year history. The first coronavirus isolated, in 1937, was the avian infectious bronchitis virus, which can cause devastating disease in chicken flocks. Since then researchers have found its cousins to infect cattle, pigs, horses, turkeys, cats, dogs, rats, and mice. The first human family member was cultivated from nasal cavities in the 1960s, after researchers realized that another group, the rhinoviruses, were responsible for only about half of all common cold infections. Today, the two known human coronaviruses, OC43 and 229E, are thought to cause about 30% of cases, depending on the year.

    The viruses themselves are something of an oddity. With a genome of more than 30,000 nucleotides, coronaviruses are relative giants, and they have a complex two-step replication mechanism. Many RNA virus genomes contain a single, large gene that is translated by the host's cellular machinery to produce all viral proteins. Coronaviruses, instead, can have up to 10 separate genes. Most ribosomes translate the biggest one of these, called replicase, which by itself is twice the size of many other RNA viral genomes. The replicase gene produces a series of enzymes that use the rest of the genome as a template to produce a set of smaller, overlapping messenger RNA molecules, which are then translated into the so-called structural proteins—the building blocks of new viral particles.

    Most coronaviruses cause either a respiratory or an enteric disease, and some do both. But the differences among these types can be small. In 1999, for instance, a team led by Luís Enjuanes of the Autonomous University of Madrid, Spain, showed that just two point mutations can change a mostly enteric virus that can kill piglets into a nondeadly one that excels at the respiratory route but replicates poorly in the gut.

    Researchers have grouped coronaviruses into three categories based on cross-reactivity of antibodies backed up by genetic data (see figure); the two previously known human viruses fall into different groups. Investigators had hoped that the genome sequence of the new virus would help pinpoint its origins. But a first glance at the data has yielded few clues, they say. The sequence, posted online by the Michael Smith Genome Sciences Centre in Vancouver, Canada, on 12 April and 2 days later by CDC, confirmed what researchers had gleaned from a few small snippets of the genome a week earlier: The new coronavirus does not fit into any of the clusters but is in a new one by itself.

    New bug on the block.

    Based on the genome sequence, it appears that the new virus associated with SARS (bottom) is not closely related to any known coronavirus; it has been placed in a new, fourth cluster.


    That leaves wide open the question of where the virus came from. Experiments in animals co-infected with two coronaviruses have shown that as many as 50% of newly formed virus particles are the result of a recombination, and some researchers have suggested that the new virus, too, is a hybrid. But if that's true, neither of the two progenitor viruses is known, Enjuanes says—nor is it clear how such a recombined virus would end up in humans if neither of the parent viruses infects people.

    Another possibility is that the virus has been infecting one animal species for a long time—perhaps without causing noticeable disease—and accidentally jumped to humans, where it found a favorable environment. If so, the animal host may be difficult to find, says Snijder. Researchers know only about a dozen coronaviruses because they haven't looked much beyond domestic animals and humans. “We may well find a coronavirus in every mammalian or avian species we look at,” says Snijder.

    The heavy economic toll that animal coronaviruses have inflicted on agriculture has led to vaccines for several types, some of them based on killed vaccine, others on weakened, live viruses. That's “encouraging,” says Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases in Bethesda, Maryland, because it suggests that SARS, too, might be contained by a vaccine. But there are pitfalls as well, Rottier says. A live vaccine to prevent feline infectious peritonitis is controversial, he notes: Many researchers think it predisposes cats to more serious disease.

    Developing a vaccine may become crucial because it seems increasingly unlikely that the disease can be stamped out by rigorous isolation of patients. As Science went to press, Hong Kong was reporting ever-growing numbers of patients, along with the rest of China. Nor is there any sign that SARS is becoming less virulent as it spreads from one human to another, a phenomenon that is believed to have prevented uncontrolled spread of other zoonotic diseases. But how serious the pandemic could become is anyone's guess.

    Coronavirologists, who have sometimes found it hard to get funding, say they regret the human toll but welcome the attention for their field. And there's another bright side, says Rottier: When he tells people he's working on coronaviruses, he doesn't get that blank stare anymore.


    From Bioweapons Backwater to Main Attraction

    1. John Bohannon*
    1. John Bohannon is a former Science intern now based in Lyon, France.

    Anthrax researchers, like experts on coronaviruses (see above story), find themselves thrust into a new environment.

    NICE, FRANCE—As they strolled in from the mellow French Riviera sun to the conference desk at the chic Boscolo Plaza Hotel to collect registration packs and satchels—compliments of IGEN International Inc.—one thing was clear to the attendees of the 5th International Conference on Anthrax, which began here late last month: Anthrax research ain't what it used to be. The last time they all got together was in June 2001 at a small liberal arts college in Annapolis, Maryland, where they shared a picnic on the lawn and slept in dormitory rooms for $20 a night. This time around, everyone has enjoyed sumptuous three-course lunches and a banquet—compliments of BioPort Corp. in Lansing, Michigan—and many have stayed at the Boscolo for $167 a night.

    Fewer than 4 months after that 2001 meeting, the United States became the victim of the first intentional use of anthrax as a bioweapon since World War I. As a result, an enormous amount of attention and money has been focused on the causative agent, the bacterium Bacillus anthracis. “It's a completely different field now,” says Stephen Leppla, a molecular biologist who leads an anthrax research group at the National Institute of Allergy and Infectious Diseases (NIAID) in Bethesda, Maryland. NIAID's budget for anthrax research has ballooned from $3.2 million in 2001 to around $75 million this year.

    Patient assassin.

    Anthrax can lie in wait in the soil for decades until inhaled or eaten by cattle.


    The first fruits of increased funding were on display in Nice, with new progress on vaccines and therapies as well as the basic biology of the anthrax bacterium. In August 2001, says Paul Keim, a veteran anthrax researcher at Northern Arizona University in Flagstaff, “I was being told to prepare for a 20% budget cut.” Now he's more concerned about how to deploy the windfall of anthrax funding most efficiently.

    Sorting out the basics

    Opening the conference's first session, Tim Read of The Institute for Genomic Research (TIGR) in Rockville, Maryland, gave a sneak preview of a tool that many in his audience have eagerly awaited. TIGR finished sequencing the 5.23 million base pairs of DNA that make up the single circular chromosome of B. anthracis months ago and has made it available in a fragmented form online. Read provided an overview of the now fully assembled genome, which TIGR says will be published “soon.”

    With the genetic blueprint of B. anthracis known, many newcomers to this fast-growing field assume “that much of the basic work is complete,” says microbiologist Paul Jackson of Los Alamos National Laboratory in New Mexico, but “that is just not true.” What is known is that B. anthracis is a naturally occurring pathogen, mainly of herbivores such as cattle and sheep. It appears to spend its life as a tiny, robust spore waiting in the soil for years—even decades—before being ingested or inhaled or entering a mammal's body through a wound. Researchers also know that the basic weapons in the anthrax armory—the toxic secreted proteins and the protein capsule that helps the bacterium evade the immune system—are encoded on two plasmids, small loops of DNA separate from the chromosome.

    But there are still big gaps to be filled in the bug's molecular biology, says Theresa Koehler, a microbial geneticist at the University of Texas Medical School in Houston. Koehler presented work, done with Agathe Bourgogne in her lab using a DNA microarray developed by Scott Peterson of TIGR, revealing that the way genes regulate anthrax's attack on the body is far more complex than has been assumed. Each of the plasmids toted by B. anthracis carries regulatory genes that orchestrate the synthesis of the toxin and capsule proteins. But Koehler has demonstrated that one of these plasmid-based regulators, called atxA, controls an unexpectedly broad array of other genes on both plasmids and the chromosome.

    Aside from identifying several new genes potentially important in the progress of the disease—and thus drug development—the work may call for a reassessment of both accepted data and standard lab practices. Many researchers, who are unable to meet the strict safety guidelines set for strains containing both plasmids, use a strain containing only the plasmid with atxA and the toxin genes on it. This then exempts them from the toughest safety regulations. Such strains are used with the assumption that they are just like the double-plasmid strain but without the capsule. However, Koehler's results reveal this to be too simple, and “for certain investigations employing strains with only one of the two plasmids,” she says, “the physiological significance of the results could be questionable.”

    Another strategy for finding new genes important for pathogenesis is to identify those that differ between B. anthracis and its already sequenced relatives, the opportunistic pathogen B. cereus and B. thuringiensis, a harmless bacterium whose insect-killing Bt toxin has been engineered into many crop plants. This analysis is under way at TIGR, and Read reports that their genomes are remarkably similar, but B. anthracis appears to have 150 genes that the other two species lack. Drug researchers, intent on finding weaknesses to exploit, are now looking very closely at these genes.

    A new urgency

    In principle, anthrax infections are easy to treat because the bacteria can be wiped out with antibiotics—as long as they are not resistant. But if too many of the bacterial spores have germinated within the body, it is often too late: They have already begun releasing the toxic proteins that do most of the damage.

    One way to save someone with an anthrax infection would be to neutralize these toxin proteins. Rather than synthesize a chemical to do the job, several groups are mass-producing human antibodies that can bind to the toxin and prevent it from making mischief.

    Herman Groen, a cell biologist at IQ Therapeutics in Groningen, the Netherlands, has been working on this idea for several years. Human antibodies against anthrax are hard to come by, so Groen tacked up a sign at the 2001 anthrax meeting, promising a “free trip to Holland” for any anthrax researcher who had been vaccinated. Several took him up on the offer, and IQ harvested some of their antibody-producing cells and fused them with cancer cells. The resulting hybrids are living factories, pumping out antibodies that neutralize the toxic proteins. Groen and his colleagues have found that these antibodies protect mice against injected anthrax, and they are now testing their protection of rabbits from inhalational anthrax.

    Headline news.

    The 2001 anthrax attacks put bioterrorism on the public agenda; above, a Greek biohazard team in training.


    IQ isn't the only company on this track. Conspicuously absent from the conference was Human Genome Sciences (HGS), another Rockville biotech company. HGS announced on 18 March that it has not only produced human antibodies to neutralize anthrax toxin—a treatment they have named ABthrax—but has already proven their effectiveness in trials with rabbits and monkeys and plans to move soon to human trials to test for any side effects. “They seem to be ahead of everyone at the moment,” admits Groen, “but it's not a zero-sum game. Having multiple antianthrax drugs on the market will reduce the chances of people getting killed by the disease, which is always a good thing.”

    The need for new drugs would be less urgent if everyone could simply be vaccinated to prevent infection, but B. anthracis has proved a difficult bug to immunize against. BioPort produces the only licensed anthrax vaccine, called BioThrax, a mixture of harmless proteins from the anthrax bacterium. But it must be taken several weeks before exposure and requires multiple injections over an 18-month period and annual boosters. It is also unknown whether the vaccine can protect against inhalational anthrax, the route of infection that most worries bioweapons experts.

    The U.S. National Institutes of Health is funding tests of a new vaccine based on a single protein called protective antigen that is likely to be added to the nation's emergency stockpile (Science, 26 April 2002, p. 639). But some groups are pursuing more novel approaches. Darrell Galloway of the Naval Medical Research Center in Silver Spring, Maryland, told the meeting that rabbits can be immunized against inhalational anthrax with a so-called DNA vaccine, although such vaccines have had limited success in the past. Galloway made plasmids carrying DNA encoding two anthrax proteins, including a truncated form of the toxin. When this was injected into rabbits, says Galloway, their own cells took up the DNA, expressed the anthrax proteins, and presented them to the immune system. The method could prove far cheaper and more effective than the traditional method of injecting the proteins themselves.

    The flip side of trying to find ways to fight the disease is the danger of doing the opposite. During the last session of the conference, Lance Price, now a microbiologist at Johns Hopkins University in Baltimore, described his work in Keim's lab cultivating anthrax resistant to ciprofloxacin, the antibiotic that is a first line of defense against infection. His conclusion: “It was pretty easy, unfortunately.”

    The purpose of the work is to develop quick molecular tests for resistance in anthrax, says Jackson of Los Alamos. But it was still controversial to make it public at the time. Such work could, in principle, encourage terrorists to develop nastier strains of anthrax. There's no easy answer for how best to strike the balance between keeping anthrax research open (and useful to scientists) versus closed (and safe from terrorists), but “it is crucial to err on the side of an open research environment,” says Keim. “You can view this as a race between us and the terrorist in which you never win, you can just stay ahead.”


    Mathematics World Abuzz Over Possible Poincaré Proof

    1. Dana Mackenzie*
    1. Dana Mackenzie is a writer in Santa Cruz, California.

    One of math's hottest unsolved problems may be about to topple. Even if it doesn't, experts agree that its would-be conqueror has achieved something wonderful

    Prove the Poincaré Conjecture = Win Lasting Glory. For 99 years mathematicians have toiled in vain to insert themselves into that equation. In 2000, the Clay Mathematics Institute in Cambridge, Massachusetts, upped the ante by putting a $1 million bounty on a proof. Now, the most credible claimant in years is offering, circumspectly and piecemeal, what might, on further inspection, turn out to be the coup de grâce. Maybe.

    Last week, in an eagerly anticipated series of lectures at the Massachusetts Institute of Technology (MIT) in Cambridge, Grigori Perelman of the Steklov Mathematical Institute in Saint Petersburg, Russia, outlined a possible proof of the conjecture. Mindful of past failures, nobody—including Perelman himself (who declined to be interviewed for this story)—is declaring victory yet. But many experts say this attempt might be special. “We're desperately trying to understand what he has done here,” says Tomasz Mrowka, a topologist at MIT. “It's clear he has done something. You're not going to spend a lot of time [reading his papers] and come away empty-handed.” According to Bennett Chow, a differential geometer at the University of California (UC), San Diego, “it's clearly a major breakthrough, but it's way too early to say anything about whether he has proved the conjecture.”

    The guessing game—has he or hasn't he?—started 12 November, when Perelman posted a preprint on the Internet that stated enigmatically, “We give a sketch of an eclectic proof of this conjecture.” Many readers thought Perelman was simply outlining a possible attack on the problem, but Perelman clarified by e-mail that he intended to unveil an actual proof. On 10 March he posted a second paper containing more details of the work, and he says one more will finish the job. The MIT talks covered material from all three installments.

    In fact, Perelman is gunning for a problem even grander than the one the French mathematician Henri Poincaré unleashed. While laying the groundwork for the subject now known as topology, Poincaré wondered whether topology had enough tools to identify the simplest three-dimensional object, the surface of a four-dimensional egg. In other words, does a 3D space with no distinguishing characteristics—no holes, no Möbius-like twists, no handles, no edges—have to be a 3D sphere?

    Topo logic.

    In 2D surfaces, negatively curved geometry (red) inevitably creates holes.


    If Perelman's work checks out, it will also confirm a much broader hypothesis. The conjecture, by William Thurston of UC Davis, grew out of a landmark theorem by the 19th century German mathematician Bernhard Riemann. Riemann said that any two-dimensional space (i.e., any surface) can be massaged to have the same kind of curvature everywhere: either negative, positive, or flat. The more negative curvature such a “geometrized” surface has, the more holes it has (see figure). As a result, any hole-free surface must be positively curved and thus topologically equivalent to a sphere.

    With his “geometrization conjecture,” Thurston sought to take Riemann into the third dimension. Because 3D spaces are much more complicated than 2D ones, mathematicians can't iron them out into constant curvature à la Riemann. In the late 1970s, however, Thurston (then at Princeton University) proposed that they could do something almost as good. By cutting in the right places, he conjectured, they could divide any 3D space into pieces that can be converted into one of eight highly uniform geometries, ranging from hyperbolic (negatively curved) to spherical (positively curved).

    If true, the geometrization conjecture would give mathematicians a sort of “periodic table” for classifying 3D spaces. It would also instantly solve the Poincaré conjecture. Because each of the seven nonspherical geometries leaves telltale topological fingerprints, a space with no identifying marks would have to be spherical. Q.E.D. But how to prove it?

    In the early 1980s, Richard Hamilton (now at Columbia University) proposed that mathematicians could induce a 3D space to geometrize itself by encouraging it to “flow” toward a uniform geometry, much as heat flows through an iron bar. In 1988, Hamilton used his “Ricci flows” to reprove Riemann's theorem for 2D surfaces.

    In three dimensions, though, Hamilton's scheme ran into a problem. Different parts of a space would flow to exaggerate their different geometries, causing the boundaries between them to stretch into increasingly thin “necks.” As Hamilton's colleague Shing-Tung Yau of Harvard University pointed out, these necks mark the spots where mathematicians should perform the “surgery” Thurston's conjecture requires. But if the necks snapped too soon or formed awkward shapes, including a particularly troublesome one called the “cigar,” the operation would fail. Furthermore, what was to keep the surgeries, like plastic surgeries on a Hollywood star, from going on endlessly?

    Both Yau and Hamilton have wrestled with these questions for years. Perelman used many techniques they developed but added a key idea. He identified a sort of “entropy” that increases as the space flows. By giving Ricci flows a sense of direction, Perelman's entropy helped keep a space moving forward toward geometrization. More subtly, it also gave Perelman control over the size and shape of collapsing regions.

    Many experts, although not all, seem convinced that Perelman has stubbed out the cigar and tamed the narrow necks. But they are less confident that he can control the number of surgeries. That could prove a fatal flaw, Yau warns, noting that many other attempted proofs of the Poincaré conjecture have stumbled over similar missing steps.

    Even if Perelman falls short of his main goal (and the million-dollar prize), Chow says his work represents a huge advance in understanding Ricci flows. “It's like climbing a mountain, except in the real world we know how high the mountain is. What Hamilton did was climb incredibly high, far beyond what anyone expected. Perelman started where Hamilton left off and got even higher yet—but we still don't know how high the mountain is.”


    Center of a Circus

    1. Ben Shouse*
    1. Ben Shouse is a writer in Sioux Falls, South Dakota.

    A devastating oak disease has reshaped two scientists' careers

    SOQUEL DEMONSTRATION FOREST, CALIFORNIA—Matteo Garbelotto stands under the redwoods and strips. The plant pathologist from the University of California (UC), Berkeley, is in the Santa Cruz Mountains to experiment on a tree-killing microbe and, as usual, things aren't going as planned. A fellow researcher forgot the cheesecloth used to cover inoculation wounds in trees, so Garbelotto tears up his undershirt as a substitute. He re-dresses and smiles sheepishly, as if to say, “This is just another day at the office.”

    Unexpected acts like this one—part performance, part sacrifice—have become the norm for Garbelotto ever since he became one of two lead scientists investigating sudden oak death. The disease is strangling oak trees in the hills of coastal Northern California and threatening to spread to other forest species; in one scenario, it could leap across the continent to eastern forests, potentially costing billions of dollars. Garbelotto's dramatic bent—inherited from his mother, an Italian actress—comes in handy. Sudden oak death has thrust him and research partner David Rizzo of UC Davis into the middle of a three-ring circus where science, government, and the media intersect.

    The crisis has forced the two researchers to increase their already prodigious research output, soothe officials demanding information and solutions, and respond to concerned citizens and overeager reporters. Meanwhile, the disease remains a mystery: Where did it come from? How does it move around? And how far will it go?

    The two scientist-ringmasters have not answered those questions yet. But their efforts have won an ovation from their peers. “I don't know how they're doing it all. They've got so many balls in the air,” says Mark Stanley, a retired California Department of Forestry and Fire Protection official who chairs the state's Oak Mortality Task Force. “The progress is remarkable, what they have achieved in just a year and a bit, compared to 20 or 50 years [to understand] some diseases,” says plant pathologist Clive Brasier of the U.K. Forestry Commission.

    Budding disaster

    In the spring of 2000, before Rizzo and Garbelotto had ever worked together, they visited a sunny hillside north of San Francisco that bore a dark omen—a stain on the trunk of a coast live oak oozing black sap. Some researchers thought beetles were to blame, but by January, the two had identified the cause as a water mold called Phytophthora ramorum, a funguslike relative of brown algae previously known only in rhododendrons in Germany and the Netherlands.

    The oak death duo.

    Matteo Garbelotto (left) and David Rizzo with a specimen of the pathogen that has strangled thousands of California oaks.


    It has now become a plague, killing tens of thousands of coast live oaks, tanoaks, black oaks, and Shreve oaks from Big Sur to southern Oregon. The pathogen infiltrates food transport tubes and growing cells just inside the bark, feasting until it has girdled the tree. The process takes about 3 months in a susceptible tree, and then the tree is doomed. It soon develops cankers on the bark, and within a year the leaves fade to dead brown.

    P. ramorum has appeared in 26 other species, a startling number. Most of them suffer only twig or leaf infections, but they include some of the most important species in the western United States. The list includes rhododendron, a widely traded ornamental; Douglas fir, backbone of the northwestern timber industry; and coast redwood, a symbol of California.

    The story of a scourge that could devastate the Golden State's beloved forests and manicured backyards grabbed the media's attention. The humble field of plant pathology became headline news, and Rizzo and Garbelotto have been roped into almost 1000 media interviews since the outbreak began.

    Rizzo, 42, traces his start in the field to his college days, when he hiked frequently in the Blue Ridge Mountains of the Appalachians. There he saw the stumps of chestnuts, once one of the tallest trees in the eastern United States. Between 1900 and 1940, a blight eliminated 3.5 billion of them. From this grew Rizzo's fascination with the power of microscopic organisms to change whole forests.

    Garbelotto, 37, first became acquainted with fungi of the edible type: He grew up in Venice and the Italian Alps, where he and his siblings frequently gathered mushrooms for the dinner table. Later he studied pathogenic fungi.

    Media reports often typecast the two personalities. Garbelotto has been called “dashing” and “colorful,” while Rizzo is chronically referred to as “bookish” or even “balding.” “He's the cool guy from Berkeley, and I'm the boring guy from Davis with two kids,” Rizzo says.

    Whatever their quirks, the two seem doomed to being cast as heroes chasing after a villain. “You have this Godzilla fungus, and the scientists are often portrayed as the Lone Ranger and Tonto,” says plant pathologist Susan Frankel of the U.S. Forest Service in Vallejo, California.

    When the disease hit, neither was in an ideal position to ride in on a white horse—Rizzo was focusing on pear trees and conifers, and Garbelotto was newly hired and just setting up a lab. Yet due to recent retirements, they were the only forest pathologists in the UC system, so they took the lead.

    Now more than 1000 researchers and officials are actively involved in sudden oak death research, and some 300 attended a December meeting on the disease in Monterey. Yet Rizzo and Garbelotto remain central. Rizzo's lab is still the only one with enough expertise to certify infections in new areas. And during the epidemic's early days, Garbelotto's lab was the only one performing polymerase chain reaction tests to identify P. ramorum.

    Black omen.

    By the time a tree oozes sap (top), the first outward sign of infection, it is already doomed. Stands of dead trees mark the disease's path.


    Both labs field a steady flow of questions from local, state, and federal officials, as well as the media and the public. At first, the differing approaches of government officials and researchers caused confusion. For example, any announcement of a new species capable of hosting P. ramorum triggered an automatic quarantine preventing the trees from being exported, something researchers say put undue pressure on the release of new information. Now regulators have decided to wait for peer review in many cases, Garbelotto says.

    Add an aggressive media to the mix of science and government, and the circus really comes to town. In the most explosive example, a newspaper filed a Freedom of Information Act request for any material about sudden oak death. The California Department of Food and Agriculture, which keeps tabs on the disease, released internal e-mails that mentioned that Garbelotto had found a preliminary positive result for P. ramorum in the Sierra Nevada Mountains. The messages became the basis for an article in March 2002 headlined “Oak pest discovered in Sierra”—which would have been a major leap from the west coast of California almost to its eastern border. The paper partially retracted the story the next day, and it's still not clear whether the disease has reached the Sierra.

    That kind of exposure—when today's work can become tomorrow's quarantine or front-page news—can make Rizzo and Garbelotto feel more like tightrope walkers than scientists. But it would be wrong to retreat into the lab, Garbelotto says. Instead, he and Rizzo have decided to respond to all media requests and to answer direct questions openly instead of concealing preliminary data.

    The pair's approach may be distasteful to other scientists—many disdain public relations—but Garbelotto says it is appropriate for crises such as sudden oak death. “Having all this new stuff coming in fairly frequently feeds the public interest and keeps the legislators thinking that this is a hot issue.”

    For Garbelotto's less gregarious colleagues, his openness is a boon. “It's great because he attracts the media attention and leaves the rest of us an atmosphere that's quite a bit quieter than we might otherwise have,” says plant pathologist Everett Hansen of Oregon State University in Corvallis, who leads research in the state's attempt to eradicate P. ramorum. Rizzo often works behind the scenes, answering questions from people whom some scientists might dismiss as crazy Californians. For someone so busy, he writes surprisingly long replies to messages alleging that sudden oak death is caused by airplane contrails, radiation from radio towers, or genetically modified organisms.

    Rizzo and Garbelotto say they are open to unorthodox ideas despite the hassle. For example, a nonexpert prodded them to treat sudden oak death with chemicals containing phosphoric acid, a technique that has worked against P. cinnamomi in Australia. Experiments showed it can prevent P. ramorum infection, and several chemical companies are now working on getting phosphoric acid certified as a pesticide to save backyard trees. Because it is difficult to apply, it is probably not a practical treatment for large numbers of forest trees.

    Decision tree

    Three persistent riddles now occupy the minor-celebrity researchers. The first is figuring out where the pathogen came from. Knowing that would help scientists find pools of genetic resistance and narrow the list of possible chemical treatments. All signs point to a recent introduction, but not from Northern Europe, where a genetically distinct pathogen has taken hold. Finding P. ramorum's true origin would require extensive surveying.

    Tree hunger.

    Phytophthora ramorum could devastate eastern forests as well as western ones.


    The second question—how P. ramorum hops around—seems a little more tractable. Researchers know it travels short distances in rain splashes. But they can't explain why infections tend to appear in clumps ranging from a few meters across to a hectare. Rare events such as sudden updrafts during rainstorms are a possible explanation but are also very difficult to test. Rizzo's lab has hung buckets as high as 50 meters in forest canopies to track spore movements.

    In a related question, the researchers would like to know which trees most of the spores come from. They suspect that bay trees and other understory species are reservoirs for the disease. In experimental plots, they are monitoring how often each species becomes infected and whether neighboring trees fall prey at the same time. This basic epidemiological information is essential to predicting the plague's future.

    The third question is the most ominous: How far will the deadly pathogen spread? Preliminary research at containment facilities in Fort Detrick, Maryland, show that eastern oak species are even more susceptible to P. ramorum than western species, raising concerns that its spread could be catastrophic.

    “This is the scariest thing to ever happen in my lifetime,” says Kurt Gottschalk of the Forest Service's Northeastern Research Station in Morganville, West Virginia. He says if P. ramorum jumps the Great Plains, its economic impact could exceed that of chestnut blight, because it could affect most of the hardwood timber in the eastern U.S.

    More immediately, researchers worry that the pathogen could jump California's central valley into the Sierra Nevada. In late 2001, Garbelotto's lab got an ambiguous positive result for P. ramorum from a maple tree in the foothills east of Sacramento. They have since found about 100 similar results, but all of the trees also test positive for a related pathogen. Some of these trees have died from symptoms resembling sudden oak death, but intense research has so far failed to find the exact culprit.

    Both scientists remain vigilant for signs of death outside the most susceptible tree species. In the Santa Cruz Mountains, Garbelotto has finished his experiment and now looks for discolored redwood branches and blackened sprouts from tree trunks. He searches hard, hoping to find nothing. He would literally give the shirt off his back to know how to stop this disease.