News this Week

Science  14 Jul 2000:
Vol. 289, Issue 5477, pp. 222
  1. AIDS MEETING

    South African Leader Declines to Join the Chorus on HIV and AIDS

    1. Jon Cohen

    DURBAN, SOUTH AFRICAWhen South African President Thabo Mbeki rose to address the opening ceremony for the XIII International AIDS Conference here last Sunday, the thousands of researchers packed into Kingsmead Stadium hoped he would say three simple words: HIV causes AIDS. He didn't. “He waffled while Rome is burning,” said Glenda Gray, a pediatrician who co-runs a perinatal HIV clinic at Soweto's enormous Chris Hani Baragwanath Hospital.

    HIV has infected one in five adults in this country, which puts South Africa in the unenviable position of having more infected people than any other country in the world. Mbeki recently convened a panel to help his government develop policies to tackle the growing AIDS crisis, but he included so-called “dissidents,” who insist that HIV does not cause disease and question whether AIDS is a new disease or old diseases collectively given a new name. Scientists around the world have berated Mbeki for giving new lifeblood to the dissidents, whose arguments had been thoroughly dismissed years ago, and they were hoping that the president would finally distance himself from their views.

    Mbeki addressed this criticism head-on in his lengthy speech to the conference, which runs through 14 July. “Some in our common world consider the questions that I and the rest of our government have raised around the HIV/AIDS issue … as akin to grave criminal and genocidal misconduct,” said Mbeki. “What I hear being said repeatedly, stridently, and often angrily is ‘Do not ask any questions!’”

    When it came to stating his own position about the role of HIV, however, Mbeki was anything but direct. Much of his speech quoted from a 1995 World Health Organization report that fingered “extreme poverty” as “the greatest cause of ill health and suffering across the globe.” Mbeki did note, however, that his government would continue to intensify its anti-AIDS campaign by encouraging the use of condoms, supporting research on an AIDS vaccine and anti-HIV drugs, and responding humanely “to people living with AIDS and HIV.” But he made no mention of his decision not to supply relatively cheap courses of anti-HIV drugs to infected, pregnant women, which studies have shown can cut by 50% transmission of the virus to their babies.

    Mbeki's failure to acknowledge directly that HIV causes AIDS has angered the country's AIDS researchers. “This was a good opportunity for him to put a closure on the whole thing, and he didn't,” complained virologist Lynn Morris, who works at the National Institute of Virology in Johannesburg and sat on the panel that Mbeki convened. Many visiting scientists also expressed their dismay. “He could have emerged as a spectacular leader of the whole African continent,” said Anthony Fauci, head of the U.S. National Institute of Allergy and Infectious Diseases. “He flubbed it.”

    Still, some top South African researchers saw the speech as a step forward. “Considering all we've gone through over the last few months, it's an excellent speech,” said Malegapuru William Makgoba, head of the country's Medical Research Council and another member of Mbeki's panel. “He had the option of just talking about AIDS. But he always talked about HIV/AIDS. So he links HIV to AIDS.” What some saw as wordsmithery, Makgoba concluded was a “clever way” for Mbeki to extricate himself from the debate. Those who criticized the speech, Makgoba said, were being “churlish.”

    The speech itself came after a day of rumors and controversy surrounding the sudden and unexplained cancellation of a press conference to publicize the so-called “Durban Declaration.” More than 5000 scientists had signed the document, published in the 6 July issue of Nature, which declares that HIV causes AIDS. The declaration apparently offended Mbeki, whose spokesperson earlier in the week said he would put it in the “dustbin” if it were sent to the president.

    Many speculated that the Mbeki administration threatened South African scientists who signed the document that they would lose their government funding if they spoke at the press conference. But Hoosen Coovadia, chair of the meeting, insisted that “they didn't put any pressure on us to cancel this.” Chris Hani's Gray said she and others nixed the press conference simply to avoid offending Mbeki on the eve of his much-anticipated speech. They hoped their conciliatory gesture would encourage him to end a sad chapter in a sad saga about a country that seems to have swapped the anguish of apartheid for the anguish of HIV and AIDS.

  2. ECOLOGY

    California Algae May Be Feared European Species

    1. Jocelyn Kaiser

    A volleyball-court-sized patch of bright green algae in a San Diego lagoon has set off alarm bells among ecologists and officials. Scientists strongly suspect that the algae, Caulerpa taxifolia, is the same fast-growing, non-native clone that has swept over the northwestern Mediterranean sea floor in the past decade with devastating ecological consequences. A consortium of agencies and private groups has cordoned off the lagoon and is laying plans to poison the seaweed, marking the first major U.S. attempt to stop an incipient marine species invasion.

    Scientists say such actions are needed to preserve biodiversity in the face of voracious non-native plants and animals (Science, 17 September 1999, p. 1834). “It's a rare chance to stop an invasion once it's started,” says marine biologist Andrew Cohen of the San Francisco Estuary Institute. He and others also hope U.S. officials will avoid the mistakes made in Europe, where governments initially ignored warnings about C. taxifolia. “This is almost a test case of the new resolve to deal with this problem” of invasive species, says ecologist Daniel Simberloff of the University of Tennessee, Knoxville.

    C. taxifolia is native to various tropical seas. But in 1980 an aquarium in Stuttgart, Germany, began sharing with other aquaria a showy clone that grew fast in cold water. The organism's potential for triggering an ecological disaster didn't become apparent, however, until 1989, when French scientist Alexandre Meinesz noticed a flourishing C. taxifolia patch in the waters off the Monaco aquarium. Meinesz's 1999 book, Killer Algae (Science, 10 March, p. 1762), describes the bureaucratic fumbling and the seaweed's relentless spread as French officials dithered 2 years before reacting. The alga now carpets 4600-and-counting hectares of sea floor, wiping out native grasses from Spain to Croatia.

    The story was familiar to Rachel Woodfield, a marine biologist with the consulting firm of Merkel & Associates in San Diego, who in mid-June spotted some unfamiliar seaweed growing in a lagoon. The 10-meterby-20-meter patch and smaller, scattered patches had apparently edged out the eelgrass within a few years. Woodfield consulted with algae experts, including Meinesz, who fingered the Mediterranean clone or one just as invasive. And that set off alarm bells. If the alga, now 30 kilometers north of San Diego, gets loose throughout California, says Bob Hoffman of the National Marine Fisheries Service Southwest Region, “the whole rocky reef plant and animal assemblage off our coast would be dramatically transformed.”

    As Science went to press, experts were awaiting results of genetic tests to confirm the invader's identity. But with evidence pointing toward the Mediterranean clone, 10 agencies and groups are now scrambling to wipe out the algae in an effort that will likely cost at least $500,000. As a first step, they've quarantined the lagoon, owned by a power plant and used for boating, to prevent tiny fragments of C. taxifolia from being spread by boat anchors. Within a week or two, they plan to cover the seaweed patches with tarps soaked with an herbicide, most likely chlorine or copper sulfate. The next step is long-term monitoring, including pamphlets to alert boaters and divers to look out for other colonies.

    In tackling C. taxifolia, the San Diego group is wading into uncharted waters. Although many weedy plants and non-native animals have been extirpated from lakes and land, only two marine invaders—a zebra mussel-like species in Australia and an abalone parasite in California—have reportedly been eradicated. U.S. experts have long feared that they might one day need to battle C. taxifolia. In 1998, Cohen spearheaded a lobbying effort that succeeded last year in adding the clone to the U.S. Noxious Weed list, which bans its sale and transport.

    A 1999 presidential order calling on federal agencies to thwart invasives may provide additional weapons for battling C. taxifolia and other troublemakers. Simberloff says he's “very impressed” with a draft interagency plan that has just been developed. But Cohen is reserving judgment, noting that federal rules to crack down on species spread by ship ballast water (see p. 241) still lack teeth. Even so, a victory over San Diego's patch of C. taxifolia will lift his spirits. “I do think there's a good chance of eradication,” Cohen says.

  3. SCIENTIFIC PUBLISHING

    Publish and Perish in the Internet World

    1. Eliot Marshall

    NEW YORK CITYWhen 120 leaders in publishing and biomedicine met here last week to talk about the Internet's effect on scholarly journals, it didn't take long for disagreements to surface. Participants clashed over two very different visions of the future—one predicting that private firms will continue to produce the most reliable and readable journals, the other that scientists will soon abandon traditional journals and share results directly with other researchers on the Internet.

    The seeds of this debate were sown 16 months ago, when Harold Varmus, who was then director of the National Institutes of Health (NIH), and Stanford University geneticist Patrick Brown floated a radical plan for an NIH-backed preprint journal and biomedical archive (Science, 12 March 1999, p. 1610). Since then, the scope of NIH's electronic publishing venture—now called PubMed Central—has been scaled back, and the public archive has been slow getting started.

    David Lipman, director of NIH's National Center for Biotechnology Information, which is running PubMed Central, reported at the meeting that his staff is making steady progress putting articles online from the 20 journals that have so far agreed to provide published papers for the archive. But his team has “bumped into technical problems,” he said, particularly with one of the most prominent contributors, the Proceedings of the National Academy of Sciences (PNAS). So far, according to a PNAS staffer, two 1999 issues have been posted. Meanwhile, plans to publish original, nonreviewed research at PubMed Central are being put aside for now.

    Those difficulties have not discouraged Vitek Tracz, head of the London-based publishing company, Current Science Group. In May, Tracz—whose company sponsored the New York meeting—started his own Internet publication called BioMed Central, which will be free of charge to authors and readers. (Its first papers are still in review.) “Our mantra is that we will never charge for primary research reports,” Tracz says.

    Although Tracz says he has no definite business plan for BioMed Central, he aims to use it to establish credibility with scientists and through this process, to develop other publications and news services that will make a profit. Already, BioMed Central has recruited an impressive board, including Varmus, now president of the Memorial Sloan-Kettering Cancer Center in New York City, Steven Hyman, director of the National Institute of Mental Health in Bethesda, Maryland, Philippe Kourilsky, director of the Pasteur Institute in Paris, and Mitsuhiro Yanagida, a molecular biologist at Kyoto University in Japan.

    At the New York meeting, the contrarian role fell to Pieter Bolman, president of Academic Press of San Diego, California. In a brief talk, he dismissed the free publication schemes as utopian, joking that they looked like the work of “academics on the loose” or “a communist plot.” To put all biomedical research data into a single open archive is “asking for trouble,” Bolman said, because it asks “existing publishers to give up their files” and “commit economic suicide.” The journals won't do it, Bolman predicted, unless forced by the government.

    The PubMed Central experiment, Bolman argued, is plagued by “a mainframe mentality”—meaning centralized management. Bolman touted an alternative, a publisher-initiated venture called CrossRef, launched in June. Later this year, it will house an electronic index with links to 3 million articles in 4000 journals. But unlike users of PubMed Central, users of CrossRef will have to pay a fee in most cases to get the full text. PubMed Central “has served its purpose,” Bolman asserted: “I invite you to join CrossRef and get it all over with.”

    Infuriated, Brown rose to give the final talk and fired a broadside at “parasites” who get the work of scientists for free, take forever to publish it, and charge readers a high price for a product they often make worse by editing. Instead of joining CrossRef, Brown urged scientists to lend support instead to free alternatives like PubMed Central. Brown's comment to the publishers: “We'll call you if we need you, but don't sit by the phone.” But right now, Brown himself is calling the publishers, because he wants them to donate their back issues to PubMed Central—“so that people can see the value” of having a free electronic archive.

    Although participants in the meeting diverged sharply on how the Internet will affect publishing, all seemed to agree with Varmus's comment that the experiment in electronic publishing has begun, and that “we are tacking to a distant port” with winds that sometimes favor and sometimes hinder progress.

  4. ARTHRITIS

    A Gene for Smooth-Running Joints

    1. Michael Hagmann

    At first glance, tartar control toothpaste and water softeners seem to have little in common with the crippling joint erosion that haunts tens of millions of arthritis sufferers worldwide. But a new study on page 265 of this issue suggests that a genetic defect in mice causes the joint's cartilage cells to pump insufficient amounts of pyrophosphate—a natural water softener—into the joint cleft, and this in turn leads to the formation of bony spurs that eventually stiffen the joints completely. Because humans have an almost identical gene, and disorders such as osteoarthritis also feature an abnormal outgrowth of bones, some arthritis researchers are hopeful that these new findings may point the way toward a new class of pyrophosphate-based drugs similar to the antiscaling chemicals in washing powders and toothpaste. But, as many of the researchers point out, the numerous roads that lead to human joint degradation make a single cure-all unlikely.

    Arthritis and other rheumatic afflictions dwarf cancer and heart disease in terms of the disability they cause. The World Health Organization estimates that arthritis-related diseases—of which there are more than 100 different forms—afflict half the world's population over 65. Although sports injuries, age, and obesity are among the most common risk factors, about half of all arthritis cases also have a strong hereditary component.

    To pinpoint genes that contribute to a specific disease, researchers often turn to animal models that mimic the ailment. Developmental geneticists David Kingsley, Andrew Ho, and Michelle Johnson of the Stanford University School of Medicine have been trying to unravel the genetic mutation at work in a strain of mice called ank, which has progressive ankylosis, or fusion of the bones. The disease starts by stiffening the digits and paws, then spreads to virtually every joint in the body, including the spine. By about 6 months of age the animals are completely immobilized and eventually die. Despite its unparalleled severity, the mouse disease and various forms of human arthritis share several hallmarks, including deposition of calcium phosphate crystals in the joints and degradation of cartilage, the smooth, gel-like cushions at the tips of the bones. “The ank mouse immediately attracted a lot of interest from arthritis specialists,” says Kingsley. But the genetic defect remained elusive.

    Other researchers had linked the ank mutation to mouse chromosome 15. To narrow the search further, Kingsley and his colleagues engaged in a brute-force breeding effort, crossing ank mice with another strain and then picking those with the ank mutation from more than 4000 offspring. They finally homed in on a 150,000-base pair stretch of DNA containing 11 candidate genes—“none of which had any obvious link to arthritis” at first, recalls Kingsley.

    When the team compared the sequences of the 11 genes between normal and mutant mice, letter by letter, they found a single typo in one of the genes that led to a protein about 10% shorter than the normal version. The gene is highly conserved among vertebrates—the human counterpart is about 98% identical—but strikingly absent in invertebrates, which lack skeletons and, hence, bones, joints, and arthritis. Further strengthening the case, in mouse embryos the ank gene is most active in developing cartilage.

    Kingsley's team had no idea what the normal gene does, but an intriguing clue came from Yusuke Nakamura and his colleagues at the University of Tokyo, who had recently identified the genetic defect behind a similar mouse disease—and determined that its protein product normally generates pyrophosphate on the outside of joint cells to keep the joints scale-free. When the Stanford team measured pyrophosphate levels in cultured cells derived from ank and normal mice, they found that the chemical accumulated in cells from the ank mice but decreased in the culture medium. Kingsley speculates that in its normal form, the ank protein may be “a pyrophosphate channel that allows pyrophosphate levels to remain high in cartilage throughout life” to prevent calcium phosphate crystal formation in the joint cleft. When that protein is defective, however, pyrophosphate is sequestered inside the cells and crystals can build up in the joint fluid, leading to inflammation and joint destruction.

    Rheumatologist Michael Doherty of the City Hospital in Nottingham, United Kingdom, notes that the ank mouse “most closely resembles familial chondrocalcinosis,” a genetic disease that leads to crystal deposition in numerous joints and shows a similarly imbalanced pyrophosphate distribution in the joints. In several afflicted families, moreover, the genetic defect has been mapped to the same chromosomal region that harbors the human ank gene. “It's a really hot candidate for [human] chondrocalcinosis,” says Matthew Brown, a skeletal geneticist at the University of Oxford.

    The role of pyrophosphate in osteoarthritis is unclear, however. Doherty points out that some osteoarthritis sufferers have too much instead of too little pyrophosphate in their knee fluid, suggesting a different disease mechanism. “I'd say the likelihood that this leads to some interventions [for osteoarthritis] in the near future is pretty low,” he says. Nonetheless, “David's study is fascinating, because it sheds light on the molecular mechanism of what's happening in the ank mouse.”

  5. U.K. FUNDING

    New Program Supports Facilities, Stipends

    1. Richard Stone

    CAMBRIDGE, U.K.—British scientists are celebrating a $1.7 billion windfall, announced last week by the U.K. government, to shore up deteriorating facilities and raise stipends for Ph.D. students. The 2-year spending boost is intended to keep the pool of British science well stocked, both by attracting more talented students into the field and stemming the flow of scientists out of the country. But the benefits are not spread evenly across the research spectrum.

    The extra money, to begin in 2002, surpasses the wishes of the scientific community to extend the popular Joint Infrastructure Fund (JIF) beyond next year. Both are bankrolled by the government and The Wellcome Trust charity, but the new Science Research Investment Fund will spend at an annual rate almost double the 3-year, $1.2 billion JIF. The largesse amounts to a roughly 20% increase in the overall science budget, says Peter Cotgreave, director of the Save British Science Society. U.K. Chancellor Gordon Brown said that “the scale of this investment is unprecedented, ensuring world-class facilities for world-class science.”

    JIF has been doling out competitive grants of at least $1 million to universities for everything from purchasing pricey instruments to building world-class facilities. The latter include a tropical medicine research center at the University of Oxford and a human genetics institute at the University of Newcastle upon Tyne. Success rates have been running at about 20%, and the last call for grants is slated for October. Last week's announcement in essence extends JIF for 2 years (see table).

    View this table:

    Government officials said they hope the new fund will ensure that the country's most productive institutions don't lose their edge. A cost-sharing provision aimed at making the money go farther also favors well-endowed universities by making mandatory a practice, begun under JIF, that institutions contribute 25% to a project's overall cost. Major universities such as Oxford and Cambridge “will find it relatively easy to unlock the money,” Cotgreave predicts. On the other hand, he says, the country's dozens of former polytechnics are likely to flounder in the hunt for matching funds.

    To bolster the quantity and quality of future scientists, the government will also boost annual science and engineering Ph.D. stipends from $10,000 to $14,000 by 2004. That hike, which analysts estimate will cost $80 million, follows a January plea from the U.K. Life Sciences Committee, an umbrella organization for 15 British societies, to stem a perceived brain drain to better paying Ph.D. programs outside the United Kingdom. It should also reduce the average debt of a science major entering graduate school, now roughly $8000, says Peter Campbell, a biochemist at University College London.

    The move to beef up infrastructure and raise stipends “will go some way toward attracting and retaining good scientists in the U.K. science base,” says Sir Aaron Klug, president of the Royal Society. However, Klug and others admit that it won't address another source of brain drain—U.K. postdocs headed to the United States for positions not available at home.

  6. PARTICLE PHYSICS

    CERN Collider Glimpses Supersymmetry--Maybe

    1. Charles Seife

    It's a notion worthy of The X-Files: a shadowy world of doppelgangers, existing in eerie counterpoint to the one we know. Last week, particle physicists at the CERN laboratory in Switzerland announced that they may have caught the first glimpse of that world. By smashing together matter and antimatter in four experiments, they detected an unexpected effect in the sprays of particles that ensued. The anomaly is subtle, and physicists caution that it might still be a statistical fluke. If confirmed, however, it could mark the long-sought discovery of a whole zoo of new particles—and the end of a long-standing model of particle physics.

    Other scientists are intrigued by the findings. “Often with an anomalous result, after a few hours' work, you say, ‘This can't be right,’ but here this is not the case,” says Gordon Kane, a physicist at the University of Michigan, Ann Arbor. But they are also skeptical. “After having been bitten 15 times, I'm twice shy,” jokes CERN physicist John Ellis. “I think it's probably going to turn out to be some background fluctuation, unfortunately.”

    The finding threatens the slightly creaky Standard Model of particle physics, which provides a mathematical framework that binds together all of the fundamental particles (quarks, neutrinos, electrons, taus, muons, gluons, and so forth). And it supports a newer, fancier model known as supersymmetry. By linking the particles that make up matter (fermions) with those that carry forces (bosons), supersymmetry unifies all the quantum forces at very high energies. In the process, it also doubles the roster of particles. Each fermion, such as a quark, neutrino, electron, or tau, has a bosonic twin: an s-quark, neutralino, s-electron, or s-tau. Likewise, every boson has a fermionic twin: The photon has the photino, and each gluon has a gluino.

    The CERN scientists put the models to the test at the Large Electron-Positron Collider (LEP), a 27-kilometer magnetic ring near Geneva where physicists had long been smashing electrons and antielectrons together, creating showers of subatomic debris. They were particularly interested in showers containing pairs of tau particles. Like electrons, muons, and quarks, tau particles are thought to be fundamental particles—indivisible chunks of matter. The Standard Model allows several different chains of particle interactions, known as channels, by which a colliding electron and antielectron can produce a pair of tau particles. Supersymmetry allows not only all of those channels, but also others that involve the twin particles unknown in the Standard Model. Each theory also predicts how many tau particles ought to result from collisions at different energies—but the answers aren't always the same.

    Those differences were the test. At low energies, the number of tau particles LEP produced matched calculations based on the Standard Model. But in 1998, when engineers at CERN pushed the energies of the collisions above 189 billion electron volts, things began to change. “Over the last couple of years, there has been a slight excess,” says CERN physicist Gerardo Ganis. Instead of observing about 170 tau pairs of a certain type, as the Standard Model predicts, physicists have seen 228—a figure consistent with supersymmetry.

    Barring some unknown type of systematic error that affects each of the four experiments, each experiment has roughly a 5% probability of seeing the excess because of a chance statistical fluctuation, Ganis says. “But when put together, it's a fraction of a percent.” That's still too high for physicists to break open the champagne (to declare a bona fide detection, they would need to push the probability of error below 0.001%), but it is enough to raise eyebrows.

    If real, the tau-pair excess would signal the end of the Standard Model and the beginning of the supersymmetric era. However, the result may also be a fluke that will disappear with more data, as other supersymmetry sightings have done in the past. More data are due to be released on July 20, and the experiments will continue until September. That probably won't be enough time to resolve the issue, the physicists say.

    Ironically, if the death knell for the Standard Model comes, it probably won't toll at LEP: This fall, the device is slated to be dismantled to make way for the Large Hadron Collider experiment.

  7. MOUNT GRAHAM

    Report Finds Squirrels Survived 3 Telescopes

    1. Mark Muro*
    1. Mark Muro writes from Tucson.

    TUCSON, ARIZONAFor 15 years, Mount Graham has been a battleground for astronomers, who want to build a cluster of telescopes, and environmentalists, who say that such activity could wipe out an endangered subspecies of red squirrel. In 1988 Congress allowed construction of three telescopes on the mountain, a desert “sky island” northeast of here, prompting the U.S. Forest Service to order a long-range study to monitor the squirrel's population. The results are now in. But the findings—that the work to date has had “no significant effect” on the rare rodents—have done little to resolve a debate that is expected to heat up again next year when the University of Arizona (UA) seeks permission to build four more telescopes.

    “We tried hard to find something that would display a negative effect, but we couldn't,” says UA population ecologist Paul Young, who directed the 10-year, $2.5 million monitoring program. “What really determined what happened were variations in the [pine]cone crops the squirrels depend on in the fir, spruce, and mixed conifer forest, not these three telescopes.”

    The UA-funded study, which the forest service approved when it allowed construction on 3.4 hectares of squirrel habitat near the 3300-meter summit, began in 1989. Since then Young has led a five-member team that conducted monthly, and then quarterly, censuses of the squirrels' middens. They found that the population increased from a low of 33 squirrels in 1989 to 102 late last year, with a spike of 225 at mid-decade. The changes were based on food supply, and the pattern within the construction area paralleled that in a control site elsewhere on the mountain.

    Both findings are exactly what Chris Smith, an evolutionary ecologist at Kansas State University in Manhattan with no stake in the outcome, would have predicted. “They looked at the narrow question of the observatory's impact on the immediate population, did it thoroughly, and turned up no surprises,” he says. “Those squirrels get used to people easily.”

    Proponents of the telescopes say the results vindicate their position—both past and future. “This report confirms what we have been saying all along, that the telescopes would not affect the squirrels at all,” says Buddy Powell, associate director of the UA's Steward Observatory, a partner in the Mount Graham facilities. Powell also believes the study bolsters plans to build additional telescopes on an adjacent site. “It suggests the squirrels would not be harmed by four more,” Powell says.

    Environmentalists, however, disagree with Powell on both points. Robin Silver, conservation director of the Southwest Center for Biological Diversity in Phoenix and longtime opponent of the telescopes, says the data are tainted by Young's university financing and the lack of outside review. “They should have given this project to another university or an outside company,” Silver says.

    Peter Warshall, a San Francisco-based ecologist who produced the original environmental impact statement for the observatory project in 1986, disputes Powell's view that the monitoring justifies more telescopes. Instead, Warshall, who leads Scientists for the Preservation of Mount Graham, believes that the sharp fluctuations demonstrate the squirrel's long-term vulnerability to environmental assaults. “There are so many uncontrollable forces threatening these squirrels, from cone crop failures and forest fires to tree diseases and windfalls,” Warshall says. “The question is whether there's room to add the controllable impact of destroying more forest to build more telescopes—and this study doesn't answer that.” To underline their concern, on 30 June Warshall's and Silver's groups filed a federal lawsuit to block construction of a buried 37-kilometer power line to the telescopes.

    Young agrees that his results don't represent a green light for astronomers. “There hasn't been any significant impact from the first three telescopes, but that doesn't mean we should build more,” he says. “I definitely think the red squirrel is in a precarious place. … This is an island species, and in general island species go extinct.”

    The next major battle over the mountain will be a formal proposal to the forest service to build other instruments, including possibly a wide-field camera and a 6.5-meter telescope. But that won't be submitted for at least a year, until work is completed on the third and largest current project, the $83.5 million Large Binocular Telescope. In the meantime, Young—who is preparing to hand off responsibility for the monitoring project and take on another position—doesn't expect to have the last word. “Something tells me both sides will continue to have a field day with the data,” he says.

  8. EUROPEAN SCIENCE

    Pathogens Lab Chief Stripped of Duties

    1. Michael Balter

    PARISEurope's most advanced high-security pathogen lab has claimed its first human casualty—and it hasn't even opened for business. On 28 June, the Marcel Mérieux Foundation, which funded the construction of the $8 million facility in Lyons, banned lab director Susan Fisher-Hoch from the premises and launched legal proceedings to dismiss her. Fisher-Hoch's most egregious offense, it appears, was speaking with the press.

    The turmoil at the biosafety level 4 (BSL-4) facility comes in the wake of a takeover of the lab's scientific direction by the Pasteur Institute in Paris. The Mérieux Foundation teamed up with Pasteur after failing to convince funding bodies to finance the lab's estimated $1.4 million annual budget. In exchange for footing a still-undecided portion of the lab's bills, Pasteur insisted that one of its own scientists become director (Science, 30 June, p. 2298). That presented a problem, however, as Fisher-Hoch has a contract naming her director until February 2002. At the foundation's request, she says, she prepared a proposal for a new contract. Fisher-Hoch agreed to give up the directorship if she could run some of the lab's international relations and do research into a Lassa fever vaccine.

    Fisher-Hoch claims she received “no reply at all” to the proposal before being “presented simply with an ultimatum to get out of the lab.” Foundation Secretary-General Claude Lardy counters that she personally told Fisher-Hoch that the proposal was “completely unacceptable” because it gave her too much independence and authority. Complicating matters, Fisher-Hoch has been in hot water over an incident earlier this year in which she allegedly stored potentially virus-infected blood samples in the lab before it was certified to hold them. Fisher-Hoch denies the allegation, saying that the samples were drawn from healthy doctors and nurses during a workshop in Liberia. Nevertheless, the primary grounds for Fisher-Hoch's dismissal, cited in a 28 June letter to her from the foundation, are that she spoke with journalists about the foundation's decision to replace her as director.

    Fisher-Hoch has hired an attorney to help fight her dismissal. In the meantime, no one is about to suit up for the pathogen lab: Local officials have delayed the facility's opening, planned for this month, until whoever takes over presents bona fide credentials for running a BSL-4 facility.

  9. EVOLUTIONARY BIOLOGY

    Chewed Leaves Reveal Ancient Relationship

    1. Elizabeth Pennisi

    God, the great British geneticist J. B. S. Haldane once remarked, must have “an inordinate fondness for beetles.” And certain beetles have an inordinate and, it turns out, historic fondness for ginger plants. Paleontologists have discovered how ancient this culinary preference really is by studying fossils of damaged leaves. The data help push back the time when a group of beetles called leaf beetles evolved their great diversity and demonstrate just how faithful some species can be to their favorite foods. The results are also convincing paleobotanists that they can sometimes glean more about their plant's ancient past from a chewed-up leaf fossil than from a pristine one.

    On page 291, paleobotanist Peter Wilf of the University of Michigan, Ann Arbor, Conrad Labandeira, a paleobiologist at the Smithsonian Institution's National Museum of Natural History in Washington, D.C., and their colleagues describe a new beetle fossil based not on traces of the insect skeleton—in fact, the insect itself never even shows up in the fossil record—but on the distinctive gouges the beetles left when they munched on 11 ginger leaves many millions of years ago. The chew marks of the newly described Cephaloleichnites strongi prove that leaf beetles underwent rapid evolution and diversification more than 65 million years ago—far earlier than the oldest fossils of insect bodies suggest—possibly taking advantage of (and perhaps influencing) the rapid diversification among flowering plants occurring at the same time.

    What's more, C. strongi represents the earliest known rolled-leaf beetle species, hundreds of which today still are picky eaters, preferring just one of the ginger- and heliconia-like plants in the Zingiberales order. For decades, ecology students have learned about this impressive array of beetle-plant pairings, in which different rolled-leaf species adopt the same lifestyle but on their own distinct host plant. This new work adds “a historical dimension to this emblem of tropical biology,” says Brian D. Farrell, an insect evolutionist at Harvard University. Adds Phyllis Coley, a tropical ecologist at the University of Utah, Salt Lake City: “The beetles and the gingers are an extremely old and conservative pairing, which in turn suggests that each could have had profound selective effects on the other.”

    As a young ecologist in the 1970s, Donald Strong—the fossil's namesake—could not help but notice the vast variety of rolled-leaf beetles, whose larvae take up residence inside the young, curled leaves of gingers, heliconias, and their relatives, plants that thrive in the understories of tropical forests of the Western Hemisphere. In particular, he was enchanted by what the beetles did to the leaf itself. Their damage becomes quite apparent as the leaf unfurls and serves as a lasting reminder of a beetle long gone. “It was an issue of artistry, how beautiful the damage was,” recalls Strong, now at the University of California, Davis.

    Over the next few decades, Strong documented the specialized associations among different beetles and particular plant species. Eventually, he learned to identify a beetle species from the leaf's chew marks, which varied according to the size and shape of the particular beetle's jaws.

    Wilf came across Strong's research in 1998, when he and Labandeira were studying a different sort of insect damage—tiny fossil pellets, mysterious specks of fossilized material found on 53-million-year-old fossil leaves he had collected from Wyoming. Until that time, Wilf hadn't really noticed the chew marks. But when he and Labandeira took a second look at the leaves, “we realized the damage [seen by Strong in the modern leaves] matched beautifully with what we had,” Labandeira recalls. Moreover, the fossil leaves looked very much like some modern gingers. Even after millions of years, says Wilf, “[the beetles] are eating the same thing, and they are doing it the same way.”

    Soon Labandeira found even older leaves bearing the telltale signs of the rolled-leaf beetle. While working with Kirk Johnson at the Denver Museum of Natural History, Labandeira noticed that some of Johnson's fossils, whose identity he didn't yet know, also had chew marks resembling C. strongi's. And they, too, turned out to be fossil gingers. Because these fossils came from a North Dakota deposit dating back to the Late Cretaceous, “we now know this insect is 20 million years older than if we just looked at body fossils,” Wilf points out.

    These findings lend support to a theory proposed by Farrell in 1998. Farrell suggested that most plant-eating beetles likely evolved in parallel to flowering plants and therefore were quite diverse during the dinosaur's heyday (Science, 24 July 1998, p. 555). But until now, there has been little supporting fossil evidence, as only one relevant beetle fossil exists from that time. Now researchers may be able to get around this lack of fossils by looking at insect damage instead, says Leo Hickey, a paleobotanist at Yale University: “The work shows the potential of an overlooked resource in [studying] the evolution of insects.” Inspired by this new work, Hickey expects that he and his botanical colleagues will be giving their plant fossils a second look for signs of insect activity. Coley agrees, noting that “it seems that the use of fossil damage patterns to infer ecological and evolutionary relationships is quite powerful.”

  10. ECOLOGY

    When Fire Ants Move In, Others Leave

    1. Elizabeth Pennisi

    For Amy Arnett, getting a Ph.D. in biology has also meant learning to be a road warrior. Beginning in May 1997, she and Christy Royer, an undergraduate assistant, covered some 2000 hot, dusty kilometers from northern Florida to upstate New York, collecting ants at 33 sites along the way. They had set out to look at how the food resources for ant lions, insects that prey on ants, changed from north to south along the East Coast. But in the process, their research uncovered new evidence about the long-range, and potentially long-term, ecological damage being wrought by an invasive species of fire ant.

    The red imported fire ant Solenopsis invicta displaces other ant species and upsets the structures of native communities of ants—disruptions that appear to be permanent, Arnett and her adviser, community ecologist Nick Gotelli of the University of Vermont, Burlington, report in the July issue of Ecology Letters. Other studies have examined how these ants perturb single communities and how other invasive species affect the communities they move into, says David Holway, an ecologist at the University of California, San Diego. But this “is the only study that looks at the impact of an invasive species at such a broad scale.”

    These fire ants entered the southeastern United States about 70 years ago, likely hitching a ride with produce from Argentina or Brazil, and have spread as far north as winter freezes will let them. They are infamous for their sting, which “you never forget,” says Gotelli. They can also make pastures uninhabitable for livestock and chew up telephone wires. But “what people [haven't understood] is the ecological havoc they are wreaking,” says Kenneth G. Ross, an entomologist at the University of Georgia, Athens.

    As Arnett and Royer drove along the East Coast between 25 May and 3 July, they would scan the horizon for a sampling site—an open field next to a forest—roughly every 30 to 50 miles. In both the field and the forest they would make a 5-meter-by-5-meter grid in which they buried 25 small plastic tubes, each placed so its lip was flush with the ground. Soapy water at the bottom of the tube prevented the escape of any creature that crawled in over the next 48 hours. In this way the roving researchers could get a quick snapshot of the ants active in both habitats.

    When they retrieved the tubes, Arnett and Royer could tell right away whether fire ants were present. If none were there, the tube was relatively empty, likely containing only about 20 ants of assorted species. But if the red ants were there, as many as 500 individuals would be crammed into each tube—a readily visible mass. Almost all would be the red imported fire ants, which live in denser populations than do native ants.

    Overall, the effort netted more than 14,000 ants, and with the help of Harvard ant specialist Stefan Cover, they identified 81 species, including S. invicta. What was surprising was the distribution of those species. The researchers had expected to find more native species in the southern part of the country than in the north, as species diversity tends to increase closer to the tropics.

    But where fire ants were present, that gradient was disrupted. Arnett and Gotelli found that, as expected, the number of species rose with decreasing latitude—from just a few in New York to 15 in southern Virginia. But the number of native species dropped off at sites farther south, slipping back down to four in Florida, Gotelli and Arnett report. “These changes correlate very strongly with the presence and absence of fire ants,” points out Lloyd Morrison, an entomologist at the U.S. Department of Agriculture's Agricultural Research Service Center in Gainesville, Florida. (North Carolina is the northernmost range of the red imported fire ant.)

    Gotelli and Arnett can't tell from their survey whether the missing species are locally extinct or just very rare. But the drop in biodiversity could represent a significant loss for these areas, notes Ross, because of the critical role ants play in recycling nutrients and other biological material. Although the red imported fire ants are voracious feeders, they may not redistribute nutrients in the same way that a variety of other ants—each with its own particular habits—would, Gotelli explains. What's more, the steadily declining number of ant species found below the northern limit of the red fire ant suggests that habitats don't recover their biodiversity with time.

    Not only does the red ant reduce the overall number of ant species at a given locale, but it also alters the community structure. When Gotelli and Arnett analyzed their data, they found that certain native species tend to coexist with certain others, likely dividing up the resources to make efficient use of what's available. Some might eat seeds; others might concentrate on leaves, for example. But where fire ants are present, those associations break down—a change that could affect the efficiency of the food webs at those sites, Gotelli suggests.

    All in all, says Ross, the study “shows very nicely the large-scale ecological effects [red ants are having] on other ants”—and it was a trip well worth taking.

  11. MOLECULAR BIOLOGY

    Creation's Seventh Day

    1. Robert F. Service

    What would life look like if DNA contained more than four nucleotide bases and proteins more than 20 amino acids? Peter Schultz aims to find out

    In the casino or in the lab, Peter Schultz loves to take risks. “If I gamble, I usually gamble at high-stakes, high-payoff games,” Schultz says. “Science is interesting when it's played at the same level, for the highest stakes with very high risk.” For Schultz, a chemist at the Scripps Research Institute and the director of the newly created Genomics Institute of the Novartis Research Foundation (GNF), both in La Jolla, California, that betting system has paid handsomely.

    While at his previous home at the University of California (UC), Berkeley, Schultz helped pioneer a fleet of high-speed chemistry techniques to generate molecules by the millions and select the ones that work best as possible catalysts, drug molecules, and even high-temperature superconductors. During the 1990s, he parlayed that experience into a string of start-up companies. Last year he took another big gamble by giving up the comfort of his Berkeley career and financial backing by the Howard Hughes Medical Institute to launch GNF, an outfit Schultz pitches as the “Bell Labs of Biology,” which aims to work out the function of the thousands of unknown genes being turned out by the world's genome projects (see sidebar).

    Still, his boldest undertaking—and the one that may ultimately have the highest impact—may lie in academic research: With colleagues at Scripps, Schultz is aiming to rewrite the basic chemistry of life. By reengineering DNA, RNA, and the proteins that interact with them, they hope to create synthetic organisms with a chemical makeup fundamentally different from all life that has existed on Earth for the last 3.8 billion years.

    If they succeed, their biochemical reengineering could have a profound effect on everything from basic molecular biology to industrial chemistry. The result—they hope—will be proteins that incorporate amino acids other than the 20 commonly used by life to construct proteins. By adding these amino acids with completely new types of chemical behaviors, Schultz and his colleagues hope to design bacteria to make proteins that work as novel catalysts and drugs, or that carry built-in tracers to help researchers decipher their structures. “The thrust of the work is to expand in a radical way genetic diversity,” says geneticist Steven Briggs, who runs the Novartis Agricultural Discovery Institute, a sister organization to GNF. “If Pete can get this to work—and I'm sure he will—it will give us a much bigger toolbox to create medicines and other things to benefit society.”

    It could also open a new window on evolution, allowing researchers to explore alternative paths life on Earth may have taken in its infancy and the shape it might take elsewhere in the galaxy. Synthetic life, says Steven Benner, another pioneer in the field from the University of Florida, Gainesville, allows researchers to explore for the first time whether an alternative chemistry of life is truly viable. As Schultz puts it, “If God had worked a seventh day, what would life look like today?”

    As scientists, government regulators, and environmentalists squabble over genetically engineering natural DNA and proteins, adding synthetic components to the mix may be asking for trouble. If synthetic organisms do come to pass, researchers will undoubtedly encounter fears that biotechnology's latest twist could lead to new types of superpathogens that will wreak havoc on other forms of life. “I used to joke with members of the [Schultz] group that we would know the project was complete when we saw people protesting outside the window,” says David Liu, a former Schultz grad student, who has since gone on to set up his own research group at Harvard.

    At present, such protests remain hypothetical. The biggest obstacle Schultz and his colleagues now face is that creating synthetic life, as Stanford University chemist Eric Kool says, “is a very, very hard problem.” Getting all the pieces to work “means reengineering 3.5 billion years of evolution,” notes Kevan Shokat, a chemist at the University of San Francisco and a former group member in the Schultz lab at Berkeley. If the attempt to make synthetic life forms has one flaw, Benner adds, “it's that it's so ambitious.”

    That suits Schultz just fine. In fact, the sheer scale of the task may give him an edge over the sparse competition. Coaxing bacteria to work with new nucleotides and amino acids requires expertise in molecular biology and genetics, along with physical, synthetic, and combinatorial chemistry. Not many research labs bring together all those specialties. But with roughly 40 members spanning an ever-changing array of disciplines, Schultz's lab does. “I think if anyone can do it, Pete Schultz's lab is the place where it can get done,” says Christopher Switzer, a chemist at UC Riverside who has also worked to add new nucleotide bases to DNA.

    What makes Schultz's goal conceivable is the basic simplicity of life itself. For all their diversity of form, all living organisms make use of the same fundamental chemical machinery: DNA and RNA to store genetic information that encodes for proteins, which carry out vital cellular chemical reactions. All DNA and RNA is made up of four nucleotide bases. All proteins draw on the same 20 amino acids (a 21st, selenocistine, crops up in exceptional cases).

    Nature and synthetic chemists, however, are capable of making hundreds of amino acids that play no part in the makeup of living creatures. In the mid-1980s, Schultz began to wonder whether it was possible to incorporate nonnatural amino acids into the chemistry of life. Protein chemists had developed machines capable of synthesizing short proteins out of both natural and nonnatural amino acids. But coaxing cells to do the same thing would be vastly more difficult. Billions of years of evolution had honed their machinery to convert DNA into proteins by a strict series of steps. The machinery first turns DNA into messenger RNA (mRNA), which leaves the nucleus and travels to ribosomal protein factories in the cytoplasm. In the ribosomes, the mRNA forms a template onto which short molecules of transfer RNA (tRNA) can ferry in amino acids and link them into the sequence of a protein. At each step, protein-based machines transcribe one code to the next: RNA polymerase converts DNA to mRNA; aminoacyl-tRNA synthetases link amino acids onto tRNA molecules; and after those tRNAs link up with their mRNA counterparts, the ribosomes assemble amino acid cargo on the tRNAs into proteins.

    To create proteins containing nonnatural amino acids meant tweaking those protein-based machines to get them to work with amino acids that billions of years of evolution had trained them to avoid. Rather than reengineer the entire protein synthesis apparatus at once from DNA onward, Schultz's team opted to climb one mountain at a time. For starters, 11 years ago they came up with a test tube-based method to trick the protein assembly apparatus of the bacterium Escherichia coli into accepting nonnatural amino acids.

    To do so, they needed to hijack a normal DNA signal and persuade the bacterium to read it as a command to insert a nonnatural amino acid. The signals in DNA come in the form of triplets of nucleotide letters in genes. DNA's four nucleotide letters—A, C, G, and T—can occur in 64 different combinations of three: ATC, ATA, and so on. Because these 64 triplets need only code for the insertion of 20 amino acids, different combinations sometimes code for the same thing. Both TTA and TTG, for example, code for the amino acid leucine. Similarly, three different DNA trios, or codons, serve as the “stop” signs that signal the ribosome to stop adding amino acids to a protein. When the cellular machinery transcribes DNA into mRNA, the letters change, but the stop signals remain in place. Schultz's team modified a certain type of tRNA to recognize one of those mRNA stop signs and insert an nonnatural amino acid into a growing protein when it did.

    Using that system, the Schultz team has added more than 80 different nonnatural amino acids to proteins. But the method has big drawbacks. One is that it uses synthetic chemistry to attach the nonnatural amino acids to the tRNA molecules that recognize the stop codons—an expensive, time-consuming procedure. Once the complexes are synthesized, the researchers simply add them to a mix of cellular components in a test tube and hope that some of the cellular machinery can incorporate them into proteins. But this hit-or-miss approach is inefficient, and very little of the protein with nonnatural amino acids winds up being made. Says Liu: “Translating a protein in vitro is not a high yielding process.”

    It would be far more efficient if all that work were done inside a living cell. “What we really want to do is build an organism—a living organism—where you can add a 21st amino acid to the growth medium and it takes up that amino acid and puts it selectively into a protein,” says Schultz. Schultz and his collaborators are working on two separate tracks to the problem, at least one of which could hit the jackpot sometime in the next year, Schultz believes.

    The group's main effort in creating a synthetic organism builds on the earlier success with stop codons in E. coli. Instead of linking the amino acids to the tRNAs themselves, the researchers are trying to adapt the cells' natural machinery to do that job. That machinery in this case is a set of proteins known as aminoacyl tRNA synthetases (aaRSs). An aaRS is a two-part molecule. One end recognizes a particular triplet sequence in tRNA, and the other end binds to the appropriate amino acid. Aminoacyl tRNA synthetases serve as the go-betweens that connect the genetic information in DNA and RNA to the chemistry of proteins.

    To coax aaRSs into handling amino acids different from the ones they've evolved to work with, Schultz and colleagues systematically change the chemical structure of the enzymes and then test whether they will grab hold of nonnatural amino acids and insert them when they see the mRNA signal for their preselected stop codon. At a combinatorial chemistry meeting last April in Tucson, Arizona, Schultz reported that his team has achieved some success in this effort. The machinery is inefficient: So far it performs its task only about 1% of the time it is signaled to do so. Still, Schultz says, “we've got our foot in the door. I think it's no longer a question of will it work, but how long it will take.”

    Paul Schimmel, an expert on tRNA synthetases at Scripps who does not work with the Schultz group on their project, is more skeptical. Aminoacyl tRNA synthetases, Schimmel points out, also play an important role in editing out mistakes in the sequence of amino acids. So even if nonnatural amino acids initially get attached to a tRNA, they may get plucked off during the editing process. “It's hard to imagine they won't have problems” with this, he says.

    Meanwhile, a more radical approach to reengineering life's chemistry is progressing rapidly as well. For this work, Schultz has teamed up with the lab of chemist Floyd Romesberg of Scripps, who worked as a postdoc for Schultz at Berkeley before moving to Scripps to start his own lab in mid-1998. Instead of using DNA's stop signals as a message to insert nonnatural amino acids, Romesberg's team writes all-new messages by expanding the number of letters in DNA.

    The advantage of this strategy, Romesberg says, is that it gets around the limitations of the use of stop codons. A stop codon can code for only one nonnatural amino acid at a time, and because stop signs are scattered throughout the genome, nonnatural amino acids could wind up being inserted where they are not wanted. By adding new letters to DNA, researchers could write a whole new set of codons, encoding for novel amino acids wherever they wanted them in the genome. (Schultz's group, meanwhile, is trying to achieve the same goals by creating codons for nonnatural amino acids that are four bases long instead of three.)

    The price of that flexibility is complexity. In addition to coming up with the novel DNA bases, the researchers must reengineer the proteins that copy DNA and transcribe it to RNA—enzymes known as DNA and RNA polymerases—to work with these new bases. Again, part of the task is well in hand. At the American Chemical Society meeting in San Francisco in April, Romesberg and Yiqin Wu, a postdoc supported by both the Romesberg and Schultz labs, reported that they had come up with a new DNA base that they can add to the DNA chain. This base, called 7-propynyl isocarbostyril, or “PICS,” pairs with itself, forming a new rung in the DNA ladder alongside pairs of A and T as well as G and C. The Romesberg-Schultz team is not the first to have inserted new letters into DNA. But previous nonnatural bases had a way of prompting the natural bases in DNA to pair incorrectly. The double PICS base pairs, Romesberg says, don't prompt such mispairings. Romesberg's lab also reported that researchers there have isolated a DNA polymerase that can copy a single strand of DNA containing the novel code, forming the standard double-stranded DNA—a key step toward creating cells that can pass down the novel changes in their genetic code.

    Next the researchers will attempt to carry out the same sleight of hand with RNA polymerase, which converts the DNA strand into RNA. Because RNA polymerases are nearly identical with DNA polymerases, Romesberg is confident that this challenge will fall quickly. Finally, the researchers will then have to find tRNAs and aaRSs to recognize the novel bases and insert the nonnatural amino acids. But here again, because Schultz's lab has already paved that road, the team members are confident that they can get all the pieces to work together.

    If synthetic life does indeed materialize one day soon in a petri dish in the hills north of San Diego, Schultz expects it will quickly attract interest from scientists conducting basic research into the behavior of proteins. Researchers could add fluorescent amino acid tags to proteins to signal their location in cells, thereby providing clues to their function. They could also insert amino acids bearing heavy atoms that can be used to help protein crystallographers work out the three-dimensional structure of proteins.

    Beyond these tools for understanding proteins, Schultz believes researchers will be eager to outfit proteins with new functional groups, such as a versatile one known as a ketone, that will serve as hooks for organic chemists to add new chemical functions to proteins. Such souped-up proteins could serve as better drugs and improved enzymes for industry. “This is a gold mine for chemists,” Schultz says. “There are so many things you can think about doing.”

    If nothing else, synthetic life should provide new clues about what life might look like beyond Earth. “If you see life on Mars, how are you going to recognize it” if it has a different chemical structure? Benner asks. By creating living organisms with synthetic DNA and proteins, scientists would know for the first time that life has no fundamental requirement for using A's, G's, C's, and T's. That realization might set the stage for understanding still deeper patterns common to life everywhere.

    Synthetic life could also provide novel insights into life's distant past on this planet, says Schimmel. For example, he says, today's organisms have evolved to have tRNAs to recognize each of DNA's 64 separate codons. Early in life's history, however, individual tRNAs probably have worked with more than one codon each. By reengineering tRNAs to carry out new functions, Schimmel says, researchers may be able to explore how early organisms could have thrived with such an ambiguous system, and how organisms came to make full use of the suite of possible codons. “It's a simulated prebiotic experiment to learn what choices were made when life evolved on Earth,” says UC Riverside's Switzer. “I think it's very important because you can start answering some interesting questions about evolution.”

    Such experiments are likely to make many people a little queasy and raise prickly questions about safety and ethical concerns. And it's in this arena that synthetic life could face its biggest threat. Most concerns will undoubtedly focus on whether such organisms are safe or whether they might somehow escape to become nightmarish superbacteria. Schultz maintains that because any synthetic organisms would depend on nonnatural amino acids to survive, “there's no possibility these organisms could thrive outside the research lab,” an assessment that none of the other researchers interviewed for this story disputed. In that respect, he and others add, synthetic organisms would be far more tame than conventional genetically modified organisms designed to flourish outside the lab.

    Ethical concerns may prove tougher to grapple with. “Genetics scares people,” says Arthur Caplan, the director of the Center for Bioethics at the University of Pennsylvania. In a recent Policy Forum in Science (10 December 1999, p. 2087), Caplan and several colleagues considered a string of objections to experiments in creating novel life forms. They concluded that such research is not inherently unethical or antireligious. Though they were discussing organisms with standard DNA and amino acids, Caplan says he thinks their conclusions would apply to synthetic life as well. “At the end of the day, I don't see any fundamental amorality to making synthetic DNA to regulate a synthetic life-form.”

    All the same, Romesberg is bracing himself for controversy ahead. “There are going to be people who don't like this,” he says. “New ideas are often scary until you demonstrate something good that comes of it.” But if Schultz's high-stakes reengineering project works out, demonstrating a useful payoff will be the easy part.

  12. MOLECULAR BIOLOGY

    Tackling Biology With No Holds Barred, at 800 Miles Per Hour

    1. Robert F. Service

    Engineering synthetic bacteria that harbor novel types of DNA, RNA, and proteins isn't a straightforward scientific challenge. It's more the scientific version of a Russian doll, with challenges nested inside problems. While most researchers are comfortable isolating and breaking down one problem at a time, synthetic life is just the sort of puzzle that Peter Schultz thrives on. “I'm not really interested in doing experiments that if somebody picks up the journal and they read it, they forget about it,” he says. “I'd rather work on something very hard that may have a high chance of failure. But if it's successful, it has a high impact.”

    So far, that strategy has served Schultz well. He launched his academic career at the University of California, Berkeley, in the mid 1980s and immediately turned his group toward exploring what made living organisms such powerful synthetic chemists. After studying how the immune system manages to generate antibodies against a limitless variety of unknown targets, Schultz decided that the key to nature's success was its strategy of generating millions of possible chemical solutions to a problem and then screening for the ones that worked best. “We looked at this and said, ‘Can we take this strategy and apply it not to what nature did, which is molecular receptors, but to something else?’ And the first thing we thought of doing was to make catalysts,” Schultz says.

    Schultz and his colleagues set out to evolve antibodies in mice toward new targets, in this case to become catalysts for particular chemical reactions such as a widely used synthetic step called the Diels-Alder reaction. By following nature's lead, they were able to evolve antibodies with unique catalytic abilities and launch a new field of antibody catalysis. As catalysts, antibodies have never managed to outdo enzymes, their protein counterparts that have evolved for the task. Nevertheless, the idea of chemists imitating biology's use of diversity to find new functional molecules was a wake-up call to the chemistry community, and it has become a hallmark of Schultz's scientific approach.

    This build-first-test-later philosophy was “pretty heretical” among synthetic chemists, who prided themselves on making just the molecules they designed, says Doug Livingston, a chemist at the Genomics Institute of the Novartis Research Foundation (GNF). The fact that the catalytic antibody experiment worked at all was enough to convince Schultz and drug-delivery entrepreneur Alejandro Zaffaroni to try the same mass synthesize-and-screen approach to finding potential new drugs. In 1988, they launched Affymax, one of the first in what has become a sea of combinatorial drug discovery companies. The strategy proved so successful that it swept through the pharmaceutical industry during the late 1980s and early 1990s and prompted Glaxo Wellcome to buy Affymax a mere 7 years after its inception for $533 million.

    Schultz was just warming up. In 1994, working with researchers at UC Berkeley and the Lawrence Berkeley National Laboratory, Schultz showed how the same strategy of generating diversity and then selecting the best could work for materials science. The researchers created a novel library of 128 high-temperature superconductors. That same year, Zaffaroni and Schultz were at it again, this time creating a combinatorial materials company called Symyx Technologies, which has gone on to rack up over $90 million worth of research deals with materials powerhouses such as Dow, BASF, and Unilever to search for novel plastics, catalysts, and even materials for electronic devices en masse.

    That flair for combining function with speed was just what leaders at Novartis had in mind when they tapped Schultz last year to create what they hoped would be a new approach to biological research at GNF. That approach, says Schultz, is to go beyond biology's traditional focus on single genes, proteins, or biochemical pathways and use the latest in computer and robotics technology to scan thousands of biochemical targets at once to understand how they work in unison. To do this, he says, requires a delicate marriage between technology and small groups of researchers focused on cutting-edge problems. “The best model for that in recent history is the old Bell Labs,” which was a dominant force in physics and materials research for much of the 20th century, says Schultz. “My view is it's time to do this in biology.”

    Doing that means acquiring a wide array of technology. As a result, Schultz and GNF aim to bring together all the available high-speed tools, such as gene chips for determining the suite of genes active in normal and diseased tissues, high-speed mass spectrometers that are essential for identifying proteins, high-speed robotics for synthesizing compounds and crystallizing large numbers of proteins for rapid structural determination, and powerful computers to make sense of all the data. “A lot of these things can be done in any lab in the world,” says Michael Cooke, a cell biologist at GNF. “But in very few places is it done in one [location].” Adds GNF neuroscientist Allen Fienberg, “It's a combinatorial chemist's view of biology. That's kind of [Schultz's] view on the whole place. Nobody else really thinks that way.”

    Or matches his frenetic pace. Schultz is usually at the office by 5 a.m. and stays on into the evening, dashing between meetings with a Diet Coke in hand, offering quick thoughts on new experiments, ever excited about the results. “You have a crazy idea, you're above the clouds. He pushes you to extend it. He takes you to Mars,” says Nada Zein, a chemist at GNF. Adds Fienberg: “Here's a guy who runs at 800 miles per hour. You have a conversation and he's three thoughts ahead of you. You start to say something and he answers your question saying, ‘I know what you're going to say.’”

    What nobody yet knows is whether the GNF experiment will succeed. The nonprofit research institute was launched 2 years ago with a promise of $250 million over 10 years from the Novartis Research Foundation—a philanthropic arm of the pharmaceutical giant. That number looks sure to swell as GNF plans to spend up to $35 million this year alone and just broke ground on a new $83 million campus a stone's throw from its current temporary site. “We're still in the honeymoon period,” says Nicholas Gehakis, a GNF molecular biologist. “Everyone is struggling hard to live up to … expectations.” But the clock is ticking. Says GNF robotics engineer Bob Downs: “If we don't perform they'd shut us down, and we'd deserve it.”

  13. OIL OUTLOOK

    USGS Optimistic on World Oil Prospects

    1. Richard A. Kerr

    A new 5-year assessment of the global store of oil bodes well for the world as a whole, but ill for the United States

    All the bellyaching lately about the price of gasoline in the United States might be taken as a sign that the world is feeling the first pangs of a global oil shortage. Pessimists, after all, have warned that world production could peak sometime in this decade (Science, 21 August 1998, p. 1128). The U.S. Geological Survey (USGS) would beg to differ.

    By its recently released estimate,* the world has 20% more oil awaiting discovery in yet-to-be-found fields than the USGS estimated 6 years ago. And a newly analyzed category—oil lurking in and around known fields—offers almost as much additional oil as in those undiscovered reservoirs. Thomas Ahlbrandt, the Denver-based director of the USGS's 5-year assessment, shares the upbeat outlook of many analysts: “I don't see peaking in the next decade at all,” he says, or for that matter in the 25-year projection of the study.

    The prospect of billion-barrel gushers to come doesn't change the pessimists' view, however. Even if the additional oil is really there, they argue, it pushes back the global production peak—and the end of the era of cheap oil—by years, not decades. And the assessment itself holds grim prospects for the world's leading oil consumer. “We're not running short of oil in the world, but for the United States I have deep concerns,” says Ahlbrandt. “The U.S. was richly endowed, but we've used a lot of it”—almost half by the latest count. Getting enough in the next decades will mean ever-greater dependence on other countries, especially those of the Persian Gulf.

    The new USGS assessment is the fifth and most optimistic in a series dating back to 1981. The latest is the product of a larger effort using for the first time a closely documented, uniform approach to evaluating potential deposits, according to Ahlbrandt.

    That new approach and the latest information from around the world led to an increase in estimates of so-called undiscovered oil, oil that lies in fields that have not been found. To estimate what hasn't yet been seen, geologists evaluate the chances that the starting material for oil—organic matter such as ocean plankton—was laid down in a given spot, that it was encased in porous rock, that it was heated to the proper point, and that the resulting oil could have seeped up to fill a self-sealed reservoir of permeable rock. Adding up those chances around the world (plus estimates made previously for the United States alone), the USGS study finds that a mean of 732 billion barrels of oil remains to be discovered in fields yet to be drilled. That's up 20% from the previous USGS estimate of undiscovered oil made in 1993.

    In a first for world oil assessments, the USGS also estimated how much the apparent sizes of known fields are likely to grow as drilling hits previously unrecognized pockets of oil within and just beyond the edges of fields already producing oil. USGS analysts, believing that this “field growth” was not great, had not specifically assessed it. But by the mid-1990s, Ahlbrandt and his colleagues saw signs that field growth was substantial and unaccounted for. By looking at the history of field growth around the world, including intelligence information and proprietary data made available by oil companies, they projected that the apparent size of existing fields would grow from initial estimates by an average of about 25%. That makes for another 612 billion barrels of oil. “We see the phenomenon wherever we look,” says Ahlbrandt. “It makes me less worried” about the world starting to run short of oil in the next few decades.

    The new USGS figures bolster the contention by most industry experts and resource economists that ever-improving technology for finding and efficiently extracting oil will hold off the peak in world oil production and keep prices down, perhaps until midcentury.

    The pessimists still see the glass half empty. “Finding a billion barrels here and a billion barrels there doesn't change the nature of the argument,” says energy analyst James MacKenzie of the World Resources Institute in Washington, D.C. The important transition, explains economist Robert Kaufmann of Boston University, is not when the world runs out of oil, but “when we go from steadily growing production to when production stagnates or declines while demand continues to grow.”

    That happened in the U.S. lower 48 states in 1970, pessimists are fond of pointing out, and no amount of new technology has reversed the subsequent decline in production. Alaskan oil production has also peaked, and some analysts see the stagnation of North Sea production in the past several years as a peak as well. One of the world's hottest prospects, the Caspian Sea, has 42 billion barrels of undiscovered oil according to the assessment, but MacKenzie calculates that it takes 400 billion barrels of oil to move the world production peak back just 6 years. With global consumption running at 27 billion barrels a year and increasing 1.5% to 2% a year, even the USGS figures push the peak of world production back years, not decades, from the pessimists' consensus date of 2015, says MacKenzie.

    The timing of the world oil peak may remain contentious, but optimists and pessimists alike agree that the United States will become increasingly dependent on foreign oil. U.S. production shows no signs of recovering, while the new assessment assigns even more resources to the Middle East and the Organization of the Petroleum Exporting Countries (OPEC) that it dominates. Mexico and China suffered sizable decreases in undiscovered oil, while the Middle East enjoyed an increase of 48%. And field growth is expected to be particularly large in the former Soviet Union and the Middle East, says Ahlbrandt. At the same time, non-OPEC production has stagnated over the past 3 years, which could be the production peak predicted for 1999 by the Paris-based International Energy Agency of the Organization for Economic Cooperation and Development. The second coming of OPEC, which analysts expected in the next decade or so, could be sooner rather than later.

  14. ASTRONOMY

    The Virtual Observatory Moves Closer to Reality

    1. Govert Schilling*
    1. Govert Schilling is an astronomy writer in Utrecht, the Netherlands.

    Data from decades of observations by dozens of instruments may soon be accessible, changing the way that astronomy is done around the world

    If Alex Szalay has his way, the next astronomical observatory won't be built on a remote mountaintop. It won't even have a telescope. Instead, says Szalay of Johns Hopkins University in Baltimore, Maryland, the National Virtual Observatory (NVO) will be an electronic web that gives astronomers access to terabytes of celestial data with the click of a mouse. In addition to eliminating such current occupational hazards as jet lag, shortness of breath, and long, cold nights, the virtual observatory promises to make possible new analyses of the heavens by weaving together information from facilities around the world—and in space. “In 5 years' time, we will have a complete view of the sky in 15 wavelengths,” says Szalay. “It will be just so different.”

    Seven years ago, when Szalay and a few colleagues first started kicking around the idea of a virtual observatory, being on site was still the key to success in astronomy. The Internet was in its infancy, and the concept of an electronic warehouse that would fulfill customer requests hadn't even occurred to Amazon.com founder Jeff Bezos, much less to most of the scientific community. Since then, however, a number of sky surveys have accumulated masses of data, others are well under way, computer and data-handling technology have improved significantly, and pilot projects are laying the foundations for the real thing. As a result, the astronomical version of Szalay's idea has gone big time. This spring the latest report by the National Academy of Sciences (NAS) on priorities for U.S. astronomy over the next decade (Science, 26 May, p. 1310) ranked a virtual observatory as the most important small project in astronomy, and last month some 160 astronomers and computer scientists spent 4 dayshttp://www.sciencemag.org/cgi/content/full/289/5477/238a talking about the nuts and bolts of how to make that idea a reality.

    The aim of the NVO will be to pull together data from dozens of telescopes and decades of collecting time. Its use of huge, standardized data sets, collected by survey programs at all wavelengths from radio waves to gamma rays, will make possible large-scale statistical studies. No longer will astronomers have to cope with disconnected hit-and-run observations, all carried out with different instruments and different procedures. “Finally, we will be able to compare apples with apples,” says Szalay. Mining large data sets is also expected to turn up the oddballs of the universe—objects so rare that you'd be unlikely to run into them by chance, but all the more interesting for that. In addition, virtual astronomy will allow scientists to study interesting subsets of celestial bodies, such as all objects that are bright at infrared wavelengths, appear fuzzy in visible light, and are not detected as x-ray sources. The combination of these qualities, says Robert Brunner of the California Institute of Technology (Caltech) in Pasadena, “will create a mind shift in astronomy.”

    Indeed, enthusiasts say virtual observatories promise to change forever the way astronomy is done. They will allow astronomers to select any part of the sky, run complicated queries, and download raw observational data in any particular wavelength for further analysis. This could all be done from an office or home computer, as well as on the road, with a laptop. “It will enable you to do first-rate science, even without access to a large telescope,” says Brunner. “This will lead to a true democratization of astronomy.”

    A driving force behind an NVO is the exponential growth in digital data being collected by numerous sky surveys. Those surveys, in turn, are fueled largely by continuing improvements in charge-coupled device (CCD) detectors, which are getting larger, more efficient, and cheaper. “The number of CCD pixels observing the universe is doubling every 20 months,” says Szalay.

    Five years ago, a terabyte of data (1 million megabytes, or enough information to fill 1500 CD-ROMs) was huge. Today, it's no big deal. For instance, the Sloan Digital Sky Survey, a U.S.-Japanese effort to map 100 million stars, galaxies, and quasars in five different wavelengths, will eventually contain some 40 terabytes of data. The future Large-Aperture Synoptic Survey Telescope, another priority project in the NAS decadal report, will produce 10 terabytes of data per day. (By comparison, the amount of information in the human genome is a mere 10 gigabytes, or 0.01 terabytes.)

    Right now, however, no infrastructure exists to access and combine all those disconnected archives. That's the challenge facing a virtual observatory. It would consist mainly of dedicated software and analysis tools, designed to search through vast amounts of astronomical data, as well as a certain amount of computer hardware, such as high-speed networks. The various archives will be accessed remotely, rather than brought together on a central computer. “The data has to be processed where it is sitting,” says Szalay, “since moving 40 terabytes of data over a typical Internet connection takes a couple of years.” To devise more efficient schemes to navigate and access the enormous amount of data, Szalay has teamed up with computer scientist Jim Gray of Microsoft in Redmond, Washington, one of the designers of TerraServer, a huge database of satellite images of Earth (see sidebar).

    Astronomers say it could cost as much as $100 million over 10 years to create this virtual astronomical warehouse. NASA and the National Science Foundation (NSF), which have historically supported ground- and space-based telescopes, are the most likely federal sources of funding. Both have already sponsored pilot projects. And although the NVO isn't expected to be fully operational until 2005, preliminary versions may be imminent, says Brunner. “We won't close the doors, come out 5 years later, and say: ‘Here's the Virtual Observatory, use it,’” he says. Adds Szalay: “I would be surprised if there weren't something up and running by the end of this year.”

    Although the academy report backs a national project, the initiative lends itself to a worldwide partnership. Already, astronomers at the headquarters of the European Southern Observatory (ESO) in Garching, Germany, are working on a similar project. “This has to be global,” says Piero Benvenuti, head of the Space Telescope European Coordinating Facility (ST-ECF) at ESO. “We have already started discussions with our American colleagues.” Many European scientists attended the Caltech conference, and next month in Munich ESO will host another international conference, entitled “Mining the Sky.”

    Benvenuti coordinates the joint ESO/ST-ECF ASTROVIRTEL project, a pilot project funded for 3 years by the European Commission. Through ASTROVIRTEL, European astronomers can access a huge data archive that contains observations from the Hubble Space Telescope, ESO's Very Large Telescope at Paranal, Chile, and the Wide Field Imager at ESO's La Silla Observatory, also in Chile. The first round of “virtual observing” proposals will be selected within the next couple of months.

    “Our approach is to start with the proposals,” says Benvenuti, “and build the software tools from that.” A review committee will choose on the basis not just of scientific merit but also on whether the question is general enough to warrant the development of a specific query tool. “In the end,” says Benvenuti, “we want our users to think of using the database as if it were an astronomical facility. You can't operate it all by yourself, just as you can't run the Hubble Space Telescope or the Very Large Telescope without expert help.” At Caltech, Brunner and Tom Prince have developed their own program called the Digital Sky Project. Funded by NSF and Sun Microsystems, it acts as a small-scale technology demonstrator for the NVO. “We've learned a lot about connecting and searching large databases,” says Brunner. “The National Virtual Observatory will certainly be founded on this experience.” Another prototype of a virtual telescope is SkyView (skyview.gsfc.nasa.gov), which lets the user look at a particular part of the sky in different wavelengths.

    The next step in creating a virtual observatory is working out the structural details. Caltech's George Djorgovski, who together with Szalay, Prince, Brunner, and others wrote a white paper on NVO for the NAS panel, foresees a series of small, focused workshops over the next year or so to address topics such as establishing new data archives, writing standard protocols for storage and query of data and images, and developing a uniform Web-based interface. “We also need to build real partnerships with computer scientists,” he says. “In the future, we might end up using virtual reality software techniques that are being developed right now for the next generation of Sony's PlayStation.”

    Nobody is predicting that virtual reality will supplant real telescopes, however. After all, an archive has limited value in the study of transient phenomena such as supernovas, variable stars, and asteroids. “The big scopes will not be replaced,” he says. “But I think a lot of routine footwork will be shifted from relatively small telescopes to virtual observatories.”

    • *Virtual Observatories of the Future, 13 to 16 June, Caltech.

  15. ASTRONOMY

    Watch This Space!

    1. Govert Schilling

    Right now, if you type http://www.skyserver.org/ into your Web browser, you get an “Under Construction” message. But later this year, the site hopes to put the sky at your fingertips. Initially aimed at a more general audience, SkyServer might become part of a National Virtual Observatory.

    The idea is to create a seamless mosaic of the sky at many wavelengths and to provide people with a variety of views of every part of the heavens. SkyServer is being developed by the same Microsoft team that built TerraServer, a Web site giving access to satellite images of Earth. In fact, SkyServer has been described as the TerraServer looking up rather than down. “Astronomy has a special public appeal,” says Alex Szalay of Johns Hopkins University in Baltimore, Maryland. “It is an incredibly interesting challenge to develop tools that might excite high school kids.”

    While TerraServer contains fewer than 2 terabytes of data, SkyServer would be much larger. “It would be easy to fill it up to tens of terabytes,” says Szalay. He expects a small prototype of SkyServer to be up and running by the end of the year.

    The European ASTROVIRTEL project will also have educational value, says Piero Benvenuti of the European Southern Observatory. “We could offer the digital sky to any school class,” he says. “Students could use it as a telescope and carry out their own projects.” Benvenuti plans to pitch the idea to several European funding agencies.

Log in to view full text

Log in through your institution

Log in through your institution