News this Week

Science  05 Nov 1999:
Vol. 286, Issue 5442, pp. 1058

    EMBL Faces Huge Bill Following Adverse Pay Dispute Ruling

    1. Michael Balter

    Scientists at one of the world's leading research centers, the European Molecular Biology Laboratory (EMBL) in Heidelberg, Germany, are in shock after being told that an adverse judgment in a salary dispute could wipe out as much as 25% of the lab's core funding and threaten cancellation of its ambitious future plans. The bad news was delivered at a staff meeting last month by EMBL director-general Fotis Kafatos, who told researchers and other employees that a worst-case interpretation of the ruling—which was rendered by the Geneva-based International Labour Organization (ILO)—could lead to the institution shutting down. Although Kafatos stressed that this possibility was remote, even under the most optimistic interpretation the complex judgment will cost the lab millions of dollars in back salary payments and possibly curtail important new research initiatives.

    The decision comes at the worst possible time for the 25-year-old institution. EMBL is already facing the potential loss of European Union (EU) infrastructure funding for one of its outstations, the European Bioinformatics Institute (EBI), near Cambridge, U.K. The EU—which this year provided about 36% of the EBI's $8.3 million budget—decided earlier this year to cut all its infrastructure funding to EBI and several other European research facilities next year. (A letter from more than 60 European scientists protesting this decision was published in this week's Nature.) Later this month, Kafatos is scheduled to go before EMBL's governing council, which consists of delegates from the lab's 16 member countries, to present a draft 5-year scientific plan for 2001 to 2005 that is expected to include a significant boost in EMBL funding for EBI and its new mouse genetics facility at the Monterotondo Research Center near Rome. Kafatos and other EMBL researchers fear, however, that the council may take a dim view of increased support for EBI if it is forced to fork out huge additional sums of money to the lab's staff.

    The convoluted legal case has its roots in EMBL's status as an international organization, sponsored by 15 European nations plus Israel. Many such organizations—including NATO, the European Space Agency, the Organization for Economic Cooperation and Development, and others—belong to a group called the Coordinated Organizations (Co-Org), which sets salary levels and scales for its members. Although EMBL has never joined Co-Org, in 1982 the lab adopted the Co-Org system as a “guide” for salary levels, a decision that was written into its staff regulations. But in 1992 the EMBL council, concerned with the growing costs to the lab of following the Co-Org guidelines, began capping salary increases at lower levels. As a result, overall salaries at EMBL began to slip below those at Co-Org member organizations.

    In 1995, EMBL's staff association challenged this decision. When the council refused to relent, three EMBL scientists and two nonscientific staff members—later joined by a large number of their colleagues—filed complaints for back pay with the ILO, which arbitrates labor disputes involving international organizations. On 8 July of this year, the ILO's administrative tribunal ruled in their favor, arguing that EMBL could not deviate from its own staff regulations without providing “proper reasons” and adding that “financial considerations … do not constitute a valid reason.” The tribunal's order, which cannot be appealed, directs EMBL to implement the Co-Org salary increases for 1995, and in addition to pay employees 10% annual interest since that year on the sums past due.

    However, the decision leaves ambiguous whether the council is only required to grant the percent increases mandated for 1995, or whether it must now restore overall salary levels to what they would have been if the guidelines had been followed since 1992, an extremely costly interpretation that some staff members are advocating—but one which could nearly bankrupt the lab. “The ILO-mandated salary adjustments are reasonable if implemented according to the council's [more conservative interpretation],” Kafatos told Science. “But the extreme interpretations advocated by some [present and former staff members] are inappropriate and could damage EMBL badly.” Moreover, two more cases before the ILO concerning salaries for 1996 and 1997 are still pending and are expected to be decided early next year, although the amounts of money involved are considerably less.

    At the staff meeting last month, Kafatos said that at worst the ruling would require the lab to pay an immediate lump sum of about $11 million, or about a quarter of the $43 million the member states provided EMBL in 1999 for its core operating costs. The tab for catching up with Co-Org salary scales would then run about $2 million per year in future years. And these large sums do not include the possible adverse judgments for 1996 and 1997. Even the best-case scenario, in which the lab would not have to fully adjust overall salaries to Co-Org levels, would still mean a $2.7 million bill for the 1995 judgment alone.

    “The council was wrong in not granting these salary increases,” says structural biologist Luis Serrano, chair of the EMBL staff association. “From a legal point of view they made a blunder.” But Kafatos says that the lab was not obligated to follow Co-Org rules. “It is clear that the EMBL council never ceded its decision-making powers to Co-Org [because] we are not a member.”

    “I am not worried that EMBL will shut down,” says computational biologist Peer Bork, “because that is an extreme scenario and I don't think it will happen. But there is a chance that we will lose our critical mass in some [research] areas and will not be competitive anymore.” One of these areas, Bork says, is the hot field of functional genomics, “which requires a lot of expensive equipment.” And Serrano adds that “it is clear this is going to be a major blow for EMBL.” Just how major, he says, depends on which interpretation of the ILO ruling that the staff chooses to insist upon, a decision currently being debated in the lab's corridors.

    As Science went to press, the staff association was expected to meet this week to discuss its options. EMBL employees might agree to spread the back payments over a number of years to soften the blow or to take increased holiday time to partly compensate for the money they are owed. But Serrano says that many of the original complainants in the case are no longer at EMBL, and they may not be willing to compromise—in which case the matter could end up back before the ILO if the governing council does not agree with the staff interpretation. According to one staff member who prefers to remain anonymous, “there are people who don't care if the lab goes down the drain over this.”

    Kafatos will have to tread carefully to avoid such a scenario in the coming months. The best solution, he says, would be one that would “safeguard both the fair interest of all EMBL personnel and the continued well-being of EMBL as an institution.” And he will be looking for any encouraging signs that the staff will rally behind him. Says Serrano: “Nobody in this building is interested in destroying EMBL.”


    Booby-Trapped Letters Sent to 87 Researchers

    1. Jocelyn Kaiser

    Psychobiologist John Capitanio could see the razor blade through the back of the envelope mailed to his office at the University of California, Davis. He already knew what to look for: Capitanio had been told he was one of 87 scientists using nonhuman primates in their research across the United States targeted last month by a shadowy animal rights group that originated in Britain. Booby-trapped to slice the fingers of an unsuspecting scientist trying to open them, the letters mark a new and disturbing turn toward violence by the militant wing of the animal rights movement.

    Although animal rights groups have vandalized many laboratories in the United States, in recent years most attacks on individuals have occurred in Europe (Science, 4 June, p. 1604). “This is the first time there's been a campaign of this ilk [in the United States] on this large a scale,” says Mary Brennan, executive vice president of the Foundation for Biomedical Research (FBR), a Washington, D.C., watchdog group, which warned the intended victims after spotting a list of them on an animal rights organization's Web site. While some researchers, like Capitanio, seemed to take the missives in stride, others saw them as much more serious: “Some of my colleagues are feeling very frightened,” Capitanio says.

    As Science went to press, more than 50 of the 87 letters had been received, all bearing a Las Vegas postmark dated 22 October. In addition to a razor blade taped inside the upper edge of the envelope, each letter contained a short, typed message that read, in part: “You have until autumn of the year 2000 to release all of your primate captives and get out of the vivisection industry.”

    A group called the Justice Department has claimed responsibility for the letters in a 24 October communiqué on a Web site, the Animal Liberation Frontline Information Service, that posts information supplied by “underground” groups such as the Animal Liberation Front (ALF; The FBR and another group, Americans for Medical Progress, spotted the posting the next day and alerted the researchers listed. No injuries have been reported.

    The Justice Department originated in Britain, where it has acknowledged sending letter bombs and other devices to pharmaceutical labs, animal breeders, and researchers since 1993. One of the group's members served 3 years in prison. Three years ago, the group began sending similar threatening letters, complete with razor blades, to Canadian hunting groups and fur retailers. A fact sheet on the site associated with the ALF, which expresses a commitment to “nonviolence” despite having taken credit for past attacks on animal labs, explains that the Justice Department “see[s] another path … [that] involves removing any barriers between legal and illegal, violent and nonviolent.”

    Some researchers already accustomed to regular protests by animal rights groups seemed unfazed by the letters. “There's not a whole lot we can do about this sort of thing other than just stay alert and not fool around with anything that looks suspicious,” says Peter Gerone, director of the Tulane Regional Primate Center in Covington, Louisiana. But others were less nonchalant, including a University of Washington, Seattle, AIDS researcher who declined to have his name published. “I have a family,” he says. “I don't want to say I'm afraid, but there are certain situations where you don't take chances.” Capitanio says he's okay now, but admits that “I might feel more nervous next autumn.”


    Reverse Engineering the Ceramic Art of Algae

    1. Ivan Amato*
    1. Ivan Amato is the author of Stuff.

    The glasslike silica laceworks within the cell walls of diatoms are so beautiful they'd be on display in museum cases if only they were thousands of times bigger. No one knows how these tiny algae pull off their bioceramic art, but researchers are closing in on the secret. On page 1129, biochemist Nils Kröger and colleagues at the University of Regensburg in Germany report new clues—silica-forming proteins dubbed silaffins.

    Within seconds after they added their first silaffin samples to solutions of silicic acid, a silicon-containing organic compound, Kröger, Rainer Deutzmann, and Manfred Sumper knew they were onto something. Says Kröger: “You suddenly see the precipitate form. The solution gets cloudy”—something that takes hours to happen without silaffins. A scanning electron microscope showed that the precipitate had formed networks of minuscule silica spheres.

    Kröger and his colleagues went on to analyze the proteins and show how their structures and chemical features could help catalyze the reaction of silicon-containing molecules into solid silica particles. The researchers “have done a great job of characterizing their proteins,” says Galen Stucky of the University of California, Santa Barbara, who last year found what may be compounds with similar functions in silica-making sponges.

    Besides helping to explain how diatoms transform dissolved silicon-containing molecules into sturdy solid particles, the finding is also a tantalizing clue for materials scientists who envy biology's ability to build sophisticated materials at ambient pressures and temperatures. To make any ceramic, from a dinner plate to a toughened drill bit, engineers and artisans now have to mix powders, press them into molds, and fire them in furnaces. There are no furnaces in sight when a developing child infiltrates itself with bone or a diatom drapes itself in silica lace, and materials scientists would like to know how they do it.

    The Regensburg group suspected that diatoms make proteins that orchestrate the initial phase of biosilica formation—the growth of tiny silica spheres. For one thing, other researchers had already found organic molecules closely linked to diatom cell walls. After extracting the organic material from their diatom samples, the Regensburg researchers isolated three proteins that could instigate silica precipitation in a test tube—a pair of small, closely related silaffins (1A and 1B) and another larger one, silaffin 2. To begin unraveling how the proteins work, the group determined the amino acid sequence of silaffin-1B and ferreted out a gene from the DNA of the diatom Cylindrotheca fusiformis, which turned out to encode silaffin-1A as well. Kröger says the team also is now working to characterize silaffin 2.

    The structures of these proteins harbor clues to the diatoms' silica engineering. The glasslike veil of a newborn diatom takes shape in a “silica deposition vesicle,” where conditions are acidic. Both silaffins have an unusual amino acid motif, consisting of bonded pairs of lysines with a string of amine groups grafted on after the protein chain is formed. The researchers say that under acidic conditions, this motif should stimulate silicic acid molecules to form silicon-oxygen bonds, linking them together into silica particles. That might help explain how diatoms form solid silica from ingredients dissolved in their watery environs, but it doesn't explain how the algae coax the silica to form intricate patterns. Kröger conjectures that other features of the proteins could be at work.

    Silaffin-1A and −1B both consist mainly of two chemically distinct components, one bearing multiple positive charges and another multiple hydroxy groups. To Kröger, the proteins resemble synthetic block copolymers—polymers in which two distinct segments, each repeated many times, alternate along the molecule. When some copolymers solidify, like segments cluster together, segregating into two separate phases that pattern the material with regions of contrasting chemical properties—somewhat the way drops of oil poured onto a saucer of vinegar form segregated droplets. Kröger wonders whether silaffins might be doing something similar within a diatom's silica deposition vesicle, forming molecular frameworks that then guide the growth of the silica.

    However diatoms create their silica patterns, it's a trick materials scientists would like to emulate. “Ceramics are one of those unfulfilled materials we could use lots more of, if only we could get [them] easily,” says materials researcher Paul Calvert of the University of Arizona, Tucson. Adopting biology's kinder, gentler methods could help engineers combine ceramics with other materials that can't take furnace temperatures. Quips Calvert: “You could make something with chocolate feet and a silicon carbide head.” Unlikely material combinations, he says, could push forward such projects as “flexible electronics,” in which silicon-based electronics are patterned onto polymer sheets. Diatom-like methods for making intricately shaped ceramics might also yield photonic materials, whose internal arrangements of solid and space could select and confine specific wavelengths of light for communication or computing.

    The more scientists learn about diatoms' glassy laceworks, the more beautiful they seem.


    Has a Great River in the Sea Slowed Down?

    1. Richard A. Kerr

    For many millions of years, two “rivers” of seawater have been flushing the deep sea clean while shuttling chemicals and heat so as to reshape climate. Now, a new analysis of oceanographic data suggests that one of the two rivers has slowed dramatically within the past century, with implications for climate and the humans who are changing it.

    In a paper on page 1132 of this issue of Science, marine geochemist Wallace Broecker of Columbia University's Lamont-Doherty Earth Observatory in Palisades, New York, and his colleagues argue that the renewal of deep waters by sinking surface waters near Antarctica has slowed to only one-third of its flow a century or two ago, while deep water formation in the North Atlantic—the site of the other river—remains high. “The whole concept that deep water circulation could have changed that much is mind-boggling,” says Broecker, who adds that, far from being a onetime event, the slowdown may recur in a 1500-year cycle.

    This huge, climate-altering change in the oceans—if it's real—would greatly complicate attempts to understand how the ocean and climate are responding to another influence on climate, the buildup of greenhouse gases in the atmosphere. “It's a really interesting and provocative idea,” says ocean circulation modeler Jorge Sarmiento of Princeton University, “but I'm very uneasy about the calculations. I find the paper more of a stimulation to further work than what I could accept as proven fact.”

    Directly measuring the flow of surface waters into the deep sea is impractical. Instead, researchers examine easily measured, indestructible “tags,” some natural and some manmade, that join surface water before it becomes denser and sinks into the abyss. One tracer, the sum of the phosphate and oxygen in seawater, should remain constant as water sinks into the deep sea. In data collected by other researchers, Broecker found that this tracer, called PO4*, is high in newly formed deep water near Antarctica and relatively low in newly formed deep water in the northern Atlantic. Throughout the deep Indian and Pacific oceans it is at intermediate levels, suggesting to Broecker that these deep ocean basins have received about equal amounts of water from each source during the past 800 years or so.

    Another tag, radioactive carbon-14, also supports equal roles for the two deep-water source regions over the last millennium, according to Broecker's analysis. Surface water heading down absorbs carbon dioxide, including carbon-14 formed in the atmosphere by cosmic rays, and carries it along toward the bottom. Broecker analyzed carbon-14's distribution throughout the world ocean and concluded that about 15 million cubic meters of water per second (15 Sverdrups) has been sinking into the deep sea at each source during the past 800 years.

    But tracers that gauge deep water formation over decades rather than centuries seem to show that the southern source is now much smaller than the northern one. Physical oceanographers have long believed that the principal southern source of deep water, in the Weddell Sea, now supplies no more than 5 Sverdrups, judging by heat and salt content. And Broecker finds further evidence in a new study of water in the Southern Ocean, near Antarctica, by physical oceanographer Alejandro Orsi of Texas A&M University, College Station, and his colleagues. They looked at the distribution of the pollutant chlorofluorocarbon-11 (CFC-11), which first entered the environment a few decades ago. To Broecker, the study implies that the southern source has generated only 4 Sverdrups of new deep water during the past few decades. In the north, on the other hand, other CFC-11 studies support 15 Sverdrups, says Broecker.

    Why should the sinking of seawater into the deep sea have slowed recently in the south? Broecker doesn't know, but he sees a parallel between the apparent recent slowdown and more drastic variations in ocean circulation during a sharp cold snap 11,000 years ago. As recorded in Atlantic sediments, deep water formation in the north slowed or halted. Because warm water normally flows northward to replace the sinking surface water, the shutdown chilled much of the Northern Hemisphere; meanwhile water began sinking faster in the south, warming the region. Something similar might have happened in the 500-year Little Ice Age, which ended around 1880, says Broecker. Since then, deep water formation in the south would have slowed. And because some suspect that the Little Ice Age is only the latest swing in a 1500-year climate cycle (Science, 27 February 1998, p. 1304), further changes could be in store.

    Broecker's ideas are “always interesting,” says marine geochemist Richard Gammon of the University of Washington, Seattle, “and he's right often enough that people have to pay attention.” Physical oceanographer Arnold Gordon of Lamont is certainly paying attention. However, he and Orsi don't think Broecker has the evidence to back up his claim. By their reckoning, true deep water formation is currently equal north and south at about 5 Sverdrups. In his accounting of deep water formation, they say, Broecker includes waters that never get very deep or are picked up by new deep water as it sinks.

    Broecker isn't worried about the cautious reception. “I don't expect people to accept at face value what I say.” The important thing is that “the Little Ice Age is going to get more attention,” he adds. “If I'm right, it has enormous consequences.” Sorting natural climate oscillations from anthropogenic greenhouse warming would become more difficult. The ocean might lose some of its ability to draw off greenhouse gases and stash them away in the depths. And future change could become even harder to predict. As Sarmiento says: “It's a very interesting speculation; it's also disconcerting. If we want to understand the next 100 years or 200 years, we really need to understand what is going on with long-term cycles.”

  5. AIDS

    European Vaccine Effort Faces Chinese Puzzle

    1. Michael Balter

    Paris—During an international AIDS meeting here last week, a group of researchers quietly met to plan a 3-year, $9.2 million European Union-backed effort to develop an AIDS vaccine. The new initiative, called EuroVac, is expected to begin on 1 January and will mark the first time the EU has attempted to pull Europe's top AIDS researchers together into a unified vaccine drive. Yet though the contracts between the EU and the researchers are still being negotiated, the initiative has already become tangled in international AIDS vaccine politics. Some members of the EuroVac scientific team were surprised to learn last week that one of the project's potential ambitions—to test vaccine candidates in China, where the AIDS virus is spreading rapidly—may duplicate similar efforts under way by virologist David Ho, director of the Aaron Diamond AIDS Research Center in New York City.

    The EuroVac project, which is co-chaired by virologists Jaap Goudsmit at the University of Amsterdam and Marc Girard at the Pasteur Institute in Paris, will begin with phase I trials to compare the ability of several different vaccine preparations to elicit immune responses against HIV. The European team will test how well two different types of genetically engineered vaccinia viruses—one called MVA and the other NYVAC—serve as noninfectious vectors to present four key HIV proteins to the immune system. In each case, this “prime” will be followed by a “boost” vaccine preparation consisting of HIV's envelope protein, the main component of its outer coat.

    In addition, the trials will mix and match proteins from two major clades, or subtypes, of HIV: clade B, which predominates in North America and Europe, and clade C, which now accounts for about 40% of new HIV infections in the world and is particularly rampant in China and India. The mix-and-match strategy should eventually allow researchers to determine whether a vaccine against clade B would also be effective against clade C and vice versa, once they are tested in full-fledged efficacy trials. “EuroVac is quite innovative,” says team member Giuseppe Pantaleo, an immunologist at the Vaudois Hospital Center in Lausanne, Switzerland. “For the first time we will be comparing MVA and NYVAC with four major viral proteins, and it's the first time we will be trying to get cross-clade immune responses.”

    But it is EuroVac's plans to test a clade C vaccine that have provoked the sparring over turf. The clade C HIV on which the vaccine is based was provided by Hans Wolf, a virologist at the University of Regensburg in Germany, who obtained the viral strain from colleagues in China. In the meantime, Ho, who has his own contacts in China, has prepared a clade C vaccine using a somewhat different strategy, which uses both an MVA vaccine and a “naked DNA” preparation that delivers HIV genes directly to the body. Ho has been quietly talking with Chinese health authorities about testing his vaccine in China and has also been discussing funding with the New York-based International AIDS Vaccine Initiative (IAVI)—a private organization funded by numerous major foundations as well as the World Bank and the British government.

    Wolf and Ho learned of each other's efforts only fairly recently. Wolf argues that the efforts are duplicative and criticizes IAVI for funding what he sees as a competitive study. “I would have nothing against it if someone like David joined in our trial,” Wolf told Science. “But now IAVI is running around the world and putting money into a competing thing; this is irresponsible.” Ho counters that “it is not unusual that multiple groups pursue the same objective” and adds that the differences between the two vaccine strategies might argue for comparing their effectiveness in parallel clinical trials. An opportunity to resolve this conflict may be at hand, however: Later this month, a Beijing meeting co-sponsored by IAVI will bring together representatives of the Chinese ministry of health, NIH, and IAVI, as well as one or two members of the EuroVac team.

    Viral immunologist Wayne Koff, IAVI's vice president for research and development, says that the clade C vaccines developed by EuroVac and Ho are only two of a number of possible preparations that could go into preliminary clinical trials in China, where more than 400,000 people are estimated to be infected with HIV. IAVI's basic strategy—to accelerate development of the most effective vaccine candidates—could well mean that it would end up funding a “head-to-head” comparison of the two vaccines. But whether either vaccine will ultimately end up a finalist, he adds, it “is too early to say.”


    Congress Shrinks Lab Chiefs' Flexible Funds

    1. David Malakoff

    If you don't like the way an institution is run, go after its budget. Congress has just applied that logic to a special fund controlled by the directors of the Department of Energy's (DOE's) national laboratories, slashing the amount available for hiring young scientists and funding high-risk research. Four national labs, including the three nuclear weapons centers, have been hit especially hard by the reductions, which lab officials hope to reverse next year. Lawmakers say that some reprogrammed money has been mismanaged in the past and that the cuts are needed to keep the labs focused on priorities determined by Congress.

    The accounting change restricts the flexibility that Congress gave lab directors in 1991, when it created an account called the Laboratory Directed Research and Development (LDRD) fund. The mechanism currently allows each lab to divert up to 6% of the funds it receives from the federal government and other sources to carry out relatively small research projects, usually costing from $25,000 to $500,000 per year. The money is awarded to lab scientists through a competitive process. Language in the 2000 DOE budget bill signed into law last month, however, would reduce the LDRD tax (divertible funds) to 4% and exempt environmental cleanup programs—a major piece of the budgets of several labs—from any tax.

    Although smaller DOE labs often do not impose the maximum tax for a variety of reasons, some of the larger labs, including the Los Alamos and Sandia nuclear weapons laboratories in New Mexico and the Lawrence Livermore laboratory in California, have used it to amass annual funds of $50 million or more (see graph). Lab administrators say the money has been essential for attracting young researchers with fresh ideas and for backing risky research, such as forays into materials and computer science, that have evolved into lab mainstays. “LDRD provides us with cherished freedom and creativity in basic research,” says Dan Hartley, Sandia's vice president for laboratory development.

    But LDRD spending has also attracted scrutiny—and criticism—from some members of the House Appropriations Committee. Representative Ron Packard (R-CA), chair of the spending panel that oversees DOE's budget, and other lawmakers are unhappy that LDRD siphons funds from programs Congress has approved, such as environmental cleanup efforts, and that some labs have funded projects of little relevance to DOE's mission. “The concern is that when you give a lab director $70 million to spend, it will be used for their priorities, not the nation's,” says one House aide. Opponents of LDRD funding have pointed to internal DOE reviews over the last decade that have found instances of mismanagement of LDRD dollars and accounting practices that diverted more funds than were allowed under the rules. In past years the Senate has rebuffed House efforts to scale back or eliminate LDRD. But this year, after the House voted to cancel the program, Senate negotiators succeeded in restoring only part of the funds.

    At Los Alamos National Laboratory, the change has produced a “traumatic” 40% cut in the lab's $70 million LDRD budget, which wholly or partly funds hundreds of scientists, says Klaus Lackner, acting associate director for strategic and supporting research. To avoid layoffs, he says the lab is shifting some scientists to weapons projects with more stable funding and focusing the remaining LDRD money on supporting young researchers and funding projects—such as those in the life sciences—unlikely to find backing elsewhere. “We started from the premise that postdocs must be able to go on,” he says.

    At Sandia, where LDRD funds have dropped from $83 million to $52 million, officials worry that the funding uncertainty could cause “some of our brightest, youngest people” to leave, Hartley says. Similar fears are being voiced at Livermore, which lost $23 million of its $58 million LDRD budget. “We are focusing our resources on protecting our long-term strategic investments,” which supplement existing work in such areas as computing and the effects of aging on nuclear weapons, says Rokaya Al-Ayat, Livermore's deputy director for LDRD.

    Also hard hit by the change was the Idaho National Engineering and Environmental Laboratory, which relies heavily on environmental cleanup funds that can no longer be taxed for LDRD funds. The lab's new director, Billy Shipp, is confident that he and DOE headquarters staff can find a way to continue many existing activities despite a cut from $21 million to $6 million, in part by getting congressional permission to use money from other programs.

    Lab officials are also thinking about the best way to restore LDRD funding in next year's appropriations bill. Bill Appleton, Oak Ridge's deputy director for science, says scientists must convince House members that LDRD “is one of the few ways that the labs have of doing innovative research that has major payoffs down the line.” At stake, say he and other lab officials, is their ability to attract the best talent and stay at the forefront of science.


    Fossils Give Glimpse of Old Mother Lamprey

    1. Carl Zimmer*
    1. Carl Zimmer is the author of the book At the Water's Edge.

    Evolution went on a creative spree about 540 million years ago. Over the course of less than 20 million years during the Early Cambrian period, a huge diversity of animals appeared for the first time, including many of the major groups living today, such as arthropods, mollusks, and various sorts of worms. Notably missing from this party—known as the Cambrian explosion—was any member of our own lineage, the vertebrates. Until now the oldest unambiguous vertebrate fossils dated back 475 million years. But this week our genealogy took a giant leap back in time. Chinese and British paleontologists reported in Nature that they have found the fossils of 530-million-year-old vertebrates—fossils that have other paleontologists in awe. “I was absolutely amazed the first time I saw these fossils. They're just unbelievable,” says Phillippe Janvier, a paleontologist at the Museum d'Histoire Naturelle in Paris who is an expert on early vertebrates.

    You might expect that such ancient creatures would be primitive, transitional forms linking us to our pre-vertebrate past. Yet surprisingly, the fossils are actually full-fledged vertebrates—more advanced, in fact, than some vertebrates alive today. As a result, paleontologists think fossils of even older vertebrates must be waiting to be discovered, perhaps in rocks dating from well before the Cambrian explosion.

    The two fossils come from a site in southern China called Chengjiang, already famous for its Cambrian treasures, where the fine-grained rock retains impressions of muscles and other soft tissues. “Chengjiang really takes your breath away,” says Simon Conway Morris, a paleontologist at the University of Cambridge. After learning that two different teams of paleontologists, one led by Degan Shu of Northwest University in Xian, had unearthed the vertebrate fossils, Conway Morris traveled to China this April to analyze them with Shu and other Chinese colleagues.

    They found that the two fossils represented different species, and although the fossils measured only a couple of centimeters long, the researchers could recognize key vertebrate traits. They had rows of gills, and their muscles were arranged in W-shaped blocks along their flanks, a pattern unique to vertebrates. “They were presumably filter feeders, but they have these muscular bodies and things which we cautiously interpret as an eye,” says Conway Morris. “And so presumably they could go along at a fair pace if they had to, and they might have grabbed prey.”

    The researchers then tried to find a place for the fossils in vertebrate evolution. A number of researchers believe that vertebrates evolved from an ancestor something like Amphioxus, otherwise known as the lancelet. Amphioxus, which lacks eyes or fins and looks rather like a miniature anchovy fillet, has a notochord—a primitive backbone. The first vertebrates added new traits to that body plan, such as a skull with a brain; later vertebrates acquired jaws and fins. The most primitive vertebrate alive today is the hagfish, a jawless fish, and the second-most primitive is the lamprey.

    Conway Morris and his colleagues concluded that the fossils fall into a surprisingly advanced position. One of the species, which the researchers named Haikouichthys, is most closely related to the lamprey. The other fossil—tortuously named Myllokunmingia—is more primitive (its gills are simpler), but Conway Morris says it is still a closer relative to us than to the hagfish.

    Features seen on both fossils may help answer the controversial question of how early vertebrates evolved the paired fins that later gave rise to arms and legs (Science, 23 April, p. 575). The new fossils show what look like two long folds of tissue running along their underside—exactly what some theories of fin evolution predicted. “We think there's a reasonable case for a double arrangement,” says Conway Morris.

    Janvier, who has argued that the paired fins came much later, has his doubts. “From what I could see of the fossils, it's not 100% certain.” He is also uncertain about the fossils' placement on the vertebrate family tree, because many details of the creatures' anatomy have been lost. He has no doubt that they are vertebrates, but says, “I wouldn't put my money on the exact positions.”

    If Conway Morris is right about the creatures' sophistication, however, millions of years of vertebrate evolution must have preceded them, reaching back before the Cambrian explosion. Some researchers already suspected as much, based on the clocklike divergence of genes in different animal lineages. According to a new study by Blair Hedges of Pennsylvania State University in University Park, for example, vertebrates got their start 750 million years ago. “Some of my colleagues who take molecular clocks seriously will be skipping for joy” over the new finds, Conway Morris acknowledges ruefully.

    He himself doesn't think vertebrates got their start so long ago. He suspects the first ones arose just before the Cambrian Period, about 565 million years ago. The traces of these ancestral creatures, he thinks, may be waiting, still unrecognized, among the fossils known as the Ediacaran fauna. “These stem groups are all lurking down there,” Conway Morris maintains, “but we're just too dim to see them.”


    Gravity's Gravity Vindicates Einstein

    1. Andrew Watson*
    1. Andrew Watson is a science writer in Norwich, U.K.

    Between them, general relativity and quantum theory explain all of nature's forces, and yet they refuse to be married. The strong and weak nuclear forces and electromagnetism are all described by quantum theories that mesh in a very satisfactory way. On the other hand, general relativity—Einstein's theory linking the force of gravity to the geometry of space and time—steadfastly refuses to be seduced into the quantum fold. “A goal in physics is to unify all the forces, that is, to combine gravity with the other three in one grand theory,” says Blayne Heckel of the University of Washington, Seattle.

    Like so many others, Heckel, Eric Adelberger, and their Seattle colleagues don't know how nature might entice the two parties to walk together down the aisle. So the Seattle group has instead looked for the possible progeny of such a match. One such child would be a difference in the way gravity acts on mass and on gravitational energy itself. But this hypothetical love child, expected in some scenarios of a deep connection between gravity and the quantum world, is nowhere to be found, the group determined.

    Einstein built his theory of general relativity on the premise that gravity acts equally on all forms of mass-energy. Experimenters have shown that nuclear binding energy and energies due to electromagnetic interactions do indeed obey this “equivalence principle.” For example, a proton and a neutron combine to make an object with less mass than the component parts; the binding energy holding the two parts together accounts for the missing mass. Yet experiments show that the combination and the individual parts free-fall at the same rate in a gravitational field.

    But no one has yet shown that gravitational energy responds to the pull of gravity in the same way as all other forms of mass-energy do. Some theories—including string theory, the current favorite in the attempts to synthesize a quantum theory of gravity—suggest it might not. “Many theorists expect that at some point we will find a difference,” says Heckel.

    Lab experiments can't study the impact of gravitational binding energy, since the energy tied up in the mutual pull of fragments of lab-sized objects is minuscule. The place to look, Kenneth Nordtvedt of Montana State University suggested more than a decade ago, is in the tug of the sun on the moon and Earth. Although Earth's gravitational binding energy is small—a mere half a microgram per kilogram—because Earth is big, around 3 trillion tons of its mass is transformed into pure gravitational energy. The moon's gravitational binding energy is around 2000 times smaller, but still big enough to displace the center of the moon's orbit relative to Earth if the sun's gravity treats mass and gravitational binding energy differently.

    Spotting these effects means monitoring the Earth-moon distance to high accuracy. By using lunar laser ranging, in which a laser beam bounces off reflectors dropped off on the moon by astronauts, Nordtvedt and others tracked this separation to centimeter accuracy and found, within the limits of the experiments, that the Earth and moon do indeed fall towards the sun at the same rate.

    Nordtvedt himself pointed out a loophole, however: Some quantum gravity theories suggest gravity might act differently on the Earth and moon because of compositional differences such as Earth's iron-dominated core, explains Heckel. “So one wants to know that the Earth and moon don't fall at different rates due to composition differences, and by an amount which could cancel a gravitational self-energy effect,” adds Nordtvedt. Such a cancellation is “quite unlikely,” but the Seattle group has sought to resolve this potential ambiguity.

    The Seattle experiment, reported in this week's Physical Review Letters, consists of a torsion balance: a fine wire supporting a tray that can rotate by twisting the wire. On the tray are four weights, alternating toy Earths and moons, all weighing exactly 10 grams. The two Earth-like weights are made of steel to simulate Earth's core material, while the two moon-like weights are made of quartz and magnesium-based materials that simulate both the Earth's and the moon's mantle material. The whole experiment is rotated so that the “planets” turn past the sun in 40-minute-long “days.” Any gravitational preference of the sun for the toy moon or toy Earth should yield a twist in the torsion balance. “It's a very clever idea, making these little models for the planets,” says Clifford Will at Washington University, St. Louis. The experimenters have produced “extraordinarily precise measurements.”

    The team found no twist. Their results, combined with the laser ranging, show that “gravitational binding energy falls at the same rate as all other forms of mass-energy to better than a part per thousand,” says Heckel. Heckel declares himself unsurprised at the result, and Einstein's theory triumphs yet again.

    Yet there's some comfort for the string theorists too, says Will, because differences in the rates of fall of different bodies could lie beyond the sensitivity of current experiments. “We really think there's a chance of finding a violation at some level,” Will says. So the dating game continues, but gravity remains as aloof and celibate as ever.


    Patterning Plastic With Plentiful Pillars

    1. Robert F. Service

    Richmond, Virginia—Rome wasn't built in a day, but a nanosized version of it may be in the near future. At the International Symposium on Cluster and Nanostructure Interfaces here last week, Stephen Chou, an electrical engineer from Princeton University in New Jersey, described a new microscopic patterning technique capable of creating arrays of plastic pillars, each less than a thousandth of a millimeter across, that resemble nothing so much as tiny versions of the great columns of Rome's coliseum. Cheap, fast, and versatile, the patterning scheme could help create novel plastic displays and electronic devices. The pillars themselves could not only be used as wires in plastic electronics, but could also direct the growth of other materials, such as metals and semiconductors, into regular patterns.

    “It's very beautiful work,” says Peru Jena, a physicist at the Virginia Commonwealth University in Richmond. It's the beauty of simplicity, says Jena, because the technique requires nothing more than putting a mask above a heated thin polymer film and waiting a few minutes while the pillars assemble themselves. Chou says that, at around 500 nanometers in diameter, the pillars are now more than twice as large as the finest features that photolithography—the workhorse patterning technology of the chip industry—can lay down on silicon. Nevertheless, K. V. Rao, a physicist at the Royal Institute of Technology in Stockholm, Sweden, points out that photolithography has been refined over decades. As for the new technique, “this is just the beginning,” he says.

    It was an unexpected beginning, Chou says. He and his students were working on a related patterning technique in which they imprint a pattern of nano-sized ridges and grooves on a soft polymer with a tiny embossing stamp. In one experiment, however, tiny dust grains, each about 0.5 micrometers high, strayed onto the polymer before the stamp was applied. Like tiny boulders, the dust grains prevented the stamp from pushing into the polymer and making an impression. Yet when the Princeton researchers removed the stamp and looked at the surface of the film, they still saw a pattern of dots that matched features on the stamp. The stamp had somehow elicited a pattern without ever touching the surface.

    Surprised, they repeated the experiment to see if they could find out what had happened. They created another set of masks, this time incorporating tiny posts that held them about half a micrometer above the polymer surface, and again they saw the array of dots. Those dots turned out to be tiny polymer pillars that had grown up from the surface of the plastic layer to the mask.

    “We still can't be sure” what causes the pillars to form, Chou says. He and his colleagues have determined that a polymer film produces pillars only when it is heated enough to melt and the masking material above is electrically conductive, which leads Chou to speculate that the interplay of electrical charges in the mask and the polymer film creates the pillars. Localized concentrations of charge in the mask likely induce an opposite charge in the nearby film, he says, generating electrostatic forces that pull the pliable polymer upward.

    If correct, says Chou, this explanation suggests that the pillars should form first at the corners of the mask, since charges preferentially bunch there. And when the Princeton team set up a video camera to watch their pillars grow, they found that pillars formed first at the corners and edges of the film below the mask and slowly worked their way in toward the center.

    So far, Chou and his colleagues have made most of their pillars in a polymer called polymethylmethacrylate, more commonly known as plexiglass. But the technique also works with conducting polymers, which could serve as the basis of futuristic flat panel displays and disposable electronics. To test whether their pillars could make conductive wires for such devices, Chou and his colleagues laid down a film of conducting polymer on a metal strip and then grew pillars upward to touch another metal strip passing over the first at right angles. The group hasn't tested the electrical behavior of the pillars, but Chou says he fully expects that they will provide conductive pathways between the metal conductors. If so, plastic pillars may be in for a rising future.


    Humane Science Finds Sharper and Kinder Tools

    1. Erik Stokstad

    For decades, more and more researchers have been using fewer laboratory animals for compassion's sake. Thanks to new experimental techniques, many are getting cleaner results, too

    Bologna, Italy—A decade ago, veterinary surgeon Christian Schnell tested candidate drugs to lower blood pressure with a procedure that was highly stressful—for himself and his test animals. First, he would anesthetize marmosets and insert a catheter into an artery in their legs. The next day he restrained the conscious animals, orally administered the drug, and recorded blood pressure through the catheter for 4 to 5 hours. Not only did the harried animals' hearts race during the experiment, but each one could be used for only six trials before all its suitable arteries had been tapped.

    But by 1991, Schnell, a researcher at the drug company Ciba-Geigy in Basel, Switzerland, switched to a new and more sophisticated technology: a sensor that he implanted in the animals' abdominal cavities. The device continually measures blood pressure and transmits the data to receivers in the cages, allowing the marmosets to move freely and remain with their families—more relaxed and with normal heart rates. Without the confounding effect of stress, the results are cleaner. “We are now convinced we're measuring the truth,” says Schnell. And without the need for catheters, Schnell could do the same research with only 10% of the marmosets he had previously needed, saving the company up to $200,000 a year.

    Schnell's case illustrates an accelerating trend in which new technology is helping researchers reduce their reliance on animal experiments, while at the same time improving their results. Although animal rights extremists continue to use violent and intimidatory tactics against researchers in many countries (see page 1059 and Science, 4 June, p. 1604), more moderate campaigners for animal welfare have for years been working with researchers to encourage this trend toward better experimental design and more humane techniques. The motto of this movement is “Humane science is better science,” and its creed is the “three R's”—replacing laboratory animals, reducing their numbers, and refining techniques to minimize pain and suffering (Science, 11 October 1996, p. 168). The results have been striking: The use of lab animals has declined in many European countries—in some cases by as much as 50% over the past 2 decades.

    As a result, the mood among the more than 800 researchers who gathered here recently for the Third World Congress on Alternatives and Animal Use in the Life Sciences was cautiously upbeat. They exchanged information on a variety of technologies—including implantable sensors like those Schnell uses and new imaging techniques to replace invasive procedures—that are already reducing the number of animals and lessening distress. And researchers reported progress in several areas—such as DNA arrays and tests using stem cells—that could help drug companies rule out dangerous compounds before they're tested in animals. “The spin-offs of molecular biology and biotechnology will have a great impact on [lowering] the use of lab animals,” predicts geneticist Bert van Zutphen of Utrecht University in the Netherlands.

    But not all the trends are downward. Many animal welfare researchers are alarmed by the imminent prospect of a new round of toxicity tests in the United States on a host of so-called high production volume chemicals (see sidebar), as well as tests on potential endocrine disrupters, that may require millions of laboratory animals. And in some hot areas of research, such as transgenics, animal experimentation is rising fast. Since 1990, the number of procedures on transgenic animals in the United Kingdom, for example, has risen almost 10-fold to more than 447,000. That's “a huge rise and due to get much higher,” predicts Caren Broadhead of the Fund for the Replacement of Animals in Medical Experiments in Nottingham.

    Even when researchers come up with technologies that can lessen the use and suffering of test animals, they still face a formidable obstacle: the glacial pace of regulatory bodies in accepting replacement tests, such as cell cultures. “A validation study takes a long time,” says Herman Koëter, principal administrator of the Environmental Health and Safety Division of the Organization for Economic Cooperation and Development in Paris. “You need years and years to get a gold standard.” That frustrates researchers. “If people knew how onerous it can be to get a test validated, many fewer would begin developing new ones,” says Ian Kimber, research manager of AstraZeneca's Central Toxicology Laboratory in Alderley Park, U.K.

    Less is more

    Schnell's work with marmosets to test potential blood pressure drugs is Exhibit A in support of the humane science movement's claim that compassion can improve science. Ciba-Geigy had been puzzled by the fact that some candidate compounds that had looked promising in the earlier, more invasive, tests were duds in early human trials. But when Schnell tried those compounds again using implanted monitors in unrestrained marmosets, they proved to be 10 times less effective at lowering blood pressure than they had in the restrained animals. “It was a shock when we discovered this,” recalls Schnell.

    Since those early tests, telemetry sensors have shrunk in size and price and they are becoming more widespread. Blood pressure monitors weighing 3.5 grams are now small enough to be implanted into mice, and the device that Schnell uses costs about $3000. The new monitors are also far more versatile: Implantable devices can record temperature, blood pressure, heart rate, electrocardiograms, and intraocular pressure, and blood flow monitors will be available soon. “I'm convinced that telemetry will be the standard method in the near future,” says Schnell.

    Whereas implantable monitors can keep track of an animal's physiology, an imaging technique developed by Xenogen Corp. of Alameda, California, allows researchers to chart the course of an infection or the growth of tumors without any surgery at all. The technique essentially records a glow from inside the animal. The light bulb is the luciferase gene, which produces the firefly's bioluminescent protein. Researchers infect an animal with a microbe engineered to express luciferase, anesthetize it, and place it in a dark chamber. Some of the photons from the luciferase pass through the animal's flesh, and a charged-coupled device counts them for a few minutes, pinpointing the active microbes.

    The pharmaceutical industry is eyeing this technology as a potential replacement for a standard test called the mouse thigh model. To check out new antibiotics, for example, technicians give the test drug to 14 or more infected mice, then kill a pair of the animals every 2 hours, grind up their thigh muscles, and culture microbes from the tissue over 2 days. The better the antibiotic, the fewer microbes grow on the ground-up muscle. In contrast, researchers can scan a living mouse in just 5 minutes. And measuring the same animal throughout the study—rather than comparing individuals that might have had slightly varying initial infections or responses to the drug—also reduces variability.

    One group of researchers, led by Tom Parr of Lilly Research Laboratories in Indianapolis, recently compared the two techniques and presented their results at the 39th Interscience Conference on Antimicrobial Agents and Chemotherapy in San Francisco in September. The team ran a mouse thigh model using doses of a known antibiotic, but before extracting the muscle, they imaged the animals. The dose-response curves from the two assays were very similar, with correlations ranging from 0.94 to 0.98. Imaging “is more sensitive and more precise while requiring fewer animals,” says Parr. “We should be able to get more valuable information in less time.” The quick results also mean that test animals can be killed before they suffer the full effects of an infection. Xenogen president Pamela Reilly Contag says six pharmaceutical companies, including Eli Lilly, are evaluating the technology, and 10 others are in various stages of negotiation.

    Going in vitro

    Drug companies are also showing interest in alternatives to animal tests to screen compounds for effects on fetal development. Researchers currently test for potential teratogenic effects by treating pregnant animals with a candidate drug and then checking embryos for abnormalities—a time-consuming and expensive proposition. “Most companies now want to have short tests that give a clear answer and that require small amounts of compound,” says Philippe Vanparys, director of genetic and in vitro toxicology at Janssen Research Foundation in Beerse, Belgium. Recent developments in establishing immortal lines of stem cells—general-purpose embryonic cells that can develop into any type of cell in the body—have raised hopes that such tests may be feasible.

    Because stem cells have a very reliable pattern of development into tissue, researchers can precisely measure any disruption to the number of cells, the quality of cells, and the timing of development. This provides a way of looking for subtle chemical effects that might lead to birth defects in particular organs. For example, Anna Wobus of the Institute of Plant Genetics and Crop Plant Research in Gatersleben, Germany, has developed an in vitro method to differentiate mouse embryonic stem cells into heart muscle cells, among others. Once these cells begin to beat after 9 days of normal development, researchers can check for defects in the nascent heart. In 1996, Horst Spielmann, director of the National Centre for Documentation and Evaluation of Alternative Methods to Animal Experiments in Berlin, submitted this test to the European Centre for the Validation of Alternative Methods (ECVAM) in Ispra, Italy, an organization run by the European Union that assesses the suitability of in vitro tests for replacing established animal tests. “So far it looks very promising,” says Juergen Hescheler, a molecular biologist at the University of Cologne, Germany.

    Now, Hescheler and his colleagues have added a feature to the test that could make it even faster, easier to use, and more versatile. At the Bologna meeting, he reported that his group has spliced a fluorescent reporter gene to the cardiac-specific promoter gene, so the cells express a green fluorescent protein on day 4 of development, cutting experimental time in half. “We can directly measure cell differentiation without any staining, so it's less time-consuming,” says Hescheler. The team now wants to link reporter genes to other types of stem cells, such as neuronal, epithelial, and cartilage precursor cells. If the reporter proteins could fluoresce in different colors, scientists might be able to examine the effects of potential toxicants on a suite of tissues at once. Interest in the cardiac reporter is already high. “In the last month, I had five to six pharmaceutical companies asking for this test,” says Susanne Bremer of ECVAM.

    Toxic chips

    Toxicologists are also turning to a hot new genetics technology to study cellular responses to test compounds: DNA microarrays, which are commonly used to track patterns of gene expression (Science, 15 October, p. 444). A single DNA “chip” carries an array of hundreds or thousands of short strands of DNA, each of which acts as a probe for a specific gene. To tell which genes were active in a sample, researchers convert messenger RNA to complementary DNA, tag it with a fluorescent marker, and wash the sample over the chip. The cDNA sticks to a specific probe on the chip, and its presence is revealed by a glowing patch when the chip is illuminated with light.

    Many toxicologists believe that such arrays could reveal which genes a cell turns on in response to toxic compounds—and because they directly probe the activity of human cells, the arrays may eventually be better than animal tests in predicting toxicity to humans. “DNA chips will be the source of the next reduction in animals used,” predicts Spielmann.

    AstraZeneca's Central Toxicology Laboratory (CTL) is one of the first off the blocks with a chip outfitted with DNA from 600 genes, associated with everything from cell adhesion and ion channels, to metabolism and immune response—all thought to be involved in cellular response to toxicity. “The most exciting thing about toxicogenomics is that we're going to start investigating genes we never would have thought of looking at,” says CTL's Kimber. “That's where the big surprises—and big benefits—are going to come from.”

    Not everyone is convinced by the promise of DNA chips, however. “There's much hype about gene chip technology,” says molecular biologist Johannes Doehmer of the Technical University of Munich in Germany. “They're very expensive, and it will take a few years before you can rely on them.” And although the microarrays generate a lot of information very quickly, the results can be hard to interpret. “The vast majority of our time is [spent] figuring out the gene response,” says CTL's William Pennie.

    Even though many researchers say that animals will never be replaced for conducting general investigations or checking a whole-body response to a potential toxicant, scientists are also enthusiastic about the potential of chip technology and in vitro tests for asking specific questions—with data from human cells, rather than animal models of disease. “We can now go into more depth,” says toxicologist Sandra Coecke of ECVAM. “With in vivo tests, you ended up with kind of a black box.” Indeed, Coecke and others feel that these kinds of new methods—once validated—could not only replace animals tests, they could be an improvement.


    Toxicity Testing: The Many Arts of Persuasion

    1. Erik Stokstad

    Last October, Vice President Al Gore announced what sounded like a great idea, one that won him plaudits from the environmental movement: a 6-year program to rapidly collect health and safety data on 2800 major industrial chemicals. But one lobby's triumph turned out to be another's catastrophe. Animal rights groups quickly denounced the High Production Volume (HPV) Challenge because they estimated it would require the destruction of more than a million animals. The groups charged that the killing of so many animals was needless because much of the information already existed, more could be derived from nonanimal tests, and some simply wasn't worth collecting.

    Over the past year, the lobbying to halt or modify the original plans of the HPV program has been intense. Gore has been followed around the country by a heckler in a rabbit suit, animal rights groups have taken out emotive newspaper ads, and animal welfare researchers have held workshops to offer Environmental Protection Agency (EPA) representatives technical advice on alternatives to animal testing. The combination may have worked. On 14 October, EPA sent a letter to 223 companies with new recommendations for testing HPV that should reduce animal use.

    The impetus for the HPV program was a 1997 study called “Toxic Ignorance” by the Environmental Defense Fund, an advocacy group based in New York City. The report suggested that basic toxicology data were not publicly available for most of the chemicals that are manufactured or imported into the United States in amounts greater than 450,000 kilograms (1 million pounds) per year. In a quick follow-up review, EPA could only complete a checklist of specific health data for 7% of these chemicals. The agency then invited chemical manufacturers and importers to volunteer the basic toxicity data and test plans—or face regulation.

    Activists countered that many tests were unnecessary, because some of these chemicals were either clearly safe, such as those already approved by the Food and Drug Administration for consumption, or obviously toxic, such as rat poison and turpentine. And for other compounds, they argued, EPA simply hadn't looked in the right databases. Many concerned scientists also weren't pleased that a fill-in-the-box suite of information was being required for all the chemicals. The HPV program “is bad news for those of us who seek a scientifically rational approach to hazard prediction and risk assessment,” was the opinion of Michael Balls, director of the European Centre for the Validation of Alternative Methods.

    In December, EPA held a stakeholder meeting, during which the agency and advocacy groups discussed the use of nonanimal tests in the HPV program. A month later, EPA representatives participated in a conference, called TestSmart, sponsored by the Center for Alternatives to Animal Testing (CAAT) at The Johns Hopkins University in Baltimore to brainstorm suitable alternative methods. Afterward, EPA tentatively proposed several approaches, such as combining reproductive and developmental toxicity tests or changing protocols (for example, replacing the infamous LD-50 test, which determines the dose at which half the treated mice die, with a test that requires fewer animals). Altogether, the agency said, this could reduce animal usage by up to 80%. People for the Ethical Treatment of Animals (PETA) disputed that estimate and kept up the pressure.

    In its letter of 3 weeks ago, EPA officially responded to the concerns of the animal rights groups and made several recommendations to chemical companies. “We are trying to minimize the number of animals and avoid needless testing,” explains Susan Wayland, the EPA deputy assistant administrator who signed the letter. “We just needed to write it down in a way that was clear.” In accordance with international animal welfare guidelines, the letter discourages or rules out several animal tests, such as those for the reproductive effect of chemicals unlikely to be released from factories. To better ensure that no tests are redundant, the agency will now consider previous results from additional databases, including a widely used international chemical safety database called IUCLID.

    EPA also postponed testing some chemicals for several years, in the hope that validated nonanimal tests for some may be available soon. “This is for us a compromise,” says Mary Beth Sweetland, spokesperson for PETA, “but it's so much better than the slaughter that was going to take place.” To help the search for alternatives, EPA announced that the National Institute of Environmental Health Sciences will invest at least $4.5 million during the next 2 years to develop and validate nonanimal protocols. EPA will chip in $250,000 and will seek to contribute about the same next year.

    Although some activists still aren't completely satisfied with the outcome or the process—PETA claims the agency had to be dragged “kicking and screaming” to consider alternatives—the recommendations leave CAAT director Alan Goldberg feeling optimistic: “This is a major regulatory agency that has been taking a hard look at how to incorporate the best technology that is more humane.”


    The Stories Behind the Bones

    1. Carl Zimmer*
    1. Carl Zimmer is the author of At the Water's Edge.

    Denver, ColoradoThere's more to paleontology than fossils, as was shown here on 20 to 23 October at the 59th meeting of the Society of Vertebrate Paleontology (SVP). Genetics labs, for example, uncovered an Ice Age disease; a changing atmosphere was fingered as the force behind the evolution of mammals in North America; and dissecting modern animals has hinted at the reason dinosaurs had such big noses.

    Ancient Tuberculosis Identified?

    The world's deadliest infectious disease, tuberculosis plagues a third of all people on Earth, killing 3 million every year. Exactly how the scourge first got a toehold in our species has been a mystery, but at the meeting researchers made a controversial announcement that they had a clue—in the form of DNA from Mycobacterium tuberculosis dating back 17,000 years.

    For decades, the traditional story of TB had it arising in Old World pastures. Cows and their bovine relatives carry strains of the mycobacterium that are closely related to the human form; people could have become new hosts for TB when they began herding cattle and handling meat and hides. That idea seemed to find support in the devastating epidemics that swept through Native American society when European colonists arrived in the New World. Native Americans, never having domesticated cattle, had apparently been spared the disease until then and thus had immune systems that couldn't cope with TB.

    But in 1994, that notion collapsed with the discovery of M. tuberculosis in a 1000-year-old mummy in Peru—predating Columbus's arrival by 500 years (Science, 25 March 1994, p. 1686). Native Americans carried TB long before Europeans came on the scene, and the massive epidemics that followed the contact could have resulted from overcrowding, malnutrition, and bad sanitation.

    So where did the New World TB come from? In the late 1980s Larry Martin, a paleontologist at the University of Kansas, Lawrence, and Bruce Rothschild, an expert on ancient diseases at the Northeastern Ohio Universities College of Medicine in Rootstown, looked for evidence of TB on bones from the New World's grazing animals. The disease can leave scars on bones in places where the immune system has walled off infected cells. Martin and Rothschild examined bones of bison, musk ox, and bighorn sheep from the Natural Trap Cave in Wyoming, dating back 15,000 to 20,000 years. They found the scars in abundance but couldn't say whether the animals suffered from TB or other lesion-forming diseases, such as brucellosis. Because the lesions looked more like those of TB than of other diseases, “we knew it was a likely diagnosis,” says Martin. “But we knew we could be wrong.”

    The researchers suspected that they might be able to settle the issue by finding genetic material from the mycobacterium itself in the fossils. Natural Trap Cave gets its name from the way it is entered—by falling into the entrance and dropping 30 meters to the cave floor. Animals have been falling to their death in the cave for 100,000 years. During the Ice Age, these unlucky creatures would have crashed into a heap of snow on the cave floor and been freeze-dried. There was a chance, therefore, that some of their DNA could have survived until now.

    Rothschild and Martin extracted some of the bone tissue from a lesion on a 17,000-year-old bison. They sent samples to labs in Israel and England, each of which used the polymerase chain reaction to amplify any fragments of genes. As Martin explained in his talk, both teams identified genes belonging to Mycobacterium. Although the timing of human arrival in the Western Hemisphere is still under intense debate, Martin says, “my suspicion is that tuberculosis was waiting for humans when they came.”

    Based on the talk, however, other researchers are skeptical. “They didn't put all their ducks in a row,” contends Ross MacPhee of the American Museum of Natural History in New York City. Many species of mycobacteria live in the soil, he points out, and they might have gotten into the cave and contaminated the bison material. Contamination has proven to be a big headache for scientists who study ancient DNA, yet Martin and Rothschild didn't present any control tests that could have ruled it out—for example, testing the bones of animals that don't get TB for the presence of the mycobacterium. “The result is really interesting, so why didn't they go that extra step and knock out the ambiguity?” asks MacPhee. According to Martin, his team will soon present data that address this issue.

    If Martin and Rothschild are right, New World TB must have come from the Old World, when some infected mammals crossed the Bering Land Bridge and then infected the early Americans who hunted them. And if people in the New World picked up the disease from hunting, rather than farming, maybe the same goes for the Old World, too. The two researchers note that their scenario resembles current theories that trace AIDS in humans to hunting chimps and monkeys. “It shows that even for the most sophisticated side of medicine, it's useful to know what happened 17,000 years ago,” says Rothschild.

    Where Have All the Browsers Gone?

    Two artist's conceptions, common in Earth history texts, tell the story. One portrays North America 20 million years ago in the early Miocene, showing a mix of grassland and trees, with many sorts of hoofed mammals, or ungulates, craning their necks to browse on leaves. In the second, showing the landscape of the last few million years, most of the browsers are gone, and many of the animals—horses, bison, and other grazing ungulates—have their heads to the ground, chewing up the grass. Why did the mammals of North America go through such a drastic shift? The explanation, many researchers think, is blowing in the wind: a drop in atmospheric carbon dioxide since the Miocene, which could have affected the vegetation that the big herbivores depended on. At the meeting, Brown University paleontologist Christine Janis offered a new scenario for how this CO2-driven change in North America's ecosystems might have taken place: by gradually starving the plants the browsers ate.

    Many researchers think the shift was abrupt. Isotopes found in ancient soils and in the fossil teeth of horses show that between 8 million and 6 million years ago, plants that use a photosynthetic system known as C3 declined, while those that use a system called C4—tougher species such as crabgrass—suddenly became dominant in North America and elsewhere. C4 plants do better than C3 plants in low levels of CO2, suggesting that the declining levels of atmospheric CO2 triggered their spread. The grazing ungulates, equipped with high-crowned teeth, could grind down the hardy C4 plants. But many of the low-crowned browsers, adherents say, couldn't handle life on the new grasslands and went extinct (Science, 28 August 1998, p. 1274).

    Janis and her co-workers, John Damuth of the University of California, Santa Barbara, and Jessica Theodor at Brown, decided to test that idea by taking a close look at the fossil record. “It has to be better examined, instead of just [being] asserted,” says Janis. They tallied instances when hoofed mammals evolved into new species or went extinct over the past 20 million years, noting whether the animals were browsers, grazers, or those with a mixed diet. Nowhere in their database could they find a pulse of extinctions that coincided with the shift from C3 to C4 plants.

    The pattern they found instead was distinctly different. Before 18 million years ago, browsers made up the majority of ungulate species at just about every site where Janis and her colleagues looked—in some places reaching levels as high as 80%. The browsers then started a slow, steady decline that carried on for millions of years, through the C3-C4 transition and into more recent times.

    Janis agrees that the ultimate spur for this mammal overhaul must have been the fall in CO2 concentrations, but says the changes started many millions of years before the C3-C4 shift. Atmospheric levels of carbon dioxide have been declining for the past 50 million years. This long-term atmospheric change exerted its effects gradually, she proposes, as the leafy C3 plants the browsers depended on produced less food, and many of the browsers gradually went extinct. The C3-C4 transition certainly didn't help matters, but it came after CO2 had already been declining for at least 10 million years. “The transition was at the end of a long progression,” says Janis.

    The work “is a wonderful demonstration that the decline of browsers and the evolution of grazing morphology is probably not directly related to the spread of certain types of grass,” says Tony Barnosky, a paleobiologist at the University of California, Berkeley. But Barnosky thinks forces other than the CO2 decline could have played a role. The early Miocene saw a period of volcanism in the western United States, and the dusting of ash would have made plants hard to chew and digest. The rough stuff would have been a challenge for the low-toothed browsers, eventually starving them, but the high-toothed grazers could have coped. Whatever the mechanism, Barnosky says, “the decline of browsers seems to have an environmental trigger, but the gun was a much bigger—and earlier—one than the C3-C4 shift.”

    Dinosaur Air Conditioning

    Dinosaurs evolved some pretty bizarre anatomy, from the ridge of plates along a Stegosaurus's back to the fantastically long tail and neck of Apatosaurus. But for Larry Witmer, a paleontologist at the Ohio University College of Osteopathic Medicine in Athens, dinosaurs' coolest feature was hidden from view—specifically, up their noses. In most vertebrates, the nasal cavity is the smallest part of the skull, he explains. Yet in several groups of dinosaurs, the nasal cavity expanded to gigantic proportions. Indeed, a small child could climb inside the nasal cavity of the long-faced, beaked Triceratops. “There must be something pretty important going on to devote half the skull to that,” Witmer says. The answer, it appears, was to keep dinosaurs, and especially their puny brains, from frying.

    To figure out what dinosaurs used their noses for, Witmer has spent the past few years engaged in something he calls the DinoNose Project. Muscles, mucous membranes, and blood vessels can all leave marks on a fossil, in the form of grooves, shelves, canals, and other structural features. To interpret the structures of dinosaur skulls, Witmer and his co-workers have studied the anatomy of living animals that have big, unusual noses, looking at the way their soft tissues shape their bones. And because dinosaurs are closely related to today's alligators and birds, Witmer's group studied the heads of these animals as well, identifying the structures they could have inherited from a common ancestor. With this knowledge, Witmer is beginning to make some inferences about what dinosaur noses probably looked like and what they did—work that is winning admiration. “This is some of the most exciting and innovative research being done in vertebrate paleontology,” says Greg Erickson, a paleontologist at Brown University.

    The noses must have been up to something important, Witmer says, because they were suffused with a huge amount of blood. In his talk at the SVP meeting, for example, he reported that his group identified three separate blood supplies to the noses of ceratopsians, each of which formed big networks of capillaries. Based on the pattern of impressions in fossilized bone, he suggests that the blood vessels were embedded in mucous membranes lining the walls of the nasal cavity.

    What's more, the mucous membranes themselves were apparently extensive. In another talk, a DinoNose collaborator, Scott Sampson of the University of Utah, Salt Lake City, pointed out a number of ridges in the ceratopsian schnozz that probably supported curtains of cartilage; these in turn may have served as scaffolding for layers upon layers of mucous membranes. Yet the most obvious function of noses—smelling—probably wasn't responsible for their size. Smelling takes place at the rear of the nasal cavity, while all the extra space and blood supply is found at the front end of dinosaur noses.

    Witmer thinks dinosaur noses helped keep their brains cool. He notes that all the big-nosed dinosaurs had big bodies as well, and for them, heat must have been a problem, because in big animals the ratio of surface area to body mass is much lower than that for smaller animals. As a result, even if dinosaurs didn't have a fast-burning metabolism like that of mammals, the bigger ones must have been unable to shed heat fast enough from the skin to keep their body temperatures from rising to dangerous levels. The brain in particular could have been damaged by such high temperatures, as everyone knows from the occasional tragic stories of teenagers dying from heat-related “brain attacks” after playing sports in summertime.

    Witmer proposes that dinosaurs relied on their noses, with their vast networks of blood vessels, to get rid of excess heat. The vessels were probably in contact with the air in the nasal passages and could have wicked heat from the brain. This would be analogous to what happens in mammals, such as the gazelle, that live in hot climates. These animals have veins just under the skin on their head, which cool the blood as they release heat to the air. Rather than traveling straight back to the heart, this cooled blood takes a detour, flowing through a mesh of veins surrounding the brain. These veins run alongside the arteries bringing warm blood from the body's core. The cool veins absorb the heat from the arteries and carry it away from the brain. “Big animals get a big benefit from heat exchange,” Witmer says. “It would allow the core temperature to rise while keeping the brain cool.”

    All this does not rule out other roles for the big noses of dinosaurs. For example, they may have helped attract mates, althoughWitmer's group has yet to study that possible function. Says Sampson, “We have yet to come up with the final word.”


    NIH Eyes Sweeping Reform of Peer Review

    1. Bruce Agnew*
    1. Bruce Agnew is a writer in Bethesda, Maryland.

    Authors of a reform proposal say their goal is not to make radical changes but to create a system that can be “continually evaluated by outside experts”

    Like Lewis Carroll's White Queen, who could believe “as many as six impossible things before breakfast,” scientists who analyze the National Institutes of Health's (NIH's) peer-review system often find themselves torn between conclusions that are, at the very least, contradictory: The cornerstone of NIH's success has been its peer-review system, in which small committees of nongovernment scientists, known as “study sections,” judge the scientific merit of about 40,000 grant applications a year; or, NIH peer review too often amounts to error-prone, turf-conscious nitpicking by obsolete study sections that reject novel ideas out of fear, ignorance, and self-interest.

    NIH officials and many researchers today seem to believe both. As a result, NIH is now in the midst of a major drive to refurbish the system—updating it to fit today's biomedical science, setting standards of behavior to improve peer reviewers' manners and methods, and creating a mechanism to ensure that peer review will adapt as science evolves in the future.

    In the most dramatic reform proposal so far, a blue-ribbon panel headed by National Academy of Sciences president Bruce Alberts wants to completely restructure the array of study sections operated by NIH's Center for Scientific Review (CSR), which pass judgment on about three-quarters of NIH grant applications (Science, 30 July, p. 666).

    But the changes that will finally emerge, after they are refined and tested over the next 2 or 3 years, may be considerably less sweeping than the Alberts panel blueprint. “I don't think it's going to be as radically different as some people have said,” says NIH director Harold Varmus. “Peer review basically works pretty well now. We don't want to make abrupt changes that could be threats to the system.”

    The Alberts panel's proposals, if not radical, certainly look pretty startling. Currently, more than 100 CSR study sections are clustered into 19 “Integrated Review Groups” (IRGs), focused mostly around scientific disciplines such as “Biochemical Sciences” and “Cell Development and Function.” Instead, in what it calls the “first draft” of its report, the Alberts panel proposes reconstructing the system around 21 reorganized IRGs—16 centered on disease or organ systems and five focused on basic research areas whose application to specific disease areas cannot be predicted.

    Basic research that “more directly underlies clinical or applied studies” on specific diseases or organ systems should be peer reviewed “within the broader biological and medical context to which it will ultimately be applied,” the panel said. “Thus, we have attempted to place the review of as much fundamental research as possible within the IRG that is most relevant.”

    The panel, formally known as the Panel on Scientific Boundaries for Review, did not propose in detail the makeup of the study sections that would populate its revised IRGs. That, it said, is the task for the next phase of the reform effort. But it said there should be enough overlapping expertise so that any grant application could reasonably be reviewed by more than one study section.

    Alberts's group offered no suggestions about the study sections run by individual NIH institutes, which generally review applications under specific institute programs. These account for about 25% of NIH grant applications but were outside the range of the panel's study.

    The community responds

    In an outpouring of more than 700 e-mailed responses to NIH by mid-October, most scientists applauded the Alberts panel's general goal. Many also seized the occasion to vent their own frustrations with the system. But a substantial minority of the comments were skeptical, and many researchers said the panel had left out major scientific areas.

    AIDS researchers—who have picked up a lot of political savvy from their activist patients—mounted an organized campaign to retain an AIDS IRG rather than having AIDS research spread among several different IRGs, as the Alberts panel suggests. They enlisted support from such quarters as the Presidential Advisory Council on HIV-AIDS. AIDS was by no means the only research area that scientists complained would be slighted by being folded into a broader IRG. Others included: kidney and urologic research, toxicology, pharmacology, organic chemistry, developmental biology, aging, nutrition, epidemiology, environmental health sciences, and well over a dozen more.

    “Please don't destroy the current system without considering the problems that the proposed changes will create,” wrote Ronald Breslow, chemistry professor at Columbia University and past president of the American Chemical Society. Weaknesses in the current system can be fixed by less traumatic, targeted repairs, many other scientists said. “The bus is running just fine,” wrote biochemist Daniel Kosman of the State University of New York, Buffalo. “If it is missing a few stops, just change the route; don't buy a new model that may not run at all.” (Science obtained the responses—some signed, most unsigned—through a Freedom of Information Act request.)

    But Alberts insists, “We didn't change everything by any means.” He says “one of the big misunderstandings” is a belief that his panel began to rearrange study sections, but “that's going to be done by a whole bunch of subpanels of experts in each area.” Alberts's panel will meet next week to review the responses and adjust its proposed framework “to make it better,” he says.

    The Alberts panel's proposal is only the latest—albeit the most sweeping—of a series of peer-review changes that have been set in place or proposed over the past few years. CSR already has gathered neuroscience and behavioral research into four new IRGs, made up of 37 reconfigured study sections, to complete the merger of the National Institute of Mental Health, the National Institute on Drug Abuse, and the National Institute on Alcohol Abuse and Alcoholism into NIH. It created another new IRG, with eight study sections, to centralize review of AIDS research applications and added a special study section for vaccine research. CSR also has fashioned new study sections to handle applications from clinical researchers who feel they don't get a fair shake in panels dominated by laboratory researchers and to provide homes for research proposals that don't seem to fit anywhere else, such as bioengineering collaborations.

    The realignment of neuroscience and behavioral study sections—which was required by the 1992 law that merged most of the former Alcohol, Drug Abuse and Mental Health Administration into NIH—started in 1997 and pioneered the technique that will be used if some of the Alberts panel proposals are finally adopted: Advisory groups including extramural researchers worked out tentative organization plans, and then CSR officials performed “test sorts”—assigning batches of actual grant applications among the proposed study sections—to see how the system would work in real life.

    Judging merit

    Study sections' marching orders have changed, too. In 1997, Varmus ordered peer reviewers to consider “innovation” as one of their explicit criteria in weighing grant applications. He was trying to break study sections' habit of favoring “safe science”—incremental projects using tried-and-true methodology—over more imaginative but riskier proposals that might pay bigger dividends.

    CSR director Ellie Ehrenfield and CSR Advisory Committee chair Keith Yamamoto, of the University of California, San Francisco, say that progress has been made but the job isn't quite done yet. “We're trying to make a shift in reviewers' mind-sets,” Ehrenfeld says. “We're trying to change people's behavior. None of these things will be solved by a single magic bullet.” The problem is an old one. Newly named Nobel Prize-winner Günter Blobel of The Rockefeller University in New York City recalls (with a laugh) that in 1986, an NIH study section trashed a proposal of his as impractical, “and I found the critiques not constructive but offensive.” But Blobel emphasizes that the NIH peer-review system “is a very good one,” and he says most of its decisions are right.

    NIH also has simplified grant applications—and reduced opportunities for reviewers' second-guessing—by ending the requirement for detailed budget plans in most “investigator-initiated” grant applications. Under the “modular grant” and “just-in-time” approaches, researchers in most cases simply ask for funding in increments of $25,000; detailed budget justifications and many other paperwork requirements don't come until after a grant is approved. Additional changes are in the works—although some have been a long time coming.

    “No matter how we organize study sections, what really matters is the people sitting around the table,” says Ehrenfeld. Thus CSR is trying to broaden study-section recruiting and has experimented in an informal way with several devices to make peer-review service less onerous. These include tours of duty that involve less than the conventional three meetings a year for 4 years and shared assignments that allow scientists to substitute for one another at some meetings. But none of these changes has been implemented in a systematic way.

    CSR officials, and Varmus, also are still puzzling over how to lure more senior scientists back onto study sections. This could bring more consistency and credibility to the process, they say, but senior scientists are generally unenthusiastic about the idea. “They've done it before,” says Varmus, “and they're on to other kinds of advisory activities, some of which are probably more fun and less work.”

    Varmus himself, of course, will be eligible for study section service next year, after he leaves NIH to become president of the Memorial Sloan-Kettering Cancer Center in New York City. Will he volunteer?

    “Volunteer?” he replies. “No. But if they call me, I'll think it over.”

    Since 1996, Yamamoto and others have been pushing another idea that is just now taking effect: oversight by “IRG Working Groups.” These will be teams of eight to 10 extramural researchers who will attend at least one round of peer-review meetings, monitor the activities of their IRG and its component study sections, and offer advice on whether the scientific boundaries between study sections are still appropriate—as well, no doubt, as on the conduct of reviews. In effect, they will peer review the peer reviewers. If they can exercise enough diplomatic skill to avoid friction with study section members and chairs, they may provide a mechanism for adapting the peer-review system as science evolves. Alberts is counting on the IRG Working Groups to keep the system up to date. He sees this as a “great once-in-a-lifetime opportunity to create a system that won't be just locked in place, but can continually be evaluated by outside experts—and in which modern science, which is changing so rapidly, can really be adequately be supported and tracked.” The first three IRG Working Groups are already on the job. Five more are in the planning stage.

    For individual researchers, however, the biggest boon may come from more efficient communication through the Internet. NIH officials say they are only a year or two away from establishing a long-sought system of electronic submission and review of grant applications that could slash by nearly one-half the 10-month lag from submission to award. Doing away with time lost to printing, collating, distributing, and mailing grant applications also might enable researchers to submit revised proposals without missing a grant-award cycle.

    Whatever the outcome of the Alberts panel recommendations, peer review is changing. And perhaps it should be no surprise that the process is taking longer than anyone would like. “This really is like turning a big ship,” Yamamoto says. “Ellie is trying to do a lot of things at the same time, with a staff that's already overburdened.”

    Will Varmus's departure in January slow the momentum? Yamamoto hopes the loss will be limited. “He's put the ship in the right direction,” Yamamoto says. “Inertia can be a friend here.”


    What's Wrong With NIH Peer Review?

    Among the more than 700 responses to the proposed reorganization of NIH's peer-review system can be found virtually every complaint researchers have ever made about study-section reviews. Here's a sampling:

    “I have been on study sections and have seen members who clearly lacked expertise review proposals and grade proposals in a biased, or self-serving, or bad scientific manner.” –Louis Gerstenfeld, Boston University Medical Center

    “I have seen the results of ideas being stolen [by peer reviewers]. Who will be believed, the experienced peer or the new investigator?” –unsigned

    “Under the present ‘culture,’ which focuses on fault finding and amplification of minor errors and discouraging innovative research, nearly all NIH funding has gone into confirming, reconfirming, and reinventing what is already known, by individuals of very little insight or talent.” –unsigned

    “Every one of us has received reviews that clearly misstated facts, indicated that the reviewer failed to read the proposal thoroughly, or were filled with unsupported assertions of opinion. Such poorly performed reviews, which are, I believe, all too common, undermine confidence in the system.” –unsigned

    “When one rebuts a review today, the rebuttal is referred to the SRA [scientific review administrator] for the study section that produced the potentially unfair review. This SRA then decides whether the rebuttal is correct. Not surprisingly, she typically decides that it is ‘a mere scientific disagreement.’… Unscientific grant review rhetoric never receives objective scrutiny.” –Michael Swift, New York Medical College

    “I do not think any major change has taken place [in study sections' over- reliance on preliminary data]. Preliminary data is still a major barrier. Risk-taking in general is much frowned upon. I remember participating in a study section which reviewed high-risk grants. If I remember correctly, most, if not all, were disapproved as being too unlikely to succeed.” –David Greenberg, Mt. Sinai Medical Center, New York

    “We do everyone an injustice by allowing half the participants in the process (the reviewers) to hide behind the veil of anonymity. Grants should be reviewed openly and there should be an opportunity to respond to the reviews in ‘real time.’” –Donald Dwyer, LSU Medical Center

    “We have had grants reviewed by a given committee and then upon resubmission, the critique was the exact opposite of the previous panel. … This is totally unfair and leads to incredible frustration.” –unsigned

    “Often I cannot recognize even one so-called expert in my area in the study section. The reviewers are pedantic and pay attention to one or two experiments which the reviewer does not understand and shoot down a 4-year project. The reviewers often do not understand the underlying principles or broad objectives of a proposal and resort to nit-picking. Basically all new ideas are rejected.” –unsigned

    “The AIDS and Related Research [3] Study Section was composed of individuals with widely different areas of expertise. … For the most part, we couldn't understand the reviews written by other members of the panel and were able to function only because we were forced to trust each other. Trust is a wonderful thing in friendship but not necessarily in peer review.”–Kathlyn Parker, Brown University

    “We all know how the system works. Do the work, describe part of the results as preliminary in the grant [application], then when you get the priority score, write the papers and start on what you really wanted to do in the first place.”–unsigned


    The Misconduct Case That Won't Go Away

    1. Eliot Marshall

    The University of Arizona fired Marguerite Kay last year, but supporters nationwide are rallying to her cause and a legal decision is pending

    A contentious scientific misconduct case that has divided faculty at the University of Arizona may be heading toward a new climax. This month, an Arizona state court is considering a request by the accused—a prominent researcher on aging, Marguerite Kay—to be reinstated as Regents Professor at the University of Arizona (UA), Tucson. University president Peter Likins dismissed Kay abruptly on 15 July 1998 after a series of faculty-led investigations concluded that Kay had manipulated data and seriously mismanaged her lab. Kay has appealed the dismissal to the state court, which issued a decision partly in her favor on a different legal basis in April. The current appeal could be decided in a few weeks.

    Kay, cited for her research on the aging of blood cells and the role of the immune system in Alzheimer's disease, has enjoyed the continuous support of a vocal contingent of the faculty. Her foremost advocate is her former department chair, John Marchalonis, head of microbiology and immunology. He insists that the scientific misconduct charges against Kay were played up by administrators who resented Kay's challenges to their decisions on lab resources and service fees.

    Former UA vice president for research Michael Cusanovich, who coordinated the initial Kay investigation, says these allegations are unfounded. The inquiry, he says, began when one of Kay's former technicians filed a written complaint with the university, and the investigation was conducted by independent panels selected by the faculty, in accordance with UA rules. Marchalonis and Carol Bernstein—a member of the same department and local president of the American Association of University Professors—have circulated many letters supporting Kay from outside the university, including from the national AAUP and well-known researchers. Among those who have questioned the UA proceedings are former National Institute of Mental Health director Frederick Goodwin, now a researcher at George Washington University in Washington, D.C.; former National Institute on Aging official Zaven Khachaturian, now director of the Ronald and Nancy Reagan Research Institute of the Alzheimer's Association; molecular biologist David Soll of the University of Iowa, Iowa City; Stanley Azen of the Doheny Eye Institute at the University of Southern California, Los Angeles; and neurology researcher Paul Coleman of the University of Rochester in Rochester, New York.

    Likins isn't commenting on the case because, an assistant says, “it would be inappropriate” to do so while it's in litigation. But he did make a dramatic and detailed presentation of his reasons for firing Kay at a faculty senate meeting last December. According to a videotape of that meeting, made available to Science by Bernstein, he told the faculty he had made this “agonized” decision after a careful review of the evidence collected by an investigative panel, which found Kay guilty of four counts of misconduct. The university began looking into the case in 1997 after one of Kay's technicians filed written allegations of misconduct, claiming that Kay had manipulated experimental results. An ethics panel then found cause for investigation; an investigatory panel collected evidence and brought an indictment; and a third panel reviewed the work of the earlier panels and held a public hearing at which Kay testified. It concluded that Kay had “falsified, manipulated, and otherwise misrepresented data and findings in publications,” and that she had egregiously “mismanaged her UA laboratory and employees.” It recommended that her employment be “terminated.”

    Likins told the senate meeting that he found some of the evidence equivocal, but was particularly swayed by one set of data from a table that Kay included in a review article published in the journal Gerontology in 1997. Likins presented a detailed analysis showing beyond a doubt, he said, that the author had selected raw data to make results appear significant when they were not. This misconduct gave credence, he argued, to other charges of data manipulation brought against Kay by lab technicians. Likins also revealed that, out of “compassion,” he had offered to retain Kay on the faculty if she would acknowledge her misconduct. But she refused, and he fired her immediately.

    Kay says that as soon as she learned from her staff that data in the Gerontology paper were erroneous, she wrote to the journal to have the table withdrawn. Her correction letter, published in Gerontology in June 1998, blames a technician for the mistakes. Today she says that she doesn't know how the errors crept in. She insists that she relied entirely on her staff for statistical computations. In any case, she says, the table was “irrelevant” to the points she made in the text of the review. Finally, Kay says that the collection of damning data presented to the faculty by Likins was a “cut and paste” assemblage—not raw data—which she had never seen in the form Likins produced. She believes that her integrity was challenged by disgruntled staff members and that a hostile administration used the criticism to dismiss her.

    On 15 August 1998, Kay filed suit against UA and its officials in state superior court for Pima County, arguing that the university had violated her due process rights by dismissing her without adequate notice or opportunity for review. Kay was shut out of her research lab before the public hearing that found her guilty of misconduct, for example. And she claims she was fired without severance pay or a chance for an appeal.

    Judge Stephen Villarreal's finding in April 1999 said that the university had acted in an “arbitrary and capricious” manner in firing Kay without a regular personnel hearing. The judge did not review the misconduct allegations but found that the 5 days of public hearings on these charges from 30 March to 4 April 1998 were not equivalent to a hearing on dismissal, which university rules require. Villarreal found that UA should give Kay such a hearing, but Kay and the university still have not aggreed on how to proceed.

    Kay has asked Villarreal to review her case again, for another reason. She has claimed “whistleblower” status under state law because she had complained in the past about administrative actions taken by Cusanovich. In an independent case, the Arizona Supreme Court found on 4 October that a whistleblower employed by the state may be represented by an attorney during a dismissal proceeding. But UA did not allow Kay to be represented by an attorney during her misconduct trial. (Her attorney was at her side, however.) On 5 October, Kay filed a motion asking Villarreal to nullify her dismissal because due process was violated. The judge's decision is expected soon.

    Kay's supporters at the university have become more vocal this fall, protesting her firing as a violation of academic rights and a threat to tenure. A faculty leadership group known as the Committee of Eleven voted unanimously in August to ask Likins to reinstate Kay to her job. This panel includes Marchalonis and the chair of the faculty senate, English professor Jerrold Hogle, who previously supported Likins's decision. Hogle could not be reached for comment.

    In addition, a constitutional law specialist on the legal faculty, Roy Spece Jr., conducted an investigation on his own initiative and concluded, as he told the faculty senate in September, that the process was heavily biased against Kay from the outset because the UA general counsel privately interviewed a technician in Kay's lab, a key witness against her. Spece told the senate the proceedings were deeply flawed on legal grounds and at odds with UA rules, which require that faculty members be given full notice before being dismissed.

    It's unclear whether any of these protests—or the state court's decisions—will cause the university to ease its punishment of one of its most distinguished biomedical researchers. But one thing is certain: Likins is discovering—as others have before—that disputes over scientific conduct rarely die. They just get more expensive.

Log in to view full text