News this Week

Science  30 Apr 1999:
Vol. 284, Issue 5415, pp. 718

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Varmus Circulates Proposal for NIH-Backed Online Venture

    1. Eliot Marshall

    Low-cost biomedical publishing on the Internet could explode soon, if a plan drafted by the National Institutes of Health (NIH) takes off. Last week, NIH director Harold Varmus and colleagues circulated a proposal that could greatly expand the use of the Internet to distribute original biomedical papers. Although the community has made only “sparing use” of electronic media so far, Varmus and his colleagues write, they anticipate that NIH will launch an online publication service “in the near future.” But the community may not be ready for a radical change: The first reaction of a prestigious editorial group at the National Academy of Sciences—briefed on these ideas on 25 April—was less than enthusiastic.

    The proposal, dated 22 April and distributed by e-mail, is the first detailed presentation of ideas outlined by Varmus at a congressional hearing last month (Science, 12 March, p. 1610). The draft, written by Varmus “with active assistance” from David Lipman, director of the National Center for Biotechnology Information, and Pat Brown, a geneticist at Stanford University in Palo Alto, asks for “constructive comments from the scientific community.” Later, the authors will revise the proposal and publish it in a print journal.

    The authors call the proposed venture independent board of governors would make and enforce rules. Its members—representing “readers and authors, editors, computer specialists, and funding agencies”—would set policies, select reviewers, and ensure fair access to the site. The authors do not say much about the board's composition or authority, but they assume that the members would be “assembled” by NIH. And they also offer a plum to prospective authors: E-biomed, unlike existing journals, would allow them to retain copyright claims.

    According to this scheme, scientists could approach E-biomed on several tracks. Those choosing the high-prestige route would submit papers to a network of peer reviewers—possibly the same reviewers now used by scientific societies and journals. This route would be “closely aligned with current practice,” Varmus writes—selective and ponderous. If rejected, an author might submit the paper to another group or seek publication through a less prestigious reviewed area of the Web site. But authors would also have a simple alternative: a route to publication requiring virtually no review and no editing.

    This track would require only that an author obtain prior “validation” of an article from two members of a large panel of scientists. This screening panel of “several thousands,” according to the Varmus memo, would be vetted by the governing board. The validation process, the article says, should exclude “extraneous or outrageous material” but remain flexible enough “to permit rapid posting of virtually any legitimate work.” Although scientists might hesitate to use this shortcut at first, Varmus observes, they would probably warm to it. It would offer “simplicity, flexibility, and speed,” he says, as well as access to a broad audience.

    Varmus and his colleagues say that E-biomed could “maximize the dissemination” of new data, delivering information to more readers more rapidly than print journals. They praise the convenience of electronic search engines, which enable readers to mine old papers while keeping up with new ones. In addition, they say E-biomed would handle more complex data displays. They're enthusiastic about the low cost of electronic delivery, the ease of researching hyperlinked footnotes, and the potential for quick feedback from readers.

    Despite such promises, however, E-biomed is already taking some criticism. For example, Martin Frank, the outspoken executive director of the American Physiological Society, sees it as superfluous: “Most nonprofit publishers are already working to implement Varmus's vision of a Web-based journal with online submission and review.” Frank asks: “Does the federal government really need to insert itself into the scientific publishing arena?” He doesn't think so. David Botstein, chair of genetics at Stanford, gives a mixed review: He likes the concept, but not all the ambitious details of the E-biomed proposal. “The direction is correctly futuristic,” Botstein says, “but if it were up to me, I would start with more modest measures.”

    Nicholas Cozzarelli, editor of the Proceedings of the National Academy of Sciences, also reported a mixed review after Brown briefed his editorial board last week. PNAS's leaders were in agreement, Cozzarelli says, that NIH should go ahead with the second part of the current proposal: an experimental preprint server to share unpublished data. This will be a “huge undertaking,” Cozzarelli says, and “very good for science.” But beyond that step, the PNAS group felt that the proposal became complex and that NIH should proceed with caution—or perhaps not at all.


    President Revokes Plan to Destroy Smallpox

    1. Eliot Marshall

    Since the mid-1990s, the U.S. government has supported an international plan to eradicate every last trace of variola virus, the cause of smallpox. Vaccination all but eliminated this ancient and deadly disease in the 1970s, and no new cases have been reported since 1978. Health officials hoped that it would be the first human pathogen purged from planet Earth: More than 70 nations had tentatively supported a plan to destroy all known stocks of the virus in June 1999. But the U.S. government changed its mind last week. Joining Russia, which has argued that live samples of the virus should be kept for research, President Clinton signed a memo calling for preservation of variola in high-security labs.

    According to a White House security official (no announcement was published), the Administration decided after an internal review that live virus should be preserved for use in developing new antiviral drugs and testing improved smallpox vaccines. The aim would be to guard against clandestine development of smallpox weapons by terrorists or hostile states. “We live in a time when bioterrorism is a real concern,” says a senior Administration official who spoke on background. And the current smallpox vaccine stockpile, he says, is “grossly inadequate,” because it relies on a live virus vaccine that cannot be given to immunocompromised persons.

    The policy change brings an end to a long-running debate in the U.S. government between advocates and opponents of total eradication of the virus. It represents a victory for defense agencies, which had argued that it would be rash to throw away this potentially valuable research tool, and a defeat for some health leaders who felt the world would be safer if all known variola stocks were destroyed. Only Russia and the United States are currently known to possess cultures of variola, although individual experts have been saying for some time that they suspect that not all variola stocks have been accounted for.

    The U.S. debate reflects a split within the World Health Organization (WHO) in Geneva. Advocates of total eradication such as public health researcher D. A. Henderson of The Johns Hopkins University in Baltimore have argued in WHO meetings since the early 1990s that smallpox is such a horrific disease that its source should be obliterated—totally and permanently. Henderson has suggested that the live virus is so infective and lethal—with a mortality rate of around 30%—that it shouldn't even be kept in secure labs. Henderson also has argued that there is little value in preserving the virus. He points out, for example, that it cannot be studied in animals, as it doesn't infect them. And it is so dangerous that few scientists would want to handle it, even in the safest environment.

    Such arguments persuaded WHO to do away with variola. WHO members agreed first to send all research stocks of the virus to two repositories, one in Russia and the other in the United States. Then an executive committee voted that these stocks would be destroyed in June 1999, if the WHO general assembly gave the final go-ahead in May. Although most members may still support the plan, the two that control the variola stocks do not.

    Resistance to the WHO plan has developed slowly. Russia opposed it from the outset. But the British and U.S. defense establishments disagreed more quietly. Recently, one U.S. official—Alan Zelicoff, a biodefense expert at the Sandia National Laboratory in Albuquerque, New Mexico—has gone public with strong objections to the WHO plan. Zelicoff, who debated Henderson on the smallpox decision last month on National Public Radio, contends that the policy of total eradication had White House support for several years because one National Security Council staffer advocated it. But recently, he says, other national security experts intervened and prompted a policy review.

    At the same time, according to Zelicoff, Joshua Lederberg, president emeritus of The Rockefeller University in New York City, who is concerned about bioterror risks, was “influential” in getting federal agencies to fund an external review by the Institute of Medicine (IOM). The IOM report, issued in March, didn't take sides in the debate, but concluded that scientists might use live variola productively to develop new antiviral drugs and vaccines (Science, 19 March, p. 1825). The IOM report was crucial, a Clinton Administration official says, to the change in U.S. policy.

    The plan now goes to the WHO general assembly for a vote. But because the two countries that hold the stocks now oppose destruction, the issue may be moot.


    Signs of Plate Tectonics on an Infant Mars

    1. Richard A. Kerr

    Almost 40 years ago, geophysicists made history by realizing that Earth's surface is shaped by plate tectonics—that new crust is born in midocean ridges and plates move around the globe. Pivotal to the discovery were rank upon rank of magnetic stripes that march across the sea floor, each marking the orientation of Earth's flip-flopping magnetic field at the moment the crust was born. Now scientific history may be repeating itself on a close planetary neighbor: On pages 790 and 794 of this issue, researchers report magnetic stripes on Mars.

    The data, gathered by the Mars Global Surveyor (MGS) spacecraft, suggest that in its early days, Earth's diminutive cousin resurfaced itself the way Earth does today, spreading freshly made crust away from long, narrow volcanic rifts. The martian magnetic stripes are “absolutely fascinating,” says Frederick Vine, a professor emeritus at the University of East Anglia in Norwich, United Kingdom, whose work on magnetic stripes was instrumental to the plate tectonics revolution of the 1960s. That the martian examples are also the work of plate tectonics is “an eloquent hypothesis,” he says, and indeed no one has a good idea what else could form such stripes. Yet the shape and pattern of the martian stripes are so different from Earth's that geophysicists are reserving judgment for the moment. The stripes point to some sort of interesting geodynamics on ancient Mars, says planetary geophysicist Maria Zuber of the Massachusetts Institute of Technology, but she's not sure what it was.

    The finding has its origin in a catastrophe—the loss of the Mars Observer spacecraft as it approached the planet in 1993—say Mario Acuña and John Connerney of NASA's Goddard Space Flight Center (GSFC) in Greenbelt, Maryland, members of the MGS magnetometer team. For the Mars Observer's successors, NASA switched into a “faster, cheaper, better” mode. As a result, MGS did not carry enough rocket fuel to send it directly to its intended high, circular orbit when it arrived at Mars. Instead, it first entered a more fuel-efficient elliptical orbit that periodically dipped it into the upper martian atmosphere, where it experienced atmospheric drag that bit by bit nudged the spacecraft closer to the desired circular orbit. These aerobraking passes carried the spacecraft as low as 100 kilometers above the surface, low enough to detect magnetization of the rocks below.

    MGS first detected patches of magnetic field embedded in the crust apparently at random, like so many bar magnets strewn on the surface (Science, 10 October 1997, p. 215). They had apparently formed when blobs of magma near the surface solidified and cooled earlier in martian history, locking in bits of the magnetic field that existed at the time. That meant that although the interior of Mars has cooled and produces no magnetic field today, it once had enough heat to churn the planet's molten iron core into a magnetic dynamo.

    A month later, another problem led to even better observations. An apparent weakening of a solar panel arm meant that the spacecraft's orbit had to be adjusted more slowly, so that MGS made about 1000 aerobraking passes rather than 100. With the additional coverage, some of the magnetic patches began to coalesce into a pattern. Across a huge swath of the southern hemisphere, wrapping a quarter of the way around the planet, irregular stripes about 100 kilometers wide and up to 2000 kilometers long appeared. The half-dozen or more stripes are roughly parallel and appear to alternate in polarity, one having its “north pole” pointing vertically up and the next with its south pole up. The stripes peter out near the boundary with the northern lowlands.

    The magnetometer team members, who are planetary scientists, tend to be more familiar with the magnetic field of Jupiter than with Earth's ocean crust, but even they saw the resemblance to terrestrial stripes. Back in the 1960s, researchers realized that when magma rises into the crest of a midocean ridge, cools, and solidifies, it records the current magnetic field. Magnetized crust continuously spreads away from the ridge in both directions like tapes in a tape recorder; when Earth's magnetic field reverses, a new pair of stripes appears, one on each side of the ridge.

    On Mars, the crustal tape machine turned on early but wound down quickly, according to Connerney and Acuña. The heavily cratered highlands that recorded the stripes date back to Mars's first half-billion years, when its interior might have been hot enough to support both an internal magnetic dynamo and the surface motions of plate tectonics. By 4 billion years ago, meteorites that crashed into the stripes left unmagnetized holes in the pattern, suggesting that the magnetic dynamo had shut down by then, says Acuña. But crustal spreading may have continued for a time; in the north, some process later produced the unmagnetized, thinner, and therefore lower crust of the lowlands.

    The stripes are “convincing evidence that there was a [magnetic] dynamo early on Mars and it reversed,” says paleomagnetist Ronald Merrill of the University of Washington, Seattle. And as an explanation for the overall pattern, crustal “spreading seems like the best guess at this time,” he says, especially because there's no persuasive alternative.

    But the pattern doesn't exactly match Earth's, so most researchers aren't yet ready to embrace martian plate tectonics. “You wouldn't say this is a dead ringer for the sea floor,” notes paleomagnetist Robert Coe of the University of California, Santa Cruz. The martian stripes are 10 times wider than Earth's, implying faster spreading, slower magnetic field reversals, or both; they are also far less regular in shape and spacing; and the symmetrical pattern of stripes on either side of a spreading center, the clincher in the plate tectonics debate, is not yet apparent. Indeed, there's no sign of a “spreading center” at all. Says Merrill: “If plate tectonics was operating on Mars, it worked differently or it was recorded differently by the rocks.”

    Another sign of that difference is the “staggering” intensity of the recorded magnetism, says Merrill. To achieve it, Connerney calculates that each martian plate would have to be 30 kilometers thick, assuming that they were uniformly magnetized as intensely as the uppermost few hundred meters of Earth's ocean crust. “It's hard for me to understand how that could happen on Mars,” says geophysicist Sean Solomon of the Carnegie Institution of Washington. On Earth, the upper ocean crust is intensely magnetized because seawater cooled it rapidly, forming tiny, easily magnetized crystals. But cooling a 30-kilometer slab would take tens of millions of years, leading to larger crystals that would be less easily magnetized. This prompts researchers to wonder if perhaps the stripes formed not through creation of new crust but through some other means, such as slow chemical alteration.

    “There's so much we don't know,” says Solomon. “You have to remember how difficult it was to convince people about sea-floor spreading in the 1960s. It'll keep people scratching their heads for quite a while.”


    Quantum Computing Makes Solid Progress

    1. Robert F. Service

    It's no wonder that the prospect of quantum computers gets people excited. If harnessed, the essential fuzziness of the quantum world could allow researchers to breeze in minutes through computations that would take today's supercomputers billions of years to crunch. Although quantum computing researchers have managed to carry out a few simple logic operations in the quantum regime, these typically have relied on ions poised in beams of light or jiggling molecules in a solution. Just getting them to serve as a single quantum circuit element required roomfuls of lasers, magnets, or other control equipment—a far cry from the millions of circuits that can be crammed onto a conventional silicon chip. But now a team of Japanese researchers has turned a simple metal- and silicon-based device into the key component of a quantum computer.

    In this week's issue of Nature, physicist Yasunobu Nakamura of the NEC Fundamental Research Laboratory in Tsukuba, Japan, and his colleagues describe how they created the first electrically controlled bit of quantum data, or qubit, in an electronic device that in theory could be reproduced manyfold. “It's an extremely significant success,” says David Awschalom, a physicist at the University of California, Santa Barbara. “It opens the door to building a solid-state quantum computer that's scalable. In computing, that's the name of the game.” Still, Awschalom and others caution that making complex solid-state quantum computers is still many years away, and researchers must first learn how to keep their quantum data from decaying almost the instant they're made. Says Dmitri Averin, a quantum computing expert at the State University of New York, Stony Brook, “It will definitely be a long and difficult challenge.”

    Even creating a single qubit was no simple task. Like all quantum computing schemes, this one makes use of the superposition principle of quantum mechanics, which states that, until it's measured or observed, a quantum system—such as the magnetic orientation of an atomic nucleus, or the location of an electron—exists as a superposition of all its possible states at once. Unlike the classical bits of data in a computer, which are decidedly either a zero or a one, qubits hover in an indecisive fog somewhere between these two values. When this fuzzy two-state bit is plugged into a logical operation, the computer in essence computes both outcomes simultaneously. Couple together just 300 qubits and a computer would instantly compute all 2300 possible outcomes—roughly the same number as there are elemental particles in the universe.

    The difficult part, however, is coupling qubits together. Since the mid-1990s, researchers have managed to make qubits from the magnetic spin of atomic nuclei in atoms or molecules and from the polarization of light, but these schemes are hard to miniaturize into working computers. “It's difficult to imagine scaling up an atomic or molecular system” to create the dozens or hundreds of qubits needed for sophisticated quantum computing, says Awschalom. In conventional computers, solid-state components have made it possible to shrink circuit elements to the scale of a few hundred nanometers. But the solid state seemed too unruly for quantum computing, for electrons in solids can have innumerable quantum states that are impossible to tell apart. For a qubit, you need easily distinguishable on-off states.

    One class of solids, superconductors, offers a simpler quantum environment, as their electrons all share the same quantum state and travel together in pairs. So the NEC team designed a set of tiny electrical components, made from metals that superconduct at very low temperature, that enabled single electron pairs to jump between a tiny bar-shaped metal island and a nearby metal reservoir. Applying a brief voltage pulse to a control electrode connected to the island creates equivalent energy states for electron pairs on both the island and the reservoir. The result is that the superconducting electron pairs oscillate back and forth between the two locations, representing the one and zero of a digital system. Such a setup does not work in semiconductors, where electrons' oscillations are disrupted by heat, lattice vibrations, and other troublemakers.

    In a working quantum computer, researchers would have to find a way for the state of this first qubit to influence the behavior of a second. But for this experiment with just one qubit, the NEC researchers simply read out the electrons' location. To do so, they apply a continuous voltage to a so-called DC electrode. This boosts the energy level of electron pairs on the island, causing them to break their superconducting bond to one another and hop to a nearby probe, which then channels them to a detector. “The probe can take two electrons if the [island] contains one extra electron pair, but zero electrons if the [island] doesn't have the electron pair,” says Nakamura. “That's why we can distinguish the two electron states.”

    Still, Nakamura acknowledges that this simple demonstration remains far from a useful quantum computer. The main problem is that the paired electrons oscillate back and forth for only about 2 nanoseconds before they are torn apart and siphoned off by the probe electrode. “That's not enough to do any computation,” says Nakamura. To be useful, researchers would like their qubits to be stable indefinitely. Efforts around the globe are now likely to focus on that goal, as well as on stringing a number of electronic qubits together to construct the first electrically controlled quantum computer.


    New Model for Hereditary Breast Cancer

    1. Michael Hagmann

    Breast cancer strikes about one out of nine Western women in their lifetime and is second only to lung cancer as a cause of cancer deaths in women. For women who have mutations in BRCA1, one of two genes linked to the 5% or so of the cases that are hereditary, the disease is even more fearsome. They have a 70% chance of getting it. Now, researchers have an important new clue about how breast cancer develops, at least in these women.

    The clue, in the form of an animal model for the disease, comes from the joint effort of two teams led by Chu-Xia Deng and Lothar Hennighausen at the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK). In a paper in the May issue of Nature Genetics, the researchers report that they have inactivated, or knocked out, the BRCA1 gene in mice exclusively in the cells where breast cancer normally originates—the epithelial cells lining the milk ducts.

    Previous efforts in which genetic tinkers knocked out one or both copies of the gene in all mouse tissues produced disappointing results. Women with BRCA1 mutations are born with one inactive copy, while the other becomes inactivated later. But the animals with one inactivated copy did not get tumors at all, and those with two inactivated copies from the beginning died before birth. In contrast, the NIDDK team found that their animals do develop breast cancers, starting when they are about 10 months old. “This is quite exciting. Such an animal model is invaluable for understanding the role of BRCA1 in familial breast cancer,” says Andrew Futreal of Duke University Medical Center in Durham, North Carolina, a BRCA1 co-discoverer.

    In keeping with previous work indicating that BRCA1 is involved in repairing defective genes, Deng, Hennighausen, and their colleagues have found that breast cells lacking an active BRCA1 gene are prone to accumulating additional defects—most prominently the loss of the p53 tumor suppressor gene—that might be crucial contributors to cancer development. What's more, Futreal says, the new animals could prove useful in evaluating new treatments or chemopreventive drugs that might delay or even block the onset of breast tumors.

    To knock out the gene specifically in breast tissue, Deng and Hennighausen engineered a so-called conditional mutant mouse strain. They first created mice with a genetic tag, called a loxP sequence, at two different spots within the BRCA1 gene. Then the team crossed the loxP mice with another transgenic strain carrying the gene for a molecular scissors, an enzyme called Cre recombinase. To make sure that the Cre gene is active only in the mammary epithelial cells, the researchers combined it with the regulatory DNA elements of a milk protein produced only in this tissue. The Cre recombinase recognizes the loxP sites and chops out the intervening part of the BRCA1 gene, inactivating it. Sure enough, some of the resulting mice developed breast cancer in at least one of their 10 mammary glands between 10 to 13 months of age.

    Deng concedes that there are slight differences between his mice and women with BRCA1 mutations. Mice at 10 to 13 months of age are analogous to women in their 50s, while BRCA1-related breast cancers usually occur before menopause. Also, only 22% of the animals get the cancer, although Deng expects that more will as they age. Still, cancer biologist Bert Vogelstein of The Johns Hopkins University School of Medicine says that the animals provide the first “experimental system to figure out the way the BRCA1 gene works.”

    The NIDDK researchers began getting their first hints of how BRCA1 loss might lead to breast cancer when they looked at the milk ducts in mutant animals that were pregnant or lactating. “The mammary glands were smaller [in the mutants], and there is very sparse and sometimes abnormal branching” of the ducts, Deng says. At the same time the team observed extensive programmed cell death, or apoptosis, in the mammary tissue of mutant mice. “At first glance that looked quite inconsistent” with a gene abnormality that supposedly predisposes to the excessive cell proliferation of cancer, Deng says.

    A peek at the chromosomes of tumor cells helped explain this apparent paradox, however. In cells lacking BRCA1, the entire genome seemed intrinsically unstable: There were extra copies of individual chromosomes, and some had large deletions or were fused to bits and pieces from other chromosomes. That makes sense, because previous studies had found a connection between BRCA1, as well as the other hereditary breast cancer gene, BRCA2, and the repair machinery for the chromosome breaks that lead to such instability.

    Deng speculates that in the absence of BRCA1, cells accumulate enough DNA damage to trigger safeguards that cause them to stop dividing or even undergo apoptosis. That would explain the high cell-death rates seen in the mutant animals. However, the genetic instability also increases the mutation rates of crucial tumor-suppressor genes or cancer-promoting oncogenes—which may eventually overcome the growth controls and spur the development of tumors.

    The Deng-Hennighausen team already has evidence connecting the BRCA1 defect to the loss of p53, the well-known tumor suppressor gene that is itself mutated in about 50% of all familial breast cancers. They found that the mouse p53 gene is either totally silent or severely scrambled in two-thirds of the tumors in their BRCA1 knockouts. The researchers also found that inactivating one copy of the p53 gene in the BRCA1 mutants accelerated tumor formation in the animals and drastically increased the cancer incidence to some 75%.

    Hennighausen says he plans a variety of follow-up experiments with the mutant mice. For example, he wants to know whether the tumors spread as frequently as they do in human breast cancer patients and whether their growth is stimulated by the female hormone estrogen, as also happens in some human patients. If so, the animals would be good models for testing therapies.

    Other researchers are also eager to get their hands on the long-awaited mice. Says Hennighausen, “We received several phone calls from people requesting the animals.”

  6. ITALY

    University Funding to Be Tied to Performance

    1. Chiara Palmerini*
    1. Chiara Palmerini is a writer in Milan.

    MILAN—Italy's reformist minister for universities and research, Ortensio Zecchino, is taking on the country's inefficient university system. A new bill, now awaiting the attention of the relevant parliamentary committee, would force universities to conduct annual assessments of the quality of their teaching and research and tie their level of government funding to the outcome. It would also give professors monetary incentives to get their students to complete their degrees and pass their exams on time. “This bill is a further step on the way to developing a more effective verification of results in the Italian academic world,” says Zecchino, who has already pushed through an overhaul of the National Research Council masterminded by his predecessor, Luigi Berlinguer (Science, 30 October 1998, p. 855).

    Previous Italian governments have tried with little success to impose assessments on the country's universities, which are almost exclusively government-funded. In 1993, universities were required to set up internal evaluation panels to assess teaching and research, but the system never worked well. Seven out of the 54 panels across the country never met and about half the panels never presented a report, while many of those that were completed turned out to be of little use. In 1996, a “national observatory” for the assessment of the university system was created, but it also has had little impact.

    If Zecchino's bill passes, the observatory would be replaced by a new national committee. “It will not be just a change of name,” assures Zecchino; “the national committee will have a much more incisive power.” The committee will have seven members, some of whom will come from abroad, and it will set general criteria for the universities to carry out their evaluations. Every university will have to set up a new internal evaluation panel with no more than nine members, one-third of whom must come from outside the university. The methods the panels will use to evaluate research and teaching have not yet been spelled out, but in their evaluation of teaching the panels must take into account student assessments of their teachers' performance—a new departure in Italy. According to the bill, assessments will not affect the careers of individual professors; they are for funding purposes only.

    The panels will be required to submit their reports to the national committee each year. And from 2000 onward, a portion of the government's funding for universities will be distributed by the committee according to the strength of these evaluations. Those universities that fail to submit an evaluation will receive none of this funding.

    The bill also aims to tackle the chronic problem of students not completing their degree courses. Only 13% of Italian students take their exams on time, and only about 30% of those who enroll eventually graduate. Zecchino is proposing to put up $150 million over the next 3 years in incentives for professors to get students through their courses successfully. Universities would bid for this money by proposing projects to improve degree success rates. Zecchino's ministry will provide one-third of a project's funding at the outset and the remainder when it begins to show results.

    Luciano Modica, rector of the University of Pisa and president of the Italian Conference of Rectors, says “any law improving the system of evaluation is certainly welcome, but it is not true that in the Italian academic world assessment of quality is completely nonexistent.”

    Giuseppe Palumbo, deputy of the main opposition party and vice president of the parliamentary committee that will scrutinize the bill, approves of Zecchino's plans in principle. But he believes that they are very ambitious and probably too idealistic. For example, he notes that there are so many students in some Italian university courses that direct contact with the professor is limited and student assessments have little meaning. “In practice, it will be very difficult applying that system without a global reorganization of our academic system,” he says.


    Schools Urged to Boost Technology Transfer

    1. Wayne Kondro*
    1. Wayne Kondro writes from Ottawa.

    OTTAWA—Canada is losing valuable technology to other countries because of its policies on exploiting the fruits of university research, according to a new report by a high-level government panel. The answer, says the panel, is to give universities rather than individuals the right to commercialize publicly funded discoveries, as well as the money to do the job right. But some academics fear that such a policy, described in a draft report obtained by Science, would turn universities into toolboxes for industry and undermine basic science.

    Bigger slice.

    Canadian officials want greater commercial return on academic R&D investment.


    The report, “Public Investments in University Research: Reaping the Benefits,” is the first product of the prime minister's Advisory Council on Science and Technology, created in 1997. Written by a nine-member subpanel of industry and university officials, it notes that only half of Canada's universities retain ownership of intellectual property (IP) generated by public funds or share it with the researchers; the others turn over full rights to the researchers. The result, says the panel, is that academe has become a “technology supply house for other countries,” with faculty members “handsomely rewarded through consulting fees in return for assigning away IP rights” to companies from other countries, notably the United States.

    Canadian universities collected a paltry $10 million last year from the marketing of university-based inventions, compared to $700 million in the United States, even though the government spends about one-tenth as much on academic R&D as its southern neighbor. Advisory Council member and expert panel chair Pierre Fortier says the only remedy is to “get some assurances from universities” that commercialization is part of their mission. “We cannot carry on with the laissez-faire approach which has prevailed until now,” says Fortier, special adviser to Montreal-based Innovitech Inc. The panel's final report will be submitted 11 May to the full council, which will forward it to the Cabinet in early June. Observers predict it will receive a warm reception from a government eager to promote high-tech industry.

    The draft report says that researchers should be obligated to make full disclosure of all IP created from federally funded research. The university, with few exceptions, should own the rights to its commercialization, while the creator should get a “share” in the form of equity or license income. The report proposes legislation similar to the U.S. Bayh-Dole Act of 1980, which gave universities the right to obtain title to inventions developed with federal funds and to grant exclusive licenses to patents based on those discoveries. Such a law would serve to harmonize what is now a hodgepodge of policies and practices.

    As an alternative to legislation, the report also proposes that the granting councils adopt a new IP code and prohibit awards to universities that don't follow its guidelines for promoting commercial activity. Fortier says that consultations with 150 academic administrators have recently led the panel to conclude that's a preferable approach. “It's easier to administer,” he says. “Legislation could take 2 to 3 years.” The panel also recommends that Ottawa spend $30 million a year to hire and train commercialization staff in universities, noting that only 62% of the country's universities have any form of office to foster technology transfer.

    While agreeing that universities need to become more attuned to the market, some academics question whether new spending on commercialization is the best solution to the problem of reaping more from Canada's investment in academic research. “Before you can imagine getting a lot of money from industrial applications, you must first invest more in basic research,” argues Yves Gingras, professor of the history of science and sociology at the University of Quebec in Montreal.

    Others, like Canadian Association of University Teachers executive director Jim Turk, worry that the recommended measures will transform universities from institutions of “open scholarship” to ones in which “commercial benefit” serves as the primary rationale for research. Turk takes issue with virtually every aspect of the report and is particularly incensed by its casting of faculty who assign IP abroad as somehow “treasonous” at a time when the government is allowing Canadian high-tech firms to be bought up by foreign interests. He also faults the panel's emphasis on commercialization rather than on the need to create new knowledge that might have applications, a view he sees as a “bizarre, Orwellian redefinition of innovation.”

  8. JAPAN

    Mixed Grades for 5-Year Science Plan

    1. Dennis Normile

    TOKYO—A 1995 law that led to Japan's first-ever 5-year plan for science and technology has helped boost spending and the number of scientists being trained, but it has been less successful in ensuring that the increased funding is well spent. That's the preliminary verdict of a committee of the country's top science policy-makers, in an interim report released last week. “There has been a fairly big effect … on bringing up the overall level of research activity,” says Mitsugu Ishizuka, a former official of the Science and Technology Agency (STA) and a member of the committee that drafted the report. “But there are aspects [of the plan] that haven't progressed as hoped.”

    View this table:

    The review is likely to be influential, given its source: the Committee on Policy Matters of the Council for Science and Technology, which is chaired by the prime minister and serves as the nation's highest science advisory body. The panel examined such quantitative measures as the level of funding and the number of lab assistants and interviewed national laboratory heads, researchers, and business leaders.

    Some of the major numerical goals in the basic plan are being met, the committee concluded. Research spending has risen dramatically and is within striking distance of a projected 17 trillion yen ($142 billion) over 5 years (Science, 22 January, p. 478). “We may be close enough to say we've hit the target,” says Nobuhiro Muroya, deputy director of STA's Planning and Evaluation Division, which helped with the report. The government has already achieved the target of 10,000 postdoctoral positions. In addition, the number of students studying for advanced degrees has increased from 39,660 to 51,360. University professors have also been given greater freedom to work with companies.

    The council gave barely passing grades to other efforts, however. Despite several emergency spending measures to stimulate Japan's stagnant economy, which included money for academic renovations and new equipment, “a lot still needs to be done,” says Ishizuka, one of two full-time members of the Council for Science and Technology. “There is still a lot of old equipment [in use].” And attempts to shrink government payrolls have stifled any increase in the number of research assistants, which Muroya says remains quite low.

    A big disappointment to many scientists is the committee's finding that procedures established to evaluate programs and institutes have had little impact on research activity. The report notes that although most organizations have gone through the motions, their efforts “are not sufficiently reflected in the allocation of resources or management of facilities.” Tomoko Ohta, a population geneticist at the National Institute of Genetics in Mishima and a member of the committee, believes the problem lies in making the evaluations sufficiently rigorous. “[Japanese] are just not accustomed to making critical comments of others,” she says.

    In a move unrelated to the committee's review, the Ministry of Education, Science, Sports, and Culture (Monbusho) hopes to standardize evaluation procedures and apply them to universities and national labs under its authority. A subcommittee of the Council for Science and Technology is also working on recommendations for other ministries. But no one expects a quick fix. Evaluation efforts “won't work if we don't change our culture,” says Muroya, a process that must be carried out incrementally.

    The committee hopes to submit a final report in about a year. In the meantime, the council plans to address some issues raised in the review, including the need for a clearer statement of science and technology priorities. One of its first opportunities will come this summer in recommendations to the Ministry of Finance for the fiscal year 2000 budget.


    High-Level Groups Study Barriers Women Face

    1. Jeffrey Mervis

    Mildred Dresselhaus isn't proud of the fact that she took a total of 4 days' maternity leave from her faculty position at the Massachusetts Institute of Technology (MIT) in the course of giving birth to four children. But as a young electrical engineer in the late 1960s in a bastion of masculinity, she didn't think she had much choice. “In my youth, it was necessary to play the game that way. We didn't have any options,” recalls Dresselhaus, who went on to become an MIT institute professor and a member of the National Academy of Sciences (NAS).

    A generation later, Elaine Mendoza, an aerospace engineer, had a lot more options when she decided to start a family. As the president and CEO of Conceptual Mindworks Inc. of San Antonio, Texas, which she founded in 1990, Mendoza has simultaneously raised two young daughters and turned a software business into one of the fastest growing Hispanic-owned companies in the United States. Her husband, an electrical engineer, is even one of her 51 employees.

    The contrast between the two women's experiences reflects the strides made by women in science and engineering in the last 30 years. But much more needs to be done for the country to take full advantage of its pool of scientific talent, most observers agree. This month the topic moved into the first rank of science policy circles. On 14 April a congressionally mandated commission met for the first time, with Mendoza as chair, to examine the barriers facing women, minorities, and the disabled in science. Last weekend Dresselhaus participated in a first-ever NAS symposium on improving scientific career opportunities for women. The commission is the latest in a series of federal efforts to document the problem, while the symposium, organized by the academy's Committee on Women in Science and Engineering (CWSE), builds upon persistent concerns about the minute presence of women—currently 5.9%—in the overall NAS membership.

    A major issue for both groups is a decline in the participation of women in science as they make their way through school and into the academic work force. “As you move along the educational and labor continuum, the gender gap becomes more and more pronounced,” says Marye Anne Fox, chancellor of North Carolina State University, Raleigh, and moderator of the NAS symposium. “And these distortions have persisted despite 3 decades of good-faith efforts.” The trend is especially troubling in younger fields like computer science, says William Wulf, president of the National Academy of Engineering. Women make up half the enrollment in high school computer science classes, he noted, but receive only 28% of the bachelor's degrees in the field. Their share of Ph.D.s drops to 16%, he added, and they hold only 6% of full professorships.

    This gender gap reflects the continuing difficulties of women scientists in academia, say many observers. And even those who “make it” face systemic discrimination in such areas as salaries, lab space, and service on key committees, according to a new report on the status of tenured women faculty at MIT (Science, 26 March, p. 1992).

    It's an open question whether industry is more receptive. IBM's Lillian Wu, a member of the President's Council of Advisers for Science and Technology and co-chair of CWSE, says she's seen a rapid improvement in the past 5 years and that today “there's a tremendous appreciation for what women can bring to technology.” But Kathryn Johnson, co-chair of the congressional panel and a geoscientist who runs her own consulting firm in South Dakota, says that she and many other women have become entrepreneurs in part to escape the “glass ceiling” and “chilly environment” at many big companies.

    The NAS symposium featured a spirited discussion about such gender-related issues as whether science, in the words of Harvard physicist and CWSE co-chair Howard Georgi, “unconsciously discriminates” against women by selecting for such traits as “assertiveness” and “single-mindedness” that favor men. The way women respond to inequalities was also debated. Asked why the MIT women waited as long as they did to seek redress, Dresselhaus replied, “Women aren't as aggressive in asking for equality in salaries and amenities. There were several instances where I was shortchanged and I didn't complain. So part of the problem is us—not them [men].”

    Both committees hope to compile and disseminate a list of current best practices and suggest concrete ways for organizations to increase opportunities for women. But entrenched attitudes are often hard to change. “It's more acceptable for a woman scientist to have a family today,” Dresselhaus admits. “But it's not any easier.”


    Are Pathogens Felling Frogs?

    1. Virginia Morell

    Massive frog die-offs have for years been linked to environmental conditions, but new data from Australia suggest that the real killer may be a deadly fungus

    TOWNSVILLE, QUEENSLAND, AUSTRALIA—On a Tuesday morning last August, Ken Aplin, curator of reptiles and amphibians at the Western Australia Museum in Perth, got a chilling phone call. The owner of a nearby organic nursery wanted to know why hundreds of frogs—common motorbike frogs, whose call sounds like a motorcycle changing gears —had suddenly died on his chemical and pesticide-free grounds. Aplin didn't have a ready answer, but he feared the worst. A few weeks later, his suspicions were confirmed: A colleague tested a freshly dead motorbike frog from another nearby source and found that the animal was lethally infected with a parasitic chytrid fungus—a virulent amphibian pathogen that had caused sudden, massive die-offs of more than a dozen frog species here in Queensland; four species have apparently gone extinct. Now, it seemed, the fungus had leaped 6000 kilometers and across the dry Nullarbor Desert, spreading to western Australia, a region with a rich endemic frog fauna that had never seen massive die-offs before.

    “It was like hearing about a first case of cholera,” says Aplin. “I feared it would spread.” In the next few months, the disease was found on other dead frogs in Perth and nearby towns to the south, often killing every frog in backyard ponds. “I suspect that we're on the edge of a major outbreak that will cause mass mortalities in the next few months, when frogs gather to breed,” he adds.

    Australia considers itself the front line in dealing with this frog pathogen, Batrachochytrium dendrobatidis, a new genus and species described as a lethal disease 9 months ago (Science, 3 July 1998, p. 23) and named only last month. But the chytrid is not just an Australian problem. It is suspected in the catastrophic disappearance of frogs in Panama and Costa Rica; and it is implicated in mass die-offs in the United States as well. Indeed, some researchers—mostly epidemiologists—say that this virulent new fungus may be the key factor in the sudden, mysterious decline of frogs around the globe, particularly those from wilderness areas of the Americas and Australia.

    Since the 1970s, populations and species of frogs have been vanishing worldwide, and deformities such as missing legs have been turning up with alarming frequency, sparking massive research and monitoring programs. In many cases, frog populations or even entire species in pristine, remote mountain areas suddenly vanished in a few months. Baffled about why frogs in protected areas would be so vulnerable, many researchers have looked to the global environment, arguing that frogs are like canaries in a coal mine, serving as indicators of global ecological health (see sidebar). They have studied a plethora of environmental suspects, ranging from increased ultraviolet (UV) light to global warming and wind-borne pollutants. Yet despite almost a decade of intense research and some loose correlations between die-offs and environmental factors, no one so far has been able to show that these factors are actually killing frogs.

    Thus, when the chytrid was first fingered, many scientists regarded it as just one of many “smoking guns” that would prove less convincing as time went by. But this time, researchers have bodies to prove the case. The Australian experience has galvanized researchers there, and now scientists elsewhere are taking seriously the idea that the chytrid plays a central role in the declines. A team of U.S. researchers has just proven in the lab that the chytrid alone can kill healthy frogs. And by studying preserved specimens, other researchers have now implicated the fungus in some of the very die-offs that first raised the amphibian alarm in the United States, including mass deaths of leopard frogs, Rana pipiens, in the Colorado Rockies back in 1974, and in more recent disappearances of Arizona's lowland leopard frogs. “It's increasingly clear that we need to treat the chytrid—and amphibian diseases in general —as a serious threat,” says Cynthia Carey, a physiological ecologist at the University of Colorado, Boulder. “Diseases are killing frogs and we need to know why.”

    Still, when it comes to worldwide frog declines, several leading U.S. herpetologists, such as David Wake of the University of California, Berkeley, resist the notion of a single cause; others note that some chytrid-infected frogs survive. So researchers are working to test two competing ideas: that the chytrid is an emerging pathogen sweeping through previously unexposed populations, or that an environmental cofactor such as increased UV light or climate change is magnifying the chytrid's effects. “I don't think you can rule out” such cofactors, says Donald Nichols, a pathologist at the National Zoo in Washington, D.C. who first identified the disease. He and others hope that a flurry of chytrid research will help them find out.

    Death down under

    The story down under starts here in northeastern Australia, where herpetologists began monitoring frog populations in 1989 after several species dwindled or disappeared from some of the least touched places on Earth, including World Heritage rainforest parks in the Atherton Tablelands. At first, environmental pollutants were thought to be the cause. But the streams where the frogs died aren't polluted, and the amount of UV light here hasn't risen during the past few decades. In this case, “you can rule out any of these environmental cofactors,” says Richard Speare, an infectious disease specialist at James Cooke University here. And infected frogs don't show “multiple opportunistic infections, such as you'd expect if their immune systems were compromised,” notes parasitologist Peter Daszak of the University of Georgia, Athens, who identified the chytrid in Australian frogs.

    The pattern of death “has all the hallmarks of an emerging pathogen,” says Speare, particularly its ability to infect a broad range of animals; thus it can continue to spread even after wiping out one species entirely. In Australia, researchers have found the chytrid on almost every suffering frog population they have checked, and they have been able to chart the fungus's spread through the continent, starting at the northern end of the epidemic in Queensland in 1993. So far they've traced it 300 kilometers south and 4 years back to 1989. The very first die-offs struck just north of the port of Brisbane in 1979, so Speare speculates that the fungus was introduced to Australia in the late 1970s, perhaps on an exotic frog.

    Traveling at about 100 kilometers a year, the chytrid decimated many species, moving “like a wave” through frog populations in successive localities. Last year it apparently crossed the continent to Perth and western Australia, perhaps on the feet of an infected frog that stowed away in a box of fruit destined for Perth. “That happens all the time,” says Aplin. “A store clerk opens a box of bananas from Queensland and out hops a frog”—and with it the chytrid. Such a scenario would explain why the first cases in southwestern Australia appeared in urban areas.

    Not all Aussie frogs have suffered equally from the chytrid: Queensland's green-eyed treefrog, which died from the disease in great numbers in the late 1980s and early '90s, is slowly coming back, suggesting that this species may have some resistance. But four stream-dwelling species, including two unique gastric-brooding frogs, are presumed to have been wiped out by the fungus. It has also infected another 24 species and drastically shrunk the population of 11 of these.

    At the same time the Australians were identifying the pathogen, researchers in Central America were puzzling over their own frog deaths. After hearing about the Australian situation, researchers checked for the chytrid—and found it on 10 different dead frog species in western Panama. The chytrid is also linked to the disappearance of numerous species in the rainforests of Costa Rica, says herpetologist Karen Lipps of Southern Illinois University in Carbondale.

    Preliminary genetic work suggests that the Central American frog fungus is the same one that plagued Aussie frogs. David Porter from the University of Georgia, Athens, chytrid specialist Joyce Longcore from the University of Maine, Orono, and graduate student Timothy James from Duke University in Durham, North Carolina, have found that the 18S ribosomal DNA genetic regions of chytrids are nearly identical, implying that a single fungal species is sweeping into new realms worldwide. James also notes that it is “quite an odd fungus,” different from the chytrids found commonly in soil and water. “There's no doubt we're seeing a new, emerging disease, one that is highly pathogenic and hits a wide range of amphibians,” concludes Daszak. Further molecular data should reveal how recently the chytrid has spread through frogs on various continents, and perhaps whether it has newly evolved to attack amphibians or if it is an old frog nemesis now invading naïve populations.

    Some researchers haven't been sure that the chytrid can kill healthy frogs. But a few months ago, in as-yet-unpublished work, a team led by the National Zoo's Nichols isolated the chytrid from poison dart frogs killed by the disease. They then inoculated healthy frogs with this chytrid. All infected frogs died, whereas frogs inoculated with a placebo did not. Next, they reisolated the chytrid from the second batch of dead frogs, a sequence of experiments that fulfills what are called Koch's postulates, the gold standard for proving that an organism causes a disease.

    The chytrid apparently uses the keratin in the frog's skin as a nutrient. Its motile, water-borne spores invade surface skin cells and grow and divide there asexually. No one is quite sure just how it kills frogs; Speare suspects that the fungus secretes a toxin, as dying frogs are unable to keep their balance and seem to have seizures, whereas others think the fungus blocks water uptake. In Australia, species hit hardest tend to spend most of their lives in water and live at higher and cooler altitudes, says Speare. That fits with both Australian and American lab studies showing that the chytrid is hard to grow at above 30°C and dies without water.

    Revisiting U.S. die-offs

    The chytrid's cold, wet-loving habits may help explain dramatic frog deaths in the United States as well, some researchers say. Take the decline of the once-common lowland leopard frog in Arizona. Over the last decade, researchers have pointed their fingers at the usual list of suspects, including loss of wetlands, heavy metals, and bacterial infections. But no one had ever seen wild frogs dying en masse from these killers. Scientists only knew that when they returned to the field each season, fewer and fewer frog populations were left.

    Then, in January of this year, Michael Sredl, a herpetologist at the Arizona Game and Fish Department in Phoenix, spotted a leopard frog population north of Phoenix in its death throes: The frogs were emaciated, trembling, and rigid. They had no obvious skin lesions, ulcers, or fungal growth, but histological sections of their skin revealed lethal numbers of the chytrid and its spores.

    This sighting also taught Sredl something else: Frogs were dying in winter. “We'd missed it every other year, because herpetologists usually don't go looking for their animals in the winter,” he says. The chytrid has now been “positively implicated” in die-offs of two leopard frog species in Arizona, as well as a species of treefrog, says Sredl. It's “under investigation in the declines of all Arizona ranid frogs.” And it's been found in specimens collected last year of Pacific treefrogs and endangered mountain yellow-legged frogs from California's High Sierras. Other scientists believe the chytrid may be responsible for a slew of other die-offs as well, including the extinction of Rana pipiens and Bufo boreas from the Colorado Rocky Mountains in the 1970s, a '70s crash of the Rana pipiens population around the Great Lakes, and sudden die-offs of ranids and toads from Wyoming to New Mexico in the 1980s.

    Pathogens have been suggested as culprits before. Back in 1993, Norman Scott, a zoologist at the U.S. Geological Survey in San Simeon, California, suggested that a novel pathogen was killing western frogs. He named the disease the “postmetamorphic death syndrome,” because newly metamorphosed frogs seemed to die overnight. That's a characteristic of chytrid infection, because tadpoles carry the disease only in their mouths and survive; after metamorphosis, the fungus spreads throughout the frog's skin and kills it. But Scott's idea received little attention at the time, perhaps, he says, because researchers were so determined to find an environmental cause.

    Now U.S. scientists are gearing up to have both historical and freshly collected specimens checked for the fungus, something the Australians began a year ago. “Only a week ago, I would never have thought to do this—collect frogs for testing,” says Michael Lannoo, a herpetologist at the Indiana University School of Medicine in Muncie, who recently realized that the pattern of declines he's studying in the Midwest's northern cricket frog matches that of earlier chytrid die-offs.

    However, even the most avid defenders of the chytrid hypothesis say it's not responsible for every decline. The fungus has not been found in Europe, for example. And herpetologists generally agree that the biggest problem for frogs worldwide is habitat loss; a species with only a few small, fragmented populations is likely to be much more vulnerable to the chytrid or another disease. “When you turn a diverse ecosystem into a hog lot,” notes Val Beasely, a veterinary toxicologist at the University of Illinois, Urbana-Champaign, who has found mild cases of chytrid fungus but no mass deaths in southern cricket frogs, “it's not a surprise that you get a disease.”

    What's more, many researchers, noting that there are correlations with environmental factors, still favor this kind of explanation. “It's way too early to rule out these other factors, such as pollutants and UV light,” says Andrew Blaustein, an ecologist at Oregon State University in Corvallis. Veterinary researchers such as Nichols add that most fungal infections in amphibians are opportunistic, moving in when animals are already stressed or injured from other sources. And in the United States, there's no clear pattern yet of deaths spreading out from one locality, as in Australia.

    Wake also questions whether one fungus could kill so many different kinds of amphibians—frogs and toads in 19 families have died from it, and it has infected salamanders too. “I don't know of any other pathogen that kills like that; for example, one that kills all mammals,” he says. In his view the genetics are too preliminary to be sure that the chytrid is the same species and strain in Australia and the Americas. Until that has been shown, he says, “any suggestion of an epidemic [is] irresponsible.” Even Nichols, who proved that the chytrid can fatally infect frogs, says to “count me among the skeptics who wonder what role the chytrids are playing” in the wild. He and others question whether something hasn't changed in the frogs' environment to weaken their resistance and promote the chytrid.

    While U.S. researchers argue about the chytrid's role, in Australia, Speare, Aplin, and others are waging a campaign to try to stop it. They're tracking the fungus's spread, identifying susceptible species, and planning captive breeding programs. And even skeptical U.S. researchers are urging field precautions, in case herpetologists themselves are spreading the fungus via wet boots or collecting gear. Chytrid specialist Longcore, who named the new genus, notes that she brought a non-disease-causing type of chytrid from Puerto Rico home to Maine in the wet mud on her boots.

    Despite all such efforts, Speare and others fear that in Australia, the disease “will be spread like the plague” through new populations. At least, Aplin says, this time scientists will be able to watch one of these sudden declines in action, rather than discovering it after it's all over: “We've caught it this time close to the beginning.” And that may provide answers to the many questions that still surround this strange frog killer.


    Frogs: Canaries in the Hot Zone?

    1. Virginia Morell

    Back in 1990, when 40-odd herpetologists and ecologists convened in Irvine, California, to discuss the mysterious problems afflicting frogs, they concluded that frog declines were symptomatic of larger environmental problems on the planet. Like the canary that miners carried to warn them of bad air, frogs could also warn us that “something is desperately wrong with our environment,” says ecologist Andrew Blaustein of Oregon State University in Corvallis.

    But now evidence is mounting that the frog die-offs—and the disturbing limb deformities seen in some populations—may be due in part to a different class of problem: pathogens. A lethal chytrid fungus is the primary suspect in many massive frog die-offs (see main text), and in this issue, researchers report that a snail-borne parasite can cause some of the extra and missing limbs seen in wild frogs (pp. 731, 800, and 802). If so, frogs' plight may not be indicative of the ills afflicting Earth as a whole, at least not in the way researchers first thought.

    The original idea was that frogs, because of their diverse biology—they live both in water and on land, are vegetarians as tadpoles and carnivores as adults, and have permeable, unprotected skin—are more sensitive to environmental changes than such species as birds, explains herpetologist George Rabb, director of Chicago's Brookfield Zoo. The “canary in a coal mine” idea took hold, and this status was a boon to frog research, as granting agencies funded studies on environmental ills such as wind- and water-borne pollutants, ultraviolet (UV) light, and global warming. Just 2 months ago, Interior Secretary Bruce Babbitt asked Congress for $8.1 million to set up a task force to look into frog declines and deformities.

    After a decade of work, researchers have shown that environmental factors such as global warming are sometimes correlated with die-offs, and that some of these factors can stress frogs in the lab. But no one has been able to show conclusively that an environmental trigger is responsible for the major, sudden frog declines.

    Still, some researchers argue that even if pathogens are to blame for some frog disorders, the amphibians can still be considered early warning signals for environmental problems. “Absolutely yes,” says Blaustein. He and other herpetologists argue that factors such as UV light and global warming do stress frogs, and that the chytrid may simply pick off amphibians weakened by these factors. They insist that there will always be many reasons for the die-offs. “Maybe in some places, a disease is going to kill the frogs; elsewhere it's going to be habitat loss, or herbicides, or UV light. It's not going to be a simple, one-size-fits-all story,” says Blaustein.

    Other researchers admit that having an amphibian pathogen as the villain takes some of the shine off the frog as poster child for the entire environment. However, that doesn't mean that the environment can be ignored, cautions Richard Speare, a wildlife infectious disease specialist in Australia. “The chytrid's presence strengthens the need to have optimal environments to tip the balance in favor of the frog,” he says.

    Speare and others add that if the brisk intercontinental trade in frogs does help spread the chytrid fungus, as some researchers suspect, then frogs may be sending another message about humans' influence on the environment. “Maybe emerging wildlife diseases warn us of a different global environmental threat: the introduction of [wildlife] diseases on a global scale,” says Speare's colleague, Peter Daszak, a parasitologist at the University of Georgia, Athens. “This could be just as significant a threat to the global environment as other forms of anthropogenic change—and one which is far more difficult to correct.”


    A Trematode Parasite Causes Some Frog Deformities

    1. Jocelyn Kaiser

    The cysts formed by the trematode lead to abnormal limb development in California frogs; whether trematodes cause the deformities elsewhere remains to be seen

    It was a disturbing sign that something might be going terribly wrong in the environment: Frogs with extra legs, missing limbs, and twisted jaws were popping up in ponds across the country. First spotted by schoolchildren in Minnesota in 1995, the famous malformed frogs, together with reports of declining frog populations worldwide, sparked concerns that the animals might be falling victim to some type of environmental degradation—a change that might even threaten human populations. The discoveries touched off a million-dollar-plus hunt to find the culprit, whether natural or humanmade. Two reports published in this issue now point to a natural cause for at least some of the frog abnormalities.

    On page 802, a team led by a recent Stanford graduate describes results indicating that infection by a trematode, a kind of parasitic flatworm, is at fault. The researchers based this conclusion on experiments in which they showed that they could exactly duplicate the kinds of limb abnormalities and other deformities seen in California by infecting tadpoles with the trematode, which goes by the genus name Ribeiroia. Some experts say that this work, together with a second study reported on page 800 that may exonerate certain chemicals suspected of causing the abnormalities, has now elevated parasites to the top of the list of possible causes for the frog deformities across the country. “This is the best experimental evidence showing a cause for the limb deformities in amphibians,” says Andrew Blaustein, an ecologist at Oregon State University in Corvallis, who has studied whether ultraviolet light could explain the deformities.

    Others caution, however, that Ribeiroia infections may not explain the different patterns of frog deformities seen outside of California, especially in the Midwest. “I do not believe that there's a single cause” for the deformities, says herpetologist Mike Lannoo of Indiana University School of Medicine in Muncie. Still, many experts are saying that after several years of frustration it's a relief to finally get some hard evidence for what might be happening to the frogs.

    Since the first malformed leopard frogs made headlines in Minnesota, deformities in at least 12 species of frogs and salamanders have been reported in Canada, Vermont, and 32 other states, often at rates of 8% or more, much higher than the rate of 1% or less expected in healthy populations. Investigators have pursued three main theories about what might be causing the problems: chemicals such as pesticides, increased ultraviolet light because of ozone destruction, or parasites (Science, 19 December 1997, p. 2051).

    Despite the flurry of activity, however, no lab had grown a batch of frogs under environmentally relevant conditions and produced the same deformities seen in wild specimens of the same species—until now. Pieter Johnson began this project 2 years ago for his undergraduate thesis at Stanford, with ecologist Paul Ehrlich as his adviser. Johnson investigated some ponds about 45 minutes south of Palo Alto where up to 40% of emerging Pacific treefrogs had deformities, mostly extra, partial, or missing hindlegs. The water tested free of chemical pollutants, but he noticed that the ponds with deformed frogs always had planorbid snails, a first host for Ribeiroia trematodes. “That was a pretty substantial clue” that trematodes might explain the deformities, Johnson says.

    That idea fit with a proposal developmental biologist Stanley Sessions of Hartwick College in Oneonta, New York, had made years earlier. In work published in 1990, Sessions had shown that he could induce extra legs in salamanders by implanting beads in their developing limbs, presumably because the beads move cells around. Noting that the cysts formed in infected hosts by trematodes could exert the same kind of mechanical forces as the beads, Sessions suggested that the worms could also cause limb deformities.

    By the time he graduated last June, Johnson had dissected hundreds of frogs and found that they did in fact have trematode cysts clustered around their extra limbs. But he hadn't done any experiments exposing tadpoles to the parasites. “I couldn't let go that close” to a solution, he says. So he teamed up with two friends, Kevin Lunde and Euan Ritchie. They all spent the summer “working pretty intensely,” Johnson recalls, often from 10 p.m. to dawn so they could catch the parasitic worms when they emerged from the snails and use them to infect the frogs.

    After several false starts, the team began infecting tadpoles with Ribeiroia and watching them develop into adults. The results were “almost painfully textbook,” Johnson says. Higher doses of the trematode produced more deformities, and the mix of multiple legs, partial and missing limbs, fused skin, and other oddities was very close to that seen in the frogs in the field. Johnson thinks the cysts may cause deformities by changing the positions of cells in a developing limb, as Sessions's beads apparently did, and may also produce some chemical that mimics a hormone.

    The Johnson team's findings don't mean that some chemical in the environment couldn't be at work too, but in the accompanying report, Sessions offers evidence that seems to rule out at least one type of chemical that has been linked to the frog deformities: retinoids. Sessions compared the abnormalities in 391 preserved, multilegged Pacific treefrogs from California and Oregon to those known to be induced in the lab by retinoids. More than 90% of the time, for example, the chemical produces a “proximal-distal duplication,” such as a new limb coming out of the elbow rather than the shoulder. The retinoids also cause only certain mirror-image limb duplications. Although Sessions found many specimens with other kinds of mirror-image duplications, none had proximal-distal duplications. “Retinoic acid gives you particular morphologies, and we just don't see that with the frogs,” says Sessions.

    Developmental biologist David Gardiner of the University of California, Irvine, who has been studying retinoids as a possible cause, disagrees, saying they are still in the running. “What the published literature says retinoids do and don't give you,” he says, isn't clear-cut. Other researchers say that differences in the abnormalities seen in midwestern and eastern frogs also point to other causes besides parasites. Few have the extra legs seen in California, for example. And although some of the animals have cysts, so far nobody has found Ribeiroia in the midwestern frogs. In addition, Carol Meteyer, a wildlife pathologist at the U.S. Geological Survey in Madison, Wisconsin, says she has dissected hundreds of metamorphosing tadpoles from the affected ponds, and the cysts she has found do not appear until after the frogs' limb buds had developed—too late to do the damage Johnson describes.

    But Lannoo and many others think parasites should be looked at more closely, even in those locales where chemicals are also suspected. “I don't for a minute think this is going to explain everything,” says David Wake, director of the University of California, Berkeley's Museum of Vertebrate Zoology. But he adds that it's “a warning not to put all of your eggs in one basket” when trying to pin down the cause of the frog deformities.


    Headhunters Stalk the Halls of Physics

    1. James Glanz

    Bidding wars are breaking out in academe as prestigious institutions vie for the top researchers in high-profile areas of physics and astronomy

    Nothing is permanent but change, said the Greek philosopher and physicist Heraclitus. Academic physics departments are discovering this truth all over again. Like star athletes and top business executives, high-profile physicists and astronomers have become the object of bidding wars, leading to a chaotic mobility from which some academics believe only a handful of the most prestigious and best funded institutions can benefit. One shell-shocked department chair, Paul Langacker, quips that he has a new motto for enticing prospective faculty members: “The University of Pennsylvania—where Princeton and Harvard come to recruit.”

    Langacker says that although researchers have every right to move, the accelerating pace threatens to destroy small groups that universities such as his own have carefully built up in emerging subfields—sometimes before the Princetons and Harvards saw the trend. Others think the aggressive recruiting actually has benefits, as it spreads ideas around and encourages collaboration. But there is one universal feeling when a star walks out the door, says Pekka Sinervo, chair of physics at the University of Toronto: “It hurts. There is no way that it can't hurt.”

    Sinervo and others at Toronto got a strong taste of that hurt in 1997 and 1998. First, Scott Tremaine—a former director of the Canadian Institute for Theoretical Astrophysics with joint appointments in physics and astronomy—left to become chair of the Astrophysical Sciences Department at Princeton. Long a force in areas ranging from celestial mechanics to cosmology, Tremaine, who turned down an initial offer from Princeton and then changed his mind 6 months later, says Toronto mounted a “very effective countercampaign” to keep him. It was not effective enough, however, and Sinervo was determined to do even better when the phone started ringing in late 1997 for one of his department's best young researchers: Thomas Mason, a materials scientist widely known for his studies of high-temperature superconductors.

    Now 34, Mason was named one of “100 Canadians to watch” by Maclean's magazine in its 1 July 1997 issue for his structural studies of novel superconductors using neutron scattering. But he let it be known that he would entertain offers in the United States because of what he saw as insufficient support for such research in his department and because Canada's premier neutron facility—the Chalk River reactor—is aging and poorly funded. Interest materialized posthaste from the University of California, San Diego, and Los Alamos National Laboratory in New Mexico; but it was an offer to become scientific director of the planned $1.36 billion Spallation Neutron Source (SNS) at Oak Ridge National Laboratory in Tennessee that really grabbed his attention.

    Toronto quickly offered to bump Mason's salary up to six figures, to spend $400,000 on a new helium-liquefaction plant—crucial for Mason's work—and to hire a new faculty member in the same area. “It's by no means been a tradition at Toronto that we can react as nimbly as we did,” says Sinervo. “In the end, the offer they made me actually addressed all of the concerns I had initially,” says Mason. But by then, the once-in-a-lifetime chance to influence scientific priorities at SNS, along with a substantially more generous financial package and other benefits, induced him to leave the university just 5 years after he arrived. “I got tenure the week I left,” he says.

    In spite of Toronto's losses, the University of Pennsylvania's Langacker might envy the national boundary that separates Toronto from prestigious, deep-pocketed American institutions. Located just a few miles down Interstate 95 from Princeton, his university has seen several of its biggest names alter their commutes recently. This year, in what Langacker calls a “devastating” loss, one of the world's most respected astrophysicists—Paul Steinhardt, who had been at Penn for 17 years—left the group of junior faculty he had been leading. In his move to Princeton, his family changed houses, but Steinhardt's wife, a professor of Chinese art history, was able to keep her position at Penn.

    “Penn was always extremely generous and supportive throughout my career there,” says Steinhardt, “but they can't create an astrophysics department of the quality here out of the blue.” Respecting Steinhardt's reasons for moving, Penn made no counteroffer, but it did swing into action when Princeton began recruiting materials scientist David Weitz. After a move from Exxon 3 years ago, Weitz had seen his reputation skyrocket as his research focus—the physical properties of biological materials, colloids, gels, and foams—became more familiar in academe.

    The university put together a package worth more than $1 million, says Langacker, including a big salary increase, another faculty position in the field, and a center for “soft condensed matter” that Weitz would direct in exchange for a reduced teaching load. Ultimately, Penn lost out—when Harvard made “what I thought was a really outstanding offer,” says Weitz, including a large amount of start-up money, a relocation package, and the chance to take all 10 of his students with him. “It's a job-seeker's market,” says Weitz.

    Just up the road, Rutgers University in New Jersey has seen its prized research group working on the high-profile topic of string theory—a mathematical quest for a unifying theory of particles and forces—become a hot commodity. Starting a decade ago, Rutgers built one of the premier groups in the field, luring four top theorists: Nathan Seiberg, Steve Schenker, Dan Friedan, and Tom Banks. But when string theory caught fire, other universities began eyeing this reservoir of talent. “What happened was they were successful—in some sense too successful,” says Paul Leath, chair of the department of physics and astronomy at Rutgers. Seiberg has left for the Institute for Advanced Study in Princeton, and Schenker is at Stanford University. Banks is also considering leaving, says Leath. “We offered them everything you could imagine” to stay, says Leath, including sky-high salaries, new infrastructure, and fresh research funds.

    Such bidding wars take a toll on universities, says one Nobel laureate. “I recognize that superstars can create real intellectual excitement and be a magnet so that a major new strength can be created,” he says, but excessively lavish packages can “divert precious resources from others who could better use the money. I am not in favor of the ‘free-agency’ aspect of recruiting,” adds this laureate.

    Still, some physicists think the free agentry could be a sign of a broader stirring in the long-stagnant job market in physics, even though statistics compiled by the American Institute of Physics don't show any trend so far. “I was just in a committee meeting last week,” says Tremaine, “and we realized that nine out of 10 people at the table either had moved in the last couple of years or were contemplating a move.” Others note that the value of endowed academic chairs has risen with the stock market, increasing the odds that people can be attracted to fill openings that do exist.

    For the departments on the losing side, there's another silver lining: the chance to recruit a new star. Toronto's Sinervo sweetened the package that had been put together for Mason and offered it to Louis Taillefer, a star materials scientist at McGill University in Montreal. Taillefer had no interest in leaving Canada, but “as soon as it became clear that Louis was mobile, other institutions moved in,” says Sinervo, who then had the pleasure of outbidding McMaster University, Simon Fraser University, and the University of British Columbia. Taillefer now occupies Mason's former faculty slot in Toronto.


    New Ground-Based Arrays to Probe Cosmic Powerhouses

    1. Dennis Normile

    Built at a tiny fraction of the cost of satellites, these telescopes should help unlock the mysteries behind these high-energy sources of photons

    TOKYO—Gamma rays are a signature of the most powerful and puzzling phenomena in the universe—gamma ray bursts, supernovae, and the black hole-powered infernoes called blazars. But scientists' view of these high- energy photons has been blurry at best. Blocked by the atmosphere, they have been studied mainly from satellites, such as NASA's Compton Gamma Ray Observatory. But the satellite-based detectors have poor angular resolution, and the highest energy gamma rays elude them. To get a better view of the gamma ray sky, astronomers are going back to where it can't be seen directly—back to the ground.

    In a flurry of construction in deserts and on mountain peaks, they are building arrays of reflectors and light detectors designed to pick up the faint glow produced when gamma ray photons slam into the upper atmosphere. The University of Tokyo's Institute for Cosmic Ray Research (ICRR) has just gotten approval to expand its present single 7-meter telescope in the Australian outback, called CANGAROO, to an array of four 10-meter telescopes. The Max Planck Institute for Nuclear Physics in Heidelberg, Germany, is developing the components for an array of four 10-meter telescopes to be built in central Namibia. A second Max Planck physics institute, in Munich, leads a group including Spanish and Italian universities and institutes building a single 17-meter dish, called MAGIC, at the Roque de los Muchachos observatory on La Palma in the Canary Islands. And the Whipple Observatory of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts, is expecting a funding decision within the next few weeks on a proposal to build seven 10-meter telescopes on Mount Hopkins in southern Arizona.

    An array of arrays.

    New telescopes to indirectly detect incoming gamma rays are sprouting in Australia, Namibia, the Canary Islands, and Arizona.


    “This is a poor man's approach to gamma ray astronomy,” says Trevor Weekes, principal investigator for the Whipple project, called VERITAS. VERITAS is the most expensive of the projects, but at $16.6 million, it is a fraction of the cost of a gamma ray satellite. Even at that bargain price, Weekes and his fellow gamma ray astronomers are expecting a big scientific payoff. The arrays should allow astronomers to track down the sources of gamma emissions that are now mysterious, as well as to observe gamma rays at energies that are now invisible. And the global coverage should allow astronomers to keep a continuous watch for short-lived sources, such as gamma ray bursts.

    Gamma rays also provide astronomers with a new window to look upon a universe now known mainly from optical, radio, and x-ray observations. “The feeling is that for our understanding of the evolution of the universe, as well as what is going on in specific processes, this nonthermal component is as important as the thermal component, and we know a lot less about the nonthermal universe,” says Werner Hofmann, one of the leaders of the German project in Namibia, called HESS.

    Satellites were the first choice for gamma ray observations because gamma rays from space “never get closer than 20 kilometers of us,” Weekes says. But satellites have severe limitations. They are necessarily small, limiting the number of photons they can gather. And gamma rays with energies more than about 10 GeV (giga- electron volts, or 10 billion electron volts) elude the small satellite-based detectors, which work by absorbing gamma rays in a dense material or by tracking the electron-positron pairs that result from gamma ray interactions in a gas-filled chamber. Both limitations are a liability for researchers hunting down the rare numbers of TeV (trillion-electron-volt) gamma rays.

    Searching for a way around that problem, astrophysicists relied on the fact that gamma ray photons create a cascade of charged particles when they slam into the atmosphere. These charged particles create a faint glow of light, known as Cerenkov radiation. Atmospheric Cerenkov telescopes, as they are called, gather and focus this light on a camera or other light sensor. The pattern of the Cerenkov radiation leaves clues about the energy and direction of the original gamma ray photon. A Cerenkov telescope on the ground can be as large as funds allow, overcoming the sensitivity limits that plague satellite detectors, and the atmospheric Cerenkov technique has proven capable of detecting gamma rays of from about 200 GeV to 50 TeV.

    Although the principle of the atmospheric Cerenkov telescopes has been known for decades, a major challenge was learning to discriminate between gamma rays, which are believed to come from point sources, and cosmic rays and other background noise. Weekes's group at the Whipple Observatory, which built a 10-meter gamma ray telescope in 1968, was the first to demonstrate convincingly that the ground-based technique could work. The technique was taken a step further by the German-Spanish-Armenian High-Energy Gamma-Ray Astronomy, or HEGRA, project in the Canary Islands. It used an array of six atmospheric Cerenkov telescopes and over 200 particle detectors, which caught the secondary charged particles directly. Completed in 1997, HEGRA demonstrated that two or more ground telescopes work better than a satellite in determining the origin of photons. “The Whipple first proved the concept of observing gamma rays from the ground,” says Hofmann. “HEGRA proved the value of having an array of detectors.”

    One of the most significant discoveries by Weekes's group and others using first-generation ground-based telescopes were TeV gamma rays from certain supernova remnants. It had been thought, but never proven, that these remnants, which have pulsars at their cores, are filled with electrons accelerated to velocities close to the speed of light. Such electrons would collide with photons in the supernova cloud and send them winging off through space as gamma rays. This inverse Compton scattering, as the phenomenon is called, creates gamma rays with a distinctive energy spectrum. Observing this telltale energy signature confirmed the presence of electrons at TeV energies. “There is no other way that [the gamma rays] can be produced but by these electrons. There is no ambiguity,” Weekes says.

    While upholding one theory concerning supernovae, the atmospheric Cerenkov telescopes have so far failed to confirm that the supernovae are the source of the cosmic rays that bombard Earth from random directions. It had been generally accepted that cosmic rays, primarily high-energy protons and atomic nuclei, originate in the expanding shock wave of supernovae. Theory had further predicted that protons within the shock wave would interact with the gases of the interstellar medium and produce gamma rays. But the telltale energy spectrum of gamma rays produced by these hadronic collisions has never been observed, with the possible exception of one observation by the Japanese-Australian CANGAROO group. The next generation of atmospheric Cerenkov telescopes will be more sensitive and, in combination with planned space-based gamma ray detectors, should be capable of probing the entire gamma ray energy range.

    Another surprise, from both ground-based and satellite telescopes, has been the detection of gamma rays from blazars, a special class of active galactic nuclei. AGNs, at the centers of some distant galaxies, are believed to consist of a black hole, a surrounding accretion disk of material falling into the hole, and relativistic jets of plasma ejected in two directions perpendicular to the disk. In blazars the jets are thought to point toward Earth, aiming potent gamma rays at us. But theorists have struggled to explain energies as high as those observed (Science, 14 November 1997, p. 1225). “We're at a loss to explain the mechanisms that produce these very great quantities of energy,” says Charles Dermer, an astrophysicist at the Naval Research Laboratory in Washington, D.C. He hopes that gamma ray observations of AGNs made with the new generation of ground-based telescopes can be coupled with x-ray and ultraviolet observations to yield clues on how black holes extract energy from accretion disks and eject it in jets.

    All in all, astronomers believe the new wave of ground-based instruments will throw wide open a new window on the universe. “The early 21st century,” says Tadashi Kifune, primary investigator for CANGAROO, “will be the era of gamma ray astronomy.”

Stay Connected to Science