News this Week

Science  09 Feb 2001:
Vol. 291, Issue 5506, pp. 958

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Muon Experiment Challenges Reigning Model of Particles

    1. Charles Seife

    The Standard Model of particle physics, for decades the seemingly unshakable ruler of the subatomic world, may have suffered a body blow this week. As Science went to press, physicists at Brookhaven National Laboratory in Upton, New York, were preparing to present results of a long-awaited experiment—results that contradict what the model predicts. The timing of the announcement prevented Science from contacting physicists not associated with the experiment. But barring a statistical fluke or undetected systematic error—both real possibilities —the Brookhaven observations appear to mark the best evidence yet that the Standard Model is just a province of a larger, shadowy realm: supersymmetry.

    “It's a very high-precision measurement —the value is unequivocal. But the Standard Model itself is unequivocal,” says Thomas Kirk, an associate lab director at Brookhaven. The measured and theoretical values disagree. “This implies that there must be physics beyond the Standard Model.”

    The Brookhaven experiment, known as g-2, measured the so-called “magnetic moment” of the muon, a heavier cousin of the electron. An object's magnetic moment describes how strongly it twists in a magnetic field. For classical objects, such as bar magnets, calculating the magnetic moment is easy. In the quantum realm, however, things get a good deal more complicated. Quantum objects such as electrons and muons exist amid clouds of evanescent particles that wink in and out of existence in fractions of a second. A muon might be surrounded by a zoo of photons, Z particles, and W particles, and it would interact with each member in the menagerie. Those interactions mess up the classical calculations of the muon's effective magnetic moment, throwing the value off by a tiny amount.

    In the past, the Standard Model of particle physics has done a superb job of accounting for that anomaly. The model binds together all the fundamental particles—quarks, neutrinos, electrons, taus, muons, gluons, and so forth—with a mathematical framework that enables physicists to calculate statistically how particles will interact. Among many other triumphs, the model successfully predicts the anomaly in the electron's magnetic moment to within 1 part per billion—the limit of measurement error. “That's one of the major reasons why we believe the theory is really correct,” says Gerry Bunce, project director for the experiment at Brookhaven. “It's a real underlying pillar of the Standard Model.”

    The muon's anomaly is less well known, as the particle is rarer. Theory predicts the value with an uncertainty of less than 1 part in a million, but experiments have not been precise enough to test the theory at that level—until now. For the past 3 years, physicists have been smashing beams of protons from Brookhaven's Alternating Gradient Synchrotron into targets, creating particles that eventually decay into muons. The muons, whose spins are all in the same direction, are fed into the field of a 14-meter-wide superconducting magnet. The field forces the particles to race in a circle and causes them to twist.

    By analyzing the products of the muons' decay, the physicists can infer how much the muons twisted in the magnetic field, revealing their magnetic moment with an error of about 1.3 parts per million. The results show that the theoretical value and the experimental value disagree by about 4 parts per million. Statistically, the two values clash at the 2.6-sigma level of significance—a suggestive, but not definitive, result.

    If the measurement holds up, the discovery will reveal a major flaw in the Standard Model. Physicists might respond by adopting an expanded model that includes new particles whose interactions affect the muon's magnetic moment. The best candidate thus far is supersymmetry, a theory that links the particles that make up matter (fermions) with those that carry forces (bosons) by providing every known particle with a still-undiscovered twin.

    Most physicists would agree that a 2.6-sigma result is too weak to signal the beginning of the supersymmetric era. In high-energy physics, much firmer five- and six-sigma results fall through with alarming frequency (Science, 29 September 2000, p. 2260). But Lee Roberts, a Boston University physicist working on the experiment, says the Brookhaven result is less susceptible to statistical flukes than a standard particle hunt is because it comes from a more narrowly focused search. Furthermore, he notes, the Brookhaven team is still crunching numbers. Enough data remain on tape to reduce the error to half a part per million, Roberts says, and another year's run will bring the value down to one-third of a part per million. That would likely be enough to trumpet the failure of the Standard Model, if the discrepancy remains.

    But Roberts acknowledges that this week's result stops far short of such a clarion call. “I would not claim discovery,” he says. “But I would claim that it is very interesting and provocative.” And it is indicative, one can predict to a high value of sigma, that more riveting announcements from Brookhaven are still to come.


    Danger to Peer Review Is in Eye of Beholder

    1. Vladimir Pokrovsky*
    1. Vladimir Pokrovsky is a writer in Moscow.

    MOSCOW—Is peer review in Russian science on the way out. It depends on whom you ask. The heads of two foundations that follow a Western-style peer-review system have very different views on the impact of management changes designed to reduce the foundations' freedom.

    “It would be a step back, like from a human being to an ape,” contends Yevgeni Semyonov, director-general of the Russian Humanitarian Sciences Foundation (RHSF), which funds social science. But Mikhail Alfimov, head of the Russian Foundation for Basic Research (RFBR), which funds natural sciences, argues that the new structure won't harm peer review.

    Last year, conditions improved slightly for Russian scientists, who for the first time in more than a decade received their regular paychecks each month. But the paltry salaries don't include money to operate their labs. In addition to tapping into Western funds to buy equipment and supplies, scientists also compete for grants from the two Russian foundations. Although the grants aren't for huge sums—about $3500—they provide young scientists with work and the opportunity to travel to conferences and field sites within Russia.

    The foundations are also sterling examples of the positive influence of the West on Russian science. Set up by the government in the early 1990s, the two organizations employ a peer-review system modeled after that of the U.S. National Science Foundation. Each Russian foundation has a scientific council that arranges for Russian experts to review and make decisions on grant proposals. Separately, each foundation has a directorate that manages its affairs, from administering the grants to interacting with the government.

    Last summer, however, the Putin Administration decided that the foundations enjoyed too much freedom. Although both distribute government funds, they operate independently from the state. By converting each foundation into a state establishment, the government can require its scientific councils to become subordinate to the directorates.

    Such an arcane change would wreak major damage on the peer-review system, asserts Semyonov. “The directorate [would] obtain the right to interfere in the decision on the grants distribution,” he says. “This would lead, if not to the complete undoing of the institution of independent evaluation, then at least to its substantial deterioration.” Semyonov says that he speaks for many scientists who serve on expert panels or who consult for the RHSF.

    Alfimov disagrees. In a recent position paper, he insisted that the change in status is merely a formality. “No one,” he wrote, “will encroach on the basic principles of the foundation.” Alfimov, who declined to be interviewed for this article, warned Semyonov “not to scare the scientific community.”

    Alfimov has support from some rank-and-file scientists. Alexei Ryskov, a biologist at the Institute of Gene Biology in Moscow who has served on RFBR review panels, doubts that changing the status of the foundation will corrupt the peer-review system—if only because the grants it hands out are so small. “If we [were speaking] about, say, a $50,000 grant, that might be different,” he says.

    Just who's right should be determined in the coming weeks. Alexandr Dondukov, minister of industry, science, and technologies, has granted a reprieve to the humanities foundation, which plans to spend about $600,000 on grants this year. The natural sciences' body, meanwhile, has dutifully filed the papers for the change in status. Now observers are watching to see whether the science council will be able to spend the RFBR's $3.5 million in grant money this year before it must report to a new boss.


    NRC Panel Pokes Holes In Everglades Scheme

    1. Jocelyn Kaiser

    An expert panel that has taken a first cut at reviewing the controversial $7.8 billion Everglades restoration plan is sounding a note of caution about one of its essential elements. In a report released last week, * the National Research Council (NRC) panel raises concerns about the planned use of wells drilled in southern Florida's Upper Floridan aquifer—a vast, porous layer of limestone—as storage tanks to regulate water levels in the region. “There are significant uncertainties associated with aquifer storage, and you have to answer the questions,” says aquifers subcommittee chair Jean Bahr, a hydrogeologist at the University of Wisconsin, Madison.

    The panel looked at a crucial piece of the Everglades restoration plan—storing water in aquifers. The plan, drawn up by the U.S. Army Corps of Engineers and other federal and state agencies, would attempt to help ailing wildlife by restoring the natural flows of fresh water that once stretched from Lake Okeechobee south to the Everglades. But rather than undoing all the canals, levees, and dikes that have diverted the water to agriculture and cities, the idea is to pump the water they carry into more than 330 wells drilled in the Florida aquifer and then release it during dry spells. Congress approved a down payment of $1.4 billion for the 20-year restoration last fall, and the Army engineers plan to begin two pilot projects in 2003 that will involve drilling test wells and collecting water-quality and geological data.

    But some ecologists and hydrologists have taken issue with the idea, arguing that it would be better to remove more barriers to natural water flows. The NRC's Committee on the Restoration of the Greater Everglades Ecosystem (CROGEE) was set up a year ago in response to such criticisms.

    In its first report, a review of two planned aquifer pilot projects, the CROGEE notes that the overall aquifer plan, which would require storing up to 6.3 million cubic meters of water per day, is “unprecedented” in scale. Showing that it will work “will require studies that go beyond the scope of the proposed … pilot projects,” the report says. It urges agencies to go forward with a proposed regional modeling study of how the new wells would affect the aquifer. The panel also recommends that more data be collected during the pilot projects, including studies of whether storing water in the aquifer would degrade the water quality, leading to harmful effects on ecosystems.

    Corps officials say they've already begun responding to these comments since they were aired at a CROGEE workshop last fall. “It's a good report. It gives us some good guidance,” says Stu Applebaum, ecosystem restoration chief for the corps in Jacksonville, who says the agency expects to carry out the regional study when the pilot projects begin.

    Those data, however, will be only a beginning. CROGEE's first report notes that more analysis, including an assessment of energy costs and evaporation rates, is needed to determine whether aquifer storage is preferable to surface storage. And the committee is now reviewing the environmental measures that will be used to assess whether restoration is working.


    Caltech Picks Insider to Lead JPL

    1. Andrew Lawler

    The Jet Propulsion Laboratory (JPL) in Pasadena, California, has had a hard time of it in the past 18 months. Two prominent Mars failures sullied its formidable reputation, its longtime director stepped down, and its dominant position in solar system exploration is being threatened by NASA's decision to open up a Pluto mission to competition.

    Propelling JPL.

    New Director Charles Elachi says he's “not afraid of competition.”


    But the center soon will have a new leader who promises to confront the lab's troubles. Charles Elachi, now chief of space and earth science programs at JPL, will take over as director on 1 May. “I'm not afraid of competition,” he said at a 31 January press conference. He said he plans to spend the interim developing a plan to handle a bevy of smaller missions within NASA's constrained budget.

    The decision by the California Institute of Technology (Caltech), which runs the lab under contract from NASA, surprised many in the space science community who assumed the job would go to an outsider. But Elachi's knowledge of the lab, scientific credentials, and vision vaulted him to the top of a list of 74 candidates. “We're not merely anointing a prince here,” said Caltech President David Baltimore. “He simply provided insurmountable competition.”

    Elachi has had a management role in the Mars program, which had to be revamped after the failures of the Mars Polar Lander and the Mars Climate Orbiter in late 1999. But several NASA officials and space scientists say that Elachi's responsibility for those failures was minimal. “He was the fall guy,” adds one senior researcher who knows the lab well. The departing director, Ed Stone, was blamed by NASA officials for failing to speak out more publicly about the lab's responsibility. Stone, age 65, will return to teaching at Caltech after 10 years as head of the lab.

    The 53-year-old Elachi studied physics, geology, and business administration and earned his electrical engineering doctorate from Caltech before joining the lab in 1971. He helped develop a series of radar instruments used on the space shuttle that revealed archaeological sites just under Earth's surface in Egypt, China, and Saudi Arabia. He also is team leader for a radar experiment on the Cassini mission to Saturn.

    At the press conference, NASA Administrator Dan Goldin told Elachi exactly what he must do: “Figure out how to double the number of missions with a similar workforce—and maybe double that again.” One former NASA official who has worked with Elachi says he's up to the task. “Elachi is smart, competent, and adept at giving Goldin what he wants,” says the official.

    Elachi says he's “very confident” he can do more with less. He's also hoping that the launch of a new Mars orbiter in April, and additional Mars missions in 2003, will boost morale at the battered lab.


    Nuclei Crash Through The Looking-Glass

    1. David Voss

    Gloves do it. Toupees do it. Even twists of DNA do it. And now, for the first time, physicists have discovered that atomic nuclei come in right- and left-handed models, too. In the 5 February issue of Physical Review Letters (PRL), a team of researchers from the State University of New York (SUNY), Yale, the University of Tennessee, and Notre Dame reports observations of rapidly spinning nuclei morphing into mirror-image forms. In the process, the physicists also uncovered solid evidence that a long-disputed feature of nuclear anatomy really does exist.

    “These results are causing quite a stir among nuclear structure physicists,” says Rod Clark of Lawrence Berkeley National Laboratory in California. Although more work is needed to nail down the conclusions, Clark says, “it is tremendously difficult to come up with an alternative interpretation” of the findings.

    The discovery springs from work by Stefan Frauendorf, a nuclear theorist at the University of Notre Dame in Indiana. In 1997, Frauendorf and colleagues were exploring the possible properties of atomic nuclei with a hypothetical feature called triaxial symmetry. In theory, nuclei can have varying degrees of symmetry, from spherical to ellipsoidal to triaxial, depending on how the neutrons and protons arrange themselves. An ellipsoidal nucleus resembles an American football; a triaxial nucleus is similar, but squashed. “It actually looks a bit like a kiwi fruit,” says Krzysztof Starosta, lead author of the PRL paper and a visiting professor at SUNY, Stony Brook.

    Frauendorf, also a co-author of the PRL paper, suggested that certain triaxial nuclei should come in left- and right-handed varieties. His calculations showed that the development of handedness, which physicists call chiral symmetry breaking, should occur in rapidly rotating “odd-odd” nuclei—those containing both an odd number of neutrons and an odd number of protons.

    Much as electrons in an atom pair up to form shells surrounding the nucleus, the protons and neutrons in the center of the atom pair up, like with like, to create their own structures inside the nucleus. In an odd-odd nucleus, however, one neutron and one proton are left over. In some cases, these “valence nucleons” orbit at right angles to each other outside the nuclear core made up of the other protons and neutrons, just as valence electrons whiz around the electronic shells of an atom. Meanwhile, the core is spinning, too (see figure).

    Doubles match.

    In some atomic nuclei with triaxial symmetry, a lone proton and neutron whizzing around a whirling nuclear core give different nuclei mirror-image values of momentum.


    According to Frauendorf, those three motions—of the two valence nucleons and the triaxial core—should create a chiral effect. Added together, they give the nucleus its overall momentum. But because the core can spin in either of two directions with respect to the orbiting particles, the overall momentum can take on two different values, too. Those values, Frauendorf said, establish the left-handed and right-handed states.

    The catch was that nobody knew whether triaxial nuclei really exist. Nuclei with three distinct axes of symmetry had been predicted in the 1960s and hotly debated ever since, but no one had definitively observed one. Some physicists suspected that the triaxial shape might be a fleeting oscillation of the nucleus, too unstable to have a measurable effect.

    To find out, Starosta and his collaborators looked at gamma rays, a kind of radiation that atomic nuclei emit after being excited to high-energy spin states. If the nuclei were triaxial and were undergoing chiral symmetry breaking, the gamma rays ought to cluster into pairs of closely related frequencies known as doublets—evidence that the energy levels of the nuclei had split into pairs of right- and left-handed states.

    The collaborators focused their efforts on odd-odd nuclei of cesium, lanthanum, praseodymium, and promethium. Using accelerators at SUNY and Yale, they shot beams of heavy ions—carbon, boron, and magnesium—into targets of tin and antimony. The smashups initiated fusion reactions that created excited nuclei of just the right type and pumped them up to the right spin states. As the nuclei settled down, they emitted a panoply of gamma rays with various energies. The telltale clustering was there; by sorting out doublets, the physicists confirmed the existence of chiral symmetry breaking.

    The next step, Starosta says, is to see whether odd-odd nuclei of other elements also form mirror images: “We started probing the nuclei around atomic mass 130 because the theory pointed us there, but we will be searching in other mass regions now.” Clark says understanding how these complex nuclear structures behave may spill over to other fields as well. “The ideas and methods for understanding nuclei, molecules, metallic clusters, and atomic condensates all feed off of each other,” he notes.


    Stratospheric 'Rocks' May Bode Ill for Ozone

    1. Richard A. Kerr

    It looks as if the infamous antarctic ozone hole—the springtime thinning of the protective stratospheric layer—has reached its natural limits, but atmospheric chemists worry about the ozone over the Arctic. It, too, thins in springtime as the rising sun spurs the chlorine and bromine from humanmade chemicals to destroy ozone. Although no real “hole” has opened over the Arctic, that could change in the coming greenhouse era. And on page 1026 of this issue of Science, a team of researchers studying the Arctic stratosphere reports the discovery of bizarre particles in clouds there that could make arctic ozone more vulnerable to changing climate.

    The discovery came during a January 2000 research flight nearly to the North Pole, report atmospheric chemist David Fahey of the National Oceanic and Atmospheric Administration (NOAA) in Boulder, Colorado, and 26 colleagues from more than a dozen institutions in North America and Europe. An instrument on NASA's highflying ER-2 coughed out what looked like disastrous noise as it measured nitrogen-containing gases.

    Pretty but dangerous.

    Polar stratospheric clouds with “rock” particles may destroy more arctic ozone.


    On closer inspection, however, the noise turned out to be humongous particles containing nitric acid. The polar stratospheric cloud (PSC) particles that form in the extreme cold of polar winter generally run a few tenths of a micrometer to a micrometer in diameter and are made of water and nitric and sulfuric acids. But these new, oversized particles, more than 3000 times the usual mass, ranged from 10 micrometers to 20 micrometers. “These things are rocks” compared with the usual PSC particles, says Fahey. They are most likely made of solid nitric acid hydrates. Atmospheric chemists aren't sure yet how they formed, says Fahey; “no one was expecting to see” such large particles.

    The discovery of PSC rocks is catching researchers' attention because of their potential role in removing nitrogen from the stratosphere. All PSCs provide surfaces where chlorine and bromine can be liberated from their inactive forms to enter their ozone-destroying forms. But PSCs that contain nitric acid can also play an indirect role in ozone destruction by taking nitrogen out of circulation. Because nitrogen can tie up chlorine and bromine in inactive, harmless forms, when it becomes tied up in PSCs—a process called denitrification—more chlorine and bromine can remain active to destroy ozone. And because PSC rocks fall far faster than feathery-light PSC particles, the rocks can efficiently ferry nitrogen out of some layers of the stratosphere.

    Researchers had found denitrified air over the poles before, but there was no evidence to prove how it was happening. The discovery of PSC rocks is “the first time we've clearly seen reactive nitrogen being stripped from the polar arctic stratosphere,” says James Anderson of Harvard University. The rocks are so big that they sink 1.5 kilometers per day compared with 0.1 kilometer per day for ordinary PSCs, Fahey and his colleagues calculate. Although relatively rare, the particles are so massive that in only a few days, their sinking could have removed about half the nitric acid above an altitude of 20 kilometers. “The removal is more widespread than we expected,” says Anderson. Last year it seemed to cover an area the size of the United States.

    “Now the most important question is how this system will respond to greenhouse loading,” says Anderson. Fahey and his colleagues point out that the atmospheric models used to simulate denitrification are now obviously inaccurate, because they don't properly account for rocks. By allowing more chlorine and bromine to remain in their active forms, denitrification could help keep ozone destruction going over the Arctic even after PSCs disappear in the spring. And increased greenhouse gases would likely enhance that process: Although they keep heat in lower down, greenhouse gases cool the stratosphere by radiating heat to space, encouraging the cold that leads to denitrification. Also, levels of stratospheric water vapor are rising along with greenhouse gases, which would also encourage the formation of the water-rich PSC rocks.

    Denitrification “doesn't portend massive ozone losses” like the Antarctic's, notes Susan Solomon of NOAA, Boulder. But Fahey and his colleagues do note that it could delay recovery of ozone over the Arctic as chlorine and bromine emission controls take effect. “If you want to calculate the response of the Arctic to climate change,” says Fahey, “the existence of these large particles says you need a rather sophisticated model.” The next step in building such a model will be figuring out how rocks start to grow in the first place.


    Progress for the 'Mouse Gene Encyclopedia'

    1. Dennis Normile

    TOKYO—The imminent publication of the draft sequences of the human genome will be a major milestone in the history of biology. But many researchers regard having the sequence as the first, not the last, step in understanding how the genome works. It's also necessary to identify the genes, which constitute only a small fraction of the human genome sequence, and determine what they do. An international consortium led by Yoshihide Hayashizaki of the RIKEN Genomic Sciences Center in Yokohama, Japan, has now provided the first installment of what promises to be a key resource for filling this gap.

    Clone collector.

    Yoshihide Hayashizaki leads the effort to collect full-length cDNA clones of all the mouse genes.


    Beginning in 1995, the group set out to produce a complete set of complementary DNA (cDNA) copies of all the mouse genes transcribed into messenger RNAs—that is, of all the active genes of the mouse genome. In this week's issue of Nature, the researchers report that they have sequenced and analyzed more than 20,000 full-length mouse cDNAs—one of the largest such collections for any organism. Because some genes were represented by more than one cDNA, the team estimates that they now have in hand cDNAs for nearly 13,000 different mouse genes—about 20% of the total encoded by the mouse genome. The consortium expects to complete the project by spring 2002.

    Hayashizaki says the consortium set out to produce this “mouse gene encyclopedia,” as he calls it, partly to aid in identifying the genes in the human and mouse genome sequences. Current methods of finding genes in genomic sequences, chiefly using computer prediction programs, are proving inadequate, missing genes and incorrectly identifying where gene coding regions start and stop. But having full-length cDNA copies of the genes makes it much easier to spot the genes in the genomic sequences.

    This information should be valuable for understanding the human genome sequence as well, because many of the mouse and human genes are likely to be similar. Indeed, the public group sequencing the human genome is interested in collaborating with Hayashizaki in using the mouse cDNA data to identify genes in the human genome. “It is a very important resource,” says Robert Strausberg, director of the Cancer Genomics Office of the U.S. National Cancer Institute in Bethesda, Maryland. “Hayashizaki is really to be congratulated; they are at the forefront of this work.”

    In addition to aiding in gene identification efforts, the cDNA clones can also be used to construct the microarrays now coming into widespread use for profiling gene expression patterns and for synthesizing the proteins themselves. “This is exactly what scientists would like to have in their labs: the ability to produce proteins and study their functional characteristics,” says Strausberg, who is overseeing a U.S. effort to produce cDNA clones.

    A number of U.S. university and public sector labs have already acquired mouse cDNA clones from RIKEN. Although RIKEN retains some rights over the material, a researcher at one institution says the usage agreement is similar to the ones at most U.S. universities.


    New Program Draws Praise, Complaints

    1. Min Ku*
    1. Min Ku is a science writer based in Bern. With reporting by Robert Koenig.

    BERN—A major program launched last month to boost the fortunes of a select group of top Swiss labs has ignited a controversy in the scientific community. Whereas the winners say it will provide much-needed opportunities for networking, some observers complain that the program is elitist. And social scientists are unhappy about being excluded from the winner's circle.

    Things got off to an auspicious start when 230 research groups across the country submitted preliminary proposals in 1999 to become one of the government's new National Centres of Competence in Research (NCCR). The Swiss National Science Foundation (SNSF), which will administer the grants, chose 18 finalists and submitted the list to the Science and Research Group (SRG), part of the Federal Department of Home Affairs. The SRG chose 10 winners, each of which will receive between $1.7 million and $3.1 million a year for the next 3 years (see table). The $77 million is expected to be supplemented with funding for an additional 5 to 9 years as well as new money to fund a second round. The new centers, says SRG State Secretary Charles Kleiber, “are an instrument to reshape the Swiss research landscape.”

    View this table:

    The grantees were chosen according to several criteria, including scientific merit, their prospects for transferring the technology to industry, and their plans to train young scientists and advance women in the field. The awards are also meant to promote collaborations among Swiss scientists from various disciplines. The projects “provide a formal structure to put together complementary expertise from all over Switzerland,” says computer scientist Gábor Székely of the University of Zürich, leader of an NCCR grant for computer-aided medicine. According to University of Geneva mouse geneticist Denis Duboule, another NCCR grantee, countrywide collaborations are needed in his field. With the advent of genomics and proteomics, he says, “most of the work in the next 15 years cannot be done by small groups.”

    But although the grants are meant to pull researchers together, they're sowing plenty of division as well. Critics contend that the NCCR, a successor to a smaller program that funded a handful of research networks, is making the rich richer. Grantees, they say, are already well known and have little trouble obtaining research funds. By rewarding established groups, the NCCR is taking “a big step back in our efforts to give free rein to young researchers,” says Gottfried Schatz, president of the Swiss Science and Technology Council. The program also puts a premium on size, adds Ernst Hunziker, director of the Müller Institute at the University of Bern, whose proposal for an NCCR grant didn't make the final cut. “It discriminates strongly against groups that are small,” he says.

    Swiss social science and humanities researchers have their own beef with the program. Of the 18 projects recommended by the SNSF to the SRG, three were in the social sciences. None of these was funded, despite the government's previous declaration of social sciences and humanities as one of four priority areas. On 21 December, more than 300 university students, postdocs, and professors sent a letter of protest to Kleiber and his boss, Home Affairs Minister Ruth Dreifuss. The next day, the general secretary of the Swiss Academy of Humanities and Social Sciences, Beat Sitter-Liver, joined two other leaders of Swiss social science organizations in sending another protest letter to Dreifuss. “Social sciences has been relegated to second-class citizenship,” contends Keith Krause, a political scientist at the University of Geneva.

    Kleiber, also a political scientist by training, says he hopes that parliament will correct that situation this summer by funding four additional centers, one of which is in social sciences. A second (and probably final) call for proposals is expected next year, but some social scientists may pass unless “we are assured that the process will be fair,” says University of Geneva sociologist Christian Lalive d'Epinay.

    The next time around, Kleiber would like to see a more focused call for proposals and a shorter review process. SNSF Secretary General Hans Peter Hertig acknowledges that there is “certainly room for improvement” in the evaluation procedure, but he disagrees with Schatz's assertion that NCCR will weaken the foundation's support for individual researchers. Individual grants still account for 80% of SNSF's funding.

    For Duboule and others involved with the new centers, the program is not just about allocating resources. It's also part of an ongoing effort to improve Swiss science. “We just have to try it and see how it works,” Duboule says. “We have to do something; we can't just do nothing.”


    Loopy Solution Brings Infinite Relief

    1. Charles Seife

    Alexander the Great would sneer. Twenty-three centuries after he slashed through the Gordian knot, mathematicians have finally made their first stab at figuring out how long it takes to untangle a tangle. The unheroic answer, for one key class of knots, is “not forever”—and even that comes with a huge string attached. Still, knot researchers are delighted.

    Circle game.

    In time, three basic maneuvers will reduce a tangle like the one at left to a simple loop.


    “People have always wondered if [the unknotting process] was unbounded, and people now know that it is bounded,” says Joan Birman, a knot theorist at Barnard College in New York City. “It was a very big problem.”

    For years, knot theory itself has been in such a tangle that even the most fundamental problems in the field still loom large. “If I hand you a knot, you would hope for some method of identifying it,” says William Menasco, a knot theorist at the State University of New York, Buffalo. “The classical problem in knots is to have a complete, nonredundant list of knots, and if you give me a knot, I can standardize it and pick out where in the list it lives.” Unfortunately, mathematicians aren't certain that they can do that even for the simplest knot of all: a mere loop of string, also known as the unknot.

    This isn't to say that knot theorists have made no progress. For instance, as far back as the 1920s, they had figured out that only three types of motion—the so-called Reidemeister moves—are needed to untangle any knot into a standard, recognizable form. But even with that armamentarium, mathematicians didn't see an obvious way to do the untangling; sometimes the Reidemeister moves made things worse. “They were hoping that the set of moves would make the knot simpler at every step,” says Jeffrey Lagarias, a mathematician at AT&T Labs in Florham Park, New Jersey. “The Reidemeister moves do not.” As a result, there is no easy way to tell how many moves it would take to untangle a given loop of string—or even whether any finite number could do the job.

    Lagarias and Joel Hass of the University of California, Davis, sought to guarantee that unknotting the unknot, at least, does not take forever. Instead of looking at an unknot as a twisted-up loop of string, they treated it as the boundary of a crumpled and distorted disk. They then performed the disk equivalents of Reidemeister moves and translated the results back into the mathematical language of knots. Their conclusion: The number of Reidemeister moves required to untangle any given twisted-up unknot is finite.

    Finite numbers, however, can still be ridiculously large. All Lagarias and Hass guarantee is that if a knot crosses itself n times, you can untangle it in no more than 2100,000,000,000n Reidemeister moves. In other words, if every atom in the universe were performing a googol googol googol Reidemeister moves a second from the beginning of the universe to the end of the universe, that wouldn't even approach the number you need to guarantee unknotting a single twist in a rubber band.

    “The [bound] is, of course, enormous and hopeless,” Lagarias acknowledges. Still, he says, just showing that a limit exists may inspire future researchers to whittle it down to a reasonable size. (Macedonian swordsmen need not apply.)


    Scientists Begin Taming Killer Lake

    1. John Pickrell

    CAMBRIDGE, U.K.—In an unprecedented and potentially risky experiment, scientists this week began venting carbon dioxide-laden water from the bottom of Lake Nyos in Cameroon. The apparatus is intended to prevent a recurrence of a 1986 eruption that claimed 1800 lives.

    The carbon dioxide is released from surrounding volcanic sediment and underground springs. Experts say that about 300 million cubic meters of the gas have accumulated in the waters near the lake's bottom, about 200 meters below the surface. In 1986, an estimated 80 million cubic meters of gas escaped from the lake and flowed down the hillside, smothering residents and livestock in what's known as a “limnic eruption.”

    Hoping to prevent future such disasters, a team headed by Michel Halbwachs of Savoie University in Chambéry, France, has floated a 3-meter-wide raft in the middle of the lake with a 200-meter-long polyethylene pipe running to the lake bottom. As the water travels upward toward the surface, the carbon dioxide comes out of solution. At the surface, 10 liters of carbon dioxide are released for every liter of water. The result is “a jet 40 meters in height, which can continue without any further input of energy,” says team member Gaston Kayser, speaking from Lake Nyos. “We are very happy with the results.”

    Water safety.

    Carbon dioxide is gushing from Lake Nyos thanks to equipment first tested in 1995.


    While applauding the effort, some experts remain cautious. “Everybody is in favor of any attempt to degas the lake. … This is a great idea,” says Sam Freeth of the Geological Hazards Research Unit at the University of Wales in Swansea, U.K. But he says there are “major risks involved” if the experiments are scaled up. Freeth worries that the movement of large quantities of cold, dense water resulting from the removal of the carbon dioxide could generate currents that would trigger another limnic eruption. Freeth urges the French-led team to publish its data so others can review them.

    The team intends to install four or five more pipes over the next year to degas the lake to acceptable levels. The team will return to France next week and continue to monitor the lake via satellite. Local authorities can turn off the apparatus, which will otherwise run indefinitely, at the first sign of a pending eruption.


    Panel Seeks Truth in Lie Detector Debate

    1. Constance Holden

    An expanded polygraph screening program at U.S. nuclear weapons labs begun in the wake of suspected espionage has heated up the perennial debate over the validity of lie detectors. And if testimony at the first meeting last week of a new National Academy of Sciences panel examining the thorny issue is any guide, the truth will be hard to come by. Researchers are, however, exploring alternative technologies, including the use of brain and thermal imaging, to identify what happens in the brain when people lie.

    The academy study is funded by the Department of Energy and follows the flap over Wen Ho Lee, a computer scientist at Los Alamos National Laboratory in New Mexico who pled guilty to mishandling classified information after facing allegations of more sinister activities (Science, 15 September 2000, p. 1851). The $860,000 study is the first major government-sponsored polygraph study since a 1983 report by the Office of Technology Assessment (OTA) concluded that polygraphs are not an effective scientific method to check for security breaches. A 2-day meeting in Washington, D.C., made it clear why panelists expect the job to take no less than 21 months. “I heard a major disconnect between what different people were saying,” says study director Paul Stern.

    The panel, headed by statistician Stephen Fienberg of Carnegie Mellon University in Pittsburgh, was confronted immediately with seemingly irreconcilable testimony. Officials from the Energy and Defense departments touted the successes of their programs, while a Department of Energy physicist claimed that polygraph screening does more harm than good. What's more, the panelists heard testimony that experiments aimed at establishing the validity of the polygraph as a generalized screening instrument may be unreliable.

    This month, for example, the Department of Defense (DOD) is starting a screening validation study involving 120 subjects—recruited through newspaper ads—some of whom have been trained to pretend they have committed espionage. But panelist Paul Ekman, a psychologist at the University of California, San Francisco, said in a written statement that such research won't yield solid results until the primed subjects are playing for “high stakes”—such as loss of a job.

    Other participants questioned the reliability of polygraph use in personnel-screening (as opposed to criminal) cases, because the low base rate of miscreants results in an unacceptably high number of false positive readings. This situation, said Alan P. Zelicoff, a physicist at the Center for National Security and Arms Control at Sandia National Laboratories in Albuquerque, New Mexico, has led to “tremendous cynicism and doubt about the utility of the test in both management and technical staff.” He predicts that almost all those who fail the current round of polygraph tests being given at the three national labs will later be found to have been truthful—but with lasting damage to already-low morale.

    Out of line.

    Output from a polygraph may fail to detect real emotions behind lying.


    Even practitioners acknowledge that the validity of polygraph tests relies heavily on factors not related to the instrument, such as the training of polygraphers and the nature of the screened population. There is thus growing interest in alternative types of technology. Panel member Richard Davidson of the University of Wisconsin, Madison, says that new approaches now have abundant brain research to draw from—knowledge that didn't exist at the time of the OTA report.

    First introduced in the 1920s, the polygraph machine measures four parameters—heart rate, blood pressure, respiration, and sweating. But that physiological quartet doesn't get at what Davidson says is presumably the emotion being measured, namely, “fear of detection.” For that, he says, researchers must go straight to the brain: “And if there's one emotion that we have really learned a lot about in the last decade, it's fear.”

    Animal studies have shown that fear is particularly associated with a brain region called the amygdala. That finding is also borne out by human brain imaging studies using functional magnetic resonance imaging (fMRI) on subjects exposed to facial expressions of emotion. Fear elicits the strongest activation of the amygdala, he says, and it looks quite different from a more generalized anxiety response.

    So far there have been no fMRI studies for lie detection. But the DOD is reviewing outside proposals received in response to a broad solicitation for new ways to study the subject. Andrew Ryan, chief of research at the DOD Polygraph Institute, says an fMRI study would ask subjects to lie about something so that researchers can examine patterns of brain activation. “Deception requires more cognitive effort than truth,” Ryan points out, so you would expect not just fear but increased cognitive activity.

    The DOD is also doing research on thermal imaging, in which the temperature changes caused by variations in facial blood flow during lying is detected with an infrared camera. Other technologies being explored include the use of lasers to pick up muscular, circulatory, and other bodily changes in a process called “laser Doppler vibrometry”; the use of a new voice stress analyzer known as the Vericator; and the monitoring of brain waves, similar to a “brain fingerprinting” system currently in use commercially. All are noninvasive technologies “not even available 10 years ago,” says Ryan.

    Ryan and others hope that this research will usher in an era of new technologies far more adept at ferreting out the truth than what has been available for the past 80 years. “The future machine,” he predicts, “will look very different.”


    Keeping the Stygian Waters at Bay

    1. Dan Ferber

    If approved by Congress, an ambitious plan would seek to banish a recurring nightmare from the Gulf of Mexico: a seasonal dead zone that turns the sea floor into a graveyard

    URBANA, ILLINOIS—In the summer of 1999, billions of creatures suffocated in the northern Gulf of Mexico. The killing started in the spring, when the waters were gradually depleted of life-giving oxygen. By the time the carnage ended that autumn, a swath of sea the size of New Jersey had been stripped of much of its oxygen. Fish could flee the expanding “dead zone,” but bottom-dwellers such as crabs and snails couldn't crawl away and were suffocated. A similar horror has visited the gulf every summer for at least 20 years—but 1999 was the worst ever.

    The gulf's woes can be attributed primarily to the 1.6 million metric tons of nitrogen, much of it from midwestern farm fields, that wash out of the Mississippi and Atchafalaya rivers each year (Science, 10 July 1998, p. 190). No one has offered data showing that the dead zone has harmed gulf fisheries, as trawlers and other ships have coped by moving their operations farther out to sea. And although one economically important species—brown shrimp—is declining, it's not clear whether oxygen-poor, or hypoxic, waters are to blame. Still, a bigger or longer lasting dead zone could spell disaster. The northern gulf is “a time bomb” that could explode any year, devastating the fishing industry, says marine biologist Robert Diaz of the Virginia Institute of Marine Science in Gloucester Point.

    Getting to the bottom of it.

    Sediments reveal plankton abundance in the gulf in years gone by—a proxy for measuring hypoxic conditions that can strangle bottom-dwellers, like this crab (right).


    But a tonic for the gulf's ills may at last be at hand. After years of wrangling, the Clinton Administration in one of its final acts released a plan last month to shrink the dead zone by sopping up its trigger: excess nitrogen in the Mississippi Basin, which drains 41% of the continental United States. Senator John Breaux (D-LA) is planning to draft legislation to introduce key portions of the plan in Congress later this year. Although the measure would rely heavily on voluntary changes in farming practices, it could face rough sledding in a conservative Congress. And even if the plan is implemented, experts warn that the gulf's summertime blues could take more than a decade to cure. But many also feel that the measures would come in the nick of time. The northern gulf “is not too far gone,” says Diaz, who thinks that the waters run the risk of reprising conditions in the Black Sea in the 1970s and 1980s: hypoxia year-round. By choking off nitrogen at its source, Diaz says, “the gulf will turn around.”

    Raising hackles

    The dead zone forms each spring when nitrogen—primarily in the form of nitrates —dumped into the gulf by the Mississippi River triggers an ecological chain reaction. The nutrient spurs tiny marine plants to bloom in the surface waters. Minuscule animals called zooplankton feast on the plants, and their numbers soar. The profusion of life is more than fish and other predators can consume, so billions of dead zooplankton and their fecal pellets rain onto the sea floor. There, bacteria digest the remains, using up oxygen in the bottom waters in the process.

    Surface waters, constantly replenished by the Mississippi, stay relatively oxygen-rich. But this layer floats atop the saltier and heavier ocean water, so there's little mixing with the hypoxic bottom waters. Animals that can't swim away soon die.

    This seasonal phenomenon was first detected in the early 1970s by marine scientists assessing the environmental effects of pumping oil and natural gas in the gulf. It wasn't until 20 years later that a team led by marine ecologist Nancy Rabalais of the Louisiana Universities Marine Consortium in Cocodrie had mapped the hypoxic zone (see sidebar on p. 970). Every month for 5 years, the researchers measured oxygen from the surface to the sea floor at nine sites on the continental shelf. Once a year, they took similar readings at more than 60 sites scattered throughout the northern gulf. Until their findings were published in 1991, no one had realized how expansive the dead zone was or that it came back every summer. Most provocative, however, was their conclusion about the dead zone's cause: Rabalais and marine ecologist Gene Turner of Louisiana State University in Baton Rouge argued that fertilizers from farmlands in the Mississippi River Basin were poisoning the gulf.

    In 1993, the Mississippi River flooded, and not coincidentally, Rabalais and Turner argued, the dead zone doubled in size from the previous year, to an area covering 17,500 square kilometers—twice as big as the Chesapeake Bay. (At its peak, the 1999 dead zone reached 20,000 square kilometers.)

    Rabalais sounded the alarm, prompting environmental groups to pressure the U.S. Environmental Protection Agency (EPA) in 1995 to stop upstream states from polluting the Mississippi. In response, the EPA held a conference and in 1997 established a task force of state and federal officials to recommend a course of action.

    While the EPA's committee deliberated, the Clinton Administration took a baby step toward helping the gulf, proposing $117 million in new funds in 1998 to study runoff and harmful algal blooms, including those linked to the gulf's dead zone. Later that year, Clinton signed into law a measure requiring a White House advisory body, the Committee on the Environment and Natural Resources (CENR), to prepare a report on the dead zone's causes, consequences, and possible fixes. The law also called for the White House to develop an action plan to fix the dead zone by 31 March 2000—a task it entrusted to the EPA-led task force.

    In May 1999, the CENR released a set of scientific reports backing the conclusion that chemical fertilizers are the main culprit. They offered several lines of evidence. In the Mississippi River, levels of nitrate—which leaches out of fertilizer-saturated soils—are three times higher now than they were in the 1950s; over the same period, chemical fertilizer use in the Midwest also tripled.

    Forging a link between nitrogen levels and hypoxia are studies on sediment cores, which give clues to the gulf's past health. Rabalais and Turner measured concentrations of silicon in core samples. Silicon is a major constituent of the cell walls of diatoms, so it's a good measure of phytoplankton abundance. They found that silicon concentrations rose slowly between 1970 and 1989, tracking well with increasing nitrogen levels. Along with other groups, Rabalais and Turner have done other experiments to show that dissolved nitrates from the Mississippi supply nearly all the nutrients that fuel phytoplankton blooms in the dead zone.

    Experts say they have made a compelling case tracing the bulk of the nitrates to human activity. In a study in last July's Eos, Transactions, American Geophysical Union, hydrologist Donald Goolsby of the U.S. Geological Survey (USGS) in Denver used water-quality monitoring data from 42 watersheds in the Mississippi Basin to model the nitrogen cycle throughout the basin. Some 7 million metric tons of nitrogen, about 30% of the total flowing into the gulf, came from fertilizer, and that total has risen sixfold over the last 50 years. An equal amount came from soil decomposition and the rest from sources such as animal manure, sewage treatment plants, airborne nitrous oxides, and industrial emissions. Goolsby also found that 56% of the nitrogen inputs into the Mississippi were from five heavily farmed midwestern states, with Iowa and Illinois the biggest sources. Other studies have come to similar conclusions.

    While acknowledging that the dead zone is real, farm advocacy groups have denounced the White House report and the supporting studies. They argue that the link between the use of nitrogen-based fertilizers and the dead zone remains unproven. “We need a cause-and-effect relationship before we can successfully pursue any remedial actions in this area, and those simply don't exist,” says Terry Francl of the American Farm Bureau Federation.

    As in the global warming debate, a handful of scientists disagree with prevailing scientific opinion and have provided the farm groups with ammunition. Jonathan Pennock of the University of Alabama's Dauphin Island Sea Lab and his colleagues issued a report in May 1999 that cast doubt on the role of fertilizer, instead blaming increases in river flow and organic matter swept downstream. The Fertilizer Institute, a Washington, D.C.-based industry trade group, funded the report.

    The most visible skeptic is Derek Winstanley, chief of the Illinois State Water Survey, who argues that nitrogen levels have always been high in rivers that feed the Mississippi, and that the dead zone is a natural phenomenon that may have plagued the gulf for centuries. Winstanley has roused political opposition to any dead zone fix that focuses on cutting nitrogen runoff in upstream states.

    CPR for the gulf

    Winstanley and other skeptics have converted few experts to their cause. The data linking the dead zone to fertilizer runoff “all line up in a pretty convincing way to most folks,” says biological oceanographer Donald Boesch of the University of Maryland Center for Environmental Science in Cambridge. He and others are convinced that the gulf will only be healed by cutting nitrogen runoff.

    That's exactly what last month's plan intends to do. If it is approved, officials say that starting in 2002, the government would need to spend about $1 billion a year for at least 5 years on measures to wean farmers off heavy use of chemical fertilizers. It's hoped that by 2015, the measures—intended to cut nitrogen discharges to the gulf by 30%—would shrink the dead zone to an average of 5000 square kilometers, less than half its average size during the 1990s. The plan would also provide ample funding for scientific projects, mostly within the USGS and the National Oceanic and Atmospheric Administration, to monitor nitrogen levels and ecosystem health in the Mississippi Basin and in the gulf, says NOAA's Donald Scavia.

    The biggest ticket item is reducing fertilizer use. Many researchers believe, for example, that midwestern farmers routinely apply about 20% more fertilizer than needed most years because fertilizer is cheap, gambling that better-than-average weather could bring higher yields, more than offsetting the fertilizer's cost. Money would go to programs that show farmers that this practice of “nitrogen insurance” doesn't pay. Other programs would aim to include nitrogen-fixing crops such as alfalfa in the rotation. If such were planted on just 10% of the cropland, the CENR reports stated, nitrogen discharges would decline significantly.

    Some farmers say they have already cut fertilizer use and that further reductions would harm their bottom line. But analysts insist that the vast majority could alter their practices without suffering. Changes that cut nitrogen runoff by up to 25%, says agricultural economist Otto Doering of Purdue University, the lead author of the CENR report's economics section, “would not have a severe impact on farmers or on food costs.”

    The plan would also pay farmers to restore wetlands and plant buffer strips of trees and grasses between farm fields and streams. More vegetation would mean more denitrification, a microbial process in which nitrate is converted into nitrous oxide and nitrogen gases that escape into the atmosphere. This would right a historical wrong. Over the past 150 years, midwesterners have cleared and drained untold millions of hectares of nitrogen-absorbing bottomland forest and wetlands for agriculture, says wetland ecologist William Mitsch of Ohio State University in Columbus.

    If it becomes legislation, the action plan would await an uncertain fate in Congress, where it may face opposition from conservatives and must compete for money with scores of other proposals. “I don't think it will be easy,” says a Republican Senate aide familiar with the plan.

    Luckily for plan backers, farm groups are sending mixed signals. Taking the toughest stand is the American Farm Bureau Federation: “There's simply no way our organization could support” the plan in its current form, says Francl, who says his organization will lobby against it. But the Fertilizer Institute sees merit in aspects of the plan that pay farmers to use fertilizer wisely and plant buffer strips. “You'll see us and many in the [agriculture] community supporting … policies that will help reduce nutrient loss,” says spokesperson Ron Phillips.

    Also testifying to the plan's merits are chronicles of other dead zones around the world. Cutting fertilizer use and restoring wetlands in Sweden and Denmark have slashed nutrient inputs into the Kattegat strait, which links the North Sea and the Baltic Sea; oxygen levels are on the rise (see sidebar on p. 969). And a hypoxic zone in the Black Sea even bigger than the Gulf of Mexico's appears to be taking its last gasps. The Black Sea's problems began in the 1960s, after years of being bombarded with heavy fertilizer runoff. When the Soviet Union collapsed in 1990, so did most central support for agriculture, and chemical fertilizer use fell by more than half. By 1996, the Black Sea's dead zone had disappeared for the first time in 30 years.

    Aside from those case studies, experts are putting their faith in models that predict the gulf's comeback after less nitrates are fed into it. Using reams of data on everything from oxygen levels and sunlight to temperature and phytoplankton levels, water-quality modeler Victor Bierman of Limno-Tech Inc. in Ann Arbor, Michigan, and his colleagues have predicted that a 20% to 30% reduction in nitrogen levels in the Mississippi would raise oxygen levels as much as 50%, providing enough of the molecule to prevent many hypoxia-related deaths. But it's likely to take more than a decade to see results, Bierman says.

    Johnny Glover doesn't know if he can wait that long. Back in the 1970s, when he started out in the fishing business, the waters near the seaside town of Cocodrie teemed with fish. Now Glover manages 10 charter boats, and his captains must travel kilometers farther than they used to to find king mackerel, black drum, and red snapper. “We're working around [the dead zone], but it's getting harder and harder to make a living,” he says. For Glover and other hard-pressed fishers, a cure for the dead zone is long overdue.


    Off Denmark, a Drawn-Out War Against Hypoxia

    1. John S. MacNeil*
    1. John S. MacNeil, a former Science intern, is a staff writer for GenomeWeb in New York City.

    The first signs of a problem in the Kattegat strait between Denmark and Sweden surfaced in the late 1970s, when Danish coastal officials chronicled a series of fish kills, plankton blooms, and low oxygen readings. The hypoxic conditions suggested that fertilizer runoff from farm fields might be the culprit, but the threat didn't register with the public until a lobster die-off in 1986 in the southern Kattegat.

    An intense lobbying campaign led by the Danish Society for the Conservation of Nature persuaded the government to draw up an Action Plan on the Aquatic Environment. After a series of false starts, the plan, enacted in 1987, finally appears to be paying dividends.

    The plan called for halving the release of nitrogen from all major sources—agriculture, industry, and sewage treatment plants—and cutting the release of phosphorus by 80% within 6 years. The $24 million a year plan doled out money for steps such as upgrading wastewater treatment plants, paying farmers to plant winter wheat to soak up excess nitrogen from the soil during the fall and winter, and limiting the amount of manure that farmers could dump on their fields.

    By 1991, it had become clear that the agricultural measures were having no effect—or were even exacerbating the problem. Planting winter wheat, for example, had led to an increase in nitrogen fertilizer use—farmers had begun applying extra fertilizer in the fall to maximize yields, according to Daniel Conley, a marine ecologist with Denmark's National Environmental Research Institute.

    Danish lawmakers tightened restrictions by requiring farmers to account for all fertilizer and manure they applied to their fields (they had not been required to report this before) and extended until 2000 the deadline for achieving nutrient reductions. In 1998, when monitoring again showed scant nitrogen reductions, Action Plan II was trotted out. The government began buying land from farmers to reestablish wetlands and forests, and paying farmers to use less than optimal amounts of fertilizer.

    Far greater success was achieved, meanwhile, through regulations to reduce phosphorus emissions, mainly from wastewater treatment plants and industry. Over the last 14 years, phosphorus levels have fallen 80%; today, plankton growth in some coastal areas is limited by the available phosphorus, says Peter Bondo Christensen, a biologist with the National Environmental Research Institute. Although Danish scientists have never compiled a map of the dead zone in the Kattegat strait, their measurements show that oxygen levels are on the rise in open water.

    Unlike the situation in the United States, the two constituencies most threatened by Denmark's dead zone are neighbors: “We're such a small country that the farmer lives next to the fisherman,” says Christensen. And that means they have to get along to solve the problem, or else be at each other's throats.


    The Dead Zone's Fiercest Crusaders

    1. Dan Ferber

    Nancy Rabalais has fought to make farming states along the Mississippi River pay for the harm she says they have inflicted on the northern Gulf of Mexico; Derek Winstanley has led the counterattack in an effort to clear the Mississippi Basin's name.

    Queen of the dead zone

    Nancy Rabalais gets so seasick that as a graduate student in marine biology, she studied fiddler crabs just to stay ashore. But one project lured her into open water. In 1985, she and Don Boesch, then director of the Louisiana Universities Marine Consortium in Cocodrie, wanted to know how widespread was a disturbing phenomenon in the northern Gulf of Mexico: patches where the water ran low on oxygen, a potentially lethal threat to the ecosystem. Boesch asked his young research associate to take water samples up to 40 kilometers offshore.

    Muck raking.

    Nancy Rabalais sifts gulf sediment for clues to dead zone's origin.


    Rabalais steeled herself and ventured out with a crew member in a 6-meter outboard. It was not the sort of boat for a landlubber, and she spent much of the day hugging offshore oil platforms in case she needed to radio for help. “The wind and seas could pick up, the engine could quit on you, you'd be too far offshore, and you could be in trouble,” she says. Still, Rabalais got half her data and returned the next day for the rest. It wasn't long before she had persuaded Boesch to buy a bigger, safer research vessel. “I told Don I wasn't going to kill myself for hypoxia,” she recalls.

    But Rabalais, 51, has watched herself get crucified for it. She's been demonized by farm groups for publicizing her theory that nitrogen swept down the Mississippi River is primarily responsible for the dead zone. Now, fixing the dead zone is the goal of an ambitious plan that could change farming practices throughout the Mississippi Basin (see main text). And the credit for that belongs primarily to Rabalais. “If it wasn't for her,” says biological oceanographer Robert Diaz of the Virginia Institute of Marine Science in Gloucester Point, “no one in Washington would have realized there was a dead zone.”

    Rabalais worked her way through college at Texas A&I University (now Texas A&M University, Kingsville), grading math papers for busy middle school teachers. She fell in love with the water there (as long as it didn't involve a boat), spending every spare moment snorkeling or scuba diving. Her coming of age as an environmentalist occurred during graduate school at the University of Texas Marine Science Institute in Port Aransas, when she and others battled to stop a U.S. Army Corps of Engineers dredging project in the Corpus Christi Ship Channel that threatened a shallow sea grass bed in a local mangrove forest. Interviewed on local television, Rabalais cut a charming figure with her lilting drawl and disarming smile. Public opposition derailed the project.

    The dead zone was a far greater challenge. Rabalais knew from the outset that it would take years of data to understand when, where, and how the waters become hypoxic. But reviewers kept rejecting grant proposals for long-term monitoring, arguing that the work was not innovative enough, says marine biologist Don Harper of Texas A&M University, Galveston, who joined Rabalais on many of the early cruises. “Nobody wanted to listen to this stuff,” he recalls. But they cobbled together enough money to keep the project going. Rabalais felt that she had a personal stake in the gulf's health. “It's my home territory,” she says.

    Rabalais developed another personal connection on the gulf with Gene Turner, a marine ecologist at Louisiana State University in Baton Rouge, who in 1974 had dabbled in hypoxia by volunteering to sort fish on fisheries vessels; he drew water samples wherever the boats stopped for fish. Rabalais and Turner married in 1988, and their daughter Emily was born a year later. “Gene calls her our best reprint,” Rabalais says.

    By 1991, the dynamic duo had completed 5 years of monthly sampling in the gulf, which revealed the magnitude of the dead zone and the fact that it came back each summer. Since then, says Boesch, “they've put the scientific puzzle together.” In 1993, when her annual summer cruise found that the dead zone had doubled, Rabalais issued her first press release. “The nation didn't know about it, and it was important,” she says. Soon she was taking journalists and TV crews out to the dead zone and testifying before Congress. Rabalais has been heckled at public hearings by angry farmers, some of whom have accused her of raising awareness of hypoxia simply to further her career. “That one smarts—it's personal,” Rabalais says. Yet she's continued to speak out. “That's taken an awful lot of courage,” Boesch says. “Most scientists would say, ‘Here's my paper, I don't want to get involved.' She's gone in and faced those folks directly.”

    Even some allies have raised questions about Rabalais's advocacy. While relaxing one evening a few years ago at a scientific meeting, Diaz told Rabalais and Turner that they should tone it down: Their rhetoric was getting ahead of the science. In 1999, however, after the government took steps to address hypoxia, Rabalais and Turner were named environmental heroes by the National Oceanic and Atmospheric Administration, and the pair received the $250,000 Blasker Award for Environmental Science and Engineering, Diaz realized they had taken the right path. “When they got that award,” he says, “I took it all back.”

    Fighting for fertilizer

    Environmental scientist Derek Winstanley heads an agency that tracks water quality in a state where farming is big business. So when a federal panel concluded in 1999 that pollution from midwestern farms is the major cause of the Gulf of Mexico's dead zone, top Illinois officials turned to Winstanley, chief of the Illinois State Water Survey (ISWS), to draft the state's response. He came through with a 27-page analysis bearing a strong message: The federal panel's conclusions were biased and unsound. Governor George Ryan signed it and sent it to Washington, D.C., in July 1999, demanding that the Clinton Administration scrap the panel's report and start over.

    Ryan's plea was ignored, but Winstanley had only begun to fight. Speaking at the annual meeting of the American Farm Bureau Federation in January 2000, Winstanley blasted the federal panel as part of a runaway regulatory effort grounded in “environmental religion” and told farmers there that federal regulations that would clamp down on fertilizer use were “a big steam train coming down the track … right at you.”

    Winstanley, 55, has sought to discredit what he calls the “fertilizer hypothesis” advocated by Nancy Rabalais and others. Last June, Winstanley and ISWS colleague Edward Krug released a 172-page report arguing that Illinois rivers were loaded with nutrients before European settlers began farming the prairie in the mid-1800s, based largely on historical descriptions of these lands, and that modern agriculture had “greatly cleansed” the Illinois River in the past 50 years.

    Later that month, he ripped into the federal hypoxia panel in testimony before the U.S. House Agriculture Committee, claiming it had ignored important data on natural sources of nitrogen runoff that undermine the link between fertilizer use and the dead zone. That argument netted results: Congress attached a provision to an emergency aid bill, signed into law by President Clinton last July, that blocked the Environmental Protection Agency from funding until October 2001 certain water-pollution programs meant to cut agricultural pollution.

    Notwithstanding that success, a range of experts have dismissed Winstanley and Krug's conclusions. Biogeochemists Mark David and Gregory McIsaac of the University of Illinois, Urbana-Champaign, and four colleagues released a 29-page rebuttal of Winstanley's report last October. “It's politics in the guise of science,” says biological oceanographer Don Boesch of the University of Maryland Center for Environmental Science in Cambridge.

    Winstanley “strongly disagrees” with his critics and says he's “upset that it's got to slinging mud.” He and Krug have submitted their findings to peer-reviewed journals. But if Congress approves the ambitious plan to shrink the gulf's dead zone, the results of that program would be the ultimate test of whether Winstanley's arguments hold any water.


    Dirigibles to Grace Skies Over Germany Once Again

    1. Olaf Fritsche*
    1. Olaf Fritsche is a freelance writer in Sandhausen, Germany.

    Companies are betting that giant airships not seen since World War II can pay off in ferrying tourists and heavy objects—and perhaps even doing science

    For people who recall images of the zeppelin Hindenburg's fiery demise in 1937, the thought of the cigar-shaped ships—and flying in one—might stir an unsettling feeling. This spring, a company called Zeppelin Luftschifftechnik hopes to wipe away those disturbing thoughts with a new breed of airship to carry tourists in Germany, and eventually to other European countries.

    Hot on its heels is a company designing the world's largest airship, a dirigible for transporting heavy machinery that could make its debut in about 2 years. Fueling the dirigible's renaissance is what appears to be a healthy demand from sightseers and a niche market in the cargo world. Research trips could soon follow. “This is just the starting signal for the development of new airships,” says aviation engineer Ingolf Schäfer, a consultant based in Lahnau, Germany.

    Dirigibles hit the comeback trail in 1988, when the company Luftschiffbau Zeppelin, which had long since gotten out of producing dirigibles and into producing radar aerials and silos, asked staff engineers Klaus G. Hagenlocher and Florian Windischbauer to study whether airships had the potential to fly again. The duo reviewed the safety and flying records of all 119 zeppelins produced before the airships were shelved in 1940. They concluded that dirigibles could offer a safe alternative to hot-air balloons and other sightseeing vessels—but that there would be too few customers to make transatlantic crossings pay.

    In 1993, Zeppelin Luftschifftechnik was established to make dirigible tourism a reality. The firm was situated in a nostalgic location: Friedrichshafen, where Count Ferdinand von Zeppelin made his dirigible dream come true when he took his LZ 1 into the air on 2 July 1900. Exactly 100 years later, the count's granddaughter, Elisabeth Veil, baptized the modern prototype “Friedrichshafen.”

    The project Zeppelin NT (“New Technology”) has exploited a host of recent advances in materials science to make the modern zeppelin a better airship than its ancestors. The craft's skin is Tedlar foil and polyester textile, weatherproof fabrics that give the company the option of not having to keep the zeppelins in a hangar. Modern materials give the new breed a big advantage over last century's zeppelins, the cotton-based skin of which would suck up a lot of water, making the dirigibles heavier and sometimes rupturing during flight—forcing daring midflight repairs.

    The new ship's aluminum and carbon fiber-strengthened plastic frame has a triangular geometry, making the helium-filled dirigible more compact and lighter by volume than its progenitors. Dirigibles also have a big advantage over blimps, famous for hovering over U.S. football games. If a blimp were to lose gas, the skeletonless airship could crumple and become unsteerable. A dirigible's frame allows it to be steered even when deflated.

    The prototype Zeppelin NT has completed more than 800 hours of test flights in Germany. Initially, it and a twin to roll out this spring will be confined to German airspace; indeed, dirigible tours will not stray far from Lake Constanze. But the company hopes to launch flights to other countries after getting regulatory approval.

    The dirigibles will be used for advertising, and Zeppelin Luftschifftechnik is exploring other markets, including measuring airborne pollutants. The airships could also provide a vibration-free and steerable platform for scientists. “A plane is too quick for some instruments,” says Markus Quante of the Institute for Atmospheric Physics in Geesthacht. He studies how greenhouse gases move through the atmosphere. “Our particle detectors work much better at low speed, and they are very complex instruments, so you need an operator at their side,” he says. “You can't do this in a balloon.”

    Taking a different tack is CargoLifter, a Berlin-based company that hopes to revive dirigibles as titanic airborne mules. “We're really creating a new industry,” says Charles H. W. Edwards, president of CargoLifter's U.S. holding company. Edwards predicts his firm will have more demand than it can handle for the 50 airships it hopes to build over the next 15 years. Major customers could include the heavy machine and construction industries, oil-exploration firms, and humanitarian missions.

    Gigantic machines, such as turbines or air liquefiers, are usually moved from factory to customer by truck for shipment out of a port or an airport. “A ship has to end [its journey] at the wharf,” says Edwards, and not every airport can handle a cargo jet. So a truck often must complete the journey. “We can go point to point,” he says. Like a flying crane, CargoLifter's CL 160—still on the drawing board—would be able to grab its freight—up to 160 tons, equal to 27 full-grown African elephants—while hovering and without disassembling it.

    But CargoLifter still must obtain regulatory clearances from air traffic authorities. The cumbersome airships might be excluded from airspace near airports, and regulators have not yet decided whether to treat the CL 160 as a plane or as a container ship, which would affect the number of hours that crews would be allowed to work and thus the duration of CL 160 flights. The uncertainty hasn't stopped CargoLifter from building a hangar at a former Russian military airport 60 kilometers south of Berlin for a prototype it hopes to fly in 2003. Potential competitors are popping up: The U.K.'s Advanced Technologies Group has recently tested a model of a jumbo dirigible designed to carry 1000 tons.

    Experts are thrilled that these anachronisms may find a place in the modern world. Within 5 years, predicts Schäfer, German skies will be filled with dirigibles.


    U.S.-Mexican Telescope Gains Firmer Footing

    1. Andrew Lawler

    After a shaky start, the Large Millimeter Telescope is taking shape in the mountains of central Mexico. It's Mexico's biggest splash in global research

    AMHERST, MASSACHUSETTS—A narrow sand-and-gravel road studded with hairpin turns is the only way to reach the future site of the Large Millimeter Telescope (LMT), atop a 5000-meter mountain in eastern central Mexico. The road is too treacherous to transport the 1000-ton, 35-meter-long pieces of steel, now being manufactured in the lowlands below, that will form the telescope's 50-meter dish and supporting structure. But work on a wider, safer, and long-overdue road has stopped because of lack of money. That problem is just one of several obstacles in the way of a 12-year quest by U.S. and Mexican scientists to construct Mexico's most expensive scientific facility and its largest cooperative R&D effort with its northern neighbor.

    Operating at wavelengths as short as 1 millimeter, the $80 million telescope is designed to generate important new data on the nature of early galaxy formation and many other objects in the universe. But despite the road and a host of other challenges—ranging from design headaches to an uphill battle for scientific respect—the project's future suddenly seems bright. The new Mexican government appears favorably disposed to the project, and the U.S. National Science Foundation (NSF) is poised to provide the first chunk of U.S. government support obtained through accepted scientific channels.

    A lot is riding on a successful outcome. The telescope would vault Mexico into the elite ranks of countries with world-class observatories. “This is very, very important for us,” says Alfonse Serrano, director of the National Institute of Astrophysics, Optics, and Electronics and the Mexican midwife of the effort. For its U.S. partner, the University of Massachusetts (U Mass), the new telescope means a chance to remain in the top ranks of U.S. astronomy departments. And for the community at large, LMT represents a new and valuable tool. “We're happy to have it,” says Martha Haynes, a Cornell astronomer who chaired last year's radio astronomy panel that fed into the National Research Council's (NRC's) influential decadal report on the future of U.S. astronomy.

    Work on its foundation began last month and is expected to be completed this year, with “first light” in 2004—2 years behind schedule. Once finished, it will be the largest millimeter-wavelength telescope in the world, soaring more than 15 stories above a desolate mountaintop. “Our killer app” will be data on early galactic development, says U Mass project scientist Peter Schloerb. Highly redshifted galaxies are particularly visible at the millimeter wavelength. With its wide bandwidth, the large dish will be able to collect massive amounts of data on these ancient structures for astrophysicists to plug into their evolutionary models. But researchers also expect to gather more accurate information, and at a faster rate, on distant galaxies, the velocities of galaxy clusters, and molecular gas clouds in our own galaxy. Peering closer to home, the telescope should be able to plot the shape of comets beneath their obscuring comae.

    Such results will be vindication for supporters, who have fought hard for scientific respectability. The original idea was conceived in the late 1980s by U Mass scientists who currently operate a 14-meter telescope near the campus, but it was rejected by NSF. The 1991 NRC decadal survey highlighted the need for a millimeter array, but that suggestion grew into the $400 million Atacama Large Millimeter Array (ALMA) being planned jointly by the United States and Europe for a mountain site in Chile's Atacama desert.

    Refusing to give up, U Mass officials struck a deal with their Mexican counterparts, who agreed to host the facility and pay half the cost of construction. And they convinced Congress to put $21.4 million for the LMT into the budget of the Defense Advanced Research Projects Agency (DARPA), circumventing peer review and NSF altogether. The move earned a “tongue-lashing” from one upset House Science Committee staffer, recalls one scientist involved in the effort. They also reaped $5 million from the state of Massachusetts as well as $4 million from the university (Science, 17 January 1997, p. 300).

    The ill feelings from that pork-barrel strategy have faded, however, in part because the funding earmarked for the LMT did not threaten other radio astronomy projects like ALMA. The LMT will form one part of a suite of new ground-based instruments now being designed or built to cover millimeter and submillimeter wavelengths, each with its own particular technical strengths. LMT, for example, will be capable of imaging celestial regions an order of magnitude more rapidly than the 40-dish ALMA can, Schloerb says. “When you put all these together, you have what you need,” Haynes adds.

    Mexico's support for the project has been unflagging, surviving the peso crisis of the mid-1990s and numerous strains on the country's research budget. Nevertheless, the project has encountered a host of difficulties. Disputes with the German company hired to design the LMT slowed progress, and a 1997 decision not to build a protective radome around the telescope led to a major redesign midstream. In addition, the site's features—including high winds, extreme temperatures, and crumbly soil—proved a greater challenge than expected for both design and initial construction, says Allen Langord, U Mass program manager.

    An August 2000 review by an outside advisory panel appointed by the project noted “substantial progress,” although it cited “serious concerns” about antenna design, contract delays, schedule uncertainties, and budget shortfalls. Cornell astronomer Paul Goldsmith, panel chair, says that he and his team were encouraged during a visit last month to the site, but says major challenges remain.

    A temporary road to the site has long been complete, for example, but work on the promised permanent route has halted. The state of Puebla had promised to build the road, but funding was halted last year after floods swept the region and a new governor was elected. The state has asked for federal funding to complete it, Serrano says, “but we don't know the result yet.” One alternative would be to improve the temporary road so that it could carry the heavy equipment necessary to complete the telescope.

    The balance of funding between the partners is another source of tension. Serrano says that Mexico has already spent its $40 million and that it needs an estimated $3 million more. Last week Serrano was named second-in-command at Mexico's research ministry, an appointment that seems likely to bolster government support. “This is good news,” says Langord. But Serrano worries about the U.S. contribution, about $30 million so far: “We're in a more difficult position if the U.S. can't raise the additional $10 million” to match Mexico's contribution.

    Langord takes a different view of what constitutes a fair share by each side. Not all the Mexican money has gone directly to telescope design and construction, he notes. In addition, he says that it may be possible to finish the U.S. part of the work for less than $40 million. Other project officials say that a 50-50 split was never formalized. Still, they say they are confident that Congress later this year will provide DARPA with the necessary funds, up to $10 million, to finish construction.

    In the meantime, he and Schloerb are waiting for final word from NSF on the university's request for support. U Mass currently receives $1.1 million from NSF to operate its current dish, which will be shut down when the LMT goes on line. The goal is to ramp up the funding to $2.5 million annually by 2004, an amount that would cover the U.S. share of operating costs of the LMT. In the meantime, money not needed to operate the existing facility would finance instrumentation.

    Outside researchers expect NSF to approve additional funding for at least the next 2 years and then review the overall project. “The fact that NSF is willing to support us is a strong indication that the scientific community considers the LMT to be a worthwhile project,” says Schloerb. And Goldsmith is confident of finding operating funds. “The science is so overwhelmingly exciting,” he says. “The money will be found, even if no one is quite sure how.”

Stay Connected to Science