News this Week

Science  07 Sep 2001:
Vol. 293, Issue 5536, pp. 1742

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Nigerian Families Sue Pfizer, Testing the Reach of U.S. Law

    1. David Malakoff

    In a case that could create new liabilities for U.S. research conducted abroad, 30 Nigerian families have sued pharmaceutical giant Pfizer Inc., alleging that the company unethically tested an antibiotic on their children during a 1996 meningitis outbreak. The unusual suit—filed by foreign citizens in a U.S. court under a 210-year-old U.S. law originally designed to combat pirates—seeks to recover unspecified damages from the company for death and serious disabilities.

    The complaint, filed on 29 August in a federal district court in New York City, alleges that Pfizer researchers violated international law by failing to obtain informed consent from the families of about 200 Nigerian children, aged 1 to 13, enrolled in a trial that took place at a rudimentary hospital in the northern city of Kano. The study tested the effectiveness of the oral antibiotic Trovan against epidemic meningococcal meningitis, a deadly bacterial infection of the developing world. Eleven of the enrolled children—including both treated and control patients—died, and others became paralyzed or deaf, according to the complaint.

    The families allege that Pfizer increased the risk of death and injury by failing to provide a proven existing treatment to patients who did not improve after ingesting Trovan, and by giving control patients a weakened version of the standard therapy. “Pfizer took the opportunity presented by the chaos caused by the civil and medical crises in Kano to accomplish what the company could not do elsewhere—to quickly conduct on young children a test of [a] potentially dangerous antibiotic,” claims the suit.

    The New York City-based Pfizer rejects the charges, saying in a 30 August statement that it is “proud of the way the study was conducted” and that it was “well conceived, well executed, and saved lives.” The company says it obtained prior consent from both the Nigerian government and patients' families.

    The lawsuit is the latest development in a 5-year-old controversy. According to Pfizer, the idea for the trial arose when Scott Hopkins, a physician working at the company's research center in Groton, Connecticut, noticed an Internet news item about the outbreak of bacterial meningitis, a brain and spinal cord infection that eventually claimed an estimated 15,000 lives in Nigeria. Within 6 weeks, Pfizer had obtained permission from the Nigerian government and the U.S. Food and Drug Administration (FDA) to test the then-experimental oral form of Trovan (the trade name for trovafloxacin) in Kano, where the company already had an outpost.

    Trial on trial?

    Pfizer hoped that an oral form of Trovan would be effective against a deadly bacterial meningitis.


    Hopkins and Pfizer's medical team, composed of both U.S. and Nigerian doctors, gave Trovan in 1996 to about 100 children selected from long lines of patients awaiting help. They administered the proven drug ceftriaxone, manufactured by competitor Hoffmann-La Roche, to an equal number of control patients. Pfizer completed the trial in about 2 weeks and submitted results later to regulatory agencies, seeking approval for a wide variety of uses. (The drug was approved for many adult uses in 1997, but those approvals were sharply curtailed in 1999 due to liver-damaging side effects.)

    Even before the trial ended, doctors within and outside Pfizer raised concerns about the study. Juan Walterspiel, a former Pfizer infectious disease specialist, has claimed that he was fired in 1997 after repeatedly warning company officials that the trial was both legally and scientifically flawed. He has filed a wrongful dismissal lawsuit. The same year, Pfizer withdrew its U.S. application to use Trovan against epidemic meningitis in children, in part because an FDA audit had raised questions about the Kano study. Then, last December, The Washington Post highlighted the Kano controversy in a high-profile series examining the growing use of overseas drug trials. The press coverage—including allegations that Pfizer gave control patients one-third of the recommended dose of the standard drug, and that Nigerian doctors fabricated trial approval letters years after the study had ended—prompted a flurry of lawsuits against Pfizer in Nigerian courts. Pfizer has denied the allegations.

    It was a prominent New York law firm, however, that took the Nigerians' case to Pfizer's home turf, using a novel legal strategy. Lawyers at Milberg Weiss Bershad Hynes & Lerach argue that Pfizer is vulnerable under the 1789 Alien Tort Claims Act, which allows foreign nationals to use U.S. courts to go after individuals and companies that have broken international laws on foreign soil. In this case, the Nigerian families charge that Pfizer has violated ethical research rules, including the Nuremberg Code of 1947.

    Created in part to allow ship owners to sue pirates, the law has been used by human rights, labor, and environmental groups to win damages or out-of-court settlements from torturers and corporate miscreants. Now, foreign research subjects have a good chance of being added to the list of those allowed to sue, says Ralph Steinhardt, an alien tort claims expert at George Washington University Law School in Washington, D.C. But it will be at least a year, other lawyers predict, before researchers find out whether the court agrees that the ancient piracy law can be used to press such claims.


    Winners, Losers Abound As Reforms Kick In

    1. Dennis Normile*
    1. With reporting by Charles Whipple in Tokyo.

    TOKYO— Next spring, Japan hopes to finish building the Earth Simulator, a mammoth supercomputer capable of modeling global climate change in unprecedented detail. But researchers may not be able to run it full-time because of budget cuts to the three agencies funding it.

    That mixed message was repeated last week to scientists in many disciplines as Japanese ministries unveiled their budget requests for the next fiscal year. Although overall spending for science will rise about 5%, the increases are concentrated in a handful of areas deemed economically important and offset by cuts in other fields.

    The crunch is especially severe for research organizations that fall into a special class of public corporations, such as the trio funding the Earth Simulator, which the current Japanese administration has called wasteful and in need of major restructuring.

    “It would be awful, but Japanese science could die because of these reforms,” says Shun-ichi Kobayashi, president of the Institute of Physical and Chemical Research (RIKEN) near Tokyo, another of the affected bodies.

    The budget requests, for the year beginning 1 April 2002, reward projects deemed likely to strengthen industrial competitiveness, invigorate the economy, and promote a high quality of life. That includes a 62% jump in the life sciences within the Ministry of Education, Science, Technology, Sports, and Culture to study the structure and function of proteins, a 49% rise in spending on nanotechnology and advanced materials, and a 12% boost for information technologies.

    But the flipside is equally dramatic. “Where is the money coming from to fund those increases?” asks Norio Kaifu, director-general of the National Astronomical Observatory of Japan (NAOJ) in Mitaka. “From other fields,” he says, including an estimated 10% cut at NAOJ and cuts in space sciences and astronomy, ocean research, and atomic energy at the education ministry (see table).

    The proposed budget cuts may be just the opening shots for institutes such as RIKEN, the Japan Marine Science and Technology Center (JAMSTEC), and the Japan Atomic Energy Research Institute (JAERI). They have the misfortune of being part of a group of 163 special public corporations—which perform functions ranging from building toll roads to running Japan's public broadcasting system—that Prime Minister Junichiro Koizumi has declared must be either privatized or dismantled.

    On 10 August, a task force issued a string of recommendations for these agencies. Although their fates may not be settled for years, the administration has pledged to start by slicing $8 billion next year from the $44 billion these agencies now receive.

    Unhappily for climate modelers, the new supercomputer is part of the Earth Simulator Research and Development Center, which is jointly supported by three of these agencies: JAERI, JAMSTEC, and NASDA, Japan's space agency. “I'm very worried about the effect of this on our research program,” says Taroh Matsuno, director-general of the Frontier Research System for Global Change.

    Wrong direction.

    Space science takes a hit despite successful H-2A rocket launch (below) last month.


    The center is scrambling to find other potential users of the computer who might be able to contribute to its operating costs. JAMSTEC is also facing a 10% cut in the construction budget for its large ocean-drilling research vessel. “It means we'll have to push back completion of the ship by a year,” says Takeo Tanaka, head of JAMSTEC's ocean-drilling program office, delaying its first scientific cruise until 2007 at the earliest.

    Some ministries have already accepted the task force's recommendations. Last week, the education ministry announced that by 2003 it would merge NASDA, which is responsible for launching weather and communications satellites and for Japan's contribution to the international space station, with two other agencies: the Institute of Space and Aeronautical Science (ISAS), which focuses on space science, and the National Aerospace Laboratory, which conducts research into fluid dynamics and other more technological problems.

    Although ISAS researchers fear that the merger will shortchange space science, ISAS director-general Hiroki Matsuo says the move is unavoidable: “We just have to try to make this merger work as well as possible.”

    Other proposals, however, are expected to generate fierce opposition. One involves merging JAERI, which heads Japan's efforts on the International Thermonuclear Experimental Reactor, with the National Institute of Fusion Science (NIFS), which operates the Large Helical Device near Nagoya. NIFS's budget for next year was cut by 27%, and Osamu Motojima, who heads the helical device project, fears that a merger could wipe out his project's alternative approach to fusion.


    Integrin Crystal Structure Solved

    1. Jennifer Couzin*
    1. Jennifer Couzin is a writer in San Francisco.

    For crystallographers, some of the more challenging proteins are those found on cell membranes. Often large and insoluble, membrane proteins are difficult to induce to form crystals. Even when that can be done, researchers run the risk that their manipulations so distort the structure that it doesn't reflect the natural molecule.

    Now, a team of structural biologists at Harvard's Massachusetts General Hospital in Boston has overcome those obstacles to bag one of the most important membrane proteins yet. In work published online this week by Science (, Mass General's M. Amin Arnaout and colleagues have determined, for the first time, the crystal structure of one of the many integrin proteins.

    These important denizens of the cell membrane control many cellular activities, including proliferation and migration. For example, the integrin studied by the Arnaout group, which is designated αVβ3, is believed to play a major role in tumor growth, bone maintenance, and inflammation. What's more, some infamous viruses, including the culprits in AIDS and foot-and-mouth disease, use the integrin as a port of entry into the cells they infect.

    Now that they have the structure, researchers should get a better picture of just how the integrin engages in those activities, and that insight may spur the design of new drugs to combat diseases. “It's definitely one of those spectacular results that will change a field,” says structural biologist Dan Leahy of Johns Hopkins University School of Medicine.

    Like all integrins, αVβ3 includes two distinct protein subunits encoded by two different genes. It's large, containing roughly 2000 amino acids, and flexible—a disadvantage when it comes to producing the highly ordered crystals needed for x-ray crystallography. Indeed, Arnaout had to cajole Jian-Ping Xiong, the paper's first author and a postdoctoral fellow at Mass General at the time, to participate in the project. Xiong feared squandering precious fellowship time on what he viewed as a hopeless task.

    Researchers at Merck KGaA in Darmstadt, Germany, provided purified protein for Arnaout's team to crystallize but, says Arnaout, Merck wanted no part in funding a project that seemed so unlikely to bear fruit.

    Despite the doubts and frustrations, scientists worldwide have hotly pursued αVβ3's structure for years, holding their cards tightly to their chests to avoid alerting the competition to what they were doing. As recently as last February, most scientists attending an integrin conference in Ventura, California, presumed that it would be years before an integrin crystal structure was complete. At the time, Arnaout gave no hint to the contrary.

    First look.

    This ribbon drawing shows the overall structure of the integrin and its various domains.


    To make the integrin soluble—a prerequisite to crystallization—scientists at Merck KGaA truncated the tiny segments that anchor it to the cell membrane and allow it to transmit signals into and out of the cell. Once the Merck group provided enough protein, the Arnaout team spent more than 3 years tinkering with conditions before they could produce crystals worthy of study. Their x-ray analysis, conducted at Argonne National Laboratory in Argonne, Illinois, revealed a structure that was partly expected, but it also contained a few bombshells.

    The integrin includes 12 distinct regions, or “domains,” four in the α protein subunit and eight in the β protein subunit. They are arranged in a molecule with an oval head and two tails.

    Previous work by Arnaout's group and others, including Harvard pathologist Timothy Springer, suggested that some of these domains might exist. Springer's prediction several years ago that part of the protein would be shaped like a propeller has held true. A couple of domains, though, are new.

    A hybrid domain combines portions of known structures, and a domain tacked onto the end of one tail is folded in a novel pattern. But perhaps the biggest surprise was the finding that the two tails, which appeared to extend stiffly from the head section in earlier electron micrograph images, are folded sharply in on themselves. The researchers don't yet know whether that bending occurs naturally; it could be an artifact of the preparation or of crystallization procedures.

    Most intriguing, though, the shape might help cell biologists understand how the integrin transmits its signals. The Mass General group argues that it has crystallized the protein in its “on” structure, but left unanswered is how that differs from the inactive form and how the protein might pass signals to the cell with which it's connected.

    Pharmaceutical companies, meanwhile, are expected to use the αVβ3 structure to guide the development of new drugs aimed at the protein. Compounds that block the protein might, for example, inhibit uptake into cells of the AIDS virus or prevent the cell migrations needed to build the new blood vessels required for tumor growth. “These are targets that [companies] have identified for a while, and they have been playing with them, but without any structural basis,” says Arnaout.

    Indeed, researchers say that the αVβ3 structure is sure to inject new energy into the integrin field. “In a lot of ways, it's really just a starting point,” says cell biologist Jeffrey W. Smith of the Burnham Institute in La Jolla, California. But, he adds, “it changes the level of research that we can do.”


    Organic Device Bids to Make Memory Cheaper

    1. Robert F. Service

    CHICAGO— Beware, silicon engineers: Organics are at the gate. Researchers have already turned electrically conducting organic compounds into light-emitting displays and flexible circuits (Science, 12 June 1998, p. 1691; 21 January 2000, p. 415). Now they're laying claim to one of silicon's remaining strongholds: memory. Last week, materials chemist Yang Yang of the University of California, Los Angeles (UCLA), told the American Chemical Society* that his group has devised an organic-based digital memory.

    The new devices work similarly to flash memory, a relatively expensive silicon-based technology that retains its data even when power to a device is turned off. An organic version has the potential to be much cheaper because it could be far simpler to produce, says Edwin Chandross, an organic-electronics expert who runs a consulting firm in Murray Hill, New Jersey. “It's impressive,” Chandross says of the new work. “A lot of people have been talking about trying to do this.”

    Talk hasn't gotten them very far. No one has yet found a way to make an organic device that has two stable states, which can encode the 0's and 1's of computer lingo.

    Yang's group was struggling with this problem as well. The researchers first tried to make organic memory devices by putting a layer of conducting organic molecules between two electrodes and applying a voltage. That caused the conductivity of the organic material to rise, a signal that potentially could encode bits of information. But it dropped again when the voltage was turned off, so the chance to store information was lost.

    New trick.

    Conducting organic materials similar to this light-emitting diode can store data.


    On a hunch, one of Yang's postdocs, Liping Ma, suggested following the lead of compact-flash devices, in which tiny strips of metal act like batteries to store electronic charges. Yang gave him the green light, and Ma sandwiched a thin metal layer between layers of conducting organic molecules. To their surprise, it worked splendidly. The device could not only write and read bits of data but erase and rewrite them as well.

    To write a bit of data, Yang's team simply applies a potential of 3 volts between a pair of electrodes bracketing the organic-metal-organic sandwich. The voltage makes the material between the electrodes more conductive, a quality it retains even when the voltage is turned off. That high-conductivity state acts as the 1. To read the bit, the UCLA team applies a second voltage of 1 volt. The highly conductive material produces a rush of electrons between the electrodes, signaling that the device is in the 1 state. Applying a third voltage of −0.5 volt returns the cell to its original low-conductivity state, a 0, which the team can read by applying another voltage of 1 volt.

    Yang says his team has run through this write-read-erase cycle about 1 million times without seeing any signs of degradation. That track record leaves other researchers both impressed and scratching their heads, because just what causes the material in the device to change its conductivity when different voltages are applied remains a mystery. “It's intriguing,” says Alan Heeger, a physicist at UC Santa Barbara, who won the Nobel Prize in chemistry last year for his work on conducting polymers. “I'd like to know what is going on.”

    In earlier devices that claimed similar properties, it turned out that metal from the electrodes was migrating into the organic layers and giving rise to some of the effect. Although such devices seemed to work for a short while, they were difficult to reproduce and eventually stopped working when the small metal fragments broke down, Yang says.

    To test whether similar metal reactions were behind the properties of the new device, the UCLA team replaced its aluminum metal layer with layers of less reactive silver, copper, or gold. All worked similarly to the aluminum, Yang says. “It was the right experiment to run,” Chandross notes; gold, in particular, is fairly inert and shouldn't react as aluminum does.

    Whatever causes the change in conductivity, organic memory devices could move into applications quickly, Chandross and others say. Manufacturers can make the devices simply by evaporating different materials through a mask in a vacuum. UCLA has granted an exclusive license to a Boston-based start-up company interested in commercializing the technology. If the effort flies, it could result in ultracheap flash memory-based computers that turn on instantly, without the minutes-long boot-up that a standard computer needs to reload its working memory with data that get lost whenever the power goes off.

    No doubt silicon is safe for a while. But it may soon be under siege.

    • *222nd ACS Annual Meeting, 26–30 August.


    New Route to Big Brains

    1. Laura Helmuth

    Never mind the bipedal posture, relative lack of fur, or opposable thumbs. What really sets humans apart from other animals is our oversized brain. But building a bigger brain is an evolutionary challenge. In addition to all the extra neurons and other brain cells that have to develop, somehow all those cells have to be wired together correctly. Now, researchers report that in humans—but apparently not in other species—some neurons in the developing brain travel along an unexpected route. This detour allows them to link to and serve the most overgrown and recently evolved parts of the human brain—those involved in higher functions such as memory and problem solving.

    “The exciting point here,” comments developmental neurobiologist Gord Fishell of New York University, is that the study identifies a “fundamental difference in the way human brains develop.” Neuroscientists still don't understand, he says, “what it is about the human brain that allows it to become more complex than [in other] primates.” The newfound pathway, which is described in the September issue of Nature Neuroscience, might explain some of the differences, he says, as well as allow researchers to study which molecules are key to building a human brain.

    Developmental neurobiologist Pasko Rakic of Yale University, who conducted the research with grad student Kresimir Letinic, calls the neurons' route an “illegal immigration.” Normally, cells from the telencephalon, the part of the developing nervous system that grows into the most sophisticated parts of the brain—those that do the heavy lifting when it comes to problem solving, social interactions, and memory—don't mingle with cells from the diencephalon, which gives rise to less advanced structures such as the hypothalamus and optic nerves. No other neuroanatomists have ever reported that neurons can breach the theoretical wall between the two regions; when Letinic and Rakic looked at fetal monkey and mouse brains, they saw, as expected, no crossover from one region to the other. But when the team stained and studied slices from fetal human brains, obtained from aborted tissue that had been donated to a brain bank, they found a stream of cells migrating from the telencephalon to the diencephalon.

    Illegal immigration.

    Neurons in a forebrain from a 20-week-old human fetus travel from telencephalon structures (yellow) to the dorsal thalamus.


    These wayward neurons land in the thalamus, a relay station that distributes information to the cerebral cortex. Although the diencephalon still builds most of the thalamus, the second wave of telencephalon cells specifically boosts the parts of the thalamus that feed into the frontal lobes and other cortical areas that are responsible for higher-level cognition. This includes passing along both sensory information and internally generated information, such as emotional responses processed in the hypothalamus. “The frontal cortex doesn't operate without interaction with [this part of] the thalamus,” says neuroscientist Edward G. Jones of the University of California, Davis. The new report implies that “along with the expansion in cortex goes a new elaboration in the thalamus that helps promote [the cortex's] activity,” Jones says.

    To try to identify how the thalamus summons telencephalon cells, the researchers turned to fetal neurons in lab culture. They found that the human thalamus sends “come hither” signals that human telencephalon cells eagerly respond to, scooting toward the thalamus in a dish. Mouse telencephalon cells, in contrast, are repelled by the mouse thalamus. The Yale team hasn't yet identified the molecules that direct the neuronal migration. But if the researchers can decipher those molecular messages, the findings might help further explain how the human brain came to tip the interspecies scales.


    Group Raises Hackles as Well as Funds

    1. Andrew Lawler

    A Republican legislator has created the first political action committee (PAC) to support proresearch candidates for Congress. But some researchers and lobbyists are worried that the group's plan to back only Republicans could divide the community by forcing scientists to choose sides.

    Representative Vern Ehlers (R-MI), a nuclear physicist who chairs the Science Committee's environment, technology, and standards subcommittee, heads the new group, called SciPAC. In a 10 August letter to potential donors, he complains that the U.S. scientific community “has not taken the steps necessary to support elected officials who have supported science.” The stated goal of the committee is to “increase the influence of supporters of science, engineering, and technology in Washington.”

    But Ehlers makes clear that he's only talking about one side of the aisle. “As a Republican, I'm obviously not going to contribute to Democratic candidates,” he told Science. Indeed, his letter takes an indirect swipe at the opposition, noting that “since Republicans took over Congress a mere 6 years ago, the federal investment in research and development has increased nearly $20 billion.” Thomas Jones, SciPAC vice chair and public policy director for the American Association of Engineering Societies, puts it more bluntly: “Republicans are good for science.”

    Science advocates have long complained that their community lacks the political muscle of other interest groups. And political action committees, which disburse funds to candidates, are a traditional way to win recognition and support. But the fact that Democrats need not apply for SciPAC funding troubles some old hands.


    Representative Vernon Ehlers has created the first proresearch political action committee, for Republicans only.


    “The science community needs to be much more involved in the political process, and I have no problem with people raising money for candidates,” says physicist Neal Lane, a former science adviser to U.S. President Bill Clinton and former head of the National Science Foundation (NSF). “But the science budget has bipartisan support, and no party can claim [full] credit.” Lane warns that “making science into a partisan fight would not be good for science or for the community.”

    PACs created by politicians are typically designed to help just one party, says Jones, adding that Ehlers's role is to support the House Republican leadership. “And giving money to Democrats could jeopardize that leadership.” More important than the question of partisanship, Jones argues, is the need to shore up support for science as the federal surplus disappears.

    To succeed, however, Ehlers must convince scientists to write checks to a general fund that he controls. Direct contributions from researchers have helped elect some members, notably Representative Rush Holt (D-NJ), a physicist who has actively courted the scientific community. But a specific science-focused political action committee has never been tried.

    Touted as “America's Voice for Science in Washington” on its Web site (, SciPAC's financial goals are modest. “We're not going to raise a million dollars,” Jones says. But he hopes that the committee will come up with enough funds to help at least a few struggling candidates “to buy a few more ads and give them an extra boost.”

    SciPAC was registered on 7 August and has a post office box in the building adjacent to NSF's headquarters in suburban northern Virginia, just outside Washington, D.C. Its first contribution, for $250, came from Vinton Cerf, who helped create the Internet.


    Body's Secret Weapon: Burning Water?

    1. Robert F. Service

    In the immune system's war against the world, antibodies have long been cast as semicombatants. They might spot invaders and even tie them up for a while. But when push comes to shove, they holler for big-gun immune cells such as macrophages to move in for the kill.

    Now, it looks as if these plucky night watchmen may also dabble in chemical warfare. On page 1806, a team of California researchers reports that antibodies create highly reactive chemicals that cells can use to cleanse themselves and poison invaders. More surprising still, they seem to make them by burning water.

    “These are important, interesting, and intriguing results,” says Chris Foote, a biochemist at the University of California, Los Angeles. Foote says the data clearly show that antibodies generate reactive compounds. He cautions that it's less certain that they are oxidizing—or burning—water to do it. But, he concedes, “I don't have a better explanation.”

    Chemistry, particularly lethal chemistry, hasn't traditionally been considered part of antibodies' repertoire. “Over the last 100 years, chemists have come to peace with a theory that antibodies don't do anything. The killing is left to others,” says Richard Lerner, a chemist who heads the Scripps Research Institute in La Jolla, California, and helped lead the team. “Now, it looks like the sheriff is the executioner as well, or at least contributes to the act.”

    Lerner, fellow Scripps chemist Paul Wentworth, and colleagues came upon the antibodies' expanded role largely by accident. While studying fine points of how antibodies function as catalysts, they found that the antibodies in their experiments were generating the reactive compound hydrogen peroxide (H2O2). “We thought something was peculiar with the experiment,” Lerner says. They tested 100 other antibodies, and the result was the same each time. The group published the perplexing finding last year in the Proceedings of the National Academy of Sciences, but it still had no real leads as to what was creating the H2O2.


    A water molecule catalyzes a reaction between singlet oxygen and another water molecule to form H2O3.


    The big puzzle was where the energy to make the H2O2was coming from. Initially, Lerner says, the researchers thought the reaction was burning the protein itself, but they found that far too much H2O2was produced for that to be the case. Then, they hit upon the notion that much of the energy could be coming from molecules of singlet oxygen, an energetic and highly reactive form of O2 produced when a source of energy, such as ultraviolet (UV) light, breaks water molecules apart and energizes the oxygen. Singlet oxygen can then react with more water to create H2O2. By tagging water molecules with a heavy isotope of oxygen, Lerner's team confirmed that the oxygen that wound up in H2O2had indeed come from water—evidence that singlet oxygen could be fueling the reaction.

    For the reaction to occur, however, extra electrons need to be coming from somewhere else. So the Scripps team set off to find their source. After a battery of tests eliminated the obvious candidate, ions in the solution, the Scripps researchers teamed up with reaction modeling expert William Goddard III of the California Institute of Technology in Pasadena and his student Xin Xu. They suggested that a water molecule could combine with a singlet oxygen to produce a highly reactive compound, H2O3, which eventually reacts to produce H2O2 (see diagram).

    Because of energy barriers, the initial reaction of water and singlet oxygen probably would never take place on its own. But Xu and Goddard calculated that if the reaction started with at least two water molecules, one of them would act as a catalyst, driving the reaction forward.

    Getting that precise arrangement of water molecules and singlet oxygen to come together isn't easy. But perhaps the antibody was holding the actors in place so the reaction could take place. To find out, Wentworth and Lerner turned to Ian Wilson, an x-ray crystallographer at Scripps.

    Using high-resolution x-ray techniques, Wilson made three-dimensional maps of the atomic structures of four different antibodies. Scrutinizing the regions that all antibodies share in common, Wilson's group identified three close-together sites that could bind oxygen molecules, as well as neighboring sites capable of holding water molecules in the right places. The antibodies also harbored a nearby unit of tyrosine amino acid, a likely spot where photons of UV light could be absorbed to generate singlet oxygen.

    Although Foote agrees that such evidence is suggestive that antibodies promote reactions between water molecules and singlet oxygen, he's not yet convinced that's what is happening. One problem, he says, is that UV light—the presumed source of energy for generating singlet oxygen—is absorbed by the top layers of the skin and doesn't penetrate the blood vessels, where antibodies do most of their work. Still, he says, the idea is provocative and worth following up.

    In a final series of tests, Wentworth, Lerner, and their colleagues found that the toxic compounds generated by antibodies were capable of killing bacteria without the need for immune cells such as macrophages. Wentworth doubts these poisons play much of a role in immune defenses today. But in early organisms, he says, the ability to brew them may have offered a significant evolutionary advantage—a remnant of which their remote descendants have inherited.


    Testing Time for Missile Defense

    1. Eliot Marshall

    President Bush's plan for a national missile defense system will face a big test in the next few weeks as Congress debates its $8.3 billion budget; after that comes the technical testing

    Delta Junction, a remote crossroads in eastern Alaska, seems an unlikely place to tackle one of the nation's most ambitious—and costly—technological challenges. But last month, when the Pentagon signed a $9 million contract for road construction to improve access to five planned missile silos in nearby Fort Greely, this economically tenuous outpost moved to the center of the expanding debate over the National Missile Defense (NMD) program.

    The silos will house ground-based missiles that someday might shoot “kill vehicles” at incoming warheads a few minutes before they would strike North America. For now, military planners describe the base as an expansion of a “test-bed” for the land-based phase of the experimental program, which extends for thousands of kilometers from Alaska to the Marshall Islands in the western Pacific. It's only one of several new components that must be tested individually and then integrated into a comprehensive “battle management system.” By the Pentagon's own estimate, each major test will cost about $100 million—and 100 tests will be needed for the components, one former Pentagon expert calculates. Years from now, testing of the integrated “system of systems” can begin.

    The NMD debate, already loud and partisan, is likely to heat up this month as Congress returns from an August recess and examines the Administration's request for $8.3 billion in 2002, a 57% increase over the current year. Issues range from the accuracy of cost estimates to the system's affordability to its impact on the 1972 Anti-Ballistic Missile (ABM) treaty. But in their rush to score political points, politicians on both sides have paid scant attention so far to the credibility of the research program itself, including its numerous undefined components and the accelerated timetables that the military has set for developing them. The plan's increased emphasis on R&D is welcome, comments George Lewis, a physicist and weapon expert at the Massachusetts Institute of Technology (MIT). But he is skeptical of the Department of Defense's (DOD's) promise to be ready for an “emergency” deployment of the first elements of the system in 3 years. “Basically, they're saying, ‘We don't know what we're going to deploy,’” but it will be developed faster than before.

    The new R&D plan is beginning to draw criticism for another reason: It proposes a “layered” defense whose cost and technical capabilities are even less defined than the R&D plan (see diagram on p. 1751). Although less sweeping than former President Ronald Reagan's “Star Wars” program, it would nevertheless be a formidable undertaking. In addition to land-based missile interceptors in Alaska and a string of radar bases, this new vision includes an airborne laser carried aboard a 747-style jet, two types of interceptor rockets for deployment at sea, a space-based laser, and a concept for a space-based “kinetic kill” device that, like the ground-based interceptors, will be designed to smash warheads with brute force. Along with rockets and lasers, the Defense Department budget includes a backlog of related items, including essential tracking satellites and medium-range antimissile systems to protect troops and battle areas. Some elements are ambiguous. For example, according to a strategy outlined by President George W. Bush in May and further described by DOD officials in July, Fort Greely might switch from a research project to an operational base as early as 2004.

    Moving targets.

    The Pentagon seeks funding to beef up radars and expand the current Vandenberg-Kwajalein test axis to include missiles launched from Kodiak, Alaska.


    Critics like David Wright, an MIT physicist and member of the Union of Concerned Scientists in Cambridge, Massachusetts, say this ambiguity is unacceptable. Fort Greely, Wright points out, will not be used for testing, because launches from this inland site would go over populated areas. Missiles will be launched from an existing commercial site 800 kilometers away on the coast at Kodiak. This means, according to Wright, that Fort Greely would be a command center, not a test site. The Administration, he argues, is trying to start deployment now in “an end run around congressional oversight.” That's where many critics want to begin the debate this fall.

    Knocking the treaty

    The government hopes to clear land around Fort Greely before the winter, so that construction can begin in April. This modest activity is one of several planned for 2002 that could cause the testing plan to “bump up against” the ABM treaty's restrictions on new defensive missile systems, DOD officials acknowledge. Russian President Vladimir Putin has argued that changes in the treaty should be linked to nuclear stockpile issues. But President Bush told reporters in Texas last month that “we will withdraw from the ABM treaty on our timetable at a time convenient to America.” This troubles the new Democratic majority in the Senate. Senate appropriations committee chair Robert Byrd (D-WV) has said that he doesn't have “a modicum of confidence [in the] scientific effectiveness” of NMD and that “it is very unwise for Congress to lend its support.” The heads of the Senate Armed Services and Foreign Relations Committees are also wary of dumping the ABM treaty.

    Even if diplomats manage to paper over these contentious issues, the Pentagon is still left with the overwhelming challenge of shaping a basket of technology programs into a coherent testing program that will deliver an antimissile system. The Bush Administration has moved systems deemed to be “operational”—mainly short-range defensive weapons—out of the Ballistic Missile Defense Organization (BMDO), which is responsible for NMD and which reports directly to Defense Secretary Donald Rumsfeld. Instead, the office will focus on research and development, with reviews by the Defense Acquisition Board leading to annual stop-or-go decisions on each program. However, BMDO has not defined its tasks beyond 2002.

    Addressing the Senate Armed Services Committee on 12 July, DOD deputy secretary Paul Wolfowitz explained why BMDO's plan seems imprecise—or “murky,” as Aviation Week called it. “We have not yet chosen a systems architecture to deploy,” Wolfowitz said, “because so many promising technologies were not pursued in the past.” He blamed the ABM treaty for preventing research on “air-, sea-, and space-based capabilities with enormous potential.”

    Growth potential.

    The ultimate goal is a decade-long plan to build air-and space-based lasers as well as sea-based kill vehicles.


    That explanation didn't go over too well with Senate Democrats. In a 24 July hearing, Senate Foreign Relations Committee chair Joseph Biden (D-DE) wisecracked that DOD's promise to answer questions about the system was “like the song from West Side Story—‘Sometime, somewhere, somehow!’”

    DOD officials have tried to reassure skeptics by stating what the program will not do. “It does not commit to a procurement program for a full, layered defense,” says BMDO's director, Lt. Gen. Ronald Kadish. “It is not a rush to deploy untested systems; it is not a step back to an unfocused research program; and it is not a minor change to our previous program.” He explained that the Pentagon wants to “explore multiple development paths.” But Wolfowitz argued that despite the uncertainty, “missile defense is no longer a problem of invention, it is a challenge of engineering.”

    Hit or miss

    DOD's engineering prowess will be challenged immediately by the need for a new testing agenda. Philip Coyle, former director of defense operational testing and evaluation for the Clinton Administration, who is now at the Center for Defense Information in Washington, D.C., says that one pressing need is to make antimissile tests more “realistic” and unclog the schedule. He told the Senate Armed Services Committee in July that “important parts of the program have slipped a year and a half” since 1999, and a reorganization ordered by Rumsfeld has added another 6-month delay. Environmental groups may cause another problem: Last week, the Natural Resources Defense Council and others filed suit in Washington, D.C., seeking to stop work at Fort Greely until DOD prepares a new environmental impact statement.

    BMDO spokesperson Lt. Col. Richard Lehner says a revision of the master testing plan will be completed “in the fall.” Lehner has said that although DOD is preparing an environmental impact statement for the launch site at Kodiak, there's no need to rewrite an earlier one for missile silos at Fort Greely. He predicted that BMDO will be able to accelerate its schedule for the ground-based interceptor because “we're getting a better handle on the producibility of the different components.”

    The rockets to be housed at Greely will be part of BMDO's main testing program, which since 1998 has focused on stopping an incoming warhead toward the end of the “midcourse” segment of a 20-minute ballistic missile flight. The goal, Lehner says, is to be able to block a “fairly rudimentary threat” from a so-called rogue nation. That emphasis is reflected in the NMD's budget request, which seeks a $1.2 billion increase in 2002 that would bring spending on midcourse programs to $3.9 billion. The amount includes $3.3 billion for continued development of a ground-based interceptor missile and $656 million for conceptual studies of a new sea-based system. The budget proposal also includes roughly $500 million for sensors, $700 million for projects aimed at the early or “boost” phase of a hostile missile, and $1 billion for medium-range weapons aimed at the final seconds before impact or “terminal phase.”

    Most BMDO funding for a national system, $5.6 billion since 1998, has been spent on this ground-based, midcourse interceptor. There have been four experimental ground-based intercept tests in which simulated attacks were launched from Vandenberg Air Force Base in California and fired at by interceptors launched from Kwajalein in the Marshall Islands. Two produced hits, the most recent on 14 July. But the schedule has slipped, and even the successes have been questioned.

    A critic of midcourse interceptors, physicist Theodore Postol of MIT, claims that the July test revealed how difficult it will be to track enemy warheads in space. High-resolution X-band radar at Kwajalein that followed the simulated attack “froze” about 64 seconds before the interceptor successfully struck its target. DOD officials initially believed the problem was data overload but later described it as a software glitch. Postol fears, however, that data processing problems will continue to plague the system, because it must analyze huge quantities of information in seconds as it seeks the real target amid a spray of debris and decoys. Even crude enemy systems, he says, can present tremendous challenges in image discrimination—for example, if tumbling warheads create electronic images that look like debris or stars.

    Richard Garwin, an IBM physicist and former weapons designer, agrees that decoys and other “countermeasures” designed to trick defensive systems are “the big problem with midcourse intercept.” He and Postol have advocated instead that DOD beef up R&D for attacking targets in the slow, bright boost phase. DOD's new NMD budget plan does include several boost-phase projects. But Garwin sees most of this money going to unpromising “programs that already existed,” like the airborne laser, slated for $410 million. “I doubt that such a vulnerable aircraft can be maintained on station” long enough to overcome a potential threat, he says. He prefers a land-based or sea-based approach.

    Big burn.

    Major tests, like this launch of a ground-based interceptor from Kwajalein, cost $100 million.


    Because boost-phase defensive systems seem destined for more support, the American Physical Society is taking a closer look. It has impaneled a 13-member group to examine key physics issues, such as the “technical challenges involved in using airborne lasers,” and the speed and size of kinetic kill vehicles required to stop a missile on ascent. The panel, headed by physicists Daniel Kleppner of MIT and Frederick Lamb of the University of Illinois, Urbana-Champaign, expects to deliver a reality check next February.

    Stepping up the pace

    Although BMDO has not released a formal testing schedule, its agenda calls for a faster tempo. So far, the ground-based interceptor has had seven major tests (including four intercept tests) in 3 years, with two more (one a booster test) envisioned this year. According to Kadish's testimony and briefing papers, the pace will increase from one major launch every 6 months to one almost every other month. And that's just for the ground-based system. Under the new plan, testing must soon begin for new sea-based rockets and kill vehicles.

    The DOD budget proposal includes funds to begin this next phase of testing. And it proposes to extend the range of its Pacific tests by adding the new sites in Alaska. These would enable BMDO to track a mock attack over a longer distance and in a more realistic orientation. At present, targets launched from Vandenberg toward Kwajalein present a bright radar signal early in flight. Coyle, for one, says he “applauds” the proposal to reorient flights so that targets are moving toward rather than away from North American tracking systems.

    The expanded range, according to Lehner, would also aid in testing new systems, such as sea-based interceptors, beginning in 2008. These would focus on the boost phase. That approach would be a significant change in the program. Beyond that, BMDO officials have said they envision starting to test a space-based kinetic kill vehicle in 2005 or 2006 and a space-based laser that could target missiles on launch by 2010. Even Lehner concedes that the space-based systems will require a bit of invention to go along with the engineering.

    The most difficult challenge, Coyle and other observers say, will come when BMDO tries to coax these hugely complex but independent systems to work in concert. The first step will be computer simulation, and Lehner says that an integrated system experiment involving a live missile flight will not occur for “3 to 5 years, at the earliest.”

    “The scale of this effort is huge,” says Coyle—with more demanding requirements for accuracy and coordination than any the military has ever attempted. The cost is also huge. Over the next few weeks, the shrinking budget surplus is likely to dominate the political rhetoric as Congress finishes putting together the 2002 budget. Democratic critics of the NMD program, especially, will be focusing on next year's $8.3 billion installment payment, arguing that the country cannot afford both of President Bush's major campaign pledges: a big tax cut and a missile defense system. Look for a reprise of the Reagan-era debates over “Star Wars.”


    Why Is a Soggy Potato Chip Unappetizing?

    1. Giselle Weiss*
    1. Giselle Weiss is a science writer based in Basel, Switzerland.

    Researchers don't have the answer, but at an unusual workshop they tried to get a feel for how texture influences palatability

    ERICE, SICILY— Contemplate, for a moment, a few of the world's greatest mysteries: What came before the big bang? How did life originate? Why is a firm strawberry jam less scrumptious than a soft one? That last enigma, at least, is yielding to the practiced tongues of molecular gastronomists. The key is texture, an elusive attribute that practitioners of the science of taste are just beginning to get a feel for. Texture is complicated, says Alan Parker, a colloid scientist at Firmenich, a flavor and fragrance company in Geneva, because it is part cognition, part physics, and part chemistry. Not to mention a dollop of fun.

    Every other year since 1992, several dozen chefs and scientists have descended on this Mediterranean resort town—better known for its summer physics and cosmology schools hosted by the Ettore Majorana Centre for Scientific Culture—to take part in a weeklong International Workshop on Molecular and Physical Gastronomy.* Past sessions here have tackled problems of flavor and heat transfer. (Cooking is largely the business of adding heat.) This year, the topic was texture.

    The workshops themselves are the brainchild of the late Nicholas Kurti, a low-temperature physicist and cooking enthusiast from Oxford, U.K., and Hervé This, the former editor of Pour la Science(the French edition of Scientific American), who's now affiliated with the Collège de France in Paris. The duo coined “molecular gastronomy” in 1990 to describe the science of cookery.

    What does cooking have in common with cosmology, Ettore's usual fare? Not much. Local physicist Beatrice Palma-Vittorelli explains that when Kurti called her husband Ugo, a physicist, one day to ask for his help in finding a venue for a gastronomy workshop, Ugo mistakenly heard “astronomy” workshop. The rest is history.

    So is at least some of the distaste hard-core scientists have shown toward food science. “People like to do heroic things,” says Gordon Williams, a mechanical engineer at Imperial College, London, and the properties of food are not considered “serious science.” An oft-cited milestone in the subject's journey toward respectability is the Nobel Prize in physics won in 1991 by past Erice attendee Pierre-Gilles de Gennes, for demonstrating how large molecules flow. His discovery relates to what happens in stirring a drink, and in moving food around in the mouth. “De Gennes legitimized the study of the everyday,” says Parker.

    And that's what the gastronomy workshops are all about. In this year's powwow on texture, however, the participants nearly bit off more than they could chew. Texture may be “too difficult to solve,” says Len Fisher of the University of Bristol. U.K., but it's “too important to ignore.”

    The problem is that texture is tough to define. Scientists understand it as the orientation and size of crystals in a material, or as the internal morphology of multicomponent materials, says Peter Barham, a polymer physicist at the University of Bristol and the co-organizer of this year's meeting. But to the senses, texture is connected to how food breaks apart in the mouth, say, or to its stickiness—qualities described with adjectives like smooth or crunchy or slimy. Individual differences complicate the picture; so does the fact that the acceptability of textures is largely learned. “It's not like ceramics,” says Athene Donald, a food physicist at Cambridge University in the U.K., “where you can say, ‘This one's brittle.’”

    Erice is a workshop, not a conference, so presentations are kept to a minimum. The format, open discussion, is designed to mix disciplines: Chefs get ideas about how to cook better, traditional food scientists learn something about quality (not a high priority in the food industry, which is concerned with bulk product), and academic scientists come away with ideas to test.

    In the laboratory, the study of texture is still a nascent discipline. Williams is unusual in having spent the last 10 years examining the viscoelastic properties of cheeses by squashing them in little cylinders, “because it's interesting.” The food industry, too, which funds most work related to food, relies heavily on a “cook and look” approach but is moving toward more exact science because of a need to reproduce successes and to create particular textures. It is a big jump from characterizing the material properties of food to understanding texture, though. “We are getting numbers [from mechanical tests] like 42, and we get answers [from tasting panels] like ‘It's rubbery,’” says Williams.

    Indeed, foods are “macromolecular messes” in which you might find small bubbles, droplets, or particles, says Fisher, whose calculation of the optimal way to dunk a biscuit won him an Ig Nobel prize in physics in 1999: “All sorts of mad things can happen.” Moreover, owing to the viscoelastic properties of foods, how fast or how slow we chew them affects how they flow in the mouth. With foods, says Parker, “things are happening along many, many length scales and at the same time.”

    A particular challenge is to untangle texture from taste: Potato chips, for example, become unpalatable when soggy, although their chemical composition remains the same. At the University of Nottingham, U.K., Andy Taylor and Tracey Hollowood are working to understand why increasing the viscosity (or resistance to flow) of a food makes it seem less flavorful, even though experiments indicate that viscosity has no effect on the amount of aroma that gets into the nose. Edmund Rolls, a neuroscientist at Oxford University, and his colleagues showed in 1999 that neurons in the brains of macaque monkeys respond to the texture of fat in the mouth—particularly to liquid fat such as cream and liquefied chocolate-hazelnut spread—independently of neurons clued in to smell or taste. The tip-off was that neurons responded just as much when the monkeys were fed harmless amounts of silicone oil, which has a fatlike texture but no taste or smell. Because humans and macaques have evolved to savor fat, and because the monkeys were more readily sated by liquid fat than by solid, Rolls hypothesizes that taking fat as a liquid, or chewing foods well to liquefy the fat in them, may have application in weight control.

    The science of texture, like that of food generally, is a subtle and sophisticated one. But “to understand texture you have to be eclectic,” says Parker, “because you do not know in advance if the answers to the questions you ask are in the Journal of Neuroscience or The Physics of Sliding Friction.” The scientists hope that meeting challenges like this will help move the science of food off the back burner.

    • *5 to 10 May.


    Scientists Use Strandings to Bring Species to Life

    1. David Malakoff

    Some scientists worry that eye-catching efforts to rescue stranded marine mammals may detract from less glamorous—and smellier—research

    ASSATEAGUE ISLAND, VIRGINIA—The baby whale wasn't fast enough to escape the ship's giant propeller. With its backbone shattered and blood streaming from five deep gashes on its right flank, it was dead within minutes.

    Robert Bonde, a marine mammal researcher with the U.S. Geological Survey in Gainesville, Florida, didn't witness that marine hit-and-run earlier this year. But the anatomist had little trouble reconstructing the last moments in the life of the 8-meter-long northern right whale, now a rotting carcass here on an isolated beach. “It looks like the ship caught him from behind, just as he put his head down to dive,” Bonde concluded after examining clues provided by the size, depth, and location of the slashes.

    Bonde and his colleagues have spent decades trying to better understand the lives—and deaths—of seagoing mammals by examining the thousands of whales, dolphins, seals, sea lions, and manatees that wash up on shore each year (see table). Most don't survive: Live animals account for less than one-fourth of the 3000 reported in an average year in the United States. And most are dead for days before they're found, leaving researchers to dub their work “stinky science.”

    Still, the knowledge acquired from beached animals is fueling moves to beef up stranding science in the United States and elsewhere. Responding in part to the need for more information about endangered species such as the right whale, Congress last year approved a plan to give up to $15 million over the next 3 years to the 25-year-old U.S. stranding network, a sometimes uneasy alliance of about 400 private research institutions, independent wildlife rescuers, and government agencies. And beachcombing scientists around the world are discussing ways to improve international collaboration.

    Whale tale.

    Researchers Scott Kraus (left) and Robert Bonde (right) prepare to necropsy a stranded right whale killed in a collision with a ship.


    The growing interest in stranding work, however, is dogged by debate. Some researchers and animal-welfare advocates want to pull out all the stops to rehabilitate and release the animals back into the wild—at upward of $1 million for a large whale. But others say that it's often more humane—and more fiscally prudent—to euthanize some of the victims and invest more in the mundane job of analyzing the samples that are harvested. The tension underscores the “uniquely difficult role” played by groups that respond to strandings, says wildlife veterinarian Andrew Stamper of Disney's Living Seas pavilion in Orlando, Florida: “They must be both a humane society dealing with the welfare of stranded animals and an objective institution collecting scientific data.”

    Carrion hunt

    At least since Aristotle pondered why dolphins would wash up along ancient Mediterranean shores, humans have been interested in stranded marine mammals. But the systematic study of strandings didn't really begin until the early 1970s, when governments began outlawing whaling and giving new legal protections to all marine mammals. The changes sparked U.S. efforts to collect stranded animals and use the specimens to learn about wild populations.

    “We didn't know much about the distribution or life history of most marine mammals,” recalls James Mead of the Smithsonian Institution's National Museum of Natural History in Washington, D.C. In 1972, he and a colleague, Charles Potter, began journeying to beaches within a day's drive of the museum. They could cover a lot of ground: “We were young, so a day's drive really meant 24 hours,” says Mead, now 58.

    Up the coast, veterinarian and oceanographer Joseph Geraci was doing similar work for the New England Aquarium in Boston. Like Mead and Potter, Geraci encouraged local officials to alert him to beached animals. Such efforts led to the current stranding response network managed by the National Marine Fisheries Service (NMFS), which funnels reports to groups with the personnel and permits to handle the animals.

    Cutting-edge science.

    Tissue samples can help stranding researchers determine everything from a whale's cause of death to its family tree


    In the early days, cutting up beached animals “was not regarded as science,” says Geraci, who recalls colleagues teasing him about his “relentless search for carrion.” But a 1973 mass stranding of white-sided dolphins on a Maine beach highlighted the scientific bounty awaiting those willing to truck the specimens to a freezer and then dissect them. Working with a small army of students, Geraci piled up discoveries about dolphin anatomy, diseases, and physiology.

    Over the years, researchers have used the locations, dates, and details of hundreds of strandings to piece together a surprisingly detailed portrait of white-sided dolphin life. Measuring fetuses from stranded females, for instance, provided scientists with insights into gestation length, breeding seasons, and growth rates—details that can be crucial to government biologists charged with protecting the species. “The life history of the white-sided dolphin is known almost exclusively from stranding data,” says Geraci, now at the National Aquarium in Baltimore, Maryland. Another 10 species of whales and dolphins—out of some 80—are known only from a handful of stranded specimens, notes Mead, who recalls spending the night on a beach in Argentina guarding the carcass of a particularly rare beaked whale so it wouldn't wash out with the tide. “I was so excited I couldn't sleep,” he says.

    In recent years, such specimens have helped researchers probe a host of biological mysteries, from why animals strand (see sidebar) to how mammals adapted to life in the sea. For instance, over the last decade, three researchers—Sentiel “Butch” Rommel of the Florida Marine Research Institute in St. Petersburg and Ann Pabst and William McLellan of the University of North Carolina, Wilmington—have led efforts to piece together one particularly interesting puzzle: how dolphins, seals, and manatees regulate their body heat, keeping it high enough to survive in the cold ocean but low enough to preserve fragile sperm and developing fetuses. The answer is a remarkable heat-exchange system that allows the animals to vary the temperatures of different organs.

    Animals hauled from beaches also help researchers track the toll from disease and human activities. The carcass of the 3-month-old right whale found in Assateague, for instance, documents the threat from ships, while other beachings have highlighted problems caused by fishing gear entanglement or plastic trash eaten by mistake. One study of 6200 California sea lions that stranded between 1986 and 1998 found that nearly 8% showed evidence of such “human interactions”—often gunshot wounds, presumably from fishers upset by the competition for fish. Other studies of tissue samples taken from beached animals helped document the spread of pesticides and industrial chemicals into the global food chain—including whales and seals that spend much of their lives in remote polar regions.

    In 1998, Frances Gulland of the Marine Mammal Center in Sausalito, California, helped solve the mysterious deaths of hundreds of California sea lions along the Pacific Coast. By analyzing stomach contents, studying toxicology reports, and sampling offshore algal blooms, Gulland and other researchers realized that the sea lions had dined on herring that had in turn fed on a well-known alga, Pseudo-nitzschia australis, which produces domoic acid, a potent neurotoxin. Because domoic acid also poses a serious threat to human seafood lovers, state officials quickly banned shellfish harvesting and fishing in affected areas. The sea lion deaths, notes Gulland, served as a “very effective early warning system in protecting human health.”

    Smelly science

    Most strandings aren't as well publicized as the sea lion die-off. And the work is hardly glamorous: The young right whale at Assateague had been killed 4 or 5 days before it was spotted on 17 March by park rangers, and it emitted the ripe aroma of spoiled meat by the time researchers gathered 2 days later. Nearly every stranding veteran has at least one tale of being banned from an eatery or motel due to the deathly stench clinging to their body and clothes. “A gas station attendant once told me just to leave the money on the pump,” recalls Rommel.

    Besides enduring the smell, stranding researchers must be adept at all types of equipment. Smaller seals and dolphins can be necropsied—the animal equivalent of an autopsy—with traditional surgical tools. But the larger whales, up to 25 meters long and weighing dozens of tons, require giant scythelike flensing blades for stripping away blubber and bulldozers and trucks to move body parts. “It's pretty ungainly science,” says Rommel, who once “nearly drowned” when he slipped through a hole in the skin into the mushy guts of a large fin whale.

    Sometimes the dead refuse to give up their stories. Reports that the baby right whale's body was in prime condition, for instance, had prompted anatomist Darlene Ketten of the Woods Hole Oceanographic Institution in Massachusetts to board a dawn flight in hopes of extracting the animal's ears for her studies of marine mammal hearing. But she discovered that time and the churning action of the waves had reduced the corpse's insides to a gray soup and dislodged the ears, reducing their scientific value.

    Even so, Ketten, Bonde, and researchers from the Virginia Marine Science Museum in Virginia Beach, the local stranding coordinator, filled enough vials and baggies with tissue samples to help researchers. Some are assembling a family tree of the 300 or so remaining northern right whales, using DNA extracted from the samples. Others hope to identify pollutants that are transferred from mother to calf by way of breast milk.

    Beach battles

    Because the right whale is a highly endangered species, NMFS paid part of the cost of responding to the Assateague Island stranding. But its stranding budget of less than $1 million doesn't go far. Most stranding groups stay afloat by relying on volunteers—from academics to laypeople.

    Florida has come up with a novel way to pay the freight: the sale of special license plates that promote conservation of the endangered manatee. The widely envied arrangement produces enough revenue to pay for detailed necropsies on more than 300 sea cows a year. Public outreach is another source of income. A handful of aquariums, private zoos, and wildlife groups that specialize in rehabilitating live stranded animals raise $200,000 or more a year by highlighting efforts to nurse beached animals, usually seals and sea lions, back to health. The work is painstaking, involving everything from hourly bottle feedings to exhausting daily workouts to rebuild injured muscles.

    Ready for study.

    Some samples will go into government tissue banks for genetics and pollution studies.


    Such rescue efforts, however, are a source of tension between animal-welfare advocates, who emphasize saving the animals, and researchers, who say that sometimes euthanasia—usually by lethal injection—is the better option. “These are emotionally charged settings,” says one stranding veteran. “I've seen shouting matches and fistfights, people tugging on both ends of a pilot whale, and kids throwing themselves in front of a hypodermic needle.”

    To cool the passions, NMFS officials have drafted guidelines on when to attempt a rescue and which rehabilitated animals should be allowed back into the wild. The agency must “confront a profound ethical and practical question,” says Kraus. “Are we defying natural selection by putting back animals that shouldn't be in the gene pool?” No one knows, for instance, how many refloated animals actually survive. Kraus says that according to unpublished data collected by the New England Aquarium, for instance, about half of the seals it has rehabilitated over the last few years died within a few months of release.

    Even animals that make it to rehabilitation centers pose risks. Rescued whales and dolphins have spread fatal viruses to captive animals. And some researchers worry that the patients, if released, could spread new strains to wild populations.

    Some rescue advocates argue that such risks have to be weighed against what we can learn from rehabilitated animals—and their value as vehicles for public education. “Rehabilitated animals have taught us a lot about behavior and physiology … and helped make the case for marine mammal protection,” notes Sharon Young, a Cape Cod, Massachusetts-based consultant to the Humane Society of the United States. Rehabilitated dolphins, for instance, have played key roles in studies of cetacean intelligence, communication, and hearing. And satellite tags attached to released animals have also provided new insights into the animals' travels, she says, revealing that they sometimes take trips of unusual length to unexpected destinations. Young says that stranding groups “should start with the notion that you are going to try to save [beached animals]” and make euthanasia a last option.

    Money talk

    The tension surrounding rescues has helped spark debate over how NMFS should allocate the new fund—a total of $4 million this year in grants of up to $100,000—that Congress wants to pump into the U.S. stranding network. NMFS officials and some marine mammal scientists have cautioned against spending too much on rescue and rehabilitation, which can cost $50,000 per seal or sea lion and double that per whale. In one well-publicized case, Sea World entertainment park in San Diego, California, spent more than $1 million to rear and release a single baby gray whale found on a beach in 1997. Its fate is unknown, in part because a tracking tag fell off the animal shortly after release.

    “Most of the network recognizes the valuable information you can get from dead animals and the severe financial drain created by [caring for] live animals,” says NMFS's Teri Rowles, who leads the program and is sifting through public comments on this summer's draft. But outsiders say the strong public interest in rehabilitation virtually assures that some of the money, to be handed out starting later this year, will go to rescues. In the meantime, Rowles and her staff are developing a Web-based tracking system to keep tabs on U.S. stranding events and working to beef up the several tissue banks it helps fund.

    Rowles is also involved in discussions with other nations that have stranding networks, notably New Zealand and the Netherlands, as well as those, such as Japan and Taiwan, starting to build them. The long-range goal is a data-sharing system that will allow researchers to spot large-scale patterns that might point to major ecosystem changes, such as increases in harmful algal blooms or climate shifts. For the moment, however, the focus is on solving existing puzzles. Gulland, for instance, is one of many researchers trying to understand the causes behind a recent spike in the number of gray whale strandings along the West Coast. Some believe it is a healthy sign that the threatened species has rebounded to fill its ecological niche and that the animals washing ashore had died naturally. Others worry that the beachings foreshadow an onslaught of pollution or disease. Whatever the cause, the episode points to the continuing attraction of marine mammals in death as in life.


    Why Do Marine Mammals Strand?

    1. David Malakoff

    Although many strandings end up as unsolved cases, scientists have no shortage of suspected causes. Post-mortems indicate that many beached animals are diseased or malnourished due to age, injury, or changing ocean conditions. Others have been poisoned by toxic algal blooms or injured by fishing nets, boats, or predators such as sharks or killer whales. Sometimes, big storms or unusually strong tides may disorient the animals, particularly in shallow coastal waters. And some studies suggest that parasites might disable an animal's ability to navigate by disrupting sensory organs such as ears.

    It's been harder, however, to pinpoint the causes of relatively rare mass strandings, which can involve dozens of seemingly healthy animals. One factor may be the strong social bonds forged by some species, such as bottlenose dolphins and pilot whales. Some researchers speculate that healthy animals may follow a sick leader or parent into danger, unable to break their allegiance. Others wonder whether the animals may be disabled by a growing load of chemical pollutants, such as pesticides, found in their tissues. Most scientists are skeptical, however, about another theory, which brands beached marine mammals as suicidal.


    From the Mouths (and Hands) of Babes

    1. Laura Helmuth

    Nicaraguan deaf students have created a new sign language, and it has fueled the debate among linguists over how languages are formed

    When Nicaragua established a school for the deaf in the late 1970s, teachers—all of them hearing—focused on teaching the children to read lips and to read and write Spanish. But outside the classroom, the children began to communicate by their own rules. Some teachers noticed the strange new hand gestures that the children exchanged and called it “mime,” and a few adults found it useful to learn some of the gestures to communicate better with the children. But none, apparently, realized they were watching the birth of a language.

    Linguists discovered Nicaraguan Sign Language (NSL) in the mid-1980s, immediately recognizing it as a developing language and not a crude game of charades. They knew they'd struck research gold. “This is a unique case,” says cognitive scientist Steven Pinker of the Massachusetts Institute of Technology (MIT). For the first time, he says, researchers have had the “ability to witness, in real time, how the structure of a language emerges as the language is being created.”

    No one taught the students to sign. Coming from hearing families and villages across the country, most possessed only a crude “home sign” system for expressing their needs to their families. But once a critical mass of children and teenagers came together, they created a still-growing vocabulary of signs and rules of grammar for stringing the signs together meaningfully. “This language was created on the school bus and in the play yard and in the street,” says psycholinguist Ann Senghas of Barnard College in New York City.

    The children, of course, had no idea that their rapidly coalescing system of gestures would feed into one of the fiercest debates in linguistics. Are children born with a so-called “language acquisition device,” an innate capacity for syntax, that prepares them to build a language anew from the merest scraps of linguistic input, an idea most famously championed by Noam Chomsky of MIT? Or, as many others believe, do children simply possess general strategies for solving problems—and learning to communicate is one of the most immediate and urgent problems they face?

    Not in the classroom.

    NSL arose from spontaneous conversation, not a lesson book.


    The debate can quickly turn ideological —“more religion than science,” says psycholinguist Dan Slobin of the University of California, Berkeley, who falls in the general social strategies camp. But fueling the debate are a few special cases, such as the birth of NSL, in which linguists can try to tease apart the effects of the environment and innate abilities. In most cases, the data are rich and varied enough to lend themselves to a variety of interpretations —and NSL is no exception. Indeed, both camps are finding support for their views from this unique group of children.

    Evolution of grammar

    One of the early linguists on the scene in Nicaragua was Senghas, who eventually joined Pinker's lab as a grad student at MIT. Starting in the early 1990s, she tried to excavate the “fossil record of language emergence”—albeit a very fresh fossil record. By that time, new students had been arriving at the school for more than a decade, and the language had been growing more sophisticated and complex every year. Senghas wanted to find out who was behind this developing complexity—was it the very young children, the preteens, or the teenagers? If the youngest children were the architects of the language, that would lend credence to the argument that children come pre-equipped to invent language.

    She focused on a form of grammar common to every known sign language but absent from spoken languages. Signers use locations in space to show how objects or ideas are related. For instance, making the sign for “cup” in a certain spot, followed by the sign for “tall” in that spot, makes it clear that the cup—and not necessarily the person drinking from it—is tall.

    Signers don't always use this spatial grammatical construct, and they are able to get by without it—as in any spoken language, a listener can usually figure out an ambiguous sentence by context. From the start, linguists noted that NSL signers occasionally used space to convey meaning. But as NSL developed over the years, the roughly 400 students made more systematic use of space to clarify their intentions.

    To quantify the use of spatial grammar and find out who uses it most, Senghas and Marie Coppola of the University of Rochester in New York showed a short cartoon to some NSL students and asked them to describe what they saw to other students. The researchers videotaped the interactions and simply counted how often and in what context the signers used spatial locations. They divided the students according to how old they were when they first came to the school—some arrived shortly after birth and others enrolled at age 19—and whether they entered the school in its earliest years or more recently. The researchers found that NSL signers who entered the community later (after 1983) were more likely to use space to link two concepts than were those who joined the school before 1983, the team reports in the July issue of Psychological Science. The prevalence of spatial grammar in people who entered the community recently—despite the fact that this group is younger, on average, and has been speaking the language for fewer years than the earlier cohort—indicates that the use of spatial grammar is a relatively recent phenomenon in the development of this language.

    But the team's other finding has intrigued researchers even more. When they looked within the group of students who arrived at the school after 1983, they found that children who entered the school (and thus were exposed to the language) at the youngest ages—10 years or less—were the most likely to make use of this newly emerging grammatical construct. This suggests, Senghas says, that young children “are the creators of the language”—they're the ones pushing NSL to become more systematic and grammatical. The finding supports the notion that children have a “specific endowment for language,” says psycholinguist Lila Gleitman of the University of Pennsylvania in Philadelphia.

    Reinventing the wheel

    Gleitman, for one, isn't surprised that young children seem to be creating NSL. In her assessment, all children, hearing or deaf, do much the same thing. “Every child in the course of so-called learning reinvents the notion of language,” she says. That is, children hear more or less fragmentary speech in their environment, but they manage to deduce complex rules of grammar without being told these rules explicitly. And if they aren't exposed to a fully formed language, she says, they create one.

    The birth of NSL, Gleitman claims, is a special case of a process that's been seen before: the transformation of a pidgin language into a creole. When people who speak a variety of native languages come together and have to communicate, they create a “contact language,” or pidgin, which has some basic vocabulary and crude structure. One of the best-studied examples arose in Papua New Guinea on banana plantations where workers were brought in from many linguistically isolated islands.

    The language of love is apparently translatable into pidgin: People in these communities mate and have children and speak pidgin in the home. But their children aren't satisfied with a stunted language. They elaborate on the pidgin, creating and systematizing rules of grammar and adding to the vocabulary until the language is sophisticated enough to be called a creole. The main linguistic powerhouses behind this creolization, claims Gleitman, are 5- to 8-year-olds, although she admits there's debate within the creolization field about the importance of children's contributions.

    In NSL, the children started with even less linguistic input than those who grow up hearing pidgin. But even without hearing a language or seeing a language signed, Gleitman says, children “know what language is about. Everybody knows this from scratch.”

    Rules of the game

    But Slobin is not convinced that children are born knowing how to create a language. And he questions whether the youngest children are the sole innovators behind the emerging NSL language: “There's no evidence that [NSL] is being produced by children.” Certainly, he says, the younger children are systematizing the language—making the rules more regular and establishing the community norms of how to communicate. But the basic grammatical constructs—including using space to show that two things are related—were already present even in the first group of students, he says, whatever their age when they arrived at the school. The children who arrived at younger ages are more fluent, just as people who learn a second language at a young age are more fluent than those who try to pick one up as adults, says Slobin. Those who learned NSL as young children are more likely to use space to designate meaning, he says, but they're refining and automating the language rather than inventing it from scratch.

    Children often impose structure on a system, says Slobin. “Children in groups can agree on all sorts of rules,” such as how to play a game. Similarly, in order to communicate efficiently, he says, children in the NSL community have to agree with each other about how to sign things. Slobin says this tendency to establish group norms is “more general than anything that happens in language.”

    In the middle

    So are the young Nicaraguan children creating language from scratch, possibly by activating an innate language program? Or are young children just perfecting rules of a language established by children of all ages who are solving the social problem of how to communicate? “We're not taking a stand on that,” says Coppola, who concedes that the data are not definitive. Even Gleitman admits that the evidence can be interpreted in a variety of ways. “If you know the literature,” she says, “you can maintain positions all along the spectrum and not make an ass of yourself.”

    In the meantime, NSL continues to grow. The important studies now, Slobin says, will be of how the grammar develops over successive generations. The use of video techniques has made it possible to get a good record of the development of a sign language, something that was unavailable when American Sign Language was being systematized. And the origins of spoken languages such as English or French are “lost in the mists of prehistory,” says MIT's Pinker. Even when more recent creoles developed, he says, “no one was there taking notes as it emerged.”