News this Week

Science  04 Aug 2000:
Vol. 289, Issue 5480, pp. 706

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Flawed Cancer Study Leads to Shake-Up at University of Oklahoma

    1. David Malakoff

    Allegations of lax safety procedures and flawed management in a clinical cancer trial have cost four researchers and administrators their jobs at the University of Oklahoma. They have also led to a temporary shutdown of 75 clinical trials at the university's Health Science Center in Tulsa and a sweeping overhaul of the school's process for approving human experiments. University officials emphasize that none of the roughly 100 patients involved in the 3-year-old study of an experimental cancer vaccine was known to have been harmed, and that most trials will soon be restarted.

    Researchers and administrators around the country are closely watching the events at Oklahoma for clues to how a new federal office charged with overseeing human experimentation—the Office for Human Research Protections (OHRP)—is likely to operate (Science, 16 June, p. 1949). This is the first major case that the office, set up in May in the Department of Health and Human Services, has dealt with in public. The new administrative procedures, announced by university president David Boren on 21 July, could also provide a test-bed for other institutions, because they mirror tough rules for clinical trials that are being considered, but have not yet been approved, by Congress. Oklahoma has “chosen to be way ahead of the curve,” says Jeffrey Kahn, director of the Center for Bioethics at the University of Minnesota, Twin Cities.

    Oklahoma's troubles began last December, according to a chronology of the events outlined in a 17-page, 29 June letter from OHRP compliance chief Michael Carome to University of Oklahoma officials. A nurse coordinating the federally funded safety trial of a vaccine to treat melanoma, led by surgeon J. Michael McGee, told university officials and the Food and Drug Administration (FDA), which regulates vaccines, that she feared sloppy shipping practices might enable bacteria or viruses to contaminate the vaccine. The vaccine, which was made on the Tulsa campus, consisted of material extracted from specially grown cancer cells.

    Carome's letter notes that on 31 January, McGee wrote to FDA acknowledging that the trial was “out of compliance” with federal regulations due to staffing and administrative problems. He said he would not enroll new patients until the problems were solved, but would continue providing vaccine to 28 patients already in the trial. On 9 March, however, McGee and Tulsa officials opted to shut down the trial after receiving a scathing consultant's report that was requested by the Tulsa science center. The report by Hausmann and Wynne Associates of Marlborough, Massachusetts, excerpted in the 14 July issue of Tulsa World, found numerous shortcomings. They included “egregious lack of control” in vaccine manufacture and record-keeping, lack of training for lab staff, and a database that identified patients by name, in violation of confidentiality rules. As a result, the consultants recommended that all remaining vaccine “either be destroyed or clearly labeled ‘For research use only—Not for use in humans.’” According to OHRP, the report's conclusions were shared with McGee; his superior, Thomas Broughan, chair of Tulsa's surgery department; medical dean Harold Brooks; and Daniel Plunket, chair of the science center's eight-member Institutional Review Board (IRB), which oversees clinical trials.

    In a 10 April letter to patients and collaborating physicians at eight sites around the country, however, McGee did not mention potential safety problems as a reason for the trial's closure. Instead, he wrote that due to “a great deal of interest in the study,” he was “unable to provide material for further injections.” According to the OHRP chronology, IRB members—who are supposed to be informed about any change in a trial—were also given a misleading explanation for the trial's end.

    On 27 June, however, after OHRP officials informed the university that it would investigate and staffers prepared to interview IRB members by phone, the IRB panelists were finally given copies of the consultant's report. Two days later, OHRP's Carome sent Oklahoma officials his letter, which included the chronology and a summary of regulatory violations. Carome informed the university that OHRP had decided to suspend Tulsa's five remaining federally funded clinical trials until the university came up with “a satisfactory plan” for protecting human subjects.

    Among many violations Carome cited in the melanoma trial are that 11 of the first 18 patients enrolled did not meet the study's approved criteria and that consent forms contained wording that was overly optimistic about the vaccine's potential benefits. Carome also wrote that Plunket's failure to share the consultant's report with his committee “undermined the IRB's independence and authority in a manner that transcends this study.” “In OHRP's experience,” Carome wrote, correcting such problems “would necessarily include changes in leadership.”

    Boren, a former U.S. senator who has led the university since 1994, took the hint. He suspended another 70 university-sponsored trials, disbanded the Tulsa IRB, sent new letters to patients in the vaccine trial disclosing the real reason the study was halted, and appointed a six-member team to devise a scheme for approving and monitoring trials.

    At a 21 July press conference, Boren announced that the four men most closely associated with the trial would be leaving the university: He had moved to terminate McGee; dean Brooks and director of research Edward Wortham had resigned; and Plunket had retired. (None of the four responded to interview requests by Science.) Boren also outlined an eight-point plan for overseeing clinical research at the university (see box) that closely follows rules, included in a bill introduced in Congress by Representative Diana DeGette (D-CO), that are considered likely to pass. “We have no choice but to demonstrate we're making a fresh start,” Boren said. “The system failed because information about potential criticisms and irregularities did not make its way up the chain of command.”


    • Institutional Review Board (IRB) revamped

    • IRB will be accredited by independent agency

    • All projects approved by senior administrator and IRB

    • A new independent research compliance office

    • Unannounced spot checks of research

    • Mandatory researcher education on procedures

    • Researchers must sign conduct contracts

    • “No fault” hotline for reporting potential violations

    Some University of Oklahoma researchers worry that the new requirements are an “overreaction to an isolated incident,” as one biochemist who has worked at the Tulsa center puts it. Minnesota's Kahn predicts that “we're going to see more of these kinds of requirements,” although he questions whether “more levels of sign-off will protect human subjects.” What's needed, he says, is greater attention to how researchers recruit and inform subjects “in the real world.”

    Meanwhile, researchers and patients who have worked with McGee, a former missionary to Nigeria who came to the Tulsa campus in 1989, say they are stunned by the developments. “He's the last person I know who would ever want to hurt a patient,” says Joseph Price III, a biomedical researcher at Oklahoma State University in Tulsa who helped McGee design animal tests for the experimental vaccine. And three patients in McGee's trial have hired a lawyer—not to sue the embattled physician, but to help them petition FDA to release vaccine stocks that they believe will keep them alive.


    An Improvement in Vital Signs

    1. Andrew Lawler

    Life scientists hoping to conduct research in space finally have some good news. Last week a hefty Russian module with living and working quarters for astronauts docked with the pieces of the international space station already in orbit, a critical step in creating a full-time orbiting laboratory. Meanwhile, NASA bureaucrats put the finishing touches on a realignment of the agency's struggling biology effort that should bolster fundamental research and allow scientists to make better use of the facility, scheduled to be completed in 2005. The two events raise the hopes of U.S. academic space life scientists that their discipline is at last on the ascent at NASA (Science, 12 May, p. 938).

    On 25 July Russian controllers at a mission control center outside Moscow guided the 20-ton Zvezda module into the U.S.-funded and Russian-built Zarya module. The maneuver formed a single spacecraft the length of an 11-story building. Although researchers must wait for the U.S. laboratory to arrive in late January, Vice President Al Gore praised the docking as a sign of the station's pending payoff for scientists, and NASA officials savored the opportunity to move beyond short, sporadic experiments on the space shuttle to more substantive projects. “We finally see the carrot at the end of that stick,” says Julie Swain, deputy chief of NASA's life and microgravity sciences office, referring to the long and painful process of getting the station into orbit.

    While engineers are putting the station through its paces, NASA managers are overhauling Swain's office in a way that will raise the profile of biological research. The new organization, due to be announced this week, will divide the current life sciences division into biomedical activity and fundamental biological research. The former, which will be run by NASA's Johnson Space Center in Houston, will focus on such human health problems as excessive bone loss from long-duration space travel. The latter, led by NASA's Ames Research Center in Mountain View, California, will include more fundamental research in such areas as cell biology. The two pieces, plus work in microgravity and other fields, will make up an office of fundamental space research.

    The new arrangement reflects a shift in emphasis from a program centered on keeping astronauts healthy to one that will foster the exploration of fundamental biological processes. “This change is really necessary and long overdue,” says Esther Chang, a genetics researcher at Georgetown University and a member of NASA's life and microgravity sciences advisory panel. “It has been very difficult to keep these two areas together and give each the attention it deserves.” NASA now spends $57 million on biomedical research and countermeasures, not including health research and other related areas, and $39 million on fundamental biology.

    Whether the new arrangement will translate into a bigger budget won't be clear until next year. “All of the professional societies involved have endorsed significantly increased funding for biological programs,” says Norman Lewis, a biologist at Washington State University in Pullman and a former president of the American Society for Gravitational Space Biology. He would like to see a greater investment in Earth-based experiments to complement space-based missions, a view endorsed by agency officials.

    NASA officials are looking for a prominent researcher with significant management experience to head the new office. Arnauld Nicogossian, the longtime head of the life and microgravity office, was to remain head until his replacement was named, but in mid-July he was relieved of that duty. (Nicogossian is now the chief health and safety officer for NASA.) NASA Chief Scientist Kathie Olsen, a biologist who was instrumental in the reorganization, has been named acting chief while a search is begun for a permanent boss. But sources say she will not apply for the job. Swain, trained as a physician, is said to be a candidate.

    As for what kind of research will be done once the station is complete, Swain says that “we're not even sure what questions we will be answering in terrestrial laboratories. But I think we're going to have a dynamite research program to help find some fundamental answers.”


    Solar Storm Knocks Out Japanese Satellite

    1. Dennis Normile

    TOKYOJapan's x-ray astronomy program was dealt a new blow last month when a solar geomagnetic storm left an orbiting x-ray telescope spinning out of control. Scientists are dubious about their chances of saving the 7-year-old Advanced Satellite for Cosmology and Astrophysics (ASCA), whose replacement—the ASTRO-E x-ray satellite—was lost shortly after launch in February. “We haven't given up,” says Hajime Inoue, head of space astrophysics research at Japan's Institute of Space and Astronautical Science (ISAS) in Sagamihara, outside Tokyo. “But we don't have a great amount of hope.”

    ISAS scientists believe a solar storm on 14 July expanded Earth's atmosphere to the point that it increased atmospheric drag on the satellite, which was orbiting at an altitude of about 440 kilometers. The drag disturbed the angular momentum of the satellite, which sent it spinning out of control. The next day it went into a safe mode, spinning in such a way that its solar panels are not facing the sun. Inoue says the best chance for regaining control of the satellite will come in a month or so, when ASCA moves into a better position for generating solar power.

    Developed jointly with NASA's Goddard Space Flight Center and several American and Japanese universities, ASCA had been a key component of ISAS's relatively small but carefully targeted space program. Its observations have generated more than 700 papers, Inoue notes proudly. One major finding was the detection of iron in the x-ray emissions from accretion disks, the swirls of gas and dust that orbit black holes. These iron emissions bore telltale evidence of the enormous gravitational pull of the black hole, something expected but never before observed. “ASCA had already been a very big success,” Inoue says.

    ASCA would have lasted only another year before falling into Earth's atmosphere. And a replacement for the lost ASTRO-E is still 4 to 5 years away. In the meantime, Japan's x-ray astronomers are trying to borrow time on other instruments. “There is a big gap in our [observational] program,” Inoue says.


    USDA to Commercialize 'Terminator' Technology

    1. Jocelyn Kaiser

    For the past year, the U.S. Department of Agriculture (USDA) has been juggling a political hot potato: whether to pursue commercialization of a controversial biotech discovery that can render seeds sterile. A diverse group of opponents, including some scientific groups and companies, have disavowed this so-called “terminator” technology as an unconscionable threat to poor farmers. But last week USDA officials announced they will move ahead with the technology because of its scientific promise—albeit with conditions negotiated with its industry partner to guard against it being used in harmful ways. Antibiotech activists adamantly oppose the decision, which runs counter to the intentions even of biotech giant Monsanto.

    At issue is what is formally called a “technology protection system,” developed by USDA and Delta & Pine Land Co. (DPL) of Scott, Mississippi, which are co-inventors on related patents. The intended application is to protect a company's investment in developing genetically engineered plants by preventing farmers from using their seeds for the next year's planting. This is done by adding three genes to a plant. If the seeds from the modified plants are treated with an antibiotic, the plants that grow from those seeds will produce a toxin that renders their seeds sterile. So far, the technology has been tried only in an experimental tobacco plant at a USDA lab in Lubbock, Texas.

    When word got out about the first patent in 1998, the Rural Advancement Foundation International (RAFI) and others launched a highly visible campaign against the technology (Science, 30 October 1998, p. 850). Critics charged that it would prevent subsistence farmers from saving seeds and that pollen from the plants might sterilize neighboring fields as well. Soon after, the world's largest nonprofit agricultural research group, the Consultative Group on International Agricultural Research, pledged never to use the technology in its crops. Faced with heated opposition, Monsanto (now part of Pharmacia) also declared a moratorium on using the technology last October when it was considering buying DPL.

    Meanwhile, away from the fray, some scientists inside and outside USDA have been arguing that the technology is too promising for the department to abandon. “There's so much good science to come from it,” says James Cook, a plant pathologist at Washington State University in Pullman. The patent could be used to turn any gene on and off—“a goal of all plant breeding,” said USDA tech transfer official Richard M. Parry Jr. at a meeting last week of USDA's new biotech advisory panel. He adds that “there are many other beneficial applications,” including preventing the spread of genes from genetically modified crops to wild plants. These benefits persuaded USDA to pursue its patent and its agreement with DPL, despite vociferous opposition.

    The opponents were well represented at the panel meeting, where USDA sought advice on what conditions it should include in the licensing agreement with DPL—not, as some expected, on whether it should proceed with the agreement at all. The diverse panelists offered several, such as making DPL legally liable should the plants damage a neighbor's field; removing USDA from the controversy by transferring its patent rights to a trust; and not licensing it to companies that own more than 40% of the market for a seed. “I still think it's a bad idea. I'm signing on to something that would make it a tiny bit better,” said Margaret Mellon of the Union of Concerned Scientists.

    By the meeting's end, the panelists had reached consensus on just one recommendation: USDA should ban the technology's use on existing varieties and on all plants that aren't highly self-pollinating—which, critics note, is what DPL plans to do anyway.

    USDA's decision—it expects to finalize the agreement with DPL in the next few months—is unlikely to satisfy groups such as RAFI, which issued a press release calling the advisory board discussion “a giant charade.” But in the larger scheme, what USDA does will not determine the fate of “terminator” technology; several companies are pursuing patents on similar technologies—and they will probably not be inviting critics to the table.


    Atmosphere Drives Earth's Tipsiness

    1. Richard A. Kerr

    For more than a century, geophysicists who track Earth's rotation have sensed a rhythmic unsteadiness about the planet, an ever-so-slight wobbling whose source remained frustratingly mysterious. But researchers have been homing in on the roots of the so-called Chandler wobble, and now a report in the 1 August issue of Geophysical Research Letters fingers the shifting pressures of the deep sea and ultimately the fickle winds of the atmosphere.

    Although the 18th century Swiss mathematician Leonhard Euler predicted that Earth should wobble on its axis at a pace of around once a year, it wasn't until 1891 that American businessman and amateur scientist Seth Carlo Chandler Jr. detected this wobble through analysis of stellar observations. Once every 14 months, Chandler found, Earth's spin axis wanders near the geographic pole within a rough circle anywhere from 3 to 6 meters across. If the off-kilter motion resulted from a single nudge to the tilted spinning top that is Earth, calculations showed it would have faded away in a few decades. Something must keep pumping energy into the wobble, researchers knew—but what?

    Candidates abounded, but most eventually fell short. The jolts of great earthquakes come too infrequently. Wind blowing on mountains proved too feeble. That seemed to leave something in the ocean as the most likely possibility. To pin it down, geophysicist Richard Gross of the Jet Propulsion Laboratory in Pasadena, California, compared how Earth actually wobbled between 1985 and 1996 with how strongly the ocean and atmosphere, as simulated in the latest computer models, could have driven the Chandler wobble. Winds and currents proved far too weak in themselves, but the varying pressure that water pushed around by the wind exerted on the sea floor accounted for two-thirds of the wobble. Shifts in atmospheric pressure explained the other third.

    Gross “has found the two biggest contributors” to the Chandler wobble, says geophysicist Clark Wilson of the University of Texas, Austin. And only one of these is in charge. “The oceans are mainly wind-driven, so you have the atmosphere driving the whole thing,” Wilson explains. Aside from satisfying geophysical curiosity, that insight could help fly spacecraft to the planets. Gauging a spacecraft's precise location is tricky from an unsteady platform like Earth, but it may be easier now that scientists know what's rocking the boat.


    Panel Backs EPA and 'Six Cities' Study

    1. Jocelyn Kaiser

    The Environmental Protection Agency (EPA) has won a major victory in the fierce battle over its tough new standard for particulate air pollution. Dealing a sharp blow to critics from industry, a nonpartisan research group has reevaluated key data that EPA relied upon to set that standard and has come out firmly behind the agency. Although all scientific debate isn't over, the reanalysis “puts to bed many of the concerns that were raised” 3 years ago, asserts John Vandenberg, an EPA environmental scientist.

    At issue was EPA's 1997 decision to extend its regulation from particles 10 micrometers or less in size to those a mere 2.5 micrometers or less across (PM2.5). EPA based its decision largely on two controversial studies that linked these tiny particles, released mainly by motor vehicles and power plants, to higher death rates.

    In the Six Cities study, Harvard researchers examined the relation between levels of PM and sulfates (a component of fine particles) and death rates among more than 8000 people in six U.S. cities, following them for 14 to 16 years. The American Cancer Society (ACS) study followed over 500,000 people in 154 cities for 8 years. Both found a slight rise in death rates from heart and lung disease in cities with higher levels of PM2.5, although the mechanism remained unclear. Based largely on the ACS death count, EPA calculated that the benefits of cutting PM2.5 to 65μg/m3 over 24 hours would far outweigh the multibillion-dollar costs.

    After EPA proposed the standard in 1996, the American Petroleum Institute (API) and other industry groups blasted the two studies. Some scientists also argued in congressional hearings that the apparent link might result from other air pollutants, a less healthy lifestyle in dirtier cities, or other confounding factors. Industry groups sued to block the new regulations. A federal court decided that the science was sound but threw out the rules based on legal arguments, which will be heard by the Supreme Court this fall. At the same time, skeptical industry groups and some lawmakers demanded that the Harvard researchers turn over their raw data. The researchers refused, saying that subjects' confidentiality would be breached.

    To resolve the scientific and data-sharing issues, Harvard turned to the nonprofit Health Effects Institute (HEI) in Cambridge, Massachusetts. HEI assembled an expert panel to reanalyze both studies. In a report released last week, that panel concluded that the association between PM2.5 and excess mortality is real. The team, led by statistician Daniel Krewski of the University of Ottawa, replicated the studies from original data sets and got essentially the same results: slightly higher death rates in the dirtier cities. The team probed the data for more than 30 possible confounders, from altitude to health services, and tested the link “in nearly every possible manner” with various analytical techniques. The results still held.

    Bill Frick, an attorney with the API, agrees that the reanalysis has “eliminated some of the uncertainty.” Another major epidemiology study released by HEI that looked at daily PM levels and deaths in 90 cities has also cleared up earlier doubts (Science, 7 July, p. 22). But Frick argues that researchers still need to figure out which component of PM2.5 causes harm and hence what problem needs to be fixed—power plants or diesel trucks, for instance. A slew of new federally funded research is addressing those questions and will feed into EPA's assessment of PM2.5 science this fall. Until EPA decides whether to adjust the standard next year, it won't ask states to comply with the regulations.

    Meanwhile, the legal scuffle over access to research data continues. In the wake of the controversy, Congress in 1998 passed a law, sponsored by Senator Richard Shelby (R-AL), mandating that federally funded researchers release their raw data if requested under the Freedom of Information Act. To the relief of scientific groups, the White House interpreted the law narrowly, limiting it to grants awarded after fall 1999 and only to data used to support regulations. The U.S. Chamber of Commerce threatened to sue to broaden that interpretation and began the process by filing requests last December for the Harvard data. So far, EPA has refused to turn over the data because the study predates the law. Keith Holman, an attorney with the Chamber of Commerce, says the group hasn't yet decided whether to litigate the case.


    Ehlers Bill Wins Bipartisan Backing

    1. Jeffrey Mervis

    A House panel has unanimously endorsed a major bipartisan initiative to improve math and science education in U.S. elementary and secondary schools. The bill would authorize nearly $100 million a year for several new programs to be run by the National Science Foundation (NSF), a sizable addition to its current $275 million budget for precollege education.

    Last week's 36-0 vote by the House Science Committee also marks a milestone in a 2-year effort by Representative Vern Ehlers (R-MI) to translate recommendations from his 1998 report on the future of U.S. science into concrete programs (Science, 21 April, p. 419). Getting Congress to pay for the initiative, however, is still a long shot.

    “We've gotten to first base,” said an exultant Ehlers about the bill, H.R. 4271. A companion bill introduced by Ehlers that proposes changes in the Department of Education's science programs is moving more slowly, however, and a third component, granting tax credits to teachers, has been blocked by House leaders.

    The key provision in the NSF bill would create a $50-million-a-year program to pay the salaries of “master” science and math teachers in elementary and middle schools across the country. These experienced teachers would be freed of classroom duties to help improve curricula, coordinate labs and other hands-on activities, and conduct after-hours professional training. Having ready access to such a person, Ehlers says, can be a great help to an inexperienced teacher uncomfortable with science: “The idea of master teachers was the one element that everybody thought was essential as I talked to people about the bill.”

    Other provisions would fund scholarships for undergraduate science majors who promise to teach for at least 2 years, computer training for those already in the classroom, summer and after-school research grants, and competitions for public-private partnerships that would foster distance learning, strengthen community colleges, and promote the involvement of women and underrepresented minorities. Two provisions were first proposed as stand-alone bills by Democrats, who Ehlers went to great lengths to accommodate in crafting the legislation.

    Given its bipartisan nature, the authorization bill stands a reasonable chance of winning House approval before Congress adjourns in early October, although its prospects in the Senate are uncertain. A bigger obstacle will be convincing a separate spending panel, led by Representative James Walsh (R-NY), to insert the necessary funds in NSF's budget. “Talk is cheap,” said committee chair James Sensenbrenner (R-WI). “What counts is getting the money.” Representative Sherwood Boehlert (R-NY), who stands to become science committee chair should the Republicans retain control of the House and Sensenbrenner ascend to head the more prestigious Judiciary committee, admitted that he's failed in the past to win funds for education bills. But he promised, half in jest, to “lead a march” on Walsh's house to persuade him to allocate the additional funds.


    U.K. Unveils 'Brain Gain' Initiative

    1. Kirstie Urquhart*
    1. Kirstie Urquhart is U.K. editor for Science's Next Wavebold. With reporting by Judy Redfearn in Bristol, U.K.

    CAMBRIDGE, U.K.— A handful of top scientists could soon be paid six-figure salaries to live and work in the United Kingdom. Last week, U.K. Trade and Industry minister Stephen Byers announced a $6 million a year program to lure as many as 50 research stars to the country or persuade others who might accept lucrative offers from abroad to remain in Great Britain. “What we hope to achieve is that some of the most outstanding young people will be motivated to stay here, and some who have left will want to come back,” says Sir Eric Ash, treasurer of the Royal Society, which will administer the program.

    The initiative is the latest in a series of moves in the past few weeks aimed at buoying up the British scientific community. Last month, the U.K. government announced a $1.7 billion plan to shore up deteriorating facilities and raise stipends for Ph.D. students (Science, 14 July, p. 226), and last week it unveiled a 3-year spending plan that would give science an annual 7% raise. Stemming the brain drain is a top priority: The education ministry has also announced plans to allot an extra $75 million in the budget next year for universities to recruit professors in all fields.

    The new program, part of the long-anticipated government white paper on how to improve U.K. science, hopes to bring in top minds to exploit these investments. The so-called “Brain Gain” fund is designed to build on the success of a current Royal Society program to top off the salaries of 15 elite professors. Half the money will come from the Department of Trade and Industry and half from a London-based charity, the Wolfson Foundation. Details of how the funds will be allocated and who, or which institutions, can apply are still being worked out. But the scientists chosen for the program will be able to use the money flexibly—to supplement their own salaries, hire an additional research assistant, or purchase equipment, for example. Each topflight scientist could receive about $150,000 a year, more than double the premium salaries professors now get paid. The funders anticipate footing the bill for the research stars for up to 5 years.

    Rank-and-file scientists welcome the initiative. Scarce resources for salaries “certainly is a problem when we've thought about trying to recruit people from the States,” says immunologist Doug Fearon of the University of Cambridge. He was lured from the United States 7 years ago by the Wellcome Trust charity, which pays generously by U.K. standards, but efforts to attract another top U.S. immunologist to the department have foundered on the salary issue, he says. Fearon hopes the fund won't be spent only on well-established scientists. “They should be targeting high flyers at any stage of their career development,” he says.

    It's unclear how great an impact the program will have. Fifty researchers is “less than half a person per university,” points out Peter Cotgreave, director of the Save British Science Society. “The real problem is that nobody's paid enough. If you want to keep the best, it's not just the top professors you want.” Then again, a few stellar researchers by their own gravitational force may draw in even more talent and prove these salary top-ups a wise investment indeed.


    A Wetter, Younger Mars Emerging

    1. Richard A. Kerr

    It's not just springlike seeps but young lava flows, salty meteorites, and unblemished water channels that are making the Red Planet look alive again

    Early in the 20th century, Mars seemed a living planet. Some astronomers thought they saw plants flush green over huge areas each spring as water coursed through planet-girdling canals. But images from the Mariner spacecraft killed off Mars in the 1960s when they revealed barren, lunarlike landscapes and no canals. Although huge water-carved channels eventually turned up, they dated from way back in the middle years of martian history, 1 or 2 billion years ago. Mars seemed geologically lifeless, its meager portion of water locked beneath a deeply frozen surface and its inner fires damped. But now planetary scientists are reviving Mars, restoring it if not to youthful vigor at least to modest geologic activity.

    Breathing new life into the Red Planet are continuing analyses of martian meteorites and stunning images from the Mars Global Surveyor (MGS), which has been in orbit since 1997. Last month's announcement that the camera aboard MGS had spied signs of geologically recent—possibly even ongoing—water seeps (Science, 30 June, pp. 2295, 2325, and 2330) has caught everyone's attention. Other, perhaps more persuasive, signs also suggest that water may even now flow on or beneath the frigid surface.

    “We keep seeing more and more evidence there's at least a little bit of water still running around,” says geochemist Timothy Swindle of the University of Arizona in Tucson, who has analyzed martian meteorites. And the heat needed to unlock that water from the planet's frozen stores also seems to still be available, at least at times. “There is considerable evidence Mars is still quite active” volcanically, says planetary scientist Stephen Clifford of the Lunar and Planetary Institute in Houston. “There is a [volcanic] heat engine that continues to the geologic present.” It all adds up to what planetary geologist William Hartmann of the Planetary Science Institute (PSI) in Tucson calls “Youthful Mars”: the emerging view that volcanic heat can still mobilize abundant, near-surface water. Whether truly youthful, or just experiencing the odd flicker of the fires of youth, a wetter Mars is good news for researchers looking for martian life and engineers looking to sustain off-planet visitors.

    Well into the '90s, most researchers believed that the planet, although perhaps geologically and hydrologically active in its early years, had long since fallen into a geologic coma. Gone were the days of water splashing on the surface, or even near it. Rains may have cut valley networks 4 billion years ago under a thicker, warmer atmosphere; an ocean may have filled the northern lowlands a few billion years ago; and water may have burst from the southern highlands to cut great channels debouching northward. But that was long ago. Water flows from a planet's high spots to low ones, where it stays unless it's recycled into the atmosphere and deposited as rain or snow in the highlands. Mars seemed to have had no such weather for eons. Even the occasional rising fingers of magma that had fueled volcanic eruptions and melted ground ice to cut the great outflow channels apparently ceased within the past billion years. So the planet's water, many planetary scientists thought, would have drained from beneath the southern highlands into or beneath the northern lowlands and frozen there.

    A dead Mars started to look suspiciously active, however, as geochronologists began dating the 14 meteorites known to have been blown off Mars by large impacts. ALH84001 (of putative microfossil fame) came in truly ancient at 4.5 billion years, having solidified from magma to form a bit of Mars's earliest crust. But the others—which came from at least two other randomly located impacts—solidified from molten lavas relatively recently. One group, including the so-called nakhlites, named for the Nakhla meteorite, is 1.3 billion years old. The other young group, including the shergottites, is as young as 165 million years, or less than 4% of the age of the planet. Geochemist Laurence Nyquist of NASA's Johnson Space Center in Houston and his colleagues will report at next month's meeting of the Meteoritical Society that the newest shergottite, called Los Angeles—it was found years ago by a rock hound in the desert near L.A. but only recently recognized—has the identical 165-million-year age as the meteorite Shergotty, which gave the group its name.

    If impacts were flinging young rocks off Mars, researchers wondered, where were the young-looking terrains? Early estimates suggested that the last volcanic resurfacing of the planet with lava occurred up to a billion years ago. But dating the surface of Mars from orbit is a tricky business. The usual approach is to count the number of craters punched in the surface by the steady stream of asteroidal and cometary debris that rains onto every planet like the grains of an hourglass. Uncertainties abound, but planetary scientist William Hartmann and his colleagues have been working to decrease them using crater counts from MGS imaging. Comparing the rain of impactors calculated for Mars with that on the moon, planetary scientists can now calibrate martian dating against Apollo isotopic dating of lunar surfaces. MGS images allow Hartmann and his colleagues to see and count much smaller martian craters, down to 11 meters in diameter. And Hartmann has focused on the youngest terrains, where the lingering error of a factor of 2 or 3 produces the smallest errors in absolute ages.

    Crater counting in MGS high-resolution images has produced some eye-popping ages for the surface of Mars. “We've got a robust case,” says Hartmann, “for surfaces that are below 100 million years” in lava flows within the Arsia Mons volcanic caldera. Last year, Hartmann and his colleagues reported the region to be no older than 40 million to 100 million years. Planetary scientist Alfred McEwen of the University of Arizona pointed out young-looking lava flows in the Elysium Plains that Hartmann and Daniel Berman of PSI date at 10 million years or less, and earlier this month Hartmann found a similarly young age for parts of Amazonis Plains. McEwen estimates that perhaps 10% to 20% of the planet's surface is younger than a few hundred million years old—roughly the same proportion as that of the youngest martian meteorites. And there's no reason to suppose that Mars was volcanically active for 99% of its existence and then happened to shut down just before humans could take a look; in all likelihood, Hartmann says, Mars is still volcanically active, at least episodically.

    Because the martian internal fires seem to still be licking the surface, it may not be so surprising that geochemists working on the martian meteorites are finding signs that liquid water has been moving near the surface in geologically recent times. In the January issue of Meteoritics and Planetary Science, Swindle and his colleagues reported that a rustlike product of weathering deposited within veins in a 1.3-billion-year-old nakhlite is only about 650 million years old, according to the potassium-argon method of isotopic dating. Geochronologist C.-Y. Shih of Lockheed Martin Space Operations in Houston, Nyquist, and their colleagues have confirmed that date using the rubidium-strontium method. That's too old to be contamination picked up after falling to Earth, suggesting that water flowed through the crust within the last 15% of martian history.

    The latest results from the martian meteorites argue for even younger water flows quite near the surface of Mars. Meteoriticists John Bridges and Monica Grady of the Natural History Museum in London reported in the 30 March issue of Earth and Planetary Letters that a variety of minerals in three nakhlite meteorites, including a fragment of the Nakhla meteorite collected within days of its fall, seem to have precipitated from a brine. Waters had leached minerals from the crust, then evaporated, concentrating the liquid. Water evaporates most readily near the surface, which is where the dynamics of impacts would place the salt formation, too. Only rocks within a few meters of the surface can make it off a planet, according to the prevailing mechanism for impact launching. And meteoriticist Susan Wentworth of Lockheed Martin and her colleagues told the Lunar and Planetary Conference in March that the 165-million-year-old Shergotty meteorite contains a group of evaporitic minerals “remarkably similar” to those of the older Nakhla, implying an evaporitic origin on Mars for the deposits.

    Although young lavas and weathering imply the requisite combination of near-surface heat and water, imaging hadn't caught a clear example of geologically recent, volcanically triggered water flows until MGS. McEwen and his colleagues are finding that floodwaters cut Marte Vallis, a smallish to medium-size outflow channel, within the past few hundred million years, making it the youngest known outflow channel. And it happens to lie just southwest of the Elysium Plains, whose lavas flow into it. “That's probably not a coincidence,” says McEwen. “The magmatism melted the ground ice to create the flooding,” and lavas flowed in later.

    The increasing evidence of a wetter, more active Mars “reminds me of what happened at the beginning of the Magellan mission” to Venus in 1990, says planetary geologist Raymond Arvidson of Washington University in St. Louis. “All our preexisting paradigms went out the window. With MGS, we're giving birth to one or more new paradigms, but we're still trying to figure out what Mars actually did.”

    At one extreme of the possibilities is the MEGAOUTFLO hypothesis (Mars Episodic Glacial Atmospheric Oceanic Upwelling by Thermotectonic Flood Outbursts) espoused by hydrogeologist Victor Baker of the University of Arizona and his colleagues (Science, 12 February 1993, p. 910). In MEGAOUTFLO, the great outflow channels are cut by water released by deep heating, as has been generally assumed. In this scenario, however, the pulse of heating is planet-wide thanks to some internal convulsion of the planet. It drives massive, simultaneous outbursts of water down the channels into the northern lowlands from the southern highlands and releases into the atmosphere carbon dioxide that had been locked up with the water in icelike clathrates. The water fills the northern lowlands to form a temporary ocean, and the carbon dioxide beefs up the atmosphere and its greenhouse to provide 100,000 years or so of relatively mild, wet climate before the ocean and atmosphere return to the subsurface. At least a few such cycles are evident in Viking images, say Baker and his colleagues, cycles that could presumably repeat again. Other scientists, however, find the available imagery unconvincing or doubt that all that water could get back into the southern highlands fast enough for the next episodic outburst.

    In an upcoming Icarus paper, Clifford and planetary geologist Timothy Parker of the Jet Propulsion Laboratory in Pasadena, California, reject such episodic rejuvenation. Instead, they propose, Mars has more or less steadily wound down geologically but not to a dead stop. They start with an “inevitable” ocean on early Mars, albeit an ice-covered one; the young planet's inner heat would have been too great to allow the water to be locked up beneath the planet's surface, they say. Water cycling slowly from lowlands to highlands by sublimation from the ice would feed into the subsurface highlands, from which it might occasionally burst to cut the outflow channels. But eventually, as internal heat waned, such a thick barrier of frozen ground would form that only the isolated and increasingly infrequent intrusion of magma could allow water to break out.

    Sorting out what Mars actually did “is an incredible challenge,” says McEwen. “It's probably a complicated story, and we've barely begun to figure it out.” Doing geology from orbit is never easy, he notes, but on Mars today it's proving particularly difficult. The Mars Orbital Camera onboard MGS is providing unprecedented detail of the surface. Still, its high-resolution images come in strips just 3 kilometers wide that will cover perhaps only 1% of the planet. Given the fuzzy Viking views of terrain surrounding the strips and the alien nature of the landscape, at times “you don't know what you're looking at,” says McEwen. Diametrically opposed interpretations are common. Add in the uncertain dating, and “we're asking questions we can't answer without sending people and collecting the samples,” says planetary geologist Kenneth Edgett of Malin Space Science Systems in San Diego. A wetter Mars would certainly help sustain any such visitors.


    Can Science Rescue Salmon?

    1. Charles C. Mann,
    2. Mark L. Plummer*
    1. Mann and Plummer are the authors of Noah's Choice.

    As scientists wrangle over whether breaching dams will save endangered Snake River salmon, the Clinton Administration has decided to bypass the controversial decision

    PORTLAND, OREGON—For now, at least, the dams will stay, as the controversy swirling around them escalates. At a press conference on 27 July, the National Marine Fisheries Service (NMFS) released a long-awaited plan to save the Columbia River's endangered salmon by restoring fish habitat, overhauling hatcheries, limiting harvest, and improving river flow. What the plan did not do, however, was call for immediate breaching of four dams on the Snake River, the Columbia's major tributary—an option that has been the subject of a nationwide environmental crusade. The NMFS will hold that option in abeyance while it sees whether the less drastic measures will do the trick. Responses from both sides were immediate and outraged. “This plan keeps the fuse burning on the extinction time bomb,” charged Tim Stearns of the National Wildlife Federation, while presidential candidate George W. Bush and Senator Slade Gorton (R-WA) had already slammed NMFS for not ruling out breaching absolutely.

    Without question, the stakes are huge: Wild salmon are cultural symbols of the Pacific Northwest. Yet breaching the Snake River dams—bypassing them with newly constructed channels—would cost almost $1 billion and affect thousands of jobs. No one disputes that these dams, and the 14 other major and hundreds of minor dams in the Columbia Basin, have drastically reduced Northwest salmon populations, some of which are headed for extinction. The disagreement concerns whether breaching the dams is indeed the silver bullet or whether the salmon can be rescued by other means.

    In theory, at least, the warring parties all agree that salmon conservation should be driven by science. Indeed, Vice President Al Gore has promised to convene a post-election “salmon summit” to save the fish with an “objective, science-based process.” But science is unlikely to provide the answer to an intrinsically political debate—especially because the scientists themselves disagree, often vocally, providing ample ammunition for both camps. At the heart of the dispute is the maddeningly incomplete body of data on Columbia Basin salmon —and especially the role the multiple threats play in driving the fish to extinction.

    In the past 18 months, two scientific teams have issued their conclusions about the relative contribution of the chief threats to Snake River salmon: hydropower, habitat degradation, hatchery misuse, and overharvesting—collectively called the “four H's” (see sidebar, p. 718). One team, composed of state, academic, and tribal scientists, fingered dams as the major culprit and called for bypassing them. The other team, scientists from NMFS, countered that other factors were equally to blame and that fixing them would have more certain benefits. The new NMFS plan signaled a clear winner in the debate: The fisheries agency listened to its own scientists. But because the plan is expected to be challenged in court, the scientific fight will likely continue for years. “I can't imagine anything cooling down these debates,” admits Phil Levin, an ecologist at the NMFS Northwest Fisheries Science Center in Seattle.

    On the brink

    The Pacific Northwest is home to five species of salmon, each of which is known by several names (Chinook or king, chum or dog, coho or silver, pink or humpback, and sockeye or blueback), and a related species of sea-going rainbow trout called steelhead. Because salmon and steelhead that spawn in one part of a river rarely interbreed with those that spawn in other parts, the six species are divisible into hundreds of individual stocks, each with its own distinct genetic, behavioral, and morphological imprint.

    Under the Endangered Species Act, NMFS is charged with protecting many endangered species of fish and marine mammals. Legally, “species” can refer to full species, subspecies, or “distinct population segments” of species. For salmon, NMFS protects what it calls “evolutionarily significant units” (ESUs): groups of related stocks that form a distinct population. ESUs are “the building blocks of salmon species,” says Robin Waples, head of the conservation biology division of NMFS's science center. “The individuals have a lot more similarity among themselves than with other ESUs, and a largely independent evolutionary trajectory.”

    According to NMFS, the Snake River has four ESUs—sockeye, spring/summer and fall Chinook (the season indicates when these distinct ESUs spawn), and steelhead—all of which are endangered. In the rest of the Columbia Basin, eight of the other 14 ESUs are listed. NMFS estimates that most of these ESUs have a greater than 50% chance of extinction by the next century, and some much sooner.

    Salmon runs in the Northwest have been shrinking since the late 19th century, reduced by cannery operations, mining, and logging. But the most severe impact has come from dams, some of which entirely closed rivers, eliminating upstream habitat. (Grand Coulee Dam by itself blocked fish from more than 1500 kilometers of the Columbia.) Beginning in 1877, federal and state agencies tried to counter the fall in salmon and steelhead populations by setting up fish hatcheries. “Hatcheries provided a very popular answer to all these problems,” says Joseph Taylor III, an environmental historian at Iowa State University in Ames. “The promise of fish culture tells everyone, ‘You can continue what you're doing.’” But the hatcheries were not a cure-all, and the populations continued to dwindle.

    The decline accelerated in the late 1970s, particularly for Snake River fish. The blame quickly focused on the lower four dams on the river, built between 1961 and 1975. Initially, the lower Snake dams had few turbines, so most of the water—and the juvenile salmon—went over the spillways, a less harmful route than through the turbines. In the late 1970s, the Army Corps of Engineers, which operates most of the Columbia Basin dams, added more turbines to boost the dams' power-generating capacity. Fish mortality soared. Alarmed, activists pushed the agency to let more water spill over the dams.

    Because spilled water amounts to lost electricity, the corps tried other ways to reduce the losses. It built improved fish ladders and other structures for bypassing the dams, modified the turbines to reduce their effects, and transported young fish down the river in barges and trucks. These efforts cut juvenile mortality by as much as half, according to an NMFS study.

    But the damage was already done. Snake River ESUs were so depleted that listings under the Endangered Species Act were inevitable. The first, for sockeye salmon, occurred in 1991, followed quickly by the two Chinook populations. Snake River steelhead were listed in 1997. Because the act blocks federal actions that jeopardize listed species, agencies such as the corps and NMFS were vulnerable to litigation. A series of lawsuits forced them to consider a wider range of strategies for salmon recovery, including the most radical: “permanent natural river operation,” or breaching the dams.

    Multiple models

    To determine whether less drastic methods could rescue the fish, state, federal, and tribal agencies joined forces in 1993 to examine research on the salmon life cycle, particularly the effects of downstream migration through dams. At the time, two models dominated the field. One, developed by biologists from state governments and local Native American groups, focused on water flow as the major determinant of juvenile mortality; it tended to show high benefits from bypassing the dams. The other model, developed by fisheries scientists at the University of Washington (UW) and funded mostly by the Bonneville Power Administration (which markets power from the federal dams), blamed most of the mortality on factors other than water flow; consequently, it tended to show fewer benefits from breaching.

    To resolve differences between the models, the two teams formed the Plan for Analyzing and Testing Hypotheses (PATH) in 1994 (Science, 23 April 1999, p. 574). Funded by the Bonneville Power Administration, PATH had a core group of 25 scientists, including a half-dozen NMFS researchers. Through workshops and papers, PATH intended to create a unified body of salmon science. The corps and the Bureau of Reclamation (the other agency responsible for Columbia River dams) could use the results to reform the hydropower system; NMFS could use them to make its legally mandated judgment—a “biological opinion,” in the jargon—about the impact on Snake River salmon of the four Snake River dams, as well as other dams downstream.

    PATH looked first at the historical record. Although it was tempting to blame the population decline on the dams, especially because it accelerated after the last was built in 1975, it turned out that the productivity of the North Pacific had changed at about the same time, reducing all Columbia River salmon populations. To isolate the dams' effect, PATH compared the Snake River stocks with those farther down the Columbia, which was less heavily dammed. Lower Columbia salmon also declined after 1975, but not as much as the Snake River stocks. Because all the other factors affected both stocks equally, PATH scientists argued, the difference had to be due to the dams. “We painstakingly went through and looked at all the H's,” says Paul Wilson, who worked on PATH for the Columbia Basin Fish and Wildlife Foundation. “In the end, we concluded that hydro was still the most important” source of mortality.

    But there was a puzzle. By this time, the corps and the Bureau of Reclamation were barging and trucking most juveniles around the dams. Yet the percentage of salmon that later returned to spawn was lower than the percentage that had returned before the dams were built. Some unknown factor seemed to be killing the fish after they passed the dams. State and tribal scientists within PATH argued that dams and barges were having delayed impacts on salmon survival. Their UW colleagues countered that the hydropower system alone could not produce such a large impact and concluded that the change in ocean productivity had to be at fault.

    Hoping to settle this and other internal disputes, PATH created a supermodel that predicted the future population levels for Snake River spring/summer Chinook; models for the other Snake ESUs, they hoped, would soon follow. The model contained every assumption everyone on the team thought important. Rather than try to agree on one set of the most likely assumptions, PATH kept them all, running the supermodel for each possible permutation, more than 5000 times in all. In a series of voluminous reports completed in April (, the group said that in almost every scenario, breaching was the best route to recovery.

    For some PATH members, the debate was over: The dams had to go. In March 1999, eight team members signed on to a highly publicized letter to President Clinton that claimed that “a building scientific consensus” showed that “bypassing four dams on the Lower Snake River” was “the surest way to restore” the endangered salmon. But other PATH scientists declined to sign the letter. One, James Anderson, the main developer of the UW model, argued that PATH gave too much credence to older, suspect data. Recent, more accurate data, he contended, were a good fit with his model, which pointed to the ocean as the major source of salmon problems. Soon after the letter was sent, Anderson told a congressional committee that “the best we can say at this time is that the work is not finished.” Designed to unify scientific opinion, PATH instead ended up splintering it further.

    A different approach

    Meanwhile, NMFS was becoming disenchanted with PATH. According to Michael Schiewe, head of fish ecology at the NMFS northwest science center, “PATH provided very hydrocentric work,” examining the dams “in isolation” from the other H's. What's more, PATH had analyzed just Snake River ESUs, but by spring 1999 the agency had added eight more Columbia River ESUs to the endangered list.

    That year NMFS asked its own scientists for a broader, less “hydrocentric” take on salmon science: the Cumulative Risk Initiative (CRI, Peter Kareiva, a fish ecologist, joined NMFS to direct the effort. Kareiva's team began with a general risk analysis of 11 of the 12 endangered Columbia Basin ESUs. Released in draft form this April, the analysis found that the Snake ESUs, though diminished, are not the most endangered in the Columbia Basin. That dubious honor belonged to upper Columbia spring/summer Chinook and three steelhead ESUs, each of which was decreasing at a rate of 10% or more per year.

    Supplementing this grim overall picture was a reanalysis of the two Snake River ESUs treated by PATH. CRI argued that attempts to isolate the dams' effects from other factors, as PATH had done in its comparison of Snake River and lower Columbia stocks, were inevitably confounded by the poor quality of the historical data and the changes in the ocean conditions. Nor did they embrace the PATH supermodel approach, which they regarded as unmanageably complex and incomprehensible for policy-makers.

    Instead, CRI used a simple demographic model that divided the complex salmon life cycle into different stages. Plugging in mortality and fecundity estimates for each stage, CRI derived its own assessment of the population growth rates for the Snake ESUs. Using these growth rates as an index of health, CRI identified the stages in which conservation measures could do the most good.

    For both ESUs, the dams had their greatest impact in a single stage: the second year for spring/summer Chinook, and the first year for fall Chinook. But, the CRI team argued, other stages offered the most potential for improvement. For the spring/summer Chinook, the most promising stages were the first year, before they migrated, and later, when the fish were in the Columbia estuary and near-shore ocean, well away from the dams, so breaching them would not help much. Breaching appeared to offer more help to the fall Chinook. Unlike the spring/summer fish, they spawn in the main branch of the Snake River and migrate through the dams in their first year; breaching the dams would reduce mortality at that stage and also improve habitat. But even for fall Chinook, protecting habitat in other ways and reducing harvest seemed equally likely to be effective.

    Scoffing at CRI's analysis as politically motivated, American Rivers, an environmental group, hired biological consultant Gretchen Oosterhout to critique it. Kareiva concedes that CRI might look “haplessly timid” about the politically charged dam decision. But he insists there are “solid scientific reasons for favoring more certain actions such as stopping the dewatering [removing water for irrigation and other purposes] of streams and of rivers before those four dams are breached.”

    Victory for CRI?

    NMFS's 27 July plan was part of the agency's long-awaited draft biological opinion on the impact of all Columbia River federal dams. To avoid jeopardizing listed salmon and steelhead, NMFS declared, improvements would have to be made in habitat protection, hatchery operations, and harvest limits. Dam operation would also have to improve—but the dams did not have to go, at least not yet. The plan uses the CRI analysis to set standards for gauging the recovery of salmon stocks. If after 8 years the fish have not sufficiently recovered, says the opinion, NMFS will recommend that the dams be breached.

    George T. Frampton Jr., acting chair of the White House Council on Environmental Quality, admitted that the decision was in part political. Breaching the Snake River dams would take a decade or more, given the fierce opposition. “There is not a single elected representative in Congress from the region that in any way supports breaching,” Frampton said. “The fish need more immediate action.”

    To conservationists, the 8-year wait is unacceptable, and plans are already afoot to sue NMFS over the final biological opinion. While the dispute continues, political groups will keep latching onto whichever science best fits their goals. American Rivers, for example, maintains a remove-the-dams Web page ( that makes numerous references to PATH-based scientific works—but has no links to CRI documents. Across the political divide, the Columbia River Alliance, a small coalition of dam supporters, gives a lukewarm nod to the CRI results on its Web page (∼cra/aa99/aa0416.htm)—and completely ignores PATH.

    The “objective, science-based process” touted by Gore will never be able to resolve political wars, says Taylor, the environmental historian. “Science can provide us with information about choices, but it is not going to deliver the Holy Grail.”


    The Other H's

    1. Charles C. Mann,
    2. Mark L. Plummer*
    1. Mann and Plummer are the authors of Noah's Choice.

    When the National Marine Fisheries Service (NMFS) unveiled its draft biological opinion of the Columbia River hydropower system (see main text) on 27 July, it was accompanied by a broader strategy written by nine federal agencies covering all the factors implicated in salmon decline. Fisheries scientists have long identified these factors as the “four H's”: hydropower dams, harvesting, habitat degradation, and hatchery misuse. According to the interagency plan, known as the “All-H Paper,” improvements in the other three H's will provide benefits that are more certain and widespread than those from dam breaching. To be successful, this new strategy must overcome environmentalist opposition and gain the cooperation of state governments, Native American tribes, and private landowners. And because addressing the other H's may be even more costly than breaching dams, NMFS will have to convince Congress to pump more money into salmon recovery.

    Harvest. In some ways, harvest is the easiest factor to understand and control, because its effects on mortality are direct and easily measured. Protecting the endangered runs while allowing harvest of others that migrate at the same time is problematic, however. Fishers have no way of knowing whether a Chinook salmon on the line is from a plentiful Washington-coast run or a critically endangered run on the Snake River.

    For most of the endangered fish in the Columbia Basin, harvest rates are already so low that further restrictions are politically difficult—and unlikely to contribute to recovery. The All-H strategy calls for continuing these low rates, while tagging most hatchery fish to enable fishers to tell them apart.

    Habitat degradation. Like overharvesting, habitat degradation has been a problem since the late 1800s. By extracting ore with high-pressure hoses, miners drew water away from streams and returned a flow of sediment, burying the gravel needed for spawning. Sometimes they mined the stream itself, extracting gravel, sand, and limestone as well as gold. And logging removed trees from forests adjoining streams, increasing stream temperatures and covering spawning beds in eroded dirt.

    In addition to wreaking damage directly, dams made it possible to irrigate the dry, eastern parts of the region. But irrigation takes water from streams, which harms spawning and rearing habitat. And the cattle that accompanied irrigation, if not fenced out of streams, can stir up sediments with similar effects.

    Habitat degradation is pronounced in the Columbia River estuary, where the young salmon make the transition from fresh water to saltwater. Dredging to improve navigation, filling in wetlands to expand urban areas, and flood control measures have made this habitat less salmon-friendly.

    The All-H strategy makes habitat protection its centerpiece. Major programs include improving stream flows by acquiring private water rights; protecting fish habitat in the lower Columbia estuary by purchasing wetlands and adjoining land; and accelerating habitat restoration on federal lands in areas identified as high priority. But these measures will require the cooperation of big private landowners, historically a problem for the Endangered Species Act.

    Hatcheries. Each natural stock adapts to the characteristics of its spawning ground, including temperature, depth, flow, and distance from the river mouth. If salmon from one environment mate with salmon from another or from a hatchery, the offspring are likely to lose sets of coadapted genes, decreasing their fitness for a particular environment. For this reason, the practice of breeding hatchery fish from whatever eggs were available, regardless of species, river, or season, is a thing of the past.

    But because hatchery populations today dominate salmon species, they still affect their wild cousins. To make up for losses in wild runs, for example, hatcheries allow many more young fish to survive to adulthood, relaxing the selective forces at that stage. If the hatchery fish interbreed with the wild ones, the genetic makeup of the population will likely be adversely affected.

    The All-H strategy takes an aggressive stance on the hatchery issue, arguing that all existing hatcheries should be reformed to minimize the harm to wild fish. Any federal agency operating a hatchery must develop a genetic management plan, including drawing from the gene pool appropriate for a particular location. But drastically changing or cutting back hatchery operations will be resisted by the tribes, whose treaty rights to salmon and steelhead have increasingly been satisfied by harvesting hatchery fish.

    Although the agencies declined to place a price tag on addressing the “other H's” strategy, rough estimates put it at billions of dollars. The funding will have to come quickly, as NMFS intends to reevaluate salmon status in 2008 to ascertain whether dam breaching is necessary after all.


    Duke Study Faults Overuse of Stimulants for Children

    1. Eliot Marshall

    Ritalin is an effective therapy for children diagnosed as hyperactive, but new research shows it's being given to many who don't fit the diagnosis

    Along with computers and graphing calculators, the drug Ritalin (methylphenidate) became ubiquitous in U.S. classrooms in the 1990s. A stimulant of the central nervous system, Ritalin is widely used—doctors write 11 million prescriptions annually—to calm a type of fidgety behavior called “attention-deficit hyperactivity disorder” (ADHD). And as sales climb, some experts are growing concerned that Ritalin is overused. Most psychiatrists think, however, that stimulants are being underprescribed, because too many cases of ADHD are going untreated. Now, an authoritative study published in this month's Journal of the American Academy of Child and Adolescent Psychiatry is stimulating a debate on these issues.

    Several studies have shown that Ritalin use is increasing in U.S. grade schools. But a research team led by child psychiatrist Adrian Angold and epidemiologist Jane Costello at Duke University in Durham, North Carolina, went a step further: They looked to see whether ADHD was being accurately diagnosed and treated. They found a stunning mismatch between symptoms and drug treatment.

    They discovered that one-quarter of the children with confirmed ADHD were not getting drug therapy. But, remarkably, they also found that more than half of the children in the community who were receiving stimulants did not come close to meeting the diagnostic criteria for ADHD. The bottom line, says Angold, is straightforward: “There's a very real problem in this area. … A lot of children are receiving poor and inappropriate stimulant treatment.” In an accompanying commentary, Benedetto Vitiello, a leader in child and adolescent studies at the National Institute of Mental Health in Bethesda, Maryland, calls these findings “surprising” and “provocative.”

    Angold, Costello, and their colleagues came to their conclusions by looking at a group of schools in 11 counties in the Smoky Mountains of North Carolina between 1992 and 1996. They began with 17,000 children aged 9, 11, or 13 in the initial study year, then randomly sampled a subset of 4500 names. Using parent interviews to identify children at high or low risk for ADHD, they winnowed the sample down to 1422 children for intense study. They collected additional data and interviewed the children to determine directly whether they had ever been given stimulants or shown signs of ADHD.

    Overall, the Duke researchers found that 3.4% of the children in the sample had ADHD as defined by rigorous diagnostic criteria, called the DSM-III-R. This rate is within the accepted U.S. prevalence range of 3% to 5%. Also, as expected, they confirmed that the rate of ADHD was higher among boys (5.3%) than among girls (1.5%). About 6.2% of the interviewed children had some ADHD-like behavior, suggesting that under less rigorous standards, this proportion might be eligible for treatment with stimulants. But the fraction who actually received stimulants was even larger—7.3%.

    This fraction is in line with other findings of pervasive use of stimulants. For example, Gretchen LeFever and colleagues at the Eastern Virginia Medical School in Norfolk reported last September on in-school ADHD treatment of elementary school children in 1995–96 in two Virginia cities. LeFever found that 8% to 10% of children in grades two through five were receiving ADHD medication at school. She notes that this is a low estimate, because it didn't count Ritalin prescriptions given only at home.

    That leaves a big mystery: Why did the children in the Duke study who had no symptoms of ADHD get medication? The report doesn't say. But Vitiello and others suggest that parents and teachers may be lobbying doctors to write stimulant prescriptions in the hope that they will help children do better in school.

    Proponents of Ritalin therapy argue that the Duke study supports earlier research showing the opposite problem: that many parents and pediatricians are biased against the use of stimulants, even though they are an effective therapy for ADHD. Peter Jensen, director of the Center for the Advancement of Children's Health of the New York State Psychiatric Institute and Columbia University in New York City, has published the only study other than Angold's that looked at whether stimulants were being given to the ADHD children who would benefit from them. His sample was smaller, and the data were from 1992, but he found that only 12% of ADHD-diagnosed children were getting appropriate stimulant therapy. “While we do have to be worried about pockets of overprescribing,” Jensen says, “there is good reason to think that only about one-half the children with ADHD are getting treated.”

    Jensen thinks that the Duke team may have underestimated the number of children who have ADHD and thus overestimated the number who are getting Ritalin inappropriately. In a comment published alongside Angold's paper, he says it's “quite remarkable” that some of the children in Angold's study with teacher ratings for ADHD-like problems did not qualify as confirmed ADHD children when examined by the Duke team. These “startling findings,” he says, suggest that “critical information may have been missing from the authors' diagnostic procedures. The teachers may have been right. …”

    Jensen's great concern, he writes, is that “the message from Angold and colleagues' article not become a mantra for those who would deny children the right and access to effective treatments. …” But Angold says it's critical that evaluations not be swayed too much by teachers' views.

    Despite their differences over the Duke study's findings, Jensen and Angold agree on several points. Stimulants do help children with ADHD and appear to have only minor side effects. Second, they agree that current practice in treating ADHD is poor. And third, as Angold says: This “chronic, highly disabling condition … deserves to receive much more in the way of assessment and treatment resources than it does.”