New TB Drug Promises Shorter, Simpler Treatment
- Jon Cohen
A chief reason that tuberculosis persists as a global killer—and is on the rise in parts of the world—is that existing antibiotics require up to 9 months of daily use, making it difficult for people to complete the treatment. Those who miss doses, in turn, fuel the emergence of drug-resistant strains of the mycobacterium that causes the illness. Yet the only new TB drugs to become available during the past 4 decades have been variations of the existing ones. Now researchers at Johnson & Johnson (J&J) in Belgium have discovered a compound that may dramatically reduce the amount of time it takes to cure the disease and that also appears to work against multidrug-resistant strains of Mycobacterium tuberculosis. “It's extraordinarily promising,” says TB researcher and clinician Jacques Grosset of Johns Hopkins University in Baltimore.
As a team led by J&J's Koen Andries reports online 9 December in Science Express (www.sciencemag.org/cgi/content/abstract/1106753), extensive studies in the test tube and mice have shown that the compound, dubbed R207910, is more potent than existing drugs, stays in the body longer, and works by a novel mechanism that makes it broadly effective. Experiments in a small number of uninfected humans and toxicology studies in rats and dogs so far suggest that the compound is safe. “It's like a dream come true,” says Andries, a microbiologist. “If you would make a wish list of the assets that an ideal TB compound would have,” he says, this would be it.
Like the rest of the pharmaceutical industry, J&J has little financial incentive to develop treatments for TB, a disease that mainly afflicts the poor. But while screening for a new broad-spectrum antibiotic, J&J researchers stumbled upon the finding that a class of compounds called diarylquinolines worked against M. smegmatis, a cousin of TB. Chemical tinkering led them to the even more potent R207910. To date, the company has bankrolled development of the drug.
Andries's group has joined with outside research teams to conduct many of the experiments described in the current report. In particular, researchers in France provided a critical mouse model for TB, which led to the finding that the compound lasted unusually long in the rodent, suggesting that it might kill M. tuberculosis with fewer doses. The French researchers added the drug to the most popular triple combination now used—rifampin, isoniazid, and pyrazinamide—and found that it achieved the same bactericidal effects in half the time. Various combinations with two of the existing drugs also showed significant benefits.
As expected, resistance to R207910 developed when given to mice as a monotherapy, but the mouse data have convinced leading TB researchers that swapping the drug for one of the three in the current cocktail would delay development of resistant strains and would vastly shorten treatment. “This is quite frankly an astonishing set of results,” says Denis Mitchison of St. George's Hospital Medical School in London. “They've managed with some of the combinations to get complete sterilization of organs within 2 months rather than 4. That's never been done before.”
Mitchison (who consulted with J&J about the results) and several other researchers were particularly intrigued by the drug's novel mechanism of action. After sequencing the genomes of strains of M. tuberculosis and M. smegmatis that were resistant to R207910, Andries and his colleagues compared the results to the DNA from susceptible strains. The genetic mutations they discovered in the resistant strains all pointed to a gene that codes for an enzyme that makes ATP, which provides energy for cells. “Nobody before has identified that as a drug target for TB,” says William Jacobs of the Albert Einstein School of Medicine in New York City.
Mel Spigelman of the Global Alliance for TB Drug Development, a nonprofit organization based in New York City that partners with industry and academics to accelerate R&D of faster-acting compounds, says R207910 is one of several novel agents now entering or nearing human trials (see chart). “There is a revolution in the development of drugs for TB,” says Spigelman. Although R207910 has moved further than other novel drugs in the development pipeline, Spigelman predicts that several of the drugs will prove their worth in human tests. He imagines a day when combining the drugs now under development offers a therapy that cures the disease in as little as 1 week. He stresses, however, that the challenge is not simply developing new drugs but delivering them at an affordable price—a key mission of the alliance, which may work with J&J in the future.
J&J's Andries says the company understands that most of the 8 million people who suffer from TB each year cannot afford expensive new drugs. “What drives us most is the medical need for such compounds,” says Andries, who adds that the lower rate of financial return could be offset by “goodwill toward the company.” The drug will soon enter into trials in people who have active TB cases. Many promising drugs of course fail in human tests, notes Andries, but if all goes well, he says the compound could be on the market in 5 years.
- DEPARTMENT OF ENERGY
Outlook for Cold Fusion Is Still Chilly
- Charles Seife
A Department of Energy (DOE) review of “cold fusion” has generated some heat but very little light on the controversial subject.
Since 1989, when Martin Fleischmann and Stanley Pons announced that a small hunk of palladium metal had apparently induced deuterium atoms to fuse at room temperature, a small cadre of cold-fusion enthusiasts has doggedly kept on the trail of endless energy. So when DOE decided in March to conduct a review of cold-fusion research, the move raised eyebrows among mainstream scientists who have long since abandoned the quest. “They asked me to serve on it, but I resolutely refused,” says William Happer, a plasma physicist at Princeton University and a harsh critic of cold-fusion research. That attitude didn't surprise those proponents of cold fusion who had pushed DOE to take another look. “I was told going into this that we would be facing an extremely skeptical and pretty hostile crowd of reviewers,” says Peter Hagelstein, a cold-fusion researcher at the Massachusetts Institute of Technology.
The outcome appears to reinforce the views of both sides, although it's hard to tell because the reviewers didn't meet to hammer out a consensus. Instead, DOE simply compiled a written summary of the reviewers' individual comments. All told, DOE asked 18 reviewers—nine by mail in July, and nine others who attended a 1-day meeting in August—to study a summary of the field prepared by Hagelstein and others as well as published results and to evaluate the evidence for nuclear reactions in matter at low energies to determine whether it's worthwhile to continue studying the phenomenon.
Several reviewers were indeed extremely critical of the research, saying that many of the experiments were poorly conducted, had results that were inconsistent with each other, and often weren't reproducible. One skeptical reviewer went further, opining that “[cold fusion] workers are true believers, and so there is no experiment that can make them quit.”
At the same time, about one-third of the reviewers, however, were receptive to claims of cold fusion. “There is strong evidence of nuclear reactions in palladium,” one wrote. Said another: “Further work that would add to the understanding of [low-energy nuclear reactions] is warranted and should be funded by U.S. funding agencies.”
DOE's position on cold fusion hasn't changed as a result of the review, says James Decker, deputy director of the Office of Science. “We never closed the door to good proposals,” he says, adding that the real value of the study was to “bring people up to date” on the issue. Hagelstein says that his side has also accomplished its goals. “In the end, the reviewers said that a study should be funded if a proposal is strong. You can't ask for much more than that.”
- U.S. RESEARCH POLICY
NSF Blocked From Funding Smithsonian Scientists
- Jeffrey Mervis
Congress has squashed a move by the National Science Foundation (NSF) to allow all Smithsonian Institution (SI) scientists to compete for NSF funds. The decision represents a victory for Senator Kit Bond (R-MO), who chairs the spending panel that sets NSF's budget, over his counterparts in the House, who had pushed for the change.
NSF's current policy allows so-called Smithsonian trust scientists—those whose salaries come from a pot created by the institution's benefactor, James Smithson—to be treated like any other eligible NSF applicant. Most of the Smithsonian's 187 trust scientists work for its astrophysical observatory in Cambridge, Massachusetts, which relies on grants from NASA and other sources.
But the vast majority of museum curators are paid from the institution's annual federal appropriation and are therefore ineligible for NSF grants. Last spring the National Science Board (NSB), NSF's oversight body, embraced equal treatment for all 431 SI scientists, despite concern that it might open the door to researchers in other federal settings to plead for similar treatment (Science, 2 April, p. 26).
Bond, however, saw the proposed expansion as double dipping. So last month he inserted language into the massive 2005 spending bill (Science, 3 December, p. 1662) passed by both the House and Senate ordering NSF to maintain the status quo. “Senator Bond felt very strongly about this matter,” says a House aide, “and conference reports are about compromises.”
“The board shares Senator Bond's concerns for setting no precedent that would allow scientists at federal research agencies or federally funded research centers to become eligible to apply for NSF grants,” says NSB Chair Warren Washington about the congressional diktat. As a result, Washington says NSF has called off talks with the Smithsonian on any changes to its grants policy. The language, he notes, also reminds NSF program managers to be fair to trust employees submitting grant proposals.
- PERSISTENT TOXIC SUBSTANCES
Study Finds Heavy Contamination Across Vast Russian Arctic
- Paul Webster*
The first comprehensive look at persistent toxic substances (PTS) across the Russian Arctic reinforces what studies in other Arctic nations have revealed: that indigenous peoples in this northern swath of the world are inordinately exposed to pesticides, industrial compounds, and heavy metals, with uncertain health effects. Due to northward flows in rivers, oceans, and atmospheric currents, persistent toxins released elsewhere, along with some local contaminants, have accumulated heavily in many areas of the Arctic, where frigid temperatures retard their dispersal and degradation.
Conducted by the Arctic Monitoring and Assessment Program (AMAP), the environmental research arm of the eight-nation Arctic Council, the 4-year, $2.8 million study sampled pollutant levels in the four major regions of the Russian Arctic. The researchers found that breast milk and maternal and umbilical cord blood samples contained moderate to extremely high levels of a variety of chemicals: hexachlorobenzene (HCB), hexachlorocyclohexane (HCH), dioxins, DDT, PCBs, oxychlordane, toxaphene, mirex, mercury, cadmium, lead, and brominated flame retardants. “The mean concentration for PTS across the Russian Arctic is comparable to what's been found in Canada and Greenland,” says the study's human health research coordinator Valery Chashchin of the Northwest Public Health Research Centre in St. Petersburg.
The highest human contamination levels were found in the Chukotka region on the eastern coast of the Russian Arctic, where indigenous people eat large quantities of marine mammals and fish, which can be heavily contaminated both with local and long-range pollutants. Researchers found that about 5% of the population, mostly males, have some of the highest PCB contamination levels—10,000 nanograms per gram of blood lipid—ever seen, says Éric Dewailly of the Centre for Inuit Health and Changing Environments at the National Institute of Public Health of Québec. The Chukotka region, Chashchin notes, “is a wasteland where millions of tons of chemicals were imported during the Soviet era and never cleaned up.”
The body burdens of some compounds—brominated flame retardants, dioxins, and furans—were actually lower in Chukotka than in the Canadian Arctic and Greenland, probably, says Chashchin, because the region is more isolated from sources of these substances in Europe and North America. But breast milk concentrations of the insecticide HCH and the fungicide HCB were 30 and 5 times higher, respectively, than in Arctic Canada, says Chashchin, who attributes these levels to historical use of these chemicals in indigenous people's homes.
Preliminary evidence, from comparisons of contamination data with information reported in health interviews, suggests that exposure to some persistent toxics (PCBs, HCH, DDT, lead, cadmium, and mercury) may be linked to reproductive effects such as stillbirths, birth defects, low birth weight, and spontaneous abortions. AMAP also noted an apparent association between reduced numbers of male births and increases in Arctic maternal blood concentrations of both lead and some types of PCBs.
A similar and more significant association was reported 20 years after a 1976 dioxin accident in Seveso, Italy, but this is the first time a link between Arctic levels and gender skewing has been reported, although the association is weak. “We are surprised and a little worried,” says a member of the study's Steering Committee, Jon Øyvind Odland of the Institute of Community Medicine at the University of Tromsø in Norway. Chashchin, Odland, and others call for further investigation of the human health effects evidence, a recommendation Inuit researchers support.
- UNDERGRADUATE EDUCATION
Tweaks to High-Tech Visas Revive NSF Scholarships
- Yudhijit Bhattacharjee
A popular federal scholarship program for low-income and disadvantaged undergraduates that was scheduled to end this year has won a reprieve, thanks to reforms in the process that allows foreign workers to hold high-tech U.S. jobs.
The National Science Foundation (NSF) began the Computer Science, Engineering, and Mathematics Scholarships (CSEMS) program in 1999 after Congress imposed an application fee for skilled worker visas (H-1Bs), tripled the maximum number, and channeled a portion of the revenue to NSF (Science, 7 April 2000, p. 40). The authority to collect that $1000 fee expired in 2003, however, leading NSF to make what would have been its last round of CSEMS earlier this year.
But now the program is ready for a comeback, thanks to a provision in the recently passed omnibus spending bill for 2005 that not only reinstates the H-1B fee but also raises it to $1500. The same legislation increases NSF's share of the fee from 22% to 30% and raises the overall cap from 65,000 to 85,000. Under the new rules, NSF could reap as much as $38.3 million a year.
That won't happen until 2006, however, because this year's applications generated no revenue. (The 65,000 quota for 2005 was filled on 1 October, the first day of the fiscal year.) NSF's Duncan McBride says the agency likely won't hold a competition until next fall and will make its first round of new awards in the summer of 2006.
The pool of eligible institutions—those that normally qualify for NSF grants— remains the same under the new program, with community colleges receiving about 40% of the awards. But there are a few new twists. The maximum amount of the 2-year scholarship will triple, to $10,000 a year, and the areas of study that can be supported will be expanded to include more fields in which job demand is high, McBride says. “Some universities have had trouble recruiting students because of that ceiling,” he says about increasing the size of the scholarship. He also welcomes the move to expand the program “into more high-tech disciplines such as biotechnology.”
The continuation of the program is “fantastic news,” says Scott Wolpert, associate dean in the College of Computer, Mathematical, and Physical Sciences at the University of Maryland, College Park, which has enrolled 60 scholarship students under a previous grant. The program helps students from low-income, minority backgrounds “break the downward spiral of high student debt leading to part-time employment, which leads to an increased risk of not graduating,” he says.
- U.S. SCIENCE POLICY
Tommy Thompson Leaves a Mixed Legacy
- Jocelyn Kaiser
Department of Health and Human Services (HHS) Secretary Tommy Thompson announced his resignation last week after a tenure marked by the post-9/11 anthrax scare, the completion of a doubling of the budget of the National Institutes of Health (NIH), a much-criticized policy on stem cell research, and a controversy over politics and science. His successor will face issues from drug safety to a flat NIH budget.
At a press conference, the former Wisconsin governor, 63, spoke with typical candor, saying that as he leaves HHS, his top worries are pandemic influenza and the safety of the food supply: “I, for the life of me, cannot understand why the terrorists have not” tampered with it yet, he said. His comments prompted President George W. Bush to declare the next day that the government is working to protect Americans from such terrorist threats. Thompson defended the Food and Drug Administration (FDA), which recently came under fire when safety problems arose with drugs already on the market, but he expressed support for an independent office to review drug safety data.
The secretary listed HHS's role in the president's $15 billion international HIV/AIDS program and his foreign travels as among his top accomplishments, along with promoting healthy lifestyles. He said his next job, likely in the private sector, would keep him involved in “medical diplomacy.”
In addition to the doubling of NIH funding, Thompson has also overseen a huge expansion of biodefense research and preparedness and implementation of the president's policy of restricting funding for stem cell research to a few approved lines. On the latter topic, Thompson insisted that the policy “is working,” and that the problem is not a lack of new cell lines but rather too few scientists involved and trained to use them. He reflected on a controversy over industry consulting by NIH scientists, praising NIH Director Elias Zerhouni for working toward a policy that is not too restrictive, because “we want the best researchers and scientists” at NIH.
Thompson's legacy includes actions that have upset scientists within and outside HHS. His office has questioned candidates for advisory committees about their political views, for example, and ordered the removal of information on condoms from the HHS Web site as part of a move to promote abstinence-only sex education. “Secretary Thompson has to bear responsibility for these developments,” says Representative Henry Waxman (D-CA), who claims to have documented political interference with science in the Bush Administration. Thompson's office has also clamped down on NIH management and limited travel to foreign meetings, irking NIH scientists accustomed to independence.
Rumored successors to Thompson include Medicare chief and former FDA commissioner Mark McClellan, a physician and economist. Thompson plans to stay until 4 February or until a successor is confirmed.
- MATH AND SCIENCE EDUCATION
Hong Kong, Finland Students Top High School Test of Applied Skills
- David Grimm
Fifteen-year-olds in Hong Kong, Finland, and Korea excel in applying the science and math concepts they've learned, whereas U.S. students trail their peers in much of the industrial world. That's one lesson from the latest results of a 41-nation test that goes beyond the usual assessment of what students know.
The Program for International Student Assessment (PISA) is part of an ongoing effort to compare the educational performance of students around the world. PISA, which covers science, math, and reading literacy, complements a set of tests called the Trends in International Mathematics and Science Study (TIMSS), which measures fourth- and eighth-graders' knowledge of specific concepts, such as geometrical formulas and chemical principles. PISA takes the premise a step further by measuring how students apply the sum of this education to new problems. “We're not asking whether students can read,” says Thomas Romberg, a math educator at the University of Wisconsin, Madison, who helped design a version of the PISA exam administered in 2000 that focused primarily on reading literacy but included science and math questions. “We're seeing whether they can understand a book they've never seen before.”View this table:
The 2003 test, the results of which were presented this week, emphasized math comprehension, whereas the 2006 test will emphasize science. The 3-year cycle will repeat in 2009. The test is coordinated by the Organisation for Economic Co-operation and Development (OECD) in Paris. Results from the latest TIMSS survey will appear next week.
The 2003 PISA test was taken by 270,000 students in 41 countries. Students had 2 hours to complete the exam, which consisted of twice as many open-ended or short answer questions as the TIMSS test. A sample math question asks students to figure out how much money they lose by exchanging their South African rands for Singapore dollars given fluctuating exchange rates. In the science section, students must decide whether scientific research can be used to determine the amount of chlorofluorocarbons in the atmosphere.
Hong Kong students placed first in math in the 2003 test, and Finland held the top spot in science. The ranking of individual countries changed little between 2000 and 2003, although Poland, Germany, and the Czech Republic did significantly better the second time around. Wealthier countries tended to place higher on the PISA charts, although students in Korea, with a national income 30% below the OECD average, placed third in math and fourth in science. U.S. students stood 24th in math and 23rd in science, similar to their relevant rankings in 2000.
“What these results say is that a student in Finland will have an easier time using his math and science knowledge to make sense of an unfamiliar situation than will a student from the U.S.,” says Romberg. Larry Suter, deputy director of the Division of Research, Evaluation, and Communication at the National Science Foundation, says he was surprised that Canadian students did so much better than their U.S. counterparts, given the similar socioeconomic profiles of the two countries. “This study is going to force us to think about what we teach in our schools,” he says.
As for PISA's impact on U.S. science and math education, Suter also believes that state assessments should be reevaluated to gauge the application of knowledge, not just retention, as a marker of student progress. U.S. high schools need to pay particular attention to practical knowledge, agrees Eugene Hickok, the U.S. deputy secretary of education: “In the international context, we have some mountains to climb.”
Brain-Computer Interface Adds a New Dimension
- Ingrid Wickelgren
This fall, surgeons implanted 100 electrodes into the brain of a 25-year-old quadriplegic man and connected them to a computer that enables him to check his e-mail and choose a television channel with his thoughts alone. And monkeys with similarly implanted electrodes have used brain signals to move cursors or robotic arms in two dimensions (Science, 24 January 2003, p. 496). Now, in a groundbreaking development, two neuroscientists from the Wadsworth Center, part of the New York State Department of Health in Albany, have shown that similar feats may be possible without the dangers of inserting electrodes into the brain. This week, in the online Proceedings of the National Academy of Sciences, Wadsworth's Jonathan Wolpaw and Dennis McFarland demonstrate a brain-computer interface (BCI) that can translate externally detected brain signals into both horizontal and vertical movement of a computer cursor.
“It's earthshattering that we may be able to reconnect the brain to a paralyzed limb or a robotic arm without surgery,” says computer scientist Melody Moore, who directs the Brain Lab at Georgia State University in Atlanta. “This disproves something people have been saying for a long time.”
Two-dimensional cursor control, Moore says, could be used to operate a wheelchair, a chess-playing robot, or a computer mouse, for example. Once you have the second dimension, she notes, “the third dimension is within reach.” And that could enable full movement of a limb.
Such a possibility seemed remote when Wolpaw, McFarland, and their colleagues described their first BCI in a journal in 1991. That system enabled a person to move a cursor on a screen up or down some indeterminate amount by raising or lowering the amplitude of electrical brain currents called mu or beta rhythms. By imagining actions such as running, floating, or moving one arm or the other, the subjects could influence these currents, which are generated by a brain area involved in sensation and movement. The researchers recorded the brain-wave changes using a detector called an electroencephalogram (EEG). It was a crude yes-no device, and skeptics doubted that this sort of BCI, which sums input from millions of neurons, would get much further.
In the following years, the Wadsworth group improved this one-dimensional BCI, enabling subjects to nudge a cursor a precise distance to land on one of four icons. Then, early last year, they translated that progress into two dimensions. One critical advance was a learning algorithm: The software program translating brain signals into cursor movement optimizes a user's performance by adjusting its parameters based on the trials a user has completed so far.
Putting the BCI to the test, Wolpaw and McFarland asked four volunteers—two of them with spinal cord injuries—to don caps speckled with 64 recording electrodes and to use whatever kind of imagery they could to push a cursor from the center of a computer screen to a target in any of eight possible locations on the periphery. As the volunteers did the task, a computer translated their brain's mu and beta rhythms into horizontal and vertical cursor movements.
After dozens of short practice sessions spread out over weeks, the two volunteers with spinal cord injuries could hit the target about 90% of the time within the 10-second time limit. (The others did so 70% to 80% of the time, perhaps because they were less motivated.) The best subject hit the target in an average of 2 seconds and with 92% accuracy—results comparable to the best achieved by monkeys operating implanted BCIs. Three of the volunteers went on to hit targets in eight additional places on the screen with similar speed and accuracy. “It did not throw them off to go to a new location,” Wolpaw says.
He and his colleagues are now working on adding a brain-wave switch that could enable a person to grasp or release an object using a robot arm or to click on icons on a computer screen after moving a cursor to them. But supporters of the implanted electrode strategy still question how flexible noninvasive BCIs can be. Brown University's John Donoghue, for example, says that complex movements requiring many dimensions of control may require devices like the 100-electrode array he and his colleagues at the firm Cyberkinetics in Foxborough, Massachusetts, are starting to implant in people. Such systems “engage the actual neural substrate intended for use in the lost voluntary movements,” as opposed to more diffuse EEG patterns, he says.
Yet the risks of neurosurgery, which include infection and brain damage, may make implanted sensing devices a hard sell for many patients. “There's a lot you can do with signals from the scalp,” says Wolpaw.
- NATIONAL INSTITUTES OF HEALTH
Report Seeks Stability for Behavioral Sciences
- Jocelyn Kaiser
Basic behavioral and social scientists want the National Institutes of Health (NIH) to pay more attention to their field. But a report calling for a “secure and stable home” for their research received a tepid reception last week from NIH Director Elias Zerhouni, and a tightening budget may limit what NIH could do even if it wanted to help.
The report comes from a 14-member panel led by University of Chicago sociologist Linda Waite, which was asked to assess NIH's current portfolio in these areas. It tallied $936 million in basic social and behavioral and another $1.75 billion in clinical research in NIH's 2003 budget of $26.4 billion. This research is vulnerable, however, says the panel, because it is housed mostly at institutes focused on specific diseases or life stages. One major source, the National Institute of Mental Health, has recently stopped funding some of those grants to support more translational work (Science, 22 October, p. 602).
The panel proposed a solution: secure funding and a stable home at an existing institute. The top candidate is the National Institute of General Medical Sciences (NIGMS), NIH's basic research institute, followed by the aging or child health institutes. The report notes that Congress has repeatedly encouraged NIGMS to enlarge its current $13 million portfolio. The report does not suggest that grants be transferred to this home institute, however, a strategy NIH followed in creating the National Institute for Biomedical Imaging and Bioengineering in 2000. The panel also recommends a bigger role for NIH's director-level Office of Behavioral and Social Sciences Research, which now coordinates and promotes these research areas across institutes.
Members of the NIH's director's advisory committee, which requested the study, agreed during a meeting last week that basic behavioral research is valuable. But there were questions about the panel's “structural” recommendations. Zerhouni, for instance, said he was not “clear” on whether the group was asking for a larger pot of money or a shift in existing resources now devoted to behavioral research. The former would require NIH “to scale something back” elsewhere, he noted.
NIGMS is “willing to support more” behavioral research such as genetics studies, says institute chief Jeremy Berg, but areas such as the social sciences would not be a natural fit. And finding new funding would be a tall order, Berg adds. Alan Kraut, executive director of the American Psychological Society, agrees: “This is going to come down to a budget issue.”
- U.S. AGRICULTURAL RESEARCH
Report, Lawmaker Promote an Independent Institute
- Erik Stokstad*
Funding for agricultural and food research has traditionally been a dry patch compared to the well-watered scientific fields supported by the National Science Foundation (NSF) or the National Institutes of Health (NIH). Now its supporters are hoping that a recent report from a blue-ribbon panel will lead to a bumper crop of basic agricultural research. But first they have to figure out where to plant the seeds.
In 2002, on orders from Congress, the U.S. Department of Agriculture (USDA) asked a group of eminent scientists to ponder a national institute of food and agricultural science. This summer the panel, led by Chancellor Emeritus William Danforth of Washington University in St. Louis, Missouri, concluded that the greatest need was for an institute that would award extramural, peer-reviewed grants for basic research.* “We felt a whole new culture has to be created that is more similar to NSF and NIH,” says Danforth.
Last month Senator Kit Bond (R-MO), who chairs the panel that sets NSF's budget, took Danforth at his word. He introduced a bill (S.3009) that would place the institute within NSF's biology directorate but give it an unusual degree of independence and its own advisory council. Although the bill has expired, Bond has said he acted quickly to stimulate discussion. And ag lobbyists are thrilled: “We've gotten to the starting line,” says R. Thomas Van Arsdall, executive director of the National Coalition for Food and Agricultural Research, an advocacy group based in Savoy, Illinois.
The task force found that basic research has been shortchanged. More than 90% of USDA's $2.4 billion research budget is not awarded by peer review. Instead, funds are distributed directly to land-grant universities and spent on intramural, mainly applied, activities through the Agricultural Research Service. Even the $180 million a year awarded competitively through the National Research Initiative (NRI) has its drawbacks: USDA grants are smaller and shorter than those of NSF or NIH and come from a much smaller pot (see chart). The task force recommended that the proposed new institute have an annual budget of $1 billion after 5 years. In addition, the number of grants should be doubled, to 1000, and their size boosted by 187%, to $225,000 per year.
Lobbyists say that they aren't worried about confusion over whether the new institute should be part of USDA or NSF. “Focus on the broader message: We need to boost federal support for basic research in the agricultural sciences,” says Howard Gobstein, vice president for governmental affairs at Michigan State University in East Lansing, who also works on behalf of the National Association of State Universities and Land-Grant Colleges, to which MSU belongs.
Advocates say that housing the new institute within NSF offers many advantages. “You could be sure that first-rate research would be done,” Danforth says. However, NSF officials worry that it could lead to similar demands from other interests, such as transportation or energy, traditionally outside NSF's purview. That would squeeze a budget that shrunk by $107 million this year and may erode further in 2006. Supporters have a quick answer: An agricultural institute, they say, could be a rallying cry for the foundation to seek a bigger budget.
USDA prefers another approach. It says that boosting the NRI budget, which will grow by 10% in 2005, would be a logical way to strengthen basic research in food and agriculture. The agency has already increased average grant size by 80% since 2001.
Bond is expected to reintroduce his bill, with some changes, after the new Congress convenes next month. Lobbyists are hoping for a companion bill in the House of Representatives, and Representative Bill Goodlatte (R-VA), chair of the House Committee on Agriculture, tops their list of desired sponsors. A spokesperson for the committee says that members will meet with Danforth in the coming weeks but declined to speculate on any possible legislation. David Goldston, staff director for the House Science Committee, says the panel, which has jurisdiction over NSF but not USDA, would welcome a discussion of how best to achieve the aims of the USDA report.
Can the War on Locusts Be Won?
- Martin Enserink
“They shall cover the surface of the land, so that no one will be able to see the land. They shall devour the last remnant left you after the hail, and they shall devour every tree of yours that grows in the field.”
SEBT BOUNAAMANE, MOROCCO—So this is what Moses was talking about. On a beautiful November morning, it's clear even from afar that something's terribly wrong with the trees around this tiny village. They are covered with a pinkish-red gloss, as if their leaves were changing color—except these argan trees are evergreens. As you get closer, the hue becomes a wriggling mass; a giant cap of insects on every tree, devouring the tiny leaves. Get closer still, and you'll hear a soft drizzle: the steady stream of locust droppings falling to the ground.
But Morocco has locust-fighting weapons far beyond anything that ancient Egyptians could imagine. Later that morning, two yellow aircraft swoop down across the nearby Anti-Atlas mountain range, releasing a fine mist as they start skimming the land. Soon, the faintly soapy smell of pesticides fills the air. When entomologist Abdelghani Bouaichi jumps in his Land Rover to drive back to the National Centre for Locust Control in Ait Melloul, he's satisfied. Within 8 hours, most of these locusts will be dead.
Africa is once again fighting a battle against the desert locust, Schistocerca gregaria, and this winter, southern Morocco is Ground Zero. Vast waves of locusts are entering the country—as well as parts of neighboring Algeria—from Mauritania and Senegal. Dozens of planes make their deadly trips every morning; if they can manage to kill enough locusts, perhaps the emergency won't develop into a full-blown plague.
Perhaps. Even after 50 years of experience, fighting locusts is still more an art than a science. Nobody is quite sure how to prevent locust plagues or squash them once they're under way, nor is it clear how effective the thousands of liters of pesticides drizzling on the red earth are. Environmentally friendlier alternatives are in development, but questions linger about their efficacy as well. Compounding the problem, there aren't nearly enough locust researchers in this obscure field to tackle the questions—nor enough locusts. Plagues often occur many years apart, leaving researchers short of experimental material in the interim.
Progress is also hamstrung by long- running doubts about whether locusts really warrant all this trouble and expense. Locust-stricken countries claim huge economic costs, but some scientists argue that, overall, the toll is not that bad—certainly not compared to that of other pests and droughts. “They just have that reputation,” says Philip Symmons, a retired veteran of the locust wars who lives in France. “It's all because of Exodus.”
Ounce of prevention
The latest emergency is the most serious since a vast 3-year plague ended in 1989, after locust swarms had visited more than 30 countries from West Africa to India and donors had spent more then $300 million in emergency aid to kill them, in addition to a similar sum spent by the affected countries themselves. It's not nearly as bad this time—at least not yet. The U.N.'s Food and Agriculture Organization (FAO) in Rome, which coordinates the battle, classifies it as an “upsurge” rather than a plague, because it's affecting only one major breeding area, West and northwest Africa. A few swarms have ventured farther out—one staged a stunning photo op in front of Cairo's pyramids—but so far, these are exceptions of less concern.
Still, the situation is bad enough—especially because it wasn't supposed to happen. Since the last plague, FAO and many countries have prided themselves on their ability to prevent crises of this magnitude. Most of the time desert locusts are solitary insects; only after heavy rainfall and an increase in vegetation do they sometimes undergo a spectacular transformation that leads them to band together (see sidebar). Small swarms merge, and merge again, until they're gigantic. Nip 'em in the bud is the philosophy; then you won't have to pull out all the stops later.
To do the nipping, countries at risk have set up early-warning systems: local teams that search for early infestations in the desert—“outbreaks” in locust parlance—and kill them. They are helped by FAO's locust forecasts, which pinpoint potential trouble spots on the basis of past locust sightings, the weather, and satellite data about vegetation growth.
It's easier said than done, however. The area where outbreaks can originate is vast (see map below); most of it is extremely rugged, inaccessible, and virtually uninhabited. Some of it is war-torn. Survey teams have gotten lost, and some have perished. Complacency is always a danger—it's hard to stay focused on a threat you haven't encountered for years—and vehicles and other equipment are often in short supply. Corruption and political favoritism occasionally stand in the way as well. “Sometimes you meet a national head of locust control who doesn't know the first thing about locusts,” says Arnold van Huis, a locust expert at Wageningen Agricultural University in the Netherlands.
Several specific problems conspired to bring about the current upsurge, says Clive Elliott, head of FAO's locust program. There was quite a bit of rain throughout the summer of 2003 in the Sahel, triggering serious locust outbreaks in Mauritania, Mali, and Niger that overwhelmed those countries' control systems. Then a few days of extreme rainfall in October provided perfect breeding conditions for the next 6 months. FAO pleaded for $9 million in emergency funds in February, but rich countries were slow to react; the money didn't start flowing in earnest until searing pictures of ravaged crops made news this summer. But by that time, the locusts had already been through a winter and spring breeding season in North Africa and another one in the summer in the Sahel. (FAO now says it needs $100 million and maybe more.)
Some experts question whether the ounce-of-prevention strategy could have worked, even with plenty of resources. The first congregations of desert locusts are so small, and the area in which they can occur so vast, that—unless hundreds of planes and entire armies are dispatched—it's futile to try to find them all, Symmons says. Rather than clinging to the idea of prevention, he says, countries should focus their fight on the later stages, when big swarms make easy targets.
Elliott disagrees. Upsurge prevention can and does work if it's done well, he says. At the same time the swarms first appeared in West Africa last year, they also surfaced in Sudan and soon crossed the Red Sea to Saudi Arabia, the traditional springboard for India and Pakistan. But that outbreak was effectively dealt with. Elliott credits the Emergency Prevention System (EMPRES) for locusts and other pests, a multinational program set up by FAO that aims to build the capacity necessary for early intervention. The plan is to expand EMPRES to West Africa.
Gone with the wind
The heavy use of pesticides is another issue of continuing debate. Since October 2003, some 110,000 square kilometers of land have been sprayed, FAO says, which corresponds to more than about 11 million liters of pesticides, most of it organophosphates. The risks to humans can be mitigated: In Morocco, for instance, planes are ordered to avoid villages, and control workers regularly have their blood checked for increased levels of the compounds. But there's pressure to reduce their use, especially from the donor countries. Already, thousands of tons of leftover pesticides from previous campaigns have been abandoned across Africa, often with their packaging decaying; few Western countries are eager to add to that sinister stock.
Research on alternatives is occurring “at a glacial pace,” says Allan Showler, a former EMPRES head who now works at a U.S. Department of Agriculture lab in Weslaco, Texas. Field testing is particularly difficult because outbreaks are so rare—and when they do happen, the first priority is squashing them. Still, FAO is encouraging new studies. This summer, for instance, two field trials were conducted on 400-hectare plots—one in Niger, the other in Mauritania—with a much-touted safer alternative, a toxin produced by the fungus Metarhizium anisopliae, which is marketed under the name Green Muscle. The trials had several logistical problems—in Mauritania, the products' formulation had a “yogurtlike consistency” that made spraying difficult, and the results were inconclusive, Elliot says: “It certainly didn't work like a dream.” FAO is hoping to do a bigger trial next year. Other promising candidates include a relatively new insecticide called fipronil and a class of hormones called insect growth regulators, but they, too, have yet to prove their mettle.
Whether the spraying operations can end an outbreak—or even alter its course significantly—is also still an open question. FAO locust forecaster Keith Cressman says there's a good chance they can; if Algeria and Morocco keep up the fight for the next 3 or 4 months—and there isn't too much rain in winter and spring—they may kill enough locusts to end the upsurge. He finds hope in the fact that, during their migration from the Sahel to North Africa, many swarms are becoming trapped by the cold just south of the Atlas mountains. That makes them sitting ducks.
But others doubt that human intervention alone can do the job. When the last plague was finally over in 1989, some credited the costly control campaigns, but others thanked strong winds in October and November 1988 that blew some locusts all the way to the Caribbean—and billions of others to their deaths in the Atlantic. (That wasn't the first time this happened: Once the pharaoh repented, Exodus 10 reports, “the Lord changed the wind into a very strong west wind, which lifted the locusts and drove them into the Red Sea.”)
Counting the cost
Beneath the questions on the best control strategy, there's another unresolved issue: Is it all worth it? Standing in a field in Morocco, surrounded by millions of insects, Bouaichi says he can hardly believe anybody doubts the urgency of the fight. Earlier that morning, he had reassured anxious villagers that the planes would arrive soon to save their olive and date trees. The Sous valley, which has citrus groves worth hundreds of millions in exports, are just 100 kilometers away—and they're at risk, too. “Not much damage? I don't understand how people can say that,” he says.
But other scientists argue that locusts are like hurricanes: The damage is devastating on a local scale but limited at the national level. In a 1990 report about the 1980s plague, for instance, the U.S. Congress's Office of Technology Assessment called the rationale for intervention “shaky.” When locust expert Stephan Krall and his colleagues at the German aid agency GTZ tried to assess the damage from the same plague, “we really didn't find all that much,” he says. Stories about the astronomical appetites of locust swarms—based on the well-known factoid that the insects can devour their body weight in vegetation every day—need to be taken with a grain of salt, Krall asserts. Besides, Van Huis notes, locusts are primarily desert creatures that often dine on the natural vegetation.
Many are skeptical about recent claims that half of Mauritania's crops were lost last summer. Countries are well aware that nothing opens donors' wallets faster than a big disaster, Cressman says. But there are more than economic costs to consider. Although the value of cash crops may be relatively easy to establish, how do you measure the loss of a harvest for a subsistence farmer? What about the social costs, such as the drift to cities that can follow a bad harvest?
Besides, the alternatives to control are either not feasible in poor countries or politically unpalatable. Food aid for stricken subsistence farmers is an unpopular idea, and some form of insurance—which is how developed nations would deal with the problem—simply isn't available in Africa. And no government can be seen as sitting on its hands when locusts strike. Says Bouaichi: “Imagine there was a locust plague in Britain or France and the government did nothing.” So the battle continues.
An Insect's Extreme Makeover
- Martin Enserink
Schistocerca gregaria, the desert locust, is a dull-looking, shy insect that tends to stay put, avoid other locusts, fly by night, and never cause trouble. And then there's the desert locust, Schistocerca gregaria, a conspicuous yellow-and-black—or bright pink when not fully mature—thrill seeker that bands together in swarms of billions that cross vast distances in broad daylight and devour tons of vegetation in their path.
So striking is the difference between the desert locust's “solitary” and “gregarious” phases that it wasn't until 1921 that Russian entomologist Boris Uvarov realized they were the same species. And only recently have scientists begun to piece together a detailed picture of how the insect switches from one phase to the other. University of Oxford entomologist Stephen Simpson, the uncontested leader in this small field, hopes that this understanding may eventually help prevent plagues. “The phase change is the defining feature of locust biology,” he says, “and also the main problem.”
The makeover is the locust's answer to harsh life in the desert, Simpson explains. Most of the time, the sparse vegetation can sustain only small numbers of desert locusts, and they do best by staying out of one another's way. After intense rain, however, plant life explodes and locust numbers skyrocket; when the inevitable drought sets in, the insects find themselves coalescing in high numbers around shrinking food supplies. This increased density is what triggers the shift from solitary to gregarious—presumably because, once they run out of food, the insects need to migrate and, like many species, they seek safety in numbers.
Researchers have long known that the locust's behavior is the first thing to change. A solitary locust becomes more attracted to its mates and more active after spending just 4 hours in a crowded cage, for instance. The spectacular morphological transformation, on the other hand, can take several generations to complete. (When densities drop—for instance, when enough members of a swarm die—the process reverses.)
Researchers have long wondered what tips off the locusts to the crowded environment: a visual, olfactory, or tactile cue. To find out, Simpson and his colleagues tested combinations of three stimuli: exposing solitary insects to air samples that had passed over a group of locusts, to the sight of 10 of them behind a glass wall, or to a tactile stimulus caused by being jostled by small paper balls. The tactile stimulus was by far the most potent trigger. Later, the group discovered that touching the insects' beefy thighs—which contain many so-called mechanoreceptors—in particular resulted in gregarization. The bottom line, according to Simpson: Locusts become social animals once their legs start bumping together.
Since then, Simpson's group, in collaboration with Malcolm Burrows and Thomas Matheson of the University of Cambridge, has delved into the physiology of the shift, discovering, for instance, that they could induce the change by electrical stimulation of a particular leg nerve. They have also shown that the central nervous systems of solitary and gregarious locusts have marked differences in the levels of 11 neurotransmitters. In this week's online early edition of the Proceedings of the National Academy of Sciences, Le Kang of the Beijing Genomics Institute in China and his colleagues take the search to the genetic level, although for another species, the migratory locust. Comparing solitary and gregarious larvae, they found differences in the expression levels of 532 genes.
Eventually, such studies could lead to the development of compounds that block or reverse gregarization. But entomologist Arnold van Huis of Wageningen Agricultural University in the Netherlands is skeptical that this would ever become a practical tool; you'd still have to find the right populations in the vast desert and spray them, he notes—precisely the problem with current, pesticide-based control.
But other findings could have a more immediate impact. Simpson and his collaborators have also discovered that it's not just the number of locusts and the amount of vegetation that determines whether a population flips from solitary to gregarious; it's also the vegetation's “patchiness.” A clump of 10 plants close together might trigger gregarization, but 10 plants far apart may not. Locust forecasting models use satellite data to gauge the amount of vegetation, Simpson notes—but they should also take into account how patchy it is. Locust forecaster Keith Cressman of the United Nations Food and Agriculture Organization says he's “very interested” in finding out if this can help refine his forecasts.
- CHILDREN'S HEALTH
NIH Launches Controversial Long-Term Study of 100,000 U.S. Kids
- Jocelyn Kaiser
Although funding is not guaranteed for the $2.7 billion National Children's Study, planners have settled on an innovative sampling strategy and are seeking proposals
Congress, advocacy groups, and researchers want to know more about how the environment—defined as everything from physical to social factors—influences a child's development and health. Could chemical pollutants, for example, be contributing to childhood diseases such as autism? To find out, federal scientists and other experts have wrangled over the design of a hugely ambitious $2.7 billion study that would follow the health of 100,000 U.S. children from before birth to age 21.
Now, after 4 years of planning, the National Institute of Child Health and Human Development (NICHD) has released a draft study blueprint* and is seeking proposals for contracts to run pilot centers. But questions loom about the methodology of the project, which would begin enrolling pregnant women and their newborns in 2007. Planners want to screen for subjects by contacting, in effect, a random sample of U.S. households in selected areas—standard procedure for the census but an untested approach for a long-term medical study.
Researchers who helped plan the National Children's Study (NCS) admit that this sampling strategy carries risks, from making it hard to get clinical samples to eroding support from researchers outside the selected areas. “It's extremely ambitious,” says epidemiologist David Savitz of the University of North Carolina, Chapel Hill, who chaired a sampling design panel. “Whether it's gone from extremely ambitious to impractical, only time will tell.” Another question is whether Congress will pony up the money for the study, which would cost about $70 million to $200 million a year starting in 2006.
Four years ago, Congress called for a longitudinal study of environmental influences on children's health, modeled on projects such as the famous heart study conducted in Framingham, Massachusetts (Science, 11 July 2003, p. 162). Hundreds of outside researchers and four agencies have narrowed scores of possible hypotheses to about 30. The current list includes whether pesticide exposures can alter cognitive development, whether violent TV shows and video games raise a child's risk of gun injury, and whether underweight newborns are more prone to obesity as teens. The study will collect environmental data in unprecedented detail, supporters say, including data on exposures to infections, stress, and pollutants even before some parents conceive.
One contentious issue has been how to recruit subjects—through academic medical centers, or by selecting a probability-based sample representing America's ethnic, social, and geographical diversity. Social scientists prefer the latter so the study's results will reflect the entire population. After planners agreed with that goal last summer, federal statisticians crunched demographic and birth data, and then last month NCS unveiled 96 study sites scattered across the country, from rural Minnesota to Queens, New York (see map). Eight were picked as possible sites for initial “vanguard” centers. They will likely screen for couples planning to have a child by calling or knocking on doors of randomly chosen households.
Although any institution can apply for a center in its quadrant of the country, organizers acknowledge it may be impractical for, say, a Boston team to lead rather than collaborate with a center in New York City. NIH does not usually solicit proposals for fixed locations. “This is very much top-down,” which may not please some researchers, says epidemiologist Grace Lemasters of the University of Cincinnati in Ohio, who is on the NCS advisory committee. Although she's disappointed that no sites fell closer to Cincinnati, Lemasters says she supports the study's sampling approach. “It almost has to be that way” so the results will reflect all of America, she says.
Also unusual is that subjects won't be chosen through their medical care provider. That makes it more likely that many will move or drop out: “Retention is going to be a huge issue,” says Savitz. Another challenge will be the logistics of collecting biological samples, such as placentas and cord blood, from the hospital in which the mother happens to deliver. And if a family has no regular doctor, “we'll have to figure out how to deal with that,” says NICHD epidemiologist Mark Klebanoff. To help fill gaps, the centers will also recruit some subjects through prenatal care providers.
Aware of these uncertainties, NICHD considers the three to eight “vanguard” centers to be pilots that will help refine the study plan released last month, says NCS director Peter Scheidt. (The study has $12 million for contracts in 2005, enough to launch these centers, which will recruit 250 newborns a year for 5 years.) The vanguard centers will later serve as models for other centers, Scheidt says. Eventually, NICHD hopes to fund up to 50 centers that cover all 96 locations.
Future funding is the big unknown. Although congressional appropriators recently expressed support for the study, they did not allocate an extra $15 million in 2005 that advocates hoped for. (A long list of advocacy groups supports the study, from the American Chemistry Council to the American Academy of Pediatrics.) Backers are hoping that the selection of vanguard centers will build support in Congress by putting the study on the radar screens of local representatives.
The Grand (Canyon) Experiment
- Elizabeth Pennisi
Last month, researchers learning from a previous failure once again flooded the Colorado River in an ambitious attempt to rebuild eroded shoreline in the Grand Canyon
LEES FERRY, ARIZONA—Some researchers chase tornadoes. Ted Melis rides waves. Big, river ones. But on the eve of the ride of a lifetime, the geomorphologist with the U.S. Geological Survey (USGS) in Flagstaff, Arizona is miserable. Situated on the banks of the Colorado River, Melis is struggling to keep sensitive electronic equipment dry as a cold downpour spills out of the dark, gray sky. He's fashioned a blue tarp into a makeshift tent covering the front half of his 11-meter motorized raft, but it's sagging precariously from the buildup of water.
Melis's wet, chilled fingers work in slow motion, packing away instruments. Colleagues at a nearby second raft stow food and supplies, including spare outboard motors, insurance against breakdowns. For the next several days, Melis and his fellow rafters will collect samples and monitor the river's behavior as rushing waters push and pull sand and silt along its long and winding course. “You have to carry all your equipment and be self-sustaining,” explains Jeffrey Cross, director of the National Park Service's (NPS's) Grand Canyon Science Center in Arizona. “Once you launch, you have to go the whole 240 miles.”
Last month, floating by native American ruins and spectacular scenery, Melis, Cross, and a dozen other researchers and journalists headed down the Colorado. Their journey marked the beginning of an audacious, 18-month experiment in which scientists and conservationists will test whether a giant wave of water let loose down the river can restore sandbars in the Grand Canyon, one of Earth's great wonders and a popular tourist destination for more than a century.
The stakes are high. For 40 years, the canyon's bars and beaches have been eroding, taking away critical habitat for riverside life and robbing human visitors of comfortable campsites. Yet playing with the Colorado's flow out of Glen Canyon Dam, about 25 kilometers upstream from where Melis and Cross set in, is no small matter: It's the source of hydropower for about 170 utility companies, reservations, and municipalities, and it contributes to the water supply of three states downstream. And if Melis and his colleagues see no improvement in the Colorado's shorelines, it will be the second time in a decade that this multimillion-dollar experiment has failed. That may leave land managers with no choice but to consider even more costly measures, such as shipping in sediment, for rebuilding the river's real estate.
A damming problem?
In theory, the Grand Canyon's problem and the solution to it are straightforward. Glen Canyon Dam, completed in 1963, restricted the Colorado's natural flow, disturbing the balance of sand deposition and erosion. A flood of extra water released from the dam should carry sediment to the degraded areas. The restored sand and gravel bars should in turn restore nursery grounds for an endangered fish, the humpback chub. As an added bonus, wind whipping up newly settled sand would blow over and rebury native American ruins and other vulnerable archaeological sites exposed by erosion.
Historically, however, any flood was “bad.” Water was viewed as a resource that should be corralled and harnessed. In the mid-20th century, the U.S. government began constructing dams to tame the Colorado and other rivers feeding it. Among the more majestic was Glen Canyon Dam, at 216 meters tall. Behind it, Lake Powell holds about 34 trillion liters. At the bottom of the dam, eight turbines generate enough electricity to satisfy, for the moment, the West's need for power at the peak consumption times.
Today, the flow from the once-mighty Colorado River is highly regulated. By law, in 2005, 10 billion cubic meters of water must pass through the dam to ensure that downriver states are adequately supplied with water. To maximize power output, the dam operators usually allow about 283 cubic meters per second (cms) of water to pour through the turbines during the day and reduce that flow to as little as 145 cms at night, creating artificial “tides” along the river's 386-kilometer run from Lake Powell to Lake Mead. Because those turbines pull water from the lake bottom, the released water is relatively cold and sediment-free compared to the Colorado's free-flowing days.
Faced with these unnatural conditions over the past 40 years, native fish disappeared, non-native fish thrived, and sandbars washed away. Few thought much about mitigating these detrimental effects until the Grand Canyon Protection Act of 1992 charged the dam and the canyon's caretakers to do something about these problems. Four years later, the Bureau of Reclamation, working with USGS and NPS, took action with the first deliberate flooding of the canyon. The bureau sent 1274 cms of water through Glen Canyon Dam's four bypass tubes for a week (Science, 19 April 1996, p. 344). As predicted, the newly surging river—its waters the color of cocoa—picked up sediment from the river bottom. Initially, the scientists were ecstatic as sandbars downstream expanded. But over the course of the weeklong experiment, the water turned clear—a sign that the flood had scoured all sand and silt—and it proceeded to slurp up the just-laid sediment from bars and beaches. “What we learned is that that sediment is moved out early,” says Charles Groat, director of USGS.
From a policy perspective, the outcome was disappointing, but from Melis's point of view, the $4.5 million experiment taught the scientists an important lesson: They had overestimated how much silt and sand had built up in the riverbed. So they hatched a new plan. Timing, they realized, was of the essence.
The key would be to release water from the dam after heavy rains had flushed lots of sediment from the Paria River—a large tributary 25 kilometers downstream from the Glen Canyon Dam—into the Grand Canyon. Also, the researchers planned to shorten the time they would release the highest flows, limiting them to 60 hours instead of the 90 hours done in 1996. And after the large releases, they would hold the flow for a few days at a relatively small 227 cms, to let the sand settle and to see the results of the flood.
It would be a delicate balance. They needed to wait for the sand pile from the Paria and, to a lesser extent, other tributaries to accumulate, but if they waited too long, it would wash away. Likewise, the flush from the dam needed to last just long enough to scoop up and redeposit the sediment but not so long that the water ran a deficit and carried it away again. Melis compares the sediment loading to a financial accounting scheme—and he wants to make sure the river stays in the black.
In 2002, after much political debate, the management group overseeing scientific projects in Glen and Grand canyons gave the plan a tenuous nod (see sidebar). Yet it took 2 years to move ahead. A prolonged drought took hold of the region, and runoff was scarce. “There was the will, but we were waiting for significant sediment,” says Melis.
Then, from September to early November, tropical storms swept through, flushing a million tons of sediment down the Paria and into the Colorado River. On 21 November, at 7 a.m., dam operators opened two 240-centimeter-diameter discharge tubes, each carrying 107 cms. Water shot out and crashed into the river, sending spray tens of meters into the air. Three hours later, two other discharge tubes were opened as well. Including the water exiting the dam through turbines, the flow eventually topped 1161 cms, four times the usual daytime high.
Riding the waves
The surge reached Lees Ferry and Melis less than a day later. By that time, “the river [was] lousy with scientists,” says Groat. About 50 researchers, some who in the weeks before had determined the baseline conditions needed for a postflood comparison, were busy with 20 projects. Airborne researchers had used remote sensing to get a precise accounting of the shape of the riverbed. Aerial photographs and light detection and ranging equipment had also documented the size and shape of 150 sandbars.
Back on the river, Melis set out early on 22 November to observe the fate of the sediment swirling around at the front of the wave. His arsenal was a combination of tried-and-true instruments and high-tech devices. An isokinetic point sampler built in 1961 with parts stripped from a B-29 bomber sampled the river at fixed depths, yielding hundreds of packets of water and sediment that would be analyzed on shore. Meanwhile, a sleek, $30,000 device provided details about grain size and concentration, sampling the water once per second and providing data in real time on particles as small as 3 mm.
Immediately after leaving Lees Ferry, “we didn't see any evidence of high sand concentrations” in the main river, says Melis. Instead, the researchers saw the preexisting sand in a large eddy being stirred up—a disturbing observation given that the goal was to put more sand into these quiet spots and not pull it out. But 1.5 kilometers later, “the whole river was brown with sand,” he notes, and on target for building bars. The researchers expect that this color transition also signaled a change in the size of the grains in the flow, a shift that may be crucial to the experiment's success, as it takes just the right mix of sediments to make stable sandbars and beaches. “It's like Nature's way of mixing concrete,” Melis explains. As in concrete, the mix of grain sizes determines the properties of a sandbar. “We hope to find a wider range of sand and silt grain sizes in these bars” than in 1996, says Melis.
By late afternoon on that first day on the river, Melis's boat passed colleagues who had set up a field lab behind a rock pile. Computer in hand, satellite dish mounted on a nearby rock, and laser-emitting and -receiving monitor by his side, USGS hydrologist Scott Wright measured the amount of sediment as well as the distribution of grain size in the water passing by. His was one of several stationary “labs” that complemented Melis's mobile one.
On the opposite bank, Mark Schmeeckle, a river mechanics expert from Arizona State University in Tempe, was tracking water speed using an acoustic Doppler device that bounced sound waves off sand in the water columns. Changes in the frequency of the returning sound waves translated into water speed. “A surprise is how fast the bottom is moving,” notes Neil Ganju, a USGS hydrologist based in Sacramento, California, who was doing similar tests 50 kilometers away. The flood was apparently moving more sand, more quickly than expected.
Schmeeckle shared his field site with biologists who viewed the dam release with trepidation because the rising waters threatened an endangered snail. The 10-cm Kanab ambersnail lives primarily in a natural spring called Vasey's Paradise that is about 18 kilometers downstream from Lees Ferry. The snail thrives on a native plant, monkey flower, which grows close to the water's edge. The flood therefore put as many as 7000 snails in jeopardy. In 1996, conservationists rescued many of the snails by taking them temporarily to higher ground, but that wasn't enough, says Clay Nelson, a biologist with the Arizona Game and Fish Department. “The habitat was inundated and scoured away,” he says. This time Nelson and his colleagues took even more radical action. In advance of the approaching flood, they dug up a 35-square-meter swath of monkey flowers and the surrounding soil and moved them on palettes 10 meters above water level. “It's a heroic effort,” says NPS's Cross.
When Melis came upon these snail savers, they were waiting out the flood in a makeshift kitchen and sitting area protected by two tarps, one held up by a river oar. They expected to be there another week, missing Thanksgiving at home. “Once the water recedes, we can put [the snails] back in place,” Nelson explains.
Another 50 kilometers downstream, Bill Parsons, a biologist with the Arizona Game and Fish Department, and his colleagues kept tabs on another endangered species, the humpback chub. It is one of the canyon's four remaining native fish species, although estimates suggest that fewer than 4000 are left here. Typically, the chub hatch in gravel bars in a tributary called the Little Colorado. Then young fish wash down into warm, shallow pools that form behind sandbars, eventually making their way into the river.
The Glen Canyon Dam has made life difficult for the chub. There are fewer warm pools and more dangers once the fish leave these protected areas. When they hit the Colorado, now colder because water is released from the bottom of the dam, growth slows, leaving them vulnerable to trout, which thrive at the lower temperatures. Moreover, the clear water—sediments settle in Lake Powell—helps the trout visually track prey. Parsons and his colleagues hope the flood-induced turbidity will benefit the chub and that new sandbars will mean more backwater refuges. One worry: The flood may push the chub downriver, away from their normal environs. Still, floods used to be a way of life for this species—unlike the trout, which are not native to the canyon.
Even if the river builds its shoreline and sandbars back up, and the chub and snails do well, the ecosystem will never be the same as in decades past. “It's not a natural ecosystem,” Cross explains. “It's a managed ecosystem.” The sediment provided by the Paria, for example, is less than a tenth of what the dam-free Colorado carried. And this bolus includes more fine sand than in earlier days. Nonetheless, “we have to manage with the tools we have left,” says Nick Melcher, a hydrologist at the USGS in Tucson. Indeed, the Grand Canyon's caretakers may have to perform controlled releases from Glen Canyon Dam every few years, just to make up for the erosion that occurs during the time in between. “If we build a whole lot of sediment on the banks, it will not stay there forever,” says Pam Hyde of the Grand Canyon Wildlands Council in Flagstaff. “[This flood] will not solve the problem once and for all.”
A Cowboy Lawyer Goes Down the River
- Elizabeth Pennisi
Bennett Raley looked a little out of place on a river raft as he rode down the Colorado 2 weeks ago to observe an experimental flood as it took place. Once a rodeo competitor, he protected his face from breaking waves with a cowboy hat instead of a raincoat hood. In lieu of rain pants, he wore oilcloth slacks, further reinforcing the cowboy look. But Raley certainly belonged on the raft: While a Republican political appointee as assistant secretary for water and science at the U.S. Department of the Interior (DOI), he was instrumental in preventing the Grand Canyon flood project from being scuttled by discord among the six federal and state agencies, seven states, two environmental groups, six Indian tribes, and two utility companies that had a stake in the effort.
Early in his time at DOI, Raley was skeptical of the project. He worried that it was aimed at altering Glen Canyon Dam's power production and represented “advocacy” science. He had a change of heart, however, when he took a raft trip on the Colorado with the researchers involved. “I think he saw the passion of the scientists, of the boatman, and of the community,” says Jeffrey Cross, director of the National Park Service's Grand Canyon Science Center. Raley agrees: “That trip was instrumental in persuading me there was a basis for trusting the scientists.” Soon after that trip, he recommended to the secretary of the interior to give the flood project a green light.
In August, however, the Glen Canyon Dam Adaptive Management Group, which had approved the project in 2002, took a second look at the plan and voted it down. “No one expected that,” says Raley. He rode into the fray, and after a soul-searching conference call with various representatives from the group, everyone came back on board and the flood was on again.
A final hurdle appeared in November. For the dam release to do any good, there needed to be enough sediment-laden runoff from the Paria, a key tributary downstream from the dam. Despite rains in September and October, it was not clear whether the amount of sand and silt at the Paria's mouth was what the approved plan called for. “We were still in the gray zone,” Raley recalls. Still, he opted to let the release proceed, and by the day of the flood, subsequent storms brought in those missing tons, confirming that his decision was the right one. “We might have had a different outcome,” he says, “had there not been that trust.”
- ENVIRONMENTAL CHEMISTRY
Tracking the Dirty Byproducts of a World Trying to Stay Clean
- Rebecca Renner*
Stain protectors and other perfluorinated chemicals are part of our lives—and they are having a growing effect on the environment
They keep ketchup out of the carpet, sauce off your shirt, and fat inside the fast food wrapper. But although fluorinated stain protectors may be a boon in the home and on the run, the almost indestructible byproducts of these chemicals are fouling the planet. Amid growing concerns about the byproducts' ubiquitous presence and possible toxicity, scientists are trying to answer an even more fundamental question: How does a class of chemicals that isn't manufactured in large quantities and that can't travel far become so pervasive?
Fluorinated stain protectors consist of fluorinated surfactants chemically bound to polymers. The fluorinated surfactants work because their strong and rigid carbon-fluoride backbones act like tiny bristles to keep dirt, water, and grease off fabrics, carpets, and paper. Most surfactants don't travel in the environment. But their volatile precursors, fluorotelomer alcohols, travel and degrade into a class of chemicals, perfluorocarboxylates, that is extremely persistent. After a half-century of increasing use, the perfluorocarboxylates are showing up at growing levels in seals and polar bears roaming the Arctic as well as dolphins patrolling the mid-Atlantic.
Over the past 2 years, a team led by University of Toronto chemist Scott Mabury has published dozens of papers identifying these various chemicals in the air and in animals. They've also explained how the volatile precursors, which can be surfactants themselves, can travel thousands of miles in the atmosphere and then be transformed by reaction with oxygen into perfluorocarboxylates. Last month one of Mabury's students, chemist Craig Butt, reported that perfluorocarboxylate concentrations are doubling in Arctic animals every 4 to 10 years (see map).
Drawing on Mabury's work, Canada this summer banned for at least 2 years the production and importation of three polyfluorinated stain protectors that degrade into the long-chain carboxylates Butt is finding in seals. The ban, a first by any government, was triggered by a request from chemical manufacturers to scale up production of the trio of chemicals. John Arseneau, director general of Environment Canada's risk-assessment directorate in Ottawa, concedes that the ban is a “preventative” step that could be lifted or altered. But despite the uncertainty, he says, the government decided “it was time to take action.”
Canada is not alone. In the United States, the Environmental Protection Agency is investigating one perfluorinated carboxylate breakdown product and manufacturing aid, perfluorooctanoic acid. PFOA is pervasive in human blood, and there is laboratory evidence of developmental and maternal toxicities in mice at higher levels. In 2000, 3M Corp. voluntarily stopped making Scotchgard, its stain repellent, because a breakdown product, perfluorooctanoic sulfonate, was ubiquitous and accumulating in animals.
Mabury has developed a theory to explain both the diffusion and transport of volatile fluorotelomer alcohols: the chemicals used to make fluorosurfactants that sometimes serve as stain protectors themselves. The alcohols, he says, can be released into the air during surfactant manufacturing or the application of stain protectors. Domestic releases also occur. Once they escape, they get blown aloft and dispersed before breaking down to the indestructible perfluorocarboxylic acids found in arctic animals.
Mabury has identified two sources for telomer alcohols in the home. The industrial application process can leave a residue of telomer alcohols that is not bound to the polymer. This residue, says Mabury, is likely to move out of the product and into the air, although the timing and rate of volatilization is not clear. When telomer alcohols are the fluorosurfactant, they can be released if the bond between the surfactant and the polymer breaks through use or abrasion.
A growing number of scientists accept Mabury's theory. “Mabury's group has described a compelling pathway that potentially explains the presence of long-chain carboxylates in remote environments,” says chemist Jennifer Field of Oregon State University in Corvallis. Field recently detected perfluorinated breakdown products in domestic waste water, strengthening the argument for home products as a source.
DuPont chemist Robert Buck also thinks the theory offers a viable explanation for how the carboxylates are transported such long distances. But he says it doesn't preclude other sources. Perfluorocarboxylates have been used in a variety of industrial applications, he notes: “This puzzle still has a lot of missing pieces.”
Although Mabury agrees that more research is needed, he doubts that other sources are large enough to account for his group's Arctic observations. “Perfluorocarboxylates are not volatile, so they can't travel,” he says. “And it seems unlikely that they would be used in the remote regions of the Arctic.”
Mabury is no foe of stain protectors, and he opposes the blanket ban that some environmentalists are demanding. “Perfluorinated stain protectors are amazing materials. It would be a waste to abandon them,” he says.
Instead, he and others would like to see companies find ways to reduce their products' impact on the environment. 3M is now selling a reformulated Scotchgard with a shorter carbon-fluorine chain length that doesn't accumulate in animals, for example. But if companies don't act quickly, he warns, government regulators could demand substitutes whose impact on the environment is unknown—and potentially worse than the current crop of fluorinated stain protectors.