News this Week

Science  15 Apr 2005:
Vol. 308, Issue 5720, pp. 334

    Restiveness Grows at NIH Over Bush Research Restrictions

    1. Constance Holden*
    1. With reporting by Jocelyn Kaiser.

    Dissatisfaction within the National Institutes of Health (NIH) is growing over the Bush Administration's restrictions on funding for work with human embryonic stem (ES) cells. Meanwhile, measures to loosen restrictions may finally make it to the floor this year in Congress.

    At a hearing last week by a Senate appropriations subcommittee chaired by Arlen Specter (R-PA), NIH Director Elias Zerhouni seemed to defend the policy only reluctantly, citing “mounting evidence” that as the 22 approved cell lines age, an increasing number of problems are arising because of genetic instability. “Clearly, from a scientific standpoint, more might be helpful,” said Zerhouni, who pointed out that the Bush policy forbidding the use of cell lines derived after 9 August 2001 is based on moral and ethical concerns. Asked by Specter “where is the moral issue” for embryos that are slated for disposal anyway, Zerhouni responded, “I think you'll have to ask that from those who hold that view.”

    Specter also released letters from several institute directors chafing at restrictions and warning that NIH could be falling behind in the field. Specter got some unvarnished sentiment by telling the directors to answer a set of questions he posed “without editing, revision, or comment by the Department of Health and Human Services.” The following are some excerpts:

    • Elizabeth G. Nabel, director of the National Heart, Lung, and Blood Institute: “NIH has ceded leadership in this field to the new California agency. … Because U.S. researchers who depend on Federal funds lack access to newer hESC lines, they are at a technological disadvantage. … The restricted access will hamper NIH's ability to recruit … young scientists.”

    • James Battey, director of the National Institute on Deafness and Other Communication Disorders (and until last month chair of the NIH Stem Cell Task Force): “The science is evolving very rapidly, and limitations of the President's policy [have] become more apparent since I last testified. … It is likely that there will be a movement of some of the best stem cell biologists to California.”

    • Duane Alexander, director of the National Institute of Child Health and Human Development: “NICHD scientists report some problems in obtaining … cell lines, [including] inadequate quantity and quality, … high prices, ‘cumbersome’ procedures, and long waiting times.”

    Reluctant defender?

    NIH's Elias Zerhouni.


    Battey—who said last week that the new conflict-of-interest rules that forbid many NIH managers and their families from owning stock in biomedical companies are compelling him to leave NIH—agrees that frustration over stem cell research constraints has been growing steadily at the agency. “I think many of our finest scientists are troubled by the policy,” he told Science. He points out that newer cell lines are being grown free of contamination from animal products, and that one of scientists' goals—creating ES cell lines that can be used as models to study diseases—is being fulfilled at the Reproductive Genetics Institute in Chicago, Illinois. That fertility clinic claims it has created 50 cell lines representing six genetic diseases, including muscular dystrophy, from fertilized eggs that otherwise would have been discarded. But none of them can be touched by a U.S. government-funded researcher.

    Battey is most worried about the effect of the federal restrictions on young scientists. “Young people are now electing to stay away” from research with human ES cells, he says. Mahendra Rao, who does stem cell research at the National Institute on Aging, says he's experiencing that firsthand: “I have four postdoc positions vacant in my lab.” He says he knows of at least three colleagues—not counting Battey and Arlene Chiu (who just accepted a job in California; see p. 351)—who have interviewed for jobs in California.

    The White House continues to stand firm against any revision in the policy, but pressure continues to grow in Congress. Last month, the moderate Republican sponsor of a bill to expand stem cell availability, Michael Castle (DE), got House Speaker Dennis Hastert (R-IL) to agree to schedule a vote on it this year.


    IOM Panel Clears HIV Prevention Study

    1. Jennifer Couzin

    An Institute of Medicine (IOM) panel has found no major improprieties in the conduct of a key HIV trial in Uganda to prevent mother-to-child transmission in the late 1990s, essentially validating the use of a cheap, effective, and simple anti-HIV drug: nevirapine. The report also helps clear the names of Johns Hopkins University pathologist Brooks Jackson and more than a dozen colleagues.

    In two papers published in The Lancet in 1999 and 2003, National Institutes of Health (NIH)-funded researchers reported that giving a pregnant woman a single dose of nevirapine, and her infant a single dose immediately after birth, dramatically cut mother-to-child transmission rates. Since then, nevirapine has become the cornerstone of HIV prevention efforts in infants across Africa and beyond. But last year the work came under fire from an NIH staffer, Jonathan Fishbein, who charged that the investigators failed to adhere to regulatory standards governing data collection and record keeping (Science, 24 December 2004, p. 2168). He argued in an interview that “you cannot use this trial as part of the knowledge about how that drug works.”

    The nine-member IOM committee agreed that the study wasn't foolproof. But “we feel firmly that the findings and the conclusion … are valid,” said committee member Mark Kline, a pediatric infectious disease specialist at Baylor College of Medicine in Houston, Texas. The committee had primary medical records sent from Uganda and focused on a sampling of 49 infants in the study. About 10% of adverse events went unreported in that sample, they noted.

    Fishbein immediately blasted the IOM report as “an apologist's statement” that supported NIH's point of view. At a tense press conference, he and his brother, Rand Fishbein, a defense and foreign policy consultant, asked how the IOM committee could be unbiased, given that six of its members receive NIH grants.

    IOM president Harvey Fineberg called that accusation “preposterous,” adding that “there is nothing financially at stake for the individuals on this committee.”

    Some in the AIDS prevention field, who have worried that African governments would abandon nevirapine, are hoping that the IOM report will end the controversy. The Ugandan trial ““was a critical pilot study” of nevirapine that has been confirmed by at least a half-dozen others, says Arthur Ammann, a pediatric immunologist and president of Global Strategies for HIV Prevention in San Francisco.


    Academy Gets the Word Out After Tussle With Agency

    1. Eli Kintisch

    The National Academies (NAS) released a report last week that says dry storage of aging spent nuclear fuel offers “inherent security advantages” over submerging the rods in pools at reactor sites. The fact that a sponsor, the Nuclear Regulatory Commission (NRC), disagrees with that message is not unusual. What makes this report stand out is that the two sides spent 8 months negotiating a public version, and that the NRC preempted the academy by going public with a point-by-point rebuttal of the document while it was still under wraps. The episode is the latest illustration of ongoing problems between NAS and the government over handling of sensitive but unclassified data (Science, 22 November 2002, p. 1548).

    With no active repository for radioactive materials, some 54,000 tons of nuclear fuel have accumulated at U.S. reactors since the 1970s. Most of the fuel sits in pools, raising the concern that a terrorist attack could drain the water from the pools, causing the fuel to ignite and emit radioactive material over a large area. Congress called for the study after a 2003 paper said pools posed a safety threat “worse than … Chernobyl,” a conclusion the NRC said lacked a “sound technical basis.”

    Last July the academy panel sent Congress a classified version of its report that raised concerns about the pools and urged NRC to take a fresh look at the problem. Separate dry casks, it said, are more robust than pools and would allow plants to disperse the older fuel. It also suggested redistributing hot fuels and installing water-cooling systems to cope with leaks. Daniel Dorman, NRC deputy director for nuclear security, says that pools and dry storage “both provide adequate protection” and that new steps to protect spent fuel are under way. At the same time, he says NRC agrees with the report's call for more outside review of the issue and its assertion that any theft of rods to make dirty bombs is unlikely.

    Hot rods.

    Academy report points to security flaws in keeping spent nuclear fuel in pools long after it has cooled.


    The academy panel then turned to producing a public version. Getting the word out, however, proved arduous. In December, NRC rejected a draft version despite the fact that NAS left out data on how fuel rod fires could overheat, potential radiation releases, and specific attack scenarios. That material had been withheld as a precaution, according to panel members, but NRC told the academy that the draft was still “permeated with sensitive information” and requested an entirely new version. “That's not the way we operate,” says committee director Kevin Crowley, who asked NRC for specific security concerns.

    In March, before the parties could agree on a public version, NRC released a point-by-point response to much of the classified report. The academy, officials wrote, was asking for “more than what was needed.” Last week NRC officials admitted that the document overstated a finding of the academy report by claiming that the committee had called for “earlier movement of spent fuel from pools into dry storage” when it had not.

    After the dustup hit the papers, legislators demanded a public version. Last week it appeared, in a version that panelists and academy officials say is substantially unchanged from the November draft. This week, NRC said the public report “alleviated [its] concerns about sensitive information.”

    “The academy clearly doesn't want to provide information that could be damaging to the country,” says NAS Executive Officer E. William Colglazier. But without clearer rules governing what should be secret, he adds, “I wouldn't say we're not going to have this problem again.”


    Gasping for Air in Permian Hard Times

    1. Richard A. Kerr

    Life late in the Permian period certainly sounds unhealthy. More than 250 million years ago, the world was overheating under a growing greenhouse. The great Siberian Traps eruptions were spewing acid haze. Within the seas, noxious gases were building as oxygen dwindled. And something was about to trigger the worst mass extinction in the history of life. How could conditions have been any worse?

    On page 398, researchers count some of the ways, from the standpoint of evolutionary biology. On land, atmospheric oxygen was sliding from a heady 30% toward a lung-sapping 15% or below. Low atmospheric oxygen would have squeezed land animals into smaller, more fragmented areas at low altitudes, inducing extinctions while driving down diversity. The hypothesis “adds another dimension” to the role of oxygen in evolution, says biologist Robert Dudley of the University of California, Berkeley. It also complicates the question of how the end-Permian extinction took place.

    Geochemists can gauge past oxygen levels by taking account of organic matter and reduced sulfur compounds stored in sediments—in effect, the byproducts of oxygen generation. In 2002, geochemist Robert Berner of Yale University calculated that during the past 600 million years, atmospheric concentrations of oxygen were stable near present-day levels of 20% until about 400 million years ago, rose sharply to a peak above 30% about 300 million years ago, and then dove to 12% by 240 million years ago.

    Too high.

    If the low atmospheric oxygen levels of the late Permian period prevailed today, few vertebrate animals could live much above an altitude of 500 meters (red).


    Paleontologist Gregory Retallack of the University of Oregon, Eugene, and colleagues suggested in 2003 that such a precipitous decline could have determined winners and losers at the end of the Permian. One of the few large animals to survive the end- Permian extinction, a dog-sized burrowing creature called Lystrosaurus, appears to have been well-adapted already for breathing oxygen-poor air, says Retallack. Like humans adapted to living at high altitudes, Lystrosaurus had developed a barrel chest for deeper breathing, among other adaptations, apparently in order to “breathe its own breath” more easily in its underground lairs (Science, 29 August 2003, p. 1168).

    On less well prepared animals, losing more than half of the normal oxygen supply would have had far-reaching effects, say evolutionary physiologist Raymond Huey and paleontologist Peter Ward of the University of Washington, Seattle. Every animal has its own minimum oxygen requirement, they note. That's why each species has a particular altitude beyond which it doesn't live. For example, humans live and reproduce no higher than 5.1 kilometers, in the Peruvian Andes. So, “if oxygen is 12%, sea level would be like living at 5.3 kilometers,” says Huey.

    With oxygen at the mid-Permian's peak of 30%, animals probably could have breathed easily at any altitude on Earth, says Huey. But as oxygen levels dropped, animals capable of living at 6.0 kilometers in the mid-Permian would have been driven down to 300 meters. Perhaps half of the Permian land area would have been denied to animals. Species specialized to live in upland habitats would have perished, assuming they couldn't adapt their relatively unsophisticated breathing systems. Survivors would have been squeezed down into smaller, more isolated areas, where overcrowding and habitat fragmentation would have driven up extinctions and diminished the number of species the land could support. “We can explain some big part of land extinction with this,” says Ward.

    Extinction by crowding into lowlands “is a very interesting idea,” says Dudley, but “it's pretty hypothetical. None of the assumptions is yet testable.” Further studies of breathing physiology and geographical patterns in the fossil record should help size up just how bad life might have been.


    A Second Entry in the Mars Sweepstakes

    1. Mason Inman

    LONDON—More than 100 European scientists met last week in Birmingham, U.K., to define Aurora, a solar system exploration venture that faces a critical decision this year. The workshop on 6 to 7 April began with one certainty: Europe wants its own Mars program. The scientists endorsed a one-way robotic trip to Mars in 2011 and hashed out the types of instruments they want onboard to search for signs of life and study geology. They also backed a follow-on sample-return mission. But big issues remain to be settled, such as whether governments will pay, and how they will coordinate the work with an ambitious U.S. Mars program.

    Aurora's head at the European Space Agency (ESA), Bruno Gardini, said at a press conference here that he was pleased with the workshop's outcome. “It has given us a very focused target,” he said. Doug McCuistion, director of NASA's Mars Exploration Program, an observer at the workshop, agrees: “It's very important that they were able to narrow their options so they can go forward.”

    Three proposals were on the table at the outset. The scientists recommended plucking out elements of each and rolling them into one mission, as yet unnamed. One piece of heavy equipment made it onto their consensus wish list: a drill to take samples at a depth of up to 2 meters below Mars's oxidized surface. NASA does not have a drill on its agenda, McCuistion says. The scientists also recommended including a rover with sensors to look at ratios of isotopes for traces of past or present life, modeled after those on Beagle 2, the United Kingdom's ill-fated robot that went missing in December 2003 during its descent to Mars. The scientists also want to include a seismograph to detect possible “marsquakes” that could show that the planet is geologically active.

    Long haul.

    Europe's vision for space exploration calls for sending first robots, then humans, to Mars.


    Before the plans get much more specific, ESA needs some of its member countries to pony up for the mission, which carries a price tag of €500 million ($650 million). ESA members make voluntary contributions to Aurora, described at its launch in 2001 as a search for signs of life beyond Earth and a start to crewed exploration of the solar system. By June, Aurora's staff will put together a more detailed plan for a complete funding review, in which countries will choose whether to pledge support to carry the 2011 project through to completion. The total budget is “a very challenging target,” Gardini said. “We are trying very hard to get support from NASA to reduce the cost and risk of the mission.” Canada, Japan, and Russia might also take part in the mission, he added.

    European researchers see the 2011 mission as preparation for a much more ambitious round trip to return samples of Mars rock, soil, and atmosphere. Space scientist John Zarnecki of The Open University in the United Kingdom, a participant in the workshop, said the group recommended working toward such a mission in 2016, which would fit with NASA's timing for such a mission. “I think everyone hopes and expects that this is going to be a big international push with ESA, NASA, and possibly other agencies,” Zarnecki says.

    This work is designed to prepare for possible international crewed missions to Mars, which ESA hopes will begin around 2030. Gardini said the sample-return mission would be valuable practice in making the round trip. Aurora faces a big test in December, when ESA's governing council will vote on funding.

  6. JAPAN

    Space Vision Backs Peer-Reviewed Science

    1. Dennis Normile

    TokyoSpace scientists here are reacting favorably to a new strategic plan from the Japan Aerospace Exploration Agency (JAXA) that endorses a bottoms-up approach to scientific exploration (Science, 1 April, p. 33). Many had feared the worst when Japan's space science agency was merged with two commercially oriented government organizations in 2003 to form JAXA. But “a number of former NASDA [National Space Development Agency] people are now listening to what space scientists say and realizing that there is a different approach to [scientific] missions,” says Kazuo Makishima, an astrophysicist at the University of Tokyo.

    The Long Term Vision report, issued last week, looks ahead for 20 years. It calls for strengthening efforts in basic space science, with the missions to be determined using the same grassroots approach to proposals adopted by the Institute for Space and Astronomical Science (ISAS), now a component of JAXA. The report discusses the possibility of crewed missions and a lunar base, but only after an additional 10 years of research and study. The plan also cites the need for satellites that could monitor natural disasters, facilitate rescue efforts, and provide a closer look at ongoing environmental problems, as well as for better launch technologies, a private-sector space industry, and supersonic aircraft. “For space science, we have to work with the scientific community, including university-based scientists,” says Keiji Tachikawa, JAXA president.


    JAXA's Keiji Tachikawa wants scientists to help chart a course for space exploration.


    Tachikawa says that JAXA hopes to use the report to develop more detailed operational plans, to motivate employees, and to build public support for space exploration. The report recommends postponing any decision to pursue crewed flight until halfway through the 20-year cycle. “We believe that what can be accomplished with robotics is not sufficient to realize the potential of using space,” says Tachikawa, noting Japan's participation in the international space station. The delay also puts off the need for an immediate ramp up in funding, however, with the report calling instead for a modest rise in annual spending from the current $2 billion to $2.6 billion over the next decade. Crewed activities will require more money, Tachikawa says, and “a good proposal [that would] gain the consent of Japanese citizens.”

    Scientists say they would welcome any new efforts by JAXA to build public support for space activities. “We've not been good at advertising the activities and accomplishments of Japan's space science,” says Kozo Fujii, an astronautical engineer who headed a delegation to the vision committee from ISAS. A larger JAXA budget built upon growing public support for space, he predicts, would also be a boon for science.


    Magnetic Scope Angles for Axions

    1. Charles Seife

    After 2 years of staring at the sun, an unconventional “telescope” made from a leftover magnet has returned its first results. Although it hasn't yet found the quarry it was designed to spot—a particle that might or might not exist—physicists say the CERN Axion Solar Telescope (CAST) is beginning to glimpse uncharted territory. “This is a beautiful experiment,” says Karl van Bibber, a physicist at Lawrence Livermore National Laboratory in California. “It is a very exciting result.”

    CAST is essentially a decommissioned, 10-meter-long magnet that had been used to design the Large Hadron Collider, the big atom smasher due to come on line in 2007 at CERN, the European high-energy physics lab near Geneva. When CERN scientists turn on the magnet, it creates a whopping 9-tesla magnetic field—about five times higher than the field in a typical magnetic resonance imaging machine. From a particle physicist's point of view, magnetic fields are carried by undetectable “virtual” photons flitting from particle to particle. The flurry of virtual photons seething around CAST should act as a trap for particles known as axions.

    Axions, which were hypothesized in the 1970s to plug a gap in the Standard Model of particle physics, are possible candidates for the exotic dark matter that makes up most of the mass in the cosmos. Decades of experiments have failed to detect axions from the depths of space, and many physicists doubt the particles exist (Science, 11 April 1997, p. 200). If axions do exist, however, oodles of them must be born every second in the core of the sun and fly away in every direction.


    CAST “telescope” hopes to detect hypothesized particles from the sun by counting the x-rays they should produce on passing through an intense magnetic field.


    That's where CAST comes in. “When an axion comes into your magnet, it couples with a virtual photon, which is then transformed into a real photon” if the axion has the correct mass and interaction properties, says Konstantin Zioutas, a spokesperson for the project. “The magnetic field works as a catalyst, and a real photon comes out in the same direction and with the same energy of the incoming axion.” An x-ray detector at the bottom of the telescope is poised to count those photons.

    The first half-year's worth of data, analyzed in the 1 April Physical Review Letters, showed no signs of axions. But CAST scientists say the experiment is narrowing the possible properties of the particle in a way that only astronomical observations could do before. “It's comparable to the best limits inferred from the stellar evolution of red giants,” van Bibber says, and he notes that plans to improve the sensitivity of the telescope will push the limits further. Even an improved CAST would be lucky to spot axions, van Bibber acknowledges, because most of the theoretically possible combinations of the particle's properties would slip through the telescope's magnetic net. Still, he's hoping for the best. “Maybe Nature will deal a pleasant surprise,” he says.


    Private Partnership to Trace Human History

    1. Elizabeth Pennisi

    In an unusual twist for population genetics research, two giants, one in publishing and the other in computing, have teamed up to trace human history. As Science went to press, the National Geographic Society in Washington, D.C., and IBM Corp. in White Plains, New York, were scheduled to announce the 5-year Genographic Project, which will collect 100,000 human DNA samples and from them determine patterns of human migration. In addition, the partnership will sell $99 DNA kits to people who want details about their past or want to contribute their genetic samples. Researchers are eager to see such an extensive survey done, but several wonder whether its organizers can avoid the problems that overwhelmed a similar survey of the world's populations.

    National Geographic's Spencer Wells, a population geneticist and popularizer of human history studies, will lead the new study, coordinating 10 research groups across the world. The teams will collect DNA samples locally, focusing on about 1000 indigenous populations. Alan Cooper of the University of Adelaide, Australia, also plans to gather DNA from preserved human remains found throughout the world.

    Led by IBM's Ajay Royyuru, the company's Computational Biology Center will store the data and analyze them for trends indicative of how people moved from region to region. Wells says the project's information will be placed in a publicly available database after the project has published its analyses. He estimates that the effort, which is funded by National Geographic, IBM, and the Watt Family Foundation, will cost about $40 million.

    DNA prospector.

    Spencer Wells's (center) new human diversity project needs indigenous people to donate DNA.


    The new project is reminiscent of the Human Genome Diversity Project (HGDP), which has limped along for more than a decade because of technical and political challenges. “It could generate all the ethical issues that stopped the HGDP,” says Ken Weiss, a geneticist at Pennsylvania State University, University Park. When it was conceived in the early 1990s, HGDP was to be a global effort with regional components, involving DNA samples from 500 populations. But neither the U.S. National Institutes of Health nor the U.S. National Science Foundation has been willing to foot the $30 million bill for HGDP, in a large part because of outcries by indigenous groups worried about the commercial exploitation of their tissue and DNA. “It was certainly much harder [to do] than we expected,” says Hank Greely, a Stanford University law professor who helped promote HGDP.

    The Genographic Project may sidestep some of the ethical landmines with a pledge not to use its data for biomedical research. The Genographic Project will just stockpile DNA, whereas HGDP also maintains cell lines, which make the collection more valuable for biomedical research, says Stanford's L. Luca Cavalli-Sforza, who came up with the HGDP concept.

    To date, however, HGDP has amassed only about 1000 cell lines. Almost all had been previously collected by researchers for independent projects. “The Genographic Project [would be] 100 times more powerful than the present HGDP collections, and this makes it extremely interesting,” says Cavalli-Sforza, who was Wells's mentor and is on the Genographic advisory board.

    The new project, notes Weiss, “is privately funded, which probably removes some constraints, controls, and monitoring.” Wells argues that his teams and local residents will work out a satisfactory protocol for DNA collection and analysis. HGDP proposed to do the same, Weiss points out, but never won the confidence of indigenous groups. Wells and his colleagues are hopeful that that bit of human history won't repeat itself.


    EPA Kills Florida Pesticide Study

    1. Jocelyn Kaiser

    The nominee to head the Environmental Protection Agency (EPA) last week cancelled a controversial study to measure the exposure of children to pesticides 1 day after two legislators threatened to derail his appointment if the study went ahead. Some scientists who see value in the study are worried that acting EPA Administrator Stephen L. Johnson has placed politics above science in making his decision. EPA says that the controversy scared away potential participants and made the research impossible to carry out.

    The Children's Health Environmental Exposure Research Study (CHEERS) aimed to follow 60 children in Duval County, Florida, living in homes where pesticides were already being used. Last fall, environmental and patient activist groups complained that the study, although approved by several ethics boards, was fatally flawed by an offer of $970 and a camcorder to families as an inducement to participate (Science, 5 November 2004, p. 961). Critics also objected to a $2 million contribution from the American Chemistry Council, an industry lobby, that would have allowed EPA to measure exposures to other chemicals, such as flame retardants. Responding to the outcry, EPA suspended the 2-year, $9 million study pending review by a special advisory panel.

    That review was to begin in May. But on 8 April, 1 day after senators Barbara Boxer (D-CA) and Bill Nelson (D-FL) blasted the study at Johnson's nomination hearing and put a hold on any vote to confirm him, Johnson declared that the study “cannot go forward” because of “gross misrepresentation” of it. EPA spokesperson Richard Hood says agency scientists told Johnson—a 24-year EPA employee who would be the agency's first leader with a scientific background—that they didn't think they could enroll enough families because of the “furor” over the study in Duval County. Hood says the researchers feel “badly burned” by the uproar.

    Some outside researchers are disappointed, too. A properly designed study would have filled critical gaps in understanding whether children absorb pesticides mainly through the skin, by inhalation, or by ingestion, says environmental health researcher Timothy Buckley of the Johns Hopkins University Bloomberg School of Public Health in Baltimore, Maryland. “We need that kind of data to protect kids,” Buckley says. Kristin Shrader-Frechette, an ethicist at the University of Notre Dame in Indiana who was to chair EPA's special review panel, says she personally thinks the study contained “fixable” scientific and ethical flaws. Instead of being improved, however, the study became what she calls a “political football.”


    Model Shows Islands Muted Tsunami After Latest Indonesian Quake

    1. Richard A. Kerr

    In the first days after a magnitude 8.7 earthquake leveled buildings on the Sumatran islands of Nias and Simeulue on 28 March, experts wondered why it failed to generate a significant tsunami. After all, the monster quake that struck just to the north in December spawned a tsunami that killed more than a quarter-million people. Now that they've had a chance to locate the fault rupture more precisely and to run some simulations, they believe that the islands that bore the brunt of the March quake largely stifled its tsunami.

    Quakes generate tsunamis by moving the sea floor, along with a lot of overlying water. The March quake was not only about a third as large as its December predecessor, but it apparently had another disadvantage: It didn't reach as far, vertically. The December quake appears to have ruptured the fault—the inclined, deep-diving boundary between two tectonic plates—from tens of kilometers deep all the way up to the sea floor in the deep-sea trench off northern Sumatra, says seismologist Seth Stein of Northwestern University in Evanston, Illinois. The vertical displacement of the sea floor along the rupture would have transferred more of the quake's energy into heaving up the tsunami, he says. In contrast, the rupture caused by the March quake didn't breach the sea floor, which means that it would have transferred less energy to the water column.

    With and without.

    Simulations driven by the March quake off Sumatra fail to generate a far-ranging tsunami (green) until islands overlying the quake (included at left) were removed (right).


    Further weakening any tsunami, the March quake occurred under relatively shallow water. (The deeper the water over a quake, the more water it will displace and the larger the tsunami will be.) The movements of the overlying islands, in fact, displaced no water at all—and that turned out to be a critical factor.

    When hydrodynamicists Costas Synolakis of the University of Southern California in Los Angeles and Diego Arcas of the National Oceanic and Atmospheric Administration's Pacific Marine Environmental Laboratory in Seattle, Washington, simulated the March quake's tsunami with the islands removed from their model, the resulting tsunami was much larger. Significant waves reached the distant islands of the Maldives south of India in the islandless simulation. “Had the two islands not been there” off Sumatra, says Synolakis, “we would have had another damaging transoceanic tsunami, although smaller in impact than the December one.”

    Such vagaries of tsunami generation are reinforcing the tsunami community's conviction that it won't be able to predict tsunamis reliably anytime soon from seismic observations alone; only a dense network of tsunami detectors on the ocean floor will do.


    Veterinary Scientists Shore Up Defenses Against Bird Flu

    1. Martin Enserink

    PARIS—It took just a few years for avian influenza to move from a veterinary backwater into the global spotlight. Now researchers are trying hard to catch up. At a meeting here last week, more than 200 bird flu scientists called for more aggressive research and control efforts—from improved surveillance to finding more humane ways of killing birds. They also launched a new international lab network to coordinate research and share virus strains.

    Asia's lethal H5N1 is grabbing most of the headlines, but it's not the only strain of so-called highly pathogenic avian influenza (HPAI) on the march worldwide. There have been 15 known outbreaks of HPAI between 2000 and 2004, which killed or led to the culling of some 200 million birds, Ilaria Capua of the Istituto Zooprofilattico Sperimentale della Venezie in Italy said at the meeting. In the 40 years before, she said, there were just 18 outbreaks, affecting 23 million birds: “We've gone from a few snowflakes to an avalanche.” Several strains other than H5N1—including H9N2 in China and Hong Kong, H7N2 in the United States, H7N3 in Canada, and H7N7 in the Netherlands—have also caused human infection, disease, or even death.

    Researchers aren't exactly sure what triggered the change or how big a threat it poses to humans. That's why meeting participants called for more funding to study the panoply of strains. Most also welcomed a new network, proposed by the meeting organizers, the World Organization for Animal Health (OIE) and the Food and Agricultural Organization of the United Nations. Mirroring a similar network for human influenza, the new structure, dubbed OFFLU, would pool veterinary expertise, stimulate closer collaboration with human flu researchers, and facilitate the exchange of samples—often a thorny issue because of intellectual property concerns, says Capua, whose lab will host the network's secretariat for the first 3 years. “It's a great idea, if they can get it to work,” says Nancy Cox, chief flu scientist at the U.S. Centers for Disease Control and Prevention in Atlanta, Georgia.

    Other changes are afoot to limit the spread of avian influenza as well. OIE has proposed that countries be required to search for and report outbreaks of low-pathogenicity avian influenza (LPAI) as well as its higher-pathogenicity kin. LPAI outbreaks cause little mortality and are easy to miss, says OIE's Alejandro Thiermann—but they can evolve to become HPAI.

    At the same time, OIE plans to introduce a new strategy called “compartmentalization” that could help protect international trade during an outbreak. Currently, entire regions or countries are shut off from the international market when they have bird flu. In the future, parts of the poultry industry could keep their disease-free status if they can show that their entire operation—including, for instance, feed supply, farm workers, and vets—operate within a biosafe “compartment” out of reach for the virus. Thiermann hopes the measure, which will be formally discussed by OIE's 167 member countries next month, will spur investment in flu-proof poultry facilities.


    A Framework for Change?

    1. Gretchen Vogel

    European Union officials have proposed a e73 billion, 7-year funding program with money for individual grants and promises of less red tape. But will researchers believe them, and will political leaders foot the bill?

    Promising to mend its bureaucratic ways, the European Commission has unveiled what it hopes will be a new and improved version of its multiyear funding program known as Framework. “This is not just another Framework program,” Research Commissioner Janez Potocnik promised several times as he pitched the proposal to the European Parliament on 6 April and to journalists a day later. “We want to do more.”

    Indeed, the proposal—Framework 7—is twice as big as previous programs, boosting yearly funding from just over €4 billion ($5.2 billion) to more than €10 billion($13 billion). Scientists greeted the report with cautious enthusiasm, praising the increased budget and the plans to launch the long-desired European Research Council (ERC), a Europe-wide grantmaking body that will fund individual scientists instead of the large and often unwieldy collaborations supported by previous Framework programs.

    The hopes are tempered, however, by two concerns: First, a pending battle over the size of the European Union's whole budget may scupper the grand plans for doubling research spending. Second, scientists have heard promises to simplify Brussels's bureaucracy before, without real results. “The problem with the commission is that they put out quite nice press releases, but they are not always able to follow up,” says Bart Destrooper of the Catholic University of Leuven in Belgium, who led a petition campaign calling for reform of the Framework program. “Especially with Framework 6, it was clear that they wanted to have a less complicated system, but it turned out to be more complicated than ever.” Nevertheless, he says, Brussels is making the right signals: “They are listening. That's very clear.”

    On the budget front, Potocnik hopes to convince Europe's heads of government and finance ministers that the expanded Framework program is vital to keep Europe competitive in the face of an aging population and limited natural resources. The Lisbon Agenda, a plan laid out in the Portuguese capital in 2000 to boost Europe's economy by 2010, makes research a key driver for growth, calling on member countries to spend 3% of their gross national products on research and development—half from industry and half from government sources. “This is a moment of truth for us,” Potocnik told reporters at a press conference. “Will we be credible in implementing the things that we have agreed to in principle?”

    Big plans.

    Research Commissioner Janez Potocnik seeks a doubling of E.U. research funding.


    Support from the European Parliament seems strong, says Philippe Busquin, the former research commissioner turned parliamentarian. In March, the parliament endorsed the ideas of an ERC and a doubling of research spending. However, Europe's heads of government, who hold the E.U.'s purse strings, are gearing up for a bruising fight over the budget. The six countries that contribute more than they receive from Brussels—Germany, the United Kingdom, France, Sweden, Austria, and the Netherlands—want to cap their contributions to the E.U. at 1% of gross national income per year. The commission is proposing an average of 1.14% per year. If these net payers get their way in negotiations in the coming months, scientists' hopes could evaporate. “It all hinges on whether the money will be there or not,” says Helga Nowotny of the Vienna Science Center, who chairs the European Research Advisory Board, a panel that advises the commission. Without an increase in the overall budget, €10 billion for Framework per year seems unlikely, she says.

    The commission divides its proposal into four areas: Cooperation, Ideas, People, and Capacities. “Cooperation” projects—the networks of excellence and other large-scale projects familiar from previous Framework programs—take up nearly half the budget. The Cooperation money is split between nine themes—including health, energy, environment, transport, and space and security—and will be used to fund large collaborations, often involving dozens of labs. “Ideas” refers to the cutting-edge research the commission hopes the ERC will fund. Money for “People” will finance the Marie Curie program that helps European and international researchers study and work abroad. The program, which even critics say is a real success, is slated to receive just over €1 billion a year—twice as much as it has under Framework 6 (see sidebar, p. 343). “Capacities,” which includes infrastructure projects such as genomics data banks, radiation sources, and observatories as well as funding for science-and-society programs, will also receive about €1 billion per year. The budget also includes €1.8 billion for the E.U.'s Joint Research Center, which does food and chemical safety testing as well as climate and some nuclear research. Because of political sensitivities, funding for nuclear energy research in the EURATOM program is calculated separately. It is slated to receive €3.1 billion through 2011.

    The proposal includes funding for two new areas: socioeconomic research and security and space. Socioeconomics will receive nearly €800 million over 7 years. The €4 billion allotted for security and space will allow for closer cooperation with the European Space Agency, especially on the Galileo project to launch a fleet of global positioning satellites (Science, 25 April 2003, p. 571). It will also fund antiterrorism research, new border security technologies, and emergency-management strategies.

    The €1.7 billion a year planned for the ERC is the result of an unprecedented grassroots movement initiated by scientists just 3 years ago (Science, 3 May 2002, p. 826). Fed up with the large projects that strangled their research in top-down bureaucracy, science leaders began calling for a European body more like the U.S. National Science Foundation or the National Institutes of Health.

    That dream came true more quickly than many of its architects expected. Busquin championed the idea, and the commission endorsed it last June. Although some worried that the Brussels Eurocrats would take a solid-gold idea and transmute it into lead, most soon came to realize that the efficient path went through the commission. “There is no alternative” to having the commission involved, says Nowotny. “The times are gone when there was no E.U., and you could set up [international physics lab] CERN with treaties between governments. If you tried to do that now, it would take 20 years.”

    In recent months the pace has accelerated. In January the commission named a five-member Identification Committee to draw up a list of candidates for the 18-member governing council that will run the ERC. The committee issued a progress report in March outlining the qualities it is looking for in candidates. It promised to present its final list to the commission by June.

    Second only to their hopes for an ERC is a call to cut down on the paperwork required to apply for and administer Framework money (see sidebar, p. 344). “If there is one word that I have heard from every scientist who has entered my office, it is ‘simplify,’” Potocnik says. He promised that the commission will try. Although details won't be clear until later this year, the commission will propose that scientists applying for Framework 7 money go through a two-step process. The first application will require less paperwork and will involve only a concept proposal. Only those whose projects make a first cut will be asked to submit a full proposal.

    In a staff working paper on simplification that, perhaps tellingly, runs nearly 10 pages, the commission also promises to establish an electronic database of applicants that should help speed the application and evaluation process. In his presentation to Parliament, Potocnik urged delegates to ease some of the legal restrictions that bind the commission, leading to complex legal contracts instead of grants. “I hope we will gain the courage to give scientists more trust and autonomy than we have in the past,” he later told journalists. “We want this to happen.”

    The proposal outlined last week is far from the final say on Framework 7. The European Parliament now has a chance to scrutinize the commission's plan. Last time around, they weren't shy about sharing their opinion: The parliament offered hundreds of amendments to the Framework 6 proposal. The competitiveness council, comprising research ministers from all member states, will also offer comments. The commission will then submit a revised proposal, and the feedback loop will continue until the council of ministers adopts a final proposal.

    That process could take up to a year and could face unexpected hurdles. In the months before Framework 6 was adopted, a coalition of countries threatened to block the entire program over funding for human embryonic stem cell research, which is restricted or even forbidden in some E.U. countries. Another fight over that issue is unlikely, because in the expanded E.U. the opponents no longer have enough votes to block the program. But some other burning issue of the day could flare to the surface.

    As the process goes forward, scientists are likely to make some noise, says geneticist Kai Simons, a director of the Max Planck Institute for Molecular Cell Biology and Genetics in Dresden, Germany, and a longtime proponent of the ERC. “The change in atmosphere in the last 2 or 3 years is just incredible,” he says. The commission has become more open, Simons says, but even more important is the fact that scientists have begun to make themselves heard in Brussels and are seeing real results: “Everyone realizes there are going to be real benefits from this. For the first time, they see a hope.”


    E.U. Wins Over Researchers With a Ticket to Ride

    1. Eliot Marshall

    Although many European Union research programs are considered unwieldy and overbureaucratic, there is one that is universally popular, says longtime observer John Findlay of the University of Leeds, U.K. It works by encouraging young researchers to jump ship. This unusual effort, called the Marie Curie Fellowships program, gives researchers a taste of independence by funding them to leave home and move to a new place—all in the name of professional growth. Recognizing its appeal, E.U. Research Commissioner Janez Potocnik last week said the commission wants to double its funding—to €1 billion per year through 2013.

    This “mobility initiative” got its more user-friendly name in 1996. The Marie Curie fellowships are now an important route for talented researchers to sidestep local constraints, or “inbreeding,” as many say. Awarded by peer review, the fellowships go mainly to young researchers. They provide a few years' pay and are very attractive, says Jean-Patrick Connerade, president of the nonprofit advocacy group Euroscience, “because the research is untargeted” and relatively free of E.U. guidelines. According to E.U. data, applications nearly doubled between 2003 and 2004, to 4364, although only a small fraction will be funded.

    The E.U.'s plan this year calls for “more of the same,” according to Conor O'Carroll, Ireland's delegate to a Marie Curie oversight panel. The new Framework 7 plan is likely to expand support for non-E.U. researchers in Europe and allow more fellows to take their money outside Europe. To discourage fellows from disappearing, grantees from Europe must return home after several years or repay the total; they can get limited research support once they're back. The willingness to support outsiders reflects the E.U.'s growing confidence, says O'Carroll: It's a recognition that “science is international. … You have to be open to outsiders.”

    On the road.

    E.U. funds have paid for thousands of migrating fellows.


    More controversial, says O'Carroll, is a proposal to have national institutions in Europe chip in 25% “cofunding” of Marie Curie schemes and other programs—“which is not universally accepted at all.” This would expand the budget but also raise the pressure on national agencies to match the E.U.'s high standards for working conditions. (E.U. contracts promise health care, maternity leave, and other benefits.)

    The E.U. laid the foundation for extending such benefits on 11 March in a new “European Charter for Researchers” and a “Code of Conduct for the Recruitment of Researchers.” The code stipulates that all researchers, including Ph.D. candidates, should be recognized “as professionals,” that “teaching responsibilities should not be excessive,” and that researchers should have “equitable social security provisions.” These are only recommendations, but they're part of the E.U.'s plan to build Europe into a formidable “knowledge economy.”

    Young researchers are enthusiastic about this dream—as well as the fellowships—but they are not blind to flaws. Toni Gabaldon of the Centre for Molecular and Biomolecular Informatics in Nijmegen, the Netherlands, and vice president of an advocacy group called EURODOC, is cheered by the doubling of the Marie Curie budget. But he warns that as the application success rate falls to a “very low” level of 10%, a lot of people are wasting their time. Mathematician Dagmar Meyer, who heads the Marie Curie Fellowship Association, a society of ex-fellows, says that grant processing has completely bogged down; months-long delays have cropped up. E.U. official Sieglinde Gruber acknowledges the problem but says it will be fixed soon.

    Although technical glitches can be fixed, one thing is not likely to change quickly, Findlay says: the economic current that sweeps talented scientists from southern and eastern Europe to Scandinavia, Britain, and the United States. It's still the case, notes O'Carroll, that four out of five researchers who cross the Atlantic and stay in America for a few years never return. He's hoping the latest E.U. inducements will lower the ratio.


    The Dos and Don'ts of Getting an E.U. Grant

    1. Martin Enserink

    BRUSSELS—So you're an upwardly mobile European researcher and some European Union (E.U.) money might be just what your lab needs. But where to start? How do you get a piece of the €73 billion Framework 7 action?

    The good news is that a whole industry of trainers, consultants, liaison officers, and research managers has sprung up across the continent to lend a hand. On the downside, they all caution that, although the rewards are enticing, learning to play the Brussels game is frustrating and requires a major, long-term commitment: It's less about filling out forms than a whole new career choice. And there are plenty of pitfalls.

    For starters, your question is the wrong one, says Sean McCarthy, president of Hyperion, an Irish company specializing in Framework training. As he wrote in a paper posted on his Web site, “you don't go to Brussels looking for money for your research. You go there to help the European Commission solve a problem that they have identified.” It's not enough, say, to show that you're an expert in detecting low-level toxic compounds in water, McCarthy explains; you have to know the politics, economics, and business of water quality and show how your research will result in the prototype of a new sensor that Europe needs to clean up its water.

    Finding good partners is also crucial. The vast majority of the E.U.'s research money is distributed in chunks as large as €12 million or €15 million to consortia of 15 institutes or more. You can try to become the lead partner for such a group, but only if you're a European heavyweight who can persuade colleagues across the continent to join and your institute is prepared to help with the dizzying paperwork, says Willem Wolters, an E.U. funding specialist at Wageningen University in the Netherlands. Smaller players are well-advised to identify the hot shots and see if they can fill a niche in existing schemes, Wolters says.

    Grants guru.

    Don't appear to promote your own ideas, says Lene Topp.


    Once the European Commission issues a call for proposals that interests you, there are usually only 3 months to apply, says Lene Topp of Rambøll Management, a Danish company that organizes training programs; that's why it's important that you organize your consortium months or even years in advance. At the same time, it's crucial that the final proposal exactly fits the call, she says—don't try to slip in unrelated ideas, however brilliant. “Many applications go straight out of the window because they don't fit the criteria,” Topp says.

    The proposals that survive the first screening are then ranked by panels of independent experts, flown to Brussels by the commission. Try to get on one of these panels, Topp says, because it's a good way of getting to know the process and increasing your chances the next time around.

    Applying political muscle to get proposals to the top of the pile is generally not appreciated, but it is very common to lobby earlier on, when the commission and the European Parliament establish research priorities—a phase that is just starting for Framework 7. But beware: “Lobbying is an American concept. In Brussels it is better to describe the process as briefing,” McCarthy says. Whatever it's called, it can be worth it: Plant scientists, realizing 4 years ago that they were about to lose out in Framework 6, teamed up and very successfully briefed their way to a bigger share, Wolters recalls.

    If you're a lead partner and your proposal is among the ones selected, you will be invited to Brussels for contract negotiations, which can last several months. Signing a contract doesn't end your worries, however. You have to make sure that all partners honor their part of the deal. (If not, the commission can and sometimes will ask for its money back.) Also, as a result of the E.U.'s past financial scandals, there's a crushing burden to account for every euro, Topp says. Be prepared to nag your partners about missing train tickets and flight coupons. Indeed, Wolters warns, “if you're a prominent researcher and you take on one of these projects, you can spend so much time on management that by the end you're no longer a prominent researcher.”

    Although Framework 7 promises some relief to the paperwork (see main text), none of the experts expect Brussels to become simple anytime soon. Yet, despite it all, many researchers do like to participate in European programs, says Menno van der Klooster of Utrecht University in the Netherlands, not just because of the money and the ability to attract staff, but also because it offers new perspectives on their research and an international network that can prove invaluable. So good luck.


    Ibogaine Therapy: A 'Vast, Uncontrolled Experiment'

    1. Brian Vastag*
    1. Brian Vastag, a writer in Washington, D.C., is working on a book about ibogaine.

    Despite potentially harsh side effects, an African plant extract is being tested in two public clinical trials—and many clandestine ones

    On a snowy President's Day, an odd group of activists and scientists devoted to treating addiction gathered in an art gallery in the Chelsea warehouse district of New York City. As an all-night, all-day rave throbbed next door, panelists outlined the latest developments in a decades-long movement to mainstream a West African plant alkaloid, ibogaine, that purportedly interrupts addiction and eliminates withdrawal.

    Sustained by true believers who operate largely outside the academic medical world, research on the vision-inducing drug is gaining attention, despite its U.S. status as a banned substance. The Food and Drug Administration (FDA) approved a clinical trial in 1993, but the National Institute on Drug Abuse (NIDA) decided not to fund it after consultants raised questions about safety.

    The plant extract can be neurotoxic at high doses and can slow the heart. Yet a handful of scientists continue to study it for its potential in treating addiction. The enthusiasts who gathered in New York reviewed efforts to tease apart its antiaddictive and hallucinatory components.

    Although a PubMed search for “ibogaine” pulls up some 200 articles on laboratory studies, clinical reports cover just a few dozen patients. That's because patients seek treatment clandestinely. “Whether the FDA likes it or not, the fact of the matter is that … hundreds, probably thousands of people … have been treated with ibogaine,” said Stanley Glick, a physician and pharmacologist at the Albany Medical Center in New York who has documented ibogaine's antiaddictive potential in rodents. At the meeting, Kenneth Alper, a Columbia University assistant professor of psychiatry, estimated that more than 5000 people have taken ibogaine since an organized (but unregulated) clinic opened in Amsterdam in the late 1980s. Boaz Wachtel, an ibogaine advocate in Israel, believes that 30 to 40 clinics operate worldwide. Listed alongside heroin, LSD, and marijuana on the U.S. Drug Enforcement Administration's schedule I of banned substances, ibogaine is nevertheless legal in most of the world.

    “There's basically a vast, uncontrolled experiment going on out there,” said Frank Vocci, director of antiaddiction drug development at NIDA. The agency spent several million dollars on preclinical ibogaine work in the 1990s before dropping it.

    Ibogaine's promoters yearn for the legitimacy that a successful clinical trial can bring. They may soon get their wish. Later this spring, neuroscientist Deborah Mash of the University of Miami in Coral Gables, Florida, will launch a phase I safety trial in Miami. A second safety and efficacy trial, of 12 heroin-addicted individuals, is slated to begin this fall at the Beer Yaakov Mental Health Center in Tel Aviv. Both are being funded in an unusual fashion: by anonymous donations—$250,000 for Mash, a smaller amount for the Israeli study.

    Traditional high.

    Ibogaine is derived from a root bark used in the West African Bwiti religion as a way to “visit the ancestors.”



    For Mash, a tenured professor who runs an Alzheimer's brain bank and won attention in the 1980s for research on how mixing alcohol and cocaine damages the brain, the donation marks a victory. The holder of a patent claim on ibogaine, she has been trying for 12 years to give the drug a scientific hearing. FDA approved Mash's phase I study in 1993, but she abruptly halted the trial when NIDA rejected her funding application.

    Three years later, she moved offshore, opening a fee-for-service clinic on the Caribbean island of St. Kitt's. The standard fee per patient of several thousand dollars is adjusted based on ability to pay, according to Mash. Critics derided the unorthodox move as a money grab, but Mash maintains that her motivations were scientific. “You know, somebody ought to test it. Either the damn thing works or it doesn't,” she said in a telephone interview.

    At the New York meeting, Mash's physician colleague Jeffrey Kamlet presented snippets of data from the 400 patients he and Mash helped treat at St. Kitt's. (Patients took a single dose of ibogaine titrated to body weight and other factors.) He said that for up to 90 days posttreatment, patients reported “feeling wonderful”; physician evaluations also showed improvement in depression and drug-craving scores. The results mirror those from 27 cocaine- and heroin-addicted individuals treated with ibogaine at St. Kitt's published by Mash in 2000 in the Annals of the New York Academy of Sciences.

    However, Mash has not published the bulk of her data. Her explanation: She does not want to stir up long-running controversies, including a patent dispute with Howard Lotsof, who discovered ibogaine's antiaddiction value as a young heroin addict in 1962.

    Multiple effects

    There is no consensus on precisely how ibogaine works, although researchers have shown that it inhibits the reuptake of the neurotransmitter serotonin, among other actions. In this way, “it's like supersticky, long-acting Prozac,” said Kamlet, president of the Florida Society of Addiction Medicine in Pensacola.

    It can also have effects similar to those of LSD or PCP. Like them, it jolts serotonin and glutamate systems and can cause hallucinations and feelings of depersonalization. In Gabon, the Bwiti religion revolves around “visits to the ancestors” induced by eating root bark from the shrub Tabernanthe iboga, the source of ibogaine. Many patients in the West also report emotionally intense, sometimes frightening visions: scenes from childhood, or past mistakes and regrets replayed and somehow released. Debate rages over whether these experiences are key to ibogaine's antiaddictive potential or simply a psychedelic side effect.

    Not every patient experiences visions, but animal and human pharmacokinetic data reveal a common physiological response: The liver converts ibogaine into its primary metabolite, noribogaine, which fills opiate receptors hungry for heroin or morphine. Mash believes that this dramatically reduces or eliminates withdrawal symptoms, and “that's why [addicts] don't feel dope sick anymore.” Ibogaine also stimulates nicotinic receptors in the cerebellum, an action that, according to Glick, contributes to ibogaine's long-lasting antiaddictive properties by modulating the dopamine reward circuit in the midbrain.

    Besides tweaking neurotransmitters, rodent studies suggest that ibogaine increases quantities of a protein in the brain called glial cell line-derived neurotrophic factor (GDNF). Researchers at the University of California, San Francisco, recently observed this effect in the brain's dopamine-producing areas. Dorit Ron and colleagues reported in the January issue of the Journal of Neuroscience that addicted rodents lose interest in opiates when given either ibogaine or GDNF. But after injecting an anti-GDNF antibody that scoops the growth factor out of play, the team found that the animals go dope-crazy again.

    Ron goes further, suggesting that GDNF maintains and possibly even repairs frazzled dopamine receptors. She reported last year in the Journal of Neuroscience that genetically modified mice producing excess GDNF grow up to have denser dopamine connections in the ventral tegmental area, where the dopamine reward pathway begins.

    Mash and others suggest that the effects of the St. Kitt's therapy lasted up to 3 months because unmetabolized ibogaine deposits in fat, creating a slow-release reservoir, and because metabolized ibogaine can stay in circulation for weeks. But government agencies are wary of ibogaine, in part because of its myriad effects. It slows the heart and, at very high doses, can destroy neurons in the cerebellum. FDA and NIDA cited these toxicity risks repeatedly in the 1990s.

    Glick has been trying to develop cleaner-acting derivatives. The best-studied, 18-methoxycoronaridine (18-MC), exhibits strong action at nicotinic receptors but “seems to lack all of the actions that make ibogaine undesirable,” said Glick. Mash and other ibogaine supporters claim that the neurotoxicity risks have been hyped. But the St. Kitt's team closely monitors heart activity of volunteers, excluding any with irregular rhythms.

    While Glick tries to line up funding for clinical studies of 18-MC, Mash is betting on a formulation of the metabolite noribogaine. She and the University of Miami won patent rights to noribogaine in 2002 after a long-running dispute with Lotsof, who holds a patent claim on ibogaine. Mash hopes that, like 18-MC, noribogaine may offer antiaddictive effects without the scary trip.

    Meanwhile, Vocci is disappointed that Mash has not published her data from St. Kitt's. “This big case series, no one knows what to make of it,” he said. “I would expect to see a spectrum of responses. Even though it's not a controlled study, it would still give us some idea whether or not she has anything worth looking at.” If Mash's new trial does produce promising data, ibogaine advocates will have a token of legitimacy to point to. But the circle of true believers seems to be expanding, Wachtel says, because users insist that ibogaine works.


    Experimental Drought Predicts Grim Future for Rainforest

    1. Erik Stokstad

    An extraordinary research effort in the Amazon starved a tropical forest of rain and provides a glimpse of the potential effects of climate change

    For 5 years, Daniel Nepstad has been slowly killing trees throughout a hectare of his beloved Amazonian rainforest. In an elaborate experiment akin to an installation by the artist Christo, Nepstad's team set up a 1-hectare array of 5600 large plastic panels that diverted the rain and created an artificial drought. The point of the $1.4 million experiment is to provide the most detailed look ever at how tropical forests respond to such stress.

    The good news, as Nepstad, an ecologist at the Woods Hole Research Center (WHRC) in Massachusetts, and colleagues have reported in recent papers, is that the forest is quite tough. Although that's no great surprise—forests in the eastern Amazon have long experienced regular droughts from El Niño events—the team is discovering clever tricks that the trees use to survive when the soil becomes parched.

    What's worrisome is that when drought lasts more than a year or two, the all-□important canopy trees are decimated. Everyone knows that a lack of water eventually kills plants. But by pushing the tropical forest to its breaking point, researchers now have a better idea of exactly how much punishment these forests can withstand.

    These kinds of data will be indispensable for predicting how future droughts might change the ecological structure of the forest, the risk of fire, and how the forest functions as a carbon sink, experts say. Given that droughts in the Amazon are projected to increase in several climate models, the implications for these rich ecosystems is grim, says ecologist Deborah Clark of the University of Missouri, St. Louis, who works at La Selva Biological Station in Costa Rica. The forests are “headed in a terrible direction,” she says. What's more, the picture includes a loss of carbon storage that might exacerbate global warming.


    Thousands of panels prevented most rain from reaching the forest floor.


    Basement to attic

    Nepstad got the idea for the experiment while working in the eastern Amazon in 1992 during an El Niño drought. Some forests there had dried out so much that they burned, apparently, for the first time. To find out more, Nepstad teamed up with Paulo Moutinho of the Institute for Environmental Research in the Amazon in Belém and Eric Davidson of WHRC. They chose a field site in the Tapajós National Forest, 67 km south of Santarém, Brazil, in the lowlands that are predicted to be especially vulnerable to climate change. It's not as wet as true rainforest and has an annual dry season that lasts for up to 6 months.

    The setup required a year's worth of effort in hot, muggy conditions. With a crew of up to 15 local workers, the team outfitted two sites with four 30-meter-high towers, linked by catwalks to study the canopy. Working with hand tools to avoid disturbing the forest, the crew also dug five pits down to 11 meters in each site to enable researchers to regularly examine roots and soil water. “You can look from the basement to the attic of the forest,” says Nepstad. Even more earth was moved as workers dug a 1.5-meter-deep trench around the hectare-sized experimental site to prevent rainwater from seeping in from the surrounding forest. To control for the impact of digging on tree roots, they excavated a similar trench around the comparison plot.

    As has been done in similar but smaller experiments elsewhere, they then assembled a system of wooden rafters 1 to 4 meters above the forest floor. Some 5600 plastic panels, each 0.6 m by 3 m, rested on these rafters. “It's like the whole understory of the forest is wrapped in plastic,” says team member Rafael Oliveira, a plant ecophysiologist now at the National Institute for Space Research in São Paulo. The panels caught about 80% of the rain that fell through the canopy and diverted it to wooden gutters that drain to the trench. To mimic natural conditions, workers flipped each panel three times a week to allow leaves and other material to reach the forest floor.

    The forest was remarkably resilient—at first. As expected, photosynthesis slowed down to conserve water, and the roots drew water from ever deeper in the soil—ultimately as far down as 13 meters. These deep roots help irrigate the topsoil, the researchers found: At night, water flows from the tap roots and dribbles out of the larger network of shallow roots to be used after daybreak, as Oliveira and Todd Dawson of the University of California, Berkeley, will report in a paper accepted at Oecologia. This phenomenon, called hydraulic redistribution, had been seen in temperate forests but wasn't known to occur in the tropics.

    Extreme instrumentation.

    Towers and trenches revealed the inner workings of the forest.


    The canopy also had tricks up its sleeve. No one would have expected leaves to absorb rainwater, says Gina Cardinot, a grad student at the Federal University of Rio de Janeiro, because of their adaptations to prevent water loss. But unpublished research by Cardinot and Leonel Sternberg of the University of Miami in Coral Gables, Florida, suggests otherwise. Stable isotope tracers applied during the drought experiment indicate that two of three common species take up some water through their leaves. “All of this adds up to a forest with enormous drought tolerance,” says Nepstad.

    That's not to say there weren't changes. Trees in the experimental plot slowed their growth, and many of the smaller trees stopped growing entirely. And then, 4 years after the drought began, they began to die. The mortality rate was especially high in tall, canopy trees—up to 9% per year—as Nepstad's team describes in a paper submitted to Ecology. “These are astonishing effects,” says Clark, who says no one ever really knew exactly how much death was specifically due to drought.

    The loss of large, centuries-old trees has big implications. Gaps in the canopy allowed more light to reach the forest floor, drying out the leaf litter and increasing the risk of fire. According to a model of fire risk that Nepstad has devised, in press at Ecological Applications, the control plot is highly flammable for about 10 days a year. The experimental plot, by contrast, is now highly vulnerable for 8 to 10 weeks each year. Intense fires not only convert tropical forest to savanna, they also release a lot of carbon and generate smoke that can further dry out remaining forest. Even without fires, dead trees release large amounts of carbon when the wood and roots decompose.

    Severe drought also brought dramatic changes in the ability of the forest to store carbon, because of the slower plant growth. By the third year of drought, the experimental plot was storing only 2 tons of carbon as wood, whereas the control plot still tucked away 7 tons. “That's a profound reduction,” says John Grace of the University of Edinburgh, U.K.

    By putting hard numbers on these kinds of processes, the drought experiment will help refine climate models, says David Lawrence of the National Center for Atmospheric Research (NCAR) in Boulder, Colorado. Already, Jung-Eun Lee and Inez Fung of the University of California, Berkeley, have shown in unpublished research that incorporating the hydraulic redistribution of water into the NCAR climate model makes it more accurate.


    Panels and gutters caught rain, enabling Daniel Nepstad and his team to mimic a drought.


    One important question is how broadly Nepstad's results can be extrapolated. In contrast to the Tapajós forest, large swaths of tropical forest further west don't experience regular dry seasons. That could mean these forests haven't evolved coping strategies and might suffer even more dramatically when drought-stricken, Nepstad warns. On the other hand, wetter environments are more buffered. Nepstad deliberately picked a site with a water table so low that roots couldn't reach it. In contrast, Grace and Brazilian colleagues have finished a smaller scale experiment farther east where the water table was higher; they found less tree mortality.

    Another factor is the time scale. Five years is just a blink of an eye for a forest. Ariel Lugo, director of the U.S. Forest Service's International Institute of Tropical Forestry in Puerto Rico, suggests that climate change will be more gradual than the onset of this experiment, perhaps allowing forests to adapt: “You have to be aware those are worst-case scenarios.”

    The next step is to observe what happens after the end of a severe drought. Nepstad's team has removed the plastic panels and will study the two plots for another 2 years to see how—or whether—the forest recovers.


    New Wave of Electrical Wires Inches Closer to Market

    1. Robert F. Service

    The performance of wires made from yttrium, barium, copper, and oxygen is getting tantalizingly close to what is needed to compete with conventional conductors

    SAN FRANCISCO, CALIFORNIA—In this era of high-tech wizardry, not every flashy new technology rises to the top. Case in point: a first generation of wire made from high-temperature superconductors (HTS). It's been on the market for years and can carry more than 100 times the current of a copper wire of the same size. But because it costs more than 100 times as much as copper wire, it has made few inroads into the mass market. Researchers worldwide are banking on a second generation of high-tech HTS wire to drop prices and improve performance. These second-generation wires—made from yttrium, barium, copper, and oxygen (YBCO)—have been difficult to make in long lengths. Now, after a decade of slow and fitful progress, YBCO wires appear to be on the cusp of reaching the market.

    At a meeting here last month,* three companies reported that they've developed manufacturing techniques to turn out YBCO wires up to 100 meters long; a few years ago, the best the industry could do was a mere meter or so (Science, 1 February 2002, p. 787). What's more, some of the new wires can carry almost as much current as the best first-generation (1G) wires, and further improvement may be in the offing: Researchers have pushed the current-carrying capacity of short lengths of wire above 1400 amperes, and they are developing a slew of wiremaking techniques that promise to drop the costs of 2G wire considerably below those of its HTS predecessor.

    “We're getting to the point where it's very interesting,” says Dean Peterson, who directs the YBCO superconducting tape development program at Los Alamos National Laboratory in New Mexico. “We can start to think about tapes and coils.” Adds David Larbalestier, a superconductivity researcher at the University of Wisconsin, Madison: “There's no doubt now it can work. Now it's all about the cost.” But therein could lie the rub. Some electrical industry watchers worry that unless 2G HTS wire ends up vastly superior to conventional copper cables, ultraconservative utilities won't make the switch, leaving HTS companies high and dry. Smaller applications, such as making motors and magnets, may bolster the bottom line of HTS companies. But utilities represent the biggest bulk customers.

    A cut above.

    Second-generation superconducting wires are slit from wide tapes into thin ribbons.


    The switch to high-temperature superconductors seemed a foregone conclusion after researchers discovered them nearly 20 years ago. Unlike conventional low-temperature superconductors that conduct electricity without resistance at about 20 kelvin, in 1986 copper oxide ceramics were found to superconduct above a relatively balmy 77 K. That meant they could be cooled with relatively cheap liquid nitrogen rather than the more expensive liquid hydrogen. Pundits immediately promised everything from levitating trains to the ability to store power endlessly without losses. All these applications depended on turning the brittle materials into long, flexible wires and cooling the wires below the critical temperature at which they superconduct. Scientists created the first generation of wires by packing a powder of bismuth, strontium, calcium, copper, and oxygen (BSCCO) into silver tubes and then drawing them out into wires. That encased architecture, however, made it difficult to fine-tune the arrangement of the BSCCO superconductor for better performance. And the need for large amounts of silver has kept the price of those wires high.

    Ten years ago, researchers at Los Alamos National Laboratory in New Mexico and Oak Ridge National Laboratory in Tennessee came up with different schemes for making potentially cheap high-current-carrying wires out of YBCO. At similar temperatures, YBCO wires carried higher currents than 1G bismuth wires did and retained their superconducting abilities better under magnetic fields. Those advantages held out hope that the materials could be useful for making high-powered magnets and motors that generate such high fields. YBCO could also work just fine atop nickel and other cheap substrates, and it could be laid down on top of the substrates with atomic precision using techniques borrowed from the semiconductor industry. That versatility gave the 2G conductors “all kinds of potential,” says Jim Daley, who manages the superconductivity program at the U.S. Department of Energy in Washington, D.C. “You can completely engineer them.”

    Easier said than done. YBCO has proved a pain in the neck to work with. As it is deposited, YBCO forms an array of tiny grains. Unless the boundaries of those grains line up with one another more or less in the same direction, the pairs of electrons that make up a supercurrent find it impossible to hop from one grain to the next. The ability of the materials to superconduct drops, and the electrical resistance spikes.

    To make matters worse, magnetic fields in many cases still cause trouble. Those fields penetrate top-down through a superconductor that passes current along a wire from one end to another. As they do, they create tiny magnetic whirlpools called vortices. As long as the vortices remain stationary, or “pinned,” in HTS lingo, there's no problem, because superconducting electrons can simply wiggle their way around them. But as the current in a superconductor increases, the vortices start to meander. The moving barriers increase the material's resistance and choke off the supercurrent.

    Such issues have to be controlled throughout the whole length of each wire. YBCO wires “are only as good as their weakest link,” Peterson says. “Initially, there was some doubt whether we could get there and maintain the support we needed.” But in the United States and Japan, support for YBCO wires has remained relatively steady, allowing researchers primarily at national labs and companies to work through their litany of challenges. “We had to roll up our sleeves and tackle the problems one by one,” says Steve Foltyn, a 2G wire expert at Los Alamos. That has been the story of the past decade as researchers have sifted through dozens of changes to their materials and processes in search of the best combination.

    Possible fixes include a pair of low-cost techniques to deposit YBCO atop nickel or other metal substrates both cheaply and quickly. Researchers discovered that topping this substrate with materials such as cerium oxide (CeO) helps orient the grains of YBCO that are grown on top. Finally, they came up with high-speed schemes to add a layer of copper or another normal conductor on top to carry the current in the event that the YBCO stopped superconducting.

    Ever closer.

    YBCO wires from the Japanese firm Fujikura are nearing the goals needed to compete with copper.


    Together, these advances have enabled companies for the first time to come within striking distance of the current-carrying benchmarks set down by the U.S. Department of Energy (DOE) and sister organizations in other countries. DOE's goals call for 2G wire to carry 300 amps over 100 meters by 2006 and 1000 amps over 1 kilometer by 2010. At the meeting, Thomas Kodenkandath of American Superconductor Corp. in Westborough, Massachusetts, reported that his company had used a high-speed wiremaking technique to make 10-meter lengths of wire that carry 272 amps at 77 K and 30-meter lengths that carry 186 amps. Venkat Selvamanickam of SuperPower in Schenectady, New York, reported that his company had turned out close to 100-meter lengths of wire that can carry up to 100 amps at 77 K, also using a high-speed process. Using a slower and thus probably more expensive process, researchers at the Nagoya Coated Conductor Center in Japan have turned out 105-meter-long wires that carry 159 amps throughout. “They're all getting to long lengths with higher currents,” says Larbalestier. What's more, he and others add, it's far easier to scale up production from 100 meters to make kilometers of wire than it is to go from making 1-meter-long wires in a lab environment to creating a production process. “If you can do 100 meters, you have a process typical of reel-to-reel processing,” Daley says.

    Improvements are still coming. At the meeting, Kodenkandath and others reported that by adding nanoscale impurities to their films, they can create “pinning centers” that grab hold of magnetic vortices and thereby allow wires to operate at higher magnetic fields. Foltyn also reported that his team at Los Alamos has come up with a new scheme for boosting the current-carrying capability of their wires. Researchers have tried for years to do that simply by growing thicker layers of YBCO in their wires, so that they can handle larger currents. But they've found that as they increase YBCO's thickness, the current-carrying capability drops off.

    One possible culprit was that as the YBCO layer thickens, the grains in the film get progressively more out of alignment and therefore carry less current. To solve the problem, Foltyn and his colleagues split a single thick layer of YBCO into several layers separated with cerium oxide film to reorient the grains. The result was striking. In one club sandwich-like tape with six alternating YBCO-CeO layers, the current topped 1400 amps. “This is a terrific result,” Larbalestier says. So far the new multilayer wire has been demonstrated only on a sample about a centimeter long. However, Larbalestier says, it's likely that the wire companies will pick up on the idea and begin trying it out on longer wires.

    Bright future?

    Utility experts anticipate that high-current wires (left) will help stabilize the power grid.


    Heartened by their progress, HTS companies are working furiously to scale up production. Last month, American Superconductor raised $45 million on the stock market to build a 2G “prepilot” production plant in Ayer, Massachusetts. SuperPower has already built a YBCO wire-production factory in Schenectady, New York. Selvamanickam says his company plans to begin producing up to 1000 kilometers of 2G wire next year. And although it's too early to know how much it will cost, he says, “it will be significantly lower than 1G.”

    “This has been a story of wonderful science and incredible work” in universities, the national labs, and superconductivity companies, says Paul Grant, a longtime superconductivity watcher, who recently retired from the Electric Power Research Institute in Palo Alto, California, to start his own consulting business. However, Grant says the approaching commercialization belies deeper questions about whether electric utilities and other heavy industries will be willing to buy HTS products.

    “What worries me is, where is the gold rush?” asks Grant. At best, he says, utilities are lukewarm in their interest in HTS products. Several times in the past, Grant argues, electric utilities have backed off even from new technologies that were widely expected to cost less in the long run. “You're dealing with an industry that is very lethargic and doesn't adopt new technology very easily,” he says. Daley agrees: “We're dealing with a regulated industry. It's not as easy for them to make investments [in new technology].” However, Daley says he is encouraged by the fact that three utility companies are participating in three separate pilot projects to install 1G HTS power cables. “We can only hope they'll stay behind it and keep active,” Daley says. Making motors and other devices from 2G wire should help HTS companies stay afloat. But if utilities bail on making the switch, all the high-tech prowess in the world might not be enough to turn HTS wires into true power players.

    • *Materials Research Society, 28 March-1 April.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution