News this Week

Science  06 Aug 1999:
Vol. 285, Issue 5429, pp. 810

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    E-biomed Morphs to E-biosci, Focus Shifts to Reviewed Papers

    1. Eliot Marshall

    Controversial plans for a Web-based repository of biomedical literature, to be hosted by the National Institutes of Health (NIH), have undergone another shift in the past few weeks. After the proposal drew fire from several scientific societies, NIH has dropped the notion of a “preprint” server containing mostly unreviewed scientific papers. Instead, it is putting together a model more like a “reprint” server, providing researchers access to publications that have mostly already gone through peer review by journals. Even the proposed name, “E-biomed,” has changed. To avoid infringing a title already held by New York-based publisher of biomedical journals Mary Ann Liebert, the service is being renamed “E-biosci.” NIH is now discussing the proposal with scientific publishers, and it could unveil its new plan “within a few weeks,” according to one government official.

    In the revised form, E-biosci would serve as host to scientific groups and journal publishers. After certification, these groups would be free to post on E-biosci virtually any material they chose—reviewed or not. NIH director Harold Varmus, who conceived the E-biomed project earlier this year with several colleagues, is hoping that journals will post the full text of the articles they are publishing. It would be up to the groups to decide whether to post unreviewed e-prints as well. Varmus has also made it clear that he is ceding another important point: NIH will not get involved in debates over who should own copyright to articles on E-biosci. In his original E-biomed proposal, he suggested authors would retain copyright, but in an “addendum” released on the NIH Web site on 21 June, he said this issue should be handled on a case-by-case basis (Science, 25 June, p. 2062). Journals that wish to retain copyright would do so.

    For many publishers, the advantages of participating in this clearinghouse of biological literature are not yet clear. Several are now talking with Varmus and other NIH officials to determine whether, and under what terms, they might participate. Among the questions they're considering are whether to provide short abstracts, full-text articles, or just digital links to their own Web pages. Those who agree to provide full-text articles must also decide whether to delay releasing articles to E-biosci until some time after the material is available to their own subscribers. Finally, the publishers are trying to figure out how to pay for editing and peer review if the papers they publish are free on the server.

    Among journal chiefs, perhaps the most enthusiastic backer of E-biosci is Nicholas Cozzarelli, editor of the Proceedings of the National Academy of Sciences (PNAS). Although he personally likes the idea of a biology preprint server—like the successful archives for physics and astronomy hosted by the Los Alamos National Laboratory—Cozzarelli says the notion that NIH might create one is now “out.” David Lipman, director of NIH's National Center for Biotechnology Information and a co-conceiver of E-biosci, agrees. Cozzarelli has embraced Varmus's new concept, and he says he would like to begin putting PNAS articles out through E-biosci as soon as possible. He's trying to persuade his board and other editors and publishers to go along.

    Other publications that might contribute to E-biosci, according to participants in these discussions, are Molecular Biology of the Cell, published by the American Society for Cell Biology; the British Medical Journal, published by the British Medical Association; Plant Physiology and The Plant Cell, published by the American Society of Plant Physiologists; and possibly a group of journals under the aegis of the European Molecular Biology Organization. EMBO chief Frank Gannon reports that a dozen European journal editors met in Heidelberg on 21 July and agreed, among other things, that they would be willing to “transfer full text to the free E-biosci after a lag period of up to 6 months.” The terms might be different in each case, and many organizations are now consulting their boards on how to proceed.

    Cozzarelli has already conducted an e-mail survey of PNAS's editorial board. He reports that the overwhelming majority of respondents (33 of 37 thus far) favor releasing material to E-biosci with no more than a couple of weeks' delay. He concedes, however, that “a lot of [other] people have reservations” about such a plan. He is taking a proposal to a meeting of the Council of the National Academy of Sciences, which owns PNAS, for review. The council will meet on 9 August to vote yea or nay. The board of the American Society for Cell Biology, meanwhile, plans to meet this week on whether Molecular Biology of the Cell should be placed in E-biosci with a 2-month delay between release to subscribers and release to the public, says executive director Elizabeth Marincola. She anticipates the board will support the proposal.

    Other publishers are reacting cautiously. Science, according to managing editor Monica Bradford, is discussing a scheme that would provide electronic links between E-biosci and the magazine's own Web site, offering access to text for a fee, or at a reduced rate after a postpublication delay. Editor-in-Chief Floyd Bloom says that Science has demonstrated with the online features it has implemented in the past 4 years “that we are committed to the core concept of making the scientific literature more accessible … to the scholarly community at the least cost.” But he notes that a decision to transfer Science's articles to E-biosci would require approval of its publisher, the American Association for the Advancement of Science, and “I have no such proposal ready to present at this time.”

    Maarten Chrispeels, editor of Plant Physiology, says that leaders of the American Society of Plant Physiologists recently voted to talk to Varmus about adding this journal and The Plant Cell to E-biosci. “We are very interested,” Chrispeels told Science in an e-mail, “but also worried” about the potential loss of revenue and need to charge authors for publication in E-biosci to offset potential losses in subscription revenues. Chrispeels calculates that the overhead charge for minimal review and editing would run to $1500 per article. This is “a hefty sum,” he notes, and “prohibitive” for scientists in poorer countries. He worries also that it would drive authors into the arms of commercial publishers, which don't charge page fees. On the positive side, says Chrispeels, “we see a chance of integrating plant biology with the rest of the biological sciences” in one database. “Our stand,” he adds, is to “see what the bigger players are going to do.”

    Like many editors and publishers, Chrispeels says his colleagues like E-biosci as a concept, but “an awful lot of details need to be worked out” before any final decision is made.


    DOE Builds a Web Site for the Physical Sciences

    1. Eliot Marshall

    By October, if a plan under development at the Department of Energy (DOE) works out, the public will be able to tap into a comprehensive new database of scientific papers in the physical sciences called PubSCIENCE. It will offer Internet access to titles, authors, and abstracts from hundreds of journals, according to Martha Krebs, director of DOE's Office of Science, the project's sponsor. The goal, according to her staff, is to index just about every scientific journal that isn't already indexed in PubMED—the online collection of medical information based at the National Institutes of Health (NIH)—and to link abstracts back to each publisher's Web site. Unlike the E-biosci proposal being discussed by NIH (see previous story), DOE is not asking publishers for free access to the full text of articles.

    DOE has already signed up a few major publishers willing to help test the system. The initial participants include the American Association for the Advancement of Science (Science's publisher), The American Physical Society, Elsevier Science, and the Institute for Scientific Information. By October, DOE hopes to make available current information from 400 journals. It aims to increase its coverage to 2000 later.

    The project has a history that reaches all the way back to 1947, according to Walter Warnick, director of DOE's Office of Scientific and Technical Information, which is curating the database along with the Los Alamos National Laboratory in New Mexico. Half a century ago, the Atomic Energy Commission created “Nuclear Science Abstracts,” a compendium of references for nuclear physicists. When DOE took over the portfolio in the 1970s, it broadened the scope and created the Energy Database. Its clients were chiefly the thousands of scientists who work at DOE research centers. Now, DOE is building on this base to create a digital index of all physical science articles in English, linked electronically to their publication sources. This should allow readers to jump from almost any citation that turns up in a literature search to the publisher's Web site.

    Most DOE scientists can use this service now to access many full-text articles, either because physical science journals permit free use of archival material or because DOE has paid publishers for online access. Recently, according to DOE, the Government Printing Office expressed an interest in making PubSCIENCE available to the public as well, through the “GPO Access” Web site.

    If a tentative agreement works out as planned, DOE says, anyone with access to the Internet will be able to do simple searches on PubSCIENCE records, retrieve abstracts, and jump directly into an archive. Depending on the conditions set by the publisher, the reader may get immediate access to the full text of articles, or be required to pay a fee or provide a password at an entry gate. “We're not trying to replace publishers; we're trying to make it easier to get to the published material,” says Krebs.

    PubSCIENCE will overlap a bit with PubMED in the titles it indexes, Warnick concedes. Some topics like bioengineering will get double coverage. Publishers have responded enthusiastically, as PubSCIENCE is likely to bring customers to the door. DOE is preparing for a possible surge of interest. Warnick notes that data requests increased rapidly at PubMED when all barriers to public access were dropped in 1996. The number of searches climbed from a modest buzz of about 7.4 million per year to a torrent of 180 million this year. “We might not get that kind of usage at first,” says Warnick, but the machines can handle it if it appears.


    New Clues to Whooping Cough Pathology

    1. Evelyn Strauss

    The whooping cough bacterium, Bordetella pertussis, appears to be a master tactician. According to new findings, the pathogen, after invading the respiratory tract, induces some cells to kill certain of their neighbors with toxic gas. The result likely contributes to the intense gasping cough that not only gives the disease its name but also spreads the bacterium to other victims.

    Physicians and researchers have known for decades that the pathogen destroys the ciliated cells in the epithelial lining of the respiratory tract. The hairlike cilia sweep away mucus, but when they die, coughing provides the only way to clear the airway. Exactly how B. pertussis kills these cells has been a mystery, however. Now, results described in the July issue of Cellular Microbiology by microbiologist William Goldman of Washington University School of Medicine in St. Louis and Tod Flak, Goldman's former graduate student, may have revealed the microbe's diabolical modus operandi.

    Using a tissue culture system in which the microbe causes the same type of damage as in humans, the researchers found that two toxic substances produced by B. pertussis work together to kill the ciliated cells. One of those molecules, tracheal cytotoxin (TCT), has long been a suspect, but the other, endotoxin, is somewhat of a surprise. It is usually known for causing widespread immune system stimulation, which can lead to shock. “This is different,” says Drusilla Burns, a microbiologist at the Food and Drug Administration's Center for Biologics Evaluation and Research in Bethesda, Maryland. “Endotoxin is not normally associated with specific pathology.”

    Even more surprising, the two toxins do not launch a direct attack on the ciliated cells. Instead, they work together to incite neighboring cells to produce a noxious molecule, nitric oxide (NO), which kills the ciliated cells by an as yet unknown mechanism. “This work might explain some of the pathology of pertussis,” says Ferric Fang, a molecular biologist at the University of Colorado Health Sciences Center in Denver. And, he adds, it might also “provide strategies for intervention.”

    Such strategies are badly needed. In developed countries, vaccination largely holds whooping cough in check, but the incidence of the disease in adults appears to be increasing; in nondeveloped countries, B. pertussis still kills from 300,000 to 500,000 people every year. And although antibiotics eliminate the bacteria, by the time the characteristic cough develops, the microbes are often already gone, having set a cascade of destructive events in motion. Drugs that inhibit NO production might allow the tracheal epithelium to recover more quickly.

    Earlier work by Goldman's group had shown that NO is involved in the attack on the ciliated cells and suggested that TCT acts as a trigger. But when the team tested whether TCT causes epithelial cells in cultured hamster tracheal tissue to activate production of an enzyme called inducible NO synthase (iNOS)—which makes NO—they got what Goldman describes as a “surprise result.” TCT had no effect at all on iNOS production in epithelial cells. The researchers concluded that something in addition to TCT may be required to coerce the epithelium to produce NO.

    Goldman and Flak suspected that this co-culprit might be endotoxin, because the team had previously uncovered another case of TCT-endotoxin cooperation, in preventing the growth of a single type of respiratory epithelial cell in culture. It prompted the researchers to add endotoxin along with TCT to their culture system. The combination worked.

    Because not all the cells in the tracheal epithelium are ciliated, the team wondered whether the ciliated cells themselves or their neighbors produce the NO. As Goldman recalls asking, “Are the ciliated cells committing suicide, or are they being assisted by other cells in the epithelium?” Further studies provided an answer: TCT and endotoxin induce the nonciliated, mucus-secreting cells to produce the toxic gas.

    Goldman notes that both TCT and endotoxin are made by many other bacteria in addition to B. pertussis, although most of the other microbes recycle TCT rather than release it. “You've got almost an ironic situation,” he says, “where you have this extraordinary specificity of pathology and of nitric oxide production” spurred by two extremely common molecules. Endotoxin and TCT might collaborate to kill other ciliated cells in the body as well. Work by Raoul Rosenthal's group at Indiana University School of Medicine in Indianapolis and collaborators suggests that Neisseria gonorrhoeae destroys ciliated cells of the reproductive tract using these same two molecules.

    The researchers do not yet understand how NO kills the ciliated cells without harming the secretory cells that produce it. But, as Goldman points out, the strategy might be “exactly what [B. pertussis] needs,” because it allows mucus to accumulate while eliminating the normal way for expelling it. The result is a hacking cough—an ideal way to transfer the bacteria from one person to another.

    Many questions remain about how both NO and the TCT-endotoxin partners produce their effects. Researchers also need to find out whether B. pertussis operates the same way in humans. This might be addressed, says Erik Hewlett, a B. pertussis expert at the University of Virginia, Charlottesville, by seeing whether the iNOS expression patterns in trachea specimens from children who died from whooping cough mimic those seen in Goldman's experiments. If so, it might indicate that B. pertussis is putting its subversive tactics to work in environments other than the culture dish.


    House Panel Cuts Space Science, NSF Budgets

    1. Michael Hagmann*
    1. With reporting by Jeffrey Mervis.

    Two weeks ago, NASA's leaders were popping champagne corks to celebrate the 30th anniversary of the “giant leap for mankind” on the moon and the recent launching of the x-ray telescope Chandra. Then last week, the atmosphere in NASA's Washington, D.C., headquarters turned funereal: On 26 July, the House Veterans Affairs-Housing and Urban Development (VA-HUD) appropriations subcommittee cut $1.4 billion, or more than 10%, from NASA's current budget. This carved a massive $640 million slice out of space science, threatening future Mars missions and the Space Infrared Telescope Facility (SIRTF), scheduled for launch in 2001. In the same bill, the subcommittee proposed a $25 million cut (a reduction of 0.7%) in the budget of the National Science Foundation (NSF), with an $8.5 million (0.3%) increase for research.

    A few days later, though, the full House appropriations committee repaired some of the damage. It added back $400 million to NASA's budget, rescuing SIRTF and the Mars missions, yet leaving NASA's total appropriation about 7% below the current level. No one mended the hole in NSF's budget, and researchers were left wondering where this turbulent process will leave them in the fall.

    The cuts were necessary to stay within spending caps Congress adopted in 1997, which neither Congress nor the White House is willing at this stage to lift. “The allocation [for the fiscal year 2000 budget] was significantly lower than last year's budget, so we needed to find money,” a subcommittee staffer told Science. Because the subcommittee didn't want to cut housing programs and intended to boost spending for veterans' medical care—both in the subcommittee's jurisdiction—“we only had limited choices, and NASA was our first choice,” he adds.

    Space scientists are still reeling. Carl Pilcher, science director for Solar Systems Exploration at NASA, says many felt the subcommittee's message was: “You're done; finish what's on your plate, close the shop, turn off the lights, and go home.” Pilcher calls the vote “the most severe cut in NASA's 40-year history.” Planetary scientist Steven Squyres of Cornell University in Ithaca, New York, the chair of NASA's Space Science Advisory Committee, agrees. The initial proposal “was a going-out-of-business budget for space science. Except for missions that are well on their way to the launch pad, it would have effectively zeroed out all future missions,” he says.

    To repair the initial cuts in NASA's budget, the full appropriations committee shifted $400 million from the Corporation for National and Community Service (Americorps) to NASA. “I'm absolutely delighted about the $400 million, especially since it all goes into space science,” says Squyres. But others fear that killing the national service corps will invite a presidential veto.

    And although SIRTF survived, other space projects are still endangered. The “faster, cheaper, better” Explorer astronomy missions and the Discovery planetary missions, for instance, would be reduced by $60 million each. In addition, CONTOUR, a $50 million comet mission, would be canceled altogether. And the Research and Technology account would lose $100 million. Louis Friedman, executive director of the Planetary Society, warned in a statement that “the remaining cuts are still draconian and will throw NASA into turmoil.”

    NSF came through the ordeal clinging to a survival budget, but with future plans in tatters. Its strategy of bundling much of its requested $218 million increase into the Administration's multiagency $366 million information technology initiative was shot down by both the subcommittee and the full committee. The legislators gave NSF only $35 million of its $146 million request in that area and nothing for a requested $35 million terascale computer. Members were put off by the initiative's size, and they feared that any competition for the new machine would favor NSF's two existing supercomputer centers at the University of California, San Diego, and the University of Illinois, Urbana-Champaign. “Even if they had the money, they'd have wanted to phase it in,” explains a House aide. “They also didn't want to let the haves get even further ahead.”

    Next, the VA-HUD bill moves to the floor of the House, where a vote could occur before Congress begins a recess this weekend. Although space science supporters may try to seek changes, “appropriations bills are hard to overturn,” a Capitol Hill source says. “Right now the odds are with the bill passing” in the House. Then the process will begin all over again in the Senate, giving science lobbyists another chance. Says Squyres, “I don't think I've made my last visit [to Washington, D.C.] this summer.”


    Gene Linked to Faulty Cholesterol Transport

    1. Trisha Gura*
    1. Trisha Gura is a free-lance writer in Cleveland, Ohio.

    From a secluded island off the coast of Virginia in the Chesapeake Bay comes a genetic treasure that lay buried for centuries in the inhabitants and their descendants. This population carries a rare hereditary disorder, called Tangier disease—after the name of the island—characterized by a defect in cholesterol management that leaves the patients with yellow tonsils, oversized spleens, and scant amounts of the heart-protective high density lipoprotein (HDL) cholesterol in their bloodstreams. In the August issue of Nature Genetics, three groups now report that they have nabbed the gene at fault in Tangier disease—a discovery that not only sheds light on this condition, but may also lead to a better understanding of a much more common genetic deficiency that carries a high risk of heart attack.

    The gene identified by the three groups, which were led by Michael Hayden of the University of British Columbia in Vancouver, Gerd Schmitz of the University of Regensburg in Germany, and Gerd Assmann of the Westlischen Wilhelms University in Münster, Germany, encodes a member of a large family of proteins that shuttle molecules into and out of cells. Although this particular family member, known as ABC1 (for ATP-cassette binding protein 1), had been unearthed before, researchers had not yet pinned down its exact function.

    The new work suggests that ABC1 transports cholesterol from inside cells to the cholesterol-enveloping proteins waiting outside to carry it as HDL cholesterol particles to the liver for recycling back to cells in the body. Christopher Fielding of the Cardiovascular Research Institute at the University of California, San Francisco, who co-authored a Nature Genetics “News and Views” item on the three papers, notes that it's been 25 years since researchers learned how cholesterol gets into cells via the LDL receptor. But, he says, “this is the first time we know anything about how it comes out.”

    Even more promising is the finding by Hayden's team that defects in the ABC1 gene also pop up in patients with familial HDL deficiency syndrome (FHA), a disorder characterized by low blood levels of HDL cholesterol and a high risk of coronary disease. That result could be “a bonanza,” says cardiologist Dennis Sprecher of the Cleveland Clinic in Ohio. “There are piles of low HDL syndromes,” he notes, and the new discovery raises the possibility of developing drugs that protect against heart attack by targeting ABC1 and thus increasing HDL cholesterol levels in the blood.

    The search for the Tangier culprit was a sprint to the finish after a very long marathon. Donald Frederickson, then at the National Institutes of Health in Bethesda, Maryland, discovered the disease in 1961. Not until 1998, however, did longtime Tangier researcher Assmann—formerly a postdoc in Frederickson's lab—and his colleagues finger a region of chromosome 9 as the location of the Tangier disease gene. But that still left the major job of finding the gene itself.

    Enter Hayden's group in Vancouver. The researchers pinned down the approximate location of the gene by looking to see which chromosome 9 “markers” were consistently found in Tangier patients, but not in unaffected family members—an indication that the gene and marker are close to one another. Once they found such a link, the researchers had help from nature in pinpointing the exact gene: One patient was the child of two first cousins, and thus, Hayden's team reasoned, he likely had the same mutation in both copies of the gene. That conjecture, together with their analysis of FHA families, led the team to the ABC1 gene, which had been cloned in 1994 by Giovanna Chimini's group at the French research agency INSERM-CNRS in Marseille Luminy, but whose function was then still unknown. “Basically, we squeezed as much juice as we could from the genetic data we had,” Hayden says.

    Meanwhile, Schmitz's group in Regensburg was using a more biochemical approach to pinpoint the gene. They looked for genes that were expressed differently in the cholesterol-laden cells of Tangier patients than in normal cells. This search also fingered the ABC1 gene as the most likely candidate—an identification that was confirmed when the gene turned up mutated in all five Tangier patients that the Schmitz team studied. In addition, Chimini's group had already inactivated the gene in mice, and when Schmitz and his colleagues examined those animals, they found that they had developed symptoms similar to those of Tangier disease. Finally, Assmann's group found ABC1 through a combination of the techniques used by the other groups.

    Evidence from Hayden's and Schmitz's groups indicates that ABC1 normally transports cholesterol out of the cell. For example, Schmitz found that cells engineered to make extremely high amounts of the protein were so depleted of cholesterol that they died, while cells in which ABC1 synthesis was inhibited accumulated more, much as cells from Tangier patients do.

    So far, however, the research has not resolved a conundrum about Tangier disease: Even though all the patients have low HDL cholesterol concentrations in their blood—a condition thought to predispose to the artery-plugging lesions of atherosclerosis—some seem to escape coronary disease. That puzzle is heightened by the Hayden group's results linking ABC1 mutations to FHA, an HDL cholesterol problem in which patients do show an increased risk of heart attack. The Vancouver group has so far screened 20 FHA families for ABC1 mutations and found them in eight.

    One possible explanation for the apparent paradox, Schmitz says, is that the transporter has a dual role—both ferrying cholesterol out of the cell and directing cells such as macrophages, a type of immune cell, to specific locations in the body. His hypothesis is that a mutation in one spot may send cholesterol-laden macrophages off to the spleen, lymph nodes, or tonsils for processing—hence the oversized tissues characteristic of Tangier disease—whereas a different ABC1 mutation may cause the macrophages to stick to the vessel wall, in that case leading to atherosclerosis and heart disease.

    Further work will be needed to test those ideas, as well as to see how common ABC1 mutations are in FHA. But the new findings raise the possibility of developing drugs that protect against heart disease by raising blood HDL levels—a feat no one has yet accomplished. Even if ABC1 doesn't pan out as a target for such drugs, researchers may yet find other potential targets. “This should open the doors to a general research effort into what regulates cholesterol coming out of cells,” Fielding says. “There may be other mechanisms, but this is the first one, and it is a good place to start.”


    GPS's 'Dress Rehearsal' for Year 2000 Problem

    1. Richard A. Kerr

    It's a glaring Sunday in August, somewhere in the trackless wastes of Nevada beyond Death Valley, and you've had enough geological mapping for one day. Ready to head home, you check your GPS (Global Positioning System) receiver for the shortest route back to the truck. Surprise! You're not in Nevada anymore, the readout informs you, but close to downtown Los Angeles. Welcome to the Week Zero Problem, a design glitch that will befuddle thousands of GPS receivers come the 21st of this month. “It's serious enough to be called a ‘dress rehearsal’ for Y2K,” says John Lovell, director of quality at Trimble Navigation Ltd. in Sunnyvale, California. “Users who depend on GPS for geographic locations on land, at sea, or in the air could face serious safety hazards.” The receivers depend on knowing the time to function properly, and on the 21st, timepieces on the fleet of GPS satellites will roll over to zero like car odometers hitting 100,000 miles. Manufacturers believe they've got the problem well in hand with software fixes for the receivers, but they expect some older receivers to act up—and advise caution during that last week of August.


    Like the Y2K problem, the Week Zero Problem has its origins in an early programming decision. Just as the first computer software designers economized on computer memory by recording only the last two digits of the year, designers of the U.S. military's GPS opted to track time on the satellites by counting weeks—using only 10 bits in a binary code of 1's and 0's. That meant the GPS week counter could tally up to 210, or 1024, weeks before rolling over to week zero. The clock started on 6 January 1980, and since then GPS has penetrated all sorts of civilian markets, from monitoring earthquake faults to surveying roads, tracking freight, and navigating cars, ships, planes, and hikers. Now, the satellites' week counters will roll over at midnight, Greenwich Mean Time, on the night of 21 to 22 August.

    When operating properly, GPS can determine geographic positions anywhere in the world to within 100 meters by triangulating on at least three satellites. Each satellite beams a radio signal to a receiver, specifying the satellite's orbit and the precise time the signal set out. By checking three satellites and calculating how long each signal took to arrive, a receiver can triangulate its position. But when the rollover happens, the satellites will send out 19-year-old dates.

    Exactly how a receiver reacts will depend on the model. Receiver manufacturers began building in fixes in the early 1990s that prevent confusion over what week it is, so most receivers will do fine. But some units will try to track satellites with the schedule of the original Week Zero and so may take from a few seconds to as much as 20 minutes longer than usual to locate satellites, warned the U.S. Department of Transportation (DOT) in early June. Spokesperson Sara Beane of Garmin Corp. in Olathe, Kansas, a leading GPS manufacturer, says that less than 20% of their GPS receivers—those more than 3 or 4 years old—have this problem, and Garmin offers a free software patch to fix it.

    Other GPS receivers will fare worse. They may never locate satellites and fail to work at all, says the DOT, or they may appear to work but display the wrong position. No one knows for sure just how many of the world's 10 million to 15 million GPS receivers will turn into pumpkins. The DOT maintains a Web site of contact information for more than 60 manufacturers worldwide,* but it's up to each manufacturer to test its products and provide upgrades. Magellan Corp. spokesperson James White in San Dimas, California, says that 99% of their GPS products should perform normally and that word is getting out to users about the problem. White sees the response to the rollover as an example of what can be done when, as in Y2K, people know about a glitch and can test for it and provide a fix. Still, as Lovell points out, “No one can predict precisely how GPS satellites and GPS technology will function in each and every application.” So when you go out that Sunday, you might want to bring the old map and compass.


    Science Board Floats $1 Billion Trial Balloon

    1. Jocelyn Kaiser

    After years of complaints from scientists and activists that it pays environmental research short shrift, the National Science Foundation (NSF) heard a similar message last week from its own governors. The National Science Board (NSB) issued a report recommending that NSF ramp up spending on environmental science from $600 million in 1999 to $1.6 billion in 5 years.

    Such a boost would jibe with the direction in which NSF director Rita Colwell, an ecologist, is steering the agency. Last year, she proposed a network of “biodiversity observatories” to study interactions among organisms (Science, 25 September 1998, p. 1935), a project that could get under way in 2000. But although NSF takes advice from the science board seriously, the prescribed boost is far from a fait accompli: Congress must approve any increase, and early indications are that NSF's overall budget request could face a tough time this year (see p. 813).

    The NSF panel that produced the report,* chaired by marine ecologist Jane Lubchenco of Oregon State University in Corvallis, reviewed scores of reports on environmental policy as well as hundreds of comments from organizations and individuals. It says NSF devotes about $600 million, or 20% of its budget, to worthwhile environmental research projects ranging from microbes that thrive in hot springs to field sites that collect data on long-term trends, such as acid rain's effects on forest growth. But that's not nearly enough, concludes the report, which argues that environmental research “should be one of the highest priorities of the [NSF],” with additional funding for everything from more interdisciplinary research to objective reviews of data for policy-makers.

    According to the panel, one area ripe for more funding is ecosystem services, a field that blends social sciences and environmental science to get a handle on the economic benefits of, say, preserving watersheds, which filter contaminants from drinking water. Also high on the agenda is research on environmental technologies, such as remote sensing of landscapes and DNA chips that can identify which genes a microbe needs to thrive in a particular environment. “There are really exciting opportunities for progress,” Lubchenco says.

    As the NSB did last year, the task force rejected the notion that NSF establish an institute or new directorate. Overseeing environmental research, it says, can be done by a “highvisibility, NSF-wide organizational focal point” that would “[identify] gaps, opportunities, and priorities” and have “budgetary authority.” One possible model, says Lubchenco, is the agency's Office of Polar Programs.

    Some environmental researchers believe that approach doesn't go far enough. Ecologist H. Ronald Pulliam of the University of Georgia, Athens, says the vaguely defined entity NSF envisions may not accomplish the “change in culture” that's needed, among other things, to prevent interdisciplinary studies from falling into the cracks between the agency's single-discipline review panels. “If it's just more money, I think that's the wrong approach,” says Pulliam, who sits on the board of the Committee for the National Institute for the Environment, a Washington, D.C., nonprofit that advocates the establishment of an environmental institute within NSF.

    The overarching concern, however, is whether Congress will go along with a $1 billion boost earmarked for environmental science. Howard Silver, who heads a lobby group called the Coalition for National Science Funding, says he is skeptical that such funding will materialize anytime soon. But he applauds the agency for “thinking big.” As he says, “One can plant a seed.”

    • *Environmental Science and Engineering for the 21st Century: The Role of the NSF.


    A Plan to Save Hawaii's Threatened Biodiversity

    1. Richard Stone

    HONOLULU—Botanist Steve Perlman will gladly risk his life to help endangered species, shimmying to the top of a rare species of palm tree to pluck fruit with viable seeds, or rappelling down a cliff above pounding surf to dab pollen on a lonely dicot clinging to the rocks. But even such heroics aren't enough to stave off the danger looming over Hawaii's unique native habitats, now under siege from alien species and development. “We're fighting a losing battle,” says Perlman, who works at the National Tropical Botanical Garden in Kauai. “It's depressing, especially when you witness an extinction that could have been prevented.”

    Now researchers are hoping to turn the tide before the casualties become unbearable. At the Hawaii Conservation Conference here last week, an advisory group of government and university scientists and land managers unveiled a draft plan for a $200 million, 5-year initiative to preserve Hawaiian biodiversity. The plan, called Legacy 2000, gets a warm reception from conservationists. “I think it's dynamite,” says William Everett, president of the Endangered Species Recovery Council in La Jolla, California. For the initiative's architects, however, the hard work has only just begun: They must find a way to pay for it. “It will have to be a manna-from-heaven situation,” admits Robert Smith, manager of the U.S. Fish and Wildlife Service's (FWS's) Pacific Islands Ecoregion. The challenge, says Michael Buck, administrator of Hawaii's Department of Land and Natural Resources (DLNR), will be to convince people on the U.S. mainland that tackling Hawaii's ecological woes is just as important as, say, fixing the Everglades, a multibillion-dollar job that Florida and the federal government are about to embark on (Science, 9 July, p. 180).

    Formed over the last 5 million years from volcanic eruptions, the main Hawaiian islands once had a breathtaking variety of species—many found nowhere else on Earth—that evolved from a few hardy pioneers. But in only 1500 years or so of human habitation, Hawaii has lost two-thirds of its native forests and hundreds of species. According to FWS, Hawaii has more species on the federal endangered list—297—than any other state. Major culprits in this decline are habitat loss and alien species, such as weeds and feral pigs, that prey on the natives or flourish in the absence of predators. But although a decade-long slump in tourism revenue has resulted in scant state support for conservation programs, not all the news is bad. Managers are making inroads against a particularly nasty invasive plant called Miconia, and conservation programs are beginning to involve Hawaiians of Polynesian descent, boosting popular support for such measures.

    Hoping to parlay these successes into an ambitious program to protect more species across larger swaths of land, a panel composed of representatives from several federal and state agencies and the University of Hawaii, Manoa—the major players that manage or study Hawaiian species—drafted Legacy 2000. Highlights of the initiative include calls for $5 million a year for community-based conservation, $3 million a year for academic research on Hawaiian ecosystems, and $4 million a year for a slate of programs to find Hawaii's rarest species, bolster endangered species through captive propagation, and create a plant germ plasm storage network.

    Few scientists would quibble with those goals. But some experts point out an additional vital step: Hawaii must do a better job of interdicting species that slip across its borders. “One of the major threats to endangered species is the hemorrhaging of alien species into the state,” says DLNR's Fred Kraus. Legacy 2000 does have some provisions that target alien species, particularly stepped-up inspections of flights from Guam and elsewhere that may be carrying the brown tree snake. A particular problem, Kraus says, is that Hawaii now bars importation of only a few kinds of noxious plants. Regulations to close this loophole, Kraus says, are “sorely needed.”

    Smith and other managers plan to incorporate comments they have received at the conference into the proposal, then gear up to mount a major effort to sell it to potential funders in Congress, nonprofit foundations, and the private sector. The clock is ticking. “This is a pretty scary time for us,” says Buck, whose department manages more than half the state's land. “If we don't get the resources” to protect the native ecosystems before more species are lost, he says, “we may never get another chance.”


    U.N. Plans Its Future in Space

    1. Helen Gavaghan*
    1. Helen Gavaghan is a writer in Hebden Bridge, U.K.

    At a meeting in Vienna last week, the 188 member states of the United Nations adopted a declaration on space and human development—their first in 17 years—that emphasizes the importance of space science and technology for improving human health, studying the environment, and helping sustainable development and disaster management. Although the declaration does not commit any country to a specific course of action, it provides a road map for cooperation among the U.N.'s member nations as well as technical and political recommendations for future space activities. Delegates also approved a voluntary fund to pay for the action plan that accompanies the declaration and suggested that countries be allowed loans from the World Bank to enable them to use the international space station.

    View this table:

    “What the UNISPACE III meeting has given us is the bible for the U.N.'s involvement in space and human development for the next 10 to 15 years,” says Hans Haubold of the U.N.'s committee on the peaceful use of outer space. Adds U. R. Rao, former head of the Indian Space Agency, who presided over the meeting, the delegates “listened to the needs of the developing as well as industrialized countries.”

    Most of the conference focused on the application of space technology to environmental issues. If member states act on the decisions made, possible spin-offs could include the establishment of a global, space- based system for disaster management and special funds for regional remote-sensing centers. It also called for a reduction in the growing swarms of space debris circling Earth and in the electromagnetic pollution that makes both ground-based astronomy and communication with spacecraft difficult.

    But basic science was not ignored: The gathering gave its blessing to several space science and astronomy proposals, including networking small telescopes in developing countries and building an orbiting world space observatory. Both of these sprang from a series of workshops organized jointly by the U.N. and the European Space Agency (ESA) that aim to improve astronomical research and education in the Third World. Each workshop, held in a developing country, brings local scientists together with the top international researchers in their field.

    Now that the concept of a world space observatory has the meeting's endorsement, possible participants will have to decide what sort of observatory they want and how to build and pay for it. It will most likely be a 1.5- to 2-meter ultraviolet telescope, says ESA's Willem Wamsteker, a former project scientist for the International Ultraviolet Explorer, who helped organize the workshops. None of the major space agencies have plans for an ultraviolet space telescope, so Wamsteker argues that such an instrument would be both scientifically important to all nations as well as a technical challenge for developing nations.

    The Vienna meeting also endorsed a plan to coordinate observations of variable stars by networking together existing small telescopes in places as far afield as Paraguay and Sri Lanka. Variable stars are a good target, according to Haubold, a professor of theoretical astrophysics, most recently at the University of Vienna, because they provide information about star structure and evolution and because there are many left that need cataloging. Such studies would also provide useful training in countries that do not have a strong tradition of modern astronomy. The participants are now discussing their ideas with the American Association of Variable Star Observers, and more concrete technical plans for turning these individual telescopes into a coordinated global network will be finalized at a meeting in Toulouse, France, next June.


    France Takes Share in British Synchrotron

    1. Alexander Hellemans*
    1. Alexander Hellemans is a writer in Naples, Italy.

    France has all but abandoned its plans to build a new, state-of-the-art synchrotron facility, called SOLEIL, which had been on the drawing board for 8 years. Earlier this week, the French government announced that it will instead invest heavily in a new synchrotron to be built in the United Kingdom, together with the British government and the Wellcome Trust, a London-based charity. That decision makes it unlikely that plans for SOLEIL will be realized, France's science minister Claude Allègre told Science. French scientists, who favored having their own facility on French soil, are disappointed.

    Materials scientists and biologists use synchrotrons to determine a compound's atomic structure from diffraction patterns produced when x-rays scatter off its internal atoms. SOLEIL, a 106-meter-diameter ring whose location in France was yet to be decided, would have given French scientists their own third-generation x-ray source. But Allègre had hinted several times that, with several other European synchrotrons to become available in the next decade, the $180 million project would be redundant. Instead, he has now opted to become a partner in DIAMOND, a new synchrotron ring to be constructed in Britain. The French and British governments and the Wellcome Trust will each donate $57 million over the next 7 years to build the machine, plus $10 million to $13 million yearly to operate it. Each will get an equal share of the facility's beamlines.

    Allègre says his decision was inspired by the need to further European scientific cooperation. “Of course, there are financial reasons [as well], but they are second to my wish to build a European [scientific] community,” Allègre says. “I will not approve large French projects anymore, my priorities are European instruments.” France's participation in DIAMOND, added to its use of the European Synchrotron Radiation Facility (ESRF) in Grenoble and LURE, an older x-ray source in Orsay, will guarantee the country's researchers access to x-ray beams, Allègre asserts.

    But French scientists disagree, arguing that the decision may stifle future research projects. Besides, says Yves Petroff, former director-general of ESRF and a member of SOLEIL's scientific council, using several European facilities will drive up travel and other costs and may be more expensive than SOLEIL would have been. “This decision has been taken only by the minister, and the scientific community has not been involved,” says Petroff. “We have been kept completely outside of it.”


    Common Ground for Fusion

    1. James Glanz

    After decades of division, researchers meeting in the Rockies found some surprising overlap in approaches as different as laser and magnetic fusion

    Fusion researchers strive to get atomic nuclei to overcome their antipathy and join together. But on a human scale, few fields are more fractious, riven by differences in technology, philosophy, and—recently—sharp funding cuts for some areas while other areas thrive. Now, the community of nuclear fusion researchers in the United States has sought some unity of its own. A 2-week meeting in Snowmass, Colorado*—conceived by physicists Hermann Grunder and Michael Mauel—went further than anyone expected in inching the field toward a consensus on scientific priorities. But as a careful diplomat might say, there is still a long way to go.

    The meeting was designed to help set a new course for the U.S. fusion program after Congress last year finally pulled the plug on U.S. funding for the International Thermonuclear Experimental Reactor (ITER), a $10 billion behemoth that had been the focus of one branch of the fusion power program throughout the 1990s. It brought together scientists who make the energy-producing fusion reactions in entirely different ways—by crushing a fuel pellet with laser pulses, say, or by trapping a much more diffuse gas in magnetic fields. Included were proponents of fresh new concepts as well as old reliables; nitty-gritty technologists sat cheek-by-jowl with basic scientists.

    “A number of people who came to Snowmass obviously are unwilling to talk to each other,” chuckles Grunder, the director of the Thomas Jefferson National Accelerator Facility in Newport News, Virginia. “That Snowmass happened is a major success for the fusion community.” Agrees Richard Siemon of Los Alamos National Laboratory (LANL) in New Mexico, “This was a very refreshing meeting—very open.”

    Participants found an unexpected degree of overlap in their research, especially in technology. For example, researchers in inertial fusion energy (IFE)—laser fusion —have planned to equip their reaction chambers with walls of flowing liquid metal; at the meeting, the concept grabbed the attention of researchers in magnetic fusion energy (MFE) as well. There was also some unanimity within subfields. The magnetic fusioneers agreed that their next step should be to ignite a plasma, creating a brief, self-sustaining fusion reaction, although they debated which of various proposed machines is the best bet for doing so. The IFE researchers, for their part, agreed that their current approach, relying on massive lasers, will have to give way to other technology if they are ever to build a working power plant.

    “We tried to make this meeting inclusive and welcoming to everyone,” says Mauel, who is at Columbia University. “We didn't want to say … ‘My experiment is better than your experiment.’” The areas of agreement that emerged should have a practical effect on the direction of the field, because the meeting's conclusions will be combined with those of other recent panels to help politicians and government agencies decide which fusion programs should be funded in the 2001 budget.

    Fusion energy is created when two hot, light nuclei collide and join, and the product then splits into a fast neutron and a new energetic nucleus. The neutron strikes a surrounding wall or blanket, depositing heat that, in a fusion power plant, would be converted to electricity. The scheme would generate no greenhouse gases and create vastly less long-lived radioactivity than fission plants do.

    But although the method works spectacularly well in hydrogen bombs, harnessing it in controlled fashion has proved elusive. Hot plasmas are inherently unstable, and they often break up and belch the heat needed to keep the fusion reactions going. To offset those losses, magnetic fusion experiments, which aim to create a steady fusion burn, tended to become ever bigger and more complex. ITER, which would have confined plasma in magnetic fields threading a doughnut-shaped device called a tokamak, was a case in point.

    Congress also mandated that MFE researchers join forces with their colleagues in IFE—traditionally a separate effort aimed in part at studying bomb physics—to come up with a common plan for the field (Science, 3 July 1998, p. 26). “The program's at a crossroads,” says Grant Logan, a physicist at Lawrence Livermore National Laboratory in California who has worked on both types of fusion and was an organizer of the conference. “So where should the scientific program go?” To begin answering that question, the meeting broke into half a dozen subgroups, which then presented summaries that were debated in plenary sessions by the approximately 300 attendees.

    The liquid-wall concept gathered support in breakout discussions between IFE and MFE researchers. Inertial fusion researchers have been exploring the concept as a way to cope with the energetic neutrons generated by fusion, which weaken and erode solid walls. The flowing, molten metals could not only be recycled, but could carry lithium, which would combine with the neutrons to make tritium—an extractable fusion fuel. For the first time, says Logan, “I was aware of people in magnetic fusion getting interested in liquid walls. They almost dominated the discussion.” Designing liquid walls for a magnetic fusion reactor is a challenge, because the strong magnetic fields they generate can interfere with the flow of liquid metals. But smallish, university-scale experiments could begin to address the challenges, says Logan.

    Another intersection between these long-separated areas of fusion research emerged from IFE. The IFE effort is thriving, with plans—and funding—to briefly crush and ignite fuel pellets with 200 converging lasers at the $1.2 billion National Ignition Facility (NIF), scheduled to be in operation at Livermore sometime after 2001. Yet although no one doubts the utility of NIF for studying bomb physics, some MFE researchers say they don't believe the concept will lead to a practical energy source. Among the biggest problems: Lasers are far too expensive and inefficient for a power plant.

    “Rightfully, the MFE community is saying we haven't worked out all those questions,” says Logan. One possible answer came from Sandia National Laboratory in Albuquerque, New Mexico, where the so-called Z machine has achieved a series of striking results by imploding a pellet of fuel using x-rays generated with blasts of electrical current (Science, 18 July 1997, p. 306 and 3 April 1998, p. 28). But Siemon of Los Alamos suggested a hybrid approach that might solve both the IFE's driver problem and the challenge of producing a stable plasma in MFE.

    Called magnetized target fusion, the concept would resemble the Z machine in using a burst of current to crush fusion fuel. But instead of a pellet, the fuel is a hot plasma caged in a magnetic field. The pulsed compression would not only compress and heat the plasma but also amplify the magnetic field, enhancing its insulating properties and relaxing the need to start with huge fields. “I think it's kind of intriguing,” says Sandia's Craig Olson, who is working on the Z machine. “It's potentially relatively low cost.”

    The MFE community is also trying to get its house in order. As in a less comprehensive meeting last year (Science, 8 May 1998, p. 818), researchers generally agreed that creating a burning plasma should be their next major milestone. “What we're arguing about is the best way to do it,” says Dale Meade, head of advanced fusion concepts at the Princeton Plasma Physics Laboratory. One route might be the so-called ITER Lite, a slimmed-down version of the original that would cost roughly half as much. Another option, with a price tag of about $1 billion, would be Meade's Fusion Ignition Research Experiment—a smaller tokamak that would eschew ITER's superconducting magnet coils for plain copper. A tokamak called the Ignitor, being designed at the Massachusetts Institute of Technology, would also create very strong magnetic fields with copper coils and be still smaller and less expensive.

    The debate revealed that “there's a lot of potential yet to be discovered in the tokamak line,” says Ron Stambaugh, a physicist at General Atomics in San Diego. At the same time, Snowmass participants agreed that MFE researchers should explore reactor designs that rely on alternative ways of caging a fusion plasma (see following story).

    Similar conclusions about MFE appear in a draft report by the high-level Task Force on Fusion Energy of the Secretary of Energy Advisory Board, some of whose members were at Snowmass. Now its report and the results of Snowmass, along with a third report on fusion still being prepared by the National Research Council and other sources, will figure in the deliberations of the Fusion Energy Sciences Advisory Committee (FESAC). By September, FESAC will make comprehensive recommendations about fusion's roadmap, including the balance of funding between MFE and IFE and the next steps toward a burning plasma, to Martha Krebs, director of the office of energy research at the U.S. Department of Energy.

    “We delayed answering the charge from Martha Krebs … to be able to hear what people had to say at Snowmass,” says John Sheffield, a physicist at Oak Ridge National Laboratory and the University of Tennessee, who is the FESAC chair. By bridging some of their differences, U.S. fusion scientists may have helped shape their future.

    • *1999 Fusion Summer Study, held from 11 to 23 July in Snowmass, Colorado.


    Fusion Power From a Floating Magnet?

    1. James Riordon*
    1. James Riordon is a science writer in Greenbelt, Maryland.

    In one radical design for a magnetic fusion reactor, energy-producing plasma would be trapped around a levitating ring of superconductor

    At first glance, something seems to be missing from the diagram Jay Kesner is describing. With a wave of a pointer he indicates a pumpkin-shaped vacuum vessel, 3 meters tall and 5 across, designed to contain a plasma of hot electrons and ions. Kesner, a physicist at the Massachusetts Institute of Technology (MIT) Plasma Science and Fusion Center, explains that a ring hovering at the center of the diagram with no visible means of support is a superconducting magnet that weighs nearly 500 kilograms. The lack of supports is not a draftsman's oversight. Kesner and his colleagues plan to levitate the ring magnetically as part of a novel experiment that may ultimately lead to a simple, safe, and inexpensive fusion power source.

    The Levitated Dipole Experiment (LDX) is a 5-year study of a plasma confinement scheme inspired by observations of ionized gases trapped in the magnetic fields of planets like Jupiter and Earth. Funded by the Department of Energy, the $6 million collaboration between MIT and Columbia University in New York City is under construction at the Plasma Science and Fusion Center on the MIT campus and should begin operation by the summer of 2000. In the current phase of the project, which will stop short of actual fusion, principal investigators Kesner and Michael Mauel of Columbia hope to determine whether a dipole-based machine—a sharp departure from current reactor designs—can generate the conditions for fusion. The project is part of a wave of experimentation now sweeping through the field of magnetic fusion as experimenters seek alternatives to current reactor designs (see sidebar).

    Thermonuclear fusion is the engine that powers the sun and stars. At tremendous stellar temperatures and the pressures of intense gravitational fields, hydrogen nuclei are driven together until they fuse, forming helium and releasing energy. Similar reactions occur briefly during the detonation of thermonuclear warheads. In magnetic confinement fusion machines, physicists mimic the conditions inside stars by heating plasma trapped in magnetic, rather than gravitational, fields.

    For nearly 30 years, doughnut-shaped magnetic confinement machines called tokamaks received the most attention and funding for potential fusion power production. These intricate devices have produced impressive bursts of energy and remain at the forefront of fusion research. But according to Dale Meade, who heads the Advanced Fusion Concepts group at Princeton University, tokamaks and related machines are plagued by various types of turbulence that cause the plasma to leak out. Surmounting these challenges, says Meade, requires either advances in machine design or dramatically scaled-up, and expensive, devices. “We know that we can overcome plasma turbulences by building huge systems,” explains Meade, “but it wouldn't be practical or attractive to persons interested in producing electricity.” The United States recently withdrew from the International Thermonuclear Experimental Reactor (ITER) tokamak project, a collaboration with Russia, Japan, and the European Union, in part due to the estimated $10 billion price tag.

    Levitated dipole reactors, in contrast, are the least complex fusion machines yet conceived. Current-carrying loops (like the superconducting ring at the heart of LDX) and common bar magnets generate dipole fields, the simplest of magnetic field configurations. So do planets, such as Jupiter. It was the Voyager II spacecraft's detection of plasma trapped in the fields of Jupiter's magnetosphere in the late 1980s that inspired Akira Hasegawa, then a Bell Labs physicist collaborating on the Voyager space missions, to propose the dipole design for a fusion machine.

    The Jupiter observations, along with theoretical predictions, suggest that dipole magnets could confine plasmas more efficiently, with weaker magnetic fields, than the complicated coils in tokamaks and related fusion machines. As LDX physicist Darren Garnier explains, in tokamaks and related machines, magnets push on the plasma from the outside, while the dipole in LDX will pull on the plasma from the inside. “I think it was Richard Feynman,” says Garnier, “who said trying to make [tokamak-style] magnetic confinement work is like trying to compress Jell-O with rubber bands.” Dipoles, on the other hand, pull on the plasma, just as gravity pulls down on Jell-O sitting in a bowl.

    In a planetary magnetosphere, plasma captured from the solar wind is lost as it follows the magnetic field lines into the poles, where the atmosphere neutralizes it. For a dipole formed by a current loop, however, field lines pass through the center of the loop unobstructed. The plasma forms a hot cloud trapped on the field lines surrounding the magnet and flowing through its center. To keep plasma from cooling down or sticking when it hits magnet supports or power cables, Hasegawa recommended doing without them. His scheme included a levitated, superconducting dipole loop with currents that flow perpetually once established.

    After 20 years of steady progress in tokamak technology, however, the scientific community was not yet ready for his proposal. “Timing is everything,” says Kesner, “and at that time only tokamaks were fundable.” That has changed, as the LDX project testifies.

    In the current design, a thermally insulated ring of niobium-tin wire will begin by resting in what Kesner calls a charging station at the base of the vacuum vessel. The wire, which becomes a superconductor below 15 kelvin, is cooled to about 5 degrees and a current is introduced. Researchers will use a crane to raise the ring about a meter and a half above the vessel floor, then switch on a magnet at the top of the chamber. Its field, while too weak to interfere much with the ring's, is strong enough to levitate the ring at the chamber center. There the coil should float for up to 8 hours, warming slowly, before it must be lowered and recooled.

    In addition to being simple, levitated dipole reactors could also be safer than other fusion schemes. Tokamaks and most other reactor designs fuse the hydrogen isotopes deuterium and tritium. These reactions generate copious neutrons, which deposit heat in the reactor walls. The heat generates power, but the neutrons ultimately render the reactor components radioactive, resulting in tons of hazardous material that must eventually be discarded. Because neutrons pose severe biological hazards, a tokamak reactor would also need to be heavily shielded.

    Dipole-based reactors, with their high plasma-confinement efficiency, should be able to generate higher temperatures and pressures, enabling them to burn more advanced fuels. These fuels mainly produce not neutrons, but energetic photons and electrically charged particles. The photons would heat the reactor, producing power, while the charged particles remain trapped in the magnetic fields. Dipole-based reactors must use these advanced fuels—neutrons, which can't be confined with magnets, would inevitably pierce the magnet, heating it until it ceased to function as a superconductor. As a bonus, the fusion products are less likely to make the reactor components radioactive or threaten bystanders.

    The fuel most frequently touted for a levitated dipole reactor is a mixture of deuterium and He3, a helium isotope containing two protons and one neutron. He3 is scarce on Earth, although conventional fission reactors produce enough He3 to conduct scientific experiments. But to fuel levitated dipole power plants, Kesner proposes that we eventually may have to mine the moon, where He3 is relatively plentiful. Kesner can afford to relax about the source of fuel for his reactor, as commercial energy production based on D-He3 fusion is several decades away—at best.

    Meade, for example, thinks plenty of problems with the levitated dipole concept could yet emerge. He believes that tokamaks, or devices related to them, are still the best bet for future controlled fusion machines. “Nevertheless,” he says, “I think LDX is a wonderful research tool to help us understand the stability issues of plasma confinement in other machines and, of course, in astrophysics.” And after the recent ITER troubles, says Steve Fetter, a professor at the University of Maryland School of Public Affairs who studies energy and environmental policy, long-term research efforts like LDX are what the magnetic fusion field needs. “At this stage, it is better to let a hundred flowers bloom rather than focus so narrowly on the tokamak,” he says.

    In any case, few physicists expect fusion to be a viable energy source before the middle of the next century. Levitating a half-ton magnet may seem like an impressive feat of engineering sleight of hand, but it's a small trick compared to bottling the fusion genie that powers the sun and stars—the ultimate goal of plasma physicists like Kesner, Mauel, and their LDX colleagues.


    Many Shapes for a Fusion Machine

    1. James Riordon*
    1. James Riordon is a science writer in Greenbelt, Maryland.

    Despite the recent troubles of the International Thermonuclear Experimental Reactor, a project to build a giant doughnut-shaped machine called a tokamak, other tokamaks continue to lead the magnetic-confinement fusion field. In late 1997, the Joint European Torus in Abingdon, England, set a new record by generating a 16-million-watt burst of fusion power—still short of breakeven, but nearly twice the previous record set in 1994 in the Tokamak Fusion Test Reactor (TFTR) at Princeton University. (TFTR was decommissioned in April 1997.) But in labs around the world, researchers are working on alternative fusion machines that they hope will confine plasma more effectively or efficiently. One is the levitated dipole reactor being developed at the Massachusetts Institute of Technology and Columbia University (see main text); here are a few of the other, less radical alternatives:

    • Stellarators: Often considered tokamaks' most serious competitor, stellarators include helical magnet coils wound around a plasma chamber. The kinky magnetic fields that result may control turbulence better than the smooth fields in tokamak configurations. Major stellarator experiments include Japan's Large Helical Device and the Helically Symmetric Experiment at the University of Wisconsin, Madison, as well as projects in Spain, Australia, and Germany.

    • Spherical toroids: Shrinking the hole at the center of a tokamak changes the doughnut-shaped machine to something resembling a cored apple. Spherical toroids rely on interlocking coils to generate fields much as tokamaks do, but achieve much higher confinement efficiencies by maximizing the length of stable magnetic field lines. Two new spherical toroids, the Mega-Amp Spherical Tokamak at the Cullham Science Centre in the U.K. and the National Spherical Torus Experiment at Princeton, produced their first plasmas early this year.

    • Reverse-field pinch (RFP): Relatively minor players in the fusion game for the moment, RFPs share the doughnut shape associated with tokamaks, but their magnets can be smaller because researchers induce a current in the RFP's plasma itself, making it flow around the machine like a river and generate its own magnetic field. The field squeezes—or pinches—the very plasma that produces it, helping to keep the plasma away from the chamber walls. Confinement efficiencies in the Madison Symmetric Torus at the University of Wisconsin rival those of tokamaks.

    • Spheromaks: Eliminating the hole in a tokamak altogether results in a spheromak, a device that, like the RFP, relies in part on plasma currents to generate confinement fields. Spheromak programs include the Swarthmore Spheromak Experiment at Swarthmore College in Pennsylvania and the Sustained Spheromak Physics Experiment at Lawrence Livermore National Laboratory in California.

  14. EUROPE

    JET Staff OKs Pay Settlement

    1. Judy Redfearn*
    1. Judy Redfearn writes from Bristol, U.K.

    The end is in sight for a 20-year-old dispute over disparities in pay and working conditions at Europe's premier fusion laboratory, the Joint European Torus (JET) near Oxford, U.K. Last month a majority of the 217 professional U.K. staff members voted to accept a 24 million euro (US$24.7 million) compensation package. Half of the money is expected to come from tightening JET's 1999 operating budget, including reducing the number of hours the machine will be running.

    The out-of-court settlement clears the way for changes in the management of JET, which since 1979 has been Europe's major contributor to the international effort to design a next-generation fusion machine. In January responsibility for the facility will be transferred from the European Commission to the United Kingdom Atomic Energy Authority (UKAEA), which will run it on behalf of the commission's Euratom program and Europe's 17 national fusion associations.

    Fearing a costly settlement, several associations had threatened to pull out if the dispute wasn't settled. The UKAEA had said that it couldn't afford to operate the facility alone and that it needed an agreement this month in order to prepare properly for the new management scheme. “Now I think our doubts will fall and we will take part,” says Roberto Andreani, head of the Italian fusion association.

    Historically, all of JET's professional staff members have been commission employees, except for U.K. nationals, who remained on UKAEA's payroll and salary structure. As a result, U.K. staff members worked side by side with nationals from other European countries who, as an inducement to sign up, earned higher salaries and were promised preferential treatment when applying for other Euratom posts. The U.K. staff complained to the European Court of Justice, which in 1996 found that this practice was discriminatory and ordered the commission to change its employment practices and negotiate a settlement. The European Parliament tried to mediate, but the staff turned down a 9 million euro offer and decided to go back to the European court.

    The cases were unlikely to be heard until the middle of next year, however. So once again the European Parliament stepped into the breach, with Detlev Samland, chair of the Parliament's budgets committee, acting as mediator. The Parliament must still give its approval to transfer public money to pay for the settlement, but little opposition is expected. “The Parliament has always considered that there has been a discrimination,” says one Parliament official.

    As for financing the settlement, some 9 million euros were set aside after the 1996 compensation offer, and an operating reserve contains another 3 million euros. The remainder will be drawn from JET's 80 million euro operating budget this year. In addition to a freeze on hiring, the lab's electric bill, a major expense, will be pared by reducing the hours of operation. “JET will have to walk for the rest of the year, at best,” says Francis Troyon, chair of the JET council.


    A Long March to Save Africa's Dwindling Wildlands

    1. Richard Stone

    A conservation biologist is setting out on a yearlong trek through the heart of Africa to survey plants and animals before they are lost

    Mike Fay was tramping through what he assumed was virgin forest in the Central African Republic a decade ago, hot on the trail of gorillas, when he stumbled across something out of place. Kneeling beside a yanga, a kind of forest pothole that collects water during rainstorms, he noticed tiny black specks in the dirt—kernels from an oil palm tree, which is usually associated with human settlement. “I thought, ‘That's strange,’” Fay recalls. “I'd been walking around this forest for 2 years and hadn't seen a single oil palm.” Fay later used carbon dating of kernels to establish that oil palms flourished in the region more than a millennium ago, around the time that linguists believe Bantu-speaking people were migrating across sub-Saharan Africa. To Fay, a biologist with the Wildlife Conservation Society (WCS) in New York City, the find suggests that the Bantus cultivated oil palms across huge tracts before moving on and letting the land revert to wild. “It appears that humans were everywhere,” in even the most remote Central African forests, he says.

    The discovery illustrates the rewards of probing the rainforest on foot rather than from behind the wheel of a Land Rover. Although not all experts buy Fay's argument that oil palms imply past occupation, most are convinced of the value of foot-slogging through the forest. And so they applaud Fay's newest venture: an ambitious effort to survey a 1400-kilometer-long swath of land, stretching from the Central African Republic southeast across Congo to the coast of Gabon, before large tracts succumb to the latest wave of human migration and exploitation. In a project set to get under way next week, Fay and a company of Babendjele and Bangombe guides and porters will spend a year trekking through forests, inventorying plants and animals and searching for additional signs of ancient settlements. Fay hopes that what he calls a “megatransect” will generate data that the three African governments and land managers can tap to cordon off ecologically valuable lands while encouraging sustainable logging and hunting in other regions.

    Whereas typical surveys cover only a small area, or examine only a few aspects of a wider terrain, the megatransect “offers an unprecedented opportunity for an ecological snapshot on a large scale,” says John Hart, a WCS senior scientist who has studied the effects of Congo's civil war on wildlife. “What Mike proposes to do is extremely important” for documenting the land before human activity irrevocably alters it, adds Claudia Olejniczak, an anthropologist at Washington University in St. Louis.

    Such a project requires an almost obsessive commitment to conservation, a kind of devotion that Fay has demonstrated before. In 1993, he led a campaign to persuade the Congolese government to set up a national park to protect 400,000 hectares of the Nouabale-Ndoki forest, a vast preserve teeming with forest elephants, an antelope called the bongo, and lowland gorillas. Based mostly on Fay's survey work, the government designated the park's core off limits while allowing sustainable logging, hunting, and tourism along the edges. “I tip my hat to his successful efforts” to set up the park, says the U.S. Forest Service's John Sidle, endangered species coordinator for the Great Plains National Grasslands, who calls Fay “a risk taker and adventurer.”

    Before Fay strikes out on foot, he plans to gather a bird's-eye view by flying his own Cessna over the study area's 17 connected forest tracts. He'll use a video camera and Global Positioning System information to roughly chart his course through the woods, recording locations of settlements, roads, and logging operations, particularly in areas with scant data on land-use patterns. Fay has pioneered this aerial approach to data collection in Central Africa, a region barely penetrated by ground surveys. “You can sense a village coming even 20 kilometers away,” he says. First weeds appear, then oil palms and other crops.

    But for the new project, the real fun starts at ground level. After striking out from a settlement at the periphery of the dry, sandy Ngoto forest in southern Central African Republic, Fay's team will follow human and animal trails deep into the forest, along a general compass line that should bring them to the next settlement some days later. Following trails rather than bushwhacking will be less invasive and save the group's strength, says Fay, who will note large mammal spoor and identify and sample plants on the route, which he expects to traverse at a rate of about 7 kilometers a day. Along for the ride will be a photographer for National Geographic, which is funding the project. “I want to be able to take away as many images as I possibly can,” Fay says.

    To get a sense of the relative wildlife density from one stretch of forest to the next, Fay intends to lure animals into the team's path. Six times a day, a guide will let out a melodious, high-pitched sound—like a cross between a goat's bleating and a cat's mewing—that mimics the distress call of a duiker, a small antelope. The call is known to attract a variety of mammals, including chimpanzees, leopards, genets, pythons, and duikers themselves. However, these “pulse of the forest” tallies, as Fay calls them, will not help to inventory mammals, such as buffalo and bongo, that ignore the wails.

    By the end of his journey, Fay hopes to have amassed a data trove that can help conservationists make a case for setting aside vital habitat for imperiled species, such as the forest elephant and the gorilla. Fay understands that many hectares will be developed for logging or other uses. But he hopes, for example, that some of the older forest areas might be spared.

    Such pragmatism may not please everybody, but it seems to work, particularly in the face of government practices that embrace logging and the conversion of forests to agricultural fields. Fay compares the situation in Central Africa today to the rapid colonization of western North America last century. “The parallels are overwhelming,” says Fay, who has steeped himself in the journals of the Lewis and Clark expedition, which blazed a trail for western colonists nearly 200 years ago. Fay feels for Meriwether Lewis, who was so distraught by the thought of developing the West that he committed suicide 3 years after the expedition ended. “He just couldn't handle the fact that the area west of the Mississippi was going to be completely colonized,” Fay says. “It just drove him crazy, he loved that place so much.”