News this Week

Science  09 Aug 2002:
Vol. 297, Issue 5583, pp. 912

    Congress Homes In on New Department's R&D Programs

    1. David Malakoff

    Science groups are feeling a little more secure about the role of research in the planned U.S. Department of Homeland Security (DHS). After weeks of frenzied lobbying, advocates for various research organizations are assessing their efforts to shape the mammoth agency and preparing for a final push in September when Congress returns after a monthlong recess. Research groups have “done a good job of elevating science's role in the department,” says April Burke of Lewis-Burke Associates, a Washington, D.C., lobbying firm that represents research universities and other science-related groups. “But we've got more work to do.”

    The proposed department, designed to shore up the nation's defenses against terrorism in the wake of last fall's attacks on the World Trade Center and the Pentagon and anthrax-laced letters, will be assembled largely from existing border-control and security programs. But it is also expected to start life next year with as much as a $2 billion budget for science and technology.

    The Senate's version of the department—approved by the Government Affairs Committee on 24 July—would create a $200 million research agency designed to spur antiterrorism technologies. The House bill, approved on 26 July by the full body, would establish the Homeland Security Institute and several university- and government-based research centers (see below), although it doesn't set specific spending targets. Both bills also call for DHS to take over selected nuclear nonproliferation and computer science programs currently run by the Department of Energy (DOE), an animal-pathogen laboratory run by the Department of Agriculture (USDA), and—if the White House requests it—certain DOE pathogen-research projects. “We may see some major new research programs” in the final bill, predicts George Leventhal, a lobbyist for the Washington, D.C.-based Association of American Universities, which represents 61 top research institutions.

    Two to tango.

    Rep. Dick Armey (R-TX) and Sen. Joe Lieberman (D-CT) will lead congressional efforts to resolve differences over new department's science programs.


    Exactly how those programs will be managed, however, is still up in the air. The initial proposal from the Bush Administration in June (Science, 14 June, p. 1944) would have rolled an array of government research programs into DHS. But the White House dropped some of its most controversial ideas—such as transferring DOE's Lawrence Livermore National Laboratory—when it presented its formal plan to Congress. Since then, scientific organizations have made plenty of suggestions.

    Biomedical research groups, for instance, oppose the Administration's proposal to give DHS control of the National Institutes of Health's (NIH's) $1-billion-plus bioterrorism research portfolio, saying that it would hinder efforts to develop needed vaccines and drugs. “Adding a layer of bureaucracy isn't helpful if your goal is a better anthrax vaccine,” says Pat White, a lobbyist for the Federation of American Societies for Experimental Biology (FASEB) in Washington, D.C. “An experienced science agency needs to be in charge,” says Janet Shoemaker of the American Society for Microbiology (ASM), also in Washington.

    The House agreed. A panel led by Representative Dick Armey (R-TX) shifted control of bioterror research back to NIH but gave DHS a strong advisory role. In contrast, the Senate bill, crafted by Senator Joe Lieberman (D-CT), gives DHS the upper hand in setting spending priorities—largely because the White House and some senators worry that NIH might stray from studies directly related to bioterror threats. “NIH sometimes can be too curiosity driven,” says one Senate aide. ASM and FASEB are urging lawmakers to adopt the House version in the final measure.

    View this table:

    The two bills also differ on which agency should regulate research involving potential bioweapons. The Senate bill calls for DHS to swallow two oversight programs being established by the Centers for Disease Control and Prevention and USDA. But biomedical groups support the House approach, which leaves them where they are.

    There is greater agreement among legislators on other issues. Both bills call for using merit-based competition to award grants and encourage the department to keep the fruits of its research unclassified. They also create advisory bodies to be stocked with outside scientists and add a high-level science czar to oversee the department's R&D portfolio. The new position “will help ensure that science has a seat at the policy-making table,” says lobbyist Phillip Harman of Lewis-Burke, who pushed the idea on behalf of the California Institute of Technology in Pasadena and other clients. Harman also pushed the $200 million security research agency in Lieberman's bill, which is aimed at accelerating targeted technologies. Dubbed the Security Advanced Research Projects Agency, the agency is modeled on the Pentagon's DARPA.

    Neither the House nor the Senate is eager to see the new department create its own centralized laboratory, something that the Administration had proposed bestowing upon Livermore. New Mexico politicians have criticized the idea, and both bills now require the department to give that state's national laboratories—Los Alamos and Sandia—a shot at becoming the lead lab. Lawmakers are also wary of the possible transfer of any of DOE's pathogen sequencing and research projects to DHS. Such politicking is sure to continue as Congress nears the finish line in a race for what some historians say is the most significant reorganization of the U.S. government since the Great Depression.


    Texas A&M Draws Flak for Plans to Nab Antiterrorism Research Center

    1. David Malakoff

    Administrators at Texas A&M University have apparently decided that “homeland security” is a password to the federal treasury. Taking advantage of legislation now before Congress (see above) to create the Department of Homeland Security (DHS), the College Station institution has positioned itself to snag a lucrative center to coordinate university-based research on antiterrorism technologies. But science advocates are crying foul, saying that other universities should be able to compete for such a plum.

    This summer Texas A&M officials worked with state politicians to insert language into the House version of the bill that sets out 15 criteria for awarding the center. Critics say that the criteria are designed to prevent competition. For example, schools must be affiliated with a Department of Agriculture “training center” and show “demonstrated expertise” in wastewater operations and port security. “Stanford doesn't do sewerage or train teamsters, so it's out,” jokes one university lobbyist. “This has nothing to do with the nation's security needs and everything to do with one university's needs,” says a House aide.

    Texas A&M officials weren't available to comment, but campus sources say the language arises from a long-running campaign to establish the school as a hub of security expertise. In May, university regents voted to establish the Integrative Center for Homeland Security, and local Representative Kevin Brady (R-TX) proposed legislation that would allocate $120 million over 5 years for it. That bill stalled, but House backers grafted it onto the DHS bill, surviving some testy debate and two close votes.

    Opponents, including House science panel chief Sherwood Boehlert (R-NY), say the language undermines the principle of awarding research funds competitively and restricts the applicant pool. But some schools say they can live with the rules. “We can compete,” says Scott Sudduth, a Washington, D.C.-based lobbyist for the University of California system. Texas A&M, meanwhile, hopes to win over senators by assembling a statewide consortium of institutions to be charter members of such a center.


    Panel Shines Light on Exploring the Sun

    1. Andrew Lawler,
    2. Charles Seife

    NASA should revive plans to send a spacecraft into the solar atmosphere, concludes a National Academy of Sciences panel that this week unveiled the first-ever strategic plan for the next decade of solar and space physics. Its report recommends that NASA and other government agencies launch probes throughout the solar system to study the sun and its interaction with the planets and the interstellar medium.

    The study—18 months in the making—offers a concrete set of priorities for solar research (see table). The panel's recommendations should help researchers obtain funding in a field that has traditionally lagged behind planetary exploration in the space sciences but that has enjoyed recent successes such as the U.S.-European SOHO mission (Science, 28 July 2000, p. 528).

    View this table:

    By and large, the 15-member panel endorsed NASA's vision of a flotilla of spacecraft of various sizes, as well as a handful of ground-based efforts. But it urged the space agency to resurrect a $650 million solar probe that will fly into the solar atmosphere to measure the sun's tumultuous plasmas, fields, and waves, despite technological and cost hurdles that include enduring temperatures of 2400 kelvin and an extra wallop of radiation. The panel, chaired by Lou Lanzerotti, a physicist at Lucent Technologies in Murray Hill, New Jersey, also puts a high priority on an as-yet-unfunded spacecraft, the Jupiter Polar Mission, that would study the interplay between the sun, Jupiter, and Jupiter's moons.

    “The Solar Probe, right now, is canceled, and we're telling them to change course,” says panel member James Burch, vice president of the Southwest Research Institute's Instrumentation and Space Research Division in San Antonio, Texas. “The Jupiter Polar Mission is not in the program right now. [The changes] might mean that they have to reshuffle the order of their solar terrestrial probes.”

    Sizzling science.

    Experts endorse canceled mission to the sun's atmosphere.


    As with a recently released study on planetary exploration (Science, 19 July, p. 317), the Lanzerotti panel grouped its ranked recommendations into large ($400 million-plus), medium ($250 million to $400 million), and small (less than $250 million) missions. Some of the experiments, such as the top-ranked Solar Probe, will study the sun directly. Others, such as the second-place Geospace Network, a group of satellites that will monitor Earth's environs, are intended to illuminate how Earth is influenced by the sun. The Solar Probe was the only large mission ranked, whereas nine missions each were included in the small and medium categories.

    The panel's plan includes missions for which NASA does not yet have funding. But it will all “fit within the budget we think is going to be available,” says Burch, from a current $400 million to $650 million in 2008 and beyond. The panel also concluded that the technical hurdles facing these missions require a new level of cooperation among five government agencies—NASA, the National Science Foundation (NSF), the National Oceanic and Atmospheric Administration, and the departments of Defense and Energy—on basic research as well as operational programs. For example, it recommends that NASA continue research into advanced power, propulsion, and electronics for spacecraft while NSF improves the reliability of ground-based sensors and networks, some of which also operate in extreme environments.

    “I think it will help maintain the vitality and health” of the field, says Michael Calabrese, a program manager at Goddard Space Flight Center in Greenbelt, Maryland, who notes that a NASA-sponsored panel is working on a 25-year road map that will supplement the 10-year scope of this report. “That way you get two looks at this,” he says. In the meantime, the academy report gives NASA a way to lift missions out of the budgetary frying pan and into the solar fire.


    NIEHS Toxicologist Receives a 'Gag Order'

    1. Dan Ferber

    A toxic tiff at the National Institute of Environmental Health Sciences (NIEHS) seems to have escalated into a cause célèbre that has even caught the attention of a member of the House Committee on Government Reform. At the center of the dispute is James Huff, a 23-year veteran of NIEHS's carcinogen testing program and an outspoken critic of the chemical industry. Last month, after clashing with his supervisor, Huff received what he calls a “gag order,” a proposed agreement forbidding him from criticizing NIEHS in public. The agreement itself was soon circulating in e-mails, and when outsiders learned about it last week, NIEHS apparently withdrew the order.

    Huff, 64, is no stranger to controversy. Beginning in 1979, he helped develop a high-profile program at NIEHS that tests suspected carcinogens on mice and rats by feeding them chemicals over an entire lifetime. Regulators have used such long-term assays to decide which chemicals might cause human cancer—and have come under intense fire for using methods that industry believes exaggerate risk. Huff, the author of more than 300 published scientific papers, has defended the validity of these methods and publicly criticized attempts by NIEHS and industry officials to revise them. Last year Huff publicly blasted a $4 million NIEHS-industry research collaboration on the effects of chemicals on human reproduction and early development.

    The draft agreement, which Huff says he received 23 July, came after NIEHS scientific director Lutz Birnbaumer asked Huff to stop other research and prepare a report on a topic Huff isn't interested in. In an e-mail, Birnbaumer said that the disagreement arose because Huff “has refused to review and summarize” an area of cell biology “in a timely manner.”

    The NIEHS agreement would have required Huff “not to send any letters, emails or other communications that are critical of NIEHS as an Institute or its scientific work to the media, scientific organizations, scientists, administrative organizations, or other groups or individuals outside NIEHS.” It also states that if Huff violates the agreement and can't provide a satisfactory explanation to the NIEHS director, he must retire or resign “voluntarily” within a week, and that he must retire by December 2003 in any case. Francine Little, an NIEHS administrator whose name appears on the memo, declined to comment on it, describing it as a “confidential personnel matter.” But she noted that it was part of a negotiation and not “a done deal.”

    News of the threatened action spread rapidly among toxicologists and public health advocates. Some said they were upset by what they saw as an attempt to silence internal dissent. Lorenzo Tomatis, former director of the respected International Agency for Research on Cancer in Lyon, France, who collaborates with Huff each summer at NIEHS, said the draft agreement “had the tone you would expect to find under a dictatorship.” And Christopher Portier, director of NIEHS's environmental toxicology program, said he had not seen the memo firsthand, but “it sounds somewhat extreme.”

    Congress is getting into the fray as well. Representative Dennis Kucinich (D-OH), in a letter he sent last week to NIEHS director Kenneth Olden and Little, demanded information on Huff's case and NIEHS policies on gag orders. “NIEHS should be determining the incidence of human illness caused by chemical, pollutant, and other environmental causes, not putting a gag order on one [of] its best scientists,” Kucinich wrote in an e-mailed statement to Science.

    Olden, who was away on vacation, could not be reached for comment. But David Brown, an assistant to Olden, said Olden telephoned Huff on 2 August and offered him a new job in the director's office. Brown concludes, “There's no story now.” Huff says he's encouraged by the offer but adds: “No commitments have been made. … I want to see what they put in writing.”


    Muon Measurements Muddle a Model

    1. Charles Seife

    Scientists at Brookhaven National Laboratory in Upton, New York, hope they've made a momentous discovery: They have confirmed a nagging discrepancy between the Standard Model of particle physics and the “magnetic moment” of the muon. Physicists are still debating just how significant the mismatch is, however.

    “That's what we're all asking ourselves,” says Frank Wilczek, a physicist at the Massachusetts Institute of Technology. It's possible that the discrepancy is a statistical glitch or a problem with the theoretical calculations, or it might be a sign of physics beyond the Standard Model.

    The new result, presented last week at a seminar at Brookhaven, is twice as precise as earlier results of the experiment, presented last year (Science, 9 February 2001, p. 958; 21 December 2001, p. 2449). In the experiment, known as muon g-2 (pronounced “g minus two”), scientists used a 14-meter-wide superconducting magnet in Brookhaven's Alternating Gradient Synchrotron to induce muons—heavier siblings of the electron—to curve around in a circle. In the process, they measured the muon's propensity to twist in a magnetic field, known as its magnetic moment. They have now measured the value to an uncertainty of 0.7 parts per million. “It's just an awesome experiment,” says Wilczek.

    Learning curve.

    Brookhaven's g-2 experiment tested theory by measuring how much muons spawned by proton collisions change orientation while circling in a magnetic field.


    The results give “a very nice, consistent picture” of the magnetic moment, says Boston University's Lee Roberts, a member of the muon collaboration. “But the question for the theoretical community is … what we should really be comparing it with.”

    Physicists would like to test the value against the Standard Model, the theoretical framework that explains how particles interact. The model predicts what the muon's magnetic moment should be. Unfortunately, at present it gives two different numbers.

    That's because the theory relies on other experiments to fill in data that aren't easily calculated from first principles. Physicists can get the missing information either by studying electron-positron collisions or by watching the decay of tau leptons, other heavy siblings of the electron. The two methods should agree, but they don't.

    According to team member James Miller of Boston University, this makes it hard to evaluate just how significant the disagreement between experiment and the Standard Model is. “We're not sure which number to take,” he says. Using tau-decay data, the difference is a mere 1.6 standard deviations, which is not considered significant. Using published electron-positron data, the number jumps to 2.6 standard deviations, which is considered interesting but far from conclusive. However, using new, unpublished electron-positron data from the Budker Institute of Nuclear Physics in Novosibirsk, Russia, the significance jumps to 3.7 standard deviations—which, if true, would be a significant result.

    “My first statement would be not to be in a hurry” to jump to a conclusion about the mismatch between theory and experiment, says Simon Eidelman, a physicist at the Budker Institute. Although Eidelman thinks that the Brookhaven experiment is “extremely beautiful from the physics point of view,” he says it's too early to tell whether there's a problem with the calculations, with experiments that feed into them, or with the Standard Model itself. “When and where all this will converge, I can't tell,” he adds.

    Eidelman might have to wait a while to find out: The muon collaboration has some more data yet to be processed that should bring the error bars down a bit, but the White House budget contains no funding to continue the Brookhaven experiments. Experiments that study the B meson, such as BaBar at the Stanford Linear Accelerator Center in California and Belle at KEK in Tsukuba, Japan, might help narrow down uncertainties in the theory. However, it will be at least half a decade before the Large Hadron Collider at CERN, the European particle physics laboratory near Geneva, shows for sure whether the Brookhaven result is the sign of new physics or just an interesting twist in the same old story.


    Panel Hears Ideas for Overhaul of NIH

    1. Jocelyn Kaiser

    Does the $23.5 billion U.S. National Institutes of Health need a major overhaul to trim its ever-growing fleet of 27 centers and institutes? Last week, an Institute of Medicine (IOM) panel that's begun investigating this question heard comments from current and former NIH directors. Two out of three said NIH would be better off if it were more centralized. But a former member of Congress who guided NIH funding injected a dose of reality, saying that “it is going to be a very daunting task” to overcome political pressures to maintain the status quo.

    Congress asked for the study in a report accompanying a 2001 spending bill. Lawmakers wanted to find out “whether the current NIH structure and organization are optimally configured.” The most prominent advocate of restructuring at that time was Harold Varmus, NIH director from 1993 through 1999. He spelled out his ideas in an article last year arguing that constantly adding new institutes, each with its own budget allocation, was becoming too cumbersome (Science, 9 March 2001, p. 1903). He called for reforming NIH into five institutes organized by disease group. In his plan, a sixth institute, “NIH Central,” would house the NIH director and have much more power to shift funds among institutes than the director has now.


    Harold Varmus thinks NIH needs fewer, not more, institutes.


    Varmus explored his ideas with the IOM panel, which is chaired by former Princeton president Harold Shapiro and includes James Wyngaarden, another former NIH director (1982 to 1989). Varmus explained that, with 27 institute chiefs squeezed into a room, “it's very difficult to feel you're actually molding things.” Administrators “got tired” of being pushed to do joint projects on zebrafish, mouse, and bioinformatics. “There is a serious misconnect between this checkerboard of institutes and how science is being done,” Varmus said.

    A leaner structure also received the support of Bernadine Healy, NIH director from 1991 to 1993, who suggested grouping NIH in four slightly different “clusters.” Healy, however, thinks more institutes are fine; she even suggested two new ones for nutrition and rehabilitation. Current NIH director Elias Zerhouni didn't take a stand on restructuring. He asked the panel to think not only about “organizational change” but also “better management tools” to “optimize performance.” He and others also suggested other questions, such as whether institute directors should have term limits.

    Abolishing institutes is easier said than done. The same disease advocacy groups that have pushed to double NIH's budget over 5 years to $27.3 billion in 2003 also support their favorite institutes, and most institutes have congressional champions as well. Debra Lappin of the Arthritis Foundation reminded the group that “the American public owns the NIH.” Redundancy, she suggested, could be a good thing, because consolidating could lead to “great orthodoxy” and “less competitiveness.”

    “Any attempt to eliminate individual institutes will meet probably very strong political resistance,” former Illinois Representative John Porter told the group. However, he thought giving budget authority to a cluster director to move money around institutes within that cluster “is possible.” This wasn't what he had in mind, Varmus said, but he acknowledged that panelists might be “forced politically to move away from the ideal world.” The panel is due to deliver its report in September 2003.


    Satellites Spy More Forest Than Expected

    1. Jocelyn Kaiser

    Fifteen years after people began chomping on Rainforest Crunch to help save the Amazon, experts still don't have a good handle on exactly how quickly tropical forests are disappearing. Now on page 999, scientists describe an effort to fill that data void: one of the first studies to assess humid tropical forest with satellite data rather than on-the-ground measurements and guesswork.

    The study's conclusion—that deforestation rates were 23% lower between 1990 and 1997 than has been estimated—doesn't change the need for conservation, says remote-sensing expert and study co-author Hugh Eva of the European Commission's Joint Research Centre in Ispra, Italy. Tropical forest cover “is still disappearing at incredible rates.”

    For climate change experts, however, the study, known as TREES, is making them rethink an important number: how much carbon dioxide land plants are absorbing. “It's a very big deal,” because predictions of global warming rely on that number, says ecologist David Schimel of the National Center for Atmospheric Research in Boulder, Colorado.

    The study is sure to be controversial. Some experts, including the authors of another new remote-sensing study, are already picking apart the TREES methodology—if not its overall conclusions.

    The source of most estimates of global forest loss has for years been the United Nations' Food and Agriculture Organization (FAO). Its foresters estimate trends by pooling data from more than 200 countries, but these reports are notoriously inaccurate. Countries don't use comparable techniques, and many lack the expertise or resources to do it rigorously (Science, 23 March 2001, p. 2294).

    Aiming for more reliable results, TREES, led by Frédéric Achard of the Joint Research Centre, applied a sampling strategy to remote-sensing data for humid tropical forests. (The researchers did not look at dry tropical forests, which cover less area and are being deforested more slowly.) They could have randomly sampled the entire forest. But to improve their accuracy, they sampled mainly where they thought deforestation is happening.

    Going, going …

    A new remote-sensing study gives firmer numbers for humid tropical forest loss from burning and clearing.


    To do this, they first identified deforestation “hot spots” using global maps they assembled from early 1990s low-resolution satellite data and by consulting with local and regional experts. Then they selected 100 sampling sites, statistically weighting them so more fell in hot spots. They compared high-resolution, before-and-after images of these 100 patches—representing 6.5% of the world's humid tropical forests—and calculated how much forest had been lost. Finally, they extrapolated these results to estimate the global deforestation rate for forests of this type.

    Between 1990 and 1997, the TREES team found, the world lost an average of 5.8 million hectares of humid tropical forest each year—an area twice the size of Maryland—give or take 1.4 million hectares. The highest percentage deforestation rates were in Southeast Asia, followed by Africa and South America. Whereas TREES found a net annual loss (after counting regrowth) of 4.9 million hectares per year, the latest FAO data for the same study area came up with a net 6.4-million-hectare loss.

    David Skole, for one, finds the study's approach less than convincing. “I dispute that they got the right hot spots,” says Skole, a remote-sensing expert at Michigan State University, East Lansing. By relying on maps from the early 1990s, he says, the study likely missed areas where deforestation began later. Christel Palmberg-Lerche, chief of FAO's Forest Resources Development Service, says her group also sees problems with how TREES found the hot spots.

    Skole is a co-author on another new satellite forest study. Led by Ruth DeFries of the University of Maryland, College Park, it is based on a global set of low-resolution images. DeFries uses an algorithm that calculates the amount of forest cover within each coarse pixel from its color and the time of year. The study, which is under review, also finds that the FAO forest loss estimates are too high for the 1990s, but it gets different results for each continent than TREES did. “The regional differences indicate that we still don't have a definitive answer,” says another co-author, ecologist Chris Field of the Carnegie Institution of Washington at Stanford University.

    Despite such uncertainties, Field and others say the two studies will help resolve a mystery known as the “missing sink.” The Intergovernmental Panel on Climate Change (IPCC), the expert group that has concluded that human activities are contributing to global warming, currently draws on studies based on FAO data to calculate how much carbon is released by burning and clearing tropical forests. The group assumed that such deforestation added 1.6 petagrams of carbon (or 1.6 × 1015 grams) to the atmosphere each year in the 1990s. This carbon, plus 6.3 petagrams mainly from fossil-fuel burning, makes up total human-caused carbon emissions. On the other side of the equation, IPCC adds up where this carbon goes. About half of it stays in the atmosphere, the researchers say, likely contributing to global warming. Oceans take up 2.3 petagrams, and the rest—another 2.3 petagrams—has been assumed to be absorbed by temperate forests.

    The puzzle is that ground-based inventories of regrowing temperate forests have not found enough vegetation to absorb 2.3 petagrams of carbon. If emissions from deforestation are smaller than estimated, however, then this sink must be smaller, too. Both the TREES team and the DeFries group estimate land-use emissions at about 1 petagram of carbon. This doesn't take care of the entire “missing sink,” says ecologist Richard Houghton of the Woods Hole Research Center in Massachusetts, but “you're getting there.” The new estimates could change the results of climate models, says Schimel.

    These two studies aren't likely to be the last word. Achard and others say that what's needed is for experts in each country to help assemble a wall-to-wall, high-resolution satellite map. Falling costs for images and computers should make this feasible, notes Achard. Until then, scientists still won't be sure just how fast the tropical forest is vanishing.


    The Real Dirt on Rainforest Fertility

    1. Charles C. Mann

    Ancient Amazonians left behind widespread deposits of rich, dark soil, say archaeologists. Reviving their techniques could help today's rainforest farmers better manage their land

    IRANDUBA, AMAZÔNAS STATE, BRAZIL—Above a pit dug by a team of archaeologists here is a papaya orchard filled with unusually vigorous trees bearing great clusters of plump green fruit. Below the surface lies a different sort of bounty: hundreds, perhaps thousands, of burial urns and millions of pieces of broken ceramics, all from an almost unknown people who flourished here before the conquistadors. But surprisingly, what might be most important about this central Amazonian site is not the vibrant orchard or the extraordinary outpouring of ceramics but the dirt under the trees and around the ceramics. A rich, black soil known locally as terra preta do Indio (Indian dark earth), it sustained large settlements on these lands for 2 millennia, according to the Brazilian-American archaeological team working here (see sidebar).

    Throughout Amazonia, farmers prize terra preta for its great productivity—some farmers have worked it for years with minimal fertilization. Such long-lasting fertility is an anomaly in the tropics. Despite the exuberant growth of rainforests, their red and yellow soils are notoriously poor: weathered, highly acidic, and low in organic matter and essential nutrients. In these oxisols, as they are known, most carbon and nutrients are stored not in the soil, as in temperate regions, but in the vegetation that covers it. When loggers, ranchers, or farmers clear the vegetation, the intense sun and rain quickly decompose the remaining organic matter in the soil, making the land almost incapable of sustaining life—one reason ecologists frequently refer to the tropical forest as a “wet desert.”

    Fruits of labor

    . Soils enhanced centuries ago underlie a flourishing papaya orchard near Iranduba, Brazil.


    Because terra preta is subject to the same punishing conditions as the surrounding oxisols, “its existence is very surprising,” says Bruno Glaser, a chemist at the Institute of Soil Science and Soil Geography at the University of Bayreuth, Germany. “If you read the textbooks, it shouldn't be there.” Yet according to William I. Woods, a geographer at Southern Illinois University, Edwardsville, terra preta might cover as much as 10% of Amazonia, an area the size of France. More remarkable still, terra preta appears to be the product of intensive habitation by precontact Amerindian populations. “They practiced agriculture here for centuries,” Glaser says. “But instead of destroying the soil, they improved it—and that is something we don't know how to do today.”

    In the past few years, a small but growing group of rsearchers—geographers, archaeologists, soil scientists, ecologists, and anthropologists—has been investigating this “gift from the past,” as terra preta is called by one member of the Iranduba team, James B. Petersen of the University of Vermont, Burlington. By understanding how indigenous groups created Amazonian dark earths, these researchers hope, today's scientists might be able to transform some of the region's oxisols into new terra preta. Indeed, experimental programs to produce “terra preta nova” have already begun.

    The research is still in an early stage, but last month attendees at the first large-scale scientific congress* devoted to terra preta argued that its consequences could be enormous, both for Amazonia and for the world's hot regions in general. Population pressure and government policies are causing rapid deforestation in the tropics, and poor tropical soils make much of the clearing as economically nonviable in the long run as it is ecologically damaging. The existence of terra preta, says Wim Sombroek, former director of the International Soil Reference and Information Center in Wageningen, the Netherlands, suggests “that some kind of sustainable, intensive agriculture is possible in the Amazon, after all. If we can learn the principles behind it, we may be able to make a substantial contribution to human welfare and the environment.”


    The good earth

    Terra preta is scattered throughout Amazonia, but it is most frequently found on low hills overlooking rivers—the kind of terrain on which indigenous groups preferred to live. According to Eduardo Neves, an archaeologist at the University of São Paulo who is part of the Iranduba team, the oldest deposits date back more than 2000 years and occur in the lower and central Amazon; terra preta then appeared to spread to cultures upriver. By A.D. 500 to 1000, he says, “it appeared in almost every part of the Amazon Basin.”

    Typically, black-soil regions cover 1 to 5 ha, but some encompass 300 ha or more. The black soils are generally 40 to 60 cm deep but can reach more than 2 m. Almost always they are full of broken ceramics. Although they were created centuries ago—probably for agriculture, researchers such as Woods believe—patches of terra preta are still among the most desirable land in the Amazon. Indeed, terra preta is valuable enough that locals sell it as potting soil. To the consternation of archaeologists, long planters full of terra preta, complete with pieces of pre-Columbian pottery, greet visitors to the airport in the lower Amazon town of Santarém.

    As a rule, terra preta has more “plant-available” phosphorus, calcium, sulfur, and nitrogen than surrounding oxisols; it also has much more organic matter, retains moisture and nutrients better, and is not rapidly exhausted by agricultural use when managed well.

    The key to terra preta's long-term fertility, Glaser says, is charcoal: Terra preta contains up to 70 times as much as adjacent oxisols. “The charcoal prevents organic matter from being rapidly mineralized,” Glaser says. “Over time, it partly oxidizes, which keeps providing sites for nutrients to bind to.” But simply mixing charcoal into the ground is not enough to create terra preta. Because charcoal contains few nutrients, Glaser says, “high nutrient inputs via excrement and waste such as turtle, fish, and animal bones were necessary.” Special soil microorganisms are also likely to play a role in its persistent fertility, in the view of Janice Thies, a soil ecologist who is part of a Cornell University team studying terra preta. “There are indications that microbial biomass is higher in terra preta,” she says, which raises the possibility that scientists might be able to create a “package” of charcoal, nutrients, and microfauna that could be used to transform oxisols into terra preta.


    Surprisingly, terra preta seems not to have been created by the “slash-and-burn” agriculture famously practiced in the tropics. In slash-and-burn, farmers clear and then burn their fields, using the ash to flush enough nutrients into the soil to support crops for a few years; when productivity declines, they move on to the next patch of forest. But Glaser, Woods, and other researchers believe that the long-ago Amazonians created terra preta by a process that Christoph Steiner, a University of Bayreuth soil scientist, has dubbed “slash-and-char.” Instead of completely burning organic matter to ash, in this view, ancient farmers burned it only incompletely, creating charcoal, then stirred the charcoal directly into the soil. Later they added nutrients and, in a process analogous to adding sourdough starter to bread, possibly soil previously enriched with microorganisms. (In addition to its potential benefits to the soil, slash-and-char releases much less carbon into the air than slash-and-burn, which has potential implications for climate change.)

    Gaining ground.

    Soil enhanced with charcoal and fertilizer did best in tests (top). Rich, dark terra preta contrasts with poor red soil (bottom).


    In a preliminary test run at creating terra preta, Steiner, Wenceslau Teixeira of the Brazilian Agricultural Research Enterprise, and Wolfang Zech of the University of Bayreuth applied a variety of treatments involving charcoal and fertilizers to test plots of highly weathered soil at a site outside the central Amazonian city of Manaus. They then planted rice and sorghum in each plot for 3 years. In the first year, there was little difference among the treatments (except for the control plots, in which almost nothing grew). But by the second year, Steiner says, “the charcoal was really making a difference.” Plots with charcoal alone grew little, but those treated with a combination of charcoal and fertilizer yielded as much as 880% more than plots with fertilizer alone.

    The “Bambi syndrome”

    Researchers believe the best use of the newly revived technique will be in a kind of updated version of precontact indigenous agriculture, which used methods very different from slash-and-burn. According to a pathbreaking 1992 analysis by William Denevan, a geographer emeritus at the University of Wisconsin, Madison, the slash-and-burn agriculture practiced until recently by most Amazonian cultures is probably a recent invention. In contemporary slash-and-burn, farmers shift from plot to plot every 2 to 4 years. But field experiments by archaeologists in Amazonia indicated that clearing the forest with stone tools was so difficult that rapid movement among areas would have been impractical, if not impossible. “What they found was that for a single moderate to big hardwood tree it can take more than 30 times longer to cut down that tree with a stone ax than with a steel ax,” Denevan says. “I argued that this meant that Indians had to stay with a piece of land in precontact times for much longer than they do now and had substantially different agricultural regimes.”

    Rather than planting annual crops, the precontact inhabitants of the Amazon mostly practiced a type of agroforestry, argues Charles R. Clement, a plant geneticist at the Brazilian National Institute for Amazonian Research in Manaus. Initial paleoecological analyses of charred plant remains from the Iranduba archaeological site show, in addition to annual crops such as manioc and maize, the wood from at least 30 species of useful trees. “They put down annuals until the orchards grew,” suggests Clement. “We'll have to find some modern equivalent to Indian agroforestry. Otherwise creating new terra preta”—if scientists learn how to do it—“will simply lead to the same kind of clearing we have now, except the land will last longer.” Indeed, research in Amazonia by Laura German of the International Center for Research in Agroforestry in Nairobi, Kenya, has shown that over time the nutrients in terra preta, when poorly managed, can decline to near-oxisol levels.

    New terra preta farms, researchers acknowledge, will be subject to novel problems, especially weeds. In small central Amazon plots, German says, weeds grow so fiercely on terra preta that they overwhelm crops—they are a principal reason that farmers on ancient terra preta sites move their fields. New techniques to control tropical weeds will have to be developed, says Cornell weed scientist Antonio DiTommaso, much as scientists have created methods to manage temperate-zone weeds.

    Some researchers hope that the more intensive agroforestry possible on terra preta would allow landowners to spare more tropical forest, especially near cities like Manaus, where the organic waste now overflowing dumps could be burned to provide charcoal. It might even be possible to reclaim cleared land. But because the benefit of increased yields depends on quickly transporting produce and fruit to large markets, the increased costs of terra preta may not be economically viable in remote parts of Amazonia. In addition, Clement argues that any success with terra preta will simply lure more people to work with it and that those people will end up clearing forest in the process. “Terra preta is about making the current process of development more rational and sustainable, not about conservation,” he says. “It's about creating the conditions for the forest to return more quickly after it's cleared, not about preserving it from development.”

    Even if Clement's view is correct, examining terra preta is still worthwhile, according to Susanna Hecht, a geographer at the University of California, Los Angeles. “We have to get over this Bambi syndrome of seeing all development in the tropics as necessarily catastrophic,” she says. “People have been farming there—farming hard—for thousands of years. We just have to learn how to do it as well as they did.”

    • *First International Workshop on Anthropogenic Terra Preta Soils, Manaus, Brazil, 13–19 July.


    The Forgotten People of Amazonia

    1. Charles C. Mann

    In 1542, when the Spanish conquistador Francisco de Orellana and his crew of 60 became the first Europeans to travel the 6000-km length of the Amazon River, they reported that the lower third of the river was full of “very large settlements,” with each village no more than “a crossbow shot” from the next. All along the way, according to the chronicle of the expedition, the riverbanks “bristled” with armed warriors. Few believed Orellana at the time, and indeed, later explorers saw no sign of large indigenous cities. But this absence, researchers have observed, was itself a puzzle. With its plethora of fruit and fish, the Amazon is an enormously rich environment. Why wouldn't the people who created complex societies in the arid heights of Peru have filled it up?

    An answer was provided in the 1950s by Smithsonian researcher Betty Meggers and her husband, the late Clifford Evans. Basing their conclusions on their archaeological studies at Marajó, a huge island at the mouth of the Amazon, Meggers and Evans concluded that Amazonia is a “counterfeit paradise”—a region with such intractably poor soils that it cannot long provide the agricultural base they argued was necessary to support materially advanced cultures.

    The arguments by Meggers and Evans cast such a large shadow that for decades after their publication few archaeologists explored the main Amazon. “Why would you look when you know ahead of time that you're going to find nothing?” asks Eduardo Góes Neves, an archaeologist at the University of São Paulo. “So for almost an entire generation almost nobody in archaeology came to even the most obvious places in the basin.” To Neves and a few other young archaeologists, the “obvious places” are deposits of terra preta: the Amazonian “black earth” that soil scientists believe was created by precontact indigenous settlements (see main text).

    Intrigued by accounts of terra preta, Michael Heckenberger, now at the University of Florida, Gainesville, and James B. Petersen, now at the University of Vermont, Burlington, decided in 1994 to explore the central Amazon for possible archaeological investigation. The first terra preta deposit that they investigated, on a high riverbank near the hamlet of Açutuba, was thick with artifacts. There were so many broken pieces of ceramic in the 3-km-long site, Petersen said, “that it was a nuisance for farming.”

    Excavating at Açutuba in 1995, 1997, and 1999, Heckenberger and Petersen were joined by Neves and many of his students from São Paulo. Robert N. Bartone, an archaeologist at the University of Maine, Farmington, later joined the team, which calls itself the Central Amazon Project. Ultimately they excavated four sites intensively and explored another 30, all near the junction of the Amazon and the Rio Negro. On the evidence of carbon-dated ceramics, they argue that Açutuba was inhabited in two waves, from about 360 B.C., when terra preta formation began, to as late as A.D. 1440. “We haven't finished working, but there seems to be a central plaza and some defensive ditches there,” Petersen says. The plaza was at least 450 m long; the ditch, more than 100 m long and up to 6 m wide and 2 m deep. His conclusion: “Açutuba was a big, permanent settlement.”

    Ancient abundance.

    Archaeologist Eduardo Góes Neves examines abundant ceramic fragments near Iranduba, Brazil.


    The scale of Amazonian settlement, the team members believe, is demonstrated by their dig at Hatahara, a farm near the riverside village of Iranduba, about 40 km southeast of Açutuba. There they partially excavated the largest of 10 low, humanmade mounds dating from about A.D. 900. They found a burial urn and its occupant in the center of the mound and eight more bodies 10 meters away, all nine apparently interred at the same time. Believing it unlikely that the group's first small dig hit the only concentration of human remains, Neves argues that the entire mound is likely to be full of burials—hundreds of them. “That suggests thousands of people lived here,” he says.

    Another indication of the site's population, Neves notes, is the huge number of ceramic fragments in the mound, many of which seem to have been deliberately smashed to build up its surface. According to a “rough, back of the envelope-type” calculation by Petersen, who specializes in ceramics, this single mound might contain more than 40 million potsherds. “Think of the industry required to produce that much pottery,” Neves says. “Then they just smash it. Look at the way they piled up this good soil—it's all wasteful behavior. I don't think scarcity was a problem here.”

    After the Central Amazon Project published its initial findings, Meggers sharply attacked them last fall in the journal Latin American Antiquity. Charging that the team members had ignored data from long-term, Smithsonian-backed surveys, she argued that they had confused multiple small reoccupations of the same site with continuous large occupation. The Central Amazon Project replied that the surveys, many of which remain unpublished, mainly involved “brief episodes of fieldwork at small samples … along vast stretches of major rivers,” not the detailed work they had performed.

    Even larger than the terra preta site at Açutuba is a deposit on a long bluff overlooking the mouth of the Tapajós, a lower Amazon tributary near the present-day town of Santarém. First mapped in the 1960s by Wim Sombroek, former director of the International Soil Reference and Information Center in Wageningen, the Netherlands, the zone of terra preta is more than 7 km long and 1 km wide, suggesting widespread human habitation. Indeed, when Orellana passed by the Tapajós, he reported that so many people poured down from the bluffs to meet him that the expedition turned tail and fled.

    The plateau has never been carefully excavated, but observations by geographers William I. Woods of Southern Illinois University, Edwardsville, and Joseph McCann of the New School University in New York City indicate that most of the Tapajós site is filled with ceramics—much like Açutuba, but even larger. If the agriculture practiced there was roughly as intensive as in the most complex cultures in precontact North America, Woods estimates, “you'd be talking about 200,000 to 400,000 people a few centuries before the Spanish came”—a city about the same size as Tenochtitlán, the Aztec capital, which then was probably the biggest city in the world. “Think of it,” says Woods, “a population on the same order as Tenochtitlán, at about the same time, here at the mouth of a river nobody has even heard of.”


    'Show Us the Cells,' U.S. Researchers Say

    1. Constance Holden,
    2. Gretchen Vogel*
    1. With reporting by Pallava Bagla in New Delhi and Mark Russell in Seoul.

    One year after President Bush announced that some 60 human embryonic stem cell lines were available, U.S. scientists have their hands on just four

    Ali Hemmati-Brivanlou, a molecular embryologist at Rockefeller University in New York City, has been trying since last September to obtain samples of all the cells listed on the National Institutes of Health's (NIH's) registry of “available” human embryonic stem cell lines—which at the beginning of this month numbered 71. The results: two viable lines, one from WiCell in Madison, Wisconsin, and one from ES Cell International (ESI) in Melbourne, Australia. (ESI sent him two lines, but the other one won't grow, he says.) “Everybody has their own reasons why they should not be sending things out,” says Brivanlou.

    One year has passed since President George W. Bush announced, after much deliberation, that he would allow federally funded researchers to work with human embryonic stem (hES) cell lines—as long as the cells had been derived before he began his speech at 9:00 p.m. on 9 August 2001. The cell lines, which can in theory develop into any type of cell in the body and thus might someday be useful for treating disease, are controversial because their derivation requires the destruction of week-old embryos. In his speech, Bush also announced that “more than 60” such cell lines were available, taking the research community by surprise. Until then, most researchers suspected that perhaps a dozen hES cell lines had been derived. But a worldwide survey by NIH had turned up at least 64 cell lines on four continents, NIH officials said.

    A year later, the scientists' conservative estimate still seems closer to the mark. Although the NIH list has grown to include 71 “eligible” cell lines—derived in accordance with certain ethical standards before the specified date—practical and legal hurdles have kept most of the lines in the labs where they were derived. And because relatively few have been fully characterized, it's not clear that all of them are in fact bona fide hES cells. So far, just 16 cell lines are currently available for distribution, according to their proprietors. Of these, at most four are actually in the hands of U.S. researchers who aren't collaborating with the labs that derived the cells; another seven or so lines are expected to be available to the scientific public in the next few months.

    “The whole thing is going pretty slowly,” says a scientist who asked not to be identified—and who would like to use up to 10 cell lines in various experiments. He blames the delay on extensive negotiations over rights to the cells and layers of NIH bureaucracy. Even so, much of the community seems to agree with George Daley of the Massachusetts Institute of Technology's Whitehead Institute that, despite the slow progress, “NIH has been doing the best it can.”

    In an attempt to speed access to the cell lines, NIH has crafted a model materials transfer agreement (MTA) and funded a half-dozen groups that have derived cell lines so they can ramp up production. The agency has also procured cells for six intramural labs and given supplementary funds to close to 20 researchers so they can add hES cells to their ongoing research.

    But none of these efforts can ensure the quality of the cell lines, many of which are not ready for prime time. A San Diego company called CyThera, for example, is listed as having nine lines, but none is available yet. “We first have to find out whether the derivations will result in bona fide human embryonic stem cells,” says the company's president Lutz Giebel. Of the 19 lines listed at the University of Göteborg in Sweden, only three will be available in the near future, says neuroscientist Peter Eriksson; 10 others are on hold until the Swedish researchers develop new protocols for growing them more easily. And at Stockholm's Karolinska Institute, all six NIH-approved lines are frozen while work focuses on newer lines. “It is an open question if the ‘NIH’ lines can be successfully thawed,” says researcher Michael Andäng.

    Cell waiting.

    Brivanlou, shown here before images of Xenopus ova and embryos, so far has received only two cell lines.


    Brivanlou points out that both commercial and academic cell providers have little or no incentive to supply cells to competing groups—particularly at the modest going rate of $5000 per sample. Many of the labs holding the cells “are not at the outset thinking of supplying the scientific community,” Wendy Baldwin, NIH's deputy director for extramural research, admitted to the president's bioethics council in early July. Some plan to supply only collaborators, she said.

    Right now, four groups are emerging as the main suppliers of hES cells to U.S. researchers: WiCell; ESI; the University of California, San Francisco (UCSF); and the Athens, Georgia, branch of the Australian company BresaGen. WiCell and ESI are already distributing cells; the others plan to come on line soon.

    NIH has signed MTAs—which assign rights to discoveries made with the cells—with all four groups. (The agreements apply to NIH employees and serve as a model for researchers at other institutions.) And all four have NIH grants to help them characterize and boost their cell supplies. “It's very labor-intensive to expand those lines,” says Andrew Cohn of the Wisconsin Alumni Research Foundation (WARF), which owns WiCell. Jennifer O'Brien, a spokesperson at UCSF, explains that hES cells are far trickier to grow than mouse cells. They grow more slowly, and they also require so-called feeder cells to keep them in their undifferentiated state. They have to be watched constantly: If colonies are too sparse, growth is retarded; if cells are grown too close together, they differentiate.

    WiCell now dominates the landscape with its cells, which were first derived at the University of Wisconsin, Madison, in 1998. WiCell also holds a U.S. patent on hES cells—an issue that has complicated some MTA negotiations. Cohn says that WiCell has executed agreements with 90 researchers, more than 70 of them in the United States. So far, they've sent out 57 batches of cells. Once the agreement is signed and payment is in hand, Cohn says, the cells ship within a week. But although the registry says WARF has five eligible cell lines, Cohn says they've only been sending out one type. The other four should be available by January, he says.

    The other current supplier for U.S. researchers is ESI, a company created by researchers at Monash University in Australia, whose cell lines were derived at the University of Singapore. ESI is listed as having six lines, four of which are viable and available, according to Chief Operating Officer Catriona King. The company has provided cells to 30 groups of researchers, including 10 in the United States. Another 30 deals are in the works, she says. It's slow going, says Daley, who has been waiting for 2 months for three ESI cell lines he has requested.

    View this table:

    This fall, UCSF will begin distributing one of its two cell lines, according to O'Brien. The fourth outfit, BresaGen, has so far sent its cells to only “two or three” collaborators, says Mike McDonell, vice president for cell services. McDonell says BresaGen has characterized all four but at the moment has just one line growing for distribution. The company has been waiting for an NIH grant that will enable it to scale up production. “Getting federal funding is the only way that we'll be able to pursue the development of the cells, and I think that's true of most of those who have cells on the registry,” agrees Jeanne Loring, founder of Arcos, the parent organization of CyThera.

    Questions about commercial rights to the cells have also dampened some researchers' enthusiasm. “The whole [intellectual property] issue hangs like a pall over a lot of the work,” says Daley. WARF's broad patent covers all import and use of hES cells in the United States—so technically, institutions that send cells to U.S. researchers could run afoul of WARF's patent. WARF has said it will defend its patent claims and the rights of its licensee, Geron Corp. in Menlo Park, California. But it has also said it has no objection to U.S. researchers acquiring cells from other sources for research purposes as long as the agreement is “substantially similar” to the agreement between WiCell and NIH. Any exceptions require a separate agreement between the cell provider and WARF, says Cohn, noting that both ESI and UCSF already have such agreements.

    To respond to researcher concerns about how the government is handling stem cells, Baldwin says NIH director Elias Zerhouni has appointed a “SWAT team.” NIH will also soon be adding details to its stem cell registry, says Baldwin—for instance, information on publications and names of researchers using particular lines. Baldwin suspects that the list will ultimately include about 80 lines.

    Ascertaining how many of the cell lines actually work as advertised will take longer. To nudge this process along, Brivanlou is organizing a conference, to be held in November at the New York Academy of Sciences, where he hopes researchers can agree on a set of molecular markers to be used to evaluate stem cells. “If I send for a cell line from India, how do I know they are really undifferentiated?” he says. With mice and other vertebrates, “one can test molecular markers and correlate them directly” with the development status of the embryo. With hES cells, though, “this is uncharted territory.”


    Regulations Constrain Stem Cell Research Across the Globe

    1. Gretchen Vogel*
    1. With reporting by Michael Balter in France, Pallava Bagla in New Delhi, Xavier Bosch in Barcelona, Dennis Normile in Tokyo, and Mark Russell in Seoul.

    U.S. researchers aren't the only ones facing delays acquiring human embryonic stem (hES) cells (see main text). Many of their colleagues in Europe and Asia are constrained by similar government regulations, and some still face outright bans on the work, which is controversial because it entails the destruction of an embryo but promises new types of treatments for a host of diseases. Following is a snapshot of various regulations, many of which are still in flux.

    European Union: The E.U. has been struggling to find common ground among states with very different laws. The latest proposal, suggested last week by the office of the Danish president of the E.U., would impose a moratorium on the derivation of new hES lines until at least 31 December 2003. Specifically, the proposal would preclude support by the E.U.'s new $17 billion Framework 6 research program for any new research that would harm embryos. Like the Bush compromise, the proposal—expected to be accepted—would allow research to continue on existing hES cell lines.

    United Kingdom: As amended in 2001, the law governing embryo research allows the use and derivation of hES cell lines, as well as the use of nuclear transfer techniques to create such cells. Despite the liberal regulations, scientists don't seem to be lining up for the chance to derive new lines. The national oversight body, the Human Fertilisation and Embryology Authority, has granted only two licenses for such work, and an HFEA spokesperson says none is pending.

    Netherlands: Under a new law that will take effect in September, research is allowed on human embryos when there is no reasonable alternative. The law prohibits so-called reproductive cloning.

    Sweden: Swedish law allows embryo research, and several groups have derived hES cell lines that are eligible for funding by the U.S. National Institutes of Health. In fall 2001, a national advisory committee recommended leaving current guidelines in place but outlawing reproductive cloning.

    Belgium and Israel: Research on embryos is allowed, under rules similar to the liberal Swedish policies.

    Germany: A new law permits scientists to import hES cell lines that were created before 1 January 2002, if approved by a national review committee. The committee will soon consider its first application; two more are expected soon. The first, from Oliver Brüstle and colleagues at the University of Bonn, might be approved this fall—2 years after he first applied for federal funds for the project, igniting heated public debate.

    France: A proposed law that would remove the ban on human embryo research—it would allow researchers to use embryos for medically important research—was put on hold when a new government came to power in June. The new government also suspended a regulation that would have let researchers import hES cell lines until the law was passed. The health minister, Jean-François Mattei, has said he will not push for drastic changes in the proposed law. But his view might not be shared by President Jacques Chirac, who is opposed to embryo research.

    Stuff of life.

    Dense, rounded masses of human embryonic stem cells rest on elongated “feeder” cells.


    Spain: Research on viable human embryos is banned. However, the 1988 law does not mention stem cells, so a few scientists assumed they would be allowed to work with lines derived elsewhere. When their research on hES cells made front-page headlines, the government stopped the work; the health ministry claims no work is now under way on hES cells. An advisory body created in late July will compare the promise of stem cells from adults and embryos and propose new rules governing the research.

    Italy, Austria and Ireland: Embryo research, including work on hES cells, is not allowed.

    Canada: Since March, Canadian scientists have been able to derive hES cells from human embryos left over after fertility treatment. The health ministry has ruled out research on human somatic cell nuclear transfer—whether for research leading to the derivation of stem cells or for reproductive purposes (Science, 8 March, p. 1816).

    Singapore: Although a half-dozen hES cell lines were derived in Singapore, the legal status of embryonic research is murky, as no laws cover the topic. In July, the government accepted the recommendations of the Bioethics Advisory Committee, permitting hES cell research and therapeutic cloning under the oversight of a national licensing body. Legislation, which includes a complete ban on human reproductive cloning, is being drafted.

    Japan: A 2-year-old law bans all human cloning experiments. Guidelines enacted last fall (Science, 3 August 2001, p. 775) permit the creation and use of hES cells. Researchers must receive approval from a local and a national ethics review board.

    Korea: Proposed legislation, expected to be introduced this fall, would allow research on surplus human embryos but prohibit reproductive cloning. Deciding whether to allow the use of somatic cell nuclear transfer procedures to create stem cells will be left to a bioethics committee not yet in place.

    India: No specific law governs embryo research, although guidelines issued in 2000 permit derivation and use of hES cells, subject to the approval of the bioethics committee at the scientist's institution. It is unclear whether scientists who develop hES cell lines will be able to send them abroad.

    Australia: A nationwide policy might soon replace the current patchwork of state regulations. A proposed law, expected to be debated this month, would permit researchers to work with hES cells and derive new cell lines from excess embryos created for fertility treatments. All research on nuclear transfer in human cells would be outlawed (Science, 12 April, p. 238).


    Water Scarcity: Forecasting the Future With Spotty Data

    1. Kathryn Brown

    While global water models warn of parched days ahead, scientists worry that another pressing scarcity is information

    Headlines warn that catastrophic thirst will soon choke northern China and mire the Middle East in more war. But a growing number of scientists say there's already a crisis—only it's not just water that's missing. It's information.

    Take Nairobi, Kenya's biggest city. About half the water delivered to Nairobi each year disappears—presumably lost to leaky factory pipes, irrigated corn fields, and other unmeasured drains. The data dearth is similar in Mexico City, Seoul, and Tehran, where no one knows where roughly a third of the water supply goes.

    In fact, in countries rich and poor, water data are often based on patchy estimates. Answers to even simple questions remain elusive: How much local water, on average, is there? How much, precisely, is going to farmers versus city dwellers? The knowledge gap shows little sign of improving.

    History shows how a data scarcity like this can hamper forecasts. During the 1990s, for instance, the amount of water actually withdrawn for human use worldwide was just half of what modelers had anticipated 30 years earlier. “The track record of water use projection has been abysmal,” remarks Robert Hirsch, head of hydrology at the U.S. Geological Survey (USGS).

    Along with confusion, water miscalculations have brought nasty surprises, such as conflict over the Colorado River, which in most years actually offers less total water than legally allotted to Western states and Mexico. “If you can't measure water, you can't manage it,” remarks Arthur Askew, director of hydrology at the World Meteorological Organization (WMO). In a data crisis, perhaps the only sustainable thing is uncertainty.

    Blurry data

    Water scenarios usually start with a fundamental fact: Of the roughly 1.4 billion cubic kilometers of water on Earth, less than 1% is fresh water that people can use. And it's not divided evenly. Water is a local treasure, prey to local problems, from pollution and politics to waste. “Overall, yes, there is still plenty of water on Earth,” says Jean-Marc Faurès, a water specialist at the U.N. Food and Agriculture Organization in Rome. The poverty occurs at home.

    Riding dry.

    According to one recent projection, nearly half the world's population will live in water-stressed river basins (above, China's Yellow River) by 2025.


    At this local level, accurate water estimates depend on supply and demand. But collecting such data is time-consuming and decidedly unglamorous. “If you open a large dam, your minister of public works will be pleased to cut the ribbon,” says Askew. “If, however, you install a stream gauge in the middle of a piece of grass in the back of a post office, nobody will inaugurate it. But after 20 years, that one gauge may have collected vital data essential for building a $100 million water project somewhere.”

    To assess how much water is available locally, governments typically rely on scattered networks of stream gauges, river runoff monitors, and similar tools. But an ad hoc group of the International Association of Hydrological Sciences reported 2 years ago that this infrastructure has begun to crumble. During the 1990s, for instance, the amount of infrastructure available at water stations fell by roughly 90% in African countries, 30% in the former Soviet Union, and 20% in Canada. Some scientists are optimistic that remote-sensing satellites will soon be able to provide better estimates of the surface water available in watersheds, but Askew contends that on-the-ground monitors are still irreplaceable.

    And supply is just half the equation. “Most of the water scarcity arguments are based on availability,” says Peter Gleick, head of the Pacific Institute for Studies in Development, Environment, and Security in Oakland, California. “We really ought to be looking at water use. A country might be water-rich if you look at the resource, but some people may have very low use and others enormous use.” He points to the Sudan, which in theory has plenty of water available from the Nile River—but in reality passes on much of its water to Egypt and struggles to access the rest amid civil war.

    Tracking actual water use is even harder than estimating availability. “Water-use data are among the most unreliable in many developing countries because nobody actually measures how much water is taken from every well, river, and aquifer,” Faurès says. Instead, he adds, public ministries typically estimate how much land is irrigated, say, and then calculate how much water is used per hectare. Says Faurès: “It's not a measure but an estimate.”

    Then the guesswork continues. Strapped by limited budgets and staff, international keepers of water data, such as WMO and the United Nations, usually have no way to verify these country-by-country statistics.

    It's not surprising that water data are often a low priority for developing nations, where millions struggle simply to find shelter and food. But rich countries also lack complete data. Much of the United Kingdom, for instance, does not use water meters—a classic source of consumption data elsewhere.

    Even in the United States, water information is relatively spotty. Since 1950, USGS has published national water-use data every 5 years. But the agency essentially reports each state's own estimates of groundwater and surface water withdrawals for irrigation, livestock, homes, and the like. The quality of state data varies considerably, depending on local politics and priorities, concedes Hirsch. Arkansas, for instance, has a much stronger water-data program than Washington.

    All washed up.

    Many earlier projections of future water use were clear overestimates. Today's estimates may also be unreliable.


    On the national level, the USGS National Water-Use Information Program, as it's known, does minimal analysis of overall trends, Hirsch says, because the survey can't afford it. The program has recently applied to Congress for additional funding.

    Water pressure

    Without good water data today, it's hard to know what will happen tomorrow. To make matters worse, conventional water models oversimplify the dynamics of water demand—often resulting in doomsday scenarios similar to those of the last century. By 2025, according to the World Resources Institute's 2000 Pilot Analysis of Global Ecosystems, at least 3.5 billion people, or 48% of the world population, will live in water-stressed river basins. But that projection, like many others, assumes for simplicity that current water consumption patterns will continue.

    Is that a fair assumption? At least in the United States, Hirsch says, annual water consumption has fallen by more than 10% since 1980. He attributes the savings to increased water efficiency, from home appliances and savvy landscaping to water pricing regulations. Europe has seen similar gains.

    In fact, Gleick says, a little water efficiency can go a long way for any population, wealthy or poor. “When Mexico City replaced 350,000 leaky toilets with more efficient ones, they saved enough water to meet the needs of 250,000 new residents,” he notes.

    On a broader scale, such savings could add up. Factoring efficiency gains and conservation into a global water forecast, Gleick projects that total water demand in 2025 need not greatly exceed today's. “We are moving away from the assumption that increased well-being requires exponential increases in water,” Gleick says. “This is the most important point in the whole question of forecasting today.”

    Information is key. In Sri Lanka, local water managers and scientists from the International Water Management Institute are using satellite imagery to track the amount of water used on crops and their growth, among other variables. The goal is to figure out where water is most needed—and how much is currently wasted. “Using this information, we can see our water use situation clearly,” reports Sri Lankan irrigation manager H. M. Jayatillake. But Sri Lanka is just one tiny corner of a world in which water remains largely a mystery.

  13. HIV/AIDS

    Malawi: A Suitable Case for Treatment

    1. Jon Cohen

    One country's efforts to secure help in tackling its AIDS epidemic indicates the gulf between needs and the resources available to meet them

    BARCELONA—One statistic haunted the international AIDS conference here last month: 40 million, the number of people estimated to be infected by HIV around the world. The unfolding catastrophe reflected in that figure had pricked the world's conscience at the previous international AIDS meeting in Durban, South Africa, 2 years ago. This year's gathering provided the first real opportunity to evaluate how well the world has responded. The resounding answer, in session after session and from protesters and panelists alike: not well enough.

    There has been some progress. In January, a new multinational organization, the Global Fund to Fight AIDS, Tuberculosis, and Malaria, opened its doors. Largely through its work, the World Health Organization (WHO) predicts that anti-HIV drugs could reach at least 3 million people in developing countries by 2005—10 times the number now being treated. But that still leaves a yawning gulf. And for countries bearing the brunt of the epidemic, getting desperately needed help is turning out to be a difficult and frustrating exercise. Just ask officials from Malawi.

    A small landlocked country in southern Africa with a per capita income of just $200 a year, Malawi is facing an unimaginable public health crisis. HIV has infected an estimated 1 million Malawians, 16% of the country's adult population. In an interview in October 2000, Malawi's vice president, Justin Malewezi, said that unless his country confronts HIV and AIDS more aggressively, “there will not be any Malawi in the future.” In many respects, Malawi's experience in putting together a plan to deal with its mounting AIDS disaster symbolizes the difficulties that poor countries and international organizations face in confronting the epidemic with inadequate funds and limited health services. In the end, after repeated interactions with donor organizations and the Global Fund, Malawi was forced to whittle down an ambitious plan to one that will barely make a dent in its problems.

    As Malewezi recounted in a talk here, soon after the Durban meeting, Malawi began assessing its needs and how to secure the resources to meet them. For 2 days in October 2000 an international group of experts met in Malawi's capital, Lilongwe, to begin hashing out a plan. “The first day was simply trying to break through the feeling that treatment was an impossibility and shouldn't even be considered,” recalls Peter Salk, scientific director of the foundation named after his father, polio vaccine pioneer Jonas Salk. Other participants included representatives from the Joint United Nations Programme on HIV/AIDS, the U.S. Centers for Disease Control and Prevention, WHO, Doctors Without Borders, the United Nations Development Programme, the World Bank, the University of Maryland's Institute of Human Virology, and even an AIDS advocacy group, Search for a Cure, based in Boston, Massachusetts.


    AIDS patient in Malawi, where 16% of adults are infected.


    By the following August—with additional input from Harvard University's Center for International Development, the Boston-based Management Sciences for Health, and the U.K.'s Liverpool School of Tropical Medicine—Malawi had a detailed proposal that called for steadily increasing treatment and prevention efforts over 7 years. In all, 300,000 Malawians would receive anti-HIV drugs, as well as treatments to prevent the opportunistic infections of AIDS. Prevention programs would be stepped up, the health system's infrastructure would be strengthened, and health care providers would receive salary supplements. The price tag: $1.6 billion.

    It would be an enormous sum for a single country, but there was reason for optimism. In a landmark speech in April 2001, U.N. Secretary-General Kofi Annan urged the world to form a new fund, which he said would need $7 billion to $10 billion each year to combat HIV/AIDS alone, and international donors were discussing the creation of what eventually became the Global Fund. In mid-September, Anne Catherine Conroy, a special assistant to Malewezi, and others from Malawi took their plan to experts at WHO. They got a splash of cold water.

    “The tactical advice they gave us was that just after September 11, the Global Fund is not going to even get $7 billion,” Conroy recalls. The WHO experts, who also had concerns about Malawi's capacity to handle the huge influx of funds, suggested that the request be scaled back to $500 million over 5 years. That would be enough to treat only 100,000 people.

    Conroy and others heeded WHO's advice. They then took their scaled-back plan to donor agencies that had long helped the country with its health care needs. “There was a real sense that antiretrovirals were not appropriate for Malawi and that we should be focusing on prevention alone,” says Conroy, ruefully noting that only 25% of the budget was for treatment. Donors also worried that the proposal was “too ambitious,” says Conroy, and that it overlapped some existing efforts.

    Rich irony.

    Sub-Saharan countries such as Malawi have the greatest HIV/AIDS burden, but the fewest numbers receiving antiretroviral drugs.


    The formation of the Global Fund in January 2002 provided a new focus for Malawi's efforts. It was already clear, however, that the fund would come up well short of the $7 billion to $10 billion Annan had called for. (It has so far raised just $2.1 billion; see chart.)

    The fund required each proposal to have the endorsement of a broad coalition, including the government, academics, the private sector, and donors. At the insistence of the donor representative in its coalition, says Conroy, Malawi cut back its request again, to $306 million—enough to treat up to 40,000 people over 5 years. The independent Technical Review Panel that screened proposals for the Global Fund then told Malawi to scale back further still.

    The final plan, which the Global Fund expects to approve in the next few weeks, calls for expenditure of $200 million and treatment for just 25,000 people, less than 3% of Malawi's infected population. “We thought this was a serious misreading of the Malawi situation,” says Conroy. But they made the changes. “Do we want to get the money, or do we want to be complaining?” she asks rhetorically. Malewezi is more blunt: “Such a limited access to antiretroviral treatment will only exacerbate tensions in the country, raising the stakes for those left out,” he warned at the Barcelona meeting. “Political parties may exploit the situation, thus threatening stability and radicalizing the political landscape.”

    Fair share?

    Contributions to the Global Fund as share of GDP. Actual amounts, are indicated above each bar.


    At the Barcelona meeting, health economist Jeffrey Sachs, who helped Malawi with its proposal, blasted the process. “Because the Global Fund didn't have the money, agents of the donors were out there twisting the arms of the countries saying, ‘Don't ask for that much,’” charged Sachs, who recently left Harvard to head the Earth Institute at Columbia University. “We will be out in force all over the world to make sure that it does not happen again.”

    Richard Feachem, who became executive director of the fund in July, strongly challenges the notion that countries should tailor proposals to fit the Global Fund's budget. “The fund does not encourage country applicants or their advisers to artificially scale back the size of proposals in anticipation of what may or may not be available,” says Feachem, an epidemiologist who founded the University of California's Institute for Global Health. Feachem notes that even Malawi's scaled-back proposal calls for an immediate increase of 30% in what the country now spends on health, and that figure rises to over 100% in less than 4 years. There is a real question, he says, whether Malawi's health care system could absorb a bigger infusion of cash.

    Malawi actually is one of the luckier applicants to the Global Fund. Of the 322 proposals the fund received for its first round of awards, Malawi was one of only 58 to make the cut. The real problem, says Feachem, is that “the Global Fund needs a massive increase in resources, and it needs it quickly.” That reality now haunts everyone fighting AIDS in the developing world.

  14. Can Sensors Make a Home in the Body?

    1. Robert F. Service

    Researchers dream of implanting sensors and other devices that interface with the body's biochemistry to treat diseases such as diabetes and even predict coming ailments. The immune system, however, is not so keen

    In early May, Jeff and Leslie Jacobs and their 14-year-old son Derek took a small step toward humanity's cyborg future. The Jacobses became the first people to receive tiny computer chip implants that store medical data, similar to the chips implanted in pets to identify them if they are lost. Jeff Jacobs, a 48-year-old dentist, has fought off numerous illnesses including cancer and a degenerative spinal condition, and he currently takes 10 different medications. He hopes that the chip under the skin of his upper arm could provide doctors with a detailed medical history in an emergency.

    Implanting things in the body is nothing new, of course. Artificial hips and pacemakers have been around for decades, and researchers have long dreamed of making everything from bionic arms, ears, and eyes to long-surviving artificial hearts. Most of those dreams of integrating the artificial with the biological still remain more fiction than science. But some might be on the verge of becoming real.

    By fits and starts, researchers and companies are beginning to devise “smart” implantable sensors that develop biochemical interfaces and in some cases respond to changes in the body. If successful, the novel implants could track early signs of diseases such as cancer, vastly improve the ability of people with diabetes to monitor and control their blood glucose levels, and even monitor internal healing following surgery. Implanted sensors, says Mauro Ferrari, a biomedical engineer at Ohio State University, Columbus, will enable doctors to continuously scan our bodies for signs of disease and begin treatment even before symptoms appear. “They will dissolve the barriers we have between diagnostics and therapeutics,” he says.

    But Ferrari and virtually all others who are working to develop implantable biosensors agree that success isn't coming quickly or easily. The key challenge is that virtually every implanted material triggers the body's defenses to wall off and isolate nonnatural materials, making a sensor essentially blind to the biochemistry outside those walls. And even if researchers overcome that hurdle, financial and ethical obstacles are lying in wait. Even so, many researchers are optimistic. The time when we all have some artificial device implanted in our bodies might be slow in getting here, says Jeffrey Borenstein, a biosensors expert at the Massachusetts Institute of Technology. However, he says, “I think the day is coming.”

    Sweet sensation

    The dream of reaching this day goes back centuries. Ever since Mary Shelley unleashed Frankenstein on the literary world in 1818—at the height of the Industrial Revolution—fiction writers have imagined constructing humans using the technology of their day. Recent fictional mergers of implants with biology range from the Six Million Dollar Man to the villainous Borg characters of Star Trek:The Next Generation.

    For today's real-world sensor developers, the big carrot is finding a way to let people with diabetes measure their glucose level without having to prick their finger to get a drop of blood. Diabetes now afflicts some 16 million Americans, and the test strips for analyzing those drops of blood are a $2- billion-a-year business.

    Biosensing companies have long hoped to tap into that market. In the mid-1990s, two companies—Futrex Medical Instrumentation and Biocontrol Technology—made a high-profile push to produce the first noninvasive glucose sensors, using techniques such as shining infrared beams through the skin. But their products proved too unreliable for the U.S. Food and Drug Administration and wound up costing investors hundreds of millions of dollars. “This field has a checkered history,” which poisoned the water for many sensor developers, says Russell Potts, who heads research at Cygnus, a California company developing a new glucose-monitoring system.

    Sensor designers realized that it was going to be necessary either to extract some bodily fluid or to go in and make the measurement in situ. But then they ran up against the problem that bedevils everyone who tries to make implanted biosensors: the immune response. Once any artificial material enters the body, a natural inflammatory response immediately kicks in, coating the intruder's surface with thousands of proteins and marshaling immune sentries called neutrophils and macrophages to remove it.

    When that can't be done, as in the case of a medical implant, the cells in this first wave typically merge and release a cascade of signaling proteins called chemokines and cytokines, which summon fibroblasts and other cells into the area. Within days, the newcomers form a tight-knit capsule around the implant, sealing it off from the rest of the body and making it virtually impossible for the sensor to track the surrounding biochemistry. Outwitting this natural immune response “is the most formidable challenge in this area,” says David Grainger, a chemist at Colorado State University, Fort Collins.

    One solution is to pull out the sensor before the fibrous capsule forms around it. In 1999, for example, Northridge, California- based Medtronic MiniMed began selling a tiny enzyme-based sensor that is implanted for up to 3 days at a time under the skin, where it measures glucose in the fluid that surrounds cells in skin tissue. The sensor is wired to a readout device outside the body and performs steadily until the body seals it off. Still, from a user's point of view, the Medtronic sensor is less than ideal. Users must have new sensors inserted every few days, making them impractical as permanent glucose monitors. And because only part of the sensor is implanted, there is always a risk of infection.

    Going inside

    At the University of California, San Diego, David Gough and colleagues have pioneered an alternative that is fully implantable and promises longer term results. Gough's team implants enzyme-based glucose sensors in a major vein of the heart and uses the steady flow of blood to prevent cells from grabbing onto the sensor and forming the capsule. A radio link reports glucose levels to a detector outside the body. Medtronic has licensed Gough's technology and is currently running clinical trials on an artificial pancreas that combines the venous sensor with an implanted pump that automatically delivers insulin based on the sensor's readings. Still, others remain cautious about the technology, fearing that if a clot should form around the sensor and become dislodged, it could make its way to the lungs and create a life-threatening embolism. Gough acknowledges the concern but says he believes that for some patients the benefit will be worth the small risk.

    Diabetes made easy.

    An implantable blood-glucose sensor (top) and insulin pump (bottom).


    For many researchers, the real hope for implanted biosensors lies not in evading the body's immune response but in trying to shape and control it so that artificial materials can be integrated into the body more naturally. “The Holy Grail of this business is the tissue-implanted probe, where the interface between the artificial and biological material resembles the biological tissue,” says William Reichert, a biomedical engineer at Duke University in Durham, North Carolina.

    That's the grail Kenneth Ward and his colleagues at Legacy Health System in Portland, Oregon, are seeking with their glucose sensor that can be implanted under the skin for long periods. To get around the encapsulation issue, Ward's team adds a protein called vascular endothelial growth factor (VEGF) to the wound site when the sensor is implanted. VEGF is a signaling molecule that triggers the production of blood vessels. In work with rats, Ward's team has found that adding VEGF to the site near a biosensor significantly increases the number and density of blood vessels in the surrounding tissue. “We're hoping that will allow glucose and oxygen from the blood to reach the sensor and improve the sensor lifetime,” Ward says. The company is still carrying out studies to confirm the effect, he adds.

    Francis Moussy of the University of South Florida, Tampa, agrees that adding VEGF is a good way to go. But he is concerned that a one-time infusion of the protein will not produce a lasting effect. “As soon as you stop the delivery of VEGF, the blood vessels will disappear,” Moussy says. So his group is developing a way to add a VEGF-producing gene to cells in the wound site to coax them into pumping out VEGF for months at a time. Moussy's team is also experimenting with coating the outside of its biosensor with a steroid called dexamethazone, which dampens the natural inflammatory response. Early-stage trials with animal models show that both the gene therapy and steroid seem to work separately, and now Moussy's team is working to put the two together.

    Others, however, are concerned about the use of VEGF. “It's dangerous,” says Buddy Ratner of the University of Washington, Seattle. Although VEGF is a natural part of wound healing, overproduction of the protein has been implicated in everything from cancer to diabetic retinopathy, a condition in which blood vessels overgrow the retina and disrupt vision, Ratner says. Ward responds that if care is taken to ensure that VEGF remains in its local environment, rather than entering the wider blood circulation, then it's doubtful that problems will arise. But Reichert adds that VEGF isn't the only protein worth trying. Dozens of cell-signaling molecules are part of the natural response to injury, including growth factors called interleukins, Reichert says. He suspects that in time researchers will use a cocktail of these molecules to help integrate biosensors into the body.

    Taking the strain

    Although glucose monitoring might promise the biggest payoff, other applications of in situ sensing are starting to gain attention as well. Micromechanics expert Shuvo Roy of the Cleveland Clinic Hospital in Ohio, for example, recently launched a project to create tiny pressure sensors that can be implanted during spinal surgery when neighboring vertebrae are fused together. Doctors fuse vertebrae for patients who have problems with the spongy discs that separate the bones in the spine. They then carry out ultrasound tests to ensure that the vertebrae fused properly. But in about 15% of cases, those readings are incorrect and often lead to another surgery.

    In hope of eliminating the unneeded follow-up surgeries, Roy's team is crafting a microelectromechanical sensor to monitor the load being carried by the bones in the spine. The sensor contains a microscale rod that a surgeon connects to two vertebrae before grafting bone into the same region. As that bone fuses with the neighboring vertebrae, it takes over the load-bearing duties of the rod in the strain gauge, which transmits the changes in pressure to a radio receiver outside. So far Roy's team is still testing the system on goats. However, he says, “our preliminary assessment is it's not going to be a problem. We have not seen a toxic reaction in the site.” And because the sensor does not read biochemical signals, it shouldn't matter if immune cells wall it off from the body. Ultimately, Roy hopes to shrink the sensors to less than a cubic millimeter so that they can be implanted with a syringe.

    For any of these devices to make a difference in people's lives, it will take more than just technical success, says Amy Pope-Harman, a lung transplantation expert at Ohio State University Medical Center in Columbus. Pope-Harman says she watches effective medical technology stumble all the time because it either doesn't mesh well with the way doctors practice medicine or because insurers won't cover the costs. Implantable biosensors will likely face both problems, Pope-Harman says. Will doctors have to be available 24 hours a day to respond to warnings from biosensors in their patients? And what if patients keep biosensor results secret from their doctor? Such real-world questions are only now beginning to emerge as the technology starts to show real promise.

    “We're a long way from Fantastic Voyage or the totally invaded body,” Pope-Harman says. But perhaps we're not so far from the first brief excursions.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution