News this Week

Science  08 Sep 2000:
Vol. 289, Issue 5485, pp. 1662
  1. 2001 BUDGET

    Research Gets Hefty Boost in 2001 Defense Budget

    1. David Malakoff

    The desert tortoise that plods across California's rugged drylands may be slow, but it will keep some fast company in next year's defense budget. The $289 billion measure, which President Bill Clinton signed last month, includes big boosts for a host of science programs, from mapping the endangered tortoise's wanderings to developing laser weapons. And with the two major presidential candidates pledging further boosts, the Pentagon's portfolio is attracting increasing attention from the life sciences community as well.

    Researchers are still digesting the details of the massive bill, which provides $5.1 billion more than the Administration requested for the budget year that begins on 1 October. So far, however, most scientists like what they see in the first science-related 2001 spending bill to clear Congress. Overall, the Department of Defense's (DOD's) science and technology (S&T) spending will increase 8%, to about $9 billion, exceeding the hopes of a blue-ribbon advisory panel. Within that total, basic and applied research go up a combined 10%, to $5 billion. Several disciplines scored big, with Congress providing more than $350 million for biomedical research and $40 million to top up a nanotechnology initiative. “Congress was pretty receptive to S&T initiatives this year,” says Robert Trew, head of DOD's basic research program.

    Still, the funding boost did not please everyone. Computer scientists, for instance, will experience cuts in some information technology (IT) programs. Some analysts worry that Congress and the Pentagon may be shortchanging long-term, high-risk research in favor of projects with a more certain payoff. And others, including former presidential candidate Senator John McCain (R-AZ), complain that the extra spending adds fat, not muscle, citing scores of nondefense projects that favor particular regions or states. The bill has so many biomedical research earmarks, McCain scoffed, that it's difficult to distinguish it from legislation funding the National Institutes of Health.

    Even critics, however, say that this year's defense budget heralds a refreshing change for researchers. Over the past decade, the Pentagon's S&T budget has typically shrunk or stagnated each year. The trend was especially troubling to universities, which depend on DOD for nearly $1 billion a year in research and fellowship funds.

    The decline also disturbed some Pentagon advisers, who warned that shortchanging research now would weaken the increasingly technology-dependent military. The issue led to a 1998 recommendation from the Pentagon's Defense Science Board, composed of outside experts, for a major hike in military spending on S&T, with a goal of $8.4 billion for 2001. More than a dozen scientific societies in Washington, D.C., also took up the cause, forming the Coalition for National Security Research. This year, their effort was fueled by a growing budget surplus and the impending presidential election, which prompted both Democrats and Republicans to tout their support for a strong defense.

    The results are impressive. A collection of programs that fund basic research at universities, for instance, will jump by nearly $100 million, to $354 million. Half of the increase goes to 18 earmarked projects, including $4 million to study everything from the physiology to the distribution of desert tortoises. The project, to be conducted by researchers chosen competitively, stems from a controversial planned expansion of an Army base in Fort Irwin, California, which would push into tortoise habitat. Another part of the increase will supplement university investments in new research equipment, such as high-resolution electron microscopes. The Pentagon will also fund 3-year fellowships for young researchers up front, protecting them against future budget vagaries.

    The university program isn't the only one to benefit this year. Congress approved $40 million for the military's share of the Administration's half-billion-dollar nanotechnology initiative, which aims to make everything from electronics to medical devices on a molecular scale (Science, 11 February, p. 952). Biomedical researchers will also be competing for new DOD funds, including $50 million in peer-reviewed grants on topics ranging from new imaging technologies for lung cancer to insect-transmitted diseases. In addition, Congress provided nearly $290 million for an ongoing program to combat breast, prostate, and ovarian cancer.

    Those sums could be a harbinger of even greater DOD support for the life sciences, says Mary Hendrix, new president of the influential Federation of American Societies for Experimental Biology (FASEB), which represents more than 60,000 researchers. In December, FASEB will add DOD to the list of federal agencies it monitors and spotlights in its annual budget recommendations, and FASEB is trying to “identify new areas of common interest” at DOD. “There is some ignorance among [biomedical researchers] about what types of projects DOD funds,” says Hendrix.

    Computer scientists didn't fare as well. Congress significantly pared IT programs run by the Defense Advanced Research Projects Agency (DARPA), in part, say aides, because lawmakers felt some programs were growing too fast while others duplicated work being done by other agencies. Although the exact impact of the cuts won't be known for months, “it's ironic that IT got cut just when everyone is saying that new computers and software will be so important to the future military,” says Robert Parker, head of the University of Southern California's Information Sciences Institute in Arlington, Virginia. “We need to get much more proactive in making the case” for Pentagon spending on IT.

    The cuts come at a time when some question DOD's stomach for the type of risky research that led to the Internet and radar-avoiding stealth aircraft. Jim Richardson, a former DARPA program manager who is now vice president of the Arlington-based Potomac Institute, believes that DOD's basic and applied research today “are focused on the nearer term than was the case 5 years ago,” although he admits that the shift is hard to quantify.

    Reversing that trend, he and other observers hope, will be high on the agenda of the next president. But although both Al Gore and George W. Bush have said they would significantly increase Pentagon R&D—with Bush calling for a $20 billion boost by 2006—neither has spelled out where the money would go. The details could emerge in January, when the new president submits his first budget to Congress.


    Molecule Shows Anasazi Ate Their Enemies

    1. Constance Holden

    It's official: Scientists say they have definite proof that prehistoric Indians in the Southwest not only killed, butchered, and cooked, but actually ate other human beings. The evidence takes the form of a dried chunk of human excrement, or coprolite, containing a telltale human protein that could have gotten there only by being ingested.

    “The bottom line,” says biochemist Richard Marlar of the University of Colorado Health Sciences Center in Denver, is “yes, [cannibalism] did occur in the Anasazi population. Now it's up to the anthropologists and archaeologists to say how and why it happened.”

    In this week's issue of Nature, Marlar and colleagues claim their data refute those who would “take the archeological and osteological evidence and say there was a potential for cannibalism” but no proof that butchering and cooking led to actual consumption of human flesh. The coprolite, he believes, has forged the final link in this gruesome food chain. A handful of critics are unswayed, however, insisting that alternative interpretations of the evidence have not been fully explored.

    The 850-year-old coprolite was found near Cowboy Wash in southwestern Colorado. The Four Corners area contains a number of sites offering strong evidence of cannibalism: human bones disarticulated, cut, burned, and cast about in exactly the same fashion as the bones of animals known to have been used for food. Investigating this small settlement of three half-buried “pit houses,” scientists found two that contained mutilated remains of seven men, women, and adolescents—apparently massacre victims whose bodies were butchered. The coprolite was found in a cold fire pit in the third pit house.

    The scientists looked for traces of myoglobin, an oxygen-transporting molecule that occurs in skeletal and heart muscles but not in the gut. They developed an assay that distinguishes between human myoglobin and that of nine food animals. The assay found human myoglobin in the coprolite, but no traces of it in any of 25 control human fecal samples. It also identified beef myoglobin in feces from people who had recently eaten cooked beef. The authors report finding more human myoglobin on shards of a cooking pot, as well as human blood on two cutting tools.

    “It appears that in two of the pit structures, people cooked and processed human remains and then left the bones on the floors,” says the lead archaeologist at the site, Brian Billman of the University of North Carolina, Chapel Hill. “In the third pit structure, they appear to have cooked human remains in a pot and then, after they were done [eating], they defecated into the hearth and smashed the cooking pot.”

    Not everyone agrees, however. For example, Debra Martin of Hampshire College in Amherst, Massachusetts, says the fecal chunk could have been contaminated with human proteins during handling by scientists. And Peter Bullock of the Museum of New Mexico in Santa Fe questions whether the coprolite is even human. “It's most likely from a coyote,” a common scavenger in these parts, he says. But Billman says the coprolite lacks the bone chunks, hair from grooming, and fur that are almost always found in canine coprolites, and there were no canine tooth marks on the human remains.

    Those who worked on the site acknowledge that their findings pack a cultural wallop. “Unfortunately, it's a very emotional debate,” Billman says. But he doesn't agree that cannibalism “dehumanizes” early native Americans: “This is, unfortunately, what human beings do.”


    Experts Downplay New vCJD Fears

    1. Michael Balter

    The thought of coming down with Salmonella after eating a tainted chicken sandwich may be alarming—but “mad cow disease”? That kind of chilling scenario was splashed across the mainstream British press last week based on a report from a leading U.K. lab that prions—abnormal proteins linked to bovine spongiform encephalopathy (BSE) and its human form, variant Creutzfeldt-Jakob disease (vCJD)—can jump from one species to another more easily than previously believed. Although some experts say the findings raise concern about a threat to human health, others decry the media frenzy and contend that the new research adds little to what is known about the mysterious, fatal diseases.

    What's beyond dispute is that scientists have only a vague idea about how vCJD victims become infected and when the brain begins to deteriorate. That is itself alarming given that the incidence of vCJD is rising 23% per year in the United Kingdom and may yet accelerate into a devastating epidemic (Science, 1 September, p. 1452). So neurologist John Collinge and his colleagues at St. Mary's Hospital in London decided to take a new look at the so-called “species barrier” that appears to block or slow transmission of prions between most species. By convention, the species barrier is determined by the time between experimental inoculation of an animal with prion-infected brain extracts (from an infected individual from another species) and the onset of symptoms, such as loss of balance.

    The best studied species barrier is that between hamsters and mice. Previous research has found that mice show no signs of disease over their lifetimes, about 2 years, after being infected with a hamster prion strain known as Sc237. Thus mice are considered highly resistant to this strain. Collinge's group replicated these experiments and confirmed that mice infected with Sc237 prions do not get sick. But the researchers went a step further, examining the brains of infected mice as they age.

    They discovered that good health was only skin deep. Brain slices revealed that some mice had neurological damage typical of that seen in some human and animal prion diseases, including clumps of prion protein. Moreover, using antibodies to distinguish similar proteins from one another, the researchers found that the brain tissue was riddled with mouse prions—not the hamster Sc237. Writing in the 29 August issue of the Proceedings of the National Academy of Sciences, the authors conclude that the hamster prions had converted healthy mouse prion protein into the aberrant form. Indeed, the researchers were able to make other healthy mice ill by injecting them with brain extracts from the asymptomatic mice, demonstrating that the newly created mouse prions are infectious.

    The study “undermines what we have thought about species barriers,” Collinge told Science. His group speculates that chickens, pigs, or other livestock fed BSE-infected animal feed may be silent carriers of the disease. That grim prospect set off a frenzy in the British press. Newspapers suggested not only that animals deemed healthy and incapable of acquiring BSE could transmit the disease, but also that people not displaying vCJD symptoms might infect others.

    Some researchers have taken Collinge's group to task for extrapolating from rodents to livestock. The findings “do not say anything directly about whether pigs, poultry, or sheep have become silently infected,” says biologist Moira Bruce of the Institute for Animal Health in Edinburgh, Scotland. She argues that the findings speak only to mice and add little to the prion debate. “All that has happened is to change the story from mice are resistant to mice are susceptible but with very long incubation periods.”

    Similar reservations are expressed by Oxford University mathematical biologist Angela McLean. Her investigations of the potential danger of BSE infection in sheep have come up reassuringly empty so far. Scientists, McLean says, “were already worried about preclinical infections” in livestock long before the Collinge study.

    Others, however, say that the findings might have public health implications. The paper “does suggest that species in which we've never seen disease might be carriers,” says epidemiologist Peter Smith of the London School of Hygiene and Tropical Medicine. Smith, who is acting director of the U.K.'s Spongiform Encephalopathy Advisory Committee, says the panel will discuss the study at its next meeting later this month.

    But whereas Collinge's group argues that their data should prompt a program of screening apparently healthy cattle for “subclinical” levels of BSE, Smith doubts the findings will persuade his panel to call for changes in anti-BSE control measures now in place. These controls, which ban the use of animal feed derived from animal carcasses and require cattle slaughtered for food to be no older than 30 months, “already take into account the possibility of subclinical infection,” says Smith. He notes that previous studies have turned up no evidence that pigs and chickens are infected with BSE.

    Although Smith thinks the findings might raise a warning flag, he too believes the media went overboard. But he's not surprised. “Anything that has BSE attached to it is going to produce a big media response,” he says.


    Brain Cells Turning Over a New Leaf

    1. Gretchen Vogel

    Many of us may pine for our lost youth, but we know we can't turn back the clock. Biologists had long assumed that the same is true in development—that once a cell becomes committed to a particular fate, it can't reverse its tracks and become something else. A spate of recent work seems to have turned this biological dogma on its head, however. New findings described on page 1754 offer the strongest evidence yet that certain cells can be steered onto new career paths after all.

    Previous reports that adult cells can be reprogrammed have been dogged by the question of whether scientists had really tapped a cellular fountain of youth or whether immature cells hiding in their culture dishes gave rise to unexpected cell types. In the current work, Toru Kondo and Martin Raff of University College London, managed to coax rat oligodendrocyte precursor cells (OPCs), which scientists thought were irreversibly committed to becoming neuronal handmaidens called oligodendrocytes or astrocytes, into becoming neurons. Because Kondo and Raff ran several experiments to test the purity of their well-characterized OPCs, they say it is unlikely that the effect was due to undetected immature cells.

    “Until recently in most people's minds, there was no reverse arrow” for OPC development, says stem cell researcher Ron McKay of the National Institute for Neurological Disorders and Stroke in Bethesda, Maryland. With this work, he says, “it's pretty clear there is a reverse arrow.” If the feat can be duplicated with human OPCs, it raises the tantalizing prospect of a new approach to treating Alzheimer's or other diseases marked by the loss of functional neurons: OPCs drawn from fetal tissue or even an adult patient might be reprogrammed to serve as replacement neurons.

    Kondo and Raff began by isolating OPCs from the optic nerves of newborn rats and used previously described techniques—culturing the cells in fetal calf serum or exposing them to bone morphogenetic proteins—to turn them into astrocyte-like cells. They then treated the cells with basic fibroblast growth factor, a protein known to stimulate proliferation of neural stem cells. Nearly half started dividing, producing cells resembling nervous system stem cells, which the team induced to develop into neurons and astrocytes as well as oligodendrocytes.

    Fetal calf serum and fibroblast growth factor, it appears, had somehow induced the OPCs to revert to a more primitive state. “There had been all sorts of evidence that these were committed precursor cells,” says stem cell biologist Ben Barres of Stanford University. But the evidence that the OPCs' developmental clock can indeed be wound back, he argues, is “totally convincing.”

    Not to everyone, however. To test for renegade stem cells, the researchers cultured the OPCs in a series of factors that turn neural stem cells into neurons. After 2 weeks, they found no sign of neuronal markers in their cells. But neuroscientist Fred Gage of the Salk Institute for Biological Studies in La Jolla, California, says that “it still remains possible that there are undetected multipotent stem cells that exist within the culture.” Last fall, his team reported that cells from the optic nerve of adult rats could, when treated with similar proteins, also become neurons. But adult cells are harder to purify than those from fetal nerves, and Gage says he wasn't convinced that his team's preparation did not contain immature stem cells.

    The evidence that OPCs can rejuvenate in the lab is good news for eventual disease treatments. But it is not yet clear what it means for researchers who study how normal brains develop and repair themselves, notes Sean Morrison of the University of Michigan, Ann Arbor. “We have to be very careful to distinguish what happens to a cell in vivo and the potential it can have after being reprogrammed in vitro,” he says. Still, he adds, attempts to pin down the molecular signals that drive reprogramming should help scientists better understand the signals that govern normal development.

    Together with previous reports of shape-shifting cells, Barres says, the new work “raises the question, ‘What else can be reprogrammed?’” If more elixirs can be discovered for various cell types, then more degenerative diseases might fall victim to newfound fountains of youth.


    Biggest Extinction Hit Land and Sea

    1. Richard A. Kerr

    There was no hiding from the greatest mass extinction of all time. Two hundred and fifty million years ago, at the end of the Permian period and the opening of the Triassic, 85% of the species in the sea and 70% of the vertebrate genera living on land vanished. Whatever pummeled life in the sea did its dirty work in a geologic moment of less than half a million years (Science, 15 May 1998, p. 1007). Now from South Africa comes evidence that the Permian-Triassic extinction of land plants was equally brutal and swift.

    The new signs of ecological catastrophe come from rocks that started as sediments laid down in South Africa's Karoo Basin 250 million years ago. In a paper on page 1740 of this issue of Science, paleontologist-geologist Peter Ward and geomorphologist David Montgomery of the University of Washington, Seattle, and sedimentologist Roger Smith of the South African Museum in Cape Town report that the rocks tell of an abrupt switch in style of sedimentation, as if the land had been permanently stripped of the rooted plants that held it in place. “It looks a lot like we just lost the forests,” says paleontologist Gregory Retallack of the University of Oregon, Eugene. Something with the power of an asteroid impact seems to have shattered life on Earth. But in the absence of any trace of an impact, researchers are groping for an equally far-reaching explanation.

    The Karoo ecological disaster left its mark in the mud and sand laid down as water drained from the landscape. At seven spots scattered across 400 kilometers of the basin, Ward and colleagues found the same pattern of changing sedimentation. Their benchmark was the Permian-Triassic (P-T) boundary, a layer of rock marked by evidence of extinctions and a globally recognized shift in carbon isotopes. In tens of meters of rock laid down before the boundary, the researchers found sandstones filling broad channels, as if deposited by meandering rivers. Above the extinction bed, the river deposits are entirely different. They are typical of quick-flowing, “braided” river systems carrying large amounts of water and sediment in narrow, interconnecting channels.

    The group's preferred explanation is the loss of the larger rooted plants, including the recorded extinction of the treelike seed fern Glossopteris, that held the soil in place, especially along stream and river banks. Montgomery can often see the same sedimentary transition when forests are clear-cut today. At the P-T boundary, it seems to have been global. “The pattern they see is matched beautifully in Australia and Antarctica,” says Retallack, based on his own work. Everywhere the transition occurred, says Ward, “it was fast. This was a really rapid, short-term event.”

    The rapidity shows up best in marine sediments. In the 21 July issue of Science(p. 432), paleontologist Jin Yugan and his colleagues at the Nanjing Institute of Geology and Palaeontology in China and paleontologist Douglas Erwin of the National Museum of Natural History in Washington, D.C., showed that the devastating extinctions in the sea took place even faster than anyone had thought. Radiometric dating of the P-T outcrop at Meishan, China, had narrowed the generally accepted duration of the extinctions from millions of years to less than 500,000 years (Science, 7 November 1997, p. 1017). Now, by conducting a detailed census across the boundary of 333 species of everything from fish to microscopic foraminifera, they find that all the extinctions could have occurred in a single bad day 251.4 million years ago, says Erwin, just as many species clearly went extinct in a geologic instant 65 million years ago at the moment of an impact.

    For the moment, an impact is just one of several contenders to explain the P-T extinctions. The most discussed of the possible earthbound causes centers on lavas known as the Siberian Traps, whose million-year-long eruption coincided with the P-T extinctions as near as radiometric dating can place them (Science, 6 October 1995, p. 27). Four other extinctions, both major and minor, have now been linked in time with huge basaltic lava eruptions like that of the Siberian Traps (Science, 18 August, p. 1130). So far, however, no one has found a decisive link between cause and effect.

    To shorten the list of potential mass murderers, researchers are combing the P-T geologic record for clues. In south China, Jin and his colleagues reported, the spike in carbon isotopic composition precisely coincides with the extinctions. It may mark a collapse of biological productivity in the sea in parallel with the ecological disaster on land. Other researchers have reported a huge spike in the amount of fungal remains around the world. The spike, which begins just before the extinction in south China, may be a sign of massive decay on land. So far, however, the big picture refuses to snap into focus. The Siberian eruptions could be behind all this, says Erwin, but “it's still difficult to pin down the extinctions to a single mechanism.”


    New CNRS Chief Hopes to Deliver on Science

    1. Michael Balter

    PARIS—For years, French research ministers have prodded the nation's scientists to make their research pay off for society. But that message has been slow to sink in. Last week, the government gave the assignment to a biologist who has done exactly that, but who believes that the carrot works better than the stick.

    Geneviève Berger takes charge of the CNRS with orders to nudge France's basic research agency out of a malaise stemming from a steady decline in research funding, the graying of the agency's scientific cadre, and debates over how best to reform the $2.2 billion behemoth (Science, 30 July 1999, p. 647). “The CNRS has been slipping for several years,” says chemist Pierre Potier, a former government science adviser. “Madame Berger is very courageous to agree to lead it.”

    As director of the CNRS Laboratory of Parametric Imagery in Paris, in the early 1990s Berger co-invented the first instrument to use ultrasound to visualize human bones. The device is now used worldwide for diagnosing osteoporosis. “I would find it more satisfying that there are thousands of such machines distributed because of my research than to be cited thousands of times in the scientific literature,” Berger told Science. Still, the invention is the fruit of years of fundamental research, she notes, adding that “there is a continuum between basic and applied research, not a division.”

    Berger, 45, has spent the last 9 months as technology director in the Ministry of Research under research minister Roger-Gérard Schwartzenberg, who proposed her for the CNRS post. Whereas Schwartzenberg's predecessor, Claude Allègre, leaned heavily on CNRS researchers to spend time in the universities and industry and wanted to tie promotions to such activities, Berger says that “mobility should be encouraged by promotions, but in no case should researchers [who do not move] be penalized.”

    Berger's political baptism will come later this month after the government unveils its 2001 budget proposal, and researchers are rooting for her. “Berger is a first-class scientist,” says Etienne-Emile Baulieu, a lab director at the biomedical agency INSERM and the inventor of RU-486, the so-called “abortion pill.” She “will be very good for the CNRS.”


    Big Hikes Sought for Life Sciences, IT

    1. Dennis Normile

    TOKYO—Genomics, information technology, and ocean drilling are the big winners in the proposed 2001 budget for Japan's major science agency. The requests, submitted this week, indicate the government's continued commitment to science and technology despite flat overall spending for the fiscal year beginning next April.

    The increase “reflects the feeling among our political leaders that [science budget] increases are needed to secure the country's economic future,” says Nobuhiro Muroya, deputy director of planning for the Science and Technology Agency, which in April will become part of a new Ministry of Education, Culture, Sports, Science, and Technology. The growth also erases fears among many scientists that the government would shift its focus after having achieved a 5-year goal last year to double the country's spending on science and technology, to 17 trillion yen ($162 billion). The overall science budget, which includes several other agencies, won't be known until later this month, and it won't be finalized until December.

    One of the biggest jumps within the new superministry is in the life sciences, where spending at research institutes and on large projects will rise 25% to $963 million. The increase particularly benefits the Genomic Sciences Center at RIKEN (the Institute of Physical and Chemical Research), where the budget will nearly double, to $152 million. The center plans to accelerate its work on the structural analysis of human proteins and to launch a new bioinformatics group.

    The requests for 2001 also point to a heavy investment in information technologies. A program to connect all the nation's universities in a high-capacity, high-speed communications network is seeking an 8% increase, to $406 million. Funding for research into next-generation networks and communications technologies would more than double, to $411 million.

    Funding for the ocean drilling program (Science, 13 November 1998, p. 1251) would rise by nearly 12%, to $79.5 million. The centerpiece of the program is a $350 million drill ship with expanded capabilities. The funding increase “means we now have the entire budget needed to complete the drill ship,” says Takeo Tanaka, head of the ocean drilling program at the Japan Marine Science and Technology Center. The boost in next year's budget also provides funds to operate the ship after its completion in 2004, and to plan a scientific program that will begin in 2006.

    Not all scientific efforts are faring so well, however. The National Astronomical Observatory failed to win construction funds for an array of millimeter and submillimeter radio telescopes jointly planned for the Atacama desert of northern Chile by the United States, Europe, and Japan. But Masato Ishiguro, director of the project for the observatory, hopes for money in time to begin work in 2002.


    Academies Fault Reno Over Handling of Lee

    1. David Malakoff

    The already tangled tale of Wen Ho Lee, the nuclear weapons physicist jailed for allegedly copying classified information from computers at Los Alamos National Laboratory in New Mexico, took another twist last week. While federal judges feuded over whether the suspect should be released to home detention before his trial, the country's scientific establishment attacked Attorney General Janet Reno over the way Lee is being treated.

    On 31 August the presidents of the nation's preeminent science academies released an open letter that blasts Reno. Lee has been “a victim of unjust treatment,” say the presidents, since he was jailed last December following a yearlong investigation into possible spying at Los Alamos. The letter—signed by Bruce Alberts of the National Academy of Sciences, William Wulf of the National Academy of Engineering, and Kenneth Shine of the Institute of Medicine—also complained that Reno had failed to respond to two earlier letters inquiring about his treatment, including alleged restrictions on contact with his family and the use of shackles while in solitary confinement. A one-page letter that they received in May from a Department of Justice (DOJ) official “was not a satisfactory response,” they said. Noting the academies' efforts on behalf of political prisoners in other countries, Wulf told Science that even some repressive foreign regimes “had done a far better job” of answering routine letters protesting the treatment of jailed scientists.

    But Wulf says the trio might not have publicly raked Reno over the coals had they known about another letter that DOJ sent to a workers' advocacy group at Lawrence Livermore National Laboratory in California. Two weeks before the academies released their letter, DOJ senior counsel Richard Rogers sent the Society of Professional Scientists and Engineers (SPSE) a three-page discussion of Lee's treatment, including assurances that he was shackled only when being moved and was allowed visits from his family. The letter, which responded to an SPSE letter protesting Lee's treatment, “is much more explicit than the one we got,” Wulf said after receiving a copy from Science.

    Rogers says that “it's unfortunate” that a copy wasn't sent to the academies. He says he is “sorry about” the oversight and isn't sure why it occurred.

    DOJ may be able to make amends with its next letter. Wulf and his colleagues also want to know how the government plans to punish an FBI agent who apparently gave misleading testimony at a court hearing last year that led a judge to jail Lee pending his trial, scheduled to start in November. Last week the judge ruled that Lee could go home, a step widely seen as a blow to the government's case. A few days later, however, a higher court unexpectedly stepped in to block the release, sparking what is likely to be another lengthy—and confusing—round of wrangling over Lee's fate.


    DNA Arrays Reveal Cancer in Its Many Forms

    1. Jean Marx

    The use of microarrays to determine gene expression patterns is providing a wealth of new information that should aid in cancer diagnosis and ultimately in therapy

    Over the years, cancer researchers and clinicians have learned that they are fighting not just one foe but a seemingly endless variety of adversaries. Indeed, the catchall name “cancer” hides the fact that there are actually hundreds of different cancers—each with its own unique characteristics. That's a major reason why cancer has been so hard to vanquish: Drugs that work against a cancer of one tissue—say, breast, or lung, or blood—often don't work against those of others. Even cancers of the same apparent type often vary widely in their responsiveness to a therapy; some retreat quickly while others relentlessly progress. Now, researchers have a powerful new tool that not only should help them sort out the differences that define the many types of cancer, but also should help identify new targets for therapeutic drugs.

    The tool is the microarray—a slide or chip systematically dotted with DNA from thousands of genes that can serve as probes for detecting which genes are active in different types of cells (Science, 15 October 1999, p. 444). Researchers doing the work say such arrays are providing an unprecedented amount of information about the genetic changes underlying cancer. “When we started out, we like all other biologists were used to studying one gene at a time,” says Paul Meltzer of the National Human Genome Research Institute (NHGRI) in Bethesda, Maryland. “When suddenly you don't have to do that, it changes things radically.” Richard Klausner, director of the National Cancer Institute (NCI) and a proponent of using microarrays to study cancer, agrees. This “really represents a new type of data,” he says.

    Microarray technology is already providing insights into cancer that would be difficult, if not impossible, to obtain using the gene-by-gene approach. In the past several months, researchers in several labs have used it to identify specific subtypes of a variety of cancers, including leukemias and lymphomas, the dangerous skin cancer melanoma, and breast cancer. In some cases, they can determine which cancers are likely to respond to current therapies and which aren't. Such information, predicts NCI's Louis Staudt, “will rewrite the cancer textbooks over the next 3 to 4 years.” In addition, the studies are giving researchers a fix on which genes are important for the development, maintenance, and spread of the various cancers, and are thus possible drug targets.

    Subdividing the enemy

    An early demonstration that it's possible to classify cancers based on their gene expression profiles came from a team led by Eric Lander and Todd Golub of the Whitehead Institute and the Massachusetts Institute of Technology (MIT) Center for Genome Research (Science, 15 October 1999, p. 531). They began by comparing the profiles of acute myeloid leukemia (AML) and acute lymphoblastic leukemia (ALL), two blood cancers that are often hard to tell apart by standard pathological examination of the diseased cells. The researchers isolated messenger RNAs (mRNAs), which are produced only by active genes, from the bone marrow cells of 38 patients with either AML or ALL; labeled the mRNAs with biotin; and then applied each sample separately to microarray chips carrying the probes for more than 6800 human genes that were prepared by the biotech firm Affymetrix of Santa Clara, California.

    After scanning the chips to determine how much mRNA had bound to each gene, the team used an algorithm to select the 50 genes whose level of expression differed most between AML and ALL cells. They selected the genes mathematically, “without human intervention,” Golub says, “because we started with the notion that we weren't smart enough to know which were informative.”

    The researchers then went back and showed that the expression patterns of those genes could in fact identify which patients had AML and which had ALL in the original group of 38 and also in another group of 36 patients not previously studied. Although cancer specialists already knew that AML and ALL are separate diseases, “it was important to show that you could [tell them apart] by gene expression,” says Jeff Trent of the NHGRI.

    Since then, researchers have used gene expression patterns to reveal previously unknown cancer categories. Some of this work comes from Staudt's team, working in collaboration with Patrick Brown and David Botstein of Stanford University School of Medicine and their colleagues. These researchers focused on patients with diffuse large B cell lymphoma, a common type of non-Hodgkin's lymphoma that affects more than 15,000 new patients annually in the United States and follows a highly variable clinical course. “From the first,” Staudt says, “it was clear that 40% did wonderfully and were cured, while 60% succumbed to the disease.” The microarray analysis points to a possible reason why.

    In the first phase of the work, which was reported in the 3 February issue of Nature, the researchers created what they call a “Lymphochip” containing nearly 18,000 genes, most of which are expressed in normal and malignant lymphoid cells. They then prepared mRNAs from biopsy samples from 40 lymphoma patients, copied them into DNAs that were fluorescently labeled, and then applied the cDNAs from each patient to a Lymphochip. “We found a great diversity in gene expression among the patients, despite [their having] the same diagnosis,” Staudt says.

    Computer analysis of the expression patterns showed, however, that the patients could be divided into two groups. One group expressed a set of genes characteristically turned on in B cells in the spleen and lymph nodes during an immune response. The other set didn't express those genes but did show activity of a set of genes that are turned on when blood B cells are stimulated to divide by an antigen. “On this basis,” Staudt says, “the patients could be thought of as having two different diseases.” Indeed, their clinical pictures also varied: Those with the expression pattern of the spleen-lymph node B cells fared much better, with 75% alive 5 years after diagnosis, while 75% of the other group did not make it to that milestone.

    Staudt's team is now participating in a collaborative study called the Lymphoma/Leukemia Molecular Profiling Project, which will look at hundreds of patients with B cell lymphoma to see whether the results hold up. But they already have possible therapeutic implications. Lymphoma patients are usually treated first with chemotherapy, and if they relapse, they become candidates for a bone marrow transplant. But in the future, those patients who have a gene expression profile indicating a poor prognosis might move directly to bone marrow transplant, avoiding the first-line chemotherapy regimen, which can be debilitating.

    Blood cancers aren't the only ones in which microarray analysis is picking up previously undetected subgroups. In the 3 August issue of Nature, the NHGRI team reported that 31 melanoma patients could be subdivided into two distinct groups based on their gene expression patterns, even though there were no obvious pathological distinctions between the patients' tumors. It's too soon to tell whether these categories are correlated with clinical outcome, NHGRI's Meltzer says, although there are hints that they might be.

    For example, melanomas from the larger of the two clusters show reduced expression of a variety of genes involved in cell movement and, consistent with that observation, the tumor cells show reduced motility. That might mean they were less invasive. An indication that that might be the case comes from the fact that only three of the 10 patients in that cluster for whom survival data was known had died, compared with four out of five in the smaller cluster. Because the fates of some patients are unknown, however, the result might not be significant.

    Stanford's Brown and Botstein and their colleagues have also found that breast cancers show distinguishable patterns of gene expression. Again, the researchers could pick out two broad groups of the tumors, one of which was marked by expression of the gene for the receptor for the hormone estrogen while the other one wasn't. That was no surprise, because cancer physicians have long known that breast cancer cells that lack estrogen receptors tend to be more aggressive. But as the researchers reported in the 17 August issue of Nature, those broad groups of tumors may contain a variety of subgroups. For example, the branch containing the estrogen-receptor-negative cells included one subgroup showing a gene expression pattern similar to that of the basal cells of the mammary ducts and another characterized by high expression of the Erb-B2 oncogene. “Not described,” Botstein says, “is whether there is a difference in outcomes [for the subgroups]. We're addressing that now.”

    Cause and effect

    These early studies clearly show that it's now possible to detect wholesale changes in gene expression patterns, but researchers want to do more than just identify genes whose activity is turned up or down. They also want to find out which of those changes are important for cancer development and progression—the causes and not just the effects. Microarrays are helping out there, too. For example, also in the 3 August issue of Nature, Golub and Lander, with MIT colleagues Edwin Clark and Richard Hynes, used arrays to compare the gene expression patterns of highly metastatic melanoma cells with those of the much less metastatic cells from which they were derived. The comparison identified a suite of genes whose activity was apparently turned up as melanoma cells progressed to malignancy.

    Many of the genes were of the type that would be expected to contribute to metastasis—genes involved either directly or indirectly in the cell's ability to move to and invade new tissues, for example. The MIT team looked in further detail at one such gene, called RhoC, because work by others had shown that expression of this gene correlates with progression of a pancreatic cancer to metastasis. When the researchers introduced the human RhoC gene into a line of human melanoma cells that normally shows little tendency to spread and then inoculated mice with these cells, they found that the cells had become highly metastatic.

    In a similar fashion, researchers are using the arrays to determine how the activation of cancer-promoting oncogenes or the inactivation of tumor-suppressor genes perturbs the expression of other genes. As NCI's Klausner says, the arrays provide “a tool for understanding the relation between specific genetic defects and how they play out.” One recent example comes from Staudt and his colleagues, who used their Lymphochip to study the consequences of abnormal activation of an oncogene called BCL-6, a situation that commonly occurs in lymphomas. Although researchers knew that BCL-6's protein product represses the expression of certain genes, the NCI team's microarray analysis pinpointed the changes that could be leading to cancer.

    They found that BCL-6 activation leads to repression of a gene called blimp-1, which normally promotes the differentiation of B cells to become antibody-producing plasma cells, and also of a gene called p27kip1, which inhibits the cell division cycle. This two-part repression is effectively a double whammy when it comes to cancer development, because the net result is to lock cells in an undifferentiated, continuously dividing state. (The results appeared in the August issue of Immunity.)

    The MIT group, in collaboration with Robert Eisenman's team at the Fred Hutchinson Cancer Research Center in Seattle, has also looked at the changes elicited in cells by activation of the MYC oncogene. In results reported in the 28 March issue of the Proceedings of the National Academy of Sciences, they found that MYC activity turned up the expression of 27 genes, including some that promote cell division, and turned down the activity of another nine. “We're discovering the malignant pathways of these tumors,” Staudt says. “Now we can ask whether interfering with these pathways can help.”

    Other applications of microarray technology to cancer that are also getting under way include studying how cancer cells respond to various chemotherapeutic agents and determining why some cells respond and some don't. “The applications are almost endless,” enthuses NHGRI's Meltzer.

    A flood of data

    As the current trickle of microarray studies swells to what is likely to be a flood, researchers will have to face what Klausner calls a “profound” challenge: how to deal with the masses of data that will come pouring out. “It's not just how to analyze the data,” Klausner says, “but how to share and compare them.” Right now, for example, people are using different “platforms,” as the arrays are called, as well as different methods of analyzing the data those platforms produce. This lack of standardization makes it difficult to relate the findings of the different labs and assess their quality.

    To deal with such issues, NCI has set up a Gene Expression Analysis Working Group, which will hold several workshops over the coming years similar to those held by the human genetics community to sort out the problems their field faced. The first workshop is in the planning stage, says NCI's Kenneth Buetow, and the questions it will deal with are still being worked out. “It's actually quite easy to find issues,” he says. “The hardest part is to identify where to start.”

    But it's important for the community to get together and solve the challenges so that it can get a better understanding of the changes that produce cancer and how to counteract them. “It's an amazing time,” Buetow says. “We're all having fun [with microarrays]. But at the end of the day, we have to translate this into clinical benefit.”


    Protein Arrays Step Out of DNA's Shadow

    1. Robert F. Service

    As technical obstacles yield, en masse protein testing is poised to take one of biochemistry's most exciting techniques into the heart of cellular chemistry

    As scientific tools go, DNA microarrays are the ultimate in multitasking, allowing researchers to track the activity of thousands of genes at once. That's made them wildly popular for tracking how patterns of gene expression change in diseases such as cancer (see p. 1670). It's also led to spin-offs, particularly efforts to create similar arrays of proteins lined up on surfaces, which among other things could be used to test the activity of potential drugs against thousands of protein targets all at once. Early progress has been slow; proteins are harder to synthesize than DNA, and plunking them down on solid surfaces tends to cause them to unfold and thereby lose their activity. Now, however, those barriers appear to be crumbling.

    On page 1760 of this issue, researchers at Harvard University report creating arrays of over 10,000 proteins on a piece of glass just half the size of a microscope slide. They then used their arrays to study a variety of protein functions, work that included identifying members of the array that bind to other free-floating proteins and to small, druglike molecules.

    “This is fantastic. I'm envious,” says biochemist Eric Phizicky of the University of Rochester in New York, who is also developing protein arrays. Phizicky says the Harvard team's work holds the promise of vastly speeding up drug development by enabling pharmaceutical companies to quickly screen potential drugs against protein targets, while ensuring that the candidates don't react with secondary proteins that can cause side effects. So it's a sure bet, he says, that other groups will race to exploit the new technology. “This is going to be the new way of doing things,” Phizicky says.

    Phizicky and others point out that protein arrays might even have some advantages over DNA arrays. After all, it's proteins, not DNA or RNA, that carry out the vast majority of chemical reactions in cells. What's more, DNA arrays detect either the messenger RNAs (mRNAs) made by active genes or DNA copies of the mRNAs, and the amount of mRNA in a cell often shows no correlation with the amount of protein that gets produced by the cell. Even more troubling, proteins can undergo innumerable slight chemical changes that can profoundly alter their activity. The bottom line, say protein researchers, is that if you want to know what's happening to a cell's proteins, you have to study the proteins themselves.

    That realization has sparked a number of early versions of protein array technology in recent years. In work reported in the 10 February issue of Nature, for example, biologist Stan Fields and his colleagues at the University of Washington, Seattle, devised something resembling a test tube version of an array, in which each test tube was a tiny well in a plate containing specially engineered yeast cells. This allowed the researchers to test the interactions of all 6000 yeast proteins with many of the others. And in the February issue of Analytical Biochemistry,Andrei Mirzabekov and colleagues at Argonne National Laboratory in Illinois and the Russian Academy of Sciences' Joint Human Genome Program in Moscow described a technique for creating arrays of proteins immobilized inside tiny gel packets dotted across a surface. But although the new techniques opened the door to making arrays of proteins, they still weren't as simple to use as DNA arrays.

    That's where the new work by chemical biologist Gavin MacBeath and chemist Stuart Schreiber comes in. To make their arrays, MacBeath and Schreiber used a robot originally designed to synthesize DNA arrays. The robot dips a quill-like tip into a well containing a single purified protein and then turns to a glass slide, where it spots down a tiny 1-nanoliter drop onto the slide. After the robot washes and dries the tip, it then repeats the procedure with a different protein, and so on to build the array.

    But the real key was getting the proteins to stick to the surface without denaturing. To accomplish that trick, MacBeath and Schreiber first coated their glass slide with a layer of a protein called bovine serum albumin (BSA), which provides a water-friendly surface that prevents the denaturing of proteins dropped on top. Next, they used the robot to make their array of protein droplets, each of them containing billions of protein copies. Finally, to anchor the proteins to the BSA surface, the researchers carried out a chemical reaction that caused lysine amino acids on the array proteins to bind to lysines in the BSA. Because lysines are present throughout the entire length of a protein, some copies always wind up binding with their chemically active regions open to the surface.

    With their arrays in hand, MacBeathand Schreiber demonstrated that they could test the actions of various proteins with ease. For example, they could easily detect the binding of a small, druglike molecule bearing a fluorescent tag to particular proteins in an array, which could help reveal novel drug targets. In hopes of using the protein arrays to speed drug discovery and other applications, MacBeath says that he and colleagues at the Massachusetts Institute of Technology and the University of California, San Francisco, are launching a new company, Merrimack, to commercialize the technology.

    Studying the functions of proteins in arrays isn't the only possible application of the technology, MacBeath says. Researchers also hope to array antibodies that bind to specific proteins. That would enable them to see which proteins are actually being produced in various tissues and, presumably, offer further clues to what causes various diseases. That goal, too, is fast approaching. Mirzabekov says his team has arrayed 10 antibodies in gel-pack-based chips and is now looking to scale up the technology. And MacBeath says he and collaborators have preliminary results showing that they should be able to pull this off with their new arrays. So while DNA arrays may have gotten a jump in the biochip business, it's a safe bet that protein chips won't be far behind.


    Virtual Institutes Gear Up to Do Real Research

    1. Wayne Kondro*
    1. Wayne Kondro writes from Ottawa.

    Alan Bernstein, president of the new Canadian Institutes of Health Research, wants to improve the quality of research—without a massive new infrastructure

    OTTAWA, CANADA—Alan Bernstein has never let administrative duties keep him out of the lab. When he was head of the Samuel Lunenfeld Research Institute at the Mount Sinai Hospital in Toronto, Bernstein, a geneticist, maintained an active research program on new cancer therapies. This spring, when he was offered the job of president of the brand-new Canadian Institutes of Health Research (CIHR)—an Ottawa-based collection of 13 virtual institutes that the government hopes will change the face of Canadian biomedical research—he accepted on one condition: He would work at least 1 day a week in his Lunenfeld lab.

    “It's important that the head of the CIHR not only be a scientist, but that he be seen to be scientist,” says Bernstein. He believes his ongoing research will not only inform his decisions at CIHR but also send a message that the new, $330 million entity is about doing cutting-edge science, not pushing paper. His good friend Harold Varmus adopted the same strategy when he became head of the U.S. National Institutes of Health in 1993 and set up an intramural lab at the Bethesda, Maryland, campus. “Someone like Alan who's perceived as a scientist, rather than a politician or an administrator, brings a level of credibility to the job that's really important,” says Varmus, a Nobelist who left government last December to become president of Memorial-Sloan Kettering Cancer Center in New York City.

    Bernstein, 53, is in the enviable position of shaping an entire national research program from scratch. On 7 June the CIHR became the primary source of investigator-initiated extramural grants for the Canadian health sciences. It took over from the country's Medical Research Council (MRC), whose president, Henry Friesen, lobbied hard for CIHR in the hope that it would generate bigger biomedical research budgets and broaden the country's portfolio of health research. So far he's right: In addition to setting up new institutes on population-based and health services research (see table), the government has promised to provide an additional $72 million in fiscal year 2000–01 for electronic links—Web sites, chat rooms, and the like—to connect scientists at all the institutes.

    Bernstein hopes to strengthen Canadian biomedical research by earmarking as much as 30% of CIHR's overall budget for targeted research. The approach, new to Canada, would pour money into “very focused, very specific” questions, he explains, in areas such as the vascular complications of diabetes, the genetics of schizophrenia, and the high incidence of suicide among aboriginal peoples. It's a bold and potentially divisive strategy: Traditionally, the MRC spent most of its money on individual investigator grants, and any type of top-down direction was seen by scientists as an unwise intrusion into the untrammeled pursuit of knowledge. “Free spirits chafe at sort of being confined,” says Matthew Spence, president of the Alberta Heritage Foundation for Medical Research. “The trick is to have the strategic direction articulated in such a way that the scientific community moves toward it willingly and naturally.”


    • Aboriginal People's Health

    • Cancer Research

    • Circulatory and Respiratory Health

    • Gender and Health

    • Genetics

    • Health Services and Policy Research

    • Healthy Aging

    • Human Development and Child and Youth Health

    • Infection and Immunity

    • Neurosciences, Mental Health, and Addiction

    • Musculoskeletal Health and Arthritis

    • Nutrition, Metabolism, and Disease

    • Population and Public Health

    Bernstein believes scientists will embrace such an arrangement once they realize that they'll be setting the agenda. Under the operational model adopted for the CIHR, every Canadian biomedical or health scientist must choose to be linked with one of the 13 institutes. That affiliation allows them, if they so choose, to become involved in crafting a strategic plan that the institute will implement through a series of requests for applications.

    Each institute will be based at the home school of its director, who will fill what Bernstein sees as a “half-time job,” and observers are anxious to see whether a handful of research-intensive universities will capture the lion's share of the prizes. Bernstein, a Toronto native who has worked in the city for most of his career, says geographic distribution isn't an issue for him; he's more concerned about each institute's ability to move Canadian science into hot new areas. “We haven't done badly,” he says. “But we haven't done an A-plus job as a country.” All competitions will be administered through centralized peer review at the CIHR head offices in Ottawa.

    Colleagues say Bernstein's ability to anticipate trends will help him convince scientists that more strategic research is essential. Bernstein has “a very strong and, indeed, visionary understanding of where biomedical science is going,” says Abraham Fuks, dean of medicine at McGill University in Montreal. It doesn't hurt that Bernstein is “hard-nosed” and “bristlingly smart,” adds Phil Branton, outgoing McGill University chair of biochemistry and a close friend since the pair studied medical biophysics at the University of Toronto. “He has high international standards and believes there's no reason why Canada shouldn't be playing that game.”

    That game takes money, however. Bernstein says CIHR's budget must be doubled, to 1% of the country's total spending on health care, for Canada to remain globally competitive.

    The challenges facing CIHR have also tapped Bernstein's characteristically high level of energy and his sky's-the-limit attitude. He says he doesn't mind adding 1-hour flights between Toronto and Ottawa to his already hectic work schedule, and he plans to continue playing cello in an amateur string quartet to relieve stress. Least of all, he doesn't care if maintaining his lab rattles the cages of Canadian tradition.

    “As I keep telling everyone around here, we have no traditions at CIHR,” he says. “We are a brand-new organization.”


    New Survey to Collect Global News You Can Use

    1. Jocelyn Kaiser

    A new 4-year, $20 million project hopes to synthesize what's known about the world's ecosystems and help policy-makers deal with those under siege

    The clear waters of Lake Victoria in eastern Africa once supported more than 350 species of cichlids, a small, bony fish that was a dietary staple of local villagers. Then fishery managers, in an attempt to bolster exports, introduced the larger, tastier Nile perch and Nile tilapia. The newcomers thrived in the lake's cool, deep waters and began gobbling up the cichlids. As the perch took over, fertilizer runoff from local plantations combined with sewage and industrial wastes to trigger vast algal blooms that sucked oxygen from the water. Today, up to half the cichlid species have vanished, residents who can't afford perch show signs of protein deficiency, and the once-healthy lake may soon become a dead zone.

    Scientists say the tragedy that unfolded at Lake Victoria over the past 30 years might have been prevented, or at least mitigated, had local authorities known more about how changes to one part of the environment affect the entire system—and what to do about them. Even today, they add, there is a role for experts in assessing the lake's woes and proposing remedies. Generating that type of information—both global and local—is the aim of the Millennium Ecosystem Assessment, a 4-year, $20 million effort that will get under way early next year.

    The assessment will turn loose ecologists and social scientists to gather and analyze data on the state of the world's ecosystems, assess nature's ability to provide essential “services” such as food and clean water, and project environmental trends such as deforestation, loss of species, and pollution (Science, 22 October 1999, p. 685). “It's simply a way of taking stock of what we know, what we need to know, and what are the consequences,” says ecologist Jane Lubchenco of Oregon State University in Corvallis. The project will also carry out regional assessments to help policy-makers cope with pressing problems and avert future disasters.

    Proponents say the assessment, funded mostly by the United Nations, the World Bank, and foundations, will transcend past efforts that viewed problems such as water scarcity and global warming in isolation ( It's a “more holistic approach” than previous assessments, says Walter Reid, former vice president of programs at the World Resources Institute (WRI) in Washington, D.C. He's acting science director of the assessment, which will be highlighted next week at a meeting of world environment ministers in Bergen, Norway. Supporters have already won commitments for about two-thirds of the total funds and hope to have the rest by early next year.

    But some say that the assessment could end up a victim of its own lofty goals—to evaluate ecosystem health, supply data to implement treaties, and sell the message that ecosystems have economic value. “The challenge the [assessment] faces right now is that it's trying to do several extremely difficult things all at the same time,” says Harvard environmental policy expert William Clark. Still, if the assessment manages to focus attention and resources on environmental sore spots like Lake Victoria and add substance to fuzzy buzzwords like sustainable development, it will be a success, Lubchenco says. “It's tremendously exciting, because it's what the world needs right now,” she says.

    The Millennium Assessment is seen by many ecologists as the logical next step in taking stock of the global environment. An earlier scientific consensus that the ozone hole posed a serious environmental threat led to the 1987 Montreal Protocol, an agreement to phase out the use of ozone-destroying synthetic chemicals. Climate scientists have used a similar approach in the Intergovernmental Panel on Climate Change (IPCC), begun in 1988. The panel's 1995 report pointing to a human influence on global climate helped build support for the 1997 Kyoto treaty, with its plan to curb greenhouse gas emissions. Looking at ecosystems next is “a natural thing to do,” says Reid. “These really are the big issues that confront development around the world.”

    The next PAGE

    The complexity of most environmental problems demands looking at the big picture, Reid explains. Storing carbon in tree farms, for example, might seem like a good way to combat global warming, but it could also reduce biodiversity by replacing natural ecosystems with single-species plantations. “Part of what led the scientific community to this [point] was its frustration with things like the climate assessment, which is looking at only that one driving force,” Reid says.

    WRI has already laid the groundwork for the assessment, with Reid spearheading the project and WRI, the United Nations, and the World Bank providing the funding. For the past 2 years, WRI has conducted a Pilot Analysis of Global Ecosystems (PAGE), to be released next week as part of WRI's biannual World Resources 2000–01 report. Reid says PAGE was “a proof of concept that proved data and information are available at a global scale.” WRI also intended it to be useful to policy-makers. For example, as part of the assessment, agricultural scientists at the International Food Policy Research Institute in Washington, D.C., gathered data on fertilizers and yields to produce high-resolution global maps showing which regions are degraded to the point that they may no longer be able to support crops.

    Persuaded by the pilot project, environmental scientists are eager to proceed. The full assessment will “go into much more depth than we've ever done before,” says Robert Watson, environment department director at the World Bank. Those details include filling data gaps in such areas as global forest cover and how much soils erode naturally, as well as weighing trade-offs, such as growing more crops by converting forests to farmland versus using more fertilizers and pesticides. And unlike the in-house WRI effort, the full assessment will tap a broader community. “It will be the world's best scientists trying to assess what we know about a global issue and how it pertains to development in general,” says Watson, who anticipates IPCC-sized teams—500 authors and 2000 reviewers—for chapters on ecosystem conditions, their future, and possible policy responses.

    Like the IPCC, the ecosystem assessment will be primarily a literature synthesis, although the authors hope to fold in new data such as high-resolution Landsat 7 images of global land cover. They also expect to develop two dozen indicators for ecosystem health, such as the crop maps that were developed as part of PAGE. Ecologists think such measures can help pin down environmental trends. Lubchenco says she's looking for the ecological equivalent of the 42-year record of soaring carbon dioxide levels from Mauna Loa in Hawaii that illustrates how humans are altering global climate. “What we don't have yet for ecosystems is a single compelling icon or set of measurements,” she says.

    The assessment will also extrapolate environmental trends, much as the IPCC has looked at changes in vegetation and farming conditions stemming from global warming. For example, Millennium Assessment experts might examine how an anticipated doubling in nitrogen from fertilizers over the next 40 years will impact water quality, fisheries, and even human disease. “This is a new area for ecologists,” says ecologist Stephen Carpenter of the University of Wisconsin, Madison.

    But the assessment is intended to be more than a scientific exercise. Organizers hope to tailor the information to the needs of those implementing international environmental agreements. In doing so, they want to avoid the fate of a similar effort that involved only scientists, whose tome “sank like a lead balloon” among policy-makers, says Stanford ecologist Gretchen Daily (see sidebar). Toward that end, Reid has gotten endorsements from leaders of the implementing bodies of three major environmental treaties—desertification, wetlands, and biodiversity—all of which will have representatives on the Millennium Assessment's 30- to 40-member board.

    Another element designed to win over politicians is a series of 10 regional, national, and local-scale assessments that would, for example, examine solutions for places like Lake Victoria, or weigh plans to dam and divert the Mekong River in Southeast Asia, now the world's largest undammed river. Such assessments aren't new, Carpenter says, but “there are very few examples as integrated as what the Millennium Assessment envisions.” Village-scale assessments will take an even finer scale approach. Madhav Gadgil of the Indian Institute of Science in Bangalore, India, has already done a pilot version that inventoried species and conditions such as erosion rates and water quality in a cluster of villages. Its findings, he says, have already affected what trees local people harvest and how they manage fish in streams. Recognizing that not all countries have the necessary expertise, these smaller scale studies will also train scientists in less developed countries so that, for example, they can analyze the Landsat 7 remote sensing data themselves.

    Although the assessment will offer advice to policy-makers, it will be in the form of scenarios rather than recommendations. And if the assessment is done properly, say proponents, the result will be not only the most rigorous, accessible data set ever on world ecosystems, but also a document that will be hard for governments to ignore. “This report will tell the truth [and] embarrass people that ought to be embarrassed,” says adviser Jose Goldemberg, an energy expert at the University of São Paolo.

    Reid and other participants don't want to promise too much: “Even if it meets its scientific goals, there's no telling whether it will have much impact on policy,” Daily says. But for the residents of Lake Victoria and other places around the world, the Millennium Assessment may offer communities the best chance to recover from an environment disaster—and perhaps avoid the next one.


    Ecologists Hope to Avoid the Mistakes of Previous Assessment

    1. Jocelyn Kaiser

    For ecologists, an attempt to assess the world's ecosystems (see main text) has a familiar ring to it. A few years ago, many of the same scientists poured their energy into the Global Biodiversity Assessment (GBA). But that exercise sank without a trace after participants failed to find a receptive audience. “When you confront people with a 1000-page thing [containing] everything people think is important, that's not a way to move the political agenda,” says emeritus ecologist Gordon Orians of the University of Washington, Seattle.

    The United Nations Environment Program (UNEP) commissioned the GBA, which was conceived as a way to help countries carry out the 1992 Convention on Biological Diversity, and the Global Environment Facility kicked in $3 million to fund it. About 300 scientists from 50 countries helped write chapters on topics such as the magnitude and distribution of biodiversity, inventorying and monitoring, and species' economic value, and up to 1500 reviewed it. The report is “an excellent [scientific] document,” says its organizer, Robert Watson of the World Bank. But as a policy tool, “it's not had the value many of us think it could have had.”

    The biggest mistake, say Watson and others, was a failure to determine ahead of time what policy-makers needed to know. “The scientific community just decided we needed this and did it,” says Jane Lubchenco of Oregon State University in Corvallis. She says this was also a political error: Countries were “not particularly welcoming,” because they feared being blamed for not adequately protecting species. Moreover, its sponsor put little effort into publicizing the study, Watson says.

    Participants in the new ecosystem assessment hope to avoid those mistakes. The study will have input from the secretariats of the biodiversity treaty and other treaties, and organizers also have earmarked funds for public outreach. For all its faults, GBA had “a lot of hidden value,” says Lubchenco, in stimulating research, framing questions, and linking experts around the world.

Log in to view full text

Log in through your institution

Log in through your institution