News this Week

Science  07 Jul 2000:
Vol. 289, Issue 5476, pp. 22

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Evidence Mounts That Tiny Particles Can Kill

    1. Jocelyn Kaiser

    Four years ago, the U.S. Environmental Protection Agency (EPA) ignited a fire storm when it declared that tens of thousands of people were dying each year from breathing tiny particles of dust and soot—and issued tough new regulations to crack down on these pollutants. Industry groups and many scientists assailed the decision, arguing that the data underlying the new particulate matter (PM) standard were inconclusive at best, and industry took their case to court. Now, a long-awaited study, by a group widely perceived to be politically neutral, comes in solidly behind the earlier EPA decision and strongly implicates particles in excess deaths.

    The study is the largest yet to examine the relation between daily levels of particles—which come mainly from soils, motor vehicles, and power plants—and deaths in the United States. Released last week by the Health Effects Institute (HEI) in Cambridge, Massachusetts, a nonprofit organization funded by industry and the government, the study* found that death rates in the 90 largest U.S. cities rise on average 0.5% with each tiny 10 micrograms per cubic meter increase in particles less than 10 micrometers in diameter, known as PM10. That number is not much different from those found in earlier studies. But this time, the case is stronger because the breadth of the new study dispels any notion that the effect might have been caused by a pollutant other than PM10, or even hot weather. Indeed, although many questions remain about how fine particles kill people, the HEI study shows there's no mistaking that PM is the culprit, lead author Jonathan Samet of Johns Hopkins University says emphatically.

    It was similar studies of the relation between day-to-day fluctuations in fine particles and death rates that raised the alarm about PM some 10 years ago. In cities such as Philadelphia, researchers found that on days when air pollution jumped yet remained within federal standards, there were more deaths and hospitalizations of elderly people for cardiac and lung disease. Although the increase was slight in each city, studies of the long-term effects of particles found it added up to a significant number of deaths, roughly 60,000 a year by some estimates. Lab studies showed that the tinier the particle, the more likely it was to lodge in the lungs, suggesting to EPA that it needed to target even finer particles than before—those less than 2.5 micrometers across.

    But in 1996, when EPA proposed a first-ever maximum level for PM2.5 together with tighter ozone standards, industry groups went on the warpath. In congressional hearings, scientists also raised a host of questions, among them whether the apparent link between deaths and PM levels was real or due to other pollutants (Science, 25 July 1997, p. 466). EPA went ahead with the standard but built in a 5-year delay to allow for more research. Meanwhile, a U.S. appeals court last year ruled that the science supported EPA's PM2.5 standard. This fall, the U.S. Supreme Court will look at a related legal question—whether the EPA's interpretation of the Clean Air Act exceeds Congress's constitutional authority.

    To help resolve the uncertainties in PM epidemiology, HEI in 1996 began funding the National Morbidity, Mortality, and Air Pollution Study, or NMMAPS. Samet's team and collaborators at Harvard scoured federal databases on daily deaths, weather, and air pollution. By including every major city with significant PM pollution and using the same methods to analyze each, the team achieved statistically stronger results than had previous single-city studies. The results varied by region—the rise in death rates was highest in the Northeast and lowest in the Southwest—but overall, the team found “a robust effect” from PM10 across the 90 cities, says Samet. Adds biostatistician Gerald Van Belle of the University of Washington, Seattle, “Whatever concern there was about a single-city idiosyncratic effect is no longer a tenable hypothesis.”

    Other recent research has also firmed up the case against fine particles. For example, studies of men equipped with heart monitors have found potentially harmful changes in their heart rates with rising PM levels (Science, 21 April, p. 424). A related NMMAPS study released in May showed that outdoor fine particle measurements are likely a reasonable surrogate for the particles that get inside homes, where people spend most of their time. And this month, HEI expects to release another study whose preliminary results appear confirmatory—a reanalysis of two controversial papers, one known as the Six Cities Study, on long-term deaths and PM levels in cities.

    Still, NMMAPS raises as many questions as it answers. “The issue now is, why is there this heterogeneity across the country?” says Sverre Vedal, a toxicologist at the University of British Columbia. One possibility is that PM in the Northeast contains more sulfate than in the West, due to coal-burning power plants. Figuring out which components of PM—sulfates, metals, acids, or ultrafine particles—inflict harm, and how they do so, is critical for determining what sources EPA should regulate. Lab researchers are now attacking those issues with “a huge wave of mechanistic work,” notes Samet, who expects some answers soon.


    Ames's Proposal for Lab Triggers Battle at NASA

    1. Andrew Lawler

    Just east of California's sprawling Silicon Valley sits Moffett Field, home to NASA's Ames Research Center. The 800-hectare former Navy base also includes an airfield, abandoned wind tunnels, and plenty of open space, the last a rare and valuable commodity in this booming region. Ames officials would like to trade the use of part of that land for a new building largely dedicated to NASA's nascent astrobiology program, the core of which is a 2-year-old virtual institute based at Ames (Science, 20 March 1998, p. 1840). Although some scientists applaud the idea, stiff opposition from other researchers and from officials at NASA headquarters could well sink it.

    The fracas began last year, when Ames managers decided to cash in on their prime real estate and their proximity to a rich supply of intellectual capital. “We want to create a research park with university, government, and industry partners,” says Bill Berry, Ames deputy director. “And we can make land available to our partners to further NASA's research agenda.” Ames officials envision a 32-hectare site with a 10,000-square-meter lab, primarily dedicated to astrobiology. Built by the potential partners, it would cost NASA about $15 million to outfit and another $15 million a year to operate. NASA Administrator Dan Goldin is intrigued by the idea, which Ames officials say has attracted the attention of local companies.

    It has also drawn criticism. Some researchers and officials at NASA headquarters argue that a brick-and-mortar project defeats the purpose of a virtual institute, and that a building would divert money from science. “There's no compelling reason to do this,” says David Black, director of Houston's Lunar and Planetary Institute. Adds one NASA manager: “This is not an obvious slam dunk, so why do it?”

    Faced with such conflicting views, NASA appointed an 11-person panel last fall to conduct an independent review of the proposed facility. Although its report is not due until September, NASA last month asked the chair, Stanford geologist Donald Lowe, for a preliminary summary to inform its planning for the 2002 budget cycle. The 10-page document, obtained by Science, says that the facility “represents an opportunity to establish a unique national laboratory.” In particular, the panel suggested that the facility could be used to design and test astrobiological experiments; to simulate the environments of other planets, comets, or asteroids; and to provide computational capability for everything from understanding planetary formation to the self-assembly of living systems. “I think it's a pretty good idea,” says panelist David Deamer, a biophysicist at the University of California, Santa Cruz. At the same time, the document warns that start-up and operating funds should not come out of the hide of other NASA science programs.

    The Lowe panel's findings rattled opponents of the new lab. According to NASA sources, Anne Kinney, the NASA headquarters official who leads the parent Origins program, wanted the Astrobiology Task Force—a group of a dozen outside researchers—to consider the findings. On 9 June, just 3 days after Lowe submitted his preliminary report, the researchers discussed the document in a teleconference; the next day they submitted a letter to Kinney. In it, task force chair Charles Beichman, an astronomer at the Jet Propulsion Laboratory in Pasadena, California, opines that the three research areas outlined by the Lowe panel are not compelling enough to warrant a new facility. “We do not agree that it is necessary to establish a national laboratory,” the letter states.

    Task force members say it wasn't a close call. “To spend $15 million a year on this when researchers are struggling along with $50,000 grants is not a good balance,” one member explains. Black also casts doubt on some of the plans themselves. The idea of creating an environmental simulation facility, for example, “is nuts. … You can't simulate the gravity on a comet,” he says.

    Next week the battleground shifts to Washington, D.C., when Beichman briefs NASA's space science advisory committee. Lowe is out of the country and will not attend the meeting, although another panel member might speak on his behalf. That arrangement infuriates the lab's supporters, who feel that NASA headquarters has ears only for the critics. Last week Ames's director, Henry MacDonald, wrote to Ed Weiler, NASA space science chief, to complain about how the Lowe report has been handled.

    But lab supporters may be fighting a losing battle. The head of the astrobiology institute, Nobel Prize-winning biologist Baruch Blumberg, has refused to take sides in the disagreement and did not return calls seeking comment. His silence, along with a lack of support from Weiler and the advisory committee, could doom any land-for-space deal.


    Mutation Points to Salt Recycling Pathway

    1. Ingrid Wickelgren

    One in five Americans and Europeans has high blood pressure, a dangerous disorder whose numerous genetic causes are only beginning to be revealed. Now a team of researchers at Yale Medical School has uncovered a piece of the puzzle: a gene mutation that leads to early-onset hypertension that may point the way to causes for more common forms of high blood pressure.

    On page 119, the Yale group reports finding a new mutation in the so-called mineralocorticoid receptor, a protein in kidney cells that is involved in the body's handling of salt. The receptor has long been thought to play a role in regulating blood pressure, but this is the first work to demonstrate how alterations in it could give rise to hypertension. In particular, the findings may explain why some women experience a sharp rise in blood pressure during pregnancy. “It's a genuine tour de force,” says nephrologist Friedrich Luft of the Max Delbrück Center for Molecular Medicine in Berlin, Germany. “It uncovers new and unexpected mechanisms for hypertension” in humans, he says, that could one day lead to better treatments and new diagnostic tools for the disorder.

    Moreover, the work produced a major, unexpected bonus: The researchers discovered a key mechanism that the entire class of steroid hormones—such as estrogen and thyroid hormone—appears to use to trigger the receptors they dock into. That finding could aid the design of drugs for conditions from breast cancer to congestive heart failure. “It's a beautiful example of a clinical investigation taken to a molecular level,” says molecular biologist Christopher Glass of the University of California, San Diego, School of Medicine.

    That investigation began about 2 years ago, when the Yale team, led by geneticist Richard Lifton, began systematically screening patients with hypertension for variants in a suite of genes they had previously linked to blood pressure abnormalities in humans. As part of that study, Lifton's group analyzed blood from a 15-year-old boy with severe hypertension that had been shipped to them by doctors at Albert Einstein College of Medicine in New York City.

    The boy, the Yale team discovered, had a mutation in the mineralocorticoid receptor. When activated by the steroid hormone aldosterone, this receptor normally triggers a cascade of molecular events that cause kidney cells to absorb salt and water for release back into the blood. This can help prevent dehydration on a hot summer day, but it also raises blood pressure.

    Lifton, postdoc David Geller, and their colleagues then found that this boy's mutation—a single changed DNA base pair—caused hypertension before age 20 in all the boy's relatives who had inherited it. They went on to discover why: The researchers detected activity from the mutant receptor in cultured cells—which had been engineered to glow when the receptor was activated—even in the absence of aldosterone. It appeared to be permanently stuck in the “on” position.

    To their surprise, they also found that progesterone, which gums up the normal receptor, strongly stimulated the mutant version. Because progesterone levels increase dramatically during pregnancy, Lifton wondered whether the hormone would exacerbate blood pressure problems in pregnant women who have this mutation. It seemed to: Worsening hypertension had plagued all five of the pregnancies of two women carrying the mutated receptor.

    Although Lifton does not think this particular mutation will turn out to be common, he says the findings suggest that the salt recycling pathway in which the mutant receptor acts could be important in hypertension. They also suggest that hormones like progesterone might in some cases overstimulate the pathway. The Yale team is now working to identify milder mutations in the mineralocorticoid receptor, and in other proteins in the pathway, that might lead to more common forms of hypertension, including some brought on by pregnancy.

    To determine why progesterone triggers the mutant receptor, Lifton's group teamed up with Yale structural biologist Paul Sigler to model the mineralocorticoid receptor on a computer, based on the known structure of the progesterone receptor, a close cousin. In the model, they confirmed that aldosterone interacted with its receptor using a precisely positioned hydroxyl group, which progesterone lacks. In its tête-à-tête with the mutant receptor, however, progesterone seemed to work by bringing two of the receptor's protein helices in contact with each other. “That suggested this interaction might be key to progesterone's ability to activate the receptor,” Lifton says. His team then proved that's the case by altering the receptor to prevent its helices from touching: Progesterone could no longer activate it.

    The Yale group then examined the crystal structures of other steroid hormone receptors. They saw interplay of the two protein helices in activated forms of every member of the family, including the estrogen and glucocorticoid receptors, suggesting that this point of contact has broad functional significance. The researchers are now trying to block this interplay in each steroid receptor to see if that does indeed prevent the receptor from being activated. If so, the insight might help researchers design drugs to block aberrant effects of any or all of these hormones. In particular, the mineralocorticoid receptor is considered a hot target for novel congestive heart failure treatments as well as new blood pressure drugs.

    And if the work helps pinpoint causes for common forms of high blood pressure, it might eventually lead to earlier identification of people at risk for the disorder, enabling preventive measures to be taken. Genetic insights might also help doctors make more informed choices when prescribing from the “Chinese menu” of blood pressure drugs, Lifton says: “In the long run, we'd like to tailor our medications to the specific underlying abnormality of each patient.”


    'Cluster' Prepares to Make a New Stand

    1. Alexander Hellemans*
    1. Alexander Hellemans writes from Naples, Italy.

    Scientists who were at the Kourou space center on 4 June 1996 will never forget watching Cluster die. One moment, the mission—four identical satellites carrying 11 instruments designed to produce the first three-dimensional (3D) maps of the magnetic fields and plasmas surrounding Earth—was lofting skyward over French Guyana. Then the rocket carrying it exploded, turning one of the European Space Agency's (ESA's) most ambitious scientific projects into fireworks. “We were in shock,” recalls principal investigator Nicole Cornilleau-Wehrlin of the Centre d'étude des Environnements Terrestre et Planétaires in Vélizy, France (Science, 14 June 1996, p. 1579).

    Dismayed scientists doubted whether ESA could muster the will, or the funds, to start over (Science, 28 June 1996, p. 1866). But Cluster is poised to fly again. If all goes well, four Cluster II spacecraft, built entirely from scratch, will reach orbit two at a time in mid-July and early August.

    “I applaud ESA's determination to fly it again,” says Cluster co-investigator Patricia Reiff of Rice University in Houston. “This is science that is not being done in any other mission before or in planning.”

    Credit for squeezing the Cluster II project into ESA's already tight science budget belongs to ESA's science director, Roger Bonnet, says principal investigator Donald Gurnett, a space scientist at the University of Iowa in Iowa City. “Bonnet really did a great job in convincing the European Community that they should be falling in,” Gurnett says. John Ellwood, Cluster's project manager at the European Space Research and Technology Centre (ESTEC) in Noordwijk, the Netherlands, agrees. “We didn't have the money initially,” he says. “It took us a year to get the mission going again.”

    To come up with 318 million euros needed for Cluster II, ESA siphoned some funds from the operations budget of the first mission, rescheduled other missions, and took advantage of improved technology. Higher capacity memory chips alone saved millions of euros, Ellwood says, by enabling the new satellites to download data to one ground station instead of two. To economize on launch costs, ESA teamed up with STARSEM, a joint venture between Arianespace and the Russian Space Agency, which will launch the quartet of Cluster II satellites on two Soyuz rockets from the Baikonur Space Center in Kazakhstan. Soyuz, the old workhorse of the Soviet Union, is considered the most reliable launcher available, with a success rate of over 98.5% in 1600 launches. At 30 million euros per launch, it is also a bargain. “Two Soyuz cost less than an Ariane 4,” says Philippe Escoubet, Cluster's project scientist at ESTEC.

    During their planned 2 years of operation, the satellites will fly in a tetrahedral formation—the optimal configuration for 3D imaging. Ground controllers will vary the distances among the satellites in order to observe different parts of the magnetosphere, such as the polar cusps and the magnetotail. “If we have a perfect injection into orbit by Soyuz, some fuel will be left over, and we will be able to extend the project for a third year,” Escoubet says. One benefit of the delay is that the mission will be active when the sun reaches maximum activity later this year or next year.

    Among scientists, expectations are high. Says principal investigator Hugo Alleyne of the University of Sheffield in the United Kingdom: “In terms of understanding the solar wind, magnetospheric boundaries, and interactions, it will be a quantum jump.”


    New European Group Lobbies for Support

    1. Lone Frank*
    1. Lone Frank writes from Copenhagen.

    COPENHAGENReeling from budget cuts and public doubts about genetically modified foods, European plant scientists are mounting an ambitious effort to persuade European Union (E.U.) officials to plow more money into their field. But their blueprint for change, intended to prevent them from falling farther behind their global counterparts, has so far failed to win any promises from E.U. commissioner and science chief Phillippe Busquin.

    The plan was drawn up by the fledgling European Plant Science Organization (EPSO), an independent body that represents 30 leading labs from 20 European countries. The group was set up in February, and last month it presented Busquin with the 10-year plan. “There is an acute need to organize the research effort and to increase funding for plant science if Europe wants to stay competitive in this field,” says the group's chair, plant geneticist Marc Zabeau of the University of Ghent in Belgium.

    EPSO's top priorities include boosting funding for basic plant science and coordinating research activities on both national and E.U. levels. The fundamental studies would eventually translate into new ways to raise yields and slash pesticide use. “If science is to tackle these challenges, there is a dire need for more integration,” says plant geneticist Michael Bevan of the United Kingdom's John Innes Center. Zabeau sees parallel national initiatives in plant genomics in Germany and France as examples of unnecessary competition for scarce resources. At the same time, he acknowledges that collaborative research also faces many obstacles. “We have networks in place, but we need significant funding and an infrastructure that allows us to come together in practice,” says Zabeau.

    Under Framework 5—the E.U.'s research portfolio for 1999 through 2003—plant science no longer has its own pot of money but must compete with microbial and animal research. One result is a two-thirds reduction in the number of successful proposals (seven in the first round) compared with the previous Framework program. Among the casualties is a plan to complete work on the well-studied mustard, Arabidopsis thaliana. After the E.U. put up $20 million toward an international sequencing effort, “all funding was denied for the final phase,” says EPSO coordinator Karin Metzlaff.

    National funding for plant research has also declined, with Denmark and the Netherlands particularly hard hit. Dutch authorities are in the midst of implementing a 30% cut over 5 years and have ended funding for all collaborative projects. All Danish programs in plant biotechnology will expire by 2002, and no new initiatives are planned.

    That alarms industry, which relies upon a strong public-sector commitment to basic research, says Georges Freyssinet, head of global genomics research at Aventis CropScience in Lyons, France. “The next 5 to 10 years will be essential,” he says, warning that now is the worst possible time to be reducing government support for plant research. European companies may need to move research operations elsewhere if the political culture surrounding genetically modified foods does not improve, he adds.

    E.U. officials have already begun to plan for the sixth Framework Program, which starts in 2004. And although Busquin confesses that the plant science community has his ear, he says that “risk assessment and ethical aspects are important issues that have to be addressed.” The research chief also doesn't want to be seen as catering to a particular discipline: It is “impossible at this point to promise more funds for individual fields,” he says.


    Getting a Feel for Genetic Variations

    1. Robert F. Service

    Even before last week's news of the near-completion of the human genome sequence, researchers had set about the arduous task of figuring out just how the sequence differs among individuals—and how those variations may predispose people to various illnesses. Current genetics technologies make it relatively straightforward to determine where such differences lie on a given chromosome, say, chromosome 11. But determining which of the two copies of that chromosome the change resides on—a necessary first step toward linking those variants to diseases—has proved challenging indeed. Now a team of researchers at Harvard University and the Massachusetts Institute of Technology (MIT) has come up with a novel atomic imaging microscope that may dramatically speed this task.

    The microscope is a modification of the popular atomic force microscope (AFM), which uses an ultrasharp tip to map surfaces of everything from computer chips to DNA at the atomic level. The new version caps a conventional silicon tip with a carbon nanotube, an ultrathin, strawlike network of carbon atoms a mere nanometer or so across. By using this molecule-sized tip, the researchers —led by Harvard chemist Charles Lieber and MIT geneticist David Housman—were able to get their AFM to march down a strand of DNA and identify uniquely shaped reporter molecules engineered to tag the genetic variations. In this case, these variations are sites along a chromosome that harbor a one-letter difference in spelling called a single-nucleotide polymorphism (SNP).

    As they describe in the July issue of Nature Biotechnology, the Harvard/MIT group—which includes postdoc Adam Woolley, then-postdoc Chantal Guillemette, and grad student Chin Li Cheung—was able to feel its way around one of the most vexing problems in genetics: haplotyping. Haplotyping is the task of figuring out exactly which of several possible gene variants occurs on a given chromosome. The difficulty arises because each cell contains two copies of each chromosome, one from the mother and one from the father. At any location along the chromosome, geneticists can tell whether the two chromosome copies are identical—that is, whether they contain the same chemical letter—or different. But when the chromosomes differ—that is, contain a SNP—the researchers can't readily tell which letter belongs on which chromosome. And the exact spelling of each chromosome is essential information, because it may change a gene into a disease-causing form, says Housman.

    Currently, explains Andrew Collins, a geneticist at Southampton University in the United Kingdom, researchers do family studies to look for disease genes. If they can't find suitable families, they look at the frequency with which different SNPs pop up in many individuals and then resort to statistical methods to infer the likely exact spelling of each chromosome. But this process is “prone to error,” he says.

    The nanotube-based AFM may change that by enabling researchers to forgo statistics and observe the SNPs on a chromosome directly. The researchers borrowed an idea from the standard sequencing method, which reveals the DNA's four chemical letters in living color by linking a different fluorescent dye molecule to each of them. But here, instead of using a fluorescent signal, the researchers added an oligonucleotide—a short strand of DNA designed to bind to a single complementary DNA fragment, in this case one surrounding a known SNP location. Each oligo was engineered to stick only when the SNP harbored a particular genetic letter—G, for example. To this oligo they hitched a reporter compound. As the AFM marched along the atomic hills and valleys of the DNA, when it hit the reporter compound the researchers knew they had found their G SNP. By adding several oligo-reporter combinations, then simply reading down a section of a gene of interest, they could readily decipher whether a series of SNPs of interest were present on the same chromosome. Says Collins: “That's a very useful thing to have.”

    For now the Harvard researchers are looking at DNA strands around 1000 genetic letters long. That's on the short side for many geneticists trying to associate combinations of SNPs with disease. But Lieber says there's no reason the technique shouldn't work with sequences perhaps as long as 100,000 letters. Furthermore, by borrowing data-storage techniques, geneticists may be able to create arrays of hundreds of AFM tips working in parallel to carry out ultrafast haplotyping. If so, says Robert Waterston, a geneticist who heads the genome sequencing center at Washington University in St. Louis, Missouri, “[this] could be the start of something impressive.”


    Italian Scientists Seek to Reverse Budget Cuts

    1. Michael Balter

    PARISThe closing session of the XIII International AIDS Conference in Durban, South Africa, next week will be a proud moment for Italy. That's when Italian researcher Stefano Vella becomes president of the International AIDS Society (IAS), which organizes these biennial conferences. But even as the Durban meeting highlights Italy's prominence in the AIDS community, the Italian government is gutting the country's national AIDS program.

    The cuts, for the 2000 fiscal year beginning 1 July, mean a 36% reduction in funding for extramural grants from the current year. They continue a trend begun in 1997. Italian AIDS researchers have known about the latest round of cuts for several months. But it is only after the appointment in late April of a cancer researcher, Umberto Veronesi, as Italy's new health minister and in the run-up to the Durban meeting that they have begun to speak out about their harmful effect.

    “The national AIDS program is one of Italy's big success stories,” says Vella, who directs the AIDS clinical research program at the Istituto Superiore de Sanità (ISS) in Rome, the agency that provides nearly all extramural funds for AIDS research. “We don't want everything we have accomplished to be lost.” Their pleas have attracted international support. “Italian scientists are very important players in the global AIDS research effort,” says Anthony Fauci, director of the U.S. National Institute of Allergy and Infectious Diseases in Bethesda, Maryland.

    The 13-year-old Italian AIDS program first ran into trouble in 1997, when then-health minister Rosy Bindi froze AIDS funding for several months. The freeze was part of a government reordering of priorities that placed more emphasis on applied research (Science, 11 April 1997, p. 191). When the dust settled, the $13.6 million extramural program had been reduced to just over $10 million. And the decline has continued: The proposed budget for 2000–01 is only $6.3 million. “This will have a major impact,” says AIDS researcher Guido Poli of the San Raffaele Scientific Institute in Milan, who depends on ISS grants for about 80% of his lab's funding. “I will have to severely reduce many of my current projects, and it will affect our ability to pay young researchers and to participate in meetings.”

    Italian AIDS researchers are now hoping that Veronesi, who replaced Bindi when a new Italian government took office this spring (Science, 5 May, p. 791), will be more sympathetic to their cause. Now that Veronesi has had time to get settled into his job, Vella and his colleagues say they are hoping to meet with the minister to discuss how to reverse the funding cuts. Veronesi was unavailable for comment.

    If the cuts are not restored, the election of an Italian as IAS president may turn out to be a hollow reward. Says Fauci: “If [Italian researchers] are unable to pursue their scientific activities at full speed because of a lack of resources, the entire global AIDS research effort will suffer.”


    Radio Galaxies Return From the Dead

    1. Govert Schilling*
    1. Govert Schilling is an astronomy writer in Utrecht, the Netherlands.

    Even for deep-space objects, radio galaxies are odd beasts—so odd that scientists have trouble explaining why they exist at all. Now astronomers in the Netherlands have deepened the mystery by discovering that some radio galaxies live twice.

    The hallmark of a typical radio galaxy is a double blaze of radio energy, which erupts when thin jets of ionized matter shooting in opposite directions slam into intergalactic atoms at enormous speed, millions of light-years from the galactic core. The origin of the jets is still unknown. Most astronomers suspect that they stream from the poles of a whirling supermassive black hole, which sucks in nearby gases and spews part of them out again as plasma. But just how the black hole's engine works is anybody's guess.

    The plot thickened when Arno Schoenmakers of the Netherlands Foundation for Research in Astronomy in Dwingeloo and colleagues found evidence that the jetmaking machinery can shut down and fire itself up again after millions of years. In the 21 June issue of the Monthly Notices of the Royal Astronomical Society, they report finding what they call double-double radio galaxies—classic double-lobed galaxies sporting a second, older pair of radio lobes much farther out than the first.

    “Apparently, it's a very rare phenomenon,” Schoenmakers says. So far, he and his colleagues have turned up just eight examples: three in a large radio survey of the northern sky, carried out with the Westerbork Synthesis Radio Telescope in the Netherlands, and five others in the astronomical literature. All are enormous, with outer lobes stretching 2.5 million to 8 million light-years apart—dozens of times the diameter of our Milky Way. By studying the energy distribution of the radio waves (higher energies fade faster than lower energies), the astronomers estimate that the outer lobes are 100 million years old, give or take a factor of 2 or so. There is no trace of a jet between the older and younger lobes; evidently the polar geysers stayed dormant for millions of years before erupting again.

    What could give jets a new lease on life? Without a good theory of how jets form in the first place, astronomers can only speculate. According to David Merritt of Rutgers University in Piscataway, New Jersey, the jump-start might occur when two galaxies collide. “A merger could be responsible for the second jet,” he says, “since mergers tend to bring gas into the center where they can fuel the ‘engine’ that drives the jet.”

    But Schoenmakers thinks that scenario is unlikely. Because mergers can occur at any time in a galaxy's history, he says, at least some merger-driven double-doubles would be expected to be smaller and younger than the lobes he observes. “There seems to be a relation with age,” he says. “Something happens to the core when it has been active for a very long time.”

    Schoenmakers's favorite theory is that the central black hole is running out of fuel. Perhaps, after pulling a steady stream of gas into itself for millions of years, the black hole simply starts to exhaust the supply. “It's like your car is running out of gas,” he says, “and the engine starts to sputter.” If so, the radio galaxy's reincarnation may be a temporary hiccup before a final death.

    If the sputtering model is correct, the depths of space might hold radio galaxies with three sets of radio lobes—triple-doubles. “That would be a major discovery,” Schoenmakers says.


    Neighborhood Gamma Ray Burst Boosts Theory

    1. Govert Schilling*
    1. Govert Schilling is an astronomy writer in Utrecht, the Netherlands.

    Titanic explosions that emit powerful flashes of energetic gamma rays are one of astronomy's hottest mysteries. Now an analysis of the nearest gamma ray burst yet detected has added weight to the popular theory that they are expelled during the death throes of supermassive stars.

    Very large stars die spectacular deaths at a relatively young age. Some explode as brilliant supernovas; the most massive stars probably collapse into black holes. Computer simulations suggest that the birth of a black hole can be accompanied by the release of tremendous amounts of energy, so these “collapsars” are prime candidates for the source of gamma ray bursts. If so, the bursts should occur in star-forming regions, as massive stars die before they are able to escape from their cosmic cradle. Unfortunately, most galaxies where gamma ray bursts originate are billions of light-years away, so even the keen-eyed Hubble Space Telescope can't tell if gamma ray bursts occur in star-forming regions.

    On 25 April 1998, however, a relatively faint gamma ray burst was detected much closer to home—in a galaxy just 140 million light-years away. Moreover, the burst coincided with a supernova, suggesting a link between gamma ray bursts and the deaths of massive stars (Science, 19 June 1998, p. 1836). Stephen Holland of the Danish Center for Astrophysics with the Hubble Space Telescope in Aarhus and his colleagues observed the host galaxy of this gamma ray burst in June with the Hubble.

    The Hubble images of the galaxy, known as ESO 184-G82, show that the burst indeed occurred in a giant star-forming region. According to Holland's team, which released the images on 27 June, part of the light at the exact burst position may be the fading glow of the supernova, but the surrounding bright objects are either young star clusters or very hot, massive stars.

    The Hubble evidence would be even stronger, astronomers say, if the explosion in ESO 184-G82 weren't such a peculiar event. The supernova was much more powerful than average, but the gamma ray burst was only a fraction of a percent as intense as a normal burst. Team member Jens Hjorth of the University of Copenhagen in Denmark says he believes that “normal” gamma ray bursts also occur in star-forming regions. By analyzing light from the host galaxy of a more typical burst, which reached Earth on 23 January 1999, Hjorth and colleagues found evidence of knotlike structures that look like star-forming regions. But that galaxy was billions of light-years away—too far to get direct evidence of what lay inside it.

    Titus Galama of the California Institute of Technology in Pasadena, who in 1998 discovered the supernova that accompanied the gamma ray burst of 25 April, says the Hubble observations are “not surprising. This is where you expect [gamma ray bursts] to occur.” Hjorth agrees. “But we've had so many surprises in this field,” he says, “that it's good to see that something we expect is actually happening.”

  10. 2001 BUDGET

    Spending Bills Show No Sign of Surplus--Yet

    1. Jeffrey Mervis*
    1. With reporting by Jocelyn Kaiser, Andrew Lawler, David Malakoff, and Robert Service.

    The U.S. Treasury may be awash in excess revenue—an estimated $228 billion next year and $4.2 trillion over the coming decade—but this year many science programs are feeling parched. At the midpoint of the annual budget cycle, health and defense agencies appear headed for large increases, while the National Science Foundation (NSF), the Department of Energy (DOE), NASA, and the National Institute of Standards and Technology (NIST) are facing a fiscal drought compared to their requests (Science, 11 February, p. 952).

    A lot could change before the 2001 fiscal year begins in October, however. Although the House has passed most of its spending bills, the Senate has yet to weigh in on many of them. In addition, tight caps on domestic spending, which are keeping many appropriations figures artificially low, could, thanks to the booming economy, be eased before legislators head home to campaign for the November elections. Finally, President Clinton has already threatened to veto spending bills that don't provide sufficient funds for the Administration's priority programs. The result is a confusing and wildly varied budget picture for science and technology agencies.

    The Administration, which requested big increases for many science agencies, says that science remains on its short list of priorities. “In the current economic climate, we have an obligation to invest in science,” declared Jacob Lew, director of the Office of Management and Budget, last week. “So we're going to stick to our guns for as long as it takes to get the job done right.”

    Republican congressional leaders, meanwhile, defend the spending bills as fiscally responsible, but they are surprisingly candid about the constraints they have faced. “[Although] NSF has fared comparatively well in the appropriations process, I would have preferred to see an increase in funding closer to the level requested,” admitted Representative James Sensenbrenner (R-WI), chair of the House Science Committee, last month when the House approved a 4.3% increase in NSF's budget rather than the 17.3% requested.

    Budget highlights to date include:

    NIH: Biomedical research advocates are edging closer to their goal of doubling the National Institutes of Health's budget by 2003. Last week the Senate approved a bill that would give NIH a $2.7 billion increase, to $20.5 billion. A House version passed on 14 June provides a $1 billion raise, which matches the Administration's request. Representative John Porter (R-IL), who heads the House spending panel that funds NIH, told colleagues voting on a 6% boost that he expects the final NIH tally to be closer to the Senate's figure.

    NSF: The $167 million increase the House approved would raise NSF's budget to $4.06 billion, far short of the $4.57 billion request. It blocks the agency from starting the $17 million EarthScope monitoring project and the $12 million National Ecological Observatory Network, while approving $12.5 million to continue work on HIAPER, a new high-altitude research plane that ranks lower among NSF's priorities. The $696 million education directorate would actually shrink by $2 million, with a $12 million Administration initiative to train information technology workers biting the dust. But the graduate teaching fellows program, a favorite of Director Rita Colwell, would rise by $10 million to $19 million.

    NASA: The space agency's hopes for a new mission to study the sun were dimmed by the House, which rejected the $20 million NASA wanted for its Living With a Star program. Lawmakers worried that the annual costs for the effort, which would place a series of sophisticated spacecraft around the sun to examine solar storms and the solar wind, would balloon to $177 million by 2005. Overall, the president's request for an additional $400 million was whittled down to a paltry $58 million. The House also criticized the agency for shortchanging programs to analyze space data and ignoring outside advice to do so, and ordered up a study to examine the problem.

    Energy: Last week the House approved a $17.3 billion budget that is 4%, or $686 million, over this year's spending level but some $853 million below its request. A $262 million bid for the Spallation Neutron Source, a $1.4 billion materials science facility under construction at Oak Ridge National Laboratory in Tennessee, was slashed to $100 million, which agency officials say is too low to move ahead. Legislators rejected some $30 million in upgrades to heavily used accelerators, synchrotrons, and research reactors and all $36 million of DOE's share of the multiagency Nanoscale Science Initiative. But there is optimism that the Senate will restore the funds. “The wrinkles will iron out in DOE's favor,” predicts a House aide.

    Defense: Last month the Senate approved a 10.5% increase, to $1.3 billion, for the Defense Department's basic research account, more than doubling the Administration's requested 5% boost. The House is poised to follow suit with an 11.5% increase.

    NIST: The House last week approved essentially steady-state funding for the agency's core chemistry, physics, and materials science labs but nixed plans for funding several new initiatives, including $50 million for a cybersecurity effort and NIST's $10 million chunk of the nanoscale initiative. It again killed the controversial Advanced Technology Program (ATP), an effort to share with industry the cost of high-risk applied research. A perennial target of Republicans since 1995, ATP rose swiftly from the dead last year and wound up with nearly $200 million. Supporters are again hoping for a rescue from the Senate.

    Other agencies: The Environmental Protection Agency's science and technology account got a $5 million boost from the House, to $650 million, although it falls $24 million short of the president's request. Within the Agriculture Department, the House and Senate disagreed on two competitive grants programs, the Senate adding $2 million to the $119 million National Research Initiative and the House cutting it to $97 million and barring funding for the $120 million Initiative for Future Agriculture Systems. Last week's House vote would set the National Oceanic and Atmospheric Administration's overall funding $100 million below this year's $2.3 billion level and nearly $530 million below the president's request. At the same time, it removed language tying funding for global climate change research to Senate passage of the Kyoto Global Climate Change treaty, currently a dim prospect.


    Astronomers Overcome 'Aperture Envy'

    1. Robert Irion

    Small telescopes can thrive in the shadow of giant new observatories—but only if astronomers adapt them to specialized projects

    Last year, Michael Castelaz published an extrapolation that would change the face of astronomy if it ever came to pass. With his tongue firmly in his cheek, the East Tennessee State University astronomer examined the trend toward building bigger and bolder ground-based telescopes in the United States. Funding agencies would pay for their operation by shutting down smaller instruments, he predicted, eventually leading to a $1.2 billion, 42-meter “Ultimate Telescope” at the expense of all other facilities. Just 44 astronomers per year would have access to this behemoth, Castelaz wrote,* noting that “at least journals would be a bit lighter.”

    The humor had an edge to it. Castelaz captured the feelings of many users of small telescopes, who are gritting their teeth at perceived slights from those who dispense astronomy's budgetary morsels. For instance, the National Optical Astronomy Observatories (NOAO) of Tucson, Arizona, is transferring its 1-meter-class telescopes to private consortia of universities, thus taking them out of circulation for the community as a whole. Most recently, the National Research Council's “decadal review” of priorities in astronomy (Science, 26 May, p. 1310) scarcely mentioned what some astronomers view as the critical role of small telescopes in modern research. Instead, the report focused on giant projects such as a proposed 30-meter segmented-mirror telescope, which would cost the United States and international partners at least $500 million to build.

    Will these events make Castelaz more prophet than parodist? Probably not, says astronomer John Huchra of the Harvard-Smithsonian Center for Astrophysics (CfA) in Cambridge, Massachusetts. “You won't get a half-billion dollars by closing small telescopes,” he says. “If you use the right tools for the job, small telescopes can be incredibly cost effective. You can still do major things.” Indeed, many argue that small telescopes are playing a more—not less—important role these days, complementing the work of the giant facilities and performing observations for which only they are suited.

    Astronomers have learned to call upon the unique abilities of telescopes with apertures of 2 meters or less to monitor broad swaths of the sky and stare at the same objects night after night, sometimes for years. Various teams are turning small telescopes into robots, creating networks of scopes that span the globe and devoting them to survey projects that big telescopes don't have a prayer of tackling. Furthermore, serious amateurs now use electronic detectors as good as those used by professional astronomers a decade ago, making amateur-professional collaborations a growth industry.

    “Small telescopes may not make much front-page news, but they are the backbone of astronomy,” says Janet Mattei, director of the American Association of Variable Star Observers (AAVSO) in Cambridge, Massachusetts. Engineer Lou Boyd, who runs a fleet of robotic telescopes at the private Fairborn Observatory in southern Arizona, makes the same point more colorfully: “If you have a hospital lab with a CAT scanner, you still need to have people doing the blood work.”

    A growing voice

    Mattei, Boyd, and their colleagues are passionate about small telescopes, but until recently it was a quiet passion. “The voice of this majority of the astronomical community has been minimal,” says Terry Oswalt, program director for stellar astronomy and astrophysics at the National Science Foundation (NSF) and an astronomer at the Florida Institute of Technology in Melbourne. “It's nowhere near what you would expect for the number of people involved.” Astronomers began to tout the advantages of small telescopes en masse in October 1996 at a meeting at Lowell Observatory in Flagstaff, Arizona, where Percival Lowell gazed upon Mars at the turn of the century with a 0.6-meter refractor. Since then, most of the biannual meetings of the American Astronomical Society (AAS) have featured at least one session devoted to small-telescope issues.

    Clear themes have emerged from these gatherings. At the AAS meeting last month in Rochester, New York, several astronomers described the merits of fully automated small telescopes. These systems are designed to open the dome on their own when conditions are right and take data. Once running, they zip among preprogrammed targets in the sky more quickly and make observations more consistently than human operators ever could. “Robotic telescopes are ideal for performing repetitive tasks that require many nights throughout the year,” says astronomer Alex Filippenko of the University of California (UC), Berkeley. Astronomers might get just a few nights per year on large telescopes, he notes—not nearly enough to monitor the sky for rare events or track the long-term behavior of objects.

    Filippenko's own robotic system, the 0.75-meter Katzman Automatic Imaging Telescope (KAIT), is setting new records for finding nearby supernovae, the explosions of massive stars. KAIT replaced a small manual telescope at UC's Lick Observatory near San Jose. The telescope records images of several thousand galaxies, one by one, then repeats its observations every three to five clear nights. An analysis program flags whether any new flares of light have appeared in the galaxies since the previous images were taken. Undergraduates at UC Berkeley check KAIT's identifications the next morning to make sure they aren't asteroids, electronic blips, or other artifacts. In this way KAIT found 40 supernovae in 1999, more than twice the number spotted by any similar system. These explosions provide the calibration necessary to show whether much more distant supernovae have any odd features that would undercut their role as cosmic yardsticks. So far, says Filippenko, the nearby and distant supernovae display some differences, but none serious enough to scuttle the conclusion that the universe will expand more quickly as time passes (Science, 18 December 1998, p. 2156).

    Another automated system gaining renown is the Robotic Optical Transient Search Experiment (ROTSE) at Los Alamos National Laboratory in New Mexico. ROTSE uses small telescopes indeed: four paparazzi-style telephoto lenses mounted on a platform that can swivel quickly to any part of the sky. Astronomers built ROTSE to pursue the optical flashes associated with gamma ray bursts, powerful explosions in the distant universe that satellites can detect. Its most spectacular success came on 23 January 1999, when it triggered on an alert from a detector in orbit on NASA's Compton Gamma-Ray Observatory. Within 22 seconds, ROTSE captured the light from an extraordinarily bright burst several billion light-years from Earth (Science, 26 March 1999, p. 2003). Such early detections will help astrophysicists devise better models of the fantastically powerful blasts. A similar apparatus, the Livermore Optical Transient Imaging System (LOTIS), watches for flashes from the Lawrence Livermore National Laboratory in California.

    The nation's hotbed for robotic telescopes is Fairborn Observatory, 6 kilometers from the Mexican border near Nogales, Arizona. Boyd, the observatory's sole staff member, oversees eight scopes ranging from 0.25 meters to 0.8 meters across; five more are on the way. Astronomers from South Carolina to Vienna, Austria, receive data from the machines over the Internet. Greg Henry of Tennessee State University in Nashville has seen his career transformed by his four telescopes currently operating at Fairborn. “I'm literally getting 50 to 100 years' worth of data now compared to what I would get in a single year working manually,” says Henry. “And the cost of operation is a tiny fraction of what it was.”

    Henry's forte is examining long-term changes in the brightnesses of stars like our sun, which are vital for understanding how the sun may influence Earth's climate. The precision of each robot's data from night to night enables Henry to calculate fluctuations as small as one 10-thousandth of a stellar magnitude, the most precise readings ever achieved. That specialty landed Henry on front pages last November when one of his telescopes spotted the dip in light caused by a Jupiter-sized planet passing in front of another star (Science, 19 November 1999, p. 1451). Perhaps the best part, he says, is his observing schedule: “I program them and go home and get a good night's sleep.”

    Fixed stare

    Henry's observations illustrate a big advantage that small telescopes have over their behemoth brethren: They can be used for long-term observations that watch for changes over time. Such observations would be unthinkable with instruments at the Keck Observatory on Mauna Kea in Hawaii, where viewing time is in fierce demand and is booked months in advance.

    Astronomer G. Wesley Lockwood of Lowell Observatory has made good use of one small telescope's long, fixed stare. Since 1971, he has used a 0.55-meter telescope to monitor changes in the brightness of Saturn's large moon Titan. Last year, astronomer Ralph Lorenz and colleagues from the University of Arizona in Tucson combined this 30-year stretch of data with their own Hubble Space Telescope observations of Titan in 1994 and 1997 to devise a new model of how a 14.5-year seasonal cycle affects the moon's hazy atmosphere. The results appeared in the December 1999 issue of Icarus. Lockwood also uses the telescope to monitor the changing light levels from Uranus and Neptune.

    More recently, a new form of long-term observation has come into vogue: Astronomers have orchestrated globe-girdling networks to conduct uninterrupted studies of specific stars around the clock. One such array is the Whole Earth Telescope (WET), directed by astronomer Steven Kawaler at Iowa State University in Ames. The network consists of a dozen or more observatories worldwide that collaborate twice a year for about 2 weeks at a time. The network includes 1-meter to 2-meter-class telescopes in countries such as China, Honduras, Lithuania, and Brazil, where astronomers otherwise might not get the chance to participate in international projects.

    WET's main quarries are pulsating white dwarfs, the dense remnants of stars like our sun that have consumed their nuclear fuel. Some of these dead stars pulsate with periods of minutes to hours; no single observatory could follow their repeated cycles. “The idea behind WET is the inverse of the British Empire: The sun never rises on it,” says one of the network's coordinators, astronomer Donald Winget of the University of Texas, Austin. “We can keep an object in continuous view.” An unbroken stream of observations enables the WET team to conduct “asteroseismology” on the distant dwarfs, using their pulsations to gauge their internal structures. This, in turn, helps the researchers calibrate the ages of the dwarfs, which are among the best chronometers in the galaxy because they cool predictably, like Earth-sized cinders.

    Although WET's members are professionals, a similar global network draws mostly upon talented amateurs. Astronomer Joseph Patterson of Columbia University in New York City established the Center for Backyard Astrophysics (CBA) to study cataclysmic variables, binary star systems in which a disk of gas spiraling onto a white dwarf periodically unleashes energetic bursts. Patterson organizes science campaigns for the 20 or so members of his collaboration, who typically use 0.25-meter to 0.7-meter telescopes. “The CBA is more effective than any telescope I've ever used,” he says. “We focus on individual stars for hundreds of hours, which makes us very good at finding all of their rotation periods.” That intensive monitoring lets the team scrutinize curious wobbles and instabilities within many disks, which may control when the energy spigots from white dwarfs turn on and off.

    Patterson lauds his amateur assemblage. “It's a tool that virtually nobody else has,” he says. “In every country in the world, there are well-equipped amateurs interested in participating in research. They can be tremendous assets.” AAVSO's Mattei concurs. In May, she notes, some of her 600 observers worldwide monitored the fluctuations of a cataclysmic variable to help a team using the Hubble decide whether to observe it. If the star had gotten too bright, Hubble's operators would have postponed the study to avoid harming its detectors. “If not for amateur astronomers observing in their backyards with small telescopes, a lot of very successful observing runs with satellites would not be possible,” Mattei says.

    All-sky watchers

    When it comes to bright objects and broad patches of the sky, small telescopes are the only games in town. That point was made clear in a paper posted in May on the Los Alamos preprint archive ( by astronomer Grzegorz Pojmanski of Warsaw University in Poland. Using a telescope with a mere 8-centimeter lens at Las Campanas Observatory in Chile, Pojmanski's project, the All-Sky Automated Survey, unveiled 3400 new variable stars in a typical slice of the Southern Hemisphere sky. Incredibly, just 11% of those stars were previously cataloged as variable. Pojmanski's robot has thus far examined less than 1% of the sky, so nearly a half-million bright variable stars may await discovery. “It's sort of embarrassing to the astronomical community that the bright sky is so poorly mapped,” says astrophysicist Bohdan Paczy'nski of Princeton University.

    Other established surveys also use small telescopes to tile the sky systemically. For example, the Two Micron All-Sky Survey (2MASS), led by astronomers at the University of Massachusetts, Amherst, and the California Institute of Technology in Pasadena, is using automated 1.3-meter telescopes in Arizona and Chile to compile the most complete census of the cosmos to date at near-infrared wavelengths. That's the domain of relatively cool objects, such as red dwarf stars, as well as galaxies and galactic structures that ordinarily lie behind the Milky Way's obscuring bands of dust.

    Global reach.

    Networks of small telescopes monitor variable stars around the clock.


    Several ongoing programs scour space for dangerous asteroids using 1-meter or smaller telescopes, such as the University of Arizona's Spacewatch and the Air Force- sponsored Lincoln Near-Earth Asteroid Research (LINEAR) program at the White Sands Missile Range in New Mexico. And when they are not zeroing in on gamma ray bursts, both ROTSE and LOTIS take images of the whole sky at least twice each night, revealing a slew of variable stars, asteroids, and other transient objects. “There's a unique new niche here for small telescopes: the time domain,” says astronomer Jeff Bloch of Los Alamos, a member of the ROTSE team. “With new equipment and greater bandwidth, it will become economically feasible to do all the sky all the time.”

    Caught in the squeeze

    Such innovative tools may keep small telescopes in the scientific game. Nevertheless, at the large national and international observatories, the momentum clearly belongs to the growing fleet of 8-meter to 10-meter glass giants. “NOAO needs to focus on large facilities that transcend the capability of single universities,” says director Sidney Wolff. “It's not that small telescopes produce bad science, but the kinds of forefront questions we're asking today require us to choose more powerful instruments and multi-institutional collaborations.” That choice, she says, was forced by a steady erosion of NOAO's purchasing power—now 40% lower than it was 15 years ago.

    NOAO's main outpost, the Kitt Peak National Observatory (KPNO) southwest of Tucson, transferred two small telescopes to universities in the 1990s. The Southeastern Association for Research in Astronomy (SARA) refurbished one of them, a 0.9-meter telescope, and moved it to a new spot at Kitt Peak. Researchers at six SARA institutions, led by the Florida Institute of Technology, perform observations remotely over the Internet. A consortium led by Western Kentucky University in Bowling Green is preparing to roboticize an outdated 1.3-meter telescope at KPNO for research and real-time online astronomy education. However, two more popular 0.9-meter telescopes at Kitt Peak are on the auction block. NOAO also is likely to phase out some of its smaller telescopes at its Southern Hemisphere station at Cerro Tololo, Chile, after a new 4-meter telescope comes online there, Wolff says.

    Concerned astronomers have succeeded in saving most of the small telescopes that appeared to face extinction less than a decade ago. But the struggle to keep them open may continue, and their supporters wonder why. “I think it's essential to maintain a few well-instrumented small telescopes at the major sites,” CfA's Huchra says. “This trend hurts students the most. We're beginning to produce students who have never been to a telescope.” Lockwood agrees: “Small telescopes aren't as glamorous as the ones on Millionaire's Row in the exhibit hall, but they still carry much of the load in astronomy,” he said at a January meeting of the AAS in Atlanta. “We all have them in our blood.”

    • *International Amateur-Professional Photoelectric Photometry Communications 75, 67 (March 1999).


    Combined Insults Spell Trouble for Rainforests

    1. Bernice Wuethrich*
    1. Bernice Wuethrich writes from Washington, D.C.

    Interactions among fire, El Niño-driven drought, and fragmentation are increasingly putting tropical forests at risk

    For tropical ecologist William Laurance, the most striking thing about walking through a parched rainforest is the sound. He likens it to walking on cornflakes. Leaves and branches crackle underfoot. Rather than a thin layer of soggy leaves, ankle-deep dry debris covers the ground and provides fuel for fire that will further desiccate the forest, priming it for repeated burning.

    The ferocity and frequency of fires in the rainforests of Brazil and Borneo reached new heights during the 1997–98 El Niño, and scientists are now saying that a combination of interacting factors—including forest fragmentation, logging, and El Niño- driven drought—has altered the forests' fire regimes and is changing regional climates and reconfiguring the landscape. These interactions are synergistic, they say—that is, the whole is greater than the sum of its parts. The concept provides a new paradigm for understanding the dynamics of fragmented rainforests and for approaching their conservation, says William Laurance of the Smithsonian Institution and Brazil's National Institute for Amazonian Research.*

    The Amazon and Borneo support two of the world's largest remaining rainforests. About 14% of the Amazon's 5 million square kilometers is already deforested, says Laurance, and up to 40% of the forest suffers from the damaging effects of fragmentation. In Borneo, the situation is even more dire. About 80% of forest cover has been allocated to commercial logging and industrial plantations, says Lisa Curran, a tropical ecologist at the University of Michigan, Ann Arbor. In both regions, synergistic interactions are establishing “critical thresholds for deforestation above which the system may collapse,” she says.

    Laurance contends that this new awareness of synergistic interactions calls for rethinking conservation goals in the Amazon. Currently, about 3% to 4% of the Amazon is fully protected, with plans to protect 10%. But Laurance considers this goal inadequate, in part because it is based on an outmoded scientific view of fragmented landscapes.

    Although scientists have studied forest fragments for 3 decades, Laurance contends that their theoretical models have been dangerously simplistic. Most use the prism of landscape ecology, emphasizing edge effects—ecological changes associated with the artificial boundaries of fragments—as well as the size, shape, and connectivity of fragments, and the quality of their surrounding habitats. “The notion of synergisms is one more step toward reality,” Laurance contends. Curran concurs: “Synergisms look at complexity and ecological processes,” she says. Unique ecology, land use, and climate may all work in concert to change forest dynamics—sometimes in unpredictable ways. The outcome depends on the net effects.

    In Borneo, the net effects appear devastating. At work are an intensified El Niño and extensive logging and clearing of the land for plantations, which together are undermining an exceptional ecosystem. The Borneo rainforest is unique: Most of the canopy trees are dipterocarps that arc high across the canopy, distinguished by their method of reproduction, which is limited to El Niño years. When El Niño hits, hundreds of species of dipterocarps across 150,000 km2 of forest reproduce in synchrony. Triggered by the dry warmth, they flower, fruit, and within a 6-week period, disperse their seeds (Science, 10 December 1999, p. 2184). The bounty on the forest floor initiates a feeding frenzy by orangutans, wild boar, parakeets, and other animals. It also provides a valued resource for local people who gather the seed for use or sale. The seeds are so abundant that some remain to sprout.

    But Curran has found that the ecosystem is breaking down. In her study area, deep within a protected national park on Indonesian Borneo, many dipterocarp species have failed to produce a single seedling since 1991, and Curran blames interactions between drought and deforestation: “El Niño, the creator, has turned into the destroyer.” In the last 2 decades, El Niño droughts have become longer and more severe, a trend some scientists think is linked to global climate change. Although the protected national forest where Curran conducts her research is unlogged, many surrounding areas are denuded of trees or destroyed by fire. Therefore, wildlife in these areas stream into the more intact forest when seeds fall. They devour all the seed, leaving none to regenerate. In part because seed production can no longer keep up with wildlife, populations are plummeting, says Curran, citing the decline of Bornean orangutans from some 15,000 to 7000 in the last decade.

    There's another negative synergy involving seed dispersal. The trees' synchronized reproduction requires vast spatial areas. With reduced forest cover, the trees produce less seed—and less viable seed. Furthermore, as El Niño droughts have intensified, fires in the rainforest have worsened. During the 1997–98 El Niño in Indonesia, about 10 million hectares, an area the size of Costa Rica, went up in flames.

    Smoke from those fires kills vulnerable seedlings and causes health problems for people and wildlife. And it creates another kind of feedback. Last year, in an article in Geophysical Research Letters, Daniel Rosenfeld of the Hebrew University of Jerusalem showed that heavy smoke from forest fires can nearly shut down rainfall in tropical areas. Using cloud observations gathered by the Tropical Rainfall Measuring Mission satellite over Borneo, he determined that the particulate matter of smoke forms so many tiny condensation nuclei that no single water droplet gets big enough to fall. As a result, the moisture is spirited away in the smoke. Thus drought, fire, and smoke are affecting the regional climate, causing yet more drought, fire, and smoke.

    Very similar forces are also drying the Amazon. There, to probe the synergism between forest fragmentation and fire, ecologist Mark Cochrane at Michigan State University in East Lansing analyzed satellite images of two fragmented areas in the eastern Amazon over 12 to 14 years. Biologists already knew that trees near the edges of fragments have high rates of mortality. Wind shear from surrounding cleared areas can snap off tree crowns, and the woody vines that rope trees together ensure that when one tree falls it drags others along. But Cochrane suspected that fire amplified these effects.

    His two study areas had been rapidly settled and partly deforested following the paving of major roads. In both areas, more than half the forest is now within 300 meters of an edge. And in both regions, Cochrane found, more than 90% of the fires were associated with these edges, demonstrating a clear synergism between fragmentation and fire. “Once fire gets into these forests, it's usually the end of them,” says Cochrane, who has submitted this work for publication. “Even light ground fires kill lots of trees, creating fuel for much bigger fires in the future.”

    Also working in the Amazon, Laurance has found a similar synergism between fragmentation and drought. On 23 1-ha plots containing a total of 16,000 trees, he compared tree mortality data from up to 17 years before the 1997–98 drought with 1 year of data during the drought. “During the drought, mortality increased throughout the forest, but jumped up most within 70 to 100 meters of the edge,” he says. In an upcoming paper he reports that average annual mortality increased from 2.44% to 2.93% near edges and from 1.13% to 1.91% in interiors.

    Although indigenous people have long used fire to clear space for farming, catastrophic wildfires are recent in the Amazon. Charcoal dating and archaeological evidence suggest that they have occurred only when mega-El Niños caused prolonged droughts, about once every 400 to 700 years. Most of the time, the forest was too wet to burn, even when struck by lightning. But all that has changed. Today, more than half the fires in the Amazon are accidental, caused when blazes set to prepare pasture or agricultural lands burn out of control. Some parts of the Amazon now burn several times a decade.

    Today, as in the past, there are great variations in climate across the Amazon. Nearly one-half of the rainforest is at its physiological minimum for water, receiving just enough rainfall to survive. Trees in these southern, eastern, and north-central regions send roots down through 8 or 10 meters of soil to soak up moisture during rainless months. These regions of the Amazon are the most susceptible to drought, for should the soil moisture run out, trees will perish. These are also the regions where El Niño droughts pack their biggest punch.

    Daniel Nepstad, an ecologist at the Woods Hole Research Center in Massachusetts, modeled areas of the Amazon forest dry enough to catch fire. In normal years, he estimates that few of the closed-canopy forests are vulnerable. But, by the end of the 1997–98 drought, 1.5 million km2 were primed to go up in flames—that is, they had less than 250 mm of water moisture in the top 10 meters of soil.

    Such findings call for a change in thinking, says Paulo Moutinho, an ecologist with IPAM, a scientific nongovernmental organization based in Belém, Brazil. “Fire in the Amazon may be more important than deforestation, because it is changing the original climate,” he says. “In an El Niño-dominated world, we need to expand our vision,” agrees Nepstad. “We've been fixated on deforestation and logging as the best measures of human effects on forests. But drought washes over those threats.”

    As scientists identify ways in which simultaneous ecological changes and land use practices are undermining rainforests, the debate continues over how much forest must be protected to prevent ecosystem collapse —and how best to protect it. Laurance estimates that perhaps half of the Brazilian Amazon must be fully conserved to maintain ecosystem function. “We need to think very big in terms of reserves, surrounded by a multiple-use forest,” he says. Tom Lovejoy, a Smithsonian biologist and consultant with the World Bank, thinks that although 50% may not need to be fully protected, more than that needs to be maintained in some kind of tree cover, while allowing for multiple land uses, such as agriculture and forestry. “The magic number no one yet knows is how much forest you need to sustain the hydrologic cycle” through which the Amazon generates nearly one-half of its rainfall, he says.

    Ana Cristina Barros, executive director of IPAM, argues as well that a large part of the Amazon should be used by people—for everything from small-scale agriculture to the extraction of forest products to logging. But it is essential at the same time to maintain the forest's hydrological, nutrient, and carbon cycles. “The key question is what is the best distribution of farmland, logged and exploited forest, and pristine forest [needed to preserve biodiversity],” she says.

    The Brazilian government and international donors have earmarked several hundred thousand dollars for Amazon conservation over the next 10 years. Yet, as part of a $40 billion nationwide infrastructure development program, Avança Brazil, the government also plans to pave some 4600 km of road in Amazonia, says Barros. If the past is any guide, she says, this will lead to more deforestation. Indeed, about three-fourths of deforestation in the region occurs within 50 km of major paved highways. Nepstad, too, argues that much of this fresh asphalt is unnecessary. Rather than opening up remote forest to development, he advocates intensifying development in the Amazon's 550,000 km2 that have already been deforested yet are largely underutilized.

    Lovejoy proposes using the Kyoto climate accord's Clean Development Mechanism to help conserve rainforest. It is, he says, “the single most important thing that can be done to slow tropical deforestation.” This mechanism enables companies and countries to offset their own carbon emissions by paying another for an ecosystem service. In this case, Brazil might receive payments for agreeing to preserve trees that would otherwise be burned and release carbon dioxide, a potent greenhouse gas, into the air. At the next meeting of the Conference of the Parties on the Climate Convention in November, delegates will decide whether this mechanism can be applied to protecting standing forests.

    Nepstad estimates that the Amazon stores 70 billion tons of carbon. The World Bank has estimated that Indonesia's fires in 1997 alone contributed more carbon to the atmosphere than did all humanmade sources in North America—about 30% of all anthropogenic global carbon emissions. Curran is in the process of calculating the amount of carbon stored in the dipterocarp forests in Borneo, which she suspects is substantial. In addition to the carbon contained in their wood, dipterocarps form unique symbiosis between their roots and fungi, and as a result potentially store tons of carbon in roots below ground. Says Curran: “The challenge is to show local and regional governments that these forests are worth something standing, that they're worth more alive than dead.”

    But the outcome of the November debate is far from certain. Nepstad points out that the Brazilian governmental delegation itself opposes including standing forests as part of the Clean Development equation. One thing is for certain: While the debate smolders, so do fires in the rainforest.

    • *Synergisms in fragmented landscapes were discussed last month during the Society for Conservation Biology meeting in Missoula, Montana, and are the subject of a number of forthcoming publications.


    New Insights Into Type 2 Diabetes

    1. Joe Alper*
    1. Joe Alper, a science writer in Louisville, Colorado, is managing editor of the online biotechnology magazine

    After discarding the old dogma, researchers are converging on a new hypothesis to explain this prevalent metabolic disorder

    After 30 years of chasing leads down one blind alley after another, researchers studying type 2 diabetes are optimistic that they are closing in on the elusive causes of the world's most prevalent metabolic disorder—although no one is willing to bet the bank on it. Using both biochemical and genetic approaches, diabetes researchers have identified multiple intracellular signaling pathways that appear to lie at the heart of this condition, which affects some 250 million people worldwide and is the leading cause of blindness, kidney failure, and amputation among adults.

    And in the process, they have thrown out much of the dogma of the past 10 years. The hallmark of type 2 diabetes is insulin resistance, a defect in the body's ability to remove glucose from the bloodstream despite the presence of normal or even elevated levels of insulin. For years, researchers thought a simple explanation, such as a malfunction in the insulin receptor, might lie behind this puzzling defect. But a decade of research has failed to turn up a direct link between insulin receptor malfunction and the disease, except for the rare mutation that accounts for less than 5% of cases. Indeed, “nearly every major feature of this disease that we thought was true 10 years ago turned out to be wrong,” says Morris White, a molecular biologist at the Joslin Diabetes Center in Boston. “We used to think type 2 diabetes was an insulin receptor problem, and it's not. We used to think it was solely a problem of insulin resistance, and it's not. We used to think that muscle and fat were the primary tissues involved, and they are not.”

    Now researchers are converging on a more complex explanation. Work from several groups, published over the past few months, suggests that the disease is triggered when the delicate balance between insulin production and insulin responsiveness goes awry. First, cells in muscle, fat, and liver lose some of their ability to respond fully to insulin, a hormone released by the pancreas after a meal. At the heart of this insulin resistance lie at least two related pathways that normally respond to insulin by signaling cells in these tissues to remove glucose from circulation and convert it into chemical energy stores. In response to growing insulin resistance, pancreatic cells temporarily ratchet up their production of the hormone. But in some people, that's when the second malfunction occurs: The insulin-producing cells give out, and insulin production falls. Researchers are only now gaining insight into the molecular mechanisms involved in this failure, says White. “It's only then, when the body loses the fine-tuned balance between insulin action and insulin secretion, that type 2 diabetes results.”

    If this picture proves correct, the results could be significant for the growing number of people with this disorder. “If we're right about the pathways that we think are involved, then the treatment of type 2 diabetes should be completely different in 5 years,” says Domenico Accili, a geneticist and head of the diabetes research unit at Columbia University.

    But not everyone is convinced. The most intriguing support for this new view has come largely from studies in knockout mice. Unfortunately, research into the human genetics of type 2 diabetes doesn't necessarily mesh with those animal data. Graeme Bell, for one, a University of Chicago geneticist who has spent over a decade searching for human genes related to type 2 diabetes, is skeptical. Bell believes that there will prove to be just a few biochemical pathways involved, but he's not sure that the ones now under scrutiny are the true culprits: “We still have work to do to pin down the causes of this disorder.”

    Even supporters of the new view caution that it's premature to declare victory. “I do think we've identified many of the major biochemical pathways involved in this disorder, but we still don't know why these pathways are not working properly, and that's a critical piece of the puzzle we're missing,” concedes Steven Shoelson, a senior researcher at Joslin.

    The shift in thinking began in the mid-1990s, when, after the insulin receptor proved to be a dead end, investigators turned to probing the intracellular signaling apparatus that responds when insulin binds to and activates its receptor. Some of the first leads came from White and members of his lab, who were studying mice with inherited defects in glucose metabolism. In the process they discovered two related proteins that are activated inside a cell when insulin binds to the insulin receptor. Named IRS-1 and IRS-2, these two proteins serve as docking stations for numerous other intracellular proteins. When assembled, the resulting molecular complex turns on a multistep signaling pathway that activates the glucose transporter, which in turn ferries glucose into the cell. The IRS-bound complex also activates a second pathway, called the Ras complex, that triggers gene expression in the cell, although the role of the Ras complex pathway in type 2 diabetes has been little studied to date.

    White says that although there is some overlap in the function of these two proteins, studies with knockout mice done over the past 2 years in his laboratory, as well as those of Accili and C. Ronald Kahn, a cell biologist and president of the Joslin Diabetes Center, have shown that IRS-1 plays a more prominent role in stimulating glucose uptake by muscle and fat, while IRS-2 is the major player in liver; IRS-2 also boosts insulin production by the pancreas.

    This tissue specificity may prove to be a critical insight for deciphering the mysteries of type 2 diabetes. Using tissue-specific knockouts of either IRS-1 or IRS-2 or both, Kahn, White, and Accili turned up some surprising results. The most notable one, published this past January in the Journal of Clinical Investigation, was that the liver plays a key role in the development of type 2 diabetes. “Most of us believed that glucose uptake by muscle alone was critical for the development of type 2 diabetes,” explains Kahn. Now, it appears that insulin resistance must occur in both muscle and liver for type 2 diabetes to develop, prompting the researchers to conclude that the insulin regulating system must fail at multiple points for the disease to arise.

    Experiments with knockout mice also highlight the importance of insulin production by the pancreas in the genesis of type 2 diabetes. In mice that developed the disorder, White found that the pancreas responded to the initial insulin resistance by overproducing the hormone. But the insulin-producing cells, known as β cells, eventually died, leading to full-blown diabetes. The insulin shutdown may result from a defect in the signaling pathway linked to IRS-2. Data from White's lab, published in Nature Genetics last September, suggest that insulin somehow turns on its own production by the β cells, and that this feedback stimulation only occurs when IRS-2 is present. White says that there may be some factor, still unknown, that eventually interferes with this IRS-2-mediated boost in insulin production and leads to insulin shutdown instead. “That's a very interesting result, because it could provide a biochemical mechanism linking insulin resistance and insulin production,” explains Accili.

    These same signaling pathways may also affect fat metabolism, explaining the association of type 2 diabetes with obesity. Patients with type 2 diabetes often have elevated serum levels of fatty acids. Gerald Shulman, a clinical physiologist at Yale University, has been using in vivo nuclear magnetic resonance to study glucose metabolism in both human and mouse muscle. In studies just submitted for publication, his group at Yale has shown that elevated plasma levels of free fatty acids suppress glucose uptake by interfering with the IRS-1 signaling pathway. “It may be that any perturbation that results in accumulation of intracellular fatty acids or their metabolites in muscle and liver sets off a cascade that leads to reduced IRS-1 and IRS-2 activity, leading to insulin resistance in these tissues,” explains Shulman. Such a decrease in IRS-2 activity could prove to be one of the factors that causes insulin production to drop as well, a possibility that Shulman is now investigating.

    What still confounds researchers is that, despite much searching, geneticists have failed to uncover any mutations in IRS genes in diabetics or even much variation in the genes among the human population as a whole, suggesting that defects in the insulin signaling pathway leading to human disease must be subtle, says Chicago's Bell. He believes that finding such mutations will be difficult. “I don't think that we're going to find mutations that knock out some protein completely, but that these will be mutations that cause a subtle decrement in the function of a protein,” he posits. “But I also think that insulin's regulation of sugar metabolism is so finely tuned that one or two such mutations will be enough to upset the entire system when combined with the proper environmental insults,” such as eating the typical high-fat American diet.

    Although numerous questions remain, Shoelson, among others, thinks the new findings can already be used to design more effective and badly needed drugs for type 2 diabetes. Such efforts are already under way, with pharmaceutical companies searching for compounds that stimulate the insulin signaling pathways and overcome insulin resistance. For example, researchers at both Merck and American Home Products are focusing on an enzyme that modifies the insulin receptor and enables it to turn on the IRS pathways more efficiently (Science, 5 March 1999, p. 1544). In addition, says Accili, both genetic and biochemical studies should eventually provide the means to distinguish between different subtypes of type 2 diabetes based on which points in the pathway are deficient in a given patient, enabling researchers to “better target specific drugs to specific patients.”

    Pathways of Insulin Control

    The insulin receptor is a tyrosine kinase that adds phosphate groups to two insulin receptor substrates, proteins named IRS-1 and IRS-2, that are located just inside the cell membrane. Once phosphorylated, these two proteins serve as docking sites for other intracellular signaling proteins that link insulin stimulation to two major intracellular pathways. One pathway, the so-called Ras transcription activation complex, produces a selective increase in gene expression that researchers have yet to characterize in any detail. The other pathway turns on the enzyme phosphatidylinositol 3-kinase (PI 3-kinase), which in turn phosphorylates a host of proteins, one of which is the glucose transporter that can then transport glucose into the cell.