News this Week

Science  04 Dec 1998:
Vol. 282, Issue 5395, pp. 1790

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    On World AIDS Day, a Shadow Looms Over Southern Africa

    1. Michael Balter

    ParisEach year, just before World AIDS Day on 1 December, United Nations AIDS officials release the latest statistics on the epidemic. This year's figures brought more bad news: An estimated 5.8 million people worldwide were newly infected with HIV in 1998, bringing the total number of HIV-infected people to 33.4 million. Over the same period, some 2.5 million people died of AIDS. Nearly 70% of the new infections occurred in sub-Saharan Africa, which continues to be the hardest hit region of the globe. In several African countries, more than one-fifth of the adult population is already HIV-positive, while in others—most notably South Africa—the epidemic is growing so explosively that this figure will probably soon be reached.

    No letup

    An estimated 5.8 million people became infected in 1998.


    “The worst is yet to come,” predicted Agathe Lawson, the Côte d'Ivoire-based representative of UNAIDS—the U.N.'s special AIDS program—at a press conference last week in Paris, one of several venues where UNAIDS officials unveiled the depressing global figures. Yet, despite these extraordinary numbers, AIDS activists and physicians continue to question whether political leaders are treating the epidemic with the urgency it deserves. In South Africa, this simmering issue has boiled over into a major public controversy. South African health officials have decided not to provide the antiviral drug AZT to HIV-infected pregnant women—despite its proven effectiveness in preventing transmission of the virus to their offspring—because, they argue, it is too expensive.

    Although Africa is currently taking the brunt of the epidemic, the new statistics show that no corner of the world will be spared the ravages of AIDS. Of particular concern is the growing HIV infection toll in India, a nation of nearly 1 billion people, where random sampling in rural areas has shown adult HIV infection prevalences reaching 2%, while among women who visited clinics for treatment of sexually transmitted diseases the figure is as high as 13.6%. Even in Western Europe and North America, where death rates from AIDS have plummeted thanks to cocktails of antiviral therapies, the proportion of the population infected with HIV is continuing to rise, with 74,000 new infections on the two continents during 1998.

    Nowhere, however, is the situation worse than in sub-Saharan Africa, where more than a dozen countries now harbor adult HIV infection prevalences of 10% or higher. On 30 November, at a press conference in Johannesburg, South Africa, UNAIDS executive director Peter Piot delivered the latest bad news personally: In four countries—Botswana, Namibia, Swaziland, and Zimbabwe—more than 20% of adults are now infected. Moreover, South Africa, where the epidemic did not take off until the early 1990s, registered about 700,000 new infections during 1998 among adults, defined as people between 15 and 49 years old. UNAIDS's senior epidemiologist, Bernhard Schwartländer, told Science that the adult HIV prevalence may now be as high as 15%, up from the 12.8% estimated for the end of 1997 (Science, 19 June, p. 1864).

    In the face of these dramatic increases, AIDS officials, physicians, and activists say they are perplexed by the South African government's decision not to treat HIV-infected pregnant women with AZT. Earlier this year, a clinical trial in Thailand conducted by the U.S. Centers for Disease Control and Prevention concluded that administering AZT over the last several weeks of pregnancy and during labor decreases mother-to-child transmission of HIV by more than 50%. And according to figures from South Africa's national health department, HIV prevalence among pregnant women is as high as 20% to 25% in some parts of the country. But a pilot program of the “short course” AZT regimen was axed in October by South African health minister Nkosazana Zuma, after a meeting with the health ministers of the nation's nine semiautonomous provinces. Zuma (who is a member of UNAIDS's program coordinating board) was unavailable for comment this week, but her special adviser, physician Ian Roberts, told Science that funds to support the program “were not available at the provincial level. … We consider the price of the drug unaffordable.”

    Yet many South African physicians who treat HIV-infected mothers and children argue that the therapy makes good economic sense. Glenda Gray, a pediatrician at the Chris Hani Baragwanath Hospital in Soweto—where nearly 1000 HIV-infected babies have been born this year—argues that the AZT short course is “ridiculously cheap,” especially now that the drug's maker, Glaxo Wellcome, has agreed to lower the price by more than 70% in developing countries. Indeed, some even poorer African countries, such as Botswana, have already decided to make the drug available to all HIV-positive pregnant women. Gray adds that a cost-benefit analysis that she and other South African colleagues recently carried out indicated that her country's government would actually save money on health care costs to HIV-infected children by providing short-course AZT, which costs under $70 at the reduced price, to pregnant mothers.

    Roberts says that he has not seen Gray's cost-benefit study. (Gray says, however, that she personally told Zuma about the study back in October and that preliminary details were presented at last June's international AIDS meeting in Geneva.) Roberts also cites the government's recent approval of a $14 million AIDS prevention campaign as evidence that “we are taking AIDS very seriously in this country.” As for future plans to purchase AZT for pregnant mothers, “if Glaxo makes the drug affordable or gives it free, we will definitely use it.” But some observers believe that South Africa's decision not to make AZT widely available to pregnant women reflects long-standing suspicions, dating back to the apartheid era, that pharmaceutical companies such as Glaxo Wellcome want to get a toehold in the South African market so they can later raise their prices. “There is a big issue in South Africa about affordable drugs for its population,” says one source who asked not to be identified.

    Piot told Science he agrees that the situation in South Africa is “complex” but adds that this is “no excuse” not to do something about the alarming rate of mother-to-child HIV transmission. “They clearly haven't done enough, that's for sure.”


    Kick-Starting the AIDS Vaccine Effort

    1. Michael Balter

    As the latest worldwide figures dramatically indicate, the AIDS epidemic shows no signs of slowing down (see main text). As a result, most AIDS experts have concluded that only a vaccine can turn the tide. But progress on this front has been painfully slow. Last week, the New York-based International AIDS Vaccine Initiative (IAVI) announced that it will invest $9.1 million in two new vaccine preparations, in hopes of breaking this deadlock. The vaccines could be ready for preliminary human testing by next year.

    The move comes 6 months after the release last June of IAVI's “Blueprint for AIDS Vaccine Development,” which argued that only a major acceleration of candidate vaccine testing could speed things up. “We are trying to widen the pipeline” of vaccine development, says Seth Berkley, president of IAVI, a private organization funded by an assortment of foundations as well as the World Bank and the British government.

    IAVI chose to support two vaccine strategies that have shown promise in experiments with rodents and monkeys. The first, developed by immunologist Andrew McMichael's team at Oxford University, combines DNA that codes for proteins in HIV's core with a modified vaccinia virus engineered to also express these proteins. The second approach, developed by AlphaVax, a biotech company based in Durham, South Carolina, uses a modified version of the Venezuelan equine encephalitis virus as a vector to carry HIV genes into host cells.

    Norman Letvin, an immunologist at Harvard Medical School in Boston and a member of IAVI's scientific advisory board, says that exploring new ways to expose the immune system to HIV proteins is a high priority in vaccine research: “We need to get as many new live vector technologies out there as possible.” And Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases (NIAID) in Bethesda, Maryland—which coordinates U.S. government-supported vaccine research—says that “both of the IAVI vaccine initiatives hold promise.” Should one or both of the approaches begin to show results in human trials, Fauci says, NIAID might eventually move in with additional support.

    To ensure that any successful vaccine candidates will be affordable in the developing world, IAVI has negotiated intellectual property agreements with potential vaccine manufacturers that require them either to produce the vaccines at a “reasonable price” or give IAVI the right to recruit other companies that will. AIDS experts will be watching closely to see if IAVI's approach can indeed break through the logjam in AIDS vaccine development. Says Letvin: “The way to get these programs going is to just do it.”


    Songbirds Stressed in Winter Grounds

    1. Bernice Wuethrich*
    1. Bernice Wuethrich is an exhibit writer at the National Museum of Natural History in Washington, D.C.

    Human inhabitants of North America may dream of relaxing winter escapes to the Caribbean, but for the American redstart, a diminutive migratory songbird, winters down south are a time of stress. The birds compete there for choice, insect-rich habitat—a contest in which young and female redstarts often lose out to older males. And as a study on page 1884 shows, a lean winter down south can have a lasting legacy, limiting the birds' reproductive success during the next breeding season up north.

    The work, by avian ecologists Peter Marra and Richard Holmes of Dartmouth College in Hanover, New Hampshire, and Keith Hobson of the Canadian Wildlife Service in Saskatchewan, is the first to show that the quality of a migratory songbird's tropical wintering grounds can affect its survival and breeding success when it arrives in the north—an achievement avian ecologists call the Holy Grail of songbird biology. “Before now, no one has ever even come close to linking up quality of habitat in winter with reproductive success in the breeding grounds,” says avian ecologist Scott Robinson of the Illinois Natural History Survey.

    Based on an isotopic marker in the bird's blood that is keyed to winter habitat type, the work has also provided a crucial piece of information for conservation. Many Neotropical songbirds are in decline, but many biologists have concentrated their efforts on northern breeding grounds rather than southern wintering grounds. But for the redstart, at least, it seems that winter habitats can be limiting. “Conservation efforts will need to focus on habitats used by these species throughout their annual cycle,” says Marra.

    Although researchers have long sought to link northern and southern populations of migratory birds, that goal has been as elusive as a tiny songbird in a forest canopy. Songbirds are too small to carry radio transmitters, and although thousands have been color-banded, only rarely has one been identified in both the north and south.

    The isotopic method offers a clever solution. Marra and Holmes measured levels of the naturally occurring stable isotope carbon-13. They capitalized on the fact that plants in certain habitats, such as wet mangrove or wet lowland forests, have less C-13 than do plants typical of dry scrub, due to differences in water use efficiency and photosynthetic pathways. This isotopic signature is passed up the food chain to plant-eating insects and insect-eating birds and shows up in their blood.

    Marra captured, weighed, and bled redstarts at the beginning and end of the winter season in Jamaica and Honduras. As expected, birds that had wintered in the mangroves and wet lowland forests had low levels of C-13, while birds that had wintered in dry scrub had high levels. Furthermore, the wet forest birds—which were 65% male—had maintained or gained weight, while the scrub-dwellers—which were 70% female—had lost up to 11% of their body mass and had elevated levels of the stress hormone corticosterone. Marra also witnessed the tactics older males use to retain control of choice habitat. In as yet unpublished work, he put decoys into the mangroves and watched the males dive-bomb them; that aggression apparently forces females and younger males into the scrub.

    The team predicted that because of their better physical condition, birds wintering in wetter habitats would be able to get an early start on the long spring flight to northern breeding grounds, where they would then have the pick of real estate for courting and nesting. Indeed, previous studies have shown that early comers of both sexes have more young. Birds that wintered in the scrub were predicted to leave late, setting them up to be losers in the mating game, says Marra.

    To test these predictions, he captured and bled arriving redstarts in the New Hampshire forest. Sure enough, he found that early arrivals—singing and ready to mate—carried the wet forest signature; they were mostly older males. Later arrivals carried the dry scrub signature and were typically scraggly younger males or females. Among the late arrivals, males were probably worse off than females, who could pair up as the second mate of an established male. It all points to the importance of winter habitat for redstart populations, suggesting that “when choice habitat is limited, individual fitness and ultimately populations can suffer,” Holmes says.

    That's important news for conservationists. Dramatic population fluctuations in migratory songbirds over the past 30 years have raised alarms. Although the American redstart is generally holding its own, populations in the Adirondacks and New Hampshire have been declining by about 3% per year. The take-home message, says behavioral ecologist Sidney Gauthreaux of Clemson University in South Carolina, is, “You may do everything on the breeding grounds to protect these Neotropical migrants, but we can still lose the species by having the winter habitat destroyed.” For example, mangroves, which shelter large numbers of migratory birds, are in decline worldwide; the Caribbean lost about 10% of its mangroves in the 1980s and continues to lose about 1% a year, says biologist Elizabeth Farnsworth of Harvard University.

    As prime habitat grows scarce, more and more birds will be forced into the scrub. “When good winter habitat is limited, even those birds that survive will be worse off and have less breeding success,” says avian ecologist Trevor Price of the University of California, San Diego. Females may pay the highest price, amplifying the redstarts' skewed sex ratio (like many migratory songbirds, redstarts have a higher proportion of males than females) and spelling trouble for the species. For redstarts and other migratory songbirds, the decline of wet winter forests could turn already difficult winters into one-way tickets south.

  4. X-RAYS

    Tabletop Laser Packs a Punch

    1. Alexander Hellemans*
    1. Alexander Hellemans is a writer in Naples, Italy.

    When materials scientists and x-ray crystallographers talk about “light sources,” they are normally referring to facilities as big and pricey as electric power plants: synchrotron radiation sources, particle accelerators built to produce intense x-ray beams for probing the structure of matter. But now a team of scientists at Colorado State University in Fort Collins has made a light source you could take home: a table-top x-ray laser that can deliver rapid-fire pulses of x-rays comparable to those of some synchrotron sources.

    “We have right now the same coherent power … as a third-generation synchrotron beamline,” says team leader Jorge Rocca. Described in an upcoming paper in Physical Review Letters, the tabletop device, powered by electric discharges, could relieve some of the seemingly insatiable appetite for new x-ray sources to study the structure of materials and biomolecules. Although the source emits very low-energy, or “soft,” x-rays, verging on the ultraviolet, at a single frequency, all wavelengths are of interest to researchers. “There are biologists who use wavelengths from visible ultraviolet to x-rays on the same machine to determine structures,” says Marie-Emmanuelle Couprie of LURE, which houses France's synchrotron in Orsay.

    In 1994 Rocca and his colleagues reported that they had demonstrated the world's first tabletop x-ray laser (Science, 4 November 1994, p. 732). While existing x-ray lasers use powerful pulses from a separate, usually huge, optical laser to ionize a gas into a plasma and then excite the ions so that they produce x-rays, Rocca's team used a different approach. They filled spaghetti-thin capillary tubes 18 centimeters long with argon gas and then used electric discharges both to create the plasma and excite the ions. Their laser was not very powerful, however, and it only produced one nanosecond-long pulse per minute. By 1996, the team had upped the power output per shot, but because of problems cooling the capillaries it could not improve on the rate of one shot a minute, well short of the millions of pulses achievable with a synchrotron source.

    Since then, the team has made several improvements. Instead of capillaries made of polyacetal, a very tough plastic, they use water-cooled ceramic ones made from alumina, which are stronger and conduct heat better, allowing the team to apply more rapid-fire electric discharges. “We have also made electrical changes to supply the power at the right rate,” says Rocca. In their upcoming paper, the team announces that it has achieved its goal. The device generates x-ray pulses with a wavelength of 46.9 nanometers at a repetition rate of 7 per second, producing an average output power of about 1 milliwatt—two to three orders of magnitude larger than produced by some older synchrotron sources.

    Such a laser is no competition to the top rank of high-energy, or “hard,” x-ray sources such as the European Synchrotron Radiation Facility in Grenoble, France, which structural biologists rely on to study the three-dimensional structure of proteins. “The study of proteins requires much harder x-rays” than the laser can produce, says Michel Bessiere of LURE. But the laser could conceivably fill the needs of some users of “soft” x-ray beams at sources like Berkeley's Advanced Light Source or Italy's Elettra for applications such as x-ray holography and spectroscopy. This could prove to be a boon for researchers queuing to run their experiment at today's facilities, which are seriously overcrowded. “You could fill every synchrotron-hour in Europe four times,” says Bob Cernik of Britain's SRS synchrotron at Daresbury Laboratory.

    The laser's potential for low cost and size could allow every university to have one of its own, but some synchrotron experts doubt that it is a serious contender yet. Couprie points out that it does not match the pulse rate or the reliability of synchrotrons, and its limited operating time—currently 30 minutes at five pulses per second—could also form an obstacle. Still, the intensity of the laser's pulses may make it useful for studying how the optical properties of plasmas change at very high radiation intensities, says Couprie.

    Rocca is well aware of the laser's shortcomings, and he and his team are already testing the laser with different gases in the plasma and hotter temperatures. Although his tiny laser is unlikely to topple the mighty synchrotrons, Rocca is sure it will find a niche. Agrees Cernik: “You need lasers and synchrotrons as well.”


    Steadying Influence for Neurons Identified

    1. Marcia Barinaga

    Like people, neurons sometimes need to be steadied a bit so that they don't overreact to stimuli. That role is one of several that fall to potassium channels, tiny protein pores that allow potassium ions to flow out of neurons. So far, researchers have identified the proteins that make up most of the 20 or so known types of potassium channels. But one channel with a major influence on neuronal excitability, the M-channel, has remained mysterious—until now.

    On page 1890, David McKinnon and Jane Dixon of the State University of New York, Stony Brook, and their colleagues report that they have identified the two proteins that together make up the M-channel. Their success is being heralded partly because it will help researchers understand how neural excitation is controlled. “This channel represents the most important regulator of excitability in many neurons,” says University of California, Berkeley, neuroscientist Ehud Isacoff.

    The M-channel may also be a key target for drug development. Even before the Stony Brook work, others had discovered that defects in the genes encoding the proteins cause a form of epilepsy. And M-channels are found in many brain areas including the hippocampus, where neural responsiveness can affect learning and memory. Knowing the identity of the channel's components will help researchers learn what turns it on and off and could lead to new drugs for epilepsy or Alzheimer's disease.

    McKinnon and Dixon study sympathetic neurons, which control things like heart rate and blood pressure. Like all neurons, sympathetic neurons fire in response to signals arriving from other neurons, which open channels that let positively charged ions flow into the cell. But some sympathetic neurons are more excitable than others, firing many more action potentials in response to a given stimulus.


    Sympathetic neurons missing the M-channel (lower panel) fire more than those that have the channel (upper panel).


    In earlier studies, McKinnon and Dixon's team specifically tested neurons for the M-current, the flow of potassium ions across the membrane under conditions in which M-channels would be the only potassium channels open. They found that the less active neurons have M-channels while the more active neurons lack them. That made sense, because M-channels let positively charged potassium ions flow out of the neuron during the period leading up to an action potential. That reduces the neuron's excitability by countering the inward flow of ions triggered by neural signals.

    The Stony Brook team used their knowledge of which neurons lack M-channels to help them search for the channel's protein components. In both types of neurons they screened through the RNA messages that indicate which proteins the neuron is making, to see whether any of the known potassium channel proteins were made only in the M-channel-containing neurons. KCNQ2, a potassium channel subunit that had not been linked to any known channel, fit the bill. In further experiments the researchers injected RNA encoding the different KCNQ subunits into frog egg cells and showed that KCNQ2 combines with another subunit, KCNQ3, to make a channel that behaves exactly like the M-channel.

    This was not the first time the two proteins had attracted attention. Earlier this year, while the Stony Brook team was doing those experiments, a team at the University of Hamburg in Germany and another at the University of Utah reported that mutations in the genes that encode KCNQ2 or KCNQ3 cause a hereditary form of epilepsy. Finding out that the two proteins encode the M-channel “really makes sense,” says Thomas Jentsch, a member of the Hamburg team, because “the M-channel has been shown to control neuronal excitability,” and epileptic seizures occur when neurons become uncontrollably excited.

    The prospect of controlling seizures via the M-channel already has drug company scientists intrigued. They “were all over the poster,” McKinnon says, when he presented his team's work at the annual meeting of the Society for Neuroscience in Los Angeles last month. DuPont neuroscientist Barry Brown, a co-author on the paper, says drug companies can now use the subunits to screen drugs. “If you could find a drug that actually opened or enhanced the activity of M-currents, it may be a good antiepileptic drug,” he says.

    In addition, several compounds developed by DuPont as memory enhancers for Alzheimer's patients had already turned out to block the M-current. “That implies that the M-current is also involved in cognition,” says Neil Marrion, a neuroscientist at the University of Bristol School of Medical Science in the U.K. “If you look at [animal models of] Alzheimer's, cell firing is actually dampened in the hippocampus,” he notes. The cognition-enhancing drugs may work at least in part, he suggests, by “jazzing up” the excitability of neurons in this important memory area.

    The DuPont drugs, along with the subunits and their genes, also provide a new set of tools for neuroscientists who study neural excitability. For example, the neurotransmitter acetylcholine enhances neurons' response to its excitatory signals by activating receptors that turn off the M-channel. But after years of research, no one has identified the intracellular messenger, triggered by acetylcholine, that turns the channel off. Having the subunits in hand “will help people to investigate what the messenger might be,” says Marrion, who has studied the M-channel for a decade.

    For instance, they can look for certain hallmark amino acid sequences in the channel proteins that provide clues to the kinds of regulatory molecules that act on the channel, mutate those amino acids to see the effects of losing that regulation, and even study the effects of altered forms of the M-channel in transgenic animals. “This work opens up whole new avenues,” Marrion says.


    Heat Shock Protein Mutes Genetic Changes

    1. Elizabeth Pennisi

    When Charles Darwin formulated his ideas about evolution, he did not really understand the source of its raw material: the inherited variation that he saw in plants and animals. And even modern evolutionary biologists struggle to explain how closely related organisms could come to look and act quite differently, sometimes in a relatively short period of time. New work now points to one possible explanation: Genomes apparently have a way of saving up mutations for a rainy day.

    In the 26 November issue of Nature, cell biologists Suzanne Rutherford and Susan Lindquist of the University of Chicago reported findings suggesting that the fruit fly genome contains a hidden reservoir of small mutations. Normally, the researchers find, these mutations are masked by HSP90, one of the so-called heat shock proteins that bind to other proteins to protect them against stresses such as high temperatures and also help newly made proteins fold correctly. But when HSP90 is out of commission, as might happen for example when an organism is under stress and the heat shock protein is tied up in its protective role, it can no longer stabilize mutant proteins and keep them working properly. Instead the mutations are revealed. Usually, they alter physical traits in harmful ways but may in some cases produce changes that help the organism adapt to the stress.

    Researchers already knew that some organisms have ways to increase mutation rates in response to stress, generating more genetic diversity for natural selection to act on (Science, 21 August, p. 1131). But this is the first clear example of any stockpiling of genetic changes. By permitting the organism to harbor a reservoir of mutations without harm under ordinary circumstances, HSP90 “gives [it] the capacity to evolve rapidly” when circumstances change, says Marc Kirschner, a cell biologist at Harvard University. “The work is really very cool,” says Patricia Foster, a bacterial geneticist at Boston University School of Public Health. “It's a wonderful concept.”

    Rutherford and Lindquist first wondered whether HSP90 might protect individuals against genetic mutations when they noticed that a few percent of fruit flies with mutations that disable the protein had any of a variety of developmental abnormalities: misshapen wings or legs, abnormal eyes, face, or bristles, or other odd physical flaws. The researchers then began breeding experiments to determine the cause of these abnormalities and HSP90's contribution to them.

    First they mated flies with similar mutations with one another. Not all the offspring were abnormal, however, and “that pattern indicated that there were multiple genes [involved]” even for a single abnormal trait, such as deformed eyes, says Lindquist. Normal flies resulted when a defective gene in one parent compensated for a different defective gene in the other. In addition, after several generations of mating only abnormal flies, further mating of those defective progeny with flies that make normal HSP90 did not make the abnormalities disappear. This suggests, Lindquist says, that the mutant HSP90 gene did not cause the changes directly. It also indicated that these defects had become so concentrated in the genome that HSP90 couldn't prevent abnormalities from showing up.

    It seemed to her, however, that when the flies didn't have too many genetic changes, the normal heat shock protein could mask the mutations—a function that is lost when HSP90 is disabled. Subsequent experiments proved that to be the case. When the researchers fed young normal fruit flies a substance that stifles heat shock protein activity, about 8% more of the resulting adult flies were deformed. But perhaps most intriguing, Rutherford and Lindquist found that even fruit flies with a normal HSP90 gene can develop abnormalities when they are raised in either unusually high or low temperatures, 30 or 18 degrees Celsius, well above or below the 25 degrees Celsius they favor.

    Based on these findings, Rutherford and Lindquist conclude that under normal conditions, HSP90 compensates for the small genetic glitches that would otherwise alter the stability and function of the fly's proteins. How the protein does so is still unclear. “It's probably fixing things in a variety of different ways,” Lindquist explains. For example, HSP90 might help a protein involved in fly development fold properly even when its amino acid sequence is not quite right because of a mutation. As a result, mutations can accumulate without any apparent effects.

    But if HSP90 itself is abnormal, or if unusual temperatures or other stresses deplete the supply of HSP90, then the consequences—either good or bad—of those mutations emerge. “If it happens to be good for the flies, then they [will survive] and can continue to express that trait,” Lindquist points out.

    This picture expands the role of heat shock proteins and other so-called chaperones that help fold proteins, notes Richard Morimoto, a molecular biologist at Northwestern University in Evanston, Illinois. More than just helping other proteins, these molecules may shape an organism's evolutionary potential. Depending on the context—such as the ambient temperature—HSP90 and possibly other chaperones can radically change the way an organism looks or acts. “It's a way you can dramatically change entire classes or proteins,” he suggests.

    Researchers have yet to learn whether other heat shock proteins work similarly and whether HSP90 masks genetic change in organisms other than the fruit fly. Morimoto expects the answer to be yes on both counts. HSP90's activity in Drosophila, he predicts, “is not going to be unique.”


    Visa Bill Creates NSF Scholarships

    1. Jeffrey Mervis

    A new law that allows U.S. high-tech companies to hire more foreign workers contains a windfall—and a headache—for the National Science Foundation (NSF). The windfall is a $27 million pot of money for college scholarships and school reform efforts, funded through a $500 fee that employers will pay the government for each visa application to bring in a foreign worker. The headache is figuring out how to set up and operate such a program, which would be a first for NSF.

    NSF's new responsibility is spelled out in the American Competitiveness and Workforce Improvement Act, which was wedged into the massive omnibus spending package that Congress approved shortly before it adjourned in October (Science, 23 October, p. 598). The scholarships, named after the bill's chief sponsor, Senator Spencer Abraham (R-MI), are meant to increase the pool of technologically adept U.S. workers available to fill vacancies at domestic information technology companies. Many of those jobs now go to foreign workers. “We wanted to look beyond the immediate crunch and get at the long-term problem of training more Americans,” says an Abraham staffer who follows the issue. “And NSF has a good reputation for running quality programs.”

    The legislation was a last-minute addition to the bill, which raises the cap on so-called H-1B visas from 65,000 to 115,000 this year and next before dropping back to 65,000 in 2002. The scholarship provision calls for NSF to run a competition that would award up to 10,000 $2500-a-year scholarships to low-income students pursuing associate, undergraduate, or graduate degrees in mathematics, engineering, and computer sciences. The exact income level has yet to be determined. In addition, NSF would receive roughly $6 million to be divided between systemic reform efforts in elementary and secondary schools (see p. 1800) and year-round enrichment courses in science, mathematics, and engineering. The money would be available annually through 2001.

    Although they welcome the money, NSF officials are concerned about the administrative burden of a new program. They would prefer to make the scholarships part of NSF's existing stable of programs aimed at strengthening the U.S. scientific labor force, including a rapidly growing advanced technology education initiative at community colleges. “A national scholarship program is a huge undertaking,” says Joseph Bordogna, acting deputy NSF director. “We haven't decided anything, but we're hoping to do something that is consistent with what we are already doing.”

    The closest thing to a scholarship program at NSF now is the agency's graduate research fellowships. But those awards are based on merit, not need, and they go to the institutions that students attend rather than to the students directly. “We're not geared up to run an individual scholarship program, including cutting thousands of checks,” says one senior education official. NSF officials are also disappointed that the bill appears to exclude most of the natural sciences, noting that science is increasingly interdisciplinary and that graduates often take jobs outside their academic specialty. Abraham's aide says those restrictions “were a deliberate attempt” to train people for the information technology industry.

    Educators at institutions that serve large numbers of low-income students say that the scholarships should help those already planning careers in information technology. But they warn that efforts to increase the flow of students into the field must start much earlier. “They may be interested, but many of our students don't have the skills to pursue the degrees that offer the high-paying jobs,” says William Edmonson, president of Panola College, a community college in east Texas with an NSF grant to reform undergraduate science and math instruction. “The foreign students who come here are at a distinct advantage because they've already taken the necessary courses in high school.”


    Drug May Suppress the Craving for Nicotine

    1. Ingrid Wickelgren

    They are tired of the scornful glances of strangers, of shivering in cold entranceways, of the fear they will die from their habit. For these reasons and more, each year 35 million smokers in the United States alone try to quit. But more than 90% start again within a year. Now, new evidence suggests that a drug used to treat epilepsy in Europe may one day help smokers kick the habit.

    In the January 1999 issue of Synapse, neuroanatomist Stephen Dewey at Brookhaven National Laboratory in Upton, New York, and his colleagues report that in baboons and rats, an epilepsy drug called gamma vinyl-GABA (GVG) suppresses a neurochemical hallmark of nicotine and other addictive drugs: a rise of the neurotransmitter dopamine in the brain's “reward centers.” It also stops behaviors in rats thought to mirror human cravings for nicotine.

    Current smoking cessation aids either deliver nicotine more safely, via patches, gums, or sprays, or combat depression in former smokers. But the Dewey team's work, if confirmed in human studies, may lead to a drug that could help many people stop smoking in a different way: by suppressing their “need” for nicotine and possibly by reducing the “high” as well.

    GVG may also have potential as a therapy for cocaine addiction, because the same team reported similar findings for that drug in the August issue of Synapse. “These preclinical data are tremendously interesting and provocative,” says Alan Leschner, director of the National Institute on Drug Abuse, who nevertheless cautions that further study is needed to determine whether GVG will be a useful—and safe—treatment for addiction.

    Dewey didn't originally set out to find an addiction drug. Although not approved in the United States, GVG has been used for years to treat epilepsy in 60 other countries. It works by irreversibly blocking a brain enzyme that breaks down the neurotransmitter γ-aminobutyric acid (GABA), which inhibits neural activity. Thus, brain GABA levels rise, dampening the excessive nerve firing that can lead to epileptic seizures. In 1990, Dewey, psychiatrist Jonathan Brodie at New York University School of Medicine, and their colleagues set out to see whether GVG could also treat schizophrenia. Some of this disease's symptoms have been linked to higher than normal levels of the neurotransmitter dopamine in certain brain areas, and the researchers reasoned that GVG might help by suppressing the dopamine-producing cells.

    In 1992, Dewey and Brodie showed that GVG does lower dopamine in a region of the baboon brain. But before pursuing GVG as a schizophrenia treatment, Dewey became intrigued by a Brookhaven colleague's work on drug abuse that focused on dopamine. “I thought, ‘I bet [GVG] will work’” to block the dopamine surge caused by addictive drugs, Dewey recalls. This surge is thought to underlie the “high” that keeps addicts coming back for more and perhaps also feelings that make them anticipate the arrival of a drug.

    In the current study, Dewey, Brodie, and their colleagues have shown that GVG can block the dopamine rush produced by nicotine. The researchers found that whereas nicotine injections double the dopamine levels in the reward centers of the brains of control rats, GVG given 2.5 hours before the nicotine could completely block the dopamine rise. And in positron emission studies that infer dopamine levels by detecting how much of a radioactive tracer can bind to dopamine receptors—low binding indicates high endogenous dopamine—the scientists saw something similar in baboons.

    To find out whether this change in brain chemistry has behavioral effects, team member Charles Ashby, a neuropharmacologist at St. John's University in Jamaica, New York, tested GVG's effects on a rat behavior called conditioned place preference, which is thought to reflect what happens in humans when particular environmental stimuli elicit drug cravings. First, Ashby and his colleagues gave rats repeated nicotine injections while they were in either of two connected boxes, one striped and the other plain, teaching them to associate nicotine with one of the boxes. Then, they let rats choose between the boxes after receiving a dose of either a control solution or GVG.

    As expected, control rats stayed in the box where they had received nicotine, but the rats given GVG displayed no preference, suggesting that GVG erased their attraction to places associated with the drug. “We think GVG stabilizes dopamine levels such that animals don't get the dopamine rush when they go to the chamber associated with the drug,” says Dewey. In humans, by extension, the treatment might dampen the intense drug cravings ex-smokers feel when they experience something—a sip of coffee, for example—that reminds them of cigarettes.

    And GVG may help combat cocaine cravings as well. This past August, Dewey, Brodie, Ashby, and their colleagues showed that GVG can prevent a cocaine-induced burst of dopamine in baboon brains. They further demonstrated that the drug blocks conditioned place preference in rats that have learned to prefer environments associated with cocaine injections.

    Of course, nobody can say whether GVG can help people stop smoking, or using cocaine, until it is tested in human smokers and cocaine users, something just now being considered by doctors at medical centers equipped to conduct such trials. And human tests may be delayed by concerns about the peripheral vision defects GVG causes in some epilepsy patients, which is why the U.S. Food and Drug Administration has not approved it. The much lower GVG doses needed to combat nicotine cravings may not cause these problems, however. Indeed, says Dewey, if further testing pans out, “we might be able to help people on any of a number of addictive drugs.”


    New Czar Aims to Sharpen France's Effort

    1. Michael Balter

    ParisFrance is second only to the United States in spending on AIDS research, but in recent years the payoff has seemed disproportionately modest. Although French clinicians have conducted major studies to evaluate anti-HIV drugs developed elsewhere, France has created few new therapies of its own (Science, 16 January, p. 312). But French officials are hoping that will change soon. At a press conference earlier this week, France's new AIDS czar, immunologist Michel Kazatchkine, unveiled plans to harness basic AIDS research more tightly to eventual therapeutic goals, as well as to beef up the nation's AIDS vaccine effort.

    Kazatchkine—who in October replaced virologist Jean-Paul Lévy as director of the National Agency for AIDS Research (ANRS)—was joined at the press conference by Claude Allègre, France's minister of research, and Bernard Kouchner, the health minister. Both Allègre and Kouchner said that their presence was intended in part to scotch rumors, circulating over the past year, that the ANRS would be disbanded and its activities absorbed into other research agencies once Lévy stepped down. “We are not going to pull back or make a lesser effort in AIDS research,” said Kouchner.

    However, beginning next year, that effort will be much more tightly focused. For example, although about 25% of the ANRS's 1999 budget of $42 million will be spent on basic HIV research, half of that sum will now be reserved for “coordinated actions” designed to lead to new therapies. The other half of the basic research allotment will be awarded to researchers on the basis of grant proposals, although these will now be much more stringently judged than in past years. “The barrier for funding these projects will be raised higher,” Kazatchkine told Science in an earlier interview. “Basic [AIDS] research should not just bring one more piece to the puzzle but have the goal of identifying new targets for therapy.”

    Vaccine research will also receive a boost next year, up 18% from the roughly $6.4 million spent in 1998. Kazatchkine says that one key goal will be to involve new industrial partners in vaccine development. At the moment, only the Lyons-based pharmaceutical company Pasteur Mérieux Connaught (PMC) is working with ANRS on vaccines. “If PMC has been so dominant, it is because not many other [companies] have come in,” Kazatchkine says. Indeed, at the press conference, both Allègre and Kouchner decried the general reluctance of French companies to get involved in HIV research. “There is a black hole there,” said Kouchner. “The international companies developing new [anti-HIV] drugs are not French.”

    The new plans to focus more heavily on therapeutic goals drew mixed reviews from researchers who spoke to Science. “The way one develops a vaccine or finds a drug is not by going basic but through the most rigorous application of basic knowledge to research that is goal-oriented,” says virologist Marc Girard of the Pasteur Institute in Paris. And Françoise Brun-Vézinet, a virologist at the Bichat-Claude Bernard Hospital in Paris and a member of ANRS's scientific advisory council, says that focusing more effort on clinical AIDS research makes sense because “this is what has worked best” at the ANRS. Moreover, Brun-Vézinet adds, fundamental HIV research can still be accommodated in France's other research organizations, such as the giant biomedical agency INSERM.

    But some question whether it makes sense to limit the focus of ANRS's research to specific targets. An AIDS researcher who asked not to be identified says, “So far the other ‘coordinated actions’ of ANRS have not been impressive, and the vaccine effort has been groping in the dark. Raising the barrier is a good idea, but it should be on everything, not just basic research.”

    Despite the assurances from Allègre and Kouchner that the ANRS will continue to exist, the agency's mandate expires at the end of 2000, at which time the government will have to decide whether to renew it. That gives Kazatchkine 2 years to prove that French AIDS research can produce results. “I told the ministers it was absolutely premature to close down the ANRS,” Kazatchkine says. “If they asked me to take over, I guess they agreed.”


    Mixed Grades for NSF's Bold Reform of Statewide Education

    1. Jeffrey Mervis

    The Statewide Systemic Initiatives were supposed to transform how public schools taught science and math. But after 7 years, the National Science Foundation is still looking for answers

    Buffalo's Southside School sits in a tough neighborhood of this aging industrial city in upstate New York. Each day its 1400 elementary and secondary school students make their way to school amid poverty and crime. However, for several years the students received some high-profile help from Washington as part of an unusual national effort to improve the dismal state of U.S. science and math education.

    Then there were eight

    NSF has funded 26 Statewide Systemic Initiatives projects, with eight winning a second 5-year award and four being phased out early.


    Until this fall, Southside was a tiny piece of a massive program by the National Science Foundation (NSF) to encourage states to make comprehensive and lasting changes in the way they teach science and math. Beginning in 1993, Southside received almost $200,000 a year to fund summer workshops for teachers, new classroom materials, and a district coordinator, along with other programs that had suffered cuts in state funding. But despite a strong commitment from the building principal and many teachers, the program struggled to show progress. And although one might think that any help would be welcome in such a grim setting, a new principal who arrived in the program's fourth year shut down the project, saying that it was divisive, hadn't improved student performance, and wasn't a priority with higher-ups in the Buffalo school system (see sidebar on p. 1804).

    The rise and fall of the Southside project, one of 12 demonstration schools across the state, reflects the harsh realities of education reform in science and math. The program that funded it, NSF's Statewide Systemic Initiatives (SSI), is a radical departure from the agency's traditional practice of funding individual educational projects involving a relative handful of teachers, students, and school districts. It's a bold attempt—perhaps the most ambitious in NSF's broad portfolio of education programs—to achieve reform on many fronts at the statewide level. From training teachers and developing new curricula to rewriting state laws and reshuffling school management, it strives to change entire education systems rather than just tinkering with their component parts. At the same time, it has retained an element common to most NSF programs—a bottoms-up approach that asks educators for their best ideas and doesn't assume the agency has all the answers.

    The SSI effort, launched in 1991 in 10 states, grew out of a bipartisan political promise to make U.S. elementary and secondary students the best in the world in science and math (see Policy Forum on p. 1830). By 1993, when New York joined the program, 25 states and Puerto Rico had been promised up to $10 million over 5 years to overhaul their science education systems. In 1994, NSF expanded its reach by creating sibling programs for large urban districts, called the Urban Systemic Initiatives (USI; see sidebar on p. 1802), and rural areas. Together, these initiatives crowned systemic reform as king of the educational hill at NSF. “This trilogy of efforts represents a singular strategy to achieve [success] for all of America's students,” Luther Williams, head of NSF's education directorate and the chief architect of the systemic reform program, told the agency's governing body, the National Science Board, in a 1994 presentation.

    But after 7 years, and nearly $600 million spent on the three programs, officials are still a long way from knowing whether systemic reform works—or even what constitutes success. A major assessment of the statewide efforts, a 5-year, $4.6 million evaluation by SRI International of Menlo Park, California, concluded this spring that the program's impact has been extremely hard to measure and that evidence of improved test scores as a direct result of the SSI reforms is even more tenuous ( “The impacts of individual SSIs were positive but limited because no SSI was able to ‘go to scale’ and intensively affect all teachers statewide,” the report states. “Also, the [project's] impact was almost always uneven, affecting some districts, schools, teachers, or students much more than others.” An outside evaluation of New York state's program, for example, concluded that after 4 years, only four of the 12 original demonstration schools like Southside “were poised to carry on with their reform efforts.”

    That spotty record doesn't surprise the scores of educators, state officials, policy analysts, and researchers interviewed for this article. Although $600 million is a large sum by NSF standards, it's a drop in the bucket of national education spending. Many believe that the mixed record also reflects mistakes by NSF, a relatively small and obscure federal agency, in launching a high-profile educational initiative without adequate preparation, a comprehensive management strategy, or a clear and consistent idea of how to evaluate its impact. “People knew, deep down, that doing a whole state was an impossible task,” says Margaret Cozzens, who headed elementary and secondary programs at NSF for 7 years before leaving this summer to become vice chancellor for academic affairs at the University of Colorado, Denver. “But there was tremendous pressure to do something. So NSF got things started and then tried to figure out what works as it went along.”

    That experimental approach troubles some politicians familiar with the program. “As scientists, we dislike fuzzy thinking,” said Representative Vern Ehlers (R-MI), a physicist turned politician, at a 23 July hearing on systemic reform before the House Science Committee. “And I feel that SSI did not, and perhaps still does not, have clearly defined objectives that the states understand and are trying to achieve.” A state legislator who followed the Michigan SSI project before coming to Washington in 1995, Ehlers says he “liked the initial idea, but I'm concerned about whether the results have been worth what we have invested.”

    Despite sharing many of those misgivings, educators unanimously applaud NSF for launching the initiative. They think that the program, despite its flaws, has made a positive contribution to the national debate on how to improve science and math education. “I think that NSF is on the right track, and that even those states that have been canceled made a better use of the money than if it had gone to separate teacher enhancement programs or curriculum development,” says Iris Weiss, president of Horizon Research Inc. of Chapel Hill, North Carolina, which has evaluated several state projects.

    Many also believe that the SSI program has helped people at all levels—students, teachers, parents, public officials, and community leaders—even if the results aren't immediate and can't be measured easily. “I think every state has benefited,” says Nancy Mincemoyer, head of the Michigan SSI project, which ended last year. “When we asked people in Michigan what the impact of the SSI had been, they said, ‘It made us think. Nobody had asked us to think systemically before.’ It's a big science experiment, and NSF should be commended for sticking its neck out.”

    In the beginning

    Nobody said it would be easy to improve U.S. math and science education in the public schools. But NSF seemed well placed to take on the challenge. It already was giving researchers millions of dollars a year to develop new materials, to improve the preparation and continued training of teachers, and to study how children learn. The problem, NSF officials acknowledged, was that those efforts were piecemeal and not linked to a larger reform strategy.

    “There had actually been a ‘reverse systemic’ reform effort over the past 30 years, with a proliferation of separate projects,” says ex-NSF staffer Joseph Danek, who helped create the education directorate's Office of Systemic Reform in 1991 and who headed it until he retired from government in 1994. The variety may have addressed specific problems in individual communities, Danek says, but it diverted attention and resources from the goal of improving student achievement nationwide. In 1989, at a historic education summit, President George Bush and the nation's 50 governors confronted that goal by announcing their commitment to making U.S. students the best in math and science by the end of the millennium. Meeting that promise, they knew, would require changing the entire education system.

    The SSI program was NSF's response. At a May 1991 press conference held jointly with the National Governors' Association, Williams announced the first batch of SSI awards. The money was combined with funds from other sources that typically exceeded NSF's contribution, and in many cases it was also linked to existing local, state, and other federal initiatives—including some from NSF—that addressed specific concerns. “We recognized that the system itself was deficient,” Williams explains, “and we asked what NSF could do to enhance that infrastructure. We can improve training for teachers, for example, but if they can't implement what they've learned it doesn't help.”

    The decade has seen many efforts to overhaul science and math education from kindergarten through grade 12. But none has taken quite the sweeping, state-by-state approach that NSF has followed. Even so, NSF's status as a federal agency imposed certain limitations. From certifying teachers and approving curricula to setting the length of the school day and year, the nation's educational system is largely a responsibility of the state, not the federal, government. And the dollars reflect that balance of power: Despite all the federal programs, Washington provides only 6% of the approximately $400 billion spent each year on K-12 education in the United States.

    Recognizing that education rests firmly in the hands of local and state officials, NSF chose not to try to impose its vision of reform but rather allowed states to take the lead. “At the beginning we permitted them to do almost anything, as long as it was systemic,” Williams says. “The idea was for NSF to help states reform their systems and then pull out, letting them continue on their own.” The approach meant that NSF and the states were making up the rules on systemic reform as they went along. “We spent the first couple of years trying to figure out what to do,” admits Frank Watson, a longtime faculty member at the University of Vermont who this summer stepped down as executive director of the Vermont SSI project.

    The result was a bewildering array of projects that explored the universe of education reform. Montana wrote a new high school mathematics curriculum, while Vermont tackled its state policy-making apparatus as part of a broader campaign that resulted in a new funding formula. In California, a new administration jettisoned existing policies and practices and, combined with new rules on classroom size that swelled teacher ranks, plunged into an ongoing debate about what children should be taught. Some states, including New York and Puerto Rico, began their reform efforts in a handful of model schools and then tried to replicate their experiences on a statewide basis, with mixed success (see sidebars on pp. 1801 and 1804). Others, such as Connecticut and Maine, set up shop outside the regular bureaucracy and tailored their input to the needs of local districts.

    Increasing the number of years of math and science required for high school graduation, eliminating remedial courses, writing new textbooks, training teachers to use “hands-on” lessons rather than lectures, and drawing up new tests to measure progress—all were part of the new equation in many states. So too were Saturday academies, science fairs, family math, and other community-based activities. The goal, to get everybody involved in changing the system and improving student performance, was the same. But the approaches were strikingly different.

    A management headache

    As this multifaceted effort gathered force, NSF struggled with how to administer it. “NSF had a good idea, but it isn't God,” says Richard Cole, head of the Connecticut Academy for Education, which administers the state's SSI project, now halfway through its second 5-year award. “There were a lot of things that it wanted to accomplish, but it hadn't worked out the details.”

    Managing systemic reform is a challenge under the best of circumstances. “It's taken us 3 years to get them to understand what we are doing,” says a project director whose state competed successfully for a second round of funding. And the situation at NSF was far from ideal. State officials and educators say the agency dug a hole for itself from the start by making multimillion-dollar awards based solely on paper descriptions of what states hoped to accomplish. “In retrospect, it's clear that the states needed a planning year to try out some ideas before they entered the limelight,” says Weiss. Williams agrees. “One of the lessons we learned [for the USIs] is that we didn't want to do what we had done with the SSIs,” he says. “So we funded the USIs to do a year of planning, at $150,000, to understand how to do systemic reform.”

    Sharing the wealth

    Urban districts now far outpace states in the competition for a share of NSF's systemic reform dollars.


    The effort also suffered from heavy staff turnover. Some state programs were managed by as many as half a dozen NSF program officers during their 5 years of funding. “We had three, one of whom was good, during our 5 years,” says one state project director who requested anonymity. Adding to the instability was the departure in the fall of 1996 of Danek's successor as head of the systemic reform office, Peirce Hammond. (Hammond, who says he stepped down voluntarily, now works on similar issues at the U.S. Department of Education.) For the past 2 years Williams has personally handled the job along with his other duties, although this fall NSF finally advertised the position.

    Assessing the inaccessible

    Even before the program began, NSF officials say they recognized the need for close oversight and assessment of the effort. But trying to determine which programs were working proved to be a bigger problem than anyone imagined.

    In the first few years, NSF was content to let states describe what they were doing and what they hoped to accomplish. Their voluminous annual reports contained plenty of descriptive information but little about short-term impacts on student achievement. “Our plan was to show results with the class that entered school in 1991 and graduated in 2003,” says Cole. “We said from the start that we were in it for the long haul.” With projects using different assessment strategies to measure different activities, even comparisons among states were difficult. “The system was created in such a way that we couldn't tell whether a project was working,” says Weiss.

    However, the demand for greater accountability grew over time. NSF began to insist that states show quantifiable progress in student achievement in their annual reports. In 1995, for example, it sent out a directive that contained nearly 100 pages of questions about activities in the past year, including requests for reams of data on course enrollments and test scores. The new reporting requirements sent state officials and evaluators into a tizzy, as much of the data either had never been assembled or didn't exist. “It was a huge effort—we were frantic to fill in all the boxes,” says Charles Bruckerhoff of Curriculum, Research & Evaluation in Chaplin, Connecticut, who has evaluated Connecticut's SSI project.

    After much grumbling from state directors, Williams set aside the directive and in 1996, with the program in its fifth year, issued what he calls the six “drivers” behind systemic reform (see table on p. 1805). The first four describe what reform efforts should look like and who should be involved. The last two address student achievement, ranging from better test scores to more science majors in college, with a special emphasis on the performance of underrepresented groups, in particular Hispanics and African Americans.

    The drivers have become Williams's shorthand for describing what systemic reform is trying to accomplish and how to measure it. Although they didn't compel states to act in a certain way, the drivers imposed greater uniformity on how states reported what they had done. And project officials generally give NSF high marks for adopting such a management tool. “It was the first time I felt our work was being taken seriously,” says one project director about the state's initial review following introduction of the drivers. “We got direct feedback that was very useful.”

    Unnatural selection?

    Even before it had fully developed its assessment criteria, however, NSF made some tough decisions on which programs it would continue to support. The agency had funded the SSI program through a novel mechanism called a cooperative agreement, which allowed it to demand that states set annual goals and update them each year, with a penalty if they fell short. In addition, NSF used midterm reviews to let each state know how it was doing. Although many officials and educators felt that NSF kept changing the rules, agency officials saw the decisions to terminate some projects as proof of their fiscal prudence. Rhode Island, part of the first class of 10 SSI states, was booted out of the program in 1994 after a sharply critical midyear review. NSF officials felt that the program was never embraced by state officials nor firmly rooted in the schools. Over the next 2 years NSF cut off funding to three more states—North Carolina, Florida, and Virginia—before their scheduled 5-year run had ended. In the case of North Carolina, NSF's decision followed a report on the state's math and science instruction by a group of civic leaders who concluded that too many cooks were spoiling the educational broth.

    Because the program had been phased in over 3 years, NSF found itself pulling the plug on some states at the same time it was receiving applications for a second round of support from others. Educators hoped that those new funding decisions would provide a clearer picture of what NSF expected from systemic reform efforts. But they say NSF officials continued to send mixed signals about which projects were doing well. NSF has never spelled out its selection criteria beyond saying that the proposals were judged on quality and in accord with its usual peer-review practices.

    For example, several state officials have pointed to Colorado and Michigan as supposed models that, at some point, fell out of NSF's favor and lost in the second-round competition. Mincemoyer of Michigan says that “at past meetings we had been highlighted as a successful SSI, but we never got much feedback from NSF when it turned down our proposal.”

    Vermont, which made the cut, has only recently put in place a statewide assessment and, therefore, had no student achievement scores, much less gains, to publicize. “It was a very big problem for NSF at first,” confesses Watson. “But Vermont had never had a state assessment in math and science, and it took state officials a long time to recognize the need for one.” NSF officials say the decision to fund Vermont for a second time was based on the assumption that the new assessment would give the state a chance to reap the fruits of systemic reform efforts that had been planted over the first 5 years.

    At the same time, however, two of the four states that SRI identified as showing gains in the classroom—Montana and Ohio—failed to win new funds. Ohio's project, which provided a relatively small number of teachers with intense and ongoing training, was seen as “deep but narrow.” And by focusing almost exclusively on high school mathematics, the Montana project was seen as insufficiently systemic, although it produced a much-admired curriculum.

    To date, no state has succeeded in scaling up fully—going from a relatively small number of initial schools, teachers, and students to a statewide program. Although Williams says he would welcome a “how-to kit”—“I'd like a few states to stay in business long enough to be a fully reformed system and to write a paper for us on what they did,” he says—most educators and evaluators say that they can't even imagine what such a manual would look like, much less that it could be written. “Statewide systemic reform is not a phenomenon; it's 26 phenomena,” says Hammond.

    Show me the data

    In late October officials from the remaining eight SSI states gathered in a Washington suburban hotel with their systemic reform peers from urban and rural districts for an annual meeting to review their progress and to look ahead. Although their numbers—and the agency's investment in their efforts—have dwindled, state officials were reminded that the pressure to show results is as strong as ever. “In my 8 years at NSF, and 2 years as director [of systemic reform programs], I don't recall any time when we have been subjected to such sustained and challenging scrutiny,” Williams told them. “The pressure is coming from everywhere—Congress, the White House, the scientific community, the science board, and NSF's own leadership. And it's all about the need for agencies to show successful outcomes.”

    However, few states have data showing sustained and significant payoffs in the classroom that are tied to SSI reforms. At best, concludes the SRI report, half of the 22 SSI states that completed at least 5 years showed “credible evidence” of fostering better practices by teachers using an improved curriculum. Only four states—fewer that one in six—could point to better test scores flowing from their SSI activities. And in two cases the sample size was tiny—no more than a dozen classrooms with a few hundred students. The rest, the report said, were engaged in activities that did not translate directly into better test scores—for instance, new standards for a statewide curriculum, or new rules on how state funds should be distributed.

    The SRI evaluators are careful to point out that “a change in student outcomes was only one target for the SSI program.” But even discounting for other goals, as well as for tests not attuned to the new skills that the students have acquired, the evaluators conclude that “it seems likely that the SSI's impacts on student achievement were limited.”

    That message wasn't a surprise to many educators, who say it's unrealistic to expect even the best SSI projects to show great leaps in test scores given the program's relatively short life, the enormous challenges it addresses, and the insignificance of a $2-million-a-year program alongside a state's multibillion-dollar education budget. “You'd like to show remarkable results after 3 or 4 years, but frankly, that's impossible,” says Sandy Scofield of the University of Nebraska, Lincoln, former director of the Nebraska SSI and current head of the school's Center for Math, Science, and Computer Education. Adds Shirley Malcom, head of education programs at the American Association for the Advancement of Science (which publishes Science) and former science board member, “Frankly, I would be skeptical of any big changes in test scores in a few years.”

    View this table:

    SRI had already conveyed that message to NSF officials in case studies and related publications that had dribbled out over the past few years. In response, NSF officials have rushed to fund a new round of studies aimed at documenting and disseminating what a June 1998 announcement refers to as “inspiring” success stories. Working with grants of up to 3 years, the evaluators will examine all 26 SSI projects for concrete evidence of systemic change. The program announcement reminds researchers that “communicating the results of these impact studies is essential,” not just via scholarly articles but also through newspaper editorials, public presentations, and discussions with policy-makers.

    NSF officials acknowledge that the need for such evaluations reflects the infant state of knowledge about systemic reform. But they say they are proud of SSI's accomplishments to date. “It's been very successful in raising expectations,” NSF's Daryl Chubin, former head of evaluation, told Congress during the July hearing. “We can't guarantee success, but we can help states to be more vigilant.” Most project directors say they welcome the help in rallying support for systemic reform. “The [NSF] name has value at the local and state level,” says Frances Eberle, head of the Maine SSI project, which ended this summer. “We're disappointed we weren't renewed, but we're very pleased with what NSF has done for us.”

    Even so, many educators think that the payoff could have been much greater had NSF possessed a clearer idea of where it wanted to go and how to get there. “I laughed when I saw the [most recent evaluation] announcement,” says education policy analyst Nancy Saunders of the University of Colorado, Denver, who evaluated Colorado's SSI program, which ended last year. “My reaction was: They should have thought of these things 7 years ago. Now that they know so much, maybe they should start over from the beginning.”


    Puerto Rico Builds a Pyramid of Success

    1. Jeffrey Mervis

    Ask educators for a success story from the National Science Foundation's (NSF's) Statewide Systemic Initiatives (SSI) program and most will point offshore, to Puerto Rico. There, teachers in target schools have been trained in a new curriculum and student test scores have risen—and the reforms are spreading outward from this solid base. Funded initially in 1992, the Puerto Rico program was renewed for another 5 years in 1997. “When we looked at who to renew, we wanted to find models of what can work, like in Puerto Rico,” explains Luther Williams, head of the education directorate at NSF.

    Puerto Rico's school system is highly centralized. It's also large: If it were one district, it would be among the three largest in the country, behind New York City and on a par with Los Angeles. The SSI project is directed by Manuel Gomez, a physicist and administrator at the University of Puerto Rico, who runs it as an experiment—including a testable hypothesis, controls, data collection, and constant monitoring. The heart of the reform is training teachers to work with a new, standards-based curriculum—the same kind of changes made in other SSI projects. But Gomez's management strategies put his program over the top, observers say. “Manuel has succeeded for a variety of reasons,” says Shirley Malcom, a former member of NSF's oversight board and head of the education and human resources directorate at the American Association for the Advancement of Science (which publishes Science). “A big part is his insistence on doing things right.”

    As part of that rigor, Gomez has spent $1.2 million to compare students at SSI schools with their counterparts at other public schools and at the island's extensive system of private schools, using publicly available portions of national and international assessments translated into Spanish. In addition, every SSI student is tested each fall and spring to help assess their progress. Puerto Rico has also made its SSI project the hub for other state and federally supported efforts to improve science and math education.

    Gomez's major innovation has been to employ a pyramid system based on bringing systemic reform to one school at a time. He began with 6 weeks' summer training of teachers from seven middle schools. Next, Gomez converted the pilot schools into what he calls “dissemination centers” to train the next round of teachers. The following summer each center worked with teachers from eight to 10 schools; the most successful seven buildings became a second tier of dissemination centers. Eventually the project brought standards-based curricula into elementary schools and then into high schools. By this fall it had reached 400 schools, one-quarter of the island's total, and project officials expect to double that number in the next 2 to 3 years.

    “Everybody said it was a clumsy idea because it takes so long,” says Gomez. “But I said, ‘Be patient. It will work if we give it time.’” Getting all the teachers on board at a school is another key element, he adds. “I could train five teachers and call it an SSI school,” he says. “But if the teacher next door feels threatened, then he or she will go to the principal and try to get it squashed.”

    Such intelligent management strategies have paid off. In its evaluation report, SRI International of Menlo Park, California, singled out Puerto Rico as one of four states “with the most credible evidence” that the SSI project had raised student achievement. Gomez's approach is also beginning to spread beyond the island. This fall NSF gave the project a 3-year, $750,000 award to adapt the model to New York City's Urban Systemic Initiative, which hopes to use Puerto Rico's Spanish-language material in setting up its own dissemination centers. “We see it as an important step in applying what we've learned,” says the University of Puerto Rico's Norma D'Avila, co-director of the project.


    Urban Districts Grab the Spotlight

    1. Jeffrey Mervis

    The sprawling Statewide Systemic Initiatives (SSI) were the National Science Foundation's (NSF's) first efforts at systemic education reform. But that program, begun in 1991, is no longer the agency's flagship educational effort. That honor now belongs to the Urban Systemic Initiatives (USI) program, launched 3 years later, which targets the 25 cities with the largest number of poor children in the country.

    Many educators say the urban initiatives have a big advantage over NSF's statewide programs. City schools are usually run by a single administrator or school board—a definite advantage when you're trying to cut through layers of bureaucracy and overturn the status quo. State systemic efforts, in contrast, must negotiate among several, often competing, sources of power. “I think that NSF's approach is better suited to a smaller administrative unit, like a city,” says Tom Baird of the Florida Department of Education and former director of the Florida SSI, which was terminated 6 months early.

    Helping big cities, where the problems are seen as more urgent and the stakes even higher, is also politically sexier. “Congress wants to have more of them, and they certainly demonstrate NSF's commitment to working with the population in greatest need,” says Margaret Cozzens, a former senior NSF education official now at the University of Colorado, Denver.

    The USI program gives each site up to $3 million a year for 5 years, 50% more than SSI states receive, to support a similar mix of efforts to reform the district's entire science and math education program. Although many USI cities are located in SSI states, the two projects are managed separately and often have little more than a nodding acquaintance with one other.

    View this table:

    All told, 22 cities have been awarded grants since the program began. The sole casualty to date is Cincinnati, Ohio, which was terminated 1 year early after NSF officials decided that systemic reform had become lost amid a broader restructuring of the district. NSF officials also pulled the plug after 2 years on a similar systemic reform grant to the District of Columbia, which is not eligible for the USI program.

    One problem that has plagued the SSI program—a shortage of data on student achievement—hasn't been a stumbling block for USI sites. Because most urban districts perform so poorly, city officials have typically regarded rising test scores as the litmus test for any educational reform and, thus, have invested heavily in preparing teachers and students for such achievement tests. As a result, several cities have managed to show improvement across one or more grades in specific subjects. But the challenge for USI sites has been to show a direct connection between higher scores and NSF's investment.

    NSF officials sidestepped that issue in a 50-page booklet published in September that touts the accomplishments of the USI sites. It notes, for example, that Chicago elementary school students did better (by some unspecified percentage) on standardized tests in mathematics in the first year (1994–95) of the USI program. But a later table on middle school student achievement shows that the magnitude of gains shrunk after the first year and that sixth graders in non-USI schools actually did better than their USI counterparts over a 3-year period. The booklet also notes that Chicago high schoolers began to do better in math after the introduction of reforms that go beyond the USI project, including accountability measures—in effect, no more social promotion—and an end to substandard and remedial courses like “consumer math.”

    Although such improvement is welcome, Chicago Public Schools CEO Paul Vallas says that the discipline-based USI reforms are only part of the reason. What makes a bigger difference in raising student achievement, Vallas told an NSF-sponsored field hearing this summer on systemic reform in his city, are efforts such as an expanded summer school, early child care, and before- and after-school activities. “The bottom line,” says Vallas, “is if you reach children earlier, if you keep them in school longer during the day and throughout the year, and if you provide them more instructional time, the children are going to perform better.”

    Whether or not the USI projects have been the catalyst for such changes, as NSF officials insist, their time may also be passing. Although Boston, Houston, and Indianapolis are still in the running for a first-time USI award, next spring marks the end of the line for seven cities in the first cohort, and NSF officials say they have not yet decided whether to fund a second round. Last month in Washington, Luther Williams, head of NSF's education directorate, reminded USI project directors that the two newest sites had first call on NSF's resources. He also encouraged cities to look for nonfederal partners, including industry, for continued support for their reform efforts.


    In New York, the Pieces Didn't Add Up

    1. Jeffrey Mervis

    When New York applied to the National Science Foundation's Statewide Systemic Initiatives (SSI) program in 1993, its strategy was to start with the toughest schools—in economically depressed, high-minority, inner city areas—and then build up from there. Buffalo's Southside School certainly fit the bill (see main text). But the rush to begin—Principal Ray Cooley recalls having less than a month to scramble to put together a proposal that would supplement some instructional changes he was already making—was a taste of the organizational problems that would plague the project throughout its 5-year existence.

    Cooley admits that systemic reform got off to a “rocky start” at Southside and that some veteran teachers were less than excited by the prospect of change. “I call it the year of the divorce, the year of tears,” says district coordinator Kathy Resutek, a former teacher who spent most of her time in the building. Still, less than 2 years into the program, state officials were pointing to a new approach to learning, including hands-on activities such as labs and field trips, that was transforming student attitudes toward science and math. “The kids loved coming to school,” says Cooley, who retired in 1997 after 18 years as principal. When he left, Cooley says, the program “was ready to take off.”

    But such changes in attitude and classroom behavior didn't translate into what many would consider success—improved student test scores. “We held our own in math, but we didn't do as well as I would have liked in science,” says Cooley. And his successor, Marilyn Brock, also a veteran administrator, says she wasn't impressed with what she found. “I asked to see the original proposal and the results to date, but there was no documentation of anything,” she says. “I had nothing to go on—I wasn't told anything about the project [by Buffalo school administrators] before I came here. I wasn't opposed to it, but there just didn't seem to be much interest [from downtown]—after all, the money was just a drop in the bucket compared to the district's overall budget.” Brock also says the project was divisive, embraced by certain teachers in certain grades but not by the faculty as a whole.

    Several Southside teachers who enthusiastically back the project dispute Brock's description. They say she saw the heavy use of teacher-led committees and other decentralized activities as a threat to her authority. Brock calls that accusation “one of many false rumors that was spread as soon as I arrived.”

    Sam Alessi, Buffalo's assistant superintendent for curriculum, believes poor communication was a major factor in the project's demise. Southside was “a very strong community, a family,” he says, and “Brock was seen as an outsider. She wanted a pause, to be brought up to speed, and they saw it as opposition. I don't think she planned to stop it.”

    Both Brock and her critics agree on one point, however: Once she raised questions about the project, neither the district nor the state tried very hard to put it back on track at Southside. Alessi says the state already had plans to put most of the SSI money for 1997–98 into other schools and that systemic reform is continuing with other funds. But he admits that reforming one school, much less scaling up throughout the district and state, is a slow and difficult process at best. “Systemic reform isn't something that you can do overnight,” he says about the extended effort—and mixed results—at Southside. “It takes years of discussions. That's why state systemic reform is so hard to do.”


    Starbirth, Gamma Blast Hint at Active Early Universe

    1. James Glanz

    Astronomers thought starbirth subsided at great distances, but new observations suggest that stars were forming and exploding as far as telescopes can see

    Somewhere beyond the greatest distances and earliest times that telescopes can reach lie the dark ages of the universe, the era between the big bang and the birth of the first stars. Those dreary times are even more remote than anyone had imagined, two recent developments suggest. In one, astronomers found frenzied star formation as far back as they could see. In another, a tremendous flash of gamma rays, among the brightest of these events ever recorded, may have originated at an even greater distance.

    These hints that the early universe was a much more active place than anyone suspected contradict the evidence of the Hubble Deep Field (HDF), a 10-day space telescope exposure that NASA unveiled 3 years ago, which probed as far back in time as possible in a tiny speck of the sky. The HDF suggested that star formation slows at the greatest distances, implying that the dark ages might lie just beyond the farthest reaches probed by the image. The new evidence hints, instead, that the HDF may have probed an unusually vacant region of the distant universe. The astronomers involved caution that contradictory evidence could emerge at any time. But for now all eyes are on the work, some of which has only recently been presented in talks and posted on the Los Alamos preprint server ( The developments are “tremendously exciting,” says Dale Frail of the National Radio Astronomy Observatory (NRAO) in Socorro, New Mexico.

    Posted just last week was a paper by Charles Steidel of the California Institute of Technology and four colleagues describing star-formation rates far out in space and back into the cosmic past. Steidel began by estimating “redshifts”—a measure of distance—for about 1500 galaxies observed with several telescopes in an area of the sky that included the HDF but was about 200 times larger. Interstellar gases screen out the shortest wavelengths of light from distant galaxies, producing a characteristic cutoff in the spectrum called the Lyman break. By estimating how much the expansion of the universe had displaced each galaxy's Lyman break toward the red end of the spectrum, Steidel was able to gauge its redshift. Then, by recording the amount of ultraviolet light that hot young stars emit from the galaxies and allowing for some dimming by dust, Steidel worked out the star-formation rate as a function of redshift.

    “We were surprised by what we found,” says Steidel. The rate did not decline for as far back as they could see—out to a redshift of more than 4, corresponding to roughly 13 billion years ago, perhaps 90% of the way back to the big bang. “It's just a long plateau,” he says. That conclusion jibes with hints from other wavelength bands (Science, 17 July, p. 320), but it conflicts with earlier work that focused only on the HDF—work whose results acquired enormous influence, despite the warnings of some of the people who did the analysis.

    The HDF analysis, led by Piero Madau of the Space Telescope Science Institute (STScI) in Baltimore—Steidel was a co-author—found a steep decline in star formation at redshifts greater than 3. The observations had the advantage of using the Hubble Space Telescope, which could pick up intrinsically fainter galaxies than the ground-based telescopes of the latest Steidel work. But the conclusions rested on just 13 galaxies at a redshift of 4, compared to 244 for Steidel's work. “The issue with the HDF is that it's a small region of the sky,” says Madau. “HDF might have gone through a region empty of galaxies, a void.”

    Madau says he wants to wait for analyses of a second deep field image, made in the southern sky (Science, 27 November, p. 1621), to draw firmer conclusions about whether the original HDF is a statistical oddball, but that for now, “I think [Steidel] is probably right.” Steidel says that his rate could even represent a lower limit, because his analysis would have missed any light blocked completely by dust, rather than just dimmed. The influence of the HDF work “does reflect how people don't read the fine print,” says STScI's Mark Dickinson, who is a co-author on both papers.

    The second hint of an active young cosmos came last month, when STScI's Andrew Fruchter posted a paper on the Los Alamos server suggesting that a gamma ray burst (GRB) seen on 29 March originated at a redshift of about 5. GRBs—mysterious, seconds-long flashes of gamma rays—cannot be ranged directly, even though they briefly outshine all other known objects in the universe. But they often have “afterglows” at optical wavelengths, like the embers of a campfire, whose redshifts can be found. Last spring, two Caltech researchers made headlines by announcing a then-record redshift of 3.4 for a GRB (Science, 24 April, p. 514).

    Then, after the 29 March GRB, a team led by Greg Taylor and Frail of NRAO saw an afterglow in radio wavelengths and fixed its position precisely enough to lead several teams to the optical afterglow. Fruchter applied the Lyman-break method to the optical data and concluded that the burst probably originated at a redshift of 5. Because the event was among the 4% brightest of all GRBs seen from Earth, such a distance points to an explosive energy at the source that is “just staggering,” says Frail.

    He and others caution that a heavy dose of dust could mimic the Lyman-break absorption. But detailed modeling by Daniel Reichart and others at the University of Chicago supports the notion that the redshift is still around 5. At that distance, the intensity of the GRB adds further support to scenarios for these celestial explosions in which a powerful magnetic field surrounding the gamma ray source channels the rays in one direction, beaming them across the cosmos like a searchlight that is sometimes pointed toward Earth, says Jonathan Katz of Washington University in St. Louis.

    If GRBs are connected with the violent deaths of the massive, short-lived stars found in star-forming regions, as some researchers have suggested, then Fruchter's distant event “is totally consistent with what we're finding,” says Steidel. But he warns that the trend of an active young universe could prove just as fragile as earlier conclusions about these remote times.


    Geologists Take a Trip to the Red Planet

    1. Richard A. Kerr

    Toronto—When more than 5000 geologists gathered here on 26 to 29 October for the annual meeting of the Geological Society of America, a highlight was Mars. The latest data from the Mars Global Surveyor lend credence to an early ocean, a new view of data from Mars Pathfinder questions whether Mars had plate tectonics, and lab work suggests a new, definitive test for life on Mars.

    Mars Ocean Holds Water

    Many proposed parallels between our own planet and Mars have come and gone—canals, intelligent Martians, maybe even martian bacteria. But one putative similarity has survived a first test by the Mars Global Surveyor spacecraft: an ancient ocean on Mars.

    It has long been accepted that water gushed out of the martian highlands a couple of billion of years ago, through massive channels that are still visible today. Many researchers argue that after reaching the northern lowlands, the water dispersed before it could get very deep. But some think repeated flows pooled there to form an ocean covering one-quarter of the planet before eventually seeping into subsurface deposits or evaporating and escaping to space. And so far, data sent back by Surveyor—which has been orbiting the planet since September 1997—are consistent with this provocative picture.

    “I was very skeptical when I started,” says Surveyor team member James Head of Brown University in Providence, Rhode Island. But after measuring the altitude of supposed shorelines, the smoothness of plains that would once have been ocean floor, and the volume of the basins, “I was surprised to see all three were consistent with predictions [of an ocean]. I don't know the answer, but it's tremendously exciting.” Not everyone is caught up in the excitement, however. Planetary geologist Michael Carr of the U.S. Geological Survey (USGS) in Menlo Park, California, is still “very skeptical of the whole ocean business,” saying it's unlikely that the water gushed out quickly enough to form an ocean. “There has been a tendency with Mars, if there's a simple solution versus an outrageous one, to choose the outrageous one because it's more interesting.”

    The outrageous possibility got a boost when Head and his Brown colleagues did three different analyses of data from the Mars Orbiter Laser Altimeter (MOLA) aboard Surveyor. One was straight altimetry—determining topography by timing a laser pulse's round trip from the spacecraft to the surface and back. That let the group test a proposal made by planetary geologist Timothy Parker of the Jet Propulsion Laboratory (JPL) in Pasadena, California, who back in the mid-1980s identified features in the northern lowlands as the shorelines of now-vanished oceans. One ocean seems to have filled the lowlands to the brim, while a later one only partly filled the depression. If the shorelines are real, the height of each one should be the same at every point.

    A first look at the MOLA data suggested that the higher shoreline undulates too much to be real. But although MOLA showed that the altitude of the lower shoreline also fluctuates, mostly over a range of about 500 meters, it may have been perfectly uniform once. When the researchers removed the great bulge of the Tharsis volcanic region, which presumably pushed up the shorelines after the ocean had emptied, the altitude fluctuations fell, especially for the smaller ocean. Planetary scientist Bruce Banerdt of JPL and Parker reported similar results at the meeting.

    Head's second test compared the roughness of the surface inside and outside the shorelines. In the Mars ocean scenario, sedimentation would have smoothed the lowlands. MOLA had already shown that the northern lowlands are as smooth at the 100-meter scale as the abyssal plains on the floor of Earth's oceans, and on a larger scale they are flatter than any other known surface in the solar system (Science, 13 March, p. 1634). New MOLA data show, according to Head, that the smoothness is greatest inside the lower shoreline.

    In a third test, Head used MOLA data to calculate the volume of water needed to fill the northern lowlands up to each shoreline. The smaller ocean would have required only about three times the minimum amount of water thought to have flowed from the channels, he found—a plausible amount. “So far, we think the three pieces of data from the MOLA are consistent with the hypothesis of Parker,” says Head, at least for the smaller ocean. “None of the three tests is unequivocal; they're just consistent.”

    Consistency isn't enough to convince Carr. Head “is doing the right thing,” he says, but the results so far don't solve a serious problem: getting enough water into the lowlands all at once to make an ocean and keeping it from freezing solid. Many researchers now think that early Mars was too cold for standing bodies of water.

    At the meeting, planetary geologist Alfred McEwen of the USGS and his colleagues noted that other processes could explain at least one oceanlike feature: the smooth lowland plains. They presented new Surveyor images of Elysium Planitia and Amazonis Planitia, two areas that lie between the two shorelines. Researchers who saw the images said they showed convincingly that Elysium and Amazonis are covered by vast fields of lava, which could account for their smoothness. “Here, it's clear the flat topography has to be due to the lava,” says McEwen. “Flat topography itself is not evidence for an ocean.”

    McEwen is quick to add that the lava fields don't rule out an ocean, either. But “everything needs more analysis,” he says. That will come next spring when Surveyor, after numerous delays, finally settles into its intended orbit close to the planet. Once it gets an even closer look, the outrageous ocean hypothesis should stand or fall.

    Life's Iron Mark

    To study life in the distant past, researchers rely on fossilized bones and shells. But fossils are rare, so scientists have also developed a more subtle method to detect signs of life: analyzing isotope ratios. Living things preferentially take up the lighter isotopes of carbon and oxygen, for example, leaving a “fingerprint” of life in the rock record even when no tissues have fossilized. At the meeting, geochemists added a powerful new element to their isotope fingerprint kit: iron.

    Geochemists Brian Beard and Clark Johnson of the University of Wisconsin, Madison, announced that they could detect bacteria's effect on iron isotopes, a fingerprint that appears to be unique to life and difficult to erase. “People have been looking more than a decade” for an iron isotopic effect, because it could be a more reliable indicator of life than carbon and oxygen isotopes, notes geochemist Richard Carlson of the Carnegie Institution of Washington's Department of Terrestrial Magnetism. “I think they've actually found one finally.” Already, researchers have used iron to confirm life's role in shaping mysterious eons-old rocks, and it holds promise for tackling questions ranging from life's role on the early Earth to its possible presence on Mars.

    Scientists realized decades ago that organisms would preferentially take up the lighter, rarer iron-54 over the more common iron-56 when filling their energy or other nutritional needs. And because of iron's great weight, factors such as temperature weren't expected to alter the isotopic fingerprint, as happens with carbon and other light isotopes. But iron is notoriously difficult to analyze. The very act of heating the sample to drive the iron into a mass spectrometer, for example, preferentially drives off the lighter isotopes, leaving the analyst to sort out how much of the light-isotope enhancement is due to life and how much to the machine.

    Beard and Johnson finally overcame that hurdle, in part by spiking some samples with a mix of two isotopes—5% iron-58, which is only a trace component in natural samples, and 95% iron-54. By determining how much the machine skewed the proportions of these isotopes, they could correct for instrumental effects and accurately gauge the ratio of iron-56 to iron-54 in the original sample. When they analyzed an ultrapure iron sample, they reduced the analytical error by a factor of 10 from previous analyses, to ±0.25 parts per thousand (per mil). Exactly the same ratio turned up in a variety of terrestrial and lunar rocks formed from magma. “Iron isotopes just aren't fractionated by the inorganic processes” that form such igneous rocks, concludes Beard.

    Life, it turns out, is a different matter. Microbiologist Kenneth Nealson of the Jet Propulsion Laboratory in Pasadena grew bacteria that use iron locked up in the mineral ferrihydrite as an energy source and then release it in soluble form. He found that the dissolved iron had 1.2 ± 0.25 per mil less iron-56 than the mineral it came from. The iron of manganese nodules from the sea floor—thought to have grown over millions of years with the help of bacteria—was similarly depleted in the heavy isotope, Beard found. He also studied a banded iron formation—a finely layered deposit of iron ore whose origin may have involved life. He found that alternate, light-colored layers are isotopically heavier, possibly due to seasonal blooms of bacteria. If the isotopic composition of any sample departs from that of magma-derived rocks, he concludes, “it probably means that iron was processed biologically.”

    “What they've done is great,” says geochemist Ariel Anbar of the University of Rochester in New York. “It's devilishly difficult to do, but it could be a really powerful tool.” Applied to some of Earth's oldest rocks, it could confirm hints from carbon isotopes that life was flourishing as long as 3.8 billion years ago. Iron isotopes in sediments laid down during the mass extinctions of 250 million and 65 million years ago could shed light on the biotic collapse in the oceans suggested by carbon isotopes. Iron should provide a new “biomarker” to be checked in any rocks that humans return from Mars. And the method may be able to detect the signature of life, if any, in the tiny iron oxide grains in martian meteorite ALH84001, which a few researchers argue were formed by bacteria. Settling that debate—Beard and Johnson's original goal in developing the method—would surely earn its keep.

    Mars Rock Not So Earth-like

    When the Mars Pathfinder rover sent back word last year that some rocks on the Red Planet resemble volcanic rock from our own Andes, researchers wondered whether the similarity was a sign of a much deeper geologic kinship (Science, 1 August 1997, p. 638). Andesites, as they are called, form in volcanoes fueled by tectonic plates as they plunge into Earth's interior. So might Mars have had drifting plates at some time, too?

    Planetary geologists were intrigued but uneasy, because Mars has no obvious signs of past plate tectonics. Now they've come up with a more pedestrian scenario for the origin of the Pathfinder rocks: that they were cooked up in the dying days of an ordinary volcano. “We haven't proved anything,” says planetary petrologist Harry McSween of the University of Tennessee, Knoxville, “but we think this is a more plausible explanation than some early ideas.”

    Plate tectonics came up in early discussions because Pathfinder rocks turned out to be about 62% silica—the element silicon combined with oxygen—compared to the average of 45% to 50% silica expected in the martian crust. On Earth, rocks that are rich in silica form in the volcanoes of the Andes and Aleutians, which erupt over the sinking, water-laden crustal slabs of plate tectonics. The slabs' water helps distill extra silica from the rock below the volcanoes. The bit of martian andesite could have been a sign that billions of years ago, when Mars was both wetter and more geologically active, Earth-like plate tectonics shaped its surface.

    But on further consideration, McSween and his colleagues find that “the best match is icelandite, not andesite.” Icelandite, a volcanic rock that forms in small quantities at volcanoes like those of Iceland, Hawaii, and the Galápagos Islands, also has a high proportion of silica, making it a sort of andesite. But it has higher iron and lower aluminum abundances than Andes andesite—features also seen in the Pathfinder rocks, says McSween. The extra silica in icelandite is concentrated by the repeated cycle of melting, crystallization, and remelting experienced by magma that takes the longest possible route through a volcano late in its eruptive life—a process that does not require plate movements.

    Researchers think they have already spotted the kind of rock that could be volcanically distilled into a martian icelandite. McSween is thinking of an iron-rich basalt like the one that gave rise to the icelandites of the Galápagos, a type of rock that may have been common early in martian history. Meteoriticist Ralph Harvey of Case Western Reserve University in Cleveland has his eye on bits of magma found as inclusions in one of the chunks of Mars rock that have reached Earth as meteorites. And petrologists Michelle Minitti and Malcolm Rutherford of Brown University have cooked up something like a Pathfinder icelandite in the lab by starting with a rock matching the composition of another martian meteorite plus a bit of water.

    Whatever the starting material on Mars, says Harvey, “we don't have to invoke something we have no proof of, like plate tectonics.” Daniel Britt, a planetary geologist at the University of Arizona, Tucson, and a former Pathfinder team member, agrees. “You look on Mars, you see volcanism everywhere,” he says, and icelandite “is what you'd expect in the last gasp of a volcano.”

Stay Connected to Science