News this Week

Science  11 Feb 2000:
Vol. 287, Issue 5455, pp. 942
  1. HIV TRANSMISSION

    AIDS Researchers Look to Africa for New Insights

    1. Jon Cohen

    SAN FRANCISCO, CALIFORNIAA remarkable shift in emphasis occurred here last week at the largest annual AIDS meeting* held in the United States. Instead of reports from North America and Europe once again dominating the meeting, many of the most significant findings came from Africa, where 70% of the world's 33 million HIV-infected people live. As HIV infection rates have plateaued in most developed countries, investigators are increasingly turning to Africa, where the epidemic is still exploding. “When it moves that rapidly, there are many things you can do about it,” explains Anthony Fauci, head of the U.S. National Institute of Allergy and Infectious Diseases (NIAID).

    As the geographic focus shifts, so do the research questions. In North America and Europe, investigations center on elucidating the relationship between HIV and the immune system and also determining how best to use the 14 marketed anti-HIV drugs. But in sub-Saharan Africa, where infection rates among adults have climbed to 25% in some countries, such questions have little relevance, as only the elite can afford anti-HIV drugs. The main issue, instead, is how to stop transmission of the virus. As several reports at the meeting emphasized, the situation may be more complicated than previously assumed.

    One puzzling question is why HIV infection rates are so much higher in some African populations than in others. To find out, Anne Buvé, an epidemiologist at the Institute of Tropical Medicine in Antwerp, Belgium, launched a huge study in four countries that compared sexual habits, male circumcision practices, and rates of infection with various sexually transmitted diseases (STDs). Buvé's international research team randomly selected approximately 1000 men and 1000 women from four towns: Cotonou, Benin; Yaoundé, Cameroon; Kisumu, Kenya; and Ndola, Zambia. HIV prevalence in Cotonou and Yaoundé ranged from 3.3% to 7.8%; in Kisumu and Ndola, it ranged from 19.8% to a whopping 31.9%.

    To their surprise, the researchers found little connection between HIV prevalence and the lifetime number of sexual partners, contact with sex workers, condom use with sex workers, or the age at which a person first has sex. “We believe that any differences in sexual behavior were probably outweighed by differences in efficiency of HIV transmission during sexual intercourse,” said Buvé. Two clues about transmission factors emerged that led Buvé and colleagues to this conclusion. The two cities that had the highest HIV prevalence also had the lowest rate of male circumcision and the highest rate of infection with ulcerative STDs, such as herpes simplex virus type 2 and Trichomonas vaginalis.

    Buvé's study is “probably the clearest demonstration … that differences in HIV prevalence are not simply due to behavior,” said epidemiologist Kevin DeCock of the U.S. Centers for Disease Control and Prevention. The data, if borne out, might well force a rethinking of prevention campaigns that solely emphasize behavior change, without also screening for STDs, distributing condoms, and even encouraging male circumcision.

    Work by Ruth Nduati of the University of Nairobi in Kenya highlighted how difficult it is to transfer gains made in wealthy countries to poorer settings. One of the crowning achievements in AIDS prevention to date was a 1994 study showing that the anti-HIV drug AZT can block transmission of the virus from an infected mother to her newborn. As Nduati noted, in 1997, the United States recorded just 500 cases of mother-to-child transmission of HIV. But that same year, 600,000 other babies—90% of them in sub-Saharan Africa—became infected by their mothers, the vast majority of whom could not afford the drugs. Today, a $4 treatment can significantly reduce mother-to-child transmission at birth, but as Nduati noted, even that treatment, if they can afford it, may not help many poor women in Kenya.

    Although several studies have shown that breast-feeding can transmit HIV, the Kenyan study is the first to evaluate whether using formula can affect transmission rates. As expected, about 20% of the babies in the study became infected before or at birth. Nduati and co-workers from the University of Washington went on to look at the uninfected babies, comparing 197 breast-fed babies to 204 who received formula. They found that an additional 16.2% of the breast-fed group became infected later, most (75%) during the first 6 months of life. “The results of this study have significant public health implications,” concluded Nduati. “Breast milk avoidance could potentially decrease mother-to-child transmission by 44%,” a reduction similar to that achieved with a short course of AZT or other anti-HIV drugs.

    But this doesn't necessarily mean that formula-feeding would save lives. In many parts of Africa where water supplies are contaminated, switching to formula is likely to increase mortality from intestinal diseases. The cost of formula is also a serious barrier, especially in Kenya, which taxes formula so heavily that a 1-month supply costs $1000.

    Many AIDS researchers believe that the best hope of thwarting HIV lies in the development of a vaccine. But Oxford University's Rupert Kaul presented some unexpected findings from another study in Kenya that suggest researchers may need to rethink some fundamental precepts about what might constitute a working vaccine.

    Kaul collaborated with investigators from the University of Kenya and the University of Manitoba who have been studying a group of Kenyan sex workers since 1984. About 15% of the 1900 women have remained uninfected by HIV, despite having an average of five clients per day and as many as 70 unprotected exposures to the virus each year. But Kaul reported that 10 of these women who took a 2-month break from sex work became infected when they returned. Kaul stressed that it's “not unexpected” that some of these women would become infected, noting that the 3-year cutoff they use to define someone as exposed but uninfected is “purely arbitrary.” What is odd, though, is that they became infected after taking a break from high-risk sex. “That's the most fascinating thing I've heard in months,” said NIAID's Fauci.

    While fascinating, the findings are nonetheless difficult to interpret. One possible explanation is that repeated exposure to the virus is needed to maintain whatever immunological barrier is protecting the women. If so, a vaccine may need to be given repeatedly. Another possibility under investigation is that these women developed antibodies to the foreign, or “allo,” white blood cells in semen. Because HIV also has pieces of human white blood cells on its surface, which the virus picks up from every cell it infects, these alloantibodies might block the virus from infecting the women's cells. Perhaps when the women stopped sex work, these antibody levels fell, rendering the women more vulnerable to HIV.

    The research tilt toward Africa and other developing regions will surely intensify during the next year, and not only because that's where HIV is hitting the hardest. This summer, some 10,000 AIDS researchers are expected to travel to Durban, South Africa, for the 13th World AIDS conference, which, in another sign of the times, will be the first one held in a developing country.

    • *7th Conference on Retroviruses and Opportunistic Infections, 30 January to 2 February.

  2. SCIENTIFIC ADVICE

    Academies Get Together to Tackle the Big Issues

    1. Robert Koenig

    BERN, SWITZERLANDWhile heads of state and business leaders were capturing the headlines at last week's World Economic Forum in snowy Davos, an unheralded group of science academy leaders quietly gathered behind the scenes. Their goal: to take some crucial steps toward creating a new mechanism for providing impartial scientific advice to international organizations.

    The proposed new body, tentatively called the InterAcademy Council (IAC), would in some ways be an international version of the U.S. National Research Council, which sets up scientific committees to advise the U.S. government. Similarly, the United Nations or World Bank could ask the IAC to appoint international panels of leading researchers to address global issues such as the effect of genetically modified plants on agriculture, the threat of emerging infectious diseases, how to protect threatened ecosystems, and how to bridge the information divide between the world's rich and poor.

    Participants in last week's meetings in Davos and Zurich—which included the presidents of a dozen of the world's national science academies and several other officials—told Science that such a global science advisory body could be approved by mid-May. “We easily reached a general agreement on a plan after spending about 20 hours together,” says biochemist Bruce Alberts, president of the U.S. National Academy of Sciences, who had proposed such an international body last spring.

    While the Davos participants agreed on the general outlines of the IAC, Alberts says remaining details will be hammered out by a working group of academy presidents or their equivalents from France, India, Germany, and Brazil. That group will make its recommendations in Tokyo in May to the steering committee of the InterAcademy Panel on International Issues (IAP), a loose association of about 80 national science academies that holds conferences and issues statements on issues of common interest.

    F. Sherwood Rowland, a U.S. Nobelist who co-chairs the IAP and helped set it up in 1995, sees no “major stumbling blocks” in convincing the IAP to agree to the proposed new IAC. But Rowland, who attended the Davos meetings, says some issues remain to be resolved. They include avoiding duplication of effort among the planned IAC, the InterAcademy Panel itself, and the International Council for Science (ICSU), a Paris-based organization of scientific unions and national academies.

    At any given time, the IAC would be comprised of representatives from 15 academies. Some larger academies, such as those of the United States, Russia, China, Japan, India, Brazil, and several European countries, would likely be selected for long-term—possibly 10-year—appointments on the council, while smaller academies would be appointed for 3-year terms initially. In general, the intention is to rotate the IAC membership among the 80 or so national science academies that are presently members of the InterAcademy Panel. The council would carry out such functions as naming the expert committees and approving the scope of studies requested by outside organizations. The IAC would have a small permanent secretariat, probably based in a European nation, such as Sweden or Switzerland, with good Third World rapport. “The procedure would be to first study the problems posed to see whether the IAC can form a well-balanced study group … and then run most of the work via the Internet,” said the French academy's president, chemist Guy Ourisson. The goal is for any studies to be turned around relatively quickly, ideally within a few months.

    Sudanese mathematician Mohamed Hassan, president of the African Academy of Sciences and executive director of the Third World Academy of Sciences, strongly favors the IAC concept, but he wants to make sure that “the developing world—where most countries either don't have academies or have weak ones—has an adequate level of representation.” Biochemist Wieland Gevers, president of South Africa's academy, says he hopes the formation of the IAC will spur smaller academies “to take more active science-advisory roles in their own countries, rather than just ceremonial roles.”

    Biochemist Ernst-Ludwig Winnacker, head of Germany's DFG granting agency, thinks the IAC “would be very useful.” Winnacker, who attended the Davos meeting, adds: “The question is whether the U.N. or other international organizations would take advantage of it.” Whether policy-makers will seek advice on burning scientific questions remains to be seen, but the Davos participants were upbeat. Says Alberts: “We all agreed that the world needs much more advice from scientists.”

  3. MATERIALS SCIENCE

    New Material Promises Chillier Currents

    1. Adrian Cho

    A moth frying on a bug lamp proves, suicidally, that an electrical current heats. But a current can also cool, if it runs through the right stuff. A handful of exotic semiconductors wick away heat while conducting electricity, and heat pumps fashioned from them chill solid-state lasers, radiation detectors, and fancy picnic coolers. Easily miniaturized and free of moving parts, such heat pumps offer clear advantages over mechanical refrigerators.

    Unfortunately, good thermoelectric materials are few and far between; research labs haven't turned up a promising new one for decades. Now a team led by Mercouri Kanatzidis, a chemist at Michigan State University in East Lansing, may have broken the impasse with a concoction of bismuth, tellurium, and cesium. “This is the first material that suggests we can do as well as or better than the materials we've had for the last 30 years,” says Frank DiSalvo, a chemist at Cornell University in Ithaca, New York.

    As Kanatzidis and colleagues report on page 1024, the new material (chemical formula CsBi4Te6) cools nearly as efficiently as the best material currently available. But whereas its decades-old rival conks out at -50º Celsius, the new stuff keeps working to roughly -100ºC. It therefore promises to improve the performance of laser diodes, infrared detectors, and other electronic devices that run best cold. Researchers cheered the newcomer's arrival. “Kanatzidis is doing some super work in this field,” says Galen Stucky, a chemist at the University of California, Santa Barbara.

    Semiconductors come in two breeds, n-type and p-type. A thermoelectric heat pump consists of a chunk of one butted against a chunk of the other. In the n-type, electrical current is carried by negatively charged electrons, which, thanks to an awkward historical convention, are considered to flow in the direction opposite to the current. In the p-type, current is carried by positively charged “holes,” the shadows of absent electrons, which flow in the same direction as the current. When current flows out of the n-type semiconductor and into the p-type semiconductor, both the electrons and the holes stream away from the border between the two materials. Like steam whistling out of a teakettle, the fleeing holes and electrons carry away heat, cooling the junction.

    Of course, for the heat pump to work, the electrons and holes must usher heat away faster than it can flow back. And that means thermoelectric materials must have high electrical conductivity and low thermal conductivity. It also means each electron or hole must carry a healthy dollop of heat. Unfortunately, the three properties are closely connected and often work against one another. For example, boosting the electrical conductivity too high leaves wimpy electrons and holes that carry only tiny amounts of heat.

    Kanatzidis and his team set out to systematically improve upon the best material available, a compound of bismuth, antimony, tellurium, and selenium. The researchers first synthesized a potassium, bismuth, and selenium compound with promising properties. They then tried to replace the potassium and selenium atoms with heavier cesium and tellurium atoms, to make a softer substance that would better damp heat-carrying vibrations. But things didn't go quite as planned. The new material took on a surprising crystal structure, which, however, performed even better than expected. “In some ways, it's a little bit of serendipity,” Kanatzidis says. But others credit more than luck. “I don't think this is so much an accident as it is his very clever way of looking at things,” Stucky says.

    The new material needs fine-tuning, Kanatzidis says. He and his team have not yet optimized the levels of dopants, impurities added to control the numbers of electrons and holes. But even as it stands, the new material has promise, because it outperforms the competition at temperatures below zero degrees Celsius. “People are desperate to find a material that will work much below room temperature,” Kanatzidis says, “so it can take over where the other material leaves off.”

  4. ASTRONOMY

    Fine Details Point to Space Hydrocarbons

    1. Alexander Hellemans*
    1. Alexander Hellemans writes from Naples, Italy.

    On Earth, it's hard to avoid polycyclic aromatic hydrocarbons, or PAHs. Made up of two or more fused benzene rings, they turn up in cigarette smoke, car exhausts, or just about anywhere that hydrocarbons burn at high temperature. But for nearly 2 decades, astronomers have argued about whether these molecules are as ubiquitous in space as they are here at home. They were puzzled by the fact that certain interstellar gas clouds in regions of space too cold to produce thermal emissions nonetheless produce characteristic emission bands in their infrared spectra. These bands look teasingly similar to the emissions bands produced by PAHs, but they are not exactly the same, and the failure of chemists to reproduce precisely the observed spectra in the laboratory has led some astronomers to argue that the emission bands are caused by small solid particles, such as soot or hydrogenated amorphous carbons. But astronomers picking over data obtained by Europe's Infrared Space Observatory (ISO), who met in Madrid last week,* believe they have evidence that clinches the case for PAHs in space.

    ISO, a 60-centimeter infrared telescope launched by the European Space Agency in November 1995, made over 26,000 observations before it ran out of the liquid helium needed to cool its telescope and detectors in April 1998. One of its tasks was to solve the riddle of PAHs. Some astronomers have suggested that PAHs could form in the atmospheres of carbon-rich stars and be ejected into interstellar space when the stars die. In the cold of space, such molecules would sometimes be struck by ultraviolet photons, causing the carbon atoms in the benzene rings to vibrate. Changes in vibrational energy are emitted as infrared light—a process called relaxation—just as sound is emitted from a gong when it is struck. Because of the numerous different shapes and sizes of PAHs, their spectra are dominated by bands of emissions, rather than discrete peaks.

    Critics of this model argue that the emission bands just don't match those of PAHs in the lab. “The spectroscopy obtained [in lab experiments] with solid materials, such as anthracite, give a better spectral agreement,” says Cécile Reynaud of France's Atomic Energy Commission in Saclay. PAH supporters counter that in cold interstellar space, there is simply not enough heat to produce thermal emissions from such solid particles. “With 15 degrees [kelvin], you won't have emission in the near infrared,” says Alexander Tielens of the University of Groningen in the Netherlands. Very little ultraviolet radiation will cause relaxation in PAHs, however, so this could still occur even in the frigid environs of interstellar clouds.

    Now, a team led by Christine Joblin of the CNRS Space Study Center for Radiation in Toulouse, France, believes it has convincing evidence. By studying the high-resolution spectra obtained with ISO's short-wave infrared spectrometer, they found that the emission bands are made up of a forest of sharp peaks. “This structure cannot be explained by solids—one expects wide and smooth bands—and we can explain this structure by assuming that we are dealing with a family of molecules,” says Joblin.

    To finally settle the matter, chemists will have to duplicate this detailed emission structure in the laboratory. Pascale Ehrenfreund of Leiden University in the Netherlands says finding laboratory spectra of PAHs that match astrophysical spectra is just a matter of getting the right mix of ingredients. But “if you have a family of 200 molecules, it is very difficult to pick the right ones on Earth. … You need a lot of laboratory studies in order to really make a perfect fit,” she says.

    Douglas Hudgins and his colleagues in the Astrochemistry Laboratory at NASA's Ames Research Center in Moffet Field, California, are attempting to do just that. And Hudgins believes that the ISO data will make his work much easier. “Up to this point, the quality of our laboratory spectra was much, much better than the quality of available astronomical data,” but now the ISO spectra should make comparisons much sharper, says Hudgins. So will ISO data settle the matter? “Absolutely … I think there is overwhelming evidence supporting molecular PAHs and PAH ions in space.”

    • *ISO Beyond the Peaks. The 2nd ISO workshop on analytical spectroscopy, 2-4 February, Villafranca del Castillo, Madrid, Spain.

  5. PLANETARY METEOROLOGY

    Deep, Moist Heat Drives Jovian Weather

    1. Richard A. Kerr

    The weather on Jupiter is awesome but frustratingly mysterious. Great jets of 750-kilometer-per-hour wind girdle the planet, marking off distinctive, dark and light bands of clouds. And oval “storms” of various sizes and hues roll between the jets, ranging from small, short-lived eddies hundreds of kilometers wide to huge, long-lived swirls. The grandest of these ovals, the 20,000-kilometer Great Red Spot, has been at it at least since Galileo's day 300 years ago. But what, meteorologists have wondered, makes it all go? The sun bathes the jovian cloud deck in feeble light, while deep-seated heat energy that has lingered since the planet formed seeps out. Which of those two energy sources predominates in driving Jupiter's weather engine? And how does that energy get bundled into the small eddies on which the jets and spots feed?

    In two papers in this week's issue of Nature, researchers provide an answer: Deep heat funneled upward by local storms is a major driver of jovian weather. They show that much if not all of the deep heat escaping the interior flows up through towering thunderstorms. These disturbances can go on to become small eddies that eventually give up their energy to storms such as the Great Red Spot. “Where these little guys come from has always been the question,” says planetary meteorologist Timothy Dowling of the University of Louisville in Kentucky.

    Lightning, captured by the imaging system aboard the Galileo spacecraft orbiting Jupiter, was the key to running down jovian weather's major energy source. On Earth and apparently on Jupiter, lightning and the water-fueled engine that drives thunderstorms are intimately connected. In a terrestrial thunderstorm, water vapor in rising warm air condenses. The condensation releases more heat that pushes the rising air upward even harder. This “moist convection” engine can also segregate electrical charges into different parts of a cloud; the reuniting of those charges is lightning. Although Jupiter has no ocean or wet ground, moist convection seems to work much as it does on Earth. “We're using lightning as a beacon to tell us where the convection is” on Jupiter, says planetary meteorologist Andrew Ingersoll of the California Institute of Technology in Pasadena, who co-authored the Nature papers with Peter Gierasch and Donald Banfield of Cornell University and their colleagues.

    By observing a thunderstorm hundreds of kilometers long just west of the Great Red Spot in daylight and in darkness, Gierasch and his colleagues could gauge how much heat thunderstorms are extracting from Jupiter's interior. In daylight, they watched the clouds carried by the air move up and away from the storm, which told them how much warm air the storm was raising. During the night, they measured the lightning associated with the storm. Then they surveyed about half the planet to determine the global abundance of lightning. They calculate that moist convection delivers a large part of the heat that is known to leak out of the interior. The Galileo data “is very good evidence the heat comes up localized in convective systems,” says planetary meteorologist Conway Leovy of the University of Washington, Seattle, “just as it does in Earth's” lower atmosphere.

    But that is where the terrestrial analogy stops, argue Ingersoll and his colleagues in the second paper. The Great Red Spot, they note, is not the hurricane-like storm that textbooks often describe. Neither it nor the whole hierarchy of smaller ovals on Jupiter draw their energy directly from their own moist convection, as hurricanes do. Instead, they argue, it's a two-step process. If jets don't tear apart the lightning-riddled jovian thunderstorms and take on their energy, the planet's rotation turns the thunderstorms into stable swirls of spinning air, or small eddies. They rotate in the same direction as the ovals without any more energy input from moist convection. This shared sense of rotation among small eddies and ovals seals the fate of the eddies, while feeding the large ovals.

    The eddies that form near the same latitude as the Great Red Spot, for example, scurry westward as much as 400,000 kilometers before encountering the slower moving Great Red Spot, says Ingersoll. Being only a couple of hundred kilometers deep, jovian spots behave as if they are two-dimensional, notes Louisville's Dowling. In a two-dimensional fluid flow, he says, “everything that touches, merges.” The opposing winds at the point where eddy and spot contact tend to nullify each other; a common, quiet center forms between them; and they and their stores of kinetic energy merge.

    Such resupplying of energy ultimately derived from the interior needn't happen all that often, notes Dowling. Even without its weekly recharging, he notes, the Great Red Spot would keep on spinning for centuries. “The reason it can get away with it is there's no surface friction; there's nowhere to stand” on Jupiter, only an atmosphere that becomes more and more dense with depth.

    Although moist convection of deep heat now appears to be a prime driver of jovian weather, “there's still a puzzle,” says Leovy. Some jets, such as the one the Galileo probe fell through, seem to increase in strength the deeper they go. That renewed a long-standing debate: Is all the weather seen at the jovian cloud tops relatively shallow—rooted only a few hundred kilometers down—or does part of it penetrate thousands of kilometers, to near the planet's heart (Science, 12 September 1980, p. 1219)? Galileo won't solve all the mysteries, it seems.

  6. NEUROBIOLOGY

    A New Clue to How Alcohol Damages Brains

    1. Marcia Barinaga

    It's common knowledge that drinking alcohol can cause birth defects. Even so, 20% of women who drink continue to do so while they are pregnant, according to a 1996 report from the Institute of Medicine. As a result, roughly 1 infant in every 1000 born in the United States has fetal alcohol syndrome, characterized by facial abnormalities, stunted growth, and learning and memory problems. New work now provides some surprising new insights into how alcohol may cause some of that damage.

    On page 1056, a team led by neuroscientist John Olney of Washington University School of Medicine in St. Louis and his former postdoc, pediatric neurologist Chrysanthy Ikonomidou (who is now at Humboldt University in Berlin), reports that alcohol works through the receptors for two of the brain's neurotransmitters, glutamate and GABA, to kill brain neurons in rat pups. The animals were exposed to one episode of high blood alcohol during the first 2 weeks after birth—a time when rat brains are going through developmental stages that occur in human brains during the third trimester of pregnancy. Although researchers have known for some time that alcohol exposure late in development causes brain damage in rodents, the new work provides the first possible explanation of how that damage occurs.

    In doing so, says pharmacologist Boris Tabakoff of the University of Colorado Health Sciences Center in Denver, it provides “the first step to understanding how you might control that damage,” possibly with drugs that block alcohol's effects on the receptors. But the work carries an even more important message for the public, says neurobiologist David Lovinger, who studies alcohol's effects on neurons at Vanderbilt University School of Medicine in Nashville, Tennessee. Late-pregnancy drinking “is really unsafe for the brain,” he says.

    Olney and Ikonomidou did not set out to investigate the effects of alcohol on the brain. They were following up on studies they performed last year on chemicals that block the so-called NMDA receptor for glutamate, an excitatory neurotransmitter in the brain. The team's earlier work showed that chemicals that block the NMDA receptor cause brain neurons to die by programmed cell death, or apoptosis, during the phase of development when the brain's neurons are forming connections with each other. The group went on to look for other drugs that trigger apoptosis and found that barbiturates and benzodiazepines, both of which suppress neural activity by activating the receptors for the inhibitory neurotransmitter GABA, caused similar neuronal die-offs.

    Because ethanol blocks NMDA receptors and activates GABA receptors, “we thought it would be wise to test ethanol,” Olney says. Indeed, the pattern of neuron death they saw with ethanol “seems to be a composite” of what they saw with GABA enhancers and NMDA blockers. That, Olney says, suggests that alcohol's effects on the brain are carried out through these two receptors.

    Other groups had previously noted that alcohol exposure late in development causes widespread neuron death in rat brains, and there had been hints that NMDA receptors are involved. For example, pharmacologist Paula Hoffman of the University of Colorado Health Sciences Center and others have shown that blocking NMDA receptors kills neurons from rat cerebellum maintained in lab cultures. The new work confirms that the same happens in whole animals, and in more areas than just the cerebellum, says Hoffman. What's more, she adds, the suggestion of a role for GABA receptors in the killing effects of alcohol “is quite novel.”

    Indeed, those results raise the possibility that benzodiazepines (Valium and its relatives)—which are sometimes given to newborns as anticonvulsants—may also kill neurons in the developing brains of human infants. “The benzodiazepines are considered extremely safe drugs,” says Tabakoff. “If this study is correct, one might need to reassess their safety in [infants] while the brain is still developing.”

    Ikonomidou points out, however, that the rats were given higher doses of benzodiazepines than those usually given to infants, and it will require more studies to say whether the drugs as they are typically used present a danger to infants. “Prolonged seizures themselves … can lead to irreversible brain damage, and it is imperative to try and stop them,” she notes, adding, “we do not have good alternative drugs to use.”

    Some alcohol researchers point out that understanding how alcohol kills neurons could spur the development of antidotes that would block its effects in pregnant women. But others see that as a long shot. The Olney group observed the dying neurons 24 hours after alcohol exposure, and the time window for preventing that damage may be too narrow for a “morning after” pill to work. And fetal alcohol researcher James R. West of Texas A&M University's Health Science Center in College Station points out that alcohol doesn't just kill neurons; it has other negative effects, such as causing neurons to grow incorrectly. Because of these multiple effects, there will be “no one silver bullet,” West says. “The key thing,” he adds, “is to find some way to keep mothers from drinking when they are pregnant.” But Tabakoff notes that some alcohol-addicted pregnant women may find it virtually impossible to quit, making progress toward developing a drug that could block even some of alcohol's damaging effects “very, very important.”

  7. STEM CELLS

    Wisconsin to Distribute Embryonic Cell Lines

    1. Gretchen Vogel

    Since the 1998 announcement that scientists had managed to grow human embryonic stem cells in lab culture, researchers have been clamoring for access to them. But they have been blocked on two fronts. One is proprietary—the biotechnology company Geron, which funded much of the work to derive the cells, has kept a tight rein on them. The other is regulatory. In the United States and many other countries, research is restricted because these cells are derived from early embryos or aborted fetuses.

    On 1 February, the University of Wisconsin (UW), Madison, took a major step toward lowering the first hurdle. In a move welcomed by many biologists, the university announced the creation of a nonprofit research institute to distribute the embryonic stem cell line derived by UW researcher James Thomson. But the legislative and regulatory limbo persists and may not be resolved before this summer.

    What makes these cells so desirable is their potential to develop into any tissue in the body, such as insulin-secreting pancreas cells or dopamine-producing neurons. Researchers envision using these cells to treat a variety of ailments, from diabetes to Parkinson's disease. Not surprisingly, private sector interest in stem cell research is high.

    Thomson's work at Wisconsin was supported by Geron Corp. of Menlo Park, California, and by the Wisconsin Alumni Research Foundation (WARF). Although Geron retains rights to certain commercial and therapeutic uses of the cells, WARF retained the right to distribute them to other researchers. WARF has now created the WiCell Research Institute, which will begin distributing stem cells to academic and industrial scientists late this spring. Thomson will serve as WiCell's scientific director but will also continue his research.

    Scientists will be asked to submit a confidential summary of their research plans to WiCell, which will review it “to make sure the cells are used appropriately and with adequate respect,” says WARF director Carl Gulbrandsen. For example, researchers will not be allowed to use the cells for reproductive cloning experiments or to mix the cells with intact embryos. Academic researchers will pay a one-time fee of $5000 for two vials of the cells, which should provide a virtually unlimited supply of cells in culture. The fee will cover quality control and technical support for the care of the fickle cells. Gulbrandsen says WARF's goal is to create a “not-for-profit subsidiary. We're not intending to make a profit off [academic] researchers, but we're also not intending to lose money.”

    WiCell will require researchers to certify each year that they are complying with the original agreement. In addition, the license for academics will apply only to research uses. If academics want to use the cells for profit, they will have to renegotiate with WiCell. Geron has commercial rights to “therapeutics and diagnostics with certain cell lines,” Gulbrandsen says, but is keeping the details of its license agreement under wraps. WARF has a preliminary agreement with Geron to notify WiCell's stem cell recipients of the areas in which the company has exclusive rights.

    Industrial researchers, on the other hand, will pay “a significant up-front fee” and a yearly maintenance fee, Gulbrandsen says. WARF will also collect a “flow-through royalty,” a percentage of any revenue that results from the use of the cells. Those revenues will have to come from cell uses not already licensed to Geron.

    Stem cell researchers welcome the new initiative. “It's an excellent idea to share these cells,” says Oliver Brüstle of the University of Bonn Medical Center in Germany.

    WiCell's first customers may come from overseas, because U.S. policy is unlikely to be settled by the time WiCell is ready to distribute cells, while other countries may move faster (see next story). A year ago January, a lawyer for the Department of Health and Human Services ruled that stem cells derived from embryos were not themselves embryos; therefore, the National Institutes of Health (NIH) could fund research on the cell lines without contravening a ban Congress imposed on embryo research (Science, 22 January 1999, p. 465). Draft guidelines, now under review, would allow NIH-funded researchers to work on stem cell lines derived by private organizations, such as WiCell, as long as the derivation met certain ethical conditions.

    On 31 January, NIH announced that it was extending the comment period on these guidelines for 3 weeks, until 22 February. NIH has already received thousands of letters, both pro and con, says Lana Skirboll, associate director for science policy. Although NIH has not tallied the responses, opposition has been significant. Skirboll says NIH now expects to issue the final guidelines no sooner than early summer.

    Debate also continues on Capitol Hill. Senators Arlen Specter (R-PA) and Tom Harkin (D-IA) have introduced a bill that would allow NIH to fund both the derivation and use of stem cell lines. A Senate hearing on the bill is scheduled for 22 February, and House committees are planning hearings as well. All of this will likely keep federally funded U.S. researchers from placing orders with WiCell anytime soon.

  8. STEM CELLS

    Report Would Open Up Research in Japan

    1. Dennis Normile

    TOKYOJapanese researchers are cheering last week's release of a report to the government that endorses the use of human stem cells in research—work that until now has been on hold. The draft report outlines a process for both publicly and privately funded scientists to follow in deriving and working with stem cells. “It's a very important step forward,” says Shinichi Nishikawa, a professor of molecular genetics at Kyoto University's School of Medicine.

    The report was drafted by a special subcommittee of the bioethics committee of the Council for Science and Technology, the nation's highest science policy body. In giving the green light for research using embryonic stem cells, the subcommittee cites the potential for “very important results for the advancement of medicine, science, and technology.” Human stem cells, which theoretically can develop into any of the body's cells, may ultimately provide laboratory-grown replacement organs and treatments for diseases such as Parkinson's and Alzheimer's. Biologists are keen to use them as well to explore basic developmental processes. But the subcommittee said that research on human stem cells and related material must be strictly regulated.

    Under the report's proposals, stem cells could be created only from embryos left over from fertility treatments and only after donors granted their informed consent. Donor privacy would be strictly protected, and the stem cells could not be used to clone humans or be combined with animal embryos. Each research center using or deriving stem cells would have to create an institutional review board, which would approve all work and maintain detailed records. The board, made up of lawyers, ethicists, and scientists, would in turn report to a higher government body.

    These recommendations differ in two major ways from guidelines proposed in December by the U.S. National Institutes of Health (NIH) (see previous story). The Japanese rules allow government funding for both the derivation and use of stem cells. The NIH guidelines, in contrast, prohibit the use of public funds for the derivation of human embryonic stem cells. And whereas NIH's proposed rules apply only to NIH-funded work, the Japanese proposals address activity in the private sector as well, suggesting that the creation and distribution of embryonic stem cells be done on a not-for-profit basis. Payments to embryo donors would not be allowed, and fees for acquiring stem cells would cover only reasonable costs for their preparation and distribution. These differences would likely make academic labs the focus of stem cell creation in Japan, while for-profit companies take the lead in the United States.

    One gray area involves the role of the institutional review committees. The report recommends that they have broad discretionary powers to decide whether a project is ethically appropriate and if the researcher has the necessary expertise. The report does not set standards for making these judgments, however, and Kyoto University's Nishikawa, a member of the subcommittee, says that the review boards' role is likely to remain cloudy until they are up and running. Nishikawa also believes that some aspects of the proposed procedures “may require some reconsideration.” He notes that privacy rules might need to be revised, for example, if researchers and regulators require additional information on the donors before approving the use of stem cells for certain medical applications.

    The draft is now open for a month of public comment before it goes to the Science and Technology Agency and the Ministry of Education (Monbusho), which are expected to draw up final guidelines by April. Meanwhile, some research centers have already set up review boards, and scientists are eager to take the next step. “This report means we will be able to extend this work to human stem cells,” says Takashi Yokota, a professor at the University of Tokyo's Institute of Medical Science who has been using mouse cells to study basic stem cell mechanisms. He hopes for a chance to begin that work this spring.

  9. HIGH-ENERGY PHYSICS

    CERN Stakes Claim on New State of Matter

    1. Charles Seife

    Not since the big bang has matter been in such a state. For a few microseconds after the birth of the universe, quarks and gluons roamed free in a blazing hot jumble of matter known as a quark-gluon plasma. As the plasma cooled, the quarks and gluons condensed into more familiar particles and disappeared. On Thursday, scientists at CERN, the particle physics laboratory near Geneva, were expected to announce—gingerly—“compelling evidence” of a new state of matter that might be quark-gluon plasma reborn—unless, that is, it's something else.

    The announcement marks the close of a 6-year chapter in high-energy physics. Since 1994, CERN physicists have been using the Super Proton Synchrotron (SPS), a 6-kilometer circle of magnets, to smash lead atoms together at enormous speeds and with energies as large as 3.5 TeV (trillion electron volts). The scientists hoped the colliding nuclei would become so hot and so dense that their protons and neutrons would reverse cosmic history, melting back into a soup of component quarks and gluons.

    Now, however, CERN's instruments are about to lose their cutting-edge status. In May, a new accelerator known as the Relativistic Heavy Ion Collider (RHIC), up to five times as powerful as SPS, will come online at Brookhaven National Laboratory in Upton, New York. “The big thrust is going from CERN to Brookhaven,” says CERN physicist Maurice Jacob. So, Jacob says, it is an appropriate time to celebrate CERN's achievements.

    The obvious question for the epilogue is, did CERN succeed? As Science went to press, CERN physicists were planning to present results from seven lead-beam experiments and cautiously lay claim to the creation of a quark-gluon plasma-like state of matter—if only for 10−23 seconds at a time. “A common assessment of the collected data leads us to conclude that we now have compelling evidence that a new state of matter has indeed been created … [that] features many of the characteristics of the theoretically predicted quark-gluon plasma,” Jacob and his fellow physicist Ulrich Heinz write in a paper summarizing the results.

    As early as 1996, CERN scientists saw evidence of a quark-gluon plasma in the unexpectedly low production of an elusive particle known as the J/Ψ (Science, 13 September 1996, p. 1492). Since then, CERN experiments have shown other hints of a quark-gluon plasma, such as anomalies in the distribution of “vector mesons” (such as the ρ particle) and in the production of “strange hadrons” (such as the Ω particle). “The excess in strangeness is quite spectacular; for the Ω, the production is 15 times normal,” says Jacob. “So something new is happening.”

    Yet none of the evidence has put the issue to rest. Columbia University's Bill Zajc, a spokesperson for a RHIC experiment, notes that other mechanisms might account for the destruction of J/Ψ particles, such as collisions with less exotic particles hurtling away from the nuclear smashup. “It's like being caught in machine-gun fire,” he says. RHIC scientist and Brookhaven physicist Sam Aronson agrees: “It's fair to say I don't find they've made a compelling argument for discovery at all.”

    CERN physicists are careful not to overstate their case. “While all the pieces of the puzzle seem to fit, it needs definite confirmation,” says CERN spokesperson and physicist Neil Calder. Achim Franz, a physicist who worked on two of the seven CERN experiments, agrees. “There's nothing there with a big red flag saying ‘I'm a quark-gluon plasma,’” he admits. “I don't think you'll see an event and say, ‘That's it!’”

    Such decisive evidence is exactly what some researchers hope RHIC will provide. “The hope is that when you go to RHIC, which has two to five times as much energy, the particles created will have crossed the threshold, and it will be easier to interpret the results,” says Wit Busza, a RHIC experimenter and physicist at the Massachusetts Institute of Technology. But Franz cautions against too much optimism. “I started as a postdoc in '86, and people said that SPS would find the quark-gluon plasma right away. It wasn't there,” he says. “I don't think there will be a threshold suddenly.”

    For now, the CERN experiments will continue, colliding atomic nuclei at lower energies to fill in gaps in the data. Most of the community's anticipation, however, is focused on CERN's next big step: an accelerator called the Large Hadron Collider, which will eclipse RHIC in 2005. Until then, CERN scientists must be content to celebrate a set of experiments that gave them a glimpse of something bizarre. “I don't know whether it's a quark-gluon plasma or not,” says Franz. “But if you take all the experiments together, it's something new and exciting.”

  10. HUMAN GENETICS

    Start-Up Claims Piece of Iceland's Gene Pie

    1. Martin Enserink

    For almost 2 years, Iceland's small scientific and medical community has been torn apart by a bold plan to pool the entire country's medical records in a database that would aid the search for disease-causing genes. In December 1998, parliament approved the creation of the database (Science, 1 January 1999, p. 13). And just a month ago, the health ministry gave one company, deCODE, exclusive rights to run it—a move that engendered heated opposition.

    Now, a small biotech start-up is providing an alternative for those critics who want to mine Iceland's genetic riches but dislike the arrangement with deCODE. The company, called UVS—after Urdur, Verdandi, and Skuld, three witches who according to old Icelandic sagas determine the fate of man—was established as an alternative to deCODE, concedes Snorri Thorgeirsson of the U.S. National Cancer Institute in Bethesda, Maryland. Thorgeirsson started the company along with Bernhard Palsson, a researcher at the University of California, San Diego, and Icelandic businessman Tryggvi Petursson.

    Founded 2 years ago, UVS went public last week and promptly announced three major research agreements—with the Icelandic Cancer Society, the National University Hospital, and the Reykjavik City Hospital. Within 6 to 8 weeks, UVS plans to open a lab outside Reykjavik, says Thorgeirsson.

    Scientists at UVS, like those at deCODE, believe that disease-causing mutations are easier to find in genetically homogenous populations, such as Iceland's, whose genomes have less “noise” than those of more diverse societies. Both companies hope to profit by selling that knowledge to pharmaceutical companies, which can use it to develop diagnostic tests and drugs. But whereas deCODE's search will be helped by having medical records on almost everyone in Iceland, UVS says it can turn up valuable data by working with smaller groups of patients, who have volunteered to participate.

    UVS is teaming up with several scientists who fiercely oppose the national database. Critics have argued that procedures for obtaining patients' informed consent and safeguarding their privacy are inadequate. One of those is Jorunn Eyfjord, a geneticist at the Icelandic Cancer Society's lab, who will cooperate with UVS on three cancer projects. Another is Jon Johannes Jonsson, head of the department of biochemistry and molecular biology at the University of Iceland. Formerly a member of deCODE's scientific board, Jonsson broke with the company because he opposed the health database. Now, one of his faculty members, Reynir Arngrimsson, is scientific director of UVS, and the company will share laboratory equipment with his department. “I don't like to see a monopoly in Iceland,” says Jonsson, “and I don't like to see human genetics done in the way deCODE proposes to do it.”

    Kari Stefansson, founder and CEO of deCODE, is unfazed by the competition. The Icelandic government, keen to stimulate its budding biotech industry, has embraced deCODE's plan. And so far, only about 5% of Icelanders have asked that their medical data be excluded from the database—which shows that the general public has more confidence in deCODE than some scientists do, says Stefansson. Nor, he adds, does deCODE have a shortage of collaborators. The company, which employs almost 300 people, is working with many physicians on projects to identify genes involved in lung, prostate, colon, and skin cancer. Says Stefansson: “For me, it's a relief to have another company, so I'm not accused of monopolizing” Iceland's gene pool.

  11. SCIENCE POLICY

    Balancing the Science Budget

    1. David Malakoff

    Clinton's final budget proposes a hefty increase for basic research and tackles a growing imbalance between biomedical research and the rest of science. It's a far cry from his first year's budget

    The mood was bittersweet as Clinton Administration science policy bigwigs gathered on Monday to roll out the president's fiscal year 2001 R&D budget proposal—his last before leaving office. They had the happy job of presenting perhaps the strongest and most balanced basic research proposal in the Administration's history, one that asks Congress to boost civilian research spending by 7%, to nearly $43 billion, and to begin correcting a growing imbalance between biomedical research and less fashionable fields, from physics to mathematics. But their euphoria was tempered by regret among some basic science advocates that the Administration had waited so long.

    A combination of factors—from a booming economy to new players at the White House and in Congress—appears to have produced the kind of R&D request that some scientists have been dreaming of since the former Arkansas governor moved into the Oval Office in 1993. Although the previous seven budget proposals had requested hefty increases for an array of R&D programs, many in the science community worried that those budgets too often had emphasized trendy topics over time-tested investments in nitty-gritty basic science. But this year the basics get a big boost, led by a 17% increase for the $3.9 billion National Science Foundation (NSF) (Science, 4 February, p. 778). The 2001 budget requests a 4.5% raise for the $17.9 billion National Institutes of Health (NIH) and a 3% raise, the first in years, for the $13.7 billion NASA. Military researchers, however, received a mixed message, with the Department of Defense proposing to boost basic research by 4%, to $1.2 billion, while slashing applied work by 8%, to $3.1 billion.

    View this table:

    Unlike many earlier White House science proposals, the 2001 plan is expected to receive a relatively warm reception in the Republican-controlled Congress. “It's not a slam dunk by any means, but the sharp ideological differences” that crippled earlier proposals “seem to be softer now,” says Jack Crowley, head of the Massachusetts Institute of Technology's (MIT's) office in Washington. “I've seen a growing bipartisan acknowledgement of the importance of balanced funding in the basic sciences.”

    Both parties, for instance, seem to agree on the need for greater support of information technology research, although they may disagree on the details of the White House's request for an additional $514 million. Similarly, there is bipartisan fascination with the “small is beautiful” approach underlying the Administration's $495 million nanotechnology initiative, which aims to build sugar cube-sized computers and other miniwonders. Such bipartisan backing could crumble amid the politicking of a presidential election year. But it is already apparent that the president's proposal culminates a shift—in both style and substance—in the Clinton White House's approach to R&D funding.

    Indeed, the contrast between Clinton's first and last budget announcements is striking. When then-presidential science adviser Jack Gibbons unveiled the Administration's first R&D budget in 1993, he emphasized the need for applied research to help U.S. companies buffeted by increased foreign competition and the end of the Cold War. Indeed, Gibbons went out of his way at the time to insist that the Administration was not shortchanging basic science. “We aren't talking about taking resources from [basic sciences],” he said at a press conference, “but rather giving greater attention to … assisting the transformation of science into things that provide us our jobs.”

    At other 1993 briefings, officials highlighted the need to bring civilian and military science spending—long tilted in favor of defense—to parity. They also warned that U.S. scientists were falling behind their counterparts in Japan and Europe. “There was a lot of fear [then] of being overtaken in key technologies,” says David Hart, a science policy scholar at Harvard University.

    This year, in contrast, the focus is on “restoring the balance in the federal R&D portfolio,” noted NSF's Rita Colwell. There was little talk of foreign economic threats or of the relative merits of civilian and defense science. And Neal Lane, the president's science adviser, apparently felt little need to defend the Administration's enthusiasm for basic science. “It's so clear from every study … that the federal investment in science and technology is about as good an investment as you can possibly make,” Lane said last month, as Clinton gave a preview of this year's R&D proposal at the California Institute of Technology (Caltech) in Pasadena (Science, 28 January, p. 558).

    MIT's Crowley calls that speech “one of the most unprecedented and passionate presentations on science I have ever heard from an American president.” Still, one Administration official labels the shift in R&D strategy as “mostly rhetorical.” Indeed, White House budget statistics show that both basic and applied research funding have gone up significantly—52% and 32%, respectively—since Clinton's arrival. But they also show that defense and civilian R&D have essentially reached parity from a starting ratio of nearly 3 to 2 in favor of defense. “Times have changed, and we've changed with them,” says the official. For instance, the 1994 Republican takeover of Congress, the arrival of an unexpectedly strong economy, and the recent appointments of Lane and White House Chief of Staff John Podesta have all changed the dynamics of the science funding debates, observers say. They also cite increased pressure from research groups concerned about stagnation in nonbiomedical research budgets (see graph).

    The fierce battles that followed Newt Gingrich's arrival as House Speaker in January 1995 “were highly distracting because it put [the White House] on the defensive,” recalls an Administration official who worked in the Office of Science and Technology Policy at the time. “There was a lot less time to promote a science agenda.” Indeed, some science policy watchers remember Clinton's 1995 State of the Union speech—whose only mention of science was a one-sentence attack on Congress for appropriating “$1 million to study stress in plants”—as a low point. “I spent way too much time … answering indignant letters and e-mail from irate plant physiologists,” remembers Rick Borchelt, a White House press aide at the time who now works on a NASA project to boost public literacy in science.

    White House officials bounced back from that embarrassment, however, putting muscle behind initiatives ranging from global change to supercomputing and this year's darling, nanotechnology. Those efforts helped forge bipartisan working relationships with key pro-science Republicans, such as Senator Pete Domenici (NM), chair of the Senate Budget Committee, and Representative John Porter (IL), chair of a House Appropriations subcommittee that oversees the NIH budget. At the same time, lawmakers worried about the growing imbalance between spending on life sciences and that on other fields—including Senators Bill Frist (R-TN), Connie Mack (R-FL), and Jay Rockefeller (D-WV)—crossed party lines to sponsor several bills calling for a doubling of federal R&D spending in all areas. In 1997, the heads of 23 research societies—along with a few executives from high-tech industries—put their weight behind the concept, asking politicians to recognize the “interconnectedness” of scientific progress and to ensure that engineers, chemists, and other nonbiomedical researchers had the necessary resources. Today, Gingrich—a fellow at the American Enterprise Institute in Washington and an energetic campaigner for boosting basic research—says such developments “helped create an environment that was very bipartisan when it came to science.”

    Such efforts, however, ran headlong into financial constraints, especially legislated budget caps designed to restrain federal spending and pay off the budget deficit. “The caps provided very little wiggle room for brash science initiatives,” notes a Republican Senate aide. But a surprisingly strong economy—and an array of accounting tricks that allowed Congress to spend more than the caps allowed—opened the door to some unexpected research gains. In 1998 and 1999, to the delight of biomedical lobbyists, the big beneficiary was NIH, racking up $2 billion increases in both years.

    NIH may be forced to share the wealth this time around. Armed with data showing the widening gap between the biomedical sciences and other disciplines, Lane, Colwell, and science society heads hit the streets last fall, reminding anyone who would listen of the need for “balance” in the federal R&D portfolio. One important ally, lobbyists say, was Podesta, who last year took the lead in criticizing congressional efforts to cut some science programs. His and Lane's fingerprints, outsiders say, are all over a speech that Clinton gave last December that signaled this year's focus on balance. “I would like to make this point very strongly, because it's one that I hope to make more progress on next year: It is very important that we have a balanced research portfolio,” Clinton said at the 3 December 1999 address on economic growth. Last month Clinton offered some details at Caltech, and a week later mentioned it in his State of the Union address—to bipartisan applause.

    Indeed, the Administration has in some ways come full circle on R&D policy, says Harvard's Hart. “It's come around to emphasizing the conventional wisdom that drove R&D policy for a long time,” he says, adding that basic research has traditionally drawn broader support from lawmakers than have applied programs that favor a particular industry or sector of the economy.

    So far, the back-to-basics approach is drawing favorable reviews from science groups. And Gingrich says he is happy to see his one-time political opponent “finally get religion when it comes to science.” Although current Republican congressional leaders have been noticeably gentle in their reactions to Clinton's proposals, their acquiescence doesn't necessarily mean the request will glide through Congress, notes MIT's Crowley. President George Bush, he recalls, tried to double NSF's budget over 5 years. “But that idea never got out of the blocks,” he says.

  12. SCIENCE POLICY

    Plan to Reduce Number of New Grants Tempers Enthusiasm for NIH Budget Hike

    1. Eliot Marshall

    The Administration's proposed 2001 budget for the National Institutes of Health (NIH) is receiving less than a standing ovation from biomedical lobbyists. But they admit that the 4.5% increase supports their campaign to double NIH's budget between 1999 and 2003, even if it doesn't come close to offering the 15% increase they want this year.

    The budget request is “positive,” says the Federation of American Societies for Experimental Biology (FASEB), an organization of 60,000 scientists based in Bethesda, Maryland, which acknowledges that it's much more generous than last year's offer by the president of an additional 2.1% that turned into 15% by the time Congress had finished its work. The White House says it wants to give NIH a $1 billion boost, to $18.8 billion, although nearly a quarter of that increase is money that NIH must pass along to other agencies. FASEB president David Kaufman, a pathologist at the University of North Carolina, Chapel Hill, calls the request the “largest dollar increase ever requested by a president,” and Richard Knapp, government relations chief for the Association of American Medical Colleges (AAMC), sees this as a “good start.”

    Although FASEB and the AAMC liked the overall message, FASEB, at least, had “serious concerns” about the fine print. The main problem, according to Kaufman, is that the agency wants to reduce the number of new and competing grants to individual investigators, from 8950 this year to 7641 in 2001. In a prepared statement, Kaufman warned that this could “prove very discouraging to young investigators.”

    Acting NIH director Ruth Kirschstein defends the decision, saying that the total number of grants supported in 2001 would be the largest in NIH's history, topping 33,000. “NIH has built up a very large commitment base,” she says, so the number of new grants must be reined in until the base stabilizes. Another problem is the sharp rise in the average cost of an NIH grant, from $227,000 in 1992 to a projected $327,800 in 2001. As one House appropriations committee staffer said, “We've reached a point where we have to have a $1 billion increase [in the NIH budget] every year just to stay even.” To avoid a crunch, NIH managers plan to hold down annual increases in the size of new and continuing grants to no more than 2%.

    While NIH is trying to apply the brakes to the cost of grants, its latest budget contains two major new initiatives—one at universities around the country, and the other on its Bethesda campus. Kirschstein confirmed that NIH is planning to spend $110 million in 2000 and is seeking $147 million next year to fund an interdisciplinary program to create academic centers of excellence in biocomputing, known as the Biomedical Information Science and Technology Initiative (BISTI). National Cancer Institute director Richard Klausner has already been working with the National Science Foundation to help set up new BISTI training centers. Meanwhile, six NIH institutes involved in brain research are joining forces to fund a new, 18,500-square-meter neuroscience lab on the NIH campus. This budget contains the first down payment of $73 million to plan a project whose total cost is still up in the air, Kirschstein says.

    Congress begins the process of chewing over NIH's budget with a House hearing on 15 February, with lobbyists hoping that lawmakers will be as generous as in years past. Republicans and the White House have already agreed to raise mandated spending limits on the overall federal budget. That move is expected to allow NIH to spend $3 billion this year that Congress has previously delayed until 2001 in a maneuver designed to stay within those limits.

  13. SCIENCE POLICY

    Solar Missions Brighten NASA's Hopes for Space Science Research

    1. Andrew Lawler

    This year, for a change, NASA gets to share in the budget wealth. The White House request for a $435 million increase, to $14.04 billion, marks the first time that the Clinton Administration has granted the space agency a significant increase.

    Nearly half the boost—$206 million—would augment the current $2.2 billion for space science projects, including Mars exploration and a bevy of small missions. The centerpiece is a $20 million down payment on “Living With a Star,” a sun research program that aims to use a flotilla of spacecraft to track solar storms and coronal mass ejections, which can interfere with communications and electric power grids. Some of the satellites would unfurl solar sails, using the solar wind for propulsion rather than chemical or nuclear sources. NASA space science chief Ed Weiler hopes to receive more than $500 million for the program over the next 5 years.

    NASA is also seeking $78 million on top of this year's $248 million for Mars exploration. Part of the money would build a system of communications satellites that could help prevent a repeat of last year's devastating loss of two Mars spacecraft. And Weiler says he is open to a review of the entire Mars effort, including whether it is wise to focus primarily on a mission that would collect and return martian rock samples. “Maybe we shouldn't put all our eggs in that basket,” he says.

    The agency also is setting aside nearly $200 million—a $42 million boost—for the next series of Discovery micromissions chosen through peer review. And it will beef up work on instruments to detect life on other planets and moons. It also wants to restart work on missions devoted primarily to pushing technology rather than science. The effort, called “New Millennium,” was deleted last year by the White House and Congress.

    NASA is requesting 10% more for its $275 million life science program, with the increase spread equally between biomedical and microgravity research. But earth science funding would remain roughly flat at $1.4 billion, with the bulk of that money going for the Earth Observing System constellation of satellites.

    At NASA's budget roll-out, Administrator Daniel Goldin emphasized ties between the agency and the National Science Foundation as well as with the academic community. A planned study of the agency's interaction with academia is intended to lead to greater participation by researchers in NASA projects.

  14. SCIENCE POLICY

    Up, Down, and Sideways: How Other Research Agencies Fared

    1. Adrian Cho,
    2. Jocelyn Kaiser,
    3. David Malakoff,
    4. Jeffrey Mervis,
    5. Charles Seife,
    6. Erik Stokstad
    1. Reporting by Adrian Cho, Jocelyn Kaiser, David Malakoff, Jeffrey Mervis, Charles Seife, and Erik Stokstad.

    Washington resembled a three-ring circus on 7 February as each agency put the best face on its 2001 budget request. Here are highlights from those presentations:

    · NSF: The National Science Foundation's 17%, $675 million increase includes a 20% boost to its $3 billion research account and 5% more for education, although director Rita Colwell emphasized that the agency's total investment in people—students and teachers as well as researchers—would rise by 11%. NSF's new facilities account soars by 45%, to $139 million, led by $17 million to kick off a $75 million mobile seismic array and $12 million to begin a $93 million network of high-tech ecological observatories. But NSF declined to request anything for a $75 million high-altitude research plane despite a $6 million appropriation last year from Congress. Funding for biocomplexity jumps by 172%, to $136 million, and NSF's portion of the information technology initiative leaps 160%, to $326 million. (For a look at how NSF's budget request came about, see last week's issue, 4 February, p. 778).

    · Energy: After a year of being battered by allegations of espionage at the national labs, Department of Energy (DOE) Secretary Bill Richardson said “it's time to return to science.” The agency's $18.9 billion budget request includes an 8% boost, to $7.6 billion, for DOE's R&D programs. The core Office of Science would get a 12%, $337 million hike, to $3.2 billion, with basic research and computing programs getting the lion's share of the new riches. Biology, fusion, and physics research budgets would rise slightly. Richardson is also requesting $10 million for a Scientific Recruitment Initiative, saying that DOE will have to work harder to attract talent, as hiring at national laboratories has “suffered because of the espionage issue.”

    · Environment: “It's basically a stay-the-course budget” for the Environmental Protection Agency's Office of Research and Development, says ORD assistant administrator Norine Noonan. The 2001 request totals $530 million, $6 million below this year but an increase of $38 million after congressional earmarks are subtracted from the 2000 budget. The total includes a $5 million increase for research on the health effects of endocrine disrupters and a $7 million boost for epidemiological studies of soot's health effects. The office also wants to complete a report on the state of U.S. estuaries next year. The agency's extramural grants program would rise 13%, to $101 million.

    · Defense: Military research advocates got mixed news from the Department of Defense, which has requested a 4% jump in basic research and an 8% reduction in applied studies. The basic research account would rise $50 million to $1.2 billion, with much of the increase devoted to biowarfare defense and cybersecurity initiatives. Applied research funding would shrink by $271 million to $3.1 billion. But if last year's pattern holds true, Congress probably won't go along with the overall cut, which would put the defense research budget 10% below its 1993 level.

    · Agriculture: The department proposes growing its National Research Initiative by $31 million, to $150 million. The agency is also asking for $120 million for a second year of a separate extramural grants initiative for applied research (Science, 21 January, p. 402). But the congressional outlook is uncertain. House appropriators blocked new funds for the extramural grants program last year and may do it again. “We're going to try to … convince them that it's an appropriate expenditure,” says the agency's budget chief, Steve Dewhurst. And the Agricultural Research Service would get a $50 million boost to $956 million, including increases for research on everything from emerging diseases and invasive species to climate change and crop-based fuels.

    · USGS: The U.S. Geological Survey's national mapping program would be the main beneficiary of a requested 10% increase, to $895 million. It hopes to add $29 million to the $127 million it will spend this year on the mapping effort, which collects and distributes data on everything from coastal wetlands to historical trends in urban growth. The survey is also asking for a $22 million boost for biological research, $2.6 million to buy 150 new seismographs for earthquake-prone San Francisco and other cities, and $4 million for new real-time stream gages for flood forecasting.

    ·NOAA: While its overall budget soars by 20%, the National Oceanic and Atmospheric Administration's research spending sticks closer to sea level, inching up just $5 million to $303 million. Overall, the agency is requesting $489 million more than last year, half of which would go toward marine sanctuaries and estuary reserves. Spending on climate and air-quality research would jump $25 million, with virtually all of that going to overhaul the agency's storm monitoring and reporting systems. Spending for research on the oceans and Great Lakes, however, dives by nearly $20 million.

    ·NIST: The biggest chunk of the 12% increase for the $636 million National Institute of Standards and Technology, some $50 million, would establish an Institute for Information Infrastructure Protection to foster public-private partnerships aimed at keeping computer data secure. The agency's core science laboratories would also get a 20% boost, to $332 million. The Advanced Technology Program, long a target of Republicans, would rise by $33 million, to $175 million.

  15. U.S. SCHOOL REFORM

    Packard Heir Signs Up for National 'Math Wars'

    1. Jeffrey Mervis

    Computer heir David W. Packard keeps a low profile, but there are signs that his charitable institute may add math to its substantial investment in reading reform

    A California-based group of academics, including four Nobel Prize winners in physics and nearly 200 mathematicians, sent a tremor rippling through the national education reform movement last fall with a full-page advertisement in The Washington Post. The 18 November ad urged the Department of Education to withdraw its “premature” endorsement of 10 new mathematics texts for elementary and secondary school students and caused enough of a fuss to spawn a congressional hearing last week (see sidebar). It was the latest salvo in the “math wars” that have raged for years in California and elsewhere between advocates of different approaches to math teaching (Science, 29 August 1997, p. 1192). But for some educators, the most intriguing aspect was neither the message nor the list of signatures. What caught their eye was the man who paid for it: computer heir David W. Packard.

    Packard, a tall, slender 58-year-old former professor of ancient Greek and the oldest of four children of David and Lucille Packard, is a newcomer to the math wars. But he brings an impressive arsenal. In the past few years he has funneled nearly $30 million into California schools to support a phonics approach to reading, including a virtual makeover of one low-performing Sacramento elementary school. The money has come largely from a $15 billion foundation created by his legendary father, the co-founder of Hewlett-Packard, who died in 1996. In July, the rapidly growing parent foundation, which supports a range of activities including family planning, conservation, and marine research, gave the Packard Humanities Institute (PHI), of which David W. Packard is president, some $1.5 billion to pursue its own philanthropic agenda. And some math reformers are wondering if the suddenly flush institute—which by law must spend at least 5% of its endowment each year—intends to make an even bigger splash in their pool.

    Packard, whose foundation in the past has preserved old movie houses and promoted the use of technology in the humanities, declined to talk to Science about why he agreed to spend $70,000 for the ad. But he's clearly interested in the subject. “I see no need to cooperate in distracting your readers from the central issue of math instruction,” he wrote in a brusque faxed reply to a request for an interview. (After checking with Packard, PHI trustee Robert Glaser, former dean of the Stanford Medical School and a family friend, also declined to talk to Science.) Instead, Packard points to a statement issued shortly after the ad appeared, which says that the institute hopes “to encourage broad discussion and debate on this important topic.”

    The ad takes aim at curricula that follow guidelines from the National Council of Teachers of Mathematics, the National Academy of Sciences (NAS), and the American Association for the Advancement of Science (AAAS, publisher of Science), which last year rated some of the texts very highly. Proponents praise their multifaceted and conceptual approach to learning mathematics and use of daily-life applications, while critics say they stray from fundamental arithmetic concepts such as fractions and offer insufficiently rigorous instruction.

    The ad Packard paid for criticizes parts of some textbooks, as well as accusing the Department of Education of ignoring the views of working mathematicians in assembling its panel. Its supporters also believe that the department should leave selection of texts and curricula to state and local school boards, despite the fact that the federal agency was ordered by Congress to conduct such an exercise. “We were getting lots of letters from people who didn't like the books the panel had chosen but felt powerless to object,” says David Klein, a math professor at California State University, Northridge, and a leader of the group, Mathematically Correct, that had already posted the letter on its Web site. “We were looking for other ways to express our views.”

    That opportunity came via Williamson (Bill) Evers, a research fellow at Stanford University's Hoover Institute and another leader of Mathematically Correct. “I know David and meet with him regularly, and he has been looking for ways to support elementary school mathematics,” says Evers. “I mentioned the open letter, suggested that a newspaper ad might be appropriate, and put him in touch with [Klein].”

    Some math educators say the ad, instead of encouraging open debate, inflames an already politicized conflict. “I respect many of these people, but I think what they have done is very harmful,” says Hyman Bass, president of the Mathematical Sciences Education Board at NAS and a professor at the University of Michigan, Ann Arbor. Bass and others say the criticism will make it harder for the math and education communities to work together.

    The institute's statement draws a direct link between the math and reading wars, noting that “the debates over mathematics instruction obviously reflect many of the same highly emotional ideological issues familiar to us from the parallel debates about reading.” In reading, the battle pits phonics, an approach that stresses the need to decode sounds and letters, against advocates of whole language instruction, which emphasizes the importance of the meaning and context of words. Phonics, the prevailing approach to reading in the 1950s and '60s, has made a comeback after being supplanted by whole language in the '70s and '80s. Although most reading experts say that both elements are essential, each side has hardened its position by becoming affiliated with a broader political philosophy—conservative vs. liberal. PHI's statement on the math letter sharply critiques whole language instruction: “superficially plausible but demonstrably ineffective methods of teaching reading [have] caused great harm to children.”

    Education reformers are now taking a closer look at Packard's involvement in the reading wars for a sense of the tactics he may bring to the math campaign. They are discovering that, unlike many philanthropists, he is hands on and very demanding. That style is evident in his bankrolling of Pacific Elementary School, a poor Hmong-majority school whose reading scores are the lowest in the 53,000-student Sacramento school district. “David doesn't just give money away,” says Pacific's principal, Kathy Kingsbury, who arrived last summer. “He wants to do good—and he wants to see results.”

    Packard is spending $4 million this academic year to renovate the old school, build a new conference center next door, and hire reading coaches for each grade. It's a compromise: District officials vetoed his original plan to create a new school and instead convinced him that Pacific was an ideal test-bed for his ideas. In addition to attracting Kingsbury—a former teacher and reading coach who had been principal at another school—and paying for another administrator, Packard's largess also led to a significant turnover of the 30-member professional staff. One-third of the teachers decided to go elsewhere after learning that they would be required to stay an hour later (with pay) each day and to adhere closely to a phonics reading curriculum called Open Court reading. As a result, Kingsbury has taken on the added burden of training nine rookie teachers.

    Math instruction at Pacific, like at every Sacramento elementary school, is based on material from Saxon Publishers Inc. of Oklahoma. It emphasizes repetition and strict adherence to the text as the best way to develop computational skills. If Packard decides to support elementary school math reform, observers speculate, he's likely to back a similar approach.

    But Saxon's curriculum is controversial: A recent AAAS ranking of middle school math texts omitted Saxon because “its philosophy, organization, and format … were not well suited to a benchmarks-based evaluation.” Saxon's president, Frank Wang, acknowledges that “we don't emphasize higher order thinking and open-ended problem solving.” Wang says the company, which also offers home-schooling materials aimed at a Christian audience, deliberately avoided the controversial Department of Education review “because we knew we'd get low ratings [based] on the criteria they were using.”

    If his reading track record is any guide, Packard is also likely to immerse himself in math reform. “He does everything himself, and he is in total control,” says Betty Flanary, coordinator for Packard's Reading Lions program, which is pumping $15 million a year into 27 California school districts that use Open Court. “David walks around with binders on every school. He analyzes the data, and he's not afraid to call school officials and ask what's going on in a particular classroom that isn't performing.”

    “A micromanager? I guess you could say that,” laughs Marion Joseph, a California State Board of Education member and leader of the phonics movement who is widely credited with introducing Packard to the reading debate. “But David is classically trained in literature and music, and [Open Court] appealed to his understanding of how we learn language and its structure.” Adds Jim Sweeney, Sacramento city schools superintendent, “He pushes very hard for full implementation of what he is supporting. But before he gets involved, he researches it thoroughly.”

    Packard has not said how long he will support Pacific, although Sweeney, Kingsbury, and others have warned him that lasting gains don't happen overnight. But whatever he decides to do next, those who have worked with him say he'll want to see results. “I can tell you that David Packard has made one heck of a difference in our schools,” says Sweeney. That ability to get his way is exactly what worries Bass and other math educators on the opposite side of the reform movement.

  16. U.S. SCHOOL REFORM

    Effort to Pick Best Math Texts Denounced at Congressional Hearing

    1. Jeffrey Mervis

    The math wars came to Capitol Hill last week. But the pedagogical question of how to improve elementary and secondary school math instruction took a back seat to the ideological debate over who should call the shots in publicly funded education.

    The 1 February hearing was held ostensibly to give Congress a chance to review the Department of Education's (ED's) recent selection of 10 exemplary math curricula, the first in a planned series by expert panels that will also cover science, technology, and other areas. A November ad in The Washington Post, signed by more than 200 mathematicians and other academics, attacked the exercise and demanded that the department repudiate the recommendations (see main text). Although the hearing gave Administration officials a chance to defend the process, it served mainly as a forum for critics of the federal government's role in education.

    The 14-member expert panel rated each curriculum on its quality, ease of use, suitability for a diverse population, and compatibility with national standards. Such measures, tested in two pilot studies, were intended to replace “anecdotal evidence” of success, noted expert panel co-chair James Rutherford of the American Association for the Advancement of Science (which publishes Science). Out of 61 voluntary submissions, five were scored exemplary and another five considered promising. In keeping with its mission to spread the word about high-quality programs, the panel did not rate the other programs nor release their names.

    But the critics were not convinced. “There is no valid research that shows any of these curricula are effective,” claimed Stanford University mathematician R. James Milgram. “With one possible exception, the programs represent a single point of view, with the teacher as a facilitator. There are no means for students to develop mastery of basic arithmetic operations. Algebra is also short-changed.” Milgram said that “the sad state of mathematics education is primarily the responsibility of two agencies, the National Science Foundation [NSF] and the Department of Education,” and Representative Michael Castle (R-DE), who chaired the hearing, said afterward that the government should do more research before it puts its “stamp of approval” on these programs.

    Neither Judith Sunley of NSF, which funded development of six of the 10 programs, nor Kent McGuire, head of ED's Office of Educational Research and Improvement, took issue with Milgram's critique at the hearing itself. Sunley said later that “there are many factors that affect the success or failure of a particular curriculum.” Part of the problem, noted Rutherford, is that “it's almost always possible to find information to support or criticize any program.”

    Much of the hearing focused not on content but on politics, however. “At the very least, the federal government should first do no harm,” asserted Susan Sarhady, a Plano, Texas, parent who has led an unsuccessful fight against one of the recommended programs, Connected Math, adopted 3 years ago by local school officials in Plano. Another parent, Mark Schwartz of Livonia, Michigan, incensed by his local district's use of another exemplary program, Core Plus, labeled U.S. public education “the largest monopoly in the world.” And Rachel Tronstein, a former student in that district, complained that she and her classmates were forced to follow 4 years of a curriculum “that simply does not work.”

    Democratic members defended the exercise and reminded their colleagues that it was mandated by a 1994 law. “It's critical that districts have access to solid research on what works,” said Representative Dale Kildee (D-MI) about the exercise, “and the Department of Education panel should be one of many tools.” But Representative William Goodling (R-PA), chair of the Education and the Workforce Committee, which convened the hearing, joined several of his Republican colleagues in denouncing the exercise. “I'm afraid that these federal expert panels sometimes take away from the power and good sense of local officials,” Goodling said.

    Representative Vern Ehlers (R-MI) pleaded with his colleagues to stop “talking about angels dancing on the head of a pin” and to start “looking at the larger issues” of what factors affect student learning. But exorcising politics from the discussion may be impossible. Tronstein, who is now a freshman at the University of Michigan, Ann Arbor, acknowledged that her concern for mathematics education wasn't the only reason she came to Washington. Glancing around the hearing room, her eyes wide with awe and ambition, she confessed: “I'd love to be back here someday, as a member.”

  17. AIDS RESEARCH

    Vaccine Studies Stymied by Shortage of Animals

    1. Jon Cohen

    NIH doesn't know how many Indian rhesus macaques its researchers need, nor how many are available. And that's a big problem

    Paul Johnson's lab at the New England Regional Primate Research Center is one of the best equipped in the world to study the immunology of SIV, the simian AIDS virus. Johnson also enjoys generous funding from the National Institutes of Health (NIH). So why does he have to wait 2 to 6 months to start an experiment? The answer is simple: The demand for rhesus macaques, the animal of choice for Johnson and a growing number of AIDS researchers, far outstrips the supply. But the reasons for that shortage are complex, encompassing everything from international trade to internal NIH politics. And AIDS researchers are worried that, if the shortage persists, it could hinder progress in the field. “It's a huge problem,” says Norman Letvin, a leading AIDS vaccine researcher based at Harvard's Beth Israel Deaconess Medical Center in Boston. “And it's going to get much worse.”

    For many years, AIDS researchers bemoaned the lack of a good animal model for testing vaccines, measuring the toxicity of various drugs, and exploring the disease's progression. But the Indian rhesus macaque was found to develop a disease that closely mimics human AIDS when infected by SIV and has steadily gained in popularity. NIH has stimulated the demand for these monkeys by doubling its budget in the past 5 years for AIDS vaccine research. To make matters worse, researchers in reproductive biology, malaria, and other fields also have begun to rely more heavily on rhesus macaques.

    But supplies are limited. India, once the main source, has banned exports of the species since 1978, and NIH is phasing out a domestic program to breed “clean” macaques that was begun after many imported monkeys were found to harbor pathogens. The resulting imbalance between supply and demand has caused delays of up to a year or more for animals with certain genetic features and has driven up the price of the animals to a level that is straining many researchers' budgets.

    Six months ago NIH's Office of AIDS Research (OAR) sponsored a meeting at which researchers concluded that there was a “severe shortage” of rhesus macaques and urged NIH to act quickly. But NIH officials say it's not clear what should be done, or which component of the $18 billion agency should take the lead. “There is a problem, but I'm not able at present to identify its dimensions,” says OAR director Neal Nathanson. The issue, he says, falls in the lap of another NIH branch, the National Center for Research Resources (NCRR), which in addition to funding the New England facility and seven other primate facilities specifically bankrolls breeding programs with both nonprofit and commercial suppliers. “I've conveyed my sense of the [problem] to NCRR,” Nathanson says.

    Some outside researchers wonder, however, if NCRR has the ability or desire to improve the situation. “It's a very isolated institute, and it doesn't collaborate well with other institutes,” says Alan Schultz, a former head of AIDS vaccine research at the National Institute of Allergy and Infectious Diseases who now works with the International AIDS Vaccine Initiative. Jerry Robinson, who oversees the regional primate centers for NCRR, agrees that there is a “real crisis.” But Robinson has had difficulty assessing the number of primates available for research, and no one at NCRR has tallied the number of animals required by NIH-funded researchers.

    Several factors complicate any attempt to make even a simple assessment of the available population, says Robinson. Primate centers and private colonies reserve some animals for breeding or behavioral research, and disqualify others that are too young or old. In addition, both commercial breeders and primate centers have been known to play favorites. “The people involved in monkey procurement seem like an old boys' network of people calling their friends,” says New England's Johnson. “Two people may call up and get different information about availability.”

    In addition, one size definitely doesn't fit all researchers. Scientists have become increasingly selective as they develop more sophisticated research tools and reagents. But the cost of many of the animals produced through NCRR-funded breeding programs now far exceeds that budgeted in the average NIH-funded grant. Add in the rising demand from outside the AIDS field and the result, Robinson says, is that “lots of NIH grants depend on macaques, and [the researchers] can't get the monkeys they need.”

    Most in demand are monkeys that don't have chronic infections. In the early 1980s, coincident with the start of the AIDS epidemic, primate researchers discovered that many rhesus macaques had been infected with what is now known as simian retrovirus. Although the virus has no relationship to SIV or HIV, it can kill monkeys from an AIDS-like disease. Many animals also harbored an array of other viruses that could either harm the monkeys or their handlers, including STLV, herpes B, and foamy. So 12 years ago, NCRR decided to fund select primate centers and commercial outfits to breed “specific pathogen free” (SPF) animals.

    Unfortunately, the SPF breeding program has done little to alleviate the crunch, and rising demand has driven up prices. Kay Izard, a zoologist who runs the SPF program at LABS of Virginia in Yemassee, South Carolina, says the top asking price for an SPF animal has risen from $2000 in 1987 to $5000 today. That's nearly double the per capita price budgeted by most NIH-funded researchers, some of whom buy as many as 100 rhesus macaques a year.

    Three years ago, NCRR decided to phase out funding for the SPF breeding program. It was a tacit admission that the animals had become too expensive for academic researchers and that the program was instead subsidizing commercial users. “The long-range goal of NCRR was to set these colonies up and not to subsidize them forever,” says Robinson. But Izard and others have questioned that decision. “It's too bad they didn't extend the funding for it, since there is a real problem,” says Izard, noting that they have had to raise prices even higher since NCRR cut its support.

    On top of the shortage of SPF animals, AIDS researchers are also hard pressed to find females of breeding age, as well as a specific genetic type used for immunologic studies that help explain why vaccines fail or succeed. Specifically, researchers have developed reagents that allow them to measure killer cells—a critical component of the immune system—only in animals that have a marker on their white blood cells known as Mamu-A*01. “The wait for Mamu-A*01 animals is not quantified in months but in years,” says Johnson.

    NCRR soon will solicit new proposals from monkey breeders, says Robinson, in hopes of boosting supply. But new housing for breeding colonies is also essential, says Ronald Desrosiers, who heads the New England center in Southborough, Massachusetts. “You can give me all the [breeding] money you want,” he says, “but if I can't use it to build a building to put them in and the infrastructure to support it, it is impossible [to produce more animals].” Others suggest that centers in cold climates like New England should instead contract out the work to commercial firms in warmer climates, where animals can roam outside all year.

    Some researchers see foreign stocks as a partial solution. Marta Marthas of the California Regional Primate Research Center in Davis says researchers should look more carefully at rhesus macaques still available from China. Although studies suggest that Chinese macaques naturally control SIV infection more effectively than do monkeys of Indian origin, Marthas says many questions remain about their differences. David Watkins of the University of Wisconsin, Madison, hopes to implant embryos from Indian Mamu-A*01 animals into Chinese rhesus mothers. “[In vitro fertilization] is the way to go,” he says, although he adds that the technique is now inefficient.

    Whatever its cause, the shortage of rhesus macaques has highlighted the need for scientists to become more involved in breeding them. “What disturbs me the most is that this [issue] is looked at with great disdain by everyone,” says Marthas. “We have to think ahead. If we don't, we'll jeopardize not only our own experiments but those of future scientists.”

  18. STATISTICS

    Revealing Uncertainties in Computer Models

    1. Barry Cipra

    Computer simulations give the impression of precision, but they are founded on a raft of assumptions. Making uncertainties evident is a tough challenge

    As computers have become faster and smarter, scientists have used them to build models of complex phenomena—everything from wildfires on rugged terrain to traffic snarls on urban streets. These simulations can yield precise answers to problems a person with a pencil and paper would need millennia to solve. But just how reliable are these virtual solutions? Although the precise numbers and realistic pictures produced by computer simulations give an illusion of accuracy, a ravening swarm of assumptions, simplifications, and outright errors lurk beneath. New tools are needed, scientists say, to quantify the uncertainties inherent in calculations and to evaluate the validity of the models. But, as several recent uncertainty workshops* attest, the quest for such tools is itself an uncertain and challenging process.

    The worrisome scenario is not the old science fiction cliché of supersmart machines shucking the shackles of humanity, but rather semistupid programs placed in positions of responsibility—a kind of digital Peter Principle, in which computers rise to their level of incompetence. As long as human experts act as a buffer between the program and the real world, interpreting the results and gauging their reliability, computer models can get by without built-in error estimates. But when the models are turned over to nonexperts, or when the programs themselves are authorized to make decisions—an automated inventory control system, for example, that decides when to restock items—quantifying their uncertainty becomes urgent. Much of the global warming debate, for example, is fueled by the radically different numbers that different models produce. As the explosive growth of computer power allows researchers to tackle ever-bigger problems with ever-more-complex models, even the experts have a hard time sorting the scientific wheat from the numerical chaff.

    “We've been lucky in the past, because we've picked problems where the sources of uncertainty are negligible,” says mathematician Mac Hyman of Los Alamos National Laboratory (LANL) in New Mexico. Regulating the flow of oil through the maze of pipes and pumps of a refinery, for example, is a straightforward exercise, as is designing a bridge. But problems like anticipating the impact of global warming up the ante on uncertainty. The number of variables is hugely greater, and the scale of the problems makes even state-of-the-art computers creak under the computational load. Moreover, notes Hyman, in a few decades, most complex computer simulations will predict not physical systems, which tend to obey well-known mathematical laws, but social behavior, which doesn't.

    One mushrooming area of application is the Defense Department's “stockpile stewardship” program, which aims to maintain the reliability of U.S. nuclear weapons in part through computer simulations and virtual testing. “The consequence of making a mistake could be very, very high,” says LANL physicist David Sharp—imagine detonators that don't comply with simulations after they've sat around for 50 years. And the need for quantifying uncertainty, Sharp emphasizes, goes beyond weapons, “pervading absolutely everything.”

    At the very least, modelers need to start putting error bars on their output, says Hyman, which is no easy task. Commonplace in reports of experiments, error bars show a range of values that is likely (or guaranteed) to include the “true” value. To calculate them, researchers must figure out the uncertainties in each step of the model and see how those uncertainties build on each other. Ultimately, however, the uncertainty enthusiasts would like to have a new variable type to work with when they write computer programs: probability distribution functions, or pdf's. The most famous and ubiquitous pdf is the bell-shaped Gaussian, or “normal,” distribution, but there are many others. Unlike error bars, which merely give a range in which the solution should fall, pdf's attach a likelihood to each possible value. Current computer languages, which are based on algebra rather than statistics, make calculating with pdf's an awkward, ad hoc affair. That's one reason modelers stick to exact numbers even when they're only estimates, Hyman points out. Extending languages to handle pdf's more easily should help.

    Error bars and pdf's won't just make results easier to interpret, notes Gregory McRae, a chemical engineer at the Massachusetts Institute of Technology. By pinpointing areas that need attention, they can also help researchers design experiments. For example, imagine a simple, two-step chemical reaction, A → B → C, with uncertain rates of reaction. A mathematical model that keeps track of uncertainties as it computes the three time-varying concentrations can indicate which experiments will be most informative. A big error bar in the model's output partway through the reaction tells the laboratory scientist to measure the three concentrations at the corresponding time.

    But what's easy for a two-step process is well-nigh impossible for the highly complex systems that dominate modern modeling. “As we put more of our understanding into these models, it's getting harder to probe their uncertainties,” McRae says. The trouble is, error estimates, when done at all, tend to be added as an afterthought, which makes them less accurate. “Since uncertainty is important, why not put it in at the beginning?” McRae asks.

    That's where statisticians come in. “We need to be playing a bigger, more important role in working with the modelers,” says Sallie Keller-McNulty of LANL. Statisticians, whose forte is variability, could help modelers come to grips with the noise of real-world data. In weather models, for example, there is a tremendous correlation between precipitation patterns in different locations, but it's easy to overlook without a statistical analysis. Statisticians could also help develop mathematical equations that describe inherently probabilistic features of physical systems, such as the distribution of grass, pine needles, and other fuel in a wildfire, or the way drivers behave in a traffic simulation.

    Despite formidable hurdles ahead, the uncertainty crowd thinks modelers will eventually embrace error bars, pdf's, and other techniques of uncertainty analysis with much the same fervor carmakers now show in advertising safety features on cars. Scott Weidman, director of the Board on Mathematical Sciences at the National Research Council, sees a “very healthy sign” in the number of workshops devoted to uncertainty. “It's a sign of a maturing field that people can step back from individual problems and ask if there's a general methodology,” he says. “People are starting to think of a metaproblem: how to build a better science of computer science.”

    • *Two took place last December in Santa Fe and Los Alamos, New Mexico. A 2-day meeting on the “Evaluation of Complex Computer Models” (3 to 4 December) focused on how statisticians can address the problem, and a 3-day follow-up on “Predictability of Complex Phenomena” (6 to 8 December) looked at efforts currently under way.

Log in to view full text