News this Week

Science  20 Aug 1999:
Vol. 285, Issue 5431, pp. 49
  1. SCIENCE EDUCATION

    Kansas Dumps Darwin, Raises Alarm Across the United States

    1. Constance Holden

    Kansas Governor Bill Graves, a Republican, called it “a terrible, tragic, embarrassing solution to a problem that did not exist.” Six Kansas college and university presidents and chancellors warned that it “will set Kansas back a century and give hard-to-find science teachers no choice but to pursue other career fields.” Science educator Steven Case of the University of Kansas, Lawrence, says: “There's statewide outrage over what they did.” And the outrage was not confined to Kansas. National organizations last week joined the chorus of protests over the Kansas Board of Education's vote to eliminate not only evolution but anything hinting at the great age of Earth, and even some cosmological theories, from statewide standards for science teaching.

    In the first evolution-related uproar to go nationwide in the 4 years since Alabama required biology textbooks to carry disclaimers labeling evolution an unproven theory, the board voted on 11 August to remove evolution from a draft of the state standards. By 6 to 4, the board adopted standards* in which the word “macroevolution” has been deleted. (Microevolution, which refers to genetic changes within a species, is still allowed.) The “big bang” is gone, along with other hints of geologic time scales such as radioisotope dating. Mount St. Helens is supplied as an example of how some radical geologic changes can occur rapidly. The changes “are more extensive than what we've seen before” in states where evolution battles have raged, says Molleen Matsumura of the National Center for Science Education (NCSE) in El Cerrito, California, which monitors evolution debates around the country. “I think this is the first time that the big bang has bothered anybody,” observes retired physicist Jack Davidson, a member of the Lawrence school board.

    The standards promote what Case labels “bizarro” hints for how to think scientifically. For example, the 27-member committee of scientists and educators assigned to write the document submitted a standard for 8th graders on the “history and nature of science” that reads: “Display open-mindedness to new ideas. Example: Share interpretations that differ from currently held explanations on topics such as global warming and dietary claims. …” But the board substituted the following: “Learn about falsification. Example: How many times would we have to prove the theory [that all cars are black] is wrong to know that it is wrong? Answer: One car of any color but black and only one time. …” Says Matsumura: “There's tremendous emphasis on direct observation in the present. … All it takes is one pair of ‘contemporaneous’ human and dinosaur footprints to falsify current conclusions that dinosaurs preexisted humans by millions of years.” As Case notes, “Creationists like to limit the nature of science to falsifiability. … Disproving one thing proves the other.”

    The standards are supposed to be guidelines, based on national ones put out by the National Academy of Sciences, for teachers in Kansas public schools. They are not mandatory, but once they are tidied up by the board, a process that was still going on early this week, they will form the basis for statewide achievement tests starting in 2001. Observers expect that they will have their largest impact in small communities where, according to Case, 80% of Kansas's science teachers teach. And Case and others are predicting that Kansas students will show “significant declines in college entrance test scores” as a result. What's more, says Matsumura, there are “some expectations that science teachers will be taught in a manner to prepare them to be in accordance with state standards.”

    The national uproar came swiftly. Groups such as the American Civil Liberties Union and People for the American Way are making noises about suing. NCSE has received a “blizzard of phone calls,” says Matsumura. And several scientific societies have registered alarm. The American Geophysical Union's president, Fred Spilhaus, sent out a call for scientists to get more involved in local politics. And the American Physical Society is plotting ways of letting the state know that dumping Darwin is bad for its economy. It can point to one concrete example already: Broadcast Software International in Eugene, Oregon, has already let it be known that it's crossing Kansas off its short list for a new service center.

    Education board members who voted in the majority have suggested that public response is overblown. “I have no problem with the teaching of evolution,” board chair Linda Holloway told the Associated Press. The main moving force behind the action is said to be Steve Abrams, a veterinarian from Arkansas City and former Republican state chair, who got a fundamentalist group to draft an alternate set of standards which he then presented to the board. The version that is being finalized now is represented by the board as a “compromise,” says Case. Abrams, who did not return several phone calls from Science, said in a statement on Monday that the board's vote “shows a respect for diversity of opinion and … expands the learning opportunity of the children of Kansas rather than narrowing it.”

    But the board could find that its vote will come back to haunt it. The governor and legislators are talking about a constitutional amendment to bring the education board—which ironically was set up as an elected body to insulate it from politics—back under the control of the legislature. What's more, Case says educators have formed a group called Citizens for Science, which plans to supply local school districts with good standards. And, says Matsumura, “Each event like this makes more scientists become actively concerned. … There are more concerned scientists now than there were before Wednesday.”

  2. ENVIRONMENT

    Sharp Drop Seen in Soil Erosion Rates

    1. James Glanz

    CHICAGO—As any fan of detective thrillers knows, if there is a murder, there must be a body. For soil environmentalists, the “crime” is the use of farming practices that lead to massive amounts of erosion. According to some studies, the bodies—countless tons of precious topsoil—have been washing into rivers and streams at a rate that has changed little in the United States since the Dust Bowl days of the 1930s. But when Stanley W. Trimble went looking for those bodies, like a Lieutenant Columbo in coveralls, the plot took a surprising twist: Most of the expected corpses simply weren't there. Trimble's findings suggest that erosion rates are running much lower than generally estimated. Indeed, over the past few decades they appear to have been a tiny fraction of their historical peaks.

    Trimble, a professor in the department of geography and the Institute of the Environment at the University of California, Los Angeles—who also happens to be a Tennessee farmer—based his conclusions on 140 years of data on sedimentation in the heavily farmed Coon Creek Basin, which drains into the Mississippi River 25 kilometers south of La Crosse, Wisconsin. The study, reported on page 1244 of this issue, is being hailed for its scope, but it is also generating controversy. “Trimble's work suggests that rates of erosion in that region are much less than a lot of people seem to have thought,” says Pierre Crosson, an agricultural economist at Resources for the Future in Washington, D.C. But some critics question the study's methods, and others argue that the Coon Creek rates may not be typical even of the humid Midwest and eastern United States, let alone other areas. Trimble responds that “the burden of proof is on those who have been making these pronouncements about big erosion numbers. … They owe us physical evidence. For one big basin, I've measured the sediment and I'm saying, I don't see it.”

    The detail with which Trimble looked at the basin, which sprawls over 360 square kilometers around Coon Creek and its tributaries, is virtually unheard of in large erosion studies. The research benefited from a combination of lucky historical factors and what Trimble describes as scholarly “perspiration” —years of fieldwork. During the erosion crisis of the 1930s, the basin was chosen for intensive study by the Soil Conservation Service (now the Natural Resources Conservation Service) of the U.S. Department of Agriculture (USDA). Trimble tracked down the old monuments and markers—ranging from steel pipes set in concrete to nails pounded in trees—and used them, just as USDA did, as benchmarks for measuring how much sediment has accumulated in the basin from erosion of the rolling fields around it.

    Records from many such studies were scattered and lost during the confusion of World War II, but Trimble ran across the Coon Creek data in the National Archives in the 1970s. He also enlisted the help of geologist Stafford Happ, who had led some of the original USDA work. “He had a memory like an elephant,” says Trimble. “He'd say, ‘Yeah, I remember this elm over here. I'm sure we drove a nail in the west side.’” Trimble then resurveyed the soil profiles in dozens of sections across the basin's valley in the 1970s and again in the 1990s to see changes.

    To trace erosion rates further back in time, he dug down to find other markers—old roads, railroad beds, concrete dams, and house foundations—that marked soil levels all the way back to the turn of the century. At greater depths, he found the dark, richly organic soil of the original prairie, a benchmark for the soil level when European farmers arrived in the 1850s and eroded sediment first started to accumulate in the basin.

    The measured rates jumped in the late 19th century, skyrocketed in the 1920s and 1930s, and then dropped again as USDA pressed farmers there to stop using the traditional moldboard plow and adopt conservation practices like strip-cropping and leaving plant residue and stubble in the fields year-round to inhibit runoff. From the 1970s to the 1990s, sedimentation rates dropped to just 6% of their peak.

    Official USDA national averages for the last 2 decades have suggested a slight decline in soil loss, but the decline that Trimble reports is so precipitous that some experts find it hard to believe. Among them is David Pimentel, a Cornell University entomologist who claimed huge, continuing erosion losses in a paper in Science (24 February 1995, pp. 1088 and 1117) that has been criticized for including what he now concedes were outdated and erroneous data. Pimentel says he distrusts routine soil-science methods such as locating and dating the original prairie surface. Trimble has “got a good imagination,” he says, adding, in reference to the 19th century soil levels: “He wasn't there, and these are guesstimates at best.”

    Others, while accepting the Coon Creek study, caution against extrapolating the numbers too far. “I could not support that as a national average,” says Marty Bender, a research associate at the Land Institute in Salina, Kansas. “I think it helps confirm, in an indirect manner, that soil erosion has been decreasing to some extent.” “I think [this] is probably a really good study,” says John Reganold, a soil scientist at Washington State University in Pullman. “But the problem is, it's just one area.” In the wheat-farming Palouse area of the Pacific Northwest, Reganold points out, the soil and climate are not only entirely different, but farmers have been much slower to adopt conservation practices on their fields.

    Trimble acknowledges that the comparison probably works only for the humid Midwest and eastern United States, and concedes that even in those regions there are probably exceptions where poor farming practices or higher intrinsic erodability come into play. And a second major conclusion from the study suggests that some of the benefits of slower erosion may be slow in coming. Trimble found that no matter what happens to erosion rates, the basin tends to store and release sediment in such a way that the amount delivered to the Mississippi River remains roughly constant over the decades. Sediment eroded from upland areas is in effect stored around Coon Creek tributaries and other geographic features and released later. Trimble mapped out in detail, for example, how the cutbanks and floodplains around the oxbows of the tributaries could change in shape and size over time and be transformed from sediment sources to sinks and back again.

    The constancy “is a big surprise,” says Olav Slaymaker, president of the International Association of Geomorphologists, who is at the University of British Columbia in Vancouver. And it could have major implications for controlling off-site damage by sediments and the pollutants that cling to them, says M. Gordon Wolman of the department of geography and environmental engineering at Johns Hopkins University. “It means that if I control the materials coming off a field or group of fields,” says Wolman, “it may be some time before I see the results of that—if I do within decades.” From a policy standpoint, he says, “this could be, to some people, very disturbing.” Environmentalists will long be debating the significance of that finding.

    Whatever the outcome of that debate, Slaymaker says that the scientific significance of the new study is clear: “It's the most comprehensive study of its kind anywhere in the world.”

  3. MISCONDUCT

    Fraud Finding Triggers Payback Demand

    1. Marcia Barinaga,
    2. Jocelyn Kaiser

    Officials at Lawrence Berkeley National Laboratory (LBNL) in California thought they were setting a positive example when they exposed allegedly fraudulent research conducted by one of their scientists on the effect of electromagnetic fields (EMFs) on living cells. Now they feel they are being punished for their forthrightness: The National Cancer Institute (NCI) has demanded that the lab repay more than $800,000 in grant money that was awarded to the researcher.

    “We think that to require us to pay back the money would set a very dangerous precedent,” says LBNL spokesperson Ron Kolb. “It discourages institutions from behaving responsibly.” But Marvin Kalt, director of NCI's Division of Extramural Activities, argues that the agency is only doing its job: “We're obligated to review whether we should recover” misused funds. Indeed, LBNL is not the first institution that funding agencies have dunned for repayment after a misconduct finding, although such cases seem to be relatively rare.

    The accused biologist, Robert P. Liburdy, published a pair of papers in 1992 that appeared to provide evidence that EMFs at the low strengths found in homes could have a physiological effect on cells by increasing the influx of calcium. His findings were taken as support for the hypothesis that EMFs could cause cancer. But a co-worker questioned the work, and in 1995, after a yearlong investigation, LBNL concluded that Liburdy had deliberately published fraudulent findings. The federal Office of Research Integrity (ORI) agreed in June that Liburdy's data did not support the claims in his papers (Science, 2 July, p. 23). Liburdy has denied wrongdoing (Science, 16 July, p. 337).

    ORI's conclusions apparently triggered NCI's demand that the lab return $804,000 in grants for Liburdy's research from 1991 until March 1994. The letter from NCI, dated 3 August 1999, says that “the rationale for this decision is that the misconduct that occurred affected the validity of the entire grant project.” All costs incurred by Liburdy “are unallowable,” the letter says, because the funds “were expended to support falsified research or obtained on the basis of falsified research.”

    The NCI demand has angered Mina Bissell, chair of the LBNL Life Sciences Division in which Liburdy worked. Bissell notes that the lab is returning the unspent portion of the funds, but to demand repayment of money that has been spent, she says, is punitive. “We have shown a lot of courage, more than most universities,” says Bissell of LBNL's handling of the case. “We did all the right things. What message does this [demand] send to other institutions?”

    Officials respond that the National Institutes of Health, of which NCI is part, states in its grant policy that awards may have to be returned if they're misspent. Kalt adds that NCI has previously recovered research grants in “a small number of [misconduct] cases”—although they are the exception. Although ORI doesn't track recovered grants, Lawrence Rhoades, director of ORI's Division of Policy and Education, says he's aware of only a few cases among the 100 or so misconduct findings by ORI since 1992. “Figuring out how much money the government should get back is not always easy to do,” explains Rhoades. Often the scientist found guilty played a small role in a study, or the overall conclusions are still valid.

    To Bissell this inconsistency adds to the unfairness of NCI's demand. The NCI letter gives LBNL 30 days to either pay or appeal the decision. Kolb says the lab plans to appeal.

  4. SPECIATION

    Mexican Pairs Show Geography's Role

    1. Bernice Wuethrich*
    1. Bernice Wuethrich is an exhibit writer at the Smithsonian's National Museum of Natural History in Washington, D.C.

    The forested mountain ranges that march across each side of Mexico's Isthmus of Tehuantepec are a naturalist's paradise, full of rare birds, mammals, and butterflies. They are a playground for evolutionary biologists too, because these nearly identical habitats were separated relatively recently, when climate change created an arid strip between them. Now, researchers who examined pairs of species on either side of the lowland report on page 1265 that they have new evidence that such geographic barriers are the major force driving the formation of new species.

    Biologists since Darwin have analyzed—and argued about—how species are born. Recently researchers have looked favorably on a version of Darwin's own idea: that populations of a single species can separate when they change their ecology, adapting to different temperatures, food resources, or other environmental conditions (Science, 25 June, p. 2106). But after examining 37 pairs of closely related species, one from each side of the arid Tehuantepec barrier, a team led by ornithologist A. Townsend Peterson of the Natural History Museum of the University of Kansas, Lawrence, found that members of each pair had similar ecological niches. In every case it appears that the geographic barrier, rather than any difference in ecology, was the critical factor in speciation, isolating the original populations so that they accumulated genetic differences and eventually became unable to interbreed, says Peterson. “Speciation is taking place simply because of geographic isolation,” not through ecological adaptation, he says.

    The idea of geographical speciation is well understood, but unambiguous examples are rare. “Showing it the way they did … is pretty clever,” says Robert Zink, curator of birds at the Bell Museum in St. Paul, Minnesota. Even so, he and others warn against dismissing the role of ecology in speciation, because the method needs refinement and there are counterexamples in which ecological differences have driven populations apart into species.

    Peterson and colleagues Jorge Soberón and Victor Sánchez-Cordero from the National Autonomous University of Mexico in Mexico City recognized the potential of the dry lowland to test speciation theories. The 300-kilometer-wide strip was once forested, but climate changes left it scrub-dry by 100,000 years ago, interrupting the ranges of forest species to the north and south.

    To carry out their test, the team used published literature to identify 37 pairs of sister species on either side of the isthmus and searched museum records to find out where specimens of each species were collected. They used these location data to define each species' ecological niche based on four conditions: temperature, precipitation, elevation, and vegetation. They then plugged these parameters into a computer program to determine the potential geographic range of each species.

    Peterson tested whether the observed ecological niche of one species could predict the niche and range of its sister. For all 37 species pairs, the answer was yes. For instance, the ecological parameters favored by a blue mockingbird also predicted the range of its counterpart south of the barrier, a blue-and-white mockingbird. This means that each species' niche remained stable throughout the speciation process. Such conservatism makes it “pretty clear that speciation did not take place in an ecological dimension,” says Peterson.

    The work offers “an intriguing contribution” to the speciation debate, says biologist Thomas Lovejoy of the Smithsonian Institution. However, Trevor Price, an ornithologist at the University of California, San Diego, says he and others don't doubt that geography can create new species; what they want to know is when and how ecology plays a role, too. “We are now asking what is the role of ecology over and above geographic isolation,” he says. And he thinks Peterson's study may have overlooked some complexities. An ecological niche is much more than just four physical parameters, he notes. Herpetologist David Wake of the Museum of Vertebrate Zoology in Berkeley, California, adds that the literature identifying species as sisters may or may not be accurate, nor is it certain that all of the supposed sisters were born when the barrier appeared.

    Nonetheless, Wake and other researchers agree that Peterson's method of predicting geographic ranges from ecological data holds great promise. Peterson is now improving his method to incorporate dozens of ecological parameters and applying it to predict the potential distribution of invasive species. “If their analysis holds … they'll have the ability to predict where species will and won't occur as habitat changes,” says Zink. “[They could] forecast the fate of species”—a valuable power indeed.

  5. ANIMAL TESTING

    One Mouse's Meat Is Another One's Poison

    1. Laura Helmuth

    Just as government labs are gearing up for a major campaign to ferret out industrial chemicals and pollutants that mimic sex hormones, scientists have discovered that some of their favorite test subjects—lab mice—vary greatly from strain to strain in their sensitivity to the hormone estrogen. According to a report on page 1259, estrogen injected into young male mice sharply curtails testis growth and sperm production in some strains, while leaving a widely used lab strain—designated CD-1—essentially unperturbed.

    The findings raise concern about whether current animal tests adequately gauge the human health risks that hormone mimics may pose. The new findings “may have significant implications,” says reproductive biologist Earl Gray of the Environmental Protection Agency's (EPA's) health effects lab at Research Triangle Park, North Carolina. CD-1 has been the mouse of choice, he says, for studying hormone mimics. If CD-1 mice prove insensitive to such compounds, adds reproductive biologist Frederick vom Saal of the University of Missouri, Columbia, then “these are the last animals you'd want to use” for testing. Researchers caution, however, that they have not yet determined whether the animals show the same range of sensitivity to hormonelike chemicals as they do to the real McCoy.

    Jimmy Spearow, a reproductive geneticist at the University of California, Davis, says he first uncovered strain-to-strain differences in hormone sensitivity in the late 1980s. Then a few years ago, he got a surprise when reading papers on the physiological effects of hormonelike chemicals in mice. “I said, ‘Oh my God, they're using the most resistant strains!’” Spearow recalls. To probe further, he and colleagues implanted estrogen plugs under the skin of juvenile male mice from four strains. The capsules released doses of 17b-estradiol ranging from 0.2 to 2.0 micrograms per gram of body weight for 3 weeks.

    At the lowest dose, mice from the most sensitive strain, B6, developed testes that weighed 60% less than those of control animals. In CD-1 mice, however, the same dose reduced average testis weight by just 10%. Similarly, the numbers of maturing sperm dropped precipitously at the lowest doses in sensitive strains. CD-1 mice implanted with twice the amount of estrogen necessary to stop sperm production in other strains still produced roughly 90% of normal levels of maturing sperm. Overall, the researchers report, other strains are about 16 times more sensitive to estrogen than CD-1 mice.

    The findings are intriguing in part because CD-1 mice have been bred to produce large litters. Spearow speculates that the physiological factors responsible for fruitful parenthood have rendered the mice relatively impervious to outside estrogens. Indeed, after breeding two strains—S15/Jls and C17/Jls—over some 70 generations, the one Spearow selected for large litter size, S15/Jls, was less sensitive to estrogen than was the other strain. “It's a biologically plausible hypothesis,” says Gray, who thinks breeding for litter size “is certainly going to affect the reproductive system somewhere.”

    The new data suggest to vom Saal at least that “the risk assessment process substantially underestimates variability in animal populations.” Wild variation could undermine the fudge factor built into animal tests to protect human health: When setting safe exposure levels for people, researchers take the dose found to be safe in lab animals and divide it by a factor of 10—to account for variability from person to person—before setting a permissible exposure level. But the gulf between CD-1 and the most estrogen-sensitive strains is so great, vom Saal says, that it overwhelms the safety factor. “If mice have a substantial potential [genetic] variation in response to estrogen,” he asks, “why shouldn't one assume an equal amount of variability in response to hormones in humans?”

    Regulatory agencies plan to take no chances. The EPA is now standardizing a test battery and screening procedures that its labs, starting in about 2 years, by law must use to evaluate tens of thousands of chemicals for hormonelike activity. Says Gary Timm, senior technical adviser in EPA's office of science coordination and policy, “Certainly we're interested in using the most sensitive species or strains.”

  6. ASTRONOMY

    A World With Two Suns

    1. Mark Sincell*
    1. Mark Sincell is a writer in Houston.

    Buried in the unusual twinkling of a star near the center of our galaxy, an international team of astronomers has uncovered the first evidence of a planet orbiting two stars at once. If other observers can confirm it, the Jupiter-sized planet will be more than an astronomical novelty. Its detection, which the Microlensing Planet Search collaboration describes in a paper posted on the e-print server at Los Alamos National Laboratory (xxx.lanl.gov/abs/astro-ph/9908038), will become a triumph for a new and potentially powerful technique for finding planets around other stars.

    Extrasolar planets are too dim to be seen directly. The several dozen detected so far have betrayed themselves by tugging their parent stars back and forth, creating a small, oscillating signal in the starlight. That technique, however, is sensitive only to massive planets—otherworldly Jupiters. But Einstein's General Theory of Relativity points to another, potentially more sensitive, way to find planets. Every concentration of mass—star or planet—bends the four-dimensional fabric of space-time. “It's like putting a bowling ball on a mattress,” explains co-author Andrew Becker, an astronomer at the University of Washington, Seattle. Just as the path of a marble curves as it rolls past the bowling ball, he says, light itself bends when it passes near a star or planet.

    This effect comes into play when a dim star in our galaxy passes almost directly between Earth and a second star, something that happens “about once per star per few hundred thousand years,” estimates astronomer Penny Sackett of the Kapteyn Institute in the Netherlands. The gravitational field of the intervening “lens” star bends and magnifies light from the background star, a process called gravitational microlensing. By observing tens of millions of different stars every night, several search teams are now regularly observing this telltale stellar brightening, which reveals clues to the nature of the intervening lens object.

    When a single star acts as the lens, the background star simply brightens and fades, but a binary star has an irregular gravitational field that creates a pattern of bright and dark regions, “kind of like the bright lines at the bottom of a pool” in sunlight, says Becker. Such complicated twinkling “has been observed many times in the past several years,” says Sackett, “and in each previous case, a good model was produced with a simple two-lens system.” And then along came MACHO-97-BLG-41, which had an exceptionally odd twinkle.

    Within 10 days of its discovery on 19 June 1997, by the Massive Compact Halo Object (MACHO) survey team, astronomers realized that the flickering of MACHO-97-BLG-41 was too complex to result from a single-star lens. The Microlensing Planet Search collaboration began monitoring the 100-day-long event in collaboration with MACHO and the Global Microlensing Alert Network. At first astronomers thought they might have detected a planet around a single star (Science, 8 August 1997, p. 765). But they now report that accounting for this star's fluctuating light requires a binary system and a third object—a planet weighing three times as much as Jupiter and orbiting at about 7 astronomical units (1 AU is equal to the separation of Earth and the sun) from the two stars. “We have several hundred data points,” says Becker, “and they constrain this model very well.”

    “It's a pretty nifty result,” says University of California, Berkeley, astronomer and fellow planet-hunter Geoff Marcy, “but the architecture of the system is quite odd,” which makes him wonder whether the planet is real. He and other astronomers say the spinning binary, with its stars just 1.8 AU apart, could create “gravitational chaos” near the hypothesized planet, perhaps flinging it from the system. But Harvard astrophysicist Matt Holman, who has used a computer to simulate how a planet around a binary star would behave over millions of orbits, says that the planet is “most likely stable. … The planet is far enough away that it should feel the two stars as if they were one.”

    All the same, astronomers would like independent confirmation. A second team, the PLANET collaboration, also monitored MACHO-97-BLG-41 but is still analyzing the data. “We have a lot of extremely good data, and we see quite a bit of structure,” says Space Telescope Science Institute astronomer Kailash Sahu, a PLANET member, “but we don't yet have a complete solution.”

    If their analysis confirms the planet, “it will add an exclamation point to the suspicion that planets can orbit outside binary stars,” says Marcy. And although this planet is much larger than our own, he says, its discovery would be a boost for a technique that may ultimately detect another Earth.

  7. CHINA SPY INQUIRY

    DOE Blames Three at Los Alamos Lab

    1. David Malakoff

    Department of Energy (DOE) Secretary Bill Richardson last week recommended that the former director of Los Alamos National Laboratory and two other officials be disciplined for bungling a spy investigation. The move has prompted speculation that Richardson is looking to deflect blame from DOE headquarters, and it has placed a spotlight on the lab's contractor, the University of California (UC), which must formally discipline the officials.

    Richardson handed down his verdicts on 12 August, declaring that a new report on the incident by the department's Inspector General “makes it clear that DOE political and career management failed to give necessary attention to counterintelligence and security. … There was a total breakdown in the system and there's plenty of blame to go around.” Congressional investigations and internal DOE reviews revealed that a 3-year investigation into suspected espionage at the lab, which focused on computer scientist Wen Ho Lee, was crippled by a host of errors. They include a failure to look closely at suspects other than Lee and communication lapses that allowed the Taiwan-born researcher to have access to classified information for months after he was supposed to be moved to a less sensitive position. Lee, who was fired in March, has denied the spying allegations and has been charged with no crime.

    Richardson did not name names in releasing excerpts from the classified report (home.doe.gov/news/releases99/augpr/pr99213.htm) but said that 19 people at Los Alamos and DOE's Washington headquarters played roles in the Lee investigation. In only three cases, however, was the evidence of mismanagement “sufficiently strong” to warrant disciplinary action. Congressional sources told Science and other media that the trio fingered by the report consisted of metallurgist Sigfried Hecker, who directed Los Alamos from 1986 to 1997 and still works there as a scientist; Robert Vrooman, a retired chief of counterintelligence at the lab who now works for an outside contractor; and Terry Craig, another former counterintelligence officer still employed at the lab.

    Hecker is accused of “failing to follow through on” a DOE request to limit Lee's access to classified information, while Vrooman reportedly allowed Lee to remain in a sensitive position even after Federal Bureau of Investigation agents told him in October 1997 that the suspect could be moved without endangering the investigation. Craig was cited for botching a file search for a waiver that Lee had signed that would have allowed agents to search his desktop computer. While Hecker has not commented on the charges, Vrooman told The Washington Post this week that the case against Lee “was built on air” and that Lee was targeted largely because of his race.

    As Science went to press, Los Alamos lab director John Browne had not yet decided on the punishments. UC officials say it could be a week or more before they fully review the classified report, which UC President Richard Atkinson discussed with Richardson during a 10 August meeting in Washington.

    Some of Hecker's colleagues at Los Alamos say their former director is being asked to shoulder more than his share of the blame. Noticeably absent from Richardson's hit list, says one weapons researcher who asked not to be identified, were any DOE headquarters staff. “This makes it look like all the problems were out here, and nobody in D.C. made any mistakes,” he says.

  8. PHYSICS

    Making the Stuff of the Big Bang

    1. David Voss

    Starting this fall, a machine called RHIC will collide gold nuclei with such force that they will melt into their primordial building blocks

    BROOKHAVEN NATIONAL LABORATORY—If all goes well later this year, physicists here in the pine barrens of eastern Long Island will create miniature copies of the big bang by smashing together the bare nuclei of gold atoms traveling at nearly the speed of light. Reaching temperatures a billion times hotter than the surface of the sun, the protons and neutrons in the nuclei will melt into their bizarre building blocks: quarks and the gluons that hold them together. Out of this inferno will come an exotic form of matter called quark-gluon plasma, primordial stuff that may have been the genesis of all the normal matter we see around us. And most profoundly, the very vacuum, what we think of as empty space, will be ripped apart, revealing its underlying fabric.

    Brookhaven's subatomic demolition derby will take place at RHIC, the Relativistic Heavy Ion Collider, a superconducting racetrack in a tunnel 3.8 kilometers around. A decade in the making, costing $600 million, RHIC is the first collider designed specifically to create and detect this primordial soup. This racetrack is itself in a race, however. CERN, the European particle physics laboratory near Geneva, hopes to collide heavy nuclei to create the same quark concoction sometime around 2005 in the Large Hadron Collider, a giant accelerator now under construction. Already, one of CERN's existing accelerators may have spotted hints of the quark-gluon plasma (QGP). The traces are fleeting, however, and physicists hunger for confirmation. At RHIC, says director Satoshi Ozaki, “we can reach a completely new range of temperature and density,” where the signs of quark-gluon plasma should be unmistakable.

    Just this past June, technicians at RHIC started flowing liquid helium through the accelerator's superconducting magnets for the “big cooldown” and testing the machine with beams of gold nuclei. By this fall, providing the inevitable gremlins have been shooed away, the search for quark plasma will begin in earnest. It won't be easy. Each collision will last for only 10−23 second and emit thousands of particles of nuclear debris, from which physicists will try to pick out the subtle signatures expected of QGP. But if captured, QGP would be a new and exciting plaything for physicists, holding clues to how quarks and gluons bind together in normal matter, and how the stuff around us first took shape.

    Now you see them

    RHIC physicist Bill Zajc likes to say that he and his colleagues are replaying the first moments of creation. “The first attempt to create this quark-gluon plasma was successful, about 10 billion years ago,” he says. It lasted just 10 microseconds after the big bang; then, as the universe expanded and chilled, the quarks paired up to form particles called mesons and combined in threes to form the protons and neutrons of everyday stuff. They have rarely emerged since—and never for earthly physicists. Try as they might, physicists have never been able to coax a naked quark into the spotlight.

    The theory developed to explain how quarks interact via the strong force, quantum chromodynamics (QCD), accounts for this behavior. Just as electromagnetic theory endows particles with electrical charge, QCD describes quarks as having a charge whimsically called “color,” either red, green, or blue. And just as electrically charged particles attract or repel each other by exchanging photons, quarks interact by tossing gluons back and forth.

    When quarks are close together in “hadrons” (mesons, protons, and neutrons), they almost totally ignore each other. But try to separate them by any distance, and the force carried by the gluons goes to infinity, much as a piece of string that is limp when its ends are close together goes taut when it is stretched. If stressed too far, the string snaps. Likewise, as quarks are pulled apart the energy between them rises until something snaps, and the energy is transformed into new quarks and antiquarks. A lone quark quickly couples with the new quarks to form regular matter again, so that single quarks can never be taken captive. But QCD also predicts that at the temperatures and densities that existed just after the big bang, quarks and gluons become “deconfined”— released from their imprisonment and free to move around in a kind of quarky broth.

    So run the movie backward, say the researchers at RHIC. Slam together lumps of quark-rich matter, such as heavy nuclei, with enough energy to heat the nuclei way past where atoms are ionized, even beyond where the nuclei themselves break apart into protons and neutrons. Crank it up to a trillion kelvin, where quarks and gluons become free again.

    Physicists studying heavy-ion collisions at CERN say they may already have achieved this feat. CERN's venerable Super Proton Synchrotron (SPS) has been circulating beams of lead nuclei and smashing them together within a detector called NA50, which is tailor-made to detect the so-called J/psi mesons that flee the scene of a collision between heavy nuclei. The behavior of these ghostly hit-and-run mesons is leading researchers at SPS to claim a sighting of QGP.

    In nuclear collisions, the pure energy of the interaction can produce many pairs of quarks and antiquarks. Often, these matter-antimatter partners just annihilate each other, but occasionally they get locked in a sort of particle tango. They can dance away from the collision, separating and then rejoining their original partners to form a new particle —a meson—that survives just a moment. J/psi particles are exactly this kind of particle pas de deux executed by charm quarks, one of the six quark flavors, and their matching antiparticles.

    When a QGP forms, theorists predict that the immense energies in the collision volume should create quark-antiquark pairs of all possible flavors—up, down, charm, strange, top, and bottom—in large numbers, a process called “nuclear democracy.” Now conditions resemble those in a crowded ballroom. With a huge number of pairs dancing close together, the separating dancers are likely to bump into someone who is not their original partner and whirl away as a new pair. Similarly, any charm quarks produced in the collision have many more partners to choose from; they don't have to run away with an anticharm partner, which causes the number of J/psi mesons to drop. “I am convinced that J/psi suppression is the gold-plated signature for QGP detection,” says NA50 spokesperson Louis Kluberg.

    He and his colleagues presented their latest results at the Quark Matter conference this past May in Torino, Italy. The data show that J/psi particle production in the energetic lead-lead collisions drops far below the level seen in collisions of lighter nuclei like oxygen, sulfur, and hydrogen. CERN's collegial competitors at Brookhaven are intrigued by the findings but aren't completely sold. “It is tantalizingly close to a real effect,” says RHIC associate project director Tom Ludlam, “but you cannot really make the statement yet” that CERN researchers have produced QGP.

    Even if they have created QGP, this fleeting glimpse provides little information about this exotic state of matter. Among other things, physicists would like to know what kind of phase transition separates ordinary matter from QGP. The appearance of the plasma could either be a first-order transition—a sharp change like water turning into steam—or a continuous second-order transition with no sharp change in properties. The answer has deep implications for theories about what happened to the big bang as it cooled and for understanding QCD. As theorist Frank Wilczek of the Institute for Advanced Study in Princeton, New Jersey, puts it, “We know QCD is correct, and it gets boring to prove that. … We can play the notes, and now we want to play some chords.”

    One chord might be heard during “freezeout” of the QGP, when the excess strange quarks populating the short-lived nuclear democracy might coalesce into “stranglets,” tiny clumps of matter made up only of strange quarks. Another might come when the droplet of free quarks and gluons created in each collision interacts with the surrounding vacuum, which QCD pictures not as empty space but as a sea of “virtual” quark pairs that wink into and out of existence. Doing all that will take a machine capable of creating generous quantities of QGP, equipped with instruments that can go well beyond detecting J/psi suppression to pick up a dozen other subtler signals of the quark plasma's life and death. That's where RHIC comes in.

    Ring of ice

    Almost miraculously, the $600 million RHIC project is nearing completion on schedule and on budget, a feat the rank and file attribute to director Ozaki's deft management. Not only has he juggled the construction of a massive piece of scientific instrumentation, but he's had to run a kind of United Nations of physicists. The list of collaborators fills an entire viewgraph with fine print: more than 800 scientists from 19 countries, including India and Croatia.

    For Brookhaven it is sweet compensation for perhaps the most painful episode in the lab's history, the cancellation of ISABELLE, a huge proton collider intended as the successor to the Tevatron accelerator at the Fermi National Accelerator Laboratory in Illinois. ISABELLE's builders broke ground in 1979, only to see it killed in 1983. Some say the machine's superconducting magnets never did work right, others that ISABELLE got the ax to open the way for an even bigger but equally ill-starred machine: the Superconducting Super Collider. Whatever the reasons, the cancellation left ISABELLE's massive concrete tunnel empty except for the occasional jogger or Rollerblading physicist on lunch break.

    Coincidentally, at about this time a separate tribe of scientists—the nuclear physicists—concluded that a heavy-ion accelerator was a high priority for studying how the atomic nucleus is put together. In 1991, the U.S. Department of Energy (DOE) chose Brookhaven as the site for RHIC, and the empty ISABELLE tunnels, just north of the main campus, offered a natural home.

    The Alternating Gradient Synchrotron, a small, existing heavy-ion accelerator, will be put to work as a booster stage, generating ion beams for injection into RHIC's two counterpropagating rings, nestled close together in the tunnel. The beams will contain needlelike bunches of gold nuclei, each about 100 micrometers in diameter and less than a meter long. At six points around the ring, strong magnets will steer the beams so that they cross and collide, yielding something like 1000 gold-on-gold collisions every second at energies of some 200 billion electron volts per proton or neutron. Temperatures will reach a trillion degrees. Under those conditions, say theorists, the colliding nuclei should explode into an almost pure quark-gluon plasma.

    They will do so in full view of four detectors, designed to collect as many different kinds of information as possible about the collisions to prove that QGP exists. “It's a funny business,” says RHIC researcher Tim Hallman. “Usually in particle physics, people are looking for one decay, and they've designed a whole experiment around it. In our case there are lots of different signatures but no one thing that sticks out that is unambiguous. So we're in the business of measuring many different signals and correlating them to give an airtight case.”

    Two of the detectors, STAR and PHENIX, are the kind of grand-scale hardware normally associated with a place like CERN or Fermilab. Each costing $100 million, they are massive steel skeletons supporting a fine filigree of sensors, wiring, and high-speed optical fiber. To illustrate the capabilities of STAR (the Solenoidal Tracker at RHIC), Hallman, the detector's spokesperson, holds up a simulated collision that looks like a fireworks display on steroids. “STAR is a tour de force in keeping track of particle trajectories,” he says.

    As particles zing through STAR's central chamber, basically a giant can of gas with high-voltage electrodes spidering through it, they will leave trails of ionized molecules. Big, charged collecting plates at each end of the STAR chamber will suck up the ionized gas, carefully recording how much ionization is collected over time. By projecting this record back into space coordinates, a computer can reconstruct the path of the particles in the three-dimensional volume of the detector—thousands from every collision.

    The filigree of tracks should include telltale signs of QGP. One is a shower of unusual particles spawned in the rich quark soup. Strange critters like K mesons (kaons), lambdas, and omegas will shoot out, for example. And just as molecules in a gas scatter off their neighbors, energetic quarks zipping through the plasma would slow down as they bang into other quarks and gluons. STAR should be able to detect this attenuation with its spectrometers and so learn something about the stuff the quarks are traversing.

    PHENIX, so named “because it has risen from the ashes of three separate detector proposals,” says Zajc, the experiment's spokesperson, will track fewer particles than STAR—hundreds rather than thousands—with higher precision. It will concentrate on the lighter and more evanescent escapees—photons and leptons (electrons and muons). Because the photons and leptons are unaffected by the strong force, they can escape the dense quark matter and report back about conditions right in the thick of the collision, such as the temperature of the quark soup. Tipping the scales at about 4000 tons, “PHENIX is about the mass of a good-sized naval destroyer,” says Zajc—in part because of the 100-ton, 20-centimeter-thick steel plates that flank the collision point. The plates, which act as filters for the detectors that identify muons, are so big that few steel mills in the world could have fabricated them. Part of Russia's in-kind contribution to RHIC, they were made at a plant in St. Petersburg and sold to DOE at half price.

    RHIC also sports two smaller experiments, BRAHMS and PHOBOS, each built for a tenth the cost of STAR or PHENIX. BRAHMS (for Broad Range Hadron Magnetic Spectrometers) will measure the energy of charged hadrons flying away from the collision, another clue to the temperature and density at the very point where the nuclei are fragmenting and blowing apart. And PHOBOS is specifically designed to watch for the appearance of the plasma as the collision energy is ratcheted up. “We are looking for a phase transition to quark-gluon plasma,” explains Massachusetts Institute of Technology (MIT) physicist Wit Busza, the project spokesperson.

    Because material passing through a phase transition exhibits huge fluctuations in density, like water at a rolling boil, PHOBOS is designed to look for unusual variations in the total number of particles created in the collision. An array of relatively inexpensive silicon detectors surrounding the collision point, lithographically engineered with pixels to register particle hits, will allow it to do a gross head count of debris particles. A second component of the detector, a set of low-cost silicon spectrometers, will keep watch for any peculiar fluctuations in particle momentum as the gold nuclei get cooked into quark soup.

    These instruments will generate a torrent of data—about a petabyte (1015 bytes) every year, according to Barbara Jacak, a physicist who coordinates computing for PHENIX. “Think about the multigigabyte hard drive on your PC,” she says. “The raw data rate from RHIC would fill that in a few minutes.” To handle the particle track reconstruction and detector signal processing, RHIC will host a $7 million computing facility outfitted with a high-performance tape library and a computer farm of 1000 Linux workstations. Theorists, who are trying to wrest predictions of how the quark plasma should behave from the complex equations of QCD, are getting some major computing muscle too: a 600 gigaflops supercomputer, one of the most powerful outside military and corporate labs, that is the product of a collaboration between Brookhaven and the RIKEN research center in Japan.

    Set your pion lasers on stun

    “RHIC was built with the specific purpose of looking for the quark-gluon phase transition,” says Ozaki, “but once you build a machine of this size, you want to do other things too.” One is searching for the basis of proton “spin,” a quantum mechanical property that causes particles to act like tiny magnets. Ozaki convinced physicists at RIKEN to kick in about $20 million to outfit RHIC for experiments in which the gold beams will be replaced with protons. The protons will be polarized, so that each little nuclear magnet is lined up in one direction. When the protons collide, the pattern of debris should hold clues about how much of the proton's spin comes from quarks and how much from gluons.

    Many earlier experiments have studied nuclear spin, but RHIC will be a pioneer in probing another mystery: the vacuum and the sea of short-lived up and down quarks and antiquarks thought to fill it. Because pairs of quarks and antiquarks make up pi mesons, or pions, physicists refer to this sea as a pion condensate. But quarks can pair up in various ways, and theorists believe that in the vacuum as it exists today, they are paired up in only one of many equally likely arrangements.

    In the early stages of the big bang, however, the up, down, antiup, and antidown quarks in the vacuum flitted about, refusing to adopt any fixed pairing. Then, as the universe cooled, the condensing quarks had to pick a particular arrangement, or orientation, and stick with it. Magnets offer a good analogy, says Wilczek: “When you raise the temperature of a magnet above a temperature called the Curie point, all the spins become randomized and the magnetism disappears.” And when the material cools back down below the Curie temperature, the spins all line up in some direction—the symmetry of the material is broken.

    If the magnet is cooled in an externally imposed magnetic field, the spins would probably lock together in a direction different from the external field. After a time, though, the spins might all suddenly jump to align with the external field. When they did, the system would release coherent waves of spin energy called magnons.

    Like the magnet, theorists predict, the hot quark matter created in RHIC will shun any particular arrangement, but as it cools, it should lock into a specific mixture. This mix is unlikely to be identical to the mix in the surrounding cool vacuum, however, and Wilczek and Krishna Rajagopal (now at MIT) have predicted that eventually that little smudge of disoriented condensate will suddenly reorient, releasing energy in the form of pions moving in lockstep, like the photons in a laser. RHIC researchers hope to catch a glimpse of this “pion laser,” in the form of unusual ratios of neutral to charged pions. Even if no pion laser technology is likely to come out of such experiments, the signal would carry a profound message about the physical nature of the vacuum.

    But as Dan Beavis, project manager for BRAHMS, puts it, “I don't know how kind nature will be to us.” For RHIC is entering largely unknown territory. Predictions that some kind of transition to quark-gluon plasma will take place are so strong that everyone expects RHIC to see something. But beyond that, “theoretical guidance has been diffuse,” says Zajc. “So experimentalists are in the driver's seat on this one.”

  9. PHYSICS

    RHIC Physicists Go to Media School

    1. David Voss

    Last month, the tiny big bangs Brookhaven National Laboratory (BNL) researchers hope to ignite this fall in the Relativistic Heavy Ion Collider (RHIC) caused a flutter outside the laboratory. The Sunday Times of London ran a story with the headline “Big Bang machine could destroy Earth,” based on speculations that the atom-scale collisions within the accelerator could create clumps of “strange” matter that might catastrophically transform normal matter into something, well, strange—or make black holes that would suck up the planet. Already unsettled by reports of chemical and radioactive leaks from Brookhaven facilities, Long Island residents feared the worst—and Brookhaven caught another glimpse of public-relations Armageddon.

    E-mails from alarmed residents began flying around when the Times story appeared, according to lawyer Scott Cullen of Standing for Truth About Radiation (STAR), a local organization that has been a steady critic of Brookhaven's environmental record. Most physicists discount the disaster scenarios. As lab spokesperson Mona Rowe puts it, the energy of the collisions is like that of a “mosquito hitting a screen door.” But these days Brookhaven officials, stung by past public-relations debacles, realize they can't simply brush off public fears. “Whenever any scientist suggests a disaster scenario, we are obligated to take it seriously enough to decide if there is an issue,” says lab director John Marburger.

    In the past, BNL has provided plenty of ammunition to its critics, who include celebrity environmental activists summering nearby at the Hamptons. The lab was declared an Environmental Protection Agency Superfund site in 1989 owing to a history of contamination going back to its early days as an Army camp. Tritium leaks from the High Flux Beam Reactor, a flagship source of neutrons for research, were a public-relations disaster, and the reactor has been padlocked since January 1997. Brookhaven's Graphite Reactor, built 30 years ago and decomissioned in 1969, has turned out to be a Pandora's box of leaking radioactivity and bad publicity. In May 1997, Associated Universities, the contractor that ran the lab for the Department of Energy, was fired; it was replaced in November 1997 with Brookhaven Science Associates. Stringent new regulations have been put into place at BNL, and researchers now attend mandatory seminars in environmental health and safety.

    Brookhaven staffers are now reaching out to the public, speaking at local schools and community events. The lab promptly countered the London Sunday Times story with a statement posted on the Web (Science, 23 July, p. 505), and Rowe says that several RHIC physicists have taken public speaking courses to prepare them for dealing with the public and the media. RHIC director Satoshi Ozaki says even seemingly small things can make a big difference in mending fences with the facility's neighbors. “One of the people living nearby complained about the noise from one of our compressors,” he says, “and I insisted that it be fixed by the end of the day. At the next town meeting, that same person said how delighted she was that the problem had been fixed,” notes Ozaki. “So we had gained a friend.”

    The legacy of mistrust will be hard to overcome, however. A survey the laboratory conducted late last year showed that a full 60% of the 766 residents polled still felt that Brookhaven was not doing enough to keep the community informed about its activities. And organizations like STAR are still convinced that the lab poses serious dangers and that the environmental impact review for RHIC has been inadequate. The outreach effort “has been mostly a PR shift,” says Cullen. “A lot of times they're willing to sit down and talk, but whether they are willing to actually do anything is another matter.”

  10. ECOLOGICAL SOCIETY OF AMERICA MEETING

    Getting to the Roots of Carbon Loss, Chili's Gain

    1. Jocelyn Kaiser

    SPOKANE, WASHINGTON—Some 3500 ecologists gathered here from 8 to 12 August for the annual meeting of the Ecological Society of America. Discussions ranged from the link between ancient deforestation and greenhouse warming to a sizzling ecological explanation for why peppers are hot.

    Chili Idea Gets Warm Reception

    From red-hot chicken wings to kimchi, many spicy foods get their sinus-clearing zing from hot peppers. For some ecologists, chili peppers have posed a burning question: Why are they hot? The fruit owes its spiciness to a chemical called capsaicin, and plants generally put energy into producing such toxins for a reason, such as to deter enemies. At the meeting, a researcher offered intriguing evidence suggesting that chilies wield their sting with the precision of a stiletto: to target seed-chewing mammals while leaving birds unscathed. Birds swallow the seeds but don't digest them, essentially acting as vessels for carrying chilies to new turf.

    The idea that plants make chemicals to deter predators isn't new, and many plants rely on animals to spread their seeds. But this may be the first well-documented instance of a plant using a chemical selectively, to repel some animals without affecting others that boost its chances to reproduce. “That seems to be the first very good case where you can poison your enemies but not your [friends],” says ecologist Judith Bronstein of the University of Arizona, Tucson. “It's a great story.”

    The story begins in South America, where chilies and their relatives—from jalapeños to poblanos and bell peppers—originated. With the exception of stoic humans, mammals don't care for capsaicin, which stimulates neurons that sense pain. That's why the chemical is a key ingredient in pepper spray and a growing number of other concoctions for warding off belligerent backwoods bears and backyard rodent pests. Birds, by contrast, seem to be impervious to capsaicin, apparently because they lack the right shape of a receptor—an ion channel—on their mucous membranes. Capsaicin binds to the ion channel and causes it to open, allowing ions to flow in and trigger nerve impulses the brain interprets as pain. Gary Nabhan, a plant ecologist at the Arizona-Sonora Desert Museum in Tucson, and grad student Joshua Tewksbury of the University of Montana, Missoula, speculated that the plants learned, so to speak, to exploit capsaicin's ability to set a mammal's tongue, but not a bird's, afire.

    To find out which wildlife eats chilies (Capsicum annuum var. aviculare), Tewksbury studied 150 hours of videotape collected last summer of around 25 chili bushes in 2500 hectares of mesquite shrubland in southern Arizona—the northern edge of the plant's natural range—which the U.S. Forest Service last spring designated a protected chili reserve. They found that the pea-sized fruits were eaten only by birds, mainly the curve-billed thrasher. The chilies were ignored by desert packrats and cactus mice foraging in adjacent bushes for hackberries, another red fruit that isn't spicy.

    Put into cages, five packrats and five mice turned up their noses at chilies, although they readily ate an altered version that wasn't spicy—but after showing up in the feces, the seeds failed to germinate. The same kind of nonpungent seeds fed to birds germinated just as well after being expelled as those planted by hand. And spicy seeds eaten by birds germinated three times more often, although the researchers don't know why. Capsaicin also acts as a laxative in birds, helping them spread the seeds, Tewksbury says.

    That's not the only benefit chilies derive from being bird feed. Chili plants grow best in shade, as Nabhan and Tewksbury found out when they transplanted some to sunny spots. But not just any shade—the chilies usually turn up near hackberry bushes and other shrubs with small fruits dispersed by birds. Being clustered with other fruit-bearing shrubs lures more birds, which ensures that more of the chili fruit gets eaten, the researchers found.

    The duo hasn't yet tied up all the loose strings. For one, they plan to try the same experiments in Bolivia, where chilies first evolved, to see if the same patterns show up there. But Tewksbury thinks he and Nabhan can fairly conclude that for Arizona's wild chilies at least, “there's a strong benefit to being hot.”

    The North's Voracious Carbon Past

    Burned prodigiously since the industrial revolution, fossil fuels are the main source of the rising levels of carbon dioxide in the atmosphere in recent decades. But land conversion—logging, burning, and plowing over natural greenery—is also a big factor, contributing about 20% of the annual CO2 surfeit from human activity, mostly from slash-and-burn agriculture in tropical forests. The tropics, however, weren't always the only hot spots for forest destruction. A new analysis presented at the meeting suggests that vegetation razed by people throughout history, largely in northern countries, pumped an enormous amount of carbon into the air—on par with the billions of tons added by burning fuels since.

    The numbers drive home the fact that civilizations began spewing out greenhouse gases centuries ago and underscore the importance of land cover—as both a source and sink for carbon—in discussions today of how to implement the Kyoto Treaty, which is supposed to limit CO2 emissions. “People are so fixated on fossil fuel emissions. They aren't putting land-cover conversion into the equation as well,” says remote imaging expert Steven Running of the University of Montana, Missoula.

    The study, by geographer Ruth DeFries of the University of Maryland, College Park, and ecologist Chris Field of the Carnegie Institution of Washington in Stanford, California, is the first attempt to sum up all the carbon added to the atmosphere since people began to leave a significant mark on Earth's vegetation, chopping down forests, turning savanna into desert, and converting woods to farmland or, eventually, into parking lots. The researchers started out with a map of the world's potential vegetation at the dawn of civilization—based on factors like climate and soil type—and simulated a satellite map to compare with actual satellite maps of global vegetation today.

    Then, to estimate how much carbon was released as lands were altered, the duo and their collaborators plugged the before-and-after satellite maps into a computer model that converted the greenness of each area into how much carbon was stored in that landscape, be it tundra, grassland, or deciduous forest. The result: Over the course of civilization, land-use changes have liberated about 185 petagrams of carbon—an amount equal to about 75% of that released by burning fossil fuels, the researchers report in a paper in press at Global Biogeochemical Cycles. About one-third of the carbon released from altering land happened before 1850, judging from a different research group's estimate of carbon released since then. “It's a very reasonable” conclusion, says Running, who says the number may be even higher according to his own calculated map of prehistorical vegetation.

    This dubious legacy of land destruction is, ironically, a boon today: In the United States, for example, regrowth of forests is sucking up a good chunk of the country's fossil fuel emissions (Science, 23 July, p. 544). Nowadays it's the tropics where land use is pumping out carbon instead of socking it away. But Field says that's no reason for northerners to gloat: “That doesn't make [today's reforestation] a virtuous thing.”

    Knowing how much carbon has been lost by altering landscapes also “sets the outer boundary of what reforestation is possible,” says Field, because “in concept, the [losses] could be reversed” if in some cases countries can let farmers' fields grow back to their natural state.

  11. HYDROLOGY

    Scarcity of Rain, Stream Gages Threatens Forecasts

    1. Erik Stokstad

    Hydrologists warn that the world's network of rainfall and stream gages—often a low priority in science budgets—is slowly eroding

    BIRMINGHAM, U.K.—On 1 March 1997, northern Kentucky was drenched with up to 25 centimeters of rain. The Licking River, which meanders through the town of Falmouth, rose a meter in only 3 hours and kept on rising. By evening, Falmouth's emergency siren was wailing and police were shouting evacuation orders through bullhorns. Most of the 2400 residents managed to flee, but the water came so fast, even shoving houses off their foundations, that some had to be rescued from rooftops. Four people in mobile homes drowned.

    The river had crested 4 meters higher and 6 hours earlier than the National Weather Service (NWS) had predicted. NWS officials admitted that they had underestimated the danger, but added that their forecasts had been severely hampered by the loss of a crucial gaging station 32 kilometers upriver, which was cut in a budget crisis in 1994. “It was like a flash flood,” says Mark Callahan of the NWS's Louisville office. “Without that gage, we were blind.”

    This kind of uncertainty is not unique. Around the world, the gages that measure rainfall and stream height are slowly disappearing, victims of a slow erosion in funding, according to hydrologists gathered here for the International Union of Geodesy and Geophysics from 19 to 30 July. At the meeting, some 400 hydrologists of the International Association of Hydrological Sciences (IAHS) issued a resolution calling rain and stream gages “an endangered species” and decrying “a severe decline in total quantity of data being collected worldwide.”

    That decline means that at a time when global warming may be exacerbating weather extremes and water shortages, scientists are less able to monitor water supplies, predict droughts, and forecast floods than they were 30 years ago, says John Rodda, president of the IAHS. And although remote sensing and other technologies offer new sources of climatic data, rain and stream gages remain crucial. “There really isn't any other way of finding out how much water is flowing down a river,” says Ed Johnson, NWS's director of strategic planning in Silver Spring, Maryland.

    Individual gages aren't terribly expensive when compared to, say, a satellite—new U.S. river stations cost about $35,000 to install and $10,000 a year to maintain and operate. But even in countries with robust science budgets, maintaining aging gages is often low priority, especially when the weather's good. “If you go too long without a flood, people tend to lose awareness of the risk,” says Duncan Reed, a modeler at the Institute of Hydrology in Wallingford, U.K. “They ask ‘Why are we spending this money?’” Yet scientists need decades of continuous data to predict extreme events such as floods or drought, says Reed. Compounding the problem is the fact that the most critical gages, such as those that monitor rainfall or snowpack in mountains, are often remote and expensive to maintain; therefore they tend to be shut down first. When that happens, says Rodda, “you have the least information from the places you most need.”

    Many of the countries whose hydrological networks are in the worst condition are those with the most pressing water needs. A 1991 United Nations survey of hydrological monitoring networks showed “serious shortcomings” in sub-Saharan Africa, says Rodda. “Many stations are still there on paper,” says Arthur Askew, director of hydrology and water resources at the World Meteorological Organization (WMO) in Geneva, “but in reality they don't exist.” Even when they do, countries lack resources for maintenance. Zimbabwe has two vehicles for maintaining hydrological stations throughout the entire country, and Zambia just has one, says Rodda.

    In South Africa, although the river gaging system is intact, the number of rainfall stations has plummeted from more than 4000 to about 1700. This is due in large part to urbanization, because daily rainfall reports typically come from farmers. “People are not inclined to do this service free of charge anymore,” says Gerhardus Schulze, director of the South African Weather Bureau.

    And in countries of the former Soviet Union, the problem is decentralization. The central Soviet hydrological service once collected all rainfall measurements and other data, but the new national hydrological agencies of countries in central Asia have much smaller budgets, notes Manfred Spreafico, director of hydrology with the Swiss Bundesamt für Umwelt, Wald und Landschaft in Berne. About 90% of the stations in the Aral Sea region are broken or idle; that means “it is now much more difficult to estimate water resources than it was 20 years ago,” says Rodda.

    As for the United States, the problem is in river gages, which have dropped by about 6% over the last decade. Most of these stations are paid for with collaborative agreements between the U.S. Geological Survey (USGS) and more than 800 state and local agencies, but local priorities often shift, leading to loss of gages with long-term data, says Robert Hirsch, chief hydrologist at the USGS.

    What's more, U.S. stations that record flow on small, free-flowing rivers have dropped by 22% since 1971. These stations are funded solely by the USGS, which has seen its gage budget shrink 9% in the past decade. The loss is “disturbing,” says Hirsch, especially because these naturally flowing rivers show how streams respond to changes in land use and climate, and so are vital to climate models.

    International officials have begun to address the problem, with several initiatives already under way. The World Bank has agreed to finance 25 new stations in Uzbekistan for a total of $2.5 million. And the World Hydrological Cycle Observing System, a program launched by WMO in 1993, is establishing 50 new monitoring stations in 10 countries in southern Africa. In the United States, Hirsch says he's beginning to see signs that Congress takes the problem seriously, and he hopes that next year's budget will include a $2.5 million boost for gages.

    But even with new stations, coverage in the developing world will be sparse. And in the meantime, valuable climatic records are not being kept, says Schulze. “There is going to be a 10- to 20-year gap; it's data lost forever.” For towns like Falmouth—where the upriver gage is still out of action—the losses might be even greater.

  12. CELL BIOLOGY

    How Chromatin Changes Its Shape

    1. Michael Hagmann

    A variety of modifications affect the protein-DNA complex known as chromatin, causing it to loosen—or tighten—as needed for cell function

    Buried in the depth of the cellular nucleus, the long threads of DNA are the ultimate control center of the cell. But how the many molecular players that convey to the genetic material the signals it needs to determine what to do has been a mystery, because the DNA is literally balled up with histones and other proteins in a tightly wound amalgam known as chromatin. Beginning about 3 years ago, cell biologists discovered the first crack in DNA's armor. They found that when a small chemical entity called an acetyl group is glued to specific places on the histone proteins, the chromatin fiber opens up, allowing the cell's gene-reading apparatus to gain access to the genetic material (Science, 10 January 1997, p. 155). New work is now revealing other ways in which histones can be coaxed to loosen—or tighten—their grip on the DNA.

    In recent months, researchers have produced evidence that two other chemical appendages—phosphate and methyl groups—can also make the DNA more accessible when they are attached to histones, boosting the activity of specific genes. One of those modifications, histone phosphorylation, also appears to be involved in other types of chromatin remodeling, such as the chromosome condensation that takes place prior to cell division. How the phosphorylation can have such different effects—opening up the chromatin for gene expression, but condensing it for cell division—is still an open question. But the finding suggests that histone modifications may have an impact far beyond gene activation.

    “There's no reason to believe that a lot of these [chromatin-remodeling] activities won't function in other processes,” such as DNA replication, recombination, or repair, says biochemist Jerry Workman of Pennsylvania State University in University Park. Indeed, the findings point to a picture of chromatin as a kind of giant receptor complex that can pick up and integrate a variety of external and internal signals and can change its appearance accordingly—opening up to permit access or winding into tighter coils. “It's like a Morse code where the different modifications could work in various combinations to bring about a biologically meaningful response,” suggests biochemist C. David Allis of the University of Virginia, Charlottesville, whose team is at the forefront of the work.

    The two faces of histone phosphorylation

    The first clue that adding phosphate groups to histone might alter chromatin and allow genes to be expressed came in 1991 from molecular biologist Louis Mahadevan of King's College in London. When certain growth factors stimulate cells, their signals are transmitted to the nucleus through the so-called MAP (mitogen-activated protein) kinase cascade—a series of kinase enzymes, each of which activates the next kinase in line by adding phosphate groups to it. Mahadevan observed that one of the five major histone proteins, H3, is phosphorylated as a result of growth factor treatment, and speculated that this change opens up the chromatin structure and allows the activation of the growth-promoting genes. In the years since, more and more evidence suggesting that MAP kinases activate genes by phosphorylating histones trickled in from Mahadevan, as well as from James Davie's lab at the University of Manitoba in Winnipeg. It took researchers several years to piece together what exactly was happening at the very end of the cascade, however.

    One clue came from Allis and his colleagues. In the course of other work on histone phosphorylation, they had generated an antibody that recognizes H3 only when it has a phosphate on a specific amino acid, serine 10. When the researchers tested the antibody on growth factor-stimulated cells, they saw that it clustered in some 100 sharply defined spots, or speckles, distributed all over the chromosomes. The number of spots roughly corresponded to the number of genes activated in the early stages of growth factor stimulation. To find the enzyme responsible, Allis's team fractionated cell extracts and came up with a protein with a molecular weight of 90 kilodaltons.

    At this point, Allis learned of results from molecular biologist Paolo Sassone-Corsi and human geneticist André Hanauer of the Institute of Genetics and Molecular and Cellular Biology in Strasbourg, France. In 1996, the two had shown that patients suffering from Coffin-Lowry syndrome, a hereditary form of mental retardation that often is accompanied by facial and other deformities, have mutations in the Rsk-2 protein, one of the kinases in the MAP kinase pathway. Sassone-Corsi's group also linked the Coffin-Lowry mutation to a defective response to growth factors, implying that the mutations somehow impede gene activation.

    Allis was intrigued by those findings, because Rsk-2 has about the same molecular weight as his mystery enzyme. He sent a sample of his antibody against phosphorylated histone H3 to Sassone-Corsi and his colleagues so that they could look for the punctuated pattern of antibody staining on cells taken from Coffin-Lowry patients. If this pattern did not appear during growth factor-stimulated gene expression, it would be a sign that the Rsk-2 mutation had interfered with histone phosphorylation—which in turn might be the cause for the defective response to growth factors in these cells.

    As the researchers reported in the 6 August issue of Science (p. 886), the Coffin-Lowry cells showed none of the bright speckles at all, evidence that histone H3 isn't phosphorylated in response to growth factor stimulation. In fact, Sassone-Corsi recalls, the results “were so black and white that the postdoc who had done the experiment thought something went wrong.”

    More recently, the Allis team obtained more direct evidence that H3 phosphorylation is linked to the expression of genes involved in growth stimulation. In experiments performed in collaboration with Davie's team and in press at the Journal of Biological Chemistry, the researchers used the phospho-H3-specific antibody to fish out histone H3 phosphorylated in the course of a gene response to growth factors, along with the snippets of DNA bound to the modified histone. And lo and behold, they found that two of those snippets came from the c-fos and c-myc genes, both of which are turned on early in growth factor responses. Taken together, these findings, says Davie, “establish histone phosphorylation as a crucial mechanism for gene regulation.”

    Rsk-2 may not be the only player in this arena, however. In an upcoming issue of the EMBO Journal, Mahadevan's group presents evidence that MSK1, a close relative of the Rsk family, might be the kinase that phosphorylates H3 in response to both growth factors and stress in mouse fibroblast cells. And geneticist Kristen Johansen of Iowa State University in Ames and her colleagues have found that this kinase localizes to areas of high gene activity in fruit fly chromatin, a further hint that it has a role in gene regulation. Because some MAP kinase subtypes don't activate Rsk-2, yet still lead to histone H3 phosphorylation, Mahadevan thinks that MSK1 could be the missing link.

    Meanwhile, others were pinning down a link between histone phosphorylation and a very different process: cell division, or mitosis, when the chromatin condenses and locks up the DNA in tight bundles. The first indication that histone phosphorylation might be important for this process had come in the early 1970s when scientists found that two of the histone proteins, H1 and H3, become highly phosphorylated when the chromosomes start condensing. But researchers couldn't determine whether this was a crucial prerequisite for chromosome condensation or a mere coincidence. “The best we had for a long time was this correlation,” Allis says.

    Now Allis's team, working with Martin Gorovsky and colleagues at the University of Rochester in New York, has turned to the ciliated protozoan Tetrahymena thermophila, which has two types of nuclei with drastically different properties. The Tetrahymena macronucleus, which harbors some 90 copies of each of the protozoan's chromosomes, provides all the cellular proteins the organism needs for normal growth. Its chromosomes don't get condensed before the cells divide; instead, the macronucleus simply pinches in half so that each daughter cell ends up with a macronucleus containing about half the multiple genomes.

    In contrast, the normally silent micronucleus, which comes into play only when two Tetrahymena cells want to have sex—that is, exchange genetic material—contains only two copies of the genome and divides by way of regular mitosis. Thus, Allis and Gorovsky reasoned that if they mutated the histone H3 at its major phosphorylation site, the serine at position 10, any resulting changes might only be manifest in the micronucleus. And that's exactly what they saw.

    Their results, reported in the 2 April issue of Cell, showed abnormal chromatin condensation and chromosome loss in the micronucleus of H3 mutant strains without any apparent effect on the macronucleus. In addition, antibody studies showed that the phosphorylated H3 covers the chromosomes along their entire length rather than just being found in isolated speckles, as in the growth-stimulated cells. “This establishes the role of H3 phosphorylation in chromatin condensation. The evidence is pretty much cast in stone,” says Alan Wolffe, a cellular biologist at the National Institute of Child Health and Human Development in Bethesda, Maryland.

    How one and the same event, the phosphorylation of serine 10 in histone H3, can lead to chromatin condensation in one setting and to chromatin unfolding and gene activation in another is still a mystery, however. As yet unpublished work from the labs of both Allis and Mahadevan provide an interesting hint: Phosphorylation in conjunction with some other modification, such as acetylation, tilts the balance in favor of gene activation. “Maybe the effect of H3 phosphorylation depends on what else is going on on that particular chromatin fragment,” speculates Penn State's Workman.

    An even simpler, albeit not mutually exclusive, scenario holds that H3 phosphorylation initially loosens up chromatin in both mitotic condensation and gene activation. In that event, says Davie, “whatever factors are available at the time this happens will determine the outcome.” During mitosis this would be the chromosome-condensing machinery, and during gene activation, transcription factors and coactivators, which might further open up the chromatin by acetylating it.

    Methylation joins the club

    Further complicating the picture, it now appears that the addition of methyl groups might modify chromatin structure as well. Molecular endocrinologist Michael Stallcup of the University of Southern California in Los Angeles came across this possibility while studying nuclear receptors, proteins that when bound to steroid hormones such as estrogen act as transcription factors that turn on specific target genes. First, though, the nuclear receptors recruit a number of “coactivators,” including a protein designated p160. And p160 itself recruits a set of helpers, as the team discovered when they found that it associates with an enzyme called CARM1. CARM1's structure, they found to their surprise, indicates that it serves to add methyl groups to other proteins. (The team described their results in the 25 June issue of Science, p. 2174.)

    When Stallcup's team tested CARM1 in gene activation experiments, they found that it behaves like a bona fide coactivator, boosting the effect of nuclear receptors and p160 on gene expression. What's more, when the researchers mutated CARM1, disabling its methyl-adding activity, it lost its ability to boost transcription as well. “This isn't a proof that the methyltransferase activity defines how CARM1 works, but it seems weird if it were a pure coincidence,” Stallcup says.

    Now the hunt is on for CARM1's targets. Although Stallcup found that the enzyme can methylate H3 in the test tube, it attaches the methyl groups to the amino acid arginine, whereas histones are mostly methylated at lysines. For that reason, Davie for one thinks that CARM1's role as a histone methyltransferase still remains to be established; he and others contend that the enzyme might methylate other proteins such as transcription factors or fellow coactivators, regulating their activities, rather than the histone's. Even so, Allis predicts that the study will bolster interest in methylation as a means of reshaping chromatin. “Before Stallcup's paper, methylation wasn't even on people's radar screen,” he says.

    And the rest?

    Many researchers in the field do not expect the current momentum to slow down anytime soon. Histones are among the most heavily modified proteins in the cell, carrying many other attachments besides acetyl, phosphate, and methyl groups. Among their other molecular decorations are the small protein ubiquitin, ADP-ribose, and various sugars. Researchers suspect that some of these, too, will turn out to affect chromatin behavior.

    As Allis sees it, “All these [histone] modifications that have been buried in the chromatin literature for years will come back into fashion before long.” He cautions, however, that a great deal more work will be required to pin down the role of the modifications and to see in particular how they tie in with the numerous hormones, growth factors, and other external signals that influence cell activities. Stallcup agrees, saying, “We're a long way from having discovered all the players that can open up chromatin.” But he adds, “we have at least cracked open the door.”

  13. CELL BIOLOGY

    Clinging to Histones

    1. Michael Hagmann

    Structural biologist Ming-Ming Zhou of the Mount Sinai School of Medicine in New York City finds nothing more enticing than the intimate relations of proteins—often a key to understanding how they perform their functions in the cell. In his latest achievement, Zhou has taken a close look at a protein region, or “domain,” that may guide proteins to specific sites on chromatin, the complex of DNA and proteins making up the chromosomes. The homing may be a step in the remodeling of chromatin and thus in gene regulation (see main text).

    The region in question is the bromodomain, a conserved sequence containing roughly 100 amino acids found in some 30 chromatin-associated proteins, including all nuclear histone acetyltransferases (HATs), enzymes that add acetyl groups to certain of the histone proteins found in chromatin. This apparently loosens up the chromatin, thus allowing gene expression to occur. Even though scientists discovered the bromodomain in 1992, exactly how it might contribute to these activities has been unclear. “Bromodomains kept showing up in all these interesting proteins, but nobody had a clue about their function,” says molecular biologist Robert Kingston of Massachusetts General Hospital in Boston.

    Zhou and his colleagues set out to test one speculation: that bromodomains might be “some sort of homing addresses,” guiding proteins that contain the domains to specific sites on the chromatin. In the first stage of their work, the Zhou team used nuclear magnetic resonance to determine the structure of the bromodomain of a HAT protein called P/CAF. As reported in the 3 June issue of Nature, this revealed that the P/CAF bromodomain consists of a bundle of four helices that are tightly packed into a more or less cylindrical shape. On one end of the structure, the researchers detected a region that looked like it might be a docking site for other proteins: a little cleft consisting of hydrophobic, or water-shunning, amino acids. “Since hydrophobic amino acids are usually buried inside a protein, we thought this pocket may be a protein-protein interaction site,” says Zhou.

    To try to find out what the site might recognize, the researchers began mixing the P/CAF bromodomain with various peptides from other proteins, including fragments of histones H3 and H4—both unmodified and modified by acetylation and other changes that have been linked to chromatin remodeling. Zhou and his colleagues found that the P/CAF bromodomain binds only the acetylated H3 and H4 peptides with the acetylated lysine amino acid fitting exactly into the hydrophobic pouch. The results suggest, Zhou says, that “bromodomains may be general binding modules for acetylated histones.” In such a scenario, an initial acetylation event could lure in other, bromodomain-containing HATs that would then lead to more widespread histone acetylation and, ultimately, gene activation.

    Some chromatin aficionados predict that the discovery of the acetyl-lysine-binding capacity of the P/CAF bromodomain is just the first glimpse of a complex homing system that might guide the hundreds of chromatin-binding factors needed to orchestrate the cell's activities. “Acetylation is just the tip of the iceberg. Underneath there's something that's even bigger—the different histone modifications together might act like molecular ZIP codes for the different [chromatin-binding] proteins,” says Alan Wolffe, a cellular biologist at the National Institute of Child Health and Human Development in Bethesda, Maryland.

Log in to view full text

Log in through your institution

Log in through your institution