News this Week

Science  05 Jun 1998:
Vol. 280, Issue 5369, pp. 1522

    Cosmos in a Computer

    1. James Glanz


    A team of astrophysicists and computer scientists has journeyed to the far reaches of space and time by capturing the entire observable universe in a computer. They have created the first simulation of how gravity could have gathered ripples left by the big bang into colossal structures—walls, clumps, and filaments of galaxies—filling all of space. The result is a coarse-grained look at cosmic history within a cube 10 billion light-years on a side, a volume so big that if Earth sat in one corner, the far corner would hold some of the most distant galaxies and quasars ever seen.

    “This simulation marks a turning point in numerical cosmology,” says Michael Norman, a computational astrophysicist at the University of Illinois, Urbana-Champaign, who is not a member of the multinational simulation team, called the Virgo Consortium. Many times larger than any earlier effort, the whole-universe computation taxed the ingenuity of programmers and the number-crunching prowess of a 512-processor Cray supercomputer at the Max Planck Society's computing center in Garching, Germany. Other cosmologists have had little time to absorb the results, which were discussed late last week at a cosmology meeting in Paris by Jörg Colberg of the Max Planck Institute for Astrophysics (MPA) in Garching and which will be the subject of a talk by August Evrard of the University of Michigan at next week's American Astronomical Society meeting in San Diego. But they say they expect this model universe to be a powerful tool for interpreting data from large surveys of the real sky.

    Norman, for instance, notes that even the biggest survey of the real sky, the 5-year Sloan Digital Sky Survey (Science, 29 May, p. 1337), will map just 100th of the visible universe, so astronomers can't be sure they're getting a true sample of galaxy clusters and voids. But because the simulation covers “essentially the entire visible universe,” he says, even rare structures should crop up often enough to settle questions about how typical they are. Adds Martin Rees, a cosmologist at Cambridge University in the United Kingdom, “It is splendid that the modeling and simulations have attained this level of sophistication just at the time when we are starting to get a flood of high-quality data.”

    To simulate cosmic evolution, the Virgo Consortium began by calculating how the slight density variations rippling through the matter of the very early cosmos might have grown. Those ripples are thought to have originated as “quantum fluctuations”—in essence, waves of uncertainty in particle positions—during the first instants of the big bang, when the entire observable universe was no larger than a grapefruit. Relatively simple calculations predict how radiation in the hot young cosmos would have smoothed out some of these perturbations while allowing others to intensify.

    As the universe expanded and cooled, gravity took over. And after about a billion years, when the cosmos was about 10% of its present age, concentrations of mass equaling tens of Milky Ways began to “go nonlinear,” becoming too closely bunched for simple mathematics to handle—just as seaside ripples on a calm spring day are less complex than breakers in a typhoon. At that point, the researchers could no longer trace cosmic evolution by solving a single set of equations. Instead they divided the mass in a cubic cosmos into a billion particles and calculated how each particle affected the motion of all its neighbors in 500 time steps. Over the course of the simulation, the cosmic wrinkles, like waves on a stormy seascape, became more and more pronounced as the particles attracted one another.

    Each run of this stage of the simulation, says Simon White of Max Planck, a leader of the effort, required about 70 hours on the Cray T3E, which split the universe up among its 512 processors. To streamline communications among the processors and to even out the voracious memory requirements, says White, “the Virgo Consortium codes had to be rewritten from scratch.” Even so, the 600 billion bytes of raw data streaming from the computation filled memory banks almost to capacity, and the first attempts to download the data from active memory onto tapes crashed the Garching computer center.

    Yet the result is only a sketchy picture of cosmic evolution, the researchers acknowledge. Each of the billion particles is the equivalent of 10 galaxies or so, and to save computer time, the calculations leave out factors other than gravity, such as pressure and radiation, that govern galaxy formation. The team also stresses that the computations follow only the invisible “dark matter” that is thought to account for most of cosmic mass, and not the bright galaxies that, in nature, trace the dark mass like campfires dotting the ridgelines of hills. But Michigan's Evrard notes that the largest dark and bright structures are thought to be roughly equivalent.

    So far, the team has done calculations on two model universes. One is filled with enough matter to stop cosmic expansion after an infinite time, a condition called omega matter = 1 (see graphics). This is the cosmic recipe most theorists opted for in the past. The other, called the lambda model, is the kind of universe suggested by new measurements of the cosmic expansion rate: one that has an omega matter of 0.3 and a substantial cosmological constant, or large-scale repulsive force (Science, 27 February, p. 1298 and 30 January, p. 651

    The team then created both “snapshots” of the fully formed universe and views of what an observer would actually see in looking deeper and deeper into the universe from one point. The farther the observer looks, the longer it has taken for light to arrive from that point, so the view gradually moves to earlier times in cosmic history along a wedgelike bundle of lines of sight. Even veteran researchers in the consortium were taken aback by the stunning detail seen in these time-lapse pictures, which were created mostly by Evrard. Says Carlos Frenk of the University of Durham in the U.K., who co-directs the consortium with White, “I was amazed that even at very early times, you see a huge amount of structure”—filaments and walls like the ones that observers have seen in the nearby universe. Agrees Evrard: “You start to pick out subtleties that you wouldn't have imagined. It blows your mind.”

    So far, the lambda model seems to be a little closer to the real universe. Large clusters of galaxies form earlier in the lambda model than in the omega = 1 version, more in line with observations. Both models have difficulty accounting for some of the most massive and distant clusters seen in the real sky, however.

    As cosmologist David Weinberg of Ohio State University in Columbus and others see it, the simulations should be most valuable as a point of comparison for surveys of the real sky such as the Sloan and the Anglo-Australian 2-degree Field (2dF) Survey. In such surveys, explains Richard Ellis of Cambridge University, a co-principal investigator of the 2dF Survey, “it's as if one was conducting an opinion poll but was a bit worried that the sample questioned might be unrepresentative.” The new simulations, say Weinberg and Ellis, should help cosmologists distinguish true indicators of the type of universe we live in from statistical flukes. “You have a volume big enough that you can plunk a ‘theoretical observer’ down in the simulation and create an artificial Sloan survey” to see what kinds of features are typical, says Weinberg, who is a member of the Sloan collaboration.

    Already, says Frenk, the simulations are pouring out clues to everything from how often galaxy clusters would tend to clump together into superclusters to the likelihood that we might live in a relatively tenuous “bubble” within the cosmos, as observations are beginning to suggest (Science, 15 May, p. 1008). And because the team plans to make its data public, other researchers will be able to make their own journeys through the far reaches of this silicon universe.


    Tau Protein Mutations Confirmed as Neuron Killers

    1. Gretchen Vogel

    Long a suspect in the neuronal wreckage of Alzheimer's disease, the protein called tau looked like it was going to get off scot-free. Circumstantial evidence linking tau to the disease is plentiful. It is the major component of “tangles”—abnormal twists of proteins found in the dead brain cells of Alzheimer's sufferers—and the number of tangles seems to correlate with the severity of dementia. But no one could ever nail tau's guilt once and for all. Now, however, fresh revelations have reignited the case against tau.

    In the 3 June issue of Annals of Neurology, a team led by Gerard Schellenberg of the Veterans Affairs Puget Sound Health Care System in Seattle reports evidence suggesting that tau gene mutations cause a type of inherited dementia that is also characterized by brain tangles. And in the Proceedings of the National Academy of Sciences (PNAS) later this month, Maria Grazia Spillantini of the Centre for Brain Repair at Cambridge University in the United Kingdom and her colleagues will report that another mutation in tau causes a slightly different genetic condition. The list won't stop there. As-yet-unpublished work from other labs has linked tau mutations to more than a dozen more of these hereditary dementias, collectively called “frontotemporal dementia and Parkinsonism linked to chromosome 17” (FTDP-17).

    To longtime tau researchers, these discoveries are welcome, as they establish at last that the protein can play a primary role in causing at least some cases of neurodegenerative disease. That idea had lost favor because genetically engineered mice that make no tau at all suffer no obvious ill effects. More important, no one could find anything wrong with the tau gene in people with Alzheimer's or other neurodegenerative diseases. “When we would go around the country and give lectures [on tau], people would ask, ‘So when are you going to find a mutation?’” says John Trojanowski of the University of Pennsylvania in Philadelphia, one of the researchers who continued to work on the protein. If tau had any role at all, it was believed to be secondary to the neurodegenerative events set in motion by abnormal deposits of another protein, β amyloid, the product of a gene mutated in rare, hereditary forms of Alzheimer's.

    But the new work “really brings the tau pathway to the forefront” as an important contributor to neurodegeneration, says neurologist Kirk Wilhelmsen of the University of California, San Francisco. That doesn't mean that β amyloid isn't at fault in Alzheimer's disease. Some researchers suspect that it may work with tau to kill cells. But with the new evidence linking tau to neurodegeneration, researchers hope they'll be able to piece together what triggers formation of the protein's ruinous tangles and perhaps how that formation could be prevented.

    Schellenberg's team had been trying to find a cause of FTDP-17s for several years. The symptoms of the diseases that fall under that rubric vary widely, ranging from psychoses that resemble schizophrenia to difficulty speaking to Parkinson's-like tremors. And because many patients develop dementias, they are often misdiagnosed as having Alzheimer's disease. The tau gene was a prime suspect in the FTDP-17 diseases. Not only did the patients have dementias and tangles, but genetic studies by Wilhelmsen and other researchers had linked all of the FTDP-17s to a region on chromosome 17 where the gene is located.

    Even so, initial searches failed to find anything wrong with the gene in FTDP-17 patients. “The trend was to give up,” says neurologist Thomas Bird of the VA Puget Sound, a co-author on the Schellenberg paper. But “to be thorough,” says Schellenberg, postdoctoral fellow Parvoneh Poorkaj continued her search, looking for variations in the gene that might be found only in patients.

    After screening members of two affected families, she came up with nine gene variations. Eight of them were eliminated as possible causes of the neurodegenerative disease when they also turned up in healthy controls. Apparently, they were normal gene variations. But the ninth proved to be found only in the ill members of one family. Still, the researchers were cautious about concluding that the gene change caused the disease. It did not appear in a family with similar symptoms, and previous negative results had made them wary. “We had taken it as far as we could,” says Schellenberg, so they began presenting their findings at seminars and meetings, hoping to find corroboration from other groups.

    They soon found it. A team led by Spillantini, Michel Goedert of the Medical Research Council Laboratory of Molecular Biology in Cambridge, U.K., and Bernardino Ghetti of Indiana University School of Medicine in Indianapolis had been analyzing the tau gene in a family with an FTDP-17 disease called familial multiple system tauopathy with presenile dementia (MSTD). As they report in PNAS, they also found a tau gene anomaly in affected family members.

    Since then, Schellenberg's group has found a mutation in the gene in a second Seattle family with a different FTDP-17, and at least three other teams, including one led by Michael Hutton of the Mayo Clinic in Jacksonville, Florida, have reported at closed meetings that they have found mutations in other FTDP-17 dementias, their colleagues say. Hutton declined to comment on his own work, as it has been submitted to Nature.

    Many of the families have slightly different mutations, says Spillantini, which may explain why the patients often vary in their dementia symptoms and show different kinds of brain damage after death. But many researchers think all these defects might kill cells in roughly the same way: by somehow preventing tau from binding to the microtubules. These are the protein filaments that help provide support for cell structures such as axons, the long projections that form connections between neurons.

    In Schellenberg's families, for instance, the mutations occur right in the gene sequence coding for the part of the protein where it contacts the microtubules, preventing it from binding. Some researchers think this disruption could cause the microtubules to collapse. The destabilized axons would lose contact with their target neurons—and neurons die when they lose their connections.

    The Spillantini group's mutation may also affect tau binding to microtubules, she says, although the mutation is more subtle. It is not in the gene's protein-coding regions but in one of its introns—regions that are spliced out of the messenger RNA made from a gene before the mRNA is translated into the actual protein. In humans, the tau protein occurs in six different forms of varying lengths, depending on where the splices are made and which exons are included in the finished protein. The mutation seems to encourage the cell's protein-making machinery to grind out abnormal amounts of three of the longer forms.

    In fact, the brains of MSTD sufferers have several times as much of the longer tau versions as do normal brains, the Spillantini team reports. This imbalance, says Spillantini, may prevent proper binding to the microtubules, which could then destabilize the axons. She notes, however, that the disrupted-axon scenario faces a serious challenge, posed by the genetically engineered mice that make no tau protein yet seem healthy. She favors another possibility: that a disruption in binding leads to an excess of tau floating loose in the cell. Like out-of-work loiterers, the unbound proteins might aggregate and disrupt cell function.

    Although Alzheimer's patients don't seem to have any tau mutations, the protein may get more respect from Alzheimer's researchers now that there's more direct proof that tau defects can lead to nerve cell death. Indeed, John Hardy of the Mayo Clinic, whose team discovered the first mutation in the gene that encodes β amyloid, thinks that β amyloid may somehow work through disruption of tau function. That could explain the correlation between the degree of dementia in patients and the abundance of tangles in their brains, he notes. “The pathway from amyloid to dementia is likely to go through tangles,” he predicts. The big question now is how the two might interact to make tau into a killer.

    Understanding how tau turns deadly could have benefits beyond Alzheimer's and the tau diseases. Hardy and others have noted that a familial form of Parkinson's disease described last year (Science, 27 June 1997, p. 1973) is also caused by a mutated protein that accumulates in brain cells. Indeed, Goedert thinks the tau diseases could be a model for understanding how abnormal protein filaments might kill cells in a whole range of diseases, including sporadic dementias, Parkinson's disease, and Huntington's disease. “I would argue that cells die because they have these filaments,” he says. Just months ago, tau was a marginal suspect in brain diseases, but now it might help close several other unsolved cases.


    Old, Old Skull Has a New Look

    1. Ann Gibbons

    The origins of our species, Homosapiens, are lost in a gaping hole in the fossil record. Between 1.4 million and 600,000 years ago, traces of human ancestors more or less vanish in Africa, where many researchers believe H.sapiens originated. Now a well-preserved skull of an early human found in the northeast African country of Eritrea has landed right in the middle of that gap.

    Dated to 1 million years ago, the skull shows a tantalizing mix of ancient and modern features, says Ernesto Abbate, a geologist at the University of Florence in Italy and head of the multinational team—including members from Italy, South Africa, and Switzerland—that reported the discovery in this week's issue of Nature. Although it has yet to be studied in detail, the skull “could be evidence of the emergence of H. sapiens characters earlier than previously thought,” says Lorenzo Rook, a paleontologist at the University of Florence. It also underscores earlier hints that northeastern Africa was a focal point of human evolution, says Tim White, a paleoanthropologist at the University of California, Berkeley: “This discovery is another significant step in establishing the Horn of Africa as the key to understanding human origins and evolution.”

    The skull was found in the remote, arid lands of the Northern Danakil Depression of Eritrea, about 50 kilometers from the Red Sea and 400 kilometers north of the famed Awash Valley in Ethiopia, which has yielded the remains of several other human ancestors. Rook was walking along a hilly slope in December 1995 when he noticed the right side of the skull, ear-side up, poking out of the sand and rock. “I called to my colleagues, and we immediately realized it was a Homo skull,” says Rook.

    When they had dug it out of the rock, their first impression was that the nearly complete brain case resembles that of H. erectus, a human ancestor that appeared in Africa 1.7 million years ago and persisted until at least 1.4 million years ago in Africa and much later in Asia. Like H. erectus, it has a pronounced brow ridge and elongated brain case, among other features, says Rook.

    The new skull would be the youngest African erectus. To date the stratum where it was found, the team looked in the rocks for the signature of known reversals in Earth's magnetic field and identified fossils from mammals that went extinct at known times. The resulting age of 1 million years puts the fossil right between the youngest H. erectus found in Africa—a 1.4-million-year-old fossil from Olduvai, Tanzania—and the oldest archaic form of H. sapiens, a 600,000-year-old specimen from Bodo, Ethiopia.

    A closer look at the fossil revealed modern-looking features mixed with the old. The skull is remarkably narrow, reaching its greatest width near the top, as in H. sapiens, rather than at the base, as in erectus. This suggests that some traits typical of H. sapiens had begun to develop 200,000 to 300,000 years earlier than expected, says Rook. “It contributes to the perception that H. erectus and H. sapiens have no distinct boundary between them,” adds paleoanthropologist Milford Wolpoff of the University of Michigan, Ann Arbor, who has seen photos of the skull.

    But others say such conclusions are preliminary because half the skull is still embedded in rock. Further hampering study, the skull, along with two incisors and pelvic fragments, has been kept under wraps at the Eritrea National Museum in Asmara. Eritrea, a nation just 5 years old, has yet to write regulations governing the study of such antiquities. The Italian team is working with Eritrean scientists and officials to arrange for the skull to be restored and studied.

    Ultimately, the fossil could help anthropologists make sense of other fossils found outside Africa that fall in the critical time gap, such as 800,000-year-old remains at Atapuerca, Spain, and H. erectus in Asia. No one knows whether these creatures are evolutionary dead ends or transitional players on the path to H. sapiens, but the new skull may help to sort out where they fit on the family tree. “This is clearly going to be a major player in future scenarios, but it has to be studied,” says Ian Tattersall, a paleoanthropologist at the American Museum of Natural History in New York.


    A Giant Snare for Monopoles

    1. David Kestenbaum

    Physicists have spent a lot of time looking for magnetic monopoles. They have good reasons to believe in the reality of these particles—the equivalent in magnetism of the fundamental bits of electric charge carried by electrons. And the existence of a single unit of magnetic charge would help answer some deep and nagging questions such as why the proton and electron have exactly the same amount of electric charge.

    But monopole searchers have had no luck. For instance, they have not found monopoles embedded in ancient rock from the moon, and they have not seen them float through detectors in the lab. Now physicists have looked for monopoles with the world's highest energy accelerator and, again, have come up spectacularly empty. The nondiscovery, now in press at Physical Review Letters, puts some new limits on the mass of this aspiring particle and has also sparked a bit of a debate about how to look for it.

    Little big bang.

    If colliding protons spawned a monopole, a pair of high-energy photons (green) might reveal it.

    Most searches have looked for monopoles that may have been drifting through the universe since the beginning of time. “There's every reason to expect they were produced in the big bang,” says Jeffrey Harvey, a physicist at the University of Chicago. The traditional way to look for these relic monopoles is with a loop of wire. If a monopole passes through, it induces a distinct blip of current in the wire, which can easily be detected. But such efforts have yielded nothing except for one 1982 sighting now viewed as a fluke. Many physicists suspect that the big bang made only a small number of monopoles, which are now too scattered to be easily detected.

    The “little bangs” created when particles collide in an accelerator, however, might also spawn monopoles, which could pop in and out of existence, influencing the energy or direction of the debris from the collision. In 1995, physicists looked for hints of monopoles made by colliding electrons and positrons at the Large Electron-Positron Collider at CERN, near Geneva, but found none.

    Thinking that higher energy collisions might improve the odds, Ilya Ginzburg, a theorist at the Institute of Mathematics in Novosibirsk, Russia, mentioned in a recent colloquium at the Fermi National Accelerator Laboratory, in Batavia, Illinois, that it should also be possible to observe monopoles in the collisions of protons and antiprotons in Fermilab's Tevatron accelerator. The monopoles, he and a colleague calculated, could boost the energy of photons produced in the collisions. “The idea was so new and interesting,” says Fermilab's Greg Landsberg, “I got really excited.”

    So Landsberg and colleagues on the D0 Experiment went back and sifted through the data from millions of collisions. They looked for pairs of high-energy photons emerging at large angles to the collision. Sadly, Landsberg says, “we found none.” The effort wasn't all for naught, however. To have escaped detection, certain types of monopoles would have to have a relatively large mass—in one case, more than 1580 times that of a proton.

    This new technique of trolling for monopoles has drawn some criticism. Kimball Milton and his colleagues at the University of Oklahoma, Norman, contend that monopole theory isn't far enough along to calculate reliably whether monopoles would really have this kind of effect. Milton prefers a more direct approach. If the monopoles are light (less than several hundred times a proton's mass), he says, it's possible that they are literally streaming out of the collisions and lodging in the detector. Milton and colleagues have gotten Fermilab to ship them bits of old detectors, which are passing through loops of wire to search for monopoles. “The odds are slim,” he admits, but if they find one, they'd have it for keeps. “We'd be rich and famous.”


    Putting Antimatter on the Scales

    1. Alexander Hellemans
    1. Alexander Hellemans is a science writer in Naples, Italy.

    One of the bearing walls of modern physics is that particles of antimatter and those of matter are perfect counterparts, down to their mass. That wall is standing strong, according to new results presented last week at a meeting of the American Physical Society's Division of Atomic, Molecular, and Optical Physics in Santa Fe, New Mexico. The international team has caged a proton and an antiproton in a trap and deduced that they have the same mass to within a part in 10 billion.

    Joe Lykken, a theoretical physicist at the Fermi National Accelerator Laboratory in Batavia, Illinois, notes that although the existing theory of particles and forces insists on mass equivalence, a speculative alternative called superstring theory may allow a small difference in mass. So Harvard University physicists Gerald Gabrielse, Anton Khabbaz, and David Hall, along with collaborators from the University of Bonn in Germany and elsewhere, decided to check. The team caught a single antiproton from the LEAR accelerator at Europe's CERN laboratory near Geneva in a web of electric and magnetic fields, where it spun in circles like a firefly in a jar. The researchers also introduced a negative hydrogen ion (a proton with two electrons circling it) into the same trap. Protons and antiprotons have opposite charges, but the hydrogen ion has the same charge as an antiproton, which makes the two easy to compare.

    To see if their masses differed, the team watched how fast the particles raced around inside the trap. If one particle were heavier, it would take a little while longer to make an orbit. They used tiny electrodes to check. “Each time the particle passes one electrode … it induces a current to pass through a resistor, and that current we amplify and measure,” Khabbaz explains.

    The group found that the two raced around in almost identical circles, about 100 micrometers across, 90 million times per second. They concluded—after correcting for the tiny mass of the two electrons—that the proton and antiproton have the same mass to about 10 decimal places, a factor of 10 times better than previous measurements. “[The precision of] this result is extraordinary,” says Jook Walraven, a physicist at the Institute for Atomic and Molecular Physics in Amsterdam.

    The finding by no means rules out string theory, says Lykken: “We are very ignorant about string theory; we don't know how large the string effects may be.” He would like to see experiments with an even higher degree of precision: “These experiments don't cost very much money compared to other things you do in high-energy physics, and you have the potential for a spectacular result.”


    Yellowstone Rising Again From Ashes of Devastating Fires

    1. Richard Stone

    YELLOWSTONE NATIONAL PARK—Lanky and bespectacled, Jay Anderson towers over half-meter-high Douglas fir seedlings, a giant among thousands of dark-green midgets. The budding conifer forest has risen from the ashes of fires that torched more than a third of the park in the summer of 1988. “After the fire, this place looked totally sterilized,” says Anderson, an ecologist at Idaho State University in Pocatello. “It was hard to believe anything would survive here.” To his delight, however, the charred landscape near Tower Junction rebounded quickly, giving birth the next summer to a carpet of pinkish-purple fireweed and other herbs. Grander changes are unfolding more gradually: It wasn't until last year, for instance, that the Douglas firs “started poking their heads up” above the shrubs, says Anderson. “We breathed a sigh of relief.”

    Ten years after a conflagration that razed an icon and rekindled a debate over fire management in U.S. parks, scientists have found ample evidence that Yellowstone's ecosystems are thriving. “Mother Nature is taking care of herself pretty well,” says Anderson. Defying early predictions that the fires might open territory to invading weeds or shift the balance toward species that once struggled to maintain a niche in the park, the postfire ecosystems are shaping up to be essentially the same as those that prospered before the flames.

    Ashes to trees.

    Barren landscape after the fires (left) is now dotted with lodgepole pine (right).


    But this pretty picture of resurrection is not the whole story, scientists reported at a meeting* last week at Montana State University in Bozeman. Spurred by the fires, they have done studies showing that climate change may do what the fires of 1988 did not: drastically alter Yellowstone's ecosystems. Past climate changes have shifted vegetation patterns and the frequency and severity of fires. And computer models suggest future warming could do the same thing, turning Yellowstone within decades into a park that—apart from the spectacular geysers and other thermal features—bears little resemblance to the one that exists today. According to geographer Cathy Whitlock of the University of Oregon, Eugene, who is leading one modeling effort, “the projected changes we're seeing are dramatic and astonishing.”

    The fires—a conflagration the likes of which has occurred only every 200 to 300 years over the past few millennia—“stimulated an incredible body of research that has shaped how we think about landscape processes,” says Norman Christensen, dean of Duke University's School of the Environment. He and others say the blaze provided an unprecedented opportunity to study a large disturbance in ecosystems protected from logging and development. “This is a unique natural experiment,” says William Romme, an ecologist at Fort Lewis College in Durango, Colorado.

    To most experts, the fires in the Yellowstone area were an inevitable consequence of three converging factors: extreme weather, an accumulation of woody debris on the forest floor, and mature living trees that could burn well. Lightning ignited the first fire of 1988 on 24 May; rain extinguished it later that day. But 248 more fires triggered by lightning and people raged in Yellowstone over the next 6 months, thanks to the driest summer in 112 years. By the time it was over, more than 321,000 hectares—36%—of the park had burned, says Don Despain, a biologist with the U.S. Geological Survey (USGS) in Bozeman. The fires consumed an area that's “an order of magnitude bigger than anyone alive had ever experienced,” adds Romme.

    The flames also touched off a firestorm of criticism over the federal fire policy, which since 1972 has been to allow fires to burn unless they threaten people or property. When this policy was put to its biggest test in Yellowstone that summer, researchers now acknowledge, the public was generally unaware of fire's benevolent side. “Ten years ago, most folks had a pretty poor understanding of the ecological effects of fires,” says Robert Gresswell of the USGS in Corvallis, Oregon. For example, after surveying charred swaths of the park, former Senator Alan Simpson (R-WY) predicted it would not recover in 1000 years. Under intense pressure from local business owners who feared losing tourist revenue, Yellowstone officials departed from their own policy on 21 July 1988 and ordered all fires in the park suppressed; but despite efforts by some 13,000 firefighters and military personnel, the fires raged on.

    After the smoke had cleared, however, scientists swooped in to start long-term studies of Yellowstone's recovery. Among the early predictions were that large mammals, such as bear and elk, would suffer severely from food shortages and that aspens would gain ground on lodgepole pine, the region's dominant tree for the past 10,000 years.

    The forecast of a carcass-littered park turned out to be off the mark. “The effects of the fires were relatively insignificant,” says Michael Coughenour, an ecologist at Colorado State University in Fort Collins. Researchers watched as elk, bison, and other mammals foraged far more than expected on sugars in the charred debris—“caramel candy,” as some scientists refer to it. Although the mammal death rate rose after the fires, much of the blame can be pinned on the severe winter of 1988–89, says symposium organizer Linda Wallace, an ecologist at the University of Oklahoma, Norman. “We were very, very surprised” that the fires did not take a higher toll, she says. Even fish in charcoal-choked streams survived mostly unscathed. As for the park's forests, killing temperatures from the rapidly moving fires penetrated only an inch into the ground. “No land was made unfit for plant growth,” says Despain.

    Equally surprising was the failure of aspen, the only deciduous species common in the park, to claim more of the scorched ground. Its territory has been shrinking for decades. “As mature trees die, they aren't being replaced,” says Romme. He and others thought the fires might open up new habitat for aspen, a species whose root system can span hectares and sprout shoots even after the trees die; an entire stand is often a single individual.

    As predicted, the fires were a good tonic for aspen, triggering intense sprouting from roots and something “almost never seen in the wild,” says Romme: young aspens growing from seeds. But the shoots are a favorite food of elk, and every winter since the fires, elk have zeroed in on aspen shoots sticking out of the snow. As a result, says Romme, the fires have actually “hastened the demise of some stands.” The story is not over, however: Roots continue to send up shoots, and if the trees are able to mature, aspen “could still become a key component of succession,” Romme says. “We should have interesting results in a couple of years.”

    Scientists have also tried to put the fires into a historical context. After studying more than 50 fire-related debris flows—landslides that sweep through denuded land and can travel up to 100 kilometers an hour—that have occurred in Yellowstone over the past 3500 years, geologist Grant Meyer of Middlebury College in Vermont says he has found that “big fires … are strongly controlled by climate.” Such flows were much more common from A.D. 900 to 1300—known as the “Medieval Warm Period”—than during the period from 1300 to 1900, called the “Little Ice Age,” Meyer says.

    Oregon's Whitlock and grad student Sarah Millspaugh have come to similar conclusions after studying charcoal-laced sediments at the bottom of Yellowstone lakes. These studies on past fires have prompted Whitlock to gaze into the future to try to forecast how climate change might alter vegetation patterns and, perhaps, fire frequency and severity. Whitlock, along with Patrick Bartlein and Sarah Shafer, has created computer models to predict the changes in species distributions in the Yellowstone region that may occur in response to global changes from a doubling of atmospheric CO2. According to results reported in Conservation Biology in June 1997, the team predicts that warmer, wetter winters could help alter the ranges of various species in the U.S. Northwest, causing larch, scrub oak, and other trees not now found in Yellowstone to spread into the park. This new landscape could be vulnerable to more frequent, possibly smaller fires, Whitlock says.

    The prospect that the Yellowstone ecosystem is poised for a makeover is spurring fire ecologists and colleagues from other disciplines to try to organize a lasting Yellowstone research program affiliated with the National Science Foundation's network of Long-Term Ecological Research sites. But the fire policy won't change, say park officials and researchers. “If our mandate is to manage Yellowstone for future generations as an unhindered ecosystem, then putting out fires is counter to that mandate,” says Despain. “The fires have brought home the inevitability of change,” adds Duke's Christensen, “and the process of renewal that accompanies it.”

    • *Yellowstone National Park 125th Anniversary Symposium, 11 to 23 May.


    One-Eyed Animals Implicate Cholesterol in Development

    1. Evelyn Strauss
    1. Evelyn Strauss is a free-lance writer in San Francisco.

    In ancient times, Homer depicted the one-eyed Cyclops as a terrifying and mysterious monster. Today we recognize infants born with cyclopia—marked by a single large eye—as victims of a defect that derails the normal development of the brain and face. But just how this developmental pathway ordinarily works has been far more mysterious than the ways of Homer's gods and heroes. Now biologists are dissecting it. At its heart, they are glimpsing a familiar molecule, cholesterol, in an entirely new role.

    Cyclopia and milder forms of the same developmental disorder result from a failure of the embryonic forebrain to subdivide properly. Defective genes can disrupt this process in people and animals, but so can certain toxins, some of them found in wild plants, and their workings are giving scientists new insights into the developmental pathway. As Philip Beachy, a molecular biologist at The Johns Hopkins University School of Medicine in Baltimore, and his colleagues report on page 1603, these toxins make the cells unable to respond to a critical developmental signal, perhaps because they interfere with the normal traffic of cholesterol within cells. A second group, at the University of Washington, Seattle, has carried out similar experiments, to be published in an upcoming issue of Development.

    The idea that a disruption in cholesterol transport may prevent embryonic cells from heeding the signal—a protein called Sonic hedgehog—comes on the heels of earlier work by the Beachy group showing that cholesterol also plays a role in activating the signal in the first place. Together, the findings provide some of the first clear evidence that cholesterol, long known as a structural component of cell membranes and as the raw material that the body converts into steroid hormones and bile acids, can also influence the signaling paths that guide development. “Everyone knew that cholesterol was important,” says Yvonne Lange, a cell biologist at Rush-Presbyterian-St. Luke's Medical Center in Chicago. “But that it could act on a [developmental] signaling process was entirely unanticipated. This work opens up a whole new role for cholesterol and raises a lot of interesting questions.”

    Among the most tantalizing: whether a mother's diet and cholesterol metabolism play some role in determining the severity of the birth defect that, in its most extreme form, manifests itself as cyclopia. One in 16,000 babies is born with some form of the defect, technically known as holoprosencephaly (HPE), says Maximilian Muenke, a human geneticist at the National Human Genome Research Institute in Bethesda, Maryland, and the Children's Hospital of Philadelphia. Early in pregnancy, before nature exerts quality control and flawed embryos are spontaneously aborted, the rate is much higher: one in 250. People with the mildest form of the disorder have signs as minor as a single upper front incisor; severe cases are marked by one eye in the middle of the face, below a protruding nasal structure, and serious brain abnormalities. Infants with full-blown cyclopia die soon after birth.

    In 1996, Beachy's group found that HPE-like symptoms, including cyclopia, develop in mouse embryos that lack a normal Sonic hedgehog (Shh) gene. Shh is the vertebrate counterpart of a fruit fly gene called hedgehog (hh), which instructs the nervous system to develop properly. The same gene is at fault in some human cases, Muenke and his colleagues Stephen Scherer and Lap-Chee Tsui at the Hospital for Sick Children in Toronto soon showed. Muenke says he has also found that mutations in other genes affecting the Shh signal can cause HPE. But many cases of HPE have not been traced to specific genetic lesions, opening a possible role for environmental factors.

    At least in animals, toxins that interfere with cholesterol metabolism can cause similar abnormalities. The first solid clue that cholesterol might play a role in development came 2 years ago, when Beachy and his colleagues discovered in test tube experiments that cholesterol prepares the Hh protein to deliver its message by reacting with the molecule, cleaving it, and then remaining bound to the now-active half (Science, 11 October 1996, p. 255). This was the first time anyone had seen cholesterol form a strong, covalent bond with a protein. It also got Beachy wondering whether cholesterol abnormalities could sabotage development in the same way as defects in Shh itself do.

    The researchers looked in the literature for clues that might help them connect what they had learned about cholesterol and Shh signaling with HPE. They struck pay dirt when they found reports of two classes of teratogens—compounds that induce birth defects—that mimic the effects of eliminating the Shh gene. Compounds in one group, which Charles Roux and colleagues showed over 30 years ago can induce HPE in rats, are known to interfere with the body's production of cholesterol. The other compounds came to light in the 1950s and early 1960s, when Richard Keeler and Wayne Binns traced a high incidence of cyclopia in lambs to chemicals in the plant Veratrum californicum, or corn lily, which the ewes had eaten. These compounds resemble cholesterol structurally.

    Beachy's team members reasoned that all of these teratogens somehow interfere with cholesterol's ability to perform the function they had observed in the test tube: activating and binding Shh. But they were about to get a surprise. Both Beachy's group and the team of John Incardona, Raj Kapur, and Henk Roelink at the University of Washington, Seattle, found that what these compounds actually do is render cells that receive the Shh signal unable to respond properly. The finding shows, says Beachy, that “there are at least two different roles for cholesterol in the [Shh] pathway. It's important for both the signaling protein and in the target cell.”

    This new role for cholesterol in Shh signaling emerged in Beachy's lab after Jeffery Porter, currently at Ontogeny Inc. in Cambridge, Massachusetts, and Michael Cooper treated cells and embryos with the teratogens and traced what happened to the Shh protein. The team found that the toxins did not affect the size of the molecule—suggesting it was being cleaved and modified properly—or where it ended up in the embryo. But “what really knocked our socks off,” says Beachy, is what happened when Cooper added the compounds along with purified active Shh to a piece of neural tissue from a chick embryo that cannot make the protein but can respond to it. The cells ignored the Shh protein, failing to turn on and off the genes that Shh normally controls. “There's an active signal present but no response,” says Beachy. “The defect induced by the compounds must be in the responding tissue.”

    A clue to the defect came when the researchers took a closer look at cells treated with the compounds and found an unusual distribution of sterols—cholesterol and related molecules. A “river of sterols,” as Beachy describes it, normally travels back and forth between the cell surface and the endoplasmic reticulum (ER)—the cellular compartment where cholesterol is made. The teratogens apparently dam up this river: Cholesterol builds to excessive levels on the cell surface, while levels in the ER appear unusually low.

    It makes sense that defective sterol trafficking might interfere with Shh signaling, says Beachy. A target cell protein called Patched, which binds Shh and plays a critical role in signaling, contains a stretch of amino acids that resembles the sterol-sensing domains in several other proteins. These proteins, which help regulate the cholesterol levels in the cell, use the domains to measure the amount of sterols present and adjust cholesterol production accordingly.

    No one yet knows for certain how to explain the new observations, says Beachy, but he proposes that a shortage of cholesterol at the ER, detected by the sterol-sensing domain of Patched, might lock the protein into an inactive state and keep it from relaying the Shh signal. “Cell proliferation is often a part of the response to Hedgehog proteins,” says Beachy. “Maybe a cell monitors cholesterol levels before it responds to Hedgehog,” to ensure that it has enough cholesterol to make new cell membranes. “That kind of system could allow the cell to ask, ‘Am I making enough?’” before it goes on to multiply. If not—or if a teratogen has interfered with cholesterol trafficking within the cell—Patched shuts down the pathway, and development goes awry.

    It's a plausible scenario, says William Mobley, a neuroscientist at Stanford University School of Medicine: “We have to start thinking of sterols as molecules that impact the function of signaling proteins within cells.” Incardona, however, thinks the teratogens' effect on cholesterol isn't the full story. He thinks that at least some of the compounds may interfere directly with Patched or some other component of the cell's response to Shh. “The trafficking defect may not be the main teratogenic effect,” he says. “In my hands, the plant compounds are teratogenic at concentrations well below those where they cause trafficking defects.”

    New studies of genetic HPE are reinforcing the connection between cholesterol and development. For example, mice that lack the gene for megalin, a cell surface protein that binds and internalizes cholesterol, show signs of HPE. Genetic aberrations that result in faulty cholesterol metabolism also may contribute to human HPE, as a disorder known as Smith-Lemli-Opitz syndrome (SLOS) indicates. SLOS patients have developmental delays, mental retardation, and, in some 5% of patients, HPE. They accumulate a biochemical precursor of cholesterol, and several recent studies on these patients have identified mutations in the gene for the enzyme that converts this precursor into cholesterol.

    So can eating a cholesterol-rich diet reduce the risk of birth defects in mothers at risk for having babies with HPE? In mother rats exposed to the teratogenic compounds that inhibit cholesterol biosynthesis, it apparently can, says Richard Kelley, a human geneticist at Johns Hopkins. “This really makes you wonder whether the mother's cholesterol metabolism will influence the severity of HPE in humans.” Kelley is quick to add that there are big differences in how rat and human mothers transport nutrients to embryos, and that consuming a lot of cholesterol doesn't mean it will reach the embryo. But such musings are sure to be put to the test as researchers explore cholesterol's surprising new role.

Log in to view full text

Log in through your institution

Log in through your institution

Stay Connected to Science

Navigate This Article