News this Week

Science  10 Mar 2000:
Vol. 287, Issue 5459, pp. 1722

    Reports Will Urge Overhaul and Delays to NASA's Mars Missions

    1. Andrew Lawler

    In the Hollywood movie Mission to Mars released this week, a human trip to the Red Planet goes mysteriously awry, and a team from Earth must quickly attempt a daring rescue effort. A similar real-life drama is under way, with teams of engineers, scientists, and managers working around the clock to save not a stranded astronaut but NASA's troubled Mars research program.

    At stake is a $1 billion effort stretching over a decade that aims to understand Mars's climate, examine the planet's resources, and search for signs of life. Following a 1996 report that a martian meteorite contained possible evidence of microscopic fossils (Science, 16 August 1996, pp. 864 and 924), NASA planned to launch an ambitious series of orbiting spacecraft and landers every 2 years, culminating in the return of a soil and rock sample to Earth in 2008. The fiery death of the Mars Climate Orbiter in September 1999 and the silence from the Mars Polar Lander 3 months later has shattered that plan. Now NASA is rushing to come up with a less daring but more realistic effort—one that agency officials warn will cost more and take longer to accomplish.

    Prompted by the failures, NASA Administrator Daniel Goldin ordered a battery of internal and external reports to reevaluate the agency's direction (see chart). Most are due for public release next week. Science has learned that the blueprint to emerge from these still-confidential reports would add smaller spacecraft to the currently planned missions, set up a communications and navigation network around the planet, and provide a longer term vision to take Mars research to 2020. Sources familiar with the deliberations say the revised plan would also delay some science missions—such as the lander being readied for a 2001 launch—and cost $300 million to $500 million more than NASA originally estimated.

    View this table:

    The Goldin-ordered reports will urge sweeping revisions in the way the Mars program is planned, managed, and executed. An independent panel chaired by former Lockheed Martin manager Tom Young is expected to criticize the Jet Propulsion Laboratory (JPL) in Pasadena for failing to manage the last two Mars missions adequately. It also will take both NASA headquarters and Lockheed Martin to task for their respective roles in the botched orbiter and lander projects. NASA officials are bracing for the worst. Goldin recently told his managers that the Young report will be the Rogers Commission of space science, referring to the devastating critique delivered by a panel that examined the 1986 Challenger disaster. Several congressional committees are eagerly awaiting the results of the Young panel and plan to hold hearings in the coming weeks to examine NASA's plans to recover from the two losses.

    This debate over Mars exploration will likely prove very public and very painful for the space agency in the short term, but space science chief Ed Weiler and outside scientists stoically predict that the attention could pay off in the long haul. “It's a wake-up call. We have an opportunity to correct the mistakes we made,” says Weiler, who adds that too little funding and too much complexity made the old Mars plan “an absolute disaster.” Michael Drake, a planetary scientist at the University of Arizona in Tucson, agrees: “Everybody was deluding themselves that it would work.”

    Indeed, until recently, that plan was seen by many NASA officials and academic scientists as ambitious but not unrealistic. The tentative evidence of microfossils in a martian meteorite triggered a wave of enthusiasm for exploring Mars that swept the public, the White House, and Congress. “The rock served us well by igniting public support, which translated into a greater budget and a new focus” on studying Mars, says Drake.

    The result was a strategy for a series of missions ultimately aimed at returning a martian sample to laboratories on Earth as early as 2005, although the target date was later revised to 2008. A panel of the National Research Council concluded in 1998 that the Mars exploration plan was “a well-thought-out and rational approach to achieving NASA's programmatic goals.” Although some researchers expressed concern about the pace and complexity of the missions, others were delighted with the program's newfound popularity.

    But orbiting Mars is tricky; numerous Russian spacecraft have missed their target, and NASA's massive Mars Observer apparently exploded in 1993 as it reached its destination. Landing is even tougher, given the distance from Earth and the uncertainties about the martian terrain. And collecting samples, ferrying them to orbit, and rocketing them to Earth, as proposed in the sample return, has never been done.

    Despite the odds, the 1997 success of Mars Pathfinder, with its innovative balloon landing and its tiny but indefatigable rover, raised hopes that NASA was up to the technical challenge. And the overall success of Mars Global Surveyor, despite some notable glitches, raised expectations that the 1999 missions would succeed. The subsequent losses of the lander and the climate orbiter stunned NASA managers and alarmed the scientific community.

    The climate orbiter's demise was quickly pegged to a miscalculation in units made by contractor Lockheed Martin. The lander's fate, however, remains a mystery. “We still don't have a 100% smoking gun,” says Weiler. A panel led by retired JPL manager John Casani came up with several theories, including the premature shutdown of the descent rocket, but none is certain.

    In the aftermath, the finger-pointing has begun in earnest. Some scientists blame Goldin and the White House for pushing the program too hard and too fast with too little money. Others cite JPL's poor management of the projects and Lockheed Martin's underbidding of the Mars contract and problems on the factory floor. Weiler, however, says “everyone is to blame”—including the scientific community—for overconfidence.

    A JPL-led panel chaired by Charles Elachi last week briefed NASA officials and the Young panel on their proposed new exploration blueprint. The most pressing issue is whether to launch the 2001 lander, which is roughly identical to the one that failed, or to scrap the launch and use the hardware for a later mission. Weiler said after the briefing that JPL suggests the latter and strongly hinted that he sees no other option. Flying the mission next year, he added, would require adding a hefty package of communications hardware to avoid the unknown fate that befell the December landing; this modification would reduce by half the amount of science the lander could perform and boost costs by another $30 million to $40 million. On the other hand, the 2001 orbiter will likely get a green light, because the failure last year of a similar spacecraft was easily explained and easily corrected.

    The Elachi plan includes launch of a communications system and navigational beacons in the next few years to aid later orbiters and landers, while a series of “micro” missions would complement the larger landers by examining the planet more broadly, according to sources familiar with the details. A sample would likely not be returned to Earth before 2010. So far, however, Weiler is not satisfied with what he's seen of the Elachi plan, although he declines to be specific. “It's not even close,” he grumbles, adding quickly that “it's a work in progress” and that “the people at JPL have made a good start.” Weiler says that whatever the details, he will insist that the new plan include a contingency fund of at least 25%—far more than the current 10%—as well as funding for handling and studying the Mars samples. And he says that although “the sample return will be a major part of the new architecture, it will not drive it like the old one did.” Hashing out the details may take until summer, he says. “Let's slow down and do this right. Taking it slow sometimes is better than speeding up—and screwing up.”

    Going slow is fine with many scientists who say they were nervous about the earlier plan. “If you delay a cycle or so, it's not a disaster,” says David Black, an astrophysicist at Houston's Lunar and Planetary Institute. Michael Malin, a geologist who heads San Diego's Malin Space Science Systems, agrees. “Mars just isn't the place we thought it was,” he says. “A slower, more deliberate and diversified scientific investigation program would be a better long-term investment than an Apollo-like race to return samples.” So unlike the Hollywood version, NASA's Mars rescue mission likely will include more patience than daring.


    Talks of Public-Private Deal End in Acrimony

    1. Eliot Marshall*
    1. With reporting by Leslie Roberts and Elizabeth Pennisi.

    Any hope of getting publicly and privately funded scientists to work together on the human genome dissolved this week in a bitter dispute over who would control the raw data. The dispute went public on 5 March, when the Wellcome Trust, a British charity that funds genome research, released a letter spelling out the details of a controversy that has been simmering for months. As word spread that the trust had released the letter, tempers flared, triggering a flurry of finger-pointing as each side accused the other of sabotaging a potential collaboration.

    The principals in the dispute seem to think the chances of mending the break are slim. “I'm pretty angry,” says Tony White, CEO of PE Corp., chief backer of the private effort to sequence the genome. White even goes so far as to call the attitude of Francis Collins, director of the U.S. National Human Genome Research Institute (NHGRI), the chief U.S. funder of the public venture, “hypocritical.” Collins was more restrained, calling the experience “disheartening.” In retrospect, neither side was very eager to have the negotiations succeed.

    This is the latest and sharpest upset in a long-running dispute between scientists involved in determining the precise sequence of the 3 billion units of DNA in the human genome. PE Corp. of Norwalk, Connecticut, and its subsidiary, Celera Genomics of Rockville, Maryland, galvanized the field when Celera's president, Craig Venter, announced in 1998 that he was planning to sequence the entire human genome by 2001. Venter said he would patent “several hundred” genes and offer conditional viewing rights to everything in his database. Nonprofit centers, led by NHGRI and the Wellcome Trust, responded by stepping up their own efforts. They rushed ahead with plans to generate a “draft” version of the human genome early in 2000, pumping results into public databases, which could undercut Celera's claims of exclusivity.

    Some observers saw this as wasteful and urged the academics to collaborate with Celera. Celera did forge a successful partnership with one group of publicly funded researchers—those working on the genetics of the fruit fly (Drosophila melanogaster). Together, Celera and these university-based scientists cranked out the fly's genome with stunning speed (Science, 25 February, p. 1374). But attempts to collaborate on human DNA haven't gone smoothly.

    After unproductive discussions on sharing data in early 1999, Celera and NHGRI let the subject drop. Then last autumn, a newcomer began mediating between the public and private labs: Eric Lander, director of one of the best funded academic sequencing centers, the Whitehead Institute/MIT Center for Genome Research in Cambridge, Massachusetts. As the talks grew more formal, Collins says, the public centers elected four colleagues to represent them. In White's recollection, Lander was “kicked off the team” and replaced by Collins; National Institutes of Health (NIH) director Harold Varmus (now president of the Memorial Sloan-Kettering Cancer Center in New York City); Robert Waterston, director of the genome center at Washington University in St. Louis; and Martin Bobrow, a medical geneticist at Cambridge University in the U.K. and a governor of the Wellcome Trust.

    These four met with a Celera team on 29 December. Then, claiming to have received no serious response from Celera after that session, they sketched out their unhappiness with Celera's bargaining position in a letter to Celera dated 28 February. In a telephone interview with Science, Bobrow confirmed that the trust gave this letter to the press on 5 March but said, “I don't know” exactly how this decision was reached. Bobrow says that the talks “are at an end,” in his view, because Celera “basically turned [its] back on the discussion.”

    Printed on NHGRI stationery, the six-page letter itemizes “fundamental differences” that emerged between the academics and the Celera group. The letter describes the talks as “discouraging” and suggests that the idea of combining data from the public and private efforts “is no longer workable.”

    The letter says that Celera sought to retain control over the human genome for as long as 5 years by requiring that everyone seeking access to data produced by the collaboration agree to Celera's licensing terms. According to White and Venter, these terms are simple: Shared company data may not be redistributed to others or used in a commercial product without Celera's permission. This would be enforced through a license that data users would agree to with a mouse click as they either start up software on a DVD-ROM or log on to Celera's Web site. According to the letter, however, Celera also wanted to control future uses of the data, including publication of a finished version of the genome produced by the publicly funded labs. And the letter mentions that Celera wanted to reach “beyond databases,” controlling technical applications such as DNA chips.

    The representatives of the nonprofit institutions who signed the letter claim that they offered Celera 6 to 12 months of unilateral control over merged human genome data on Celera's Web site. But Celera wanted more time, they wrote—and this, combined with other demands, was “not in the best interests of science or the general public.”

    But White insists that he only suggested that Celera be given 5 years' control over the DNA sequence if Celera went along with a request to share its raw data (such as “tracings” from DNA sequencing machines) with co-authors. Otherwise, he said, exclusive control might end in 2003, when the public effort to finish the genome is due to be completed. Similarly, White said, the discussion of long-term claims on DNA chips and other applications arose only in the context of sharing confidential trace data.

    The authors of the NHGRI letter were especially concerned that Celera might use data from the publicly funded labs in its own sequencing efforts, and, if no agreement were reached, might publish a scientific paper on the final sequence without consulting the academics who generated the data and deposited it in public data banks. “Publication of other groups' primary data without consent is considered to be a breach of scientific ethics,” the NHGRI letter scolds. Venter shoots back that NIH officials have talked about publishing data derived with Celera's help, but without seeking Celera's consent.

    This part of the dispute particularly annoys White. He fumes that the whole argument seems to boil down to who will get credit for completing the human genome. No, says Collins, the real issue is whether the human genome will be locked up in a “monopoly” for the next 3 to 5 years.


    Einstein Probe Remains Earthbound

    1. Andrew Lawler

    A NASA-funded effort by Stanford University to test Einstein's theory of relativity faces $70 million in cost overruns and an additional 6-month launch delay. This setback is the latest in a long-running series of technical snafus and political tensions that have plagued the experiment. Frustrated NASA managers say they will decide this summer whether to kill the Gravity Probe B effort, but they seem unlikely to carry through with the threat.

    Originally slated for a 1999 launch, the probe is designed to measure how Earth's mass warps space-time. Physicists say the results could provide hard evidence to prove Einstein's theory. To make those delicate measurements, the science package of the spacecraft is contained within a supercooled structure that resembles a giant thermos bottle. But problems with gyroscopes contained inside that package, coupled with inadequate cooling in the neck of that bottle, are forcing time- and money-consuming fixes. The situation is “annoying, very embarrassing, and very frustrating,” concedes the mission's principal investigator, Stanford physicist Francis Everitt.

    How much money and how much time those fixes will take is a matter of heated debate. Much hinges on a series of tests slated to begin in May and conclude later this summer. Everitt insists that there is at least a 50-50 chance that the probe can still be launched as scheduled by September 2001, with a cost overrun of about $40 million. But an independent panel assembled by NASA space science chief Ed Weiler puts the cost overrun at $70 million. The panel is also significantly less confident that Stanford will meet the September launch date. Rex Geveden, the program manager at NASA's Marshall Space Flight Center in Huntsville, Alabama, says, “No one is taking that date seriously.” He thinks April 2002 is more likely.

    Science obtained a copy of the report of the panel, which was led by retired Lockheed Martin engineer Parker Stafford. It concludes that “the project is in good shape from a technical standpoint,” but says Stanford needs an experienced integration and test manager as well as NASA engineers onsite to monitor progress. The Stafford panel also takes NASA to task for repeatedly threatening to cancel the program, which the report says undermined Stanford's confidence and led the university team to be less than forthright about the technical difficulties it encountered.

    Irritated NASA officials complain that Stanford managers have not been clear about technical problems, and they assert that the Stanford team has repeatedly gone over their heads by lobbying Congress for support. But Everitt insists Stanford has been up front all along. Nor does he think the extra funding will be a terrific strain, as NASA promised in 1994 to set aside $53 million in case of probe overruns.

    That money, however, is not in the bank. Weiler says he must cut other programs to save the probe. In jeopardy are the Europa orbiter, which could be delayed by more than 2 years, or plans for new spacecraft in a series of ongoing midsize science payloads, says the space science chief. The other option, he threatens, is to kill the probe, but Weiler admits he hesitates to do so given the $450 million and 3 decades already invested in it. Weiler intends to wait until critical tests are completed this summer before making a final decision. Even Everitt agrees that if the probe does not come through the tests with flying colors, then “it will be very reasonable for NASA to ask very serious questions.”


    Solar Physicists Get a Glimpse of the Far Side

    1. Robert Irion

    The far side of the sun seems inaccessible, obscured from our view by 1.4 million kilometers of hot, seething gas. But because the sun rotates every 27 days, that hidden face emerges without fail to shine upon us and, at times, launch dangerous storms our way. Now, researchers may have learned how to detect storms brewing on the far side of the sun, 2 weeks before they swing toward Earth, thanks to a technique that literally hears the rumbling of big sunspots through the sun itself.

    Angry star.

    A new technique that reveals magnetic disturbances on the far side of the sun may give 2 weeks' advance warning of major solar storms, such as this massive ejection of gas on 27 February.


    The technique, dubbed “helioseismic holography,” relies on acoustic vibrations that ring the sun like the solar system's largest bell. On page 1799 of this issue, solar physicists Charles Lindsey and Douglas Braun of the Solar Physics Research Corp. in Tucson, Arizona, explain how they unraveled tiny stutters in those vibrations and traced them clear across the sun to a massive disturbance on the other side. Although their first acoustic hologram of this region is smudgy, the researchers say their images should improve. Says Lindsey: “In a very short time, we will monitor the far side of the sun routinely for large active regions.”

    That prospect has lit up the solar physics community. “It's so exciting that we can now manage to look at the far side of a star,” says Bernhard Fleck, the European Space Agency's project scientist for the Solar and Heliospheric Observatory (SOHO), which gathered the helioseismic data. “Using the entire sun as an acoustic lens is an elegant and brilliant idea.” Those who stare at the sun for its potential impacts on Earth are equally enthusiastic. “This presages a daily analysis of activity on a region of the sun that has been completely out of bounds to us,” says Ernest Hildner, director of the Space Environment Center at the National Oceanic and Atmospheric Administration in Boulder, Colorado.

    Although it took a decade for Lindsey and Braun to make their concept work, they maintain that it's simple in principle. Gas churns within Texas-sized convection cells near the sun's surface. Those motions propel acoustic waves into the star, where they skip and reflect until the entire sun hums in a complex, low-pitched cacophony. Many vibrations cancel out, but some reinforce one another to create resonant frequencies, like the deep tones within an enormous organ pipe. The Michelson Doppler Interferometer aboard SOHO and ground-based instruments spot those frequencies by using Doppler-shifted sunlight to reveal throbbings of the sun's surface, which rises and falls by tens of kilometers every few minutes.

    During their travels, the acoustic waves speed up when they encounter high gas pressures and temperatures deep within the sun or strong magnetic fields near the surface. Sunspots and other storm centers at the surface usually lie within vast regions of strong and tangled magnetic fields, called plages. These magnetic fields press the sun's surface downward by 100 kilometers or more. That shortens the distance that some acoustic waves travel, making them reflect within the sun more quickly than they otherwise would. It's like a dent in the organ pipe, Lindsey says: Waves bouncing off such a region get slightly out of phase with the rest and disrupt the resonant frequency.

    Lindsey and Braun saw such a pattern when they analyzed SOHO data from 1998. On 8 April, a major active region appeared on the eastern limb of the sun as it rotated. Tracing backward, the researchers calculated that the region lay on the far side opposite SOHO's vantage point on 28 and 29 March. Sure enough, the travel times of certain waves sped up by about 6 seconds on those days—a mere hiccup during their 3.5-hour journeys from the near to the far side, but enough to create a splotch in the acoustic signatures. Many such analyses over the face of the sun allowed the team to construct a fuzzy hologram of the hidden plage, which covered 300 square degrees of the sun's surface.

    That's a gigantic swath, but the technique can't yet visualize anything much smaller than 100 square degrees. “We're working on improving our resolution, but the larger regions are exactly the ones of most interest to space-weather forecasters,” says Braun, who now works at Northwest Research Associates Inc. in Boulder. The method can detect active regions within about 50 angular degrees of the center of the sun's opposite face, Braun notes, although he is now devising ways to extend the analysis to areas near the edge of the far side.

    Indeed, researchers at the Space Environment Center often see huge plages rotate into view on the sun's eastern limb “angry and ready to explode,” says Hildner. That gives the forecasters no more than a week's warning before the regions may take aim at Earth with a barrage of flares and coronal mass ejections, huge belches of plasma laced with magnetic fields. Solar physicists are still struggling to understand which plages will erupt and which outbursts will affect Earth once they arrive (Science, 24 December 1999, p. 2438). Even so, another week of advance warning may help electrical utilities or satellite operators plan for possible disruptions and put key instruments into a safe mode. “If this technique can reveal which active regions are growing in magnetic strength as they cross the far side of the sun, that's enormously promising,” Hildner says.

    Protecting humans in space may be the greatest benefit, especially with astronauts due to spend thousands of hours on spacewalks during the next decade to assemble the international space station. With far-side monitoring, “we probably will be able to give a general ‘all-clear’ notification that we see no evidence of big active regions for the next 2 weeks or so,” says William Wagner, discipline scientist for solar physics at NASA headquarters in Washington, D.C. However, such an alert system would require continuous listening and rapid analysis of the sun's acoustic symphony. That may fall to the next generation of solar satellite beyond SOHO or to a ground-based helioseismic network now being upgraded: the Global Oscillation Network Group, appropriately known as GONG.


    Buried Channels May Have Fed Mars Ocean

    1. Richard A. Kerr

    A team of geophysicists may have found a missing link in the growing body of evidence that Mars once had a major ocean. On page 1788 of this issue of Science, researchers analyzing gravity data from the Mars Global Surveyor (MGS) spacecraft report that they have detected a system of now-invisible, buried channels that delivered water from Mars's southern highlands into the northern lowlands billions of years ago. If they're real, these channels “greatly increase the chances of an ocean” on early Mars, says geophysicist Norman Sleep of Stanford University. But MGS geophysicist Roger Phillips of Washington University in St. Louis warns that “whatever it is, it's going to be tough to test.”

    Signs of an early ocean on Mars have been accumulating for years, but the evidence has been far from conclusive. First, geologists spied hints of a shoreline around the northern lowlands in 20-year-old Viking images, although preliminary analysis of more detailed MGS images has failed to confirm them. Then, after MGS topographic measurements showed the northern lowlands to be the flattest, smoothest known surface of broad extent in the solar system, planetary geologist James Head of Brown University and his colleagues marshaled a variety of MGS images and topographic data to suggest that the lowlands resemble an ocean basin that has been at least partially filled with water (Science, 4 December 1998, p. 1807, and 10 December 1999, p. 2134). But where would the water have come from? Catastrophic floods gushing toward the northern lowlands clearly cut deep channels into the edge of the southern highlands, but they would have been too infrequent and too small to have created and maintained an ocean in the freezing cold of Mars. And there's no sign of a Mississippi or Nile river that carried enough water to fill the lowlands.

    The new gravity data from MGS may provide the answer. The varying pull of gravity from place to place on Mars causes MGS to bob up and down, motions that show up as Doppler frequency shifts in the spacecraft's radio transmissions. After accounting for gravity variations due to the undulating surface—the extra mass of volcanoes and the dearth of mass over deep basins—the remaining variations can be attributed to variations in subsurface density. The deeper the crust extends into the denser mantle or the deeper sediment fills a channel in the denser crust, the lower the gravity.

    On a broad scale, MGS gravity has revealed a thinned crust beneath the northern lowlands, a sign of early plate tectonic- like processes on Mars (Science, 14 January, p. 218), but on a finer scale it shows a pattern that geophysicist Maria Zuber of the Massachusetts Institute of Technology and her MGS team members suggest marks a system of now-buried channels. Narrow zones of lower gravity extend from two clearly visible major flood channels—those leading from the Valles Marineris and Kasei Vallis into Chryse Planitia, the plain where the Viking 1 lander still sits. Doubting that the martian crust just happened to thicken along narrow zones that run out from two major flood channels, Zuber, Phillips, and their colleagues attribute the lower gravity to channels cut early in martian history filled with sediment less dense than crustal rock. And these channels are big: 200 kilometers wide, thousands of kilometers long, and at least 1 to 3 kilometers deep.

    Such size “indicates that these outflows brought an awful lot of water to the lowlands,” says Zuber, as well as a lot of sediment. The putative buried channels extend northward past the end of the flood channels at about 30ºN, past the last signs of surface flow now put at about 45ºN in MGS topography, to as far north as 75ºN, which is well into the North Polar Basin. “You need a huge amount of sediment deposited in the northern part of Mars” to bury the channels, says Phillips. Presumably the sediment would have been carried by an equally huge amount of water.

    Oceans of water in the first eon of martian history “may very well be correct,” says Sleep. “The explanation they offer is a perfectly reasonable one,” agrees planetary physicist David Stevenson of the California Institute of Technology in Pasadena, but “the problem with gravity is there are always other explanations.” MGS geologist Michael Carr of the U.S. Geological Survey in Menlo Park, California, would prefer one of those. The gravity lows are real enough, he says, but they could reflect some thickening of the crust rather than sediment-filled channels. “I'm very skeptical,” he says, because the flow paths indicated by the buried channels do not follow “what you can see in the topography.”

    Phillips isn't yet ready to insist on the reality of buried channels. “They could be something else, I suppose,” he says. “It's a difficult hypothesis to test.” He and his colleagues will be doing their best by checking whether the buried channels could be sloping downward all the way north and whether the highlands could have supplied all the sediment that the buried channels imply.


    Panel Says NASA Must Show Results--Fast

    1. Andrew Lawler

    NASA should pull the plug on efforts to grow crystals in space unless it can show better results from its investment. That stern advice comes from a National Research Council (NRC) panel in a report ( that takes a critical look at the agency's oft-repeated scientific rationale for building the $100 billion international space station: using a microgravity environment to produce new crystals and to better understand cell growth in microgravity.

    Weighty criticism.

    NRC panel says some microgravity studies need rethinking.

    NASA spends nearly $20 million annually on cell science and protein crystal growth, which will shift to the space station in the coming years. But the NRC panel said that significant changes are needed to warrant continued support for crystallography. Reviewing years of work on the U.S. space shuttle and the Russian space station Mir, for example, the panel concludes that “one cannot point to a single case where a space-based crystallization effort was the crucial step in achieving a landmark scientific result.” The panel also takes NASA to task for not dealing adequately with the perception that the agency is “not really interested in input from outside” in running its research program.

    Eugene Trinh, the new NASA microgravity sciences chief, accepts much of the criticism but says the agency already is addressing most of the issues raised. “We're way ahead of this report,” says Trinh, a former fluids researcher at NASA's Jet Propulsion Laboratory in Pasadena, California.

    Protein crystals serve as a foundation for much basic science in biology as well as drug development, and the near-complete absence of gravity in orbit offers the possibility of creating larger and purer crystals. But the more powerful beams coming from such new synchrotron sources as the Advanced Photon Source at the Department of Energy's Argonne National Laboratory outside Chicago are providing sharper pictures of these structures, making size less important. The role of microgravity in creating purer crystals is also ambiguous. Some 36 of 185 proteins and other biological macromolecular assemblies studied in space have shown higher resolution than the best results for those same materials on Earth, the panel notes, but it's not clear whether microgravity was the biggest factor contributing to those results.

    Overall, the panel concludes that the impact of microgravity crystal work on structural biology “has been extremely limited.” It urges NASA to fund competing work on Earth- and space-based crystals and to compare the results. If the results show no new breakthroughs from space-based projects, the panel warns, “then NASA should be prepared to terminate its protein crystal growth program.”

    Trinh says NASA will conduct such a competition and that the agency already intends to de-emphasize its former plans to grow crystals on a large scale. But he adds that the agency does not want to shut the door on potential commercial users of the station who might conduct crystal experiments. And Larry DeLucas, a crystallographer at the University of Alabama, Birmingham, says the past may not be a good guide to the future. He points out that the typical weeklong shuttle flight is often too short. “On the shuttle, 50% of our crystals grew too slowly” to be useful, he says. “The length of time is the real handicap,” not the environment.

    The panel also recommends that NASA reduce its emphasis on bioreactors, rotating vessels for growing cells aboard the station. The small amounts of data generated by the bioreactors, the difficulty in removing dead cells, and other technical issues could limit their usefulness, the panel members argue, and newer technologies, such as miniaturized culture systems and compact analytical devices, should be explored. But Trinh maintains that the bioreactors work well in the early stages of research.

    Going beyond biotechnology, the panel also takes a swipe at NASA's practice of “borrowing” money from the pot allocated to new research facilities to pay for station construction. That trend, the panel members warn, will erode trust in the agency's user community, because “continual uncertainty is demoralizing and discouraging” and because researchers want to use the best facilities. If it continues, the report states bluntly, “NASA will send a clear message that science on the [space station] has a low priority and will alienate the research community even more.”

    Another tough issue is how to undo the perception that NASA's biology program is a closed shop. Many of those involved in working groups or advisory committees “are … the same people who make up the pool of grantees,” the report notes. The panel urges the agency to expand its outreach efforts with the scientific community, and Trinh says NASA is doing just that. “We were really remiss,” he says. “But once we open up our program to researchers in academia and industry, it will be easy to show that we are not parochial.”


    Duo Dodges Bullets in Russian Roulette

    1. Richard Stone

    One is in the twilight of his career, a physicist virtually unknown beyond Russia's borders. The other is an oceanographer in his prime, a rising star outside his native Ukraine. What these two have in common is a tribulation that once spelled death for a scientific career, if not for the accused himself: Each was charged with a serious crime by his country's security apparatus. Now the two share happier circumstances. Last month, both won victories suggesting that the judicial systems in the young democracies of Russia and Ukraine are not inclined to rubberstamp trumped-up accusations against scientists.

    In one case, 70-year-old Vladimir Soyfer of the Pacific Oceanological Institute in Vladivostok had been accused by the Federal Security Bureau (FSB), the successor to the KGB, of mishandling classified data. He won an initial court battle on 11 February, when a judge in Russia's Far East ruled that the FSB obtained the evidence on which the charges were based through an illegal search. The FSB has appealed the ruling, but if allowed to stand, it would cripple the FSB's case, observers say.

    KGB target.

    Physicist Vladimir Soyfer is hoping for exoneration.


    The second researcher, Sergey Piontkovski, 46, of the Institute of Biology of the Southern Seas in Sevastopol, Ukraine, got even better news. He was preparing to stand trial on charges of financial improprieties relating to his Western grant when, on 25 February, the local prosecutor dropped the charges soon after meeting with a delegation from the European agency whose grant was at the center of the controversy.

    These victories, along with the recent acquittal of a Russian environmental activist, are huge morale boosters for former Soviet scientists, who have forcefully and publicly defended their colleagues. “It is a very important sign for me. I used to believe that the court is always on the KGB's side,” says Valentina Markusova of the All-Russian Institute of Scientific and Technical Information in Moscow.

    Before it dissolved in 1991, the Soviet Union was notorious for making its citizens pay for opposing its policies or getting too cozy with Western colleagues, and scientists were no exception. The constant was a presumption of guilt, until glasnost in the late 1980s laid the groundwork for the almost libertarian freedoms briefly enjoyed by Russians after the Soviet Union's dissolution. The pendulum soon swung back, however. In 1994, for example, Russia's security service charged a former chemical weapons researcher, Vil Mirzayanov, with revealing state secrets about a new class of nerve gas (Science, 25 February 1994, p. 1083). The arrest sparked an international outcry, and charges against Mirzayanov were subsequently dropped. Nevertheless, arrests of scientists and environmentalists have continued.

    Among those seized was activist Aleksandr Nikitin. He was charged with espionage and divulging state secrets after co-authoring a report for Bellona, a Norwegian environmental group, on nuclear contamination from Russia's Northern Fleet. Last December, a judge in St. Petersburg acquitted Nikitin, a former nuclear safety inspector and retired Navy captain, and last month, the American Association for the Advancement of Science (which publishes Science) gave him, in absentia, its 1999 award for scientific freedom and responsibility. But Nikitin is not yet out of the woods. His case is on appeal, and he has not received his passport for foreign travel.

    Only weeks ago, prospects were looking much bleaker for others who had been accused. Take Soyfer, whose nightmare began on 26 June 1999, when FSB agents raided his office, then descended on his home a week later. During the second raid they seized papers related to Soyfer's research on Chazhma Bay off Vladivostok, which was contaminated with radioactive materials after an accident involving a Soviet nuclear submarine in 1985. The work—sponsored by the Ministry of Atomic Energy and done in collaboration with the Radiochemical Safety Bureau of the Russian Navy, whose Pacific Fleet is based nearby—went well for 2 years, Soyfer says. But then, he claims, the safety bureau's new director took a disliking to him and called in FSB agents in Vladivostok to help oust him from the project. After seizing the research materials, the FSB charged Soyfer in early July with revealing secret information that “compromised the state and military security of the Russian Federation.” Soyfer denied the charge, and colleagues rallied to his side.

    Later that month, 11 top scientists and a deputy of the Duma, Russia's parliament, signed a letter to acting President Vladimir Putin, then prime minister, pointing out that the government had decreed earlier that ecological data could not be classified as state secrets. “We urgently request that you take measures to end the illegal persecution of V. N. Soyfer and other scientists,” they wrote. Even more valuable to Soyfer's defense, an expert panel of the prestigious Kurchatov Institute in Moscow reviewed the disputed data and stated in a 7 February letter that none were secret. “The FSB does not have grounds for its attack,” says Soyfer, who's waiting for the FSB to formally exonerate him in the wake of the court's ruling that the search was illegal.

    Whereas Soyfer's cause was buoyed by his Russian colleagues, Piontkovski drew most of his support from scientists outside Ukraine. His saga began on 16 October, when agents from the Ukrainian Security Bureau (SBU) seized documents and cash from the homes and offices of Piontkovski and two colleagues (Science, 29 October 1999, p. 879). The focus of the search was Western grants that involved analyzing and digitizing data on bioluminescence collected over the past 30 years, first by Soviet and then by Ukrainian and Russian ocean expeditions.

    After accusing the researchers of illegally passing data to the West, the SBU worked up charges against Piontkovski of illegal currency transfers: receiving and distributing funds under a grant from INTAS, a European agency that supports East-West scientific cooperation. Despite numerous appeals, the Ukrainian Academy of Sciences leadership failed to achieve any breakthroughs, but once the case was handed to the prosecutor, INTAS sprang to action.

    INTAS chief David Gould and a legal adviser flew to Ukraine on 9 February, meeting with officials in Kiev and then with the prosecutor in Sevastopol. INTAS was ready to pull the plug on 55 new grants to Ukrainian teams if other scientists faced the threat of prosecution simply for cashing INTAS checks, Gould says. The prosecutor dropped the charges less than 2 weeks later. As Science went to press, Piontkovski was in Kiev, seeking a visa for an extended stay in the United Kingdom or the United States.

    Now, observers are anxiously following the cases of three other former Soviet scientists whose fates remain up in the air. The FSB is still investigating Vladimir Tchurov, a colleague of Soyfer's at the Pacific Oceanological Institute, who is accused of selling sensitive acoustic technology to China, and last November it arrested Igor Sutyagin, an arms control researcher at the Russian Academy of Sciences' USA-Canada Institute in Moscow, on espionage charges.

    Meanwhile, in Belarus—where democracy is struggling to take hold—Yuri Bandashevsky, an outspoken critic of the government's response to lingering health effects of the 1986 Chornobyl disaster, has been imprisoned since last July on charges of taking bribes. (His case has not been tried.) After the trio of recent judicial triumphs, the hope is that good news will again come in threes.


    Hardy Microbe Thrives at pH 0

    1. Elizabeth Pennisi

    This one is extreme, even for an “extremophile.” While surveying the depths of an abandoned copper mine, a team of geomicrobiologists has detected a new microbe that survives in some of the most acidic waters on Earth, at a seemingly impossible pH near 0. That makes this critter, a member of the microbial kingdom Archaea, one of a few record-setting bugs that can survive in conditions usually toxic to life as we know it.

    Not only does this microbe, dubbed Ferroplasma acidarmanus, survive, but it positively thrives. Indeed, it accounts for the overwhelming majority of life-forms found at the inhospitable mine, report Katrina Edwards of the Woods Hole Oceanographic Institution in Massachusetts and her colleagues on page 1796. And that, in itself, is “remarkable,” says microbiologist John Baross of the University of Washington, Seattle. Most extreme environments studied so far—whether hot, frigid, acid, alkaline, or high pressure—support a diversity of life, usually a collection of hardy bacteria, notes Baross. But at the Iron Mountain mine near Redding, California, just this one microbe rules.

    The bug's hardiness is even more surprising considering its architecture. Most microorganisms have cell walls to shield them from harsh surroundings, but not these Archaea. They are encased in what appears to be just a simple cell membrane—a seemingly flimsy way to guard against sulfuric acid and the high amounts of copper, arsenic, cadmium, and zinc also present in the drainage. Yet that membrane “seems totally capable of maintaining them in environments that would destroy most other organisms,” marvels William Ghiorse, a geomicrobiologist at Cornell University. Moreover, because this microbe can't survive in water of a normal pH, “the high acidity seems to be essential to maintain the membrane.” The researchers are trying to figure out how this membrane works.

    Edwards helped track down F. acidarmanus while she was a graduate student working with Jillian Banfield, a mineralogist at the University of Wisconsin, Madison. They were trying to understand the role of microorganisms in the geochemical processing of sulfur and the generation of acid mine drainage. Earlier Banfield and Edwards had sampled water from 500 meters inside the same mine, using molecular probes designed to recognize genetic material specific to different types of organisms. At the time, they found no signs of the bacteria that are often cultured from mine drainage water (Science, 6 March 1998, p. 1519). But the probes did detect large populations of Archaea—microbes typically associated with other types of extreme environments, such as hot springs.

    Over the past 2 years, the team periodically resampled the mine's waters and from them has isolated and grown the one species that is predominant: F. acidarmanus, which grows best at a pH of 1.2 but can grow in a range from pH 0 to 2.5. Most other organisms recovered from acid mine drainage grow best at a pH of 2.5 and survive anywhere from a pH of 1 to 4. Other researchers have also come across traces of microbial life in highly acidic settings. But Edwards and Banfield are the first to collect and quantify these Archaea from a natural environment. They have since found the same or similar species at other sites throughout the mine.

    If these microbes are widespread in the natural world, as these findings suggest, how they got there “is the $64 million question,” says Banfield. Given their need for such acidic habitats, it's unclear how they could spread. Princeton University geochemist Tullis Onstott wonders whether such microbes found in Earth's subsurface became established when these geological formations first formed and then existed in a dormant state through geological time until conditions became suitable for them to spring back to life. Whatever the answer, the discovery of this archaeal species suggests that yet more bizarre microbes may exist out there, perhaps bugs that survive at negative pHs. Predicts Onstott, “If you keep looking, you will find them.”


    Strong Economy Lifts Some Research Boats

    1. Wayne Kondro*
    1. Wayne Kondro writes from Ottawa.

    OTTAWAFlush with revenues from an unprecedented economic boom, the Canadian government last week unveiled a series of budget initiatives that would reinvigorate academe while making major thrusts in high-energy physics, genomics, and environmental technologies. But there's no increase for the country's research councils, and that could mean continued hard times for many scientists.

    The first wave of good tidings came on 28 February with news of an additional $615 million in the 2000-01 budget for refitting university labs and $109 million for a national genomics initiative (Science, 3 March, p. 1569). The next day the federal government nudged the high-energy physics community into the winners' circle by announcing plans to spend roughly $136 million over 5 years on operations and upgrades at the national laboratory for particle and nuclear physics.

    “It's a huge vote of confidence in basic research in Canada,” says Alan Astbury, director of the Tri-University Meson Facility (TRIUMF) in Vancouver, built in 1974 to develop a Canadian research capacity in particle physics. “This will allow us to do what you might call real nuclear physics.” TRIUMF plans to use roughly $15 million of its windfall for a fourfold boost in the energy level of its nearly constructed Isotope Separator and Accelerator (ISAC), to 6.5 million electron volts. ISAC takes low-mass particles, evaporates the nuclei, ionizes them, and then accelerates them to higher energies. The upgrade will allow scientists to work at an energy level “which at the moment doesn't exist in North America,” Astbury says, explaining that accelerators like the Relativistic Heavy Ion Collider at Brookhaven National Laboratory in Upton, New York, operate at much higher energy levels and generate “very short-lived” nuclei. Another $10 million will go toward components for the Large Hadron Collider being built by CERN, the European particle physics center near Geneva, bolstering Canada's contribution to international physics.

    Stronger global ties are also expected from a national genomics initiative, first proposed in 1998 (Science, 3 July 1998, p. 20). Martin Godbout, acting president of the nonprofit corporation overseeing implementation of the initiative, says it will allow Canada to participate in a number of international consortia. The G-5 of genomics nations (the United States, United Kingdom, Germany, France, and Japan) “has become the G-6,” he noted with pride.

    Provincial governments are expected to at least match the federal contribution to Genome Canada, creating a genomics center in each of five specified regions. Each center will receive up to $15 million a year to pursue work of interest to industries such as agriculture, health, forestry, or fisheries. “They'll have to include proteomics and sequencing and genotyping, so the technology platform will be a prerequisite for each center. But the centers will be allocated by sector,” Godbout says.

    The $615 million for refurbishing university labs, which will be channeled through the Canada Foundation for Innovation (CFI), is seen as dovetailing with an earlier commitment to create 2000 new research chairs (Science, 22 October 1999, p. 651). The 3-year-old CFI program, which would have run through its initial $680 million next year, is intended to rejuvenate university research facilities, including networking and databases. With the CFI awash in applications, president David Strangway said the new monies will allow the foundation to clear its backlog, undertake more “strategic” competitions aimed at bolstering specific scientific sectors, and possibly create a $70 million pot for international joint ventures in such areas as information technology or biotechnology.

    Although the government invested heavily in university infrastructure and personnel, the nation's three research granting councils received no increase in their support for basic research grants, despite skyrocketing demand prompted, in part, by the new infrastructure programs. “It's an appalling budget,” complained Jim Turk, executive director of the Canadian Association of University Teachers. “The universities will be worse off.”

    Similarly, the National Research Council (NRC) was stiffed for the second straight budget despite Industry Minister John Manley's earlier vow to make it a top priority and an aggressive lobbying effort by NRC officials (Science, 18 September 1998, p. 1781). NRC president Arthur Carty says he's “devastated” and must now consider a range of cuts. His options include across-the-board reductions, closure of one of its institutes, or ending the agency's contribution to such national projects as one to develop fuel cells and another to build a synchrotron light source. “It's pretty morale-sagging,” Carty noted.


    Top French Researchers Spar Over Synchrotron

    1. Michael Balter

    PARISA lively and often heated debate broke out last week among some of France's top scientists over plans to scuttle a major French synchrotron radiation facility and instead back a joint venture across the English Channel. But for all their oratorical fireworks, the scientists, who spoke at a parliamentary hearing, were essentially haggling over a possible consolation prize: whether France should build a second, smaller synchrotron facility on its own soil.

    Scientists use x-rays produced by synchrotrons to probe the atomic structure of proteins and other compounds. French researchers had been counting on getting an advanced x-ray source, the long-planned SOLEIL facility, until research minister Claude Allègre canceled the project last year. Instead, Allègre opted for French partnership in a new synchrotron to be built with the British government and the Wellcome Trust, the mammoth British medical charity (Science, 6 August 1999, p. 819). Allègre's decision has become a flash point for protestors unhappy with his research priorities.

    A French parliamentary commission, which has been examining Allègre's decision to pull the plug on SOLEIL, organized the 2 March roundtable at the National Assembly. The forum was the last step before the commission, headed by National Assembly deputy Christian Cuvilliez and Senate member René Trégouët, releases its report later this month. But the odds have grown vanishingly small that the panel will recommend canceling what some French researchers sarcastically call the “Franco-Wellco-British” synchrotron. “The negotiations are too far along” to put a stop to the project, Cuvilliez told Science.

    The debate got off to a rollicking start when Nobel laureate Pierre-Gilles de Gennes, a physicist with the Collége de France, questioned the importance of synchrotron facilities. While “the machines are useful,” de Gennes said, “if we are speaking of major unexpected discoveries made during the last 15 years, the result is practically nothing.” These remarks drew a blistering response. Roger Fourme, head of biology at LURE, an aging x-ray source at Orsay that Allègre wants to shut down in the next few years, rattled off a list of proteins whose structures have recently been solved using synchrotron radiation. And Yves Petroff, director-general of the European Synchrotron Radiation Facility (ESRF) in Grenoble, complained that “de Gennes has not kept up with what is going on in this field,” adding that research done at ESRF has been featured on the covers of Science and Nature four times since the facility went online in 1994.

    Whatever the field's intrinsic value, others defended Allègre's decision to join forces with the British. “If SOLEIL had been constructed, it would have had half of the capacity the new synchrotron will have,” said geophysicist Vincent Courtillot, research director at the French science ministry. Besides, he said, SOLEIL's price tag—estimated at between $160 million and $300 million—would have taken too big a bite out of the ministry's budget when it's straining to fund new positions for young scientists and to boost basic lab budgets.

    On the other hand, Courtillot said, the government is “absolutely open” to the idea of building a smaller synchrotron in France, although he insisted that any such decision must be made in consultation with European partners. This attitude won approval from Nobel laureate Georges Charpak, a physicist at the CERN accelerator facility near Geneva. “It is clear that Europe is behind Japan and the United States in synchrotron radiation,” Charpak said. “But does this mean we should catch up country by country?”


    Prospects Brighten for Berkeley Synchrotron

    1. David Malakoff

    Two years ago, the future looked grim for the Advanced Light Source (ALS), a premier synchrotron at the Lawrence Berkeley National Laboratory in California. A 1997 report criticized the management of the facility and the quality of the science it produced, and the Department of Energy (DOE) responded by cutting its budget. Now, things are looking up. Last week, a DOE advisory panel gave the synchrotron a glowing review, and DOE officials are now planning to boost its budget. “At last, we are out from under a very dark cloud,” says ALS director Daniel Chemla, a physicist at the University of California, Berkeley.

    The ALS, opened in 1993, is one of four DOE-funded synchrotrons producing x-rays used to illuminate the structure of everything from computer chips to protein molecules. Researchers have flocked to the particle accelerators cum microscopes in growing numbers over the last decade. Tight budgets during most of this decade, however, have forced DOE to make some hard choices about equipment upgrades and operating funds for the ALS; the National Synchrotron Light Source at Brookhaven National Laboratory in Upton, New York; the Advanced Photon Source at Argonne National Laboratory in Illinois; and Stanford University's Synchrotron Radiation Laboratory in California.

    To help it decide spending priorities, DOE convened a team led by Massachusetts Institute of Technology physicist Robert Birgeneau. Its report, issued in October 1997, stunned ALS officials, who had expected that their $100 million machine would sail through the review. Instead, the report found that the ALS was poorly managed, relatively underused, and scientifically unproductive, and that its “soft” or long-wavelength x-rays were less attractive to scientists than the hard x-rays produced elsewhere (Science, 17 October 1997, p. 377). Within weeks, DOE slashed the light source's $33 million annual operating budget by nearly 10% and postponed some proposed upgrades.

    Lawrence Berkeley leaders moved quickly to address administrative shortcomings identified by the report, which some felt unfairly compared the youthful ALS to its more mature siblings. They spun it off as a semiautonomous unit and hired Chemla to develop a long-range scientific plan and mend frayed relations with ALS users, who then numbered less than 250. They also made technical changes that allowed the ALS to produce the hard x-ray beamlines coveted by many scientists, increasing its appeal (Science, 27 August 1999, p. 1344).

    By last year, the changes were having their intended effect: A review by the University of California, which runs the Berkeley lab, found that the number of users had grown nearly fourfold and that ALS researchers were regularly publishing in premier scientific journals. “The place has really turned around,” says physicist Nora Berrah of Western Michigan University in Kalamazoo, who leads the ALS user committee. “We have worked very hard together to show that this is a great place for doing science.”

    The ALS's scientific productivity impressed members of the recent DOE review panel, led by Yves Petroff of the European Synchrotron Radiation Facility in Grenoble, France. One reviewer, materials scientist Richard Smalley of Rice University in Houston, Texas, was initially doubtful he would find much of value at the ALS. But he was “blown away by what I saw” during the team's 2-day visit last month. And Birgeneau, who was invited by ALS officials to serve on an advisory committee after releasing his report, says he “really admires the way [ALS] responded—they could have launched an attack on the report instead.”

    Congress will ultimately decide whether those kind words will translate into more money. Anticipating the facility's strong showing, DOE basic sciences chief Pat Dehmer had already proposed increasing its budget by $4.4 million, to $35.4 million, for the budget year that begins on 1 October. In the meantime, Dehmer says, “any lingering prejudice against the ALS should be washed away.”


    An Appealing Snowball Earth That's Still Hard to Swallow

    1. Richard A. Kerr

    The startling claim that Earth has frozen over from pole to pole for millions of years at a time has intrigued many earth scientists but as yet convinced few

    The farther back in Earth's history you go, the weirder things seem to get. Back in the Neoproterozoic era 600 million or 700 million years ago, before life was much more than a green scum, the world was particularly bizarre. Glaciers often flowed to the sea seemingly everywhere, including the tropics. Yet, as soon as rocky debris churned out by those glaciers settled on the floor of iceberg-clogged seas, the climate seems to have flipped: Warm seawater supercharged with carbon dioxide began depositing carbonate rock right on top of the glacial detritus. At the same time, life appears to have suffered a near-death experience after eons of stability. Even iron deposits supposedly banished by Earth's now-abundant oxygen reappeared in the late Neoproterozoic for one final bow. And shortly thereafter, geologically speaking, a profusion of animal life—all the basic body plans seen today—burst on the scene full blown after evolution had stagnated for a billion years or more.

    For the past 18 months, geoscientists have themselves been thrown into turmoil by a bold explanation for all this weirdness: During the late Neoproterozoic, this theory goes, Earth suffered through at least two globe-engulfing ice ages that reinvigorated life by nearly snuffing it out. These deep freezes, according to this model, abruptly gave way to global warming that turned the entire planet into a sauna. Bizarre though it may seem, this hypothesis, widely known as the “Snowball Earth” scenario, is gaining ground.

    Researchers from geologists and geochemists to paleontologists and climate modelers are suddenly taking snowball Earth seriously enough to trek to Namibia's Skeleton Coast and California's Death Valley for rocks and to crank up computer models for climatic insights. And last year, most researchers agreed that one part of the sweeping hypothesis—the claim that glaciers once flowed into ice-covered tropical seas—is correct, even though this idea had been rebuffed for more than 30 years. A few scientists are even persuaded that the entire theory most likely is true. Snowball Earth “is a very plausible explanation for some very puzzling observations,” says geochemist James Walker of the University of Michigan, Ann Arbor. “It probably happened.”

    Most researchers, however, aren't yet ready to embrace the whole sequence of events. Big, in-your-face ideas can be a bit hard to take, even when they solve all your problems. After all, the idea that a 10-kilometer chunk of rock blew away the dinosaurs 65 million years ago took 10 years of often-bitter debate before gaining widespread acceptance; microbes from Mars look like they may never gain acceptance. Snowball Earth is gaining ground faster than the dinosaur killer did, but it has a long way to go. It is “interesting, intriguing, provocative,” says paleontologist Guy Narbonne of Queen's University in Kingston, Ontario. “It has focused us on a pivotal event in Earth's history, and it provides a testable hypothesis that links a number of observations, but I wouldn't say it's accepted.” For Narbonne and many others, there are still too many details that don't fit yet. “I just don't know whether it will stand up,” says paleontologist and molecular biologist Charles Marshall of Harvard University. But then, “I would have said the same thing about the K-T [Cretaceous-Tertiary] impact” in its first year.

    A “freeze-fry” world

    Although elements of the snowball Earth hypothesis have been around for several decades—geobiologist Joseph Kirschvink of the California Institute of Technology coined the term in 1992—it was a 1998 paper in Science (28 August, p. 1342) that made it a going concern among researchers. In that paper, geologist Paul Hoffman and geochemists Daniel Schrag and Galen Halverson of Harvard and geochemist Alan Kaufman of the University of Maryland, College Park, a former Harvard colleague, sketched a scenario that would resolve four paradoxes found 750 million to 580 million years ago late in the Neoproterozoic era and a few tens of millions of years before the explosion of animal diversity in the Cambrian period.

    First up in the Harvard scenario is global glaciation. Climate modelers have long believed that if the natural greenhouse warming of early Earth were to weaken, say because unusually severe weathering of continental rocks sucked carbon dioxide out of the atmosphere, Earth would freeze over. The sun was fainter in those days, so if the greenhouse waned, bright white ice and snow would creep from polar regions toward the equator, reflecting more and more of the sun's heat back into space and further cooling the planet.

    At some point, this albedo feedback effect would take over, the ice would push across all oceans to the equator, and Earth would be a snowball. This was the “White Earth solution” or “ice catastrophe” of a simple climate model run by Mikhail Budyko of the Leningrad Geophysical Observatory in the 1960s. At the time, however, climate modelers didn't think the real world ever iced over. How could life have survived, they asked, in a world in which the average surface temperature would have hovered around -50ºC, not to mention the all-encompassing sea ice that would average a kilometer thick compared to the Arctic Ocean's few meters?

    In the Harvard snowball scenario, microbes, simple seaweeds, and probably animals of near-microscopic size eked out an existence around sea-floor hot springs or the occasional rift in the ice, even as those same hot springs were loading the oceans with minerals and sucking oxygen from the water. This went on for perhaps 10 million years until volcanoes came to the rescue, in a brutal sort of way. Over millions of years, carbon dioxide seeped from the interior through continental volcanoes to rebuild the greenhouse, and then some. Eventually, an atmosphere with 350 times today's carbon dioxide countered the snowball's albedo effect, melting back the ice in a century or so and bringing on a steamy climate with an average global temperature of +50ºC and replete with corrosive acid rain. And the Harvard group believes there were more than one of these “freeze-fry events,” as Hoffman and Schrag dub them in a January Scientific American article. Geologists agree that there were at least two widespread, although not necessarily global, glaciations in the late Neoproterozoic; some researchers opt for three or four, including Kaufman, who sees all of them as global.

    Melting opposition

    Snowballs from hell didn't set well at first with most earth scientists: too extreme, too bizarre, too deadly, too speculative. But several developments, some from the Harvard group, have softened resistance. One supportive piece of evidence, oddly enough, comes from one of snowball Earth's more vociferous critics—as paradoxical as it seems, glaciers really did flow into icy tropical seas. In the 1960s, geologist Brian Harland of the University of Cambridge noted how many Neoproterozoic glacial deposits—rock ground up by glacial ice movement and “dropstones” carried out to sea by icebergs—seemed to have formed in tropical latitudes. Although the glacial deposits appeared authentic enough to most geologists, few were convinced that they had formed near the equator.

    The problem was one of reliability. In order to tell at what latitude a deposit formed before 700 million years of peripatetic plate tectonics reshuffled it around the planet, paleomagneticians trace the alignment of magnetic minerals in the deposit. These minerals tend to line up at the time the deposit was formed in the direction of Earth's magnetic field: If horizontal, the deposit formed at the equator, where magnetic field lines running pole to pole parallel the surface; if vertical, at one of the poles, where field lines plunge toward the core. But rocks that have been around for a few hundred million years have a good chance of having their magnetic signatures later rewritten when the rock is reheated or chemically altered, so researchers remained skeptical.

    Now comes paleomagnetic evidence of low-latitude glaciation that everyone can believe. In last August's Geological Society of America Bulletin, geologists Linda Sohl and Nicholas Christie-Blick (a snowball critic) of the Lamont-Doherty Earth Observatory in Palisades, New York, and paleomagnetician Dennis Kent of Rutgers University in New Brunswick, New Jersey, published a paper on the Neoproterozoic Elatina Formation glacial deposit in Australia. They had finally collected enough samples from the deposit, they reported, to recognize not just a low latitude of deposition, but also at least three flip-flops of Earth's magnetic field recorded as glaciers laid down sediment. That meant that the low latitude could not be a later remagnetization, which would have wiped out the reversals and imposed a single magnetization.

    “There's no reason to doubt it,” says Kent. “All the tests we've been able to do haven't been able to get around the low latitude.” Paleomagnetician Joseph Meert of Indiana State University in Terre Haute had been highly critical of earlier claims by paleomagnetists of low-latitude glaciation, but the latest Elatina data have changed his mind. “The data look pretty strong now,” he says. The Sohl results “make it pretty tough to argue against a low latitude [glaciation], at least for the Elatina.”

    Another out-of-place deposit is also lending credibility to the snowball Earth scenario, this one out of place in time rather than geography. When Kirschvink first coined the term snowball Earth, he pointed out how an ice-covered planet could resolve the paradox of iron formations associated with Neoproterozoic glaciations. Massive amounts of iron that originally spewed from sea-floor hot springs precipitated from the sea in the Archean eon, more than 2.5 billion years ago. But about 2 billion years ago, enough oxygen rose in the deep sea to cut off formation of such iron deposits by precipitating the iron before it spread through the oceans—except, it turns out, around the time of glaciations in the Neoproterozoic. “These iron formations have always been a thorn in the flesh,” says geologist Grant Young of the University of Western Ontario in London, Ontario. Kirschvink's scenario, adopted by the Harvard group, would have iron build up in the ice-covered, oxygen-free oceans, only to be precipitated when the ice melts away and oxygen reenters the ocean.

    In addition to tropical glaciation and strange iron deposits, the snowball Earth scenario solves the paradox of the juxtaposition of the low-latitude glacial deposits and hundreds of meters of carbonate rock. Around the world, Neoproterozoic glacial deposits are capped by tens or even hundreds of meters of carbonate rock whose sometimes bizarre textures indicate rapid precipitation from warm seas saturated with carbonate. (Unlike most minerals, carbonates are more soluble in cold water than in warm.) Read literally, the rock record says that tropical seas iced up and then abruptly thawed to a carbon dioxide-rich warmth.

    “I'm not a full supporter of snowball Earth yet,” says geologist Frank Corsetti of the University of California, Santa Barbara, but “its strong point is that it explains why we get glacial sediments right under carbonate rocks that are more indicative of warm water.” The juxtaposition implies “a weird flip” of climate, says Corsetti. That it flipped in a geologic instant in low latitudes “is doubly weird.” But to Hoffman, it fits nicely into the snowball scenario. “This is no paradox,” he says, “but a predictable consequence.”

    The Harvard group believes it has also found evidence for another predictable consequence of a snowball Earth: hard times for life. Plants alter the proportions of carbon-12 and carbon-13 in the environment by preferentially using the lighter isotope in photosynthesis. As plant organic matter gets stored in sediments, the relative abundance of the heavier isotope in the ocean and atmosphere increases. But volcanoes and sea-floor hot springs have the opposite effect: They spew carbon dioxide that is relatively rich in the lighter isotope, pushing the isotopic composition of ocean and atmosphere toward the lighter side. Given a more or less steady volcanic flux, the carbon isotopic composition of the environment as recorded in carbonates can be taken as a gauge of how well plant life is doing: The greater the relative abundance of the lighter isotope, the worse off the plants were. And that's exactly what Kaufman and his Harvard colleagues found in the “cap” carbonate overlying the glacial deposit in Namibia.

    Leading up to the glaciation, the group reported, plant productivity, as measured by carbon isotopes in various carbonates laid down below the glacial deposits, dropped precipitously to near zero. When the record picks up again after the big thaw, it continues to drop until plant life seems extinguished, recovering only slowly. So the biggest carbon-isotopic shift in recorded history was another predictable result of snowball Earth.

    Appealing to a point

    All this paradox resolution has a strong appeal. “When you have a series of paradoxes, all of which are plausibly explained by one hypothesis,” says Hoffman, “that makes the hypothesis very, very attractive.” Adds Schrag: “It's one hypothesis that can explain all these incredible observations, each of which was mysterious.” And the appeal extends beyond Harvard. “The proposal is very attractive,” says geochemist Lee Kump of Pennsylvania State University, University Park. “I have a strong suspicion they've got the story right, although the details may evolve.”

    But those details trouble most researchers. Take the problem of life: It's still here. “Quite a few major clades [of organisms] made it through” the Neoproterozoic glaciations, notes paleontologist Narbonne, including such relatively complex organisms as multicellular green, red, and brown algae living on the sea floor and probably some simple animals. “It's difficult to imagine how these could have survived if the ocean were totally ice covered,” says Narbonne. “Things like cyanobacteria could have survived; they'll survive anything. But [the algae] need sunlight.” Paleontologist Bruce Runnegar of the University of California, Los Angeles, adds that the ocean's presumed lack of oxygen would present another seemingly insurmountable obstacle; oceanic anoxia would even be a serious problem for life around deep-sea vents, he says.

    Then there's the details of timing. Some events in the real world didn't happen when the snowball scenario would seem to call for them. In Scotland, notes Young, carbonates aren't just perched on top of glacial deposits but interlayered with them as well. And iron formations or iron-rich sediments around the world, while associated with Neoproterozoic glaciations, occur not only at the top of the glacial deposit, as predicted, but also down in the glacial deposits. In fact, notes Christie-Blick, the older of the two certain glaciations left smidgens of iron here and there and one huge deposit in northwest Canada, while the younger glaciation left virtually no iron behind. “Why isn't there more iron? The absence of iron formations everywhere is a big problem,” he says.

    Another problem lies in the glacial deposits themselves, Christie-Blick has noted. Even the Elatina Formation took on the order of a million years to be laid down, implying that much of it must have formed while Earth was ice covered. Yet with the oceans frozen over, there would have been no source of moisture for snow and therefore no glaciers moving to create those glacial deposits. “Paul [Hoffman] has always looked at the big picture,” says Christie-Blick. “I'd rather look at the details to see if they fit. When you look at them, they don't fit quite as well.”

    The Harvard group has responses to these and most other complaints about mismatches between the snowball scenario and the real Earth. In general, Hoffman and Schrag explain the apparent discrepancies in terms of a real snowball world that is far more complex and heterogeneous than most people are imagining. As to the survival of life, for example, they envision—though there is no geologic record—numerous refuges for plant and animal life through millions of years of global freeze. Long-lived volcanism near the sea surface, as now occurs in places like Iceland and the Galápagos Islands, would have provided warm spots, says Schrag. And the ice would have varied in thickness, especially in the tropics, where ice as thin as 50 or 100 meters might have allowed frequent cracking and open leads. Thus, life could have hung on, they argue, while suffering the kind of stresses that might have snapped evolution out of its eon-long lethargy. After all, they note, the oddball Ediacara fossils, life's first attempt at large-scale animal life, appeared shortly after the last Neoproterozoic glaciation, followed shortly by the appearance of all the basic animal body plans seen today.

    The geological inconsistencies might be similarly resolved, Hoffman and Schrag say. Those cracks and leads in the ice might let enough air in to oxygenate at least the upper layer of the ocean, giving life a respite and providing a means of precipitating iron formations during the glacials. And the long-running deposition of glacial debris might have been driven by slow but steady sublimation of ice from tropical sea ice, instead of evaporation from open ocean waters, and deposition at higher elevations as snow.

    No one has accused Hoffman and Schrag of failing to think big, but big ideas—even if they're right—take time to sink in. In the meantime, this one is stimulating thinking. “This hypothesis has really generated a lot of interest in rocks that haven't been worked on in years,” says Corsetti. And that's what will be needed to test the snowball Earth, says Christie-Blick. “None of the models work that well,” he says, “but something happened.” By broadening field studies beyond Namibia to places like China and Canada's Mackenzie Mountains, as is happening now, “maybe we'll figure it out.”


    Wildlife Deaths Are a Grim Wake-Up Call in Eastern Europe

    1. Robert Koenig

    After a cyanide spill blighted a major Hungarian river last month, scientists and officials are scrambling to come up with ideas for warding off future disasters

    SZOLNOK, HUNGARYOn the night of 30 January, a dike holding millions of gallons of cyanide-laced waste water gave way at a gold-extraction operation in northwestern Romania, sending a deadly waterborne plume across the Hungarian border and down the nation's second largest river. As Hungarians watched in horror, some 200 tons of dead fish floated to the surface of the blighted waters or washed up on the Tisza River's banks. Fish were only the most visible victims. The toxic brew also killed legions of microbes and threatened endangered otters and eagles that ate the tainted fish.

    After devastating the upper Tisza, the 50-kilometer-long pulse of cyanide and heavy metals spilled into the Danube River in northern Yugoslavia, killing more fish before the much-diluted plume finally filtered into the Danube delta at the Black Sea, more than 1000 kilometers and 3 weeks after the spill. Scientists across Europe are now assessing the damage and planning how to speed the Hungarian river's revival. Some warn that in the upper Tisza and on the nearby Somes River, the accident could leave a poisonous legacy for several years if heavy metals are left to linger in the river sediments.

    The accident may be the worst case of water pollution in eastern and central Europe ever. Some environmentalists even fear that the spill, depending on the extent of the long-term damage, could become Europe's biggest environmental disaster since the explosion at the Chornobyl nuclear power plant in Ukraine in 1986. But many experts think that comparison won't hold water. So far, no people are reported to have been killed by the pollution in Hungary or Romania. And whereas the Chornobyl disaster was initially downplayed by Soviet authorities and most international help took years to mobilize, the response to the Tisza contamination has been swift and broad.

    In conjunction with the Hungarian and Romanian governments, the European Union's (E.U.'s) commissioner for environment, Margot Wallström, has formed a task force to suggest ways to prevent further cyanide releases from gold mines in the region, and to identify and mitigate similar “hot spots” in the Danube River Basin. Says Istvan Lang, a former secretary-general of Hungary's Academy of Sciences and the founding chair of the nation's environmental council, “This tragedy should become a case study for developing an international approach to restoring ecosystems and for preventing such disasters from happening in the future.”

    Fool's gold? For centuries, the quest for gold has taken a toll on the environment. In their search for the mythical El Dorado, 16th century Spanish conquistadors slashed and burned their way across the Americas. Today's miners are a different breed, in many cases seeking only to extract gold from tailings left over from mining other minerals. But one of their tools, used for more than a century, is particularly hazardous: cyanide, which separates gold from ore. Mining operations often store cyanide-laced sludge in diked-off lagoons.

    Such waste has escaped from lagoons in the past. In 1995, a mine in Guyana, South America, spilled 3 million cubic meters of waste water, contaminated with cyanide and copper, into the Essequibo River. Other major cyanide spills occurred in Latvia and Kyrgyzstan in the 1990s, and heavy-metal waste from a mine in Spain sickened wildlife in the Gradiamar River and the Donana National Park in April 1998, spurring a major cleanup effort.

    Last month's accident occurred in the Romanian mining town of Baia Mare, where an Australian-Romanian company, Aurul SA, has been using cyanide to treat tailings. At Baia Mare, waste water is stored in lagoons surrounded by earthen dikes. At about 10 p.m. on the night of 30 January, a dike ruptured, sending an estimated 100,000 cubic meters of waste water into a stream that flows into the Somes, a Tisza tributary that crosses into Hungary.

    It wasn't until early the following evening that Romanian officials notified Hungary of the plume; the “Danube Accident Emergency Warning System” in Bucharest broadcast the alarm about 2 hours later. Tests found staggeringly high levels of cyanide in the Somes in both countries. In the following days, scientists measured concentrations of metal complexes of cyanide in the upper Tisza of up to 12 milligrams per liter (mg/l)—about 100 times higher than the country's standard for a “very polluted” river, and far higher than the E.U.'s maximum limit on cyanide in drinking water (0.05 mg/l). That's an ample amount to harm fish, which tend to be more sensitive than humans to cyanide's deadly effects—caused when the compound strangles cells by cutting off their oxygen supply.

    But cyanide wasn't acting alone. The plume also carried heavy metals, including copper and zinc, that are now more worrisome than the cyanide. Hungary's environment ministry has reported that copper concentrations temporarily shot up to 36 times the “very polluted” level; zinc and lead also skyrocketed.

    A team from the United Nations Environment Program just wrapped up 2 weeks of work along the rivers assessing the damage. The task now, says UN official Anders Renlund, is to “suggest what to do about the rivers, and what steps might be taken to prevent such a disaster from happening again.” A group from the U.S. Environmental Protection Agency also is expected to assess the situation later this month.

    Because the upper Tisza and the lower Somes were hit hardest, Hungarian scientists are concentrating their efforts there. Two fish species found only in the upper Tisza may have been pushed to the brink of extinction. Meanwhile, at Hortobagy National Park, Hungary's largest park, ecologist Gabor Szilagyi and his team have recovered the carcass of an endangered white-tailed eagle, and they are trying to save another sick eagle. Prospects may be bleak for the area's 400 protected otters, says Szilagyi: “We don't see as many footprints and other signs that they are gathering at the usual places.”

    Downriver, in Szeged, residents are wondering if the annual appearance of “Tisza flowers”—colorful mayflies that briefly cover the river—will take place at all this spring. And in Szolnok, scientists are analyzing river samples to find out how much pollution remains and how many organisms have survived—or how few. “There isn't much life in the river right now,” says microbiologist Eniko Szilagyi, who has examined poisoned plankton.

    Damage control. In his office next to the palatial Hungarian Academy of Sciences in Budapest, Lang points to a map showing the tributaries of Hungary's rivers—with more than 90% of that water coming from Romania, Ukraine, Slovakia, and Austria. A thicket of mines, chemical plants, oil refineries, and other sources of pollutants line those tributaries. “The management of environmental security cannot be stopped at the borders,” he says. The Hungarian government has filed lawsuits seeking monetary damages against the operators of the Baia Mare gold-extraction lagoon, and it has threatened to sue the Romanian government to help recover the cleanup costs. The accident has stoked bilateral tensions: Romanian officials accuse the Hungarian side of exaggerating the extent of the damage, while Hungarians assert that the Romanians are downplaying the spill.

    The accident has also prompted soul-searching within Hungary. The government's top science official, physicist Jozsef Palinkas, told Science that he was unhappy with his country's procedures for dealing with such emergencies. He will try to convince the government to “develop an early-warning system for detecting and dealing with environmental disasters.” But the World Wildlife Fund (WWF) and a dozen other groups that have banded together to form the “Tisza Platform” want a wider system and more enforceable international agreements on pollution. The WWF is also calling for emergency efforts to clean up the upper Tisza and Somes rivers and the surrounding watershed, it says, “because they will form the base from which the reintroduction of aquatic life in the contaminated waterways will start.”

    It may take months before Hungarian officials decide on a course of remediation, which could include seeding the river with pollutant-eating microbes. In the meantime, says hydrobiologist Oszkar Balazs, the algae and plankton that survived, as well as organisms that flow in from unaffected tributaries, will help breath new life into the Tisza. Tibor Müller, who heads the Hortobagy Fish Farm, thinks the river will heal within 5 years. Others expect a slower recovery. Biodiversity will suffer for decades, predicts the WWF's Gyorgy Gado. However, he says, “life will return to the Tisza, eventually.”


    Tracing Leptin's Partners in Regulating Body Weight

    1. Trisha Gura*
    1. Trisha Gura is a writer in Cleveland, Ohio.

    Although the hormone leptin hasn't turned out to be a “magic bullet” for obesity, it has partners in controlling weight—any of which may be antiobesity targets

    When Jeffrey Friedman and his colleagues at The Rockefeller University in New York City discovered in 1994 that defects in a hormone called leptin make mice grossly fat, the news brought hope to millions of obese people. Could injections of a natural hormone do what the latest diet couldn't? The biotech firm Amgen Inc. of Thousand Oaks, California, was sufficiently impressed to plunk down $25 million for the rights to leptin, and the hormone seemed headed for the pharmaceutical big time. Since then, leptin's star has—perhaps predictably—fallen. The low point came last year when a clinical trial showed that high doses of leptin produced, at best, modest weight losses in a subset of obese patients. The hormone turned out to be anything but obesity's “magic bullet” (Science, 29 October 1999, p. 881).

    Is leptin, then, the pharmaceutical equivalent of a fad diet—an overhyped solution to a serious personal and national problem? Not at all, say obesity researchers. Even as the clinical efforts were faltering, basic lab research—much of it stimulated by the discovery of leptin—has been pushing ahead, sketching out a much clearer picture of just how the body controls its weight. And this information is providing many potential new targets for antiobesity drugs. Researchers have identified a wealth of molecules involved in weight regulation, some of which cooperate with leptin to suppress appetite, while others blunt the hormone's fat-busting effects.

    Neuroanatomists, meanwhile, are fast on the trail of how and where these molecules work. For the most part, the hot spot is the brain. But leptin also acts on muscle and fat tissue, and a cadre of endocrinologists is teasing out its effects there, too (see sidebar). “Compared to where we were even 3 to 4 years ago, we have an enormous base of new knowledge that we hadn't even a whiff of before,” says endocrinologist Jeffrey Flier of Beth Israel Deaconess Medical Center and Harvard Medical School in Boston.

    Researchers have been trying to solve the puzzle of what causes obesity for decades. Progress limped along, however, until Friedman's team set the field afire. They reported that mutations in the leptin gene cause hereditary obesity in a long-studied strain of mutant mice. Just 1 year later, Louis Tartaglia's team at Millennium Pharmaceuticals in Cambridge, Massachusetts, and their colleagues at Hoffmann-La Roche in Nutley, New Jersey, hunted down the receptor through which the hormone exerts its weight-suppressing effects. Those discoveries, says Tartaglia, “really provided some genetic entry points into these [obesity] pathways that biochemical strategies failed to identify.”

    As other investigators threw their data into the pool, leptin's role in obesity evolved into that of a lipostat: Fat stores rise, and so do levels of leptin, which is manufactured mainly by fat cells. The hormone then signals the brain to eat less and the body to do more. But clinicians soon learned that whereas defects in either leptin or its receptor cause obesity in certain mutant strains of mice and rats, defects in those genes very rarely cause obesity in humans. So far, geneticists have fingered only two individuals with defects in their leptin genes, and none with mutations in the genes encoding the leptin receptor. Indeed, because leptin is churned out in proportion to the size of the fat deposits, many obese people have high levels of the hormone in their bloodstreams. But for some perplexing reason, they fail to respond to it.

    Both puzzled and intrigued, researchers figured that they might get at leptin resistance by tracing out the components of the brain pathways through which the hormone exerts its effects. They knew the brain was a likely site of its action because earlier studies in which researchers had destroyed various brain regions of animals had identified at least four areas—mainly in the hypothalamus—involved in appetite control.

    Leptin's partners

    The first brain molecule found to interact with leptin was neuropeptide Y (NPY), a small protein that had long been known to boost appetite when injected into animals. Studies that involved crossing leptin-deficient mice with mice whose NPY gene had been knocked out showed that some—but not all—of leptin's appetite-dampening effects are due to its inhibition of NPY activity. Since then, researchers have unearthed at least a dozen more molecules that interact with leptin in the brain to control appetite. Perhaps the best studied is a member of the melanocortin family of proteins called α-melanocyte-stimulating hormone (α-MSH).

    Although α-MSH is best known for orchestrating the production of brown pigment by skin cells, researchers found that the neuropeptide functions differently in the brain: It blunts appetite. The clue that tipped them off to this new role came from studies of a mutant mouse strain, called agouti, that has a striking gold-colored coat and is grossly obese. These mice continuously crank out copious amounts of a protein, also called agouti. The protein, researchers learned, blocks α-MSH's action on both skin cells and in the brain, thus accounting for the animals' obesity as well as their lack of dark pigmentation (Science, 7 February 1997, p. 751). The link between α-MSH and leptin came with the finding that leptin-deficient mutant mice make very little α-MSH. The discovery suggests that leptin stimulates α-MSH production, which then turns down appetite.

    Other recent work suggests that defects involving α-MSH can lead to obesity in humans. Greg Barsh, a geneticist at Stanford University, estimates that mutations in the gene encoding the brain receptor for α-MSH, a protein called MCR-4, account for 2% to 3% of severe obesity cases, presumably because they prevent the peptide from exerting its appetite-suppressing effects.

    In addition, α-MSH is synthesized as part of a precursor protein called POMC (for pro-opiomelanocortin), which is chopped into fragments by a cellular enzyme to produce α-MSH plus several other peptide hormones. Two years ago, a group at Humboldt University in Berlin, Germany, identified mutations in the POMC gene as the cause of a rare human hereditary syndrome featuring severe obesity, red hair, and adrenal insufficiency. “The melanocortins must be exceedingly important, because upsetting any part of that system gives you obesity,” says obesity researcher Joel Elmquist of Beth Israel Deaconess Medical Center and Harvard Medical School.

    Still, mutations affecting leptin and α-MSH account for only a few percent of all cases of human obesity, and so researchers are looking at other factors that might be involved. One of them is a molecule linked to obesity by endocrinologist Terry Maratos-Flier's group at the Joslin Diabetes Center and Harvard Medical School.

    About the time leptin was discovered, Maratos-Flier was looking for molecules that might contribute to weight control. Using a technique called differential display, she homed in on a neuropeptide called melanin-concentrating hormone (MCH), whose gene is two to three times more active in the brains of ob/ob mice—one of the obese leptin-deficient strains—than in normal mice. In another variation on the obesity-pigmentation theme, MCH had originally been discovered as a hormone that lightens the color of fish scales. But Maratos-Flier's finding suggested that increased expression of the MCH gene—possibly in response to the ob/ob mouse's leptin deficiency—might also contribute to the animal's obesity.

    Indeed, when Maratos-Flier and her colleagues injected MCH into the brains of rats, the animals' food consumption shot up as the dose escalated. Conversely, when her group in collaboration with that of her husband, Beth Israel's Jeffrey Flier, knocked out the MCH gene in mice, the resulting animals ate less during the night—normal dinnertime for mice—and ended up 15% to 25% skinnier than their normal counterparts. What's more, the group found that the neurons making the peptide, which are located in the lateral hypothalamus, project to neurons in the cortex region of the brain, including those that orchestrate smell.

    Based on these results, Maratos-Flier proposes that MCH causes what she calls “the pizza effect.” When a person is satiated, neuropeptides like MCH are at their lowest levels, she says. But the smell of something tasty might trigger their release. “Even though you are not hungry and you don't need the calories, you still eat the pizza because you know it will taste good,” Maratos-Flier explains.

    So far, however, researchers have identified no MCH defects that might explain leptin resistance in obese people. But they are exploring other possibilities. One is that leptin may somehow be blocked from entering the brain in obese people. To see if that might be the case, Flier's group at Beth Israel is trying to track down the elusive molecular ferry that ships leptin across the blood-brain barrier. The investigators have found an attractive candidate: a shortened form of the leptin receptor that is presumably not tethered to the cell membrane, because it lacks the peptide segment that would normally hold it in place there.

    Last year, Flier's team showed that the cells that form the blood-brain barrier produce higher levels of the mRNA that encodes the shortened form of the receptor, called OBR-A, than any other cells in the body. The researchers have since shown that the protein is necessary to transport leptin across membranes formed by the cells in culture dishes. Whether transport problems spur leptin resistance in people, however, remains to be tested.

    Leptin resistance might also stem from molecules that interfere with signaling by the leptin receptor, such as one identified last year by Flier's group, in collaboration with Elmquist's. The researchers found that when they injected leptin into normal animals, the hormone rapidly juiced up production of a protein called SOCS3 (for suppressor of cytokine signaling-3) in cells in the hypothalamus that bear leptin receptors. SOCS3, in turn, bit the hand that fed it by halting the leptin receptor from further signaling. This is presumably part of the normal mechanism for halting leptin signaling when the hormone has done its job. But in addition, Flier says, the finding “raises the possibility that this inhibitor might be mediating the resistance to leptin in obese people.” He and his colleagues plan to look for defects in OBR-A and SOCS3 function in obese patients.

    Making the connections

    With all these new actors coming onto the stage, the most daunting task is to cast them together in one leptin-conducted performance. Among those tackling the problem are Elmquist and his colleagues. They've identified two sets of neurons in one of the brain's feeding hot spots, the arcuate nucleus of the hypothalamus, that respond in opposite ways to leptin stimulation. One population produces appetite-inhibiting peptides such as α-MSH, and this neuron group responds to leptin in the expected fashion—by expressing genes that signal activation. In contrast, the other population makes two appetite-boosting proteins, NPY and agouti-related protein (AgRP), the human equivalent of the mouse agouti protein, and these appear to shut off in response to leptin.

    Elmquist attributes the difference to the fact that the neurons that make the appetite-stimulating peptides, but not the others, respond to leptin by producing, among other molecules, the leptin-receptor inhibitor SOCS3. But however it happens, the net result is a profound suppression of appetite. “It's intriguing that the same leptin receptor in the same nucleus could promote two distinct physiologic responses,” Elmquist says.

    And the neural circuitry reaches even farther. Studies by the Beth Israel team show that both sets of neurons also send projections to the neurons in the lateral hypothalamus that produce the appetite stimulator MCH. The MCH neurons, while probably inhibited by those producing α-MSH and perhaps stimulated by the NPY/AgRP neurons, also project into smell centers in the cerebral cortex and other nervous system regions that are responsible for complex behavior including feeding.

    In addition, the arcuate neurons are wired to another population of neurons in the lateral hypothalamus that spew out potent peptides that are called orexins, because they stimulate appetite. The orexins have been linked to arousal in mice, dogs, and people, and defects in such peptides cause a narcolepsy-like state. “These [MCH and orexin] neurons are really in a powerful position to regulate broad areas of the central nervous system,” Elmquist says. And at the center of it all sits leptin, regulating both appetite and feeding in the brain and possibly activity and calorie burning in the body.

    Obesity researchers admit that they still have a lot of work to do before they trace out all the molecules involved in weight control. But they already have a number of promising targets for antiobesity drugs. For example, if SOCS3 does down-regulate leptin-receptor activity, SOCS3 inhibitors might heighten leptin signaling and help curb appetite. And MCH has already caught the eyes of several pharmaceutical companies, which are scrambling to find small molecules that block its action.

    Indeed, obesity researchers can expect a glut of research leads to feast on. “There is so much data, it is turning out to be a remarkable story,” Elmquist says.


    Is Leptin a 'Thrifty' Hormone in Muscle and Fat?

    1. Trisha Gura

    Most Westerners find it much easier to put on pounds than to take them off—a problem that may have its roots in evolutionary history. While people in today's developed countries have a veritable glut of food, life for our hunter-gatherer ancestors was a constant struggle with famine. Thus, decades ago, researchers proposed a “thrifty genotype” hypothesis, which holds that early on, animals and people evolved mechanisms that allowed them to hoard calories—as fat—in times of abundance in preparation for famine later on. Recent work suggests that the hormone leptin, despite having won media fame for its role as a “fat buster” (see main text), may contribute to this metabolic thriftiness. “Leptin's role may really be as a regulator of the body's response to starvation,” says endocrinologist Jeffrey Flier of Beth Israel Deaconess Medical Center and Harvard Medical School in Boston, whose group first proposed the idea back in 1996.

    New evidence supporting that proposal comes from endocrinologist Luciano Rossetti and colleagues at the Albert Einstein College of Medicine in New York City. In August, the researchers reported that leptin injections given to rats that had fasted for 5 hours caused the animals to turn down the activity of the leptin gene in fat cells while turning it up from ground zero levels in muscle. The investigators determined this by measuring the cells' content of the messenger RNA (mRNA) that directs leptin synthesis.

    That was the first inkling that leptin might play a role in muscle. A clue to what it does there came in a second set of experiments. When Rossetti's group injected leptin into rats on a low-calorie diet, the hormone, surprisingly, had little or no effect on leptin mRNA production by fat cells, but sharply increased its synthesis by muscle cells. Apparently, Rossetti speculates, muscle cells amplify leptin production in times of food deprivation to guide that tissue toward burning fat instead of depleting its protein and carbohydrate stores. Work by Roger Unger's group at the University of Texas Southwestern Medical Center in Dallas and others indicates that Rossetti is on the right track. The Southwestern team found that leptin boosts the activity of nearly all the genes involved in lipid oxidation. “You need to protect the protein and glycogen that composes the muscle fibers so you would be able to undergo fight or flight,” Rossetti says.

    But when the investigators injected leptin into rats given a high-fat, high-calorie diet, they saw a different picture. Leptin mRNA levels plummeted in fat cells but did not change dramatically in muscle cells. This suggests, Rossetti says, that the leptin produced after a hearty meal curbs its own production in fat cells, apparently so that the animals will eventually eat again and increase their fat stores. At the same time, muscle will likely burn less fat than in the animals on the low-calorie diet with their higher leptin production. “This gives us a hint that constant overfeeding limits the effective leptin response,” Rossetti says. He suggests that this tendency to brake leptin's action may sow the seeds of leptin resistance in overweight people.

    Whether that's so remains to be seen. “Right now, nobody knows whether this is a trivial biological phenomenon or an important action of leptin,” Flier says. The answers will come, he says, from experiments in which rodents are genetically engineered so that their leptin or leptin-receptor genes are knocked out only in specific tissues. That way, the hormone's effects in muscle and fat can be separated from those in brain.

Log in to view full text

Log in through your institution

Log in through your institution

Navigate This Article