News this Week

Science  28 Apr 2000:
Vol. 288, Issue 5466, pp. 586

    In Contrast to Dolly, Cloning Resets Telomere Clock in Cattle

    1. Gretchen Vogel

    When researchers announced 3 years ago that they had cloned Dolly the sheep, many scientists asked a question that sounds almost metaphysical: Are her cells older than she is? Because Dolly had been cloned from an adult cell, they wondered whether her own cells would show some of the hallmarks of a more mature animal. The answer came in 1998: Dolly's telomeres—the “caps” on the ends of her chromosomes—are shorter than normal. Because telomeres normally shrink with age, this was a disturbing sign that her cellular clock hadn't been reset to zero. Not only did the finding imply that Dolly might age unusually quickly, but it also dampened hopes that the cloning technique might someday be used to produce replacement cells for patients suffering ills such as liver failure or Parkinson's disease, because such cells may be too “old” to use. Now, a surprising finding has diminished those concerns.

    On page 665, physician Robert Lanza of Advanced Cell Technology (ACT) in Worcester, Massachusetts, and his colleagues report that cells from calves they cloned have telomeres that are longer than normal. Moreover, the cells show other signs of youth and can divide in culture many more times than normal cells do. Similar work in cattle and in mice, as yet unpublished, seems to support the ACT team's results.

    Why these findings are so dramatically different from those on Dolly is not yet clear. But, as Lanza points out, it suggests that tissues produced by cloning might last at least as long as the original cells—and perhaps longer. Pathologist and gerontologist George M. Martin of the University of Washington, Seattle, agrees. A technique that can lengthen the life-span of cultured cells, he says, should be very useful for tissue engineering.

    Ethical problems will remain, however. Cloning is accomplished by transplanting nuclei from somatic cells into eggs whose own nuclei have been removed. When the goal is to produce animal clones, embryos that develop from those cells are implanted in foster mothers in the hopes that they will produce live births. But for therapeutic cloning, the embryos would be allowed to develop just long enough to produce embryonic stem cells, which could then be used to generate replacement tissue. Using nuclei from a patient's own cells to produce the embryos should avoid rejection problems, because the resulting cells' nuclear DNA would be identical to that of the patient.

    But such research cannot be done in many countries because the procedure requires creating and then destroying a human embryo, and many also worry that therapeutic cloning would open the door to human reproductive cloning. Earlier this month, however, the influential Nuffield Council on Bioethics in Britain said that the potential benefits of therapeutic cloning outweigh the ethical concerns, and a British government panel is expected to rule in favor of the research. Other countries, including the United States, seem further from allowing such research to be done with government funds.

    When Lanza and his colleagues heard about Dolly's shortened telomeres, they decided to try a stringent test: Could they derive healthy animals from cells kept in culture until senescence, when they are no longer able to divide? To find out, the scientists obtained cells from a fetal calf and allowed them to replicate for several months, until near the end of their expected life-span. By then, Lanza says, the cells were showing characteristics of aging, including growing larger and accumulating cellular debris. They also had shortened telomeres.

    The researchers then transferred nuclei from nearly 1900 of the cultured cells into enucleated egg cells and eventually produced six calves. At birth, the animals showed the now-expected characteristics of cloned animals—they were larger than normal newborns and had high blood pressure and difficulty breathing. But by 2 months of age, the animals seemed healthy and normal.

    Indeed, when Peter Lansdorp and his colleagues at Terry Fox Laboratory in Vancouver examined blood cells from the young cattle 5 to 10 months after birth, they found that the animals' telomeres were significantly longer than the telomeres of normal cattle the same age, and in some cases were even longer than the telomeres of normal newborns. Another sign that the cloning process had somehow turned back the clock on the animals' cells came when the team cultured fibroblasts, a type of connective tissue cell, from the calves' ears. The cells expressed high levels of a gene called EPC-1, which is typically found at high levels in young cells and may be involved in cell division and proliferation.

    In a related experiment, the team cloned five calf fetuses from adult cells kept in culture until senescence. They removed the fetuses at 6 weeks of gestation so they could compare their cells with those of normal fetal calves. The clones' cells divided an average of 93 times compared to only 61 for cells from normal calves. If this increased life-span extends to the whole animal, Lanza says, there is “a real possibility” that cloned animals might live as much as 50% longer than their normal counterparts—up to 180 to 200 years in the case of humans—an idea, he says, that “is going to raise an eyebrow or two.”

    Other scientists are more cautious, noting that aging is extremely complex and is controlled by more than just telomere length. But cell biologist Leonard Guarente of the Massachusetts Institute of Technology says the evidence does suggest that the oocyte was able to “restore a youthful state” to the donor cell's nucleus. But he cautions, “What you want to know is, will these cloned animals live longer?” The scientists will have to wait a while to answer that question, as sheep can live 12 years and cows about 20.

    No one is yet able to explain the difference between Dolly and the cloned calves. It might be due to random variation, species differences, a difference in the cell type, or different methods of nuclear transfer. Telomere expert Jerry Shay of the University of Texas Southwestern Medical Center in Dallas hypothesizes that starting with relatively short telomeres in the senescent cells might prompt the early embryo to overcompensate and grow unusually long telomeres.

    Whatever caused the difference, the Lanza team's results are consistent with preliminary findings of two other groups. In as yet unpublished work, Xiangzhong Yang of the University of Connecticut, Storrs, has found that the telomeres in calves cloned from adult cells were of at least normal length. And Teruhiko Wakayama of The Rockefeller University in New York City says that he, with colleagues in Hawaii and Japan, found a similar pattern in telomeres of cloned mice.

    The researchers hope the findings will provide insights into the source of the egg cell's rejuvenating power. “Ultimately we want to understand how that reprogramming goes on in the oocyte so we could do it in vitro” and skip the embryo stage, Martin says. Several groups are working toward that goal, hoping to produce replacement tissues without the ethical baggage.


    NIH Nomination on Hold for This Year

    1. Eliot Marshall

    Four months after Harold Varmus resigned as director of the National Institutes of Health (NIH), the agency has learned that it will have to go without a permanent chief for at least the rest of this year and probably for part of 2001. Deputy NIH director Ruth Kirschstein, a veteran research manager and former head of the National Institute of General Medical Sciences, who took charge of NIH in January, will continue as acting chief.

    A federal official confirmed last week that Secretary of Health and Human Services Donna Shalala urged the Administration not to nominate a permanent replacement for Varmus at this time, and that White House officials agreed. In addition, sources say, Shalala consulted with the leading candidate for the NIH job, whose name has not been disclosed officially but privately is acknowledged to be Gerald Fischbach, director of NIH's National Institute of Neurological Disorders and Stroke. Varmus recruited Fischbach to NIH in 1998 from his position as chair of the departments of neurobiology at Harvard Medical School and Massachusetts General Hospital in Boston. Shalala and the candidate “mutually agreed” that it would be best not to send his name to the Senate for confirmation, the source said, primarily because time is running out for the Clinton Administration.

    Fischbach and NIH officials aren't discussing the decision. But a Senate democratic aide who follows NIH affairs says that “NIH people were up here last week,” explaining that they had shelved the nomination because of the “short time frame for moving a name through the Senate.” In a nomination hearing, the aide said, “any nominee would have to expect tough questions regarding the use of fetal tissue and embryonic stem cells.” Even if the review went smoothly, the new NIH director would have only a few months in office before the arrival of a new Administration—and possibly a move to change NIH's leadership. The decision to stick with the status quo, the Senate aide argued, is also a “vote of confidence” in NIH and “a recognition that Kirschstein is getting high marks for her handling of the job.”


    Task Force Tinkers With Research Council

    1. Andrew Lawler

    After several years of public turbulence, the U.S. national academies of science and engineering are about to embark on some private upheaval. The chiefs of the National Academy of Sciences (NAS) and its sister groups, the National Academy of Engineering (NAE) and the Institute of Medicine (IOM), have set their sights on restructuring the National Research Council (NRC), the huge think tank-like operation responsible for most of the reports, meetings, and workshops carried out each year by the academies.

    A 15-member task force, chaired by retired Howard Hughes Medical Institute president Purnell Choppin and retired Honeywell vice president Gerald Dinneen, was formed in August 1998 and began meeting last spring. Its fourth and final session is scheduled for next month, with a report due in August. On the agenda are proposals that would streamline the Byzantine NRC structure, raise additional revenue from state governments and other nonfederal sources, and extend its influence beyond its bread-and-butter reports on topics ranging from defending the country against nuclear attacks to improving minority health care.

    “It won't be wallpaper,” predicts Mary Jane Osborn, a microbiologist at the University of Connecticut Health Center in Farmington and a member of the task force. But neither will it be as radical as the last review, instituted by then-NAS president Frank Press in 1981, that redrew the entire NRC map. “The layers of approval [for individual NRC reports] need streamlining, not removal,” says NAS president Bruce Alberts, who also chairs the NRC.

    There is widespread agreement that some sort of an overhaul is long overdue. The NRC, created in 1916, produces about 200 reports a year with help from a full-time staff of about 1000. However, its revenues—$182 million in 1998—have remained stagnant in recent years, and many customers have complained about the high cost and long wait associated with many NRC studies (Science, 9 May 1997, p. 900). In addition, its rigid and complicated structure—the council has seven commissions that oversee most of the work of innumerable boards, task forces, and working groups—is poorly suited to interdisciplinary problems.

    The question of restructuring the NRC has been on the table for several years. But a messy fight that resulted in the departure of former NAE president Harold Liebowitz (Science, 1 March 1996, p. 1222) and legal wrangling over whether the NRC must abide by a law that requires government advisory committees to conduct their business in public (Science, 14 November 1997, p. 1219) left top officials with little time to address possible changes. “We were kept busy with a series of crises, and now things have quieted down,” says Alberts. The task force's charge, he adds, is to come up with a system that is “more efficient while providing equal quality.” NAE president William Wulf says that the growing need for crosscutting approaches and the increased role of states in science and technology initiatives demand a review of the NRC's products, processes, and organization. “Everything is on the table,” says Wulf.

    Currently, the NRC's governing board approves requests for a report, which is then assigned to the appropriate body. A draft report, put together by a committee of outside experts, is typically reviewed by its parent commission, as well as by a separate report review committee that monitors the quality of the draft. Many voices inside and outside the NRC say that the commissions are a weak link in a process designed to ensure accurate and objective reports. “You could take out that layer,” says one former NRC official. Another former official complains about the drain on time, money, and effort from frequent “dog-and-pony shows” performed by staff to keep commissions up to date.

    Task force members agree that the commission structure should be revised to enhance the work of the boards. “You cannot mess with the boards—they are the ones in the trenches, the front-line troops,” says member Brad Parkinson, a Stanford University physicist and engineer. Alberts says he would welcome “more standardized procedures” for the commissions. But sources familiar with the task force's deliberations say it is unlikely that the commissions—created in the 1982 reorganization that followed the 1981 study—will disappear. That approach was tried unsuccessfully in the policy division, says Alberts.

    The scope of the NRC's work is also under scrutiny. Parkinson says the group is rethinking the mix of core activities in light of a flattening of federal requests. “Less [federal] work comes in over the transom,” says Thomas Deen, a former NRC staffer who sits on the task force. At the same time, he says that the NRC “is uniquely positioned” to help states in such areas as transportation, education, and health care delivery. The NRC derives only about 15% of its revenues from nonfederal sources.

    The panel is also looking at how to supplement the NRC's primary diet of reports with roundtables, workshops, fellowship programs, and other activities. “No one is trying to denigrate the studies, but there can be more synergy” in what the NRC accomplishes, Deen says. Creating a body of work in a specific area is another approach, notes IOM president Ken Shine, as the IOM has done to much acclaim with health care issues.

    Once the report is submitted, it will fall to Alberts, Wulf, and Shine to win over the membership, volunteer community, and NRC staff. “We have to be strategic,” says Alberts, who plans to start lobbying members this week during the academies' annual meeting in Washington. “It's not going to be simple to get members to recognize that changes will be good for the organization in the long run.” One thing that may not change, however, is the academies' penchant for secrecy. The document outlining the 1982 reorganization remains confidential, and academy officials say there are no plans to release the new report, either.


    Draft Report Affirms Human Influence

    1. Richard A. Kerr

    For the past several years, an international panel of climate scientists has been testing alternatives to the idea that people are affecting global climate. They examined climate's natural variability, changes in solar radiation, and volcanic outpourings, among others. But none of those factors fit the past century's observed warming as well as the explanation they suggested in 1995: an increase in greenhouse gases generated by human activity. So last week, the group, the United Nations-sponsored Intergovernmental Panel on Climate Change (IPCC), released the draft of a new report concluding “that there has been a discernible human influence on global climate.” If those words hold up under further expert and governmental review, they would be the strongest official pronouncement yet that human-induced warming is real.

    “Something definitely seems to have happened” to the climate, says climate researcher Tim P. Barnett of the Scripps Institution of Oceanography in La Jolla, California, who reviewed part of an earlier draft. As this draft points out, “three of the last 5 years have been the warmest in the instrumental record,” which goes back 140 years. And three different records of temperature preserved in tree rings and elsewhere have now revealed the large, abrupt 20th-century warming to be unique in the past 1000 years.

    Exceptional century.

    Temperatures recorded in tree rings and elsewhere (purple) reveal that the 20th century (red is instrumental record) was unique in the millennium.


    The confident recognition of an anthropogenic climate effect—which could bolster calls for action to curb global warming—is the draft report's only major shift since 1995, when the IPCC found that “the balance of evidence suggests a discernible human influence.” The new report notes that there has been little progress in projecting the future of greenhouse warming, thanks to uncertainties about everything from climate models and the behavior of clouds to the vagaries of humans' burning of fossil fuels. Even so, the report, to be finalized later this year, should inform negotiations that culminate this fall on the implementation of the Kyoto Protocol for the reduction of greenhouse gas emissions.

    The IPCC gained confidence in identifying the 0.6°C warming of the past century as anthropogenic through a process of elimination. Since the previous report, researchers have run their improved climate models repeatedly and longer to look for alternatives—the natural ups and downs of temperature, solar variability, or volcanic emissions. None seems to suffice. And model simulations of the past century including rising greenhouse gases bear a strong resemblance to the actual warming.

    Barnett is cautious about declaring complete certainty, but “we have a change we can't explain with natural variations. There aren't many other options.” Climatologist Gerald North of Texas A&M University in College Station, who does greenhouse detection work but has not been involved in the IPCC process, is more confident: “There are too many independent pieces of evidence, and there's not a single piece of contradictory evidence,” he says. North is particularly impressed by the 1000-year temperature records. “The planet had been cooling slowly until 120 years ago, when, bam!, it jumps up,” he says. “We've been breaking our backs on [greenhouse] detection, but I found the 1000-year records more convincing than any of our detection studies” using climate models.

    Even greenhouse contrarians are tacitly going along with the IPCC's confident conclusion. Rather than dispute the reality of the warming or its cause, they have lately emphasized its modest size and inferred minimal future negative effects. Much of the warming, they note, has come at night, in the winter, and in areas that might stand some warming, such as Siberia.

    While the report seems to reflect broad support for the recognition of human-induced climate change, “we don't quite know what it means for the next 100 years,” admits North. The report offers nothing new on how much temperatures might rise given an added shot of greenhouse gases. It cites the same possible warming from a doubling of carbon dioxide—2.5°C with a range of 1.5° to 4.5°C—as did the 1990 and 1995 reports. Indeed, that range goes back to a National Academy of Sciences report of 1979. Uncertainties in the magnitudes of complicating factors such as solar variations and the effects of pollutant hazes have changed little since 1995.

    One change in the report—a more prominent role for socioeconomic factors—only increases the uncertainty. Depending on which of six possible scenarios for emission of greenhouse gases and cooling pollutant hazes is used, warming by 2100 could be between a modest 1°C and a sizzling 5°C. The range of warming created by economic, demographic, and policy assumptions in the scenarios “is similar to that due to uncertainty in models,” the report observes. With so much up in the air, the IPCC should have no lack of grist for its next report in 2005.


    AIDS Researchers Decry Mbeki's Views on HIV

    1. Jon Cohen

    Most governments that face a serious AIDS epidemic have taken a long time to acknowledge the fact. In South Africa, one of the hardest hit countries in the world, this pattern has a bizarre twist: President Thabo Mbeki has acknowledged that his country has an AIDS epidemic, but he has questioned whether HIV is to blame.

    Not only is Mbeki publicly flirting with scientifically discredited ideas about the cause of AIDS, but a leading skeptic of HIV's role in the disease has been invited to serve on a panel to discuss how South Africa should deal with the crisis. These moves are drawing international attention—and increasingly sharp attacks from AIDS researchers inside and outside South Africa, where the virus has infected one out of every 10 adults.

    Mbeki's questioning of the scientific evidence that HIV causes AIDS became front-page news around the world last week when The Washington Post revealed that he recently sent a letter about his views to President Bill Clinton, other heads of state, and U.N. Secretary Kofi Annan. In the letter, Mbeki decries the “orchestrated campaign of condemnation” that has been directed at him for seeking out the views of so-called AIDS “dissidents,” such as the University of California, Berkeley's, Peter Duesberg, who in 1987 began challenging the widely accepted scientific conclusion that HIV causes AIDS (Science, 9 December 1994, p. 1642). “We are now being asked to do precisely the same thing that the racist apartheid tyranny we opposed did, because, it is said, there exists a scientific view that is supported by the majority, against which dissent is prohibited,” wrote Mbeki in his 3 April letter. “The day may not be far off when we will, once again, see books burnt and their authors immolated by fire by those who believe that they have a duty to conduct a holy crusade against the infidels.”

    “I think the letter was emotional and irrational,” says Malegapuru William Makgoba, an Oxford-trained immunologist who in July became the first black head of South Africa's Medical Research Council. “This man will regret this in his later years. He displays things he doesn't understand.”

    Makgoba says Mbeki told him and others earlier this year that he became intrigued by the dissidents' views after reading about them on the Internet. In January, Makgoba says Mbeki sent him about 1500 pages of documents that question the so-called “HIV/AIDS hypothesis.” “It's pure rubbish,” says Makgoba. “They never provided any data and, at the same time, they are taking things out of context.” He told Mbeki as much in a letter that also offered detailed counterarguments. “His credibility as an African leader may suffer from this,” says Makgoba, who recently edited a book called African Renaissance, which has an introduction written by Mbeki.

    Parks Mankahlana, Mbeki's spokesperson, stresses that Mbeki has never said that he does not believe that HIV causes AIDS. “We've gone through all of his speeches,” says Mankahlana, who points out that Mbeki has increased support for AIDS research, encourages the use of condoms, and always wears an AIDS ribbon on his lapel. Mbeki, says Mankahlana, is simply exploring a range of views on the role that HIV plays in the disease. “The problem that the scientific world has is this: It has to do with human arrogance.”

    The dissidents' views are expected to be included in a panel of about 30 AIDS “experts” that South Africa's Department of Health is convening to discuss how to address the country's epidemic. Duesberg says he has been invited and may well attend the panel's meeting next month. “I think after this letter, I have to go,” says Duesberg. “It's getting hot again, just like in the old days, thanks to Mbeki. I'm surprised that there's a place left on this planet where you can ask commonsensical questions.”

    In part because of Mbeki's stance, some AIDS researchers have threatened to boycott the international AIDS conference scheduled to be held in Durban this July. But Salim Abdool Karim, a leading South African AIDS researcher who chairs the scientific committee for the meeting, says he does not expect Mbeki's views to depress attendance. “In fact, it has encouraged some people to say, ‘I will attend the conference,’” Karim says. Karim, who conspicuously was not invited to sit on the health department's panel, hopes Mbeki will quickly declare that he believes HIV causes AIDS. “This should be resolved urgently, rather than making it an international issue,” says Karim.


    Heat Flow Runs Into Quantum Limit

    1. Adrian Cho

    Heat is a symphony of vibrations rippling through a material. But, just like the electrons flowing in an electrical current, the individual vibrations are really quantum mechanical waves. Now a team of physicists has found that they can filter out all but a handful of those vibrations by making them jiggle down a tiny beam only a few billionths of a meter thick. When they do that, the quantum mechanical nature of the vibrations sings out, as the amount of heat the vibrations will carry butts into a fundamental quantum limit.

    The findings, reported in the 27 April issue of Nature, raise the prospect of observing individual vibrations, called phonons. They also provide a warning for scientists and engineers hoping to create wires or machines only a few molecules thick: Such devices may overheat in a hurry.

    In 1988, physicists discovered that when electrons flow through a wire only a few nanometers, or billionths of a meter, thick, they move in a handful of quantum channels. So when researchers increase the voltage between the two ends of a tiny wire, the current passing through it climbs in a series of even steps as the channels open one by one. Now Michael Roukes, Keith Schwab, and their colleagues at the California Institute of Technology in Pasadena have overcome daunting technical challenges to catch heat behaving in a similar manner. “It's beautiful work,” says Alex Zettl, a physicist at the University of California, Berkeley. “I have nothing but praise for it.”

    Roukes and colleagues set out to measure the flow of heat in beams of silicon nitride a mere 60 nanometers thick and 200 nanometers wide. They heated each beam at one end with a minuscule electric heater and tracked the temperature difference between the two ends by measuring the temperature-dependent jostling of electrons in gold patches painted on either end. They then monitored the thermal conductance, the ratio of the heat applied divided by the temperature difference, as they cooled the beam toward absolute zero. To keep from melting the delicate device, the researchers kept the heat down to about a femtowatt, or a millionth of a billionth of a watt—roughly the power that would reach your eye from a lightbulb 60 miles away. “You have to control everything well below the femtowatt level,” says Schwab. “That's what's terrifying about this experiment.”

    As the researchers cooled the beam, the thermal conductance fell in proportion to the temperature cubed as the higher frequency vibrations, the flutes and violins of the thermal symphony, faded out. Then, below 1 kelvin, the researchers found that the thermal conductance began to decrease in direct proportion to the temperature. That meant they had winnowed out all but the bass fiddle, the four simplest vibrations, analogous to the lowest energy channel in quantum electrical conductance. The rate of decrease revealed a limit on how much heat these vibrations could carry. Two years earlier, physicists George Kirceznow and Luis Rego of Simon Fraser University in Burnaby, Canada, had predicted just this fundamental limit. “I was obviously hoping that they would see what we predicted,” Kirceznow says, “but I'm stunned that the agreement was so good.”

    The observation may mean extra work for researchers striving to manufacture machines only a few molecules or atoms across, especially if they must run at low temperatures. Such tiny devices may have to get rid of their heat only through the limited channels, so they may tend to overheat. “When things get very small, these sorts of limits will come in,” Roukes says. “So you have to consider how you'll deal with them.”

    Now that they've struck a quantum chord, researchers would like to observe the individual phonons that are doing the vibrating. Roukes envisions an experiment in which an exquisitely sensitive detector registers a click for each phonon. And Zettl thinks that may be just the beginning: “There are going to be many, many experiments coming out of these results.”


    CITES Puts Off Plan to Hasten Shipments

    1. Wendy Williams*
    1. Wendy Williams is a freelance writer in Mashpee, Massachusetts.

    Nairobi, Kenya—In a setback for scientists, an international trade body has shelved a proposal to simplify research on endangered species. The proposal would have waived an often-cumbersome permitting process for handling samples of everything from hair and DNA to cell lines derived from endangered species. The rules are mandated under the 25-year-old Convention on International Trade in Endangered Species (CITES), which three European countries lobbied to change. But unexpected opposition from the United States and several developing countries at a meeting here last week torpedoed the proposal until at least 2002.

    The restrictions are meant to squelch international trafficking in wildlife while granting exemptions for research samples. They prevent smugglers of animal parts—say, bear gall bladders used in traditional medicine—from masquerading as scientists by requiring a permit from the originating and destination countries. Many countries are slow to issue permits, which in some countries is predicated on paying a bribe. Fed up with the status quo, Germany, Switzerland, and the United Kingdom proposed an amendment that would eliminate the need for permits for biomedical research, diagnosing animal diseases, and DNA testing.

    Scientists at the meeting shared a few bureaucratic nightmares in hopes of bolstering their argument. For example, it took 7 months for a German group to get a permit from the U.S. Fish and Wildlife Service (FWS) to send blood from a St. Vincent's Amazon parrot to New York City for DNA analysis, says ornithologist Donald Bruning of the Wildlife Conservation Society in New York City. Such delays can be fatal for a sick animal needing a proper diagnosis, says Samuel Wasser of the University of Washington, Seattle, who studies stress hormones in scat in several African countries. By the time the blood sample comes back, Wasser says, “all we know is what the animal died from.”

    Although sympathetic to the plight of scientists, FWS's Donald Barry says that the European proposal would lead to “a serious erosion of domestic controls.” Customs agents can't judge whether someone with wildlife parts is a bona fide scientist, he adds. Instead, Barry suggests that countries issue blanket permits to certain institutions and scientists—a plan that Wasser fears would “create a scientific and institutional elite.”

    Delegates from several developing nations also went on the attack. “How can we ensure that these samples are not used for bioprospecting?” says Hesiquio Benitez Diaz, a biologist with Mexico's National Commission for the Knowledge and Use of Biodiversity. Permits help to keep track of biological resources that leave the country. If a drug is developed from an endangered plant, for instance, a country may be able to seek royalties by proving the plant's origin. But right now, says Diaz, “we just don't have a framework yet for controlling access to our own genetic resources on a global level.”

    In closing the biennial treaty meeting on 20 April, CITES officials instructed several working committees to resolve their differences before the next treaty meeting in October 2002.


    Relativity Passes a Tough Cosmic Test

    1. Govert Schilling*
    1. Govert Schilling is an astronomy writer in Utrecht, the Netherlands.

    Continents drift, empires rise and fall, stock markets whipsaw, but photons just keep rolling along. That reassuring idea—that the speed of light is independent of the velocity of the light source—has been a central postulate of special relativity for almost a century. Now, a rigorous test has shown that it is indeed true, to at least 20 decimal places.

    On 29 April, at the April meeting of the American Physical Society in Long Beach, California, astronomer Kenneth Brecher of Boston University will report how he confirmed Einstein's assumption by studying distant gamma ray bursts, violent explosions at the edge of the observable universe. Astronomers see the bursts as fleeting pulses of high-energy radiation, some less than a millisecond long, which they plot as peaked graphs of brightness versus time known as light curves.

    Nobody knows what gamma ray bursts are or where they get their staggering power. Even so, Brecher says, it's safe to assume that any matter capable of emitting such energetic radiation must be hurtling through space at relativistic speeds, at least a few percent of the 300,000-kilometer-a-second speed of light. And if the explosion flings particles in many directions, sources of the radiation must start out moving at different velocities with respect to Earth.

    If the speed of light did depend on the motion of the source, Brecher says, that “velocity dispersion” would give some photons slightly higher speeds than others. During the billions of years it takes the photons to reach Earth, those tiny differences would smear out the light curve of a gamma ray burst, spreading the peak over a longer time.

    By analyzing the light curves of a number of gamma ray bursts with extremely rapid brightness variations, Brecher found that any light speed differences must be smaller than 3 billionths of a millimeter per second. “The speed of light is really constant to a precision of one part in 1020,” he says.

    Bradley Schaefer, an astronomer at Yale University who has used gamma ray bursts to test other tenets of relativity, points out a possible weak point in Brecher's argument: velocity dispersion. “How do you know that the gamma rays aren't emitted by things that have the same velocity?” he says. “You can concoct some finely tuned scenarios where that is not the case.” That's possible, Brecher says, but he thinks such scenarios—in which gamma rays reaching Earth all came from particles with the same velocity relative to us—would have to be hopelessly contrived.

    Why do astronomers bother torture-testing a theory that almost nobody doubts is true? Schaefer describes the relativity tests as “anomaly searches.” “We push as hard as we can, hoping that something breaks,” he says. “Who knows what kind of subtle discrepancies we may find? That would be big news and would lead to a new important step” in physics. Brecher agrees: “No one expects great deviations, but one should test the theories as well as one can.”


    Milky Way Looks Like Big Kid on the Block

    1. Govert Schilling*
    1. Govert Schilling is an astronomy writer in Utrecht, the Netherlands.

    Fellow citizens of the Milky Way, take heart. No longer do you have to settle for second best. If two British astronomers are right, our underdog galaxy is, in fact, the heavyweight champion of the neighborhood.

    The neighborhood in question is the Local Group, a collection of some 30 galaxies in a region of space about 8 million light-years across. For decades, astronomers have believed that the heftiest of those galaxies is Andromeda. Earlier estimates put its mass at about twice that of the Milky Way, because it is larger and brighter and contains twice as many globular star clusters—spherical collections of hundreds of thousands of stars. But that's “poor evidence,” says astronomer Wyn Evans of Oxford University in the United Kingdom. Measurements of brightness, he points out, focus on the visible disk of a galaxy while ignoring the spherical halo of dark, unseen matter that surrounds it. Because Andromeda's disk is more massive than its counterpart in the Milky Way, astronomers assumed that Andromeda's total mass, including the halo, would also be larger.

    Evans and astronomer Mark Wilkinson of Cambridge University decided to put that assumption to the test. Together they analyzed velocity measurements of 37 objects that orbit Andromeda, such as small satellite galaxies and outlying globular clusters and planetary nebulae. Because the objects lie far from Andromeda's center, their speeds are determined by the galaxy's total mass—including the dark matter in the halo. From the speeds, Evans and Wilkinson estimate that Andromeda has a total mass of 1.2 trillion solar masses, about half the mass of the Milky Way, the duo reports in a paper slated for publication in the Monthly Notices of the Royal Astronomical Society. “Although the central region [of Andromeda] is certainly bigger and brighter,” Evans says, “the total mass turns out to be much less.”

    “This is a surprising result,” says Piet van der Kruit of the University of Groningen, the Netherlands. “Although there is no direct proof, everyone assumes that the proportion between the visible and the dark mass is more or less the same for every galaxy.” If Andromeda really does have a lightweight halo, van der Kruit says, astronomers may have to develop a more complicated picture of how galaxies evolve.

    So far, few are rushing to change their minds. “This is a really interesting attempt to determine the mass of [Andromeda],” says Paul Hodge of the University of Washington, Seattle, a leading expert on the Andromeda galaxy, “but I'm a little bit nervous about using the distant, outlying possible satellites.” Sidney van den Bergh of the Dominion Astrophysical Observatory in Victoria, Canada, agrees. “I don't believe it yet,” he says. “It all depends on the objects you use.” Van den Bergh, who has discovered many satellite galaxies of Andromeda and of the Milky Way, says that two of the most distant satellite galaxies cited by Evans and Wilkinson—Pegasus and IC 1613—may not really be part of the Andromeda subgroup. In that case, their velocities may reflect forces other than the gravity of Andromeda, which could make them useless for estimating the galaxy's mass.

    Evans concedes that “our error bars are pretty large.” But he says new data further support the findings. “Since we submitted our paper, we have acquired additional Keck Telescope measurements on five faint dwarf galaxies that were discovered only in 1998 and 1999,” he says. The velocities of the new satellites also point to a skimpier Andromeda.

    A final answer must wait for more and better velocity measurements, Evans says. NASA's Space Interferometry Mission and the European Space Agency's Global Astrometric Interferometer for Astrophysics, slated for launch in 2006 and 2009, respectively, will probably settle the matter.


    BOOMERANG Returns With Surprising News

    1. Charles Seife

    For months, cosmologists have been rumbling with excitement, awaiting a look at the data that an antarctic balloon brought back from the edge of the universe. Now the wait is over. On 27 April, a map published in the journal Nature gave scientists their most detailed glimpse yet of the primordial universe, revealing the shape of the cosmos and the distribution of matter shortly after the big bang. It was worth the wait: The data support most cosmologists' view that the universe is “flat,” but cast doubt on some key assumptions about the balance of matter it contains or the nature of its early expansion.

    The data came from BOOMERANG, a set of sensitive microwave detectors that a truck-sized helium balloon carried on a 10-day swing around the South Pole in late 1998. During the flight, BOOMERANG (a contorted acronym of “Balloon Observations of Millimetric Extragalactic Radiation and Geophysics”) probed a large swatch of sky for fluctuations in microwave radiation, a constant electromagnetic hiss that bombards Earth from all directions, accounting for about 1% of the noise on our television sets. The cosmic microwave background (CMB) is the leftover glow from the big bang.

    “This is probably the farthest light that can be observed,” says Phil Mauskopf, an astrophysicist at the University of Massachusetts, Amherst, and one of 36 scientists on the international team that masterminded the project. Soon after the birth of the universe, photons were tightly coupled to the hot plasma that made up most of the universe. Light and matter acted as a single fluid. But about 300,000 years after the big bang, the universe cooled enough for the plasma to condense, and the photons escaped from their cage of matter. The CMB is a snapshot of that moment. Thus, by looking at little fluctuations in the CMB, astronomers can map the ripples in the light-matter fluid just as the photons broke free. “You're looking at the surface of the early universe,” Mauskopf says.

    Unfortunately, the microwave background is so faint that noise from the ground and the atmosphere tend to swamp the signal from the heavens. To escape the racket, the scientists sent their microwave telescope up in a balloon, lofting it into a wind current that circles Antarctica. True to its name, BOOMERANG swung around the South Pole, returning 10 days later almost exactly to its starting point. Because the instrument was aloft for so long and had very sensitive detectors, it was able to measure the CMB over a wide area with great precision and with very low noise.

    “It's really the first high-resolution map across a large part of the sky,” says Wayne Hu, a cosmologist at the Institute for Advanced Study in Princeton, New Jersey. The balloon experiment has a resolution of about one-third of a degree; the famous COBE satellite, which first detected larger scale fluctuations in 1992, has a resolution of about 7 degrees.

    The results are exciting—and in some cases surprising. According to relativity theory, the four-dimensional “sheet” of space and time that we live on can be curved. For years, astrophysicists have been figuring out the ways in which curved space might distort the images of distant objects, in hopes that astronomers would be able to tell by looking just which sort of space we live in—spherelike, saddle-shaped, or neither.

    “The best part is nobody has to know all of this any more, because the universe is flat,” says Scott Dodelson of the Fermi National Accelerator Laboratory in Batavia, Illinois. BOOMERANG and other recent CMB experiments show that the fluctuations are not distorted as theory says they would be in curved space. But whereas a flat universe is what astronomers expected, another bit of BOOMERANG data took them by surprise. According to theory, the ripples in the microwave background ought to exist on many different scales, each contributing a “peak” to the data. BOOMERANG saw a peak corresponding to roughly 1-degree-sized fluctuations and theoretically should have spotted a half-degree peak as well. It didn't.

    “With the simplest models, [the peak] should have been higher,” Hu says. “It should have been detected.” University of Pennsylvania physicist Max Tegmark is excited by the surprise. “That is extremely interesting,” he says. “The mischievous side of me wanted that to happen.”

    The missing peak means that simple models of how the universe formed and what holds it together cannot be correct. To explain the observations, cosmologists must add some new wrinkles, but each has its own problems. “You can raise the amount of ordinary matter, baryons, in the universe, pushing up the first peak and pushing down the second,” Hu says. “But you have to push it up significantly, something like 20% to 50%.” You could jack up the amount of dark matter in the universe as well, or “tilt” the properties of the engine that drives inflation, or lengthen the plasma phase of the universe. But each of those models requires rethinking basic assumptions, Tegmark says. “You'd have to be violent to one of the sacred cows of cosmology.” If he had to choose, Tegmark says he would rather add matter than accept the tilt or late-recombination theories. “Those are the two lesser evils of those four.” Hu, on the other hand, favors a combination of extra matter and tilt.

    The news is not all bad for the Standard Model, Tegmark says; the shape of the 1-degree peak eliminates some alternative theories to the inflationary model, such as the ones that assume that “topological defects” rather than inflation were responsible for the structure of the universe. “With topological defects, you only predict one peak, but a very broad one. This peak is way too narrow,” says Tegmark. “This really means that most of the rivals to the standard theory just died.” Thus, variants of the Standard Model are really the only game in town.

    Which variants prevail will depend heavily upon future results. Much of the BOOMERANG data has yet to be processed, and soon NASA will be launching a microwave-sensing satellite, MAP, which might catch sight of the second and even a yet-to-be-discovered third peak. That would tell scientists just how much invisible dark matter and baryonic matter there is in the universe and would help nail down the values of 10 or so cosmological parameters, such as the cosmological constant. “The measurements that come out of that are going to be much more sensitive ways of weighing the universe than other cosmological tools,” says Tegmark. “To me, this experiment really signifies the beginning of a new era.”


    NIH, Under Pressure, Boosts Minority Health Research

    1. Laura Helmuth

    Advocates complain that NIH isn't doing enough to address minority health issues and Congress is considering a bill to create a powerful center on health disparities; NIH is launching a high-level initiative

    Death rates are one of the grimmest measures of the disadvantages faced by minorities in the United States. Study after study has shown that African Americans are more likely to die of cardiovascular disease, cancer, diabetes, asthma, complications from childbirth, and many other causes. Recent data from Hispanics, Asian Americans, and Native Americans show that people in these groups, too, are more likely than whites to die from some diseases. “This is an emergency health problem,” says Beverly Coleman-Miller, a visiting scholar at the Harvard School of Public Health in Boston.

    Last week, the National Institutes of Health hosted members of minority communities from across the country at a meeting to discuss NIH's efforts to address the issue. The occasion was the 10-year anniversary of NIH's Office of Research on Minority Health (ORMH), but one message came through loud and clear: Almost everyone at the meeting, from social workers to institute directors, said that NIH hasn't directed as much of its power to studying and alleviating health disparities as it should. Overcoming these gaps, says NIH acting director Ruth Kirschstein, is a scientific challenge and moral imperative: “We have a responsibility to be sure that [NIH's] knowledge benefits all of our citizens, all of our communities.”


    Meeting participants cited several factors that impede NIH's work on minority health care issues, from the lack of minority members on grant review boards to a perceived institutional bias against behavioral research aimed at identifying better disease prevention strategies for minorities. This situation is aggravated by the fact that, as currently constituted, ORMH can't award its own grants but can only collaborate with the various institutes. In addition, several speakers pointed out that lack of trust engendered by the infamous Tuskegee syphilis project makes some African Americans wary of participating in medical research.

    But the message wasn't unremittingly bleak. NIH officials described an institutes-wide review currently under way to intensify its efforts, and Health and Human Services Secretary Donna Shalala pointed out that the White House budget request includes an additional $20 million for coordinating health disparities research at NIH. Also in the works is a controversial proposal in Congress to elevate ORMH to a full-fledged center, which would give it both greater visibility and the power to fund its own projects, although it's too soon to tell whether Congress will buy the idea.

    Navigating the bureaucracy

    Meeting participants cited several problems with the NIH bureaucracy—even in the ORMH. The office was established in 1990 in response to growing concern that minorities disproportionately face health problems. One of the studies that raised public awareness most dramatically was a 1990 finding that men living in Harlem had a lower life expectancy than men living in Bangladesh. The NIH director charged the office with the mission to “support and promote biomedical research aimed at improving the health status of minority Americans across the life-span” and to “encourage the participation of underrepresented minorities in all aspects of biomedical and behavioral research.”

    ORMH has some success stories. It is the chief sponsor of the Jackson Heart Study (JHS), a longitudinal, community-based study modeled on the Framingham Heart Study. Building on an earlier study, JHS aims to enroll 6500 African-American men and women this year in Jackson, Mississippi, which has one of the highest incidences of stroke in the country. The goals are to assess their risk factors for heart disease and stroke and evaluate therapies and screening methods for these diseases. ORMH has also funded clinical trials on diabetes management at Charles R. Drew University, a historically black university in Los Angeles. And supplemental grants from ORMH have allowed the University of California, San Francisco, to enroll more minorities in ongoing drug-abuse studies.

    The ORMH's power, however, is mainly symbolic. Its budget this year is just $97 million—out of an NIH total of almost $18 billion (although by some estimates, the total spent on research and training in minority health issues adds up to $1 billion). But its biggest limitation, critics of the current system say, is that the office cannot award its own grants. ORMH serves as what ORMH director John Ruffin calls “a catalyst, a change agent.” It can help identify and fund pilot projects, but it must team up with another institute that manages the research projects, even when the money comes out of ORMH's budget. Without real power, says the American Medical Association's senior vice president for professional standards, Reed Tuckson, “the office has faced obstacles of indifference and ghettoization.”

    Even with ORMH's help, researchers focusing on health disparities say that navigating NIH's obstacle course of a bureaucracy can be tough. Grant applications for such research are at a disadvantage, argues Mario de la Rosa of Florida International University in Miami, because “the review process is the heart and soul of NIH, and people in the review process don't understand our communities.” Angela Pattatucci Aragon of the Center for Scientific Review at NIH, which assigns review panels, adds that “there's a strong propensity to fund what has worked in the past,” and that “selects against research in our communities,” which have generally been poorly studied. Furthermore, some study sections fund projects that are “generalizable,” which handicaps applications for research on interventions specific to one minority community.

    Attitudinal biases pervade NIH, too, claims Tuckson. Studying many of the factors—such as smoking rates, access to cancer screening tests, or obesity—that have been identified as contributing to health disparities isn't considered “hard science,” Tuckson says. To circumvent such problems, speaker after speaker at the ORMH conference recommended that the office be elevated within the NIH hierarchy to a center, a change that would require a congressional directive.

    Representative Jesse Jackson Jr. (D-IL) and Senator Edward Kennedy (D-MA) have introduced bills into the House and Senate to do just that. Their proposal would create a National Center for Research on Domestic Health Disparities that would control its own budget, make its own grant funding decisions, and issue calls for research proposals. Support for a center was not unanimous, however. Keith Norris of Drew University says, “I worry that other institutes may feel they will no longer have to make the same effort” to study health disparities if a designated center is on the task. Former NIH director Harold Varmus also opposed such a move, pointing out that the wide range of scientific and social issues that contribute to health disparities requires the attention and expertise of all NIH institutes, not just a single center.

    Even if the Jackson and Kennedy bills don't make it through Congress this year, the $20 million discussed by Shalala should create a center within the ORMH to coordinate the health disparities research done by the institutes. It might also hold limited grant-making authority. But as Lovell Jones, director of the M. D. Anderson Cancer Center in Houston, points out, this proposal refers to a “small-c center” that would remain within the Office of the Director, rather than the “big-C Center” of the Jackson and Kennedy bills.

    A new NIH initiative

    Whether or not the ORMH metamorphoses into an independent center, NIH is hatching a systemwide plan to address health disparities. In January, a working group composed of all the institute directors—a very high-powered committee, as co-chair Yvonne Maddox, acting deputy director of NIH, points out—solicited 5-year plans from each institute. Committee co-chair Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases (NIAID), says the effort will allow NIH to “formalize and concretize some things we've done all along,” as well as initiate new research and training projects.

    The working group is focusing on six primary areas: cancer screening and management, cardiovascular disease, diabetes, HIV and AIDS, infant mortality, and mental health. Other conditions, such as glaucoma, deafness, and asthma, will come in for attention as well, Maddox says. Once the working group receives all the institutes' plans—some institute directors missed the 3 April deadline—it will integrate them and forward a trans-NIH plan to the ORMH director and its advisory committee, who will submit a final plan to the NIH director. Fauci expects the plan to be integrated into the 2002 fiscal year budget.

    Some of the initiatives proposed by the institutes will compensate for past oversights, Fauci says. For example, better tissue-typing methods are needed for African Americans, who disproportionately suffer from kidney disease and require kidney transplants. Possibly because the original tissue-typing banks, set up in the 1950s, were based on samples from whites, African Americans are more likely to suffer from transplant rejection. And National Institute of Mental Health deputy director Richard Nakamura says that more culturally sensitive standards are needed for diagnosing and treating mental disorders in minorities.

    The plan will also try to rectify what is often a dearth of minority participants in clinical trials. In some cases, this is due to failure of the trial organizers to make enough of an effort to include them. That's relatively easy to fix, says Fauci. In AIDS research, for example, a majority of new HIV infections occurs in minority populations—but earlier research projects enrolled few minorities in clinical trials. Then NIAID included in its grant scoring process a measure of how accurately the proposed project reflects the population suffering from infection with HIV. “That turned it around in a couple of years,” Fauci says.

    But clinical trial organizers face another problem: the reluctance of minorities to sign on. Shalala attributes this to many African Americans losing trust in the medical system after the Tuskegee syphilis project, which ran from 1932 to 1972. During that time, doctors at Alabama's Tuskegee Institute watched the course of syphilis in 399 black men—a disease they could have cured with a shot of penicillin once antibiotics became widely available after World War II.

    An ambitious project recently announced by the National Cancer Institute (NCI) as part of its 5-year plan may help overcome this lack of trust. NCI has initiated a $60 million program to establish a network of 17 regional cancer centers that will conduct research and run outreach and training programs in minority communities. The institutions serve, for example, the Cocopah and Paiute tribes in the southwest, rural populations in Appalachia, Latinos in Denver, and a multicultural population in Harlem.

    The hope is that these centers will provide better access to preventive health care, including education about nutrition and risky behaviors, and serve as centers for research on the disproportionate cancer burden in their communities. In addition, the research goals include making community members part of the teams that design and execute research projects. “If we don't understand the culture, the interface between culture and poverty, we cannot solve the cancer problem. We need to understand the [disease] in the context of what's happening to the human being,” says Harold Freeman, chair of the President's Cancer Panel. The NCI networks, he says, will help the NCI involve people “who will ask the questions that have not yet been asked or answered.”

    Ultimately, however, Tuckson urged participants at the ORMH conference not to focus just on bureaucratic reorganization or temporary plans: “I hope at the end of the day, it's irrelevant whether we have an office or a center.” What's needed, he says, is a “national conversation” about health that includes all Americans, not just “Leave It to Beaver's family.”


    Working in the Hot Zone: Galveston's Microbe Hunters

    1. Martin Enserink

    From a little-known school in a quiet Texas backwater, a world-class center for the study of infectious diseases is emerging

    GALVESTON, TEXAS—When malaria researcher Joseph Vinetz finished his postdoc at Johns Hopkins University 2 years ago, he had job offers from prominent universities in exciting cities on the East and West coasts—as well as one from a relatively unknown medical center in a small town in southeast Texas. “It felt like a choice between heaven and hell,” says Vinetz. “I chose hell.”

    Hell, in this case, was Galveston, population 59,070, a town on a flat barrier island in the Gulf of Mexico with boiling hot, sticky summers and a somewhat sleepy feel to it. Here, at the University of Texas Medical Branch (UTMB), a cadre of virologists and microbiologists is quietly building a topflight center for the study of emerging tropical and infectious diseases, and luring experts like Vinetz, who previously might have scoffed at the idea. The number of faculty members working in infectious diseases has roughly tripled to over 70 during the last decade; funding from the National Institute of Allergy and Infectious Diseases has shot up from $4.4 million in 1995 to almost $10 million this year.

    The newcomers are drawn to three departments—pathology, microbiology and immunology, and internal medicine—working together in a Center for Tropical Diseases, which has excellent facilities and a critical mass of bright minds to collaborate with. And they study everything from malaria, hantaviruses, dengue, yellow fever, and hemorrhagic fevers to Salmonella and hepatitis. “Every place has its good and bad times,” says Alan Barrett, a virologist who moved here from the University of Surrey in the United Kingdom 7 years ago, taking his research group of five with him: “This is our heyday. People want to come and work here.”

    In an example that astonished many in the field, Galveston now seems set to recruit C. J. Peters, a top virologist from the Centers for Disease Control and Prevention (CDC) in Atlanta. Although he hasn't signed a contract yet, Peters says he's “99% decided” that he'll go. “Good people always attract good people,” comments virologist Charles Calisher of Colorado State University in Fort Collins. “It's like a black hole!” “It's a fantastic group. They may well become the center for tropical medicine in the world,” adds Ian Lipkin, a molecular biologist at the University of California, Irvine, and one of the discoverers of the West Nile virus strain that surfaced in New York City last summer.

    UTMB is also planning to build a biosafety level 4 (BSL-4) lab—the type in which spacesuit-clad researchers work under the most stringent containment conditions—of which there are currently just a handful in the world. That would allow Galveston researchers to study any virus they want to and really put their center on the map, say colleagues.

    Back to the future

    If all that seems ambitious for such a small town, UTMB researchers like to point to Galveston's past. Once Texas's main seaport—during the 19th century, the island ranked as a Manhattan on the Gulf—Galveston was frequently plagued by scourges like yellow fever, cholera, typhoid, and dengue. Turning a vulnerability into a strength, the University of Texas opened its first medical school here in 1891; in its early years, it was one of the nation's finest. But in the 20th century, as the city itself slipped from prominence, the school lost its edge. A monstrous hurricane killed over 6000 Galvestonians in 1900 and flattened a large swath of the city, setting off a century of economic decline.

    As a result, UTMB didn't have a strong research tradition when David Walker, a pathologist from the University of North Carolina, Chapel Hill, seized the chance to become the new head of its pathology department in 1987. Walker had been fascinated by infectious diseases long before they became fashionable and was determined to make them the mainstay of his department. His own research interest was how intracellular bacteria such as Rickettsia and Ehrlichia cause disease, but he wanted UTMB to study other bacteria, viruses, and parasites as well, especially those occurring in Latin America. “I thought in Galveston we had to look south,” he says.

    The university supported Walker with money to gut a charming but run-down 1920s building on campus and turn it into a state-of-the-art lab with biosafety level 3 rooms, which was completed in 1995. But declining revenues in health care in recent years have tightened budgets at UTMB, as in many other academic health centers, and “we're just as pinched as any other department,” he says. Much of the growth had to come from outside funding. In addition, Walker says he strategically hoarded some $6 million between 1990 and 1995, when the university could still spend in style, then used it after the lab renovation to do what some colleagues say he does best: snatching up talent.

    Perhaps Walker's biggest coup came in 1995, when he hired Robert Shope and Robert Tesh, two world-renowned experts in arthropod-borne viruses, or arboviruses, from Yale University. Like many other places, Yale at the time was shifting focus to the AIDS pandemic, while more obscure viruses were falling from grace. And the lab came under heavy fire after a visiting French scientist became infected with the Sabia virus, an incident an investigation blamed on lapses in safety procedures. “Yale wasn't very supportive, and they would never invest in infrastructure,” says Tesh. “I decided I had to get out before I got too old to move.”

    Galveston seemed like an up-and-coming place, he says, and the facilities were great. Besides, he knew Barrett, and one of Tesh's postdocs, Scott Weaver, had just decided to move from the University of California, San Diego, to Galveston. And Galveston was just a short flight from Latin America, where he had many collaborators and did most of his fieldwork.

    Shope, 66 at the time, had led Yale's Arbovirus Research Unit for 24 years and earned himself a reputation as a senior statesman of virology; together with Nobelist Joshua Lederberg he had authored Emerging Infections: Microbial Threats to Health in the United States, a 1992 Institute of Medicine report that served as a wake-up call to the country. Few thought at his age he would leave New England for a new adventure in Texas. But once Tesh decided to leave, the choice was easy, says Shope, especially since the duo had built up the World Reference Center for Arboviruses together, a frozen treasure of thousands of virus strains and related reagents, funded by the National Institutes of Health. “It didn't seem feasible to have half of it at Yale and half of it in Galveston,” says Shope. So he retired and followed Tesh south, taking the collection along.

    Shope spends most of his time working with the collection and serving as a “walking encyclopedia” of virology, say colleagues. Researchers around the world use the collection to identify an unknown virus or characterize a particular viral strain. Tesh spends much of his time in the field in South America, studying the growing army of deadly agents there, such as yellow fever and arenaviruses, which cause hemorrhagic fevers. To better understand what causes new outbreaks, Tesh also plans to study how the ecology of a virus changes when the rainforest is cleared to make room for people. Leishmania, the parasite that causes leishmaniasis, once was mostly limited to forested areas, but now infects dogs and horses and has become an urban disease; the question is whether viruses are doing the same, says Tesh.

    With “the two Bobs,” as Shope and Tesh are fondly known, Galveston's lure rose exponentially. Vinetz, for instance, says the presence of a strong arbovirology group was one of the factors that persuaded him to come. Galveston's collective knowledge about mosquitoes and other insect vectors is a big plus, he notes. And it pays off: Last year, Vinetz found the gene encoding the mosquito gut protein that Plasmodium, the malaria parasite, uses to break through the insect's gut wall and travel to its salivary glands. Tesh also urged Charles Fulhorst, a virologist who was on Yale's payroll but had been stationed at the CDC for several years, to join him and Shope in Galveston. Fulhorst is one of the researchers who's eagerly awaiting the BSL-4 lab; at CDC, he practically lived in one, he says, and he's eager to do more work on some hantaviruses and arenaviruses. (The Sealy & Smith Foundation, which exclusively supports UTMB, has pledged to pick up the tab for Galveston's BSL-4 lab.)

    The promise of the new facility is also one of the things drawing CDC's Peters. A virus hunter with 3 decades of experience—he was immortalized in books like The Hot Zone and flicks like Outbreak—Peters currently directs CDC's Special Pathogens Branch. But there the emphasis is always on responding to outbreaks, he says; at an academic center like UTMB, Peters thinks he'll have the time to dig into more basic virological questions. For instance, he'd like to study the molecular mechanisms by which hemorrhagic fevers cause disease and death. “For my personal scientific fulfillment, I'd like to get to the bottom of some of these problems,” says Peters. “At CDC, that's not really our mission. … We don't have the funding, people, space, etc., to do that.”

    A different culture

    One of UTMB's big attractions, says Vinetz—and, in fact, it's repeated by almost everybody else on campus—is that researchers here seem to collaborate more easily than elsewhere. Perhaps it's because there just happen to be few prima donnas among senior researchers, some say. “Usually, big egos get in the way of research,” says Barrett. “It's very strange, but here everybody wants to work together.” Others say it's a Galveston thing, a unique local culture—call it an island state of mind. “It's a low-traffic, no-stress place, and it's very congenial,” says Vinetz. But whatever the cause, “it's an extraordinary scientific environment that you could find nowhere else in the world.”

    One example is a daring project in which UTMB virologists and pathologists have joined forces with the university's structural biologists—who probe the three-dimensional structure of proteins—to develop new defenses against the threat of bioterrorism. Led by Shope and David Gorenstein, who heads UTMB's Center for Structural Biology, the team first identified three virus groups that terrorists are likely to employ: flaviviruses (which include dengue, yellow fever, and the West Nile virus), alphaviruses (responsible for several types of brain infections), and the arenaviruses. Now, they are using combinatorial chemistry to find small molecules to thwart some key viral proteins in each group. The collaboration, funded by a $3.7 million grant from the Defense Advanced Research Projects Agency, is “a unique opportunity,” says Gorenstein: “In academics, you don't often work on the big picture.”

    Not that there aren't any drawbacks to working in Galveston. One problem, some say, is that UTMB doesn't have the stellar reputation of some of the universities that it draws its people from. In her previous job, says immunologist Lynn Soong, who studies leishmaniasis, if she needed reagents or knockout mice, she just had to say she worked with Dr. So-and-so at Yale, and colleagues from other institutions would send them along. Mentioning she's from UTMB doesn't do the job as quickly, she has found: “For some people, it's the first time they hear about this university.”

    Another frequent gripe—besides the unforgiving climate—is that Galveston is, well, a small town in Texas. For Vinetz, it didn't turn out to be the hell he expected; but senior researchers do concede that some people from the East and West coasts absolutely refuse to live here. “In job interviews, I always try to be up-front about it. This is not the center of the cosmos, and there's not a lot to do,” says Barrett. “You can work, you can work, or you can get in your car and drive to Houston.” But most newly arrived Galvestonians, including UTMB president John Stobo, say they have no regrets about moving here; they like to point out the city's good points, such as considerable historic charm and affordable housing. Stobo, a Massachusetts native and a former vice dean of Johns Hopkins University School of Medicine, has even taken to wearing cowboy boots, which he enjoys showing off. “Once they settle in, most people come to like Galveston,” he assures.

    And Galveston is trying to keep them coming. UTMB acknowledges that several parts of its infectious diseases profile could be beefed up; for instance some “card-carrying epidemiologists” would be very welcome, says Stanley Lemon, who came here in 1997 to head the department of microbiology and immunology. Lemon studies the molecular biology of hepatitis C, but says he would also like to answer basic epidemiological questions, such as why it infects more Mexican Americans than Caucasians. UTMB is also interested in stepping up surveillance for emerging infections along the Mexican border, says Lemon. Last year, there was a large outbreak of dengue in Texas, and a girl died—the first U.S. casualty from the disease in over 3 decades. “That's a signal of what we can expect,” says Lemon. Walker adds that the group also would like to strengthen its efforts in vaccine development and bioinformatics.

    With new deadly pathogens popping up almost every year, there's certainly no dearth of study material. And new molecular techniques have made it possible to study and fight them right down to the molecular level, says Walker. “That was my dream, that was what I was hoping would happen,” he adds. “It's sort of fun to see so many people involved in doing it.”


    Information Technology Takes a Different Tack

    1. David Malakoff

    Challenged by a White House committee to change its ways, the National Science Foundation is looking for far-out ideas in computer science

    “Excellent,” raved one reviewer about a preproposal from James Allen, a computer scientist at the University of Rochester in New York. But Allen says another dismissed it as “impossible.”

    Normally, such wildly conflicting reactions would doom a grant application submitted to the National Science Foundation (NSF). But not this time. In fact, the skepticism may have helped: An NSF official, intrigued by the wide variation, plucked Allen's preproposal from the discard pile and gave him the green light to seek up to $3 million over 3 years as part of the agency's new information technology (IT) program.

    Crowded field.

    High interest will mean a low success rate for NSF's IT program.

    View this table:

    NSF isn't breaking out of its shell on a whim. By taking more risks than usual, officials hope to encourage researchers to submit proposals that are likely to be more innovative than those the agency traditionally supports. Here's how program manager Michael Lesk describes what he's looking for: “If somebody not in your research group but familiar with your published papers could predict your proposal, perhaps you should rethink it.”

    As lead agency for a 5-year, $5 billion federal IT program, NSF hopes its boldness will rub off on the other players. And some researchers think that such a fresh, frisky approach would work well in NSF's other programs, ranging from exploring the early universe to plumbing the ocean depths. “The IT program is heading in the right direction,” says computer scientist Jonathan Smith of the University of Pennsylvania, Philadelphia. “The question is how fast an agency that is very set in its ways can morph.”

    NSF officials are not averse to morphing, but they say it is up to each discipline to decide whether to adopt the more freewheeling approach being followed by IT program managers. If the reaction to the IT program is any guide, the community is ready: Computer scientists submitted more than 2000 proposals for the $90 million program, making it one of the largest competitions ever run by NSF's Directorate for Computer and Information Science and Engineering (CISE). Allen's idea—an interdisciplinary research center to study the mechanics of human speech and to create computers that can carry on conversations—was one of 200 that made the first cut; he'll know this summer whether it will be funded. “It's energized the community,” says CISE director Ruzena Bajcsy. “Anybody who is anybody in the discipline has applied.”

    The program's shakedown cruise hasn't been trouble-free, however. NSF officials fret that they are seeing too few innovative ideas from researchers, who perhaps haven't heard—or don't believe in—the IT program's new tune. They've also scrambled to recruit reviewers, as so many computer scientists are already involved in the competition. Researchers, on the other hand, worry that the program's low—less than 10%—overall success rate will scare away entrants next year. To increase the number of future winners, they are pushing Congress to double the program's budget in 2001, despite skepticism from some lawmakers.

    Overcoming doubts

    Whether or not Congress agrees, researchers say NSF deserves credit for grasping an opportunity to do things differently. That opening was created by the President's Information Technology Advisory Committee (PITAC), a high-profile panel packed with computer company CEOs and prominent academics. Last year, it issued a report aimed at recreating the federal funding climate of the 1970s and 1980s that produced the Internet and other computing revolutions. It called for a boost in federal spending on information technology by a total of $4.7 billion by 2005 (Science, 21 August 1998, p. 1125) and argued that basic research be reinvigorated through larger and longer grants to universities, which have been hard hit by the loss of talent to new dot-coms and the computer industry. To foster innovation, it recommended giving one federal agency the lead in gambling on high-risk concepts with potentially high payoffs.

    The Defense Advanced Research Projects Agency (DARPA) has traditionally played that catalytic role in information technology. Among its hits, for example, were the basic studies that led to networked computers and e-mail (Science, 3 September 1999, p. 1476). But with DARPA under increasing pressure to focus on military needs, PITAC decided that NSF, with a broader portfolio that covers the natural and social sciences, would be better able to lead the federal government's IT resurgence. However, PITAC urged the agency to shed its cautious reputation and adopt a more aggressive, DARPA-like style. The key to that approach is putting professional staff, not consensus-driven peer-review panels, in the driver's seat on funding decisions.

    “PITAC was quizzical because NSF has a reputation for funding incremental proposals,” says CISE executive director George Strawn, who supported the move to dust off little-used NSF rules that allow staff to play a larger role in picking grant winners. PITAC also worried that NSF's cautious reputation could hamstring any IT initiative. Researchers have been “trained to think that, unless half the work is already done, don't submit to NSF,” notes Penn's Smith, a CISE adviser.

    Congress gave the agency's IT budget a 45% boost to kick off the new program, and Strawn, Lesk, and Bajcsy took the revolutionary fervor to heart. CISE's first solicitation listed not just the seven areas that PITAC said needed more research—from writing more reliable software to understanding the social implications of the IT boom—but also a “revolutionary computing” category that included Lesk's litmus test for identifying the kinds of “highly innovative” proposals the agency was looking for.

    Rochester's Allen believes that his proposal, which he says “throws caution to the wind,” certainly fills the bill. His nine-member team, which includes three researchers not on Rochester's faculty, hopes to take advantage of new eye-tracking technologies, for instance, that show that speakers tend to look at the objects or people they are talking about. Such physical clues could be valuable to a computer trying to understand a conversation's context. The team also wants to study “disfluencies,” the “uhms” and “you knows” that seem to be little more than verbal refuse but may actually provide silicon-based ears with valuable information about the speaker's level of uncertainty or state of mind. If such research pans out, it could “radically change the way spoken dialogue systems operate,” says Allen, a member of CISE's advisory board.

    But revolutions can take time. Despite the outpouring of proposals, Lesk says it appears that many researchers still haven't picked up on NSF's new style. “I'm not seeing as many innovative proposals as I'd like,” Lesk told advisory board members earlier this month.

    Long odds

    All the finalists know that the chances of realizing their dreams are small. Bajcsy expects to be able to fund fewer than 4% of the initiative's more than 200 larger proposals, and just 12% of the 1156 smaller requests, which came in by 14 February. And dazzling scientific promise isn't the only criterion. Unlike her DARPA colleagues in the past, Bajcsy must also fulfill NSF's social mission, which includes everything from helping subpar schools improve their science departments to helping minorities and the poor leap the digital divide. But the community seems to have confidence in her judgment. “One of the most exciting things about this is that Ruzena will be able to make some of the final decisions, and she has excellent taste,” says Penn's Smith.

    Whoever makes the winners list, “we're going to disappoint a lot of people,” says Lesk, including more than 100 researchers who submitted proposals that earned an “excellent” rating from at least one reviewer. Among the high-ranked losers are ideas for large-scale ecological simulations of urban sprawl, better databases of human organs available to biomedical researchers, and social and economic analyses of electronic commerce. Bajcsy hopes another federal agency will find funds for these and other proposals, but she's made limited progress so far. The much bigger National Institutes of Health, for instance, is toying with its own biocomputing initiative (Science, 11 June 1999, p. 1742) and so far has nibbled on only one idea, a joint computing initiative in neuroscience. The low proposal success rate also shows Congress that “there is more excellent science out there than we will be able to fund this year,” says Bajcsy, who sees that demand as an argument for keeping IT research on PITAC's recommended 5-year upward funding slope.

    Key lawmakers seem supportive, if only because they are loath to offend potential sources of campaign contributions in the high-tech industry, which has largely backed the PITAC recommendations. The House, for instance, recently passed a bill sponsored by Science Committee chair James Sensenbrenner (R-WI) that endorses PITAC's call for significant increases in federal IT spending. It is uncertain, however, if the Senate will follow suit. The Senate Budget Committee, for instance, recently concluded that “the need for significant [government] spending on IT … seems long past.”

    For researchers in other fields, an even bigger question is whether the IT program's management style will spread to other NSF programs. NSF director Rita Colwell has repeatedly said that—if she had the money—she'd like to have every program office in the foundation dole out the larger, longer grants offered by the IT program. In the meantime, each NSF directorate has considerable latitude to run its programs as it sees fit. Although the IT program “reflects community input and desire,” says NSF spokesperson Mary Hanson, “no one [proposal review] model fits all of the range of disciplines … represented by NSF.”

    However, success breeds success. If Lesk and others can show positive results with the IT program, other NSF managers may also be tempted to take advantage of what Strawn calls “capabilities we probably haven't used enough in the past.”


    Science and Policy Clash at Yucca Mountain

    1. Richard A. Kerr

    EPA's draft standards for protecting groundwater are drawing fire from scientific groups, including a panel of the National Academy of Sciences

    In 1992, Congress tried to ensure that science would have the upper hand in a crucial battle over Yucca Mountain, the proposed repository in Nevada for the nation's highly radioactive wastes from nuclear power plants and national defense activities. It passed a law directing the Environmental Protection Agency (EPA) to come up with public health and safety standards to protect the public from radioactivity that will inevitably leak from Yucca Mountain over the millennia. And it decreed that the standards be “based upon and consistent with the findings and recommendations of the National Academy of Sciences.”

    EPA unveiled draft standards last summer that it said “emphasize prevention of groundwater pollution” in protecting public health and the environment. But the proposal quickly drew a chorus of objections. Among the dissenters was the academy's own Board on Radioactive Waste Management (BRWM), whose findings and recommendations were supposed to form the basis of EPA's proposed rules. The board has argued that, among other problems, EPA's supporting data were out of date and it was needlessly duplicating the protection of human health by proposing two sets of standards, one covering all possible ways humans could be exposed to radiation and the other focusing on groundwater.

    Protecting groundwater separately “may greatly complicate the licensing process and have but a negligible impact on protection of the public,” the board observed last fall in its comments on the proposed standards. The board is hoping to persuade EPA to change course when it issues final standards early this summer. “As geologists, we want to see science involved in policy,” says David Applegate, who has been following developments as government affairs director at the American Geological Institute in Alexandria, Virginia. In Yucca Mountain, “we're seeing just how complicated that is.”

    To protect human health, EPA proposed that those living near the repository should not be exposed to more than 150 microsieverts of radiation per year from the repository in any fashion. That “all-pathways” standard drew some flak from groups, including the Nuclear Energy Institute, an industry group; the Department of Energy, which must obtain a license to operate the repository; and the Nuclear Regulatory Commission (NRC), which would issue the license in accordance with EPA standards. A higher, 250-microsieverts-per-year standard would be almost as safe and far easier to ensure with confidence, these groups argued. People are exposed to 3000 microsieverts of background radiation every year, from cosmic rays to the radiogenic potassium in bananas.

    Perhaps more disturbing for critics is EPA's insistence on a standard to protect the groundwater around Yucca Mountain, separate from the all-pathways standard. EPA proposes an exposure standard of 40 microsieverts per year from groundwater, assuming users of the groundwater ingest 2 liters of groundwater per day. “Groundwater is a valuable resource,” says the preamble to the proposed rules, “with many potential uses,” such as drinking water, irrigation, stock watering, food preparation, showering, and industrial use. Adds Frank Marcinowski of EPA's Office of Radiation and Indoor Air: “If there's contaminated water migrating off site, then you're imposing the cost of cleaning it on someone else. It's an equity issue—let's prevent the pollution before it happens.”

    “It was not scientifically logical to add in the groundwater standard,” responds John Ahearne, director of Sigma Xi in Research Triangle Park, North Carolina, and current chair of the BRWM. In its comments, the BRWM contends that “if EPA wishes to establish groundwater standards on the basis of science, it must make more cogent scientific arguments to justify the need for this standard.” Ahearne doesn't see the justification so far: “You have to ask, Why are you protecting the resource? If it's because of human health, you're back to the first [all-pathways] standard.” John Greeves of the NRC goes further: “We're saying there's no need for a separate groundwater standard. There's no country I know of that has anything but all-pathways standards.” In addition, the BRWM told EPA, the all-pathways standard is based on the latest understanding of radiation effects, while the groundwater standard goes back to 40-year-old data.

    Although EPA has been regulating groundwater pollution as a matter of course for decades, it has never established a formal radiological standard for untreated groundwater, as the NRC pointed out. The proposed 40 microsieverts per year was set in the 1974 Safe Drinking Water Act for drinking water “at the tap,” that is, after treatment of raw water from the ground, rivers, or lakes. But that number has a checkered history, according to William Mills of Olney, Maryland. Mills, as a member of the Public Health Service working for the EPA, helped develop the 40-microsieverts drinking water standard from work he had been doing on the Great Lakes. “We couldn't make even a guesstimate [of risk] to better than a factor of 10,” he says. “It was quite arbitrary in many ways.”

    EPA argues, however, that the standard is “scientifically and technically achievable.” Says Marcinowski: “It's been applied at Superfund sites, low-level [radioactive] waste facilities, and WIPP,” a site for radioactive transuranic wastes in New Mexico. He adds that EPA is now updating the 40-year-old data underlying the standard so that risks of contracting cancer should fall in the range that EPA traditionally regulates—a 10−4 to 10−6 risk.

    Congress, alarmed at what Senate Energy and Natural Resources Committee chair Frank Murkowski (R-NM) has called EPA's “unrealistic” proposed standards, passed legislation this session that would transfer environmental regulatory responsibility at Yucca Mountain from the EPA to the NRC. But President Clinton has promised to veto it. A lot is at stake in this tussle, in the view of some observers. If EPA sticks to its tough standards, they believe, it could add yet another burden to the already strained authorization process at Yucca Mountain. And if this repository falls through, notes geophysicist Mary Lou Zoback of the U.S. Geological Survey in Menlo Park, California, and the BRWM, “we won't be able to go anywhere else.”


    The Science of Astrobiology Takes Shape

    1. Robert Irion

    Mountain View, California—About 600 researchers from 30 countries came to NASA's Ames Research Center on 3 to 5 April, eager to help mold a new field. At the First Astrobiology Science Conference* every talk, it seemed, touched on a new discipline. Topics ranged from prebiotic chemicals to icy life, but the conference was a story unto itself.

    Something Old, Something New

    The night before the meeting began, Air Force One brought President Clinton to Ames for a Silicon Valley fund-raiser with the dot-com crowd. Signs and badges for the conference caught the eyes of a vigilant Secret Service agent, whose question Ames director Henry McDonald overheard crackling over a private radio: “What the hell is astrobiology?”

    Answers varied. To atmospheric chemist James Kasting of Pennsylvania State University, University Park, it's an exercise in rebranding. “Astrobiology is not a new field,” he says. “It's a new name for an old field.” Indeed, NASA Administrator Daniel Goldin invoked the term 5 years ago while attempting to broaden NASA's program in “exobiology,” the study of possible life beyond Earth. With the new name in place, NASA invited a wide slice of the scientific community—including earth scientists, chemists, oceanographers, planetary scientists, molecular biologists, zoologists, and paleontologists—to work out the content. A series of workshops led to an official astrobiology “roadmap,” complete with three fundamental questions, four operating principles, 10 goals, and 17 objectives. They boil down to investigating how life arose and survived on Earth, how it might have done so on other worlds, and how we might go about finding it and recognizing it. That's the formula Goldin is banking on to unite the best minds of today and excite the laureates of tomorrow.

    Judging from the response to the meeting, the old-new field is off to a golden start. Organizer Lynn Rothschild, an evolutionary biologist at Ames, was swamped by 370 submitted abstracts with a mere month's notice, and more than twice as many people showed up as she initially expected. Diverse talks kept the audience in place until the end. Disciplinary walls teetered at the egalitarian poster sessions, where members of the National Academy of Sciences presented alongside graduate students. “I felt a visceral sense of excitement,” says geologist Peter Ward of the University of Washington, Seattle. “I think that was an almost universal reaction.”

    Participants aren't yet rushing to call themselves astrobiologists; rather, the title is becoming a “second identity,” says planetary geologist Bruce Jakosky of the University of Colorado, Boulder. “I'm retooling to be an astrobiologist. I now talk to microbiologists on a regular basis. That's something new.”

    Jakosky attributes much of that new focus to the Astrobiology Institute, a virtual center that NASA established in 1998 at Ames. The institute now has about 420 contributing scientists and students at 11 member universities and research centers. It also has a director with cachet: biochemist Baruch Blumberg, winner of the 1976 Nobel Prize in physiology or medicine for his work on the hepatitis B vaccine.

    The institute has a modest budget by the standards of national science: $13 million in this fiscal year and a projected $16 million to $17 million next year. That means most of the institute's investigators have received just $10,000 to $20,000 annually for each person on their peer-reviewed projects. “We're trying to shovel more money to the teams,” Blumberg acknowledges. Even so, the institute's ranks will expand by another three to four member institutions this year.

    The fiscal situation makes some scientists wary. “Astrobiology has arrived, but it's fragile,” says biochemist David Deamer of the University of California, Santa Cruz. “It all depends on the continued interest from NASA headquarters. Some new blood has come into the field, but the money got spread out too thinly, and everyone is complaining that it's not enough.”

    Nonetheless, Deamer and his colleagues say they were inspired by the conference and by visions of a web of new collaborations weaving among the attendees. As planetary scientist David Morrison, director of astrobiology and space at Ames, proclaimed in the aisle of the overflowing lecture hall: “I'm beginning to think astrobiology is real.”

    Shades of Europa in Arctic Sea Ice

    Planetary scientists who dream of probing for microbes on Jupiter's icy moon Europa would love to explore similar settings on Earth. One popular choice is Lake Vostok, a freshwater lake buried under 4 kilometers of ice in the heart of Antarctica. A primitive ecosystem may eke out life in Vostok's dark waters (Science, 10 December 1999, p. 2094). However, new research points to a better and more accessible match: microscopic pockets of salt water frozen inside the Arctic Ocean's winter ice.

    Bacteria make happy homes within those pockets at temperatures of -15°C and below, a team reported at the meeting—the most frigid climes yet seen for living microbes. Researchers had thought that in such extreme cold, organisms trapped inside isolated microscopic pores would freeze or starve. But scans using magnetic resonance imaging (MRI) and microscopy show that watery microveins lace the ice and connect the briny abodes to a surprising degree.

    The developments are “exciting and encouraging” for the prospects of microbial habitats on Europa, says planetary geologist Robert Pappalardo of Brown University in Providence, Rhode Island. Slushy upwellings heated by the moon's tidal flexings should warm the ice near Europa's surface to −10°C or −20°C, he says. Further, the Galileo spacecraft apparently has spotted salty deposits on the moon, suggesting that its ocean and overlying ice are anything but fresh. “You'll get these brine pockets everywhere,” Pappalardo says.

    When seawater freezes on Earth, it leaves behind liquid-filled pores like the air bubbles in a sponge. Salts and other impurities concentrate within the pores, keeping them fluid even as the ice grows colder and harder. Bacteria, diatoms, and other organisms spend their winters there, presumably subsisting on a bare minimum of nutrients.

    No one had ever studied the physics and biology of this seasonal ecosystem without melting the ice, until a team led by marine microbiologist Jody Deming of the University of Washington, Seattle, and geophysicist Hajo Eicken of the University of Alaska, Fairbanks, devised a way to spot bacteria within ice at temperatures well below freezing. The researchers collected sea ice near Barrow, Alaska, in March 1999 and again last month—the coldest part of the winter there. Then, in a cold room in Fairbanks, graduate students Karen Junge and Aaron Stierle and postdoctoral researcher Christopher Krembs applied salty solutions of a bacterial stain to specially prepared ice sections. The stain attached to the DNA of the bacteria, making the cells fluoresce when exposed to ultraviolet light.

    The team found dozens of intact bacteria dotting the ice samples from 1999 at temperatures ranging from −2°C to −15°C. Early results from last month's ice show similar concentrations of bacteria down to −20°C. Another stain, on melted samples, showed that the bacteria are respiring, not just dormantly waiting out the winter.

    Of course, the organisms must also eat. For that, they need fluid moving through the ice and transporting nutrients. To determine whether the brine can flow, Eicken and his colleagues at the Alfred Wegener Institute for Polar and Marine Research in Bremerhaven, Germany, used a cold-adapted MRI probe to detect liquids within intact samples of ice. They found that a network of narrow veins among the pores persisted down to −30°C—within 5 degrees of the coldest temperatures that marine ice reaches naturally on Earth. “Even very cold, hard ice contains small wet areas that can and do support life on this planet,” Deming says.

    Such refuges may be the first place to look for life on Europa if a lander ever visits the moon. “The environments may have tremendous similarities,” says planetary scientist Richard Greenberg of the University of Arizona in Tucson. Greenberg thinks Europa's icy crust may be thin enough for tidally induced cracks to penetrate to the ocean beneath. If that's true, he says, “the arctic ice is probably a closer analog to Europa than Lake Vostok.”

    Can Amino Acids Beat The Heat?

    Astrobiologists would love to divine the birthplaces of amino acids, the links in the protein chains of life. Some probably formed on the primitive Earth as lightning and ultraviolet light from the sun energized the atmosphere. An even richer source may have been space, where chemical reactions within the wombs of star-forming regions may forge amino acids in abundance. To help spark life, however, they had to withstand the fiery plunge to Earth's surface. New work presented at the meeting suggests that some amino acids survived that trip aboard two seemingly fragile hosts: comets and dust.

    Planetary scientists have long known that organic compounds could make the passage aboard meteorites. For instance, fragments of the carbon-rich Murchison meteorite, which decelerated relatively gently in the atmosphere and fell onto Australia in 1969, contain amino acids (including a few not found on Earth) that stayed cool beneath a red-hot outer rind. “As long as some chunks remain, the surface heating seems to go only a few millimeters deep,” says impact specialist H. Jay Melosh of the University of Arizona in Tucson.

    In principle, comets—which carry more carbon-rich material than most meteorites—ought to be even better sources of the chemicals that assemble to make amino acids. Most comets, however, slam into Earth at tremendous speeds because of their distant orbits. Intense atmospheric shock pressures and high temperatures should prove fatal to amino acids, planetary scientists believed. Even so, some researchers held out hope that a fraction of the cometary bounty would survive.

    Now, experiments led by geophysicists Jen Blank of the University of California, Berkeley, and Greg Miller of UC Davis have boosted that optimism. The team encased watery solutions of five amino acids inside steel disks and fired steel projectiles at them with a 12-meter-long gun. The impacts, at speeds up to 1.5 kilometers per second, pummeled the amino acids with up to 200,000 times Earth's atmospheric pressure and temperatures as high as 600°C. Such conditions might be felt by a comet slicing into Earth's atmosphere at a very low angle, Blank says: “This is the closest anyone has achieved in the lab to recreating the conditions of a cometary impact.”

    Blank and Miller found that 40% to 70% of the amino acids survived, perhaps because the pressure stabilized them and prevented the heat from breaking their chemical bonds. Moreover, some of them paired in a flash to form dipeptides, the first step toward amino acid chains. The experiments are encouraging, Melosh says, although he warns that time scales are vastly different in the atmosphere. “Extreme conditions in a real impact last up to a second,” he notes—a million times longer than collisions in the lab. The extra time might let destructive reactions penetrate throughout an impactor.

    On a far smaller scale are micrometeorites and dust particles, which add some 40,000 tons of extraterrestrial material to Earth each year. Some tiny particles decelerate and drift to the ground, but they can still roast at more than 1000°C for several seconds—enough to destroy amino acids. However, geochemists Daniel Glavin and Jeffrey Bada of the Scripps Institution of Oceanography in La Jolla, California, have now demonstrated a way for organic compounds to escape the mini-infernos.

    Glavin and Bada extracted grains from part of the Murchison meteorite and heated them in a partial vacuum. Glycine, the simplest amino acid, sublimed into a vapor when the temperature reached 150°C and survived as the rest of the grains heated to 800°C. Such a vapor would recondense in a cold trail behind the micrometeorites, Glavin says, creating tiny rains of glycine seeding the surface. Because interplanetary dust was rampant in the young solar system, he says, “this may have been a good way to get some of the prebiotic molecules onto the early Earth.” However, the news is not entirely rosy: Other amino acids in the Murchison sample did not sublime and were destroyed by the heat. This summer, the Scripps team will analyze antarctic micrometeorites—in which the researchers have detected extraterrestrial amino acids—to see whether they will sublime in the lab.

    The experiments suggest that space delivery of amino acids billions of years ago was a “plausible” companion to organic synthesis in the atmosphere, says David Deamer of UC Santa Cruz: “There's just no doubt that some of the amino acids survive the impacts. It's become surprisingly convincing.”

Log in to view full text

Log in through your institution

Log in through your institution