News this Week

Science  20 Mar 1998:
Vol. 279, Issue 5358, pp. 655

    Ames Tackles the Riddle of Life

    1. Andrew Lawler


    Mountain View, California—Three years ago, NASA's Ames Research Center here was preparing to jettison large pieces of its scientific program after being ordered to narrow its mission and meet strict budget ceilings. A last-minute proposal from scientists won it a reprieve, however, and has led to the birth of a new scientific discipline: the study of how life might arise across the universe. Now NASA is betting that a series of startling discoveries in recent years will provide a rich payoff both for Ames and for humanity's understanding of its place in the cosmos.

    “There's been an explosion of discoveries … that are really reinvigorating our quest for understanding life,” NASA space science chief Wes Huntress told an audience last week at a conference at George Washington University in Washington, D.C. Adds Bruce Alberts, a biologist and president of the National Academy of Sciences, “This is a great adventure.” Indeed, recent findings of ancient terrestrial fossils, possible fossils of martian microbes, an ocean on a moon of Jupiter, planets circling other stars, and evidence of life in solid rock 1 kilometer below Earth's surface have revitalized the once esoteric debate over how life adapts to extreme conditions and whether it could exist outside the atmosphere's protective blanket.


    A 2003 mission will look for tidal changes on the jovian moon from a proposed ocean beneath a cracked ice sheet.


    Next month, Ames will unveil its new Astrobiology Institute. NASA is on the verge of picking a director to run the institute, which will serve as the hub of a network of research teams tackling a range of topics. Although the institute's initial annual budget is a modest $4 million, NASA Administrator Dan Goldin predicts that it could eventually grow into a $100-million-a-year behemoth with astronomers, biologists, chemists, geologists, and researchers from an assortment of other specialties. “It seems to include everything from the big bang to elephant ecology,” says Ken Nealson, a former University of Wisconsin microbiologist who was recently hired by NASA's Jet Propulsion Laboratory (JPL) in Pasadena, California. “It's up to us to define it.” But skeptics worry about that fuzziness and fear that the institute could become mired in bureaucracy.

    Cosmic imperative?

    For years, NASA has funded research in exobiology—the study of potential life on other planets—and paid for missions like Viking, which sampled martian soil for signs of microbial life. Its work complemented the portfolio of agencies such as the National Science Foundation (NSF), which backed research on life in extreme environments on Earth and on evolutionary biology. But Huntress says astrobiology connotes a wider search for understanding the origin and development of life everywhere, from lichen in dry Antarctic valleys to complex organic compounds in interstellar clouds to the biological signatures of microbes on other planets. “It's imperative we broaden our thinking,” he says.

    Until recently, however, the search for life beyond Earth was often derided by researchers, politicians, and the public alike. NASA's tiny SETI (Search for Extraterrestrial Intelligence) project—a systematic search for radio signals from outer space—was killed in 1993 by lawmakers who dismissed it as wasteful spending to find little green men. And there have been few obvious reasons to bring together chemists, geologists, biologists, and astronomers interested in the overarching question of life's emergence. “It's a term looking for a definition,” sniffs one NASA biologist about the concept of astrobiology. Indeed, Ames researchers working on the topic in the 1970s rejected the word “because it sounded a bit too close to astrology,” recalls the SETI Institute's John Billingham, a former chief of life sciences at Ames.

    But a stream of apparently unrelated findings has earned the field new respect. Discoveries of life in extreme environments on Earth—such as in hot vents deep in the ocean and on basaltic rock hundreds of meters below the surface—are challenging traditional notions about life's adaptability. And geologists have uncovered fossils as old as 3.7 billion years, an unmistakable sign that life developed extremely rapidly after Earth cooled. At the same time, a NASA spacecraft has sent back convincing evidence that an ocean flows below the icy surface of Jupiter's moon Europa, perhaps providing an environment for life to begin, while the Hubble Space Telescope may have spotted planet-forming regions and ground-based instruments have fingered extrasolar planets.

    Even the highly controversial claim that an ancient martian meteorite found in Antarctica contains evidence of fossilized microbes has served to legitimize the debate. “The Mars rock crystallized people's interest,” says Gregory Wray, a biologist at the State University of New York, Stony Brook. “Suddenly [life beyond Earth] sounds like more of a reality.” Popular culture also helped to alter public perceptions. Last year's Hollywood film Contact, for example, lionized scientists involved in the search for life beyond Earth.

    Beta Pictoris.

    Earth-like bodies could arise from a protoplanetary disk like this one spotted by the Hubble telescope.


    It's a far cry from 1995, when Ames—saddled with an aging aeronautical research facility—was locked in a grim struggle for survival. “We were facing possible closure,” says Harry MacDonald, the new center director. With an ultimatum from NASA headquarters to narrow its mission and cut costs, Ames managers reluctantly took steps to gut the science program. But scientists at the center made a last-ditch bid for the job of coordinating NASA's fledgling effort to understand life in the universe. Their arguments prevailed, transforming the center into a lab with the dual missions of information technology and astrobiology.

    The new institute will focus attention on this line of research and supplement efforts by other agencies, NASA officials say. Goldin has already spoken with Rita Colwell, the nominee to lead NSF, about greater research collaboration in the field, and NSF program managers, for example, are involved in the review of institute proposals. “It's the right time,” says Mary Jane Osborn, a University of Connecticut microbiologist who serves on the National Research Council's space studies board. “Never mind the Mars rock, which triggered public interest; there's a growing realization that there is good science to be done.” Adds Ursula Goodenough, a biologist at Washington University in St. Louis: “It's a real opportunity to fill a vacuum. The idea is very compelling, and the questions that need to be addressed are staggeringly underfunded.”

    Martian rocks.

    NASA rovers hope to arrive in 2001, returning samples to Earth by 2008.


    That's been true historically at NASA, whose life sciences program is held in low esteem by the biological community. The National Institutes of Health prefer research with more direct medical applications, say researchers, while NSF has difficulty with proposals that touch on a variety of fields. Goldin, however, insists he can offer an attractive home to an array of life scientists despite an agency culture dominated by engineers and physicists. “How can we watch the explosion in biology with blinders on?” he asks. Bringing biologists into the fold, he says, will allow the agency to tailor its programs and instruments to the search for life elsewhere.

    Two experiments

    Goldin is not the only one excited by the new institute. When NASA officials announced last fall that they wanted help from a cadre of research teams, “I expected 20 to 25 proposals,” says Gerald Soffen, director of NASA's university programs. Instead, the agency received twice that many. Soffen compares it to the California gold rush: “I was blown away—it was like Sutter's Creek.” The proposals—ranging from ocean vent expeditions to studies of complexity theory and astropsychology—are now undergoing peer review, and the institute's new director is slated to meet in May with members of the seven or eight winning teams.

    Each successful team, made up of dozens of investigators from many institutions, will receive funding commitments for up to 5 years. The winners will communicate using NASA's new sophisticated high-speed computer network, reducing travel costs without impinging on the level of interaction. A small staff of civil servants at Ames will coordinate the effort. “We don't want to waste our money on bricks and mortar,” says Goldin.

    The institute will be a test both of a virtual institute and of the viability of the field. “We're trying to do two experiments: one to foster collaboration over great geographic distances, and the other to stimulate new ways to look at problems that are multidisciplinary in the academic community,” says David Morrison, Ames's space science chief. For a model, he points to the growth of planetary science, a field that NASA helped to create 25 years ago. He predicts that within a decade astrobiologists will be enrolled in Ph.D. programs, publishing in their own journals, and conducting field expeditions to hydrothermal vents in the ocean or lakes in the Antarctic to set the stage for space missions.

    The opportunity to get in on the ground floor of a new discipline is part of astrobiology's attraction, say researchers. “NASA managers are saying, ‘Tell us what it is, we'll fund it, and that will define it,’ “says JPL's Nealson. “Rather than being told what the problem is, we're being told to decide.” Nealson has taken NASA at its word, proposing a diverse team from California's Lawrence Berkeley National Laboratory, the Carnegie Institution in Washington, the University of Wisconsin at Madison, New York's University of Rochester, and NASA's Marshall Space Flight Center in Huntsville, Alabama, that would study ways to detect life in rocks. “Each represents a technology not available at JPL or [the California Institute of Technology, which operates JPL],” he notes.


    Evidence of cyanobacteria, left, from ancient Earth and elsewhere, like this martian meterorite, could illuminate how life developed.

    NASA (right)/SCHOPF ET AL.

    That ambiguity is too much for some researchers. “I'm a little befuddled,” says one evolutionary biologist whom NASA has consulted for advice. He's uncomfortable with NASA's lack of experience in biology, and he worries about shoring up a space life sciences program that “is not very credible.” But although he may be cynical about the concept, he says the prospect of a stable funding source is certainly attractive.

    The astrobiology effort also has made waves within NASA. Goldin assigned it to the office of space science rather than life sciences and microgravity research, a move that exacerbated tensions between the two bureaucracies. Life sciences managers so far have refused to contribute any funding. Joan Vernikos, a former Ames researcher who is now NASA's life sciences chief, says that having the organization run by civil servants rather than a fresh batch of outside scientists “could put the bureaucratic reins on the academic community” and make it harder for the institute to support top-notch science. She says she wants to see a string of achievements before she spends scarce funds. But Ames officials say the real issue is turf. “They are very concerned this could turn out to be a raid on their budget,” explains MacDonald.

    MacDonald, Morrison, and other proponents believe that keeping the institute closely tied to Ames will give biologists and others a greater voice in shaping future NASA missions at the same time that it strengthens the parent center. The agency already plans an ambitious set of space flights and is considering a joint effort with NSF to explore Antarctica's Lake Vostok. “The institute will serve as an intellectual center,” says Huntress. “Its job is to make astrobiology credible.”

    Antarctic dip.

    New forms of life may lurk in a lake under Russia's Vostok Station, which will also serve as a technology test-bed.

    R. STONE

    That task seemed insurmountable 3 years ago. “The idea has succeeded beyond my wildest dreams,” says one Ames researcher involved in the 1995 proposal. Morrison admits that a will to survive may have been the original force behind the institute, but he notes that the recent findings in so many disciplines suggest that astrobiology “has taken on a life of its own.” Now the question for Ames researchers is whether the same is true for worlds beyond Earth.


    Russia Removes Obstacles to Projects

    1. Richard Stone

    Happy endings are rare in Russian science these days, but with his pen and his word Prime Minister Viktor Chernomyrdin appears to have ended a pair of long-running battles that had imperiled major international projects in astrophysics and seismology. Last week, Chernomyrdin pledged that the Russian government would withdraw an earlier threat to sell 60 tons of gallium at the heart of a neutrino detector in southern Russia, and his science minister revealed that a new decree, signed by Chernomyrdin, will end customs snafus that have dogged a global seismic monitoring network.

    The revelations added drama to the 10th meeting of the Russian-U.S. Commission on Economics and Technology, headed by U.S. Vice President Al Gore and Chernomyrdin. The Gore-Chernomyrdin Commission (GCC), as it's called, is meant to nurture and troubleshoot joint efforts in such areas as the international space station, nuclear disarmament, and disease surveillance, as well as showcase multimillion-dollar deals between U.S. and Russian companies. While the GCC meeting in Washington, D.C., last week featured some standard fare—including gripes about Russia's failure to keep the first space station component on track for a June launch—Chernomyrdin's commitment to preserve the two high-profile projects stole the show.

    The first piece of good news came from Russian science minister Vladimir Fortov, who announced that a 5 March decree will settle a protracted dispute between Russian customs officials and scientists who run 12 seismological monitoring stations in Russia. The sites, some of which were set up in the late 1980s, are part of the Global Seismographic Network and are operated by the Russian Academy of Sciences (RAS), the U.S. Geological Survey, and Incorporated Research Institutions for Seismology (IRIS), a nonprofit university consortium based in Washington, D.C. The Russian sites provide rapid data on earthquakes and for studies of Earth's deep interior and help monitor compliance with the Comprehensive Test Ban Treaty barring nuclear weapons tests.

    Trouble began soon after the Soviet Union fissioned in 1991, when Russia's fledgling customs bureau set out to collect duties on the country's mushrooming imports and exports. “The biggest problem,” says IRIS President David Simpson, was that Russian customs began demanding retroactive duties on equipment already on Russian soil, even though it was exempt under Soviet-era agreements. And it would sometimes take 2 to 3 months to get data tapes through customs in the Russian Far East, he says, diminishing the data's value.

    According to Fortov, the decree—stamped confidential by the Russian government—orders the duty-free import of instruments and other materials needed for bilateral research projects, including the seismology network. U.S. officials, who had not seen the decree by the time Science went to press, are cautiously optimistic that the problem is solved; Fortov was expected to release more details at an 18 March press conference in Moscow.

    Also willing to take a tentative sip of champagne are researchers affiliated with the Soviet-American Gallium Experiment (SAGE), a solar neutrino observatory in the Caucasus mountains that has been in operation since the mid-1980s. SAGE has been beset by troubles of late: Last year the Russian government threatened to sell—and thieves tried and failed to steal—the detector's ultrapure, liquid gallium, prompting 12 Nobel laureates to appeal to Chernomyrdin to save SAGE (Science, 19 December 1997, p. 2045).

    Responding to a query at the GCC meeting from White House science adviser Jack Gibbons, Chernomyrdin stated unequivocally that the Russian government would not sell the gallium. “This is clearly good news” for neutrino physicists, Gibbons says. Now U.S. and Russian officials are hoping Chernomyrdin will issue a formal decree to change the gallium's status as a national reserve held for Russian industry to an RAS asset, says the Russian head of the SAGE collaboration, Vladimir Gavrin. Until that happens, Gavrin says, the gallium's “long-term future in science will not be assured.


    Cancer Warriors Claim a Victory

    1. Eliot Marshall

    The news couldn't have come at a better time for cancer researchers: Just as Congress began working on the 1999 biomedical budget, a group of experts announced last week that the United States has “turned the corner in the war on cancer.” That was the word from David Rosenthal, president of the American Cancer Society, as he and other public health leaders released encouraging data at a press conference in Washington, D.C. According to their report (published in the 15 March issue of Cancer), a sea change occurred in 1992. In that year, cancer rates that had been rising steadily from the 1930s through the 1980s reversed and began to drift downward.

    The average death rate for all types of cancer, which had been rising at 0.4% per year from 1973 to 1990, dropped 0.5% per year from 1990 to 1995. At the same time, the incidence of new cases (based on a sample of 9.5% of the population) began to recede. After climbing at an annual rate of 1.2% from 1973 to 1990, cancer incidence has been declining by 0.7% annually in recent years.

    Turning point.

    New cancer cases (left) and deaths began to decline in 1992.


    Many biostatisticians—including Harry Rosenberg of the National Center for Health Statistics in Hyattsville, Maryland, a co-author of the report—agree that one cancer is driving these overall trends: lung cancer. Aided by a long-term decline in stomach cancer deaths and more recent declines in deaths from breast and colorectal cancer, the decline in smoking has helped push down average cancer death rates. People began to give up cigarettes after the surgeon general branded them a cancer risk in 1964. The effects of that shift in behavior showed up first in lung cancer rates among men, which have been declining since 1984. For women, who started smoking later, lung cancer incidence is still climbing, although its rate of increase has slowed since 1994. Another factor in the decline in mortality rates for some cancers is the widespread use of improved detection methods, such as mammography, that can catch cancers early.

    The importance of smoking trends and early detection means, says John Bailar, a biostatistician at the University of Chicago, that “the government has had little role” in directing the recent improvements, which reflect decisions by millions of individuals to improve their lifestyle. Basic research may have resulted in an explosion of new knowledge about the molecular processes that lead to cancer, but these findings have had little impact on overall cancer figures, Bailar and others argue.

    National Cancer Institute (NCI) director Richard Klausner says, however, that the data “do not reveal the real improvements in the quality of life for cancer survivors” made possible by improvements in therapy and medical care. It is very difficult, Klausner argues, to ascribe causes to any changes in cancer rates—other than to those related to tobacco. But “we know that for certain cancers, screening and therapy have made a difference” in prolonging life, he says. For example, he points out that clinical trials have established that surgery plus adjuvant therapy—including tamoxifen, which is designed to block cancer in a second breast, and mixed drug cocktails known as “polychemotherapy”—have reduced deaths from breast cancer. As for contributions from basic research, Klausner predicts that NCI-supported studies of cancer genetics should pay off in the future with improved diagnostics and screening—and, before long, in new methods of targeting chemotherapeutic agents more effectively. But at the moment, he concedes, “this is a hypothesis.”

    Klausner also cautions that although many trends in the “cancer report card,” as the authors called it, are favorable, there's no cause for complacency. The incidence of cancer of the skin and lymph system, for example, continues to increase, and African Americans have not shared in the improvements seen in the Caucasian population—partly because blacks may have poorer access to screening and therapy. Blacks were specifically at greater risk for developing and dying of breast and prostate cancer.

    For whites, one of the most dangerous cancers is melanoma, whose incidence is rising at a rate of about 2.5% a year. Despite its lethality, however, melanoma is often manageable if caught early; deaths have been declining at 0.4% a year since 1990, reversing a previous upward trend. But the battle against the lymph disease known as non-Hodgkin's lymphoma has not been going so well. Incidence rose at 3.5% a year in the 1980s; while it slowed after 1990, it continued to climb at a rate of 0.8% per year. It remains lethal: For unknown reasons, the death rate from this cancer actually increased faster—by 1.9% per year—in the 1990s than it did in the 1980s.

    The most likely explanation for the rise in skin cancer, says Brenda Edwards, an NCI statistician and co-author of this report, is that “we have a lot more leisure time to spend at the beach and on the tennis courts,” where people get sunburned. There is no consensus on why non-Hodgkin's lymphoma is climbing among older people, says NCI researcher Lynn Ries, adding that “studies are under way.” Also targeted for investigation, says Edwards, is a rise in brain cancer among the elderly


    Asteroid Scare Provokes Soul-Searching

    1. Gretchen Vogel

    Is a false alarm better than no alarm at all? Astronomers are debating that question this week after their very public reassessment of the threat posed by asteroid 1997 XF11, which is headed for a close encounter with Earth in 2028. On Wednesday, 11 March, initial calculations suggested that there was a small possibility that the object—thought to be at least a kilometer wide—might collide with Earth. But just hours after a press release had put the story on the network news and the front pages of the morning's newspapers, those estimates were revised. The asteroid will probably pass about 950,000 kilometers away (more than twice the distance to the moon), and the chance of a collision is essentially zero. For a second day, the story made the front pages, only this time the news was that there was no need to worry.

    Scientists who study near-Earth asteroids have been lobbying for more attention and more funding, but some say this is not the kind of spotlight they need. “It reflects horribly on our credibility,” says Richard Binzel of the Massachusetts Institute of Technology. “I would rather have the honest answer out there, maybe on page 7. That would be better than coming back the next day [with a revision].” To head off future false alarms, he and others say that before the press is notified of a threatening asteroid or comet, word should go out to a small community of colleagues so that they can come to a consensus before the media storm breaks out. But others, including Steve Maran of the American Astronomical Society, who distributed the press release, counter that the benefits of the publicity outweigh the confusion, and say too much behind-the-scenes conferring is fodder for conspiracy theorists.

    The story started out like a script for one of the asteroid-impact movies due out this spring: Brian Marsden of the International Astronomical Union's (IAU's) Central Bureau for Astronomical Telegrams in Cambridge, Massachusetts, a clearinghouse for new discoveries in astronomy, published a notice on the organization's e-mail circular, asking his colleagues to take a closer look at an asteroid that seemed to be headed for a very close encounter with Earth. The latest observations, made a week before, suggested that the object would pass especially near Earth in October 2028, and the uncertainty in Marsden's preliminary calculations seemed to leave room for a collision.

    The circular, which included an uncharacteristic exclamation point, was followed by the press release from Marsden (distributed by Maran). Within hours, the news of the close encounter—and the chance of a collision—was all over the media.

    But the rest of the story didn't follow the script. A few hours after they received the circular, Paul Chodas and Donald Yeomans of NASA's Jet Propulsion Laboratory (JPL) in Pasadena, California, got the latest data from Marsden. Their analysis indicated that the asteroid would pass very near Earth, but with no real possibility of a collision. And by the next day, Eleanor Helin and Ken Lawrence at JPL had located some old images of the asteroid, taken at Mount Palomar Observatory in 1990. The additional pictures gave the scientists several more dots to connect in calculating the object's path. With the new information, everyone, including Marsden, agreed: The chance of a collision was nonexistent.

    Chodas and Yeomans—veterans at tracing orbits from sparse data—say that if Marsden had sent them the data before he sent out the circular, they could have eliminated the possibility of a collision right away. Their method takes careful account of the uncertainty in available observations of the asteroid. By projecting the effect of those uncertainties forward 30 years, the astronomers calculated an “error ellipsoid,” a region of space where the asteroid is likely to be at a given time. The data Marsden had on Wednesday yielded a long, narrow ellipsoid that came close to, but excluded, Earth's orbit. Although another analysis of the same data by Karri Muinonen of the University of Helsinki in Finland predicted a 1 in 50,000 chance of a collision, Yeomans says that probability is “so small that we have a better chance of being hit by an undiscovered asteroid in the next 30 years than have the asteroid 1997 XF11 hit in 2028.”

    Marsden, however, says he would do it again the same way. “On the basis of the data that were available [Wednesday], it seemed that there was a small possibility [of a collision],” he says. “I thought the important thing was to get observations” of the object, which was dim and fading fast. “How could I persuade someone on a big telescope to look at this? The best way I know is to put out an IAU circular.” The press release, he says, was designed to anticipate questions from reporters who learned of the announcement. He adds that the press attention heightened public awareness of asteroid hazards, which will be a net gain for the field. “One of the things I hope would come from this is a wide interest in doing these searches,” he says.

    But to other astronomers, the consciousness-raising doesn't offset the embarrassment. Next time, they say, the data should be distributed to other researchers before any public notice. “When this happens again, and it will happen again, the data should be placed on the [near-Earth object] Web page immediately so we can grab it. We should do an analysis, notify other observers, and then come to a consensus,” says Yeomans. “All those steps took place, it's just that they took place in the glare of the media.”

  5. SPAIN

    Funding Outlook Improves, But Job Crisis Remains

    1. Nigel Williams

    Madrid—Like many other young Spanish researchers, Marta Alvarez is looking for a job. For the past 5 years, she has been working on and off as a postdoc in her home country, most recently at the Physical Chemistry Institute in Madrid. But she has reached the end of a 5-year scheme the Spanish government devised to help its talented overseas scientists find permanent jobs back home, and her appointment is about to end. “Many people are disillusioned and are having to look abroad for work again or face unemployment,” she says. “It's a very bleak prospect.”

    Alvarez, a chemist, was drawn to a job in science during the boom years of the 1980s. The Socialist Party was then in power, and it made training more scientists a priority. Many of those students are now postdocs—some in Spain but many abroad—and most of them are eagerly looking for work in Spain. But jobs in the universities and at Spain's National Research Council, the CSIC—which runs 96 basic research institutes—are hard to come by. “It is very competitive and extremely difficult to win a permanent position,” says Margarita Salas at the Center for Molecular Biology at the Autonomous University of Madrid. “The fate of a group of extremely well-trained people is in jeopardy. It's the first time Spain has had so many good people,” says Mariano Esteban, director of the National Biotechnology Center (CNB) in Madrid.

    The new right-wing government, which ended 13 years of Socialist rule in May 1996, has begun to respond to these cries of distress. Last month, it provided a substantial boost in funding for R&D in its 1998–99 budget, giving the National Plan—which provides money for competitive, targeted research, focused in part on regional development—a 21% boost. The increase followed a rise in science's political prominence: Last year, the prime minister himself took charge of an interministerial commission that oversees science policy and funding, and the government has also created a new Office of Science and Technology to bolster policy. “The government holds science as a state priority,” the secretary of state for research, Manuel Jesus Gonzalez, told Science. “Research and development spending has decayed in recent years, but we have now reversed that downward shift.”

    Researchers have welcomed these developments, but only as a first step. The current plight of young scientists, they say, is a result of shifts in policy over the past 15 years that have put science on a roller-coaster ride. The Socialists doubled research spending during the 1980s, and the number of permanent research positions almost doubled to 2000 over the decade. As a proportion of gross national product, however, research spending peaked at only 0.85%, well short of the European Union average of 1.9%. During the first half of the 1990s, the Socialists' enthusiasm for science waned as it was forced to cut back government spending to reduce the national budget deficit. The new government had promised to restart growth in research spending, but in its first budget last year, research funding was stagnant.

    Last month's budget has finally begun to bring science back up from a long dip in this bumpy ride. And some regional governments are also showing increased interest in funding research. Francisco Rubia, the director of research for the Madrid region, says: “We are keen to support the best research in Madrid. We are now spending 0.86% of our total budget—4500 million pesetas [$30 million]—on research, and our target is 2.0%,” he says. According to Salas, “Madrid regional funds are extremely important for us, as they are more flexible than national funds.”

    But Spain's researchers say much more will be needed to tackle the pressing job crisis. “We're not recruiting young scientists, and the average age of staff at the CSIC is now 46 to 47. That's too old,” says Manuel Espinoza, director of the Center for Biological Investigation in Madrid. “The total scientific staff employed by the CSIC is now 1800, whereas in France the number of equivalent research posts is many times higher.” And the situation is about to get worse. In 1992, the Socialists created several hundred new positions on 3-year contracts to help postdocs working abroad obtain work back home. Alvarez was among those in this program. But Alvarez, like many of her colleagues who took the contracts, has been unable to find a permanent job, and the government extended some of the contracts for 2 years. All of the extended contracts will soon come to an end.

    The budget increases are one response to this problem. They will result in more temporary jobs for scientists through the National Plan, which for the first time now allows its grants to be used in part for the payment of salaries. The government has also been sympathetic to a request from the CSIC's president, Cesar Nombela, to create up to 150 new, permanent CSIC research positions this year, although some of these will be promotions for current staff. “I am confident the government will fund a substantial expansion,” says Nombela. But senior researchers worry whether this will be sufficient. “It is just so hard for young researchers to get established,” says Salas. “One result is that people are leaving science who are certainly good enough to continue,” says molecular biologist Juan Ortin of the CNB.

    An increasing number of researchers are now calling for a more radical solution: a complete overhaul of the career structure of the CSIC to make it easier for young researchers to get onto a career track. CSIC staff members are tenured civil servants, and there is now wide support for a new kind of appointment for independent researchers on renewable contracts but with no automatic path to a civil service position. “Everybody wants a more flexible system, and it has to happen in the long term,” says Ortin. “I'm confident we'll get support for a more flexible system, but that requires changes to legislation through parliament,” says Nombela. And the message is getting through to the government. “We want to develop methods to incorporate young people into research without a necessary transition to public employment in the end,” says Gonzalez.

    Despite the government's bold words and budget promises, Spanish researchers, after witnessing so many ups and downs in the fortunes of science over the past 15 years, remain to be convinced of its commitment. “Spain really needs science, but I'm not convinced the politicians know how best to go about taking that into account,” says Alvarez.


    Gore Pushes Whole Earth Channel

    1. Andrew Lawler

    If Vice President Al Gore has his way, a NASA satellite will soon be beaming back continuous pictures of the whole Earth from 1,600,000 kilometers away to anyone with a television or an Internet connection. The simple idea, announced last week, caught earth scientists by surprise, irritated Republican lawmakers, and sent the space agency scrambling to define the educational and scientific benefits of such a mission.

    An array of U.S., European, Japanese, Chinese, and Indian satellites already monitors Earth at a height of about 39,000 kilometers, providing crucial meteorological data. But Gore envisions a telescope and camera mounted on a small spacecraft, to be launched by 2000, that would hover at the point where the sun's gravity cancels out that of Earth. That site would provide a vantage point for a constant view of the planet's entire sunlit face. An “Earth-Span” channel broadcasting this image in real time, said Gore during a 13 March speech at the Massachusetts Institute of Technology, “will awaken a new generation to the environment and educate millions of children around the globe.” But some lawmakers worry that the project, which NASA says could cost as much as $50 million, may endanger funding for other science efforts. And they wonder if it can't be done better by industry.


    Galileo snapped this picture of Earth in 1990 on its way to Jupiter.


    NASA Administrator Dan Goldin has embraced Gore's idea, which White House sources say came in the form of a middle-of-the-night brainstorm. Goldin gave the assignment to Ghassem Asrar, NASA's new earth science chief, who says such a satellite offers several advantages over current systems. It could immediately gather and transmit data that otherwise would have to be stitched together in a time-consuming and difficult fashion, he says, and provide coverage of the polar regions. One satellite would also make it easier to track hurricanes or incoming solar storms. Gore has dubbed the satellite Triana, after the sailor on Christopher Columbus's historic voyage who spotted land.

    The proposal surprised most of the scientific community. “I don't know what's involved,” says Harvard University's Steve Wofsy, who chairs NASA's advisory panel on earth sciences, adding that he lacked sufficient detail to comment. M. R. C. Greenwood, president of the American Association for the Advancement of Science (which publishes Science) and chancellor of the University of California, Santa Cruz, discussed the idea briefly with Gore last week before the announcement and is supportive. “It's a good idea, but it hasn't been fully fleshed out yet,” she says.

    Some House lawmakers, however, are dubious about its value. Representative Dana Rohrabacher (R-CA), who chairs the space subcommittee that authorizes NASA funding, dismissed the idea as one of several “high-profile, politically motivated projects” that could be done by the private sector. “There's no reason this has to be done in a normal NASA fashion,” says David Gump, president of LunaCorp of Arlington, Virginia, which wants to put a rover on the moon with the ability to send pictures back to Earth.

    House Science Committee chair James Sensenbrenner (R-WI) says he would like Gore or his staff to testify before his committee about the project. “I want to know where the money is coming from and why this project got to the head of the line,” he told Science. “This means less money for those efforts that were peer reviewed.”

    White House and NASA officials say they are open to suggestions on how to assemble the mission, adding that education rather than science is the driving force. Gore “just wants to see if this is feasible,” a White House aide says. “If people find better ways to achieve the goal, then we'll be supportive.”


    Japan Readies Helical Device to Probe Steady-State Plasmas

    1. Dennis Normile

    Toki, Japan—Japan's effort to understand and harness the power that drives the stars will take a big step forward on 31 March when plasma physicist Atsuo Iiyoshi steps up to a panel in a cavernous control room here and pushes a button. Barring last-minute glitches, he will initiate production of the first plasma in a fusion reactor that is the largest of its kind. For Iiyoshi, director-general of Japan's National Institute for Fusion Science (NIFS), the $650 million Large Helical Device (LHD) is the next step in making this type of reactor a contending design for a commercial fusion power plant.

    For Japan, the machine marks the newest component in what is arguably the world's most ambitious fusion research program. The LHD is just one element in the country's $250-million-a-year fusion research budget—some $18 million more than the United States is spending. Japan also operates one of the world's most advanced tokamaks, the JT-60, and it's the odds-on favorite to host the planned $10 billion International Thermonuclear Experimental Reactor (ITER). “They're certainly going to give other countries a run for their money [in fusion research],” says Robert Goldston, director of the Princeton Plasma Physics Laboratory in New Jersey.

    Inside story.

    Iiyoshi with a model of the Large Helical Device, in which helical coils create a powerful magnetic field to confine burning plasma.

    D. NORMILE (top) / NIFS (bottom)

    The design for the LHD was pioneered at Princeton in the 1950s with a device called a stellarator. However, Princeton and most other labs around the world later turned their attention to a competing fusion device, the tokamak. Both approaches rely on heating a plasma of ionized light atoms so that they fuse into heavier atoms, releasing energy in a process that mimics the sun's power plant. To maintain the required temperature of 100 million degrees Celsius, the plasma is confined by a magnetic field that spirals through a doughnut-shaped vessel.

    The difference between the two machines lies in how that field is created. In a tokamak, the field is the sum of what is generated by a current sent through the plasma itself and coils that fit like rings around the doughnut. In a helical device, the coils themselves are wound in a helix around the doughnut.

    Neither approach is trouble-free. The current running through the plasma in a tokamak can only be applied in short pulses, limiting the duration of the magnetic confinement and, thus, the fusion reaction. Large currents also can cause a phenomenon called disruption, a sudden rapid loss of energy that can damage reactor components. The magnetic field in a helical device is independent of the plasma and can run in a steady state, allowing a continuous fusion reaction. But the plasmas in the early stellarators lost energy at rates an order of magnitude greater than that of the best tokamaks. In addition, their helical coils were hard to build and their magnetic fields were difficult to analyze.

    While much of the rest of the world abandoned helical devices in the 1960s, a few groups, notably at Kyoto University in Japan and the Max Planck Institute for Plasma Physics in Garching, Germany, continued working on the energy-loss problem. Proponents see the LHD and a German device being built of comparable size but different configuration as opportunities to show that a helical device could be an alternative to the tokamak for commercial power reactors. “There is no unique solution for fusion power yet,” Iiyoshi says.

    Japan's fervent interest in fusion starts from one simple fact: The nation imports all of its oil. The economic devastation wrought by the oil shocks of the 1970s is still a vivid memory here, fueling the search for alternative energy sources. Policy-makers also believe that fusion research will be a boon to the country's heavy industry. Osamu Motojima, NIFS's director of research operations, points proudly to the LHD's superconducting coils, which required advances in everything from the material of the wires to a new machine to wind them. “Very few countries could build something like this,” Motojima says.

    The project has also gotten a boost from ongoing competition between rival agencies. “Happily, Japan has two [science] ministries,” says Iiyoshi. With JT-60 and Japan's ITER efforts supported by the Science and Technology Agency, NIFS benefited from a willingness of its funding agency, the Ministry of Education, Science, Sports, and Culture (Monbusho), to fund an alternative.

    Despite the LHD's domestic importance, Iiyoshi stresses that it will be an international facility and that any scientist can apply for time on the machine. “They are more than just open, they're very eager for people to come here,” says John Rice, a research scientist at Massachusetts Institute of Technology now working on diagnostic devices at NIFS.

    Iiyoshi says that the parallel efforts on helical devices and tokamaks are complementary and necessary. Tokamaks are far ahead of helical devices in terms of the plasma densities and temperatures achieved. In the last few years, tokamaks have crept closer to break even, where the energy generated by the fusion reaction equals the energy put into heating the plasma (Science, 3 October 1997, p. 29). Günter Grieger, director of the new Greifswald branch of the Garching institute, where the new Wendelstein 7X stellarator is being built, adds that their track record makes tokamaks the obvious choice for the next step—investigating the self-sustaining fusion reaction using the deuterium and tritium that would be used in future reactors. “For ITER, it is the right way to go,” Grieger says.

    But ITER is not the last word in fusion reactors. While ITER plans to operate only in pulses of up to about 1000 seconds, the Japanese and German machines will confine the plasma for hours or even days. “We will be able to investigate parameters of steady-state plasma physics in ways that tokamaks can't,” Iiyoshi says. That contribution, says Princeton's Goldston, “makes the LHD an important part of the world fusion research effort.”

    Helical device proponents hope to make more than just a contribution. Iiyoshi predicts that the performance of the LHD and the Wendelstein 7X will put helical devices back in the running by 2015, when it's time to design a demonstration reactor. “It could be the choice if we have great success with the LHD experiments,” he says, a process that starts after he pushes the button.


    Training Lasers to Be Chemists

    1. Robert F. Service

    This News story accompanies a special issue on reaction dynamics, which discusses chemists' efforts to understand and influence the inner workings of chemical reactions (see p. 1875).


    With the abandon of newlyweds on their honeymoon, molecules undergoing a chemical reaction come together in an embrace that can both transform them and create a whole family of offspring in no time. However, chemists, like meddling relatives, are continually searching for ways to coax the partners into producing just the offspring they're looking for. This is not always an easy task, but over the past decade or so researchers have been trying to light the way to particular molecular offspring by delivering precisely tuned, precisely timed laser pulses to the parent molecules as they react.

    The potential of this technique—known as coherent control—is enormous. Reactions produce many valuable compounds in only minute quantities, lost among many other products. Coherent control could coax such reactions to produce much more of the desired product. It also has the potential to push reactions down previously unknown pathways, yielding products that might never be produced by traditional methods. But the difficulties have turned out to fully equal the promise, and so far the technique has been successfully applied only to a smattering of simple reactions. The reason: For any molecule containing more than just a handful of atoms, the task of calculating precisely what sort of laser pulse will push it in the desired direction becomes impossibly complicated. “We can't do the experiment because we don't know what the best shaped pulse would be,” says Philip Bucksbaum, a laser physicist at the University of Michigan, Ann Arbor.

    In recent months, however, several research groups have made remarkable progress by turning to an age-old engineering principle: feedback. They put the control of the input lasers in the hands of a computer program capable of learning, such as a genetic algorithm. After the lasers fire a random array of pulses, the program analyzes the results, calculates how to achieve a better result, and fires another array of improved pulses until it finds the best one possible. For researchers looking to control the reactions of complex molecules, the upshot is that “it's no longer necessary to know everything about your system in advance,” says Bucksbaum.

    So far, attempts at using computer-controlled feedback to optimize laser control of molecules have been limited to simple demonstrations, such as finding the best light pulse for triggering fluorescence from a dye molecule. But these demonstrations have been so successful that many researchers see feedback as the most promising way to apply coherent control to everything from cancer therapy—where tailored light pulses would prompt optically active molecules to generate cell-killing reactive oxygen—to electronics manufacturing, where pulses would make precise changes in the molecular structure of the polymers used to create circuit patterns. “Learning in quantum systems is a very important development,” says Herschel Rabitz, a theoretical chemist at Princeton University. “All of the applications [of coherent control] will have to embrace this.”

    The right light

    In conventional chemistry, researchers typically use heat, mechanical mixing, and pressure to influence how compounds react. But this “shake and bake” approach isn't terribly selective, as the changes in external conditions affect all atoms in the molecule indiscriminately. What chemists would like to be able to do is manipulate the electronic structure of the molecule, encouraging individual bonds to break and reattach, and so produce a desired product.

    Lasers and other precisely controlled sources of electromagnetic energy offer this possibility. Atoms connected by bonds act something like balls connected by a spring: Knock one of the balls and the spring jiggles or vibrates back and forth. In molecules, heat energy keeps the balls moving and springs jiggling all the time, with each pair of atoms having its own characteristic vibration frequency. Pulses of electromagnetic radiation—most often light—can excite these natural vibrations, causing them to jiggle faster or slower. If tuned just right, the light can cut an electron free or jiggle a bond so much that it breaks, thereby forcing the molecule along one desired reaction path. Researchers began to have enticing success with this approach in the early 1990s (Science, 14 October 1994, p. 215).

    Among the first reactions to be controlled was the breakdown of the simple molecule H-O-D, in which one of the hydrogens in water is replaced by deuterium, hydrogen's heavy twin. Breaking a single bond in the molecule produces either H-O and D, or H and O-D. Because H and D have different masses, H-O-D's two chemical bonds vibrate at slightly different frequencies. By sending in a laser pulse at one frequency, a chemist can increase the stretching of, say, the H-O bond, making it more likely to break and create H and O-D. Such experiments have become an increasingly popular way to learn about the dynamics of reactions, because they reveal the precise amount of energy needed to drive particular reactions (see Article on p. 1875).

    When it comes to more complex molecules, however, things get messy fast. What complicates the picture is that the movement of one spring between two atoms affects the movement of neighboring springs. H-O-D only has two interacting springs, but in complex molecules with 50 or 100 atoms, any change in vibration between one pair instantly feeds into dozens of other bonds. Tracking all these changes quickly becomes an impossible task. “We have to have the knowledge of these changes to know what kind of light [pulse] will help us meet the goal” of steering the reaction down one particular pathway, says Kent Wilson, a chemist at the University of California, San Diego. With complex molecules that's simply not possible. As a result, he adds, “we realized we're going to have to do something else.”

    Learning a lesson

    That something is feedback. People use feedback all the time to learn skills. And feedback is ubiquitous in the machine world, appearing in everything from airplane guidance-control systems to home thermostats, says Rabitz. Nevertheless, he adds, “it has been a long time coming for the [coherent control] community to think that this is the way to deal with molecules.”

    Rabitz and Richard Judson of Sandia National Laboratory were the first to explore the approach. Back in 1992, the pair proposed a feedback scheme that would identify the best pulse for exciting a simple molecule. The idea was to deliver a photon blast to a sample and send the results to a computer, which would use this information to predict what shape the next pulse should be. In a computer simulation of an actual laser experiment, Rabitz and Judson reported in Physical Review Letters (9 March 1992), the computer quickly learned to create the optimal light pulses needed to excite two-atom molecules to rotate in a unique manner.

    Although the simulation suggested that the idea held promise, it wasn't until recently that Wilson's team tested it on real materials in the lab. In the 28 November 1997 issue of Chemical Physics Letters, Wilson and San Diego co-workers Christopher Bardeen and Vladislav Yakovlev, along with colleagues at Brown University and Princeton, showed that feedback systems can manipulate real-world molecules.

    In their experiment, the researchers used large dye molecules—each containing about 100 atoms—that absorb laser light at one frequency and reemit it as fluorescence at a slightly lower frequency. They set two separate goals for their setup. First, find the type of light pulse that would create the brightest fluorescence possible from dye molecules in an organic solvent, no matter how much energy was put in with the laser pulse. Second, find the pulse that would produce fluorescence most efficiently, getting the most light out for the least put in.

    Light pulses can vary in many ways, such as duration, colors, and chirp, a measure of how the color composition of a pulse changes from start to finish. For their initial set of pulses, Wilson and his colleagues chose more or less random combinations of features, such as a short duration, red and yellow colors, and a chirp that shifted the light more toward the yellow by the end. After each pulse was fired, a detector recorded how much fluorescence came out of the sample.

    Then a computer running a genetic algorithm program treated the pulse characteristics like genes to produce new pulses for the next round. The computer ignored pulses that produced poor results while recombining the characteristics of the successful ones, marrying, for example, the duration of one successful pulse with the color of another and the chirp of a third. The computer then analyzed this second round of pulses and produced a third, and so on.

    The team found that after about 17 iterations the computer had come up with solutions to the two different problems. For creating the brightest fluorescence, the best pulse turned out to be one containing photons over a relatively broad range of frequencies, with a chirp that moved from low to high frequencies. For sparking the most efficient fluorescence, the best pulses contained a narrow range of frequencies and the chirp was irrelevant. The results are “very interesting,” says Bucksbaum, because it would have been impossible to come up with this solution through sheer calculation alone. “It's just the direction we want to be going in.”

    It's a direction others are also beginning to take, testing the feedback skills that will ultimately be put to use guiding chemical reactions. At a chemistry meeting last November in Cancún, Mexico, physicists David Reitze and Jeffrey Krause of the University of Florida, Gainesville, presented results from a feedback experiment done with optical crystals that absorb photons at one frequency and reemit them at twice the frequency. They trained a pair of lasers on one such crystal and had their computer determine the combination of color, timing, and beam mixing that turned out the most light at double the frequency. In fact, Krause explains, the answer to the problem was already known in advance, but he and Reitze just wanted to see if their computer-controlled optics system would come up with the right answer. “It got it right away,” says Krause.

    Charles Marcus and his colleagues at Stanford University have been experimenting with a feedback system that relies on voltages rather than light pulses and controls not a reacting molecule but a tiny semiconductor island known as a quantum dot. But the goal is the same: manipulating the behavior of electrons through feedback. Ordinarily, electrons can flow smoothly through the dot. But add a magnetic field, and the electrons change their interactions with the walls of the quantum dot, causing the amount of current that passes through the dot to fluctuate wildly. With a computer-controlled feedback system that compensated instantly for any change in conductance by applying a voltage to the dot, Marcus and his colleagues found that they could easily squelch the fluctuations and return the electron flow to a steady trickle.

    So what's likely to come of the new enthusiasm for feedback in coherent control? Commercial applications will probably be limited, say Rabitz and others. At least for now, using lasers remains an expensive way to process bulk chemicals, although coherent control could gain a foothold in medicine and microelectronics. But the new approach is likely to produce valuable scientific insights into complex molecules. The technique, says Krause, gives researchers the full benefit of hindsight: By knowing what type of light pulse produces the best result, they can determine how their molecule must have wiggled and jiggled to produce it. As a result, says Krause, “this is a way of learning about the dynamics of a system as well as learning how to control it.”


    Gentle Force of Entropy Bridges Disciplines

    1. David Kestenbaum

    Anyone who has tried to reunite two socks after a tumble in the dryer is well acquainted with entropy. Over time the universe slides on a one-way course toward disorder. Cream mixes irretrievably into coffee. Our bodies lose heat to the cold winter air. But sometimes, it turns out, entropy can be a force for organization. Under certain conditions, the clothes dryer will, in effect, pair up socks.

    Physicists have recently rediscovered this strange phenomenon, in which an increase in entropy in one part of a system forces another part into greater order. Now their enthusiasm is spreading. Engineers are awakening to the possibility of harnessing the force of entropy to build ordered structures. And a recent paper in Physical Review Letters showing how this ordering force can push on membranes has rekindled speculation that living cells might take advantage of this little-known trick of physics. The idea is “exquisitely interesting,” says Adam Simon, a biophysicist at Merck & Co., Inc., in West Point, Pennsylvania. “If entropic forces are playing a role, that's a complete rethinking” of what goes on in a cell.

    Making space.

    Big spheres can make more space for little ones by clustering together or moving into a corner. That's because large spheres and container walls are edged by a forbidden area (brown) where a small sphere cannot fit.


    The idea that entropy could tidy up as well as tear down actually goes back to a 1958 paper by two Japanese physicists. They described how two large particles in a sea of smaller ones would eventually find each other and stick together. By snuggling close together, the two freed up a little bit of space for the smaller particles to bang around in, increasing the overall disorder.

    The revival began in a drop of water at the University of Pennsylvania in 1992. Peter Kaplan, then a graduate student there, had been mixing up a colloid cocktail. He let loose two sizes of microscopic plastic spheres in a bit of salty water. Instead of a uniformly murky solution, Kaplan saw small structures forming around the edge of the container. The larger spheres were packing themselves into tightly ordered crystals. It was like watching cream unstir itself from coffee, or a movie of erosion run backward.

    After a bit of head scratching, Kaplan, his adviser Arjun Yodh, and colleagues realized that they were seeing a strange side of entropy. The large spheres were crowding toward the edge of the drop to make more room for the small spheres to be disorderly, just as people waiting out a dance number in a discotheque crowd against the walls to make more room on the dance floor. Because each sphere contributes equally to the overall entropy, the increase in entropy from the many small spheres more than compensated for the reduction in entropy from the few large ones.

    The force of entropy acts through the uncoordinated movements of the smaller spheres. “You can literally watch it happen under a microscope,” Yodh says. The researchers made videos and saw that the random bombardments of the small spheres would eventually bring a big one near to the wall, where it would stick, pelted on its exposed sides by the smaller beads. Then the big spheres would pile up one by one. It takes a certain density of small spheres to trigger the pileup, says Yodh. Add a bit more water, “and [the crystal] just melts.”

    Now Yodh and his colleagues have taken the work another step further. In a paper in the 12 January issue of Physical Review Letters, they showed that, in an arterylike tube filled with small and large spheres, entropy will nudge the large spheres to the edge of the tube, and then toward the place where the tube bends the most. “In a microscope it looks very magical,” says Anthony Dinsmore, one of the authors, who is now at the Naval Research Lab in Washington, D.C. Actually, he explains, the big spheres are just surrounding themselves with as much wall as possible to make more room for the little ones.

    Dinsmore adds that if the wall is flexible, the large spheres should embed themselves in it. That would completely take them out of the picture and maximize the entropy. And that, says Dennis Discher, a biologist at the University of Pennsylvania, just might explain the mysterious process in which the nucleus of a red blood cell precursor buds off and is shed before the cell heads out into the bloodstream as a sack of hemoglobin. Maybe the nucleus, like a large ball, gets shoved out when enough hemoglobin—which plays the role of the small spheres—builds up inside the cell.

    As Simon notes, the conditions in a cell are just right for entropic forces to get a foothold. Cell fluids are salty, which would tend to screen out the much stronger electrostatic forces that could overwhelm entropy's gentle influence. And, he says, there are plenty of proteins and other objects floating around that could play the role of the small spheres. Indeed, biochemists Steve Zimmerman and Allen Minton at the National Institutes of Health anticipated some of the current speculation by showing in the 1980s that DNA ligation—which stitches DNA strands together—can proceed 10 times faster in vitro when tiny molecules are added to the mix. “It's a whole different kind of chemistry going on” when things get crowded, Minton says.

    Engineers now hope to harness this force for their own purpose: developing crystals that reflect visible light perfectly at all angles. These long-sought “photonic band gap” materials—ordered arrays of particles spaced about a wavelength of light apart—could improve lasers or keep fiber-optic cables from leaking light when bent into a sharp U-turn, says David Pine of the University of California, Santa Barbara. But scientists have had trouble getting the tiny spheres to arrange themselves in the right crystal structure.

    Pine hopes a simpler method may work: Lay down a regular network of ridges on a sheet, then use the entropic force to nestle metal spheres into the ridges. These spheres would form the first layer of the crystal, which would provide a template for subsequent layers. The only problem, he says, is making the pattern of ridges with the necessary 0.5-micrometer accuracy. Still, he says, the entropic force is ideal. “You can control its strength by adjusting the size and concentration of spheres,” he says. If Pine is right, he'll have made an ordered niche in an increasingly disordered world.


    Toxicologists Shed New Light on Old Poisons

    1. Jocelyn Kaiser

    Seattle—Nearly 5000 scientists flocked here earlier this month for the Society of Toxicology's largest ever annual meeting. While the elements outside were cool and rainy, the elements indoors were hot: One presentation suggested that mercury exposure might increase malaria risk, and a symposium offered new insights into an old enigma of how arsenic causes cancer.

    A Mercury-Malaria Link?

    The gold rush now under way in developing countries from the Philippines to sub-Saharan Africa and Brazil has wreaked environmental havoc—and a toxicologist's nightmare. The often crude processes used to extract gold from sediments can expose workers to heavy doses of the element mercury, a neurotoxin. At the meeting, researchers discussed hints from animal studies that along with its known risks, mercury may also be lowering workers' immunity to malaria.

    Millions of people are now employed in gold-mining operations in the tropics, and statistics from Brazil show that mining areas are malaria hot spots: Nearly 80% of cases occur in three Amazonian states with intensive mining. Prosaic factors such as standing water near mines—breeding grounds for mosquitoes that transmit the disease—may be contributing to the outbreaks. But another possible factor that has been overlooked until now, says Ellen Silbergeld of the University of Maryland School of Medicine in Baltimore, is the effect of mercury on the immune system.

    One widespread mining process involves mixing sediments with mercury to extract clumps of gold-mercury amalgam, then evaporating the mercury, leaving the gold behind. In blood and urine samples from workers in the Brazilian state of Para, Silbergeld's group and collaborators in Brazil last summer detected mercury levels up to 100 micrograms per liter, five times the level at which adverse effects are seen in people, according to the World Health Organization. Mercury exposure is well-known as a trigger for the immune system to attack the body's own tissues, leading to kidney failure, and it has been shown to skew levels of the immune system's T helper cells, increasing the numbers of some kinds of T helper cells and decreasing others.

    To probe for a link between mercury and malaria, Silbergeld's group turned to mice used in malaria studies by the University of Maryland's Abdu Azad. Silbergeld's team injected two strains of mice with mercury chloride or a saline control over 11 days, giving low enough doses so that the mercury-treated animals had no weight loss or other overt signs of toxicity. Next, the researchers injected the mice with a nonlethal strain of malaria. After 12 days, when parasite blood levels peaked, mice exposed to mercury harbored up to five times more parasites than controls did, they found. “We saw that [exposed mice] had a substantial reduction in their resistance to malaria,” Silbergeld says.

    The study is “suggestive but not decisive,” says immunotoxicologist Allen Silverstone of the State University of New York, Syracuse. The researchers need to offer a cellular explanation of how mercury might lower an organism's defenses against malaria, he says. Silbergeld says her group plans to study the question and also investigate whether mercury lowers resistance to cholera and other pathogens as well.

    Clues to Unsolved Arsenic Case

    In Agatha Christie stories, a sip of an arsenic-laced cocktail means instant death. But arsenic also acts more insidiously. As far back as 1888, physicians noticed that some people who ingested an arsenic-based skin tonic later developed skin cancer. Arsenic has since gained a fearsome reputation as a carcinogen, which has led the Environmental Protection Agency (EPA) to propose drastically tightening the U.S. standard for arsenic in drinking water. Yet how chronic arsenic exposure causes cancer has remained a century-old mystery. And in a plot twist that has only deepened the puzzle, it seems that just people—not lab rats—get cancer from arsenic.

    But now toxicologists finally have some solid leads as to how arsenic may do its dirty work. At the meeting, scientists reported that various mammal species—and possibly some human populations—differ sharply in how they metabolize arsenic. The finding suggests that certain individuals may be genetically predisposed to the element's toxic effects. Researchers also presented evidence that arsenic may turn cancer genes on or off by interfering with a key gene-regulatory process called DNA methylation.

    While the lack of an animal model for arsenic toxicity has hindered research, lately it has turned up clues to the mystery. In general, humans and other mammals defang inorganic arsenic compounds—arsenate or arsenite, the forms usually found in ground water—by adding methyl groups to make two compounds thought to be less toxic, monomethylarsonic acid (MMA) and dimethylarsinic acid (DMA). But researchers now realize that this methylation may be a double-edged sword. While tacking on methyl groups may protect against arsenic's immediate toxic effects, it could also explain why chronic arsenic exposure can lead to cancer.

    Hints of a link between how people metabolize arsenic and cancer come in part from Vasken Aposhian's group at the University of Arizona, Tucson, which has found that the activity of a family of enzymes that catalyze these reactions, arsenite methyltranferases, varies widely across mammalian species. Aposhian says that unlike other mammals that convert arsenic to MMA and DMA, South American mammals like the marmoset and guinea pig convert little or none of the element to methylated forms. He suggests in an article in press at Environmental Health Perspectives that these species might have evolved an alternative way to tolerate inorganic arsenic in the blood—perhaps by binding it to proteins.

    Intriguingly, people may have the same geographic split in their ability to detoxify arsenic. Aposhian's team has been studying a group of indigenous villagers in Chile, who for thousands of years apparently have been drinking water laced with dangerous levels of arsenic, but who have no signs of cancer. Aposhian hopes to isolate some of the group's arsenic-metabolizing enzymes from lymphocyte samples. It's “fascinating stuff,” says the National Institute of Environmental Health Sciences' Michael Waalkes, who says some populations may have a genetic variant that causes them to metabolize arsenic differently. That could explain why people exposed to arsenic in Chile and other geographic regions, such as Mexico, seem to be less susceptible to arsenic-related cancers than are Taiwanese and Indians, Waalkes says.

    A possible link between arsenic methylation and cancer fits squarely with a booming research area: the role of methylation in switching on or off cancer-related genes. When cells add methyl groups to arsenic, they deplete a compound called SAM that's needed to methylate DNA and tag genes that should be turned off. If the pool of SAM is depleted by arsenic, that might hinder the cell's ability to control gene expression, says Waalkes. His lab has done experiments showing that rat liver cells treated with arsenic are “hypomethylated”—that is, the genome doesn't have as many methyl groups as it normally would. He's also shown that an oncogene, c-myc, switches on in these cells.

    But arsenic might also have a deleterious effect in some cells by boosting methylation, says Marc Mass, a toxicologist at EPA's research lab in Research Triangle Park. His group has found in experiments on human lung cells that arsenite spurs the activity of an enzyme that attaches methyl groups to the p53 tumor suppressor gene. The methyl groups are added to a region of the gene where they would slow its transcription—and thus increase the risk of cancer, says Mass.

    Clearly, the mystery is far from solved. And that leaves scientists in the uncomfortable position of assessing EPA's proposal to reduce maximum arsenic levels in drinking water without being able to point to a proven mechanism of arsenic's carcinogenicity. EPA wants to reduce those levels from 50 micrograms per liter to as few as 2 micrograms per liter in 2001—a plan that could cost utilities up to $1.5 billion a year.


    Missing Link Ties Birds, Dinosaurs

    1. Ann Gibbons

    Paleontologist Catherine Forster and colleagues were working in their lab one weekend in 1995, chipping the skeleton of an ancient bird from a block of sandstone, when they noticed that the bird's half-buried second toe seemed unusually large. Forster joked that perhaps this specimen would help settle the long-running battle over whether dinosaurs gave rise to birds by having a long sickle claw like some dinosaurs. Half an hour later, the whole toe was exposed—and, amazingly, the raven-sized bird had a “wicked-looking” sickle claw, fit for a Velociraptor or other dromaeosaurid dinosaur. “We knew then that this was a really primitive bird, walking in the gray area between bird and dinosaur,” says Forster, of the State University of New York, Stony Brook.

    On page 1915, Forster and her colleagues describe this new species of primitive bird from Madagascar, called Rahona ostromi (Rahona, for menacing cloud in Malagasy, and ostromi in honor of Yale University paleontologist John Ostrom). R. ostromi, which lived 65 million to 70 million years ago, had feathered wings like a modern bird, but a long bony tail and a sickle claw like a meat-eating theropod dinosaur. Although it lived 80 million years after the first known bird, Archaeopteryx, R. ostromi may be one of the most primitive birds known and joins a gallery of recently discovered creatures that seem part bird and part dinosaur, researchers say. “It's a great discovery,” says Archaeopteryx expert Peter Wellnhofer of the Bavarian State Collection of Paleontology and Historical Geology in Munich, Germany. “This fossil is very strong support for the theropod ancestry of birds.” But this find won't end the fight over bird origins; researchers skeptical of a dinosaur ancestry say that Forster's team may have mistakenly combined bird and dinosaur bones.

    Bird of passage.

    A slashing claw (above) on an ancient bird dug up in Madagascar suggests that it was in transition from dinosaurs to birds.


    In 1995, in their second field season in the sandstone hills of Madagascar, Forster and paleontologist Scott Sampson of the New York College of Osteopathic Medicine in Old Westbury, on a dig led by David Krause, also of Stony Brook, dug up a long, slender lower wing bone with quill knobs for feather attachment. It lay just above, although not attached to, several hind limb bones that fit together, as well as a long, bony tail like that of Archaeopteryx. The researchers knew they had something special, so they chopped out a 27-kilogram block of stone containing the bird and shipped the whole thing home to New York.

    After preparing the specimen, they found that the bird's pelvic and pubic bones resemble those of Archaeopteryx and other early birds—making R. ostromi surprisingly primitive for its time in the late Cretaceous, when modern birds had already taken flight. What's more, the bird's last six dorsal vertebrae have an extra articular face, as seen in theropods but not in modern birds. These features, plus that sickle claw, make R. ostromi look even more like a theropod than Archaeopteryx does, says Forster.

    The new specimen joins a collection of strange-looking birds and dinosaurs, such as Confuciusornis and Protoarchaeopteryx from China, Mononykus from Mongolia, and Unenlagia from Argentina, whose combinations of features are hard to explain if birds evolved from some predinosaurian reptile, argues University of Chicago paleontologist Paul Sereno. “In the past 5 years, we've discovered so many wonderful intermediate forms that are close to the transition from dinosaurs to birds,” he says. The best way to explain these specimen's half-bird, half-dinosaur appearance, he says, is that birds evolved from dinosaurs. Animals such as R. ostromi then retained many primitive dinosaurian traits for millions of years, making it “a living fossil in its own time,” says Wellnhofer.

    Researchers like John Ruben of Oregon State University in Corvallis, who argues against a dinosaur origin of birds, agree that R. ostromi looks like a dinosaur—because, they say, its hind limbs actually come from a small dinosaur. “I think it's a chimera—a little dinosaur hindquarter, with a bird's forelimbs,” Ruben says. Agrees University of Kansas paleo-ornithologist Larry Martin, “It's another dinosaur trying to hit it big as a bird.” Martin thinks that the hind limb belonged to a dinosaur and that the wing bones could have been those of another ancient bird found at the same site, Vorona berivotrensis; the only known skeleton of that bird is missing its wings. “They owe us a close explanation why this can't be that bird,” says Martin.

    As discussed in the paper, Forster can't rule out that the wing bones and hind limb come from two different animals. But she contends that the hind limbs are clearly bird legs, possessing avian traits such as an opposable big toe and a small fibula, or lower leg bone. “They make a pretty good case that there are subtle avian characters in the hind limb,” agrees University of Pennsylvania paleontologist Peter Dodson.

    He adds that even if the bones all come from a bird, “the overall impression is that it's dinosaurian. It's exciting because if it's a single animal, it's sitting on the fence, somewhere between birds and dinosaurs.” But whether R. ostromi is what it appears to be probably won't be settled until another specimen—complete with wings, tail, and slashing claws—rises from the sandstone of Madagascar.


    The Bare Bones of Catalysis

    1. Erik Stokstad

    Nature has given us millions of enzymes, the chemical workhorses that speed up reactions inside living organisms. So you'd think that bioengineers who use enzymes in test tubes or industrial vats could simply choose the best one for the task at hand. But the sad truth is that good enzymes are hard to find. So biochemists have been harnessing an artificial version of evolution to refine natural enzymes, making new variants that work faster, longer, and at higher temperatures. Now researchers have used test tube evolution to create a redesigned enzyme that still performs the function of its natural counterpart.

    By tearing apart an enzyme and pushing its fragments through a round of mutation and selection to recover the original function, chemist Donald Hilvert of the Swiss Federal Institute of Technology in Zürich and his colleagues report on page 1958 that they have come up with a new, smaller version that is equally adept at the original job: helping to assemble amino acids. Researchers hope this strategy of stripping an enzyme or other protein to its bare essentials will reveal how particular kinks and folds dictate how that protein works. It might also lead to tiny molecules that retain a therapeutic protein's function while lasting longer in the body. This molecular miniaturization is “on the frontier of protein design,” says David Eisenberg of the University of California, Los Angeles.

    Bent up.

    Enzyme's monomers add amino acids (red) and turn into smaller, working enzymes.


    One of the first successful downsizings came in 1996, when chemist Andrew Braisted and protein engineer James Wells of Genentech in South San Francisco, California, chopped away at one binding site of protein A, a bacterial protein that binds to a class of antibodies called G-type immunoglobulins. When one of three helices that help form this binding site is truncated, they found, the molecule and the antibody don't get together. To try to patch up that relationship, the researchers turned to evolution. They randomly fiddled with nucleotides at specific locations within the gene encoding the two full helices, creating about 100 million new versions. Next they inserted these mutated genes into viruslike particles called phages, which expressed the protein on their surface. Any new proteins with truncated helices that worked like the original would bind to an antibody stuck to the bottom of a plastic well.

    Braisted and Wells rinsed away phages that had not bound and collected ones that did. They mutated the genes again and repeated the selection process. After three rounds of mutation and selection, the duo had evolved a new truncated peptide that bound to the antibody with nearly the same affinity as the original protein. “It's like taking an animal, amputating one of its legs, and evolving it so it can walk again,” says Michael Hecht, a chemist at Princeton.

    Now Hilvert and his colleagues—Gavin MacBeath at Harvard University and Peter Kast at the Swiss Federal Institute of Technology, all working at The Scripps Research Institute in La Jolla, California—have taken this approach a step further by applying it to an enzyme. Their target was chorismate mutase (CM), an enzyme that helps bacteria and higher plants make certain amino acids. Like protein A, CM sports helices linked by short strings of amino acids, called turns. However, CM is a dimer: two identical monomers, each consisting of three helices, locked in a tight embrace. Hilvert's group set out to part them and force a single monomer to develop the dimer's ability to catalyze a chemical reaction needed to make the amino acids.

    Splitting the dimer required the researchers to bend one of the helices into a “U” by inserting a new turn. Because it is nearly impossible to predict how a turn will alter the structure or function of a helix, Hilvert's team decided to let natural selection pick a winner. They created a “library” of DNA sequences encoding millions of versions of the enzyme, each one with a different turn, and slipped these genes into a strain of Escherichia coli that can't make CM. Next, they added a selection pressure: The bacteria were grown on food lacking the amino acids that CM helps make. This meant that bacteria with monomers that work like the dimer would flourish, while ill-equipped bacteria would perish. “We let the organisms fight it out,” says Hilvert. When the team sampled the proteins made by the survivors, they found that 0.05% of the variants were monomers that could work as well as the original dimeric CM.

    “It's an impressive accomplishment,” says Frances Arnold, a chemical engineer at the California Institute of Technology in Pasadena who has applied evolution to protein design (Science, 19 August 1994, p. 1032). Re-creating an enzyme's function in a molecule with a different structure, Arnold says, “is a very difficult design problem—and he let nature tell him what the answers are.”

    Redesigning proteins this way may have practical payoffs. For instance, says Eisenberg, therapeutic or industrial proteins cranked out by engineered bacteria sometimes clump into insoluble lumps. “Maybe you could make an alteration that could keep [the protein] as a monomer” that would not stick together as dimers and larger aggregates, he says. And because test tube evolution is a strategy with “very few design constraints,” says Hilvert, “it's possible that we'll find surprises.”


    Animals Thrive in an Avalanche's Wake

    1. Kevin Krajick
    1. Kevin Krajick is a writer in New York City.

    Avalanches are efficient killers, hurtling down mountain slopes with forces of up to 54 tons per square meter, sweeping away trees, wildlife, and humans alike. At least 30 people have died in avalanches this year in the United States and Canada; in the French Alps, another 13 lost their lives at one swipe in late January. But in a series of recent and ongoing studies, biologists are finding that avalanches also have a positive side: The stubbled terrains they create provide key habitat for everything from deer to grizzly bears to wolverines.

    “Just about everything in the mountains keys into [avalanche tracks] in one way or another,” says Rick Mace, a biologist with the Montana Department of Fish, Wildlife, and Parks in Kalispell, Montana. As a result, several agencies are now moving to protect the tracks, which are coming under threat from logging operations.

    Slides boost plant diversity, researchers say, because they first clear the land, gouging out permanent open chutes up to a kilometer long and 100 meters wide through terrain that would otherwise be densely covered by conifers. Indeed, when the snow is gone, forested slopes in avalanche country look like they have been clawed by giant fingers, which in some regions cover 20% of the mountainside. But plants and shrubs that can't grow under shady conifers spring up in these sunny chutes, providing important nutrition for large mammals as well as a variety of birds.

    Mountain travelers have in fact long suspected that avalanches boost biological diversity, much as natural fires do. “Just hang out and you'll see a parade of animals go by,” says Matt Besko, a habitat biologist for the British Columbia Ministry of Forests. But most studies have been narrowly focused, done by specialists on certain animals or on plants. Only recently, after realizing that avalanche slide zones are threatened by increased logging, have researchers begun to study their overall ecology and appreciate their central role in mountain ecosystems.

    One of the most extensive of these studies comes from the Columbia Basin Fish and Wildlife Compensation Program (CBFWCP) in Athalmere, British Columbia. In their as-yet-unpublished report, the agency hired a contractor to pull together previous and ongoing research relevant to slides; much of the ongoing research is being sponsored by the CBFWCP, Parks Canada, and the British Columbia Ministry of Forests. Researchers found 350 avalanche paths along a single 32-kilometer creek drainage. And by observing radio-tagged animals from aircraft and on foot for thousands of hours, biologists found that the slide tracks are vital winter habitat.

    Because mountain forests are often buried by 3 meters or more of snow in winter, grazing animals not only can't forage, they flounder when they walk. But avalanches often sweep slide tracks practically clear of snow, so the tops of plants may stick out. Mountain goats and deer frequently trot out to graze; mountain caribou, which subsist on tree lichens in winter, appear to use slides for easier traveling, says Parks Canada warden John Flaa.

    Of course, hanging out in an avalanche track in winter is just as dangerous for animals as it is for people. After outfitting 60 mountain caribou with radio collars since 1988, biologists observed that six of them have been killed in avalanches, says Flaa. He estimates that slides are responsible for up to 15% of mountain caribou mortality.

    But death for one animal often means an opportunity for another. In this case, it seems that wolverines, elusive and rarely studied scavengers, are among the winners. By radio tagging and following 41 wolverines for the past 3 years, CBFWCP biologists John Krebs and Dave Lewis found that the animals spend much of their time patrolling avalanche tracks, apparently seeking fresh carrion; the researchers have also found cached carrion near slides. “I'm convinced this is their number one winter food source,” says Krebs.

    Through radio tagging, Krebs also managed to track a few females to their midwinter birthing dens, the locations of which have long been a mystery. The dens lay snugly under snow-covered jumbles of boulders and logs in the so-called runout zones near the bottom of the avalanche paths. Tunnel complexes go in as far as 20 meters—“real fancy places, convenient to food and very protected under all that snow,” says Krebs.

    Slides are crucial habitat in spring, too. The snow covering the tracks melts off long before that in the snowy woods, and caribou move in to graze on early grasses and forbs. And when grizzlies awake in spring, they mainly eat starchy avalanche-lily roots dug from slide tracks, says Roger Ramcharita, a University of British Columbia doctoral student following radio-tagged bears and analyzing scat for another multiagency study. Tracks make up only 10% of his study area in the Columbia Mountains, he says, but 40% of his spring bear radio locations are in them.

    After studying radio-tagged grizzlies in the Swan Mountains of northwestern Montana for 10 years, Mace came to the same conclusion: Avalanche tracks comprised only 5% of his study area, but grizzlies spent 60% of their time there. And it's not just food they're after: Females ready to breed go to the same spots in the same chutes year after year and wait for males to show up, says Mace, who published part of this work with his colleague John Waller in a recent issue of Journal of Wildlife Management. “These sites have historical resonance for bears,” he says.

    Avalanche paths are now attracting newer creatures: loggers. Rising timber prices have made it profitable to cut trees in steep, remote drainages adjacent to slide tracks, particularly in British Columbia. But logging removes nearby cover, without which bears, caribou, and other animals won't venture into a track. Logging roads often cut directly across runout zones, bringing snowmobilers and hunters, says Besko.

    Given the mounting evidence showing the ecological importance of slides, the British Columbia Ministry of Forests has begun requiring buffer zones around tracks and is pushing to close off roads. U.S. Forest Service managers hope for stricter rules, too. “These places are obvious gems,” says Besko. “We have to protect them.”


    Exploring How to Get at--and Eradicate--Hidden HIV

    1. Jon Cohen

    Dedham, Massachusetts—Arthur Ammann, head of the American Foundation for AIDS Research (AmFAR), placed a small placard next to a slide projector sitting in the back of a lecture hall at a small scientific meeting he co-organized 2 weeks ago. “Don't even think about it,” the placard warned the three dozen scientists who attended the weekend gathering in this quaint Massachusetts town.

    Ammann's sign was simply a warning to participants that the meeting was supposed to be a slide-free, discussion-heavy think tank. But 3 years ago, that same sentiment—don't even think about it—would have applied equally well to the topic of the conference* itself: how to eradicate HIV from an infected person. Now, the remarkable success achieved by various combinations of anti-HIV drugs—called highly active antiretroviral therapy, or HAART—has allowed researchers not only to think about whether the last bits of virus can be eliminated from the body, but also to speculate about how it might be done. “Up until now, we've been talking about an ocean of HIV in patients,” said Richard Kornbluth, a researcher at the University of California, San Diego, who specializes in HIV infection of scavenger white blood cells called macrophages. “Now it's like the tide has gone out with HAART.”

    Gray matter.

    There are no black-and-white solutions to ridding the brain of cells like these HIV-producing glial cells.


    Despite this sea change, however, discussions at the meeting made it clear that eliminating HIV from the body may be every bit as difficult as attempting to roll back the ocean itself. The main obstacle: Even in people whose viral load is “undetectable,” the virus continues to lurk in cells throughout the body. It hides out, under the immune system's radar and out of reach of antiviral drugs, in “latently” infected blood cells and in sites such as the brain, eye, and testes.

    “The issue is answering the question: Is eradication possible?” said David Ho, head of the Aaron Diamond AIDS Research Center in New York City. He and others proposed several strategies, including injecting people with immune system messengers that could flush out the virus so that the patient's own immune system could pick it off and looking for ways to sneak drugs past borders such as the one that protects the brain. “Whether you succeed or not,” said Ho, “you're going to learn a lot of science.”

    The hiding place that should be the most vulnerable to current treatments, Ho thinks, is latently infected blood cells, which have the viral genome woven into their own but do not actively produce new copies of the virus. Keeping patients on HAART to hold the virus at bay and waiting for the latently infected cells—which include macrophages and T lymphocytes—to simply die off is one approach. Using mathematical models recently developed by his team, Ho calculates that the macrophages would die off first, leaving the more stubborn T cells around for—at a minimum—5 years. That's a long time to be on HAART, which requires taking dozens of pills a day and also can cause serious side effects.

    Other results suggest that Ho's estimates about macrophages may be too optimistic. Donald Mosier, a mouse researcher at The Scripps Research Institute in La Jolla, California, has found that HIV infection somehow lengthens the life-span of macrophages in mice. And at the meeting, Suzanne Crowe from Australia's Macfarlane Burnet Center for Medical Research revealed that she still finds latently infected macrophages in “undetectable” patients who have been on HAART for up to a year. “It may well be that the macrophage is one of the toughest nuts we have to crack when it comes to eradication,” said Crowe (Science, 29 November 1996, p. 1464).

    Battle plans.

    Instead of simply waiting for the latently infected blood cells to die off, some researchers proposed a more active approach. The idea is to compel latently infected cells to produce HIVs, drawing the attention of the immune system so that it can destroy the infected cells. HAART, meanwhile, would theoretically prevent any new HIV that was “flushed out” from infecting virgin cells.

    Latently infected T cells might be flushed, proposed Aaron Diamond's Martin Markowitz, by injecting patients with the mouse antibody OKT3, which specifically tells T cells to copy themselves. Latently infected macrophages might be unmasked, others suggested, with immune system messengers like granulocyte-macrophage colony-stimulating factor (GM-CSF), tumor necrosis factor, and interleukin-12, all of which stimulate macrophage reproduction.

    Harvard immunologist Abul Abbas suggested yet another strategy, aimed at the whole class of quiescent T cells that can harbor latent virus. “Identify stimuli that are particularly good at keeping these memory cells alive and then inhibit them,” said Abbas. He gave the example of interleukin-15 (IL-15), which in mice blocks the “death by neglect” that memory cells usually undergo if they don't receive growth stimulation. Theoretically, then, blocking an IL-15-like substance could speed the death of memory cells.

    On the road to eradication, the last hairpin turn requires stopping HAART. It's possible that if there are still tiny amounts of virus left, the person's immune system can then mop them up, ultimately clearing the infection. But Markowitz and Ho worry that the immune system may not do its part. In several of their patients, HAART has so successfully controlled the infection that they no longer mount strong immune responses against HIV. To try to thwart this resurgence, Ho and Markowitz conclude that it makes sense to boost these patients' immune systems with an HIV vaccine. “Before you stop the drug, wouldn't it be wise to enhance the immune response?” asked Ho.

    John Coffin, a leading retrovirologist from Tufts University, didn't think so, sparking the hottest debate of the meeting. Coffin argued that while such vaccinations might hamper any remaining HIV's attempt to spread, they could not prevent it. “You're not going to eradicate the virus with an immune response,” Coffin contended. “So far that hasn't been good enough to remove the infection from a single individual.” Ho countered with a calculation suggesting that in patients who only have latently infected cells, a few cells—50 to 75—come out of latency each day and produce HIV. Vaccinating, he proposed, would give the immune system enough punch to clear that small number of producing cells and, over time, eradicate the infection.

    But several researchers suggested that complete eradication of HIV from the blood may be the wrong target in any case. Instead, they asked, why not enlist the immune system—rather than drugs—to control the virus for extended periods? That goal, Scripps's Mosier proposed, could be achieved by cycling patients with both undetectable viral loads and waning HIV immunity off therapy, and then cycling them back on when their virus reemerges. Several studies, he noted, have shown that such “treatment holidays” do not allow the virus to develop resistance to the drugs. And this repeated exposure to the virus would keep boosting immune responses to it. “It surprises me that you're even actually thinking about vaccines [to boost immunity] when you have got the natural infection of immunized people to begin with,” said Mosier.

    Brain drain.

    Even if HIV can be eliminated from the blood, however, it might not be possible to rid the body of it altogether because of the virus's propensity for skulking around the central nervous system. “I'm very worried that there's a viral reservoir in the brain,” said Stuart Lipton, who studies HIV neuropathogenesis at Harvard's Brigham and Women's Hospital. There's no accurate way to measure levels of HIV in the brain, where it can stimulate the production of toxins that produce dementia. But it seems to be protected there by the so-called “blood-brain barrier,” which keeps both drugs and immune cells that might fight the virus out of the brain.

    View this table:

    Anti-HIV drugs that do reach the brain are a double-edged sword, warned immunologist Angela McLean of Oxford University in the United Kingdom. Only low concentrations of the drugs likely make it past the blood-brain barrier, she noted, and “suboptimal” levels of anti-HIV drugs allow the virus to develop resistance to the agents. “The last thing you want is intermediate drug levels,” cautioned McLean.

    To make matters worse, the brain is but one “sanctuary site” for HIV. Other areas of the body that have similar barriers include the testes, the eye, the thymus, and the spinal cord. And even less is known about HAART's impact on HIV in these sites. “The sanctuary issue is still up in the air,” said Ho.

    The think tank participants proposed a practical way to learn much more about the impact of anti-HIV drugs on these remote compartments: autopsies of relatively healthy, HIV-infected people. Although many autopsies have been done on people who have died from AIDS, by definition they have failed treatment and their bodies hold no clues to how HAART affects viral levels in the brain and other hard-to-access sites. Retrovirologist Ashley Haase of the University of Minnesota Medical School in Minneapolis suggested that researchers make a concerted effort to find HIV-infected people who have died from car accidents and other traumas. “One thing that might come out of this meeting is the need to develop such a tissue bank,” said Haase.

    In the end, researchers could not agree whether eradication is possible, but they clearly are nowhere done thinking about it.

    • *Cellular and Systemic Reservoirs for HIV Replication Under Highly Active Antiretroviral Therapy, Massachusetts Institute of Technology's Endicott House, organized by AmFAR and Treatment Action Group, 27 February-1 March.


    Getting an Inside Look at Cells' Chemistry

    1. Kevin Boyd
    1. Kevin Boyd is a free-lance science writer in San Francisco.

    New microscopic sensors can size up the chemical status of a living cell. Unveiled early this month in New Orleans at the Pittsburgh Conference on Analytical Chemistry and Applied Spectroscopy by a group at the University of Michigan, Ann Arbor, the sensors fit inside the cell and monitor its changing chemistry on the spot. Among other things, they could help researchers understand how an individual cell responds to a dose of drugs or toxins.

    Cells constantly adjust their level of ions such as calcium or potassium in response to external stimuli. Biologists usually measure these changes by injecting dyes into a cell or poking it with electrodes. But Michigan chemist Raoul Kopelman and his colleagues wanted to find a less invasive method.

    The key turned out to be a class of molecules, called ionophores, which normally latch onto positively charged ions and help pull them across the cell membrane. Kopelman and his grad student Heather Clark packaged ionophores specific for different ions into the surfaces of particles less than 100 nanometers across, made of a plasticlike polymer. As each ionophore sucks up its target positive ion, it compensates by releasing positively charged protons, which activate a fluorescent dye embedded in the particle.

    To test their minisensors, dubbed PEBBLEs (Probes Encapsulated By BioListic Embedding), the researchers injected them into brain cells or mouse eggs with a “gene gun,” which is normally used to shoot DNA into a cell via a puff of helium. The brightness of the PEBBLEs' glow revealed the concentrations of the ions they were tailored to detect. Cells don't seem to mind the PEBBLEs, says Kopelman, probably because they take up little space—each occupies only a millionth of an average cell's volume.

    “I think it's beautiful,” says Stephen Weber, a chemist at the University of Pittsburgh. He notes that because PEBBLEs are used with a microscope, researchers can determine the exact three-dimensional position of each sensor as well as its readout. Says Kopelman: “You can now follow chemistry on a single-cell level and see how any change affects it within seconds, minutes, or hours.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution