News this Week

Science  04 Jul 1997:
Vol. 277, Issue 5322, pp. 24
  1. SHARING MATERIALS

    The Mouse That Prompted a Roar

    1. Eliot Marshall

    RESTRICTIONS ON SHARING MICE ENGINEERED WITH DUPONT'S PATENTED TECHNOLOGY HAVE DRAWN PROTESTS FROM PROMINENT RESEARCHERS; SOME ORGANIZATIONS, INCLUDING HHMI, HAVE ACCEPTED THE CONDITIONS

    Jamey Marth doesn't like to ask permission to send a colleague a research tool refined in his own lab. But the University of California, San Diego, geneticist no longer freely gives away a type of mouse that's prized for its power to reveal gene function. He waits until he gets approval from E. I. du Pont de Nemours and Co. of Wilmington, Delaware. As a result, he says, he has been forced to make “heart-wrenching” decisions to withhold the animals from some researchers.

    Marth finds himself in this predicament because DuPont holds a patent on a powerful method of manipulating genes in the mouse. The company doesn't want transgenic mice created with this technology, called Cre-loxP, handed around loosely from one lab to another. So it insists that researchers using the mice acknowledge the company's rights to the animals, share any money that may be made on discoveries from the technology, and distribute the animals only to other researchers whose institutions have agreed to these terms. (DuPont also has an exclusive license to distribute the “Harvard oncomouse,” a tumor-prone animal valued in cancer research, and it is trying to control its use as well.)

    The Cre-loxP technique isn't the only basic research tool whose use is restricted by patent rights (see sidebar). But it has become a lightning rod for scientists chafing at restrictions on the free flow of research materials. Harold Varmus, director of the National Institutes of Health (NIH), has sent a letter to DuPont protesting the company's policy. It can be “an incredible burden for the individual investigator” to comply with the administrative requirements, Varmus believes, and he worries that the legal fallout will “slow things down, make research unattractive, and turn people off.” Varmus has established a panel to look into this and other restrictions on sharing materials. The issue is also coming to a head because the largest breeder and distributor of lab animals in the United States—The Jackson Laboratory of Bar Harbor, Maine—has declined to sign an agreement with DuPont and is not distributing any Cre-loxP mice, making the animals hard to come by.

    DuPont licensing executive Robert Gruetzmacher says the company has no desire to hamstring basic researchers. “Our philosophy is: Let's make it as easy as practical for the researchers to use [DuPont's patented technology] for research, but gosh, if they go beyond the research and get into a commercial mode, let's see if we can't capture some of that fairly.”

    Inventing a better mouse

    The technology at the center of this battle was not always so popular. In fact, the inventor of record—geneticist Brian Sauer, a former DuPont employee who is now a staffer at NIH's National Institute of Diabetes and Digestive and Kidney Diseases—says he got little response when he first presented his Cre-loxP system in public at a biotech poster session in 1985 in San Francisco. “It was the kind of thing where you stand around for a long time. … Not many even stopped by.”

    Molecular scissors.

    One mouse expresses the Cre enzyme in selected tissues; the mate carries a targeted gene flanked by loxP markers. In offspring, cells expressing Cre delete the targeted gene.

    Sauer's technique adapts a natural gene-splicing system from a bacteriophage—a virus that infects bacteria—for use in cells of complex organisms (eukaryotes). It is based on two genetic elements of the P1 bacteriophage: a gene called crethat expresses an enzyme not normally seen in higher organisms, and a stretch of DNA called loxP. They work together like a powerful editing machine. When Cre encounters two loxP sites in a stretch of genetic code, it clips out the intervening DNA, along with one of the loxP sites, reattaching the ends to make a seamless strand.

    Sauer says he already had the idea of turning this editing technique into a kind of molecular scissors when he arrived at DuPont in 1984. He reasoned that by inserting loxP signals on either side of a target gene and exposing the DNA to Cre enzyme, the target gene would be snipped out. The system worked in eukaryotic cells and even in mice. Sauer and DuPont filed for a patent on using Cre-loxP to modify DNA in eukaryotic cells. It was granted in 1990.

    Since then, Cre-loxP mouse technology has taken off and—according to independent scientists—been significantly improved in taxpayer-funded labs. Among those who have used and improved the system are Marth, Klaus Rajewsky and Werner Müller at the University of Cologne in Germany, Heiner Westphal at NIH, Susumu Tonegawa and colleagues at the Massachusetts Institute of Technology, and others (Science, 1 July 1994, p. 26). The system's main value is in creating “conditional mutants,” mice in which a specific gene is bracketed for deletion in particular cells producing the Cre enzyme. The technique is also being used to create a variety of other genetically engineered mice, including straightforward “knockouts.”

    Rajewsky says that published papers do not reflect the growing importance of the technology. “It takes a long time to breed and to analyze the conditional mutants,” he says, and results are just coming out. Rajewsky reports that the Volkswagen Foundation is sponsoring a new program on conditional mutagenesis in Germany. At its inaugural scientific meeting last week in Cologne, 15 of 30 attending groups said they are using Cre-loxP technology, according to Rajewsky. “The interest is huge,” says Arthur Beaudet of the Baylor College of Medicine in Houston, who chairs NIH's mammalian genetics peer-review section.

    Thomas Caskey, genetics chief at Merck & Co. in Whitehouse Station, New Jersey, says, “I think this concept is going to have extremely broad applications in trying to understand gene function.” He foresees a system in which researchers could draw from a shared library of mice with cre expressed in a variety of cells. Investigators could develop their own animals with loxP targeted genes and breed the two lines to get Cre-loxP offspring with genes inactivated in specific tissues. According to Caskey, Merck is entertaining proposals right now to create a library of cre mice for distribution to researchers. But, of course, anyone who wanted to participate would have to come to an agreement with DuPont—and that may take some negotiating.

    Resistance at NIH

    DuPont is trying to control the technology through no-cost “research licenses.” Institutions that sign up agree that their researchers will share the mouse only with other licensees, and they may be asked to pay unspecified royalties on commercial discoveries enabled by the Cre-loxP system. Commercial outfits must negotiate their own, expensive licenses, which can run to more than $100,000.

    DuPont says some 70 institutions have agreed to sign research licenses. They include the Howard Hughes Medical Institute (HHMI) of Chevy Chase, Maryland. All HHMI investigators—who are employees of the institute—must therefore abide by DuPont's rules. This includes Marth, an HHMI investigator since 1995. Maxwell Cowan, HHMI's chief scientific officer, explains, “We felt [signing the agreement] was the right thing to do. … We had a number of investigators who were using the technology and had obtained animals prepared with that technology from others.” DuPont holds a valid patent, Cowan says, and has the right to enforce it. The only other option was to instruct investigators not to use Cre-loxP mice—calling a halt to many research projects. Accepting DuPont's terms seemed the wise thing to do, says Cowan, adding: “We have a difference of opinion with Harold Varmus on that.”

    The NIH has not signed an agreement with DuPont, but has “friendly negotiations” under way, says Varmus. DuPont is allowing NIH researchers to continue using animals made available to them by Sauer years ago, before the company began trying to license all nonprofit institutions. But Varmus protested the company's license conditions in a 28 March letter to DuPont's president, John Krol. The restrictions, Varmus wrote, “will seriously impede further basic research and thwart the development of future technologies that will benefit the public.” Varmus said in an interview that he is just as upset when university or NIH scientists try to patent research tools: “There are investigators here who would like to seek intellectual protections for everything they do, and I don't find it very appealing.”

    Varmus, who uses transgenic mice in his own lab, has scheduled a meeting in late July with DuPont executives to discuss the issue. In addition, he's creating a small panel—including two experts in gene patenting, Rebecca Eisenberg of the University of Michigan, Ann Arbor, and John Barton of Stanford University—to advise him on how NIH should respond to the threat of “reach through” provisions in sharing agreements in which a company or researcher lays claim to discoveries not yet made.

    Many U.S. and European researchers are also said to be quietly objecting to DuPont's terms. Bruce Alberts, president of the U.S. National Academy of Sciences, recently singled out restrictions on Cre-loxP technology in a statement on commercial barriers to basic research. Cologne's Rajewsky says he finds the strings attached to DuPont's license “much too strong for a basic technology.” One mutinous researcher even admits to simply ignoring the rules and sharing mice with trusted peers. But the most important outsider is the Jackson Lab.

    Jackson Lab and DuPont have been at a standoff in negotiations on Cre-loxP for 2 years. Kenneth Paigen, director of the Jackson Lab, says DuPont's terms have not been accepted because they would burden the lab and its clients with too many legal constraints. The result: The public distributor for the biomedical community neither accepts nor sends out Cre-loxP mice. Research is suffering, some researchers say. “The most serious practical problem we have at the moment,” says Rajewsky, is that these mice “cannot be distributed by commercial mouse breeders like Jackson Lab,” making it hard for the growing numbers of interested researchers to get animals. Beaudet, who says “NIH is making a huge investment in developing these mice,” says the stalled talks between Jackson Lab and DuPont “could have broad implications” for biomedicine.

    There are signs that DuPont may be willing to compromise. Gruetzmacher, DuPont's licensing chief, says, “We continue to modify our license to make it better. We are learning to make concessions to make it work.” Many researchers are hoping the meeting between Varmus and DuPont later this summer will bring the stalemate to an end. Varmus is optimistic, but he has bigger objectives in mind. “The community really needs to rethink what a patent is for,” Varmus says. He expects to explore that question in broad discussions this year.

  2. SHARING MATERIALS

    Battling Over Basics

    1. Marshall Eliot

    The clash over who may and may not use mice that have been genetically engineered with a patented technology (see main text) is just the latest skirmish in a decade-long battle over commercial controls on basic tools in biomedical research.

    The most prominent case arose in the late 1980s, when biologists were steamed about controls on the polymerase chain reaction (PCR), a method of amplifying DNA sequences. It is now accessible to just about every lab doing DNA research, but basic scientists once feared that licensing fees might put it out of reach. The Cetus Corp., the original owner of the patents on PCR and its key reagent, Taq polymerase, initially tried to get all users to take out licenses. But many balked, some complained about the high fees, and a few threatened a boycott. When the furor was at its peak in 1991, Cetus sold its PCR rights to Hoffmann-La Roche, the Swiss pharmaceutical company. Roche set up a multicategory licensing system with special terms. Although Roche announced that it would not pursue people who were doing pure science, it has kept tabs on “infringers” who do not take out a license for use of Taq. Roche also claims in a lawsuit due to come to trial soon that researchers can be compelled to obtain such licenses.

    Soon after the PCR flap died down, another battle flared up. The issue: who should own rights to fragments of human genes called expressed sequence tags (ESTs). Industrial DNA sequencing outfits are seeking patents on thousands of ESTs, even those with poorly understood biological function. One company—Human Genome Sciences of Rockville, Maryland—also offers researchers access to its proprietary EST database, provided they sign a restrictive license agreement and share rights to future discoveries. National Institutes of Health director Harold Varmus and Bruce Alberts, president of the National Academy of Sciences, recently asked the U.S. Patent and Trademark Office not to grant patents on ESTs (see p. 41). So far, only a handful of ESTs have been covered in patents, although thousands are awaiting review.

  3. POLITICAL SCIENCE

    NSF Candidate Study Irks Congress

    1. Jeffrey Mervis

    Members of the House of Representatives, who face reelection every 2 years, have an intense interest in what motivates potential candidates to run for office. But some are not eager to have social scientists looking into the question. At least 70 members have complained that the research, a $175,000 study funded by the National Science Foundation (NSF) to learn why many qualified people don't enter the fray, is a waste of money, and there have been calls for two separate investigations.

    Welcome to the latest flap involving NSF's social and behavioral sciences directorate. Two years ago, the directorate survived an attempt by retired Representative Robert Walker (R–PA), then chair of the House Science Committee, to kill it. Now, NSF officials find themselves in the uncomfortable position of defending a social science project—and by extension the peer-review process that funded it—to skeptical members of Congress at the same time NSF's annual budget is under consideration.

    In the hot seat are political scientists L. Sandy Maisel of Colby College in Waterville, Maine, and Walter Stone of the University of Colorado, Boulder. The duo began their study last year by soliciting the names of persons well qualified to run for Congress from leading political figures in 200 randomly selected districts. This fall, the researchers will send out a questionnaire asking those would-be candidates about all the factors affecting their decision to run or to stay on the sidelines. Maisel says he believes the study “could tell us a lot about how a democracy works,” in particular the factors that discourage such persons from becoming candidates.

    But that's not how Representative William Clay (D–MO) sees it. Clay, a 15-term House veteran and the most outspoken critic of the study, has tried unsuccessfully to obtain the names of the districts being sampled (almost half the total of 435) to see if they are, indeed, random. But his main concern, he says, is that there are many more deserving problems facing the nation. “There is never any shortage of good and qualified people [in my district] who feel they could serve in Congress,” he declared in a recent press release. “One thing we, as Americans, have never been short of is politicians running for office.” Some 70 House colleagues have expressed their opposition to the NSF study by signing a letter to the editor from Clay to two newspapers that have written about the controversy.

    In its own letter to every House member, NSF defends the study and explains the rationale behind it, the methodology, and the rigorous selection process by which the grant was awarded. “I expect you may have a natural interest in [this study]—it's your field, it's what you do,” writes Bennett Bertenthal, head of the directorate. “[But] the general objective of this study is straightforward. The researchers are interested in understanding the reasons individuals do not run for office.”

    Maisel says he's glad that NSF “has stood up” to Clay's attack, which he characterizes as an attempt to demonstrate that “he's more qualified to judge the quality of a research proposal than a peer-review panel.” At the same time, Maisel doesn't think the fight is really about the quality of the research. “We've tried to explain that it's not an attempt to find candidates, nor does it pose a threat to any incumbent,” says Maisel. “We've also tried to explain why confidentiality is so important. But [Clay] doesn't seem to want to hear any of that.” Adds Maisel: “I think this episode shows that members are concerned about their electability, not about the science that NSF is funding.”

    Clay has asked the General Accounting Office, the congressional watchdog agency, to investigate the award. And last week, the House spending panel that oversees NSF (see p. 28) approved language asking NSF's inspector general to look into whether the researchers are following their protocol in carrying out the study. Those investigations may give the combatants a cooling-off period. But any truce could be short-lived. A floor vote on the NSF bill later this month will give House critics another forum to air any remaining concerns.

  4. SPACE STATION

    Accident Clouds U.S. Future on Mir

    1. Andrew Lawler

    Supporters of the international space station are waiting to see if the gash ripped in the Mir space station last week will wreck more than just the science module that served as living quarters and laboratory for U.S. astronauts. Lawmakers on Capitol Hill are pressuring NASA Administrator Dan Goldin to cancel plans for further long-term stays by astronauts on Mir until the agency certifies that the Russian station meets or exceeds U.S. safety standards. Such a step could jeopardize U.S.-Russian collaboration, including the scientific and technical experience that NASA officials say is an important element in assembling and working on the space station.

    The accident, in which a Russian cargo module rammed into the Spektr science module while being guided remotely by a Mir cosmonaut, has put NASA in an awkward situation. While the agency is responsible for the safety of its astronauts, the shuttle-Mir program is the cornerstone of U.S. and Russian space cooperation. It serves both as a barometer of goodwill between the two nations and as a mechanism for scientific exchange. The impact of a U.S. withdrawal, says Marcia Smith, an analyst with the Congressional Research Service, “would depend on how gracefully it's done.”

    Damage control.

    Collision between a cargo ship and the Spektr module could weaken U.S.–Russian space ties.

    SOURCE: NASA

    The immediate issue for NASA is how to react to language in its 1998 authorization bill, passed by the House of Representatives and pending in the Senate, that calls for the safety certification. The measure is the brainchild of Representative James Sensenbrenner (R–WI), who chairs the House Science Committee and has been a relentless critic of U.S. cooperation with Russia in space. Sensenbrenner's concerns have been fed by Russia's slow pace in funding its portion of the international space station—the first module of which is slated for launch in just 1 year—as well as a fire and assorted other technical problems aboard the 11-year-old Mir.

    “I, for one, can no longer sit idly by as mishap after mishap occur while we continue to plan the next shuttle mission to Mir hoping for, but not really expecting, the mission will succeed without a potentially life-threatening situation,” Sensenbrenner said hours after the accident. Sensenbrenner demanded that Goldin immediately launch an independent review of Mir safety and complete it before the next U.S. crew arrives at the station in September to relieve U.S. astronaut Mike Foale. Sensenbrenner told the NASA chief in a private meeting last week that he wants the agency to abide by the certification measure, even though it is not yet law.

    One Administration official downplayed the need for an independent review. “NASA already has experts working on this,” he says. In the meantime, “we're committed to our space partnership with Russia.”

    But if NASA does conclude that safety standards are not up to par on Mir and forbids U.S. astronauts to work on the station, it would end what Administration officials say is a critical effort to conduct a host of experiments—from biological to engineering to materials science—in preparation for the international station. “The experience has been difficult, but we're learning from it,” says the Administration official. The current mission is the sixth planned U.S. visit out of nine scheduled.

    For now, however, Foale will have little science to conduct. Much of the U.S. equipment—including a protein-crystal experiment and some biological devices—is in Spektr, which is sealed off until cosmonauts can patch the hole and repressurize the chamber. The first priority, however, is to fix the damaged power system; a space walk has been scheduled for next week to examine Spektr from the outside.

    It's not clear whether the equipment is still in working order after being exposed to the vacuum and cold of space. The module, which was attached to Mir in 1995, also contains Russian geophysical and remote-sensing equipment. But the larger question is whether Mir can continue to serve as an experiment for cooperation among the former superpower rivals.

  5. ITALY

    New Rules Provoke Scramble for Funds

    1. Susan Biggin
    1. Susan Biggin is a science writer in Venice.

    VENICE—Italian university researchers are in a state of near panic this month following the mid-June call for grant proposals under the government's new scheme for funding university research collaborations. Not only are the deadlines tight—applications are due by the end of July—but the ministry for universities and research (MURST) has instituted stringent new requirements for receiving funds. Any joint research between separate groups within a university or groups at different universities must get up to 60% of its funding from a separate source, public or private, to qualify for the new MURST grants.

    Across the country, there is now a mad rush to find academic partners with the right funding connections. Already, some researchers are criticizing the new system. Physicist Giorgio Benedek of Milan University says he and his colleagues are still trying to make sense of the new rules. Earth scientist Claudio Eva of Genoa University fears that “the rich will get richer, the poor poorer.” Finding partners will be impossible in some cases, particularly in the humanities, he believes, and “many [researchers] are not even going to try.” Eva thinks that the new system “is a way to level down, to kill off scientific research, especially frontier, innovative, individual, new directions,” because the rules tend to favor large, well-organized collaborations.

    Since 1993, MURST has channeled its funding for university research into two streams. Of the $150 million annual budget, 60% has gone straight to the universities to distribute to their own researchers; the remaining 40% ($60 million) has been awarded as grants to collaborative projects. Fourteen subject committees, elected by researchers, awarded the grants without peer review—a system many believed was open to abuse, because committee members could have a personal interest in individual projects. Last month, however, Science Minister Luigi Berlinguer announced a thorough overhaul of the system.

    Berlinguer upped the collaborative project budget to $90 million, to include additional funding for large items of equipment. He also replaced the subject committees with a single panel of five senior academics (see table), who will decide each application based on the reports of two referees. According to Carlo Calandra, a member of the new grants committee and president of the National Institute for Physics of Matter in Genoa, the only criterion for selection will be scientific quality. The emphasis, he adds, will be on basic research, with no particular requirement that proposals be innovative or have immediate applications.

    View this table:

    Berlinguer also introduced the new requirement of cofunding: Intrauniversity applicants must have 60% of their funding up front, while interuniversity collaborations must secure 40%. Cofunding will allow the support of “more significant and expensive projects,” rather than the many smaller projects of the past, says a MURST spokesperson. Franco Jovane, also on the new grants committee and director of the National Research Council's Institute of Industrial Technologies and Automation in Milan, told Science that the new mechanism will allow university scientists to pursue “free research … driving the entire research system ahead.”

    In spite of the challenge researchers may face in finding partners, the grants committee is likely to receive a mountain of proposals from the nation's 60 universities, from which it will make its selection by the autumn. “Every individual research group will be considering an application,” says Gianni Orlandi, dean of engineering at University “La Sapienza” in Rome. But Calandra believes that if the committee is stringent this year, there will be fewer proposals in the future. “The best improvement is … that we will be selecting only a few, specific proposals and financing them completely. In the past, there was the tendency to finance everything,” he says.

  6. U.S. SCIENCE POLICY

    House Study Tackles New Era in R&D

    1. Andrew Lawler

    As a physicist, Representative Vern Ehlers (R–MI) is comfortable working with theoretical models. But as vice chair of the House Science Committee, he lives in the realm of the practical. Over the next year, Ehlers intends to tap both worlds as he carries out a request by House Republican leaders to formulate a new policy on U.S. support for science that will serve the community and satisfy both Congress and the White House.

    Ehlers is a rare breed: a scientist-turned-politician. He has a Ph.D. in nuclear physics and taught at Calvin College in Michigan before being elected to the state legislature in 1985 and the U.S. Congress in 1994. Last week, he outlined his plans for the science policy study before a meeting of the Science Coalition, a group of university and industry science advocates, whose members offered cautious support for his effort. Observers say it is too early in the process to predict whether the new study will leave a more lasting mark on policy than previous reports.

    The study hopes to come to grips with the impact on science of the end of the Cold War and the grim reality of budget constraints, according to Ehlers. “We no longer can say that we must do research or the Russians will get ahead of us,” he noted. At the same time, he said, the fragile ties between basic and applied research need to be strengthened, large international projects require better coordination, and scientists must take a much more active political role: “We need a fresh look at the role of research in society.”

    Ehlers says he wants to encourage more partnerships between the federal government and states, industry, universities, and foreign nations. He also cited the Department of Agriculture's cooperative extension services as a model for turning discoveries into products that benefit society. “We need that for other sciences as well … to diffuse knowledge more rapidly.”

    Ehlers hopes to keep his study short and concise. He is struggling to win funding for only one full-time staff member and will hold a minimum of public events. “We've had enough hearings [on this topic] in the last decade,” he said, referring to an effort by the House Science Committee in 1985–86 that conducted nearly 20 hearings and generated a dozen papers but never a final report. “I'm deliberately trying to keep it simple.”

    The first step, said Ehlers, will be a roundtable—likely in September—bringing together 40 or so “fairly high-level scientists” to develop ideas and formulate an agenda for the study. Their work would be circulated and then vetted at a public hearing in an effort to win the backing of political leaders and the scientific community. “It's not going to have any impact unless Congress and the Administration buy into it,” he added.

    Ehlers asked for help in making a case for the value of continued federal support for science. “Scientists have to be more active,” he said. “They can't be priests of a cult who descend from a mountain [periodically] and say, ‘I need money.’ “He noted that it took researchers years to discover that the Internet was not only a way for scientists to communicate but also a tool to talk with the public and Congress.

    Science Coalition members say they welcome the new study and wish him well. “It's a daunting task, but an essential one,” says Jack Crowley, lobbyist for the Massachusetts Institute of Technology, who has helped to organize the science coalition. Democrats, meanwhile, are keeping close tabs on the effort. Representative George Brown (D–CA), ranking minority member of the Science Committee, says the review could help shape the debate over R&D spending and that the study “ought to be bipartisan.” Brown says he plans to suggest topics that Ehlers might address.

  7. APPROPRIATIONS BILL

    House Panel Boosts NSF, NASA, EPA

    1. Andrew Lawler,
    2. Jeffrey Mervis

    The first major science spending bill for 1998 to start moving through Congress contains good news for researchers. But federal officials and lobbyists are cautioning that funding levels approved last week by a House subcommittee for NASA, the National Science Foundation (NSF), the Environmental Protection Agency (EPA), and other agencies could be reduced as the bill grinds its way through the congressional mill. “It's wonderful, but we'll see if it lasts,” says Howard Silver, chair of the Coalition for NSF Funding, an advocacy group. “I think there are probably a lot of people taking credit for this mark.”

    The appropriations bill, for the fiscal year that begins on 1 October, would increase NSF's overall budget by 6.6%, to $3.49 billion. That's $120 million more than the president's request, and close to a 7% target set by a coalition of scientific societies (Science, 21 February, p. 1055). The lion's share of that added boost is a $90 million allocation to build a new South Pole research station that would replace a deteriorating 20-year-old facility. The research account would rise by 4.3%, to $2.54 billion; education activities would go up by 2%, to $633 million; and two new facilities—a millimeter array and a polar-cap observatory—would move ahead as planned.

    For NASA, the subcommittee provided $148 million more than the $13.5 billion presidential request, although the figure falls short of the agency's current budget. Two-thirds of the added funds would go to cover unanticipated increases in space station costs caused by Russian delays in building station hardware. The panel also said NASA could move another $150 million from other agency accounts into the space station budget if necessary. The remaining $48 million of the increase above the request would be spread out among a host of science, aeronautics, and technology programs, including NASA's new Space Biomedical Institute in Houston and the National Space Grant Colleges and Fellowships program.

    The science and technology account at EPA also fared well, receiving $656 million—$41 million above the president's request. That includes $35 million for research on the health effects of particulate matter—which will more than double what EPA now spends—and $5 million more for ozone research, both to support new air-quality standards.

    For all three agencies, the tougher fight will be in the Senate, where legislators will have almost half a billion dollars less to spend on the same bill, which also funds politically popular housing and veterans' programs. The appropriators “are making us temporarily cheerful, but in the long run, we may have more to cry about,” says Representative George Brown (D—CA), the ranking minority member of the House Science Committee.

    The House Appropriations Committee is scheduled to take up the bill next week, with the Senate panel expected to swing into action later this month.

    With additional reporting by Jocelyn Kaiser.

  8. PUBLIC HEALTH

    Magnetic Field-Cancer Link: Will It Rest in Peace?

    1. Gary Taubes

    It could be the obituary. For 18 years, researchers have explored the question of whether low-frequency electromagnetic fields (EMFs) can cause cancer. But they have neither come up with solid support for the hypothesis, nor managed to put it to rest. This week, a team led by epidemiologists Martha Linet of the National Cancer Institute (NCI) and Leslie Robison of the University of Minnesota, Minneapolis, reported the results of the most carefully controlled study yet: a $5 million, 5-year investigation into the possible link between magnetic-field exposure and childhood leukemia. “The results are very clear,” says Robison. “They're negative.”

    The study, which appeared in the 3 July issue of The New England Journal of Medicine(NEJM), could mark the end of a trail that researchers began following in 1979, with the first suggestions of a link between residential exposure to high EMFs and childhood leukemia. Since then, a series of studies has turned up vague epidemiologic associations between EMFs from various sources and everything from leukemia to breast cancer to brain cancer. A series of expert reviews—the latest being a National Research Council (NRC) study of residential EMFs released last year—has concluded that there is little evidence to support most of these claims (Science, 8 November 1996, p. 910). But a weak positive association between childhood leukemia and certain home wiring configurations has refused to go away.

    No need to worry?

    How various home magnetic-field levels (left, in microtesla) and wiring configurations (right) affect the risk of childhood leukemia.

    SOURCE: NEJM

    With the new study, it might—at least for most scientists. “The study will be less easily criticized than previous studies, simply because it was conducted so carefully,” says Lawrence Fischer, a toxicologist at Michigan State University in East Lansing and chair of the study's advisory group. David Savitz of the University of North Carolina, a member of the NRC panel and the author of a study that had linked wiring configurations to childhood leukemia, describes the study as “quite impressive … a massive undertaking.” He says, “This will certainly be interpreted as a negative bottom line.” The researchers measured residential EMFs directly as well as inferring them from wiring codes, the strategy that had led to positive results in the past. They also enrolled childhood leukemia cases immediately after diagnosis rather than years later, when EMF exposure at the time of the disease's onset may be hard to assess.

    The study was launched at the behest of Congress in the late 1980s. While still in the planning stage, says Linet, the NCI received a proposal from Robison and his colleagues to take a comprehensive look at risk factors for acute lymphoblastic leukemia (ALL), which accounts for 80% of all childhood leukemias. Robison chairs the epidemiology and cancer-control strategy group for the Childhood Cancer Group (CCG), a consortium of over 100 institutions that identify and treat about half the children with cancer in the United States and pool their cases for study. Linet and her NCI colleagues realized that by piggybacking an EMF study on Robison's project, they could take advantage of the CCG's infrastructure and cases.

    The other advantage of working with the CCG was that the researchers would be notified of eligible cases within several days of diagnosis. “So interviews and contacting families could be done in a relatively expedient way,” says Robison. The EMF subgroup of the ALL study eventually collected 638 children with childhood leukemia and 620 matched controls in nine midwestern and mid-Atlantic states and assessed their EMF exposures.

    In doing so, they hoped to avoid some of the weaknesses of previous studies. Three U.S. studies that had reported an association between childhood leukemia and EMFs had relied on local wiring configurations—in particular, the thickness of the wires going into the houses and the distance from the houses to nearby power lines—as a surrogate for actual field strengths. Those studies, as well as others, had found no cancer-EMF link when they actually measured the field strengths inside the children's houses. But these direct measurements were open to criticism because they were made years after diagnosis.

    To learn the best way of assessing residential EMF exposure, the NCI team did a preliminary study in which they had 30 children wear 24-hour dosimeters “about the size of a beeper,” says Linet. “[They were] put on children in pouches tied with thick plastic ties that the kids couldn't undo.” The dosimeters monitored the children's exposure to EMFs in their homes, schools, and day-care centers—data that guided the team in designing its measurements. “We wanted to make sure that area measurements would reflect children's actual personal exposures,” says Linet. During the study itself, the NCI team measured field strengths in each child's bedroom over 24 hours and made spot measurements in the kitchen, the family room, and the room where the mother had slept during her pregnancy.

    Comparing the EMF exposure of the cases and controls, the researchers found no association between an increased risk of childhood leukemia and magnetic fields of 0.2 microtesla (mT) or more, which were the levels that previous investigators had associated with the cancer. The researchers did find what Linet calls a “hint” of an association in homes with field strengths of 0.4 to 0.499 mT—a result Savitz calls “not flatly negative.” But the numbers of cases and controls at those field strengths were small—just 14 and 5 respectively. “The study was not designed to address the question of leukemia risk … at the very highest EMF levels,” says Robison. Above 0.5 mT, in any case, the hint vanished.

    Perhaps most important, points out Harvard School of Public Health epidemiologist Dimitrios Trichopoulos, the study also looked for a link between wiring configuration—the EMF proxy that previous positive studies had relied on—and the risk of ALL. While the wiring codes turned out to be reasonably good estimators of actual EMF exposure, they showed no association whatsoever to ALL—“Nada,” says Trichopoulos. “Flatly negative,” agrees Savitz.

    The results, NEJM Deputy Editor Edward Campion says in an accompanying editorial, suggest it is time to “stop wasting our research resources” on the EMF-cancer question. Those familiar with the emotional tenor of the issue, however, are not sanguine that the study will convince the public at large. As Fischer puts it, “Many people, when they hear the result, will think, ‘This can't be right.’ “But, he adds, “they will think that not on any scientific basis, but because of their emotional involvement with the disease.”

  9. PLANETARY SCIENCE

    Does Mathilde Have a Broken Heart?

    1. Richard A. Kerr

    Laurel, MarylandSeen in the first images from a passing space probe, the asteroid Mathilde was stunning enough, adorned with incredibly deep, shadowed craters. But when the probe radioed back the first clues about the interior of the 52-kilometer asteroid last week, researchers were in for another surprise: They found only a third of the mass they had expected. The discovery supports the claim that most asteroids are heaps of rocks loosely held together only by their own gravity. Eons of banging about in the asteroid belt, it seems, have reduced Mathilde, and perhaps most asteroids, to piles of flying rubble.

    Mathilde, the largest asteroid to date to be viewed up close, was imaged by the low-cost, Discovery-class Near Earth Asteroid Rendezvous (NEAR) spacecraft, which is on its way to an encounter with the asteroid Eros in 1999. The spacecraft carries no instrument that could directly probe an asteroid's interior, but NEAR's radio managed to do so indirectly, as researchers reported at a press conference here last week. As the spacecraft passed within 1200 kilometers of Mathilde, the asteroid's feeble gravity slightly deflected NEAR's path. By monitoring the Doppler frequency shift in the spacecraft's radio signal during the flyby, researchers inferred that Mathilde slowed down the spacecraft by 1 millimeter per second—about the speed of a sluggish ant, said NEAR team member Donald Yeomans of the Jet Propulsion Laboratory (JPL) in Pasadena, California.

    Not as solid as she looks.

    Gravity data suggest Mathilde's interior is disrupted.

    JHU APL

    From that minuscule slowing, Yeomans calculated a mass for Mathilde of 1017 kilograms, or a millionth the mass of Earth's moon. Assuming an average diameter of 52 kilometers (a preliminary value determined from NEAR images), Mathilde has a density of just 1.3 grams per cubic centimeter—not much more than water. But meteorites thought to have been chipped off this type of common asteroid are typically twice as dense, or 2.6 grams per cubic centimeter. “We've got an object considerably lighter than we thought,” says celestial mechanicist Yeomans. “If it were any lighter, it could float.”

    Although Mathilde's apparent low density may rise somewhat as researchers take better account of its irregular shape, it's unlikely that the final estimate will approach the density of solid rock, notes asteroid specialist Alan Harris of JPL. The most likely explanation, he says, is that Mathilde is a conglomeration of blocks, boulders, and loosely compacted debris. There have been earlier signs that asteroids are rubble piles (Science, 1 January 1993, p. 28; 26 April 1996, p. 485), but this is the most direct evidence yet, says Harris. Mathilde's apparent density bears on problems such as the collision history of asteroids and strategies for protecting Earth from future asteroid collisions.

    The finding could also have implications for NEAR's future if Eros, too, is a low-density body with unexpectedly weak gravity. In January 1999, when controllers aim to put NEAR into orbit around Eros, they will have to keep Mathilde's broken heart in mind.

  10. PARTICLE PHYSICS

    Case for Neutrino Mass Gathers Weight

    1. Andrew Watson
    1. Andrew Watson is a science writer in Norwich, U.K.

    “Neutrinos, they are very small / They have no charge they have no mass / And do not interact at all,” runs the poem by John Updike. Physicists have known for decades that Updike is mistaken on one count at least: Neutrinos do interact with matter, albeit very feebly. Many are now convinced, however, that he is also wrong about the mass. Three new experimental results, announced last week at a meeting on the Italian island of Capri near Naples, add to hints that neutrinos might indeed have a very small mass.

    The three experimental groups approached the question from different directions, two of them by using massive underground detectors to capture neutrinos streaming from the upper atmosphere, and one by studying neutrinos made with an accelerator. But all three believe they may be seeing signs of “neutrino oscillations,” in which neutrinos—which come in three “flavors” called electron, muon, and tau—spontaneously switch from one flavor to another. Oscillations can take place only if neutrinos have mass. “We have evidence, in fact, I believe strong evidence, that oscillations are occurring,” says Bill Louis, spokesperson for the accelerator experiment—the Liquid Scintillator Neutrino Detector (LSND) at Los Alamos National Laboratory in New Mexico.

    Whether the humble neutrino has a mass is a weighty matter for physicists. Massive neutrinos could help account for the universe's “missing mass,” the extra heft astronomers believe must be out there but have not been able to find; neutrino oscillations might also explain why the sun appears to produce fewer neutrinos than theorists expect; and neutrinos with mass would offer a first step outside the Standard Model—the tried-and-tested description of nature's fundamental particles and forces. “The Standard Model, as it stands, doesn't have room for massive neutrinos,” says Oxford University physicist Hugh Gallagher, a member of the Soudan 2 experiment in the Soudan iron mine in northern Minnesota, one of the two groups studying atmospheric neutrinos.

    Hints of oscillations had already emerged from a previous LSND experiment (Science, 10 May 1996, p. 812) and from earlier atmospheric neutrino studies. But the trio of new results may be more compelling. The two atmospheric neutrino experiments offer better statistics and rely on two different detector technologies, and the LSND group has devised new ways to produce and detect their neutrinos. Even so, theorists are scrutinizing the new results very carefully before rushing to rewrite their models. One reason for the skepticism is that the two sets of results seem to deliver different messages about how neutrinos oscillate. “The LSND results are apparently not consistent with atmospheric neutrino results,” says Yoji Totsuka, spokesperson for Japan's Super-Kamiokande group, the other team reporting atmospheric neutrino results.

    The Soudan and Super-Kamiokande claims rest on a single calculation: the relative numbers of electron and muon neutrinos created when cosmic rays collide with particles in the upper atmosphere. “This ratio is simple to calculate and is quite robust,” says theorist Tom Gaisser of the University of Delaware.

    Both Soudan 2 and Super-Kamiokande aim to measure this ratio to see if any of the neutrinos have oscillated to a different type en route between the upper atmosphere and the detectors deep underground. Based in the Kamioka laboratory west of Tokyo, the Super-Kamiokande detector snares neutrinos in a 50,000-ton water tank watched by 13,400 photodetectors. An electron neutrino crashing into nuclei in the water produces an electron, a muon neutrino, and a muon. The electron and muon, being charged, create distinctive flashes of light that are picked up by the detectors. Soudan 2 works on a similar principle, but relies on 1000 tons of corrugated iron sheets interspersed with sensitive charged-particle detectors.

    “We have observed a smaller muon-neutrino-to-electron-neutrino ratio as compared to the expectation of the atmospheric neutrino flux calculations,” says Kenzo Nakamura, reporting Super-Kamiokande's results at Capri. The Soudan result is “in line” with Super-Kamiokande's, according to Gallagher. “What we've measured is a result which is only about 60% or 65% of what we expect based on the Standard Model,” he says, adding, “The fact that they measure essentially the same result in very, very different detector technologies serves as a strong indication that what we're measuring is not some artifact of our experimental apparatus.”

    Instead of relying on nature to supply their neutrinos, the LSND team at Los Alamos drives a proton beam from an accelerator into a water target to create particles called pions. These then spawn other particles, including muon neutrinos. Thirty meters away is the neutrino detector itself, consisting of 167 tons of mineral oil under the watchful gaze of 1220 photodetectors. The new LSND experiment, reported at Capri by team member Geoff Mills, detected a couple of dozen excess electron neutrinos in the beam, which originally contained only muon neutrinos. If what the group sees really is due to oscillations, then these results, together with their earlier ones, show that “roughly about a third of a percent of the muon neutrinos … will turn into electron … neutrinos,” says Louis.

    Weaving together the threads of evidence is not straightforward, however. “The solar neutrino deficit, atmospheric [neutrinos], and Los Alamos—they are not all consistent,” says Adam Para of the Fermi National Accelerator Laboratory near Chicago. “At least one of them must be due to something else, or something that we didn't think of,” he says. Louis, however, believes these are simply wrinkles that theorists will be able to iron out. “They could be seeing different aspects of oscillations,” he says.

    Others, such as physicist Douglas Morrison at the CERN particle physics lab near Geneva, are less sanguine. Morrison suspects that dubious assumptions behind the calculations, rather than oscillating neutrinos, might explain both the solar and atmospheric shortfalls. Although Gallagher, Nakamura, and others say the uncertainty in estimates of the ratio of electron to muon neutrinos generated by cosmic rays is about 5%, “my feeling is that the error is bigger than 5%,” Morrison says. As for the LSND result, he says, “I think the majority of the community is skeptical.”

    But Oxford's Wade Allison, another member of the Soudan 2 team, thinks the evidence for oscillations cannot be discounted: “I am completely convinced there is a real [neutrino] problem, and I also believe that neutrino oscillations are the only show in town to explain that problem.” New experiments will win the day, he says, by revealing more of the properties of neutrino oscillations than simple ratios, such as actual mass differences and identity-switching probabilities. Adds Allison: “We've really got to try and tie the thing down, and then it will be convincing.”

  11. ALZHEIMER'S RESEARCH

    New Lesion Found in Diseased Brains

    1. Wade Roush

    For years, researchers praying for clues that might lead to a new treatment for Alzheimer's disease have had two congregations to choose from: the “BAPtists,” who attribute the disease mainly to the β-amyloid protein (sometimes known as BAP) found in plaques that riddle the brains of Alzheimer's patients, and the “Tauists,” who suspect that the misbehavior of a neuronal protein called tau is more central. Now, with the detection of a third type of lesion that has apparently lain hidden ever since Alois Alzheimer began studying the disease 90 years ago, a new sect may be in the making.

    The new lesions are known for now as “AMY plaques,” because they were initially mistaken for amyloid plaques. They appear to be nearly as widespread in the brains of Alzheimer's patients as the more familiar plaques and tangles of tau proteins, according to a team led by neuropathologist John Trojanowski and neuroscientist Virginia Lee at the University of Pennsylvania School of Medicine in Philadelphia, who report their work in the July issue of the American Journal of Pathology. That means the AMY plaques could represent an unrecognized cause of the dreaded memory-depleting disease, which strikes 5% of people over age 65.

    “I feel very excited about it,” says neuroscientist Zaven Khachaturian, director of the Alzheimer's Association's Ronald and Nancy Reagan Research Institute in Chicago. “This opens new vistas for us in terms of conceptualizing what's happening in the disease, and it may even give us new diagnostic tools and new targets for treatment.” But he and others note that this promise won't be realized soon, because researchers still aren't certain how or even whether the familiar amyloid plaques and neurofibrillary tangles cause neuronal deterioration.

    The Penn researchers discovered the new lesions by accident while trying to learn more about the tau protein, which is normally found inside neurons but congeals in extracellular masses in the brains of Alzheimer's patients. Two years ago, Trojanowski and Lee reported from studies of biopsied and autopsied tissue that tau collects this way when phosphatases, enzymes that remove excess phosphate groups from tau, somehow fail to do their job (Science, 10 February 1995, p. 793).

    To learn whether trouble at tau's phosphate-binding sites contributes to this problem, Marie Luise Schmidt, a researcher in Trojanowski and Lee's laboratory and the study's lead author, reviewed a set of 59 new antibodies designed by Lee to recognize and bind to these sites. While testing the antibodies on tissue slices from Alzheimer's brains, however, Schmidt found that four of them—especially one called AMY 117—didn't bind to tau at all, but sought out plaques instead.

    The team suspected that the purified proteins used to create the antibodies weren't pure—that they contained molecules from amyloid plaques, a glitch they had encountered before. But when Schmidt stained brain slices using both AMY 117 and an antibody to β amyloid, she was astonished to find that the two antibody types gravitated to two different sets of plaques—one of which had never before been glimpsed. The new lesions resemble the amyloid plaques from the outside, but inside they lack the core of β-amyloid protein that traditional staining techniques recognize. So the proteins used to make the antibodies must have been contaminated after all—but with proteins from the AMY plaques, not the amyloid plaques. The existence of the new lesions “couldn't have been suspected without these new antibodies,” says Trojanowski.

    All 32 of the Alzheimer's brains the team examined exhibited the new lesions, usually close to, but not overlapping, the amyloid plaques. That means Alzheimer's researchers now have an entirely new set of pathological mechanisms to explore. Explains Trojanowski: “It could be that if you sweep away amyloid plaques and tangles and still have these AMY plaques, you would only be getting rid of two-thirds of the symptoms.”

    The Penn team is currently purifying the AMY 117-binding protein with the goal of cloning the gene encoding it. That could eventually lead to an improved means of diagnosing Alzheimer's based on detecting the protein in the blood or cerebrospinal fluid, and perhaps even to a way of blocking the formation of new plaques.

    But will Alzheimer's researchers welcome this new denomination to their increasingly ecumenical field? “Heavens, yes,” says neuromolecular biologist Marcelle Morrison-Bogorad, associate director of the Neuroscience and Neuropsychology of Aging program at the National Institute on Aging. “The more angles we have, the closer we'll be to understanding what Alzheimer's actually is.”

  12. AIDS THERAPIES

    The Daunting Challenge of Keeping HIV Suppressed

    1. Jon Cohen

    ST. PETERSBURG, FLORIDA—When the media last year trumpeted the great advances being made against HIV, speculating about cures and even the end of AIDS, the researchers' own caveats tended to get drowned out. But at a meeting held here last week, emerging data about treatment failures sounded a discordant note that was hard to miss: While powerful new drug combinations are delaying disease and death, they have serious limitations—and clinicians and patients who ignore these shortcomings do so at their peril.

    More than 200 leading AIDS researchers from around the world gathered here from 25 to 28 June for the workshop, which focused on HIV drug resistance, treatment strategies, and the possibility of eradicating the virus from an infected person. One presentation after another reinforced the message that keeping HIV at bay, even with the most potent three-drug cocktails now available, remains a daunting challenge. “Triple combination therapy can fail for a variety of reasons,” said John Mellors of the University of Pittsburgh Medical Center, one of the meeting's organizers. As this reality sets in, infected people may end up feeling that their hopes were raised too high last year, he warned. “The pendulum will swing back.”

    The meeting went into fine detail about why these failures occur. New, more sensitive assays that measure levels of HIV indicate that even the best treatments have a difficult time completely suppressing viral replication, which gives drug-resistant mutants a chance to appear. Less surprisingly, many treatments also fail because patients don't “comply” with therapies that require taking dozens of pills—many of which have serious side effects and dietary restrictions—each day. Although there were encouraging findings about new treatments allowing the immune system to recover if the virus can be suppressed, researchers spelled out just how distant the goal is of completely rebuilding a full range of immune responses in an HIV-damaged body (see sidebar).

    Last year's surge in hope was driven by dramatic findings about the wallop delivered by combinations of two drugs that attack HIV's reverse transcriptase (RT) enzyme with one drug from a newer class of compounds directed at the virus's protease enzyme. Several studies showed that such triple combinations—and even some cocktails of RT inhibitors alone—can drive the amount of HIV in a person's blood, the “viral load,” down so low that the most sensitive tests could not detect any virus for more than a year in many patients. Researchers warned, however, that just because HIV couldn't be found didn't mean it wasn't there—nor did it mean the virus wasn't replicating.

    One of the more disconcerting findings reported here is that, just as researchers feared, the “undetectable” HIV reported last year can routinely be detected with a more sensitive test. A year ago, the most sensitive tests could measure viral levels down to 500 copies of HIV per milliliter of blood. New tests now measure as few as 20 copies per milliliter. Brian Conway, of Vancouver, Canada's BC Centre of Excellence in HIV/AIDS, used such a test in a 151-person study comparing two RT inhibitors to three RT inhibitors. One year after treatment began, 27% of the patients receiving one of the two-drug combos had fewer than 400 copies of HIV. But when the samples were reanalyzed with an assay that went down to 20 copies, only 12% had “unquantifiable” levels. And when Conway ran the test on one sample below 20, he detected the virus three out of 11 times. “A lot of people hear ‘unquantifiable,’ and they think ‘zero,’ “said Conway.

    John Coffin of Tufts University in Medford, Massachusetts, suggested that researchers consider changing the focus to the number of virally infected cells, which estimates suggest is about 1000 times higher than the HIV copy number detected. So 20 copies, noted Coffin, would equal 20,000 infected cells. “If you say a person has less than 20,000 infected cells, it might give even the most optimistic patient pause,” said Coffin.

    Considering that many patients have tens of thousands of copies of HIV when they begin therapy, a drop to, say, 400 copies is hardly bad news. The problem, however, is that if HIV is detected, it's replicating and can mutate into resistant strains. Dale Kempf of Abbott Laboratories near Chicago, maker of the protease inhibitor ritonavir, underscored this point. Kempf presented a study that analyzed patients who had failed various ritonavir regimens—including triple drug combos—because viruses resistant to the drug had emerged. He found that the most telling gauge of whether treatment would eventually fail is how low a person's viral load fell. If the therapy knocked HIV down to 200 to 1000 copies per milliliter, viral levels would rise again, on average, 128 days after treatment began. In contrast, for people whose viral levels went below 200—the limit of the assay Abbott used—treatment failed in 199 days. “Very low viral load must be achieved to ensure a durable response, and we don't think 200 is low enough,” concluded Kempf.

    Douglas Mayers of the Naval Medical Research Institute in Bethesda, Maryland, added yet another eye-opening finding about drug failures, by showing that they often have nothing to do with resistance (see diagram). Mayers's genetic analysis of HIV from 37 patients who failed various RT-protease inhibitor combinations revealed that 22% had no mutations that would make them resistant to any of the drugs. Another 24% were resistant to RTs but not protease inhibitors. “It was a real surprise,” says Mayers. “These data suggest that as much as 40% of failure is related to compliance.” While the drugs would presumably still work in these patients if they took them, one danger, said Mayers, is that many physicians might see their virus rebounding, assume they had developed resistance, and switch to other drugs. He said failure may also be caused if people have unusually high metabolisms, processing the drugs so quickly that they have too little time to do their work.

    Humble pie.

    Treatment failures often occur without resistance to reverse transcriptase (RT) or protease inhibitors.

    SOURCE: D. MAYERS

    Patients who fail one therapy can often switch to other combos of the 11 anti-HIV drugs now on the market. Unfortunately, however, many mutations that confer resistance to one drug will render similar drugs useless. The best hope for people for whom several drugs have failed is new drugs that attack novel HIV targets. But there is a dearth of such drugs in the pipeline, says meeting co-organizer Charles Boucher of University Hospital in Utrecht, the Netherlands. “I'm confident that [these drugs] are not going to be developed in the next 3 to 4 years,” says Boucher. “It's going to be a disaster.”

    Other clinicians at the meeting warned that they already are running out of options for many of their patients who have tried and failed several drug regimens. “I think three-drug therapy is an incredible advance, but our ability to treat people who fail is troubling,” said Margaret Fischl of the University of Miami School of Medicine. “We don't have anything to offer them.” So in spite of the vast improvement in treatment that the new therapies have brought, there were no trumpets playing victory tunes here.

  13. AIDS THERAPIES

    Recovering From the Ravages of HIV

    1. Jon Cohen

    New anti-HIV treatments are staving off disease and death in thousands of people. But will a patient's immune system actually recover from the ravages of an HIV infection if the virus is kept at bay? On page 112 of this issue, immunologist Brigitte Autran of the Hôpital Pitié-Salpétiêre in Paris and co-workers show that even severely compromised immune systems can make a significant recovery after state-of-the-art drug treatment has kept HIV levels suppressed for a year. Yet their work also makes it clear that fully rebuilding an HIV-ravaged immune system is a tall order, especially given the limits of treatments available today (see main text).

    Autran and colleagues focused on white blood cells, or T lymphocytes, that have a receptor known as CD4 on their surfaces. HIV selectively infects these immune system cells and, both directly and indirectly, leads to their destruction. In time, HIV-infected people are left with so few CD4s that their immune systems can no longer fend off even the wimpiest bacteria, viruses, or fungi. A key benefit widely seen in people receiving potent new treatments is that CD4s dramatically rebound. Yet in all but the healthiest of infected people, CD4s do not return to normal levels. And it's not clear whether patients really regenerate CD4s or just “redistribute” ones that have been sequestered in the lymph nodes and other tissues. This, in turn, determines just how effective the “new” CD4s are at fighting infections.

    The Autran group analyzed the CD4s that returned in eight adults taking a powerful combination of three anti-HIV drugs. After 1 year, the drugs had driven down the level of virus in the blood of all of the patients, and CD4 cells had jumped from a mean of 165 cells per cubic millimeter to 327. (The average number in an uninfected person is 900.) But because all CD4s are not created equal, the researchers used other markers on the surfaces of these cells to categorize them as either belonging to the “memory” or “naïve” subset. A memory cell only responds to an invader it has seen before, while naïve cells can launch an immune response—and create memory cells—against newcomers.

    During the first 4 months of treatment, returning CD4s were mostly memory cells, and they came back so fast that they were unlikely to be regenerated cells. “The massive increase at the beginning of treatment is redistribution,” says Autran. But after that initial phase, the naïve population rose steeply, indicating that new cells were being generated—and providing a more diverse “repertoire” of CD4s able to respond to new invaders, or antigens. “It's a nice, three-color snapshot,” says immunologist Donald Mosier of The Scripps Research Institute in La Jolla, California, of the work. “It's the best analysis of which T cells come back after triple-drug therapy that I've seen.”

    Work by Mark Connors, Clifford Lane, and colleagues at the National Institute of Allergy and Infectious Diseases adds a darker shading to the picture, however. Their study, published in the May issue of Nature Medicine, shows that CD4s of HIV-infected people often lack the full range of cell surface proteins that they need to function properly. So even naïve cells, Lane says, may be handicapped.

    Scripps's Mosier adds that the chances are “slim to none” that anti-HIV drugs will eventually allow the immune system to completely restock itself with fully functional CD4s, both because of the limits of the drugs and the immune system's ability to fix itself. Says Mosier, “I'd frankly be surprised if you could continue to do this year after year.” Autran holds out hope, however. She notes that people not infected by HIV who receive bone marrow transplants lose their CD4s and see them return in the same two-phase pattern her lab observed. “I'm quite optimistic that if we could diminish the level of viral replication enough, we could approach that situation,” she says.

Log in to view full text