News this Week

Science  28 Aug 1998:
Vol. 281, Issue 5381, pp. 1258

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution

  1. Medical School Caught Up in Pennsylvania Hospital Debacle

    1. Constance Holden

    When molecular geneticist Darwin Prockop came to the MCP-Hahnemann School of Medicine in Philadelphia 2 years ago, the future seemed bright. Prockop, who runs a gene therapy center at the medical school, was one of about a dozen top scientists recruited by MCP-Hahnemann in the past few years with promises of hefty salaries and abundant research support. But this spring, prospects for Prockop and his colleagues at the nation's largest private medical school took a nosedive. The organization that runs the school, the Pittsburgh-based Allegheny Health Education and Research Foundation (AHERF), after many months of hemorrhaging funds, filed for bankruptcy last month. “It's literally unprecedented for a medical school to be caught up in this kind of bankruptcy proceeding,” says Jordan Cohen, head of the Association of American Medical Colleges.

    Next month, Allegheny's eight Philadelphia hospitals will be put on the block, and a court has ordered an academic committee to come up with a plan by mid-October for restructuring MCP-Hahnemann and the three professional schools that make up Allegheny University of the Health Sciences. Officials hope they will be salvaged as an intact, independent university, but researchers like Prockop, whose budgets are being severely squeezed, are worried. There is a “very real” chance that much of the expensive talent lured to Allegheny during its expansionary binge will jump ship, says Prockop.

    The Allegheny debacle is an extreme example of the turmoil at U.S. academic medical centers triggered by the rapidly changing health care economy, as managed care and severe cutbacks in insurance payments have turned hospitals that were once university cash cows into financial drains. But observers contend that a unique set of circumstances, and misjudgments by Allegheny's former president Sherif S. Abdelhak, pushed Allegheny over the edge. They include a massive expansion at a time when many other health centers were contracting, and an extremely competitive medical market in Philadelphia that crimped Allegheny's anticipated revenues, says health consultant Gerald Katz of Plymouth Meeting, Pennsylvania. The financial collapse was a “colossal disaster and one that could have been avoided” with more prudent management, says Donald Faber, chair of the MCP-Hahnemann neurobiology department. (Abdelhak could not be reached for comment.)

    Until recently, Allegheny was riding high under Abdelhak, who spent 12 years at the helm before the AHERF board fired him last June. “Some people were very impressed with what Abdelhak was doing,” says Faber. But he did “two things [that] brought the whole system down,” contends biochemistry professor Gerald Soslau. He purchased several hundred private practices, paying top salaries to physicians whose patient referrals were supposed to generate revenue for the system's hospitals. But the return turned out to be “minimal,” says Soslau. Then last year, Allegheny bought up two Philadelphia hospitals, one of which was soon closed, that came saddled with $160 million in debt.

    “I think Abdelhak was operating under a set of assumptions that used to be the case but were no longer the case,” says Katz. Philadelphia, he says, has too many hospital beds; much of the population is enrolled in health maintenance organizations, which have highly restrictive reimbursement standards; and only two insurance providers cater to 80% of the population, giving them power to “pretty much dictate the prices” of services. When the Philadelphia branch of AHERF filed for bankruptcy on 21 July, Allegheny claimed it was $1.3 billion in debt.

    Things really began to unravel in the spring, when officials used a number of endowments and special funds to pay pressing expenses, according to reports in the Philadelphia Inquirer. AHERF reportedly quickly restored the money to one endowment, worth $3.4 million, after the donor complained. Allegheny officials are now “reviewing all restricted funds,” says spokesman Thomas Chakurda, to determine if any “were moved inappropriately.”

    The researchers recruited during the expansionist phase are now feeling the squeeze. Prockop, who came from Jefferson Medical College in Philadelphia in 1996, says he discovered last month that a foundation-donated fund containing $360,000 for diabetes research that he had brought with him to Hahnemann had been “depleted.” Prockop anticipates that the school will renege halfway through his 5-year contract, and he says he is “actively pursuing” possibilities for moving his 20-person lab elsewhere.

    Another scientist who got burned is Howard Ozer, a cancer researcher at Hahnemann Hospital who was lured from Emory University last year with the promise of a $5 million-a-year budget for his research and treatment center. Unlike Prockop, who gets half his budget from the National Institutes of Health (NIH), Ozer is heavily dependent on funding from the hospital and is now preparing a slimmed down, $1.2 million budget. He is guardedly optimistic about prospects when the hospitals are taken over by new proprietors; nonetheless, “if I had it to do all over again, I would not have come to this institution.”

    The financial crisis “should have been seen much earlier, and probably could have been dealt with if it had been,” says Ozer. Faber remembers site visitors from a licensing board a year or so ago remarking that “you're the only people who are expanding while the world is contracting.”

    Indeed, many other U.S. academic medical centers made wrenching adaptations to the managed care economy. Both Stanford University and New York University, for example, have merged their hospitals with other systems. In 1996, restructuring at the University of Southern California's medical school led to a lawsuit by scientists complaining that attendant pay cuts constitute an assault on tenure. And the University of Pennsylvania last month reported that its health system ran a deficit of $100 million in the fiscal year ending in June. Georgetown University Hospital reported losing $27 million last year, but spokesperson Paul Donovan says the hospital expects to turn a profit again next year as a result of management reforms aimed at making the clinical, teaching, and research functions all self-supporting.

    Observers are hoping the worst is also over for Allegheny. The bankruptcy court last month approved an emergency $100 million loan to AHERF from Madeleine LLC, an investment consortium, to cover salaries and expenses until a purchaser takes over the hospitals. So far, three national hospital chains are lined up to bid on the eight Philadelphia hospitals that will be sold on 29 September. Last month, Congress passed a measure allowing medical students to continue getting federal student loans worth some $43 million a year. And NIH won't cut off the roughly $56 million in grants Allegheny researchers receive each year, says NIH spokesperson Don Ralbovsky.

    “This is a fragile time for us,” says Faber, who is a member of the university restructuring committee. But “the hope and expectation is that … we will come out of this as a viable, freestanding university.”


    Did an Ancient Deep Freeze Nearly Doom Life?

    1. Richard A. Kerr

    For most of its history, Earth has been a comfortable place for life. Even during the regular ice ages of the past million years and the huge impact 65 million years ago, most organisms either adapted to the new conditions or found refugia and survived. Now, researchers propose that about 700 million years ago Earth suffered a series of enveloping ice ages that nearly snuffed out life. Coated by ice and snow from pole to pole, the planet slept on for millions of years, according to this theory, until it was finally roused by its own volcanic emanations.

    On page 1342 of this issue of Science, geologists Paul Hoffman and Galen Halverson and geochemist Daniel Schrag, all of Harvard University, and geochemist Alan Kaufman of the University of Maryland, College Park, present isotopic and geological evidence from Namibia for an ancient “snowball” Earth that threatened the diverse but still simple organisms that then constituted life on Earth. “It's mind-boggling that such events may have happened,” says Hoffman. While rocks from the time do record at least two ice ages, not everyone is convinced that the glaciation was so extreme. “This is really interesting, but it's really speculative,” says geochemist Louis Derry of Cornell University. “There are significant questions about the data.”

    Traces of a snowball?

    A glacial deposit is abruptly capped by warm-water carbonate rock.


    The data come from rock deposited about 700 million years ago on the edge of a long-vanished ocean, in what is now Namibia in southwest Africa. The rock section under study—carbonate topped by a jumbled deposit of debris dumped into the ocean by glaciers, followed by a distinctive “cap layer” of carbonate—preserves a tracer of ancient life's productivity: two isotopes of carbon in the same ratio as existed in the ancient ocean. Photosynthetic organisms tend to remove more carbon-12 than carbon-13 when they draw in carbon dioxide, causing the ratio of carbon-13 to carbon-12 in the water to rise, while chemical precipitation of dissolved carbonate onto the sea floor removes equal proportions of each isotope, leaving the ratio unchanged.

    Other researchers have traced isotopic changes in the Namibian rocks, but Hoffman and his colleagues have the most complete record. Well before the ice age, the isotopic ratio suggests that carbon was removed from the world ocean through about half chemical and half biological processes, says Hoffman. But as the ice age approached, the portion due to biology began to decrease, as would happen if the ice were gradually spreading across the globe. The glacial deposit itself doesn't preserve a faithful isotopic record, says Hoffman. But afterwards, the cap carbonate record suggests that biological productivity had dropped all the way to zero, and recovered only slowly. “It's difficult to imagine any other mechanism that would shut down productivity on that scale other than global glaciation,” says Hoffman.

    No one knows just why the ice age began, but as highly reflective snow and ice spread to lower latitudes, more sunlight would have been reflected from Earth, chilling the planet further until a runaway glaciation enveloped even the tropics, says Hoffman. He assumes that there were at least a few breaks in the ice or patches of bare ground where microbes and multicellular algae survived to later give rise to all life today. But across most of its surface, he says, “Earth just sat there.”

    The group suggests that eventually, volcanic carbon dioxide oozing from the interior over millions of years created a greenhouse effect powerful enough to break the ice's grip. Then, this high carbon dioxide level drove the deposition of the cap carbonate.

    To find out how long Earth had to wait for this volcanic rescue, Hoffman and his colleagues estimated the duration of the isotopic event by calculating the ancient sedimentation rate. They used the rate at which tectonic processes formed the Namibian Basin, and assumed that it filled with sediment as it formed. They concluded that the isotopic excursion took at least 10 million years.

    This long duration “pretty well destroys” another explanation for the isotopic spike, says Hoffman—that the ocean overturned suddenly. That idea was proposed in 1996 by paleontologist Andrew Knoll, also of Harvard, and his colleagues, who suggested that the cap carbonate was deposited from carbonate-rich waters welling up from the deep sea. But such upwelling would have lasted less than 100,000 years.

    Not everyone is ready to accept the idea of a frozen Earth. Prolonged isotopic excursions are unlikely, says Knoll colleague Dick Bambach of Virginia Polytechnic Institute and State University in Blacksburg, and require unusually strong data to back them up. And geochemist Martin Kennedy of the University of California, Los Angeles, also has carbon isotopic data from Namibia, but they show no deep productivity decline before the Namibian glacial deposits. His evidence “is very different than theirs,” he says. Snowball Earth “is a novel and creative idea, but I don't think the data support it.”

    Furthermore, Kennedy argues that if Earth really was a snowball, strontium isotopes should respond too. The ratio of strontium-87 to strontium-86 in the oceans should have dropped as the glaciation cut off rivers enriched in strontium-87 by weathering of the continents, he says. But his unpublished data show that during the carbon excursion, the strontium ratio rose sharply, indicating more continental erosion, not less. “It's one of the greatest shifts in history,” he says.

    Hoffman offers a rebuttal on all points. Their carbon isotope data resemble other published records, he says. And Schrag says the strontium ratio would have been kept high, first by acid from undersea volcanoes dissolving strontium-bearing carbonate sediments, and later, after the glaciation, by greenhouse-induced weathering of continental rock. But Derry, who has worked with members of both groups, says that if the strontium data hold up, snowball Earth “has a problem.” It remains to be seen whether this snowball can take the heat.


    NIH, DuPont Declare Truce in Mouse War

    1. Eliot Marshall

    A contentious, 2-year legal wrangle that set molecular biologists against company lawyers ended last week when the DuPont Pharmaceuticals Co. of Wilmington, Delaware, agreed to relax the terms under which it allows scientists to share a popular type of laboratory mouse.

    On 19 August, Harold Varmus, director of the National Institutes of Health (NIH), announced at a scientific meeting that NIH has hammered out a memorandum of understanding with DuPont that will make it easier to transfer genetically engineered mice from NIH labs to other nonprofit institutions. (The text is available on the Web at The agreement lifts several restrictions DuPont had placed on the use of mice created with the company's patented “cre-lox” system—an efficient method of editing DNA at a specific site on the mouse genome. It is used chiefly to explore gene function. Varmus describes the pact as “a milestone in the cooperative relationship between academia and industry.” And NIH staffers say they hope other companies will use the model to make patented research tools more accessible.

    The flap over cre-lox mice began about 3 years ago. In an effort to tighten control over products on which it holds patents, DuPont began contacting researchers, asking them to sign an agreement that would limit their freedom to use and share the cre-lox technique (Science, 4 July 1997, p. 24, and 1 July 1994, p. 26). DuPont asked that anyone using cre-lox methods send the company prepublication copies of their scientific reports. The company also tried to acquire commercial rights to future inventions that might arise from experiments involving a cre-lox animal. In addition, DuPont's lawyers warned researchers not to share cre-lox mice with colleagues unless the recipient agreed in advance to DuPont's terms.

    Many scientists balked. For example, Jackson Laboratories of Bar Harbor, Maine, a nonprofit research center that breeds and distributes mice to scientists around the world, negotiated for 2 years, but failed to reach an agreement with DuPont. The impasse prevented Jackson from distributing cre-lox mice, making it difficult for some scientists to acquire animals. Varmus, who had pushed for making new genetic tools widely accessible before coming to NIH, sided with Jackson in 1997 and joined in boycotting DuPont's terms. But after more than a year of negotiations, NIH and the president of DuPont's research labs, Paul Friedman, found common ground in June, according to NIH's director of technology transfer, Maria Freire. They signed the papers in August.

    Their agreement says that NIH scientists are free to share cre-lox mice with other nonprofit research labs, provided they sign a simple transfer agreement indicating the recipient won't give the material to anyone else and that DuPont keeps commercial rights. DuPont is not asking to preview publications, nor does it claim extensive “reach-through” property rights on second generation discoveries, as in the past. However, the company does insist that commercial uses of the technology must be covered by a license. DuPont also plans to retain strict control of the use of cre-lox genetic modifications in agricultural research and in the production of mouse embryonic and stem cells. The most significant aspect of the agreement, according to a Jackson Lab staffer, may be its universality: DuPont has said that all researchers who receive federal funding—not just those who work at NIH—will be covered by the liberal rules, effectively freeing up the nonprofit world.


    Small Businesses Get Extra Boost From NSF

    1. Jeffrey Mervis

    Twenty years ago, the National Science Foundation (NSF) had the then-radical idea of providing federal funds to help budding scientist-entrepreneurs turn research findings into products. The idea grew into the government-wide Small Business Innovation Research (SBIR) program, now a billion-dollar operation spread across 10 agencies, that provides small companies with two rounds of federal support before they must stand or fall on their own. This month, NSF gave its portion of the program a new twist, adding a third round of funding for companies that aren't quite ready to cut the federal cord. The move is likely to rekindle debate over just how well the program is working in generating an economic payoff from federally funded research.

    Congress created SBIR in 1982 and modeled it after the original NSF experiment. The program—which is funded by a controversial 2.5% “tax” on the R&D budgets of all major research agencies (Science, 17 May 1996, p. 942)–awards up to $100,000 for a feasibility study of a potential product, called Phase 1, and up to $750,000 for additional research on a prototype, called Phase 2. The law stipulates that Phase 3, the company's entry into the marketplace, must occur without the help of government funding. Now, NSF has tinkered with those rules by adding a component, dubbed Phase 2b, that allocates an additional $100,000 for 12 more months to companies that have lined up investors willing to put up at least $200,000. NSF, which currently limits Phase 2 awards to $400,000, is testing the idea with four companies this year and plans to expand it to more than 100 next year using money from its existing SBIR pot.

    The rationale, say NSF officials, comes from a survey that found most fledgling companies aren't ready for the free market after only 2 years of federal support, and that a small percentage of the companies aided by SBIR generate most of the jobs and revenue. “SBIR is not working as well as it should,” says Kesh Narayanan, head of NSF's industrial innovation division, who conceived the extended funding idea. “We wanted to find ways to encourage more companies to take the next step [toward commercialization].”

    SBIR's supporters generally regard NSF's new twist as fine-tuning an already worthy activity. “Our commercialization rate is much higher than most university technology transfer programs,” says Dan Hill of the Small Business Administration, which coordinates the government-wide program and approved NSF's experiment. “I don't see the additional federal support as a crutch, but rather as a way for a company to do more R&D while it lines up investors. And since NSF is buying more research, it's a win-win situation for both parties,” adds Hill.

    However, others say that NSF may be giving the companies a little too much nurturing. “It's extremely tricky to find the right balance between federal incentives and the commercial sector,” says Tom Moss, head of the Government-Industry-University Research Roundtable at the National Academy of Sciences. And Harvard University economist Josh Lerner says that successful companies tend to use SBIR as seed money to attract private investors and that “it's not healthy for companies to avoid the need to go out into the market.”

    One company participating in NSF's pilot program, Polatomic Inc. of Richardson, Texas, is also looking at the government as a primary customer. The company received $100,000 from NSF based on money it has lined up from NASA's Jet Propulsion Laboratory in Pasadena, California, to help it develop an instrument called a vector/scalar laser magnetometer, which can measure a planet's magnetic field from orbit. “We didn't want to restrict the source of their outside funding,” says Narayanan. “As long as it's for the benefit of the federal consumer, what does it matter who's putting up the money?”

    Polatomic's chairman, industrial physicist Bob Slocum, says that the company hopes someday to have customers besides NASA and the Navy, which is interested in using it on submarines. Slocum adds that a modified version of the device should also appeal to private companies, who could use it to identify mineral and oil deposits, locate toxic waste sites, and detect buried explosives. But geophysicist John Connerney of NASA's Goddard Space Flight Center in suburban Maryland, which builds a different type of magnetometer for space observations, sees the new injection of federal funding as a sign that the company hasn't really built a better mousetrap. “If Polatomic was truly a commercial business, I would think they would have identified paying customers by the end of Phase 2,” says Connerney, who in the past has reviewed SBIR proposals for NSF.

    Another grantee, Auxein Corp. of Lansing, Michigan, says it needs the additional funding to conduct more field trials of a plant growth stimulant based on the natural hormone gamma aminobutyric acid, which acts as a neurotransmitter in animals. Chief scientist Alan Kinnersley says the company has lined up $2 million from three investors and found another company interested in becoming the exclusive distributor for its use in horticulture. But sales have fallen short of projected levels. “We have faced an uphill battle educating people about what AuxiGro can do,” he explains, “including a 30-year history of biostimulants that have failed to live up to their promise.”

    Narayanan says he doesn't expect dramatic results from Phase 2b. But he says the SBIR program needed a boost. “We had two choices,” he says. “We could sit back and hope for the best. Or we could try something new.”


    Los Alamos Magnet Leads the Field

    1. Robert F. Service

    It shrieks like Godzilla, harnesses the power of 80 diesel locomotives, exerts a force strong enough to crumple the strongest reinforced steel beams, and now it's open for business. Today, officials at the Los Alamos National Laboratory in New Mexico are scheduled to cut the ribbon on a new magnet capable of generating fields of 60 tesla, about 1 million times stronger than the Earth's own magnetic field.

    Brute force.

    A 1000-ton generator sends current to a refrigerator-sized magnet (right), its coils separated by channels for liquid nitrogen coolant.


    The new magnet isn't the first to reach such high fields, but what it does have is staying power. The best existing machines create roughly the same field but can hold it for only about 1 millisecond. The new magnet, with a 1000-ton generator capable of powering a modest-sized city, sustains that field for about 100 times longer.

    “It's a unique facility that will allow experiments that are impossible to do elsewhere,” says Simon Foner, a high-field magnet expert at the Massachusetts Institute of Technology in Cambridge. The magnet is expected to be most useful for looking at the electronic and magnetic behaviors of a variety of materials, such as semiconductors, high-temperature superconductors, and the layered magnetic materials used to make computer disk drives.

    In high-temperature superconductors, for example, free-flowing electric current and magnetic field are like oil and water: They don't mix. Hence, researchers can use the new magnet to wipe out the superconductivity of the materials at temperatures at which they would normally work, thereby allowing them to study the transition between the superconducting and nonsuperconducting states.

    Los Alamos physicist Scott Crooker and his colleagues became acquainted with the magnet's capabilities this spring, during its testing phase. The researchers looked at the behavior of layered semiconductor materials designed to confine large numbers of electrons in a thin sheet within the device. Crooker and his team are still analyzing the results, but it was instantly clear that the new machine made life easier. “The beauty of the long-pulse magnet is that we have 100 times longer to collect data and get meaningful results,” he says. “The machine is a real tour de force.”

    The brute behind that force is a building-sized generator that arrived at its present role via a circuitous route. It was built in 1980 to convert energy from a steam turbine in a Tennessee nuclear power plant into electricity. When plans for the power plant were scrapped, Los Alamos picked up the generator to power the high-field magnets needed to confine the energetic plasma in a planned nuclear fusion reactor. But when the fusion reactor itself was later cancelled, the pulsed-magnet builders snagged the homeless generator.

    Today, the machine has been reengineered to act first as a motor, which slowly takes power from the grid to crank up a giant spinning shaft, and then as a generator to convert that mechanical energy back into an enormous pulse of electricity.

    But the magnet designers did a lot more than simply apply an electrical sledgehammer. “From an engineering point of view, it's quite an accomplishment,” says Foner. The generator provides some 1.4 billion watts of power, much of which is dumped straight into the metal coils of the refrigerator-sized magnet. That huge amount of current puts an enormous strain on the coils. The magnetic forces essentially try to squash the magnet into a pancake, and the energy dissipating in the coils generates heat that would also melt the metals in seconds if allowed to stick around.

    To prevent an implosion of the coils, the magnet's designers made them out of reinforced copper laced with aluminum oxide, surrounded by specially coiled stainless steel supports, says James Sims, a mechanical engineer at Los Alamos and the magnet's chief engineer. And to prevent a meltdown, the researchers take two main precautions. First, they shunt the power out of the magnet as quickly as possible after the pulse has finished to minimize heating in the coils.

    Second, Sims and his colleagues designed the magnet with nine separate sets of coils that nest within one another like Russian dolls. The coils are kept cool by ultracold liquid nitrogen, circulating through channels between them. Just before an experiment, the researchers drain the coolant to prevent it from vaporizing, fire up the magnet, and then flood the coolant back in.

    Even with its improved design, the new 60-tesla magnet may not top the high-field heap for long. Researchers at the Los Alamos branch of the National High Magnetic Field Laboratory already have designs on the drawing board for a 100-tesla pulsed magnet. A prototype of the new machine is expected next year and the full 100-tesla device is scheduled to be built by 2002.


    A New Route to Treating Schizophrenia?

    1. Ingrid Wickelgren

    Last month's shootings of two Capitol Hill police officers were a sad reminder of how far we are from a cure for schizophrenia, the debilitating mental illness that plagues the alleged murderer. But work described in this issue suggests a new approach to schizophrenia drugs that may someday lead to better therapies for the condition, which afflicts 1% of the population of the United States alone.

    A calming influence.

    By binding to a metabotropic glutamate receptor (mGluR), LY354740 prevents the PCP-induced surge in glutamate excitation.


    On page 1349, neuroscientists Bita Moghaddam and Barbara Adams of Yale University School of Medicine and Veterans Administration Medical Center in West Haven, Connecticut, report that a new drug that lowers brain levels of the chemical glutamate—one of the neurotransmitters that relay messages between neurons—can block schizophrenia-like symptoms in rats, without apparent side effects. “It's impressive work,” says schizophrenia researcher Jon Horvitz of Columbia University in New York City. “This is a promising avenue for a drug that attenuates schizophrenic symptoms.”

    Such drugs are badly needed. Current schizophrenia medications, known as neuroleptics, work by blocking the action of another neurotransmitter, dopamine, and they are far from ideal. While patients who take them often see reductions in paranoia and hallucinations, the drugs offer little relief from other symptoms, such as poor attention spans, jumbled thoughts, and difficulty interacting with other people. What's more, the neuroleptics often cause troubling side effects, including uncontrollable tremors similar to those in Parkinson's patients. “We are desperate for compounds that might treat psychosis that are not primarily dopaminergic,” says schizophrenia specialist David Pickar of the National Institute of Mental Health (NIMH) in Bethesda, Maryland.

    The psychoactive drug phencyclidine, or PCP, offered tantalizing hints that a different approach might work. Researchers have known for many years that PCP–“angel dust”–induces schizophrenia-like symptoms in healthy people, an effect that's been attributed to its ability to block the N-methyl-D-aspartate (NMDA) receptor, one of a number of receptors in the brain through which glutamate exerts its effects. That finding led scientists to hypothesize that a depression of glutamate transmission in the brain might contribute to schizophrenia. Efforts to treat schizophrenia with drugs that rev up the NMDA receptor didn't pan out, however, largely because the agents can cause serious side effects such as seizures. Meanwhile, scientists and drug developers focused on dopamine, as evidence accumulated that the effectiveness of neuroleptics is proportional to their ability to block dopamine transmission in the brain.

    But the Yale workers took a new look at glutamate, using rats treated with PCP, which develop symptoms, such as frantic running and incessant head-turning, thought to parallel psychotic symptoms in humans. As expected from previous work, the drug raised dopamine concentrations in the rats' brains. But much to the surprise of the researchers, for reasons as yet unknown, PCP also caused a surge in brain glutamate levels. This suggests that abnormally high, rather than low, glutamate activity might underlie the rats' reactions to PCP. Moghaddam then speculated, she recalls, that “if we block glutamate activation, maybe we can block these behavioral effects.”

    To avoid side effects, Moghaddam and Adams wanted to block glutamate activity only in those parts of the brain where it might be elevated. So, they turned to a drug called LY354740 that is under development at Eli Lilly & Co. in Indianapolis for other psychiatric disorders such as anxiety. LY354740 stimulates a subgroup of the so-called metabotropic glutamate receptors, which sit on the terminals of glutamate-releasing neurons and act as regulators of glutamate levels: As shown in rat experiments by Lilly's Darryle Schoepp and colleagues, LY354740 dampens output of the neurotransmitter when its levels get too high but doesn't interfere with normal levels. By stimulating these receptors, Moghaddam and Adams hoped, they could selectively lower glutamate levels in their PCP-treated rats.

    The plan worked. In seven control rats, PCP caused glutamate levels to rise more than twofold in the prefrontal cortex, one of the brain regions that goes haywire in schizophrenia, but the six animals given the Lilly drug experienced no such glutamate rise, although their dopamine levels surged. The treated rats also escaped the symptoms that PCP induced in the controls. They stayed calm and showed little head-shaking behavior.

    What's more, the drug might even act on schizophrenia symptoms that neuroleptics don't relieve. With food rewards, the researchers trained other rats to alternately visit the two arms of a T-shaped maze. This is a test of working memory, a type of short-term memory used to make decisions or draw conclusions that is often severely impaired in schizophrenia. Control rats treated with water and PCP suffered memory lapses, choosing the wrong arm about half the time under certain conditions. But rats pretreated with LY354740 made a wrong choice just 30% to 40% of the time, about the same rate as normal rats. This suggests that the drug reduces this PCP-induced cognitive deficit.

    Of course, many a drug that has looked promising in animals has failed in human trials. Researchers worry, for example, that PCP-induced symptoms in rats do not accurately reflect schizophrenia in humans. Lilly will not say whether the company plans to test LY354740 in schizophrenia patients, but it has conducted early clinical trials of the drug for other diseases, including anxiety and nicotine withdrawal. “It's a very exciting approach to a number of psychiatric disorders,” says Steven Paul, the head of Eli Lilly's research labs. Pickar of NIMH hopes that schizophrenia will be among them. “If the compound will be tolerated well,” he says, “you have something that's got to get into humans.”


    Top Honors Go to Math With a Physics Flavor

    1. Allyn Jackson*
    1. Allyn Jackson is a writer and editor for the Notices of the American Mathematical Society and lives in Munich, Germany.

    Berlin—Mathematicians officially anointed four new superstars when the 1998 Fields Medals were presented here last week at the opening ceremonies of the International Congress of Mathematicians. There is no Nobel Prize in mathematics, and the Fields Medal, awarded every 4 years by the International Mathematical Union (IMU), has become the discipline's highest honor. Unlike Nobels, Fields Medals are traditionally awarded only to mathematicians no older than 40 and are intended as much to encourage future work as to recognize past achievement.

    The four new medalists are Richard E. Borcherds of the University of Cambridge and the University of California, Berkeley; William Timothy Gowers of Cambridge; Maxim Kontsevich of Institut des Hautes Etudes Scientifiques, Bures-sur-Yvette, France, and Rutgers University; and Curtis T. McMullen of Harvard University. In addition, Peter Shor of AT&T Laboratories in Florham Park, New Jersey, received the IMU's Nevanlinna Prize—meant to be the equivalent of the Fields Medal in theoretical computer science—and Andrew Wiles of Princeton University was given a special one-time award for his proof of Fermat's Last Theorem.

    Much of the work honored by the medals shows the influence of physics. “I think that's not an accident,” says Borcherds. “At the moment, theoretical physicists are churning out enormous numbers of amazing new ideas. My guess is that this is going to continue well into the next century.” McMullen remarks that when he was in graduate school, gauge theories from particle physics were all the rage. “I used to think, ‘I don't do that kind of trendy mathematics,’” he says. But now he finds himself working on mathematics connected to the notion of renormalization, a kind of scaling technique used in physics to study phase transitions such as the change from ice to water.

    The links to physics are strongest in Kontsevich's work. He first gained international attention for his doctoral thesis, in which he proved a conjecture of Edward Witten, a mathematical physicist at the Institute for Advanced Study in Princeton. The conjecture made the surprising prediction that one could use certain calculations in algebraic geometry to produce a solution to an equation from a completely different area, the Korteweg-de Vries equation from the study of nonlinear waves. Kontsevich has also done important work in topology. For example, he produced a vast generalization of the notion of linking numbers for knots—numbers that give a measure of how intricately two knots are entangled—which originated in the 1800s with the mathematician Carl Friedrich Gauss.

    Although McMullen has worked in a variety of mathematical areas, from topology to the theory of computing, much of his work has focused on dynamical systems—systems that evolve over time. McMullen has studied systems in which a simple process is iterated many times to produce complicated dynamics. These processes depend in very subtle ways on certain parameters, which McMullen has analyzed through renormalization. He has also used ideas from dynamical systems to produce important results in other areas, such as topology.

    Borcherds's route toward physics started in finite group theory. An example of a finite group is the collection of integers from 1 to 12 under the operation of “clock arithmetic,” so that, for instance, 8 + 5 = 1. The concept sounds simple, but it gives rise to an enormous variety of mathematical flora and fauna. Mathematicians have worked for decades on classifying all the finite groups. One of the strangest they have uncovered is the “monster group,” which has some 1053 elements and a little-understood structure. Conjectures about the monster came to be known as “monstrous moonshine” because they were considered so improbable, but Borcherds solved one of the most famous of these conjectures by inventing a fruitful new concept, called a vertex algebra. Vertex algebra has since proven important in physics as well, especially in a field theory that is an underpinning of theoretical particle physics, including string theory.

    In contrast to the work of the other medalists, Gowers's work has little if any connection to physics. It focuses on objects that form part of the standard tool kit for many areas of mathematics: infinite-dimensional Banach spaces. Named after the Polish mathematician Stefan Banach, who worked in the 1930s, these spaces are akin to the familiar Cartesian plane, which is the natural home of two-dimensional vectors, except that they exist in infinite dimensions. Gowers solved a number of famous problems, originally stated by Banach, that had gone unsolved for decades.

    Shor's work has received far more attention outside mathematics than has that of the Fields Medalists. After doing important research in combinatorics and graph theory, Shor startled the world in 1994 by proving that a quantum computer—based on the ability of atoms or particles to exist in several quantum states at once—could solve a real problem: factoring numbers at a speed vastly greater than that of conventional computer algorithms. Besides sparking great interest among scientists and mathematicians, the work also has raised concerns about the security of cryptographic codes based on the difficulty of factoring large numbers. Since then, Shor's results in quantum error-correcting codes and fault tolerance have raised hopes that quantum computers might one day become a reality (see Science, 7 August, p. 792).

    But the longest and loudest applause from the crowd of 3500 mathematicians assembled in Berlin's International Congress Center came when the IMU presented the conqueror of Fermat's Last Theorem with a special one-time tribute. In 1994, the last time the Fields Medals were awarded, a gap remained in Wiles's proof. With the gap long since repaired, many believe that Wiles—now 45-has produced results of such rare beauty and significance that the IMU should have made an exception to its “no older than 40” rule and awarded him a Fields Medal. One mathematician remarked that Wiles has already gotten so many prizes he doesn't need a Fields Medal. No, said another, the Fields Medal needs Wiles.


    Packing Challenge Mastered At Last

    1. Barry Cipra

    Johannes Kepler is best known for his elliptic laws of planetary motion. But mathematicians also remember him for a vexing problem in geometry: proposing—but not proving—that the densest possible packing of same-sized spheres is the arrangement familiar today to anyone who's ever admired a pyramid of oranges in a grocery store. Known to crystallographers as the face-centered cubic lattice packing, it fills a little over 74%–πsqrt18, to be precise—of space.

    For nearly four centuries Kepler's conjecture has remained one of those mathematical Everests, like Fermat's Last Theorem, that people tackle for the sheer challenge of it. There's never been much doubt that the conjecture is true; the question has always been whether anyone can prove it. The answer, finally, appears to be Yes.

    Thomas Hales, a mathematician at the University of Michigan, Ann Arbor, recently announced the completion of a lengthy analysis that appears to provide a rigorous proof of Kepler's conjecture. Hales's analysis, parts of which have already been published, combines 250 pages of mathematical reasoning with computer programs that enumerate and check thousands of crucial details. “It's really an amazing achievement,” says Neil Sloane, a sphere-packing expert at AT&T Laboratories. Hales's proof has yet to undergo close scrutiny, but Sloane is impressed with what he's seen so far. “He's documented everything very carefully,” Sloane says. “Nobody's raised a single doubt about his work.”

    That's important, in light of the Kepler conjecture's recent history. In 1990, Wu-Yi Hsiang, a mathematician at the University of California, Berkeley, announced that he had solved the sphere-packing problem (see Science, 1 March 1991, p. 1028). However, Hsiang's proof encountered a buzz saw of criticism from experts, including Hales, who said there were numerous flaws and gaps in the proof's reasoning (see Science, 12 February 1993, p. 895). In 1994, Hales wrote a detailed critique of Hsiang's proof for The Mathematical Intelligencer, calling on Hsiang to withdraw his claim. (Hsiang reportedly stands by the correctness of his proof, but could not be reached for comment.)

    Hales's own proof follows a strategy first outlined by the Hungarian mathematician László Fejes Tóth in 1953. The strategy is to reformulate the conjecture in “local” terms, reducing it from a question about infinitely many spheres filling all of space to a series of questions about how certain finite arrangements of spheres fit together.

    To carry out that strategy, Hales invented a new way to allocate the empty space in a packing to individual spheres. The standard approach assigns to each sphere its “Voronoi cell,” consisting of the points that are closer to it than to any other sphere. If every sphere in any other packing simply occupied no more than 74% of its Voronoi cell, the Kepler conjecture would follow immediately. But Voronoi cells don't give a consistent measure of a packing's density. The spheres in some packings occupy as much as 75.5% of their Voronoi cells, although this high density is invariably canceled out by the low density of nearby cells.

    Hales calls his alternative a “star decomposition.” Roughly speaking, a star is a modified Voronoi cell with a batch of tetrahedral protuberances. A second key ingredient of his proof is a novel way to “score” the local density of each star. In Hales's convention, the stars in the face-centered cubic lattice packing all have a score of 8 points. Any counterexample to the Kepler conjecture would have to include stars with a score greater than 8.

    “The proof gives a classification of all the stars that can potentially be a counterexample to the Kepler conjecture,” Hales says. The list is a long one, but finite: A computer program found 5094 different types of stars, any one of which could conceivably have a score greater than 8. Each type then had to be ruled out, by showing that the largest score for each type of star stayed less than 8.

    To do so, Hales had a computer convert each inequality estimate into a series of problems in linear programming, a mathematical method that is widely used for optimization problems in industry. So that no star with a winning score would slip through the gaps of round-off error, the computer did the conversions using a mathematically rigorous technique called interval arithmetic. Most of the cases were easily disposed of, but some required extra care. “At the very end, it came down to 50 or so cases that the general arguments didn't rule out,” Hales recalls. Hales and a graduate student, Sam Ferguson, looked at those cases one by one.

    One particularly problematic case, a local arrangement called a pentahedral prism, consisting of 12 spheres surrounding a 13th central sphere, became the subject of Ferguson's doctoral dissertation. “I only handled one case,” Ferguson notes proudly. “It just ended up being the worst case.” Initial calculations only said the pentahedral prism had a score less than 10. Ferguson had to refine the analysis to get a bound below 8.

    Early this month, the final pieces fell into place: The last potential counterexamples had been eliminated. Hales is careful not to claim too much. “This is not a refereed paper, and so it should be taken accordingly,” he says. “I think [the experts] should be able to judge the overall strategy and methods and that sort of thing quite quickly, but I expect it'll be several months before the details will have been carefully checked.”

    Assuming all goes well, proofs of other sphere-packing conjectures might follow. “I'll be curious to see what other problems can be solved by similar methods,” Hales says. One possibility: A computationally intensive approach could suffice to prove the Kepler conjecture in four dimensions rather than three—a packing you'll never see at the grocer's.


    Wanted--A Better Way to Boost Numbers of Minority Ph.D.s

    1. Jeffrey Mervis

    With set-aside programs under fire, “majority” institutions are being asked to find new approaches to achieve diversity

    For years, the National Science Foundation (NSF) has reserved a portion of its prestigious graduate research fellowships for minority students seeking to launch a career in science. By holding a separate competition among underrepresented minorities—notably, African-Americans, Hispanics, or American Indians—for approximately 15% of the 900 slots, NSF officials hoped to increase their pitiably small number in academic science. But yesterday,* when NSF announced the rules for its 1999 awards, the minority component was gone.

    The decision is triggered by a recent pretrial settlement of a white student's lawsuit claiming that the separate competition discriminated against majority students (Science, 26 June, p. 2037). And it's part of a much broader review of some two-dozen NSF programs, which last year received $110 million to help diversify the U.S. scientific work force. “This is a big issue,” says Joe Bordogna, acting deputy director of NSF, who oversees the effort.

    NSF is not alone in questioning such activities. In the wake of a string of legal and political reversals for affirmative action programs, federal agencies, universities, and private foundations are seeking ways to increase the number of minorities in science without running afoul of the law. They range from trying to change the culture of research universities to promoting mentoring and building bridges between predominantly white institutions and historically black colleges. Some also try to “prime the pump” by reaching down into high schools or even earlier, and others also target women, an underrepresented minority in many fields.

    “There's no magic bullet,” says cell biologist James Wyche, associate provost at Brown University in Providence, Rhode Island, and executive director of the Leadership Alliance, a consortium of 25 universities formed to address the issue that held its 6th annual symposium in Washington, D.C., last month. “We all need to come up with better ways to increase diversity.” The dismantling of affirmative action programs, say others, doesn't indicate that the hurdles facing minority students who embark on a scientific career have disappeared. “Some people think that the playing field is now level,” says microbiologist John Ruffin, head of NIH's Office of Research on Minority Health. “But nothing could be further from the truth.”

    A shaky record

    This rethinking is coming at a time when, despite more than a generation of programs aimed at giving minorities a greater opportunity to compete for scientific careers, all but Asian-Americans remain dismally underrepresented in science. While African-Americans, Hispanics/Latinos, and American Indians comprise 23% of the U.S. population, they make up only 4.5% of those holding scientific doctorates. When physicist and university administrator Walter Massey, president of Morehouse College and former NSF director, challenged graduate science departments almost 10 years ago to produce more minority Ph.D.s from underrepresented groups, he noted that many of them had yet to produce a single minority Ph.D. Recent figures show little change in that situation (see tables).

    View this table:

    As universities struggle to improve their record, two events have complicated their efforts. One was a 1996 California referendum, Proposition 209, that makes it illegal for state institutions to use race-based criteria in admissions and hiring decisions. The other is a 1996 federal appellate court ruling, Hopwood v. Texas, that imposed a similar prohibition in Texas, Louisiana, and Mississippi. This fall, residents of Washington State will vote on an anti-affirmative action referendum, and educators are also awaiting the outcome of a pending suit that accuses the University of Michigan of discriminatory admissions policies. These decisions would apply only to specific states or institutions, however; the Supreme Court has yet to go beyond its 1978 Bakke ruling, which allows education officials to use race as a factor in their decisions.

    That situation has created widespread confusion over whether a particular law, court decision, or agency policy applies to a specific action by an individual institution. “As private and public institutions, we're not supposed to have set-asides,” says Gary Ostrander, associate dean for research in the arts and sciences at Johns Hopkins University in Baltimore. “But federal agencies like NSF and NIH can have [campus-based] programs that target minorities and that we must administer without violating the law. It's a fine line that we all walk.” Private philanthropies like the Howard Hughes Medical Institute and the Sloan Foundation are also struggling with the issue.

    Many science educators say that the recent judicial and legislative activity has cast a pall over efforts to attract minorities into the profession. “It has a chilling effect on the groups you are trying to reach,” says Herbert Nickens, head of minority programs for the 123-member Association of American Medical Colleges (AAMC), which in April issued a ringing 13-page defense of affirmative action in medical education. It calls the termination of such programs “a threat to diversity [that is] even more serious than the backlash in the mid-1970s.” Notes Nickens: “Even as you're trying to drum up interest in academic science, it sends the clear message to minority students that, ‘We don't want you.’”

    However, others say that there's no point longing for something that's not coming back. “I think phase I of affirmative action is dead, and I don't lament its passing,” says Richard Tapia, an applied mathematician at Rice University in Houston, Texas, and a member of the National Science Board, which oversees NSF. “It gave us a jump start, but it was never supposed to be permanent. Now, we have to find other ways to achieve real gains.”

    New approaches

    One way is to urge the “majority” universities that churn out the lion's share of U.S. Ph.D.s to lure and retain minorities without setting quotas. Next month, for example, in addition to revamping its fellowship program, NSF hopes to award up to $2.5 million over 5 years to each of eight or so universities that have promised to graduate more minority Ph.D.s in the natural sciences and engineering. “I've been prodding them for years to do something like this,” says Representative Louis Stokes (D-OH), who inserted language last fall into NSF's 1998 spending bill that created the Minority Graduate Education (MGE) program and who provided for its continuation in the House version of NSF's 1999 budget bill. “I think we've been making minimal progress, and I think that putting more money into the effort will help.” Last month, NSF received more than 50 applications featuring partnerships between majority and minority institutions, mentoring programs, networking, and other ways to funnel more minority students into graduate programs and to lower the barriers to success.

    Although educators welcome the new program, many caution that it will be hard to change the culture at research universities—even at those with an exemplary record of trying to boost minority participation in science and engineering. For example, Purdue University has earned a national reputation for providing research opportunities for minority undergraduates under a program begun by a trio of faculty members that included biologist Luther Williams, now head of NSF's education directorate that sponsors the MGE program. But the institution still has a long way to go, says Purdue biologist Joe Vanable. “The climate here is no worse than anywhere else, but it's not good,” he says.

    So, Vanable led a group that penned a proposal in response to NSF's new program requesting money for, among other things, intensive workshops aimed at opening the minds of his colleagues on matters of race. But senior university officials questioned the likely impact of such efforts and refused to submit it to NSF. Vanable remains bitter about the decision. “It's a question of priorities,” he says. “Once, when enrollment was declining, we were threatened with budget cuts if we didn't improve the numbers. So, we met weekly to come up with ways to succeed. That's never happened with the recruitment of minorities or the hiring of minority faculty.”

    Vanable's lament that minority issues are slighted by the scientific mainstream is common among educators who have labored to raise participation rates. “What I'd like more than anything is a national summit on the subject so that we can see what everybody else is doing,” says Joel Oppenheim, director of the Sackler Institute of Graduate Biomedical Sciences and associate dean of graduate studies at New York University (NYU), who travels around the country seeking minority applicants for the university's summer research program as well as for his graduate school. “We've never had one.”

    Looking for results

    Because trying to change the culture of an institution can be a long, slow process, some funding organizations focus on departments and individuals with a good track record. “We've decided to concentrate on helping the people who have shown that they can make it happen,” says Ted Greenwood, who runs the Sloan Foundation's effort to increase the number of minorities receiving Ph.D.s in science and engineering.

    Sloan's program, begun in 1994, tries to boost Ph.D. output by giving faculty members and departments $30,000 to support each additional minority student. It also keeps score, trimming the grants of those who fall short by failing to recruit or retain the expected number of students. Although the program has yet to graduate its first Ph.D., Greenwood hopes that his annual budget of $3 million, including a grant to Purdue's biology, chemistry, and engineering departments, will eventually add 100 minority doctorates a year to the existing academic pool.

    NSF's revamped graduate fellowship program is looking for racially neutral ways to serve minorities without reserving a certain number of spots for them. At the urging of the Justice Department, which wanted to avoid a politically charged trial and a possible precedent-setting defeat, NSF officials settled the case for $95,400-paying $14,400 to the student, Travis Kidd, and $81,000 to his lawyers.

    The new selection criteria for next year's class of fellows are expected to downplay the importance of scores on standardized tests, in particular the Graduate Record Exam (GRE). One approach would set a threshold score above which every applicant is deemed acceptable and then use other factors that many people believe are equally relevant to success in science, including creativity, determination, and real-life experience, to choose the winners.

    Tapia, who advocates such a threshold as a way to increase diversity without using set-asides, says such an approach would be a marked improvement over current practices. “The misuse of standardized tests has been the worst enemy of minorities,” he says. Rice's freshman class contains a significant number of underrepresented minorities with “substantially lower SAT scores than the university at large,” he says, who were chosen on the basis of other, race-neutral criteria. “But once admitted,” Tapia says, “they are on a par in terms of retention rates and grade point average.”

    Applying the same policy to graduate admissions, he says, has allowed his colleagues to assemble a computer and applied mathematics department of some three dozen students that is one-third minority and more than half women. “And we graduate at those rates,” he says proudly. Tapia currently receives funding from the Sloan program and is hoping to win NSF funding to replicate that success at a consortium of universities.

    But administrators of other programs aimed at minorities worry that, whatever guidelines are used, the results may not make up for the loss of the minority fellowships. “[The settlement] could have a devastating effect,” warns NIH's Ruffin. “These things are very competitive, and the people who make the decisions bring to the table their own sense of what makes someone most qualified. They may not be biased, but they may not know all the factors involving minority students.”

    Indeed, choosing the appropriate factors is so problematic that most federal agencies, including NIH, sidestep the issue by making grants to institutions, which are then free to use their own selection criteria. Through bridge and partnership programs with colleges and universities that have large numbers of minority students, majority universities also can tap a much larger pool of minority students than exists on their own campus.

    At Johns Hopkins University, for example, that means linking up with two nearby historically black universities—Coppin State and Morgan State. A summer research fellowship program funded in part by the Howard Hughes Medical Institute, for instance, uses those connections to team up a dozen or so minority undergraduates with Hopkins's world-class faculty. “It happens that the program involves Coppin and Morgan State, whose student body happens to be more than 90% black,” says Hopkins's Ostrander. “We will not take race and gender into consideration in the selection process. However, if it turned out that the top 10 applicants for the program were white males, we probably wouldn't run the program.”

    Granting sources like Hughes are also treading warily. The philanthropy is still a defendant in a suit brought by a white high school student denied entry to a summer science camp for minorities run by Texas A&M University, which U.S. officials agreed to settle last December (Science, 2 January, p. 22), and top officials declined to be interviewed on the subject. “There is a changing legal climate that raises questions we have to address,” says Hughes spokesman David Jarmul. “But we think our commitment to supporting programs that increase participation by minorities and women is compatible with the law and consistent with our goal of training biomedical researchers.”

    NYU's Oppenheim also wants to do good science in an atmosphere that fosters diversity. And that requires a major commitment from “majority” institutions such as his. In 1990, he began a summer research program to attract minorities to NYU's graduate schools. It was through the program, which is now “color-blind” and currently 75% majority, that he befriended Savoy Brummer, an African-American graduate of Howard University.

    Now a second-year medical student in the honors (research-oriented) program at the medical school, Brummer has spent the past three summers at NYU doing research. He says that Oppenheim, who is white, has been an immeasurable help as he takes his first steps into a career in medical research and that trust, not a desire to remove racial barriers, is the key to their close relationship. “I decided to come to NYU because Joel promised to stay here until I'm done,” he says. “In a way, I've put my life in his hands. And he's always there for me.”


    NIH Concocts a Booster Shot for HIV Vaccines

    1. Jon Cohen*
    1. Jon Cohen is on leave from Science, writing a book about AIDS vaccines.

    After being criticized for moving too slowly on AIDS vaccine research, NIH is putting more urgency into the push to develop and test candidates

    When the National Institutes of Health (NIH) decided 4 years ago not to fund large-scale efficacy trials of the leading AIDS vaccines then under development, the move underlined a stark and sobering message: A decade had passed since HIV had been unmasked as the cause of AIDS, yet researchers had not even found a vaccine promising enough to justify the expense of a full-scale test. The field lost what little momentum it had. Now, NIH is trying to give AIDS vaccine research a shot in the arm.

    Neal Nathanson, a viral epidemiologist who in May left a long career at the University of Pennsylvania to take over NIH's Office of AIDS Research (OAR), says revitalizing the vaccine program is his top priority. “NIH money watered the basic science field, and we've let 100 flowers bloom,” he says. “Now, we have to figure out some way of harvesting those and getting them eventually into [human] trials.” In recent interviews with Science, Nathanson laid out how he and other NIH officials plan to speed vaccine development. The steps include boosting funding; creating a new peer-review study section to evaluate vaccine proposals; launching a set of standardized, comparative tests of candidate AIDS vaccines in monkeys; and trying to stimulate partnerships between U.S. investigators and colleagues in other countries. NIH even announced last week that it will collaborate in analyzing results from tests of one of the very vaccines it declined to sponsor 4 years ago—tests that are now being carried out by a private company.

    These changes come at a time when NIH has been under heavy criticism for not doing enough to speed the search for an AIDS vaccine. “Pretending to fill the leadership gap, marginally increasing public funds, and improving part of the grant evaluation and awards process does not add up to the full mobilization we need,” the AIDS Vaccine Advocacy Coalition—an activist group that analyzes obstacles to vaccine development—complained in a stinging critique of the government's efforts that came out in May.

    And NIH is having trouble making good on a pledge, announced by President Bill Clinton with great fanfare in May 1997, to establish a Vaccine Research Center on NIH's campus to speed the AIDS vaccine search by gathering intramural researchers from different disciplines under the same roof. Although a site has been selected for the center, a joint venture of the National Cancer Institute (NCI) and the National Institute of Allergy and Infectious Diseases (NIAID), a director has not yet been hired. “The Vaccine Research Center is something we hold tremendous hope for,” says Nobel laureate David Baltimore, who heads an influential NIH advisory group known as the AIDS Vaccine Research Committee (AVRC). “But, frankly, it has been quite difficult to find a director who has the eminence and capabilities.”

    Baltimore says he's delighted at the new leadership of the AIDS vaccine effort, though. It includes not only Nathanson, a former AVRC member, but also Margaret Johnston, who as of next month will run NIAID's AIDS vaccine program. Johnston, who left NIAID in 1996 to work at the International AIDS Vaccine Initiative—a private organization that has been critical of NIH's efforts—says, “I wouldn't go back unless I was absolutely convinced that the NIH was taking AIDS vaccine development more seriously.”

    One indication that NIH is taking the problem more seriously is money: Next year's $1.73 billion AIDS budget calls for spending nearly $180 million on vaccines, a 79% jump from 4 years ago (see table). This increased funding has begun to lure researchers into the field. “Many of my colleagues never considered doing vaccine research before,” says AVRC member Beatrice Hahn, a prominent AIDS researcher at the University of Alabama at Birmingham. “I didn't consider working in vaccine research, whereas now it's the focus of my lab.” NIAID also is reaching out to exceptional researchers outside the United States, allowing them for the first time to be the principal investigators on NIAID-funded AIDS vaccine collaborations.

    View this table:

    Asked whether this surge of money might support second-rate grants that decrease the average quality of AIDS vaccine research, Nathanson says he “won't lose a moment's sleep” if quality declines as more people are funded. “Speed and cost efficiency are totally different parameters, and you can't have both at the same time,” he says. The crucial thing, he says, is to explore all promising ideas as quickly as possible. “The way this epidemic is going, any other approach would be intellectually absurd and ethically unconscionable.”

    Johnston points out that the traditional NIH peer review system has, in any case, often given vaccine proposals short shrift. The problem, she says, is that vaccine research tends to be empirical, and study sections favor more basic research. To correct this bias, NIH—after being prodded by OAR (pre-Nathanson)–plans to form a new study section in November, the Vaccines Special Emphasis Panel, to evaluate grant applications for AIDS and other diseases.

    One problem the panel will face, however, is judging which approaches show the most promise. Researchers still are debating which immune responses a vaccine must trigger to provide protection. “It's really true that the road is unclear,” says Baltimore. And empirical results from monkey tests that challenge vaccinated animals with SIV, HIV's simian cousin, have been confusing, too, because labs have used different reagents and protocols, making it difficult to compare results. Now, says Nathanson, it's time “to take an orderly, logical approach.”

    Nathanson is advocating a large-scale, standardized comparison of dozens of vaccines in monkey tests. “We'd like to bracket the [candidates] from the very effective to the ineffective,” says Nathanson, who is gambling that the monkey model will reflect what happens in humans. Human versions of the most promising vaccines then would move forward. Primate researcher Norman Letvin of Harvard Medical School is helping to organize these studies. “I am very excited,” says Letvin. But he adds that “to do this is moving mountains because everyone sitting around the table has their own interests.”

    In parallel with the increased emphasis on monkey studies, NIH is more aggressively pursuing human trials. NIAID's new Integrated Preclinical/Clinical Program awards grants to academic researchers who promise to move candidate AIDS vaccines into human trials, and Nathanson says NCI has a plant in Frederick, Maryland, that academics could ask to manufacture vaccines for human studies. “In the past, NIH almost exclusively relied on companies to bring products into clinical trials,” explains Johnston.

    The drive to learn more from human studies even led NIH last week to announce that it would take part in tests of one of the vaccines it rejected 4 years ago. The vaccine—a preparation made from HIV's envelope protein, called gp120-is made by VaxGen of South San Francisco, which during the past few years raised private money to stage efficacy trials. The trials, the first of their kind, began in the United States this June. NIH will conduct laboratory studies of the immune mechanisms behind the vaccine's successes or failures. “Potentially valuable science will be captured because VaxGen itself will do limited studies with licensure as the goal, not scientific understanding,” says Baltimore, who, interestingly, is one of the vaccine's many critics (Science, 30 January, p. 650).

    Nathanson, who has been a prime player in the backroom negotiations with VaxGen, has worked hard to spread this pragmatic point of view. “Obviously, the VaxGen trial can make important contributions in a variety of different ways to developing a vaccine,” he says. “That's all I really care about.”

  11. Coaxing Big Pharma Onto the Playing Field

    1. Jon Cohen

    The National Institutes of Health's efforts to inject new life into AIDS vaccine development are focused mostly on its traditional clients in academic science (see main text). But one key player is sitting largely on the sidelines: Big Pharma. Now, the World Bank is considering an initiative to coax more companies onto the playing field.

    Pharmaceutical companies have not avoided AIDS vaccines altogether. Pasteur Mérieux Connaught has a significant AIDS vaccine program, and Merck says it's beefing up its efforts. Novartis has invested heavily in Chiron, a biotech that has a long-standing HIV vaccine program, and Wyeth-Lederle recently purchased a smaller biotech in the running, Apollon. But industry is cautious about potential returns from new vaccines, especially when the greatest demand for them is in poor countries, and it has mainly left the commercial field to small, cash-starved biotechs. “All along, the biggest problem has been how do you deal with the economic incentives for companies,” says Seth Berkley, president of the International AIDS Vaccine Initiative (IAVI), a private group that has been holding workshops on the topic for the past 4 years. Enter the World Bank.

    Building on an idea that came from an IAVI workshop, the World Bank has formed a task force (which includes Berkley) to plan a project called Innovative Financing Instruments for HIV/AIDS Vaccines, which would provide a guaranteed market for AIDS vaccines. “This project is considered a high priority within the bank and has generated quite a bit of excitement,” says Amie Batson, a management consultant who co-chairs the task force and specializes in linking the vaccine industry to public health needs.

    The task force is exploring the idea of creating a guaranteed purchase fund for AIDS vaccines. Under this plan, countries would secure either a loan or a gift—depending on their economic status—from the World Bank that they would promise to spend on a working AIDS vaccine. “If there were a large pot of gold at the end,” says Batson, companies may be more willing to invest now in the science. Batson says the task force also is discussing other incentives, such as awarding a monetary prize to the discoverer of a working vaccine. With the blessing of the World Bank's board, these programs could be in place by next summer.

    Berkley says without more involvement from industry, intriguing scientific leads will languish. “If you increased the NIH budget four times, that would be great—it would move the science forward,” he says. “But ultimately, if the corporations aren't there to begin to take on that science and really put a serious effort into making vaccines … it's not going to happen.”


    Botanical Gardens Cope With Bioprospecting Loophole

    1. Alan Dove*
    1. Alan Dove is a writer based in New York City.

    Experts will draft guidelines ensuring that indigenous peoples profit from specimens collected from their countries years ago

    With thousands of species from around the world collected under one roof, a botanical garden may seem like a mother lode for drug companies and scientists interested in probing exotic plants or fungi for novel biochemicals and genes. Not only do botanical gardens and seed banks house up to a third of the world's vascular plant species, but the majority of an average garden's holdings may come with an economic bonus: Because they were collected before the biodiversity treaty was signed by 160 countries in Rio de Janeiro 5 years ago, they don't come under its protection. Thus, any company that struck botanical gold in a collection may not be obliged under the treaty to pay a cent to the country in which the specimen originated.

    Botanical gold.

    Guidelines could spell out who profits from garden holdings, such as this Gentianaceae Exacum from Madagascar.


    Alarmed by this prospect, watchdog groups and scientists worldwide are stepping up efforts to craft regulations to ensure that source countries are compensated for products derived from specimens gathered by botanical gardens. And representatives of gardens and arboreta themselves are also confronting the issue. They will meet in Kirstenbosch, South Africa, in September to hammer out a consistent, nonbinding policy that they hope to finalize within a year.

    Although garden officials back such efforts, they argue that they are policing themselves effectively. “If we collect anything in a foreign country as a herbarium specimen or for cultivation, it's available for sampling [only with] permission from the country where we got it,” says Peter Raven, director of the Missouri Botanical Garden in St. Louis, who says this policy is common at other botanical gardens. But some activists argue that general rules are needed to back up those assurances.

    Agreements between gardens and drug companies “are being put together in a very ad hoc way,” asserts Edward Hammond, a program officer at the advocacy group Rural Advancement Foundation International in Winnipeg, Canada. Hammond points to an arrangement between the New York Botanical Garden (NYBG), the drug company Pfizer Central Research, and several Hawaiian botanical gardens, in which Pfizer pays an undisclosed sum for plant research in exchange for the right to license promising compounds. After investigating the agreement in 1995, Hammond claims it was “very clear” to him that the proposal “had nothing to do with compensating countries of origin or [the] people from whom these collections came.”

    Not so, say NYBG officials. The garden forges agreements only with those groups—including Pfizer—that pledge to return a fixed portion of future royalties from plant products to the source country, says Hans Beck, an NYBG botanist. Each specimen “has careful records that will allow us to identify, inform, and compensate the original collaborators, decades from now when a product might be developed,” adds Michael Balick, director of NYBG's Institute of Economic Botany. The garden's official policy says that compensation would amount to half of all net royalties from any discovery.

    The gardens' own efforts to craft fair guidelines at the Kirstenbosch meeting will be difficult, however—partly because it's often unclear where a plant came from. “How far back in time do you go in terms of how long people have lived in an area, and where do different genes originate?” asks Allan Stoner of the U.S. Department of Agriculture's National Germ Plasm Research Laboratory. Seed banks are already having problems working up guidelines compatible with the biodiversity treaty, Stoner says, because the United Nations Food and Agriculture Organization favors free exchange of agriculturally useful plants. “Many countries are still trying to figure this whole thing out.”

    Balick and others agree, however, that the time is ripe to tackle thorny ownership questions. Although, he says, “I don't know of any commercial drug that has come from a botanical garden collection,” the gardens should be players in this debate. “If companies are interested in natural products such as plants,” says Balick, “they come to where the botanists are—which is gardens.”


    Miocene Survivors: Armed to the Teeth?

    1. Jocelyn Kaiser

    The land where buffalo roam and deer and antelope play is a pale shadow of an earlier time, at least for hoofed species. Ten million years ago in the Miocene, the North American continent was home to an array of grazing mammals, including camels, rhinos, and as many as 20 different species of horses. Most of these species died out about 6 million years ago, probably because of a sudden climate shift. At the meeting, paleontologist Steven Stanley of Johns Hopkins University aired a provocative view as to what may have helped cause the extinction: scratchy grass.

    In the new climate, Stanley suggests, softer grasses gave way to more abrasive plants, so that only certain horses and other grazers with teeth able to withstand tough chewing survived.

    Stanley's idea attempts to explain a mammalian mystery: The species that survived the Miocene are almost all animals with very high-crowned molars, while those that died out have lower-crowned teeth. That pattern was laid out in a 1993 Paleobiology paper by Richard Hulbert of Georgia Southern University in Statesboro. Vertebrate paleontologists David Webb and Bruce MacFadden of the Florida Museum of Natural History in Gainesville, who helped document the extinction, have an explanation for why most browsing mammals—low-crowned animals such as some camels and deerlike species that nibbled on soft leaves on trees and shrubs—might have died out. They argue that in the new, drier climate at the end of the Miocene, the savanna—a grassland with scattered trees—became only grassland. But that doesn't explain why many grazing animals vanished, which should have been able to survive on grasses alone, too.

    Stanley put this puzzling extinction data together with other recent clues to the vegetation changes of the time. By looking at carbon isotopes in the teeth of more than 500 horselike species and other fossil mammals, Thure Cerling, a geochemist at the University of Utah in Salt Lake City, and others sought clues to animals' diets and thus their habitats. As Cerling's group reported in Nature last September, the teeth reveal a major shift across North America, and much of the rest of the world, in the composition of late-Miocene plants: Wheat, bluegrass, and other plants known as C3 (because of their photosynthetic pathway) gave way to tougher C4 plants, such as crabgrass and Bermuda grass, which prefer a drier, more tropical setting. The reason, the group suggested, might be a global drop in atmospheric carbon dioxide levels, which would have favored C4 plants.

    Stanley now links this switch to C4 grasses to the mammal extinctions. Paleontologists generally think that grazers evolved high-crowned teeth so they could tolerate the wearing effects of grasses. Stanley's hunch: Horses with shorter teeth died young because their teeth were worn down by the even more abrasive C4 grasses. He says he “dug around in the literature to try to test this idea” and found support: Botany papers showed that C4 grasses, which have more veins than C3 plants, produce more than five times as many abrasive silica bodies by leaf area.

    Stanley concludes that the grass change doomed animals that were not long in the tooth. “If you need long teeth to eat grass with silica, then you need longer teeth to eat grass with more silica,” he says. Stanley says this could also explain another twist to the mystery noted by Webb, MacFadden, and Hulbert—that a pocket of shorter-toothed grazers persisted on the U.S. Gulf coastal plain, which stayed moister than the rest of the continent and still supported savanna where grazers could supplement a C4 diet.

    Although Hulbert calls Stanley's idea “plausible,” he and others say it's not yet convincing. Other theories could also explain the demise of the short-toothed grazers, notes paleontologist Paul Koch of the University of California, Santa Cruz. For example, the new mix of grasses would have had less protein than C3 grasses, so horses would have had to chew more of it—putting longer-toothed horses at an advantage regardless of the grass' silica content. One strategy to test Stanley's hypothesis might be to study the tooth wear rates of modern species, says Hulbert—for example, by comparing the teeth of U.S. bison to those in Canada, where there is more C3 grass. These modern grazers might offer clues to why their ancestral cousins weren't so lucky.


    Nitrogen as Forest Fertilizer Falls Short

    1. Jocelyn Kaiser

    From car exhaust to agricultural fertilizer runoff, nitrogen pollutants are blitzing Earth's ecosystems, triggering fish-choking algal blooms and other ecological problems. But environmentalists have had at least one reason to cheer this nitrogen glut: The nutrient should fertilize tree growth, spurring forests to soak up human-made carbon dioxide that would otherwise fuel global warming. Such an effect has been a leading explanation for where a big chunk of the world's carbon emissions disappear to—a mystery known as the “missing sink.”

    At the meeting, however, ecologist Knute Nadelhoffer of the Marine Biological Laboratory in Woods Hole, Massachusetts, described experiments in the northeastern United States and Europe that suggest nitrogen isn't spurring much forest growth at all. His group and others traced the path of nitrogen compounds added to experimental forest plots and found that, at least in the short-term, the nitrogen winds up mostly in soils, not in trees. If the finding holds up, “it has some pretty big implications” for trying to find the missing sink, says terrestrial ecosystem modeler Alan Townsend of the University of Colorado, Boulder.

    Over the last several years, scientists have traced the fate of about 75% of the carbon dioxide pumped into the atmosphere each year by human activity. But a whopping 1.8 petagrams of carbon remains unaccounted for. Global carbon dioxide measurements suggest that the carbon is disappearing into the land, not the oceans (Science, 24 July, p. 504).

    One leading idea: As vegetation is plied with more and more nitrogen, it grows faster and thus takes up more carbon dioxide. The amount of the nitrogen falling onto the Earth's surface as rain and soot has more than doubled since the 1960s, fed by fossil-fuel burning and ammonium nitrate from manure and fertilizers (Science, 13 February, p. 988). Townsend has estimated that if 80% of deposited nitrogen goes into tree biomass, it could account for up to 1.3 petagrams of the missing carbon.

    But there's been a dearth of data on the fate of nitrogen once it hits the ground. That's what Nadelhoffer's team set out to gauge in the Harvard Forest's red pine and oak stands. In 1991 and 1992, the group sprayed four 900-square-meter plots every month with water doped with nitrogen-15, a stable isotope found naturally only in minute amounts. The researchers traced where the nitrogen went by feeding wood, root, and soil extracts into a mass spectrometer, which measures isotopic composition. The results, now in press at Ecological Applications, were a surprise: At more pristine plots, the nitrogen stayed mostly in the soil, with as little as 5% of the nutrient ending up in the trees within 2 years after spraying began. In plots that had already been heavily fertilized since 1988-at levels comparable to polluted parts of Europe—as much as 25% of the nitrogen showed up in the wood within 2 years. But that's far less than levels that would account for even half the missing sink, points out Nadelhoffer. “We thought we might see a much higher level,” he says.

    A similar experiment by Nadelhoffer's group in a Maine conifer forest yielded comparable results, and he reported at the meeting that other teams in six European forests are finding nitrogen uptake levels in trees running no higher than 30%. Nadelhoffer thinks that most nitrogen gets bound up in organic complexes in soil humus that would become accessible to tree roots only slowly.

    These nitrogen experiments aren't the final word, Nadelhoffer cautions, because there could be some lag in the time it takes trees to absorb nitrogen from the soil. Townsend also notes that the sites didn't cover all forest types. “We need a few more years of data and sites to really put the nail in the coffin,” he says. And forests may still be taking up the extra carbon dioxide, but not because of extra nitrogen, he adds. If nitrogen fertilization is ruled out, says Townsend, “we're going to have to start looking somewhere else.”


    Green Thumb for the Southwest

    1. Kathryn S. Brown*
    1. Kathryn S. Brown is a writer in Columbia, Missouri.

    In the U.S. Southwest, color arrives and vanishes suddenly, as fleeting wildflower blooms paint the stark desert. But this dramatic landscape may not last forever. New data presented at the meeting suggest that global warming could shift the Southwest from a chiefly desert biome to one overrun by grasses and shrubby trees.

    The new view stems from efforts to examine how vegetation will shift with rainfall patterns following increased atmospheric concentrations of carbon dioxide (CO2). For 20 years, climate change models have suggested that global warming will alter rainfall in many regions of the world. One often-cited example is the Southwest, home of summer rains dubbed the Arizona monsoon. Ecologists would like to know how plants will respond—and migrate—if such storm systems intensify over the next 50 to 100 years.

    Ecologists Jake Weltzin at the University of Notre Dame in South Bend, Indiana, and Guy McPherson of the University of Arizona in Tucson tested how the shrubby oak Quercus emoryi–which dots lower tree lines across the Southwest—might respond to higher average summer rainfall. Working at Arizona's Fort Huachuca Military Reservation from 1994 to 1996, the duo watered oak plots—sheltered from precipitation by plastic film and cut off from groundwater by film-lined trenches—with various amounts of rain collected nearby. They found that high summer “rains,” or 5.25 millimeters of precipitation per day, produced three times as many oak seedlings as dry summer conditions–1.75 millimeters per day. Similar increases in watering in wintertime had no such effect.

    Even if prolonged droughts punctuate increased precipitation from global warming in the Southwest, shrubby trees should still thrive: Higher atmospheric CO2 levels typically favor woody plants, which respond by growing more and evaporating water more slowly, says Wayne Polley of the Agricultural Research Service in Temple, Texas. In greenhouse experiments reported at the meeting, Polley found that increasing CO2 concentrations from 370 ppm to 700 ppm doubles the number of honey mesquite seedlings that survive drought. A one-two punch of summer rains and CO2, therefore, could lead to an explosion of shrubby oaks and other woody plants, McPherson says. “We're talking about these plants going from less than 5% to more than 30% of the canopy cover in many areas of the Southwest,” he says.

    Woody plants aren't the only likely desert invaders. In a third talk, ecologist Ronald Neilson of the Forest Service in Corvallis, Oregon, predicted that U.S. grasslands could spread into the Great Basin and grow denser in California and Texas. Neilson unveiled new, more finely resolved analyses of vegetation changes predicted by the Mapped Atmosphere-Plant-Soil System (MAPSS) computer model under a doubling of atmospheric CO2. According to Neilson's conclusions—some of which appeared in the June issue of Global Change Biology–the Bermuda high, a tropical air mass that fuels the Arizona monsoon, would intensify. As a result, plants such as love grasses and grama grasses could eat away at low-lying desert, Neilson says.

    Ecologists and atmospheric scientists alike welcome the new data, but they stress that pinning down any link between climate change and vegetation is tricky. During a global warm-up 11,000 years ago, “the various vegetation types didn't just migrate [en masse] around the map of North America,” says James Brown of the University of New Mexico in Albuquerque. Jerry Meehl of the National Center for Atmospheric Research in Boulder, Colorado, adds that the Southwest's complex landscape of mountains and desert makes forecasting regional climate change difficult. “There is a chain of uncertainty that runs through links between CO2, rainfall, and vegetation,” Meehl says. “But to get at actual ecosystem impacts, these are the kinds of studies you have to do. This type of research can be very useful.”


    Stroke-Damaged Neurons May Commit Cellular Suicide

    1. Marcia Barinaga

    Recent evidence that neurons die from apoptosis in oxygen-deprived brains and also in the brains of Alzheimer's patients suggests new approaches to therapy

    When human beings commit suicide, it's almost always a tragedy. But within the organism, the cellular suicide that goes by the name of apoptosis is vital to life. It prunes tissues during embryonic development and removes damaged cells in the adult in a neat, orderly way. But apoptosis has a dark side as well: If it's turned on at the wrong time, crucial cells may die off. Over the past few years, researchers have come to suspect that's just what happens when a stroke or heart failure deprives the brain of oxygen.

    When the blood supply to part of the brain is blocked, as in a stroke, neurons in the most severely affected area die immediately from oxygen starvation, known as ischemia. But a long-standing puzzle in neurology is what causes the more gradual loss of neurons in the region outside the stroke's core, where the oxygen supply is reduced but not eliminated. Recent experiments in rats and mice suggest a possible explanation: Some cells that might otherwise recover from the ischemia may be dying because the injury triggers their suicide programs. When researchers induced brain ischemia in lab rodents by temporarily cutting off blood flow to the animals' brains, for example, they found dying neurons there that show some key criteria of apoptosis. In particular, the cell death seemed to be controlled by caspases, protein-clipping enzymes that orchestrate the cell's death program (Science, 3 April, p. 32).

    These findings haven't proved that the dying neurons are truly undergoing apoptosis, because the cells don't completely fit the textbook description. Because of all the stereotyped criteria it is expected to meet, “apoptosis seems to be a charged word,” says neuroscientist Michael Moskowitz of Harvard Medical School in Boston. “Maybe it's better to just talk about ‘caspase-mediated cell death.’” But whatever it's called, he says, the cell death should be a good target for therapeutic drugs aimed at limiting stroke damage.

    Indeed, recent work already suggests that caspase inhibitors should be added to the list of potential new drugs for stroke. What's more, research on a related problem—brain damage caused by oxygen starvation during or just after birth—hints that the same drugs may be able to limit that type of damage as well.

    Until about 5 years ago, most researchers thought that neurons killed during strokes die not by an orderly program of apoptosis but simply by breaking apart in an uncontrolled form of death called necrosis. That's a messy way to go—necrotic cells spill their contents, which can attract potentially harmful inflammatory cells to the damaged site. But in the mid-1990s, several research groups questioned that dogma and produced evidence suggesting that at least some of the brain neurons damaged by stroke succumb instead to apoptosis. This is a much neater way of dying in which the cell disassembles its DNA and breaks up its contents into membrane-wrapped packets, which can be cleared away without causing inflammation.

    The first clue came when researchers showed that the protein-synthesis inhibitor cycloheximide reduces brain damage in laboratory rats given experimental strokes. They chose cycloheximide for the experiments because it was known to inhibit apoptosis. But as neuroscientist and stroke researcher Dennis Choi of Washington University in St. Louis notes, “that early work can be criticized because [cycloheximide] is relatively nonspecific.” Because the drug blocks protein synthesis generally, researchers couldn't be sure which of its many effects was behind the stroke results.

    But around the same time, several groups studying rat brain neurons following strokes found other signs of apoptosis: DNA breaks and an overall granular appearance caused by the condensation of the chromosomes in the decomposing nuclei. They concluded that the neurons were dying by apoptosis.

    Those findings soon became controversial. Neuroscientist Alastair Buchan of the University of Calgary, one of the researchers who found the DNA breaks, had second thoughts about whether the brain neurons in which they occur are truly undergoing apoptosis. Under closer examination with electron microscopy, he says, the contents of the dying cells don't appear to be neatly packaged in membranes, as they are in apoptotic cells. Instead, “the membranes are completely shot.”

    In addition, a collaborator of Buchan's, John MacManus of the National Research Council of Canada's Ottawa lab, who has done more extensive tests on the DNA in the dying neurons, found that it is cut up differently than the DNA in true apoptosis. Molecular tests showed that instead of being blunt, the ends of the DNA are ragged, suggesting that it is being chopped not by the enzyme that normally cuts the DNA during apoptosis but by some other DNA-degrading enzyme.

    MacManus and Buchan argue that the neuron death in stroke is something between apoptosis and necrosis. They see this not as a setback but as an opportunity. “There may be processes in effect here … that are specific to ischemic neurons,” says MacManus. And that, Buchan adds, could “yield us a lot of therapeutic targets, which we badly need.”

    Others maintain that neurons dying by apoptosis shouldn't be expected to look exactly like other apoptotic cells. Nancy Rothwell of the University of Manchester in the United Kingdom notes that apoptosis was first described in dividing immune cells. “I would be surprised if the criteria established in one type of cell are going to look identical in another,” she says, especially in neurons, which do not divide.

    The view that apoptosis is the cause of death for some of the neurons killed by stroke got a big boost last year when Michael Moskowitz, Junying Yuan, and their collaborators at Harvard Medical School reported evidence suggesting that caspase activity plays a role in cell death after stroke. The team induced strokes in mice whose caspases had been blocked in either of two ways: by using peptides that inhibit a range of the enzymes, or by a genetic engineering trick that specifically inhibits a particular caspase called caspase-1.

    Compared to control animals, caspase inhibition by either method decreased the area of stroke damage by up to 40% or 50%, says Moskowitz. What's more, the neurons were not only saved from dying but seemed to remain in working order; the animals had fewer movement and sensory impairments than controls.

    Although many researchers consider caspase activation to be an even more reliable hallmark of apoptosis than the DNA changes measured in the earlier experiments, these results don't settle the issue. Caspase-1, the caspase knocked out in the genetically altered mice and one of those targeted by the peptide inhibitors used, lives a double life. In addition to playing a direct role in apoptosis, it also cleaves and activates interleukin-1β (IL-1β), a signaling molecule of the immune system that triggers inflammation. And inflammation after strokes also causes neuron damage and death.

    Rothwell's team at the University of Manchester has shown that IL-1β activity goes up in the brains of rats after stroke and seems to contribute to the neuron damage that ensues. When they block IL-1 action with a protein called IL-1 receptor antagonist (IL-1ra), Rothwell reports, “it reduces ischemic brain damage by greater than 50%.” Indeed, she adds, “there is considerable commercial interest” in developing IL-1ra into a possible stroke therapy.

    But not all the effects of caspase inhibitors can be explained away by inhibition of inflammation. Moskowitz says his group detected the neuron protection before the time when inflammatory changes typically occur in the brain. What's more, other evidence suggests that caspase-3, which has no direct role in inflammation but is key to apoptosis, also helps kill neurons after ischemia. In May, Moskowitz, Yuan, and their colleagues reported that they had used antibodies that specifically recognize activated caspase-3 to show that caspase-3 activity increases in rodent's brains after stroke. Also in the past few months, Roger Simon and his colleagues at the University of Pittsburgh School of Medicine and a group at Eli Lilly and Co. in Indianapolis reported that caspase-3 gene expression is up as well in rats with global ischemia, the type of brain oxygen starvation that occurs during heart failure. “This puts the apoptosis story on a firmer footing,” says Moskowitz.

    Researchers will now want to follow up on this work by exploring the safety of known caspase inhibitors for use in stroke therapy, and by searching for even better drugs. And this may not be the only type of ischemia in which they might prove helpful.

    Most neuroscientists agree that neurons in the brains of newborn rats deprived of oxygen undergo a death that has all the features of textbook apoptosis, including the intact membranes missing in adult brains. This led neurologist David Holtzman of Washington University to wonder whether caspase inhibitors might prevent cerebral palsy in human babies that have suffered ischemia during difficult births and those born weighing less than 1500 grams, half of whom develop some form of ischemic brain damage due to their immature lungs and brains. If neural apoptosis is to blame, says Holtzman, “one would predict that if you inhibited caspases … you would protect the brain.”

    Animal tests of his hunch have looked promising. Holtzman's team reported earlier this year that apoptosis inhibitors given to newborn rats within 3 hours after the researchers had induced brain ischemia protected brain neurons from death. They are now studying rats treated this way to see if the neurons they save go on to function normally. At least one pharmaceutical company is interested in trying this approach to treat infants at risk for cerebral palsy, Holtzman says. So in young as well as old brains, protecting neurons from untimely death may provide a new lease on life.

  17. NEWS

    Is Apoptosis Key in Alzheimer's Disease?

    1. Marcia Barinaga

    Cell suicide may play a role in Alzheimer's disease, although the value of blocking it is not yet known

    Researchers studying diseases as devastating as Alzheimer's want to explore every lead that could produce a treatment or cure. So while they have several good potential culprits in the nerve cell death that characterizes the disease—most notably a toxic protein called β amyloid (Aβ)–neuroscientists are also following up on evidence that some of the dying neurons commit a form of suicide called apoptosis. What remains to be seen is whether the cellular equivalent of suicide intervention will help slow the disease's progress.

    Researchers have found that the brains of Alzheimer's patients contain dying neurons that display certain characteristic signs of apoptosis, such as DNA breaks. Even more intriguing, three proteins already linked to Alzheimer's pathology—Aβ itself, along with two others called presenilins—seem to drive cells into apoptosis when conditions are right. Because of the lack of good animal models of the disease, no one has been able to test whether inhibitors of apoptosis can protect against cell death in Alzheimer's, as they can in animal models of stroke (see p. 1302).

    But if apoptosis does turn out to play a role in the nerve cell loss, the finding could lead to badly needed new Alzheimer's therapies—a goal that keeps researchers interested despite the uncertainties. “Will [preventing apoptosis] wind up being a good thing?” asks Steven Younkin of the Mayo Clinic in Jacksonville, Florida. “That is an open question … but I don't think that means you shouldn't pursue [it].”

    The notion that apoptosis occurs in Alzheimer's came to light in 1993 with the work of two research teams, Carl Cotman's at the University of California, Irvine, and Gianluigi Forloni's at the Institute of Pharmacological Research in Milan, Italy. The teams showed that Aβ, which builds up in the brains of people with Alzheimer's, causes cultured neurons to die by apoptosis—also known as programmed cell death because it involves the activation of a genetic program for dismantling cells—rather than simply by falling apart in an uncontrolled form of death called necrosis. “This made a prediction that [apoptosis] might be present in human brains” affected by the disease, Cotman says.

    Shortly thereafter, his team found signs of the broken-up DNA typical of apoptosis in brains from people who had died from Alzheimer's. But many researchers remain skeptical. “The notion that the neurons that die in Alzheimer's disease undergo apoptosis is a very sound concept that hasn't been totally proven,” says Younkin. “There are problems looking in Alzheimer's brains and knowing whether you are looking at true apoptosis or just fragmentation of DNA that resembles apoptosis.”

    Cotman cites other data, including evidence for the activation of enzymes called caspases, which play a role in apoptosis, that convinces him that programmed death is really occurring in Alzheimer's brains. No one yet knows, however, exactly how Aβ might cause the apoptosis.

    But Aβ is only one of the links emerging between Alzheimer's disease and apoptosis. Some researchers studying the presenilins, the protein products of two genes that are mutated in some inherited forms of Alzheimer's, suspect that these proteins might help to regulate apoptosis. Early evidence came from Luciano D'Adamio and his colleagues at the National Institute of Allergy and Infectious Diseases. In 1996, they discovered that cultured neuronlike cells known as PC12 cells that had been engineered to make presenilin 2 are much more sensitive to apoptosis triggers—including Aβ–than normal cells. What's more, PC12 cells engineered to make the mutant form of presenilin 2 that causes Alzheimer's disease were even more likely to die.

    That report verified what other researchers had also observed when they engineered cells to make presenilins, says Alzheimer's researcher Rudy Tanzi of Harvard Medical School in Boston. He and others speculated, however, that the engineered cells might be dying because they make huge amounts of presenilin, which could be overloading the cells' protein transport pathways. Dora Kovacs in Tanzi's group has ruled out that explanation, at least for the mutant presenilins. She recently found that cultured neurons making just normal amounts of mutant presenilin 1 look healthy but are still more easily pushed into apoptosis by various forms of stress.

    Just how the mutant presenilins might increase neuronal susceptibility to apoptosis remains unclear. One possibility is that normal presenilin protein actually helps keep the brakes on apoptosis, and that the mutations somehow interfere with that function. Results reported last month by Jean-Pierre Roperch, Adam Telerman of the Fondation Jean Dausset-CEPH in Paris, and their colleagues support that picture. They found that two apoptosis-promoting proteins, p53 and p21, both turn off presenilin 1 production, and that turning off presenilin 1 by other methods also favors apoptosis. “If a cell wants to undergo apoptosis, it has to turn presenilin 1 down,” says Tanzi. “That suggests presenilin 1 is normally antiapoptotic.”

    That's also suggested by an observation reported last year, first by Tanzi's group and then by D'Adamio's group and a team led by Helmut Jacobsen at Hoffmann-La Roche in Basel, Switzerland. These researchers found that in cultured neurons undergoing apoptosis, presenilins are cut by the caspases, protein-cleaving enzymes that are activated as part of the death process. One interpretation is that “not only does the cell turn off expression of new presenilin 1,” says Tanzi, referring to the French team's results, “but it takes existing presenilin 1 and renders it inoperable by cutting it with caspase.” If the presenilins are antiapoptotic, then cells engineered to make the normal proteins presumably are dying not by a specific activity of the presenilin but by the protein glut that some researchers suspect.

    The caspase studies also suggest why mutant forms of presenilin might be less protective than the normal proteins: Researchers have observed that they are clipped more readily than their normal counterparts. Taken together, says Tanzi, the results suggest that “if you don't allow the presenilin to be clipped by the caspase, you could attenuate the amount of cell death.” Such a strategy might lead to a new therapeutic approach, designed to block presenilin cutting. An approach like that might protect against both familial and the much more common nonhereditary Alzheimer's, if caspase cleavage of normal presenilin plays a role in that form of the disease.

    But that is a big “if,” and indeed the scenarios about possible presenilin function are very speculative and have yet to be tested. The majority of Alzheimer's researchers think that if apoptosis is involved in Alzheimer's, it is induced by Aβ, and the effect of the mutant presenilins is simply to raise Aβ levels. Several Alzheimer's labs have shown that the mutant presenilins cause neurons to make more Aβ, particularly the most harmful form, which contains 42 amino acids. “That would suggest that mutations in presenilins cause Alzheimer's by generating more Aβ 42,” says Alzheimer's researcher Ben Wolozin. “Occam's razor says you don't need to invoke other things to explain it. Why make it more complicated?”

    In the face of these differing views and unproven hypotheses, more work is clearly needed to resolve how—and even whether—apoptosis figures in the development of Alzheimer's disease. But even if apoptosis is established as contributing to Alzheimer's pathology, many neuroscientists doubt that blocking it will slow or halt the progress of the disease. “I think [apoptosis] is very far down the cascade of events driven by these pathogenic [Aβ] molecules,” says Alzheimer's researcher Sam Sisodia of the University of Chicago.

    Sisodia and others believe that by the time apoptosis occurs, Aβ has already damaged the neurons so severely that they are not salvageable and indeed that blocking apoptosis, a very neat way of eliminating damaged cells, could make things worse, leading to messy necrotic death, which triggers harmful inflammation.

    The lack of good animal models for Alzheimer's makes this a difficult controversy to resolve. But given the possibility of a good therapeutic payoff, Alzheimer's researchers will undoubtedly continue to try.


Stay Connected to Science