# News this Week

Science  25 Feb 2000:
Vol. 287, Issue 5457, pp. 1374
1. GENOMICS

# Fruit Fly Genome Yields Data and a Validation

WASHINGTON, D.C.—The humble fruit fly has just soared to the top of the genome charts. Using an approach dismissed as unworkable a mere 2 years ago, a team of publicly and privately funded scientists announced last week that they had decoded more than 97% of the genome of Drosophila melanogaster. As with all genome projects, parts are missing: The team sequenced only gene-containing regions, and about 1600 gaps remain. Even so, Drosophila, which has been long studied by geneticists, is the largest creature ever to be sequenced, genomewise, and only the second multicellular organism. What makes this milestone especially noteworthy, however, is that it validates the controversial “shotgun” approach. As such, it could pave the way for a public-private effort to complete the human genome, said J. Craig Venter, president of Celera Genomics in Rockville, Maryland, the private half of the team.

The last two big successes of the genome project, the nematode (Science, 11 December 1998, pp. 1972, 2012) and human chromosome 22, recently published in Nature (Science, 24 September 1999, p. 2038), were both done using the “clone-by-clone” approach. This involves determining the order of the bases in a series of overlapping clones, whose locations on the chromosomes are known.

In May 1998, Venter stunned the genome community when he said he would tackle the human genome with the whole-genome shotgun approach that he had pioneered on microbial genomes (Science, 18 June 1999, p. 1906). To “shotgun” a genome researchers shred the entire genome into random pieces, sequence all the pieces, and then reassemble them in the correct order with the aid of a supercomputer. At the time, critics argued that Venter would be unable to put the millions of DNA fragments back together. As a test case, Venter teamed up with Gerald Rubin and the Berkeley Drosophila Genome Project to try the fruit fly.

The effort “worked better than anyone expected,” Rubin reported at the annual meeting of the American Association for the Advancement of Science, which publishes Science. Geneticists and molecular biologists are ecstatic. “[Venter and Rubin] have really pushed the envelope of what's possible,” raved Daphne Preuss, a geneticist at the University of Chicago. Added geneticist Lawrence Goldstein of the University of California, San Diego: “The quality of what I saw was really exceptional.”

One key to their success was an assembly program designed by Celera's Eugene Myers. In short order, the program was able to assemble the 120 million bases into 26 long stretches, or “scaffolds.” Myers relied on existing genome maps to order these stretches. Still, the program left 1800 gaps, which Myers reduced to 1600 by adding sequence data from his academic collaborators. What's more, the shotgunned data matched already finished fly sequence quite well. Myers is confident that this approach will work on the far larger human genome. But skeptics are waiting to see how difficult the remaining gaps are to close, a task Rubin's team is taking on, before giving the thumbs-up.

Meanwhile, analyses so far suggest the fruit fly could have as many as 13,000 genes, half of whose functions are unknown, said Celera's Mark Adams. With the sequence in hand, Goldstein expects research to “catapult ahead.” For Rubin, this achievement is sweet because everyone worked together well: “It has been one of the most pleasurable scientific experiences that I've had in my academic career.”

2. INTELLECTUAL PROPERTY

# HHS Probes Genesis of Gene Sequencer

1. Jon Cohen*
1. With reporting by Eliot Marshall, Leslie Roberts, and Elizabeth Pennisi.

During the past 6 months, biologist Leroy Hood and members of his former lab at the California Institute of Technology (Caltech) in Pasadena have become ensnared in a tangled federal probe of the origins of the DNA sequencing machines that now play a central role in decoding the human genome. Responding to subpoenas, the researchers have been turning over files from more than 15 years ago to inspectors from the U.S. Department of Health and Human Services (HHS).

Hood's Caltech lab developed the key technology behind the sequencing machines and the reagents needed to run it that now are marketed by PE Corp. At issue in the HHS probe is whether Hood's lab used grant money from the National Science Foundation (NSF) as part of that research. If it did, then Caltech, which holds the patent on the technology, may improperly have received royalties from sales of the machine to government researchers.

Nobody is accusing the researchers of wrongdoing, and Hood and Caltech vigorously deny that NSF funds were involved in the key research. But the issue has developed into a cause célèbre since the Los Angeles Times broke the story last week. A member of Congress is concerned that the federal government may have been ripped off, and Caltech president David Baltimore has responded with a public statement.

Caltech officials, the researchers themselves, and lawyers from PE all say they are baffled by the probe. It's unclear who would owe money to the federal government, and the amount at stake may total less than a million dollars, says PE attorney Joseph Smith. Indeed, one of the most puzzling aspects of the affair is why the HHS Inspector General's (IG's) office launched the probe in the first place. Officials from the office decline comment. But the researchers involved are not so reticent. “What they're doing is wrong,” says Hood, who's now at the University of Washington, Seattle. “It's a witch-hunt.”

The probe focuses on a 1980 change to patent law known as the Bayh-Dole Act and a 1988 contract between Caltech and Applied Biosystems Inc. (ABI), a company that PE later purchased. Bayh-Dole encourages universities to patent inventions developed with government money but stipulates that the government should not pay a royalty fee if it uses the invention.

Although the Bayh-Dole Act has been widely praised for helping to move technology from universities to industry, it has created a perplexing dilemma: It's often difficult to separate how labs spend their federal and private funding. Hood says his lab developed what's known as the four-color fluorescent dye DNA sequencer in the early 1980s with money from the Weingart Foundation, Monsanto, the Baxter Foundation, and Upjohn. Hood's lab then received grants from the National Institutes of Health (NIH), but Hood says he used that money solely for studying the immune system.

PE attorneys note that the Hood lab first filed a patent on the DNA sequencer in January 1984, 20 months before receiving an NSF grant in September 1985. John Wooley, an NSF officer who led a team of reviewers on a site visit in March 1985, confirms that the machine was already operational then. “We saw it working, and we saw sequence data,” says Wooley, who is now an official at the University of California, San Diego.

At the time, however, Caltech and the researchers made some statements that have come back to haunt them. In June 1986, Caltech and NSF both issued press releases in conjunction with a Nature paper about the machine; all of these documents note NSF's support of the research. “It was a courtesy to NSF,” says Baltimore: “As far as we can tell, this falls into the category of ‘No good deed goes unpunished.’” Tim Hunkapiller, who is named in the patent for the sequencer and now runs Discovery Bioscience in Seattle, blames Hood's “grantsmanship” in part for the acknowledgments to NSF.

According to PE attorney Smith, Caltech further confused the issue in 1988 when it included language in a license agreement with ABI that said the university would not receive royalties for DNA sequencers sold to the federal government. But Smith, who helped craft the document, says the Caltech attorneys mistakenly included this language because a postdoctoral student in Hood's lab received an NIH fellowship—which Bayh-Dole explicitly exempts.

Why the HHS IG's office became interested in this case in unclear, because NSF does not fall under its jurisdiction. In the past, however, the IG has criticized NIH for not closely monitoring whether universities comply with Bayh-Dole when they patent NIH-funded inventions. And NIH has purchased several PE sequencing machines—which now cost up to $300,000—and the expensive reagents needed to run them. Science has learned that NIH has passed on to the IG complaints that it received from a Boston lawyer, George Corey, that questioned the government funding of Hood's lab during the time it developed the DNA sequencer. Corey did not want to discuss the matter publicly. If the genesis of the probe is uncertain, so are the stakes. “We don't even know the answer to that,” says Baltimore. PE pays Caltech a royalty of 2% on sales of its sequencers, but Smith says the company already gives the government a 6% discount on the few sequencers it purchases. That “the government [is] trying to get anything here is really kind of silly,” says Smith. Texas Representative Ralph Hall, the ranking Democrat on the House Science Committee, doesn't think it's silly at all. “We have been looking into charges that the American taxpayers may have been overcharged for these sequencers,” Hall told the Los Angeles Times. The General Accounting Office (GAO) also has a congressional mandate to look into government-wide compliance with Bayh-Dole and has issued three reports on the topic in the past 2 years. With this combination of Congress, HHS, GAO, patent lawsuits, and the media, Caltech's role in the development of the DNA sequencer will likely remain in the spotlight for months to come. 3. GENE PATENTS # Patent on HIV Receptor Provokes an Outcry 1. Eliot Marshall For the past 5 years, a biotech company that set out to methodically sequence and commercialize human genes has been telling the world that it would beat everyone else to the Patent Office. Last week, that company—Human Genome Sciences Inc. (HGS) of Rockville, Maryland—made good on its boast. It won a U.S. patent on a human gene that plays a key role in HIV infection. The gene codes for a cell surface receptor called CCR5 that HIV uses to gain entry to a cell. The news gave HGS a big boost: Its stock, after declining a week before, skyrocketed on 16 February to a record high,$188 per share, gaining more than 21% in a day. But academic scientists who had also chased this gene and, unlike HGS, published scientific papers showing that HIV uses the receptor, were dumbfounded—especially because HGS did not know of the AIDS connection when it filed its patent.

The patent decision “takes my breath away,” says Robert Gallo, director of the Institute of Human Virology at the University of Maryland, Baltimore. “As a society, we have to ask if it's fair” to give the main commercial prize to the company that simply sequences a gene rather than to those who do the hard work of figuring out its biological function, says Gallo. Several groups, including Gallo's, that played critical roles in identifying the suite of receptors that HIV uses to slip inside cells have also applied for patents, but their claims were filed after HGS's. (A patent expert at the National Institute of Allergy and Infectious Diseases notes that the Patent Office could still issue other, competing patents on CCR5, creating a nasty legal traffic jam.)

“I'm flabbergasted,” says virologist Christopher Broder. “I can't believe” the Patent Office made a decision to reward what he calls “armchair” biology research by HGS. Broder, now at the Uniformed Services University of the Health Sciences in Bethesda, Maryland, was a member of a National Institutes of Health team led by Edward Berger and Philip Murphy that in 1996 published a detailed analysis of how CCR5 works as a “coreceptor” for HIV on the surface of immune system cells (Science, 10 May 1996, p. 872, and 28 June 1996, p. 1955). Berger was traveling and couldn't be reached for comment, but Broder said that “it is rather upsetting to all of us to learn that this company is obtaining patents, despite the fact that we made the discovery first.” Murphy says he, too, was “a little surprised” by the patent. HIV researchers John Moore of the Aaron Diamond AIDS Research Center in New York City and Robert Doms of the University of Pennsylvania, Philadelphia, were also dismayed to learn that HGS's sequencing effort seems to have given the company priority over those who first published studies on CCR5's function.

But HGS's chief executive, William Haseltine, a former HIV researcher himself, says his company didn't simply churn through sequencing data to obtain priority. He says the company's strategy from the outset was to use sequencing as an entry point to research, and then to move quickly from databases to “wet biology” and pharmaceutical development. HGS targeted the family of cell receptors that includes CCR5—the seven-transmembrane group—for special attention because they have been hugely successful pharmaceutical targets. Drugs aimed at these receptors, which include antiulcer and allergy remedies, account for $40 billion in sales annually, he says. Haseltine recalls that CCR5—which has a different name tag in HGS's database—turned up in a batch of unknown gene sequences in an early sequencing scan of the human genome begun in 1993. He says the company's computer analysis tagged it as a probable seven-transmembrane receptor, and it was cloned in a cell line and expressed. “Very quickly, we discovered that it was a chemokine receptor … and we used that information to write that description of the gene in a patent,” Haseltine says. The patent, filed in June 1995, included “many” other genes. He says HGS has 13 patents on receptors in this family. “We also realized that chemokine receptors might be viral receptors,” Haseltine explains, so the patent was written to cover generic medical uses of CCR5, such as for therapies to block or enhance the function of the receptor. Haseltine confirms, however, that the connection between CCR5 and HIV was not known when the initial patent was filed. Indeed, HGS's patent doesn't mention AIDS or HIV. But HGS's press release on 16 February 2000 describes the patent as covering “what is believed to be the critical entry point for the AIDS virus.” HGS has licensed the patent to its pharmaceutical partners for AIDS drug development, including a new deal with Praecis Pharmaceuticals Inc. of Cambridge, Massachusetts, to develop “peptide mimetic drugs,” according to Haseltine. HGS itself will try to develop antibody-based therapies to block or treat HIV infection. Haseltine says he understands why other scientists are disappointed and agrees that they deserve recognition for their work in elucidating the function of CCR5. HGS, he claims, is ready to share data and reagents with them. “We would not block anyone in the academic world from using this for research purposes.” But if anyone wants to use the receptor to create a drug, HGS will enforce its claim. 4. MOLECULAR GENETICS # Spider Genes Reveal Flexible Design 1. Erik Stokstad As engineers, spiders leave most other creatures in the dust. Dragline silk, which the eight-legged acrobats use for bungee jumping or rigging webs, is fabulously strong; a cable not much thicker than a garden hose could support two fully loaded Boeing 737 jetliners without breaking. And yet the so-called flagelliform silk used in web spirals is elastic enough to stretch over 200%. With seven kinds of silk, many spiders weave complex and resilient works of art. Now, the most extensive look yet at spider-silk DNA, reported on page 1477, reveals that the gene for flagelliform silk has some marvelous architecture of its own. Like other silk genes, the DNA that codes for flagelliform silk is a repetitive string of nucleotides. But these nucleotides are grouped in a way that may be an engine for generating diversity. Although the finding may not have an immediate impact on the way artificial silk is designed, it has excited people who think about spider evolution. “This structure is totally different from anything we've seen in spiders,” says Catherine Craig, an evolutionary biologist at Tufts University in Medford, Massachusetts. “It looks like a system that could allow considerable variation in the composition of flagelliform silk proteins.” Many arthropods make silk: spiders, silkworms, butterflies, and even some honeybees. Trying to mimic and improve this wonder fiber, outfits from the U.S. Army to DuPont have investigated its biomechanics and genetics. Since 1990, several teams have discovered 10 spider silk genes. But no one had looked at the genetics of the highly elastic flagelliform silk, so called because the glands that produce it resemble whips. So evolutionary biologist Cheryl Hayashi and molecular biologist Randolph Lewis of the University of Wyoming, Laramie, sequenced genes for the flagelliform silk of the tropical spiders Nephila clavipes and Nephila madagascariensis. Silk proteins are made of repeating amino acid motifs. Some researchers think this chain of springlike helices is a key to the strength and elasticity of silk, and the regularity may help the liquid proteins self-assemble into fibers as they are pulled out of the silk glands. Not surprisingly, as Hayashi and Lewis sequenced the Flag gene, they found repeated stretches in the DNA that coded for three regular amino acid motifs. Curiously, these motifs kept turning up in the same order. But Hayashi and Lewis also came across a stretch that didn't code for amino acids. Such noncoding sequences, or introns, are common in other genes but had never been seen before in spider silk DNA. Then Hayashi found another, and another—almost all the same size. “That's when I started to be surprised,” she says. The 12 introns take up about half the gene, and they alternate with active coding regions called exons. Even more unusual, most of the introns are very similar—two of them are 99.9% identical—suggesting that the introns are more highly conserved than the exons, which code for amino acids. That's like taking better care of your CD cases than the compact discs themselves. Hayashi and Lewis believe that all these peculiarities are tied to the monotony of the exons. When a sequence is highly repetitive, enzymes that copy the DNA can lose their place and make an error. That could explain why the exons tend to vary more in length than do the introns, which are less repetitive. The exons are also rich in the nucleotides cytosine and guanine, which when found in strings may increase the chance of genetic mistakes. Either factor might have led to new genetic diversity, the researchers say. On the other hand, the repetitive architecture may lead to “homogenization” of the gene, which occurs when one repeat overwrites another repeat. Such rewriting is more likely when the repeats are similar in sequence. “The amazing similarity across the Flag introns almost certainly means that this kind of recombination event led to the redundancy of the motifs,” says Andrew Clark, an evolutionary geneticist at Pennsylvania State University, University Park. This tug-of-war between mutation and homogenization may put the spider in a bit of an evolutionary tangle, Hayashi and Lewis say. Although the gene structure may be good for allowing mutations—and coming up with an even better type of silk—the same architecture means that those improvements might be weeded out. “Purely because of the repetitiveness, it might be very hard to stabilize these genes and maintain an optimal sequence,” Hayashi says. Silk is crucial for spiders; aside from catching insects, they also use it as a safety line and to make egg cases. That makes spiders great animals for exploring the link between the evolution of an important behavior and the evolution of a family of proteins, Craig says. And because the protein is directly extruded—rather than hidden inside the body like most proteins—the physical effects of mutations on silk proteins and spider survival are much easier to study. Those kinds of experiments may be far off, but probing the genes behind the proteins is a key step. “Now we know the structure of the gene,” Craig says, “and that's fantastic.” 5. PLANETARY SCIENCE # NEAR Finds a Battered But Unbroken Eros 1. Richard A. Kerr When the Near Earth Asteroid Rendezvous (NEAR) spacecraft went into orbit around the asteroid Eros last Monday, planetary scientists were expecting to see a body little worn by the ravages of time. According to conventional theory, Eros had escaped from the main belt of asteroids between Mars and Jupiter shortly after it was formed in a catastrophic collision of two larger asteroids. With little time spent in the cosmic shooting gallery of the main belt, Eros should have arrived near Earth less blemished by impacts than siblings that remained behind. But data from NEAR reveal that the asteroid took a heavier beating than expected, NEAR team members announced last week; the pitted surface suggests a slower, unconventional passage out of the main belt. Early NEAR returns also hint at Eros's innermost nature. Every image returned by the NEAR spacecraft on its arrival shows a surface almost covered with impact craters. The more craters, the longer a body has been exposed to the interplanetary elements, so “Eros does have an ancient surface,” says NEAR project scientist Andrew Cheng of the Applied Physics Laboratory (APL) in Laurel, Maryland, which built the spacecraft and is running the mission for NASA. A group of planetary dynamicists is suggesting that Eros suffered such a battering because it was slow getting out of the main asteroid belt following the impact that tore it from its parent body. Most asteroids that escape toward Earth are blasted by a collision into one of two narrow zones, as William Bottke of Cornell University and Alessandro Morbidelli of the Observatory of the Côte d'Azur in Nice, France, point out. They usually remain in these regions for less than about a million years before Jupiter's powerful gravity slings them inward, so near-Earth asteroids like Eros should look relatively young. Instead, they argue, a gentle push from sunlight may have nudged Eros gradually into one of numerous, weaker “escape hatches” created by Jupiter and Saturn or even by little Mars (Science, 13 August 1999, p. 1002). Such a route would have taken Eros hundreds of millions of years longer to approach Earth. Despite the prolonged pounding, Eros still seems to be in one solid piece. Impacts have pummeled the asteroid Mathilde, which NEAR flew by in 1997, into a flying bunch of boulders. Such collections can appear solid when covered with fine debris. But by combining NEAR estimates of the volume and mass of Eros, the radio science team led by Donald Yeomans of the Jet Propulsion Laboratory in Pasadena, California, calculates that the asteroid has a density of 2.4 grams per cubic centimeter, about the same as Earth's crust. That doesn't leave much room for the nooks and crannies of a rubble pile, Yeomans says: “It now looks like we have a fairly solid object” in Eros. NEAR is getting hints about Eros's parent body as well. For decades, planetary scientists have been debating whether asteroids like Eros, whose spectral color makes them the most common type in the main belt, could be the source for the ordinary chondrites, the most common type of meteorite to fall to Earth. That's one reason NEAR was sent to Eros. If so, Eros, like its parent body, would have the same primitive composition throughout. But ground-based observations by Scott Murchie of APL and Carlé Pieters of Brown University had caught a hint of different colors on opposite sides of Eros. The colors, imperceptible to the human eye, suggest different mineral compositions. That implies that Eros's parent became hot enough inside to melt and form different minerals in separate places, a process the parents of ordinary chondrite meteorites never went through. NEAR team member James Bell of Cornell now tells Science that “we do see [color] differences from place to place” on Eros, confirming the ground-based observations. That still doesn't necessarily mean Eros is mineralogically differentiated and an unsuitable source for ordinary chondrites, cautions Bell. Surface colors may have been altered by poorly understood “space weathering” over Eros's ages and ages of exposure (Science, 9 February 1996, p. 757). Best to wait, he says, for more NEAR data to get to the heart of Eros. 6. NEUROSCIENCE # New Stroke Treatment Strategy Explored 1. Laura Helmuth Strokes are among the most prolific killers in the developed world, claiming about 160,000 lives each year in the United States alone. Those who survive are often left physically and mentally impaired. Although new treatments can limit stroke damage, they have to be administered quickly to work. Now, researchers working with rats have come up with a new strategy that may guard against brain damage by taking advantage of the brain's response to injury. In this work, described on page 1453, neuroscientists in effect immunized rats against the brain nerve cell death caused by stroke and severe seizures. The team, led by Matthew During of the University of Auckland in New Zealand and Thomas Jefferson University in Philadelphia and his colleagues, vaccinated animals with a virus that had been genetically engineered to contain part of the NMDA receptor. Excessive stimulation of this receptor, which occurs when massive amounts of the neurotransmitter glutamate are released in the wake of a stroke, contributes to neuronal death. During's team theorized that the modified virus would stimulate the production of antibodies that could slip into the stroke-damaged brain, seek out NMDA receptors, and prevent them from being excited to death. In the vaccinated rats, this strategy proved surprisingly effective. The protection against stroke damage “was as good as could possibly be expected,” says neuroscientist Brian Meldrum of King's College Institute of Psychiatry in London. But the technique could pose dangers to humans. Neuroscientist James McNamara of Duke University in Durham, North Carolina, cautions that “immunizing people with neural antigens might have unwanted effects,” including encephalitis or learning disruptions. Other NMDA receptor-blocking drugs are already used clinically, but for optimal effectiveness, they have to be given within about an hour of a stroke, and many stroke patients do not reach help so quickly. To develop a treatment that could act almost immediately, During and his colleagues took advantage of the fact that the blood-brain barrier—the membrane that normally prevents large molecules such as proteins from entering the brain—breaks down after a stroke or other trauma. NMDA antibodies circulating in the bloodstream could then steal into the brain. To test this, his team modified an adeno-associated virus so that it carried a piece of DNA that codes for a portion of the NMDA receptor. The researchers fed rats one dose of the virus and waited 1 to 3 months for antibodies to build up in the blood. Then, to simulate a stroke, they injected an artery leading to the brain with a compound that causes the artery walls to squeeze shut. The resulting lack of blood ordinarily devastates a large portion of the brain. But in the vaccinated rats, the lesion was 70% smaller than in controls. During's team also induced a state of prolonged, intensifying seizures known as status epilepticus by injecting another set of rats with kainate, a compound that stimulates the release of large amounts of glutamate. In humans, long-lasting status epilepticus seizures can kill cells in the hippocampus. In the group of untreated rats, 68% progressed into full-blown status epilepticus and also suffered major cell death in the hippocampus. But just 22% of the vaccinated animals seized and their hippocampal damage was barely detectable. As impressive as the results were, says McNamara of Duke, it's still not clear from this early study exactly how the antibodies subdue the NMDA receptor. But the approach is promising, he adds: “If the results are indeed mediated by antibodies, then [having antibodies available during] the fleeting and localized disruption of the blood-brain barrier would be advantageous.” Also unclear is how directly the strategy could be translated into a human treatment. The NMDA blockers already in use to treat stroke can cause hallucinations and other symptoms of psychosis in humans. While that may be an acceptable risk for someone who would otherwise lose more brain tissue to a stroke, it poses a problem for a pretreatment scheme that would allow NMDA inhibitors to circulate before an injury. The antibodies could also impair learning by blocking or damaging NMDA receptors, although During says that preliminary results suggest that this doesn't happen. And During is not wed to the vaccination technique. Directly injecting NMDA antibodies could also do the trick, he says. During hopes to eventually test that in humans by treating people at extremely high risk for stroke with NMDA antibodies. After a hemorrhage on the surface of the brain, for example, people have a 50% chance of suffering a stroke within the next week. During's hope is to protect the brain before disaster strikes. 7. NEUROSCIENCE # Death Triggers Regrowth of Zebra Finch Neurons 1. Gretchen Vogel Until recently, scientists assumed that adult brains were doomed to constant, steady decline. Shortly after birth, it seemed, neurons lost their ability to grow, and cells that died could not be replaced—a gloomy outlook indeed. But in the past few years, scientists have found that certain kinds of neurons can grow in adult brains, including those of humans. Not only has this turned scientific dogma on its head, but it has also provided the first glimmer of hope for those suffering from degenerative brain diseases or paralyzing spinal cord injuries. Exactly what allows some brain neurons to grow while others can't remains mysterious, however. Now, new work in songbirds may provide a clue. In the 24 February issue of Neuron, neuroscientists Constance Scharff and Fernando Nottebohm of The Rockefeller University in New York City, Jeffrey Macklis of Harvard Medical School in Boston, and their colleagues report that the brains of adult male songbirds can recruit certain kinds of neurons to grow and replace cells that have died. What's more, the experiments suggest that the new neurons are fully functional. But the recruitment is selective; only certain cells have this phoenixlike quality. The work suggests that “the brain can make new neurons, but it can't make all of them,” explains neuroscientist Allison Doupe of the University of California, San Francisco. Researchers now hope to discover what causes that selective recruitment so that they can eventually trigger the growth of all types of neurons. The new findings emerged from the Nottebohm team's investigations into a long-studied phenomenon in canaries. Male canaries use complex songs to court their mates. Each fall, however, the birds curiously lose their ability to carry a tune, and the songs become muddled. Then in the spring, just in time for courting season, the canaries regain the ability to sing their intricate songs. Decades of research by Nottebohm and others have provided a partial explanation for these changes. They have uncovered the key brain areas that govern this song behavior. One is the high vocal center, or HVC. Some neurons in this region send axons into the so-called RA region, which helps to control the muscles involved in singing. It turns out that in canaries, many of these HVC → RA neurons die each fall, around the same time the birds forget their complex songs. A short time later, a spurt of neuron growth occurs. Researchers have long wondered what triggers the neuron growth, as well as whether the cell death and regrowth affect singing ability. Macklis, Nottebohm, and Scharff suspected that the stimulus for the neuronal regrowth might simply be the death of old neurons. To test that idea, they turned to the study of zebra finches. Unlike canaries, zebra finches do not forget their mating songs, nor do the neuron populations of their HVC region fluctuate seasonally. Rather, a few new neurons constantly grow into the region. To see whether neuronal death indeed triggers regrowth, the team used a method devised by Macklis that could kill selected types of neurons in living animals. Working in zebra finches, the team selectively killed the kind of neurons that normally regrow—the HVC cells that send axons to the RA region. Some of these birds, like the canaries in fall, lost their ability to sing complex songs—an indication that loss of the HVC neurons indeed causes the canaries' decline in singing. The brain-damaged zebra finches “sound like they're muttering,” compared to the birds' usually clear and precise notes, Scharff says. Then the researchers injected the zebra finches with a radioactive compound that would enable them to see whether any neurons regrew to replace those that died. They found that neuronal death did seem to trigger cell growth. Birds in which HVC → RA neurons had been killed had three times as many new neurons as did control birds. And as the neurons grew back, the zebra finches' singing improved. After a few weeks, one bird had recovered its old song almost perfectly, and the three others had made partial improvements. This ability to regrow seems to be specific to HVC → RA neurons, however. When the team used the same technique to kill neurons that project from the HVC to the so-called “area X,” which are not normally produced in adult birds, no replacement neurons grew. Discovering why the HVC → X neurons failed to regrow could help scientists find ways to grow particular types of neurons in mammals, says developmental neuroscientist Ron McKay of the National Institute of Neurological Disorders and Stroke in Bethesda. Nottebohm agrees. “We have to figure out ways to loosen those restrictions” on regrowth, he says—an achievement that would surely be something to sing about. 8. BROOKHAVEN # Meltdown on Long Island 1. Andrew Lawler A nasty battle that led to the closing of a neutron-scattering facility at Brookhaven National Lab demonstrates what happens when researchers lose the fight for the hearts and minds of the public Any illusions scientists at Brookhaven National Laboratory had that they could continue their half-century of fundamental research in splendid isolation were shattered the night of 16 January 1996. More than 700 people—both lab employees and Long Island neighbors—crammed Brookhaven's Berkner auditorium that chilly winter night to hear about contaminated water found in some local drinking wells. Angry and worried residents wanted warm reassurances; instead, Brookhaven and Department of Energy (DOE) scientists spoke dispassionately about risk analysis, flow rates, and billions of parts per gallon and promised to hook neighbors up to public water. “People went absolutely nuts,” recalls local activist Jean Mannhaupt, who was at the tumultuous meeting. “People were shaking [with anger] as they came up to the mike,” adds lab public affairs manager Mona Rowe. “It was pretty frightening,” adds another witness. After a decade of concerns and complaints about the DOE lab's waste dumps and balky sewage plant, residents now feared direct and immediate harm from their scientific neighbor. For those at the fateful 1996 meeting, it was no surprise that all hell broke loose exactly 1 year later, when DOE and lab managers announced that radioactive water from a spent-fuel storage pool under a silver-domed research facility called the High Flux Beam Reactor (HFBR) had been leaking into the ground unnoticed for more than a decade. Although the leak was small and posed no immediate threat to the public, its discovery began a fierce and sometimes bizarre drama starring movie actors, supermodels, lawmakers, Cabinet secretaries, antinuclear protesters, and petition-wielding scientists. Energy Secretary Bill Richardson cut short the fracas last November when he ordered the reactor permanently closed, citing budget constraints. But for the hundreds of Brookhaven and outside researchers who depend on the HFBR's steady stream of neutrons, the decision was a victory of cynical politics over scientific truth. “It was completely out of our control from the beginning,” says lab physicist John Tranquada. “Science was never an issue.” Nicholas Samios, who led Brookhaven for 15 years until 1997, still can't believe the outcome. “The total leak was less than the tritium in an exit sign—it was not a problem.” But others say that the problem had little to do with flow rates or picocuries. Instead, Brookhaven scientists and their managers were so intent on producing excellent research that they largely ignored the need to recruit allies among neighbors, local politicians, and federal officials. When the time came to defend the HFBR, says Robert Crease, a State University of New York (SUNY) philosopher who has written a history of the lab, “there was no one there.” The demise of HFBR is also a warning to other researchers—whether manipulating plant genes, experimenting with animals, or using nuclear reactors—that ignoring community and political concerns can be hazardous to their professional health. “This is a terribly important lesson to learn,” says Geraldine Richmond, a chemist at the University of Oregon, Eugene, and chair of DOE's Basic Energy Sciences Advisory Committee (BESAC). “We can't be isolated in our ivory towers.” Adds Crease: “What can happen at Brookhaven can happen anywhere.” ## Twelve men and a phone booth Ironically, Brookhaven is itself a child of politics. As the United States moved from a hot to a cold war, physicists from a handful of prestigious northeastern universities yearned for a nuclear research facility nearby that would be a scientific equal to Los Alamos in New Mexico and Oak Ridge in Tennessee. In 1946 the government awarded a contract to a consortium called Associated Universities Inc. (AUI), which chose an old Army base at the eastern end of Long Island as the lab's site. In the early days, neighborhood issues were virtually nonexistent: Set deep in the pine barrens and downwind from New York City, the 13,000-hectare lab had few neighbors. A few small communities of potato farmers and fishers dotted the eastern half of Long Island, while the occasional beach resort clung to the southern coast. The facility sat at the muddy headwaters of the Peconic River and on top of the freshwater aquifer that extends the length of the sandy island. The lab kept a low profile even as it grew into a sprawling collection of accelerators, reactors, and other facilities with more than 3000 employees, a$400 million annual budget, and a stream of out-of-state visitors. It lacked the heroic legacy of Los Alamos and the economic clout of Oak Ridge. Instead, Brookhaven scientists went about their business largely invisible to the outside world. “We just faded into the background,” recalls one researcher.

AUI preferred it that way. Unlike the large and powerful University of California, which runs Los Alamos, AUI was “12 men and a phone booth,” jokes lab physicist Stephen Shapiro. AUI's board, made up of distinguished academics, focused on the lab's scientific direction and left the details of administration to onsite employees. Nor did the organization court politicians and bureaucrats. Brookhaven never had an influential patron, like Los Alamos has in Senator Pete Domenici (R-NM), nor the staunch backing of an entire state political apparatus, as Oak Ridge does. “In Washington, a lot of people didn't know Brookhaven existed,” says Peter Bond, a physicist who served as interim director in 1997-98.

For decades, the strategy succeeded brilliantly. Work at the Cosmotron accelerator generated a Nobel Prize in physics, and its successor, the Alternating Gradient Synchrotron, produced three more laureates. There were ominous stumbles, such as the cancellation in 1983 of a large accelerator called Isabelle, but by and large the lab established a reputation among researchers for serious science without the hype. “[Samios] told me to keep my head down, do my research, and leave the rest to him,” recalls one scientist. Coming online in 1965 as the lab's third reactor, HFBR fit in well with the lab's low profile. It lacked the scientific sex appeal of the proton accelerator at the Fermi National Accelerator Laboratory in Batavia, Illinois, with its expansive promise of exploring conditions during the first moments of creation. However, the reactor's intense beams of neutrons, which can penetrate deeply into materials, gave physicists, biologists, and chemists a tool to probe the hidden structure of everything from crystals to ceramics to polymers to blood plasma. The potential applications drew a steady stream of researchers from companies and universities around the world.

As the reactor aged, however, problems began piling up alongside its impressive research results. Designed for a 25-year life at a maximum power level of 40 megawatts, the reactor was upgraded to 60 megawatts in 1982 before safety concerns forced operators to scale back power to 30 megawatts. In March 1994, a fire in the casing surrounding a target of uranium-235 used in an experiment raised questions about the lab's safety procedures. About the same time, officials also began to worry about wear and tear on the aluminum beam tubes exposed to the neutron streams, and reactor operators were eager to replace the cylindrical vessel holding the fuel elements—an upgrade that would cost as much as 200 million. The lab also faced a growing list of environmental problems. It had inherited chemical dumps from the site's days as an Army base, adding piles of low-level radioactive glass and discharges of heavy metals and plutonium into the Peconic River. In 1985, the well of an elementary school just outside the gates showed evidence of increased levels of tritium, although its source was unclear, and a handful of home wells tested positive for various chemical compounds. In 1988, a team of DOE environmental and safety inspectors—who dubbed themselves “the revenge of the C students” because of their tough stance on waste generated by scientific research—visited DOE labs and pinpointed Brookhaven as one of the hot spots. The following year, it was designated a Superfund site, which mandates increased scrutiny and cleanup of a badly polluted area. By 1996, five of six underground plumes containing concentrations of wastes that exceeded drinking water standards had migrated beyond the lab's gates. None of these setbacks was as damaging to the lab's image as the tritium leak, however. It was discovered in late 1996, when the reactor was closed for routine maintenance. To save money, designers had decided not to use a second liner in the storage pool underneath the reactor, a precaution taken at other DOE reactors built in that era. The result was a spreading plume of water, with tritium concentrations more than twice New York state's standards. Both researchers and environmental activists agree that contamination at Brookhaven is far less severe than at weapons production areas such as South Carolina's Savannah River or Washington state's Hanford Reservation. But the environmental problems at Brookhaven are complicated by its location on the Peconic and above the aquifer in an area with a large and rapidly increasing population. It isn't the only source of pollution—industrial plants have sprung up in the area, and potato farmers are heavy users of powerful pesticides. But Brookhaven drew the wrath of neighbors who rely on well water. Part of their anger stemmed from fears that the lab was doing secret and potentially dangerous research, and part from the contractor's insistence until 1987 that it was legally exempt from Suffolk County water-purity regulations. “The perception was that Brookhaven thought itself above the law,” says activist and social studies teacher Connie Kepert. “The lab operated like a foreign country,” adds Adrienne Esposito, another community activist. Samios and some other current and former Brookhaven officials reject the notion that Brookhaven was an irresponsible neighbor. And the DOE science office, which is ultimately responsible for the department's civilian labs, continued to give AUI high marks for its performance right up until the contractor was abruptly fired in 1997. But David Schweller, who headed the DOE office at the lab for most of the 1980s, says that lab managers were too focused on science at the expense of assuring safety and protecting the environment. His memos on the growing environmental troubles at the lab were ignored by both AUI and the DOE hierarchy. “I was told we're here to do science and to peddle my papers elsewhere,” he says. ## Contact sport Brookhaven's neighbors were more than willing to take on the lab, however. “Activism is a contact sport here,” says Mannhaupt, a community leader who focused attention on Brookhaven's problems (see facing page). In the 1980s, for example, a powerful coalition of antinuclear, environmental, and community groups successfully blocked the planned start-up of the Shoreham nuclear power plant, several kilometers from Brookhaven. “Long Island is a hotbed for all environmental, safety, and health issues,” says Schweller. It is also a hotbed for rumors of government cover-ups, such as in the recent mysterious crash of a commercial jet offshore or wilder talk of dead aliens kept in Brookhaven tunnels. The January 1997 announcement of the reactor leak only confirmed the suspicions of county officials, activists, and neighbors. After receiving complaints for many years, the lab promised in 1994 to dig test wells near the reactor. But DOE and the lab did not allocate money for them until 1996. In retrospect, that was a mistake, says Bill Gunther, the lab's senior environmental adviser. So when DOE and lab managers tried to explain that the plume was confined well within the lab grounds and posed no obvious threat to public health, their words were greeted with deep suspicion. “People felt like they had been lied to again,” says Mannhaupt. The lab's precious isolation, while helpful in conducting its research, now proved a tremendous handicap. There had been efforts to connect with the community through open houses and a speakers' bureau, but the outreach, according to lab biologist Dieter Schneider, “was geared to the high school level—a little on the trivial side.” Samios says AUI lacked the expertise to carry out a strong public relations campaign before or during the HFBR crisis, and he doubts that such a campaign would have made a difference. “The public is only interested if someone says there's a big danger,” he argues. But Esposito, who lives in nearby Patchogue, says the problem was not PR, but scientific arrogance. “The attitude toward the public was that it was ignorant and stupid and could not understand,” she says. ## Cleaning house News of the leak put Brookhaven in an unaccustomed spotlight. DOE's oversight office quickly launched an investigation and concluded that the lab “has not kept pace with contemporary expectations for protection of the public, the workers, and the environment.” The report also strongly criticized DOE for its confusing hierarchy, ineffective oversight, and poor management of the lab. Energy Secretary Federico Peña, fresh from Senate confirmation, flew to Brookhaven on 1 May and fired AUI—an unprecedented move in the department's history and one that rocked the DOE complex. Some lab officials say Peña's decision was a cynical attempt to assert his power. They see it as an effort to blame the lab for the troubles and shield DOE from criticism. But DOE managers maintain it was the only way to deal with a contractor that had been lax in dealing with both the environment and the public. Paul Martin, the Harvard physicist who chairs the AUI board, declined to discuss the ouster. And AUI hasn't gone out of business—it still operates the National Radio Astronomy Observatory for the National Science Foundation. But Brookhaven's Shapiro says that the organization “failed on the operations end” of overseeing the lab. DOE's Schweller agrees. “Just because you're a great scientist doesn't mean you're a great administrator.” Even Samios, who was appointed by AUI, says that the consortium had fossilized into “a self-perpetuating board” heading “a powerless organization.” View this table: The removal of AUI stoked the controversy. The congressional General Accounting Office investigated, local media like Newsday kept up a drumbeat of coverage, and demonstrators outside the main gate soon were carrying skull-and-crossbones signs recalling Chernobyl and Three Mile Island. Protesters in white jumpsuits and surgical masks rallied against the HFBR, one man went on a hunger strike, and an auxiliary Catholic bishop from Detroit warned that the lab's polluting ways ran counter to God's wishes. “It became a circus,” says Samios with undisguised disgust. Although DOE promised to appoint a new contractor, conduct a full study of the environmental impact of restarting the reactor, and maintain frequent contact with citizens' groups, it and the lab were losing ground. “Why on Earth would we trust that institution after Hanford and Rocky Flats?” says Alec Baldwin, who grew up on Long Island. “They've never been forthcoming. They lied and lied and lied and covered up for decades. The whole lab is corrupt.” In the summer of 1997, Baldwin and others formed Standing for Truth About Radiation (STAR), an organization based in fashionable East Hampton with an impressive board of wealthy New Yorkers and well-known activists (see story). Their goal was to prevent a restart of what they believed was a dangerous and dirty reactor. “You don't put a reactor in the middle of a crowded island with a sole-source aquifer,” says Helen Caldicott, STAR's founding president. “And Brookhaven has made a terrible toxic cocktail.” Brookhaven scientists were infuriated by what they felt was a gross exaggeration of the dangers. “The word ‘nuclear’ wiped out all sense of reason,” says physicist Tranquada. “You don't yell radiation on a crowded island,” says another researcher. Even local activists were offended by what they viewed as the arrival of Johnny-come-latelies more interested in combat than cooperation. “Environmental carpetbaggers,” snorts Mannhaupt, who was open to the idea of restarting the reactor if it could be done safely. “We disagreed with the environmentalists, who said there is no solution,” adds Kepert. The increasingly cantankerous fight was a political powder keg for local Representative Michael Forbes, a rookie Republican in the midst of switching parties. After expressing concern about the reactor, Forbes apologized to lab employees in July 1997 for his earlier doubts and said “it is a safe reactor.” Then he flipped again. At a hastily scheduled press conference in September 1997 in nearby Mineola, Forbes and Senator Alphonse D'Amato (R-NY) shocked and surprised lab managers by announcing that they opposed restarting the reactor. “That was the death knell of the HFBR,” says one Brookhaven manager. Washington and Long Island sources cite Forbes's desire to attract contributions from wealthy Democrats on the island's east end by aligning himself with environmental causes. Forbes, who declined repeated requests by Science to discuss the issue, also used Congress's power of the purse by inserting language into DOE spending bills forbidding the lab from spending money toward the reactor's restart. Some lawmakers such as Domenici disliked the language. But faced with the unusual circumstance of a lawmaker denying funds to his own district, they supported their colleague 3 years running. In the meantime, lab officials hoped that an environmental impact statement supporting restart would convince Forbes to lift the funding ban. ## Celebrity politics Feeling under the gun, Brookhaven scientists adopted some of the tactics of their opponents. They gathered more than 18,000 names on a petition urging the reactor's restart, held rallies, sent heaps of letters to Richardson, and spent innumerable hours at community meetings. But time was running out. The outside researchers that make up BESAC had supported restart of the reactor under three conditions—if it could be put online in a timely fashion, ultimately double its power, and not impose a budget strain—and it soon became clear that none of those conditions could be met. “We were clearly getting out of range of a prompt restart,” says Martha Krebs, who at the time was DOE science office chief. The proposal to double the power evoked fierce opposition from the community, the estimated costs of restarting HFBR began to climb, and the environmental impact statement was mired in DOE bureaucracy. The controversy heated up further in January 1999, when Baldwin, Caldicott, and STAR counsel Scott Cullen met with new Energy Secretary Richardson. The STAR officials proposed an environmental assessment to be overseen by the community rather than DOE. “We didn't say shut down the HFBR,” says Baldwin. Cullen believes Richardson was sympathetic to the idea of a separate study but unwilling to pay for it. Caldicott says that STAR officials reminded Richardson that “he has political aspirations with [Vice President Al] Gore, and if he didn't shut [the HFBR] down, there would be political ramifications.” In April, the scientists had their turn. Meeting in the secretary's office overlooking the Smithsonian castle, Robert Birgeneau, dean of science at the Massachusetts Institute of Technology, and Frank Bates, a chemical engineer at the University of Minnesota, Minneapolis, and head of the Neutron Scattering Society, argued vehemently for a200 million upgrade as well as a restart, according to sources familiar with the meeting. They said other facilities could not match the HFBR's capabilities.

But managers of the department's science programs sang a different tune at the 45-minute meeting, the sources add. Krebs and DOE basic energy sciences chief Pat Dehmer told Richardson that BESAC's conditions would be hard to meet. A restart likely could not happen before 2002, they said, and would add $10 million to the annual outlay of more than$20 million that DOE already was spending on the downed reactor. The extra cost, worried BESAC chair Richmond, could wipe out funds set aside for hundreds of graduate student stipends. And then there was the threat of lawsuits from antinuclear and environmental groups, which could lead to further delays and political headaches.

Richardson held a third meeting in October, with model Christie Brinkley, a newly named STAR board member, and her husband, architect Peter Cook, shortly after the couple met with President Bill Clinton during a visit to Washington. Cullen, who also attended, said Brinkley expressed her concerns about the HFBR, including the potential exposure by workers to cancer-causing substances and local groundwater contamination. Although Mannhaupt complains about the Administration's willingness to meet with celebrities but not local activists, DOE officials insist that the meetings were not pivotal moments in the fight over the reactor. “I see no evidence STAR had an impact on the decision or the process,” says Krebs, who did not attend either meeting.

Krebs had her own worries, which included costly upgrades under way or planned for two other DOE neutron sources and an increasingly difficult struggle to fund a $1.3 billion Spallation Neutron Source at Oak Ridge. She and BESAC members say they grew convinced that closing the HFBR, although it would pose short-term problems for the neutron-scattering community, was for the long-term benefit of researchers given the pressing need for the Oak Ridge facility. On 16 November, Richardson announced that the reactor would be shut down. Emphasizing that the HFBR posed no health threat, he declared that “we need to focus our limited resources on productive research.” ## Blame game The decision—and its timing—infuriated lab officials as well as many activists. Both sides were gearing up for a public debate over the environmental impact statement, the draft version of which stated that there was no pressing reason to keep the reactor closed. “Washington took public participation out of our hands,” says Mannhaupt angrily. “It's a hell of a way to run science policy, and a hell of a way to work with a community,” adds local civic activist and former Brookhaven employee Don Garber. Even STAR officials complained. Baldwin and Cullen say that Richardson deliberately defused public outrage over the leaks and waste dumps at the lab by abruptly ending the debate. In a meeting with reporters, Richardson insisted that budgetary reasons, not politics, were behind his decision: “I don't like to close scientific facilities, but it made no sense to restart it.” He added that “my scientists unanimously said 6 months ago that we should shut it down.” Both Dehmer and Krebs confirm that they recommended closing the facility prior to Richardson's decision. “It's never easy to make a decision like this, but sometimes you have to,” says Dehmer. Now DOE and Brookhaven must decide whether to mothball or decommission the reactor—the latter would cost hundreds of millions of dollars. Meanwhile, the lab's new contractor, Brookhaven Science Associates, gets good marks from all sides, and director John Marburger has been successful in establishing a basic level of trust between activists and the lab. Even Caldicott gives him grudging praise. “I respect him,” she says, while insisting that all radiation-related activities at the lab should cease. The emotional aftermath is harder to calculate. Lab scientist Tranquada remains deeply upset about the decision. “I'm still overcoming the loss,” he says, his voice breaking in frustration. “There's a facility with 15 instruments and no place to put them.” Longtime Brookhaven materials scientist James Hurst retired in December, complaining that DOE “should have drawn the line in the sand” to prevent antinuclear groups from thinking they could shut down other facilities as well. “Maybe I'm naïve, but I think this should have been about science,” he says. Beyond the blame game, however, some researchers and managers say they've learned a hard lesson about an axiom—perception is reality—that is taken for granted by politicians. “As a scientist, you believe in marshaling the facts and proceeding in a logical fashion,” says Shapiro, “but politics adds so many variables.” Adds SUNY's Crease, “People don't take in facts nakedly, and it is naïve to say that facts speak for themselves.” What happened at Brookhaven, say Crease and others, should be a stark warning to scientists about the growing public fears over everything from research involving fetal tissue to genetically modified crops. “It's very sobering,” says Bates. “There are thousands of labs around the country doing work that may not be in vogue, and I hate to think they will become cannon fodder for local politicians.” To win over such critics, says Mannhaupt, scientists need to fight fire with fire. “They need a kick-ass logo, they need a hip-hop song,” she says. But the battle over HFBR also demonstrates that researchers who ignore those who fund and regulate them can't expect help when the going gets tough. “Scientists have to look beyond their own self-interest to the neighborhood in which they work,” says Richmond. “If we want to be part of a community,” she warns, “we can't act like prima donnas.” 9. BROOKHAVEN # Grassroots Activist Earns Respect--and Learns Some Hard Lessons 1. Andrew Lawler Jean Mannhaupt was roller-skating with her young daughters when her neighbors walked up and asked her to sign a petition. Tests by the Suffolk County, New York, health department showed unacceptable levels of benzene in a neighbor's well water, and the group wanted the county to test their wells in the Mastic Beach community. That moment nearly 20 years ago launched Mannhaupt, now 45, on a determined march to understand what pollutants threatened her and her family and how to stop them. She grew into a formidable and pragmatic critic of Brookhaven National Laboratory in nearby Upton, earning the support of her neighbors and the grudging respect of scientists. But she unexpectedly found herself fighting other environmental and antinuclear activists in the battle over the High Flux Beam Reactor (HFBR), and she was accused of selling out after accepting funds from Brookhaven and an office at the lab. An unlikely activist—a high school graduate of a Catholic school in Queens with a blue-collar husband and three kids to raise—Mannhaupt needed help in understanding the benzene threat. And she was pleasantly surprised to discover lab scientists who were happy to explain how groundwater moves, the nature of volatile organics, and their health effects. Although she failed biology, math, and chemistry in high school, she says “the left side of my brain opened up when I was 26. … Scientists couldn't come down to me, so I had to go up to them.” Says Bill Gunther, Brookhaven's senior environmental adviser, “I'm much in awe of her. She started with limited technical understanding and she learned. She won't let go until she does.” She and her neighbors learned about politics too; they scored$11 million in government funding to hook up to public water. They also compared stories of children in their school district with learning disabilities and physical ailments. Mannhaupt, a woman with an imposing and energetic presence, proved a natural organizer who could also speak the scientists' language. “They'd be amazed you would point out a data gap in a soil or monitoring well screening,” she recalls with her strong Long Island accent. “I don't think they really believed that anybody in the community could read them.”

Although Mannhaupt was a harsh critic of Brookhaven's environmental record, she also valued its research activities and felt that HFBR could be fixed and reopened given sufficient public debate and input. That flexibility contrasted with the increasingly strident attacks on the lab and the reactor from antinuclear and other environmental groups. Those groups drew media coverage through their celebrity members and more confrontational tone, as well as criticism from Mannhaupt and others. “All of a sudden, from the left wing, come people like [the actor] Alec Baldwin,” she says. “They are not schooled enough, technically or scientifically. You can't just stamp your feet and get what you want.”

But the biggest blow to Mannhaupt's cause came in 1996 after she accepted $25,000, as well as office space, from the lab for the Community Working Group she headed, an umbrella organization that included more than 30 local groups. “I thought we'd reached the utopia of grassroots activism,” she says. “The community finally had the opportunity to become a true partner at the table.” But both her longtime supporters and opposing groups say that the perception of a conflict of interest destroyed the group's credibility as an independent voice. The loose alliance of Brookhaven skeptics was shattered, leaving room for groups like the Standing for Truth About Radiation (STAR) foundation. “I got trapped,” concedes Mannhaupt. She refuses to apologize, however, and says others have a lot more to be held accountable for. She accuses Energy Secretary Bill Richardson of selling out the lab's neighbors by denying the public a chance to comment on the reactor's environmental impact statement, and Representative Michael Forbes (D-NY) and groups like STAR of using the HFBR controversy to solicit money from wealthy and environmentally conscious Long Islanders. She also takes the lab to task for failing to connect with more citizens like herself and for polluting the area. She also hasn't given up the fight. Parlaying her environmental knowledge into a job with a private water-testing company, Mannhaupt has formed a new nonprofit organization, Neighbors Expecting Accountability and Remediation, to tackle environmental problems at Brookhaven as well as at other nearby sites that pollute. “I'm never happy with the lab,” she says. “If I were, I would go home.” 10. BROOKHAVEN # Antinuke Leader Uses STAR Power in Fight to Close Reactor 1. Andrew Lawler Worth Austin spent his days dumping barrels of radioactive waste in the ocean. It was the 1950s, and ocean dumping was the preferred way to get rid of some wastes generated by the Graphite Research Reactor at Brookhaven National Laboratory. Today, his great-grandson, Scott Cullen, is involved in dumping something even larger—Brookhaven's High Flux Beam Reactor (HFBR). As counsel for the Standing for Truth About Radiation (STAR) foundation, the 29-year-old Cullen played a key role in blocking the reopening of the HFBR, the successor to Austin's graphite reactor. It's an unusual role for a local boy who grew up in nearby Bellport, who spent summers at the Brookhaven pool, whose grandmother worked at computer processing for the physics department, and whose grandfather was a sergeant on the security force. Whereas Brookhaven was a black box to many of his friends, his exposure to the lab gave him what he calls “a deep appreciation for science—I was fascinated with it as a child.” But it was another reactor a few kilometers down the road that ignited his interest in environmental causes. The fight to prevent the opening of the Shoreham commercial nuclear plant, which Cullen followed as a teenager, convinced him that nuclear power was “a failed experiment” and that “radiation in the environment is [naturally] decreasing—and we shouldn't be creating more.” Cullen, a lean man with a serious air, returned home in the summer of 1996 while in law school at the University of Vermont and fell into the orbit of Bill Smith, an environmental activist on the eastern end of Long Island. Smith had founded a group called Fish Unlimited, which has long complained about Brookhaven's pollution. He convinced Cullen to get involved in fighting the lab's environmental legacy. Cullen played a central role the following summer in creating the foundation, which seeks “objective medical and scientific” data on the extent of that pollution, according to STAR literature. Funded primarily by wealthy individuals, the group has held a symposium on the health effects of radiation, paid for a public opinion poll on the reactor, and agitated for government funds to conduct independent testing of Brookhaven's toxic plumes. “Our organization thinks there shouldn't be reactors on Long Island,” says Cullen, noting the high density of population and the common aquifer from which residents get their water. “I don't think the HFBR leak was a huge health risk,” he admits, but it was “a symptom of a larger problem”—the lab's history of failure to protect the environment. The eastern end of Long Island is home to a host of celebrities and wealthy New Yorkers, and STAR's star-studded board includes actors Spalding Grey and Alec Baldwin, plus model Christie Brinkley and her architect husband Peter Cook. Antinuclear activist and pediatrician Helen Caldicott was its founding president. Local activists and lab scientists rail against the group's glamorous cast and its no-compromise approach, saying that STAR has exaggerated the real dangers and ignores the work done over the years by neighbors. “I don't think they listen,” says Brookhaven's senior environmental adviser, Bill Gunther. “I'm not sure they care, or if they want a meaningful dialogue.” In the modest single-room office walk-up he shares with other staff members in East Hampton, Cullen bridles at the suggestion that his group is an idle hobby of rich environmentalists, noting that STAR's supporters include women across Long Island who worry about the rising breast cancer rates. He adds that lab neighbors were ineffective in dealing with the lab before STAR came on the scene and says that Baldwin and Brinkley enhance the hands-on work that he does daily, from attending meetings to making phone calls. HFBR supporters, he maintains, are loath to admit the real reason behind the reactor's demise. It wasn't money or glamour, he says: “It was closed because people on Long Island didn't want it.” 11. BROOKHAVEN # Neighbors of the Reactor Pursue a Not-So-Civil Action 1. Andrew Lawler Residents suspect that their well water is making their children sick. The alleged culprit, a big local company, says not to worry. The residents find a lawyer and sue, triggering a long and bitter legal fight. Those events are depicted in the recent movie, A Civil Action, starring John Travolta, based on a true story. A counterpart may be playing soon in a Long Island court, with Brookhaven National Laboratory as the defendant. More than a dozen neighbors are seeking damages stemming from the lab's alleged negligence in handling dangerous chemical and radioactive wastes. The$1 billion complaint was filed in February 1996, 1 month after the lab admitted that some pollutants had migrated beyond its fences. This spring the plaintiffs will ask Justice Howard Berler of the Suffolk County Supreme Court to certify them as a class, allowing them to sue Brookhaven as a group rather than as individuals. If Berler agrees, then what is already a protracted and expensive judicial battle will move into high gear.

The residents hope to include anyone living within a 16-kilometer radius of the lab who believes they were exposed to its hazardous materials and that their health has suffered as a result, along with anyone who has had to pay to monitor their health to detect potential injuries and any property owners who have suffered economically as a result of the wastes. The suit charges that the contaminants caused illnesses ranging from cancer to headaches, and that “a real perceived fear of contracting health problems” led to emotional and psychological problems and a decline in property values. The suit blames the lab for its failure to “properly construct, maintain, repair, inspect, and update its scientific facilities and to take reasonable measures to minimize toxic discharges.”

“There is no justification for this suit,” replies Brookhaven counsel Michael Goldman, who leads the defense. Although former lab contractor Associated Universities Inc. is named as a defendant in the suit, the Department of Energy is ultimately responsible for paying the legal bills of its contractors. The current contractor, Brookhaven Science Associates, is overseeing the case.

Berler has so far refused Brookhaven's initial requests to have the suit dismissed. But even the plaintiffs' lawyer, Richard Lippes, a partner with the Buffalo firm of Allen, Lippes & Shonn, admits that it is extremely difficult to establish clear links between specific contaminants—assuming their origins can be identified—and specific ailments. And although a jury trial would be a disturbing prospect for Brookhaven, officials can take some comfort in the outcome of the movie: Travolta lost the case, at least on his first try.

12. ECOLOGY

# The Unbearable Capriciousness of Bering

1. Robert A. Saar*
1. Robert A. Saar is a writer in Irvington, New York.

Scientists have labored hard to untangle the web of life in the Bering Sea; some strange new kinks have them wondering just what the web ought to look like

The short-tailed shearwater flies a long way for a good meal, migrating some 15,000 kilometers every summer from the seas south of Australia to prime feeding grounds in the Bering Sea, off the coast of Alaska. In July 1997, however, something went terribly wrong. Scientists estimate that about a half-million carcasses of this brownish-gray relative of the albatross were bobbing lifelessly in the water and washing up on shore. The scale of the die-off was “unlike any we had ever seen,” says marine ecologist George Hunt of the University of California, Irvine, a 25-year veteran of Bering Sea research. The birds, it turned out, had starved to death.

The death toll was the latest sign of an ecosystem under siege. From unprecedented algal blooms to fewer salmon returning to spawn in Alaskan rivers and declines in fur seal and sea lion populations, the Bering Sea's ecological balance is shifting before researchers' eyes. The data now point to an assault on the food web from top and bottom: Fishing and hunting are taking out predators, while climate changes are reshaping the community of tiny marine plants and animals that sustain higher life-forms. It's an “ecosystem sandwich,” says Robert Francis, a fisheries oceanographer at the University of Washington, Seattle, and chair of a 1996 National Research Council study of the Bering Sea.

Scientists want to understand this sandwich better, because the Bering Sea is such a remarkable crucible of life. Nearly half of its 2.3 million square kilometers is a shallow shelf, less than 180 meters deep. In winter, when waters of varying depths mix well, the shallows are fed by a rich load of nitrogen, phosphorus, and silica nutrients dredged by currents from the sea's deep basin. Each spring and summer, these enriched shelf waters nourish microscopic green plants—phytoplankton—that are consumed by zooplankton including tiny shrimp, themselves food for fish, birds, and some mammals. “This is one of the most productive of the world's ocean areas,” says Elizabeth Sinclair, a marine mammalogist with the National Oceanic and Atmospheric Administration (NOAA) in Seattle. Indeed, the Bering Sea is home to the United States' largest single-species fishery, with an annual walleye pollock harvest exceeding 1 million metric tons.

Although weather plays an important role in shaping habitats everywhere, “the link between climate and the ecosystem is stronger [in the Bering Sea] than in many places,” says Nicholas Bond, a research meteorologist at the University of Washington, Seattle. The strength and direction of the winds buffeting the sea determine how much of the surface is covered by ice in winter and how fast the ice edge retreats toward the Arctic Ocean in summer. The extent of open water is critical to phytoplankton populations and the creatures that depend on them: Ice, especially when covered by snow, acts as a sunblock, limiting photosynthesis by the phytoplankton.

Also critical to the ecosystem are air and water temperatures. When the surface freezes, the salt concentration increases in the underlying water. As ice melts in spring, the fresh water dilutes the saltiness of the surface water, lowering its density. Heat from the sun further reduces density to a depth of about 20 meters. Lower density is a problem, because the greater the density difference between surface and deeper waters, the harder it is for the layers to mix and bring nutrients to the surface. During calm, warm summers, the phytoplankton quickly exhaust the nutrient supply. It then takes strong storms to stir the pot and restock the shallows.

Even for an ecosystem extra vulnerable to climatic whims, “in the last 3 years, elements of the climate and the biology in the Bering Sea have been crazy,” says Francis. Just before the shearwater die-offs, scientists realized something unusual was going on. In late June and early July 1997, a microscopic alga called ehux (Emiliania huxleyi), never before seen in the Bering Sea but common elsewhere, bloomed over an area larger than Nebraska. Ehux has minute reflective carbonate plates, which in multitudinous numbers are visible from space.

Unusual conditions may have favored the ehux bloom in 1997, says Phyllis Stabeno, a physical oceanographer with NOAA in Seattle. Very strong storms in May churned up an extra helping of nutrients from the Bering's deeper waters. Then, a summer of light winds and more sunshine than usual warmed the surface waters to about 2 degrees Celsius above normal. Similar conditions in 1998 triggered an ehux bloom that covered twice the area as that of 1997, extending through the Bering Strait and into the Arctic Ocean.

Although the connection remains unproved, the bloom may have sounded the death knell for the many shearwaters in 1997. Hunt speculates that ehux's appearance upset the food web. A tenth the diameter of diatoms—normally the dominant phytoplankton in the Bering Sea—ehux probably makes a good meal for smaller zooplankton such as copepods, which are the size of rice grains. Larger zooplankton like euphausiids (small shrimp) probably cannot eat enough ehux to grow beyond a small population. Diatoms and other large types of phytoplankton, the normal food for euphausiids, were crowded out by the smaller ehux. And euphausiids had been the mainstay of the shearwater diet.

Field data lend support to this scenario. Copepod numbers were at least 10 times higher in 1997 than in earlier years, while euphausiids had dwindled to numbers a tenth or fewer than usual. That's why Stephan Zeeman, a biological oceanographer at the University of New England in Biddeford, Maine, finds Hunt's ideas plausible. “The shift of the phytoplankton population to ehux and away from the more common and much larger diatoms is bound to cause changes in how the food web works,” he says.

If vicissitude is the only constant in the Bering Sea, it is a fact of life that the region's denizens seem able to adapt to. For example, although euphausiid populations were low again in 1998, shearwaters survived, albeit a bit underweight for the return flight to their wintering grounds near Tasmania. Hunt and his co-workers found that the shearwaters augmented their diet with more fish than they had been observed to consume in the past. For reasons that are still not fully understood, the shearwaters were unable to find enough food of any type (fish or invertebrates) during the late summer of 1997. Meanwhile, to the relief of the fishing industry, the walleye pollock population thus far appears to have been unaffected by ehux.

Keeping scientists guessing, ehux bloomed again last year—even though water temperatures were colder than average. The erstwhile stranger now seems to be a member of the ever-changing Bering Sea family. “Ecosystems can occupy any number of distinct states if pushed hard enough by changed environmental conditions,” says Suzanne Strom, a plankton ecologist at Western Washington University's Shannon Point Marine Center in Anacortes, Washington. “All of a sudden, there is a new community, in this case including ehux, and it may not want to go back to its original state.” Scientists will be keeping close watch to see how long this new food web arrangement will last—and which species will next fall victim to the Bering Sea's capricious cold shoulder.

13. ECOLOGY

# New Corn Plant Draws Fire From GM Food Opponents

1. Dan Ferber*
1. Dan Ferber is a writer in Urbana, Illinois.

Improved methods for controlling corn rootworms could help the farmer, but a novel corn strain genetically modified to do just that is raising safety concerns

The next battle in the war over genetically modified (GM) foods may be shaping up in Washington, D.C. In December, Monsanto Corp. applied to the U.S. Environmental Protection Agency (EPA) for approval to field-test and sell a new line of modified corn. The plant carries a bacterial gene that produces a toxin that kills one of the toughest insect pests of corn, the so-called corn rootworm complex, actually the larvae of three related beetle species. A go-ahead for field tests could come as early as next month, and Monsanto says it hopes to be able to sell seeds to farmers in time for planting next year. But biotech watchdogs are urging EPA to reject Monsanto's applications unless a host of safety questions are answered.

Currently, corn rootworms, which infest virtually all of the 30 million hectares of corn planted annually in the United States, are controlled either by rotating corn with another crop, such as soybeans, whose roots are not palatable to the pests, or with chemical insecticides. For the most part, the insecticides are applied to the 5.7 million hectares where crop rotation isn't practiced. Even so, they cost farmers about $200 million each year—about one-fifth of the total spent on insecticides for all crops in the United States. The new GM corn could dramatically reduce this pesticide use, at least for a few years, experts say. “Farmers have been waiting for something like this for a long time,” says Fred Yoder, a spokesperson for the National Corn Growers Association who farms about 400 hectares of corn, soybeans, and alfalfa near Plain City, Ohio. Indeed, finding a new rootworm control strategy is becoming increasingly important as rootworms are becoming resistant, not just to chemical pesticides but to the crop rotation strategy as well. In the past few years, some beetles have adapted to lay eggs in soybean fields that hatch to attack the next year's corn crop. But despite the potential benefits of a rootworm-resistant corn plant, the new strain is raising concerns among environmental and consumer groups. Monsanto scientists created the modified corn by putting in the gene encoding one of the Bt toxins, so called because they come from the soil bacterium Bacillus thuringiensis. But this toxin, from B. thuringiensis tenebrionis, is different from the ones previously used to produce corn resistant to the European corn borer, which don't deter rootworms. And because no transgenic crop expressing this toxin has ever been planted commercially, all sorts of safety questions should be addressed anew, says Charles Benbrook, an agricultural consultant in Sand Point, Idaho, who works with consumer and environmental groups. He and representatives of three public-interest groups, including Consumers Union, the publisher of Consumer Reports, urged the EPA in written comments to reject the application to field-test the corn unless Monsanto proves that it is not toxic to other species, including beneficial beetles such as the pest-eating ladybird beetle. The company also needs to show, Benbrook says, that the toxin breaks down in soil and doesn't harm soil organisms, a particular concern given recent work showing that the roots of other Bt corn plants secrete their toxins into the soil (Nature, 2 December 1999, p. 480), where they can bind to soil particles and remain active for months. Finally, the groups opposing the new corn strain worry that it could lead to the development and spread of Bt-resistant corn rootworms, which could ultimately make the technology useless. Entomologist Michael Gray of the University of Illinois, Urbana-Champaign, considers such resistance inevitable. He points out that the worms eat little else except corn roots, which means they can't escape to so-called refuges that are free of the toxin as European corn borers can, and that they've already demonstrated their adaptability by becoming resistant to chemical pesticides. “Any notion that they will not develop resistance to transgenic [corn] is foolhardy,” Gray says. But Randy Krotz, director of business and industry affairs for Monsanto, says that the company has been working for more than 2 years with corn-rootworm experts from midwestern universities to devise a plan to contain resistance to the Bt toxin should it develop. He also says that Monsanto has already conducted its own studies, some of which were submitted with the applications, showing that the corn does not secrete the Bt toxin into the soil. Even if it did, he adds, toxicity studies show that the new corn line does not harm nontarget organisms such as earthworms, ladybird beetles, lacewings, and more than nine other species. “The environmental profile is great,” Krotz says. Buoyed by these results, Monsanto is pushing the technology as fast as it can. The company applied in mid-December to conduct large-scale field tests, and they applied less than 2 weeks later for full commercialization, even though the field tests, if approved, would not be completed until late next summer. Other biotech companies say they are not far behind. Novartis Biotechnology is developing its own transgenic corn for corn rootworm control, and a joint effort by Dow Agrosciences and Pioneer Hi-Bred International has already yielded rootworm-resistant Bt corn lines. EPA will take a hard look at the new technology before deciding whether to approve it, says Steve Johnson, associate deputy assistant administrator for the EPA's Office of Prevention, Pesticides, and Toxic Substances. “If we feel that there is additional research needed, we'll not grant the EUP [Experimental Use Permit for field trials] or the full license,” he says. But under the most favorable scenario for Monsanto, the EUP could be granted as early as March, although the agency won't decide whether to let Monsanto commercialize the corn until after data are in from the field tests. Meanwhile, biotech watchdogs are getting ready to bark if EPA approves the crops quickly. “There will be a global reaction,” Benbrook says. 14. FUNDAMENTAL CONSTANTS # Getting a More Precise Grip on the Physical World 1. Andrew Watson* 1. Andrew Watson writes from Norwich, U.K. The new recommended values of physical constants are the result of decades of work by metrologists, pushing technology to the limit in search of greater precision Planck's constant, the Rydberg constant, Avogadro's number, big G. Every student of physics knows this honor roll: the fundamental constants, the seemingly immutable numbers that underpin our description of physical reality. But the high priests of metrology who tend this list know that they are far from unchanging—and they know the sweat that goes into improving their accuracy. At the U.S. National Institute of Standards and Technology (NIST), Britain's National Physical Laboratory (NPL), and other temples to the 10th decimal place, researchers labor for decades to improve the accuracy of these numbers by a few more parts per billion. They do not do this solely out of a zealous desire for precision. An extra decimal place always harbors the potential for a glimpse of new physics, says Clive Speake of the University of Birmingham, U.K. “Whenever you try to make a measurement of one of these fundamental constants at an accuracy higher than other people [have made], you're actually delving into a new physical world,” he says. Last summer, the latest official set of values of the fundamental constants—a major revision of the 1986 values referred to as the “1998 CODATA recommended values”—was posted on the NIST Web site (physics.nist.gov/constants). The fundamental constants “have gone to another factor of 10 in accuracy in many cases,” says Brian Petley of NPL in Teddington, southwest London. “Science and technology have marched on another decimal place.” That may not sound impressive, but behind those simple numbers lies ground-breaking technology combined with the amazing skill of experimenters around the world who spend their working lives teasing out increasingly refined values of the constants. Each new constant is not the result of a single measurement; it is an amalgam of many results, often obtained using different techniques. The challenge is to sift through all new measurements related to each constant, reject any duds, and create a consistent set of values for the scientific public. Consistency is the watchword: It is normally not possible to adjust a single value of one constant without affecting the values of others, due to the intimate relationships that exist among them. The researchers with the unenviable task of compiling the new set of constants were Peter Mohr and Barry Taylor of NIST in Gaithersburg, Maryland. The 1998 values were the fruit of a 4-year effort, which Mohr and Taylor carried out under the watchful eyes of a 13-strong international Task Group on Fundamental Constants, appointed by the Paris-based Committee on Data for Science and Technology (CODATA). The full story, including which measurements made the grade and which didn't, plus an explanation of why one constant is now known less accurately than it was in 1986, are revealed in a mammoth paper, penned by Mohr and Taylor, which will appear in the November/December issue of the Journal of Physical and Chemical Reference Data (still in press) and will also be published in the April issue of Reviews of Modern Physics. Testing QED. One of the incentives for improving the accuracy of several of the constants in the list—those relating to electromagnetic interactions of particles—is to see if they confirm the predictions of quantum electrodynamics (QED), undoubtedly the most precise theory in science. Among this set of constants is the so-called magnetic moment anomaly. Because the electron is a spinning charge, it behaves like a tiny magnet whose strength is defined as its magnetic moment. However, as quantum theory describes, the electron is surrounded by a cloud of virtual particles that pop in and out of existence for brief periods of time. This cloud of ephemeral particles affects the electron's magnetism, producing a small additional contribution. Physicists use ever more accurate measurements of this anomaly as precision tests of QED. The champions at measuring the magnetic moment anomaly are Hans Dehmelt and Robert Van Dyck and their collaborators at the University of Washington, Seattle. They used a Penning trap, which relies on magnetic and electric fields to trap individual electrons, and then measured the anomaly through the electron's response to high-frequency electromagnetic fields. The value used in the 1998 constants actually dates from the team's experiments in 1987. Although efforts to improve these results have been under way ever since, they have not yet borne fruit. The new recommended value runs to 11 significant digits, with an uncertainty of just four parts per billion. Even at this level of accuracy, however, the measurers are still trailing behind the QED theorists. According to Taylor and Mohr, theorists have been plugging away for decades at successively more intricate layers of detail in QED to reach an accuracy of one part per billion for the magnetic moment anomaly. The first layer, calculated by the American physicist Julian Schwinger in the 1940s, is by today's standards a graduate student exercise. In stark contrast, the third tier has taken 3 decades of theoretical sweat; it was only finally pinned down in 1996. And since the 1970s, Toichiro Kinoshita of Cornell University in Ithaca, New York, has spearheaded efforts to harness ever faster computers to evaluate what happens next. At this level of refinement, theorists have to consider not only the upsetting effects of virtual electrons and photons, but also rarer particles such as the electron's heavier cousin the muon, and strongly interacting particles in the form of various mesons. Fine detail. Through QED, the magnetic moment anomaly is linked to one of the most important numbers of all: the fine structure constant, which defines the strength of the quantum electromagnetic interaction described by QED. Physicists would never be happy with simply using the one constant to guarantee the other, however. The fine structure constant “is ubiquitous in all area of physics, such as atomic physics, condensed matter physics, and even nuclear physics,” says Kinoshita. This gives researchers a powerful tool, because they can measure the constant's value using a variety of methods from different areas of physics and compare the results. “If they all agree,” says Petley, “then you know that you understand all of the phenomena involved, unless you're pessimistic and say they all agree with the same wrong answer.” The value for the fine structure constant quoted in the new list, based on input from numerous measurements from many different types of experiments, is accurate to four parts per billion. The fine structure constant itself is connected, via the charge of the electron, to what might be viewed as one of the listing's most significant advances, the Planck constant. This number links the energy of a packet of light to its frequency, and the fact that its minuscule value is not zero is the starting point for quantum mechanics and therefore much of modern physics. The new value for the Planck constant, derived principally from a measurement made at NIST in 1998, has the smallest uncertainty to date, about nine parts in 100 million, 15 times better than NIST's previous figure and more than twice as refined as a similar measurement made at NPL in 1990. These great strides made in measuring the Planck constant are largely due to a machine called a moving-coil watt balance, first conceived by Bryan Kibble at NPL in 1975. The watt balance produces a value for the Planck constant by comparing a power measured mechanically, in terms of meters, kilograms, and seconds, to the same power measured electrically through two highly accurate quantum-mechanical phenomena, the Josephson effect and the quantum Hall effect. The heart of the balance is a multiturn circular coil that moves vertically in a radial magnetic field. In the first part of the experiment, researchers move the coil at a speed of about 2 millimeters per second and measure the voltage induced in the coil. In the second part, researchers pass a precisely measured quantity of current through the coil and then measure the force that the magnetic field exerts on the coil by balancing the force against the gravitational pull acting on a standard mass. Because of the relationships that exist between the Josephson and quantum Hall effects and the electronic charge and the Planck constant, combining the results of the two parts of the experiment yields a value for the Planck constant. “The beauty of the moving-coil watt balance is that the geometry of the coil, its dimensions, and the strength of the magnetic field drop out—one never needs to measure them,” says Mohr. “Rather, all one needs to measure is the speed of the coil, the induced voltage, the coil current, and the local acceleration due to gravity.” The watt balance, says Taylor, “has been a very important advance.” Rydberg riddle. The new Planck constant gives metrologists satisfaction, but the Rydberg constant is causing a bit of a headache, even though it is one of the most precisely known values on the new list. The Rydberg constant links the frequency of light emitted from a hydrogen atom to an electron's hops up and down the rungs of the energy-level ladder. Its recently revised value is a string of 14 digits. “The precision of the Rydberg constant is a gauge of our ability to make precise frequency measurements and our quantitative theoretical understanding of the hydrogen atom,” says Mohr. But there is a problem. “In doing a systematic comparison of theory and experiment to determine the Rydberg constant, we have found that there is a well-defined systematic difference between theory and experiment,” says Mohr. Mohr and Taylor think the difference, which is small, likely stems from uncalculated QED contributions, or from the fact that the effective “charge radius” of the proton is larger than that deduced from electron scattering experiments performed elsewhere, or both. “The point is that we have identified a possible problem in our understanding of the hydrogen atom,” says Mohr. But Mohr and Taylor and their Task Group colleagues have an even bigger thorn in their sides: big G, the gravitational constant. The new value has a factor of 10 greater uncertainty than the 1986 figure, reflecting what Taylor calls the “current turmoil” arising from a measurement at the German standards lab, the PTB, in Braunschweig in 1994 (Science, 18 December 1998, p. 2180). “The researchers there made a measurement of big G and did a very careful and thorough job, as you might imagine, and got a value which was in gross disagreement with everyone else's value: 42 standard deviations,” says Taylor. “The problem is big G is so difficult to measure,” says Birmingham's Speake. Gravitational pull is an extremely feeble force, and researchers must measure it—by gauging the attraction between heavy weights in the lab—in a world awash with gravity, from Earth and every other object in the lab. In big G experiments, clouds overhead, a trench dug outside the lab, and the height of the water table all alter the gravitational gradient and can influence the experiment. Speake says that the German experiment, the result of about 15 years of dedicated effort by highly respected researchers, would have been sensitive to the gravitational tug of a mouse a meter or more away, or a person 40 meters away. Sparked by the fuss, there is a “tide of new measurements that rule out the PTB result,” says Speake. But it makes metrologists uneasy to simply overwhelm an errant result by force of numbers, as they may be missing something vital, explains Speake. “We have to understand why that number is so large,” he says. “It's a very worrying situation.” Despite having released their revised values to the scientific public, such concerns keep Mohr and Taylor busy. They are looking forward to news of yet another fine structure constant measurement, this time from a Stanford University team using the new technique of atom interferometry. Also in the pipeline is an improved measurement from Brookhaven National Laboratory in Upton, New York, of the muon magnetic moment, in which the impact of the strong nuclear force is vastly greater than for the electron. It's a business that creeps forward: Fundamental constant measurements are “usually evolutionary, not revolutionary,” says Taylor. 15. # Can Old Cells Learn New Tricks? 1. Gretchen Vogel Stem cells found in adults show surprising versatility. But it's not yet clear whether they can match the power of cells from embryos Stem cell biologist Margaret Goodell has never seen her work on muscle and blood development as particularly political, so she was surprised when last month the Coalition of Americans for Research Ethics (CARE), a group that opposes the use of embryos in research, invited her to speak at a congressional briefing in Washington, D.C. She was even more astonished to find herself quoted by conservative columnist George Will a few weeks later. Goodell gained this sudden notoriety because her work, and that of other teams around the world, just might provide a way around the moral and political quagmire that has engulfed stem cell research to date. Since their discovery in 1998, human embryonic stem cells have been one of the hottest scientific properties around. Because these cells can theoretically be coaxed to differentiate into any type of cell in the body, they open up tantalizing possibilities, such as lab-grown tissues or even replacement organs to treat a variety of human ills, from diabetes to Alzheimer's. Politically, however, human stem cells have been a much tougher sell, as they are derived from embryos or fetuses. Indeed, most research is on hold as policy-makers grapple with the ethics of human embryo research. Enter Goodell, whose work suggests that stem cells derived from adults, in this case, from mouse muscle biopsies, can perform many of the same tricks as embryonic stem (ES) cells can—but without the ethical baggage. Both CARE and George Will seized upon her work as an indication that research on ES cells could remain on hold with no appreciable loss to medicine. “There's a lot less moral ambiguity about the adult stem cells,” says bioethicist and CARE member Kevin Fitzgerald of Loyola University Medical Center in Chicago. But can adult stem cells really fulfill the same potential as embryonic stem cells can? At this stage, the answer is by no means clear. Indeed, scientists caution that it is too early to know if even ES cells will produce the cornucopia of new tissues and organs that some envision. “It is still early days in the human embryonic stem cell world,” says stem cell biologist Daniel Marshak of Osiris Therapeutics in Baltimore, which works with adult-derived stem cells. From a scientific standpoint, adult and embryonic stem cells both have distinct benefits and drawbacks. And harnessing either one will be tough. Although scientists have been working with mouse ES cells for 2 decades, most work has focused on creating transgenic mice rather than creating lab-grown tissues. Only a handful of groups around the world have discovered how to nudge the cells toward certain desired fates. But that work gained new prominence in late 1998, when two independent teams, led by James Thomson of the University of Wisconsin, Madison, and John Gearhart of The Johns Hopkins University, announced they could grow human stem cells in culture. Suddenly the work in mouse cells could be applied to human cells—in the hope of curing disease. The beauty of embryonic stem cells lies in their malleability. One of their defining characteristics is their ability to differentiate into any cell type. Indeed, researchers have shown that they can get mouse ES cells to differentiate in lab culture into various tissues, including brain cells and pancreatic cells. Studies with rodents also indicated that cells derived from ES cells could restore certain missing nerve functions, suggesting the possibility of treating neurological disorders. Last summer, Oliver Brüstle of the University of Bonn Medical Center and Ronald McKay of the U.S. National Institute of Neurological Disorders and Stroke and their colleagues reported that they could coax mouse ES cells to become glial cells, a type of neuronal support cell that produces the neuron-protecting myelin sheath. When the team then injected these cells into the brains of mice that lacked myelin, the transplants produced normal-looking myelin (Science, 30 July 1999, p. 754). And in December, a team led by Dennis Choi and John McDonald at Washington University School of Medicine in St. Louis showed that immature nerve cells that were generated from mouse ES cells and transplanted into the damaged spinal cords of rats partially restored the animals' spinal cord function (Science, 3 December 1999, p. 1826). Although no one has yet published evidence that human ES cells can achieve similar feats, Gearhart says he is working with several groups at Johns Hopkins to test the abilities of his cells in animal models of spinal cord injury and neurodegenerative diseases, including amyotrophic lateral sclerosis and Parkinson's disease. While Gearhart and his colleagues were grappling with ES cells, Goodell and others were concentrating on adult stem cells. Conventional wisdom had assumed that once a cell had been programmed to produce a particular tissue, its fate was sealed, and it could not reprogram itself to make another tissue. But in the last year, a number of studies have surprised scientists by showing that stem cells from one tissue, such as brain, could change into another, such as blood (Science, 22 January 1999, p. 534). Evidence is mounting that the findings are not aberrations but may signal the unexpected power of adult stem cells. For example, Goodell and her colleagues, prompted by the discovery of blood-forming brain cells, found that cells from mouse muscle could repopulate the bloodstream and rescue mice that had received an otherwise lethal dose of radiation. Bone marrow stem cells may be even more versatile. At the American Society of Hematology meeting in December, hematologist Catherine Verfaillie of the University of Minnesota, Minneapolis, reported that she has isolated cells from the bone marrow of children and adults that seem to have an amazing range of abilities. For instance, Verfaillie and graduate student Morayma Reyes have evidence that the cells can become brain cells and liver cell precursors, plus all three kinds of muscle—heart, skeletal, and smooth. “They are almost like ES cells,” she says, in their ability to form different cell types. These malleable bone marrow cells are rare, Verfaillie admits. She estimates that perhaps 1 in 10 billion marrow cells has such versatility. And they are only recognizable by their abilities; the team has not yet found a molecular marker that distinguishes the unusually powerful cells from other bone marrow cells. Still, she says, her team has isolated “a handful” of such cells from 80% of the bone marrow samples they've taken. Although the versatile cells are more plentiful in children, Verfaillie's team has also found them in donors between 45 and 50 years old. Verfaillie's work has not yet been published nor her observations replicated. Even so, many researchers are excited by the work. The cells “look extremely interesting,” says hematologist and stem cell researcher Leonard Zon of Children's Hospital in Boston. Stem cell biologist Ihor Lemischka of Princeton University agrees. “I'm very intrigued,” he says, although he cautions that data from one lab should not outweigh the decades of research on mouse ES cells. Besides skirting the ethical dilemmas surrounding research on embryonic and fetal stem cells, adult cells like Verfaillie's might have another advantage: They may be easier to manage. ES cells tend to differentiate spontaneously into all kinds of tissue. When injected under the skin of immune-compromised mice, for example, they grow into teratomas—tumors consisting of numerous cell types, from gut to skin. Before applying the cells in human disease, researchers will have to learn how to get them to produce only the desired cell types. “You don't want teeth or bone in your brain. You don't want muscle in your liver,” says stem cell researcher Evan Snyder of Children's Hospital in Boston. In contrast, Verfaillie says her cells are “better behaved.” They do not spontaneously differentiate but can be induced to do so by applying appropriate growth factors or other external cues. Adult stem cells have a drawback, however, in that some seem to lose their ability to divide and differentiate after a time in culture. This short life-span might make them unsuitable for some medical applications. By contrast, mouse ES cells have a long track record in the lab, says Goodell, and so far it seems that they “are truly infinite in their capacity to divide. There are [mouse] cell lines that have been around for 10 years, and there is no evidence that they have lost their ‘stem cell-ness’ or their potency,” she says. For these and other reasons, many researchers say, adult-derived stem cells are not going to be an exact substitute for embryonic or fetal cells. “There are adult cell types that may have the potential to repopulate a number of different types of tissues,” says Goodell. “But that does not mean they are ES cells. Embryonic stem cells have great potential. The last thing we should do is restrict research.” Right now, she says, stem cell specialists want to study both adult and embryonic stem cells to find out just what their capabilities might be. That may be difficult. At the moment, human ES cells are unavailable to most researchers because of proprietary concerns (see next story) and the uncertain legal status of the cells. Internationally, most research on human ES cells is on hold while legislatures and funding agencies wrestle with the ethical issues. In the United States, the National Institutes of Health is the government agency that would fund the research, and currently, researchers are not allowed to use NIH funds for work with human ES cells. Many European countries, too, are still developing new policies on the use of the cells (see Viewpoint by Lenoir, p. 1425). The final version of NIH's guidelines for use of embryonic and fetal stem cells will not appear before early summer, says Lana Skirboll, NIH associate director for science policy. The draft guidelines would allow use of NIH funds for ES cell research as long as the derivation of the cells, by private institutions, met certain ethical standards (Science, 10 December 1999, p. 2050). But several members of Congress are considering legislation that would overrule the guidelines and block federal funding of ES cell research. At least some of that debate is likely to focus on whether adult stem cells do in fact have the potential to do as much as their embryonic precursors. 16. # The Business of Stem Cells 1. Eliot Marshall Human stem cells have become one of the hottest areas in biotech as several companies have jumped in to try to exploit them commercially When biologist James Thomson announced 15 months ago that he had grown human embryonic stem cells in a petri dish, scientists were excited about their potential uses in medicine. These cells, which are capable of developing into almost any other type of cell in the body, may one day provide an unlimited source of replacement tissues for treating human diseases. Some elected officials were less enthused, however; they were more concerned about the cells' source—human embryos. For now, at least, U.S. government rules that protect the embryo have put the cells off limits to most publicly funded researchers. But they aren't off limits for private companies. As a result, commercial enterprises now have the field almost exclusively to themselves. One company, Geron Corp. of Menlo Park, California, has secured a commanding position. Geron not only bankrolled Thomson's work—gaining first rights to exploit the cells commercially—but it also funded the isolation of a second type of very early or “primordial” cell from human fetal tissue by John Gearhart of The Johns Hopkins University. Now, the company is gearing up an intensive research program aimed at turning both of these discoveries into therapeutic products. “We certainly have invested heavily” in the field, says Geron CEO and president Thomas Okarma, noting that exclusivity is the reward for “being smart and lucky.” While Geron has nabbed the early lead in exploiting embryonic and primordial fetal stem cells, almost a dozen other biotech firms are elbowing their way into a crowded field to develop therapies using so-called “adult” stem cells. Once thought to be less versatile than primordial stem cells because they have already made a commitment to become particular cell types, these cells are now turning out to have greater than expected capabilities (see previous story). What's more, they pose fewer ethical problems because they can be obtained from sources other than embryos or aborted fetuses. And the companies using them argue that it may require less work to transform them into specialized cells for transplantation. The whole field has a gold-rush aura, with biotech companies betting heavily on their own technologies and stock prices swinging on the latest announcements. Some companies are already moving into clinical trials for products that, they are quick to point out, might serve a vast pool of patients: the estimated 2 million people with severe osteoarthritis or Parkinson's disease. “The great enthusiasm for stem cells,” says Ronald McKay of the National Institutes of Health (NIH), “is based on the idea that they can be manipulated and have highly reproducible properties. I think it's a very important step in biomedical research … to be able to use them directly in therapy.” ## Embryonic potential Much of this heady anticipation was sparked by Thomson's and Gearhart's success in growing primordial stem cells. But academic researchers have been on the outside looking in, wondering when—and under what conditions—they may get to work with the new cell lines. For Gearhart's line, the answer is entirely up to Geron and its subsidiary, Roslin Bio-Med of Midlothian, Scotland: Geron controls all uses of the cell line through an exclusive license from Hopkins. The company has less control over Thomson's cells, however. His institution, the University of Wisconsin, Madison (UW), insisted on retaining the right to distribute the cells to academics. On 1 February, UW established a new nonprofit subsidiary—WiCell Research Institute Inc., directed by Thomson—that will provide stem cells to approved applicants. The university has already received more than 100 requests, including 12 from private companies, according to Carl Gulbrandsen, director of the Wisconsin Alumni Research Foundation (WARF), which handles UW's patents. Gulbrandsen says that anyone who wishes to use Thomson's cells will have to promise not to share them with others, not to “mingle” them with human embryonic cells to make a human clone, and not to attempt to grow them into embryos. WiCell will review each applicant's research agenda annually, Gulbrandsen says, but WARF insists that “our intention is to make these cells widely available and at a low cost for academic researchers.” Distribution hasn't begun yet, and federally funded researchers will have to wait until government rules for working with embryonic cells are finalized (Science, 10 December 1999, p. 2050). Okarma also says his company won't try to go it alone in developing embryonic stem cells. Geron intends to recruit outsiders to work with its scientific staff to “drive” the stem cells into specific applications. In December, Geron held a meeting with 45 researchers at the Asilomar conference center in Monterey, California, to begin building a collaborative network. The conference brought together experts in cell regulation, gene insertion, and nuclear transfer (cloning), Okarma says. But the agenda and guest list are confidential. Geron is also still very much involved in basic research. In the past year, Okarma says, company scientists have produced cardiac muscle cells and three types of nerve cells from the stem cells. They have also had “some success” in introducing new genes into stem cells to control their differentiation into specialized cells. Indeed, Okarma predicts, the first commercial payoff will come from identifying genes that either initiate, or help maintain, the development of specific cell types. The information will be useful, he hopes, in designing new therapies and screening candidate drugs. ## New cells, familiar sources Primordial cells like Thomson's and Gearhart's have captured most of the attention, but adult stem cells have so far attracted far more investment. Many companies have focused on the hematopoietic stem cells of bone marrow, which give rise to all types of blood cells. Typical of this group are Nexell Therapeutics Inc. of Irvine, California, and Aastrom Biosciences of Ann Arbor, Michigan, both of which are developing systems to isolate such cells and grow them in large quantities, chiefly to aid in restoring cancer patients' immune systems after intense radiation or chemotherapy. Osiris Therapeutics Inc. of Baltimore, Maryland, has identified a different type of cell in the supportive tissue that surrounds bone marrow, or stroma, called mesenchymal stem cells. It has patented systems for isolating and producing these cells and launched two clinical trials. Initially, Osiris is using the cells to help restore bone marrow in cancer patients, as the other companies are doing. Meanwhile, because mesenchymal cells can differentiate into cartilage, muscle cells, and possibly even some neuronlike cells, according to Osiris, the company is investigating whether they can be used to replace cartilage in arthritis patients, fix damaged tendons, and repair brain tissue. To help in these endeavors, Osiris's chief scientific officer, Daniel Marshak, says, “we are making the cells available” to all nonprofit labs through a private distributor, “so that everybody in the research community can move the field forward.” Neural stem cells came on the scene later than the hematopoietic and mesenchymal cells, but in the past year they have become hot items because of their potential for treating patients whose brains have been damaged by disease or trauma. Indeed, investors are so keen on this idea that each new neural stem cell discovery seems to attract immediate investment. And the field is highly competitive. Layton BioScience, a small private company in Atherton, California, has already begun clinical trials. It developed a line of cells derived from a germ line tumor that behave like neural stem cells, according to CEO Gary Snable. In 1998, University of Pittsburgh neurosurgeon Douglas Kondziolka transplanted the cells into the brains of 12 stroke patients and later reported that brain scans revealed increased glucose uptake in the affected area in several patients, an indication that the cells were alive and metabolically active. In late 1999, Layton licensed a different cell line derived from human fetal tissue and patented by neuroscientist Evan Snyder of Children's Hospital and Harvard Medical School in Boston. Snyder's team has shown that the cells will engraft in the brains of experimental animals and is now testing them in models that mimic human diseases and spinal cord injury in preparation for a potential clinical trial next year. Snyder worries, however, that the field is becoming so hot that its credibility could be damaged by hype, and he says he aims to help deflate exaggerated claims. View this table: Another small private company, NeuralSTEM Biopharmaceuticals of Bethesda, Maryland, plans to exploit human neural stem cells derived from embryos. Karl Johe, a former researcher in McKay's lab at NIH and now at NeuralSTEM, discovered a method of isolating and growing these cells in animals. NIH released the patent on the cells to NeuralSTEM, which was founded by McKay, attorney Richard Garr, and another investor. Garr, the CEO, says the company's first goals are to produce cells that can be transplanted into Parkinson's disease patients and develop vectors that can deliver therapeutic proteins to the brain. A similar project is taking shape on the West Coast, under the direction of Nobuko Uchida, who previously worked in immunologist Irving Weissman's lab at Stanford University. Uchida is now chief of neurology research at StemCells Inc., which Weissman helped found. StemCells is a subsidiary of a public company known as CytoTherapeutics Inc., in Sunnyvale, California, which announced last year that it was shedding all other investments to focus entirely on stem cells. It aims to commercialize Uchida's pending patent on a method that uses surface markers to isolate adult neural stem cells from brain tissue. Once the cells are in hand, the goal is to use them to treat patients with neurodegenerative diseases. Another company that aims to attack the same medical problems is Neuronyx Inc., which just set up shop this month in Malvern, Pennsylvania, with backing from Hubert Schoemaker, the former CEO of Centocor. Johnson & Johnson recently bought Centocor for$4.9 billion, and Schoemaker is using some of the proceeds to create his new company, which hopes to exploit embryonic stem cells for an agenda to be developed by research chief Tony Ho, a neuroscientist recently hired from Johns Hopkins.

Although most of this new business activity is taking place in the United States, several companies have sprung up elsewhere. ReNeuron, a small British company with a staff of about 17, is trying to commercialize stem cell work by three faculty members at the Institute of Psychiatry in London. With backing from the large biotech fund called Merlin Ventures, ReNeuron has established a line of neuroepithelial stem cells derived from fetal tissue. According to CEO Martin Edwards, the company hopes to begin transplanting these cells into stroke patients in a clinical trial “around the end of 2000.” Stem Cell Sciences, based in Melbourne, Australia, which has ties to embryologist Austin Smith of the University of Edinburgh in Scotland, is raising money for unspecified therapies using stem cells.

It is of course far too early to judge the likelihood of success for any of these investments. But one thing is certain: We will be hearing a lot more about the promise of stem cells in the next few years.

17. # Fetal Neuron Grafts Pave the Way for Stem Cell Therapies

1. Marcia Barinaga

A decade of experimental treatments using fetal neurons to replace brain cells that die in Parkinson's disease can provide lessons for planning stem cell therapies

Swedish neuroscientist Anders Björklund and his colleagues may have caught a glimpse of what the future holds for the treatment of failing organs. For more than 10 years, Björklund has been part of a team at Lund University in Sweden that has been grafting neurons from aborted fetuses into the brains of patients with Parkinson's disease. In many cases, the transplanted cells have dramatically relieved the patients' symptoms, which include slowness of movement and rigidity. That is just the kind of therapy that stem cell researchers hope to make routine for treating all sorts of degenerative diseases, if they can coax the cells to develop into limitless supplies of specific cell types that can be used to repair or replace damaged organs.

Although the current Parkinson's treatment uses fetal cells that have already developed into a particular type of neuron, the promising results represent a “proof of principle that cell replacement actually works,” says Björklund. The results have given researchers increased confidence that, if they can manipulate stem cells to develop into the kind of neuron the Lund group and others are using—a big challenge—the new cells would take over the work of damaged cells in the brains of Parkinson's patients. If so, Parkinson's treatment could be among the first applications of stem cell therapy.

The successes have also increased the urgency of developing stem cell treatments, because despite their promise, there are many reasons that fetal cells will never be widely used to treat Parkinson's disease. The reasons range from ethical concerns, such as the protests by antiabortionists that led the governor of Nebraska to urge that research involving fetal tissue be shut down at the University of Nebraska (Science, 14 January, p. 202), to the fact that there will never be enough fetal tissue to treat all the people who need it. Parkinson's disease afflicts 1 million people in the United States alone.

Researchers are now looking closely at the results from fetal cell transplants for lessons that will help guide future work with stem cells. There are still many hurdles to overcome, but this first round of cell replacement in the brain sets a “gold standard” that stem cells must meet if they are to become the basis for new Parkinson's treatments, says neuroscientist and stem cell researcher Evan Snyder of Harvard Medical School in Boston.

Parkinson's disease is a logical candidate for cell replacement therapy, in part because conventional treatments have had limited success. The disease is caused by the death, for unknown reasons, of a particular group of brain neurons that produce dopamine, one of the chemicals that transmit signals between nerve cells. Afflicted people lose the ability to control their movements, ultimately becoming rigid. Treatment with levodopa (L-dopa), a drug that is converted to dopamine by the brain, alleviates these symptoms, but as the neurons continue to die, L-dopa's effectiveness wanes. Researchers first tried replacing the dopamine-producing cells by grafting into the affected region cells from the adrenal medulla gland. These cells are not neurons, but they make dopamine and can be coaxed to become neuronlike. The treatment reversed Parkinson's symptoms in rats, but produced little lasting improvement in human patients, probably because the cells died or stopped making dopamine, says John Sladek, chair of neuroscience at the Chicago Medical School.

Researchers have had better luck grafting immature neurons taken from aborted human fetuses. Dozens of patients who have received these experimental dopamine-neuron grafts over the past 10 years have had up to a 50% reduction in their symptoms. And the effects appear to last. Using positron emission tomography to image the brain, Olle Lindvall of Lund University and a team of colleagues in Lund and at Hammersmith Hospital in London reported in the December issue of Nature Neuroscience that, in one patient, the transplanted neurons are still alive and making dopamine 10 years after the surgery. That's encouraging, says neurotransplant researcher Ole Isacson of Harvard Medical School: Whatever killed the brain's own dopamine-producing neurons doesn't seem to have killed the transplanted cells.

Still, fetal cell transplants are plagued by problems that can never be overcome. Aside from ethical concerns about scavenging neurons from aborted fetuses, there are practical issues. It takes six fetuses to provide enough material to treat one Parkinson's patient, in part because as many as 90% to 95% of the neurons die shortly after they are grafted. Indeed, Lund's Björklund says, the cell supply is so limited that researchers have not even been able to test some possible avenues for fetal cell transplants. The neurons that die in Parkinson's originate in a brain region called the substantia nigra and send their long axons to several other areas, where they release dopamine. So far, researchers have put the cell grafts into only one of these areas, the putamen—and even there, they have not yet transplanted enough neurons to restore normal dopamine levels in most cases.

Even if researchers can develop techniques that diminish the fetal cell die-off, there will never be enough fetuses available to make this an “everyday procedure,” says Sladek. What's more, the brain material recovered from aborted fetuses “comes out in a form that makes it difficult to … standardize” in terms of quality and purity, Björklund says. This is likely why some patients do far better than others—uncertainty that would be unacceptable in a standard medical treatment.

Consequently, researchers are pinning their hopes on cultured stem cells. They would eliminate a continuing dependence on aborted fetuses, although the ethical concerns won't be completely laid to rest unless researchers can use stem cells obtained from adults rather than embryos (see p. 1418). And the supply of cultured cells could be unlimited, allowing tests of grafts into the putamen and possibly into other brain areas as well. The cell treatment, moreover, could be standardized and controlled to assure a more predictable outcome. “The ability to grow the cells of interest will make this a routine technology,” predicts neuroscientist Ron McKay, whose team is working on ways to culture neural stem cells at the National Institute of Neurological Disorders and Stroke.

But to make this brave new world of cell replacement technology a reality, researchers must first learn how to keep stem cells dividing for many generations in culture and then be able to trigger them to differentiate into the type of neuron they want. Stem cells presumably have the ability to differentiate into any of the several different types of dopamine neurons the brain contains, but it may be crucial to use the specific type of dopamine neurons that die in Parkinson's. Researchers doing the fetal cell transplants specifically select these neurons—known as nigral neurons because they originate in the substantia nigra—when they harvest neurons for grafting from fetal brains. Nigral neurons are “genetically programmed and designed to be a dopamine neuron in the appropriate brain circuit,” Sladek says.

Among other things, nigral neurons may respond better to local conditions, producing just enough dopamine. Experience with L-dopa treatment has shown that too much of the neurotransmitter can be just as problematic as too little, causing uncontrollable, jerky movements in patients. Researchers worry that stem cells coaxed to develop into dopamine neurons may become one of the nonnigral types and will not regulate their dopamine output in the appropriate ways. “It is like putting the right alternator into your car,” Sladek says. “If you put in one designed for another model of car, it may not work as well.”

Getting stem cells to differentiate into the right type of neuron may be only part of the problem. Neurons in their natural environment are surrounded by support cells called glia, which nurture the neurons and even modulate their activity, and optimal cell transplants may require replacing not only dopamine neurons but also the glia that normally surround them, Harvard's Snyder suggests.

But some researchers believe that the brain itself may be able to overcome the hurdles of producing both the proper neurons and the support cells they need. Snyder's lab has shown in animal experiments that stem cells put into the brain can be influenced by the brain environment to differentiate into both neurons and support cells. He envisions someday putting stem cells into Parkinson's brains and letting the brain tell them which cell types to become.

Even if it turns out not to be quite that simple, Parkinson's poses a much less daunting challenge for cell replacement therapy than do other neurological disorders. “The dopaminergic system is a fairly easy system to work with compared to sensorimotor or visual systems or spinal cord,” says Isacson. That, he says, is because the nigral neurons lost in Parkinson's disease have a diffuse and relatively nonspecific network of connections in the brain areas they link up to, rather than the very intricate and precise connections made by neurons in many other parts of the nervous system.

The treatment of most other brain disorders would likely require coaxing new neurons to make very precise connections, a task that no one is sure how to achieve. But in the case of Parkinson's disease, simply getting neurons to release dopamine in the correct general area helps patients. Because of that difference, Isacson predicts that “it will take some time to get other diseases to benefit from all these discoveries.” Nevertheless, a successful Parkinson's treatment based on stem cells would still be a dramatic achievement. “You would help a huge number of patients,” he says, “as many as the surgeons could do.”