# News this Week

Science  12 May 2000:
Vol. 288, Issue 5468, pp. 938
1. SPACE BIOLOGY

# Goldin Shakes Up NASA's Life Sciences Program

1. Andrew Lawler

These are boom times for the life sciences, as the National Institutes of Health (NIH) and other federal agencies rack up record budgets to study everything from the workings of the brain to life in deep-sea vents. But one community has yet to cash in on the biological bonanza: Faced with repeated delays to the international space station and surrounded by a sometimes hostile engineering culture, NASA life scientists are struggling to garner resources from inside the agency and respect from their often skeptical peers outside.

Last week, NASA Administrator Dan Goldin moved decisively to shake up the agency's troubled $300 million life and microgravity sciences effort. On 1 May he announced the transfer of the office's chief, Arnauld Nicogossian, to a new position and the start of an expedited search for his successor. Meanwhile, NASA chief scientist Kathie Olsen is reviewing the agency's biological program, which has come under repeated criticism from the National Research Council for its inbred culture, insufficient peer review, and tendency to overstate its findings (Science, 10 March, p. 1728). Life scientists hope the moves are the first steps toward creating a world-class research program at an agency that has traditionally given short shrift to the field. “There is an opportunity here to bring new blood into the leadership of life and microgravity sciences, and to spearhead a new approach with the outside research community,” says Ken Baldwin, a physiologist at the University of California, Irvine. “It would be really exciting if Dan [Goldin] is willing to make a fresh start so we could have a program worthy of the$60 billion [space station] lab we are building,” adds Andrew Gaffney, a Vanderbilt University cardiologist and former NASA scientist and astronaut.

Any major reform effort faces formidable obstacles, however. The entire life and microgravity sciences and applications program accounts for only 2% of the agency's $14 billion annual budget, which is devoted mainly to large engineering projects, and only 10% of that slice goes for fundamental biological research. About half is spent on a host of microgravity research projects in biotechnology, fluid physics, and combustion and materials science, with another quarter for biomedical research on measures to counter the health hazards of space living. The rest involves technology associated with human space flight and astronaut health. “The priority has always been, and rightly so, the health and safety of astronauts,” notes Frank Sulzman, who retired last year as a manager in NASA's life and microgravity sciences office. The agency's culture also is vastly different from that found in universities. “NASA has a quasi-military mindset, not accustomed to the open debate of academe,” says Sulzman. “It's follow the leader, and dissent is disloyalty.” In addition, he says, NASA's scientific acumen is shaped by astronomy and astrophysics, which are based on observational research. “The culture at NASA just doesn't get experimental research, in which you need more controls and to set parameters in objective and rational ways,” says Sulzman. Unfortunately, those controls and parameters often conflict with other needs on a crowded and complex spacecraft. Many scientists say that NASA's course in the life sciences has been set over many years by Nicogossian, a physician by training, and his allies George Abbey and Carolyn Huntoon, the current and former heads, respectively, of NASA's Johnson Space Center in Houston. “The Abbey-Huntoon-Nicogossian Mafia,” says Gaffney, “ran the program not on the basis of what was the best science. … Maybe those days are over.” Other researchers privately concur with that analysis. Adds one: “Arnauld used a physician's approach—and that approach is not one that has helped the life sciences flourish.” Neither Abbey nor Huntoon could be reached for comment, but Nicogossian bristles at such criticism. He says his office is improving interdisciplinary research, strengthening ties with NIH, and reaching out to the broader life sciences community. “We are set to change things,” says Nicogossian, who will assume the new job of chief health and medical officer for the agency. Nicogossian will run programs concerned with astronaut health and safety, leaving the life and microgravity sciences office with a stronger research focus. Other sources say that a new cadre of prominent biologists hired by Goldin is behind the changes. Olsen, a neurobiologist, says now is “an opportune time” to reconsider the office's focus as part of her review. And Nobel Prize-winning biologist Baruch Blumberg—who now heads the Astrobiology Institute at NASA's Ames Research Center in Mountain View, California—says that as a result of the review of biology, “there is going to be more of it,” although the details have not been worked out. NASA plans to pick a new life sciences chief within the next few months, and outside scientists are hoping for a researcher with management and program experience. “We need leadership by someone who is actively immersed in biological research,” says Gary Stein, a cell biologist at the University of Massachusetts Medical School in Amherst. Outside researchers say the new chief needs to strengthen peer review, encourage more ground-based experiments as precursors to space flight, and build stronger relationships outside the traditional community. “You need to elevate peer-reviewed, ground-based science to build a stable of stellar scientists, so you have the crème de la crème once the station is ready,” says Baldwin. One pressing problem is how to maintain interest in a program with few research opportunities until the space station is ready to do science in 2005. “We're at a very critical stage in building the space station,” says Gaffney, “and not to do good science once it is ready would be an absolute tragedy.” 2. HUMAN GENOME PROJECT # Chromosome 21 Done, Phase Two Begun 1. Elizabeth Pennisi* 1. With reporting by Leslie Roberts. Chalk up two achievements—one public relations, the other scientific—for the international consortium in the race to sequence the human genome. This week, on the eve of an annual genome meeting at Cold Spring Harbor Laboratory in New York—and, perhaps, to preempt an expected statement from rival Celera Genomics of Rockville, Maryland—the international consortium announced on 8 May that it has completed (almost) “phase one” of the project, the rough draft of the human genome. (85% of the promised 90% of the draft sequence is now available in GenBank, says the consortium.) On 9 May, the consortium entered “phase two” and turned its collective sequencing firepower to “finishing” the human genome—that is, producing a 99.99% accurate sequence. Celera head J. Craig Venter dismissed the announcement as an “artificial” milestone and “science by press release”—a tactic he has used himself. But Venter lauded the consortium's other announcement: the complete sequence of chromosome 21, only the second chromosome to be finished. Already, this chromosome, published electronically this week and in print in the 18 May issue of Nature, has reached the gold standard for which the consortium is striving for the entire genome, says Yoshiyuki Sakaki, who directs the human genome sequencing effort at the Institute of Physical and Chemical Research (RIKEN) outside Tokyo and whose team did half the sequencing. Together, the 62 scientists from 13 labs have determined the identities of 33.5 million bases of the long arm and another 280,000 bases of the short arm of chromosome 21. Just three clone gaps remain—stretches, each about 30,000 bases long, that could not be determined with current technology. By contrast, chromosome 22, which is roughly the same size and was completed last December, has 10 such gaps (Nature, 2 December 1999). In another technical tour de force, the chromosome 21 team can also claim the longest contiguous stretch of DNA ever sequenced, at 25.5 million bases, says Sakaki. Sequencing proceeded so quickly in part because several groups interested in Down syndrome had begun mapping chromosome 21 in the 1980s, before the Human Genome Project even existed. In Down syndrome, an extra copy of chromosome 21 results in mental retardation, heart problems, and other abnormalities. Now the task of figuring out just what goes wrong in this devastating disease will be far easier. Instead of finding the expected 800 to 1000 genes, gene prediction programs came up with just 225. “We now have a real definition of who the [genetic] players are,” says Roger Reeves, a geneticist at Johns Hopkins University School of Medicine. The paucity of genes on chromosome 21—one 7-million-base stretch contains just one gene—may have broader implications as well: If the gene prediction programs prove correct, then the entire human genome could have less than the 100,000 genes previously estimated. As sequencers enter the home stretch, competing claims of genome accomplishments should be coming fast and furious. But, says Reeves, it will be quite a while before any sequence beats the accuracy and completeness of chromosome 21. 3. STRUCTURAL GENOMICS # Protein Data Justice for All 1. Robert F. Service* 1. With reporting by Michael Hagmann in Cambridge, U.K. Scientists who crack protein structures and colleagues who want to decipher what these proteins do are on the verge of a watershed agreement that would usher structural biology into the genomic era. The carefully crafted guidelines are designed to help coordinate international financing of publicly funded protein structure efforts and ensure prompt release of structure data so that no team has an unfair advantage in working out the functions of unknown proteins. The guidelines, being finalized as Science went to press, come at a time when robotics and computer automation promise to transform structural biology into a high-speed effort, dubbed “structural genomics,” in which researchers will churn out thousands of protein structures in the next 5 years. Nurturing this souped-up approach, the National Institutes of Health (NIH) this fall plans to fund up to six structural genomics pilot centers to establish and test techniques for high-throughput x-ray crystallography and nuclear magnetic resonance (NMR) spectroscopy, the workhorse technologies of structural biology. Similar approaches are being adopted or considered in Japan, the United Kingdom, France, Brazil, and Germany (Science, 17 March, p. 1954). To help coordinate these efforts, officials at NIH and Britain's Wellcome Trust last month brought some 50 leading protein specialists to the Wellcome Trust Genome Campus in Cambridge, U.K., for a brainstorming session on how to release data quickly and fairly. They had a gulf to bridge. “Bioinformatics experts want fast access” so they can run follow-up experiments, says John Norvell, who heads the structural genomics program at the National Institute of General Medical Sciences in Bethesda, Maryland. But “experimentalists want time to check the data.” The issue boils down to how protein structures are gleaned and checked before release, says Wayne Hendrickson, a structural biologist at Columbia University. Computer programs, fed data from x-ray crystallography and NMR experiments, generate the likeliest set of three-dimensional coordinates for all of a protein's atoms. Bioinformatics experts initially wanted guidelines that mandate the release of those computer predictions the instant they are produced. Such a policy would be similar to the way sequence data from the publicly funded human genome project are posted daily on the Web. “That did not fly,” says Tom Terwilliger, an x-ray crystallographer at Los Alamos National Laboratory in New Mexico. Experimentalists maintain that protein structure analysis is more complex than spitting out raw genome sequence data. Each modeling prediction must be vetted, Hendrickson says. Several participants, he says, felt there's “no need to abandon the current standards of investigators making the decision” on when data are ripe for release. But although structural biologists will still make the call on when data are solid, they won't be allowed to withhold a structure for the sake of determining its function. That means changing the status quo. When a protein structure is submitted to a journal today, Hendrickson says, it's almost always accompanied by findings—from experiments that alter key amino acids in the protein, for instance—that allow scientists to make educated guesses about how the protein works. But with a high-speed approach to solving protein structures, says Norvell, “publishing will have to be done in a different way.” NIH and other agencies that plan to pour money into the structural genomics centers don't want to freeze out biologists not associated with the centers. According to Hendrickson, “everyone agreed that the concept should not give those groups a privileged status.” As a compromise, researchers will be asked to publish their results—most likely in electronic format or as a brief summary in a specialist journal—within 2 to 4 weeks of finishing a structure, says Stephen Burley, a structural biologist at The Rockefeller University in New York City. “The moment the paper is posted on the Web,” he says, “the coordinates would be placed in the Protein Data Bank,” which is freely available to all researchers. The burden will be primarily on funders to enforce the timelines. They're accustomed to that, Burley says: Agencies regularly use their leverage over purse strings to ensure that structural biologists submit coordinates to the Protein Data Bank as soon as findings are published. As an additional prod, structural biologists plan to add a little peer pressure. Hendrickson and others say the new guidelines call on each structural genomics center to keep a log on the Web of which structures they are attempting to solve. They would chart milestones such as cloning, isolating, and purifying a protein, and coaxing it to form a crystal. This will not only help to prevent several groups from working on the same projects, says Hendrickson, but “it will put internal pressure on the groups that they wouldn't be able to hold something forever.” 4. GLOBAL WARMING # Some Coral Bouncing Back From El Niño 1. Dennis Normile Coral reefs in the Indian and Pacific oceans seem to be recovering more quickly than expected from a recent devastating “bleaching” caused by high ocean temperatures. New research suggests that the nascent recoveries may be partly due to the unexpected survival of juvenile coral that somehow avoided the brunt of the environmental assault. “It may indicate that reefs are more resilient than we had thought,” says Terry Done, a senior research scientist at the Australian Institute of Marine Science in Cape Ferguson who studies reefs in the Indian Ocean. However, the coral would not be able to mature and recover from the repeated bleaching forecast to accompany projected global warming, he adds. Coral stressed by heat or disease expel zooxanthellae, the symbiotic algae that give the white coral its color, through an imperfectly understood mechanism. But severe bleaching, a result of an excessive loss of zooxanthellae, can kill the coral. Record-high sea surface temperatures in tropical regions during the 1997-98 El Niño-Southern Oscillation led to the most extensive coral bleaching event ever seen. And because coral reefs host a disproportionate share of marine life, their loss represents a serious blow to the ecosystem. Done studies a reef outside one of the Lakshadweep Islands off India's west coast that, like many Indian Ocean reefs, looked “like a graveyard” after the 1998 bleaching. But in a March survey of the reef, Done found a surprising amount of new coral, of a size that indicates it must have settled and begun growing at just about the time of the bleaching. Their presence poses a mystery. “If they drifted in after the bleaching, it isn't clear where they would have come from,” Done says, as several species of adult coral were virtually wiped out in the bleaching. But if they arrived before the bleaching, as he suspects, “how did they survive?” One bleaching hypothesis holds that the high water temperatures disrupt enzymes in the zooxanthellae, which then create toxic byproducts during photosynthesis. Done speculates that the new coral had minimal zooxanthellae in their tissue, which would have spared them these toxic effects. And by lodging in cracks and fissures, the coral were shielded from the sun's normal ultraviolet radiation that would have accelerated production of the toxins. Other scientists are seeing similar recoveries, although they say the recovery mechanism is not clear. William Allison, a marine biologist with Sea Explorers Association in the Maldives, also reports seeing “survival of corals that must have settled during or shortly before or after the bleaching event.” And Robert Richmond, a marine biologist at the University of Guam in Mangilao, says that recent surveys of reefs near Palau have turned up “some very nice [coral] recruits in areas that had been hit by the bleaching events.” Although these corals likely have arrived since the bleaching, he says that any recovery is “good news.” However, not all reports are so upbeat. In a February survey of reefs near Belize in the Caribbean that virtually collapsed during the El Niño bleaching, Richard Ransom of Dauphin Island Sea Lab in Alabama found “no signs of recovery.” Even the recovery events in the Pacific and Indian oceans, he notes, are patchy and seem to depend on the absence of other stress factors. That connection puts the fate of the coral reefs in the hands of any future global warming trend. Coral have a good chance to recover from a one-time, short-term disturbance like bleaching if water and substrata quality are sufficiently high and there is a viable adult population within a reasonable distance, says Richmond. “But if you then kick it and stomp on it, there is a limit to the stress it can take,” he adds. Done notes that it may become harder for coral reefs to enjoy the decade or more of undisturbed growth needed for a full recovery if global warming triggers more episodes of severe bleaching. Under such a scenario, he says, “this recovery won't do the reefs much good. They'll no sooner get 1 or 2 years old before they'll be wiped out again.” 5. CLIMATE CHANGE # Panel Estimates Possible Carbon 'Sinks' 1. Jocelyn Kaiser 1. IPCC Special Report: Land Use, Land-Use Change, and Forestry. Available in June; E-mail: ipcc-sec{at}gateway.umo.ch The December 1997 Kyoto Protocol was a major milestone on the road to addressing human-induced climate change, but it left unanswered many questions that have stalled the treaty's ratification. One of the most vexing: Just how much carbon can a forest or farmland hold, and how easy is it to measure? The reason it's an issue is that the protocol allows developed countries to meet tough carbon emissions targets—5% below 1990 levels, between 2008 and 2012—both by reducing emissions from fossil fuels and, in a controversial option, by “sequestering” carbon in forests or other types of land. A new report* attempts to answer the question scientifically, but it is unlikely to make the option more palatable politically. View this table: Some European countries and environmental groups object to allowing developed countries to get too much credit for carbon sinks because they think it will let energy-guzzling countries such as the United States off the hook. Storing carbon in forests and fields won't stave off the need to find renewable energy sources, and it's too difficult to accurately track how much carbon they hold, say such groups as the World Wildlife Fund. Leery that countries won't cut emissions and that sequestration won't work, these critics want to limit how much of a country's emissions can be reduced this way. Leaving the political issues aside, the new report—written by a scientific panel of the United Nations-sponsored Intergovernmental Panel on Climate Change (IPCC) and approved by delegates from over 100 countries this week at a meeting in Montreal—adds long-awaited numbers and a how-to manual to what has, until now, been a murky academic debate. “There was a lot of mystery for the delegates. Now the scientists have really aired it out, and there's a good source of information,” says Mike Coda, a climate change analyst with The Nature Conservancy in Arlington, Virginia. The 460-page report runs through a dozen types of land uses, such as regenerating forests or converting cropland to grassland, spelling out just how much carbon might be socked away in each. The answer is, a lot. The report, informally known as the “sinks” report, suggests that, at first glance, the 41 developed countries could meet their Kyoto goals of cutting 200 megatons of carbon emissions per year entirely through land use changes rather than by reducing emissions from fossil fuels (see table). But following this route could be tricky, cautions the report, which offers a plethora of caveats concerning the difficulty of measuring how much these lands store. For forests, however, the tally is mandatory: Countries will get credit for the forests they have grown or be docked for the trees they have lost since 1990. With so much at stake, just defining what a forest is has been controversial. For example, define a forest as 70% canopy cover, and countries with savanna forests, which have sparse canopies, might feel free to mow them down. Also crucial, the report notes, is whether forests that are harvested commercially and replanted should be counted. But answering those questions is far simpler than estimating how much carbon might be stored by changing the use of other types of land, such as farmland. The protocol makes a vague reference to sequestering carbon through these “additional activities”; the IPCC panel ended up with the tough task of clarifying this option. First, scientists had to agree on how much of a particular land type exists globally, and then how much carbon it might hold if its management changed. Improving agricultural practices over the 1300 megahectares now in use, for instance, could save 125 megatons of carbon a year, the panel estimated. But experts caution that such estimates are optimistic and difficult to verify. Compared to forests, which “are pretty easy to see from space,” tracking carbon soaked up by fields is “a lot harder,” says panelist Richard Houghton of the Woods Hole Research Center in Massachusetts. The report also discusses the feasibility of allowing developed countries to offset emissions by planting, protecting, or managing forests in developing countries (Science, 24 July 1998, p. 504). Such mechanisms can have “benefits,” says the report, but there are risks, for instance, that displaced people will deforest lands elsewhere. Some European countries want to limit such offsets, maintaining that developed countries should reduce their own use of fossil fuels instead. Now that this report has laid out carbon accounting options, countries must decide which ones to pursue before the next major meeting of Kyoto parties in November to finalize the treaty. • *At rate of current activities using IPCC definitions. 6. CIRCADIAN RHYTHMS # Two Feedback Loops Run Mammalian Clock 1. Marcia Barinaga As a grandfather clock keeps time with an oscillating pendulum, the 24-hour rhythm of the biological clock is also maintained by oscillations—in this case by oscillating levels of proteins. But the biological clock has two oscillations moving in counterpoint; the levels of one set of proteins cresting while the others are low, and vice versa. Results described on page 1013 now show how the mammalian form of the biological clock keeps those opposed oscillations in sync. A team led by Steven Reppert of Harvard Medical School in Boston reports that the key to this regulation seems to be a pair of proteins that enter the cell nucleus together but then apparently split their duties. One, called CRYPTOCHROME (CRY), turns off a set of genes, while the other, PERIOD2 (PER2), turns on a key gene. The work “explains how genes can be activated in two opposite phases,” says clock researcher Paul Hardin of the University of Houston, whose group recently made a similar discovery about the clock of fruit flies. Researchers are excited by the way the new work clarifies the role of CRY in the mammalian clock. In fruit flies, CRY, which absorbs light, helps reset the clock in response to light (Science, 23 July 1999, p. 506). But the mammalian clock, deep in the brain, doesn't receive direct light input, so researchers wondered what function CRY could be serving there. Reppert's team has now “firmly established” that CRY is a central component of the clockworks, where it turns off key clock genes, says circadian rhythm researcher Steve Kay of the Scripps Research Institute in La Jolla, California. What's more, it seems able to do this alone, without the aid of PER2, the protein previously thought to do the job. The Reppert team's findings build on work on the fruit fly clock, which features a negative feedback loop similar to the one in which CRY participates. In flies, the feedback is accomplished by PER together with a protein called TIMELESS (TIM). The per and tim genes turn on in the morning, and the two proteins accumulate in the cytoplasm during the day. By evening, when they reach a critical concentration, they pair up and go to the nucleus to shut down their own genes. This feedback helps keep PER and TIM protein levels oscillating up and down every 24 hours. But that is only half of the story. A protein called CLK oscillates in counterpoint with PER and TIM; its levels rise as theirs fall and vice versa. CLK is a positive regulator that pairs with a protein called CYC to turn the per and tim genes on. Indeed, PER and TIM shut their genes off by binding to and inactivating CLK and CYC. Mammalian clocks use many of these same proteins, although mammals have three PERs and two CRYs, and in some cases—such as that of the CRYs—the proteins play different roles in the two clocks. In mammals, CYC's counterpart is a protein called BMAL, and it is BMAL whose levels cycle in counterpoint to PER's. To get a better fix on how the mammalian cycling works, Reppert collaborated with two research teams that had produced mice with mutant clock genes. Gijsbertus van der Horst of Erasmus University in Rotterdam, the Netherlands, and his co-workers brought to the collaboration a strain of mice that lacks functional cry genes, and Cheng Chi Lee of Baylor College of Medicine in Houston contributed mice with a mutant per2 gene. Neither strain has a working clock, but the clocks are not identically broken. It's like “hitting a clock with a sledgehammer in different ways,” Reppert says. By examining how the clockworks have failed, the researchers gained clues to the function of the protein encoded by the mutant genes. For example, results with mice that have a mutation in the per2 gene suggest that its protein acts as a positive gene regulator, switching on a key clock gene. Reppert's Harvard colleagues Lauren Shearman and David Weaver deduced that from their observation that, in per2 mutants, the bmal gene is expressed at lower levels than in normal mice, implying that PER2 normally turns bmal on. Work with the cry mutants showed that the CRY proteins normally turn off the per and cry genes. Van der Horst's team had already shown that per gene activity is high in the cry mutant mice. Now, Sathyanarayanan Sriram, a postdoc with Reppert, has confirmed in cell culture studies that CRY protein alone can turn the genes off, without any help from PER. CRY apparently down-regulates the genes much as PER and TIM do in flies—by binding to CLK and BMAL and taking them out of action. Experiments with the cry mutant mice also suggested a second role for CRY: It seems to stabilize PER2. That conclusion came out of the fact that despite all the per gene activity in the clock cells of the mutants, the researchers could find no PER protein there. That suggests PER protein is quickly degraded in the absence of CRY. The bmal gene's activity was also low in the mutants, as PER is needed to turn the gene on. Putting it all together, the team came up with two interlocking loops by which proteins feed back on the expression of their own and other genes. In one loop, PER2 turns on the bmal gene. BMAL, after a delay, returns to turn on the cry and per genes, triggering the second loop. In that loop, CRY and PER proteins accumulate and then pair up and enter the nucleus, where CRY turns off the cry and per genes and PER2 once again turns on bmal. This picture is similar to what Hardin's team found last year in fruit flies, although in the fruit fly clock both PER and its partner, TIM, seem to work together to turn off their own genes and turn on CLK(Science, 22 October 1999, p. 766). The paper “is a great move in the right direction,” says Scripps's Kay. “It is doing what needs to be done, which is to work out the real mechanics of the clock” in mammals. Many questions remain, such as how PER regulates the bmal gene, says Kay, but with the studies moving along like clockwork, those answers are sure to follow soon. 7. PHYSICS # A Slow Carousel Ride Gauges Gravity's Pull 1. Charles Seife Sometimes progress starts with a big step backward. After 14 years of gravitational confusion, physicists at the University of Washington, Seattle, have released the most precise measurement yet of the strength of gravity, thanks to a clever new device. Although scientists have been studying gravity since the time of Newton, they have had little luck measuring its pull. The strength of gravity, represented by a universal constant nicknamed “big G,” is puny; huge amounts of mass exert only a small gravitational attraction. As a result, seismic disturbances, minute electric and magnetic fields, and even the mass of a nearby graduate student can mess up laboratory measurements of G. Such measurements date back to the end of the 18th century, when the English physicist Henry Cavendish dangled a dumbbell-shaped pendulum from a thread and placed heavy masses nearby. By measuring how much the dumbbell twisted under the attraction of the masses, Cavendish obtained a fairly good measurement of big G. Over the years, Cavendish-like torsion pendulums and other devices yielded better and better values. In 1986, the National Institute of Standards and Technology (NIST) published a value with an uncertainty of only 1.3 parts in 10,000. Then things started to go downhill. Also in 1986, the PTB, the German equivalent of NIST, performed a technically exquisite experiment that yielded a value 42 standard deviations away from other measurements. “That was quite startling,” says NIST's Peter Mohr. “Nobody knows quite what was wrong with it.” To make matters worse, in 1995, physicists realized that, because the pendulum wires in Cavendish-style torsion devices are not perfectly elastic, they don't twist in quite the way that scientists had assumed. “[It] should have been obvious,” says Randy Newman, a physicist at the University of California, Irvine. “You get a version of G which is too big.” NIST hiked its uncertainty about big G by a factor of 12, to a mortifying 15 parts in 10,000. Enter the big G whizzes of Seattle. At last week's meeting of the American Physical Society,* physicist Jens Gundlach announced that he and his colleagues had eliminated the string-twisting bias and measured big G with an error of a mere 14 parts per million—about 10 times better than previous measurements. The key to the newfound precision was keeping their experimental apparatus in constant motion. Gundlach's team mounted the pendulum's support on a turntable that rotates about once every 20 minutes. As the ends of the pendulum approached the attractor masses—four 8-kilogram steel balls—they felt the increased gravitational force. But whenever the pendulum began to twist, a laser sensor triggered a switch that accelerated the turntable, counteracting the torque. “The torsion fiber hardly gets twisted,” says Gundlach. “The gravitational acceleration is transferred to the turntable,” getting rid of the string-twisting bias. Meanwhile, the attractor masses rotated in the opposite direction from the turntable with a period of 5 minutes. That second rotation screened out unaccounted-for gravitational influences from the outside world by turning them into a periodic signal that could easily be subtracted from the data. “You can walk up to this thing, and it won't affect the value,” Gundlach says. The result was a value of G (tentatively 6.67423 ± 0.00009 × 10−11 m3/kg·s2) far more precise than physicists need for practical purposes. “It's one of the fundamental constants,” Gundlach says. “Mankind should just know it. It's a philosophical thing.” • * 29 April to 3 May, Long Beach, California. 8. PALEOCLIMATOLOGY # Viable But Variable Ancient El Niño Spied 1. Richard A. Kerr A climate record preserved in the bottoms of now-vanished lakes in New England shows traces of El Niño as far back as 17,500 years ago. The findings, reported on page 1039 of this issue, indicate that the tempo of tropical Pacific warmings remained roughly constant, but the beat strengthened and weakened for thousands of years at a time. “It's nifty stuff,” says paleoclimatologist David Rea of the University of Michigan, Ann Arbor. “This is new information—how long does a climate mode last, and when does it switch to the next mode?” The results fit well with new climate models that suggest that periods of weakened El Niños rhythmically alternate with the current mode of strong El Niños. Driving these swings, at least in the models, are periodic variations of solar heating as Earth wobbles on its spin axis. The record of these orbitally induced climate shifts over past millennia may help researchers understand how climate oscillations like El Niño will respond to future greenhouse warming. The latest record of ancient El Niño is a byproduct of century-old geologic work. Early in the 20th century, Swedish geologist Ernst Antevs came to New England to sort out the history of the great ice sheets' retreat northward beginning 20,000 years ago. He could map the sequence of sediments washed out from the ice sheet and deposited in a lake at the ice's edge, such as Glacial Lake Hitchcock, which filled the Connecticut River valley. But to trace the ice front's position in New York, Vermont, and New Hampshire at a given time, Antevs needed a time marker that would tell him that sediments in widely separated lakes were laid down at the same time. Antevs found such a marker in the seasonal layering of glacial sediments. Each summer, glacial meltwater would carry a heavy load of coarse sediment into the lakes, and each winter, meltwater would slow to a trickle, laying down a thinner, finer-grained layer. Antevs could tell which year, out of thousands, one of these layers or varves was laid down, because climate changes from year to year and decade to decade influenced the amount of melting across New England, creating unique patterns of varve thickness. Geologists Tammy M. Rittenour and Julie Brigham-Grette of the University of Massachusetts, Amherst, which sits on Glacial Lake Hitchcock deposits, shared Antevs's interest in glacial history. They took a digitized version of his thickness records of more than 4000 summer-winter varve pairs published in the 1920s, filled a few gaps by adding some new thickness measurements (including some from a core they drilled on campus), and verified a 4000-year varve chronology spanning the glacial retreat from 17,500 to 13,500 years ago. But the varve record offered tantalizing hints about climate as well. Rittenour (who is now at the University of Nebraska, Lincoln) and Brigham-Grette teamed with statistical climatologist Michael Mann of the University of Virginia in Charlottesville to extract a history of climate variability over that same 4000-year period. The thicker a varve, the warmer the weather in New England that year. They found that New England climate varied with the same 2.5- to 5-year beat that El Niño currently follows in the Pacific. Apparently, then as now, El Niño's influence reached into New England. That came as a bit of a surprise, because a 12,000-year climate record retrieved from an Ecuadorian lake, close by the tropical Pacific, shows no sign of El Niño's torrential rains in that period (Science, 22 January 1999, p. 467). But the New England varves hold a clue to reconciling the two records. The Ecuadorian record may contain only the strongest El Niños, the ones whose heavy rains could wash sediment into the lake, while the New England record may contain both large and small El Niños. In New England, the apparently larger El Niños steadily faded until they were more or less gone 13,500 years ago. If those were indeed the larger ones, their absence left the Ecuadorian record with nothing big enough to record from the end of the ice age until strong El Niños returned in the past 5000 years. Climate modelers using two very different paleoclimate models think they see how El Niños might have waxed and waned this way over the millennia. Amy Clement and her colleagues at the Lamont-Doherty Earth Observatory in Palisades, New York, used a simple model of the tropical Pacific, and Zhengyu Liu of the University of Wisconsin, Madison, and his colleagues used a complex global model. Both groups find that Earth's wobbling could be responsible. Today, Earth's spin axis is tilted so that the planet is closest to the sun in January, during Northern Hemisphere winter, giving the planet a little extra solar heating then. But Earth wobbles like a top, taking 21,000 years for one wobble, so that 11,000 years ago its spin axis pointed in the opposite direction, putting Earth closest to the sun in Northern Hemisphere summer. In the models, that extra summer heat is enough to strengthen winds over the tropical Pacific at a time of the year critical to initiating a new El Niño, discouraging it from forming, says Clement. That Earth's orbital variations may be fiddling with El Niño's vigor has climate researchers excited. “This is a real opportunity to look at El Niño under other climatic base states,” says paleoclimate modeler John Kutzbach, a collaborator of Liu's at Wisconsin. By seeing how El Niño reacts when prodded by changes in the rest of the climate system, he notes, researchers should get a better idea of how El Niño will behave in the future. The recent results “say, yes, if we have a different base state, we get a different El Niño,” he says. “And the greenhouse will obviously be a different base state.” 9. HUMAN GENOME # Mapping a Subtext in Our Genetic Book 1. Michael Hagmann ParisAs scientists race to finish a rough draft of the human genome, a European consortium is about to launch an effort to pinpoint every key spot in our genetic code where cells turn genes on and off by adding a methyl group to cytosine, one of the four nucleotides that make up DNA. Variations in methylation between healthy and ailing tissues “might give us a better understanding of what goes wrong” in some diseases, says Alexander Olek of Epigenomics, a biotech start-up in Berlin. The consortium announced at the Genomes 2000 meeting here last month that they have won initial funding for the project, which aims to grind out methylation maps for every kind of tissue. Since the 1960s, researchers have accumulated more and more evidence suggesting that methylation plays many roles, including silencing the extra X chromosome in females. And methylation gone awry is linked to cancer and rare inherited diseases such as ICF syndrome, an immune disorder distinguished by facial abnormalities such as low-set ears. The mysterious instructions on where and when a cell ought to add methyls are somehow transmitted to daughter cells during cell division. Many scientists believe the resulting “epigenetic” variations are “a way of regulating genes in a hereditary manner without changing the genetic code,” says Olek. “There are hundreds of thousands of methylation signals in the human genome, and the methylation pattern pretty much reflects the pattern of gene [activity].” For now these patterns are as inscrutable as hieroglyphics were before the discovery of the Rosetta stone. “We would like to find out what different methylation patterns mean,” says Stephan Beck of the Sanger Centre in Cambridge, U.K. Sanger—along with Epi-genomics, the Max Planck Institute for Molecular Genetics in Berlin, the French National Genotyping Center in Paris, and others—formed the Human Epigenome Consortium (HEC) last December to undertake the twin Herculean tasks of compiling methylation patterns for every tissue and raising an undetermined wad of cash to get going. To provide insights into biological processes, Olek estimates that HEC must identify about 400,000 cytosines that undergo methylation. HEC will use several tools to hunt down these cytosines, including a DNA chip developed by Epi-genomics that maps methylation patterns. The effort will be a slog: Multiply the number of target cytosines by roughly 100 tissue types, Olek says, and “you've got almost the same amount of work as the entire human genome.” HEC met its first milestone last month, when the European Union gave it$1.1 million for a pilot project in a small region—about 4 million base pairs, or 1/1000 of the human genome—containing the genes for the major histocompatibility complex (MHC), proteins that present snippets of pathogens to immune cells. Researchers will look at MHC regions in about 20 tissues, mainly various classes of immune cells, and they will compare methylation patterns of inactive cells with ones riled up by pathogens or by autoimmune disease.

Although HEC will make its pilot data freely available to the scientific community—along with methylation patterns in healthy tissues once the full project gets under way—Epigenomics plans to set up a proprietary database on differences between healthy and diseased tissues to flag, for example, patterns associated with tumors. “If they can develop assays for the early detection of tumors, that's going to be key for cancer treatment,” says Adrian Bird, a molecular geneticist at the University of Edinburgh.

Some experts are less enthusiastic. “Whether it will be as revelatory as assumed remains to be seen,” says Aravinda Chak-ravarti, a human geneticist at Case Western Reserve University in Cleveland. He thinks the project should also probe other regulatory processes, such as how DNA is packaged into chromatin. “This might tell us how valuable a methylation map really is.”

Others say the time is ripe to bring the heavy guns of sequencing to bear on methylation. “For much too long people have ignored the fact that DNA consists of five bases instead of four,” says Beck. Olek hopes the fifth base will entice other major sequencing centers to joint the HEC. “They'll have to find other projects to keep them busy after the human genome is finished,” he says.

10. ASTRONOMY

# Astronomers Detect More Missing Matter

Forget about dark matter, the mysterious invisible stuff that scientists say holds the galaxies together. The plain old ordinary matter that makes up stars, planets, and you and me presents puzzles of its own. The early universe contained up to 10 times more of it than would fit into all of today's galaxies, and no one knows where it all went. Now new data from the Hubble Space Telescope suggest that much of it may be hiding out in the open, in enormous clouds of ionized gas stretching between galaxies.

Astrophysicist Todd Tripp and astronomer Edward Jenkins of Princeton University in Princeton, New Jersey, and astrophysicist Blair Savage of the University of Wisconsin, Madison, found a clue to the missing matter's whereabouts when they pointed the Hubble at a particularly bright, young quasar and dissected its light. In the ultraviolet region of the quasar's spectrum, the researchers found pairs of absorption lines, wavelengths at which the intensity of the light dipped. The lines indicated that oxygen VI, oxygen stripped of five of its eight electrons, lay between the quasar and Earth, soaking up some of the light. And where there's oxygen, there's sure to be hydrogen, as hydrogen is by far the most abundant element in creation. So, as the researchers report in the 1 May issue of Astrophysical Journal Letters, they concluded that intergalactic space is filled with clouds of invisible ionized hydrogen. “This brings us toward a census of all the matter in the universe,” says Jane Charlton, an astronomer at Pennsylvanina State University, University Park.

Such a census is necessary because most of the ordinary matter in the universe has gone missing over the past 10 billion or 12 billion years. When astronomers peer at very distant quasars, they see a forest of absorption lines due to hydrogen atoms. The lines indicate that, when the universe was just a couple of billion years old, it was filled with enormous clouds of hydrogen. But when they look at younger, not-quite-so-distant quasars (say, 5 billion years old), researchers find far fewer lines. Apparently, a few billion years ago, as galaxies swarmed into clusters and the universe generally grew lumpier, most of the hydrogen atoms disappeared.

Or maybe they just became invisible. To absorb light, a hydrogen atom must be whole; a raw nucleus, stripped of its electron, can't do the trick. In the original hydrogen clouds, perhaps one atom in a million retained its intact, light-hungry form. According to a theory developed over the past decade, the clouds collapsed into a network of denser filaments. During the collapse, the gas heated to temperatures between 100,000 and 10 million kelvin—hot enough to pop the electrons off the remaining neutral hydrogen atoms, but cool enough to prevent ricocheting hydrogen nuclei from giving off copious x-rays. “Most of the matter today, if it's in this form, is very difficult to observe,” says Renyue Cen, a theoretical astrophysicist at Princeton University, who predicts that half of all ordinary matter exists as such intergalactic “warm hot gas.” If that is true, then like a set of misplaced keys, the missing hydrogen may be hiding more or less where it's been all along.

The new Hubble observation supports the theory and suggests that ionized gas clouds may contain as much matter as all the galaxies combined. But Tripp and colleagues say it's too early to tell precisely how much matter the clouds account for. The oxygen VI measurement reveals only the gas at the lower end of the expected temperature range. Moreover, the researchers must estimate the ratio of oxygen to hydrogen. “By far that's the biggest uncertainty here,” Savage says. “And figuring that out is not going to be easy.”

11. PALEOANTHROPOLOGY

# A Glimpse of Humans' First Journey Out of Africa

1. Michael Balter,
2. Ann Gibbons

New skulls, dates, and simple stone tools from Georgia reveal traces of people who may have been the first humans to travel outside Africa—a stunning 1.7 million years ago

Tautavel, FranceOne day last May, an archaeology student named Gotcha Kiladze was showing an archaeological site in Dmanisi, Georgia, to a group of visiting schoolchildren. Dmanisi, 85 kilometers southwest of the Georgian capital of Tbilisi, had already yielded a few ancient human bones and stone tools, but outside researchers had been skeptical about their age and which species they represented. Then on this spring day shortly before the season's excavations were scheduled to begin, the team's luck changed. It had rained heavily in previous days, and suddenly Kiladze spotted the glint of a bone poking out of the sediments. After careful excavation, and to the joy of the team, it turned out to be the skull of an early hominid, apparently a young adult male. Two months later, another student unearthed a second skull, this time what appeared to be a female adolescent, including part of the face and the upper jaw with four teeth.

Kiladze and the rest of the multinational team are now basking in the glow of these discoveries, which are presented on page 1019 of this issue and were also shown off last month at a meeting convened here to explore new evidence about the first Europeans.* The meeting, organized by French prehistorian Henry de Lumley, was something of a coming-out celebration for the Dmanisi fossils, with talk after talk on the skulls, the accompanying stone tools, and their ages. The talks and paper present a startling conclusion: that the Dmanisi humans were at least 1.7 million years old—the most ancient undisputed human fossils outside Africa.

Anthropologists say that the skulls, high-quality casts of which were on display during the meeting, closely resemble hominid crania of about the same age from East Africa. Although researchers have claimed that humans reached the island of Java in Indonesia by this time, the Dmanisi people appear to represent the first known wave of humans who left Africa and then colonized much of the rest of the globe. They are the first hominids with clearly “African” features found outside that continent—and they used only a simple stone tool kit, called Oldowan tools, to accomplish their journey. “They look African,” says archaeologist Ofer Bar-Yosef of Harvard University, who has visited Dmanisi several times. “I would give [Dmanisi] the credence of being the oldest known site in Eurasia with Oldowan stone tools.”

By today's definitions, the site is indeed in what is loosely called “Eurasia,” although many European researchers are eager to claim the Dmanisi skulls for their own continent. “This discovery has doubled again the age of humans in Europe, or at least at the gates of Europe,” human anatomist Giacomo Giacobini of the University of Turin in Italy told Science. Enthused University of Rome paleoanthropologist Giorgio Manzi during a panel discussion: “This is the missing link between Africa, Europe, and Asia!”

Yet it is not at all certain that these early immigrants out of Africa were directly ancestral to the humans found at key European sites such as Atapuerca in Spain, Ceprano in Italy, or Tautavel (see map). The fossil humans at all those sites are at least a million years younger than those at Dmanisi. Rather, many researchers agree that, as the Georgian team proposed in their talks, the people represented by the Dmanisi specimens may have been the forebears of the first humans in Asia, who are thought to have appeared between 1.8 million and about 1 million years ago, depending on which sites are accepted, and who are usually called Homo erectus. “Geographically this makes a lot of sense,” says paleoanthropologist Peter Andrews of the Natural History Museum in London, who visited Dmanisi last year and saw the skulls. “These are the first hominids of the genus Homo outside of Africa, and they are already a third of the way to China.”

## A question of age

Despite the fanfare the Dmanisi skulls are receiving, for the team that discovered them it is really an old story. In 1991, prehistorians from the Georgia State Museum in Tbilisi, the Georgian National Academy of Sciences, and the Roman-German Central Museum in Mainz, Germany, launched a new round of excavations at the Dmanisi site, which during the 1980s had yielded important remains of ancient plants and animals. At the end of the first season, they discovered a hominid lower jaw in a layer of lake sediments, just above a deposit of volcanic rock dated to at least 1.8 million years ago (Science, 24 January 1992, p. 401). Yet a number of researchers questioned the date, arguing that the jawbone's shape suggested a younger age and that the bone might have been deposited much later than the volcanic layer. Those questions persisted after a human foot bone was found in the same area in 1997.

But in the paper in this issue the team presents new analyses that support an early date. And at Tautavel, Georgia State Museum paleontologist David Lordkipanidze presented their arguments—and appeared to convince most of his audience of the fossils' antiquity. The skulls, jawbone, and foot bone were all found just above the volcanic layer, which has now been dated by three different teams. The first rounds of dating, using the decay of potassium-40 to argon-40, came up with dates ranging from 1.8 million to 2.0 million years ago. The newest round uses a related but even more accurate technique called argon-argon dating and was performed by respected geochronologist Carl Swisher of the Berkeley Geochronology Center in California. Swisher's results put the lower volcanic level at about 1.85 million years ago.

As for the key date of the stratigraphic layer in which the human bones were found, the team argues that it is fixed by new paleomagnetic measurements, which take advantage of known changes in Earth's magnetic polarity, as well as by associated fossils of other species. The team argues that the geology of the site and the bones of other animals show that the hominid-bearing layer was deposited close to a boundary between normal and reverse polarity, known as the Olduvai-Matuyama boundary and well dated elsewhere to 1.77 million years ago. For example, the fossil layer includes teeth from key species of the rodent genus Mimomys, known to have lived only between 1.6 million and 2 million years ago, according to unpublished work by paleontologists Jorge Agusti of the Paleontological Institute in Sabadell, Spain, and Aleksander Mouskhelishvili of the Georgia State Museum. “The fauna actually points older—the fauna says Olduvai. And the magnetics say Matuyama, so it's got to be right in between,” says Swisher. “I feel confident that we know where the boundary is. I think we're pretty safe and tight.”

Christophe Falguères, a geochronologist at the National Museum of Natural History in Paris, told Science that the team has “a reasonable argument” in its claim that the skulls are that early. The problem, he said, “is to know how much time has passed” between the formation of the radiometrically dated volcanic layer and the deposition of the fossil-bearing sediments. But Andrews, who says he examined the volcanic layer carefully during his visit to Dmanisi, told Science that what he saw was consistent with the claim that the fossil-rich layer was deposited shortly after the underlying volcanic rock had formed. “To me [the volcanic layer] looks extremely fresh,” he says, “very ragged and uneven, many sharp edges,” suggesting that the rock had not had time to weather before the next layer of sediment was deposited. Taking the dating evidence together, “there is no doubt that humans were living at Dmanisi at least 1.7 million years ago,” de Lumley concludes.

Any lingering skepticism about the dates tends to be dispelled by the primitive size and shape of the bones, say those familiar with the new fossils. The skull considered to be an adult male has a braincase of about 780 cubic centimeters (cc), as Leo Gabunia of Georgia's National Academy of Sciences, lead author on the Science paper, told a rapt audience at Tautavel; modern humans pack about 1500 cc of brain capacity. The skull of what may be an adolescent female measures about 650 cc. In addition to their small size, the skulls have a number of other features—including high temporal lines, prominent brow ridges, and a marked constriction of skull width behind these ridges—which Gabunia and co-authors argue resemble an early species of human called H. ergaster, which lived in Africa between 1.9 million and 1.4 million years ago. (Typical of the lively debate in this field, some researchers do not recognize H. ergaster and refer to these specimens as simply early H. erectus.) The team has provisionally classified the Dmanisi skulls as H. ergaster—the first known representative of this species outside Africa.

After seeing Gabunia's talk or reviewing the new paper, most prehistorians who spoke to Science agreed that the Dmanisi fossils closely resemble the H. ergaster specimens, including the so-called Nariokotome Boy, an almost complete skeleton of an adolescent male dated to about 1.6 million years ago. For example, paleoanthropologist Alan Walker of Pennsylvania State University, University Park, held a cast of Nariokotome Boy's lower jaw next to the Dmanisi jawbone found in 1991 and concluded that “they look just the same.” Andrews says that if the skulls had been found “in an early … African setting, no one would have been surprised.” Even many who have seen only photographs are convinced. “They are astonishing!” says Dan Lieberman, a paleoanthropologist at George Washington University in Washington, D.C. “They could be Nariokotome Boy's brother.”

But a somewhat more reserved view is taken by Günter Bräuer, a paleoanthropologist at the University of Hamburg in Germany, who had earlier questioned the age of the Dmanisi jawbone. In contrast to Walker, he argued that that specimen more closely resembled later H. erectus fossils than early African H. ergaster. But given the new dating analyses, Bräuer now says that he sees “no reason to doubt” the older date, although he suggests that it should be confirmed by direct dating of a tooth from one of the skulls with another method. He adds that the differences he perceives in the fossils suggest considerable genetic variation in these traits at this time.

If the dating is confirmed, it makes the Dmanisi fossils a rare prize. The skulls are far older than any European fossil—the next oldest hominid from the continent is a crushed skull from Ceprano, Italy, dated at 800,000 to 900,000 years ago. In Asia, a jaw fragment and two worn molars from Longgupo Cave, China, have been dated to 1.9 million years ago, but many researchers aren't sure that the bones are really human (Science, 17 November 1995, p. 1116). And Swisher reported argon-argon dates in 1994 that put Java H. erectus fossils at 1.8 million years ago (Science, 25 February 1994, p. 1118). Those dates have recently come under attack, however, both at the meeting and in a commentary by two Dutch researchers in the April issue of the Journal of Human Evolution (JHE). The researchers argue that although Swisher probably dated an underlying volcanic layer correctly, the fossils may come from a higher level no more than about 1 million years old. Swisher, however, defends his dates at the Indonesian site of Sangiran as at least as solid as those at Dmanisi, noting that the authors of the JHE article never visited the site. He adds that as-yet-unpublished dates support the accuracy of the geological framework at Sangiran, putting fossils from various layers at ages ranging from 1.8 million years to 1 million years ago.

## Simple tools, long journeys

Whatever the resolution on the Java dates, the next oldest undisputed traces of early humans outside Africa are primitive stone tools at ‘Ubeidiya, Israel, dated to as early as 1.5 million years ago. These tools represent an early stage of the relatively advanced Acheulean technology, which includes “handaxes” and other symmetrical tools carefully crafted according to a preconceived plan. Similar tools began to show up in Africa about 1.6 million years ago, and some researchers proposed that the invention of the tools spurred colonization out of Africa at that time.

But the Dmanisi results may shatter that theory. Not only are the Georgian fossils older than the earliest known Acheulean tools in Africa, but the Dmanisi people had only simple stone tools, as archaeologists Antje Justus of the Roman-German Central Museum and Medea Nioradze of the Georgia State Museum reported at Tautavel. The Dmanisi tools resemble the Oldowan tools found in East Africa's Olduvai Gorge as early as 1.8 million years ago—simple stone flakes, scrapers, and various “chopping tools.” If the Dmanisi people managed to travel thousands of miles with these simple tools, “this means that the Oldowan adaptation was more complex than people thought,” comments Milford Wolpoff, a paleoanthropologist at the University of Michigan, Ann Arbor. “The overwhelming discovery is that there are Oldowan-using people colonizing outside of Africa.” Adds paleoanthropologist Susan Antón of the University of Florida, Gainesville, a co-author on the paper: “If we are right about when hominids left Africa, it is biology, not new tools, that prompted their dispersal. As soon as you get larger body sizes and brains, you see shifts in what they eat and how far they range that ultimately lead them out of Africa.”

Indeed, back in 1989 Walker and biological anthropologist Pat Shipman of Pennsylvania State University, University Park, proposed that once australopithecines evolved into the genus Homo and became more carnivorous, they would have expanded their range, because carnivores need large territories. But in 1989, there was no sure evidence of such an early expansion. Now there is. “As soon as Homo erectus evolves in Africa, they're out,” notes Walker, who prefers this species name to H. ergaster. Ian Tattersall, a paleoanthropologist at the American Museum of Natural History in New York City, agrees: “The story is looking strong that up until 1.9 million years ago the only hominids were archaic [types] restricted to the forest fringes [of Africa]. Somehow, a modern body was acquired, then all hell broke loose and this strutting biped … was mobile enough to set out of the forest fringes and walk out of Africa.”

Although the evidence from Dmanisi supports the idea that hominids got out of Africa fast and early—and that H. ergaster was the first traveler—exactly where they went next and how they eventually got to Europe remains a mystery. Leaving the contested Java and Longgupo sites aside, the oldest accepted dates for humans in Asia are roughly 1.1 million years for hominid fossils found at Gongwanling in China (see map), says de Lumley. At the meeting, Gabunia proposed that the Dmanisi people went east and eventually gave rise to Asian H. erectus. Several researchers at the meeting found that quite plausible, given the dates and features of the fossils, but others are skeptical. “No one knows who were the ancestors of Asian H. erectus,” says Harvard's Bar-Yosef. “There was more than one wave out of Africa,” and so there are many possibilities for H. erectus's ancestors.

When it comes to Europe, the evidence is even more puzzling, for there is an almost 1-million-year gap between Dmanisi and any European site. And the much younger stone tools at European sites such as Gran Dolina at Atapuerca, dated to about 780,000 years ago, are still the simple Oldowan type, as are those at most sites in Asia. The Acheulean technology is rarely seen outside Africa until about 500,000 years ago, yet it's hard to fathom why Acheulean toolmakers would be less mobile than Oldowan toolmakers.

One explanation, championed by paleoanthropologist Nicholas Rolland of the University of British Columbia in Victoria and raised again at the Tautavel meeting by archaeologist Sarah Milliken of Oxford University, is the “long journey” hypothesis: that early humans first ventured out of Africa via the Middle East with Oldowan tools close to 2 million years ago, but were prevented by climate and geography from turning west. Instead they dispersed east, taking a southern route to China and even Java. Then they turned north and started west again across more central reaches of Asia and Europe. In this view, the Oldowan-like tools at Atapuerca and Ceprano may have been left by the descendants of these early wanderers.

Whether this scenario holds water will depend on future research. “We just don't have enough sites and enough evidence yet,” says Oxford University archaeologist Derek Roe. He predicts that “within the next 5 years there will be more dramatic discoveries of the magnitude of Dmanisi.” Swisher adds that researchers have just begun to mine Dmanisi's rich fossil beds. But whatever routes early humans took in their colonization of the globe, the evidence already in from Dmanisi makes it clear that they got an early start out of Africa, arriving in Georgia not long after their first African appearance. “Once emancipated from the forest,” says Tattersall, “you find these guys moving very, very fast.”

• * The First Inhabitants of Europe, Tautavel, France, 10 to 15 April 2000.

12. BIOMEDICINE

# Gene Therapy on Trial

1. Eliot Marshall

A flurry of reports and congressional hearings, sparked by the death of a volunteer in a study at Penn, are due in the next few weeks. The Penn episode points up a central problem: The field still lacks an ideal vector

Dusty Miller, a veteran gene therapy researcher, wants to test a new idea for treating cystic fibrosis. He has engineered a strain of virus to create a new “vector” to inject useful genes into cells. He has tested it in his lab at the Fred Hutchinson Cancer Research Center in Seattle, getting “wonderful” results in mice. Although he can't guarantee that it's safe for human use, he's confident that it is. Yet he's hesitating about testing it in patients, stretching out preliminary research while using an established but, he thinks, less efficient vector in volunteers. He's being supercautious, he says, because the “climate for gene therapy” has turned cold.

The chill set in on 17 September 1999. That's when Jesse Gelsinger, a young volunteer, died in a gene therapy trial at the University of Pennsylvania in Philadelphia, triggering a blitz of media and government attention. The Food and Drug Administration (FDA) has issued Penn a warning letter and shut down all clinical trials at Penn's Institute for Human Gene Therapy while it investigates what happened. The chill intensified last week when FDA made public a warning letter to cardiac specialist Jeffrey Isner of St. Elizabeth's Medical Center in Boston, alleging infractions of FDA rules in a gene therapy trial for heart disease in which one patient's cancer could have been exacerbated by the treatment and, FDA contends, a death was not properly reported. Isner's studies are now on hold. FDA also halted several other gene therapy trials around the country last winter while investigating vector toxicity.

And the climate is likely to become even more inhospitable over the next few weeks, when a blizzard of reports and hearings are expected. The Senate Health committee is planning a public hearing, its second on the Penn case. The House Commerce subcommittee on oversight and investigations has a probe under way; Penn, according to an official, has sent the committee “truckloads” of files, many of them demanded by Representative John Dingell (D-MI), the committee's fearsome inquisitor. The National Institutes of Health (NIH) has two groups looking into what happened. Penn is conducting two inquiries of its own: one led by its provost and another by an outside panel, due this week. In addition, Penn is concerned that Gelsinger's family may sue. Meanwhile, FDA and other agencies are scrutinizing gene therapy programs around the country. Miller, for example, says FDA inspectors have paid two surprise visits to his lab this year, demanding to see colleagues' lab notes.

Public attention in this round of reports and investigations is likely to focus on who's to blame for errors, whether patients were adequately informed of the risks, and whether the tangle of relationships among companies, investigators, and institutions has created unacceptable conflicts of interest in the field (see sidebar, p. 954). Many clinicians fear that support for gene therapy will buckle under the onslaught.

At the scientific level, what happened at Penn holds two important lessons that are likely to get swamped in the publicity over the next few weeks. The first is the story of the vector James Wilson, director of Penn's gene therapy institute, and his team used: a patented version of a common respiratory tract virus—adenovirus—that had been stripped of certain genes to make it more innocuous. Researchers had once pinned their hopes on adenovirus vectors, believing they would overcome a basic problem that has dogged gene therapy since its inception: the difficulty of getting genes into target cells and, once there, getting the genes to express their proteins. Now some investigators think that, because of their inherent problems, adenovirus vectors may be limited to narrow uses. The problem is, every vector that has been investigated also has limitations (see sidebar, p. 953).

The second lesson involves the nature of clinical research itself. Although it's a shock when a patient dies in a toxicity test, says a clinician who has supervised many such trials, it is not unusual. “If you were to look in [a big company's] files for testing small-molecule drugs,” he insists, “you'd find hundreds of deaths.” Often, warning signs become clear only in retrospect, and many clinicians believe that's what happened in the Penn trial. Hints of toxicity had cropped up in previous experiments done by Wilson and others, but the Penn team may have been misled in one crucial respect by animal data that did not translate to humans.

But others suggest that clinicians at Penn should have been more sensitive to the risks, especially because they were injecting a potentially toxic vector into relatively healthy volunteers. “There were many places where this should have been stopped,” says Huntington Willard, a molecular geneticist at Case Western Reserve University in Cleveland and a member of the American Society of Human Genetics board. Several leaders in the field have said that they knew that directly injecting the livers of volunteers with huge quantities of immunogenic viral particles (38 trillion at the highest dose) was risky. But they did not intervene, and the trial was given a green light by several local and federal agencies. Today, Willard sees “a very strong parallel” between a rush to the clinic in gene therapy and the space shuttle Challenger explosion. “It takes an event like that,” he says, to let people see “just how dangerous some of this stuff really was.” Willard concludes that “we need to take a much more sober view of where this field is going.”

## Design by committee

Regardless of what the critics think, says Arthur Caplan, director of Penn's Institute for Bioethics, people designed this gene therapy trial with the best intentions. He recalls how Mark Batshaw, a pediatrician at Penn in the early 1990s, now at Children's National Medical Center in Washington, D.C., wanted to save children born with a deadly liver problem. The disease occurs when a gene on the X chromosome is missing or defective, producing too little of a liver enzyme, ornithine transcarbamylase (OTC), that's needed to remove ammonia from the blood. Many infants become comatose at birth and die. Some with mild deficiencies—like Jesse Gelsinger—can survive if they keep to a strict diet and take compounds that help eliminate ammonia. But there's no substitute for natural OTC. And even mild deficiencies can be deadly. Gelsinger, for example, neglected his OTC regimen and nearly died in 1998. Caplan says Batshaw “was the pivotal guy” in Penn's OTC gene research: “He was tired of burying babies.”

Batshaw, Wilson, and a surgeon at Penn named Steven Raper, the principal investigator, devised a plan in 1994-95 to transfer healthy OTC genes into people who lack them. (Through a Penn spokesperson, Raper and Wilson declined to comment.) The objective, according to the protocol, was to develop “a safe recombinant adenovirus” that could infect the livers of patients and release OTC. Wilson's institute at Penn and the private company he founded had additional goals: to develop vectors for treating liver diseases and other illnesses.

The improved adenovirus vector developed at Penn seemed like a “wonder vector” back in 1995, Miller recalls. It was easy to grow, versatile, capable of infecting both dividing and nondividing cells, targeted the liver (as everyone assumed), and was quick to express genes in tissue. This vector was the right tool, Batshaw still argues: “Adenovirus is the only one that works rapidly enough, even now.” He explains that whereas most other vectors take 3 to 6 weeks to begin working, adenovirus vector starts to express genes within 24 hours. This could be crucial for treating newborns with severe cases of the disease. You need quick action, he says, “if you're trying to get kids out of hyperammonemic coma” and prevent death or mental retardation. “Our plan was to use the adenovirus to get them out of coma; that would last for a few months,” then go to second-stage gene therapy with a different vector—one problem with this vector is that gene expression is of limited duration—or possibly to liver transplantation.

But the plan changed when ethicists looked at it. Caplan, who was recruited to Penn shortly after Wilson, argued that it would be preferable to begin with adult volunteers because the trial was designed only to test toxicity. Later, infants could be enrolled. The initial subjects would have no chance of benefiting, in part because adenovirus vector can be given only once. It sets up an immune response that usually causes the body to eliminate the vector if it is used again. This meant that no one who took part in this trial could hope to benefit from adenovirus gene therapy at a later time. Even in ordinary circumstances, Caplan says, obtaining parental consent for experiments on children is “a problem.” But it's especially tough “if you're trying to explain to parents in the middle of a crisis that you're only doing a safety study” that would not help a critically ill child. Caplan argued that it was “wrong to do nontherapeutic research on someone who cannot consent.”

Batshaw and other OTC experts then took part in a meeting of the National Urea Cycle Disorders Foundation, run by parents of OTC children, to talk about these issues. “At the end of a 2-hour conference,” Batshaw recalls, “they came to the same conclusion: It would be better to treat the adults.” NIH's Recombinant DNA Advisory Committee (RAC) agreed in a 1996 discussion.

The RAC review was just one of many ethics and safety reviews the trial had to clear before it could begin. The process brought about several small changes and one double reversal. Some members of the RAC thought the plan to inject adenovirus vector directly into a hepatic artery was too risky. But the majority gave consent, provided that the vector was put in a peripheral vein. Penn agreed to this change.

In 1997, safety reviewers at the FDA argued that it would be less risky to go directly into the liver. FDA at that time was worried that gene therapy experiments might alter human germ cells and pass risky genes to future generations. FDA's experts felt that by channeling the virus vector into the hepatic artery, it would be concentrated in one lobe of the liver, limiting overall exposure. Everyone assumed that adenovirus had a strong affinity for human liver cells and would be quickly concentrated in them. The Penn team agreed to go back to its original plan of inserting the vector directly into the hepatic artery. But Wilson neglected to inform RAC that it was taking FDA's advice. Wilson apologized to the RAC in December 1999.

Routine toxicology studies in mice, rhesus monkeys, and baboons were reassuring, Penn concluded, although they indicated toxicity at high doses. For example, early versions of the adenovirus vector plus OTC gene damaged the liver of rhesus monkeys, and monkeys given the highest doses died. The improved vector to be used in the clinical trial—from which a different viral gene was removed—appeared to be less toxic, although baboons still showed liver inflammation at high doses. The Penn team proposed using a maximum dose in humans that would be about 5% of the dose that produced maximal toxicity in nonhuman primates. And they proposed climbing toward that level in five threefold increases, with each step involving three patients.

Satisfied with Penn's plan and responses to queries, FDA gave the trial a green light in 1997. The first of 18 volunteers, a woman, was given a 2-hour infusion of vector with OTC genes on 7 April 1997. Most patients experienced fever and other moderate symptoms. The 10th and 12th patients exhibited signs of liver stress, with liver enzymes in serum higher than the normal upper limit (8 and 5.3 times higher, respectively). FDA later reprimanded Wilson's team for failing to pause and consult FDA by phone at this point. The trial proceeded “like a train,” says one outside clinician, until it was halted abruptly on 17 September 1999 when Gelsinger, the 18th patient, died.

## Surprising toxicity

After Gelsinger's death, Wilson led scores of researchers in a months-long search for a cause. As possibilities were eliminated, the Penn clinicians were left with one conclusion: Gelsinger died from a massive immune response to the adenovirus vector itself.

The “most unexpected finding” in the postmortem, Raper said at a RAC meeting in December 1999, was that precursors for red blood cells in the boy's bone marrow had been wiped out. The Penn team concluded that this probably did not happen in the short 4-day period of gene therapy. Raper and Wilson speculated at the RAC meeting that a preexisting parvovirus infection might have done the damage. In addition, Batshaw notes, it's possible Gelsinger had inherited a mutation that caused an exaggerated response to adenovirus. But no evidence for either theory has been found. The blood cell problem remains unexplained but appears not to have been the cause of death.

Wilson and Raper also noted that Gelsinger's blood contained high and sustained levels of interleukin-6 (IL-6), a cell signaling protein (cytokine). Even now, researchers don't understand why it was so high, but they do know that IL-6 often surges after an insult to the body, contributing to inflammation. Raper called it “an immune revolt.” A systemic inflammation flooded Gelsinger's lungs with fluid, causing acute respiratory failure and death.

Vector designers have long known that adenovirus triggers an immune response, but for gene therapy trials, they have taken out some of its genes in an attempt to reduce its immunogenicity. Wilson and Ronald Crystal at the New York Hospital-Cornell Medical School in New York City, among others, have patented forms of adenovirus with bits of the genome removed. For the OTC trial, Wilson used a 1996 version of the vector with two key genes deleted.

Some researchers—such as Art Beaudet of Baylor College of Medicine in Houston and Inder Verma of the Salk Institute in La Jolla, California—say there were warning signs that vectors containing any active adenovirus genes were risky and could cause inflammation. The most dramatic early sign came in a 1993 gene therapy trial conducted by Crystal. He was using an early adenovirus vector to inject healthy genes into the lungs of cystic fibrosis patients. During the experiment, a subject known as “patient number three” developed a severe inflammatory reaction, including a rapid increase in IL-6. Crystal reported later that he saw patients' IL-6 levels rise in serum “within 2 to 4 hours after vector administration,” and that the peak IL-6 “correlated well” with vector dose. Crystal felt that the inflammation had not been caused by adenovirus itself but by the large volume of fluid used to deliver it. Animal studies had not warned of this possibility, he wrote: It “was a surprise.”

Some see a parallel with Gelsinger's reaction: “The patient had high IL-6 levels in plasma, the whole syndrome, including a single-lobe ARDS [adult respiratory distress syndrome],” the proximate cause of Gelsinger's death, says one clinician. The patterns, he says, are “similar.” Beaudet, who saw a baboon die of adenovirus toxicity in a preclinical study, also sees a similarity.

But Crystal does not think the 1993 and 1999 cases are comparable. In a RAC meeting last December, he said the inflammation in 1993 was the only serious adverse event attributable to adenovirus in his team's “140 administrations of vector.” It occurred “when we were using a larger volume to administer the vector to the bronchi” and a primitive vector containing more viral genes.

Crystal wasn't the only one, however, to report an inflammatory response. Among others, Richard Boucher of the University of North Carolina, Chapel Hill, also ran into the problem in 1994-95 while treating cystic fibrosis patients. He abruptly stopped the trial. “We had two concerns,” Boucher recalls. One was that adenovirus “just didn't work,” because “it didn't get in” to the targeted cells. And second, “if you pushed [the dose] you got into troubles from flat-out protein load.” The North Carolina group followed up with animal studies and concluded that adenovirus vector was stimulating nerve fibers in the epithelium and triggering an inflammatory response. Boucher concluded in 1995 that “it was a capsid protein problem”—a reaction caused by the virus's outer shell—and sent his findings to FDA and published them.

An expert who followed these results, speaking on background, says: “In retrospect, we really should have learned more” from Crystal's experience. “We knew this stuff was toxic back in 1993” for use in the lung. “Why did we think that a damaged liver would be any different?” But, although cytokine release may seem important now, this expert still doesn't think it points to a “clear answer.” It only suggests that, “in some people, you get a whopping cytokine response.” Robert Warren, an oncologist at the University of California, San Francisco, pointed out at the December RAC meeting that he gave 25 cancer patients adenovirus vector doses nearly as large as the one given to Gelsinger, “and we have not seen anything close to this problem.” However, several patients did have other serious adverse reactions, including loss of blood pressure.

The Penn team was taken aback by the lung inflammation, but in view of that reaction, it was astonished to see little liver damage. Relying on mouse studies, they had expected to see adenovirus concentrated in the liver. Instead, as a postmortem revealed, the vector was everywhere. To figure out what happened, Wilson gave the vector intravenously to mice. Tagged adenovirus vector first appeared in macrophage or scavenger cells in the liver, called Kupffer's cells (which secrete IL-6). Later, it reached the intended target, primary liver cells (hepatocytes). This “may not be a good thing,” Wilson said at the RAC meeting in December, because low doses of vector might not put enough OTC genes into hepatocytes, and high doses might saturate nontarget organs. This might explain the low gene transfer rate (less than 1%).

Animal data may have given clinicians false hope that adenovirus would work well in the human liver. A key docking site adenovirus uses to enter a cell, known as the Coxsackie adenovirus receptor (CAR), is much more abundant in mouse livers than in human livers. In fact, “rodent models might be misleading” for gene therapy, says Jeffrey Bergelson of the Children's Hospital of Philadelphia. Again, the warning signs were there before Gelsinger entered the Penn experiment but may have become obvious only in retrospect. Bergelson published a paper in 1998, a year after the trial began, reporting that he found “barely detectable” signs of CAR in human liver, while signs of CAR were “off the wall” in mouse liver. One implication, Bergelson notes, is that clinicians relying on the mouse model may find it necessary “to give higher and higher doses” to deliver genes to the human liver.

## A mortal blow for adenovirus?

Expert opinion is divided on whether the tragic events at Penn should spell the end of the once-promising adenovirus vector for treating genetic diseases. The key question is whether the virus can be re-engineered to eliminate the immune response.

Researchers have been trying for more than a decade to create a tamer adenovirus. The virus is shaped like an icosohedral box studded with “penton” bases that support long fibers—described by FDA gene therapy specialist Philip Noguchi as “a cannonball with spikes.” The box, or capsid, shields the genome. Modifications such as those used by Wilson and Crystal have focused on editing out key bits of DNA inside the capsid that are expressed early during infection of a cell, genes labeled E1 through E4, which trigger immune reactions. The goal is to make the vector as stealthy as possible. The fewer viral proteins the immune system “sees,” the less likely it will attack. And the longer the vector survives, the better its chances of delivering therapeutic genes.

For the OTC trial, Wilson used a version with E1 and E4 genes deleted. In his cystic fibrosis trials, Crystal has used a version with E1 and E3 deleted, which he claims can even be given safely in repeat doses. Since switching to an inhaled spray containing this new vector, Crystal says, “we have had no significant serious toxicities.”

Some scientists have also attempted to create fully “gutless” vectors by hollowing out all viral genes and replacing them with substitutes. They include Jeffrey Chamberlain at the University of Michigan, Ann Arbor, Beaudet and Larry Chan at Baylor, and a group at Merck in Whitehouse Station, New Jersey, under former executive Thomas Caskey. Beaudet and Caskey say researchers in their labs have observed virtually no toxicity when their gutless vector is given to mice at high doses. However, it is hard to eliminate contamination by live “helper” virus and to produce high-concentration batches.

High doses may still be required to produce a clinical benefit, and, as Boucher suggested in 1995, high doses may run into toxicity from capsid proteins. Wilson suggested as much in the December RAC meeting, and Noguchi and FDA toxicologist Anne Pilaro have raised this possibility in several meetings. So has Salk's Verma, who co-authored a 1998 study of adenovirus vector that called for a “reevaluation” of its use in long-term gene therapy.

Recently, FDA staffers heard from another scientist who concluded 5 years ago that adenovirus capsid protein toxicity was a problem: Prem Seth, senior scientist at the Human Gene Therapy Research Institute in Des Moines, Iowa. Based on studies he did in the mid-1990s, he concluded that “empty capsids appear to be immunogenic, like intact virus,” and produce similar effects, like cytokine release. He never published the data, because “there wasn't much interest.”

This analysis suggests that even gutless vectors may be dangerous in some circumstances, but the jury is not in. “It's still debatable,” says Chamberlain. Beaudet agrees: “Based on our published mouse data,” he says, “we think the capsid proteins are not a big problem.” But he concedes that there are “not convincing data yet” from nonhuman primates to settle the issue.

As far as Noguchi is concerned, “the most critical issue for the field right now” is determining the risk of these new, “safe” vectors. “Are there two types of toxicity with adenovirus or just one?” he asks. Is the shell itself a problem, in addition to viral gene expression? “What is its inherent toxicity? Is this the dose-limiting thing? We need to rethink these hard questions.”

For many people in the field, however, the critical question over the next few months is whether they will be able to continue gene therapy trials while everyone rethinks these questions.

13. BIOMEDICINE

# Improving Gene Therapy's Tool Kit

1. Eliot Marshall

More than 4000 patients have been enrolled in gene transfer experiments over the last decade, but until now the research has produced few unambiguous results. Last month, a French research team announced the first clear success. Marina Cavazzana-Calvo and Alain Fischer of the Necker Hospital for Children in Paris reported that they had put a healthy gene into the bone marrow of two children with a rare, lethal immune disorder (SCID-X1), enabling the children to leave a protective bubble for the first time (Science, 28 April, p. 669). It was welcome news, adding substance to the promise that gene therapy will be used to cure genetic diseases. At the same time, however, it was a reminder of how difficult it has been to find ways of transferring genes into patients. The method used by the French team is one of the oldest in the tool kit, a “vector” based on a mouse virus. It is just one of many being developed, each with its own risks and advantages:

Retroviruses. The vector used to treat the SCID children in France was derived from the Moloney retrovirus, an RNA virus that infects mice. Because it inserts its genes into the host's genome, any genes artificially added to the vector are expressed for a long period. It is efficient and seems to produce no strong immune response, but it only works in cells that are actively dividing. Its other main disadvantage is that it integrates into DNA randomly. Gene therapists say there is a remote but real chance that if a retrovirus landed in the wrong location, it might promote cancer.

Adenovirus. Many vectors based on this “cannonball with spikes,” as one expert calls it, are being developed for gene therapy. A common DNA virus that infects the human respiratory tract and eyes, it was the basis for the vector used to treat a liver enzyme deficiency at the University of Pennsylvania. Adenovirus is easy to grow in the lab, and it readily infects both dividing and nondividing cells, expressing genes without inserting itself into the host cell's genome or posing a risk of cancer. But adenovirus proteins stimulate strong immune reactions that clear the vector from the body, making it ineffective for long-term therapy. High doses may be required to transfer enough genes to produce a health benefit. But high doses also can produce powerful toxic reactions when given intravenously, as Penn's researchers discovered.

Adeno-associated virus (AAV). This parvovirus is nearly invisible to the human immune system and readily infects human dividing and nondividing cells. It requires a “helper,” adenovirus, to replicate, and when it integrates into host DNA it does so at a known and apparently safe location. It has some disadvantages: It's more difficult to grow to high concentrations than adenovirus, and it has a small genome, restricting the amount of therapeutic DNA it can carry. Researchers Mark Kay of Stanford University and Katherine High of the Children's Hospital of Philadelphia recently used this vector to transfer genes for a blood clotting protein (factor IX) into patients with hemophilia B. Two patients in the first cohort of a safety trial appeared to improve, and dogs lacking factor IX have shown benefits for as long as 2 years. In addition, Kay has developed a technique that may double AAV's gene-carrying capacity.

Lentiviruses. These slow-growing retroviruses are promising candidates for vectors, according to one champion, gene therapy researcher Inder Verma of the Salk Institute in La Jolla, California. He likes their “unique advantage of introducing genes into dividing and nondividing cells” and their ability to survive without producing a strong immune reaction in the host. The AIDS virus belongs to this family. And despite its fearful origins, Verma is convinced that HIV can be tamed to create a useful vector for gene therapy, although clinical trials may be a long way off. Herpes simplex virus is another candidate in this family, prized because it can infect nervous system cells, which are resistant to other vectors.

14. BIOMEDICINE

# Gene Therapy's Web of Corporate Connections

1. Eliot Marshall

Mark Kay, a researcher at Stanford University who has chalked up several recent triumphs in gene therapy, says there was a time when he advised patients directly about enrolling in his studies of hemophilia B. But not any more. Because he is on the scientific board of a company backing this research—Avigen of Alameda, California—he says he keeps an arm's length from clinical work. He lets others who have no stake in the business handle patients. “I still give talks,” he says, “but I always mention that I am on Avigen's board and that I get remuneration for this.”

Welcome to the new world of genetic medicine. Researchers in gene therapy have become extremely sensitive about perceived conflicts between their financial and scientific portfolios, following the death last year of a volunteer in a clinical trial at the University of Pennsylvania's Institute for Human Gene Therapy (see main text). The trial used a technique for inserting genes into cells that was developed and patented by the institute's head, James Wilson. Wilson and Penn itself have a financial stake in a company Wilson founded to develop the technology.

Wilson's business connections are not unusual. Company sponsorship is pervasive in gene therapy—and for good reason, according to Ronald Crystal, another pioneer in the field, now at the New York Hospital-Cornell Medical School in New York City. Crystal, who developed early cystic fibrosis treatments, says that scientists had to turn to private investors because the clinical tools they need are “very expensive” to develop and were not likely to be funded by National Institutes of Health (NIH) grants.

Indeed, W. French Anderson, the former NIH scientist who filed one of the first applications to perform a clinical trial in gene therapy and who also holds one of the first broad patents in the field, left NIH to pursue this research. In 1987 Anderson helped launch one of the first companies in the field, Gene Therapy Inc. of Gaithersburg, Maryland. By seeking private money, researchers “flipped the whole paradigm of drug development on its head,” Crystal says: It put academic clinicians in charge of developing their own medical products—not just testing products created by others. “We are playing the role of a pharmaceutical company.”

Crystal holds patents on many gene therapy inventions. He, too, founded a company: GenVec of Gaithersburg, Maryland, which exploits his discoveries under license agreements. In return, GenVec helps pay for Crystal's studies at the New York Hospital. Data from Crystal's efforts to grow new blood vessels in patients with heart disease, for example, are featured in GenVec's press releases. But Crystal says that, to avoid conflicts, he does not get directly involved in patient care. Nevertheless, his dual roles as clinician and businessman recently drew press attention, because he asked the government not to disclose a report he filed on deaths that had occurred among his patients. Crystal and outside reviewers had concluded that the deaths were caused by the patients' underlying disease, not gene therapy. Crystal's request, which was not honored, did not violate federal guidelines.

It was the Penn case, however, that brought potential conflicts of interest to the fore. Like other leaders in the field, Wilson holds patents on several gene therapy delivery techniques, one jointly with Francis Collins, director of the National Human Genome Research Institute. And in 1992, Wilson founded a company—Genovo of Sharon Hill, Pennsylvania—which has R&D agreements with two larger companies, Biogen Inc. and Genzyme, both in Cambridge, Massachusetts. Genovo uses some of the revenue from these deals to help support Penn's gene therapy institute, reportedly providing about $4 million a year. The institute, which has a budget of about$25 million, also receives federal grants and other revenue. Penn's guidelines do not allow faculty members to hold an executive position in an outside business such as this. But Wilson, an unpaid consultant to Genovo, holds equity in the company, as does Penn.

News reports have spotlighted an apparent conflict between Wilson's and Penn's responsibility to give primary attention to the needs of patients and their obligation to provide data to corporate sponsors. The gene therapy trial in which the patient died was not expected to benefit the enrolled patients, but it had a good chance of developing information that could improve the prospects of Genovo. Although Wilson was involved, his connection to Genovo apparently did not violate university or NIH guidelines on conflict of interest because Wilson was not directly involved in the recruitment or care of patients in the clinical trial, nor did Genovo finance the trial. Arthur Caplan, director of Penn's Institute for Bioethics, says Wilson was just “the vector supplier,” and it is “irresponsible” to suggest he was influenced by financial interest. (Wilson declined to comment through a Penn spokesperson, who also declined to respond to questions about the university's policies submitted by fax.)

The furor over this case prompted the American Society of Gene Therapy, of which Wilson was president in 1999, to issue a statement on conflicts of interest in April. It essentially echoes the NIH guidelines. It says that members who are “directly responsible for patient selection, the informed consent process and/or clinical management in a trial must not have equity, stock options or comparable arrangements in companies sponsoring the trial.” Crystal supports it, saying, “we already had that in place” in his clinic. Anderson, likewise, says he has followed this rule in all 16 clinical trials he's been involved in.

The American Society of Human Genetics (ASHG)—whose membership is less directly involved in gene therapy—also issued a statement in April calling for caution in gene therapy. But it stopped short of ruling on conflicts of interest. ASHG president Ron Worton of Ottawa Hospital Research Institute says: “We debated … a ban on recruitment of patients by physicians who have a financial interest,” but board members didn't want to take that step, arguing that “this is something traditionally policed by the universities.” But, as the Penn case illustrates, universities themselves may have potential conflicts to be policed.

15. SCIENTIFIC COMMUNITY

# National Academy of Sciences Elects New Members

The National Academy of Sciences last week elected 60 new members and 15 foreign associates. More details are available at national-academies.org/nas

Newly elected members and their affiliations at the time of election are:

Alexei A. Abrikosov, Argonne National Laboratory, Argonne, Illinois; Peter C. Agre, Johns Hopkins University; J. Roger P. Angel, University of Arizona, Tucson; Marsha J. Berger, New York University; Howard Brenner, Massachusetts Institute of Technology (MIT), Cambridge; Steven P. Briggs, Novartis Agribusiness Discovery Unit, San Diego; Robert L. Byer, Stanford University, Stanford, California; Moses H. W. Chan, Pennsylvania State University, University Park; Rita R. Colwell, National Science Foundation, Arlington, Virginia; Eric A. Cornell, National Institute of Standards and Technology and University of Colorado, Boulder; Robert J. Cousins, University of Florida, Gainesville; Francis A. Dahlen Jr., Princeton University; Jack E. Dixon, University of Michigan Medical School, Ann Arbor; Kenneth B. Eisenthal, Columbia University, New York; Stanley Fields, Howard Hughes Medical Institute (HHMI) and University of Washington, Seattle; Jean M. J. Frechet, University of California (UC), Berkeley; Lila R. Gleitman, University of Pennsylvania; Sen-Itiroh Hakomori, Pacific Northwest Research Institute and University of Washington, Seattle; Susan E. Hanson, Clark University, Worcester, Massachusetts; Martha P. Haynes, Cornell University, Ithaca, New York; Arthur M. Jaffe, Harvard University; Charles A. Janeway Jr., HHMI and Yale University; William A. Jury, UC Riverside; Jon H. Kaas, Vanderbilt University; Thomas Kailath, Stanford University; James P. Kennett, UC Santa Barbara; Richard D. Kolodner, UC San Diego; Robert H. Kraichnan, Robert H. Kraichnan Inc., Santa Fe, New Mexico; Simon A. Levin, Princeton University; Roderick MacKinnon, HHMI and Rockefeller University, New York City; Robert W. Mahley, UC San Francisco and Gladstone Foundation, San Francisco; Joan Massague, HHMI and Memorial Sloan-Kettering Cancer Center; Barbara J. Meyer, HHMI and UC Berkeley; Jacob Mincer, Columbia University; Michael E. Moseley, University of Florida, Gainesville; William T. Newsome III, HHMI and Stanford University; David R. Nygren, Lawrence Berkeley National Laboratory, Berkeley, California; Eric N. Olson, University of Texas Southwestern Medical Center, Dallas; Peter Palese, Mount Sinai School of Medicine, New York City; Jeffrey D. Palmer, Indiana University, Bloomington; George C. Papanicolaou, Stanford University; Walter C. Pitman III, Columbia University; Alejandro Portes, Princeton University; Akkihebal R. Ravishankara, National Oceanic and Atmospheric Administration, Boulder, Colorado; Douglas C. Rees, HHMI and California Institute of Technology, Pasadena; Kenneth A. Ribet, UC Berkeley; Richard H. Scheller, HHMI and Stanford University; Joseph Schlessinger, New York University Medical Center; Eric M. Shooter, Stanford University; Robert M. Silverstein, State University of New York, Syracuse; Sean C. Solomon, Carnegie Institution of Washington, Washington, D.C.; Peter J. Stang, University of Utah, Salt Lake City; Leonard Susskind, Stanford University; Leslie G. Ungerleider; National Institute of Mental Health, Bethesda, Maryland; Grace Wahba, University of Wisconsin, Madison; Robert H. Waterston, Washington University, St. Louis; Rainer Weiss, MIT; Michael J. Welsh, HHMI and University of Iowa, Iowa City; Tim D. White, Cleveland Museum of Natural History and UC Berkeley; Reed B. Wickner, National Institute of Diabetes and Digestive and Kidney Diseases, Bethesda, Maryland.

Newly elected foreign associates, their affiliations at the time of election, and their country of citizenship are:

Simon K. Donaldson, Imperial College of Science, Technology, and Medicine, University of London (U.K.); Reinhard Genzel, Max Planck Institute for Extraterrestrial Physics, Garching (Germany); Shirley Jeffrey, Commonwealth Scientific and Industrial Research Organization, Hobart (Australia); Yoshito Kaziro, Tokyo Institute of Technology, Yokohama (Japan); Willem J. M. Levelt, Nijmegen University and Max Planck Institute for Psycholinguistics, Nijmegen (Netherlands); Shigetada Nakanishi, Kyoto University (Japan); Roddam Narasimha, National Institute of Advanced Studies, Indian Institute of Science, and Jawaharlal Nehru Center for Advanced Scientific Research, Bangalore (India); Eviatar Nevo, University of Haifa (Israel); Armando J. Parodi, University of Buenos Aires (Argentina); A. M. Celal Sengor, Istanbul Technical University, Istanbul (Turkey); Nicholas J. Shackleton, University of Cambridge and Godwin Institute for Quaternary Research, Cambridge (U.K.); T. N. Srinivasan, Yale University (India); Bruce W. Stillman, Cold Spring Harbor Laboratory, Cold Spring Harbor, New York (Australia); Akira Tonomura, Hitachi Ltd., Saitama (Japan); Martinus Veltman, University of Michigan, Ann Arbor (Netherlands).