News this Week

Science  01 Dec 2000:
Vol. 290, Issue 5497, pp. 254

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    How Prevalent Is Fraud? That's a Million-Dollar Question

    1. Eliot Marshall

    Charles Turner still doesn't know whether his experience was like finding a rare bad apple in the barrel. But he is sure that there was something rotten in the survey data going into his federally funded study of sexual behavior. And he knows that it has taken him 2 years to pluck out the spoiled fruit and piece together a clean report for publication.

    Turner, a social scientist at City University of New York/Queens College, offered his cautionary story last month at a conference* called by a key federal watchdog agency to announce a $1 million grants program to investigate the prevalence of fraud, data fabrication, plagiarism, and other questionable practices in science. The 8-year-old Office of Research Integrity (ORI), a small unit within the Department of Health and Human Services, hopes to support studies aimed at gauging the frequency of misconduct and how to raise ethical standards.

    Turner's story was the most dramatic of a series of case studies presented at the ORI conference. In 1997, he explained, the National Institutes of Health funded his proposal to ask 1800 Baltimore residents about their sexual behavior. The project, an epidemiological look at AIDS and other sexually transmitted diseases such as gonorrhea and chlamydia, was managed by the Research Triangle Institute (RTI) of Research Triangle Park, North Carolina. Eleven months into the study, Turner, who has an appointment at RTI, got a call from a data-collection manager who was troubled by the apparent overproductivity of one interviewer. A closer look revealed that the worker was faking results; the address of one interview site, for example, turned out to be an abandoned house. The worker was dismissed, and others came under suspicion.

    After “a horrible 6 months” pulling apart the entire study, Turner and his colleagues discovered an “epidemic of falsification” that they linked to a cessation of random quality checks. As the schedule slipped, says Turner, some staffers may have felt pressure to hurry up. Despite a “significant” loss of money and time, the investigators painstakingly plucked out data from tainted sources, sorted the remains, and pieced together a final report that has been submitted for publication.

    View this table:

    Turner says the exercise taught him several hard lessons, the most important being to “validate the work yourself.” Scientists should start analyzing survey data as soon as it is submitted, he says, with a sharp eye for anomalies. Turner says he doesn't know if other projects have faced similar problems, because most journal articles don't discuss the issue. And the incident never became public, he says, because no one was ever publicly accused of wrongdoing and the institute chose to avoid the risk of litigation.

    How often does misconduct like this occur? There appears to be no consensus on the answer, although science historian Nicholas Steneck of the University of Michigan, Ann Arbor, co-chair of the conference, has drawn up a range of estimates. At the low end is an estimate of 1 fraud per 100,000 scientists per year. That's based on 200 official federal cases that fit a narrow definition that counts only fraud, data fabrication, and plagiarism, out of a community of 2 million active researchers over 20 years.

    At the same time, Steneck notes that 1 in 100 researchers “consistently report” in surveys that they know about an instance of misconduct. A broader definition yields even more hands. There is a “troubling discrepancy,” Steneck observed, “between public statements about how ‘rare’ misconduct in research supposedly is and the more private belief on the part of many researchers that it is fairly common.”

    A study of students at one campus suggests that the practice of massaging data is common, but the behavior decreases as students advance toward a career in science. Biologist Elizabeth Davidson and colleagues at Arizona State University in Tempe asked students in seven introductory biology and zoology courses whether they manipulated lab data to obtain desired results. A huge majority—84% to 91%—admitted to manipulating lab data “almost always” or “often.” Most said they did this to get a better grade. Other studies, however, show that the willingness to fake data declines sharply as students move on to graduate and professional-level work, leading Davidson to speculate that their behavior improves as the “research becomes important to them personally.”

    Some institutions have attempted to remedy the problem of scientific misconduct with special education programs. The University of Minnesota, for example, reported on an ambitious ethics training program at the medical school that in 1 year spent $500,000 on 60 workshops and signed up 2200 researchers as participants. But Steneck and others say that it's hard to measure the effectiveness of such training, and that the meager results to date are disheartening.

    A study of 172 University of Texas students enrolled in a “responsible conduct of research” course, for example, found “no significant change” in attitudes after training, says Elizabeth Heitman of the University of Texas School of Public Health in Houston. The finding is consistent with what Steneck has seen, including a 1996 study that found that people who had gone through a training course were actually more willing to grant “honorary authorship” to colleagues who had not performed research than were those who had not been trained.

    ORI director Chris Pascal says his office has received several favorable comments about the new grants program and that 70 scientists interested in the topic showed up last month for an ORI workshop on how to apply for biomedical research grants. The first round of winners will be announced next year.

    • *ORI Research Conference on Research Integrity, 18–20 November, Bethesda, Maryland.


    Too Little, Too Late, at the Climate Talks

    1. Richard A. Kerr

    Under pressure from too many complex issues, too many divergent views, and too little time to forge consensus, international negotiations aimed at reducing greenhouse gas emissions collapsed last week. The most obvious bone of contention was whether the United States, the world's biggest source of humanmade greenhouse gases, should be allowed to meet much of its obligation without actually cutting its own emissions. The United States softened its controversial stance in the final hours, but European negotiators found even the scaled-back U.S. position unacceptable. Although the negotiators headed for home with nothing tangible to show for their efforts, they say the rule-setting process is not over, just suspended.

    Filling in the details of the Kyoto Protocol crafted by governments in 1997 was obviously going to be tough (Science, 3 November, p. 920). “The fundamental problem is that you have several intersecting issues and a complicated set of coalitions,” says economist Henry Jacoby of the Massachusetts Institute of Technology. The mix becomes even more daunting when you add in an agreement on targets for reducing greenhouse emissions in developed countries—an average 5% reduction of emissions below their 1990 level—that was reached before anyone established how those reductions could be made. The United States found itself in a particularly tight spot, facing the need for a two-thirds majority in the U.S. Senate to ratify the treaty and a hot economy that would require a 30% reduction in emissions over the next 10 years relative to business as usual.

    To lessen the economic pain, the U.S. negotiators latched onto several “flexibility” opportunities allowed by the protocol. One is to let growing forests and soils soak up carbon dioxide emitted by burning fossil fuels. Initially, U.S. negotiators proposed that almost 310 million tons of carbon—about half of the U.S. reduction target—be accounted for by its forest and soil “sinks” (Science, 3 November, p. 922). Not fair, countered E.U. negotiators as well as observers from some environmental groups, claiming that such generous use of sinks amounted to “rewriting the Kyoto targets.” Many of those same forests and soils were soaking up carbon dioxide in 1990, they pointed out, without any effort on the part of the United States government. The Europeans insisted that the United States must actually reduce its greenhouse emissions rather than rely largely on sinks or another Kyoto option involving trading credits for emission reductions made in other countries.

    By the final hours—actually, during a last-minute extension of negotiations—the U.S. team agreed to let sinks account for just 50 million tons of its mandated 620-million-ton reduction. But by then, “there was a lack of time and a lack of trust,” says Jennifer Morgan of the World Wildlife Fund in Washington, D.C. A compromise on sinks carefully crafted by the U.S. team and a small group of European negotiators was rejected by the full European contingent as time ran out.

    In several other problem areas, progress was made but nothing settled. Left undecided, for instance, was the extent to which a country can buy emission-reduction credits from another country—such as Russia, where a sagging economy has resulted in large emission reductions since 1990. Nor did the negotiators agree on how compliance might be enforced. And it remains unclear how much help developing countries would get to cope with climate change. The protocol mandates that developed countries transfer money and technology to help these countries make the transition to cleaner energy production. But developing countries did not even have the opportunity to weigh in before the talks ended.

    They may get another chance in a few months. In an unusual move, the parties to the protocol agreed to meet again, probably in Bonn in May, to take another stab at setting rules. “The parties aren't letting the protocol fail,” says economist Michael Toman of Resources for the Future in Washington, D.C. “They're still far apart, but these things don't come easily.”


    Sweden to Get Tough on Lingering Compounds

    1. Lotta Fredholm*
    1. Lotta Fredholm is a freelance writer based in Stockholm.

    STOCKHOLM—For generations the Orrefors Kosta Boda glassworks has earned international acclaim for its fine leaded crystal art glass. But its handiwork may soon go the way of gasoline: lead-free. A Swedish government panel has called for banning from commerce any substance that persists in the environment and accumulates in organisms. The guidelines also place a heavier burden on industry to prove that a chemical is safe, whether it's a new compound or one that has been on the market for decades. “Today, substances are treated as if they were suspects in court: They are regarded as innocent until their harm is proven beyond reasonable doubt,” says pollutants expert Bo Wahlström, a senior scientific adviser at the United Nations Environment Programme in Switzerland. Sweden, he points out, “wants to change this by proposing that chemicals prove their innocence before they are marketed.”

    Getting the lead out.

    Proposed Swedish guidelines would drum lead out of commerce, forcing the Orrefors Kosta Boda glassworks to change its time-honored recipe.


    In January, the administration of Prime Minister Göran Persson will present the guidelines to the Swedish parliament, which is widely expected to approve the new chemical policy. But the guidelines could have an impact far beyond Sweden. On 18 and 19 December, the European Union's (E.U.'s) environmental ministers will meet to begin planning a new pan-European chemical policy. To date the E.U. has promoted the risk-analysis principle, meaning that a chemical must be proven harmful to be banned from use. But Sweden will be in a strong position to argue for its approach, which is grounded in the so-called precautionary principle, because Sweden assumes the rotating E.U. presidency in January. Although Swedish environmental minister Kjell Larsson insists he will not impose his country's cautious stance upon his E.U. colleagues, he says Sweden will place a high priority on bringing a new E.U.-wide chemical policy into force during its presidency.

    Established by the government in April 1999, the Swedish Chemicals Committee, a 15-member panel composed of representatives from government, industry, and academia, heard testimony from scientific experts before drafting its report highlighting the dangers posed by chemicals that linger in the environment and accumulate in bodily tissues. A classic example of this threat is polychlorinated biphenyls (PCBs), a family of compounds used for insulating electrical lines. In 1971, Sweden was among the first countries to ban PCBs after they were linked to reproductive problems in eagles and seals that preyed upon PCB-tainted fish in the Baltic Sea. But the report goes a step further, arguing that a chemical need not be proven toxic—as PCBs are—to be drummed out of commerce: It need only be proven persistent and bioaccumulative. If a compound meets these two criteria, “we can be pretty sure that it will also be harmful in the long run,” says environmental chemist Bo Jansson of the Institute of Applied Environmental Research at Stockholm University.

    The new guidelines call for banning from commerce any compound that has a half-life of more than 8 weeks in tests simulating an aqueous environment and is 2000 times more likely to accumulate in fish tissue than in seawater—a standard measure of biological uptake. The panel also calls on the government to hold all 100,000-odd chemicals in the European Inventory of Existing Commercial Chemical Substances to the same standards as new chemicals entering the market. Currently, only about 1 in 10 chemicals now in commerce has been tested for persistence and bioaccumulation. The onus will fall on companies that market products containing these chemicals. “We believe that [the chemical industry] should be carrying out the testing,” says committee chair Arne Kardell, former director-general of the Swedish National Food Administration. According to Kardell's panel, the government should mandate that the most abundant chemicals—the 2500 or so that are imported or produced at a level of 1000 tons or more each year—be tested for persistence and bioaccumulation by 2005. Less used chemicals would have an additional 5-year grace period. Any chemical that survives this testing must run the standard gauntlet of toxicology tests.

    Industry can see the writing on the wall. Even before the committee got going, the International Council of Chemical Associations launched a program in 1998 in which, as a first step, participating companies share the costs of testing the 1000 most commonly used chemicals for persistence, bioaccumulation, and toxicity. The companies are footing the testing bill—as much as $220,000 per chemical—to “gain credibility and confidence from the public,” says Rainer Koch of Bayer, which is leading the industry task force. And, of course, to get a jump on complying with any new Swedish rules.

    The guidelines represent a long-sought victory for scientists who have fought to see the rules adopted. “When we tried to express this view 5 years ago, we were called ‘fundamentalists,’” says Jansson. Now that the view is about to be adopted as government policy, outside experts are cheering Sweden on. “The committee's proposal makes a tremendous amount of sense,” says Linda Birnbaum, research director of experimental toxicology at the U.S. Environmental Protection Agency in Research Triangle Park, North Carolina. “Extreme persistency and extreme bioaccumulative properties go hand in hand with toxicity.” Birnbaum, however, doubts that Sweden's approach will be adopted any time soon by the United States, which follows the risk-analysis approach.

    Whether Sweden can persuade the rest of the E.U. to adopt such an aggressive policy is unclear. “The political desire for starting this work … will not happen as long as the public doesn't demand a change,” predicts environmental scientist Finn Bro-Rasmussen of the Technical University of Denmark in Copenhagen. But for some companies in Sweden, at least, their products will never be the same. With lead striking out under the new paradigm—it's persistent, bioaccumulative, and toxic—Orrefors Kosta Boda will have to devise new recipes for its crystal. Barium, for instance, gives the same luster as lead, but it is lighter. Says Orrefors spokesperson Karin Lindahl, “We will have to educate our customers not to choose their glass by weight but only by its beauty.”


    NASA Blasted for Rising Costs, Cancellations

    1. Andrew Lawler

    When NASA cancelled a project last month that would have sent a tiny rover crawling over an asteroid, planetary researchers went into orbit. In a rare public statement, several senior scientists said that the cancellation is symptomatic of larger problems in the U.S. planetary science program. They warned that spiraling costs are threatening a fleet of planned missions and also called for a sweeping reexamination of the outer solar system effort.

    The nanorover was scheduled to ride aboard Japan's Muses-C mission, which will return samples of an asteroid to Earth. But cost estimates tripled in the past year, to $60 million, prompting its manager, the Jet Propulsion Laboratory in Pasadena, California, to recommend canceling it. NASA headquarters concurred. The news comes just 3 months after NASA put a Pluto mission on hold because of rising costs (Science, 17 November, p. 1270). Earlier this year, NASA also abandoned a 2001 Mars lander and bowed out of a European comet mission.


    Wes Huntress and AAS decry pattern of delayed and canceled missions.


    “The cancellations and delays never seem to stop,” says Wesley Huntress, director of the Carnegie Institution of Washington's Geophysical Laboratory, NASA's former space science chief, and vice chair of the American Astronomical Society's (AAS's) 1200-member planetary sciences division. “The planetary exploration program is in a crisis mode.”

    In a public statement issued on 14 November, the AAS division blamed the financial problems on “a pattern of underbidding” and an overemphasis on the “cheaper” portion of NASA's commitment to launching faster, cheaper, better spacecraft. To control the cost growth, the division recommends increased competition and external peer review. “We understand NASA is trying to wrestle with this beast,” says division chair Mark Sykes, a planetary scientist at the University of Arizona in Tucson. “But there is the prospect for more cancellations.”

    Agency officials acknowledge the problem. “This is an unusual set of circumstances,” says Jay Bergstralh, NASA's acting science director of the planetary exploration effort. “And there is anxiety in the community.” The 1999 failures of two Mars missions have made for more conservative—and therefore more costly—estimates, he says, citing a report earlier this year that attributed the Mars failures in part to a lack of money for adequate testing.

    AAS isn't the only outside group calling for changes. This week NASA's own space science advisory committee planned to send a letter to Ed Weiler, the agency space science chief, backing increased competition and reiterating the importance of missions like Pluto and Europa. “It's time to take a very careful look at the entire [planetary] program and fix it,” says Steven Squyres of Cornell University, who chairs the panel. Huntress and Sykes also want an outside study of NASA's outer planetary program, but agency officials say that a NASA-led inquiry might come up with better solutions more quickly.

    Bergstralh admits that officials at Japan's Institute for Space and Astronomical Studies in Tokyo are “not very happy” with NASA's decision on the nanorover. A proliferation of scientific instruments, he says, drove up costs on what began as a small technology demonstrator. However, it's possible that NASA may want to provide communications and navigation support in exchange for some data.


    DOE Drops Plan to Restart Reactor

    1. Robert F. Service

    The U.S. Department of Energy (DOE) has abandoned the idea of restarting a controversial nuclear reactor at the Hanford Nuclear Reservation in Washington state. Some biomedical researchers are applauding the decision to pull the plug on the Fast Flux Test Facility (FFTF), which they feared would drain scarce resources from other DOE research programs. “It's the right decision,” says Kenneth Krohn, a radiation oncologist at the University of Washington, Seattle, about the department's 21 November announcement. “The FFTF is just too costly.”

    The reactor was opened in 1980 as a breeder test reactor but was shuttered in 1993 after an independent review found that the facility was too expensive to operate. DOE officials later considered using it to produce radioactive isotopes for cancer treatment and plutonium for deep-space probes, both of which DOE feared could face future supply concerns. But last week officials decided that the cost of the restart, at $314 million over 5 years plus about $80 million a year to operate, was too high and support too thin. Regional environmental groups had been active in opposing any restart.

    Instead, DOE plans to make do with existing facilities and to build a less expensive neutron accelerator that could produce tritium, a short-lived isotope of hydrogen critical to nuclear weapons, and meet other needs. “The department remains committed to its core nuclear science and technology role,” said Energy Secretary Bill Richardson in a written statement. “We expect to meet the nation's foreseeable needs for years to come using our current facilities.”

    No longer in flux.

    Hanford research reactor will be decommissioned.


    The decision to scrap the Hanford reactor, which had become a partisan issue, came as something of a surprise. A draft version of the environmental impact statement released this summer described a rapidly growing need for medical isotopes and raised concerns of future shortages. It also said that NASA needs continuous supplies of plutonium-238 to power batteries for deep-space scientific missions. The FFTF, it noted, could easily fill both these needs and also conduct nuclear research.

    At public meetings, however, FFTF opponents argued that DOE had overestimated the need for both types of materials, and that other sources were plentiful. They also noted that NASA is planning to switch to a new space battery technology that will need only 2 to 3 kilograms a year of PU-238. Such a small quantity could easily be purchased from Russia at just $10 million a kilogram, a fraction of the cost of restarting FFTF.

    To defray costs, DOE officials searched for companies willing to lease reactor beamtime to make the isotopes needed for diagnostic procedures and cancer treatments, as is done at other reactors for other isotopes. Although several companies seemed interested, DOE undersecretary for nuclear energy William Magwood said that none provided the firm commitment DOE needed to proceed with the FFTF restart.

    Richardson is expected to ratify the department's announcement next month before the Clinton Administration leaves office. That would set the stage for officials to drain the reactor's sodium coolant, after which the reactor cannot be restarted. But supporters, including local officials who lament the loss of jobs, haven't thrown in the towel. Gerald Pollet, head of an environmental group that has led the fight against FFTF, says he is wary of last-ditch efforts to undo Richardson's decision. “It's not a done deal,” says Pollet, whose Heart of America Northwest is preparing legal action to preserve DOE's position. He notes that Senator Slade Gorton (R-WA), whose status depends on the results of a recount, has been a longtime supporter and that a Bush Administration might well appoint an energy secretary who favors restarting the reactor.

    Even if the FFTF fades away, however, DOE may be facing other reactor battles. DOE officials plan to spend $60 million on a conceptual design and research program over the next 2 years for a new neutron accelerator facility, at a site not yet selected, that could provide a backup supply of tritium as well as break down high-level radioactive nuclear wastes into less dangerous byproducts. Senator Pete Domenici (R-NM) led efforts to add the money to this year's budget over the objections of the Clinton Administration, which wanted the funds for other domestic programs.

  6. NIH

    Higher Profile for Minority Health

    1. Jocelyn Kaiser*
    1. With reporting by Laura Helmuth.

    For years, some biomedical groups and health activists have pushed the National Institutes of Health (NIH) to devote more attention to the health of U.S. minorities. Last week, they got their wish: President Bill Clinton signed into law a measure that elevates NIH's office of minority health to the National Center on Minority Health and Health Disparities. The move comes with the promise of a bigger budget and greater autonomy to pursue studies on why blacks, Hispanics, and other groups suffer disproportionately high rates of diseases such as heart disease, prostate cancer, and diabetes.

    Center of attention

    Past and current health officials gather to watch President Bill Clinton sign a bill creating a new NIH minority health center.


    NIH created the Office of Research on Minority Health (ORMH) as an administrative home for minority health activities in 1990, but put it on a short leash. It's part of the director's office, and it must broker partnerships with other institutes to fund any studies. Former NIH Director Harold Varmus objected to the idea of creating a center devoted to minority health research, arguing that the problems would be better addressed by NIH-wide initiatives (Science, 28 April, p. 596). But some legislators felt that a center was needed to give health disparities studies the attention they deserved. After a previous attempt by Representative Jesse Jackson Jr. (D-IL) fell short, Senators Edward Kennedy (D-MA) and Bill Frist (R-TN) prevailed on their colleagues, winning final passage of S. 1880 on 31 October.

    The new law gives the center the power to award grants for basic and clinical research independently of other institutes. It also dangles the promise of a doubled budget in 2 years. Although the bill authorizes $100 million—not much more than ORMH's current $87 million—“the intent is that it will be [$100 million] over and above the current budget,” says a staffer for Kennedy. Appropriators have already started the ball rolling, putting $117 million for the center into the 2001 funding bill for NIH that is still pending, according to Dale Dirks of the Association of Minority Health Professional Schools.

    Another provision will forgive up to $35,000 a year in student loans for any researcher conducting studies of health disparities. James Hildreth, a molecular immunologist who's leaving an administrative position at Johns Hopkins University to become assistant director of the center, says the provision “allows more people to make the choice” to study the issue. The legislation also authorizes about $50 million for other Department of Health and Human Services agencies to study ways to reduce disparities in health care outcomes and to educate physicians on treating minority populations.

    The changes will give the center “more impact, more influence, more power,” says Anthony Fauci, head of the National Institute of Allergy and Infectious Diseases. Fauci and NIH acting deputy director Yvonne Maddox are leading a working group examining health disparities research across the institutes that will help shape NIH's priorities for addressing health disparities in 2002.

    Although major biomedical lobbying groups have pushed for the center's creation, some observers question the decision by NIH principal deputy director Ruth Kirschstein to put longtime ORMH director John Ruffin in charge instead of conducting a national competition. Ruffin has fought successfully to fund specific studies and establish new clinical centers at historically black medical schools, but a source who requested anonymity says he's “not very aggressive in moving the agenda.” However, others say that Ruffin's familiarity with top NIH officials could be critical to the center's success. “It seems prudent to me to maintain the core infrastructure of the office to ensure a smooth transition,” says Keith Norris, a clinical researcher at Charles R. Drew University of Science and Medicine in Los Angeles.


    Where the Brain Monitors the Body

    1. Laura Helmuth

    As any klutz will attest, coordination is complicated. Just to keep track of their limbs, for example, people and animals use information from several senses, such as vision, touch, and proprioception, which tells them their body's position. Indeed, large portions of the brain are devoted to keeping track of these sensations and dictating the body's movements. “As you interact with the world, you need constant information about where the body is,” says neurophysiologist Lawrence Snyder of Washington University in St. Louis. Researchers haven't known exactly where all those signals are integrated, but now a team may have located some of the neurons that first make these multisensory connections.

    On page 1782, a team led by psychologist Michael Graziano of Princeton University reports evidence that a small region of the parietal cortex of the monkey brain known as area 5 may enable the monkey to integrate many sources of information about its body and thereby update its mental model of what the body is doing. The researchers based this conclusion on their finding that some area 5 neurons fire at their fastest rates when the visual feedback from a monkey's arm matches the sensory feedback, an indication that the neurons are sensitive to both streams of information.

    Neuroscientists had suspected for some time that parts of the parietal cortex, located below the crown of the head, might be involved in maintaining a coherent representation of the body. One indication of this came from instances in which people with damage in the parietal cortex fail to recognize one of their limbs. Such patients might wake up startled, thinking someone put a fake leg in the bed.


    Neurons in area 5 aren't fooled by flipped arms.


    Graziano and his colleagues were inspired to look for multisensory neurons a few years ago when they uncovered “roundabout evidence” that neurons in another movement area, called the premotor cortex, are sensitive to both vision and proprioception. If neurons in areas that process the body's movement and sensations also respond directly to vision, they reasoned, such neurons might be key to integrating the different kinds of signals that provide a coherent model of the body. Graziano and colleagues then decided to track down where this integration starts—where in the brain's body-sensory system vision first makes an appearance.

    To do this, the researchers devised a technique for giving a monkey information from both vision and proprioception; this would enable the researchers to identify neurons that are sensitive to whether the information matches. After fitting the monkey with a long collar that restricts its near-body vision, the researchers hide one of the animal's arms beneath a shallow ledge. They then place a realistic, stuffed monkey arm or other objects on top of the ledge, either in the same position as the hidden arm or on the other side of the body. Because of the collar, the fake arm might appear, from the monkey's perspective, to be coming from its own body.

    When the researchers recorded the responses of single neurons in area 5 of the monkey's brain, they found cells that are sensitive to whether the sight of a fake arm matches the feel of its real arm. Neurons that respond to one arm didn't change their firing rate when the researchers placed apple slices on the ledge or lined the fake arm up with the monkey's other, also hidden, arm. But when the fake arm was aligned in the same position as the real, hidden, arm, 29% of the neurons changed their firing rate.

    What's more, these neurons weren't fooled by mismatched arms: Right-arm-sensitive neurons didn't fire strongly when a fake left arm was put in the right arm's place; likewise, no neurons ramped up their firing if the fake arm was placed in a different position than the real arm, say, with the palm near the animal's body rather than the shoulder. And neurons upstream of area 5—those that participate in earlier stages of body-sensation processing—didn't respond to the fake arm at all, suggesting that area 5 is the first to integrate different streams of input.

    Earlier research had shown that area 5 responds to proprioceptive signals, says Snyder, but this new result suggests that “the information processed by area 5 is more multisensory, more abstract” than simple proprioception. And if area 5 neurons integrate signals from many channels, Snyder says, they might be the first stages of a “representation of where the body is in space.”

  8. INDIA

    Disease Data Stolen in Lab Break-In

    1. Pallava Bagla

    NEW DELHI—The hard drives of nine computers, containing epidemiological data gathered from around India, have been stolen from the Indian Council of Medical Research (ICMR). The missing data, stored on personal computers in the council's Epidemiological and Communicable Diseases (ECD) unit, include published and unpublished information collected by 16 regional centers on the incidence of AIDS, malaria, tuberculosis, and other killers. Health officials say they have no idea who stole the drives, or for what purpose.

    The hard drives were removed on the night of 10 November from the ICMR's third-floor offices. The thieves systematically dismembered functional computers after breaking open locks to as many as six different rooms but did not touch other, more expensive equipment on the premises. They also left undisturbed the council's main bioinformatics computer center on the ground floor.

    ECD chief Lalit Kant says he is “heartbroken” by the break-in, which represents the loss of years of “sweat and blood.” Individual data sets still exist in the regional centers, he notes, but what is now missing is an overall picture of communicable diseases in India. Kant speculates that the theft might have been an inside job, because the burglars knew exactly which computers to target and because the resale value of the hard disks is negligible. Police have no suspects in the case, which is under investigation.

    ICMR officials say they have no idea how the thieves might use the data, although Kant suggests that his unit might have been targeted for harassment by “jealous” competitors. Director-General Nirmal Kumar Ganguly says that “no sensitive data have been lost,” but building security has been tightened considerably since the theft. “Even my bags are now searched when I leave the office every day,” he notes.


    Gene Jocks, Data Crunchers Hit Jackpot

    1. John Pickrell*
    1. John Pickrell is a science writer in London.

    LONDON—Unveiling its science spending plan for the next 3 years, the U.K. government last week announced major new investments in three key areas: tracking disease genes, leveraging the Internet for data analysis, and supporting emerging industries such as nanotechnology and bioengineering. Although these programs cut across a range of disciplines funded by the U.K.'s science councils (see table), the government also bestowed a long-anticipated gift on astronomers: membership in the European Southern Observatory (ESO), which will give U.K. researchers access to the world's largest optical telescope.

    From 2001 to 2004, the U.K. government will pour $8.2 billion into science, $1 billion more than the science budget for the last 3 years—a 7% increase after adjustments for inflation. The biggest winner is genomics. It will feature as its centerpiece the U.K. Population Biomedical Collection, a project in which several centers will take DNA samples from half a million volunteers, then chart each person's medical condition and lifestyle over the coming years. The effort should allow researchers to hunt down genes linked to chronic illnesses such as cancer and heart disease. “The collection promises to be one of the most exciting scientific initiatives of recent times,” says Sir George Radda, chief executive of the Medical Research Council.

    Also capturing a large share of new funding are projects that aim to digest once-unfathomable amounts of data, an area the U.K. government calls “e-science.” Epitomizing the challenge is the Large Hadron Collider (LHC), a proton accelerator scheduled to come on line at CERN, the European particle physics laboratory near Geneva, in 2005. Each of the LHC's four detector experiments, which will probe the nature of matter and the origins and evolution of the universe, are expected to churn out more than 1 petabyte (1 quadrillion bytes) of data—equivalent to a mile-high stack of CD-ROMS—every second. Much of the $137 million in new funding will go to DATAGRID, a pan-European project based at CERN to develop shared databases.

    Climate change research, too, will benefit from this database approach. Besides getting a $10 million infusion for climate modeling, the Natural Environment Research Council (NERC) will fund projects aimed at determining whether the British Isles are on the brink of a rapid cooldown. Thanks to the Gulf Stream, part of the so-called conveyor belt in the Atlantic Ocean that brings warm water northward and cold water southward, northwestern Europe is currently 5° to 10°C warmer, on average, than other regions at these high latitudes. Some scientists believe that global warming might destabilize the conveyor belt. If so, the conveyor belt could weaken or turn off some time in the next century—as it appears to have done in the past—and that could trigger much icier conditions in northwestern Europe. In response to that threat, NERC will spend at least $15 million over the next 5 years on studies of the North Atlantic aiming to assess the likelihood of a conveyor breakdown and the impact that would have on the United Kingdom. Says a NERC spokesperson, “We hope that this program will act as a trigger to enable other countries to get funding for research in this field.”

    Astronomers have special reason to celebrate this week: Pending final negotiations, starting in 2002 they will have access to all ESO facilities, including the vaunted Large Millimeter Array and the Very Large Telescope, both situated in the Atacama Desert in Chile. They will also get in on the planning of ESO's proposed next-generation ground-based scopes, dishes about 50 meters in diameter that would form a facility called the Overwhelmingly Large Telescope, or OWL. “This is excellent news,” says Mike Edmunds of the University of Wales, Cardiff, who chairs the Particle Physics and Astronomy Research Council's (PPARC's) astronomy vision panel. PPARC's decision to cast its lot with ESO shows that whereas U.K. politicians are bitterly divided over how tightly they should yoke Britain's economy to the European Union's, scientists themselves are no isolationists. “Entry to the ESO is consistent with the trend towards global cooperation,” notes Luke Geor-ghiou, a science policy expert at Manchester University.

    View this table:

    But the commitment to ESO—$14 million a year for the next decade—could entail sacrifices at home, because PPARC's budget rise likely won't cover the entire ESO bill. “The main worry is what could be cut domestically to pay for it,” says Georghiou. A PPARC spokesperson discounted rumors that ponying up ESO dues would force the closure of the Jodrell Bank Observatory in Cheshire and its venerable Lovell radio telescope now being refurbished under a grant from the U.K.-Wellcome Trust Joint Infrastructure Initiative. But he declined to comment on where the savings might come from.

    One budget item that the government left untouched is academic paychecks. “The level of pay for university researchers remains the biggest outstanding problem,” says Georghiou. “It has fallen well behind other professions, making recruitment of the best talent increasingly hard.” But even if the take-home pay isn't great, at least scientists in genomics and Internet databases, for instance, should have a fair shot at snaring some additional cash to keep their labs in top form.


    New Recruits for French Prion Research

    1. Barbara Casassus*
    1. Barbara Casassus is a freelance writer in Paris.

    PARIS—As panic over “mad cow disease” engulfs France and threatens to spread to other countries in Western Europe, French research minister Roger-Gérard Schwartzenberg last week unveiled detailed plans for spending $27 million the government has earmarked for prion disease research in 2001. Next year's budget for studying prions —infectious, abnormal proteins linked to bovine spongiform encephalopathy (BSE) and its human form, variant Creutzfeldt-Jakob disease (vCJD)—will triple France's current prion research spending.

    Earlier this month, both Spain and Germany reported their first BSE cases, sparking fears of a major Europe-wide epidemic. In France, meanwhile, the sharp jump in BSE cases this year prompted the government to respond with a whopping increase for prion research (Science, 17 November, p. 1273). Some of the new cash will be spent to recruit 100 researchers and technicians in 2001, including 25 postdocs, to add to the 240 scientists now working full- or part-time on prion diseases in France, Schwartzenberg said at a 23 November press conference. Another 20 researchers will be recruited in 2002 and 2003. “It is good we will be able to pay” researchers and technicians, says immunologist Jean-Yves Cesbron of Joseph Fourier University in Grenoble, but the extra resources have come “a bit late.” Cesbron wonders where the postdocs will be found and is worried that the French prion research effort will be too dispersed. “Very few laboratories working in this area have critical mass,” he says.

    The cash infusion also will be used to expand current research efforts (see table) and fund entirely new projects, such as work at the University of Montpellier using lemurs to study how prions cause disease in primates. A major focus will be animal models for prion diseases, including transgenic mice, cattle, and sheep. The beefed-up research effort catapults France to nearly the same level as the United Kingdom—the country the hardest hit by both BSE and vCJD—in prion research spending. The U.K. government spends about $34.4 million a year, with an additional $1.4 million coming from the Wellcome Trust charity. France is now “well placed compared to other European countries,” Schwartzenberg said. Although the spending boost comes hard on the heels of what the French press has called the country's budding “national psychosis” over BSE, Schwartzenberg played down suggestions that the panic itself fueled the increase. This is “not the principal” reason, he said. “We are convinced that the increase in the number of [BSE] cases … justifies a greater research effort.”

    View this table:

    Sanger Will Sequence Zebrafish Genome

    1. Gretchen Vogel

    As the international human genome project nears completion, the Sanger Centre in Cambridge, U.K., has settled on a new effort to keep its sequencing machines humming: the genome of the zebrafish, a model organism much loved by developmental geneticists. After nearly 4 years of lobbying biomedical funding agencies (Science, 14 February 1997, p. 923), scientists who study the 4-centimeter Danio rerio are delighted. “It's just such a dream,” says Leonard Zon of Children's Hospital in Boston, who studies the development of blood cells in the zebrafish and who with Nobel Prize-winner Christiane Nüsslein-Volhard and others was a strong advocate for the sequencing project.

    Developmental biologists value Danio rerio for its transparent embryos, which allow easy viewing of a developing vertebrate nervous system, heart, and other organs. Researchers typically expose male fish to DNA-altering chemicals and then breed the fish to create embryos carrying a range of mutations. This enables them to examine the mutants for intriguing characteristics, such as brittle bones or faulty digestive systems, and from there to find the defective genes (Science, 19 May, p. 1160). Although the mouse genome now being sequenced by two groups—a public-private consortium and Celera Genomics in Rockville, Maryland—will help scientists assign functions to many human genes, having the full DNA sequence of a more distantly related vertebrate will uncover additional gene functions that are missed in human-mouse comparisons, says developmental geneticist Marnie Halpern of the Carnegie Institution of Washington in Baltimore.

    The Sanger Centre gave its unofficial endorsement to the project earlier this year (Science, 5 May, p. 787), but the research team, led by Jane Rogers and Richard Durban, worked out the details only last month. The Wellcome Trust announced on 21 November that it would fund the effort, spending more than $7 million for the first year of the 3-year project.

    Sanger scientists will use the shotgun technique, fracturing the entire zebrafish genome into smaller pieces, sequencing each one, and reassembling them in order with the aid of sophisticated computer programs. A first draft of the sequence could be done by fall 2001, Rogers says, with the finishing touches completed by 2004.

    Rumors a few months ago that Celera might launch a zebrafish sequencing project had researchers worried that the Sanger Centre might drop the project and data would not be as freely available. Researchers intensified their lobbying. But Celera's Mark Adams says they have no plans to sequence the zebrafish and are glad Sanger is doing so.

    Recent advances, such as the creation of improved genetic maps, have already shortened the time it takes to identify the genes responsible for the intriguing mutations in zebrafish. But the entire 1.8-billion-base-pair sequence will speed the job considerably, Zon says: “This will really set the field on fire.”


    Stem Cells: New Excitement, Persistent Questions

    1. Gretchen Vogel

    Bone marrow cells that become neuronlike cells in the brain fascinate scientists, but ample uncertainties must be resolved before such results can be translated into therapeutics

    If only bodies were as easy to fix as automobiles, diseases like diabetes or heart disease would be vanquished. Worn-out, defective cells would be readily replaced, new organs inserted, and impossible illnesses cured. Sometimes, such a world seems just around the corner—if you don't read the caveats too closely. Almost each week brings another report of the uncanny abilities of versatile stem cells, when transplanted into mice or rats, to form new blood vessels, strengthen weak bones, and even seek out and begin to repair damaged spinal cords and brains (Science, 24 November, p. 1479).

    Tantalizing interlopers.

    Cells derived from bone marrow transplants, identified by green fluorescent protein (top) or a green-stained Y chromosome, expressed neuronal markers, stained red, in the brains of mice.


    Much of the excitement has focused on the ability of these partially developed cells present in early embryos, fetal tissue, and several adult tissues to change course and become different types of cells—a proto-brain cell morphing into a muscle cell, say, or a bone marrow cell into a liver cell. For many years researchers assumed that a cell's fate was sealed, irrevocably, early in development. But increasingly, experiments are undermining that idea. In the latest example, two independent research teams report in this issue that, in mice, adult cells from the bone marrow can enter the brain and become neuronlike cells. The two papers strengthen the notion that cells from adult tissue, when prodded with the right signals, can change trajectories, abandoning their original identity and assuming a new one. If a similar phenomenon occurs in human brains—still a big if—it could mean that easily accessible cells from bone marrow might someday be used to treat a wide range of neurological diseases—without raising the ethical concerns that accompany the use of embryonic cells.

    But there's a catch. Can the dramatic findings that so far have grown out of work with stem cells taken from mice be repeated in humans? Research on human cells lags behind, in part because of ethical debates restricting the use of cells derived from human embryos and fetuses (see sidebar), but also because of certain characteristics of human cells themselves. Human cells grow more slowly and divide less often in culture than their mouse counterparts. And once transplanted, usually into rodents, human stem cells are proving decidedly less predictable. What's more, scientists are at a loss to explain the surprising behavior of both human and mouse stem cells. The molecules that control the unusual fate-switching and tissue-rescuing cells remain elusive, making it difficult to test the observations with human cells, especially, in culture. Any human treatments, suffice it to say, are years away.

    The latest papers highlight the personality-switching abilities of mouse stem cells while also reflecting some of the uncertainty typical of the field. éva Mezey of the National Institute of Neurological Disorders and Stroke (NINDS) and her colleagues describe on page 1779 how they transferred bone marrow cells from normal adult mice into a strain of mice that cannot produce immune system cells. Usually, mice without immune systems die within a day of birth, but a bone marrow transplant can rescue them, and they grow normally following the transplant. To trace the fates of the transplanted cells, the team members injected bone marrow from adult male mice into newborn female recipients. One to 4 months after the transplants, the scientists killed the mice and examined their brains. In all of them, the researchers found cells containing Y chromosomes—unmistakable proof that they came from the male donors.

    That observation in itself was not surprising: Scientists have known for years that cells from the immune system can enter the brain, and recent reports have shown that cells present in bone marrow could become astrocytes and glia, the brain's supporting cells. The unexpected result was that a small percentage of the male-derived cells expressed protein markers typical of neurons, the brain's key signaling cells, suggesting that the bone marrow cells had, upon reaching the brain, become neurons. Until a few years ago, scientists did not think mammals produced any new neurons at all after childhood—much less that foreign bone marrow cells could be coaxed into such a feat.

    In independent work, cell biologist Helen Blau, graduate student Tim Brazelton, and their colleagues at Stanford University also found evidence for the versatility of adult bone marrow cells. As reported on page 1775, the team members injected bone marrow cells from adult mice into otherwise normal mice that had received a lethal dose of radiation to kill their bone marrow cells. The researchers used bone marrow from mice genetically engineered to express green fluorescent protein in their cells so they could track the injected cells. Several months after the transplant, the researchers found glowing green cells throughout the brains of recipient mice. To determine what type of brain cells the bone marrow had become, the team members stained brain sections to detect neuronal-type markers. To their surprise, they, too, found transplant-derived cells expressing multiple neuronal proteins.

    Despite both teams' independent results, other scientists caution that protein markers can be misleading. Mature, functional neurons can be notoriously difficult to identify using cell markers, and both teams failed to detect more than a few cells with the characteristic shape of a mature neuron, with long extensions reaching out to other cells. The transplanted cells are “expressing certain features of neurons, but there's a lot we don't know,” says developmental neuroscientist Ron McKay of NINDS.

    And if the cells truly are neurons, the scientists still need to decipher exactly which bone marrow cells enter the brain and what molecular signals draw them there. Neuroscientist Anders Bjorklund of Lund University in Sweden suspects that the age and condition of the recipient mouse might influence the recruitment of bone marrow cells to the brain. Mezey and her colleagues worked with newborn mice, and it might be easier for stem cells to infiltrate those still-developing brains. In Brazelton and Blau's work, the adult recipient mice received a high dose of radiation that killed not only bone marrow but also any dividing cells in the brain. Perhaps such an assault prompted the migration of cells, Bjorklund speculates.

    Blau's team is now working to characterize the molecules that control the recruitment process. “We need to find out what factors we can deliver to make cells divide and home in and take up residence in the right place,” she says. Indeed, a detailed understanding of such factors would probably have to precede any clinical applications, McKay says.

    A web of possibility.

    In mice, stem cells from adult brain and bone marrow have shown encouraging potential for repairing a variety of tissues and organs and perhaps curing disease.


    Although clinical applications are a long way off, recent work supports the idea that human bone marrow might also have multiple talents, although exploiting them may still be a challenge. Researchers led by Darwin Prockop of Tulane University in New Orleans and Ira Black of Robert Wood Johnson Medical School in Piscataway, New Jersey, reported in the August Journal of Neuroscience Research that human marrow stromal cells, a subset of bone marrow, began to resemble neuronal-type cells in culture. And this summer, Malcolm Alison of the Imperial College School of Medicine in London and his colleagues reported in Nature that at least a few human bone marrow cells became liver cells in patients who had received bone marrow transplants. In the study, women who had received bone marrow transplants from male donors had liver cells that contained Y chromosomes—most likely derived from the transplanted bone marrow cells. That finding is consistent with previous reports of similar phenomena in mice (Science, 14 May 1999, p. 1168), suggesting that bone marrow cells might someday be useful in treating liver disease.

    In contrast, the human embryonic stem cells and fetal germ cells that made headlines in November 1998 because they can, in theory, develop into any cell type have so far produced relatively modest results. Only a few papers and meeting reports have emerged from the handful of labs that work with human pluripotent cells, whose use has been restricted by legal and commercial hurdles. Last month, a group led by Nissim Benvenisty of The Hebrew University in Jerusalem, in collaboration with Douglas Melton of Harvard University, reported in the Proceedings of the National Academy of Sciences that they could nudge human embryonic stem cells toward a number of different cell fates. But the results did not produce easy answers; some cells expressed markers from several kinds of lineages.

    The work suggests that it will not be simple to produce the pure populations of certain cell types that would be required for safe and reliable cell therapies—much less the hoped-for replacement organs, says stem cell researcher Oliver Brüstle of the University of Bonn in Germany. Brüstle was one of the first to show that mouse embryonic stem cells could help treat an animal disease model, in which neurons lack their insulating coat of myelin. Even so, he is cautious about the near-term prospects in humans. Says Brüstle: “At present, it looks like it is really difficult to differentiate these [human] cells into more advanced cell types.” Melton agrees. “It's unlikely anyone will ever find a single growth factor to make a dopaminergic neuron,” as some might have hoped, but the work provides “a starting place,” he says.

    Simply keeping human embryonic stem cells alive can be a challenge, says Peter Andrews of the University of Sheffield in England. For more than a year, he and his colleagues have been experimenting with embryonic stem cell lines that James Thomson derived at the University of Wisconsin, Madison. “They're tricky,” Andrews says. It took several false starts—and a trip to Wisconsin —before the researchers learned how to keep the cells thriving, he says. Melton uses almost the same words: Human embryonic stem cells “are trickier than mouse,” he says. “They're more tedious to grow.”

    Researchers from Geron Corp. in Menlo Park, California, are having some luck. Company researchers have been working with human embryonic stem cells as long as any team has, because Geron funded the derivation of the cells and has an exclusive license for their commercial use. They reported in the 15 November issue of Developmental Biology that cell lines derived from a single embryonic stem cell continue to replicate in culture for 250 generations. This is important, says Geron researcher Melissa Carpenter, because it means that a single human embryonic stem cell, which might be modified in the lab, could produce an essentially unlimited supply of cells for therapy. That was known for mouse embryonic stem cells but had not been shown in humans before. Even so, Geron researchers seem no closer than other groups to devising therapeutic uses for stem cells. Geron researchers reported last month at the annual meeting of the Society of Neuroscience that they had attempted to transplant human embryonic stem cells into rats. When they injected undifferentiated cells into the brain, they did not readily differentiate into brain cells, the researchers found. Instead, they stayed in a disorganized cluster, and brain cells near them began to die. Even partially differentiated cells, the team reported, tended to clump together; again, nearby brain cells died.

    Still, Melton is optimistic. “How easily can we translate what we know in the mouse to the human? There's nothing we've found that makes me think it can't be done,” he says.

    The most important next step, say several stem cell researchers, is to identify the molecular processes that underlie the impressive feats of stem cells. Many of the purported breakthroughs are simply observations, Bjorklund says, which may eventually be explained by events unrelated to stem cell versatility. “That is going to be one challenge for those working in the field,” he says. “One has to come up with a deeper understanding of the mechanisms involved to get anywhere.” Blau agrees: “We have to understand the rules” to find out how to better play the cell-replacement game.


    Stem Cell Scorecard

    1. Gretchen Vogel

    As researchers continue to explore the potential uses of stem cells obtained from a variety of sources (see main text), governments around the world are grappling with whether to allow research on stem cells derived from human embryos. Governments are cautious yet increasingly open to the new research, which may eventually yield treatments for a variety of diseases from Parkinson's to diabetes.

    Japan: The Council for Science and Technology, the country's highest scientific advisory group, was scheduled to begin discussions of final guidelines governing the use of stem cells this week. Those guidelines are expected to closely resemble a draft document released last spring (Science, 11 February, p. 949). Until those guidelines are in place, scientists in Japan are not allowed to derive or work with human embryonic stem (ES) cells, says Shin-Ichi Nishikawa of the University of Kyoto. One use of the technology is likely to be banned: A law prohibiting human reproductive cloning has passed the lower house of parliament and is expected to pass the upper house, says Koichi Morimoto, science counselor at the Japanese embassy in Washington, D.C.

    Germany: Although German law forbids research that harms an embryo, it does not prohibit import of already-derived ES cells, according to Oliver Brüstle of the University of Bonn. Brüstle has applied for a grant to do just that and is waiting to hear back from Germany's funding agency, the DFG. Legislation is unlikely to change in the near future, he says.

    Sweden: The government is reviewing its guidelines for stem cell and cloning research, even as plans are under way at Stockholm's Huddinge Hospital to launch a project to derive new stem cell lines from embryos that will be available for basic research, says neuroscientist Anders Bjorklund of Lund University.

    United Kingdom: The U.K. may be the most permissive. In a report supported by Prime Minister Tony Blair, an advisory committee recommended in May that researchers be allowed to conduct nuclear transfer experiments with human cells (Science, 25 August, p. 1269). Some scientists would like to learn how to transfer the nucleus from a patient's cell into an enucleated egg in order to derive perfectly matched stem cells for treating the patient. Parliament is expected to debate revisions to the law governing research on embryos in the next few months.

    Australia: In Australia, policies are in flux. State law varies across the country. Victoria, for example, prohibits the derivation of ES cells, but other states have no regulations. The National Health and Medical Research Council, the country's biomedical funding agency, has issued guidelines on research in human-assisted reproduction technology; these state that human ES cells may not be developed with the aim of cloning an individual. A National Parliamentary Committee is currently considering the status of therapeutic cloning, stem cell research, and related matters. They are expected to report before the end of the year.

    European Union: An E.U. ethics advisory board recommended in November that the E.U. fund research using all types of stem cells, especially those derived from adult tissue. Because research on ES cells is still preliminary, the advisory board discouraged work that would create new embryos for research. Plenty of “excess” embryos already exist in fertility clinics, destined to be discarded, the report says.

    United States: Although the National Institutes of Health (NIH) issued guidelines for funding work with human pluripotent stem cells in August, federally funded researchers will not be able to begin such work until next spring at the earliest. The guidelines allow NIH-funded scientists to use embryonic or fetal stem cells only after careful ethical review of the methods used to derive those cells. Because no scientist has yet submitted all the documentation needed for the review, says NIH associate director for policy Lana Skirboll, the Human Pluripotent Stem Cell Review Group will not meet this month as originally scheduled. Committee members should be announced before year's end, Skirboll says, and the next scheduled meeting is in April 2001.

    That will be well into the new presidential Administration, and if Governor George W. Bush prevails, the climate at NIH may change. Bush has stated that he is opposed to the NIH funding work with ES cells, and a Bush-appointed director could prohibit any ES cell work.

    Such an outcome could prompt stem cell supporters in Congress to act. Senator Arlen Specter (R-PA), for instance, last year introduced a bill that would have authorized the NIH to fund work on the derivation and use of human ES cells. But despite star-studded hearings in which actors Mary Tyler Moore, Michael J. Fox, and Christopher Reeve testified in favor of the research, the bill died in the Senate (Science, 13 October, p. 261). Specter has said he will reintroduce the bill in the new Congress.


    Fossils Come to Life in Mexico

    1. Erik Stokstad

    MEXICO CITY—New insights into ancient life came by land and sea at the 60th annual meeting of the Society of Vertebrate Paleontology, held here from 25 to 28 October. Stunningly preserved fossils from Mongolia continued to impress; biomechanical models were also big, including one that clocks the swimming speed of cruising ichthyosaurs.

    More Family Life for Dinos

    As a sandstorm howled about them, 15 Protoceratops hatchlings huddled in a nest. Suddenly, an onslaught of sand, perhaps from a collapsing dune, buried them alive. Uncovered more than 65 million years later at a dig in Mongolia, their tiny bones have given paleontologists the first evidence that this common herbivore cared for its young. Meanwhile, other Mongolian fossils suggest that a predatory dinosaur cited as a model parent may not have been so nurturing after all. Taken together, the new finds point to “radically different strategies” of dino parenting, says Hans-Dieter Sues, a paleontologist at the Royal Ontario Museum in Toronto.

    The care of dinosaur young has fascinated scientists since the 1980s, when well-preserved nests discovered in Montana showed that duck-billed Maiasaura padded their nests with plants and carefully arranged the eggs to prevent them from rolling. The new report of the first known Protoceratops nest comes from a 1994 discovery by Narman Dakh of the Mongolian Paleontological Center in Ulaan Baatar, which David Weishampel, a paleontologist at the Johns Hopkins University School of Medicine in Baltimore, described for the first time at the meeting. The 15 skeletons, each just 16 centimeters long, lay belly down on one side of the nest. All had their heads facing away from the prevailing wind, as indicated by patterns in nearby fossil dunes. Weishampel believes that the young dinosaurs were siblings and were probably being cared for by an adult. “There certainly is a family life here,” says Peter Dodson, a paleontologist at the University of Pennsylvania in Philadelphia.

    Carnivores can be caregivers, too. In the past decade, paleontologists have found two striking examples: a pair of nests on which predatory Oviraptor mothers apparently died while brooding their eggs. At the meeting, Weishampel described hints that closely related dinosaurs, the oviraptorosaurs, may have become self-sufficient shortly after hatching. Three fossilized eggs from a Mongolian nesting site—also about 65 million years old—contain partial, articulated bones, he said. They are much more developed than bones from other oviraptorosaur embryos, and the type of tissue deposition indicates rapid growth. These facts suggest to Weishampel that when the dinosaurs hatched, they were relatively fully developed and may have been itching to leave the nest.

    How Fast in the Water?

    While dinosaurs roamed the Mesozoic land, equally exotic reptiles—ranging from long-necked plesiosaurs to huge, powerful lizards—swam the oceans. Their teeth and skeletons offer many clues to how they lived, but watertight estimates of their typical swimming speeds have been few and far between. At the meeting, paleontologist Ryosuke Motani of the Royal Ontario Museum in Toronto described a new method for calculating both the speed of sleek marine reptiles called ichthyosaurs and how energetic they must have been to achieve it. “The neat thing about this study is that it is the first opportunity to have an empirical estimate of potentially raised metabolism in this group,” says Glenn Storrs of the Cincinnati Museum Center and the University of Cincinnati.

    Motani became interested in the speed of ichthyosaurs last year while trying to figure out how deep the animals could dive. He thought they could stay submerged for about 20 minutes, judging from a correlation between body size and dive duration in modern swimmers. To know how deep they could go, though, he needed their typical speed. Clocking speed can be tricky for extinct land animals, because they could have walked in many different ways. For swimming creatures, however, the laws of hydrodynamics dictate that an animal's speed depends strongly on its shape. Because ichthyosaurs were shaped much like modern tuna, Motani conservatively estimated that they swam at tuna speed, about 1 meter per second. Thus, he calculated, they probably could have dived at least 500 meters.

    To check his results, Motani developed a mathematical model that estimates the most efficient swimming speed from features of a fossil, such as the length of the animal and the shape and size of its tail fin. Then he tested the model on 12 species of whales, fishes, and other living marine swimmers. It correctly predicted their typical swimming speeds. Motani then plugged in the dimensions of an Early Jurassic ichthyosaur called Stenopterygius, one of a few fossils preserved with its tail fin intact. The model indicated that the roughly 2-meter-long ichthyosaurs would have swum most efficiently at speeds between 1.3 and 1.8 meters per second. That's on a par with tuna of the same size but slightly slower than whales, which have a different body plan.

    Next, Motani compared his results with those of a model designed in 1988 by Judy Massare of the State University of New York, Brockport. Massare's model calculated the relative speeds of swimming animals from their size, shape, and metabolic rate—the last an unknown in the case of ichthyosaurs, of course. Motani fine-tuned the model and tried it with three different metabolic rates: an average of rates of modern marine mammals; an average for terrestrial reptiles; and an intermediate rate, like that of leatherback turtles and some tunas. The cruising speed for the turtle-and-tuna group matched the range of the ichthyosaurs, suggesting that the creatures had similar metabolisms.

    Ichthyosaurs needed a fast metabolism to propel their sleek, streamlined bodies to full speed, Storrs says. And compared with their modern land-dwelling cousins, adds David Norman of the University of Cambridge, United Kingdom, reptiles that spent their lives immersed in cold water would have required extra energy just to stay warm.


    Beating Up on a Young Earth, and Possibly Life

    1. Richard A. Kerr

    Bits of melted rock in lunar meteorites tell of a brutal battering suffered by the moon and Earth almost 4 billion years ago, as life was getting started

    Call it tough love or growing pains—Earth owes its character to the school of hard knocks. It achieved something like its present size by colliding with innumerable chunks of rock as big as cities. Then, about 4.5 billion years ago, a rogue Mars-sized body plowed into the nascent planet and splashed off enough rock to form the moon. The inner solar system quieted down considerably within a few hundred million years after that, and relative peace prevailed. But early analyses of lunar rocks returned by Apollo astronauts hinted at a sudden violent episode 600 million years after Earth's birth. Seemingly out of nowhere, a hail of objects pummeled Earth, the moon, and perhaps the entire inner solar system. Now this “late heavy bombardment” is getting strong support from analyses of rocks the astronauts never saw: meteorites that fell to Earth from the moon's back side.

    Lunar meteorite analyses reported on page 1754 of this issue reveal a burst of impacts on the moon 3.9 billion years ago and nothing before that. Cosmochemists Barbara Cohen, Timothy Swindle, and David Kring of the University of Arizona, Tucson, conclude that the moon and Earth endured a storm of impacts 100 times heavier than anything immediately before or after. Such a lunar cataclysm would have scarred the moon with the great basins that now shape the man in the moon. On Earth, the same bombardment would have intervened in the evolution of life, perhaps forcing it to start all over again. “To me it seems highly likely there was a lunar catastrophe,” says cosmochemist Laurence Nyquist of NASA's Johnson Space Center in Houston. But skeptics wonder how the solar system could have held off delivering such a devastating blow for more than half a billion years.

    Even a pair of binoculars reveals that the moon has had a rough time of it. Analyses of impact-battered Apollo rocks suggested that violent collisions about 3.9 billion years ago—dubbed the terminal lunar cataclysm—disrupted the isotopic composition of moon rocks. Then, in the early 1990s, geochronologist Brent Dalrymple of Oregon State University in Corvallis and planetary scientist Graham Ryder of the Lunar and Planetary Institute in Houston determined precise ages of 12 bits of Apollo rock apparently melted in 12 different impacts. They found a flurry of impacts 3.9 billion years ago but none older. If the impact rate had simply tailed off from the formation of Earth and the moon about 4.5 billion years ago, as dynamical astronomers insist it must have, Dalrymple and Graham should have found impact melts as much as 4.2 billion or 4.3 billion years old. Failing that, they concluded that a burst of impacts 3.9 billion years ago had overwhelmed the few impacts that preceded it.

    Lunar sample freebie.

    Meteorites blasted off the moon hint at an earlier cataclysm of impacts.


    The lunar cataclysm wasn't immediately accepted, however. Critics such as planetary scientist William Hartmann of the Planetary Science Institute in Tucson, Arizona, pointed out that the apparent surge might just mean that cratering was obliterating all traces of earlier impacts until it gradually slowed to the point where some could survive. And all of the dated moon rocks had come from the equatorial region of the moon's nearside, Hartmann noted, where one or two of the huge, basin-forming impacts there could dominate the record.

    With the geographic constraints in mind, Cohen and her colleagues turned to the other source of moon rocks, the meteorites blasted off the moon by large impacts. Cohen chose four lunar meteorites containing bits of impact melt and dated 31 of those bits representing at least seven different impacts. None was older than 3.9 billion years. More telling, none contained the distinctive “KREEP” material (rich in potassium, rare earth elements, and phosphorus) that covers much of the nearside and tags all Apollo samples. The lunar meteorites seem to have sampled the moon far from Apollo landing sites, even on the moon's farside.

    The latest results from the moon are pushing even the doubters toward a lunar cataclysm. “When we started this study,” says Swindle, “I thought this would be the way to disprove it. We haven't proved there was a cataclysm at 3.9 billion years, but it passes the test.” Hartmann agrees, and he now concedes that obliteration of an earlier impact record may be harder than he had thought. “The way out may be a compromise scenario,” he says. “Maybe there was a fairly big spike [superimposed on the tail] 3.9 billion years ago, and we're just arguing over how big that spike was. But you would still have the serious problem of where you store this stuff for 600 million years” before dropping it on the moon.

    Astronomers still don't have any good idea of the cataclysm's source. Simulations show that the gravity of Earth and the other terrestrial planets would have cleared the inner solar system of threatening debris within a few hundred million years. Collisions in the asteroid belt can shower Earth with debris, notes Brett Gladman of the Observatory of Nice, but a cataclysm would require the breakup of a body larger than 945-kilometer Ceres, the largest asteroid. The chance of that happening any time in the past 4.5 billion years is nearly nil, he notes.

    As a last resort, researchers look to the outer reaches of the solar system. Dynamical astronomer Harold Levison of the Boulder, Colorado, office of the Southwest Research Institute and colleagues show in a paper to appear in Icarus how the newly formed Neptune and Uranus could have tossed icy debris, along with some asteroids, inward in sufficient quantities to resurface the moon, give Mars a warm and wet early atmosphere, and sterilize Earth's surface with the heat of the bombardment (Science, 25 June 1999, p. 2111). The only catch, says Levison, is that the two large outer planets would have had to have formed more than half a billion years later than currently thought. Levison is toying with the idea that Uranus and Neptune started out between Jupiter and Saturn, where his simulations suggest they could have orbited for hundreds of millions of years before flying out into the lingering debris beyond Saturn and triggering a late heavy bombardment. “That's my fairy tale,” he says. Maybe that's just what young planetary bodies need.