News this Week

Science  17 Dec 1999:
Vol. 286, Issue 5448, pp. 2238

    Capturing the Promise of Youth

    1. Gretchen Vogel

    In 1999, researchers recognized the extraordinary potential of stem cells, immature cells with the ability to become different kinds of tissue—and perhaps to heal many kinds of illness

    Old age may have wisdom, but it will always envy youth for its potential. The same might be said for cells, for a young cell's most important trait is its ability to choose among many possible fates and become, say, a neuron passing electrical signals or a blood cell carrying oxygen. Late last year, in a technological breakthrough that triggered a burst of research and a whirlwind of ethical debate, two teams of researchers announced that they had managed to prolong the moment of cellular youth. They kept embryonic and fetal human cells at their maximum potential, ready to be steered into becoming any cell in the body.

    Building on that achievement, in 1999 developmental biologists and biomedical researchers published more than a dozen landmark papers on the remarkable abilities of these so-called stem cells. We salute this work, which raises hopes of dazzling medical applications and also forces scientists to reconsider fundamental ideas about how cells grow up, as 1999's Breakthrough of the Year.

    Stem cells may one day be used to treat human diseases in all sorts of ways, from repairing damaged nerves to growing new hearts and livers in the laboratory; enthusiasts envision a whole catalog of replacement parts. Despite such promise, many in society object to using stem cells derived from human embryos, a debate that is sure to continue into 2000 and beyond.

    But another astonishing development that occurred in 1999 may ease the ethical dilemma. In defiance of decades of accepted wisdom, researchers in 1999 found that stem cells from adults retain the youthful ability to become several different kinds of tissues: Brain cells can become blood cells, and cells from bone marrow can become liver. Scientists are now speeding ahead with work on adult stem cells, hoping to discover whether their promise will rival that of embryonic stem (ES) cells. Thus, 1999 marks a turning point for this young field, as both science and society recognized—and wrestled with—our newfound power to manipulate a cell's destiny.

    Open future

    Researchers have long been familiar with certain types of stem cells. Back in 1981, they discovered how to culture mouse ES cells, treating them with just the right growth factors to keep them dividing but forever immature. Today such cells are common research tools, used, for example, to produce tens of thousands of knockout mice that lack specific genes. In humans, scientists had managed to identify and culture stem cells from several types of adult tissue, such as bone marrow and brain, but had done little work on stem cells from embryos.

    Then last November two company- funded teams announced that they had isolated and cultured human embryonic and fetal stem cells, coaxing them to pause their development before they became committed to any particular fate (Science, 6 November 1998, pp. 1014 and 1145). Many scientists worldwide were immediately eager to join the field but were hampered by rules in many countries that restrict public funding for research that destroys a human embryo. Instead, while ES cell research moved ahead in a few privately funded labs, ethical debates on the wisdom of using human embryos in experiments sprang up in all sorts of public forums.

    The U.K. imposed a 1-year research moratorium to allow for public discussion, for example, and a French high court advised lifting that country's ban on human embryo research. And a U.S. presidential advisory panel recommended that public funds be available for all types of stem cell research. In proposed rules released this month, the National Institutes of Health is a bit more restrictive, prohibiting researchers from using federal funds to derive cell lines from an embryo but allowing them to work with certain cell lines created using private funds (Science, 10 December, p. 2050).

    Career switching

    While the public and policy-makers mulled the ethical implications, researchers galvanized by last November's announcements rushed down another avenue of research—on stem cells from adults. Scientists had known for decades that certain kinds of stem cells lurk in adult tissues, for example in bone marrow and skin. But most scientists had assumed that the adult-derived cells have a limited repertoire. Just as years of training usually commit a concert violinist to a career in music, so scientists had assumed that when a young cell takes on an identity and turns various suites of genes on or off, that genetic programming irreversibly commits it to becoming one of just a few cell types.

    But this year several startling results have shown that in some cases, those early commitments can be rewritten. In January, Italian and U.S. scientists reported that stem cells taken from the brains of mice could take up residence in the bloodstream and bone marrow and become mature blood cells—a leap roughly equivalent to a music student becoming a successful professional baseball player. That means signals in the immediate environment can in some cases override a cell's history, implying that nature allows developing cells far more freedom than scientists had imagined.

    Many scientists initially balked at that idea, but a string of new results seem to back it up. Texas researchers found just a few weeks ago that muscle stem cells could become blood cells, for example. Other scientists reported that stem cells found in rat bone marrow could become liver cells—raising the tantalizing possibility that cells now routinely harvested in bone marrow transplants may have much broader uses. And this fall Pennsylvania scientists reported that mouse marrow cells injected into the brains of newborn mice could develop into brain cells.

    As such basic research leaped forward, biomedical scientists sought new ways to use stem cells to help people. For example, researchers this year found that healthy bone marrow stromal cells, which give rise to bone, could strengthen bones and prevent fractures when injected into three children with the bone-weakening disease osteogenesis imperfecta. And in a promising step toward a possible treatment for neural disorders such as multiple sclerosis, researchers in Boston injected neuronal stem cells from healthy mice into the brains of newborn mutant mice that lacked a key protein, part of the protective myelin sheath around neurons. The treated mutant mice produced the missing protein and were less prone to a characteristic tremor. Another group injected bone and muscle stem cells into mice that lacked the protein dystrophin—missing in patients with Duchenne's muscular dystrophy—and within 2 weeks the animals produced the missing protein.

    Applications of ES cells could be even more dramatic. This year another set of mutant mice, unable to produce myelin, received mouse ES cells in their brains; the cells soon spread through the brain and produced myelin. And Missouri researchers this month reported that mouse ES cells could help restore some movement to the limbs of partially paralyzed rats.

    With dramatic results like these coupled with growing public acceptance, the stem cell field is poised for progress. If it lives up to its early promise, it may one day restore vigor to aged and diseased muscles, hearts, and brains—perhaps even allowing humans to combine the wisdom of old age with the potential of youth.


    Pharmaceutical Research and Manufacturers of America stem cell site

    American Society of Cell Biology

    Coalition of Americans for Research Ethics

    AAAS/Institute for Civil Society report on stem cell research



    The Runners-Up

    1. The News,
    2. Editorial Staffs

    The News and Editorial Staffs

    Science honors nine additional major discoveries in fields that span the universe, from the edgy dance of subatomic particles to the biological wizardry that imprints memories.

    First Runner-Up: Genomics Speeds Ahead. In December 1998, the publication of the first genome of a multicellular organism, the nematode Caenorhabditis elegans, ushered in a new era in genomics—that of rapidly comparing thousands of genes in complex organisms. This year every facet of genomic technology accelerated, from sequencing to database management. As a result, genomics swept through biology, as researchers compared the sequences as well as the expression patterns of many thousands of genes at once.

    In terms of sequencing, additional complex genomes are now due much sooner than predicted. The fruit fly genome is expected in the next few months rather than years, and the international, publicly funded effort to determine the order of the 3 billion bases that make up the human genome has kicked into hyperdrive, promising to deliver a rough draft of at least 90% of the genome by March 2000. Indeed, the first installment, the sequence of chromosome 22, arrived in 1999. Also this year, the U.S. and British governments pledged more than $100 million to help sequence the mouse genome, which is about the same size as our own and contains many of the same genes.

    At the same time, researchers polished off the genomes of several key microbial pathogens, including Chlamydia pneumoniae, which causes respiratory infections; the food-borne pathogen Campylobacter jejuni; and a virulent strain of Mycobacterium tuberculosis. And two chromosomes of the malaria parasite Plasmodium falciparum were sequenced and two teams mapped its genome. All these data should yield new targets in the battle against these pathogens.

    The outpouring of sequence data also sparked an explosion of technologies to integrate and analyze the flood of information—and thereby to probe questions ranging from how the immune system responds to how humans evolved. Many scientists turned to DNA-chip and low-budget microarray technologies, which allow them to compare gene expression rapidly in adults and embryos, or in healthy and diseased tissues. Faced with a data deluge, bioinformation experts this year revved up their efforts to link different kinds of information, hoping to allow researchers of all stripes to capitalize on the growing genomic riches.


    TIGR Microbial Database

    How the other half cools. Four years ago, researchers cooled rubidium atoms into a quantum-mechanical superparticle called the Bose-Einstein condensate, a stunning feat lauded as Molecule of the Year in 1995 (Science, 22 December 1995, p. 1902). This year, physicists at the same lab knocked over another milestone, cooling a vapor of ornery atoms in the first step toward creating an even more bizarre macroscopic quantum state.

    Every particle of matter comes in one of two types: bosons and fermions. Bosons are content to share the same quantum state, as photons do in laser light and as rubidium atoms do in a Bose-Einstein condensate. But fermions, which include electrons, quarks, and about half the atoms in the periodic table, hate to be forced into the same state; instead, they try to occupy adjacent states, as electrons do when filling energy shells around an atom.

    This year, however, Colorado researchers discovered a new trick to help tame wild fermions. The scientists put fermionic potassium atoms into different quantum states and induced atomic collisions between them. The collisions carried energy away from each fermion, leaving the resulting vapor ever cooler and putting the gas as a whole into a state in which atoms are precisely arranged in a ladder of energies.

    This cooling method clears the way for an even more mind-boggling possibility: At still lower temperatures, fermions may be coaxed into pairing up into bosonlike composites, as electrons do when they travel without resistance through superconductors. Such an achievement may lead to the next state of quantum matter: an atomic superfluid of fermions, the first step toward a new generation of atomic clocks and lasers.


    JILA site on BEC studies

    Georgia Southern University's site, with BEC links

    Journal of Research of the National Institute of Standards and Technology, special issue on BEC

    Rally for ribosomes. After a prolonged drum roll, the curtain came up this year on the structure of one of the cell's most important players, the ribosome. This massive protein-RNA complex produces proteins, somehow accurately translating genomic information into each of the tens of thousands of molecules necessary for life. Structural biologists had struggled for decades to probe this complex molecular machine but were thwarted until recently by its intricate tangle of 54 proteins and three RNA strands.

    In 1999, however, three x-ray crystallography groups—capitalizing on earlier, blurred glimpses of the molecules plus innovative ways to analyze x-ray data—partially resolved the structures of the ribosome's two subunits. In addition, a fourth team described a slightly less detailed structure of the entire ribosome, showing how the two components interact. That group also captured the ribosome bound to messenger RNA, which carries the genetic data needed to make proteins, as well as to transfer RNA, which supplies the amino acids needed to build the proteins.

    This first good look at the ribosome's internal connections only made researchers eager for more—in particular, for new views of this star cellular player in action. In the next year or two, expect structures with near-atomic-level resolution to reveal the precise connections between each protein and RNA. And advances in cryoelectron microscopy, in which electron microscopy specimens are quick-frozen in glassy ice, promise to image ribosomes at work as they move along the messenger RNA. For ribosome fans, 1999 opened what promises to be a long, successful run of discoveries.

    Plentiful planets. Theorists think the universe is teeming with planets, but astronomers detected the first extrasolar planet only 4 years ago. They spotted the wobbles induced in its parent star, an indirect method that left room for skepticism. But in 1999 astronomers unequivocally confirmed the existence of exoplanets: They caught one transiting the face of its star, significantly dimming that star's brightness for a few hours every 3.5 days. Astronomers even sketched the planet's characteristics: It is 200 times more massive than Earth, has a diameter significantly larger than Jupiter's, and like Jupiter is composed mainly of hydrogen and helium.

    Also in 1999, another team found tantalizing (but still unconfirmed) evidence of starlight reflected off a different exoplanet. And discoveries of new planets flowed in steadily all year, bringing the total number of known extrasolar planets to nearly 30. The latest haul even contains worlds that orbit in the habitable zone of their parent stars, where liquid water and life could exist. Unfortunately, all planets detected so far are gas giants like Jupiter, lacking a solid surface. Moreover, most of them move in very eccentric orbits, unlike the planets in our solar system, leaving theorists scrambling to explain why. Although the string of new data confirms that planets are common, at this stage it seems that planetary systems configured like our own are rare indeed.


    Discovery of Extrasolar Planets site

    The Extrasolar Planets Encyclopaedia

    Making memories. From an old nursery rhyme to the name of someone you just met, memories of all kinds are etched into your brain by a molecular cast of epic proportions. The actors in this drama are molecules rushing into and out of the synapse, the gap between neurons. Their movements strengthen the connections between certain neurons, making those synapses more likely to fire in the future. Exactly how the enhanced firing calls up and stores memories remains a mystery, but this year researchers used new methods to film a neuron's most intimate connections for the first time.

    Researchers have long cast one molecule, the NMDA receptor, in a starring role in learning and memory: When activated, this receptor makes a neuron more sensitive to incoming signals, a process called long-term potentiation (LTP). But this past spring the spotlight swung to an unsung molecule called the AMPA receptor. Using a new laser microscopy technique, researchers watched fluorescently labeled AMPA receptors stream into the little nubs known as spines that form the receiving end of synapses. The AMPA receptors then allowed activation of the spines' NMDA receptors, triggering more LTP.

    Also playing at the synapse film festival were movies made by the same technique that showed brand-new spines pop up next to synapses that had undergone LTP. And last month researchers showed that those new spines quickly form twin synapses next to the original, effectively doubling its strength.

    Another celebrated result came in September, when a Princeton team genetically engineered mice to produce more of a certain NMDA receptor found in young animals. The altered mice were better at learning mazes, dramatically demonstrating the large-scale effects of this molecular drama.


    Site showing movies of new synaptic spines being born

    Flat and happy. Cosmologists can hardly believe how their luck has changed. New observations often send theorists scurrying back to their equations, but this year, measurements of one of the cosmos's most basic numbers gave the answer theorists had hoped for: The universe appears to contain just the amount of matter and energy that the most elegant picture of its origins requires.

    This theory, called inflation, holds that a burst of expansion in the first instant of time stretched space almost perfectly flat, so that parallel light rays stay parallel forever. But until recently, the universe seemed irredeemably curved. Because mass can change the shape of space-time, cosmologists thought a flat universe implied a specific density of matter. And as hard as they tried, they could only find a third of the density needed.

    Last year, a possible salvation appeared: Measurements of distant exploding stars suggested that a mysterious energy in empty space is accelerating the expansion of the universe. Energy can curve space just as matter can, and the finding suggested that energy might fill in for matter and flatten the universe (see last year's Breakthrough of the Year, 18 December 1998, p. 2156).

    Now, the cool cosmic glow called the microwave background has yielded a sign that it does. The microwave glow retains the faint imprint of lumps in the newborn universe, in the form of slightly hotter and cooler regions. These fluctuations are a kind of test pattern for cosmic geometry: Space is flat if their apparent size on the sky tends to be about 1 degree of arc. Observations from high in the Andes, the South Pole, and balloons now agree that the 1-degree peak is there, and theorists are cheering.


    Max Planck Institute for Astrophysics site on cosmic microwave background experiments

    Tutorial on the cosmic microwave background, maintained by Wayne Hu of the Institute for Advanced Study

    New light on photonics. Semiconductors transformed the communications and computing industries by channeling electrons faster than giant old vacuum tubes could. And many observers expect that in the 21st century these industries will be transformed again, by photonic crystals, latticelike structures that have the potential to manipulate photons as semiconductors do electrons—but at the speed of light. In 1999, researchers hailed a string of firsts in crafting photonic structures, paving the way to true photonic optical devices.

    Photonic crystals are made of a lattice of materials of differing optical properties, which excludes light of certain wavelengths. Introducing a defect into the crystal allows the forbidden wavelengths to travel along the defect. This year, several independent groups demonstrated the crystals' potential as light guides. One team created a photonic bandgap mirror, which completely reflects light of certain wavelengths, and rolled it into a tube that can steer light around sharp corners without dissipating. Other groups took a different tack, using a pattern of holes in an optical fiber cable to trap light. These light guides can carry high-intensity light that could damage conventional optical fibers and promise advances in industrial welding and machining.

    Also this year, California researchers developed the first photonic crystal laser. Using photonic waveguides to link dozens or hundreds of such lasers within a crystal would create an integrated optical circuit, with uses ranging from telecommunications circuits to lightning-fast optical computers.


    The Ultimate Collection of Photonic Band Gap Research Links, maintained by Yurii A. Vlasov of the NEC Research Institute

    The Photonic and Sonic Band-Gap Bibliography, maintained by Jonathan P. Dowling of the Jet Propulsion Laboratory

    John Joannopoulos' photonic crystal research group at Massachusetts Institute of Technology

    Optoelectronics Group at the University of California, Los Angeles, headed by Eli Yablonovitch

    Axel Scherer's Caltech Nanofabrication Group

    The University of Bath Optoelectronics Group, headed by Philip Russell

    Tracking distant ancestors. Geochemists added a billion years to the history of complex cells in 1999, by finding chemical traces of eukaryotic cells—which make up plants, fungi, and animals, including humans—in rocks 2.7 billion years old. Their discovery not only rewrites our early history, it demonstrates the power of a promising technique that marries geochemistry and biology and may help researchers read the most ancient chapters in the history of life.

    Scientists have argued for decades over the origins of eukaryotes, which unlike bacteria or archaea have a nucleus. But clear fossil evidence has been sparse; the oldest undisputed eukaryote fossils are less than 2 billion years old.

    Although eukaryotes can be hard to distinguish in fossils, these cells do make more complex molecules than their simpler cousins, a fact researchers took advantage of this year. They extracted chemical residues from Australian shale and found traces of steranes, derivatives of cholesterol and related molecules that today are made only by eukaryotes. That suggests that eukaryotes—or at least their close precursors—are much older than many scientists had thought.

    This same rock sample also included molecules characteristic of cyanobacteria, or blue-green algae—microbes credited with producing much of the early Earth's oxygen—pushing back their origins by 600 million years. In response to the new finds, biologists are gathering more chemical and fossil evidence, hoping to fill in the gap between the two kinds of data.


    NASA ‘Imagine the Universe!’ site

    Cosmic Gamma Ray Bursts site

    Mystery flashes unveiled. Gamma ray bursts—cosmic explosions that emit more power in a few seconds than the sun does in 10 billion years—have posed a mystery for more than 30 years. But since 1997, astronomers have employed an armada of ground-based telescopes to make rapid follow-up observations of the bursts' feeble afterglows. As a result, researchers now seem to be closing in on the true nature of the massive blasts: At least some are apparently the birth cries of black holes, formed when rapidly rotating, supermassive stars collapse upon themselves. Powerful jets of matter from the nascent black hole plow through the collapsing star, blowing it to smithereens in a titanic supernova explosion and producing huge amounts of gamma rays in the process.

    The first tentative link between a gamma ray burst and a supernova was made back in April 1998. Now this year's findings by teams from Pasadena, California, Chicago, and Amsterdam seem to close the case, by revealing the telltale glows of supernovae superimposed on the fading afterglows of at least three gamma ray bursts.

    Details of the explosion mechanism may fill in as astronomers continue to pounce quickly on the bursts. And two NASA satellites carrying rapid-response telescopes, to be launched in January and in 2003, are likely to shed even more light on the physics of these colossal explosions.



    1. 10.
    2. 11.
    3. 12.
    4. 13.


    1. 14.
    2. 15.
    3. 16.
    4. 17.
    5. 18.
    6. 19.
    7. 20.
    8. 21.
    9. 22.
    10. 23.


    1. 24.
    2. 25.
    3. 26.
    4. 27.


    1. 28.
    2. 29.
    3. 30.
    4. 31.
    5. 32.
    6. 33.
    7. 34.


    1. 35.
    2. 36.
    3. 37.


    1. 38.
    2. 39.
    3. 40.
    4. 41.
    5. 42.
    6. 43.
    7. 44.
    8. 45.


    1. 46.
    2. 47.


    1. 48.
    2. 49.
    3. 50.

    Peering Into 2000

    We turned our crystal ball on high to spot fields that may rise to the top in the first year of the new millennium. Here are Science's picks for areas to watch in 2000.

    Unraveling Alzheimer's. Next year should bring the acid test for the popular but controversial notion that amyloid plaques cause the mental deterioration of Alzheimer's disease. This year researchers tentatively identified two enzymes called secretases, which make particular cuts in a precursor protein and together free the plaque-forming amyloid fragments. In the next few months, drug companies plan the first trials of inhibitors of these enzymes in people.

    X-ray astronomy. By early next year, a trio of orbiting telescopes will be observing the x-rays that stream from the most energetic objects in the universe. NASA's Chandra, launched last summer, is already revealing as much detail at x-ray wavelengths as most scopes discern in visible light. This month, Europeans launched the X-ray Multi-Mirror mission to pick up even fainter x-rays, and a Japanese satellite tuned to the very shortest wavelengths is set to follow early next year. Expect new insights into exotic corners of the universe such as black holes, neutron stars, and supernova remnants.


    NASA's Chandra site

    Smithsonian Astrophysical Observatory Chandra site

    XMM site at the European Space Agency

    Japan's Institute of Space and Aeronautical Science satellite site

    Epigenetics explosion. As sequenced DNA piles up, expect genetics to take another great leap next year into epigenetics, the study of how DNA sequences are packaged and expressed. Epigenetic changes in gene expression, such as silencing one copy of a gene, aren't linked to DNA alterations. Yet studies have suggested that such changes can be passed from one cell to another and sometimes from parent to offspring, violating Mendelian principles. In 2000, watch for more insights on how epigenetics affects gene regulation, development, and even cancer.


    1999 Gordon Research Conference on Epigenetics Program



    Hewlett Packard nanocomputing site

    U.S. National Science and Technology Council report on Nanostructure Science and Technology

    Restoring rivers. Big plans are afoot to undo past damage wrought by diverting and damming U.S. waterways. Next year officials will mull over or begin huge projects aimed at restoring water flow to the Everglades, salmon runs in the Snake River, and wetlands in the San Francisco Bay delta. Much more than simple cleanups, these projects all have long timetables, hefty price tags—a proposed $7.8 billion for the Everglades alone—and their share of critics. In 2000, expect more debate as ecologists worldwide watch the outcome.

    New paths to nanocomputers. Researchers seeking to shrink computer chips to molecular dimensions have long been stumped by a key problem: how to wire up their minicomponents—mere molecules that perform as electrical switches and wires—into circuits in which every device works perfectly. This year, however, a handful of teams scored initial success with a bold strategy that allows some components to fail and simply routes around circuitry that doesn't work. And several teams created molecular and nanotube electronic devices that can connect to other components. Expect new types of components and more complex circuits in 2000.

    Farewell to polio. In the 1950s, polio crippled more than half a million children every year. But a World Health Organization (WHO) campaign to wipe the disease off the planet by the end of 2000 has slashed cases by more than 90% since 1988 and pushed the virus into ever-shrinking corners of South Asia and Central Africa. Still, 1999 brought thousands of scattered new cases and a major outbreak in Angola. Boosted by an infusion of cash from Bill Gates and Ted Turner, WHO has stepped up immunization and surveillance, and observers say that the all-but-final blow to this scourge is likely to come next year.


    WHO site, A World Without Polio

    The Polio Information Center Online (PICO)



    1. 5.
    2. 6.


    1. 7.
    2. 8.
    3. 9.
    4. 10.
    5. 11.
    6. 12.
    7. 13.


    1. 14.
    2. 15.
    3. 16.


    1. 17.
    2. 18.
    3. 19.

    Scorecard '98

    Each year we choose six fields that we predict will make news in the next 12 months. But forecasting is no exact science; here's whether Science's crystal ball was cloudy or clear.

    Coming of age, slowly.

    This year's crop of studies linked genes that control aging to key hormones such as insulin. And as predicted, there was a flurry of work on telomeres, DNA sequences on the ends of chromosomes thought to shorten with age. But the telomere studies didn't deal directly with aging, and the basic biology behind why and how cells—and people—grow old remains a mystery.

    Biowarfare worries.

    Although there were no attacks, last year's concerns about bioweapons grew more visible in 1999, as did efforts to counter the threat. A new book, Biohazard, documented a former Soviet scheme to pack long-range missiles with pathogens. The U.S. Department of Defense began vaccinating troops against anthrax attacks by rogue nations or terrorists, and other agencies announced growing budgets for biowarfare defense.


    U.S. Defense Advanced Research Projects Agency Web site on countermeasures R&D

    Crystal-clear materials.

    1999 was a banner year for photonic crystals—latticelike structures that promise to control photons the way semiconductors control electrons (see Runner-Up, p. 2242). Researchers reported the first photonic crystal laser and developed photonic crystal waveguides that steer light around sharp corners, a key step toward crafting optical circuits.

    Still seeking a sink.

    Despite a busy year, climate researchers puzzling over where Earth's excess carbon dioxide is being stored have yet to answer the question. A study of land-use records since 1700 found that North America's plants may not be sopping up massive amounts of CO2, as proposed in 1998. But researchers are still waiting for new data on how much carbon is sequestered by soils, oceans, and forests.


    Overview of sink debate from Global Change magazine

    USGS sink research site

    No relief from sneezing and wheezing.

    Some clues surfaced as to the complex causes of allergies and asthma, including new data suggesting that a treatment's success may depend on the patient's genetic makeup. But although many clinical trials began in 1999, they have yet to produce good news for patients.


    American Academy of Allergy, Asthma, and Immunology

    National Jewish Medical and Research Center site

    Climate mysteries.

    Understanding climate swings from millennium to millennium is proving to be just as hard as forecasting the atmosphere's day-to-day gyrations. This year researchers linked the Little Ice Age of the 18th century with a 1500-year climate cycle, and they concluded from layered sediments in a high Andean lake that El Niño has waxed and waned over millennia. But the underlying causes of such long-term swings remain elusive.




    1. 9.
    2. 10.


    1. 11.


    1. 12.
    2. 13.


    1. 14.
    2. 15.
    3. 16.

    NASA Tangles With the Metric System

    The year's prize snafu struck in September, when the $87 million Mars Climate Orbiter (MCO) plowed deep into the martian atmosphere and perished. Within days, the root cause surfaced: Navigators had inadvertently used English units of pound-seconds instead of metric newton-seconds as they guided the spacecraft. But as in most catastrophic failures, a slew of otherwise harmless oversights and miscommunications combined to turn the English-metric confusion lethal.

    The curse of the Red Planet continued this month when the Mars Polar Lander spacecraft, due for a 3 December touchdown, went missing. This time the problem wasn't units of measurement—a beefed-up team of navigators had honed the spacecraft's approach to within a gnat's eyelash, with nary a pound-second in sight. But nothing has been heard since the spacecraft went quiet, as planned, during final approach. That leaves MCO's lesson to be remembered: Talking “faster, cheaper, better” isn't necessarily the same as achieving it.



    Creationists Win in Kansas

    1. Constance Holden

    Scientists who realized long ago that evolution is the key to biology—and assumed that the rest of the world agreed—got a rude awakening courtesy of the Kansas State Board of Education, which in August voted to drop evolution from statewide science teaching standards. Exactly what students are taught will depend on their teachers, but as the standards form the basis for new state tests, students won't have to know much about evolution at exam time. And the Kansas educators tossed more than Darwin into limbo—they worded other passages of the standards so as to cast doubt on the big bang and radiometric dating too.

    “Where is the evidence for that canine-looking creature that somehow has turned into a porpoise-looking creature?” wondered school board chair Linda Holloway on NBC TV. “I haven't seen that evidence.” Kansas promptly became a laughingstock in some quarters. “Uh, sorry Dorothy, it's Kansas all right—Oz is not this strange,” gibed Robert Park of the American Physical Society in his newsletter.

    But Kansas educators are not alone. Alabama textbooks have carried a disclaimer saying that evolution is only a “controversial theory” since 1996. And even though a Louisiana appeals court ruled against a similar practice—oral disclaimers by teachers—in August, Oklahoma ordered publishers this fall to insert a disclaimer just like that used in Alabama. In New Mexico, just as newly elected scientists on the school board got that state's evolution prohibition reversed this fall (Science, 22 October, p. 659), a Kentucky panel was eliminating “evolution” and substituting “change through time” in its stanndards.

    Perhaps the most disturbing aspect for scientists is that these moves reflect widespread beliefs: About 35% of American adults think that the Bible is literally true, including on creation and the age of Earth, compared to only 7% in the U.K., according to the International Social Survey, which is run from the University of Chicago. And in a 1997 Gallup poll, 68% of Americans said that “creationism should be taught along with evolution” in public schools.

    Presidential candidates have been quick to pick up on that sentiment: When queried after the Kansas decision, not one took an unequivocal stand in favor of evolution. “I believe children should be exposed to different theories about how the world started,” said Republican front-runner George W. Bush. “I personally believe my children were not descended from apes,” added Republican Gary Bauer. Even Vice President Al Gore couldn't resist a waffle: He favors teaching evolution, said a spokesperson, but “localities should be free to teach creationism as well.” Gore later backed off, saying creationism should be confined to religion courses.

    All this suggests that the future of evolution in many U.S. schools is “grim,” says Eugenie Scott of the National Center for Science Education in El Cerrito, California. All the same, scientists can stem the tide, say Scott and others; observers note that four of the five religious conservatives on the Kansas board will be up for reelection next year. Science will win out in the end, predicts Wayne Carley, head of the National Association of Biology Teachers in Reston, Virginia. “It took 300 years to accept Galileo. Another 300 years and maybe the U.S. will catch up with the rest of the world.”


    National Center for Science Education

    The Creation Science Association for Mid-America

    Kansas Association of Biology Teachers



    GM Foods Under Attack

    The debate over genetically modified (GM) foods exploded in 1999, becoming a worldwide public relations disaster for the biotech industry and casting science in the role of villain. Although most fiery in the United Kingdom, where headlines warned of “the horrors of GM foods” and “the mad forces of genetic darkness,” the public fervor spread through Europe, leading the European Union to suspend the introduction of new GM crops pending new legislation, which could be 3 years away (Science, 26 November, pp. 1662-1668).

    The effects even reverberated in the United States, where the GM revolution had been proceeding all but silently. U.S. farmers, who planted roughly half of their corn, cotton, and soy fields with transgenic crops this year, watched with dismay as their export markets shrank. And at recent public hearings organized by the Food and Drug Administration, many speakers voiced concerns that the crops—often tailored to resist insects or herbicides—might be hazardous to human health or could cross-pollinate with wild plants and create “superweeds.”

    This eruption of public feeling was fueled by a few critical studies published this year. One showed, for example, that monarch butterfly caterpillars in the lab died when fed transgenic pollen containing Bt, a bacterial insecticide. Another reported that rats' gut linings were somewhat swollen after the animals ate transgenic potatoes. But such work was preliminary and controversial—it's not clear yet how much Bt caterpillars would eat in the wild, and Britain's Royal Society called the potato study “deeply flawed.”

    Critics have pointed out that the lack of good data works both ways—there's no clear evidence that transgenic crops are totally innocuous. But this year's turbulence seems to have stemmed less from data and more from the public's knee-jerk reaction to moving genes from one species to another, a fear sometimes compounded by peculiar political circumstances. In Great Britain, for example, public trust in food safety laws had been eroded by the government's attempts to play down the mad cow disease crisis in the early ‘90s. British biotech opponents also happen to have a prominent organic farmer on their side: the Prince of Wales, who once likened tinkering with genes to playing God. Resistance may have been further inflamed by unspoken resentment against big corporations, perhaps with a pinch of anti-Americanism; that might explain why U.S. biotech giant Monsanto was a favored target, and why French farmers directed their anger at local McDonald's.

    Whatever the causes, the power of protest groups is a fact of life that agricultural biotech firms are slowly learning to deal with. Meanwhile, they hope resistance will fade within a few years, as new GM fruits and vegetables with extra vitamins or antioxidants tempt consumers. The new wave could be a real boon for developing countries. Rice with added vitamin A and iron could help prevent blindness and anemia in millions, for example.

    But attitudes toward GM foods will have to thaw considerably before such benefits materialize. It may take years before we know whether the 1999 backlash was a mere ripple in the introduction of biotech crops—or whether millions of consumers have renounced them for good.


    Monsanto's “Life Sciences Knowledge Center”

    Greenpeace's genetic engineering campaign



    Gene Therapy Death Prompts Review of Adenovirus Vector

    1. Eliot Marshall

    For the past 3 months, one-third of the 250 faculty and staff members connected with the University of Pennsylvania's Institute for Human Gene Therapy have been studying a single case. They've been trying to understand why Jesse Gelsinger, a relatively fit 18-year-old with an inherited enzyme deficiency, died on 17 September, 4 days after doctors at Penn injected a genetically altered virus into his liver.

    Gelsinger was the first patient in a gene therapy trial to die of the therapy itself, as James Wilson, who heads the Penn institute, confirmed at a public meeting last week. His death is the latest blow to a field that has been struggling to live up to the promise and hype surrounding the first gene therapy trials a decade ago. And Penn isn't the only one investigating the accident; two federal health agencies and many companies with a stake in the field are digging into the details. Preliminary results of those investigations suggest that a central factor in the tragic events that led to Gelsinger's death is one that has dogged the field from its inception: the difficulty of transferring genes to human cells and getting them expressed.

    Wilson, the chief of Penn's clinical team, appeared with co-investigators Mark Batshaw and Steven Raper at a special public meeting at the National Institutes of Health (NIH) in Bethesda, Maryland, on 8 and 9 December to examine what went wrong. He faced members of the Recombinant DNA Advisory Committee (RAC)–a safety group that advises the NIH director—and a special RAC working group headed by geneticist Inder Verma of the Salk Institute in La Jolla, California. With his peers sitting in front of him on an elevated stage and a large audience at his back—including Gelsinger's father, reporters, and photographers—Wilson described in excruciating detail how Gelsinger had died. It was a tense session.

    After releasing stacks of clinical data and answering questions for 2 days, however, Wilson and colleagues said that they didn't fully understand what had gone amiss. They reported that the vector they used—a crippled form of adenovirus combined with a gene to control Gelsinger's ammonia metabolism (the gene for ornithine-transcarbamylase, or OTC)–invaded not just the intended target, the liver, but many other organs (see bar graph). This triggered an “activation of innate immunity,” the Penn clinicians wrote, followed by a “systemic inflammatory response.” Within hours, Gelsinger's temperature shot up to 104.5 degrees Fahrenheit. He went into a coma on the second day and was put on dialysis and then on a ventilator. His lungs filled with fluid. When it became impossible to oxygenate his blood adequately, he died.

    The Penn team had given Gelsinger a massive dose of the vector–38 trillion virus particles, the highest dose in this 18-patient trial—to try to get enough functioning OTC genes into his liver. But even so, only 1% of the transferred genes reached the target cells. None of the patients in the trial showed significant gene expression.

    Why weren't the high-dose effects foreseeable, and why were so few genes transferred? Animal trials had indicated a higher transduction rate, but the Penn team doesn't know why the adenovirus vector works less efficiently in humans. As for the toxicity, they suggested that Gelsinger's reaction was an anomaly. At the meeting, Wilson and his colleagues reviewed earlier animal and human data, which they said gave no hint that their dose-escalation study was moving into a lethal range. Other clinicians who spoke at the meeting described “manageable” adenovirus toxicity that they had seen in earlier trials, most of them at lower doses, but Wilson spoke of running into a surprisingly steep “elbow” of toxicity in the dose-response curve.

    Looking for clues to what happened, the Penn team discovered that Gelsinger's bone marrow was severely depleted of erythroid precursor blood cells—which could not have happened quickly, they felt. To Wilson, it suggested that Gelsinger might have had an undetected genetic condition or a parvovirus infection, either of which might have helped trigger the harsh immune response. Wilson also noted that the vector had been taken up most rapidly by immune cells–“not encouraging” for the efficiency of future trials, he said, because it meant that these cells, not target cells, would be affected first and possibly disseminate the immune response.

    Most gene therapists at the meeting agreed that Gelsinger's reaction was unusual. For example, Robert Warren, a clinician at the University of California, San Francisco, said that he had not seen serious toxicity—except in one patient who lost blood pressure—in the more than 40 cancer patients he has treated with adenovirus vector. Ron Crystal of Cornell University's New York Hospital said that he had administered similar vectors 140 times and seen a serious immune reaction only in one cystic fibrosis patient in 1993.

    One researcher publicly challenged the notion that Gelsinger's reaction was unusual. Art Beaudet, chair of molecular and human genetics at Baylor College of Medicine in Houston and a member of RAC's special working group, said: “You don't need to evoke anything weird” in this case. Because adenovirus is known to trigger an immune response and because the Penn team kept raising the dose, Beaudet suggested, a sharp immune reaction might not be so strange. Thomas Caskey, a research executive at Merck & Co. of Whitehouse Station, New Jersey, also told Science that he thought the Penn trial had been “pushing the edge of the envelope.” Beaudet and Caskey are both testing a new, “gutless” adenovirus vector stripped of all native genes, which they say has eliminated most toxic immune responses in animals. It hasn't been used in clinical trials, however.

    Members of the RAC and the special working group praised the investigators for sharing data and ended the session with mild proposals. Summing up, Verma urged clinicians to adopt a common index for dosing patients and a standard measure of virus particle concentration. These are “obvious” ideas, he said. Others suggested that RAC assemble a database on vectors and their effects; that clinicians screen patients more carefully; and that researchers collect better data on the fate of vectors in the body.

    Gelsinger's death “did some damage to gene therapy, and it did some damage to clinical research,” a key federal official said when the public meeting was over. The Food and Drug Administration has already announced that it found “deviations” from the protocol in Penn's conduct of the trial, including a decision to treat Gelsinger even though he had an ammonia level before the trial that was 30% to 60% higher than the agreed limit. The FDA is expected to issue a report and possibly a reprimand soon. The RAC and another NIH advisory group are thinking about how to improve the monitoring of gene therapy and will send recommendations to the NIH director for action early next year.


    Promising Antibiotic Candidate Identified

    1. Martin Enserink

    The ever-increasing resistance of pathogenic microbes to antibiotics has raised the specter of a “postantibiotic era” in which doctors are powerless to treat many bacterial infections. Even vancomycin, a longtime antibiotic of last resort, appears to be losing its punch. Now researchers have come up with a possible replacement—one that they say could lead to a whole new generation of antibiotics. It's not a brand-new invention, however. In fact, it has been used as a food preservative for over half a century.

    The compound, nisin, is a peptide—a small proteinlike molecule—produced by Lactococcus lactis, a bacterium that can turn milk sour, to kill its competitors. Microbiologists have known for decades that nisin is an effective killer, but they never knew exactly how it worked. On page 2361, a Dutch-German team provides an answer. They show that the peptide latches onto a molecule on many bacterial cell membranes known as Lipid II—the same target used by vancomycin. The result suggests that nisin, or derivatives of it, could one day replace vancomycin as a broad-range antibiotic. And because nobody has ever found a bug that is resistant to nisin, the researchers hope that nisin and related compounds might trump the problem of bacterial resistance. “This looks like a very significant contribution,” says biochemist Norman Hansen of the University of Maryland, College Park.

    Nisin is only one of many antimicrobial peptides that have recently caught researchers' interest; others have been isolated from frog skin, insects, and plants, for instance. Most kill bacteria by sticking to and punching a hole in their fatty cell membranes. Some studies suggested that nisin might work the same way. But whereas large quantities of the other peptides are needed to do the job, limiting their usefulness, nisin always stood out because it is effective at concentrations up to 1000 times lower, making it a popular preservative for dairy and many other products. “There has always been this question about why nisin is different,” says Tomas Ganz, who studies antimicrobial peptides at the University of California, Los Angeles.

    Other work had suggested that might be because nisin has a different mode of action: Like vancomycin, it might bind to Lipid II, which is a precursor of the bacterial cell wall, a tough protective layer that lies outside the membrane. The binding would rob bacteria of their ability to build cell walls, eventually killing them.

    The new study, from biochemists Eefjan Breukink and Ben de Kruijff of Utrecht University in the Netherlands, together with colleagues at three other institutions, indicates that both explanations are in fact partially correct. For example, the researchers found that nisin resembles magainin, a peptide antibiotic derived from frogs, in that it kills Micrococcus flavus bacteria within a few minutes by forming pores in the cell membrane. But they also found that vancomycin interferes with nisin's ability to make these holes, presumably because it competes with the peptide in binding to Lipid II. Conversely, when the researchers fused the bacterial membranes to artificial membranes loaded with extra Lipid II, nisin's pore-forming power was bolstered.

    Apparently, says Breukink, Lipid II is a special key that nisin uses to punch its deadly holes—a key that other antimicrobial peptides lack. He does not yet know exactly how Lipid II helps nisin form pores. But he is sure that the peptide attaches to a different part of the lipid than vancomycin does, which may explain why bacteria have become resistant to vancomycin but not to nisin.

    Now that researchers know that Lipid II is such an Achilles' heel for bacteria, they can try to devise a whole range of compounds that exploit it. The low doses needed would reduce the risk of side effects, Ganz says, and could help make the drugs economically feasible. And by tinkering a little with the nisin gene, researchers could easily produce many slightly different derivatives, for instance if resistance arises. “This holds the promise of giving access to huge numbers of antibiotics through relatively simple means,” says Hansen.

    But many hurdles will have to be overcome. For one, nisin can only kill Streptococcus, Staphylococcus, and other so-called gram-positive bacteria. Another problem is that peptides have a short lifetime in the body and a higher risk of triggering allergic reactions than conventional antibiotics have. Still, the new study may help motivate the pharmaceutical industry to overcome such obstacles, says Hansen: “They just have never recognized the potential of these antimicrobial peptides.”

  10. SPACE

    Europe Lofts X-ray Observatory

    1. Alexander Hellemans*
    1. Alexander Hellemans writes from Naples, Italy.

    To the relief and delight of engineers and x-ray astronomers, Europe's new space workhorse, the Ariane 5 launcher, deposited a $640 million x-ray observatory into orbit on 10 December. If all goes well, the European Space Agency's X-ray Multi-Mirror Mission (XMM) will capture images of very distant sources of fluctuating x-rays, such as those produced by black holes or supernova explosions.

    Onlookers at the Kourou, French Guiana, spaceport had their hearts in their throats as the Ariane 5 rocket lifted off. They remembered the fireworks caused by the first Ariane 5, which exploded in June 1996 while carrying a squadron of four space probes. The launch proceeded smoothly, however, and XMM was gradually brought into its final, elongated, 48-hour orbit that will keep it largely out of Earth's radiation belts, reports Giovanni Bignami, science director of the Italian Space Agency, who witnessed the launch. “The solar panels have also opened with no problem,” he says.

    The 10-meter-long spacecraft carries a set of three x-ray telescopes that together contain 58 mirrors with a total surface area of 120 square meters. These mirrors focus the x-rays onto charge-coupled device (CCD) cameras that capture images of the observed objects and also measure the wavelength of the x-rays. Two telescopes are also connected to diffraction gratings that spread out the x-rays according to wavelength so that researchers can study x-ray spectra with a much higher precision than that from the CCDs. XMM's scopes have lower resolution than those of Chandra, the x-ray observatory launched by NASA in July, but they excel at sensitivity –they are 5 to 15 times more sensitive depending on the wavelength and can pick up fainter or more fleeting signals. Bignami, who was the principal investigator for the prime focus CCD cameras until 1998, expects that because of its elongated orbit and better shielding procedures, the CCDs will not suffer the same radiation damage that has slightly impaired some of Chandra's detectors. The CCDs can be closed off with an aluminum shield whenever XMM enters the radiation belts near Earth or during a solar flare.

    Like Chandra and Astro-E—a Japanese observatory that will be launched to look at shorter wavelength x-rays in January (Science, 30 July, p. 652)–XMM will focus its attention on x-ray producers such as hot gases, supernova remnants, jets of material squirting out of exploding stars, and massive black holes at the centers of galaxies. Astronomers are anxiously anticipating XMM x-ray data from enigmatic black holes. Because their x-ray outputs can fluctuate rapidly, XMM's sensitivity will be an advantage because it can collect more photons in a shorter exposure time from weak sources, says Jeffrey McClintock of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts. With the XMM, one can see “very special things a black hole can do and that nothing else can do,” he says.

    According to John Heise of SRON, the Dutch Foundation for Space Research in Utrecht, the Netherlands, this new generation of observatories that together cover the entire range of x-ray wavelengths will cause a revolution in x-ray spectroscopy comparable to the revolution caused by optical spectroscopy in astronomy in the 1930s. For example, because of the huge gravitational pull of a supermassive black hole, the gases rotating around it travel at speeds close to that of light. X-ray spectrometers will allow astronomers to directly measure the velocity of the gases in these accretion disks by looking at Doppler shifts. “This would, in my view, be the first real indication of the existence of a supermassive black hole,” says Heise. He also hopes that by turning its scopes on the faint x-ray afterglow of gamma ray bursts and tracking how the afterglow decays, XMM will help astrophysicists figure out what fuels these fantastically powerful explosions.

  11. NIMH

    Mental Health Agency Shrugs Off Critics

    1. Constance Holden

    An advocacy group last week slammed the National Institute of Mental Health (NIMH) for spending money on topics the group says have little relevance to severe mental illness—everything from AIDS and Alzheimer's disease to research on vole mating behavior. But the criticisms, reported by major press, are misplaced, according to NIMH officials and many observers.

    The attack came from the National Alliance for the Mentally Ill (NAMI), a Washington, D.C.-based lobbying group for families of the mentally ill, and its research arm, the Stanley Foundation. For years NAMI has urged NIMH to spend a larger share of its resources on studying the most crippling and costly mental illnesses: schizophrenia, bipolar disorder, depression, and obsessive-compulsive disorder. Convinced that NIMH was paying only lip service to its concerns, NAMI appointed a committee led by Stanley Foundation director Fuller Torrey, a schizophrenia researcher formerly at NIMH, to review about $420 million worth of NIMH-funded projects in 1997. It concluded that only a little over one-third of this spending was for research on major mental illnesses, and of that only a small fraction went to clinical and treatment-related research. The panel also noted that NIMH was putting more money into AIDS ($60.2 million) than into schizophrenia ($57.1 million).

    Armed with the analysis, Torrey's group last week fired a public broadside at NIMH, accusing the institute of straying into research on diseases—particularly AIDS and Alzheimer's—and on basic neuroscience already being pursued by better endowed divisions of the National Institutes of Health. The panel also asserts that NIMH is supporting “almost no behavioral research that is relevant to severe mental illnesses,” instead probing matters such as infant sleep disorders and “how married couples [with new babies] make judgments about fairness in the division of housework.” Contends Torrey, “Many people at NIMH are very comfortable with a research portfolio which covers every form of human behavior ever described.” Taking a cue from former Senator William Proxmire and his Golden Fleece Awards in the 1970s, NAMI issued a list of projects it flogged as unworthy of funding.

    NIMH defends its research strategy. In a statement, the institute explained that the AIDS dollars were congressional earmarks, but insists the money has been well spent, as mentally ill people are at high risk for the disease. As for the non-AIDS research in its budget, which in fiscal year 2000 amounts to roughly $970 million, 80% goes to research directly related to mental illnesses. And NIMH points out that since 1997, it has launched four big clinical trials on major illnesses.

    Without singling out projects, NIMH director Steven Hyman told Science that there are a few studies on NAMI's hit list, taken on before his arrival at NIMH in 1996, that he's “not pleased to be funding.” Hyman promises to continue phasing out questionable or irrelevant research, which he says amounts to a trivial portion of his budget. Overall, however, Hyman says the NAMI report misses the mark, arguing that it presents “a very shortsighted and to me shocking eschewal of neuroscience.” Other advocacy groups have sprung to NIMH's defense. The American Psychological Society, for one, urged the agency to “stand firm in the face of these unwarranted and divisive attacks.” Indeed, says Elliot Gershon of the University of Chicago, a schizophrenia researcher formerly at NIMH, NAMI's complaints are outdated: The institute began shifting its focus away from behavioral studies and into the biology of mental illnesses years ago.

    NAMI leaders portray themselves as rendering a public service. “We're helping Dr. Hyman with our report,” claims schizophrenia researcher Irving Gottesman of the University of Virginia, Richmond, who helped compile it. “It will give him ammunition to resist encroachments on NIMH's original mission.”


    Changes to Missions Could Delay Science

    1. Andrew Lawler,
    2. Richard Kerr

    The silence from Mars is leading to a lot of talk on Earth. With two Mars probes lost in less than 3 months, NASA is hurriedly organizing a blue-ribbon panel to reexamine its ambitious plans for a series of flights that would bring back martian soil and rocks in 2008. Meanwhile, NASA managers are considering whether to send additional navigation and communication systems to Mars to guide future spacecraft, a safety step that could delay some experiments.

    NASA Administrator Dan Goldin was expected this week to name the members of a panel that will examine not just the future Mars program but also how the Pasadena, California-based Jet Propulsion Laboratory (JPL) and contractor Lockheed Martin of Bethesda, Maryland, managed the ill-fated Mars Climate Orbiter and Mars Polar Lander missions. “Whatever the panel says we ought to do, we're going to go fix it,” NASA Administrator Dan Goldin told CNN on 11 December. The panel's report is due in March. A separate failure review board, to be set up shortly, will examine what went wrong with the $165 million polar lander that failed to phone home after descending into the martian atmosphere on 3 December.

    Ed Weiler, NASA's space science chief, says JPL will develop a revised roadmap for Mars exploration, which the panel will then critique. The current plan includes launching an orbiter, lander, and rover in 2001; a 2003 launch of a lander and rover to collect samples; a 2003 mission to place small communications satellites around the planet; and a 2005 lander to gather up the samples and fly them back to Earth.

    The panel's most pressing task is to figure out what to do with the 2001 mission. Components that will ease the lander onto the surface, very similar to those on the polar lander, are already arriving at JPL in preparation for the planned April 2001 launch, says project manager George Pace of JPL. “We had confidence that the design was going to work [before the polar lander was lost],” he says. “What does it take to return it to flight status? That is a little hard to say. We don't know what the failure was.”

    To save weight and money, mission designers did not include a transmitter to send back flight data during the lander's entry into the atmosphere, its descent via parachute, and its rocket-assisted landing. Investigators will study all possible failure points, repeating the steps taken by its designers and by outside experts after the September loss of the Mars Climate Orbiter.

    Goldin appears reluctant to abandon NASA's plans for 2001, saying, “If there's any possibility that we could go back and land, maybe a little different way, we're going to do it.” The existing hardware also could be salvaged for a different mission. One option, says Weiler, is to turn the lander spacecraft into an orbiting telecommunications satellite with high-resolution cameras that could scout out safe landing sites for later missions and provide a stronger link between landers and Earth. Although some science would have to be postponed, he says, such an arrangement would boost the chances of success for later spacecraft.

    The additional navigational tools reflect NASA's view that martian geography may have contributed to the lander's failure. The craft was headed toward a poorly understood terrain in the south polar region. Although images returned from orbit by the Mars Global Surveyor showed the targeted landing site to be relatively smooth, “we don't have a lot of experience yet in interpreting [those] images,” says 2001 project scientist Stephen Saunders of JPL. “Mars looks like a completely different planet at a resolution of a few meters versus the tens of meters” available before Mars Global Surveyor, he says, and it could still harbor lethal hazards too small for Surveyor to see.

    Project scientists are currently eyeing two large zones as potential landing sites for the 2001 mission. “It could be we should put more emphasis now on the smoother area” just north of the equator near Sinus Meridiani, says Saunders. The region may be a dried, mineral-laden lake bed.

    So far, criticism of NASA and the Mars effort in Washington has been muted. Recent media polls show that a majority of the American public supports continued planetary research, and President Bill Clinton assured reporters on 8 December that he firmly backs Goldin's approach of “faster, cheaper, better” missions. For the moment, members of Congress seem willing to withhold judgment until the panel has had its say.


    Researcher Rebuked for 20-Year-Old Misdeed

    1. Michael Hagmann*
    1. With reporting by Robert Koenig in Berlin.

    The Max Planck Society, Germany's premier research organization, announced on Monday that its president will issue a formal censure to neuroscientist Peter Seeburg, director of the Max Planck Institute for Medical Research in Heidelberg, for publishing data in a 1979 paper that Seeburg has said were false.

    Seeburg's censure is the latest chapter in a drawn-out scientific melodrama involving a court battle between the University of California (UC) and biotech pioneer Genentech of South San Francisco over patent rights to engineered human growth hormone (Science, 11 June, p. 1752). Seeburg, a co-inventor on a UC patent at the center of the dispute, testified last April that shortly after he moved to Genentech in 1978, he took DNA samples that he had helped prepare while working at UC San Francisco. He also said he and Genentech colleagues falsified technical data in a Nature paper to cover up the origin of the samples. Prompted by this testimony, Max Planck president Hubert Markl earlier this year ordered a scientific misconduct investigation.

    Only after Genentech agreed to pay UC $200 million in mid-November to settle the suit could the Max Planck investigation finally conclude. While litigation was pending, UC did not answer Max Planck inquiries about whether Seeburg was allowed to take the samples, says Klaus Hahlbrock, Max Planck vice president and a member of the investigating committee. UC ultimately acknowledged that, at the time, there were no unequivocal regulations barring Seeburg from taking the samples, he says. Seeburg himself declared repeatedly that he felt he was entitled to do so according to “most scientists' ethical standards.”

    Although the committee didn't accept that argument entirely, the investigation focused on the alleged falsified information in the 1979 Nature paper. Seeburg's admission was hotly contested by co-author David Goeddel, then at Genentech and now chair of the South San Francisco biotech company Tularik, and several other former colleagues. The Max Planck committee took Seeburg's admission at face value, however. The committee concluded, says Hahlbrock, that “a falsified description in a publication cannot be tolerated, no matter if it dates back 20 years,” and recommended that Seeburg be censured –a rare and rather exceptional measure. The censure does not directly affect Seeburg's position at Max Planck, says Markl, but it “will be put into his personnel record.”

    For his part, Seeburg told Science that he may donate part of the $17 million he and four other former UCSF researchers will each receive as part of the settlement to a charity or research foundation. But most of all, he hopes that his censure will be the final episode in this painful saga and that he will once again be able to “concentrate on doing science.”


    Big Blue Aims to Crack Protein Riddle

    1. Robert F. Service

    IBM last week announced a $100 million research initiative to build a supercomputer 500 times more powerful than the current record holder. Dubbed “Blue Gene,” the technology test-bed's initial goal will be to model how proteins fold into the three-dimensional shapes that allow them to orchestrate life within the cell. If successful, the 5-year effort could allow drug researchers to go right from the sequence of a disease-related gene to the predicted structure of its protein, in order to identify targets for therapeutic drugs. Down the road, Blue Gene and its kin could also revolutionize other computationally intensive disciplines, such as modeling climate change and the evolution of galaxies.

    Researchers who model protein folding agree that Blue Gene's ability to run 1 quadrillion (1015) mathematical operations per second (also known as 1 petaflops) will be a big step for the field. “Petaflops computers certainly make you salivate,” says Stephen Mayo, a protein-folding expert at the California Institute of Technology in Pasadena. It won't be easy to serve up this feast, however. Supercomputers built by IBM and Intel for the national weapons laboratories currently reign as the world's fastest, at 2 trillion operations per second (2 teraflops). “But there's no way to get up to a petaflops using [the same] technology,” says Monty Denneau, a mathematician at IBM's T. J. Watson Research Center in Yorktown Heights, New York, and Blue Gene's chief architect.

    The chief obstacle is power consumption and heat: Denneau says a petaflops machine that used the same amount of energy for each operation as current teraflops machines “would take a dedicated [power] reactor”–and would quickly immolate itself. To speed computation while cutting power consumption, IBM plans to come up with an “ultraminimalist approach” for both hardware and software, which will reduce the complexity of the processors but increase their ability to communicate and work in tandem. Both the new chips and the software are still on the drawing boards, but “I think the plan makes a lot of sense,” says Arvind, a computer architecture expert at the Massachusetts Institute of Technology, who goes by a single name.

    As a test for their new machine, Denneau and his colleagues have chosen one of the toughest challenges in biology. Inside cells, newly synthesized chains of amino acids take a second or less to fold into a functional protein. Every one of tens of thousands of atoms in the chain and surrounding water molecules pulls or pushes on its neighbors to determine the final shape. But even though researchers have measured the forces between atoms in great detail and can easily predict how a handful of amino acids will interact, precisely modeling the folding of proteins has been out of reach.

    The most ambitious efforts, called all-atom simulations, calculate the interatomic forces and their effects for every possible pair of atoms in the protein chain. Even for a small protein, an all-atom simulation of just a fraction of the folding process takes months of supercomputing time.

    IBM's answer, Blue Gene, will consist of more than 1 million processors, each capable of 1 billion operations per second (1 gigaflops), assembled on 64 racks. To reduce power consumption, IBM researchers are doing away with a type of fast but power-hungry on-chip memory called cache. In its place, they're moving onboard a slower type of memory called DRAM, which is traditionally located off the chip. This should bring big power savings by eliminating the need to send off the chip for additional data.

    To compensate for the slower memory speeds, the IBM team is planning to turn to a technique known as multithreading, the computer equivalent of a multitasking commuter who eats breakfast, drives, and talks on the phone all at the same time. In this case, each processor will work simultaneously on eight separate problems. That way if a processor is waiting for a bit of data to come in from memory to complete one computation, it can still be working on others at the same time. Finally, because chip failures are inevitable in an array of a million processors, Blue Gene's software will be designed to reroute data to working devices if a processor or connection fails in midsimulation.

    Even with a new petaflops machine, it will take about 1 year to simulate the complete folding of a typical protein. And even then the protein-folding problem may not be solved. Peter Wolynes, a protein-folding expert at the University of Illinois, Urbana-Champaign, explains that the all-atom approach of computing the interactions between pairs of atoms may not be enough. It may turn out that to get the right answer, researchers will have to compute the interactions among many atoms at once as they tug and push on each other, which would vastly increase the problem's complexity and require still more computing muscle. “My suspicion is that you won't need all that additional stuff,” says Wolynes. But if it turns out you do, at the end of the day “you would have learned that you can't solve it.”

  15. NASA

    NRC Panel to Propose Station Institute

    1. Andrew Lawler

    Researchers have long criticized NASA for giving scientists short shrift in funding facilities and experiments on the multibillion-dollar international space station now under construction. Now the space agency is about to get some controversial advice on how to fix the problem.

    In a few weeks, Science has learned, a panel of the National Research Council (NRC) will recommend that NASA create a nongovernment institute to plan and manage science on the station. The report, requested by NASA, aims “to increase the visibility, voice, and clout of the research community” in the decade-long life of the station, says Cornelius Pings, president emeritus of the Association of American Universities and chair of the NRC panel drafting the document. Adds Michael Katovich, a physiologist at the University of Florida, Gainesville, and a panel member, “We scientists want to ensure that the best science gets done—that the station is used as a platform not for Mars exploration or for engineering goals, but for science.” But the report's suggestion for an institute seems certain to spark opposition from NASA centers involved in the space station. It also presents a formidable challenge in trying to reconcile the competing research interests of scientists and engineers from different disciplines, sectors, and countries.

    Although Pings declined to discuss details, others familiar with the draft report said the new organization should be designed to circumvent NASA's notorious bureaucracy. “It's awful; every experiment on Spacelab required a couple of truckloads of paper,” says New Jersey space consultant and panel member Judith Ambrus of the lab module designed to ride in the space shuttle bay. The new institute would give researchers a mechanism for access to NASA ground facilities, says one source, as well as financial and technical help in building experiments and planning and managing research time on the station. The NRC report will also suggest that crew members be specially trained to do primarily science during a tour on the orbiting base.

    The NRC report proposes creating the institute by around 2002–3 years before the station is ready. But the panel is still wrestling with whether the new entity should be chartered and funded directly by Congress, and therefore be independent of NASA, or function like the Space Telescope Science Institute in Baltimore, which contracts with NASA to manage the Hubble Space Telescope. NASA recently examined more than a half-dozen ways to structure the institute—from a university and industry consortium to a government corporation like Comsat—without reaching a conclusion, says Mark Uhran, a NASA life and microgravity sciences manager who was instrumental in requesting the NRC report.

    Although the idea of a separate organization has won early rave reviews from some officials at NASA headquarters, the reception at two centers with key roles in station management—Johnson Space Center in Houston and Marshall Space Flight Center in Huntsville, Alabama—is likely to be chillier. “We have a sense they won't like it,” a panel member says. “But the [NASA] administrator wants this to happen.” Katovich believes that the institute would relieve pressure on the centers by looking after scientific matters: “The centers clearly are critical for station development, but once everything is up and running, they can move on to other things.”

    Some center officials support the need for changing current practices. “Everyone is very eager to make it simpler to do research,” says John David Bartoe, a former astronaut and solar physicist who heads the station's research program at Johnson. And Katovich and others say that the lengthy process to put experiments into space demands that NASA act as quickly as possible. “Part of the problem is that it takes so long to get something done—there are so many hoops,” he says. “And while safety is still paramount, there are ways to do this less bureaucratically.”

  16. PCR

    Taq Polymerase Patent Ruled Invalid

    1. Robert F. Service

    Be careful with whom you pick a fight. Seven years ago the Swiss pharmaceutical giant Hoffmann-La Roche took a swing at Promega Corp., a small biochemicals supplier based in Madison, Wisconsin, suing it for infringing a Roche patent on a key enzyme for molecular biology research. The enzyme, called Taq polymerase, is a crucial element of the polymerase chain reaction (PCR), the ubiquitous technique used to replicate snippets of DNA. But last week it was Promega that was still standing after a federal judge in San Francisco, California, ruled that the patent for Taq is invalid because its original holder, the now defunct Cetus Corp., obtained it by deliberately misleading the U.S. Patent and Trademark Office (PTO).

    “We were very glad to see the ruling,” says Promega chief technical officer Randall Dimond. “We felt for a long time that the only reason the patent was issued in the first place was due to a misrepresentation of the science.” The ruling means that, for now, Promega can continue to sell Taq without paying royalties to Roche. But the war between the companies is far from over. The invalidated patent governs only one form of Taq and a small part of the current Taq market, but Promega is challenging Roche's patents on PCR itself and may do the same with other Taq patents Roche holds. And it claims this ruling makes these patents vulnerable. Roche, meanwhile, says it will appeal last week's ruling and is pressing on with a suit charging Promega with encouraging academic researchers to violate Roche's PCR patents.

    This David vs. Goliath battle started shortly after Roche bought rights for the Taq and PCR patents from Cetus for $300 million in 1991. At that time Promega had a license from Cetus to sell “native” Taq enzyme, purified from the bacterium Thermus aquaticus, for uses other than PCR. Roche argued that Promega, which was undercutting Roche's price, was in fact marketing Taq for use in PCR and asked the company to renegotiate its license. When Promega declined, Roche sued. But Promega lawyers fought back with a countersuit claiming that the original Taq patent should be disallowed. They argued that Taq had been purified earlier by others, but Cetus scientists had misrepresented their data to convince the PTO that the enzyme they had purified was different.

    Three years ago Judge Vaughn R. Walker ruled that Cetus had indeed misled the patent office in obtaining the native Taq patent. But in order to decide whether to void the patent, the judge needed to determine whether it had been done intentionally. Last week, Walker said the evidence was clear: In a 65-page ruling, the judge outlined instances in which former Cetus scientists David Gelfand and Susanne Stoffel misstated or omitted data that “afford no inference other than that they were made to deceive.”

    In a written statement to Science, Gelfand and Stoffel maintained that the Taq they purified was different from previous versions. “We feel that the judge's ruling was both wrong and unfair to us,” they state. “This situation … puts all scientists at risk of having their raw data and good faith interpretations of it be misrepresented in court.”

    The ruling invalidates the patent only for the native Taq enzyme, or n-Taq. Roche has since patented a recombinant enzyme produced in Escherichia coli, which is licensed widely and now dominates the Taq market. The pharma giant also has several patents on the PCR process itself. So the new ruling is not likely to affect Roche's existing business immediately. But Brenda Furlow, Promega's general counsel, argues that because these patents are related and involve some of the same researchers, the judge could wind up invalidating other Taq or PCR patents as well. “There is a real possibility that additional patents could fall,” says Furlow.

    Melinda Griffith, head counsel for Roche Molecular Systems—the U.S. subsidiary that handles Roche's PCR business—flatly disagrees. “The fact of the matter is that this relates only to the n-Taq patent,” says Griffith. “This does not or should not affect any other patents.” Kate Murashige, a biotechnology patent expert at the Washington, D.C., law firm of Morrison and Foerster, agrees, calling the invalidation of related patents “conceivable but unlikely.” Wall Street, too, seemed unworried: The stock price for Roche Holdings, AG, Hoffmann-La Roche's parent, dipped only a couple of points in end-of-the-week trading.

    Despite Promega's victory in this round, Roche officials haven't thrown in the towel. “This decision has not changed the fact that Promega is infringing our PCR patents and inducing others to do it as well,” says Griffith. Because of that, Griffith says, Roche plans to press its original case against Promega. Furlow counters that Promega will seek reimbursement for court costs as well as for royalties the company paid on Taq from 1991 to 1995. Judge Walker will map out the future the case is expected to take when he meets with the combatants at the end of next month.


    Licenses Suspended at 3 Toronto Hospitals

    1. Wayne Kondro*
    1. Wayne Kondro writes from Ottawa.

    Ottawa, Canada—Citing violations of nuclear safety regulations, the government has blocked the use of radioisotopes for research at one of Canada's largest academic/medical hospital complexes. The action, taken on 2 December by the Atomic Energy Control Board (AECB), has halted all experiments involving radioactive compounds in 91 labs at the Toronto-based University Health Network (UHN). Although roughly 20 of the labs may be allowed to resume normal operations as early as this week, the AECB has said that patient care could be affected next spring if the network does not make the necessary improvements.

    “They've been given a timeline,” says AECB license assessment officer Richard Cawthorn. “If they don't meet that, they will have used [up] all of their get-out-of-jail-free cards.” Although AECB's mandate covers only worker and public safety, Cawthorn says, it will have little choice but to take “further licensing action that will restrict the clinical areas” if the problems are not corrected by 1 April. UHN chief operating officer Michael Guerriere says he expects all 91 labs to be in compliance and operational within 3 to 4 weeks.

    The suspension follows a routine investigation of safety practices at UHN, which has a license to conduct experiments with radioactive materials at five buildings housing cardiac, neurosciences, organ transplant, and oncology programs. The inquiry found many deficiencies, including excessive contamination levels and unreported losses of sealed radioactive materials—problems that Cawthorn says indicated that the hospitals were exerting little control over the radioactive materials being used. Some 55 instances of noncompliance were discovered during site visits at the three hospitals in November. The occupational safety program “was essentially nonfunctional,” according to a report from a site visit in October.

    Guerriere attributes the lapses in safety practices to confusion caused by a merger in January 1998 of Toronto General Hospital, Toronto Western Hospital, and Princess Margaret Hospital that created UHN. “We had separate radiation safety programs at each of the three sites that were managed differently,” says Guerriere. “We've also had the entire lab group that used to be at the Wellesley Hospital [in Toronto] moved across into our labs, and we've restructured them quite significantly. A lot of labs have moved around, and we didn't pay sufficient attention to these training issues during that process.”

    One bench scientist says that the shutdown has thus far been little more than a temporary inconvenience. “We're just going to go with the flow, follow the plan, and get recertified,” says physician Michael Tymianski, whose research includes work on radiochemicals such as calcium-45. “The whole thing has been quite gentlemanly. Everybody has gotten reassessed and reevaluated, and our plan is to put things back together again.”

    The AECB has laid down nine directives that UHN must follow, including creation of a comprehensive radiation safety program covering all operations and facilities and adequate training of employees. “They basically have to build a program that has organization—a proper management structure, an oversight hazard control function, [and the ability] to control doses of radioactive materials,” says Cawthorn. Guerriere says the UHN is already on the road to recovery by filling its vacant position of corporate radiation safety officer and reaching agreement with the control board on a “remedial training program.”

    Although no one on staff is known to have received an excessive dose of radiation, the board has ordered UHN to perform a thyroid bioassay on one worker. Cawthorn says he's unaware of any instances in which patients were exposed to excessive radiation, either, but adds that patient care is “not within the AECB mandate.”


    Ancient Continent Opens Window on the Early Earth

    1. Carl Zimmer*
    1. Carl Zimmer is the author of At the Water's Edge.

    In Canada's cold North, geologists have mapped the outlines of a lost continent and started to decipher its history, which began more than 4 billion years ago

    Acasta River, Northwest Territories—On the banks of the Acasta River, deep in the barrens of Canada's Northwest Territories, stands a lonely shed. There's no sign of humanity for many miles—just low hills, a few black spruce, a thick cover of reindeer moss, and an endless succession of lakes and rivers. The only way to get here is by canoe or by float plane. But this shed, filled with geologists' hammers, boats, and camping gear, marks a special place. Over the door is a plank on which someone has painted, “Acasta City Hall. Founded 4 Ga.” It commemorates the surrounding rocks, some of which are over 4 billion years old—the oldest rocks on Earth.

    Geologists first recognized the antiquity of the Acasta region in the early 1980s, but at the time it wasn't clear whether its oldest rocks could say much about Earth's early history. Most of the ancient landmass to which the rocks belonged had been remelted, buried in the mantle, or eroded down to sand. What little remains has been deformed and shot through with intrusions of younger rocks. As a result, geologists had no clear idea how the oldest rocks were related to the surrounding formation (known as the Slave Province). It was as if most of the pieces to the ancient puzzle had been thrown out and the few remaining ones were tattered and torn.

    But that is now changing. For the past 5 years, geologists led by Wouter Bleeker and Richard Stern of the Geological Survey of Canada (GSC) have been studying this remote territory intensively. Systematically traversing the region, they have mapped the Acasta area and adjacent parts of the Slave Province, and they have collected samples for precise dating based on uranium and lead isotopes to help them piece the jigsaw puzzle back together. “This is the oldest terrestrial crust known on Earth, so we need to understand it in as much detail as we can,” says Bleeker.

    Now the researchers have started to publish their results in several journals, including special issues of the Canadian Journal of Earth Sciences (July 1999) and Geoscience Canada (March 1998). They report that they have placed the Acasta rocks within a relatively large protocontinent. What's more, they can trace the history of that continental nucleus for 1.5 billion years after its beginnings 4 billion years ago. “The incredible longevity of history in this area is mind-boggling even for a geochronologist,” says Stern.

    In the process they are shedding light on some of the most important questions about the early Earth. When did the cores of the continents first become stable? Was this early crust created and destroyed by plate tectonics, as crustal rock is today? Or did some other process dominate the younger planet, as the new research hints? In the past, geologists have debated these questions fiercely but have had little evidence to draw on. Now they have a window on the planet's formative years. “[Bleeker] is doing it right,” says Warren Hamilton, a geologist at the Colorado School of Mines in Golden. “He's going where the rocks and the dates take him. He's not working backward from a model and ignoring the data that don't fit.”

    Old rock, young Earth

    By the 1980s geologists working in the region knew that the Slave Province represented an ancient core of a continent, known as a craton. The rocks dated back into the Archean eon, before 2.5 billion years ago, but beyond that their age was unclear—and given that the Archean takes up almost the first half of Earth's history, that was a lot of ambiguity. So Sam Bowring, now at the Massachusetts Institute of Technology, set out to determine the age of an Acasta rock by analyzing crystals known as zircons that had formed as the rock cooled and crystallized.

    Zircons can act as timekeepers because they trap uranium in their lattice when they form, and the uranium then steadily decays into lead. Although zircons are remarkably durable, even they can get damaged over 4 billion years. New zircon crystals can grow on top of old ones, and old crystals can suffer radiation damage—all of which can throw off the uranium dating method. So researchers use a device called a Sensitive High-Resolution Ion Microprobe (otherwise known as a SHRIMP) to home in on undamaged parts of a zircon crystal. The SHRIMP fires a narrow beam of oxygen ions to blast atoms off a patch of the zircon's surface a few micrometers across, then analyzes the atoms' isotopic composition.

    Bowring teamed up with researchers at Australian National University (ANU, the birthplace of SHRIMP technology) in Canberra to analyze the Acasta rock. In 1989 they reported that zircons in a few samples were at least 3.96 billion years old, edging out other ancient rocks elsewhere in the world by a few hundred million years. (Geologists have found individual zircons in Western Australia that date back further, to 4.28 billion years, but they had eroded out of their original rock and were incorporated into much younger sedimentary rocks.)

    Later, some of the rocks at Acasta sneaked past the 4-billion-year mark. The Geological Survey of Canada purchased a SHRIMP from ANU in 1995 and within a few months was using it for a systematic survey of the rocks around Acasta. “Nobody would want to go digging around in a place like that without such a tool. You couldn't solve anything without it,” says Stern. In 1997 the Canadians announced that they had dated the rocks to 4.03 billion years ago. Meanwhile, Bowring and Ian Williams of ANU dated more samples from Acasta and got essentially the same age. Since then the Canadian team has pushed back the date even further—to 4.055 billion years ago. The rocks formed when Earth itself was a little over 500 million years old, its interior still seething hot from its formation and its surface pummeled by asteroids and comets.

    Now the mapping project led by the GSC's Bleeker is beginning to work out how these oldest rocks—found to date in only a few square kilometers of the Slave—are related to the formations that surround them. In many parts of the Slave Province, a recurring geological pattern can be seen exposed on riverbanks and hillsides. The lowermost, oldest layers of rock—which include the 4-billion-year-old specimens—are made of gneiss, a metamorphic rock created under high temperatures and pressures. The top of the gneiss is weathered and uneven, suggesting that the rock was exposed for a long period, eons ago. Above lie much younger layers of rock—quartzite, banded iron formations, and volcanic rocks—creating a time gap called an unconformity. Previous researchers had noticed this succession of layers here and there in the Slave Province, but Bleeker has now traced the stratigraphy across hundreds of kilometers. John Ketchum, another geochronologist on the team, has dated two volcanic layers above the unconformity, and they consistently date to just over 2.8 billion years. “Correlations over such large distances are tricky,” says Bleeker. “But we feel we've got it worked out.”

    The researchers conclude that these Archean gneisses were not scattered, unrelated bits of crust but were united, at least 2.9 billion years ago, in a protocontinent. It then experienced uplift, possibly as a hot mantle plume pushed it up from below. As the continent rose, its surface eroded, creating the unconformity. Then it began to subside (perhaps as the plume dissipated) and sank far below the surface of an ancient ocean, where sediments could cover it: first quartz-rich sandstone, and then other sediments that formed the banded iron formations. Finally volcanic rocks spread over its surface as the Slave protocontinent was rifted apart about 2.8 billion to 2.7 billion years ago.

    Nested in the heart of this continental nucleus are the 4-billion-year-old Acasta gneisses, as the researchers will report in Current Research 2000, Geological Survey of Canada. Their position below the unconformity indicates that these ancient rocks don't represent some exotic tectonic sliver driven into the Slave hundreds of millions of years after it formed. “This work places the Acasta gneisses firmly in this larger geological entity,” says Stern, implying that they mark the very core of the continent. “Perhaps it was a piece of crust that survived [impacts of giant] meteorites and avoided getting stuffed down into the mantle, and it acted like a seed for further crust formation. At this stage of the game, we don't really know why this place is so special.”

    A protosupercontinent?

    The new maps of the region around the Acasta will help geologists search for more rocks over 4 billion years old. But researchers are less interested in breaking records than in decoding the 1.5-billion-year history of this protocontinent. One pressing question is whether it was actually part of an even bigger landmass. In more recent Earth history, continents seem to have evolved in a cyclical pattern, colliding to form a supercontinent, which later rifts apart and then joins back together in a different arrangement. The last supercontinent was Pangaea, which started splitting up about 200 million years ago; its predecessor was Rodinia, which aggregated about 1 billion years ago.

    Before Rodinia, the record is shadowy, but some researchers have speculated that a supercontinent existed as early as 3 billion years ago. Bleeker thinks the protocontinent he and his colleagues have mapped may have been part of it. Along the eastern edge of the territory they mapped, for example, the old gneisses, the distinctive unconformity, and overlying quartzite and banded iron formations are all missing. That suggests that the eastern part must have rifted off at some point and drifted away. “If you see the head of an elephant cut off at the neck,” says Bleeker, “you know there must be a body lying around somewhere.” Bleeker suggests that parts of the body may already have been identified: Archean cratons of the same age in Zimbabwe, Wyoming, and elsewhere have been reported to have quartzite layers and banded iron overlaying gneisses as well. “These may be several pieces of a larger continent of which the Slave nucleus is just one remnant,” he says.

    By connecting these continental nuclei, geologists may also gain insight into how the early Earth behaved. On the modern Earth, plate tectonics assembles and destroys continents. They float atop rigid plates, which are pushed around the planet by the mantle's roiling heat. Where the plates rift apart, continents are divided and new oceans form; where they collide, mountains are pushed up, or one plate often dives beneath the other in a subduction zone, fueling volcanoes that build new crust on the overlying plate. But Bleeker thinks several features of the ancient rocks indicate that plate tectonics may not have shaped the Slave protocontinent during its first billion years. “I'm not sure plate tectonics will be the best explanation,” he says, at least for the growth of the oldest crust.

    Many geologists have envisioned an early start for plate tectonics, with small tectonic plates crashing into each other as much as 4 billion years ago. They point out that the earliest kinds of rocks, such as gneisses, have a chemistry similar to rocks formed today by plate tectonics. They've also pointed to places, including parts of the Slave, that seem to show signs of plate-tectonic activity such as sea-floor spreading or accretionary prisms—the piles of sediments scraped off a plate as it descends into the mantle.

    But others have argued that plate tectonics would have been impossible on the early Earth. They point out that 3 billion to 4 billion years ago the planet was much hotter than it is today—too hot, they say, for rigid continental plates to form. They also argue that the hot surface layer would have been too buoyant to sink in subduction zones, and without sinking plates, the whole plate-tectonic cycle grinds to a halt. And they've questioned whether geochemistry alone can say anything certain about how rocks were formed. “It's groupthink, with a bunch of people who don't know what they're comparing [Archean geology] to,” says Hamilton of the Colorado School of Mines, who helped formulate the theory of plate tectonics in the 1960s and 1970s but has become an outspoken opponent of the idea that it operated in the Archean.

    Believers in ancient plate tectonics will find little comfort in the new results. Bleeker says the oldest parts of the Slave province seem to lack key signatures of plate tectonics, such as the long, narrow belts of accreted and deformed rocks. The mapping has even eliminated some supposed relics of ancient plate tectonics. Bleeker and his co-workers say, however, that they aren't coming down against plate tectonics, only calling it into question. “Plate tectonics is an attractive paradigm, because we understand it well,” says Bleeker. “But that doesn't necessarily mean it is the best model for the early Archean.”

    Exactly what process might have been at work to build the ancient continent remains a matter of speculation. “That's where things get much more tenuous,” admits Hamilton. In the nucleus of the old continent, rocks differing in age by a billion years or more are sometimes juxtaposed. That's not what geologists expect from the gradual accretion of crust at plate boundaries, but it could be the handiwork of episodic volcanic outbursts, fed by broad plumes of rock that rose periodically from deep in the mantle. “We need some stretching or cracking, and then it comes bubbling up,” says Hamilton. These lava flows would have gradually built up the continental nuclei. Eventually the planet cooled enough for plate tectonics to take over. Bleeker speculates that the breakup of the continental nucleus his team has documented at 2.8 billion to 2.7 billion years ago might represent the dawn of plate tectonics.

    Paul Hoffman of Harvard University cautions that looking only at a small area like the Slave won't be able to resolve the debate. “I hate to be negative, but … a fragment of crust the size of the Slave Province is simply too small to ever work out the governing tectonic boundary conditions.” Hoffman accepts the ancient geography that Bleeker and his colleagues have mapped, but he says that geologists will need to compare its distinctive stratigraphy with what they find on other fragments of ancient crust to conclude anything about ancient tectonic processes.

    Bleeker agrees: “It's critical to try and find matching pieces scattered around the globe.” But he adds that Earth's oldest crust is a good place to start. “The Slave Province is a wonderful laboratory for all these questions.”


    Does Cancer Therapy Trigger Cell Suicide?

    1. Elizabeth Finkel*
    1. Elizabeth Finkel writes from Melbourne, Australia.

    The notion that drugs and radiation kill cancer cells by causing them to self-destruct has guided drug searches. But some cancer experts aren't convinced

    In the dark trenches of the war against cancer, a ray of light seemed to shine through a few years ago: a simple notion about what makes cancers susceptible to radiation or chemotherapy. The key, many cancer researchers came to believe, was the presence or absence of the p53 tumor suppressor gene, which produces a protein that cells need in order to commit suicide when they are damaged or stressed. As long as p53 remained functional, the theory went, cancer cells damaged by radiation or chemotherapy would self-destruct. But if the cascade of genetic changes that led to the cancer also inactivated the p53 gene—which happens in about 50% of all human malignancies—the cells can shrug off the worst that oncologists can throw at them and continue to multiply. As a result, many researchers are looking for ways to restore cancer cells' ability to undergo apoptosis, as cell suicide is more technically called, in order to make them sensitive to cancer therapies.

    But some specialists—among them many oncologists who treat tumors with radiation—don't buy this picture, at least not for solid tumors such as cancers of the lung, breast, prostate, and colon. They say that the kinds of test tube assays that point to apoptosis, and p53 in particular, as critical to cancer therapy may not accurately reflect what happens in cancers that have developed naturally in the body. What's more, they argue, using assays based on apoptosis to screen for cancer drugs might cause researchers to miss drugs that kill by other means. “People have become enamored with apoptosis—everything begins and ends with apoptosis—and that's not right,” says radiation oncologist Martin Brown of Stanford University School of Medicine.

    Recently, the radiation oncologists have been finding support for their position in a variety of studies showing that p53 gene status does not correlate with a cancer cell's susceptibility to therapy. Not only may cells having a functional p53 gene fail to respond, but those lacking one may be killed easily. But these findings are not easy to interpret, because other genetic factors can also determine whether cells will undergo apoptosis. This makes it difficult to tell whether apoptosis is in fact key to cancer therapy response.

    “As with everything in biology and cancer, the situation is much more complicated than we would like,” says cancer gene expert Bert Vogelstein of The Johns Hopkins University School of Medicine. Still, it's critical to learn just what determines whether radiation and drugs will kill cancer cells. As Vogelstein notes, “In the long term, understanding these complexities is likely to enhance our ability to develop better therapies and tailor such therapies to the specific characteristics of individual tumors.”

    Apoptosis is a rapid and tidy form of suicide that cells may opt for when their DNA is damaged by radiation or toxic drugs. A great deal of evidence has shown that the decision to self-destruct is controlled by a gene circuit, with the p53 protein serving as the key damage sensor that tells the circuit when to kick in. Over the years several lines of evidence have pointed to the importance of p53-induced apoptosis in determining whether blood cell tumors will respond to treatment.

    Pediatric oncologist David Fisher of the Dana-Farber Cancer Institute in Boston cites acute lymphoblastic leukemia, a blood cell cancer that usually afflicts children, as an example. These tumors typically carry an intact p53 gene when they are first diagnosed and virtually melt away when treated with chemotherapy, Fisher says. Recurrences occur in 20% to 40% of cases, however, and when the tumors come back, half of them carry a defective p53 gene and are now resistant to further therapy.

    No solid evidence

    But although the evidence is convincing for the blood cell tumors, a definitive answer about how p53 influences the response of solid tumors to therapy has been much harder to come by. Simply collecting data from patients doesn't settle the issue, in part because it can be hard to determine whether a cancer has a working p53 protein. Even looking at biopsy samples from primary tumors that are shrinking after treatment to see whether the cells are dying by apoptosis hasn't provided conclusive results because apoptosis is rapid and leaves no traces.

    So in 1994 Scott Lowe, then at the Massachusetts Institute of Technology, and colleagues studied experimental tumors in mice. They transplanted cancer cells in which the p53 gene had been inactivated into the animals. The resulting tumors were resistant to both radiation treatments and the chemotherapeutic drug Adriamycin. But tumors formed by cells with an intact p53 gene were sensitive to both types of therapy (Science, 4 November 1994, p. 807). Furthermore, the cells in the shrinking tumors showed such hallmarks of apoptosis as having their DNA chopped into regularly sized bits.

    Work on cultured cancer cell lines echoed these findings. Take the National Cancer Institute's (NCI's) drug-screening program, which since 1990 has tested 60,000 potential drugs against a panel of 60 human cancer cell lines. The screening indicated that many current cancer drugs work most effectively in cells with an intact p53 gene, while cells with inactivating p53 mutations tend to resist the compounds (Science, 17 January 1997, p. 343). So entrenched did the idea become that cell biologist Michael Strauss of the Max Delbrück Center for Molecular Medicine in Berlin told the German magazine Der Stern in a 1996 interview that patients whose tumors bear inactivating p53 mutations would not benefit from radiation or drug therapy. Indeed, the interview bore the headline “Jede zweite Therapie ist überflüssig”–that is, “Every second treatment is superfluous.”

    Radiation oncologists think that's far too sweeping a conclusion, at least for solid tumors. “We don't worry about doctors,” because they treat patients regardless of their p53 status, says one such clinician, Lester Peters of the Peter McCallum Institute for Cancer Research in Melbourne, Australia. But he adds, “We do worry about patients who might look up the information and decide treatment is futile.”

    Stanford's Brown argues that the assays on which Strauss's conclusion was based are flawed. Lowe's transplanted cancer cells were highly artificial, having been made cancerous in the culture dish by the introduction of two oncogenes, E1A and RAS. Many studies, including some by Lowe himself, have shown that these oncogenes raise p53 levels, making cells totter on a knife edge of survival. Thus, Brown says, the cells are poor models for solid tumors because they would never withstand the insults encountered on the way to forming the tumor. Lowe concedes that the model he used was not ideal, but says that finding a model that accurately reflects what happens in natural tumors is a major challenge. “This is the question I've been struggling with for years,” he says.

    Brown says that short-term assays like those carried out in the NCI screen to test cancer drugs are also misleading. In those assays, researchers typically bathe cancer cells in high doses of the drug for 2 to 3 days and check the growth response. Cells with intact apoptotic circuitry freeze in their tracks, while those lacking p53 keep going. But the problem is that even if cells survive 48 hours, that doesn't mean they can survive and proliferate in the long term. “p53 [status] tells you how a cell dies but not whether it dies,” Brown maintains.

    More than one way to die

    He and other radiation oncologists favor a nonapoptotic mechanism, partly because of their clinical experience. The behavior of tumors, they say, seems to indicate that rather than self-destructing, cancer cells die when they try to divide. “Most solid cancers take several weeks to respond [to treatment] because [their cells] have to undergo mitosis,” Peters explains. In fact, he says, one can use a tumor's growth rate to predict how long it will take for a response to become apparent—within weeks for fast growing tumors, whereas slow-growing tumors, such as pituitary adenomas, take years to go away.

    More direct evidence for that point of view comes from the so-called clonogenic assays that radiation biologists traditionally use to study the effects of radiation on cancer cells. In these assays, cells are exposed to radiation or drug therapy, seeded onto a culture plate, and then followed for about 12 days, long enough for six or seven divisions. In these assays, cells from solid tumors typically don't die until they divide, at which time the daughter chromosomes break when they try to separate. In other words, death appears to be a consequence of mechanical damage rather than a rapid self-destruct signal.

    What's more, these assays don't show a consistent link between cells' ability to undergo apoptosis and their susceptibility to anticancer therapies. In 1995, for example, cell biologist Robert Schimke of Stanford University tested HeLa cells, a line of cultured cells derived from a human ovarian cancer, in both clonogenic and short-term assays. He found that although cells engineered to produce high levels of Bcl-2, a protein that inhibits apoptosis, survived drugs in the short term by resisting apoptosis, they nevertheless had sustained a fatal blow, as evidenced by the fact that they could not form colonies in the clonogenic assay. “Our experiment showed that apoptosis is not necessarily a measure of the success or failure of therapy,” Schimke says. Schimke's experiment has since been followed by a host of others using clonogenic assays, which have come to similar conclusions.

    Brown and his colleagues think solid cancers are unlikely even to retain the capacity to undergo apoptosis because of the way they develop. As these tumors grow, they outstrip their blood supply, at least temporarily, leaving their cells deprived of food and oxygen—stresses guaranteed to drive most cells to press the self-destruct button. So only those cells that have disabled their apoptosis circuitry are likely to make it to the point of forming solid tumors. If so, radiation and drugs that shrink solid tumors must therefore be acting by other mechanisms.

    Yet the issue is far from settled. Many biologists say the clonogenic assays are as unnatural as the short-term tests that pick up apoptosis. Cells individually seeded onto a plate to test their growth are a far cry from cells growing in a tissue, where their fate may be influenced by contact with neighboring cells or the extracellular matrix. For instance, Caroline Dive and John Hickman of the University of Manchester in the United Kingdom found that lymphoma cells made resistant to apoptosis by an extra copy of the bcl-2 gene had no long-term survival advantage in a standard clonogenic assay. But when the culture dishes were made to resemble tissue by coating them with the extracellular matrix protein laminin and adding the growth factor IL-4, apoptosis-resistant cells had a survival edge.

    Further complicating efforts to untangle the situation is the fact that cells' responses to therapy may vary depending on the drug used. One recent example comes from the Vogelstein team at Johns Hopkins. In experiments reported in the August issue of the Journal of Clinical Investigation, the researchers specifically inactivated the p53 gene in a line of cultured colon cancer cells. Although the mutants did become resistant to 5-fluorouracil, a drug widely used to treat colon cancer, they became more sensitive to another cancer drug, Adriamycin, and to gamma radiation.

    In addition, p53 status by itself is not enough to indicate whether cells are capable of apoptosis. Other components of the apoptosis circuit can determine the final outcome. For example, mutations that activate the oncogenic potential of Bcl-2 and its relatives are well-known derailers of apoptosis. And recent work shows that the second most common mutation in solid cancers—disruption of the chromosome locus that includes the p19 tumor suppressor gene—also results in the failure of apoptosis.

    In work reported in the 15 October issue of Genes and Development, Lowe, now at Cold Spring Harbor Laboratory on Long Island, Clemens Schmitt (also of Cold Spring Harbor), and their colleagues inactivated the p19 gene in a strain of mice already prone to B cell lymphomas because the animals carry an active myc oncogene. The researchers found that the resulting animals developed an aggressive lymphoma that closely resembles the cancers seen in animals with inactivating p53 mutations; among other things, they were highly resistant to chemotherapy. These results mean that researchers wanting to establish whether apoptosis is important in how cancer cells die will have to determine exactly which genes are defective in resistant cells, an effort already going on under the aegis of the NCI in Bethesda, Maryland.

    “Brown is right in saying the answer's not known yet; we have to bite the bullet and get into these experiments,” says Dive. The hope is that order will soon emerge from the chaos, says Vogelstein: “Whether the models we have now are correct is not as important as the fact that cancer researchers are for the first time getting some real insights into why drugs fail, and more importantly, why they work at all.”


    Europe Stresses Prevention Rather Than Cure

    1. Michael Hagmann

    European research managers have woken up to the issue of fraud. But rather than policing it, they aim to nip it in the bud

    Ringberg, Bavaria, AND Edinburgh, Scotland—Drummond Rennie, deputy editor of The Journal of the American Medical Association, says he had a strong sense of déjà vu when he attended a recent conference on scientific misconduct in British biomedical research at Edinburgh's venerable Royal College of Physicians. “The U.K. is almost exactly 20 years behind the U.S. [in dealing with scientific misconduct],” says Rennie, who estimates he has accumulated some 250,000 travel miles to attend meetings on scientific misconduct. “It's really striking—it's all the same questions and arguments that used to come up” in the United States in the late 1970s and early 1980s.

    European researchers would likely agree that, until recently, institutions on this side of the Atlantic maintained a concerted head-in-the-sand policy toward fraud and other forms of research misconduct. “A lot of [U.K. researchers] thought that this only happens in other countries,” says Stephen Lock, former editor of the British Medical Journal(BMJ). Povl Riis, one of the founders of the Danish Committee on Scientific Dishonesty, encountered the same attitude in Denmark in the early 1990s. “Many researchers think that a high IQ goes hand in hand with high moral values –which is, of course, absolute nonsense.”

    But a rising tide of retracted papers and some high-profile fraud cases are finally stirring research officials into action. Ethics committees and working groups are now hard at work in the United Kingdom, Germany, and other countries churning out new guidelines and procedures for good lab practice and publication. And at two recent conferences,* one in the U.K. and one in Germany, scientists from both countries engaged in some serious navel gazing. The focus was on prevention rather than cure. “The main goal was to find out what circumstances would favor scientific misconduct and to try and create an environment that would prevent it from happening in the first place,” says Rüdiger Wolfrum of the Max Planck Institute for Comparative Public Law and International Law in Heidelberg, organizer of the German gathering. This view is shared on the other side of the English Channel. “There is little value in lengthy discussions about a definition of scientific misconduct as done in the U.S. A better approach seems to me an emphasis on implementing good research practice guidelines,” says Graeme Catto, vice principal of the University of Aberdeen, who helped organize the Edinburgh conference.

    Although Britain took an early lead in Europe in tackling the issue—with a 1991 report on scientific fraud in medical research from the Royal College of Physicians—it took a while to capitalize on that head start. “That's where matters stopped. The report hasn't even been publicized widely and certainly not implemented,” says Lock. A case in point, says George Alberti, president of the Royal College of Physicians in London, was a recent investigation of British medical schools, which revealed that “local mechanisms [for dealing with misconduct] were in chaos or nonexistent.”

    One reason for the slow progress in Britain is the wide variety of funding sources: two separate research councils; the health and agriculture ministries; and numerous charities, which provide the largest slice of the pie. This contrasts sharply with the situation in Denmark, the first European country to take firm action against scientific fraud, where “90% of the funds are paid for by the government, which makes it easier to have a government-regulated framework,” Lock says. Britain's Medical Research Council (MRC) did finally adopt procedures to investigate scientific misconduct in 1997. It followed those up last year with guidelines for good clinical practice and is currently working on a booklet defining good research practice. But few other bodies have made any efforts, Lock says.

    Germany was thrown more precipitously into self-analysis over misconduct by a major scandal in the mid-1990s involving allegations of manipulated or falsified data that might affect more than 50 publications by two biomedical researchers (Science, 15 August 1997, p. 894). This served as a late wake-up call for the scientific establishment by revealing that German science was ill prepared to deal with scientific misconduct. In the wake of the scandal, Germany's main granting agency, the DFG, which funded the suspect research, and the Max Planck Society embarked on a mission to combat fraud, and both have now developed procedures for investigating and sanctioning scientific misconduct. Many smaller grant-giving institutions as well as German universities—which have to adopt similar procedures to receive DFG money—are now following suit.

    The October meeting, which took place at the Max Planck-owned Ringberg castle in the Bavarian Alps, is part of the next, preventive stage in Germany's crusade against scientific misconduct. In June 1998 Max Planck president Hubert Markl asked an interdisciplinary working group of Max Planck scholars to come up with a code of conduct for scientists that would ensure good research practice, akin to the ones already drawn up by the U.S. National Academy of Sciences in 1992 and the DFG in 1997. A report is due early next year and is likely to incorporate many of the points discussed at the meeting, says the chair of the working group, Wolfgang Edelstein of the Max Planck Institute for Human Development in Berlin. Both the Ringberg and the Edinburgh meetings, held just 1 week later, came to similar diagnoses: The main ingredients for an effective antimisconduct recipe are the proper training and education of scientists at all levels on what constitutes acceptable ethical behavior, some cultural changes to diminish the steep hierarchy in certain research institutions and to encourage open scientific discussions, and—especially at the German meeting—an emphasis on good publication practice.

    “There's no appropriate scientific education for medical doctors in Germany, although they're expected to do basic research” in order to become professors, says Ulrike Beisiegel of the University Hospital Eppendorf in Hamburg. Even standard procedures such as keeping comprehensive lab journals and notebooks are often lacking, says Gordon Duff of the University of Sheffield: “Record keeping is little short of disgraceful in most [U.K.] universities.” The consequences of a loose collection of unnumbered sheets of hieroglyphic scribbles with no dates can be dire: “The Imanishi-Kari case [which cast a shadow over U.S. research in the late 1980s and early 1990s, before charges were dismissed by an appeals board in 1996] wouldn't have taken 10 years if she'd kept proper records,” says MRC chief executive George Radda.

    Many of those who met in Edinburgh and Ringberg argued that education in research ethics should be an obligatory part of an advanced degree such as a Ph.D. But even that may not be a panacea. “One shouldn't expect too much from formal [ethics] courses; there are tons of lawyers who finish their formal legal training and still go work for the Mob,” cautions Markl. Whatever the merits of ethics classes, there was little dispute about the best way of fostering ethical and responsible behavior in the lab: “You have to teach it by example; the young scientists have to see it day in, day out. It's about a culture, a climate,” says Lord Kilpatrick of Kincraig, former president of Britain's General Medical Council and chair of the panel that drew up the Edinburgh meeting's closing statement.

    Some delegates also suggested that the very structure of research institutions might be creating an environment that encourages research misconduct. Extremely hierarchical structures, such as those in most medical institutions but also in many Max Planck Institutes where an almost almighty director sits atop a pyramid of researchers, may put pressure on younger staff members. Often it takes some courage for junior group members to come up with ideas that go against the group leader's school of thought. “It's important to take away the fear of talking openly from students and to develop a culture of criticism and open discussions,” says Marinus Lamers of the Max Planck Institute for Immune Biology in Freiburg.

    If institutions are reluctant to adopt guidelines or don't comply with given procedures, Rennie has a simple remedy: “Cut the funds.” The Edinburgh meeting's closing statement also reflects this attitude. It recommends withholding grants unless institutions have in place a proper system for dealing with scientific misconduct and guidelines for good research practice.

    Delegates were generally enthusiastic about the progress made at the two meetings. Lock thinks the Edinburgh consensus statement is “very encouraging. Now is not the time for nitpicking and dampening people's enthusiasm. Let's see how it evolves,” he says. From his vantage point, Rennie, the traveling salesman of scientific misconduct, keeps his fingers crossed that European colleagues can keep the ball rolling. He says he is mildly surprised, however, “that they invent the wheel all over again instead of looking at the experiences in other countries.” But then again, he says “maybe you've got to do this. You have to carry the community along and get them involved. People here wouldn't believe in a system simply copied from the U.S.”

    • *Ringberg Conference on Ethics in Research, 20 to 23 October, and Joint Consensus Conference on Misconduct in Biomedical Research, Edinburgh, 28 to 29 October.


    Indian Agency Eyes Moon in Shift Toward Research

    1. Pallava Bagla

    A new rocket able to deliver lunar payloads complements efforts by the Indian Space Research Organization to focus more on scientific challenges

    Lucknow, Bangalore, AND New Delhi—After 3 decades spent focusing on the earthly pursuits of weather forecasting and communications, India's space agency is lifting its sights. The Indian Space Research Organization (ISRO) is putting together plans for a mission to the moon, and it is also hoping that private industry will take over its launch and communications satellite business so it can concentrate on scientific payloads and advanced technology.

    Krishnaswamy Kasturirangan, ISRO's chair, floated the idea of going beyond Earth orbit last spring after the successful launch in May of the country's new Polar Satellite Launch Vehicle (PSLV). He noted that the 300-ton rocket, which placed three satellites into orbit, “can [also] undertake a mission to the moon.” Toward that end, ISRO convened a symposium at this fall's annual meeting of the Indian Academy of Sciences* entitled “An Indian Case for Going to the Moon.” Scientists from the space department showcased the agency's willingness and capacity to undertake such a scientific mission and received a positive response from the 200 researchers in attendance, many of whom see it as a matter of national pride as well as technological prowess. “We cannot not afford to go to the moon,” asserts Narendra Bhandari, a planetary scientist at the Physical Research Laboratory in Ahmadabad who has worked extensively on moon rock samples and asteroids.

    Kasturirangan believes that the country's political leaders, who must approve such a plan, also may be ready to take the next step. He points to a 20% rise in the agency's space science budget in the past 2 years, to $8.9 million in 2000, built in part on a success rate of 75% for nearly two dozen launches with a variety of payloads. The proposed moon mission is also seen as a way to galvanize support for other basic research projects. And space officials from other countries believe that India is up to the task. “ISRO is today technically fully capable to take on a mission to the moon,” says Serge Plattard, director of international relations for CNES, the French space agency. “It is natural to reach out to new frontiers when you have reached maturity.”

    Although details must still be worked out, agency officials are weighing two kinds of lunar missions. One would send a 275-kilogram satellite past the moon, and the second would place a 140-kilogram satellite in a lunar orbit. Those payloads are comparable in weight to recent U.S. missions like the Lunar Prospector and Clementine. The spacecraft would carry out studies of the moon's core and collect high-resolution images of its surface. The first mission could also include a gamma ray spectrometer to map the surface of the moon at a resolution of about 10 meters, a fourfold improvement over previous efforts.

    However, while Kasturirangan talks of a launch late in the next decade, NASA's James Dodge of the Mission to Planet Earth office warns against overoptimism. “ISRO should keep in mind that the moon is an expensive business,” he says. “India could well undertake quite a few Earth missions at the cost of a single moon mission.”

    But even as ISRO looks toward the moon, it is keeping its eye on earthly applications. Last month India signed a preliminary agreement with France to launch the first-ever Indo-French space mission to study tropical weather and climate. The 500-kilogram payload, christened Megha-Tropiques, will help in the study of moisture and temperature convective systems that are particularly intense in intertropical regions. The two countries will split the $100 million cost of the project: The satellite bus is to be provided by France, the scientific instrumentation developed jointly, and the finished satellite launched from India on the PSLV in 2005.

    ISRO sees the French mission, which uses optical instruments, as one of a series of custom-made remote-sensing satellites it hopes to develop. It wants to improve synthetic aperture radar and other sophisticated sensors and cameras that can look through the thick band of clouds that envelops India during the monsoon season. It also plans to continue work on launchers such as the Geo-Stationary Satellite Launch Vehicle, capable of orbiting 2000-kilogram satellites, that is scheduled to make its debut early next year.

    To do that, however, ISRO must withdraw from more mundane commercial jobs such as the development of communications satellites and new launchers. “The private sector has to take over this activity,” says Kasturirangan, who compares India's situation to what France faced 25 years ago before it created Arianespace Corp. as the commercial arm of CNES. Science and technology cabinet minister Murli Manohar Joshi echoed that sentiment recently at an international space science symposium in New Delhi, noting that “it is not always possible to maintain many [space] systems with taxpayers' money. Suitable mechanisms should be developed to recover costs from the beneficiaries.” A policy of transferring ISRO's know-how to domestic industry also is in keeping with other government efforts to bow out of many sectors of the economy.

    By spinning off its launchers, says India specialist Didier Aubin, an electronics engineer and director of sales and marketing at Arianespace, ISRO “can focus its energies on new opportunities.” And that's exactly what it should do, says Narendra Kumar, director of the Raman Research Institute in Bangalore and president of the Indian Academy of Sciences, which hosted the ISRO symposium. “ISRO has really come of age, and this shift toward research-oriented outer space missions is a necessary step.”

    • *65th annual meeting of the Indian Academy of Sciences, Lucknow, 30 October 1999.


    Scanners Get a Fix on Lab Animals

    1. Robert F. Service

    Purpose-built machines with improved resolution are allowing researchers to monitor novel drugs in vivo and watch gene expression, without wielding a scalpel

    To see what is going on inside the human body, physicians can turn to advanced medical imagers such as computed tomography (CT) scanners and magnetic resonance imaging (MRI) machines. But for biomedical researchers working with animals, the standard tool has been the scalpel. Imagers lacked the necessary resolution to make out the tiny features inside the brains or other organs of small animals such as mice. And in any case, the companies building the machines have had their sights fixed on clinical settings, drawn by the multibillion-dollar market—until now.

    “The interest in small-animal imaging has been growing exponentially,” says G. Allan Johnson, who specializes in producing high-resolution MRI images of small animals at Duke University in Durham, North Carolina. The reason, paradoxically, is a revolution in human imaging: These techniques are no longer limited to simply mapping the internal anatomy. Thanks to new probes that give off a detectable signal when they encounter specific molecules, medical imaging is now beginning to shine a spotlight on particular molecular events within cells, such as illuminating when genes get turned on or when a cancer cell turns malignant (Science, 15 May 1998, p. 1010). Because the probes—like drugs—can be tested more easily in animals than in people, imaging research has been turned on its head, sending researchers and imaging companies rushing to develop scanners able to see fine detail in rats, mice, and rabbits. “Virtually all the imaging techniques are being developed for small animals,” says Johnson.

    This month, for example, a Knoxville, Tennessee-based company called Concorde Microsystems is set to roll out the first commercially available positron emission tomography (PET) scanner for small animals, dubbed Micropet. Already the company has back orders from drug companies, academic researchers, and government research labs for some 13 machines. MRI and CT machines for small animals are already on the market. And researchers at the University of California, Los Angeles (UCLA), are working on a new scanner that combines PET's ability to monitor activity in particular tissues with CT's ability to reveal general anatomy. MRI specialist Kamil Ugurbil of the University of Minnesota, Minneapolis, says of the new class of machines: “I think these will become relatively abundant and popular in the near future.”

    Along with allowing researchers to test new molecular imaging techniques before they are used in human patients, all this new hardware is starting to generate handsome research payoffs, many of which were on display at a recent meeting entitled “Imaging in 2020” in Jackson Hole, Wyoming. The meeting, the first of its kind, brought together everyone from radiologists and surgeons to physicists and chemists working to design novel imaging probes and advance the state of molecular imaging. The results on display included images tracing specific neural connections in the olfactory bulbs of rats to maps showing the activity of newly introduced genes in genetically engineered animals.

    “There is very significant work you can do with animals that you can't really do in humans,” says Ugurbil. Researchers can keep animal subjects immobilized for extended periods of time, inject them with potentially toxic image-enhancing contrast agents, alter their genes, and subject them to higher magnetic fields or levels of x-rays than they can with human patients. And compared to the snapshots obtained by dissecting a research animal, images can be cross-indexed to produce complex digital atlases that track key molecular players in tissues throughout the body.

    Building the hardware hasn't been easy. That's because most imaging schemes have relatively poor resolution, explains Michael Huerta, who helps oversee extramural funding at the National Institute of Mental Health in Rockville, Maryland. Clinical MRI systems, for example, can see features about 1 cubic millimeter in size. “In humans, that resolution is not bad,” says Huerta. “But in a mouse, that volume occupies a lot of the brain.”

    Much finer detail is now within reach of virtually all types of imaging systems. MRI, for example, uses a combination of strong magnetic fields and radio waves to make certain atomic nuclei in a sample, typically those of protons in water, resonate and reveal their location. By boosting these magnetic fields and using advanced software algorithms, researchers have managed to improve the resolution in animal systems to about 50 micrometers on a side, an 8000-fold reduction in volume. With Micropet, meanwhile, new detectors capable of picking up fainter gamma ray signals from the radioactive probe compounds used in PET have improved resolution to 2 millimeters on a side, a 10-fold reduction in volume.

    In one demonstration of what these new high-resolution machines can do, Ugurbil reported using a high-field animal MRI scanner on cats to produce the first-ever MRI pictures of activity in brain structures called ocular dominance columns. An MRI variant called functional MRI, or fMRI, can measure metabolic activity in various tissues by tracking blood flow. But it had lacked the resolution to see these columns—collections of cells in the brain that all respond to particular visual cues, such as horizontal or vertical lines—forcing researchers to map the columns with tiny electrical probes inserted into the brain. Now, with their new high-magnetic field MRI machine, Ugurbil and his colleagues were able to see the columns, each a mere 300 micrometers or so in size. “It's not a new neuroscience discovery. But for fMRI, it's a very big discovery to map such small regions,” says Ugurbil. And that advance, he adds, should allow his team to make detailed studies of the development of vision in individual cats over time without ever having to harm the animals.

    Other teams got equally impressive results. Johnson and his Duke colleagues, who were among the first to wind MRI resolution down to 50 micrometers on a side, were able to make MRI movies of the lungs of a guinea pig as it breathes. What makes this feat particularly impressive is that MRI normally can't see lungs at all. The lungs contain very little water, which MRI normally tracks. To get around this problem, Johnson and his colleagues used another trick besides their high-resolution imager: hyperpolarized helium. Like other magnetic atoms, helium atoms act like tiny bar magnets. In a typical helium gas, these magnets point in all directions, meaning that the gas has no net magnetic field. But in recent years, labs around the world have come up with schemes to use circularly polarized light waves to align the bulk of the atomic magnets in helium gas. When the inert gas is then breathed by a subject, such as a guinea pig, it gives off a strong magnetic signal that is picked up by the MRI machine.

    Another unusual MRI tracer enabled Alan Koretsky, a chemist with the National Institute of Neurological Disorders and Stroke in Bethesda, Maryland, to track anatomy in the brain. Koretsky and his colleagues used manganese, which is chemically similar to calcium, the element that helps trigger the release of neurotransmitters from neurons so they can communicate with their neighbors. Unlike calcium, however, which exits the cell rapidly, manganese sticks around on the inside. And because the element has a strong magnetic signature, it enables MRI to track neural activity. As a bonus, manganese can also travel between cells, which allows researchers to track neural connections.

    Koretsky reported that when he and his colleagues exposed mice to a strong odor spiked with manganese, images of the neural connections in the olfactory bulb—a pea-sized neural relay station in the brain's frontal lobe—lit up on their MRI machine. “We were able to watch as the manganese moves through neurons and see exactly where neurons [travel],” says Koretsky. Scott Fraser, a developmental biologist at the California Institute of Technology in Pasadena, says the new work is “really fascinating,” because the technique holds out the potential to image neural development during the first months of life.

    Lighting up particular tissues is only half the fun. Over the past 2 years, several labs have shown that they can also trace the activity of implanted genes that have been tailor-made to make a protein that can be picked up by an imager. Traditionally, researchers have had to sacrifice an animal to check its gene expression, only giving them a snapshot at that point in the animal's life. By imaging gene expression in live animals, researchers will get more of a movie, tracking expression over time and correlating it with the animal's behavior and health. Down the road, these techniques should also aid drug development. “You can look at what genes are turned on before or after drug delivery,” says Ronald Blasberg, a PET specialist at the Memorial Sloan-Kettering Cancer Center in New York City, offering clues to how novel drugs work in the body.

    Imagers will also likely prove useful for monitoring how altering a gene affects an animal's development and function. Johnson, for example, presented a technique he and his colleagues have dubbed “rapid phenotyping,” which aims to correlate genetic changes with alterations in tissue development. The researchers use a technique called magnetic resonance microscopy that takes a series of digital MRI slices through an animal–1000 in this case—which they then “stack” one atop another to produce a three-dimensional reconstruction of its body. By comparing such 3D maps of one animal with maps of another that has an altered gene, they will be able to see how the change affected the animal's physiology.

    Johnson and others agree that these new techniques will be a boon for understanding animal genetics. “There are 2 million genetically engineered mice sold in [the U.S.] each year,” says Michael Phelps, a PET imaging specialist at UCLA. “That's expected to triple in a few years,” he adds, and imaging could help researchers track the altered genes' activity and effects. Animal imaging has now gained such momentum that in September the National Institutes of Health announced that it will spend nearly $1 million on the initial development of a new computerized mouse imaging atlas. The project will attempt to duplicate the success of the human brain mapping project begun in 1993, which collects various types of imaging data—CT scans, MRI, histology, and so on—converts them all into digital form, and then allows researchers to combine the various elements in a single image that lets them see instantly how a discovery they make with one imaging method relates to everything else that has been learned.

    Such maps should allow researchers to move from comparing a few animals to comparing hundreds or thousands, thus providing much more reliable information about biology and disease. Researchers interested in tracking the expression of one particular gene over time, for example, will be able to see how its expression compares to the expression of other genes detailed by other teams. Multiple teams will also be able to work together to chart the many developmental steps that take place shortly after birth, says Huerta: “It will be a very rich information space.” The same will likely be said of animal imaging in general.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution