News this Week

Science  01 Jan 1999:
Vol. 283, Issue 5398, pp. 12
  1. BIOLOGY

    RNA Molecules May Carry Long-Distance Signals in Plants

    1. Evelyn Strauss

    Like all multicellular organisms, plants have a long distant transport system that carries nutrients and messenger molecules to far-flung organs. But plant biologists had thought that plants send only small and simple signaling molecules along these highways, a set of pipelines throughout the plant called the phloem. Large molecules, they thought, could not negotiate the narrow access channels leading to the phloem. On page 94, however, researchers report the discovery of a carrier protein that can apparently truck large RNA molecules into the phloem, suggesting that RNA transport may be part of a plant “information superhighway.”

    “The work identifies what may be an essential component in a new system plant cells use to talk to each other,” says Richard Jorgensen, a molecular geneticist at the University of Arizona in Tucson. “We're getting at the components of a system that controls how RNA is moved between cells and around the plant.”

    The finding, by William Lucas, a plant cell biologist at the University of California, Davis, and colleagues, provides the first glimpse into the mechanism of a transport system that might control gene expression in distant cells. This “highway” could help resolve some long-standing mysteries about how information travels among plant parts such as leaves and flowers, researchers say. Not everyone is persuaded that the new protein actually carries RNA from cells into the phloem, but the work fits with previous clues that plants use big molecules to transmit information, says Robert Turgeon, a plant physiologist at Cornell University. “There's a possibility that what we're looking at, in fact, is a very sophisticated situation where regulation occurs by macromolecules over very long distances,” he says.

    The researchers made their discovery by extrapolating from the behavior of plant viruses. Scientists knew that to spread infection throughout a plant, these microbes must move large nucleic acid molecules into the phloem, where they enter a high-pressure stream that sweeps them to distant tissues. To get into this stream, a molecule must pass through narrow channels that feed into cells called sieve elements, which form the phloem tube. The channels are ordinarily too small to allow passage of a big nucleic acid, but viruses somehow manage to widen them with so-called viral movement proteins.

    Lucas's team guessed that the viral movement proteins were mimicking plant proteins that do the same thing. To find the plant counterparts, the researchers applied an antibody against a viral movement protein to pumpkin phloem sap, where it bound to a 16-kilodalton protein called CmPP16. They then detected this protein and the RNA that encodes it in the sieve elements, which themselves have no nuclei—and thus don't produce RNA—or machinery to make proteins. That suggests that the protein and RNA had moved in from adjacent cells. To test the idea, they injected CmPP16 and a variety of RNA molecules into plant cells and found the molecules in neighboring cells.

    Finally, by grafting a piece of a cucumber plant onto a pumpkin plant, the researchers showed that CmPP16 and its mRNA can move long distances. In phloem sap from the cucumber graft, they found the pumpkin CmPP16 and its mRNA, indicating that these large molecules had traveled into the graft. Together, the findings suggest that CmPP16 acts like a viral movement protein, ferrying RNA into the phloem. “This is the first plant-encoded movement protein that's been found,” says Jorgensen.

    But not everyone is convinced. The work doesn't actually show that CmPP16 and its mRNA traverse the channels that enter the phloem, and so doesn't rule out the possibility that the protein and mRNA found in the sieve elements are remnants from immature sieve elements, which do contain nuclei, says Turgeon. And the microinjection technique used in the experiments delivers such an overwhelming load of molecules into cells that finding the molecules nearby doesn't necessarily reflect a natural system for moving them into the phloem, says James Carrington of Washington State University.

    Even if CmPP16 is actually trucking RNAs into the phloem for long journeys, at the moment no one is quite sure what those RNAs are doing. “It's nice work, and now what we need are good functional tests to assign significance to these proteins and nucleic acids in the phloem,” says Carrington.

    Researchers already have several ideas, based on previous hints that RNAs might be used in plant signaling. Cells receiving distant RNA molecules could translate them directly into new proteins, for example. And plants are known to battle viruses by dispatching some kind of messenger molecule to distant cells, where it degrades viral RNA. This system requires that the “turn off” message be sequence specific, suggesting that the messenger is a nucleic acid, possibly another RNA. Plants could use a similar mechanism for developmental or physiological purposes, says Jorgensen. A hint that they do comes from the current study, where two cucumber proteins that may be related to CmPP16 disappeared during the grafting experiment. Lucas says that when the pumpkin RNAs arrived in new tissue, they may have shut down production of those proteins.

    Indeed, an RNA messenger system might solve long-standing puzzles about how different parts of plants talk to each other. For example, scientists have long known that some substance travels from leaves to buds, conveying the signal to flower in response to cues such as day length. But no one knows the identity of this messenger. The new results inspire speculation that it is an RNA, says Winslow Briggs, a plant physiologist at the Carnegie Institution of Washington at Stanford. For those studying traffic on the plant information highway, the new RNA-transport molecule could be a good ride.

  2. HUMAN GENETICS

    Iceland OKs Private Health Databank

    1. Martin Enserink*
    1. Martin Enserink is a science writer in Amsterdam.

    Ending months of furious and, at times, bitter debate, the Icelandic parliament has given a private company permission to build a database containing the health records of the entire nation. But critics of the legislation, passed 16 December by a sizable majority, immediately pledged to find ways to block its implementation.

    The new law grants one company, deCODE Genetics from Reykjavik, the right to establish and commercially exploit a nationwide database created through agreements with hospitals, clinics, and individual physicians to submit their patients' medical records. The company expects this information to greatly speed up its search for disease-causing genes, on which diagnostic tests and therapies could be based. Icelanders belong to a very homogeneous gene pool, making disease genes much easier to spot here than in other populations.

    The Icelandic government hopes the database, which will also be available to health officials, will improve the country's health care system. It also sees genetics as a promising way to generate high-tech jobs for the country's small, fish-based economy. “We have quite a few people abroad who have educated themselves in this field. Now, they can come home and work on this,” says Siv Fridleifsdottir, vice-chair of the Committee on Health in the Althingi, the Icelandic parliament. But the deCODE bill, introduced last spring and then revised over the summer, has touched off a sulfurous battle within the research community (Science, 14 August 1998, p. 890, and 30 October 1998, p. 859). “This has totally destroyed the scientific atmosphere,” says Eirikur Steingrimsson, a geneticist at the University of Iceland.

    Critics of the bill say it violates basic ethical principles because patients will not be asked for their consent before their records are deposited in the database. They argue that there should be more safeguards to secure privacy, and that one company should not have the commercial rights to a whole nation's gene pool. Over the past few months, dozens of medical, scientific, and patients' organizations testified against the bill in committee hearings. “We look at this as a black day in the medical and scientific community,” says psychiatrist Tomas Zoega, chair of the Ethics Committee of the Icelandic Medical Association. “But the battle will keep on going.”

    deCODE's founder and president, Kari Stefansson, says that many opponents have acted out of professional envy rather than ethical concerns. “A subpopulation of people working in biomedicine in Iceland feels that we have disrupted their lives simply by our size,” says Stefansson, a former Harvard University geneticist. “They have great difficulty recruiting people in their labs and competing with us.” Now that the bill is passed, he adds, “I expect that there will be a lot of reconciliation.” Adds University Hospital gastroenterologist Bjarni Thjodleifsson, who is working with deCODE on a genetic study of inflammatory bowel disease, “This is a revolutionary bill, and people are unduly paranoid about their position. As the dust settles, matters will clear up, and trust can be obtained.”

    With only two defections from the ruling coalition, the bill passed parliament by a vote of 37 to 20. Still, the debate opened many wounds in the body politic. Critics claim that deCODE had too much influence in drafting the bill. In particular, they point to a last-minute addition that allows deCODE to link the database's medical information to existing genealogical records and to genetic information that the company collects in its own studies—an arrangement that critics say will make it relatively easy to identify individual patients and learn sensitive details about them. “I have never witnessed such a stronghold [on the parliament] by one company that has interests in a law,” says Social-Democrat Össur Skarphedinsson, chair of the health panel.

    But Stefansson says the company was not trying to hide anything. “This [database link] had been the idea that was discussed from day one,” he says. “If the politicians say they didn't know about it, they are being very disingenuous.” He also denies that the company has received any special favors. “You can have a stronghold simply by the power of your idea.”

    Despite their defeat, deCODE's critics haven't given up. One recourse, says Zoega, is to ask the Icelandic and European courts to overturn the law on the grounds that it violates an individual's right to privacy. In addition, the bill allows individuals to notify the surgeon-general if they oppose use of their data, and the medical association may place ads and provide patients with the necessary forms, he adds. Already, 44 general practitioners and 109 hospital specialists have pledged not to send information to the database unless a patient specifically requests them to do so. “We will certainly be dragging our feet,” Zoega says about participating in the data collection.

  3. ASTROPHYSICS

    Has a Dark Particle Come to Light?

    1. James Glanz

    PARIS—A strange new particle may have left its mark in a mountain in central Italy. Its appearance, which has yet to be confirmed, would not be entirely unexpected, but it would have profound implications. Reclusive and ponderous, in this case with about 60 times the mass of the proton, such WIMPs (for weakly interacting massive particles) could account for some or all of the mysterious dark matter that astronomers believe far outweighs the galaxy's glowing stars and gas clouds.

    Hoping to detect particles of dark matter, researchers have set up WIMP detectors in underground laboratories around the world (see Science, 21 March 1997, p. 1736). Now, a collaboration working at the Gran Sasso laboratory in Italy's Apennine Mountains has picked up the strongest hint so far of passing WIMPs: a particle count that appears to vary with the seasons, as the Earth's orbit carries it into a galactic wind of WIMPs and then away again. The Gran Sasso result “is in favor of this modulation” with about a 99% level of statistical confidence, said Pierluigi Belli, a University of Rome physicist who is a member of the collaboration, called DAMA (for DArk MAtter).

    The claim, which Belli announced here on 16 December at a gathering called—despite the location—the Texas Symposium on Relativistic Astrophysics and Cosmology, faces plenty of skepticism. But if WIMPs are real, they might settle a long-standing problem. The Milky Way and other galaxies spin so quickly that the gravity of their ordinary, luminous matter is not enough to keep them from flying apart. Perhaps 90% of the galaxies' mass has to consist of some unseen matter to add the extra gravitational glue. Theories of how elements were forged in the big bang, however, limit the universe's complement of baryonic matter—the ordinary stuff of which planets, stars, and people are made—to less than is required to make up the deficit.

    WIMPs have become a favorite candidate for fleshing out galaxies to the required mass, in part because they are natural consequences of some speculative theories of particle physics. In a theory called super-symmetry, which many theorists hope will extend the current picture of particles and forces, each known particle has a still-undiscovered massive partner. A WIMP of about the mass suggested by the DAMA results could be the lightest of these super-symmetric partners, a particle called the neutralino.

    Detecting a WIMP is a matter of setting a trap and waiting. DAMA consists of nine 9.7-kilogram crystals of doped sodium iodide—a material that scintillates, or generates a flash of light, when one of its nuclei or electrons recoils after interacting with another particle. Photodetectors gather the light and the results are stored on computers for analysis.

    Because of natural radioactivity in the rock and other materials surrounding the detectors, “there is a huge mountain of background signals,” said Bernard Sadoulet of the Center for Particle Astrophysics at the University of California, Berkeley. To sort any WIMP signal from this noise, the DAMA researchers looked for a subtle seasonal variation in the scintillation counts. When the galaxy formed from a collapsing cloud of gas, the cloud's stately rotation was amplified, like that of a ballerina drawing in her limbs, so that the visible matter of the galaxy now spins rapidly, carrying the sun around the galactic center at some 220 kilometers per second. But the WIMPs would not have collapsed because they can't radiate photons to shed energy. Like the primordial gas cloud, they should hardly rotate at all. As a result, the sun should encounter the WIMPs as a kind of wind, said Sadoulet. Because Earth's orbital motion adds to the sun's velocity in the summer and subtracts in the winter, the WIMP signal should show a slight annual modulation.

    In 1997, the DAMA group presented weak hints of a modulation. And now, based on 180 days of data collected from November 1996 to July 1997, they are more confident in claiming that they have seen “an effect satisfying all the distinctive requirements for a WIMP-induced process,” as Rita Bernabei of the University of Rome, the DAMA group leader, puts it.

    An unambiguous WIMP detection would delight theorists. But in sharp exchanges after Belli's talk, experimenters took aim at everything from the DAMA group's statistical analysis techniques to the fact that data presented so far cover mainly the rising part of the modulation. “You have not shown us that the signal is going up and down,” said Sadoulet, “which would be much more convincing to the community.”

    “Yes, of course,” Belli shot back. “This is a work in progress.” Bernabei says that the collaboration is analyzing additional data “to verify the reproducibility of the effect—with proper features—over several cycles.” Other evidence for the reality of WIMPs could also come from efforts to create supersymmetric particles in an accelerator at CERN, the European laboratory for particle physics in Geneva, and from other dark matter detectors such as those at Sadoulet's laboratory. Said Antonio Masiero, a theorist from the International School of Advanced Studies in Trieste, Italy, “Other WIMP experiments are close, so it is starting to be exciting.”

  4. NEUROBIOLOGY

    Filling in the Blanks of The GABAB Receptor

    1. Ingrid Wickelgren

    Valium and its copycat drugs soothe jangled nerves by augmenting the actions of the brain's own sedative, a neurotransmitter known as γ-aminobutyric acid (GABA). They do this by binding to one of the cell-surface molecules through which GABA exerts its effects, the GABAA receptor. But neurons have other GABA receptors that could also serve as drug targets for treating disorders ranging from epilepsy to pain. Now, four research teams have discovered a feature of this second class of GABABreceptors that could open the way to more effective and subtle manipulations of the brain's GABA system.

    The groups—one reporting its results in this issue of Science—have found that the GABAB receptor is not a single molecule but instead consists of two different proteins, neither of which is effective on its own. This marriage of two disparate proteins to produce a functional receptor offers greater opportunities for drug design, as researchers can now target each protein separately as well as the receptor as a whole. And it has researchers speculating that the same kind of marriage, called a heterodimer, might also turn up in other members of the receptor class to which GABAB belongs. These are known as G protein-coupled receptors for the kind of protein that relays their signal into the cell, and they number some 1000 in all.

    “This is pretty wild,” says neurobiologist Roger Nicoll of the University of California, San Francisco. “No one had ever shown that these [G protein-coupled] receptors can form heterodimers.” Kenneth Jones at Synaptic Pharmaceutical Corp. in Paramus, New Jersey, whose group reported its findings in Nature, says the research “has major implications” both for understanding the workings of this large class of molecules, which also includes receptors for the neurotransmitter serotonin and for opiates, and for developing novel drugs to block or stimulate them.

    The discovery solves a mystery that arose early in 1997 when molecular biologist Bernhard Bettler at the drug giant Novartis in Basel, Switzerland, and his colleagues cloned the first gene for a GABAB component, a protein called GBR1. When inserted into cells, however, GBR1 could not perform a key function of natural GABAB receptors: opening membrane channels that allow potassium ions to flow out of the cell. Now, Bettler's team and three others have found out why.

    Aware that something seemed to be missing from the receptor cloned by the Bettler team, groups led by Hans-Christian Kornau of the biotech firm BASF-LYNX Bioscience AG in Heidelberg, Germany, and by Fiona Marshall at Glaxo Wellcome's Molecular Pharmacology unit in Stevenage, England, coaxed yeast cells to express the tail of GBR1. The tail was to serve as a bait for picking up any proteins that interact with it and might be needed for GABAB function. Both teams turned up the same protein, which Kornau dubbed GBR2. Like GBR1, GBR2 turned out to contain seven hydrophobic regions that could thread through the lipid-rich cell membrane and two ends that could project inside and outside the cell. This structure suggested that GBR2 is also a receptor, and thus that two receptor molecules may operate as a duet in cells.

    Meanwhile, the Novartis, Synaptic, and Glaxo teams were searching the GenBank human gene database for proteins resembling GBR1 in hopes of finding one that would do a better job of reproducing the GABAB receptor's functions. Remarkably, they all picked out the same protein that had popped up in the yeast. But when the researchers coaxed cultured cells to express GBR2, along with the requisite potassium channels, this receptor also failed to produce robust potassium currents in response to GABA treatment.

    Thinking they had missed the active part of the GABAB receptor, the Synaptic team was about ready to give up when they looked at the expression patterns of both GBR1 and GBR2 in sections of rat brain, and noticed a striking overlap. This overlap, which the other scientific teams also saw, suggested that the two proteins may work together in individual neurons.

    And that's what all the groups have now shown. When they expressed both GBR1 and GBR2 in cultured frog or human cells, the cells produced potassium currents. “It worked beautifully,” says Jones. The Novartis group went one step further: Aided by specific antibodies, they demonstrated that the two proteins are closely associated on individual brain neurons. In addition, the Kornau group has mapped the site where the two proteins interact.

    The BASF-LYNX group reports its findings on page 74; the other three papers appeared in the 17 December 1998 Nature. Together, the four papers suggest that GBR1 and GBR2 cooperate in at least two ways. First, the proteins are likely to help each other transmit GABA's signal within a neuron, allowing the neurotransmitter to activate the potassium channels. In addition, GBR2 may help shuttle GBR1 to its final location on the cell membrane, since the Glaxo team showed that GBR1 does not get to the membrane unless GBR2 is present.

    Whatever the nature of the partnership, by providing a fully functional receptor, the discovery of GBR2 should help researchers design new drugs that work through GABAB. Drugs that target GABAB might, for example, provide a new range of therapies that help depress the excessive neuronal firing characteristic of epilepsy, pain, and anxiety, or perhaps help relieve neuronal inhibition to bolster memory or ameliorate depression. As yet, however, nobody can predict how successful such drug development efforts will be. “We're still far from a direct clinical application,” says Kornau, “but knowing this receptor's structure is a significant step forward.”

  5. PLANETARY SCIENCE

    Moon-Forming Crash Is Likely in New Model

    1. Dana Mackenzie*
    1. Dana Mackenzie is a writer in Santa Cruz, California.

    The greatest accident in Earth's history was probably no accident at all, according to new computer simulations of the early solar system. Planetary scientists believe that sometime in the first 100 million years after the solar system took shape from gas and dust, a Mars-sized planet smashed into Earth. The impact liquefied Earth's surface and ejected a huge blob of material that coalesced into the moon. Far from being a chance encounter that defied all the odds, the new simulations suggest, an impact like this is expected to occur in the solar system's first 100 million years.

    “The lesson is that giant impacts are common,” says Robin Canup of the Southwest Research Institute (SWRI) in Boulder, Colorado, who developed one of the models. “They're not the wild, ad hoc event that they were once believed to be.” Her simulations, which were announced last month at the Origin of the Earth and Moon conference in Monterey, also tracked for the first time how smaller collisions following the giant impact could have tweaked Earth's rotation rate and the tilt of its axis to match what is seen today.

    Don Davis and William Hartmann, now of the Planetary Science Institute in Tucson, Arizona, proposed the giant impact theory in 1975. By the mid-1980s, it had emerged as the leading explanation for the moon, largely because every other theory appeared to have fatal flaws. For example, “co-accretion,” in which the Earth and moon grew up together, failed to explain why the moon has a much smaller iron core than Earth. The “fission” model, in which the moon spun off from the outer layers of Earth, failed to explain why the moon has an iron core at all. But planetary scientists embraced the impact scenario reluctantly, says Hartmann. “The big objection in those days was that a giant catastrophic collision seemed ad hoc to all the other workers.”

    Testing the idea by simulating the motion of the dozens of “protoplanets”—the building blocks of the planets—in the early solar system was impossible until recently. Computers weren't up to the task, and existing mathematical methods limited the length of time that could be modeled. After 100,000 orbits or so, the accumulated errors in the computations would cause the planets to fly away to infinity or spiral into the sun. Canup, Craig Agnor of the University of Colorado, Boulder, and Harold Levison of SWRI, however, use a new method called “symplectic integration,” which prevents the energy of the virtual solar system from increasing, enabling researchers to model tens of millions of orbits.

    Canup's simulations begin about 10 million years after the birth of the solar system, when gas and dust would have coalesced into about two dozen protoplanets, and continue until the full-fledged planets have settled into stable orbits, typically after about 100 million years. At this point, the inner solar system nearly always contains only four or five planets that have swept up all the rest. Usually, one or two of these planets have experienced impacts large enough to form a moon, Canup found. Two other research groups, relying on symplectic modeling techniques, have gotten similar results.

    Planetary scientists agree that the new simulations are not the last word. “The problem with the simulations is that they are all primitive in one way or another,” says Jack Lissauer of NASA's Ames Research Center in Mountain View, California, a member of one of the other modeling groups. In particular, all three models assume that colliding objects stick together like lumps of clay; in reality, many of the collisions probably threw out debris, affecting the size and orbit of the resulting body. (The new models don't actually show debris flying off to form the moon; they simply show impacts big enough to do the job.)

    Still, Canup's simulation may resolve an inconsistency in the giant impact scenario: the difficulty of producing a collision large enough to get a moon-sized body into orbit, but with low enough angular momentum to produce the orbit seen today. Most plausible impacts would have resulted in a lunar orbit much farther out.

    Earlier this year, Alastair Cameron of Harvard University proposed one solution. His model assumes the Earth was only two-thirds its present size when the impact occurred, allowing the impactor to be small and eliminating the angular momentum problem. If the collision took place early enough in planetary history, plenty of debris would have remained to feed the growth of Earth to its present size. But there's a snag: comparisons of the chemical compositions of the Earth and moon imply that the Earth was fully formed, or nearly so, when it spawned the moon.

    Canup's simulations provide a different way out. They show that before or after the giant impact, Earth could have experienced a shower of small impacts, which could have slowed the rotation of the Earth-moon system. “Smaller impacts are very effective at tweaking the spin of a planet, even though they add very little mass,” Canup says. Lissauer, however, is not convinced that Canup's solution escapes the problem of the similarity in Earth-moon composition, noting that his “back-of-the-envelope” calculation shows that the impactors would have added at least 4% of Earth's present mass after the moon had been born.

    The new impact simulations also give astronomers who are searching other stars for planets like our own an extra tool. By making giant impacts a likelihood and smaller impacts a near certainty, the simulations suggest that the birth throes of moons like our own might be visible in other solar systems. Alan Stern of SWRI, who has calculated that a moon-forming impact should be detectable from a distance of 400 light-years, says the glow of such a cataclysm “would be the only way of detecting an Earth-sized planet in another solar system directly.”

  6. HUMAN CLONING

    Korean Report Sparks Anger and Inquiry

    1. Michael Baker*
    1. Michael Baker writes from Seoul.

    SEOUL, KOREA—Last month's report by doctors at a Seoul fertility clinic that they had cloned a human embryo has set off a storm of scientific doubts and public anger here. Announced on 14 December by a team from Kyunghee University Hospital, the procedure has also sent Korean politicians scrambling to fill a hole in rules adopted last year by the Ministry of Health and Social Welfare that cover genetic research but not human cloning.

    A press release from the Kyunghee doctors described a procedure using cells donated by a patient, a woman in her 30's, for unspecified experimental purposes. The researchers say they removed the nucleus of an egg cell and inserted the nucleus of an adult cumulus cell—a kind of cell that surrounds the ovary—before chemically activating the new cell. The reconstituted egg cell then divided twice in vitro before the researchers ended the experiment.

    Fertility specialists Kim Bo Sung and Lee Bo Yun, two members of the Kyunghee team, say they closely followed a technique used by University of Hawaii researchers to clone mice (Nature, 23 July 1998, p. 369). But they have declined to provide details and say that no results will be published until further work is done. Lee says he thought that pre-implantation stage research might be helpful in treating infertile couples, but that the team aborted the embryo after four cells because of ethical concerns.

    Other scientists here have joined a global chorus of skepticism about these claims. They point out that it is common for ordinary unfertilized eggs to divide twice if stimulated and have derided the Kyunghee team for grandstanding. Seo Jung Sun, head of a four-man committee appointed by the Korean Medical Association to investigate the Kyunghee results, says he has “lots of questions” for the researchers, including whether the introduced genes were actually in charge of cell division.

    An expert at one of Seoul's biggest in vitro fertilization clinic, who asked not to be identified, says the Kyunghee doctors “are not experts” on cloning and have few published papers. He sees the experiment as a response to heavy competition among what he calls “test tube baby centers” (there are up to 30 in Seoul, and 80 nationwide), noting that such publicity might be expected to drum up more business.

    The announcement nevertheless produced a strong public outcry, with newspaper editorials evoking the specter of “large numbers of Adolf Hitlers.” In a brief telephone interview, Lee professed “surprise” at the strong reaction. Ethical oversight of research is sparse in Korea, and many university programs have no ethics committees to judge experiments. The Korean Fertility Society visits fertilization centers, but offers no certification or regulation.

    Responding quickly to the outcry, several legislators say that they want to ban all human cloning experiments except those that relate to disease research. One proposal already before the National Assembly would give the job of reviewing such experiments to a committee of representatives from government, religious groups, research, and industry. Seo says he hopes any legislation would still permit in vitro research with embryos up to 14 days old, but that it may be difficult to find support for such an approach. “Before [the experiment], congressmen were cooperative,” he says. “But now they are really anxious.”

  7. MOLECULAR BIOLOGY

    DNA Chips Give New View of Classic Test

    1. Elizabeth Pennisi

    It's a simple experiment, one that cell biologists have been doing for at least a quarter-century. Take a culture of connective tissue cells called fibroblasts, deprive them of nourishing serum for 2 days, then add back the serum and watch the genes that turn on as the fibroblasts grow. By now, you might think that biologists would have the cells' responses pretty well figured out. You'd be mistaken.

    The standard view has been the serum's growth factors and other nutrients switch on the fibroblasts' cell proliferation program, stimulating them to divide. But on page 83, molecular biologists Patrick Brown and Vishwanath Iyer of Stanford University and their colleagues report a very different picture. Using a DNA chip that allowed them to monitor more than 8600 genes at once, the Stanford team found that the serum not only stimulates cell division, it also turns on genes needed for wound healing.

    The work demonstrates the power of DNA chips for looking at how entire batteries of genes coordinate their activity. It also shows that even isolated cells can react as if they were still in intact tissue, initiating gene changes that would bring about the cell-to-cell interactions needed for wound healing. With this new approach, says Jennifer Lippincott-Schwartz, a cell biologist at the National Institute of Child Health and Human Development in Bethesda, Maryland, “Pat Brown is offering us a whole new way of looking at cellular connections.”

    Brown and his colleagues have been perfecting the DNA microarrays that were crucial to this experiment for the past several years. With a customized machine, they cover glass slides with microscopic dots of immobilized DNAs, each representing a different gene. Exposed to fluorescently labeled DNA copied from the mRNA made by the corresponding gene, a spot will light up—a sign that that gene is active. To date, the researchers have shown they can use these arrays to monitor gene expression in a variety of organisms, including yeast and the plant Arabidopsis (Science, 23 October 1998, p. 699). When they were ready to try it using human DNA, they turned to the serum response system because the genes involved had supposedly been so well characterized. “It was a way to check out the [microarray] system and to learn new things,” Brown says.

    After the team made an array representing about 8600 human genes, Iyer withdrew the serum supply that nourished his cultures of human fibroblasts to get the cells to shut down most of their genetic activity. Two days later, he added back a 10% serum solution. To see which genes were affected by the serum addition, Iyer purified mRNAs from subsets of the cells at various intervals during the next 24 hours, labeled the cDNAs made from those mRNAs with fluorescent dyes, and exposed each batch to the array. By monitoring which DNAs in the array bound the cDNAs, he and his colleagues were able to tell which genes were active at what times. With the aid of a computer program that examined the 500 most active genes, the researchers grouped those with similar activity patterns.

    The computer program showed a coordinated response to serum by 28 genes known to be involved in controlling cell proliferation. The fastest to respond—some turning on within a few minutes after the serum exposure—were genes that make proteins that regulate the expression of other genes. These prodded the cells to copy their DNA and divide. Some of the active genes turned off within an hour; others remained active for several hours. A different subset of genes quieted down in response to serum, including those that keep the cell in a nondividing state.

    But other sets of genes not involved in cell division also responded to the serum. Serum addition activated eight genes whose proteins elicit immune responses, 19 genes known to be involved in rebuilding damaged tissue, and a dozen whose proteins stimulate the growth of new blood vessels. The fibroblasts essentially reacted to exposure to serum in culture much as they would in the body if blood had seeped into a fresh skin wound. “If you look at the papers and the review articles [about the serum response model], everything is interpreted in terms of how [the results] fit into cell proliferation,” Brown points out. “But that's not what the model was about.”

    Given that fibroblasts are known to help with healing, these findings are perhaps not surprising. Researchers “will be slapping their foreheads” for not having recognized it sooner, admits Stanford's Gerald Crabtree, and will see their work in a new light.

    The work is also a taste of what is to come as researchers use microarrays to analyze gene expression in other mammalian systems. As Harold Varmus, director of the National Institutes of Health, pointed out last month in San Francisco at a meeting of the American Society for Cell Biology, “we will [eventually] be looking at the totality of gene behavior in individual cells, even [in] whole organisms” with microarrays. And that, he predicts, “is going to change our view of how life works.”

  8. TOXICOLOGY

    EPA Ponders Pesticide Tests in Humans

    1. Jocelyn Kaiser

    The volunteers drank corn oil spiked with a poisonous chemical as doctors watched for symptoms like sweating, headaches, and nausea. Inhuman torture or scientific necessity? That 1997 experiment was in fact performed legally on paid subjects in England by a commercial lab, but it is among several such experiments that spurred the Environmental Protection Agency (EPA) to convene an expert panel last month to lay the groundwork for its first-ever set of rules for testing the toxicity of pesticides on people.

    The panelists, who met on 10 and 11 December in Arlington, Virginia, wrestled mightily with the issue without pinning it to the mat. Some argued that pesticide experiments on humans might be permissible in the absence of alternatives such as studies of farm workers exposed to a pesticide on the job, although they insisted on a stringent ethical review of the experimental protocols. But others urged EPA to reject human data, especially from studies done merely to market a product. “I heard a lot of ethical and scientific concerns about those data,” says Lynn Goldman, who stepped down on 31 December as head of EPA's Office of Prevention, Pesticides, and Toxic Substances, which hopes to issue draft regulations by this spring.

    The current debate is an outgrowth of efforts to beef up protection against pesticide toxicity, which have spawned a backlash that could increase the number of tests done on humans. The agency now sets safe levels at one-hundredth the pesticide concentration found to have no effects on animals, partly on the assumption that humans might be more sensitive to the chemicals than lab rats. A 1996 law aimed at protecting children could lead to another 10-fold reduction in acceptable levels of toxicity. That further tightening has so concerned pesticide companies that some have proposed dumping animal-only tests in favor of direct tests in adult humans.

    The EPA was already concerned about this possibility when the Environmental Working Group, a Washington, D.C.-based activist group, reported last July on the use of human volunteers in recent pesticide tests mostly in the United Kingdom. The publicity prompted EPA to check its own files, which contained the results of eight human no-effects studies—some recent, others decades old. In most cases, it was unclear whether the studies had been approved by an ethical panel called an Institutional Review Board, as required by a government-wide standard called the Common Rule. “We were terribly concerned … because to observe no adverse effect levels, somebody's going to have to have an adverse effect,” Goldman says.

    Last month's panel, with experts on topics ranging from bioethics to toxicology, examined both the science and the ethics behind such testing. Several observers noted that many of the tests submitted to EPA included only a handful of subjects—too few to yield statistically significant results. If human tests are done, they “should be scientifically valid,” said computational biologist Chris Portier of the National Institute for Environmental Health Sciences in Research Triangle Park, North Carolina.

    At the same time, University of Rochester toxicologist Bernard Weiss pointed out that human data can offer valuable insights on topics such as toxicity mechanisms or individual sensitivity differences. Others noted that the tests, however unsettling at first glance, are similar to a Phase I clinical trial, where healthy volunteers are often used to test a candidate drug for side effects. University of Pennsylvania bioethicist Arthur Caplan says that human pesticide testing “makes me morally queasy, but not to the point where I'd say, ‘ban it.’” Like other committee members, however, he emphasized that EPA should require companies to comply with the Common Rule, which is now applied only to agency-funded research.

    To some panelists, however, the value to society of a new pesticide pales in comparison with the discovery of a new drug and, therefore, requires a higher standard. As Gary Ellis, who oversees human subject protection for the National Institutes of Health, put it, “The importance of knowledge to whom? Those interested in better growing of cotton in west Texas?” Considerations such as these caused at least one committee member to suggest that EPA reject any data involving human experiments.

    The committee had hoped to prepare a recommendation during the meeting itself. But the 16 members decided instead to submit comments for a draft advisory report that would be sent to EPA by 1 January. The EPA plans to use the report in preparing a draft policy on human testing. Given the range of opinions already expressed, however, EPA officials can expect a lively debate during the comment period that follows its release.

  9. EUROPEAN SPACE AGENCY

    Flat Budget Keeps Space Science on Edge

    1. Helen Gavaghan*
    1. Helen Gavaghan writes from Hebden Bridge, U.K.

    Europe's space powers have postponed until spring a decision on the long-term science budget for the European Space Agency (ESA), although they have temporarily approved level funding and kept alive a planned Mars mission. The delay angers scientists who have been lobbying for an increase in the agency's new 5-year budget, which begins today. But some officials said they were heartened that agency officials didn't make further cuts in a budget that, since 1996, has lagged behind inflation.

    “I am disappointed,” says Hans Balsiger, chairman of ESA's science program advisory committee. “I told the council that they can't keep patting us on the head, telling us how well we are doing and, at the same time, taking money out of our pockets.” But Roger Bonnet, head of space science at ESA, is pleased with the decision to approve $3.5 million for Mars Express, a mission to orbit the planet that's planned for launch in 2003. “It is the first positive sign from council since Toulouse [an October 1995 meeting at which funding was capped for 3 years] that they are concerned about science. It does at least reverse the trend [of budget cuts].”

    On 16 December, ESA's ruling council met in Paris to approve a 1999 space science budget of $408 million, including the earmark for Mars Express. Last year's budget was $407 million. But they deferred discussion of longer-term spending plans until a May meeting in Brussels of science ministers from the 14 member states.

    The delay is feeding anxiety among European space scientists, who have drawn up an ambitious agenda of exploration. Those missions—including an x-ray multimirror telescope and a cluster of instruments to study the Earth's magnetosphere to be launched in 2000, a gamma-ray laboratory in 2001, the Rosetta comet mission in 2003, and a far-infrared telescope in 2007—had counted on steadily rising expenditures. “If we get funding at the mid-1998 level and this [year's] increase is in line with inflation, we can just about do all of these missions,” says Balsiger, looking ahead to the spring meeting. “If it is less, then we will have to cancel Mars Express.”

    The delay highlights the precarious level of support for space science throughout Europe. “We see problems in both the short term, 5 years, and the long term, 10 years,” says Paul Merton, director of space science at the British National Space Center. Britain supports a budget that would keep pace with inflation, Merton says, but would like ESA to save money on its large missions to make room for the rest of the science program. In Germany, the new government is carefully reviewing its current level of expenditures on science. “The Social Democrats spoke favorably of basic research prior to the election but said nothing specific about space science,” notes Manfred Ottobein, who oversees space and microgravity science at the German Space Agency. And French scientists don't want ESA programs to be hurt by their country's agreement with NASA to share the cost—still uncertain—of a sample return mission to Mars, says Phillippe Masson, a former science adviser to the government.

    While scientists look to the spring ministerial meeting for salvation, some observers warn that even a modest increase may not solve the long-term problems facing European space science. “People are scrabbling for the last million Euros,” says Roy Gibson, a former ESA director-general. “No program is worth doing if it is eroded every year by inflation.”

  10. ASTROPHYSICS

    Galaxies Seen at the Universe's Dawn

    1. Govert Schilling*
    1. Govert Schilling is an astronomy writer in Utrecht, the Netherlands.

    Astronomers from the State University of New York, Stony Brook, have won a race to the edge of the universe. After 3 weeks of working around the clock on infrared data from the Hubble Space Telescope, they may have shattered previous records for the most distant stars and galaxies, pushing the frontier of the visible universe to distances so great that they are seen just a few hundred million years after the big bang. “We knew other groups were working on the same data,” says one of the astronomers, Ken Lanzetta, “so there was a lot of hurry.”

    Together with his Stony Brook colleague Amos Yahil, postdocs Alberto Fernandez-Soto and Sam Pascarelle, and students Hsiao-Wen Chen and Noriaki Yahata, Lanzetta analyzed data from a very small patch of sky in the southern constellation Tucana, where the Hubble gathered light for 10 straight days last October (Science, 27 November 1998, p. 1621). NASA released data from this observation, called Hubble Deep Field South, on 23 November. By 18 December, the team had published its results on the Internet (sbast4.ess.sunysb.edu/hdfs/home.html): a catalogue of 323 distant galaxy candidates, along with their redshifts—a measure of distance, and hence age. The farthest galaxy spotted previously has a redshift of 5.64, meaning that the expansion of the universe has stretched its light by a factor of 6.64. Lanzetta and Yahil now claim to have found 14 galaxies with redshifts of between 5 and 10, and another five candidates with redshifts larger than 10.

    At a redshift of 10, galaxies are seen when the universe was only 9% of its current size and probably just a few hundred million years old. “We are getting back to a significant fraction of the age of the universe,” says Yahil. “These are the last few percent to the big bang.” If he and Lanzetta are right, galaxies and stars formed much earlier in cosmic history than most theorists had imagined.

    There is a catch, however. Because the galaxies are extremely faint, Lanzetta and Yahil did not measure their redshifts from a spectrum—the usual procedure. Instead, they deduced redshifts by comparing each galaxy's brightness in measurements at different color ranges. In very distant galaxies, for example, interstellar hydrogen gas blots out a part of the ultraviolet spectrum. By comparing observations at different colors, astronomers can estimate where this drop-off falls in the spectrum of a distant galaxy, and thus how much its light has been red-shifted. Because even ultraviolet light is shifted all the way into the infrared for very distant galaxies, Lanzetta and Yahil's analysis relied heavily on observations by Hubble's NICMOS infrared camera.

    Such “photometric” redshift measurements are considered less reliable than measurements based on a spectrum. Yahil notes, for example, that some of the candidate high-redshift galaxies could be old elliptical galaxies—whose light is very red—at smaller distances. Charles Steidel of the California Institute of Technology in Pasadena, who pioneered the photometric redshift technique a few years ago, sees “no reason to doubt the results,” but says, “You have to have a lot of faith at these high redshifts.” And theorist Jim Peebles of Princeton University says, “Lanzetta and Yahil have demonstrated a good track record for redshift estimates in the [Hubble Deep Field North], so I expect people will take the observations in the south seriously, but not [consider them] definitive.”

    Confirming the results could take the Next Generation Space Telescope, an orbiting infrared telescope to be launched in 2007, although Lanzetta and Yahil have floated plans to build an array of cut-rate telescopes, with giant mirrors made of spinning mercury, to gather the light needed to make spectra of distant galaxy candidates. But if the findings do hold up, they will add to mounting evidence that the early universe was a far more active place than astronomers had thought.

    When the Hubble Space Telescope observed Deep Field North, just over 3 years ago, the NICMOS camera wasn't installed yet, and the most remote objects remained invisible because all of their light is shifted into the infrared. Based on Deep Field North, astronomers concluded that few stars had formed at redshifts larger than 3. “That notion is beginning to crumble,” says Yahil. Steidel, for example, has found evidence of rapid star formation at redshifts greater than 4 (Science, 4 December 1998, p. 1806). And Yahil now says, “At the moment, it's even not clear if we see a drop in the star formation rate between redshifts of 5 and 10.”

    How the new data will affect theorists' picture of cosmic structure formation isn't clear yet. “I think the conclusions are not a problem for the usual ideas if the galaxies at a redshift of 10 are quite rare,” says Peebles. But as astronomers push farther out and back in time, the usual ideas may come under more and more pressure.

  11. ASTROPHYSICS

    Microwave Hump Reveals Flat Universe

    1. James Glanz

    PARIS—The physicist Richard Feynman once said that even if a camel's tail appeared under the flap of a tent, he wouldn't believe in the camel until he could see the hump. Now, astrophysicists may be seeing the full dimensions of a hump in measurements of the cosmic microwave background (CMB), a faint microwave glow on the sky that is the afterglow of the big bang. The observations, made by two microwave telescopes at the South Pole and announced here in mid-December at the Texas Symposium on Relativistic Astrophysics and Cosmology, are likely to be far more welcome than a camel in a tent. The hump appears to be a long-sought sign that the cosmos contains the full complement of matter and energy that theorists have long postulated.

    The hump is actually a measure of ripples in the CMB, which record slight irregularities in the matter and energy of the early universe. The apparent size of the irregularities indicates the shape of the universe, just as the apparent size of an object viewed through a magnifying glass says something about the shape of the glass. In a plot of “power,” or abundance, of ripples of various sizes, a peak at a size of about 1 degree on the sky would indicate a universe that is spatially “flat.”

    A flat universe—a universe with a “critical density” of matter and energy—is a key prediction of the best-accepted theory of how the big bang got started, called inflation. In a universe consisting entirely of matter, the critical density of matter would be just enough for gravity to slow its expansion to a stop after infinite time. But if a major ingredient of the universe is a hypothetical energy in empty space, it could be flat with much less matter, and its expansion could even be speeding up, as recent measurements of distant exploding stars suggest it is (see Science, 18 December 1998, p. 2156).

    At least two earlier sets of measurements had hinted at the 1-degree hump. One set, made from a telescope in Saskatoon, Canada, showed points on the power plot rising from large angular scales to a possible peak at 1 degree; another, made by the Cambridge Anisotropy Telescope in the United Kingdom, seemed to show a drop on the far side of the peak at smaller scales. But astrophysicists wanted more details of the hump before believing it—details that the South Pole experiments, Python and Viper, now appear to have provided.

    The two experiments, part of the Center for Astrophysical Research in Antarctica, take advantage of the thin, dry air at the South Pole for a clear view of the CMB. They surveyed opposite sides of the peak, Viper looking for ripples on scales from 1/6 degree up to a degree, and Python for ripples measuring from one degree up to several degrees. Their findings dovetail, too. “We definitely see rising power up to this [1-degree] scale,” said Python's Kimberly Coble of the University of Chicago. “We see a decrease” at large scales, said Viper's Jeff Peterson, of Carnegie Mellon University.

    Both results are preliminary, although Python members say they are preparing a paper on the results. But Neta Bahcall of Princeton University, who is not involved in the work, said, “It's a very nice observation. They do see a rise; they do see something falling down. It's very suggestive.”

  12. Eastern Europe's Research Gamble

    1. Robert Koenig

    The European Union's Framework research program this year hopes to admit 10 central and eastern European states. For the new recruits, playing in the big league has a price, and a prize worth playing for

    PRAGUE, WARSAW, AND BUCHAREST—Nearly a decade after the fall of the Berlin Wall, the former communist countries of central and eastern Europe are still waiting impatiently outside the door of the European Union (EU), eager to join the club. But early this year, scientists in as many as 10 of those countries will get a taste of life on the inside: for the first time, becoming full participants in the EU's new flagship research program, Framework 5. For cash-strapped nations from the Baltic to the Black Sea, the chance to join the Framework represents a major opportunity. But it also constitutes a high-stakes gamble: Governments will pay for participation up front and hope that their researchers win back the cost of admission in grants.

    It's a serious wager. After initial subsidies end, Framework 5 subscriptions will consume as much as a tenth of the already- meager national research budgets of some countries. And the competition for grants will be intense: The post-communist region's researchers will be up against teams from research powerhouses such as Germany, France, and the United Kingdom. “The challenge is for our scientists to bring their research—and their grant applications—to a high international level,” says Andrzej Wiszniewski, a former university rector in Wroclaw who is now the minister-level chairman of Poland's KBN granting agency.

    The chance for Polish and other scientists to test their mettle against international competitors is the result of a major expansion of the Framework program. Nineteen countries participated in Framework 4, which ended last month: 15 EU members plus four that joined as associates. The $17.6 billion Framework 5—which begins its 4-year run next month—is expected to encompass 31 countries, including 10 from post-communist Europe. Parliaments in a few of these countries could still block entry, but Brussels expects firm commitments by the time of Framework 5's formal kickoff in late February. Although researchers from some of these aspiring associates took part in previous Framework programs, they were admitted only on a project-by-project basis. Now, they will be able to form their own collaborations, with at least one partner from an EU country, and apply to Brussels for grants.

    For most of these countries, however, the cost of associate membership in Framework 5 is just the ante in a higher-stakes game: the political maneuvering to become full EU members in the next round of expansion, probably between 2003 and 2007. For favored countries such as Poland and Hungary, which have earned high marks for their economic and scientific restructuring, eventual EU membership offers the prospect of playing for serious money: the EU's “structural funds.” Intended to beef up infrastructure in regions with low per-capita income, structural funds include substantial sums for R&D projects, such as new laboratories and computer networks. For new member nations, the amounts of money they expect to receive from the structural funds would dwarf what their scientists win in Framework 5 grants.

    Although science represents only a small factor in the negotiations to join the EU, associate membership in Framework 5 is a crucial first step in that process. Hence, officials in countries across the region are putting in countless hours trying to bring their research structures up to EU standards, as well as scraping together the annual fees required to join Framework. They see participation in Framework 5—and, later, joining the EU itself—as the best hope for scientists in post-communist Europe to climb out of the depression of the past decade. “We have no alternative: We must join the EU and take part in its research frameworks,” says Rudolf Zahradnik, a physical chemist who is president of the Czech Academy of Sciences.

    SOURCE: EUROPEAN COMMISSION

    At what cost?

    With so much at stake, many scientists in central and eastern Europe are nervous. Over the past 6 months, Science has interviewed more than 60 researchers and science administrators, from Gdansk on Poland's north coast to Bucharest in southern Romania, and a clear pattern has emerged: The region's science administrators are more enthusiastic about joining Framework 5 than are the scientists themselves. In Hungary, Pal Venetianer, a molecular biologist at the Biological Research Center in Szeged, supports Framework but worries that “if we are not successful in grant applications, the result will be a net loss for Hungarian R&D.” In Ljubljana, Dragan D. Mihailovic, a physicist at the Jozef Stefan Institute, says Slovenian science did well under the project-by-project participation in Framework 4, but “I'm not at all sure that we will be able to get back, in research grants, the sums that Slovenia's government will pay.”

    Officials at the EU science directorate in Brussels offer no guarantees, but say they have fashioned the new Framework—especially its fee structure—to ease the financial burden on central European countries. Each associate member is expected to pay an annual subscription based on the relative size of its gross domestic product (GDP). But when post-communist nations said they could not afford the fees, the EU allowed new associates to begin by paying only 40% of their fee the first year, 60% the second, and 80% the third before paying the full amount in Framework 5's final year. They can also use grants from an EU aid program to pay up to one-third of their Framework fees. Says Ranier Gerold, an EU research directorate official who until recently supervised science programs for the region, “We realize that financing is a problem, and we have done what we can to help.”

    Those measures have made initial membership more palatable. For example, Estonian officials estimate that their first-year payment will represent only 4% of their government's research budget. But once the initial subsidies run out, Framework fees are likely to spur some tough decision making. Says Stanislovas Zurauskas of the Lithuanian science ministry: “Framework will open up new opportunities for us, but it presents a challenge to the nation's R&D system.”

    SOURCE: EUROPEAN COMMISSION

    In many post-communist countries, research is so underfunded that any additional expenses may well cause problems. R&D spending as a percentage of GDP is well below 1% in most post-communist countries, where research systems—along with their economies as a whole—have endured harsh reforms this decade to make the difficult transition from discarded socialist models to leaner, competitive free-market approaches. While R&D budgets are finally on the rise again in Hungary and a few other countries, economic problems have crippled research budgets elsewhere. Bulgaria's prime minister, Ivan Kostov, concedes that “science financing is a serious problem” for many post-communist nations. But, he says, “It is precisely because we consider science and education a national priority that we are launching a large-scale reform,” which is likely to include Framework associate membership.

    Even relatively prosperous Poland faces strains. Some scientists—angered by the Finance Ministry's efforts to reduce basic-research funding—formed a “Save Polish Science” committee last summer that helped convince the government to at least preserve, and perhaps increase, the level of R&D spending. A leader of that effort, physicist and former Warsaw University rector Andrzej-Kajetan Wroblewski, says “scientists are aware that the ‘regular’ science budget will be reduced” as a result of Framework fees. “But joining the 5th Framework seems to be a necessity. Poland simply can't stay away from it if we ever want to join the EU.”

    In Romania, researchers were disheartened last year when the government cut its contribution to the nation's already-anemic research budget to about 0.2% of GDP. But many scientists are determined to bite the bullet in order to reap the benefits of wider European research networking. “It will be costly, but it is a price which we have to pay,” says Ionel Haiduc, a chemist who is vice president of the Romanian Academy. “We have to be involved to avoid isolation.” Elmars Grens, a molecular biologist who is a member of Latvia's science council, takes a similar view. Joining Framework, he says, is “the only way to preserve international collaboration and make sure that good science survives.”

    “We have no alternative: We must join the EU and take part in its research frameworks.”

    —Rudolf Zahradnik

    CREDIT: R. KOENIG

    Some researchers relish the challenge ahead. “I'm optimistic that Hungarian scientists will be talented enough to get back in grants what our government contributes in fees,” says Norbert Kroo, a solid-state physicist who recently became Hungary's deputy education minister for science policy. In Slovakia, computer scientist Ivan Trebaticky, the Education Ministry's head of scientific cooperation, says, “It's now up to our scientists to show that Slovakia can get as much out of Framework 5 as the government put in.”

    Many basic researchers fear, however, that they may not be competing on a level playing field: While their national R&D budgets are heavily weighted toward basic research, Framework 5 is more oriented toward applied research. “The Framework looks as if it is mainly an industrial-research program,” complains physicist Robert Blinc, vice president of the Slovenian Academy. In response, EU officials say Frameworks are by definition oriented mainly towards applied research—addressing EU priorities—but they also include some basic-science projects. “There are basic-science aspects to this program,” says Finnish physicist Jorma Routti, the top civil servant at the EU's research directorate. But he adds that “it makes no sense to duplicate science-driven research conducted at the national level.”

    An EU-commissioned report by the consulting firm Coopers & Lybrand found that in most post-communist countries there is an overemphasis on basic science, while industrial research is underdeveloped. The report recommended major efforts toward a more balanced mix. Hungary, for one, has pursued that course, fostering the creation of several major industrial R&D centers. And some central European officials argue that joining Framework 5 will nudge their national programs in a similar direction. Polish Prime Minister Jerzy Buzek, a former chemical engineering researcher, told Science: “I am absolutely convinced that applied science in Poland will help determine our economic growth and our potential for joining the EU.”

    Membership in sight

    Like Buzek, many central European science officials view participation in Framework 5 as a stepping stone to full membership in the EU club. Associate membership in Framework 5 “will improve our strategic orientation toward the EU,” says Slovenia's foreign minister Boris Frlec, a chemist. Judging from the experiences of scientists whose nations have joined the EU over the past decade—notably, Ireland, Spain, and Portugal—membership comes with substantial benefits, especially the chance to compete for structural funds. “For Spain, EU structural funds have been of tremendous value to improving the state of research,” says Rafael Rodriguez, a materials scientist with the Spanish Council for Scientific Research.

    View this table:

    Lajos Nyiri, the former president of Hungary's OMFB R&D agency, argues that there are many other hidden scientific benefits of EU membership, such as increasing research collaborations, opening up new markets that will heighten the business demand for R&D, and making the nation more attractive to international investors. “We studied what happened in Ireland and Portugal and found great benefits for science and technology there,” he says. Indeed, the next expansion round could create a new de facto demarcation across central Europe—separating the “haves” from the “have nots.” Last year, the EU tapped five central European states as the most likely candidates for membership: Poland, Hungary, the Czech Republic (see p. 25), Estonia, and Slovenia (see p. 25). But the nations excluded from the next expansion round may take many more years to get into the elite club.

    Whether the EU opts to expand quickly, or slowly, into post-communist Europe, nearly everyone agrees that it will be decades before the level of science in the region will match that of the west. Even the former East Germany still lags behind the west—despite massive efforts to bolster its research base. Hubert Markl, president of Germany's Max Planck Society, says that “countries like Poland and Hungary have been making tremendous efforts to improve their research. But, depending on economic developments, it may take 20 or 30 years before a full equilibrium is reached.”

    When Slovak immunologist Michael Novak did research at Cambridge University in 1989, a British colleague predicted that the first 10 years after the Iron Curtain's fall would be the worst for central Europe's scientists, and “it might take 20 to 30 years for scientists here to fully regain their former prominence.” Says Novak, who now directs the Slovak Academy's Institute of Neuroimmunology in Bratislava: “I didn't believe him, but now I see that it may well take another 20 years before this region's scientists reach the same position as researchers in the west.”

  13. Will the Euro Help Grants Flow?

    1. Robert Koenig*
    1. With reporting by Alexander Hellemans.

    BRUSSELS—Managers of national research programs in Europe have traditionally kept a close watch on international money markets. If their nation's currency weakened, the costs of participating in multinational European projects could suddenly go through the roof, prompting cuts in domestic projects. For most members of the European Union (EU), such gyrations are about to become a lesser concern. Today, 11 of the 15 EU countries take a major step toward adoption of a single European currency, the Euro. The long-term result, many researchers predict, will be more international mobility among scientists and more reliable budgeting for cross-border collaborations.

    In the first phase, exchange rates between the Euro and the currencies of the participating members will be permanently fixed. While national currencies will remain legal tender, foreign exchange and some bank transactions will be done in Euros. Euro notes and coins will be phased in gradually after 2001, until their use becomes mandatory on 1 July 2002.

    The president of Germany's Max Planck Society, Hubert Markl—a biologist who has appeared in ads that back the Euro—regards the currency as “a stabilizing element for the development of European research.” Says Markl: “It makes it easier to move scientists in EU nations, and it should strengthen our ability to find more synergies in European research, especially in ‘big science’ projects.” Peter Day, a solid-state chemist and former director of the Institute Laue-Langevin in Grenoble who is now at the Royal Institution in London, also sees potential budgeting advantages. “No national research council now knows exactly how many French francs or Deutschemarks and so on they have to set aside” for membership in international projects, observes Day.

    Ernst-Ludwig Winnacker, a biochemist who heads Germany's DFG granting agency, agrees that the introduction of the Euro will speed scientific integration and cooperation: “Once the Euro is in common use—and a scientist in Munich is paid in the same currency as a researcher in Paris—some psychological barriers will disappear, and we'll see even more international cooperation.” Jorma Routti, a Finnish physicist who heads the EU's research directorate, foresees mainly indirect advantages for science: “The common currency will strengthen the interaction of Europe's scientific and financial spheres to provide more opportunity for launching high-tech companies.”

    These predictions will, however, be of little comfort to researchers in the United Kingdom, Denmark, Sweden, and Greece—the four EU countries that opted not to join the Euro. “I deal with a European contract at the moment,” says Bob Cernik, assistant director for physical science at Britain's Daresbury Laboratory. “We are paid in ecu [the prototype for the Euro] and, … as a consequence of [the strength of the pound], I lost something in the region of [$425,000] from [the project] budget because of currency fluctuations.” From a personal point of view, Cernik says: “If we were part of the Euro, that, of course, would make planning easier.” Cernik and his colleagues will continue to keep a close eye on the money markets.

  14. EXPANDING EU: SLOVENIA

    Money and Mentors Hold Onto Young Researchers

    1. Robert Koenig

    LJUBLJANA, SLOVENIA—When biochemist Vito Turk strolls around the modern research campus where he works, he sees a healthy mix of younger and older scientists. “We don't have a major ‘brain drain’ problem,” boasts Turk, who directs Slovenia's premiere basic-research center, the Jozef Stefan Institute. “Slovenian scientists tend to return home.” For a country the size of New Jersey, with a population of only 2 million, Slovenia—the first republic to break away from the former Yugoslavia—has a remarkably resilient research sector.

    Much of the credit for training—and retaining—a talented pool of young Slovenian scientists goes to the government's “2000 Young Researchers” program, which provides research “mentors” and laboratory work for graduate students. It is Slovenia's innovative way of plugging the brain drain that has severely hurt research throughout post-communist Europe. “In this part of Europe, many talented young people who formerly studied the sciences are now pursuing careers in economics, law, and business,” says Robert Blinc, a physicist who is vice president of Slovenia's science academy. He says the Young Researchers program “has helped revitalize Slovenian research institutes.”

    “We don't have a major ‘brain drain’ problem.”

    —Vito Turk

    CREDIT: R. KOENIG

    According to the research ministry, a mere 3% of new Ph.D.s leave Slovenia for good, although many of them do postdoc research abroad. Under the Young Researchers program, the government provides funds to universities and research institutes for M.A. and Ph.D. students, covering the costs of their university education and their salaries as researchers. “It gives students both a mentor and a place to work,” says biomedical engineer Renata Karba, a 1995 graduate of the program who is now a counselor to Slovenia's science minister. “Mentors want as many of these young Ph.D. students as possible, because they are good research assistants.”

    The main architect of the Young Scientists program, Boris Frlec—a chemist who is now the nation's foreign minister—told Science that it “has been successful beyond our original expectations. It was meant to revitalize the then-aging Slovenian research community and, indeed, has yielded so far about 1900 highly trained young researchers.” That includes nearly 80% of all students who got their Ph.D. in Slovenia last year, when the program—which covers all fields of sciences, including social sciences—accounted for about 20% of the Research Ministry's total budget.

    Most scientists say it is worth the high cost. Dragan D. Mihailovic, a physicist who returned to the Stefan Institute after studying at Oxford University, calls the program “extremely effective,” adding: “The biggest benefit is that young Slovenian researchers gain great self-confidence from the experience of working side by side with some top-flight scientists.”

    But there is a downside to the 10-year-old program. Milos Komac, who heads the ministry's Division for Scientific Programs, says, “the problem is finding appropriate jobs for these talented young people.” While the goal was for two-thirds to go into industrial research, Komac says, most new Ph.D.s “tend to stay at institutes and universities.” Adds Komac: “We are now producing more Ph.D.s than we can decently employ.”

  15. EXPANDING EU: CZECH/SLOVAK

    Bringing Diverging Paths Back Together

    1. Robert Koenig

    BRATISLAVA—The breakup of Czechoslovakia into the Czech and Slovak Republics 6 years ago today interrupted seven decades of joint scientific effort. Since then, Slovak and Czech scientists have, in most cases, formed separate research organizations pursuing different national research policies as the two nations—partners from 1919 to 1993—set off on diverging paths. Last year, the Slovaks found out just how far those two paths had diverged: When the European Union (EU) selected five central European nations as fast-track candidates for membership, it included the Czech Republic but left out its Slovak neighbor. In a few years' time, the Slovaks may find themselves almost entirely surrounded—apart from a short border with Ukraine—by EU members.

    “Slovak scientists … need support from Western Europe.”

    —Peter Biely

    CREDIT: R. KOENIG

    In some ways, it is unfair to directly contrast Czech and Slovak science, as the Czech Republic is a larger country with a longer scientific tradition and a stronger industrial sector. But comparisons are inevitable. According to data produced by Philadelphia's Institute for Scientific Information on the citation impact of scientific papers from 1993–97, Slovakia ranked last out of 33 European nations (excluding former Soviet republics). The Czechs ranked only slightly better at 29th. Since the Czech-Slovak split, the Slovak research budget has lagged behind and Czech researchers have taken part in more EU-funded research projects.

    While both the Czechs and Slovaks have restructured their academy institutes—cutting employment roles to half their 1989 levels—they both continue to struggle with tight science budgets, low salaries, and—to a greater extent in Slovakia—the loss of young researchers to the west. A recent report on Slovak science by the consulting firm Coopers & Lybrand, commissioned by the EU, found that “there is little cooperation between research institutes and the universities or industry, and organizational structure is often poor.” In contrast, the consultants found the Czech Republic—despite some nagging problems—to be “relatively advanced in its scientific transformation.”

    While some Slovak scientists fear falling further behind the Czechs after the EU's next round of expansion, others are optimistic that Slovak research will be able to catch up with its neighbors. They cite two reasons: the government's move to join the EU's Framework 5 program as an associate member (see p. 22) and the recent defeat in general elections of a nationalist anti-EU government that was cool to the Czechs. The new Slovak administration has made overtures both to its neighbor and to Brussels.

    Among the optimists is Jozef Simuth, a molecular biologist who is deputy secretary-general of the Slovak academy. His laboratory has collaborated with both U.S. and European partners and he believes Slovak researchers will fare well in Framework 5. Similarly, Michael Novak, who directs the academy's Institute of Neuroimmunology, says joining Framework 5 will have significant long-term advantages for Slovak science. “It's vital that Slovak scientists have the opportunity to get involved in these EU research projects,” he says. Peter Biely, a researcher at the Chemistry Institute, agrees: “Slovak scientists have a great deal to offer, but we need support from Western Europe.”

    The Slovaks are well placed to profit from collaboration: Their capital, Bratislava, is close to Vienna and to Hungary, and just a few hours' drive from scientific centers in the Czech Republic. “We expect to get a big boost from Framework 5, and from improving our scientific cooperation,” says Ivan Trebaticky, a computer scientist who directs the Education Ministry's international cooperation branch. And relations between the two republics are thawing. Trebaticky's Czech counterpart, Peter Krenek—who heads the Czech Education Ministry's department of international cooperation—says the two countries expect to sign a new bilateral agreement on science and technology cooperation later this month. Observes Krenek: “There are no problems between Czech and Slovak scientists—just problems between politicians, which we are now overcoming.”

  16. EXPANDING EU: U.S. AID

    End of Joint Programs Leaves Researchers Feeling Jilted

    1. Richard Stone*
    1. With reporting by Robert Koenig.

    During the Cold War, countries behind the Iron Curtain strived to keep their research under wraps and their scientists politically isolated from the West. Particularly vigilant against the corrupting influence of scientific intercourse was the Czechoslovak government, which refused to allow even the token research exchanges with the United States that were occurring under a more liberal regime in neighboring Hungary. But when Czechoslovak authorities eased their stranglehold on culture in 1988, the U.S. National Science Foundation (NSF) quickly jumped in, signing a bilateral agreement that June to fund joint research and workshops.

    “By the time the Velvet Revolution took place in 1989,” says an NSF official, “we were able to open doors to them.” The State Department soon followed, launching a joint research fund there in 1991. Although the sums of money provided by the U.S. side were small, the program “had a very positive psychological impact,” says Josef Syka, director of the Institute of Experimental Medicine in Prague.

    Those heady days are, alas, over. Three years ago, the State Department ended its contributions to joint research fund programs throughout eastern and central Europe, as the focus of attention shifted to the plight of former-Soviet weapons scientists. While collaborations sponsored by NSF and other agencies continue, the former Soviet satellites must now find the bulk of their support in the harsher realities of post- communist Europe (see p. 22).

    The U.S. joint fund program “had a very positive psychological impact.”

    —Josef Syka

    CREDIT: R. KOENIG

    The State Department programs helped catalyze efforts to remodel research programs along western lines, incorporating peer-reviewed grant competitions and other strategies, according to researchers in participating countries. They required that host countries provide matching funds and bypassed ingrained Soviet-style systems in which institute directors divvied up budgets according to their whims. Instead, expert panels from the United States and each host country reviewed proposals and selected the best science to fund. “The program was not considered so much as aid but more as a chance for collaboration,” says Syka.

    The joint funds—ranging from tens of thousands of dollars per year in Slovakia to more than $1 million a year in Poland—were a fraction of what the European Union (EU) was spending in the region. “It wasn't the sums of money that were so important,” says Ivan Trebaticky, director of international scientific cooperation at Slovakia's Education Ministry. Rather, it was the spirit of cooperation they engendered.

    But the State Department's whirlwind romance with central European researchers, and sudden departure, has left recipients of that attention feeling jilted. “Seduced and left alone” is how Istvan Szemenyei, former science counselor at the Hungarian Embassy in Washington, D.C. puts it. The loss has left some researchers speculating about ulterior motives. “I think the U.S. has already changed its attitude to these countries and sees them as prospective competitors rather than partners,” says plant biologist Andrzej Jerzmanowski of the University of Warsaw. A State Department official says the program was an innocent victim of budget cuts. “We had not intended the programs to end unilaterally and abruptly,” says the official, who requested anonymity. She adds that her division has since helped hook up central European researchers with programs at NSF, the National Institutes of Health, and other U.S. agencies.

    Although no white knight has appeared to replace State Department aid, some nonprofit foundations have stepped into the breach. The most prominent is the Howard Hughes Medical Institute, which in 1995 awarded 5-year grants—averaging about $35,000 a year—to 50 young biomedical researchers across the region (Science, 14 July 1995, p. 155). And while it has focused mostly on stimulating free-market economic growth, the Andrew W. Mellon Foundation has provided a handful of grants for projects that directly benefit researchers, including $350,000 worth of computer equipment for the Czech Academy of Sciences' Institute of Information Theory.

    The challenge for the region's scientists will be to preserve what connections they have with U.S. colleagues while attempting to weave themselves into the fabric of the EU. “Our cooperation with the EU is clearly on the rise,” says Andrzej Wiszniewski, president of Poland's KBN granting agency. “But I don't think we should cut back on our scientific ties with the U.S.”

  17. NANOTECHNOLOGY

    Borrowing From Biology to Power the Petite

    1. Robert F. Service

    Nanotechnology researchers are harvesting molecular motors from cells in hopes of using them to drive nano-sized devices

    If you received a molecule-sized car, snowmobile, or jet ski for Christmas, you've probably realized by now that the thing is totally useless. It just sits there on your microscope slide like an inert dust speck, incapable of going for a spin around the cover slip. Okay, so molecular vehicles are pure fantasy. But their immobility is a problem that's all too real for would-be builders of nano-sized devices. Such devices are so small, there's no obvious way to power them. Now, researchers are turning to biology for what may be a possible solution: molecular motors from living things.

    Cells are packed with protein-based motors powered by the chemical fuel of life, adenosine triphosphate, or ATP. These motors ferry cargo, flex muscles, and even copy DNA. And at a recent meeting,* two groups, one led by Carlo Montemagno of Cornell University in Ithaca, New York, and the other by Viola Vogel of the University of Washington in Seattle, reported taking the first baby steps toward harnessing these motors to power nanotechnology devices. Like molecular mechanics, the researchers have unbolted the motors from their cellular moorings, remounted them on engineered surfaces, and demonstrated that they can in fact perform work, such as twirling microscopic plastic beads. “What we're really trying to do is make engineered systems that tap into the energy system of life,” says Montemagno.

    The effort still has a long way to go. But the early work is already generating enthusiasm in the community. “I think it's a very productive path to follow,” says Al Globus, a nanotechnology expert at the National Aeronautics and Space Administration's Ames Research Center, Moffat Field, California. If the effort does pan out, it could help researchers make everything from tiny pumps that release lifesaving drugs when needed to futuristic materials that heal themselves when damaged.

    For their molecular motor, Montemagno and his colleagues turned to one of the cell's heavy lifters: ATPase, a complex of nine types of proteins that work together to generate ATP. While tiny—it measures just 12 nanometers across and 12 high—this cellular motor is remarkably sophisticated, containing a cylinder of six proteins surrounding a central shaft. ATPase converts the movement of protons within the cell's energy powerhouse, the mitochondrion, into a mechanical rotation of the shaft, a motion that helps catalyze the formation of ATP. But the motor can also run in reverse, burning ATPs to rotate the shaft and move protons.

    Last year, Hiroyuki Noji and his colleagues at the Tokyo Institute of Technology and Keio University in Yokohama, Japan, captured this rotational motion on camera for the first time (see Science, 4 December, p. 1844). They dangled a fluorescent-tagged molecule off the end of the shaft, fed the motor ATP, then put it through a microscope and took sequential pictures of the shaft as it rotated in circles around the cylinder.

    Based on the number of rotations produced by a given amount of ATP, the researchers calculated that the motor operates at near 100% efficiency— “well above the efficiency of motors we're capable of building,” says Montemagno. “If the motor was as big as a person, it would be able to spin a telephone pole about 2 kilometers long about one revolution per second.”

    That result inspired Montemagno and his Cornell colleagues—George Bachand, Scott Stelick, and Marlene Bachand—to see if they could use the ATPase rotary motor to move man-made objects. They started by genetically engineering two changes into ATPase proteins, one to stick the motors to metal surfaces and the other to provide an attachment site for the beads that they wanted the motor to move.

    To make the first change, the team added an amino acid sequence loaded with histidine, which binds tightly to metals, to the base of the proteins that form the motor's cylinder. Next they used electron beam lithography to pattern an array of nickel islands—each roughly 40 nanometers across—atop a glass microscope cover slip. When they then spritzed water on top to keep the proteins happy and added the motors, the base of the cylinders bound to the nickel islands, causing the motors to stand upright.

    To attach the beads, which were made of plastic or a plastic/iron composite and coated with a small organic molecule called biotin, Montemagno and his colleagues added cystine, a sulfur-containing amino acid, to the top of the central shaft. That allowed the shaft to grab a small sulfur-binding protein called streptavidin, which could in turn bind the biotin-coated beads. When the researchers then added ATP fuel to the solution atop the slide and used a laser-based interferometer to track the beads' movement, they could see their array of motors twirling in endless loops, like a dance floor of nano-sized dervishes. “I had the thing running for well over 2 hours at a time,” says Montemagno. “It was seriously cool.”

    But whirling beads—impressive as they may be—are still a long way from nanorobots rooting through the body. So Montemagno's team is pressing ahead. They're currently working on replacing the beads with tiny magnetic bars. If the motors spin the bars, the researchers will be able to measure precisely how strong the motors are by applying an outside magnetic field: By increasing the field until the motors can no longer spin, they will be able to probe the limit of the motor's power.

    What's more, the spinning bars should generate an electrical current that might eventually be used to power devices, such as chip-based drug delivery pumps or chemical weapons sensors implanted in the body. But these uses, Montemagno says, are just the beginning. “There's 100,000 different things you could do with these motors,” he says.

    Washington's Vogel says much the same thing about her team's contraption, a nanoscale monorail in which a collection of molecular motors all lined up on a surface pass a tiny tube hand over hand down the line. Vogel based her monorail on one of the cell's own transport systems, which consists of tracks made of microtubules, tube-shaped assemblies of a protein called tubulin, and small motors made of another protein, kinesin. In cells, the kinesin motors latch onto the fixed microtubules and churn like steam engines from one end of the line to the other, ferrying molecular cargo such as proteins and lipids. But for their experiment, Vogel and her colleagues John Dennis and Jonathan Howard reversed these roles, fastening kinesin motors to a surface and having them shuttle microtubules down the line from one motor to the next.

    Biophysicists studying kinesin motors had done related experiments in the past. But in those, Vogel says, the kinesins were in random locations on surfaces. When microtubules and ATP were then added, the kinesins shuttled microtubules in all directions. To control the transport, the Washington team had to line up the kinesins. Here, the researchers took a low-tech approach. They simply rubbed a block of polytetrafluoroethylene, or PFTE, across a glass slide, causing molecules of the chainlike polymers to rub off and coat it. The scraping acted something like a hair brush, getting all the PFTE chains to line up on the surface, creating a series of grooves running for micrometers along the slide.

    After submerging the slides in water and coating them with a small protein called casein, to protect overlying proteins, they added the kinesin motors, which settled into the grooves. They then sprinkled on a few microtubules, which were tagged with fluorescent compounds so they could be seen, and dropped some ATP fuel into the solution.

    By turning on a xenon lamp to set the microtubules aglow and letting their cameras roll, Vogel and her colleagues could see the kinesins push their tubular cargo in one direction, moving it hand over hand down the parallel grooves. “Even though kinesins move on the nanoscale, we could watch the microtubules move on the micron scale,” says Vogel.

    For now, the team is using the monorail to study the performance of their motors. But down the road, Vogel says that the tiny rail lines could be used to transport replacement components for self-healing biomaterials for medical implants. If this and other efforts to motorize the nanoworld are successful, those microscope slides may soon see their first traffic jams.

    • *Sixth Foresight Conference on Molecular Nanotechnology, Santa Clara, California, 13 to 15 November 1998.

  18. MEETING: AMERICAN GEOPHYSICAL UNION

    From Eastern Quakes to a Warming's Icy Clues

    1. Richard A. Kerr

    SAN FRANCISCO—A record 8300 researchers gathered here on 6 to 10 December for the fall meeting of the American Geophysical Union. At this smorgasbord of earth and planetary science, the topics ranged from the future of giant earthquakes in southeastern Missouri to evidence that ancient climate changes took place in lockstep in the tropics and Greenland.

    No More New Madrid Quakes?

    Residents of what is now southeastern Missouri suffered through the horrific winter of 1811–12, devastated not by the weather but by the Earth itself: Between December and February, the three largest earthquakes to hit eastern North America in historic times destroyed the town of New Madrid. The most violent of the quakes—which rivaled any quake in California—momentarily reversed the flow of the Mississippi River, shot plumes of sand and water 10 meters in the air, and rang church bells 1000 kilometers away in Charleston. Ever since, scientists have wondered when the next New Madrid quake will strike, and a 1992 study suggested it might be soon—in the next few hundred years. But at the meeting, geophysicists monitoring the New Madrid region for signs of strain reported that the next big jolt shouldn't hit for 5000 or 10,000 years, if then.

    “I think we've vastly overestimated the seismic hazard of New Madrid,” says geophysicist Seth Stein of Northwestern University in Evanston, Illinois, who led a group that has surveyed the area for the past 6 years. A damaging but smaller quake is still possible, says Stein, but “the hazard of large earthquakes is very, very small.” Not everyone is quite so confident. Because geophysicists don't really understand why the New Madrid faults ruptured in the first place, notes Paul Segall of Stanford University, another magnitude 7 to 8 “can't be dismissed at this point. … The simplest assumption is, if [big quakes] happened in the past, they can happen in the future.”

    The new results come from satellite-based searches for movement of the land above the buried faults that zigzag across far southeastern Missouri, Tennessee, and Arkansas. If stress is building up along a locked fault, driving it toward eventual rupture in a large quake, the land on either side should be deforming, shifting the surface in opposite directions. Researchers can detect such subtle motions—a few millimeters per year across tens of kilometers—using the Global Positioning System (GPS), an array of military satellites in precisely known orbits. By comparing minute differences in arrival times of a satellite's radio signal at two sites, the distance between markers tens of kilometers apart can be determined to within a few millimeters. Researchers repeat the measurements over a period of years to spot movement of the markers and the ground.

    After conducting GPS surveys of 24 sites across the New Madrid area in 1991, 1993, and 1997, Stein and his colleagues presented “what is undoubtedly the most boring set of GPS data you'll ever see,” Stein told his audience. “We found no observable motion. All the sites we have been measuring have been staying right where they are.” Motion across a fault that generated one of the big quakes is 1 ±1 millimeter per year. “It seems very unlikely we're accumulating the kind of strain we need for a magnitude 8 in the future,” said Stein.

    His data are a far cry from the ominous 1992 findings of Segall, geophysicist Mark Zoback of Stanford, and their colleagues. They compared their own 1991 GPS survey with a nonsatellite, 1950s survey and found that ground near the fault was moving at about 5 to 8 millimeters per year—about one-quarter of the movement seen on the San Andreas fault of California. But Segall agrees that their conclusion “doesn't seem to be supported by the more recent results,” including their own continuing GPS survey.

    Even 1 millimeter per year of deformation wouldn't lead to another magnitude 8 for 5000 years or more, Stein notes. The 1811–12 quakes and some earlier ones in the geologic record must have resulted from a pulse of activity that has now died out, he says. But other geophysicists add that no one knows just what's causing New Madrid seismicity and so are cautious about predictions. Because New Madrid is in the middle of the continent, far from a plate boundary like the San Andreas, “we don't have a model like plate tectonics to help us understand” what drives the earthquakes, says Segall.

    It could be that strain reaccumulated rapidly in the first century or so after 1812, Segall says, and is now building very slowly toward another quake in the next century or two. Further GPS surveys in the next few years should show whether the land is absolutely motionless, or is ever so slowly inching toward the next upheaval.

    Tropical-Polar Climate Link

    Fifteen thousand years ago, as Earth began to shake off the chill of the last ice age, it plunged into climatic turbulence, with rapid bursts of warming and a brief return to glacial conditions. At 14,700 years ago, for example, temperatures in Greenland jumped 5°C in less than 10 years—twice the warming that greenhouse gases are predicted to produce in the coming 100 years. In search of a cause for this and other climatic jolts, researchers had been eyeing the high-latitude ocean, where abrupt changes in circulation might have triggered warming or cooling. Now, analyses of gases trapped in the ice of Greenland and Antarctica suggest that they should start looking elsewhere: to the tropics. “Greenland and the tropics march in lockstep,” says geochemist Jeffrey Severinghaus of the Scripps Institution of Oceanography in La Jolla, California.

    The new work, reported at the meeting by Severinghaus and geochemist Edward Brook of Washington State University in Vancouver, doesn't reveal the warming's ultimate cause, says glaciologist Richard Alley of Pennsylvania State University in State College. But he calls it “tremendous science,” because “it tells a very clean story” about how events in the tropics may drive climate change at higher latitudes. And because the change happens in the blink of a geologic eye, the finding implies that the swift-moving atmosphere, not the sluggish ocean, must have carried the climate signal to high northern latitudes.

    Severinghaus and Brook identified the link between the Greenland warming and changes in the tropics in tiny bubbles trapped in ice drilled from the Greenland ice sheet. The temperature signal was a change in the isotopic composition of the air trapped in the bubbles (Science, 14 June 1996, p. 1584). Warming or cooling at the surface creates a temperature gradient in the snow blanketing the ice, affecting how the isotopes diffuse through the snow's pores. When the accumulating snow turns into ice, sealing in the gas, the temperature signature is preserved.

    For example, nitrogen-15, the heavier isotope of nitrogen, tends to diffuse downward toward colder layers of snow, leaving lighter isotopes above, when the surface has suddenly warmed as it did 14,700 years ago in Greenland. In ice that researchers had dated to that time by counting annually deposited ice layers, Severinghaus and Brook found a sudden increase in the proportion of nitrogen-15 and concluded that Greenland had warmed 10°C over several decades and 5°C in less than 10 years—perhaps as little as 1 to 3 years.

    While isotopes provided a thermometer, methane in the same ice provided the link to the tropics. Methane is produced by organic decomposition in everything from mangrove swamps in the tropics to tundra bogs in the Arctic. But Brook and Severinghaus fingered a tropical source by comparing the methane in Greenland ice with a new record in an ice core drilled from Taylor Dome in Antarctica. Both records show an abrupt increase of about 30% at the time of the warming, and the methane levels in the two cores differ by only 3% to 4% throughout the change. Such similar methane distributions at the two poles are only possible if the methane surge came from a site that could feed both hemispheres —the tropics. And because the beginning of the methane rise as measured in Greenland and the abrupt warming there coincide within a few decades, Severinghaus and Brook assume that the tropical and high-latitude events were connected.

    Even though methane is a greenhouse gas, it could not have caused the high-latitude warming, because the methane increase did not precede the temperature change. Instead, the climate change probably came first, boosting methane by warming and moistening the tropical source regions. And Severinghaus suspects it was the tropics—the firebox of the climate engine because of the vast amount of heat and moisture there—that warmed first and triggered the immediate warming of higher latitudes.

    “This is a crossroads” in paleoclimatology, says isotope geochemist James White of the University of Colorado, Boulder. “You start to understand mechanisms more than you have before.” But Alley cautions that “it doesn't yet give us the answer,” the ultimate cause for abrupt climate change. To find it, researchers will have to study more climate records—from the muddy bottoms of oceans and lakes and from high-altitude tropical glaciers as well as polar ice sheets—if they are to explain at last why Earth tumbled so erratically into its present warmth.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution