News this Week

Science  21 Dec 2001:
Vol. 294, Issue 5551, pp. 2442
  1. BREAKTHROUGH OF THE YEAR

    Molecules Get Wired

    1. Robert F. Service

    In 2001, scientists assembled molecules into basic circuits, raising hopes for a new world of nanoelectronics

    Computer chip technology and scientific breakthroughs have been marching in step for decades. Without computers, scientists couldn't track climate change, sequence the genomes of entire organisms, or image the human brain at work. But the ability to cram ever more circuitry onto silicon chips now faces fundamental limits. Ironically, it's now possible to make the innards of a circuit—the transistors, resistors, capacitors, and wires—so small they no longer function.

    In recent years, scientists have tried to get around these limits by going for the ultimate in shrinkage: turning single molecules and small chemical groups into transistors and other standard components of computer chips. It's a provocative idea, but many have doubted that researchers would ever manage to link such devices together into more complex circuits. Today, those doubts are diminishing. This year, researchers wired up their first molecular-scale circuits, a feat Science selects as the Breakthrough of 2001. If researchers can wire these circuits into intricate computer chip architectures, this new generation of molecular electronics will undoubtedly provide computing power to launch scientific breakthroughs for decades.

    It's easy to see the allure of computing with molecules. Today's state-of-the-art computer chips pack some 40 million transistors onto a slab of silicon no bigger than a postage stamp. The smallest features in these miniature landscapes measure just 130 billionths of a meter, or nanometers, across. In another 10 years or so, chip engineers expect to shrink whole transistors—not just individual features—down to about 120 nanometers per side. Small as this seems, it's still gargantuan compared to molecules, which are some 60,000 times smaller. Chips made with components at that scale would harbor billions of devices.

    Good connections.

    Molecules can now be crafted into working circuits. Constructing real molecular chips will be a big challenge.

    ILLUSTRATION: C. SLAYDEN

    Dreams of such unbridled computer power started soon after the dawn of the computer age itself. In 1974, Mark Ratner and Ari Aviram of IBM suggested building computers from the bottom up by turning individual molecules into circuit components. But their suggestion remained little more than a pipe dream until the advent of scanning probe microscopes in the 1980s, which gave researchers the tools to probe individual molecules and move them around at will. That led to a spate of studies in the late 1990s that showed that individual molecules could conduct electricity like wires or semiconductors, the building blocks of modern microprocessors.

    Turning individual molecules into devices was not far behind. In 1997, groups led by Robert Metzger of the University of Alabama, Tuscaloosa, and Chong-Wu Zhou of Yale University created molecular diodes, one-way current valves that are among the most basic and essential elements in the chip designer's tool kit. In July 1999, another group led by James Heath and Fraser Stoddart of the University of California, Los Angeles (UCLA), created a rudimentary switch, a molecular fuse that carries current but, when hit with the right voltage, alters its molecular shape and stops conducting. Within months, still another team led by Yale University electrical engineer Mark Reed and Rice University chemist Jim Tour reported making molecular-scale devices that could control a current just as a transistor does.

    By the end of 2000, researchers had amassed a grab bag of molecular electronic devices but no demonstrations of wiring them together. 2001 brought a world of difference, when five labs succeeded in hooking up these devices into more complex circuits that could carry out rudimentary computing operations. In January, a team led by Charles Lieber, a chemist at Harvard University, got the ball rolling. In the 26 January issue of Science, Lieber's team reported arranging indium-phosphide semiconducting nanowires into a simple configuration that resembled the lines in a ticktacktoe board. The team then used a technique called electron beam lithography to place electrical contacts at the ends of the nanowires in order to show that the array was electronically active. The tiny arrangement wasn't a circuit yet, but it was the first step, showing that separate nanowires could communicate with one another.

    The next step came at the American Chemical Society meeting in April. Heath and his colleagues at UCLA reported making semiconducting crossbars. But in this case, Heath's team placed molecules called rotaxanes, which function as molecular transistors, at each junction. By controlling the input voltages to each arm of the crossbar, the scientists showed that they could make working 16-bit memory circuits.

    But molecular crossbars were only part of the success story. Researchers also made heady progress with their favorite nanomaterial, carbon nanotubes. These tiny straws of carbon are among the hottest materials in the nanotech world because they have an atomically perfect structure, resembling a rolled-up sheet of chicken wire. Depending on how these carbon sheets are rolled—so that chains of carbon atoms circle or spiral around the tubes—the nanotubes' electrical properties can be engineered to serve as either conductors or semiconductors.

    In the 26 August online edition of Nano Letters, a team led by Phaedon Avouris of IBM reported making a circuit out of a single semiconducting nanotube. By draping the nanotube over a pair of electrodes and independently controlling their behavior, the team coaxed the device to work like a simple circuit called an inverter, another basic building block for more complex circuitry.

    In addition to carrying out rudimentary processing, the IBM circuit demonstrated another key advantage: “gain,” the ability to turn a weak electrical input into a stronger output, which is a necessary feature for sending signals through multiple devices. Circuits with even stronger gain came just over 2 months later in a pair of reports in the 9 November issue of Science. The first, by Cees Dekker's team at Delft University of Technology in the Netherlands, also relied on carbon nanotubes. In 1999, Dekker's group was the first to report making a nanotube-based transistor. His team then wired up a range of logic circuits with nanotube-based transistors. By carefully controlling the formation of metal gate electrodes, Dekker's group was able to create transistors with an output signal 10 times stronger than the input. Lieber's group at Harvard, meanwhile, constructed circuits with their semiconducting nanowires, in this case made from silicon and gallium nitride.

    Finally, in a report published online by Science on 8 November, a group led by physicist Jan Hendrik Schön of Lucent Technologies' Bell Laboratories in Murray Hill, New Jersey, reported similar success in crafting circuits from transistors made from organic molecules that chemically assemble themselves between pairs of gold electrodes.

    Backed by this string of accomplishments, molecular electronics is rapidly moving from blue-sky research to the beginnings of a technology. Experts in the field have few illusions, however, that molecular electronics will replace conventional silicon-based computing anytime soon, if ever. Researchers now face the truly formidable task of taking the technology from demonstrations of rudimentary circuits to highly complex integrated circuitry that can improve upon silicon's speed, reliability, and low cost. Reaching that level of complexity will undoubtedly require a revolution in chip fabrication. But as chip designers race ever closer to the limits of silicon, pressure will to extend this year's breakthroughs in molecular electronics will only intensify.

    Links

    • Nanotechnology and molecular electronics information at Mitre Corp.

    • Reed group

    • Tour group

    • Lieber group

    • Metzger group

    • Dekker group

    • Stoddart group

    • Heath group

    • Nanotechnology group at IBM

    • Bell Labs page about molecular transistors

    References

  2. BREAKTHROUGH OF THE YEAR

    The Year of Living Dangerously

    1. David Malakoff

    When the World Trade Center towers collapsed on 11 September, seismographs recorded an impact equivalent to that of a mild earthquake. The aftershocks for the international science and engineering community, however, could be of much greater magnitude—from reshuffled budgets to new restrictions on research, information-sharing, and international collaboration. Indeed, some scientists say the terrorist assaults, and the subsequent anthrax mail attacks, could mark the beginning of a sobering new era of scientific soul-searching, akin to that which followed the development of the atom bomb.

    The new landscape “will present some scientists with opportunities and others with obstacles,” says Lewis Branscomb, a science policy specialist at Harvard University, who is helping lead a U.S. National Academies study of what scientists can do to respond to terrorism.

    The immediate consequences of the attacks were far-reaching. Engineers analyzed why the towers fell—and debated what should be done to make skyscrapers safer. Gene researchers scrambled to cobble together a DNA-testing system capable of generating and handling the cascade of data needed to identify the thousands of crushed and incinerated victims. Public health specialists struggled to identify and treat those threatened by anthrax. Microbiology sleuths took to their labs, hoping to finger the deadly strain's origin.

    Shake-up.

    Twin Tower and anthrax attacks have rattled the science community.

    CREDIT: CORBIS/SYGMA

    The U.S. Congress, meanwhile, passed sweeping new security laws requiring everything from the installation of new security technology at airports to criminal background checks for scientists working with deadly biological agents. It debated barring foreign students from studying certain sensitive scientific topics at U.S. universities. And lawmakers pumped billions of new dollars into developing ways of detecting biowarfare agents—and creating new vaccines that could render them harmless. Other nations, such as the United Kingdom, launched similar initiatives.

    The shake-up is far from over. Economic damage caused by the attacks has erased a U.S. government budget surplus, raising fears that some research spending might be trimmed next year to pay for the war against terrorism. Universities are reconsidering research programs involving potential bioweapons in light of increased regulation and community concerns. Journal editors, research funders, and scientists have begun debating whether some information—such as the genomic details of candidate bioweapons—is just too sensitive to be released publicly. “If the scientific literature is no longer used for the good it was intended, we will be left with no choice but to restrict information access at a cost to human health,” The Lancet Infectious Diseases warned in a 1 December editorial.

    Many researchers hope such crackdowns can be avoided. But they are again pondering how to work for good while keeping the fruits of their labors from being used for evil.

    Links

    References

  3. BREAKTHROUGH OF THE YEAR

    The Runners-Up

    1. The News and Editorial Staffs

    The News and Editorial Staffs

    Science celebrates nine other areas in which important findings were reported this year, from subatomic to atmospheric and beyond.

    First runner-up: RNA ascending. RNA molecules, long viewed as little more than couriers shuttling messages or amino acids around the cell, are turning out to be remarkably versatile. In 1995, researchers showed that small pieces of RNA could shut down genes in the nematode Caenorhabditis elegans, a phenomenon very similar to gene silencing, which was known to occur in plants. Molecular biologists realized that this RNA interference (RNAi) could be a boon to studies of gene function, and now interest in RNA has exploded. This year, they discovered that RNAi can quell gene activity in mouse and human cells as well.

    Structure solved.

    RNA-building enzyme revealed.

    CREDIT: P. CRAMER ET AL.

    Short RNAs clearly play important biological roles. Dozens of the molecules are now known to exist in the nematode and fruit fly. The coding for these molecules is contained in the DNA sequence. Some 100 of these tiny RNA “genes” have been found in the gut bacterium Escherichia coli, and some 200 were uncovered in DNA from mouse brain tissue. In the nematode and fruit fly, they seem to be involved in development; in E. coli, they may facilitate rapid responses to environmental change and could serve similar functions in mammals.

    The process by which the more familiar messenger RNA (mRNA) is generated also gained new respect in 2001. During transcription, mRNAs are built to match the sequence of the active gene, and they then instruct the cell's protein factory, the ribosome, how to build a protein. The discovery of key proteins involved in splicing together mRNA's coding regions added details to this scenario. Also, in June, high-resolution pictures captured the yeast RNA polymerase II—which builds the mRNA based on the gene's sequence—in action. During transcription, the polymerase opens and closes to bring DNA in and pump newly made RNA out, all through the same opening.

    Another set of small nuclear RNAs turns out to play a surprising role. These RNAs combine with proteins to form the spliceosome, which removes noncoding sequence from nascent mRNA. Researchers discovered this year that spliceosome RNAs, not the proteins, control the removal of unwanted sequence. This has spawned a new field, “ribozymology,” aimed at clarifying and harnessing RNA's enzymatic potential. In one experiment, researchers forced RNA enzymes, or “ribozymes,” to evolve by selecting for those that are able to replicate RNA. These results help show that RNA could have preceded proteins in the earliest life-forms.

    Links

    So what's neu? With nearly no mass, it's amazing that anyone noticed that they were missing. But this year, the 30-year-old case of the missing solar neutrinos was cracked.

    Case closed.

    Sudbury observatory finally nabbed missing neutrinos.

    CREDIT: SNO

    One of the great triumphs of science in the past century has been understanding how stars burn nuclear fuel, yet physicists quickly discovered that something was wrong. Expected products of the stellar fireball—neutrinos of a particular type called “electron” neutrinos—were not nearly as abundant as required by the straightforward theory.

    In 1998, physicists presented evidence that seemed to explain the discrepancy: They showed that neutrinos “oscillate” from one flavor (such as the electron neutrino) to another (such as the muon or tau neutrino). If the electron neutrinos turned into another variety as they streamed away from the sun, that would explain the shortfall of electron neutrinos. But there was still no direct evidence that neutrino oscillations are to blame for the solar neutrino paradox.

    This June, all that changed, thanks to the Sudbury Neutrino Observatory (SNO), a 1000-ton sphere of heavy water 2 kilometers below Earth's surface in Sudbury, Ontario. After some delay, SNO scientists announced that they had found the missing solar neutrinos—and they were, indeed, changing flavor.

    SNO was able to count the number of electron neutrinos coming from the sun and compare it to the total number. The number of electron neutrinos was still too small to match nuclear physicists' theories, in agreement with past experiments. However, the total number of incoming neutrinos matched expectations: The predictions were correct, but the neutrinos were swapping identities on the way to the detector.

    Links

    Genomes take off. Barely 3 years ago, researchers were edging hesitantly to the starting gate in the race to sequence the human genome. But egged on by competition, a private effort and an international public consortium galloped into the home stretch early this year, achieving the first publications of draft coverage of the 3.3 billion bases that define our species. The drafts cover the gene-containing portions of our DNA but have gaps of varying sizes. The big surprise when the drafts were revealed: Our genetic code contains only about 35,000 genes, not even twice as many as the lowly worm Caenorhabditis elegans. (By year's end, however, the number began creeping upward.) All told, genes account for less than 2% of the DNA, the rest consisting of re-petitive stretches of sequence or mobile genetic elements that have inserted themselves into our chromosomes.

    These drafts only fueled the hunger for more sequence. By the end of the year, about half the human genome sequence was put into final finished form. And now the sequences of more than 60 other organisms are done. Most are those of microbes, such as nitrogen-fixing bacteria, human pathogens, or organisms that cause food poisoning. A draft sequence of the Japanese puffer fish genome—the most compact genome of all vertebrates—is now in the public databases. The sequencing of the rat, mouse, zebrafish, a freshwater puffer fish, a malaria mosquito, the fungus Neurospora crassa, and fission yeast is proceeding at full speed, with the last two virtually done. Similarly, private companies announced that they have rough copies of the rice and mouse genome sequences.

    Links

    Superconductor surprises. Visions of resistance-free electricity transmission tantalized researchers this year, as two teams discovered vastly different superconductors at higher than expected temperatures.

    Hopes ran high in January, when researchers announced that one of the simplest compounds in the chemistry stockroom—magnesium diboride—becomes a superconductor at 39 kelvin. Although still chilly, it beat the previous record metallic transition temperature by a factor of 2.

    But the big question was whether MgB2 is kin to the high-temperature ceramic superconductors discovered 15 years ago or more like conventional metal superconductors. The answer could shed light on how the high-temperature superconductors work—a riddle still without a solution.

    Simple superconductor.

    A metal compound—MgB2—set a new record.

    CREDIT: ILLUSTRATION: C. SLAYDEN

    By March, the theoretical questions were settled: MgB2 is an interesting but conventional superconductor. Although not the paradigm-shifter researchers had hoped, it proved that simple compounds can still generate surprises. The second surprising superconductor may be more promising. C60 molecules—popularly called buckyballs—had long been known to superconduct at 18 K when “doped” with alkali metals. Last year, physicists pushed the transition temperature to 52 K. In August, researchers announced that when they stuffed organic molecules into a crystal of C60, the distance between the buckyballs increased and the transition temperature rocketed to 117 K. If they can spread out the C60s even more, the researchers hope the material might even superconduct at room temperature, opening new possibilities for superconducting molecular electronics.

    Links

    Guide me home. How do the neural projections called axons find their way to the correct destination during embryo development? Several studies over the past year helped sort out this finely tuned neuronal traffic control system.

    In the 1990s, researchers identified four major families of molecular signals that tell wandering axons to either “come hither” or “shove off.” At about the same time, researchers identified the receptors on the developing axon's surface that respond accordingly. And it turned out that axons could turn receptors on and off sequentially—allowing an axon to be attracted to the spinal cord initially, say, but then become repelled by it as the axon continues its voyage across the body.

    Whither?

    These axon tips are testing the waters for guidance cues such as netrin and Slit.

    CREDIT: E. STEIN AND M. TESSIER-LAVIGNE

    Simple stop-and-go signals aren't enough to generate complex neural circuitry, though; somehow the growing tip of the axon, called the growth cone, integrates conflicting signals. At the very end of last year, researchers reported that three different receptors for one guidance molecule, called Slit, could combine in various ways to guide axons to either close, distant, or intermediate paths. A study this year showed that, when presented with conflicting signals (one stop and one go), growth cones don't get confused. At the cell surface, a receptor for Slit latches onto and “silences” a receptor for a different signal, called netrin.

    Additional studies this year focused on how the message received at the axon's surface translates into axon movement. It turns out that the signals activate regulators that instruct the actin cytoskeleton to build up the axon in one direction or another.

    Ultimately, understanding how the developing axon finds its way could help repair adult nervous systems. Human axons don't regrow once severed because something stops them. Modifying those signals could lure damaged axons—which do have growth cones—to grow back to where they're needed.

    Links

    Climatic confidence. A major milestone was reached in January when the United Nations-sponsored Intergovernmental Panel on Climate Change (IPCC) officially declared that “most of the observed warming over the last 50 years is likely to have been due to the increase in greenhouse gas concentrations.” But the panel was vaguer than ever about future warming.

    All the elements.

    Extreme weather, melting snow, and gathering clouds may signify global warming.

    CREDIT: NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION

    A better understanding of climate change finally allowed IPCC to pin much of the blame for rising temperatures on human activities. With better computer models and more data, it has become clear that even a combination of volcanic debris, solar flickering, and the natural jostlings of the climate system could not explain the 0.6°C warming of the past century. An impressive new line of evidence came from the compilation of temperature records extracted from tree rings and other climate proxies. From a millennial perspective, the 20th century stands out as unnaturally warm.

    The new scientific consensus spurred on negotiators working out the implementation of the Kyoto Protocol for the control of greenhouse gas emissions, with the notable exception of those from the United States. President George W. Bush pulled the United States out of the Kyoto process, citing high costs, an unfair advantage for developing countries, and remaining scientific uncertainties. The latter were evident in the IPCC's projections of possible warming by the end of the century, which ranged as high as a sizzling 5.8°C. Part of that increased uncertainty arose from unresolvable technological and socioeconomic uncertainties, but much lies in scientists' inability to pin down climate's sensitivity to increasing greenhouse gases. That uncertainty hasn't been narrowed in 20 years.

    Links

    Cancer in the crosshairs. Thirty years after President Richard Nixon and the U.S. Congress declared a “War on Cancer,” a new breed of cancer drugs is entering the clinic. This year, the U.S. Food and Drug Administration (FDA) approved a drug called Gleevec for use in treating one kind of leukemia. It's a milestone because Gleevec is the first small-molecule drug that works by targeting the specific biochemical defect that causes the cancer.

    Gleevec's development is an outgrowth of 3 decades' search for genetic changes that lead to cancer. For example, in chronic myeloid leukemia (CML)—the cancer treated with Gleevec—the fusion of two genes, called BCR and ABL, produces an abnormally active kinase enzyme that fuels the growth of the cancer cells. Because Gleevec inhibits that kinase, it is very effective against CML, particularly in the leukemia's early stages.

    Growth-regulatory kinases such as BCR-ABL are a prominent target of the new wave of anticancer agents. Indeed, 3 years ago FDA approved a monoclonal antibody called Herceptin for treatment of metastatic breast cancer. Herceptin binds to and blocks a growth factor receptor, which is also a kinase and whose activity is thought to foster the growth of some breast tumors. Other drugs directed at kinases, including both small molecules and monoclonal antibodies, are now in preclinical and clinical trials against various cancers, as are a variety of drugs aimed at correcting other cancer-fostering defects. Dozens of clinical trials are now under way worldwide to test the efficacy of targeted cancer drugs.

    Links

    Banner year for Bose-Einstein. Cooled atoms that march in quantum lockstep were big news in 1995, and this past October that work garnered physics Nobel honors for their discoverers. But the field hasn't rested on its laurels.

    In March, two groups figured out how to make Bose-Einstein condensates (BECs) of metastable helium, a state in which high energy is stored in the atom's electrons. Each metastable atom is like a tiny bomb waiting to explode, so the result might one day lead to laserlike beams of atoms for carving nanocircuits out of silicon.

    Atomic vortex.

    Superfluid swirls were another BEC triumph this year.

    CREDIT: W. KETTERLE ET AL./MIT

    Researchers got their hands on other new condensates as well this year. One team cooled a mixture of lithium isotopes to create a kind of quantum pressure between atoms, similar to the force that keeps white dwarfs and neutron stars from collapsing. And in October, another group wrestled potassium atoms into a BEC. Both experiments ignited hope that even more members of the periodic table could join the club, leading to BECs with wholly new properties.

    Scientists this year also acquired new skills in manipulating the condensates. They observed a kind of atomic supernova, dubbed a “bose nova,” in a BEC, as part of the atomic vapor collapsed on itself, throwing off a miniature shock wave of atoms akin to the blast from a collapsing star.

    Lasers and BECs also made a good mix as researchers learned how to form a BEC using light alone, with no magnetic fields. Similarly, physicists trapped arrays of BEC bunches to create the first “squeezed” states of atomic matter, perhaps one route to future ultraprecise measurements.

    Links

    Carbon consensus. Researchers who had been puzzling over how much carbon dioxide is absorbed by U.S. forests and fields have finally reconciled their conflicting results. The outcome will help hone estimates of how much the planet may warm in future years.

    A “missing sink” had to exist: Even after scientists accounted for known carbon sinks, such as the oceans, less than expected of the CO2 from burning fossil fuel remains in the atmosphere. Three years ago, researchers who put atmospheric CO2 measurements into computer models concluded that North America is a humongous sink—big enough that plant ecosystems seemed to be sopping up most CO2 emissions. Researchers tallying carbon uptake with land surveys, however, found a much smaller sink.

    Big sponge.

    Researchers pinned down how much carbon is being absorbed by U.S. forests.

    CREDIT: KATHLEEN BROWN/CORBIS

    This summer, the two camps published a joint study in which their results finally match up. New atmospheric analyses modeling a bigger time period—all of the 1980s—found a slightly smaller sink than before in the lower 48 U.S. states. And a more thorough ground-based tally of where the carbon goes—one that included overlooked pieces of the sink, such as decaying wood and forest soils, sediments in reservoirs, and exported wood and grains—yielded a bigger number. The result is a sink of about 0.5 petagram per year, or about one-third of current U.S. emissions.

    With the U.S. carbon sink firmed up, researchers now hope a similar approach will pinpoint carbon uptake in other regions, such as the tropics, where it seems to fluctuate most. The bad news: Because much of the U.S. sink is due to ecosystems recovering from past exploitation, this sponge is steadily shrinking and will taper off within 100 years.

    Links

    REFERENCES

    REFERENCES

    1. 9.
    2. 10.
    3. 11.

    REFERENCES

    1. 12.
    2. 13.
    3. 14.
    4. 15.
    5. 16.
    6. 17.
    7. 18.

    REFERENCES

    1. 19.
    2. 20.
    3. 21.
    4. 22.
    5. 23.

    REFERENCES

    1. 24.
    2. 25.
    3. 26.
    4. 27.
    5. 28.

    REFERENCES

    1. 29.
    2. 30.

    REFERENCES

    1. 31.
    2. 32.
    3. 33.

    REFERENCES

    1. 34.
    2. 35.
    3. 36.
    4. 37.
    5. 38.
    6. 39.
    7. 40.
    8. 41.
    9. 42.
    10. 43.
    11. 44.

    REFERENCES

    1. 45.
    2. 46.
    3. 47.
    4. 48.
    5. 49.
  4. BREAKTHROUGH OF THE YEAR

    Peering Into 2002

    Science's editors once again read the tea leaves for hot research areas in the coming year.

    Stem cells abroad. The Bush Administration has limited federal support for human embryonic stem (ES) cell research to work on cell lines derived before 9 August 2001. That still leaves the door open for unfettered research in privately funded labs and in countries with less restrictive rules. Look for progress in translating results from mouse to human ES cells as governments around the world clarify their rules and more scientists gain access to the human cell lines. But also watch for legal and commercial entanglements as companies race to stake their claims in the wide-open field.

    Links

    Proteomics. Genes tell cells what proteins to make, so figuring out how proteins interact is vital for leveraging genetic knowledge in medicine and biotech. It's a tough assignment: Although there might be as few as 35,000 genes, there might be millions of proteins. But the will is there: Biotech companies and funding agencies are pouring hundreds of millions of dollars into untangling the proteome. Next year could see the first protein-based drug targets from biotech proteomics.

    Links

    Eyes on the sky. Early in 2002, the second of the Gemini project's 8-meter telescopes will be dedicated in Chile, following its sibling, which saw first light in Hawaii 2 years ago. The Very Large Telescopes (also in Chile), now fitted with new “adaptive optics,” reportedly see as well as the Hubble Space Telescope. Next year, big sky efforts such as the Sloan Digital Sky Survey should continue to produce solid results. Also on the horizon is the Virtual Observatory, a vast network of astronomical databases. Often called the World-Wide Telescope, it should open the heavens to rich discoveries in coming years.

    Links

    Next in genetics. Chronic diseases generally result from the interplay of multiple genes. Geneticists have made much progress pinning down the genetic basis of single-gene disorders, but the roots of more complex diseases have been elusive. With the human genome sequence in hand, researchers expect to make clear progress in determining the relative contributions of various genes to problems such as heart disease, cancer, and diabetes.

    Optical clocks and constants. Because they're based on higher frequency visible light waves rather than microwave radiation, optical clocks are an order of magnitude more accurate than previous instruments. They should lead to more precise global positioning systems and a new generation of experiments to test and challenge the fundamental constants of nature. In 2002, expect an increasing pace of research as optical clocks become the gold standards by which to judge other important measurements.

    CREDIT: ILLUSTRATION: TERRY E. SMITH

    Visualization of complex systems. New imaging technology and ever-faster computers will come together to allow a closer look at biological molecules and their interactions. Recent lab successes with electron cryomicroscopy and electron microscope tomography are merging with computer simulation to create new views of how proteins work with each other. And labeling techniques for tagging proteins, lipids, and other biomolecules are evolving rapidly toward the goal of watching cell signaling as it occurs in space and time.

    Links

    REFERENCES

    REFERENCES

    1. 3.

    REFERENCES

    1. 4.
    2. 5.

    REFERENCES

    1. 6.
    2. 7.
    3. 8.

    REFERENCES

    1. 9.
    2. 10.

    REFERENCES

    1. 11.
  5. BREAKTHROUGH OF THE YEAR

    Scorecard 2000

    In which the editors face the music and polish their crystal balls.

    Infectious diseases.

    There were few breakthroughs in drug development for the world's major killers—HIV, tuberculosis, and malaria—and new vaccine candidates have only recently entered the testing phase. On the other hand, interest in tackling infectious diseases kept surging—especially after the 11 September attacks and the spate of anthrax letters.

    New views of the ocean.

    A satellite called SeaWiFS provided the first multiyear, global picture of photosynthesis on both land and sea and detected a burst in ocean plant growth during an El Niño-La Niña shift. SeaWiFS also revealed how large “planetary” waves pump nutrients to the ocean surface. However, data from NASA's Terra satellite, which will add ocean fluorescence to the picture, are still awaiting validation.

    Links

    RNA surveillance.

    It has been a banner year for RNA interference (RNAi) (see Runners-Up, p. 2443). 2001 has seen the identification of the enzyme, appropriately called Dicer, that kicks off the whole RNAi process by chopping up double-stranded RNA precursor molecules into small RNAs. Dicer also has been implicated in the control of gene expression by small RNAs during development. Just last month, the RNAi field took another leap forward with the finding that the cell can boost RNAi by amplifying the number of small RNAs through the action of a cellular-directed RNA polymerase.

    Follow the money.

    With the exception of the National Institutes of Health, presidential candidate George W. Bush's support for research during the campaign was AWOL this spring in his first proposed budget. But legislators repaired much of the damage, giving several science agencies more than they had requested. At the same time, research budgets in most of Europe and Asia were protected from the worst effects of a global downturn, because governments continue to see science and technology as a good way to bolster their long-term economic prospects.

    Quark soup.

    In 2001, scientists at the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory in Upton, New York, saw evidence of “jet quenching”: a sign that nuclear particles might have melted into their component parts, creating a quark-gluon plasma. Although the results are still open to interpretation, and theorists are talking about new concepts such as “colored glass” states, it has been a good year in quark-gluon plasma research. If all goes well at RHIC, next year will be even better.

    Links

    Cellular asymmetry.

    Although 2001 has been a quiet year for asymmetry—how a cell tells its left from its right and up from down—the asymmetric distribution of messenger RNAs (mRNAs), proteins, and stem cells in the developing Drosophila embryo has been mapped. Also, in developing flies the distribution of Wingless mRNA and stardust proteins was found to be important for establishing epithelial cell polarity.

    CREDIT: CRYSTAL BALLS: TERRY E. SMITH

    REFERENCES

    REFERENCES

    1. 5.
    2. 6.
    3. 7.

    REFERENCES

    1. 8.
    2. 9.
    3. 10.
    4. 11.
    5. 12.
    6. 13.

    REFERENCES

    1. 14.
    2. 15.
    3. 16.
    4. 17.

    REFERENCES

    1. 18.

    REFERENCES

    1. 19.
    2. 20.
    3. 21.
    4. 22.
  6. BREAKTHROUGH OF THE YEAR

    Breakdowns of 2001

    Bush Mystery Science Theater. Never has a modern president tackled so many hot scientific issues with so little help. In his first year, Bush laid out new policies, from stem cells and missile defense to climate change and the amount of arsenic that should be allowed in drinking water. Then came anthrax. Through it all, many of the appointees who might normally be expected to offer technical advice—from the White House science adviser to the heads of the National Institutes of Health (NIH) and the Food and Drug Administration (FDA)—were invisible. It took a record 10 months to install science adviser John Marburger in his post, and the NIH and FDA positions remained unfilled in mid-December. The lackadaisical pace drew criticism, but White House officials pointed to an increasingly onerous appointments process, while thorny politics made recruiting for some jobs difficult.

    An early meeting of the new president and his science team.CREDIT: ILLUSTRATION: TERRY E. SMITH

    Where no budget has gone before. Spacewalkers continue to build the international space station in orbit. But on the ground, the research facility seems to get smaller and smaller. Frustrated by cost overruns on the program, the White House earlier this year ordered NASA to make drastic cuts, and the result has infuriated researchers. They argue that the crew will spend the vast majority of its time keeping the station operating rather than doing science. A blue-ribbon panel in November backed a strong research effort but called for the space agency to prove it can finish building the modest version before it completes the more ambitious design. Those findings, however, triggered a revolt among the international partners, who say that the smaller version would leave their own research programs high and dry.

    REFERENCES

    REFERENCES

    1. 4.
    2. 5.
    3. 6.
  7. U.K. RESEARCH FUNDING

    Universities Raise Their Game, But the Money Doesn't Flow

    1. Andrew Watson*
    1. Andrew Watson is a writer in Norwich, U.K.

    NORWICH, U.K.— British researchers won warm praise last week for their talent, but it came with a splash of fiscal cold water.

    Reporting on a mammoth review carried out every 5 years or so to allocate funding for research infrastructure, Howard Newby, head of the Higher Education Funding Council for England (HEFCE), beamed that the U.K.'s researchers are “performing better than any [country's] in the world at the present time.” Unfortunately, these hardworking researchers have raised the level of their game so fast that they have stripped bare the government fund designed to support them.

    HEFCE and its sister councils in the rest of the U.K. use a formula based on the results of the Research Assessment Exercise (RAE) to distribute $2 billion of research infrastructure funds—paying for computers, libraries, and staff, for example—each year. The hugely improved results mean that they are $290 million short in the financial year that begins in August. On 14 December HEFCE's board voted unanimously to respect the RAE results but to tinker with the funding formula to funnel scarce resources to those departments with the highest rating.

    The rich-getting-richer approach rankles officials at middle-ranking universities, where science departments appear to have struggled in vain. “[This] throws into question the whole incentive system that was here previously in the RAE, that if you did better you got more money,” says Ben Martin, head of the Science Policy Research Unit at the University of Sussex. “I think once you move outside the top few, discontent with the whole system may be quite substantial.”

    Begun in 1986, the review this year enlisted 60 expert panels to assess the best four recent papers from 50,000 U.K. academics. Other indicators, such as invitations to international conferences, journal editorships, and lab visits from well-known researchers, were also considered. The 17-member medicine panel alone faced 14,000 individual items over 5 months. “It was certainly quite a busy period over the summer,” says panel chair Leszek Borysiewicz, an immunologist at London's Imperial College. Nearly 300 international experts were called upon to double-check the top rankings, which are meant to compare departments, not individuals, says John Rogers, RAE manager. “97% of the people we asked confirmed that the panels had got it right,” he says.

    Star performer.

    With its 30 top-rated departments, Cambridge University's funding is safe.

    CREDIT: MICHAEL S. YAMASHITA/CORBIS

    Each department is scored on a seven-point scale, with the coveted 5 and 5* representing international excellence and a 4 going to departments whose work is virtually all deemed of national excellence. Levels of 3a, 3b, and below are assigned to departments with less stellar track records. Under the existing formula, the funding councils' infrastructure funds kick in for departments that achieve a grade of 3b or higher. The higher echelons receive disproportionately more: A 5* wins 4.05 times as much funding per capita as a 3b department.

    The latest RAE results, which include humanities and social sciences, show a huge improvement, with several institutions showing dramatic gains in the last 5 years across a spectrum of subjects (see table). For example, the University of Manchester increased its number of 5 and 5* departments from 28 to 37. But at least one high-profile institution has seen its star dislodged: The University of Oxford was toppled from the number one spot by archrival University of Cambridge, slipping to number three behind Imperial College. Altogether, 55% of researchers now work in departments of international standing, compared with 31% just 5 years ago. “There's absolutely no doubt that the quality of the work that was submitted to us has gone up,” says Bristol University's John Enderby, who headed the physics panel.

    View this table:

    To probe for any bias in the assessment, HEFCE asked Evidence, a consulting firm in Leeds, to check the results using citation analysis. “The citation analyses show that the U.K.'s average performance has improved,” says Evidence chief Jonathan Adams. Britain's average number of citations per paper has increased from 23% above the world average to 38% since the last RAE. “It's a pretty stunning performance,” says Adams.

    HEFCE officials view the improvement in research quality as a validation of their approach of channeling more money into strong departments. “This is actually selectivity at work. We gave more money last time to those that were 5 and 5*. And what did they do with it? They invested it in more staff,” says Bahram Bekhradnia, HEFCE's director of policy. Newby says there is anecdotal evidence of large staff turnovers in the past 5 years in which high-caliber researchers tended to replace underperforming staff members. Indeed, says Bekhradnia, “there's been a lot of evidence of deliberate policies to bring in new blood.”

    Sadly, many university departments whose improvements are not enough to place them in the upper echelon will find the RAE an exercise in frustration. The government has ruled out any extra cash for the funding councils to maintain current funding levels while also rewarding those departments that have improved their rating. Faced with the cash crunch, the HEFCE board has “committed ourselves to funding 5*'s: They will get the full amount,” says Newby. But the simple solution—putting 3a and 3b departments into the pool that won't receive funding—would not erase the shortfall. The HEFCE board meets again on 23 January to decide on a final formula.

    Skewing the funding even further toward the top-rated institutions is likely to provoke a furor in some departments. Research funding is already very selective, says Roderick Floud, president of Universities UK, an umbrella organization representing all of the U.K.'s universities. “It should not be made more selective,” he says. Enderby says he would welcome a “flatter” funding model: “Small centers of excellence must not be squashed out.”

    “There is always a dilemma … about research funding, about whether one puts all the resources toward the most excellent alone, or whether one holds some money back to fund research capability in areas which are not currently very strong but which have the potential to get stronger in the future,” says Newby. As with most dilemmas, it's easier to describe than to resolve.

  8. BIOMEDICAL RESEARCH

    Case Institute a False Start

    1. David Malakoff*
    1. With reporting by Eliot Marshall.

    Ten weeks after leaving the top job at the $4 billion National Cancer Institute (NCI) to start a new biomedical research institute, biologist Richard Klausner has jumped again, this time to help the government combat terrorism. He will serve as a liaison between the U.S. National Academies and the government's antiterrorism efforts while maintaining a lab at NCI. Klausner's surprise move means that the $100 million organization he was supposed to lead has folded before it even began.

    Klausner announced his departure from NCI on the morning of 11 September, just as the planes crashed into the World Trade Center and the Pentagon (Science, 14 September, p. 1967). He said he had a commitment from Jean Case and her husband, America Online founder Steve Case, to fund the Case Institute of Health, Science and Technology, with him at the helm. In October, he told Science that the Cases had agreed to spend nearly $100 million over the next few years on bioinformatics studies and other life sciences research, including awarding grants to outside researchers and hiring an in-house science team.

    Turning to terror.

    Richard Klausner will become the National Academies' point person on antiterrorism.

    CREDIT: SAM KITTNER

    Those plans were shelved earlier this month, Klausner says, after his volunteer work as the co-chair of a new National Academies panel on antiterrorism began to take the lion's share of his time and interest. “The Cases and I felt that launching something new from scratch wasn't doable” given his time commitment to antiterrorism efforts, he says. “The nation's interests come first, and we fully support [Klausner's] decision,” says Jean Case.

    Klausner, who says that his new position “is a calling,” estimates that it will take up half his time for at least a year, leaving him an opportunity to continue running his NCI laboratory. National Academy of Sciences president Bruce Alberts says Klausner will be responsible for keeping an eye on the academies' many studies on terrorism-related subjects, helping fulfill government requests for immediate technical assistance, and sharing potentially useful ideas with the White House Office of Science and Technology Policy (OSTP). OSTP head John Marburger says Klausner's knowledge of Washington will make him “a very effective interface” between science and government.

    Jean Case says there are “no plans to revisit” the idea of creating the institute. Klausner's two assistants have shifted to other work within the larger Case Foundation, which supports a variety of education and technology programs.

  9. PARTICLE PHYSICS

    Sign of Supersymmetry Fades Away

    1. Adrian Cho*
    1. Adrian Cho is a freelance writer in Boone, North Carolina.

    Like that extra $223.78 that seemed to enrich your checking account before you realized your mistake, a tantalizing hint of new elementary particles has vanished once physicists double-checked their math. The culprit: an extra minus sign in one of the calculations.

    In February, researchers at Brookhaven National Laboratory in Upton, New York, reported that a particle called the muon was 4 billionths more magnetic than predicted by the Standard Model of Particle Physics (Science, 9 February, p. 958). The muon's magnetism depends on other particles that flit in and out of existence too quickly to be directly observed, so the tiny discrepancy between the measured and predicted values suggested the presence of new, unaccounted-for particles. Many physicists interpreted the result as possible evidence of supersymmetry, a theory that predicts a much more massive partner for every type of particle known today.

    But the true cause of the discrepancy is far less exciting. Misled by an extra minus sign, theoretical physicists underestimated the predicted value of the muon's magnetism. To find that value, physicists must add up the results of a series of hugely complicated calculations, one for each combination of particles emitted and reabsorbed by the muon.

    In 1995 two groups independently calculated the contribution for a combo known as “hadronic light-by-light scattering,” but both got the sign of the answer wrong. If the negative results are made positive, the theoretical prediction for the muon's magnetism goes up enough to shrink the discrepancy by nearly half. That leaves the difference only slightly bigger than the theoretical and experimental uncertainties, spoiling the case for new particles.

    Are you positive?

    The sign of this elaborate expression originally led physicists astray.

    CREDIT: M. KNECHT AND A. NYFFELER

    The sleuths who cracked the case were Marc Knecht and Andreas Nyffeler of the Center for Theoretical Physics in Marseille, France. The pair repeated the calculation after improving a key mathematical ingredient called a “form factor.” “We thought perhaps with our better description we could get an improved value. But we didn't expect to get the opposite sign,” Nyffeler says.

    Knecht and Nyffeler's paper appeared on 6 November on the Los Alamos preprint server, leading the other two groups to double-check their work. Both found mistakes that affected the outcome.

    In a paper posted on the Los Alamos server on 6 December, Masashi Hayakawa of the KEK laboratory in Tsukuba, Japan, and Toichiro Kinoshita of Cornell University in Ithaca, New York, report that in their 1995 calculation they had inadvertently introduced an extra minus sign when using a specialized computer program to help grind through the staggering quantities of algebra. “We misunderstood the program in a subtle way,” Kinoshita says.

    Meanwhile, Johan Bijnens of Lund University in Sweden says that in 1995 he and collaborators Elisabetta Pallante of the International School for Advanced Studies in Trieste, Italy, and Joaquím Prades of the University of Granada in Spain performed the calculation in two different ways, and now they think they've found a different mistake in each approach. “We're still checking it, but it seems that we have an overall sign error, too,” Bijnens says.

    The winnowing of the discrepancy is disappointing, says Lee Roberts, an experimental physicist at Boston University and spokesperson for the Brookhaven team: “You always want to be the first to discover something new and exciting.” However, the discovery of the mistake has an upside, Roberts says. The scrutiny that unearthed the mistake has also improved the various calculations in ways that put the theoretical prediction on firmer ground. “I view it as a positive step that we have a lot more confidence in the theory than before,” Roberts says.

    The Brookhaven experimenters hope to publish an even more precise measurement for the muon's magnetism in March based on four times as much data as they used in their original measurement. Those results should allow physicists once again to say whether the magnetism of the muon adds up to a tiny sign of something big.

  10. GERMAN RESEARCH

    Helmholtz Reforms Mollify Scientists

    1. Gretchen Vogel

    BERLIN— A controversial plan to inject more competition into Germany's largest research organization, the Helmholtz Association of National Research Centers, has been modified to ensure that basic research remains a priority. Physicist Walter Kröll, the newly elected president of the association, discussed details of the plan last week at a press conference following the first meeting of the reorganized Senate.

    The reforms, to take effect in 2003, will replace a system of block grants for the 15 member institutes, among them the DESY synchrotron in Hamburg, the German Cancer Research Center (DKFZ) in Heidelberg, and the Earth Sciences Research Center in Potsdam. Instead of each institute receiving its own grant, the federal and state governments will divide the association's $1.5 billion yearly budget into so-called program areas. A review committee will determine how to distribute each program's budget between the centers based on each center's performance and its research priorities.

    In May, more than 4300 scientists and other Helmholtz employees signed a petition protesting the reorganization (Science, 11 May, p. 1038). They feared that the new system would give too much power to the federal government to decide research priorities and would reduce the ability of scientists to follow the hottest areas. In response, the government has modified the association's constitution, including making “basic research” a fundamental goal and bolstering the presence of scientists in the 22-member Senate, a governing body that also includes bureaucrats and politicians.

    New president.

    Walter Kröll takes over at the revamped Helmholtz Association.

    CREDIT: HELMHOLTZ-GEMEINSCHAFT

    Molecular immunologist Peter Krammer of the DKFZ, one of the most outspoken opponents of the plan last spring, says the changes have satisfied many of his concerns. Particularly important, he says, the new constitution states that institutes may spend up to 20% of their total expenditures on research unrelated to their central focus—allowing scientists at the DKFZ, for example, to pursue promising research in immunology or developmental biology. “I see no basis for fundamental opposition at the moment, but let's see how it works,” he says.

    Albrecht Wagner, director of DESY, says his initial concerns over long-term funding for large-scale projects such as the planned TESLA linear accelerator were addressed in a letter from federal research minister Edelgard Bulmahn to Kröll promising that labs would be free to sign international agreements extending beyond the normal funding cycle.

    Kröll, the head of the German Air and Space Research Institute in Cologne, said last week that the reforms will strengthen the society. “There is no getting around it; the reforms must be implemented as soon as possible,” he said. “We want the Helmholtz Association to be a leading pillar of European research and to make a globally recognized mark.”

    Most scientists would agree, especially now that their concerns have been acknowledged. “The old system definitely had to be changed,” Krammer says. “If the new system fulfills the purpose, that is, to stimulate more dynamite research, then that's wonderful.”

  11. CANADIAN BUDGET

    Most Initiatives Stalled; Health Research Grows

    1. Wayne Kondro*
    1. Wayne Kondro writes from Ottawa.

    OTTAWA— A 7% increase when many other budgets are being squeezed might seem cause for celebration. But to Marc Renaud, the head of Canada's Social Sciences and Humanities Research Council, the 7% boost in his 2002 budget falls so far short of what a recent parliamentary committee agreed was needed that he is thinking about heading back to academia. And Renaud might have company: Last week's budget blueprint from federal Finance Minister Paul Martin served up more losers than winners for a scientific community with a long wish list fueled by a once-robust economy.

    To nobody's surprise, the biggest new wrinkle in Canada's $86 billion budget for the year beginning 1 April 2002 is $1.1 billion for counterterrorism initiatives. Those programs will gobble up a big chunk of the overall budget increase of 4.7%, putting a squeeze on the rest of the government. As a result, astronomers were shut out in seeking money for some $100 million in new facilities over the next decade. Separate proposals to create the National Academies of Canada (Science, 27 October 2000, p. 685), the Biomolecular Interaction Network Database (Science, 8 June, p. 1813), and a program to improve university information technology departments (Science, 20 April, p. 413) also failed to win support.

    At the same time, Canada's three research councils that fund academic research fared well, at least on paper. The Canadian Institutes of Health Research, the largest of the three, will receive a 15% increase, to $353 million. “It'll allow our institutes to continue the momentum and the gains they've made over the last year,” says a pleased Alan Bernstein, who took over the newly constituted body some 18 months ago (Science, 8 September 2000, p. 1675). The budget for the Natural Sciences and Engineering Research Council will go up 7%, to $327 million.

    The new budget leaves astronomers up in the air, however. First on their list of proposed projects is a 5% stake in the $650 million Atacama Large Millimeter Array radio telescope, which the United States, Europe, and Japan are building in Chile's Atacama desert. The National Research Council still hopes to “scratch and save up” $31 million over 5 years, says NRC president Arthur Carty—but only if it can get the Canada Foundation for Innovation to chip in nearly half the total.

    Heading home?

    Social sciences head Marc Renaud says the tight budget may drive him from Ottawa.

    CREDIT: SOCIAL SCIENCES AND HUMANITIES RESEARCH COUNCIL

    Even the long-rumored federal foray into providing overhead costs for publicly funded research (Science, 27 October 2000, p. 687) was watered down in the battle against terrorism. Dreams by university administrators of a permanent $250-million-a-year program turned into a “one-time” allocation of about $125 million. Left unresolved is whether to give small universities a bigger slice of the pie to help expand their research capacity.

    Still, educators hail the government's recognition of its role in supporting the indirect costs of council-funded research. “It is a substantial start,” says Martha Piper, president of the University of British Columbia in Vancouver. And they're not giving up on their campaign for a permanent fund. “We might get it all 2 or 3 years down the road,” says Robert Giroux, president of the Association of Universities and Colleges of Canada.

    Renaud is not so sanguine about the budget prospects for his council. Earlier this year legislators joined a consensus that the social and behavioral sciences have been chronically underfunded and proposed a rapid doubling of the council's budget. So the $6 million increase, to $91 million, was a big disappointment. “There seems to be nothing that can convince this government of the usefulness and the need to support the social sciences,” says Renaud, a sociologist who was at the University of Montreal when he was recruited in 1997 to rescue a body embroiled in internal bickering (Science, 11 April 1997, p. 195). “The question for me is: Am I going to stay in Ottawa?”

  12. CIRCADIAN RHYTHMS

    A Time to Rest: Clock Signal Identified

    1. Marcia Barinaga

    Pet hamsters love to run on their wheels at night, the time when their wild cousins venture out of their burrows to scurry around in search of food. Wheel-running—or its natural equivalent—is just one of many activities and physiological states that follow a daily rhythm, cycling up and down about every 24 hours, under the control of an internal biological clock known as the circadian clock. Researchers have made tremendous progress recently in identifying the molecular gears and levers that run the clock, but they have remained in the dark about an equally important part of the clock: the signals it sends out to control circadian behaviors.

    Now, on page 2511, a team of researchers led by Charles Weitz at Harvard Medical School in Boston reports that it has discovered the first known output signal from the mammalian clock, a molecule called transforming growth factor alpha (TGF-α), known for its role in cancer and embryonic development.

    To run or not to run.

    Hamsters and other nocturnal rodents run at night when TGF-α levels are low.

    CREDIT: PHOTO RESEARCHERS INC.

    Clock researcher Michael Menaker of the University of Virginia in Charlottesville calls the work “a pioneering effort” that opens the way for clock researchers to begin to study the neural circuits by which the clock controls behavior and physiology. Joe Takahashi of Northwestern University in Evanston, Illinois, agrees: “One of our big gaps in understanding is how signals leave the [clock].” This paper, he says, begins to fill that gap.

    Weitz's team based its quest for clock signals on work by several research teams showing that the brain's suprachiasmatic nucleus (SCN), home of the mammalian clock, secretes something that controls wheel-running. Weitz postdoc Achim Kramer searched for molecules secreted by SCN neurons, and Weitz scanned the literature for known SCN products. “About 20 secreted peptides have been documented from SCN,” Weitz says, “and no one knows what most of them do.”

    Weitz's team took a selection of newly discovered and previously published SCN products and collaborated with Fred Davis's lab up the street at Northeastern University in Boston to test each candidate molecule's effects on wheel-running in hamsters. They infused each molecule continuously for 3 weeks into the brain near the SCN and monitored the animals' wheel behavior. Of 32 molecules they tested, TGF-α stood out: It completely stopped the animals from running in their wheels. Other experiments done with Tom Scammell across the street at Beth Israel Deaconess Medical Center showed that the hamsters weren't paralyzed or otherwise impaired. They just lost their nocturnal urge to run.

    The researchers hypothesized that TGF-α may be produced by the clock during the day to inhibit running. If so, they reasoned, TGF-α production by the SCN should be high during the day and low at night—and that is exactly what they found. What's more, TGF-α would need to act on a nearby brain area called the subparaventricular zone (SPZ), which controls daily running rhythms. And when they looked in the SPZ, they found that its cells contained the epidermal growth factor (EGF) receptor, which is also the receptor for TGF-αα

    To confirm that TGF-α acts through the EGF receptor to block wheel-running, the researchers tested EGF, a molecule that works through the same receptor but is not made by the SCN. They found that it also suppressed wheel-running. What's more, Weitz's team found that mutant mice with impaired EGF receptors don't seem to register TGF-α's signal; they run during the day more than normal mice do.

    The mutant mice were oblivious to external signals as well. Normal mice stop running at night if the lights are turned on. This response, called “masking,” seems to work like an emergency backup system to assure that nocturnal rodents aren't scurrying around when predators can see them. The mutant mice were “grossly deficient” in masking, Weitz says.

    Masking seems to be controlled by a signal carried directly to the SPZ by a small subset of ganglion cells, neurons found in the retina. Because the EGF mutants ran despite the light, that suggested that the EGF receptor is also responsible for masking and sent the researchers looking for that signal. They found that a small percentage of ganglion cells—about the percentage that are known to connect to the brain area called the hypothalamus, where both the SCN and the SPZ are located—contain EGF. “We don't know that those cells project to the hypothalamus yet,” says Weitz, “but the numbers and distribution are very similar to the subset of ganglion cells that do.” Weitz cautions that the data are not conclusive, but they suggest that both light and the clock may use the same molecular pathway to send the “don't run” signal. Clifford Saper of Harvard Medical School plans to use a method developed in his lab to see whether the EGF-containing retinal cells connect to the hypothalamus.

    Meanwhile, Weitz and colleagues will continue to test molecules secreted from the SCN, looking for others that regulate wheel-running. For example, there must be a positive regulator that triggers the running activity at night. With such a molecule in hand, Weitz says, they can then look at how the positive and negative signals interact in the SPZ, whether they are received by the same or different neurons, and how those neurons interact to tell the motor centers of the brain to run or not to run.

  13. LASER TECHNOLOGY

    Hot New Beam May Zap Bandwidth Bottleneck

    1. Robert F. Service

    Still waiting to download full-length movies over the Internet or chat on a video phone? Blame the “last mile problem.”

    Long-distance fiber-optic cables transmit billions of data bits every second, plenty for these applications. But it would require digging up city streets, and more, to connect those cables to your home or office. Companies have recently been pursuing an alternative to laying fiber called free-space optics, in which an infrared (IR) laser beams data to a receiver on your rooftop. But the only cheap semiconductor lasers available aren't really up to the job. They work at a relatively short wavelength of about 1.5 micrometers, and the beams typically travel only a few hundred meters before being absorbed by water vapor in the air.

    A report published online this week by Science (http://www.sciencexpress.org/) suggests a new way to go the distance. A Swiss research team led by Jérôme Faist and Mattias Beck of the University of Neuchâtel describes making a new semiconductor IR laser with a wavelength of about 9 micrometers. Because water vapor—and even rain, snow, and smog—absorbs only a tiny amount of light at that wavelength, free-space optical systems built with the new laser should work at distances of 2 kilometers.

    “We're very excited about it,” says Jim Plante, president of Maxima Corp., a San Diego, California, company that is working to develop free-space optics technology. “With this technology we can conquer the weather.” The longer wavelength lasers could also open a wealth of new research opportunities in atmospheric chemistry and medical diagnostics.

    The Swiss group's lasers operate in a section of the spectrum known as mid-IR. Mid-IR lasers that generate light in a chamber filled with CO2 gas have been around since the 1960s. But these lasers are bulky and expensive and must be cooled with cryogenic liquids. Research teams have begun to make semiconductor chip-based models that are small and potentially cheap, but they also require very low temperatures to keep them from burning out. Even then, the lasers spit out only intermittent pulses, rather than the continuous stream of light that is preferable for most applications.

    Filling the gap.

    A new laser may boost short-range data flow on a par with long-range fibers.

    CREDIT: ADAPTED FROM JIM PLANTE/MAXIMA CORPORATION

    To solve these problems, Faist, Beck, and their colleagues designed a novel structure for a “quantum cascade laser.” QCLs, first developed in 1994 by Faist and Federico Capasso at Lucent Technologies' Bell Laboratories in Murray Hill, New Jersey, are an offshoot of traditional semiconductor lasers. In standard chip-based lasers, different semiconductor alloys emit different wavelengths of light. Gallium arsenide, for example, emits red light, whereas gallium nitride emits blue. In a QCL, however, the wavelength of the light is determined by the thickness of the active materials used. Each device consists of numerous semiconductor layers. When an electric current flows through them, electrons cascade down an electronic waterfall, an energetic staircase with numerous steps; when an electron hits a step, it emits a laser photon.

    That's the theory. In reality, however, not all the electrons cascading in mid-IR QCLs are converted to photons. Many are absorbed by the semiconductor lattice, which generates enough heat to burn out the devices quickly. Researchers coped with the heat buildup by cooling the lasers to cryogenic temperatures and generating staccato pulses instead of a continuous beam.

    The Swiss researchers, however, decided to make the laser shorter and narrower. “That allows electrons to tunnel more efficiently through the whole structure” and generate less heat, Beck says. Embedding the active region of the device in indium phosphide, an excellent heat conductor, whisked away the remaining heat. The new lasers not only work at room temperature but can produce a continuous beam of light without burning out.

    Plante says the approach could revolutionize free-space optical communications. Although companies specializing in the technology have raked in more than a billion dollars in investments, Plante says, the industry has been struggling to put shorter wavelength lasers to work. “The use of mid-IR allows you to get the job done,” he says. His company has already begun testing Faist and Beck's new QCLs, the latest versions of which can transmit data hundreds of times faster than a standard ISDN or T1 line.

    The novel mid-IR lasers could be equally important for spectroscopy studies designed to detect particular gases in the air, says Frank Tittel, an applied physicist at Rice University in Houston. Potential users include atmospheric scientists measuring pollutants and medical researchers diagnosing disease from compounds in a patient's breath. Cheap and portable mid-IR QCLs, says Tittel, “would be a great jump forward.”

  14. NOBEL STATEMENT

    Laureates Plead for Laws, Not War

    1. Wayne Kondro*
    1. Wayne Kondro writes from Ottawa.

    OTTAWA— A majority of the world's living Nobel laureates issued a statement last week urging industrial nations to work cooperatively to address conditions that they believe contribute to global terrorism and unrest in the developing world. The statement, signed by 110 laureates and released at the 100th anniversary of the prizes, identifies poverty, global warming, and the spread of arms as a combustible mix, and it points to several international agreements as examples of the kinds of measures that should be encouraged. Ironically, a few days after the statement was released, the Bush Administration announced that the United States is withdrawing from one of those agreements, the 1972 Anti-Ballistic Missile (ABM) Treaty.

    The message “is a call, not to arms, but to disarm the source of major tensions in the world,” says the University of Toronto's John Polanyi, a 1986 chemistry laureate and the driving force behind the letter. “It's also a call for replacement of war by law.” If Alfred Nobel could give away an immense fortune to reward achievements in science, literature, and peace, Polanyi decided, the least his colleagues could do is think idealistically about how to improve the world.

    Bully pulpit.

    Some 110 living Nobel Prize-winners have signed on to John Polanyi's 100th anniversary statement.

    CREDITS: (TOP TO BOTTOM) PETER BREGG; PRESSENS BILD/HENRIK MONTGOMERY/AP

    The statement, signed by luminaries in science, medicine, literature, and world affairs (see www.sciencemag.org/feature/data/nobel.shl for full text and list of signatories), says: “The most profound danger to world peace in the coming years will stem not from the irrational acts of states or individuals but from the legitimate demands of the world's dispossessed. … If, then, we permit the devastating power of modern weaponry to spread through this combustible human landscape, we invite a conflagration that can engulf both rich and poor.”

    “Science alone, technology alone, is not sufficient to deal with these issues,” says Massachusetts Institute of Technology chemist and 1995 Nobel recipient Mario Molina. “We need strong commitments and values from society that technology and science are put to good use.” The statement mentions the Kyoto Convention on Climate Change, the Strategic Arms Reduction Treaties, the Comprehensive Test Ban Treaty, and the ABM Treaty as agreements fostering a similar spirit of community. U.S. withdrawal from the ABM Treaty is a “serious mistake,” Polanyi says, adding that nations are “fooling themselves” if they think safety can be found behind the protective walls of new missile screens. Crawling into an armed hole offers only the pretence of safety, Polanyi argues.

    Polanyi began talking to his Nobel colleagues last July about drafting the statement. About 30 laureates declined, he says, for reasons that vary from its omission of population control to a general distaste for political commentary. Many laureates who initially thought that the statement might be presumptuous or an oracle of the future changed their minds, he notes, after deciding that “the alternative, having a high level of education and some public prominence and not saying what you believe, is even worse.”

    The message shouldn't be labeled liberal or conservative, Molina says. “To me, it's rational. It's the only means to provide stability in the long run. It's also what we think is fair and justifiable from an ethical point of view.”

    Molina says there is no formal plan to achieve official recognition of the document among governments or international bodies. “But there might well be next steps that each of us take as individuals,” he says. Adds Polanyi: “I don't think one can afford to discount the thinking of scientists in an age of science.”

  15. SCIENTIFIC MISCONDUCT

    Psychologist Made Up Sex Bias Results

    1. Constance Holden

    The career of a promising young social psychologist lies in ruins following her admission that she “fabricated” five experiments on social discrimination that she conducted while at Harvard University.

    Last week the Office of Research Integrity (ORI) of the Department of Health and Human Services announced that Karen Ruggiero, 33, who last year moved to the University of Texas (UT), Austin, “engaged in scientific misconduct by fabricating data in research supported by the National Institutes of Health.” A September report from Harvard assistant dean Kathleen Buckley to Harvard's Standing Committee on Professional Conduct cites Ruggiero's comments in a 21 August letter that the manuscripts were based on “fabricated” data. In addition to retracting four published studies, Ruggiero is banned from receiving federal research funds or serving on government advisory committees for 5 years. A woman who answered the phone at her Texas home declined to discuss the case.

    Ruggiero was an “up-and-coming social psychologist,” according to social psychologist James Sidanius of the University of California, Los Angeles. She was prominent in a field called “the psychology of legitimacy,” in which scientists examine how and why the underdogs in social hierarchies adapt to, and even endorse, the systems of which they are a part. Her support included $38,707 from the National Institute of Mental Health.

    A Canadian who received her Ph.D. in psychology at McGill University in Montreal, Ruggiero spent 4 years at Harvard before accepting a tenure-track position at UT last year. After her departure, according to documents released by ORI under the Freedom of Information Act, Harvard received allegations that two of her published studies might contain fabricated data. According to an article in the Austin American-Statesman, a graduate student notified Harvard after Ruggiero balked at giving him her notes. She resigned from UT in June.

    In her 21 August letter, Ruggiero says that she “did not run” 600 subjects whose data were reported in two papers. A 1999 paper concluded that members of high-status groups are more likely than members of low-status groups to blame failures on discrimination. A 2000 article involved women students who took a test of “creativity” that was graded by male graduate students. Those given low grades were more likely to blame themselves rather than to suspect discrimination, unless they had learned that men had received high marks from the same graders. Ruggiero retracted both papers this summer and has agreed to retract two more. In her letters to the journals she takes full responsibility for her actions, and ORI director Chris Pascal says that neither Harvard nor ORI believes that any co-authors were involved in the misconduct.

    Harvard psychologist Herbert Kelman says Ruggiero's admission of misconduct “came as a complete shock.” He characterizes her as “very well organized, very hard-working. … In some respects quite a perfectionist.” Michael Domjan, chair of the UT psychology department, says there is no evidence that Ruggiero engaged in any improper conduct while at UT. The situation, he says, is “both tragic and unfortunate.” Pascal says that ORI has just begun a $1 million grants program to find out more about why some scientists cheat.

  16. ASTRONOMY

    Dark Dwarf Galaxy Gets Even Darker

    1. Robert Irion

    Nine galactic midgets roam our Milky Way's neighborhood, dim collections of stars called dwarf spheroidal galaxies. These objects are “the smallest and wimpiest of all galaxies,” says astronomer Raja Guhathakurta of the University of California, Santa Cruz. Yet the wimps pack a heavyweight punch: They appear to contain dark matter in quantities that far exceed the combined masses of their stars.

    Now, astronomers have their first compelling evidence that the dark matter around a dwarf galaxy sprawls deep into space, creating an extended halo similar to what is thought to shroud all bigger galaxies. Further studies of this cocoon, whose composition remains a mystery, promise to illuminate the early history of our own galaxy, which presumably acquired its heft by eating such dark-matter snacks.

    The new data come from the Draco dwarf spheroidal, a smudge of stars about 250,000 light-years away in a dragon-shaped constellation near the Big Dipper. A team led by astronomer Jan Kleyna of Cambridge University, United Kingdom, used the 4.2-meter William Herschel Telescope in the Canary Islands to study 159 giant stars in Draco, many of them in the galaxy's barely detectable outskirts. An instrument with more than 100 individually controlled optical fibers let the team collect light from scores of the faint stars for hours at a time. The instrument yielded spectra of the stars, which revealed their motions through space.

    Where's Draco?

    Star motions point to a shroud of dark matter around this barely visible galaxy.

    CREDIT: DIGITIZED SKY SURVEY/SPACE TELESCOPE SCIENCE INSTITUTE

    “The achievement of this study is the richness of those observations and the number of [stellar motions measured] at great distances from the galaxy's center,” says astronomer James Binney of Oxford University, U.K. Those remote stars are the most critical for tracing the galaxy's mass, Binney notes, because the motion of each one depends on how much matter lies within the entire radius of its orbit.

    Stars at Draco's margins dart so quickly that the galaxy probably contains about 200 times more mass than its visible stars suggest, the team found. Moreover, models developed by Kleyna, Cambridge colleagues Mark Wilkinson and Gerard Gilmore, and Wyn Evans of Oxford point to a vast halo of dark matter surrounding Draco, not just a knot among the stars. “The dark matter is not just mixed in with the stars,” Kleyna says. “It extends much farther into space.” The team reported its results in the 20 December issue of Astrophysical Journal Letters.

    The dark matter's presence is not a surprise, says Guhathakurta. However, its distribution in a large halo makes Draco look like a tiny precursor to a mature galaxy, not just a puff of stars with some invisible weight sprinkled in. “It's something people believed, but mostly from extrapolation,” he says. The calculations fit with the “bottom-up” view of galaxy formation, in which the gravitational fields of big galaxies shred smaller ones and assimilate their stars, gas, and dark matter.

    Draco and its kin are the best labs for reconstructing how that process works, says astronomer Mario Mateo of the University of Michigan, Ann Arbor. “If galaxies really are built from smaller components, these dwarfs are the quanta,” he says. “They are the closest places to which we can point and say, “There is dark matter here.’” But Draco is still hoarding its composition like a dark gem.

  17. MOLECULAR EVOLUTION

    Genome Duplications: The Stuff of Evolution?

    1. Elizabeth Pennisi

    The controversial—and formerly unprovable—proposition that evolution moves forward through duplication of entire genomes is getting support from current advances in molecular biology

    One of biology's greatest mysteries is how an organism as simple as a one-celled bacterium could give rise to something as complicated as a human. Thirty years ago, prominent geneticist Susumu Ohno of the City of Hope Hospital in Los Angeles put forth what many of his colleagues then considered an outrageous proposal: that great leaps in evolution—such as the transition from invertebrates to vertebrates—could occur only if whole genomes were duplicated.

    The resulting redundancy—each chromosome would have a twin—would enable the thousands of extra gene copies to evolve new functions, Ohno suggested. Thus the copies could become fodder for large innovations in body plans and other modifications, even as the functions of the original genes were maintained. At first, geneticists were either enthusiastic or appalled, but by the late 1980s, they had lost interest in the idea because they simply lacked enough data to support or refute Ohno's theory.

    Now, however, an explosion of DNA sequence information, as well as software for finding sequence similarities, has made it easier for researchers to detect signs of genome duplications. “Technological and theoretical advances, as well as computer power, are allowing us to empirically test [Ohno's] ideas,” says Axel Meyer, an evolutionary biologist at the University of Constance in Germany. He and increasing numbers of others are finding duplicated genes and chromosome segments that suggest Ohno might have been right after all.

    The emerging data have not persuaded all of the skeptics, however. They maintain that evolutionary change could have been fueled by duplication of individual genes or perhaps segments of chromosomes—without invoking anything as dramatic as genome duplications. “You can't prove that [genome duplication] didn't happen, but [if it did], it didn't have a major impact,” says Austin Hughes, a molecular evolutionist at the University of South Carolina, Columbia, who has done his own genome analyses. “For me, it's a dead issue.”

    Diversity explained.

    Genome doublings could have prompted the development of ever more complex organisms.

    CREDITS: (TOP TO BOTTOM) ADAPTED FROM PETER HOLLAND; M. D. STOKES/SCRIPPS INSTITUTION OF OCEANOGRAPHY; DAVID A. NORTHCOT/CORBIS; WAYNE BENNETT/CORBIS; JEFFREY L. ROTMAN/CORBIS

    For others, the issue is very much alive. Even biologists unfamiliar with Ohno's ideas are now combing genome data for new clues about how humans and other higher species came to be. “We want to understand where new genes came from,” explains Kenneth Wolfe, a molecular evolutionary biologist at the University of Dublin, Ireland. Understanding gene evolution could have practical benefits as well, he adds. More insights about redundant genes could help geneticists sort out gene functions and perhaps even pin down disease genes.

    Controversial proposal

    The idea that duplication of existing genes provided the raw material for evolution actually predates Ohno's proposal by nearly 35 years. In the 1930s geneticists realized that evolving organisms had to have a way of maintaining old functions even as they gained new ones. Gene duplication fit the bill: One copy could stay the same while the other took on new functions as base changes accumulated.

    But in Ohno's 1970 book Evolution by Gene Duplication, he went a big step further, proposing that not just genes but whole genomes duplicated. He thought that two rounds of duplication had occurred: one making possible the transition from invertebrates to vertebrates and the other leading to the diversity seen in vertebrates. Eventually this idea came to be known as the 2R hypothesis.

    The idea was not well received. “Found throughout the book are inconsistencies, misrepresentations and errors of fact,” the University of Chicago's Janice Spofford wrote in a review for Science in 1972 (Science, 11 February 1972, p. 617). She said, for instance, that Ohno misstated the amount of active DNA in bacteria and mice and failed to consider that the horseshoe crab, a primitive nonvertebrate, has far too much variation in protein makeup to fit his scenario.

    But Ohno insisted that organisms could handle extra genomes far more readily than they could extra copies of a particular gene. A second copy of a single gene would likely translate into production of too much of that gene's protein, which could be deleterious to the organism. In contrast, duplication of the entire genome would mean that production of all proteins would increase proportionally.

    2R revival

    Only with the rise of comparative genomics have researchers been able to address Ohno's hypothesis seriously. Some have matched genomes against themselves to find stretches of DNA that were once identical and therefore represent gene, chromosome, or whole-genome duplications. Others have sought out the same genes in many species, expecting that an increase in the number of any one gene across species might signal ancient genome duplications. We now have “completely different data” than were available in the 1970s, says Meyer.

    In 1993, for instance, Jürg Spring, a geneticist at the University of Basel in Switzerland, found that vertebrates have four copies of a gene for a cell surface protein called syndecan, whereas the fruit fly Drosophila melanogaster has but one. “I got stimulated to ask whether the fourfold duplication was general,” Spring recalls. He found more than 50 instances in which the fruit fly has a single copy of a gene, whereas the human has multiple copies.

    Researchers have also found signs of genome duplication at a key evolutionary crossroads: where vertebrates diverged from invertebrates. Some of this work comes from Peter Holland and Rebecca Furlong of the University of Reading, U.K., who are studying Amphioxus, the fishlike invertebrate thought to be most closely related to the vertebrates' ancestor. Recently, Holland and his colleagues compared some 84 gene families—groups of closely related genes—in Amphioxus to the equivalent families in other organisms.

    In many, they found that where Amphioxus has one copy, vertebrates have more. These cases include genes for enzymes and for proteins that control gene activity or are involved in cell signaling. Although some genes have just two or three copies instead of four, Holland thinks they each had four at one time. “There are lots and lots of duplications, and [that] is very consistent with two rounds of vertebrate [genome] duplication in quick succession,” he concludes.

    Longtime skeptic Wolfe also has new evidence that supports genome duplication in the prevertebrate lineage. Wolfe and his Dublin colleagues searched the draft sequence of the human genome for clusters of genes that exist on more than one chromosome. They located 80 pairs of clusters of varying sizes. That's “far more than you would expect by chance,” Wolfe says. What's more, the largest cluster pair, located on chromosomes 2 and 12, matched up across 34 million bases. Genome or chromosome—and not gene—duplications are the most likely explanation, he concludes.

    Next, Wolfe's team tried to estimate when these copies first appeared. These data showed that “there had been a burst of duplication events around 330 million to 500 million years ago,” he says. That's not exactly the time that Ohno predicted but is still in the ballpark, Wolfe adds.

    Ohno would be pleased, because “these data are consistent with the 2R hypothesis,” comments John Postlethwait, a genome researcher at the University of Oregon in Eugene. Indeed, says Wolfe, “I've moved from being skeptical to being more open-minded” about Ohno. Others, however, have yet to be convinced, although new evidence of yet another genome duplication, this time in the fish lineage, is helping Ohno's cause.

    Going another round

    Late in his career, Ohno added a third—and much later—round of genome duplication to his original two. He decided that there might also have been a genome duplication in fish, occurring just after the divergence of the lobed-fin fishes that led to land-based organisms. In 1998, Postlethwait's team found evidence for that third round of duplication (Science, 27 November 1998, p. 1711), and he and others postulated that these extra genes might have fueled the evolution of fish diversity.

    Postlethwait and his colleagues were looking at the HOX genes, which play a key role in regulating the development of higher organisms. They found that the zebrafish has seven HOX clusters, not the three or four known to exist, say, in mammals. More recently, other researchers discovered similar numbers of HOX gene sets in the puffer fish, in a Japanese species called medaka, and in cichlid fishes. Because these fish are distant piscine cousins of zebrafish, the findings imply that this duplication occurred in their common ancestor.

    Wild idea.

    Susumu Ohno (1928–2000) said genomes were duplicated in evolution.

    CREDIT: CITY OF HOPE HOSPITAL, LOS ANGELES

    Despite this accumulating evidence, Manfred Schartl, a geneticist at the University of Würzburg in Germany, and others who want to accept this scenario are still cautious. He points out, for example, that it would be very difficult for the first tetraploid fish—those with four rather than the usual two copies of each chromosome—to engage in sexual reproduction. He also says that other work, in which researchers have traced the histories of individual genes, points to duplication of single genes in fish, not the doubling of whole genomes. “The problem [with gene duplication] really is that you can find examples that are in agreement with one or the other hypothesis,” Schartl notes.

    Marc Robinson-Rechavi and his colleagues at the école Normale Supérieure de Lyon in France, for example, determined the lineages of related genes from three fish species to see whether the genes duplicated before or after the fish groups diversified. These duplications “did not occur all together at the origin of fishes, as would be expected if they are due to an ancestral genome duplication,” Robinson-Rechavi says. Although Meyer and his colleagues take issue with Robinson-Rechavi's analysis, others are not so sure how to interpret these findings.

    Duplication rebutted

    Indeed, the same problem plagues efforts to pin down those earlier genome duplications proposed by Ohno. In one study, South Carolina's Hughes pieced together the histories of 134 gene families in the Drosophila, Caenorhabditis elegans, and human genomes by looking at how the sequences of these related genes changed through time. If Ohno was right, Hughes says, the family history of any gene should show that an ancestral gene gave rise to two descendents simultaneously and that later in time, each of those two yielded two more—again at the same time. But almost three-quarters of the gene families he examined, including the HOX family, had different histories.

    Hughes also took a close look at the order of genes on supposedly duplicated chromosomes, an analysis that he says also failed to support Ohno's hypothesis. If a whole chromosome was copied, then most or all of the genes should be more or less in the same order in both. But often they are not. “Everything we've looked at [fails to] support the hypothesis,” Hughes concludes. He proposes instead that the genes occurring on multiple chromosomes moved to these different locations as a group and then stayed together because it was advantageous to the genome.

    But Holland isn't giving up that easily, and his scenario could be a way of reconciling Hughes's findings with Ohno's proposal about the two duplications early in vertebrate evolution. He thinks the inconsistencies highlighted by Hughes might be resolved by assuming that the time between the two rounds of duplication was much shorter than Ohno imagined.

    By Ohno's thinking, the first round produced two copies of each chromosome, or four total, because the chromosomes exist as pairs. At first those copies randomly paired off, but eventually they became different enough to have preferred partners, and each set of four became two sets of two, restoring diploidy, the typical chromosomal arrangement. Only after that had happened, which Ohno proposed would take many millions of years, would the second round of duplication have taken place.

    At a meeting* in April in Aussois, France, Holland suggested instead that the second duplication occurred before the four chromosomes produced by the first duplication diverged, thus producing eight roughly equivalent chromosomes. If that had been the case, then the recombination and switching of parts of chromosomes that typically takes place between chromosome pairs would have involved all eight, with different genes moving around at different times. Thus, gene order would vary from chromosome to chromosome, and neighboring genes could appear to have duplicated at different times instead of all at once. This scenario would confound analyses such as that done by Hughes.

    Other molecular events may change the genome in ways that obscure its true evolutionary history. Hiccups in DNA replication can spit out extra copies of genes or additional pieces of chromosomes. Mobile genetic elements can move genes and gene pieces around. And frequently, one copy of a gene loses its function and becomes unrecognizable as a gene. Sorting through all this to get a clear picture of how each organism's genome reached its present state will be hard, perhaps even impossible, says Meyer. Improvements in dating genes and identifying what instigates changes in a genome can help, however.

    But if the work resolves how the evolution of genomes prompts the evolution of new organisms, it will make possible a much better understanding of our own recently sequenced genome. If researchers can figure out the histories of families of genes, they will be in a much better position to sort out which genes are equivalent between, say, human and mouse or human and zebrafish. Knowing that will help tremendously as researchers try to pin down the functions of human genes in mice or other organisms that are more amenable to genetic manipulations than humans. No matter what, says Hughes, “we have to really understand how the genome is arranged.” And that is one thing that he and Ohno would agree on.

    • * The Jacques Monod Conference on Gene and Genome Duplications and the Evolution of Novel Gene Functions, Aussois, France, 26 to 30 April.

  18. PATENT INFRINGEMENT

    High Court Asked to Rule on What Makes an Idea New

    1. David Malakoff

    Ten years after a U.S. company sued a Japanese firm for patent infringement, the Supreme Court will hear “the biggest patent case in decades”

    When is imitation innovation—and when is it piracy? The U.S. Supreme Court will hear conflicting answers to those questions early next month in a patent infringement case that is being watched closely by academic and industrial groups.

    The case, referred to as Festo,* centers on a 150-year-old legal concept known as the “doctrine of equivalents.” The doctrine is designed to prevent businesses from making minor changes to a patented technology and then claiming it as a new invention. Companies that have patented proteins, for instance, have invoked the doctrine to prevent competitors from marketing molecules that have slightly different amino acid sequences but perform the same biological function.

    Last year, however, a federal appeals court stunned many experts by ruling that the doctrine doesn't apply to any patent claim that was narrowed during the review process before the patent was issued. Because that happens to most patent claims, the ruling could have a broad impact—especially, some experts claim, in the biotechnology industry. The business, biotechnology, and patent law communities have filed dozens of friend-of-the-court briefs since the high court agreed in June to hear the case (see table); oral arguments are scheduled for 8 January.

    CREDIT: PAT BENIC/AP

    Supporters say that last year's ruling clarifies the law and should prevent nuisance lawsuits while it fosters better written patents and greater innovation. But many major research universities disagree, joining critics who predict that it will open the door to wholesale copying and undermine thousands of patents. With billions of dollars in licensing revenues potentially at stake, “this is the biggest patent case in decades,” says Susan Braden, an attorney at Baker & McKenzie in Washington, D.C., who is representing the universities.

    This isn't the first time that the Supreme Court has looked at Festo. The case entered the federal court system in 1992, when the Hauppauge, New York-based Festo Corp. charged that SMC Co. of Japan had infringed on its patent for a cylindrical part of a robotic arm by producing a similar mechanism. Festo won several early rounds, but in 1997 the Supreme Court asked the U.S. Court of Appeals for the Federal Circuit—a specialized court that handles patent cases—to review some tricky technical issues. In November 2000, the appellate court produced a 170-page decision that favored SMC, although seven of the 12 judges involved wrote separate opinions.

    The court concluded that Festo had altered its cylinder patent during review by the U.S. Patent and Trademark Office (PTO), triggering what lawyers call “prosecution history estoppel.” Estoppel is designed to prevent a patent applicant from narrowing a claim to satisfy PTO's requirement that inventions be original, then using the doctrine of equivalents to expand the reach of the patent and quash competitors. But the court declared “unworkable” current rules governing when estoppel could be claimed and ordered a new “complete bar” against invoking the doctrine to defend altered portions of patents. The court said it could still be used to defend unaltered claims.

    View this table:

    The flaw in that ruling, critics say, is that virtually all key patent claims undergo revision during the tug-of-war between applicant and examiner. Patent seekers typically word their applications as broadly as possible, whereas PTO's examiners routinely narrow the language to leave more room for innovation.

    Some biotech companies complain that Festo is especially inappropriate for DNA-based patents, because the engineered proteins that may become blockbuster drugs can be created by thousands—if not millions—of slightly different but related DNA sequences. “Festo provides a road map for a would-be copyist to avoid [literal] infringement,” argues a brief filed by Chiron, an Emeryville, California-based biotech company that wants the court to overturn the ruling. To legally copy a patented protein, it says, a clever forger could simply review a patent's public case history then create a twin with functionally equivalent substitutes.

    That's already happened, claims one British biotech company. In its brief urging Festo's reversal, Celltech describes its suit against MedImmune, a U.S. company. MedImmune makes Synergin, a hot-selling drug that protects premature babies from lung infections. But Celltech says Synergin's hybrid mouse-human antibody differs from one of its creations by just a single amino acid out of more than 1300—a big enough difference to avoid literal infringement but not enough to alter the molecule's function.

    Celltech has asked a U.K. court to find MedImmune guilty of infringing on its U.S. patent under the doctrine of equivalents, but the Festo decision has stalled the case. “Celltech's situation will be a common occurrence if [the Festo decision] stands as written,” warn Celltech's attorneys. The case will likely be thrown out if the Supreme Court upholds the earlier ruling.

    In its own brief supporting the lower court decision, MedImmune says Celltech's lawsuit is just the kind of “wasteful litigation” that the Festo ruling will help prevent. Celltech is using “creative [legal] arguments … in order to reclaim what [it previously] surrendered to the Patent Office” and win hefty licensing fees on Synergin, the company claims. Before the Festo case, it adds, uncertainty about how judges would apply the doctrine of equivalents often prompted companies “to take licenses and pay royalties as ransom, to avoid litigation.”

    Industry pioneer Genentech of South San Francisco, California, agrees that Festo will reduce that uncertainty. “Biotechnology will not only survive Festo, but will thrive under it,” its brief predicts. One benefit, the company says, will be that firms will fine-tune their applications before filing them.

    Chiron, however, sees a darker future. It warns that Festo will cause “an explosion in the verbiage a patent contains” as applicants try to cover every foreseeable minor variation on an invention. For protein patents, “mathematics alone make describing every possible variant impossible,” it says, noting that 1036nucleotide sequences could code for one type of protein. “The resulting patent claims would be a stack of paper miles high,” the company warns. Genentech dismisses that argument, saying that such fears “derive from a naïve misapprehension” of current patent law and PTO policies.

    Research universities have their own worries about the Festo ruling. In urging the court to overrule the decision, 20 schools and university groups—including Stanford, the Massachusetts Institute of Technology, the University of Wisconsin, and the 63-member Association of American Universities—note that over the last decade academic institutions have won more than 16,000 patents that have generated more than $4 billion in licensing fees. Among the patents are key discoveries that helped launch the biotechnology industry, create important cancer drugs, and produce popular products such as the Gatorade sports drink. If the doctrine of equivalents is weakened, they warn, academia will have less incentive to innovate—and the public will suffer.

    Other players—from the Solicitor General of the United States (the government's lawyer) to the Institute of Electrical and Electronics Engineers—have their own prescriptions for curing Festo headaches. There is also the issue of whether the decision, if upheld, should apply retroactively to 1.2 million existing U.S. patents.

    Next month's oral arguments are expected to draw a capacity crowd—a rarity for a patent case—with a star-studded legal cast that includes former Supreme Court candidate and Festo lawyer Robert Bork as well as SMC advocate Arthur Neustadt, a top patent attorney. But the curtain probably won't drop on this legal drama until late spring, when the court is expected to release its ruling.

    • * Festo Corporation v. Shoketsu Kinzoku Kogyo Kabushki Co. Ltd. (a.k.a. SMC Co.). U.S. Supreme Court Docket 00-1543.

  19. MATERIALS RESEARCH SOCIETY MEETING

    Assembling the Supersmall and Ultrasensitive

    1. Robert F. Service

    BOSTON, MASSACHUSETTS—Graced by unseasonably mild temperatures, about 3500 materials scientists, chemists, and physicists gathered here for the semiannual Materials Research Society (MRS) meeting.* Hot topics included reports of using viruses to assemble nanoparticles and of novel composites that pave the way for ultrasensitive sensors.

    Liquid Crystals Via Virus

    Angela Belcher doesn't just borrow ideas from biology in making new materials—she borrows whole organisms. At the MRS meeting, Belcher, a chemist at the University of Texas, Austin, described making liquid crystalline films out of viruses engineered to tote tiny inorganic nanoparticles. Polymer-based liquid crystalline films already form the heart of flat-screen computer displays. And although the viral liquid crystals aren't likely to displace their polymer counterparts anytime soon, they point the way forward in one of the hottest areas of materials science, merging the organizational prowess of biology with the electronic, magnetic, and structural properties of inorganic compounds.

    Liquid crystals, first discovered in 1888, have molecular structures that are poised between the rigid order of solid crystals and the bumper-car chaos of liquids. Displaymakers use this malleability to their advantage: By applying an electric field, they reorient the liquid crystal, a change that switches the material from transparent to opaque. In Belcher's experiments, the viruses take on the ordering duties themselves. “The virus is making the liquid crystal and orienting it,” says materials scientist Samuel Stupp of Northwestern University in Evanston, Illinois.

    Self-assembly.

    An atomic-force microscope image (right) shows that pencil-shaped viruses toting nanoparticles have stacked themselves like cordwood.

    CREDIT: ANGELA BELCHER/UNIVERSITY OF TEXAS

    Viral films aren't Belcher's first foray into using living organisms to make materials. Two years ago, Belcher and her Texas colleagues reported that viruses could be engineered to express, for example, proteins that bind semiconductor nanoparticles made from gallium arsenide but not silicon or cadmium selenide (Science, 24 December 1999, p. 2442). That work held out hope that microbes could eventually help researchers shepherd nanoparticles into complex arrangements that are possibly useful for storing electronic data or wiring up molecule-sized electronic devices. The new work brings such potential applications much closer, as it shows for the first time that biological systems can organize nanoparticles into macrosized structures.

    To make the films, Belcher and her graduate student Seung-Wuk Lee started by inserting random mutations into the viral DNA for a protein that sits on one outer tip of long, pencil-shaped viruses. When the viruses infected their bacterial hosts, they reproduced and expressed the engineered protein on their coat. The researchers then turned the viruses loose in a solution containing nanoparticles made of zinc selenide. The best nanoparticle-binding virus was then allowed to reproduce, making billions of copies. Finally, by simply changing the concentration of the viruses in solution, Belcher and Lee found that they could coax the viruses to pack themselves and their nanoparticle cargoes together in different liquid crystalline arrangements. The viruses' ability to form films “was a surprise to us,” says Belcher.

    In one such film, for example, the viruses lined up in successive rows, like rows of pencils all with zinc selenide erasers at one end (see figure above). Other films have viruses lining up in either a zigzag pattern or all facing the same direction but not organized into orderly rows. And each arrangement, Belcher's team found, produced unique iridescent patterns when viewed with a microscope under polarized light, a common characteristic of liquid crystalline films. Unlike polymer liquid crystals, however, the viral films cannot yet be easily switched between different configurations.

    Although the new films' optical properties make them visually striking, their ability to align nanoparticles over relatively large distances could be even more important in the long run. Stupp points out that nanoscience researchers have developed numerous techniques to pattern nanoparticles and other nanosized objects at small-length scales. Extending such patterns to the macroscale has proven difficult. But because the Texas team's viruses can assemble themselves into films that can be picked up with a pair of tweezers, they may offer researchers new ways to stack metallic nanoparticles into nanowires or magnetic nanoparticles for data storage.

    Swell New Sensor

    Whether we notice them or not, sensors that detect small changes in temperature, pressure, and even the presence of chemicals are all around us. Many of them spot changes in their environment by tracking the effect on a material's ability to conduct electricity. Increase the temperature or pressure of the material, for example, and the conductivity can drop by about 50% to 90%. At the meeting, Jim Martin, a physical chemist at Sandia National Laboratories in Albuquerque, New Mexico, unveiled a plastic material embedded with magnetic particles that does quite a bit better, changing its conductivity as much as 100- billion-fold. And it's versatile, responding not only to temperature and pressure changes but to the presence of some chemicals as well.

    That sensitivity is “off the charts,” says Todd Christenson, an electrical engineer and sensors expert at Sandia who is not affiliated with Martin's group. “It's incredible.” That could ultimately make the new materials useful in a wide range of ultrasensitive detectors.

    Power lines.

    Conducting particles aligned by a magnetic field offer sharp warnings of changes that break their connections.

    CREDITS: J. MARTIN/SANDIA NATIONAL LABORATORIES

    Martin got interested in the sensing ability of plastic composites after hearing a talk about the way some commercial sensors work. Those sensors harbor a random mixture of tiny conductive carbon particles in an insulating polymer matrix sandwiched between two electrodes. Flip the switch, and current hops between connecting carbon particles. But jack up levels of a property such as temperature and the polymer swells, breaking the contact between some of the carbon particles and causing the conductivity to drop. Such devices can vary widely and unpredictably in performance, however, because each winds up with a different ordering of carbon particles, Martin says.

    Martin suspected that researchers could make the sensors more sensitive and reliable by stringing the conducting particles between the electrodes like tiny wires. In that arrangement even slight environmental changes would change the polymer enough to knock the particle chains out of alignment. To get that alignment, Martin's team decided to use a magnetic field, which causes magnetic particles to line up in the direction of the field lines (see figure at above). Because carbon particles aren't magnetic, the Sandia researchers used nickel particles, each about 2 millionths of a meter across. They electroplated each particle with a thin layer of gold to prevent the nickel from oxidizing when exposed to air, a change that would destroy the particle's conductivity. They then embedded their gold-coated nickel particles in an uncured polymer and exposed the combo to an external magnetic field. Once the particles were aligned in chains, the researchers cured the polymer and began their tests.

    The results were dramatic. Bumping up the temperature just 5 degrees Celsius slashed the electrical resistance by five orders of magnitude. An 80-degree change made it plummet by 10 orders of magnitude. Modest pressure changes produced an effect of 11 orders of magnitude. And even exposing the polymer to a chemical vapor of toluene caused the polymer to expand and altered the resistance by nine orders of magnitude. “We were surprised by how well it worked,” Martin says.

    Martin says he has no immediate plans to commercialize the materials: “We're still at the level of science.” Some bugs still remain to be ironed out as well. For example, many of the particles oxidize over time when exposed to air, probably a result of incomplete electroplating. But once the composites are fine-tuned, they are likely to be a hot commercial property, Christenson predicts.

    • * 26–30 November.

  20. PLANT BIOTECHNOLOGY

    For Plants, Reproduction Without Sex May Be Better

    1. Anne Simon Moffat*
    1. Anne Simon Moffat is a freelance writer in Chicago.

    A better understanding of how plants reproduce asexually by apomixis may boost efforts to develop improved crops

    Mention embryo formation, and thoughts usually turn to sex: the union of sperm and egg to produce a new individual. But several common plants, including dandelion, citrus, mango, and certain forage grasses, can do without sex, producing their embryos from unfertilized immature sex cells or even from ordinary somatic cells. Although plant biologists have known about this bizarre form of reproduction, called apomixis, for about a century, only recently have they begun to get a glimmer of insight into how plants achieve the feat.

    Within the past few years, researchers have identified a variety of genes that may determine whether plants reproduce sexually or asexually by apomixis. Eventually, they hope to use these genes to develop new strains of apomictic crop plants. The payoff could be enormous. Because the progeny of apomixis are identical to the parents, traits transferred into an apomictic plant, whether by classical breeding or genetic engineering, wouldn't be lost in the genetic shuffling that occurs during sexual reproduction.

    Apomixis also offers a possible way to avoid the degeneration of breeding stocks of some vegetatively propagated plants, such as potato and cassava, that accumulate pathogens through repeated use. “Apomixis could play a major role in feeding the growing population of our planet,” says plant developmental biologist Ueli Grossniklaus of the University of Zürich, Switzerland. “It could promise social and economic benefits that would challenge those of the Green Revolution.” As a result, “apomixis is a very, very hot area of research,” says Robert Goldberg, a plant biologist at the University of California, Los Angeles.

    Funding agencies are beginning to pay attention. In 2001, for example, the ApoTool European Union funded the $2 million program, coordinated by Sacco de Vries of the University of Wageningen in the Netherlands, to further the identification of genes that might go into a custom-designed apomict. In addition, the International Maize and Wheat Improvement Center in Mexico and various funding agencies in the United States—including the Department of Agriculture (USDA), the National Science Foundation, the Department of Energy, and the Rockefeller Foundation—are providing modest, but increasing, support for the research. And several biotech companies—such as Ceres of Malibu, California, which was co-founded by Goldberg, and Apomyx of North Logan, Utah—have research efforts devoted to apomixis, although they are keeping quiet about the number of dollars invested.

    A measure of this increased worldwide interest was evident at last April's XVIth International Congress on Sexual Plant Reproduction in Banff, Canada, where 170 people signed up for the apomixis section alone. At the previous meeting, 2 years earlier, just 140 scientists had attended the entire meeting.

    One development that helped spark this interest came in 1998 when two teams of classical plant breeders, working independently, demonstrated the feasibility of introducing the apomixis trait into crop plants that normally reproduce sexually. Bryan Kindiger, Phillip Sims, and Chet Dewald of the USDA's Agricultural Research Service (ARS) Southern Plains Range Research Station in Woodward, Oklahoma, developed corn that reproduces by apomixis by crossing the grain with an apomictic relative, eastern gamagrass (Tripsacum dactyloides). Their success didn't come easily.

    A long slog.

    This corn-gamagrass hybrid was one of many produced on way to an apomictic corn strain.

    CREDIT: KEN HAMMOND/USDA

    The team had to perform at least 5000 backcrosses to get a fertile grass-corn hybrid and thousands more crosses to get apomictic corn. “It was a long haul, a difficult and elusive challenge,” says Dewald, whose team worked on the project for nearly 20 years. After a similarly long slog, Wayne Hanna and his colleagues at the ARS Coastal Plains Research Station in Tifton, Georgia, transferred apomictic genes from the forage grass Pennisetum squamulatum to its cultivated relative, pearl millet, a popular birdseed.

    So far, the new corn and millet apomicts aren't fertile enough for commercial application, but fertility is improving. “We've gone from having 4% or 5% fertility in corn to about 10%, but we're still a long way from a commercial crop,” says Dewald, who adds that 80% fertility would be a desirable first goal for a commercial cultivar.

    Despite these successes, conventional breeding of new apomicts faces a big obstacle: Few valuable food plants have close apomictic relatives that can be used for crossbreeding. To get around this, plant researchers are hoping to engineer an apomictic plant. They are trying to identify mutations of genes coding for key steps in sexual development that could produce a form of apomixis when transferred into other species. “I think it can be done with the newer, molecular tools,” says Dewald, who began breeding natural forage apomicts some years before molecular tools became available.

    As in much of plant genetics these days, the search is focusing on the model plant Arabidopsis thaliana. In recent years, researchers have identified hundreds of mutations in Arabidopsis and a few other plants that affect embryo development. And they have found that the products of some of these genes can elicit embryo formation from the vegetative tissue of Arabidopsis. In 1998, for example, Goldberg—with John Harada of the University of California, Davis; Bob Fischer of UC Berkeley; and their colleagues—cloned a gene called LEAFY COTYLEDON 1 (LEC1). The researchers showed that the gene, which was already known to be involved in embryo formation, can trigger the formation of embryolike structures when it is expressed in leaves.

    Last August, the same team described a second gene, LEAFY COTYLEDON 2 (LEC2), with similar activity. And in unpublished work, Kim Boutilier of Plant Research International, a nonprofit research institute in Wageningen, cloned the BABY BOOM gene from canola (Brassica napus) and showed that when that gene is expressed in Arabidopsis it has an even more dramatic effect than LEC1, leading to the production of hundreds of embryos on seedlings. The group has applied for a patent on the gene.

    The proteins produced by the LEC and BABY BOOMgenes are thought to regulate gene activity directly, but genes that make other types of proteins can also enhance the production of embryos from somatic cells. One example, found this year by De Vries, is a gene called SERK (for somatic embryogenesis receptor kinase) that increases sensitivity to growth-inducing compounds, such as plant hormones.

    Getting embryos to form is only half the battle in producing viable seed, however. Conventional seed formation requires two fertilizations, one of the egg, which yields the embryo, and another of the so-called central cell that goes on to make the endosperm that nourishes the embryo. As a result, apomixis researchers are also focusing on isolating the genes involved in endosperm development.

    In the late 1990s, three groups—one including Goldberg, Harada, and Fischer; another led by Zürich's Grossniklaus; and the third led by Abdul Chaudhury of the Commonwealth Scientific and Industrial Research Organisation (CSIRO) in Canberra, Australia—described mutations in three genes that lead to formation of endosperm and partial seeds in the absence of fertilization. All three genes, called FERTILIZATION-INDEPENDENT ENDOSPERM (FIE), FERTILIZATION-INDEPENDENT SEED (FIS2) and MEDEA, encode polycomb proteins, similar to polypeptides found in the fruit fly, mouse, and other species that are associated with condensed chromatin and repression of gene expression.

    About a year ago, Rod Scott of the University of Bath, U.K., Hugh Dickinson of the University of Oxford, U.K., Fischer, and their colleagues extended this work. They showed that reducing DNA methylation, which can be achieved by means of mutations in any of several genes, improves endosperm development in FIE mutants.

    A typical apomict.

    The dandelions that proliferate in your lawn can do so without the aid of sexual reproduction.

    CREDIT: SACCO DE VRIES

    Meanwhile, a few researchers are taking a very different tack to identify genes that might be used to engineer apomicts: Instead of looking for genes that influence seed development in Arabidopsis, they are probing select apomicts to see if they can find the genes involved in this asexual reproduction. For example, Peggy Ozias-Akins of the University of Georgia, Tifton, is studying Pennisetum, grasses related to millet; Ross Bicknell of Crop and Food Research Ltd., a government-sponsored company in Christchurch, New Zealand, and Anna Koltunow of CSIRO in Adelaide, Australia, are studying the dandelion relative Hieracium aurantiacum. Bicknell notes that this work is still in the very early stages and accounts for only 10% to 20% of all funds devoted to apomixis research. Indeed, he says, “we exist as the lunatic fringe of apomixis research.”

    Still, Bicknell cites some progress. A few years ago, his team identified a mutant form of Hieracium that had lost the ability to reproduce by apomixis and used ordinary sexual reproduction instead. Since then, he and his colleagues have discovered more than 80 such mutants. By comparing their gene expression profiles with those of plants that are naturally apomictic or sexual, the researchers hope to identify regulatory pathways that are critical for apomixis. Bicknell says they have already picked up several genes that seem to be important, although he declines to offer details, because the work was done under contract to Ceres and the genes may be patented.

    Not everyone thinks that tinkering with a few or even generous clusters of genes will be enough to make apomicts, however. These researchers note that classical breeding programs had to add at least one complete alien chromosome, and often more, to transfer the desired trait into new species. John Carman, a plant evolutionary biologist at Utah State University in Logan, goes so far as to claim that “apomixis genes [per se] don't exist.” Instead, he suggests, the process likely involves many genes located at different places in the genome that influence traits such as the timing of flower and embryo development. Even apomixis researcher Goldberg describes the study of apomixis as “far more daunting than the study of parthenogenesis in animals.”

    Nevertheless, what researchers have learned so far has raised their hopes. “The versatility of reproductive systems and the variability in known apomicts suggest that apomixis can be engineered,” predicts Grossniklaus—even though the plant engineers' approach may not be the same as Mother Nature's.

  21. UNDERGRADUATE EDUCATION

    Can Universities Be Bribed to Train More Scientists?

    1. Jeffrey Mervis

    Economist Paul Romer has persuaded Congress to test his theory of why too few U.S. students major in science and engineering. But is money the real roadblock?

    Stanford University economist Paul Romer readily accepts the conventional wisdom that the United States isn't producing enough scientists and engineers to ensure a healthy economy. But his explanation of who's to blame, and how to fix the problem, is anything but conventional.

    Romer argues that U.S. universities deliberately underproduce science and engineering graduates because they are so expensive to train. The traditional weeding-out process is simply a smokescreen for holding down costs, he says. His solution: Pay the universities to turn out more scientists and engineers. “Most schools will do the right thing if you make it worth their while,” asserts Romer, who has spent 15 years analyzing the factors behind long-term economic growth.

    His fresh insights into what has traditionally been seen as an intractable problem have made Romer the darling of politicians and business leaders who believe that the federal government should be playing a bigger role in training the next generation of scientific talent. His ideas have formed the basis for new legislation, the Technology Talent Bill (S. 1549 and H.R. 3130), that would create a competitive grants program at the National Science Foundation (NSF) for universities that promise to boost the number of undergraduates majoring in science, mathematics, and engineering. The concept is so appealing politically that last month Congress gave NSF $5 million to start a pilot project to test Romer's thesis even before it took up the authorizing legislation (Science, 16 November, p. 1430).

    Most educators agree that the country needs more scientists—and are delighted that Congress is willing to tackle the problem. But the vast majority take strong exception to Romer's analysis. They say it ignores a vast body of literature on why students avoid or drop out of the sciences, from the field's unappealing image to high family expectations, that have nothing to do with an institution's unwillingness to pay the bill. Romer's explanation fails to account for the steady growth in the life sciences, they note, as well as the realities of higher education, where departments compete for students and universities flaunt their scientific prowess.

    Filling the pipeline.

    Paul Romer says degrees will follow the dollars.

    “That's nuts,” says William Saam, chair of the physics department at Ohio State University, Columbus, when asked about Romer's argument. “Of course we could do better with more resources. But we are working hard to increase the number of majors, and so is every other physics department in the country.”

    The power of an idea

    A self-proclaimed naïf in the corridors of political power, the 46-year-old Romer has a pedigree that opens doors. He's the son of Roy Romer, a former governor of Colorado and head of the Democratic National Committee who's now superintendent of Los Angeles schools. He's also been touted in the media as a potential Nobel Prize-winner for his pioneering work on endogenous growth theory: the idea that economic growth is driven by new technology.

    Technology, in turn, depends on a steady flow of scientific talent. But Romer argues that U.S. colleges and universities winnow out a large percentage of students who express interest in science and engineering through tough grading and a survivalist mentality in introductory courses. Faculty and administrators then cite the need for high academic standards as a rationale for their behavior, which Romer says is deeply rooted in the culture of science. The emphasis on research over teaching at many top schools reinforces such behavior, he argues.

    But money can change such attitudes, he says, provided departments are rewarded for churning out more science graduates. As proof, he cites a 3-year-old Canadian program that pays Ontario universities up to $5000 for each new computer science and engineering student. So far, undergraduate enrollment at the 17 eligible universities has jumped by 145% in engineering and by 180% in computer science.

    One of Romer's earliest and most important political converts was Senator Joe Lieberman (D-CT), the driving force behind the Technology Talent Bill and a staunch advocate of doubling the NSF budget and increasing federal support for training the next generation of scientists. “We've been wrestling with this issue for a long time,” says one Lieberman staffer. “So the idea of rewarding the gatekeepers if they can produce more majors was very appealing.” Romer's message also warms the hearts of high-tech business leaders, who complain that they must import tens of thousands of foreign-born workers because the U.S. talent pool is too shallow. As a result, Romer has broken bread with groups ranging from the New Democrat Network and the Washington, D.C.-based Council on Competitiveness, a coalition of CEOs that lobbies for increased government spending on research and training.

    Romer's initial proposal for priming the technology pump, outlined in a June 2000 working paper (www.nber.org/papers/w7723), would have offered training grants to undergraduate science departments and portable fellowships for graduate students as well as tax breaks for industry. However, its multibillion-dollar-a-year price tag scared off more than a few supporters. As a result, the Technology Talent Bill calls for a $25 million a year pilot program for undergraduates, and it gives NSF plenty of leeway to set the rules of the competition. Staffers say Lieberman and others envision it growing to $200 million annually if it proves successful.

    Theory vs. reality

    Romer's argument, and the legislation that is based on it, rests on two key assumptions: There is a large reservoir of qualified students interested in majoring in the natural sciences and engineering, and U.S. universities have excess capacity to handle such an influx. But many educators question whether either premise is true.

    Romer and his supporters cite the traditionally high attrition rate in the sciences as proof that many students are being pushed out of fields they want to pursue. A 1992 study by Alexander and Helen Astin of the Higher Education Research Institute (HERI) at the University of California, Los Angeles, for example, found persistence rates of only 40% to 50% for first-year students declaring an interest in the natural sciences and engineering. But such shifts in interest are not unusual for first- and second-year students, say educators, and the percentages have held steady over the years, according to data from HERI, which has surveyed incoming freshmen for 35 years.

    Many educators join with Romer in decrying what Michael Teitelbaum of the Alfred P. Sloan Foundation in New York City calls the “boot camp” mentality in many top-tier science departments. But Romer stands alone in attributing it to economic causes. “That's a weird one,” says Roman Czujko of the American Institute of Physics in College Park, Maryland, which closely tracks science enrollment and graduation trends. “Departments have to justify their size to the dean,” he says, “and the best way to do that is with more majors. Besides, university presidents recognize that having a vibrant science program is essential for attracting top students in all fields.”

    Numbers game.

    The last decade has seen big shifts in enrollment by discipline.

    SOURCES: (LEFT TO RIGHT) AIP STATISTICAL RESEARCH CENTER; CPST; U.S. DEPARTMENT OF EDUCATION, NCES

    Retention and attrition in higher education are affected by many factors, say other scholars. Elaine Seymour, a sociologist at the University of Colorado, Boulder, and co-author of a 1997 book on why undergraduates drop out of science, cites “appalling teaching” as a much bigger reason for driving students away. The solution, she says, requires systemic reform of the science curriculum at all levels and improved teacher training.

    Romer's assumption that there's unused capacity for educating more scientists and engineers gets equally short shrift. “I can't turn out any more B.S. students in engineering without a major investment in space and faculty,” says Janie Fouke, dean of engineering at Michigan State University in East Lansing. “And my colleagues are in the same boat. We all want to produce more graduates, in particular women and people of color. But the scale of an award that would accomplish that, in terms of size and duration, is probably much bigger than anything that is likely to be offered.”

    Canadian educators say it's not clear whether the Ontario program is relevant to the U.S. context. The additional money, by itself, wouldn't have been enough to justify expansion, says Sujeet Chaudhuri, dean of engineering at the University of Waterloo, the province's top technology institution, which declined to participate the first year because the government's doubling target was too high, he says. Instead, it was the new authority to charge higher tuition—10% to 15% more a year for the next 3 years—that made it economically feasible for the department to expand. Demand has held steady, says Chaudhuri, but that might not be the case in a more competitive U.S. market. In addition, U.S. administrators say that political realities would make it impossible for their universities to levy tuition hikes of that magnitude.

    Romer says his economic solution allows plenty of room for fine-tuning. “I say that people will respond to incentives, but I can't tell you what the exact number should be,” he says. “And I'm not trying to sell this as a solution to the problem of underrepresentation in science, although I think that whatever you do to eliminate weed-out mode will disproportionately benefit those groups.”

    For most academics, however, what he is selling is too facile. “It's typical of an economist to pick out one thing,” scoffs Colorado's Seymour. “It won't hurt, but it won't fix the problem.”

  22. UNDERGRADUATE EDUCATION

    Undergraduate Data Show a Shift, Not a Decline, in Interest

    1. Jeffrey Mervis

    Amid all the hand-wringing about the declining interest in science among U.S. students, one significant fact has been largely ignored: Undergraduates today are just as likely to earn bachelor's degrees in the sciences as they were when Jimmy Carter was president.

    To be sure, there have been shifts within the natural sciences and engineering, as interest in particular fields have waxed and waned. Policy-makers have been quick to cite such worrisome numbers as a 37% drop in the number of computer science degrees awarded since 1984 and a 25% drop in physics majors since 1988. But declining interest in those relatively small fields has been more than offset by a spectacular rise in the life sciences, up 83% in the past decade after a dip during the 1980s. At the same time, computer engineering has been hot throughout the decade, topped by a 35% jump last year that brings the 2000 enrollment to triple the 1990 level. And women have a greater presence in almost every scientific discipline. Overall, data from the National Science Foundation (NSF) show that the fraction of U.S. undergraduates choosing to major in science and engineering has stayed remarkably constant—roughly one in three—for more than a generation.

    Staying power.

    Students don't seem to have lost interest in science.

    SOURCE: NSF/SRS

    The numbers suggest that any program that relies on financial rewards to pump out more science majors may be ignoring the natural ebb and flow of students from one technical field to another based on perceived opportunities. At the same time, the overall stability of the scientific talent pool is no cause for complacency. “For 35 years the number of [science, mathematics, engineering, and technology] graduates has oscillated around one-third of the total B.A. pool,” agrees Norman Fortenberry, head of NSF's undergraduate programs. “But today's world requires a greater level of technological sophistication. So it's more important than ever that we find ways to reach that other two-thirds.”

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution