News this Week

Science  12 Mar 1999:
Vol. 283, Issue 5408, pp. 433

    NIH Weighs Bold Plan for Online Preprint Publishing

    1. Eliot Marshall

    The National Institutes of Health (NIH) is considering throwing its weight—and money—behind an ambitious Web-based publishing venture that could radically change the way biology papers are disseminated. Harold Varmus, director of the NIH, revealed at a congressional hearing last week that he likes the idea of building a one-stop, public source for biomedical research papers, and he's been exploring ways of doing it. NIH leaders are not ready to discuss specifics because their proposal is still in its infancy. They want to avoid false steps, knowing that almost anything they propose will be seen as a threat to traditional journals. But they are drawing up plans this spring.

    The news that Varmus openly supports this idea—at least in principle—has cheered a small group of researchers and database experts who have been trying to foment a revolution in scientific publishing. Led by genetics researcher Patrick Brown of Stanford University and David Lipman, director of NIH's National Center for Biotechnology Information (NCBI), they've drawn up several proposals for an electronic preprint or “e-print” repository for biology papers, modeled loosely on the e-print archives at Los Alamos National Laboratory in New Mexico, which has become a major forum for results in many areas of physics and astronomy. The proposed venture would include methods for conducting a “streamlined” version of traditional peer review of submitted papers. The details are in flux, but they're likely to be defined and released “in a relatively short time,” according to Brown. Lipman discussed their plans with the Howard Hughes Medical Institute (HHMI), and now he and his colleagues are also turning to the NIH. Backed by its deep pockets, this dream could quickly become a reality.

    Varmus met twice in February with proponents and has already discussed the project with, among others, Richard Klausner, director of the National Cancer Institute, and Steven Hyman, director of the National Institute of Mental Health. Both, he said, seemed favorable. Ari Patrinos, director of the Department of Energy research program that funds genome research, is supportive as well. Lipman says: “I've been impressed by the amount of discussion and careful thought given to the idea by [NIH] directors” and other leaders. “It's very exciting.”

    But the concept is likely to run into some heavy fire. Varmus acknowledges that some scientific societies depend entirely on their journals for income, and that a public electronic publishing center might destroy their budgets by siphoning off subscribers. These societies might object to a government agency taking over the service they now provide. Journal editors, who are likely to see such a venture as a threat, are also expected to raise substantive objections. And postdocs might be concerned that an e-print publication would lack the prestige of a paper in a traditional journal, making it less valuable on a curriculum vitae.

    Brown and Lipman had been discussing the idea of building a Web preprint site for all of biology for more than a year. But the idea gained new impetus in December at the Cold Spring Harbor Laboratory on Long Island, when biologists and database specialists got together for discussions that originally had a narrower focus. Brown had organized a session to discuss a new repository for gene expression data. He and Richard Young of the Whitehead Institute for Biomedical Research at the Massachusetts Institute of Technology told how their research has already led them to do their own electronic publishing of sorts. In what is known as functional genomics, they use microarrays of DNA sensors to monitor the simultaneous functioning of thousands of genes, producing huge sets of data. The only way to make sense of the information, Young says, is to visualize it in a dynamic computer display. For that reason, he and Brown began publishing results on their Web pages and sharing new data over the Internet, and they were ready to spread the gospel. “We realized that what we need is a centralized mechanism,” Young says, to serve the functional genomics community.

    Members of the European Molecular Biology Laboratory (EMBL) also have been talking about building such a facility, Young says. And Alvis Brazma of the European Bioinformatics Institute at the Wellcome Trust's campus at Hinxton near Cambridge, U.K. is attempting to put together a group to support a new EMBL functional genomics data center in Europe. He's organizing a planning session in Europe later this year.

    But at the Cold Spring Harbor meeting, Paul Ginsparg, the physicist who started the Los Alamos e-print server, described his project and urged biologists to broaden their efforts. According to attendees, Brown's idea for a broad-spectrum biology site has won support from other scientists, including Gerald Rubin of the University of California, Berkeley, Leland Hartwell of the Fred Hutchinson Cancer Research Center in Seattle, and David Botstein of Stanford. By January, Brown and Lipman had put together a rough description of the e-print server they wanted to build. It was ambitious.

    The core of Lipman and Brown's scheme would be a Web-based arrangement like the one started by Ginsparg. The Los Alamos server accepts papers from all sources, stores them in categories, and makes them available freely over the Internet. Ginsparg does not review, edit, or correct the submissions. But the proposed biology server might differ from this model in one important way: It might include a “filter,” perhaps a board of editors that would help sort papers according to subject, significance, and quality. Not all papers would go through the filter. But those that did, in this scenario, might be given to two reviewers, and their signed comments would be published alongside the original paper. This would help maintain standards and give the site some prestige.

    Lipman informally presented his idea to the staff of HHMI in Chevy Chase, Maryland, in late January. One observer says the audience, including institute president Purnell Choppin, was enthusiastic; another, that the reception was “lukewarm.” Officially, HHMI has no comment. Lipman and Brown then took a version of their plan to Varmus on 16 February, and Varmus and NIH staffers spent another day, 27 February, taking the proposal apart and putting it back together again. NIH officials are trying to come up with a plan that traditional journals might embrace.

    NIH continues to make revisions. In one recent version, the traditional print journals and scientific societies would be invited to team up with NIH in creating a universal biology research archive. In this scheme, journals might place their own stamp of approval on electronic papers deemed worthy of it. But the proposal may undergo further changes before it is released.

    Varmus disclosed his interest in such a scheme on 4 March during the final day of a 2-week NIH budget review chaired by Representative John Porter (R-IL) in the House appropriations subcommittee for education, labor, and health and human services. Porter, raising a concern about the increasing cost of scientific journals, asked Varmus whether NIH was doing anything about it. “We are,” Varmus replied, explaining that he and Lipman have been exploring ways to disseminate full-text scientific articles by grantees to other grantees, essentially as a government service. He later said that NIH could potentially save “millions of dollars” by distributing research results over the Internet, bypassing traditional journal subscription fees that eat up a lot of grant money.

    Journal editors who have heard of the proposal remain skeptical. For example, Tony Delamothe, an editor at the British Medical Journal in London, who organized an e-print experiment at the BMJ's Web site, says he senses some “messianism” in schemes for reorganizing scientific publishing. He says he hasn't heard of a good way of conducting rigorous peer review online. And Ed Rekas, director of publications for the Federation of American Societies for Experimental Biology, sees a risk of “destroying the scholarly journal system that has served science so well for centuries.” He wouldn't want to put a penny of public money into an e-print server.

    Brown acknowledges that there are many “psychological barriers” to overcome. But he is convinced that the shift to Internet publishing is inevitable, and that it will increasingly be viewed as a good thing. Brown says he believes the current system is terribly inefficient. Traditional journals represent a “balkanized” form of science in which information is fragmented into literally thousands of publications, he says. And their methods of disseminating data and processing peer judgments are “klugey.” At present, Brown says, “there's no such thing as a scientist who takes a journal and reads it from cover to cover.” And “there's no single journal that satisfies the need of any scientist.” Everyone puts together his or her own “virtual journal,” Brown says, consisting of an article from one publication, a paragraph from another, a news item, 20 abstracts, 50 titles, and so on. “Some people actually Xerox these things and put them in a folder to take on a plane, so their virtual journal is almost a physical entity.” Brown asks: Why not reorganize the data flow so that every biologist can get access to everything he or she needs “in a sensible way,” from a single site on the Internet?

    NIH will be weighing alternative proposals for an e-print server this spring. Varmus says he will pay close attention to the community's concerns. Although NIH can afford to move quickly if it wants to, he notes, the funding power “doesn't mean a thing if the scientific community doesn't want to play.”


    Atom Lasers Get More Laserlike

    1. David Voss

    From high-tech weapons to rock-and-roll light shows, lasers are celebrated for their ability to shine a narrow, tightly focused beam of light exactly where you want it. Researchers around the world are working to give beams of atoms the same ability, essentially creating “atom lasers” that could make measurements of length and time with unprecedented accuracy or even build microscopic structures atom by atom. But the few atom lasers built so far produce an output that is more like a blob than a beam and is propelled out of the device by gravity, so it can only be directed straight down. Now a team of researchers in the United States and Japan reports on page 1706 that by carefully nudging the atom cloud at the heart of an atom laser with light, they have produced an atom beam that is far more like a laser beam.

    “We're trying to do for atoms what the laser has done for optics,” says team leader William Phillips of the National Institute of Standards and Technology (NIST) in Gaithersburg, Maryland. The NIST approach has two advantages, notes Wolfgang Ketterle of the Massachusetts Institute of Technology (MIT): “First, it reduces the spreading of the beam, and second, it means you can point the beam in any direction.” With such a beam, says Keith Burnett of Oxford University, “you'd be able to have the same control over matter that you have over light.”

    The active ingredient in atom lasers is a Bose-Einstein condensate (BEC), a trapped vapor of atoms cooled down to a temperature near absolute zero. Without the jostling produced by thermal energy, the atoms all condense into the single lowest quantum mechanical state. Right after making the first BEC in 1995, physicists realized that BECs could, in theory, emit a laserlike beam of atoms whose wave properties were “coherent”—identical and in step, just like the light waves in a conventional laser. Ketterle and his MIT group produced the first atom laser in 1997 by tricking part of the atom vapor into leaving the pack (Science, 31 January 1997, p. 637). A BEC is held together by a magnetic field, but if the atoms are slightly tweaked with a pulse of radio waves, they ignore the field, and a burst of atoms in the same coherent quantum state drops out of the trap. More recently, in work to be reported in Physical Review Letters, Theodore Hänsch's group at the Max Planck Institute for Quantum Optics in Munich has coaxed a more laserlike, continuous atom beam from a BEC trap. But these atom lasers could still only point downward. “The MIT result was the landmark experiment,” says Phillips, “and we now have the opportunity here to make a highly directional beam.”

    Going down.

    Output from early atom lasers was intermittent and propelled by gravity. The new directional NIST laser is boosted by light.


    Phillips and his co-workers, from NIST, Georgia Southern University in Statesboro, and the University of Tokyo, were able to kick the atoms out in a real beam using a technique called Raman scattering. When a photon scatters off molecules in a fluid, it sometimes exits as a photon with a slightly longer wavelength, leaving behind a tiny bit of its energy. The team used this effect to give the atoms in their cooled vapor a slight kick of momentum to get them traveling in a beam.

    One potential showstopper was that the photons used to induce Raman scattering might also get absorbed and destroy the ultracold BEC by heating it. So the researchers hit their atom cloud with two laser beams. One pumps the atoms to a higher energy level, and the second stimulates them to jump back down immediately. But the photons in the second beam, which define the size of the jump, have a slightly lower energy than those in the first, so a little bit of the photon momentum is left behind. This is just enough to produce a narrow, tightly focused atomic beam, as well defined as the laser beam from a laser pointer, spreading out with an angle of only about a tenth of a degree. And although the atom laser is not fully continuous—the Raman lasers are pulsed on and off—the emitted atom pulses overlap enough to form a nearly continuous beam.

    Having an atomic ray gun with laserlike precision opens up a whole host of applications. With a beam of atoms all in the same well-defined state, better atomic clocks and high-tech meter sticks can be made. Researchers typically define such fundamental standards by counting wavelengths of optical light like the ticks of a clock or the marks on a ruler, but the quantum mechanical waves from atoms are much smaller, allowing far more precise counts. The longer term dream, however, is atomic holography. Just as a conventional hologram interferes beams of photons together to create a three-dimensional image, so an atom hologram could combine beams of atoms to build a 3D solid object. Such a technique could be used to grow nanostructures for integrated circuits or biotechnology.

    Researchers caution that atom lasers won't be pumping out microprocessor chips in the near future, because the number of atoms in the laser beam is so small. “We are talking about femtograms coming out of the trap,” says Ketterle. But some space-age measurement applications may not be far over the horizon. “The European Space Agency has shown interest in gravitational wave measurements with atom optics,” says Burnett.


    Attractive Wire Guides Atoms Out of Trap

    1. Alexander Hellemans*
    1. Alexander Hellemans is a writer in Naples, Italy.

    It's not easy to manipulate individual atoms. For a start, you must confine them in a magneto-optical trap—a device that holds a small cloud of atoms in a magnetic field and cools them down to almost absolute zero—and then your options are still pretty limited. The most researchers have been able to do is to tweak the trap's magnetic field to move the atoms around, allow them to drop under gravity, or kick them out with pulses of light (see p. 1611). But now, researchers at Innsbruck University in Austria have gained a new mastery of atoms with current-carrying wires that can guide atoms just as optical fibers guide light.

    The technique, described in the 8 March issue of Physical Review Letters, could be used in small nanostructures that would manipulate atoms as handily as integrated circuits guide electrons. Ultimately such “atom optics” might even provide components for quantum computers. This research is a “step in the right direction” toward the creation of integrated atom optics, says physicist Wolfgang Ketterle of the Massachusetts Institute of Technology.

    The Innsbruck researchers, led by physicist Jörg Schmiedmayer, took advantage of the small magnetic field that atoms gain from the quantum mechanical “spin” of their electrons. An atom is repelled by an external magnetic field if it is parallel to its own and attracted to one that is antiparallel. So the researchers took some cold lithium atoms from a trap and nudged them in the direction of a current-carrying wire that generates its own magnetic field, looping around the length of the wire. Any approaching atoms with a magnetic field antiparallel to the wire's get caught up in this field and begin to orbit the wire. If they also have momentum parallel to the wire, they will spiral along it, guided by the magnetic field.

    There are not many circumstances when you can spiral atoms down a free-floating wire, so the team also attempted a slight variation on the technique. They superimposed an additional magnetic field across the wire that has the effect of canceling the wire's field on one side, creating a “tube” of field parallel to the wire that has a minimum in its center. Atoms whose parallel field would normally be repelled by the wire can get trapped in the region of low field and channeled along the tube next to the wire. The team was able to take pictures of the atoms following the wire by briefly illuminating them with lasers, causing them to luminesce.

    Riding the tube.

    Atoms caught in magnetic field tube next to current-carrying wire.

    CREDIT: PHYSICAL REVIEW LETTERS 82 (10), 2014 (1999)

    This side-guiding technique is particularly appealing to atom opticians because it could allow them to construct wires on a surface and, with the help of an external magnetic field, channel atoms along specific paths just above the surface. Says Schmiedmayer: “You could mount the side-guide wire on a surface and use nanofabrication technology to produce very small wires and structures,” devices that might serve as the hardware for a quantum computer. “In fact, we are trying to make such small structures now.” Ketterle adds that the Innsbruck group isn't the only one developing techniques to guide atoms. “There are quite a few efforts made now to use magnetic fields, wires, miniaturized permanent magnets, and laser light to build new traps, new guides, and new mirrors for atoms,” says Ketterle.


    Model Indian Deal Generates Payments

    1. Pallava Bagla

    NEW DELHI— A native community and an Indian research institute this month will collect the first payment from a landmark agreement to market a herbal tonic derived from a local plant. The agreement is seen as a model for so-called bioprospecting efforts endorsed by the 1992 Convention on Biological Diversity.

    The medicine is based on the active ingredient in a plant, Trichopus zeylanicus, found in the tropical forests of southwestern India and collected by the Kani tribal people. Scientists at the Tropical Botanic Garden and Research Institute (TBGRI) in Trivandrum, Kerala, isolated and tested the ingredient and incorporated it into a compound, which they christened “Jeevani”—giver of life. The tonic is being manufactured by the Aryavaidya Pharmacy Coimbtore Ltd. a major Ayurvedic drug company.

    The process marks perhaps the first time that cash benefits have gone directly to the source of the knowledge of traditional medicines, says Graham Dutfield, an ecologist with the Working Group on Traditional Resource Rights at the University of Oxford, U.K. “It is a replicable model because of its simplicity,” he says about a chain of events that began well before the international biodiversity treaty was signed.

    TBGRI scientists learned of the tonic, which is claimed to bolster the immune system and provide additional energy, while on a jungle expedition with the Kani in 1987. A few years later, they returned to collect samples of the plant, known locally as arogyapacha, and began laboratory studies of its potency. In November 1995, an agreement was struck for the institute and the tribal community to share a license fee and 2% of net profits. Another agent from the same plant is undergoing clinical tests for possible use as a stamina-building supplement for athletes.

    The first $21,000 payment, to be shared by the tribal community and the institute, is due later this month. P. Pushpangadan, until last month the institute's director, predicts that the deal will “not only generate mass employment but also be a money spinner for the poverty-stricken tribals.” He compares its potential value to the booming market for ginseng, cultivated in Southeast Asia.

    But some scientists are skeptical of such claims, and a few even question the agreement's underlying principles. Dan Janzen, an ecologist at the University of Pennsylvania, Philadelphia, who works in Costa Rica and helped broker a deal between the government and Merck for bioprospecting there, calls it a “publicity and political move by the participants, the benefits of which may disappear within a few years.” Although he acknowledges the value of drug search and discovery in the wild, he deplores a mindset that assumes the “produce of those at some lower class” should be readily available for exploitation “by the decision-makers of a given society.”

    Pushpangadan, now head of the National Botanical Research Institute in Lucknow, disagrees, saying that he would not have fought against the odds for 12 years unless he was sure that the arrangement would benefit the Kani. And botanist Peter Raven, director of the Missouri Botanical Gardens, considers this agreement a “very good model for future” partnerships throughout the developing world. The current agreement must be renegotiated in 7 years, and the tribal community is expected to use the money for health care facilities and schools.


    New Findings Reveal How Legs Take Wing

    1. Gretchen Vogel

    It doesn't take training at Kentucky Fried Chicken to know the difference between a chicken wing and a leg. But it's taken researchers a long time to figure out the molecular signals that tell the developing embryo which kind of limb to make. Now, work from at least four labs points to a set of proteins that appear to play a leading role in separating legs from wings.

    In the past decade, developmental biologists have made impressive progress in identifying the genes that control a limb's growth from trunk to tip and from front to back. But almost all of those genes turned out to be the same in both arms and legs. That left researchers wondering just what genes control the many shape differences between the limbs. “To identify the genes that convey ‘legness’ is amazing,” says developmental biologist Cheryll Tickle of the University of Dundee in the U.K. who has helped uncover many of the embryonic genes that structure the limbs.

    The first clues came last year, when several groups reported that in a number of vertebrates, at least three genes are expressed only in either forelimbs or hindlimbs. Pitx1 and Tbx4 are found in the legs, while Tbx5 is expressed in wings and arms.

    On the other hand.

    A mouse embryo lacking the Pitx1 gene (right) forms short, slender, armlike hindlimbs.


    Suspecting that these genes might help differentiate the limbs, several groups began to examine their effects in embryos, and results are now rolling in. Last month, developmental biologists Juan Carlos Izpisúa Belmonte of the Salk Institute for Biological Studies in La Jolla, California, and Michael Rosenfeld of the University of California, San Diego, and their colleagues reported in Genes and Development that in genetically engineered mice lacking the leg-specific Pitx1, the hindlegs have short, thin bones that resemble forelimbs. Although the animals' fore- and hindlimbs are not identical, there is “an element of armness that's come to the leg,” says Rosenfeld.

    The opposite experiment—expressing Pitx1 in the wings of developing chicks—seems to bring a bit of leg to the wing, as Izpisúa Belmonte and Rosenfeld reported, and as Clifford Tabin and Malcolm Logan of Harvard Medical School in Boston now report on page 1736 of this issue. To express Pitx1 in the forelimb, Tabin and Logan inserted the gene into a virus and then infected the chick embryonic region destined to become the wing bud. Although the wing was not completely transformed, the results were striking. Chicken wings normally bend downward at the equivalent of the wrist, but in the infected wing the bones grew straight, similar to the junction between a chicken's ankle and foot. The affected wings did not grow feathers and often sprouted claws. (Izpisúa Belmonte and Rosenfeld's experiments yielded similar results.) Logan and Tabin also found changes in the chick's muscle structure: Infected wings developed the four muscles characteristic of the chicken drumstick, but lacked the usual seven wing muscles.

    Pitx1 apparently doesn't produce these changes alone; it seems to exert its influence by turning on another leg-specific gene, Tbx4. In infected wings, Logan and Tabin found the Tbx4 gene turned on wherever the Pitx1 gene was active. Tbx5, the forelimb gene, was also active, however, which may explain why neither group found complete transformations from wings to legs.

    At the Nara Institute of Science and Technology in Japan, developmental biologist Toshihiko Ogura and his colleagues have found even more dramatic transformations by working with Tbx4 and Tbx5. Ogura declined to discuss the as-yet-unpublished work, but those familiar with the study, such as Sumihare Noji of the University of Tokushima, who heard Ogura present it at a meeting, say that Ogura's team used electroporation to misexpress the Tbx4 and Tbx5 genes in the developing wings and legs. And they got almost complete transformation from both hindlimb to wing and wing to hindlimb. These more dramatic results may be due to interactions between the proteins that are not yet understood, or to the electroporation method Ogura used to insert the new gene, which triggers gene expression in 2 to 3 hours instead of 12.

    These genes may shape arms and legs in all vertebrates; scientists have already identified human equivalents of both Tbx4 and Tbx5. A rare disorder called Holt-Oram syndrome, which causes severely shortened arms and heart defects, has been traced to defects in the human TBX5 gene. Versions of the genes are also present in the newt, and scientists are searching for the zebrafish versions, hoping for clues to the genes' evolutionary history.

    Based on other work on developmental signals, the researchers suspect that Pitx1, Tbx4, and Tbx5 influence “wingness” or “legness” by altering a cell's response to similar growth factors. The difference between legs and wings could be due to subtle variations in when and where the common growth factors are active, leading to minor changes in the growth of analogous sets of bones in the two limbs, researchers say. “You have all the same cell types, but a slightly different pattern” in the different limbs, says Tabin. “And these genes turn out to be the heart of that difference.”

    The next step, says Logan, is to figure out how the limb-specifying genes fit into the hierarchy of signals that build arms and legs. For example, although Pitx1 seems to turn on Tbx4, no equivalent gene has been found to direct Tbx5 expression. And scientists are eager to discover exactly which growth factors and other genes Tbx4 and Tbx5 help to control. “It's a very complicated story,” says Rosenfeld, “one that's not solved at all.”


    Videotapes Expose Classroom Faults

    1. Steve Olson*
    1. Steve Olson writes from Washington, D.C.

    Test scores have documented how poorly many U.S. students do in math compared to their counterparts around the world. But why are their scores so low? A new analysis * just released by the U.S. Department of Education suggests that what happens in the classroom—what students are asked to do and how the material is taught—could provide at least part of the answer. And the data are in living color rather than being buried in answer sheets or questionnaires.

    Poor grades.

    The quality of the math taught in U.S. middle schools, judged by such measures as complexity and the use of deductive reasoning, is markedly lower than in Japan and Germany.


    In 1994–95 researchers videotaped 231 eighth-grade mathematics classes in the United States, Japan, and Germany as part of the Third International Mathematics and Science Study (TIMSS). Conceived as an interesting sidelight to the main event—international tests of student achievement among fourth graders, eighth graders, and students in the last year of secondary school—the tapes have become one of the most informative and influential parts of TIMSS. Although confidentiality agreements prevent most of the tapes from being shown publicly, educators who have watched excerpts of a few that have been released and shown at conferences and workshops say they are shocked at the shortcomings of U.S. pedagogy. “I've seen teachers with tears of admiration in their eyes after watching some of the lessons from other countries,” says one researcher who has analyzed them. “It's amazing to see a lesson play out so well.”

    The tapes highlight teaching methods long suspected of pulling down U.S. achievement levels. “What the videotapes show—for example, that U.S. teachers cover more topics in a class than do teachers in other countries—is very consistent with earlier findings about the U.S. curriculum,” says William Schmidt of Michigan State University in East Lansing, the national coordinator for U.S. TIMSS research. “And common sense says that the characteristics of the curriculum are going to have an effect on student achievement.”

    The classrooms videotaped were chosen at random from the pool of those that participated in TIMSS. A single lesson in each classroom was videotaped, and the video was then digitized, translated, transcribed, and put on a CD. Six coders, two from each country, watched the lessons and quantitatively analyzed the mathematics presented and the actions of students and teacher. In addition, a group of experienced college mathematics teachers read transcripts and judged the quality of each lesson after the source was disguised.

    Perhaps the most distressing finding from the educators was the subpar mathematical content of the U.S. classes (see graphic). “Any particular indicator of quality might be questioned,” says James Hiebert of the University of Delaware, Newark, a contributor to the study. “But no matter what indicator was used, most people agreed that the quality of the lessons was lower.” Also, U.S. students typically were taught subjects one to two grades below those taught to their peers in Germany and Japan.

    Teaching practices were remarkably uniform within each country, but they differed sharply from nation to nation. In Japan, teachers usually present their students with a problem and then let them work on it individually and in small groups so that students have to struggle with the relevant mathematical concepts. In the United States and Germany, teachers tend to drill students on concepts they have just described. The result, says the principal investigator for the study, James Stigler of the University of California, Los Angeles, is that “American and German students tend to practice routine procedures, while Japanese students are doing proofs.”

    Most of the teachers videotaped said that they had implemented practices from the standards issued by the National Council of Teachers of Mathematics (NCTM), which break the mold of traditional U.S. math classes by emphasizing high-level problem solving and greater flexibility in attacking a problem. But the tapes show little evidence of such innovations. “At the policy-making level, states and districts have been very influenced by the standards,” says Glenda Lappan of Michigan State, the current NCTM president. “But there aren't many places where teachers [are encouraged] to move forcefully toward the standards.”

    However, Stigler and Hiebert say that teachers should not shoulder all the blame for the poor performance of U.S. students. “Teachers have been encouraged to teach the way they do,” they write in a forthcoming book. “They have been provided no system in which they might spend time and energy studying and improving teaching.”

    Stigler and Hiebert also caution against interpreting the video findings too broadly. Japan's high scores on TIMSS might derive from other factors, they say, including extensive preparation for high school entrance exams. And German students did roughly the same as U.S. students on the TIMSS tests despite differences in pedagogy. “This is a study with a sample size of just three countries,” says Stigler. “We should be careful not to say that particular teaching practices produce high student learning.”

    But educators are hoping to draw exactly those types of conclusions from a much larger project already under way. Stigler's research team and collaborators in six other countries are videotaping hundreds of eighth-grade math and science classes. The results, due out in 2001, are likely to produce more painful introspection for U.S. educators.

    • *The TIMSS Videotape Classroom Study, available at

    • The Teaching Gap, by James Stigler and James Hiebert, to be published in August by Free Press.


    Memory for Order Found in the Motor Cortex

    1. Ingrid Wickelgren

    Every time we kick a ball, shake a hand, or spoon ice cream to our lips, we rely on a strip of tissue on the top of the brain known as the motor cortex to tell our muscles what to do. Now it appears that the motor cortex can do far more than simply orchestrate movements. On page 1752, a team led by neuroscientist Apostolos Georgopoulos of the University of Minnesota in Minneapolis and the Veterans Affairs Medical Center there reports that neurons in the motor cortex can also do a kind of thinking: They can help recognize and remember the sequence of events in time, at least as a prelude to movement.

    Monkey see, monkey remember.

    In this task, the monkey has to remember and point to the spot that appeared just after the one that turned blue.


    If the findings hold up, researchers may have to rethink their view of the motor cortex, and perhaps of other brain regions, too, says Steven Wise, a neuroscientist at the National Institute of Mental Health (NIMH) in Poolesville, Maryland. The work, which Wise describes as “unprecedented,” may mean, he says, “that the information needed to perform complex cognitive tasks is distributed very widely” in the brain. In that event, prospects for recovering from brain injuries may someday be brighter. If healthy areas share some functions of the damaged brain areas, Wise speculates, clinicians may be able to boost those functions and stimulate more complete recovery.

    The new findings extend more than 2 decades of experiments indicating that in certain circumstances neurons in the motor cortex are active even when an animal isn't moving. In 1976, for example, Jun Tanji and Ed Evarts, then at NIMH, found activity in this region when an animal was preparing to move but not yet moving. And in the 1980s and early 1990s, Georgopoulos's team found that neuronal activity in the motor cortex can provide information about the direction of an upcoming movement and can also serve as a memory for the spatial locations of individual stimuli to which the animal was supposed to move.

    Georgopoulos then wondered whether motor cortex cells also could keep track of several spatial locations when they are presented in a sequence. So in 1993, with the aid of his grad student Adam Carpenter and a Minnesota colleague, neuroscientist Giuseppe Pellizzer, Georgopoulos began training a monkey on a task that requires memorizing the order of events in time. The monkey watched a series of yellow spots pop up on a screen in a random order around an invisible circle. Then, after three or four of them were present on the screen, one spot would turn blue. The monkey's job was to move a cursor to the spot that had appeared right after the blue spot. It took a year and a half for the monkey to master the task, and when it did, the team spent an incredible 3 years training a second monkey to do it with up to five spots.

    As the monkeys performed their task, the researchers recorded the responses of hundreds of neurons in the animals' motor cortices and found, as expected, that many of them showed a change in activity while the monkey was watching the spots, before any movement occurred. But they were surprised to discover that hardly any of these responses was specifically related to a spot's location. Instead, the researchers found that more than one-third of the recorded neurons showed an abrupt increase in firing only when a spot arrived in a certain place in the sequence—first, say, or second—no matter what its location. These neurons seemed to be specifically sensitive to “serial position.”

    “What is truly impressive is the magnitude and robustness of the effect,” comments Patricia Goldman-Rakic, a Yale University neuroscientist. “Serial position is represented at least as prominently as movement direction.” No single neuron could indicate a spot's serial position with certainty. But the more neurons monitored, the better their aggregate responses could specify this position, and the pattern of activity of 16 cells or more could pinpoint a spot's serial position with 100% accuracy, Georgopoulos says.

    The data suggest that the motor cortex can play an active role in processing abstract information and is not simply a slave of cognitive regions that tell it what movements to direct. Georgopoulos emphasizes, however, that the region does not do this job alone, but in concert with the rest of the brain. Neurons in the motor cortex, he says, are “participants in a dynamically changing network” of cells in different brain regions that share whatever cognitive duties are demanded by a task.

    Because of the wealth of evidence showing that the motor cortex is an engine of movement, some neuroscientists are skeptical of these conclusions. For instance, some think it's possible, despite the Minnesota team's results, that the neural responses seen in the study somehow correspond to a monkey's thoughts of moving in the direction of each new spot. To address that concern, says Carl Olson, a neuroscientist at Carnegie Mellon University in Pittsburgh, he would like to see how neurons in the motor cortex respond during a task in which the monkey must remember order, but doesn't ever have to move toward any of the remembered stimuli.

    Georgopoulos argues, however, that if the monkey were thinking about moving toward the spots, the neural responses would have been linked to location, not serial position. Still, he agrees that the results are a surprise and says he and his co-workers did a lot of “soul-searching” over the data. It goes to show, he says, that “we hardly know anything about the brain.”


    Proposed Access Rules Split Community

    1. David Malakoff

    Conceived in controversy, the Grand Staircase-Escalante National Monument in southern Utah is the largest U.S. reserve ever created specifically for science. Now, 3 years after President Bill Clinton set it aside, the vast preserve continues to be a flash point, as scientists take sides over a draft plan to preserve its scientific treasures without damaging them.

    Clinton enraged state politicians—and surprised researchers—in 1996 when he abruptly set aside the 770,000 hectares of spectacular canyon-carved desert for scientific research. Although critics accused Clinton of pandering to environmental activists, White House officials said the move was the best way to head off a coal mine that threatened the area's fossil-rich rock formations, archaeological sites, and rare ecosystems. Now, the government has upset some scientists as well with a proposal that could limit access to the monument.

    Last November, the Bureau of Land Management (BLM) unveiled a draft plan* to close roads and limit routine excavations within the preserve, which lies in a remote area in southern Utah. Ecologists and archaeologists are cautiously praising the plan, but some geologists and paleontologists say that the restrictions will hamper their ability to carry out research. The plan is open for public comments until 15 March, and BLM has been getting an earful.

    The rugged region was one of the last places in the continental United States to be mapped. In naming the reserve, Clinton paid tribute to pioneering geologist Clarence Dutton, who in 1880 described the area's terraced cliffs as a “grand staircase” for researchers to climb back into geologic time. Clinton also touted the reserve's “exemplary opportunities for geologists, paleontologists, archaeologists, historians, and biologists.” And he gave BLM the tough job of balancing protection and use of the monument, setting a September deadline for a management plan.

    The draft plan offers five options. Under its “preferred alternative,” BLM would close roughly half of the monument's 3500 kilometers of paved and dirt roads, ban cross-country trips on all-terrain vehicles, and discourage surface-disturbing research on about 60% of the reserve. In the restricted zone, digs would be permitted only to study “unique” or “extremely high value” fossils, artifacts, and rock formations. The plan would also require research permits, with proposals that the monument's managers deem controversial—such as major digs in the restricted zones—subject to review by an independent scientific advisory panel.

    Those suggestions have worried some geoscientists. Cracking down on motorized access and surface-disturbing research “could cause some great difficulties” for fossil researchers, says vertebrate paleontologist Jeffrey Eaton of Weber State University in Ogden, Utah, who notes that field studies often require sifting tons of soil for tiny fossil fragments. Adds paleontologist David Gillette of the Museum of Northern Arizona in Flagstaff, “It's just not practical to hike into every site, and you sometimes need heavy equipment.” Although Gillette hopes for “flexibility” in the permit system, he worries that the science advisory board may “open the door to needless bureaucracy.”

    Some geologists interested in studying the monument's oil and coal formations are more critical. The disturbance restrictions “are ludicrous” for large studies, says Utah State Geologist Lee Allison, who is attempting to rally opposition to the plan. If adopted, he says, the rules “would lead me to do science anywhere but there.”

    But monument chief Jerry Meredith says the BLM's intent is not to prohibit any research. “It would just require some projects to pass a higher test,” he says. “Our responsibility is to first protect and preserve these resources—and then allow responsible study.”

    Archaeologists applaud the “preserve first” attitude. They say unregulated visitation represents the greatest threat to the monument's 2800 known rock art sites and ancient Anasazi dwellings, which date back 1000 years. Biologists and environmentalists are also supporting the restrictions as a way to provide a rare opportunity to study large tracts of undisturbed land. “We don't need roads open just so someone can remove rocks,” says soils biologist Jayne Belnap of the U.S. Geological Survey in Moab, Utah. Indeed, the Wilderness Society would like to see more of the monument closed off. “Scientific research should be welcomed, but the monument's remote, unspoiled character should not be sacrificed to promote it,” says Greg Applet of the group's Park City, Utah, office.

    Meredith has already received more than 6000 comments, which will be analyzed in preparation for the final guidelines this fall. “I know paleontologists aren't excited about packing in backhoes on their backs,” he says. But, so far, he isn't promising to put them in the driver's seat.


    Eastern Europe's Social Science Renaissance

    1. Constance Holden

    A decade after seizing their destinies, the former East Bloc countries are struggling to emerge from an intellectual dark age for disciplines once subjugated to serve the state

    Under the guise of scholarship, the mission of the Institute of Scientific Atheism of the Czechoslovak Academy of Sciences, located in the heavily Catholic city of Brno, was to generate philosophical arguments supporting the notion that religion only interferes with the upward march of communism. When authorities finally swept the institute into history's dustbin in 1993, a few of its staff members who had done research on public attitudes toward religion found a haven at the University of Brno. Many of the rest, the political idealogues, turned their talents to business. “People like that often had good commercial instincts,” academy sociologist Michal Illner notes dryly.

    Devoted more to propaganda than research, dozens of institutes like the one in Brno littered the landscape when eastern Europe threw off nearly a half-century of Soviet control in 1989. Although some of these countries had managed to sustain world-class work in such areas as math, physics, and chemistry, disciplines at the heart of the social sciences—perceived as threats to the communist regimes—were subverted or allowed to languish. But now, although resources are still perilously scarce, eastern Europe, with massive help from institutions in the West, is resurrecting its social sciences. Tended by hosts of visiting scholars and money from private foundations, new institutions designed to help these countries blossom as capitalist democracies are sprouting like tender shoots on a fire-ravaged field. Indeed, eastern European social scientists have no more rewarding subjects for study than themselves as the area undergoes an unprecedented social and economic transformation. “These are golden times” for the social sciences, says Illner.

    A daunting challenge is to direct the energies of the next generation to where they are sorely needed. The brightest new Ph.D.s are neither in academia teaching nor in government drawing up enlightened policies. Instead, they are heading for the private sector, where the money and prestige are. Princeton economist Orley Ashenfelter worries that, without the services of people trained in modern economics, political science, and other policy-making fields, architects of post-Cold War societies in eastern Europe are “trying to build bridges without any engineers.”

    Ideological Trojan horses

    In 45 years under Soviet dominion, eastern Europeans had very little contact with Western social science. Economics was Marxist economics—heavy on history, light on equations; sociology was virtually banned from many university curricula; and political science served as a conduit for communist lecturing. Young politicians in Poland, for example, would attend the Communist Party's Higher School of Social Science, where they would learn about class struggles and the “dialectic” between historical forces driving societies ever closer toward communist ideals. In classes with names such as “Scientific Communism” and “Historical Materialism,” professors expounded on a society based on the marriage of science and socialism, says science historian Loren Graham of the Massachusetts Institute of Technology.

    But serious intellectual activity fermented beneath the surface. In Poland, Hungary, and Czechoslovakia, there were “islands of freedom” in some social science fields, says Josef Syka, vice chair of the R&D Council of the Czech Republic, especially in the research academies where, quarantined from students, top scholars were able to pursue unorthodox inquiries.

    While Czech scholars kept a handful of social science disciplines—such as ethnography and linguistics—alive in their academy, colleagues in Poland maintained a far more extensive clandestine culture that has helped them emerge from the intellectual dark ages. “If you were rejected by the official system, you could go underground” to publish books and papers, called bibula, on censored topics such as anticommunist uprisings abroad and the dismal performance of Marxist economies, says Polish political scientist Marek Kaminski of New York University and the University of Warsaw's Central and East European Economic Research Center. Such activities did entail a risk: Kaminski spent 5 months in jail in 1985 for managing the publishing venture for Solidarity, the union movement born in the Gdansk shipyards in 1980 that, by the end of the decade, had toppled the government.

    Poland's avant-garde movement provided a lifeline to Western thought for researchers in other countries, too. In sociology, Polish translations of Western literature and other publications “became an ideological Trojan horse” in eastern Europe, says Edmund Mokrzycki of the Institute of Philosophy and Sociology of the Polish Academy of Sciences.

    While Poland spread the subversive gospel on sociology, Hungary was able to sustain a decent reputation in economics, partly due to its strong tradition in mathematics. Politics too played a role. In 1968, Hungary spiced its centrally planned economy with free-market mechanisms, an experiment often referred to as “goulash communism.” “Hungarian economists made names for themselves explaining what was wrong with the communist system,” says economist Janos Kornai, who divides his time between Harvard University and a Hungarian think tank, Collegium Budapest.

    The new shoots

    But whatever intellectual capital eastern Europeans possessed, they faced awesome obstacles after seizing their destinies a decade ago. The biggest problem was a one-two economic punch. Accompanying the wrenching shift to the free market—and the loss of captive consumers in the Soviet Union—have been steep drops in government support for research and education. In Hungary, for example, R&D spending has slid from 1.8% of the total budget in 1989 to 0.7% in 1997, even as the budget itself has shrunk, says Andras Roboz, science attaché to the Embassy of Hungary in Washington, D.C.

    A more insidious obstacle to reform, says Kaminski, is the persistence of “petrified hierarchies” in universities and academies. Budget woes have made inroads there, making it easier to purge resistant outposts of communist ideology, says Petr Kratochvil of the Czech Academy of Sciences, where severe budget cuts enabled officials to kill 25 institutes—including the one for scientific atheism and the Lysenko-tinged Institute of Developmental Biology—and reduce the staffs by half. But universities are still home to “a lot of tenured deadwood” who balk at change, says Polish political scientist Karol Soltan, now at the University of Maryland, College Park.

    Aware of the stultifying legacy of the communist system, experts early on concluded that the most effective way to reorient social sciences would be to create new institutions, or new departments in old ones, rather than try to redirect existing entities.

    The first big initiative came in 1991, when billionaire financier George Soros founded the Central European University (CEU) in Budapest, now supported with a 20-year commitment from Soros's Open Society Institute. Housed in a restored baroque palace in the heart of Budapest, CEU has about 60 faculty members, plus dozens of visiting Western professors who lecture on everything from international affairs to “gender and culture.” CEU is the “crown jewel” of postcommunist higher education initiatives, says Andrew Kuchins, associate director of Stanford's Center for International Security and Arms Control. CEU, adds Kornai, is “a very good sign that you have something outside the state-run hierarchy of higher education and research.”

    CEU's major achievement so far has been to furnish people with master's degrees that set them up to qualify for Ph.D. programs abroad, usually in the United States or Britain. Twenty graduates are now at Stanford, which has a special program for eastern European students, says sociologist Laszlo Bruszt, chair of the CEU board. The university, which has a sociology department in Warsaw, intends to reach deeper into the pipeline of young social scientists by establishing an undergraduate college as well.

    Another promising growth on the postcommunist landscape is a new school in the Czech Republic. A joint venture of Prague's noted Charles University and the Czech Academy of Sciences, the Center for Economic Research and Graduate Education-Economics Institute was designed with the ambitious aim of creating “the next economic elite,” according to co-founder Jan Svejnar, who shuttles between Prague and a post at the University of Michigan, Ann Arbor. The center, started in 1991, awards U.S.-style Ph.D.s in economics, sponsors research, and holds seminars for government and business people. The goal is to sprinkle economists throughout academia, the government, and the private sector, says Svejnar.

    The center, whose professors do both research and teaching, is drawing raves. “It's the only institution of its kind anywhere in eastern Europe,” says Hungarian-born economist Richard Quandt of Princeton University, who runs an Andrew W. Mellon Foundation program that subsidizes the school. “They are really producing Western-style Ph.D.s.” His colleague Ashenfelter says the graduates are getting “snapped up by the world market” for positions in government and international finance.

    Most of the new initiatives, however, come in more modest packages. Take the Institute for Social Studies (ISS) at the University of Warsaw, established in 1991. Affiliated with the University of Michigan's Institute for Social Research in Ann Arbor, ISS does research and graduate training in sociology, economics, political science, and psychology. Since 1992, it has also run the Polish General Social Survey, which Kaminski calls “the most methodologically advanced project in empirical sociology in eastern Europe.” Modeled on the U.S. General Social Survey, it involves annual interviews with 2000 Poles to track values, beliefs, health, and well-being.

    Looking to youth

    As with institutions, it seems to be easier to form than reform individuals, observers say. “We found it is virtually impossible to retrain any of the existing economists,” says Svejnar. Those styled in the Marxist tradition, he says, “basically found it too difficult” to adapt to rigorous, math-based methodologies.

    Quandt and others are placing their hopes on the next generation. In the Czech Republic, university enrollment has doubled since 1989, and new campuses have proliferated, says Syka. Regionwide, the social sciences appear to be keeping pace. In the first half of the 1990s, the ranks of Hungarian university students in economics and social sciences swelled by more than 150%, while tripling in Poland, says Tamas Tarjan of the Institute of Economics in Budapest.

    A major problem, however, is that these freshly minted social scientists are not going into jobs where their skills may be needed most—in academia and in government, where they could tend the machinery for a democratic society. Universities, with their rock-bottom salaries, no longer carry the prestige they did under communism. “Higher education in eastern Europe is a totally unmitigated disaster,” says Polish economist Stanislaw Wellisz, who teaches at both Columbia and Warsaw universities. Although governments are putting high priority on expanding the availability of higher education, “professors come last in the pecking order,” he says, noting that “a full professor earns about the same salary as a janitor. … If you are bright and have any ambition, you are not going to go into academic life.” That's bad news for students, Wellisz says: “Faculty members treat their academic position as a base for … other work and, therefore, devote very little time to teaching and to research.”

    Channeling fresh talent into the government is also difficult. According to Quandt, the Mellon Foundation funded an economics honors program at Warsaw University designed to prepare students to go abroad for Ph.D.s and then come back to “put Polish economics on its feet.” Instead, he says, “95% of the graduates got such incredible offers” from banks and companies that they never went on for Ph.Ds.

    Countries must battle external as well as internal brain drains. In Poland, says Wellisz, “the brightest ones go for graduate study abroad and stay abroad.” Most eastern European universities aggravate the problem by failing to take obvious steps such as aggressively recruiting rising stars, says Hungarian sociologist Ivan Szelenyi, now at the University of California, Los Angeles. He says three of his best Hungarian students at UCLA have been snapped up for tenure-track jobs at top U.S. schools. “If my colleagues in Hungary would listen to me, they would create fast tracks for these people. They would offer a chair and full professorship,” says Szelenyi.

    Observers say it will take many years to heal the deep wounds in eastern Europe's social sciences. “The way mathematicians survive in Poland is very simple: They do subcontracting work for Silicon Valley” while retaining their faculty positions, says Wellisz. But when social scientists enter the private sector, he says, they are lost to the world of teaching and scholarship. Some believe it will take a generation for modern social science to gain a firm foothold in eastern Europe. Wellisz offers a more pessimistic view: “Unless the [educational] system is changed, it will take an infinity.”


    Bright Spots in a Bleak Russian Landscape

    1. Constance Holden

    Harley Balzer recalls vividly the signs of “enormous damage” to Russia's social sciences inflicted by decades of disinformation and inept Soviet policies. When he attended a conference on “civil society” last summer in St. Petersburg, says Balzer, director of the Center for Eurasian, Russian, and East European Studies at Georgetown University in Washington, D.C. “I was appalled at the lack of ability to think systematically” among some of the country's top social scientists. “Essentially the discussion was mush.”

    Its economy in tatters, Russia trails many of its former satellites in pulling the social sciences from the ideological morass into which they sank during the Soviet era. Many universities are repackaging old, dogma-riddled courses, notes Irina Dezhina, a public policy analyst at Moscow's Institute for the Economy in Transition. For example, she says, a standard course once called “the Scientific Approach to Communism” is now called “the Social and Economic History of the 20th Century.” But the teachers often “teach exactly the same stuff” as before. A streak of nationalism in the intellectual community also threatens to undermine reform, says science historian Loren Graham of the Massachusetts Institute of Technology (MIT). “There are still quite a few Russian scholars who believe Western social science and humanities represent a certain ideological viewpoint which they don't wish to embrace,” he says.

    Nonetheless, some beacons have appeared on the horizon—think tanks, renewed departments in universities, and freestanding institutions for graduate education. As in eastern Europe (see main text), reformers are pushing innovation more than conversion. “While things are bleak, there are bright spots,” says Kennette Benedict of the MacArthur Foundation, who says much of the initiative—and most of the cash—is coming from outside the country. One high-profile endeavor was cooked up by Israeli economist Gur Ofer of Hebrew University in Jerusalem, a prominent expert on the Soviet economy, along with Valery Makarov, director of the Central Economics Mathematics Institute of the Russian Academy of Sciences. Together they designed and launched the New Economic School (NES), a graduate school that opened in Moscow in 1992. Supported by the MacArthur and Ford Foundations and international financier George Soros, the NES, headed by Makarov, awards about 40 master's degrees a year, preparing Russians to enroll in top-notch Ph.D. programs abroad. The first fruits have just come in: Two newly minted Ph.D.s—one from Harvard, one from MIT—are now at the Stockholm Institute for Transition Economics, studying how to integrate Russia into the world economy.

    Also making a mark is the Moscow School of Social and Economic Sciences, founded by Teodor Shanin, a prominent Russian rural sociologist who spent most of his career at Manchester University in the U.K. before taking leave to return to Moscow and run the school. A star faculty member is sociologist Tatiana Zaslavskaia, whose pathbreaking public opinion surveys in the early days of glasnost—revealing widespread demoralization, apathy, and economic distress in the Soviet Union—won her a position as adviser to former Soviet leader Mikhail Gorbachev.

    Fresh students are treated like tabulae rasae at the European University, a new independent private graduate institution in St. Petersburg organized by sociologist Boris Firsov and supported entirely by money from the MacArthur, Ford, and Soros Foundations. It now has about 210 students pursuing M.A.s or Ph.D.s in everything from economics to ethnology. Teaching students to think analytically is the number-one priority, says Vadim Volkov, dean of the faculty of political sciences and sociology: “Basically, we retrain them and resocialize them.” The Russian faculty are young—mostly in their early 30s—and all were trained in the West, says Volkov, 33, who earned his doctorate at Cambridge University. The students must be fluent in English and, in a break from Russian tradition, spend much of their time writing instead of preparing for rote oral exams.

    Experts hope these fledgling efforts will sow the seeds for an indigenous crop of world-class social scientists. And there's more to come. Last November, the MacArthur and Carnegie Foundations held a meeting to map out future support strategies for the social sciences in the former Soviet Union. One theme of the meeting, according to Volkov, is that communism is so deeply rooted in Russia that “total rejection of the past is impossible and unwise,” so more attention should be given to promoting “gradual transformation from within” at universities and academies throughout Russia's regions.

    So far, the initiatives in Russia have resembled intensive-care patients hooked to an array of tubes—their survival depends on infusions of foreign funds. Even before last summer's financial crisis, it was hard—in a country with no tradition of private philanthropy—to rally Russian support for these institutions. Russia needs not just money but time to move past parochial attitudes that are as stifling as Marxist dogma, says Balzer: “Russia is still a country where they make a distinction between world science and ‘fatherland’ science.”


    Scientists--and Climbers--Discover Cliff Ecosystems

    1. Kevin Krajick*
    1. Kevin Krajick is a writer in New York City.

    Researchers venturing onto remote bluffs find them to be oases of diversity, but rock climbers are taking out species even as scientists discover them

    They're vertical, they're made of rock, and you can't see them up close without risking your neck. So it's not surprising that few biologists have paid much attention to cliffs. But lately, some hardy researchers have dangled from ropes alongside high bluffs, and they are finding unusual and ancient communities that don't exist in the flatlands below.

    These first forays have turned up surprisingly diverse communities, including rare plants and lichens, birds, and trees nearly 1000 years old. “Cliffs protect themselves very well by being so inaccessible, so they can have unusual communities even in heavily populated areas,” says Jerry Freilich, former ecologist for California's Joshua Tree National Park. Joshua Tree and other parks are commissioning new studies on these hard-to-reach habitats, largely because a boom in rock climbing is putting unprecedented pressure on them, says Freilich, now science director for the Nature Conservancy of Wyoming.

    Life on the edge.

    Ancient trees on the Niagara Escarpment are adapted to harsh cliff conditions.


    Wildlife biologists have long known that raptors such as peregrine falcons and red-tailed hawks nest on cliffs, where predators can't get at their young. And a few researchers cataloged sea-cliff plants in Ireland and Britain in the 1980s. But until fairly recently, there have been no studies of cliffs as distinct ecosystems. “Look at how hostile they appear. No one really viewed them as habitat,” says Richard Knight, a professor of wildlife biology at Colorado State University in Fort Collins.

    Knight and graduate student Richard Camp recently discovered that some of Joshua Tree's granite spires are actually islandlike centers of diversity. They found 60% more bird species, and three times as many plant species, on the cliffs than were on the flat, arid desert floors below. From top to bottom, the cliffs provide all sorts of niches: Rock wrens and white-throated swifts rush in and out of cracks where they nest in great chirping masses, while the prairie falcons that prey on them incubate eggs on nearby ledges. Rock faces concentrate infrequent rains, dribbling moisture down to ledges and cliff bases to supply trees and succulents such as quercus oak and staghorn cactus that won't grow elsewhere; Lazuli buntings and other Neotropical migrant birds use this vegetation for nesting and food.

    Researchers are still figuring out what makes some of these rocky, windswept sites so rich. One reason is that cliffs create a classic “edge effect”—a break in the normal landscape that is often more diverse than, say, the monotonous interior of a forest. Winds that bring insects and seeds from all over may also play a role. For whatever reason, “we do know the Joshua Tree cliffs are a distinct place,” says Knight.

    Up, up, and away.

    Climbers on granite spires in Joshua Tree National Monument put cliff plants and birds at risk.


    And because cliffs are so inaccessible, organisms once widespread may end up clinging to them as sanctuaries. About 5 years ago, in a boat off the Hawaiian island of Kaui, biologists from the National Tropical Botanical Garden there spotted what they believe were the last surviving individuals of Munroidendron racemosum, a primitive-looking tree with long, pendulous branches. The trees were sprouting from volcanic cliff ledges that looked as if they were about to crumble into the sea. All the others of their kind, once common on the island, had been eaten by human-introduced goats that couldn't reach this one last refuge. The biologists rappelled down, rescued seeds, and have since repropagated the species, says Paul Cox, director of the botanical garden.

    Cliffs in the midwestern and southern United States also are home to a host of endangered species that have either been pushed there or just prefer rocky spots. They include such plants as mud warts and water hyssops, which grow in shallow seasonal pools that form in cliff rocks in Minnesota, and lichens such as Parmelia stictica, which cling to vertical faces.

    Lichens are often a major component of cliff ecosystems, notes biologist Michael Farris of Hamline University in St. Paul, Minnesota, but these low-profile organisms are hard to identify and poorly known. So lichen biology remains a wide-open field. Last summer, rock climber Peter Smith, then a master's student in biology at Appalachian State University in Boone, North Carolina, surveyed one small part of the walls of nearby Linville Gorge and quickly came up with 23 genera inhabiting several distinct zones according to moisture. He also spotted one entirely new species, since named Fuscidea pallida. “He only did 12 transects, which makes us think there are many more undiscovered things up there,” says Gary Walker, Smith's adviser.

    In addition to their diversity, parts of cliff ecosystems can be remarkably ancient. Botanist Doug Larson and dendrochronologist Peter Kelly of the University of Guelph in Ontario, Canada, have found that some of the eastern white cedars dominating the 800-kilometer-long Niagara Escarpment of the Great Lakes region are up to 800 years old; well-preserved dead trees are more than twice that age.

    Many of the cedars have multiple root systems attached directly to soil-less solution hollows and cracks in bare rock. Larson and his colleagues have found dense colonies of algae, bacteria, and fungi penetrating 1 to 3 millimeters into these apparently solid rocks. Larson hypothesizes that these so-called cryptoendoliths—previously known mainly from Antarctica—may help nourish the trees. Larson also notes that the cedars are apparently adapted to slow growth rates; in fact they are among the slowest growing plants known, adding only a couple of layers of cells each year, compared to perhaps 600 layers for their cousins on flat land. Twisted trunks may reach 3 feet in diameter, but some 200-year-old specimens are no bigger than a toilet plunger. Larson believes slow growth assures longevity and thus survival of the species. “It's an advantage—if they grew fast, gravity would drag them off before they got a chance to reproduce,” he says.

    Unfortunately, scientists are not the only ones discovering cliffs. Last year, 4 million people went rock climbing in the United States alone, and they left their mark on these fragile ecosystems, as Knight and Camp report in studies in the December 1998 issue of Conservation Biology and the April issue of the Wildlife Society Bulletin. Some Joshua Tree prominences are now hung with so many ropes that they look like Gulliver tied down by Lilliputians. To keep regular routes safe, climbers routinely “garden” them, pulling plants and soil out of cracks and wire-brushing lichens off protruding handholds.

    Not surprisingly, Knight and Camp's studies show that climbers reduce plant cover and drive off birds. Independent botany consultant Victoria Nuzzo of Rockford, Illinois, showed that climbers reduced lichen cover and species by half and took out three-quarters of threatened cliff goldenrod plants at one site in northern Illinois's Mississippi Palisades State Park. Perhaps worst of all, climbers on the Niagara Escarpment are clearing the way by cutting down the old trees. Survivors may be used to fasten ropes, which strips their bark. Dendrochronologist Kelly has meticulously documented the damage; he dated one tree that germinated in 1215—and had its main axis sawed off in 1992.

    Because the recognition of cliff life is so new, few parks have gotten around to making rules. As studies build, that may change. “I like to think that the more we learn about these places, the more we can demonstrate how special they are,” says Kelly.


    Engineering Metabolism for Commercial Gains

    1. Joseph Alper*
    1. Joseph Alper is a writer in Louisville, Colorado.

    Researchers are using genetic engineering to turn bacteria into chemical reactors that perform multistep synthesis of bulk chemicals

    The chemical industry is going back to the future. Until the 1930s, most bulk chemicals came from microbes, which made them by fermenting biomass such as corn and potatoes. But after learning how to “crack” petroleum into simpler hydrocarbons, chemists took over. They devised complex, multistep schemes to convert these building blocks into bulk chemicals as well as smaller scale specialty products. Now, microbes are poised to reenter the bulk chemical business.

    Two decades of advances in microbial genetics and a new understanding of cells' metabolic pathways are helping researchers turn microbes into one-pot chemical reactors, able to perform multiple enzymatic steps to convert sugars and other raw materials into industrial chemicals or pharmaceuticals. By combining several chemical steps into one reaction vessel, so to speak, the strategy can save large amounts of money. As a result, the chemical industry is now getting set to reintroduce fermentation as an economical means of producing many bulk chemicals.

    Microbial industry.

    Equipped with genetically engineered enzymes (green), bacterial metabolism can transform glucose into propanediol.

    For example, DuPont, in Wilmington, Delaware, is planning to put a modified bacterium to work turning glucose into 1,3-propanediol, a monomer that can be linked to form a polyester called polytrimethylene terephthalate, now found in some carpeting and textiles. “We have a tremendous opportunity here to make an impact with a highly efficient and cost-effective biological process,” says Richard LaDuca, the project coordinator at Genencorp International in Rochester, New York, which is working with DuPont. Two different multistep processes are now used commercially to make 1,3-propanediol.

    Genencorp is also working with Eastman Chemical, of Kingsport, Tennessee, to commercialize a microbial process that transforms glucose into 2-keto-L-gulonic acid, the key intermediate in the industrial synthesis of ascorbic acid (vitamin C). The collaboration—which included several other companies and Argonne National Laboratory, in Argonne, Illinois—engineered an undisclosed bacterium to carry out the four-step metabolic pathway. According to chemical engineer Michael Cushman, Eastman's project director, this biological process is now, “without a doubt, the cheapest way to make ascorbic acid.” If adopted, this one-step process would replace the current seven-step method.

    Other chemical companies are also trying to harness microorganisms to produce bulk chemicals. But they are generally tight-lipped about their efforts, because of both the financial stakes and the strategy's history of difficulties. “Replacing chemistry with biochemistry was one of the very first things to cross people's minds when genetic engineering first came about in the early 1980s,” says Douglas Cameron, recently hired away from the University of Wisconsin, Madison, by food-processing giant Cargill to build a metabolic engineering group at its Minneapolis research and development center. “But to do this on a commercial scale was a far more difficult task than anyone thought.”

    “Putting the new enzymes into an organism is really the easy part,” adds Bernhard Palsson, professor of bioengineering at the University of California, San Diego. Indeed, it can be almost trivial, says Cameron, who is more forthcoming than many others working in industry. Developing bacteria capable of producing 1,2-propanediol—used today as a food additive, particularly for making semimoist pet food—took him and his group just a month, he notes.

    They took advantage of Escherichia coli's ability to convert glucose into small amounts of the compound methylglyoxal as a normal part of sugar metabolism. They knew that either of two enzymes—aldose reductase or glycerol dehydrogenase—would turn methylglyoxal into 1,2-propanediol. By consulting online databases, the group identified the appropriate genes for the enzymes and engineered them into E. coli. Current production of 1,2-propanediol by this engineered E. coli is a mere 0.2 grams per liter, “but these are our initial results and far from optimized,” explains Cameron. He sees no reason to doubt that further engineering will increase production to the 100-grams-per-liter level needed to make the process commercially viable.

    But coaxing a bacterium to shift much of its metabolic resources into making a particular compound is a challenge nonetheless. The production of an individual metabolite via a particular pathway is affected by the ebb and flow of dozens of other pathways in a cell's metabolism. “Eventually, you have to start looking at metabolic fluxes in the organism, in an attempt to choose pathways to get rid of or down-regulate in order to shunt more metabolic energy into the pathway you've engineered,” says Palsson.

    He and others, including James Bailey of the Swiss Federal Institute of Technology in Zurich, Gregory Stephanopoulos of the Massachusetts Institute of Technology, and Jans Nielsen of the Technical University of Denmark, have developed computer models of these metabolic fluxes in E. coli. One model correctly predicted the metabolic effects of 73 different mutations, Palsson says. He is now trying to predict which of E. coli's metabolic pathways are absolutely essential for life, and is working with Harvard geneticist George Church to knock out the pathways one by one to see if the model's predictions match reality. Ultimately, these models should help researchers who have introduced new enzymes into an organism to plan a second round of metabolic engineering, bolstering or shutting off specific pathways to maximize the amount of product.

    Microbiologist Mary E. Lidstrom of the University of Washington, Seattle, is hoping to do the same kind of metabolic engineering on Methylobacterium extorquens, a bacterium capable of growing on one-carbon sources such as methanol. Because methanol is easy to make from methane, found in natural gas, genetically engineered Methylobacterium could replace some of the existing chemical processes for turning this readily available feedstock into the dozens of commodity chemicals that go into the manufacture of almost every polymer now in use. Lidstrom has already created an efficient vector system for introducing new genes into the organism, and she has worked out most of the metabolic pathways this organism uses to grow on methanol. Her research group is also sequencing the remainder of the organism's genome. That, she says, “will give us the tools to greatly reduce the time it takes us to engineer new pathways in this organism.”

    The promise of genomics is what makes metabolic engineers so hopeful these days. “With all of the genome sequences we now have and with a better understanding of cellular metabolism, we now have the tools to engineer new metabolic pathways and increase yield of a desired product on a time frame that competes with chemistry,” says Genencorp's LaDuca.


    For Radioactive Waste From Weapons, a Home at Last

    1. Richard A. Kerr

    Independent scientific oversight—and understanding that no site is perfect—helped create the world's only certified deep radwaste repository

    For 40 years, scientists and engineers have been crisscrossing the United States searching for likely places to store the mounting tons of radioactive waste created by nuclear weapons production and by nuclear power plants. But everywhere they have looked, they have found geological and political problems. Yucca Mountain, Nevada—the site decreed by Congress as the sole site to be studied as a repository for the nation's most radioactive wastes—is still years from accepting its first curie (see p. 1627). And other nations are even further from actually storing waste: They are still trying to narrow their choices of possible disposal sites.

    Yet in this frustrating saga, there is one lone success story: the Waste Isolation Pilot Plant (WIPP), a multibillion dollar effort to bury long-lived radioactive wastes in deep salt beds 40 kilometers east of Carlsbad, New Mexico. Unlike any other deep radwaste facility in the world, WIPP has managed to gain approval from scientists and regulators as a safe repository, and even many locals are behind the project. Of course, not everyone is enamored of WIPP. It still faces two lawsuits, filed by environmental groups and the New Mexico attorney general, that challenge its science and due process. But government scientists and lawyers say they're optimistic they'll get favorable judgments. If so, bomb-related wastes could start to be entombed as early as the end of the month.

    Salty solution.

    Rooms dug deep into salt beds at WIPP (right) may soon store nuclear waste trucked from bomb production facilities (left).


    It's not that WIPP is scientifically a perfect site; indeed, one of the lessons of its history is that there is no such thing. “You never feel quite as comfortable about a site as the day you start to study it,” says geophysicist Wendell Weart, who spent more than 20 years as the lead scientist on the project. “If there's anything we've learned” in the course of repository site searches, adds Kevin Crowley, director of the National Research Council's (NRC's) Board on Radioactive Waste Management, “it's that the natural setting is a lot more complicated than we thought it would be. These are first-of-a-kind efforts; they're running into a lot of surprises.” Indeed, he says, “it's fair to say things did not go all that smoothly at WIPP, especially in the early days.”

    The story of how WIPP overcame these obstacles to reach its current status as the world's first certified deep radwaste facility may hold lessons for others struggling along the same path. If there's one overriding lesson, observers say, it is that the technical surprises were handled in an open spirit of scientific inquiry. And a key to this process was an independent scientific advisory board, which provided a thorough—and very public—check on the project scientists.

    Deep salt

    The road to Carlsbad began in the early 1970s, with a surprise beneath the cornfields of Kansas. Since the 1950s, scientists had pointed to salt as one of the most promising geologic media for a radwaste repository. Laid down in evaporating seas long ago, salt is rock-solid and essentially impermeable. It flows to seal up any excavated cavity and leaves clear traces of any past intrusion of water, the bane of any repository intended to entomb wastes for millennia. In 1970, the Atomic Energy Commission (AEC), a predecessor of the Department of Energy (DOE), tentatively selected an abandoned salt mine near Lyons, Kansas, as a radwaste repository.

    Although the nuclear industry was still in its infancy, highly radioactive spent fuel rods from civilian power plants were already piling up. And for decades nuclear weapons production had been generating liquid high-level wastes along with plutonium-contaminated debris—rags, protective clothes, and tools—called transuranic, or TRU, wastes. Although TRU wastes are not as “hot” as spent fuel, they remain radioactive for so long—hundreds of thousands of years—that they must be disposed of in deep, remote sites.

    But the case for a Lyons repository proved to be literally full of holes: The site turned out to be punctured by old oil and gas wells, so scientists worried that wastes might leak out of similar, undetected holes. The AEC withdrew Lyons from consideration and sent Oak Ridge National Laboratory and the U.S. Geological Survey on a search for the most promising salt beds in the country. In 1973 they chose the Delaware Basin salt beds of southeastern New Mexico.

    Although the state of New Mexico repeatedly challenged the subsequent site characterization process, Weart says, the original scientific search meant that the atmosphere was not poisoned by a political decision to dump wastes on the politically weakest state available. It's a far cry from the situation at Yucca Mountain, where Congress, rather than any scientific board, chose the site. Local New Mexican politicians, accustomed to potash mines, oil and gas wells, and nearby nuclear test sites, have even been consistently supportive.

    In spite of the rigorous selection process, WIPP managers soon began to encounter one scientific surprise after another. The site had seemed straightforward enough geologically: beneath the arid scrub-covered surface lies about 300 meters of clay, silt, and sedimentary rock. Below that, there's 600 meters of salt, undisturbed for 250 million years. The idea was simply to dig shafts down to about 655 meters, carve out rooms, stash the drums of wastes there, and fill it back up. Over the centuries, the drums may corrode and be crushed, but the enclosing salt was expected to securely entomb the wastes.

    Complications, however, began popping up rapidly. When engineers drilled on the proposed site in 1975, they found the “flat” salt layer to be so disturbed that it was tilted to near vertical, suggesting that the geology was surprisingly complex. Worse, below the salt, the drill hit a pocket of pressurized brine that surged to the surface. If that happened while wastes were stored there, some radioactive material would be sure to escape.

    That setback presented Weart and his team with a dilemma: Should they go public and risk an immediate backlash, or keep mum and risk an even bigger outcry later? DOE had tended to be secretive, a legacy of weapons-related research, but in this case Weart, who worked for DOE's scientific adviser, Sandia National Laboratories, decided it was time to break that tradition. “My first job in addressing the public community was to tell them ‘we don't have a suitable site here,’” he says. “We came right out and told the public unless we can find an alternative, there's no place for a repository” in the Delaware Basin salt formation. That openness proved to be a plus, says Weart; “it probably helped our credibility.” Geologist Roger Anderson of the University of New Mexico, Albuquerque, a longtime critic of WIPP, doesn't recall DOE being a paragon of openness but concedes that over time, “their skills at dealing with the public have increased enormously.”

    The site was relocated to another part of the salt beds, but the project team soon got a dose of déjà vu: A 1981 bore hole at the new site again brought up pressurized brine. And other problems arose in quick succession, as critics such as Anderson raised the specter of natural processes that could disrupt the repository. Anderson charged that groundwater would eat into the repository from the west. Later, once shafts and tunnels were dug, any freshly exposed surface oozed briny water from the “dry” salt. That could accelerate decomposition of the wastes and greatly complicate predicting their behavior. The problems “did cause us a great deal of visibility in the press,” recalls Weart, who says he continued to be open about the project's ups and downs.

    Watching over DOE

    One reason why the project was able to weather these storms is because in 1979 its task was made easier: Congress decreed that WIPP would handle only the lower level TRU wastes. Scientists agree that the high temperatures created by the “hotter” wastes could draw water from the salt and increase the risk of leakage, making WIPP unsuitable for those wastes.

    Even so, the project might not have succeeded as a repository for any wastes at all if not for the creation in 1978 of an independent technical oversight body, the New Mexico Environmental Evaluation Group. EEG studies—and questions—all DOE reports, visits the site, often with critics, and generally looks over DOE's shoulder regarding every major decision. EEG works “on behalf of the state, but we are not part of the state” political system, says longtime EEG geologist Lokesh Chaturvedi. “That has given us a great deal of credibility.” Weart agrees. “Although at times DOE considers [EEG] to be an annoyance,” he says, on balance he's found the group's badgering to be helpful. And EEG experts say they have been able to play a key role mediating between DOE and outside critics.

    When the 1981 well hit brine, for example, EEG suggested that DOE do detailed hydrological analyses, at a cost of several million dollars. That showed that the brine came from an isolated, contained reservoir, which could be avoided by simply relocating the site elsewhere in the beds. Without that discovery, “the project would have died,” says Chaturvedi. EEG also helped resolve other problems, for example, by getting DOE to drill another deep hole, showing that Anderson's dissolution threat wasn't real.

    That kind of oversight—from the EEG and the NRC's WIPP Committee as well as eight external peer reviews—has proven crucial, says current WIPP Committee chair John Garrick, a risk assessment consultant in Newport Beach, California. “This was such a pioneering effort dealing with a controversial issue,” he says, “that the oversight may have made as much of a contribution to certification as the good work done by the DOE.” Anderson, who is not part of either lawsuit, says he still has some concerns, notably the pressurized brine problem and the effect of climate change on the permeability of the overlying rock. But he agrees that “a lot was found out about the site because the EEG has been watchdogging it.”

    Meanwhile, the WIPP managers were broadly investigating the site, pursuing 116 different scientific studies, many of them on basic questions. But scientific curiosity doesn't necessarily lead to regulatory compliance. “How much science is enough?” asks Leif Eriksson, director of GRAM Inc. a DOE contractor in Albuquerque. “There has to be a cutoff. From a scientist's perspective, there's always going to be things he's interested in and wants to know about.”


    To focus the WIPP effort, DOE requested in 1994 that Sandia decide what was known about WIPP and what was needed to comply with Environmental Protection Agency (EPA) regulations. Studies that helped fill the knowledge gap—such as those on shaft seal design and water flow immediately above the salt—went forward, and those that wouldn't make much difference were stopped. “We accelerated the schedule for compliance [by] several years” and saved about a third of a billion dollars, says Les Shephard, WIPP project manager at Sandia at the time.

    It worked. In May of last year, after 25 years of study, the EPA certified that there is a “reasonable expectation” that WIPP will contain all but a tiny fraction of its TRU wastes for the next 10,000 years. No natural process is likely to disrupt the isolation provided by the enclosing salt for tens of thousands if not millions of years, according to the DOE analysis. In the end, after countless calculations of fluid flow, radionuclide transport, and human exposure, DOE concluded that the most likely exposure route is through holes drilled for oil millennia hence, when the presence of the repository may be forgotten. Even then, according to DOE calculations, the maximum annual radiation dose to an individual would be 32 times lower than the EPA limit and 768 times less than the average natural background radiation in the United States.

    Nevertheless, not everyone is ready to accept WIPP. Environmental groups, including the Natural Resources Defense Council and the Environmental Defense Fund, have joined with the New Mexico state attorney general to sue EPA and DOE to stop the project. They charge that EPA hasn't realistically evaluated the threat from future drilling and that DOE hasn't fulfilled all legal requirements concerning transfer of the land and appointment of a regulator. In the DOE case, an injunction now prevents waste from being stored in WIPP. But oral arguments on lifting that injunction start this week in the U.S. District Court in Washington, D.C. If DOE wins quickly, WIPP could begin accepting waste by the end of the month. This, finally, should be it, says Weart. “There's a process laid out by Congress that has been followed religiously,” he says. “We do have a robust repository.”


    Yucca Mountain: A Hotter Case to Handle

    1. Richard A. Kerr

    As the world's first approved radwaste site prepares to open for business in New Mexico (see main text), another high-profile proposed repository, Yucca Mountain in Nevada, faces continued obstacles on the road to successful licensing. Part of the difference is scientific—Yucca Mountain is slated to accept more highly radioactive wastes, and the mountain's geology “is just a lot more complicated,” says geologist Rodney Ewing of the University of Michigan, Ann Arbor. But part of the Yucca Mountain site's difficulties also lie in its troubled political history.

    From day one, the Yucca Mountain repository was a political creation. Whereas the New Mexico site, known as the Waste Isolation Pilot Plant (WIPP), was the winner in a scientific search for a place to store waste from weapons production, Yucca Mountain was the loser in a political football game. It was one of many sites under consideration as a permanent home for high-level wastes, until Congress in 1987 simply designated the long ridge adjacent to the Nevada Test Site as the sole place to be studied. That abrupt act did not sit well with Nevadans, who have always adamantly opposed the repository. And although Yucca Mountain, like WIPP, has a technical oversight group, there's a crucial difference: The Nevada panel is part of the governor's office and is a predictably staunch opponent of the repository, while the New Mexican scientific advisory group is politically and scientifically independent.

    The task set at Yucca Mountain is also far more ambitious than at WIPP. The Nevada site is to store highly radioactive and thermally hot spent fuel rods from nuclear power plants. When the repository leaks, as scientists agree it eventually will, those wastes will be a far larger source of radiation than the contaminated clothing and rags inside WIPP. And the heat from these highly radioactive wastes could have unpredictable effects on both the waste itself and the enclosing rock.

    What's more, the rock of Yucca Mountain is riven with innumerable cracks that let rainwater ooze through to the proposed repository 300 meters below the mountaintop. Tests in 1996 revealed that water was flowing to the repository 10 to 30 times faster than estimated. That means more water to help break down the waste containers and carry the radioactivity into aquifers.

    “Things are not as simple as we thought,” says Kevin Crowley, director of the National Research Council's Board on Radioactive Waste Management. “A lot of these [geologic] processes are proving very hard to characterize and measure. WIPP is a more homogeneous system whose behavior is a lot easier to predict.”

    To make the situation at Yucca Mountain more predictable, the Department of Energy (DOE) has begun to emphasize artificial barriers—enclosing wastes in layers of metal and minerals—whose behavior can be thoroughly studied in the laboratory. “With the natural system,” says Crowley, “you have to take what nature gives you. Engineered barriers simplify the problem.” Even so, a blue-ribbon panel recently criticized DOE for some sizable holes in its calculations of waste release rates through the combined engineered and natural barriers of the current repository design (Science, 26 February, p. 1235). WIPP's accomplishments are good news for the radwaste community, but there's no guarantee that Yucca Mountain will be able follow the same path to success.

  15. KOREA

    Subsidy Helps New Grads Find First Science Jobs

    1. Michael Baker*
    1. Michael Baker writes from Seoul.

    New government program gives thousands of students with advanced degrees a temporary spot in an economy trying to rebound

    SEOUL, KOREA— Kang Hae Won, with a freshly minted Ph.D. in nutrition from New York University, confronted a bleak job market when she rejoined her husband in Korea in early 1998. Along with some 2500 Korean students who received advanced degrees last year in science and engineering, she was seeking her first research job just as the country's economy took a nose dive and thousands of workers were being laid off. Unable to find a paid position, she signed on as an assistant to biochemist Park Tae Sun of Yonsei University in Seoul for a project that had not yet been funded. In spite of her tenuous status, Kang considered herself fortunate to be working in her field on a project that offered a chance for publication.

    Several months later, her financial situation also improved. In September, she was accepted into the first wave of the government's new Research Intern Program, which places new graduates in scientific positions and subsidizes their pay for up to 1 year. The program is intended to carry them through tough times, keep their skills sharp, and stock universities and institutes with young talent.

    The intern program, which pays $870 a month to Ph.D.s and $700 to those with master's degrees, is the largest of three recent initiatives for scientists and engineers. Unemployed but experienced scientists are being dispatched to small and medium-sized companies as part of a “Science and Technology Corps” that keeps them active, as well as tapping into their expertise. Other jobless scientists work for similar pay on displays for the National Science Museum.

    The government hopes to enroll 5000 scientists and engineers over 2 years in these programs. But the demand is even greater. Last year, the number of positions for scientists at public laboratories in Korea dropped by 8.7%, and private companies trimmed their R&D payrolls by 6%. With another 2500 graduating with advanced degrees this year, the employment picture remains grim. “The programs are too small for all those people,” says Chung Sung Chul at the Science and Technology Policy Institute in Seoul. “There is enormous pressure for jobs.”

    The intern program is run by the Korea Science and Engineering Foundation (KOSEF), which matches job seekers to vacancies. Kang and Park asked to be paired, but graduates without any institutional links can apply individually to KOSEF and wait for a match. From a pool of 1800 applicants, all with degrees received in the last 2 years, the Research Intern Program so far has placed 1300, about half of them at universities. About 40% are engineers, in line with the discipline's overall share of a typical graduating class, followed in descending order by agriculture and fisheries science, biology, chemistry, physics, geoscience, and mathematics.

    Although the program is not a perfect solution—the fit isn't always right, interns must keep one eye out for their next job, and their short tenure limits the type of work that can be done—it offers a life preserver to recent graduates floundering in a suddenly troubled job market. “I never realized the real world would be so different from my ideal workplace,” says Lee Sung Joon, who graduated last year with a master's in computer graphics. “When I was turned down again and again, I was quite devastated.”

    Lee is now working at Seoul's Gunzamorey Computer Co. a small company normally unable to hire those with advanced degrees. Gunzamorey handed Lee a project—to develop a marketable device for converting digital video signals to analog—that would normally be assigned to a well-trained team. Lee's formal education doesn't help much, but he has responded to the challenge by assigning himself lots of supplementary reading and routinely logging 12-hour days.

    Being an intern isn't the same as holding a permanent job, of course. Some interns must struggle to win the respect of their peers. “People see interns as useless people—just another part-time worker rather than a colleague,” complains one participant, who requested anonymity. And some senior scientists say the intern program doesn't allow enough time for substantive research that can generate publishable results. Cho Yoon Hae, a scientist conducting protein-structure research at the Korea Institute of Science and Technology, says his lab, which has three interns, spent the first 3 months integrating the new arrivals into the routine. Others wish that research expenses were included in the program.

    Interns who have promised to work in Korea in exchange for government funding of their graduate education face a particularly tough future in an economy where domestic jobs are scarce. Geoscientist Lee Yong Joon, a recent graduate of Texas A&M University, has the scientific and language skills to compete for a job overseas, but he owes the government 3 years in Korea in return for his training. “Yong Joon is a good scholar and researcher,” says Lee Jin Han, his project supervisor at Korea University. Unfortunately, the school's budget may be too tight to hire him after the intern subsidy is up. In the meantime, Lee Yong Joon is making a contribution by peering at hyper-thin slices of rock, searching for signs of a new fault in central Korea.

    Some of the biggest beneficiaries of the new programs are companies needing help with shop-floor issues or in the lab. The Science and Technology Corps dispatches experienced teams or individuals to provide technical assistance. At Kumho Life and Environmental Science Lab in Kwangju, for example, 10 of its 55 workers are paid by the program and work alongside full-time employees to crystallize proteins, breed transgenic plants, and conduct studies on environmental stress signals. The Corps employees have appropriate backgrounds and learn quickly, says Song Pil Soon, one of Kumho's principal investigators. Song would like to keep them on when the government funding expires, but Kumho, like many companies in Korea these days, is under a hiring freeze.

    Despite these obstacles, a KOSEF survey found that 90% of institutions and interns are satisfied with the program, and that 75% believe the interns stand a good chance of finding a permanent job after the government subsidy ends. “I'm very happy and grateful to the government,” says Kang, whose 1-year appointment runs through August. “And I'm sure that I will find a [permanent] position when the economy improves.”


    Watching DNA at Work

    1. Robert F. Service

    Few biomolecules can claim to be as well studied as DNA. Ever since James Watson and Francis Crick unraveled the structure of the double helix at Cambridge University in 1953, researchers have spent billions of dollars and untold sleepless nights trying to understand and manipulate the molecule of life. They've sequenced entire genomes for more than 20 organisms, and the human sequence is expected to succumb in the next few years. But in virtually all these studies, researchers examine the collective, herdlike behavior of many thousands of copies of particular DNA fragments. Now, in a handful of biophysics labs around the world, a quiet revolution is under way: Researchers are getting a fix on the behavior of single DNA strands.

    Just as an ecologist uses radio collars to track the movements of individual animals, these researchers are using tools such as lasers and magnets to gain a wealth of new insights into how DNA twists, turns, and stretches. In the very latest work in this rapidly emerging field, investigators are going beyond the simple mechanics of the DNA molecule to get an intimate look at the way different proteins work to cut, copy, and splice it. Such studies can reveal, for example, the precise force that a single protein-based motor exerts as it crawls along a DNA strand translating the code into RNA. These intricate observations even show that at some points the motor works fast and efficiently, and at other times it stalls out.

    “By looking at single molecules, you can answer questions you can't possibly address with the broad-brushed approach of traditional biochemistry,” says Princeton University biophysicist Steven Block. Such studies, Block adds, “have opened up these whole new areas of micromolecular mechanics,” the study of molecule-sized machines. By engineering the genes for these protein machines, researchers will be able to see how alterations to their structures affect their operation. This will “allow you to study the molecular mechanisms that lead to motion,” says Block. Adds Jeff Gelles, a biochemist at Brandeis University in Waltham, Massachusetts, “I think that virtually all the DNA enzymes will be studied in this manner.”

    DNA and its companions aren't the only molecules being pursued by single-molecule researchers. Chemists are using the approach to track changes to a variety of individual molecules. But DNA and its workmate proteins are generating interest in part because of DNA's central role as the cell's database, as well as more practical considerations. Among them is DNA's titanic size (the largest human chromosomes span a whopping 9 centimeters when fully extended), and its rugged nature. “DNA is a very robust molecule to work with, so it's a good one to start on,” says Vincent Crouquette, a biophysicist at the Ecole Normale Supérieure (ENS) in Paris.

    The key to manipulating DNA was the development of optical traps, which use tightly focused laser beams to snag tiny plastic spheres and other particles, as well as related techniques for manipulating tiny objects. By linking spheres to either end of a DNA strand and manipulating the laser beams, researchers can test the mechanics of the molecule, in essence seeing how much force DNA can withstand when stretched and twisted. Through the mid 1990s, teams found that the molecule, when pulled apart at the ends, initially stretches like a rubber band. Pull harder and the double helix actually unwinds somewhat, stretching the strand even further before breaking altogether. While such knowledge may seem arcane, it's turning out that DNA's mechanical properties are critical to the way some enzymes work with it.

    In one recent study, for example, Crouquette, David Bensimon, and their ENS colleagues revealed the mechanical details of a new type of coiling pattern that is thought to occur when DNA is transcribed to RNA in the cell nucleus. The researchers attached one end of a DNA strand to a magnetic bead and the other to a glass slide. Then they applied a magnetic field to rotate the bead, twisting the helical strand so that it coiled back on itself, similar to the way a helically coiled telephone cord can become twisted. When they altered the magnetic field to pull back on the bead, they found that only a small applied force creates a new type of structure—essentially turning the DNA helix inside out—that has previously been shown to occur when the enzyme RNA polymerase (RNAP) transcribes DNA into RNA. This suggests—although it has yet to be confirmed—that the enzyme induces the awkward twist to copy a DNA strand more easily.

    Single-molecule studies are leading to other insights into the way RNAP manipulates DNA. In the 30 October 1998 issue of Science, for example, a team led by Michelle Wang, a former postdoc in Block's lab at Princeton, reported that they had managed to gauge the force that a single RNAP molecule can exert as it crawls along a DNA strand, churning out an RNA version of the DNA code in the process. To do so, the team “set up the world's tiniest tug-of-war and followed it,” says Block.

    First they anchored an RNAP molecule to a glass slide. They then connected one end of a piece of double-stranded DNA to a plastic bead, which they could move back and forth with a laser-based optical tweezer, and fed the other end to the RNAP. The enzyme normally walks along the DNA strand, unzipping the pair of single strands in the double helix, reading the DNA code from one strand, and assembling RNA versions of each base pair. In this case, however, because the RNAP was fixed in place, it was forced to pull the DNA past it like a sailor hauling in a jib sheet. When the strand was pulled taut, the researchers were able to tug on the bead and determine that RNAP can pull with a whopping 25 piconewtons of force—four times that exerted by myosin, the protein that contracts muscles. “RNA polymerase is the most powerful mechanoenzyme yet studied,” says Block.

    But it's not the only one coming under close scrutiny. At the Biophysical Society meeting last month in Baltimore, Maryland, Block's postdoc Thomas Perkins reported that the Princeton team had succeeded in performing an experiment similar to their “RNAP tug-of-war” with a completely different type of motor protein called lambda exonuclease. This enzyme—part of a bacteria-infecting virus—chews up one of DNA's double-helical strands, leaving behind just a single strand.

    In this case, when the researchers set up their tiny force meter they found that the motor was capable of exerting at least 5 piconewtons of force. However, unlike RNAP and other molecular motors which are powered by the ubiquitous cellular fuel adenosine triphosphate, lambda exonuclease is fueled directly by the energy liberated in DNA's broken bonds. The new experiment, says Perkins, now allows researchers to explore just how many DNA bases lambda exonuclease can clip in one stretch and whether this occurs in a steady or sporadic fashion, information that is likely to be crucial for researchers working to sequence single strands of DNA (see sidebar).

    Still other examples of fantastic minimotors that manipulate DNA continue to emerge nearly every month. At the University of California, Berkeley, for example, Carlos Bustamante and his colleagues are using single-molecule techniques to study the complex way in which the cell manages to pack up to a meter's worth of DNA into its nucleus, just a few millionths of a meter across. At the heart of this packing process are chromatins, assemblies of DNA wrapped around a series of proteins. “The structure of chromatin is of central importance to gene expression,” says Bustamante, because that structure determines which genes are readily accessible to other proteins called transcription factors that turn on gene expression. So Bustamante's team is pushing and pulling on single chromatins in an effort to learn about the forces that hold these collections together.

    Meanwhile, at the ENS, Bensimon, Crouquette, and their colleagues are unraveling the mysteries behind an enzyme called topoisomerase that itself unties knots that form in DNA as it is unpacked in the nucleus and copied during cell division. Understanding how this motor works, says Bensimon, could prompt the development of novel cancer drugs capable of blocking the enzyme and thereby preventing cancer cells from replicating.

    Even these examples are just the beginning. “It's a field that is only just starting,” says Bensimon. Groups around the globe are already gearing up to take a look at a wealth of other enzymes, such as transcription factors, as well as helicases and gyrases, which help pack and unpack DNA coils within the nucleus. With single-molecule biophysics, says Bensimon, “there is a lot of molecular machinery in the cell that is now open to study.”


    Deconstructing DNA for Faster Sequencing

    1. Robert F. Service

    The international race to sequence the human genome has turned gene sequencing into a high-speed—and high-profile—endeavor. At its heart are machines that create thousands of copies of DNA fragments as a first step toward decoding the sequence, one nucleotide base pair at a time. But a handful of groups around the world are working to steal a bit of the sequencing limelight with new approaches that decode single copies of DNA. Because single- molecule sequencing has the potential to speed up the sequencing process, “[it] could become very important,” says Jay Trautman, one of the technique's pioneers at the biotech company Praelux—formerly Seq—in Lawrenceville, New Jersey.

    The push for the technology comes from the fact that current DNA decoding schemes sequence relatively short stretches of DNA, each about 1000 base pairs long. Researchers wanting to sequence a gene containing, say, 100,000 base pairs must sequence overlapping fragments of the gene and then use complex computer programs to reassemble the pieces in the right order. Single-molecule sequencers, in contrast, hope to sequence DNA segments as long as 50,000 base pairs, which would simplify and speed up the task of putting the puzzle back together.

    The key to the technique's long sequencing lengths is a protein called an exonuclease, which degrades DNA by munching its way through tens of thousands of individual bases, one by one, like a molecular Pac Man. In theory, once each base is clipped off, researchers can channel it away and identify it with a laser-based detector.

    But spying single bases isn't a simple task. The most common approach—being pursued by chemical physicist Richard Keller's team at Los Alamos National Laboratory and other groups at the Max Planck Institute for Biophysical Chemistry in Göttingen, Germany, and the Karolinska Institute in Stockholm, Sweden—is to initially tag each of the four types of bases in DNA with its own fluorescent compound. The researchers then fix one end of the DNA strand to a tiny plastic bead that is then held steady with a laser inside a tiny flow chamber. An exonuclease on the other end of the DNA clips off bases from one of DNA's two strands, which then flow past another laser, revealing their presence with a bright flash.

    At the Photonics West meeting in San Jose, California, in January, Keller's team reported successfully detecting and identifying individual labeled nucleotides cleaved from DNA. But for now they're still plagued by false-positive signals: The fluorescent dye compounds stick to the beads when the DNA strands are attached and later wash past the laser at a rate of about three per second. The exonuclease, meanwhile, only clips one to five bases off the strand every second. “We're optimistic” that the technique will eventually pan out, says Keller. “But we're not there yet.”

    In search of faster sequencing speeds, Trautman and his colleagues are trying to accomplish the job on “native” DNA. Without tags in the way, exonuclease can clip about 12 bases a second, says Tom Perkins, Trautman's collaborator at Princeton University. But it's trickier to detect untagged bases. Though still in the early stages, the Praelux team is working on a way to repeatedly drag a DNA strand across a surface, clip bases off with the exonuclease, and fix them to the surface where they can be chemically modified for easy identification.

    Even if the novel sequencing approaches don't pan out quickly, single-molecule DNA technologies are still likely to bear fruit. The technique, for example, has already shown promise for DNA fingerprinting, which could speed forensic testing in the field and help detect potential bioweapons threats. One way or another, expect to see single-molecule DNA studies grabbing a little limelight.

Log in to view full text

Log in through your institution

Log in through your institution