News this Week

Science  27 Aug 1999:
Vol. 285, Issue 5432, pp. 1334
  1. EARTHQUAKES

    In Turkey, Havoc From a Falling-Domino Fault

    1. Tim Appenzeller

    There were plenty of cruel surprises when the magnitude 7.4 earthquake struck northwestern Turkey early last week. Apartment houses and hotels collapsed when shaky soil turned to jelly and shoddy concrete crumbled. The rescue effort was slow and chaotic. By the end of the week, the death toll was rising to the tens of thousands and recriminations were flying. But one aspect of the disaster offered no surprise at all: the behavior of the ultimate culprit, the North Anatolian fault. Although seismologists could not have predicted when the earthquake would strike, they had long expected it.

    Last week's cataclysm was just the latest in a series of massive earthquakes that have marched along 1000 kilometers of the fault since 1939. The sequence began that year in eastern Turkey with a magnitude 7.9 earthquake that killed more than 20,000, then moved westward in six other major events. “This is probably the most spectacular example of earthquake progression known,” says Ross Stein of the U.S. Geological Survey (USGS) in Menlo Park, California. The most recent quake in the series had been a magnitude 7.0 that struck in 1967. The quiet stretch of fault to the west, passing near the port of Izmit, “was the obvious candidate” for a future earthquake, says Nicholas Ambraseys of Imperial College London.

    Such knowledge wasn't much help to the residents of the area when the quake finally came, in the wee hours of 17 August. Even though the dangers of the fault had been well publicized and decent construction standards were on the books, “there was a total absence of enforcement of the regulations,” says Ambraseys. As rescuers tried to free people trapped in the wreckage, geologists took to the field to map the rupture, which had started 17 kilometers belowground near Izmit, 100 kilometers east of Istanbul. By late last week, Aykut Barka of Istanbul Technical University and his colleagues had traced the break for more than 100 kilometers and measured up to 4 meters of ground displacement along the fault. The eastern end of the rupture overlaps the site of the 1967 earthquake, says Ambraseys. “It extends the earthquake sequence to 2000.”

    The ultimate impetus for the earthquakes is the collision of two tectonic plates carrying the Arabian Peninsula and Eurasia. Caught in between, Turkey is being squeezed to the west like an orange seed, geophysicists say. The slip takes place along two major faults, the North Anatolian and the Southeast Anatolian, which bracket the country to the north and south. Historical records show that both faults have experienced periodic clusters of large earthquakes, says Ambraseys, with clusters flip-flopping between the faults every century or two. The previous major cluster on the North Anatolian fault struck in the late 1600s and early 1700s, flattening Izmit at least once.

    And something about the fault—perhaps the way it is broken into a series of segments—makes it uniquely prone to orderly earthquake sequences. “We shouldn't get too comfortable with the falling-domino character of this fault, because it's not the way most other faults work,” says Stein. But 3 years ago he, Barka, and James Dieterich of the USGS tried to explain how the dominoes fall. They calculated how each earthquake, while relieving stress along one segment of the fault, transfers stress to the next, unbroken segment. Because of the 1939–67 earthquakes, “we suggested that there was an increased hazard in the region of Izmit,” says Dieterich. The calculated risk, high right after the 1967 earthquake, had dwindled over time, however. He and his colleagues put it at 12% in the next 30 years —“not a huge effect by any means,” Dieterich says.

    Other researchers viewed the fault segment as dangerous simply because it was a seismic gap, where the stresses of plate motion had not been released by a recent earthquake. In theory, such stresses can be dissipated slowly and harmlessly, by fault “creep.” But Massachusetts Institute of Technology seismologist M. Nafi Toksöz and colleagues in Turkey monitored the Izmit section of fault using the satellite-based Global Positioning System (GPS), which is sensitive to the tiny ground movements that might indicate creep. Two years ago, he says, they raised the alarm: “Our GPS result showed the fault was truly locked, and stress was accumulating.”

    Toksöz and others note that in spite of these warning signs, no one knew exactly when an earthquake might strike, or whether it would be a single large event or a series of smaller, less destructive tremors. “What's unclear to me, and probably never will be clear, is why it waited 30 years” after the 1967 quake, says Stein. Even so, seismologists are already speculating about the North Anatolian fault's next move. As James Dewey of the USGS's National Earthquake Information Center in Golden, Colorado, puts it, “Is this the end of the sequence, or does it go on from here?”

    The fault continues west from the ruptured area, diving under the Sea of Marmara and skirting 30 kilometers from Istanbul. “If there's any stress transfer mechanism, it's scary,” says Toksöz. “We are convinced that the next segment is locked, it's a seismic gap, and it's sitting right next to Istanbul, with 12 million people.” Toksöz is planning to extend his network of GPS stations to this next fault segment, to monitor any buildup of stress that might indicate that the seismic activity on the North Anatolian fault is about to take another mighty step to the west.

  2. PALEOANTHROPOLOGY

    Kenyan Skeleton Shakes Ape Family Tree

    1. Carl Zimmer*
    1. Carl Zimmer is the author of At the Water's Edge.

    In the summer of 1993, fossil hunter Boniface Kimeu walked down a slope in the Tugen Hills of north central Kenya and noticed a single tooth sticking out of a wall of rock. He returned with other researchers from the Baringo Paleontological Research Project, who worked their way deeper into the rock and discovered a jaw with more teeth, plus bones from the spine, rib cage, arms, wrists, and hands. Kimeu had found the most complete ape fossil known from about 11 million to 16 million years ago—a crucial transition time when primitive apes looking something like howler monkeys were evolving into the ancestors of the living great apes, including humans. Now, after 6 years of preparation and study, a research team presents the find on page 1382 of this issue. Paleontologists say it shines light into an extremely murky phase of ape evolution, forcing researchers to reexamine the family tree of our distant ancestors and offering a glimpse of new connections across continents.

    Initially the research team, led by paleoanthropologist Steve Ward of the Northeastern Ohio Universities College of Medicine in Rootstown, Ohio, thought the skeleton belonged to a 15-million-year-old primate called Kenyapithecus, a controversial genus once considered the ancestor of modern apes. But closer analysis proved this idea only partly correct. The team now argues that the new fossils, as well as some previously collected specimens of Kenyapithecus, are actually so different from the original Kenyapithecus fossils that they belong in an entirely new genus. The new primate, which the team calls Equatorius, is not a close relative of living apes after all, but it does record apes' first steps down from the trees—a crucial evolutionary step that our own ancestors must have taken independently. And the reclassification suggests that Kenyapithecus was part of a great migration of apes out of Africa. The new find is “extremely important,” says David Begun, a paleoanthropologist at the University of Toronto. Adds paleoanthropologist Peter Andrews of The Natural History Museum in London: “It shows general evolutionary patterns in Africa and a migration between Africa and Europe and Asia. That's a lot from just one new name.”

    Kenyapithecus was first discovered by pioneering paleontologist Louis Leakey in 1961. At a site called Fort Ternan in western Kenya, he found an upper jaw and a few teeth dating back 14 million years, which he dubbed Kenyapithecus wickeri. Then in 1965 he sifted through bits of jaw and teeth from Maboko Island in Lake Victoria, 100 kilometers away, and decided that they represented a second species, K. africanus, which paleontologists now date back to about 15.5 million years ago. Impressed by its modern-looking teeth, Leakey declared Kenyapithecus to be “a very early ancestor of man himself.”

    Since then, as paleontologists have gathered more data on the evolution of apes—the lineage of large primates that includes humans, chimps, gorillas, and orangutans—they have argued about where Kenyapithecus fits into the picture. The first apes seem to have arisen from monkeylike primates in African forests more than 20 million years ago, and by 10 million years ago they had blossomed into a huge radiation reaching across Europe and much of Asia. But sometime in the next 10 million years, almost all went extinct.

    Kenyapithecus, with its scant fossil record but intriguing teeth, has been cast in many different roles in this story. A few researchers stand by Leakey's original idea (Science, 18 April 1997, p. 355), but many consider Kenyapithecus to be more primitive. They see our own roots in later European apes, which might have evolved from Kenyapithecus.

    The new fossil suggests that both sides might be right, because it shows that Kenyapithecus is not one genus but two. Ward recognized the split thanks to the nearly complete set of teeth of the new find, particularly the canines and incisors, which set it apart from K. wickeri. “That was the catalyst that caused us to carefully reassess everything else, and everything else just about fell into place,” says Ward. When they looked with fresh eyes at the specimens assigned to Kenyapithecus, they realized that the new fossil and other K. africanus material lacked many of the distinctive features of K. wickeri, including details of the canines. They conclude that K. africanus and K. wickeri were profoundly different apes, belonging to two separate genera. “Everything we tried kept pointing in the same direction,” says Ward.

    In their report, the researchers therefore rechristen Kenyapithecus africanus as Equatorius africanus, so named because all known specimens come from near the equator. This was the earliest known ape to occasionally leave the treetops for the ground, about 15 million years ago when the dense African rainforests began to turn into a more open woodland, says Jay Kelley, a co-author from the University of Illinois, Chicago.

    The new fossils, together with previously collected specimens, give a clearer picture of the functional changes made by this pioneer. In Ward's words, it was “an animal about the size of a big adult male baboon, an animal whose arms and legs were about equivalent length, with a long, flexible vertebral column and powerful grasping hands and feet. We're dealing with an animal that spent considerable time on the ground but also used the trees a great deal.”

    Meanwhile, K. wickeri shows tantalizing similarities with modern apes. “We're looking at Equatorius on one side of the divide and wickeri on the other,” says Ward. His team sees links in K. wickeri to Eurasian fossils, in particular, a still-unnamed 14-million-year-old ape found at a site called Pa alar in Turkey. That link may mean that K. wickeri was “a participant in the [early] radiation out of Africa,” says Ward.

    This new view of Kenyapithecus has “very important implications for the whole picture of [ape] evolution,” says Carol Ward (no relation to Steve) of the University of Missouri, Columbia. Comparing two genera allows researchers to study the arrow of evolution from primitive to derived traits, she says.

    But not everyone agrees with all of Steve Ward and colleagues' interpretations. Begun, for example, thinks that Equatorius, rather than Kenyapithecus, may resemble the ancestral ape that migrated out of Africa. He is now preparing a report on a 16-million-year-old German fossil of an ape called Griphopithecus, which looks much like Equatorius. That would suggest that the first ape to migrate out of Africa was a much more primitive, earlier branch. “It's really Equatorius that shows the earliest connection [to Europe],” argues Begun.

    Meanwhile, the crucial question of which ape made it through the Miocene to give rise to the living great apes and humans remains a mystery. Indeed, by erasing K. africanus, Ward and his colleagues have reduced one contender, Kenyapithecus, to little more than the handful of teeth that Leakey found at Fort Ternan. “It's tempting and tantalizing to think of Kenyapithecus as an early member within the great ape clade, but we really can't say that at this point. The material at Fort Ternan is just too limited,” says Kelley. For that, paleontologists will have to wait for Kimeu or some other sharp-eyed fossil hunter to find more complete fossils, skeletons whose parts may eventually let them make sense of the whole.

  3. RESEARCH RISKS

    California Probes Prison Teens Study

    1. Marcia Barinaga

    BERKELEY, CALIFORNIA—A study in California of an antiviolence drug given to incarcerated teenagers, which was recently hailed as a model for work with such a rarely studied population, is now under attack for possibly violating laws to protect inmates involved in medical research. The investigation centers on whether the research met legal requirements that all inmates in a study have a reasonable chance of benefiting from their participation.

    In 1996 Stanford University psychiatrist Hans Steiner set out to measure whether divalproex sodium (Depakote), an antiepilepsy drug that is already widely used to treat violence in teenagers, is actually effective for that use. Steiner conducted the study at the O. H. Close Youth Correctional Facility in Stockton, where he has worked with youths for 15 years. Seventy teenaged boys whose aggressive violence fit a psychiatric condition called conduct disorder were divided into two groups and given different doses of the drug for 7 weeks. Steiner says boys in both groups showed a “reduction of distress” during the trial, and the high-dose group had a moderately increased ability to control violent urges. The study was funded by Abbott Laboratories and the California Youth Authority (CYA), which runs the facility.

    Studies like Steiner's, in which drugs are tested on teenaged inmates, are “very, very rare,” says Markus Kruesi, a child and adolescent psychiatrist at the Institute for Juvenile Research of the University of Illinois, Chicago. Kruesi hailed Steiner as a pioneer last fall when he introduced Steiner's work at the annual meeting of the American Academy of Child and Adolescent Psychiatrists (AACAP) in Anaheim, California. Steiner says he hoped his presentation would encourage more colleagues to follow in his footsteps. “Child psychiatrists are remarkably absent in these institutions,” he says. “I think it's a mistake.”

    The still unpublished work caught the eye of new CYA director Greg Zermeño in March after Steiner applied to do a follow-up study on its long-term effects. Zermeño opened an investigation after declaring that there had been a breakdown in his office's review process when the project was approved. Governor Gray Davis has ordered the state inspector general to investigate, and the CYA has asked the state attorney general's office to help with its review. Kathy McClelland, Stanford's research compliance director, says the study protocol was approved by the university's institutional review board, and Stanford assumed that CYA had done its own review. Thomas Puglisi, director of human subject protection at the National Institutes of Health, says his office is following the progress of the state investigations but has no plans to do its own. Puglisi noted that Steiner appears to have gone beyond legal requirements for informed consent by obtaining parental approval and assigning advocates to the boys.

    The controversy, first reported in the Los Angeles Times on 16 August, centers largely on the issue of whether subjects receiving the low dose could reasonably have expected to benefit, as the law requires. The newspaper reported that Steiner, in his presentation at the AACAP meeting and in a recent interview with the Times, said he chose a dose so low as to have no effect. If that is true, it might be considered a placebo, which would generally be seen as not providing subjects with a reasonable chance to benefit.

    Steiner vigorously denies designing a sneak-placebo, although he says a placebo would have made the analysis more powerful. But even the low-dose arm could well have had a therapeutic effect, he says, citing the experience with other psychoactive drugs, including Haldol and commonly used antidepressants, which were found to be effective at levels much lower than what had once been given. Psychiatric experts will weigh those arguments during the state's review.

    The use of a low-dose arm as a placebo “would certainly fail to meet the intent of the regulations,” says bioethicist Jeffrey Kahn, director of the University of Minnesota Center for Bioethics. But Kahn says that juvenile inmates can also lose out if the laws protecting them are so strict as to prohibit potentially beneficial research from being done. One could argue, he says, that juvenile and adult prisoners are no more compromised in their ability to make informed choices than seriously ill hospital patients who routinely serve as study subjects, and that inmates are selectively being denied “the benefits of research participation … because of the current law.”

    Larry Stone, executive medical director of Laurel Ridge Hospital in San Antonio, Texas, and a past president of the AACAP, supports that view. “Certainly prisons are the places where … we should be experimenting with a number of things for rehabilitation,” including medications, he says. “My concern is that good, solid researchers who are trying to do that do not get persecuted and tried in the media because of some quirk in the system.”

  4. MATHEMATICS

    Proving the Perfection of the Honeycomb

    1. Dana Mackenzie*
    1. Dana Mackenzie is a writer in Santa Cruz, California.

    Why do bees build their honeycombs out of hexagonal cells? As early as the first century B.C., Marius Terentius Varro—Rome's answer to Isaac Asimov, the most prolific science writer of his day—speculated that it had to do with the economy, rather than the symmetry, of the design. From Varro to the present, scientists have assumed that a hexagonal lattice allows bees to store the most honey in a single layer of equal-sized cells, while using the least beeswax to separate them. Until this summer, however, no one could prove that a honeycomb was the sweetest solution. Now, a mathematician has removed all doubt: Bees do it best. The result also confirms the intuition of human engineers, who have relied on honeycomb composite materials made of paper, graphite, or aluminum to reduce the weight of components for cars, planes, and spacecraft with little sacrifice in strength.

    Last month, at the Turán Workshop in Mathematics, Convex and Discrete Geometry in Budapest, Thomas Hales of the University of Michigan, Ann Arbor, presented his proof that a hexagonal honeycomb has walls with the shortest total length, per unit area, of any design that divides a plane into equal-sized cells. (The proof has also been available on the Web since June.) Hales says that he began working on the honeycomb conjecture just last year, after solving a similar conjecture on the packing of spheres (Science, 28 August 1998, p. 1267). That problem, called the Kepler conjecture, stated that the densest packing of spheres is a face-centered cubic lattice, the pattern a grocer makes when he stacks apples. The proof had taken years. “After the Kepler conjecture, I expected every problem to be very difficult,” Hales says. “In this case, I feel as if I won the lottery.”

    Both questions, Hales explains, can be viewed as versions of the same physical problem: how bubbles of equal volume are distributed in foams. “In a really wet foam, the faces of each cell are not surfaces but have thickness,” Hales says. Because a sphere uses the smallest possible surface area to hold a given volume, the bubbles acquire a spherical shape, with the foam material filling up the interstices. Minimizing the amount of extra foam translates to maximizing the number of spherical bubbles per unit volume—the problem Hales solved last year.

    A honeycomb is more like a dry foam. “In a dry foam, the walls have zero thickness,” Hales explains. With no space between them, the bubbles in a dry foam affect one another's shape and thus can't all be spheres. And although Hales showed that a wet foam of equal-sized bubbles will form a regular lattice, with all its cells the same shape, no one knew whether the same is true for a dry foam. In 1994, Denis Weaire and Robert Phelan of Trinity College in Dublin had found an arrangement of bubbles with equal volume but different shapes, which used 0.3% less wall area than the best known single-shape arrangement.

    Weaire and Phelan's discovery led to renewed scrutiny of the honeycomb—in effect, a two-dimensional foam, because bees build just a single layer of cells. Mathematicians realized they still didn't have an adequate explanation for why the cells in a honeycomb had the same shape. Could an optimal honeycomb, like Weaire and Phelan's foam, have cells of different shapes, with some being pentagons or heptagons, say, or having curved sides?

    Any individual bubble can improve its perimeter-to-area ratio by rounding its sides out to circular arcs, or by adding more sides (because a heptagon is rounder than a hexagon). But that improvement comes at a cost to its neighbors. As mathematicians have long known, the topology of the plane forces the average number of sides to be six, so any heptagon must be balanced, for example, by a less efficient pentagon somewhere else. Similarly, one bubble's outward-curving arc will curve inward to the adjacent bubble and make its perimeter-to-area ratio too large. The question was how to calculate these trade-offs to find out whether one bubble's gain outweighs its neighbor's loss, making the overall arrangement more efficient.

    “Hales's bright idea was that no single cell can do better than a hexagon if appropriately penalized for having more than six sides or outward curves,” says John Sullivan of the University of Illinois, Urbana-Champaign. Although other mathematicians, including Weaire, had discovered a penalty for the number of sides, Hales is the first to find the right penalty for the curvature of the sides and to combine both penalty terms.

    Other geometers seem quite pleased with the proof. Unlike Hales's proof of the Kepler conjecture, which involved thousands of elaborate computer calculations, the proof of the honeycomb conjecture does not require a computer at all. “The overall idea just seems right,” Sullivan says. “There should be an easy reason for a pattern this simple, and I think Hales has found it.”

  5. DEVELOPMENTAL BIOLOGY

    Selenium's Role in Infertility Explained

    1. Evelyn Strauss

    Many proteins lead multiple lives, depending on environmental conditions or on the presence of particular partners. It's rare, however, for a protein to change its stripes completely, acting as a soluble enzyme under some circumstances and an insoluble structural component under others. But that's what new research suggests for a particular selenium-containing protein of sperm—a discovery that may help explain the long-standing mystery of why selenium deficiency in lab and domestic animals leads to male sterility.

    On page 1393, biochemists Leopold Flohé of the Technical University of Braunschweig in Germany, Fulvio Ursini of the University of Padova in Italy, and their colleagues report that the protein, previously identified as an enzyme that helps rid developing sperm of dangerous reactive oxygen molecules, moonlights as part of the glue that holds together mature sperm. “This is a new function for a selenoprotein—to form a structure, not just to carry out a reaction,” says Raymond Burk, a selenium expert at Vanderbilt University in Nashville, Tennessee.

    Although selenium deficiency is rarely a problem for humans, who get the element from common foods such as seafood, liver, lean red meat, and grains grown in soil that is rich in selenium, scientists showed decades ago that animals fed selenium-deficient diets produce sperm that break in the middle and can therefore no longer fertilize eggs. Beyond demonstrating that selenium is concentrated in the midpiece, the region between the head and tail, of normal sperm, scientists made little headway in explaining this effect. Several years ago, they thought they had an answer: The selenium deficiency might be interfering with another protein they had identified in the mitochondrial capsule, a structure that holds the energy-producing mitochondria in the sperm midpiece.

    But that idea dropped out when sequence analysis of the corresponding gene revealed that in some perfectly normal animals it doesn't encode the amino acid that carries selenium—evidence that the element isn't required for the protein's function. “There's been a question of whether there is such a thing as a real structural selenoprotein in sperm,” says Thressa Stadtman, a selenium biochemist at the National Heart, Lung, and Blood Institute in Bethesda, Maryland.

    Flohé and his colleagues have now shown that there likely is, by studying a known selenoprotein called phospholipid hydroperoxide glutathione peroxidase (PHGPx). The enzyme, which likely protects the developing sperm cell against damage by converting toxic peroxides to harmless alcohols, climbs to extremely high levels in testes. Because the levels are much higher than would be expected for protection against the amounts of peroxides probably present in that tissue, Flohé describes the situation as “kind of strange.” About 2 years ago, however, his team's work began pointing to a structural role for the protein. Their analysis of the mitochondrial capsule showed that PHGPx is its most abundant component, accounting for about 50% of the capsule material.

    But even though it constitutes such a large proportion of the capsule, tests revealed that the protein from mature sperm had lost its enzymatic activity, apparently because the protein molecules had become linked together in an inactive form. Based on these findings, the researchers propose that PHGPx acts as a soluble enzyme early in sperm development and later polymerizes into a protein mesh that contributes to the structural integrity of the midpiece. If so, says Burk, “this [work] may explain the head-to-tail separation seen in sperm of selenium-deficient animals.”

    In addition, the result opens the door to a better understanding of the mechanisms underlying normal sperm development, presumably in humans as well as in animals, say experts. Currently, Stadtman notes, the triggers for the switch from active enzyme to inactive structural protein are not known. “The next step is to work out the signals that tell the sperm to undergo this developmental change,” she says. Indeed, as anyone who has juggled identities knows, timing is one key to success.

  6. NEUROBIOLOGY

    New Role Found for the Hippocampus

    1. Laura Helmuth

    When you remember a friend, your first day of work, or your address, you're fully aware of what you're remembering. But memory has another guise: nonconscious skills like riding a bicycle or knowing how to tie your shoes. Subjectively, the two kinds of memory seem very different, and the brain structures responsible for them are different as well. Results reported in this month's issue of Nature Neuroscience suggest, however, that the hippocampus, a twist of tissue deep in the brain long believed to help form only conscious memories, also serves certain memories that don't rise to the level of awareness.

    The report comes from psychologists Marvin Chun of Vanderbilt University in Nashville, Tennessee, and Elizabeth Phelps of New York University. Chun and Phelps compared how normal people and those with anterograde amnesia, a memory defect caused by damage to the hippocampus, respond to certain complex patterns. They found that the normal subjects, but not the amnesiacs, could learn to remember repeated patterns they weren't consciously aware of. “What they've contributed,” says neuroscientist Larry Squire of the University of California, San Diego, “is probably the first demonstration that the hippocampal system can be dissociated from conscious awareness.”

    The finding bolsters an emerging theory that the role of the hippocampus in memory is to relate different elements of experience —a role that is suggested by the fact that it receives information from many other brain regions. “Anatomically, the hippocampus is a big convergence zone,” says Squire. The hippocampus and interconnected neural structures register and temporarily hold new information, he says, binding the various elements—place, odors, sounds, people —that constitute a remembered episode. By showing that the hippocampus binds the spatial relations of a dozen objects, but without awareness, this study suggests, says Squire, “that relational work may be more fundamental to the work of the hippocampus than awareness.”

    Indeed, the growing evidence that the hippocampus is involved in tying together the various aspects of memory was one of the inspirations for Chun and Phelps's study, which they began while they were both still at Yale University. Like much other research on human memory, their work relies on studying patients with severe anterograde amnesia. Because of hippocampal damage, say from stroke or a brain infection such as encephalitis, such people can remember their distant pasts but cannot create new memories. However, they can learn new skills and habits. With repeated practice, for example, an amnesic patient could learn to trace a complex pattern while looking at his or her hand in a mirror, but never remember seeing the pattern before. Such findings had persuaded researchers that the hippocampus is needed for forming conscious memories but not for learning unconscious skills, so what Chun and Phelps found came as something of a surprise.

    For their study, they chose a mental skill. The task was to pick a sideways letter T out of a field of 11 Ls and push a button to indicate which way the T was tilting. Over the course of 240 trials, the 15 control subjects and four amnesic patients got faster at finding the target T.

    Half of the patterns were generated randomly and were unique. But unbeknownst to the subjects, the other half were repeated over and over again throughout the course of the experiment. None of the subjects noticed the repetition, and no one could pick out the repeated patterns at the end of the experiment. Even so, as the experiment proceeded, the control subjects responded faster to these repeated patterns than they did to newly created ones. The amnesia patients, however, responded to the old forms only as fast as they responded to new.

    The work shows that even though people with damage to the hippocampus can learn simple patterns and skills, this brain structure is crucial to unconscious recognition of more complex patterns. Some theorists think the hippocampus's role in relating different elements of an experience may explain its role in conscious memory as well. Neuroscientist Howard Eichenbaum of Boston University says that awareness might emerge as the hippocampus ties together the interconnected properties of an experience. Alternatively, he says, conscious awareness might be just one element of an experience, stored with all the other information we register about an event.

  7. CHINA

    Academy Seeks to Tap University Elites

    1. Lin Gu*
    1. Lin Gu writes for China Features in Beijing.

    BEIJING—Scrambling to retain its dominant position in the country's research establishment, the Chinese Academy of Sciences (CAS) is offering its first-ever university scholarships to both undergraduate and graduate students at 20 leading universities. The just-announced scholarships, for the upcoming school year, are seen as a recruitment tool for the academy's shrinking network of research institutes as well as a way to forge closer ties to the increasingly important university system. The academy intends to offer positions to every scholarship winner who wishes to join a CAS institute after graduation.

    CAS's 122 institutes, traditionally set off from the rest of the scientific community, once ruled the roost, attracting the best young talent by offering better working conditions, newer facilities, and greater prestige than most universities and other research institutions. But times have changed. Academy officials are in the midst of closing at least a third of those institutes and cutting even deeper into the workforce, even as they strive to raise overall research quality by consolidating programs and improving conditions for an elite core of researchers (Science, 8 January, p. 150).

    At the same time, China is beefing up its top universities, several of which offer scholarships funded by private companies, in an attempt to foster world-class science as well as entrepreneurship. The new academy scholarships are an effort to link those two trends. “It will further close the gap between the two institutions that existed under the old system by integrating the innovation, processing, dissemination, and application of knowledge,” says Min Weifang, executive vice president of Beijing University, one of the 20 schools (see table) participating in the new program.

    View this table:

    The academy will provide up to 20 scholarships at each school, with 10 slots for undergraduates and five each for master's- and doctoral-level students. Graduate scholarship winners will also be able to participate in the research and educational programs run by individual CAS institutes, which currently enroll more than 11,000 students. The universities were chosen based on their excellent reputations both domestically and abroad, says Li Hongwei, an official with CAS's graduate education section. “We are badly in need of talented students to replace an ever-increasing brain drain,” says Li. The schools themselves will choose which students will get the awards. The university agreements, which represent an annual commitment of $85,000, run through 2001 and are seen as a “trial run,” says Sun Dianyi, a CAS press officer. “If we see that the efforts are really paying off, the investment is sure to increase, with more money for more universities.”

  8. PHYSICS

    Wiggling and Undulating Out of an X-ray Shortage

    1. Robert F. Service

    Biologists and other researchers are lining up at synchrotrons to probe materials and molecules with hard x-rays, and demand is growing fast; new magnet technologies may soon relieve the crush

    These are heady days at the world's most powerful x-ray sources. Materials scientists, chemists, and biologists are flocking to the stadium-sized machines for front-row views of the structure and chemical behavior of materials and molecules. But like fans at a World Series game, they have to fight the crowds. And there is a waiting list for tickets, especially for the best seats—those at a handful of synchrotrons that produce highly energetic, or hard, x-ray beams, which are ideal for figuring out the structure of complex proteins all the way down to the atomic level. “There's been an explosion of interest in the hard x-ray range,” says Herman Winick, a synchrotron physics expert at the Stanford Synchrotron Radiation Laboratory (SSRL) in Menlo Park, California.

    Among the triggers for this explosion have been spectacular demonstrations of the power of synchrotron beams to determine the precise shapes of important molecules, providing insights into how they work and clues to the design of drugs to interact with them. The latest, published just this week, is the long-sought structure of the ribosome, the cell's protein factory (see sidebar on p. 1343). And the demand for such information is growing fast as labs around the world identify more and more of the intricate trysts that take place between proteins as they convey biochemical signals between and within cells. Once researchers figure out which proteins interact in these signaling pathways, they are eager to know the intimate details of how they fit together. Adding to this information demand are the vast numbers of new proteins being discovered through genome sequencing efforts around the world—a deluge that has prompted the National Institute of General Medical Sciences (NIGMS) to launch a $3 million a year program to automate crystallography for structural genomics (see sidebar on p. 1345). As a result, says NIGMS director Marvin Cassman, waiting lists for structural biology beamlines around the world are growing. Indeed, demand outstrips supply by a factor of 4 for hard x-ray “beamtime” at one busy x-ray port at SSRL.

    View this table:

    Help is on the way, however. The number of hard x-ray beamlines devoted to structural biology, protein crystallography, and environmental science is set to increase dramatically in the next few years. One reason: New techniques for making x-rays are enabling smaller, cheaper machines to create x-ray beams nearly as powerful as those produced by top-of-the-line, $1 billion behemoths like the Advanced Photon Source (APS) in Argonne, Illinois. These advances are bringing powerful new synchrotrons within reach of countries like China and Canada, as well as prompting biomedical research funders—such as the U.S. National Institutes of Health (NIH) and Britain's Wellcome Trust—to collectively pump hundreds of millions of dollars into synchrotron construction. “There really is a pretty big surge in new capabilities that is coming online,” says Columbia University crystallographer Wayne Hendrickson.

    Slow start

    Biologists and other researchers now lining up for access to beamlines have come a long way from the early days of synchrotron radiation studies, when they lived off the wastes produced by particle physics experiments. For accelerator physicists, the x-rays generated by particles that speed around their giant atom-smashers are a drag, literally. As charged particles zip around an accelerator ring, guided by massive magnets, they shed x-ray photons like baggage flying out of a convertible screeching around a tight corner. The departing x-rays rob the particles of energy, making it harder to accelerate them to higher speeds.

    But the particle physicists' losses turned out to be other researchers' gains. Engineers figured out a way to corral the unwanted x-rays and focus them down into a hair-thin beam that could be trained on samples ranging from crystallized proteins to minerals squeezed in a diamond anvil to mimic pressures at Earth's core. Early experiments with these penetrating probes were anything but straightforward, however. To carry out the first synchrotron-based protein crystallography experiments in 1975, for example, Keith Hodgson and his SSRL colleagues essentially had to build their own makeshift beamline to collect their data. “It was a very difficult way of discovery,” says Hodgson. But it was worth the effort: The 1000-fold increase in x-rays compared with the small lab-based sources then available sped up experiments dramatically, and the results quickly dispelled fears that the intense x-ray beams would destroy delicate samples.

    Synchrotrons were no panacea, however. The finicky accelerator rings in the early machines often malfunctioned, going off-line for weeks at a time—a major inconvenience for researchers visiting a facility for a tightly scheduled block of time. Protein crystallographers, one of the largest groups of users today, also faced the challenge of transporting fragile protein crystals, painstakingly grown in the lab, to the facility. As a result, says crystallographer Janet Smith of Purdue University in West Lafayette, Indiana, the chance of an experiment working out in the early days was no better than 50-50. Indeed, for most crystallographers, it was less hassle to keep using the plodding but trusty x-ray tube in the lab.

    Most of those teething problems have now been solved. New beamlines and accelerator rings “are pretty much bullet proof,” rarely going off-line during scheduled operation, says Smith. And researchers now stabilize their crystals by freezing them with liquid nitrogen, making them easier to transport and more resistant to radiation damage from the even stronger x-ray beams at today's sources. That has increased the chance of a successful experiment to well over 90%, she says. And increased computer power and the ability to select the precise wavelength of x-rays has made it possible to solve novel protein structures in just days or weeks, rather than the months to years it took a few decades ago.

    As a result, “everybody wants to go to synchrotrons now,” says Smith. The crowds include not just the traditional users but some new groups as well. “At one time, determining a protein structure was the province of x-ray crystallographers,” notes Cassman. “That's not totally true anymore. Cell biologists who find a protein want to know the structure [themselves].”

    Souped-up wigglers

    A few years ago, prospects for meeting this surging demand were grim—and expensive. Although bio-centered beamlines were—and still are—being added at the large hard x-ray facilities, space was going fast. New hard x-ray machines were running about $1 billion apiece, and few were likely to be funded. X-rays from the relatively inexpensive soft x-ray machines didn't have enough juice to penetrate protein crystals. But recent improvements in the technology of the magnets at the core of the particle accelerators have dramatically changed the picture by allowing low-energy accelerators to create the abundant high-energy hard x-rays prized by biology researchers.

    One source of x-rays from a synchrotron is the machine's particle-steering “bending” magnets. But other pieces of equipment called undulators and wigglers, essentially strings of magnets with alternating polarity, dramatically increase the x-ray output. As the charged particles fly through the devices, the alternating magnetic field causes them to wiggle back and forth like a slalom skier whipping through a series of tight gates. In an undulator, for example, at each turn, the particle emits a broad spectrum of x-rays whose wavelengths are linked to the regular interval in the array of magnets. Some of the x-ray photons will have the same wavelength as the distance between magnets; others are “overtones” of these wavelengths. The phenomenon is a bit like playing a note on a violin, which can sound not only the note itself but also harmonic overtones.

    These shorter wavelength, or higher energy, x-ray overtones are key to generating abundant hard x-ray beams prized for experiments. The higher the harmonic, the more energetic are the x-rays. But slight defects in the magnetic structures of undulators long made it impossible to generate x-rays beyond the third and fifth harmonics. So engineers had to rely on another strategy to get those third and fifth harmonics into the hard x-ray range: raw power. They cranked up the speed of the particles whipping around the synchrotron racetrack so that they have more energy to shed when they pass through the alternating magnets. The APS, for example, hurls electrons around the ring at 7 giga electron volts (GeV).

    In the early 1990s, Pascal Elleaume and his colleagues at the European Synchrotron Radiation Facility in Grenoble, France, cracked the harmonic barrier. They showed that a standard magnet-tweaking practice called shimming—essentially precisely shaping a magnet's field by adding tiny bits of magnetic material to it—could greatly improve an undulator's performance. “[That] made it possible to reach much higher harmonics, thus opening the way for lower energy rings to reach hard x-rays with high brightness,” says SSRL's Winick. Instead of needing a billion-dollar, 6- to 8-GeV ring, researchers could now think about generating the high fluxes of hard x-rays with intermediate-sized rings that have energies of about 2.5 to 3 GeV and cost $100 million to $200 million. A related set of advances is also paving the way for even smaller, 1-GeV machines to produce hard x-rays. The key is the use of wigglers that harbor high-field superconducting magnets to bend the particle beam more tightly, which causes the charged particles to emit more energetic x-rays.

    New intermediate-energy synchrotrons are currently on the drawing board in Canada, the United Kingdom, Switzerland, and China. Some of these machines have been in the planning stages for nearly a decade and have recently been revised to take advantage of the new magnet technology. Five years ago, for example, Canada decided to build a 1.5-GeV soft x-ray machine, but the plans were revamped in 1997 when it became clear that a machine with only a little more power could provide hard x-rays. Canada is now planning a 2.9-GeV facility, which is expected to open in 2003. Planned synchrotrons in the United Kingdom, China, and Spain also underwent recent redesigns to take advantage of the new hardware. “It makes sense to build accelerators that will not break the bank,” says Joan Bordas, who is heading Spain's effort to build its first synchrotron, the Synchrotron Light Source, which has yet to secure its final funding.

    Numerous existing facilities are also planning upgrades to take advantage of the new insertion devices. Last month, for example, NIH announced plans to provide an initial $14 million toward a $53 million upgrade of the SSRL facility, which will use the new magnet technology to become a high-flux “third generation source” (Science, 30 July, p. 650). Similarly, the CAMD facility in Baton Rouge, Louisiana, and Germany's BESSY-1 ring, which is being donated to the Middle East (Science, 25 June, p. 2077), are planning upgrades with superconducting magnets to generate hard x-rays. But the Advanced Light Source at Lawrence Berkeley National Laboratory in California probably stands to gain the most from the new technologies. The facility will soon be retrofitted with new superconducting bending magnets, and its fortunes appear to be turning around after a recent poor scientific review (see sidebar on p. 1344).

    As many experts see it, these developments have virtually eliminated the need for additional APS-sized and -priced machines. Indeed, “it's unlikely another one will be built in the near future, if ever,” says SSRL's Hodgson. But Hodgson and other synchrotron experts are quick to add that the new technology in no way makes the behemoth machines obsolete. Even with the new technology, the smaller sources cannot produce beams as bright or as tightly focused as those of their big siblings. Winick and others predict that most of the growth in crystallography will take place at the smaller machines, freeing up beamtime on the big synchrotrons for experiments that take full advantage of their capabilities, such as making x-ray movies of proteins in motion. “The bread-and-butter stuff can be done without having to go to the very large machines,” says Bordas.

    It will take several years before many of the upgrades and new facilities are completed. But if the new generation of magnet technology lives up to its billing, other smaller, cheaper synchrotrons will likely follow. For biologists, that will provide a welcome relief from today's crowds.

  9. PHYSICS

    Ribosome Finally Begins to Yield Its Complete Structure

    1. Elizabeth Pennisi

    Crystallographers have long regarded the ribosome, the cell's protein factory, as the Mount Everest of their field. In fact, many have considered it insurmountable. A conglomerate of some 54 different proteins and three RNAs, it lacks the symmetry and repetitions that have eased the way to solving the structures of bigger entities, such as viruses. Now, two research teams have succeeded in setting up base camp.

    Crystallographer Thomas Steitz and biophysical chemist Peter Moore of Yale University and their colleagues report in the 26 August issue of Nature that they have worked out the structure of the larger of the two distinct subunits that form a complete ribosome. And in an accompanying paper, an independent team led by Venki Ramakrishnan, a biochemist who has just moved from the University of Utah, Salt Lake City, to the Medical Research Council Laboratory of Molecular Biology in Cambridge, U.K., has determined the structure of the smaller subunit. Both teams attained a resolution of about 5 angstroms. Although still too fuzzy for atomic detail, the images show the overall arrangement of the proteins with respect to the RNAs—enough detail that researchers consider this a milestone for ribosome studies.

    For years, ribosome experts have had to make do with indirect approaches to studying the ribosome's structure, such as mutating the various components to see how the changes affect protein synthesis. With the new structures, says Joachim Frank, a physicist at the New York State Department of Health Wadsworth Center in Albany, “we can now begin to put all the bits and pieces together.” These structures and higher resolution ones expected within the next year or two represent “a first step in a whole new direction,” agrees Albert Dahlberg, a molecular biologist at Brown University in Providence, Rhode Island, that will reveal just how these proteins and RNAs interact as they piece together a protein's amino acids.

    Ramakrishnan's team tackled the smaller ribosome subunit—designated the 30S subunit from the rate at which it sediments in the ultracentrifuge—from the bacterium Thermus thermophilus. It took 2 years of tinkering to get the material they needed—good crystals with heavy atoms inserted as landmarks for interpreting x-ray diffraction data. From the 140 crystals they screened at the National Synchrotron Light Source at Brookhaven National Laboratory in Upton, New York, they were able to use six to build maps of the electron density of the subunit. The detail was sufficient to reveal certain key features, such as where the RNA had looped back on itself to form a double helix and where the amino acids of the proteins formed distinctive spirals called a helices, says Brookhaven's Malcolm Capel, who helped both teams with their diffraction studies.

    Aided by a neutron-scattering map that showed the approximate locations of all the proteins in the subunit, plus the structures of seven that had already been individually determined by Ramakrishnan and Stephen W. White of St. Jude Children's Research Hospital in Memphis, Tennessee, and others, Ramakrishnan's team also managed to pinpoint the positions of eight of the 21 proteins. In addition, with help from biochemical data previously obtained by others indicating which proteins are in contact with the RNA, Brian Wimberly in Ramakrishnan's lab traced the RNA's path in the center of the molecule.

    Meanwhile, Moore and Steitz's team focused on the structure of the larger, 50S ribosomal subunit, which they obtained from a salt-loving microbe, Haloarcula marismortui. Some 20 years ago, biocrystallographer Ada Yonath of the Weizmann Institute of Science in Rehovot, Israel, had shown that it was possible to crystallize this subunit, but she and others have been unable to generate detailed x-ray diffraction data. Moore and Steitz found that part of the problem may be the fact that the giant molecule—containing 34 proteins and two RNAs—tends to make twinned crystals. Until one realizes twinning occurs, these look fine but ultimately give confusing results when studied using x-ray diffraction. Consequently, says Moore, they wound up rejecting the majority of the crystals they made. For this reason, and because of its size, “this was a bigger mountain to climb than was the 30S subunit,” Steitz notes.

    In many cases, the structures confirmed what researchers already thought. For example, Steitz and Moore's team could see a tunnel in the 50S subunit that others had observed in electron microscopy studies. But there were surprises as well. The structures indicate, for example, that protein-protein interactions may be more important than thought in making sure the right amino acid is picked during protein assembly. Previous research had shown that mutations in two particular 30S proteins cause mistakes in that selection, and researchers had assumed that the mutations affect the proteins' interaction with the ribosomal RNA. But the structure shows that the mutations actually affect the interface between these two proteins. “This is a hint that the proteins are actually doing something,” says Ramakrishnan.

    He and others expect many more surprises as they improve crystallization, diffraction, and analysis to achieve ever higher resolution. At least two other teams, Yonath's and that of Harry Noller of the University of California, Santa Cruz, expect to publish ribosome structures in the upcoming months, setting the stage for an intense race for the first high-resolution one. And, that, says Moore, is what makes these particular images so exciting: “Having gotten this far, it's now clear that we can go the rest of the way.”

  10. PHYSICS

    Upgrade Brings Hope to Berkeley's Advanced Light Source

    1. Robert F. Service

    Recent advancements in synchrotron technology (see main text) have come to the rescue of the Advanced Light Source (ALS), a “soft” or low-energy x-ray synchrotron at Lawrence Berkeley National Laboratory in California. Two years ago, ALS researchers were stunned by a harsh review from a scientific advisory panel assembled by the U.S. Department of Energy (DOE), which oversees the facility. But last month, another panel concluded that the ALS is steering a much-improved course, thanks in part to upgrades that enable it to generate hard x-rays for protein crystallography and other uses.

    Completed in 1993 at a cost of $100 million, the ALS became one of the world's premier sources for soft, or long-wavelength, x-rays, which it generates from a beam of electrons accelerated to 1.9 giga electron volts. That makes the facility a prime spot for studies such as tracking the electric and magnetic behavior of superconductors or mapping the chemical identities of a wide variety of elements in a sample. But in 1997, an influential DOE advisory panel headed by Massachusetts Institute of Technology physicist Robert Birgeneau concluded that demand for synchrotron beamtime was growing fastest for hard x-rays, the short-wavelength, high-energy radiation that x-ray crystallographers need for probing molecular structures. If money was tight, the panel concluded, DOE should spend it first on upgrades of hard x-ray machines such as the Stanford Synchrotron Radiation Laboratory and the National Synchrotron Light Source at Brookhaven National Laboratory in Upton, New York—even before paying the regular operating budget at ALS (Science, 17 October 1997, p. 377). The recommendation left many ALS researchers concerned about the future.

    Now, however, the ALS is getting a new lease on life. Two years ago, it opened its first x-ray crystallography beamline, which benefits from a new superconducting magnet, called a wiggler, that can wring hard x-rays from the ALS's moderate-energy electrons. And more hard x-rays are on the way. A set of three new “superbend” magnets, capable of turning out hard x-rays, is being installed at a cost of only about $4 million. When completed in 2002, the new magnets could supply hard x-rays to another 12 protein crystallography beamlines, bringing the total to 13—about the same number as planned for the nation's premier hard x-ray source, the Advanced Photon Source in Argonne, Illinois. “Nobody anticipated the Berkeley ring would be much good for this kind of science,” says Janet Smith, an x-ray crystallographer at Purdue University in West Lafayette, Indiana. “But now the ALS is flourishing.”

    The makeover caught the eye of ALS reviewers, who were asked to assess the facility's progress by the University of California, which runs the lab for DOE. The review concluded that the shift toward hard x-rays, in addition to changes in management and efforts to be more responsive to users, ranked as “excellent/outstanding.” The glowing review isn't expected to have any immediate impact on ALS's budget, says Tom Russell, a polymer physicist at the University of Massachusetts, Amherst, who served on both the current review team as well as the Birgeneau panel. However, he adds, “it can't hurt.”

  11. PHYSICS

    The Automated Approach to Protein Structure

    1. Robert F. Service

    If Thomas Earnest and a handful of like-minded colleagues have their way, robots may someday put human crystallographers out of work. Earnest, who heads a new crystallography beamline at the Advanced Light Source in Berkeley, California, is working with a handful of other groups to automate every stage of determining protein structure—from generating the proteins to crystallizing them, blasting them with x-rays, and analyzing the data. “We're really trying to minimize the amount of human intervention necessary,” says Earnest.

    It's easy to see why. Genome projects around the world are now churning out the DNA sequences that code for proteins by the tens of thousands every year. The Human Genome Project alone is set to complete the sequencing of the 80,000 to 100,000 human genes by the end of 2001, and researchers have no idea what most of their proteins do. They will be looking for clues from the molecules' three-dimensional structures.

    Getting those structures will be a huge challenge with today's techniques. Researchers must first coax bacteria to overexpress the protein they are interested in, isolate the protein, find the right conditions that cause it to crystallize, and purify the crystals. Next comes the trip to a synchrotron, where the crystals are bombarded with x-rays. If they actually diffract the x-rays in a manner that provides useful information, the data must be collected and analyzed. From start to finish, the whole process can take months or years to determine a single structure; amassing the current tally of some 10,500 structures has taken decades. “You just can't solve enough structures with the normal techniques,” says Earnest. “At current rates, it would take decades to get this accomplished.”

    That's why a variety of labs are looking to robots for help. Earnest, Ray Stevens—a molecular biologist with a joint appointment at the Novartis Institute for Functional Genomics (NIFG) and The Scripps Research Institute in La Jolla, California—and their colleagues are working on a robotic system for loading crystals, checking to see if they diffract, and collecting data that they hope will dramatically boost the pace of work at synchrotron beamlines. Whereas today's top-of-the-line crystallography beamlines can collect complete data sets on two crystals per day, “we should be able to do 10 to 15 data sets, if not more,” says Earnest. NIFG researchers are also developing similar robotic systems for the front end of the effort: overexpressing, isolating, and crystallizing proteins.

    Meanwhile, on the software side, Peter Kuhn and his colleagues at Stanford, as well as Robert Sweet's team at Brookhaven National Laboratory in Upton, New York, are testing programs to allow users to run the beamline machines from work stations at their home institutions. And Tom Terwilliger of Los Alamos National Lab in New Mexico and his colleagues recently unveiled a new software package called Solve, which quickly converts raw diffraction data into 3D protein structures.

    Help is also starting to roll in from funding agencies. This summer the National Institute of General Medical Sciences launched a new $3 million a year program to fund up to six high-throughput crystallography pilot centers to test some of the new technology. In time, researchers should be able to stay at their home institutions and simply send their samples to a synchrotron via overnight mail for processing. Such a strategy won't work with the most complex proteins, which will need extra experimental care. However, says Earnest, “most proteins are amenable to a FedEx approach.”

  12. CHEMISTRY

    Brazil Lobbies for First Nobel

    1. Cassio Leite Vieira*
    1. Cassio Vieira is a free-lance writer in Rio de Janeiro.

    Brazilian scientists hope to convince Swedish colleagues that one of their patriarchs of biodiversity, Otto Richard Gottlieb, deserves a Nobel Prize

    RIO DE JANEIRO—Brazil's science community has begun an unusually visible campaign to promote the Nobel Prize candidacy of Otto Richard Gottlieb, a revered figure in that country, for his contributions to the study of biodiversity and plant chemistry. But the effort is a long shot. In addition to the difficulty of touting someone from the developing world working in a relatively obscure field, Gottlieb's advocates are up against the secretive world of the Nobel selection process.

    The Royal Swedish Academy of Sciences, which awards the chemistry prize, does not accept unsolicited nominations and does not discuss who is under consideration. But for 3 years running, its chemistry committee, as part of its normal outreach efforts, has invited the Brazilian Chemistry Association to submit the names of worthy candidates. The association has used the opportunity to promote the 79-year-old Gottlieb, born in Czechoslovakia of a Brazilian mother, who moved to Brazil on the eve of World War II.

    Gottlieb's candidacy has attracted extensive media attention here, thanks to the efforts of Peter Rudolf Seidl, a chemist at the Federal University of Rio de Janeiro. Earlier this year, Seidl began making the rounds of government offices and scientific societies to build support for Gottlieb's candidacy, which by then had been endorsed by the Brazilian Chemistry Society. With public pressure building, last month the Brazilian Academy of Sciences took the unusual step of throwing its weight behind the effort, although academy officials are still debating how best to communicate that support to their Swedish counterparts, which award Nobel Prizes in physics, economics, and literature, too.

    Trained in Brazil, Gottlieb spent decades in the Amazonian rainforests, helping to create a taxonomy system that classifies plants not by their physical characteristics but by the chemicals they produce. Colleagues say he made important contributions to the field of natural products at a time when Brazil was virgin territory to most of the world's chemists. “He played a big role in developing what is now called biodiversity,” says Norman Lewis, director of the Institute of Biological Chemistry at Washington State University in Pullman and regional editor for Phytochemistry, which is planning a special issue next year in Gottlieb's honor. “He recognized that plants are factories that produce a variety of chemical compounds, depending on their environments.”

    Before Gottlieb began his work, Seidl says, studying biodiversity was like trying to repair a Swiss watch without understanding how its sophisticated mechanism works: “He provided the theoretical base, quantifying and bringing coherence to the study of this area.” Chemistry Nobelist Roald Hoffmann of Cornell University agrees that Gottlieb has made major contributions to several fields. “He is the premier Brazilian organic chemist and one of the world's outstanding phytochemists and biogeochemists as well,” says Hoffmann. “His work deserves the highest honors of our profession, including the Nobel Prize.”

    Gottlieb retired from the University of São Paulo in 1990. But he has transformed the living room of his modest apartment in the Copacabana neighborhood here into a minilaboratory that serves as home base for a dozen researchers, graduate students, librarians, and a secretary working on one project or another. Despite the recent outpouring of media attention to the academy's efforts, Gottlieb maintains a deep humility about his status in the scientific community. “I think there are people who are more clever and capable than I am,” he says. “I am a product of this nation, and by this nation I have been more than adequately compensated.”

    The uphill campaign to capture Brazil's first Nobel Prize has the strong backing of the country's scientific establishment, members of which are speaking out on his behalf. “The nomination of Professor Gottlieb for the Nobel Prize would do his work justice and would be a great distinction for Brazil,” says José Israel Vargas, special assistant to the Brazilian president. A former minister for science and technology, Vargas is currently president of the Third World Academy of Sciences (TWAS), which awarded Gottlieb its chemistry prize in 1991. “TWAS supports the nomination of Dr. Gottlieb for the Nobel Prize. And so do I,” says its executive director, Mohamed H. A. Hassan.

    Gottlieb is also well known for building the country's scientific infrastructure, having seeded the faculties of many of Brazil's leading universities and research institutes with more than 100 of his students. “If there was a popular vote in Brazil for the Nobel, he'd be near the top of the list,” says Lewis, who adds that Gottlieb is still going strong thanks to an “insatiable curiosity.”

    It is not clear that any of this advocacy will make any difference, however. “I am sorry to inform you that most of your questions cannot be answered due to our secrecy rules,” says Lennart Eberson, chair of the chemistry committee and a professor at the University of Lund in Sweden, in response to an e-mail query from Science. “The only thing I can say is that we every year ask a large number of individuals to nominate candidates for the Nobel Prize. Many nominations are received, and each candidate is evaluated.”

  13. INTERNATIONAL BOTANICAL CONGRESS MEETING

    New Ways to Glean Medicines From Plants

    1. Trisha Gura*
    1. Trisha Gura is a free-lance writer in Cleveland, Ohio.

    ST. LOUIS—Among the developments featured at the XVI International Botanical Congress, held here early this month, were two new approaches for getting plants to serve as protein factories. In one, plant roots were induced to secrete the proteins, while another used tobacco mosaic virus to ferry new genes into plants.

    Holy Alliance?

    It's hard to imagine a human disease being cured by a combination of a virus and tobacco—two notorious agents of disease. But that was the scenario described at the congress by Guy della-Cioppa of Biosource Technologies Inc., a plant biotech company located in Vacaville, California. His team devised this unlikely alliance by engineering the tobacco mosaic virus (TMV) to carry human genes and growing the virus in tobacco plants to make therapeutic products. So far, the Biosource group has produced proteins that might be used to treat either of two diseases: Fabry's disease, a rare hereditary condition caused by lack of an enzyme called a-galactosidase, which often leads to death from kidney failure, heart attack, or stroke; and a blood cancer called non-Hodgkin's lymphoma. Because adult plants in the field or greenhouse can be easily infected by TMV, the company claims that the process is faster and cheaper than more conventional genetic engineering techniques, in which proteins are made by bacteria or in cultured mammalian cells.

    “It's a good application,” says Chris Somerville of the Carnegie Institution of Washington in Stanford, California. Still, he cautions that “there are limitations.” The proteins might be contaminated with harmful allergens, and spraying fields with a genetically engineered virus raises environmental concerns, although the company says its early tests have allayed them.

    To make their engineered TMV, company scientists link the human gene either to the regulatory sequences for the viral coat proteins, which infected cells make in large amounts, or to the coat protein genes themselves. In one field trial, begun in the early 1990s, the researchers found that plants in small outdoor plots that were infected with such a virus produced what they call “respectable” amounts of a-trichosanthin, a protein then being investigated as a possible AIDS drug. Also encouragingly, they found no detectable virus outside of plants after 2 or 3 days, an indication that it is unlikely to spread in the environment.

    Since then the Biosource group has put the technique to work making a-galactosidase. The impetus came from Roscoe Brady of the National Institutes of Health (NIH) in Bethesda, Maryland, whose team wanted to see if they could help Fabry's patients by treating them with the enzyme, which would require up to a gram of protein per patient per year. Biosource's field trials, conducted this summer at the Biosource manufacturing facility in Owensboro, Kentucky, suggest that the infected tobacco plants can meet the demand, yielding “tens of grams” of the protein per acre of tobacco, says della-Cioppa. The NIH researchers are now testing the TMV-generated enzyme in mice lacking the enzyme as a prelude to human trials.

    With so few Fabry's patients, Biosource CEO Robert Erwin says a-galactosidase won't be much of a moneymaker, although it gives Biosource an opportunity to demonstrate the feasibility of its technique. But a project Biosource researchers are doing with a group from Stanford could lead to a bigger market.

    For over a decade, the Stanford researchers, led by immunologist Ronald Levy, have been trying to combat non-Hodgkin's lymphoma by getting the patients' own immune systems to destroy their tumor cells, which are derived from antibody-producing B cells. This involves vaccinating the patients with their own specific tumor antigens. The Levy team had been making the antigens—actually the unique antibodies displayed by the malignant B cells—by harvesting the patients' tumor cells, hybridizing them to immortal cells, and getting the resulting hybridomas, as they are called, to produce the antibodies—all of which can take from 9 to 12 months.

    By contrast, with the TMV method, it took less than 30 days to get the plants producing the antigens from antibody genes cloned from the patients' tumor cells into TMV and only a day or two more to purify them. What's more, Erwin says, “we can grow a full-course vaccination from the number of plants that would fit on a desktop.” He estimates that the tobacco-produced antigen could go into clinical trials in humans by next spring, provided the U.S. Food and Drug Administration gives its approval.

    Some researchers, such as plant biochemist Elizabeth Hood of ProdiGene in College Station, Texas, express caution, however, raising the possibility that pathogens and allergens might slip through the purification process. But della-Cioppa says the chances of contamination are no greater than for proteins produced in bacteria or cultured mammalian cells, and that the purification procedures are largely the same in any event. If he's right, Biosource's technology may give both tobacco and viruses a chance to improve their reputations.

    Getting to the Root of the Matter

    The term “manufacturing plant” is taking on a whole new meaning as scientists shuttle human genes into plants such as tobacco or corn, hoping to turn the plants into medical protein factories. But although the gene transfers usually succeed, extracting the pure proteins from the plant material, which may contain pathogens or allergens, can be costly. Now, one group of researchers proposes solving this problem by going back to the plants' roots.

    At the botanical congress, Ilya Raskin of Rutgers University in New Brunswick, New Jersey, described how his team has induced tobacco plants to secrete three different foreign proteins from their roots. If the plants are grown hydroponically in nutrient-laden liquids, the proteins can be extracted from the liquids, thus minimizing the number and cost of purification procedures. And the plants don't have to be harvested to get at the protein but can keep producing as long as they live. More recently, the team has extended the work, prodding plant roots to secrete their own defense chemicals, some of which may have potential as antimicrobial or anticancer drugs.

    “It's a nice piece of work with unique approaches and unique applications,” says plant geneticist John Finer of the Ohio State Agricultural Research and Development Center in Wooster, who wrote an editorial that accompanied the Raskin team's paper on the protein results in the May issue of Nature Biotechnology. Still, Finer cautions, “a lot of work will be needed to show that it can be useful on a commercial scale.”

    In the first phase of the work, Raskin and his colleagues simply demonstrated that plants can secrete a foreign protein from their roots. The researchers tested seeds from tobacco plants into which Uwe Sonnewald's team at the Institute for Plant Genetics in Gatersleben, Germany, had introduced a bacterial gene encoding an enzyme called xylanase, which can digest a blue-colored form of the sugar xylan, turning it clear. When the seeds sprouted on petri dishes coated with nutrient agar containing the blue-colored sugar, the transgenic plants produced “clear zones” around their roots, showing that they were secreting the enzyme.

    The researchers then went on to create two transgenic tobacco strains of their own. They introduced the gene for green fluorescent protein (GFP) into one and a human gene encoding an enzyme called secreted alkaline phosphatase (SEAP) into the other, along with regulatory sequences that would allow these genes to be expressed in plants. As with the xylanase gene, the group coupled GFP to a short stretch of DNA encoding “signal sequences,” which tell the root cells to secrete the proteins. The SEAP gene already had its own secretion signal.

    Grown hydroponically, the plants carrying the GFP gene exuded the protein from their roots, turning the hydroponic fluids a bright fluorescent green. The SEAP transgenic plants produced even more compelling results, churning out an average of 5.8 micrograms of enzyme per gram of dry root per day—a figure that went up to 20 micrograms when the researchers used a different gene construct with a stronger promoter. For comparison, Raskin points out that maize engineered to produce the protein avidin makes about 230 micrograms of the protein per gram of dry seed weight. He estimates that at the higher production rate, the tobacco roots could surpass the productivity of the transgenic corn over the life of the tobacco plants.

    Raskin's team has since branched out in a new direction: coaxing plant roots to churn out higher levels of their own defense chemicals. In unpublished work presented at the meeting, Raskin showed that this can be done, for example, by exposing the roots of plants grown hydroponically to a fragment of a bacterial cell wall or a toxin from a fungus. “Basically, we press different biochemical triggers and see what comes out,” he says.

    The Raskin team has so far collected about 5000 samples of materials exuded by the roots of 700 different plant species. Some have been screened by the National Cancer Institute's Natural Products Branch, which found that several of these crude extracts killed various cancer cells in lab culture.

    Whether the strategy will lead to any new anticancer agents or other drugs remains to be seen. But even if it does, some plant researchers caution that it may not be possible to produce either natural compounds or engineered proteins on an economically feasible scale. Elizabeth Hood, a plant researcher at ProdiGene in College Station, Texas, who has helped pioneer the production of avidin in maize, describes Raskin's technique as “a neat idea.” But she adds, production “depends on greenhouse space and hydroponics, and those are very expensive.”

    Raskin counters that much supermarket produce, including tomatoes and cucumbers, is grown hydroponically, and consumers don't seem to object to the price. And he points out that making therapeutic proteins in mammalian or bacterial cells requires even more costly incubation techniques and sterile conditions. “Rhizosecretion is not going to work for all proteins,” he maintains, “but it will work for some.”

  14. ASTRONOMY

    Subaru Sees an Unruly Pair

    1. Dennis Normile

    A pair of embryonic stars, swaddled in gas and dust, emit parallel jets in an image from Japan's new 8.3-meter Subaru Telescope on Mauna Kea in Hawaii. Material is still collapsing to form these protostars, a system called L1551IRS5, about 450 light-years from Earth. Theorists believe that it spirals inward to form a disk around each newborn star's equator. Some of the material acquires so much momentum, however, that it is thrown out in jets emanating from the poles.

    These jets, which stretch 1500 times the distance from Earth to the sun, are pointed toward Earth and are visible partly because a stellar wind from the protostars has swept away material along the line of sight. A second set of parallel jets likely points in the opposite direction but is hidden by intervening dust and gas.

    The Hubble Space Telescope discovered the jets, but Subaru is the first Earth-based telescope to see them. Its analysis has revealed the temperature of the jets—several thousand degrees—and their composition, showing that the hot gas is rich in ionized iron. Resolving the jets in such detail from the ground “is a pretty good trick,” says Alan Boss, an astrophysicist at the Carnegie Institution of Washington. “It's really a tribute to this great telescope,” he says, adding that Subaru's analysis could also sharpen astronomers' understanding of star formation.

Log in to view full text