News this Week

Science  31 Jul 1998:
Vol. 281, Issue 5377, pp. 622

    New Study Points to Eurasian Ape as Great Ape Ancestor

    1. Ann Gibbons

    Scientists trying to trace the origins of the great apes—including humans—have come up with several candidates for the honor of being the last common ancestor to the living great apes. The most widely accepted nominees, such as Kenyapithecus, Proconsul, and Morotopithecus, have been extinct African apes (Science, 18 April 1997, p. 355). But new results reported in the 30 July/15 August issue of Current Biology point to a different—and highly controversial—conclusion.

    Based on a synthesis of data from both fossil analyses and comparisons of DNAs obtained from living apes and monkeys, molecular evolutionist Caro-Beth Stewart of the State University of New York, Albany, and molecular anthropologist Todd Disotell of New York University propose that the ancestor was an unknown ape from Europe or Asia that dispersed into Africa 10 million years ago. To a small group of paleontologists who had already been advocating a link between fossil apes from Europe and modern apes, the findings are welcome news, especially because DNA and fossil data have not always been in such good agreement. “It is gratifying that independent lines of evidence suggest a similar interpretation,” says one such paleoanthropologist, David Begun of the University of Toronto.

    But the agreement between molecules and fossils is winning few new converts. “I don't think we have enough evidence to settle the issue,” says Harvard University paleoanthropologist David Pilbeam, who thinks the weak link is the fossil data. Not only is there a gaping hole in the fossil record in Africa during the time when the living African apes evolved, but Pilbeam also doubts whether the fossilized bones of extinct apes will ever offer enough clues to fill in the branches of the ape family tree reliably.

    In coming to their conclusion, Stewart and Disotell constructed a phylogenetic tree showing the evolutionary relationships of extinct and living apes and Old World monkeys. First, they placed living apes and monkeys on branches of the tree, based on data from dozens of different studies, by their group and others, comparing the DNA sequences of the primates. Differences in the sequences indicated how close or far apart the primates' perches should be. To determine when the branches split apart, the researchers applied dates from a molecular clock used by molecular phylogeneticist Morris Goodman of Wayne State University in Detroit and colleagues, which converts DNA differences into a time since two lineages split apart.

    To see where extinct primates fit in this molecular tree, Stewart and Disotell consulted several new computational syntheses of data from the fossil record, by Begun's group and others. They put these fossils on the molecular tree and found that the dates when extinct species diverged from each other were remarkably similar to those predicted by the molecular studies. “It all fell together so perfectly that I felt like I was cheating,” says Stewart.

    She and Disotell then labeled their tree of monkeys and apes to show the geographical homes of both living and extinct species. Finally, with the help of a computer, they set about testing migration scenarios that could produce such a family tree. The scenario that worked best shows that at least one African ape left the continent about 18 million to 20 million years ago, “when Africa was chockful of apes,” says Disotell. Its descendants eventually gave rise to an array of early apes in Asia and Europe, including the ancestors of gibbons and orangutans in Asia. By about 10 million years ago, one of those Eurasian apes moved back to Africa, where it became ancestor to the living African apes and humans.

    The authors favored this scenario, because it requires only two migrations—one out of and one back into Africa. In contrast, say Disotell and Stewart, if an extinct ape in Africa gave rise to all the great apes, at least six of its descendants would have had to have migrated out of Africa to produce gibbons, orangutans, and at least four lineages of fossil apes in Eurasia. That seems unlikely, says Stewart, as the continents probably were not continuously connected by the forest habitat critical for most primates.

    Begun notes that the migration of a Eurasian ape back to Africa in the past 10 million years fits with his finding that two fossil apes from Europe—the 9-million-year-old Ouranopithecus and the slightly older Dryopithecus—share more traits with the modern African great apes than do extinct African apes. In his view, the modern apes must be close relatives of the Eurasian ancestor.

    But other paleoanthropologists, such as Monte McCrossin and Brenda Benefit of Southern Illinois University in Carbondale, sharply contest Begun's interpretation of the fossil record, which they think included incorrect identifications of the anatomy of fossil apes. In particular, they contend that Kenyapithecus, a 14-million-year-old ape from Kenya, resembles living African apes more closely than do Ouranopithecus or Dryopithecus.

    The fossil record is simply too fragmentary to settle this question, says Pilbeam: “The fact you could end up with people saying it's Africa, Asia, and Turkey—these are basically trial balloons being floated.” The tracks of the great ape ancestry are still faint.


    Clinton's R&D Chiefs Waiting on Sidelines

    1. Andrew Lawler

    With the November congressional elections approaching, Senate Republicans are in no mood to rubber-stamp President Bill Clinton's choices for senior Administration posts. That was clear last week at a hearing on the nomination of U.N. envoy Bill Richardson to head the Department of Energy (DOE), as Republican senators blasted Clinton's handling of nuclear waste, the nuclear stockpile, and global climate change issues.

    Richardson is the latest high-level Administration R&D nominee being held hostage to partisan wrangling between Congress and the White House over issues not related to their qualifications for the job. Science advocates worry that the delays could jeopardize R&D programs as the Administration begins work on the 2000 budget.

    Perhaps the most frustrated of the science officials-in-waiting is one who has already been confirmed: microbiologist Rita Colwell, who was approved on 22 May as director of the National Science Foundation (NSF). The problem is that her predecessor, Neal Lane, is still awaiting confirmation as director of the White House Office of Science and Technology Policy and, thus, has not officially vacated his NSF post. Neither appointment is controversial; Lane won plaudits from Republican and Democratic senators at his confirmation hearing, and lawmakers dispensed with a hearing for Colwell altogether. Rather, the delays are due both to White House tardiness in completing paperwork and to an election-year reluctance by Republicans to help Clinton rebuild his team.

    The partisanship was on display last week during a 4-hour grilling of Richardson, a former Democratic member of Congress from New Mexico, by the Senate Energy and Natural Resources Committee. The sternest words came from Senator Larry Craig (R-ID), who warned that he is prepared to oppose Richardson's confirmation until the White House lays out a clear plan to store spent nuclear fuel from commercial reactors. Under Senate rules, the objections of a single senator can delay a floor vote once the committee acts.

    DOE is legally bound to store the waste, but the long-term storage facility at Yucca Mountain in Nevada will not be ready until well into the next century. Some lawmakers want an interim facility, but the White House opposes this option because it could divert money and attention from the long-term solution. Craig complained that the White House has not allowed previous energy secretaries to negotiate an interim plan with Congress. But Richardson, who insisted that decisions about nuclear waste disposal “will be based on science and not politics,” reminded legislators that “I can't deal with these issues until you confirm me.”

    Republican lawmakers also questioned the Administration's efforts aimed at ameliorating global climate change despite stiff congressional opposition to the Kyoto treaty negotiated last December and blasted its oil and gas policies, which they maintain are hurting domestic producers. And some, such as Senator Jesse Helms (R-NC), argued that the department should be abolished, although the idea has garnered little real political support in either the House or Senate.

    At the same time, many senators said that holding up the confirmation would be counterproductive. “DOE needs a leader, a Cabinet officer, as quickly as possible,” said Senator Pete Domenici (R-NM), who chairs the panel that appropriates DOE funding. Former DOE secretary Federico Peña left last month to join Vestar Capital Partners, a New York investment firm, leaving Deputy Secretary Betsy Moler as acting chief. Moler is expected to resign once a new secretary is confirmed; Administration officials say a leading candidate to succeed her is T. J. Glauthier, who now oversees energy, space, and science issues at the Office of Management and Budget.

    Administration officials hope all of the R&D nominees will be confirmed before Congress leaves in early August for a monthlong recess. That would let them play a role in developing agency requests for the 2000 budget, which are submitted in the fall. Science advocates fear that the absence of senior officials like Richardson and Lane could hurt R&D programs. But given the Senate's backlog of some 140 nominees, a stack of other pressing business, and continuing partisan tensions, the would-be R&D chiefs may be forced to cool their heels for a while longer.


    U.S., Ukraine Launch New Chernobyl Lab

    1. Richard Stone

    Every summer for the past 6 years, U.S. ecologist Ron Chesser dons his moonsuit and respirator and prowls the marshes near the Chernobyl nuclear power plant. The site is not on any travel agent's list of popular destinations, but it does offer Chesser exactly what he wants—a supply of voles, striped field mice, and other small mammals that are markers for the ecological health of a region 12 years after the world's worst nuclear accident. At the end of every field season, however, Chesser must leave behind certain samples, such as highly radioactive biological tissue or soil, that cannot be taken out of the country and transported to his lab at the Savannah River Ecology Laboratory in Aiken, South Carolina. “It's been pretty limiting,” he admits.

    But things are about to get a bit easier for Chesser and other researchers who venture into Chernobyl's forbidden zone. Last week, Vice President Al Gore and Ukraine President Leonid Kuchma unveiled plans for an International Radioecology Laboratory at Chernobyl, funded jointly by the U.S. and Ukrainian governments. The lab, which will study everything from genetic mutations in local wildlife to radionuclide movement and cleanup technologies, should be up and running by next summer. “We place great hopes in this new facility,” says Anatoly Nosovsky, director of the Slavutych Laboratory of International Research and Technology, a nearby research center devoted to nuclear safety and cleanup technologies.

    When the Chernobyl power plant's reactor number 4 exploded on 26 April 1986, it released into the air as much as 150 million curies of radiation, much of which settled onto nearby land. Authorities created a 30-kilometer “exclusion zone” around the nuclear plant, evicting more than 135,000 people and limiting access mostly to plant workers, cleanup crews, and scientists. As a result, the exclusion zone has become a unique ecological laboratory in the shadows of the still-operating power plant.

    But what little money the Ukrainian government spends for research in the exclusion zone goes mostly to study hazards from the nuclear fuel remaining in the burned-out reactor core and the weakening sarcophagus that covers it (Science, 19 April 1996, p. 352). “There are not enough experts in radio ecology” in the zone now, says geologist Valentin Radchuk, who heads the Department of Scientific Programs for Ukraine's Cabinet of Ministers. Foreign ecologists can stay only for short stints, and they often must tailor their research to fit whatever analyses can be done on equipment at Chernobyl or nearby Kiev. The new lab will be able to tackle many problems, including contaminated groundwater and wind-borne radioactive dust.

    Recognizing a compelling need for the lab, officials from various Ukrainian Ministries and the U.S. Department of Energy last October began hammering out the details. The $1.3 million agreement, signed at the second meeting of the U.S.-Ukraine Joint Commission in Kiev, calls for Ukraine to house the facility and pay its utility bills, and for the United States to furnish it with top-of-the-line instruments for separating radionuclides and carrying out other analyses. The lab should also help cut through red tape that stymies work in the most dangerous areas in the exclusion zone. “The greatest contribution of the new lab,” says Robert Baker of Texas Tech University in Lubbock, who collaborates with Chesser at Chernobyl, is that “we'll be more likely to get permission to work in areas most in need of research.” Scientists from the United States and Ukraine will meet in Chernobyl next month to draw up a list of necessary equipment and discuss their research strategy.

    In the meantime, Ukrainian officials are searching for a suitable home for the lab. The leading candidate is an unfinished building intended as a hotel-resort in the ghost town of Pripyat, situated across a lake from the nuclear plant. Such a setting would also serve as a constant reminder of the accident. “It's very sobering,” says Chesser. “You never get complacent.”


    Outside Insider Named to Head EPA Research

    1. Jocelyn Kaiser

    The White House this week tapped a veteran Washington insider for the top research post at the Environmental Protection Agency (EPA), a job vacant for over a year. The choice of Norine Noonan, a biologist-turned-bureaucrat without previous ties to EPA, is raising eyebrows. But some observers say Noonan's expertise as a scientist who knows the ropes in Washington—she spent a decade on Capitol Hill and at the White House before becoming vice president for research and dean of the graduate school of Florida Institute of Technology in Melbourne—will stand her in good stead in defending the $500 million research budget at EPA, an agency often accused of giving science short shrift. “I think that's what they need in that job,” says Howard University toxicologist Bailus Walker.

    The previous chief at EPA's Office of Research and Development, marine ecologist Bob Huggett, presided over a sometimes painful overhaul of EPA science launched in 1994 that includes shifting research dollars from agency staff to outside scientists and forcing EPA researchers and risk managers to work more closely together (Science, 21 January 1994, p. 312). Huggett left in June 1997 to become research vice president at Michigan State University in East Lansing.

    Noonan earned a Ph.D. at Princeton University in biochemistry and cell biology in 1976 but soon moved on to a congressional science fellowship and then to the White House Office of Management and Budget (OMB), where she oversaw budgets for the National Science Foundation and NASA. “She was very professional, very hard-nosed, asked all the right questions,” a House staffer says. “I would rather have had a really strong scientist again,” admits Linda Birnbaum, a dioxin researcher at EPA's health effects lab in Research Triangle Park, North Carolina. But she and others say they're relieved a nominee has finally been chosen.

    Noonan must now be confirmed by the Senate. Her “first order of business,” she says, “is to get to know the organization.” EPA watchers and Noonan both agree she has a lot to learn. Her selection, she says, “is as interesting a choice for me as it is for them.”


    Vanishing Pools Taking Species With Them

    1. Kathryn S. Brown*
    1. Kathryn S. Brown is a science writer in Columbia, Missouri.

    Near the end of Noble Drive in San Diego, past a row of condos, the city has erected a chain-link fence to protect a patch of dried mud. To understand why, one must look beneath the surface—or wait a few months. Come winter, the rainy season, this sun-baked plot turns into a pond teeming with fairy shrimp and plants, some of which are on the federal endangered species list.

    These unusual species spring to life in rainwater ponds, called vernal pools, that linger until late spring or summer every year before evaporating. But strategies to save these ecosystems are falling short, according to new data presented last month at a joint meeting of the Ecological Society of America and the American Society of Limnology and Oceanography in St. Louis. Surveys suggest that up to a third of vernal-pool crustaceans thought to have existed in California in the mid-1800s have gone extinct. “It's death by 1000 small wounds,” warns ecologist Gordon Orians of the University of Washington, Seattle. “If we were to lose just one pond or one species, would it matter? Probably not. But the first one goes. Then, the next. And the next. Finally, the cumulative effect on biodiversity is devastating.”

    The crustaceans are dwindling because the pools themselves are a vanishing breed. It is hard to track the ephemeral habitats, formed when rainwater collects in depressions lined with thick clay. But historical soil surveys in California's Central Valley suggest that a century ago, vernal pools occurred on 1.64 million hectares in this region alone, says Bob Holland, an ecologist who contracts for state agencies. Now, the pools return to less than 400,000 hectares in the valley, he says. Fueling the decline are development and agriculture, says Ellen Bauder, an ecologist at San Diego State University. In San Diego, she says, over 90% of vernal pools spotted in aerial photos 70 years ago no longer come back.

    For years, hardly anyone noticed the pools were disappearing—until scientists started counting species. Bauder first documented the decline of vernal pool plants, 13 of which are now endangered. Then in 1992 and 1994, a team led by biologist Marie Simovich of the University of San Diego sampled vernal pools in San Diego and throughout the Central Valley. They tallied 80 crustacean species, many existing in only a few pools. Losing these crustaceans could have ramifications up the food chain, says Simovich. Fairy shrimp, for example, are eaten by mallards and other migratory birds that winter in California.

    Plugging vernal pool loss and Simovich's numbers on species range into a computer model that forecasts extinctions, Jamie King of the Environmental Protection Agency in Annapolis, Maryland, has estimated that up to a third of the crustaceans that lived in the Central Valley's pools 150 years ago have since gone extinct. “Given that most crustacean species occur only in a few pools, you don't have to lose much habitat before you lose a lot of diversity,” King says.

    Hoping to thwart further losses, the U.S. Fish and Wildlife Service in the past year has bought two San Diego tracts with vernal pools and says it plans to buy more. The city itself is guarding some pools, including the one on Noble Drive. And Miramar, the Marine Corps Air Station just outside San Diego, has hired contractors to restore 116 vernal pools on its 9300-hectare base—an anticipated 5-year, $1 million project that will involve, among other things, sculpting depressions and stocking them with fairy shrimp, plants, and other vernal species.

    But conservation strategies on private lands—which aim to create an equal amount of vernal pool habitat for that destroyed—are bogged down in disputes. Some landowners complain about regulators spying on private property in the hopes of catching citizens filling in mere puddles. “It's a nightmare,” says Bruce Blodgett, director of national affairs for the California Farm Bureau Federation in Sacramento, who worries that farmers could lose cropland to restored vernal pools. Scientists decry the strategy for another reason: “This ‘no net loss’ approach ignores the fact that some fairy shrimp species live in one pool but not in another,” says Simovich, who wants to see pools with rare species conserved, not re-created. “The conflict,” adds Bauder, “is getting worse.”


    Did Twisty Starlight Set Stage for Life?

    1. Robert Irion*
    1. Robert Irion is a science writer in Santa Cruz, California.

    In their quest to trace the origins of life on Earth, scientists keep confronting a puzzle: How did vital molecules get their distinct twists? Nearly all the amino acids in proteins are “left-handed” (L), a designation for one of two mirror-image configurations of atoms around a carbon center. On the other hand, the sugar backbones of DNA and RNA always spiral to the right. This uniform handedness, or homochirality, could have arisen in the course of evolution, either by chance or because such shapes somehow aid DNA replication or protein synthesis. Or it may have preceded life: Some researchers argue that our infant solar system was seeded with L amino acids formed in cool interstellar clouds, which then rode to Earth aboard comets, meteorites, and dust.

    That scenario receives a boost this week with a report on page 672 describing the first evidence of a possible space-borne mechanism. A team led by Jeremy Bailey of the Anglo-Australian Observatory near Sydney has spotted circularly polarized infrared light—in which the electromagnetic wave rotates steadily—streaming from a region of intense star birth in the Orion Nebula. Ultraviolet (UV) light polarized this way can selectively destroy either left- or right-handed (D) amino acids, depending on the direction of spin. If similar radiation bathed the dust around our newborn sun 5 billion years ago, says team member James Hough of the University of Hertfordshire in Hatfield, England, “it could have created the necessary precursors to life's [handedness]. This process would produce a much higher excess [of L amino acids] than anything that could occur on Earth.”

    The findings are “quite exciting,” adds organic geochemist John Cronin of Arizona State University in Tempe, who has found a surplus of L amino acids in two meteorites that hit Earth this century and thinks such space-borne amino acids might have set the pattern for ones made later on Earth. Origin-of-life experts have a different spin. “There are so many problems” with the scenario, says biogeochemist Jeffrey Bada of The Scripps Institution of Oceanography in La Jolla, California, who doubts that large quantities of amino acids from space would have survived the journey to Earth or hung around long enough to influence early biology. “I doubt this will settle the issue of how homochirality arose.”

    Those who favor an unearthly genesis for homochirality have for years pointed to circular polarization as a possible trigger. Astronomers have seen high levels of such radiation near binary stars and in other exotic settings with strong magnetic fields. Now, Bailey's team has found it in an environment much like the one that spawned our solar system. They studied the Orion Molecular Cloud, a cauldron of star formation, with an infrared camera on the 3.9-meter Anglo-Australian Telescope. They found that up to 17% of the infrared light streaming from Orion was circularly polarized, presumably by scattering off fine dust grains aligned in a magnetic field. “That was a big surprise,” says Bailey, who had expected levels of 1% to 2%.

    Infrared light, however, does not pack the energy needed to destroy organic molecules. That would take UV light. Although Bailey's colleagues could not see UV light from Orion because of obscuring dust, they calculate that a similar percentage of UV light should also be circularly polarized. If such light from a nearby star cascaded through our early solar system, it could have broken the bonds in enough D amino acids to yield one extra L amino acid for every 10 molecules—enough of an excess for early organisms to seize upon and amplify. Other planetary systems, depending on the direction of polarization, might see an excess of D amino acids.

    Even so, Bailey and Hough acknowledge, many events must fall into place to render their scenario plausible. Those steps include making huge amounts of amino acids in space and delivering them to Earth without losing the surplus to “racemization”—the spontaneous transformation of homochiral molecules to an even-handedness that happens quickly at high temperatures and in water. “I consider each of those steps to be possible,” says planetary scientist Christopher Chyba of the University of Arizona, Tucson, noting Cronin's recent discovery of L amino acid excesses ranging from 3% to 9% in the Murchison meteorite, which fell in Australia in 1969 (Science, 14 February 1997, p. 951), and in a 1949 meteorite from Kentucky. “The open question is, would such an excess be important to the origin of life?”

    Bada and his colleague at the University of California, San Diego, chemist Stanley Miller, think not. “Once the amino acids get to Earth, they would racemize in very short order,” Miller says. “I've always felt that homochirality arises by chance.”


    A Sulfurous Start for Protein Synthesis?

    1. Gretchen Vogel

    Although Charles Darwin proposed that life originated in a warm, nourishing broth, new evidence supports a less cozy idea: that the cradle of life was more like a Puritan minister's version of hell—a sulfurous swirl of superheated water and oozing magma. On page 670, chemist Claudia Huber and patent attorney Günter Wächtershäuser report that they have re-created a crucial step in assembling the ingredients of living cells—the linking together of amino acids into short, proteinlike chains called peptides—under just such harsh conditions.

    Although other researchers have achieved a similar feat in the lab, they generally did so with the help of additives or conditions not likely on early Earth. The amino acids had to be kept dry, for instance, or be activated by compounds not found in nature. In contrast, Wächtershäuser says, his system “uses nothing more than what is available in volcanic exhalations”—the magma and pressurized gases that suddenly hit cooler ocean water at cracks in Earth's crust. James Ferris of the Rensselaer Polytechnic Institute in Troy, New York, agrees: “These peptides are made under plausible prebiotic conditions. You don't have to throw anything in that is artificial.”

    Indeed, says evolutionary biologist Norman Pace of the University of California, Berkeley, the peptide formation is “very exciting,” both for its novel chemistry and for what it may imply about life's origins. These experiments support Wächtershäuser's theory, originally proposed in 1988, of how the first ingredients of living organisms might have assembled on the surface of minerals near underwater volcanic gas vents.

    Wächtershäuser had made his suggestion in the wake of reports from geologists that cast doubt on the idea that the first life-forms on Earth might have arisen in what Darwin called a “warm little pond.” Those reports suggested that such a temperate pond might not have existed 4 billion years ago, when life is thought to have had its genesis, because Earth was much hotter then, seething with volcanoes and enduring a bombardment of comets and asteroids.

    So, Wächtershäuser, who holds a Ph.D. in organic chemistry, proposed instead that the iron and nickel sulfide minerals that collect near the volcanic vents might have catalyzed the formation of the first biomolecules from carbon monoxide and other gases belched from escaping magma. The sulfide minerals carry a positive charge on their surfaces, and Wächtershäuser believes organic molecules with negative charges would have accumulated there and continued to react with each other, forming many of the precursors for life.

    The basic biomolecules he had to explain include amino acids, the building blocks of proteins. Last year, Wächtershäuser and Huber, of the Technical University of Munich, took one step toward demonstrating the feasibility of making amino acids in conditions similar to those at the vents. They showed they could make activated acetic acid, a starting material for amino acid synthesis, by mixing carbon monoxide and hydrogen sulfide with a slurry of nickel sulfide and iron sulfide particles at 100 degrees Celsius (Science, 11 April 1997, p. 245).

    The researchers have not yet shown that this recipe can produce actual amino acids, but the current work indicates that if amino acids do form at the vents, they could hook up to form peptides. Huber and Wächtershäuser added amino acids to the same sulfide slurry, and within a few days they could detect a range of dipeptides, consisting of two amino acids linked together, as well as a few tripeptides, containing three amino acids. The researchers propose that, catalyzed by the iron and nickel sulfides, carbon monoxide and hydrogen sulfide bind to the amino acids and convert them into a reactive form.

    Not all specialists in the origins of life are convinced that this lab demonstration proves that the same thing could have happened naturally, however. Stanley Miller, a biochemist at the University of California, San Diego, says that concentrations of carbon monoxide, which activates the amino acids in Wächtershäuser's reaction, are much lower in nature than in the experiment. And even if the reaction could occur in nature, it would not be adequate to form proteins that contain many amino acids, says Miller, who favors a cooler beginning for biomolecules.

    Pace adds a caution that applies to any lab effort to create biomolecules. “It's a very long leap,” he says, “from [mineral] surface chemistry to a living cell.”


    Molecular Imaging Beats Limits of Light

    1. Rob van den Berg*
    1. Rob van den Berg is a science writer in Leiden.

    Leiden, The Netherlands—Researchers can map single atoms or molecules on surfaces almost as routinely as cartographers map hills and lakes, thanks to instruments like the scanning tunneling microscope. But below the surface, they start to lose their bearings. Microscopes equipped with sensitive detectors can pick up individual fluorescent molecules in a liquid or solid, but they generally cannot distinguish the molecules if they are separated by less than a wavelength of light. In today's issue of Chemical Physics Letters, however, physicists bring new accuracy to subsurface molecular imaging.

    The researchers, Jürgen Köhler of Leiden University and his colleagues, took advantage of tiny differences in the way chemically identical fluorescent molecules respond to light, depending on their immediate surroundings. Even close neighbors can be distinguished if they are probed with a laser that delivers light at a very precise frequency, the group showed. They managed to determine the positions of seven molecules in a matrix of another material to within a few tens of nanometers, perhaps a tenth of a wavelength of light. That is more than 10 times the resolution of earlier techniques—“a beautiful experimental demonstration of a way to increase the resolution of optical measurements,” says W. E. Moerner of the University of California, San Diego, a pioneer in single-molecule spectroscopy.

    Molecular imaging has been hampered by the diffraction limit, an intrinsic blurring of light that prevents two sources from being resolved when they are close together. Only near-field microscopy, which makes use of tiny optical fibers that are narrower than a single wavelength, can pinpoint fluorescent molecules with subwavelength accuracy. However, it sacrifices depth information for two-dimensional precision.

    In 1995, Eric Betzig, of NSOM Enterprises, proposed a way to get around the diffraction limit. Each molecule in a solid matrix finds itself in a slightly different structural environment because of random strains and imperfections. As a result, each one has an absorption line at a slightly different frequency. This shift is generally very small, but at low temperatures it can be resolved with a tunable laser that generates a precise frequency of light. “Molecules which can normally not be spatially separated are clearly distinguished,” says Köhler.

    Köhler and his colleagues illustrated the method on a sample of pentacene, an aromatic hydrocarbon, in a host crystal of p-terphenyl. Pentacene fluoresces strongly when excited by laser light. By moving the focus of the laser through the sample in three dimensions and determining the position of the fluorescence maximum for each molecule, the group could pinpoint its location with an accuracy well below the diffraction limit.

    Thomas Schmidt of the University of Linz in Austria thinks that by using more strongly fluorescing molecules and computerizing the setup, the group should be able to image molecules in minutes rather than hours. That could open the way to minute, three-dimensional mapping of the cell. Researchers might, for example, label genes with different fluorescent molecules, then determine the precise positions of these marker molecules to learn, say, how the DNA twists and coils. Köhler and his colleagues, says Niek van Hulst of the University of Twente in the Netherlands, “are pushing optical microscopy to its limits.”


    A Shadow Falls on Hepatitis B Vaccination Effort

    1. Eliot Marshall

    Critics blame the widely used vaccine for many ills, citing anecdotes and a theory of molecular mimicry, similar to one proposed for Lyme disease arthritis—but scant data

    Hepatitis B seems to be the perfect target for a vaccine. Spreading quietly through blood contact, sex, and birth, the virus currently infects 350 million people worldwide, according to the World Health Organization (WHO)—mostly without producing symptoms. But in a fraction of cases, those infections lead to liver failure or liver cancer, deadly complications that each year kill an estimated 1 million people around the world and about 4000 in the United States.

    In the 1980s, public health officials in Europe and the United States sought to reduce this toll by mandating the immunization of adults in high-risk categories, such as health-care workers, and, in 1991, all newborns. But now a shadow has fallen across the vaccination campaign.

    A growing number of those who have received the vaccine shots—although just a tiny fraction of the 200 million immunized—claim to have experienced serious adverse effects. Their medical complaints cover a spectrum of autoimmune and nervous system disorders, including rheumatoid arthritis, optic neuritis, and neurodegenerative illnesses that resemble multiple sclerosis (MS). Now, some researchers are proposing that the vaccine tricks recipients' immune systems into attacking their own tissues. And a legal onslaught may be beginning.

    Already, several groups are seeking compensation from governments and manufacturers or demanding that mandatory vaccination be stopped. On 17 July, for example, an alliance of antivaccine activists in France filed a lawsuit in Paris against the national government, accusing it of understating the vaccine's risks and exaggerating the benefits for the average person. The attorneys, who seek a criminal inquiry, claim to represent 15,000 people. And in the United States, several East Coast lawyers representing clients with hepatitis vaccine claims held a summit near Washington, D.C., in July to discuss strategy. ABC's 20/20 program, meanwhile, is filming a documentary on hepatitis vaccine risks that may be aired this fall.

    Vaccine safety officials interviewed by Science say they have seen no evidence that autoimmune diseases like MS are appearing at a higher rate among vaccinated people. Indeed, a French government study last year found that vaccinated people were less likely to have MS. “These fears [of the hepatitis B vaccine] are quite unfounded,” according to Mark Kane, director of the hepatitis vaccine program at the World Health Organization. But he and other health officials worry that scary publicity about the vaccine could interfere with the drive for immunization.

    They also worry that they may get caught in what Robert Chen, vaccine safety chief at the Centers for Disease Control and Prevention (CDC) in Atlanta, refers to as a Catch-22. Chen and his colleagues say they are taking the claims of injury seriously. Epidemiological studies to see if there is a link between hepatitis B vaccination and MS—the most publicized concern—have already begun, and some data may be available next year. But, perhaps mindful of a similar controversy over whether silicone breast implants cause autoimmune diseases, Chen and his colleagues fear that any study—even if results prove negative—will add legitimacy to the claims. According to Chen, who commissioned one of the new studies, “The folks who oppose the vaccine will say … that just because we are looking at it, that must mean there is an association” between the vaccine and the illnesses.

    Concern about the vaccine appeared early in France and now seems to have gained the most attention there. Physician Philippe Jacubowicz, who heads an organization in Paris called REVAHB, has collected data on more than 600 cases of illnesses, many with MS-like symptoms, in people who had received the hepatitis B vaccine. In addition, patient advocacy groups in Britain and Canada have studied more than 100 cases each, as has an outspoken U.S. accuser of the hepatitis B vaccine, Bonnie Dunbar, a molecular biologist at Baylor College of Medicine in Houston.

    A developer of contraceptive vaccines herself, Dunbar is a forceful critic. And she is motivated by personal experience: Her brother developed immune problems that she believes were triggered by the hepatitis B shots he had to get when he became a health care worker. Dunbar says that when she began investigating, she found that other medical colleagues had experienced or knew about such reactions. One nurse, for example, attributed a dozen cases of MS to vaccination.

    To support her case, Dunbar is culling data from a list of more than 20,000 reports of miscellaneous adverse reactions to hepatitis B vaccination, filed with the Food and Drug Administration's (FDA's) Vaccine Adverse Event Reporting System (VAERS). FDA officials themselves have so far identified 111 MS cases in VAERS that appeared after vaccination, but they say a review of the medical records from these cases has turned up no evidence that they were actually caused by the vaccine.

    Dunbar thinks the FDA may be overlooking a possible biological mechanism. To explain the apparent bad reactions, she postulates that a hepatitis B surface protein used as an antigen in the recombinant vaccine (HBsAg) may provoke an autoimmune attack on a similar protein in the nerves or other tissues of a genetically susceptible group of vaccine recipients. This “molecular mimicry” scenario is at least plausible.

    In this issue of Science, for example, researchers report evidence that the Lyme disease organism can trigger arthritis in this way (see pp. 631 and 703). And other molecular biologists have published papers arguing that the herpesvirus triggers MS and an eye disease called stromal keratitis through molecular mimicry. Still others think the Coxsackie virus induces diabetes through such mimicry. To be sure, these scientists have laboratory results to support their proposals—something Dunbar lacks, although she plans to undertake such studies in collaboration with an immunogeneticist and a hepatitis virus expert at the University of Oklahoma. A grant application they submitted to the National Institutes of Health (NIH) has now been turned down twice, however. Dunbar says she may even try to pay for the research herself.

    Other vaccine experts are skeptical of the molecular mimicry thesis. Neal Halsey, a leader of the American Pediatric Association and director of the vaccine safety center at Johns Hopkins University, thinks those who attribute risk to the vaccine have not begun to make a case. He says, “I am not finding any scientific evidence that there are any cross-reacting antigens” in the vaccine that might trigger an attack on nerve tissue. Halsey also points out that infection by the natural hepatitis B virus has not been identified as a risk factor for MS; why, he asks, would a fragment of virus protein used in a vaccine be riskier? And Kane notes that although the prevalence of MS is highest among people in northern Europe and North America, hepatitis B rates are highest near the equator. One would expect an overlap, he says, if the virus and MS were biologically linked.

    Still, claims that the hepatitis B vaccine triggers autoimmune disease caused one vaccine manufacturer—Merck & Co. of Whitehouse Station, New Jersey—to sponsor a daylong review of the available data in Atlanta on 21 March 1997. When the session ended, the participants, including Kane, Chen, an NIH expert in molecular mimicry, Army researchers, and scientists from the chief vaccine makers—Merck, SmithKline Beecham of Philadelphia, and Pasteur Mérieux Connaught (PMC) of Lyon, France—agreed that the available data were very sketchy. They found no association between hepatitis B vaccine and the onset or exacerbation of MS. But they concluded, according to the minutes of the meeting, that “epidemiologic studies should be conducted because of public concern.”

    At least three studies have been launched, according to Robert Sharrar, a Merck medical officer. Merck is spending about $260,000 to help obtain hepatitis B immunization data from an ongoing, independent study of nurses' health in Boston. PMC is helping to fund a study of immunization run by MS clinics in France. And Chen confirms that CDC is collecting data from four health maintenance organizations on the West Coast for a study of MS and hepatitis vaccination. Sharrar says the Boston study could yield data next summer. The CDC project may take longer.

    Although public health officials are confident that the hepatitis B vaccine is safe, they know they are likely to face more claims of vaccine-induced injuries in the future. This infuriates some proponents of universal immunization. “This vaccine prevents cancer,” says Halsey. “For me, it is incredible that people are not taking into account the potential harm to public health they are doing” by raising an alarm.

    Chen is more philosophical. Now that millions of people are receiving hepatitis B shots each year, he says, many will blame it for any misfortunes that follow. “It's human nature,” he says, to attribute cause to almost anything that precedes a tragedy.


    Possible Cause Found for Lyme Arthritis

    1. Steven Dickman*
    1. Steven Dickman is a writer in Cambridge, Massachusetts.

    For many unlucky Lyme disease sufferers, the disease has a painful way of lingering. Weeks or months after the tick bite that transmits the disease-causing bacterium, some patients develop arthritis. Usually, the condition disappears following antibiotic treatments. But in roughly 10% of the patients, it persists after the infection has vanished. This has been a major mystery. As rheumatologist Brian Kotzin of the University of Colorado Health Sciences Center in Denver asks: “Why this perpetual response in the joint if the bug is not there any more?” The answer, researchers now say, is an immune response that goes awry.

    On page 703, Allen C. Steere of the New England Medical Center in Boston, Brigitte T. Huber of Tufts University School of Medicine, and their colleagues report the discovery of a striking resemblance between a protein found on the outer surface of the Lyme disease organism—the spirochete Borrelia burgdorferi—and a protein carried on human cells. This suggests that some people develop the persistent arthritislike condition because the infection triggers immune cells that attack both the spirochete protein and their own normal cellular protein.

    Immunologists are intrigued by the finding because it may be one of the few cases in which both the precise trigger for an autoimmune attack and its target in the body have been uncovered. “This is perhaps a unique opportunity to work front to back,” from the trigger to the misplaced immune response, says rheumatologist Leonard Sigal of the Robert Wood Johnson Medical Center in New Brunswick, New Jersey. As such, it could help researchers design new drugs or vaccines for Lyme disease arthritis.

    Perhaps even more important, the discovery could have implications for efforts to develop vaccines against Lyme disease, of which there are about 16,000 new cases every year. Just last week, The New England Journal of Medicine published the results of two large-scale clinical trials of Lyme disease vaccines. Both were very effective, but both are made from the very same spirochete protein linked to autoimmune arthritis by the Steere and Huber team.

    In theory, the protein might provoke autoimmunity in some people who receive the vaccines, as some patients and researchers now claim the hepatitis B vaccine is doing (see story on page 630). “This is an issue of concern,” says Steere, who was also a principal investigator on one of the vaccine trials. Indeed, he notes, arthritis did develop in several subjects who received the vaccine, although a number of the controls also became arthritic. “Ongoing surveillance will be important” to detect any problems, he adds.

    The path to the current finding began about 10 years ago. Immunologists know that some people are genetically predisposed to autoimmunity because of natural variations in their so-called HLA molecules, which reside on certain immune cells and help determine to which antigens a person responds. And in 1989, the Steere team found that the patients who get persistent Lyme-related arthritis very often carry a particular HLA variant designated DRB1*0401. Because that same variant is associated with rheumatoid arthritis, an autoimmune disease, it seemed that Lyme disease arthritis might itself result from an autoimmune attack, possibly triggered by a B. burgdorferi antigen that resembles some component of human joint tissue.

    So, Steere and his colleagues looked for antigens on the pathogen that HLA-DRB1*0401 might recognize. And in 1994, they found one that might fit the bill: a B. burgdorferi protein called OspA (for outer surface protein A) that was frequently recognized by the T type of immune cells from treatment-resistant patients but recognized only uncommonly by T cells from those whose arthritis responded to treatment. OspA had also become the primary component of the Lyme disease vaccine because it provided a protective antibody response in animals injected with it.

    Steere's group then joined forces with immunogeneticist Huber to look for human proteins that might resemble OspA. Using a computer algorithm based on a previous lab analysis of DRB1*0401's peptide-binding abilities, they showed that DRB1*0401 binds to a particular nine-amino acid segment of OspA when it triggers an immune response. Then, the scientists searched a database looking for human proteins that contain the same sequence and might therefore be the target of an autoimmune attack initiated by OspA. They turned up one candidate, a protein called hLFA-1, found on blood and other cells.

    When the researchers studied T cells from 11 patients with treatment-resistant Lyme arthritis, they found that nine of them carried T cells that respond strongly to the key sections of both OspA and hLFA-1. T cells taken from controls with other forms of arthritis did not show those responses. These findings suggest, Steere says, that T cells originally triggered to recognize OspA go on to attack the unfortunately similar part of hLFA-1 found on the patients' own cells.

    Steere and Huber are quick to admit, however, that the evidence for this scenario is circumstantial. One worry is that they couldn't find hLFA-1-reactive T cells in all 11 patients, although Kotzin says that may just be because T cells from joint fluid can be hard to study. “That they are able to demonstrate such a specific response from these T cells is really remarkable,” he adds.

    But if that specific response to hLFA-1 causes the arthritis, asks immunologist Kai Wucherpfennig of the Dana-Farber Cancer Institute in Boston, why is the inflammation confined to the joints, while hLFA-1 is found on cells all over the body? One explanation: T cells naturally have hLFA-1 on their surfaces and therefore carry it along when they move into joints to combat B. burgdorferi infections. This may exacerbate immune-cell responses to hLFA-1, resulting in a vicious cycle in which the joints become permanently inflamed. Nonetheless, Wucherpfennig wonders whether “there may be other important targets that have not yet been identified.”

    Researchers might be able to settle these issues if they could recreate treatment- resistant, Lyme-related arthritis in an animal such as the mouse. So far, however, they haven't been successful, perhaps because mice have a different form of LFA-1. “This is an initial observation,” says Steere. “Now, one sets forth on the next phase of the journey.”


    Field Emitters Finding Home in Electronics

    1. Dennis Normile

    New materials promise smaller, more efficient way to deliver electrons to flat-panel displays, cameras, and microscopes

    Tsukuba, Japan—The big vacuum tubes and copper wiring that filled the backs of early televisions have long since been replaced by tiny semiconductor chips and printed circuit boards. The one holdout against miniaturization is the cathode ray tube (CRT), which retains all the bulk and inefficiency of 50 years ago. Various flat-panel display technologies have emerged, but none matches the brightness and resolution of the CRT, in which a heated filament spews a stream of electrons that light up phosphors on the image screen. But time may finally be catching up with the CRT.

    A recent meeting here* highlighted advances in a technology that uses an electric field rather than heat to wrest electrons from an emitter material. A key step forward in this technology, called field emission, is the use of more durable materials that promise slimmer, more energy-efficient, and portable displays for personal computers. Perhaps more significantly, researchers are also applying field-emission technology to microscopes, image sensors, and other devices that have traditionally used electron guns, like those in CRTs, as electron sources. New uses for field emitters “have been popping up like mushrooms,” says Chris Holland, a field-emission specialist at SRI International in Menlo Park, California. That activity, he says, is a good indication of how quickly the field is maturing.

    So long, silicon.

    These diamond tips (magnified 7000 times) form part of an emitter array developed by researchers at Vanderbilt University.


    Field emission is not a new idea, either. But the pointlike cathodes—traditionally made of silicon or tungsten—have tended to break down quickly in the powerful electric fields. Researchers have long had their eye on diamond film as a cathode material, but it has proven tough to fabricate into the sharply pointed shapes needed for electron emission. That obstacle also has hindered efforts to add an extra terminal, or gate, to the diamond cathodes—a drain as well as a cathode and anode—that allows for more precise control of the flow of electrons. At the meeting, however, two research groups have reported success in fabricating three-terminal, or gated, diamond arrays. In a display, for example, a gated array would mean faster, more precise switching on and off of pixels.

    One team, at Vanderbilt University in Nashville, Tennessee, started with a pyramidal mold for the deposition of diamond film. They added terminals and backing, using conventional semiconductor fabrication techniques. Finally, the mold material was etched away. The result was a uniform pattern of diamond pyramids in a three-terminal configuration. Electrical engineer Jim Davidson says the array “could significantly outperform silicon devices in speed, power [consumption], and reliability” once some practical problems are solved. Among these is a tendency to get reverse current flow between the gate and the tip, which limits the three-terminal performance.

    Wolfgang Mueller of Research 2000 Inc., a start-up company in Westlake, Ohio, described tests on an approach that dodges the problem of fashioning an array of diamond points. Starting with a silicon substrate, he adds a grid of an insulating material topped by a second conductive layer laid down in wide lines. The wells between the grid lines then get a sprinkling of diamond crystals. Passing a current from the substrate through the diamond crystals to the top conductive layer induces field emission in the diamond that should send streams of electrons to an anode above the semiconductor sandwich. This would be an extremely simple and easy way to fabricate a three-terminal array. However, Mueller and his team have yet to add the anode and make precise measurements of emission currents. “It is very recent work,” he explains.

    While researchers are still struggling with diamond emitters, a group led by Yahachi Saito, an associate professor of electrical and electronic engineering at Japan's Mie University, is staking its efforts on carbon nanotubes as field emitters. The team mounted a wafer of multiwalled carbon nanotubes on a substrate and treated the surface with an etch to remove carbon debris and expose the tips of the nanotubes. Then, they put this cathode in a thumb-sized vacuum tube with an aluminum film anode and a phosphor screen. This CRT lighting element emits a bright light that remains stable for 10,000 hours, promising to bring the brightness and resolution of the CRT to flat-panel displays. Ise Electronics Corp., which is collaborating on the work, has already adapted the device for use in a rudimentary flat-panel display.

    But Saito says there are still formidable challenges to commercializing the technology. Scaling up the production of the device is one hurdle, as is making large quantities of the nanotubes themselves, now created by passing an arc discharge between graphite electrodes in a helium environment. “We've succeeded with an experimental device, but it's not clear yet if we can successfully commercialize it,” Saito says.

    A team at Delft University of Technology and Philips Research Laboratories in the Netherlands is studying carbon nanotubes as a possible electron source for electron microscopes. The big advantage of carbon nanotubes in that environment would be their ability to emit electrons in an extremely narrow energy range. “The more monochromatic the source is, the better electron lenses can be used to focus the beam,” says physicist Martijn Fransen. A single, multiwalled carbon nanotube produced an energy spread of just 0.11 electron volts, compared with a spread of 0.6 to 0.3 eV from the best electron sources currently being used. “The difference doesn't seem like much, but it's quite important to get [the value] lower,” he says. The biggest challenge, according to Fransen, is reducing the variability in emission characteristics among nanotube samples.

    Indeed, field-emission researchers are now thinking far beyond flat-panel displays, which inspired the early work on the technology. Other groups reported attempts to use them in image sensors, mass spectrometers, and even in a surge absorber for high-speed computer communication lines, in which the surge of energy is diverted to emit electrons. But much more work is needed before emitters can deliver on their promises. “The easy experiments have been done,” says Ian Milson, a researcher at EEV Limited, a British company considering ways to commercialize field-emission technologies. “The problems left are the difficult ones.”

    • * The Second International Vacuum Electron Sources Conference, Tsukuba, Japan, 7–10 July.


    The Sahara Is Not Marching Southward

    1. Richard A. Kerr

    From a satellite perch, the supposed steady encroachment of desert into Africa's Sahel appears instead to stem from climate variation

    Twenty-five years ago, the Sahel—the narrow band of barely habitable semiarid land stretching across Africa at the southern edge of the Sahara—was thought to be on the verge of disappearing. Images beamed around the world showed deep drought, starving millions, and land stripped of vegetation by man and beast as human activity turned the land into permanent desert. Sub-Saharan Africa became the embodiment of “desertification,” the human-driven, irreversible expansion of the world's deserts. Thought to be gobbling up tens or even hundreds of thousands of square kilometers of arable land every year, desertification provoked an outpouring of aid and an international treaty.

    For the past decade, however, scientists armed with ecological studies have been fighting the idea that such desertification is wide-spread or largely human induced. Now, they have satellite images to bolster their argument that much of what has been called desertification was instead the reflection of natural ups and downs of rainfall.

    “We're not trying to say desertification is not happening,” says climatologist Sharon Nicholson of Florida State University in Tallahassee, lead author of one of two recently published satellite studies. “We're saying the scenario of the Sahara sands marching southward at the hands of humans is wrong.” The studies show that natural climate variation has shifted the desert's edge, with no net effect on the amount of vegetation. Although people may be degrading the drylands of the Sahel, mainly by changing the mix of plants, they aren't expanding the desert.

    Keeping tabs on an expanse of land that spans a continent requires the lofty vantage point of a satellite. Both of the new studies—by remote-sensing specialist Stephen Prince and his colleagues at the University of Maryland, College Park, and by Nicholson and her remote-sensing colleagues—rely on images from a series of National Oceanic and Atmospheric Administration satellites carrying an Advanced Very High Resolution Radiometer (AVHRR).

    Intended to map snow, ice, and clouds, the AVHRR is good at recording vegetation changes as well. A ratio of surface brightness in the red part of the spectrum, where chlorophyll absorbs light, versus brightness in the near infrared, where green leaves efficiently scatter light, provides a “greenness” index that can gauge surface properties—for example, the proportion of the surface covered by vegetation.

    When Nicholson and her colleagues calculated the greenness index for the entire AVHRR record—1980 to 1995—they found the edge of the Sahara doing a frenetic tango paced by rainfall, rather than a steady march. In the western Sahel, the area covered by their study, the southern boundary of the Sahara advanced southward and then retreated at least three times, moving as much as 300 kilometers over several years. “There is no progressive ‘march’ of the desert over West Africa,” they conclude.

    The satellite record also does not reveal any long-term degradation of the vegetated land. The greenness index can be related to the amount of rainfall to produce a measure of rain-use efficiency—the amount of green plants produced per unit of water. If something other than drought (overgrazing or soil erosion, for example) lowered plant productivity, rain-use efficiency would drop. But it remains constant if productivity simply varies with rainfall. Neither Nicholson, looking at the western Sahel over 16 years, nor Prince and his colleagues, who considered the full breadth of the Sahel between 1982 and 1990, found any net change in rain-use efficiency.

    “I would say the remote-sensing observations confirm what most ecologists believed in the mid-1980s,” says ecologist Dean Graetz of Australia's Earth Observation Center, part of the Commonwealth Scientific and Industrial Research Organization in Canberra. “The deserts aren't advancing; the Sahara was never marching south. Policy-makers are still impressed by the word ‘desertification.’ It is hypnotic, but it's not appropriate.” Land degradation is a better term, he says, reflecting the more localized effects of such activities as grazing and foraging for fuel. Ecologists working in the Sahel have shown that around villages and wells, for example, overgrazing by cattle can shift vegetation from grasses to equally green, but less palatable, shrubs.

    And that kind of degradation can be reversible, says ecologist William Schlesinger of Duke University. “If humans cause degradation, humans have the power to remediate the damage. That's encouraging.” Graetz, however, fears that drought and misuse of the land will keep taking a toll on the health of the Sahel, even if they don't threaten its existence.



    Sphere Does Elegant Gymnastics in New Video

    1. Dana Mackenzie*
    1. Dana Mackenzie is a mathematics and science writer in Santa Cruz, California.

    A tour de force of computer graphics gives the simplest solution yet to the venerable problem of turning a sphere inside out

    More than 40 years ago, a University of Michigan graduate student named Stephen Smale laid down a challenge for future mathematicians. He proved an abstract theorem that had a startling corollary: An elastic sphere can be turned inside out, or “everted,” without tearing or creasing it—providing the sphere can pass through itself, ghostlike. Smale did not give an explicit recipe for this sleight of hand, however. Since then, topologists have turned spheres inside out in media ranging from hand-drawn pictures to chicken-wire models to computer animations, but their solutions always seemed more complex than necessary.

    Now, mathematicians George Francis and John Sullivan of the University of Illinois, Urbana-Champaign, have created a computer animation of a sphere eversion that is the simplest possible by several criteria. Demonstrated in a 6-½-minute video tour that will premiere next month at the International Congress of Mathematicians in Berlin and was shown in an abbreviated form at this month's Siggraph 98 convention in Orlando, Florida, their solution provides the most satisfying answer yet to Smale's challenge. It also shows how topologists are turning to computer graphics to solve some of their hardest problems. “In a real sense, the eversion question has become a benchmark for the use of computer technology in attacking problems of surfaces in three-dimensional space,” says Thomas Banchoff of Brown University.

    A French mathematician, Bernard Morin, is generally credited with finding the first explicit eversion of the sphere in the early 1960s, and a computer-animated video called “Outside In,” based on an idea by topologist William Thurston of the University of California, Davis, offers what may be the best-known example. Thurston's approach, however, is far from optimal. First, it allows the occurrence of many topological “events”—moments when two surfaces pass through one another, or when the curves of self-intersection abruptly change configuration. Second, it introduces an ornate pattern of corrugations to enable the sphere to twist around any potential kinks.

    In the new eversion, Francis and Sullivan minimized bending by assigning their elastic surface an “energy” that increases when it is bent more tightly. At all stages of their eversion, the surface automatically keeps the lowest possible energy. And thanks to work done more than 10 years ago by Robert Kusner, a mathematician at the University of Massachusetts, Amherst, they already had an optimal configuration for the halfway point in the eversion, where the bending energy reaches a maximum.

    Topologists had shown that at some stage in any sphere eversion, four sheets of surface must pass through the same point. They knew that a surface with such a “quadruple point” must have a bending energy of 16 ϕ, expressed in a dimensionless unit. (By comparison, the starting energy of any sphere, regardless of size, is 4 ϕ.) In 1983, Kusner had found a surface with exactly that energy—a surface that looks very much like the halfway surface in Morin's eversion.

    Kusner proposed that one could give this surface a little push, as one might push a chair that is precariously balanced on two legs, and let nature take its course. A nudge in one direction, he proposed, would cause it to collapse into a sphere; a nudge in the opposite direction would cause it to collapse into an inside-out sphere. Then, by running one sequence backward and the other forward, one could create a complete eversion, in which the original sphere evolved into Kusner's surface and then into its inside-out alter ego.

    But there were doubts. Kusner's surface might not be as unstable as the chair on two legs: Given a small push, it might just return to the balance point. Or it might indeed collapse to a sphere when pushed one way, and to the same sphere (not an inside-out one) when pushed the other way. Finally, in its quest to minimize energy, the surface might pinch off into two separate spheres. The animation by Francis and Sullivan shows, however, that the eversion works according to plan.

    To create it, the two researchers enlisted software tools that had not existed when Kusner did his work. Each frame of their video uses between 1000 and 2000 triangles to approximate the elastic surface. Both the number of triangles and the way they are connected change during the animation, making it nearly impossible to describe the intermediate surfaces by standard mathematical techniques. Instead of computing the movement of the surface as a whole, the software had to follow each piece separately.

    When Francis and Sullivan ran the computation, they found that their energy-minimizing approach offered a bonus. Not only did it minimize bending, but it turned out to have the smallest possible number of topological events as well. To Banchoff, a member of the jury for the VideoMath section of the International Congress of Mathematicians, the video by Francis and Sullivan “represents a new level of elegance.” Now, Sullivan hopes to apply the energy-minimizing approach to other classical topology problems, such as smoothly deforming a torus (an inner-tube shape) so that a stripe painted around the central hole changes places with a perpendicular band, running around a “meridian” of the inner tube.


    Physicist-Turned-Politician Seeks Middle Ground

    1. Andrew Lawler

    Representative Vern Ehlers, a former professor, is completing one of his biggest assignments: setting out a course for U.S. science in the next century

    When House Speaker Newt Gingrich (R-GA) called for a sweeping review of science policy last summer, he said it was time for a dramatic new vision for science and technology after the Cold War and on the brink of the millennium. He gave the job of pulling together that vision to Representative Vern Ehlers (R-MI), the number two Republican on the House Science Committee.

    Now, on the eve of unveiling his report, Ehlers knows he faces a tough sell. The more detailed the recommendations, the more critics it will attract, including those who may reject it as a partisan document serving the man who requested it. But a failure to take a stand on the important issues facing the community could turn the report into a political bookend, unread and ignored.

    Finding a middle ground is no easy task, even for a man recently named one of the three brainiest U.S. House members by Washingtonian magazine. For example, although Ehlers suggests that consolidating research agencies may be a good idea, he hastens to add that there are “many different options.” The report, he says, “will not make any recommendations” on the matter or lay out detailed options. “We will simply point out the problem.”

    Ehlers's background—he calls himself the first research physicist to serve in Congress—may disarm some potential critics. He has a reputation as a moderate Republican and environmentalist. He also holds a Ph.D. in nuclear physics from the University of California, Berkeley, did research at Lawrence Berkeley National Laboratory, and taught for 17 years at his undergraduate alma mater, Calvin College in Grand Rapids, Michigan. And, despite a 23-year career in politics, he retains the serious, self-effacing, and soft-spoken quality of a small-college professor.

    “I didn't fit the typical mold,” he says, recalling his first try at public office. “Scientists don't generally run. And people who get elected have hair.” But the voters didn't seem to mind, electing him as county commissioner, and later state legislator, before sending him to the U.S. House of Representatives in 1995.

    The science policy study is proving to be one of the biggest challenges of his political career. “The most frustrating part is the lack of time to do the kind of job I would like to do,” he told Science. “I don't want to put the kiss of death on the report, but it was a very complex and time-consuming task, and it comes on top of my regular duties, which take 80 hours a week.”

    Time is not his only challenge. Neither House Democrats nor the White House has shown much enthusiasm for the review, and a series of hearings held to gather input on a host of science-related issues played to half-empty hearing rooms. But Ehlers, a devout Christian who has rankled some researchers with his opposition to human cloning, is hoping that his scientific colleagues will ultimately embrace his project as a well-intentioned attempt to stir debate on an enormously complicated and important subject. “Nothing would sink it faster than them saying, ‘Oh, this is just another study,’” he says.

    What follows is an edited transcript of a 21 July Science interview with Representative Ehlers in his Capitol Hill office.

    On the report's potential impact:

    I'm not trying to produce the most comprehensive science policy but one that Congress will take action on in the form of a resolution. We would also like to get some indication of approval from the President's Committee of Advisers on Science and Technology. Even if nothing is ever adopted—and I expect something will be—we've already had a major impact. A lot of things have come together since we started work on the report: a Senate bill [to double R&D spending] and Newt's public statements in support of increased science funding, which in turn led to the president putting substantial increases in his budget. This has all focused a lot of attention on science funding and the need to set priorities.

    I'm hoping to finish a draft before the August recess, but it's a very slow process. Once Newt Gingrich and [Science Committee Chair James] Sensenbrenner [R-WI] have vetted it, [the report] will go public. But that is just the first step toward what I hope is a long-term process in which Congress will actively focus on science policy, reviewing it at least every 5 years.

    On the need for a new science policy:

    First, we're not doing a good job of setting priorities. Second, the Superconducting Super Collider was killed, which indicated [a lack of communication on the need for basic research]. And then there is the space station, which is getting very, very expensive. And look at what's happened with science education—we're not doing well.

    So, although there is no catastrophe, there are a lot of indications that science is not in the healthy state it has been for the past half-century. And the time when military competition provided the built-in support for science is over. That constituency has diminished and nearly disappeared. Now, economic competition is at the forefront. The science community has to develop a new constituency and stop bemoaning the loss of the old one.

    On oversight of federal R&D programs:

    There is a need to consolidate some of the science decision-making in Congress. [Former Energy Secretary] Admiral [James] Watkins loves to point out that when he wanted to get his oceanographic initiative passed a few years ago, he had to work with 43 different subcommittees and committees in the House and Senate. In our report, we don't offer solutions outside the jurisdiction of our committee, but we will point out this issue. I am sure the Rules Committee will work on this next year.

    It's just as bad on the Administration end. A Department of Science is not a good political option—there's not enough support for it. A more realistic option would be to consolidate some of the science functions within an existing agency or a new one without Cabinet status. That way, you are more likely to end up with a technically or scientifically oriented person heading it rather than a political person. The question is how you would make all this fit together. Or the president could be encouraged to go the route of a very strong scientific adviser who has considerable say over the operation of the nation's science establishment.

    On math and science education:

    We need a more coordinated effort. The National Science Foundation (NSF) is doing a better job [than the Education Department], and math and science education certainly should be in NSF's hands. I see no need to have it in both places. We also have to energize the state and local governments, which brings up a whole host of issues. And although we spend $300 billion a year on education, we spend about 0.01%—some $30 million—on education research. Not too many corporations would survive if they spent that percentage on research.

    On federal spending:

    We need to reform entitlement programs [such as Social Security and Medicare], because they can eat up any surplus we generate. Entitlements were a quarter of the budget in 1962; now, they are half. If we don't get them under control, by 2010 we'll be spending all of our revenue on entitlements and interest. That will leave nothing for defense and domestic discretionary spending [where science programs reside].

    On linking basic and applied research:

    We've been shoving our [federal] money more toward basic research, while industry has been driven by international competition and their stockholders to focus on the shorter term payoffs. What has developed is a Valley of Death: As basic research becomes more basic, applied research is shifting more toward product development.

    We need to stop talking about the Commerce Department's Advanced Technology Program (ATP), which basically gives money to industry [for applied research] they should be doing anyway, and Cooperative Research and Development Agreements between the federal government and industry. Instead, we should focus on setting up partnerships involving governments, industries, and universities. In the report, we won't get into details of which approach is better, but we'll discuss the elements of good partnerships—which ones work and which ones don't. Why should we tie ourselves to a limited model like ATP? We need to go back to first principles.

    Look at Monsanto Corp., which provides direct funding to individuals at Washington University in St. Louis for basic research. Monsanto is not buying researchers and telling them what to do; it is providing grants to faculty in the hope that someday that research will pay off for that company. Perhaps, we need tax credits or tax breaks for any corporation that provides that kind of funding for university research. And states have to play a better role—they are much more into economic development than the federal government.

    On large funding increases for the National Institutes of Health (NIH), compared with other agencies:

    It's a very dangerous trend. NIH depends very strongly on work done by NSF, the Department of Energy, and also NASA to a certain extent. They are constantly dipping into the well of ideas generated by research in chemistry, physics, and biology. If we continue to give more money to NIH and less to the others, someday that well is going to be dry.

    On how well scientists lobby Congress:

    Historically, [their grade is] probably a D+. But they are improving tremendously. Particularly in the past couple of years, scientists have become more politically astute and more politically involved.


    Gravity Teases Details From Ancient Cosmic Birthplaces

    1. James Glanz

    By refracting light from the far edge of the universe, gravity is giving astronomers a fine-scale view of the dim clouds that spawned stars and galaxies

    Gravity bends light rays, and it is also letting astronomers break a seemingly inflexible rule. Ordinarily, the farther away an object is, the more difficult it is for observers to resolve its finest details. But when the titanic gravity of an entire galaxy bends light from a quasar—a brilliant object at the far edges of the universe—astronomers can sometimes pick up details just 100 light-years across in huge, distant clouds of nearly primordial matter veiling the quasar's light. The feat, recently achieved by a team at the California Institute of Technology (Caltech), depends on the chance alignment of a quasar, a cloud, and a foreground galaxy along a single line of sight. But in sheer resolving power, it beats the Hubble Space Telescope by a factor of 20.

    The strategy relies on quasars that appear as multiple images in the sky—a sign that the gravity of the foreground galaxy has captured light rays emanating from the quasar on slightly different paths and slung them all toward Earth. Near the quasar, where the rays have not yet had a chance to diverge much, they may have passed through one of the pristine clouds of material that emerged from the big bang. Traveling on paths just a few tens of light-years apart, the rays can pick up clues as to how the cloud might be swirling and clumping in the early stages of galaxy formation. “It's extraordinary,” says Chris Impey, an astronomer at the University of Arizona's Steward Observatory. “You can get a level of detail on the universe billions of years ago that exceeds anything else you could [resolve], except in your own backyard in the Milky Way.”

    In an early fruit of the technique, the Caltech team of Michael Rauch, Wallace Sargent, and Thomas Barlow saw dramatic differences in the density of carbon, silicon, and iron—clues to the presence of stars—along two nearby light paths through a cloud. One ray, but not the other, may have passed through a zone of stellar activity—perhaps a galactic building block. If that connection can be tightened, says Charles Steidel of Caltech, “this really gives you a handle [on galaxy formation] in exquisite detail.”

    These clouds, at distances of up to 10 billion light-years and more, are thought to be the birthplaces of the universe we know today, containing the raw material for the walls, bubbles, and filigree patterns traced out by galaxies. They are too distant and dim to be seen directly. Instead they are detected as spikes of absorption in the spectra of quasars, strange galaxylike beacons that were among the first objects to form in the universe. Although the same element, hydrogen, is responsible for most of the absorption spikes, the expansion of the universe shifts the spikes toward the red end of the spectrum by an amount that depends on each cloud's distance. As a result, the light of a typical quasar, passing through many clouds on its way to Earth, has a spectrum that bristles with separate absorption spikes—the so-called Lyman-α forest.

    A quasar's light ordinarily pierces each cloud at just a single point. This one-dimensional view gives a sketchy picture of the clouds' structure, although astronomers have learned that they tend to be lumpy on scales of millions of light-years. To add the missing dimensions, the Caltech team has analyzed spectra from three multiple-image quasars, captured by the high-resolution spectrograph on the 10-meter Keck Telescope in Hawaii.

    For each quasar, they compared spectra from the different images, looking for signs that the light had passed through the same Lyman-α clouds. Then, from the redshifts of the spectral spikes, they worked out how far the quasar's light had traveled before it reached each cloud and how far the multiple light paths had diverged at that point. That gave them the dimensions of any structures that might cause the cloud's spectral fingerprint to vary.

    The Caltech team's first result, which has been presented at scientific conferences in Europe, revealed little structure on scales smaller than 1000 light-years in most clouds. But in about a dozen other clouds with higher gas densities, they found factor-of-2 changes in the density of highly ionized carbon over distances of a few thousand light-years. The density changes, along with slight velocity variations, could be the mark of “minigalaxies,” about 10 times less massive than the Milky Way—and possibly caught in the act of merging, says Rauch.

    Most striking were a few cases where the spectra revealed clouds lying relatively close to the quasars, where the light paths had diverged very little. The team looked along each sight line for the absorption spikes produced by singly ionized silicon, carbon, and iron, and neutral oxygen. These ions should mark the densest parts of the clouds, where plentiful electrons combine with and eliminate doubly and triply ionized atoms. In one case, the team saw factor-of-10 variations in the ion densities over a distance estimated to be less than 100 light-years, along with signs that the atoms are swirling within the cloud.

    “It's a wonderful thing to be able to look at the scale of interstellar clouds in our own galaxy, but see it in a galaxy when the universe is 10% of its current age,” says David Weinberg of Ohio State University in Columbus. The wind from a single star or a supernova in the early universe could be responsible for the density variations, says Rauch.

    He and his colleagues want to collect more examples before they firmly commit themselves to any interpretations. But Rauch thinks the technique could ultimately free astronomers who want to understand the universe's early days from having to depend on the few objects that can be seen directly, such as quasars or especially bright galaxies. “The question,” he says, “is to what extent are [observed distant] galaxies giving us a typical picture?” Details gleaned from the dim Lyman-α clouds, he thinks, may give a truer sample of the early universe.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution