News this Week

Science  07 Aug 1998:
Vol. 281, Issue 5378, pp. 758
1. PHYSICS

Practical Tests for an 'Untestable' Theory of Everything?

1. David Kestenbaum

At the end of a formal dinner recently, physicist Joe Lykken leaned across the table to relay an odd bit of gossip. “There could be extra dimensions, and they might be this big!” he confided, holding his thumb and forefinger about a nickel-width apart. It was unusual behavior for a string theorist. Not because “extra dimensions” sounds outlandish—string theorists are used to dealing with multiple dimensions. But they are definitely not in the business of predicting things that can be easily tested by experiment. “When I wrote [a paper on this] I really hesitated because it's easy if you're a little loose for people to think that you're a crackpot,” says Lykken, who works at Fermi National Accelerator Laboratory in Batavia, Illinois, “but the idea is on solid ground.”

If it's right, then physicists may soon have their first experimental evidence that string theory—a grand “theory of everything” that attempts to tie all the known forces together in a single framework—is more than just mathematics. String theory postulates a total of 10 dimensions, seven of which are assumed to be “compactified,” that is, curled up on scales of just 10−33 centimeters—so small as to be out of reach of any conceivable experiment. But now Lykken and several other groups are considering the possibility that a few of those dimensions could unravel a bit, opening up onto scales that precision measurements in accelerators or even on a benchtop might actually probe. The work has drawn considerable interest at physics conferences over the last month.* “Taking them seriously as [large] dimensions that can affect things is a new thing basically this year,” says University of Michigan, Ann Arbor, theorist Gordon Kane. “It's really profound. It's hard to say it strongly enough.”

Kane, Lykken, and others caution that the idea is so new, and indeed, so unusual, that it may have fatal flaws no one has thought of yet. Still, “I think it's very exciting,” says Brian Greene, a theorist at Columbia University. It could mean that “at some accelerator in the next decade you'll see all kinds of new particles.” He adds: “In my gut I think it's likely to be wrong.” But if it's correct, “it would rock the foundation of physics.”

And the foundation does need shaking up. Modern physics rests on two enormously successful but disconnected theories—quantum mechanics, which describes the behavior of subatomic particles, and Einstein's theory of gravity. So far no one has figured out how to tie the two together, except possibly with string theory. In this picture, matter's fundamental bits are strings that live in 10 spatial dimensions. Compactify seven of those dimensions, and those strings look like particles. As on a violin, the strings can vibrate, and crudely speaking the “notes” correspond to different everyday particles such as electrons or quarks, along with exotic particles that exist only in all 10 dimensions, such as the postulated graviton, which conveys the force of gravity.

As part of this unification, string theory folds the four forces—gravity, electromagnetism, and the strong and weak forces at work inside the nucleus—into one. Proponents say that experiments have already revealed clues to this unification of forces. At high energies in accelerators, electromagnetism and the weak force turn out to be different manifestations of a single electroweak force. Other accelerator experiments have shown that the strengths of the strong and electroweak force start to converge when they are probed at increasingly high energies. When extrapolated with theoretical models, the strong and electroweak forces seem to merge at very high energy, called the GUT (grand unified theory) scale. Unfortunately, that energy is about a trillion times higher than will be reached even at the Large Hadron Collider (LHC), a massive accelerator now being built at CERN in Switzerland.

But the GUT scale, although too high for experimenters, was too low for string theorists: The theory predicted—apparently with very little wiggle room—that unification of all the forces, including gravity, would occur at an energy 20 times higher still. This puzzle spawned some of the recent work with extra, “large” dimensions. In 1996, Edward Witten at the Institute for Advanced Study in Princeton, New Jersey, and Petr Horava, now at the California Institute of Technology (Caltech) in Pasadena, offered a way to close the gap between the string scale—the scale at which gravity should get strong enough to meet up with the other forces—and the GUT scale. The pair showed that if one of the compactified dimensions was allowed to grow a bit, the string scale slid conveniently down to the GUT scale.

Nothing dictated that the dimensions had to be any particular size, so Lykken thought, “Why stop there?” He then wrote a paper looking at toy models where the extra dimensions were even bigger, and the string scale fell 12 orders of magnitude, to a point just above the energies that had been probed by accelerators.

Lykken's paper raised a lot of eyebrows in March 1996, but neither he nor others took it too seriously. “We were taught from birth” that gravity wouldn't get strong enough to unify with other forces until very high energies, Greene recalls. And most theorists assumed that the “large” dimensions needed for low-energy unification would have already shown up in dozens of experiments. But recent work has demonstrated one way for the extra dimensions to have avoided detection. The trick is that only gravity experiences the extra dimensions, while the other forces and particles are confined to the three dimensions of the world we know.

Earlier this year, Stanford University physicists Nima Arkani-Hamed and Savas Dimopoulos, with Gia Dvali of the International Center for Theoretical Physics in Italy, showed that if gravity lived in more than three dimensions, it could be very strong at short distances but would peter out into its normal, weak self at distances greater than the size of those extra dimensions. And, Arkani-Hamed points out, “gravity has only been accurately measured down to a millimeter or so.” Researchers are now planning at least two tabletop experiments to see whether gravity's strength grows at smaller distances, down to a micrometer.

Reaction to this theory is mixed: “It's a long shot,” says John Schwarz, a string theorist at Caltech. He and others suspect that an extra, “large” dimension for gravity might be inconsistent with various astrophysical measurements. The supernova explosion of 1987, for instance, should have produced gravitons that would have carried energy into the extra, “large” dimensions and cooled the star quickly. But neutrinos from the explosion came in at approximately the expected numbers and over the right time period for a star cooling in three dimensions.

Arkani-Hamed and colleagues say their theory survives this challenge if there are more than two of these “large” dimensions, or if there are two that are smaller than about 10 micrometers. The theory has survived other assaults as well. “At first I thought ‘This is crap, I'm going to rule it out,’” says Stanford University physicist Scott Thomas, “but it turns out it's completely consistent with experimental data.” Tom Banks, a theorist at Rutgers University in New Brunswick, New Jersey, however, says that the theory still needs to be checked against certain precision measurements made at accelerators.

A third group is also independently exploring the consequences of an additional “large” dimension, this time in a string theory picture where all particles and forces can experience it. Keith Dienes, Emilian Dudas, and Tony Gherghetta at CERN have found that allowing the electromagnetic, weak, and strong forces to leak into an extra dimension on a scale of 10−19 centimeters makes the three unify at a very low energy. The extra dimension is small enough that it might have escaped notice thus far. “Everyone thought [the extra dimension] would destroy the unification” and the theory, Dienes says, but add it “and bingo, the [three forces] unify almost immediately.”

Combined with the work by Arkani-Hamed and colleagues, showing how gravity can be made to get strong at low energies, that opens the tantalizing possibility that unification, and hence a theory of everything, might be revealed at energies that would be probed by the LHC, Dienes says. If that's true, he adds, the LHC will show that the strength of the forces are hurrying to meet at an energy far lower than anyone had expected.

A negative verdict on this and other schemes to add new dimensions to the real world might come earlier. “It could be that next week someone will come up with a very simple argument why none of this can be true,” Dienes says. “But we've been at this since the end of March, and nobody has knocked us out.” Others point out that the theory might work on paper but still not be the one that runs the universe. Still, comments Juan Maldacena, a theorist at Harvard University, “in this field, any idea that is not obviously false is interesting.”

• * International Conference on High Energy Physics, 23–29 July in Vancouver, Canada, and SUSY '98, 11–17 July in Oxford, England.

2. SCIENCE APPOINTMENTS

Physicist Named Japan's Education Minister

1. Dennis Normile

For the first time in recent memory, Japan has a Minister of Education, Science, Sports, and Culture with hands-on experience as a researcher and educator. On 30 July, physicist Akito Arima, former president of the University of Tokyo, took the post as head of the Ministry of Education (Monbusho) in the Cabinet formed by the new prime minister, Keizo Obuchi. Monbusho oversees all the national universities as well as several dozen national research institutes.

Arima, 67, had been president of the Institute of Physical and Chemical Research (RIKEN), outside Tokyo, since retiring from the University of Tokyo in 1993. He resigned from RIKEN earlier this year and on 12 July was elected to the upper house of Japan's Diet.

The science community is elated to have a friend in such a high place. “When he was president of the University of Tokyo, he put extraordinary effort into improving the research environment,” says Yoji Totsuka, director of the university's Institute for Cosmic Ray Research. “We're hoping he can do even more in a higher position.” Hirotaka Sugawara, director-general of the High-Energy Accelerator Research Organization (KEK) in Tsukuba, seconds the approval. Arima, whose specialty was nuclear physics, can be counted on “to emphasize that basic research is also important” at a time when pressure is increasing for research to be economically strategic, says Sugawara.

Whether the United States will continue to participate in reshaping the project depends on whether DOE can persuade Representative John McDade (R-PA), who chairs the House panel that funds DOE, and Representative James Sensenbrenner (R-WI), who heads the Science Committee, to lift a hold they have placed on extending the ITER agreement, which expired in July. Both lawmakers are loath to spend more money on the project until a thorough review of the U.S. fusion effort is complete (Science, 3 July, p. 26), and they directed DOE officials not to sign an extension of the ITER agreement when the partners met 21 to 23 July in Vienna. McDade's panel has also declined to appropriate the $12 million DOE has requested to continue work on ITER in 1999. Senior DOE officials and White House staff have been unable to meet directly with McDade on the matter, and their entreaties to his staff have failed. ITER supporters hope that Sensenbrenner will be persuaded to support the project during a mid-August visit to Japan. Aoe told DOE Undersecretary Ernest Moniz in a 14 July letter that all parties must sign the agreement in order for work on ITER to continue. The U.S. decision, he wrote in the strongly worded missive, would determine the project's fate and “the future fusion programs” of all four partners. Hidetoshi Nakamura, director of the Science and Technology Agency's Office of Fusion Energy, explains that Japan's ability to work on ITER is based on a four-party international agreement. Without an agreement, “efforts [in Japan] would have to be suspended,” he says. That would mean disbanding the teams of scientists and engineers working on the project. But Hiroshi Kishimoto, executive director of the Japan Atomic Energy Research Institute, which heads Japan's ITER design efforts, emphasizes that if the United States drops out entirely, “The other three parties—Japan, Europe, and Russia—will consider other possibilities to continue the joint work.” Europe also is willing to proceed without the United States, say fusion officials, but Japan's participation is key, since it wants to host the facility and is willing to pay the largest share of the project's cost. The congressional ban on extending the agreement is already hampering U.S. efforts to convince the other project partners to consider alternatives to ITER Lite as a hedge against a failure of the scaled-down design to win political backing, says Anne Davies, U.S. fusion program chief. She says that because of time, money, and resource constraints, the partners rejected a U.S. proposal that the ITER team work simultaneously on the design of smaller and cheaper machines that could be parceled out to various countries. But the partners agreed to cooperate with a U.S. effort to examine such options. “We want our partners to join us in doing so, and they may in some limited way,” Davies said. Congress recessed last week until September without an agreement between the House and Senate on a final 1999 DOE spending bill. That will provide DOE officials with additional time to make the case for ITER to lawmakers. The project's fate may be riding on their powers of persuasion. 4. SUPERCOMPUTING Computer Experts Urge New Federal Initiative 1. Jennifer Couzin Last week, 200 experts from academia, industry, and government gathered in Washington, D.C., to help put together a potential major research initiative: an effort spread among several government agencies to build the next generation of U.S. supercomputers. The National Workshop on Advanced Scientific Computation—hastily convened by the Department of Energy (DOE) and the National Science Foundation (NSF), which are now preparing their fiscal year 2000 budget requests—reached broad agreement that the government should invest about$1 billion over the next 4 years to develop a national network of supercomputers for civilian use, together with supporting technology and cutting-edge software.

The ultimate goal would be to construct two 40-teraflop machines by 2003, each of which would be 200 times more powerful than the best supercomputers in universities today. (A teraflop is 1 trillion operations per second.) To allow scientists across the country access to the new machines, workshop participants also agreed to urge the government to bankroll a network of scientific and support centers. The workshop's organizers—who include DOE Undersecretary Ernest Moniz and Larry Smarr, director of the Illinois-based National Center for Supercomputing Applications—have put together a 10-page draft proposal that they will pass along to DOE this week for consideration in its budget preparations.

If the proposal is approved, it would provide a civilian counterpart to the Accelerated Strategic Computing Initiative, a 2-year-old DOE project to develop a 100-teraflop machine in the next decade that would be used to model the behavior of nuclear weapons. Although some universities have constructed high-end computing systems, their machines cannot keep pace with the demands of scientists for faster number-crunching capabilities for tasks such as mapping climate change, simulating combustion systems, or studying a microbe's interaction with its environment. “This [supercomputer] initiative is the most cost-effective way of leveraging this new world of science and technology,” says James Langer, a physicist at the University of California, Santa Barbara, and chair of the workshop.

DOE and NSF are not the only potential participants in the initiative. The National Institutes of Health, the National Oceanic and Atmospheric Administration, and NASA, among other agencies, are also interested in taking part and contributing funds, says Michael Knotek, program adviser for science and technology in Moniz's DOE office. “Everybody sees here a real opportunity,” says Robert Eisenstein, assistant director of mathematics and the physical sciences at NSF.

“We've got to move fast to do it right,” says Langer. But he and other participants acknowledge that the program's ambitious goals won't be easy to achieve. Even if the White House includes the initiative in its 2000 budget request and Congress endorses the plan, attracting the hundreds of experts needed to implement it from a relatively small pool of computer science graduates will pose a challenge. And “some of the development requires machines not available for 3 to 4 years,” says Paul Messina, who directs the Center for Advanced Computing Research at the California Institute of Technology and helped organize the conference.

But scientists were heartened by the level of consensus achieved at the workshop among experts of varied backgrounds. “The science is ready for this kind of activity,” says James Baker, administrator of NOAA. “The scientists are there; they know what to do; they just need the technology.”

5. PHYSICS

Gravity Measurements Ride the Atom Wave

1. Robert F. Service

Gravity may be the law of the land, but the force it applies varies slightly depending on the rocks beneath our feet. In the 3 August Physical Review Letters, researchers report that they have devised a sensitive new scheme for mapping these variations that relies on the quantum mechanical nature of atoms. The device could eventually be useful for searching out new oil and gas deposits, which reveal themselves in tiny gravity anomalies.

Devised by Yale University physicist Mark Kasevich and his colleagues, the scheme builds on the bizarre dual nature of matter, which behaves—so says quantum mechanics—as solid particles at some times while resembling light waves at others. Since the late 1800s, instruments called interferometers have split light waves, allowed them to travel separately for a distance, and then recombined them. The result is a shadowy interference pattern, created because waves that converge in phase form light patches and those that cancel each other out form dark areas. In 1991, several research teams showed that “matter waves” of atoms can produce the same effect.

Typical atom interferometers work by dropping a collection of ultracold cesium atoms down a vacuum tube while hitting them with a series of laser pulses. The first of these pulses effectively places the atoms in two separate energy states at the same time, one moving faster than the other. These “atom waves”—two for each atom—split and move apart. Another pulse brings the two together again. In the meantime, however, the force of gravity has slightly different effects on the separated waves because they follow different trajectories. It alters the way they recombine, affecting the interference pattern, which a third laser pulse reads out.

One interferometer wouldn't be enough for measuring gravity in the field, says Kasevich. The problem is vibration, which can make gravity's tug appear weaker or stronger by moving the instrument closer to Earth's center or farther away. Using two instruments, one atop the other, gets around the problem. Both experience the same vibrations, but the difference in the two measurements—the gravitational gradient—stays constant. It varies only when the actual pull of gravity changes. So Kasevich and his colleagues stacked one atom interferometer on top of another, a meter apart.

In their initial tests of the approach, the researchers gauged gradients between the devices as small as one part in 10,000,000. They have since improved the sensitivity of the setup 100-fold. Made over a broad area, such measurements can generate a map of gravitational gradients, useful for everything from prospecting for oil to warning a submarine navigator when his ship is nearing the sea floor.

The new device “is a very impressive first step” toward measuring gravitational gradients with atom interferometers, says Dave Pritchard, an atom interferometry pioneer at the Massachusetts Institute of Technology. For now, the mechanical gradiometers traditionally used to look for oil and gas deposits still beat the atom-based device in sensitivity. Part of the trouble, says Yale team member Jeff McGuirk, is that some vibrations can cause unwanted movements in the instrument's laser-directing mirrors, affecting the paths of the laser pulses through the interferometers. But McGuirk adds that the group has already tested a scheme for compensating for the vibrations, which should improve the sensitivity by another factor of 10 to 100, good enough to beat the competition, he says.

6. PROTEIN CHEMISTRY

A Two-Piece Protein Assembles Itself

1. Gretchen Vogel

MONT-ROLLAND, QUEBEC—Proteins do many of the trickiest jobs in living cells, catalyzing reactions, passing signals, and providing basic structure. Now scientists have discovered a bacterial protein with yet another talent: seamlessly splicing together two other protein pieces. At an evolutionary biology meeting here last week,* molecular biologist Xiang-Qin Liu reported that he and his colleagues have identified a molecular matchmaker, a protein-within-a-protein called a split intein, which brings together two pieces of protein encoded on very different parts of the chromosome, knits the pieces together, and then neatly cuts itself out.

Scientists had theorized that a bit of protein with such clever action might exist, and several protein engineers had made artificial versions in the lab, but “it is gratifying and very exciting” to have an example in nature, says molecular biologist Henry Paulus of the Boston Biomedical Research Institute. Although many proteins are made from several subunits that clump together, this is the first time anyone has found a natural mechanism that actually splices two disparate protein fragments together into an unbroken amino acid chain. Researchers predict that other split inteins will surface from newly sequenced genomes, and they hope the find will lead to new ways to manipulate proteins in biotechnology.

The finding, published in the current issue of the Proceedings of the National Academy of Sciences, may also offer clues to the origin of more run-of-the-mill inteins—stretches of extraneous amino acids that interrupt proteins. Inteins are similar to the better known introns, sequences of extra DNA that commonly interrupt genes. Introns, however, are cut out of the RNA code for making a protein before the code is translated into an amino acid sequence. Inteins, on the other hand, are encoded in both RNA and DNA; only after they are translated into proteins do they remove themselves and splice the interrupted protein back together. In a process similar to some intron splicing mechanisms, the intein forms a loop, bringing the protein fragments together, and then catalyzes the formation of a normal peptide bond between them.

Researchers discovered the first intein in 1990, and to date scientists have found more than 70 examples, but no one had yet found a split intein, with the DNA encoding its loop-forming ends located in different places on the chromosome.

Liu and his colleagues at Dalhousie University in Halifax, Nova Scotia, made their discovery while “mining” the complete genome of a cyanobacterium called Synechocystis. They found that the genetic code for a key protein called DnaE, which helps to replicate DNA, was split between two genes separated by a very long stretch of other DNA. They also found telltale signatures of intein ends in the DNA in both genes.

Two other groups independently found the same signatures, but Liu's group is the first to report biochemical evidence that the intein works. The enzyme is too rare to be detected in Synechocystis, so the team inserted copies of the two genes, intein signatures included, into Escherichia coli bacteria and forced the bacteria to overexpress these genes. Three proteins were produced in quantity: the products of the two individual genes and a third, larger protein the same size as the other two spliced together, minus the intein fragments. The team examined parts of this large protein's amino acid sequence, including the suspected splicing site, and found that it was identical to the predicted DnaE protein, similar to those found in other bacteria. Thus they concluded that the split intein is active in cells.

Researchers hope that additional work on the split intein, which lacks a DNA-cutting sequence seen in most inteins, may eventually help solve the mystery of how inteins arose in the first place. Researchers have argued whether the original inteins had the DNA-cutting function—which is suspected of helping inteins spread from one genome to another—or were simply ancient protein manipulators, sewing together protein fragments to make new and improved enzymes. Liu's team is studying DnaE genes from closely related species, seeking clues to what this intein looked like before the splitting event.

The find may also help protein engineers find better ways to manipulate and produce proteins. Some therapeutic proteins, such as human growth hormone, are toxic in high amounts to the organisms enlisted to manufacture them. With a split intein, researchers could make the protein in two pieces in different organisms and assemble them later, Paulus says. Based on studies of regular inteins, at least two teams have already had some success at producing artificial split inteins. But Paulus says that perhaps nature does it better: “The fact that it can occur in [nature] means it's potentially a very efficient process.”

• *The annual meeting of the Canadian Institute for Advanced Research Program in Evolutionary Biology, 25–29 July.L

7. PHYSICS

Accelerator Gets Set to Explore Cosmic Bias

1. Andrew Watson*
1. Andrew Watson is a science writer in Norwich, U.K.

An understanding of why the universe is biased in favor of matter may have come a step closer with a burst of collisions in a particle accelerator that has a bias of its own. Called the Asymmetric B Factory and based at the Stanford Linear Accelerator Center (SLAC), the machine collides a beam of electrons, accelerated in a ring 2200 meters around, with positrons, their antimatter partners, accelerated to lower energies in a second ring of the same size. The collisions spawn B mesons, particles containing heavy bottom quarks, and the energy mismatch flings the B's off to one side for study. On 23 July, just days after the positron ring was completed, the two rings collided particles for the first time—a critical step in the long process of getting this novel facility up and running, which should be completed early next year.

“We're very excited about what we have managed to do,” says project leader Jonathan Dorfan. “It's definitely a milestone,” agrees George Brandenburg of a competing facility, CESR, the Cornell Electron-Positron Storage Ring. The B mesons made in the Stanford machine, CESR, and other colliders around the world should enable physicists to probe a phenomenon called CP violation, a subtle effect that distinguishes matter from antimatter and could explain why we live in a matter-dominated universe. The asymmetric Stanford machine could offer an especially sharp view of the phenomenon, because it boosts the short-lived B mesons to a large fraction of the speed of light, extending their lifetime through the time dilation predicted by Einstein's theory of relativity.

The new machine, built on time and on budget at a cost of $177 million, uses electron and positron beams from the existing SLAC linear accelerator. It stores the 9.0-billion-electron-volt (GeV) electrons in the old, rebuilt PEP ring, while a new ring stores the lower energy, 3.1-GeV positron beam. The two superbright beams are brought into collision at a single crossing point, where the BaBar detector, now nearing completion, will watch for the creation and subsequent decay of about 100 million B mesons per year. “The asymmetric energies make the design of the interaction region very complicated,” says SLAC's John Seeman. The challenge, Dorfan explains, was designing a set of magnetic optics that can handle two beams of different energies simultaneously. The payoff, he believes, will be a better understanding of the symmetry between matter and antimatter, and why it breaks down. In almost all particle interactions, matter and antimatter show a basic equivalence, CP symmetry. CP symmetry holds that the behavior of a set of particles and that of the matching antiparticles look identical—one system is a mirror-image of the other, with all the particle spins reversed. But, mysteriously, some exotic particle systems violate CP symmetry. “CP violation is one of the remaining enigmas of the standard model of particle physics,” says Andreas Schwarz at the DESY accelerator center in Germany. It “can be linked to the very fact that matter dominates over antimatter in the universe.” B mesons, containing either a bottom quark or its antiparticle, are thought to show especially strong CP violation when they decay, making them ideal for probing this gray area in particle physics. That has spurred a worldwide surge of interest in accelerators that can mass-produce B mesons. Cornell, which lost out to SLAC 5 years ago in a competition for government funding for an asymmetric collider, will upgrade both the CESR accelerator and its CLEO detector in the middle of next year. DESY has a B meson project of its own, says Schwarz. And across the Pacific the sun is rising on the world's other asymmetric B factory, under construction at KEK, the Japanese high-energy physics lab near Tokyo, which is likely to produce its first collisions by the end of the year. For now, Dorfan and his team are still coaxing their new machine to its full brightness and learning how to operate it efficiently. “We're not about to start physics next week,” says Dorfan. At about the end of the year, the 1000-ton BaBar detector will be slotted into place, and by next spring the machine will begin exploring the universe's fundamental bias. 8. SPACE Engineers Dream of Practical Star Flight 1. James Glanz Why settle for poking through the clutter of the solar system when you can break out into interstellar space? That was the mood last week at a workshop on Robotic Interstellar Exploration in the Next Century, held at the California Institute of Technology in Pasadena and sponsored by NASA's Jet Propulsion Laboratory (JPL). Engineers took the opportunity to engage in some uninhibited thinking about practical—or, at least, plausible—ways to propel, control, and communicate with an interstellar probe. One enthusiast is NASA Administrator Daniel S. Goldin, who has directed NASA's Office of Space Science to investigate the possibilities for interstellar flight. The notion is also getting a boost from the recent discovery of planets around other stars. Although the first interstellar probes would probably aim for nearby interstellar space, the ultimate goal would be to reach other planets within, say, 40 light-years of Earth. “If you can find them and image them, maybe you should think about visiting them,” says JPL deputy director Larry Dumas. That idea, says Dumas, “is so audacious that it stimulates and confounds at the same time”—which is exactly the point, say researchers. The requirements of a journey thousands of times longer than any spacecraft has ever taken are so daunting that some people find them laughable. But even skeptics say that some of the novel propulsion, robotics, and communications concepts discussed at the meeting could pay off for travel within the solar system, if not to the stars. “I think it is enormously valuable and stimulating,” says Louis Friedman of the Planetary Society in Pasadena. “I would just caution that the reality of interstellar flight is far off.” The scientific interest is already there, says Richard Mewaldt, a physicist at Caltech who spoke at the workshop. The solar system sits inside the heliosphere, a bubble blown into the ionized gases of the interstellar medium (ISM) by a wind of particles from the sun. The ISM reflects the makeup of the galaxy billions of years ago, before the solar system formed, and researchers would like to probe its composition and magnetic fields. They would also like to sample cosmic rays in the ISM, because many of them can't penetrate the heliosphere, and survey two distant reserves of comets: the Kuiper Belt just outside the orbit of Pluto and the Oort Cloud in nearby interstellar space. A spacecraft at the right location in the ISM could even use the sun as a colossal gravitational lens to bend light rays from objects in the far reaches of the universe, magnifying them. “There's science to be done all the way,” says JPL's Sam Gulkis. But just to reach the heliosphere's edge, perhaps 100 Earth-sun distances (100 AU) from the sun, in a reasonable time, a craft would need a propulsion mechanism that is thousands of times more powerful than conventional, chemical rockets yet doesn't require carrying large amounts of fuel. (Today's spacecraft would take at least 30 years to make the journey.) Three approaches have emerged as contenders, says Henry Harris, the JPL researcher who organized the workshop: thrusters or sails driven by Earth-based lasers, matter-antimatter annihilation, and nuclear power. In the first concept, a laser fired from the ground is reflected off a mirror and focused into a chamber at the back of the spacecraft, heating gases that then rush out of a rocket to generate thrust. The concept “is very efficient, because you're leaving your engine on the ground,” says Harris. Before the craft leaves Earth's atmosphere, ambient air could serve as the propellant. At the workshop, Leik Myrabo of Rensselaer Polytechnic Institute in Troy, New York, described actual flight tests in which he fired a 10,000-watt laser into a Coke-can-sized facsimile of a spacecraft and lifted it about 30 meters off the ground, says Harris. He says that million-watt lasers, which already exist, could fling objects into orbit, at a calculated cost of about$500 per kilogram for the electricity.

Outside the atmosphere, such a probe would need to carry its own supply of propellant, which could be bulky. A better strategy for harnessing laser power might be to equip a craft with a large, reflective sail that would catch and deflect the beam from a laser—or even plain old sunlight—and accelerate under the bombardment of photons. Harris, who leads a program involving several NASA labs, the Army, the Air Force, and the Department of Defense to develop space sails, calculates that a ground-based, 46-billion-watt laser firing at a craft that has a 50-meter sail could send 10 kilograms to Mars in 10 days. A billion watts “is a lot,” allows Harris, with more than a touch of understatement—it's roughly the output of an average electric power station.

Another propulsion concept, based on the annihilation of matter with antimatter, faces even bigger scientific hurdles. But it too would require only small masses of fuel to power a craft into deep space—assuming sufficient quantities of antimatter could be produced and stored. Still more futuristic engines would scoop hydrogen right out of interstellar space and use it as fusion fuel.

“These three technologies may have the capability of getting us to the nearest stars in a reasonable time—10 to 100 years,” says Gulkis. Once a probe gets into interstellar space, communications delays of hours, weeks, or years rule out controlling the spacecraft from the ground. So other talks at the workshop dealt with ways to get an interstellar probe to operate autonomously during its long, lonely voyage. Another challenge comes at the journey's end: sending back data across a distance of light-years. Laser beams aimed at Earth might be the answer, some participants suggested. Because the lasers could be more tightly focused than radio beams, they could in principle be millions of times more efficient.

“The programmatic requirements are daunting,” concedes Goldin. But if researchers meet the challenge, “it opens up the prospects for some truly innovative missions,” he says. “It may be a probe to sample the interstellar medium … or a mission to explore the Kuiper Belt. But one thing is for sure: It will literally be out of this world.”

9. BIOTECH REGULATIONS

Paving the Way for British Xenotransplants

1. Nigel Williams

The transplantation of animal organs into humans moved a step closer in Britain last week, when the government circulated to hospitals a set of national guidelines intended to ensure that proposed clinical trials don't put patients or the public at risk of new diseases. The proposed trials remain highly controversial, but by providing a regulatory framework, the new rules encourage companies to move ahead, say officials at Imutran, a Cambridge-based company working toward eventual xenotransplantation trials.

Xenotransplants could help meet a serious shortage of donor organs: Globally, only one person out of every three who need new organs will find a donor this year. Organs from animals such as pigs could make up the deficit. Although the human immune response would normally destroy transplanted animal tissue, researchers can now produce “humanized” animal organs from pigs; the pigs have been genetically engineered so that their tissue doesn't produce the molecules that trigger an early portion of the human immune response.

But there are concerns that animal tissues might harbor hidden pathogens, which might pass from animal to human and threaten patients or even the general public. For example, studies at the Institute for Cancer Research in London have found that in the test tube, a pig retrovirus can infect human cells, raising fears that such retroviruses could pass to xenotransplant recipients. “It is a question of balancing the needs, which are real because there are never going to be enough human transplants, against the possible dangers of using animals,” says Lord Habgood, chair of the U.K. Xenotransplantation Interim Regulatory Authority (UKXIRA) and a former bishop of York who trained as a pharmacologist.

In the United States, these worries have led the Food and Drug Administration to consider xenotransplants among the other biomedical technologies it regulates, but in Britain the regulatory machinery had lagged behind. Now Britain has its first national review procedures for assessing xenotransplant risks. All applications for clinical trials will be scrutinized by the authority, which will then make a recommendation to the health minister for a final decision. Human trials will take place only “if and when we are fully satisfied that the risks associated with such procedures are acceptable, taking account of all the available evidence at the time,” Health Secretary Frank Dobson said. UKXIRA is also building up a long-term surveillance plan to monitor any infections arising from xenotransplants. Their recommendations are only advisory at the moment, but Dobson said that they may become legally binding if there's a great deal of public concern.

Animal rights organizations, which oppose the use of animals as organ donors, said the new rules were “a very backward step in terms of animal welfare [that] could pose serious health risks to the human population,” as Mike Baker, chief executive of the British Union for the Abolition of Vivisection, put it. But biotech companies welcomed the new framework. Officials at Imutran, which is now a division of the Basel, Switzerland-based biotech giant Novartis, say that the path toward trials is now more straightforward, as it's easier to deal with a standardized national approach.

Imutran researchers are scanning for pig viruses in 160 patients worldwide who have received small portions of pig tissue, such as blood vessel valves; they are also studying monkeys that received pig grafts. If the results, expected later this year, are promising, the company will apply for a human trial. The first such trials may examine the benefits of using a “humanized” pig liver outside the body as temporary support for a patient awaiting a human organ.

10. SCIENCE POLICY

USGS Nominee Breaks Ground

1. Jocelyn Kaiser

The White House has tapped a new chief for the U.S. Geological Survey (USGS), the Interior Department's science agency. Last week President Clinton announced his intention to nominate Charles Groat, a geologist who's a familiar face in policy circles but little known among researchers.

Groat, associate vice president for research at the University of Texas, El Paso, has headed Louisiana's geological survey, served on several National Research Council panels, and spent 2 years as the American Geological Institute's executive director. Groat says it's a “fair appraisal” that his expertise lies in “applications of science to decision-making.” He adds: “That's frankly what I think the survey needs more than anything else.”

Some grumble that Groat lacks the research muscle of past USGS directors. As a rule of thumb, “the best thing for the survey is someone whose scientific credentials are unimpeachable,” says Debra Knopman, a former Interior official and USGS geologist now at the Progressive Policy Institute in Washington, D.C. “You want someone who's above the fray.” Sources say Groat's name may not have been on a secret list of potential directors provided to Interior by the National Academy of Sciences. Interior Secretary Bruce Babbitt declined to comment, but defended Groat's scientific credentials: “Look at his bio—what he's done and where he's taught,” he said.

Others say Groat could give the embattled agency a boost. His “policy experience will bring a new perspective to the role of USGS director,” says Mary Lou Zoback, a USGS geophysicist in Menlo Park, California. USGS is coping with staff cuts and a drive to make it customer-oriented (Science, p. 19 September 1997, 1755). Moreover, the agency has always struggled to defend its mission, which includes mapping, hydrology, seismology, volcanology, and—since the National Biological Service was folded into it in 1996—biology. Indeed, 3 years ago Congress came within a hair of eliminating USGS. If confirmed by the Senate, Groat says he plans to “raise the profile” of the agency—without cutting science: “The issue is making sure the fundamental science is aligned with what the future needs are.”

11. PLANT GENETICS

Agricultural Biotech Faces Backlash in Europe

1. Nigel Williams

Genetically modified foods have met virtually no consumer resistance in the United States, but in Europe they are provoking fears about safety and environmental damage

London—Guy Watson, Britain's largest grower of organic vegetables, may seem an unlikely warrior in a battle that is roiling Europe's food industry and sending tremors through the board rooms of U.S. biotechnology companies. But Watson's peaceful farm at bucolic Buckfastleigh in southwest England has been on the front lines of an increasingly bitter struggle. Land adjoining the farm is being used by the National Institute of Agricultural Botany, which has won approval for a trial of genetically modified maize, and Watson believes that these experimental plants may contaminate his own organic maize when they release pollen later this month. So he challenged the trial in Britain's High Court, seeking to get the test plants destroyed. “I'm disappointed and angry. This is not what consumers want, and things are moving too fast for the full environmental impact to be assessed,” he says.

Watson's legal quest failed last month, when the court upheld the institute's right to grow the experimental crop. But the case received huge media attention, and his fight has become a cause célèbre that led to an emergency debate in Parliament last week. It is the latest battle in a continent-wide campaign by consumer, environmental, and conservation groups to prevent genetically modified crops being grown on European soil or being imported from outside. In the United Kingdom, a poll this year found 77% of people want genetically modified crops banned, while 61% do not want to eat genetically modified food—attitudes typical of those in many European countries. Austria and Luxembourg are locked in dispute with the European Union (EU) over a genetically modified variety of maize that the EU has approved but they have banned from being planted in their fields. And Norway has banned all products from crops containing antibiotic-resistance marker genes, which have been used in the development of several crop species. Critics fear the transplanted genes could be transferred to other species.

Even Britain's royal family has joined the debate: Prince Charles, who farms his estate in western England organically, wrote a high-profile newspaper article earlier this summer attacking the development of genetically modified crops. “I happen to believe that this kind of genetic modification takes mankind into realms that belong to God and to God alone,” he said. Numerous crop trials have been destroyed by protesters, and one U.K. group trashed a display of genetically modified wheat by the Biotechnology and Biological Sciences Research Council (BBSRC) at a spring farming show this year.

Supermarkets have responded quickly to such public concerns about genetically modified food. In the United Kingdom most of the major retailers have introduced plans to label products containing genetically modified ingredients even before a proposed EU regulation forces them to do so. And some are committed to reducing or eliminating such products (see sidebar).

U.S. and multinational biotechnology companies are increasingly alarmed and surprised at the level of resistance in Europe to what they see as safe and innocuous technology. They view the new techniques, which have been embraced by many farmers in the United States and elsewhere with little public concern, as a seamless extension of traditional plant breeding. The United States has already approved more than 30 genetically modified crops for commercial use, with many more under trial. From a standing start in 1996, 27% of U.S. plantings of soybean are now genetically modified to carry resistance to herbicides and the share is expected to grow rapidly. Some European consumers “are not accepting this product and the benefits of biotechnology as quickly, and that is creating trade problems,” Hendrick Verfaille, president of the multinational biotech company Monsanto, told a recent conference of U.S. and Canadian seed traders in Toronto.

The EU has tried to bring order to the situation, but its directives, which guide national regulations, have come under fire from biotech companies as too opaque and ineffective and from critics for not taking wider public concerns into account. “Our biotechnology industry has expressed considerable frustration at the cumbersome and unpredictable procedures in the [EU] and at the length of time it takes for the EU to review and approve products for commercialization,” says Tim Galvin of the U.S. Department of Agriculture's (USDA's) foreign service in Washington, D.C., who gave evidence to a British House of Lords inquiry on the introduction of genetically modified crops last month. “Unless Europe can sort out its review processes, we could see a trade war developing.”

Grassroots movement

The reasons for Europe's apparent Luddism are many and complex. In some countries, there is a general abhorrence of any genetic manipulation because of Nazi abuses of genetics in the name of science. There is also a general distrust of the food industry and official regulators, following numerous scares from salmonella, through Escherichia coli, to bovine spongiform encephalopathy (BSE or mad cow disease). Opponents argue that, although consumers may be taking risks by eating genetically modified food, all the benefits go into the pockets of (often U.S.-owned) biotech companies. And there are genuine differences between farming practices in the United States and Europe, where many farms are still small and family-run and wildlife is dependent on particular farming techniques that critics fear will be changed by the new crops.

Yet the vehemence of the opposition to trials of genetically modified crops is surprising in view of Europe's willingness to embrace biotechnology for medical and other uses. There has, for example, been little ethical concern about the introduction of genetically engineered insulin for treating diabetes, or a genetically engineered version of the enzyme chymosin for cheesemaking. Ironically, because chymosin is traditionally extracted from calves' stomachs, the innovation has made cheese more acceptable for many vegetarians. And genetically modified food is already on sale: in 1996 Britain approved the sale of a tomato paste produced from plants modified to delay fruit-ripening, which was voluntarily labeled as genetically modified. The product sold well when it was introduced, says a spokesperson for one of the retailers selling it.

There are also clear signs that Europeans do see the benefits of genetic manipulation. In Switzerland, a national referendum in June on a proposal to severely restrict all transgenic research on animals and plants was defeated by a 2-to-1 margin (Science, 12 June, p. 1685). But it is not a blind acceptance. “In medical genetics, the public may have an eventual gain in terms of better diagnosis and treatment. By contrast, in agriculture the only clear beneficiaries of genetically modified crops are agrochemical companies, who get to retain their market share, while the public, and the environment, is left with the potential risks to their health,” says biologist Tom Wakeford of the University of East London.

The trigger for the current wave of opposition was the unannounced arrival in Europe last year of products derived from genetically modified soybeans imported from the United States. Because there is no requirement to differentiate between modified and conventional beans in the United States, European consumers found that, unknowingly, they were eating foods that may have contained soybeans with genes for herbicide resistance. “I think that recent history with multinational companies bringing food products into Europe shows how important that early voluntary decision to label was,” says geneticist Don Grierson, who led the work to develop the genetically modified tomato used in the tomato paste sold in Britain. “People were outraged because they wanted to be treated—rightly—as individuals with minds of their own,” he says.

Although regulatory bodies have determined that the modified soybeans present no health hazards, tampering with the food chain without public consultation touches an extremely raw nerve—especially in Britain, which is still blighted by the legacy of BSE. “BSE was a watershed for the food industry in this country. For the first time people realized that merely attempting to ensure a culinary end product was safe to eat was not a good enough approach. We had to look at the entire process by which food is produced,” says a spokesperson for Britain's Soil Association, which licenses organic growers.

There has also been a huge increase in demand across Europe for organically produced products. Already this year 140 British farmers have applied for accreditation as organic producers—445 are currently licensed—and the number of European organic producers has risen by 24% to 62,000 since 1996. Some countries, such as Sweden and Austria, are now almost 9% organic in terms of land area compared with about 2% 5 years ago.

More systematic samplings of public opinion have provided little comfort for the biotech industry. A “citizen's panel” project, organized by the University of East London earlier this year, provided an opportunity for 12 members of the public with no specialist knowledge of biotechnology to give their verdict on the technical issues following questions to a range of expert witnesses. The panel concluded that genetically modified foods provide no benefit to the consumer and that the risks they pose, both to long-term human health and to the environment, are unknown. However, they were not against laboratory research continuing into possible future benefits.

France also recently held a high-profile public “consensus conference” on genetically modified crops. A polling organization identified 14 lay people who had no prior scientific knowledge; they were then given intensive briefings and posed questions to experts. After that intense exposure to the issue, the panel called for the prohibition of antibiotic marker genes in transgenic crops, separation and labeling of transgenic and unmodified products, and a legal liability on any unforeseen consequences of introducing a transgenic product into food or the environment.

Environmental backlash

Public fears about safety are not the only problem agricultural biotech companies face in trying to market genetically modified products in Europe. Critics have also raised concerns about the possible environmental effects of introducing crops that might change farming practices. They argue, for example, that planting herbicide-resistant varieties could lead to changes in the use of herbicides that, in turn, might damage critical habitats. “There is insufficient assessment of any wider environmental impact of the effects of management practices that may be changed in growing the crop,” says population biologist Brian Johnson, adviser on genetically modified organisms to the conservation body English Nature.

Changing farming practices is a key issue, conservationists say, because the farming environment in Europe is different from that in the United States. “In the U.S., you have farming, or you have wildlife, with only 28% of the land cultivated,” says Johnson. “In Europe, farming and wildlife are intimately interlinked with 80% of U.K. land cultivated. So the impact of genetically modified crops, and the new management plans for the use of pesticides for herbicide-resistant crops, may have a devastating impact on wildlife species, many of which have already been highly damaged by intensification,” he says.

“Narrow strips of land around field margins left to grow weeds and other wild plants provide a vital habitat and food source for many creatures, and are highly vulnerable to changes in management practices,” says a spokesperson for Britain's Royal Society for the Protection of Birds (RSPB), with more than 1 million members Europe's largest conservation charity. Any changes in pesticide use that could destroy these plants could have a serious impact on wildlife. “There's no requirement to look at the effects of a genetically modified crop on other organisms. It's a very flawed process,” says Johnson. English Nature and the RSPB have called on the government to introduce a moratorium on commercial release until further work on the environmental impact of genetically modified crops can be better assessed.

European researchers are also beginning to find evidence of a potential environmental impact of genetically modified crops themselves and the need to monitor their effects carefully. Some groups have found evidence that genes from genetically modified crops can be transferred to native species via pollen. Other work by researchers at the Swiss Federal Research Station for Agroecology in Zurich has shown that lacewings, a natural predator of aphids, may be harmed by eating aphids on maize modified to express an insecticidal protein from Bacillus thuringiensis. Studies led by Nick Birch at the Scottish Crop Research Institute in Dundee have also found a similar effect with genetically modified potatoes containing a novel lectin, which reduces aphid attack without killing them. Ladybirds that feed on these aphids suffered significant loss of viability of their eggs compared to ladybirds feeding on control aphids. “There is a problem with monitoring programs. They have been a little bit forgotten,” says Marcel Bruch, a biotechnology adviser to the Luxembourg government. But Swiss drug and agrochemicals giant Novartis says that extensive studies on its modified maize show that it is as safe as conventional maize in terms of its impact on beneficial insects and other wildlife.

Regulatory disharmony

Biotechnology companies hoping for relief on the regulatory front are facing disappointment there, too. The EU's attempts to ensure that uniform approval procedures for genetically modified crops are adopted across Europe seem to have stalled. In 1992, the European Commission, the EU's executive in Brussels, approved a directive spelling out licensing procedures for trials of genetically modified crops in the field and their commercial release. Each national government was required to incorporate it into its own law. According to the directive, if a crop is licensed for commercial growing following trials in one or more member states, then all member states must include the crop in their national lists of varieties approved for sale and cultivation.

That aim was soon put to the test. In 1995, the French government approved the commercial release in France of a genetically modified maize developed by Novartis. That approval was endorsed by the Commission in 1996 so that growers across all 15 member states could adopt the new crop. Austria and Luxembourg, however, refused to adopt it. Meanwhile, in France, after the Socialist Party wrested power from the conservatives in the 1997 general election, it bowed to pressure from Green Party colleagues and the public and last November announced a moratorium on any further approval or commercial releases of genetically modified crops.

Critics also contend that national licensing systems are open to abuse because safety data submitted to regulatory bodies comes from industry, and industry is also responsible for following up any permitted release. “They need to tear up [the directive] and start again,” says biologist Mark Williamson of the University of York, who also presented evidence to Britain's House of Lords inquiry.

The European Commission is now consulting with interested parties on major amendments to the directive regulations. At the same time, the EU has also introduced plans to enforce the labeling of products containing genetically modified ingredients, starting later this year. But again, the plan has infuriated U.S. authorities. “The proposed [labeling] regulations have a questionable scientific basis and are ambiguous and impractical,” says USDA's Galvin.

Biotech firms go public

Concern about public opinion has led the U.K. government to establish a new panel to develop public consultation on the future of the biosciences. Britain, which held the presidency of the EU for the first half of this year, championed the need to bolster biotechnology. The science minister, John Battle, told a special conference in Brussels in June that issues of public perception had to be addressed. “The debate about biotechnology is still to be won,” he said. And Tom Wakeford, a member of the new consultation panel, says it will have to be careful to allow the public to distinguish between genetic engineering directed toward medical, as opposed to agricultural, applications. “There are fundamental differences in each case as to who are the risk takers and who are the beneficiaries,” he says.

The biotechnology industry has also begun to take its case to the public. Monsanto has been running a newspaper advertising campaign in Britain and France, which now backs European calls for labeling of genetically modified products. In Britain, the BBSRC has launched a touring exhibit, In-gene-ious, to raise public awareness about biotechnology. Spokesperson Monica Winstanley says the stand has attracted a great deal of interest from farmers and the public wanting to know more about the technology. “We're trying to get to the bottom of what people are concerned about—concerns that are amenable to a realistic response.” How the technology has been handled by the multinational companies is one perceived problem, she says. But some protesters have tried to block the message: At Britain's premier agricultural show last month pots of genetically modified wheat were attacked.

But in spite of current stiff resistance, even the European states that have taken the hardest line are keeping the door slightly ajar. “We don't, in principle, oppose the development of biotechnology,” says Georg Rebernig, a member of the Austrian representation to the EU in Brussels. “Our concern is that there is greater transparency and harmonization on risk assessment,” he says. “The biotechnology industry has huge potential, but it can't force products down people's throats. It's vital the industry does everything possible to regain the trust of the people.”

Others also believe the industry can reverse its current fortunes in Europe. “Our view is that we need more time to do more research on the wider impacts of genetically modified crops. This first generation of crops can be seen as quick and dirty. We'd like to see more sophisticated gene modification of crops and their assessment to show that they don't damage the environment,” says Johnson of English Nature. “We support the development of genetically modified crops that can bring environmental benefits.”

12. PLANT GENETICS

Can Regulations Requiring Labeling of Genetically Modified Foods Work?

Public pressure to label food containing genetically modified ingredients, as well as impending labeling legislation from the European Union (EU), has sent retailers and food manufacturers scrambling to find ways to determine whether the products they sell contain such ingredients. But critics question whether a meaningful labeling system can be achieved.

The proposed EU regulation would require labeling of foods in which “foreign” DNA or protein resulting from genetic modification can be detected. But the U.S. Department of Agriculture (USDA), commenting on the EU proposals, points out that no standard tests or limits of detection have been outlined. As a result, it says, many products may end up being labeled “may contain” or “may be produced from” genetically modified crops, with little benefit to the public. “The U.S. encourages industry to disseminate information concerning genetically engineered foods,” says Tim Galvin of the USDA, “but does not believe that labeling is the most practical way.”

The problem this poses for food companies is highlighted by soybeans. More than a quarter of soybean plantings in the United States—a major supplier to Europe—are now genetically modified varieties, and soybean products, such as flour, oil, and lecithin are used in a wide variety of processed foods. Because, under U.S. regulations, genetically modified varieties are considered equivalent to the conventional product, no labeling is required, and the major processing companies have not attempted to segregate the two types.

One U.K. retail chain, Iceland, claims that it can get around these problems. Earlier this year, it announced that it will not sell any product containing genetically modified ingredients at all. The ensuing publicity may be one reason the company's sales recently shot up by 14%, says technical manager Bill Wadsworth. “Our aim is a balance between the use of lab tests and an effective audit trail from our sources,” he says. The tests will use the polymerase chain reaction and other methods to detect novel DNA and proteins. The company says it will first eliminate flour and oils from genetically modified soybeans, then products derived from other genetically modified plants, and finally, products from animals that have been raised on genetically modified feeds.

Other companies are also seeking sources of unmodified soybeans, and one major chain, Sainsbury's, promises that products containing genetically modified soy will be kept to a minimum. Sainsbury's, along with another chain, Safeway, introduced the first genetically modified food to Britain—tomato paste in 1996. The paste initially sold well, says a company spokesperson, but the current publicity surrounding genetically modified foods may hurt sales.

Not all biotechnology companies oppose labeling. Monsanto, in its current newspaper advertising campaign, supports the move in the hope that, if it is open about genetically modified foods, it will eventually be able to win over reluctant consumers.

China Hopes to Move FAST on Largest Telescope

1. Li Hui*
1. Li Hui is a reporter for China Features in Beijing.

Chinese astronomers have the go-ahead to design a 500-meter dish that they hope will anchor a major international project

Beijing—The terrain in southwest Guizhou Province—hundreds of round depressions, each surrounded by hills a few hundred meters high—already looks like a scene from another world. If astronomers get their wish, it will someday sprout a collection of instruments that would make it look even more like the backdrop to a science fiction movie.

China has embarked on a project to build the world's largest radio telescope, a spherical dish 500 meters in diameter, in this haunting landscape. The facility could make China a major player in the field. “Perhaps we can even achieve something that will bring a Nobel Prize to China,” says project director Peng Bo of the Beijing Astronomical Observatory. But Chinese scientists are hoping for even more: They see the telescope as the forerunner of a billion-dollar, internationally funded radio array that would probe the very earliest stages of the universe.

Astronomers around the world are looking on with interest. Several years ago, an international team of astronomers began putting together plans for such an array, a cluster of instruments that, in combination, would form a collecting area 1 kilometer on a side. Operating at wavelengths of several centimeters to a meter and at frequencies up to 10 gigahertz, the array would be able to peer back in time, looking for traces of atomic hydrogen, the building block of the universe, which emits a very weak spectral line at a wavelength of 21 centimeters. It could also probe for heavier molecules, including carbon dioxide, that indicate star formation, as well as exotic objects such as pulsars and the physics of black holes.

But such collecting power doesn't come cheap. The reigning individual heavyweight of radio astronomy, a 305-meter dish in Arecibo, Puerto Rico, run by Cornell University for the National Science Foundation, would cost about $100 million to replicate. And the square-kilometer array—with a collecting area of 1 million square meters—would require roughly 25 such dishes. (Arecibo has an effective collecting area of 40,000 square meters.) “There is nothing in the square-kilometer array that can't be done, from a technical perspective, except that it would cost many billions of dollars,” notes Britain's Peter Wilkinson of the University of Manchester observatory at Jodrell Bank. So finding ways to save money is critical. Enter Guizhou. Its plentiful limestone formations, called karsts, provide naturally occurring bowls in which the large receiving dishes can be suspended. “The geology is similar to Arecibo, and they have the largest number of such depressions anywhere in the world,” says Richard Strom of the Netherlands Foundation for Radio Astronomy, who has visited the site and has been active in planning the array. Having an existing hole in the ground reduces construction costs by as much as 90%, estimates Wilkinson. Still, the costs of scaling up are formidable. “It's a huge global undertaking that's unlikely to be decided before 2010,” says Wilkinson. Chinese officials have decided not to wait before taking the first step, however. This spring, the Ministry of Science and Technology approved$800,000 for preliminary work on a 500-meter spherical telescope, known as FAST. The money is widely seen as a downpayment on the $10 million facility, which would have twice the sensitivity and sky coverage of Arecibo at a fraction of the cost. If the design phase goes well, scientists hope to earn a place on the government's list of megascience projects that it will fund in the next 5-year plan, which begins in 2000. FAST, they hope, would then be the prototype for the square-kilometer behemoth later in the 21st century. First, however, the scientists must overcome several logistical problems. Although a parabola is the preferred shape for a movable dish because it can focus radiation of any length at a single point, FAST—the size of 16 football fields—is too large to rotate. Instead, it will use a fixed, spherical reflector, which makes positioning easier by providing identical views along any axis. However, a sphere focuses radiation along a line rather than at a point. To gather the radiation, Arecibo, which also has a spherical dish, uses a long, rodlike waveguide suspended above the dish and kept rigid by a 160-ton platform. Such a waveguide can handle only a narrow range of radiation, however, so Arecibo recently added two special correcting mirrors to focus the radiation to a point, allowing the telescope to operate across a broad spectrum. But it's an expensive technical fix. The Chinese design embodies a lightweight and less costly solution that, in effect, would turn the spherical dish into a parabola. The idea is to build the dish with hexagonal elements, roughly 12 to 15 meters on a side, that could be independently adjusted. “At this size, the difference between a parabola and a circle is only a few feet,” says Strom. Differing combinations of panels would be rearranged as the telescope tracked objects moving across the sky. The design also modifies the system used at Arecibo and other facilities to collect and amplify the signal before it is processed. (That system, called a feed, moves in tandem with the illuminated part of the reflecting surface.) Lasers will accurately detect the position of the feeding system in real time, and the information will be sent back to the central computer. More than 40 scientists from research institutes and universities across the country are now working on FAST. “Technologically, we can make the telescope all by ourselves,” says Peng. “But we would welcome foreign collaboration.” Once the telescope is built, China hopes to convince the international science community that Guizhou, with its geography and its isolation from sources of electronic interference, is an ideal place to build the square-kilometer array. Long before then, however, foreign scientists say that an operational FAST would provide a big boost for Chinese astronomy. “China would leap to the forefront of radio astronomy,” says Wilkinson, part of a delegation from the British Royal Astronomical Society that is scheduled to visit next month. “And people seem to be very impressed with what they've seen so far.” 14. RUSSIAN MUSEUMS Fight Erupts Over Rights to Profits From Holdings 1. Richard Stone Zoological Institute leads resistance to efforts by Russian Academy of Sciences to share revenue from exhibits and specimens with a new commercial agency Russia's premier zoological institute is battling its parent body over control of an important source of research funds—revenue from traveling shows and products that showcase its vast holdings. The fight pits the Zoological Institute (ZIN) in St. Petersburg and like-minded institutes against a new agency of the Russian Academy of Sciences (RAS) called the International Academic Agency (IAA) Nauka. The outcome could affect not only ZIN's 15,000,000 holdings, including a prized mummified baby mammoth named Dima that was unearthed in 1977, but also the operations of dozens of other state-owned institutions struggling to adapt to the free market. Nauka was created last year after RAS's leadership declared that its “museums, various precious collections, archives, and libraries” were “realizing only feebly” the possible revenue from copies, molds, models, and secondary samples of their collections. RAS set up the agency as a joint venture with Pleiades Publishing Inc., a U.S.-based firm that invested$245,000 in start-up funds. Institutes must receive permission from the academy's presidium to organize any exhibition that bypasses Nauka.

IAA Nauka director Nikolai Parin says that by scouting out new opportunities for exhibitions or by putting together shows involving material from several institutes, his agency should boost revenue flowing into museum coffers. “The whole is always greater than the parts,” says Parin, who adds that his agency's cut will vary, depending on the agreement. “We're looking forward to collaborating with every RAS museum, and we hope we will.” Some 50 museums are on Nauka's list of potential clients.

But researchers at some RAS institutes aren't convinced. ZIN officials say they fear that Nauka will take a big bite out of scarce revenue that supports ZIN's museum. For example, a 1996 exhibition in Germany commemorating the life of naturalist George Steller, discoverer of the sea cow that now bears his name, netted ZIN 20,000 German marks worth of high-quality microscopes. ZIN argues that Nauka, with the academy's blessing, intends to transform Russia's vast scientific collections into mere commodities, and that Nauka's commercial partner, Pleiades Publishing, stands to profit from Russia's precious collections. “To turn these treasures into property—it is a crime!” says Roald Potapov, director of ZIN's museum.

ZIN director Alexander Alimov has retained legal advisers to help the institute force major changes to the 10-year deal offered by Nauka. One of the lawyers, Konstantin Isakov, says the proposed agreement that Nauka has floated to ZIN and other institutes conflicts with Russian laws on “export of cultural values” and on “guarding cultural monuments.”

To make their point, ZIN officials cite the experience of a sister organization, Moscow's Paleontological Institute. PIN has been mired in lawsuits, investigations, and controversies involving, among other things, fossils that have disappeared from its collections. But it has formidable assets: Some 57 fossil skeletons—including a prized 70-million-year-old, $10 million Saurolophus angustirostris—are each valued at$100,000 or more. These specimens are an important source of revenue for an institute where scientists earn only about $100 a month. For example, a recent Russian dinosaur exposition—featuring unique Permian fossils such as the only known Estemmenosuchus uralensis and the species-describing type specimen of Scutosaurus karpinskii—generated$105,000 for PIN during a 7-month show ending last May at the new City Museum in St. Louis.

Similar support came from a 4-year-long Great Russian Dinosaurs exhibition, organized in August 1993 by PIN, the Monash Science Centre in Clayton, Australia, and the Queen Victorian Museum in Tasmania. The exhibition, featured on the cover of Time magazine's Australia edition, “was put together by scientists, and the funds all flowed back into research and education in one way or another,” says paleontologist Patricia Vickers-Rich, science-centre director. The activities, she says, have “definitely helped PIN survive.”

Nauka will now get a share of such proceeds. A 21 October draft “framework” agreement calls for PIN and Nauka to develop “commercial usage of museum exhibits, objects from collections, from archives, and other unique materials,” as well as to make for sale “reconstructions, copies, and casts” of “original paleontological samples,” with Nauka taking a 15% cut. But the percentage in the final agreement—signed last December by PIN director Alexei Rozanov, Parin, RAS vice president Rem Petrov, and Pleiades president Alex Shustorovich—was not revealed, leading to speculation that it may be higher. “Rozanov called it a commercial secret,” complains Masha Hekker, a PIN paleontologist and outspoken Rozanov critic who was dismissed last December and is now fighting her dismissal in court. Seven PIN scientists, including Hekker, wrote to the All-Russian Paleontological Society, saying that the Nauka deal “looks like the beginning of the privatization of collections and other property of the institute.”

And one U.S. collaborator with PIN shares such misgivings. Charles Dean Pruitt, a self-employed mathematician who hooked up with PIN serendipitously in 1993 during a visit to Moscow, organized the St. Louis exhibit and is now negotiating with Nauka to organize shows early next year at the Kansas City Children's Museum and afterward at the Florida International Museum. Pruitt questions the need for Nauka to be involved: “It's unfortunate that the efforts of the existing team of specialists and experts at PIN are being duplicated.” Income from the show, Pruitt says, “goes a long way to keep these people in science instead of selling pencils in a kiosk.”

A subsequent agreement has fueled fears that institutes may get little revenue from some activities. As part of a joint program, PIN staff members earlier this year made two casts of a fossil of an ancient flightless bird called Diatryma steini. The casts were then sold for \$5000 each to two German museums. Igor Novikov, who is acting director of PIN while Rozanov recovers from heart problems, says PIN's share was “almost 50%.” Parin defends the figure, saying that PIN staff who made the casts “earned much more than their regular income” and that the money has helped the institute purchase materials to make its own cast. “No casts would have been made at all if it was not for the Nauka effort,” says Parin.

PIN's top staff members seem to agree. “We have to follow the order of the presidium,” Novikov says. “But the agreement also will be fruitful for our institute.” Other key staff members have accepted Nauka as well. “We have to organize these exhibitions to support our research,” says paleontologist Alexander Karhu, PIN's exhibitions supervisor.

Besides the Zoological Institute, Nauka has approached three other institutions. They are the Kunstkamera—a world-renowned collection of pathological specimens and medical oddities in St. Petersburg that was started by Peter the Great—and two Moscow outfits, the Botanical Garden and the Archaeological Institute. Kunstkamera director Chuner Taksami says he too opposes the deal on the table from Nauka.

Staff at these institutes are anxiously watching the showdown between ZIN and Nauka. But, with most Russian scientists and officials spending large chunks of the summer at their dachas, or summer homes, Potapov says the dispute won't be resolved until September at the earliest. That means Dima and other Russian scientific icons must wait a bit longer to find out who'll be profiting from their next public appearance.

15. MAMMALIAN EVOLUTION MEETING

New Views of the Origins of Mammals

1. Dennis Normile

Hayama, Japan—Paleontologists and molecular biologists take different approaches to questions of evolution and often come to different conclusions. Fifty mammalian researchers from both sides of the fence tried to find common ground here at the International Symposium on the Origin of Mammalian Orders from 21 to 25 July.

Rallying Round the Tertiary Radiation

In recent years, researchers who determine how long species have been diverging based on differences in their DNA have pushed back the dates of emergence of modern mammals—the predecessors of everything from whales to tree shrews—to as much as 100 million years ago. That's far earlier than fossils suggest, but the DNA researchers blame the discrepancy on the notoriously incomplete fossil record. At the meeting, however, two paleontologists went on the offensive, claiming that a close look at the fossil record shows that it is complete enough to date the origin of the modern mammalian orders. If the DNA “clocks” can't agree with the fossils, says the author of one study, paleontologist J. David Archibald of San Diego State University, then “the problem is with the molecular clock.”

Paleontologists have long held that the modern mammalian orders emerged and differentiated into families, genera, and species after the Cretaceous-Tertiary (K-T) extinction 65 million years ago. That event wiped out the dinosaurs and presumably gave mammals more evolutionary breathing room. But many in the molecular camp have argued that several orders of mammals, including primates and rodents, arose more than 35 million years before the K-T boundary. Divergence within a few if not many of the modern orders was well under way during the Cretaceous, they say (Science, 1 May, p. 675).

Now Archibald and ecological modeler Douglas Deutschman, also of San Diego State University, have surveyed the fossil record and report renewed support for the traditional paleontological view. Archibald notes that 15 of the 18 extant orders of placental mammals first appear in the fossil record during an evolutionarily short period of 16 million years in the early Tertiary. Statistically, says Archibald, the probability of this clustering occurring randomly due to gaps in the fossil record is “vanishingly small.”

What's more, Archibald found no evidence that paleontologists have neglected the period before the K-T, creating a gap in the fossil record that might result in an apparent explosion of diversity later. He found roughly equal numbers of fossil sites and specimens for the 5-million-year periods immediately before and after the K-T boundary. Yet only 11 genera of placental mammals have been found in the fossil record in the period before the boundary, while 139 placental genera have turned up in the 5 million years after it. “Something happened to cause this explosion of speciation,” he says.

The reality of the early Tertiary radiation was echoed by John Allroy, a paleontologist at the National Museum of Natural History in Washington, D.C., who did a similar analysis focused just on North American mammals and came to a similar conclusion. If molecular analysts can't find evidence of this explosion in speciation, he says, then they “don't know anything about the evolutionary process.” Archibald suggests that something happened at the time to cause molecular clocks to speed up, making the splits among mammals appear earlier than they actually were.

The paleontologists' arguments were “a real eye-opener,” says Michael Stanhope, a molecular biologist at Queen's University of Belfast, showing that the fossil record cannot be lightly dismissed. “I think there is a good chance we're missing something about the way DNA sequences evolve.”

But not everyone was convinced. Peter Waddell, a phylogeneticist at the Institute of Statistical Mathematics in Tokyo, says that fossil evidence of the ancestors of aardvarks, tree shrews, and rabbits, among others, is missing. “The fossil record is not picking up things we know are there,” he says, “so why close the book on [other] missing modern forms in the late Cretaceous?”

Others caution that the molecular evidence may never exactly fit the fossil data. Molecular phylogeneticists can only work with extant species, for example, and so will “never be able to reconstruct all those species which arose but died out,” says molecular biologist David Mindell of the University of Michigan, Ann Arbor. There may also be a gap between the moment two DNA sequences begin to differ and the moment a species actually divides into two.

“We are hoping the molecular and morphological data converge,” Waddell says, “but molecular people wouldn't necessarily be unhappy if they don't.”

Judged by its DNA, a whale is just an overgrown hippopotamus with an unusual lifestyle. Researchers who learn how living animals are related by studying their DNA have tended to group the cetaceans—whales, dolphins, and porpoises—with the even-toed ungulates, or artiodactyls, which include cows, pigs, and hippos. By some analyses, hippos are the closest living whale relatives. But to paleontologists, who study fossils, that conclusion has long been anathema. Instead, they contend that cetaceans descended from extinct hyenalike mammals called mesonychians. Now the fossil record may be opening the door to a whale-ungulate connection.

At the meeting, Hans Thewissen, a paleontologist at Northeastern Ohio Universities College of Medicine in Rootstown and an expert on whale evolution, described analyses of new specimens of early whales and whale ancestors his team collected in Pakistan. The new specimens weaken the link between the whales and the mesonychians, which was primarily based on similarities in the teeth. But they support the idea that whales are cousins of the ungulates, if not actual members of that group, he reported. “I think there is no doubt that they are very closely related to artiodactyls,” says Thewissen.

One blow to the mesonychian link came from two specimens of a 50-million-year-old whale, a member of the family Pakicetidae. Analysis by a colleague of Thewissen's, Maureen O'Leary of the State University of New York, Stony Brook, showed that its teeth are not as highly evolved as those of the mesonychians, making it unlikely that whales are the descendants of that group. But on the question of whether the cetaceans are an actual subgroup of the Artiodactyla, as the molecular biologists think, this and other fossil whales don't give a clear answer.

Thewissen says that five morphological features of the early whales, including features of the skull, upper teeth, and feet, are “not inconsistent” with the hippo hypothesis. In particular, the new pakicetid skulls have holes over the eye sockets, known as supraorbital foramina. These features are not known in modern whales but are common to all artiodactyls.

But the last molar on the lower jaw, which has three sections in artiodactyls, has just two in whales. And in artiodactyls, the astragalus, one of the anklebones, has a rounded head and other characteristics that make the ankle much more flexible than it is in any other mammal. Thewissen recently discovered an anklebone from an early whale ancestor that still had legs. It lacks the rounded head, although in other respects it is similar to an artiodactyl astragalus.

Still, Thewissen thinks the morphological evidence, although mixed, opens the door to some kind of relation between the whales and the ungulates. He adds that there is now “considerable doubt” that cetaceans are closely related to mesonychians. That conclusion got a thumbs up from paleontologists at the meeting. For example, John Allroy of the National Museum of Natural History in Washington, D.C., says pulling the mesonychians out of the picture makes a closer cetacean-artiodactyl link plausible. But O'Leary says “it's [still] difficult to connect hippos with whales in the fossil record.”

The molecular camp, for its part, viewed Thewissen's conclusion as just a first step toward ultimate vindication. As Norihiro Okada, a molecular biologist at Tokyo Institute of Technology, put it: “I think paleontologists may discover more [features common to early cetaceans and early hippos] in the near future.”