News this Week

Science  09 Oct 1998:
Vol. 282, Issue 5387, pp. 206

    NASA to Buy Research Time to Bail Out Russian Agency

    1. David Malakoff
    1. With reporting by Elena Savelyeva in Russia.

    U.S. scientists planning research projects aboard the international space station have long fretted that their experiments would get short shrift from astronauts too busy putting the station together to spend much time conducting science. But a deal negotiated with Russia last week could provide them all the experiment time they want. On 2 October, NASA announced that it will pay Russia's bankrupt space agency $60 million for thousands of hours of cosmonaut time over the next 5 years as well as the rights to store experiments aboard a Russian-built module. The agreement is intended to prevent the $50 billion space station project—whose first components are scheduled to be launched next month—from falling even further behind schedule (Science, 1 May, p. 666).

    NASA and Russian officials say the deal will allow Russia to finish work on the crucial service module, a major piece of the station to be launched next year. And many space scientists are enthusiastic, because it will shift astronaut time into running the extensive array of experiments they want to deploy on the station while it is being assembled. But some members of Congress are attacking the plan, which was expected to be presented formally to a House Science Committee hearing on Wednesday, as a giveaway. “It looks like we are going to pay for work that the Russian astronauts were going to do anyway, simply to keep the project moving,” said one congressional aide.

    Space scientists “have gone on record repeatedly as being very concerned about the number of astronaut hours available for research” during the 5-year assembly phase, says David Larson, a State University of New York, Stony Brook, materials scientist who serves on several panels that advise NASA on station science. “There has been a long-term debate about how well you can use the house while you build it,” adds NASA's W. Michael Hawes, deputy director of the Development Office of Space Flight.

    The new agreement, both NASA officials and scientists say, could go a long way toward alleviating the scientists' concerns. According to agency officials, station astronauts will have roughly 30,000 work hours to divide among various tasks during the assembly process, including construction, operations, and maintenance (see graphic). About 10,000 hours had been reserved for research, split evenly between the United States and Russia. Russia had planned for its cosmonauts to use most of their time on Russian-sponsored experiments. But the country's fiscal crisis has cast doubt on whether those experiments will fly. If they don't, cosmonauts will spend most, if not all, of their science hours tending to U.S.-led research, from medical studies to chemistry experiments. “To potentially double our access [to labor] for that amount of money is a very good buy,” says aerospace engineer Gerard Faeth of the University of Michigan, Ann Arbor, another NASA adviser.

    But some observers wonder if space scientists will be able to take full advantage of the windfall when—and if—it arrives. They note that the $60 million to be paid to the Russians will come at least partly out of the station's engineering research budget, and that NASA has already borrowed other research funds to cover cost overruns. “We will finally have the crew, but will [NASA] have the funds to cover the supporting cadre of scientists on the ground?” asks one researcher familiar with the issue. He is also skeptical that most of the newly purchased hours will end up being used for science. Other researchers question the value to science of the storage space that NASA is purchasing, noting that it may lack the electrical power or environmental conditions necessary to support many experiments.

    For congressional critics of the space station, however, such technical concerns take a back seat to the deal's foreign policy implications. In particular, many Republicans and some Democrats are worried that it signals the Clinton Administration's intention to stand by their increasingly unreliable Russian partners, who have announced they will not be able to meet obligations to build transport spacecraft and other essential equipment without U.S. financial assistance.

    The Administration initially had sought Russian involvement to cut costs and to prevent Russian rocket scientists from selling their skills to hostile governments. But as Russia's economic troubles threatened to cripple the station, those reasons have been eclipsed by frenzied efforts to craft a bailout package for Russian companies. In September, for instance, NASA officials estimated it could cost at least $660 million over the next 4 years to prop up Russia's space station enterprises. The news prompted vitriolic attacks on the station from House Speaker Newt Gingrich (R-GA), among others, raising doubts about whether Congress would back the plan. “Everyone except the White House appears to have woken to the reality that it is time to change the program,” claims one Republican staffer.

    Indeed, one NASA official wonders if the new agreement marks the end of Russia's major role in the space station project. “It certainly sends the signal that, rather than a partner, they are more like a hired hand,” he says. But Russian Space Agency spokesperson Sergey Gorbunov says that's not the case. “NASA is just renting some space [from us],” he says.


    Optical Circuits Turn a Corner

    1. Andrew Watson*
    1. Andrew Watson is a free-lance science writer in Norwich, U.K.

    The ultimate aim of today's telecommunications researchers is to quit dealing with electricity. Communication systems already zip messages across the globe via satellite as microwaves or through optical fibers as infrared light, but at either end of such transmissions the messages must be converted into electrical signals and passed through electronic circuits—a process that slows them down considerably. The solution is to develop circuits that can process the infrared or microwave signals directly. On page 274 of this issue of Science, a team of researchers from Sandia National Laboratories in Albuquerque, New Mexico, and the Massachusetts Institute of Technology (MIT) describe a crucial element of such optical circuits: an artificial structure called a photonic crystal that can transmit light with minimal loss and make it turn a corner.

    Photonic crystals manipulate light in much the same way as semiconductor chips manipulate electricity. But the tiny components of a photonic circuit need to be wired together, and existing technology, such as fiber optics, is too crude, worse than joining the rooms of your house with a 12-lane freeway. “This experiment models the future wiring” of tomorrow's optical microcircuits, says photonic crystal pioneer Eli Yablonovitch of the University of California, Los Angeles. “I think it could really revolutionize the way that we are making optical circuits,” says Katie Hall of Lincoln Laboratory in Lexington, Massachusetts.

    The key feature of photonic crystals, which were first demonstrated by Yablonovitch in 1991, is a repeating pattern of reflective elements, spaced at roughly the wavelength of the light or other electromagnetic waves to be manipulated. The Sandia experimenters, led by Shawn-Yu Lin, made a photonic crystal from columns of alumina, or aluminum oxide, each a half-millimeter in diameter, set in a grid. Their spacing, about a millimeter apart, enabled the array to manipulate electromagnetic waves of millimeter wavelength, somewhere between the microwave and infrared parts of the spectrum.

    At the surface of each column, part of each wave is reflected and part passes through. “The multiple reflections create a range of frequencies for which the propagation of electromagnetic waves is not allowed inside the crystal,” says team member Pierre Villeneuve of MIT. The photonic crystal's repeating pattern causes the reflected waves to superimpose out of step, so that peak meets trough and they cancel out. “There's destructive interference between all the different waves that bounce back and forth,” says Villeneuve. Exactly which waves have the correct frequency to rebound around the crystal and cancel out is determined by the diameter of the rods and the spacing between them.

    Although they work beautifully as filters, photonic crystals get really interesting when you add defects—in the case of the Sandia-MIT work, a missing row of alumina columns—which can support a wave otherwise banned from the crystal interior. This offers the prospect of micromanaging light within the body of the crystal. “If you open up a channel in that photonic crystal … the light's going to follow the small channel you've carved out,” says Villeneuve. Lin and fellow Sandia experimenters Edmund Chow and Vince Hietala found that they could pass millimeter waves along the missing row of alumina columns with virtually no loss.

    MIT theorists led by team member John Joannopoulos had predicted that under the right circumstances, waves would turn a corner from one such corridor into a second. When the researchers added a second corridor at right angles to the first, they found that they could get waves to do just that, cornering in a distance roughly equal to their wavelength. Reproduced at higher frequencies, this bending would mean that infrared waves—of interest to the telecommunications world—could turn through 90 degrees in about a micrometer, 1000 times tighter than anything possible using optical fibers.

    The team's eventual aim is to integrate numerous components, such as waveguides, filters, light sources, and modulators, onto a single photonic crystal. The challenge, however, is manufacturing such chips, because the pillars of an infrared photonic crystal have to be fashioned accurately on a scale of micrometers. The necessary size reduction is “pretty tricky,” says Hall. But Villeneuve says light-bending photonic crystals are already in sight: “The first samples at 1.5 microns, which is the telecommunications wavelength, have been fabricated: They are waiting on Shawn's [Lin's] desk ready to be tested.”

  3. SPAIN

    R&D Budget Request Reverses Long Decline

    1. Alicia Rivera*
    1. Alicia Rivera is a science writer in Madrid.

    Madrid—In an effort to invigorate its poorly funded science community, Spain's conservative Popular Party government this week announced plans to ask for a major increase in research funding in the 1999 budget. Currently, Spain spends just 0.8% of its gross domestic product (GDP) on research and development, placing it firmly in the bottom tier of European research spending. Observers expect parliament—in which the Popular Party has an overall majority—to approve the increase by the end of the year.

    Excluding military programs, the research portfolio will go up about 8%, to $1.8 billion. The government has cited biotechnology and medical research as priority areas for a cash infusion, but it says other fields—including marine biology, energy, and transportation—will also receive special treatment.

    Huge increases in research spending during the 1980s under the former Socialist government pushed government funding for R&D to 0.9% of GDP in 1992. Since then, Spain's economy has faltered, and R&D funding has stagnated or slipped slightly each year. “We are quite behind other developed countries,” says Jesús Avila, a researcher at the Center for Molecular Biology in Madrid. Researchers hope that the 1999 budget will reverse this trend and bring Spain nearer to the European Union average of 1.9% of GDP. “We must continue this trend,” says Fernando Aldana, director of the Office of Science and Technology in Madrid, which oversees distribution of the government's science funds and authored the R&D request. “Spain can only achieve competitiveness through the sustainable growth of the investment in R&D.”

    The main concern among researchers is to find work for the large numbers of young scientists who were trained during the Socialist-led science boom in the 1980s and cannot now find permanent positions or get adequate funding (Science, 20 March, p. 1844). “The big groups have money, but the new ones—some of them very good—must be funded to avoid a disastrous situation in the future,” says Avila.

  4. U.S. R&D BUDGET

    Three Spending Bills Bolster Research

    1. David Malakoff
    1. With reporting by Jeffrey Mervis and Jocelyn Kaiser.

    Congress last week sent a mostly upbeat message to researchers that it is ready to give basic science a fresh infusion of cash for the 1999 fiscal year, which officially began on 1 October. It put the finishing touches on a trio of spending bills that would give the National Science Foundation (NSF) a 7% increase, science spending at the Department of Energy (DOE) a 10% boost, and basic research at the Department of Defense a 6% rise. “We're delighted,” says DOE science chief Martha Krebs, echoing reaction from the research community to the new spending levels.

    The numbers were hammered out by House-Senate conference committees that negotiated compromises between different versions of the bills passed by each chamber, and in some cases have already been approved by both bodies. Not all of the research agencies covered by the bills were lifted by the rising tide, however. NASA and the Environmental Protection Agency (EPA), for example, received essentially flat science budgets from House and Senate conferees. And as Science went to press, Congress had not yet decided what to do with three other key funding bills—including one that could give the National Institutes of Health a double-digit increase—that have become mired in election-year politicking.

    White House officials say President Clinton is likely to sign the bills that emerged last week, once Congress has completed its work. Here are some highlights:

    · NSF: Officials are very happy with the conferees' 7% solution to their $3.67 billion budget and the relatively free hand it gives them. The 8.8% increase in the research account, to $2.77 billion, is less than the Administration's 12% request, but it allows for healthy growth in programs ranging from human-computer interactions to life in extreme environments. Legislators agreed to drop proposed Senate earmarks for several research centers, although they directed NSF to spend $22 million more than it requested to support Arctic research and added $10 million to the $40 million plant genome initiative begun last year by congressional directive. They also added $10 million to a $38 million program to help 18 “have-not” states as part of a broader concern for schools outside the top 100 grant recipients.

    Congress was more specific about how it would like NSF's $663 million education and training directorate to grow. It boosted the directorate's budget by $30 million, but told NSF to spend $13.5 million of the increase on two programs to increase minority participation in science and $10 million on informal science education. On major facilities, legislators once again ignored NSF's request for $21 million for a Polar Cap Observatory in Canada, although they tacked $17 million onto NSF's request for $22 million to nearly complete financing of its $145 million renovation of the South Pole station.

    · DOE: Legislators not only gave DOE's science programs a $217 million boost, to $2.7 billion, but also moved fusion energy from its politically exposed position as a separate budget item into a new four-division Office of Science covering existing energy research programs. Krebs says the change, which also added $1.6 million to the Administration's request for fusion, reflects DOE's success in retooling the $223 million fusion program to emphasize university research rather than technology demonstration. The new office's $809 million Basic Energy Sciences account includes a $130 million boost to begin construction of the Spallation Neutron Source, a $1.3 billion facility scheduled to open in 2005 at Oak Ridge National Laboratory in Tennessee. “It's more than enough to get us off to a good start” but not enough to prevent delays, says project manager Bill Appleton about the funding, which fell $27 million short of the request.

    Congress also ordered an end to DOE's once-grand plans for the $10 billion International Thermonuclear Experimental Reactor. It provided about $12 million to shut down the project, despite DOE efforts to explore scaled-down alternatives. The science office's two other research portfolios—the $696 million health and environment program and the $335 million high-energy and nuclear physics program—received a total of $9 million more than requested, but legislators earmarked $40 million for specific projects that were not in the request.

    · NASA: Space scientists won't see much new funding in 1999. Legislators gave the agency's science programs $2.1 billion, $61 million more than the Administration's request but about the same as last year's budget. Lawmakers also met the Administration's request for $2.27 billion for the embattled international space station (see p. 206).

    · Defense: For the first time since 1993, the Defense Department's spending on basic research will grow faster than the expected rate of inflation. The 6.1% increase, to $1.1 billion, still puts basic science spending 25% below the 1993 level, however. The defense bill also includes more than $200 million for several major biomedical programs, including $135 million for breast cancer research, $58 million for prostate cancer research, and $10 million for a new ovarian cancer program.

    · EPA: The conferees agreed to fund the agency's science and technology account at $650 million, a 3% rise and 3% above the president's request. Within that amount, legislators boosted the agency's program on particulate matter research by $18 million, to $47 million. They also quelled a controversy over energy research related to global change policy in the wake of the Kyoto treaty. “The final language makes clear that we can proceed with commonsense actions to reduce greenhouse gases and to pursue other important environmental goals,” says Todd Stern, the Administration's climate change coordinator.

    Legislators were less generous to the Administration's $110 million a year Next Generation Internet project. For the second straight year, DOE, which had asked for $22 million, was blocked from spending any money. At the same time, plans by the two biggest partners, Defense and NSF, to spend $50 million and $25 million, respectively, remained on track.


    Geologists See Mars in the Canadian Arctic

    1. Robert Irion*
    1. Robert Irion is a science writer in Santa Cruz, CA.

    Mars aficionados of all stripes long to walk upon the Red Planet, but a trip there is a dim and distant possibility. So some planetary geologists have opted to learn about Mars by studying similar regions on its sister planet, Earth. Now a band of NASA scientists has found what may be the most Mars-like setting yet: the Haughton meteorite crater in the hostile Canadian Arctic.

    This 20-kilometer-wide basin may offer clues to the early evolution of the martian landscape, says planetary scientist Pascal Lee of the NASA Ames Research Center in Mountain View, California, who next week will present his team's work at the American Astronomical Society's Division for Planetary Sciences meeting in Madison, Wisconsin. Landslides, sinuous valley networks, and other Haughton landforms bear an eerie resemblance to terrain on parts of Mars, says Lee: “This little microcosm has an amazing variety of geologic features that may have direct analogs on Mars.”

    Scientists who have accompanied Lee to Haughton share his hopes for the 23-million-year-old crater, located on Devon Island, an uninhabited slab of rock west of Greenland and far north of the Arctic Circle. “It's an impact crater in a cold polar desert, so it may be an excellent analog for the climate and geologic processes on early Mars,” says planetary geologist Aaron Zent of NASA Ames. By studying the crater's ice-sculpted terrain, the team may be able to deduce whether similar martian features arose during icy conditions, rather than during a warm and wet period postulated for early Mars. Although NASA has not yet promised more funds to study it, the crater already has captured the imagination of enthusiasts at the private Mars Society, who plan to build a prototype Mars base there.

    Other martian stand-ins on Earth range from ice-covered lakes in Antarctica to wind dunes among volcanic debris in Iceland and Death Valley. But Lee's team says that images from the Viking and Mars Global Surveyor orbiters, plus air and ground surveys of Haughton, show that the crater looks startlingly like parts of Mars in several unique ways.

    The similarities may stem in part from how ice and subsurface permafrost interact with the crater's shattered terrain, says Lee. For example, old ice exposed within crater walls might trigger the small landslides seen at Haughton and some martian craters, in which thin layers of rock detach from steep slopes. Haughton once held a lake, and its preserved lake sediments are ice-rich—the only ones known that may match the icy lake sediments in some martian craters. And ice locked within jumbled impact deposits at Haughton has melted to form odd cup-shaped basins at the heads of long valleys, a process proposed for similar valleys on Mars. Finally, Lee's team suspects that fans of narrow channels seen in a plateau next to the crater were carved by meltwater beneath a stationary glacial cap. The channels are the only ones on Earth sharing the strange branching patterns of some martian valley networks. “The wasting away of an ice cover might explain many of these features without requiring rainfall,” Lee says. “Mars might not need to have been that warm in the past.”

    Planetary scientist Steven Squyres of Cornell University in Ithaca, New York, cautions that such similarities could be deceptive. “We don't know what early Mars was like,” he says. “We need to apply what we see at all terrestrial Mars analog sites with a great deal of caution and humility.” Lee agrees but says that studies of the crater's geologic evolution and glacial action at the crater will help.

    More study is just what the private Mars Society, founded this year by aerospace engineer Robert Zubrin of Indian Hills, Colorado, is planning. The society has ambitious plans to put a “Mars Arctic Research Station” at Haughton, to help field scientists learn how to interpret Mars-like terrain and test drills, robots, and other mission technology, Zubrin says. Lee, a Mars Society member along with other NASA scientists, is consulting on the structure, which could take 2 years to build and cost about $1 million in funds that Zubrin's group is raising.

    At the moment, NASA has no comment on these grand plans. Officials are waiting for reports from Lee's team—based on two seasons of fieldwork funded by small grants from NASA, the National Research Council, and the National Geographic Society—before deciding on future support, says Carl Pilcher, the agency's science director for solar system exploration. But geologists who have seen Haughton are eager to do more work there. “Haughton has a lot of Mars-like geology in a very compact place,” says astrogeologist James Rice of the University of Arizona, Tucson. “If I can't go to Mars, this may be as close as I can get.”


    Probing the Milky Way's Black Heart

    1. Govert Schilling*
    1. Govert Schilling is an astronomy writer in Utrecht, the Netherlands.

    Astronomers have taken their closest look at the mysterious center of our galaxy—and uncovered a further mystery. At the very center of the galaxy lies a black hole with a mass millions of times greater than the sun's. The black hole is invisible, but just outside it, electrons torn from matter falling into the black hole gyrate around magnetic field lines, broadcasting radio waves. By mapping the radio emission with the Very Long Baseline Array, a system of linked telescopes that spans North America, a group of Taiwanese and American astronomers have found that the emitting region is drastically elongated, suggesting that the black hole is somehow shooting jets of material out of the plane of the galaxy.

    “It's an interesting result,” Cambridge University astronomer Martin Rees says of the map, which offers the most intimate view ever of the immediate surroundings of a giant black hole. Rees, who in 1982 was the first to suggest that the radio emission from the galactic center comes from hot gas circulating near a supermassive black hole, adds that “the jetlike shape inferred in the new observations suggests that the emission may come mainly from an outflow”–a conclusion that runs counter to many models of the radio source's structure.

    From the tremendous speeds of the stars whirling around the Milky Way's central radio source, called Sagittarius A*, astronomers had calculated that it must harbor a black hole with a mass equivalent to 2.6 million suns. The region is invisible to optical telescopes because of intervening dust clouds, says team leader Kwok-Yung Lo of the Academia Sinica Institute of Astronomy and Astrophysics in Taipei, so the most detailed view of it comes from synchrotron radiation, the radio waves emitted by fast-moving electrons spiraling in a strong magnetic field. “The intrinsic size and structure of [the radio source] are crucial for our understanding of the immediate vicinity of the massive black hole,” he says.

    Earlier attempts to gauge the size and shape of the radio source were unsuccessful because of scattering by interstellar electrons, which made the radio source look larger than it really is, just as a streetlight looks larger when viewed in the mist. However, these blurring effects vary with wavelength. By combining near-simultaneous measurements at five different radio wavelengths, Lo and his colleagues—Zhi-Qiang Shen from Taiwan and Jun-Hui Zhao and Paul Ho of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts—were able to extract the true size and structure of the source from the scattering. The team presented the results last month at a workshop on the galactic center in Tucson, Arizona, and will publish them in the November Astrophysical Journal Letters.

    In the plane of the galaxy, they found, the radio source measures 150 million kilometers across—about the distance from Earth to the sun and about 10 times the calculated diameter of the black hole itself. In the perpendicular direction, the source stretches nearly four times that distance. Those proportions imply that the black hole is somehow spurting out material, probably in two opposite directions. But Lo says that the observations don't support a number of scenarios that astronomers have invoked to explain radio emission from Sagittarius A*.

    For instance, the so-called coupled disk-jet model, which holds that the radio emission is from jets of material expelled from a disk of material that is spiraling into the black hole, predicts a smaller jet for the Milky Way's black hole than Lo and his colleagues have measured. In another popular model, the synchrotron radiation is from extremely hot electrons in the inner parts of an accretion disk, but because the electrons would occupy a near-spherical region, the model is hard to reconcile with the observed elongated shape, says Lo.

    Rees cautions that there is a slight chance that the elongated shape is not genuine. It might instead be due to scattering in a preferred direction, as a result of asymmetric turbulence in the gas surrounding the black hole. But with the galactic-center black hole only 26,000 light-years away, Lo and other radio astronomers have a good chance of ultimately sorting out its puzzles.


    NSF Spells Out an Electronic Future

    1. Jeffrey Mervis

    The National Science Foundation (NSF) has pledged to go electronic with its entire grantsmaking process by October 2000. The agency plans to handle all grant applications, reviews, financial reports, and other communications about awards through an interactive World Wide Web-based system. The new policy, announced last month, puts NSF at the head of the pack of federal research agencies moving toward a paperless system of doing business, and it has added urgency to efforts to ensure that different agencies develop compatible digital systems for handling research grants.

    NSF's new deadline has been welcomed by academic research administrators, who have been urging NSF and other agencies to speed up and clarify their efforts to move research administration online. “It's fantastic,” says Pamela Webb, director of sponsored programs for the University of California, Santa Barbara. “Even if they don't make it [by 2000], it sends a strong message that this is the way the government is going and that universities need to get on board.”

    NSF's system will be based on FastLane, a prototype Web-based system begun in 1994 ( The agency has already required universities to use FastLane for certain competitions, and some institutions also have used it to meet various reporting and administrative responsibilities for other grants. In a letter last month to university presidents and other grantees (Important Notice 123), NSF director Rita Colwell invites the research community to help NSF go all the way—to “a fully integrated electronic proposal and award system that will provide a quick, secure, paperless record and transaction mechanism for all NSF awards, from program announcement to award closeout, by October 2000.”

    NSF officials say that going fully digital is the only way the agency and the community can manage a growing workload. “FastLane started out as a demonstration project, and now it's becoming the way we do business,” says Jean Feldman, head of policy for NSF's grants and contracts office. “We think we can meet the deadline, but success also depends on how ready the community is.”

    Coming soon.

    NSF has set deadlines for doing its business electronically on its FastLane Web site.

    View this table:

    University administrators say they are generally pleased with FastLane. “It's the first project to move beyond the pilot stage into production, and that's neat,” says Webb, whose institution has used it to submit more than 100 proposals. But the system is far from perfect. Institutions must redo portions of an application to capture and manipulate the data for internal purposes, a serious handicap for institutions that generate large numbers of proposals. And the Web site can be slow in responding during peak periods. More significantly, FastLane is only one version of an all-electronic world.

    Other agencies are pursuing different paths tailored both to their own needs and those of the communities they serve. The National Institutes of Health is developing a system, called NIH Commons, which recognizes that its large and diverse base of grantees may not want an interactive, Web-based system like FastLane. It is registering schools now for a test using noncompetitive grant renewals ( Other agencies do the bulk of their R&D business with contractors, often large companies with intricate accounting procedures, that may be more comfortable with an existing technology known as electronic data interchange.

    To prevent an electronic Tower of Babel from developing, a consortium of 11 federal agencies and 65 private institutions is working on a project called the Federal Commons that aims to provide an electronic translation service between different systems. The goal is to offer institutions a common menu for interacting with the government, from which they would choose the system that suits their needs. For example, a university that uses a Web-based system might submit its proposal to NIH via NSF, which would convert it to a format compatible with NIH's system before sending it along. And it might be routed through another agency's server for other aspects of federal grants management.

    Such a division of labor would allow universities to put their resources into perfecting a single form of communication, while faculty members would be spared having to learn multiple systems. “If a Federal Commons comes off, and I think it will, you'll be able to pick and choose what interface to use,” says Northwestern University's Barbara Siegel, chair of the executive committee of the Federal Demonstration Partnership. “And the government will present a common face to the world.”

    In the meantime, Siegel says, she's glad NSF has taken the first step. “Now that NSF's plan is out on the street, we have something to point to. It should help focus attention on what needs to be done.”

  8. JAPAN

    Court Hears Fight Over Safety of Lab

    1. Dennis Normile

    TOKYO—Ten years after filing suit to block construction of a government laboratory, and 6 years after it opened for business, a retired professor and neighborhood activist is finally getting his day in court. Last week, Shingo Shibata argued before a Tokyo district court that the National Institute of Infectious Diseases (NIID) in downtown Tokyo poses unacceptable health risks to the densely populated community and should be halted. The institute disagrees. “We think the safety precautions are equal to or exceed those at America's National Institutes of Health,” says NIID Director-General Shudo Yamazaki.

    The case is being watched closely. “This is the first time a [Japanese] citizens' group has really assembled considerable scientific support,” says Keisuke Amagasa, a writer and environmental activist. The group has also won the backing of such global activists as Jeremy Rifkin of the Washington-based Foundation on Economic Trends. Environmentalist groups hope that a victory—which many consider unlikely—would set a precedent in a country where successful challenges to government institutions are rare. But even a defeat, says Amagasa, would give other groups pointers for waging similar battles. Scientists are concerned that the fallout from this case could be burdensome regulations and nuisance lawsuits.

    The NIID is the national government's main facility for research on and surveillance of infectious diseases. The lab's stores of pathogens include dengue and hantaviruses, which fall into a category defined by the World Health Organization (WHO) as causing serious disease but do not ordinarily spread from one infected individual to another. The six-story lab is located in downtown Tokyo, wedged between the 40,000-student Waseda University and a rehabilitation center for the handicapped. A midrise apartment complex sits across a narrow street, a major hospital is a stone's throw away, and the NIID building casts its afternoon shadow on a row of single-family homes, including that of Shibata, who is a professor emeritus of philosophy and sociology at Hiroshima University.

    Shibata's protest began after NIID, then known as the Japan National Institute of Health (JNIH), announced plans to move to the site in 1986. The site was already notorious as the headquarters of the Japanese Army's infamous Unit 731, which used prisoners of war as human guinea pigs for biological warfare and human endurance experiments. Unhappy with JNIH's explanations of safety procedures, Shibata and others started lobbying politicians and persuaded the local ward assembly to adopt a resolution opposing construction. In December 1987, the protest escalated when riot police were called to clear the construction site. The following March, area residents and members of the Waseda faculty and staff formed a committee and sued the institute and the Ministry of Health and Welfare, its governmental sponsor, seeking a ban on experiments. Although the building was occupied in 1992, the suit continues to wend its way slowly through the legal system.

    Safety first.

    NIID's Yamazaki says biosafety precautions match or exceed U.S. standards.


    The plaintiffs, who now number more than 200, claim that neither the location nor the safety practices of the NIID meet WHO recommendations. A succession of scientific witnesses, including some researchers from NIID, testified earlier that a more isolated site would reduce public health risks and that NIID's crowded and cluttered labs compromise safety. But NIID officials say the concerns are groundless. “There has never been a single case of disease caused by an escaped organism in Japan,” says Yamazaki.

    Key elements in the legal battle are expected to be reports by outside experts—a rarity in such disputes—on the lab's safety procedures and the risk to the community. The plaintiffs called on microbiologist Christopher Collins, a former coordinating editor of the WHO Laboratory Biosafety Manual, and David Kennedy, a medical equipment safety specialist formerly with the United Kingdom's Department of Health. The defendants recruited Vinson Oviatt, a former biosafety official at the U.S. National Institutes of Health in Bethesda, Maryland, and at WHO in Geneva, and Jonathan Richmond, director of health and safety for the U.S. Centers for Disease Control and Prevention in Atlanta. The four toured the facilities together in June 1997.

    The experts' findings supported the positions of their clients. A 27-page report submitted to the court by Collins and Kennedy finds problems with everything from protective clothing to the handling of solid waste. Regarding a pathogen freezer storage compartment, for example, Collins and Kennedy write: “There did not appear to be a procedure, or materials available, for dealing with accidental leakage or breakage of vials and release of contents.” They also cite WHO recommendations that “high-level containment or high-risk laboratories should be located away from patient or public areas. …” The report's conclusion: “… it is difficult to see how the protection of staff and public from exposure to infection can be properly ensured.”

    In contrast, the 13-page Oviatt-Richmond report concluded that nothing was amiss. “All the biological safety level/P3 laboratories meet or exceed the standards,” reads one of 27 “positive findings.” Its opening lines convey its essential message: “The National Institute of Infectious Diseases poses no biosafety threat to the outside surrounding community as a consequence of its work with infectious diseases. No serious breaches in biosafety were observed.”

    Dangerous neighbor?

    Shibata says NIID (right) is too close to apartment buildings (left), a university campus, and a local hospital.


    Institute officials say that Collins took out of context the WHO recommendations for siting such labs away from public areas. “That section [of the WHO recommendations] is referring to a lab located within a hospital,” says Hiroshi Yoshikura, NIID deputy director-general; he says it seeks to avoid siting labs in areas where the public and patients circulate. Collins, who was the coordinating editor of the WHO publication, acknowledges that the section is ambiguous, referring both to labs within hospitals as well as free-standing institutes. But he maintains that NIID's location is inappropriate.

    A ruling is not expected for a year or so, and even Shibata admits that victory is a long shot. “When citizens sue the government in Japan, usually the government wins,” he says. The activists hope that a positive ruling will lead to national standards for laboratories handling biologically hazardous materials and to independent inspections to ensure compliance. Earlier this year the Ministry of Education, Science, Sports, and Culture (Monbusho) released a manual intended to standardize biosafety practices at universities, an activity that researchers say was not related to the NIID controversy.

    Yamazaki even sees some advantages to an inspection system. Recent revelations of official cover-ups of accidents and mishandling of radioactive material at nuclear power research facilities have caused public faith in government research institutes to plummet, he says, and an inspection system “could reassure residents.”


    Cresson Told to Explain Questioned Contracts

    1. Nigel Williams

    The European Commission's head of research, Edith Cresson, was called last week before the European Parliament's research committee to answer allegations that a personal friend from her home town of Chatelleraut, France, had been given consultancy contracts with the commission for work he may not have been best qualified to carry out. Rene Berthelot, a dentist, had been contracted to gather information on the state of research in France. But Cresson, who sent a deputy to answer the committee's questions, told Science this week that Berthelot had been the right person for the work.

    Cresson's decision not to appear before the committee angered some members. “In the Netherlands if a minister is called to answer questions by Parliament they come straight away,” says Dutch member Elly Plooy van Gorsel. Cresson's deputy, Hendrik Tent, told the committee that Berthelot was not merely a dentist but had additional medical and legal qualifications, and that he was hired in the normal way. But his short statement failed to satisfy all members, and the committee insisted that Cresson submit a written statement herself. “There's nothing wrong with awarding a contract to someone you know who is qualified,” says committee member Giles Chichester, “but if it is a personal friend, then it is a different matter. It looks like cronyism.”

    Cresson, a former French prime minister, is the head of directorate-general XII (DGXII), which is responsible for the European Union's science, research, and development activities, including the current 5-year, $15 billion Framework program, which covers everything from nuclear fusion to biotechnology. The contracts were awarded by DGXII to Berthelot soon after Cresson's appointment in 1995. Cresson says that she was unhappy with the large number of priority research topics she inherited and wanted urgently to assess the state of research. “I wanted someone to look at AIDS, cancer, and also technical innovations to help us focus priorities.” Berthelot had the networking skills to gather information quickly and reliably, she says.

    The timing of the allegations, which first surfaced in the press, was unfortunate for Cresson. The commission and Parliament are currently locked in combat over a number of financial problems and accusations of corruption, focused largely on contracts for humanitarian aid carried out by the Luxembourg arm of a company run by businessman George Perry. Audit inquiries allegedly have failed to account for millions of dollars of commission funds and Luxembourg police are investigating the so-called Perrylux affair. Berthelot had once worked for Perry as a consultant, and the French newspaper Liberation published an article last week linking Cresson, via Berthelot's DGXII contracts, to the “real mess” she says surrounds humanitarian funds. Tent told the committee that Berthelot's work had nothing to do with the Perrylux affair, and Cresson says she is suing the newspaper.

    Two members of the science committee, Claude Desama and Gordon Adam, say Cresson does not need to defend herself any further. “I felt we had been given a full explanation,” says Adam. But others want more specific information about whether the normal process had been followed in hiring Berthelot. Says Wim van Velzen: “There have been serious allegations in the press, and it is only normal that Parliament asks for Mrs. Cresson's response. If she does not provide us with a formal statement in due course, I will ask her to appear before the House after all.”


    Bug Vanquishes Species

    1. Dan Ferber*
    1. Dan Ferber is a science writer in Urbana, Illinois.

    For the first time, scientists have documented a case of an infection wiping out the last remnants of an entire species. The victim was a type of land snail that scientists were trying to pull back from the brink of extinction in a captive-breeding program. Experts say the finding, reported in this month's issue of Conservation Biology, points up the urgent need to guard against infectious diseases when nursing species off the endangered list. “Captive breeding is not always a safe haven,” says conservation biologist Stuart Pimm of the University of Tennessee, Knoxville.

    South Pacific land snails are rare to begin with, but they have taken a hit in the last few decades after residents of Raiatea, in the Society Island chain some 5000 kilometers south of Hawaii, began importing predatory snails from Florida in 1986 to eat another imported snail that had become a pest. The predators, it turned out, preferred the taste of the native snails, and by 1991 they had driven several species to the brink of extinction.

    That year Paul Pearce-Kelly and colleagues at the Zoological Society of London captured the last known individuals of one species–Partula turgida–to try to save them through a breeding program. But 4 years ago the snails began dying off mysteriously. When the population had dwindled from 296 individuals to fewer than 10, veterinary pathologist Andrew Cunningham of the Institute of Zoology in London and parasitologist Peter Daszak of Kingston University in Kingston-upon-Thames, England, set out to find out why.

    Before they could solve the puzzle, however, the remaining snails had died. After slicing open the last five bodies, Cunningham noticed something odd: scads of protozoan-like spores in the digestive glands and reproductive tracts, suggesting that a parasite had infected the snails. Daszak put the spores under an electron microscope and spotted spiral tubes—a hallmark feature of Microsporidia, a family of protozoa known to infect aquatic snails. Closer scrutiny revealed that the spores belong to a new species of microsporidian in the genus Steinhausia. The parasite had ravaged the snails' digestive glands, Daszak says, persuasive enough evidence for him and Cunningham to conclude that it had finished off the snails. Because the apparent killer does not seem to infect other land snails, by killing off P. turgida it may have sealed its own fate.

    “It's great that somebody's finally got a concrete example of an infectious disease leading to the extinction of a species,” says ecologist Andy Dobson of Princeton University. He says the finding should serve notice to endangered-species recovery programs that they must closely monitor the cause of death of individuals in their care. And the snail's demise hammers home the danger of allowing a species to slip so far that it has vanished from its habitat and ended up on captive-breeding life support. Says Pimm, “Species need to be in the wild, not in zoos.”


    Human Rights Fades as a Cause for Scientists

    1. James Glanz

    Now that China has replaced the former Soviet Union as the major focus of concern, human-rights activists meet with indifference or even hostility

    Last spring, the annual gathering that is proudly billed as the largest physics meeting in the world became a nightmare for the one human-rights group officially represented there. The meeting, held in Los Angeles in March and sponsored by the American Physical Society (APS), had its usual scientific dazzle, as participants flocked to talks on quantum dots, micromachines, and ultrafast lasers. But problems for the Committee on the International Freedom of Scientists, or CIFS—a bylaw committee of the APS—started even before the first overhead projector flashed to life. Then the problems got worse.

    A workshop on human rights had to be canceled when only two of the thousands of physicists at the meeting registered for it. Then a Chinese graduate student in physics destroyed literature at a CIFS booth, enraged by what he said were false depictions of human-rights abuses in his home country. To top it off, an APS official then ordered CIFS representatives to remove the materials that had just been defaced, saying that boldface words like “torture” and “repression” on posters and flyers were too inflammatory.

    For human-rights activists in the scientific community, the combination of apathy from many scientists, ambivalence from others, and hostility from a few has become a familiar experience in recent years. Many organizations have seen scientist interest in human rights slip into a long-term decline after decades of raucous agitation on behalf of Soviet dissidents during the Cold War—and a brief resurgence after the crushing of a student-led pro-democracy movement at Tiananmen Square in Beijing, on 4 June 1989, in which hundreds of people were killed.

    One factor in the change was the end of the Cold War, which undercut a broad consensus among scientists on human-rights issues. There is, for example, no longer a central, symbolic figure like the physicist Andrei Sakharov during his exile to Gorky in the former Soviet Union. The People's Republic of China (PRC), which has replaced the Soviet Union as the major focus of human-rights activism, also has a far less hostile relationship with the United States. And although the PRC keeps strict clamps on political expression, it has loosened the reins on personal and economic freedoms.

    Equally important are the views of the large population of expatriate Chinese scientists here. “You should know that there is a tremendous split within the expatriate Chinese scientific community,” says Irving Lerch, director of International Affairs at the APS and co-chair of the Committee on Scientific Freedom and Responsibility at the American Association for the Advancement of Science (AAAS, publisher of Science). “The feelings are very strong,” Lerch says.

    Although many expatriates deplore Tiananmen and the imprisonments since then, others resent criticism of their homeland or believe a strident approach to human rights is counterproductive. In fact, some of the most powerful voices in that community—such as Chen Ning Yang,* the Nobel Prize-winning physicist at the State University of New York, Stony Brook—have not condemned the 1989 massacre, saying it may have been necessary to preserve stability and economic progress in the PRC.

    News reports and the watch lists of human-rights groups are still replete with cases of harassed and imprisoned scientists and other academics. But the numbers—and the scientists' international profile—are lower than in the immediate aftermath of Tiananmen, feeding the disagreement. Even younger scientists who recognize continuing human-rights problems in the PRC tend to feel that exiled dissidents, and the Western press that covers them, give an unfair picture of China. “Generally the Chinese students here feel that especially the bad side of China has been exaggerated out of proportion, because the media tries to make big news,” says Kezhao Zhang, a physics graduate student at the University of California, San Diego, who was president of the local chapter of the Chinese Students and Scholars Association from 1996 to 1997.

    Fang Li Zhi, the astrophysicist in exile who is now at the University of Arizona, Tucson, notes that as the PRC released into exile high-profile dissidents like himself, others slipped out of the public eye. “My case was very visible,” he says. Most of the dissident scientists who remain, he says, “are unknown people.” The same is true for most of the scientists on human rights watch lists outside the PRC, mostly in the Middle East, Africa, and Latin America.

    Activists are quick to add that some human-rights organizations, especially those that have adapted to the fractured post-Cold War political scene, have been thriving. But the overall mood shift has left the scientists who are still speaking out about human rights bewildered and frustrated. “It's my academic generation, and probably my generation in a larger sense, that has completely dropped the ball,” says William Dorland, 32, this year's CIFS chair and a physicist at the University of Maryland, College Park. “I'm not riding on a high horse. It's just disappointing that so few people think this is an important part of what it means to be a scientist.”

    From one voice to many

    Many learned societies have made human rights a cause, but for years physicists were the standard-bearers. After World War II, a long line of physicists expressed humanitarian views in the face of the destructive power of the atomic bomb their field had created. Many Soviet dissidents, like Sakharov, were physicists, which drew the APS further into the human-rights movement.

    By any measure, the current difficulties with the human-rights programs of the APS come at a particularly ironic time. During its 100th anniversary next year, the APS will showcase its activist history in the human-rights struggle at a huge gathering in Atlanta. And the president of APS for 1998, Andrew Sessler of Lawrence Berkeley National Laboratory in California, has attained an almost legendary status as a veteran of those struggles.

    In the late 1970s, Sessler, Berkeley's Morris Pripstein, and others formed a group called SOS—focusing on the cases of imprisoned scientists Sakharov, Yuri Orlov, and Natan Sharansky. After years of pressure from SOS and other scientific organizations, all three were released. Sessler recalls taking disruptive actions like picketing a talk in the United States by Nikolai Basov, a Soviet physicist and Nobel laureate who Sessler says signed a petition against Sakharov. Sessler's sign said, BASOV: GREAT SCIENTIST, LOUSY HUMAN BEING. Newspapers ran pictures of Sessler and his sign. “The picture went back to Moscow,” says Sessler. “What an embarrassment! They send their Nobel Prize winner and instead of being noticed for all his great accomplishments in science, here is this character getting all the PR.”

    Not all of Sessler's colleagues favored such direct confrontation. Although Sessler and others hoped to declare a moratorium on travel to scientific conferences on Soviet soil, some scientists favored continuing engagement. “In the end, there was a kind of compromise between the two camps,” says Lerch. “The compromise was, if you feel you have to attend meetings over there, at least acknowledge that the regime is doing something wrong.” So scientists would preface their technical talks with a “dedication” to particular dissidents, or publicly visit them at some point during the trip. The compromise permitted the international scientific community to speak nearly with one voice.

    When it came to China, physicists' voices were heard again, but this time the messages were mixed. The silence of several prominent Chinese scientists about the killings in Tiananmen Square set the tone for much of the debate that has followed. Such is the scientific prestige of Yang and the man with whom he shared the Nobel Prize in 1957—Columbia University's Tsung Dao Lee—that they made a ringing statement simply by refusing to condemn the killings. One Chinese-American physicist with moderate political views, who asked that his name not be used, asks a question that was initially on many of his colleagues' minds: “How come these big shots remained silent over Tiananmen? They should be the spiritual leaders.”

    Asked this question, Yang responds: “I knew the situation was very complex. The view which the dissidents would like me or anybody else to take is a one-sided view.” He adds: “I did not want to say anything publicly about it also because the fundamental position of the Chinese government, now, has some truth in it. That is the following. If the bloodshed did not happen, China might end in a tremendous turmoil, and all subsequent economic progress since 1989 would not be possible. There is great truth in this statement. This is usually discarded as nonsense or propaganda.”

    Lee did not respond to several requests for comment on his views, but almost a decade ago he took a step that went even further, in the eyes of many physicists. Just a few months after Tiananmen, he met with Deng Xiaoping and allowed himself to be photographed shaking hands with the Chinese leader. The gesture caused “an outpouring of indignation and shock,” recalls Joseph Birman, a physicist at the City College of New York and chair of the human rights committee of the New York Academy of Sciences. After the meeting, according to newspaper reports, Lee announced that Deng had privately expressed a new, conciliatory stance on the killings. The new stance by Deng and the government never materialized publicly, however.

    Since then the division among Chinese-American physicists has only deepened, as the government has continued to suppress political dissent. This year, for example, the physicist Wang Youcai was imprisoned for attempts to organize an opposition party. Lin Hai, a software engineer in Shanghai, was arrested in July for subversion after distributing Chinese e-mail addresses to U.S.-based Internet publications that promote democracy, according to rights groups. Xu Liangying, the elderly translator of Einstein's works who has petitioned the country to adopt political reforms, continues to live under tight surveillance in Beijing. And Chinese scientists who live in the United States have recently been detained and interrogated during visits to see relatives in the PRC.

    But with the increase in personal freedoms and economic prosperity, along with the release of some high-profile dissidents, some Chinese physicists began to feel that direct confrontation was not the best route to progress on human rights, says Cheuk-Yin Wong of Oak Ridge National Laboratory in Tennessee, vice chair of the Overseas Chinese Physics Association (OCPA). “Nobody can defend” the continuing human-rights violations in the PRC, says Tsung-Shung Lee, a physicist at Argonne National Laboratory in Illinois, but “if you mix the scientific activity with politics, there's no boundary.”

    Other expatriates, especially younger students, view human-rights criticisms as groundless attacks on their homeland. And even moderate Chinese often say that criticism of the PRC here is excessive. Says Frank Shu, an astrophysics professor at the University of California, Berkeley: “Across the board I think among all Chinese, most of us feel that in the Western press this is given more emphasis than is commensurate with the degree of violations that take place. There is an attempt to put China in the role of bogeyman.”

    But Betty Manyee Tsang, a physicist at Michigan State University in East Lansing who is an OCPA member and a former CIFS chair, argues that by distancing itself from human-rights issues, the Chinese-American science community has discouraged wider support. “A lot of American physicists said, ‘The Chinese physicists are not involved with human rights. Why should we care?’”

    Some American scientists and scientific administrators call such sentiments overwrought. But although human-rights subcommittees at the AAAS, the American Chemical Society, and the National Academy of Sciences remain active in advancing their cause, their parent organizations are increasingly decoupling scientific activities from activism. The National Academy of Sciences (NAS) suspended its relations with the Soviet Academy of Sciences during the Cold War and with Chinese institutions on 7 June 1989. The APS sponsored decades of resistance to Soviet repression and abandoned a major scientific exchange program with the PRC because of Tiananmen.

    By 1993, however, the NAS had decided to reestablish contact with the Chinese Academy of Sciences. And in 1994, when Burton Richter, the Nobel laureate and director of the Stanford Linear Accelerator Center, was APS president, the society negotiated a “memorandum of understanding” with the Chinese Physical Society. The agreement was intended to stem the pirating of APS technical journals in China—there were fewer than 10 paid subscriptions in the entire country—and foster various forms of scientific collaboration, which had come to a sudden halt with the 4 June killings. After intense debate, including an impassioned speech by Fang and opposition by CIFS, the APS council narrowly approved a memorandum with no explicit mention of human rights.

    Richter, who favored separating the issues and whose influence probably ensured victory for that approach in the final vote, says that any attempt to link human rights to the agreement would have doomed it. It was, he says, “unrealistic in the extreme” to think that Chinese scientists would ever sign an agreement with such language. Says Richter, “I believe personally that in situations like this, it takes two different groups to advance things. It takes a group like Fang Li Zhi to protest—to make the protest vocal and heard all over. And it takes another group to try and keep the doors open.” The AAAS is also keeping human-rights concerns separate from other initiatives. Although its own Science and Human Rights watch list chronicles human-rights violations in China and elsewhere, AAAS has made subscription agreements with the Natural Sciences Academy in China that do not mention human rights.

    Flash point

    The conflict between the activism of a few scientists and the indifference or hostility of most was in full view at the APS March meeting. Problems began with a workshop sponsored by CIFS and AAAS titled “International Freedom of Scientists: What the Physics Community Can Do and Is Doing.” When only two of the 5056 physicists who registered for the meeting signed up for the workshop, it was canceled.

    A far stronger message about how the world had changed awaited the activists the next day. On the morning of Monday, 16 March, in the meeting's bustling registration area, Dorland set up the usual CIFS display encouraging those in attendance to sign a petition on behalf of a list of Chinese scientists thought to have been imprisoned “because they have engaged in the peaceful exercise of their right to freedom of expression.”

    The petition—which was sponsored by CIFS, the Committee on Scientific Freedom and Responsibility of the AAAS, the Committee on the Human Rights of Scientists of the New York Academy of Sciences, and the independent, New York- based Committee of Concerned Scientists—highlighted two of the 18 scientists, engineers, and physicians listed by a variety of watch groups as imprisoned or missing. One was Zhu Xiangzhong, who was reportedly sentenced to seven and a half years' imprisonment in 1989 for dissident political activities; the other was Lu Yanghua, a student at Lanzhou University who was detained in April or May 1992 for connections with underground dissident organizations. Prominently distributed on the petition, and on larger posters that followed its design, were words such as “torture,” “repression,” “intimidation,” and “unlawful imprisonment”.

    At some point during the morning, when Dorland stepped away from the booth for a few moments, the words caught the eye of Patrick Shuanghua Dai, a graduate student in physics at Tufts University in Medford, Massachusetts. “I looked through the poster [and] got very angry,” says Dai, who is from the PRC. “It looks like some terrible human-rights abuse in China,” he says. “But I live in China, and I know it's not true. It's just a lie.”

    Dai says he looked around for someone to talk to about the material, and not finding anyone, he started tearing down the posters. He threw away the material at the booth and then found thousands of flyers and began throwing those out, too. That was when Dorland returned to the room. “He was scooping them up as I saw him,” says Dorland. “I said: ‘What are you doing?’ I don't think I hollered, [but] it was a dramatic moment. He started telling me that we were supporting common criminals and we didn't know the true situation in China.” Says Dai, “I think I did the right thing”–just what an American in China might do, he says, if confronted with similar material about his or her country.

    The material had been cleared through Lerch's office and had been used in the same basic form for at least 2 years. So Dorland called in Judy Franz, the APS executive officer, to back him up. But now it was Dorland's turn to be shocked. After scolding Dai, Franz told Dorland she had decided that words such as “torture” and “repression” were unnecessarily inflammatory and would have to be removed from the display.

    Fighting words.

    A human rights petition displayed at last spring's American Physical Society meeting before (left) and after a Chinese graduate student objected to it.

    How stark was the contrast with the heyday of APS activism? “If Sakharov was in Gorky and Orlov and Sharansky were in prison and some Russian student came tearing through a display like that,” says Lerch, “I think there would have been a riot.” Instead, Dorland and Jenifer Kirin, then an international programs assistant at APS, dipped into CIFS's shoestring budget and, in the local copy center, ran off a few flyers without the offending words.

    “I was very unhappy about [Franz's] decision,” says Dorland. Franz will only say, “I thought it was a very small incident. It was unfortunate that we had a young student who got more emotional than he should have. CIFS is an important part of APS, and we will continue to support them in the work that they do.”

    Dorland notes that no one in APS has questioned the facts of the cases publicized in Los Angeles. Liu Gang, the dissident who as a physics student helped organize protests on Tiananmen Square and who escaped to the United States in 1996 after he was released from prison, says “I can tell you that I experienced torture by policemen and common prisoners sent by policemen [in my] solitary cell, when I was in prison.” Science has what is said to be a firsthand account from a source close to Zhu, which reports that while in prison, Zhu was beaten with a bench by an orderly and severely injured, then went on a hunger strike to protest the subsequent lack of medical treatment. Zhu survived, but the dissident network has not been able to confirm his release at the end of his prison sentence 2 years ago.

    “We take great efforts to be accurate,” Sessler, the APS president, says about the reports on Zhu and Lu. “Very rarely if ever has somebody found that we misspoke, and certainly that hasn't been so in this case.” As to the enforced revision, says Sessler, “we're down to a level of detail of what's appropriate to get people's attention.”

    Sessler says the APS isn't wavering from its commitment to human rights. “There's a long tradition, which I fully support, of the Physical Society being involved in public affairs which concern physicists,” he says. “We are concerned about any physicist, anywhere in the world, who is suffering a human-rights attack.”

    Although that concern is harder than ever to translate into an effective campaign, a handful of organizations say they have found ways to adapt to the changed political landscape. “We're making a greater effort to match cases of imprisoned scientists with individual scientists in their field … who are interested in that country,” says Carol Corillon, director of the Committee on Human Rights of the National Academy of Sciences, the National Academy of Engineers, and the Institute of Medicine. This tailored approach, she says, produces committed activism by scientists who have useful connections. The committee recently reported, for example, that after a visit by committee members, Guatemala indicted three top military officers for the murder of an anthropologist.

    There may no longer be a single political route to success in the campaign for human rights. But if the heirs to Sessler's Cold War activism can master the new landscape, scientists may once again find human rights as fascinating as lasers and quantum dots.

    • * The order of Chinese family and given names varies with individual preference.


    Molecular Methods Fire Up the Hunt for Emerging Pathogens

    1. Michael Balter

    Combining an early warning system with genetic techniques, microbiologists have stepped up the hunt for emerging pathogens in the United States

    When a 3-year-old Connecticut girl was hospitalized with an often-fatal type of kidney failure last year, doctors at first suspected that she was infected with Escherichia coli O157:H7, a dangerous strain of bacteria that can cause kidney failure in children. But all attempts to culture this and other pathogens failed. Fortunately, the girl recovered, and the case became one of the thousands of unexplained illnesses put on the books every year in the United States.

    This time, however, the story didn't end there. To track down the mystery pathogen, doctors turned over samples of the girl's blood taken during the height of her illness to a specialized pathogen lab in California, via the Unexplained Illness Working Group, a network of infectious-disease experts coordinated by the Centers for Disease Control and Prevention (CDC) in Atlanta. The California lab used sensitive molecular and immunological probes to identify the pathogen: an unknown strain of enterovirus, a large group of microbes that includes the polio-virus. This information came too late to help the Connecticut girl, but researchers are still probing the virus's genome to see if it matches one of the more than 70 known enterovirus strains, or if it is a new pathogen.

    This is just one example of how new molecular technologies are speeding the hunt for microbes that have recently begun to attack human hosts, or so-called emerging pathogens. To fight these bugs, researchers are now going beyond the traditional means of identifying pathogens—culturing them in petri dishes and test tubes—and isolating the DNA or RNA that makes up their genomes. The enterovirus that infected the Connecticut girl, for example, was spotted by matching a segment of its RNA to that of other known enteroviruses. The Unexplained Illness Working Group, created by the CDC in 1994, is one leader in this effort, focusing not on the tropics, home to infamous viruses such as Ebola, but on the familiar settings of U.S. hospitals and clinics, where new and deadly strains may also emerge.

    The network serves as an early warning system for dangerous microbes as well as a focal point for research on new tests. And over the past year the team has revved up to full speed: Some 200 cases of unexplained illness are under active investigation, and results are starting to emerge. The network has tracked down possible new strains of enterovirus—implicated in a number of recent outbreaks of childhood disease in the United States and Asia—and has uncovered evidence that microbes once thought innocuous can cause disease. For example, the team has found that human herpesvirus 6, previously thought to be benign when it infects children, is behind some cases of childhood encephalitis. Once new pathogens have been identified, says CDC epidemiologist Bradley Perkins, the working group's Atlanta-based coordinator, the ultimate goal is to develop a “diagnostic test that a clinician could order in the hospital.”

    The evidence so far suggests that some unidentified killers may already be out there. In up to 14% of deaths caused by infection in people between the ages of 1 and 49, no known microbe could be identified as the culprit, according to surveys carried out over the past few years by the working group and other collaborators.

    But finding these silent killers isn't easy. The time-honored means of identifying an invading microbe is by taking blood and tissue samples and trying to culture the organism in various artificial growth media, then identifying it either under the microscope or with physiological and immunological tests. But many organisms can't be cultured, either because no one knows the conditions in which they thrive or because they can't exist outside the body. “Trying to culture organisms has been the downfall of previous attempts to look for unknown pathogens,” says Perkins. “It is daunting to think that you are going to be able to culture something when you don't know what it is and you don't know what its growth conditions are.”

    Work reported by Stanford University microbiologist David Relman at a meeting on emerging infectious diseases earlier this year* underscores the limits of the traditional approach. Relman's lab, which collaborates closely with the working group, took a sample of the microbial communities that live in the spaces beneath the teeth and gums and divided it into two parts. One half was cultured in a variety of standard bacterial growth media, and the other half analyzed with the polymerase chain reaction (PCR), which can amplify minute amounts of DNA or RNA. All of the cultured bacteria belonged to known genera, and only 11% appeared to be new species. But of the bacteria identified through PCR analysis, 30% represented new species, and 13% appeared to be from entirely unknown genera—implying that a staggering number of unidentified microbes live in the human mouth alone.

    Of course, most unknown organisms probably don't cause disease. To find the ones that do, the working group relies on experts at hospitals and clinics in four states—California, Connecticut, Minnesota, and Oregon—who spot unusual illnesses, then funnel specimens through the network to specialized labs, often run by these states' health departments. The labs hunt for clues about the organism involved, then try to pin down its identity using molecular techniques.

    In the case of the Connecticut girl, for example, scientists at the California health department's Viral and Rickettsial Diseases Laboratory in Berkeley suspected that an enterovirus might be responsible, because this type of virus infects many tissues and is implicated in a number of recent outbreaks of childhood diseases, such as encephalitis and liver failure. The researchers used a technique called consensus PCR, so-called because it tests for conserved genetic sequences that are shared by closely related organisms—in this case by all known strains of enterovirus. Such conserved sequences were found in the girl's blood, and an antibody test against enteroviruses then confirmed that these microbes were involved.

    Consensus PCR was also used by scientists at CDC and their collaborators to nail the microbe behind the hantavirus outbreak in the southwestern United States in 1993. And Relman led a team that used it to identify the bacterium that causes Whipple's disease, a debilitating syndrome that causes symptoms such as diarrhea and weight loss. The rod-shaped bacteria were spotted in affected tissue in the 1960s but hadn't been cultured.

    When researchers have no clues about what kind of organism is responsible, they sometimes turn to another powerful PCR technique called representational difference analysis (RDA). In this method, both healthy and diseased tissues are subjected to a variant of PCR that “subtracts” sequences common to both specimens, leaving only the genome of the infectious agent. Using RDA, in 1994 molecular biologists Patrick Moore and Yuan Chang at Columbia University in New York City isolated small bits of DNA from a skin lesion in an AIDS patient afflicted with Kaposi's sarcoma, a cancer that often affects HIV-infected people. From these genetic segments the pair was able to sequence the complete genome of a new virus called KSHV, now thought responsible for the cancer.

    Powerful as they are, these techniques are laborious and time-consuming—and not always successful. So researchers are working to improve their methods. For example, Relman, working with Stanford biochemist Patrick Brown, has begun exploring the use of “pathogen detection chips,” using the DNA chip technology developed by Brown and others (Science, 24 October 1997, p. 680). The idea is to put DNA strands from known pathogens on the chip and then see whether labeled DNA from an unknown microbe binds to any of the fixed pathogen DNA.

    Relman and Brown have also begun using this DNA microarray technique as “cellular scouts” to detect changes in gene expression in the host, for example in immune system cells. Their goal is to create profiles of the gene expression changes triggered by specific kinds of invaders. Thus they would use the body's own response to identify pathogens, an approach Perkins calls “totally revolutionary.” Preliminary results are promising: Brown's lab has shown that white blood cells express different genes when exposed to different combinations of immune signaling proteins called cytokines, which are produced in response to infection. If related microbes provoke similar cytokine responses, Moore says, the scout approach might quickly narrow the hunt to certain groups of organisms.

    Meanwhile, the search for new methodologies continues. And researchers say that the molecular identifications thus far suggest that combining heightened surveillance with state-of-the-art techniques is bound to pay off sooner or later. Says Moore of the working group: “I think they have a really good chance of making some great discoveries.”

    • *International Conference on Emerging Infectious Diseases, Atlanta, Georgia, 8–11 March.


    19th Century Rules of Causation Outdated?

    1. Michael Balter

    For more than a century, three postulates set down by the German microbiologist Robert Koch have guided the hunt for disease-causing microbes. Koch argued in 1890 that to prove an organism causes a disease, microbiologists must show that the organism occurs in every case of the disease; that it is never found as a harmless parasite associated with another disease; and that once the organism is isolated from the body and grown in laboratory culture, it can be introduced into a new host and produce the disease again. (An oft-stated fourth postulate, that the microbe must be isolated again from the second host, was not part of Koch's original formulation.)

    But Koch's criteria are now being swept aside by new technology. Many of the microbes isolated in recent years by molecular techniques that pull segments of their DNA or RNA directly out of infected tissues (see main text) cannot be grown in culture. That makes it impossible to fulfill all three of Koch's postulates. As a result, more and more biologists are agreeing with Stanford University microbiologists David Relman and David Fredricks, who argued a few years ago in the journal Clinical Microbiology Reviews that it's time to modify Koch's strict requirements.

    Relman and Fredricks say that researchers can instead build a convincing case against a microbe by examining a wide variety of molecular circumstantial evidence, such as how tightly the suspect microbe is associated with infected tissues and how closely the time course of the disease correlates with the amount of microbial genetic material present. But some microbiologists caution that loosening the rules of evidence too much could lead to mistaken convictions, especially because the human body harbors many seemingly harmless microbes.

    Charting the narrow course between accepting molecular evidence and making mistakes is tricky. “If you play the game that you have to fulfill Koch's postulates, you fall into a line of thinking that has been obsolete for decades,” says Patrick Moore, a molecular biologist at Columbia University in New York City. He and his wife, Columbia's Yuan Chang, used molecular methods rather than Koch's tenets to finger a herpesvirus as the cause of Kaposi's sarcoma, an AIDS-associated skin cancer.

    Still, Moore admits that there is room for error. “The literature is rife with examples of people finding various viruses in tumors and other diseases,” he says. For example, last year Makoto Mayumi and colleagues at the Jichi Medical School in Tochigi-Ken, Japan, used molecular techniques to isolate a virus they called transfusion-transmitted virus (TTV). They found this virus in many patients with hepatitis and chronic liver disease and proposed that it might be causing these syndromes. But earlier this year in The Lancet, hepatologist Nikolai Naoumov and his colleagues at the University College London Medical School compared patients with liver disease with healthy controls—and found that both groups were infected with TTV at roughly the same rate, implying that it does not cause liver disease. “With the increase in molecular fishing, there is a greater chance of identifying foreign genomes in humans which are not necessarily pathogenic,” Naoumov says.

    Despite such reservations, most researchers agree that Koch's original postulates no longer work. “We need to define new rules of causality,” says Moore. “And I think that [Relman and Fredricks] have made a really wonderful attempt to tackle that.”


    How Matter Can Melt at Absolute Zero

    1. David Voss*
    1. David Voss, a former editor of Science, is a physics writer in Silver Spring, Maryland.

    New tools are revealing the crowd behavior of electrons at close to absolute zero, where freezing and melting are governed by quantum mechanics

    If you cool a piece of ice to the very lowest temperatures possible, about the last thing you would expect it to do is melt. But physicists are learning that some exotic crystals made of electrical charges or electron spins do indeed engage in something like that odd behavior: They “melt” at absolute zero, changing from one phase to another. These phase transitions are strictly in the weird realm of quantum mechanics, a world dominated by large fluctuations in energy and momentum even at the lowest temperatures, where classical physics would insist that the opportunity for change is frozen.

    Letting go.

    As a lattice of electrons called a Wigner crystal (left) is compressed, the quantum fluctuations in the electrons' positions grow until the lattice melts (right).

    Theorists have been exploring these transitions since the 1950s, but now experimenters are actually seeing these strange metamorphoses in the laboratory. With sophisticated tools for building semiconductors and improvements in low-temperature analysis, physicists have been able to watch the melting of an electron crystal and the unusual flip-flops of two-dimensional “gases” of electrons. This new kind of crowd behavior is fascinating in its own right, they say. “These are things that just can't be discussed in terms of single particles at all,” says Subir Sachdev, a Yale University physicist. But it may also have some practical implications: By searching for hints of quantum phase transitions in high-temperature superconductors, researchers are hoping to spring the lock on the stubborn mystery of how these materials work.

    Classical phase transitions, like the melting of an ice cube, are driven by thermal energy. Heating the ice above the freezing point causes molecules in the solid water to vibrate. Eventually, the jiggling becomes so wild that the molecules are no longer content with their orderly seating arrangement in the ice crystal, and they break loose to become liquid water. Heat them even more, and they don't even want to slosh around in a liquid—they vaporize into steam. Reverse the process, the jiggling slows down, and the molecules fall back into place. The variable that controls all of these transformations is temperature.

    Quantum phase transitions are a completely different beast. What opens the way to a quantum phase transition is a change not in temperature, but in some other parameter like the density of a material or the strength of an external magnetic field. Because they don't need thermal energy, quantum transitions can theoretically take place at zero temperature. In fact, they can only be studied at low temperatures, where the thermal fluctuations that cause particles to jiggle subside, revealing the quantum fluctuations in position and momentum that actually trigger the phase transitions.

    Steven Girvin, a theorist at Indiana University, Bloomington, likes to explain the notion of a quantum phase shift by citing a textbook concept known as the Wigner crystal. This orderly arrangement of electrons, named after Eugene Wigner, who proposed it in 1938, can form when the electron density is low and the particles sort themselves into a stable spatial array, like soldiers standing in formation. Arrayed charges like these have since been observed in the layers of electrons that collect on the surface of liquid helium and in confined electron layers in sandwichlike semiconductor structures called quantum wells.

    To illustrate how quantum mechanics can trigger a phase transition in this arrangement, Girvin suggests a thought experiment based on Heisenberg's uncertainty principle, which holds that as a particle's position becomes better known, its momentum gets fuzzier. “As you make the crystal more dense, the electrons become more confined,” he says. “But then the uncertainty principle takes over and the fluctuations in momentum grow.” Squeezing on the crystal causes the electrons to become restless; eventually they melt into a conducting quantum “liquid.” The change is analogous to the heated water molecules jiggling out of their comfortable crystalline lethargy.

    The melting of a Wigner crystal is just one example of a whole class of quantum phase transitions that theorists have studied, in which a material can transform from a conductor into an insulator and back as some nonthermal variable is changed. That variable can be the density, as in the Wigner crystal example. Or the amount of disorder in the material can be varied, which can trap or release charge carriers and change a conductor into an insulator or vice versa. These transitions don't occur in everyday materials, but they provide new kinds of insight into what it means to be a metal or an insulator. And they expand notions of how matter can transform itself, says Thom Mason, a condensed matter experimentalist at Oak Ridge National Laboratory in Tennessee. “Now we have a whole new class of phase transitions that only occur at low temperature,” he explains. “This isn't just small changes in the third decimal place. The characteristics of these systems are completely different.”

    Quantum melting revealed. Actually seeing these transitions was beyond experimenters' reach until recently, because it required the ability to control and probe assemblages of electrons with exquisite precision at close to absolute zero. Now experimenters are armed with precisely controlled magnetic fields and analytical tools such as neutron scattering, along with quantum wells and other new semiconductor architectures that can corral electrons into two-dimensional gases or one-dimensional wires. They are observing how these transitions actually take place—and confirming that some of the textbook scenarios really do occur in the real world.

    Light on the quantum world.

    An experiment probes quantum phase transitions by scattering laser light from electrons trapped in a pair of energy wells (inset). As a phase transition approaches, the electrons become easier to excite and the scattering frequency changes.


    Mansour Shayegan and his co-workers at Princeton University, for example, have seen a Wigner crystal melting, just as Girvin envisions it. Theorists predicted nearly 2 decades ago that electrons confined to a two-dimensional layer would be insulating, locked in a Wigner crystal, unless they were disturbed by magnetic fields. Yet in 1994, Sergei Kravchenko at the City University of New York, studying a two-dimensional electron layer trapped in a quantum well, found hints of a transition to a metallic, conducting state in the absence of magnetic fields. Working at temperatures below 1 kelvin, Shayegan's group found out why. As they will report in Physical Review Letters, the material always changed from insulating to conductive at a specific density of electrons. That density, according to numerical simulations by Bilal Tanatar and David Ceperley at the University of Illinois, Urbana-Champaign, is precisely the value at which a Wigner crystal state should melt.

    Experimenters have also probed another set of quantum phase changes seen in low-temperature electron layers, this time in a strong magnetic field. In what is known as the quantum Hall effect, the resistance to the flow of current along the layer of electrons changes in steps as the magnetic field is increased, a discovery that won Klaus von Klitzing of the Max Planck Institute for Solid State Research a Nobel Prize in physics in 1985. Von Klitzing realized that each jump in resistance corresponded to a quantum phase change—in this case, a change in the traffic pattern of the electrons flowing through the layer.

    Like cars moving through an urban obstacle course of traffic circles, the electrons rotate around the magnetic field lines as they flow through the layer. A stronger field puts more traffic circles in the flow, causing the cars to speed up in some directions but slow down in others and affecting the resistance. Because quantum mechanics demands that the magnetic traffic circles be filled in a particular way—with particles moving in discrete, quantized orbits—the resistance does not rise steadily with magnetic field. Instead, it jumps to a series of plateaus as the electron gas executes quantum phase transitions from one state to the next.

    Exactly what happens right at these transitions is still unclear. One theoretical prediction made in the 1980s is that near each phase transition, the modes of excitation in the electron gas—oscillations in the positions of the charges or in the orientations of their spins—can “soften,” much like piano strings losing their elasticity and dropping in pitch. Because quantum mechanics equates energy with frequency, the softening means that the energy required to cause a change in the electron layer goes toward zero, making it unstable. The slightest nudge, such as an infinitesimal change in the magnetic field, will induce the phase transition.

    To probe this possibility, researchers at Lucent Technologies Bell Laboratories in Murray Hill, New Jersey, and universities in Italy, the United Kingdom, and the United States studied the transitions by scattering laser light off the electron layers in a quantum Hall system. At some wavelengths, the light is scattered more strongly by the electrons, indicating that it is exciting the collective modes. But Aron Pinczuk of Bell Labs explains that because the excitations in a single-layer quantum well have such high energies, they can only be probed with x-rays—a difficult or impossible experiment. So he and his colleagues used a double-layer device in which the energy levels of each layer interact to produce excitation modes at a lower energy. These modes have energies in the infrared part of the spectrum, which is easier to monitor with laser light (Science, 7 August, p. 799).

    Just as predicted, Pinczuk and his colleagues saw the mode softening: The frequency of the infrared scattering peak changed in the double electron layers as they neared the quantum Hall transitions. “This mode softening is part of the basic conceptual framework that we all use to understand these systems,” he notes. “It's not been seen before in an electron gas.”

    Going through a phase. Several groups of condensed-matter physicists are now looking into the possibility that quantum phase transitions may hold the key to their field's biggest mystery, high-temperature superconductivity (HTSC). Conventional superconductivity, which is seen at very low temperatures, entails a classical phase transition: The transition to zero resistance occurs as the temperature is lowered, making the thermal fluctuations weak enough to allow electrons to form pairs that travel without loss through a crystal lattice.

    Yet the high-temperature superconductors, made of copper oxides, behave nothing like conventional ones. Although the temperature has to be lowered to turn them into superconductors, their properties seem to depend strongly on impurities in the copper oxide crystal. “How did it occur to Bednorz and Müller [discoverers of HTSC] to look for a superconductor in the most terrible conductor, a ceramic insulator?” marvels Shou-Cheng Zhang, a theorist at Stanford University. “By doping atoms into copper oxide, they found that somehow this insulator becomes a perfect conductor.” The fact that the change depends on something besides temperature, namely the doping of other elements into the crystal lattice, suggests a quantum phase transition.

    Last year, Zhang came up with a possible scenario for this phase transition by applying mathematical tools borrowed from high-energy physics. The insulating state of the high-temperature superconductors is an antiferromagnet, a delicate balance of alternating spins, like an array of tiny bar magnets pointing alternately north and south. Zhang and his colleagues found that they could mathematically “unify” antiferromagnetism and superconductivity, much as particle physicists have learned to unify the electromagnetic force with the nuclear weak force. This unification allowed Zhang to describe the transition to superconductivity in a high-temperature material as a process radically different from the one in a conventional superconductor, where the resistance drops to zero precisely when the electrons form pairs.

    “The pairing can actually occur at high temperatures, well above those at which the material is superconducting, but the fate of these pairs is governed by lower temperature effects,” Zhang says. As the temperature is lowered, the pairs can form an antiferromagnetic quantum solid or melt into an electron-pair superconductor. At zero temperature, Zhang's model suggests, these two ground states might emerge as two different quantum phases, with composition alone governing the transition between them.

    So far, it's only a theoretical picture, but experimenters are beginning to find hints of quantum phase transitions in high-temperature superconductors. Mason and his co-workers at Oak Ridge, for example, probed single crystals of LaSrCuO4 with neutrons to measure magnetic fluctuations in the material when it had been doped almost enough to make it into a superconductor (Science, 21 November 1997, p. 1432). The neutrons are like a beam of tiny bar magnets bouncing off the spins in the sample; sorting out the scattered neutrons tells how the spins are fluctuating. Although the experimenters could not reach absolute zero—the place where quantum phase transitions come into their own—they found that the spin fluctuations varied with temperature in a way that would be expected if a quantum phase transition were lurking somewhere nearby.

    Not only could quantum phase transitions lift the lid on HTSC, but Mason and others think that studying these transitions may equip physicists to solve equally tough puzzles in the future. The research, says Mason, is part of physics's quest for universality—for ways to explain the behavior of complex, many-body systems in terms not of microscopic details but of large-scale properties. “Because of universality, the things we are exploring now can be applied to cases we don't know about yet,” he says. “It's a remarkable thing that you can experiment on a simple system, then discover there's a whole other set of transitions that follow the same behavior.”



    New Clues to Movement Control and Vision

    1. Ingrid Wickelgren

    Last month, neuroscientists gathered at Boston University School of Medicine for a conference on the brain, held to commemorate the late computational neurobiologist David Marr, who pioneered theories about brain regions as varied as the cerebellum, the visual system, and the cerebral cortex. The meeting's topics were similarly diverse, ranging from the workings of the retina to computer models of the cerebellum

    Guiding a Robot's Movements

    When a tennis ball flies over the net, U.S. Open champion Lindsay Davenport has to predict where it will go so she can race to hit it. If she fails to guess the right spot, she'll lose the point. It's crucial to be a split-second ahead of the game, and computational neurobiologist Terrence Sejnowski of the Salk Institute in La Jolla, California, aided by a robot his team developed, has new evidence to support the idea that this skill is learned with the help of a brain area called the cerebellum.

    At the meeting, Sejnowski showed that a software model of the human cerebellum gave the 2-centimeter-wide cylindrical robot an ability it didn't have before: It could predict, a second in advance, the position of a moving light based on the light's pattern of movement. The result suggests that the cerebellum learns how to anticipate where a fast-moving object will go. Other work had already suggested that the cerebellum—known for its role in stabilizing the body, moving the eyes, and performing multijoint movements—might have a role in learning to make very short-term predictions. The details of Sejnowski's model have not yet been published, but if the software really is a good model of the cerebellum, says computational neuroscientist Tomaso Poggio of the Massachusetts Institute of Technology, the new work could be “an important and necessary step” toward nailing down this function for the cerebellum.

    Sejnowski's group originally set out to use the robot as a real-world test of a model they had made of a brain circuit that governs motivation and reward-seeking. This circuit, shared by both humans and bees, is thought to assess the chances of rewards that may appear in the next few seconds and direct behavior to maximize a reward. It had accurately simulated, on a computer screen, the behavior of bees given a choice of stationary blue or yellow flowers that offer different amounts of nectar (the reward). The computer bees learned to move to the colored flowers that had the highest probability of containing nectar.

    For the robot, the reward was not nectar but the light emitted from a horizontal array of diodes that flashed in sequence from left to right and then back again. The reward circuit caused the tiny robot to roll toward the array of diodes, because this makes the light appear brighter. But because the light moved from left to right, changing position every second, the robot would lose track of it. The robot seemed unable to predict where the light would go next.

    Sejnowski and his graduate student, Olivier Coenen, wondered whether the robot might do better if it had a software version of the cerebellum. This brain structure was already known to have a role in predicting movement. Some of this work involved the so-called vestibular-ocular reflex, in which the eyes move to compensate for a head movement so that the image on the eyes remains still. Studies in monkeys suggested that the cerebellum delivers signals that predict how much the eyes must move in response to a future head movement. That way the brain can plan the eye movement before the head turns.

    Sejnowski suspected that this predictive role might extend to tuning other types of motion to sensory cues as well. So, he enlisted the aid of engineers Marwan Jabri and Jerry Huang at the University of Sydney in Australia to help him and Coenen program the robot with a model of cerebellar learning based on recent physiological data. In this model, the cerebellum cooperates with a brain structure called the inferior olive, which compares sensory data indicating what happened with input from the cerebellum, which predicted what should happen. If there is a difference between the two, the olive transmits that to the cerebellum, changing connections between simulated neurons in a way that is thought to mimic learning. In this case, the learning is supposed to reduce the difference between prediction and reality to improve future predictions and thus performance.

    Equipped with software based on this model, the robot's performance did improve: It learned to predict the light's movement well enough to follow, and sometimes lead, the light by turning its body. Sejnowski believes an animal's ability to track a moving target is similarly “improved” by the cerebellum. “The model has taught us that each piece of brain can be greatly helped by other parts of the brain,” Sejnowski says. “These parts fit together in a temporal progression,” with the cerebellum acting within a second and the reward circuit acting within several seconds, followed by longer term predictors like the cerebral cortex.

    Sejnowski says his group now plans to test whether the model will enable the robot to perform other animal-like behaviors thought to depend on the cerebellum. He also hopes the robot will “evolve” to perform even more complicated tasks as additional brain structures are added to it. “We might be able to reproduce simple creatures, then more complex creatures, and ultimately, ourselves,” he says.

    Seeing in the Dark

    Although humans are not nocturnal creatures, we can see well enough at night to find our way to the bathroom or walk along a beach guided only by starlight. We owe this ability to our rods, specialized cells in the retina of the eye that can respond to as little as a single photon of light, triggering neural impulses that will eventually relay an image to our brains. Scientists have largely worked out the chemical reactions that enable rods to make these responses. They know much less, however, about an equally important step: how the rods turn off that response when the light that elicited it is gone, thereby keeping the eyes ever-ready to perceive changes in light patterns at night.

    Probing vision.

    The micrograph shows an electrode recording the activity of a single rod in a cluster.


    Now, neurobiologist Denis Baylor of Stanford University School of Medicine and his colleagues have pinned down a key reaction in resetting the rods. At the meeting, Baylor described new results with knockout mice showing that an enzyme called rhodopsin kinase, which adds phosphate groups to the rods' light-sensing protein rhodopsin, initiates the chain of events that returns rhodopsin to its off state. The work, which neuroscientist Edward Pugh of the University of Pennsylvania, Philadelphia, calls “really elegant,” could also help researchers understand—and perhaps design better treatments for—diseases in which rods become dysfunctional. In such ailments, which include retinitis pigmentosa, people's night vision may be so impaired they cannot go out at night without someone to guide them.

    The first inklings of how rhodopsin is shut down after it absorbs a photon and triggers a neural signal came in the late 1980s and early 1990s. Working with suspensions of rod proteins, biochemists found that rhodopsin inactivation requires two types of proteins: a kinase, which attaches a phosphate group to one of several amino acids clustered at one end of the molecule, and arrestin, a protein which then sticks to phosphorylated rhodopsin. As a result, rhodopsin can no longer bind to a so-called G protein, which transmits the light-initiated signal through the cell.

    Roughly the same thing happens in intact rods, as Baylor's group, in collaboration with Melvin Simon, Jeannie Chen, and their colleagues at the California Institute of Technology (Caltech) in Pasadena, showed in 1995. When the researchers genetically altered mice so that their rhodopsin lacked the postulated phosphorylation sites, their rods recovered very slowly from light, taking about 20 times as long to return to the resting state as did rods in normal mice. And last year, the team also confirmed a role for arrestin by showing that rods can't reset all the way in mice that lack a working gene for that protein. It still wasn't clear, however, which enzyme performs the crucial phosphorylation. In test tube studies, the reaction could be catalyzed equally well by protein kinase C, a kinase that is commonly involved in cells' signaling paths, and by an enzyme specific to rods named rhodopsin kinase.

    To find an answer, Jason Chen in the Caltech group genetically engineered mice to inactivate the rhodopsin kinase gene. Rods from these animals, Stanford postdoc Marie Burns found, recovered as slowly after light exposure as did those of mice without the rhodopsin phosphorylation sites. This shows, Baylor says, that “rhodopsin kinase is the enzyme that normally turns off rhodopsin,” and that protein kinase C plays a minor role, if any.

    Now Baylor's team is probing the mystery of how the activity of a single rhodopsin molecule generates a cellular response that is exactly the same every time a photon is absorbed, a property necessary for us to make sense of what we see. His group is also studying how other proteins that mediate a rod's response to light are reset. Solving such mysteries will shed more light on our still-hazy view of how we see in the dark.

Log in to view full text

Log in through your institution

Log in through your institution