# News this Week

Science  24 Jul 1998:
Vol. 281, Issue 5376, pp. 494
1. INTERNATIONAL COLLABORATION

# Indian Scientists Shaken by Bomb Test Aftershocks

1. Pallava Bagla

new delhi, india—Indian and Pakistani scientists are beginning to pay a price for last May's atom bomb tests—a price that many believe is unfairly penalizing civilian science. Although the U.S. government took steps last week to limit the impact of economic sanctions, individual agencies began suspending all interactions with scientists from a long list of Indian and Pakistani research institutions, denying entry to some and questioning the status of others already in the country, restricting the exchange of lab materials, and canceling ongoing projects. Japan and Sweden have also taken punitive steps, leaving scientists from the subcontinent worrying that the stigma of the bomb tests may affect them for months or years to come.

About a half-dozen Indian scientists say they have been prevented in recent weeks from participating in international events in the United States. The list includes Rajagopala Chidambaram, chair of India's Atomic Energy Commission and vice president of the International Union of Crystallographers, who was scheduled to speak at last week's meeting of the union in Arlington, Virginia. Other Indian and Pakistani scientists in U.S. research programs, meanwhile, have come “under review,” according to a spokesperson for the Department of Energy, and one Indian scientist has already been dropped from a collaboration at DOE's Argonne National Laboratory near Chicago, Illinois. Some exchanges of laboratory materials appear to have been affected. Even the Centers for Disease Control and Prevention (CDC) in Atlanta has cited the bomb test sanctions in refusing to send out a rabies virus clone.

Under laws designed to prevent the spread of nuclear weapons and missile technology, the United States Commerce Department is blocking the shipment of goods to India and Pakistan that could have a military use. Meanwhile, a multiagency working group at the State Department is deciding on a case-by-case basis whether joint scientific and technical projects should go forward. That has left the bureaucracy struggling to formulate general guidelines, and many agencies seem uncertain about the new rules.

U.S. diplomatic officials, for example, offered competing accounts of why Chidambaram, who is involved in India's atom bomb program, was unable to get a visa to attend the crystallographers' meeting. A State Department official in Washington, D.C., says that Chidambaram simply withdrew his request for a visa. But a U.S. embassy spokesperson here says, “Visa procedures [are] under review as a result of the nuclear tests.” Chidambaram says that U.S. officials in New Delhi held his application for 10 days, then returned his passport and fees on 9 July.

Less prominent Indian researchers have also been affected by the chill. A young civilian astronomer who asked to remain anonymous says he was prevented from taking a job in a private remote-sensing center in Virginia because his visa has been withheld for 2 months. “I had sold all my household belongings and was ready to fly out within a week,” he says. He wonders why civilian scientists should be “made to suffer just because a nation has to be punished.”

Crystallographer Krishan Lal of the National Physical Laboratory in New Delhi says he was denied a visa to visit the United States, even though he works at a civilian research facility. This violates the principles of the International Council of Scientific Unions, says Lal. The union's U.S. representative, the National Academy of Sciences, apparently agrees: Earlier this month, it sent a letter of appeal on Chidambaram's behalf to the U.S. embassy in New Delhi.

At least one U.S. scientific agency—DOE, which runs the U.S. nuclear weapons program—is also taking a hard line. Recently departed DOE Secretary Federico Peña issued a memorandum on 16 June suspending all DOE laboratory collaborations and visits involving “Indian and Pakistani foreign nationals from nuclear institutes and related entities.” DOE has drawn up a list of more than 65 Indian and Pakistani research institutions covered by the suspension, including all of India's civilian nuclear centers and the Indian Space Research Organization. The rules can be waived for individual projects and scientists only by appeal to the secretary. Meanwhile, DOE spokesperson Carmen MacDougall says the agency is reviewing “half a dozen” foreign scientists now at DOE labs to see whether they should return home.

Equally alarming to some Indian scientists are the problems they now face in acquiring supplies and equipment. Scientists at the Indian Institute of Science in Bangalore are still waiting for CDC to ship rabies complementary DNA clones it promised more than 3 months ago. “We were requested to interrupt all biologic shipments until diplomatic resolutions occur,” a CDC official wrote curtly to Indian officials last month.

Sweden has gone even further. It has ended a $500,000 collaboration with India started last December in the fields of environment, energy, and food processing. And Mohan Gopal Kulkarni, a polymer chemist at the National Chemical Laboratory in Pune, says his collaborative project on biodegradable polymers with Y. Tokiwa of the National Institute of Biosciences & Human Technology in Tsukuba, Japan, has been delayed indefinitely “because of the recent nuclear tests.” Despite these setbacks, most Indian officials seem confident that their country can ride out the storm. Raghunath Anant Mashelkar, secretary of the department of scientific and industrial research, feels that “the new round of sanctions can easily be brushed aside, as India has literally grown up in this atmosphere of technology denials.” But some are not so sure. They warn that Indian science will suffer as a result of this international isolation, and that the repercussions from the nuclear blast will be felt even if official sanctions are lifted soon. One engineer worries that civilian research will pay a heavy price for what he calls “the romantic indulgences of a few nuclear scientists.” 2. ANIMAL CLONING # Cloned Mice Provide Company for Dolly 1. Elizabeth Pennisi Dolly, the cloned sheep, can no longer be considered a fluke. As the first—and, at the time, only—mammal cloned from an adult cell, she was greeted first with awe and, later, with doubts (Science, 19 December 1997, p. 2038; 30 January, pp. 635 and 647). Dolly was up against the dogma that DNA from mature cells could not start over and guide an egg's development into a complex, multicellular organism. Before that dogma could be overturned, skeptics argued, the cloning experiment that yielded the lamb needed to be replicated. Now, they've gotten their wish. In this week's issue of Nature, Ryuzo Yanagimachi and his team at the John A. Burns School of Medicine at the University of Hawaii, Honolulu, provide the first scientific report confirming that cloning from adult cells is not only possible but repeatable. In it, they describe experiments that have so far yielded more than 50 cloned mice. Two other reports in the same issue describe DNA analyses proving that Dolly and the ewe she was cloned from are indeed genetically identical, as would be expected of clones. And in Japan, two calf clones born 5 July but not yet fully described in the scientific literature (Science, 10 July, p. 151) have apparently passed similar tests. “[Cloning] is a real phenomenon,” comments Richard Schultz, a developmental biologist at the University of Pennsylvania, Philadelphia. These achievements may reignite the ethical frenzy that followed the first reports of Dolly, primarily because of worries that the technology will be applied to humans. And they will certainly spur renewed vigor among companies vying to apply these technologies. For example, they might be used to clone herds of cattle that produce therapeutic proteins in their milk. “We intend to commercialize [the mouse technology] on a broad range of animals,” says Laith Reynolds, CEO of ProBio America, a Honolulu-based company that is now supporting the mouse work in Hawaii. To clone mice, Yanagimachi, working with Teruhiko Wakayama of the University of Tokyo in Japan, devised a variation of the technique used by Ian Wilmut of the Roslin Institute in Scotland and Keith Campbell of PPL Therapeutics to create Dolly. The idea is to get nuclei from adult cells into eggs whose own nuclei have been removed. The resulting cells can then be triggered to develop into embryos, which can be implanted in foster mothers. But while the Roslin team got the adult cells to fuse with enucleated eggs by subjecting them to an electrical pulse, Wakayama uses a very fine needle to take up the donor cell nucleus, which he very gently and quickly injects into an enucleated egg. “He is very careful to make sure as much of the donor cytoplasm is gone as possible,” says Schultz. That cytoplasm could contain factors that might thwart proper development. The Hawaii group also took a different approach to initiating egg development. In the Roslin team's case, the same electrical pulse that fused the cells prompted the egg's activation. Wakayama first lets the cells sit for up to 6 hours to give the egg cell time to alter the donated DNA so that its developmental genes can be expressed again. Then, the Honolulu team triggers development of the eggs by putting them into a culture medium containing strontium, which stimulates the release of calcium from the eggs' internal stores—the same signal that tells fertilized eggs it is time to start dividing. For some reason, the Honolulu team's strategy worked best with cumulus cells, which surround an egg as it matures. Over the past year, the group has used them to create some 50 clones, confirming their clonal origins by comparing the DNA of the newborn mice to that of the animals that provided the nuclei. The cloned mice seem normal: The group has cloned some clones and mated others, creating healthy young in both cases. All told, “it's a very compelling paper,” says Michael McClure, a cell biologist at the National Institute of Child Health and Human Development in Bethesda, Maryland. Just as compelling are the results of two DNA analyses, conducted independently by the PPL-Roslin team and a group from the Hannah Research Institute in Aye, Scotland, and the University of Leicester in the United Kingdom, to evaluate Dolly's origins. Some researchers thought that the limited DNA analysis Wilmot's team originally performed to show that Dolly is not an offspring of its surrogate mother was not convincing. So, the two groups made a more detailed comparison of DNAs from Dolly, from the cultured udder cells used as nuclear donors, and from the ewe that provided the cells. Wilmut's team, working with a local company called Rosgen, analyzed 10 microsatellites—short stretches of DNA known to vary between unrelated individuals. “They all had identical patterns,” comments Robert Wall, a geneticist at the U.S. Department of Agriculture in Beltsville, Maryland. The second team compared DNA fingerprints—patterns created by chopping up DNA with enzymes and sorting the fragments by size—from the same three sources, and also looked at DNA obtained from animals belonging to the same herd as the ewe. Dolly's fingerprints matched the donor ewe's but not those of the herd. “These two [reports] are fairly powerful demonstrations that Dolly is what they say she is,” Wall concludes. The two Japanese calves, obtained by fusing oviduct cells from one cow with enucleated eggs from another, apparently are as well. Last week, Yukio Tsunoda, a professor of animal reproduction at Kinki University's Faculty of Agriculture in Nara, announced that DNA testing has confirmed that the calves are offspring of the oviduct cell donor. “I think there is no mistaking that they have repeated the Roslin procedure,” says Tomohiro Kono, a developmental biologist at Tokyo University of Agriculture. The Japanese team, including researchers from both Kinki and the Ishikawa Prefectural Livestock Research Center in Nara, also said that they have an additional four cows pregnant with cloned embryos. Now, cloning researchers can move on to other challenges, such as trying to improve their success rates, currently a few percent at most. This effort should be helped by the ability to study cloning in mice, which have shorter life cycles and require much less care and space than, say, sheep or cows. With improvements, says Wilmut, nuclear transfer “is going to be a very reliable, robust [cloning] method.” 3. SPACE SCIENCE # Negative Review Galls Space Crystallographers 1. Jennifer Couzin The promise of space-grown protein crystals has been a major selling point for the international space station. Larger and more perfect than Earth-grown versions, they could reveal new molecular details and new targets for drug designers. But a group of academic scientists who issued a review of the field last week said that crystal-growth experiments NASA has already flown aboard the space shuttle have not lived up to expectations. The seven members of the American Society for Cell Biology (ASCB) said in their report that the field has made “no serious contributions” to scientific knowledge and there is “no justification” for continuing such studies in space. Released on the eve of the House vote on NASA's 1999 budget, the report was distributed at a 15 July press conference at the Capitol by Representative Tim Roemer (D-IN) as he sought support for an amendment to eliminate the space station. At press time, the amendment was not expected to pass, but the report has infuriated some protein crystallographers. “I think the report is absolutely wrong,” says Larry DeLucas, a crystallographer at the University of Alabama, Birmingham. “I can't believe the [ASCB] would get behind a statement like that.” DeLucas says his NASA-funded space-based research helped reveal a protein structure that has contributed to the ongoing development of influenza drugs. And Daniel Carter, a biophysicist who formerly worked for NASA and now directs New Century Pharmaceuticals, a company in Huntsville, Alabama, that receives funding from NASA, says his space-based work crystallized proteins 10 times larger than those grown on the ground, making them available for structural studies. Of the ASCB report, Carter says, “it just seems to be more of an opinion than a review of the facts.” Members of ASCB's panel were unanimous in giving NASA's crystallography program bad marks. So far, the$9-million-a- year effort has not lived up to claims that it would aid drug development for Alzheimer's disease and breast cancer, says the chair of ASCB's panel, biologist Donald Brown of the Carnegie Institution of Washington, D.C. “The [Earth-based] crystal community doesn't feel that real gains have been made in space,” says Brown. Another member of the panel, Harvard University crystallographer Stephen Harrison, says he conducted a literature search for crystals grown in microgravity conditions and determined that “none of the modest successes reported” had made a “significant impact” on drug design or structural biology.

A third member of the panel, Washington University biologist Ursula Goodenough, explains that “it became untenable for those of us in the ASCB to sit back” without pointing out the lack of productivity in the field. The panel concluded that there was no justification for conducting these experiments aboard the space station. Indeed, the members of this panel—none of whom receive funding from NASA—determined that, except for studies of astronaut physiology, virtually all NASA's life sciences research should be ground-based.

NASA sought to downplay the criticism. Joan Vernikos, the director of NASA's life science program, says she is “perplexed” by the report and argues that “it is not the general consensus of the community that [space-based protein crystallography] is a useless program.” Recently, she says, a group of outside scientists appointed by NASA and the National Institutes of Health issued a report praising the crystallography research. And NASA-funded crystallographers argue that the quality of data will improve as they move to long-term experiments on the station.

Despite the anger the ASCB report has aroused, Goodenough stresses that it is “meant to be positive” in urging NASA to increase ground-based research in the life sciences, such as analyzing satellite photographs of the Earth. Whether NASA managers will take the criticism in that spirit and adjust their programs remains to be seen.

4. FRANCE

# Allègre Sets Tough Targets for Research

1. Michael Balter

paris—Last week, while football supporters were still dancing in the streets following France's victory in the World Cup, French government ministers adopted an ambitious plan to win a similar prize for French science. The 15 July closed-doors meeting, attended by ministers from all spheres of government and chaired by Prime Minister Lionel Jospin, set the goals high: Over the next 4 years, France will attempt to double the impact of its scientific publications, triple its international patents, and create 400 new high-technology companies. “We are going to make radical changes,” said geochemist Claude Allègre, France's research and education minister, at a press conference immediately after the meeting.

Although Jospin chaired the session, Allègre and his staff had worked out the new strategy many weeks earlier. Despite the fanfare surrounding its announcement, some researchers expressed skepticism that its goals are feasible. Over the years, many French scientists have grown weary of the steady stream of research strategies, blueprints, national consultations, and commissions that have emanated from successive governments. “The objectives are correct,” says Philippe Froguel, director of the human genetics department at the Pasteur Institute in Lille. “My only question is, how is all this going to be carried out?”

Still, two features of the new plan have made researchers sit up and take notice: First, the government intends to move toward a system of peer-reviewed grants to finance publicly funded research. This would be a radical step for French science, which relies on arcane formulas to distribute most research funds to government labs. Second, evaluation of research in the giant public research institutions, such as the basic research agency CNRS and the biomedical research agency INSERM, will gradually be taken away from internal committees and put in the hands of external review panels made up of both French and foreign scientists. (Also see Editorial by Allègre on p. 515.)

Pierre Chambon, director of the Institute of Genetics and Molecular and Cellular Biology near Strasbourg, says that if these reforms are adopted, “it will be a fantastic change.” Chambon, who has been chosen to head the external review panel for the CNRS, says that a peer-review system will funnel more research money to “active young people, good people,” rather than allowing funds to wind up concentrated in the hands of senior lab directors, whose productive years may be behind them. Although details of the new plan are still sketchy, Allègre mapped out at the press conference a series of steps designed to shake French research out of its current doldrums (Science, 16 January, p. 312). To foster the creation of new high-tech companies, the research ministry will create liaison committees between researchers and industry at each university and research agency. Special funds will be allotted to finance applied research projects, and the external evaluation committees will be charged with organizing peer-review boards to vet grant proposals in various key fields.

Allègre may, however, have a hard fight transforming France's huge and entrenched public research system. Some researchers fear the consequences of a peer-review system on basic research. “There are certain lines of research that are not fashionable and that are difficult to finance,” says microbiologist Richard D'Ari of the Jacques Monod Institute in Paris. “These areas should be protected.” Moreover, in recent months researchers have mounted considerable resistance to organizational reforms Allègre has already proposed for the CNRS and INSERM (Science, 6 March, p. 1442).

Late last month, Allègre discreetly invited top figures in French research to a daylong meeting at the Ferrières château, east of Paris, to garner support for his ideas. The gathering generally endorsed Allègre's plan, which was essentially the same as that agreed upon by the ministers last week, but some of his goals met with skepticism. For example, doubling the relative citation impact of French papers—the number of times research articles are cited divided by the total number of articles published over a period of time—would make France the world leader in the scientific impact of its publications, a target some called unrealistic. “Everybody was very surprised to see this goal announced,” says a French researcher who asked not to be identified.

But geophysicist Vincent Courtillot, Allègre's chief adviser, told Science that research officials hope to at least approach this goal by encouraging researchers to publish fewer articles but with higher quality. “We can publish half as many papers that are twice as good,” Courtillot says. The other two goals—increasing the number of patents and creating new high-tech companies—may be within the realm of the possible, researchers told Science, but are still ambitious. “If each university created four or five new companies, we could do it,” says Froguel. On the other hand, Chambon says, “France is not very well equipped” to create start-up companies. “It is much easier in the United States, where you can hire people, fire people, and create a new firm in a few days. The system to help scientists start companies and find venture capital is very underdeveloped in our country.”

Despite these reservations, and the prospect of stiff resistance from some scientists, to many researchers it seems that the future of French research hangs in the balance. “[The government] shouldn't ask the heads of the labs if they want to change, but the young scientists, who want to work and do science now,” Chambon says. “If Allègre doesn't succeed in changing French research, it will be a long time before anything will happen.”

5. SPACE SCIENCE

# Chain of Errors Hurled Probe Into Spin

1. James Glanz

The loss of the $1 billion Solar and Heliospheric Observatory (SOHO) has exposed a long chain of software and control errors, a NASA review panel has found. The report says a pair of software bugs laid the groundwork for a disastrous command from controllers at NASA's Goddard Space Flight Center in Greenbelt, Maryland, which caused SOHO to spin out of control early on 25 June. Science has learned that the debacle also revealed other foul-ups, which played smaller roles in the loss: a third software bug, which may have distracted the flight operations team during the crisis, and the failure of three of SOHO's four emergency batteries. The panel, which issued its preliminary report last week, is not writing an epitaph for SOHO, which had been watching the sun for two-and-a-half years and was expected to keep gathering data into the next millennium. The spacecraft, a joint project of NASA and the European Space Agency, is now mute and is thought to be tumbling in an orientation that prevents its solar panels from collecting sunlight and generating power. As it moves around the sun over the next few weeks, however, the panels may rotate sunward again and enable the craft to answer NASA's calls. But Joseph Gurman of NASA Goddard, the U.S. project scientist for SOHO, says the debacle holds important lessons. “We ended up with a system that was more complex than was consonant with the very highest degree of safety,” he says. The incident centered on two gyroscopes, called A and B, which sense the spacecraft's roll—its rotation around its longest axis, which is normally aimed at the sun. Because of slight imbalances and electronic inaccuracies in the gyros, they must be calibrated occasionally to determine their “drift,” or the amount of actual roll SOHO has when the gyros read zero. To help with the fine-tuning, gyro B's output is set to 20 times its normal sensitivity during calibration. The first software error—introduced during a recent effort to streamline the SOHO software—left gyro B in its hypersensitive mode after calibration rather than resetting it. “It was left indicating roll rates 20 times greater than actual,” says Gurman. This error caused trouble when controllers at Goddard began a second routine procedure, in which SOHO's thrusters are fired to hold the spacecraft steady while a set of wheels, accelerated to twist the spacecraft during maneuvers, are slowed from the high rates of spin they acquire over time. As soon as the procedure was finished, gyro B began telling SOHO, incorrectly, that the spacecraft was spinning 20 times too fast. On the evening of 24 June, the spacecraft went into a mode called Emergency Sun Reacquisition (ESR). Triggered automatically if SOHO detects anomalies in its orientation, ESR fires thrusters to reorient the craft toward the sun. At that point, a second critical software bug did its damage. To save wear and tear, gyro A shuts off while the wheels are braked. But because a necessary command sequence had been omitted from onboard software during a rewrite about a year ago, gyro A, unknown to controllers, had failed to come back on. Gyro B's erroneous output had, by then, been reset, but it conflicted with gyro A's false zero. And while this problem was developing, other errors cropped up in software written for a recent move to a new control room. Controllers hustled down a hall to the original room, where they hoped the instrument readings would be more reliable. Here, they took the fateful step described succinctly in the report: “A rapid decision was made that gyro B was faulty and should be commanded off.” When gyro A reads zero, SOHO's thrusters fire briefly to compensate for its drift and stabilize the spacecraft. But because gyro A continued to give its false zero, the thrusters had been firing continuously, spinning SOHO faster and faster. Its befuddled sensors triggered two more ESRs, eventually sending it flailing out of control, like a ballerina tripped in midpirouette. Communications faded rapidly as the solar panels lost sunlight—far too rapidly, one panel member says. Surprised controllers now found that undetected failures sometime earlier this year had taken out three of the four batteries. Michael Greenfield, an official in NASA's Office of Safety and Mission Assurance who co-chaired the panel, puts the blame squarely on the controllers' decision to turn off gyro B. “The team had sufficient time—over 24 hours the spacecraft would have been stable—to reevaluate what to do,” he says. “You generally stop, call in experts, senior management. That was not done.” Others, however, point out that many individual failures contributed to the loss. “We've got to avoid any finger-pointing,” says Goddard's Art Poland, the previous SOHO project scientist, who emphasizes the scientific successes the mission has already scored. “Each of us can share part of the blame,” agrees Gurman, who was on vacation at the time. “If I had been driving the Titanic, would it have hit the iceberg?” 6. GLACIOLOGY # West Antarctica's Weak Underbelly Giving Way? 1. Richard A. Kerr The news out of West Antarctica remains unsettling. Early this month, researchers sifting through mud drilled from beneath the West Antarctic Ice Sheet reported that the massive pile of ice had disintegrated to next to nothing at least once in the past 1.3 million years (Science, 3 July, p. 17), presumably during a warm interlude between ice ages. Now, space radar images hint that the ice sheet may be weakening again in today's warming world. One of the glaciers flowing from the ice sheet into the sea—a glacier that has long been seen as the ice sheet's weak point—is eating into stabler ice at a startling rate. The observations, reported on page 549 of this issue of Science by radar scientist Eric Rignot of the Jet Propulsion Laboratory in Pasadena, California, show that the “grounding line” of the Pine Island Glacier—where ice resting on its bed gives way to floating ice—has been retreating inland at a rate of more than a kilometer per year, presumably because the glacier is losing mass by melting at its base. “That's not catastrophic yet,” says glaciologist Richard Alley of Pennsylvania State University in State College, “but most models indicate [that the retreat] would speed up if it kept going.” And that, say some glaciologists, might be a first step toward the collapse of the entire West Antarctic Ice Sheet, which covers one-quarter of Antarctica. In 1981, University of Maine glaciologist Terence Hughes dubbed Pine Island Glacier and the adjacent Thwaites Glacier the “soft underbelly” of the ice sheet because they seemed particularly vulnerable in warm climates like today's. They lack the extensive floating ice shelves thought to buttress other glaciers that are draining the ice sheet, he noted. They are also exposed to the relatively warm South Pacific Ocean. Once these vulnerable ice streams began to give way, Hughes speculated, the generally downward slope of the seabed that the ice sheet rests on would accelerate the grounding line's retreat and the accompanying thinning of the ice, ultimately leading to the complete collapse of the ice sheet within a couple of centuries. The result would be a sea level rise of more than 5 meters—enough “to back up every sewer in New York City,” as one researcher puts it, not to mention flood any low-lying coast, from all of South Florida to the city of Bangkok. Rignot looked for early signs of collapse in observations of the 33-kilometer-wide Pine Island Glacier made between 1992 and 1996 by the radars aboard the European Earth Remote Sensing satellites ERS-1 and ERS-2. In his computer analysis, he allowed radar signals reflected from the glacier during satellite passes a few days apart to interfere with each other. The resulting interference pattern, sensitive to small vertical motions, revealed the subtle flapping of the floating ice as ocean tides raised and lowered it. The grounding line at the hinge of the flapping ice shelf retreated into the ice sheet at a rate of 1.2 ± 0.3 kilometers per year during the 4-year period, Rignot concluded. “That's a significant retreat,” he says. “I would say it's surprisingly large,” agrees radar glaciologist Mark Fahnestock of the University of Maryland, College Park. “It could potentially lead to a collapse” of the ice sheet. But researchers aren't panicking yet. Their primary reservation, which Rignot shares, is that 4 years “is a very short interval,” says Alley. “Glaciers do weird things.” For example, one of the ice streams draining into the Ross Sea stopped flowing about 100 years ago, and another slowed by 50% during the past 35 years. More radar surveillance should tell whether the glacier's retreat is continuing, and measurements of exactly how the seabed slopes beneath the glacier should indicate whether the retreat really will accelerate. “That guarantees a very high priority will be to map the sea floor in that whole area,” says Hughes. Getting to know Pine Island Glacier better will not be easy. It's “a hideous place to work,” says Alley. It's so remote, “you can't get there from anywhere, and the weather stinks.” But to see what the warming world might hold, glaciologists are already breaking out their cold-weather gear. 7. NEUROBIOLOGY # How the Brain Sees in Three Dimensions 1. Marcia Barinaga When Renaissance painters solved the problem of depicting three-dimensional (3D) scenes on flat canvases, their paintings blossomed into realistic representations of the world. Our brain must solve this problem every day to reconstruct 3D views from images that fall on the 2D surface of our retinas. Researchers have long known that we use various cues to accomplish this, such as the stereoscopic effect of binocular vision and the relative sizes of objects. Now, a team at the California Institute of Technology in Pasadena has made a surprising discovery about the neurons that apparently translate distance cues for the brain. Most neuroscientists thought that neurons sensitive to object distance would be located in the so-called “where” processing stream, a set of brain areas that receive information from the primary visual cortex and use it to compute spatial relationships that, among other things, guide movements, such as the reach of a hand toward an object. But on page 552, Caltech's John Allman and Allan Dobbins and their co-workers report finding brain neurons outside the “where” stream that register depth, as indicated by correlations between their firing rates and the absolute distances of objects. “This paper is going to attract a lot of interest,” predicts Robert Desimone, director of intramural programs at the National Institute of Mental Health. Terrence Sejnowski, a neuroscientist at the Salk Institute in La Jolla, California, agrees, noting that it suggests that depth-sensing neurons are found throughout the visual cortex, their information combining with the 2D map that already exists in each visual cortical area to provide the areas with full 3D maps of visual space. The Caltech team identified the depth-perception neurons by recording the activity of neurons in monkeys' brains as the animals looked at bars of light displayed on a computer screen at various distances from the monkeys. The team looked in two brain areas, the primary visual cortex and a nearby area called V4, and found that some neurons in each area respond best to light bars that produce a particular size image on the retina. Because the size of the retinal image changes with the screen's distance, that means the brain's response to any bar would also change with distance—no surprise there. But when the researchers kept the size of the retinal image constant by varying the size of the light bar as they changed the position of the screen, they still found, Allman says, that “distance was having a very powerful modulatory effect” on some neurons. There were “farness neurons” whose responses increased as the screen moved away, “nearness neurons” whose responses grew stronger when the screen moved near, and other neurons that peaked in between. The researchers then monitored the firing rates of these neurons as they selectively removed visual cues for distance. Some neurons stopped registering distance when one eye was covered, suggesting that they depend on binocular cues. Others worked monocularly as long as the monkey had a broad view of the room and the monitor, but lost depth perception when the monkey viewed the image through a tiny hole. And some neurons continued to register distance when both context and binocularity were removed. Allman and Dobbins think those neurons may respond to cues such as the focus of the eye, which varies with distance, or the angle of gaze, which shifts inward toward the nose as an object gets nearer. “It is interesting that different cells appear to be tuned to different kinds of depth cues,” says neuroscientist Mel Goodale, of the University of Western Ontario, in London. But perhaps the most intriguing aspect of the work, he says, is that the Caltech team found neurons sensitive to object distance, not in the “where” stream, where conventional wisdom suggested it to be, but in primary visual cortex and in V4, which is part of a second processing stream, the “what” stream, which specializes in the identity of objects. This could mean the trait may occur throughout that stream, and perhaps the whole visual cortex. This invites researchers to “rethink the ‘what’ pathway” and the role distance information plays in its mission, says Sejnowski. Size is relevant to an object's identity, he says, and the “what” stream would need distance information to compute size. “In retrospect,” says Desimone, “it makes perfect sense” that visual maps in the ‘what’ stream would be three-dimensional. “But honestly,” he adds, “I was surprised.” 8. PHYSICS # First Ticks of a Super Atom Clock 1. David Kestenbaum Bose-Einstein condensates have yet to make the leap from quantum toy to tool. But in prodding these curious aggregates of supercold atoms, physicists have elicited some hints of future practicality. Now, a group led by Carl Wieman at JILA and the University of Colorado in Boulder has fashioned a crude clock based on the quantum ticks of these balls of atoms. A much-refined version might one day replace the traditional atomic clocks that keep the world on time. Today's atomic clocks are pegged to the frequency of light emitted when a cesium atom flips between two slightly different configurations—one in which the spin of the electron and the nucleus point in the same direction, and one in which they point in opposite directions. To get an observable signal, clocks usually watch millions of atoms. With such large numbers, however, the atoms interfere with each other electrically, smearing out and shifting the precise spacing of atomic levels, which blurs the regular ticking. Condensates offer a way to get more bang for the atom. A condensate forms when a cloud of atoms is cooled to within a hair of absolute zero and all the atoms leap into the same quantum state “like lemmings,” Wieman says. Then, it turns out, instead of acting like individual clocks, their pendula swing in perfect harmony, which can amplify the signal manyfold. “That's a huge difference,” says physicist Steven Chu of Stanford University. Wieman and colleagues didn't set out to make a condensate clock. Condensates, like all quantum objects, have wavelike properties, and the team was studying how the waves of two condensates interact over time. They confined a few hundred thousand rubidium atoms in a magnetic trap and cooled them into a condensate. Then they split it using a radio frequency burst to form a second, overlapping condensate, in a different spin state from the first. The quantum waves corresponding to the two states have frequencies that differ by a small but precise amount, just like the two states of cesium that have been used to keep time. The frequency difference can be inferred by allowing the two condensates to interact briefly and watching how many atoms jump from one to the other. The team expected that as the condensates sloshed around in the trap, the frequency difference would get washed out. But when they measured the populations of the condensates with a laser, they found that the frequency difference was durable enough to keep time. “You can think of this as the first Bose-Einstein condensate clock,” Wieman says. But he cautions, “I wouldn't want to push the accuracy [of] it much.” Jason Ensher, a member of the team, notes that it is only good to about a billionth of a second—over a million times less accurate than the best atomic clocks. Another practical limitation of this version is that the laser burst that reads the time destroys the clock. And the magnetic fields of the trap will probably blur the frequencies of the two states and degrade the precision of such a clock, points out Christopher Oates, a physicist at the National Institute of Standards and Technology in Boulder. Condensate clocks may keep time for future generations, he says, but for now the idea is “still in diapers.” 9. ENTOMOLOGY # Earth's Unbounded Beetlemania Explained 1. Virginia Morell It's a tale every evolutionary biologist knows by heart: Asked what he had concluded about the Creator from studying creation, the great biologist J. B. S. Haldane reputedly quipped that the Creator “had an inordinate fondness for beetles.” And indeed, the 330,000-odd species in the order Coleoptera—the beetles—far exceed the number in any other plant or animal group. “It's a saying that's always in the back of your mind,” says Brian D. Farrell, an evolutionary entomologist at Harvard University. On page 555, Farrell hands the credit for this diversity to the beetles' own fondness for a leafy diet. Although the Coleoptera arose some 250 million years ago, “age alone doesn't explain” their diversity, says Farrell. Instead, his research shows that the appearance of flowering plants some 100 million years ago set leaf-eating beetles on speciation's fast track. “It's a classic case of coevolution,” says Farrell. “The plants were like a new, unoccupied island, and the herbivorous beetles were among their first colonizers—that's what opened the door for their dramatic radiation.” To reach that conclusion, Farrell merged paleontology, phylogenetics, biogeography, and plain old natural history. “This kind of analysis is what evolutionary biology is all about,” says Harvard's E. O. Wilson. “He's addressed two of the most important problems in the field: what determines the number of species [in each taxon] and why some groups, like the leaf-eating beetles, are just over the top in terms of success.” Adds Thomas Eisner, a chemical ecologist at Cornell University, “It's a classic study. Charles Darwin would be proud.” The first beetles were not vegetarians—primitive beetles living today eat detritus and fungi—but it only took them 50 million years, or as Farrell says, “the evolutionary equivalent of about 50 seconds,” to figure out they could survive on cycads, ferns, and conifers. Many of these earliest herbivores dined on the interior sappy bark or stems of such plants, while their larvae munched the nitrogen-rich, pollen-bearing structures inside the cones. Farrell thought that tissue-eating behavior could have prepared certain beetle species for the appearance of the juicy flowering plants, or angiosperms. The idea that plants and insects might be dancing an evolutionary pas de deux was first suggested in 1964 by Stanford University ecologist Paul Ehrlich and Missouri Botanical Garden botanist Peter Raven. “They had this cool vision early on,” says Farrell, “about how plants could be the driving force in insect evolution, and vice versa.” The hypothesis made sense because many insects are restricted to feeding on certain groups of plants—and many plants have defenses targeting insects. The idea took a hit 5 years ago, when Conrad Labandeira of the National Museum of Natural History found that the appearance of angiosperms had no effect on the number of insect families (Science, 16 July 1993, p. 310). But the picture was different when Farrell applied a finer lens to the most successful insects of all, the beetles. He analyzed highly conserved DNA sequences from 115 species of the herbivorous beetle subfamilies, sampling up to six species from each, to create a phylogenetic tree showing the likely evolutionary relationships of today's beetles and when they diverged from common ancestors. He mapped his tree on floor-to-ceiling charts hung in his office, compiling onto them data from fossils, species' dietary habits, and the biogeography of present-day beetles to identify which beetle ancestors ate which host plants on which continents and over what time periods. “I felt like a photographer in a darkroom,” he says, “watching as the chemicals made a picture emerge.” The developing image revealed a tight link between plants and beetle diversity. While cycad and conifer-feeding beetles formed the family tree's trunk, angiosperm-eaters dangled from the top branches. Two related superfamilies, Chrysomeloidea (such as the Colorado potato beetle) and Curculionoidea (which includes weevils), seem to have benefited particularly from the blossoming of a leafy, green world. Together, their known 135,000 species comprise some 80% of all herbivorous beetles and almost half of all herbivorous insects. And their population boom coincides with the rise of angiosperms. “They show an increase in diversity by several orders of magnitude,” Farrell says. “Well over 100,000 new species of beetles arose because of that move to angiosperms.” The findings, says Farrell, “show how moving into a new environment, where there's no competition, can free you for an explosive adaptive radiation.” Or, as Eisner pithily puts it, “it shows what happens if you eat your vegetables.” 10. DEVELOPMENTAL BIOLOGY # How Plants Pick Their Mates 1. Evelyn Strauss* 1. Evelyn Strauss is a science writer in San Francisco. Plants don't flirt. They don't gaze too long or blush. They produce flowers but don't show up at the door with a bouquet. Even without such behaviors, though, they are very choosy about their mates and rarely cross-fertilize with other species in the wild. Now, researchers have shown for the first time that the female parts of plants make snap judgments—good ones—about whether they have something in common with the pollen grains on their threshold. At the Society for Developmental Biology meeting at Stanford University last month, Daphne Preuss, a plant geneticist at the University of Chicago, and her colleagues reported that the receptive female part, the stigma, of the experimental plant Arabidopsis grabs pollen of the same species and holds on so tightly that even the sheer force of a centrifuge can't separate the two. But pollen from a different species just falls off. What's more, the stigma apparently discriminates among its suitors by conversing with the cell wall of the pollen grain, a structure most scientists had deemed to be as uncommunicative as, well, a wall. Understanding how plants block inappropriate mating might allow researchers to “overcome those barriers and produce new hybrid crops,” says Preuss. “For example, cold-tolerant plants could be crossed with plants that produce high-quality fruits.” Because plants can't move, many are literally at the mercy of the winds for introducing sperm, carried by pollen, to the eggs in the female part of flowers. Factors such as species-specific pollinators ensure that the right pollen ends up in the right type of flower, but pollen from other species often makes its way to the stigma as well. So, the female tissue raises many obstacles to hinder mismatched sperm on its way to the egg, explains Dina Mandoli, a plant developmental geneticist at the University of Washington, Seattle. But the earlier the female foils any interloping sperm, the better, and “no one had looked at the very first step before,” she says. To find out whether the female tissue assesses potential partners upon first contact, Greg Zinkl, a postdoc in Preuss's lab, brushed an Arabidopsis anther, which carries pollen, on a stigma. He dislodged unbound pollen by adding a solution containing detergent and centrifuging or vortexing the stigma. Then, he counted the pollen grains still attached—and found that many had made an instant connection. “This is very abusive treatment,” says Preuss. “We'd expect things to pop off if they could, but amazingly, the pollen sticks very well.” When Zinkl repeated the experiment with pollen from the unrelated petunia plant, the pollen fell off. When the researchers looked at the plant surfaces through an electron microscope, they saw no physical structures that might explain how the plant parts embrace—no entangling hooks and loops. “Whatever it is has to be very small—some kind of molecular Velcro,” says Preuss. “We're at the level of chemistry—not large cellular structures.” Although the group hasn't yet tracked down the sticky molecules, a preliminary search for mutant pollen cells that don't adhere properly suggests that structural defects in the cell wall ruin the attraction. This adds to a growing realization that cell walls play an active role in cell-cell signaling, researchers say. “People have always had this idea that the cell wall is an inert matrix—just cork, boxes that hold cells in place,” says Scott Poethig, a plant developmental geneticist at the University of Pennsylvania, Philadelphia. “But it's becoming clear that the cell wall is a playground where a lot of molecules can interact and talk with each other.” Whatever the precise mechanism, it seems that the instant alliance between pollen and stigma is based on surface attraction. But among plants, at least, this type of union sticks. 11. CLIMATE CHANGE # Coming to Grips With the World's Greenhouse Gases 1. Karen Schmidt* 1. Karen Schmidt is a writer in Washington, D.C. Scientists are mounting an ambitious effort to trace the flux of carbon dioxide between land and air; their success will help decide the fate of the Kyoto climate change treaty Forester Matt Delaney always dreamed of working in the tropics, so when he was asked to help measure carbon stored in 634,000 hectares of humid forest in Bolivia, he jumped at the chance. During 6 months last year, he and 10 Bolivian foresters fought their way through an area twice the size of Rhode Island to sample trees, soil, and ground cover at 625 sites. Along the way, they had to contend with relentless bees and the odd piranha attack. “There was never anything easy about it,” says Delaney, who works for Winrock International, a nonprofit in Morrilton, Arkansas. But the survival school-cum-carbon project is paying off. The team, he says, now has a good sense of how much carbon dioxide has been sucked out of the atmosphere by this primeval forest—a necessary baseline for tracking changes in carbon storage if the world were to warm in the coming decades. Such information is not meant to fill almanacs: Scientists and policy-makers of all political persuasions perceive the world's forests as a vast sponge for CO2 belched out by factories, cars, and other societal sources. Canny business leaders view carbon storage as a way to offset industrial CO2 emissions, thus balancing their carbon books and meeting reduction goals set for each country under the climate change treaty that 174 nations in Kyoto, Japan, agreed to last December. Environmental groups, meanwhile, see forest preservation as a savior for staving off global warming. “Forests are one of the few ways to take CO2 out of the atmosphere—there's no technology on the horizon to do that,” says Tia Nelson, senior policy adviser at the Nature Conservancy in Washington, D.C. Researchers are only now grasping the monumental task of understanding the capacity of Earth's surface—primarily its oceans and forests—to absorb greenhouse gases. To track CO2 flux between land and air, scientists are ramping up a global network of monitoring stations; their data could help in assessing whether nations are adhering to the Kyoto treaty (see p. 506). And, in 2 years, engineers will undertake a novel pilot project to pump CO2 into the Pacific Ocean as a possible way to reduce atmospheric gas concentrations (see sidebar on p. 505). But no scientific issue is thornier than gauging how much CO2 can be socked away in forests. Negotiators hammering out the details for implementing the Kyoto accord have agreed that changes in forest cover since 1990 can be counted for—and against—a nation trying to meet its CO2 emissions obligations. But the treaty is hazy about how to calculate forest carbon stocks and whether nations can use forestry projects in developing countries to claim carbon credits. Efforts to clarify the CO2-forest connection have just begun in earnest, with a U.N. workshop scheduled for September, and the results will be critical to the treaty's success. “If policy-makers are not well informed about the carbon cycle,” says Michael Apps of the Canadian Forest Service in Edmonton, Alberta, “they will fail to meet the [treaty's] objective of stabilizing greenhouse gas concentrations in the atmosphere.” ## Seeing the forest for the trees The Kyoto agreement requires 38 nations to come up with a baseline assessment for 1990 of their greenhouse gas emissions from all sources—including changes in land use—and of their carbon stocks. Then, they must reduce emissions by an average amount (7% for the United States) during the years 2008 to 2012. These targets will be adjusted for “verifiable changes in carbon stocks” because of “direct human-induced activities of afforestation, reforestation and deforestation since January 1, 1990.” In broad terms, researchers estimate that deforestation currently results in about 20% of CO2 emissions worldwide, and that afforestation—planting new forests—and reforestation draw CO2 from the atmosphere. But those generalizations provide a shaky basis for binding, international commitments that have huge economic consequences. One big problem is that scientists are now 8 years late—and counting—in setting the 1990 baseline. “I'm not aware of any country that has a scientifically rigorous inventory for forests in 1990,” says Darren Goetze, a biophysicist at the Union of Concerned Scientists. Even for the United States, he says, a national picture of forest density was compiled from satellite data only 5 years ago—a picture that's still being filled out by ground measurements. This uncertainty, says Dan Lashof, a senior scientist at the Natural Resources Defense Council, could result in countries being rewarded for savings that simply result from more accurate field data. “This could undermine any significance of the Kyoto protocol,” he says. What's needed to plug the gaps in knowledge about carbon sequestration in forests, Lashof says, is a comprehensive system for monitoring carbon stocks that combines remote sensing with ground-based sampling—including a global effort to track CO2 (see page 506)—and which provides consistent data from country to country. The$10 million project in Bolivia is one of the largest attempts to replace the generalizations with real numbers for a specific tropical forest. The Noel Kempff Climate Action Project, as it is called, came about in 1997 when an unlikely coalition of The Nature Conservancy, the Bolivian government, a Bolivian foundation called Friends of Nature, and three U.S. companies (American Electric Power, British Petroleum America, and PacifiCorp) bought logging rights and began training local people on how to protect and use the forest—for example, by harvesting hearts of palm and breeding orchids instead of clearing the land to grow corn. The project is guided by mounting evidence that carefully managed forests can soak up CO2 (Science, 18 July 1997, p. 315).

Data gathered last year suggest that this forest swath sequesters about 15 million metric tons of carbon that would have been released over the next 3 decades if timber companies, farmers, and other developers were allowed to deforest the land. To arrive at that figure, the team first had to measure carbon in the forest's trees, leaf litter, ground cover, and soil. Next, they estimated how much carbon would have been lost to development. To do that, they sampled nearby farm fields—carbon paupers compared to mature forest—to project how much CO2 would be released if 13,000 hectares of forest were razed for cropland, and they estimated how much carbon would have been lost to proposed logging. By both stemming carbon loss and perhaps allowing the sink to continue accumulating, the coalition hopes to use the predicted carbon savings as “credits” to meet targets for reducing their CO2 emissions.

But it is far from certain the strategy will work. Any benefits could be erased if conserving forest carbon in one place simply shifts deforestation elsewhere. Such concerns were on the table in Kyoto, where negotiators hotly debated whether to give credits and debits for forest alterations at all. Many parties believed that “by allowing forest sinks to be counted, it's almost like giving permission to extract more fossil fuels and do more oil exploration,” says Jennifer Morgan, climate policy officer at the World Wildlife Fund. Because forest carbon stocks could be a deciding factor in whether a country meets its emissions target, she says, “we ought to be very cautious” about how this part of the treaty is implemented.

## A matter of definition

World Bank environmental chief Robert Watson, chair of the United Nations' Intergovernmental Panel on Climate Change, agrees on the need for caution. To make the treaty work, he says, nations must adopt environmentally sound definitions of forest alterations—an unclear definition could backfire. For example, if “deforestation” leaves out destruction from fire, and “reforestation” is defined as planting trees where forest used to grow naturally, then some countries might encourage burning down forests and replanting them later for credit. This would result in a net carbon loss to the atmosphere because the new forest could take decades to store as much carbon as the original forest. Scientists will discuss possible definitions at the September workshop.

Experts agree that those definitions and the calculations for determining whether countries meet the Kyoto goals must take into account different forest types. “If you have a forest like part of the Brazilian Amazon that regrows slowly … and is inefficient to harvest and use, then it may be best to leave it protected,” says Gregg Marland of the Oak Ridge National Laboratory in Tennessee. But for fast-growing forests that can be harvested easily, such as in the southeastern United States, he says, “then it's best to make forest products” that sequester carbon or displace other carbon-intensive materials such as cement.

The global picture may seem overwhelming, but sequestration efforts such as the Noel Kempff project in Bolivia suggest that it is possible to take steps now. Whether such schemes will count under the treaty is expected to be a hot topic at a treaty meeting in Buenos Aires in November, in part because most Latin American countries—which already have forest conservation programs they want to bolster—are lobbying for it. But Nelson of the Nature Conservancy predicts the issue will not be resolved soon: “Working out how to do these projects is very complicated, because there's so much room for interpretation, so many agendas, and so much at stake.”

12. # Closing the Carbon Circle

1. Jocelyn Kaiser,
2. Karen Schmidt

The Kyoto treaty has begun to reshape the efforts of thousands of climate change scientists around the world. The treaty's spotlight on the carbon cycle is already helping to drive a major overhaul of the multibillion-dollar U.S. Global Change Research Program (Science, 12 June, p. 1682). This week's lead Focus stories explore the major scientific uncertainty underlying the Kyoto treaty—how much carbon is stored and released by the world's forests (see page 504)—and how a global monitoring system will track carbon dioxide flux between land and air (see p. 506).

Bedeviling negotiators attempting to implement the treaty is a huge mass of carbon, some 1.8 petagrams (1012 kilograms), released into the air each year that currently cannot be accounted for in known land and sea carbon sinks (see above). “We see uptake but nowhere near big enough,” says land-use expert Richard Houghton of the Woods Hole Research Center. Scientists have four main approaches for ferreting out this missing carbon: sampling CO2 levels in the air, measuring carbon in forests and analyzing human-altered landscapes, erecting towers that trace CO2 flux between land and air, and using satellites to estimate photosynthesis—and thus CO2 consumption by plants—across ecosystems. The treaty has made it more important than ever to pursue all these approaches, climate researchers recently argued in Science (29 May, p. 1393). Says National Oceanic and Atmospheric Administration ecologist Dennis Baldocchi, “I'm in favor of mixing all four models, almost like the blind men and the elephant”—but hopefully with greater success.

At a meeting of the treaty parties in Bonn last month, the Intergovernmental Panel on Climate Change (IPCC) was asked to prepare a report due out in mid-2000 on the key uncertainty—the role of forests in the carbon cycle. “Our job is to say what can possibly be done and how we can ensure compliance with the protocol,” says IPCC chief Robert Watson. Solving these uncertainties is a huge task. However, Watson says, “the only thing that matters at the end of the day is what the concentrations of greenhouse gases are.”

13. CLIMATE CHANGE

# A Way to Make CO2 Go Away: Deep-Six It

1. Jocelyn Kaiser

While climate experts try to get a better fix on the carbon cycle, some engineers are taking what they view as the obvious next step: devising schemes for funneling excess CO2 into the deep ocean, unminable coal seams, abandoned oil fields, and isolated aquifers. One idea now leading the pack is to pipe CO2 waste from power plants to the bottom of the sea.

The notion is not as farfetched as it may sound. In a project that has won support from some environmentalists, the Norwegian oil company Statoil since August 1996 has shunted 1 million tons per year for the last 2 years of CO2, a byproduct of natural gas production, into a salt aquifer in the North Sea—CO2 that otherwise would have added 3% to Norway's carbon emissions. The CO2 should eventually form a bubble under the formation's shale roof.

Because there aren't enough aquifers in convenient places to hold all the world's extra CO2, scientists are gearing up to test another approach on a pilot scale in the Pacific Ocean. Last December, the U.S., Japanese and Norwegian governments agreed to launch a \$4 million experiment off Hawaii's Kona Coast in the summer of 2000. Engineers will extend a 2- to 3-kilometer pipe down a steep slope, so that it reaches about 1000 meters below the ocean surface. Liquid CO2 sprayed out of a nozzle will mix with seawater and be swept away by deep currents. Scientists will follow the fate of the CO2—300 tons over 30 days—with sea-floor instruments and from remote vehicles.

About 90% of the CO2 produced on Earth already gets sucked up by oceans eventually, so, in a way, “we're trying to help nature along here,” says Massachusetts Institute of Technology chemical engineer Howard Herzog.

But major uncertainties loom over this approach, says Ben Matthews, a chemical oceanographer at the University of East Anglia in the United Kingdom. For example, he notes, solid hydrates formed from CO2 and water could clog the pipe, and CO2-laced water—more acidic than normal seawater—could harm marine life. CO2 could also resurface if bottom currents, which ought to circulate the CO2 for centuries, were to shift. Another contentious issue is that preparing CO2 from power plants—scrubbing off impurities and pressurizing the gas—could require up to 30% of the energy that produced the original CO2. Until the experiment is done, says Perry Bergman of the Energy Department's Federal Energy Technology Center, “no one is ever going to know whether it makes any sense or not.”

14. CLIMATE CHANGE

# New Network Aims to Take the World's CO2 Pulse

1. Jocelyn Kaiser

An expanding array of carbon dioxide monitoring towers around the world could help scientists pin down carbon sinks and enforce the Kyoto Treaty

This spring, a gleaming, 50-meter aluminum tower rose among the aspens and red maples in a Michigan forest, “almost like an extra tree,” says Jim Teeri, director of the University of Michigan Biological Station near Pellston. Towers also sprouted in a California grassland, a Quebec wetland, and a Costa Rican rainforest. Fitted with devices for sensing faint whiffs of carbon dioxide, the towers are the latest tool for answering a key question in climate change models: how much carbon is sequestered by ecosystems.

Until now, researchers have picked at the edges of the carbon-cycle problem, either modeling fluxes globally or looking at tree growth and other clues to carbon storage. But a worldwide network of 70 or so towers now running or about to come online will soon churn out a stream of data on how much CO2 is socked away in various soil and plant types. That information should, over the long haul, help refine models of global warming as greenhouse gases continue to build up in the atmosphere. “To see what the terrestrial biosphere is going to do in the future, data from these sites are crucial,” says Dave Hollinger, a U.S. Forest Service ecologist.

The need for such data has become more pressing now that 38 nations have pledged to slash carbon emissions under the Kyoto treaty. Tracking CO2 flux between land and air is “enormously important for what becomes of the treaty,” says NASA ecologist Tony Janetos. Indeed, notes Riccardo Valentini of the University of Tuscia in Italy, a flux tower network “can be an independent way to verify [the cuts] that the Kyoto protocol requires.” Others caution that the science of tracking CO2 is still in its infancy. “It seems like a promising approach,” says ecologist and climate modeler David Schimel of the National Center for Atmospheric Research (NCAR) in Boulder, Colorado, “but it's also exploratory.”

The towers will help probe a long-standing mystery: the so-called “missing carbon.” Only half the 7.1 petagrams of carbon released by fossil fuel burning and biomass destruction each year stays in the atmosphere. The ocean absorbs 2 petagrams, leaving unaccounted for a whopping 1.8 petagrams—enough carbon to fill a soccer field with a pile of coal 230 kilometers high. When scientists feed data on atmospheric CO2 levels collected worldwide into climate models and subtract fossil fuel emissions, the results point toward the Northern Hemisphere as a major carbon sink. But forest inventories and land-use studies fail to explain where all the carbon goes, says Schimel, perhaps because these approaches do not fully account for soils, which may absorb as much as two-thirds of the missing carbon.

The towers will track carbon by measuring CO2 breathed in and exhaled by plants and soils. The technique, pioneered in the 1970s, uses wind velocity sensors and infrared gas analyzers to measure CO2 in air drafts. But “lots of funny things can happen,” says Harvard University atmospheric chemist Steven Wofsy. For example, weak nighttime drafts and other factors can lead to underestimates of CO2 release by 10% or more. Despite this drawback, Wofsy's team has shown that warm temperatures alone do not spur carbon storage at the Harvard Forest—abetting factors include a long growing season, cloud-free summer days, and less snow cover (Science, 15 March 1996, p. 1576). And a study in Canada found that as Earth warms, boreal ecosystems may turn into major sources of CO2 from thawing peat (Science, 9 January, p. 214).

Eager to find out what the towers might reveal about other biomes, researchers running 24 North American flux towers are now organizing a long-term network, called Ameriflux, to monitor ecosystems as diverse as tundra, cropland, and old-growth forest. The towers will measure CO2 about 10 times a second over the next 3 years. The Ameriflux team is also linking up with a 3-year-old network in European forests and other towers in Japan, the Amazon, Australia, Siberia, and Southeast Asia. Called Fluxnet, the data-sharing project is funded by NASA, which wants to use the data to calibrate an Earth Observing Satellite (slated for launch next year) that will estimate how much CO2 plants absorb. Another aim is to pool data on the World Wide Web so that modelers can combine them with data from inventories, satellites, land-use studies, and CO2 measurements from airplanes. They can then test predictions of how much carbon the different ecosystems now sequester and how much they will absorb as greenhouse gas levels rise.

Early results are exceeding expectations. For example, tower data appear to confirm that in temperate zones, landscapes nearer to the equator are likelier to serve as CO2 sinks. In Italy, for instance, forests absorb as much as 5 tons per hectare each year. The amount stored drops off further north, and a Swedish boreal forest, where peat may have begun to thaw, actually releases about 0.5 tons of carbon per hectare a year. Thus, the towers may be more helpful than expected in terms of closing in on the missing sink, proponents say. Inspired by these results, NCAR's Schimel has suggested to a White House panel on climate change research that Ameriflux expand its network to perhaps 100 towers, if the cost per tower could be brought down.

Some are even more optimistic. At a Fluxnet meeting last month, scientists reported preliminary findings that European forests absorb a net total of up to 0.28 petagrams of carbon a year—a third of the continent's industrial emissions. According to Valentini, who directs Euroflux, the next step is to add data from grasslands and croplands and plug them into “more sophisticated models” of fluxes between soils, plants, and the atmosphere.

A global network of 250 towers coupled with satellite and weather data might allow monitors to see whether countries are living up to their Kyoto commitments. It “can be an independent way to verify what the Kyoto protocol requires,” Valentini says. But such statements make some U.S. scientists antsy. If the towers are seen as “tools for the carbon police,” says one Ameriflux researcher, Congress may set out to kill the flux program. Others worry that tower fever will lead agencies to underfund other methods needed to ferret out the missing sink. Even with a larger network, it will be difficult to extrapolate local CO2 fluxes to a regional level, says land-use expert Richard Houghton of the Woods Hole Research Center in Massachusetts. “I don't think they can do it at the accuracy you'd need,” he notes.

Flux tower scientists acknowledge that the program is still proving itself—but they say it is on the right track. Until recently, Wofsy admits he had doubts about how useful the towers would be for closing the carbon cycle: “A year ago, I would have said we're not trying to do that.” Now, he adds, the data are more encouraging. “We might be able to make more progress than we thought.”

15. NEUROGENETICS

# New Gene Tied to Common Form of Alzheimer's

1. Jean Marx

A mutation in a protein that may help scour toxins from between neurons appears to increase the risk of late-onset Alzheimer's disease

Over the past half-dozen years, researchers hoping to pin down the cause of the devastating brain degeneration of Alzheimer's disease have seen their list of potential culprits grow. They've found, for example, that mutations in any of three different genes can cause some cases of early-onset Alzheimer's, which strikes in middle age. In addition, they've identified a variant of another gene that increases an individual's risk of developing the much more common form of the disease that occurs later in life. But researchers have been all too aware that none of these discoveries could fully explain the late-onset Alzheimer's disease that afflicts so many families. Now, they have an important new suspect to add to their lineup.

Earlier this week, at the Sixth Annual International Conference on Alzheimer's Disease and Related Disorders, which was held in Amsterdam, neurogeneticist Rudy Tanzi of Harvard's Massachusetts General Hospital in Boston reported genetic evidence indicating that a common mutation in the gene encoding a protein called α2-macroglobulin (α2M) makes the people carrying it more susceptible to developing the neurodegenerative condition as they age. (The results will also appear in the August issue of Nature Genetics.) At present, no one knows how many Alzheimer's cases might be linked to the mutation, but the number could be large, given that an estimated 30% of the population carries the mutation. “I think that [the new mutation] is probably the strongest risk factor for whether you get Alzheimer's late in life—as strong as or stronger than ApoE4,” says Alzheimer's expert Sam Sisodia of the University of Chicago, referring to the only other gene currently linked to the late-onset form of the disease.

The new gene and its protein could also make sense of how several other proteins already implicated in Alzheimer's might contribute to the disease. Work by other researchers suggests that the normal α2M protein acts as a kind of cleanup crew for neurons by binding to several proteins that could have toxic effects and sweeping them out of the space between neurons. These include, for example, the small protein β amyloid, already notorious as a possible cause of Alzheimer's. The mutation may put this cleanup crew out of commission, or at least slow it down, leading to β amyloid deposition and nerve cell death.

ApoE4, a variant of a lipid-carrying protein called apoE, may fit into this picture as well, for one way α2M may prevent β amyloid deposition is by binding the peptide and transporting it into cells for degradation—a step that uses the very same receptor that apoE uses to enter cells. ApoE4 or excess amounts of other apoEs might block the α2M-β amyloid complex from binding to the receptor, preventing the cleanup crew from removing its sweepings. All that makes the Tanzi team's discovery “scientifically very interesting,” says Steven Hyman, director of the National Institute of Mental Health (NIMH)—and perhaps a clue to new Alzheimer's therapies.

The apoE4 link was one of the clues that first alerted Tanzi and his colleagues to the gene encoding α2M. They reasoned that if ApoE4 is a risk factor for Alzheimer's, then other proteins that bind to the apoE4 receptor, a cell surface protein known as LRP (for low-density lipoprotein receptor-related protein), might be risk factors as well. α2M, which was then known primarily as an inhibitor of many of the body's proteases, or protein-splitting enzymes, is one such protein. The researchers then performed a standard genetic linkage analysis to see whether a common variant of the A2M gene that lacks a particular five-nucleotide deletion is associated with Alzheimer's in a group of families which they and teams at Johns Hopkins University School of Medicine and the University of Alabama at Birmingham had collected under the aegis of the NIMH's Alzheimer's Disease Genetics Initiative. The results of this test were disappointing, however.

View this table:

But other work began suggesting that the researchers might nonetheless be on the right track. “While we were screening,” Tanzi recalls, “papers began appearing implicating α2M biologically in Alzheimer's disease.” For example, Dennis Selkoe's team, also at Harvard, found that α2M has paradoxical effects on one protease: While preventing the protease from degrading large proteins, it apparently triggers the protease to break down β amyloid, an action that should prevent toxic β amyloid deposits from forming. The α2M protein also interacts with a variety of cytokines, molecules that influence immune activity, suggesting that it might somehow damp down dangerous inflammatory reactions in the brain.

In addition, several teams, including those of Steven Paul at Lilly Research Laboratories in Indianapolis; Sudhir Sahasrabudhe at Hoechst Marion Roussel Inc. in Bridgewater, New Jersey; and Guojun Bu at Washington University School of Medicine in St. Louis, showed that α2M binds to β amyloid itself, with two potentially protective effects. In test tube assays, it prevents the formation of the insoluble β amyloid fibrils that are considered most toxic to neurons—and does so effectively enough, Paul's group showed, to protect cultured neurons against β amyloid toxicity. What's more, Bu's team showed that cells take up and degrade the α2M-β amyloid complex, apparently as a result of its binding to the LRP receptor.

These protective functions suggested that a defective A2M gene should be an Alzheimer's risk factor, so the Tanzi team tried again to find an association in the Alzheimer's families. This time, though, they turned to a powerful new method of analysis called “family-based association.” Using methods developed by statistician Nan Laird of the Harvard School of Public Health and Tanzi's Harvard colleague Deborah Blacker, the researchers compared the frequencies of the mutant A2M allele in Alzheimer's patients and their unaffected siblings. And here they struck paydirt.

The analysis showed a highly significant association between the A2M deletion and the presence of Alzheimer's disease—as strong as the association with ApoE4, Tanzi says. For instance, when researchers removed the possible confounding influence of ApoE4 by looking only at families lacking that ApoE variant, they found that the frequency of the mutant A2M gene in Alzheimer's patients was four times greater than in their unaffected siblings. The result was “pretty striking,” Tanzi says. “Somehow not having the deletion [in A2M] protects you against Alzheimer's disease when you get old.”

At least one additional recent genetic study also points to A2M as a possible Alzheimer's gene. Alison Goate of Washington University School of Medicine and her colleagues have been conducting a genomewide screen looking for such genes. In a paper in press at the Journal of the American Medical Association, they report that a genetic linkage study of a subset of the families studied by the Tanzi team picked up the region on chromosome 12 where A2M is located. “I can't say our linkage is to A2M, but I can say it is a good candidate for what we are seeing,” Goate says. What's more, the deletion the Tanzi team is studying may not be the only A2M mutation associated with Alzheimer's. Brad Hyman of Harvard, working with Tanzi, has identified a second, independent mutation in the gene that also appears to increase the risk of the disease.

If these common mutations do predispose carriers to Alzheimer's, the discovery could help explain a puzzling observation about ApoE4. Some people carrying ApoE4, even those with two copies, appear not to get Alzheimer's, an idea supported by another study reported in the August Nature Genetics. In it, John Breitner of Johns Hopkins School of Medicine and his colleagues screened nearly 5000 people for Alzheimer's disease as well as their ApoE4 status.

In agreement with previous work, they found that people with two ApoE4 copies get the disease earlier than people with one, who get it earlier than people with none. But the analysis also indicated that some ApoE4 carriers would not get the disease no matter how long they lived. “This suggests that there is a window of time in which if you're going to get Alzheimer's disease, you do. ApoE4 is defining the window, but not who is getting it,” Breitner says. Other genes—such as A2M—may determine that “whether,” he suggests.

Still, before researchers conclude that A2M mutations do in fact play such a role, they would like to see the Tanzi team's finding confirmed in other populations besides the NIMH families. Even before that happens, though, they will begin focusing on just how the A2M mutations might lead to Alzheimer's. “It's certainly exciting, but at this point, it's still early and it's not clear how the biology of this will work out,” Paul says.

But thanks to the previous biological work, researchers have several hypotheses to test. They will want to know, for example, whether the deletion affects α2M's ability to carry β amyloid into cells or prevent β amyloid fibril formation. They will also want to test whether ApoE4 fits into the picture as Tanzi and others propose. Paul's team has already shown that mice lacking the gene for apoE are protected against amyloid deposition, although it remains to be seen whether that's because apoE can interfere with α2M's cleanup operation. If it does, that could explain how apoE4 accelerates the development of Alzheimer's.

Those kinds of possibilities, sure to spur a new round of research, are what make the new work “very, very interesting,” says neurobiologist Steven Moldin of NIMH. “It offers a set of very explicit hypotheses that can be tested.”