# News this Week

Science  19 Aug 2005:
Vol. 309, Issue 5738, pp. 1162
1. WILDLIFE BIOLOGY

# 'Genetic Rescue' Helps Panthers but Puts Researchers on the Spot

Ten years ago, the Florida panther seemed on the brink of extinction. Now, a new analysis concludes that a risky experiment to reinvigorate the panther population has paid off. But both the conclusions and the methodology of the analysis are proving controversial.

In 1995, wildlife biologists transplanted eight female panthers from Texas to south Florida in a last-ditch attempt to reverse the worrisome effects of inbreeding, including heart murmurs and defective sperm. A team of biologists led by ecologist Stuart Pimm of Duke University in Durham, North Carolina, has now analyzed a decade's worth of panther data and concluded that hybrid cats with Texas ancestry are surviving better than purebred Florida panthers and increasing the species' ranges of habitat. “This will be the strongest demonstration that a genetic introduction program can have a major positive impact on an endangered species,” says conservation biologist Paul Beier of Northern Arizona University in Flagstaff.

The work, which is being released this week by Animal Conservation, is not without critics. Some doubt the introduction of Texas panthers deserves full credit for the population rebound. The group officially compiling the data analyzed by Pimm's team is also crying foul.

The decision to transplant endangered animals, especially large, charismatic predators such as panthers, is a political and scientific hot potato. Most efforts, such as the return of wolves to Yellowstone National Park, have been reintroductions into areas with no survivors. One exception is an effort with the prairie chicken, which was deemed a success (Science, 27 November 1998, p. 1695). Pimm had doubts that a population as inbred as the Florida panthers would benefit, arguing that conservation efforts ought instead to focus on land conservation and restoration.

After lengthy consultations with scientists and stakeholders, state and federal agencies permitted the taking of eight females from Texas and releasing them in south Florida. Biologists with the Florida Fish and Wildlife Conservation Commission (FFWCC) joined contractors in tracking the Texas panthers using radio collars and studying 54 offspring from the five Texas females that reproduced.

FFWCC publishes an annual report on the panthers, but it has not yet published a peer-reviewed study of the introduction. This delay frustrated Pimm, a conservation biologist. Working with Oron “Sonny” Bass Jr., an experienced panther biologist at Florida's Everglades National Park, and Duke doctoral student Luke Dollar, Pimm combed through the demographic data and movements of panthers contained in the annual reports.

Pimm's group calculates that the survival rate of the hybrid kittens was three times higher than that for the 118 purebred Florida kittens. Once adults, hybrid females also survived “considerably better” than purebreds did, Pimm says. Hybrid males, however, had shorter life spans than purebreds.

Still, the team concluded that hybrid males are expanding into new habitats, such as grasslands. That finding is controversial because it contradicts the long-standing policy of FFWCC and the U.S. Fish and Wildlife Service, which have determined that forests are the key for panther survival, a view that was contested by a scientific review team appointed by the agencies in 2002. “You certainly don't want to give up areas to developers by assuming that panthers cannot occupy them,” says Pimm.

However, David Maehr of the University of Kentucky in Lexington, who led the FFWCC panther team from 1985 to 1994, maintains that panthers depend on forests and says he has a paper in press that will bolster this view. Any expansion by panthers into Everglades National Park may not last, he says: “It is dangerous to suggest that these often-flooded and low-prey-density areas will be a long-term benefit to panther recovery.”

Maehr and other critics have additional bones to pick with Pimm's analysis. Pimm compared purebreds to all hybrids, irrespective of how much Texas ancestry they had, for example, to compare kitten survival rates. “I would have much more confidence if the people who had collected the data had made this conclusion,” says Phil Hedrick, a population geneticist at Arizona State University in Tempe. “I think they're being much more cautious than Pimm is.”

Darrell Land, FFWCC's current panther team leader, says that Pimm's team may have acted unethically. “We feel they are seeking to publish other people's data. They never talked to us,” says Land, noting that he and others have been working on a publication. But John Gittleman of the University of Virginia in Charlottesville, an editor of Animal Conservation, disagrees because the data are public: “I think that independent assessment is perfectly within their rights.”

2. EVOLUTION

# Kansas Prepares New Standards

1. Yudhijit Bhattacharjee

The Kansas Board of Education last week endorsed science standards that would allow for the teaching of alternatives to evolutionary theory. Scientists say the new draft standards are a thinly disguised attempt to slip intelligent design (ID) into the curriculum by highlighting uncertainty and gaps in current scientific thinking. But it's an open question whether they will translate into changes in the classroom.

The 6-4 vote by the deeply divided board represents the latest skirmish in a long-running battle that has attracted national attention. The new standards follow May hearings that were boycotted by national scientific organizations, which saw them as a way to confer scientific legitimacy upon ID. The hearings were scheduled after an advisory panel set up by the board to revise the standards voted against including alternatives to evolution. The board is expected to adopt the standards this fall after an external review.

The 123-page draft document* calls on students “to learn about the best evidence for modern evolutionary theory, but also to learn about areas where scientists are raising scientific criticisms of the theory.” Board member Kathy Martin, who voted with the majority, says that “these standards will ensure that our students learn to analyze scientific evidence critically. … They are the best thing to have happened to education in Kansas.”

That's not what most scientists think, however. Although the standards do not mention ID—the idea that some features of living systems are best explained by an intelligent cause—the draft “is littered with language that is routinely used by intelligent design advocates,” says Steven Case, committee chair and a biologist at the University of Kansas (KU) in Lawrence. The Kansas draft standards, he and others say, contain distorted definitions of evolutionary concepts and misstatements about biology. Biological evolution, for example, is described as “postulat[ing] an unguided natural process that has no discernable (sic) direction or goal”—a statement that Case says introduces the false idea that science addresses the purpose and meaning of natural phenomena. And Case says the statement that “the sequence of the nucleotide bases within genes is not dictated by any known chemical or physical law” deliberately ignores the fact that scientists are still exploring the organization of nucleotide bases. “If you say the sequences are not dictated by any known chemical or physical law, which is itself untrue, you could go one step further and ask if the sequences are dictated by a divine law,” says Case.

The new standards may not represent anything more than a moral victory for ID proponents, however. None of the controversial items in the standards has been marked for assessment, which means they won't show up in state assessment tests, says John Poggio, co-director of KU's Center for Educational Testing and Evaluation, which designs and coordinates those examinations. And because most school districts tailor their curriculums to the tests, he adds, the revisions may have little impact on the classroom.

Even so, Poggio says test designers might drop some evolution-related questions from the tests. Martin sees that as an ideal solution, arguing that “some students have deeply held convictions about this topic, which puts them at a disadvantage while answering questions on a test.”

Apart from battling the standards, many scientists have also targeted a statewide election in November 2006 involving the seats of five board members, including four conservatives. Sue Gamble, one of the four board members who opposed the standards, says that a wholesale reshuffling is the only way to stop “this assault” on science education. But she worries that a debate over evolution might “polarize the state further” and overshadow the bigger issue of how best to train Kansas students for the workplace.

3. HIGH-ENERGY PHYSICS

# Costs Force NSF to Cancel Brookhaven Project

1. Jeffrey Mervis,

The National Science Foundation (NSF) has withdrawn its support for a high-energy physics project planned for the Department of Energy's Brookhaven National Laboratory in Upton, New York, after deciding that its budget couldn't handle the soaring costs. The decision, unusual for NSF, effectively kills the Rare Symmetry Violating Processes (RSVP) project just before construction was to begin on its two massive detectors.

“These are compelling experiments, and the scientific rationale for doing them is still strong,” says Michael Turner, head of NSF's math and physical sciences directorate. “It was a very difficult decision, but the increased costs were too much to bear.”

RSVP consisted of twin experiments. One, MECO, would have examined whether a subatomic particle called the muon could transform into an electron, an interaction not allowed by the prevailing theory of particles, the Standard Model. The other, KOPIO, would have looked for unexpected differences in the behavior of matter and antimatter by studying a specific decay of a particle called a K0 meson to another called a π0 meson, a neutrino, and an antineutrino. The rare decay is allowed by the Standard Model, but researchers hoped to see a deviation from the predicted rate, which would be a sign of undetected particles or interactions.

Originally approved in 2000 as a $145 million project at Brookhaven, RSVP last fall received its first$15 million in construction funds from Congress. That triggered a fresh review of the project that bumped its construction costs to $282 million. Its lifetime operating costs tripled, from$80 million over 5 years to $250 million over 8 years. The main culprit in the increase was a required upgrade of the lab's aging Alternating Gradient Synchrotron (AGS), the accelerator that would provide a beam of protons for the experiments. Since 2002, AGS has been used primarily to feed particles into the much larger Relativistic Heavy Ion Collider (RHIC), which studies nuclear physics. Not only did AGS need to be tweaked to meet the more exacting requirements of RSVP but also its entire operating budget would have fallen on RSVP if RSVP outlasted RHIC. Turner says these added costs had to be weighed against the potential scientific gains from several large physical science projects on the drawing board, including an underground laboratory to house experiments in physics, geology, and biology; a giant segmented mirror telescope; and an energy-recovery linear accelerator that would power an x-ray source for materials science research. In addition, he says RSVP's higher operating costs would have eaten into the directorate's existing budget for investigator grants. Scientists involved in RSVP say that they anticipated the foundation's decision after both House and Senate spending panels this spring yanked the project from NSF's 2006 budget request. “Given Congress's position, I didn't see what else the National Science Board could do,” says Michael Zeller of Yale University, co-spokesperson for KOPIO, RSVP's matter-antimatter experiment. RSVP's demise opens the field to non-U.S.-led efforts, notably the MEG experiment to begin next year at the Paul Scherrer Institute in Villigen, Switzerland, and a pair of proposed experiments at the Japanese Proton Accelerator Research Complex in Tokai. Meanwhile, with the exception of neutrino experiments, all accelerator-based particle physics experiments in the United States will likely shut down within a few years. “To see the accelerators coming to an end in the U.S.—if they are—is amazing to me,” says William Willis, a physicist at Columbia University and project manager for RSVP. “Things looked a lot different a few years ago.” 4. U.S. POLAR SCIENCE # NSF Taps Russian Vessel for Antarctic Icebreaking 1. Jeffrey Mervis With one eye on its wallet and the other on Congress, the National Science Foundation (NSF) has decided to charter a Russian icebreaking vessel this winter to clear a path to its major research station in Antarctica. The cost-saving move appears to be at odds with pending Senate language that NSF should continue its historic reliance on U.S. government ships (Science, 1 July, p. 31). But the decision dovetails with a new report from an NSF advisory panel recommending less costly and more reliable ways to resupply McMurdo Station, the hub of NSF's Antarctic operations. McMurdo, the largest of NSF's three Antarctic stations, sits at the end of a sound that must be cleared of ice every austral summer. The workhorses of that effort have been two 30-year-old icebreakers owned and operated by the U.S. Coast Guard. But the NSF panel calls this resupply system “inherently risky.” The Coast Guard ships are increasingly frail, it notes, a condition exacerbated by the calving of a massive iceberg in 2000 that produced unusually thick and persistent sea ice in the sound. The system is also expensive and inefficient: In addition to growing maintenance and repair costs, the ships themselves consume about a quarter of the nine million gallons of fuel delivered each year to operate McMurdo and the inland South Pole station. Earlier this year, NSF hired the Krasin, a Russian-owned and -operated icebreaker, to help the U.S. Polar Star crunch through the ice (Science, 21 January, p. 338). This winter, says NSF polar chief Karl Erb, the agency wants to use the Krasin as the lead dog and hold the Polar Star in reserve. “The Coast Guard thinks we should have two icebreakers,” he says, “but we think that one will do it because storms have pushed the icebergs away.” It's a win-win situation, he says. “The Krasin is cheaper to operate and more fuel-efficient,” he notes. “The$5 million we'll save by keeping Polar Star in reserve could be put toward fixing the Polar Sea. And the Polar Star will be available next year [when it's otherwise scheduled for major repairs] if we don't use it this year.” Last week, the plan was endorsed by the National Science Board, NSF's oversight body.

Deferring to a foreign vessel isn't what Senator Patty Murray (D-WA), who represents the state where the icebreakers are berthed, was thinking when she slipped restrictive language into NSF's pending 2006 budget. “The NSF director shall procure polar ice breaking services from the Coast Guard,” says the Senate report accompanying the spending bill. NSF is allowed to shop elsewhere “if the Coast Guard is unable to provide” such services, it notes, before adding that NSF and the White House should “work jointly to ensure that the Coast Guard fleet is capable of meeting NSF's future ice breaking needs.”

That language could be altered or dropped in an upcoming conference to reconcile differences with the House, which told NSF in its report to use “the most cost-effective means of obtaining icebreaking services.” NSF Director Arden Bement says he hopes that legislators will see the benefits of leasing the Krasin, which he says is consistent with existing U.S. policy to ensure access to Antarctica and promote polar science. NSF is responsible for carrying out that policy in a fiscally and environmentally prudent manner, he notes. The Coast Guard says the decision rests with NSF; Murray's office declined comment.

Meanwhile, the advisory panel to NSF's polar programs presented Erb with 68 pages of innovative options to reduce NSF's dependence on icebreakers and, at the same time, improve operations throughout the Antarctic continent. Their proposals include building a runway that would allow the South Pole station to be resupplied by planes from New Zealand, improving NSF's ability to move supplies over land to the pole and various remote field sites, and running a leaner operation at McMurdo. They also suggest that NSF explore using heavy-lift blimps and contracting with commercial operators to reduce its dependence on military transportation. Even so, the report notes that NSF may someday need access to a new icebreaker capable of resupplying McMurdo.

5. SPACE AND EARTH SCIENCES

# Budget Woes Greet NASA Science Chief

1. Andrew Lawler

An engineer and former astronaut with a background in biology is taking the helm of NASA's $6 billion science program. Mary Cleave, who has been briefly in charge of the space agency's beleaguered earth science effort, now faces the tough task of reining in spiraling costs on several major science projects and ensuring the repair of the Hubble Space Telescope. At the same time, she'll try to protect the overall research budget from cuts to feed the space shuttle, station, and a new space flight vehicle that is central to the exploration vision of President George W. Bush. Cleave is one of several senior appointments made last week by NASA chief Michael Griffin, who has known her for years. But despite her current job, she is not a familiar face to space and earth scientists. “She doesn't have experience doing science, and she doesn't have long experience working with the scientific community,” says one researcher who has worked with Cleave. He adds, however, that she is “very focused” on abiding by the research goals laid out by recent reports on long-term planning for astronomy, solar system exploration, and earth sciences from the National Academies in Washington, D.C. NASA chief scientist James Garvin, who has known Cleave for a decade, predicts she will be “a strong, pro-science” manager and that her experience with human space flight will help her make the case for research. Charles Kennel, director of the Scripps Institution of Oceanography in San Diego, California, and chair of the NASA Advisory Council, adds that “during the next few years, science will be the engine of NASA's public relations success. [Cleave's] job needs someone who understands the science and can create the strong support to carry out its science mission.” Cleave studied microbial ecology and civil and environmental engineering and flew twice on the space shuttle in the 1980s. She was NASA project manager for an ocean color sensor spacecraft before joining headquarters in 2000 and only became chief of earth sciences last year after the office was merged with the space science office. Her new deputy, Colleen Hartman, is a physicist who ran NASA's solar system program before working in the White House Office of Science and Technology Policy and the National Oceanic and Atmospheric Administration. One of Cleave's first challenges will be to manage the$1 billion cost overrun on the James Webb Space Telescope now under development. An internal report due next month is expected to consider alternatives that include drastically scaling back the instrument's capabilities. Another challenge, the fate of the Hubble, won't be resolved until after the shuttle flies again, an event that could be pushed back until as late as next spring. Both the Webb overrun and the Hubble mission could force Cleave to cut back in other areas.

Despite Cleave's background, oversight of biological sciences will fall not to her but to the space operations office led by engineer William Gerstenmaier, until now chief of the space station effort. This month, NASA began to cancel contracts to build long-planned facilities related to biological research on the space station, including work on the advanced animal habitat and the plant research units.

6. IMMUNOLOGY

# Versatile Development Gene Aids Insect Immune Response

1. Elizabeth Pennisi

Just like people, insects get infected by a multitude of microbes. But unlike people, they don't produce millions of distinct antibodies that can bind to and thwart pathogens with great specificity. Instead, insects were thought to depend on just a small number of molecules that recognize features common to many microbes.

But new results published online this week in Science (www.sciencemag.org/cgi/content/abstract/1116887) from a group at Harvard Medical School in Boston, Massachusetts, point to a more complex insect immune system. Drosophila melanogaster can muster its own army of proteins against microbial invaders, says Harvard's Dietmar Schmucker, a developmental neurobiologist.

To fight infection, the fruit fly has harnessed a complex gene previously known for its role in differentiating nerve cells and guiding their extensions, called axons, Schmucker and his colleagues report. The gene, called Dscam for Down syndrome cell adhesion molecule, stands out among genes because it has 116 coding regions, most of which can mix to encode up to 38,000 subtly different proteins in neurons. “We had thought that Dscam has a role exclusively in axon patterning,” says James Clemens, a neuroscientist at the University of California, Los Angeles. That the gene works in the immune system, too, “is a very intriguing discovery.”

Although much more work needs to be done to establish Dscam's immune function, the findings hint that the gene's molecules function like primitive antibodies, guiding scavenging cells to particular pathogens. “It could be an early step” in the evolution of adaptive immunity, the ability of an immune system to remember and respond ever more effectively against infection, suggests Brian Lazzaro, an evolutionary geneticist at Cornell University.

Over the past 5 years, researchers have established that the proteins made by Dscam in insects vary from nerve cell to nerve cell, helping define neuronal identities (Science, 6 February 2004, p. 744). This reminded Schmucker of the specificity seen in vertebrate immune cells and prompted him to look beyond the nervous system for Dscam proteins.

Using antibodies that recognize such proteins, Schmucker's postdoc Fiona Watson found the molecules in fruit fly hemolymph—the insect equivalent of blood serum—and on the surfaces of fat body cells and immune cells called hemocytes. Graduate student Roland Püttmann-Holgado also showed through microarray studies that the insect's immune system used a wide variety of Dscam proteins.

When Watson inhibited Dscam expression in hemocytes using the RNA interference technique, she found that they gobbled up 30% fewer bacteria. In other tests, the researchers demonstrated that the versions of Dscam made by fruit flies bound with different affinities to the bacterium Escherichia coli, possibly indicating that variants are tuned to specific pathogens. “Maybe these [Dscam proteins] allow fruit flies a sophistication that we haven't seen before,” says Schmucker.

Vertebrate immune and nervous systems are also known to share genes. In 2003, for example, researchers discovered that a key vertebrate immune system gene complex that forms the unique MHC molecules on the surface of T and B cells is also active in the nervous system. “I think we will find other examples of this,” says Clemens.

Whereas the human version of Dscam encodes only a few proteins and has no obvious immune role, Schmucker notes that flour beetles—which are separated from fruit flies by about 250 million years of evolution from their common ancestor—use Dscam proteins in the same way as fruit flies do. It “is clearly a very ancient process” in insects, says Brenton Graveley, a molecular biologist at the University of Connecticut, Farmington.

7. ENVIRONMENTAL SCIENCE

# Sperm Whales Bear Testimony to Worldwide Pollution

1. Dan Ferber

Early results are in from the first-ever global survey of toxic contaminants in marine mammals—and they're not pretty. Sperm whales across the Pacific, even in midocean areas thought to be pristine, are accumulating humanmade chemicals called persistent organic pollutants (POPs). DDT was the most common pollutant, followed by polychlorinated biphenyls. The survey's sponsor now plans to take a similar worldwide look at contaminants in people.

“It doesn't matter where you are, these animals are polluted,” says biologist Roger Payne, president and chief scientist of the Ocean Alliance, a Lincoln, Massachusetts-based conservation group that funded the whale work. Data from the survey were slated to be announced this week after the research vessel Odyssey sailed into Boston Harbor, completing its 5-year investigation of pollution across the world's marine food webs (Science, 11 June 2004, p. 1584). The Odyssey's 12-person crew surveyed sperm whales, which range the globe and eat fish and giant squid. These massive mammals were thought to accumulate POPs in their tissues, making them a likely indicator of the health of the world's oceans.

Researchers shot nearby sperm whales with an arrow that removes a small core of skin and blubber without harming the whale. Samples from 424 whales were then analyzed by toxicologist Celine Godard of the University of Southern Maine in Portland. Her preliminary findings showed that whales in the Sea of Cortez, between the west coast of Mexico and Baja California, had nearly twice the levels of CYP1A1, an enzyme that detoxifies pollutants, as whales in an area of the mid-Pacific thousands of kilometers from land. One suspected cause for the disparity is agricultural runoff. (Whales near the Galápagos Islands have even higher CYP1A1 levels, but Payne is not sure why.) To make sure regional variations are real, the team is measuring contamination in tissue samples from prey species that never leave the region, says toxicologist John Wise of the University of Southern Maine.

Preliminary tests by ecotoxicologist David Evers and colleagues at the BioDiversity Research Institute in Gorham, Maine, show that mercury levels were higher in skin samples from sperm whales near the Galápagos and in the Sea of Cortez compared with whales elsewhere in the Pacific. Sperm whales may provide a much-needed global standard to compare mercury pollution in different regions, Evers says.

Peter Ross of the Canadian Department of Fisheries and Oceans predicts that the results, once published, will “build a case that these chemicals move around the planet with relative impunity.” Payne's team is planning to circumnavigate the globe in 2006 and 2007 to test for pollutants in people who live near especially contaminated areas.

8. ASTRONOMY

# Second Failure Cripples Suzaku Satellite

1. Dennis Normile*
1. With reporting by Robert Irion.

TOKYO—Astronomers trying to answer questions about the evolution of galaxies and the mechanics of black holes cheered mightily last month when Suzaku, a joint U.S.-Japanese satellite, settled into its orbit around Earth. Launched on 10 July, Suzaku was a replacement for a 2000 mission lost due to a rocket failure. For 19 days, its main instrument, the x-ray spectrometer (XRS), worked perfectly during calibration tests, measuring the energy of individual x-ray photons to an unprecedented level of accuracy. “We thought we were on our way,” says Richard Kelley, XRS principal investigator for NASA, which jointly developed the mission with the Japan Aerospace Exploration Agency (JAXA).

Then they started noticing a glitch. To achieve its unprecedented resolution, XRS uses liquid helium and frozen neon packed around the instrument in a cryogenic container called a Dewar to maintain a supercooled temperature of 0.06 kelvin. On 29 July, anomalous temperature readings led controllers to conclude that helium was leaking into the Dewar's vacuum space. The leaks were sporadic. But in one climactic incident last week, enough helium entered the vacuum space to degrade its insulating capabilities. The remaining helium evaporated into space, rendering XRS useless.

“Now there is a lot of frontier science we just won't be able to do,” says Hajime Inoue, an astrophysicist and project manager for Suzaku at JAXA's Institute for Space and Astronautical Science. Timothy Heckman, an astronomer at Johns Hopkins University in Baltimore, Maryland, planned to use Suzaku to study winds of hot gas ejected from galaxies rich with newborn stars. XRS would have gauged the wind speeds and the specific gas ingredients for the first time. “This was a revolutionary capability to help us understand how galaxies evolve and propel heavy elements into space,” Heckman says.

Kelley says that the spectrometer's brief performance validated its design and engineering. The failure of its cryogenic system is expected to spur a search for alternative mechanical cooling schemes on future missions, such as NASA's proposed Constellation-X mission, which would use similar ultracooled instruments on four satellites to measure x-rays with exquisite sensitivity.

Suzaku carries two instruments that are unaffected by the loss of the cryogenics and are still functioning. One is the hard x-ray detector, and the other is a collection of four x-ray charge-coupled-device cameras. Together, the instruments cover a wide energy range that Inoue says should provide new data on violent astrophysical phenomena occurring near black holes and within active galaxies, which are centered on supermassive black holes. The original observation program was based on using XRS. Mission managers will now select other observational targets to make best use of the surviving instruments.

9. NUCLEAR POWER

# Is the Friendly Atom Poised for a Comeback?

1. Eliot Marshall

The threat of global warming and high fossil fuel prices have inspired talk of a revival of nuclear power, but skeptics say it is a poor investment and a worse security risk

“Nuclear power faces stagnation and decline.” So warned a group of scientists in a sweeping review published 2 years ago by the Massachusetts Institute of Technology (MIT) in Cambridge.* Led by chemist John Deutch and physicist Ernest Moniz, both of MIT, the study concluded that nuclear power was in trouble and deserved a helping hand from government. Despite high construction costs, the authors argued that the United States should triple the number of nuclear power plants by midcentury because they can deliver electricity without emitting greenhouse gases such as CO2. The MIT group proposed a hefty tax on carbon emissions to help get this cleaner energy source moving.

The political and economic environment has changed dramatically since that report came out. On 8 August, President George W. Bush signed into law the first major U.S. energy bill in a decade. Although it does not tax carbon, it promises subsidies across the board for new investments in renewable energy, such as wind and solar power, and a grab bag of more than $6 billion in benefits narrowly tailored for builders of new nuclear reactors (Science, 5 August, p. 863). The bill was a plum for the nuclear power industry—one of several events that have got people talking about a “nuclear renaissance.” Indeed, that's the title of a book published earlier this year by physicist and energy policy analyst William Nuttall of the University of Cambridge, U.K. One reason for optimism, Nuttall points out, is that oil and natural gas prices have shot up since 2003, making non-fossil-fuel energy more attractive. Meanwhile, some public leaders have cited nuclear power as a way to reduce the impact of global warming—and even some environmental advocates seem to agree. Although a few Asian countries never got off the nuclear bandwagon, new ones are now climbing aboard to meet rapidly growing electricity demand. India, with the most reactors under construction in the world, is planning a unique system that relies mainly on thorium rather than uranium fuel (see p. 1174). Japan continues work on fast neutron reactors that can “breed” plutonium (see p. 1177). And China announced in April that it will more than quadruple its nuclear electric capacity by 2020, buying among other designs a new “pebble bed” reactor that shuts down if it overheats. Nuclear advocates in the West also hope that advanced reactor designs can help overcome the lingering memories of Three Mile Island and Chornobyl (see p. 1172). Does all of this amount to a nuclear renaissance? Skeptics point out that it would take a huge leap in the pace of plant construction simply to maintain nuclear power's current global share of electric output—about 17%—let alone increase it. Many aging U.S. and European reactors will have to be dismantled in the next couple of decades. Even new ones remain more expensive than coal- or gas-fired systems. And governments are not imposing stiff taxes on carbon emissions, the one strategy the MIT report said would tip investment decisions toward nuclear. Moreover, even if the economics were to favor nuclear power, two issues will continue to dog the industry: fears of nuclear weapons proliferation and disputes about how to dispose of high-level wastes (see p. 1179). Optimists still think that the problems can be fixed. Reiterating his view of 2 years ago, Deutch says: “If nuclear power can get its costs down and address the important issues of waste management and proliferation, its future will be very bright.” The next few years may reveal just how bright. Apocalypse pending The threat of global warming is perhaps the key factor in the rethinking of nuclear power. The nuclear industry, in particular, has seized on it as a reason to switch from fossil fuel to the atom. For example, John Ritch, executive director of the London-based World Nuclear Association (WNA), an advocacy group backed by power supply companies, told an audience in Idaho last month that unless the world cuts greenhouse gases, it will “face catastrophic climate change, with the severest consequences for sea levels, species extinction, epidemic disease, drought, and extreme weather events that could combine to disrupt all civilization.” WNA suggests that the best solution would be to raise the number of nuclear electric plants in the world from 441 today to 5000 by the end of the century. That is the most ambitious scheme anyone has proposed, but so far, it has few takers. A more modest proposal—to maintain the nuclear share of electricity at the current level as a “bridge” to future clean energy technologies—has struck a chord, however. David King, science adviser to the U.K. government, has spoken publicly about the need to keep nuclear power as a clean energy option. Britain, the world's most visible campaigner for action on global warming, faces a common dilemma, as King explained to the Independent newspaper in May. He described a looming “gap” in clean energy production. About 27% of U.K. electricity now comes from nuclear power, he noted, but without a “new build,” only one reactor unit (Sizewell B) will still be running in 2025, producing an estimated 4% of the needed electricity. King said he was “not a great fan of nuclear” but was willing to consider it because “the climate change issue is so important.” A recent U.K. government forecast lends weight to King's analysis: Solar panels, windmills, and wave-driven generators cannot pick up the slack anytime soon. An electricity strategy issued in May by the U.K. Council of Science and Technology, which reports to King, notes that “the existing policy to reduce CO2 will not be sufficient … since the nuclear stations are likely to be replaced by carbon-based technology (e.g., gas) in the short term.” And even the United Kingdom, which has championed the international effort to curb CO2 emissions, is failing to meet its self-imposed CO2 reduction goals. Physicist David Wallace, vice president of the Royal Society in London, warned in May that “our emissions are clearly going in the wrong direction,” and that U.K. government forecasts of achievable CO2 reductions have been “frankly unrealistic.” Royal Society president Robert May has written that “it is difficult to see how we can reduce our dependence on fossil fuels without the help of nuclear power.” A few leaders in the green movement have endorsed the idea of using nuclear power as a bridge to cleaner systems in the future—including U.K. ecologist James Lovelock. Creator of the “Gaia” metaphor that describes Earth as a living organism, Lovelock published a broad appeal last year. “Only one immediately available source [of energy] does not cause global warming, and that is nuclear energy,” he wrote. “I entreat my friends in the movement to drop their wrongheaded opposition [to it].” A few others, such as Greenpeace co-founder Patrick Moore, have made similar statements. But environmental advocacy groups are not following. Stephen Tindale, executive director of Greenpeace International in London, says it's “misleading” to suggest that “the green movement is suddenly embracing nuclear power on the back of Lovelock's statement.” He sees nuclear revival talk as “a big distraction” from the need to invest in moderate-scale, renewable energy systems. He adds that Moore is “vehemently opposed to everything that Greenpeace stands for” and now makes his living “by being anti-Greenpeace.” Likewise, the head of Friends of the Earth in London, Tony Juniper, says, “we have reviewed our position on nuclear power,” in part because of the urgency of the climate change issue, and concluded that it is a “false solution” pushed as part of “a clever public relations campaign” by “nuclear industrial interests.” The Natural Resources Defense Council has also reviewed its policy recently, says NRDC physicist Thomas Cochran in the Washington, D.C., office, and concluded that nuclear couldn't survive without massive subsidies. As a June NRDC issue paper says, nuclear “suffers from too many security, safety, and environmental exposure problems and excessive costs to qualify as a leading means to combat global warming pollution.” Cochran offers a scenario to illustrate why he doesn't see nuclear as a good option. He begins with a modest goal: avoiding a small amount (0.2°C) of global warming at the end of this century. He calculates that relying on nuclear electricity for this benefit would require increasing the number of reactors in the world from the current 441 to at least 700 by midcentury and holding that number steady for 50 years. Allowing for retirement of obsolete equipment, he suggests, this will require building 1200 new plants in all, at a rate of about 17 per year. The support requirements, he argues, would be staggering: a dozen new fuel-enrichment plants for reprocessing, the same number of Yucca Mountain-sized waste repositories if there were no reprocessing—or hundreds of thousands of tons of material to guard during reprocessing. Because just 8 kilograms of diverted plutonium would be enough to “take out lower Manhattan,” a nuclear renaissance isn't worth the risk, Cochran says. The MIT review 2 years ago acknowledged that “shortcomings” in the international safeguards on nuclear materials “raise significant questions about the wisdom of a global growth scenario” for nuclear power. It did offer a fix: Tighten up the management of nuclear materials by the International Atomic Energy Agency (IAEA) and persuade France, Japan, Russia, and the United Kingdom to cut down the traffic in plutonium by shutting their reprocessing factories. But those changes have not occurred. The threat of global warming may not have sparked a nuclear renaissance yet, but it is breathing new life into a debate over nuclear power that, in many countries, had been quiescent for the past few years. • * “The Future of Nuclear Power,” funded by MIT and the Alfred P. Sloan Foundation, MIT, Cambridge, Massachusetts, 2003. 10. MAP # Nuclear Power's Expanding Territory 1. Mason Inman In the past half-century, nuclear fission has emerged from behind a wall of military secrecy to become a widely used source of commercial electricity. 11. REACTORS # Nuclear Industry Dares to Dream of a New Dawn 1. Daniel Clery* 1. With reporting by Gong Yidong of China Features in Beijing. Reactor builders think that fossil fuel prices and climate fears will revive nuclear power. But will new reactor designs overcome the concerns of utilities and the public? The nuclear industry is biding its time. Amid all the hullabaloo about climate change, rising prices of natural gas, dwindling oil stocks, and the environmental impact of wind farms, the makers of nuclear power plants feel that their time is about to come. Sometime soon, they believe, people will realize that the only carbon-free way to keep our society humming along—and fuel the rapidly growing economies of China and the developing world—is to use nuclear reactors. “The signposts are there for a renaissance” of nuclear power, says Peter Wells, marketing manager for GE Energy's nuclear business. The industry has not been idle during the 2 decades since the Chornobyl accident brought reactor building to a virtual standstill. Designs for light water reactors (LWRs), the main type in use today, have been thoroughly reworked. They are now simpler and incorporate so-called passive safety measures—simple systems that automatically kick in when something goes wrong. A trickle of orders from countries such as Japan, Korea, and China has kept companies afloat, and the energy bill signed by President George W. Bush this month contains generous measures to coax U.S. power utilities to start building nuclear again. But many nuclear experts think that the coming boom will not be a simple rerun of nuclear power's heyday in the 1960s and '70s. For a start, many more countries want nuclear power, but not all want the 1000-plus-megawatt-sized plants favored by large industrialized nations. They want reactors to be quick to build and safe and easy to run, whereas the leading nuclear nations want to ensure that spent fuel can't be diverted to other purposes. In some cases, the plants may not even generate electricity. Alternative uses include powering desalination plants in arid areas, providing heat for petrochemical processes, and even generating hydrogen for the much-touted hydrogen economy. In such situations, some experts say, large monolithic LWRs do not fit. Instead, they point to the high-temperature gas-cooled reactor. Plants cooled with air or carbon dioxide have been around for decades, but a few companies are in the process of reinventing them for the 21st century. New-generation plants are cooled with inert helium, which directly drives a gas turbine to generate electricity. They work best at smaller sizes—a few hundred megawatts—and run at much higher temperatures than conventional reactors, between 500° and 1000°C. High temperature makes energy conversion more efficient and suits applications such as hydrogen production. But perhaps their best trick is that they go one better than passive safety: Their cores are designed so that a runaway nuclear reaction simply can't happen. You can fire up such a reactor to full power, vent away its coolant, pull the control rods right out, and nothing bad will result. “It's a walkaway reactor,” says Dave Nicholls, chief technology officer of South African reactor builder PBMR (named after its Pebble Bed Modular Reactor). “You can come back in a few days and sort things out.” Enthusiasts say gas-cooled reactors will eventually displace LWRs. Although they don't achieve the economies of scale possible with big plants, reactor builders can make a virtue of their small size by mass-producing components and shipping them to construction sites by road or rail. And if utilities want big megawatts, they can install a battery of small reactors at the same site, sharing facilities. Twenty years from now, “gas-cooled reactors will begin to dominate. Every new reactor ordered will be gas-cooled,” says Mike Campbell, senior vice president at U.S. nuclear company General Atomics. Not everyone agrees that the nuclear industry is poised for revolution. “All big utilities look at the costs and want the cheapest possible electricity,” says Philippe Garderet, vice president for research and innovation at French reactor company AREVA. “There just isn't a market” for small reactors. The Bush Administration, however, is prepared to take a gamble. The new energy bill authorizes$1.3 billion for the Department of Energy (DOE) to construct a new experimental nuclear reactor at the Idaho National Engineering and Environmental Laboratory. Industry watchers expect this Next Generation Nuclear Plant (NGNP) to be a high-temperature gas-cooled reactor for producing electricity and hydrogen. “We need to show that gas will work. That's why the NGNP is so vital for the next step into gas,” says nuclear engineer Andrew Kadak of the Massachusetts Institute of Technology in Cambridge.

## Liquid vs. gas

Although nuclear power generation has long been dominated by water-cooled reactors, there have been frequent attempts to establish gas-cooled designs. The first—Britain's Dragon reactor, which began operating in 1965—led to a number of carbon dioxide-cooled plants in the U.K., some of which are still in use today. General Atomics pioneered their use in the United States, and in the early 1970s it had orders for 10 machines. All were canceled when the 1973 oil crisis led to a collapse in energy demand. Meanwhile, water-cooled reactors were getting larger and larger and increasingly complex. Then the twin shocks of Three Mile Island in 1979 and Chornobyl in 1986 caused a major rethink of reactor design.

Most of the plants being built today in Asia and elsewhere are “evolutionary” improvements on the water-cooled designs from the boom years. Westinghouse's current offering, the AP1000, uses gravity, natural circulation, and compressed gas to cool its core in an emergency. As a result, the reactor has 50% fewer valves, 83% less piping, 87% less control cable, and 35% fewer pumps than a conventional plant. With less equipment, there is less to go wrong. Similarly, GE's latest design, the Economic Simplified Boiling Water Reactor, holds emergency cooling water high up in the reactor vessel. If anything gets too hot, a release valve is automatically triggered and water flows down under gravity. “The reactor then remains below water level, and you don't get the core exposed,” says GE's Wells.

But, according to Kadak, “these evolutionary designs are still too expensive. No one is buying.” At the vanguard of the movement to sweep aside such leviathans are two efforts to build small gas-cooled demonstrator reactors, one in South Africa and one in China, by around 2010. Both use a reactor design that has its origins in the postwar scramble to find new uses for atomic power.

Just after World War II, researchers at what was soon to become the Oak Ridge National Laboratory in Tennessee investigated a reactor for generating electricity designed by physical chemist Farrington Daniels of the University of Wisconsin, Madison. He proposed encapsulating enriched fuel in small graphite balls, placing a large number of them in a reactor vessel, and cooling them with helium. The design, known as a pebble bed reactor, was considered too complicated and was abandoned in 1948.

In the 1950s, German physicist Rudolf Schulten resurrected the idea and built a small demonstrator reactor which operated from 1968 for 22 years. In 1985, a firm in Germany also built a commercial-scale reactor, but both machines were closed down soon after the Chornobyl accident.

There the pebble bed story might have ended, except that in the 1990s, South African utility company Eskom began looking for new power plants. South Africa has abundant coal, so power is cheap. But the coalfields are all in the high interior of the country; Eskom wanted a new type of plant to power coastal cities. Pebble bed seemed to fit the bill, so Eskom licensed the German technology. Today the company PBMR is poised to start building a demonstrator plant at Koeberg near Cape Town, which it hopes to connect to the grid in 2010. “Nuclear must change technology to meet the needs of society,” says PBMR's Nicholls.

The pebble bed design is simple. Tiny flecks of low-enriched uranium are coated in layers of silicon carbide and carbon to make particles 1 millimeter across. Some 15,000 such particles are then mixed with graphite powder and pressed into a sphere the size of a tennis ball, which is again coated and hardened. Each “pebble” is only 4% uranium. When the reactor is ready for commissioning, engineers load 456,000 pebbles into the ring-shaped core. Control rods run through cavities in the graphite reflector material around the edge. The helium coolant simply flows through the pile of balls, is heated, and drives a turbine directly connected to a generator.

One great benefit of the pebble bed design is that it does not need to be shut down to rearrange or renew the fuel. Instead, every day some pebbles are taken from the bottom of the reactor and weighed to see if they still have usable fuel inside; those that do are fed back onto the top of the pile. In this way the fuel is continually moved around to achieve an even burn and full utilization. Each pebble passes through the reactor six times over the course of 3 years. Much of the equipment is straight off the shelf, Nicholls says. “We're not trying to push the state of the art at the component level,” he says. “We just put it together better.”

Meanwhile, researchers at the Institute of Nuclear and New Energy Technology (INET) at Tsinghua University near Beijing, China, also took a leaf out of Schulten's book during the 1990s and in 2003 fired up their 10-megawatt High-Temperature Reactor. According to INET director Zhang Zuoyi, this experiment-sized pebble bed has been steadily churning out power ever since. On three occasions, he says, the team has tested the reactor's safety by pulling out its control rods and leaving it to its own devices—producing a short-lived rise in temperature but no danger to the reactor.

Pebble beds are considered inherently safe because their cores are only sparsely loaded with nuclear material; they also exploit a natural ability of uranium-238, the nonfissile isotope that makes up the bulk of uranium fuel. As the temperature of the reactor rises above its normal operating level, uranium-238 starts to become better at absorbing neutrons, the particles that spark the nuclear chain reaction. So when the coolant or the reaction-damping control rods are removed, the reactor temperature begins to rise, but as uranium-238 starts to make the core less reactive, it cools naturally by radiation and conduction. “We can calculate the peak temperature the fuel will reach,” says Nicholls.

With this experience in its pocket, the INET team and the company Chinergy are planning to build a commercial prototype in Shandong province in the east of China by 2011. INET also signed an agreement last month to join a consortium with Westinghouse to put in a bid to build the NGNP in Idaho. Westinghouse is one of the backers of the PBMR, and the South African company is part of the consortium. Pebble bed enthusiasts hope that their design will be chosen for this $1.3 billion test reactor. The pebble bed approach is not the only way to make a high-temperature gas-cooled reactor. General Atomics, for example, has developed the Gas Turbine Modular Helium Reactor (GT-MHR). As in pebble beds, the uranium fuel starts out as tiny coated particles, but instead of pebbles, the fuel for the GT-MHR is formed into hexagonal prisms about the size of two large paint cans stacked up. The prisms are arranged in an array in the reactor core and stacked 10 high. Japanese researchers have built an experimental “prismatic” gas-cooled reactor, the High Temperature Test Reactor, which has been operating successfully since 1998. Arkal Shenoy, director of the GT-MHR project at General Atomics, says the design is pretty well worked out now. “We're waiting for someone to say 'Do you want to build this thing?'” Shenoy says that in a conventional reactor, one-third of all systems are safety-related, and you hope you will never have to use them: “We've eliminated the need for safety systems. The physics is such that the worst case of accident can never happen.” ## Idaho or bust Despite all the advantages of the new generation of gas-cooled reactors, proponents concede that utilities are going to be wary of unproven technology. “Without a full demo reactor, utilities won't buy. They're used to 90% availability. No amount of analysis will get you this,” says Shenoy. The South African and Chinese demo reactors are being heavily subsidized by their governments, and U.S. researchers hope their government will follow that example. “Until the NGNP is finished, you won't see a gas reactor being built in the U.S. We need to reduce the risk [for utilities],” says General Atomics' Campbell. “It must be an Administration priority. Otherwise it won't be real.” Researchers are also confident that DOE will want a high-temperature gas-cooled reactor because of its interest in hydrogen production. “All the buzz about the hydrogen economy really comes from gas-cooled reactors,” says Nicholls. There are various ways of extracting hydrogen from water, including electrolysis and thermochemical splitting, and they are all much more efficient at high temperature. “Nuclear is the only really practical source of hydrogen, and the only nuclear technology that gets you there is the high-temperature gas-cooled reactor,” Nicholls says. One thing these reactors do not do is resolve the issue of waste. The highly encapsulated fuel in gas-cooled reactors is very effective at containing nasty fission products, and it would be extremely difficult for any potential terrorist to extract any usable bomb-grade material from it. But the downside is bulk. All that graphite and multiple coatings make for large volumes of waste. The nuclear industry in the United States has never reprocessed its spent fuel, nor has the government come up with an accepted solution for long-term waste storage. Despite this, few believe the United States should embark on fuel reprocessing anytime soon because that would open a Pandora's box that the public is just not ready for. An influential 2003 report on the future of nuclear power, co-chaired by former CIA director John Deutch, concluded that for the next 50 years, a once-through fuel cycle was the best option for the United States. “Once-through will dominate for many years,” says Regis Matzie, chief technology officer at Westinghouse Electric. “Reprocessing is very costly in comparison, and utilities always take the least-cost route.” Few, however, believe that this situation can continue forever. “I don't see how we can expand nuclear with the way we are doing it today. We have to clean up the fuel cycle, and [reprocessing] may be the only way to do it,” says Campbell. “It's a 100-year problem, not a 10-year problem.” Farther down the road than the NGNP, 25 or more years from now, a new breed of reactor will be needed that can destroy much of its own waste. DOE has begun looking for such designs through a program called Generation IV and has enlisted a handful of other countries to collaborate. Beginning in 2000, a panel of more than 100 international nuclear experts sifted through many proposed designs and whittled them down to six generic types worthy of further study. Some of these are quite exotic, including one cooled by molten lead and another in which the fuel itself is a circulating mixture of molten salts. All but one of the six Generation IV designs have the ability to burn up the more long-lived products of the fission reaction. Nevertheless, industry experts seemed underwhelmed by the prospect of such futuristic reactors. “They're too far out, too speculative, and I can't see the advantage,” says Matzie. But France's AREVA, which already has experience of building fast neutron reactors for destroying waste, is looking that far ahead. “AREVA must be ready to produce plants with fast neutrons. We know how to do it, but we have 20 or 30 years to develop better, cheaper, safer technology,” says Garderet. U.S. reactor makers appear more focused on the near term, waiting for that spark that will set their industry burning again. “The Bush Administration is clearly supportive of nuclear power. This provides a window of opportunity: If steps are not taken by 2008, the opportunity will be lost,” says GE's Wells. Matzie agrees: “A big banner will go up when U.S. utilities start buying again. Once the U.S. starts building and establishes a track record, it will be time for others to do the same.” 12. RETHINKING NUCLEAR POWER # India's Homegrown Thorium Reactor 1. Pallava Bagla KALPAKKAM, INDIA—For more than 5 decades, India has followed its own path on nuclear power. After refusing to join the Nuclear Nonproliferation Treaty and detonating a nuclear device in 1974, it was excluded from the international group that shares fission technology. In isolation, it launched an ambitious nuclear electric program that relies heavily on homegrown technology. What makes India's strategy unique is its plan to build commercial reactors that run not on uranium but on a lighter element, thorium-232. India has one of the world's largest reserves of thorium—about 225,000 metric tons—but little uranium ore. Thorium does not fission; when irradiated with neutrons from a source material such as uranium-235, however, some of the thorium becomes uranium-233 (U-233), which does fission and can sustain a nuclear reaction. In 1958, India announced that it was embarking on an ambitious, three-stage plan to exploit its thorium reserves. The first stage required building pressurized heavy-water reactors powered by natural uranium; they yield plutonium as a byproduct. Twelve are now operational. The plan called for stage two to kick in after sufficient plutonium had been extracted from spent cores; it would be used as a fuel in future fast-neutron reactors, which can irradiate thorium and produce U-233 as a byproduct. In the third stage, Advanced Heavy Water Reactors will burn a mixture of U-233 and thorium, generating about two-thirds of their power from thorium. Other nations—including the United States, Russia, Germany, and Israel—have studied the route but have not attempted to use it to generate electricity. Stage two of this grand strategy began officially last October. In the sleepy southern township of Kalpakkam, a government-owned company began building a 500-megawatts-of-electricity (MWe) fast-breeder reactor that will use fast neutrons to produce U-233. In its core, the reactor will use a “seed” fuel containing uranium and plutonium oxide; this source will send neutrons into a surrounding thorium blanket. Indian atomic energy officials are confident that this exotic fuel system can be scaled up from a smaller, 40-megawatt Fast Breeder Test Reactor (FBTR) that has been running in Kalpakkam without major problems since 1985. This reactor and other research projects at the Indira Gandhi Center for Atomic Research in Kalpakkam have demonstrated, IGCAR officials say, that India has mastered the new technology. In a “bold step forward,” says Anil Kakodkar, chair of the Atomic Energy Commission (AEC) in Mumbai, researchers at IGCAR in May of this year successfully extracted plutonium in high purity from the unique plutonium-rich mixed carbide fuel discharged from FBTR. AEC anticipates that the fast breeder at Kalpakkam will cost about$700 million and produce 500 MWe. The long-term goal, according to Kakodkar, is to increase nuclear electric output from 3360 MW today to “around 275 gigawatts” by the middle of this century.

Construction at Kalpakkam ran into trouble early this year: The 26 December 2004 tsunami flooded the foundations of the reactor building and set the schedule back by 4 months, says Baldev Raj, IGCAR's director. But he says that the work is now on track and predicts that the reactor will go critical as planned in September 2010.

Mujid Kazimi, a nuclear engineer who studies thorium fuels at the Massachusetts Institute of Technology in Cambridge, says India's approach to breeding nuclear fuel from thorium is “slightly more complicated” than fuel breeding planned elsewhere in the world. But he adds, “everything they have reported to date indicates they are on track.”

India cannot go it entirely alone, however. It still requires uranium, including for two boiling water reactors it bought from General Electric in the 1960s, and that may be one reason it is interested in opening nuclear trade with other countries. At a meeting last month with Prime Minister Manmohan Singh, President George W. Bush called India “a responsible state” with “advanced nuclear technology.” The opening could lead to future exchanges of personnel and technology—and possibly fuel. Singh reassured Parliament, however, that the deal would not undermine India's nuclear self-sufficiency.

13. ASIA

# Asia's Demand for Electricity Fuels a Regional Nuclear Boom

1. Gong Yidong,
2. Dennis Normile*
1. Gong Yidong is a writer with China Features.

While Western governments debate the pros and cons of replacing old nuclear power plants, India, China, and Japan are investing rapidly in new systems

Nuclear power may have fallen on hard times in some parts of the world, but not in Asia. Demand for electricity is growing steadily across the region, and a number of countries have seized on nuclear fission as a secure energy source that avoids coal's air-choking carbon and sulfur emissions. And as oil and gas prices rise to record high levels, nuclear energy is starting to look more affordable. The result—on paper at least—is a boom. “Sixteen of the 25 nuclear power plants currently under construction worldwide are in Asia,” says Akira Omoto, director of the Division of Nuclear Power of the International Atomic Energy Agency in Vienna, Austria.

China is embarking on a nuclear power plant building spree. Korea and India are both beefing up their nuclear electric grids. And Japan—despite public opposition that has blocked one project—plans an expanded nuclear power network that includes a controversial fast neutron reactor. Other novel designs are being tried in China and India.

## Steep climb

China has the most ambitious nuclear plans of any country in the world, although it is starting from a small base. Nine nuclear power plants are operating in China today, accounting for about 2% of the national power output. Two more reactors are under construction in eastern Jiangsu Province; they will come online by the end of this year, raising total nuclear capacity from 6.7 billion watts, or gigawatts, of electric power (GWe) to 8.7 GWe. For comparison, the 104 nuclear power plants in the United States now produce more than 10 times as much power, about 98 GWe. China's target is to have nuclear power supply 6%, or about 40 GWe, of the nation's electrical energy needs by 2020.

JNC engineers have been studying more economical fast reactors that would use lead-bismuth and helium gas as potential coolants. They concluded that sodium is still the most promising option. Simplifying the cooling system and using more compact heat exchangers could improve efficiency. But these modifications depend on perfecting a high-chrome-content steel alloy that at present is too brittle. JNC is working with Japan's steel companies to improve it. If the materials research and other modifications pan out, JNC's studies show they could build a new advanced sodium-cooled fast reactor plant about one-sixth the size of Monju but five times as productive, at 1500 MWe. This would cut the cost per kilowatt of capacity to $1600. Sagayama says they hope to start building a demonstration commercial fast reactor in about 2015. The China Institute of Atomic Energy is a few steps behind with an experimental fast reactor with a power capacity of 20 MWe, due to be commissioned in 2008. China plans to follow up with a 600-MWe prototype by 2020 and commercial-scale fast reactors around 2030. India is hoping to complete a 500-MWe Prototype Fast Breeder Reactor by 2010, and Korean researchers are designing a 600-MWe fast reactor. All these countries foresee an important role for fast reactors. Japan's long-term plan, Sagayama says, calls for fast reactors to replace conventional reactors completely, although there is no target date. And Huang Guojun, a CNNC deputy general manager, told a conference earlier this year that fast reactors will be the “main type of nuclear reactor to be used in China” by the middle of this century. Researchers in both countries say they will be needed to recycle scarce nuclear fuels. Xu notes that if nuclear power accounts for 20% of China's power needs in 2050, the country would have to acquire 75% of all known easily accessible uranium deposits. Japan might still have a problem with public acceptance. Hideyuki Ban, co-director of the Citizens' Nuclear Information Center (CNIC), a Japanese antinuclear group, says the delays and accidents at Monju and other facilities make it clear that “Japan's fast breeder program is in trouble.” He says the government's main nuclear advisory commission has been unwilling to “change a policy established 50 years ago,” and the program seems to be running on inertia. Also targeted by critics is Japan's large-scale reprocessing plant at Rokkasho, at the northern tip of Honshu Island, capable of producing 8 tons of plutonium a year. In May 2007, the plant is scheduled to start converting plutonium into so-called mixed-oxide fuel that can be used in conventional reactors, a stopgap until commercial fast reactors come online. Ban's bottom-line worry is that “if more and more countries acquire [reprocessing] technologies, there will be no controlling the proliferation of nuclear weapons.” CNIC is calling for a moratorium on all plutonium production. In China, the public seems less aware of such controversies, says Xue Ye, executive director of Friends of Nature, the country's largest environmental organization. With just nine plants scattered throughout the country, “people have not felt their existence.” He thinks that could change, however, as more plants are constructed. “The Chornobyl explosion is still vivid in the memory of most Chinese,” he says. Beijing residents have protested the construction of laboratories handling dangerous pathogens near their homes; he thinks similar protests could interrupt China's plans for nuclear electricity. The big question facing Asia's booming nuclear industry is can it stay ahead of this nascent public opposition? 14. RETHINKING NUCLEAR POWER # Down to Earth: Lingering Nuclear Waste 1. Mason Inman Few countries have a concrete plan for disposing of long-lived radioactive waste, but those that do are converging on the same idea: Dig a deep mine, store the material in robust containers, and rely on geology to keep it out of the biosphere for tens of thousands of years. The concept seems simple. But the potential flaws are hotly debated, and only a few projects are under way. First off the blocks were the United States and Finland, which have actually chosen locations. Yucca Mountain, Nevada, in effect became the official U.S. site in 1987 but has been delayed by continuous legal battles and scientific questions. The Department of Energy (DOE) has spent$5 billion on planning and does not have a target opening date.

About 51,000 metric tons of high-level waste have been earmarked for Yucca Mountain; if all U.S. nuclear power plants run to the end of their current licenses, DOE spokesperson Allen Benson says, the country will have about 120,000 metric tons to dispose of—70% more than the site is legally permitted to hold. But “people are just assuming … the law will be changed, allowing Yucca Mountain to expand,” says geologist Steve Frishman of the Nevada Agency for Nuclear Projects, which monitors the project.

DOE's design relies on elevation and waste heat to keep water out of the repository, which would be carved into volcanic tuff. But water is its bugbear. Critics say that the site isn't dry enough, the rock is fractured and leaky, and the oxidizing environment will corrode the waste containers.

Both Finland and Sweden are well along with designs much different from the U.S. approach. Finland is planning a $3 billion repository near the community of Eurajoki on the country's southwestern coast, and Sweden will choose one of two candidate sites in 2008. Their repositories are to be built in granite and maintain a relatively cool, nonoxidizing environment that could work even if damp. The Swedish and Finnish designs “have vastly reduced the uncertainties” about how well they will contain the waste, says geologist Allison Macfarlane of the Massachusetts Institute of Technology (MIT) in Cambridge. Eurajoki's deep-bedrock repository will hold about 5600 tons of waste, 40% more than existing Finnish plants could produce in their lifetimes. Sweden voted in 1980 to phase out nuclear power; if the decision sticks, its$2.5 billion repository will store about 8000 metric tons of waste.

France, Japan, and Russia are exploring possible burial sites but keeping wastes near the surface indefinitely while they decide what to do next. All now reprocess spent fuel to extract usable isotopes, which they argue conserves fuel and reduces waste. But a blue-ribbon review published in 2003 by MIT concluded that, for many decades, reprocessing will cost more than “once-through” use of fuel. And because reprocessing increases the risk that material will be diverted to a dirty bomb or a nuclear weapon, the MIT group argued that it should be stopped.

Many countries hope to reduce waste eventually through “transmutation”: bombarding highly radioactive elements with neutrons to convert them to less threatening isotopes. But this method is expensive, and it may not be practical for decades, if ever.

Russia and some countries with small nuclear programs that would have difficulty funding a repository are exploring the possibility of pooling wastes in shared, multinational repositories. The chief advocate of this approach is the International Atomic Energy Agency (IAEA), in part as a way to help keep dangerous waste under lock and key.

But repository plans everywhere face a significant first hurdle in winning public support, IAEA Director Mohammad ElBaradei has said: “Once the first country has succeeded in placing a geological repository in service, … the road ahead for other countries will be made much easier.” So all eyes are on the groundbreaking projects in the United States, Finland, and Sweden.

View this table: