# News this Week

Science  17 Apr 2009:
Vol. 324, Issue 5925, pp. 318
1. U.S. SCIENCE POLICY

# NIH Stimulus Plan Triggers Flood of Applications--and Anxiety

1. Jocelyn Kaiser*
1. With reporting by Eliot Marshall and Jeffrey Mervis.

The past month has been a blur for neuroscientist Chiara Cirelli. She's been so busy preparing grant applications to the National Institutes of Health (NIH) that she has stopped reviewing papers by colleagues. She's barely set foot in her lab at the University of Wisconsin, Madison, not to mention her garden.

Cirelli's group already has two basic R01 research grants and a center grant for its sleep research. But the chance to use a piece of NIH's $10.4 billion share of the recent stimulus package to hire students and postdocs, buy equipment, and expand their studies of adult animals to young mice is proving irresistible. “I just can't not do it,” says Cirelli. Nor can thousands of her peers. As a result, biomedical scientists across the country have been cranking out grant applications for the$8.2 billion in NIH funds for extramural research in the American Recovery and Reinvestment Act (ARRA) of 2009. “It's like the day after Thanksgiving, when everybody's lined up at Wal-Mart for the new Tickle Me Elmo doll,” says Brian Dedecker, a cell biologist at the University of Colorado, Boulder, who has decided not to enter the fray. NIH's shelves are stocked with infrastructure grants, extensions of existing awards, and enticing new competitions, in particular Challenge Grants of up to $1 million apiece. Research deans say the proposals flooding their offices reflect the pent-up need across a 5-year period in which NIH's budget lost 13% of its purchasing power. The excitement, however, is tinged with anxiety. The ARRA grants require scientists to report quarterly on how they're spending the money and how many jobs are created and saved. With up to 10,000 applications for perhaps 200 Challenge Grants, the competition is also expected to cause a spike in applications for NIH's regular R01 grants in the next few years as investigators who come up empty recycle their proposals. That surge will be combined with scientists reapplying in 2011 to continue work funded with ARRA money. If NIH's budget—now$30.3 billion—doesn't keep pace, the pain could be even more acute than what scientists experienced after a 5-year doubling of NIH's budget ended in 2003. “What really worries me is that we could fall off a cliff again,” says Karen Antman, dean of the Boston University School of Medicine. At the same time, she says, “I've never seen the faculty so happy” for a chance to compete for additional funding.

As dictated by the Recovery Act, $7.4 billion is split among NIH's 27 institutes and centers, many of which plan to spend the lion's share on 2-year awards for proposals that just missed the cut last year. Another$1.3 billion is going to construction and instrumentation. The NIH director's office has divvied up about two-thirds of its allotted $800 million among four competitions:$200 million for the Challenge Grants, another $200 million for Grand Opportunities (GO) Grants of at least$1 million, $100 million for new faculty hiring at core academic facilities (Science, 3 April, p. 27), and$21 million for summer research experiences for students and teachers. Five institutes have joined to offer $60 million for autism research, and more announcements are expected. The agency hasn't yet determined the final split between funding existing proposals and new competitions. The recovery money has thrown a lifeline to investigators who missed the funding cutoff last year and were surviving on other funds. Heidi Hamm, chair of pharmacology at Vanderbilt University in Nashville, Tennessee, says four of 24 people in her department have been told that they will likely receive 2-year grants. “That's huge,” she says. Many others, including Cirelli, are requesting supplements to existing grants or proposing an extension of the work, known as a competing supplement. The most alluring prize, however, seems to be the Challenge Grants, which can cover research on any of 15 broad categories, from bioethics to stem cells. The 12-page maximum for proposals is only half the current length of an R01 application, and applicants don't need to show preliminary data. Susan Bryant, vice chancellor for research at the University of California, Irvine, says her campus will be submitting “over 200” such applications. Johns Hopkins University molecular pathologist Anirban Maitra is working on three Challenge Grants, two for early detection of pancreatic cancer and one for nanotechnology. “My philosophy is, let's try to maximize our chances,” says Maitra. University of Minnesota cancer biologist Peter Bitterman is submitting a Challenge proposal and a GO proposal, both involving drug discovery. He says it's worth the effort even though the odds are long. “It's obvious. Once you've prepared a grant [application] and it's ready to go, it can be submitted as an investigator-initiated R01.” Processing these proposals is expected to strain the federal online site, Grants.gov, through which all NIH applications must pass. Antman says she's asked her grants' staff to work on weekends and at night when the volume of traffic is lower: “You push the button 20 times and it doesn't go through,” she says. Some universities have moved up NIH deadlines by a week or more to ease the strain on harried staff. Once the proposals are submitted—each competition has its own deadline, and the rules for supplements vary by institute—reviewing the Challenge Grants, GO grants, and other competing proposals is “going to be a major amount of work,” says Sally Rockey, acting director of NIH's office of extramural research. For the Challenge Grants, NIH expects to test out a new “editorial board” model where proposals will be emailed to subject experts. The board will then meet in person to assign global scores. The scope of the proposals is expected to be very broad. Drawing from their strategic plans, institutes have described nearly 900 topics as worthy of consideration, with 150 highlighted as priority areas. Although NIH is under no obligation to fund work in any single area, the competition is nonetheless expected to be fierce. One saving grace: Institutes have the option of tapping their recovery money to award additional Challenge Grants. Some researchers are passing on the current competitions. Instead, they've elected to wait a few months to submit a regular R01 proposal for 4 years of funding, betting that they'll have a better shot because of a dip in applications. Early-career investigators have a second reason to eschew Challenge Grants. NIH now relaxes the pay line for young scientists who haven't yet received an R01, and officials have decided that the Challenge Grant is large enough to strip new investigators who win them of their protected status. “I'd give up my advantage in applying for an R01. That's a big deal,” says Dedecker, who came to Colorado in 2006 and hasn't hit NIH pay dirt yet. He adds, only half in jest, “Besides, I would rather have my regular grant rejected in a thoughtful fashion.” The big question is whether NIH's budget will grow at anywhere close to the rate needed to handle the blizzard of new ideas and expanded scientific work force. Last month, Acting NIH Director Raynard Kington warned a congressional spending panel that success rates could drop “several points” if NIH does not receive a “substantial” budget increase. The National Institute of Allergy and Infectious Diseases, for example, is already warning on its Web site that it “will be very difficult for people to secure funding” from NIH's regular budget once the stimulus money runs out. To help smooth things out, NIH has announced that researchers can ask for a 1-year unfunded extension of their grant. But that may not be enough to avoid a second crash landing. 2. U.S. SCIENCE POLICY # NSF Is Keeping It Simple 1. Jeffrey Mervis The National Science Foundation (NSF) is using its$2 billion windfall from the American Recovery and Reinvestment Act to fund more proposals already in the pipeline, reducing what officials say is a $2-billion-a-year backlog of good ideas. That's quite a contrast from the National Institutes of Health (NIH), which has rolled out an array of mechanisms, including new competitions and several types of supplemental grants targeting various audiences, to allocate its$8.2 billion in research funds from the $787 billion stimulus package (see main text). Two weeks ago, an advisory panel for NSF's math and physical sciences directorate spent a large chunk of its 2-day meeting discussing how various federal agencies are spending the stimulus money. They were particularly concerned about NSF's policy not to top off existing grants with supplement awards, citing new competitions at NIH to support teachers and students, to broaden participation, and to welcome back researchers whose scientific careers have been interrupted. NSF Director Arden Bement defended his approach during a lunch with the committee, explaining that Congress has told NSF to boost success rates—approximately one in four grant proposals is funded—within the parameters of the overall goal to create or preserve jobs. “Our philosophy is KIS—keep it simple,” he said. “That's the only way to get [the money] out the door in a timely fashion.” Bement also said the “robust” tracking requirements of the legislation preclude supplements to existing projects. “We can't mix the money,” he said. Despite those stipulations, he said, NSF isn't ignoring the needs of early-career scientists. NSF can still use its regular budget to support requests for supplements from current grantees, he said, citing$7 million in supplemental grants that NSF's math division has allocated from its just-passed 2009 budget to create an additional 30 or more postdoctoral positions at the seven NSF-funded mathematics research institutes. “All this angst over supplements is not well-founded,” he scowled.

Sally Rockey, acting head of NIH's extramural research program, says that NIH has managed to go where NSF feared to tread by making each supplement a separate grant. However, she emphasizes that “institutions will have to track [the spending] carefully.”

The lack of alternatives bothered Héctor Abruña, a chemist at Cornell University and a member of the advisory panel. “Where's his vision?” asked Abruña. “NSF seems more concerned with process, reducing paperwork, and making things easier for program managers than with the challenges facing the scientific community.”

Panelist and Yale University chemist William Jorgensen also spoke in favor of supplements, noting that they would boost undersized NSF grants as well as help fledgling scientists. But Jorgensen, a longtime NSF grantee who's not currently receiving NSF funding, says he's not unhappy that the foundation is concentrating on boosting success rates. Giving the money to established scientists, he says, “sends the right message to PIs [principal investigators] and is very clean.”

3. REPRODUCTIVE BIOLOGY

# Study Suggests a Renewable Source of Eggs and Stirs More Controversy

1. Dennis Normile

A new study has turned up the heat on an already blazing—and sometimes nasty—debate over one of the dogmas of reproductive biology: that female mammals start life with a limited number of eggs and cannot produce new ones after birth.

The work, done in mice by stem cell biologist Ji Wu and colleagues at Shanghai Jiao Tong University in China, suggests that adult ovaries harbor primitive germline cells that can give rise to new eggs, or oocytes, and those in turn, when fertilized, can produce healthy offspring. If confirmed, the implications for reproductive biology and the possible treatment of infertility are enormous.

“It's a beautiful paper,” says Evelyn Telfer, a reproductive biologist at the University of Edinburgh in the United Kingdom. It is “very tight,” she says, and shows “quite clearly that there are cells in the ovary that have the capacity to form germ cells” that are progenitors of eggs.

Developmental geneticist Robin Lovell-Badge of the National Institute for Medical Research in London takes the opposite view. “To me, this is a very incomplete piece of work that will only add to the confusion,” he says.

For at least 50 years, the theory that female mammals do not produce new eggs after birth has been thought to explain why fertility declines with age; in women, menopause is believed to commence when the store of eggs is exhausted. Reproductive biologist Jonathan Tilly and colleagues at Harvard Medical School in Boston kick-started the current debate with a 2004 paper that claimed that oocytes in mouse ovaries die too quickly for a limited supply to last the animals' reproductive life span. They also identified cells expressing a gene unique to the germ cells that give rise to sperm and eggs. When they grafted pieces of normal ovaries into mice with ovaries expressing green fluorescent protein (GFP), they later observed green fluorescing oocytes in the grafted tissue, presumably generated by the host's germ cells (Science, 12 March 2004, p. 1593).

Some subsequent studies supported Tilly's findings, while others contradicted them. In a review published in the January 2009 issue of Biology of Reproduction, Tilly cited 16 previous commentaries that were split on whether the preponderance of evidence supported the orthodox or heretical view. “The viciousness of the debate has scared a lot of people away from this area,” Tilly says.

In particular, no one had succeeded in culturing the putative female germline stem cells (FGSC) until Wu's team tried an overlooked technique called immunomagnetic isolation. In this approach, tiny magnetic beads coated with an antibody latch onto a protein expressed only by germline stem cells. A magnetic screen collects cells with attached beads. The group then cultured FGSCs collected from the ovaries of newborn and adult mice through numerous cycles of division, even freezing and thawing them without any apparent ill effects. Next, they infected the cells with a harmless virus carrying the GFP gene and transplanted them into the ovaries of sterilized mice. After mating with normal males, the females gave birth to healthy, fertile offspring, some of which carried GFP throughout their bodies. The group's paper was published on 12 April by Nature Cell Biology.

Telfer, who co-authored two critical reviews of Tilly's work, says the paper “vindicates” Tilly's 2004 conclusion that germline stem cells exist in mouse ovaries. But not everyone is convinced. “Large claims require searing critical scrutiny, and I will reserve final judgment until it is replicated elsewhere,” says Roger Gosden, a reproductive biologist at Weill Cornell Medical College in New York City. “Replication has been a historic problem for many stem cell studies.”

Lovell-Badge says there are alternative explanations for many of the group's findings, and additional experiments would have bolstered the claims had they been done. He also questions how immunomagnetic isolation could work on a protein found within cells and not on the surface.

In an e-mail exchange with Science, Wu pointed to reports by others indicating that a part of the protein extends beyond the cell membrane. Jeffrey Kerr, a developmental biologist at Monash University, Clayton, in Australia, agrees that the Wu team captured germline stem cells. The “molecular signatures are commensurate with known characteristics of other stem cells including germ cells,” he says. “Based on the data reported, these cells qualify as FGSCs.”

Telfer adds that “by producing live young, these cells have passed the ultimate test to prove their germline credentials.” Tilly predicts confirmation won't be long in coming, as Wu and her colleagues “spell out a simple protocol [for isolating FGSCs] that any lab can now try,” he says.

Both sides agree that if the findings are confirmed and if FGSCs are found in human ovaries, they might offer new options to treat human infertility caused by cancer therapies, disease, or aging. “Identification and isolation of such [cells] in the human ovary may revolutionize the field of fertility preservation,” says Kutluk Oktay, a reproductive endocrinologist at New York Medical College in Valhalla. “I have not been excited about a scientific piece like this for a long time.” David Albertini, a reproductive scientist at the University of Kansas Medical Center in Kansas City, adds a caution: “Let's be careful about how relevant this is for humans.” Telfer suspects that the debate will soon shift to the physiological role these cells normally play. She suspects that FGSCs are usually quiescent and become active only if the ovary is damaged. Wu, on the other hand, thinks that “under normal conditions, [FGSCs] can renew themselves and differentiate into oocytes.” “This paper opens up so many questions, we could spend an hour talking about possibilities,” says Tilly.

4. GEOCHEMISTRY

# Great Oxidation Event Dethroned?

1. Richard A. Kerr

The wait was worth it for the pond scum. For more than a billion years, one-celled denizens of early Earth had held their own under an atmosphere devoid of oxygen. Finally, 2.4 billion years ago, photosynthesis had built up free oxygen so that life could begin to move toward complex, active organisms and eventually to us. At least, that's the story a clever isotopic analysis seemed to tell, muffling a decades-long debate over when oxygen first gained the upper hand.

But new laboratory results reported in this issue challenge that mainstream scenario by showing how supposed signs of an early lack of oxygen could have come from unrelated geochemical reactions. Still, the Great Oxidation Event will likely reign a while longer. “Now we can go back to arguing the preponderance of evidence,” says geochemist Ariel Anbar of Arizona State University, Tempe. “I think it still would be on the side of there not being a lot of oxygen” before 2.4 billion years ago.

By 2000, mineralogical and isotopic evidence had convinced most researchers that not even a whiff of oxygen was in the air before the event. But some, such as geochemist Hiroshi Ohmoto of Pennsylvania State University (PSU), University Park, still argued that oxygen was abundant much earlier.

Then in 2000, geochemist James Farquhar of the University of Maryland, College Park, came up with a nifty technique involving sulfur isotopes. The proportion of one isotope to another of the same element can change during a chemical reaction. Normally, the change depends on the masses of the isotopes. But Farquhar found isotopic shifts among three sulfur isotopes before 2.4 billion years ago that hadn't depended on isotope mass. As far as anyone knew, such “mass-independent fractionation” (MIF) could have happened only under solar ultraviolet radiation in an oxygen-free atmosphere—and MIF sulfur disappeared 2.4 billion years ago. That result “shut down the discussion” about oxygen's arrival, says Anbar.

Undaunted, Ohmoto and Yumiko Watanabe of PSU set out to confirm theoretical hints that MIF sulfur can also be produced in the presence of hot, solid proteins. On page 370, they and Farquhar—who performed the critical isotopic analyses—report that, under conditions crudely mimicking ancient hot springs, two amino acids can indeed produce very small degrees of MIF. “The results defy all expectations,” says early-life specialist Bruce Runnegar of the University of California, Los Angeles.

In a carefully worded conclusion, the paper suggests that the newly discovered reactions reopen the question of early oxygenation. “This is at least a possibility that we should be thinking about,” Ohmoto says. Personally, he adds, “I feel the findings we made fit nicely to what I've been saying for years.” Farquhar, on the other hand, thinks the lab results may be relevant to the geologic record but considers it “much more likely” that the MIF signal marks the first appearance of atmospheric oxygen, he writes in an e-mail.

Other specialists, including Runnegar, agree with Farquhar. The effect in the lab is small—10% of the amount of MIF found in the geologic record—observers note, and there's no convincing reason why the hot-spring reactions would have shut off at 2.4 billion years. “I think it is still more likely to be of atmospheric origin,” says geochemist Shuhei Ono of the Massachusetts Institute of Technology in Cambridge, “so the current idea of an [early oxygen-free] atmosphere still holds.”

5. SCIENCENOW.ORG

# From Science's Online Daily News Site

Blasting for Ice on Mars. The Phoenix lander has given scientists a close look at the ice in one spot high in the martian arctic, but researchers have also been surveying fresh craters across the planet for signs of frozen water. Now two teams reported at the recent Lunar and Planetary Science Conference in The Woodlands, Texas, that they have detected ice that is relatively pure—far purer than expected on a dusty, dirty planet. The finds lend support to a scenario of an ancient “iceball Mars” in which ice encased much of the planet.

What You See Is What You Feel. Stare at a waterfall long enough, and nearby stationary objects such as rocks and trees will seem to drift up. The optical illusion is called motion aftereffect, and it may trick more than just your eyes, according to a new study published in Current Biology. When subjects watched a stationary stripe on a computer screen after a machine stroked their fingertips, the motion of the stroking created the illusion that the stripe was moving. The discovery demonstrates for the first time a two-way crosstalk between touch and vision, challenging long-held notions of how the brain organizes the senses.

Want to Stop Malaria? Target the Geezers. Kill ‘em fast, kill ‘em young. That's been the unofficial motto in insect control for the past 50 years. But a new paper in PloS Biology suggests that, at least in the case of malaria, the strategy may be misguided. By choosing insecticides that act more slowly, or that specifically target older mosquitoes, researchers may be able to prevent the evolution of pesticide resistance, a problem that has long bedeviled malaria-control efforts.

6. EARTHQUAKE PREDICTION

# After the Quake, in Search of the Science--or Even a Good Prediction

1. Richard A. Kerr

Did technician Gioacchino Giuliani successfully predict last week's quake that crumbled buildings in the Italian city of L'Aquila, killing more than 270? He thinks so, and he's been all over the Italian media since then, claiming credit and demanding an apology from Italian authorities who silenced him a week before the quake and from Italian scientists who said there was no merit in his methods.

Neither side is backing down. The scant documentation of Giuliani's methods of prediction that has begun to surface offers no real evidence of the technique's efficacy, scientists say. Giuliani's recent predictions were wrong or reported after the quake struck, they note, and earlier efforts to correlate releases of radon gas—the marker on which Giuliani bases his predictions—with the seismic record are unconvincing.

“I think Giuliani is speaking in good faith,” says Warner Marzocchi, a chief scientist at the National Institute of Geophysics and Volcanology in Rome, “but all the things he's presented, may I say, are at a very low level from a scientific point of view. That does not mean radon is not a potential precursor, [but] I didn't see any evidence the method could work.”

As central Italy suffered a disquieting “earthquake swarm”—a surge in seismic activity—beginning this past January, Giuliani began attracting national attention by aggressively promoting his seismic predictions through the media. On 24 March, in an interview posted on the Italian-language blog Donne Democratiche (www.donnedemocratiche.com/?p=2219), he explained how he and two colleagues got into quake prediction and what their work meant for the quake-prone region around the city of L'Aquila, northeast of Rome. In 2000, while working on a particle physics experiment in a subterranean laboratory of the National Institute of Nuclear Physics near L'Aquila—where Giuliani still works—they incidentally detected a rise in radon at the same time an earthquake struck Turkey more than 1200 kilometers away.

Radon-earthquake connections had spurred scientists in the 1970s and '80s to try to predict quakes, but decades of work came to nothing. Levels of radon seeping from the ground rose and fell a lot, it seemed. Sometimes quakes followed; often they didn't. Undeterred, Giuliani and his colleagues designed and built five radon monitors that now dot the region around L'Aquila.

Asked what light he could shine on the intensifying seismic activity of the L'Aquila region, Giuliani gave Donne Democratiche a prediction: The swarm of low-level quakes was a “normal phenomenon” for the region, was not a precursor to a larger event, and would diminish by the end of March. On 30 March, the largest event in the series up to that time—a magnitude 4.0—struck L'Aquila.

About a week before the 6 April magnitude-6.3 quake, Giuliani made his second prediction. He has not responded to repeated inquiries from Science, but according to media reports, Giuliani told the mayor of the town of Sulmona, 55 kilometers to the southeast of L'Aquila, to expect a damaging earthquake within 6 to 24 hours. As widely reported in the media, vans mounted with loudspeakers blared warnings to residents to flee. Sulmona never got its quake, but by then Italian authorities had told Giuliani that he was panicking an already jittery populace and they would not allow him to publicize any predictions.

That meant Giuliani's third claimed prediction—a forecast of the L'Aquila quake, which Giuliani told reporters he had shared with colleagues—went unverified. After the fact, Giuliani told the media he had found alarming rises in radon levels in the hours before the big one, even as two of the strongest quakes in the intensifying swarm struck. As levels of both radon and seismic activity rose, his predictions mounted as well, until he was foretelling an imminent quake of greater than magnitude 4.0, he told reporters and talk show hosts. A quake did indeed strike within hours, but it was 1000 times more powerful than that minimum prediction. Such an open-ended prediction of magnitude—from minimally damaging to catastrophic—is of little use to those responsible for public safety, scientists say.

Marzocchi, who works on the forecasting of earthquakes and volcanoes, has examined two Italian-language documents containing examples of Giuliani's radon records used to make predictions: a patent application (www.wipo.int/pctdb/en/wo.jsp?WO=2004061448) and a chronological account of the method's development (www.chiocciolandia.it/index2.php?option=com_docman&task=doc_view&gid=2&Itemid=38). He is not impressed. “It's very hard to find anything good in this work,” says Marzocchi. The problem is too many peaks in radon records that are too short, he says (see figure). Earthquakes (“ev” 1 and 2 dots) are associated with supposedly precursory radon peaks with no obvious rhyme or reason, he says. For example, there's no correlation between the size of the peaks and the magnitudes of the subsequent quakes. “These figures are unacceptable from a scientific point of view,” he concludes.

7. CLIMATE CHANGE

# New Push Focuses on Quick Ways to Curb Global Warming

1. Eli Kintisch

NASA climate modeler Drew Shindell knew his research would raise eyebrows. But he was overwhelmed by the response to a paper published last week in Nature Geosciences that modeled the causes of Arctic warming over the past century. “‘Did you really say aerosols are responsible for half or more of the warming in the Arctic?’” he describes a typical e-mail.

He did. Carbon dioxide may get all the attention, Shindell says, but black carbon—a component of soot—is also an important factor in global warming. He and other scientists say that reducing emissions of black carbon and other short-lived pollutants that contribute to global warming could buy the world crucial time while governments begin the slow overhaul of global energy systems that will be required to reduce emissions of CO2, which comprise 77% of all greenhouse gas emissions. “Short-lived carbon forcers like methane, black carbon, and tropospheric ozone contribute significantly to the warming of the Arctic,” Secretary of State Hillary Clinton said in a speech last week. “Because they are short-lived, they also give us an opportunity to make rapid progress if we work to limit them.”

Dirtier air has slowed global warming over the past century by blocking solar radiation. But the four short-lived pollutants that scientists are targeting actually warm the atmosphere. Methane and hydrofluorocarbons (HFCs) are green-house gases like CO2, trapping radiation after it is reflected from the ground. Black carbon and tropospheric ozone, an element of smog, are not greenhouse gases, but they warm the air by directly absorbing solar radiation. Compared with CO2, which can persist in the atmosphere for up to 3000 years, black carbon remains for only 2 weeks and methane for no more than 15 years.

Environmental activists such as Durwood Zaelke of the nonprofit Institute for Governance and Sustainable Development want Clinton to ask the eight Arctic nations, whose foreign ministers will meet in Norway on 29 April, to create a partnership to support technology and joint demonstration projects that limit diesel emissions globally and particulates from cookstoves and chimneys in the developing world. The U.S. Environmental Protection Agency is also considering including particulate emissions in an upcoming ruling on using the Clean Air Act to fight climate change, says the agency's Paul Gunning. “It's important,” says University of California, San Diego, atmospheric scientist V. Ramanathan. “The joint benefits for human health and climate would be considerable.”

For methane, Rafe Pomerance of the nonprofit Clean Air-Cool Planet would like the Obama Administration to broaden its well-respected Methane to Markets Partnership, which features demonstration programs to limit emissions from farms, landfills, and energy installations.

Representative Henry Waxman (D-CA), chair of the House Energy and Commerce Committee, has asked the Administration to propose adding HFC language to the Montréal Protocol, which controls ozone-destroying chemicals. (HFCs don't destroy ozone, but they are 1400 times more potent than CO2 as a warming agent, making up 2% of world greenhouse gas emissions and rising fast.) The Administration says it is considering doing so, although it must act by next month to be considered by the next official meeting of treaty participants in November. Last month, Waxman proposed the first-ever federal regulation of HFCs as part of a massive climate bill he introduced in draft form.

8. SCIENCEINSIDER

# From the Science Policy Blog

When the U.S. government funds energy research at U.S. national laboratories or universities, who owns the resulting inventions? During a recent tour of the national laboratories, Energy Secretary Steven Chu said he hoped federally funded scientists would “share all intellectual property as much as possible.” But Representative James Sensenbrenner (R-WI) of the House Science and Technology Committee has demanded clarification on the Department of Energy's policies in the area. Experts say that solving the climate crisis will require nations to share their technologies more widely.

A researcher who pricked herself last month with a needle containing the Ebola-Zaire virus has returned to the Bernard Nocht Institute in Hamburg, Germany. The researcher was inoculated within 48 hours after the accident with a Canadian experimental vaccine. Based on a livestock pathogen called vesicular stomatitis virus, it had never been tested in humans before. The patient is now healthy, but scientists don't know whether the vaccine actually prevented an infection. Scientists are looking for telltale antibodies in the researcher's blood for clues as to whether the vaccine made a difference.

Irish scientists were wondering whether their golden age of research has ended after the government announced spending cuts that will hit 3000 publicly funded Irish scientists to the tune of roughly €6000 each. This comes on the heels of budget cuts that cut pay by an additional €2000. More cuts to government-funded science in Ireland are expected in the next 2 years.

Elsewhere … An online story suggested that the U.S. Centers for Disease Control and Prevention could have done more in 2007 to publicize data on lead poisoning. Woes continued at Grants.gov as stimulus-hungry scientists swamp the site. A new approach to particle physics could make giant accelerators, now the staple of modern experimental physics, obsolete.

For the full postings and more, go to ScienceInsider.

9. NEWSMAKER INTERVIEW

# John Holdren Brings More Than Energy to His Role as Science Adviser

1. Jeffrey Mervis

President Barack Obama's three domestic priorities—energy, health care, and education—provide John Holdren with a road map for serving as the president's science adviser. They also point to three different ways in which the 65-year-old physicist, on leave from Harvard University's John F. Kennedy School of Government, may carry out his second job, that of director of the 50-person Office of Science and Technology Policy (OSTP) within the White House.

On energy, Holdren told Science last week in one of his first interviews since his Senate confirmation 19 March, he hopes to wield considerable influence. “Energy is one of my big things. I'm going to pay a lot of attention to energy,” says Holdren, who has extensive experience in energy, climate, and nuclear-proliferation issues. At the same time, Holdren signaled that the President's Council of Advisors on Science and Technology (PCAST), co-chaired by medicine Nobelist and former National Institutes of Health director Harold Varmus and Eric Lander of the Massachusetts Institute of Technology, is likely to be the nexus for any health care debates within OSTP. He acknowledged that he expects the president to rely heavily on Education Secretary Arne Duncan, a fellow Chicagoan and basketball buddy, for guidance on improving U.S. schools, with OSTP playing a complementary role in reforming science, technology, engineering, and mathematics education.

Three weeks into his job, Holdren says OSTP and government scientists are “energized” by his boss's bold promise to “restore science to its rightful place.” In a conversation with Science's Jeffrey Mervis, Holdren spoke frankly on issues ranging from nuclear proliferation to the teaching of evolution. The following is an edited transcript; a complete version is available online at Science's policy blog, ScienceInsider.

Q: Are you concerned that reporting requirements for the American Recovery and Reinvestment Act (the $787 billion stimulus package) will hamstring U.S. scientists? Or is that the price to pay for this massive influx of funding? J.H.: There's clearly a tension there. When you do something as big as the recovery package, there's tremendous pressure to make sure that you don't just push the money out the door without any attention to assessment and evaluation. But the other side of the coin is that you don't want to burden people who are doing good work with a degree of reporting requirements that impair their productivity in any significant way. So it's a fundamental tension, and I'm not sure that we've got it exactly right. … If you overburden researchers with reporting requirements, then you've done a bad thing. And we'll try to avoid that. Q: Do you expect OSTP to play a bigger role in national security? J.H.: Steve Fetter is assistant director at large, so I can deploy him on energy, climate change, and nuclear weapons. Steve has a background very similar to my own, and Steve has a portfolio similar to mine, and when I can't be in two places at once, I have complete confidence that Steve will be bringing the same things to the table. We will ultimately have an associate director who will be dual-hatted in the [National Security Council]. But I also have a role in the NSC. Whenever science and technology are on the table, I'm there. Q: Is building a renewable replacement weapon necessary to win Senate approval of the Comprehensive Test Ban Treaty (CTBT)? J.H.: My personal view—I don't make the policy, but I provide advice—is that we do not need a new warhead. [A National Academies' report I led] concluded that the safety and effectiveness of the current nuclear stockpile could be maintained indefinitely without developing new warheads, by monitoring the situation and making modifications if necessary. My personal view is that designing a nuclear warhead and deploying it would throw out a good part of the baby with the bath water. It negates a substantial advantage to ratifying the test ban treaty because it would send a message to the world that the United States still thinks that it can and should design and deploy new warheads when circumstances require it. If that's the case, what have you accomplished with CTBT? Q: Will additional shuttle missions be needed to complete the space station? J.H.: The current plan is to get an additional shuttle mission to the space station within the 2010 framework. … If that can't be done and things slip, then consideration will be given to going beyond that date. And that would be the last shuttle mission. There will be a gap in our capacity to put people in space with U.S. vehicles, because we will not have a follow-on to the shuttle ready before 2015. Q: Will it be only 5 years? J.H.: I wouldn't want to speculate. It's going to be at least that long. I don't see any way we can do it before 2015, and if things go as they often do, it might be a little later than 2015. And what we'll have to do in that interim period is rely on our international partners, which means the Russians. It might also be the Chinese, depending on how our relationship develops. Q: Do you have confidence in China's ability to launch our astronauts? J.H.: I think it's possible in principle to develop the required degree of confidence in the Chinese. I put it out there only as speculation, but I don't think it should be ruled out. Q: Will your review of scientific ethics include a review of conflict-of-interest policies at each agency? J.H.: I think it has to look at that. I wouldn't prejudge what we're going to say. But the question is, “What are the appropriate boundaries?” Q: What about full disclosure for all National Institutes of Health (NIH) grantees? J.H.: I don't feel comfortable prejudging that. It's not a domain with which I'm closely familiar. I would be interested in the views of Harold Varmus and Eric Lander on that. They are co-chairs of PCAST, which has not yet been fully constituted. … And since I have, as co-chair of PCAST, the former director of NIH, and one of the smartest people I know, I'm not going to go on record on that issue without talking first to Harold. Q: Will the portfolios of the associate directors be science, technology, energy/environment, and national security/international affairs? J.H.:Yep. Although when you say energy, the title will be environment, and how energy will be handled remains to be seen. It depends in part on who we recruit for technology. Right now, the only associate director who has been nominated is Shere Abbott, for environment. Q: So you haven't decided where energy will go? J.H.: Well, energy is one of my big things. I'm going to pay a lot of attention to energy. Energy is one of Steve Fetter's big things. And we have Kevin Hurst, a senior policy analyst who's been working on energy. So right now we have a strong energy team, and we'll be bringing even more energy capability on board. Q: Given the Administration's energy team—Steve Chu, Carol Browner, Lisa Jackson, among others—what special expertise and perspective do you bring? J.H.: Number one, of the people you just named, the only other scientist is Steve Chu. And Steve Chu and I, in the interagency working group on energy and climate, represent the science and technology side. Steve and I are both knowledgeable about a wide variety of energy technologies, and we are very close partners. We both know a fair amount about climate science, and we have others working for us who know even more. Carol Browner, the former EPA director, is a brilliant analyst of policy and regulation. And we have at the table Larry Summers, Christina Romer, and Peter Orszag, who cover the economic side. We also have Cabinet secretaries who have big stakes in the energy issue, and they bring to the table important constituencies. Q: How will OSTP handle science education? J.H.: It'll be within the associate director for science. Everybody has a stake in it, however. And we will have an associate director for science who is known for his or her commitment to strengthening science, technology, engineering, and math education. That's already clear. Q: So you have somebody in mind? J.H.: I do. And this is a big deal for the president. His commitment to education is clear, and it's shared by the education secretary, Arne Duncan. We're going to do a lot in that domain. Q: Staying with education, do you think that the Texas state school board's recent decision to add a skeptical view of the study of evolution and the fossil record weaken the state's science standards and weaken national efforts to improve science education? J.H.:Well, I have not reviewed that decision carefully. But my impression from reading about it is that it was not a step forward but rather a step backward. Of course, all science needs to be skeptical. It's hard to be against skepticism. But when you get into the domain of promoting particular views about the basis for skepticism of evolution, and those views are not really valid, then I think we have a problem. I think we need to be giving our kids a modern education in biology, and the underpinning of modern biology is evolution. And countervailing views that are not really science, if they are taught at all, should be taught in some other part of the curriculum. Q: Is there anything you can do? J.H.: I'm not aware of any leverage we have, at OSTP or within the federal government, over the science curriculum in Texas, other than exhortation. We can argue and we can beg and we can try to educate. But we have no authority to act. Q: Were you troubled by the recent National Academies' report that one in six life scientists say they have self-censored some of their research because of security concerns, and is there anything you can do? J.H.: That is a tough one. I think security concerns in the biological domain are real, and we cannot be cavalier about the propagation of findings that could be used by terrorists to harm us. But what the right approach to managing those risks is, is something we'll continue to struggle with. There was self-censoring within the nuclear physics community in the late [19]30s and '40s, when it became clear to scientists that there was potential for weapons of vast destructive power. And I think that was a good thing. 10. PHYSICS # Fusion's Great Bright Hope 1. Daniel Clery* 1. With reporting by Robert F. Service. New realms of chemistry and physics. Nuclear tests without nukes. A giant step toward fusion power. Even if the National Ignition Facility works as planned, how much can it really deliver? New realms of chemistry and physics. Nuclear tests without nukes. A giant step toward fusion power. Even if the National Ignition Facility works as planned, how much can it really deliver? In November 1957, Gordon Gould, a grad student at Columbia University, jotted down in a notebook some ideas on how to make a laser, a term that he coined. Possible uses for such a device, he noted, included spectrometry, interferometry, radar, and nuclear fusion—all 3 years before a laser was actually demonstrated. Gould's innovations were disputed: He was not included in the 1981 Nobel Prize for the laser, and for 3 decades he fought in the patent courts to assert his inventions. He was eventually successful, and many of the applications he dreamed up, such as heating and evaporating materials, measuring distance, communications, and television, have come true—apart, that is, from nuclear fusion. Next year, researchers at the Lawrence Livermore National Laboratory (LLNL) in California hope to tick that box off Gould's list. Despite his foresight, Gould could not have imagined the lengths to which scientists and engineers would have to go to bring his prediction to reality. LLNL's National Ignition Facility (NIF), which was officially completed last month, is a laser on a truly epic scale. The building housing it is 10 stories high and covers an area the size of three football fields; for a very brief instant, its beams deliver a power of 500 terawatts, more than the power-generating capacity of the entire United States. If all goes according to plan, some time in 2010 the power of those beams will be directed at a small beryllium sphere filled with hydrogen isotopes. The resulting implosion will crush the hydrogen to a temperature and pressure higher than in the core of the sun. If NIF's scientists get everything right, the hydrogen isotopes will do what they do in the sun: fuse together into helium nuclei and release a huge store of energy. NIF's principal aim is to reach “ignition”: a self-sustaining fusion burn that gives off more energy than was put in to make it happen—something that so far has occurred only in nuclear explosions and stars. “People have been waiting for this moment for a long time,” says NIF Principal Associate Director Edward Moses. The achievement could have profound implications for our future energy supply. If the fusion gain—the ratio of energy out over energy in—is high enough and a laser could be developed to spark such ignitions at a steady rate, laser fusion could provide almost limitless energy with little radioactive waste. “This will ignite a change in the political debate,” says Mike Dunne, head of the Central Laser Facility of the Rutherford Appleton Laboratory near Oxford, U.K. Energy production is not NIF's raison d'être, however. Its funding comes not from energy or science budgets but from the coffers of the National Nuclear Security Administration (NNSA), the agency tasked with the maintenance and security of nuclear weapons and naval reactors. NIF's overarching role is to provide hard data that can confirm computer simulations of nuclear explosions. In the absence of nuclear testing, NIF is the best way weapons designers can know what happens when one of their bombs goes off. But basic science should be a big beneficiary: Researchers plan to use NIF to simulate the interiors of supernovas, stars, and giant planets, as well as to shed light on how materials behave under such previously unattainable conditions. “Really, it's a very, very exciting period for all of plasma physics,” says Jacques Ebrardt of France's Atomic Energy Commission and one of the leaders of a rival fusion project, Laser Megajoule. After a dozen years of construction, researchers are keen to see these dreams realized soon. NIF staff members think they won't have long to wait. “We're feeling pretty conf ident,” says Moses. But some other researchers say such temperatures and pressures are uncharted territory, and controlling them may not be as straightforward as NIF's proponents think. “It's just very, very complicated. Shots even close to this power have never been done before,” says Steven Cowley, director of the Culham Science Centre, the U.K. fusion research lab near Oxford. Some think NIF is bound to fail: that the leap in laser technology is too great or that we don't yet understand enough about how plasmas and other materials will behave under these conditions. Practical questions have also dogged NIF. Technical and managerial problems early on stretched out construction by 7 years and drove up costs; at$3.5 billion, the price tag is several times the original estimate. Some say that money should have paid for several smaller, less risky facilities.

NIF is perhaps one of the most scrutinized scientific projects in recent history, the subject of countless reviews, panels, and investigations. But the time for predicting its future has passed. That future will soon be decided by a brilliant flash of light and whether it does what researchers hope it will do. Either it will usher in a new era of fusion research, or some hard questions will have to be answered. If NIF works, “we're going to have a gold rush of people being interested. It'll grab the attention of the world,” says Robert McCrory, director of the Laboratory for Laser Energetics at the University of Rochester in New York state.

## “Every scale of problem”

The route to fusion that has won the most attention and funding is magnetic confinement fusion, which uses huge electromagnets to confine a hot but low-density plasma inside a vessel known as a tokamak. The premier magnetic fusion device, which aims to show large energy gain for extended periods, is ITER, currently being constructed by a worldwide collaboration in southern France. Meanwhile, a smaller community has attempted to achieve fusion by imploding small capsules of fuel using light or particle beams—a technique known as inertial confinement fusion (ICF) because inward inertia of the implosion holds the fuel in place.

The first experiments with ICF were carried out in the 1960s using ruby lasers soon after they were invented. But a key paper by LLNL physicist John Nuckolls in 1972 predicted that ignition would need laser pulses of 1 kilojoule and that high gain would require 1 megajoule (MJ). There followed a series of attempts at fusion: During the 1970s, LLNL built increasingly powerful lasers—Janus, Cyclops, Argus, and finally Shiva, a 20-beam, 10-kilojoule laser with amplifiers made from neodymium-doped silica glass. With every attempt, however, researchers encountered new difficulties with power-draining interactions between the beam and the plasma and achieving a smoothly symmetric implosion of the fuel capsule.

In 1980, researchers at Rochester developed crystals that could triple the frequency of high-intensity laser light, converting it from infrared to ultraviolet, which interacts with plasma less and causes a better implosion. Rochester soon put the crystals into practice with its 24-beam Omega laser, and LLNL followed suit with Nova in 1984. Funding for fusion stagnated during the 1980s, but Nova and Omega advanced the science enough that by the late 1980s and early 1990s, several labs were developing designs for a next-generation machine.

In 1992, the United States stopped testing nuclear weapons, and new methods were needed to ensure that existing weapons would still work when needed and that new weapons could be developed without testing. In discussions between the national weapons laboratories, officials decided that an ICF device was needed to validate computer simulations of nuclear explosions. In 1994, a design for NIF emerged that would produce a 1.8-MJ ultraviolet beam at a cost of just over $1 billion with completion pencilled in for 2002. Problems emerged with the design soon after construction began in 1997. Capacitors failed in the pulse power modules that supply current to the flash lamps that pump the laser amplifiers, and there were persistent problems with dust on optical surfaces: Powerful beams would heat up the dust specks and damage the surfaces. NIF staff members hid delays and cost overruns from government officials, and in September 1999, NIF Associate Director E. Michael Campbell stepped down after anonymous tips revealed he had not finished a claimed doctorate from Princeton University. Those revelations caused the Department of Energy (DOE) to carry out a thorough reevaluation of the project, and Congress ordered an independent review by the Government Accountability Office. GAO's damning report prompted DOE to rebaseline the project with a new cost estimate of about$4 billion and completion slated for 2008.

Moses took over the troubled project in 1999 and found “every scale of problem,” he says. He worked to develop a “partnership” with vendors and a cultural shift among the staff so that they would speak up if there were a problem. He tackled the dust issue by building a huge clean room where optical elements are enclosed in sealed units that could easily be slotted into and out of the beamline. He also began commissioning the beamlines one at a time, beginning in 2001, rather than all of them in parallel, so that any bugs in the first completed beamlines could be corrected in later ones. “That had a huge impact,” says Mark Newton, leader of NIF's engineering division.

Under the new management, NIF has pretty much kept to the revised schedule and budget, culminating in last month's official completion and, according to Moses, a test shot with an energy of 1.1 MJ. Researchers will now test all parts of the system before taking a shot at ignition. They will make sure that all 192 beamlets can be focused, smoothed, and targeted accurately; that they can direct them into the ends of the hohlraum, the gold cylinder that houses the fuel capsule; and that they can get a capsule to implode symmetrically. “By the end of the campaigns, we'll have a pretty good idea of what to expect,” says Gilbert Collins, leader of NIF's shock physics group. Moses is similarly confident: “Ignition is a grand challenge. Our aim is to do it in 2010,” he says.

Other researchers have heaped praise on LLNL's achievements. “The laser is really quite tremendous and truly awe-inspiring,” says particle physicist Roy Schwitters of the University of Texas, Austin, who chairs the JASON Defense Advisory Group, an organization of scientists that assesses defense-related projects, such as NIF, for government clients. Cowley is similarly effusive: “NIF is a triumph of laser construction.” But few agree with Moses that NIF will be able to move rapidly to ignition by next year. “The schedule looks almost unattainable,” says Cowley. And according to nuclear engineer David Hammer of Cornell University, “There needs to be more than one miracle for everything to work in time for the first ignition experiment.”

## Wrestling instabilities

Most researchers cite two areas where nature may throw NIF a curve ball: laser-plasma interactions (LPI), which affect the beams as they enter the hohlraum; and hydrodynamic instabilities (HDI), which can cause the fuel capsule not to implode symmetrically. Both effects plagued earlier ICF experiments, and NIF researchers have spent years simulating and testing ways to control them. But NIF's huge energies may still spring surprises. “Until they put a beam into the hohlraum, we won't know what nature will do,” says Hammer.

LPI happens when the beams enter the hohlraum and hit its inside wall, kicking up enough gold atoms to create a plasma inside the cylinder. The interaction of the beams with this plasma can reduce the power deposited into the beryllium capsule and can preheat the fuel, making it harder to compress. The plasma can even reflect some of the beam out through the hole again, reducing efficiency. “We have a woeful ability to predict” LPI, says Dunne. Mordy Rosen of LLNL's Weapons and Complex Integration Directorate agrees. “We're going to a place we've never been before. It's going to be a new game,” he says. Nevertheless, he adds, “I think we've got what it takes to respond to those issues that come up.”

Steven Bodner, retired head of laser fusion at the Naval Research Laboratory in Washington, D.C., says NIF has even bigger problems: He believes the quality of its beams is not up to specification. The design describes a maximum beam spot size that will fit into the hohlraum without touching the entrance hole or the fuel capsule, as well as a bandwidth the beams must be detuned to in order to combat LPI. Bodner says that in results released so far, beams have achieved both of these criteria, but not at the same time while operating at full power. “If they can't focus the beam into the hohlraum, they can't get ignition,” he says. Bodner thinks NIF's chance of reaching ignition “is worse than a snowball's chance in hell.” NIF counters by citing the conclusion in February this year of the National Ignition Campaign Review Committee, which stated that “each and every one of the laser performance completion criteria has been met or exceeded.”

Compressing the fuel capsule is also fraught with difficulties, collectively known as HDI. If you imagine trying to squeeze a balloon with your two hands, you'll see what the exploding capsule is trying to do: compress the contents uniformly without bits of it bulging out again. Many things can cause HDI: The bath of x-rays coming from the heated hohlraum may not be uniform or there may be some flaw in the capsule or fuel layer. Even under ideal conditions, instabilities are inevitable, researchers say. The key to beating them is speed: “We need to do it fast enough so instabilities don't get big. It's extremely hot and high pressure. It wants to blow itself apart,” Rosen says.

## Breaking glass

Apart from LPI and HDI, other issues could prove a headache for NIF's managers. According to some outside LLNL, the risk of damage to the laser optics has not gone away. The energy contained in each laser pulse is not huge, but because it is pumped through in only a few nanoseconds, the power is enormous. Hammer says NIF can cope with a certain amount of damage to glass, “but if 192 beams destroy several optical elements, a lot of optics is involved.” The “triplers,” which convert the final beam into ultraviolet, are particularly tricky, he says, and because they are very close to the target chamber, they could do substantial damage if they explode. Moses says scientists have done a huge amount of research on damage mechanisms and removing defects from surfaces. “We've shown we can get surfaces to work at full performance. … That was our biggest challenge.”

Some experts also worry that NIF's choice of beryllium for the capsule material, which requires more energy to explode than alternatives such as plastic, leaves little margin for error in reaching ignition. In 2005, a JASON panel investigated NIF's chances of achieving ignition. Noting plans to start out at energies of about 1 MJ, it concluded “that success in the early attempts at ignition in 2010, while possible, is unlikely.” The panel was invited back to view progress in January of this year, but its report has yet to be released by NNSA. Hammer, who co-chaired the panel, says that in his own opinion there's still not enough power available. He thinks that a couple of years after the first attempts at ignition, they will have a 50:50 chance of success. “They will throw everything at it to get there,” says Cowley. “By 2010, they might, but if they operate it for a long time they'll learn how to do it.”

## Illuminating the stars

Some researchers are less concerned about the trials of reaching ignition than what they can do once it's achieved. These are the plasma physicists, planetary scientists, and astrophysicists who want to use NIF to do basic research. Twenty percent of time at NIF is ear-marked for basic research, and several groups are gearing up to take advantage of it. Planetary scientist Raymond Jeanloz of the University of California, Berkeley, is preparing experiments for NIF that will replicate pressures at the cores of giant planets. “NIF will give us 100 times the energy we can currently deposit into samples,” he says. “We will begin to turn the page on a new kind of chemistry that wasn't accessible before.” Cowley, who has worked in astrophysics as well as plasma physics, says ignition at NIF will produce “an unbelievable neutron flux if you get really close”—conditions akin to extreme astrophysical events such as supernovas. This will open up new opportunities in the burgeoning field of experimental astrophysics. “There are wonderful things you can do with NIF,” Cowley says.

Also hoping to do wonderful things, although with less visible results, are the weapons scientists involved in stockpile stewardship. Ever since the idea of NIF was first mooted, it has faced controversy over how useful it really will be to weapons research, including sniping from other national laboratories that benefited less from NNSA's largess. “I've never viewed it as relevant to weapon design. The parameters are very different, it's orders of magnitude wrong,” says Bodner. A 2007 report on stockpile stewardship from the Federation of American Scientists concluded that the nation's nuclear weapons were being kept safe and reliable through careful monitoring and the judicious replacement of parts. “The NIF could be ended without reducing the confidence in the existing nuclear stockpile,” it said.

NIF's relevance to weapons “has been reviewed for 20 years by blue-ribbon panels, everyone under the sun,” says Moses. “The community has spoken, the NNSA continues to fund us, that's pretty much put to bed.” What's more, France is spending billions constructing Laser Megajoule, a similar machine that will carry out its first experiments by the end of 2012, also aiming for ignition and weapons verification. “The architecture is basically the same,” says Ebrardt, and some components, such as the amplifier glass, were developed jointly by the two teams.

Nevertheless, just as NIF reaches the stage at which it can prove itself, the tide of politics is flowing away from its original mission. President Barack Obama has spoken much more about nuclear disarmament than about maintaining a credible deterrent, and his appointments and funding decisions show a keen interest in developing new sources of energy. It's perhaps no coincidence that most news coverage of NIF's completion last month focused on its significance for energy, not weapons. LLNL researchers have also been busy developing designs and technology for fusion-energy projects that would come after NIF (see sidebar, p. 328). “[NIF] is not a power-production machine,” Collins acknowledges, but it “will unveil the science needed to get there.”

For NIF researchers, waiting to see if a dozen or more years of work will pay off, there is now some respite from the constant probing and questioning of NIF's abilities and rationale. “Some of our most serious critics are waiting and seeing. The rhetoric has really dropped down,” says Collins. Rosen, for one, is ready. “It's up to us now,” he says. “Mother Nature is waiting.”

11. PHYSICS

# A Long, Winding Road to Ignition

1. Daniel Clery

The laser system of the National Ignition Facility (NIF) pushes the boundaries of technology. If all goes according to plan, the complex series of reactions will ultimately generate more energy than the laser pumped in.

The National Ignition Facility (NIF) beam starts life in one of two ytterbium-doped optical fiber lasers known as the master oscillators. These produce an infrared flash (1053 nanometers in wavelength) that has an energy in nanojoules. This flash is split into 48 beams and passed through 48 preamplifier modules, slabs of neodymium glass pumped with bright light just before the beams arrive. Four passes through the preamplifiers boost the total energy 10 billion times to about 6 joules. Each of the 48 beams is then split further into four beamlets.

The 192 beamlets pass through the power amplifier into the main beamline, which includes the 48 main amplifiers, each made of 11 1-meter-long slabs of neodymium-doped phosphate glass. Just before the beam is first generated, the amplifiers are pumped full of light by 7680 xenon flash lamps, storing 400 megajoules (MJ) of electrical energy. As the beams pass through, the amplifiers dump that energy into the beam. An optical switch called a Pockels cell traps the light between two mirrors so that the beams pass back and forth through the amplifiers four times before they are switched back up through the power amplifier and on toward the switchyards. The beams now have a total energy of 6 MJ.

The 10-story-high switchyards use mirrors to route the beams into the 10-meter-wide target chamber from all directions around the sphere. Just before entering the chamber, the beams pass through the final optics assemblies, which condition the beams and step down their wavelengths. Frequency converters made from thin sheets cut from single crystals of potassium dihydrogen phosphate convert the infrared beams first to green (527 nanometers) and then to ultraviolet (351 nanometers), which is much more effective at heating the target. Losses bring the total energy down to 1.8 MJ. But because the flash is only 20 nanoseconds long, its power is 500 terawatts, more than the generating capacity of the entire United States. All told, the beam travels 305 meters from master oscillator to target, a journey that takes 25 nanoseconds.

The target is a tiny, hollow sphere made of beryllium about the size of a peppercorn. Inside is 150 micrograms of deuterium and tritium, two isotopes of hydrogen, chilled to 18 kelvin so that they form a uniform layer of ice on the inside of the sphere. The target capsule sits at the center of a tiny gold cylinder about the size of a pencil eraser, called a hohlraum. The beams shine into the ends of the hohlraum, heating its inside surface to such an extreme temperature that it emits a pulse of x-rays. The x-rays cause the beryllium capsule to explode, and the outward blast drives the deuterium-tritium ice inward toward the center of the capsule.

If the implosion is completely spherically symmetric, it will compress the fuel to a density 100 times that of lead. But the fuel still needs a spark to ignite fusion. A shock wave from the original beryllium explosion arrives in the center and heats the core of the fuel to 100 million kelvin. As nuclei fuse in the core, they release enough heat to trigger more fusion in the surrounding fuel in a chain reaction. If all goes according to plan, the reactions will generate enough heat to make the fusion burn self-sustaining and will generate more energy than the laser pumped into the hohlraum, a result known as “ignition”—one of the ultimate goals of NIF.

12. PHYSICS

# What's Next for ICF?

1. Daniel Clery

If the National Ignition Facility reaches its goal of ignition, it will be a triumph of plasma science. But physicists will still be far from showing that inertial confinement fusion is a viable energy source for the future.

If the National Ignition Facility (NIF) reaches its goal of ignition—a self-sustaining fusion burn that produces more energy than was put in to create it—researchers will celebrate a triumph of plasma science. But they will still be far from showing that inertial confinement fusion (ICF) is a viable energy source for the future.

One key stumbling block for an ICF energy reactor is laser technology. NIF managers hope to perform about two shots a day because of the time needed to let optical elements cool down, check for damage, replace any damaged parts, and install a new fuel capsule. At that rate, with each shot producing fusion burns of 20 megajoules—its initial target—NIF will barely generate enough power to keep a single light bulb glowing. According to Steven Cowley, director of the Culham Science Centre, Britain's fusion research lab near Oxford, “laser fusion has all the problems of magnetic fusion, but ICF also has to find a laser that can fire many times per second and is 20% to 30% efficient, plus how to make fuel pellets at low cost.”

The National Nuclear Security Administration, which funds NIF, has also been backing the High Average Power Laser (HAPL) program, bringing together researchers at national labs, universities, and industry to develop the technology needed for such a power reactor. Its goals include a laser that can fire as many as 10 shots a second, optics that can withstand that much power for long periods, a target chamber that can absorb the neutrons produced by fusion and convert their energy into heat, and a target factory that can churn out fuel capsules at the required rate. That's a tall order: At 10 shots per second, more than 850,000 fuel capsules would be needed every day.

The favored laser design is a krypton fluoride gas laser pumped with electron beams that is being developed at the Naval Research Laboratory (NRL) in Washington, D.C. NRL's Electra laser has recently demonstrated continuous operation for 10 hours firing 2.5 shots per second at ultraviolet wavelengths. Researchers still have to ensure that a working laser can keep that up for years and boost its power to the levels needed for ICF.

The Lawrence Livermore National Laboratory in California, home of NIF, is working on a high-repetition-rate version of the neodymium-doped glass lasers used on NIF. Livermore's Mercury laser dispenses with the inefficient and slow flash lamps used to pump NIF's laser amplifiers and replaces them with solid-state laser diodes. Mercury has shown a repetition rate of 10 hertz firing at infrared wavelengths.

The HAPL project is currently stalled, however, because it received no funding in the 2009 omnibus funding bill. “Hopefully, through some combination of actions by the new Administration and Congress, the challenge of funding HAPL and other such inertial fusion energy research in the U.S. will be resolved soon,” says Steven Obenschain, head of NRL's laser plasma branch.

In Europe, researchers are plotting a slightly different route to laser fusion energy. In traditional ICF, a single laser pulse plays two roles: compressing the fuel and sparking fusion at its center. An alternative, known as fast ignition fusion, uses one laser to compress the fuel and a second pulse, with extremely high power (1015 watts) but short duration, to set off the fusion burn. The advantage is a significant reduction in the energy requirement of the lasers. “If it works, it could lower the energy necessary to get high gain, making the economics more tantalizing,” says NIF Principal Associate Director Edward Moses.

The idea of fast ignition was conceived 15 years ago, and early experiments with the Gekko laser at Osaka University in Japan suggested it might work. Researchers at the University of Rochester in New York state are hoping to put it to some sterner tests with their newly upgraded Omega EP laser. But a design study funded by the European Union is planning something bigger: a dedicated fast-ignition facility with high repetition rates, dubbed HiPER. “We're putting together all the building blocks so that politicians can make a decision,” says HiPER Director Mike Dunne of the Rutherford Appleton Laboratory's Central Laser Facility near Oxford, U.K. He's hoping construction could start in 2015. Some caution, however, that fast ignition should learn to walk before it tries running. Says nuclear engineer David Hammer of Cornell University, “Fast ignition is one of those attractive ideas that haven't been tested yet.”

13. CONSERVATION BIOLOGY

# Will Captive Breeding Save Africa's King of Beasts?

1. Jerry Guo*
1. Jerry Guo is a writer in New Haven, Connecticut.

One nonprofit claims its lion reintroduction program will ensure a stable population of the iconic predator; many experts are dubious.

One nonprofit claims its lion reintroduction program will ensure a stable population of the iconic predator; many experts are dubious

VICTORIA FALLS, ZIMBABWE—Two young male lions trail a middle-aged South African couple through parched savanna. The waist-high cats may appear to be stalking human prey, but suddenly, off script, they bound into the bush. Handlers with a company called African Encounter chase after the captive-bred lions and with threatening waves of sticks cow them back onto the trail. The predators are coaxed to pose for pictures with the two tourists, who have paid \$200 for the chance to take a wilderness stroll with the tame beasts.

“Walking with lions” may be little more than a petting zoo with claws. But the non-profit African Lion and Environment Research Trust (ALERT) claims its captive-breeding and reintroduction program, supported by tourism revenue from the African Encounter operation, is “ensuring the future of the African lion.” ALERT, which also runs a breeding facility in Gweru, Zimbabwe, opened a lion-walk center in Zambia last December and plans to release a pride with seven female members into a 4000-hectare site there later this year. “No one has done what we've done,” says David Youldon, ALERT's chief operating officer.

Some experts are unimpressed. They argue that ALERT's program diverts donations and volunteer attention from efforts to stem what they say is the greatest threat facing lions: dwindling habitat. “There's no sound science behind what they're doing,” charges Paula White, a lion ecologist at the University of California, Los Angeles, Center for Tropical Research. “In most cases, lion reintroductions are poorly thought out, do little to benefit conservation, and use valuable resources that could be used to benefit existing populations desperately in need of protection,” adds Andrew Loveridge, a research fellow at the University of Oxford in the U.K. who studies lions in Zimbabwe. He doubts ALERT's program is an exception. Youldon disagrees and says critics are missing the point. “We're realists,” he says. “We think there has to be a commercial aspect.”

ALERT's center, started in 2005, follows a standard reintroduction protocol. Before release, captive-bred animals live in incrementally larger enclosures and are weaned off human contact. ALERT claims that “walking with lions” helps juveniles, taken from mothers when they are 3 weeks old, bond in a pride. “If it was pure science, you wouldn't do it,” acknowledges Pieter Kat, a wildlife geneticist at Investigaç ão Veterinária Independente in Lisbon and a scientific adviser to ALERT. “There is a tourism aspect involved, but they have to make ends meet.”

ALERT's initial reintroduction foray met with mixed results. In August 2007, the center released seven lions into a 200-hectare enclosure near Gweru. Within 2 months, males had killed two females—uncharacteristically lethal aggression thought to be linked to a captive upbringing. ALERT removed the males and reconstituted an all-female pride that thrived for a year before the center was shuttered for renovations. The pride led a sheltered existence in the enclosure, says Roseline Mandisodza, a Zimbabwean ecologist who studied it for her master's degree. The release site was too small and had too few competing predators such as hyenas, cheetahs, or leopards to simulate hunting conditions in the wild, she says. “There is very little or no chance of [their] survival in the wild.”

To be fair, carnivore reintroduction is a high-risk endeavor. In 2001, Urs and Christine Breitenmoser, co-chairs of the World Conservation Union Wild Cat Specialist Group, reported that only 30% of felid releases are successful, and almost all of these were translocations, in which wild animals were moved from one habitat to another. Captive breeding and release is a more drastic approach but may be the only hope for critically endangered species such as the Iberian lynx, estimated at fewer than 200 individuals, and the Amur leopard, of which only 30 remain in the wild. Lions are a different story. With some 23,000 lions in Africa, the most pressing need is habitat preservation, not adding to an ample population, argues Luke Hunter, executive director of New York City-based cat conservation nonprofit Panthera. “Reintroduction of captive-bred animals as a means to establish wild carnivores is probably the last resort,” he says.

Unless, that is, the lion population were to crash—a possibility that ALERT says supports its program. Feline immunodeficiency virus (FIV), once thought to be a relatively harmless cousin of HIV, infects more than 90% of wild lions. In a recent survey of 68 lions in Botswana, Melody Roelke, Stephen O'Brien, and colleagues at the U.S. National Cancer Institute found that 70% exhibited at least one AIDS-like symptom. ALERT's captive-bred animals offer an opportunity to study FIV in a controlled environment, says Kat. “We can now track the progress of this virus among individuals, a difficult thing to do in the wild,” he says. Later this year, Kat and colleagues at the University of Glasgow in the U.K. will begin taking blood samples from infected lions and examine them for immunodeficiency.

But to Hunter and others, the FIV threat appears minimal and hardly justifies captive breeding—or walking with lions. “Even if ALERT was going to succeed, so what?” Hunter asks. “It's not an answer at any scale that's going to matter.”

14. LINGUISTICS

# How Many Languages? Linguists Discover New Tongues in China

1. Michael Erard*
1. Michael Erard is a freelance writer in Portland, Maine. With reporting by Chen Xi in Beijing.

Researchers working in remote China have uncovered dozens of languages that had been hidden by mountainous terrain and administrative practice.

Researchers working in remote China have uncovered dozens of languages that had been hidden by mountainous terrain and administrative practice

After a long day in the field, deep in the mountains of southwestern China near the border with Vietnam, retired environmental health professor Gary Shook was surprised to meet another American, Jamin Pelkey, staying in the same government guesthouse. The two exchanged pleasantries.

“I'm collecting tiger beetles,” explained Shook, who had found four new species in the region. “What about you?”

“I'm collecting new species of languages,” replied Pelkey, then a graduate student at La Trobe University in Australia doing fieldwork for his dissertation. In 2006, Pelkey and his wife were gathering linguistic data in 41 villages in a 100,000-square-kilometer area of Yunnan Province. Over the course of a year, they drove 15,000 kilometers across rugged terrain in a Jeep. At the end, Pelkey had identified 24 languages associated with the Phula ethnic group, 18 of which had never been defined scientifically before. Until Pelkey's work, these languages had been invisible because their speakers were lumped together under a single ethnic label, the Yi, which is officially considered to have one language.

At a time when hundreds of languages are disappearing because children don't learn them and adults don't speak them, it may seem surprising that many existing languages have never even been named (though they are not “new,” especially not to the people who speak them). Yet there are potentially hundreds of undiscovered languages in China, Burma, the Amazon, and elsewhere, linguists say. Pelkey's 24 are listed for the first time this month, in the latest edition of Ethnologue: Languages of the World, an authoritative, worldwide gazetteer of languages maintained and published by SIL International, a nonprofit based in Dallas, Texas. This newest edition of Ethnologue lists 6909 living languages from 156 countries, including 83 “new” languages from 19 countries.

Pelkey's new entries are the most from any single country. China is “one of the last places on earth where there are large numbers of unreported and undescribed languages,” says linguist David Bradley of La Trobe, who also works in Yunnan. The reasons have to do with geography, history, and politics. Bradley speculates that Yunnan alone may have over 150 languages, and Western and Chinese linguists are now surveying the region more thoroughly. “In the last few years, there's been very much a heightened interest [by Chinese] in their diversity and a desire to study and work on language maintenance,” says linguist Arienne Dwyer of the University of Kansas, Lawrence. Yet this interest in linguistic diversity sometimes conflicts with the notion of a multiethnic but unified Chinese state. “The reason that language is particularly sensitive is that, in southwestern China, language was the principal way of categorizing people,” says Thomas Mullaney, a historian at Stanford University in Palo Alto, California.

## Language lumping

How can there be so many undiscovered languages in one region? One reason is the remoteness of villages. “Yunnan has so many mountains, and transportation was so limited before the Communists started building roads, and ethnic groups have been proliferating for so many centuries there,” Pelkey says. “The astonishing thing would be to walk into the situation and find only a few dozen languages.”

Yunnan is most frequently identified by the colorfully embroidered clothes and quilted hats of the non-Han ethnic groups who have called the mountains and lowlands home for thousands of years. Because their languages were rarely written down, linguistic change went unchecked. Local and imperial governments had little interest in languages, leaving them uncounted.

Centuries of isolation widened the gap between varieties descended from the same parent tongue. Today, the 500 speakers of Alo Phola can't understand speakers of a sister language spoken less than 8 kilometers away, says Pelkey. One of Pelkey's main criteria for judging language separateness is “mutual intelligibility,” or how well speakers of different varieties are able to understand each other. Among speakers of the 24 Phula languages, mutual intelligibility is so low that if they ever got together, they would have to communicate in a regional variety of Mandarin, Pelkey says.

Many Chinese languages are being described only now in part because a tradition of lumping ethnic groups together has masked the extent of the diversity. Chinese social scientists of the 1930s and '40s streamlined the number of ethnic minority groups, which were based mainly on language. “The logic was, ‘It does no one any good to have an ethnic group of 100 people,'” says Mullaney. In the 1950s, about 50 surveyors spent 6 months in Yunnan and divided a population of 2 million into 20 official groups, even though 212 ethnic group names had been discovered. In 1991, China permanently froze the number of recognized ethnic nationalities, known as minzu, at 56: the majority Han plus 55 minority groups, 25 in Yunnan. Until the 1980s, it was forbidden to suggest that China had more than 55 languages, Bradley wrote in 2005 in the International Journal of the Sociology of Language. “Any additional linguistic entities had to be classified as ‘fangyan.’” Although the word fangyan is often translated as “dialect,” it refers more specifically to “a language spoken in a specific area,” or a “topolect,” in contrast to yuyan, or an autonomous language.

This legacy has led to some disagreement between Chinese and Western linguists over what counts as a language. “We are very strict, while foreign researchers are very loose,” says linguist Sun Hongkai of the Chinese Academy of Social Sciences in Beijing. Sun began doing fieldwork in 1953 in Sichuan Province and Yunnan and has helped identify 19 languages. He promotes a method different from that of Western linguists, saying that the boundary between a language and a dialect should be determined by comparing grammatical patterns, vocabulary, and sound rules. If they are similar, the varieties are dialects of the same language. Other Chinese scholars add that varieties that come from the same parent language and have the same writing system must be fangyan, not yuyan, and reject mutual intelligibility as unscientific. Using such criteria, the roughly 230 European languages would be fangyan of a handful of languages. Chinese linguists “are still constrained by political realities as well as the traditional macrocategories imposed by the Han Chinese majority on their minorities,” says Bradley.

For example, in a 2008 report to UNESCO of endangered languages in China, Sun listed a single language for the Yi minzu. Although some of the Phula languages Pelkey described are endangered, they cannot be identified as such because the Yi officially have only one language. So it may be harder to target those languages with revitalization resources.

## Into the field

All the same, since the 1980s, Chinese linguistic diversity has become an open secret, and Chinese researchers have become freer to identify new languages as yuyan, says Bradley. In 1992, Sun helped establish an academy project on new languages, for example. Overall, Chinese linguists have identified a total of 134 languages, and the 80 identified in the last 25 years are called yuyan, not fangyan.

The Chinese have also opened their doors to foreign researchers such as Pelkey, who studied under Bradley. In 2005, Pelkey joined SIL International, the world's biggest player in describing minority languages. SIL has a Christian goal: It describes and analyzes languages to aid in Bible translation and literacy projects. In the past, a Christian organization might have had difficulties in China, but such survey work has been encouraged recently because it helps to provide education in mother tongues and coordinate language revitalization, Pelkey says. Pelkey did his research using his affiliation with SIL, which has been registered as a nongovernmental organization in Yunnan since 2004.

Pelkey stayed 3 to 5 days at a time in Yunnan villages, interviewing 10 or so local people. Using a list of 1200 words, he would say a word in southwestern Mandarin Chinese and show a picture of the object, then record people saying the word in their language. On breaks, he recorded people telling stories and played recordings of people from other villages in order to determine mutual intelligibility.

Originally, Pelkey had hypothesized that the languages associated with the Phula ethnonym were related to each other. They are all tonal languages, have a default subject-object-verb word order and very simple word structure. However, he found that although they have the same ancient ancestry, they're not siblings or even distant cousins. Using a distance matrix, a tool from evolutionary biology that is new to historical linguists, Pelkey determined that Azha (spoken by 53,000 people) and Pholo (spoken by 30,000) don't share the recent ancestry of the other 22 Phula languages. Thus these two would not be fangyan even by Chinese criteria. The speakers of all these languages have been subsumed under the Yi minzu.

For some communities, linguistic description and discovery is welcomed, but others are uncomfortable with losing traditional affiliations, linguists say. In Sichuan, Bradley says, speakers of 20 to 25 languages in the Tibetan minzu strongly reject any claim that they're anything but Tibetan and so don't want distinct languages to be identified as such.

The 24 new Phula languages included in Ethnologue have now acquired something of an official status internationally because they have been assigned identification codes by the International Organization for Standardization (ISO). Such language codes are used in software, digital archives, and library collections and are an official recognition that a speech variety meets ISO's definition of a “language.” It remains to be seen how the Chinese government will react to this recognition. Says Mullaney, “When people start to talk about there being new languages out there, it really starts to pull the thread out of this idea that there are a set number of minzu.”

Pelkey hopes a discussion will ensue. “You start out with assumed categories, then you find a lot of diversity inside them, and then you use a scientific approach to modify your understanding,” he says. “The two don't have to be in dissonance, and they don't have to be in consonance, either.” Otherwise, defining a language invites so much controversy, discovering species of beetles looks like a walk in the park.