# News this Week

Science  06 Feb 2004:
Vol. 303, Issue 5659, pp. 740
1. NUCLEAR PHYSICS

# New Chemical Elements Probe the Shoals of Stability

1. With additional reporting by Daniel Clery.

Nuclear scientists from Russia and the United States have penciled in two new entries on the periodic table. The new superheavy elements disintegrate within a second or so, but their fleeting existence suggests that an “island” of relatively stable nuclei lurks just beyond experimenters' grasp. “Our interest is not just to find one more element,” says Yuri Oganessian of the Joint Institute for Nuclear Research (JINR) in Dubna, Russia. “The key point is to conclude that the island of stability exists.”

Researchers have long known that some atomic nuclei hold together better than others. The protons and neutrons within a nucleus nestle into series of quantum-mechanical shells, and certain “magic numbers” of protons and neutrons fill the shells and confer additional stability. For example, 82 is a magic number for protons, as is 126 for neutrons. Bismuth, whose nuclei contain 83 protons and 126 neutrons, is the heaviest element that doesn't undergo radioactive decay.

Theorists suspect that the next magic number for neutrons is 184 and the next magic number for protons may be 114. So superheavy nuclei containing nearly 114 protons and 184 neutrons should be relatively stable. Instead of instantly falling to pieces in a process called spontaneous fission, they ought to disintegrate more slowly by spitting out alpha particles, which consist of two neutrons and two protons. If dry land signifies absolute stability, nuclei near the jumbo magic numbers are still a bit wet, says Kenton Moody, a nuclear chemist at Lawrence Livermore National Laboratory in California. “The undersea mountain of stability might be a more appropriate name,” Moody says.

Island or seamount, in recent years nuclear physicists and chemists have been wading toward their goal by smashing together heavy nuclei. Using JINR's ion accelerator, the Livermore and JINR scientists produced 114- and 116-proton elements that lasted many times longer than their slightly lighter cousins (Science, 22 January 1999, p. 474). Now the team has created four nuclei of element 115 by blasting calcium-48 ions, which have 20 protons and 28 neutrons, into atoms of americium-243, which have 95 protons and 148 neutrons, the team reported this week in Physical Review C. In a fraction of a second, the nuclei decayed into another new element, number 113, which appears to last as long as a second.

Such superheavy nuclei should help researchers develop a more unified theory of the nucleus, says Richard Casten, a nuclear physicist at Yale University in New Haven, Connecticut. The nuclei still fall about a dozen neutrons short of the magic 184. To reach that number and the center of the island, researchers need a facility that can accelerate rare unstable nuclei, such as the proposed Rare Isotope Accelerator (RIA), says Konrad Gelbke, director of the National Superconducting Cyclotron Laboratory at Michigan State University. “If you will ever do it,” Gelbke says, “RIA is your only shot.”

In the meantime, the team from Dubna and Livermore will continue to feel around the edges of the island of stability, says Mark Stoyer, a nuclear chemist at Livermore, to see if it's “a tiny Gilligan's Island or as large as Cuba.” Either way, all agree the water is getting shallower.

2. FRANCE

# Researchers Issue an Ultimatum

1. Barbara Casassus*
1. Barbara Casassus is a writer in Paris.

PARIS—The directors of hundreds of French research centers say they will stop performing administrative duties on 9 March if the government does not meet demands to restore budget cuts and jobs. Researchers in France's government-funded research agencies have signed a petition, launched on the Internet on 7 January, that already boasts more than 37,500 signatures. The petition calls for the government to pay the agencies what it still owes them for 2002, step up recruitment of young researchers, and hold a conference on the future of French research.

Relations between researchers and the government have been deteriorating since deep budget cuts were imposed last year. Biologist Alain Trautmann of the Cochin Institute in Paris, a spokesperson for the petitioners, says: “I feel as though we are standing on the edge of an abyss. If we don't change direction completely, we will fall in.”

The protesters are demanding reinstatement of some 550 permanent posts for young researchers and engineers that were converted into 3- to 5-year contracts this year, and reversal of the government's decision to create no new posts for university lecturers or professors. If the directors carry out their threat on 9 March, labs could close, because staff members are not covered by accident insurance if they have no boss.

President Jacques Chirac and his government have tried to placate the researchers. The government acknowledges that it owes research agencies $240 million for 2002 and has promised to come up with the money by next year. It has also promised legislation to reform the research system, an audit by 20 February to determine why labs and the government disagree on budget figures, and no spending cuts this year. But researchers are not satisfied. Last week up to 10,000 marched through Paris in protest. Then in a further concession late last week, French Academy of Sciences President Étienne-Émile Baulieu told Science that research minister Claudie Haigneré had endorsed the idea of a conference and agreed that it be organized by researchers. 3. BIOMEDICAL POLITICS # Sex Studies 'Properly' Approved 1. Jocelyn Kaiser There is nothing improper in the National Institutes of Health's support for studies of human sexuality, NIH Director Elias Zerhouni has informed Congress. In a 26 January letter to congressional leaders, Zerhouni rejected criticisms from an independent conservative group, which challenged NIH grants last year and asked Congress to investigate. Many scientific societies saw this as an ideological attack on peer-reviewed science and objected; last week, several said they were reassured by Zerhouni's response. NIH began reviewing its portfolio after the House nearly eliminated four sexual research grants in July 2003. Several lawmakers raised questions about more grants at a 2 October hearing. A staffer at the House Energy and Commerce Committee then forwarded to NIH a list of 198 studies compiled by the Traditional Values Coalition, a conservative advocacy group (Science, 31 October 2003, p. 758). Coalition director Andrea Lafferty called the research, on topics ranging from AIDS risk among drug users to teen-pregnancy prevention, “smarmy” and a waste of taxpayers' money. Not so, concludes Elias Zerhouni in a two-page letter sent to Commerce Committee chair Billy Tauzin (R-LA) and seven other House and Senate members. The “peer-review process … worked properly,” and “I fully support NIH's continued investment in research on human sexuality,” Zerhouni wrote in a letter he drafted himself, according to an NIH spokesperson. An attached six-page summary explains three widely reported grants. A study of prostitutes and truck drivers may help prevent heterosexual transmission of HIV, the letter says. A conference on sexual functioning could shed light on dysfunction, “a major cause of divorce.” And understanding the sexual behavior of older men “has important implications for families.” The summary does not explicitly discuss other controversial studies—for example, on transgendered Native Americans and Asian prostitutes in California—but says, for example, that sex worker studies “are certainly not promoting an illegal activity”; they are “trying to stop the devastation it can unleash” by spreading disease. “Some of this research has unseemly titles because, frankly, the research involves looking at difficult, albeit real, components of the human condition,” the summary concludes. It also explains the peer-review process, saying that all the grants are “scientifically justified,” received fundable quality scores, and are “connected to clear public health priorities.” Advocates of biomedical research praise the letter. “We are pleased that NIH has reiterated its support for sound science” involving “significant” public health issues, says Karen Studwell, a policy analyst for the American Psychological Association. (APA, AAAS—publisher of Science—and other groups issued statements defending NIH's peer-review process.) But the conservative critics aren't satisfied. Zerhouni's letter “was unconvincing,” says the coalition's Lafferty. “This is just an attempt to disguise all of this as science.” Some lawmakers may not be placated, either. Although Tauzin's spokesperson declined to comment, Commerce Committee member Representative Mark Souder (R-IN), who questioned a study of women's reaction to pornography, called the letter “an unbelievable rationalization.” Observers expect the sex grants to come up later this year during congressional hearings on NIH's 2005 budget appropriation. 4. CONDENSED-MATTER PHYSICS # New-Style Matter Opens Cool Middle Ground 1. Charles Seife Once more, scientists have ventured into the frontier of very cold matter. The creation of a new state of matter, known as a Fermi condensate, promises an exciting new way to understand the mysterious physics of high-temperature superconductors. Last November, two teams of researchers announced that they had created a type of matter known as a Bose-Einstein condensate (BEC)—a cluster of particles that acts like a single, enormous quantum-mechanical object—out of atoms belonging to a class of particles known as fermions (Science, 14 November 2003, p. 1129). A fermion, by nature, cannot act just like its neighbors. So it was a major achievement to get fermions to pair up, condense into a BEC, and march in quantum-mechanical lockstep. Last week one of the two groups, led by Deborah Jin of the Joint Institute for Laboratory Astrophysics (JILA) in Boulder, Colorado, announced at a press conference that it has taken the next step. As in the earlier experiment, the team used lasers and magnetic fields to cool a potassium gas to a few hundred millionths of a degree above absolute zero. An additional magnetic field adjusted how much the potassium atoms attracted one another. In the BEC experiment, Jin and her colleagues used these supplementary magnetic fields to bind the fermions into loose “molecules” that could condense into a BEC. In the present experiment, however, Jin's group tuned the fields so that the potassium atoms repel one another slightly. “The binding cannot occur,” says Jin. Nevertheless, the fermions still pair up in a sense, each partner affecting its counterpart's motion enough that the fermions can still condense. This condensation is more similar to what happens in a superconductor, where repulsive electrons form “Cooper pairs,” than what happens in a BEC. Although BECs and superconductors are related, their behavior is explained by theories based on different assumptions. The region where they overlap—precisely where this experiment probes—is murky. “It's a very new regime; it's someplace where the theory isn't clear,” says Jin. Eric Cornell, a leader of a different JILA cold-temperature physics group, says that scientists used to think that a condensate was either BEC-like or purely superconductor-like. “There was no compromise,” he says. But Jin's work proves this idea false, he says. Indeed, Jin's preliminary data seem to show that there's a gradual change from the BEC to the superconducting regime. “We don't see a sudden change in the behavior of the system,” she says. Jin believes that the work may illuminate the physics behind high-temperature superconductors: materials that conduct electricity with no resistance at liquid-nitrogen temperatures rather than the much colder liquid-helium temperatures of traditional superconductors. That understanding, she adds, could someday lead to practical applications. In the meantime, says Cornell, the experiment is “a technological and scientific tour de force.” 5. SOUTH ASIA # India, Pakistan Hold First Science Talks 1. Pallava Bagla NEW DELHI—The science ministers of India and Pakistan met last week for the first time in the two countries' 57-year history. The meeting anticipated a “composite dialogue” later this month between the leaders of the two nuclear powers, which have fought three wars and threaten each other along a tense border. Researchers in areas such as biotechnology, nanotechnology, natural product chemistry, and information technology are likely to be the first to feel a warming in relations, as the two sides agreed to set up expert panels to explore these areas. “I am here to build bridges” and encourage scientists to join hands, Pakistan's federal minister for science, technology, and higher education, Atta-ur-Rahman, told Science. India's Murli Manohar Joshi, minister for human resource development, science and technology, and ocean development, was said to be more circumspect about the 45-minute meeting, although a government statement later pointed to “opportunities for expanding bilateral scientific cooperation.” In addition to opening doors for working scientists, such moves could also improve the professional status of many who live on both sides of the border, such as Shazia Jamshed, a pharmacologist at the University of Karachi who is married to Indian organic chemist M. J. Siddiqui. Jamshed came to India in 1999 and was accepted into a doctoral chemistry program at the University of Delhi, but the foreign office refused to grant her permission to enroll. Now working as a medical transcriptionist, Jamshed hopes that the fruit of ongoing scientific talks might “change the suspicious mindset of the bureaucracy” and allow her to pursue her career. The director of India's well-regarded Central Drug Research Institute in Lucknow is hoping for a more immediate payoff. The institute has been pursuing high-gum-yielding varieties of a wild plant, Commiphora mukul, that yields the popular lipid-and cholesterol-lowering drug gugulip. “But the best plants grow in Pakistan,” says institute director Chhitar Mal Gupta. In the past, he notes, the institute has worked jointly with the HEJ Research Institute of Chemistry in Karachi on research involving medicinal plants and natural products chemistry. 6. INTELLECTUAL PROPERTY # Court Tells Nichia to Pay Blue LED Inventor$180 Million

1. Dennis Normile

TOKYO—Japanese courts last week delivered stunning monetary awards to two corporate researchers who claimed that they had received inadequate compensation for inventions produced for their employers. The plaintiffs hailed the victories as progress on the road to better treatment for corporate scientists. But others warned that the awards could spell the road to ruin for companies by undermining the potential payoff of a breakthrough discovery.

In the most eye-popping decision, handed down on 30 January, the Tokyo District Court awarded $180 million to materials scientist Shuji Nakamura for his development of a blue light-emitting diode (LED) while employed by Nichia Corp. of Anan, Tokushima Prefecture. That's a million times more than the$180 that the company originally paid Nakamura, now a professor at the University of California, Santa Barbara, for the rights to a key LED patent. In a separate case decided 1 day earlier, the Tokyo High Court ordered Hitachi Ltd. of Tokyo to pay $1.5 million to former company researcher Seiji Yonezawa for three key technologies that are at the heart of CD players and other optical disk devices. Katsuya Tamai, a professor of intellectual property law at the University of Tokyo, says the decisions reflect the ambiguity in existing patent laws. Currently, patents are given to individuals, who may cede rights to their employers in exchange for “reasonable” compensation. The two recent cases hinged on the definition of that term, which is not spelled out in the law. Although a growing number of researchers have won suits that allege they have been treated unfairly by their companies, until now the awards have been small. What may have tipped the scales in the Nakamura case is the fact that LEDs are a multibillion-dollar industry. Blue LEDs can be combined with previously developed red and green LEDs in giant outdoor displays and to produce white light devices that could supplant conventional light bulbs. The court determined that Nichia had earned more than$1.1 billion in profits from the technology since it was commercialized in 1993. In the Hitachi case, the appellate court took the unusual step of quadrupling the award of a lower court, which had ruled that Yonezawa was entitled to a slice of the company's domestic licensing but not its foreign activities.

Nichia, in a statement posted on its Web page, criticized the court decision for an “excessive” interpretation of the provisions of the patent law. It has already appealed the ruling. (Hitachi is also planning to appeal to the country's highest court.) In the meantime, Japan's leading daily economics newspaper, Nihon Keizai Shimbun, called the rulings “out of touch with the realities [of business] in Japan.” It warned that a rash of such verdicts could “strip profits from many technology-oriented companies.” Tamai says he worries that more large awards could pressure corporations to move their laboratories offshore. What's needed, he says, is reform of the patent law.

Not surprisingly, Nakamura offers a more positive take. Larger awards, he told a press conference, will create financial incentives for scientists “that will have everyone striving to make discoveries. … I think this will fuel the dreams of young people interested in science.”

7. GENETICS

# Development Gene May Give Nerve Cells a Sense of Identity

1. Elizabeth Pennisi

Under the microscope, it's hard to tell brain cells apart. But similarities can be deceiving: Neurons acquire unique identities during development, each finding its correct place in the brain and connecting with the appropriate neighbors. A new study suggests that, in fruit flies at least, a gene called Dscam has the flexibility to endow specific groups of neurons, even individual cells, with “Hello, my name is” tags. The gene comes in 38,000 flavors.

Like many genes, Dscam, which stands for Down syndrome cell adhesion molecule, consists of protein-coding regions called exons interspersed with noncoding regions. It has more than 100 exons, some of which—or even fragments of which—can become active separately in a process called alternative splicing. Each combination of expressed gene segments creates a different protein. (In humans, Dscam lacks the variability seen in its insect versions.)

Although immune system genes were known to mix and match their exons, researchers were initially surprised to find another gene in which so many combinations were possible, says evolutionary developmental biologist Andrew Chess of the Massachusetts Institute of Technology (MIT). To track the variations, the team used DNA microarrays, which reveal which genes are most active in a given sample. Chess, MIT's Guilherme Neves, and colleagues assessed the activity of about 19,000 Dscam variants possible from the three most readily divisible exons. They exposed the microarray to genetic material belonging to fruit fly embryos, larvae of different stages, and adults.

Different combinations of Dscam variants were active at different ages, the researchers report online this week in Nature Genetics. They also found variety in Dscam patterns among individual neurons. Further analyses showed that each variant formed independently of the others and that the patterns of gene expression were somewhat random. The variability “may help each cell know it's different from its neighbor,” Chess proposes.

“This is the first evidence for the possible presence of distinct Dscam molecules in individual [nerve] cells,” says Tzumin Lee, a cell biologist at the University of Illinois, Urbana-Champaign. Next, the team hopes to demonstrate “whether and how the presence of distinct Dscam molecules helps provide for the huge diversity and specificity in the central nervous system.”

8. PSYCHOPHARMACOLOGY

# FDA Weighs Suicide Risk in Children on Antidepressants

1. Constance Holden

Amid growing concerns that some antidepressants may make some children suicidal, the U.S. Food and Drug Administration (FDA) acknowledged a possible risk in a public hearing this week. Members of its drug advisory panel urged that, pending more information, stronger warnings should be put on labels for selective serotonin reuptake inhibitors (SSRIs). There's a “signal” in the data from some clinical trials of children taking SSRIs, officials said at the hearing in the crowded ballroom of a hotel in Bethesda, Maryland. The agency put forth plans for a thorough review of research to find out what, if anything, that signal means.

SSRIs have been generally accepted as the only antidepressant safe and effective for children and adolescents. Gianna Rigoni of FDA said that in 2002, 10.8 million prescriptions were written for those under 18. Leading the pack are paroxetine (Paxil) and sertraline (Zoloft), which in recent years have surged ahead of pioneer fluoxetine (Prozac).

But with popularity has come an increasing number of reports of adverse effects. Critics claim that SSRIs lead to extreme agitation, hostility, suicide, and homicide. Many of the speakers at the public meeting shared heart-stopping stories about young family members who had killed themselves.

Defenders of the drugs counter that there's no good evidence that they are harmful. Last week, the American College of Neuropsychopharmacology announced the results of a review of 15 clinical trials of paroxetine and six other SSRIs covering more than 2000 youths. There were no suicides. And the difference between drug and placebo groups in “percent of youth with suicidal behavior or ideation” was not statistically significant in any trial. Finally, the review noted that since prescribing has become widespread, there has been a dramatic drop—averaging 33%—in rates of youth suicide in 15 countries.

Those challenging the drugs have different numbers. Foremost among SSRI critics is David Healy, director of the North Wales Department of Psychological Medicine in Bangor, U.K. He contends that companies have kept the lid on research that fails to show efficacy for their drugs. He says he has had access to the files of all the trials conducted on Paxil by GlaxoSmithKline and that the rate of suicidality among the SSRI takers is three times that for the placebo group.

Healy estimated that SSRIs lead to 100 excess suicides each year. But Columbia University psychiatrist John Mann counters that far more lives would be saved with more SSRI use. The “vast majority” of the 4000 young people who kill themselves each year in the United States are not taking antidepressants, he says, and SSRIs might ease their misery.

Last year, the United Kingdom's Healthcare Products Regulatory Agency banned the prescribing of all SSRIs except fluoxetine to people under age 18. Some want FDA to follow suit. It has been gradually tightening surveillance of SSRIs since last spring, reported Thomas Laughren of FDA's Division of Neuropharmacological Drug Products. In October, it warned doctors that studies have not ruled out increases in suicidal thinking among youth given SSRIs and reminded them that only fluoxetine is approved for use on children.

Some SSRI critics claim that the United Kingdom acted on the basis of data that have been withheld from public view. But Laughren told those at the hearing that everyone probably had the same information. “It doesn't appear that the U.K. did any analysis other than what the drug companies did,” he said. The agency is asking a group at Columbia University to come up with a detailed classification of suicidal and self-injurious behaviors that will help them reanalyze 25 clinical trials with SSRIs. The studies cover not only depression but also anxiety disorders including obsessive-compulsive disorder and attention deficit hyperactivity disorder. The agency will reassess the debate at another meeting in late summer.

9. SPACE-BASED ASTRONOMY

# Hubble Huggers Get a Reprieve

1. Andrew Lawler

Under pressure from a senior U.S. senator, a bevy of astronomers, and hordes of interested amateurs, NASA is taking another look at its recent decision not to send another shuttle flight to service and upgrade the Hubble Space Telescope. Harold Gehman, the retired admiral who led the investigation last year into the Columbia shuttle disaster, will conduct his own review of the agency's decision to seal Hubble's fate.

Last week's reversal marks a victory for Senator Barbara Mikulski (D-MD), who criticized NASA chief Sean O'Keefe for his abrupt decision. The ranking member of the panel that funds NASA, Mikulski received a standing ovation from staff at the Space Telescope Science Institute in her hometown of Baltimore when she visited it shortly after O'Keefe's announcement. But Mikulski warned the enthusiastic crowd not to assume that Hubble has been rescued.

O'Keefe has concluded that a servicing mission would put astronauts in jeopardy, because the telescope is in a different orbit from the international space station and there would be no safe haven in the event of trouble. In order to conduct such a mission safely, NASA officials say, a second shuttle would have to be readied on the pad and a second crew trained for a possible rescue. That would be a complicated and expensive venture, and NASA is loath to devote time to anything that would divert resources from the president's goal to complete the orbiting laboratory by 2010. A shuttle mission to Hubble was planned for as early as 2006.

But Mikulski believes Hubble shouldn't be written off so readily. In a 21 January letter to O'Keefe, the veteran legislator asserted that Hubble “is the most successful NASA program since Apollo … and cannot be terminated prematurely with the stroke of a pen.” O'Keefe proposed to her that Gehman conduct an independent review of the matter, and the senator agreed. “She respects Admiral Gehman and will respect his opinion,” says her press spokesperson Amy Hagovsky. Gehman's report on the Columbia accident did not take a stand on whether a Hubble mission would be too risky.

Gehman's review is expected to be completed within 2 months.

10. NATION BUILDING

# Reports to U.N. Propose Bigger Role for Science

1. Jeffrey Mervis

Two reports delivered this week to United Nations Secretary-General Kofi Annan make a strong pitch for developing nations to build up their scientific institutions if they hope to improve conditions in their countries. The two reports, written independently, both underscore the importance of improving universities, funding the best science through peer review, and providing government leaders with impartial technical advice. Whereas those elements are woven into the fabric of scientifically advanced nations, the reports note, they are often lacking in the rest of the world.

“Too many poor countries see science as a luxury, something that's only for the rich,” says U.S. National Academy of Sciences President Bruce Alberts. “Even worse, some of them see it as a nuisance because it requires truth-telling.” Alberts is co-chair of the InterAcademy Council (IAC)—a group of 90 national science academies formed in 2001 to provide advice to governments around the world—which produced one of the reports. * The other, still in draft form,was written by a task force formed to help implement the goals of the U.N.'s 2000 Millennium Summit.

The two reports were compiled by scientists, educators, and policymakers from around the world, and both maintain that countries cannot overcome myriad food, health, environmental, and other problems without help from the scientific community. The U.N. task force, for example, advocates “entrepreneurial universities” that would encourage faculty members to tackle these pressing problems at the same time that they pursue excellence in teaching and research. “Higher education is important, but the goal is not just churning out more graduates,” says Harvard University's Calestous Juma, who co-chaired the panel. “Universities also need to serve as incubators for business and incorporate those skills into the curriculum.” Adds Mamphela Ramphele of the World Bank, a former vice chancellor of the University of Cape Town, South Africa, and a panelist for the IAC report, “You need a higher-education community that wakes up to its responsibility to be a champion of reform.”

The two reports also put considerable weight on the need for scientists to advise their governments on how best to allocate limited resources. Juma cites the need to reach the most senior officials who determine national policy, and Alberts talks about the value of “merit-based institutions” for all sectors, from dispensing wisdom to hiring faculty and funding the best research proposals. “It's not easy to get rid of the corruption and the cronyism,” Alberts says. “But the cost of not moving forward is to be left further behind.”

Systemic reform is needed, he adds, beginning with each country's own scientific infrastructure. “We decided to focus on institution building as the best way to make a difference,” he says about the council's first-ever report. “There's all this terrific science out there on the Internet, for example. But it's no good to a developing nation without the talent and the mechanisms to use it.”

11. SCIENCE POLICY

# 2005 Budget Makes Flat a Virtue

1. David Malakoff*
1. With reporting by Yudhijit Bhattacharjee, Jocelyn Kaiser, Jeffrey Mervis, Charles Seife, and Erik Stokstad.

Federal science managers are facing tough choices as nondefense programs get little if any increase under President Bush's proposed budget for the upcoming year

As a doctoral student in ecology 3 decades ago, Paul Gilman learned all about environmental stress. Now, as head of the Environmental Protection Agency's (EPA's) research office, he's trying to adapt to a stressful fiscal environment: Under the Bush Administration's 2005 budget proposal released this week, EPA's R&D spending would drop significantly, forcing Gilman to find new ways to stretch his dollars.

It's a skill many scientists may have to learn next year. Facing an expensive war in Iraq and soaring federal budget deficits, President George W. Bush on 2 February unveiled a $2.4 trillion budget blueprint that would barely increase the budgets of most major nondefense science agencies. Overall, government spending on R&D would rise 4.7%, to$132 billion, for the 2005 fiscal year that begins 1 October. Nearly three-quarters of the new money would go to defense-related programs, however. The Pentagon's technology programs would grow 7%, to nearly $70 billion, and the Department of Homeland Security's applied science efforts would get a 15% boost, to$1.2 billion.

Nonsecurity science, in contrast, would rise by just 2.5% overall. Leading the pack is NASA, with a 5.6% rise, to $16.2 billion. Its science budget, however, would grow by only 0.5%, to$6.6 billion. Following are the National Science Foundation (NSF), slated for a 3% boost, to $5.7 billion, and the National Institutes of Health (NIH), with a 2.7% increase, to$28.8 billion. The Department of Energy's (DOE's) Office of Science, meanwhile, would remain flat at $3.4 billion. Bush Administration officials say those numbers look pretty good considering that the White House held the entire discretionary, nondefense budget—the$489 billion piece of the spending pie that pays for everything from food safety to environmental protection—to just a 0.5% increase. “This is an Administration that's been good to R&D,” says White House science adviser John Marburger, adding that research spending is at record highs by several measures.

Science advocates of all stripes aren't impressed, however. “I am very disappointed. … We just have to find a way to do better,” said House Science Committee chair Sherwood Boehlert (R-NY). “The nation must pull itself out of our deficit spiral, but we cannot do so by shortchanging research,” added Nils Hasselmo, president of the Association of American Universities in Washington, D.C. “It's looking pretty ugly,” sums up Tom Jones, a co-chair of the Coalition for National Security Research, which monitors Pentagon research budgets.

Even NSF Director Rita Colwell, who as a presidential appointee is duty-bound to defend White House budgets, appeared to struggle with the numbers her agency has been dealt. “We're pleased to receive a 3% increase when many agencies are facing cuts,” she began her public briefing. But “I have no doubts that NSF merits” a 5-year doubling trajectory, she said, to $9.8 billion by 2007, called for in a 2002 law that is not binding on Congress. “And the$19 billion budget recommended [last week] by the National Science Board is fully justified and, frankly, necessary,” she concluded.

Now, science backers are looking to Congress—which began examining the White House plan this week—to ease their pain. They may not find much relief, however. “With a $500 billion deficit [this year], and the bill for Iraq still due, it's going to be hard to find much cash,” predicts a veteran budget aide. One major focus for research advocates will be NIH's budget, which enjoyed double-digit increases from 1998 to 2003. “There's no question that this [year's request] is a slowdown,” said Tommy Thompson, Secretary of the Department of Health and Human Services, NIH's parent agency. But given NIH's “huge” recent increases, “this is one area where we felt we could tighten the belt,” he said. View this table: The request includes a 7.5% increase for biodefense research, bringing the total to$1.7 billion. Some funds would go to building 20 new biosafety level 3 labs. There is also $47 million to develop treatments to counter a nuclear or radiological attack. NIH's new Roadmap—initiatives that cut across NIH's 27 institutes and centers—would get$237 million. And an obesity initiative would get a 10% boost, to $440 million. But funding some of these priorities would come at the expense of new grants (see sidebar). NSF's overall 3% increase, to$5.75 billion, papers over reductions in the agency's budget that will displease many of its constituents. For example, its bread-and-butter research account is projected to go up by 5%, or $200 million. But$80 million of that is the remnant of a much bigger math and science education partnership program that the White House wants to move to the Department of Education (Science, 16 January, p. 295). Colwell says the new accounting reflects the fact that the agency “is now focused on other [education] programs.” With another $50 million of the increase going to beef up NSF's share of a$1 billion governmentwide nanotechnology initiative, most of the foundation's research directorates would receive only a 2% increase.

NSF's education programs overall take an 18% hit. In addition to the loss of the partnership program, there are sharp drops in programs serving K-12 teachers, minority institutions, museums, and states with few research-intensive universities. But NSF plans a 10% boost in the number of graduate fellows supported through three popular programs. And the agency's major new research facilities account would continue to grow, with money for three new projects: a high-energy physics experiment, a national ecological observatory network, and an upgrade of an ocean-drilling vessel.

In a very constrained budget, the biggest surprise is a $75 million management initiative. The money, which represents nearly half of NSF's overall$167 million increase, would be spent on an improved, more secure computer system; better training; and more staff. “This is NSF's single greatest need this year,” says Colwell.

NASA's rising budget is tied to the beginnings of a major reorganization aimed at implementing the White House's new plan for returning to the moon and going to Mars (Science, 30 January, p. 610). As part of the shakeup, the agency plans to accelerate construction of the James Webb Space Telescope, but other projects would be delayed.

At DOE, the biggest news was Secretary Spencer Abraham's plan to merge two of its national laboratories, the Idaho National Engineering and Environmental Laboratory and Argonne-West, both near Idaho Falls. The goal is to create “the world's premier nuclear energy technology center within a decade,” Abraham said. The laboratory will concentrate on next-generation nuclear power plants, research on space nuclear power, and a new fuel cycle for nuclear plants.

A flat research budget, however, spells trouble for DOE's three fusion energy research facilities. The trio—in New Jersey, Massachusetts, and California—would lose 4 weeks of run time, to 14 weeks, under a $264 million fusion budget. And neither the Brookhaven nor Berkeley national lab will get funding for support buildings that Office of Science head Ray Orbach says are “desperately needed.” Core science programs at the National Institute of Standards and Technology would get a 20% boost to$482 million, including nearly $60 million to build and renovate laboratories in Colorado and Maryland. But the White House once again calls for abolishing the Advanced Technology Program, which funnels tens of millions of dollars a year to companies for early-stage research. Congress has repeatedly rejected the idea. Meanwhile, Gilman faces a 12% cut, to$572 million, in EPA's research account. Administration officials say most of the decrease stems from removing congressionally ordered “earmarks.” But the request also cuts the agency's Science to Achieve Results (STAR) grants program by 34%, to $66 million, and a related STAR fellowship program by 40%, to$5.9 million. Gilman says the cuts are the result “of having to come to grips with other priorities,” such as developing a new generation of clean school buses.

Outsiders give Gilman credit for doing a good job under strained budget circumstances. But Granger Morgan of Carnegie Mellon University in Pittsburgh, Pennsylvania, says that the Administration shouldn't be forcing its science managers to be so creative. “You can't argue that the country needs science-based regulation and then not make the investment,” he says.

12. SCIENCE POLICY

# Highlights From the Budget

1. Jocelyn Kaiser,
2. David Malakoff,
4. Charles Siefe

## New Grants Are Sickly Measure for NIH

The biomedical community has long measured the health of the National Institutes of Health by the annual number of new and competing grants it funds. By that measure, NIH is about to flat-line.

The pressure on new NIH grants results from the modest 2.7% increase the president has proposed. The bulk of the $764 million is committed to priorities such as biodefense and the cross-NIH Roadmap, as well as to 4-year grants already under way. “This is a difficult budget, and we're doing everything we can [so] it doesn't damage … the most important aspects of the budget,” says NIH Director Elias Zerhouni. That means institutes are scrambling to find ways to preserve the investigator-initiated grants that make up about 54% of NIH's overall budget. But unless Congress comes to the rescue, some pain seems inevitable. Although the overall number of NIH grants is expected to hit an all-time high of 39,986, the 2005 budget request would support just 10,393 new and competing grants, the same figure as in 2003. And that comes after a dip of 258 in this year's total, the result of NIH funding more grants than it had planned to support in 2003 and getting less money than expected in 2004 for research. Moreover, to help shore up grant numbers, NIH is squeezing grant size: The new batch will be only 1% bigger than this year's (continuing grants will grow by 1.9%), well below the projected inflation rate for biomedical research costs of about 3.5%. The silver lining in the dark NIH cloud is that the research community had expected even worse. The Federation of American Societies for Experimental Biology (FASEB) last month predicted that new grants would plummet to 9925 in 2005. But 2005 “really is a halt to the growth” of the previous half-decade, says FASEB's Howard Garrison. And success rates—the portion of submitted grant proposals that are funded—will dip to 27%, the first time since 1996 that it's been below 30%, FASEB notes. “The message will come out loud and clear to people who depend on NIH for grant support: The situation is becoming more difficult as the opportunities in biomedical research are increasing,” says Garrison. ## Reviews Play a Bigger PART of Budgeting When White House officials rolled out the Environmental Protection Agency's (EPA's) budget this week, they revealed some bad news for ecosystems research: A$132 million EPA program that funds an array of in-house and competitive grants to monitor and model ecosystems would lose $20 million. The program is one of 58 government research efforts that the White House Office of Management and Budget (OMB) put under the microscope last year in a somewhat controversial process called PART (Program Assessment Rating Tool). The exercise judges programs on relevance to the agency's mission, quality, and performance in an attempt to give taxpayers the most bang for their buck. But some science advocates worry that the reviews are simply window-dressing for politically driven budget decisions. In EPA's case, OMB reviewers concluded that the ecosystems program suffered from poor organization, unclear goals, and inadequate oversight. “We found the research was not well connected to [EPA's] program offices; it needed refocusing,” says OMB official Marcus Peacock. So the White House has proposed stripping$20 million from the program and giving the money to states to improve their water quality monitoring. EPA could get the funds back if it puts its house in order, says OMB.

Other “PART-ed” programs judged lacking—and slated to lose funding—include several applied research programs at the Department of Energy focused on oil and gas development. But a complete list from the Administration suggests that a poor PART doesn't always have budget consequences: An EPA research program on particulate air pollutants that hasn't “demonstrated results” has retained its funding—at least for now. The same goes for the space station, which OMB also judged to be poorly managed. Critics say the EPA decision typifies the Administration's disregard for environmental protection, and they are vowing to reverse the cut. The outcome will be closely watched, for OMB is planning to subject scores of other R&D programs to PART this year.

## Moon Rocks and Hard Knocks

Last week, at the Super Bowl, the National Football League had a fake astronaut plant a flag on an artificial moon in tribute to the fallen astronauts aboard Columbia. If only it were that simple. The reality of sending humans back to the moon and then to Mars will cost many billions of dollars and require NASA to cut back many other projects. That's life in a budget world where flat has become a four-letter word.

One project potentially under siege is the Beyond Einstein initiative, which will use satellites to study fundamental questions about cosmology, black holes, and dark energy. That sort of exploration doesn't quite fit in with the Administration's new initiative, so it is taking a back seat to other projects. NASA's comptroller, Steven Isakowitz, says that missions like Beyond Einstein's x-ray observatory Constellation-X and laser interferometer LISA, as well as the unrelated earth-science satellite Global Precipitation Measurement mission, are going to be “deferred” by a year or more. “We will maintain funding at [previously projected] levels,” he says, but NASA will delay some of the planned spending on those projects. Nicholas White of NASA's Goddard Space Flight Center says that he is concerned, but the community can “learn to live with” a delay if the overall funding is maintained.

Even as some astronomers and earth scientists wonder about their futures, some planetary scientists have cause to celebrate. Despite less money for exploring the outer solar system, NASA hopes to allocate $70 million as a down payment for lunar exploration. Mars missions are getting a 16% increase over last year's budget, with the intent to double spending on the Mars program by 2009. Those projects, says Isakowitz, might result in a mission that would return samples from Mars as early as 2013. 13. JOHN DELANEY PROFILE # Marine Geologist Hopes to Hear the Heartbeat of the Planet 1. David Malakoff University of Washington marine geologist John Delaney extols the science—and the poetry—of building a network of observatories on the ocean floor Many scientists turn to poets for inspiration. But marine geologist John Delaney actually took one along for a voyage to the bottom of the sea. The 1991 submarine dive that sent Maryland poet laureate Michael Collier 2200 meters down to boiling volcanic vents off the Pacific coast typifies Delaney's expansive vision, say friends and colleagues. “John's a dreamer, an instigator. … He rejects limits,” says Margaret Tivey, a geochemist at the Woods Hole Oceanographic Institution in Massachusetts. And the University of Washington, Seattle, researcher knows how to make that vision appeal to others. “The first time I heard John give one of his talks, I felt like I was at a rock concert—I wanted to pull out a lighter and salute him,” says Oscar Schofield of Rutgers University in New Brunswick, New Jersey. Delaney calls himself “rather impractical.” Still, he's shown a pragmatic bent, from helping establish a long-running program to study underwater volcanism to leading a herculean expedition that hauled massive “black smoker” chimneys off the sea floor. Now the tall, 62-year-old researcher stands on the verge of realizing one of his wildest dreams: a$200 million plan to wire an entire tectonic plate off the Pacific Northwest with a spider web of sensors, pumping gigabytes of real-time data directly to scientists ashore. Dubbed NEPTUNE—for North-East Pacific Time-series Undersea Networked Experiments—the project is jockeying to become part of a broader National Science Foundation (NSF) plan to build a trio of ocean observatories that would enable scientists to keep a constant watch on the sea. “We're going to listen to the heartbeat of the planet,” Delaney says in his sonorous baritone, displaying the poetic turn of phrase that has become a hallmark of his public persona.

But funding for the observatories isn't yet certain, and not all marine scientists are on board. Some fear that the program will siphon funds from other projects; others question the approach itself. Physical oceanographers, for instance, “would probably not go down this road first to solve their problems,” says Carl Wunsch of the Massachusetts Institute of Technology in Cambridge.

Fittingly for a man captivated by volcanoes, Delaney made his debut in the afterglow of another kind of explosion. The son of a Navy engineer and his wife, he was born beside the U.S. Navy base in Pearl Harbor, Hawaii, on 8 December 1941, the morning after Japanese bombers had reduced much of the U.S. fleet to smoking hulks. Growing up in Charlotte, North Carolina, he developed basketball skills that won him a scholarship to Lehigh University in Bethlehem, Pennsylvania. Graduating with a geology degree, he turned down an offer to assist a hometown college basketball coach named Al Maguire, who went on to win a national championship, and headed to graduate school instead. “Al said I couldn't dribble, but I could think,” Delaney says.

He ultimately enrolled in a doctoral program at the University of Arizona in Tucson, working as a prospector for mining firms on the side. But he didn't get “serious about school,” he says, until he was nearly trapped in an abandoned mine while sampling. A trip to the Galápagos, which included camping inside a recently active volcano, hooked him on studying volcanism. The journey almost didn't happen: Delaney's adviser “couldn't afford to take me,” he recalls. “So I put $2500 on his desk and said, ‘I'm going.’” Using samples donated by another researcher, Delaney ultimately wrote a thesis that examined how the volatile gases in sea-floor basalt—a volcanic rock—behave when bottled up by the sea's crushing pressure. “It was magic,” he says. “I was given this garbage bag full of basalt that came from the sea floor!” The work won him a temporary post at Washington, where he was assigned to teach oceanography—a course he'd never taken. Although the head of his hiring panel soon suggested that Delaney start looking for another job, it wasn't long before students began to praise their 36-year-old lecturer. Within a few years Delaney had won a top teaching prize and secured a permanent position. But Delaney still hadn't found his niche as a scientist. That occurred during a 1980 dive in the submersible Alvin. “It changed my life. I realized I wasn't a laboratory researcher.” His work increasingly revolved around understanding the dynamics of the nearby Juan de Fuca Plate, a relatively small and accessible chunk of the Pacific crust rife with earthquakes, volcanoes, and thriving chemosynthetic communities of tubeworms and bacterial snow. Delaney was also honing his administrative skills. He helped organize the NSF-funded RIDGE program, a multidisciplinary assault on the midocean ridges where crustal plates creep apart. Within RIDGE, Delaney and others sparked controversy by proposing to divert already-planned cruises to undersea eruptions along the Juan de Fuca immediately after they had been pinpointed by newly available sensors. The rerouting paid off, however, giving researchers an unprecedented firsthand look at the almost apocalyptic events that shape the sea floor. Still, many researchers were frustrated by the limitations of traditional ship-based studies. In the early 1990s, Delaney, Alan Chave, a Woods Hole geophysicist, and others began to explore what it would take to install instruments that could keep a constant watch on the plate—and stream data back to land through a cable that could also provide the instruments with a steady source of desperately needed power. ## A passion for networks It wasn't a new idea. Marine scientists had been experimenting with cabled instruments for decades (see sidebar), and Japan had already instrumented several offshore sites. But Delaney's allies envisioned more: a sensor net that could dispatch robotic observers to fast-moving episodes—from eruptions to plankton blooms—that researchers often miss, and a communications grid that would offer anyone with a computer an instant window onto the sea. Thus was born NEPTUNE, which aims to link dozens of nodes bristling with physical, chemical, and biological sensors with more than 3000 kilometers of fiber-optic cable. The idea initially made little headway, but Delaney was “incredibly persistent,” says Kendra Daly, a biological oceanographer at the University of South Florida in St. Petersburg. “He kept going, cajoling, long after most people would have given up and gone away.” Adds Wunsch: “John may not be the world's greatest marine geologist, but he's got this spark and passion that we as a community sometimes lack.” The commitment has paid off. Four years ago, NSF formally endorsed the “regional observatory” concept, bundling it into a$245 million initiative that also includes coastal sensors and open-ocean buoys. About half the funds would go to the regional system, with NEPTUNE a leading candidate. The next step is up to Congress, which next year will be asked to open the spending spigot.

Delaney's allies, however, didn't wait for Congress. Last October, the Canadian government gave the University of Victoria nearly $50 million for a northern leg of NEPTUNE, starting with a project off Vancouver Island dubbed VENUS. And Delaney's team has raised about$25 million for related work in the United States, including a second pilot project—MARS—set for California's Monterey Bay.

MARS and VENUS will tackle what Delaney admits are a host of daunting technical issues, from building workable sensors to waterproof sockets—often an Achilles' heel for cabled instruments. Even if the pilots pan out, however, NEPTUNE still must overcome a flat NSF budget and concerns that it could overtax a thin research fleet. Another fear is that the observatories will become oceanography's version of the space station: a huge infrastructure that supports relatively little science.

Delaney welcomes the debate, saying that NEPTUNE and its sister observatories “will only benefit from more discussion, more ideas.” But he fiercely challenges the notion that the projects will monopolize resources. “I want to find exciting, important ways to argue for decades of new funds, not get by on what we've got,” he says.

Delaney drives home that message in the dozens of talks he gives each year before everyone from congressional aides to schoolteachers. The word is also getting out through the media: His work has been featured in books and documentaries, including a PBS NOVA show on a dramatic 1998 mission he led to recover several “black smokers”—chimneys that belch superhot water—from the Juan de Fuca Ridge. The mission recovered bacteria that thrive in the chimney's record-high temperatures, and several of the formations are now on display at the American Museum of Natural History in New York City. But Delaney doesn't want to become a similar kind of display: fascinating to outsiders but no longer useful to fellow scientists. “I'd like to reach a broader audience,” he says, but not at the cost of his credibility.

Delaney's desire to communicate may also explain the poetry that leavens his technical talks and the shipboard “poetry nights” that have become a tradition on his cruises. His selections, often delivered from memory, range from the earthy rhythms of Robert Frost and Robert Service to the ethereal images of the Japanese haiku master Basho. And he is fond of T. S. Eliot's observation that “we shall not cease from exploration.”

Indeed, Delaney says that if he were starting his career today, he'd probably want to work in planetary exploration. He's participated in NASA workshops on a probe to Europa, the jovian moon that some believe holds an ocean under its frozen surface. Submerged fires on Europa, he believes, could be fueling life beneath the ice—just as they did on Earth. “When it comes to life, it takes an ocean,” he jokes, borrowing from a slogan popularized by Hillary Clinton.

Collier, the University of Maryland, College Park, poet and longtime friend who went down in Alvin more than a decade ago, believes poetry “is another way for John to articulate his wonder and his enthusiasm for science.” Delaney is “incredibly inclusive,” he says. “He wants to share.”

14. JOHN DELANEY PROFILE

# A Cautionary Tale From Bermuda

1. David Malakoff

Fifty years ago, legendary oceanographer Henry Stommel of the Woods Hole Oceanographic Institution in Massachusetts set out to establish his own ocean observatory. Its fate offers both hope and caution to advocates of today's crop of underwater facilities, such as NEPTUNE (see main text).

Like today's architects, Stommel designed a multipart observatory to collect a steady, long-term stream of data on ocean conditions, recalls physical oceanographer Carl Wunsch of the Massachusetts Institute of Technology in Cambridge, a former student of Stommel's. One element is now known as “Hydrostation S”: a spot 20 kilometers southeast of Bermuda, at a depth of 3000 meters, where scientists regularly measured temperature, salinity, and dissolved oxygen from the sea's surface to its floor. Another was a set of drifting buoys, fitted with radio transmitters, for tracking water movements. There was also a power cable connected to several instruments located thousands of meters off Bermuda. Then, as now, the cable was seen as a promising way to provide power and accurate timing and to move data ashore.

But Stommel's dream turned into a nightmare, Wunsch says. The weather didn't cooperate, electrical connectors sprung leaks, instruments failed, and good help and steady funding proved hard to find. “Funders didn't want to commit to open-ended data collection,” he says. Within a few years, most of the station was abandoned—with one significant exception. Today, Hydrostation S is the source of one of the world's few long-term records of a changing ocean.

Skeptics predict that the next generation of observatories will face similar crippling problems. But supporters take heart from the continuing stream of data from Hydrostation S. It is a model, they say, that new observatories can first replicate and then expand.

15. DNA FORENSICS

# Buried, Recovered, Lost Again? The Romanovs May Never Rest

1. Richard Stone

DNA studies in the 1990s appeared to prove that the remains of the last Russian tsar and his family had been found; a new analysis raises questions

In the summer of 1991, the remains of nine people were unearthed from a shallow grave in central Russia. Forensic experts concluded that the skeletons likely were those of the last tsar, Nicholas II, the tsarina, and three of their five children, whose bodies disappeared after they were shot by the Bolsheviks in July 1918. DNA studies in the mid-1990s supported that claim, and in 1998, a special government panel affirmed the bones to be those of the Romanovs, Russia's ill-starred royal family, along with their doctor and three servants.

A new study, however, challenges this verdict. In the current issue of the Annals of Human Biology,* a team led by molecular systematist Alec Knight of Stanford University resurrects questions about the discovery of the remains and mounts a blistering attack on the original DNA analysis, contending that the results were tainted. Knight's group also performed the first analysis of the remains of the Grand Duchess Elisabeth, the tsarina's sister. All told, says Knight, “the evidence does not support the claim that the remains are those of the Romanov family.”

“That's nonsense,” fires back Pavel Ivanov, a molecular biologist at the Engelhardt Institute of Molecular Biology in Moscow who carried out the original DNA studies with Peter Gill of the U.K. government's Forensic Science Service and several colleagues. Ivanov and Gill contend that the new claims are flawed.

The new findings “do not add to the slim doubts about the remains. But they do not take away from the doubts either,” says Evgeny Rogaev, a molecular geneticist at the Center of Mental Health in Moscow and the University of Massachusetts Medical School in Worcester, who was invited by Romanov descendants and the Russian government to conduct an independent examination.

The debate could create a stir in Russia. The Russian Expert Commission Abroad, an expatriate group that doubts the authenticity of the remains, is calling for a new examination, claiming support from the Russian Orthodox Church. The plea, however, may fall on deaf ears: “The case is closed,” says a Putin Administration official.

The lingering uncertainty stems in part from the intrigue surrounding the demise of the Romanovs, said to have been killed and disposed of near Ekaterinburg. The discovery of a communal grave in the vicinity was reported in 1989. Two years later Russia's chief forensic medical examiner unearthed nine badly damaged skeletons and, after a series of forensic tests, came up with a tentative ID.

The Gill and Ivanov team amplified short tandem repeats from DNA, which confirmed that the purported tsar, tsarina, and three girls belonged to the same family. If correct, the bodies of one daughter and the tsarevich, Alexei, were missing. More curious was an analysis of mitochondrial DNA (mtDNA), which is passed from mother to child. The team compared mtDNA fragments from the nine skeletons to blood samples from the Duke of Edinburgh, Prince Philip—a grandnephew of the tsarina—and two living descendants of the tsar's maternal grandmother. The tsarina and the children matched Prince Philip, but the tsar's mtDNA was heteroplasmic—having cytosine and thymine at one site. The relatives had thymine or cytosine. Despite that apparent mismatch, the group reported in the February 1994 issue of Nature Genetics that the odds of the remains not being those of the Romanovs were no less than 700 to 1.

Two years later, Ivanov, working with a U.S. team, provided what appeared to be the clincher: The mtDNA of the remains of the Grand Duke Georgij Romanov, the 28-year-old brother of the tsar who died of tuberculosis in 1899, showed the same heteroplasmy. Rogaev in 1997 and 1998 sequenced DNA from the femur presumed to be from Nicholas II and from the blood of a nephew of Nicholas II. In findings submitted to the Russian commission, he concluded that the DNA matched.

Knight and colleagues contend, based on recent historical research, that the discovery and removal of the remains is “characterized by extreme irregularities at every level,” and that “crucial evidence has been proven fraudulent.” The most damning shortcoming, they charge, is that samples of old DNA must have been contaminated with “fresh” DNA that skewed the analysis. They argue that a sequence of 1123 base pairs, for example, was too long to have come from old bones.

The Knight team contributes some new data: a DNA analysis of the Grand Duchess Elisabeth's shriveled finger. The sequence did not match the reported sequence of the tsarina. To Knight's group, this means that “it is probable that the Ekaterinburg remains were misidentified.” But the sequence didn't match Prince Philip's mtDNA either, so it might not have come from Grand Duchess Elisabeth, says Ivanov. Knight's team acknowledges that it may have been from a contaminant.

The Knight paper is “intriguing,” says Anne Stone, an expert on ancient DNA at Arizona State University in Tempe, who sequenced the purported remains of the Wild West outlaw Jesse James. Notwithstanding the Elisabeth-Prince Philip puzzle, she says, the “the ancient DNA work looks like it was performed according to today's standards and looks good.”

Gill is not impressed, insisting that his team's forensic DNA work “set the standard.” The new paper so completely mischaracterizes his work, he says, that it “comes across as vindictive and political.”

Rogaev says that further DNA studies would be worthwhile. Knight agrees; one possible way to settle the identity of the tsarina, he says, is to invite other living descendants of Queen Victoria, her maternal grandmother, to donate blood for mtDNA sequencing. Ivanov, meanwhile, contends that the scientific argument is too weak to warrant debate. And what about the Romanovs? They must be turning in their graves.

• * A. Knight et al., Molecular, forensic and haplotypic inconsistencies regarding the identity of the Ekaterinburg remains. Annals of Human Biology (2004).

16. ROBERT AYMAR INTERVIEW

# The Man to Finish the Job

1. Daniel Clery

Robert Aymar's task is to complete the construction of the world's most powerful particle accelerator at a CERN that is cowed and short of cash

GENEVA, SWITZERLAND—When Robert Aymar took the reins of CERN at the beginning of the year, Europe's premier particle physics laboratory was in flux. Construction of the Large Hadron Collider (LHC), the world's most powerful particle accelerator, was forging ahead (see sidebar on p. 756). But behind the scenes, the lab was still struggling to make changes prompted by a 3-year-old financial crisis.

The crisis began brewing in 1996, when, after much political wrangling, CERN's 19 member states agreed to build the LHC but skimped on its budget (Science, 3 January 1997, p. 19). CERN's council ordered cutbacks to improve efficiency. In September 2001, however, CERN director-general Luciano Maiani revealed that the LHC was overspending on its $1.6 billion budget by$300 million for hardware alone, mostly due to unforeseen problems with the superconducting magnets and underground construction. The council set up an External Review Committee (ERC), headed by Aymar, to assess CERN's problems. Following changes made by CERN management, some recommended by the ERC, the LHC will now be completed a year later than scheduled, money will be shifted to the LHC from other CERN programs, and numerous non-LHC experiments will be closed for 1 year (Science, 29 March 2002, p. 2341; and 28 June 2002, p. 2317).

Aymar was tipped to head the lab in December 2002, and he comes with impressive credentials. He led the construction of France's Tore Supra fusion tokamak from 1977 to 1988 when its first plasma was achieved. In 1990 he became head of physical sciences research at France's Atomic Energy Commission, and in 1994 he joined the International Thermonuclear Experimental Reactor (ITER) project as director and in 2001 as international team leader.

When Science interviewed him in mid-January, Aymar, 67, had moved into his new office only that morning, and the smell of fresh paint still hung in the air.

Q: Thinking back a few years, if I had said to you then that in 2004 you would be building the LHC rather than the ITER, what would have been your reaction?

A: I would not have been so surprised, because it was agreed that I was too old to begin construction of a big project that would last for 10 years. And I was involved with the LHC right from the beginning because I was chairman of the committee [that assessed the original LHC proposal]. I did not fight for it at all, but I did not refuse when I was asked.

Q: As a non-particle physicist, what skills do you bring to the job?

A: My experience of managing different disciplines was proved 10 years ago. I have experience of being involved with large projects, with real technical goals and large budgets. At the same time I have run large laboratories in different disciplines.

People asked me to accept this job, and I said I'm not a particle physicist. You would assume, to drive a laboratory such as CERN, you would need to be a specialist. But it was suggested, and I agreed, that if my deputy is a particle physicist who is well known in the field—it is Jos Engelen [former head of the National Institute for Nuclear and High Energy Physics in the Netherlands]—and we work together hand in hand, it should not be too difficult.

Most of the member states, especially the large ones, felt that CERN had not changed its procedures and management for quite a long time. At a time when laboratories everywhere were being forced to build large facilities with less money, CERN was not subjected to this pressure, and the time had come to do something.

Q: In hindsight, do you think it was wise to embark in 1996 on building the LHC without contingency in the budget, as this turned out to be a major factor in the financial problems in 2001?

A: At that time, the member states wanted to see CERN applying more rigor, more awareness of cost. So aside from the LHC budget, they ordered CERN to reduce its staff by up to 20%, and they cut its budget by 9% year on year. Unfortunately, this was done without any rational analysis. It was an austerity measure applied without correlation to the scientific product.

Without this analysis, the management continued to make more commitments beside the LHC, on other programs, without considering the implications for budget and personnel. Previously there had always been plenty of resources at CERN, and nobody realized, certainly in-house, that when a difficulty arose the member states would not provide more money. So at some point, a crisis was bound to happen.

Q: And that happened in 2001?

A: Exactly, and after that they had to make savings everywhere, squeeze every part of the program, and they asked the ERC to look at the full program, not just the LHC.

The ERC made a long list of recommendations, mostly on how to make the council and the management more aware of the relationship between objectives and resources. This used a lot of simple management tools that are known everywhere but were not applied here at CERN. The tools were made available, savings were made, budgets were squeezed, everyone acted properly, and I think the situation has improved. As usual, unexpected things may happen, but I believe we will have the first shot at physics in the summer of 2007.

Q: In your 2002 ERC report, CERN was charged with “serious weaknesses” in cost awareness and control, contract management, and financial reporting. What are you doing to remove these weaknesses?

A: I think the most effective thing we have proposed, which has already been put in place, was to take everything that has to be made for the LHC—hardware and software—identify it, define it, and put a cost on it. These items are then put under the responsibility of a group of people which will have a quasi-contract with the management to do the job with an agreed budget and deadline. These “work packages” are the most appropriate way to improve people's awareness of cost and schedule. As soon as you have people who feel responsible and agree to do the job in the agreed time scale, you are almost there.

This is a complete change of philosophy for CERN. Previously, people were always interested in the excellence of the technical solution, improving it all the time, but the cost and schedule was something else. This reverses the priorities: You have to complete the job, and sometimes the compromise is not to make it perfect but just make it work.

Q: The ERC suggested scaling back non-LHC work to cut costs and divert personnel to the LHC. To what extent has that been done?

A: The cost of personnel was never included in the costs of the LHC. This was the way of doing things: You cost the hardware and you cost what is called industrial services, when you have to hire people to make something. Staff from CERN were supposed to be free. But because of the non-LHC commitments that the management continued to make, these people were not available to the LHC.

Another assumption was that non-LHC programs cost almost nothing. Running the older accelerators was supposed to be free. But this isn't true. There is a very large cost. The ERC looked in detail at the commitments of all these experiments, and most were coming to a stop at the end of 2004, with plans to restart soon in 2005. We proposed instead to stop them for 2 years, although this was later reduced to 1 year. This will provide an opportunity for reflection, to see if the experiments need any improvements, such as a better detector. When the time comes, people will cry; this is normal, but I think this is not wasted time.

The staff, the technicians freed up in this way, can then move to LHC work. There is a lot of work involved in testing all the magnets. We have a very large installation, and to run it day and night all year long needs a lot of people.

Q: The ERC also suggested that, during these lean times, CERN should collaborate more closely with other particle physics laboratories.

A: This is something that I feel very strongly about. The convention which created CERN states that the council has two functions. One is to supervise the laboratory, and the second is to steer particle physics in Europe. This gives CERN a special role: not as one laboratory among all the others, but as the laboratory with connections to all the others in Europe.

We are now working with other labs on, for example, an injector for the Next Linear Collider and proton injectors which can be used on the LHC or a superconducting proton linac. There are plenty of small items like that. We have larger ones, mostly involving CLIC [CERN's concept for the next linear collider]. I am now pushing for CLIC R&D to have more organized collaborations and become a collective project for the whole of Europe so that we have a proof of concept ready when the time comes to decide on the linear collider.

Q: CERN is also playing a leading role in GRID computing, which essentially uses resources across the Internet as one giant computer. What role will this play?

A: It is absolutely compulsory. The number of events we have to look at with the LHC is such that without the capacity to store and analyze this huge amount of results at a number of sites, it would be impossible to make it work. This is a central activity, and it will have repercussions everywhere. If we can work reliably and safely with a distributed system, we will meet our goals.

Q: Last fall, CERN celebrated the anniversaries of two major discoveries, neutral currents in 1973 and the W and Z particles in 1983. What will CERN's next big discovery be?

A: LHC is the facility to provide new discoveries. No other lab will do that to the same extent. It is surprising that the Standard Model of high-energy physics is good, but the questions that are not answered by the Standard Model are completely open. There are plenty of theories, all different, and you cannot tell which is likely to be true. The LHC will provide perhaps not all answers, but a lot of answers. That will be a reward for people who have been working 20 years on this. And I am very happy that that will come—after me, but it will come.

17. ROBERT AYMAR INTERVIEW

# A Leviathan Takes Shape Beneath Geneva's Gentle Environs

1. Matin Durrani*
1. Matin Durrani is deputy editor of Physics World.

## The mathematical heart

Mathematical models showed long ago that there is some method to the apparent madness of the fibrillating heart. Ventricular fibrillation is first and foremost a malfunction in the heart's electric circuitry. In a normal heartbeat, electrical activity starts near the top, in the atria; shoots to the bottom along special highly conductive muscle cells; and rises through the ventricles before dying away. When ventricular fibrillation sets in, however, one or more “spiral waves” of electrical activity start pinwheeling around in the cardiac muscle, like the “Mexican wave” in a soccer stadium. Just like sports fans, the cells in the ventricles start paying more attention to the wave than to the game. That is, they ignore the normal pacing signals coming from the atrium.

Heart modelers who study fibrillation are more interested in the electrical behavior of the heart than its mechanical pumping, because the electricity drives the pump. So they begin with what mathematicians call a reaction-diffusion equation, which expresses the two ways that electrical signals travel through the heart: by diffusion of ions from cell to cell through gap junctions, and by currents that pass through the cell membranes.

Valentin Krinsky, a Russian biophysicist, and Arthur Winfree, an American mathematician, were the first to realize that rotating spiral waves arise naturally as solutions to reaction-diffusion equations. Two interwoven spirals emerge from an inactive core or “phase singularity,” like the two flavors in a spiral lollipop: a spiral of active cells and a spiral of resting cells. The spirals rotate as individual cells take turns charging up or relaxing. In three-dimensional models, the spirals become scrolls wrapped around a “filament” that meanders through the heart muscle.

It took experimenters a while to catch up, because it is not easy to map the heart's electric field as it races around the organ five times per second. Voltage-sensitive dyes solved the problem, and by 1993 researchers at the State University of New York (SUNY) Upstate Medical University in Syracuse had produced the first video images of spiral waves on the surface of a rabbit heart: ghostly yin-yang shapes twirling on a heart-shaped radar screen, many times per second. Scroll waves have not been seen yet, because no one has imaged electric fields inside the muscle.

Although the basic reaction-diffusion models explain why spiral waves exist, they do not explain how to start or stop them. In fact, Keener says, “the heart is the only medium in which we know how to get rid of these waves. The mechanism that works in cardiac tissue doesn't work in any chemical oscillators. It's special, it's unique—and that makes it a mystery.”

As in any mystery, there is an abundance of suspects—perhaps too many. One of them is the “virtual electrode” theory. About a decade ago, John Wikswo, a biomedical engineer at Vanderbilt University in Nashville, Tennessee, noticed that an electric current applied to the heart creates several spots of positive and negative voltage—not just a single spot under each electrode of the defibrillator. Wikswo explained this observation with a model that treats the heart as if it were a coaxial cable. Like a coaxial cable, the heart has two different conductors: the insides of the cells and the outsides. The positive and negative spots, or “virtual electrodes,” are places where current is flowing through the cell membranes, from one conductor to the other. Those transmembrane currents, Wikswo and others believe, are responsible for calming the spiral waves and returning the heart to normal.

Unfortunately, the coaxial-cable approach predicts that transmembrane currents should exist only in the outer millimeter of the heart—not deep enough to stop fibrillation. Natalia Trayanova, a biomedical engineer at Tulane University in New Orleans, Louisiana, has shown that by taking into account the electrical effects of the twisting fibers that make up heart muscle, a 3D model can create virtual electrodes extending into the interior of the heart. Trayanova's simulation gets rave reviews from some of her colleagues: “I can't say enough good things about Trayanova's model,” says Paul Belk, a biomedical engineer at Medtronic Inc. in Minneapolis, Minnesota, which manufactures defibrillators. But Wikswo warns that some of the assumptions in the model have not been tested in the lab.

Keener, however, thinks virtual electrodes alone can't explain defibrillation. A virtual electrode, Keener says, would have to cover a scroll wave filament completely to wipe it out (see figure). It's like fighting a forest fire with fire: You have to completely surround it, or it might escape. Even then, the shock has to be timed just right, or it will just trigger another wave of fibrillation. And as a fibrillating heart has several “fires” burning at once, it is very unlikely that they will all be doused at one stroke.

Keener thinks the real key to defibrillation lies at the level of the cells themselves. The cell membrane, he says, responds more vigorously to a positive voltage than a negative one. Applying a positive voltage to it is like pouring gasoline on a fire: It helps release the energy already stored inside the dry wood. The energy propagates from cell to cell until the “fire”—the region of positive electric potential—engulfs the whole heart. Then the fire burns out, and the scroll waves are gone. If the individual cells do indeed act as batteries, then the electric potential should follow a “sawtooth” profile, with one end of each cell positive and the other negative. Earlier experiments failed to detect such a sawtooth, but Arkady Pertsov, a biophysicist at SUNY Upstate Medical University, is gearing up to search with new technology.

So are the mathematical models getting closer to a solution or just adding to the confusion? “I think we're getting pretty close,” says Brad Roth, a biophysicist at Oakland University in Rochester, Michigan. “I don't think the cardiologists will be convinced by mathematical models, but they will be intrigued enough to do experiments. I see our role as motivating the experiments.”

Certainly Medtronic is taking models seriously. Belk and fellow engineer Paul de Groot use them to determine the best placement for the electrodes of an ICD. They are also working on “pain-free” defibrillators that restore the heart's normal rhythm by a gentle pacing signal instead of a giant jolt.

“The best thing about a model is that you can see exactly what's going on,” says Belk. “You can stick 500 electrodes on the heart. [Animal experiments typically manage 30.] You can induce tachycardias on the computer and play around with different pacing schemes. But at the end of the day, you don't have to trust your model very much. You ask yourself, ‘Does that make sense?’ If the result is valid, almost invariably the answer is ‘Jeez, I should have thought of that.’”