News this Week

Science  17 Mar 2006:
Vol. 311, Issue 5767, pp. 1532
  1. NUCLEAR PHYSICS

    Researchers Raise New Doubts About 'Bubble Fusion' Reports

    1. Robert F. Service

    Bubble fusion is again generating heat, but not the kind Rusi Taleyarkhan was hoping for. Last week, Purdue University in West Lafayette, Indiana, announced that it was launching a review into allegations that Taleyarkhan—a nuclear engineer at Purdue and the field's chief proponent—had obstructed the work of Purdue colleagues by removing shared equipment, declining to share raw data, and trying to stop them from publishing results that countered his own published work.

    The allegations, which Purdue University Provost Sally Mason calls “extremely serious,” were first made public in last week's print and online issues of Nature. The review also follows a meeting in Taleyarkhan's lab, attended by other researchers trying to replicate his work, at which Taleyarkhan attempted to demonstrate bubble fusion in action. Several participants say the attempt was a dismal failure. And, adding more heat to the debate, a new analysis of data in Taleyarkhan's latest publication casts doubt on the source of a purported signature of fusion.

    In an interview with Science, Taleyarkhan says he was blindsided by the charges. “It came as a major shock to me when I first heard about it on Tuesday [7 March],” Taleyarkhan says. The following day, Taleyarkhan met with the university administration and agreed to the review. “We decided we as a university need to provide a point-by-point response,” Taleyarkhan says.

    Evidence that fusion occurs at the heart of collapsing bubbles has been controversial from the beginning. Fusion, the process that powers the sun, normally takes place under intense pressures and temperatures needed to cause atomic nuclei to smash together with enough force to combine, giving off intense energy in the process. On Earth, fusion researchers have tried to replicate the process with the help of intense lasers and magnetic fields. But 4 years ago, Taleyarkhan, then at Oak Ridge National Laboratory in Tennessee, and colleagues published a paper in Science claiming that the pressure and heat at the center of collapsing bubbles in an organic solvent had also produced the telltale signature of fusion (Science, 8 March 2002, pp. 1808 and 1868). The work held out enormous hope, because if it could be scaled up, it promised near-limitless energy.

    Embattled.

    Fusion researcher Rusi Taleyarkhan says inquiries will support his lab's evidence of nuclear reactions in hot collapsing bubbles.

    CREDIT: LYNN FREENY/U.S. DEPARTMENT OF ENERGY FILE PHOTO

    In their experiments, Taleyarkhan and his colleagues started with a small cylinder of acetone, a common organic solvent, in which all the hydrogen atoms had been replaced by deuterium, a sister isotope with an additional neutron. The researchers bombarded the cylinder with intense ultrasound and zapped the deuterated acetone with a pulse of neutrons or, in the group's most recent experiment, alpha particles. The combination caused bubbles to form, swell, and then collapse, producing a tiny flash of light, a phenomenon known as sonoluminescence. According to the authors, it also fused pairs of deuterium atoms, creating either tritium and a proton or helium-3 and an extra neutron, which were counted by the group's detectors.

    The work drew fire from other researchers who either could not reproduce the results or challenged it on theoretical grounds. Since the original Science paper, Taleyarkhan and colleagues have published two other papers in Physical Review E and Physical Review Letters (PRL)—both prestigious peer-reviewed journals—offering further evidence of bubble fusion. But the effect has yet to be confirmed by researchers who have not been affiliated with Taleyarkhan at one time.

    It hasn't been for lack of effort. Last year, the U.S. Defense Advanced Research Projects Agency (DARPA) supported efforts by Seth Putterman, a chemist at the University of California, Los Angeles (UCLA), to replicate Taleyarkhan's results. Taleyarkhan and sonoluminescence expert Kenneth Suslick of the University of Illinois, Urbana-Champaign, also received funding. With independent confirmation still lacking, on 1 March, DARPA convened a contractors' meeting in Taleyarkhan's lab at Purdue in hopes that they could all see tabletop fusion in action. But Putterman and others at the meeting say it didn't go well. “The trip from the point of view of reproducing his experiment was a waste of time,” Putterman says.

    For starters, the acoustic device that generates the bubbles wasn't working well, says meeting attendee Felipe Gaitan, chief scientist at Impulse Devices, a company in Grass Valley, California, working to commercialize bubble fusion. Instead of creating a largely clear solution with a few bubbles that would concentrate the acoustic energy, the acetone was clouded with bubbles. “We expected he wouldn't see any [results],” Gaitan says. But Taleyarkhan claimed the experiment was producing fusion.

    Rather than measuring fusion's excess neutrons with a standard device called a scintillation detector, however, Taleyarkhan measured them with plastic neutron traps. The devices are common among nuclear engineers but not among researchers, because they can't measure the precise energy level of recorded neutrons—an important clue to their source. Taleyarkhan says that, unlike scintillation detectors, plastic traps need not be calibrated, and they show irrefutable evidence of the presence of neutrons. But Putterman notes that because plastic traps take hours to process, the group had no time for control experiments needed to interpret the results. “It was very frustrating,” Gaitan adds.

    At the meeting, Putterman also presented calculations made by his graduate student Brian Naranjo that questioned the conclusions of Taleyarkhan's most recent paper, published in January. The calculations suggested that the energy levels of the neutrons Taleyarkhan reported are not what the Purdue group should have seen if deuterium atoms were in fact fusing. Instead, Naranjo said, the results are a far better match for what the scintillation detector would have registered in the presence of californium-252, a radioisotope commonly used in nuclear laboratories.

    Putterman says Taleyarkhan told him he does have californium-252 in his lab but keeps it enclosed in a shielded vault. Robert Block, a nuclear engineer at Rensselaer Polytechnic Institute in Troy, New York, and a co-author with Taleyarkhan, argues that cosmic rays and other background neutrons in the experiment could have made its readings resemble the expected signature of californium. But Putterman counters that when Naranjo calculated how the detectors would register radioactive cesium and cobalt that are used to calibrate the device, the result was a near-perfect match to the calibration data Taleyarkhan published in his paper. Still, Taleyarkhan says, it's hard for him to assess Naranjo's work until it has been published in a peer-reviewed journal. (Naranjo says he has submitted the work to PRL.) “We are trying to address the issues brought up by UCLA,” Taleyarkhan says. And that will be done through publications. “That's how we believe science should be conducted,” he adds.

    Nevertheless, it now appears that DARPA is preparing to pull the plug on the effort to replicate Taleyarkhan's results. When a DARPA representative at the 1 March meeting suggested that the “successful” experiment be crated up and shipped to UCLA for independent verification, Putterman says, Taleyarkhan balked, saying he was too busy with teaching and research commitments. Because Putterman's lab has been unable to independently verify the results, the agency says the program won't proceed. “If that had been successful, DARPA would have considered moving into a second phase that would have focused on whether the results can be scaled up,” DARPA spokesperson Jan Walker said in a statement on Friday.

    Despite the latest round of controversy, Putterman and other sonoluminescence researchers all say the idea of bubble fusion remains worth exploring. Unlike the discredited notion of cold fusion in which deuterium atoms supposedly fuse in a hunk of palladium metal, collapsing bubbles are calculated to produce temperatures in the millions of degrees, possibly high enough in that tiny volume to allow atoms to fuse.

    For now, however, the immediate hurdle for Taleyarkhan will be convincing Purdue officials that the effect and his methods are sound. In guidelines issued late last week, Purdue officials said their review would be conducted by three senior Purdue professors and overseen by Peter Dunn, Purdue's associate vice president for research. An initial fact-finding phase will be completed by 1 June. Mason said the results of the review will be made public. Taleyarkhan says he is confident he will be vindicated: “We stand by whatever data we have presented,” he says.

  2. CHEMISTRY

    Columbia Lab Retracts Key Catalysis Papers

    1. Robert F. Service

    For synthetic chemists working to craft new molecules, a carbon atom surrounded by hydrogens can be as hard to handle as a greased pig. Undaunted, in recent years researchers have scrambled to devise schemes for plucking select hydrogens off carbon and replacing them with other atoms that offer an easier handhold. A pioneer of this subfield, known as C-H activation, Columbia University chemist Dalibor Sames has developed a wealth of advances along with his group members. But some of the lab's results are now in doubt.

    Last week, the Journal of the American Chemical Society (JACS), a leading chemistry journal, printed corrections for three papers from the Sames lab. Two of the papers on C-H activation catalysts were fully retracted, and part of a third was withdrawn. In each case, the retractions say that the work was disavowed after Sames group members could not reproduce the results following the departure of Bengü Sezen, a former Sames group graduate student, who was the lead author of the two retracted papers and a co-author of the third. JACS Editor Peter Stang says the corrections came at the request of the Sames group. Sames did not reply to repeated phone and e-mail messages from Science.

    Susan Brown, Columbia's director of public affairs, says the university has launched a review of the case, but that she cannot comment on its scope or timing. “It's our policy not to comment on reviews while they are ongoing, so the integrity of the process can be maintained,” Brown says.

    In an e-mail exchange, Sezen, who is now a Ph.D. candidate in the group of University of Heidelberg molecular biologist Elmar Schiebel, according to the group's Web site, says the retractions came as a surprise. “Professor Dalibor Sames or anyone else from Columbia University did not contact me regarding the retractions,” she says. For the two retracted papers, Sezen named two other Sames group members who she says repeated her work while she was out of town. For the third paper, Sezen says her contribution was “limited to an intellectual one.” But Kamil Godula, one of the Sames group members Sezen cited, says in an e-mail that the reactions worked only when Sezen was in town. The other Sames group member Sezen mentioned did not return messages from Science.

    Deactivated.

    Researchers withdrew two synthetic-chemistry papers and part of a third after failing to reproduce the results.

    Justin Du Bois, a synthetic chemist at Stanford University in California, calls the retractions “a bit of a blow” to the subfield of C-H activation: “These were definitely important papers,” he says. Sezen has at least five publications on C-H activation with Sames in addition to those corrected in JACS. Benjamin Lane, a former Sames group member now working as a chemist with the pharmaceutical company Biogen in Cambridge, Massachusetts, says some of Sezen's work has been replicated and has been used by chemists in the pharmaceutical industry. Says Lane, “She has done some good things and made an impact on the field.”

  3. COSMOLOGY

    Magnet Experiment Appears to Drain Life From Stars

    1. Michael Schirber

    It's an unassuming experiment: to see how a magnetic field affects polarized laser light. And the rotation the researchers saw was tiny, a mere 100,000th of a degree. If the result is true, however, the implications are huge. According to researchers in Italy who conducted the experiment, this slight twist in the beam—the result of disappearing photons—suggests the existence of a small, never-before-seen neutral particle, which, if made in stars, would siphon off all their energy.

    Even theorists who find that scenario far-fetched are struggling to explain the disappearance of the photons. “I'm skeptical of the particle interpretation,” says theoretical physicist Georg Raffelt of the Max Planck Institute for Physics in Munich, Germany. “But there are no other obvious explanations.”

    Standard physics predicts a very small rotation in a beam's polarization in a magnetic field due to ordinary particles popping in and out of the vacuum. But when researchers at the PVLAS experiment at Legnaro National Laboratory of Italy's National Institute for Nuclear Physics turned on their 5-tesla magnet in 2000, they immediately saw a rotation 10,000 times larger than expected, says PVLAS member Giovanni Cantatore of the University of Trieste. The rotation is caused by the loss of a small number of photons whose electric fields line up with the magnetic field. This selective disappearance is what physicists would see if the missing photons were converting into neutral particles about 1 billionth of the mass of electrons.

    “If you believe the signal is real, then the interpretation is a new particle,” says theoretical physicist Andreas Ringwald of DESY, Germany's particle physics center near Hamburg. But Ringwald thinks most physicists believe the rotation comes from some subtle artifact of the instruments. The PVLAS team has spent 5 years looking for such systematic effects: They have rotated and reduced the magnetic field, added air to their vacuum system, and changed the frequency of the laser. “All this time we have tried to make the signal go away,” Cantatore says. It hasn't. The PVLAS team doesn't claim to have discovered a new particle. “It is important to be careful,” Cantatore says. A paper in Physical Review Letters is due this month.

    “These are very serious, very competent people,” says Pierre Sikivie of the University of Florida, Gainesville, who also looks for novel particles with magnetic fields. Still, he has a “wait-and-see attitude,” because the implications would be “revolutionary.”

    The PVLAS particle, if it exists, has the makings of an axion, a hypothetical particle that some cosmologists propose is the invisible missing dark matter that makes up a large chunk of the mass of the universe. However, the particle suggested by the PVLAS experiment is not what the theorists ordered. It couples so strongly to photons that the axion-search experiments currently scattered around the globe should have seen loads of them coming from the sun (Science, 15 April 2005, p. 339). Such a stream of invisible particles out into space would drain a star of its energy in a few thousand years. But we know stars, including our sun, last for billions of years. Raffelt says the PVLAS particle would need “crazy properties” to match astrophysical constraints, but there is no fundamental reason they can't behave that way.

    A twist in the tale.

    By rotating a laser beam with magnets, this experiment may have found never-before-seen particles.

    CREDIT: THE PVLAS COLLABORATION

    The PVLAS collaboration plans to settle the question with an experiment involving two magnets separated by a wall. On one side, part of a laser beam would be converted into a flux of PVLAS particles, which would fly straight through the wall. On the other side, the second magnet would reconvert some of the particles back into photons, at a rate of one every 2 seconds, Cantatore predicts. Ringwald is proposing a similar experiment at DESY, and CERN, the European particle physics lab near Geneva, Switzerland, is also considering one.

    Although most physicists doubt the reality of this particle, they are curious to see what comes of it. “People want to give the idea a fair hearing,” Sikivie says. “If it turns out to be true, it will be a theoretical challenge to explain, but also an opportunity.”

  4. PLANETARY SCIENCE

    Minerals Point to a Hot Origin for Icy Comets

    1. Richard A. Kerr

    HOUSTON, TEXAS—Scientists analyzing the first samples returned from a comet announced startling news this week. They are finding not the unprocessed “stardust” thought to have glommed together in the frigid fringes of the early solar system, but bits of rock forged in white-hot heat. The discovery may mean that the disk of dust and gas from which all planetary bodies formed was far more violently mixed than previously thought.

    At the Lunar and Planetary Science Conference here, leaders of the 150-strong Stardust science team told how team members on four continents have been slicing, dicing, and analyzing 10-micrometer particles collected by the Stardust spacecraft. It swept by comet Wild 2 two years ago and returned its samples to Earth on 15 January. Working first on the larger particles snared in the Stardust collectors, analysts are finding mineral crystals such as forsterite, pyroxene, anorthosite, spinel, and titanium nitride. These “are all minerals that formed at moderately high to extremely high temperatures,” Stardust principal investigator Donald Brownlee of the University of Washington, Seattle, later told a press conference at NASA's nearby Johnson Space Center. “These are hot minerals from the coldest place in the solar system,” the comet-forming region beyond Neptune.

    A hot one.

    This 2-micrometer bit of comet Wild 2—a magnesium-rich olivine called forsterite—formed at a high temperature, perhaps near the young sun.

    CREDIT: NASA/JPL-CALTECH/UNIVERSITY OF WASHINGTON

    The minerals must have formed at 1400 K or hotter, Brownlee said, especially a couple of particles resembling the so-called calcium-aluminum inclusions (CAIs) known from meteorites. In contrast, the dust the analysts expected to find in comets would be submicrometer in size and lacking in any crystalline structure. That's the form they would have taken as they condensed from vapor in deep space after being blown off other stars.

    Brownlee offered two possible solutions to the hot-and-cold conundrum. The crystals “could have come from the innermost region of the [still-forming] solar system,” he said. Astrophysicist Frank Shu of National Tsing Hua University in Taiwan has advanced that idea to explain CAIs and once-molten droplets called chondrules that dominate the most common type of meteorite coming from the asteroid belt (Science, 20 June 1997, p. 1789). Shu argues that the young, violently active sun would have blasted nearby solids to their melting points and magnetically flung them—including CAI and chondrule particles—out over the disk as far as the comet-forming region. Alternatively, says Brownlee, the Stardust minerals may have crystallized from melts near other stars and reached the forming solar system by some unspecified means.

    “If this were astronomy, we'd stop there,” Brownlee told his colleagues. Astronomers have nothing to go on but the electromagnetic spectrum, which would yield no further information in this case. “But we have samples; that will solve this mystery.” The key will be isotopes, he said. The mix of isotopes in solar system material is wildly different from that of other stars, he noted, as evidenced in rare bits of interstellar material long known from meteorites. “We'll know in weeks or months,” says Brownlee.

  5. U.S. REGULATORY POLICY

    Courts Ruled No Forum for Data-Quality Fights

    1. Jocelyn Kaiser

    A federal appeals court ruled last week that the public can't sue federal agencies over their compliance with a controversial law on the quality of scientific data. The decision is a victory for environmentalists and government watchdog groups, which have accused industry of using the so-called Data Quality Act (DQA) to delay new regulations.

    The 2000 act, which requires federal agencies to set standards to ensure the quality of information they disseminate, allows critics to petition agencies that they believe have not met the standards. Many such petitions have been filed, largely by industry groups challenging reports on topics such as the effects of toxic chemicals. But petitioners have no recourse if rebuffed.

    In May 2003, the Salt Institute and the U.S. Chamber of Commerce filed a DQA petition to obtain unpublished data from DASH-Sodium, a study funded partly by the National Heart, Lung, and Blood Institute (NHLBI) (Science, 30 May 2003, p. 1350). The study found that eating less salt lowered participants' blood pressure, and NHLBI has cited these findings in recommending that all Americans lower their salt intake. But DASH researchers had failed to break down the data for subgroups (such as white men under age 45 without hypertension), argued the industry group, which demanded that NHLBI release these data for independent analysis. After NHLBI rejected the request, the groups sued the Department of Health and Human Services (HHS), NHLBI's parent agency.

    In November 2004, a Virginia federal district court turned down the suit, a decision upheld on 6 March by the U.S. Court of Appeals for the 4th Circuit in Alexandria, Virginia. The panel of three judges found that the DQA “does not create any legal right to information or its correctness,” and for that reason, the plaintiffs lacked legal “standing” to pursue their case.

    The decision is “very broad” and will likely stand because it's from “a very conservative panel,” says University of Maryland law professor Rena Steinzor of the Center for Progressive Reform. But proponents of the law say they aren't giving up. “I'm deeply disappointed. I feel that Congress intended that the Data Quality Act should be enforced,” says Richard Hanneman, president of the Salt Institute.

    NHLBI has been providing a limited data set to qualified researchers since January 2004. But the Salt Institute has not requested the data because “there's no assurance” its request would be granted, Hanneman says. Jim Tozzi of the industry-funded Center for Regulatory Effectiveness, who helped craft the legislation, is thinking about suing HHS over its position that marijuana has no accepted medical benefit. “A dozen people with diseases” might have a better shot at convincing a court they have standing, says Tozzi.

    Meanwhile, the Chamber of Commerce is pondering whether to push for legislation that would open up any DQA decision to legal challenge. Steinzor predicts that such an effort will mobilize opponents of the act to maintain the status quo.

  6. U.S. DEPARTMENT OF ENERGY

    Can Energy Research Learn to Dance to a Livelier Tune?

    1. Eli Kintisch

    Turning basic research into commercial technology has never been easy. But it's especially hard in the energy sector, where problems such as cutting greenhouse gas emissions and finding less-polluting energy sources resist easy solutions despite laboratory breakthroughs by the country's best minds.

    Last week, Congress took the first steps toward addressing that problem, as legislators embraced the concept of creating a small, nimble agency within the mammoth Department of Energy (DOE). The Senate Committee on Energy and Natural Resources voted out a bill (S. 2197) that would authorize an Advanced Research Projects Agency-Energy (ARPA-E) modeled on DARPA, the 48-year-old agency within the Pentagon. The next day, a House panel devoted a 2-hour hearing to the concept, proposed last fall in a National Academies report on U.S. technical competitiveness (Science, 21 October 2005, p. 423).

    “A small, agile, DARPA-like organization could improve DOE's pursuit of R&D much as DARPA did for the Department of Defense,” wrote the academies panel in a report described at the House hearing by Nobelist Steven Chu, director of DOE's Lawrence Berkeley National Laboratory in California. ARPA-E's “transformational” research, the panel said, “could lead to new ways of fueling the nation and its economy, as opposed to incremental research on ideas that have already been developed.” Chu, like others, said fixing a depleted basic science base was the top priority for energy research, but that ARPA-E could help “bridge the gap between basic energy research and development/industrial innovation.”

    Like DARPA, ARPA-E would employ a small staff of program managers who would leave industry and academia for short stints with the government. The agency should have the freedom to start and stop programs quickly and—again like DARPA—be attuned to the spectrum of research from basic discovery through prototypes, before handing it over to the private sector for commercialization.

    Melanie Kenderdine of the Gas Technology Institute in Washington, D.C., says the staff could bridge the sort of communication gaps between vehicle research and fuels work that she witnessed as a senior DOE official during the Clinton Administration. Although Dan Arvizu, director of DOE's National Renewable Energy Laboratory in Golden, Colorado, fears that a new agency could grow unwieldy, he likes DARPA's focus on the “entire spectrum” of research. That philosophy, he says, could bolster agency initiatives on ethanol and photovoltaics by focusing on basic scientific questions in addition to technological improvements.

    Even so, experts warned legislators that simply recreating DARPA wouldn't work. The military is a different breed of cat than the energy sector, says former DARPA Director Frank Fernandez. The government would do a better job fostering new technologies by using taxes or mandates on existing energy sources, says House Science Committee Chair Sherwood Boehlert (R-NY), who calls himself an “open-minded skeptic.”

    Risky business.

    Nobelist Steven Chu backs a new agency that would fund “out-of-the-box,” high-payoff energy research.

    CREDIT: ROY KALTSCMIDT/BERKELEY LAB

    Even supporters are wary of any new agency that might drain resources from President George W. Bush's request for a 14% increase in the 2007 budget of DOE's Office of Science. The Senate bill, which has 65 co-sponsors in a body of 100, would authorize a $250-million-a-year operation. But Senator Pete Domenici (R-NM), who chairs the Senate panel that controls DOE and who introduced the bill, hasn't endorsed a specific funding level.

    When President Dwight D. Eisenhower created DARPA's forerunner after the Soviet Union launched Sputnik, he had to overcome the objections of military leaders. To succeed, ARPA-E's supporters will have to convince the Bush Administration that it won't “distract” from DOE's other initiatives, as Energy Secretary Samuel Bodman told a Senate panel earlier this month. But they are on their way to capturing another key element of DARPA's longevity—the support of Congress.

  7. RUSSIAN SCIENCE

    Moscow Plans Tighter Control of Science Academy's Research Money

    1. Andrey Allakhverdov,
    2. Vladimir Pokrovsky*
    1. Andrey Allakhverdov and Vladimir Pokrovsky are writers in Moscow.

    MOSCOW—The Russian Academy of Sciences (RAS) is facing a tough new challenge from the government, according to its leaders. Members say they were shocked earlier this month to learn that the Ministry of Finance is proposing changes that could give bureaucrats more authority over science funding decisions and radically alter research management.

    One of the proposed changes would take away the academy's independent control over the distribution of public science funds now in its purview, sources say. The academy would no longer be funded as a line item in the national budget; instead, the government would allocate funds to the Ministry of Education and Science, which would redistribute money to various programs, including to RAS. Academicians fear this would give bureaucratic and political considerations too much weight in decisions.

    Losing independence?

    Russian Academy of Sciences President Yuri Osipov has urged the finance minister to drop proposed changes.

    CREDIT: YURI KADOBNOV/AFP/GETTY IMAGES

    In addition, the Finance Ministry has proposed that an institute's private earnings should go to the state rather than to the institute that earned them. Currently, 40% of RAS's revenue comes from independent sources such as grants and contracts for commercial, defense, and consulting work. Both finance proposals are circulating in the ministries but have not been approved by the cabinet or the Duma.

    RAS Vice President Alexandr Nekipelov, among others, sees these moves as an attack on the academy's independence. Says Nekipelov: “These amendments will lead to financial collapse of the whole structure of the academy.” In an “epic” struggle more than a year ago, says Nekipelov, the government tried to take control of academy resources (Science, 24 September 2004, p. 1889). “But we managed to assert that important organizations like the academy, Moscow University, and others … would preserve the right to be in charge of the funds at their disposal.” Now, Nekipelov says, “these amendments have come up all of a sudden,” renewing the struggle. The proposal to transfer outside earnings to the state is “absurd,” he argues, because it could drive academic institutions out of “innovative R&D activities” that could foster new industries. RAS President Yuri Osipov has written to the finance minister urging that the proposals be dropped.

    The Ministry of Education and Science seems to think RAS officials are being alarmist, however. Innovation policy department chief Alexandr Khlunov says his ministry argued that the proposed budget changes “must not affect the ability of research institutions to do research.” As a result, Khlunov said, the changes will be flexible, but he declined to give specifics. “It doesn't make any difference” who provides funds,” he says: Managers should manage, and “scientists should do research.”

    The proposed changes may not affect the other big provider of basic research money, the Russian Foundation for Basic Research. Director Vladimir Lapshin says it is too early to say what will happen: “There are many amendments at the moment … and too much confusion.” He suggests that a government review of science funding could be useful if it leads to more competitive peer-reviewed science, which is already increasing “every year.” But he is adamant on one point: “It takes too long” from the moment the budget is approved to when researchers get their money; something must be done to speed this up.

  8. PARTICLE PHYSICS

    Linear Collider Partners Woo Newly Opened India

    1. Pallava Bagla

    NEW DELHI—With the wheels of Air Force One barely off the tarmac following U.S. President George W. Bush's visit, which ended India's 3 decades as a nuclear pariah state, a delegation of U.S. and European physicists arrived here last week to discuss India's involvement in the International Linear Collider. ILC is a multibillion-dollar particle accelerator that researchers hope will study the exotic species of particles that existed just after the big bang. “We all hope that India will become a key partner in this global collaboration,” says Pier Oddone, director of the Fermi National Accelerator Laboratory near Chicago, Illinois. According to some, India could even host the machine.

    Sanctions have been imposed on India since 1974 because of its clandestine nuclear weapons program and its refusal to sign the Nuclear Nonproliferation Treaty. But under the U.S.-Indian deal agreed upon earlier this month, India would be able to trade in civilian nuclear technology with other countries in exchange for opening up a majority of its nuclear facilities to international inspection.

    Despite their exclusion from U.S. research programs, Indian researchers have made names for themselves in high-energy physics. India has observer status at CERN, the European particle physics lab near Geneva, Switzerland, and has contributed precision equipment worth more than $20 million to the construction of CERN's Large Hadron Collider.

    With a price tag that could reach $8 billion, ILC will be a global project. India is being considered for a position as an “equal partner,” says Barry Barrish, director of ILC's Global Design Effort: “Early participation in ILC will enable India to integrate their program in the development stages with the world program and bring back new expertise, rather than just contributing some technology to a large external project.”

    “India is seriously considering to join the project,” says nuclear physicist Valangiman Subramanian Ramamurthy, secretary of India's Department of Science and Technology. Some think India could play an even greater part, with its combination of skilled scientists and engineers and low labor costs. According to Carlo Pagani, a member of the visiting delegation from Italy's National Institute for Nuclear Physics in Milan: “It just might be advantageous for the world to house the project in India or China.”

  9. INFECTIOUS DISEASES

    Report Concludes Polio Drugs Are Needed--After Disease Is Eradicated

    1. Jennifer Couzin

    As efforts to wipe out polio intensify in the handful of countries where the disease still occurs naturally, public health experts are thinking about what comes next. In a report released last week, a seven-person committee appointed by the National Research Council in Washington, D.C., argued for developing an antipoliovirus drug in the event of a posteradication outbreak. But whereas everyone on the panel endorsed that advice in principle, not all felt it was achievable.

    Antivirals might seem unnecessary for a disease that will be declared eradicated. But since efforts to stamp out polio began in 1988, public health officials knew that their success might create a difficult dilemma: The very oral polio vaccine used to prevent the disease can spur fresh outbreaks, because it contains live but weakened versions of the three types of poliovirus. Vaccinated individuals, particularly immune-deficient ones, shed the virus and can transmit it to the unvaccinated. That poses a problem, because several years after the disease is declared eliminated, countries may stop vaccinating their residents. “What are we going to do then, when vaccine virus is still circulating around?” asks Samuel Katz, an infectious-disease specialist at Duke University in Durham, North Carolina, who was the committee chair. “If we get outbreaks again and go in with oral vaccine and control them, you're perpetuating the dilemma.”

    The report, requested by the World Health Organization (WHO) in Geneva, Switzerland, and the U.S. Centers for Disease Control and Prevention in Atlanta, Georgia, calls for developing a safe, orally administered antiviral that prevents and treats polio. But some panelists question the prescription. “I started to wonder, ‘Is this going to be realistic?’” says committee member Neal Nathanson, associate dean for global health programs at the University of Pennsylvania School of Medicine in Philadelphia. Among other things, he wonders how easy it would be to persuade thousands of healthy people to take an antiviral drug for a disease they may not get. Although vaccine-driven outbreaks are real—last year, both Indonesia and Madagascar suffered outbreaks of paralytic polio caused by vaccine-derived viruses—Nathanson notes that they have been “more or less self-limited.”

    Moreover, developing a new drug can take years, and WHO anticipates that transmission of polio will end in about a year. James Hogle, a Harvard University structural biologist, adds that it's unclear who would fund such drug development, because a polio antiviral is unlikely to rake in anything approaching a profit. “It's rather late in the game to do this,” he says.

    But with the endgame in sight, Bruce Aylward, WHO's coordinator of the global eradication initiative, worries that “you're going to have an increasingly vulnerable world to polio.” Antivirals haven't been pushed until now, says Aylward, because WHO only recently became confident that it could stamp out the disease.

  10. SCIENTIFIC COMMUNITY

    Bias Claim Stirs Up Ghost of Dolly

    1. Gretchen Vogel,
    2. Eliot Marshall*
    1. With reporting by John Bohannon.

    A hearing into a scientist's claim that he was the target of harassment and racial discrimination has put under a microscope the lab where Dolly the sheep was cloned. It also has prompted the man widely recognized as Dolly's creator, Ian Wilmut, to give detailed evidence on who deserves credit for the successful experiment—and precisely how much. In testimony, Wilmut gave himself less than a third of the credit.

    The investigation arises from a suit brought by molecular biologist Prim Singh. He charges that Wilmut, then a researcher at the Roslin Institute in Midlothian, U.K., bullied him and stole his ideas. Seeking $1.74 million, he claims that Roslin passed him over for promotions because of his race and forced him to quit after he lodged a complaint against Wilmut in 2003. Wilmut and Roslin have denied the charges. (A previous discrimination claim that Singh filed was dismissed last year.) Singh did not work on the Dolly project, but testimony in his case provides an inside view of the team that pulled off one of the world's most famous biology experiments.

    An employment tribunal in Edinburgh began hearing testimony from Singh and other witnesses in November 2005, but it was testimony last week from Wilmut himself that caught wider attention. Singh's lawyers questioned Wilmut about the famous paper describing Dolly, published in Nature in 1997. Wilmut, who is now at the University of Edinburgh, was lead author and has received most of the public credit. But in court he said that he had neither developed the key technology nor conducted the experiments that led to Dolly's birth. When Singh's lawyer asked him if the statement “I did not create Dolly” was true, Wilmut answered “Yes.” He said he played a coordinating role in the project but that his colleague Keith Campbell, now at the University of Nottingham, deserved “66%” of the credit for the breakthrough.

    Murky origins.

    A discrimination hearing has reignited old resentments among the team that cloned Dolly the sheep.

    CREDIT: REUTERS

    Other members of the team offered independent views to journalists covering the case. Bill Ritchie, a technician at Roslin, says he and Karen Mycock, another technician, did the nuclear transfer procedures. But neither is listed as an author on the paper. Alan Colman, now CEO of ES Cell International in Singapore, who was working at Roslin's sister institute PPL Therapeutics at the time of the Dolly experiments, says that authorship questions on the paper were controversial from the start. He says Ritchie and Mycock made important contributions to the project, but adds that Wilmut did not take an undue share of the credit. “Ian conceived the program, worked on it for many years, and hired the right people to get it done,” he says.

    Roslin itself, meanwhile, is planning a complete makeover and change of location. After a positive scientific review last fall, says director Harry Griffin, Roslin has been approved to join a new outfit in 2009 called the Edinburgh Bioscience Research Centre at the University of Edinburgh School of Veterinary Studies. The U.K. government is pledging $60 million to the merger, which will also bring in experts in prion diseases from the nearby Institute for Animal Health. Griffin says its leaders aim to raise another $52 million for a research facility employing 500 scientists. He would not comment on the Singh case.

  11. SPACE SCIENCE

    A Space Race to the Bottom Line

    1. Andrew Lawler

    Flush with new discoveries, NASA's space and earth scientists now must figure out how to get by on $3 billion less than they expected—without triggering a civil war

    SPACE SCIENCE IS GETTING PLENTY OF headlines these days. A new spacecraft is on its way to Pluto, one just arrived at Mars, and another may have spotted water on Saturn's moon Enceladus. But last week, two dozen senior researchers met in a windowless Washington, D.C., conference room to try to avert what some fear could turn into a civil war among earth and space science disciplines scrambling for science's decreasing share of the space agency's budget.

    The go-go years of the past decade came to a crashing halt last month, when NASA's 2007 budget request pulled more than $3 billion out of the long-term science plan (Science, 10 February, p. 762). NASA has since canceled two missions close to launch, deferred a handful for a year or two, and effectively killed a half-dozen others slated for orbit in the next decade. To cope with the rapidly unfolding crisis, members of the National Academies' Space Studies Board assigned themselves the task of building a united front among notoriously fractious disciplines to make the best use of scarce dollars. They don't have much time. “Everyone recognizes that we are in this together—and we have to solve it together,” says board member Daniel Baker, a space physicist at the University of Colorado, Boulder.

    The unprecedented effort to find an acceptable alternative to NASA's 2007 budget request before legislators act on the bill this summer has the blessing both of the agency and Congress. Space agency chief Mike Griffin says he is willing to consider the results (see sidebar, p. 1542). And congressional staffers are cheering them on. “I hope you folks will have the answer to the problem—because we don't,” Richard Obermann, a minority staffer with the House Science Committee, told the board on 6 March. Adds David Goldston, the committee's chief of staff, “Whatever pattern is set this year, it will be the pattern for the foreseeable future.”

    Out of business?

    Griffin and other Administration officials dismiss the idea that a $5.3 billion request for research in 2007 represents a crisis for the field. “There is still a very large overall science budget, just not as large as had been hoped,” says Griffin. “NASA's science budget is almost as large as the entire [budget for the] National Science Foundation. I'm unable to see the level of damage here that those who are concerned about it seem to see.” Indeed, the proposed 1% boost in NASA science over current levels beats out the average 0.5% cut borne by nondefense discretionary programs across all federal agencies.

    CREDIT: K. BUCKHEIT/SCIENCE (ILLUSTRATION); IMAGES: NASA; JUPITER IMAGES; GETTY IMAGES

    Scientists, congressional staffers, and NASA science staff say this statement is true but misleading. Two years ago, the agency planned to boost its science budget by $1.5 billion by 2009. As recently as last year, the increase was still $1 billion by 2010. Based on such optimistic figures, NASA in recent years began funding work on an ambitious array of projects, most to meet scientific goals set by the National Academies in its various decadal plans.

    But those projects are costing far more than planned. The most dramatic example is the James Webb Space Telescope (JWST), whose price tag is now $4.5 billion—$1 billion above the planned cost. A host of other projects are in the same boat. Costs for the Stratospheric Observatory for Infrared Astronomy (SOFIA) have ballooned from $400 million to $650 million, and several projects considered by the academies to be mid-size efforts now have grown to the size of flagship missions. “The problem is an enormous growth in the cost of doing programs; the numbers don't add up,” says Thomas Young, a former aerospace executive and board member.

    To cope with the budget crunch combined with rising costs, NASA officials are taking drastic steps to curtail costs and limit new starts—mostly by deferring missions, canceling troubled projects, and reducing the amount of money scientists spend to analyze research data. As a result, the number of new science missions launched will decline from a dozen this year to one in 2010. In the meantime, aging spacecraft will begin winking off. “This looks like we're going out of business,” Baker says.

    Defer and delay

    For some disciplines, that is no exaggeration. “The last mission we have in earth sciences is in 2012,” frets board member Berrien Moore, co-chair of another academies' panel writing that discipline's first decadal plan. “After that, we'd better be going to Mars!”

    Congress forced NASA 2 years ago to reverse planned cuts in several earth science missions. But in recent weeks, the agency has canceled the Deep Space Climate Observatory (Science, 6 January, p. 26) and Hydros, a $170 million effort to study soil moisture. NASA officials say that Hydros was a backup to two other missions now in the works, and so it never was a confirmed project—a point disputed by some researchers. The agency also will delay the Global Precipitation Mission by 30 months and slow a precursor mission for a national environmental satellite system by 18 months.

    For solar physics, the top-ranked mission in a 2003 decadal study by the academies—a magnetosphere mission—now will not be launched until 2013. Two other high-ranked missions—two separate constellations of small satellites to examine the interaction between the ionosphere and the thermosphere and understand how energy moves in Earth's magnetotail—are on indefinite hold.

    View this table:

    Rising costs and flat budgets also will force NASA to compete several new astrophysics flights. Constellation X—a group of four orbiting telescopes that will image the x-ray universe—will face off against the Laser Interferometer Space Antenna, designed to detect gravitational waves, and a Joint Dark Energy Mission with the Energy Department. The winner will get a green light to start work in earnest in 2009 or 2010 for a launch later in the next decade. The other two will have to wait their turn.

    NASA also has stopped early work on the Terrestrial Planet Finder, a spacecraft that researchers had hoped to orbit in the next decade in search of Earth-sized planets. The Space Interferometry Mission, another planet-hunting mission, won't be orbited until 2015 or 2016, and its cost has grown to $4 billion.

    Stanford University astrophysicist Roger Blandford also fears for the future of the Explorer program, NASA's attempt to launch smaller missions run by principal investigators. The agency earlier this month canceled the Nuclear Spectroscopic Telescope Array, which was to open up the high-energy x-ray sky, and postponed the next solicitation for an Explorer from 2007 to 2008—delaying the launch of the next mission to 2014 at the earliest.

    Planetary scientists are perhaps most bitter about the 2007 budget request. Their program, complains Reta Beebe, a board member and an astronomer at New Mexico State University in Las Cruces, “has unfortunately become the source of funds supporting other NASA programs.” She and others note that of the $3.1 billion taken out of the 5-year budget projections for science, nearly all came from planetary missions. NASA recently canceled the Dawn mission to the asteroids Vesta and Ceres, rejected pleas to begin a large mission to Jupiter's moon Europa, and cut the astrobiology budget by a whopping 50%. (On 10 March, Griffin agreed to review the decision on Dawn.) The agency also abandoned plans to launch a Mars sample return by 2016.

    “The proposed budget transforms an existing, vibrant program into a stagnant holding pattern,” says Beebe. “The damage is immediate and increasingly irreversible. … We are reenacting the events of the 1970s,” she says, when a series of exciting missions was followed by a 15-year drought.

    Yet even that grim prediction doesn't match the crisis in the space life and microgravity sciences field, which had $1 billion for both ground-and space-based research as recently as 2004. With the advent of the exploration initiative, that figure has plummeted to near zero. Donald Ingber, a Harvard University biologist and board member, insists that such cuts will make long-term human space flight impossible, given unknowns about radiation hazards and the impact of microgravity on human health. “This will set the manned program back by decades,” he warns.

    Civil war or solidarity?

    Short of an abrupt cancellation of the shuttle and station programs, there are few prospects for a dramatic change in science's fortunes. Indeed, this year's overall increase of 3.2% for NASA may look good in a few years, board members fear. And even if the shuttle is retired in 2010 once the space station is complete, the space agency's budget documents note that the dividends will go into the exploration program rather than science.

    “We're not going to be able to execute the decadal [studies] as they exist,” concludes Lennard Fisk, board chair and a geophysicist at the University of Michigan, Ann Arbor. A 1% increase in NASA's science budget, he says, translates into “a major retrenchment.” And scientists say they would rather make the hard choices than leave them to NASA managers. If they don't, Blandford warns, “choices that should be scientific and technical will be left to the political process.”

    After hours of discussion, board members broadly agreed to protect research funds for the university community and for smaller missions. That decision puts larger efforts in each discipline on the chopping block. Moore suggested that to find earth science savings, the $430 million Landsat mission slated for launch by 2010 could be reviewed, and astronomers privately and cautiously suggest that deferring JWST by a few years could rescue smaller astrophysics missions in the near term. The largest planetary mission now scheduled is the Mars Science Laboratory, slated for a 2009 launch; among solar physicists, the big-ticket item is the Solar Dynamics Observatory due for orbit in 2008.

    But some researchers already are parrying the attack on larger programs. William Smith, president of the Association of Universities for Research in Astronomy, warned in a 6 March letter to the House Science Committee that NASA's great observatories provide $70 million annually for research analysis. Canceling flagship missions “would simply shift the imbalance, not eliminate it,” he says. Attempts to defer, descope, or kill JWST or the Mars Science Lab also would provoke major battles in each discipline and in Congress.

    Even if NASA likes what the space board proposes, the fight is sure to move quickly to Capitol Hill, where projects with the most political muscle could triumph despite the academies' priorities. Still, researchers say they have to try. “Dividing a growing pie is not all that difficult,” says George Paulikas, board vice chair and a retired aerospace executive who is heading up the academies' effort.

    But the alternative to consensus is too awful to contemplate, he adds. Can independent-minded scientists agree on a plan that spreads the pain around? “Stay tuned until April,” he says.

  12. SPACE SCIENCE

    Bumpy Ride for Data-Driven NASA Chief

    1. Andrew Lawler

    “Show me the data” proclaims the framed sign over Michael Griffin's desk. It is a warning to visitors to his ninth-floor office at NASA headquarters in down-town Washington, D.C., that the 56-year-old aerospace engineer and applied physicist brooks little idle chatter, speculation, or wheeling and dealing. “I don't try to be blunt,” says Griffin, nearing his first anniversary as head of the $16.6 billion U.S. space agency. “I just tell the truth.”

    Griffin's two predecessors, Daniel Goldin and Sean O'Keefe, were known for being mercurial, visionary, and political—anything but plainspoken. And Griffin's no-nonsense approach to fixing what ails the U.S. space program—from a crippled space shuttle program and a half-completed international space station to an overmortgaged science portfolio—has earned him a large reservoir of good will in the White House and on Capitol Hill.

    But that pool is drying up (Science, 10 March, p. 1359). President George W. Bush wants NASA to focus on building a new rocket to take humans back to the moon. Lawmakers are pressing to keep space shuttle jobs and contracts intact as long as possible. NASA's international partners worry that the agency still might back out of finishing the space station, leaving them and their hardware in the lurch. And scientists need continued big annual budget increases to build the many ambitious missions planned for the next decade (see main text, p. 1540).

    To keep these disparate groups happy, Griffin last fall asked the White House to give NASA a whopping 8.8% increase in 2007. He warned the Office of Management and Budget (OMB) that a lesser boost would force him to halt the growth in science. When Bush decided to ask Congress for a 3.2% increase, Griffin kept to his word—prompting an angry reaction from scientists and their allies on Capitol Hill.

    Under fire, Griffin's refreshing forthrightness can come across as political insensitivity. He dismisses the community's outcry as “a hysterical reaction, a reaction out of all proportion to the damage being done.” But those words are likely to antagonize rather than assuage science advocates. Griffin is famous for responding rapidly to e-mails; he carries his Blackberry everywhere. Yet he's uncomfortable with the face-to-face socializing and back scratching that his predecessors practiced so adroitly. “You have to form relationships, not just send an e-mail,” says one who has worked closely with him. Although Griffin responds rapidly, adds another, “his impatience often shows.”

    Griffin's style befits his career as a project engineer, industry manager, and lab department head. He also spent a difficult few years at NASA headquarters overseeing the agency's doomed lunar and Mars exploration program, an idea that George H. W. Bush proposed but Congress ultimately ignored.

    This time around, he knows he will need help from all quarters. “I need the scientific community to worry about more than just what happens to science,” Griffin says. To win them over, however, he may want to serve up a slice of tact and empathy along with the data.

    What follows is an edited transcript of a 7 March interview with Science's Andrew Lawler.

    Q: You told Congress in 2003 as a private citizen that NASA needed $20 billion to do everything on its plate. What's changed?

    What's changed is that I am an agency head. Every agency head would like to have more money. The average [nondefense discretionary program] took a half-percent decrease in the 2007 budget—so we got 3.7% above average. I think that is extraordinary. My response is to say thank you. Is it as much as we would like to have? Of course not.

    Q: How much would you like to have?

    I'm not going to answer that question.

    Q: In a November letter to OMB, you asked for 8.8%.

    There are months of work that go into preparing a budget with all kinds of trades, and that was a missive from a snapshot in time.

    Q: Has your promise last year not to take “one thin dime” out of science come back to haunt you?

    No. I found we could not complete the station and the shuttle and make any kind of progress in replacing the shuttle with the CEV [Crew Exploration Vehicle] and the CLV [Crew Launch Vehicle] without restricting the growth of science. We just ran out of money.

    Q: Why should science take the fall?

    Your readers should understand that everybody in NASA paid the piper. I cannot accept an argument that manned space flight operations got everything they wanted when they in fact took a huge whack.

    Q: But isn't it the science aboard the station that is taking the whack?

    I chose to assemble now and utilize later.

    Q: Why is there no post-2010 plan to do science on the station?

    I inherited what I inherited. Clearly, the [National Academies'] report [on space station science] is very specific and unequivocal in its position that we don't have a good space station utilization plan. But we have several years now to develop one, and we will.

    We still have an extraordinarily healthy science program. Some missions have been delayed, some things of a doubtful nature have been canceled, and a couple of things are on the chopping block because the promised technical performance has not come true.

    Q: Is there any prospect of ending the shuttle program before 2010, thereby freeing up money for exploration and science?

    We're flying out the shuttle program in an orderly and disciplined way and using it to finish the space station. We have been working on it for 20 years, and we have multiple international commitments. Other things we would like to do—including exploration and science—are going to have to sacrifice for the next few years to allow that to come true.

    CREDIT: LUKE FRAZZA/AFP/GETTY IMAGES

    Q: Is that why NASA canceled NuSTAR and the asteroid mission Dawn—and soon maybe the flying observatory SOFIA [Stratospheric Observatory for Infrared Astronomy]?

    Dawn was canceled because it was overrun by 20%. That's a matter of project discipline. Dawn's cancellation has nothing to do with [the NASA] budget. SOFIA is so far overrun on cost and schedule that only if we can convince ourselves that it is past its technical problems—well, the question is, can its people get to the finish? I insist on imposing discipline on our projects.

    Q: But why are the shuttle and station exempt from this rule?

    Or the James Webb [Space Telescope]? Our highest priority missions will be completed. And other things in science are suffering to pay for James Webb. So what's your point?

    Q: Why should projects like Webb be exempt, if the smaller ones often are being managed more innovatively than large projects?

    Our science program is structured to pay appropriate and ample respect to National Academy priorities. Now at present, in the astrophysics line, James Webb is the highest priority. Fifteen minutes after I arrived at NASA, I learned there was a billion-and-a-half shortfall in James Webb. My choice is either to continue to respect the academy's priority and find the money from lower priorities, or I could disrespect the academy's priorities and cancel James Webb. That is a bind. My choice will generally be to respect academy priorities. If the academy revisits the issue of whether a single flagship mission is worthy of the sacrifice of numerous lesser, possibly more innovative, more timely missions, that would be a judgment for the scientific community to make. I'm listening. But I do not view that as a judgment that a NASA administrator ought to make.

    Q: NASA's credibility as a nonpartisan purveyor of science was damaged in the flap over recent complaints by agency scientist Jim Hansen. What are you doing to change that?

    Even Jim Hansen has not said that anyone has interfered with his publication of his technical conclusions. Jim said he was inappropriately denied an interview he should have been able to conduct. And I think he was right. And the person who denied him that interview is no longer here. I can only assure you, as the head of NASA, that no one here wants or will tolerate any restriction on the prerogatives of technical people to publish their conclusions to their community and have them be debated on their merits in their communities.

    Q: What do you say to those scientists who are angry at the 2007 request?

    We have a space program that requires some sacrifices. I'm sorry they have to happen, but they do.

  13. GENE SEQUENCING

    The Race for the $1000 Genome

    1. Robert F. Service

    Fast, cheap genetic analyses will soon become a reality, and the consequences—good and bad—will affect everybody

    MARCO ISLAND, FLORIDA—Computers aren't the only things getting better and cheaper every time you turn around. Genome-sequencing prices are in free fall, too. The initial draft of the first human genome sequence, finished just 5 years ago, cost an estimated $300 million. (The final draft and all the technology that made it possible came in near $3 billion.) Last month, genome scientists completed a draft of the genome sequence of the second nonhuman primate—the rhesus macaque—for $22 million. And by the end of the year, at least one company expects to turn out a full mammalian genome sequence for about $100,000, a 3000-fold cost reduction in just 6 years.

    Charting islands.

    Glowing dots on a glass slide mark cloned DNA being sequenced.

    CREDIT: OMEAD OSTADAN/SOLEXA

    It's not likely to stop there. Researchers are closing in on a new generation of technology that they hope will slash the cost of a genome sequence to $1000. “Advances in this field are happening fast,” says Kevin McKernan, co-chief scientist at Agencourt Bioscience in Beverly, Massachusetts. “And they are coming more quickly than I think anyone was anticipating.” Jeffrey Schloss, who heads the sequencing-technologies grant program at the National Human Genome Research Institute (NHGRI) in Bethesda, Maryland, agrees. “People are roundly encouraged and nervous,” Schloss says—encouraged because their own technologies are working, and nervous because their competitors' are too.

    A host of these novel sequencing technologies were on display last month at a meeting here. * Although no one at the meeting claimed to have cracked the $1000 genome sequence yet, researchers are getting more confident that it's a real possibility. “From what I've listened to the last few days, there is no physical principle that says we shouldn't be able to do a $1000 genome,” says Harvard University sequencing pioneer George Church.

    Even today, the declining cost of genome sequencing is triggering a flowering of basic research, looking at broad-ranging topics such as how the activation of genes is regulated and understanding genetic links to cancer. And as prices continue to drop, sequencing will revolutionize both the way biologists hunt for disease genes and the way medical professionals diagnose and treat diseases. In fact, some researchers say cheap sequencing technology could finally usher in personalized medicine in a major way. “The promise of cheap sequencing is in the understanding of disease and biology, such as cancer, where the genome changes over time,” says Dennis Gilbert, chief scientist of Applied Biosystems, the leading gene-sequencing-technology company based in Foster City, California. “It will enable different kinds of science to be done.” Of course, as with other forms of high technology, that promise brings new risks as well. Researchers expect cheap sequencing to raise concerns about the proliferation of bioterrorism agents as well as patient privacy.

    Free fall.

    As with computer technology, the plunging cost of DNA sequencing has opened new applications in science and medicine.

    CREDIT: ADAPTED FROM GRAPH PROVIDED BY JEFFREY SCHLOSS/NHGRI

    The race is on

    The first group to produce a technology capable of sequencing a human genome sequence for $1000 will get instant gratification, as well as potential future profits: In September 2003, the J. Craig Venter Science Foundation promised $500,000 for the achievement. That challenge has since been picked up by the Santa Monica, California-based X Prize Foundation, which is expected to up the ante to between $5 million and $20 million. But the competition really began in earnest in 2004, when the National Institutes of Health launched a $70 million grant program to support researchers working to sequence a complete mammal-sized genome initially for $100,000 and ultimately for $1000. That program has had an “amazing” effect on the field, encouraging researchers to pursue a wide variety of new ideas, says Church. That boost in turn has led to a miniexplosion of start-up companies, each pursuing its own angle on the technology (see table).

    All are racing to improve or replace a technology first developed by Fred Sanger of the U.K. Medical Research Council in the mid-1970s that is the basis of today's sequencing machines. The technique involves making multiple copies of the DNA to be sequenced, chopping it up into small pieces, and using those pieces as templates to synthesize short strands of DNA that will be exact complements of stretches of the original sequence. The synthesis essentially mimics the cell's processes for copying DNA.

    The technology relies on the use of modified versions of the four bases that make up DNA, each of which is tagged with a different fluorescent marker. A short DNA snippet called a primer initiates the synthesis at a specific point on the template DNA, and the altered bases—which are vastly out-numbered by normal bases in the mix of reagents used to perform the synthesis—stop the process when one of them is tacked onto the end of the growing DNA strand. The result is a soup of newly synthesized DNA fragments, each of which started at the same point but ends at a different base along the chain.

    Today's sequencers separate these fragments by passing the soup through tiny capillaries containing a gel; the shorter the fragment, the faster it moves through the gel. The process, known as capillary electrophoresis, is so effective that each fragment that emerges from the capillary is just one base longer than the one that preceded it. As each fragment emerges, it is hit by a laser, which causes the altered base at the fragment's tip to fluoresce. A computer records the identity of these bases and the sequence in which they appear. Eventually, the process generates billions of stretches of sequence that are fed into pattern-recognition software running on a supercomputer, which picks out overlaps and stitches the pieces together into a complete genome sequence.

    A long list of refinements in capillary electrophoresis systems, coupled with increased automation and software improvements, has driven down the costs of sequencing 13-fold since these machines were introduced in the 1990s. Most of the new technologies aim to miniaturize, multiplex, and automate the process even further. They fall into three main camps. The first, called sequencing by synthesis, tracks bases as they are added to a growing DNA strand. Second is a group of techniques that sequence single DNA molecules. Finally, nanopore-sequencing technologies coax DNA to wriggle through a tiny pore and read the bases either electronically or optically as they go by.

    Sequencing-by-synthesis strategies have a head start. Indeed, one company, 454 Life Sciences Corp. in Branford, Connecticut, already has a commercial instrument; it sold 20 of them last year. The company's technique, called pyrosequencing, first chops a genome into stretches 300 to 500 base pairs long, unzips the double strands, discards one strand, and links the other to compounds tethered to a plastic bead—each bead gets just one strand. These snippets are then copied by the polymerase chain reaction (PCR) until the copies cover each bead. The beads are separated on a plate containing as many as 1.6 million wells and dosed with a series of sequencing reagents and nucleotides. Every time a nucleotide is tacked onto a growing DNA chain, the reaction triggers the release of a compound called pyrophosphate, which in turn prompts a firefly enzyme called luciferase in the well to give off a flash of light. By correlating the recorded flashes from each cell with the nucleotides present at the time, a computer tracks the systematic sequence growth of hundreds of thousands of DNA snippets simultaneously.

    In August 2005, 454 Life Sciences researchers reported that they had sequenced the nearly 600,000-base genome of a bacterium known as Mycoplasma genitalium with an accuracy of 99.4%, as well as the larger 2.1-megabase genome of Streptococcus pneumoniae (Science, 5 August 2005, p. 862). At the Florida meeting, Michael Egholm, 454's vice president for molecular biology, reported that they had since sequenced four different microbial genomes, each with greater than 99.99% accuracy. “In a 6-month period, we have dramatically improved the data quality,” Egholm says. Higher accuracy is critical because two genomes being compared, such as those of normal cells and cancer cells, could differ in only one part per million.

    David Bentley, chief scientist for Solexa in Little Chesterford, U.K., also reported heady progress. Like 454's approach, Solexa's turns separate snippets into roughly 1000 exact copies. Instead of attaching individual DNA strands to a separate bead, Solexa researchers fix each strand to a different spot on a glass slide, much as they do in standard microarrays. They then duplicate those strands, creating myriad tiny DNA islands. Finally, in a step akin to Sanger sequencing, they use nucleotides with four different colors and standard microarray optics to simultaneously track the growth of strands complementary to those attached to the slide. Bentley reported that his team had sequenced a 162-kilobase stretch of human DNA and compared it to the standard reference sequence worked out by the Human Genome Project. Their sequencing turned out to be more than 99.99% accurate and spotted all 162 common mutation sites known as single-nucleotide polymorphisms known to exist in that stretch of DNA.

    Pore fit.

    In this computer simulation, DNA wriggles through a 1.5-nanometer pore in silicon. Such devices hold the promise of sequencing DNA electronically.

    CREDIT: IMAGE COURTESY OF ALEKSEI AKSIMENTIEV, KLAUS SCHUTLEN, AND GREGORY TIMP/UNIVERSITY OF ILLINOIS, URBANA-CHAMPAIGN

    Church has developed a slightly different sequencing approach, part of which Harvard has licensed to Agencourt. In this approach, called sequencing by ligation, researchers start with a short snippet of DNA bound to a bead or a surface. They then add a short stretch of DNA called an anchor primer that binds a known starter sequence on the DNA snippet. Additional nine-base primers, known as query primers, are then added to the mix. These primers come in each possible sequence combination, and each has a labeled A, G, T, or C at just one position. If a short primer with a correct complementary sequence binds to the DNA, an enzyme called ligase stitches it to the anchor primer to hold it to the surface, and the other primers, which bind less tightly, are washed off. The mix is then hit with a blast of laser light to reveal the color of fluorescence that gives away the identity of the newly bound base. Finally, the query and anchor primers are stripped away, and another anchor primer is added as the first step to identifying the next base in the template strand. Agencourt's McKernan said their version of the technology could currently sequence some 200 million bases a day and may reach 3 billion a day by August.

    Slow start, strong finish?

    Despite these advances, sequencing by synthesis has its drawbacks. One is that the techniques read relatively short DNA snippets—usually several hundred base pairs in length or less, compared with the 1000 or so in current capillary systems. That can make it hard to reassemble all the pieces into a continuous genome sequence. Another drawback is that they rely on PCR, which is expensive and can introduce copying errors. Greater experience with the new sequencing technologies may improve matters. 454's Egholm, for example, says his team has developed a prototype version of their technology that increases read lengths from 100 base pairs to 400. Several groups are developing ways to sequence a single copy of a long DNA strand, thereby achieving longer reads and avoiding PCR.

    One approach being pursued by VisiGen Biotechnologies in Houston, Texas, anchors a polymerase—the enzyme that tacks new nucleotides to a growing DNA chain—onto a surface and feeds it a template strand. As the polymerase then adds fluorescently labeled bases to a complementary strand, an advanced optical system detects the tiny flashes from the single molecule, allowing a continuous sequence to be read. A variation of this approach by LI-COR Biosciences in Lincoln, Nebraska, anchors single-stranded DNA and polymerase molecules to an electrode, and then uses an electric field to drive nucleotides linked to fluorescent nanoparticles in solution toward the polymerase. In the instant between the time when the polymerase latches onto the nucleotide and the time when it cuts it off the nanoparticle, the researchers reverse the electric field, driving away nucleotide-nanoparticle pairs not bound to the DNA. Then they snap a picture to see the color of the fluorescent particles still bound to the polymerase. Once the nucleotide is cut free, the nanoparticle drifts away, and the process is repeated to identify the next base. At the meeting LI-COR's John Williams predicted that this technique could produce read lengths of up to 20,000 bases.

    Generation next.

    Companies racing for the $1000 genome sequence strive simultaneously for low cost, high accuracy, the ability to read long stretches of DNA, and high throughput.

    View this table:

    But another technology altogether may hold the most revolutionary potential, Church says. Called nanopore sequencing, this family of techniques aims to sequence DNA strands as they thread their way through tiny synthetic or natural pores, each just 1.5 nanometers or so across. Numerous groups are pursuing nanopore synthesis techniques, but researchers acknowledge that they have far to go. “We're still learning about the science of nanopores,” Schloss says.

    No group has yet reported using such a setup to sequence DNA one base at a time. But in a series of papers beginning in 1996, researchers led by John Kasianowicz and Daniel Branton at the National Institute of Standards and Technology in Gaithersburg, Maryland, reported that they could use protein-based pores embedded in a lipid membrane first to detect snippets of DNA and then to differentiate snippets with all A's from those made up of C's. But because proteins and lipids are fragile, other groups have begun making their pores out of silicon and other electronic materials in hopes of developing a more robust technology that can also integrate transistors and other electronic devices. In most versions of nanopore technology, researchers use tiny transistors to control a current passing across the pore. As the four different DNA bases go through, they perturb that electric signal in different ways, causing the voltage to spike upward or drop in a way that identifies the nucleotide passing through.

    At the meeting, for example, chemist Gregory Timp of the University of Illinois, Urbana-Champaign, reported that his team has generated electrical readings of DNA moving through nanopores. Unfortunately, the DNA wriggled back and forth so much that the researchers had trouble teasing out the sequence of bases in the chain. But Timp says he and his colleagues are finishing a second-generation device that uses electric fields to keep the movement of the DNA under control. If it works, the technology promises to read long stretches of DNA without the need for expensive optical detectors.

    “We have to worry now”

    No matter which technology or technologies make it to market, the scientific consequences of lower sequencing costs are bound to be enormous. “I think it's going to have a profound impact on biology,” says Yale University molecular biologist Michael Snyder.

    Some early progress is already on display. At the Florida meeting, for example, 454's Egholm reported that he and his colleagues used their technology to identify as many as four genetic variants of HIV in single blood samples, in contrast to today's technology, which identifies just the dominant strain. The technique, Egholm says, could eventually help doctors see the rise of drug-resistant HIV strains in patients at the earliest stages. In another study, they quickly analyzed the sequence of non-small cell lung cancer cells and identified the specific mutations that give rise to drug resistance.

    In similar studies, Thomas Albert and colleagues at NimbleGen Systems, a biotechnology firm in Madison, Wisconsin, used their version of sequencing-by-synthesis technology to identify the mutations in Helicobacter pylori—the microbe responsible for ulcers—that cause resistance to a drug known as metronidazole, as well as the mutations in the tuberculosis-causing bacterium that trigger resistance to a new TB drug. The power of such studies is “unbelievable,” Snyder says, because they hold out the hope of enabling doctors to tailor medicines to battle diseases most effectively. Some personalized-treatment strategies are already in use: Herceptin, for example, is targeted to patients with a specific genetic form of breast cancer. But cheap sequencing should make them far more widespread, Church says.

    Basic researchers are looking at the early benefits of cheap sequencing as well. At the meeting, for example, Snyder talked about his team's use of gene chips to map the sites where transcription factors—proteins that control when genes are turned on—bind to the genome. The technology is effective, but gene chips are expensive. So Snyder is turning to cheap sequencing technology to rapidly sequence the millions of DNA fragments needed to identify transcription factors. Church says he is also using cheap sequencing techniques to propel his group's synthetic-biology efforts to create an extensive tool kit of microbial “parts” that can be mixed and matched to enable microbes to perform new functions.

    Like most new technologies, ultracheap sequencing casts shadows as well. For starters, Church says, it's hard to imagine what privacy will mean once anyone with a sample of your DNA can determine who you are, your heritage, and what diseases you're likely to inherit. Celebrities and politicians may soon face a world hungry to scrutinize their genes. Among ordinary people, many analysts worry that insurers and employers will use genetic information to screen out those at high risk for disease. Finally, the same sequencing technology that could potentially help create beneficial new microbes, such as ones tailored to turn out large amounts of hydrogen gas to power fuel cells, could also make it easier to create new bioterrorist pathogens.

    “We have to worry about these issues now, because we will be sequencing with very high throughput in 10 years,” Timp says. Schloss notes that NHGRI has long supported research on ethical, legal, and social concerns. However, he adds, “it's very hard to do it in the abstract.” With technology advancing at a rapid clip, neither the benefits nor the concerns of ultracheap sequencing are likely to remain abstract for long.

    • *Advances in Genome Biology and Technology Conference, Marco Island, Florida, 8–11 February 2006.

  14. PATIENT PRIVACY

    Rule to Protect Records May Doom Long-Term Heart Study

    1. Jocelyn Kaiser

    Researchers are still grappling with how to conduct medical studies while complying with federal and state laws to keep patient data private

    For 25 years, heart disease researchers have tapped the medical records of more than 40,000 Minnesotans for findings on everything from sex differences in heart attack survival to the role of cholesterol-lowering drugs in saving lives. But the well may be drying up: State and federal privacy laws could make it impossible for epidemiologists at the University of Minnesota, Twin Cities, to continue to collect the hospital data they need.

    Barely beating.

    Patient privacy laws could shut down a heart study led by Minnesota epidemiologist Russell Luepker.

    CREDIT: UNIVERSITY OF MINNESOTA

    The problem stems from a federal privacy rule that took effect 3 years ago and that affects bio-medical researchers around the country. The rule “still is slowing down or substantially discouraging researchers from certain studies,” says Susan Ehringhaus, associate general counsel for regulatory affairs of the Association of American Medical Colleges (AAMC). Prominent on that list is the Minnesota Heart Survey, which periodically reviews patient records from hospitals around Minneapolis to analyze factors in heart disease and stroke survival such as ethnicity, procedures, and medications. “They're leaders on this,” says epidemiologist Steven Shea of Columbia University. “We would lose a very important, very high-quality lens on what's going on over time.”

    In 1996, Congress passed the Health Insurance Portability and Accountability Act (HIPAA) to make it easier for people to retain or switch their health insurance coverage. In April 2003, the Department of Health and Human Services (HHS) began to implement one provision, called the Privacy Rule, that gives patients access to their medical records and restricts how health care providers use them (Science, 9 July 2004, p. 168). One key change from existing practices requires researchers outside the provider organization to obtain written consent from each patient in order to use the patient's records or, if that is impractical, to get a waiver from their institutional review board (IRB). Researchers can also receive a data set stripped of identifying information. The onus is on health care providers, who can be fined or jailed for violating the rules.

    A National Institutes of Health (NIH) spokesperson says most researchers have received waivers and managed to continue their studies. But the law continues to lead to delays, say some researchers, and a review of HIPAA in the February 2006 Annual Review of Medicine suggests that the higher costs—the government has estimated $600 million over 10 years—is causing researchers to revise or scrap some studies out of concern the work will become prohibitively expensive.

    The Minnesota Heart Survey is one example of a study that has been hit particularly hard by the double whammy of federal and state laws. Investigators need identifiers such as Social Security number and birth date to match the medical data with death records, says principal investigator Russell Luepker. Although hospitals once allowed his team to view patient files, he paid a research foundation affiliated with the hospitals to collect the data after Minnesota implemented a new privacy rule in 1997. Last summer, however, the foundation folded, and Luepker hasn't found a replacement.

    Luepker can't simply get an IRB waiver to HIPAA because Minnesota's privacy law requires each patient to give consent. The hospitals ask patients to sign a general consent for use of their records, but it's not easy to get written consent from a sick person admitted to a hospital for a heart attack or stroke, notes Luepker. Not everyone returns mailed consent forms, he adds, and some hospitals are even reluctant to send them.

    So Luepker has been talking to lawyers from each of the 22 hospitals to work out a way to obtain the identifiers even for patients who haven't signed a form. If that approach fails, Luepker says he won't apply for a renewal of two large NIH grants that expire in June. “I'm quite frankly very worried,” Luepker says about a situation first reported by The Minneapolis Star Tribune. “For us it's quite bad. This long-running study may stop running, and we're vain enough to think it's produced some very good information.” Luepker says a related telephone population survey will continue.

    Other studies face new limitations as institutions interpret their responsibilities under HIPAA. For example, at the University of Michigan, Ann Arbor, researchers who once recruited subjects for a survey of acute coronary disease care by telephone had to get their written permission first. In a 23 May 2005 paper in the Archives of Internal Medicine, Kim Eagle and others reported that consent rates dropped from 96% to 34% when they switched from phone calls to mail. Subjects also tended to be older, healthier, and married. Roberta Ness of the University of Pittsburgh in Pennsylvania says she must now rely on patients' doctors to recruit prospective patients for preeclampsia and cancer studies.

    Studies that pool data from many centers are also feeling the impact of HIPAA. An Alzheimer's disease consortium's database of clinical data on patients that's used to develop better diagnostics and treatments has been delayed while contributing researchers obtain IRB waivers to record the ages of subjects over 90, says one investigator who asked not to be named. An international trial of a drug for brain injury was hamstrung by the refusal of many U.S. hospitals to divulge ages, the exact time of injury, and other data on patients screened for the trial, reported a Dutch team in the February 2006 issue of Intensive Care Medicine.

    Efforts to ease the load on researchers have so far been unsuccessful. For example, in 2004 a panel that advises the HHS Office for Human Research Protections recommended nine changes, including eliminating a requirement that hospitals account for each use of a patient's data for research; shortening the list of identifiers; and allowing patients in a study approved under the federal Common Rule, which protects human subjects, to authorize use of their data for future, unspecified research. “There is still a need to bring some sense to these regulations,” says former panelist attorney Mark Barnes of Ropes & Gray in New York City. AAMC has gone further, urging that any research already approved under the Common Rule should be exempt. The HHS Office of Civil Rights says its staff “continues to listen to the concerns of the research community” and is working with researchers “to enable important research to move forward.”

    Meanwhile, researchers are doing their best to get by. The University of Michigan's IRB, for example, eventually allowed Eagle's team to send prospective patients a letter saying their records could be part of the survey unless they mailed back a postcard to opt out. Only 5% have objected, says Eagle, and many “are delighted that we're doing the study.” New York University's Douglas Morse, who's had trouble finding patients for an oral cancer study in Puerto Rico through pathology labs, says that life under HIPAA is like coexisting with an infected toe. “You might be able to get around, … but the result might not be everything you hoped for.”

  15. RESEARCH FUNDING

    China Bets Big on Big Science

    1. Hao Xin,
    2. Gong Yidong*
    1. Gong Yidong writes for China Features in Beijing.

    For a few lucky research fields, a new government road map for science is like winning the lottery

    BEIJING—He Fuchu, a major general in the People's Liberation Army, is combat ready. “Advanced countries compete fiercely to control the high ground in protein research,” says He, using military jargon to describe his primary objective as director of the Beijing Protein Research Center. Now He, a vice president of the Chinese Academy of Military Medical Sciences, is about to get a substantial war chest to fund his center's research in proteomics, a big winner in China's new 15-year plan for science and technology (S&T).

    The long-awaited S&T plan, a set of marching orders handed down to scientists last month, may set the tone of science in China for years to come. It specifies 16 major engineering projects, including design of large aircraft, moon exploration, and drug development. Four major basic research programs are highlighted: protein science, topics in quantum physics, nanotechnology, and developmental and reproductive science. Although not stated in the plan, R&D spending by all sources, industry included, will rise from 236 billion yuan ($30 billion) in 2005 to 900 billion yuan ($113 billion) in 2020, Chinese officials announced last month. Basic research is slated to climb from 6% of R&D expenditure in 2004 to as much as 15% in 15 years.

    Ready for liftoff.

    A large share of China's R&D spending will be funneled to a favored few projects.

    SOURCE: CHINA'S STATE COUNCIL

    With government coffers flush, Chinese scientists had hoped the new plan would give a bigger boost for basic research. However, “basic science is still not playing a central role in the government's mind,” asserts Shing-Tung Yau, a mathematician at Harvard University. As in the past, scientific activity will be yoked tightly to economic development. “New scientific knowledge and inventions need to be industrialized and transformed,” says Lu Yongxiang, president of the Chinese Academy of Sciences (CAS). A buzzword permeating the document and on the lips of science officials is “innovation”: the key, the plan states, to reducing China's reliance on imported technology and intellectual property. Industry is expected to shoulder a heavier load than it currently does. For encouragement, the plan offers companies tax incentives to spend more on R&D.

    Although the details have not been filled in, the plan has been hailed as a noble attempt to reshape a landscape of patchy scientific talent into a cohesive community churning out innovations, rivaling the West. The plan is “an important platform for China to transform from the largest developing country to a world powerhouse,” says Duan Yibing, a science policy expert at the CAS's Institute of Policy and Management.

    Others are hesitant to jump on the bandwagon. They worry that a heavy emphasis on applied science and megaprojects will stifle creativity. “The most innovative ideas come from very few creative scientists at rare moments, whereas planning of large-scale projects requires the consensus of many scientists,” says Yi Rao, a neurobiologist at Northwestern University in Evanston, Illinois, and deputy director for academic affairs of China's National Institute of Biological Sciences (NIBS). “It is unrealistic to expect very innovative science projects to come out of planning.”

    Muffled criticism

    Drafting the S&T plan was not straightforward. Twenty working groups involving 2000 scientists and officials wrangled over the document for close to 3 years, revising it a dozen times at a cost of $10 million. The buck stopped with Prime Minister Wen Jiabao, who chaired a ministerial committee over the working groups. Since becoming China's prime minister in March 2003, Wen has made a “scientific approach to development” a theme of his administration, backed by steady increases in R&D funding. “I believe that Prime Minister Wen had the best intentions when he decided to increase funding and, at the same time, required scientists and engineers to come up with visionary plans on how to use the funds,” says Rao.

    It quickly became clear that Wen hoped to replicate the success of China's first S&T plan, a 1956 blueprint that led to the creation of scores of CAS institutes, produced the nation's first atom and hydrogen bombs, and sent up its first satellite. Although the government never spelled out “two bombs and one satellite” as a goal, people associate these triumphs with the 1956 document, and Wen was determined to rekindle past glory by embracing large projects.

    Deliberations slowed, however, when some scientists openly questioned the new plan's emphasis on big programs. In the fall of 2004, as the working groups were putting the finishing touches on the plan, Nature published a compilation of essays, some sharply critical of elements of the plan and of the Ministry of Science and Technology (MOST), the lead agency for crafting and implementing it.

    In one essay, three prominent Chinese scientists—Rao; Bai Lu, a neuroscientist at the U.S. National Institutes of Health; and CAS bio-physicist Chen-Lu Tsou—asserted that MOST's spending lacks transparency and gives bureaucrats too much power over scientists. The authors recommended stripping MOST of its budgetary authority and bolstering mechanisms for awarding peer-reviewed grants. In a second essay, Mu-ming Poo, a biologist at the University of California, Berkeley, and director of CAS's Institute of Neurosciences of the Shanghai Institutes for Biological Sciences, blasted waste and poor accountability, which he said are inevitable in big science projects. Chinese media devoured the broadsides.

    MOST complained to the General Administration of Press and Publication. The over-sight body squelched the debate, banning distribution of Nature's China supplement and warning Chinese editors not to play into the hands of foreign forces. “What's most difficult for me to understand was their assertion that we were in cahoots with foreign publications,” says Liu Dun, editor-in-chief of Science and Culture Review, a small journal ordered to scrap plans to publish debates on China's S&T structural reform. Discussions of the S&T planning process were purged from Chinese media, and several critics were bounced from working groups.

    After more than a year's delay, the S&T plan emerged—with big science front and center.

    Supersized

    The four basic science programs deemed most strategic are areas in which China has already invested considerable sums. Each megaprogram is expected to receive about $1 billion over the next 15 years, says a researcher close to government planners. “There are surely more chances for innovation” in hot areas such as nanotechnology, says Xie Sishen, chief scientist at the National Center for Nanoscience and Technology (NCNST). The center was created in late 2003 by merging CAS's nanoscience center and research groups at Beijing University and Qinghua University. The move, some say, anticipated the high profile awarded by the new S&T plan.

    Dissenting voices.

    Megaprojects are not fertile ground for innovations, argues Yi Rao (top). Yigong Shi (bottom) worries that too few scientists will control the purse strings.

    CREDITS (TOP TO BOTTOM): ELIZABETH J. RAO; DENISE APPLEWHITE/PRINCETON UNIVERSITY

    The plan places NCNST and the Beijing Protein Research Center in the driver's seats of the megaprojects in their respective fields. That disturbs some observers. “I am resolutely against the system of one chief scientist” controlling tens of millions of dollars of research funds, says Yigong Shi, a molecular biologist at Princeton University. In August 2004, Shi and 10 other members of the Society of Chinese Bioscientists in America—a group of Chinese biologists working in the United States—wrote an open letter to Wen expressing concern about the big biology projects in the draft S&T plan. They claimed that such projects would fail to achieve their goals and would strangle competition.

    Features of the other two basic science megaprograms may make them more appealing to small teams. Scientists who helped shape the program on developmental and reproductive biology say they intend to establish a merit-based system to distribute funds. The program “probably will stimulate the interaction among genetics, developmental biology, and evolution, which is a very promising direction,” says Zhang Ya-ping, director of CAS's Kunming Institute of Zoology.

    Some critics worry that money will be wasted and that expensive new instruments will languish because there are too few skilled scientists to use them. “The number of basic-science scholars is far from satisfactory,” Yau says, despite government programs to entice talented expatriates and foreigners to work in China.

    Others see a strategic flaw: Enshrining narrow priorities in a 15-year plan could make it hard to change course in the future, warns Yau. “It is very bad to commit money [over a long term] to directions that are considered to be important now,” Yau says, noting that the plan ignores “many important areas”—including his own, mathematics. Indeed, some predict an exodus from disciplines not in vogue. “Scientists may shift their research focus to favored areas in the plan. If they don't, they can hardly get funding,” says Deng Xingwang, an agricultural biotechnologist and director of NIBS. Even the country's bastion of basic research funding, the National Natural Science Foundation of China, seems to toe the line. Although its budget is slated to increase by $50 million to between $400 million and $500 million this year, sources say, its 2006 handbook stresses “an integration of the national strategic need and the independent development of science.”

    Another worry is that big programs may be impervious to adequate oversight. Because almost everybody in a field in China will be involved in a big science project, nobody can objectively evaluate it, as Rao and his colleagues pointed out in their essay. Some have suggested bringing in expats to conduct reviews. “The government should establish a more open mechanism so that overseas Chinese scientists can take part,” says Shi.

    Duan says the critics will be proved wrong. “By catering to the national need, basic research will enjoy an opportunity for development by leaps and bounds,” he says. “There is still much room for the free exploration driven by curiosity.” Others see the plan as a multibillion-dollar gamble.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution