News this Week

Science  02 Sep 2005:
Vol. 309, Issue 5740, pp. 1468

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Chimp Genome Catalogs Differences With Humans

    1. Elizabeth Culotta

    Anyone who has ever looked into the eyes of a chimpanzee has wondered what separates them from us. Now, in a raft of papers in this week's Nature and other journals, including Science (see pp. 1457, 1498, and 1499), international teams of researchers present a genetic answer to that question.

    Scientists produced a rough draft of the chimpanzee DNA sequence, and aligned it with the human one, and made an intimate comparison of the chimp and human genomes. “It's wonderful to have the chimp genome,” says geneticist Mark Adams of Case Western Reserve University in Cleveland, Ohio, who was not on the papers. “It's the raw material … to figure out what makes us unique.”

    The papers confirm the astonishing molecular similarity between ourselves and chimpanzees. The average protein differs by only two amino acids, and 29% of proteins are identical. The work also reveals that a surprisingly large amount of genetic material—2.7% of the genomes—has been inserted or deleted since humans and chimps went their separate evolutionary ways 6 million years ago.

    But those hoping for an immediate answer to the question of human uniqueness will be disappointed. “We cannot see in this why we are phenotypically so different from the chimps,” says Svante Pääbo of the Max Planck Institute of Evolutionary Anthropology in Leipzig, Germany, a co-author on one Nature paper and leader of a study in Science comparing gene expression in chimps and humans (see “Part of the secret is hidden in there, but we don't understand it yet.”

    Instead, the papers delve deeply into the genomic differences between us and our closest living relatives, revealing a flurry of relatively recent insertions and deletions in both human and chimp DNA, and mutational hotspots near the ends of chromosomes. “[A] genome is like the periodic table of the elements,” says Ajit Varki of the University of California, San Diego. “By itself it doesn't tell you how things work—it's the first step along a long road.”

    The researchers in the Chimpanzee Sequencing and Analysis Consortium deciphered DNA taken from an adult male named Clint; the draft sequence was announced but not formally published in 2003. Now the team, led by Robert Waterston of the University of Washington (UW), Seattle, confirms in Nature the oft-cited statistic that on average only 1.23% of nucleotide bases differ between chimps and humans.

    But as suggested by earlier work on portions of the chimp genome, other kinds of genomic variation turn out to be at least as important as single nucleotide base changes. Insertions and deletions have dramatically changed the landscape of the human and chimp lineages since they diverged. Duplications of sequence “contribute more genetic difference between the two species—70 megabases of material—than do single base pair substitutions,” notes Evan Eichler, also of UW, Seattle, who led a team analyzing the duplications. “It was a shocker, even to us.”

    The total genetic difference between humans and chimps, in terms of number of bases, sums to about 4% of the genome. That includes 35 million single base substitutions plus 5 million insertions or deletions (indels), says Waterston.

    All in the family.

    Genome data reveal a few surprising differences between chimps and humans but overall confirm our close kinship.


    Somewhere in that catalog of 40 million evolutionary events lie the changes that made us human. But where? In another Nature paper, a team led by Barbara Trask of UW, Seattle, and the Fred Hutchinson Cancer Research Center reports that almost half of the indels in the regions near the ends of chromosomes are unique to humans. Many of the insertions contain gene duplications, which in other organisms have fostered evolutionary novelty by allowing one copy of a gene to adapt to a new function without disrupting the original. “It'll be very exciting to see how many indels actually made a difference in our own evolution,” says David Haussler of the University of California, Santa Cruz.

    To narrow the number of genes that might have been favored in the primate lineage, Waterston's team searched for genes evolving more rapidly than the background rate of mutation. Among both human and chimp lineages, genes involved in ion transport, synaptic transmission, sound perception, and spermatogenesis stood out. The researchers also used the chimp data to identify 585 genes evolving more quickly in people, including genes involved in defense against malaria and tuberculosis. And they uncovered a handful of regions of the human genome that may have been favored in “selective sweeps” relatively recently in human history; one region contains the FOXP2 gene, proposed to be important in the evolution of speech.

    Overall, however, “the vast majority of changes between humans and chimps appear to be neutral, and there's no smoking gun on which are the important changes for making us human,” says Adams.

    One notable finding was that the fastest evolvers among human proteins are transcription factors, which regulate gene expression. Thirty years ago, Mary-Claire King and Allan Wilson proposed that altered gene regulation could solve the paradox of how a few genetic changes drove the wide anatomic and behavioral gulf between humans and chimps. “That's how you could get lots of morphological change without much nucleotide substitution. But there's been no evidence for it until now,” says Eichler. Given the chimp data, “people will rethink the regulatory hypothesis,” predicts Huntington Willard of Duke University in Durham, North Carolina.

    Another Naturepaper addresses a controversy about whether the human Y chromosome will vanish within some 10 million years. Geneticist David Page of the Whitehead Institute in Cambridge, Massachusetts, and colleagues report the detailed sequence of the “X-degenerate” region of the chimp Y, which contains functional genes once paired with those on the X but now being slowly eroded by deleterious mutations. Page's team then compared human and chimp Ys to see whether either lineage has lost functional genes since they split.

    The researchers found that the chimp had indeed suffered the slings and arrows of evolutionary fortune. Of the 16 functional genes in this part of the human Y, chimps had lost the function of five due to mutations. In contrast, humans had all 11 functional genes also seen on the chimp Y. “The human Y chromosome hasn't lost a gene in 6 million years,” says Page. “It seems like the demise of the hypothesis of the demise of the Y,” says geneticist Andrew Clark of Cornell University in Ithaca, New York.

    Although the chimp genome should be a boon for biomedical studies, an accompanying Nature commentary by Varki and colleagues calls for moderation, using principles generally similar to those that guide human experimentation. The similarity of the two genomes underscores the importance of an ethical approach to our closest living cousins, says Waterston.


    Final NIH Rules Ease Stock Limits

    1. Jocelyn Kaiser

    The National Institutes of Health (NIH) in Bethesda, Maryland, has relaxed ethics rules issued 6 months ago that many feared would drive talent away from the agency. NIH Director Elias Zerhouni last week announced that the agency's final rules would no longer require all employees to limit their stock in biotech or drug companies. But NIH will retain a blanket ban on consulting for industry.

    The revised rules seem to please both NIH scientists and outside critics. “Dr. Zerhouni has done an admirable job addressing a difficult yet critical issue,” said House Energy and Commerce Committee chair Joe Barton (R-TX), whose committee held several hearings on the subject.

    The rules appear to end a controversy that has roiled NIH since late 2003, when the Los Angeles Times raised questions about several senior NIH researchers who had been paid large sums to consult for drug or biotech companies. NIH eventually found at least 44 cases in which researchers didn't receive proper ethics approval and nine possible criminal violations. To address the problem, Zerhouni issued interim ethics rules in February 2005 that banned all biomedical consulting—even for nonprofits—and limited all employees' ownership of drug company stock (Science, 11 February, p. 824).

    The interim rules outraged many NIH employees. Some senior intramural scientists cited the rules as a factor in their departure, one institute director threatened to leave, and a newly hired one delayed his arrival.

    After receiving 1300 mostly critical comments, NIH “decided to adjust in terms of degree,” Zerhouni told reporters. Stock limits will now apply only to about 200 senior staff, including directors and other top managers of NIH's 27 institutes and centers. By next February, these employees and their families must limit their stock to $15,000 in any one company “significantly involved” in biomedicine. Previously, this limit would have applied to 12,000 lower-level employees, and about 6000 senior staff would have had to divest all their drug company stock. Those senior staff and clinicians will now have to report their holdings for review.

    Tight reins.

    NIH Director Elias Zerhouni says final rules are “most restrictive” in the field.


    NIH will no longer ban work done for associations, such as serving as an officer of a scientific society. The final rules also allow compensation for reviewing scientific grants and for giving a single lecture—the interim rules exempted only entire courses—and make clear that approval is not needed for hobbies, such as coaching youth soccer.

    The NIH Assembly of Scientists' executive committee “is very pleased” by the changes, says member Cynthia Dunbar of the National Heart, Lung, and Blood Institute. “Morale should improve markedly,” she adds. Howard Garrison of the Federation of American Societies for Experimental Biology expressed relief that NIH scientists can maintain ties to professional associations.

    Dunbar says concerns remain that the industry consulting ban will harm recruitment and retention. Zerhouni says he decided to retain the ban after concluding NIH doesn't have “adequate systems” to prevent abuses. He added, however, that NIH intends to review the rule within a year. Although NIH scientists can still work with companies through cooperative agreements, some outside biomedical leaders suggest that's not enough: “It is also important to continue to seek ways to foster appropriate interactions with” industry researchers, says Phil Pizzo, dean of the Stanford University School of Medicine, who served on a 2004 NIH advisory panel that favored allowing some industry consulting.

    Not everyone thinks the final rules solve NIH's ethics problems. “There's a whole variety of things involving laundered money going to people whose views are favorable,” such as drug company-sponsored education courses, says Sidney Wolfe, of the Washington, D.C.-based watchdog group Public Citizen. But Zerhouni defended the new plan as “the most restrictive of any rules we know about in the world of biomedical research.” The final regulation was to take effect this week when it was published in the Federal Register.


    Germany Poised to Elect First Scientist-Chancellor

    1. Gretchen Vogel

    BERLIN— German opinion polls predict that the country will elect its first chancellor trained in the natural sciences later this month. A victory for the Christian Democratic Union (CDU) on 18 September over the ruling Social Democrats would mean a government led by Angela Merkel, who holds a Ph.D. in physical chemistry—a result that could produce significant changes for German scientists.

    Merkel has been a politician for longer than she worked as a scientist, and her training is seldom mentioned in a campaign dominated by economic issues. More is being made of two other milestones stemming from a CDU victory: the country's first female chancellor and the first from the former East Germany. But some scientists hope that Merkel's previous career, and the fact that her husband is a well-respected chemistry professor, might give them a sympathetic ear in the chancellery—and boost science's profile. “The first natural scientist as a chancellor would be a wonderful message for the country,” says biologist Hubert Markl of the University of Constance, former head of Germany's Max Planck Society and its DFG funding agency.

    Markl is quick to add that the current chancellor, Gerhard Schröder, is also “very pro-innovation,” and party politics is likely to play a larger role in shaping science policy than the next chancellor's Ph.D. For example, if Schröder pulls off a come-from-behind victory, scientists hoping to work with human embryonic stem (ES) cells could get a boost. Schröder has said that he would like the Bundestag to revisit the laws that ban research on embryos and allow scientists to import only those ES cell lines derived before 1 January 2002.

    The CDU provided much of the support for this legislation, and while Merkel has been quiet on the subject, several high-ranking party members have said that there would be no move to relax the law in a CDU-led government. That stance might be challenged, however, by the CDU's preferred coalition partners, the Free Democrats (FDP). Like Schröder, the FDP favor relaxed laws that would allow derivation of human ES cells and human nuclear transfer experiments.

    Quantum leap.

    Angela Merkel, who studied physics and quantum chemistry, is likely to be Germany's next chancellor.


    The potential coalition partners have fewer disagreements on two other hot scientific issues: nuclear power and genetically modified crops. Both the CDU and FDP support a relaxing of the current government's policy of phasing out all nuclear power plants by 2020. Some say this policy, pushed by the government coalition member Green Party, has made it difficult for nuclear physics departments in Germany to attract students.

    Both the CDU and FDP say they would relax restrictions on genetically modified crops. The Greens have supported tough curbs on the technology, pushing through a law that holds planters legally responsible for pollen that escapes and contaminates a neighboring field (Science, 25 June 2004, p. 1887). Scientists say that the measure effectively rules out all field research with genetically modified plants. Several politicians expect a more permissive law to be high on the agenda of a CDU-FDP coalition.

    All parties agree on the need to boost science funding, a step the Bundestag took this summer by passing a 5-year, $2.8 billion science spending package (Science, 1 July, p. 33). The CDU says that it wants to go even further, adding $1.25 billion over 4 years so the DFG can fund overhead costs.

    What voters are most concerned about, however, is whether Merkel can tackle the country's economic woes. At least some observers say her scientific training might be an advantage. Last month, the influential Süeddeutsche Zeitung wrote that Merkel had demonstrated both meticulousness and tenacity in her 1986 dissertation on the calculation of rate constants in hydrocarbon decomposition reactions. Such qualities, the paper said, could be usefully applied to the equally complex problems facing Germany.


    Scientists Scramble to Curb Webb Overruns

    1. Andrew Lawler

    NASA plans to reduce the sensitivity of the successor to the Hubble Space Telescope to beat back rising costs that threaten to overwhelm the project. But despite a wealth of suggestions from scientists on how to cut costs, the agency likely will still face a shortfall of more than $500 million of the $3.5 billion needed to build, launch, and operate the James Webb Space Telescope (JWST).

    The trouble surfaced this spring, when agency officials found a $1 billion overrun in the project that they blamed on a host of technical issues (Science, 13 May, p. 935). A panel convened to examine the crisis last week recommended several ways to minimize the shortfall and avoid future cost increases. As a result of the cost-cutting measures, NASA science chief Mary Cleave has tentatively given the project a green light pending a final decision by NASA administrator Mike Griffin, telescope officials say. NASA is also projecting a 2-year slip in the launch, to 2013.

    The higher price tag could not come at a worse time for a science program choking on the costs of another space shuttle mission to Hubble, overruns in other science efforts, and the seemingly endless woes of the space shuttle program. Still, a 2001 report by the National Academy of Sciences labeled JWST the top priority for astronomy and astrophysics in the coming decade. NASA officials also remember well the uproar following the attempt by former NASA chief Sean O'Keefe to cancel a Hubble shuttle servicing mission—a decision Griffin reversed.

    Less polished?

    Making the segmented mirror less sensitive could reduce the costs of the James Webb Space Telescope.


    JWST scientists remain adamant that dramatic cuts to the size of the mirror or the major instruments are not an acceptable option. But the science panel, led by astronomer Peter Stockman and JWST scientist Mattias Mountain, both of Baltimore, Maryland's Space Telescope Science Institute (Mountain was recently named its director), did find significant savings in other areas.

    The mirror is designed to capture wavelengths from 0.6 to 28 microns. But thanks to advances in adaptive optics that can screen out perturbations in the Earth's atmosphere, the team agreed that the lower limit could be raised to 1.7 microns. That change would require one less cycle of polishing, at a savings of $150 million. Although the telescope would be less capable of observing at shorter wavelengths, future ground-based telescopes could compensate, says Eric Smith, JWST program scientist at NASA. That change disturbs some scientists, like Robert O'Dell of Vanderbilt University in Nashville, Tennessee. He says that degrading JWST's performance will make studies of nebulae and star formation now possible with Hubble more difficult.

    Relaxing stringent requirements designed to limit dust on the mirror could save a similar amount in test-related hardware, says Stockman. And the telescope likely would require one less testing cycle, knocking off another $100 million.

    Although the major instruments would not dramatically change, the team did recommend saving weight, mass, and support costs by dropping one portion of the Canadian fine guidance sensor designed to image at shorter wavelengths. A small French-made coronagraph could also be abandoned if necessary, the team said. NASA has yet to discuss those options with Canadian or French officials.

    The savings could total $430 million, Stockman estimates, and significantly reduce future risk, which also saves money. “I don't think we're going to find $1 billion,” adds Smith. “But hundreds of millions … is most welcome.”


    Base Commission Alters Pentagon's Wishes on Labs

    1. Jocelyn Kaiser,
    2. Eli Kintisch

    A federal panel tasked with restructuring U.S. military facilities delivered a mixed bag to researchers last week.

    The Armed Forces Institute of Pathology (AFIP) in Washington, D.C., got a reprieve from a recommendation to shut down most of its functions (Science, 20 May, p. 1101), and the Army's Night Vision Lab in Fort Belvoir, Virginia, fought off a move to Aberdeen Proving Ground in northeastern Maryland. But the Army signal processing research and electronics laboratories at Fort Monmouth, New Jersey, are headed to that site.

    The Base Realignment and Closure (BRAC) process, last completed in 1995, drew up a list of hundreds of closures and restructurings in the military's vast network of bases, labs, and offices. The commission voted last week, and its recommendations, which include major closings in Texas and Georgia, now go to the White House and then to Congress.

    The Defense Department recommended in May that the president “de-establish” the AFIP except for its museum and tissue repository. The College of American Pathologists and other groups lobbied the BRAC commission to save the $100 million a year, 820-staff pathology institute, arguing that the research staff was essential for roles such as helping prepare for bioterror attacks. The panel's decision that its functions “be absorbed” into other federal or civilian facilities is a “glimmer of good news, but the devil is in the details,” says former lab pathologist William Travis, who left AFIP this year for Memorial Sloan-Kettering Cancer Center in New York City.

    One question is whether AFIP will stay intact and move to a new building, he says. At least one piece is already splitting off: The chair of veterinary pathology announced last week that his department expects to move to an annex in Silver Spring, Maryland. A conference this week was to explore the future of its renowned 3 million-case tissue repository in light of the proposed breakup.

    Lobbying against the move to Maryland, former Fort Monmouth research director Robert Giordano cited a poll that showed that only 20% of the 5000 technical civilian staff would follow the lab. The resulting “brain drain,” he warned, would decimate crucial military research positions. Similar arguments were made against the move of the night vision lab, which conducts work in lasers, radar, and infrared light.


    NIH, Chemical Society Look for Common Ground

    1. Jocelyn Kaiser

    U.S. government officials and a scientific society are batting ideas back and forth on how to keep a new federal chemical database from overlapping with an existing private one. So far, they are still searching for common ground.

    A dispute between the National Institutes of Health (NIH) and the American Chemical Society (ACS) broke out after NIH's National Center for Biotechnology Information (NCBI) last fall launched PubChem, a database of small molecules with potential use as biological probes or as drugs, including data from a new screening initiative. ACS complained to NIH and Congress that PubChem's listing of chemical structures, though modest in size so far, duplicated its Chemical Abstracts Service (CAS) Registry, a massive, subscription-only chemical database that is a critical source of income for the society. Earlier this year, after discussing whether NIH should scale back the scope of PubChem, the House and later the Senate instead asked NIH to “work with private sector providers to avoid unnecessary duplication and competition” (Science, 17 June, p. 1729).

    In early August, ACS president William Carroll made NIH an offer: The society would donate $10 million and up to 15 staff members over 5 years to build NIH a free database of chemicals with attached bioassay data. NIH expressed many concerns about the proposal, however, in a four-page letter to Carroll from NIH Director Elias Zerhouni.

    Paper trail.

    NIH's Elias Zerhouni countered the American Chemical Society's offer to build NIH a chemical database.


    In the 22 August letter, Zerhouni notes that NIH wants to integrate PubChem with other public biomedical databases, which NCBI staff—not a chemistry organization—“are in an ideal and unique position” to do. NIH is also concerned about which molecules ACS would include, arguing that the database cannot be limited to compounds with biological data because such bioactivity may remain to be discovered. In addition, Zerhouni explains, the plan would violate federal rules requiring that any such agreement be open to bidding from other companies.

    Zerhouni offered a six-part “alternative structure” that would avoid overlap between PubChem and CAS but strengthen the ties between the two databases. Among those changes, NIH would pay ACS to make sure PubChem entries contain the same numbers that CAS uses to register each molecule to “maximize the interactiveness” of the two databases. NIH would agree not to include nonbiomedical information that CAS now offers, such as chemical reactions and patents. NIH also wants to set up a working group, with chemical database companies as members, that would offer NCBI advice on how to run PubChem.

    The letter says NIH is open to developing a “retrospective process” for removing chemicals from PubChem that are deemed of no use for biomedical research. NIH officials have noted in the past that it would be very hard to rule out any chemicals. For example, ACS initially claimed that an explosive called HDX should not be included in PubChem, but an NCBI official pointed out that the National Cancer Institute has found that HDX has activity in antitumor assays.

    Both sides say they are committed to finding a compromise. In a 23 August letter to ACS members, Carroll says the society is “studying” this proposal but maintains that NIH should “[take] advantage of the CAS Registry.” ACS spokesperson Nancy Blount said that a national ACS meeting in Washington, D.C., earlier this week prevented society officials from speaking with Science, but that ACS will “continue to have the best interest of science in mind.” Likewise, NIH spokesperson John Burklow says that “we are hopeful our proposals will resolve the issues.”


    Microbiologist Resigns After Pitch for Antianthrax Product

    1. Martin Enserink

    A scientist's enthusiastic endorsement of a skin lotion against anthrax has ended his career at the University of Texas Medical Branch (UTMB) at Galveston. On 17 August, John Heggers, a microbiologist and plastic surgeon specializing in burn treatment, resigned after the university's Scientific Integrity Committee (SIC) concluded he engaged in “egregious” misconduct by making “false and excessive statements” about the purported antianthrax lotion, a blend of citrus oils, plant herbs, and seed bitters that sells at $179 for half a liter.

    On 1 February 2005, the report says, Heggers wrote a letter on UTMB letterhead to Bio-Germ, the Dallas company that produces the lotion, in which he said his research had demonstrated the product's efficacy and safety; “we believe it will be successful against Smallpox, the Plague, and other pathogens possibly used by terrorists” as well, he wrote, adding that the lotion “should be rolled out to our Nation's First Responders, Military and, as soon as possible, to the citizenry of our Country.” Bio-Germ posted the letter on its Web site, according to the 29 June university investigation, along with a videotaped interview in which Heggers made similar statements.

    All you need.

    Bio-Germ says its $249 Protection Kit, which includes antianthrax lotion (also sold separately), provides “an effective shield against infection from anthrax.”


    Heggers, 72, has been at UTMB for 17 years and had a co-appointment at the Shriners Burns Hospital; he was not involved in UTMB's sizable federally funded biodefense program. He did carry out one anthrax study, but the committee says it did not support his claims. In a 2004 paper in the online Journal of Burns and Wounds, Heggers described tests of several topical antibacterials, “nutriceuticals,” and herbal products against strains of Bacillus anthracis, which causes anthrax. The paper claims that the Bio-Germ lotion and many other products killed the microbes, but the result is irrelevant, the panel says, because the tests used vegetative B. anthracis growing in a petri dish, not the spore form used in weaponized anthrax. Heggers has no data on plague and smallpox, according to the panel, which calls his recommendation for mass deployment of the lotion “utterly irresponsible scientifically.” The SIC says Bio-Germ paid Heggers's expenses to attend several meetings about homeland security but no honoraria. Two of Heggers's co-authors, Johnny Peterson and Ashok Chopra, say they did not see the manuscript of the paper in the Journal of Burns and Wounds before it was posted and found its conclusions “misleading.” The paper was removed from the journal's Web site in early June, they say.

    Heggers could not be reached for comment. But in a 15 July letter to UTMB President John Stobo, Heggers claimed that the university had been “intimidated” by the Dallas Morning News, which first reported the story, and that the panel was not qualified to judge him. In his 17 August resignation letter to Stobo (copies of both letters were made available to Science), Heggers acknowledged “several misstatements.” In an e-mail to a Newsreporter he attached to the resignation letter, Heggers said, “on reflection, I think my hope and enthusiasm outran my scientific caution.”

    At UTMB's request, Heggers's testimonials have been removed from Bio-Germ's Web site. “It's an embarrassment,” says David Walker, executive director of UTMB's Center of Biodefense and Emerging Infectious Diseases.


    Homeland Security Ponders Future of Its Animal Disease Fortress

    1. Martin Enserink

    The Alcatraz of animal diseases may come ashore. Last week, the U.S. Department of Homeland Security (DHS) announced that the Plum Island Animal Disease Center (PIADC)—which studies the most devastating agricultural diseases on a tiny speck in the Atlantic off Long Island, New York—will be replaced by a new facility that may be located elsewhere. The state's politicians, who oppose expanding the lab's remit but don't want it to close, immediately blasted the proposal. But some scientists say they would welcome leaving the remote, impractical location.

    DHS took over responsibility for Plum Island from the U.S. Department of Agriculture in 2002. In a fact sheet issued last week, the department said the 50-year-old lab is “nearing the end of its lifecycle” and will be replaced by a new National Bio and Agro-defense Facility (NBAF) with a stronger focus on bioterrorism. DHS is launching a study to determine the facility's mission, its preferred location, and whether it needs a biosafety level 4 (BSL-4) lab, the highest level of biological containment. The study should be completed by 2006, and the facility could open in 2011.

    Few contest that the dilapidated complex at Plum Island needs an extreme makeover. But adding a BSL-4 facility, or moving it, is controversial. Because most of the diseases studied there—such as foot-and-mouth disease and classical swine fever—don't infect humans, the lab operates at BSL-3 plus, which resembles BSL-4 except that researchers don't wear space suits. Scientists have long argued that the U.S. needs a BSL-4 facility for agricultural diseases to allow the study of agents, such as the Nipah and Hendra viruses, that sicken farm animals as well as humans.

    But Long Island residents and local politicians fear an escape of the deadly viruses and have resisted those plans (Science, 26 May 2000, p. 1320). In 2003, former DHS secretary Tom Ridge assured Sen. Hillary Clinton (D-NY) and Rep. Timothy Bishop (D-NY) that no BSL-4 would be built on Plum Island—a promise DHS says it will honor. Clinton and Bishop want the facilities upgraded, they wrote “in distress” to DHS secretary Michael Chertoff last week, not moved off the island.

    A DHS spokesman says that all options are still on the table—including building a new lab without a BSL-4 on Plum Island. But Harley Moon, an emeritus professor at Iowa State University in Ames who directed Plum Island in the mid-1990s, says moving the lab ashore would be the best option for several reasons. Operating the lab on an island is expensive, he says, the researchers are “intellectually isolated,” and Long Island's high cost of living hinders recruitments. Moon suggests moving it to an agricultural research center, such as those in Georgia, Colorado, or Iowa, where “the community and the policy makers understand the importance of the lab's mission.”


    An Earlier Look At Baby's Genes

    1. Jocelyn Kaiser

    The increasing ability to analyze fetal DNA from maternal blood should lead to better prenatal diagnoses of genetic disease—and confront future parents with tough information and choices

    The smiling, dark-haired woman chatting with Katie Couric on NBC's popular Today show explains why she wants to know the sex of her third baby just 7 weeks into her pregnancy. Holly Osburn of Glastonbury, Connecticut, the mother of two daughters, says her house is full of pink, purple, and green, and “we're anxious to find out if we're going to … maybe have to paint the nursery blue.”

    So Osburn has sent dried spots of her blood to a Massachusetts company offering Baby Gender Mentor, a new $275 test that promises to detect a fetus's sex from maternal blood as early as 5 weeks after conception. After Couric conducts a discussion with a physician about the pros and cons of the test, a spokesperson for a company selling it online delivers the big news live to millions of viewers: It's a girl! Osburn's smile wavers. “Another one,” she says. Then she regains her composure, assuring the TV audience that “a third is great.”

    While watching this in June, “my jaw dropped,” says Diana Bianchi, a prenatal geneticist at Tufts University School of Medicine in Boston and one of a small number of researchers who have spent more than a decade trying to detect sex and genetic disorders from fetal cells and DNA in a mother's blood. She notes that “at home” fetal DNA tests such as Baby Gender Mentor aren't yet considered scientifically and ethically vetted. “I'm concerned about whether this is ready for prime time,” says Bianchi.

    Ready or not, noninvasive fetal diagnosis is here. Tests based on fetal DNA circulating in a woman's blood are expected to replace invasive prenatal tests, such as amniocentesis, that are typically done later in pregnancy and pose a small risk of miscarriage. Researchers have already used fetal DNA from maternal blood to successfully test for genes inherited from a father that cause diseases such as cystic fibrosis and the blood disorder thalassemia. They are now refining their techniques and moving on to bigger challenges, such as identifying Down syndrome. If this work pans out, fetal genetic testing could be as cheap and routine as many other diagnostic tests, such as ones for HIV, says molecular biologist Sinuhe Hahn of the University Women's Hospital in Basel, Switzerland.

    Earlier and easier fetal DNA testing will certainly raise ethical questions. For example, some researchers worry that gender tests will lead to abortions by parents who desire a baby of a specific gender. The ethically explosive applications extend beyond sex selection. If fetal DNA testing can one day routinely reveal whether an early fetus has genes that predispose it to cancer or other diseases, parents-to-be could be facing much more difficult decisions than what color to paint the nursery.

    For now, researchers are grappling with how to get a clear, consistent signal from a relatively few molecules of fetal DNA sequence floating in a sea of maternal DNA. When a diagnosis could lead parents to end a pregnancy, they note, accuracy is crucial. “It's very important that we get it right,” says medical geneticist Maj Hulten of the University of Warwick, U.K.

    Broadcast news.

    Through a new test (inset), expectant mother Holly Osburn, along with Katie Couric and Today viewers, learned the apparent gender of her 7-week-old fetus.


    One in a million

    Researchers have known for more than 3 decades that a few fetal cells of various types are present in a pregnant woman's blood. While there may only be about two to six fetal cells per milliliter of blood during pregnancy, some of these cells can linger for several decades after birth and may even contribute to postnatal tissue repair or disease in the mother (Science, 21 June 2002, p. 2169). The first proof that such cells could be used to diagnose a fetal condition came in 1991 from Joe Leigh Simpson's lab at Baylor College of Medicine in Houston, Texas. Using an antibody called CD71 that tends to bind to red blood cells of fetal origin, his team separated these cells from most maternal blood cells. They then used fluorescence in situ hybridization (FISH), in which colored probes bind to chromosomes, to detect Down syndrome, which is caused by an extra chromosome 21, and another chromosomal disorder.

    Other labs soon reported similar results, exciting researchers who saw the technique as a promising alternative to amniocentesis and chorionic villus sampling (CVS). These diagnostic tests, which collect fetal cells by inserting a needle into the womb either late in the first trimester or during the second trimester, carry up to a 1% risk of miscarriage. In 1994, the National Institute for Child Health and Development (NICHD) launched a validation study in which five labs used fetal cells from maternal blood to look for Down syndrome in 2744 pregnancies. The results, published in 2002, were just modestly encouraging: The researchers found only enough fetal cells to detect 74% of Down syndrome cases. In contrast, CVS and amniocentesis are 99% accurate.

    The authors of the NICHD study concluded that the current techniques—which involve physically separating the fetal and maternal cells—would have to improve before blood-borne fetal cells could provide reliable diagnoses. The key will be an antibody or other compound that can more efficiently separate out the fetal cells, which make up only about one out of every million cells in a mother's blood, says Simpson. “Once that occurs, the field will turn around overnight,” he says.

    A few teams, including Simpson's at Baylor, and at least two companies are also pursuing an alternative approach, attempting to isolate fetal cells, called trophoblasts, from cervical swabs of pregnant women. The trophoblasts make up about 1 in 100,000 cells in a swab, and so should be easier to distinguish from maternal cells than fetal blood cells, says Farideh Bischoff of Simpson's group. Yet to be proved is whether researchers can extract enough cells without sampling so high in a woman's cervix that the technique becomes invasive, Bianchi notes.

    Oh, boy.

    Red marks the Y chromosome in a male fetal cell amid maternal blood cells.


    Free and easy

    Noninvasive fetal testing took off in a new direction several years ago after Dennis Lo, now at the Chinese University of Hong Kong, and co-workers discovered that maternal blood contains more than fetal cells. There's also fetal DNA floating freely, outside of cells, he found. Lo was inspired to look by two 1996 Nature Medicine articles on detecting tumor DNA in the blood of cancer patients. He reasoned that like a tumor, the fetus-derived placenta is a fast-growing tissue that might shed DNA.

    The hunch paid off: Using a form of polymerase chain reaction (PCR) to detect a gene called SRY on the Y chromosome of male fetuses, Lo's group reported in 1998 that fetal DNA is much more plentiful in a future mom's bloodstream than are fetal cells. Levels rise during pregnancy to as much as 3% to 6% of the cell-free DNA in a mother's plasma, then plummet in 2 hours after a baby is born. The fetal DNA seems to come mainly from the placenta, Bianchi and others have shown.

    Lo's group soon showed that this fetal DNA could be used to diagnose potentially lethal conflicts in Rh factor, a protein on the surface of red blood cells. If an Rh-negative woman carries an Rh-positive fetus, her immune system can create antibodies against the baby's blood cells, causing anemia for the fetus. This sensitization can be prevented by injecting the pregnant mother at certain points in pregnancy with Rh immunoglobulin, a step often taken as a precaution without knowing the fetus's Rh status. But many research groups have now shown they can reliably test the blood of Rh-negative pregnant women for fetal DNA that reveals the functional form of the Rh gene. Such a test has been offered since 2001 by a few research labs in Europe.

    Several groups have since reported they can detect other disease mutations passed on from the father, such as ones causing cystic fibrosis, beta-thalassemia, a type of dwarfism, and Huntington's disease. The results haven't always been reproducible, partly because smaller mutations are difficult to pick up from a mixture of fetal and maternal DNA. Other promising findings are still being debated. Lo's group reported in 2000 that intact fetal DNA in fragments of dying cells could be analyzed for Down syndrome, and last year a biotech company claimed that treating maternal blood with formaldehyde could boost the amount of fetal DNA recovered. Only some labs have been able to replicate these experiments.

    Two advances in the past year have clearly boosted the potential reliability of fetal DNA tests, however. Both involved studies looking for mutations that trigger beta-thalassemia, which leads to severe anemia and is most common in people of Asian and Mediterranean descent. Last summer, a report in the Proceedings of the National Academy of Sciences by Lo's team and the San Diego-based firm Sequenom Inc. said that inherited beta-thalassemia point mutations could be diagnosed in 12 fetuses much more reliably if mass spectrometry and PCR, rather than PCR alone, were used to analyze the fetal DNA.

    Earlier this year in the Journal of the American Medical Association, Hahn's team in Basel reported another approach for detecting beta-thalassemia mutations comprising a single nucleotide change. The group took advantage of a finding by Lo's group that the fragments of fetal DNA found in the mother's blood are typically less than 300 basepairs in size, compared with more than 500 basepairs for cell-free maternal DNA. By using electrophoresis to increase the ratio of the shorter segments in blood samples, the Swiss team successfully detected the presence or absence of four common beta-thalassemia point mutations in 28 of 31 fetuses. While the mass spectrometer needed for the Sequenom-Lo method costs $300,000, the Swiss team notes that its approach could cost as little as $8 per sample, within the economic reach of developing countries.

    Several teams are now racing to try these techniques—or combine them—to reliably detect cystic fibrosis and other genetic diseases, says Hahn. “They will open up a lot of new applications,” Lo agrees.

    One major caveat is that the studies so far have only been able to detect mutations passed on by the father. Because there's not yet a way to completely separate fetal DNA from the maternal DNA in a woman's blood, it's not possible to tell if a mutation possessed by the mother has been inherited by the fetus or if researchers are just seeing the mother's DNA. One possible solution may be an epigenetic marker, such as methylated groups attached to a gene, that distinguishes fetal DNA from a mother's. Lo's group showed in 2002 that they could make such a distinction. Another potential strategy is to use messenger RNA molecules produced only by the fetus and not the mother. Several groups have recently shown, for example, that RNA produced by placental genes can be detected in maternal blood.

    Detective squad.

    Dennis Lo (center) and his group at Chinese University of Hong Kong have pioneered noninvasive prenatal testing using cell-free fetal DNA.


    Seeing double

    Diagnosing Down syndrome noninvasively through fetal DNA is the big prize luring researchers. The potential demand for such a test is huge, says Boston University's Charles Cantor, chief scientific officer of Sequenom Inc., because the rate of Down syndrome is at least 1 in 270 for mothers over 35. Doctors can screen for the disorder in the first trimester by using ultrasound to measure the dimensions of the fetus's neck and checking the levels of several protein markers in maternal blood; this combination picks up 85% of cases, albeit with a false positive rate of 2% to 6%. The International Down Syndrome Screening group last year called for this noninvasive strategy to be offered to all women, but a firm diagnosis still requires subsequent amniocentesis or CVS. The $1000 or more cost of these two tests limits routine use to women over 35, which means most Down syndrome births now occur in younger women.

    Yet while Down syndrome is easy to detect if fetal cells are in hand, it's harder using cell-free DNA. The reason is that this condition is caused by an extra chromosome, rather than a mutation that can be detected with PCR. So far, for Down syndrome, fetal DNA can be used to only slightly improve screening: Overall fetal DNA levels are higher in women carrying fetuses with Down syndrome and some other aneuploidies. Adding a fetal DNA quantity test to other serum markers for Down syndrome would boost the detection rate from 81% to 85%, Bianchi's group has shown.

    Still, the real prize is a straightforward, noninvasive fetal DNA diagnostic for Down syndrome that's as accurate as amniocentesis and CVS. One possible solution is to discover an epigenetic marker for Down syndrome that would allow Down-specific DNA sequences to be amplified with PCR. Another is to look for fetal mRNA from a gene expressed by chromosome 21 but not by the mother's cells. Cantor estimates that two dozen groups are working on the problem and predicts it will be solved in 3 years.

    Baby signs.

    Cell-free fetal DNA levels rise during pregnancy, as shown in three future moms.


    Ethical minefield

    Indeed, while research on noninvasive fetal testing is very competitive—Lo and other investigators have certainly applied for many patents—cooperation is common. Cross-lab studies like the one sponsored by NICHD have nurtured the field, and they are continuing thanks to a new 5-year, €12 million European Union project called Special Advances in Fetal Evaluation (SAFE) that involves 52 institutional partners. “I think this is a positive example of a new technology being rigorously investigated before it filters into practice,” says gynecologist Wolfgang Holzgreve of the Basel group.

    The need for caution makes some scientists uncomfortable with Baby Gender Mentor. The company offering the test, Acu-Gen Biolabs in Lowell, Massachusetts, claims it works at 5 weeks of gestation at 99.9% accuracy. But Bianchi questions that figure, noting that a cross-lab study of gender detection published last year found that sensitivity varied widely among labs. A company spokesperson says the 99.9% figure is based on 20,000 births but notes that the company won't publish results until it has patented its technology.

    There's little chance for outside experts to scrutinize that accuracy claim. Food and Drug Administration approval is not needed as long as the blood sample goes to a lab and the test is sold as a service rather than as a kit. Like other genetic tests, “[this] is opening up gaps in the oversight system,” says Kathy Hudson, director of the Genetics and Public Policy Center at Johns Hopkins University in Baltimore, Maryland. It's not just the U.S. that does not regulate such testing. A Canadian company called Paragon Genetics has been offering a fetal DNA gender test for more than 2 years. The firm's quiet marketing of it hasn't drawn as much criticism as Baby Gender Mentor, in part because it follows the practice of many fetal DNA researchers by using fresh maternal blood, instead of dried blood spots. It also suggests that samples be taken 10 weeks into pregnancy.

    As for concerns that some couples could use fetal DNA gender tests to end a pregnancy, Paragon Genetics lab director Yuri Melekhovets argues that parents can already do that based on ultrasound tests early in the second trimester. Still, Lo's group has gone so far as to stipulate in licensing agreements with companies that its technology can't be used for sex selection. The SAFE project, meanwhile, is funding a study of the implications of using early fetal DNA testing, especially if costs fall enough to make it feasible for couples in countries such as India and China where female children may be viewed as less desirable. “Especially in 'one child' countries, there is a risk that this [test] can be abused,” says Hulten, the SAFE project's coordinator.

    Another troubling ethical issue for some is how abortion rates could be affected by the advent of widespread, accurate fetal DNA testing for many genetic diseases. Although abortions may increase, Bianchi points out that mothers who keep a child with a disease could also benefit from the prenatal diagnosis. A survey by her group found that mothers who went to term after learning that they were carrying a fetus with Down syndrome were better able to cope psychologically once the child was born than mothers who learned of their baby's disorder at birth. Based on that finding, if fetal DNA testing fully comes of age, it may provide many potential parents with news that's difficult to hear, but it could also give them time to decide what's right for them and accept their decision.


    Laser Facility Faces Burning Questions Over Cost, Technology

    1. Eli Kintisch

    The National Ignition Facility is the world's biggest laser. So maybe it's no surprise that sparks are flying over its fate

    Nobody ever said recreating a thermonuclear explosion in a laboratory was going to be easy. But this year, the Department of Energy's (DOE) long-troubled National Ignition Facility (NIF) has suffered a series of political and fiscal blows that threaten to sink the stadium-sized laser. With doubts recently raised about the project's technical progress, Congress must decide whether to give DOE enough money to continue building the $3.5 billion facility, housed at Lawrence Livermore National Laboratory in California.

    NIF is the linchpin of DOE's $5.5-billion-a-year stockpile stewardship research program—an effort to use science to assure the effectiveness of existing nuclear weapons without actually testing them. Its 192 lasers are designed to heat a pea-sized capsule of heavy hydrogen 100 million degrees to achieve a fusion reaction simulating the guts of a nuclear warhead. Although the project has spent 80% of its estimated budget, only eight of the 192 lasers are operational, and a recent report by the Jasons, an elite outside group, says there is “substantial technical risk” of not achieving fusion ignition by DOE's stated 2010 goal.

    The Bush Administration has requested another $337 million for 2006, an amount that the House of Representatives endorsed before the Senate slashed it to $103 million in June. While the lower funding level would effectively kill the project, NIF project head Ed Moses is optimistic. “We've been through this before, and we'll get through it again,” he says.

    Approved in 1993, NIF has never had a smooth ride. The nuclear weapons community has long been divided on its usefulness for assessing U.S. nukes, and the project, initially pegged at $2 billion, is 3 years past its original planned completion date. In 2000, management scandals led to a reshuffling. Last year, after DOE pushed the target ignition date back to 2014, Senator Pete Domenici (R-NM), a recurrent foe and chair of the spending panel that controls DOE's budget, threatened to “do everything in my power” to force the lab to confront “pressing technical issues.” Lab officials subsequently altered their schedule to achieve ignition by 2010.

    Still, a string of review boards have blessed its mission. Just last year, for example, a review by the Defense Science Board concluded that “[L]aser performance parameters … have been demonstrated.” And Livermore Director Michael Anastasio says, “I believe the project is going well.”

    Bright idea.

    Energy Secretary Samuel Bodman (right) tours NIF with Livermore's Ed Moses.


    The Jasons's report, delivered in June, takes another look at the lasers, which have been used for 400 shots, as part of its inquiry into whether NIF will be ready by 2010. NIF scientists point to successful results separately using the three kinds of so-called “beam smoothing” devices, during which proper focus and laser spot-size were achieved. But the report urges the project to demonstrate that the three techniques can work simultaneously, at full energy and high frequency.

    Arrayed in a 10-meter sphere, the activated lasers turn the target into a plasma, which can reflect some laser light in a phenomenon known as backscattering. The Jasons's report says the problem is “a serious potential risk” and calls for more study, including real tests of the system. But NIF doesn't plan to do those tests, since managers have halted further laser-target experiments until all 192 lasers are built.

    The Jasons also said that Livermore's decision to conduct the first experiments in 2010 at 1.0 megajoules (MJ) rather than the intended operating level of 1.8 MJ would reduce their chances of success. The reduction, NIF managers say, will let them ramp up gradually, but critics say the real goal is to protect expensive optics. The full 1.8 MJ allow use of larger, more robust targets. With smaller targets, the Jasons fear, irregularities such as asymmetrical squeezing may prevent ignition. Critics as well as supporters also now worry that the laser won't be able to deliver full power. “The laser operates at one-third the total energy without damage,” says plasma physicist Stephen Bodner, a consultant to the Jasons's study and longtime NIF critic.

    Despite its criticism, the panel wants the project to be completed, says Jasons study leader David Hammer of Cornell University in Ithaca, New York, noting the report hailed engineering feats including diagnostics and other supporting technologies. In fact, NIF Associate Director George Miller predicts that getting the 192 lasers, the target, and the diagnostics working correctly together will pose a more formidable challenge than reaching 1.8 MJ. As for the backscattering, he says the plasma encountered in the early tests is different from what they'll encounter in the final product. “To create ignition-relevant plasmas, you have to complete NIF,” Miller said in an e-mail. Lab officials also believe that the smoothing performance shown in separate tests is adequate because the three techniques do not interact. But they acknowledge a full-power test shot with all three smoothing techniques simultaneously won't be performed until early next year.

    Within Congress, Sen. Dianne Feinstein (D-CA) is expected to lead the charge for full funding. Domenici is unhappy that other laboratory stockpile stewardship programs—several of which, NIF supporters note, are located in his home state—are being cut to fund NIF. So more funds for those programs could win him over. Congressional supporters also hope the White House will step in. In February, DOE Secretary Samuel Bodman called NIF the “core” of U.S. stockpile stewardship; now Bodman's spokesperson says the secretary needs to learn more about the project. The administration also failed to react officially to the Senate cuts. To survive, the biggest laser ever built just might need a little more firepower.

  11. THE LAW

    Vioxx Verdict: Too Little or Too Much Science?

    1. Andrew Lawler

    A widow's victory in the first of thousands of cases against Merck's pain reliever sidesteps important questions about the drug and a patient's death

    In her closing argument last month in a Texas courtroom, Merck lawyer Gerry Lowry said that science is “what this [Vioxx] case is all about.” The team representing the pharmaceutical giant presented reams of data to show that Vioxx was not to blame in the death of the plaintiff's husband, a 59-year-old triathelete. So when 10 of 12 jurors decided that Merck was guilty of producing and marketing a drug that could be deadly, some reports painted the outcome as a triumph of hot emotion over cold facts.

    But observers say that neither legal team dealt forthrightly with two important scientific questions: How safe are Vioxx and similar drugs, and did the victim, a Wal-Mart employee named Robert Ernst, actually die from one of the drug's known side effects? “Both sides avoided the scientific complexity,” says Garret FitzGerald, a cardiologist and pharmacologist at the University of Pennsylvania in Philadelphia who has criticized COX-2 inhibitors, the class of drugs to which Vioxx belongs, despite having received research funding from Merck. That complexity, which received only minimal attention during the 5-week trial, centers on how Vioxx affects the heart—and how new techniques can pinpoint death by blood clot even if it's not visible during a standard autopsy.

    Carol Ernst's suit on behalf of her husband, who died in 2001 after taking Vioxx for 8 months, is one of more than 4000 pending suits against the one-time blockbuster drug that Merck pulled from the market last September after a study showed an increased risk of heart attacks and strokes. The jury's brief deliberations and the size of the verdict—$253 million, some 10 times what Texas law allows—led some observers to conclude that jurors disregarded the factual evidence in favor of emotional arguments put forth by the plaintiff's attorney Mark Lanier. “The jury never quite got it—a lot of people just don't want to hear data,” says Thomas Wheeler, a pathologist at Baylor College of Medicine in Houston, Texas, who testified as a paid witness on Merck's behalf. In fact, jurors told the Wall Street Journal after the verdict that the science presented during the trial “sailed right over their heads.”

    Running question.

    Whether arrhythmic victim Robert Ernst also suffered a blood clot (right) went unanswered at the trial.


    But questioning the jury's scientific competence—a majority had high school educations—may be missing a more important issue. “It is all too easy to blame the jury for being stupid,” says Shari Diamond, a research psychologist and law professor at Northwestern University in Chicago. She and others believe the jury was convinced by studies showing that Vioxx can cause abnormally high rates of heart attack and stroke. Neil Vidmar, a law professor and psychologist at Duke University in Durham, North Carolina, adds that although “science seemed to be the issue” during the trial, “I suspect the strongest evidence [against the defense] was that Merck had covered up.” Internal memos and e-mails cited in the trial show that Merck scientists were aware of cardiovascular concerns about Vioxx long before a 2000 article in The New England Journal of Medicine first raised the issue publicly.

    Unlike older anti-inflammatory drugs, Vioxx and other COX-2 inhibitors are thought less likely to upset the stomach or cause gastric bleeding. The compounds—which include Pfizer's Bextra and Celebrex—inhibit on a narrower set of inflammatory enzymes than their painkilling predecessors do. That action, researchers believe, may instigate and accelerate blood clotting that could lead to heart attacks and strokes (Science, 15 October 2004, p. 384).

    However, Ernst's death certificate listed arrhythmia rather than a heart attack brought on by a blood clot. Vioxx or other COX-2 drugs have not been associated in any study with arrhythmia. And the autopsy showed no signs of clotting that could have generated a heart attack. Wheeler, a pathologist who has specialized in prostate rather than heart research, told the jury that Vioxx could not have been responsible for Ernst's death. “And I stand by that,” he told Science last week.

    But FitzGerald and others aren't so sure. Although there was no evidence of a clot, he says, “that doesn't mean [Ernst] didn't have one.” Eric Topol, a cardiovascular researcher at the Cleveland Clinic in Ohio who has been a vocal critic of Vioxx, notes that clots traditionally have been difficult to track because they sometimes dissolve only to reform. “This is not a static phenomenon.” Clots also can embolize, or shower downstream, leaving little trace. It's even possible to find clues of a vanished clot using microscopic examinations of heart tissue, a method pioneered by the late Michael Davies.

    Such a fine-tuned autopsy was not performed on Ernst. As a result, says Topol, “they spent 5 weeks discussing the wrong topic. The issue was [what causes] sudden cardiac death, not [that Ernst suffered an] arrhythmia.” The sharp distinction Merck tried to draw was blurred by testimony from the coroner Maria Araneta that a blood clot may have been dislodged and dissolved during attempts to save Ernst's life.

    Merck's general counsel Kenneth Frazier said after the trial that “the jury was allowed to hear testimony that was not based on reliable science and that was irrelevant.” He vowed to appeal that case, in part on that basis, and to fight some 4200 pending cases “one by one,” although last week he told The New York Times that Merck might be willing to settle select cases.

    Yet even if upcoming trials involve heart attack and stroke victims, they inevitably will require sorting through many complex factors, says Topol. And a courtroom, Wheeler notes, is a difficult place to debate complicated research issues. So even if science is what the Vioxx debate is all about, it's a strain developed for the courtroom, not the lab.


    The Quest for Dark Energy: High Road or Low?

    1. Adrian Cho

    A space telescope could reveal the mysterious stuff that is blowing the universe apart—if those on the ground don't do it first

    Seven years ago, astrophysicists asked a simple question: “How far?” The answer overturned our understanding of the cosmos.

    Since 1929, researchers had known that the universe is expanding. But they assumed the expansion is slowing as the universe's own gravity tugs against it. Two teams set out to observe the slowing by measuring the distances to exploding stars known as supernovae. To the researchers' surprise, the farthest supernovae were farther than expected. That meant the expansion of the universe is accelerating as if driven by some weird space-stretching “dark energy.”

    “When we first saw the result, I assumed our data was miscalibrated,” recalls Saul Perlmutter of Lawrence Berkeley National Laboratory and the University of California, Berkeley. But within a few years, studies of the afterglow of the big bang—the “cosmic microwave background”—and other measurements bolstered the case for dark energy and showed that it accounts for a whopping two-thirds of the universe. “The amazing thing about this discovery is how quickly people accepted it,” Perlmutter says.

    Yet researchers still don't know what the mysterious stuff is. They believe the answer lies in observing thousands of supernovae and millions of galaxies. Sometime in the next decade, NASA and the U.S. Department of Energy (DOE) are expected to launch a $600-million space telescope designed to measure dark energy, the Joint Dark Energy Mission (JDEM).

    But even as they lay their plans for the satellite, researchers are debating whether they could hammer out key properties of dark energy with observations from the ground. NASA, DOE, and the U.S. National Science Foundation have received dozens of proposals for measuring dark energy from terra firma, and the agencies have assembled a Dark Energy Task Force to evaluate them and report back by year's end. The task force's report will help the agencies set their near-term priorities and will inform another panel studying proposed methods and technologies for JDEM.

    Figuring out what can be done from the ground may be key to keeping JDEM affordable. “Clearly, you should only do from space what you have to do from space,” says Rocky Kolb, a cosmologist at the Fermi National Accelerator Laboratory in Batavia, Illinois, and chair of the task force. But deciding what's best done where is tricky, says Charles Bennett, an astrophysicist at Johns Hopkins University in Baltimore, Maryland, and co-chair of the JDEM science definition team. “We don't know what dark energy is, and there are different ways to measure it and different aspects to measure,” Bennett says. “There are unknowns in all directions.”

    Don't look down.

    A proposed space telescope such as SNAP (left) faces competition from today's ground-based telescopes such as the Canada-France-Hawaii Telescope on Mauna Kea.


    Weird and weirder

    So far theorists have dreamed up three ideas of what dark energy might be, each one a challenge to the current conception of the universe, says Sean Carroll, a theoretical physicist at the University of Chicago. “There are no uninteresting possibilities,” Carroll says, “which is what makes it so exciting.”

    Perhaps the simplest explanation is that dark energy is part of the vacuum itself, so that space naturally tends to stretch as if driven by some inherent constant pressure. In 1917, Albert Einstein proposed such a pressure, or “cosmological constant,” to counteract gravity and keep the universe from imploding. He later abandoned the notion as unnecessary when astronomers found that the universe is in fact expanding. But Einstein's orphaned idea may be the thing that drives the acceleration of the universe.

    If so, it will vex particle physicists. For decades they've known that, thanks to quantum mechanics, the vacuum roils with particles popping in and out of existence, and that such “virtual” particles give the vacuum an energy that could serve as the cosmological constant. Unfortunately, the energy physicists calculate is far too big to fit the data. In the past, theorists have assumed for the sake of simplicity that some still-unknown principle cancels everything out to make the vacuum energy add up to zero. If there is a cosmological constant, Carroll says, that tidy fix won't work.

    Alternatively, the dark energy might come from some sort of particle or interaction that propagates through space much as light does and provides the space-stretching push. Such “quintessence” theories skirt the problem with the vacuum energy, but they run into difficulties with other aspects of particle physics. For one, theorists must explain why the new particles don't interact with those already familiar to us.

    Finally, the accelerated expansion might not be driven by dark energy at all. Rather it could signal that across billions of light-years, gravity no longer works as Einstein's general theory of relativity predicts it should. “It's very hard to change gravity on large distances without changing it at short distances, too,” says Gia Dvali, a theoretical physicist at New York University. But that's a good thing, he says, because it means that theories that modify gravity may be easier to test.

    Researchers hope to distinguish between the possibilities by measuring simply how the density of dark energy changed as the universe expanded. If dark energy is a cosmological constant, then the density should have remained constant (see figure, below). And if the density varied, then dark energy must be something else. To tell the difference, researchers must trace the history of the expansion of the universe, which is encoded in the ancient light from far-flung stars.

    Loaded question.

    Dark energy acts like a spring that stretches space. If it's a cosmological constant, the spring grows with space to maintain a constant push (top). Otherwise, the push may vary (bottom).


    How red? How far?

    The quest boils down to asking two questions about some astronomical object, such as a supernova or a cluster of galaxies: How far away is it? And how red is its light? The record of the universe's expansion lies in the combination of the two answers, the so-called “distance-redshift relation.”

    As space expands, light zipping through it stretches to longer and redder wavelengths, much as sound waves in a slide whistle shift to lower pitches as the whistle's plunger descends. Light's wavelength increases more quickly if space is stretching faster. So to accumulate a given amount of stretch, or “redshift,” light would have had to travel longer and farther if the universe had expanded more slowly billions of years ago than if the universe had always expanded at its current rate. Astronomers first glimpsed dark energy by noting that supernovae whose light had been stretched by 20% to 100%—that is, with redshifts of 0.2 to 1.0—were farther away than expected (Science, 27 February 1998, p. 1298). The leading proposals for JDEM—designs named SNAP, JEDI, and Destiny—all aim to measure thousands of supernovae with redshifts as high as 1.7.

    But instead of a supernova, the object in question could be the distance between galaxies, says Daniel Eisenstein, a cosmologist at the University of Arizona in Tucson. Because of a phenomenon known as “baryon acoustic oscillations,” galaxies show a slight tendency to space themselves at a specific distance. That distance, about 500 million light years, is determined by how far sound waves traveled in the plasma that filled the primordial universe before atoms formed. By surveying millions of galaxies of a given redshift and measuring how far apart they appear in the sky, researchers can determine their distance and deduce the distance- redshift relation, Eisenstein says.

    Galaxies might also reveal the evolution of dark energy through a more subtle effect. From studies of the cosmic microwave background, researchers know that almost all the matter in the universe is undetected “dark matter,” which fills space with vast filaments that contain the galaxies. Gravity from those threads bends light from more distant galaxies and distorts their appearance so that neighboring galaxies in the sky seem to line up (Science, 17 March 2000, p. 1899). Such “weak gravitational lensing” depends on the distances to the dark-matter “lens” and the observed galaxy. So by comparing the lensing of millions of galaxies at different redshifts, researchers hope to decipher the distance-redshift relation.

    Finally, researchers might probe dark energy simply by counting clusters of galaxies in a patch of the sky, says Joseph Mohr, an astronomer at the University of Illinois, Urbana-Champaign. The number of clusters at a given redshift reveals how much volume the patch contains, and the size of the patch in the sky reveals how far away it is, just what cosmologists need to know to trace the expansion.

    The ups and downs

    The question now is where best to make the observations. All agree that supernovae with redshifts greater than 1 can be studied only from space because their light stretches to infrared wavelengths that would get lost in the infrared glare of the atmosphere. Beyond that, consensus vanishes.

    “If you're looking at stellar objects, you're better off in space,” says Anthony Tyson, an astrophysicist at the University of California, Davis. “If you're looking at galaxies, you're better off on the ground.” Tyson leads the team developing the Large Synoptic Survey Telescope (LSST), a ground-based behemoth with an 8.4-meter mirror and a camera with 3 billion pixels, which if funded could take its first look at the sky in 2012 (Science, 27 August 2004, p. 1232). Imaging the entire sky and some 3 billion galaxies, LSST should best space-based measurements of weak lensing and baryon acoustic oscillations, Tyson claims.

    But others say such cut-and-dried standards are too simplistic. For example, atmospheric distortions can mimic weak lensing, says Gary Bernstein, a cosmologist at the University of Pennsylvania in Philadelphia. So a space-based weak-lensing study might do better than LSST even if it counted only a few hundred-million galaxies. Similarly, the redshifts of galaxies are harder to measure from the ground, says Arizona's Eisenstein, so earth-bound baryon acoustic oscillation measurements could prove impractically slow.

    Given the uncertainties, researchers have proposed different strategies for JDEM. The infrared Destiny space telescope would measure only supernovae, says Jon Morse, an astrophysicist at NASA's Goddard Space Flight Center in Greenbelt, Maryland. “Destiny is focused like a laser on this one problem,” he says. But that's taking a risk, says Yun Wang, a cosmologist at the University of Oklahoma in Norman and leader of the JEDI project. “We still don't know that supernovae will give you the precision you need to really know what dark energy is,” she says. JEDI and SNAP would measure supernovae, weak lensing, and baryon oscillations.

    The biggest uncertainties surrounding JDEM may be more political than technical. Both NASA and DOE list JDEM as a priority, but neither has committed to building it. Researchers say they're ready to start now, for a launch before 2011. But JDEM may not launch until 2017. And in the meantime, ground-based measurements will continue to whittle away at our ignorance—and JDEM's potential scientific impact.

    “You shouldn't look at a space mission as an improvement over what you know today,” says Johns Hopkins's Bennett. “You should look at it as an improvement over what you'll know tomorrow.” But as tomorrow gets pushed further into the future, such prognostications grow as murky as the nature of dark energy itself.