News this Week

Science  25 May 2001:
Vol. 292, Issue 5521, pp. 1462
  1. U.S. ENERGY POLICY

    Administration's Energy Plan Is Short on Scientific Details

    1. David Malakoff
    1. With reporting by Andrew Lawler.

    Researchers hoping for a power surge from the Bush Administration's new energy policy got barely a jolt when the White House released its much-anticipated report last week. Although the high-profile statement declares that scientific and technological breakthroughs are essential for stable U.S. energy supplies, it says little about which fields should be emphasized or how much money they should get. And even some of its specific recommendations, such as funding renewable energy research with revenue from opening a wildlife refuge to drilling, are unlikely to win approval in Congress.

    “The ‘music' in this report is right on key,” says House Science Committee chair Sherwood Boehlert (R-NY), who applauded the tone of the report. “But the ‘words' are still sometimes vague or dissonant.”

    The 170-page plan* was written by a task force led by Vice President Dick Cheney. It makes more than 100 recommendations for preventing future shortages, from allowing drilling in Alaska's Arctic National Wildlife Refuge and building new nuclear power plants to giving tax breaks to consumers who buy energy-efficient cars and homes. Critics say the plan favors developing new energy supplies over promoting energy- saving strategies and gives short shrift to environmental concerns. “It's heavily biased in favor of the most polluting fossil fuels: coal and oil,” charges the Natural Resources Defense Council, an environmental group with headquarters in New York City. And it is a “disastrous development” for international efforts to reduce emissions of greenhouse gases, adds Jan Pronk, head of the United Nations forum on climate change.

    The report also sidesteps some tough political issues. For example, it promises only to “continue to study the science” behind whether to build a nuclear waste repository beneath Nevada's Yucca Mountain.

    The Cheney task force has been tight-lipped about whom it consulted, and some researchers say it didn't seek enough technical advice. The strategy “lacks any serious science and technology perspective,” says physicist Ernest Moniz of the Massachusetts Institute of Technology (MIT) in Cambridge, the top scientist and third-ranking Department of Energy (DOE) official in the Clinton Administration. “This is a well- intentioned slapdash job,” adds MIT chemist John Deutch, who led the Central Intelligence Agency under Clinton.

    Still, scientists are savoring a few tidbits. Fusion researchers are happy about language that encourages continued funding for studies of building devices that—like the sun—can fuse atoms to produce more energy than they consume. Nuclear energy researchers hope that a call to boost fission power might revive their discipline, which has seen academic interest and research reactors dwindle over the past decade. And materials scientists are cheered by an endorsement of efforts to create superconductors, materials that allow wires to carry far more current with less loss. “I'm glad that someone had the foresight to include us,” says Dean Peterson, head of a superconducting research center at the Los Alamos National Laboratory in New Mexico, part of a $37 million DOE program.

    The Cheney report, however, is silent on future funding for such programs. Instead, it directs Secretary of Energy Spencer Abraham to conduct a series of reviews and make recommendations to the White House. DOE officials dismissed speculation that the White House might use the strategy's release to shore up the president's stingy 2002 budget request for DOE research, which included cutting some nuclear and renewable energy research programs by up to 30%.

    Even some of the plan's more specific recommendations have been greeted with skepticism. Renewable energy advocates, for instance, predict that Congress will block a plan to generate more than $1 billion over the next decade for renewables research by selling drilling leases in the Arctic refuge. And although Congress is likely to be friendlier to a proposed $2 billion expenditure over the next decade on “clean coal” technologies, which seek nonpolluting ways to burn coal, Boehlert says they “need just as much reevaluation as do the alternative energy R&D programs the policy seems to distrust.”

    Researchers are puzzled by other recommendations. One asks the President's Council of Advisors on Science and Technology (PCAST) to “make recommendations on using the nation's energy resources more efficiently.” But plasma physicist John Holdren of Harvard University notes that he chaired a seemingly identical 1997 PCAST effort. “It's not clear to me [whether] the task force was aware of our results,” says Holdren, who had no contact with Cheney's team. “I don't know if it makes sense to do it again.”

    The energy report's lack of scientific perspective, Holdren says, highlights the new Administration's failure to connect with the technical community. But Congress may be more receptive to researchers' advice when it starts to translate the strategy's sketchy outline into legislation. In the Senate, for instance, a bipartisan group— including Pete Domenici (R-NM), Jeff Bingaman (D-NM), and Joe Lieberman (D-CT)—has already suggested boosting DOE's renewable research programs. Last week House Democrats released their own energy strategy. It calls for doubling the DOE's science budget over 5 years and creating a “science czar” to ensure that the best science guides any overall energy road map.

  2. EMBRYONIC STEM CELLS

    Court Asked to Declare NIH Guidelines Legal

    1. Gretchen Vogel

    Seven prominent stem cell scientists, together with three patients, have filed suit against the U.S. Department of Health and Human Services (HHS) and the National Institutes of Health (NIH). They are charging that the Bush Administration's failure to fund work on human pluripotent stem cells is causing “irreparable harm” by delaying potential therapies.

    Last August, NIH issued guidelines to govern federal funding of work on human pluripotent stem cells (HPSCs) (Science, 1 September 2000, p. 1442). The move paved the way for NIH-funded scientists to conduct research that can now be done in the United States only with private funds. But it ignited a controversy because the cells—which in theory could be coaxed to become any cell type in the body—are derived from human embryos or fetal tissue obtained from elective abortions. In February, the new Bush Administration asked HHS Secretary Tommy Thompson to review the guidelines; Thompson in turn told NIH to put its process for implementing them on hold (Science, 20 April, p. 415).

    On 8 May, the scientists, who work with HPSCs using private funding, joined forces with actor Christopher Reeve (paralyzed by a spinal cord injury), Parkinson's disease advocate James Cordy (who suffers from the disease), and Chicago business executive James Tyree (who has type I diabetes). In their complaint they ask the U.S. District Court for the District of Columbia to declare that the NIH guidelines are legal and to compel NIH to fund research on the cells. The plaintiffs include James Thomson of the University of Wisconsin, Madison, who first isolated HPSCs from embryos, and John Gearhart of Johns Hopkins University, who isolated HPSCs from fetal tissue. It also includes the three researchers who have asked NIH to certify that cell lines they have derived meet the guidelines, a prerequisite for their use by other federally funded researchers. They are Roger Pedersen of the University of California, San Francisco, and Alan Trounson and Martin Pera of Monash Medical Center in Melbourne, Australia. Stem cell researchers Dan Kaufman of the University of Wisconsin, Madison, and Douglas Melton of Harvard University are also listed as plaintiffs.

    In dispute.

    Suit seeks implementation of guidelines that would permit U.S. government funds to be used for research on pluripotent stem cells derived from human blastocysts (above).

    CREDIT: ALAN TROUNSON/MONASH INSTITUTE FOR REPRODUCTION AND DEVELOPMENT

    In the 22-page complaint, the plaintiffs argue that by halting NIH's review process, HHS is failing in its “statutory duty to fund scientifically meritorious research projects.” They note that the 1993 NIH Revitalization Act specifically bars the executive branch from blocking funding for research on transplanting fetal tissue; they also argue that embryonic stem cell lines, as opposed to embryos, are fetal tissue. The review is causing irreparable harm by “delaying research using HPSCs … by restricting collegial sharing of cell lines with other scientists, and by discouraging talented young researchers from joining their labs or entering the field of HPSC research,” the plaintiffs assert. The patients charge that the HHS review is “preventing or delaying the development of potential treatments” for conditions such as paralysis, Parkinson's disease, and diabetes.

    This is not the first time the issue has been in the courts. Nightlight Christian Adoptions, an agency that arranges adoptions of extra human embryos created as a part of fertility treatments, filed suit on 8 March to block the NIH guidelines. That suit was put on hold when Thompson asked NIH to suspend its review process.

    The government has 60 days to respond to the complaint, says Jeffrey Martin of Shea & Gardner in Washington, D.C., who is representing the plaintiffs pro bono. “Much of the conversation about legal arguments was one-sided until we filed our case,” says Martin. He hopes the arguments in the suit will help persuade the Administration to allow the guidelines to proceed.

  3. U.S. SCIENCE EDUCATION

    Lawmakers Vie to Shape NSF Program

    1. Jeffrey Mervis

    Congress abhors a vacuum. So this spring, after President Bush proposed a $200-million-a-year science and math education program to be run by the National Science Foundation (NSF) but offered scant details (Science, 13 April, p. 182), legislators jumped at the chance to influence one of the hottest political debates of the year. The result is a slew of bills that would flesh out Bush's sketchy plan to forge partnerships between university researchers and local school districts. Chances appear good that one or more of them will be adopted this year, although funding levels remain up in the air.

    Making the biggest splash is a plan introduced last week by the chair of the House Science Committee, Representative Sherwood Boehlert (R-NY). The National Mathematics and Science Partnerships Act (HR 1858) would authorize $267 million a year in programs to strengthen teacher training and professional development. The bill would establish a new NSF grants program that would link universities and nonprofit organizations with local schools and businesses to improve math and science instruction in elementary and secondary schools. It would also provide scholarships for science and math majors or scientists wishing to become teachers, give teachers grants to do university-level research, and create four NSF-funded centers to study how children learn.

    “One of the failings of our current [public education] system is that we don't take advantage of all the expertise residing in our universities and businesses,” said Boehlert in a prepared statement. “My bill is an effort to do just that.”

    Slightly different versions of the Bush plan are embodied in HR 1 and S 1, the main Republican vehicles for the president's overall effort to rework federal support for elementary and secondary schools. However, those bills would funnel most of the partnership money to local and state school districts through the Department of Education.

    Partners aplenty.

    Representative Sherwood Boehlert's science and math education bill is one of many that address a new NSF program.

    CREDIT: RICK KOZAK

    Boehlert's bill avoids a controversial provision in a related education bill sponsored by Representative Vernon Ehlers (R-MI) that would require NSF to fund the salaries of master teachers at private as well as public schools. Ehlers says he hopes to move ahead with his bill, HR 100, which suffered a surprise defeat last fall (Science, 10 November 2000, p. 1068). But other observers predict that some of Ehlers's provisions will be folded into the chair's bill. Boehlert also hopes to join forces with House Democrats, who earlier this month introduced a partnerships bill, HR 1693, that places a greater emphasis on increasing the participaton of underrepresented groups and boosting educational technologies.

    Boehlert's plan appears to be closely aligned with NSF's thinking on Bush's partnership program, which officials first learned about in late January. Judith Sunley, head of NSF's education directorate, expects to issue an announcement this fall on how the program will be run. “We hope that our legislation will influence what [NSF] decides to do,” says Boehlert aide David Goldston, who expects the bill to be marked up by the full committee next month.

    Whatever their differences, these bills simply give NSF permission to carry out specific programs. The money to run them comes from appropriators, who will shortly start carving up some $661 billion in discretionary spending for the 2002 fiscal year, which begins in October.

    Political trade-offs are likely to shape NSF's overall budget, currently $4.4 billion and scheduled for a 1.3% boost. Despite widespread support for improving precollege math and science education, for instance, the 11% increase Bush has proposed for NSF's education programs might be vulnerable. On 16 May a House spending panel discussed shifting some education money into the foundation's core research programs in order to offset cuts and putting a freeze on major research facilities, as called for by the president's April budget. Comparing those cuts to a requested 13.5% increase for the National Institutes of Health, appropriations subcommittee chair James Walsh (R-NY) said after the hearing that “we may need to put more money into the physical sciences” to improve the balance of federally funded research.

  4. EUROPEAN HIGHER EDUCATION

    Thirty Nations Pledge to Harmonize Degrees

    1. Robert Koenig

    PRAGUE—European universities pride themselves on their unique histories and independence. But that autonomy can be a disadvantage in a world where intellectual talent is increasingly free to ignore national boundaries. Last week, the education ministers of 30 European nations agreed to bolster efforts to bring their systems of higher education closer together. They endorsed reforms that should make it easier for students and researchers to move freely among Europe's sometimes disparate universities.

    Meeting here on 18 to 19 May, officials from Germany to Malta agreed to establish by 2010 a “European Higher Education Area” that would include elements such as comparable degrees and transferable credits (see table). In addition, the parties aim to develop a European joint degree program that would be based on courses taken at two or three institutions in different countries. Lowering these and other administrative barriers should help slow or reverse a brain drain from European to U.S. institutions, they believe.

    “We have made strides toward the goals of stimulating mobility and encouraging more reforms in higher education that already are taking place in much of Europe,” says Sweden's education and science minister, Thomas Östros. He co-chaired the meeting with Eduard Zeman, education minister of the Czech Republic, who believes that the integration of Europe's universities into the mainstream will accelerate the overall process of European unity. “Today's universities are tomorrow's Europe,” he says.

    View this table:

    The movement toward a common set of administrative practices began at a small Paris meeting in 1998 and was fleshed out the next year at Bologna. There ministers from two dozen countries called for replacing the complex, confusing, and sometimes nontransferable degree systems in many European countries with “easily readable and comparable degrees.” That would include the use of bachelor's and master's degrees in most fields. Other goals include a strengthened accreditation system, often called “quality assurance,” and a freer flow of students and researchers across national borders.

    Although there has been progress, some obstacles won't be easy to remove. “There are still great differences in the systems of quality assurance for higher education in European countries,” says Jacob Henricson, a Swedish student who chairs the National Union of Students in Europe. And some officials warn that going too far could cause an unwanted homogenization of higher education. France's education minister, Jack Lang, for example, argued strongly for the importance of maintaining diversity—especially in languages and curricula—in the midst of harmonizing degree and credit systems. Others defended the value of keeping the typical European degree system for medicine, engineering, and law, in which students begin their specialized studies right after secondary school and do not earn a bachelor's degree along the way.

    The U.K.'s education minister, Tessa Blackstone, said that she would oppose any effort to interfere with a particular university's curricula and independence. But she welcomes better coordination to combat “huge competition from U.S. universities.” These and other issues will be discussed at the next meeting of ministers, set for Berlin in spring 2003.

  5. HUMAN EXPERIMENTATION

    Bioethics Panel Urges Broader Oversight

    1. Eliot Marshall*
    1. With reporting by Gretchen Vogel.

    The U.S. government should create a single office to monitor both public and private research involving human volunteers, says a new report by a presidential commission. The report, 5 years in the making, concludes that major changes are needed in the oversight of clinical research. But it's not clear if the new occupant of the White House is listening.

    The National Bioethics Advisory Commission (NBAC) adopted its findings on 15 May and will publish a complete report in late summer.* What NBAC cannot do, however, is ensure that the government pays attention to these long-debated conclusions. The president who sought this advice—Bill Clinton—is gone, and the new Administration is far from attuned to these concerns, with no science adviser and a skeletal scientific staff.

    The new NBAC report is the broadest of several the panel has issued since 1996. Chaired by Princeton University president Harold Shapiro, the panel has tackled a variety of focused issues, from how to obtain consent from people with impaired judgment to the ethics of running tissue collections. The recommendations issued last week, which represent its cumulative investigations,”reflect a dual commitment to ensuring the protection of those who volunteer for research while supporting the continued advance of science.”

    The commission did not find evidence of widespread problems in either public or private human subjects research, says its executive director, Eric Meslin. But it “felt that the principle of respect for participants in human research is one that extends to all participants, regardless of funding source.” At present, privately funded research comes under government review only if it is part of a government-funded project or is being used to support drugs under review by the Food and Drug Administration.

    The new office, which the panel labeled the National Office for Human Research Oversight, would set policy, issue and enforce regulations, oversee research, and disseminate information. It would not fall under the jurisdiction of the Department of Health and Human Services (HHS), parent to the 1-year-old Office for Human Research Protections (OHRP), the current overseer of federally funded clinical research. But Robert Rich of Emory University in Atlanta, Georgia, president-elect of the Federation of American Societies for Experimental Biology (FASEB), doesn't see the need for such a change. The existing OHRP “has all the tools it needs” to oversee human subjects research, he says, and a panel housed outside HHS would be subject to greater political pressures.

    NBAC also seeks to clarify the definition of human subjects research and cover a broader swath of activities. The government, it says, should keep tabs on any “systematic collection or analysis of data [involving human subjects] with the intent to generate new knowledge.” The policy would also apply to medical databases and tissue collections. All studies of more than minimal risk should be approved by an independent review board, NBAC says, and at least 25% of the members of such panels should be “unaffiliated with the institution” and focus on “nonscientific areas.”

    Although most of NBAC's recommendations are designed to close loopholes in the current system of research monitoring, some are designed to remove excessive paperwork. For example, the panel says that reviews of informed consent should focus on the process and substance of informing volunteers rather than on obtaining a signed document, and it suggests that some procedures should be exempt from routine consent requirements if the risks are truly “minimal.”

    The new report also could be the panel's swan song. Meslin says the commission has received no word on whether its charter, which expires in October, will be extended. Even if the NBAC report is ignored, however, the idea of a new oversight body has been embraced by the biomedical community. On 23 May a consortium of six university and research groups—including FASEB—announced the formation of the Association for the Accreditation of Human Research Protection Programs. The association will operate a voluntary national accreditation program to monitor clinical research carried out with either public or private funding.

  6. CELL BIOLOGY

    Protein Clumps Hijack Cell's Clearance System

    1. Laura Helmuth

    Although Parkinson's, Huntington's, amyotrophic lateral sclerosis, and other neurodegenerative diseases cause very different behavioral symptoms, inside the neuron they look a lot alike. All are marked by big intracellular clumps of protein that scar neurons targeted by the disease. But researchers haven't known whether these protein clumps cause neurological damage themselves or are mere byproducts of some other system gone awry.

    Now a study on page 1552 suggests that protein aggregates can directly damage cells by hijacking a cellular quality control mechanism, the ubiquitin-proteasome system (UPS). Normally, the UPS seeks out and destroys misshapen proteins inside the cell—including those that tangle up into clumps in neurodegenerative disease. But in cells artificially induced to churn out clump-prone proteins, the system stalls. Protein tangles thus apparently “initiate a vicious cycle,” says Susan Lindquist of the University of Chicago. “The study suggests very nicely that there is an interplay between the two: As proteins start to accumulate, they put stress on the UPS. This in turn causes more proteins to accumulate, which in turn puts more stress on the UPS.”

    Overwhelmed.

    Cells with clumps of proteins (yellow) can't clear an aberrant protein marker (green).

    CREDIT: N. F. BENCE ET AL.

    A stressed-out UPS spells trouble for the cell. As many as 80% of the proteins a cell produces don't fold correctly, points out Alfred Goldberg of Harvard Medical School in Boston; the UPS destroys them before they cause damage. Ubiquitin tags abnormal proteins for destruction, and the proteasomes chew up those ubiquitinated proteins. If proteasomes are stifled by inhibitors, even healthy cells accumulate abnormal proteins in thickets similar to those seen in neurodegenerative diseases. These cells can't reproduce, and they're likely to die.

    Earlier research uncovered suggestive links between the UPS and neurodegenerative diseases. Mutations in UPS-related genes can cause early-onset forms of Parkinson's disease, for instance. And protein clumps inside diseased neurons are studded with both ubiquitin and proteasomes.

    To test directly whether protein clumps can hobble the UPS, Ron Kopito and colleagues at Stanford University developed an easy-to-read tracer, composed of a fluorescent protein attached to a protein fragment that sends an “eat me” signal to the UPS. If the clearance system is working efficiently, it destroys the tracer and the cell's fluorescence quickly dims.

    The researchers engineered cells to produce one of two proteins—one involved in cystic fibrosis and the other in Huntington's disease—both of which tend to aggregate. In cells with clumps of these proteins, the researchers found, the fluorescent tracer glowed robustly, indicating that the UPS was powerless to break it down.

    Although they're not sure how tangled proteins shut down the UPS, one possibility is that “the proteasome is, as we put it, frustrated,” says Kopito. Abnormal proteins are sucked into the proteasome through a small opening, like someone slurping a strand of spaghetti. If the protein is tangled into a clump, the proteasome can't pull it in. But, because the proteasome is a “possessive enzyme” that holds onto its prey, it also can't let go. “The proteasome is basically out of commission,” says Kopito, which prevents it from chasing down new, badly built proteins, and the problem escalates. Either player in the drama—an excess of misfolded proteins or a glitch in the UPS—could trigger the cycle, Kopito says.

    A malfunctioning UPS could be partly at fault in other neurological diseases as well. Alzheimer's disease is defined in part by protein tangles inside the cell similar to those modeled in the current study. But other protein clumps, amyloid plaques outside neurons, have received more attention lately (see “New Clue to the Cause of Alzheimer's”). Kopito says it's possible an overloaded UPS or similar mechanism could allow plaques to form. He also suspects UPS overload in bovine spongiform encephalopathy, or “mad cow disease,” and its human cousin, Creutzfeldt-Jakob disease, which are both characterized by clumps of abnormal protein fragments. If so, says Kopito, “this could be an explanation” for how these mysterious diseases kill neurons.

  7. NEUROBIOLOGY

    New Clue to the Cause of Alzheimer's

    1. Jean Marx

    Alzheimer's disease sneaks up on its victims, robbing them first of short-term memory and then ultimately of all ability to think and reason. Like several other neurological diseases, Alzheimer's seems to be caused by abnormal protein deposits accumulating in the brain (see “Protein Clumps Hijack Cell's Clearance System”). In Alzheimer's, the prime suspect is a small protein called amyloid beta (Aβ), which is a major component of the pathological plaques that stud the patients' brains. But what causes that protein to accumulate?

    Most Alzheimer's researchers have focused on the enzymes that free Aβ from its precursor protein (Science, 29 September 2000, p. 2296). By releasing too much Aβ, overactive enzymes could lead to abnormal Aβ deposition in the brain. Recently, however, researchers have been approaching the problem from a different direction: looking for enzymes that degrade the peptide. If the enzymes are underactive, the result could also be Aβ buildup. “It's just as important to find the [Aβ] degradation pathway as the synthesis pathway,” says Rudolph Tanzi, an Alzheimer's researcher at Massachusetts General Hospital in Boston.

    Alzheimer's suspect.

    The red staining shows the distribution of neprilysin in the normal mouse brain.

    CREDIT: S. FUKAMI, K. WATANABE, N. IWATA, AND T. C. SAIDO

    On page 1550, a team led by Takaomi Saido of the RIKEN Brain Science Institute in Saitama, Japan, now provides direct evidence in mice that a protease called neprilysin could be a natural Aβ-degrading enzyme. “It might play an important role in clearing Aβ in the brain,” says Alzheimer's researcher Sangram Sisodia of the University of Chicago Pritzker School of Medicine. But he and others caution that more work will be needed to confirm that neprilysin does indeed break down Aβ. Neprilysin is not the only candidate for that job, however.

    Last year, Dennis Selkoe's team at Harvard Medical School in Boston showed that a neuronal protein called insulin-degrading enzyme can break down the peptide in lab cultures. If a deficiency of either—or both—of these enzymes turns out to contribute to Alzheimer's, the enzymes could provide new targets for drugs to prevent or treat the disease, which afflicts 4 million people in the United States alone.

    Saido's team identified neprilysin as a possible Aβ-degrading enzyme about a year ago. In that work, the team injected mouse brains with Aβ and also with a series of specific protease inhibitors to see which would reduce the breakdown of the peptide. This led the researchers to neprilysin and three related protein-destroying enzymes. Because neprilysin had the greatest activity against Aβ in test tube studies, the scientists collaborated with Craig Gerard's team at Harvard Medical School, which had knocked out the neprilysin gene in mice, to see how that affected the animals' ability to degrade the peptide.

    When injected into the brains of normal animals, Aβ is broken down within about 30 minutes. But in the knockout mice, the researchers found, almost all of the peptide persisted. By crossing knockout mice with normal animals, the team produced mice that have just one active copy of the neprilysin gene. These animals yielded intermediate results: More of the injected Aβ persisted than in normal animals, but less than in the complete knockouts. That's important, Saido says, because it “indicates that even partial reduction of neprilysin activity, which could be caused by aging, will elevate Aβ and thus cause Alzheimer's.”

    The Saido team also found that natural Aβ levels in the knockout mice were highest in brain regions, such as the hippocampus and cortex, where Alzheimer's plaques are most prominent. This and recent findings from Patrick McGeer and his colleagues at the University of British Columbia in Vancouver are consistent with the possibility that neprilysin deficiency could contribute to Alzheimer's, says Saido. When McGeer's group analyzed neprilysin levels in the brains of patients who had died of the disease, they found the lowest levels in the high-plaque regions. “If you have adequate amounts of neprilysin, you never accumulate Aβ,” McGeer concludes.

    Despite the biochemical evidence suggesting that neprilysin deficiency could lead to Alzheimer's, to clinch the case researchers are waiting for two additional findings. One would be the demonstration that neprilysin deficiency increases plaque formation in an Alzheimer's mouse model—an experiment that Saido describes as the “top priority.” The other is genetic evidence linking the neprilysin gene to human Alzheimer's.

    Tanzi's group at Harvard has found hints of such a genetic linkage between human Alzheimer's and the region on chromosome 3 where the neprilysin gene is located. It was just short of statistical significance, however, and he hasn't published the work. But a neprilysin deficiency could be caused by a defect in the machinery needed for expressing the gene, as well as in the gene itself, Saido speculates.

    Meanwhile, Selkoe and his colleagues are following up on the insulin-degrading enzyme. Among other experiments, they are knocking out the gene in mice to see whether that alters Aβ handling in the brain. If at least one of these enzymes proves to be a bona fide Aβ-degrading enzyme in the brain, Alzheimer's researchers will have an important new line of investigation to pursue in their efforts to tame this devastating disease.

  8. SOLID STATE PHYSICS

    Rubbery Liquid Crystal Gives Lasers a Stretch

    1. Andrew Watson*
    1. Andrew Watson writes from Norwich, U.K.

    You won't find it in gumball machines, but Heino Finkelmann's stretchable laser may be the cutest example of elasticity since Silly Putty. Finkelmann, a physical chemist at the University of Freiburg, Germany, created a new type of artificial rubber with the light-altering properties of a liquid crystal display. Then, colleagues at Kent State University in Kent, Ohio, showed that it could be used as a new type of laser—one that emits light without mirrors or a cavity and that changes its wavelength when pulled like taffy.

    Behind the fun lies serious physics. Finkelmann's rubber is a photonic bandgap material, one that can block light in a controlled way. Such materials are important as tiny light guides in the world of optical communications, but they are expensive and tricky to engineer. “Here's a hope that one can make naturally occurring bandgap materials that therefore are very inexpensive, and here they're also tunable,” says Azriel Genack, a physicist at the Queens College of City University of New York in Flushing. “It's just a question of mixing the chemicals together.”

    The transparent rubber is about as stretchy as an office rubber band. It is also a liquid crystal, a member of a class of twilight materials whose structures lie partway between the neat ordering of a true crystal and the haphazard disorder of a liquid. In a typical liquid crystal, long, rodlike molecules spontaneously line up with one another. Finkelmann's material varies the theme: The rodlike molecules twist like the steps of a spiral staircase, a structure first observed in derivatives of the cholesterol molecule. In such so-called cholesteric liquid crystals, the helical structure is described by its pitch, the distance needed to make one complete twist.

    What makes cholesteric crystals useful, Finkelmann explains, is their ability to capture light. “If light is coming in the direction of the pitch axis, light of a given wavelength is reflected,” he says. That means that if light of just the right wavelength enters the liquid crystal material, the twisted molecules can trap it just as mirrors do in a regular laser. Where to get such fine-tuned light. The trick, Finkelmann says, is to impregnate the material with a fluorescent dye that, when excited by an external laser, gives off light of just the right wavelength to be trapped. If the exciting laser is powerful enough, the light emitted by the dye escapes as laser light at wavelengths both just above and just below the forbidden, trapped wavelength band.

    Last year, Genack demonstrated such a fluorescent-lasing mechanism in a liquid. Finkelmann saw similar results presented at a conference by Peter Palffy-Muhoray of Kent State. Finkelmann's lab had just succeeded in creating cholesteric ordering in a rubber. “It was straightforward to transfer this concept to our rubbers,” he says.

    The result, as Finkelmann, Palffy-Muhoray, and three colleagues will report in an upcoming issue of the journal Advanced Materials, is a material with its spiral axis oriented through its thickness. Shine a laser beam on one side, and laser light of a different frequency comes out the other. Stretch the rubber in two directions at once (see figure), and the molecular spirals shorten, causing the emitted light to shift toward the blue end of the spectrum. The color of the light leaving the rubber changes before your eyes, Finkelmann says.

    The new work is a “significant step” in the progress of liquid crystal lasing, says physicist Cliff Jones of ZBD Displays in Malvern, United Kingdom. Applications may be in the offing. “I can imagine that this may lead to a low-cost, tunable laser,” Jones says. Genack says he is interested in using the rubber to develop practical devices for display technology.

  9. GENE SILENCING

    A Faster Way to Shut Down Genes

    1. R. John Davenport

    When scientists say a new technique ranks up there with the polymerase chain reaction (PCR), you know it's big. That's how at least one researcher is viewing a report this week of a new method for shutting off specific genes in mammalian cells. Although still in its infancy, the approach, known as RNA interference (RNAi), makes a currently arduous process much faster, much as PCR did for copying segments of DNA.

    “It's going to totally revolutionize somatic cell genetics in mammals,” predicts Phillip Zamore of the University of Massachusetts Medical Center in Worcester. “Instead of devoting 6 months to a year figuring out how to turn off expression [of a mammalian gene], people will be able to go in and in a week turn off the expression of 10 genes.”

    RNAi is a process in which double-stranded RNA (dsRNA) molecules turn off, or silence, the expression of a gene with a corresponding sequence (Science, 26 May 2000, p. 1370). A variety of organisms, including plants, fruit flies, the roundworm Caenorhabditis elegans, and likely mammals, seem to enlist RNAi to fight off viruses and restrain the movement of transposons, pieces of DNA that can hop around and disrupt a genome.

    Recently scientists have also harnessed this process as a research tool, injecting dsRNA into cells to turn off specific genes, thereby garnering clues about their function. That work has yielded some stunning results; last year, for instance, scientists used the technique to systematically block expression of nearly all the genes on two C. elegans chromosomes. But so far, silencing genes using RNAi has not worked well in mammalian cells; although successful in mouse embryos, dsRNA triggers a global shutdown of protein synthesis in other mammalian cells.

    Running interference.

    An siRNA stops its target from making protein (left); siRNAs to other genes have no effect (right).

    CREDIT: THOMAS TUSCHL

    Now scientists at the Max Planck Institute for Biophysical Chemistry in Göttingen, Germany, report in the 24 May issue of Nature that they have finally overcome this barrier: By using specially constructed dsRNA molecules that are shorter than those tried to date, they have shut down target genes in several different cultured mammalian cells lines. The technique won't replace gene knockout strategies, but RNAi does offer some advantages. It is faster and less labor intensive than making a knockout, and genes that are lethal when knocked out in embryos can be analyzed with RNAi in cell culture.

    The new work grew out of studies of RNAi in Drosophila. Although much of the RNAi mechanism is mysterious, researchers know that dsRNA triggers the degradation of the messenger RNA (mRNA), blocking synthesis of the protein product. Earlier this year, Thomas Tuschl and his Max Planck co-workers discovered that an RNAi intermediate, called siRNA (small interfering RNA), instigates degradation of an mRNA in the fruit fly. Tuschl wondered whether that intermediate, a 21-base dsRNA with two bases overhanging on each end, would be better than longer dsRNAs at silencing genes in cultured mammalian cells.

    In the new work, Tuschl and his team tested synthetic siRNAs that matched the sequence of four different genes that express cytoskeletal proteins. In four different cell lines derived from human and monkey, expression of three of the four genes was lowered substantially—by as much as 90%. The fourth gene is highly expressed, which may make silencing it more difficult, say the authors. Control genes not targeted by the siRNAs were not affected. The team has subsequently tried RNAi on other genes, and according to co-author Klaus Weber, nine out of 10 genes can be “knocked down” with the method.

    This targeted silencing in mammalian cells appears to work by circumventing a global cellular process not present in lower organisms. Injection of long dsRNAs into mammalian cells (which probably mimics invasion by a virus) induces a broad interferon response that reduces the translation of many genes and can even trigger cell suicide. siRNAs appear to be short enough to sneak under the radar of the interferon system but long enough to specifically target single genes.

    Still, Zamore cautions that although the technique is promising, it is too soon to know whether it will deliver. Genes are not always turned completely off, for instance, and RNAi may not be effective for genes whose protein products are unusually stable or highly abundant. Fifteen years ago, antisense methods for gene silencing and gene therapy offered similar hopes, but that has largely been a bust. Unlike antisense, however, RNAi seems to take advantage of an existing biological pathway, which just might give it a leg up.

  10. QUANTUM PHYSICS

    Microscale Weirdness Expands Its Turf

    1. Charles Seife

    Schrödinger's pipe dreams aside, no sane physicist would try to do a quantum- mechanical experiment with a cat. A cat is a large object, which tends to follow the classical laws of prequantum physics; quantum mechanics tends to hold sway over small objects, such as atoms. But physicists in Austria are breaking down the tidy distinction between large objects and small ones. In a paper that has been posted on the Los Alamos preprint archives (arXiv.org/abs/quant-ph/0105061) and submitted to Physical Review Letters, they show that a cluster of 70 carbon atoms—a veritable monster by quantum- theory standards—is governed by the archquantum law known as the Heisenberg Uncertainty Principle. These clusters are the largest, hottest, and most complicated objects that have been shown to obey Heisenberg's law, and they are helping scientists understand the increasingly fuzzy divide between the quantum and the classical.

    “It's a very good idea to try to cross that boundary,” says Christopher Monroe, a physicist at the University of Michigan, Ann Arbor. “I think these are wonderful experiments.”

    Decades ago, experimenters showed that very small things such as neutrons and protons obey Werner Heisenberg's dictum that the better you understand an object's position, the less you are able to predict its momentum. For example, when you increasingly constrict a beam of neutrons by forcing it through smaller and smaller slits, the particles take on a greater and greater range of possible momenta. As objects get larger, though, the Heisenberg Uncertainty Principle and other quantum effects such as interference become harder to measure. Some physicists, including Roger Penrose of the University of Cambridge, U.K., argue that quantum effects break down as objects get larger, causing big objects to behave classically while small ones behave quantum mechanically. Anton Zeilinger of the University of Vienna in Austria disagrees. “A transition from quantum to classical as you go from micro- to macroscopic … is not going to happen in my expectation,” says Zeilinger. “It's just a question of the skill of the experimenter and how much money [there is] to perform the experiment.”

    Zeilinger and his colleagues decided to probe the hazy border between quantum mechanical and classical by looking at an object that lies between both worlds: the C70 molecule. The ungainly bigger brother of C60 (buckminsterfullerene), C70 is much more massive than the particles that physicists usually use to test the laws of quantum mechanics. At the same time, C70 is much smaller than a cat, so it can be stuffed through a small slit with a lot less difficulty. “You don't usually think of looking at such a complex system as quantum,” says Monroe, who calls the massive molecule “a nice middle ground.”

    Zeilinger's team shot a beam of C70 through an adjustable slit whose size ranged from 20 micrometers to 70 nanometers. Using a sensitive laser detector, the researchers measured the range of momenta of the C70 molecules that had passed through the slit. When the slit got smaller than 4 micrometers, the C70 began to behave like quantum objects: As the slit size decreased, the range of the molecules' momenta got broader and broader. In other words, the more that was known about the molecules' positions, the less was known about their momenta. Despite its size, the C70 was behaving “very, very precisely” as a quantum-mechanical object should, says Zeilinger. And although he notes that C70 is still too small to disprove Penrose's breakdown theory, it has extended the quantum domain farther than before. “We can work upwards slowly,” he says. Cat lovers beware.

  11. ASTRONOMY

    'Invisible' Astronomers Give Their All to the Sloan

    1. Ann Finkbeiner*
    1. Ann Finkbeiner is a science writer in Baltimore, Maryland.

    The builders of the Sloan Digital Sky Survey have put their careers on hold for as much as a decade. Their reward: a chance to work on the most ambitious astronomical project in history

    Jim Gunn is by any measure a superb scientist. He's the Eugene Higgins Professor of Astrophysics at Princeton University, a member of science academies, winner of prizes. In the course of a 35-year career he has calculated the composition of the early universe, collected quasars, surveyed distant clusters of galaxies, and cataloged the evolution of stars. He helped invent cosmology as an observational science, and along the way he designed and built the instruments that make the observations possible. But in the last decade, he says, “I basically disappeared from the science scene.”

    In 1986, Gunn came up with an idea for a digital camera that could conduct the largest, deepest, and most informative survey of the sky yet. Since then, he and a group of colleagues, from grad students to senior professors, have spent a large fraction of their careers turning his brainchild, now called the Sloan Digital Sky Survey, into reality. “I don't say I don't regret it,” Gunn says. “But if you undertake something as ambitious as the Sloan, you have to make a decision” about your career.

    Astronomers are doing more and more of these surveys of large fractions of the sky in large fractions of the spectrum, each survey resulting in terabytes of data and each requiring years of astronomers' time. The scientific return is magnificent; but the professional return, say participating astronomers, is anything but assured. For example, once the Sloan begins operations, its builders have first crack at authoring papers from the data, but the name of any one builder may get lost among those of co-authors. And as of 5 June, when the data become publicly available, the builders will have no advantage at all. Young builders do not establish the names that certify they'll get or keep jobs. Older builders disappear from the scientific literature and can't compete for grants. When the astronomers on the Sloan were asked how the survey had affected their careers, the first thing they did was laugh.

    But astronomers join large surveys anyway, just as high-energy physicists collaborate on multiyear international experiments, biologists spend years mapping and sequencing DNA to decode the human genome, and space scientists take decades to launch giant observatories. In the case of the Sloan, young astronomers say they're willing to gamble that being part of the survey will get them jobs. Older astronomers say such surveys are their field's future, and somebody has to do the work. “Progress requires huge directed projects and lots of time,” says Richard Kron of the University of Chicago, “and time is just the cost of admission. But the grand and bold thing is what we wanted to do.”

    The Sloan is astronomy's grandest and boldest survey yet. Past surveys provided the statistical basis for knowing which of the universe's inhabitants are common, which are rare, what families they fall into, where they prefer to live, and how they are born, age, and die. Until recently, however, most surveys in optical wavelengths have used photographic plates that detect light less sensitively and less accurately than a digital camera's charge-coupled devices (CCDs).

    The Sloan is astronomers' first large-scale digital survey, and in addition to making higher resolution images of fainter objects over a wider field of view, it finds their distances to boot. And it does all this fast: A 1991 survey by Michael Strauss, now at Princeton University, and his colleagues took 5 years and found the distances to about 5000 galaxies; the Sloan, in the first year of operation, found the distances to 90,000. By the time the Sloan is finished in 2005, it will have measured the shape, brightness, and color of 100 million objects and the distances to a million galaxies and 100,000 quasars. This June, data from the first 18 months will be arranged in a searchable, publicly available archive. And everything—from the telescope to the CCDs, the camera, the two spectrographs, the software for running them, the software for correcting and analyzing the data, and the archive itself—has been designed and built by the Sloan's participants.

    Not only have team members committed their time, but depending on their seniority and to varying degrees, they've also changed or deferred their careers. Princeton astronomer Jill Knapp's first job with the Sloan, in 1991, was writing the phone book-sized grant proposal to the National Science Foundation (NSF); her second job was coordinating the software that turns the camera's data into a catalog. “God knows what I do, but it sure takes up time,” she says. “You're sort of 100% into it.” And her pre-Sloan research on the life cycle of stars. “All gone,” she says, “all gone.”

    For David Schlegel's doctoral degree, he studied the large-scale motions of galaxies, but for the two and a half years of his postdoc at Princeton he worked on Sloan software; he raises Knapp's estimate of the commitment needed to 200%. Connie Rockosi is still a graduate student at the University of Chicago but has worked on the camera for 10 years: The standing joke, she says, is “that I'm the person with the largest fraction of her life on the Sloan.” Kron says he's still doing his pre-Sloan research (“I take a random patch of the sky and study it to death”) while helping with Sloan operations strategy, although he admits, “My publication rate has gone down.” Alex Szalay of Johns Hopkins University in Baltimore studied the evolution of the structure of the universe until he joined the Sloan in 1992 and began designing the public archive. “It was like a maelstrom, it catches you up and draws you in,” he says. “I did write papers, but essentially all my thoughts have been about Sloan.”

    Among other changes, most Sloan astronomers are newly negotiating a switch to big science. Astronomers traditionally work, and publish papers, in groups of two to four. “The whole reward system in astronomy, the way one's career advances, is by the work that you do in a small group,” Strauss says. The Sloan, however, is a collaboration of 11 institutions and 100-plus astronomers. One name hiding among 100 co-authors doesn't stand out: The risk with the Sloan is that an astronomer might invest years in the survey and wind up professionally invisible.

    The Sloan collaboration has tried to alleviate this risk by following a complicated set of operating principles. The right to use the data before they become public goes to researchers at any of the participating institutions. The right to list themselves as co-authors on any scientific paper goes to the so-called builders, those who have spent 2 years full-time on the project; their names, currently numbered at 100 or so, are listed in alphabetical order. The right to first authorship goes to those with prime responsibility for a given paper; their names precede the alphabetized list of builders. “I don't think that our authorship system is perfect,” says Zeljko Ivezic, a postdoc at Princeton who is first author on one Sloan paper so far, “but I cannot think of any better one.”

    But despite the operating principles, an astronomer who has neglected, say, galaxy evolution to spend years designing data reduction software still might end up buried in authorial midlist along with dozens of others. In a field in which visibility depends on the names on scientific papers, how do such people get hired or promoted or famous. “It's actually a very difficult question. How can the world know that they did a superb job on the software.” says Strauss. “That's a question we don't have a complete solution to.”

    Generations.

    For Connie Rockosi, Richard Kron, and John Peoples, survey work strikes different balances of risks and rewards.

    CREDIT: REIDAR HAHN/FERMILAB

    Scientific invisibility poses different problems for astronomers at different stages of their careers. For younger astronomers, says Szalay, working on the Sloan “is a long shot; it's a calculated risk.” The postdocs and untenured faculty members say they must gamble on getting jobs. “What I'm banking on careerwise is just having made a reputation within the survey for being able to get things done,” says Schlegel. “To be honest, I don't know how it's going to play out.” Ivezic says he's used his early-data rights to mark out his own science: The best bet “for youngish people,” he says, “is to define their own science that they will be recognized by. So I hope I'm on the right track with my ideas.”

    Tenured astronomers have more leeway, Szalay says: “We could still blow 5 years of our lives and not destroy our careers.” Instead, the main problem facing Sloan's tenured astronomers is funding. Astronomers who launch satellites analyze their data with funding from NASA. High-energy physicists who build colliders analyze their experiments with funding from the Department of Energy (DOE). The Sloan's funding “is patchwork,” Kron says—a combination of sponsors that includes the Alfred P. Sloan Foundation, DOE, NASA, NSF, and 13 private and public institutions. And all the funding, says the survey's director, John Peoples of Fermi National Accelerator Laboratory in Batavia, Illinois, “is strictly to get the data into the databases.” Princeton's Gunn agrees: “There's no money for science.”

    In principle, Gunn says, younger astronomers should be able to get grants, but older astronomers who have to support graduate students and postdocs are in a more precarious position: “If you haven't done anything for a long time, it's very hard to get funding.” So far, regardless of age or standing, Peoples says, “almost none of the Sloan astronomers have succeeded in obtaining new grants for research.”

    So why do they sign on with the Sloan, let alone stick it out for years. The most obvious reason is that, with data from a volume of the universe 100 times larger than that of any previous survey, Sloan's science leaves its astronomical predecessors in the dust. “We can answer questions people have been puzzling over for years,” says Strauss. “We already have more data than [they] ever did.” The data should finally answer astronomy's broadest question: how the universe grew up. That is, how the featureless gas of the infant universe evolved into the lacy network of galaxies observable today, and within that network, how the galaxies themselves evolved. “The relationship between galaxy evolution and large-scale structure,” says Strauss, was “purely in the realm of speculation 5 years ago.”

    Moreover, the Sloan's laboriously homogenized database allows astronomers to do the first real statistical studies of the sky, enabling them to define the typical and identify the weird. Astronomers plot 100,000 stars, say, by the brightness of their light in specific colors (ultraviolet, blue, green, red, and infrared), trace how they line up, and then look for the objects that fall off the line. “Going after objects of unusual colors has been a lot of fun,” says Strauss. “It's basically shooting fish in a barrel.” So far, Sloan astronomers have snagged eight of the 10 most distant quasars known. Close by, they've found stars that are barely stars at all, small and dim and so cool that their atmospheres contain molecules of water and methane. On the day of “the methane stars news” in May 1999, Princeton's Knapp says, “the grapevine—well, I came in and there were 200 e-mails from all over the world—'Jill,' ‘Jill,' ‘Jill, what's going on.' I've never seen a thing like that before or since.”

    Another reason everyone cited for sticking it out is a spirit of community. “I just find it incredibly satisfying to bat ideas around with other people,” Gunn says. Szalay says it was a “thrill to work [with] the people you really admire through your whole career.” Knapp agrees: “These are fantastically good people. It's just a privilege to be in the same room with them.” In particular, Ivezic says, “it's wonderful for someone young”—a judgment that 29-year-old Rockosi confirms. “I've got 200 people asking me when I'm going to finish my thesis,” she says. “I think they even care.”

    The deepest reason for sticking with the Sloan is that it's the shape of things to come. Eventually, astronomers hope to merge the Sloan's database with the databases of surveys in other wavelengths, from x-ray through optical to infrared and radio, and create a full-spectrum digital sky. “Once we figured out where the Sloan is leading,” says Szalay, “it was obvious that the problem we are facing is much bigger than the Sloan.” Szalay, along with astronomers around the world, is putting together the so-called Virtual Observatory, a group of linked databases in 15 different wavebands, all searchable by the same yet-to-be-created methods (Science, 14 July 2000, p. 238). Everyone calls this era of grand integration the new astronomy. In the new astronomy, says Gunn, “you no longer have to go to a telescope to answer your questions. You can just look it up in the database.”

    Sloan's younger astronomers in particular are betting their careers on the new astronomy. “Once you finish Sloan and cross- correlate with all the other catalogs,” says Eric Peng, a graduate student at Johns Hopkins University, “astronomy's going to be different. Partly I want to learn how to do science in that mode, with statistics, having large databases.” Ivezic wants to compare 10 million stars from the Sloan with another 10 million from an infrared survey, the Two Micron All Sky Survey: “It is impossible that there would be no good results.” Peter Kunszt, who spent his postdoc at Johns Hopkins working on the software to run Sloan's archive, was hired at CERN, the European particle physics laboratory near Geneva, to help create the archives for its next big experiment, the Large Hadron Collider. “All this working on the Sloan paid off,” he says. “CERN's job description fit my person 100%.” Szalay thinks that by giving young scientists the experience of writing scientific software, “we will see in 10 years from now that we actually trained a whole generation of people who are much better adapted for this new world. I think we gave them skills which they could not have otherwise.”

    Ribbons of stars.

    Sweeping the sky, the Sloan's five-color digital camera (bottom) records everything in sight.

    CREDITS: (TOP TO BOTTOM) SDSS; MICHAEL A. CARR/SDSS CAMERA PROJECT

    In the end, the payback on such a large investment is intangible; it's the reward of working on something truly important, and it looks surprisingly like altruism. “I know what the potential of this thing is,” says Gunn. “I'm old enough and there are enough technical problems left that I'm not going to have time to do much science with it. So I've managed to work that around in my head in a way that I can just feel fatherly about this. These are my children doing this wonderful thing with this project that I've worked on for some substantial fraction of my life. And I'm happy to have had my part in making it happen.”

    “Your own papers have legacy value of 2 to 3 years,” says Szalay. “But you'd like to leave a legacy that's hardwired. It's so rare that you can be part of something that really changes the field. And you know, to be able just to say, ‘I worked on this thing, I was one of the 100 who did this.'”

    Rockosi says she has heard science called a tapestry woven by multitudes of hands. “Sloan will be used by many, many people for many, many years. Sometimes I wake up at 3 a.m. and think, ‘What am I doing.' But even if I do no other astronomy, I've contributed a lot. And it's not just a thread I'm contributing to, it's the warp.”

  12. ASTRONOMY

    Funding the Sloan

    1. Ann Finkbeiner*
    1. Ann Finkbeiner is a science writer in Baltimore, Maryland.

    Astronomers around the world will get their first look at the Sloan Digital Sky Survey's treasure trove of information next month, when the survey opens up its database for the first time. Even the highly focused group of scientists who put their careers on hold to see the venture through (see main text) must have wondered if this day would ever come. The financial “ups and downs of the survey have been tortuous,” says Jim Gunn of Princeton University. “We've nearly lost it so many times.” It survived thanks to an extraordinary patchwork of funding. The Sloan survey may look like the epitome of big science, but it was stitched together with small-science funds.

    The project began in 1988, when Gunn, Steve Kent of Harvard University, and Richard Kron and Don York of the University of Chicago wanted to survey the sky using a small, dedicated telescope mounted with two spectrographs that together take 640 spectra simultaneously, and the largest-yet digital camera that makes images in five colors. “We didn't have any money,” says Gunn, “and so Don and I tried to drum up support at our respective universities.” Gunn's department chair, Jerry Ostriker, told him, “You build it and I'll pay for it.” (Ostriker, Gunn says, “is a mastermind at finding money.”) Not long after, says Gunn, a Princeton alumnus with a career on Wall Street “walked into Jerry's office and said, ‘I've always been interested in astronomy and I want to give some money. Do you want any.' And that was the seed money that got us going.”

    Shortly thereafter, Ostriker happened to hear that the Alfred P. Sloan Foundation was looking for a bigger-than-usual project to fund. “And boy,” says Gunn, “did we have a deal for them.” The survey group sent in a proposal, and the foundation wrote a check for $8 million, Gunn says, “which we confidently said would be over half the capital cost.” The foundation's support, says Gunn, “is why it's the Sloan Digital Sky Survey.”

    Meanwhile, astronomers at the University of Chicago talked John Peoples, the director of the Department of Energy's Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, into joining the survey and writing a letter to the president of the University of Chicago. The university's president wrote back, “If Fermilab is in, it must be OK,” Peoples says; so both Chicago and Fermilab joined. A little later, Ostriker was on a site visit at Johns Hopkins University in Baltimore, which had a reputation for building spectrographs. “Jerry realized that we could be of help to Sloan,” says Hopkins's Alex Szalay. Hopkins joined and built two spectrographs. About the same time, astronomer Bruce Margon urged his school, the University of Washington, Seattle, to join; their engineering group built the 2.5-meter telescope in Apache Point, New Mexico. Chicago's eventual responsibility was the software—the “pipeline”—that got the data from the spectrographs into a database at Fermilab. Princeton's responsibility was the camera, which Gunn designed and built, and the pipeline that got data from the camera into the same Fermilab database. Fermilab's responsibility was to integrate the pipelines and feed them into a common archive, which was designed and built by Hopkins, and which becomes public in June. Technologically, Gunn says, “we were really pushing the envelope.”

    “Now, it is a sad thing about us,” Gunn adds, “that from the very beginning we have always been tremendously behind schedule and tremendously over budget.” The initial budget was low, Szalay says, “because the expectation was that faculty members at the different institutions would all chip in 50% of their time.” The reality proved much less rosy, Gunn recalls: “If you think we were 3-year-olds, you're right.”

    Vision thing.

    The Sloan Survey's 2.5-meter telescope in New Mexico went to work after years of planning—and much creative fund raising.

    CREDIT: REIDAR HAHN/FERMILAB

    A second expectation was that the pipelines and archive could be built with off-the-shelf astronomical software. Unfortunately, says Jill Knapp of Princeton, that software “was never intended to handle anything remotely as big as the Sloan.” The Sloan software had to be written from scratch, Szalay says—more than a million lines of code in all. And since “we didn't have volunteers flocking in to write difficult code,” Gunn says, “we had to pay people.”

    With the equipment built and programmers paid for, after management crises, several near-bankruptcies, and a broken secondary mirror, a completed Sloan will have cost $84.1 million, Peoples says—not including the “incalculable but tangible value” of faculty and administrators' time. The Sloan Foundation has contributed about $20 million; Fermilab, Los Alamos National Laboratory, and the Navy together, over $28 million. The National Science Foundation and NASA have each put in about $5 million; the universities together, $9.8 million; and international partners, over $9 million. The survey still needs another $6.5 million. Peoples, who is now the survey's director, says, “Trying to keep this afloat has been interesting.”

    The upshot of Sloan's tight budget, Gunn says, is that it has no funds earmarked for research. When NASA budgets its big observatories, by contrast, it includes a few percent for the costs of research using the data. “We felt the budget should be to produce data, not to analyze it,” says Kron. “Otherwise it would drive costs to the unaffordable.” Gunn would like an institute set up to disburse research funds, “but I've been banging my tiny feet about it for a long time without any success.” Peoples says that raising money for research “is on my mind, but first we have to finish the survey and pay the bills.”

    In the end, the Sloan survey's troubles may be the inevitable price of a unique, ambitious (Peoples prefers the word “pioneering”) project run by professional astronomers who are amateur managers. Since a management overhaul 3 years ago, however, the survey has stayed on schedule, financially and technically. Hirsh Cohen was vice president of the Sloan Foundation and has been the survey's program officer. “The winds were strong and the waves were high,” he says, “but I'd go through the adventure again. It's setting a new mode of study: reaching out and grabbing in data by the gigabyte. We're proud of it.”

  13. ACADEMIC COLLECTIONS

    Japan Shakes the Dust off Treasure Trove of Specimens

    1. Dennis Normile

    Valuable specimens that have long languished out of sight of both researchers and the public are being cataloged and put on display in university museums

    KYOTO—For decades, Japanese university scientists have traveled around the world collecting everything from fossils and mineral samples to pottery shards and plant specimens. But after being shipped home, the material all too often gathered dust in basements and odd corners around campus. Its value was lost not only to researchers, but also to professors who wanted their students to get a hands-on view of doing science. “I had to send the students to a big museum somewhere to see good specimens,” says Kyoto University paleontologist Terufumi Ohno.

    Next week Kyoto University will begin to turn that situation around with the opening of a $50 million museum that showcases its fine collection of paleontological samples. It's part of a trend that's sweeping Japan's campuses: Seven university museums have been established in the past 4 years, with five more on the drawing boards. Kyoto's museum is the first to receive government funds for a new building, with proper exhibit and storage facilities. “A lot of resources have been wasting away,” says Yoshihide Akatsuka, deputy director of the Scientific Research Institutes Division of the science and education ministry, which estimates that such academic storehouses contain up to 90% of the 25 million scientifically valuable samples in Japan.

    One reason for the sorry state of the current collections is the historic autonomy of Japanese professors. Individual departments or laboratories are unofficially responsible for the samples, but there has been little time and money to catalog and display the material properly. A lot of specimens have simply been thrown away, suspects Tokuhei Tagai, a mineralogist at the University of Tokyo Museum. “It's really a shame,” he says.

    One small exception has been at the University of Tokyo, which in 1966 set up what in English was called a “museum.” But its Japanese name, “materials depository,” more accurately described its function. “There was little curatorial work and no exhibits or displays,” says Tagai.

    Indeed, when university officials were preparing an exhibition to mark the university's 120th anniversary in 1998, their search of departmental storerooms turned up a treasure trove. Among the materials were a book of botanical specimens collected in Japan by the renowned 19th century German naturalist Philipp Franz von Seibold and hundreds of photographic plates of the campus just before the devastating 1923 earthquake. The officials discovered more than 6 million items, most of them spottily cataloged and all very difficult for scientists and scholars to access. In 1997, the university converted the materials depository into a museum, just as an Education Ministry advisory committee recommended the creation of additional university-based museums.

    Museum officials say the increased accessibility of the collections is already paying off scientifically. Archaeologists from the University of Tokyo, for example, recently used jade samples from the museum's collection to prove that the jade used in Korean ornaments that are up to 5000 years old must have come from Japan. That conclusion supports theories that traders crossed the Japan Sea earlier than historical records indicate.

    On display.

    Kyoto University's Terufumi Ohno stands in front of the new University Museum. Its opening next week is part of a move to showcase valuable academic collections like the exhibit “Bone” at the University of Tokyo (below).

    CREDITS: (TOP TO BOTTOM) D. NORMILE; COURTESY OF UNIVERSITY OF TOKYO MUSEUM

    The universities also hope that the new displays will be popular with the public. Drawing on materials from its collection, Tokyo has mounted an exhibit of human and animal skulls and skeletons, called “Bone,” that explains how scientists use bones to establish evolutionary links and make deductions about prehistoric environmental conditions. Kyoto is using its permanent exhibit to highlight ongoing research, such as a diorama with catwalks strung through a tree canopy to simulate the university's Ecology Research Center in Borneo. Another corner features Kyoto's Primate Research Institute, with video clips that show how wild chimpanzees use stones to crack open nuts or how a mother deals with the death of her infant. Nobuo Shigehara, a primatologist at the institute, hopes to update the exhibits periodically if he gets sufficient money to do so.

    The Education and Science Ministry is providing small sums to support some of the museums, and ministry officials are cautiously optimistic that they can provide modest funding increases. But a governmentwide campaign to reduce the number of employees has forced universities to operate their museums with undersized crews. Tokyo has nine faculty members assigned to the museum, while Kyoto has just six—and no trained curators or technicians. “Colleagues from overseas laugh when I explain this,” says Kyoto's Ohno, who welcomes the chance to show his students some of the school's collection, including Precambrian bivalve fossils that he studies. “They wonder how we can have a museum without curators and technicians.”

    Ohno hopes to ease the burden by using 30 retired professors who have offered their time, either to lecture or to help classify and catalog specimens. Tokyo's Tagai says that much of the restoration and cataloging must be done piecemeal, often as part of setting up a new exhibit. Despite these obstacles, both men hope that the opening of the Kyoto museum signals a new era of openness and support for scientific collections.

  14. EGYPTIAN SCIENCE

    A Biotech Gambit in the Desert

    1. Lone Frank*
    1. Lone Frank is a science writer in Copenhagen.

    To create a homegrown biotechnology industry from scratch, Egypt must reverse a decades-long scientific exodus

    ALEXANDRIA—Ringed by oil refineries and factories, three glass and concrete pyramids rise like a fata morgana from the sand 20 kilometers south of this ancient Mediterranean harbor. This pharaonic vision, the Mubarak City for Scientific Research and Technological Applications (MUCSAT), is buzzing with scores of scientists who have flocked here hoping to put their country on the biotech map. “Mubarak City is a tremendous step for Egypt,” insists immunologist Malak Kotb of the University of Tennessee, Memphis. So predicts the country's minister of higher education and scientific research, Moufid Shehab, “MUCSAT will propel Egypt into the world of advanced technology.”

    President Hosni Mubarak issued a decree to create MUCSAT in 1993 and handed the project to then-research minister Venice Kamal Gouda, an ardent believer in Egypt's potential to develop a biotech industry from scratch. (The country does not yet have a single biotech company.) Gouda put together a network of expatriate molecular biologists to help come up with a winning formula. Various snags strung out the construction of the main labs—housed in the pyramids—for 7 years, but the $24 million center finally opened last August with two of 10 planned institutes—one devoted to informatics, the other to genetic engineering—up and running. Institutes on everything from lasers to desert research are slated for launch over the next 7 years.

    Part of the rationale for this postmodern Giza is to try to stanch a decades-long brain drain that has seen the exodus of many of Egypt's finest minds, including Nobel laureate chemist Ahmed Zewail, who was born in Alexandria and now works at the California Institute of Technology in Pasadena. “Many [expatriates] choose not to return because conditions for doing research are difficult,” says geneticist Samia El-Temtamy of the National Research Center in Cairo. Hardships include salaries of about $200 per month, aging equipment, and red tape that can delay orders for months. “If [politicians] seriously want this project to succeed,” says molecular biologist Hisham Moharram, “they need to face the hornet's nest of dealing with bureaucracy.”

    Another challenge is the country's economy, now in the third year of a serious recession triggered by a soaring budget deficit and declining exports. Mubarak City scientists are banking on promises of steady funding from the government and on the continued commitment of Mubarak, who in speeches often cites investment in science and technology as a key to economic recovery. However, MUCSAT is supposed to set up a technology incubator and eventually attract a large share of its funding from industry.

    Niggling setbacks have taken some of the gloss off the venture. The uneven pace in finishing the labs, for example, has left researchers in an odd bind: Although MUCSAT has outfitted the pyramids with big-ticket items such as DNA sequencing machines, a shortage of less expensive but vital pieces of equipment, such as freezers and centrifuges, has slowed many experiments. Such poor planning has given some researchers misgivings about MUCSAT's management. “Our administration is not up to the task,” asserts Moharram, whose group is waiting for the installation of a tissue culture lab, now months overdue. And although the government has promised $31 million over the next 5 years for salaries and overhead, Mubarak City lacks such essentials as journal subscriptions and a travel budget for sending scientists to conferences abroad.

    Postmodern Giza.

    Biotech takes root in Egypt.

    CREDIT: L. FRANK

    Although far from ideal, MUCSAT's facilities have sparked collaborations with more impoverished, but experienced, colleagues in academia. “In universities, equipment is typically scattered and often not working,” explains physicist Salah Arafa of the American University in Cairo. MUCSAT—which does not grant degrees—and the universities have jointly enrolled about 100 master's and Ph.D. students to date.

    Without the lure of MUCSAT's budding science mecca, some of these students might have pursued degrees abroad. “Egypt needs to create distinguished universities and a mechanism for selecting the truly talented,” says Zewail, who is helping establish a technical university near Cairo. So far, Mubarak City has managed to retain some fine young minds, says Osman Shinaishin of the U.S. National Science Foundation, who has visited MUCSAT twice under the auspices of the Gore-Mubarak initiative, which funds joint projects. That program and one run by the United Nations pay for Egyptian scientists in foreign labs to set up collaborations with colleagues back home. “We cannot force scientists to return,” says Abdul Latif El Sharkawy, secretary-general of Egypt's Supreme Council for Research Centers and Institutes. Instead, he says, “we try to make use of those who elect to stay abroad.”

    Much of Mubarak City's research is geared toward projects with an imminent payoff: culturing bacterial strains that are more efficient at processing sewage, for example, or developing cheap diagnostic tests for human diseases. “Mubarak City is really the only place to go” to set up a biotech lab in Egypt, says U.S.-trained immunologist Maha El Demellawy, now at MUCSAT. She's collaborating with her mentor, Kotb, on a project to look for genetic susceptibility to hepatitis C, now rampant in Egypt. Scientists are free to pursue basic research, if they can find funding. “We strongly encourage researchers to apply for grants,” says MUCSAT director Ahmed El-Diwan.

    Of Mubarak City's 160 scientists, half have worked abroad. Some cite patriotism as a key factor in their decision to come home. “Many of us sense a responsibility to help boost the country's socioeconomic development,” says MUCSAT biochemist Hesham El-Enshasy, who trained in Germany. He and others are hoping that their dreams of a biotech industry, like so many other desert visions, do not evaporate before their eyes.

  15. GEOLOGY

    Thermal Features Bubble in Yellowstone Lake

    1. Kevin Krajick*
    1. Kevin Krajick is the author of Barren Lands: An Epic Search for Diamonds in the North American Arctic.

    Newly discovered vents, spires, and other features make the seemingly placid lake a center of geothermal activity—and raise the possibility of dangerous blowouts

    Yellowstone Lake has always seemed a haven of normality amid the thermal oddities of surrounding Yellowstone National Park. Visitors check out the earth's heated contortions at Mammoth Hot Springs and Old Faithful, then go to Yellowstone Lake to fish trout. And although scientists have been investigating the park's thermal wonders for decades, only now are they focusing on the 350-square-kilometer water body. On the deep bottom, researchers have discovered a panoply of unusual hydrothermal vents, spires, craters, and domes—activity so intense that the calm lake now appears to be the true centerpiece of the park's geothermal features.

    Venting much of the park's heat and affecting creatures from bacteria to bears, the submerged geothermal features promise to shed new light both on Yellowstone's complex subterranean workings and on the biology of extreme environments. Unfortunately, they also appear prone to massive explosions of steam and may be much more dangerous than huggable favorites like Old Faithful. “It's a startling discovery,” says John Varley, Yellowstone's director of research. “It would be the most spectacular part of the park, if you could see it.”

    Geologists, geochemists, geophysicists, and biologists are now zeroing in on the lake, probing it with thermal surveys, scuba gear, a small remotely operated vehicle (ROV), and high-resolution sonar. With 40% of the bottom surveyed, this summer several teams plan a big push to cover more, and the National Park Service is hosting a major conference in October to present the results. Biologists, for instance, are hoping for more data on the unusual organisms clustered in and around the vents and are planning to take biological samples with a snakelike “vental rooter.” “I think there are layers of unseen creatures down there,” says microbiologist Russell Cuhel of the University of Wisconsin, Milwaukee (UWM).

    Hydrothermal pressure cooker

    Yellowstone Lake straddles a corner of one of Earth's largest and most active volcanic calderas, so it makes sense that it hosts volcanism-related features. However, the scale and variety now being brought to light are far beyond expectations. The lake is literally a hot spot, says geophysicist Robert Smith of the University of Utah in Salt Lake City, who presented decades' worth of data on its heat flow at the fall 2000 meeting of the American Geophysical Union in San Francisco. Although the whole Yellowstone caldera is hot—it consistently puts out an average of 2 watts of heat per square meter—some spots in the lake radiate 20,000 times the average background. Overall, the lakebed may account for 20% of the 8987-square-kilometer park's heat. Lake water, thought to reach a maximum depth of 120 meters, is cold near the bottom, about 4 degrees Celsius, but in some localities Smith and his partner David Blackwell of Southern Methodist University in Dallas have found sediments hitting 118 degrees just 3.5 meters beneath the lake floor. Seismic profiling suggests that the heat comes from huge pockets of high-pressure steam circulating up from magma chambers. These pockets are peculiar to the lake, because they are kept from exploding only by the weight of the water. “It's like a pressure cooker: Lift the lid, and it will blow you away,” says Smith.

    And evidence is building that the lid has repeatedly been lifted in the recent past. For decades, scientists thought that the lake's level had steadily sunk since glaciers receded 12,000 to 15,000 years ago—a trend that would be expected to allow steam to escape gradually and thus prevent giant blowouts. But long-term measurements of the caldera's ground level now show that the bed of the Yellowstone River, which drains the lake, “breathes” up and down heavily with magmatic processes, cyclically damming and undamming the outflow, says Daniel Dzurisin of the U.S. Geological Survey (USGS) in Vancouver, Washington.

    Even in the short term, the elevation of the outflow rose nearly a meter from 1923 to 1984, then deflated, then started rising again in 1995. And on longer time scales, water levels apparently go through huge fluctuations —as does the pressure, or lack thereof, on the steam pockets. That idea is bolstered by the discovery of old beaches both above and below the current shoreline and separated by as much as 30 meters elevation, say USGS geologist Ken Pierce of Bozeman, Montana, and National Park Service archaeologist Ken Cannon. Stone tools tied to paleo-Indian cultures plus carbon dating of organic material in the beaches suggest major up-and-down cycles of 1000 to 3000 years.

    All this fluctuation may trigger immense buildups and explosive releases of steam. A USGS bathymetry team led by Lisa Morgan of Denver, Colorado, has taken a close look at several lake bays plus separate nearby lakes and confirmed that they are deep blowout craters, lined with shattered breccia. These include Mary's Bay, which blew up 10,800 years ago, and Indian Pond, dated at 3000 years old. Under one old beach, jumbled projectile points and a buried layer of large pebbles point to a tsunami-like wave correlated with the Mary's Bay blowup. And since 1997, Morgan's team has also spotted dozens of smaller craters—still radiating heat—on the lake floor.

    Blowouts could also follow movements in a large fault line recently observed to run into the lakebed, notes geologist William Locke of Montana State University in Bozeman. Whatever the trigger, says Pierce, “you don't want to be around when something happens.”

    Thus there's “a possibility”—although not “a probability”—that picnickers might look up some afternoon to see a tsunami bearing down, admits park geologist Paul Doss. He plans to take precautions, especially because the USGS has also spotted domelike structures on the lakebed, apparently pumped up by steam—possible precursors of craters. Some are only a few meters across, but one is a kilometer long and peaks at 50 meters. There are no such domes known on dry parkland, possibly because steam there escapes gradually, says Morgan, who wants to put tiltmeters on the domes to warn of sudden growth.

    Mysterious spires

    Starting in 1997, the USGS and a UWM team headed by biogeochemist Val Klump also started spotting more benign features: forests of mysterious pinnacles and spires, apparently formed by hydrothermal activity, jutting 6 meters or more from deep canyons and craters. One set of spires is lined up in a 0.4-kilometer-long rank, possibly aligned with a fault. Too narrow to be clearly resolved by sonar, the spires were missed by surveys until the ROV shared by the USGS and UWM took down a video camera.

    Such spires are seen chiefly in seas, not lakes, and their formation in Yellowstone remains something of a mystery. Most ocean spires occur when hot fluids emerge from a hydrothermal vent and loads of metallic sulfide minerals precipitate into the cold water. Precipitation may be partly responsible for the Yellowstone spires, but other processes are apparently at work too. A retrieved 1.5-meter chunk is mostly silica, like the rocks below, but contains what look like silicified remains of bacteria, as well as the siliceous shells of diatoms. Dried, the chunk is like a sponge, weighing only a few pounds; a computerized tomography scan done by the UWM team suggests that the interior is a maze of interconnecting spaces, unlike the weighty sea spires, which have a central channel like a chimney. Morgan thinks that organisms may have formed the labyrinthine structures directly over vents, while other researchers think microbes and diatoms may have simply drifted down from the water column. UWM's Cuhel suggests that sediment may have squeezed up through cracks in the dying glaciers to form the spires; one fragment is dated at 11,000 years, just after the ice receded.

    The puzzle is made all the more difficult by the fact that no one has yet found a spire actively venting. Within the park there are a few similar-looking structures now on dry land, for example in Monument Geyser Basin, that may have been formed underwater years ago, says Morgan.

    Spire-builders.

    A chunk of a silica-rich spire reveals the cylindrical shells of diatoms.

    CREDIT: GREG MEEKER AND ISABELLE BROWNFIELD/USGS

    Although the spires appear dead, there are at least 150 hot vents in the lake, many swarming with creatures. Some vents have built themselves little fingers or beehive piles of minerals, while others are just dark, quarter-size holes in the sand, bubbling hot fluids or gases. Some are much hotter than Yellowstone's land vents, says Klump, whose team has scuba-dived to shallow vents and now uses an ROV to reach deeper ones. The vents pour out carbon dioxide, methane, hydrogen sulfide, and iron—all possible grist for the metabolisms of various microbes—as well as mercury, thallium, barium, arsenic, molybdenum, tungsten, boron, and lithium, which the organisms may not find quite as desirable.

    The ROV has in fact spotted bacteria and other creatures swirling around vents “like little cyclones,” says Cuhel. Meterwide fissures display the colors of the rainbow along their length, depending on the specialized kinds of bacteria living there. Samples show dozens of strains including a new sulfur- and heat-loving genus, Thermodesulfovibrio, described in 1994 by microbiologist James Maki of Marquette University in Milwaukee, Wisconsin. Nearby, stringy white mats of more common chemosynthesizers like Beggiatoa cling to any available substrates, including stray tree branches. Diners on the bacteria include sizable water fleas, hydra, and snails, as well as worms that live by the squirming handful in the sediments. Identical species are normally found in shallow water, but at the food-heavy vents, they appear to have adapted to live in far denser colonies as far down as 100 meters.

    The grazers are in turn pursued by 5- or 8-centimeter predatory leeches, which are sometimes found “fried”—apparent victims of hot-water blasts, says UWM invertebrate ecologist Jerry Kaster. At one vent dubbed the “Trout Jacuzzi,” native cutthroat trout—usually cool-water, near-surface dwellers—have been observed charging bacterial mats, although no one knows what if anything they are eating. The only organisms missing are weird macrofauna such as the giant mussels and tubeworms seen at deep-sea vents; the lake is too young for such organisms to have evolved.

    The vents' effects may even filter up to land. USGS geochemist Pat Shanks has shown that lake water has high concentrations of arsenic and mercury, probably derived from the vents; he has also shown that trout, topping the aquatic food chain, concentrate mercury, sometimes near the federal danger zone of 1 part per million. The trout in turn feed ospreys, otters, and grizzly bears—not to mention people—and a couple of lakeside bears have turned up with elevated mercury levels in their blood. Whether wildlife—or tourists—suffer any ill effects is unknown, but at least for the animals, the benefits may outweigh the drawbacks, suggests Morgan: “Because of all the nutrients going to the bacteria and going up the chain, there may be a lot more food in the system.”

    In any case, Yellowstone Lake may need more protection from humans than vice versa. Since news of the spires leaked out in the Billings Gazette last year, recreational divers have been pressing officials for maps; they want to go down and take a look. So far, Doss and Morgan have stonewalled, fearing visitors might cause damage. “It could be dangerous down there,” says Doss, “but it's also very beautiful and delicate.”

  16. EVOLUTION

    Putting Limits on the Diversity of Life

    1. Richard A. Kerr

    A new way to analyze the fossil record is suggesting that life, despite its many evolutionary innovations, long ago hit a limit to its diversity

    Life only gets bigger and better, or so one might infer from the history of marine life that paleontologists have drawn up in the past few decades. The number of distinctly different kinds of organisms living in the sea may have taken a hit now and again, but over the long haul ocean life has only become more and more diverse, the data seemed to show. But now a recount of fossils is giving paleontologists second thoughts about that ever-upward trend.

    In a new record of life published this week in The Proceedings of the National Academy of Sciences (PNAS), 25 paleontologists demonstrate a fresh approach to extracting a history of life from an imperfect fossil record imperfectly sampled by paleontologists for 180 years. And the group's first, preliminary results suggest that previous studies may indeed have overstated life's penchant for diversification. Although far from the last word, the method marks a turning point in the study of paleontological databases, says co-author Richard Bambach of Virginia Polytechnic Institute and State University in Blacksburg. “This isn't the end of the story, but it's a step in the direction we want to go,” he says.

    Paleontologists wouldn't need a new record of diversity if their predecessors had done it right the first time—laboriously sampling and documenting all the accessible exposures of fossil-bearing rock around the world. Instead, earlier researchers searched most intensively nearest to home—mostly Europe and North America—and often pursued paleontological novelty rather than systematic surveying. Making the best of a bad situation, one paleontologist spent 20 years compiling a now-classic diversity record from the hodgepodge of published and unpublished data. John “Jack” Sepkoski of the University of Chicago, who died in 1999 at the age of 50, combed reports of marine fossil finds and created a “phone book” that listed each fossil genus by name, the first time it appeared in the fossil record, and the last time it appeared. From that database he plotted a diversity curve (see left figure) that rose sharply 500 million years ago as life burgeoned in the Cambrian “explosion,” plateaued in the Paleozoic era, and then rose steeply and steadily from the Mesozoic to the present day. Read literally, Sepkoski's curve implied that the great Permian-Triassic mass extinction at the end of the Paleozoic 250 million years ago somehow reignited life's drive to diversify. Life in the seas became more active, more predatory, and continually more varied as the so-called modern fauna found new ways to thrive.

    Upward or onward?

    Where Sepkowski's numbers (left) told of steadily increasing marine diversity, a recount (right) shows a 400-million-year plateau.

    CREDITS: (LEFT) MARK NEWMAN/SANTA FE INSTITUTE/PNAS; (RIGHT) ALROY ET AL./PNAS

    A nice story. But has diversity really tripled since the Paleozoic. “We may have been misled for 20 years,” says paleontologist Douglas Erwin of the National Museum of Natural History in Washington, D.C. “There may be some real problems.” The biggest may be that the harder paleontologists look at a particular rock outcrop or at outcrops from the same slice of geologic time, the more kinds of fossils turn up. With the meager information in his phone book-style compilation, Sepkoski had no way of correcting for the wide range of sampling intensities from time to time or place to place or for other potential biases.

    To fill the gap, paleontologists John Alroy of the University of California, Santa Barbara (UCSB), Charles Marshall of Harvard University, 23 colleagues, and dozens more contributors are assembling the Paleobiology Database. Housed at the National Center for Ecological Analysis and Synthesis at UCSB, the still-growing database records a range of information beyond first and last appearance. Perhaps most important, it contains the reported occurrences of a genus through time and space.

    Drawing on the compiled information, Alroy and his colleagues sliced geologic history into 10-million-year intervals and used statistics to try to even out the effects of varying sampling intensity in each interval. To bracket as many ways of correcting for bias as possible, the group tried four ways to standardize the data and two ways to count fossil occurrences—a total of eight different statistical recipes.

    All eight approaches had much the same effect on the diversity curve (see right figure). “What's surprising is that diversity in the post-Paleozoic up to the Oligocene [24 million years ago] doesn't shoot up as extremely as was originally found in Jack's data set,” says co-author David Jablonski of the University of Chicago. If diversity really hasn't risen much since the Paleozoic, then all of life's innovations in the oceans since the days of the trilobites—from new predators to armored snails and burrowing clams—have been unable to break through some set ceiling on diversity.

    “This paper represents a real step forward,” says ecologist Michael Rosenzweig of the University of Arizona in Tucson. “For the first time, a large group of people is saying paleobiology has been making a mistake, that it's very important to deal with sampling issues. And when you try to get rid of the biases, the diversity curve looks a lot flatter.”

    Rosenzweig, like Alroy and his colleagues, regards the paper as a progress report on the road to a larger and more thoroughly analyzed paleo data set. But even the most sophisticated data mining, the scientists warn, might not extract the true history of diversity. Paleontologist Jeremy Jackson of Scripps Institution of Oceanography in La Jolla, California, thinks existing data are still too biased to do the trick. It may be “the information isn't there to begin with,” he says. The compilers of the Paleobiology Database intend to find out.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution