News this Week

Science  25 Oct 2002:
Vol. 298, Issue 5594, pp. 718
  1. HIGH-ENERGY PHYSICS

    Wayward Particles Collide With Physicists' Expectations

    1. Charles Seife

    EAST LANSING, MICHIGAN—Physicists' quest for a new state of matter has taken a bewildering turn. At a meeting here last week,* researchers from the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory in Upton, New York, announced results that, so far, nobody can explain. By slamming gold atoms together at nearly the speed of light, the physicists hoped to make gold nuclei melt into a novel phase of matter called a quark-gluon plasma. But although the experiment produced encouraging evidence that they had succeeded, it also left them struggling to account for the behavior of the particles that shoot away from the tremendously energetic smashups.

    “The more I think about it, the more I think it's not completely wacko,” William Zajc of Columbia University, spokesperson for one of the four particle detectors at RHIC, said privately at the conference. Zajc ruminated for a few moments and then corrected himself. “Well, it is completely wacko,” he said. “We don't get it. I really don't know—on a fundamental level.”

    The confusion comes from PHENIX, one of the four detectors, which probed the differences between “hard” and “soft” nuclear collisions. Nuclei are collections of protons and neutrons, and at low energies, they behave like hard objects. Smash one nucleus into another, and the components scatter like billiard balls. But scientists think they behave differently in very high-energy collisions. Neutrons and protons are made up of particles known as quarks and gluons, and at very high temperatures and pressures these particles should burst their bindings and roam free, forming a state of matter known as a quark-gluon plasma. In that case, theory predicts that the particles in the smashup would no longer bounce cleanly off one another; the melted mess would be sloppier, the particles splashing off one another like droplets of water instead of rebounding like chunks of ice. By analyzing the sprays of particles created by colliding various atoms, the RHIC physicists hoped to determine whether collisions become softer as the nuclei get bigger and carry more energy—a sign of a quark-gluon plasma, a state of matter that hasn't existed since the big bang.

    Hard riddle.

    At the Relativistic Heavy Ion Collider (top), protons and pions born from the same explosions inexplicably show earmarks of different origins.

    CREDITS: (TOP TO BOTTOM) BNL;RHIC

    Last year, RHIC seemed to be seeing just that. For example, trackers found proportionately fewer high-momentum particles spraying away from powerful gold collisions, a phenomenon known as jet quenching (Science, 26 January 2001, p. 573). Although jet quenching could be due to some new, subtle effect caused by the particles' travels through dense nuclear matter, it is consistent with the creation of a quark-gluon plasma: The particles slow down as they fly through the sticky, soft goop in a plasma, rather than merely ricocheting off the components of the nucleus.

    This tidy picture has just become considerably messier. With the higher energies and better statistics of RHIC's second year of running, physicists could classify the particles zooming away from the collisions. What they saw was a shock.

    Measurements at PHENIX indicate that some of the particles flying away from the smashup are moving more slowly than normal, as one would expect in a soft collision, but others are caroming out of the wreck as if from a hard collision (see figure). Scientists know of no plausible mechanism for this discrepancy. “It's a true puzzle,” says Zajc.

    Part of the problem is that most of the particles PHENIX detects are born after the collision—spawned from more or less identical quarks and gluons (collectively dubbed “partons”) that scatter off one another at the moment the two atoms crash together. The flying partons only then recombine into two-quark or three-quark ensembles (“hadrons,” such as protons and neutrons). Because identical partons are doing the scattering, the hadrons they produce should all look as if they were born in the same sort of collision, soft or hard.

    But that isn't what PHENIX sees, says Julia Velkovska, a Brookhaven physicist who is also associated with the PHENIX experiment. Pions, two-quark ensembles made of up and down quarks and antiquarks (and a handful of gluons) bound in an uneasy package, “behave more or less exactly like predicted” for a particle traveling through a sticky medium like a quark-gluon plasma, she says, whereas protons and antiprotons, three-quark ensembles also made of up and down quarks and antiquarks (and a handful of gluons), behave as if they were formed by a hard collision.

    “Gee whiz,” said Sean Gavin, a theorist at Wayne State University in Detroit, Michigan, when told of the results for the first time. “That's really interesting.” But so far neither he nor anybody else can account for the difference. Zajc suggests that exotic gluon configurations might make two-quark ensembles (mesons such as pions) behave differently from three-quark ensembles (baryons such as protons). Velkovska says that perhaps a parton flying away from a collision somehow “knows from the beginning that it's going to be a baryon.” But both admit that these are wild guesses at the moment.

    James Thomas of Lawrence Berkeley National Laboratory in California, who works with the RHIC detector called STAR, says that data due to be collected in 2004 will reveal whether a similar pattern holds with heavier baryons and mesons, such as the lambda baryon and the K meson. The next RHIC run, however, will collide deuterium with gold and protons with protons—a lower energy regime than gold-on-gold collisions. If the anomaly disappears under these lower energy conditions, physicists will be much more confident that this effect and others stem from the formation of some sort of dense plasma, rather than from partons traversing the nucleus.

    • * 2002 fall meeting of the American Physical Society's Division of Nuclear Physics, 9–12 October.

  2. NATIONAL SCIENCE FOUNDATION

    White House Concerns Block Doubling Bill

    1. Jeffrey Mervis

    Call it a case of double or nothing. Legislators thought they had worked out a deal to authorize a 5-year doubling of the National Science Foundation (NSF) budget, a cherished goal of science lobbyists, as part of a comprehensive bill covering myriad NSF programs. But a last-minute objection from the White House sent lawmakers home last week with nothing to show for their efforts. Angry legislators from both parties accuse the Office of Management and Budget (OMB) of sabotaging the long-awaited agreement, which lobbyists hope can be salvaged when Congress returns after the 5 November elections.

    The money to run an agency comes from appropriations bills, most of which are still pending 1 month into the 2003 fiscal year. But authorizing legislation provides detailed and binding instructions on how an agency should operate. The House of Representatives passed its version of the NSF authorization bill (H.R. 4664) in June, a 3-year blueprint with annual increases intended to put NSF's current budget on a doubling track. Last month two Senate panels approved a different version (S. 2817) that provided for a full doubling, to nearly $10 billion, by 2007. In addition, the bills require NSF to publicly rank proposed major research facilities and give greater hiring and budget autonomy to the National Science Board, NSF's presidentially appointed oversight body. NSF Director Rita Colwell had previously raised strong objections to both items (Science, 27 September, p. 2187).

    Hold on.

    Senator Jon Kyl (R-AZ) is apparently in no hurry to double NSF's budget.

    CREDIT: AP PHOTO/APTN

    Although the full Senate has yet to vote on the measure, on 10 October House and Senate negotiators resolved their remaining differences and prepared for a pro forma vote by each body on identical bills. But on 16 October Senator Jon Kyl (R-AZ) raised a parliamentary objection, blocking a vote in the Senate. Sources say that his “hold” reflects OMB's concerns that a 5-year doubling is arbitrary—a point science adviser John Marburger has made repeatedly—and runs counter to the Administration's long-term budget strategy. Congressional aides nevertheless feel that they were blindsided; they say the Administration never formally objected to the provisions. “It came up at 11:59 p.m.,” says one frustrated staffer. “And now it's 12:01.”

    Congress is now weighing an OMB counterproposal that shortens the bill to 3 years and removes the word “doubling” from its title. But although that might be acceptable to some members, it rankles others. “It's a doubling bill,” says one aide. “And it's not a random increase; we were very careful to spell out our priorities.”

    In fact, the 91-page bill discusses several NSF programs in great detail. The annual ranking of proposed research facilities, for example, is intended to clear up the community's confusion over the status of various projects that the board has approved but for which NSF has not requested funding. And the science board provisions are meant to ensure that the NSF director does not wield undue influence over the board. “The board can certainly live with the provisions in the authorization bill,” says board chair Warren Washington, a climate modeler at the National Center for Atmospheric Research in Boulder, Colorado. Washington says that the board is already developing a ranking of pending facilities projects and hopes to polish the list at its 20 November meeting.

  3. EVOLUTIONARY GENETICS

    Jumbled DNA Separates Chimps and Humans

    1. Elizabeth Pennisi

    BALTIMORE, MARYLAND—For almost 30 years, researchers have asserted that the DNA of humans and chimps is at least 98.5% identical. Now research reported here last week at the American Society for Human Genetics meeting suggests that the two primate genomes might not be quite as similar after all. A closer look has uncovered nips and tucks in homologous sections of DNA that weren't noticed in previous studies.

    The results are quite exciting, says Michael Conneally, a human geneticist at Indiana University Medical Center in Indianapolis. With this research, “we can really find out so much more about evolution,” he predicts.

    In the past 3 decades, biologists have used all sorts of biochemical methods to assess differences between genomes, particularly those of humans and chimps. As more DNA sequence became available over that time, many researchers began to look at short stretches of DNA and count the number of single bases that didn't match the equivalent bases in another species—known as single-nucleotide polymorphisms. In contrast, some cytogeneticists have taken a more global view of the genomic landscape, mapping out differences in how chromosomes appear under the microscope.

    Now two research teams have spotlighted the middle ground, using so-called gene chips to evaluate millions of bases of DNA in a single experiment. The chips—some of the most powerful to date—carry snippets of known genetic material that, when paired up with DNA in a test sample, tell researchers what genetic code is present.

    With this wide-ranging view, genomicists Kelly Frazer, David Cox, and their colleagues at Perlegen Sciences in Mountain View, California, have detected insertions and deletions ranging from 200 bases to 10,000 bases in length that differ between chimps and humans, each of which has a genome of about 3 billion bases. Evan Eichler and Devin Locke, geneticists at Case Western Reserve University in Cleveland, Ohio, have studied changes extending about 150,000 bases. “A significant fraction of the variation [between chimps and humans] is present in these [two types of] rearrangements,” Frazer reports.

    Loosened family ties.

    Gene-chip studies reveal previously unrecognized differences between these two species.

    CREDITS: (TOP TO BOTTOM) MICHAEL LEACH/PHOTO RESEARCHERS; CORBIS

    The Perlegen team used chips densely packed with small pieces of DNA, each 25 bases long. The chip is studded with “13 billion unique [pieces],” Cox points out. The researchers assessed the resemblance between the chimp's chromosome 22 and the equivalent human chromosome, 21. They compared 27 million bases, and “much to our surprise, we found around 57 areas of rearrangement between the human and the chimp,” says Cox.

    There seemed to be no rhyme or reason to the changes; they occurred just as frequently outside coding regions as within. The density of these differences is “a little bit higher than anyone would have predicted,” says Eichler. “The implications could be profound,” he adds, because such genetic hiccups could disable entire genes, possibly explaining why our closest cousins seem so distant.

    Instead of using small bits of DNA, Locke, Eichler, and their colleagues deposited on a chip a series of bacterial artificial chromosomes, each of which contained about 150,000 bases of human DNA. The chip sported almost 2500 sequences covering 360 million bases in all. They compared this DNA to DNA from Asian and African great apes and found 63 chunks that were missing or added. The deletions and insertions they uncovered, which were larger than those picked up by the Perlegen team, tended to be close to large duplicated regions, Locke reported at the meeting, although the researchers aren't sure how to interpret this finding. The frequency of such genetic differences suggests, Frazer says, that “these rearrangements are playing a much bigger role [in evolution] than we expected.”

    Locke's and Frazer's results come as no surprise to Roy Britten of the California Institute of Technology in Pasadena, who has analyzed the chimp and human genomes using a customized computer program. He compared 779,000 bases of chimp DNA with the sequence of the human genome, both found in the public repository GenBank. Single-base changes accounted for 1.4% of the differences between the human and chimp genomes, and insertions and deletions ranging up to 31 bases long accounted for an additional 3.4%, he reported in the 15 October Proceedings of the National Academy of Sciences. Locke's and Frazer's groups didn't commit to new estimates of the similarity between the species, but both agree that the previously accepted 98.5% mark is too high.

    Such findings leave researchers eager to scrutinize the full chimp sequence. Japanese, German, South Korean, Taiwanese, and Chinese researchers formalized a chimp genome project in 2001 (Science, 23 March 2001, p. 2297); that program recently got a boost when the National Human Genome Research Institute in Bethesda, Maryland, listed the chimp as a high priority for sequencing by its high-throughput centers. The sequence should be ready in mid-2003.

  4. PROTECTING HUMAN SUBJECTS

    Koski Steps Down After Bumpy Ride

    1. Jocelyn Kaiser

    The first director of a federal office created to beef up safety in clinical trials is heading back to academia after running into some bumps within the government and earning mixed reviews from outsiders.

    Greg Koski, a Harvard anesthesiologist, says his decision to leave after 2 years is not related to the political winds blowing through his office, including a decision this summer to dismantle its advisory committee. But sources say that a lack of support from his bosses might have helped speed his return to academe.

    Koski was recruited by then-Department of Health and Human Services (HHS) Secretary Donna Shalala to lead the newly promoted Office for Human Research Protections (OHRP) after a death in a gene-therapy trial brought increased scrutiny of patient safety in research. Once there, he worked to persuade institutions that protecting patients required obeying their “consciences” as well as federal rules. Since Koski's arrival, OHRP has begun developing a system in which institutions—rather than the government—grade themselves on their oversight programs. A report earlier this month from the Institute of Medicine supports this approach, as well as voluntary accreditation of human-subjects protection programs.

    Bioethicist Mary Faith Marshall of the University of Kansas Medical Center in Kansas City says Koski was a “tireless ambassador” in campaigning for “shared goals.” David Korn of the Association of American Medical Colleges (AAMC) in Washington, D.C., agrees that Koski “deserves a lot of credit” for promoting the idea that protecting human subjects should involve the entire institution. But Korn is still waiting to see if the office follows AAMC's advice on the issue of reducing financial conflicts of interest. Koski says he hopes final guidelines will be out by the end of the year.

    Patient advocate.

    Greg Koski was “tireless ambassador” for shared responsibility.

    CREDIT: HHS

    Some patient advocates and members of Congress, however, are pushing for mandatory standards. And one government official is skeptical that Koski accomplished much with OHRP's more than doubled budget and staff. “His brief tenure was reminiscent of a placebo: Some people thought it worked,” quipped the official.

    Koski insists that his departure “is not a political decision” but rather marks the end of a 2-year leave from Harvard in Massachusetts, where his family still lives. However, his time at HHS was not always smooth sailing. Koski often failed to follow proper procedures for developing policies and releasing information, one official says.

    The office was also caught up in a revamping of HHS advisory committees (see p. 732). Koski acknowledges that he did not expect HHS's decision to let lapse the charter of OHRP's advisory panel, which Marshall, the chair, says “shocked and dismayed” members. HHS now plans to convene a smaller group, with 11 members instead of 17, and has revised its charter to include specific topics, such as protection of fetuses and embryos, that reflect the Bush Administration's opposition to abortion. Fetuses are already mentioned in federal regulations for protecting human subjects, but Koski says including embryos is “a change.” Korn says AAMC is “very concerned” about the membership of the panel: “I just hope it isn't packed with ideologues.”

  5. SEISMOLOGY

    Suit Ties Whale Deaths to Research Cruise

    1. David Malakoff

    The U.S. National Science Foundation (NSF) has a whale of a problem involving sounds, lawsuits, and the high seas.

    Last week an environmental group asked a federal judge to suspend an NSF-funded sea-floor mapping expedition off Mexico that it claims led to the deaths of two whales. NSF rejects a link between the deaths and the air guns used by shipboard researchers to generate sound waves, adding that the researchers were following the law. But the incident has curtailed an expensive international mapping project and reignited controversy over the impact of noise on marine mammals. The case, filed 17 October in San Francisco, California, by the Idyllwild-based Center for Biological Diversity (CBD), could also lead to new regulation of U.S. researchers who use sound at sea.

    The controversy began 25 September, when five vacationing marine biologists sailing in Mexico's Gulf of California happened upon two freshly beached Cuvier's beaked whales (Ziphius cavirostris). The group, which included several beaked-whale experts, tried to inform Mexican colleagues about the unusual find. In the process, they discovered that the Maurice Ewing, a research vessel owned by Columbia University's Lamont-Doherty Earth Observatory in Palisades, New York, was conducting a seismic survey nearby. The ship was bouncing sound pulses produced by blasts of compressed air off the gulf's floor to map the margins of the continental plate.

    Too close for comfort?

    The research vessel Maurice Ewing sailed near the island in the Gulf of California where beaked whales beached.

    SOURCE: LDEO

    Human-created noise, including the use of sonar by military vessels, has been linked to other strandings of beaked whales, a poorly understood group of species (Science, 26 January 2001, p. 576). Although there is still no clear explanation of how sound might harm the whales, the gulf strandings “just seemed too coincidental, given the history,” says Barbara Taylor, one of the vacationers and a whale researcher at the government's Southwest Fisheries Science Center in La Jolla, California.

    Five days after the incident, Lamont officials temporarily halted the $1.6 million, 6-week cruise to review environmental precautions. Lamont's director, Michael Purdy, says there is no clear link between the mapping and the strandings, noting that the Ewing appears to have been at least 50 kilometers from the animals when they stranded. But cruise managers ultimately decided to reduce noise levels, drop half the planned routes, increase efforts to spot and avoid whales, and end night work, when whale monitoring is impossible. Lamont is also paying for aerial surveys to look for new strandings and helping outside researchers study the noise signature produced by the Ewing's air-gun array. The cruise is scheduled to end 4 November.

    The additional precautions, however, don't satisfy some whale experts. Air-gun signals can travel 10 or more kilometers from the ship, they note, a distance far beyond the gaze of sentries. Beaked whales are notoriously difficult to spot, they add, moving quickly and staying submerged for 30 minutes or more. “The Ewing should cease operations; they don't have a workable plan,” says John Hildebrand, a whale and acoustics specialist at the Scripps Institution of Oceanography in La Jolla.

    The incident has also raised the complicated legal question of whether the researchers had the proper permits. Last April, Hildebrand asked the U.S. Marine Mammal Commission, a government advisory body, to review the planned cruise after becoming concerned about surveying in an area known to be rich in beaked whales. Lamont and NSF officials, however, concluded that the cruise did not need U.S. marine mammal permits as it would occur in Mexican waters, and the Mexican government approved the mission. The biodiversity center contests that interpretation of U.S. law, saying that cruise planners should have consulted with U.S. regulators.

    As Science went to press, the judge was still deciding whether to halt the cruise. Even if it continues, CBD attorney Brendan Cummings hopes that the case will clarify which laws apply to U.S. research vessels. It is certain to intensify interest in a separate lawsuit, in which environmentalists are attempting to block the U.S. Navy from deploying a new sonar that some researchers say could harm whales. Remarkably, the Mexican incident occurred on the same day that more than a dozen beaked whales stranded off the Canary Islands in the eastern Atlantic, following naval exercises conducted by U.S. and Spanish vessels.

  6. ENDOCRINOLOGY

    Divorcing Estrogen's Bright and Dark Sides

    1. Greg Miller

    Despite concerns about the risks of hormone replacement therapy for postmenopausal women, one benefit has not been challenged: It makes bones stronger. Now a study on page 843 suggests that it might be possible to tease apart the various effects of estrogen, maintaining its benefits while reducing its risks. A synthetic hormone has been shown to boost bone strength in mice without affecting reproductive organs.

    Estrogen makes women less likely to develop osteoporosis and suffer debilitating fractures. But this boon apparently comes with increased risk of breast cancer, pulmonary embolism, heart attack, and stroke (Science, 19 July, p. 325). Reasoning that estrogen's effects on various tissues might be mediated by different cell signaling cascades, a team led by Stavros Manolagas of the University of Arkansas for Medical Sciences in Little Rock has been identifying synthetic hormones that activate only a subset of these pathways.

    Whether such compounds will prove useful in humans remains to be seen, but other researchers and clinicians say the new study is a promising first step. “If it holds up, then it's quite important,” says molecular endocrinologist Geoffrey Greene of the University of Chicago. “If compounds like estrogen could be used to maintain bone density with few or no side effects in aging women, that would be huge.”

    The researchers gave a compound named estren to adult female mice whose ovaries had been removed. As with menopause, ovariectomy curtails estrogen production and eventually leads to a decline in bone density. Estren reversed this change and restored bone strength as effectively as—and in some cases more effectively than—estrogen.

    Choosing the right message.

    Synthetic hormones that bypass the traditional estrogen pathway (left) and activate the “nongenotropic” pathway (right) might prevent bone loss without side effects.

    ILLUSTRATION: C. SLAYDEN

    Estren apparently strengthens bones by tinkering with the cellular construction crews that constantly remodel them. At any given time, Manolagas says, there are 5 million to 10 million sites on a human skeleton where cells called osteoclasts dig tiny trenches in the bone that are filled in by bone-forming osteoblasts. After menopause, osteoclasts outpace osteoblasts, making bone more porous and brittle. Manolagas's team found that estren (as well as estrogen) tips the balance in the other direction: Both compounds encourage osteoclasts to self-destruct while prolonging the life of osteoblasts.

    Despite their similar effects on bone, estren and estrogen have markedly different effects on the reproductive organs, the team found. In ovariectomized mice, the uterus loses nearly two-thirds of its weight. Estrogen, but not estren, prevents this loss. And whereas estrogen stimulated the growth of cultured breast cancer cells, estren did not.

    Manolagas says these differences arise because estren doesn't activate the pathway by which estrogen acts on the reproductive organs. In that pathway—traditionally thought to be the only means for estrogen signaling—the hormone diffuses into the nuclei of cells, where it binds to its receptor and a complex of other proteins that together regulate the transcription of certain genes.

    Recently, Manolagas and others have suggested that estrogen can activate a “nongenotropic” pathway, whereby estrogen alters gene expression by means of a biochemical cascade that kicks off when the hormone binds receptors outside the nucleus—an idea that is still controversial. Last year Manolagas's team reported that estrogen's effects on osteoblasts and osteoclasts seem to be mediated by this pathway. The new study suggests that estren activates this pathway but not the traditional one, which would explain its preferential effect on bone. (No one is sure which pathway might be responsible for estrogen's apparent link to heart disease.)

    The study, says Greene, supports the existence of nongenotropic pathways. Whereas most evidence has come from experiments with cultured cells, he says, the new study is “one of the most striking demonstrations” of such effects in live animals.

    Many osteoporosis drugs are in the pipeline, but none boasts estrogen's track record for preserving bone, says Susan Ott of the University of Washington, Seattle. Time will tell if estren can bring out the best of estrogen therapy and leave its dark side behind.

  7. PALEONTOLOGY

    Cuts at Dino Monument Anger Researchers

    1. Erik Stokstad

    Vertebrate paleontologists are up in arms about a plan to cut back on research at Dinosaur National Monument. Although monument officials say the move will benefit paleontology in the long run, some scientists charge that the plan is misguided. “This is a big step backward,” says Kenneth Carpenter of the Denver Museum of Natural History in Colorado.

    The monument, which straddles the border between Utah and Colorado, includes a dramatic quarry face and 853 square kilometers of fossil-rich backcountry. It is well staffed for a site run by the U.S. National Park Service, boasting a paleontologist, a preparator, and a curator.

    Last year, monument officials were charged with drawing up a 5-year plan that would eliminate nine positions out of 50. In the new plan, announced earlier this month, the preparator and paleontologist positions will be combined into a “physical sciences resource manager.” Research will be less than half-time. The monument's chief of administration and acting supervisor, Susan Richardson, says that the change will ultimately boost the monument's paleontology program, because the new manager will focus on attracting scientists, students, and funding there and on coordinating volunteers. “I think we're going to be able to make it way larger,” she says.

    Chipping away.

    Paleontologists are concerned that cuts at Dinosaur National Monument will hurt research.

    CREDIT: JAMES L. AMOS/CORBIS

    But paleontologists say that using volunteers to prepare unique fossils is risky. What's more, the long training process and high turnover could make a volunteer program an “enormous waste of time,” Carpenter says. He also doubts whether many paleontologists would have time to prospect for fossils in an area that's new to them.

    Amy Henrici of the Carnegie Museum of Natural History in Pittsburgh, Pennsylvania, studies some of the oldest frog fossils in North America—which monument paleontologist Dan Chure found and has lent her. “It's very important for me to have them out there looking for more fossils,” she says. The planned changes raise the prospect that her research, and that at the monument, would slow or halt, she says.

    But Greg McDonald, who coordinates paleontological research for the National Park Service, says that each superintendent must decide how to meet the needs of his or her park. “There's the ideal,” McDonald says, “and then there's the practical.”

  8. EUROPEAN RESEARCH

    Directive Could Give Postdocs Permanency

    1. Kirstie Urquhart*
    1. Kirstie Urquhart is European editor for Science's Next Wave, where a longer version of this article appears (intl-nextwave.sciencemag.org/cgi/content/full/2002/10/22/1).

    CAMBRIDGE, U.K.—Richard A'Brook has worked for more than a decade at the University of Dundee in Scotland—a steady job, you would think, but the epidemiological statistician is himself a statistic. He's one of a legion of contract researchers whose job security depends on their employers' fundraising prowess. Now, however, a new directive from the European Commission (EC) could lead to radical changes in how contract researchers, mainly postdocs, are employed, particularly in the United Kingdom and Ireland. It's a “right to equal treatment,” A'Brook says.

    The EC's Directive on Fixed-Term Work, incorporated into U.K. law earlier this month, mandates that E.U. nations “prevent the abuse of fixed-term contracts through their continuous use.” Precisely how the rule is implemented is up to each country, explains the EC's Andrew Fielding, because “the situation on the ground is so different” from one country to the next.

    In the United Kingdom, universities and other employers will be forced to give permanent positions to any research staff members whose positions are renewed and run longer than 4 years, unless they can offer good reasons for not doing so. The directive also requires that workers on fixed-term contracts be given “equal treatment” in terms of benefits such as paid holidays, maternity leave, and representation on departmental committees. Most E.U. nations already have rules protecting contract workers, but the directive is causing a considerable stir in the British Isles, where such regulation has been lacking until now.

    According to the U.K.'s Association of University Teachers (AUT), some 59,000 academics are now on fixed-term contracts in the United Kingdom. And few of these researchers are bright-eyed trainees happy to live with uncertainty and low status: An AUT study showed that the proportion of research-only fixed-term contract staff members in the United Kingdom aged 30 or higher rose from 53% in 1995 to 63% in 2001. But because the clock started ticking on the new rules only this summer, they'll have to wait until July 2006 to see whether their status will change.

    Larger, research-intensive campuses and institutions are not anticipating making wholesale changes. “This is really just enshrining in legislation good employment practice,” says a spokesperson for the University of Cambridge, one of the country's biggest employers of contract research staff. Similarly, institutes run by Cancer Research UK will continue to employ postdoctoral researchers on fixed-term contracts as part of the charity's philosophy of regularly bringing in new blood and ensuring that postdocs move on for career development.

    But the changes could have profound effects at smaller institutions, some of which are choosing to make life-changing decisions well before crunch time. The attitude at Robert Gordon University (RGU) in Aberdeen is “let's just get on with it rather than wait for others to force us to do it,” says RGU's human resources director, Robert Briggs. Last August, the Scottish university moved its entire contract research staff onto the same open-ended contracts as the rest of its academic staff.

    A'Brook anticipates a groundswell of support for RGU's approach. Real change, he says, “will depend on local unions or staff representatives actually pushing institutions to implement these things.”

  9. LOW-DOSE RADIATION

    U.N. Faces Tough Sell on Chornobyl Research

    1. Paul Webster*
    1. Paul Webster is a writer in Moscow.

    MOSCOW—The United Nations is mounting a last-ditch effort to reinvigorate flagging interest in the long-term health consequences of the Chornobyl disaster. At a meeting of U.N. agencies in New York City earlier this week, the U.N.'s Office for the Coordination of Humanitarian Affairs (OCHA) established a new organization, the International Chernobyl Research Network, to mount a coordinated research program on the lingering impacts of the world's most serious nuclear reactor accident. A concerted scientific effort is necessary, it argues, “if the evidence is not to be lost forever.” Prospects for the new initiative are unclear, however. OCHA itself has no money to launch new research projects, and expert opinion is split on the network's scientific potential.

    The Chornobyl network is the brainchild of Keith Baverstock, the European radiation health adviser to the World Health Organization (WHO). A lack of coordination among international agencies, he says, has hampered research on the health impacts of the April 1986 explosion at the Chornobyl Nuclear Power Plant, which spewed roughly 200 Hiroshima bombs' worth of radiation across a region of Eastern Europe inhabited by 2 million people. As a result, he contends, much Chornobyl research has been unsound.

    Baverstock is hoping that governments and international organizations will commit new funds for the initiative. The network could be modeled after WHO's effort to coordinate research on the health effects of electromagnetic fields, a program supported by $150 million in research commitments from governmental and nongovernmental research programs worldwide, says Mike Repacholi, coordinator of WHO's Radiation and Environmental Health Unit.

    Scientific paradise lost?

    The U.N. hopes to rally interest in one last push for a major research effort on Chornobyl.

    CREDIT: R. STONE

    Partly to help guide the new network, WHO plans a systematic review of the literature on low-level radiation. WHO has a head start on this assessment thanks to the U.N. Scientific Committee on the Effects of Atomic Radiation (UNSCEAR), which 2 years ago issued a comprehensive survey of Chornobyl health research. UNSCEAR charged that many studies suffer from “methodological weaknesses,” including spotty diagnoses and disease classification, poor selection of control groups, and inadequate radiation-dose estimates. Apart from an increase in mostly treatable thyroid cancer in children, UNSCEAR concluded, “there is no evidence of a major public health impact.”

    The biggest challenge, UNSCEAR warned, is to estimate radiation doses reliably. Recent studies suggest that doses might have been lower than originally thought. “A lot of people thought the Soviets were underestimating the dose,” says UNSCEAR scientific secretary Norman Gentner. “It's turning out the opposite was the case.”

    The lowered dose estimates suggest that any lingering health effects apart from thyroid cancer, if they exist, will be hard to detect. But that doesn't mean researchers shouldn't try, says Dillwyn Williams, a thyroid cancer expert at the University of Cambridge, U.K. “I do believe that there are large uncovered areas of research,” he says. Priority areas, he adds, should be new case-control studies on breast and lung cancer and genetic effects, under the umbrella of a comprehensive long-term population study.

    Few Chornobyl researchers anticipate undiscovered health effects. “It appears unlikely that excess for solid cancers can be seen and can be related to radiation exposure,” says Albrecht Kellerer, director of the University of Munich's Radiobiology Institute, who has been involved in a decade-long German-French project on Chornobyl. But he's keeping an open mind on blood cancers. “Even if there is little expectation to find a radiation effect,” Kellerer says, it would be worthwhile to monitor childhood leukemia—and to continue surveillance on thyroid cancer—among the roughly 200,000 people living in Chornobyl-contaminated areas.

    Kellerer believes, however, that the hunt for knowledge about the health risks from long-term exposure to low-dose radiation could be pursued more fruitfully elsewhere. His group has won support from the European Commission to move its focus from Chornobyl to the region around the Mayak nuclear facility in the southern Urals of Russia, where extensive radioactive contamination in the surrounding watershed came to light after the Cold War. Mayak, he says, has opened “a vast new chapter of radiation epidemiology.”

    Such views don't augur well for the U.N.'s fundraising effort, which began this week with discussions aimed at generating research commitments within U.N. agencies and will continue at a follow-up meeting next month. As well as generating funding commitments from outside the U.N., the aim of the entire effort is to arrive at a consensus on “what research exists and what's needed,” says David Chikvaidze, Chornobyl coordinator for OCHA in New York City. Judging by researchers' increasing ambivalence about their chances to make breakthroughs with Chornobyl data, the U.N. might need to set modest expectations.

  10. PUBLIC HEALTH

    Creeping Consensus on SV40 and Polio Vaccine

    1. Dan Ferber

    At first it seemed impossible: The widely celebrated polio vaccine that was given to millions of people in the 1950s was contaminated with a monkey virus—a virus that causes cancer in animals.

    Since the virus was discovered in the monkey kidney extracts used to make the Salk vaccine some 40 years ago, concern has risen that the vaccine, which wiped out polio in the United States, might have triggered an epidemic of cancer (Science, 10 May, p. 1012). Now, at the request of the U.S. Congress, an expert panel of the Institute of Medicine (IOM) has issued the most definitive judgment to date, allaying most—but not all—of those fears. The virus, known as SV40, has not caused a wave of cancer, the panel concluded. But it might be causing some rare cancers, and more research is needed to find out.

    Since the contamination was detected, government researchers and others have published epidemiological studies that they said proved that the vaccine was safe. But questions remained, because the virus kept turning cultured cells cancerous, and it kept causing tumors in animals. That debate heated up in the past decade, after researchers began finding SV40 DNA in four types of rare human cancers—the same kinds it causes in animals—and press reports emphasized that tens of millions of people could have been exposed.

    The IOM committee examined the 4 decades of epidemiology to see whether people exposed to SV40-contaminated vaccine have a higher risk of cancer. Although the studies were flawed, the panel concluded, they were good enough to show that no cancer epidemic has occurred. But millions of people might have been infected with SV40 from the contaminated vaccine, the panel wrote. And the evidence is strong enough to suggest, but not prove, that the virus can sometimes cause human cancer. “We acknowledge that SV40 at least could have a carcinogenic effect, but epidemiological evidence does not suggest that it actually did,” says IOM committee member Steven Goodman, a biostatistician at Johns Hopkins School of Medicine in Baltimore. Even so, he adds, “there's a body of evidence [on SV40 carcinogenicity] that has to be taken quite seriously.”

    The committee stressed the need for more reliable and sensitive tests to detect SV40 in human tissue, especially tests for anti-SV40 antibodies in human blood. Once those tests are devised, researchers could test human tissue samples from before 1955 for the monkey virus to ascertain whether it really came from contaminated polio vaccine. In addition, the panel said, there's enough evidence that the virus is spreading in humans that the issue should be studied further.

    But overall, the IOM report “really closes the book on the discussion” of past epidemiological work, says pediatric oncologist Robert Garcea of the University of Colorado Health Sciences Center in Denver. Although SV40 might yet turn out to cause cancer in humans, the risk, if any, is “not remotely in the ballpark” of well-known carcinogens such as tobacco smoke or asbestos, adds Goodman.

    The report seems to have satisfied protagonists across the spectrum, although they're drawing different conclusions. One reason, Goodman explains, is that IOM took extraordinary precautions to prevent conflicts of interest, excluding anyone who had ever sat on a government vaccine panel or received money from government or industry for vaccine research.

    Virologist Janet Butel of Baylor College of Medicine in Houston, Texas, is “gratified that they recognized the biological evidence” implicating SV40 in cancer. Keerti Shah of Johns Hopkins School of Medicine, a long-time skeptic, calls the report “a positive step,” although he'll need better assays before concluding that SV40 is indeed present in humans. If the National Institutes of Health follows the panel's suggestions, more money should soon be available to probe the link.

  11. ATTOPHYSICS

    X-ray Flashes Provide Peek Into Atom Core

    1. Adrian Cho*
    1. Adrian Cho is a freelance writer in Grosse Pointe Park, Michigan.

    Using ultrashort pulses of x-rays, physicists have taken a “movie” of electrons frenetically rearranging themselves deep inside an atom. The technique opens the way for a new class of experiments in which researchers should be able to trace and control changes within atoms that take place in billionths of a billionth of a second, or attoseconds.

    Researchers have been steadily developing sources that pump out soft x-rays or ultraviolet light in pulses only a few hundred attoseconds long (Science, 30 November 2001, p. 1805). But the new result marks an important milestone in the emerging field of attophysics, says Philip Bucksbaum, a physicist at the University of Michigan, Ann Arbor: “This is the first demonstration of a real experiment. It's not just measuring the pulse itself.”

    Electrons stacked deep in an atom behave a bit like gumballs piled into a vending machine: Pop one out from the bottom of the heap, and the others move to fill the void left behind. But whereas anyone with a keen eye can see the gumballs shift, the electrons rearrange themselves far too quickly to be directly observed with even the fastest particle or radiation detectors. The shuffling reveals itself, however, when the atoms are prodded with equally speedy bursts of electromagnetic radiation, Markus Drescher of the University of Bielefeld in Germany, Ferenc Krausz of the Vienna University of Technology in Austria, and colleagues report in this week's issue of Nature.

    The researchers hit krypton atoms with a one-two punch: a blast of soft x-rays only 900 attoseconds long, followed by a flash of laser light roughly seven times longer. The x-rays stripped electrons from the krypton atoms in a process called photoionization, and some of the atoms lost an electron from a particular inner shell, creating a core hole. An electron from an outer shell then fell into the vacancy. In the process, it gave up some of its energy to yet another electron, which then flew out of the atom with a specific energy. The core holes decayed through this complicated Auger process within a few femtoseconds, and the researchers hoped to track precisely how their numbers dropped.

    Sign of the times.

    In the electron spectrum, an extra peak (red) next to the main one lets physicists time changes in the atom.

    CREDIT: ADAPTED FROM M. DRESCHER ET AL., NATURE 419, 803 (2002)

    Through a trick of quantum mechanics, the researchers reduced the problem of clocking the decay of holes to one of counting electrons. As the Auger electron emerged, it could absorb a photon from the laser pulse or even radiate a matching photon into the passing stream of light. That quantum interaction slightly increased or decreased the energy of a fraction of the electrons—a fraction determined by the shape of the laser pulse.

    By counting those energy-shifted sideband electrons, the physicists could deduce how many core holes remained unfilled when the laser pulse arrived. To see how that number fell with time, they simply varied the delay between the x-ray flash and the laser pulse. “We basically record a series of snapshots,” Krausz says. “We reconstruct the motion from them” in the same way a movie recreates a moving image from still ones.

    The researchers found that the krypton core holes decayed with a lifetime of about 8000 attoseconds, or 8 femtoseconds. That relatively long lifetime matches what other researchers had inferred by indirect methods. However, the new measurement marks the first time anyone has used attosecond x-ray sources and lasers to time the flickering changes within an atom directly.

    “I imagine that everybody is going to adopt this technique,” says John Hepburn, a chemist at the University of British Columbia in Vancouver, Canada. In the meantime, Drescher and Krausz say that their first priority is to apply the method to a shorter lived core hole, to confirm that they really can trace attosecond changes.

  12. BACTERIOPHAGE THERAPY

    Stalin's Forgotten Cure

    1. Richard Stone

    Bacteriophage therapy, pioneered in Stalin-era Russia, is attracting renewed attention in the West as a potential weapon against drug-resistant bugs and hard-to-treat infections

    TBILISI—Last December, three woodsmen in the mountains of Georgia stumbled upon a pair of canisters that were, oddly, hot to the touch. The men lugged the objects back to their campsite to warm themselves on a bitterly cold night. That turned out to be a terrible mistake: The canisters, Soviet relics once used to power remote generators, were intensely radioactive and burned two of the men severely. The victims were rushed to the capital, Tbilisi, where doctors plied them with antibiotics but failed to prevent staphylococcus bacteria from invading the deep wounds. Septic shock seemed just around the corner. Then a kinder legacy of the Soviet Union came to the rescue.

    Georgian doctors turned to a therapy virtually unknown in the West: They unleashed the bacteria's natural predators. The doctors covered the open wounds with novel biodegradable patches impregnated with bacteriophages, viruses that infect bacteria. The business card-sized PhageBioDerm patches, recently licensed for sale in Georgia, eliminated the infection, and within a few weeks the woodsmen were stable enough to go abroad for treatment to replace the lost skin.

    The episode shows that a unique brand of medicine from the Stalin era is alive and well in this remote corner of the world. More surprisingly, phage therapy might be about to stage a comeback in the West. After a brief fling with phages before World War II, when the use of penicillin became widespread, Western physicians ignored the therapy for more than half a century. “Phages were relegated to the dustbin of history,” says Richard Carlton, president of Exponential Biotherapies Inc. (EBI), a firm based in Port Washington, New York, that is one of more than two dozen racing to reclaim phages from that dustbin.

    Driving phage therapy's potential rehabilitation is the accelerating crisis of antibiotic resistance. Some cases are particularly chilling: Last July, for example, U.S. health authorities reported the first instance in which a stalker of hospital wards, methicillin-resistant Staphylococcus aureus, had also acquired full resistance to vancomycin, often used as a last-ditch treatment. “The window of opportunity for new antibiotics is rapidly closing,” asserts Janakiraman (“Ram”) Ramachandran, a former president of AstraZeneca India who 2 years ago launched GangaGen Inc., a phage-therapy start-up in Bangalore.

    Although phages offer hope against drug-resistant bacteria and could soon find a role as a treatment for burns, diabetic ulcers, and other open wounds, experts concur that these viral breeds are unlikely to knock antibiotics off their pedestal for most infections. “Phages are certainly not going to replace chemicals,” says Alexander Sulakvelidze of Intralytix, a company in Baltimore, Maryland, that's exploring a potential market for PhageBioDerm or similar products in the United States. Nor is it evident how the U.S. Food and Drug Administration will regulate phage products, or if it will be permissible to alter phage strains after FDA approval to counter bacterial resistance. For these reasons, many experts expect that phage concoctions for livestock and food-borne pathogens will find their way to market first (see sidebar, p. 730).

    Irradiated.

    Georgian doctors covered this man's radiation burns with patches that released bacteriophages into the wound as they degraded.

    CREDIT: DAVID JIKIA

    Despite the uncertainties, proponents say that as a medicine, bacteriophages have a lot going for them. “They are the most abundant life forms on Earth,” notes phage biologist Elizabeth Kutter of Evergreen State College in Olympia, Washington. A drop of seawater or sewage teems with millions of phages; any exposed surface of our bodies, not to mention our digestive tracts, is carpeted with them. “Mother nature gives you an endless source of phages,” says Sulakvelidze. And unlike most antibiotics, they are very specific, Kutter says: “Phages can kill off a small fraction of the microbial population and leave the rest intact.”

    Phages are like minuscule smart bombs that home in on particular bacterial strains. Anecdotal evidence from decades of Soviet practice suggests that this results in far fewer side effects than use of antibiotics. And whereas drugs lose effectiveness as they are metabolized, phages replicate in their hosts, gaining strength in numbers and thus increasing potency.

    EBI, the fastest company off the blocks, has completed safety testing in healthy volunteers of a phage against vancomycin-resistant enterococci and plans to launch a clinical trial in patients with VRE in the middle of next year. “We really are living in a brave new world,” says Toney Ilenchuk, vice president of Biophage Pharma in Montreal, Canada. He and others are watching EBI closely, because its experiences could determine how quickly the first phage therapies against human diseases reach the market—or whether the approach slips back into obscurity in the Western world.

    Many are rooting for a comeback. “We need to do something, have some alternative to antibiotics,” says Diane Schaak of the Rowland Institute for Science at Harvard University. “Phage therapy could be a wonderful way to go.”

    A checkered past

    The first whiff of the microscopic predators came in 1896, when British chemist E. H. Hankin reported that water straight from the sewage-ridden Ganges and Jumma rivers could kill the cholera pathogen. It wasn't until 2 decades later that a pair of scientists, working independently, concluded that the bacteria slayers must be microbes themselves. In 1915, British bacteriologist Frederick W. Twort described an “ultramicroscopic virus” that somehow killed bacteria in solution. But it was noted biologist Félix d'Herelle of the Pasteur Institute in Paris who made the critters famous: He and his wife coined the term bacteriophage in 1916 after d'Herelle isolated an “anti-Shiga” microbe from the feces of patients with dysentery and grew it in the bacterium that causes the disease.

    D'Herelle was also the first to comprehend the promise that phages held as a disease treatment. In 1919, he and his colleagues made a phage preparation for a 12-year-old boy with severe dysentery. After guzzling 100 times the intended dose to check its safety—“the first clinical safety trial,” jokes Sulakvelidze—they gave the diluted preparation to the boy, who recovered fully within a few days.

    Old Georgian recipe.

    Microbiologist Zemphira Alavidze examines bacterial colonies decimated by phages. At the Eliava Institute, hardy strains are cultivated and refrigerated the old-fashioned way.

    CREDITS: MUTSUMI STONE

    Over the next several years, d'Herelle helped set up phage-therapy trials across the globe. “He would go to villages and observe who was recovering on their own from an illness, isolate phages from these people, and grow them in the lab,” says Ramachandran. Phage therapy was off to a flying start, and it gained in popularity after the 1925 publication of Arrowsmith, a novel by Sinclair Lewis in which a doctor deploys phages against an outbreak of bubonic plague in the West Indies.

    Back then, “phages seemed like a miracle answer to many devastating infectious diseases,” says Kutter. The drug giant Eli Lilly and a plethora of entrepreneurs piled into the phage business, but their record was spotty. In some patients the concoctions worked well, whereas in many others they had no effect. The mixed results were grist for a damning critique of phage therapy from the American Medical Association in 1934.

    Phage enthusiasts are quick to disassociate modern approaches from the field's early days. Little was known then about phages or bacteria, so patients often took phages that were not suited to their infections. In addition, says EBI's Carlton, “they didn't purify these products well enough.” Preparations were often loaded with endotoxins produced by bacteria in the suspensions used to cultivate phages, and they were rarely tested before use to see if the phages were viable. But whereas Western physicians abandoned the fickle medicine, Soviet scientists kept the faith.

    Stalin's antibiotic alternative

    Sunlight streams through a picture window into an office suffused with the yeasty smell of agar as Amiran Meipariani removes a logbook from a desk drawer. The silver-haired and -moustached bacteriologist scrolls down a record in cramped Cyrillic handwriting of the last batches of medicinal bacteriophages shipped abroad by the Eliava Institute here in Tbilisi. On the wall behind him is a 1930s photograph of the man who started it all in Georgia, a dashing young scientist with oiled black hair and deep-set eyes. Under Giorgi Eliava's intense gaze, Meipariani, who has worked here for 45 years, puts a finger on the most recent entries in his log: 88,600 phage tablets for intestinal illnesses and 497,000 tablets for prophylaxis against Salmonella, both shipped to Central Asia in 1989. That was the beginning of the end of the Eliava Institute's golden era.

    The Soviet enterprise got under way in 1923, when Eliava, who had spent 5 years with d'Herelle in Paris, founded a bacteriological research center with the blessing of Soviet dictator Josef Stalin. Eliava's phage program got a big boost in 1933, when d'Herelle left Yale University to join his protégé in Tbilisi. He stayed until tragedy struck a few years later: Eliava fell into disfavor with the dreaded Lavrenty Beria, later head of the KGB, and was executed. The devastated institute eventually recovered and continued its pioneering work, including the development in the 1940s of phages against anaerobic infections such as gangrene. Soviet authorities placed a high value on the Eliava Institute's work. When it came to ordering new equipment and supplies of enzymes, says microbiologist Mzia Kutateladze, who joined the institute in the late 1980s, “we got whatever we wanted.”

    Heady days.

    Félix d'Herelle (seated) and Giorgi Eliava (right) in Tbilisi shortly before Eliava was shot.

    CREDIT: COURTESY OF A. SULAKVELIDZE

    The Soviet military was perhaps the biggest consumer of phage preparations, many of which were produced in Russia—and still are—according to Georgian techniques. “Antibiotics were expensive, while phage preparations were very cheap,” explains Meipariani. The military's enthusiasm did not ebb after the Soviet meltdown. During the civil war in the early 1990s, Georgian soldiers fighting in the breakaway Abkhazia region carried spray cans filled with phages against five bugs: Staphylococcus aureus, Escherichia coli, Pseudomonas aeruginosa, Streptococcus pyogenes, and Proteus vulgaris. Phage preparations were also widely available in many Russian cities alongside antibiotics. And in a handful of towns such as Tolyatti, an auto-manufacturing center, clinics “rarely used antibiotics,” instead relying almost exclusively on phages, says Zemphira Alavidze, a microbiologist at the Eliava.

    By the time the Soviet Union dissolved in 1991, the Eliava had only the means to produce phages for a still-thriving domestic market in newly independent Georgia. This minimal production, says Meipariani, has helped “preserve the tradition.” Cooking up phages means more than following a recipe, he says: “You need a good mind and good hands.” Visitors concur. “There is really no substitute for their collective experience over the past 70 years,” says Tony Smithyman, managing director of SPS, a phage-therapy company in Sydney, Australia. He's one of many Westerners who have made a pilgrimage to Tbilisi to learn the art of phage therapy.

    The Eliava Institute was not alone in pursuing such therapies, as phage centers sprang up elsewhere in Eastern Europe. Perhaps the most important data in the English literature on the therapy's effectiveness come from the Institute of Immunology and Experimental Therapy in Wroclaw, Poland. Researchers there compiled a detailed report on the successful treatment of more than 500 patients with bacterial infections in the mid-1980s, but the results appeared in the obscure Archivum Immunologiae et Therapiae Experimentalis and only recently were excavated for wider dissemination. The Wroclaw institute is itself experiencing a renaissance, busily culturing medicinal phages and forging ties with Western labs.

    Promises and perils

    Whereas antibiotics relegated phage therapy to a historical footnote in the West, bacteriophages themselves, particularly the lunar lander-shaped T4 that infects E. coli, became the darlings of biologists. Beginning with Max Delbrück's famous “phage group” in the early 1940s, molecular biologists exploited the simplicity and ease of handling of a few lab phages to do the following: confirm that DNA is the genetic material, show that messenger RNA is involved in its translation, reveal that the unit of recombination is the nucleotide, and clarify how genes are turned on and off. But the mainstream view on phage therapy was summarized in a passage from Gunther Stent's classic 1963 textbook Molecular Biology of Bacterial Viruses: “Just why bacteriophages, so virulent in their antibacterial action in vitro, proved to be so impotent in vivo has never been adequately explained.”

    Stent suggested that the therapy failed because antibodies mop up phages infused into the body. In the early 1970s, a young researcher at the U.S. National Institutes of Health (NIH), Carl Merril, tested that notion on a germ-free colony of mice using a common lab phage, lambda. His group discovered that before the mice could even develop antibodies, the phages were cleared from the bloodstream, primarily by the spleen.

    Several years later, Merril and NIH colleague Sankar Adhya wondered whether some particularly hardy strains might evade clearance. If so, these could be harvested and studied and perhaps serve as the basis for an improved therapy. To find out, the researchers proposed injecting mice with billions of lambda phages and seeing if any persisted hours later. Teaming up with EBI's Carlton, they found that phage mutants that were around many hours longer than run-of-the-mill phages were much more effective at rescuing mice from otherwise lethal infections. Their 1996 report, in the Proceedings of the National Academy of Sciences (Vol. 93, p. 3188), was widely hailed as a basis for selecting promising phage strains. For the first time in more than half a century, Western experts were taking the disparaged approach seriously again.

    In a sign of the changing times, the Cold Spring Harbor Laboratory in New York, once a powerhouse in basic research on bacteriophages, will hold its first-ever Banbury meeting on phage therapy next month. However, there are still big gaps in our understanding of how phages work: Exquisite studies of phage genetics have revealed little about how these viruses behave in their natural environments or when introduced into the human body. “Decades of neglect of phage biology have left us woefully unprepared to take rapid steps in such uses as therapeutics,” says Ry Young, a phage biologist at Texas A&M University in College Station.

    What is known is that phages come in two flavors. “Lytic” phages infect a bacterium, hijack its DNA, and replicate madly until the bacterium's cell wall gives out and it expires: the killing mechanism common for all phages. Lytic phages are ideal for therapy. But half of all phages, it is thought, are “temperate,” meaning that they often integrate their DNA into that of their host. “You wouldn't know the phage is there,” says Adhya, because they hibernate in the form of genetic code before the viral DNA tears itself free again and the virus begins replicating—sometimes taking some of the host DNA with it. At the same time, temperate phages can protect their host from attack by other phages. And they can abet pathogens: Certain temperate phages carry the genes for the toxins released by bacteria that cause diphtheria and cholera, for example.

    One serious concern is that a temperate phage could make off with host genes connected with virulence or resistance. These phages could, in principle, wreak havoc by integrating such genes into a new host. “I would worry about the transference of such genes,” says Harvard's Schaak. After all, she asks, “Why would a phage wipe out its host?” Schaak speculates that some phages that are lytic in the test tube could acquire the genes to turn temperate in the body. Although most experts discount that possibility, Sulakvelidze in informal conversations with FDA officials understood that firms must guarantee that their phages are stably lytic. “We're doing much more rigorous characterizations of phages than has been done in the past,” he says, including DNA sequencing.

    Another hurdle that could make or break phage therapy is that bacteria inevitably develop resistance to phage strains just as they do to antibiotics. The problem might not be as severe for would-be phage therapists: A study from the early 1980s found that mutations conferring resistance in E. coli occurred less frequently following phage therapy than they did following antibiotic therapy. And proponents say it's much easier to tackle resistance with phages than with drugs. “You can generate a new phage variant in a week” by selecting those that don't lose virulence in culture, says Ramachandran, who envisions a regulatory process in which panels of phages are put through clinical trials for FDA approval. Adhya's team, meanwhile, is developing a “master” phage that could undergo rigorous FDA review. Such a master phage could be subtly tinkered with, for example by altering a single gene that affects host susceptibility. Modified strains presumably would take less time and money to approve, Adhya says.

    Some scientists hope to bypass these issues altogether by extracting the active components from phages. For example, Vincent Fischetti and his team at Rockefeller University in New York City describe in the 22 August issue of Nature how they used a phage's lytic enzyme to kill the anthrax bacterium in the test tube. In a similar vein, Young's lab at Texas A&M reported last year in Science (22 June 2001, p. 2326) that one type of phage makes peptides that act like penicillin, blocking cell-wall synthesis in bacteria and causing the cell to explode when it tries to divide.

    Some companies are trying to exploit such eccentricities of phages in their quest for new drugs. For instance, PhageTech in St. Laurent, Canada, is studying a myriad of phage “killer” proteins that derail the host metabolism to make it easier for the phages to reproduce. It is now screening libraries of small molecules for killer protein analogs that could act as antibiotics. And Schaak is taking a novel tack. Earlier this year, she and a few colleagues launched MicroStealth Technologies, which aims to use phages as delivery vehicles for antimicrobial peptides that are only active inside bacterial cells.

    Phage futures?

    Although the experiences in Georgia and elsewhere in Eastern Europe have helped establish sound methods for selecting and cultivating phages, it's unclear how much of that data will be useful to companies intending to bring phage therapy to the West. “I'm not knocking the work in the East,” asserts EBI's Carlton, “but the FDA pretty much has to discount it.” That's not a universal view. Kutter, for one, argues that Eastern European findings on the use of phages to treat conditions such as diabetic ulcers and osteomyelitis, in which poor circulation can render antibiotics toothless, “are particularly impressive and incontrovertible. They have excellent cure rates.” But it seems that message isn't reaching the right ears. “We just get a blank stare when we talk to regulators,” says Ilenchuk, who says his firm, Biophage Pharma, will target “compartments” such as the mouth or intestines rather than dive into injectibles.

    FDA has not yet issued written guidance on how it intends to regulate phage therapy. But many unknowns will be cleared up as EBI takes its VRE phage through trials. “They are blazing the trail for us,” says Asher Wilf, who last year founded Phage Biotech Ltd. in Rehovot, Israel. Navigating through the regulatory waters will be “a big challenge,” says Ilenchuk. “It would take just one of us to screw it up,” he says, recalling the troubles encountered in the early days of bringing blood substitutes to the market. FDA's emerging stance could also determine if and when large pharmaceutical companies get into the game. “Big pharma is waiting to see proof of concept before they do deals,” says PhageTech co-founder Michael DuBow, now at the Université Paris-Sud XI in Orsay, France.

    In addition to regulatory and scientific uncertainties, phage enthusiasts might face one more hurdle: public acceptance. The time is ripe to start educating the public about phages, says Ilenchuk. “We have to get the message across that phages are everywhere.” He and others assiduously avoid the “v” word. Rather than refer to them as viruses, says Schaak, “I call phages a natural delivery system.” Others prefer a straight-shooting approach. “Phages are viruses, and if they are to be used as therapeutic agents, we need to respect their origin and use our modern scientific methods to assure that they are safe,” says NIH's Merril.

    No matter how it's sold, most experts believe that in light of the increasing perils of antibiotic resistance, the once-scorned Soviet therapy will ultimately find a niche in modern Western medicine. A half-century of antibiotics usage has taught us that “you cannot win the war against bacteria,” says Sulakvelidze. But with phages, he says, “at least you can try to shift the ecological balance in our favor.”

  13. BACTERIOPHAGE THERAPY

    Food and Agriculture: Testing Grounds for Phage Therapy

    1. Richard Stone

    Last month, the U.S. Food and Drug Administration tightened another screw in its effort to curb the spread of antibiotic resistance from the burgeoning use of agricultural drugs. The agency aired draft regulations requiring manufacturers to test potential livestock pharmaceuticals for their ability to help pathogens acquire resistance to human drugs. But farmers are concerned that they could be left with fewer weapons to combat Listeria and other food-borne pathogens that cause several hundred deaths each year in the United States alone. “When farmers are told they can't use any antibiotics used in humans, they say, ‘What do we use?'” says Toney Ilenchuk. His firm, Biophage Pharma in Montreal, Canada, believes it has part of the answer: bacteriophages against Salmonella and pathogenic strains of Escherichia coli.

    Ilenchuk and other advocates are also eyeing the use of phages—viruses that attack bacteria—in food processing. The Baltimore, Maryland-based firm Intralytix already has a permit from the U.S. Environmental Protection Agency to test a phage against the bacterium Listeria monocytogenes in a food-processing plant, although in this trial the phages cannot be applied to surfaces that come into contact with food. The company hopes to have its anti-Listeria phage on the market by early next year. Intralytix is also developing phage preparations to spray on eggs to reduce Salmonella contamination. It would be too daunting to go after all 2400 or so Salmonella serogroups, “but we can target the five or six serogroups most commonly associated with human illness,” says Intralytix co-founder Alexander Sulakvelidze, who directed the State Microbiology Laboratory in Tbilisi, Georgia, before emigrating to the United States a decade ago.

    A man and his phage.

    Alexander Sulakvelidze shows off a phage preparation sold in Georgia; electron micrograph of bacteriophages that kill Salmonella.

    CREDIT: SAM KITTNER

    Fresh-cut produce might also be candidates for phage treatment: A team at the U.S. Department of Agriculture (USDA), working with Intralytix, has found that phages are more effective than chlorine at ridding cut fruits and vegetables of Salmonella. At least two dozen other phage firms worldwide are hoping to get a foothold in the food and livestock market.

    Shadowing the corporate push is surging academic interest in using phages to improve food safety. For example, a team led by microbiologists Donna Duckworth and Paul Gulig of the University of Florida, Gainesville, has isolated phages against Vibrio vulnificus, a bacterium sometimes found in raw oysters that can trigger severe illness in people. The researchers have used these phages to cure infected mice and are trying to harness the phages for depurating oysters for human consumption. In a similar vein, Elizabeth Kutter's team at Evergreen State College in Olympia, Washington, is working with USDA to find phages that can clear the deadly E. coli O157:H7 from cattle guts. The strain doesn't seem to bother cows, but outbreaks traced to undercooked hamburgers and unpasteurized fruit juices have killed scores of people.

    All these potential applications would require regulatory approval. But enthusiasts hope that the very ubiquity of phages will make them an easy sell. They are “the ultimate clean, green, ecofriendly disease-control system,” pitches Tony Smithyman of SPS, a phage-therapy firm in Sydney, Australia. “We're working in tune with nature.”

  14. TOXICOLOGY

    Overhaul of CDC Panel Revives Lead Safety Debate

    1. Dan Ferber

    Just as an advisory committee began looking at evidence for setting a stricter lead-exposure standard, it got reorganized

    How far should society go in protecting children from exposure to low levels of lead? The question, hotly debated in decades past, is suddenly back on the front pages after public health advocates and a group of Democrats in Congress accused the Bush Administration of trying to load an influential advisory panel with friends of the lead industry. They suspect that the Administration wants to head off an effort to tighten the definition of lead poisoning. A government spokesperson acknowledges that a panel that advises the Centers for Disease Control and Prevention (CDC) in Atlanta is getting new members. But he denies that the Administration has any policy in mind and says that the new panelists are well qualified.

    The changes to CDC's Advisory Committee on Childhood Lead Poisoning Prevention are the latest of several that have raised concerns about scientific advice (see Editorial, p. 703). This change comes at a critical time for the CDC committee: It is examining studies suggesting that lead is harmful below the allowable level of 10 micrograms per deciliter (μg/dl) of blood. If the panel decides that even this amount of lead is harmful, it could recommend lowering exposure. This would likely prompt tighter cleanup regulations, which would be expensive for the lead industry and owners of housing with lead-based paint to implement.

    The CDC changes also come at a critical time in litigation over lead's toxic effects. Rhode Island, following the model of states that sued the tobacco industry, is suing lead-paint producers to recover the cost of treating lead-poisoned children and removing crumbling paint from old buildings. Some 38 million U.S. housing units still had lead-based paint in the late 1990s, according to a recent survey. Lowering the level deemed harmful would increase the number of children at risk and probably boost damage claims.

    In the 1960s, doctors diagnosed lead poisoning if the blood level was above 60 μg/dl, exposure that can cause severe abdominal spasms, kidney injury, and brain damage. After the United States began phasing out leaded gasoline in 1976, average blood lead levels plummeted. But epidemiological studies in the 1980s and 1990s revealed that low levels still damaged children's ability to think, concentrate, and hear. CDC continued to reduce the allowable lead level—to 30 μg/dl in 1975, 25 in 1985, and 10 in 1991. The current level has been endorsed by the National Academy of Sciences, the American Academy of Pediatrics, and the World Health Organization. But about 900,000 U.S. children under age 6 still have blood lead levels of 10 μg/dl or more, according to CDC.

    Subtle poison.

    Bruce Lanphear found that even low-level exposure to lead—as from old paint—can affect children's test scores.

    CREDIT: (TOP TO BOTTOM) CINCINNATI CHILDREN'S HOSPITAL MEDICAL CENTER

    New studies in the past decade indicate that children's ability to think and learn might suffer at still lower levels. For example, in Public Health Reports in 2000, preventive medicine expert Bruce Lanphear of Cincinnati Children's Hospital Medical Center reported that data from a nationwide health study suggest that lead levels below 10 μg/dl lowered children's scores on reading and arithmetic tests. And there is no good treatment.

    The CDC committee began evaluating the new research last year. Meanwhile, a group of senators, including Jean Carnahan (D-MO), began pressuring Secretary of Health and Human Services (HHS) Tommy Thompson to lower the standard from 10 to 5 μg/dl. CDC seemed agreeable: The St. Louis Post-Dispatch last summer quoted Richard Jackson, head of the CDC division that deals with childhood lead toxicity, as predicting that the committee would recommend lowering the lead standard and CDC “would go along.”

    But Jackson isn't commenting now. And the Administration's critics see the latest committee changes as a reversal. The CDC's advisory panel overhaul was documented by an environmental group, the Natural Resources Defense Council, and a report released on 8 October by Representative Edward Markey (D-MA). The critics blasted Thompson for rejecting several qualified nominees to the panel, including Lanphear, who has publicly advocated halving the permissible level, and Michael Weitzman of the University of Rochester, New York, who had been on the committee since 1997. The Markey report also raised questions about some experts nominated to the panel, including Joyce Tsuji of Exponent, a California-based consulting firm whose clients include a major lead smelter; Sergio Piomelli of Columbia Presbyterian Medical Center in New York City, who in 1991 opposed lowering the lead standards; and pediatric toxicologist William Banner of the University of Oklahoma Health Sciences Center in Oklahoma City, who provided written testimony on behalf of lead-industry defendants in Rhode Island. Banner argued in that testimony that lead levels below 70 μg/dl do not injure the nervous system.

    Public health advocates are worried about what will follow the panel overhaul. “I would be very concerned that these scientists were placed on this committee to represent the interest of their clients,” says pediatrician Susan Cummins of the National Academy of Sciences, who chaired the CDC panel in the mid-1990s. But Cummins adds that, although there's no evidence for a threshold below which lead is safe, “you reach a point where there really isn't much you can do beyond moving a family out of their home.”

    The new CDC nominees defend their inclusion on the panel. Banner says that he has “an open mind” about lead poisoning and dismisses allegations in the Markey report as “absurd.” Tsuji declined the nomination because of perceptions that she might have a conflict of interest, she says. Piomelli says that in the past he frequently opposed the lead industry on gasoline standards and has testified for defendants and plaintiffs.

    Summing up the Administration's view, HHS spokesperson Bill Pierce says that he thinks the new roster of members will improve the panel as a whole and that “the secretary wants to see a breadth and depth of opinion.”

  15. INDIA

    Missing Generation Leaves Hole in Fabric of Research

    1. Pallava Bagla

    Top fellowships go begging and outside hires stir controversy as India wrestles with a shortage of talent for leadership positions

    NEW DELHI—The first atmospheric neutrinos were detected nearly 4 decades ago in an Indian gold mine 3 kilometers underground. The team performing the experiment was led by physicist M. G. K. Menon, who went on to a distinguished career that included a stint as India's science minister. But this month, when the Nobel Prize in physics was awarded to a U.S. and a Japanese scientist for pioneering research in understanding these elusive particles (Science, 18 October, p. 527), India's contribution to the field rated barely a footnote. The experimental facility in the Kolar mine has long since been shuttered for budgetary reasons, and Indian scientists are nowhere near the forefront of this suddenly red-hot area of astrophysics.

    India's slide in neutrino science is symptomatic of what Menon and others have labeled a “missing generation” of Indian scientists. “There is a lack of leaders in the age group of 45 to 55 years,” says C. N. R. Rao, a former science adviser now at the Jawaharlal Nehru Centre for Advanced Scientific Research in Bangalore. The problem was triggered by a serious brain drain to the West that began in the 1970s and has been compounded in recent years by talented students bypassing science for better paying fields. The phenomenon takes many forms, from a dearth of worthy candidates for a prestigious midcareer research fellowship to consternation over the pending appointment of the first outsider to lead one of the country's flagship labs, the Tata Institute of Fundamental Research (TIFR) in Mumbai. It's also contributed to a graying at the top of the scientific hierarchy.

    “Without question there is a certain crisis,” says Menon, a member of the TIFR search committee that chose Shobo Bhattacharya, a 51-year-old Indian-born condensed matter physicist who has spent his entire career in the United States, most recently at the NEC Research Institute in Princeton, New Jersey. Ironically, the crisis comes as a scientist, aeronautical engineer A. P. J. Abdul Kalam, holds the largely ceremonial position of president for the first time in history and the science minister, physicist M. M. Joshi, is a member of the Cabinet of Prime Minister A. B. Vajpayee.

    Menon's assessment is based on current trend lines, which point downward. India lags far behind China and most other Asian nations in the percentage of its college graduates who major in the natural sciences and engineering, according to data compiled by the U.S. National Science Foundation. In addition, the government allowed funding of university research to stagnate for much of the past decade. The result has been a decline in productivity. A recent international study of scientific publishing, for example, found that the number of Indian papers in peer-reviewed journals had dropped by 24% since 1980 and the country's global ranking had slipped from eighth to 15th, despite its status as the world's second most populous nation.

    Thin at the top.

    India's president, engineer A. P. J. Abdul Kalam (top, right), and physicist M. G. K. Menon (top, left) are part of a shrinking corps of homegrown senior science managers; Shobo Bhattacharya (bottom) was recruited from the United States to head the Tata Institute for Fundamental Research.

    CREDITS: (TOP TO BOTTOM) P. BAGLA; TIFR

    “With our scientific output on the decline, this [leadership crisis] was inevitable,” says Goverdhan Mehta, director of the Indian Institute of Science in Bangalore and incoming president of the International Council of Scientific Unions. “And the problem is going to become even more acute.” Some scientists see Bhattacharya's selection, which awaits final approval by the prime minister, as the final straw.

    One program already feeling the effects of the talent gap is the government's premier midcareer fellowship program. Begun in 1997 to mark the country's 50th anniversary, the Swarnajayanti (Golden Jubilee) Fellowship program offers research funding, salary supplements, and generous travel allowances to top scientists between the ages of 30 and 40. But only five of the 25 slots, on average, are filled each year because of a lack of qualified candidates. Demand is not the problem: This year, for example, 411 scientists applied for the lucrative fellowships.

    Some scientists say that the age ceiling should be lifted to allow more experienced scientists to apply. But Valangiman Subramanian Ramamurthy, a nuclear physicist and secretary of the Department of Science and Technology, says that enlarging the pool would undermine the objective of identifying and nurturing promising talent. He also rejects the idea of settling for less than the best: “There can be no compromise on quality just to make up the numbers,” he insists.

    Several Asian countries, including aspiring powerhouses such as China, Taiwan, South Korea, and Singapore, have acquired world-class talent by tapping native-born researchers now working abroad. That approach, which can be a quick way to infuse new blood into tired institutions and foster international collaborations, makes sense to Raghunath Anant Mashelkar, a polymer scientist and director-general of the Council of Scientific and Industrial Research. “Ruthlessly go for best,” he says about such efforts.

    But the strategy is controversial in India, where many scientists see the hiring of such recruits as an implied criticism of domestic talent. “It is inconceivable that excellent scientists with adequate administrative skills working in India were not available for the job [at TIFR],” says Padmanabhan Balaram, a molecular biophysicist at the Indian Institute of Science in Bangalore and editor of the well-regarded journal Current Science. He and other scientists also fret that Bhattacharya's unfamiliarity with the notorious Indian bureaucracy could put him at a disadvantage in battling for resources.

    Menon says that the search committee looked high and low for domestic talent, even perusing the membership lists of the country's scientific academies, before deciding upon Bhattacharya. Bhattacharya, who studies the dynamics of complex and cooperative systems, declined to discuss his plans for the institute until his appointment has been finalized. But he suggests that he will cast a wide net, telling Science that “TIFR must engage the public in its vision of the future of science in India and India's role in the global science enterprise.”

    Whereas the TIFR appointment has generated considerable discussion, the Department of Atomic Energy caused barely a ripple last year when it chose an Indian-born U.S. citizen to lead another prominent institute within its fold, the Harish-Chandra Research Institute in Allahabad. Ravi Kulkarni, a mathematics professor at the City University of New York, says he had to demonstrate his Indian roots before being picked for the job. “I think that they wanted to make sure I was not a CIA agent,” Kulkarni told an audience of Asian-American scholars this summer at a symposium in New York. In the end, he says, his extensive knowledge of Indian philosophy and the Sanskrit language won over his future employers.

    The slim pickings within the domestic ranks have meant longer tenures for those at the top. With the exception of Anil Kakodkar, secretary of the Department of Atomic Energy, the secretaries of seven major scientific departments have exceeded their scheduled terms, including those at the departments of science and technology, biotechnology, space, and ocean development; the Council of Scientific and Industrial Research; the Indian Council of Medical Research; and the Defence Research and Development Organisation. When they do step down, warns Pavagada Venkata Indiresan, a former director of the Indian Institute of Technology in Chennai, their successors might be career bureaucrats, as has already happened at two ministries.

    Given the magnitude and duration of the shortage of senior talent, Indian scientists are not expecting any quick fixes. But they agree that the problem can no longer be ignored. “All the government agencies should have a discussion and arrive at an action-oriented program,” says Rao. “This is a matter of serious concern.”

  16. THE INSTITUTE FOR GENOMIC RESEARCH MEETING

    Gene Researchers Hunt Bargains, Fixer-Uppers

    1. Elizabeth Pennisi

    BOSTON, MASSACHUSETTS—Technology buffs, bioinformaticists, and hardcore experimentalists rubbed elbows here 2 to 5 October at TIGR's 14th International Genome Sequencing and Analysis Conference. They met to discuss better ways to gather and use genomic information, a vast array of which is now at their fingertips. Highlights included discussions of chromosome evolution and new low-cost sequencing approaches.

    Do-It-Yourself Repair Kit

    Men have a reputation for trying to fix things without asking for help from others. The same might soon be true of the Y chromosome, that knobby piece of the human genome that makes men men, says David Page, a geneticist at the Whitehead Institute for Biomedical Research in Cambridge, Massachusetts.

    Some biologists have theorized that the Y chromosome is destined to decay because it lacks a twin to help it keep its genes intact. Other chromosomes come in pairs that intertwine during meiosis. This allows matching genes, or alleles, in one chromosome to change places with their doubles. This recombination sheds faulty DNA and keeps each chromosome pair well matched. Females carry two X chromosomes, enabling X's to recombine, but males carry an unmatched X and Y. “If a piece [of DNA] does not participate in crossing over, then its genes begin to rot,” Page explains.

    But now Page and his colleagues have discovered that the Y chromosome does have matching genes—within itself. These repeated genes might allow the Y chromosome to somehow fix problem DNA. It seems this chromosome has “taken out an insurance policy,” says Stanley Letovsky, a bioinformaticist at Boston University.

    The new find might improve the Y chromosome's reputation. “For centuries, the Y chromosome has been called a junk heap,” Page points out. A few genes at its very tips are similar enough to genes at the tips of the X chromosome to successfully recombine. Researchers have proposed, though, that with no apparent way to swap out harmful mutations on most of its length, the Y chromosome has become ever more dysfunctional, full of dead or dying genes. Indeed, some geneticists have gone so far as to predict that the Y chromosome will one day self-destruct, perhaps taking males with it. A superficial view of the chromosome seems to confirm this dire warning: More than half of its 59 million bases are apparently meaningless sequence.

    But when Page and his colleagues sequenced 24 million Y chromosome bases that do contain genes, they found a surprise. About one-third of that DNA consists of complex blocks that are repeated two or more times along the chromosome, Page reported at the meeting. Furthermore, the blocks tended to be arranged in eight huge palindromes. Each is composed of coding regions that are mirror images of each other, separated by small spacers.

    Like father like …

    This male's handyman bent extends to his Y chromosome.

    CREDIT: TOM STEWART/CORBIS

    The few Y chromosome genes that are shared with the X chromosome, in contrast, tend to exist as single copies. They are active in many types of cells, carrying out a variety of housekeeping functions. The genes in blocks thus far appear to be active only in the testes.

    Page and his colleagues chopped DNA from these palindrome blocks into small segments and found that one-third of the pieces had almost perfect matches to other parts of the Y chromosome. And 18% had perfect matches that stretched as far as 2000 bases. “There are sequences on the Y chromosome that are effectively functioning as alleles,” Page reported.

    The researchers mapped the positions of the alleles and found that they were on opposite sides of a given palindrome, one in reverse order relative to the other. Palindromes sometimes encompassed multiple genes and even smaller palindromes. The large palindromes spanned upward of 3 million bases, Page reported. In contrast, pseudogenes—genes that had ceased to function—were outside these blocks of DNA.

    Although 2 years ago Page and his colleagues suggested that genes with multiple copies might be predisposed to errors, they now propose that in many cases this arrangement sets the stage for the Y to work out kinks in its genes without help from another chromosome. Good genes might replace bad neighbors down the line. “It's not that anyone has seen this happen,” cautions Svante Pääbo, a geneticist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, “but it seems that one copy on one part of the palindrome [fixes] another without getting changed itself.”

    This apparent solution to the Y chromosome's lack of a recombinatorial partner might be ancient. Page's team has analyzed parts of the chimpanzee Y chromosome and found similar palindromes. “That puts [this DNA structure] back 5 million years,” says Letovsky, one of many male researchers pleased to see his DNA redeemed.

    Bargain-Basement Sequencing?

    There's a certain amount of bravado among genomics researchers about how far the cost of sequencing genomes has dropped—from $1 a base in 1985 to 10 cents a base this year. But with today's grand total running from tens to hundreds of millions of dollars for one genome, the price needs to get cheaper for sequencing's full potential to be realized. At the TIGR meeting, researchers debated how to achieve their new goal: the $1000 genome. “What is exciting [now] is how quickly this notion is gaining momentum,” says George Church, a biophysicist at Harvard University.

    There's strong motivation to trim sequencing costs to the bone, allowing more researchers to spell out more genomes, says J. Craig Venter, president of the Center for the Advancement of Genomics in Rockville, Maryland. The newly deciphered human genome and the rapid unraveling of the genomes of a half-dozen other vertebrates have whetted researchers' appetites. Sequencing the genomes of additional species would make evolutionary studies stronger and improve the understanding of DNA outside genes.

    In addition, geneticists are hoping that the genomes of thousands of people will be deciphered, making it easier to track down genetic risk factors for diabetes, heart diseases, and other disorders. Other researchers foresee using cheap DNA sequencing technologies to monitor the environment for specific microorganisms, including biowarfare agents.

    Dozens of corporate and academic research groups are working feverishly to satisfy the hunger for bargains. In some cases, researchers are finding ways to squeeze every base they can from current sequencing technology. Others are trying new ways to decipher genetic codes. All are moving toward reducing the amount of chemicals used in sequencing reactions, which represent sequencing's biggest cost. This approach often entails shrinking DNA samples, sometimes going down to just a few molecules (Science, 12 March 1999, p. 1669).

    But getting down to bargain-basement prices won't be easy, says Trevor Hawkins, a sequencing expert at Amersham Biosciences in Sunnyvale, California. Pushing existing techniques to the limits “will get us down to $30,000 genomes in the next few years, but taking that next leap is going to take a new technology.” There are good new technologies in the wings, but Hawkins worries that, even if they work, developers may not have the know-how or resources to bring their products to market.

    One new technology that researchers are nevertheless excited about “reads DNA like a ticker tape,” claims Eugene Chan, head of U.S. Genomics Corp. in Woburn, Massachusetts. The traditional approach, in comparison, sequences bits of DNA and then pieces them back together. The basis of the U.S. Genomics approach is a lattice built of closely packed, nanometer-sized posts. The lattice acts as a comb: DNA's spirals straighten out as they squeeze between the posts. The DNA emerges as a linear molecule—the ticker tape—whose fluorescently labeled bases can be read as they exit.

    Currently, the U.S. Genomics technology can't read every base. And although Chan boasts that 3 billion bases can stream through the lattice in 45 minutes, his team has yet to come up with a way to handle all that data. The technique can now process DNA molecules of about 200,000 bases, but he hopes it will be able to take care of whole chromosomes and whole microbial genomes by next year.

    Nanocomb.

    DNA is straightened as it moves through this lattice prior to sequencing.

    CREDIT: U.S. GENOMICS

    The equipment will be quite expensive, says Chan. But once the system is in place, the $1000 genome should be a breeze. “We think we can do it for much less,” he predicts.

    Lower-tech approaches can also make genome sequencing cheaper, says Church. Since 1999, he and Robi Mitra, also of Harvard, have been working out a sequencing system that uses off-the-shelf equipment. All they need are slide racks commonly used in histology labs, microarray scanners that are now part of most genomics operations, and machines that control chemical reactions by cycling between warm and cool temperatures. “From scratch, [the equipment] currently costs about $40,000,” Church says. Once in place, he calculates that the cost for a quick pass across the human genome could come down to about $750.

    Church, Mitra, and their colleagues start by mixing up a gel containing the DNA under study, chemicals that promote DNA replication, and short DNA sequence tags needed for the polymerase chain reaction. They deposit the gel as a thin layer on a microscope slide. The primer tags capture random bits of the DNA to be sequenced, which then multiply and form piles of DNA called polonies.

    To determine the order of the bases, the researchers add fluorescently labeled bases, one base at a time, to the polonies. The polonies' single-stranded DNA builds up a matching strand using the newly added bases, whose order the researchers record. Over the past month they've made great progress on the software needed to streamline this process, Church notes.

    Church's and Chan's are but two of about a dozen groups pushing back the sequencing frontiers. At the meeting, Susan Hardin, president of VisiGen Biotechnologies Inc. in Houston, Texas, described progress she's made in exploiting the natural chemistry of DNA replication to distinguish one base from another more efficiently. Others, such as Michael Weiner of 454 Corp. in Branford, Connecticut, are working out ways to do many thousands of sequencing reactions in parallel.

    All of the teams must still surmount technical problems, says Hawkins, and for the time being, he thinks improvements upon sequencing machines now in use—including the one his company makes—hold the key to bringing down sequencing costs. But Venter says it's possible there will be “a radical change” in how people sequence within 5 years. Either way, Church says he's poised for exciting times: “Like the World Wide Web in 1993, this project may zoom forward.”

  17. BIOINFORMATICS

    The Human Genome in 3D, at Your Fingertips

    1. John Bohannon

    Swishing through thickets of genes with a “light saber” in a virtual-reality room could become a fun way to make sense of the flood of genome data

    AMSTERDAM—In a dark chamber in the bowels of Stichting Academisch Rekencentrum Amsterdam (SARA), the Dutch national supercomputing facility, Anton Koning thrusts his hand into a galaxy. “This group here is interesting,” he says, pointing to a cluster of glowing points above his head. The computer scientist thumbs a hand-held mouse, and the cluster becomes enmeshed in gleaming red lines. The galaxy is made of genes—not stars—and the red lines, Koning explains, show which are expressed in the same tissues. With another mouse click, small pennants emblazoned with names and functions pop up from each gene. “Human surfactant, human pulmonary,” Koning reads. “Potential drug targets for lung disease. So let's see where they sit and compare with known disease loci.” Click, click, and the 23 human chromosomes appear behind the galaxy. He pokes at the cluster with what looks like a light saber shooting from his mouse, and a ray extends from each gene to its location on a chromosome.

    Saragene, the virtual-reality software driving this galactic tour of the human genome, could become a powerful new tool for scientists struggling to tap its hidden treasures. Even the most seasoned experts are daunted by the mountains of human genome data churned out by sequencing and microarray technologies. By making it possible to display vast amounts of data and interconnections all at once, virtual reality will soon offer an indispensable way to extract meaning from gargantuan data sets, predicts Peter van der Spek, director of bioinformatics drug discovery at Johnson & Johnson Pharmaceutical Research and Development in Beerse, Belgium, who initiated the Saragene project with SARA a year ago. The system could become available widely within a year. “There are some basic software engineering problems to overcome,” says Antony Cox, who tackles such problems for the Wellcome Trust Sanger Institute near Cambridge, U.K., “but this is a project with legs.”

    Koning, bushwhacking through thickets of genes with his light saber, zeroes in on a smaller group of genes. “Now we're getting somewhere,” he says, as more pennants appear. Some reveal gene-protein coupled receptors (GPCRs) that are like x's on a treasure map to drug discoverers. GPCR proteins bind signal molecules such as hormones (or imposters such as drugs) and trigger gene expression. Diseases marked by aberrant GPCR activity—schizophrenia, hypertension, and asthma, to name a few—would rate on a pharmaceutical most-wanted list. “Let's see about mouse models,” says Koning, as homologous genes appear in his galaxy. What Koning has done in seconds with Saragene, van der Spek explains, could take hours in two dimensions and cannot be viewed all at once.

    May the force be with you.

    Saragene allows researchers to display vast amounts of interconnected genomic data all at once.

    CREDIT: SARA

    The virtual-reality chamber Koning is standing in is called a CAVE (for computer-assisted virtual environment), a system designed in the early 1990s and now in use at about 100 facilities, mostly universities. Koning wears a pair of glasses with shutters that open and close 120 times per second in synchrony with the lights that project the galaxy into the chamber. The slightly shifted perspectives flashing alternately at each eye fool the brain into seeing the projection in three dimensions. A sensor on the glasses relays to the computer the position of Koning's head in space, which allows him to move in and around the data intuitively.

    Because of supercomputing charges, wielding a light saber in a CAVE costs about $1000 per hour. The motivation for Johnson & Johnson, which pays the bills in the Saragene marriage, is to speed up the process of drug discovery. Before “wet-lab” research can begin, van der Spek's bioinformatics team hunts for promising drug targets in silico. By developing this as a public tool with the technical team at SARA, says van der Spek, Johnson & Johnson is shaping Saragene to fit its needs: “It's win-win.”

    Although the hardware Koning plays with is pricey, most of the data are free. Stored on site at SARA is a copy of Ensembl, a database of annotated genomes maintained in the public domain by the Sanger Institute and the European Bioinformatics Institute, based in Cambridge. The keepers of Ensembl were given a demonstration of Saragene 2 weeks ago and are now considering teaming up with Johnson & Johnson and SARA to add a virtual-reality interface to Ensembl, allowing any facility with a CAVE to jack into the Saragene world. “I found it pretty exciting,” says Cox, who would lead the design modifications if Ensembl incorporated a Saragene interface, a decision that could come by next summer.

    Cox is particularly attracted to the possibility of using Saragene for comparing gene regions between species, a famously difficult problem in two dimensions. In the search for drug targets, candidate genes are often found with few clues to function, but insight can be gleaned by comparing a gene's region on the human chromosome with the matching chromosome region from another species in which gene functions are better defined. Making such comparisons for more than two species when evolution has shuffled genes across chromosomes becomes very messy on a computer screen. “Stepping into the data” with Saragene, as Koning puts it, makes multispecies comparisons “much easier.”

    Another enthusiast is Gert Vriend, director of the Dutch Centre for Molecular and Biomolecular Informatics in Nijmegen. Vriend, who recently visited SARA, is interested in using Saragene to study evolutionary biology. “Looking at a phylogenetic tree in a CAVE is really cool. You can see much more information at once,” he says. And rather than displaying one lineage at a time, Saragene creates a phylogenetic “forest.” “You can put thousands of sequences in one single tree,” beams Vriend.

    Just as Koning seems to be closing in on a culprit in lung disease, his light saber lodges in chromosome 13. Even supercomputers freeze sometimes. After a reboot, he once again loses himself in a galaxy of genes.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution