News this Week

Science  24 Oct 1997:
Vol. 278, Issue 5338, pp. 564
  1. BIOMEDICAL POLICY

    Whose DNA Is It, Anyway?

    1. Eliot Marshall

    This year's special issue on the genome features gene families. Viewpoints and Articles begin on page 601. Related News articles below and on pages 568 and 569 look at access to genetic materials, the Human Genome Diversity Project, and the Environmental Genome Project.

    THREE NIH INSTITUTES ARE TRYING TO OPEN UP COLLECTIONS OF DNA TO SPUR HUNTS FOR DISEASE GENES, BUT CLINICIANS FEAR THAT MOLECULAR BIOLOGISTS WILL SKIM THE CREAM OFF YEARS OF THEIR PAINSTAKING WORK

    I thought the movie business was rough,” says Los Angeles filmmaker Jon Shestack, but in the past 2 years, “I've learned that this business is much rougher.” Shestack—who produced the blockbuster movie Air Force One—is talking about the rush to gather and analyze DNA from families who may carry disease-related genes. He acquired an insider's view of the “brutal” academic struggle for priority when he tried to persuade researchers who have collected genetic data on autism, a disorder that's been diagnosed in his own 5-year-old son, to share their collections. Shestack hoped that sharing might speed the pace of research. But when he approached some of the best and brightest researchers in the field, he says, they told him they'd rather keep control of their own materials.

    Shestack and his wife, Portia Iversen, took matters into their own hands. They started an organization 2 years ago called Cure Autism Now (CAN), which, along with lobbying for more resources for autism research, will collect and share its own DNA samples. With help from Lee Ducat, founder of the Juvenile Diabetes Foundation, who used a similar strategy in the 1980s for diabetes research, Shestack and Iversen have created the Autism Genetic Resources Exchange (AGRE). In the past 6 months, Shestack claims, “we have enrolled 200 families.” AGRE is training 15 people to conduct diagnostic interviews, getting ready to collect DNA, and contracting with the Coriell Cell Repositories in Camden, New Jersey, to transfer the DNA to “immortalized” cell lines that will be made available to qualified researchers on a nonexclusive basis. Shestack says this should open the field to new researchers—perhaps even “a gunslinger with a big lab” who may be new to autism research but can get quick results. “A lot of people don't like it,” Shestack observes, including researchers who have devoted years to the field. “That's too bad,” he says, but “it would be irresponsible for us to do anything else” but share data.

    That kind of claim, along with counterarguments from medical researchers, will be reverberating throughout the medical genetics community in the next few months as institutes at the National Institutes of Health (NIH) try to do on a larger scale what CAN is doing for autism. As investigators turn from the relatively straightforward task of finding single genes that cause rare disorders to the bigger challenge of finding genes that might interact to cause common diseases such as autism, many are running into the same problem: They need access to high-quality clinical data and DNA from huge numbers of affected families. But clinical researchers, who spend years painstakingly diagnosing patterns of illness in families and collecting blood samples, are wary of handing these materials over to molecular biologists—gunslingers with big labs—who might cream off the results and the glory. They say they deserve a chance to work with the data, and they worry that the quality may suffer as investigators, faced with a deadline for releasing their data, rush to collect and analyze it.

    The tide seems to be turning against these arguments. Like CAN and the diabetes support groups, some NIH institute directors don't want to wait for investigators to offer up their data and are drafting policies to make it easier for geneticists to get their hands on materials collected at NIH's expense. “There's been a lot of gritting of teeth” over this, says geneticist Aravinda Chakravarty of Case Western Reserve University in Cleveland, Ohio. “But I think we have to start with the principle that competition is good…and we should make these collections as widely available as possible.”

    Opening the data banks

    These tensions are coming to a head this fall at the National Heart, Lung, and Blood Institute (NHLBI). As Science went to press, NHLBI chief Claude L'Enfant was planning to meet with his advisory council on 23 October to consider a set of recommendations to open up the institute's extensive collections of clinical data. L'Enfant notes that NHLBI has “enormous resources, close to 1.5 million blood samples,” including 50 years of clinical data collected from subjects in a heart study based in Framingham, Massachusetts. The samples, which are under the control of individual investigators and their universities, in some cases could be extended to cover three generations. L'Enfant worries that “these blood samples could remain on the shelf forever unless we address the issue [of collaboration] in a very positive way.”

    L'Enfant and his advisory panel will be taking their cues from a report put together by a “Special Emphasis Panel” on genetics co-chaired by Eric Lander, director of the Whitehead Institute/MIT Center for Genome Research in Cambridge, Massachusetts, and Roger Williams, director of the University of Utah Cardiovascular Genetics Research Unit in Salt Lake City. The panel says investigators have regarded DNA they've collected as “a precious, finite resource” and do not readily share it. To solve this problem, it urges NHLBI to establish a central service for immortalizing DNA in cell lines and maintaining the cells in a repository. Overseers would select material for the repository, ensure privacy, and pass on requests for access. NHLBI-funded investigators would be expected to make their materials generally available through this system.

    The panel also suggests that NHLBI put money into small grants covering the cost of shipping materials, particularly to anyone interested in testing out a new hypothesis. Moreover, to make sure that NHLBI's resources are widely publicized, the panel says the institute should create a World Wide Web site to provide detailed information about clinical data collected by grantees, listing all studies of more than 150 people.

    NHLBI will probably endorse all these recommendations, says L'Enfant. But it is likely to provoke a big debate when it tackles one key question: How long should an investigator be allowed to retain exclusive control of data he or she has collected before turning it over to the central repository? The panel doesn't suggest a deadline; it merely proposes that new grants be offered as an incentive to encourage more rapid release of data from completed projects. “I would say about a year” is reasonable, says L'Enfant.

    The issue has prompted a review at National Institute of Mental Health (NIMH) as well. On 19 September, NIMH director Steven Hyman's advisory council adopted most of a special report recommending changes to clear the way for a new “genetics initiative.” The report, written by an eight-member panel chaired by Samuel Barondes, a University of California, San Francisco, psychiatrist, and including Chakravarty and the ever-present Lander, recommended some organizational changes that “we have already implemented,” says Hyman. But the council couldn't agree on the panel's recommendation that all genetic materials gathered at NIMH expense should be shared “after a 12- to 18-month proprietary period.”

    “We had a very vigorous debate” on this at the council meeting, says Hyman. Members focused on the question of “how do you trade off the need to provide an incentive to investigators [by rewarding them with exclusive use of materials] versus the need to share DNA samples.” Although the council reached no conclusion, Hyman says NIMH will set a policy soon. He claims that “you won't meet a serious geneticist who doesn't agree that we need to share.” But Hyman also recognizes that the questions of what to share and from what point to measure the 18-month time limit “are not niggling issues” for investigators.

    Problems in compulsory sharing

    Hyman has good reason to be cautious. Efforts by NIMH and the National Institute on Alcohol Abuse and Alcoholism (NIAAA) to open data from prior collaborations have turned out to be highly controversial.

    NIMH led the way 2 years ago, when it took control of the interview results, clinical data, and DNA gathered by researchers at nine sites who had collected material from hundreds of families affected by Alzheimer's disease, schizophrenia, and manic depression. NIMH began making all of it available at cost to qualified scientists. Many companies interested in making products for mental disorders—including Merck, Pfizer, Millennium, Sandoz, and others—immediately bought the immortalized DNA samples. Many academic researchers have also acquired the materials to use as a means of testing a hypothesis in a second data set.

    Some researchers think NIMH erred in forcing its researchers to yield up this raw material so rapidly. One East Coast psychiatrist, who asked not to be named, believes sharing is desirable, but says the choice should be the clinician's: “This is our life's work,” this researcher says. Michael Conneally, a longtime leader of genetics research at Indiana University School of Medicine in Indianapolis, says many of his colleagues “felt that NIMH jumped the gun and were a little hasty in letting [materials] out.” He thinks “these groups didn't get a really good chance at trying to find genes before the materials were released.” Because it is hard to gather clinical data—particularly for behavioral disorders—Conneally thinks the clinical teams should have “a minimum of 5 years” of exclusive time to analyze the DNA they've collected, and “a maximum of 10 years.”

    The NIAAA leadership is sympathetic to this view. Enoch Gordis, NIAAA's director, says, “Of course, the molecular biologists would like to get their hands on all this stuff and win their Nobel Prizes, but the work was done by others.” He also argues that the biggest effort, deserving the lion's share of credit, is not the molecular biology but “finding the families, chasing after them, interviewing them—this is very labor intensive.”

    Gordis himself has had to weigh these interests in negotiating the release of data from the massive Collaborative Study on the Genetics of Alcoholism. COGA, which has been funded by NIAAA at a cost of $57 million since 1989, has collected material on 4100 individuals affected by alcohol dependence, and it is about to publish its first major findings. The principal investigator of the multicenter effort, Henri Begleiter of the State University of New York in Brooklyn, says one publication, to appear later this year, will debunk claims of linkage between alcoholism and the gene for the D2 receptor, and others will identify several “hot spots” on chromosomes that COGA researchers have linked to alcoholism. Now, NIAAA wants to make COGA's materials more widely available.

    After a year of negotiating, NIAAA and the COGA investigators agreed last year that all the materials supporting this work will be released to the world in September 1999, roughly a decade after COGA was launched. Meanwhile, Gordis says, COGA is entertaining and approving requests for material on a case-by-case basis. “It's a situation where everybody is right,” says Gordis, and he thinks COGA has made a reasonable compromise.

    That deliberate pace has advantages that go beyond fairness, clinical researchers add. Those working with families with autistic children, for example, worry that forcing data to be released may erode quality. Geneticist Gerard Schellenberger of the University of Washington, Seattle, points to a widespread concern that the rush to collect families could lead to duplication of data sets. If investigators conduct their gene hunts in groups of families that are not truly independent, this could undermine the value of the statistical results. In addition, Schellenberger worries about moving too fast and prematurely “freezing” the diagnostic criteria used to select individuals for analysis. It is important, he thinks, to permit flexibility so that different theories can be fully tested.

    But advocates of pooling family data have answers for such concerns. “This isn't a magic act,” says CAN's Shestack: “All you have to do is keep good records” that place a unique identifier on each sample. He also thinks it should be possible for any investigator to sort the data according to any particular interest—once the raw materials have been characterized and made available.

    In the end, the affected families seem likely to prevail in this discussion, and the NIH institutes and the researchers they fund will no doubt find a way to adjust. For as Shestack none too subtly points out, the subjects in autism research outnumber the investigators by 60,000 to one.

  2. HUMAN GENETICS

    Gene Prospecting in Remote Populations

    1. Eliot Marshall

    In a new gold rush, genetics researchers are scouring odd corners of the world for families whose DNA is likely to carry interesting genes. They won't be freely sharing what they find, however, because their backing comes from companies like Sequana Therapeutics Inc. of La Jolla, California; Millennium Pharmaceuticals Inc. of Cambridge, Massachusetts; and Genset S. A. of Paris. All are banking on discovering and patenting a gene for a common ailment, and perhaps parlaying it into a therapy.

    One such collaboration has helped finance the research of Noë Zamel and Arthur Slutsky, physiologists at the University of Toronto. Zamel had traveled to the island of Tristan da Cunha in the south Atlantic to gather genetic data that may lead to an asthma gene. Sequana later contacted him and made a deal. This kind of research is “very, very expensive,” notes Slutsky. Even doing one search means you must interview hundreds of families, draw blood, extract DNA, scan each individual for hundreds of genetic markers, and hope the statistical analysis will put out a positive signal. Because the government of Canada has clamped down on spending, it can't offer much help for big projects like this.

    But help came after Zamel returned in 1993 from his trip to Tristan da Cunha, where he had gone to collect blood from the island's tiny population, descendants of a group who settled it in the early 1800s and today suffer a high incidence of asthma. Sequana signed an agreement with the Toronto scientists' own home organization, the Mount Sinai Hospital Corp. in Toronto. “It was clear that we didn't have the resources to do [big-scale] genetics” without help, Slutsky observes. “Collaboration with Sequana made all the sense in the world.” The company is expecting to put about $70 million into the asthma project, according to Zamel, who adds: “Try to find [a grant for] $70 million; even at the [U.S. National Institutes of Health], that's impossible.”

    But there's a downside to this collaboration: Like many other academic groups funded by industry, the Toronto team cannot publish the valuable information it has collected on these subjects until the patent rights to asthma genes and possibly claims on related therapeutic products have been secured. Zamel says the legal agreement between his hospital, Sequana, and Sequana's partner, Boehringer Ingelheim of Germany, is “like a telephone book.” Clearly this is one area of genetic studies where public resource sharing won't occur anytime soon. And it's an area that's likely to grow as many other companies jostle for access to DNA from genetically isolated groups that may point the way to disease genes.

    Sequana is among the more aggressive DNA prospectors. Through its academic partners, it is gaining access to the DNA of several populations, in addition to Tristan's. One of the oldest to be studied, according to Zamel, is a small Jewish trading community that settled in southern India more than 2000 years ago. The tightly knit group relocated to villages near Jerusalem in modern times, although it remains somewhat closed. Zamel says the incidence of asthma in this population is high—perhaps as much as 25%.

    The Sequana team also expects to have access through the University of Santiago, Chile, to people living on Easter Island, an isolated spot in the South Pacific. About 1000 of the 3000 inhabitants, Zamel says, are of Polynesian descent, and their DNA may be valuable. “We haven't collected any blood samples yet,” he says, only questionnaires. Sequana plans to check for asthma genes in other communities, as well—including an extended family of 170 in the Brazilian highlands outside Rio de Janeiro, a family of 120 in a small village in China, and small family groups that include asthmatics from Australia, Toronto, and San Diego.

  3. HUMAN GENETICS

    Tapping Iceland's DNA

    1. Eliot Marshall

    Harvard geneticist Kari Stefansson returned to his native Iceland last year to launch a remarkable enterprise: He aims to gather up and index the heredity of the entire nation—clinical records, DNA, family histories, and all. The result, he hopes, will be the world's finest collection of family data for studying the genetic causes of common diseases. He intends to make this resource available for a fee, and for pharmaceutical companies, part of the price will be free treatment for Icelanders if the research leads to a therapy.

    With venture capital of $12 million, Stefansson has launched a profitmaking company—deCode Genetics Inc., based in Reykjavik, Iceland—begun collecting genetic data, assembled a staff of 90, and already published a study. Now he's seeking partners to support studies that could lead to new diagnostic tests and drugs. And his proposal is attracting interest from both academic and industrial scientists, who anticipate that deCode will provide access to genealogical and clinical information of extraordinary quality on Iceland's isolated population. Says Mary Kay McCormick, director of the asthma project at the La Jolla, California, pharmacogenetics firm, Sequana Therapeutics Inc., “The Icelandic population would be very valuable” for gene hunters because of its long stability and homogeneity—perhaps even more valuable than the much-studied Finnish population.

    To make the project possible, Stefansson, deCode's president and CEO, has forged a partnership with local government and academic leaders, and he claims that nearly all of them fully support the project. But deCode must also win the public's trust, because the company's chief asset is Icelanders' DNA.

    Iceland's 270,000 citizens offer a valuable resource, because they have been isolated from the outside world since the Vikings settled the island more than a millennium ago. In addition, Stefansson says, Icelanders passed through two bottlenecks that further increased their genetic homogeneity—an attack of bubonic plague in the 1400s that narrowed the population from 70,000 to 25,000 and the eruption of the volcano Hekla in the 1700s, which brought widespread famine. Because so many Icelanders share the same ancestors, and because family and medical records are so good (the national health service began in 1915), it should be easier to identify a genomic locus linked to disease among Icelanders than it would be in an outbred population.

    Stefansson concedes that, in a sense, “we are mining the consequences of isolation and two natural disasters.” But, he says, “there should be some sort of poetic justice” to make up for Iceland's harsh past, and he views his company as “the mechanism to do that.” For example, he asks all drug companies using his data to pledge that they will distribute any drugs that result from those data to the people of Iceland free of charge during the life of the patent.

    DeCode Genetics also goes to great lengths to reassure Icelanders that their privacy will be protected. No medical records are brought into the company until identifiers have been removed and replaced by encrypted IDs, Stefansson says. The list that links names to coded IDs is kept in a guarded room, inside a double-locked safe that can be opened only by a supervising clinician. The company has taken a position that it will never go back to inform individuals about the results of studies on their DNA: The company's research is entirely anonymous, as consent forms indicate. But it will make available to Icelanders, free of charge, any diagnostic tests the company itself develops.

    Stefansson declines to say how many Icelanders have agreed to provide DNA so far. But deCode has “initiated projects in 25 common diseases,” including multiple sclerosis, psoriasis, preeclampsia, inflammatory bowel disease, aortic aneurism, and alcoholism. He adds that all but two of the families academic clinicians have contacted so far have agreed to contribute blood for research. The company is now looking for commercial partners for its studies. DeCode has no standard format for academic collaborations, but Stefansson says his “own personal bias is to publish results as swiftly as possible.” He notes, however, that “our principal duty is to make sure that research results are turned into benefits for patients.” That may require some publication delays to secure patents on important discoveries.

    One indication of the power of deCode's data, he says, is that it took only 2 1/2 months for the company's researchers to complete the analysis that appeared in the September issue of Nature Genetics linking a locus on chromosome 3 to an inherited disorder that causes shakiness, known as familial essential tremor. And Stefansson predicts that “a flurry of papers” based on deCode research will appear by the end of the year. For him, it is “a dream come true.”

  4. ITALY

    Crisis Over, 5-Year Plan Back on Track

    1. Susan Biggin
    1. Susan Biggin is a science writer in Venice.

    VENICE—Italian space scientists emerged from 5 days of torment last week as a government crisis put an eagerly awaited 5-year plan for the Italian Space Agency (ASI)—including a big boost for space science—in jeopardy. The center-left coalition government of Romano Prodi had been teetering on the edge of collapse after members of the Communist Refoundation Party walked out. Italy's 1998 budget—of which the ASI plan is a part—would have been shelved if the government had fallen. But on 14 October, the Communists returned, and 2 days later the government won a vote of confidence from Parliament. The budget and ASI's plans are now back on track.

    The 1998 to 2002 plan represents another step in the rehabilitation of an agency that has been dogged by problems since its establishment in 1988. Weak management and poor policy-making led successive governments to place the agency under the guidance of special commissioners. But since the appointment of Sergio De Julio as ASI president a year ago, it has been getting back on its feet. The new plan, which will be officially presented to the government's financial planning committee on 31 October, includes a budget request of $3.8 billion for the 5 years. Funding for 1997 was $0.62 billion.

    ASI's scientific director, Giovanni Bignami, points out that the three most important programs in the new plan are space science, the international space station, and Earth observation, each of which takes up about one-fifth of the budget. Space science received a significant boost over the minimal funding of previous years. “For the first time in the history of ASI, science is the biggest budget line,” says Bignami. This boost has won the approval of Italian researchers, including Antonio Ruberti, European Union research commissioner and former Italian research minister.

    The remainder of the budget will support three smaller programs: telecommunications, technology development, and a new launcher. A key component of the technology program will be a new series of low-cost, small national satellites, to be launched at the rate of about five per year. The new launcher, to be developed by Italy in collaboration with other countries, will be designed to loft small satellites up to a weight of 1 ton on a commercial basis.

    While ASI aims to bolster researchers and industry at home, “we remain staunch supporters of European Space Agency [ESA] programs,” says Bignami. ASI will also continue to pursue a number of international programs: Italy is already in the process of building a logistics module for the space station, while more long-term plans include participation in various NASA-led missions to Mars and the moon, and collaboration with Russia on the Spektrum series of scientific satellites.

    The plan also reshapes the structure of ASI into four divisions—scientific, technology and applications, strategic, and administrative—echoing the directorate structure of ESA. Bignami says the new structure should solve some of the agency's management problems. In the past, he says, members of the scientific and technical committees also took on executive responsibilities. The new structure separates top management from the committees.

    Despite the enthusiasm for the new plan, some researchers are skeptical that the requested budget will cover all the new initiatives. Although Italy is Europe's third largest spender on space, ASI's budget is still about half the amount Germany spends on space, and one-third of the French. “Having only one-twentieth of NASA's budget, we had to come up with some choices,” De Julio told Science, “concentrating on things we know we can do well.”

  5. HUMAN GENETICS

    NRC OKs Long-Delayed Survey of Human Genome Diversity

    1. Elizabeth Pennisi

    This year's special issue on the genome features gene families. Viewpoints and Articles begin on page 601. Related News articles below and on pages 564 and 569 look at the Human Genome Diversity Project, access to genetic materials, and the Environmental Genome Project.

    A proposed international survey of genetic variation across the entire human population just got a cautious nod of approval from the National Research Council (NRC), after having been mired in controversy for several years. In a report released on 21 October, a 17-person NRC committee of scientists, ethicists, and lawyers concluded that the Human Genome Diversity Project (HGDP), first suggested by Stanford University population geneticist Luca Cavalli-Sforza in 1991, is worth pursuing because it can lead to a better understanding of the origin and evolution of humans.

    But the committee recommended a less international, less technically ambitious project than some of its planners had envisioned. In particular, it said project organizers should set aside secondary goals, such as searching for new disease genes. Those goals would complicate the logistics of the project, the report said, and make it more difficult to protect the privacy and other rights of those who donated DNA. “[The HGDP] is meritorious and warrants support,” says the committee's chair, William Schull, a geneticist at the University of Texas School of Public Health in Houston. “But we could see that the ethical and legal issues might be the ultimate stumbling block that would doom the project.”

    Both proponents of the HGDP and potential backers welcomed the report's conclusion. The original intent had been to study human evolution, they note, and they had always intended to take whatever steps are necessary to be fair to study participants. “We agree with the concept of instituting a variety of safeguards for individuals and their communities,” says biological anthropologist Dennis O'Rourke of the National Science Foundation (NSF). Adds Cavalli-Sforza, “This is the strategy that we have already chosen.” But while he considers the report a green light for the HGDP, neither the NSF nor the National Institutes of Health (NIH) are close to putting up the estimated $10 million to $30 million such an endeavor might cost. NIH may even consider supporting other approaches to studying variation in the genome.

    Cavalli-Sforza and his colleagues proposed the HGDP as a way of getting a handle on questions ranging from human migration patterns to customs influencing patterns of human reproduction. Along the way, it might find mutated genes associated with diseases. The project would involve collecting DNA samples—from blood, hair, or cheek cells—from up to several hundred members of some 500 well-defined populations around the world. DNA from some families would be included. A predetermined set of genes—or markers—from each sample would be analyzed and compared to see how closely the populations are related. Then 25 of the samples would be immortalized in cell lines and the rest of the DNA would be banked for future use by researchers seeking to answer genetic, evolutionary, or anthropological questions.

    But the idea precipitated heated discussion among geneticists and anthropologists about which populations should be studied and how the research should be carried out—what markers should be analyzed, for example (Science, 20 November 1992, p. 1300). In addition, the project drew strong protests from groups concerned that indigenous populations would be exploited because researchers might try to patent their DNA for use in medical tests or other products without sharing the profits with the original donors (Science, 4 November 1994, p. 720). Taken aback by the furor, in 1996, the NSF and the NIH—two possible HGDP funders—asked the NRC to help make sense of all the conflicting views.

    In its report, entitled “Evaluating Human Genetic Diversity,” the committee came up with five possible strategies for the project, listing the potential legal, ethical, scientific, and economic pitfalls of each. Ultimately, it recommended one that would limit the biomedical value of the HGDP. “If the primary goal is to get some feel for [human] diversity, then to obligate investigators to a very [strong] biomedical commitment would stultify the program from the outset,” says Schull. “Let's focus on one target rather than focus on too many and have none of them be satisfied.”

    For example, in order to get a broad picture of diversity as economically as possible, with a minimum of ethical and legal complications, the report suggests that the project obtain samples from unrelated individuals within a given population and make sure the identities of the donors could not be linked to the samples. Besides protecting privacy, this approach would also free researchers from having to obtain further consent from donors for each new study done.

    But that strategy would mean that the data wouldn't be very helpful to researchers hunting disease genes. Gene hunting requires knowing what diseases the donors had and examining DNA from as many members of affected families as possible. But precisely because it is ill-suited to making discoveries with potential commercial value, anonymous DNA collection raises fewer ethical and legal concerns.

    The NRC report also recommended leaving open which genes or markers to analyze. New technologies under development, the committee pointed out, will allow researchers to do comprehensive analyses, even on small samples that could have many markers in common. “It seems to be shortsighted to fix on a set of markers when a year from now we could do much more,” says Schull. And the report concluded that establishing and maintaining cell lines would be so costly, and create such logistical problems, that the fledgling HGDP would do best to avoid them.

    Nor should the project try to set up a truly international effort until after the complex negotiations required to safeguard the rights of study participants are completed. For now, the committee concluded, the work should be limited to studies originating in the United States.

    On the whole, HGDP planners are relieved to have the panel's endorsement. Now, says NSF's O'Rourke, he and his colleagues can start figuring out how to make the HGDP become reality. Nothing will happen before fiscal year 1999 because no funding was budgeted for HGDP in 1998.

    But the report has raised further questions about whether NIH will participate. “NSF from the start was more committed to the idea because of the clear benefit for anthropology,” says Judith Greenberg, a developmental biologist with the National Institute of General Medical Sciences in Bethesda, Maryland. She and her NIH colleagues will be weighing their options. “It might be possible that you don't have to collect lots and lots of samples from populations all over the world,” she notes. “There might be alternative ways that might be more cost effective and [that] avoid some of the legal and ethical issues.”

  6. HUMAN GENETICS

    Environment Institute Lays Plans for Gene Hunt

    1. Jocelyn Kaiser

    This year's special issue on the genome features gene families. Viewpoints and Articles begin on page 601. Related News articles below and on pages 564 and 568 look at the Environmental Genome Project, access to genetic materials, and the Human Genome Diversity Project.

    Bethesda, MarylandTwelve people died after members of the Aum Shinrikyo cult unleashed a potent nerve gas called sarin in the Tokyo subway 2 years ago. Some of these victims, scientists now know, may have been much more vulnerable than others. Circulating in the blood of 25% of Asians and 10% of Caucasians is a version of the enzyme paraoxonase that converts sarin to a less toxic chemical about 10 times more quickly than the enzyme found in most people.

    The paraoxonase gene is one of dozens that toxicologists think make some individuals more susceptible to the effects of pollutants and other environmental chemicals, contributing to everything from cancer to birth defects and Parkinson's disease. Hoping to ferret out dozens more of these “environmental susceptibility” gene variants, National Institutes of Health (NIH) scientists are putting together a major effort to sequence DNA from perhaps 1000 people to try to demonstrate a link between certain genes and patterns of disease. “This is information that can really revolutionize public health policy” by making it possible to identify and protect people susceptible to hazards, says Ken Olden, director of the National Institute of Environmental Health Sciences (NIEHS), whose scientists conceived the so-called Environmental Genome Project.

    But the path to this revolution is far from certain. Top genome experts and population geneticists invited to an NIEHS symposium here last week to discuss the project's feasibility pointed to many potential obstacles. First, there's the novelty factor: No genome project has yet attempted to survey genetic diversity for a large number of human disease genes. Then there's the expertise factor: NIEHS, an NIH branch in Research Triangle Park, North Carolina, is best known for toxicology—not genomics—and will need plenty of help from the rest of the scientific community, some experts cautioned. And finally, there's the turf factor: National Human Genome Research Institute director Francis Collins has outlined an ambitious proposal to create a public data bank of so-called single nucleotide polymorphisms, or SNPS—minute variations in sequenced genes (Science, 19 September, p. 1752)—that might overlap with NIEHS's project. “There's a lot of unknowns in this endeavor, and we have to proceed cautiously,” acknowledges NIEHS scientific director Carl Barrett. But the project's uncertainties shouldn't hold it back, says symposium co-organizer Lee Hartwell, head of the Fred Hutchinson Cancer Research Center in Seattle. “There's no reason why we can't get started.”

    The idea for the undertaking follows several decades of work on common variations of genes involved in activating or detoxifying drugs and chemicals that we breathe, drink, or eat. “Each person basically has his own unique fingerprint of drug-metabolizing enzymes and receptors, so we all handle drugs [and chemicals] differently,” says Dan Nebert, director of the Center for Ecogenetics and Environmental Science at the University of Cincinnati.

    Variations, or alleles, of the gene for paraoxonase—an enzyme that breaks down toxic organophosphate compounds, including many insecticides—is one example. Many other genes, such as those in the cytochrome P450 and NAT families—which metabolize carcinogens—can increase cancer risk, especially in smokers. For instance, a smoker with certain NAT variants may have up to a sixfold greater risk of bladder cancer than nonsmokers with other NAT variants. (These cancer susceptibility genes will be discussed in a special report on cancer in the 7 November issue of Science.)

    Such variants appear to be much more common and less dangerous than mutations in genes such as the breast cancer gene BRCA1 that increase disease risk dramatically, seemingly independently of environmental stimuli. But the sheer prevalence of P450 and other variants could result in a major population risk. “If the allele is common,” explains NIEHS molecular epidemiologist Jack Taylor, “it can account for an incredibly large fraction of disease in the population.”

    So far, researchers have managed to indict only a handful of environmental risk genes. In just a few cases are there “really solid human studies that truly demonstrate” a link between one of these genes and disease, says toxicologist John Groopman of Johns Hopkins University. Carefully sifting through the genome to find new candidates, Taylor and others say, could have profound implications for encouraging susceptible people to avoid certain exposures and by setting safer standards for workers and the public (see sidebar).

    But NIEHS officials are unsure about how to begin the hunt and which genes to target first. At the meeting, Barrett sketched out NIEHS's preliminary thinking: First, scientists would set up a DNA repository from 1000 individuals representative of the major U.S. ethnic groups. Next, teams would use some combination of DNA chips and sequencers to resequence alleles of roughly 200 “candidate” genes involved in everything from DNA repair to digestion (see table). At about 10 cents per nucleotide, NIEHS says this step could cost $200 million. Next, researchers would sort out which alleles are common enough to be classified as polymorphisms—versions shared by 1% or more of the population. The project would then sponsor molecular, animal, and, finally, population studies—of sick people who had been exposed to a suspect chemical, for example—to find out how important these polymorphisms are to disease.

    View this table:

    Although experts at the meeting endorsed the project's overall goals, they found plenty to fault in the game plan itself. For instance, some meeting participants said, NIEHS may have greatly underestimated the project's price tag, which relies on savings from untested technologies such as the DNA chips for resequencing being developed by the biotech firm Affymetrix Inc. of Santa Clara, California. Also unclear is how much to sequence—whether regulatory regions should be included, for example.

    The biggest hurdle, however, is the crucial step that follows sequencing: determining the relevance of polymorphisms to disease. The key questions, says Barrett, are “how much variation to expect and whether you can distinguish important changes from unimportant ones.” Scientists argue that any effort to find suspect polymorphisms is complicated by the often dizzying variation of alleles. “We should not be too facile in thinking we can find causal sites,” says Penn State geneticist Ken Weiss. Adds National Cancer Institute clinical sciences director Ed Liu: “I'm increasingly impressed with the complexity of what's involved and how little data we have that would guide us.”

    The bottom line, experts say, is that NIEHS shouldn't try to go it alone—particularly in the early stages of sample gathering. “If we end up with several different efforts for collecting samples, that will really be too bad,” says Collins, who urged NIEHS to collaborate on the SNPs database. Indeed, Barrett says, NIEHS is now likely to join that effort, which, as Collins outlined at the meeting, would aim to assemble a DNA repository representative of at least some of the U.S. population. NIH will host a meeting on 8 and 9 December to decide “how to design the sample set,” Collins says. In the meantime, NIEHS plans to start compiling a list of candidate genes. (NIEHS is soliciting suggestions at: www.niehs.nih.gov/dirosd/policy/egp/home.htm)

    Even more uncertain is how NIEHS will move from genotype to phenotype. “Everybody realizes we really don't know how to do it yet,” Hartwell says. One way to approach this, suggests Liu, would be to start small—by resequencing, say, just 10 genes—and assess how well functional and epidemiological studies succeed at tying variations in those genes to disease before proceeding to the next batch of genes. Another idea is to pick a gene whose variance is already well understood—NAT2, for example—and characterize its alleles from scratch using DNA repository samples to see how approaching the problem from the genomic end compares to the earlier toxicology and molecular epidemiology studies.

    According to Collins, “it's going to be a few months to see how this shakes out.” Barrett agrees and says that NIEHS may delay putting out its initial request for grant proposals for $10 million worth of seed money until late next year, after key details of the project's design are worked out. There's no doubt, however, that NIEHS's grand plans have got others in the community revved up about environmental genes. “NIEHS is way out in front,” says genome sequencer Glen Evans of the University of Texas Southwestern Medical Center in Dallas. “This is kind of the wave of the future.”

  7. HUMAN GENETICS

    A More Rational Approach to Gauging Environmental Threats?

    1. Jocelyn Kaiser

    One of the biggest potential payoffs from an environmental genome project (see main text) is that it could help policy-makers devise rules that better protect sensitive individuals. “To have intelligent environmental regulatory policy, one has to begin to unravel the role of genetics in determining the differences in susceptibility,” says National Institute of Environmental Health Sciences (NIEHS) director Ken Olden.

    New direction.

    The time is right for NIEHS to search for susceptibility genes, says Olden.

    DUANE HALL

    Olden's words are music to the ears of members of Congress who have been clamoring for better science behind regulations. Risk assessors at the Environmental Protection Agency (EPA) and elsewhere now craft rules with a standard fudge factor to try to protect sensitive individuals: They set the permissible exposure level to a chemical, for instance, at a tenth of that deemed acceptable for the general population. Data on the prevalence of susceptibility genes could reduce the need for guesswork, says NIEHS's George Lucier, who's helping write the EPA's dioxin reassessment. “As we get more and more information on the variation of environmentally relevant genes across the population,” he says, “we'll be able to more frequently…use real data.”

    In some cases real data might result in a less stringent standard and in others a tighter one. For example, some people may have a version of a detoxifying enzyme that makes them five times more sensitive than others to a pollutant. Risk assessors, then, might permit an exposure that's a fifth that of the acceptable level for the rest of the population—an exposure twice as high as they might set using the standard fudge factor. “It could go either direction,” Lucier says. “It could be a 100-fold factor or a twofold factor.” But political factors and economics, in some cases, will inevitably pull rank on science—particularly if the sensitive population is tiny. “It would obviously become extremely expensive to protect a few individuals,” says University of Washington, Seattle, toxicologist Dave Eaton.

    Indeed, both scientists and regulators may struggle to “digest and understand the meaning and importance” of the initial data on environmental genes, says George Gray of the Harvard Center for Risk Analysis. For one thing, several genes may be involved in defining an individual's risk. Take, for instance, benzene, which at high exposures has been linked to human leukemia. After sampling liver tissue from 10 people, a team led by toxicologist Michele Medinsky of the Chemical Industry Institute of Technology in Research Triangle Park, North Carolina, found a 14-fold difference across the samples in the activity of CYP2E1, an enzyme that converts benzene into chromosome-damaging metabolites. If this variation is due to genetics, says Medinsky, then people with lower CYP2E1 activity would be “relatively protected” from benzene toxicity. But because people also vary in their activity levels of other enzymes that detoxify the benzene metabolites, the risk for someone with fast-acting CYP2E1, she asserts, “in fact might not be elevated at all.”

    Another bedeviling issue is how this information might be used to alter workplace exposure levels. One test case might be beryllium, an industrial metal that can cause an incurable lung disease. Four years ago, scientists found a genetic marker of susceptibility to beryllium disease carried by 30% of the population; 97% of a group of workers with the disease had the marker. Employers are now debating whether to screen workers for it. One worry is that a worker with a susceptibility gene could be denied a job.

    The trend in risk assessment is to factor in genetic susceptibility by developing “ranges and distributions” of risk, rather than a single number—leaving it to managers to work out how to use the information, says Gray. EPA's proposed cancer risk guidelines encourage this kind of analysis, but don't specify how to do it. Says Lucier, “I think the regulatory agencies need to really start getting their thoughts together about how this information will be used.”

  8. NUCLEAR PHYSICS

    An Element of Stability

    1. Richard Stone

    THREE TEAMS OF NUCLEAR PHYSICISTS ARE RACING TO DISPROVE THAT AS SUPERHEAVY ELEMENTS GET BIGGER, THEY ARE ALWAYS MORE UNSTABLE. IN THEIR SIGHTS IS ELEMENT 114, WHICH IS POSTULATED TO BE EXCEPTIONALLY LONG-LIVED

    DARMSTADT, GERMANYPhysicists in three labs around the world are gearing up for an expedition to a remote island, where they hope to find a bit of relief from the unstable world they are used to. Getting there will hardly be a pleasure cruise, however. In fact, it is shaping up to be an all-out race, and the first to arrive could capture one of physics's biggest prizes.

    High points.

    Yellow peaks in this chart of isotopes mark those with full nuclear shells of protons or neutrons or both.

    PETER MOLLER/LANL

    This tantalizing territory is a so-called “island of stability” that is postulated to exist around the superheavy element 114. The explorers who are hoping to reach this island are relying on theoretical extrapolations for evidence that it even exists. But if they find it, the discovery “would revolutionize the fields of heavy-element nuclear physics and nuclear chemistry,” says Ken Gregorich, a nuclear chemist at Lawrence Berkeley National Laboratory (LBNL) in California, one of the three labs that are about to embark on the search.

    View this table:

    The chart for this voyage is the periodic table of elements. Like all maps it looks simple, but hides a wealth of complexity. Each of the 112 elements currently in the table has a collection of shadowy siblings, isotopes with different numbers of neutrons in their nuclei. While all the isotopes of a particular element have essentially identical chemical properties, their nuclear characters can be very different—for example, many are highly radioactive with half-lives measured in microseconds.

    Nature provides us with 94 different elements, up to and including plutonium, but since 1940 physicists have used nuclear reactors and particle accelerators to produce new elements with ever greater numbers of protons and neutrons in their nuclei. They have forged 20 elements, but the task is getting harder and harder and the elements more and more unstable—the latest, element 112, lasts a mere 280 microseconds.

    Theorists predict, however, that this trend toward instability will take an upturn as they approach element 114. One particular isotope of 114, theorists say, is situated at the center of an island of stability, with many of the isotopes around it more stable than the superheavy isotopes that researchers can make today. That's the promised land that is beckoning labs in California, Germany, and Russia. But the race is more than just an intellectual one. One laboratory, the Institute for Heavy Ion Research (GSI) here in Darmstadt, has dominated this field for the past 2 decades, indisputably discovering five of the last six known elements. But soon GSI will have some real competition: LBNL and the Joint Institute for Nuclear Research (JINR) in Dubna, near Moscow—both of which have long histories of forging new elements—are putting the finishing touches on new facilities and will soon join GSI in pursuit of the superheavies.

    All three institutes are planning experiments to forge superheavy elements as early as next spring, efforts that will be discussed at a meeting on 27 and 28 October of the Welch Foundation for Chemistry in Houston. “Element 114 is still the Holy Grail of this field,” says nuclear theorist Rayford Nix of Los Alamos National Laboratory (LANL) in New Mexico.

    Magic quest

    These nuclear navigators are taking their bearings from the arrangement of protons and neutrons in atomic nuclei. Currently, nuclear theory says that they form themselves into concentric “shells”—akin to the electron shells that surround the nucleus—with neutrons in one set of shells and protons in another. Each shell has a particular number of spaces for particles, and just as a complete electron shell makes for a chemically very stable element—one of the noble gases—a full nuclear shell gives that nucleus added stability. Helium, oxygen, calcium, tin, and lead, for example, each have a full shell, or “magic number,” of protons. So should element 114. Lead-208 is the largest known isotope that is “doubly magic,” also possessing a magic number of neutrons (126). Theorists predict that the next magic number of neutrons is likely to be 184, so the isotope that researchers at GSI, LBNL, and Dubna have their sights set on is the doubly magic element 298114. Isotopes in the vicinity of 298114 are expected to have half-lives lasting up to years. This is “one of the fundamental predictions of modern nuclear theory,” says Dubna nuclear physicist Alexander Yeremin.

    The hunt for the doubly magic 114 began in the 1960s, when physicists first suggested that closed nuclear shells might lead to stable superheavy elements. Some physicists were then predicting that 298114's half-life might even be millions of years. “There was a lot of enthusiasm,” says GSI director Hans Specht, “because some theoreticians felt that these elements would have exotic properties”—ones that could lead to new materials, fuels, or even weapons.

    But after just a few years' attempts to reach it, the island of stability seemed farther away than ever. The discovery of elements 104 to 106—whose first known isotopes had half-lives measured in milliseconds—suggested that the size of still heavier ones would overwhelm the predicted stabilization of closed shells and split apart, or fission, instantly after formation. Conjuring fleeting glimpses of elements beyond 106 would require accelerators with more energy and detectors more sensitive than were available at existing facilities.

    At that time, the favored technique for creating superheavy elements was “hot fusion,” which involved smashing beams of helium and other light elements into a target made of a heavy element such as plutonium: In this reaction, each new fused nucleus must “cool off” by shedding neutrons. In the early 1970s, Dubna researchers pioneered a new approach, dubbed “cold” fusion, in which they fired a beam of moderately sized isotopes, such as cobalt, at the heavier stable elements lead or bismuth in the hope the two would have just enough energy to fuse. The Dubna group managed to produce isotopes of fermium (100) and rutherfordium (104), but their calculations suggested that to forge elements heavier than 106, cold fusion would require an additional burst of energy—an “extra push”—to get the atoms to fuse. “Many people believed that cold fusion would never work,” says GSI physicist Gottfried Münzenberg. In the end, Dubna shelved the technique.

    There were, however, no experimental data to back the extra-push theory, so while Dubna abandoned cold fusion, GSI took a gamble on the technique. “They believed in our method and adopted it,” says Dubna team leader Yuri Oganessian. In the mid-1970s, GSI physicists led by Peter Armbruster set up a beamline for cold fusion and constructed a new detector, dubbed the Separator for Heavy Ion Reaction Products (SHIP), to weed out the rare superheavy isotopes from the showers of collision debris. Their bet paid off in 1981, when they fired a chromium beam at a rotating bismuth target and detected the predicted signature of element 107: a unique chain of alpha-particle decays. “Each decay chain has its own fingerprint,” says GSI team leader Sigurd Hofmann.

    Dubna's postulated “extra push” turned out to be unnecessary: Within 3 years, cold fusion had delivered elements 108 and 109 to the GSI group. However, the collisions that spawned each successive element were rarer.

    To extend the periodic table even further, GSI embarked on a program in 1989 to increase the sensitivity of the SHIP detector 10-fold, finishing their modifications in 1994. That year, all three labs claimed to have created element 110—GSI and Berkeley via cold fusion, and Dubna via hot fusion. Who was first remains unresolved, but, in contrast to years past when there were bitter disputes between the labs over priority and the right to name new elements, most members of the three groups now favor deciding on a name for element 110 together. Later that year, however, GSI researchers had the limelight to themselves again when they forged 111, and in February 1996 the group fused zinc with lead to create 112—an event so rare it took about 3 × 1018 zinc ions fired at the lead to create two atoms of element 112.

    Since then GSI scientists have been running experiments to calculate the exact beam energy it will take to create 113 from a beam of zinc-70 blasted at bismuth. The group will try for 113 next spring and, if successful, will next try to forge 283114 from a beam of germanium ions and a lead target—an experiment that will push their equipment's sensitivity to its limit. “114 is at the borderline of possibility,” says Specht.

    Back in the race

    If GSI manages to create element 114 next spring, the 169-neutron version it is shooting for would still have fewer neutrons and be far less stable than the doubly magic 298114. Aiming to get even closer to this isotope is the Berkeley team, headed by LBNL's Gregorich and including two éminences grises of nuclear physics, Glenn Seaborg and Albert Ghiorso. Last fall, LBNL brought in GSI's Victor Ninov as a visiting scientist to help design its new $600,000 Berkeley Gas-filled Separator (BGS), which is expected to be ready for test runs in February 1998, after the lab finishes installing new magnets in its 88-inch (224-cm) cyclotron. One of the first BGS experiments, set for next fall, will aim to prove that LBNL was the first to create 110; the group will reproduce and extend the original work—whose partial decay chain suggested a sighting of one atom of element 110.

    Later next year, the Berkeley group plans to bombard plutonium-244 with calcium-48 in an attempt to create element 114 with 174 neutrons. Such an isotope would be closer to 298114, and should be far more stable than the version GSI will try to make. The LBNL team could also change its target to curium-248 in an attempt to make element 116 with 176 neutrons, also thought to be in the island of stability. “The BGS will have the best sensitivity for these reactions,” says Gregorich. But he acknowledges that the timetable depends on the untested BGS: “We haven't even found the bugs yet.”

    Also in the hunt for a stable version of 114 is the Dubna group, which, like LBNL, is finishing an upgrade. In collaboration with researchers at Lawrence Livermore National Laboratory (LLNL) in California, Dubna will use hot fusion to try to forge superheavies. The group has installed a new ion source for its cyclotron and is now working to jack up the intensity of its calcium-48 beam, a project it expects to complete next month. “We hope that the [new] source will make more efficient use of the very rare and expensive calcium-48,” says Livermore's Ron Lougheed. Early next year, Dubna will launch a series of experiments to produce neutron-rich versions of elements 110, 112, and 114. By producing a more neutron-rich version of 114, says Lougheed, “we may have a better chance at discovering how much stability these spherical shells add to element 114.”

    A dark horse in the race is Argonne National Laboratory (ANL) in Illinois, which by the end of the year will have finished installing a new ion source on its ATLAS accelerator. Some time next year, Argonne researchers hope to join the hunt for superheavy elements. However, acknowledges Argonne physicist Walter Henning, “we do not have the experience that the other laboratories have.…I don't count us a favorite to win this race.” Henning tips GSI to be the clear favorite.

    No matter which country plants its flag first, scientists are excited about having the island of stability in their sights. If such elements are discovered, says Gregorich, they “are sure to be the center of study for a large part of the nuclear physics and nuclear chemistry communities for many years.” For physicists, measuring the decay properties of these elements should provide new insights into how the atomic nucleus sticks together, while chemists would have a virtual terra incognita of how these stable superheavies might chemically react with other elements. Further in the future are attempts to create superheavies such as element 126, which some nuclear-shell theories suggest could be even more stable than 114. Beyond 126, says GSI nuclear chemist Matthias Schädel, “theoretical models simply aren't good enough to predict what might happen.”

  9. NEUROBIOLOGY

    Enzyme Linked to Alcohol Sensitivity in Mice

    1. Elizabeth Pennisi

    In the movie Days of Wine and Roses, the drunken stupors of the alcoholic couple add to the pathos of the story. But in the laboratory of Hiroaki Niki at the RIKEN Brain Science Institute in Wako City, Japan, the drunken stupors of certain genetically altered mice are cause for celebration: These mice are providing new insights into how alcohol affects brain neurons, and hints of why some people are more susceptible to alcohol addiction than others.

    The mice Niki and his colleagues are studying lack a functional gene for an enzyme called Fyn tyrosine kinase, which previous work had already linked to memory and learning. The researchers found that animals without the kinase stay drunk longer than normal mice, apparently because the enzyme normally counteracts alcohol's depressive effects on neurons, causing acute tolerance. When it is missing, the neurons—and the animals—take much longer to recover.

    In the past, one theory held that sensitive individuals simply break down alcohol more slowly, but this study suggests that sensitivity to alcohol is determined at least in part by levels of Fyn tyrosine kinase. If so, individual differences in enzyme levels could explain why some people are more prone to have trouble with alcohol than others.

    “Initial insensitivity [to alcohol] seems to be a strong predictor of alcoholism later in life,” notes pharmacologist Adron Harris of the University of Colorado Health Sciences Center in Denver. And because too little Fyn kinase increases alcohol sensitivity, it could be that too much of the enzyme may lead to lowered sensitivity. If that is the case, he adds, the Fyn kinase gene may be a target for researchers searching for genes that might predispose people to alcoholism. “Any gene encoding a protein that influences alcohol action becomes a candidate gene for alcoholism,” Harris says.

    Fyn kinase first became a center of attention 5 years ago. By producing mice lacking the Fyn kinase gene, Eric Kandel's team at Columbia University College of Physicians and Surgeons in New York City had showed that the enzyme is needed for spatial learning and long-term potentiation, a long-lasting alteration in nerve-cell excitability thought to underlie memory (Science, 18 December 1992, p. 1903).

    Independently, geneticist Takeshi Yagi of the National Institute for Physiological Sciences in Okazaki had made a similar knockout mouse strain to learn more about the enzyme's function in brain development. Niki then teamed up with Yagi to look at how the mice responded to various drugs, including sedatives such as ethanol. To assess alcohol sensitivity, Niki's RIKEN collaborator Tsuyoshi Miyakawa injected ethanol into the mice, put them on their backs in V-shaped troughs, and measured the time it took for the inebriated animals to stand upright.

    They found that while mice not given alcohol stand up immediately, normal injected mice take anywhere from 3 to 40 minutes, depending, Niki says, “on how much alcohol they were administered.” But for each blood level of alcohol tested, the mice lacking Fyn kinase took about twice as long to get up as the mice with a functioning Fyn kinase gene, the Japanese team reports.

    To try to find out how the tyrosine kinase affects neuronal responses to alcohol, Miyakawa and Yagi then assessed the enzyme's activity in knockout and normal mice both before and after alcohol exposure. They already knew that a biochemical effect of the tyrosine kinase is to add phosphates to a component of the NMDA receptor, a multiprotein complex in the nerve cell membrane that receives a specific chemical signal from adjacent cells.

    As expected, Niki's team found that giving ethanol to the knockouts didn't change NMDA receptor phosphorylation in their brains. But in normal animals, the researchers found that within 5 minutes of ethanol administration, phosphorylation of the receptor increased in a brain structure called the hippocampus, then dropped off slowly.

    This phosphorylation appears to be correlated with the neurons' recovery from suppression by the alcohol. When Nobufumi Kawai's group at the Jichi Medical School in Minamikawachi-machi monitored the electrical activity of nerve cells from the hippocampus in culture, they found that, at first, alcohol made the nerve cells less responsive and less likely to fire. Soon, however, the effect diminished in mice with Fyn kinase, presumably because the NMDA component became phosphorylated. After about 5 minutes, the cells were just as active as they had been initially, despite continued exposure to ethanol. In contrast, nerve cells from mice lacking Fyn kinase were still only half as responsive as normal after 15 minutes.

    Alcohol researchers are impressed with Niki's effort. “There's a nice correlation between the electrophysiology, phosphorylation, and the behavioral effects,” notes Harris's Colorado colleague Paula Hoffman. “It's quite convincing.” They note that this new role for Fyn kinase also fits well with what Kandel's group first discovered using knockout mice. “Tolerance, like learning, is an adaptation of the brain to an external stimulus,” Hoffman adds.

    Both Harris and Hoffman caution, however, that the work still leaves many questions unanswered. “I don't think you can explain all of intoxication through the NMDA receptor,” agrees neurobiologist David Lovinger, who does alcohol research at Vanderbilt University in Nashville, Tennessee. Even so, these researchers agree that Niki's results do offer another clue to the brain chemistry behind an important societal problem, one that, in life as in Days of Wine and Roses, can have all too tragic outcomes.

  10. EARLY UNIVERSE

    Cosmologists Celebrate the Death of Defects

    1. Andrew Watson
    1. Andrew Watson is a writer in Norwich, U.K.

    Cosmologists are sounding the death knell for one of the two main theories vying to explain how the universe unfolded in the fraction of a second after the big bang. In a bruising encounter with real data, the cosmological model known as defect theory was itself found to be defective. “A really good candidate for explaining large-scale structure [of the universe] has failed,” says Paul Steinhardt of the University of Pennsylvania.

    Fatal curves.

    Defect models do not fit the level of temperature fluctuations seen in measurements of the cosmic microwave background.

    ILLUSTRATION: L. CARROLL SOURCE: N. TUROK/UNIV. OF CAMBRIDGE

    Cosmologists are far from downcast, however. Even champions of defect theory are oddly upbeat about its demise, announced in several recent papers that compared its predictions with measurements of the cosmic microwave background, the radio hiss that reveals the contours of the early universe. “I'm not in the slightest discouraged. I regard this as a great success,” says Cambridge University's Neil Turok, the leading proponent of defect theory. The reason? The death of defects heralds a new era in which cosmologists can test their theories against data from new generations of ground- and space-based instruments. “[We] are at the threshold of a golden age of cosmology,” says Andreas Albrecht of London's Imperial College. “The demise of the standard defect scenarios is the first great success of this new golden age.”

    The starting point of modern cosmology is the big bang, the birth of the universe between 10 billion and 20 billion years ago. But theories of the big bang say little about how the universe's largest structures, such as galaxies and clusters of galaxies, came into being. Cosmologists believe these features were seeded by irregularities in the early universe, which are evident in the irregularities of the microwave background. “What created the inhomogeneities in the universe that produced galaxies and large-scale structure?” asks Steinhardt. This, he says, is “a basic puzzle of cosmology.”

    One possibility is inflation, a brief period of superfast expansion first proposed by Alan Guth of the Massachusetts Institute of Technology in the mid-1980s. The surge of growth, between 10−34 and 10−32 seconds after the big bang, would have amplified microscopic quantum fluctuations in the infant universe into large density variations. Areas of higher density would then draw in more and more matter and create galaxies.

    According to the rival hypothesis, defect theory, the seeds of structure were sown in a transition that took place about 10−36 seconds after the big bang, when the observable universe was no bigger than a grapefruit. Until then, nature's fundamental forces had been united in a single force, but as the infant universe cooled from unimaginably high temperatures, they disentangled into the distinct forces seen today. This disentangling took place through a process called symmetry breaking. Defects, misalignments in the fabric of space, “are formed because different regions underwent symmetry breaking in a different manner,” says Turok, who in 1989 proposed many of the defect models now in question.

    Such defects resemble the faults in crystals formed by rapidly freezing a liquid: Particles do not have the time to arrange themselves into a single, faultless crystal, so boundaries form between regions with different alignments. Cosmic defects amount to localized concentrations of energy density, which attract matter: “That would have seeded the origin of galaxies and clusters of galaxies,” says Turok.

    Now, however, Turok has found fault with his own defect theory, as he reported in the 1 September issue of Physical Review Letters (PRL). Working with Ue-Li Pen at Harvard College Observatory and Uros Seljak of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts, Turok has found that at least the simpler versions of the theory are in conflict with observations of the microwave background made by NASA's COBE satellite.

    The data were collected in the early 1990s, but fashioning a test of the defect theories was not easy. “The defect models are highly nonlinear and notoriously difficult to calculate,” says Albrecht. “What we've done is invent a technique which we can use to calculate what the effect of any event happening in the early universe will be on certain observable quantities,” explains Turok. The heart of what Princeton University astrophysicist David Spergel calls “an elegant calculation” is a trick for taking a theory of what is fundamentally a random process, the evolution from original defects to large-scale structures, and rewriting it as a sum of a series of nonrandom, ordered, predictable processes.

    Turok and his collaborators took the measured fluctuations in the cosmic microwave background, fed them into their new defect-evolution algorithm, which runs on a supercomputer, and used it to predict the distribution of galaxies we see in the universe today. “What we found…is that the level of fluctuations predicted for the galaxy distribution was wrong by about a factor of 2,” says Turok. “A factor of 2 discrepancy is regarded as fatal.”

    Complementary calculations from other groups support that conclusion. In the 6 October PRL, Albert Stebbins at Fermi National Accelerator Laboratory near Chicago and his collaborators published a more limited calculation that reaches conclusions Turok calls “qualitatively…very similar to ours.” And Albrecht, along with Richard Battye and James Robinson of Imperial College, has done a simplified version of the calculation that also “zeroed in on the predictions which cause defect models the most problems,” according to Albrecht. Taken as a whole, the new defect theory results are “a complete, irreconcilable disaster, in my judgment,” says Steinhardt. “I think the defect enthusiasts agree.”

    But the proponents of inflation theory should hold off popping the champagne corks just yet, warns Stebbins. “At the moment, the inflationary theories do a better job of fitting the microwave-background observations and the observations of large-scale structure,” says Spergel. But inflation theory has problems of its own. “The simplest inflation model fails by just as much as the simplest defect models do,” says Turok, as it predicts more galaxy clustering than is seen, he explains. What gets inflation off the hook is its adjustability.

    But for cosmologists, perhaps just as important as the fate of this or that theory is the fact that ideas can now be precisely tested against experiment. “Cosmology is in the midst of an experiment-driven scientific revolution,” says Spergel. “A few years ago, theories that deviated from observations by factors of 2 would be considered successful.” Now, the demand is for a few percent or better. Says Turok: “We are just on the threshold of getting the data which will test any theory to the limit.”

  11. EVOLUTION

    Bacteria Diversify Through Warfare

    1. Virginia Morell

    Arnhem, THE NetherlandsIt's a civil war in there for many gut-loving bacteria, and the battles between the strains may help explain a microbial mystery: why Escherichia coli and other microbes are so genetically diverse. Peg Riley, an evolutionary biologist at Yale University, notes that while two humans might differ in 0.05% of their DNA, E. coli strains vary by 5%—“more diversity than you expect to find [in a single species],” she says. At the recent meeting here of the European Society for Evolutionary Biology, Riley described evidence that a chemical arms race could be helping to drive this genetic diversification by dividing group from group and descendants from ancestors.

    Mortal combat.

    Bacterial colonies shrivel when exposed to bacteriocins, chemical weapons made by competing strains.

    P. RILEY

    The weapons in question are colicins, one of a group of chemical compounds collectively known as bacteriocins, which bacteria use to defend themselves and kill other, closely related strains. These weapons are often deployed in the gut, which houses several dominant strains of E. coli in the average mammal. When a new strain begins competing with a resident strain and resources grow scarce, both may release colicins.

    “Colicins may be their number one line of defense and offense,” says Riley. Designed to recognize specific receptors on other E. coli cells, the colicins are transported inside the enemy bacterium and kill it by disrupting cellular functions, for instance by chewing up the DNA.

    Each strain escapes harm from its own weapon by producing an immunity protein that turns off its own colicin's killer mechanism. “If it's not their strain of colicin, then they die,” explains Riley. “Normally, the bacteria don't have immunity to anything but their own colicin,” she adds. But just as a strain of E. coli can develop resistance to antibiotics (see related story), it can also evolve resistance to its competitors' colicins.

    That ability led Riley to suspect that like superpowers in an escalating arms race, the E. coli are under constant pressure to develop new defenses and weapons—and that they do so by a seldom-seen form of evolution called positive selection. Most mutations are harmful, and nearly all mutations—whether “good” or “bad”—are simply lost through genetic drift, explains Riley. “We don't have many examples of ‘good’ mutations that overcome the power of genetic drift. We suspected that this might be one.”

    Riley and her colleague Ying Tan tested their idea with several strains of E. coli carrying extra immunity genes that give them protection from the colicins of other strains as well as their own. In nature, natural selection should give an individual bearing the extra gene a huge advantage. “Because it's protected, it won't be swept out of the population by drift. That buys it time” to increase in numbers, says Riley. Indeed, the researchers found that just a single “superimmune” cell put in a flask with 100 million or more ancestral bacteria always ended up invading its competitors.

    Riley thinks such a strain's initial advantage might open the way to an additional—if Oedipal—blessing: a second genetic change that alters the E. coli's colicin and turns the strain into a “superkiller” that can eliminate its ancestor as well as other strains. “As the strain increases in frequency [because of the extra immunity gene], the greater the chance that it will evolve this second mutation: a colicin that its ancestor doesn't recognize” and thus can't disarm, explains Riley. Evidence that such genetic changes can happen comes from Japan, where researchers have actually created a superkiller strain via a single point mutation.

    Inevitably, says Riley, the advent of a superkiller strain will elicit a response from other E. coli, as new strains develop that carry new immunity proteins and new colicins that can overcome the variant strain's defenses. “This kind of experiment shows that such positive selection can act to produce more and more variety,” says Riley.

    Riley and Tan's findings go far to explain a molecular puzzle that Riley uncovered 5 years ago. In studying the evolutionary history of colicins to determine which were ancestral to which, she found an odd pattern in their DNA sequences: There was always a block, centered on the immunity gene and the end of the colicin gene, with astonishingly high levels of diversity. “I remember thinking, Wow! What could possibly explain that?” she recalls. She now thinks she has the answer: “This is the region that selection is actually acting on” as the E. coli evolve.

    “It's super work,” says Bruce Levin, an evolutionary biologist at Emory University in Atlanta, Georgia, “and goes a long way toward explaining how that enormous variation in E. coli arises and is maintained.”

  12. EVOLUTION

    Antibiotic Resistance: Road of No Return

    1. Virginia Morell

    Arnhem, THE NetherlandsThey are one of medicine's biggest headaches: bacteria that have evolved resistance to those former wonder drugs, antibiotics. Now it appears that—contrary to everyone's hopes and microbiologists' expectations—these troublesome microbes will remain resistant long after doctors stop prescribing the drugs. That was the grim prognosis offered by Bruce Levin, a population geneticist at Emory University in Atlanta, at the recent meeting here of the European Society for Evolutionary Biology. Drawing on studies of bacteria from a day-care center and in the lab, he said, “I'm afraid I can't be optimistic. We can't go back again” to antibiotic-sensitive bacteria. “The best we can do is slow the pace at which resistance evolves and increases in frequency.”

    A match for medicine.

    E. coli is unaffected by six of eight antibiotics.

    BRUCE LEVIN

    Researchers had hoped that bacteria that have become resistant to overused antibiotics would “evolve backward,” losing their resistance, because the resistant strains wouldn't be able to compete with the sensitive ones once the drugs were removed. “Theoretically, the genes responsible for resistance are supposed to adversely affect the bacteria's fitness,” Levin explains. “You're altering a gene's normal function and therefore expect it to have a disadvantage.”

    But a random survey last year of Escherichia coli bacteria collected from a day-care center in Atlanta by Levin and an Emory undergraduate, Bassam Tomah, suggested that the theory may not hold up. In a quarter of the bacteria sampled from the diapers of 25 infants, the researchers found strains of E. coli still resistant to streptomycin, an antibiotic doctors have rarely used for the last 30 years. Adding to this puzzle are bacteria in Richard Lenski's long-term evolution study at Michigan State University in East Lansing. These E. coli originally carried a streptomycin-resistance gene called rpsL, which is known to markedly reduce the bacteria's fitness. Yet, after evolving in an antibiotic-free environment for 10 years, or 20,000 generations, Lenski's bacteria are still streptomycin-resistant. “Why didn't that gene revert to its sensitive state, when it only required the change of a single DNA base?” asks Levin.

    To find out, Levin's colleagues Stephanie Schrag and Véronique Perrot allowed laboratory cultures of E. coli with rpsL mutations to evolve in an antibiotic-free medium for 16 days, or 160 generations. They then competed these evolved bacteria against drug-sensitive E. coli and found that they are almost as fit. “That suggests that they evolved a compensatory mutation,” says Levin—a second genetic mutation that makes up for the loss of fitness from the first.

    Schrag and Perrot, with Levin and Nina Walker, confirmed that suspicion by making their evolved E. coli strain drug-sensitive again. They replaced the bacteria's streptomycin-resistant rpsL gene with a sensitive version of the gene, then set this genetically altered strain and the resistant strain against each other in another fitness-competition bout. The genetically altered E. coli failed miserably—implying that the compensatory mutation reduced its fitness when not paired with the resistance gene.

    The interaction between the two mutations would act as a kind of ratchet, preventing bacteria from reverting to sensitivity. “The compensatory mutations establish an ‘adaptive valley’ that virtually precludes that population of resistant bacteria from returning to drug sensitivity,” explains Levin. And that explains why the bacteria in Lenski's lab and possibly those in the children's diapers have not lost their resistance. “Those that revert, that make that one change, are at a disadvantage,” explains Levin. The team is now trying to identify the gene that carries this compensatory mutation.

    Levin suspects that the same kind of compensatory mutations “will almost certainly be found in other resistant bacteria.” But already, the findings have “clear, practical—and rather frightening—implications,” says Marlene Zuk, an evolutionary biologist at the University of California, Riverside. “It's not enough to stop using antibiotics; the bacteria aren't going to revert to what they were before”—and antibiotics that have lost their effectiveness won't become powerful weapons again.

  13. PALEONTOLOGY

    Does Evolutionary History Take Million-Year Breaks?

    1. Richard A. Kerr

    The history of life is one continuous upheaval, or so strict Darwinists would have it. Species come and go continually, as creatures either adapt to a changing environment and ever-shifting competition and evolve into new species, or become extinct. But lately, a small group of paleontologists has been asserting that evolution sometimes takes a holiday. In the fossil record of hundreds of millions of years ago, they point to examples of entire communities of marine animals that remain snared for millions of years in something close to stasis, then plunge into a brief frenzy of extinction and new species formation.

    Claims of such “coordinated stasis” have galvanized the paleobiology community into a frenzy of its own as researchers try to test the idea by studying how other animal communities fared over tens of millions of years. The first results to come in are “a mixed bag,” concedes paleontologist Carlton Brett of the University of Rochester in upstate New York, who, with Gordon Baird of the State University of New York, Fredonia, first proposed the concept of coordinated stasis in 1992, based on 400-million-year-old marine fossils from New York. “The pattern we have seen is holding up well in our rocks,” he says, but “that pattern is perhaps toward the extreme end of a range.” Indeed, most studies of similar fossil records have found little evidence for prolonged periods of evolutionary stasis.

    Yet confirmation of even occasional episodes of coordinated stasis in the fossil record could have major ramifications for understanding evolution. One proposed explanation for the stasis is that the species in the static ecosystems interacted so tightly that there was no room for change. If so, the more fluid ecosystems of recent times, in which individual species react independently to evolutionary pressures, may not be the evolutionary norm. The brief upheavals of accelerated evolution said to begin and end the periods of stasis are more widely accepted, but just as intriguing, hinting at little-understood evolutionary dynamics (see sidebar).

    The classic case of coordinated stasis comes from the animals that lived in ocean-bottom muds during the early Silurian to middle Devonian periods, about 440 million to 380 million years ago. Those muds hardened into fossil-bearing shales that are now found in Ontario, New York state, and Pennsylvania. Studying the rocks nearly a century ago, paleontologist Herdman Cleland noted that the array of fossil species, including the mollusklike stalked brachiopods, corals, mollusks, echinoderms such as starfish, and trilobites, seemed to change very little over many millions of years.

    It was not until the early 1990s that Brett and Baird, drawing on fossil specimens collected over 20 years, quantified the stability that Cleland had reported. They identified 14 intervals, generally running 3 million to 7 million years each, during which 60% or more of species persisted with little change. Within each interval, extinction, speciation—the formation of new species—and immigration of species from outside the now-vanished ocean basin were all more or less on hold, until the interval ended in a period of drastic turnover lasting just a few hundred thousand years.

    The herky-jerky pattern hearkens back to the revolutionary concept of punctuated equilibrium proposed by Niles Eldredge of the American Museum of Natural History in New York City and Stephen Gould of Harvard University in 1972. They argued that species tend to persist unchanged for millions of years before abruptly giving rise to a new species, instead of evolving gradually. Coordinated stasis “is punctuated equilibrium at a higher level, the ecological level of the community,” says Douglas Erwin of the National Museum of Natural History in Washington, D.C. But while there is finally some strong evidence for the reality of punctuated equilibrium (Science, 10 March 1995, p. 1421), many paleontologists have had a hard time swallowing the idea that all the species in a community could be held in check at once.

    Even more startling was the explanation for coordinated stasis advanced by Paul Morris of the Paleontological Research Institution in Ithaca, New York, Linda Ivany of the University of Michigan, Ann Arbor, and Kenneth Schopf of Harvard in 1995. They proposed that the ecological interactions among species—those that compete, prey on each other, or depend on each other—might have been so specific and intricate that a new species or an invader from outside the community could not break in. This interdependence was so strong, they suggested, that even if the community came under pressure from, say, climate change, it might not change—instead, it might shift en masse to a more suitable environment in deeper or shallower water. That kind of interdependence has no parallel in modern ecosystems, which constantly reorganize in the face of environmental change.

    Prompted by this provocative hypothesis, paleontologists are now searching for stasis in their own data sets, with mixed results. During the past 10 years, Mark Patzkowsky of Pennsylvania State University and Steven Holland of the University of Georgia, Athens, noted the comings and goings of brachiopod species in a roughly 20-million-year interval of the Ordovician period about 450 million years ago, as recorded in rocks in Tennessee, West Virginia, and Virginia. “Of all the studies out there, ours is most similar to Brett and Baird's,” says Patzkowsky. “Do we have the same pattern?” he asks. “The answer is no.”

    The record does include some intervals when evolutionary churning slowed, but even then the rates of speciation and extinction “are much higher than what Brett and Baird report,” says Patzkowsky. Fewer than 10% of the late Ordovician species persisted through an interval, compared with 60% in the Devonian. Stephen Westrop of Brock University in St. Catherines, Ontario, found similar patterns in the trilobites late in the Cambrian period, 520 million years ago.

    But the news for coordinated stasis is not all bad. Mark Harris of the University of Wisconsin, Milwaukee and Peter Sheehan of the Milwaukee Public Museum inventoried species in a brachiopod-dominated community from about 430 million years ago, early in the Silurian period just following the Ordovician. “The story's very similar to that in the Devonian of New York,” says Sheehan.

    To Schopf, the mixed findings suggest that “different parts of the fossil record may show this pattern more clearly than others. Some parts may not show it all.” Some paleontologists see an exciting implication: The rules of the ecological game that enforced stasis may have changed over time, locking up ecosystems at some times but not others. Westrop, however, thinks the sporadic appearance of stasis implies that it is an artifact, appearing when the ecosystem is dominated by a group of animals in which individual species happen to be unusually resistant to extinction. For example, a species whose larvae disperse widely because they are free-floating in the sea should endure longer than one whose less mobile offspring could be wiped out by a local disaster. Westrop thinks he sees examples of a species duration effect in the Cambrian: The trilobite community, made up of typically short-lived species, was relatively unstable while brachiopods, which had longer species durations, were more stable.

    No one expects paleontologists to agree anytime soon on how common coordinated stasis is or what accounts for it. “Everybody's data sets aren't quite comparable,” says Patzkowsky. “They're from different times, the rocks are different, the rock exposures are different, the animals are different. It makes it difficult to compare patterns.” And repeating any one of these massive studies would take a decade of a researcher's time and would mean infringing on another paleontologist's field area. The best hope, says Patzkowsky, is “a more consistent comparison of data sets,” using procedures that limit the number of variables, including the inevitably subjective process of defining species in the fossil record.

    Brett, who started it all, thinks there will be at least one sure payoff. “I think the idea of coordinated stasis has stimulated a lot of people to look at the record more closely,” says Brett. “Maybe that's the main benefit of it all.”

    Additional Reading

  14. PALEONTOLOGY

    When Evolution Surges Ahead

    1. Richard A. Kerr

    While paleontologists poring over their records of ancient sea creatures debate whether the evolutionary turnover of species ever slows toward stasis (see main text), they do agree that every few million years, it can speed up dramatically. All three studies done to test the idea of stasis in 400- and 500-million-year-old communities of mollusklike brachiopods showed these evolutionary “events.” So did studies of 500-million-year-old trilobites and 300-million-year-old crinoids—stalked marine animals that look like flowers.

    These events fall well short of the species turnovers that follow global mass extinctions, every hundred million years on average: They are more frequent and may be limited to a single ocean basin. But in each one, upwards of 60% of species seem to be replaced over a period of a few hundred thousand years. At least some of these boundary events could be artifacts of the geologic recording process, says Steven Holland of the University of Georgia, Athens. At times when sea level falls rapidly, he says, a dearth of sediment deposition can compress the geologic record in shallow-water sediments, making the evolutionary clock seem to speed up. But by examining the events in sediments from deep water as well as shallow, he and Mark Patzkowsky of Pennsylvania State University have identified at least one event about 455 million years ago that is clearly real.

    Now the question is what triggered this and other surges. Holland and Patzkowsky identify several possibilities in the environment. Their studies of the sediments that record the evolutionary surge show a drop in sea level, a change in ocean circulation, and perhaps a change in water temperature. In general, “it looks like the coincidence of several rapid environmental changes is what undoes the system,” says paleontologist Carlton Brett of the University of Rochester in New York state. Together, he suggests, the changes push organisms so hard that they cannot adapt or move to more suitable conditions fast enough. As a result, leisurely change—or outright stasis—gives way to upheaval.

  15. NOBEL PRIZES

    Masters of Atom Manipulation Win Physics Prize

    1. James Glanz

    Three physicists who developed techniques for cooling atoms to near absolute zero won this year's physics prize. The chemistry prize went to three researchers for groundbreaking work on key enzymes involved in the body's cellular energy cycle. The champion of the hypothesis that infectious proteins underlie a class of degenerative brain diseases won the physiology or medicine prize earlier this month (Science, 10 October, p. 214).

    At the end of the last century, physicists were locked in a debate over whether atoms really existed or were creatures of theory. Near the end of this century, atoms are not only real; they are so commonplace that physicists cool, trap, bounce, and toss them with the facility of microscopic jugglers. That facility is due in no small part to techniques developed by three masters of atomic manipulation who have been awarded the 1997 Nobel Prize in physics. These masters showed how atoms could be chilled to millionths of a degree above absolute zero and trapped by bombarding them with laser light.

    Work by the three recipients—Steven Chu of Stanford University, Claude Cohen-Tannoudji of the école Normale Supérieure in Paris, and William Phillips of the National Institute of Standards and Technology (NIST) in Gaithersburg, Maryland—has proved invaluable for basic physics, because the ability to slow and trap isolated atoms allows them to be studied at leisure. It has also led to dozens of applications, including astonishingly accurate clocks, high-precision devices for measuring the pull of gravity, and lasers made of coherent “waves” of atoms. The work has “opened up a new world, a new field,” says Daniel Kleppner of the Massachusetts Institute of Technology.

    The techniques honored by the prize rely on a few basic facts of optics and atomic physics: Atoms can absorb and reemit photons of light at specific frequencies, and those photons carry momentum. If a laser beam's frequency is tuned just below one of those absorption frequencies, a stationary atom will be nearly oblivious to the light, but if the atom is moving into the beam, the so-called Doppler shift will raise the frequency seen by the atom, like a police whistle heard from a passing car. The atom can then absorb the beam's photons, whose momentum has the effect of tennis balls bouncing off an oncoming basketball.

    Working with ions rather than atoms, researchers including David Wineland of NIST in Boulder, Colorado, showed in the 1970s that bathing the particles in laser light could lower their velocity in every direction—cool them, in other words. In 1985, Chu and co-workers were the first to use this strategy to chill neutral atoms, which are much harder to control, because static electric fields have little effect on them. Their initial efforts cooled the atoms to 240 millionths of a degree above absolute zero (240 microkelvins). By 1988, Phillips had pushed the temperature down to 40 microkelvins, while also developing better ways to measure it.

    Just how cold atoms can get in a bath of laser light depends on how narrow a margin is left between the frequency of the lasers and the frequency absorbed by the atoms at rest. But the lasers can't be tuned too close to a rest frequency, because most absorption windows span a range of frequencies. To get around this limit and push the temperatures even lower than Chu and Phillips had reached, Cohen-Tannoudji and others showed how to exploit narrower “windows within windows,” created by absorption processes that become apparent only in the presence of intense light.

    Still more recently, Cohen-Tannoudji and his group have come up with tricks to keep the windows open for moving atoms but close them entirely for stationary ones, leading to even colder temperatures, down to a fraction of a microkelvin. “It's exciting to see how far one can go,” says Cohen-Tannoudji.

    “What's really so dramatic is that you can have such marvelous control over these atoms,” says Kleppner. By adding spatially varying magnetic fields to the cooling setup or making use of other optical forces, he notes, Phillips, Chu, and others showed how to trap the atoms once they are cooled, holding them steady for lengthy experimentation.

    Out of those successes has come an explosion of new science. In 1995, Eric Cornell of NIST and the University of Colorado in Boulder and colleagues used laser cooling along with other techniques to make atoms so cold that they formed a Bose-Einstein condensate, in which their quantum-mechanical waves all overlapped to create a strange new state of matter (Science, 14 July 1995, p. 152). Phillip Gould of the University of Connecticut at Storrs and others are playing atomic billiards: studying how atoms behave when they collide at extremely low velocities. “The only way to [do] that is with laser cooling,” says Gould.

    Chu and co-workers are also putting chilled atoms to work by kicking them out of a laser trap. Timing the atoms as they rise and fall in an “atomic fountain” gives the most precise measurements of Earth's gravity ever made. Oil companies are now using the device to search for deposits, which are marked by tiny dips in local gravity, says Chu.

    At NIST in Boulder, researchers are turning a similar fountain into an atomic clock that should eventually be accurate to one part in 1016. An atomic clock's accuracy depends on how long the atoms can resonate within a microwave cavity, says Tom Parker of NIST, and the atomic fountain boosts that time from the tens of milliseconds of standard technology to a second or so. In fact, a team in France has already built a fountain clock that keeps time to a few parts in 1015, says Parker. “There's much more to come,” says Chu. “Most of the applications, I didn't dream of in 1985.”

  16. NOBEL PRIZES

    Chemistry Prize Taps the Energy of Life

    1. Robert F. Service

    Three physicists who developed techniques for cooling atoms to near absolute zero won this year's physics prize. The chemistry prize went to three researchers for groundbreaking work on key enzymes involved in the body's cellular energy cycle. The champion of the hypothesis that infectious proteins underlie a class of degenerative brain diseases won the physiology or medicine prize earlier this month (Science, 10 October, p. 214).

    Every day our cells synthesize multikilogram quantities of a power-packed molecule known as adenosine triphosphate (ATP), a cellular fuel that drives processes ranging from the firing of nerve cells to muscle contraction. Last week, this year's Nobel Prize in chemistry went to three researchers for their “pioneering work” on enzymes that create and burn this ubiquitous compound. American Paul Boyer and Briton John Walker shared half of the $1 million prize for deducing the remarkable molecular machinery of an enzyme called ATP synthase, which catalyzes the production of ATP. Danish physiologist Jens C. Skou took the other half of the prize for his discovery of another enzyme that's the body's biggest ATP user.

    Researchers identified ATP synthase in 1960, in cellular power plants known as mitochondria where ATP is constructed. A series of enzymes inside these fuel cells break down energy-rich compounds formed by the metabolism of nutrients, liberating energy that's used to pump hydrogen ions across a membrane inside the mitochondria. The pumping leaves an internal region with a relative shortage of hydrogen ions. In the early 1960s, British biochemist Peter Mitchell proposed that hydrogen ions streaming back through the membrane somehow cause ATP synthase—which is bound into the membrane—to create ATP.

    By the 1970s, researchers had determined that ATP synthase consists of three sets of protein assemblies: a wheellike structure lodged in the mitochondrial membrane, a rod with one end fixed to the wheel's hub, and a large cylinder that wraps around the other end of the rod and sticks into the internal region of the mitochondria. Several labs subsequently showed that ATP is created at a trio of sites on the cylinder, and that the rod plays a key role in turning on the catalytic activity at these sites. But again, the underlying mechanism was mysterious.

    Boyer, a biochemist at the University of California, Los Angeles, put the pieces of the puzzle together. He theorized that hydrogen ions cause the wheel to spin as they pass through the mitochondrial membrane back to the central region, much as rushing water turns a water wheel. Because the rod is attached to the wheel, it spins too, causing the other end to rotate within the stationary cylinder. This rotation slightly alters the structure of the trio of active sites within the cylinder, causing each in turn to snag the building blocks of ATP, synthesize a molecule of ATP, and release it.

    “It was a startling new idea,” says Joseph Robinson, a professor of pharmacology at the State University of New York (SUNY) Health Science Center in Syracuse. It wasn't Boyer's only one: He and his co-workers found that rather than using energy to link chemical building blocks together, as most enzymes do, ATP synthase uses energy to bind the building blocks to the enzyme and kick out ATP molecules once they have formed. The formation of ATP itself doesn't require excess energy in the special catalytic environment of the enzyme, but it takes considerable energy to pry the molecule loose so that it can be put to use elsewhere. “It's a beautiful little molecular machine,” says Boyer.

    In 1994, Walker and his colleagues at the Medical Research Council Laboratory of Molecular Biology in Cambridge, United Kingdom, verified some of Boyer's theories by using x-rays to create an atomic-scale map of the catalytic portion of the enzyme, made up of the cylinder and rod. The resulting three-dimensional (3D) structure “had a tremendous effect,” allowing researchers to see exactly how the enzyme's mechanism worked, says Robinson's SUNY colleague Richard Cross.

    But despite this progress, ATP synthase hasn't yet yielded all its secrets. Researchers have yet to determine just how the rush of hydrogen ions through the membrane spins the enzyme's wheel. “That's the $64,000 question,” says Robert Fillingame, a biochemist at the University of Wisconsin, Madison, whose own group is racing to determine the atomic structure of that part of the molecule.

    Skou, at Aarhus University in Denmark, approached the fuel cycle from the other end. In 1957, he discovered the first enzyme that uses energy stored in ATP to pump ions across cell membranes. Researchers already knew that ions were often more heavily concentrated on one side of a cell membrane than another, “but people didn't know how the transport of those ions was accomplished,” says Robinson.

    Working with cell membranes from crab nerves, Skou discovered that an enzyme in these membranes was breaking down ATP when sodium and potassium ions were both present in the surrounding medium. After a series of tests in which he systematically altered the concentration of these ions and watched their effect, Skou surmised that an enzyme in the membrane pumps sodium out of cells and potassium in. Skou's enzyme, known as sodium, potassium-ATPase, uses about one-third of all the ATP the body generates to shuttle these ions back and forth, an activity essential for nerve signal firing as well as a host of other cell functions. Hundreds of related ATP-using transport enzymes have since been discovered.

    Like ATP synthase, however, the detailed workings of this ATP-burning enzyme are still shrouded in mystery. Researchers have worked out the protein's amino acid sequence, and they know which regions are embedded inside cell membranes and which are not. But it's proving hard to grow crystals to determine the 3D structure. And without the structure, says Skou, “we still do not understand fully how this machine works.”

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution