News this Week

Science  18 Jun 1999:
Vol. 284, Issue 5422, pp. 1898

    Top Scientists Lock Horns in Research Reform Debate

    1. Michael Balter

    PARIS—A nationwide debate on the future of French research heated up last week, as some of the country's leading scientists presented often conflicting reform proposals at a daylong hearing held here at France's National Assembly. The 9 June hearing, attended by about 200 researchers and conducted by parliamentary deputies Pierre Cohen and Jean-Yves Le Déaut, was the penultimate event in a parliamentary inquiry launched last February by Prime Minister Lionel Jospin at the request of research minister Claude Allègre (Science, 5 March, p. 1442). The inquiry will conclude on 26 June with a mass colloquium in Paris, after which the deputies will make their recommendations to Jospin.

    Allègre had resisted the notion of holding a national debate before moving ahead with what he considered essential reforms. But his attempts to forge closer relations between the universities and public research agencies—such as the basic research agency CNRS and the biomedical agency INSERM—provoked such fierce resistance from many researchers, who feared the measures would dilute the quality of French science (Science, 18 December 1998, p. 2162), that the government was obliged to consult with France's scientific community, via a Web-based forum and numerous small hearings and consultations held throughout the country.

    The latest hearing, which featured two invited panels—one made up of established researchers and the other of younger scientists—was largely dominated by the divergent views of two well-known figures in the debate: molecular biologist Pierre Chambon, director of the Institute of Genetics and Molecular and Cellular Biology near Strasbourg, and chemist Henri-Edouard Audier of the Ecole Polytechnique near Paris. Chambon briefly presented his “wish list” of reforms—originally drawn up by a committee commissioned by Allègre and headed by Chambon but never published—which include ending the “researcher for life” status of French public service scientists. Chambon argues that the universities should be given the major role in recruiting young researchers and that once researchers have obtained university or research agency positions, they should undergo evaluations by international scientific committees every 5 years to continue receiving research funds. Those who fail to pass muster, he says, should be assigned to teaching or administrative duties.

    As for the public research organizations, Chambon argued that the research funds of agencies such as the CNRS should be given only to the best scientists—whether they are university teachers or full-time public researchers—rather than being spread around only among their own staff, as is currently done. “We must transform the research organizations into powerful granting agencies,” he said. Chambon also advocated eliminating the current rung of permanent entry-level positions in the universities and research organizations and replacing them with temporary postdoctoral positions, which are rarely available in France at present.

    But the Chambon plan drew criticism from Audier and some other members of the panel. “I am absolutely opposed to replacing [the entry-level positions] with postdocs,” Audier said. “How can we hang on to the best scientists if we offer them a mediocre salary over 4 or 5 years when industry can offer them a good salary right away?” And panel member Alain Deshayes, a former researcher with the National Institute for Agronomic Research who now serves as a research director for the Swiss corporation Nestlé, expressed concern about the consequences of “sending researchers in decline into teaching.” Audier agreed: “Pierre Chambon thinks creativity is a function of age. I don't believe that.”

    Chambon countered that it is “competence, not age” that matters most, adding that if a scientist is brilliant enough to pass an evaluation every 5 years he or she could still be a researcher for life. As for the lack of funding for postdoctoral positions in France, Chambon argued that its major effect is to force French researchers to do their postdocs abroad. “It doesn't seem to bother anyone if [they] take on insecure positions in the United States,” he said. But although the panel members disagreed on many issues, all were in accord that no serious rapprochement between the universities and the research agencies would be possible until the current heavy teaching loads of young university teachers were reduced. “It leaves us little time for research,” said panel member Isabelle Kraus, an assistant professor of physics at the Louis Pasteur University in Strasbourg. “We have to do research in the evenings and at the weekend.”

    Given these differing visions for French research, and the high stakes for France's future scientists, the final colloquium of the inquiry on 26 June is sure to be heavily attended: Cohen and Le Déaut have reserved the 1000-seat auditorium at the Sorbonne for the event.


    How Aztecs Played Their Rubber Matches

    1. Erik Stokstad

    When 16th century Spanish clerics came to the New World, they were enthralled by a fast-paced and sometimes bloody sport. Teams of up to six athletes would whack heavy, solid balls through hoops several meters above the stone courts using anything but their hands or feet. Apart from the occasional postgame human sacrifice, what most astonished the Spanish were the ricocheting balls. “I do not understand,” wrote Pedro Martyr, the official historian of the Spanish court in 1530, “how when they hit the ground they are sent into the air with incredible bounce.” For Europeans used to playing with pigskins, the rubber balls were practically miraculous.

    The native Americans made their seemingly magical material, Martyr wrote, by collecting sap from lowland trees and mixing in juice from a vine. Four centuries later, this crude recipe has finally given up some of its secrets. On page 1988, researchers describe how the Olmec, Maya, and other ancient Mexican and Central American cultures turned raw latex into rubber. This feat of chemistry, which converts the slippery polymers in raw latex to a resilient structure, was not duplicated until the mid-19th century. “It's a marvelous example of technology demonstrated at an incredibly early stage,” says Frank Bates, a polymer chemist at the University of Minnesota, Minneapolis.

    The ball game, invented at least 3400 years ago, was an important ritual for many Mesoamerican societies. To the Maya, for instance, the game—called chaah—reenacted portions of their creation story. By the 5th century A.D. many towns had central stone courts, some of which could hold thousands of spectators. Leaders tested prophecies through tournaments, rival cities took out their aggressions on the court, and the rich placed huge wagers. According to a 16th century codex, the Aztec capital Tenochtitlan demanded 16,000 rubber balls each year as tribute from one province. The ballmakers “were the ancient equivalent of Rawlings,” the sporting goods manufacturer, says Warren Hill, an archaeologist at the New World Archaeological Foundation of Brigham Young University in Provo, Utah. These societies also used rubber for a host of other products, including religious figurines, incense, and even lip balm.

    Last summer, Massachusetts Institute of Technology (MIT) archaeologist Dorothy Hosler and undergrad Michael Tarkanian traveled to Chiapas, Mexico, to gather the raw materials for rubbermaking mentioned in ancient documents. To their surprise, they saw farmers collecting latex by slashing the bark of Castilla elastica trees, then mixing in juice from pulverized morning glory vines that wrap around the trees—just as the 400-year-old texts described. “It was amazing,” recalls Tarkanian. “After about 10 minutes, a mass of rubber rose to the surface. We formed it into a ball that would easily bounce over your head.”

    The pair brought the ball, as well as raw latex and vine juice, back to their lab. A battery of tests showed that the homemade rubber was about twice as elastic as dried latex, which cracks when handled. With MIT materials scientist Sandra Burkett, the researchers probed the material with nuclear magnetic resonance spectroscopy, finding unidentified organic compounds in the latex that were absent from the rubber.

    The team speculates that some of these mysterious compounds might be plasticizers, which would keep the latex runny by preventing its polymer molecules from linking to each other. (Modern rubber is made by cross-linking polymers.) If the vine juice dissolves the plasticizers, the researchers thought, polymer molecules would be more likely to entangle and form a rubbery mass. Although they failed to find direct evidence for cross-linking, they did discover vine juice components—traces of sulfonyl chlorides and sulfonic acids—that can react with polymers, stiffening segments and making them more likely to interact. The team says that only a few such entanglements would be enough to give the rubber its spring.

    Understanding ancient rubbermaking “teaches us how conscious these people were of their environment and how they were able to manipulate it,” Hosler says. She and her colleagues next plan to test rubber made with varying amounts of vine juice to see whether the Olmec, Maya, and Aztec could have engineered rubber with specific elasticities. No matter what they find, the Mesoamericans have earned the respect of modern chemists. “To discover [the process] and refine it to make those products is impressive,” says Bates. “They probably had a pretty good R&D team.”


    Mutant Fruit Flies Respond to Lorenzo's Oil

    1. Marcia Barinaga

    The 1993 movie Lorenzo's Oil raised the profile of adrenoleukodystrophy (ALD), a fatal hereditary brain disease that strikes one in every 20,000 boys. The film told the true story of how one patient's parents set out to find a cure. Their brew of fatty acids, now known even among researchers as “Lorenzo's oil,” didn't become the cure they had hoped for. But on page 1985, Kyung-Tai Min and Seymour Benzer of the California Institute of Technology in Pasadena report that a component of the oil prevents neural decay in fruit flies with a similar condition. The finding might spark new research on the oil, and it already has researchers enthused about the potential of the mutant flies for studying what causes ALD and how it might be treated.

    “Using the fruit fly as a model … is extremely exciting,” says neurologist Hugo Moser, who studies ALD at the Kennedy Krieger Institute at Johns Hopkins University School of Medicine in Baltimore. Neurologist Dennis Choi, of Washington University School of Medicine in St. Louis, agrees. Benzer and Min “didn't just simply create the model”; they also showed that it could identify “something that is of interest for human treatment,” he says. “That is a validation.”

    Benzer and postdoc Min made their discovery as part of a project to find fruit fly mutations that mimic those that cause human neurodegenerative conditions. First they mutated flies with P elements, little bits of DNA that jump around the fly's genome, inactivating genes. Min then looked for mutants with shortened life-spans and examined them for signs of dying neurons. In one such mutant, one of the neural layers in the flies' eyes had an abnormal bubbly appearance, prompting the researchers—who had already named other neurodegenerative mutants after foods—to call it bubblegum.

    When Min cloned the mutated gene, it turned out to encode a protein whose sequence suggests it is an acyl coenzyme A (CoA) synthetase, a type of enzyme that helps break down fatty acids. The involvement of that enzyme brought ALD to mind, Min recalls, because a similar enzyme is impaired in ALD. In the human disease, which is passed from mother to sons on the X chromosome, the impairment is indirect, because the primary mutation is in a gene encoding one of a class of proteins that transport substances across membranes. Researchers do not know how the transporter affects the synthetase, but the synthetase's decreased activity results in high blood levels of very long chain fatty acids (VLCFAs) in ALD patients and—also for poorly understood reasons—progressive degeneration of brain neurons. The ALD link prompted Min to see whether VLCFA levels are also high in the mutant flies, and sure enough, they were.

    The researchers then tested whether the flies respond to Lorenzo's oil, which is a mixture of two fatty acids. In humans, the oil lowers VLCFA levels by slowing their synthesis, although so far in clinical trials it has shown little or no effect on disease progression. Benzer and Min saw something similar when they treated adult mutant flies with glyceryl trioleate oil, one of the components of Lorenzo's oil. The flies' VLCFA levels dropped, but their brains still had dying neurons. Min tried another approach, beginning treatment when the flies were larvae. When those flies grew to adulthood, says Min, “the pathology was not there.”

    On the face of it, that finding suggests that starting oil treatment early might ward off the neuron death. But giving the oil to humans before symptoms begin doesn't seem to have increased its effectiveness, says Moser. One reason may be that humans have a barrier that prevents easy transfer of many substances—including Lorenzo's oil—from the blood to the brain. Fruit flies lack such a barrier, and Moser says the fly results may rekindle efforts to alter Lorenzo's oil to enable it to enter the brain.

    ALD and the fly condition differ in other ways as well. Not only are the mutations in different genes, but flies lack an inflammatory response, which may contribute to the human disease. And the sick fly neurons look different from the dying neurons in ALD. Although flies may be an imperfect model, there are enough similarities to expect that “we can learn a lot” from them, says biochemist Paul Watkins, who studies ALD at the Kennedy Krieger Institute.

    The flies should allow researchers to quickly screen potential ALD treatments for their effects on diseased brain cells, says Choi. Compounds identified this way would not be a sure bet against the human disease, he cautions, but they “would be good starting points” for further study. Watkins adds that the mutant flies may provide insights into the molecular basis for ALD. His lab is searching for the acyl CoA synthetase that is key in ALD, and although they have found six human versions of the enzyme so far, none is abundant enough in the brain to fit the bill. The defective fly enzyme belongs to a different subfamily than the known human enzymes, and because its mutation causes neuropathology, he says, it “could be the one we're missing.”

    If any of these applications proves fruitful, Benzer will no doubt be pleased. He says his goal was to establish flies as a disease model that, although not perfect, would provide “hints on physiology which may or may not extrapolate to humans.” And he has his sights set beyond ALD; bubblegum, he says, is just one of a “whole zoo” of mutant fly strains developed by his lab that suffer neural degeneration and might turn out to be useful models of other human diseases.


    Canada Dedicates New Human, Animal Labs

    1. Jeffrey Mervis

    The world health community has a new weapon in the fight against emerging infectious diseases. Last week the Canadian government dedicated a $100 million facility in Winnipeg, Manitoba, that features side-by-side maximum containment laboratories for the study of viruses that kill animals and humans. The labs are Canada's first biosafety level 4 facilities and represent the only site in the world where scientists will be able to work with livestock to study both the medical and veterinary aspects of zoonotic diseases—diseases that hop from animals to people, including such recent pathogens as the Ebola, Nipah, and Hendra viruses.

    “It's an outstanding facility, well designed and constructed,” says Jonathan Richmond, who oversees the level 4 lab at the U.S. Centers for Disease Control and Prevention (CDC) in Atlanta and has served as a technical adviser for the Canadian lab. Typically, level 4 labs specialize in either human or animal diseases; as a result, the animal labs lack medical expertise, whereas the medical labs lack the capacity to work with large animals such as pigs and cattle. “What's exciting is that [the Winnipeg lab] will bring together a critical mass of people to understand the pathogenicity and progression of diseases that affect both animals and humans,” says Richmond.

    The lab complex, called the Canadian Science Center for Human and Animal Health, grows out of a political promise made in the 1980s by then-Prime Minister Brian Mulroney to spur economic development in the western provinces. The center, which also contains several level 2 and 3 labs for less hazardous organisms, is operated jointly by two federal agencies, Health Canada and the Canadian Food Inspection Agency. Both agencies have transferred personnel from Ottawa to the new facility in the past year.

    The human level 4 lab is headed by Heinz Feldmann, a 40-year-old German physician and virologist trained at CDC and the University of Marburg in Germany who has spent the past decade studying viral hemorrhagic fevers such as Ebola and Marburg. Feldmann hopes to continue his research on those and other pathogens, collaborating with partners in the United States, Germany, and elsewhere. But first he must assemble a six-person staff to set up the lab and win approval to open shop, a process he hopes will be completed before the end of the year. Then the lab must demonstrate its ability to work with other research labs in diagnosing infectious agents from around the world. “It will take some years to get the research program up and running,” says Feldmann, who adds that the government so far has given him “sufficient support … even though I don't yet have a budget.”

    His colleagues wish him well, but some say it won't be easy. “It's a wonderful facility, and an engineering marvel, but it could be tough for him to get people to come out there because of where it's located,” says Susan Fisher-Hoch, director of the level 4 lab under development at Fondation Marcel Mérieux in Lyon, France. “Then he has to draw up his protocols and get ready for business. It'll be a real challenge.”

    At the same time, Fisher-Hoch and others say that the Canadian center offers scientists a unique opportunity to work with such zoonotic diseases as Nipah virus, a newly identified pathogen that is transmitted from pigs to humans and recently caused a spate of deaths in Malaysia (Science, 16 April, p. 407). And Brian Mahy, director of viral and rickettsial diseases at the CDC, says the location is an advantage for U.S. scientists. “If the CDC wants to study Nipah in pigs, we have to work with [the Australian Animal Health Laboratory at] Geelong. It would be a lot more convenient to work with Winnipeg.”

    Tony Della-Porta, head of technical services for the Geelong lab, says there's more than enough business to go around. “There's a real need for all of us to work together against these newly emerging diseases,” he says. “We really don't know what will pop up next. And we have to be ready to fight it with everything we've got.”

  5. KOREA

    Spending Boost Aims to Reform Academia

    1. Michael Baker*
    1. Michael Baker writes from Seoul.

    SEOUL—Undergraduates taking a biology course at Seoul National University (SNU), Korea's top university, may choose from among offerings by five departments. Such duplication adds to the teaching load of faculty members and leaves less time for research. The large number of departments offering courses also lowers research productivity by imposing additional barriers to joint projects that require a pooling of resources. This month the Korean government unveiled an ambitious plan that addresses those and other problems facing academic researchers as part of a $1.17 billion, 7-year program to reform higher education.

    The new effort, called Brain Korea 21 (BK21), is intended to raise the country's ranking among global scientific powers by shaking up an overly centralized and hidebound system. It aims to improve graduate training by giving money in selected areas to new research consortia that cross institutional boundaries. It also promises to ease the teaching load for those researchers by lowering the number of undergraduates in their departments. At the same time, the government hopes to strengthen the country's capacity to do research by funneling undergraduate students and more resources into promising provincial universities.

    Announced on 4 June, the project is the country's latest attempt to make more efficient use of its rising investment in science and technology (Science, 2 January 1998, p. 24). Government officials hope that BK21 will begin to remove the barriers between departments, end inbreeding and cronyism among faculty, and reduce the intellectual gap between a handful of elite universities and dozens of provincial institutions. “BK21 is aimed at changing the basic structure of universities in Korea,” says policy analyst Chung Sung Chul of the government's Science and Technology Policy Institute in Seoul.

    Brain 21 will pour money into such burgeoning fields as medical and agricultural biotechnology, information technology, chemical engineering, and materials science along with the traditional disciplines of physics, chemistry, and biology. At the same time, undergraduate enrollments in departments that receive government funding as part of newly formed consortia will be cut by 30% to lighten the teaching load on faculty members who will be carrying out the research. Funding will also go to beef up provincial universities and to lessen the frantic competition for entry into SNU.

    Targeted research. Brain Korea 21 initiative hopes to raise scientific productivity in several sectors.

    Scientists working in the targeted fields see the plan as a way out of an inefficient and stale departmental system. “There is no collaboration or communication,” says Lim Jeong Bin, a microbiology professor at SNU, who wants to see “real restructuring.” He adds that “If [the government] doesn't do anything, the universities won't change on their own.” The targeted money is intended to help Korea become a master in certain areas rather than remaining a jack of all trades, says Kim Sun Ho, assistant director of BK21 at the Ministry of Education, which is overseeing the program. “Universities have many subjects and faculties,” he says, “but they lack specialization.”

    Success will require a break from tradition for Korean faculty, however. To qualify for a biotechnology grant, for example, 30 of SNU's 50 biology professors must join with 15 professors from another university to create a new biotechnology graduate research program. The remaining 20 SNU professors will be left to teach undergraduate courses to a smaller student body. Although Lim says that departmental barriers to such collaborations are high and that organizing such a cross-departmental group is somewhat “unpractical,” he agrees that “the overall direction is right.” The new rules also mandate that half of the graduate students admitted to a new program come from other universities and that universities develop a more independent system for evaluating professors.

    Faculty members in fields not singled out for special attention—in particular, mathematics—are worried that students will shun their departments for other majors and that research funding will dry up. Their fears are stoked by the government's decision to increase from 270 to 2000 the number of students who are exempt from military service if they pursue Ph.D.s in strategic areas. But education ministry officials say that any discipline may compete for funding and that the money comes on top of existing funding.

    Critics also say the program is skewed toward applied science, and they fear that emphasis will starve basic research. Chung laments that only 20% of the project's funding is actually flowing directly into research, with the bulk divided among undergraduate and graduate training, equipment and materials, and scientific exchanges. “Education is getting too much, and research is receiving too little,” he says.

    But ministry officials argue that fostering strong local research universities is in the national interest, that a larger network of high-quality institutions will improve Korean science, and that limited resources require them to set priorities. They also hope that training more graduate students locally will save money and ease the brain drain caused by students who remain abroad. “The 21st century is going to be a knowledge-based society, and we want to move with the changes,” says Kim.

    View this table:

    Supreme Court Limits Scope of Appeals

    1. David Malakoff

    The U.S. Supreme Court has limited a special federal court's power to second-guess decisions by government patent examiners. Last week's 6-3 ruling disappointed many biomedical and computer companies, who say it will make it harder to appeal patent rejections. But legal experts say it will be years before the impact is clear.

    The case, Lehman v. Zurko, stems from the rejection of a patent application for cybersecurity software written by Mary Ellen Zurko, now with Iris Associates in Westford, Massachusetts, and her colleagues at the Digital Equipment Corp. (DEC). DEC—now Compaq Computer Co. of Houston, Texas—appealed the 1994 decision by the U.S. Patent and Trademark Office (PTO) to the Court of Appeals for the Federal Circuit, a panel that hears patent and other technical cases (Science, 27 November 1998, p. 1622).

    Last May the court ruled that the denial was “clearly in error.” The PTO challenged the judges' right to question its expertise, while industry groups and patent attorneys warned that if the PTO prevailed, the appeals system would be turned “on its head.” In a 10 June decision* written by Justice Stephen Breyer, however, the high court sided with the PTO. It found that the appeals court was not following a 1946 law that requires judges to defer to government expertise unless an agency acts in an “arbitrary and capricious” manner.

    Some attorneys say the ruling will make it harder for companies to win a review from the PTO if their patent application has been rejected. But others say the change will be hard to detect anytime soon, as fewer than 100 of the 100,000 annual denials end up in court. Says Ernest Gellhorn, the law professor at George Mason University in Fairfax, Virginia, who represented Zurko: “There is room for interpretation, and it will take time to build up the case law.”

  7. JAPAN

    Corporate Ties Still Off Limits for Academics

    1. Dennis Normile

    Efforts to form closer ties between industry and academe in Japan suffered a setback last week when government officials refused to bend a rule that prohibits university professors and other civil servants from holding positions with private companies. The Ministry of Education, Science, Sports, and Culture (Monbusho) had lobbied the National Personnel Agency for permission to allow an economics professor to serve as an outside director of Sony Corp. Observers see the agency's refusal as a setback to all academic researchers hoping to become involved in start-up companies or industrial collaborations derived from their work.

    The National Civil Service Law bars civil servants, which includes faculty members at national universities, from holding positions at private companies. Its primary intent is to keep bureaucrats at arm's length from the companies they regulate. This spring Iwao Nakatani, a prominent economics professor at Hitotsubashi University in Tokyo, was nominated to be an outside director of Sony, and Monbusho hoped to use the case to convince the Personnel Agency to exempt university professors from the law. It argues that such a change is needed to allow the private sector to tap the universities' scientific and management expertise.

    Indeed, the law has been interpreted in recent years to allow faculty members to serve as unpaid consultants to private companies. In lieu of a fee, a company typically makes a contribution to the professor's research funds. But despite pressure from Monbusho, the Personnel Agency refused to allow Nakatani to straddle the worlds of academe and industry by taking a spot on Sony's board.

    “It's a shock,” says Katsuya Tamai, a professor of intellectual property law at the University of Tokyo, who is involved in setting up an organization to help university professors license their patents and sell their expertise. “It's hard to interest investors [in a start-up business] if the person who understands the technology best can't be directly involved.”

    But Shinichi Yamamoto, director of the University of Tsukuba's Research Center for University Studies, warns that any changes to the law need to be considered carefully. “At the moment, the duties of university professors are not clearly defined,” he says. That ambiguity, he adds, makes it difficult to determine what level and what kind of outside activities would be consistent with their university responsibilities.

    Nakatani's plans had already prompted the government to form a committee drawn from Monbusho, the Personnel Agency, and other government bodies to study the issue and report back this fall. But Monbusho officials now say that they are not optimistic about finding a quick solution.

    Nakatani, who could not be reached for comment, is reportedly planning to resign his professorship to clear the way for election to the Sony board at a shareholders' meeting on 29 June.


    Tulane Inquiry Clears Lead Researcher

    1. Jocelyn Kaiser

    An investigation into whether fraud played a role in an influential report on the health effects of hormonelike chemicals has drawn to a murky close. In a letter on page 1932, the chancellor of Tulane University Medical Center in New Orleans announces that endocrinologist John McLachlan “did not commit, participate in, or have any knowledge of any scientific misconduct” in preparing the paper, which was published in Science 3 years ago and later retracted (25 July 1997, p. 462). The conclusions about the researcher who conducted much of the work are not so clear-cut, however: Tulane found that Steven Arnold's data fail to support “the major conclusions” of the paper.

    Questions about the paper, which claimed that mixtures of pesticides could have potent hormonal effects, have reverberated partly because of the prominence of its senior author, a former scientific director of the National Institutes of Health's National Institute of Environmental Health Sciences in North Carolina. “I'm just glad it's starting to clear up for John McLachlan,” says Earl Gray, a reproductive toxicologist at the Environmental Protection Agency's health effects lab in Research Triangle Park, North Carolina. Arnold, who resigned from Tulane in 1997, eventually found work at the Roswell Park Cancer Institute in Buffalo, New York. A co-worker said last week that Arnold had just finished his last day and was planning to begin business school in the fall. Arnold could not be reached; McLachlan, through an assistant, declined to comment.

    In the paper at issue, the Tulane team used yeast cells with a human estrogen receptor to test the potential estrogenic effects of different compounds. They found that pairs of several pesticides were 1000 times more potent at triggering estrogenic activity than were individual chemicals on their own. The prospect that pesticides could mimic the female sex hormone raised alarm bells among toxicologists and environmentalists and helped convince Congress to include provisions in two 1996 laws requiring manufacturers to screen thousands of chemicals on the market for estrogenic activity. Within a few months after the Science article appeared, however, other labs reported that they could not replicate its results. In July 1997 the authors retracted the paper, and that August Tulane announced an inquiry into what happened.

    Although it absolved McLachlan, a Tulane faculty committee “concluded that [Arnold] provided insufficient data to support the major conclusions of the Science paper” and that the “independent review of Arnold's data does not support the major conclusions,” writes chancellor John C. LaRosa. This ambiguous denouement—neither exoneration nor a misconduct finding—is not surprising, says Chris Pascal, acting director of the federal Office of Research Integrity. A decade ago, universities were “always trying to find one or the other,” he says, but now they “realize there's a lot that falls in a gray area.”

    Whether McLachlan's lab should have kept better tabs on its raw data is another question. “Most institutions don't enforce data retention or data recording,” Pascal says. But Gray says he wouldn't expect a lab chief to check the underlying data “if you have a lot of trust in somebody in your lab.” The study's sponsor, the W. Alton Jones Foundation in Charlottesville, Virginia, says it's ready to close the books on the affair.

    As for the hypothesis that hormonelike chemicals are dramatically more potent in combination, “it's kind of fallen by the wayside since that paper was retracted,” says Gray. But he and others are convinced it's important to test chemical mixtures, because in the real world mixtures abound and the results can be additive. Says John Sumpter, a reproductive toxicologist at Brunel University in the United Kingdom: “I don't think there's any disagreement on that.”


    A High-Stakes Gamble on Genome Sequencing

    1. Eliot Marshall

    Craig Venter's commercial venture to sequence the human genome, and those of several other complex organisms, has shaken up the international Human Genome Project; but how will it make money?

    ROCKVILLE, MARYLAND—In nondescript office buildings here in the northern suburbs of Washington, D.C. where government contractors and other “beltway bandits” ply their trade, a remarkable experiment is taking shape. Celera Genomics, a new company that moved into its quarters just 10 months ago, is preparing to become the world's biggest producer of genome data. This year, it aims to start cranking out original biology on a truly industrial scale and marketing it to the world.

    Before it can sell information, of course, Celera must obtain it. And to do so, it must finish installing a phalanx of robotic DNA sequencing machines that are still arriving from the factory, get them running smoothly, create a database, and devise software to mine the data. Somewhere along the line it must also figure out who its long-term clients are and how to make a profit. These are just a few of the challenges the company faces this year. Yet even as Celera tries to convert raw biology into a business, the president, J. Craig Venter, is promising that he will “give away” one of the first, and most important, fruits of this $300 million-plus venture—the DNA sequence of the human genome. The schedule calls for finishing the human genome—the Holy Grail of a separate, nonprofit international project, now in its 10th year—and those of at least three other complex organisms (see table) in just 18 months. How can Celera do this and make money?

    Venter acknowledges that he hears this question a lot. Business people ask where Celera's profits will come from, while dubious academics ask whether the business agenda is compatible with collegial sharing of data. Venter—never one to mince words—responds that the questioners just “don't get it.” Celera must succeed in two worlds, he said in a recent interview: “The scientific community thinks this is just a business project, and the business community thinks it's just a science project. The reality is, it's both.” The “business model only works if [we do] absolutely world-class science,” Venter explains, “and the science model only works if it's world-class business.” In his view, he is implementing a “radical change” in biology, an approach that enjoys “the best of both worlds”—private funding and academic freedom. As a result, he says, he will be working more openly than most companies or academic labs, for both the science and the finances will be open to scrutiny: “Everything will be out in the open.” This, Venter insists, “is the opposite of secret.”

    Many of Venter's peers in academic research don't buy it. They suspect that the science will prove more difficult than expected, or that Celera will have to scale back its promises to share data. And some are offended by Venter's style, especially his wisecracks about the public genome project—gibes he can't seem to resist—and his habit of predicting accomplishments in advance. As one genetics lab director says: “I tell my students they should never announce results before the experiment” is complete. It isn't the way science is done, he says.

    But Venter seems to relish breaking the rules and battling skeptics. Over the next 18 months, his boasts will be put to the test in a glare of publicity. He stands to succeed or fail spectacularly.

    A basic biology factory

    What sets Venter's experiment apart is its scale and promised speed. When Celera's operation is running under full steam later this year, it will be the largest DNA sequencing center in the world. The project is built around an army of 230 freshly minted robots that determine the precise order of nucleotide bases in DNA—the genetic instructions. Getting DNA sequence has been a tedious and, until now, extremely labor-intensive process.

    Celera has pinned its hopes on a new type of machine—the PRISM 3700, made by the PE [Perkin-Elmer] Biosystems Corp. of Norwalk, Connecticut—that greatly reduces the need for technical support. Instead of scanning DNA as it migrates through 96 lanes in a series of poured slab gels, it sends the DNA through 96 capillary tubes filled with polymer. In older machines, gels must be poured and reagents frequently reloaded, interrupting the sequencing. But the robot moves DNA and reagents through the tubes continuously, requiring attention only once a day. The system produces a steady flow of data—signals representing the DNA bases adenine, cytosine, guanine, and thymine. At Celera, these machines crowd the lab, yoked to a network of optical fiber, working in chilly silence.

    The 3700 machine is central to the project in another way: Its manufacturer, Perkin-Elmer, came up with the idea and is bankrolling the venture. Venter says that early in 1998, Michael Hunkapiller, the PE president who developed the 3700, approached him. PE was ready to provide enough 3700s and “possibly the funding to sequence the human genome.” Was Venter interested? Venter says he was intrigued. Venter thought he could use a shortcut method—a “whole-genome shotgun” strategy—that he had used in 1995 to complete the Haemophilus influenzae microbial genome with stunning speed. Venter asked for a promise that, “once we had sequenced the genome, we weren't going to keep it secret.” PE executives agreed, “but they turned it back to me,” according to Venter, saying that “if you want to use a couple of hundred million dollars and sequence the human genome and give it away, come up with a business model that allows you to do that.” Venter has been working on that conundrum ever since.

    View this table:

    After reaching an agreement, PE and Venter announced in May 1998 that they were creating a new company that would sequence the entire human genome by 2001. The target was several years earlier than the planned completion date set by the international Human Genome Project, funded mostly by the U.S. government and Britain's Wellcome Trust. It turned the sequencing world upside down (Science, 15 May 1998, p. 994).

    The news broke on the eve of the annual meeting of academic genome scientists at the Cold Spring Harbor Laboratory (CSH) in New York. Some thought the timing was deliberate. They suspected that Venter, who had been snubbed by the genome research elite in the past, wanted to show them up. A year later, resentment still lingered: Attendees at the 1999 CSH meeting, held last month, warned that Celera was planning to use public resources and grab most of the credit. One center chief grumbled that Celera would leave others “the scut work” of filling gaps.

    The sharpest on-the-record critique of Venter's plan came from biologist Maynard Olson, an intellectual leader of the genome project from its earliest days, now at the University of Washington, Seattle. Speaking at a House Science Committee hearing on 17 June 1998, Olson scorned Perkin-Elmer's “science by press release.” He objected to a slapdash “biotech style,” full of hustle and PR, taking over his field. He predicted there would be “over 100,000 serious gaps” in Venter's version of the human genome and expressed a concern that the publicly funded Human Genome Project would lower standards to keep pace. The latter is already happening: At last month's CSH meeting, leaders of the international project backed a plan to produce a rough “working draft” of the human genome by spring 2000—about a year before Celera's target delivery date—and to complete an accurate, finished version by 2003 (Science, 28 May, p. 1439).

    Venter, taking a broad swing at his critics, says they're worried about the old “academic funding order.” As he told one interviewer last year, “If I were on the other side of this, I would feel upset and threatened, too.” The U.S. National Human Genome Research Institute (NHGRI), which provides most of the U.S. funding for the Human Genome Project, may be defensive about having spent hundreds of millions of dollars to create genomic “sequence-ready maps” for a strategy that has now been abandoned, he says. Yet he points out that NHGRI cannot take credit for two key sequencing tools: bacterial artificial chromosomes or BACs, funded mainly by the U.S. Department of Energy, and capillary sequencers, developed by industry. He notes that “98% of the sequencing in the world is done on Perkin-Elmer machines,” which even “the NIH [National Institutes of Health] labs are gearing up to buy.”

    View this table:

    In public, NHGRI director Francis Collins takes all this with a smile: He says he welcomes the private investment as “complementary” to the government's work. (Offstage, Collins isn't so chipper. For example, he warns that Celera's statements about releasing data to the public are “disturbingly ambiguous.”) But some researchers have embraced Celera's arrival. Bioinformaticist Gabor Marth of Washington University in St. Louis says competition in science is “healthy” and that the field will benefit from Celera's quick pace. Richard Gibbs, director of the genome center at Baylor College of Medicine in Houston, says: “Things have been good since we got a shot in the arm from Craig's activities. The level of excitement is up. … People will get their data more quickly. So, who should complain?”

    World-class robotics

    The jousting between Venter and the public genome project continues, but less noisily now that Venter's venture is up and running. Venter boasts, “I have $338 million in the bank” and an amazing scientific lab under construction. Celera got most of this money from PE after PE sold its Analytical Instruments division in January. In addition, Venter reports that the company has secured $100 million in income for 5 years from several clients, who will get an early look at the sequence data. And on 27 April, when Perkin-Elmer was reorganized under a new umbrella called the PE Corp. its stockholders got new shares—one share of PE Corp. and one-half of Celera—for each Perkin-Elmer share they had held. Celera's stock is traded publicly (at $17 a share, down from a peak of $29), but Venter says Celera isn't using stocks to raise money.

    Celera's first objective is to finish converting the office buildings into a data factory—and to produce data. Huge cooling ducts, big enough to walk through, were still being finished in May. The sequencing and PCR machines generate lots of heat, and the 3700s require a cold environment. New roof supports were installed to hold the chillers. To prevent even a moment's interruption of electricity, Venter has installed a big generator and hired a fuel truck to deliver diesel. Plans call for 230 of the model 3700 machines to be installed by the end of June; at this writing, about 213 are on site.

    A big challenge will be to get those machines to live up to their promise. The first 3700s were rushed to buyers in February with minimal testing or tinkering, and they have not yet achieved targeted efficiencies. At the CSH meeting last month, Washington University researcher Elaine Mardis reported that the performance of the 3700 was “a bit disappointing.” They're hard to install, and some units are balky. PE has promised eight runs per day (each run is assumed to use all 96 capillaries per machine, and each capillary produces a “read length” of 500 or more bases). But in May, the machines at Washington University were averaging only five runs a day, with read lengths of 500 bases. Venter says Celera's are delivering six runs a day, with read lengths of 500 to 750 bases. He says he expects the machines will hit nine or 10 runs a day “in a few months.” If so, Celera would generate more than 100 million base pairs of data a day. Already, in less than a year, its capacity has grown to 70 million base pairs a day—the world's largest.

    Just as bold as the plans for generating raw sequence data are those for computer-based research. The space that holds Celera's main brain—where cables from hundreds of machines converge—is worthy of a sci-fi movie. To reach it, you pass a guard, flash a proximity card, punch a code on a key pad, hold your hand in a biometric scanner, then enter a glass cage watched continuously by TV cameras. Here the data funnel into a digital maze being created by a partnership of Celera and the Compaq Corp. of Houston.

    Compaq is building Venter the world's most powerful civilian computer. Its retail value, according to Celera's computer chief, Marshall Peterson, is well over $80 million, although Celera is getting a big discount. The main system is a 64-bit machine powered by 1200 top-of-the-line Compaq alpha processors, connected in parallel and capable of crunching data at almost 1.3 teraflops, or 1.3 trillion floating point operations per second. The only rivals, says Compaq executive Ty Rabe, are those used for classified work by the U.S. government, notably a monster called “ASCI Red” built by Intel to model nuclear explosions.

    Ultimately, Celera will employ its computer power to analyze the many organisms it plans to sequence. By aligning and comparing whole genomes, Celera hopes that archived genetic data on the mouse and fly will spotlight new human genes and reveal their functions. But to begin, the computers will be put to work piecing together DNA fragments into complete genomes. This is a lesser task, but hard enough that the consensus a few years ago was that this approach simply couldn't be used to complete the human genome. Some still think Celera may stumble at this hurdle.

    Celera is taking a very different tack from the Human Genome Project. NHGRI has funded a multicenter effort that began with a massive investment in genome maps, sets of easily identifiable landmarks on the genome. The idea was that the maps could be used to coordinate the detailed work of sequencing, to come later. That strategy was scrapped, following Celera's announcement, for a faster coordinating method: using a common set of BAC clones into which human DNA has been inserted. Each BAC will have a unique DNA fingerprint and will be anchored by other identifiers to a location on the genome. Five big nonprofit labs have divvied up the genomic landscape and are conferring weekly on who's working on which areas. The challenge now is to get the BACs processed and machines running.

    Celera, in contrast, is skipping the mapping and coordination for the whole-genome shotgun approach. (It will benefit, however, from the fingerprinted BACs, for they add structure to the data.) Celera is breaking the entire genome into random clones and sequencing each clone. The clones overlap, so each end sequence, like the shape of a jigsaw puzzle piece, should make a unique match with the end of another clone. With new pattern-recognition software, Celera plans to assemble the sequences of several hundred thousand clones into a complete genome.

    Celera wasn't the first to suggest doing the human genome this way. In fact, James Weber, director for medical genetics at the Marshfield Medical Research Foundation in Marshfield, Wisconsin, proposed this strategy to NHGRI several years ago. With help from Eugene Myers, then a professor of bioinformatics at the University of Arizona, Weber proposed that NHGRI fund a pilot project to shotgun the human genome. But reviewers said it would be “expensive and risky,” Weber recalls, and NHGRI declined. Weber and Myers eventually published their proposal in Genome Research in December 1997. The same issue carried a rebuttal by biocomputing expert Philip Green at the University of Washington, Seattle. He called the idea “extremely inefficient” and twice as costly as NHGRI's approach.

    Even as NHGRI was rejecting the Weber-Myers approach for the human genome, Venter was busy trying it out on a small scale. The Institute for Genomic Research (TIGR)—a nonprofit outfit Venter founded, now run by his wife, Claire Fraser—used it to sequence microbe genomes that were 2 million bases long. That gave Venter confidence that the 3 billion base pairs of the human genome were within his sights. And, to help with the task, last year he hired Myers away from Arizona. Says Nathan Goodman, a bioinformaticist at Compaq: “When I heard he had hired Gene Myers, my confidence that you could do it shot up.” It seemed no longer the “vague mutterings of a genius” but “a plan to be taken seriously.”

    Green still has doubts. He thinks even Celera's crack informatics team will have trouble figuring out where certain bits of human DNA belong. The human genome contains many “repeat” sequences that may lack unique identifiers. Celera could have a “very big problem” locating the repeats, he says.

    Myers, however, is confident that Celera's number-crunching power, combined with new software, will prove the critics wrong. “Truthfully,” says Myers, “we have code in place and running that will do human [genome assembly] in less than 3 months,” although he admits, “we're still working on getting those repeats resolved.”

    A crucial test of Celera's strategy will come in the next few months. As a kind of muscle-flexing exercise, the company is assembling the genome of the fruit fly, Drosophila melanogaster, in collaboration with an academic team led by Gerald Rubin of the University of California, Berkeley. It is supposed to meet some exacting deadlines. According to a memo issued in February, Celera should begin releasing raw data to the public—and simultaneously to Rubin's team at Berkeley—starting “about July 1, 1999.” The first data are now expected in late July. The job is to be finished by January 2000.

    Venter says Celera will simultaneously begin sequencing the human genome, the rice genome, and then—possibly as the human work is winding down—the mouse genome. He wants to get started on the mouse “as soon as possible”—perhaps in March—he says, because it will be “essential for interpreting human data.” Celera plans to align the genomes, one atop another, for detailed analysis. Boasts Venter: “We are going to discover the circuitry of biology here.”

    What's for sale?

    If much of this information is to be publicly available, how can Celera make a profit selling it? Venter answers this question in several ways. First, he points to the support of Celera's “early access” clients. He says the $5 million annual fees they pay show that, even though genomic data have been available for years, important customers will pay a premium for Celera's work. Second, a lot isn't being given away: Celera intends to patent several hundred human genes and a large set of human single nucleotide polymorphisms for use in individually tailored medicine. Venter hasn't set the terms for data release of three of the four initial genome projects, or for others on a list of possible targets in agriculture, such as the cow, corn, wheat, soya, and apples.

    Finally, Venter says he's not interested in exclusivity because he doesn't want to peddle intellectual property. He wants to create “a big information company,” not just for sequence data but for the analyses pouring out of his computers—something like Bloomberg News, which distributes financial data over a closed network. His future customers, Venter likes to say, are not just companies and universities, but “anybody with a genome.” He wants to reach the masses.

    Celera's first customers, however, are elite: They include the pharmaceutical companies Amgen Inc. of Thousand Oaks, California; Novartis Pharma of Basel, Switzerland; and Pharmacia & Upjohn (P&U) of Bridgewater, New Jersey. William Boyle, who heads Amgen's cell biology lab, says it's easy to explain his interest in Celera: “It's not so much what the information is,” says Boyle, but “the timing and the pace with which it will become available. … It's getting a first look” at the entire human genome. Over time, Boyle says, Amgen expects to collaborate with Celera in comparative genome projects to elucidate the hierarchy, organization, and function of genes.

    Les Hudson, head of global research at P&U, says his company is not interested in raw data but in building tools to analyze diseases. Like the other two early clients, P&U has its own, fenced-off computer server located at Celera, which it can access remotely. Hudson expects that the company's bioinformatics group, located at the Karolinska Institute in Stockholm, will use the system to search genomes for “druggable targets.” And Novartis's chief of research, Paul Herrling, says that Celera's “key aspect is speed.” Novartis, he adds, will use the collaboration to develop “new high-powered computer tools to analyze and annotate” genomic data.

    Venter sees his company as an elaboration of a model “validated” by Incyte Pharmaceuticals of Palo Alto, California. This is the concept that you can make a profit marketing nonexclusive access to genomic data, selling information services rather than information ownership. He says he plans to offer data at “a reasonable price” to everyone—including university scientists and citizens who want to learn about their health. But Celera hasn't disclosed the terms.

    Because Celera's plans for releasing human genome data remain cloudy, some researchers suspect Venter is having trouble finding a solution that satisfies the company's business plan. One genome center leader predicts Venter will have to curb his academic ambitions to protect the company's investment in data. Celera may grant access, this researcher predicts, but only to those who sign a contract promising not to share information or use it commercially. Indeed, although Celera originally talked about putting human data in GenBank, the public repository at NIH, NIH officials report that discussions are stalled. NIH's stance on the need to share research tools without such strings, issued last month (Science, 28 May, p. 1445), may make it harder to work out an agreement.

    The never-ending questions about public data release are irksome to Venter. It is “inappropriate for us to be discussing what might or might not be happening with human data vis-à-vis GenBank right now,” Venter says. “We're going to make the data available to the scientific community on our Web site, like we've always promised.” At the moment, he says, “our goal is to get Drosophila done. We're going to let our accomplishments speak for themselves.” He adds: “That's the beauty of genomics: Sooner or later you have to come up with the data. If you do, you win; if you don't, you lose.”

    In just 1 month, Celera is scheduled to begin releasing sequence data from the Drosophila genome; in 3 months, it plans to start putting human genome data on its Web site. Soon, everyone will be able to judge for themselves who won.


    Experiment Uses Nuclear Plants to Understand Neutrinos

    1. Dennis Normile

    Physicists hope a novel facility being built in a Japanese mine will shed light on the elusive neutrino—and Earth's radioactive heat source

    Neutrino research and nuclear reactors go back a long way. The first neutrinos ever detected, in a 1956 Nobel Prize-winning experiment by physicists Clyde Cowan and Frederick Reines, emanated from a nuclear plant. But since then the relationship has cooled. In recent years, physicists trying to understand these elusive particles have targeted the high-energy neutrinos coming from space or from accelerators at high-energy physics labs because of the logistical problems of siting detectors at the right distance from enough reactors. Now the old flame is reviving in a Japan-United States collaboration that is building a massive underground snare for neutrinos emitted by Japan's nuclear power plants, which may hold the key to neutrino puzzles that are hard to unlock with other approaches.

    Called KamLAND (Kamioka Liquid scintillator Anti-Neutrino Detector) and located beneath the mountains of central Japan, the detector will catch antineutrinos—the antimatter counterparts of neutrinos—from the country's 51 nuclear power reactors, as well as neutrinos directly from the sun. By studying how the neutrinos behave on their way to the detector, the project members hope to add to recent evidence that neutrinos—assumed until recently to be massless—do have mass. And because nuclear reactors produce neutrinos in similar energy ranges to those produced in the sun, KamLAND may help physicists explain the so-called solar neutrino deficit: the shortfall—by up to one-half—in the observed versus expected number of neutrinos from the sun. As a bonus, KamLAND could also yield clues to the distribution of radioactive elements in Earth's crust and how their decay contributes to the heat generated within the planet.

    “KamLAND is a great experiment,” says John Bahcall, a neutrino expert at the Institute for Advanced Study in Princeton, New Jersey. He is particularly excited about the ability to investigate the solar neutrino anomaly under what amounts to laboratory conditions, that is, knowing the conditions under which the neutrinos were created: “I never expected to live to see a laboratory test of a solar neutrino explanation.”

    KamLAND is a collaboration of three Japanese and 10 U.S. institutions, led by the Research Center for Neutrino Science of Tohoku University in Sendai. It uses a mine cavern occupied by Kamiokande, an earlier neutrino detector that has been succeeded by Super-Kamiokande, now running in a separate cavern in the same mine. The detectors made worldwide headlines last year by offering evidence of mass for at least one of the three flavors, or types, of neutrinos. Both of these detectors consisted of huge tanks of water outfitted with photomultiplier tubes, which pick up the flash of light generated when an occasional high-energy neutrino interacts with a proton in the water.

    In contrast, KamLAND will use 1200 cubic meters of a liquid scintillator, a chemical soup that luminesces in response to neutrinos at lower energies. The liquid is confined in a 13-meter-diameter spherical balloon surrounded by layers of inert oil and water intended to cut background noise. With 1280 photomultiplier tubes to pick up the luminescence, KamLAND will cost an estimated $20 million, all coming from Japan's Ministry of Education, Science, Sports, and Culture (Monbusho). U.S. collaborators have asked the Department of Energy for $7.8 million to provide another 650 photomultiplier tubes, which would increase the sensitivity of the detector.

    After it starts taking data in 2 years, KamLAND could bolster the neutrino mass claims from Super-Kamiokande. Those claims were based on signs that muon neutrinos made by cosmic rays colliding with air molecules were “oscillating,” or changing into another type, on their way to the detector—something the laws of quantum mechanics forbid if both particles are massless. But Super-Kamiokande's case for oscillations had a weak point, because it relied in part on calculations of how efficiently cosmic rays should produce neutrinos in the atmosphere.

    A number of so-called long-baseline experiments are attempting to remove the uncertainty by sending streams of neutrinos generated in accelerators through a near detector to a far detector so the neutrinos can be counted at both ends of their trip. These experiments, however, are aimed at the muon neutrino and energy ranges associated with atmospheric neutrino oscillation. KamLAND will focus on electron antineutrinos and the solar neutrino anomaly.

    Atsuto Suzuki, a professor of physics at Tohoku University and head of the collaboration, says there's no need to place a detector at the source because the neutrino-producing reactions of commercial nuclear reactors are well understood. Instead, Suzuki and his colleagues will simply compare the number of electron antineutrinos detected at KamLAND with the number made by the reactors to determine whether some of them are oscillating into undetectable muon antineutrinos. “It's an amazing coincidence that Kamioka is just the right distance from these reactors” for the oscillations to show up if neutrinos do indeed have mass, says Stuart Freedman, a physicist at Lawrence Berkeley National Laboratory in California and one of the U.S. spokespersons for the collaboration.

    Evidence of oscillations may shed light on the solar neutrino deficit. The current favorite explanation for the deficit is that the missing solar neutrinos, on their way to Earth, are oscillating into flavors not seen by the detectors. But theorists have four different scenarios for how this might happen. Suzuki says that KamLAND will be able to investigate all four, using the reactor neutrinos for one and its observations of solar neutrinos to examine the others. KamLAND also will be sensitive to critical neutrino energies that have eluded previous detectors.

    In addition, KamLAND will be looking downward at Earth's own internal processes. The decay of radioactive isotopes of uranium and thorium is one of the major sources of Earth's internally generated heat, but nobody knows just how much heat this source produces or how the uranium and thorium are distributed within the crust and mantle. Fortunately, the low-energy antineutrinos generated by this decay fall within KamLAND's range of sensitivity, and their signature can be distinguished from reactor antineutrinos. By tracking neutrinos coming from the deep Earth to their origins, investigators hope to get a better fix on the nature and location of the planet's internal heat source.

    Suzuki expects KamLAND to yield most of its useful data within the first few years, although the experiment is capable of running for a decade or longer. If it succeeds, it will add another link to the chain that connects neutrinos with nuclear reactors.


    SNO Closes In on Solar Neutrino Riddle

    1. Mark Sincell*
    1. Mark Sincell is a free-lance science writer in Tucson, Arizona.

    The first neutrinos have been spotted colliding with heavy water molecules in a giant tank at the bottom of an Ontario nickel mine. Announced last week, the events mark the inauguration of the Sudbury Neutrino Observatory (SNO), a new facility that physicists hope will finally solve the solar neutrino problem, which has been haunting the field for decades.

    Generated by the sun's nuclear processes, nearly a billion solar neutrinos—ghostly subatomic particles that can easily pass through Earth without hitting anything—shower down on each square centimeter of the planet's surface every second. Although neutrinos come in three “flavors,” electron, muon, and tau, existing detectors can only see the electron variety, and they only see half as many coming from the sun as theorists had predicted. To resolve this discrepancy, physicists have proposed that half of the neutrinos switch flavors, or “oscillate,” on their way from the sun's center to Earth. Other neutrino experiments have been gathering indirect evidence for oscillations by comparing the number of neutrinos from a known source with the number observed in a detector, and a new project in Japan called KamLAND could firm up the case (see main text). But SNO should provide the most direct test yet of the theory.

    The key is its ability to see several varieties of solar neutrinos at once. SNO contains 1000 tons of ultrapure heavy water, water in which the hydrogen atoms have been replaced with deuterons, whose nuclei have a proton and a neutron. When an electron neutrino collides with a heavy water molecule, it can split apart the neutron and the proton and eject an electron. Other neutrino flavors split the nuclei but don't scatter electrons. By counting both neutrons and electrons, SNO should be able to measure both the total number of incoming neutrinos and the fraction of electron neutrinos, says physicist and SNO spokesperson David Wark of Oxford University in the United Kingdom. If SNO finds that the shortfall of electron neutrinos is made up in other flavors, it will provide strong support for oscillations.

    “It is an extremely important experiment,” agrees physicist Paul Langacker of the University of Pennsylvania, Philadelphia. “They will very likely ascertain definitively whether neutrino oscillations are taking place.” Unfortunately, physicists will have to be patient: Neutrinos collide with matter so rarely that SNO will detect only some 20 neutrinos every day. As a result, says Wark, “it will be at least a year” before SNO has an answer.


    Survival Test for Geophysics Center

    1. Richard Stone

    In the mountains of Kyrgyzstan, a research station that monitors earthquakes and nuclear tests faces an uncertain future

    BISHKEK, KYRGYZSTAN—When the ground trembled beneath the Lop Nor nuclear test site in western China on 27 January, the shock waves lit up a string of sensors in Central Asia and jolted an international scientific network to life. Within seconds the recordings were uploaded to a Russian satellite and sent via the Internet halfway around the world to the United States, where analysts began decoding the seismic signatures. Did China resume nuclear testing after signing the Comprehensive Test Ban Treaty in 1996, 2 months after its last blast? Or was the power unleashed by a natural event: an earthquake or a meteorite strike, perhaps? The answer was important to security and diplomacy—even before allegations of espionage in U.S. weapons labs suggested that China has acquired knowledge to upgrade its nuclear arsenal.

    Quake, not nuke

    The three KNET stations nearest China's Lop Nor test site recorded these signals on 27 January and transmitted them to the United States in a matter of seconds.


    Thanks to 10 sensors in mountainous Kyrgyzstan, one of 15 countries created after the breakup of the Soviet Union, scientists were able to determine quickly that the event was an earthquake measuring 3.9 on the Richter scale. The seismic patterns recorded by the network, about 1200 kilometers west of Lop Nor, “provided essential information for detecting and discriminating the earthquake,” says Frank Vernon, a research seismologist at the University of California, San Diego, who oversees the Kyrgyz Broadband Seismic Network (KNET). The nonclassified data helped reassure treaty-monitoring agencies that they weren't seeing an encore to last year's Indian and Pakistani nuclear tests, which KNET tracked as well.

    But, in spite of such successes, the seismic sentinel faces an uncertain future. KNET is currently operating with stopgap funds from the U.S. State Department, which run out on 1 July. A proposal to maintain KNET is pending at the U.S. Civilian Research and Development Foundation (CRDF), a nonprofit in Arlington, Virginia. And the array is not the only important geophysics facility in Kyrgyzstan that's in jeopardy. When the U.S. government stepped in last year to prop up KNET, it also helped set up an International Geodynamics Research Center (IGRC), based at a Russian field station outside Bishkek, the Kyrgyz capital. Initial funding for the center is also drying up. Now, geophysicists are waiting to hear whether the U.S. National Science Foundation (NSF) or other agencies will ante up funds to help keep the center afloat.

    Geophysicists who study the rapid mountain building in the mighty Tien Shan range, which dominates Kyrgyzstan and spills into neighboring countries, have a big stake in the outcome of these two funding decisions. The seismic network churns out a wealth of data for research as well as treaty monitoring, and Western researchers say the center provides an invaluable base to study a region where the crust is deforming at an intriguingly fast rate because of stresses generated as India plows into Asia. “This is a pretty exciting part of the world,” says David Simpson, president of IRIS, a Washington, D.C.-based consortium of universities involved in seismological research. “They have magnitude 6 [earthquakes] like California has magnitude 3's,” says Steve Roecker, a geophysicist at Rensselaer Polytechnic Institute in Troy, New York. If the funding for either facility ends, says Vernon, “I am afraid that the earth science community will lose a valuable resource which, once lost, cannot be resurrected.”

    The fact that there are valuable research resources at all in Kyrgyzstan owes a lot to a Russian geophysicist named Yuri Trapeznikov. In 1978 Trapeznikov, of the Institute of High Temperatures (IVTAN) in Moscow, was tapped to open a field station in Bishkek to study rock layers in the Tien Shan using a device called a magnetohydrodynamic generator. Developed at a Soviet military institute, the machine shoots huge bolts of current into the ground that can travel tens of kilometers through the crust to receiving instruments. Changes in electrical resistance give clues to the forces compressing the rock layers. “IVTAN is a world leader” in these sorts of measurements, says Vernon.

    Trapeznikov, a hulking figure with a slight stoop and a booming baritone, ran IVTAN's Bishkek Proving Grounds until he died from a heart attack in April. Colleagues credit him with creating a bastion of solid scientific expertise. “He had a talent for raising research personnel,” like one raises a family, says Gennady Schelochkov, IVTAN's deputy director. Indeed, Trapeznikov told Science in an interview last winter, presiding over a research fiefdom in the hills above Bishkek suited him well. “As a child I dreamed of becoming the chief of a collective farm,” he said. “I think my dream came true. Here we had everything necessary—it was like a collective farm.”

    Trapeznikov managed to sustain a thriving program for a time after the Soviet Union dissolved. First he and Vladimir Zeigarnik, executive director of IVTAN Association in Moscow, pulled off a political coup: They helped persuade Kyrgyz officials to let Russia retain the field station. And Zeigarnik kept funds flowing so the Bishkek outfit could pull its weight in international collaborations. When the U.S. government sent Global Positioning System equipment to the Kyrgyz highlands in 1993, for example, a phalanx of IVTAN trucks and staff carried out the installation at 85 sites. But the crumbling Russian economy eroded the station's resources to the point that Western researchers over the last few years have started to pay the lion's share of expenses for joint projects.

    Unstable ground

    Yuri Trapeznikov's geophysics bastion, overlooking the Kyrgyz range, is hoping for an infusion of Western cash.


    The IVTAN field station's plight caught the attention of Askar Akayev, president of Kyrgyzstan. A physicist and former president of the Kyrgyz Academy of Sciences, Akayev in 1995 sent letters to U.S. Vice President Al Gore and Viktor Chernomyrdin, then Russia's prime minister, suggesting that an international center be established at the Bishkek Proving Grounds. Akayev foresaw that a center would “raise the profile of Kyrgyz science very much,” says Tynymbek Ormonbekov, science chair at Kyrgyzstan's Ministry of Education, Science, and Culture. But the idea languished until July 1997, when Akayev met Gore in Washington, D.C. This time more was at stake: Funding for KNET, deployed 6 years earlier, was drying up.

    The Soviet Union had invited the United States to set up KNET and a network in the Caucasus mountains after a devastating earthquake in Armenia in 1989. Political instability in the Caucasus doomed that network before it was deployed, but KNET got off the ground thanks to funds from NSF and the U.S. Department of Defense. The idea was that IRIS, which managed the project, would install the sensors—which can detect seismic waves ranging in frequency from 0.008 to 50 Hertz—and leave them to the Kyrgyz Institute of Seismology to maintain, as it was in the country's interest to monitor earthquake hazards, says Simpson. But with the Kyrgyz economy languishing, the seismic array by 1997 seemed destined to deteriorate.

    After his meeting with Akayev, Gore instructed the State Department to find a way to give both KNET and IGRC a 1-year lease on life, with the hope that the Kyrgyz government would rally continuing support. State officials farmed the job out to CRDF, which supports mostly former Soviet weapons scientists and has a system for transferring funds to Central Asia. By March 1998, an agreement was in place in which CRDF would transfer $150,000 to maintain KNET for 1 year (much of the expense goes to renting helicopters and other vehicles to service the remote sensors) and $100,000 to launch IGRC. The Russians kicked in $50,000 and the Kyrgyz $5000. Last year, KNET ran 99% of the time and lost less than 3% of recorded data. “There are no seismic networks in the United States that can approach these data returns,” says Vernon.

    Meanwhile, Western scientists who have set up collaborations with the IGRC are thrilled to have a base in this corner of the world. “This is the best place to study intracontinental deformation,” says Roecker. He's part of a team led by Brad Hager, a geophysicist at the Massachusetts Institute of Technology, that's in the middle of a 5-year NSF grant to explore mountain building in the Tien Shan. Already the team, which includes Kyrgyz collaborators, has made a surprising finding: The area encompassing the mountains is compressing 20 millimeters a year—that is, the land south of the mountains is squeezing up against the land to the north, thrusting the Tien Shan higher—about twice as fast as earlier estimates. The deformation rate suggests that the Tien Shan range is about 10 million years old, corroborating the idea that the Tibetan Plateau rapidly rose as much as 2.5 kilometers some 5 million to 10 million years ago—a phenomenon that would have perturbed air circulation patterns, perhaps strengthening the monsoon.

    The findings also suggest that millions of people are sitting on a powder keg. Central Asia has endured some devastating earthquakes, including one in 1966 that leveled Tashkent, the capital of Uzbekistan. In the 1800s two magnitude 8 temblors shook the Tien Shan, and three more struck between 1902 and 1911. The historical record and the high deformation rate, the researchers say, indicate that the Tien Shan is primed for an imminent major earthquake.

    U.S. scientists say their colleagues in Bishkek are critical to keeping an eye on the situation. “For a Western team to make similar-quality measurements, they would have to live in Kyrgyzstan, speak the languages fluently, understand the local culture, and be able to work within the regional political systems,” says Vernon. That's “a nearly impossible order in my experience.” Instead, westerners can draw on local researchers—for a fraction of the cost of sending additional staff to Central Asia. “They are a great source of cheap labor and expertise,” says Roecker.

    Indeed, Hager's group finds the IGRC so valuable that it plans to seek funds from NSF to support the center. Without this support, Zeigarnik, who will manage IVTAN's Bishkek branch and IGRC from Moscow, foresees “a remarkable narrowing of our research activities.” Meanwhile, a decision on the CRDF funding for the seismic network is expected in the next few weeks. For KNET, it's all or nothing. If the grant doesn't come through, says Vernon, “there are no contingency plans. So we have to hope for the best.”

    Even if money materializes, the scientists at IVTAN and IGRC know they face a daunting challenge in preserving Trapeznikov's legacy. Hager represented his Western colleagues at a memorial service for the dynamic former director in Bishkek last month. He says he encountered three strong emotions from the local researchers: sadness at the loss of their leader, determination to continue his dream, and fear. “I felt from them a sense of the jitters,” says Hager. “They seem scared of the awful big job ahead.” But they are resolved to see it through.


    Telescope Builders Think Big--Really Big

    1. Govert Schilling*
    1. Govert Schilling is an astronomy writer in Utrecht, the Netherlands.

    They have seen the future, and it is a telescope with a mirror the size of a football field and a structure rising to match the Great Pyramid

    BÄCKASKOG, SWEDEN—In the 389 years since Galileo Galilei first turned his 4-centimeter telescope toward the heavens, telescope sizes have grown steadily, culminating in today's 10-meter mammoths. Now some astronomers are talking about climbing off this steady slope and taking an unprecedented leap to telescopes 25, 50, and even 100 meters across. Says Matt Mountain, director of the Gemini Observatory on Mauna Kea, Hawaii, “We are talking about a very large step.”

    The idea of bypassing the incremental approach and jumping straight to a giant scope got its first airing last summer at a workshop on “maximum-aperture telescopes” in Madison, Wisconsin (Science, 4 September 1998, p. 1428), and was renewed here earlier this month. Meeting at the medieval Bäckaskog Castle in southern Sweden for a 2-day workshop organized by Lund University, some 70 astronomers, optical and structural engineers, and observatory directors turned over plenty of ideas about how to build a gargantuan telescope and found no obvious showstoppers. Says Ray Wilson, a retired telescope designer from the European Southern Observatory (ESO), “Let us rejoice if there's technical diversity. All solutions discussed at this conference are valid.”

    A 100-meter telescope is the stuff of astronomers' dreams. Although an array of more modest instruments that merges its light to form a so-called interferometer, such as ESO's Very Large Telescope at Cerro Paranal, Chile (Science, 1 May 1998, p. 671), provides the same ultrasharp vision, a giant mirror would gather much more light. That would let it detect and spectroscopically analyze the light of extremely dim sources. For instance, it would be able to dissect light reflected from nearby extrasolar planets to see if their atmospheres harbor compounds that are the signature of life. And it could inspect extremely distant galaxies, which offer a view of the very early universe, as if they were on Earth's doorstep.

    Technology should now make it possible to realize these visions at a less-than-astronomical cost, or so researchers hope. In the past, each doubling of telescopes' aperture has increased the cost sixfold, says ESO's Roberto Gilmozzi. “According to this rule of thumb, a 100-meter telescope would cost at least $20 or $30 billion,” he says. He and his ESO colleagues believe they could build one for less than $1 billion—less than twice the total cost of ESO's Very Large Telescope. Even so, “you don't go from paper studies to a 100-meter telescope,” says Jerry Nelson, director of the Keck Observatory at Mauna Kea. “We need 25-meter prototypes.”

    Different visions of the next step toward the 100-meter dream-scope were on view at the Bäckaskog workshop. For example, even though most of the proposals called for segmented mirrors, made of hundreds or even thousands of individual pieces of glass, some telescope designers argued that it would be possible to make a 50-meter mirror out of a single glass slab. Such a monolithic mirror might offer better image quality and would not require such sophisticated supports to keep it optically true.

    Casting, polishing, and aluminizing a single slab five times the size of a basketball court could be done at the telescope site, or the mirror could be transported from the factory to the site by airships, says Wilson. “Large monolithic mirrors are something we have to keep in mind,” agrees Mary Edwards of glassmaker Corning Inc. of Corning, New York, which produced the 8.3-meter monolithic mirror blank—the largest ever—for the Japanese Subaru telescope, recently completed at Mauna Kea.

    Lund astronomers Torben Andersen and Arne Ardeberg, while agreeing that a 50-meter monolithic mirror “is a very attractive possibility, with many advantages,” have a different vision for their Extremely Large Telescope. They propose a “mega-Keck solution”—a giant mirror built of segments about the same size as those in the 10-meter Keck Telescope. The 50-meter mirror would consist of 585 hexagonal segments of 63 different types. Together, the 2-meter segments would form a parabolic reflecting surface, focusing star light to a 4-meter secondary mirror some 70 or 80 meters above the primary. The secondary would reflect the light back through a central hole in the main mirror, where it would be analyzed by cameras and spectrographs.

    The size and hence the number of the individual segments is not fixed. “We even discussed a design with 104,000 15-centimeter segments,” says Andersen. However, getting so many mirrors properly aligned with computerized actuators would be a major problem, he says. But Nelson says the technology for such a task may soon be available: “Smaller segments will make [production] much simpler. I don't know where the limit is. One-centimeter segments could even be flat.”

    Another way to cut costs is to abandon the usual parabolic mirror, which reflects light to a precise focus, in favor of a spherical mirror, at the cost of some distortion, which would have to be corrected later. The advantage would be that all segments of the mirror would be identical, with exactly the same easy-to-polish curvature. “This is a low-risk option with more bang for the buck,” says Thomas Sebring of the National Optical Astronomy Observatories (NOAO) in Tucson, Arizona. Together with astronomers from the University of Texas and Pennsylvania State University, Sebring is proposing a 30-meter telescope based on the design of the Hobby-Eberly Telescope at McDonald Observatory in Fort Davis, Texas.

    Besides having a spherical mirror, Sebring's instrument, also called the Extremely Large Telescope, would be built on a rotating platform, aimed at a fixed altitude of 55 degrees above the horizon, making the structure easy and cheap to build. “We're talking about $250 million, which is probably a conservative estimate,” Sebring says. The design would limit the amount of sky the scope could survey, however, and to compensate for image distortions created by the spherical primary mirror, it would need at least three additional corrective mirrors, introducing additional light loss. This makes little sense to Nelson. “The purpose of a telescope is to collect light,” he says. “If you're throwing away light, you're throwing away money—tens of millions of dollars.”

    That drawback has not stopped a team at ESO from proposing an instrument with a full 100-meter spherical mirror, a behemoth known as the “Overwhelmingly Large,” or OWL, Telescope. With a primary mirror as large as a football field and a telescope structure nearly as high as the Great Pyramid, the OWL is an exercise in superlatives. OWL would have 10 times the light collecting area of all professional telescopes ever built before, says project manager Gilmozzi. And unlike less ambitious giant telescope concepts, OWL would be fully steerable.

    ESO recently established a special project office for OWL. Optical engineer Philippe Dierickx says a final choice for the optical design of OWL is expected at the end of this year, but the current design incorporates 2000 identical 2.3-meter mirror segments. A mirror factory would have to produce one segment per day to complete the job in 8 years. Mechanical engineers Enzo Brunetto and Franz Koch have completed detailed designs for the highly modular Eiffel Tower-like telescope structure. “It will consist of 4100 identical pipes and 850 nodes, fitted together like the elements of a construction kit,” says Gilmozzi.

    OWL would require a hangarlike enclosure, which would slide over the telescope when it is in a horizontal position, and four petallike, air-conditioned mirror covers to keep the reflecting surface cool during the day. The total moving mass is expected to be some 17,000 tons—more than 35 times the moving mass of the 5-meter Hale telescope at Palomar Mountain in California, for example. Gilmozzi hopes to complete a design study in 2002 and build the scope for $900 million. OWL is an ambitious project, but “we're using the technology we have,” says Gilmozzi. “The telescope could be fully commissioned some 20 years from now, just around the time of my retirement.”

    Because of the costs of the monster telescopes discussed at the Bäckaskog workshop, they will certainly require international cooperation. They will also require major advances in the field of adaptive optics, needed to compensate for the blurring effect of Earth's atmosphere. Current adaptive optics systems use small, deformable mirrors in the light path to compensate for atmospheric blur, but unblurring the image in a 50- or 100-meter telescope will be much harder, because the distortion could vary across the width of the mirror. “We need to reconstruct a three-dimensional view of atmospheric disturbances,” explains adaptive optics specialist Roberto Ragazzoni of Padua University in Italy. Despite these challenges, even cautious people like Nelson say there are no serious limits on the size of ground-based telescopes. “These things don't violate the laws of physics.”


    Lofty Observatory Gets Boost

    1. Govert Schilling*
    1. Govert Schilling is an astronomy writer in Utrecht, the Netherlands.

    The United States and Europe have breathed life into plans to build a giant new astronomical observatory in Chile that could be fully operational in 2009. Last week, science officials from both continents signed an agreement in Washington, D.C. laying out a 3-year plan for the design and development of the Atacama Large Millimeter Array (ALMA).

    Located 5000 meters above sea level on the Chajnantor plain in the Chilean Andes, ALMA (Spanish for “soul”) will be Earth's highest continuously operated observatory. It will consist of 64 12-meter dishes, observing the universe at millimeter and submillimeter wavelengths. This relatively unexplored part of the electromagnetic spectrum, between infrared and radio waves, opens a window into some of the coolest and dustiest objects in the universe, such as the clouds of dust and gas that form planetary systems, as well as into the farthest reaches of space and time. ALMA will have a collecting area of some 7000 square meters, larger than a football field and far surpassing any existing millimeter-wave telescope. And its high, dry location is largely free of atmospheric water vapor, which absorbs millimeter waves (Science, 19 March, p. 1836).

    “It will take us back to the era where we see galaxies form,” says Bob Dickman, coordinator of the Radio Astronomy Unit at the U.S. National Science Foundation (NSF). “No matter how distant the first galaxies are, ALMA will detect them,” adds Ewine van Dishoeck of Leiden University in the Netherlands. By combining signals from multiple dishes—a technique called interferometry—the array will create images of these distant objects as sharp as a single imaginary dish spanning the 10-kilometer width of the array. Interferometry is a household word in radio astronomy, but it requires great finesse at the shorter millimeter and submillimeter wavelengths that ALMA will observe.

    Major partners in the agreement are the U.S. National Radio Astronomy Observatory and the European Southern Observatory, an intergovernmental organization with eight member states. Research institutes in France, Germany, the Netherlands, and the United Kingdom will also take part, while Japan is expected to join later. Europe will chip in $16 million and the United States $26 million for the first phase of design and development; in 2001 the partners will make a final decision about whether to proceed. The observatory's total cost is expected to exceed $400 million.


    Microbes Feature as Pathogens and Pals at Gathering

    1. Evelyn Strauss

    CHICAGO, ILLINOIS—Unlike Leeuwenhoek, who guarded the lens that best magnified the “wee animalcules” that he first saw, today's microbiologists enthusiastically broadcast both their methods and their findings. At the American Society for Microbiology meeting, held here from 30 May to 3 June, about 14,000 of the researchers assembled to discuss findings on topics ranging from a novel protection against Leishmania to communication by gut bacteria.

    A Bite in Time

    Avoiding insects might seem a logical way to avoid insect-borne diseases. But new results suggest that for at least one disease, leishmaniasis, the best protection might be the bite of the very same fly that transmits it. At the meeting, immunologist David Sacks of the National Institute of Allergy and Infectious Diseases (NIAID) reported that mice bitten by sand flies that do not carry Leishmania, the tiny protozoan that causes the infection, resist infection later, possibly because something in the flies' saliva revs up the animals' immune response.

    Leishmaniasis afflicts hundreds of thousands of people in tropical and subtropical areas every year, making it the second most common protozoan disease after malaria. Most commonly, the microbe causes a skin infection, but in some cases it infects the internal organs and can be fatal. No one has been able to develop an effective vaccine, so some researchers hope the surprising effect of the sand fly bite may open a chink in Leishmania's armor. “There's a great deal of potential [in this work] in terms of understanding transmission, which boils down to prevention,” says Duane Gubler, an epidemiologist at the Centers for Disease Control and Prevention in Fort Collins, Colorado.

    Researchers have known for some time that components in sand fly saliva can enhance the infectivity of the parasites they transmit. Among other things, they block blood clotting in the vicinity of the wound, thereby aiding feeding by the insect and also transmission of the parasite. In the current work, Sacks wanted to see if exposure to the saliva of uninfected sand flies might lead to some kind of immune response that would neutralize these enhancing effects.

    He and his research team at NIAID and the Walter Reed Army Institute of Research in Washington, D.C. filled small vials with sand flies, clamped the vials against the ears of six mice, and allowed the insects to bite. Infected flies produced lesions on most of the ears within 3 weeks. The lesions healed, but most of them still harbored parasites. In an experiment conducted in parallel, the mice were exposed to uninfected flies twice, at 2-week intervals, before they were bitten by infected flies. Only three of the 12 ears developed lesions, and after they healed, all were parasite-free.

    These results suggest that the prior bites not only decrease the incidence of disease, but also reduce the capacity of mice to serve as reservoirs of infection. “Allowing uninfected flies to bite these mice provides as much or more resistance [to Leishmania] as the best vaccines known,” says Sacks.

    But that didn't happen quite the way he expected. Rather than neutralizing the enhancing effects of the saliva, the repeated bites apparently protected the animals by inducing an immune response that would fight the parasite more directly. He found that production of interferon γ increased in the animals. This molecule, one of many cytokines that regulate immune responses, stimulates cell-mediated immunity, a response specialized for thwarting pathogens that reside within host cells, as Leishmania does.

    Because the majority of flies in areas where leishmaniasis is endemic are uninfected, the protective effect of prior bites might explain why the severity of Leishmania infections varies from one individual to another, Sacks says. It might also explain anecdotal reports that children and newcomers to parts of the world where the parasites and sand flies live tend to suffer more serious illness than adult natives. He notes that researchers have ascribed such resistance to immunity against the parasite itself gained during earlier infections. Instead, he says, “we're introducing the idea that the history of exposure to vector saliva has a profound effect.”

    The findings suggest that sand fly saliva might be a useful component of an antileishmaniasis vaccine. And they point to an irony, says Sacks: “One reason the Army has been keeping [the flies used in the experiment] is to figure out how to keep them off soldiers. Maybe what they should be doing instead is allowing their soldiers to come into contact with uninfected flies before they encounter the infected ones.”

    How Yeast Mitochondria May Meander

    Like good parents, normal cells equip their offspring properly before sending them off into the world. That means, among other things, ensuring they each receive the organelles called mitochondria. Thought to be the descendants of symbiotic bacteria, mitochondria provide cells with energy, and a dividing cell employs active mechanisms to ensure that each offspring gets its fair share. About 5 years ago, for example, researchers found that in dividing yeast, mitochondria seem to amble to the emerging bud along cables of the protein actin. Results described at the meeting by cell biologist Liza Pon of Columbia University may now indicate what propels them.

    Pon's work suggests that mitochondria move with the aid of a seven-protein complex called Arp2/3, which helps propel the movements of crawling cells by initiating the addition of actin subunits at the interface between the cell membrane and actin polymers. “If it's true, it's very interesting and novel because everyone has focused on the Arp complex in cell motility, not organellar movement,” says David Drubin, a cell biologist at the University of California, Berkeley. The findings also highlight the link between mitochondria and their proposed evolutionary predecessors, the Rickettsia bacteria, because these microbes also use Arp2/3 to propel themselves, building a tail of actin to push themselves from cell to cell.

    Pon originally set out to test the idea that mitochondria move with the aid of the motor protein myosin, which is known to run along actin filaments and is what powers muscle contraction. But she found that mitochondria could still zip along actin cables in yeast strains whose myosin genes were defective. “We hit the wall looking at myosin,” Pon said. That spurred her to look for other molecules that might yoke mitochondria to actin.

    She eventually identified six proteins that might fit the bill. Sequence analysis revealed that one of them was Arc15p, a member of the Arp2/3 complex. With fluorescently labeled antibodies that bind to Arc15p, she confirmed that the protein lies on the mitochondrial surface along with another subunit of the Arp2/3 complex, Arp2p.

    To determine whether these proteins are in fact needed for the mitochondria to move, Pon examined a yeast strain with a mutant arp2 gene that contains a functional protein at 25°C, but not at 39°C. Although actin organization appeared normal, the mitochondria ceased to move when she raised the temperature to 39°C. An arc15 mutant gave similar results. Finally, Pon found that a drug that interferes with actin polymerization in a mutant strain that contains an abnormally high number of cables decreased the number of moving mitochondria and rendered those that did move more sluggish.

    Based on these results, Pon suggested that yeast mitochondria push themselves ahead by polymerizing actin behind them, in effect building an actin tail like those of their supposed bacterial ancestors. Unlike the bacteria, however, the mitochondria move along actin tracks. This mechanism may be special to budding yeast. Other cells use another type of cable, the microtubule, to move their mitochondria, but yeast microtubules are mainly used to pull the nucleus apart during cell division.

    Pon and others caution that more work will be required to confirm the finding. “It's an intriguing hypothesis, and it bears closer examination, but there are some published experimental results that aren't consistent with a direct role for actin in mitochondrial inheritance in yeast,” says Michael Yaffe, a cell biologist at the University of California, San Diego. For example, he says, certain mutant strains of yeast have no detectable actin cables but still seem to be able to divvy up their mitochondria properly.

    Assuming Pon's results hold up, they might say more about the facility with which subcellular particles can co-opt actin than about the evolutionary roots of mitochondrial movement, says Julie Theriot, a cell biologist at Stanford University School of Medicine. Pon's observations might simply suggest that “it's easy for an organelle—or a microbe—to pick up this kind of motility.” Indeed, several labs have shown that other organelles—endosomal vesicles that transport proteins within cells, for example—can apparently move in a similar way, says Daniel Portnoy, a microbiologist at the University of California, Berkeley. “This may be just the tip of the iceberg of actin-based mechanisms in organelle movement.”

    Communication in the Gut

    Disease-causing bacteria give microbes a bad name, but the fact is, humans have lots of microbial pals. Among them are the bacteria that dwell within our intestines and protect us from disease, help digest food, make vitamins, and even help shape the immune system. But although such indigenous bacteria are clearly important, scientists don't know much about how they operate. “The [human gut] epithelial lining is about an inch [2.5 cm] thick with bacteria, and we have no idea what they're doing,” says Stuart Levy of Tufts University School of Medicine in Boston and ASM president. New results reported by Jeffrey Gordon, a molecular biologist at Washington University School of Medicine in St. Louis, provide the first detailed insights into how a gut bacterium communicates with its mammalian host, presumably to the benefit of both.

    Gordon described how he and Lora Hooper, a postdoc in his lab, are dissecting the molecular signaling system by which Bacteroides thetaiotaomicron, a common bacterium in both the mouse and human intestine, induces the intestinal lining to make a carbohydrate that the bacterium uses for food. Their results suggest that the microbe can control the signal so that it communicates the need for the carbohydrate only when the sugar runs low.

    What the mouse gets in return is not yet clear. The bacterium's success undoubtedly makes it more difficult for pathogens to invade, and its alteration of the host environment probably helps build a healthy microbial community, suggests Gordon. “This is the first clear molecular definition of how a commensal bacterium plays a role in the development of a mammalian cell environment,” says Levy. It could ultimately point to ways for keeping the gut microflora healthy when it is under assault, for example, in patients taking broad-spectrum antibiotics.

    The current finding is an outgrowth of studies the Gordon team has been performing for several years on mice raised in a germ-free environment. Normally the intestine begins producing particular carbohydrates that contain the sugar fucose shortly after birth. But in the germ-free animals, the researchers found, production of this compound wanes as animals are weaned. By introducing various bacterial species individually into the animals, Gordon's group found that B. thetaiotaomicron was the missing factor, somehow signaling gut cells to make the fucosylated carbohydrate (Science, 6 September 1996, p. 1380).

    Since then, Hooper has identified some of the molecular switches that control this signaling. Using a variety of genetic and biochemical techniques, she showed that the bacteria make a protein called FucR, which regulates both how the bacteria metabolize fucose and signal the host cells to make more of it. When the bacteria have ample fucose, the sugar binds to FucR. This binding both relieves the inhibition the protein would otherwise exert on the genes needed for fucose breakdown and turns off the signal to the intestinal cells.

    Hooper found, for example, that a B. thetaiotaomicron strain carrying a mutation that eliminates FucR, and thus prevents it from responding to fucose, induces production of the sugar. The same mechanism would explain why a different mutant strain has the opposite behavior: It never tells the host to make the sugar. The researchers found that because of an enzyme deficiency, this mutant can't break down fucose, so the sugar builds up. As a result, Gordon and Hooper hypothesize, the sugar is always bound to FucR, which keeps the signal permanently “off.”

    Gordon says the ability of bacteria such as B. thetaiotaomicron to modify the intestinal ecosystem probably plays an important role in assembling and shaping its microbial communities. “At birth, the intestine is an unoccupied wilderness,” he says. “How is this mass society of microbes assembled and maintained along the length of the gut? We're trying to lay the groundwork for understanding that by unraveling the schemes hatched by microbes for surviving and prospering in this competitive ecosystem.”

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution