News this Week

Science  02 May 1997:
Vol. 276, Issue 5313, pp. 668
  1. U.S. Science Policy

    OSTP Gears Up for Change

    1. Andrew Lawler

    JACK GIBBONS IS PLANNING TO LEAVE AFTER MORE THAN 4 YEARS AS THE PRESIDENT'S SCIENCE ADVISER. WHITE HOUSE OFFICIALS SEE AN MIT PROFESSOR AND OLD WASHINGTON HAND AS SOMEONE WHO CAN REVITALIZE THE OFFICE

    As an April Fools' Day joke, Jack Gibbons told some of his staff that he intended to resign as President Bill Clinton's science adviser and retire from government service. The announcement was a prank, but it appears that only the timing was wrong. Administration sources say that the 68-year-old Gibbons will step down as head of the Office of Science and Technology Policy (OSTP) by the end of the year. The leading candidate to replace him, they say, is John Deutch, a Massachusetts Institute of Technology (MIT) chemistry professor with extensive government experience as the Pentagon's second in command and head of the Central Intelligence Agency.

    If he gets the job, Deutch would have a mandate from the White House to revitalize OSTP, which has been pushed to the sidelines in debates over national science policy. Insiders say they hope that Deutch's arrival would add weight and political savvy to an office viewed as lacking clout by many in Congress, the Administration, and the scientific community. That may prove tough, however, given that OSTP has no direct budgetary authority and the president's science adviser must tread a fine line between being an advocate for the research community and a loyal team player.

    Gibbons told Science last week that he has informed Clinton and Vice President Al Gore that he wants to leave office well before the end of the Administration. But, he adds, “I'm still here indefinitely.” Indeed, he says he is in the process of choosing two new associate directors–as of this week, all four positions are filled by acting heads or nominees awaiting Senate confirmation. He quickly adds, however, that they will be “first-rate people who would wear well with any successor.”

    Earlier this week, the White House moved to quash rumors that Gibbons's departure is imminent and that Deutch is the heir apparent. It issued a statement that Gibbons “may desire to leave his post near the end of the year,” but added that “any contemplation of his successor is premature speculation.” However, sources have told Science that Deutch is the front-runner, although the transition is not expected until the end of the year because of Deutch's personal obligations. He did not return calls seeking comment.

    An affable administrator with long experience in Washington, Gibbons is widely credited for a close working relationship with Gore, his creation of the National Science and Technology Council (NSTC), and for helping to ward off R&D cuts proposed by the last Republican Congress. But many science and technology advocates believe that a powerful political insider like Deutch would help give the office more political clout. Stricter White House management in Clinton's second term has left OSTP with less direct influence in setting policy, while the NSTC has never garnered the authority of other such White House councils. The science adviser's office itself lacks firepower: Two of Gibbons's original lieutenants–those who oversee the environment and national security–left last spring, and their successors still have not been confirmed by the Senate. Replacements for the other two slots (science and technology) haven't even been nominated. The technology post has had an acting head since October.

    This power vacuum has not gone unnoticed. A recent outside study calls for a stronger and more focused office, while two executive agencies are making moves to expand their influence in R&D policy. Moreover, lawmakers and their staffs worry that the science office is too quiet. “We didn't hear from Gibbons on any of our authorization bills,” says Representative James Sensenbrenner (R-WI), who chairs the House Science Committee. “And we could have used some input.”

    Gibbons gently dismisses such criticism. “We're not in disarray,” he says, citing a host of reports and policy initiatives led by his office, as well as the NSTC and the President's Committee of Advisors on Science and Technology (PCAST), which he co-chairs. In addition to “holding the line” against significant cuts proposed in the past 2 years by Republicans, he says, the president's 1998 budget is his fifth straight request for greater R&D funding. “But we have suffered because of vacancies not filled,” Gibbons adds.

    Sharper focus

    Outside groups say that the problems go deeper than a few tardy appointments. The Carnegie Commission on Science, Technology, and Government–a panel of distinguished scientists, university presidents, and industry managers–recently offered up new advice to the Administration, based on a review of OSTP's record during Clinton's first term. The report, issued in March, calls for a stronger and more focused policy shop in the executive office. It recommends additional employees, a chief of staff to handle internal matters, and a written agreement between OSTP and the Office of Management and Budget (OMB) that spells out OSTP's role in determining the R&D budget. The report also recommends a major reorientation of the NSTC, which includes representatives from all agencies involved in R&D matters. Instead of focusing on programs, the report says, the council should devote itself to a “small number of priority policy issues of concern to the president.” Meanwhile, the Carnegie panel says key players such as OMB are turned off by the large number of panels–now about 60–involved in NSTC's work and have lost their enthusiasm for the process.

    Gibbons disputes many of these conclusions. He notes that the council has played a key role in developing important policy initiatives, such as one aimed at tackling emerging infectious diseases and announced by Gore last July. The council has since worked to coordinate federal research efforts on Lyme disease, hantavirus, and Ebola, among others. Gibbons also points out that the number of NSTC subcommittees is shrinking–the 10 environmental and natural resources subcommittees have dwindled to five, for example. “It took a couple of years to put a strategy in place,” he says. “Now we're streamlining operations.” He also rejects the idea of a formal agreement with OMB. “That would be a mistake,” Gibbons says. “Relationships have to be nurtured, and you can't do that with a formal agreement.”

    One area that Gibbons and the Carnegie panel see eye to eye on, however, is the need for a better structure to deal with the administrative burdens of the job. He says he is in “an advanced stage” of thinking about hiring a chief of staff. Administration sources say the post will soon be filled by Holly Gwin, a lawyer and former OSTP counsel.

    As OSTP struggles to find its proper niche, other science and technology agencies are stepping into the political vacuum. The mild-mannered National Science Board (NSB), for example, recently endorsed a plan to offer broader science policy advice to the government (Science, 14 March, p. 1555). “We're not trying to do OSTP's job, and we don't think we'll be in conflict,” insists Stanford University chemist Richard Zare, the board's chair. He says that the board simply wants to develop a new vision that goes beyond its direction and oversight of the National Science Foundation's programs. Gibbons says he has encouraged the NSB to think more broadly, but there are limits. “You can't have two PCASTs,” he warns, adding that “I'm concerned that they first do their job [to oversee] the foundation.”

    At the Commerce Department, the technology administration office has become more involved in interagency efforts because OSTP lacks the staff to coordinate activities. “We're probably doing more work” as a result of the vacancies, says chemist Mary Good, who heads the Commerce Department office. For example, she says Commerce is “carrying the burden” in the effort to coordinate state and federal R&D efforts. At the same time, Administration sources point to tension between her operation and the one Gibbons runs because of a widespread view that she is lobbying for his job. “I'm too old for that kind of thing–running OSTP,” insists Good. And Gibbons declines to criticize his colleague, saying only that “Mary has a strong personality … and she has her own ideas about technology policy.”

    A tighter rein

    Major reshuffling in the White House after Clinton's reelection also is making life more difficult for OSTP managers. The more freewheeling structure of the first Administration has given way to a tighter organization and a new deputy chief of staff, John Podesta, who takes a keen interest in science and technology issues. So, while Gibbons still maintains close contact with Gore, he now must navigate another level of White House management.

    Before taking the job, Podesta served as president and general counsel of Podesta Associates, a Washington lobbying firm with many high-tech clients concerned with R&D issues. “I have some background in it,” he says. “I try to work closely and coordinate with Jack,” as well as with Gene Sperling, head of the National Economic Council. “We try to set goals and objectives,” he adds. For example, Podesta says his shop played the central role in coordinating the Administration's response to the recent news about the cloning of sheep (Science, 7 March, p. 1415).

    Gibbons, however, says that “I don't think I've gotten a single piece of paper from John Podesta. … It's more a communications link.” As for the cloning issue, Gibbons points out that he briefed the president on the subject, although “we let [Podesta's office] know what we were doing.”

    But some observers say a larger role for Podesta would be good news for R&D advocates. “There is a lack of attention to science and technology issues at the higher levels of the Administration, primarily because of the ineffectiveness of OSTP,” says lawyer Ken Kay, a long-time science and technology lobbyist who worked under Podesta and now heads a business consulting firm, Infotech Strategies. “Having him there is nothing but a plus.”

    Podesta won't comment on any personnel changes, but he says the second Clinton Administration views science and technology policy “less as an esoteric academic aside and more as a primary part of our agenda for keeping the economy strong.” The next few weeks will see “high-level presidential attention on his vision of the role of science and technology in people's lives and in the economy,” he predicts–including initiatives that will entail changes to the Administration's budget priorities.

    New blood

    If Deutch were to succeed Gibbons, Administration officials believe he would be more of a political infighter with better access to Clinton, which would result in a more powerful NSTC, a more active PCAST, and an OSTP that focuses more heavily on technology issues and how they interact with economic and regulatory policy. But it will be hard for anyone to overcome the handicaps built into the job.

    A lack of authority over agency R&D budgets, minimal staff, and the inherent tension of serving both a president and a typically apolitical academic community are all part of the position. Some scientists, for example, complain that Gibbons has been too partisan in his dealings with the Republican Congress, while others insist he has not played enough of the political game. That conundrum has plagued all science advisers. “It's one of the worst jobs, because you are caught between the White House and the research community,” says one Administration official.

    Gibbons's preference for consensus and his easygoing manner, however, have allowed turf fights to break out among his associate directors, hindering progress on education and technology initiatives and making it harder for the science adviser to be heard at the Cabinet table, say Administration sources. They add that this posture has opened the door for others, such as Kathleen McGinty, chair of the Council on Environmental Quality, to expand their influence through such means as winning personnel slots at the expense of OSTP. “He's intellectually bright and unbelievably nice, but sometimes he needs to be a bit more authoritarian and dictatorial,” says one former White House associate.

    Gibbons, a courtly Virginian, doesn't apologize for his management style. “If you have to be a bully, that's a sad commentary,” he says. In the long run, he says, being polite and seeking consensus pay off. But in the cutthroat world of the White House, that style may put OSTP at a disadvantage. “He's from a gentlemanly era,” says one colleague. “You need a tougher person.” The question is whether any successor to Gibbons–even someone with the savvy and access of Deutch–can forge a truly powerful office that puts R&D in the limelight.

  2. U.S. Science Policy

    Deutch Treads MIT-Washington Axis

    1. Andrew Lawler

    Presidential science advisers typically have stronger ties to the research community than to the upper echelons of Washington political life. But John Deutch, currently the leading candidate to replace Jack Gibbons as science adviser and head of the Office of Science and Technology Policy, has long divided his attention between both sectors.

    The 58-year-old Deutch, a chemistry professor at the Massachusetts Institute of Technology (MIT), certainly has strong roots in academia. He holds a Ph.D. from MIT, which he joined in 1970 as a faculty member. Later, he became chemistry department chair, dean of science, and ultimately provost.

    Deutch has hardly lived an ivy-covered existence, however. Taking leave from MIT in 1977, he joined the Energy Department during the Carter Administration and eventually became undersecretary. After President Clinton's election in 1992, he joined the Defense Department and wound up as deputy defense secretary before moving over to head up the Central Intelligence Agency. He returned to MIT in December when Clinton named a new national security team.

    Deutch is likely to be named soon to the President's Committee of Advisors on Science and Technology. Given his close ties to senior White House officials and key lawmakers, Deutch is expected to have an easy time winning confirmation if his name is submitted once again to the Senate.

  3. Reform in Japan

    War on Debt Puts Big Science Under Fire

    1. Dennis Normile

    TOKYO–Japanese researchers are keeping a wary eye on governmental reform efforts that could significantly affect scientific activities. Two advisory councils, charged with streamlining the bureaucracy and shrinking a ballooning national debt, are said to be taking a hard look at spending on big science projects, including some high-profile international endeavors, and may even question the need for one of the country's major scientific agencies.

    The efforts, which will gather steam over the coming months, stem from promises made during last fall's election campaign by the Liberal Democrat Party (LDP). Shortly after receiving a vote of confidence last October, Prime Minister Ryutaro Hashimoto created two panels to recommend sweeping changes: the Administrative Reform Council, charged with producing a plan by November to streamline the bureaucracy, and the Fiscal Structure Reform Council, which is supposed to report in June on ways to tackle the nation's snowballing debt. The panels–made up of academic, business, and civic leaders and chaired by the prime minister–include just one scientist: physicist Akito Arima, president of the Institute of Physical and Chemical Research (RIKEN). News about their private deliberations is now beginning to trickle out, raising anxiety about how science will fare in the reviews.

    Few dispute the need for fiscal reforms. As a percentage of gross domestic product, Japan's $2 trillion debt–incurred in an effort to revive a sluggish economy–is among the highest of all the industrialized nations. Cutting overall government spending, however, will make it difficult to follow a plan adopted last year to boost R&D spending significantly over the next 5 years (Science, 28 June 1996, p. 1868). Indeed, the fiscal reform panel recently told the head of the Science and Technology Agency (STA) that it would not exempt the 17-trillion-yen ($136 billion) spending plan from its deficit-cutting efforts (Science, 25 April, p. 519).

    Science has learned that the committee intends to take a hard look at big science projects. The list includes space activities, including the international space station; the International Thermonuclear Experimental Reactor (ITER); and the nation's extensive nuclear-power research program. “Some people [within the fiscal reform council] say that these projects should be suspended,” says Minoru Yonekura, an official in the STA's planning department. Although official support for ITER and the space station remains strong, he predicts that “compared to 5 years ago, it is going to be very difficult to launch new big science projects.”

    One compromise may be to stretch the 5-year plan over 7 years. Arima says he “may agree with the fiscal reform council” on the need for an extension as long as the goal of increased governmental support for science is maintained. Despite pressure from the committee, Yonekura says STA hopes to keep the current schedule for growth on track: “We will make efforts to maintain the current [5-year] plan.”

    Keeping the plan intact isn't the only thing Yonekura and his colleagues may have to worry about, however. Their entire agency could fall victim to the far-reaching reforms that LDP officials have promised voters. One popular idea is halving the current number of Cabinet-level ministries and agencies, and the Administrative Reform Council has been asked to produce a plan to achieve that goal. Later this month, that council will begin to quiz the 21 ministries and agencies on their functions and relations with other agencies.

    A prime candidate for pruning is the $5.8 billion, 41-year-old STA, in large part because of its relative youth, small budget, and widely dispersed constituency. Two ministries that may be asked to absorb it are the Ministry of Education, Science, Sports and Culture (Monbusho) and the Ministry of International Trade and Industry.

    Not surprisingly, STA officials oppose any change in status. Nobuaki Kawakami, who monitors administrative reform for the agency, says any merger would likely result in “reducing the [government's] promotion of science and technology.” Arima believes that it is too early to forecast STA's fate.

    In addition to the tasks set for it by the politicians, the administrative reform committee has received pleas for help from various groups. For example, 13 professors at the University of Tokyo Medical School submitted a petition in March to the Administrative Reform Council, to another government reform committee, and to Monbusho, seeking greater autonomy from government-wide rules for the medical school and hospital, particularly on budgetary and personnel affairs.

    But the strategy appears to have backfired. The letter was seen by some government officials as a plea to privatize the hospital, and the group antagonized the medical school's faculty council by not consulting it first. Hiroyuki Yoshikawa, who retired last month as president of the university, says that a majority of faculty members at the University of Tokyo and at other national universities share the group's concerns, but that the letter has sown confusion as to what university faculty members think about reform. One professor who signed the letter declined to discuss the matter, saying it had “become a big problem.”

    Even so, the resulting furor may not amount to anything. Arima says the Administrative Reform Council is likely to concentrate on Cabinet-level ministries and agencies and leave university reform for another time. However, that still gives researchers plenty to worry about.

  4. High-Energy Physics

    Peña to Review LHC Agreement

    1. Andrew Lawler

    The U.S. government plans to review its tentative agreement with Europe to help build the Large Hadron Collider (LHC), to make sure it is a good deal for this country. The review, announced last week by Energy Secretary Federico Peña, comes at the urging of Representative James Sensenbrenner (R-WI), who chairs the House Science Committee. Agency officials say they are confident that most of the lawmaker's concerns can be met with only minor changes to the proposed partnership, while European managers insist that the current agreement already addresses most of Sensenbrenner's worries.

    Department of Energy (DOE) officials hope to provide $450 million worth of hardware for the accelerator and its two main detectors, with the National Science Foundation chipping in an additional $80 million for the detectors. The LHC, with a total budget of $5 billion, is slated for completion in 2005 at CERN in Geneva; DOE and CERN managers signed a draft agreement in February spelling out U.S. and European responsibilities. But some House members, including Sensenbrenner, think the United States is getting a raw deal, and the House Science Committee has denied specific funding for the project in the 1998 DOE authorization bill. The bill should reach the House floor next week.

    Skeptics in Congress harbor deep resentment toward Europe's perceived indifference to the fate of the Superconducting Super Collider (SSC), which was voted down by Congress in 1993. Last week, at a colloquium sponsored by the American Association for the Advancement of Science (AAAS, which publishes Science), Sensenbrenner accused CERN officials of having “raised an upright finger” at U.S. requests for financial backing for the SSC. His main concern now is that the United States lacks an adequate management role in the LHC and that cost overruns could lead to requests for additional U.S. funds. He also wants a contractual agreement to ensure U.S. access to the complex and a pledge of financial support from European science managers for any future accelerator in the United States, in return for U.S. help with the LHC. Finally, House staffers say the committee chair would like CERN to revise procurement practices that he believes discourage the purchase of U.S. goods.

    “The answers to all his questions are in the agreement we negotiated,” says Christopher Llewellyn Smith, CERN's director-general. “It is a mutually very beneficial deal, and I think that with explanation and time, Congress will understand it is a good deal.” As a CERN observer, Llewellyn Smith says, the United States will have a forum to express its concerns. CERN already has a written open-access policy for non-European researchers, he notes, and the LHC agreement includes further assurances. As for unexpected costs, he says, “we think the chance of overrun is very small” given that the device is being built in an existing tunnel.

    Sensenbrenner met with Peña on 23 April; 2 days later, at the AAAS colloquium, Peña said that some of the chair's concerns are legitimate. Saying that he is willing to “go back and make some changes,” Peña noted that some revisions may not require action by the full CERN council. Other Administration officials agree with Llewellyn Smith's overall assessment about access and costs, and they are loath to demand a larger U.S. management role. “We don't want to be a CERN member,” says one official. “That would be too expensive.”

  5. Science and Commerce

    Disclosing Data Can Get You in Trouble

    1. Eliot Marshall

    Even if you hold no stock in a company sponsoring your research, you can get yourself into serious legal trouble if you speak too freely–even to your spouse–about results that could affect the company's stock price. Milton Mutchnick is learning that the hard way. On 10 April, the U.S. Securities and Exchange Commission (SEC) filed insider-dealing charges against Mutchnick, a gastroenterologist and well-known expert in liver diseases at Wayne State University School of Medicine in Detroit, along with his former assistant, Rangarao Panguluri. The SEC has accused them of illegally disclosing early, negative results in a clinical trial of a hepatitis-B drug (thymosin alpha 1), allowing friends and relatives to beat the market by selling off stocks.

    The SEC's associate director of enforcement, Thomas Newkirk, says this is the first insider-trading case the SEC has brought against clinical researchers. He claims that the “tippees” who received the early information from Mutchnick and Panguluri in 1994 avoided financial losses by quickly selling stock in two companies–Alpha 1 Biomedicals of Bethesda, Maryland, which sponsored the trial Mutchnick supervised, and SciClone Pharmaceuticals Inc., of San Mateo, California, which had bought the rights to market thymosin alpha 1 overseas. Three days after the clinical results were leaked in April 1994, Alpha 1 Biomedicals issued a press release saying that the clinical data would not support an application to market the drug in the United States, and it abandoned the project. SciClone's stock also took a hit, but the company didn't give up. Under drug-export rules that Congress simplified in 1996, it has been selling thymosin alpha 1 in China for hepatitis B and other infections. And the company announced this year that it plans to market the drug in at least 25 foreign countries.

    Panguluri, now a physician in private practice in Anaheim, California, is contesting the SEC charges. Neither he nor his attorney responded to phone messages left at their offices. Mutchnick agreed to a judgment last month that requires him to pay the government $163,494.75–a fine equivalent to the stock losses the SEC claims his friends and relatives avoided because they got access to inside information. The judgment, in which Mutchnick neither admits nor denies the SEC's charges, also compels him to avoid speaking against the agreement.

    Mutchnick thought he and his colleagues had insulated themselves from potential conflicts of interest when they ran the clinical trial of thymosin–the first major test of the drug. “None of the investigators [in the thymosin trial] held stock” in companies backing the drug, Mutchnick says, adding, “I thought that gave immunity” to conflict-of-interest and insider-trading charges. But, in this case, the researchers are charged with violating a section of the 1934 Securities Exchange Act by “disclosing or misappropriating … material, nonpublic information concerning the Phase III trial of thymosin.”

    Mutchnick, noting that the experience has been “very hurtful,” claims he got into trouble in part through “my own naiveté.” As the principal investigator, he says, “it never occurred to me that I couldn't talk about my impressions” of how the trial was going. Mutchnick was both principal investigator and, during years of toxicity testing, holder of the investigational new drug (IND) permit, which gave him a sense of ownership of the data, he says. In the final year of the phase III trial, which tested for efficacy, Mutchnick turned the IND over to Alpha 1 Biomedicals, he says, because he “couldn't handle all the paperwork.” But he continued to discuss the trial.

    He also met with stock analysts and investors, at Alpha 1's behest and expense, to talk about the progress of the trial. (Indeed, after the stock plummeted in 1994, a group of investors sued Alpha 1, claiming Mutchnick and the company had exaggerated the drug's value; the plaintiffs eventually settled out of court.) These meetings, which the SEC did not challenge, gave Mutchnick the idea that it was all right to share information about the clinical trial with other interested third parties, he says. Besides, he argues, a scientist must speak openly about his research: “I've never said ‘No comment' in my life. … I couldn't say ‘No comment' for 5 months, “while waiting for the company to decide what the data signified and make a public announcement.

    According to the SEC complaint, Mutchnick and Panguluri unblinded the data on patients receiving a placebo or thymosin at about 3 p.m. on 25 April 1994. They found “an equal response rate across both groups with respect to the disappearance of viral DNA,” the SEC says, a strong indication “that the study had failed to demonstrate thymosin to be effective in the treatment of hepatitis B.” The SEC alleges that Mutchnick visited his sister and brother-in-law at 7 p.m. and told them the bad news. Before the stock market opened the next morning, the SEC charges, his brother-in-law had placed an order to sell stock in Alpha 1 Biomedicals and SciClone Pharmaceuticals.

    The SEC even takes Mutchnick to task for sharing the bad news with his wife, Renee, claiming that Mutchnick “knew, or should have known, or acted with reckless disregard of the fact, that Renee Mutchnick was likely to disclose the information to others. …” The Mutchnicks, according to the judgment, spread the news to three friends and to Renee's father and sister, all of whom sold stock the next day or the day after. The SEC also alleges that Panguluri tipped off two doctors in the Anaheim medical practice he was negotiating to join, and that they and their friends quickly dumped stock. On 28 April 1994, Alpha 1–pressured by a massive sell-off of stock–issued a press release disclosing the negative results of the clinical trial.

    Today, Mutchnick says his initial impression of the trial results was hasty and erroneous, and he believes that thymosin alpha 1 was “defamed” by the “premature” press release of 28 April. He claims that on reanalysis, his own clinical data show that the drug is useful in treating hepatitis B. Some of the data were presented at a meeting in 1995, but have not been published as yet. Another study of 33 patients based in Bologna, Italy, published in 1996, found that thymosin alpha 1 was better than interferon α in controlling hepatitis-B infection. And several other foreign studies (not yet published) indicate–according to SciClone–that thymosin is effective for controlling hepatitis B and C.

    SciClone's chief financial officer, Mark Culhane, says the company did a meta-analysis of data from these studies to support its application to foreign healthb authorities to sell thymosin, under the name Zadaxin, in China and the Philippines. It is already planning to market thymosin in Singapore and Taiwan. For Culhane, the logic of the marketplace may be more germane than lingering quibbles about the clinical data: He notes that thymosin “is being sold” right now, “which is the ultimate confirmation” of its value. SciClone, unlike Mutchnick, may yet profit from the drug.

  6. AIDS Research

    Montagnier to Head New York Center

    1. Michael Balter

    PARIS—Luc Montagnier, whose group here at the Pasteur Institute first isolated HIV in 1983, surprised the world of AIDS research last week by revealing that he intends to team up with an American entrepreneur to create a new research institute in cell and molecular biology–focusing principally on AIDS. The new institute will be at Queens College in Flushing, part of the City University of New York. Montagnier, who has just completed a 6-year term as head of the Pasteur's AIDS and retrovirus research department, will maintain his own laboratory at the Pasteur Institute and will continue to work with two organizations he co-founded: the World Foundation for AIDS Research and Prevention, and the Luc Montagnier Center, an AIDS research institute in Paris.

    Some AIDS specialists wonder how much Montagnier will be able to achieve in the United States with so many other demands on his time. “It's not clear what he can do there that is not already being done,” says a colleague at Pasteur who asked not to be identified. The 64-year-old Montagnier–who is nearing retirement at Pasteur–is “already running three other operations at once. He may not be able to keep up,” the researcher says. Montagnier told Science he plans to commit “a large part” of his time to Queens College. (According to sources at Pasteur, Montagnier will be replaced as head of the AIDS and retrovirus department by hepatitis-B expert Pierre Tiollais.)

    Montagnier was lured to Queens by college alumnus Bernard Salick, former chief executive of Salick Health Care, a chain of 24-hour cancer-care and kidney-dialysis outpatient clinics. Salick was ousted as CEO of the chain earlier this month, after a takeover by the British pharmaceuticals giant Zeneca, but he is donating $4.5 million of his own money to start the center. In addition, Salick and Queens officials will attempt to raise some $15 million from the state of New York and matching funds from private sources. “The outlook is very good,” says Queens spokesperson Ron Cannava, “because Salick has tremendous contacts with the pharmaceutical industry.”

    According to Montagnier, the center will hire five prominent researchers, who will be appointed senior-level professors at Queens College. Montagnier says that the center's AIDS research will be focused primarily on finding therapies “that would relieve patients from having to be treated every day for the rest of their lives,” as well as development of an AIDS vaccine. Although some researchers from his Pasteur lab or the international network of researchers supported by his foundation may move to the new institute, Montagnier says that “it will be mostly an American center, run by Americans.” Montagnier adds that he has already begun sounding out some U.S. scientists about coming to Queens, although he declines to give names at this point.

    The new institute will take about 2 years to construct. But Montagnier says he is eager to begin work as soon as he can negotiate temporary lab space at Queens. Asked what attracted him to set up shop across the Atlantic, Montagnier says that in the United States, research discoveries can be exploited much more rapidly than in France: “There is a greater potential for having findings applied by industry and biotechnology companies, and more opportunity to interact with those groups.”

  7. Working in China

    Geoscientists Seek Common Ground on Collaborations

    1. Li Hui,
    2. Xiong Lei
    1. Li Hui and Xiong Lei are reporters with China Features.

    NANJING—Chinese stratigraphers were elated when an international panel last August overwhelmingly endorsed a site in eastern China's Zhejiang Province as a reference point for the middle Ordovician period some 350 million to 400 million years ago. The designation represented an international seal of approval for the nation's scientific prowess and its ability to be the steward for a site that would draw researchers from around the world. By November, however, excitement about the 15-to-1 vote had turned to anger when the International Commission of Stratigraphy (which gives the final blessing to such sites) was decidedly less enthusiastic about awarding China this paleogeological plum. It approved the Huangnitang site by a vote of 11 to 5, with two abstentions. “We were shocked by such a low favor rate,” admits stratigrapher Jin Yugan of the Nanjing Institute of Geology and Paleontology (NIG&P), which has worked extensively at the site.

    History by the dozen.

    Chinese stratigraphers have proprosed 12 sites as models for the study of important geological boundaries.

    SOURCE: NIG&P

    The second vote followed a campaign by a U.S. paleontologist for a moratorium on all proposed Chinese sites until the government guarantees international access to those sites (see sidebar). The effort, by Spencer Lucas of the New Mexico Museum of Natural History, arose from a dispute over the ground rules for an expedition last summer to a remote site in northwestern China. That expedition focused attention on potential problems arising from joint research projects in China (Science, 1 November 1996, p. 715).

    The increasing number of such cooperative ventures creates a pressing problem for both sides. Since 1979, for example, there has been a sevenfold increase in collaborative projects organized by the Chinese Academy of Sciences (CAS), to about 4000 in 1995, and foreign scientists make about 7000 visits there each year. Although these joint efforts take place in many fields, the country's geological richness and its strong national program make international collaborations especially popular in the geosciences. The stringent logistical requirements of many such expeditions mean that understanding the rules is essential for a successful collaboration.

    The stakes are high on both sides. For the Chinese, collaborating with foreign counterparts means access to Western technology, the sharing of research costs, and global recognition of their efforts. They put a premium on such interactions: Some 10% of the 300 researchers at NIG&P, for example, hold ranking positions in 25 international scientific organizations.

    For foreign collaborators, the wealth of potentially valuable sites throughout the country, many of them thought to be unique to China, is a major attraction. Especially important in that respect is the Qinghai-Tibet Plateau in the country's wild west—the site of Lucas's botched expedition and a potentially rich source of nonmarine fossils across the 250-million-year-old Permian-Triassic boundary. “We have done extensive work in geological basic research,” says Zhao Xun, deputy director of the Chinese Academy of Geological Sciences (CAGS). “That is why foreign geoscientists are interested in working with us.”

    But what are the ground rules for good cooperation? It's important to follow local practices and customs, obey the rules, acknowledge the professional contributions of colleagues, and pay a fair price for services provided. Even with the best of intentions, however, Zhao says that, in practice, both sides have sometimes found collaboration to be a bittersweet experience.

    A matter of respect. For many Chinese scientists, the most important element is mutual respect. Academician Sheng Jinzhang of NIG&P recalls one unsettling episode in the early 1980s, when a group of Japanese stratigraphers wanted to work jointly on a stratotype section that his team had explored for many years. The Japanese scientists planned to start from scratch, he said, while he “suggested strongly that the joint research should be based on what the Chinese scientists had done, and that improvements be made where necessary.” His point was clear. “In the end,” he said, “they acknowledged our study of fossils, while I admired their work on sedimentary rocks.”

    Some Chinese researchers also complain that their overseas colleagues sometimes fail to respect their advice on China's rules for collecting samples. CAGS geophysicist Guo Jingru recalls an unhappy episode a few years ago with some German geologists. “We were collecting data for our high-temperature and high-pressure research in Tibet when I found that my German colleagues had collected some rock samples,” he says. “When I told them that rock collecting was illegal because it was not in the bilateral agreement, they refused to budge, saying that as long as it is beneficial to science, they were justified to do anything. I was really mad.” Only when they realized they would have trouble getting these samples out of the country, he says, did they change their minds.

    China is not alone, of course, in protecting its geological and fossil resources. Liu Jiaqi, director of the CAS's Institute of Geology, recalls that while visiting Yellowstone National Park a few years ago, he was tempted to hammer off a rock sample. “But my American host said to me: ‘Do you want to go behind bars?' I understood him and dropped the idea.”

    And sometimes Chinese scientists can send a mixed message. Zhang Miman, of the well-known Institute for Vertebrate Paleontology and Paleoanthropology in Beijing, believes that some Chinese researchers have turned a blind eye to the collection and shipping of precious samples and specimens in return for personal gain, including round-trip tickets to a foreign country and the inclusion of their names on research papers written by the foreign parties. She emphasizes, however, that such behavior violates national sovereignty.

    Money matters. Not surprisingly, money is often a source of friction. Geologist Larry Brown of Cornell University is a senior scientist on the International Deep Profiling of Tibet and Himalayas (INDEPTH) project (Science, 17 November 1995, p. 1144; 6 December 1996, p. 1684, 1688, 1690, 1692, and 1694). He says his team had to cancel plans to film some of its work after learning about the high cost of access fees that non-Chinese participants were charged to enter Tibet. The negotiations, he says, left him with the impression that some Chinese are “more concerned with financial than scientific issues.”

    But CAGS geologist Xiong Jiayu, who is also head of the academy's Science and Technology Division, says many of the conditions are not set by scientists. For instance, Tibet has a local rule that forbids a single vehicle from going out to the field, he says. And housing costs are almost always higher for foreigners, he notes, because of rules that prohibit them from staying in the same, inexpensive guest houses typically used by their Chinese colleagues.

    Then there is the matter of setting aside what INDEPTH scientists euphemistically label a “public relations” budget. The money is used to smooth out any obstacles to progress, from a recalcitrant truck driver to an uncooperative local official. Once INDEPTH's foreign participants understood the importance of having such a budget, says Zhao Wenjin, a senior geophysicist at CAGS and co-leader of the project, they were able to carry out their work much more easily.

    Veterans of joint projects say that one way to avoid mishaps in navigating the complex regulations, differing procedures, and rigid social norms in Chinese society is to put everything in writing. “Our Chinese colleagues are always faithful to the very word,” says Brown. “The disagreements usually come over issues that developed during the course of the experiment.”

    Other tips from experienced collaborators are to select a partner carefully and be sensitive to cultural differences. Don't assume, for example, that a nod or smile means one's collaborator has understood the conversation or agreed to a particular course of action. For vertebrate paleontologist Richard Tedford, of the American Museum of Natural History in New York, there's really only one thing to remember: “Communicate, communicate, communicate,” he stresses. “That's the secret.”

    Despite the occasional problems, China and its global partners are eager to reap the scientific advantages that flow from collaboration. In January, the International Union of Geological Sciences, in addition to ratifying the Ordovician site, took a small step toward cementing those ties by embracing a draft of guidelines compiled by geoscientists from 22 nations. The guidelines say that each visiting scientist “must respect not only the sovereignty, laws, and environment of the country in which he or she conducts research, but also the dignity and intellectual rights of its scientists.” In other words, mind your manners when traveling abroad.

  8. Paleontology

    Pact to Open Up New Fossil Trove

    1. Ann Gibbons

    The latest example of China's importance to paleontologists around the world is the discovery of a stunning trove of dinosaur and bird fossils in northeast China. And a new international collaboration is expected to be the key to unlock its secrets.

    Scientists at the National Geological Museum in Beijing and The Academy of Natural Sciences in Philadelphia are hoping to reach an agreement for long-term exploration and characterization of the site in the Yixian formation of the Liaoning Province of northeast China. The academy's Don Wolberg estimates that the effort could cost $1 million over 5 to 8 years and involve mapping and dating the site, drilling, and an analysis of its geology, flora, and fauna.

    The fossil beds, covering a period some 120 million to 130 million years ago at the border of the Jurassic and Cretaceous periods, are the resting place of hundreds of early birds and dinosaurs. They include a female dinosaur called Sinosauropteryx, which had the carcass of a mammal in its gut and an egg in its oviduct–making it the earliest example of mammalian predation and of an internal organ from the fossil record (see photo). The rich beds also have produced two notorious fossils in the past year–the purported “feathered” dinosaur called Sinosauropteryx prima (Science, 1 November 1996, p. 720) and a candidate for the oldest modern-looking bird, called Liaoningornis (Science, 15 November 1996, p. 1083).

    DAVID BUBIER/THE ACADEMY OF NATURAL SCIENCES

    “It appears to represent a blank page in a chapter of Earth time not seen before,” says Yale University paleontologist John Ostrom, who in March led a reconnaissance trip to the site by a group of U.S. and German paleontologists.

  9. Working in China

    No Moratorium on Trust

    1. Jeffrey Mervis

    Veterans of international collaborations with China say that trust is the lubricant that makes such projects run smoothly. But the pump temporarily ran dry in the feud between Chinese geoscientists and Spencer Lucas, a paleontologist at the New Mexico Museum of Natural History.

    “Anybody who approves a GSSP [global stratotype section and point] is taking a big chance,” says Lucas, who fought with his Chinese collaborators last summer over the rules governing a field project and ultimately left the country without any samples. “Who knows what they will do at a site once it gets approved?” But Liu Dunyi, a geochronologist with the Chinese Academy of Geological Sciences (CAGS) and a vice president of the International Union of Geological Sciences, says that Lucas turned a simple misunderstanding into an unwarranted attack on the country. “[The incident] was an isolated case that has nothing to do with China's official policy,” he says. “Professor Lucas has made it an excuse to negate all international collaborations in China and mar the reputation of Chinese scientists.”

    Last fall, Lucas urged the global geosciences community to impose a moratorium on its process of approving Chinese sites as models for understanding Earth's history until high-ranking state authorities promise free and open access to them. Although that campaign may have affected the voting on one site (see main text), it does not appear to have had a lasting effect on the process.

    Now there are signs that the breach may be closing. Last month, Lucas and his co-investigators on the ill-fated expedition invited their Chinese colleague, CAGS's Cheng Zhengwu, to sign a joint research agreement and to participate in an analysis of the samples, now being kept in Beijing, once the material is sent to the United States. “We're happy to work with them, but we can't offer them any money,” says magnetopaleontologist John Geissman of the University of New Mexico. “There's nothing in the original grant for analysis.” The invitation followed a recent letter to Lucas from Cheng, recounting the incident and ending with his wish “to cooperate with the U.S. side and send the samples to the USA as soon as an agreement is signed.”

  10. Neuroscience

    Estrogen Stakes Claim to Cognition

    1. Ingrid Wickelgren

    Recent work is showing that the female hormone estrogen has many effects on brain neurons that could give it the ability to improve such higher mental functions as learning and memory

    Over the years, the so-called “female” sex hormones have come in for their share of scorn. One notorious example was in 1970 when Edgar Berman, personal physician to former Vice President Hubert Humphrey, declared that women were unfit for many jobs because of their “raging hormonal imbalances.” But recently one female hormone–estrogen–has begun to reveal some unsuspected talents: an apparent ability to preserve and even improve some of the brain's highest functions. “We used to think [estrogens] influenced only sexual behavior; now we know they also influence learning and memory,” says one pioneer of the work, neuroendocrinologist Victoria Luine of Hunter College of the City University of New York.

    Estrogen has won this new respect through a wealth of cellular and molecular studies showing that the hormone can empower brain cells involved in thinking in many ways: It boosts the cells' chemical function, spurs their growth, and even keeps them alive by shielding them from toxins. Earlier work had shown that estrogen stimulates nerve-cell growth in the brains of developing embryos, but until recently, no one realized that the hormone could exert similar power over cognitive portions of the adult brain. Now, says Phyllis Wise, a reproductive endocrinologist at the University of Kentucky in Lexington, “There are clear-cut, basic science data showing a biochemical substrate for a memory effect.”

    What's more, recent work suggests that estrogen's neuronal effects have functional consequences. Human epidemiological studies and small clinical trials indicate that the hormone improves memory in both healthy women and female patients with Alzheimer's disease, and may even stave off that disease if given to women after menopause (see sidebar). If confirmed, such findings will not only help secure estrogen's status as a memory molecule, but also may lead to better treatments and prophylactics for Alzheimer's disease and possibly normal, age-related memory loss as well. And even men may benefit if researchers can design appropriate drugs that don't cause feminization or other unwanted side effects. Studies suggest that the male brain is sensitive to estrogen, which a brain enzyme in men synthesizes from testosterone.

    The first hints that estrogen might affect cognition came 2 decades ago, although the investigators didn't realize it at the time. While studying how the hormone might control reproductive behavior, Luine, then working with Bruce McEwen at Rockefeller University in New York City, gave estrogen to female rats in which the ovaries had been removed, then looked for changes in brain areas thought to govern reproduction. Among other things, the researchers saw an increase in levels of an enzyme called choline acetyltransferase (ChAT) in certain neurons of the basal forebrain.

    Because ChAT makes acetylcholine, the chemical those neurons use to communicate with other nerve cells, it looked like estrogen might be revving up activity in the basal forebrain cells. But because the part of the basal forebrain Luine and McEwen examined was not widely known to be involved in learning and memory, the researchers didn't make the connection to cognition.

    Then, in the early 1980s, Luine stumbled across a review article describing a massive loss of acetylcholine-releasing neurons in the basal forebrains of patients with Alzheimer's disease. She realized that these neurons must play a role in cognition–and that estrogen might have a therapeutic effect in Alzheimer's disease.

    What's more, Luine did further animal studies that linked estrogen changes to two brain areas more commonly associated with memory and learning: the hippocampus and cerebral cortex. Because basal forebrain neurons send long projections called axons to both areas, she reasoned that the extra ChAT produced in the basal forebrain in response to estrogen could reach the hippocampus and cortex through the basal forebrain axons. And in fact, Luine found that ovariectomized female rats given estrogen did have more ChAT enzyme in the hippocampus and frontal cortex than control animals did.

    Molding the brain

    Within a few years, Catherine Woolley, Elizabeth Gould, McEwen, and their Rockefeller colleagues uncovered another way in which estrogen might act on neurons involved in learning and memory: by helping to build and maintain synapses, the specialized structures through which one neuron communicates with another.

    Synapses form at points of contact between axon endings and tiny branches, called spines, that jut out from the shorter neuronal projections known as dendrites on the target cell. In the 1970s, Dominique Toran-Allerand, of the Columbia College of Physicians and Surgeons in New York, had shown that estrogen stimulates the sprouting of axons and dendrites from developing mouse neurons in cell culture, but it was not supposed to affect adult neurons.

    Yet, the Rockefeller team found that depleting adult female rats of estrogen by removing their ovaries caused a loss of spines from certain hippocampal cells. By contrast, ovariectomized rats that received estrogen injections had hippocampal cells with almost the same number of spines that rats with ovaries had. The study “opened up our thinking about what hormones could do in an adult animal,” says Luine.

    Two new studies now have suggested that the estrogen-induced dendritic changes actually affect neuron function, by linking them to a molecule that plays an important role in cognition. This is the NMDA receptor, a membrane protein that detects incoming signals from the neurotransmitter glutamate. In the first study, completed late last year, John Morrison and Adam Gazzaley at Mount Sinai School of Medicine in New York, working with Nancy Weiland and McEwen at Rockefeller, found 30% more of the protein in certain hippocampal neurons from ovariectomized female rats treated with estrogen than in the cells of untreated animals. The increase was concentrated in the same hippocampal region where the Rockefeller team had found an increase in spines.

    And just last month, Woolley and Philip Schwartzkroin, now both at the University of Washington, Seattle, along with Rockefeller's Weiland and McEwen, determined that the additional NMDA receptors fostered by estrogen are active in transmitting neuronal signals. After first confirming that estrogen replacement induces a 30% rise in both NMDA receptors and spines in the hippocampuses of ovariectomized female rats, the researchers electrically stimulated hippocampal neurons under conditions in which only NMDA receptors are active. They found that neurons from estrogen-treated rats responded to this stimulation with larger currents than did neurons from control rats. “That says that the new spine synapses are mainly NMDA-type synapses,” McEwen notes.

    The growth-factor theory

    But how might estrogen induce the nerve cell growth needed for such synapse formation? Work done over the past few years by Toran-Allerand's group suggests one possibility: The hormone may cooperate with neurotrophins, potent stimulators of nerve-cell growth, such as nerve growth factor (NGF). In 1992, for example, Toran-Allerand and her colleagues found receptors for both estrogen and the neurotrophins on the same neurons in the rodent basal forebrain. A year later, they found the same receptor pairing in neurons in the cerebral cortex and hippocampus.

    But the functional significance of this finding didn't become clear until 1994, when the Columbia team discovered that estrogen increases the expression of NGF receptors in cultured rat cells and, conversely, that NGF enhances the binding of estrogen to the same cells. This hinted that each type of molecule may act in the cell nucleus to boost the expression of the other's receptor, allowing estrogen and NGF to amplify each other's growth responses.

    More recently, Toran-Allerand has taken this work further with results suggesting a possible new mode of action for estrogen. The hormone is supposed to exert its effects by forming a complex with its receptor, which then helps turn on certain genes in the cell nucleus. Such a mechanism could account for the hormone's ability to increase the synthesis of the NGF receptor and also that of the ChAT enzyme.

    But at last year's Society for Neuroscience meeting, Toran-Allerand's group presented evidence indicating that estrogen also co-opts NGF's own growth-stimulatory pathway. In tissue slices from the cerebral cortexes of developing rats, the researchers found that estrogen activates a key class of molecules in the NGF signaling pathway: cytoplasmic enzymes called extracellular-signal regulated kinases that help relay the NGF signal from its receptor to the nucleus. In that way, estrogen could regulate many more genes than anyone thought it could. But even though this flies in the face of the current dogma on estrogen action, neuroscientists, including James Simpkins of the University of Florida, Gainesville, are warming up to the idea. “It's clear that many effects of estrogen do not involve what we learned in school about the estrogen receptor,” he says.

    Chemical shield

    Indeed, estrogen seems to have yet another trick up its sleeve. Through a mechanism involving neither its own receptor nor the growth factor-signaling pathways, the hormone can directly protect brain cells from toxins. The first hints of this shielding effect emerged from experiments that Simpkins and his Florida colleagues performed in 1994. Normally, 80% to 90% of human neuroblastoma cells, which were derived from a cancer of the peripheral nervous system, die within 2 days when placed in culture fluids lacking blood serum, which contains factors essential for their growth. But the Simpkins team found that estrogen prevents this cell death.

    What's more, the protective effect seemed to have little to do with the estrogen receptor: It occurred in other cell types that lack the receptor, and various forms of estrogen–which have different affinities for the estrogen receptor–were equally good at keeping the neurons alive.

    So how might estrogen be protecting nerve cells? Recent data show it can act as an antioxidant, soaking up highly reactive molecules called free radicals, which can kill a cell by fracturing its membrane lipids, proteins, and DNA. In 1995, Christian Behl and his colleagues at the Max Planck Institute of Psychiatry in Munich, Germany, reported that high estrogen concentrations reduce the neuron-killing effects of several toxins that boost production of free radicals. Among these are glutamate, a neurotransmitter that is toxic to cells in high concentrations, and β amyloid, a protein that accumulates in the brains of Alzheimer's patients and is thought by some to be a cause of their neuronal degeneration.

    Simpkins's team has confirmed Behl's results, using lower estrogen concentrations similar to those found in the body. “Our data clearly indicate that estrogens, at physiologically relevant concentrations, eliminate most of the oxidation” in a cell that results from its exposure to β amyloid, says Simpkins.

    Memory medicine?

    Researchers are now amassing evidence that estrogen's cellular effects improve mental function. Experiments in female rats and monkeys have linked high blood levels of estrogen to better performance on cognitive tasks thought to involve the hippocampus, where the hormone stimulates synapse formation. And a few studies in people have also supported the estrogen-cognition link.

    For example, psychologist Barbara Sherwin of McGill University in Montreal studied 18 women in their 30s who were being treated for fibroid tumors of the uterus with a medication that suppresses estrogen production. The women's verbal memory scores dropped after they began taking the drug. The decline was then reversed in women who received 8 weeks of estrogen-replacement therapy, but not in those who got a placebo. The results, Sherwin wrote last year in the Journal of Endocrinology, “strongly suggest that estrogen serves to maintain verbal memory in women.”

    Such results, together with the mounting data on estrogen's beneficial effects in Alzheimer's, are spurring investigators to develop drugs that might bolster brain function without promoting reproductive cancers in women (one side effect of estrogen therapy) or feminine characteristics in men. Indeed, the Behl and Simpkins teams have already discovered a clue that may aid in the design of drugs that imitate estrogen's antioxidant effect.

    In the April 1997 issue of Molecular Pharmacology, Behl's group reports that rodent hippocampal neurons exposed to estrogens and various related steroids were protected against β amyloid and glutamate only when the steroid had a hydroxyl group dangling from a particular place on one of its molecular rings. Simpkins independently reported similar results at last year's meeting of the Society for Neuroscience and is already working with medicinal chemists to synthesize neuroprotective estrogens that lack the hormone's other effects.

    Finding a new estrogenlike drug that works well in the body, however, may require an even more detailed understanding of estrogen's maneuverings through neurons. It's still not clear, for example, precisely how estrogen helps neurons skirt death by oxidation, or exactly what molecular steps it takes to induce neuronal sprouting, or even which other molecules might influence its encounter with its nuclear receptor. “We're very early in the game, even on a basic research level,” Simpkins warns.

    Still, many researchers believe that estrogen will soon reveal its other mind games, given the army of investigators assigned to its case. “It's going to happen very rapidly,” says McEwen. And as it does, a hormone once considered a key only to reproduction may open new doors to our brains and keep us mentally sharp beyond our reproductive years.

    Additional Reading

  11. Estrogen

    A New Weapon Against Alzheimer's?

    1. Ingrid Wickelgren

    Back in the mid-1980s, a pioneering young doctor named Howard Fillit unwittingly rekindled true love between a poet and his girlfriend–an 80-year-old woman named Elsa who had Alzheimer's disease. Before Fillit treated her, Elsa was quiet, apathetic, and unable to learn pairs of words, even after seeing them dozens of times. Afterward, she was much more alert, talkative, and able to remember the words after just a few trials. The treatment that brought on this transformation was estrogen, and its effects won this woman not only a new mind but also a marriage proposal from her boyfriend.

    Fillit, who was then at Rockefeller University in New York City, had begun a pilot study of estrogen after one of his colleagues there, Victoria Luine, found laboratory evidence that the hormone protects the neurons that deteriorate in Alzheimer's. He found that besides Elsa, two of the six other women in the study also showed gains in their cognitive skills. But when Fillit applied for funding for further studies of estrogen's efficacy in Alzheimer's disease, several agencies, including the National Institutes of Health, roundly refused–on the grounds, he recalls, that his applications had “no scientific merit.” Now, barely a decade later, the possibility that estrogen might help Alzheimer's patients has become one of the hottest topics in the field.

    At least five small treatment trials have replicated Fillit's findings, and other studies suggest that the hormone might prevent or delay the onset of Alzheimer's in women. Men also could benefit, if analogs of the hormone that lack its feminizing properties can be developed. And while experts caution that these human studies aren't definitive, a flood of other work has shown that estrogen has neuronal effects consistent with a role in cognition (see main text). As a result, Alzheimer's researchers are becoming “cautiously optimistic” about estrogen's ability both to treat and stave off dementia, says Victor Henderson, a neurologist at the University of Southern California School of Medicine.

    Among the recent clinical trials engendering this optimism is the first double-blind, controlled study in which Alzheimer's patients received the standard estrogen dose given to postmenopausal women in the United States. Study leader Sanjay Asthana, of the Veterans Affairs Medical Center in Tacoma, Washington, reported the following at last year's annual meeting of the Society for Neuroscience: He and his colleagues found that five of six women who had mild Alzheimer's disease and wore an estrogen patch for 2 months showed noteworthy improvements in verbal memory and attention. For example, from a list presented to them 20 minutes earlier, they could remember an average of twice as many words as they could at the start of the study. “We found a direct relationship between the level of estrogen in the blood and improvement in memory,” says Asthana. The effect disappeared when the treated women went off estrogen, however.

    While Asthana's results are encouraging, experts say they need to be confirmed in larger studies, such as a multicenter trial sponsored by the National Institute on Aging, which will include 120 women with Alzheimer's disease. Those results are expected in 1999. Meanwhile, other studies are strengthening the case that the hormone can also slow the development of Alzheimer's.

    That case began building in 1994, when an epidemiological study by Henderson and his colleagues suggested that estrogen use by women may lower their risk of Alzheimer's disease by at least 45%. These results were open to question because they relied upon death certificates for the diagnosis of Alzheimer's; however, in the past year, at least two large trials in which dementia was diagnosed by experts during the trial have tied estrogen use to a diminishing risk of Alzheimer's.

    One of these comes from a team led by Richard Mayeux of the Columbia College of Physicians and Surgeons in New York City. At the start of this study, the researchers questioned 1124 elderly women, all of them mentally healthy at the time, about their estrogen use. The women were then followed for 1 to 5 years. At the study's end, only 6% of the 156 estrogen users had developed Alzheimer's disease, compared to 16% of the 968 women who had never taken the hormone. Moreover, estrogen users who did develop dementia did so significantly later. The researchers calculated that taking estrogen for more than a year would lower a woman's risk of developing Alzheimer's by up to 5% annually. “From the newest data, it looks like estrogen might benefit women by delaying things,” says epidemiologist Walter Kukull of the University of Washington, Seattle.

    Still, to be confident about that, experts would like to see similar results from more prospective studies such as Mayeux's. Better yet would be a positive outcome from the first randomized intervention study of estrogen's ability to stave off Alzheimer's disease. This effort, part of the government-sponsored Women's Health Initiative, will test the mental acuity of 8000 elderly women each year for 6 to 9 years, to find out whether the Alzheimer's incidence is lower in women given estrogen than in those given a placebo.

    If it is, there will be a push to devise a treatment that can mimic estrogen's properties in the brain without producing growth elsewhere–an action thought to lead to reproductive cancers in women and feminizing effects in men. In the meantime, doctors may recommend estrogen to postmenopausal women who don't want to risk cancers to bolster their bones or their hearts–but will take that chance to safeguard their brains.

  12. Cell Biology

    Force-Carrying Web Pervades Living Cell

    1. James Glanz

    “Don't fight forces; use them.” The words of the engineer and architect R. Buckminster Fuller might turn out to be a motto for the living cell. Investigators have traditionally pictured the cell's cytoskeleton of protein fibers as mainly a supporting mechanism. Recent findings, however, have hinted that mechanical forces on the cell can affect everything from the way proteins bind to DNA to whether a malignant cell develops into a full-blown tumor. And now a team of cell biologists–inspired, in part, by Fuller's structural ideas–has demonstrated that mammalian cells are densely “hard-wired” with force-carrying connections that reach all the way from the membrane through the cytoskeleton to the genome.

    Making connections. A force-carrying network (below) extends from the cell membrane into the nucleus, where a micropipette pulls out linked chromosomes (above).

    The team, at Harvard Medical School and Children's Hospital in Boston, combined micromanipulation, video microscopy, and highly specific molecular “adhesives” to show that tugging on particular receptors at the surface of a living cell triggers nearly instantaneous rearrangements in the nucleus. The experiment, by Andrew Maniotis, Donald Ingber, and their collaborators, is being hailed as a triumph that required, among other things, “a really masterful use of reagents that have become available only in the last 5 years,” says Stuart Newman of the New York Medical College in Valhalla.

    More important, he adds, “It puts the cytoskeleton in a new light: as a mechanism for signal transduction rather than just as a supporting mechanism.” Although various researchers have suggested that cytoskeletal connections could transmit regulatory information through the cell, this direct demonstration “really puts [that conjecture] on the map,” according to Newman. The mechanical communications system, if that's what it is, even extends through the nucleus, as the team found in a follow-up experiment in which it reached directly into the nucleus and plucked out structures such as individual chromosomes. They too were linked–by elastic strands of DNA. “It would suggest that everything in the nucleus is in fact connected,” says Jeffrey Nickerson of the University of Massachusetts Medical Center in Worcester. “From my point of view, that's remarkable, and it's wonderful.”

    Other cell biologists, such as Zena Werb at the University of California, San Francisco, say that to establish the significance of what it has seen, the Harvard group still must show whether the connections are important in, say, regulating specific genes. But few researchers doubt that the results will raise the profile of mechanical forces in the cell. They are also likely to draw attention to the concept that inspired the experiments: the idea that the cell owes its shape and many of its properties to a “tensegrity” structure–a design principle described by Fuller.

    Tensegrity (tensional integrity) structures gain shape and strength by combining elements that resist compression with a network of other elements under tension, creating a “prestressed” system, explains Ingber. A bow used to shoot an arrow is one example, as are Fuller's own geodesic domes and the gravity-defying, strut-and-cable sculptures of the artist Kenneth Snelson. Ingber argues that the cell's internal skeleton shares properties with these structures, because it combines structural elements that resist compression, called microtubules, with others that are strong under tension–the actin microfilaments and the intermediate filaments.

    Because tensegrity structures act as a force-carrying network, the model predicts that forces applied to cell surface receptors anchored to the cytoskeleton will quickly propagate into the cell interior. Using live human and cow endothelial cells, which line blood vessels, Maniotis, Ingber, and Christopher Chen set out to test this hypothesis.

    First, they coated 4.5-micrometer beads with fibronectin, a protein that binds only to integrin receptors–cell surface structures moored to the cytoskeleton through the cell membrane. With a manually operated micromanipulation device, Maniotis then used a micropipette “like a golf club” to move the beads about 10 micrometers a second, while monitoring the cell with the video microscope.

    As the group reported in the 4 February issue of the Proceedings of the National Academy of Sciences, the video microscope captured almost instantaneous movements and realignments of nuclear structures–dense structures called nucleoli suddenly lining up, for example, or moving toward the edge of the nucleus. Pulling on other membrane receptors that aren't linked to the cytoskeleton had no such effect, suggesting that the rearrangements were not caused by a “sausage-casing” effect of tensing the membrane.

    “These studies are compatible with a prestressed cytoskeletal system, [which is] part of the tensegrity model proposed by Ingber,” says Avri Ben-Ze'ev of the department of molecular cell biology at the Weizmann Institute of Science in Israel. Mina Bissell of California's Lawrence Berkeley National Laboratory, who has shown that mechanical stresses on human malignant cells can determine whether they proliferate into a tumor or lie dormant, adds: “They have beautifully demonstrated the nature of the physical connection” between a cell's surroundings and genes in the nucleus. “I think we are looking at very similar phenomena.”

    In a second paper, which appeared last week in the Journal of Cellular Biochemistry, Maniotis, Ingber, and Krzysztof Bojanowski describe evidence that this kind of mechanical continuity extends into the nucleus. Using micropipettes that had been carefully drawn so that their tips were roughly a wavelength of light in thickness, the researchers harpooned structures in the nucleus itself. “You read a lot of Moby Dick and you practice,” says Maniotis. He and his collaborators then drew the structures–nucleoli and chromosomes–out of the breach in the nucleus. Even though chromosomes extracted from dead, fixed cells typically appear isolated, the experiment showed that in the living cells, the chromosomes and nucleoli were always connected by flexible strings. The connections turned out to consist of DNA, as the team determined by treating the strands with a range of chemicals that snip only specific molecules.

    If this is the case, the clean separation between chromosomes seen in other micrographs may be an artifact of sample preparation. “On a [heretical] scale of one to 10,” says Ingber, this is “an 11.” But the Harvard group is not alone in this heresy. Mohamed El-Alfy and Charles Philippe Leblond of McGill University in Montreal recently reported at a conference that they, too, have seen the DNA connections in all phases of the cell cycle–even though they used an entirely different technique based on electron microscopy. “This is really strong proof that there are bridges between chromosomes,” at least in many cells, says El-Alfy.

    What it all shows, says Peter Davies, director of the Institute for Medicine and Engineering at the University of Pennsylvania, is that “the cell structure is an integrated entity where the parts are connected.” Although it will take further work to learn whether these links really do carry signals to the genes in the nucleus, researchers seem to agree that the results are changing the way biologists think about the cell. “For a long time, the mechanical and engineering aspects of cell biology were not appreciated,” says David Bensimon, a biophysicist at the École Normale Supérieure in Paris. “This type of experiment … certainly suggests that there is much to learn about the possible mechanical control of DNA and gene expression and regulation.” Although geodesic domes may be out of fashion now, their principles could live on in biology.

  13. Optical Communications

    Chaos Keeps Data Under Wraps

    1. Alexander Hellemans
    1. Alexander Hellemans is a science writer in Paris.

    PARIS—Not long ago, chaos was a bugbear of electronic and optical systems, creating seemingly unavoidable noise and uncertainty in computation and communication. Now, says physicist Edward Ott of the University of Maryland, College Park, “People have begun to … use it for practical purposes.” The latest example came last month when a group of French scientists announced that it had harnessed chaos to protect optical-communications signals from eavesdroppers.

    Caught in the loop.

    This electro-optic feedback loop creates chaotic frequency fluctuations that mask the data carried.

    Researchers learned how to use chaotic fluctuations to hide data in electronic signals several years ago. Creating chaotic fluctuations in frequency in an optical signal, then reliably removing them at the receiver, is a logical next step, and that's what the French group has done. The new scheme, developed by Jean-Pierre Goedgebuer, Laurent Larger, and Alexis Fischer of the University of Besançon, is exceptionally fast, allowing encrypted signals to make full use of the high transmission speeds of fiber-optic cables. Daniel Gauthier of Duke University calls it “an interesting first demonstration,” adding: “The high-speed aspect of it would be quite useful.”

    With the rapid growth of communications, encryption has developed into an industry. Encryption schemes that rely on numerical “keys,” however, have a drawback: the processing time needed to encode and decode the information. Chaos offers a faster solution. Just as a microphone brought too close to an amplifier creates feedback, some types of circuits can become caught in a loop and generate chaotic fluctuations in frequency, masking the information they are carrying. Researchers realized that if they could generate an identical copy of the chaotic signal at the receiver, they could quickly subtract it from the signal to recover the original data.

    Some electronic circuits readily generate identical chaotic signals because their chaos is “deterministic.” As a result, identical circuits with the same physical parameters will produce identical chaotic signals. In 1993, electronic engineers Kevin Cuomo and Alan Oppenheim of the Massachusetts Institute of Technology were the first to harness deterministic chaos for encryption in an electronic communication system (Science, 23 July 1993, p. 429).

    Now the Besançon group has shown that the principle can be applied to beams of light. The team encrypted a data signal carried by a laser beam by controlling the laser diode with an electro-optic feedback loop. The loop consists of a “birefringent” filter that responds nonlinearly to changes in beam frequency by transmitting more or less of the beam, and a photodetector that converts the flickering light into an electronic signal. The data signal is superimposed on this fluctuating electronic signal, which is then amplified and fed back to the laser diode, varying its frequency chaotically to create the encrypted signal.

    The receiver contains an identical feedback loop with the same settings. When an encrypted beam arrives, it is split in two with a beam splitter. One part passes through the feedback loop and causes a second diode laser to emit an identical chaotic signal—but this time without the data signal having been added in. This chaotic signal is then inverted and superimposed on the other part of the incoming encrypted signal, canceling out its chaotic component to reveal the original data. This process is fast and secure, says Larger: “For the type of chaos we are using, called hyperchaos, the signal appears very much as random noise.”

    Goedgebuer says the system could be valuable where high speed and low cost are critical, such as encrypting cable-TV transmissions. Jean-Yves Le Traon of France Telecom's research branch in Lannion, which supported the research, expects applications on a limited scale in about 5 years: “We hope to increase security in this way …, especially if we combine optical with numerical encryption.”

    Gauthier says he's impressed that the company is taking a serious interest in this exotic concept: “The industry is so set in its standards that trying to do something like this would have to be a major shift. If they are really convinced it is cheap, they might be willing to switch over.”

  14. Paleoclimatology

    Second Clock Supports Orbital Pacing of the Ice Ages

    1. Richard A. Kerr

    For a while, it looked as if a water-filled crack in the Nevada desert might doom the accepted explanation of the ice ages. Twenty years ago, the so-called astronomical theory had carried the day. Oceanographers had found evidence implying that the march of ice ages over the last million years was paced by the cyclical stretching and squeezing of Earth's orbit around the sun, which would have altered the way sunlight fell on the planet's surface. But in 1988, researchers scuba diving in Nevada's Devils Hole came up with a climate record—captured in carbonate deposits in the crack—that seemed to contradict this chronology (Science, 6 April 1990, p. 31).

    Benchmarks.

    Terraces on the New Guinea coast record the high sea levels of past interglacial periods.

    Arthur Bloom/Cornell University

    The Devils Hole record traced climate swings of about the same length as the marine record, but they were out of step with the variations of Earth's orbit. Most glaringly, these carbonates indicated a profound warming trend, which appeared to signal the end of the penultimate ice age, thousands of years before orbital variations could have begun to melt the ice. If the Devils Hole chronology was a true record of the world's ice ages, researchers would have to dump the astronomical mechanism and look for something new.

    But now, after almost a decade of wrangling over whether inaccuracies in the dating of one record or the other might account for the conflict, an arbiter has come forward. On page 782 of this issue of Science, geochronologists Lawrence Edwards and Hai Cheng, of the University of Minnesota, and Michael Murrell and Steven Goldstein, of Los Alamos National Laboratory in New Mexico, present the preliminary verdict. They used a new clock, based on the radioactive decay of uranium-235 to protactinium-231, to check the dates of both the Devils Hole record and records of sea-level change in Barbados coral. The result: The marine record is right, and the astronomical theory is on solid ground.

    But the new findings haven't settled the issue cleanly: Puzzlingly, the Devils Hole record seems to be correct as well. To most oceanographers, this bolsters their contention that the Devils Hole and marine records “are two fundamentally different beasts,” says Steven Clemens of Brown University. He and others suggest that while the marine records trace the ebb and flow of the ice ages, Devils Hole may chronicle only the climate of a region as small as southwestern North America. “They have convinced us that both kinds of dates are pretty firm,” adds geochronologist Teh-Lung Ku of the University of California. “That gives us another layer of confidence that dating isn't the problem.”

    The original confirmation of the astronomical theory came in the late 1970s from sea-floor sediment, where the ratio of two oxygen isotopes traces how much water was locked up in the ice sheets when the sediment was deposited. This ice-volume signal showed that water flooded into the ocean from melting ice sheets about 128,000 years ago, marking the start of the last warm interglacial period. That was just when orbital variations would have maximized the amount of sunlight falling on Northern Hemisphere ice sheets. The coincidence had helped convince oceanographers that orbital variations paced the ice ages.

    But because marine sediments are so difficult to date, the team of oceanographers studying this isotope record, the so-called spectral mapping group (SPECMAP), had only two or three direct dates for the sediment of the past million years. To fill in the chronology, they simply counted the “ticks” left in the isotope record by two known periods of the orbital clock, 21,000 and 41,000 years long. But a better, more directly dated record tended to confirm the sea-floor chronology. Corals that formed at the ocean's surface when melting ice pushed up the sea level, then died when the ice returned and sea level fell, have left terraces that can be dated by measuring the accumulation of thorium-230 from the decay of uranium.

    Some of the coral dates were at odds with the astronomical predictions, however, and the Devils Hole record posed an even starker contradiction (Science, 9 October 1992, p. 220). Oxygen isotopes in the finely layered carbonate deposited there by ground water record the temperature of the atmosphere when the rain or snow fell. When Isaac Winograd of the U.S. Geological Survey in Reston, Virginia, Kenneth Ludwig of the Berkeley Geochronology Center in California, and their colleagues applied uranium-thorium dating to the carbonate layers, they found that the warmth of the last interglacial began 140,000 years ago in Nevada. That was long before orbital changes would have begun warming the Northern Hemisphere.

    One possible explanation for the conflict was that one or both sets of dates had been skewed by chemical alteration of the rock, which could add or deplete the uranium or thorium. Any exchange—say, with ground water seeping through an old coral bed—would change the setting of the thorium clock and produce an erroneous age. “There was always some degree of uncertainty about the thorium-230 ages,” says Edwards, “because you never had a good way of determining whether the age was accurate or not.”

    Now Edwards and his colleagues have come up with the first reliable check on age-altering chemical changes. This check involves the element protactinium, which is also a decay product of uranium but comes from a different isotope than thorium. Chemical alteration should have similar effects on both clocks. But because the uranium “parents” of protactinium and thorium decay at different rates, an altered sample will yield different thorium and protactinium ages, while reliable ages will agree. To get the most precise reading from this second clock, Edwards and his colleagues directly counted individual protactinium atoms with a new mass-spectrometric technique, instead of trying to estimate the element's abundance by measuring its own slow decay. “It's a technological tour de force,” says Ludwig.

    Edwards applied this procedure to samples from both the Barbados coral and Devils Hole. He found that “most, but not all, of the samples [of Barbados coral] that we thought were accurate are.” But the protactinium-thorium approach also supported the dating of Devils Hole samples. This should put to rest any question about the reliability of the Devils Hole ages, says Ludwig.

    Oceanographers are quite content with this split decision. It puts the last interglacial, as recorded in the coral, somewhere around 129,000 to 120,000 years ago—about where oceanographers always had it. Although the earlier warming at Devils Hole appears to be real, it must have been a local event, say climate researchers. Thomas Crowley of Texas A&M University notes, for example, that some marine records from midlatitudes in the Atlantic and Pacific show signs of an early warming without a massive melting of glacial ice. The ocean's most frigid waters might have retreated far enough northward to allow continental midlatitudes (including Devils Hole) to warm, but not so far as to allow wholesale glacial melting, he says.

    Winograd, though, continues to think that Devils Hole might be recording global climate changes. He says that if Edwards had checked more coral ages, the marine record might have fallen into line with Devils Hole. He notes that some corals from places such as Hawaii, Australia, and the Bahamas put the interglacial sea level rise as early as 134,000 years ago, and some maintain a high sea level as late as 110,000 years ago. “These dates, if correct, are clearly incompatible with [orbital] forcing,” says Winograd. “If Edwards had redated these, that would have been new.”

    Edwards will be extending his testing of coral ages beyond Barbados, but he suspects that the anomalous ages found elsewhere will not hold up. Geochronologist James Chen of the California Institute of Technology, who produced some of the older dates, agrees with Edwards, saying those dates can be “peculiar” and “should not be taken too seriously.”

    Other records are also yielding new support for the orbital theory. Oceanographer Maureen Raymo of the Massachusetts Institute of Technology has dated marine cores without using the shorter orbital cycles to measure time. She simply assumed that sediment accumulated steadily between the oceanographers' three traditional dates and then combined the oxygen isotope records of 11 cores longer than 800,000 years. The average ages of the seven ice-age deglaciations in that interval came “very, very close to SPECMAP ages,” she says.

    Even a terrestrial source—Jewel Cave, 1500 kilometers north-northeast of Devils Hole in the Black Hills of South Dakota—seems to be lining up on the side of orbital pacing. Cave carbonates analyzed by Derek Ford and his colleagues at McMaster University in Hamilton, Canada, and Joyce Lundberg of Carleton University in Ottawa, put the end of the penultimate ice age between 131,000 and 129,000 years ago, with the interglacial warmth lasting until about 119,000 years ago. “We've got it pretty much nailed exactly where the ocean-core people put it,” says Ford. If such claims hold up, the orbital theory of the ice ages will win a second round.

  15. Medical Imaging

    New Technique Maps the Body Electric

    1. David Ehrenstein
    1. David Ehrenstein is a science writer in Bethesda, Maryland.

    Medical imaging is built on gifts from physics, among them x-rays, nuclear magnetic resonance, and radioactive decay. Now, yet another imaging technology may be emerging from the physics world. At last month's Vancouver meeting of the International Society for Magnetic Resonance in Medicine, two National Institutes of Health scientists introduced Hall-effect imaging (HEI), a strategy for combining ultrasound with magnets and electrodes to map variations in the electrical conductivity of tissues.

    Lighting up.

    In a pig kidney (top), Hall-effect imaging maps conductivity variations (bottom), while conventional ultrasound (middle) is sensitive to density.

    H. Wen

    Those variations may allow clinicians to distinguish tumors, fatty plaques, or areas of ischemia from normal tissue, something that's difficult with ordinary ultrasound. The technique also should provide better contrast than ultrasound, while retaining a cost advantage over magnetic resonance imaging, say the developers, Han Wen and Robert Balaban, of the National Heart, Lung, and Blood Institute. So far, the most complex specimen that Wen and Balaban have imaged is a pig kidney, but other medical-imaging specialists believe their technique could eventually have broad applications.

    “This has the potential to be a very valuable approach to imaging,” says Nathaniel Reichek, head of cardiology at the Allegheny General Hospital in Pittsburgh and a leader in the field of cardiac imaging. He notes, however, that “the approach stands just on the doorstep of development. It's a very long path from there to widespread use.”

    The method makes use of the Hall effect. Electric charges (for example, ions in biological tissue) follow curved paths when they move in a magnetic field. Because positive and negative charges curve in opposite directions, they diverge, giving rise to a so-called Hall voltage. In biological tissue, an ultrasound beam can create the motion while an external magnet imposes the field. The size of the resulting Hall voltage is determined largely by the tissue's conductivity.

    To maximize the signal-to-noise ratio, Wen actually runs the system in reverse—applying voltage pulses through electrodes and picking up the resulting vibrations with an ultrasound detector. After some mathematical manipulation, the ultrasound signal gives a profile of the conductivity along the direction the detector is pointed. As with conventional ultrasound, the detector must be moved to collect a full three-dimensional image.

    Like ultrasound, HEI requires a continuous acoustic path from skin to imaged tissue, so it cannot see through bones or the air in lungs, for example. The technique also requires a magnet and a voltage pulser, which aren't needed for conventional ultrasound. But because HEI stimulates acoustic waves in the tissue, the ultrasound needs to travel only one way rather than round trip, as in conventional ultrasound imaging. That should give HEI deeper penetration and better resolution. Wen believes, however, that the method's real strength is its sensitivity to conductivity variations.

    He explains that when conventional ultrasound imaging of the interior of a blood vessel reveals a bulge, for example, “it's very hard to tell whether that bulge is just a harmless fibrous lesion, or whether it's actually a more dangerous fatty plaque. Now, potentially you could use this technique, because there's a big difference in conductivity between these two types of tissue.” Wen and Balaban also propose using HEI to observe the stages of tumor development, to diagnose ischemic (oxygen-deprived) heart tissue, and perhaps to distinguish breast tumors from cysts.

    Last summer, Wen tested the system on a slice of bacon and found that it revealed the fat and muscle layers with greater contrast than ultrasound could; he has since moved on to pig kidneys. Before applying the system to entire animals, he wants to add an improved voltage pulser and real-time image processing. (Currently, the analysis must be done after the data are collected.) He plans to begin imaging animals within the next year.

    Reichek thinks it's a worthwhile effort and hopes eventually to use HEI to diagnose scar tissue and other abnormalities of the heart: “Genuinely novel approaches to biological imaging don't come along all that often, and this is one of them.”

  16. Neuroscience

    A Mitochondrial Alzheimer's Gene?

    1. Marcia Barinaga

    Researchers have known for years that energy metabolism is abnormally low in the brains of patients with Alzheimer's disease (AD). Now, a team led by Robert Davis, at the San Diego biotech company MitoKor, and neurologist W. Davis Parker, of the University of Virginia School of Medicine in Charlottesville, offers a possible genetic explanation: Most AD patients seem to have inherited high levels of a mutant form of cytochrome oxidase (CO), a mitochondrial enzyme that is a key part of the cell's energy-producing machinery.

    The finding, which is reported in the 29 April issue of the Proceedings of the National Academy of Sciences, could lead to a diagnostic test for the disease. Beyond that, it supports earlier suspicions that poor energy metabolism may contribute to the neurodegeneration that occurs in AD. Indeed, Alzheimer's researcher Bruce Yankner of Harvard Medical School calls the result “by far the strongest evidence” yet that a CO defect is involved in triggering the disease.

    The current work is an outgrowth of Parker's 1990 discovery that CO activity is low in the blood platelets of AD patients. Subsequent studies also found low CO activity in Alzheimer's brains. The enzyme is there, but it doesn't work correctly, which suggests that it might be mutated, says Parker.

    Three of the 13 proteins that make up the CO molecule are encoded not in the nucleus, but in the mitochondrial DNA. The team looked at those genes, as studies have shown that children of mothers with AD are more likely to get the disease than are children of affected fathers. That suggests a mitochondrial gene could be causing a predisposition to AD, as virtually all of the thousands of mitochondrial genomes we inherit come from our mothers.

    The team found a relatively common variant of the mitochondrial genome in which two of the CO genes are consistently mutated. The variant turned up both in AD patients and normal controls, but it made up a higher percentage of the mitochondrial genomes in the patients: In 60% of the 506 AD patients examined, more than 20% of their mitochondrial genomes had the mutant form, while only 20% of the 95 controls (normal subjects and people with other neurological diseases) had mutation levels that high. One-fifth of the AD patients had mutation levels of 32% or more, higher than any controls. MitoKor is creating a diagnostic test based on the mutation assay.

    The team showed that these CO gene mutations have physiological effects by transferring mitochondria from AD patients into cultured cells that lack mitochondrial DNA. The resulting cells had impaired energy production, reflected by their high production of oxygen free radicals, the damaging molecules produced when energy-generating processes don't run to completion.

    The findings could provide several links with other areas of AD research. Free radicals, which damage cell membranes, have been implicated in the destruction of neurons in Alzheimer's. And 4 years ago Yankner and his colleague Dana Gabuzda linked CO activity to another possible cause of AD: They poisoned CO in cultured cells and found an increase in a direct precursor of β amyloid, the protein that forms the core of the senile plaques found in the brains of AD patients.

    The new work falls short of proving that CO mutations help cause AD, but the idea deserves to be explored, says Yankner: “It is not an area that has attracted a great deal of attention in Alzheimer's research, but it might now.”

  17. Evolutionary Biology

    Catching Lizards in the Act of Adapting

    1. Virgina Morrell

    When evolutionary biologists transplanted small populations of Anolis sagrei lizards from Staniel Cay in the Bahamas to several nearby islands 20 years ago, they thought that the reptiles would go extinct. Indeed, that was the outcome the researchers planned to study. But instead of expiring, the small brown lizards, like Oklahoma land-rush settlers, flourished—even though their new homes differed dramatically from their original island. And the “extinction” study turned into a demonstration of evolution in action.

    Reptile moves.

    New home leads to shorter legs in Anolis lizards.

    Patti Murray/Animals, Animals

    In the current issue of Nature, Jonathan Losos of Washington University in St. Louis and his colleagues report that the transplanted lizards appear to be in the first stages of an adaptive radiation, undergoing the kind of body changes needed to inhabit a new environment. Such changes could in time turn each island's population into a separate species—the same process that led to the great diversity of finches that Darwin spotted on the Galápagos Islands, and to the galaxy of Anolis lizards themselves (150 species in the Caribbean alone). In particular, the researchers saw the lizards' hindlimbs grow shorter, an apparent adaptation to the bushy vegetation that dominates their new islands.

    The change was rapid, but others have also demonstrated the speed with which organisms can adapt in the wild (Science, 28 March, pp. 1880 and 1934). More important, by drawing on previous studies of anole adaptations, the Losos team was able to predict precisely how the lizards' bodies would change in response to their new homes. The leg-length change they observed might not be genetic, some researchers note; it could be environmental—the equivalent of a body-builder's muscles. But if it is rooted in the genes, then the study is strong evidence that isolated populations diverge by natural selection, not genetic drift, as some theorists have argued. “It's the first attempt to make a prediction about how the theory of evolution will work—and then show that it does happen as predicted,” says University of California, San Diego, evolutionary biologist Trevor Price.

    Once Losos and his colleagues realized that their transplanted lizards were surviving instead of becoming extinct, they decided to study how they were adapting. They focused on how the habitat change affected the length of the animals' hindlegs because Losos had previously demonstrated that the trait correlates with the lizards' preferred perch. For instance, species living on tree trunks have longer legs than do those living on twigs, apparently because they can trade the agility that comes with shorter legs—crucial on bushy vegetation—for the increase in speed that longer limbs provide. Because Staniel Cay, the home of the founding population, is covered with scrubby-to-tall forest, the anoles there are long-legged.

    But the 14 lizard-free islands that the researchers seeded with the Staniel Cay pioneers have only a few trees; most of their vegetation is bushy and narrow-leafed. “From the kind of vegetation on the new islands, we predicted that the lizards would develop shorter hindlimbs,” says Losos. And although they are not as ground hugging as a Chevy low-rider, statistical analysis shows that, after 10 to 14 years on the new islands, these anoles do indeed grow shorter rear legs than do their ancestors.

    For this change to be the first step toward the formation of new species, though, the change in limb length would have to be a genetic trait that would be passed on to the animals' progeny. But some researchers doubt that such rapid morphological change, coming in a few generations, could have a genetic basis. “They have seen a lot of evolution in a very short time,” notes evolutionary biologist David Wake of the University of California, Berkeley. Losos concedes that there is as yet no evidence that the shorter hindlimb length is strongly heritable. Without that, he notes, he can't exclude the possibility that the lizards' legs develop differently on the bushy vegetation than on trees because the different vegetation types exert different stresses on the leg bones.

    This kind of “developmental plasticity” would depend only on the type of environment and might not be transmitted to the offspring. “They don't have the evidentiary data to support either hypothesis,” says Wake. Indeed, he even questions whether the Losos team has in fact demonstrated that the legs get shorter. He points out that the researchers did not take leg measurements from the founding populations on the new islands—not realizing that they were going to thrive. Instead, they measured lizards living today on Staniel Cay and used their legs to compare to these transplanted populations.

    To help resolve these issues, Losos is now raising anoles in a variety of laboratory environments to see if the lizards consistently change morphologically from the stresses of living on different surfaces. If they do, this would favor the developmental plasticity idea.

    He's also trying to take his natural radiation experiment one step further by introducing two closely related species of lizards on lizard-free islands. “That's the next step,” he says. “If they compete and alter their habitat use in the presence of this other species, will that lead to a [further] morphological change?” It's that kind of competition, between populations that had started to diverge and were then reunited, that researchers say led to the rich diversity of Galápagos finches.

    But whether the lizards continue to evolve depends largely on the winds of fate, says Losos. These islets are periodically swept by hurricanes that could whisk away every trace of anolian evolution—the outcome he originally pictured for them.

  18. Biodiversity

    Dams Drain the Life Out of Riverbanks

    1. Nigel Williams

    A team of researchers studying biodiversity on riverbanks in central and northern Sweden has some bad news for managers of water resources: Plant communities along the banks of rivers dammed for hydroelectric power contain significantly fewer species than those alongside neighboring free-flowing rivers. These findings, reported on page 798, are likely to fuel debates in Sweden and elsewhere on licenses for new dams and other water-regulation schemes. “The results of this study are likely to be applicable to regulated rivers in many other regions,” says ecologist Stuart Pimm at the University of Tennessee, Knoxville.

    Draining away.

    Plant life never fully recovers on the banks of dammed reservoirs.

    Christer Nilsson/Umeå University

    Riverbanks provide a variety of environments for plant life. The inflow of nutrients and sediments; changing water levels over the seasons, which can create specialist niches; and waterborne dispersal of seeds all contribute to a rich diversity of species. It has long been known that the construction of dams and changing water levels can have a disastrous impact on original bank communities, but there is considerable debate among ecologists about how well plant populations recover and reestablish themselves alongside new, regulated water courses. “There have been many studies on the physical changes in regulated water courses and how plants adapt to these changes, but no one until now has looked systematically at the effects of regulation on plant biodiversity,” says river ecologist Jack Stanford at the University of Montana's Flathead Lake Biological Station in Polson.

    Ecologist Christer Nilsson and colleagues Roland Jansson and Ursula Zinko of the University of Umeå in Sweden set out to answer this question by studying the vegetation at almost 90 sites alongside hydroelectric power “schemes”—each of which can incorporate a number of dams and reservoirs. Some dams in the region date back to the 1920s, and “the regulation patterns of these systems have not changed since the schemes were first built,” says Nilsson. The team also studied several undisturbed rivers in the same region as controls—something that is becoming difficult in many other areas of the world.

    The team compared both the simple number of species and an index of “species richness,” which compensates for differences of riverside areas. Half the samples were at main storage reservoirs, where water levels can vary substantially, and half were alongside smaller dammed reservoirs, or “impoundments,” downriver, which directly feed the turbines and show much less variation in water level.

    The team found about one-third fewer species around storage reservoirs than at comparable undisturbed sites; the index of species richness was only about one half for these sites. At impoundment sites, the richness index was comparable with that at control sites, but 15% fewer species were crammed into the much narrower band of habitat than in natural rivers.

    By studying the vegetation alongside regulation schemes built between 1 and 70 years ago, the team also found that the number of species growing along the water's edge appears to increase slowly after the construction of a dam. They found that the number appeared to increase for an average of 34 years alongside storage reservoirs, and for 13 to 18 years alongside impoundments. Once plant communities reach these ages, however, their development appears to halt. “No further increase in species numbers has occurred, and the community looks permanently different,” says Nilsson. “And evidence from the oldest dam schemes suggests that these communities are now declining in species number.”

    Some ecologists suggest that this impoverishment in riverside plant life might have a significant impact on the aquatic ecosystems. “We're just beginning to get some glimmers of the effects on biodiversity of changes in these habitats,” says Pimm. “These changes are likely to lead to higher local extinction rates for some species.”

    The Swedish results are likely to fuel current debate on the relicensing of schemes to regulate water. “Many Swedish hydroelectric schemes were built about 30 years ago and must now undergo assessment for relicensing under Swedish requirements,” says Nilsson. Concerns about their environmental impact have already led to proposals for a 5% increase in flow through the schemes. But, Nilsson says, “Our results suggest that for the same reduction in generating capacity, it would make more sense to close down one in 20 schemes and restore their original river courses.” Stanford agrees: “It's becoming increasingly clear that we have to restore some of the natural attributes of river systems.”

  19. Microbial Biology: Microbiology's Scarred Revolutionary

    1. Virginia Morell

    Microbial biology is undergoing a renaissance, and in this special section we explore bursts of progress in areas ranging from how pathogens invade cells to the origins of life. News reports beginning on this page focus on the extremophiles—unusual microbes that flourish in harsh conditions—while the Articles that follow probe a broad range of topics, including plant-microbe interactions and the astonishing diversity of microbes.

    Carl Woese revised the tree of life and started a new age in microbial biology by recognizing a third domain of life—but he paid the price for his radical ideas

    Shortly after dawn on 3 November 1977, evolutionist Carl Woese picked up the morning newspaper from his front lawn in Urbana, Illinois, and thought of how people across the country were doing the same thing. “Soon,” he remembers thinking, “they'll be reading about me and my discovery.” And he wondered how much his world would change. For Woese (pronounced “woes”) had just announced his discovery of the Archaea—a group of one-celled organisms so different from all other living things, including bacteria, that he had placed them in a separate domain of life. It was as if a colony of alien creatures had suddenly been discovered living secretly in the backyards of suburbia. That day, as Woese had expected, the Archaea hit page one of newspapers from the Urbana News Gazette to The New York Times.

    However, Woese soon found that the world doesn't stop even if you've created a new paradigm for understanding life on Earth. Later that same day, he ordered coffee from a young person at a fast-food counter and asked if she knew who he was. She shook her head, so he offered some clues: “I'm Carl Woese, and I discovered the third domain of life.” Then she smiled in recognition. “Oh yeah,” she said. “You're Bob's dad.”

    Telling this story today in his paper-strewn office, Woese tilts his head back and laughs: “That [comment] put everything in perspective for me.” Keeping things in perspective has been a challenge for Woese, in part because after the first burst of publicity, not only the general public but also most microbiologists ignored his tripartite tree of life. Few researchers in the United States followed up on his work; grad students did not clamor at his door; and his funding stayed modest. “People didn't understand the size of my contribution,” says the slightly built 69-year-old scientist, who still works in the same converted lab, full of rusty sinks and pipes, where he made his discovery. “They had no appreciation for what not having a microbial phylogeny meant and therefore no appreciation for what having one would mean.”

    By now, Woese has gotten his due: a MacArthur Foundation grant, election to the National Academy of Sciences, and honors including the Leeuwenhoek medal, microbiology's top honor. There are even rumors of a Nobel Prize in the offing. Yet he still appears to feel the sting of that first rejection, perhaps because he never received praise from microbiology's leading figures when he needed it most. His story dramatically illustrates the price paid by those who start a scientific revolution—even a spectacularly successful one. “Carl knew he had seen the Truth, that he'd seen God in his table of gene relationships, and I knew it when I saw it,” recalls Norman Pace, an evolutionary microbial biologist at the University of California, Berkeley, who has been a friend and colleague of Woese's since the 1960s. “But very few other people in the field were convinced. They dismissed him or ignored him, and maybe because Carl is a shy, introverted person, he took this very hard.”

    Tree of life.

    The Woese family tree shows that most life is one-celled, and that the oldest cells were hyperthermophiles.

    SOURCE: OTTO KANDLER

    Most microbiologists today have just about forgotten the initial skepticism. Their science is bursting with discovery, and much of the ferment is due to what many now term the “Woesian revolution”: his single-handed revelation that microbes, far from being just one of life's five major kingdoms, are actually two of its three broad domains: Bacteria, Archaea, and Eukarya (which includes all multicellular organisms, from plants to people). The stunning implication: Most life is one-celled, and all Eukarya are but a twig on what amounts to a great microbial tree of life. Perhaps most important, Woese's research—triumphantly confirmed last summer by DNA sequencing of a complete archaeal genome—has enabled microbiologists to place their organisms in the Darwinian fold. With an evolutionary tree in hand, biologists can look at microbes as they do the rest of life—as organisms with histories and evolutionary relationships to one another, and to all other organisms.

    “It's as if Woese lifted a whole submerged continent out of the ocean,” says Günter Wächtershäuser, an evolutionary biologist at Germany's University of Regensburg. “Now we can look at the continent—the microbial biosphere—in detail and with purpose.” That, he says, “finally makes biology a complete science, because for the first time the study of evolution includes all organisms.”

    From physics to RNA

    Perhaps part of the reason Woese had a hard time convincing fellow microbiologists about his work was that he wasn't officially a microbiologist, or even a biologist for that matter. As an undergraduate at Amherst College in Massachusetts, he studied physics, and his Yale University doctorate was in biophysics—the term first used to describe molecular biology. He stumbled into the microbial world as a postdoc at Yale in the mid-1950s; there he investigated the development of ribosomes (the cell's protein-synthesis machines) and became interested in the origin of the genetic code. After stints at General Electric in New York and France's Louis Pasteur Institute, he got a job at the University of Illinois in 1964 and has been there ever since.

    Woese wanted to unravel the evolutionary history of DNA and RNA; to do this, he knew he needed a family tree, or phylogeny, that encompassed all organisms. At the time, most of the microbial world was lumped into a single group known as the prokaryotes, defined as organisms that lack a nucleus. “We had a real evolutionary understanding of the plants and animals, but that left out the whole world of bacteria. So I thought that's what I would do first: Bring in the prokaryotes,” says Woese.

    Counting catalogs.

    Woese with one of the thousands of oligonucleotide films he analyzed.

    DEWEY HENTGES

    He was not the first to tackle this problem. In the 1930s, the leading microbiologist C. B. van Niel of Stanford University's Hopkins Marine Station had cited the classification of the bacteria as the key unresolved issue in microbiology. But after decades of fruitless labor trying to classify bacteria by their shape and metabolism, he and his former student, Roger Stanier of the University of California, Berkeley, decided that the bacteria simply could not be phylogenetically ordered. “The ultimate scientific goal of biological classification cannot be achieved in the case of bacteria,” Stanier wrote in the second edition of his leading textbook, The Microbial World. In Woese's view, that attitude consigned microbial research to the Dark Ages. “It was as if you went to a zoo and had no way of telling the lions from the elephants from the orangutans—or any of these from the trees,” he says.

    Yet that is where bacterial classification stayed for the next 2 decades. Woese, however, didn't think the problem insoluble. “I hadn't been trained as a microbiologist, so I didn't have this bias,” he explains. His physics background led him to believe that “the world has deep and simple principles, and that if you look at it in the right way,” you can find these. What's more, he was convinced that the molecular revolution, then in its infancy, offered just the tools to decipher the problem. He turned to ribosomal RNAs (rRNAs), nucleic acid sequences found in ribosomes. Woese knew from his previous research that these sequences are among the most conserved elements in all organisms, making them excellent recorders of life's evolutionary history. They are also abundant in cells, so that they're fairly easy to extract. And because they are found in all organisms, from Escherichia colito elephants, their similarities and differences could be used to track every lineage of life. Woese calls them “the ideal tool.” But this, too, went against the prevailing scientific tide, says Wächtershäuser, because everyone else was building family trees using proteins, which were then much easier to work with than other tools were.

    In 1966, of course, none of today's efficient methods for sequencing genetic material existed. Instead, Woese relied on a tedious, labor-intensive technique known as oligonucleotide cataloging. In this method, an rRNA molecule, which is a long string of four nucleotides (adenine, cytosine, uracil, and guanine, or A, C, U, G), was broken into small fragments by cutting it at every G residue. Each of these fragments or oligonucleotides was then broken into subfragments with enzymes that sliced at different residues; this allowed Woese to reconstruct the sequence of the original rRNA fragment. These oligonucleotides were usually quite short by today's standards—from six to 20 nucleotides—but long enough that most occurred only once in an rRNA. So Woese could seek matching oligonucleotides in other microbes to determine how closely they were related.

    Physically, these RNA fragments appeared as fuzzy spots on film, and Woese stored thousands of such films in large, canary-yellow Kodak boxes. It took him a full year to make and read his first catalog. He was one of “only two or three people in the world” to learn this backbreaking technique, he says, and he labored alone. “Carl was the only one who could read the films,” recalls William Whitman, a former graduate student of Woese's and now a microbiologist at the University of Georgia, Athens. Soon, every flat surface in Woese's office sported a light box with an oligonucleotide film clipped to its surface. Other films hung in front of Woese's “luminescent wall”—a sheet of translucent plastic with lights inside that blocked the windows and stretched the length of his room. “He stood there all day, every day, looking at these, searching for patterns,” says Whitman. A stash of Dr Pepper and a chin-up bar kept him going. Nevertheless, the work was of such mind-numbing tedium that Woese himself says it left him “just completely dulled down.”

    He worked this way for a decade, completing the rRNA sequences of about 60 diverse bacteria and arranging them by genetic similarity. Slowly he began unraveling the tangle of microbial relationships, publishing phylogenies of chloroplasts and mitochondria—cellular organelles thought to have originated as symbiotic bacteria—and groups of bacteria. In some cases, he made surprising findings, for example that the anaerobic bacteroids and the aerobic flavobacteria are related. During this period, Woese received modest $50,000 grants from NASA and served the university by teaching molecular biology. But with each untangled group, he added another twig to his tree of life.

    Then in 1976, he attached an entire limb. Ralph Wolfe, a close colleague at the University of Illinois, suggested that Woese try his technique on an odd group of bacteria that produced methane as a byproduct. “We knew of about eight different methanogens at that time, and nobody knew where they fit,” recalls Wolfe. “They were diverse morphologically—rods, spirals, marblelike cells—but they all had the same kind of biochemistry. That's what intrigued Carl.”

    But when Woese studied their sequences, the methanogens did not register as bacteria. “They were completely missing the oligonucleotide sequences that I had come to recognize as characteristic of bacteria,” he explains. Thinking the sample had somehow been contaminated, he ran a fresh one. “And that's when Carl came down the hall, shaking his head,” says Wolfe. “He told me, ‘Wolfe, these things aren't even bacteria.’ And I said, ‘Now, calm down, Carl; come out of orbit. Of course, they're bacteria; they look like bacteria.’ “But, as Woese now knew, morphology in bacteria meant nothing. Only their molecules told the story. And the molecules proclaimed that the methanogens were not like any other prokaryote or eukaryote—they were something unto themselves, a third branch of life.

    The silent treatment

    That's what Woese and NASA put in their press release the next year, announcing what he then called the archaebacteria (he has since dropped the “bacteria”); the scientific paper appeared in the Proceedings of the National Academy of Sciences (PNAS), with Wolfe as one of the co-authors. But it was the newspaper accounts of the discovery that most scientists first read, and by and large, they were skeptical. Woese's tree, after all, overturned one of biology's most basic concepts—that life was divided into two large groups; it seemed outrageous to claim a third.

    And Woese's solitary years at his light table had left him with a reputation as an odd person, “a crank, who was using a crazy technique to answer an impossible question,” as one researcher put it. His tiny snippets of rRNAs were considered too fragmentary to be reliable indicators of evolutionary relationships, says Pace. Molecular biologist Alan Weiner of Yale University recalls that many leading biologists thought Woese was “crazy,” and that his RNA tools couldn't possibly answer the question he was asking.

    Few said anything to Woese directly, or even responded in journals. “The backlash was rarely if ever put into print,” says Woese, “which saddens me because it would be helpful to have that record.” Instead, many researchers directed comments to Wolfe, who was well established and highly regarded. Recalls Wolfe: “One Nobel Prize winner, Salvador Luria, called me and said, ‘Ralph, you're going to ruin your career. You've got to disassociate yourself from this nonsense!' “Ernst Mayr of Harvard University scoffed to reporters that the notion of a third domain of life was nonsense, an opinion that he and a handful of other skeptics hold to this day. “I do give him credit for recognizing the archaebacteria as a very distinct group,” says Mayr, who insists on keeping the word bacteria attached to the Archaea. “However, the difference between the two kinds of bacteria is not nearly as great as that between the prokaryotes and eukaryotes.”

    Woese's retiring nature didn't help. He says he has “an almost visceral” dislike of meetings, feeling that they are largely political gatherings. He seldom even attended the annual meetings of the American Society for Microbiology (ASM) and so had few opportunities to argue in person on behalf of the Archaea. It was left to Wolfe to parry the criticisms and catch the gossip. At one ASM meeting in the early 1980s, for example, R.G.E. Murray, the editor of microbiology's bible, Bergey's Manual, passed Wolfe in the hallway. He didn't bother to stop, but “just waved his hand dismissively and muttered, ‘The archaebacteria are only bacteria.’ “(Murray finally put the Archaea in the manual in 1986, but kept them as a subgroup within the kingdom Prokaryotae.)

    In Germany, however, Woese was hailed as a hero from the beginning, thanks largely to the influential microbiologist Otto Kandler, who had also realized from his analysis of methanogen cell walls that these microbes were different from other bacteria. Thus, he was prepared to accept Woese's work. “Kandler's word was not taken lightly,” says Wächtershäuser. “When he said we should study three things [the three branches of life], as Woese was saying, then everyone did.” German researchers such as Wolfram Zillig of the Max Planck Institute near Munich plunged into the chemical structure of the Archaea, ultimately coming up with more evidence of the group's uniqueness. It was Kandler who organized the world's first Archaea conference in 1981; the previous year, he held a special Archaea seminar at the Big Hall of the Botanic Institute at the University of Munich. Knowing that Woese “was a little bit depressed by all this resistance” in the United States, Kandler arranged for a church brass choir to “blow full power” in greeting as Woese stepped onto the podium.

    But at home, Woese waited in vain for a response from the two microbial biologists he expected would recognize his work's importance: van Niel and Stanier (both now deceased). He typed their important papers into his computer and today can quickly summon quotations to show that his work solved the problem they had pondered throughout their careers. Worse, a few years later, a colleague used Woese's method to resolve another question about microbial relationships and received a warm letter from Stanier. “I felt hurt and jealous,” Woese recalls, “because Stanier had never written to me praising what I was doing.”

    At times the silence left Woese fuming, says Pace: “Even now, he sometimes lashes out at people on whose shoulders he stood and who, he thinks, failed him because they didn't recognize the Archaea or the universal tree.” However, says Weiner, Woese may have been expecting too much. “He was making a claim of extraordinary scope. He was saying that we had missed one-third of all living things. People don't like to be told that they're missing it big—and that what's missing is big—particularly when it didn't seem to make a difference” to the rest of the work in microbiology at the time.

    More than a decade has passed since Woese was badly treated, and many of his friends and colleagues say that they wish he could put that time behind him and accept the accolades now streaming in. “We all know how important he is,” says Weiner. But “it's as if he doesn't hear it. When someone suffers a rejection like that, how long does it take to undo all the damage, all the years when the field blasted him? We can say, ‘Oh, that's just the way science works.’ But it was a personal experience for him. He's the one who had to live through it.”

    Woese only shrugs and smiles wryly when asked about the snubs of the past. He says he drew strength from those such as Wolfe, Pace, and Kandler, who believed in him. “That's what happens when you break a paradigm; people scoff, they don't treat you seriously,” he says. “But I'd read Thomas Kuhn [The Structure of Scientific Revolutions], so I knew exactly what was going on.”

    Yet even in the United States, the tide had begun to turn in Woese's favor by 1980. The previous year, he and Wolfe had published a review article of the methanogens, clearly establishing their differences from the bacteria. “That went a long way toward stopping the doubters,” says Wolfe. “It made people stop and think.” Other microbiologists began finding other ways in which the Archaea are unique and discovering other unusual organisms that fell into the group, such as the salt-loving halophiles and the “thermoacidophiles”—sulfur-metabolizing methanogens. And although grad students still didn't beat a path to Woese's door, his own output of landmark papers increased, with major studies on aspects of the three domains appearing in Science and PNAS. Invitations to give lectures and attend conferences poured in; true to form, Woese declined most—although he agreed to give Berkeley's Stanier lecture.

    The new phylogeny is the basis for a range of new explorations in microbial biology, from the origin of life to its diversity. “It's the model to reckon with,” says John Baross, an evolutionary microbiologist at the University of Washington, Seattle. “Even if you don't like its conclusions, [the Woesian tree] is the best we've ever had and have; it's here to stay.” The final genetic seal of approval came with last summer's sequencing work. The complete sequence of Methanococcus jannaschii revealed a raft of genes unlike any others known, confirming that the Archaea are indeed a unique domain and showing that they have closer ties to eukaryotes than to bacteria (Science, 23 August 1996, pp. 1043 and 1058). Today, every major microbiology textbook carries the Woesian tree of life, although general biology texts are still catching up.

    For his part, Woese continues to peruse the genetic differences among microbes—only now he summons sequences onto his finger-smudged computer screen, where a program scans for similarities. “Microbial diversity, like microbial evolution, has become a real science,” says Woese, ever the physicist at heart. “The same [is true] with microbial ecology. And it's all because we can now interpret these things within the framework of the tree.”

  20. Microbial Biology: Tracing the Mother of All Cells

    1. Virginia Morrell

    When evolutionist Carl Woese unveiled the once-hidden land of the Archaea in 1977, he not only revolutionized microbial classification but also ushered in a new era in one of biology's grandest, if most problematic, pursuits: understanding the origins of life. Researchers had been exploring this question with test-tube experiments on organic molecules for years, but those studying living microbes had little to add to the debate. By sketching the first complete family tree of the microbes, however, Woese allowed researchers to search for organisms near the tree's base—close to the ancestral one-celled form. The features of those organisms may open a window on the era that preceded cells, when free-floating molecules first developed the ability to replicate and evolve.

    “Before Woese, microbiologists simply wouldn't touch the question of what the last common ancestor looked like,” says Alan Weiner, a molecular biologist at Yale University. Now hardly a month goes by without a fresh insight or meeting on the subject, from an April Nobel forum in Stockholm to an American Chemical Society symposium in June. The results so far suggest that the first organisms were probably not bacteria but archaea. And instead of evolving in a mild soup of organic molecules—as was first suggested by Charles Darwin—these organisms may have been born in what humans would consider marginal environments, such as boiling, sulfurous pools or hot, mineral-laden, deep-sea volcanic vents. “For the first time, we can go into the deep, totally unobservable past to test our predictions about the early evolution of life,” says Günter Wächtershäuser, an evolutionary biologist at Germany's University of Regensburg.

    The leading theory has long held that life began when lightning charged a warm soup of organic molecules with energy and caused them to begin replicating; variations include the possibility of a cold ocean as the cradle of life, an idea now advocated by longtime origin-of-life expert biochemist Stanley Miller of the University of California, San Diego.

    But to many researchers, Woese's tree points to a different conclusion. Originally based on RNA and now confirmed by other genetic sequences (see main text), the tree separates all living organisms into three domains: Archaea (microbes that often inhabit extreme environments), Bacteria, and Eukarya (all multicellular animals and plants). By comparing sequences from the three domains, Woese could trace where various groups of organisms had branched off. And by following the branches back down the tree, Woese, and later other scientists, could identify organisms close to the shared common ancestor for the archaea and bacteria (see diagram in “Microbiology's Scarred Revolutionary”). Most of these turn out to be thermophiles or hyperthermophiles—Archaea and Bacteria that thrive at 80°C or higher.

    That suggests that the last common ancestor was a hyperthermophile, says John Baross, an evolutionary microbiologist at the University of Washington, Seattle. Extrapolating back from the first fossil evidence of microbes at 3.8 billion years ago, he and others estimate that this organism lived about 4.3 billion years ago. Going even further back, researchers such as Everett Schock of Washington University in St. Louis conjecture that life began in an anaerobic, “nasty and hot” environment much like Yellowstone's sulfurous hot springs or a deep-sea vent. In addition, many of the most ancient organisms on the Woese tree are autotrophs, with metabolism based not on organic compounds but on inorganic material, such as carbon dioxide or hydrogen sulfide. Some scientists argue that the first cell was an autotroph, too.

    That scenario matches many scientists' view of the early Earth. For example, Stanford University geochemist Norman Sleep has proposed that the ancient atmosphere contained high, greenhouse levels of carbon dioxide and was constantly being bombarded by meteorites and asteroids; some of these were as “big as Mount Everest” and sparked enough heat to boil off a 3-kilometer-deep ocean, says Baross. “That would have caused the death of any organism,” he adds, “so where are the safe places on early Earth? In deep-sea volcanic vents, or subsurface sea-floor cracks”—just the places the primitive hyperthermophiles love today.

    Trying to divine the branching pattern of the tree—and hence which organisms lie near the base of it—from the genetic sequences of living microbes has pitfalls, however. Chief among them is that early in history, “life was not chaste,” as Antonio Lazcano, an evolutionary biologist at the National University of Mexico in Mexico City, puts it. Clues from living bacteria suggest that, like characters in a John Updike novel, different species freely swapped genetic material or simply picked up exogenous free-floating DNA, leaving a confusing trail.

    Still, Wächtershäuser has recently argued that some traits are not easily swapped—notably those related to temperature. He theorizes that it is impossible for “a hot organism to exchange genes with a cold organism, and vice versa,” because the proteins regulating cellular functions are tuned to work at a specific temperature. The implication is that the first cell did indeed like it hot. What's more, because it is easier to activate proteins at warm-to-hot temperatures, it makes sense that life evolved in a heated soup, then adjusted to lower temperatures as Earth cooled, he says.

    But even if the first cells inhabited hot springs, the molecules that preceded them could have originated in a milder, “Club Med” environment, say Miller and others. They point out that even more than 4 billion years ago, the ancestral cell was already sophisticated, with a genome and the ability to replicate. “The hyperthermophiles may be ancestral to later life, but they are hardly primitive,” says Miller. “They are as complicated as we are.” He and Lazcano argue that life could have as easily started in warm, balmy seas—or even cold ones—and later given rise to both hyperthermophiles and lower temperature organisms. A megameteor strike might then have killed off all but the heat-loving organisms, leaving them to give rise to all later ones, says Miller.

    Those who favor an archaeal origin of life admit that they need to go back further into the past. Baross argues that hydrothermal vents themselves are windows into the nature of the early Earth—and that the genes of some vent organisms may therefore be living relics. Indeed, Wächtershäuser and Claudia Huber, a chemist at the Technical University of Munich in Germany, took their scenario further back into the past by showing that chemical reactions at vents could lead to the synthesis of key biological molecules (Science, 11 April, p. 245).

    Then again, none of these scenarios may be right. Perhaps life evolved elsewhere in the solar system and came blasting into Earth on a bolide. Not even that would knock down the new tree of life, says Norman Pace, an evolutionary microbial biologist at the University of California, Berkeley. “I wouldn't be surprised that if we dredge up some martian creature, we'll find it rooted in Woese's big tree”—making that tree truly universal.

  21. Geomicrobiology: Life Goes to Extremes in the Deep Earth--and Elsewhere?

    1. Richard A. Kerr

    Life underground has always been abundant in literature and folklore—dragons guarding their caves, fairy folk under the hill, dwarves mining precious metals in the depths. But when scientists have considered life's role in the subsurface, they have been a good deal less imaginative. Until recently, the conventional wisdom was that the rich life on the surface petered out to sterility not far beneath the soil, and occasional claims that some deep, ancient rock had yielded living microbes were greeted with extreme skepticism. No longer.

    Hell's bacterium.

    Bacillus infernus dwells 2.7 kilometers down, at over 60°C, without oxygen.

    HENRY ALDRICH/UNIVERSITY OF FLORIDA

    In recent years, microbiologists and geologists have teamed up to probe the deepest limits of life, and they have discovered just how lacking their imagination has been. They have found organisms trapped almost 3 kilometers beneath Virginia for millions of years, microbes living off bare rock and water a kilometer down in the Columbia Plateau, and signs of life subsisting on glass and mineral-rich water beneath the midocean ridges. It seems that wherever a source of energy exists, life is present. And although there are obvious limits to what life can endure, they are turning out to be far less restrictive than once assumed. “The only true limit on the depth of life is temperature,” says hydrogeologist Tullis Onstott of Princeton University. At the moment, the most thermophilic, or heat-loving, organism known lives at temperatures up to 113 degrees Celsius—which would allow life down to perhaps 5 kilometers beneath the surface.

    Life in the slow lane.

    All cells in this 180-meter-deep sample stain blue (top), but only active ones stain red (bottom), showing that many deep microbes are inactive.

    S. A. NIERZWICKI-BAUER/RPI

    Isolated in the depths for millions of years, microbes have adapted with exotic metabolisms and very slow rates of reproduction. And they are shaping the deep Earth and its waters to a degree that researchers are only beginning to determine. “We have convinced people that there are microorganisms in the subsurface and that they are responsible for geochemical changes there,” says microbiologist Thomas Kieft of the New Mexico Institute of Mining and Technology. Now, he says, the focus will be on gauging the extent of those changes—and judging whether life may await discovery below the inhospitable surfaces of other planets and moons.

    Visitors from the past. Perhaps the most dramatic claim for deep life on Earth has come from a hole drilled in northeastern Virginia under the aegis of the Department of Energy's (DOE's) recently terminated Deep Subsurface Program. In 1992, Texaco and the Eastern Exploration Co. tapped into a pocket of life 2.8 kilometers down that may have persisted—out of touch with surface life—since the rise of the dinosaurs. The drillers were looking for oil in a 4-kilometer-deep rift called the Taylorsville Basin; the rock now filling this basin was laid down as lake and stream sediments starting 230 million years ago, and has since been buried by 500 meters more of sediment—heated, uplifted, cooled, and partially eroded.

    After all that time and turmoil, cores retrieved from depths of about 2.75 kilometers—where temperatures ran up to 75°C—contained microbes that flourished in lab cultures, as the DOE team has reported in a string of reports. Some of the microbes apparently extract energy from ancient organic matter in nearby rock, using minerals such as iron or manganese (instead of oxygen) to oxidize the carbon. In 1994, John Parkes, of Bristol University in the United Kingdom, and his colleagues made the same claim for microbes retrieved from beneath more than 500 meters of Pacific Ocean sediment. And in 1995, Todd Stevens and James McKinley of Pacific Northwest National Laboratory (PNNL) in Richland, Washington, found microbes living 1.5 kilometers beneath the Columbia Plateau in bare basalt rock. Finally, in March, Lee Krumholz of the University of Oklahoma and his colleagues reported finding viable bacteria hundreds of meters down in New Mexico. They were living off 100-million-year-old organic matter; the water that might have carried these microbes to the depths left the surface 30,000 years ago.

    All these spectacular reports had to overcome one key doubt: that the microbes really came from the depths and not from contamination, a problem that sank many previous claims of deep life. Ensuring cleanliness is no small task when working a kilometer or two down a 20-centimeter-wide hole flooded with the fluid “mud” needed for drilling. But potential contamination can now be spotted by looking for tracers—highly fluorescent dyes or other chemicals that are injected into the hole before coring. The ultimate tracers, however, are bacteria themselves. If those bear little resemblance to retrieved microbes in the likely sources of contamination, they are probably the “real McCoy.” Such safeguards have convinced researchers that, in the recent studies, contamination is not masquerading as deep life. “It's really clear now that there are microbes in all these deep subsurface environments,” says ground-water microbiologist Derek Lovley of the University of Massachusetts.

    Thousands of leagues under the sea. The rocky depths beneath the continents aren't the only deep environments for life. Other researchers are finding evidence of microbial activity below the 60,000-kilometer-long system of volcanic midocean ridges that girdles the planet. Bacteria are an important part of the bizarre ecosystems found around midocean ridge hot springs, where microbes survive on sulfides leached from the ocean crust by deep-circulating seawater. But in recent years, the first oceanographers on the scene of volcanic disturbances along ridge crests have discovered great white billows of material, including much bacterial debris, spewing from the vents. This discovery immediately set researchers debating whether these microbes are living throughout the hundreds of meters of deep circulating fluids or just within the upper couple of meters of the crust. Now, two independent groups—Martin Fisk and Stephen Giovannoni of Oregon State University, and Terje Torsvik of the University of Bergen in Norway and his colleagues—have evidence that some bacteria, at least, come from deep down.

    About 5% of ocean crustal rock is glass, and “glass will weather on its own because it is unstable,” notes Fisk, who presented his group's findings at a March workshop in Washington, D.C. But microbes are apparently accelerating the process to their own benefit. Both groups have seen signs that glass in rock drilled from deep within ocean crust has suffered such microbially enhanced weathering, including glass with deep pits and glass etched in feathery patterns suggestive of microbial activity. Fluorescent chemical tags that attach only to DNA light up the pits, presumably marking the responsible bacteria. Given that and the volume of vented bacterial debris, Fisk concludes that “you have a subsurface aquifer that is just humming with life.”

    A hidden biosphere? Just how much life is down there? It's hard to be sure; Fisk's notion of a richly populated subsurface needs to substantiated by drilling. And Onstott calculates that deep life accounts for just 0.1% of the world's total biosphere. But he's being conservative: He is only counting bacteria that can be cultured in the laboratory. Deep samples often yield 100 or 1000 times more cells that appear to be intact but don't grow in culture. If they were truly alive, and preferred some environment not offered in the lab, the estimate of deep biomass could soar toward that at the surface.

    Certainly, the amount of life drops off with depth. Agricultural fields might host a billion or more culturable cells per gram of soil, notes Onstott, but abundance falls exponentially below the land surface, according to the best available data. In the deep Taylorsville Basin, the figure drops a millionfold, to one to perhaps 10,000 culturable cells per gram.

    And researchers are being reminded time and again how brutal life can be at its extremes. The biggest problem isn't temperature, which increases with depth. The great stress in the first few kilometers of continental rock seems to be hunger: The deep subsurface suffers from the greatest scarcity of nutritional resources on the planet. Microbes rapidly remove oxygen, nutrients, and consumable organic matter from ground water as it descends, leaving meager pickings for their deeper brethren. Onstott suspects that in the deep Taylorsville Basin, microbes have been eking out an existence without resupply from descending ground water for more than 180 million years.

    In analytical and modeling work done with Hsin-Yi Tseng of Princeton and presented last December at the meeting of the American Geophysical Union (AGU), Onstott argued that geologic events conspired first to sterilize the 230-million-year-old rock with extreme heat. Then, uplift of the strata gave surface water the high ground; from there it could sink into deep rock, ferrying microbes down with it. Finally, erosion removed the highlands from which the ground water was sinking and so cut the deep rock off from the surface. Today, the researchers calculate, topographic relief is so low, and the rock so nearly impermeable, that water would take at least 50 million years to seep to the sampled depth—perhaps as long as 180 million years. “It becomes geologically and hydrologically unreasonable to get [the microbes] there recently,” Onstott concludes. Most other researchers are not yet prepared to agree that such depths are completely cut off, however, if only because such interminable isolation is “just so difficult to prove,” notes Lovley.

    Novel life forms. If Onstott is correct, the Taylorsville microbes, although not quite a slice of life preserved from the age of the dinosaurs, would be direct descendants of surface life from that time. Most are turning out to be new species, says David Balkwill of Florida State University, who is using ribonucleic acids to identify the Taylorsville microbes. He and David Boone of the Oregon Graduate Institute have already found the first member of the bacterial genus Bacillus, B. infernus, that cannot tolerate oxygen.

    Thanks to the isolation and the scarcity of nutrients, life at great depths must be extremely slow paced as well as sparse. Using the amount of carbon dioxide produced by the oxidation of organic matter as a guide, Onstott estimates that microbial activity in the deep Taylorsville Basin occurs at a rate 10-15 that in soil. A deep bacterium “doesn't do much,” says microbiologist David White of the University of Tennessee, Oak Ridge. “It's very good at doing nothing,” or as close to doing nothing as life gets. Onstott guesses that cells may be dividing once a year or once a century, rather than every few minutes, as in an infection, or every few hours, as in soil.

    Other researchers can't imagine even that level of activity, noting that the microbes may not be “alive” all the time. “Bacteria don't have to grow,” notes marine microbiologist Holger Jannasch of Woods Hole Oceanographic Institution in Massachusetts. “They will sit there until someday they have a chance to grow, [but] we really haven't understood yet how microorganisms, especially their spores, survive that long.” A few of the Taylorsville Basin microbes do produce inactive spores of the sort capable of prolonged “suspended animation”; the others might be “salted out” most of the time, says microbiologist Tommy J. Phelps of Oak Ridge National Laboratory in Tennessee. He imagines a microbe caught in a microscopic pocket of highly concentrated brine, with the salt crystallizing around the cell and dehydrating it. When conditions improve, it might rehydrate, revive, and grow.

    Alternatively, deep microbes may be making a steady, although not spectacular, living by feeding on the rock itself. In 1995, Stevens and McKinley of PNNL reported that water pumped from as deep as 1500 meters in the volcanic rock of southeastern Washington state contained a surprising abundance of microbes, hydrogen, and methane (Science, 20 October 1995, p. 377). To them, that mix looked like a self-contained community. Weathering of the rock by the 30,000-year-old water would produce hydrogen, which the microbes could use as an energy source to convert dissolved carbon dioxide to biomass, producing methane as a byproduct.

    Now, Stevens, McKinley, and James Fredrickson, also of PNNL, have created just such an ecosystem in the laboratory. They have coaxed a variety of microbes to grow on a diet of crushed basaltic rock that reacts with oxygen-free water to produce hydrogen—the only energy source in their ecosystem. At the AGU meeting, the PNNL group reported that the weathering of a number of common silicate- and iron-containing minerals produces hydrogen. They concluded that microbial ecosystems “based on rock weathering may be widespread in nature.”

    And not just on Earth. Researchers see the deep biosphere as just the place to consider what life on Mars, or even on Jupiter's moon Europa, might be like today. If a body like Mars ever hosted life, it must have permeated the subsurface, the reasoning goes, and the conditions there today are thought to be every bit as hospitable to life as is Earth's subsurface: mild temperatures, liquid water, dissolved minerals, and plenty of rock surface.

    Even if Earth's subsurface life equals just 0.1% of the total terrestrial biosphere, “that's a lot if you look at how prolific life is on the surface,” says Onstott. “You look at other planets where there's no sign of life on the surface at all, but conceivably at great depth a biomass comparable to that on Earth, and you begin to realize how important the subsurface biosphere is.” Studying Earth's deep biosphere might thus inform the study of putative fossil microbes in meteorites from Mars or observations and samples taken from the Red Planet. So the reality of life underground may turn out to be even more fantastic than the dragons, fairy folk, and dwarves of our imagination.

  22. Biotechnology: In Industry, Extremophiles Begin to Make Their Mark

    1. Elizabeth Pennisi

    This month, a Rochester, New York-based company called Genencor International introduced a new detergent additive that it promises will make cotton clothes look like new through hundreds of washings. That may sound like a routine claim in the detergent business, but it is based on a protein that is far from ordinary. The additive, an enzyme called cellulase 103, was taken from an extremophile—a microbe that thrives where few humans dare to go—and its mass-market debut represents a milestone for the fledgling industry based on these microbes. This particular bug was discovered in an alkaline lake, but other potentially useful extremophiles have recently turned up everywhere from the frigid Arctic Ocean to boiling thermal pools.

    Going to extremes.

    Researchers have uncovered extremophiles in Arctic ice (left, and brown layer, right).

    J. W. DEMMING/UNIV. OF WASHINGTON

    This isn't the first extremophile product to go commercial. Many biologists already use an extremophile enzyme, or extremozyme, everyday—the Taq polymerase. Isolated in 1965 from a microbe that thrives in a bubbling thermal pool in Yellowstone National Park, Taq is the heart of the polymerase chain reaction, and companies are still fighting over its patent and annual sales of $80 million. But cellulase 103 is aimed at a much broader—and bigger—market: It is “the first large-scale industrial application of a product from an extremophile,” boasts Scott Power, a protein chemist with Genencor International's labs in Palo Alto, California.

    Hot stuff.

    Useful heat-loving microbes have turned up in smoking deep-sea vents.

    BRIAN E. JONES

    Its introduction is one sign of growing industrial interest in these enzymes, which can work at more than 100 degrees Celsius or at a pH of 10. New research initiatives, new companies, even a new journal (Extremophiles, launched in February) are melding basic biology with commercial development. Research agencies are surveying the natural world with renewed vigor and finding rapid new ways to identify and produce extremozymes. Such proteins could help process chemicals or food, or even lead to new classes of antibiotics. “We have yet to fully realize the benefits of these organisms,” notes Mark Madden, a biochemist at Stratagene in La Jolla, California.

    Even if the microbes' own proteins don't prove to be useful, chemists hope to learn from them how to redesign conventional enzymes to perform in harsh conditions. “[Extremophiles] are totally expanding our view of what enzymology is,” says molecular biologist Francine Perler of New England Biolabs Inc. in Beverly, Massachusetts.

    The industrial launch of extremozymes has been a long time in coming, however. Although the success of Taq polymerase seemed a hot start for extremophile biotech, few firms rushed to follow it up with other applications. And those that did often found it difficult to produce or purify enough enzyme to study—let alone to scale up for industrial use. Indeed, although enterprising microbiologists began collecting extremophiles from bizarre corners of the world more than 30 years ago, until recently they have gotten little return for their efforts.

    Given that history, some experts urge caution. “There are people trying to make a whole big story about [extremophiles], but they are not the Holy Grail,” says molecular biologist Glenn Nedwin of Novo Nordisk Biotech Inc. in Davis, California. Still, several companies and research agencies are now betting that risky investments in these microbes will yield big payoffs.

    From soda lakes to suds. Genencor, for example, is eyeing what Power says is a potential $600 million market for detergent enzyme additives. The company is counting on cellulase 103's ability to break down the microscopic fuzz of cellulose fibers that traps dirt on surfaces of cotton textiles, without harming the cotton fibers as much as other additives do. And because it comes from bacteria in alkaline soda lakes of varying temperatures, this cellulase performs under a broader range of conditions than other additives. It will work at the pH of soapy wash water—hot or cold, says Power.

    The bacteria behind the enzyme were collected from soda lakes on several continents—although Genencor won't reveal exactly where. Scientists worldwide are now surveying a host of extreme environments, from cold to acid to high salt, for organisms that might yield similarly useful enzymes. They are finding plenty of microbes to work with, says John Baross, an evolutionary microbiologist at the University of Washington, Seattle. Publications on alkaliphiles alone, for example, surged from a half dozen to more than 100 annually in the past few years.

    Two of Baross's University of Washington colleagues, marine microbiologist Jody Deming and chemical engineer Barbara Krieger-Brockett, have focused on cold-adapted organisms, or psychrophiles (from the Greek for cold-loving). They isolated organisms from sediments on the Arctic Ocean floor and have begun to purify their enzymes, which could perform such jobs as removing fat effectively in cold water—a handy trait for soaps.

    For some researchers, the search for extremophiles has been a career-long occupation. Take Japanese microbiologist Koki Horikoshi of the Japan Marine Science and Technology Center in Yokosuka. Horikoshi has been hunting extremophiles for decades in deep-sea sediments, and he filed several hundred patents on microbial enzymes—including one used in Japan as a detergent additive—long before the name extremophile was even coined. Only now, after his patents have run out, have some of the enzymes become widely used (Science, 3 January 1992, p. 28). But he's not discouraged—far from it.

    His latest effort, part of DEEPSTAR, a $3-million-a-year program, has so far collected 1000 different organisms from deep-sea mud and 2500 strains from the Marianas Trench—the deepest place on Earth, almost 11,000 meters down. They have found psychrophiles as well as “barophiles”—microbes that love high pressures. The most useful of Horikoshi's new crop of organisms, however, may be those that flourish in organic solvents toxic to most other life.

    He and DEEPSTAR colleagues Chiaki Kato and Akira Inoue found microbes that, for unknown reasons, manage to thrive in toluene, benzene, cyclohexane, and kerosene, sometimes at solvent concentrations of up to 50%. The microbes turned up in both soil and deep-sea muds, but were 100 times more common—and therefore much easier to isolate—in the muds. Although their precise utility is unclear, these microbes can degrade crude oil and polyaromatic hydrocarbons, and also show potential for processing substances that don't readily dissolve in water and therefore require huge volumes of water for their production, says Horikoshi.

    Extreme potential. Once researchers have extremophiles in their petri dishes, the next step is to see just what the organisms can do. In Europe, a consortium of 39 teams, funded by the BIOTECH-Programme of the European Union, just finished a 3-year, $8 million survey of organisms from high-salt, high- and low-pH, and high-temperature environments. In January, 61 laboratories embarked on the next phase: a 3-year, $12 million project “to exploit these microbes for various industries,” says project leader Garabed Antranikian, a microbiologist at the Technical University of Hamburg-Harburg, Germany. “We want to modify existing chemical processes and make them more biological.”

    One focus of this project is the potential use of biodegradable extremozymes to reduce toxic waste. For example, several collaborators aim to use extremozymes instead of chlorine in paper production; they are working on a quick screening method to evaluate xylanases and hemicellulases, enzymes that can remove the glue and break down pulp fibers. Other researchers are evaluating heat-resistant enzymes that could be used to label newly made DNA, thereby reducing the need for radioactive tags in diagnostic tests in molecular biology and medicine.

    In the United States, too, the extremophile community is poised to put the surveying efforts of researchers like Baross to good use. An expanding cadre of biochemists and molecular biologists is sifting through enzymes extracted from these collections. In addition, the U.S. Department of Energy (DOE) is making basic information on extremophiles widely available and easy to use. Some of that information is as basic as you can get: the genetic blueprints of these strange organisms. DOE is supporting the sequencing of the complete genomes of several extremophiles. One is already completed (Methanococcus jannaschii), and DOE expects several more to be finished by the end of June. This makes “these genes from these weird environments” accessible to any researcher who can call up the sequence from a database, says New England Biolabs' Perler.

    And one enterprising U.S. company—based in Sharon Hill, Pennsylvania, and La Jolla, California—has adopted a scattershot technique that produces extremozymes rapidly while avoiding the time-consuming process of isolating and purifying individual genes. Founded in 1994, Recombinant BioCatalysis Inc. (RBI) recovers DNA directly from samples collected from all over the world, randomly cuts it into fragments, and inserts the pieces into bacteria, usually Escherichia coli. The bacteria then churn out proteins that are screened for various types of enzymatic activity under different conditions.

    The company has discovered 175 new extremozymes using this approach and now provides a menu of them in kits sold through biological- and enzyme-supply companies. So, while previously researchers had to invest time and money in culturing or purifying enzymes just to see whether they might be useful, “Now there are enzymes out there to try,” says chemical engineer Robert Kelly of North Carolina State University in Raleigh. Each enzyme is different, explains Eric Mathur, who directs the company's La Jolla laboratory, and scientists can see which one works best for their application.

    Scaling up. RBI's rough-cut approach sidesteps another difficult problem in working with extremophiles: getting enough enzyme to work with. RBI makes enzymes the traditional way, inserting the gene into E. coli and using these bacteria as enzymatic factories. So it's poised to scale up production if a client requests it, although no one has yet, says Mathur. But this method doesn't work with all extremozymes: The E. coli sometimes fail to fold the protein into an active form, or the proteins are toxic to these bacteria. And, ultimately, E. coli are too inefficient for commercial production, which can require tons of enzymes.

    Genencor solved this potential problem easily because cellulase 103 comes from a newly discovered species of Bacillus, a genus of bacteria already used as an enzyme factory in commercial operations. So the company was able to put the cellulase gene into a different Bacillus strain and produce the 50 to 250 metric tons needed per year.

    For those enzymes for which Bacillus might not work, some researchers are trying to use an extremophile itself for mass production. They must first overcome a big hurdle, however: These organisms seem to be infected by few viruses capable of ferrying foreign genes into them. But some early progress has been made. Patrick Forterre of the University of Paris, South, in Orsay, France, for example, is working with a thermophile microbe called Pyrococcus abyssi that has plasmids—small, circular pieces of DNA that readily serve as vectors for new genes. And another European laboratory has found a virus that might transfer genes into the acid-loving extremophile, Sulfolobus.

    Even if companies do succeed in producing large quantities of extremozymes, some experts caution that they may not find a ready market. Novo Nordisk's Nedwin says that his company—which sells enzymes to companies that process everything from foods to paper—often finds that extremozymes aren't needed because enzymes from more conventional bacteria work just fine. In his view, extremozymes' chief benefit may be to help chemists learn to tailor enzymes to be stable under harsh conditions. “What extremophiles may do, and that's a big if, is maybe give you a different starting point” for altering traditional enzymes, Nedwin says.

    Market economics and tradition may also prevent the use of even the most promising extremozymes. For example, Kelly found an enzyme that breaks up a viscous natural polymer called guar gum, which is used to open crevices in bedrock to enhance the flow of oil and gas. This enzyme, from heat-loving Thermotoga neapolitana, can make the guar-gum solution flow more readily than current enzymes do at the high temperatures—100°C—found in deep wells. Yet coaxing traditional oil- and gas-industry companies to risk the switch has been tough. “Most of the major companies have been downsizing,” Kelly explains. “It's not a good time to champion a new technology,” especially because the new enzyme may not cut costs initially.

    Still, Kelly and others are optimistic about the future of extremophiles in industry. “What we have today is the possibility of getting enzymes that are considerably more active at the higher temperatures and more stable,” Baross argues. Hamburg's Antranikian insists that his consortium is more than ready to take on these challenges. “The time has come,” he says, “to show that these microbes are really interesting for industry.”

Log in to view full text