News this Week

Science  14 Feb 2003:
Vol. 299, Issue 5609, pp. 990
  1. SCIENCE PUBLISHING

    The UPSIDE of Good Behavior: Make Your Data Freely Available

    1. Eliot Marshall

    Prompted by what one biologist calls “an erosion of traditional standards”—and, in particular, the public dispute over access to data on the human genome—leaders in the life sciences have prescribed rules for the sharing of data and research materials. Their report, issued last week by the National Academy of Sciences (NAS),* decrees that the authors of published papers must support their claims by quickly and freely releasing data such as DNA sequences. The NAS group also came up with a snappy acronym for this ethic, UPSIDE (Universal Principle of Sharing Integral Data Expeditiously). Their goal, they say, is to get “universal adherence, without exception,” to the idea that any scientist should have ready access to the data and materials needed to “verify or replicate” a published claim.

    Thomas Cech, president of the Howard Hughes Medical Institute in Chevy Chase, Maryland, and chair of the NAS panel, calls the UPSIDE rules “a stake in the ground” for good behavior. He hopes that everyone in the community will feel “attached by various tethers, … and if they wander away, they will feel a tug” of public opinion pulling them back.

    The report lays out detailed procedures as well as broad principles. For example, it says that any group that claims to have sequenced a genome should put the entire data set in the public repository, GenBank, “by the time of publication” to guarantee free access. More generally, it says, any time a repository is widely used in a research community, authors should deposit relevant results in it. They must share materials pertinent to their findings, including source code for software, and they must explain how materials—even those under patent—can be obtained. Authors must be willing to share all this with investigators “on similar, if not identical, terms.” This means, according to UPSIDE, that company scientists should get the same access as academics do. Finally, the report says that journals and institutions should help enforce the rules.

    Anchor.

    Tom Cech wants people to feel “tethered” by the rules.

    CREDIT: KAY CHERNUSH FOR HHMI

    Cech notes that the academy asked the panel to undertake this review partly because of concerns about increasing commercial ties in academic life. But the main impetus, he and others say, was Science's decision 2 years ago to publish a report on the sequencing of a draft human genome by the company Celera Genomics of Rockville, Maryland. Celera made the data available but didn't deposit them in GenBank. Instead, the company released information on its own Web site, requiring users to sign agreements promising not to distribute or commercialize the data. Celera also set different terms of access for academic and commercial users.

    Leaders of the genome community objected, including Francis Collins, director of the U.S. National Human Genome Research Institute in Bethesda, Maryland, which co-funded a competing public genome project. Collins was concerned, he says, that “standards were slipping” and that it “could be damaging to the progress of science.”

    View this table:

    When the Celera controversy erupted, Science had no fixed rules on handling DNA-sequencing submissions, says Donald Kennedy, Science's editor-in-chief. As editor, he made special allowances twice to obtain proprietary data from commercial groups that he says wouldn't otherwise have seen daylight: the Celera sequence in 2001 and a draft sequence of the rice genome from the Swiss agribusiness firm, Syngenta, last year (Science, 5 April 2002, p. 92). In both cases, the authors released data from private Web sites. But now that the NAS panel has established community rules, Kennedy says, “we're going to ask [that all DNA sequences] go to GenBank” or a sister site.

    “It is very valuable to set out these understandings,” says Kennedy. “It's going to let everyone know what [the standards] are.” He adds that there are still some “interesting questions” about how to enforce the rules—such as whether a funding agency, an employer, or an editor might have the best leverage in a specific case.

    Philip Campbell, editor of Nature, commented in an e-mail that he agrees with the UPSIDE rules and with the idea that editors should enforce them. “We do,” Campbell says. “A phone call is often enough, but we can contemplate banning consideration of future submissions, reporting people to their employers or funding agencies, or highlighting [misbehavior] in our pages.”

  2. COSMOLOGY

    MAP Glimpses Universe's Rambunctious Childhood

    1. Charles Seife

    WASHINGTON, D.C.—The clearest baby pictures of the universe have caught the moment at which the first stars blazed into life. The images, taken by a satellite and unveiled at NASA headquarters this week, surprised and delighted cosmologists by implying that the cosmos was swarming with activity much earlier than was previously thought.

    “This has profound consequences, I think,” says Max Tegmark, a cosmologist at the University of Pennsylvania in Philadelphia. “On one hand, it's sort of shocking. On the other hand, it's like a piece of the puzzle has fallen into place and you say, ‘Aha!’”

    The images come from the Microwave Anisotropy Probe (MAP), a $145 million satellite launched aboard a Delta II rocket in June 2001. Perched in orbit around a point 1.5 million kilometers from Earth, MAP uses twin antennae to make exquisite measurements of the cosmic microwave background (CMB): radiation left over from the hot, dense universe 400,000 years after the big bang.

    For the past few years, balloon- and ground-based instruments have been making very fine measurements of the CMB across small slices of the sky, and physicists have used those data to estimate the amount and type of matter and energy in the universe (Science, 22 June 2001, p. 2236). MAP took a giant step forward by producing a much more detailed chart of the whole sky. The new results confirm cosmologists' increasingly troubling picture of a universe more than two-thirds of which consists of “dark energy” that scientists have yet to fathom. “[The plot of] the model goes through all the data,” says Charles Bennett, the principal investigator of the MAP mission. “It's really pretty.”

    Islands of light.

    Microwave satellite charted the bigbang's afterglow in unprecedented detail.

    CREDIT: NASA/MAP SCIENCE TEAM

    The MAP map also provides the tightest bounds yet on the age of the universe (13.7 billion years, give or take 1%), pins down the rate at which the universe is expanding (smack dab on the value the Hubble Space Telescope yielded a few years ago), and “puts quite a squeeze” on a theory of dark energy known as quintessence, says Bennett. But the real excitement centers on the timing of events that took place a few hundred million years after the big bang, when large structures such as stars and galaxies came into being.

    The clues come from MAP's measurements of the polarization of the CMB. A ground-based instrument called DASI detected the polarization late last year (Science, 27 September 2002, p. 2184), but MAP's view of the entire sky enabled it to pick out much larger features. The polarization of the CMB at large scales reveals patches of a characteristic size—a “peak” in a graph—that show when the light from newborn stars wrenched the electrons from free-floating atoms of neutral hydrogen. This “reionization” burned away the hydrogen fog that suffused the universe and marked the beginning of the modern era of stars and galaxies.

    MAP shows that the patches are smaller than expected, indicating that reionization took place about 200 million years after the big bang, rather than about 800 million years later, as astrophysicists had estimated by studying light from quasars. This apparent contradiction may help explain a lot of data that didn't seem to fit, such as the excess “power” on very small angular scales that the ground-based Cosmic Background Imager saw last spring (Science, 31 May 2002, p. 1588). The data could indicate that an early population of massive stars sprang to life before quasars and galaxies began to form in earnest. If so, that would all but eliminate the once-popular theory that fast-moving, “hot” dark matter played a role in the formation of structure in the universe.

    Whatever the reason for the early reionization, MAP has charted exciting territory ahead. Says Tegmark: “All cosmology over the next 5 years will be based upon this.”

  3. GENE THERAPY

    RAC Hears a Plea for Resuming Trials, Despite Cancer Risk

    1. Jocelyn Kaiser

    After learning more about the leukemias in two patients in a French gene-therapy trial, a U.S. review panel this week suggested that although it makes sense to keep this trial on hold, others could be allowed to resume. The experts believed that in some other protocols, the benefits might outweigh the risks. The panel also heard a worrisome report: A third patient in the French trial has a gene insertion like the one in the two who developed leukemia—although this child does not have leukemia.

    The discovery that two of 11 boys developed T cell leukemias after being successfully treated for X-linked severe combined immunodeficiency disease (X-SCID) in a trial at the Necker Hospital in Paris dealt a blow to the field of gene therapy. Last month, researchers in the United Kingdom halted an X-SCID trial, and the U.S. Food and Drug Administration (FDA) stopped 27 trials using the same kind of vector, a retrovirus, to treat blood stem cells (Science, 24 January, p. 495). A SCID trial in Italy and three U.S. SCID trials were already on hold. In addition, the U.S. National Institutes of Health's Recombinant DNA Advisory Committee (RAC) urged about 90 other retroviral trials targeting blood cells to consider a pause.

    An emergency meeting of RAC this week brought more concern: Alain Fischer, lead investigator of the French trial (speaking by phone), and collaborator Christof von Kalle of the Cincinnati Children's Hospital Medical Center in Ohio reported that a third patient has T cells with the same gene insertion near LMO2, a gene linked to leukemia. More patients could have this insertion, they noted. The leukemia was “not a random event” and “constitutes a serious inherent risk,” RAC concluded.

    But RAC also heard impassioned pleas from two researchers who argued that other gene-therapy trials should go forward. For example, some involve adult cancer patients who would be threatened more by existing tumors than by the leukemia risk. The panel concluded that resuming other trials “may be justified” if changes are made to informed-consent forms and monitoring plans. Moreover, RAC suggested that X-SCID trials might continue, but only for patients who have failed standard transplant therapy.

    Philip Noguchi, FDA's gene-therapy expert, said that the meeting was “very productive” but that risk information “is evolving.” An FDA advisory panel will review the French data on 28 February.

  4. ASTRONOMY

    Institute Faulted on Attitudes Toward Women

    1. Andrew Lawler

    With its control over the world's premier observatory, the Space Telescope Science Institute (STScI) in Baltimore exerts a powerful gravitational pull within the astronomy community. But it appears to have the opposite effect on women researchers on its staff.

    An independent panel set up in the wake of a recent exodus of women scientists from the institute has concluded that its management style is dismissive and inhospitable to women. The four-woman, two-man panel, chaired by Smithsonian Astrophysical Observatory astronomer Andrea Dupree, found no disparity in salary or lab space among men and women. But its report characterizes the institute, which is managed by the Washington-based Association of Universities for Research in Astronomy (AURA), as a “dysfunctional family.” It also notes that “an extremely competitive, aggressive environment” is undermining morale among men and women and could hamper the institute's performance.

    In response to the report, which is just beginning to attract attention after being posted on the association's Web site in November, AURA and institute officials have promised to hire more women and pay closer attention to gender issues. And AURA's chief, William Smith, says his organization is planning to examine the status of women at other institutions it manages, including the Arizona-based National Optical Astronomy Observatory and the National Solar Observatory.

    The space institute was created in 1981 to manage Hubble Space Telescope science, and it was once seen as a beacon of opportunity for women. “The institute is very important for the community, and it always had a leadership role [in] the advancement of men and women in the field worldwide,” says Dupree. Since 1992, however, the share of female scientific staff has dropped from 15% to 10%, and all 20 senior astronomers are men. In addition, five of the seven women scientists hired in the past 5 years have left, while only one of the 19 men hired during that period has departed.

    Unpleasant setting.

    Although they may enjoy the work, women astronomers at the space institute were much more likely than men to perceive a hostile environment.

    SOURCE: AURA

    Smith says that declining numbers of women researchers prompted the review, which spans the tenures of Robert Williams, who headed STScI for 5 years in the 1990s, and Steven Beckwith, who succeeded him in 1998. Dupree and her panel of outside astronomers agreed to conduct the study after winning assurances that the results would be made public (aura-astronomy.org) and that they could conduct a follow-up study in 1 or 2 years.

    Although there was no apparent difference in salary or lab space between the sexes, the report cites “a fundamental ignorance of accomplishments of women and the presence of gender bias” among senior managers. For example, Dupree and others cite an incident—not spelled out in the report—in which a male manager put his hand in the face of a female colleague to stop her from talking during a meeting. No disciplinary action was taken. In another case, a man introducing a woman for a talk made derogatory comments about her abilities. Such “lack of sensitivity towards the climate and behavior … is tolerated by senior management,” states the report.

    One woman who recently left the institute for a university post says that “managers would dominate” using “awful behavior” and creating “a competitive and combative atmosphere.” Although she adds that this behavior was not aimed exclusively at women, “women were more vulnerable because they didn't have the support system men had.”

    Beckwith says he knows about “a lot of interpersonal friction” among employees but that he hasn't pinpointed its causes. He also believes that his long-standing awareness of and empathy for the plight of women researchers will help the situation. Last fall, AURA extended Beckwith's contract for 3 years rather than a possible 5 years. Although Smith and Beckwith declined to say if the problems facing women influenced that decision, several women say they believe that the panel's negative findings played a key role in the contract negotiations.

    In response to the Dupree report, AURA has pledged to boost the share of women on the science staff from its current 13%—9 of 71—to 20% by 2010 and the proportion of tenured female Ph.D.s from 12% (4 out of 34) to 20%. It hopes to see the number of female full astronomers jump from 5% (1 of 21, although on part-time status) to 15%. The institute hired three women researchers last year and has a higher percentage of female scientists than other astronomy organizations do, Beckwith notes.

    But both the panel and AURA agree that hiring more women is only a piece of the puzzle. “The harder thing is to make a change in the professional climate,” says Smith.

  5. RUSSIAN SCIENCE

    Academy Plucks Best Biophysicists From a Sea of Mediocrity

    1. Andrei Allakhverdov
    1. Vladimir Pokrovsky*
    1. Andrei Allakhverdov and Vladimir Pokrovsky are writers in Moscow.

    MOSCOW—Russia's main science body has launched a long-overdue experiment in survival of the fittest. Last month, the Russian Academy of Sciences (RAS) announced the winners of its first-ever competition for funds: 49 major grants to what it deemed to be the cream of Russia's biophysics labs. But the awards—each totaling $625,000 over 5 years—have drawn allegations of conflicts of interest, as more than half the grants went to labs in institutes whose chiefs ran the competition.

    Since the dissolution of the Soviet Union 12 years ago, RAS has become a welfare agency of sorts, providing poverty-level salaries to hundreds of thousands of scientists at its 325-odd institutes. That approach, however, is at odds with President Vladimir Putin's insistence that government-supported science should serve industry better. The academy has also found itself in a struggle with an increasingly powerful Ministry of Industry, Science, and Technology for preeminence in Russian science.

    Now it appears that at least one division of RAS is getting with the Putin program. In August 2002, RAS asked lab chiefs in the 35 institutes of its Division of Physico-Chemical Biology to submit proposals for the major grants. The roughly 70 submissions were judged on criteria ranging from the citation-impact factor of papers published by the applicants to the number of young researchers in a lab.

    Big winner.

    The Shemyakin and Ovchinnikov Institute of Bioorganic Chemistry won several $625,000 grants.

    CREDIT: TASS/SOVFOTO

    The awards, announced last month, dwarf those handed out by the Russian Foundation for Basic Research (RFBR), the country's main agency for supporting peer-reviewed research. On average, RFBR biophysics grants in 2003 pay $3800, or 1/33 of the new RAS grants. The 49 academy grants will ensure adequate funding for about two-thirds of the active labs in the RAS division, says Georgy Georgiev, director of the Institute of Gene Biology in Moscow and chair of a seven-member panel that awarded the grants. “For the first time, we were able to select the best laboratories and to provide them with good working conditions for a long time and with well-deserved salaries,” he says.

    But with that sort of money at stake, Russia's scientific community was ready to pounce on any whiff of irregularity. In total, 30 of the 49 grants went to five Moscow-based institutes whose directors sat on the review panel: Georgiev's institute and the Institute of Protein Research, the Shemyakin and Ovchinnikov Institute of Bioorganic Chemistry, the Institute of Molecular Genetics, and the Institute of Cytology and Genetics. Several scientists who asked to remain anonymous blasted the results. “They just got together and distributed money amongst themselves,” grouses one critic.

    The competition's overseers insist that the selection was unbiased. “Conflict of interest is out of the question,” says Alexander Spirin, director of the Institute of Protein Research and a selection committee member. “The committee consisted of the best experts in this area, and it would be strange if they were excluded from the competition.”

    Others see merit in richly rewarding Russia's best scientists—but prefer truth in labeling. “I would not call it a competition,” says Boris Saltykov, a former science minister who helped create RFBR during his struggles to introduce peer review in Russia in the early 1990s. The RAS system may well work, he says, but it raises the question of who will decide which scientists are topflight and should control the purse strings. Georgiev bristles at that evaluation. “We had a real competition,” he says. Still, to eliminate any appearance of conflict of interest in future competitions, Georgiev says that reviewers will be drawn from previous grantees and will not be permitted to submit proposals.

  6. RNA INTERFERENCE

    Mini RNA Molecules Shield Mouse Liver From Hepatitis

    1. Jennifer Couzin

    Shooting millions of tiny RNA molecules into a mouse's bloodstream can protect its liver from the ravages of hepatitis, a new study shows. It offers further hope that a novel approach to silencing troublesome genes will become a valuable disease-fighting tool. But the authors and others caution that the therapy must leap many hurdles before it can be safely applied to humans.

    The technique, known as RNA interference (RNAi), utilizes short bits of RNA to silence specific genes. In this case, they blunt the liver's self-destructive inflammatory response, which can be triggered by agents such as the hepatitis B or C viruses.

    Full-size RNA molecules convert genetic information into proteins. But in the late 1990s, researchers found that truncated RNAs could be coaxed to turn off the very genes that had helped generate them. Biologists exploring RNA's newfound versatility quickly recognized RNAi's potential as a powerful medical treatment (Science, 20 December 2002, p. 2296). But until now, that potential has been largely theoretical.

    Harvard University immunologists Judy Lieberman and Premlata Shankar are apparently the first to unveil the therapeutic power of RNAi against disease in an animal. Infusing a solution of small interfering RNAs (siRNAs) into a mouse's tail—in a massive amount, equivalent to half the animal's blood volume—protected it against hepatitis. And in animals that were already ailing, RNAi shut down the inflammation enough to allow the liver to recover. Despite the traumatic delivery method, the mice didn't appear to suffer side effects.

    “This is the first example that I know of where there's been a biological challenge and an animal [was] protected” by RNAi, says Phillip Sharp of the Massachusetts Institute of Technology in Cambridge, who recently co-founded a company called Alnylam to design RNAi-based therapies.

    Interference.

    Mice given a placebo treatment for hepatitis suffered serious liver scarring (left, lines) from inflammation, whereas the livers of those receiving targeted siRNAs were protected (right).

    CREDIT: E. SONG ET AL. NATURE MEDICINE

    In humans, liver inflammation is normally triggered by a virus or an autoimmune disorder. Although no viral analog exists in animals, scientists can induce autoimmune hepatitis. The disease has become popular in RNAi studies, largely because the liver readily absorbs siRNAs into its cells.

    In a series of experiments published online this week by Nature Medicine, Lieberman's team gave mice injections of siRNAs designed to shut down a gene called Fas. When overactivated during an inflammatory response, it induces liver cells to self-destruct. The next day, the animals were given an antibody that sends Fas into hyperdrive. Control mice died of acute liver failure within a few days, but 82% of the siRNA-treated mice remained free of serious disease and survived. Between 80% and 90% of their liver cells had incorporated the siRNAs. Furthermore, the RNA molecules functioned for 10 days before fading completely after 3 weeks, lasting roughly three times longer than in previous studies. (Fas is rarely expressed at high levels outside the liver, so briefly disabling it has little effect on other organs.)

    Another set of animals faced a different challenge: injections of cells called ConA, which compel the immune system to attack the liver and produce the scarring seen in viral hepatitis. Animals infused with siRNA developed no liver damage.

    “It's amazing how well it worked,” marvels Charles Rice of Rockefeller University in New York City. Still, he adds, the delivery method is clearly problematic. In humans, rapidly injecting a huge volume of solution is “not the way to go,” Rice says. Researchers have yet to determine whether a gentler approach might work. In addition, biologists agree that the best strategy would be to aim siRNAs directly at hepatitis B or C viruses, but that would require a different siRNA than the one used by Lieberman's team. Evidence from several labs suggests that, in petri dishes, siRNAs can stop hepatitis C from replicating.

    Lieberman, Shankar, and others are also testing RNAi's ability to battle HIV. At the Conference on Retroviruses and Opportunistic Infections, held in Boston this week, Shankar presented evidence that siRNAs can target a protein called CCR5 that helps shunt HIV into immune cells. The result: For 3 weeks, the RNAs barred HIV's entry. And in cells already infected, they put the brakes on viral replication altogether.

  7. EUROPEAN UNION RESEARCH

    Ethics Group Gives Qualified Nod to Placebos

    1. Gretchen Vogel

    BERLIN—A European ethics body has said that placebo-controlled trials can sometimes be justified in developing countries, even when proven treatments are available in wealthy countries. The statement departs from more stringent standards on placebos laid out in the Declaration of Helsinki, a document that sets worldwide standards for clinical research ethics.

    The opinion from the European Group on Ethics in Science and New Technologies does not have the force of law, but it is expected to serve as a guideline for European Union- funded research, and especially for a $640 million European Commission project gearing up to test drugs against AIDS, malaria, and tuberculosis in the developing world. The opinion, released 4 February, states that placebo-controlled trials may be warranted when, for example, “the primary goal of the clinical trial is to try to simplify or decrease the costs of treatment for countries where the standard treatment is not available for logistic reasons or inaccessible because of the cost.” Two members of the group dissented from that conclusion, stating that it establishes an unethical double standard for research in wealthy and poorer countries.

    The issue has been hotly debated since the late 1990s, when critics denounced several large HIV/AIDS trials in Thailand and Africa. In those studies, researchers compared a short course of treatment to a placebo, even though the effectiveness of a longer, more expensive course was well known (Science, 12 February 1999, p. 916). Patient advocacy groups, including the U.S.-based Public Citizen, argued that researchers had a responsibility to provide the best proven treatment to study subjects, even when that treatment would not normally be available in the local community.

    The World Medical Association (WMA) agreed, amending its Declaration of Helsinki in 2000 to tighten the rules for placebo-controlled trials worldwide. Unless the condition is minor, said WMA, control-group participants must receive the “best current treatment.” At the time, some scientists complained that the revision ruled out research that could save lives in poor countries by efficiently testing less expensive or simpler treatment options (Science, 20 October 2000, p. 418).

    In its new statement, the European ethics group accepts that argument. Although in principle the best current treatment should be provided, exceptions can be made when they are approved by an ethics committee that includes representatives of the local community. “We open up a possibility for somewhat different exceptions [to the standard] than the Declaration of Helsinki [has],” says Göran Hermerén, chair of the group.

    That approach puts the interest of the society ahead of the interest of individuals involved in clinical trials, says Delon Human, secretary-general of WMA, a premise rejected by the Declaration of Helsinki. “We do not believe that economic arguments should justify a lowered standard of research practice” for placebos or any type of research, he says.

    Peter Lurie of Public Citizen says the new document is a missed opportunity. In calling for stringent ethical guidelines, “the object is to force researchers to think more creatively about alternative trial designs,” he says. “This document provides more than adequate justification for business as usual.”

  8. AGRICULTURE

    Study Shows Richer Harvests Owe Much to Climate

    1. Erik Stokstad

    Since the 1940s, harvests across the United States have become ever more bountiful as farmers have planted better varieties of crops, generously fertilized them, and gained the upper hand against pests and weeds. But over the past 2 decades, they have had a little help: A new study shows that a surprisingly high percentage of the improvement in yield was due not to farm management but to climate change.

    The finding suggests that food production in the United States may be more vulnerable to shifts in climate than was previously suspected, a fact that could affect global food security. “It's an eye opener,” says agricultural meteorologist Gene Takle of Iowa State University, Ames.

    On page 1032, graduate student David Lobell and Gregory Asner, both of the Carnegie Institution of Washington and Stanford University, report an analysis of the role of climate and other factors in U.S. agriculture. They investigated the interplay among temperature, rainfall, amount of sunshine, and bushels of corn and soybeans per acre from 1982 to 1998. During this time, summers in a large swath of the Midwest became slightly cooler. Lower temperatures in the region are known to boost the yields of corn and soybeans, which rose about 30% over the study period. The United States leads the world in production of the two crops.

    Lobell and Asner wanted to tease out the impact of those gradual climate shifts relative to other influences on yield, such as farming practices. To spot correlations amid the statistical noise, they picked counties throughout the United States where yields had responded to climate in the same way, either rising in cooler summers or falling in warmer ones. That was the case in about half of the counties where corn or soybeans were being grown—618 for corn, 444 for soybeans. (The amount of sunshine or rainfall was unrelated to changes in yield.)

    Using a statistical model to compare these climate variations among counties with changes in yield, the researchers found that the cooling climate was responsible for about 20% of the gains over the 17 years. The remainder they credit to management and other factors, such as increased carbon dioxide in the atmosphere. “Gains from management have not been as high as we thought,” Lobell says. “This has important implications for projections of future food production.”

    Nature's co-pilot?

    Farming practices such as crop dusting may have less effect on crop yields than previously thought.

    CREDIT: JOEL SARTORE/CORBIS

    Others agree. “This points out that our food production may be more vulnerable to shifts in climate than we thought,” says Jonathan Foley, a climatologist at the University of Wisconsin, Madison. “It is a little scary.” Lobell and Asner's analysis indicates that yields would drop by 17% for each degree that the growing season warms. That's three times as much as other studies have suggested. Most climate models predict that the Corn Belt of the U.S. Midwest will warm over the next few decades.

    “These yield trends in the U.S. have global implications,” says Kenneth Cassman, an agronomist at the University of Nebraska, Lincoln. For example, Cassman explains, a drop in U.S. production might stimulate more planting of soybeans in environmentally sensitive areas such as Amazonian watersheds in Brazil.

    If climate in the United States should take an unfavorable turn, farmers might be hard pressed to compensate, says agronomist Cynthia Rosenzweig of Columbia University and NASA's Goddard Institute for Space Studies in New York City. That's because U.S. agriculture may be nearing its maximum efficiency, she says. But Don Duvick, a retired plant breeder from Pioneer Hi-Bred International, is sanguine. “If there were a trend to higher temperature, the breeders would churn out hybrids that can take it,” he says. More vulnerable to climate change are developing countries, where temperatures are already high, soils are often poor, and management has a long way to go. “Climate change will put them even further behind,” Rosenzweig says.

    The study is not the last word, Rosenzweig and others emphasize. Lobell and Asner looked at only one aspect of climate; its variability and the number of extreme events, such as floods, also strongly affect yields. In addition, temperature shifts might influence harvests in other ways, such as by bringing more or less land into production, says Mark Rosegrant, director of the International Food Policy Research Institute in Washington, D.C. It may also be dicey to generalize from these counties to the nation as a whole. “It's so simplified, it's hard to say that this is truly reality,” Rosenzweig says.

    No one, however, disputes the bottom line: Small, gradual shifts in climate can play an important role in yield trends and therefore food supply. What needs to be done next is to expand the study to other regions of the world—something that Lobell is already working on—and to other crops.

  9. SPACE SHUTTLE

    After Columbia, a New NASA?

    1. Andrew Lawler

    Strong political backing and the promise of more money may help NASA Administrator Sean O'Keefe fulfill a bold new vision—if the agency can survive the current investigation

    From disaster comes good fortune, according to a Chinese proverb. If that holds true, the tragic loss of Columbia and its crew may breathe life into NASA Administrator Sean O'Keefe's plan to build a complement to the space shuttle, revolutionize space science missions, and create the steppingstones for human missions beyond Earth orbit.

    That potential good fortune is the result of an unusual constellation of events that strengthen O'Keefe's hand as he begins discussions this week on Capitol Hill about the civil space program. “It's a singular opportunity,” says Charles Kennel, chair of NASA's Advisory Council and head of Scripps Institution of Oceanography in San Diego, California. “The course of the space program over the next 20 years will be set by the debate we're about to begin.” Kennel and others predict that the shuttle and space station programs will emerge in a stronger position, and that the political spotlight will help NASA resolve nagging questions about future launch systems and human missions beyond Earth orbit.

    The short-term challenges for NASA, however, remain daunting. Hundreds of investigators continue to comb through Texas fields, Louisiana swamps, NASA computers, and Air Force photographs in search of clues to the shuttle's sudden disintegration. The longer the mystery remains unsolved, the longer the shuttle fleet is likely to remain grounded. Meanwhile, agency managers are struggling to figure out the impact of the disaster on everything from the 2004 planned servicing of the Hubble Space Telescope to the cost of storing space station equipment shelved due to the grounding of the shuttle fleet. Outside the shuttle, there is no immediate crisis: The space station is well supplied and accessible with Russian vehicles, Hubble doesn't need maintenance for the foreseeable future, and money slated for flying the shuttle can help cover the unexpected costs.

    Nevertheless, shaken lawmakers promise sweeping reviews of the agency. “Everything should be on the table … including budget,” said Senate Minority Leader Tom Daschle (D-SD) after meeting with O'Keefe 3 February. Senator John McCain (R-AZ) and Representative Sherwood Boehlert (R-NY), who chair NASA oversight panels, were to hold a joint hearing this week to start that review, and some panel members have already suggested that the space station's future could be in jeopardy.

    Despite the tough talk, many scientists, congressional aides, and NASA officials dismiss the notion that the shuttle and station programs are doomed, that there will be a radical reassessment of the role of humans in space, or that NASA's budget could suffer as a result of the failure. O'Keefe, they note, has powerful advocates on his side. Three of his former bosses are now in influential positions: Vice President Dick Cheney, White House Office of Management and Budget Director Mitch Daniels, and Senator Ted Stevens (R-AK), who chairs the powerful Appropriations Committee. Stevens told O'Keefe 3 February that he will do everything he can to help NASA overcome any fiscal difficulties. Cheney, speaking of the astronauts at a 6 February memorial service at Washington's National Cathedral, said “their greatest memorial will be a vibrant space program with new missions carried out by a new generation of brave explorers.” And Daniels says that there is room for NASA's budget to grow. “The president wants to emphasize it more,” he says.

    Briefing the boss.

    NASA's Sean O'Keefe met 2 days after the Columbia disaster with President Bush, who has proposed boosting space science spending by 60% over 5 years.

    CREDITS: (TOP TO BOTTOM) J. SCOTT APPLEWHITE/AP; SOURCE: NASA

    Science soars

    Past administrators confronted with the loss of astronauts went through the political ringer and struggled to keep their human space-flight programs—Apollo and then the space shuttle—afloat. But O'Keefe not only has strong political connections, he's also got a specific plan that bears the White House seal of approval. The plan, laid out in the 2004 budget request released 3 February, would end a decade of stagnant NASA budgets in favor of modest but steady increases. The bulk of the increases would go to a small space plane that could complement the shuttle and to a raft of space science efforts that draw on advanced propulsion, power, and communication systems. Those systems, in turn, could lay the foundation for human outposts beyond the space station.

    In the president's request, NASA's budget would rise from $15 billion-plus for 2003 (the exact number was still being negotiated as Science went to press) to $17.8 billion in 2008. Although that may seem puny to biomedical researchers who have watched a 5-year doubling of the budget for the National Institutes of Health, it's very good news after the past decade. The biggest winner would be the space science program, which would jump 60%, from the $3.5 billion requested in the 2003 budget to $5.6 billion by 2008. “The Administration chose to really back space science,” says Bruce Murray, a planetary geologist at the California Institute of Technology in Pasadena. “And it doesn't seem to be just posturing.” Says NASA space science chief Ed Weiler: “Without [O'Keefe's] support, these increases never would have gotten through.”

    Somber evidence.

    Columbia's nose cone being recovered after it landed in eastern Texas.

    CREDIT: ERIC GAY/AP

    Most of that increase, an additional $3 billion over 5 years, would go to Project Prometheus, an effort to slash the time it takes to reach distant planets by using nuclear propulsion. The centerpiece of this initiative, designed to make chemical propellants obsolete, would be the Jupiter Icy Moon Orbiter. The mission would search for hidden oceans on Europa, Ganymede, and Callisto, oceans that astrobiologists believe could harbor organic material. “Our launch target is no earlier than 2011,” one NASA official says.

    NASA also wants to spend $233 million over 5 years to improve high-bandwidth communications so that spacecraft can send back more data more rapidly. Using current radio transmission techniques, it would take 21 months to map one-fifth of the martian surface. The greater bandwidth would allow the Mars Reconnaissance Orbiter, slated for 2009, to map the planet's entire surface in 4 months.

    Some $765 million through 2008 would be set aside to continue work on the Laser Interferometer Space Antenna, a flotilla of three spacecraft that would gather data on gravitational waves, and on a powerful x-ray telescope called Constellation X. The money would also cover work on a new series of low-budget physics missions, called Einstein probes, modeled on the Discovery and Explorer series (Science, 15 November 2002, p. 1320). Funding for investigations of the sun also would nearly double, from $674 million in 2003 to $1.2 billion, for a series of spacecraft like the Solar Dynamics Laboratory that will examine the sun's magnetic field and space weather.

    Funding for other areas of science, such as earth, biological, and physical sciences, would grow by smaller percentages. But a human research initiative designed to study crew safety during deep-space missions would begin at $39 million this year and grow to $347 million by 2008.

    The propulsion, communication, and human-health efforts could provide the basis for a decision to move beyond the space station, NASA officials say, along with healing the old breach between advocates of robotic versus human missions. Agency planners envision astronauts assembling large telescopes or complex planetary probes at a depot outside Earth orbit, eventually following those probes when it makes good research sense. “We don't want to throw humans at every problem,” says Harley Thronson, chief technologist in NASA's space science office. “But we want to find ecological niches” that would benefit science missions.

    What to do with humans once the space station is complete is sure to come up in the congressional debate, predicts Kennel. And NASA had better have some options in place. “The one intolerable thing is if there is no definition of a next step,” he says. “That would be politically bad.”

    No free launch

    No launchers, no exploration. A decade of pouring money into advanced launch technology has produced little, and a frustrated O'Keefe decided last year that NASA should continue upgrading and modernizing the shuttle to keep it flying as far into the future as 2020. The loss of Columbia will force NASA and Congress to reconsider that choice, which likely will prove the most controversial decision in the new plan.

    But O'Keefe also proposed in November that NASA consider building an Orbital Space Plane. A more modest winged vehicle than the shuttle, it would sit on top of an existing expendable rocket and travel to and from the station, with or without crew. It could also serve as a lifeboat for space station personnel.

    But the idea has won little support from many aerospace contractors, who fear it could replace the shuttles—and their lucrative contracts—or from legislators, who question its feasibility and its price tag. The shuttle grounding could prompt O'Keefe to put the program on a faster track in order to ensure an alternative to the shuttle. But gaining political support for this may be tough. “You won't get a multibillion-dollar appropriation for this,” a House aide says. “It's not going to happen.”

    NASA declines to estimate the plane's cost, but one industry official says that development costs could exceed $35 billion—or more if NASA wants it to fly before the current target of 2012. NASA seeks to spend $550 million on the program in 2004 and an additional $515 million for next-generation launch technology.

    The European Space Agency approached NASA about jointly building the space plane but was rebuffed, according to NASA officials. O'Keefe, a former Navy secretary and Defense Department comptroller, prefers to work with NASA's richer cousin, the military. “It's the difference between holding a bake sale and winning the lotto,” explains one aerospace company manager. The Defense Department hopes the plane would be able to move around its space assets—such as spy satellites—already in orbit.

    But even a 2012 launch will leave NASA without its own space station rescue vehicle for six long years after Russia's obligation to make Soyuz available expires in 2006. If the station partners hope to expand the station to a crew of six, they will need to have either two Soyuz vessels on hand or a new vehicle altogether. Political constraints make the former difficult, and the huge cost of building a new rescue vehicle by 2006 seem prohibitive.

    Despite these tough issues, O'Keefe is in a good position to make his case. “There will be a reexamination of the program, and it will come out stronger, just as it did after Apollo 1 and Challenger,” predicts John Bahcall, an astronomer at Princeton University in New Jersey.

    So far, the agency is earning high marks for its handling of the investigation. “They'll shoot straight with us, because they want to know” the cause of failure, former President Bill Clinton said 6 February. But it won't all be smooth sailing. That same day, responding to congressional criticism of the accident-investigation team's makeup, O'Keefe added more non-NASA and military officials to “eliminate any ambiguity about the independence of this group.”

    In the months ahead, a protracted war with Iraq could undermine public and political interest in the space program. O'Keefe's push to tie NASA more closely to the military could also backfire, and his desire to expand use of nuclear systems also might provoke a backlash. Given those potential pitfalls, more midcourse corrections seem likely as NASA's administrator navigates his ambitious plans through the treacherous waters created by the tragedy over Texas.

  10. SPACE SHUTTLE

    Disaster Sets Off Science Scramble

    1. Andrew Lawler

    The Columbia tragedy has grounded the fleet of U.S. space shuttles and halted construction of the international space station. But that leaves the station's crew temporarily available to perform other tasks, including more experiments, if space officials can figure out a way to get them what they need. “We have additional samples and supplies” aboard the station to extend the current research program, says John Uri, a member of the newly organized station utilization recovery team at Johnson Space Center in Houston, Texas. “We're looking at how the crew can do more science.”

    The current crew of three arrived on the station in November. It was scheduled to leave next month after conducting almost two dozen experiments, ranging from the effects of long-duration space flight on skeletal tissue to the impact of the station's own vibrations on experimental measurements. Unfortunately, a shipment of new parts that arrived last week on a Russian Progress cargo ship didn't help the astronauts fix an apparent electrical short that has knocked out a glove box critical for a host of materials experiments. Uri says the crew will continue to try to repair the equipment.

    By juggling the resupply schedule and depending on Russian vehicles, at least in the near term, NASA planners hope to keep the astronauts busy doing science. Russian officials say they will do what they can to accommodate the Americans in their hour of need. The next Progress vehicle, for example, may be able to ferry additional research material. The Russian capsule, which normally brings food, water, and other supplies, could be readied earlier than its scheduled June launch. But bringing down what goes up is the real problem. “That's the choke point,” explains Uri. Progress vehicles don't return to Earth—they are allowed to burn up in reentry—and the Soyuz capsules, which hold a crew of three, have limited storage space.

    Backup plan.

    Russia's Soyuz vehicle, preparing to replace one already docked, may be the best short-term hope for science aboard the international space station.

    CREDIT: NASA

    What's worse, neither one can carry the refrigerator-sized racks that hold the bulk of U.S. experiments. The European Space Agency (ESA) has offered to speed up work on a robotic spacecraft, the Advanced Transfer Vehicle, that could carry the racks into orbit. But it likely won't be ready until the summer of 2004, and its ride into space, Ariane 5, was itself grounded after a recent failure.

    Seven racks are now on the station, and a shuttle was slated to haul three more into orbit next month when it brought up a new crew. Meanwhile, in April, a Soyuz is scheduled to replace one now docked to the station. That vehicle serves as a lifeboat in case of emergency, and it needs to be replaced every 6 months in order to assure reliability.

    The Russians had planned to fill the Soyuz on its weeklong stay with an ESA astronaut, a cosmonaut, and possibly a paying tourist. Both Russian and ESA officials say they are willing to alter that plan if NASA instead wants to bring back the two Americans and one Russian now in orbit. “Manned flights will have to be adjusted,” Energia Corp. president Yuri Semonov told reporters last week. Those issues are currently taking a back seat to the investigation, says Michael Kostelnik, NASA shuttle and station chief, but he expects them to come to the fore in the next few weeks.

  11. SPACE SHUTTLE

    Columbia Disaster Underscores the Risky Nature of Risk Analysis

    1. Charles Seife

    NASA has been using cutting-edge tools to analyze the probability that a shuttle and its crew will be lost, yet interpreting the results is a dicey proposition

    When the Challenger exploded in 1986, NASA's calculations of the probability that a space shuttle would fail ranged from 1 in 100 to 1 in 100,000. Nearly 2 decades later, NASA's ability to estimate risk has vastly improved. Yet this month's breakup of Columbia, the second such catastrophe in 113 shuttle flights, suggests that its most recent official estimate of 1 in 250 is still way off. Why can't NASA get it right? The answer lies in a field known as probabilistic risk assessment (PRA).

    PRA is a way for engineers to get a handle on the likely failure rate of a complex system such as a space shuttle and pinpoint the elements most likely to contribute to such a disaster. Mathematicians and risk experts feel that it is the best way to get a quantitative grip on what will cause a machine to fail, yet engineers, including those at NASA, have long resisted the idea of using probabilities to analyze risks. “They think that displaying uncertainty is displaying a lack of understanding,” says Elisabeth Paté-Cornell, a risk expert at Stanford University.

    NASA soured on the whole idea of quantitative risk analysis when its first crude attempt predicted that the Apollo moon missions would almost certainly fail. But the space agency has worked hard to overcome its engineers' natural resistance to PRAs and is even pioneering work in adapting probabilistic assessments to constantly changing conditions during a mission. “It's a day-and-night difference,” says Ali Mosleh, a risk-analysis expert at the University of Maryland, College Park.

    Risk assessment begins with the construction of a fault tree, a list of events that, at various levels of detail, can cause a system to fail. For example, the shuttle's orbital maneuvering system, which allows the shuttle to change its attitude in space, can fail because of a leak in its helium tank or nitrogen tank, a stuck valve, or a myriad of other reasons. By putting all of these events onto one chart, an engineer can quickly figure out which ones are most important (see chart).

    Tree of knowledge.

    The first step in a probabilistic risk assessment is generating a fault tree, such as this one for analyzing the shuttle's orbital thrusters.

    SOURCE: NASA

    In 1975, nuclear engineers took the next step by augmenting a fault tree for nuclear power plants with the probabilities of those faults occurring and a measure of the severity of the consequences of each failure. By adding and multiplying the probabilities, they came up with an overall probability of failure with a given level of severity. Better yet, the result, in a study known as WASH-1400, suggested which subsystems in the nuclear power plants presented the greatest risk of failure and, thus, the most pressing need to be redesigned. Thus was born the PRA.

    The engineering community was immediately dismissive. “There's a general reluctance by engineers and the engineering communities [to use] probabilities,” Mosleh says. “This reluctance is not unique to NASA.”

    Engineers try to assure safety rather than analyze risk, and they usually do this by building in “safety margins” to the systems they design. “If you have a girder bearing a load, you might make it twice as thick as it needs to be,” says Paté-Cornell. “This is the way engineers have operated for, well, forever. The Romans were not computing probabilities when constructing aqueducts.” In this spirit, NASA generated a list of individual shuttle components that could cause a catastrophic failure and tried to make those components as redundant and robust as possible, giving sufficient safety margins for each critical element.

    But this approach didn't provide an estimate of how likely the shuttle was to fail, leading to wildly unrealistic expectations of its reliability. It also didn't give engineers a feeling for how a change to one part—such as improving the durability of a shuttle tile—contributes to the overall safety of the shuttle. And picking out which items were critical was a black art. Just before the Challenger disaster, NASA said there were about 2500 critical items on the shuttle. That number swelled to 4500 after the tragedy.

    In the Challenger aftermath, expert panels urged NASA to consider other methods of assessing risk. “The earliest experiment with PRA was in 1988, a PRA of the [shuttle's] main propulsion system,” says Pete Rutledge, NASA safety expert and division head. “We look back to it as a proof-of-concept study.” Studies in 1989 and 1993 gave the shockingly high values of 1 in 78 and 1 in 90 for the shuttle as a whole, implying that it was less reliable than the Russian Soyuz was.

    According to Rutledge, the real culture change at NASA came in the mid-1990s when then-Administrator Dan Goldin recognized that PRAs could be used to set priorities in figuring out which parts of the shuttle were most in need of upgrades. Before Goldin pushed PRAs, says Michael Stamatelatos, the manager of risk assessment at NASA headquarters in Washington, D.C., “it was a bit of a Catch-22. [Engineers] would not believe in it until they did it, and they wouldn't do it until they believed in it.”

    NASA has now taken PRA methods to heart and regularly performs PRAs on the shuttle's systems and the shuttle as a whole. In 1995, the overall calculated value for catastrophic shuttle failure was 1 in 145; in 1998, that value dropped to 1 in 245, due, in part, to design improvements of the main engines. Another PRA, started before the accident, is under way.

    But as the Columbia disaster showed, the latest number appears to be wrong. According to Rutledge, these overall risks are inherently deceptive. “There's all sorts of complexity that we may not be able to capture,” he says. Overlooking a possible source of failure often leads to an underestimate of risks, as does ignoring the possible dependence of failures of different systems, which can boost the risks immensely. Furthermore, these analyses tend to ignore human fallibility. “People under time constraints are going to cut corners,” says Paté-Cornell. But, notes Rutledge, there is pressure to generate a single number. “[The public] likes to know one number, and people on Capitol Hill like to know one number, so we have it,” he says. But inside analysts don't put much stock in such a figure.

    So what are PRAs good for? “The relative risks are what's really important,” says Stamatelatos. By assessing which risks are more likely to cause failure than others, NASA can direct its limited resources to the problem areas. “These efforts help us make upgrades and project the safety of the shuttle into the future,” Stamatelatos adds. According to Mosleh, NASA is working hard to make PRAs more accurate by taking into account the changing conditions as the shuttle flies; in the meantime, the risk assessments already performed are helping investigators generate a fault tree to study what might have gone wrong with Columbia.

    But even the most sophisticated PRA is likely to be wrong when it comes to calculating the absolute odds of a rare event such as a shuttle failure. “The only way you can calculate this is by modeling,” says Stamatelatos. Adds Rutledge, “The exact number is unknowable.”

  12. MEDICINE

    Tracing the Steps of Metastasis, Cancer's Menacing Ballet

    1. Jennifer Couzin

    New studies are beginning to deconstruct this mysterious process, which is overwhelmingly the cause of cancer deaths

    “We put the cancer cells in last Tuesday,” Weili Fu explains, as she slits open a mouse's chest. Speaking over the rapid hiss-click of a miniature ventilator, she threads a tiny catheter into the pulmonary artery and infuses a bolus of dye. Fu and her mentor at the University of Pennsylvania in Philadelphia, Ruth Muschel, keep the lung pressure high until the chemical runs through, highlighting the lungs' blood vessels to help trace the path of human cancer cells injected 8 days ago into the animal's tail. Under a microscope, the tumor cells show up in vivid shades of yellow within blood vessels dyed red—exposing a dynamic view of metastasis, the process by which cancer cells migrate from a primary tumor to other sites in the body.

    When cancer cells metastasize, the fallout is devastating; indeed, it leads to death for most of the 282,000 people in the United States who succumb to four common cancers—of the breast, lung, prostate, and colon. But until very recently, many biologists were put off by the challenge of studying this process in the lab. The roadblocks are daunting. Because metastasis sweeps through various parts of the body, it must be studied mainly in whole animals rather than in cell and tissue cultures. Obtaining data from mice, which rarely have cancers that spread, remains difficult. Human tissue samples are scarce, too, because few cancer patients have secondary tumors surgically removed.

    Although these challenges persist, the study of metastasis is undergoing a renaissance. Once the province of a small band, it is now drawing many scientists as one of the last great frontiers of cancer biology. Recent studies have found, unexpectedly, how little in metastasis occurs by chance. Instead, a constellation of molecular signals in cancer cells and the patient's own body steer tumor cells, bit by bit, from the primary site to a new, ideal home. Large-scale gene studies are suggesting fundamental differences between metastatic cells and other cancer cells. Researchers are also taking a deeper look at electrifying similarities between embryonic and metastatic cells.

    Illuminating an enemy.

    Cancer cells glow yellow inside the blood vessels of a mouse's lung.

    CREDIT: WEILI FU AND RUTH MUSCHEL

    Deciphering metastasis is a painstaking task. Like Muschel and Fu, growing numbers of researchers are tackling complex experiments and obtaining images of disease that could open new opportunities for treatment. The field's expansion has also brought growing pains. New entrants are challenging long-accepted theories, leading, many say, to sniping and quarrels, as each camp seeks to advance its worldview (see sidebar, p. 1005).

    Although contentious at times, this work is providing insight into some fundamental puzzles. Among them: Are certain tumors predestined to be metastatic from their very beginning? What draws cancer cells to specific organs? Ultimately, for a cancer to progress, “something has to happen” to give tumor cells the capacity to thrive in new environments, says Bruce Zetter of Children's Hospital in Boston, and researchers are still trying to learn what that is.

    A cunning enemy

    Viewed in retrospect, the spread of cancer looks like an intricately choreographed ballet, linking dozens of steps in what's known as the metastatic cascade. Before they form a new tumor, cells must escape from the primary site, tumble into the bloodstream, and survive typhoonlike blood flow powerful enough to kill them. They must lodge in a spot conducive to growth (most are not) and colonize this outpost, recruiting blood vessels critical for nourishment.

    This pattern—well established and orderly though it may seem—belies a number of peculiarities familiar to oncologists. Tumors that appear identical under the microscope display utterly divergent behaviors in the body, some spreading aggressively while others stay put. Cancers that appear cured may never stir again; or they may resurface as a metastasis 10 or 20 years later. This unpredictability frustrates physicians and patients.

    One of the enduring goals of the field is to find a way to read a cancer's tea leaves. Yet even as sophisticated diagnostics push back the date when cancer first becomes visible, no universal signs of cell destiny have emerged. “When you try to ask the question of whether early detection by mammography predicts outcome,” says Muschel, “it's not very good. That could be because there are two categories of tumors, one of which has the metastasis phenotype.” If those phenotypes exist, identifying them could potentially guide treatment decisions.

    Scientists have not unearthed a global signature that reads “metastasis” from the day of diagnosis, but they are picking up messages from all sorts of cancer cells that hint at their destiny. From one overexpressed enzyme marking metastatic colon cancer cells to a dizzying expression pattern of 70 genes in breast cancer linked to a poor prognosis, researchers are discerning individual, and sometimes divergent, signatures that are gradually pointing the way toward metastasis predictors.

    What makes a metastasis?

    In the early days of metastasis research in the 1970s, the focus was on dissecting the first step in the metastatic cascade, a cancer cell's escape from the primary tumor. At that time, scientists believed that if cancer cells had peeled off and infiltrated the bloodstream, “the horse is out of the barn and there's nothing you can do about it,” says cancer biologist Carrie Rinker-Schaeffer of the University of Chicago. Isaiah Fidler, a veterinary surgeon-turned-pathologist, proposed 3 decades ago that a cryptic minority of cancer cells harbor an inherent ability to spread. The rest are ill equipped to travel, said Fidler, now at the University of Texas M. D. Anderson Cancer Center in Houston. The proportion wasn't exact—guesses ranged from 1 in 10,000 to 1 in 1 million cells in a tumor—but regardless of their number, Fidler believed, something about these cells propelled them to launch a metastasis.

    Tumbling down the metastatic cascade.

    After breaking off from the primary tumor (far left), cancer cells travel through the blood vessels. Those that reach a secondary site such as the lung (right) may colonize it and form a metastasis.

    ILLUSTRATION: C. SLAYDEN/SCIENCE

    Fidler, widely considered the grandfather of metastasis research, inspired a core group of fewer than a dozen scientists to roll up their sleeves and try to identify what makes these cells different from others in the primary tumor. They focused on genes that are turned on or off only in cancer cells that have metastasized. Patricia Steeg of the National Cancer Institute in Bethesda, Maryland, and colleagues found the first gene, nm23, in 1988; at least seven more have been identified since then. When turned on, these genes appear to inhibit cancer's spread; when shut down, they are often associated with metastases, but apparently they do not affect primary tumor growth. Researchers dubbed them metastasis-suppressor genes, although their precise functions remain fuzzy. At a meeting in the late 1990s, Steeg and her colleagues, pooling their information, discovered that metastasis-suppressor genes are less relevant to the first stage of the metastatic cascade than to one of the last, a cancer cell's ability to colonize a new site.

    This realization came as other researchers, eager to find out what enables a small fraction of malignant cells to thrive, began to focus on the later steps of the cascade. To their surprise, scientists are discovering that metastasis faces remarkably long odds. Although some tumors shed millions of cells into the blood, few reach an organ, and even fewer grow into full-blown secondary tumors. Recent imaging work in Ann Chambers's lab at the University of Western Ontario in London, Canada, has shown that only 1 in 40 skin cancer, or melanoma, cells that hit the liver will form what are called micrometastases: clumps of cells that remain small unless they develop blood vessels. Just 1% of those will acquire a blood-supply network and metamorphose into true metastases. “We ended up … seeing a totally different picture [from] what I assumed to be true,” says Chambers.

    What enables a voyaging cancer cell to beat the odds and thrive? Metastasis researchers have more than one explanation. Some, such as Steeg and Rinker-Schaeffer, believe that metastasis-suppressor genes, which seem to come into play later in the cascade, are critical. They agree with Fidler's original theory that only a tiny portion of a primary tumor contains cells with metastatic potential. This suggests that the larger the tumor, the likelier it is to harbor metastatic cells and hence to spread.

    Although Fidler's theories have been challenged in the past, it's only now, with the advent of gene microarrays, that they are being rigorously put to the test. Todd Golub, head of cancer genomics at the Whitehead Institute in Cambridge, Massachusetts, runs one of many cancer labs that are developing microarray technologies to monitor thousands of genes simultaneously. Studies analyzing tumor samples and correlating patient outcomes with gene activity are identifying literally dozens of genes whose expression appears to vary in tune with cancer's spread.

    Researchers such as Golub and René Bernards of the Netherlands Cancer Institute in Amsterdam argue that certain primary tumors are, early in development, composed largely of cells with a genetic makeup that compels them to metastasize. Furthermore, they believe it may be possible to identify these deadly tumors early on by analyzing their gene-expression patterns.

    Seeds and soils

    As scientists skirmish over gene-expression studies, separate molecular biology work is spawning at least one principle on which almost everyone agrees: Metastasis appears to be partly controlled by messages embedded in the organs to which cancer spreads. Elements of the signals stimulating metastasis may come “not from the tumor cells themselves but from the microenvironment,” suggests Golub. The new tumor locale seems to include a key that cancer cells use to unlock the site and thrive.

    Researchers have long puzzled over the ties that bind metastasized cancer to certain organs, knowing that specific cancers indisputably show a taste for specific sites. Whereas breast cancers favor the bone and lungs, for example, colon cancers prefer the liver. In 1889, a British surgeon named Stephen Paget spelled out in The Lancet his “seed and soil” theory, which argues that metastasis depends on matching certain types of “seeds,” or cancer cells, with “soils” in which they are likely to grow. Researchers now agree that although many ties between primary tumors and metastases are statistically predictable based on blood-flow patterns, about a third defy logic, among them breast and prostate cancers' frequent and deadly spread to bone.

    As they decipher these affinities, biologists are finding that the choice of where to relocate isn't solely a cancer cell's to make: Distant organs also beckon them. The ensuing dialogue tugs the cells closer, or creates a welcoming second home in which they can freely multiply.

    Albert Zlotnick, director of genomic medicine at Eos Biotechnology in South San Francisco, California, saw this pattern emerge 2 years ago while examining well-known proteins called chemokines. Chemokines, which recruit white blood cells to damaged tissue, landed in the spotlight in 1996 when HIV was found to use them to enter cells. Zlotnick eavesdropped on chemokine signals between metastatic breast cancer cells and two locales to which breast cancer spreads, lymph nodes and lungs. He was surprised to find that chemokines helped explain breast cancer's affinity for these organs: The cancer cells expressed specific chemokine receptors, and lymph nodes and lungs expressed the molecules that bind to them. In mice, he found, blocking this back-and-forth Morse code helped inhibit cancer's proliferation.

    Joan Massagué, a new entrant to the metastasis field, is exploring what drives breast cancer to distant targets, particularly bone tissue. The question is especially intriguing because breast cancer cells strike bone more readily than anatomy would predict. In his lab at Memorial Sloan-Kettering Cancer Center in New York City, Massagué is studying how the cancer and target cells signal to one another; already, he's identified proteins that enable cancer cells to adhere tightly to bone and attack bone tissue.

    Massagué is also finding that breast cancer cells corrupt a much-studied signaling pathway called transforming growth factor-β (TGF-β). Other researchers have found that TGF-β can function as a tumor suppressor, slowing the growth of some primary tumors. But in unpublished research, Massagué has observed that metastatic breast cancer cells appear to turn the tables on TGF-β; for them, it helps spur invasion and metastasis. Cancer cells must “acquire a number of abilities … to nest and thrive at appropriate sites,” says Massagué, and he thinks that subverting TGF-β is one of their successful dodges.

    Some cutting-edge work also suggests that cancer cells master these tricks by turning back the clock. Last month, Denise Montell, a developmental biologist at the Johns Hopkins School of Medicine in Baltimore, Maryland, published an article in Nature Reviews: Molecular Cell Biology pointing out a possible connection between embryo development and metastasis that she stumbled on by accident. Montell was analyzing how cells in an adult fruit fly ovary migrate from one place to another—similar to the cell migration that occurs during embryo development and during metastasis, when cells move from one organ to another.

    Montell found that two critical cell-signaling pathways, known to help cells proliferate in embryos and, in some cases, in cancers, also confer mobility. One of these, governed by steroid hormones and a fruit fly gene called taiman, controls the timing of cell migration in certain ovarian and embryonic cells. A closely related mammalian protein is highly expressed in metastatic breast cancer. Although scientists have long linked hormonal effects with cancer, they have not coupled hormones with migratory abilities. Startled by these associations, 2 years ago, Montell shifted some of her 10-person lab into metastasis studies and collaborations with cancer biologists.

    Others are seeing tantalizing parallels between early development and cancer cells that have completed the metastatic cascade. “If you look at the molecular profile of these cells” in gene-expression studies, “they look like stem cells,” says Mary Hendrix of the University of Iowa in Iowa City. Stem cells carry built-in blueprints that enable them to morph into various tissue types. This, she explains, would clarify how breast cancer cells can live perfectly comfortably in strange environments.

    All these advances, though, are years from being translated into therapies. Adding to the uncertainty is the erratic performance of one treatment that targets metastases as well as primary tumors: antiangiogenic drugs, which inhibit new blood vessel growth (Science, 22 March 2002, p. 2198).

    But if old treatment approaches are struggling, new ones are emerging. Researchers are increasingly interested in designing drugs to focus on the final step in the metastatic cascade. Tumors have often spread insidiously through the body by the time they are diagnosed, but as surgeon Judah Folkman of Children's Hospital in Boston has shown, micrometastases may linger for anywhere from a few months to more than a decade before suddenly becoming metastases proper. This suspended state has recently captured scientists' attention; many believe that it's a pause that offers hope, and prolonging it may be the best short-term strategy for halting metastasis.

    Delaying disease may defeat it. “That, I think, is a newly appreciated goal of cancer treatment,” says Zetter of Children's Hospital. He hopes that someday metastasis, if not curable, will be treatable as a chronic disease—and that more than just a lucky few will be able to live with it.

  13. MEDICINE

    A Clash Over Genes That Foretell Metastasis

    1. Jennifer Couzin

    Biologists who sift DNA for evidence of what causes cancer cells to become metastatic are torn between two camps. One holds that a fatal problem occurs relatively late, when so-called metastasis suppressor genes fail, allowing an existing tumor to spread. About eight candidate genes fitting this scenario have been identified (see main text).

    But a second group remains unconvinced. These researchers hold that the same genes that drive primary tumors—oncogenes and tumor-suppressor genes—also launch metastatic cancer, and that even small primary tumors can contain mostly metastatic cells. Metastasis, they argue, is not genetically distinct; it is just the final step of well-studied processes of cell dysregulation.

    This debate erupted in a heated exchange of comments last summer after Robert Weinberg, a lion of molecular oncology studies, staked out a position against a unique role for metastasis-suppressor genes. A member of the Whitehead Institute and professor at the Massachusetts Institute of Technology in Cambridge, Weinberg and his former postdoc René Bernards, now at the Netherlands Cancer Institute in Amsterdam, argued in the 22 August 2002 issue of Nature that some of the initial mutations that transform a normal cell into a cancerous one can also cause metastasis. They specifically pointed to the oncogenes ras and myc, which Weinberg has studied for much of his career.

    Doubter.

    The Whitehead Institute's Robert Weinberg raises questions about the genes behind metastasis.

    CREDIT: JUSTIN KNIGHT/WHITEHEAD INSTITUTE FOR BIOMEDICAL RESEARCH

    The article struck a nerve. “We got both love and hate letters,” about 60 in all, says Bernards. Weinberg has been here before: In the 1980s, his lab discovered what it thought was a metastasis gene before finding that it didn't cause metastasis after all.

    Advocates of the metastasis-suppressor theory say that Weinberg and Bernards have slighted their work. “The metastasis people who've toiled in the trenches … have long felt that oncogenes and tumor suppressors get all the limelight,” explains one researcher who straddles both camps, Bruce Zetter of Children's Hospital in Boston. Others are less diplomatic. The paper “completely ignores a body of literature” on the role of metastasis-suppressor genes, fumes cancer biologist Carrie Rinker-Schaeffer of the University of Chicago.

    Weinberg's greatest allies may be those who have identified genetic signatures in primary tumors that appear to anticipate metastasis. Todd Golub, a Whitehead colleague of Weinberg's who published a paper in Nature Genetics in December postulating a 17-gene signature in primary tumors that could predict metastasis, agrees with Weinberg's theory. His own work, he says, supports the view that a tumor's destiny is carved out early on, and that in these cases, metastatic cells come to dominate a primary tumor. Another supporter is Robert Kerbel of the Sunnybrook and Women's College Health Sciences Centre in Toronto, who predicted something similar in the 1980s.

    For his part, Weinberg protests that his article was misunderstood, and that he wasn't denying the existence of metastasis-suppressor genes.

    The ruckus over the article has some experts calling for calm. “I sound like a U.N. diplomat, … [but] there's room for everyone,” says Kerbel. Indeed, even the combatants concede that the two camps may be talking about overlapping sets of genes.

  14. GENOMICS

    Tinker, Tailor: Can Venter Stitch Together a Genome From Scratch?

    1. Carl Zimmer*
    1. Carl Zimmer is the author of Evolution: The Triumph of an Idea.

    J. Craig Venter plans to create microbes to cure the world's environmental woes. Whether he can even partially succeed is an open question

    Craig Venter can't stand to be bored. No sooner had he and his team at Celera Genomics finished sequencing the human genome than Venter set another modest challenge for himself: He would tackle the world's environmental woes. His self-proclaimed goal (which landed him in newspapers and magazines around the world a few months ago) is to create microbes from scratch that can produce clean energy or curb global warming. To make this a reality, he set up a new organization, the Institute for Biological Energy Alternatives (IBEA) in Rockville, Maryland, right next to The Institute for Genomic Research that he founded in 1992. He got a small vote of confidence last November, when the Department of Energy awarded IBEA $3 million to take the first few steps toward that goal.

    Venter predicts he will pull off the first step—creating a synthetic genome that, when inserted into a cell, can live and replicate—within 3 years. But experts on microbes and genomics are not so sure. No one has ever synthesized a string of DNA hundreds of thousands of base pairs long, much less “brought it to life.” Obstacles range from determining which genes are essential to how to switch on a new genome. As for going the next step and creating a new pollution-fighting bug, many dismiss the scheme as science fiction or, at best, decades away. But Venter's critics and champions are closely watching what happens to the synthetic genome. “If he does make it, it will be a momentous achievement,” promising insights into the fundamental workings of all living things, says Eugene Koonin, an evolutionary biologist at the National Center for Biotechnology Information in Bethesda, Maryland.

    Step one

    Venter's project has its roots in the mid-1990s, when he and his colleagues sequenced the peculiar genome of Mycoplasma genitalium, a species of bacteria that lives in the human urinary tract. They discovered that M. genitalium has only 517 genes, making it among the smallest genomes known (Science, 20 October 1995, p. 397). Its tiny size raised some fundamental questions, says Venter: “Is there a smaller set [of genes] we don't know about? Is there a way we can define life at a molecular level?”

    To find out, Venter joined up with Clyde Hutchinson of the University of North Carolina, Chapel Hill, and other researchers to test whether individual genes were essential for M. genitalium's survival in the lab. They would knock out a gene, watch the microbe thrive or wither, and then repeat the experiment with a different gene. A surprising number of genes turned out to be superfluous, and Venter's group estimated that M. genitalium might be able to survive on only 265 to 350 genes (Science, 10 December 1999, p. 2165).

    But testing one gene at a time could not tell the researchers exactly what would constitute such a minimal genome. It's possible, for example, that any one of five genes can do some vital task. Tested one at a time, all five may seem unnecessary, but if all of them are removed, the microbe dies. “We didn't have a way of doing cumulative knockouts to get down to just having a chromosome with just 300 genes in it,” Venter explains. “So we don't know which are the right 300 genes, or even if 300 is the right number. That study drove my thinking that we needed a synthetic genome.”

    At IBEA, a team of a dozen scientists has been at work on the project for several months, led by Hamilton Smith, the 1978 Nobel Prize winner who has collaborated with Venter on the sequencing of many genomes. Their first challenge is daunting enough: to stitch together a string of nucleotides hundreds of thousands of base pairs long. The biggest genome synthesized to date is that of the 7500-nucleotide poliovirus (Science, 9 August 2002, p. 1016). “It's not trivial to do these syntheses,” says Hutchinson.

    Synthesizing DNA brings with it the small risk of introducing a typographical mistake into the code. “If you put thousands of these things together, then the chances that you can assemble something that doesn't contain any errors are very small,” says Hutchinson.

    Aniko Paul, a member of the team at the State University of New York, Stony Brook, that assembled the poliovirus last year, agrees: “Our project was technically much more difficult than we expected.” She and her colleagues had to study each segment of the sequence carefully to make sure that it was error free.

    Nor is it obvious which genes should be included in a synthetic genome. Microbiologists are familiar with the function of only a few well-studied microbes, such as Escherichia coli. Other microbes, such as M. genitalium, are still a puzzle. For instance, the function of a third of the genes that Venter's group has identified as potentially essential is unknown.

    Playing God?

    Craig Venter (right) and Hamilton Smith predict that they will have a synthetic genome up and running in 3 years.

    CREDIT: MARTY KATZ

    The researchers can't hope to construct every possible combination of M. genitalium genes and see which survives and which doesn't. There are simply too many possibilities. Venter believes that this brute force won't be necessary, citing intriguing evidence that bacterial genomes may be organized into functional groups (“cassettes”) that all work together to do some particular job, such as metabolizing a specific molecule. Instead of testing every gene, Venter is banking on limiting the search to a much smaller set of cassettes.

    Once Venter's team members have a full genome ready for a test drive, they will have to give it a home. They envision destroying the natural DNA of another M. genitalium and then inserting the synthetic genome into the microbe. What happens next is a mystery. “How do you boot up a new genome?” asks Bernhard Palsson of the University of California, San Diego, who builds computer models of the metabolism of microbes. “Now that you've made it, how do you get it going?” Paul and her colleagues didn't face this question with the poliovirus genome because a virus is not exactly alive. In essence, it's a genome that can hijack the cellular machinery of a living host. Bacteria, on the other hand, are truly alive, and their survival depends on the complex interactions of their genes and proteins. It's possible, Palsson suggests, that a synthetic genome might just lie inert in its new home, unable to spring to life.

    “How do we know?” Venter responds. He points out that biochemists insert small sets of genes all the time into bacteria, which happily start using them to make new proteins. If a synthetic genome is properly designed, he argues, the machinery of a host microbe might automatically recognize it, and the microbe would start replicating on its own. “It wouldn't surprise me if it just happens,” says Venter.

    Biologists such as Palsson and Koonin would be delighted if Venter could get this far. “I've been talking for years with my colleagues about how we have to do this,” says Palsson. For one, success would offer proof that there is such a thing as a minimal genome. “Ultimately, proof by synthesis is the most convincing way to demonstrate that we understand the basic principles of any process,” says James Shapiro, who studies microbe evolution at the University of Chicago. A microbe with a synthetic genome could offer more than a proof of concept, however. It could serve as a sort of microbial fruit fly that biologists could use to investigate basic aspects of life itself.

    Step two

    But Venter has grand plans for his “minimal microbe.” He hopes to use it as a foundation for building a much larger genome, endowed with new genes that would enable it to produce hydrogen fuel on an industrial scale or efficiently suck in the carbon dioxide released by a power plant. “If he can do it, more power to him—but I think it's going to be really hard,” says Suellen van Ooteghem of the National Energy Technology Laboratory in Morgantown, West Virginia. Others question the need for a new microbe, should it be possible to create one. But Venter, with characteristic brashness, believes he can improve on evolution.

    Even if the team successfully builds a genomic skeleton, it would have to add a lot of muscle—perhaps 1000 additional genes—to enable it to fight pollution. Venter and colleagues plan to scour nature for the most powerful combination of genes. But figuring out how to add the regulatory functions to ensure that all these genes turn on and off at the right time will be difficult, if not impossible. “Our understanding of regulatory circuits, even in bacteria, is actually quite poor,” says Koonin.

    Venter recognizes the profoundness of the challenge he has set for himself and his co-workers. “There are so many unknowns, and no one's been there before,” he says. “All this is pure basic science, and we're learning as we go.” But that hasn't stopped Venter before, and it doesn't seem to be stopping him now.

Log in to view full text